• 0 Posts
  • 85 Comments
Joined 1 year ago
cake
Cake day: April 3rd, 2024

help-circle
  • My most used features so far are vertical splitters, vertical nudging, and the new placement modes for conveyors and pipes. With an honorable mention going to conveyor wall holes, which also free up a lot of design options.

    Honestly, though, just about everything in this update has been a godsend. Priority splitters are the only thing I haven’t really used yet. Even the elevators rock; being able to zoop up to 200 meters up or down in one go can make them useful even as a temporary yardstick for tall structures. (Also, I did end up needing to go 150 meters straight down to get at some resources and can confirm that elevators handle their intended purpose very well.)




  • Jesus_666@lemmy.worldtoMemes@sopuli.xyzThe Faculty, any day
    link
    fedilink
    English
    arrow-up
    7
    ·
    13 days ago

    Das Millionenspiel.

    It’s The Running Man except twelve years earlier and a media satire instead of an action movie. It comments on TV phenomena that wouldn’t exist in Germany until two decades later (like scripted “reality” TV). Also, it has early appearances of one of Germany’s most famous TV hosts (as the show’s host, fittingly) and one of Germany’s most famous comedians of the 70s to 90s (in a completely serious role, unfittingly). And unlike the Schwarzenegger movie it doesn’t construct a dystopian future to introduce public bloodsports but merely gives a terse reference to a “law on active recreation” dated three years after the movie first aired.

    To make it even more odd, it’s actually a good movie despite being from Germany and made for TV.



  • AI isn’t taking off because it took off in the 60s. Heck, they were even working on neural nets back then. Same as in the 90s when they actually got them to be useful in a production environment.

    We got a deep learning craze in the 2010s and then bolted that onto neural nets to get the current wave of “transformers/diffusion models will solve all problems”. They’re really just today’s LISP machines; expected to take over everything but unlikely to actually succeed.

    Notably, deep learning assumes that better results come from a bigger dataset but we already trained our existing models on the sum total of all of humanity’s writings. In fact, current training is hampered by the fact that a substantial amount of all new content is already AI-generated.

    Despite how much the current approach is hyped by the tech companies, I can’t see it delivering further substantial improvements by just throwing more data (which doesn’t exist) or processing power at the problem.

    We need a systemically different approach and while it seems like there’s all the money in the world to fund the necessary research, the same seemed true in the 50s, the 60s, the 80s, the 90s, the 10s… In the end, a new AI winter will come as people realize that the current approach won’t live up to their unrealistic expectations. Ten to fifteen years later some new approach will come out of underfunded basic research.

    And it’s all just a little bit of history repeating.







  • Nuclear power has some nice properties (and a whole bunch of terrible ones), is technologically interesting, and has been the premier low-CO₂ energy source for a while. That gets it some brownie points although I agree that it shouldn’t be sacrosanct.

    I personally am mainly interested in using breeder reactors to breed high-level waste that needs to be kept safe for 100,000 years into even higher-level waste that only needs to be kept safe for 200 years. That’s expensive and dangerous but it doesn’t require unknown future technology in other to achieve safe storage for an order of magnitude longer than recorded history.

    There’s a whole bunch of very good questions you can ask about that approach (such as how to handle the proliferation risk) but the idea of turning nuclear waste disposal into a feasibly solvable problem just appeals to me.

    Of course I expect an extreme amount of oversight and no tolerance for fucking up. That may be crazy expensive but we’re talking about large-scale breeder deployment. It’s justified.


  • Jesus_666@lemmy.worldtoxkcd@lemmy.worldxkcd #3089: Modern
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 month ago

    Wasn’t there a Dark Age after Bronze? The one where everyone was scowling the whole time and the stories were so tryhard edgy you could use a typical Youngblood issue as a letter opener?

    (Basically the “pouches” era the sibling comments talk about. Rob Liefeld’s contributions to fashion will never be forgotten.)



  • Of course you wouldn’t use an existing database engine as the foundation of a new database engine. But you would use an existing database engine as the foundation of an ERP software, which is a vastly different use case even if the software does spend a lot of time dealing with data.

    If I want to build an application I don’t want to reimplement everything. That’s what middleware is for. The use case of my application is most likely not to speak a certain protocol; the protocol is just the means to what I actually want to do. There’s no reason for me to roll my own implementation from scratch and keep up with current developments except if I’m unhappy with all current implementations of that protocol.

    Of course one can overdo it with middleware (the JS world is rife with this) but implementing a communication protocol is one of the classic cases where it makes sense.