• 4 Posts
  • 367 Comments
Joined 6 months ago
cake
Cake day: September 22nd, 2025

help-circle

  • This article is trash.

    Baier-Lentz went as far as to say that developers are “demonizing” a “marvelous new technology,” speculating that a big reason people dislike AI is because they’re worried about their jobs,

    They quote this “investor” and then just run with his theory that the distaste is about replacing peoples jobs without even challenging it. Maybe developers are eschewing AI because it’s garbotrash dogshit? I guess we will never know because the “journalist” who wrote this article didn’t bother to engage their brain for even a second before tapping their fat little fingers on their keyboard.




  • The preceding paragraph is this:

    These are massive, legacy-heavy codebases where much of the code predates modern C++ practices. Code written with raw new/delete, C-style arrays, char* string manipulation, manual buffer management — the full catalogue of pre-C++11 antipatterns. And there’s a conflation problem: the studies report “C/C++” as a single category. The Windows kernel is largely C, not C++. Android’s HAL and native layer are heavily C. Modern C++ with RAII, smart pointers, and containers is a fundamentally different beast than malloc/free C code, but the statistic treats them as one.

    And then after the paragraph you quote, the author just blasts along with the conclusion that code should be rewritten in memory safe languages like Rust, Kotlin and Java, without even touching on their own observation that modern C++ facilities do make a difference to prevalence of bugs.

    What is the reduction in bug rate when rewriting legacy C++ with raw memory management vs. modern C++ with RAII and reference counted pointers? We don’t know and they don’t want to ask, because it would challenge their main thesis.

    I don’t disagree that modern C++ safety still relies on the programmer making the right choices, whereas with a truly memory-safe language the compiler makes those decisions for you. But to sidestep the question completely is disingenuous and serves to make those of us who actually care about the specifics (C++ programmers, the people they’re trying to convince to retrain) to be incredibly suspicious of the whole argument.


  • HORSES has proven to be a highly controversial title with both Epic Games and Steam banning its release. However GOG, Humble Store and itch.io have all decided to keep it.

    “HORSES is a 3-hour first-person horror adventure set over fourteen days on a rural farm, where you work as a summer hand under a cryptic farmer and follow “a few rules” that unravel into increasingly surreal, unsettling tasks. As the sun sets and the facade of tranquility crumbles, you decide whether to keep to the safe path or venture into the farm’s hidden depths. The game blends interactive scenes with live-action intermissions, monochrome visuals, and silent-cinema title cards, with unique gameplay events each day. It’s a game about the burden of familial trauma and puritan values, the dynamics of totalitarian power, and the ethics of personal responsibility.”

    I watched the trailer and this shit is fucked up. Not surprised it was banned, but on reflection I’m not sure what my line in the sand would be to determine what counts as obscene, so I’m also not surprised that stores were split on the decision.






  • Not surprising if you’ve been in computing since before the dot-com boom.

    In the 80s it was the finance industry that attracted all the yuppies because it was where the money was. Do you think any of those stock traders actually care about economics, supply chains or business development outside of it being a vehicle for them to make wads of cash? Of course not.

    As soon as the internet demonstrated its financial potential, all the money-chasers looking for a career path became “web developers” and moved to Silicon Valley. Once smartphone app stores appeared, they all went into “app development.”

    These people never cared about the technology, they just care about getting their retirement.

    Actually caring about coding? That’s only for the real nerds.










  • There’s a lot more to semiconductor manufacturing than just the lithography, and there’s a whole supply chain required to support it. To replicate the kind of process nodes that TSMC is manufacturing you would need a lot of institutional expertise in:

    • crystal growth (or sourcing)
    • wafer manufacture (or sourcing)
    • silicon doping
    • process design
    • mask design
    • mask manufacture
    • photoresist manufacture (or sourcing)
    • photoresist application
    • etching
    • sputtering
    • testing and validation
    • wire bonding
    • packaging

    That’s just off the top of my head as someone who is a spectator of the industry but not involved. There’s probably a lot more you’d need to perfect in order to produce cutting edge silicon.

    If this were done in the EU it would likely involve multiple companies specialising in one area each, as we don’t have an Intel or AMD with the budget to pour into developing an in-house foundry. I can’t imagine how much investment it would take though. China has been working on this for decades and is still at least a decade behind in their own silicon process.