• 0 Posts
  • 217 Comments
Joined 9か月前
cake
Cake day: 2024年8月26日

help-circle

  • My perspective is that EA and the upper-class philanthropy it inherits from are consumerist, a system that rests on top of colonialism. It’s basically selling spiritual consumer goods, much like the medieval Catholic Church selling indulgences (and look what that provoked!). Once we get beyond the public health interventions, into longtermist EA’s “trillions of simulated minds in our future lightcone” bullshit, it’s clearly selling an unhealthily narcissistic spirituality, though its adherents would never call it that. The product, in this case, is the warm fuzzy self-aggrandizing feeling that one can extend one’s (over)privileged position in our relatively fragile 21st century society into influence over sci-fi-scale expanses of time and space.

















  • As I noted on the YouTube video, this is doubly heinous as a lot of CA community college instructors are “freeway flyers” - working at multiple campuses, sometimes almost 100 miles apart, just to cobble together a full-time work schedule for themselves. Online, self-paced, forum-based class formats were already becoming popular even before the pandemic, and I’ve been in such classes where the professor indicated that I was one of maybe 3 or 4 students who bothered to show up to in-person office hours. I have to wonder if that will end up being a hard requirement at some point. The bottom rung on the higher-education ladder is already the most vulnerable, and this just makes it worse.


  • I have to agree. There are already at least two notable and high-profile failure stories with consequences that are going to stick around for years.

    1. The Israeli military’s use of “AI” targeting systems as an accountability sink in service of a predetermined policy of ethnic cleansing.
    2. The DOGE creeps wanting to rewrite bedrock federal payment systems with AI assistance.

    And sadly more to come. The first story is likely to continue to get a hands-off treatment in most US media for a few more years yet, but the second one is almost certainly going to generate Tacoma Narrows Bridge-level legends of failure and necessary restructuring once professionals are back in command. The kind of thing that is put into college engineering textbooks as a dire warning of what not to do.

    Of course, it’s up to us to keep these failures in the public spotlight and framed appropriately. The appropriate question is not, “how did the AI fail?” The appropriate question is, “how did someone abusively misapply stochastic algorithms?”