I write about technology at theluddite.org

  • 2 Posts
  • 80 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle
  • Jesus yeah that’s a great point re:Musk/Twitter. I’m not sure that it’s true as you wrote it quite yet, but I would definitely agree that it’s, at the very least, an excellent prediction. It might very well be functionally true already as a matter of political economy, but it hasn’t been tested yet by a sufficiently big movement or financial crisis or whatever.

    +1 to everything that you said about organizing. It seems that we’re coming to the same realization that many 19th century socialists already had. There are no shortcuts to building power, and that includes going viral on Twitter.

    I’ve told this story on the fediverse before, but I have this memory from occupy of when a large news network interviewed my friend, an economist, but only used a few seconds of that interview, but did air the entirety of an interview with a guy who was obviously unwell and probably homeless. Like you, it took me a while after occupy to really unpack in my head what had happened in general, and I often think on that moment as an important microcosm. Not only was it grossly exploitative, but it is actually good that the occupy camps welcomed and fed people like him. That is how our society ought to work. To have it used as a cudgel to delegitimize the entire camp was cynical beyond my comprehension at the time. To this day, I think about that moment to sorta tune the cynicism of the reaction, even to such a frankly ineffectual and disorganized threat as occupy. A meaningful challenge to power had better be ready for one hell of a reaction.


  • Same, and thanks! We’re probably a similar age. My own political awakening was occupy, and I got interested in theory as I participated in more and more protest movements that just sorta fizzled.

    I 100% agree re:Twitter. I am so tired of people pointing out that it has lost 80% of its value or whatever. Once you have a few billion, there’s nothing that more money can do to your material circumstances. Don’t get me wrong, Musk is a dumbass, but, in this specific case, I actually think that he came out on top. That says more about what you can do with infinite money than anything about his tactical genius, because it doesn’t exactly take the biggest brain to decide that you should buy something that seems important.







  • I know that this kind of actually critical perspective isn’t point of this article, but software always reflects the ideology of the power structure in which it was built. I actually covered something very similar in my most recent post, where I applied Philip Agre’s analysis of the so-called Internet Revolution to the AI hype, but you can find many similar analyses all over STS literature, or throughout just Agre’s work, which really ought to be required reading for anyone in software.

    edit to add some recommendations: If you think of yourself as a tech person, and don’t necessarily get or enjoy the humanities (for lack of a better word), I recommend starting here, where Agre discusses his own “critical awakening.”

    As an AI practitioner already well immersed in the literature, I had incorporated the field’s taste for technical formalization so thoroughly into my own cognitive style that I literally could not read the literatures of nontechnical fields at anything beyond a popular level. The problem was not exactly that I could not understand the vocabulary, but that I insisted on trying to read everything as a narration of the workings of a mechanism. By that time much philosophy and psychology had adopted intellectual styles similar to that of AI, and so it was possible to read much that was congenial – except that it reproduced the same technical schemata as the AI literature. I believe that this problem was not simply my own – that it is characteristic of AI in general (and, no doubt, other technical fields as well). T


  • I’ve now read several of these from wheresyoured.at, and I find them to be well-researched, well-written, very dramatic (if a little ranty), but ultimately stopping short of any structural or theoretical insight. It’s right and good to document the shady people inside these shady companies ruining things, but they are symptoms. They are people exploiting structural problems, not the root cause of our problems. The site’s perspective feels like that of someone who had a good career in tech that started before, say, 2014, and is angry at the people who are taking it too far, killing the party for everyone. I’m not saying that there’s anything inherently wrong with that perspective, but it’s certainly a very specific one, and one that I don’t particularly care for.

    Even “the rot economy,” which seems to be their big theoretical underpinning, has this problem. It puts at its center the agency of bad actors in venture capital becoming overly-obsessed with growth. I agree with the discussion about the fallout from that, but it’s just lacking in a theory beyond “there are some shitty people being shitty.”





  • Your comment perfectly encapsulates one of the central contradictions in modern journalism. You explain the style guide, and the need to communicate information in a consistent way, but then explain that the style guide is itself guided by business interests, not by some search for truth, clarity, or meaning.

    I’ve been a long time reader of FAIR.org and i highly recommend them to anyone in this thread who can tell that something is up with journalism but has never done a dive into what exactly it is. Modern journalism has a very clear ideology (in the sorta zizek sense, not claiming that the journalists do it nefariously). Once you learn to see it, it’s everywhere







  • I cannot handle the fucking irony of that article being on nature, one of the organizations most responsible for fucking it up in the first place. Nature is a peer-reviewed journal that charges people thousands upon thousands of dollars to publish (that’s right, charges, not pays), asks peer reviewers to volunteer their time, and then charges the very institutions that produced the knowledge exorbitant rents to access it. It’s all upside. Because they’re the most prestigious journal (or maybe one of two or three), they can charge rent on that prestige, then leverage it to buy and start other subsidiary journals. Now they have this beast of an academic publishing empire that is a complete fucking mess.



  • My two cents, but the problem here isn’t that the images are too woke. It’s that the images are a perfect metaphor for corporate DEI initiatives in general. Corporations like Google are literally unjust power structures, and when they do DEI, they update the aesthetics of the corporation such that they can get credit for being inclusive but without addressing the problem itself. Why would they when, in a very real way, they themselves are the problem?

    These models are trained on past data and will therefore replicate its injustices. This is a core structural problem. Google is trying to profit off generative AI while not getting blamed for these baked-in problems by updating the aesthetics. The results are predictably fucking stupid.