

Thank you for fleshing out your world model and theory. I think that this model falls short of a source (and contradict some other AI-pessimistic economics predictions; namely a crash in computing cost and in crypto), but could be developed into something I’d find compelling.
Let me brainstorm aloud about what I think this world model predicts that we might have data on…
Did we see a crash in ISP prices, home and industry internet use, domain hosting, or other computing services in the dotcom bubble? That situation seems extremely analagous; but my vibe was that several of these did not drop (ISP price I suspect was stable), and some of these saw a dip but stayed well above early-internet rates (domain hosting)? I feel like there’d be a good analogy here, but I’m struggling with a way to operationalize.
I mentioned a use for compute that your reply didn’t cover: crypto mining. Do we have evidence that the floor on crypto is well below datacenter operating costs (across exploitative coins as well)? I vaguely remember a headline in this direction. Another use case I don’t see drying up: cheating on essay assignments.
More broadly, this model predicts that all compute avenues are much lower payoff than datacenter operating costs. I think I’d need to see this checked against an exhaustive HPC application list. I know that weather forecasting uses up about as much compute as AI for some supercomputing clusters.
Governments have already issued rather large grants to AI-driven academic projects. I suspect many of these are orders of magnitude larger than the size of academic AI 6 years ago. (I’ll also quickly note that libraries are better than google search has ever been for finding true facts; yet google search has remained above library use throughout its existence.)







Ah, I’m bad at reading. Editing. Thanks!