I always think it’s unfair to compare things to video games. Video games are so inefficient they had to invent a separate processor with hundreds of cores just to run them. Of course they end up running well.
If cheap phones had a 128-core JavaScript Processing Unit, websites would probably run fast too.
I’m a web dev and yes they could. It’s annoying that web devs get blamed for it though, the reason for all the javascript is mostly business decisions out of our control.
Mainly the tracking scripts which the marketing department adds against out will. But also it’s a lot cheaper to have a client-rendered web app than a traditional website (with client side rendering you can shut off all your web servers and just keep the api servers, our server side processing went down 90% in the switchover). And it’s more efficient for the company to have one team working in one programming language and one framework that can run the backend and frontend, so the frontend ends being a web app even if it’s not really necessary.
I bet a lot more people know what 0°C feels like than 0°F. One is freezing point, one is a completely arbitrary temperature which only gets called “the lowest you’ll experience” as a post hoc rationalisation of Fahrenheit. Most people will never experience anything that cold, some people experience colder.
I even bet more people know what 100°C feels like than 100°F. One is accidentally getting scalded by boiling water, the other is a completely arbitrary temperature which is quite hot but not even the hottest you’ll experience in America.
Yeah, like who needs to tell quickly whether road conditions will be icy? It’s much more useful to know how much warmer it is than the arbitrary temperature Americans say is the lowest you can survive
Yeah that’s another difference. When something breaks on Windows people will do anything to fix it, including reinstalling Windows or buying another machine.
When something goes wrong on Linux they decide Linux doesn’t work and reinstall Windows.
I’ve had Windows installs slow down till they take 15 minutes to start. I once clicked the wrong button in Visual Studio and the computer became some kind of remote driver debugging target, permanently. Half the settings broke and every startup it would autologin as a debug user.
If anything like that happens on Linux it’s proof Linux is too complicated, but on Windows it’s just one of those things.
deleted by creator
deleted by creator
Also since companies are adding AI to everything, sometimes when you think you’re just doing a digital zoom you’re actually getting AI upscaling.
There was a court case not long ago where the prosecution wasn’t allowed to pinch-to-zoom evidence photos on an iPad for the jury, because the zoom algorithm creates new information that wasn’t there.