

Back in my day, computer was a job, and all you had was an abacus. We liked it that way. None of this newfangled al-gebra nonsense.


Back in my day, computer was a job, and all you had was an abacus. We liked it that way. None of this newfangled al-gebra nonsense.


It’s also cheaper, if they can offload a portion to the user’s computer.


So what happens if the artist is dead?
Freddie Mercury would find it difficult to maintain an active social nedia presence to prove he’s human, being rather indisposed at the present.


This doesn’t seem like a totalitarianism issue, though. The High or Supreme courts (other courts are available) could rule that replacement with AI is not a valid reason for termination of employment, and the result would be much the same.
It’d be a different sort of exercise, but it would be interesting as a means of learning how to control a forklift.
It probably doesn’t help that they may have an outdated image of autism. Their child does not have high support needs, so it can’t be that. The doctor must be mistaken.


Who doesn’t like their phone charging them by the word?


To be fair, using react for it was just an odd decision to begin with.
I would be surprised if it was something that they trained themselves, and not an off the shelf model hooked up to a search.


Is this anything new at all?
Even back in the day, you had people wanting to live in the recent past, because the past usually gets romanticised.
So people in the 1960s might have a rosy view of the turn of the century, and want to go back to the 1930 days of art deco and balls, or those today, that might want to return what they believe to be glory days of 1960. Even if it isn’t actually realistic to how you might live in the past. The average citizen in 1930 was not attending balls at a swanky music lounge.
Give it a few decades, we might also have people from 2050 pining for the 2020s, believing it to be just like the advertisements, where we all live in the penthouse level of skyscrapers, overlooking a vast cityscape.
foamed lactase.
Isn’t that the thing that digests milk, but not milk? You can buy little lactase pills at the pharmacy.
To be fair, that is a similar problem, just with a q instead of a k.
You’d need to know the name of the calculator to access it, if you don’t have a dedicated button to load it, or a menu to find it in.
If it’s your first time on linux, you might well think it doesn’t have a calculator at all.


That instance needs a login to show the post.


If you don’t have a Kobo, the file conversion is also a lifesaver.
I have one of the old Kindle e-readers, and it doesn’t support epub, for example. It does support pdf, in theory, but the age of the hardware means any decently large/complicated pdf bogs it down something fierce.
Being able to use calibre to convert my books to a format it does support is nice.


Do kind of wish that they had less silly names, though.
It’s hard to recommend them without sounding like you’re just babbling nonsense.
If you get Libby and Hoopla for your Kobo, you don’t need Ploob, no matter how much Ploob has it for you.
Mechanical Windows
As opposed to what, wireless windows?


Sort of? Apple’s reputation is traditionally that they make middle-of-the-road hardware, but make up for the shortcomings with software.
On paper, you can buy a Windows computer with better specs for cheaper, but the Apple computer still holds its own because the software is well-made, at least on the OS side of things. Even if the rest of their software was rubbish, you could get rid of it and still have a good foundation to work from. Hence why the Hackintosh was all the rage some years back. In theory, you could eke out the best of both worlds.


I think that’s why we haven’t seen Apple Silicon advertised that heavily lately.
There’s also not much of a point to advertise it at this point. The M-Series chips been around for a good while now, and is used in a bunch of their products. It’s basically turned into the status quo, so they have no need to advertise it, particularly as the improvements seem to be mostly incremental for the time being.
That’s basically model routing, and has existed a while. Open AI’s GPT-5 and llama-swap do that, for example. If the task is simple, it uses a smaller, less intensive model, and only uses the slower, larger one of the task is more complex.
Though most tend to operate with models on the same device/service, rather than a model run elsewhere.