Your country going to war, as the belligerent, should be more than just “news.”
We are in a hot war with Iran.
I get it, I can see how it’s missed, but I think that shows just how screwed up the system is here.
Your country going to war, as the belligerent, should be more than just “news.”
We are in a hot war with Iran.
I get it, I can see how it’s missed, but I think that shows just how screwed up the system is here.


Or Lemmy.
Information hygiene in the news subs is terrible, at least here on .world.


We certainly did. We learned Chicago/APA style, types of sources, and how to make citations in reports.
And that Wikipedia is not appropriate as a source to cite.


That’s not true, unfortunately. It’s not economical to transplant RAM ICs once they’re packaged and soldered onto something.
And if they’re produced as, say, HBM modules, they absolutely cannot be repurposed for, say, DDR5 or LPDDR5 CPUs, or GDDR GPUs. There’s no reworking, the memory buses on processors simply do not support them electrically, and altering those processors would have a massive development cost with years of lead time.
Some of the RAM (like the LPDDR5X for the Nvidia Grace Hopper ARM CPUs) can be re-used, but it seems most is being made as HBM.


That’s not true, from what I’ve read:
https://www.trendforce.com/presscenter/news/20251113-12780.html
despite higher ASPs boosting profitability across the memory industry, capital spending on DRAM and NAND Flash is only anticipated to increase modestly in 2026. This limited investment growth is unlikely to significantly affect bit output.
Memory makers seem skeptical, hence aren’t planning to spend on more capacity in 2026.


Too late, I’m afraid. The supply for DRAM basically can’t adjust, and already-manufactured chips can’t be repurposed, so even if Oracle, Meta, and OpenAI went bankrupt tomorrow, it would take some time to build up inventory again.


The issue with AI is “now”
Can they power with solar? Nuclear? Hell, even a natural gas plant? Nope, the data centers need the power right this second, so they get gas turbines on site. Same with cooling; evaporative is just the quickest and cheapest to set up.
Same with its architecture. There’s no time to fix temperature/sampling issues, no time to try bitnet or any of a bazillion interesting papers that came out. A shippable product (model) is needed yesterday; just scale up what we have. “Fail” a single experiment? Your team is fired, which is exactly what happened at Meta.
Everything has to happen right now because of corporate FOMO. So, while this is an interesting musing and maybe Intel or someone will play with it, the actual AI labs could not care less because they can’t get it immediately.


“Dismissing Wikipedia” is my political litmus test.
To be clear, it’s never been a reliable source; we learned that in middle school. You take everything written on it with a grain of salt.
…But it’s still an oasis in a desert.
When some of my family started questioning its utility because of its “liberal bias,” like post-grad-educated family saying this as Fox News blares in the background, I knew things had gotten bad.
I haven’t seen any extreme left question it IRL, but I’m afraid that’s coming too, with how tankies a some terminally online bits of Reddit are skeptical of it.


Yeah. Wine/Proton is an incredible achievment. DirectX->Vulkan translation is a miracle by itself.
EDIT: Also, stripping Windows is not daunting. It comes down to:
Install it fresh.
Don’t install anything unless something absolutely doesn’t work without it.
Delete apps you don’t need, like (say) Xbox.
Tweak the power profile to minimum 0%/maximum 100% CPU, if it isn’t already.
Run a Windows debloating script.
Disable realtime AV.
(Optional) auto-undervolt your GPU with MSI Afterburner’s curve optimizer.
…And that’s about it, really. There’s tons of other Windows performance mysticism, but it’s (mostly) either very situational, or straight up nonsense.


You can run DXVK (DirectX -> Vulkan) in Windows, too.
Antivirus (even Windows Defender with defaults) can massively slow down disk IO in some games. As an example, my Rimworld loading times were over 2X as long with Defender realtime active, and it caused all sorts of hitching.
I’m not trying to dunk on Linux here; it can help a ton, sometimes. Sometimes it is Linux that provides the massive boost.
…But sometimes it’s just about a good default configuration, with linux gaming OSes provide. Windows can be like this too, once it’s stripped down.
Again, not trying to dunk or tout either OS; I use both, though linux mostly. But I think attribution is important. And the assertion that Linux provides a big performance boost is not always true; I’m still stuck on Windows with several games just because (in spite of my best tweaking/modding efforts), they still perform better on Windows in A/B tests.


You ran it locally? With what, DrawThings?


I love how there’s a ton of comments and upvotes here, yet OP’s article is paywalled behind a subscription. Did anyone here actually read it?
It reminds me of a post I just saw elsewhere, with total nonsense in the link. Since it was already upvoted, the moderater left it up as an experiment: it got a boatload of upvotes and comments. No one cared, even with someone pointing this out in a comment. It was just a bunch of the same comments affirming what they already believed.
…That about sums up the internet for me now. People don’t actually care where information came from; they just want to drive by, then keep scrolling :(


The “upgradability” part in a small laptop is questionable to me, anyway.
The GPU is really compromised in that chassis, as having it in a slot compromises cooling big time, and limits how much power it can use. And while I love upgradable RAM for the CPU… it’d be better if they used faster CAMM modules. Many other brands have upgradable SSDs/WiFi.
Swappable ports are awesome, no question.
…But honestly, I’d rather have a smaller chassis, bigger GPU and better cooling right off the bat, like a Zephyrus chassis. And have it reparable, and make the whole motherboard standardized/swappable, but not compromise the chassis so severely by making it modular.


If you’re wondering about Fedora vs CachyOS, it comes down to what you do on your PC. And what you’re used to.
If you want better “preconfiguration” for graphics stuff, CachyOS is the way to go. With Fedora you will end up referencing and maintaining a whole lot more yourself, while the CachyOS maintainers basically do all that maintinance and config optimization for you.
But Fedora might be better for a less GPU-focused “workstation” type system.
Generally, I’d look at the “style” and interests of distro maintainers. CachyOS is built by a collective of linux gaming/compute enthusiasts that snowballed into popularity, though it does inherit all the work from Arch. Fedora is a long standing workstation/server workhorse, a “pre release” for Red Hat enterprise linux.


To be blunt, I dunno if that’s coming? Apple’s designs are pretty conservative these days; I doubt they’d make a big folding iPhone.
iPhones do go on firesales from some carriers, sometimes even below cost.


Well, with how things are going with Windows and laptop OEMs, it’s still a better deal than those.
A lot of user need an iPad with a touchpad and keyboard for their needs, which is precisely what this is.
So it’s not technically Chromium anymore? It’s a fork of Chromium?
It’s not a fork, IMO. It’s just a set of patches to upstream Chromium.
Mmm, its versioning keeps up with Chromium, and the source just looks like a bunch of patches to me:
https://github.com/imputnet/helium/tree/main/patches/helium/core
Not a fork, I don’t think.
My distro (CachyOS) has it packaged, so maybe you can request for yours to do it as well?
Human brains just aren’t wired for citations. Especially outside academia I guess.
I think it would help if people were more “LLM literate” though, eg they took a lesson in school on how they work at a low level. Folks would be horrified they ever put so much trust in them.