I think more and more browsers are spoofing their UA to pretend that you’re using windows, for fingerprinting resistance
I think more and more browsers are spoofing their UA to pretend that you’re using windows, for fingerprinting resistance
well, the point of flatpak is to have bundled dependencies so they run predictably no matter the distro
if one of your software’s dependency gets updated, and your software isn’t, you may run into issues - like a function from the library you’re using getting removed, or its behaviour changing slightly. and some distros may also apply patches to some of their library that breaks stuff too!
often, with complex libraries, even when you check the version number, you may have behavioural differences between distros depending on the compile flags used (i.e. some features being disabled, etc.)
so, while in theory portable builds work, for them to be practical, they most often are statically linked (all the dependencies get built into the executable - no relying on system libraries). and that comes with a huge size penalty, even when compared to flatpaks, as those do have some shared dependencies between flatpaks! you can for example request to depend on a specific version of the freedesktop SDK
, which will provide you with a bunch of standard linux tools, and that’ll only get installed once for every package you have that uses it
“AI” today mostly refers to LLMs, and whichever LLM you’re using, you’ll likely face the same issues (wrong answers creeping in, tending towards mediocrity in its answers, etc.) - those seem to be things you have to live with if you want to use LLMs. if you know you can’t deal with it, another rebrand won’t help anything
it sure seems like it though
i mean, they’ll never replace system package manager, but for desktop applications, flatpak is honestly quite good
wow is me, i am le surprised
to be fair, some people deserve to be called assholes
according to the github readme, you can just run sudo pro config set apt_news=false
to disable those
if you have things set up the way you like on xubuntu, it’s maybe worth it to just do that rather than start fresh
iirc, postgresql renames itself in htop to show its current status and which database it’s operating on
the avreage person also isn’t as convincing as a bot we’re told is the peak of computer intelligence
there are tons of webring still going these days!
well, i just tried it, and its answer is meh –
i asked it to transcribe “zenquistificationed” (made up word) in IPA, it gave me /ˌzɛŋˌkwɪstɪfɪˈkeɪʃənd/, which i agree with, that’s likely how a native english speaker would read that word.
i then asked it to transcribe that into japaense katakana, it gave me “ゼンクィスティフィカションエッド” (zenkwisuthifikashon’eddo), which is not a great transcription at all - based on its earlier IPA transcription, カション (kashon’) should be ケーシュン (kēshun’), and the エッド (eddo) part at the end should just, not be there imo, or be shortened to just ド (do)
it is absolutely capable to come up with it’s own logical stuff
interesting, in my experience, it’s only been good at repeating things, and failing on unexpected inputs - it’s able to answer pretty accurately if a small number is even or odd, but not if it’s a large number, which indicates it’s not reasoning but parroting answers to me
do you have example prompts where it showed clear logical reasoning?
huh, i kinda assumed it was a term made up/taken by journalists mostly, are there actual research papers on this using that term?
because it’s a text generation machine…? i mean, i wouldn’t say i can prove it, but i don’t think anyone can prove it’s capable of thinking, much less of reasoning
like, it can string together a coherent sentence thanks to well crafted equations, sure, but i wouldn’t qualify that as “thinking”, though i guess the definition of “thinking” is debatable
New response just dropped
for it to “hallucinate” things, it would have to believe in what it’s saying. ai is unable to think - so it cannot hallucinate
A/B testing moment
you probably got a kernel panic, which froze the system. it’s like a BSOD on windows, except on linux, there isn’t a proper stack to handle them when they happen while you have a graphicam session running, so it kinda just freezes
i don’t think reisub would do anything, because the kernel was probably already dead
you don’t risk corrupting much data by hard-reseting your pc on linux – journaling filesystems, like ext4 or btrfs, are built to be resilient to sudden power loss (or kernel crashing). if a program was writing a file at thz time the kernel crashed, this one file may be corrupted, because the program would get killed before it finished writing the file, but all in all, it’s pretty unlikely. outside of fs bugs, which are thankfully few and far between on time-tested filesytems like ext4, you shouldn’t have to worry much about sudden power loss!
unfortunately, figuring out the cause of these issues can be challenging – i’ve had many such occurences, and you have no logs to go off of (because the system doesn’t have time to save them), so you’d most likely need to figure out a way to send your kernel logs onto another system to record them
as general mitigation steps, you should try monitoring your cpu temperature a bit closer - it could be high temperature tripping the safeties of your motherboard/cpu to avoid physical damage to them - in which case, try installing a daemon to control your cpu frequency, like auto-cpufreq, or something like thermald specifically made to throttle your cpu if it gets too hot (though i think that one is intel specific)
my main question is: how much csam was fed into the model for training so that it could recreate more
i think it’d be worth investigating the training data usued for the model
Most usb-c ports with DP alt mode support up to 1 monitor at 4k@60Hz, or 2×1080p@60Hz, and I believe 2×1440p@30Hz. It comes down to bandwidth, so I think that as long as you’re fine with one monitor running at a slower refresh-rate or lower resolution, you can have your primary screen displaying in high-res.
Of course, you have to also take into consideration the GPU performance, running higher resolutions will usually degrade performance!