• 1 Post
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle







  • Maybe more apt for me would be, “We don’t need to teach math, because we have calculators.” Like…yeah, maybe a lot of people won’t need the vast amount of domain knowledge that exists in programming, but all this stuff originates from human knowledge. If it breaks, what do you do then?

    I think someone else in the thread said good programming is about the architecture (maintainable, scalable, robust, secure). Many LLMs are legit black boxes, and it takes humans to understand what’s coming out, why, is it valid.

    Even if we have a fancy calculator doing things, there still needs to be people who do math and can check. I’ve worked more with analytics than LLMs, and more times than I can count, the data was bad. You have to validate before everything else, otherwise garbage in, garbage out.

    It’s sounds like a poignant quote, but it also feels superficial. Like, something a smart person would say to a crowd to make them say, “Ahh!” but also doesn’t hold water long.





  • There was a similar study reported the other day about using FMRI imagining and AI to recreate the “thought content” of someone’s brain. It required training for the AI in the person’s brain and some other training. It does seem these techniques can work with some specified models, but yeah, it doesn’t seem like hooking someone’s brain up to this would create a movie of their mind or something.

    I think the more dangerous part is “This is step 0,” which this tech would have seemed impossible 10 years ago. Very strange times.