You can run your own open source large language models at home about as well as you can run Bethesda’s Starfield on a same spec’d PC
…
Yes, you can download an executable of a chatbot lol.
That’s different than running something remotely like even OpenAI.
The more it has to reference, the more the system scales up. Not just storage, but everything else.
Like, in your example of video games it would be more like stripping down a PS5 game of all the assets, then playing it on a NES at 1 frame per five minutes.
You’re not only wildly overestimating chatbots ability, you’re doing that while drastically underestimating the resources needed.
Edit:
I think you literally don’t know what people are talking about…
Do you think people are talking about AI image generators?
I am talking about generative AI, be it text or image both have a challenge with copyrighted material.
“executable of a chatbot”
lol, aint you cute
“example of video games”
Are you refering to my joke?
I am far from overestimating capacity, Starfield runs mediocre on a modern gaming system compared to other games.
The Vicuna 13b llm runs mediocre on the same system compared with gpt 3.5. To this date there is no local model that i would trust for professional use and chatgpt 3.5 doesnt hit that level either.
But it remains a very interesting, rapidly evolving technology that i hope receives as much future open source support as possible.
“I think you literally don’t know what people are talking about”
I hate to break it to you but you’re embarrassing yourself.
I presume you must believe the the following lemmy community and resources to be typed up by a group of children, either that or your just naive.
HOT TAKE: Hugging face is run by people who are really into chatbots but dont understand it in the slightest.
I have been patient and friendly so far but your tone has been nothing but dismissive.
you cannot have a nuanced conversation about AI while excluding the entire Open Source field within it. That’s simply unreasonable and i plore you to ask others because i know you wont take my word for it.
…
Yes, you can download an executable of a chatbot lol.
That’s different than running something remotely like even OpenAI.
The more it has to reference, the more the system scales up. Not just storage, but everything else.
Like, in your example of video games it would be more like stripping down a PS5 game of all the assets, then playing it on a NES at 1 frame per five minutes.
You’re not only wildly overestimating chatbots ability, you’re doing that while drastically underestimating the resources needed.
Edit:
I think you literally don’t know what people are talking about…
Do you think people are talking about AI image generators?
No one else is…
I think you’re confusing training it with running it. After it’s trained, you can run it on much weaker hardware.
I am talking about generative AI, be it text or image both have a challenge with copyrighted material.
Are you refering to my joke?
I am far from overestimating capacity, Starfield runs mediocre on a modern gaming system compared to other games. The Vicuna 13b llm runs mediocre on the same system compared with gpt 3.5. To this date there is no local model that i would trust for professional use and chatgpt 3.5 doesnt hit that level either.
But it remains a very interesting, rapidly evolving technology that i hope receives as much future open source support as possible.
I presume you must believe the the following lemmy community and resources to be typed up by a group of children, either that or your just naive.
https://lemmy.world/c/fosai
https://www.fosai.xyz/
https://github.com/huggingface/transformers
https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
https://huggingface.co/microsoft/phi-2 & https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/
https://www.theguardian.com/technology/2023/may/05/google-engineer-open-source-technology-ai-openai-chatgpt
Or…
I could just block some of the people who are really really into chatbots, but don’t understand it in the slightest.
I think that might be more productive than reading a bunch of stuff from other people who don’t understand it.
HOT TAKE: Hugging face is run by people who are really into chatbots but dont understand it in the slightest.
I have been patient and friendly so far but your tone has been nothing but dismissive.
you cannot have a nuanced conversation about AI while excluding the entire Open Source field within it. That’s simply unreasonable and i plore you to ask others because i know you wont take my word for it.
Farewell