Grok, Elon Musk’s AI chatbot, has exposed hundreds of thousands of private user conversations through Google search indexing. When users click the “share” button to create a URL for sharing their chat, the conversation becomes publicly searchable - often without users realizing it[1][2].

Google has indexed over 370,000 Grok conversations, including sensitive content like medical questions, personal information, and at least one password[2:1]. Unlike OpenAI’s ChatGPT, which quickly removed a similar feature after backlash, Grok’s share function does not include any warning that conversations will become public[3].

According to Forbes, some marketers are already exploiting this feature by intentionally creating Grok conversations to manipulate search engine rankings for their businesses[2:2].


  1. TechCrunch - Thousands of Grok chats are now searchable on Google ↩︎

  2. Forbes - Elon Musk’s xAI Published Hundreds Of Thousands Of Grok Chatbot Conversations ↩︎ ↩︎ ↩︎

  3. Fortune - Thousands of private user conversations with Elon Musk’s Grok AI chatbot have exposed on Google Search ↩︎

    • Zerush@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      4
      ·
      2 days ago

      Really? OpenAI and Deepseek used are from Google, that meaans also, even if it is installed locally it also act online, same as other searches. You can’t run an complex LLM only locally in a crappy PC, what you run is an desktop client of the LLM, don’t confuse it.

      • HumanPerson@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        20 hours ago

        So, that is incorrect. I’m not even going to make a counter argument. Learn what you’re talking about before you post stuff.

      • Barbarian@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        2 days ago

        You can’t run an LLM on a crappy PC, that’s true. You need at least a decent CPU. If you’re running an LLM locally, there’s no calls to the outside world. I have a very mid computer, it isn’t great, and unfortunately I need to work with LLMs due to my job. A call to my local LLM might take ~2 minutes where using an online platform it might take ~30 seconds, but I think that’s a reasonable trade.

        If you have a gaming PC, you have a platform that can run a local LLM.