• Electricd@lemmybefree.net
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    4 days ago

    That’s known. Siri data is kept for improving the models through human labeling. It’s not like it was hidden, just read the damn privacy policy.

    If that’s your magical source as an insider, I’m sorry, but you’re bullshitting. That didn’t prove anything you said too

    It’s not spying as it wasn’t their goal. It sure is shit, but you can’t compare that to the stuff Microslop and Google do

    • toad@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      That’s not the problem. With the false positive they were hearing people during everyday interactions. I remember my colleague bothered by the fact they were hearing people having sex, talking about drugs, all the while with personal information written on screen.

      Do you want some guy in Apple headquarter hears some random snippet of your life because you pronounced the word “Shiny” and the model messed up?

      • endlesseden@pyfedi.deep-rose.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        what bothers me more is them constantly scanning files and storing summaries/metadata for “law enforcement”.

        could they be more like google… they already got the pretending they were doing no evil phase and going back on it done down pat.

      • Electricd@lemmybefree.net
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        4 days ago

        I disabled that voice activation feature for this exact reason, but yea, what’s shitty is that people had not been clearly informed at all