1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Apple and it's Siri recordings, now deleteable?

Discussion in 'Hardware' started by izlik, Oct 13, 2019.

  1. #1
    Apple has now started rolling out the feature that allows users to decide for themselves whether Siri recordings should be forwarded to Apple for improvement or not.

    People who have the beta for iPadOS 13.2, iOS 13.2, Apple tvOS 13.2, WatchOS 6.1 or MacOS 10.15.1 should also be able to delete their Siri and dictation history. Doing so will simply erase all Siri data Apple has on its servers.

    These new features can be found among the phone's settings.

    The English shortcuts look like this:
    * Settings> Privacy> Analytics & Improvements> Improve Siri & Dictation
    * Settings> Siri & Search> Siri? & Dictation History> Delete Siri? & Dictation History

    Apple, as is well known, ended up in windy weather towards the end of the summer when it became known that users' recordings were being sent to third-party companies who could then, among other things, listen when people were having sex or selling drugs. Apple has now changed its routines so that these recordings are only managed internally by Apple.

    My question to you all that have so far read down here is, did you even know / have read about and know about Siri being able to record conversations just like Alexa or Google home.

    Also, how OK do you think that this is now that it's just because it got out to the public? can you really trust your iphone?

    _
     
    izlik, Oct 13, 2019 IP
  2. sarahk

    sarahk iTamer Staff

    Messages:
    28,494
    Likes Received:
    4,457
    Best Answers:
    123
    Trophy Points:
    665
    #2
    I have conversations with Siri that go like this
    • Send a text message to Charlotte mobile
    • Meet me at 204 Great North Road
    • Meet me at 2 hundred and 4 Great North Road
    • Meet me at 2 oh 4 Great North Road
    • Fuck Siri, you're hopeless
    and
    • Send a text message to Mary mobile
    • Have to go out, left the stuff you wanted on the deck
    • Have to go out, left the stuff on the duck
    • Have to go out, left the stuff on the deck
    • Have to go out, left the stuff on the deck
    • Have to go out, left the stuff on the deck
    • Have to go out, left the stuff on the deck
    • Send
    • I give up and my lesbian sister in law now thinks I have a "dick"
    and
    • Send a text message to Charlotte mobile
    • on my way
    • Send
    • Siri, send as text message not imessage
    • Fuck Siri, you're hopeless
    So I hope like hell they ARE recording and analysing the messages and will, at some stage, prevent these ridiculous "conversations" from happening. If the user tries, say, 4 times to send a single message then some bot should be able to randomly select it for human assessment to see if a human can understand the request and work out why Siri can't.
    I have the privilege of living in a politically stable country and my private actions are consistent with my public actions so I couldn't a damn.

    That question is better asked of someone living with the fear of "discovery".
     
    sarahk, Oct 13, 2019 IP
    malky66 and jrbiz like this.
  3. jrbiz

    jrbiz Acclaimed Member

    Messages:
    6,030
    Likes Received:
    2,610
    Best Answers:
    2
    Trophy Points:
    570
    #3
    LOL, I just had an interesting voice recognition experience with a credit card company. My credit card's magnetic stripe had worn out somehow, so I needed a new card. The card was not compromised, I just needed a new card. So, I dial the telephone number on the card and get an automated attendant that asks me to enter my cc number and provide additional information to confirm my identity. Once it was confirmed to the computer's satisfaction that I was the right person, it started spouting off a long list of information about my account (how much owed, last payment, total credit line, etc., etc.) As it was talking and talking, I started to say, "I want to speak to an agent....speak to an agent...speak to a person...etc., etc.) Well, the voice kept prattling on with information about my account that I had no interest in. Finally, in complete frustration, I yelled out, "WTF?!?!" (I said the actual words.) The voice instantly said, "I will connect you with a person" and within one second, I was talking to a human.

    Apparently, an astute programmer put in a rule that if a certain word was said, immediately connect the caller to a human. :D

    Getting back on topic: once your data (of any form) is provided to any device that has Internet connectivity, you MUST assume that it is available to anyone with the capability or authority to access it. Nothing is ever truly erased or eliminated from the public domain (i.e., the Internet) once it has been placed there. Like @sarahk I live in a democracy that protects free speech and, more important, I do or say nothing that could get me in legal trouble, so I have no real worries about authorities or others accessing my private communications. But, if you live in a different part of the world where what you say or think can have horrible consequences, let me repeat: you MUST assume that your data on the Internet is accessible to anyone, whether or not you are told that it is encrypted, deleted, etc.
     
    Last edited by a moderator: Oct 25, 2019
    jrbiz, Oct 25, 2019 IP
    sarahk likes this.
  4. sarahk

    sarahk iTamer Staff

    Messages:
    28,494
    Likes Received:
    4,457
    Best Answers:
    123
    Trophy Points:
    665
    #4
    Back in the day when I was working in big companies it was commonly accepted that if you said "fuck" instead of a more appropriate response you'd go straight through to an operator.

    At a presentation at a Google Conference a company was talking about how they handle their quality control at a call centre. The calls are recorded, converted to text using Google, and then split back out into a conversation. They can then run an AI over it to test for negative emotions and pull out those calls that seemed to be going south.

    In another example chat bots would control a conversation until they detected a negative response and then flick the chat to a real person.

    So the happy customer who is saying how great the call has gone - will never be heard
    The unhappy customer's call where they express discontent is more likely to be followed up.
     
    sarahk, Oct 25, 2019 IP
    jrbiz and mmerlinn like this.