Leh Meriwether:[Music]. Todd, Alexa told me that you are still not staying away from Krispy Kreme, even though you said at the beginning of the yearyou were giving up Krispy Kreme.
Todd Orston:Alexa ... I can't trust that woman. I, uh-
Leh Meriwether:Not only that, but she actually ordered some for you on the firm card.
Todd Orston:I like her. I'm liking her more and more.
Cabot Howell:Does she have a sister?
Todd Orston:My belly.
Cabot Howell:I guess they're distant cousins.
Leh Meriwether:Welcome, everyone. I'm Leh Meriwether and with me is Todd Orston. Todd and I are partners at the law firm of Meriwether and Tharp.And you're listening to Meriwether and Tharp radio on the new Talk 106.7. Here you will learn about divorce, family law, tips onhow to save your marriage if it's in the middle of a crisis, and from time to time, even tips on how to take your marriage tothe next level.
Leh Meriwether:If you want to learn more about us, you can always call or visit us online at Atlanticdivorceteam.com. Well, as promised, today we'rediving into the ways that we give up our private information voluntarily and not even know it. And so, I called the title today, "IsAnything Private Anymore?" And uh, I joked with Todd about Alexa, but as we're gonna learn today, these devices like Siri andAlexa and Google ... They are taking our private conversations and making them public in many ways.
Leh Meriwether:And so, of course, Todd and I are not the experts in this area. So we had Cabot Howell from Digital Agent come back on the show justso we didn't make fools of ourselves.
Todd Orston:That's gonna happen anyway, so.
Leh Meriwether:Yeah. Cabot is currently the vice president of Digital Agent. Digital Agent, among other things, provides data storage and security servicesto a wide variety of businesses. Cabot himself brings with him 20 plus years of experience in the field of IT support. Cabot takesa proactive approach to IT management by anticipating what customers need before they need it. He's versed in each customer'sexisting infrastructure, and gives direction on the most efficient and cost effective ways to reach their business goals.
Leh Meriwether:He is the chief technology officer for Digital Agents' IT customers, including our law firm.
Leh Meriwether:Well Cabot, thanks so much for coming back on the show.
Cabot Howell:Thank you Leh, I appreciate coming back and the opportunity to share some of the ways we can compute safely in this world.
Leh Meriwether:Yeah. I'm glad we didn't scare you away last time.
Todd Orston:So, Cabot, let me ask you this. It used to be that the only thing in terms of your secrets and things that you said in your house, the onlything you really had to be concerned about was a well trained parrot that might be able to repeat what you were saying. But, itsounds like more and more with all the new technologies that are coming out, we are literally filling our homes with open doors,with pathways for information to unknowingly be transmitted and stolen by people.
Cabot Howell:Yes. And it's increasing with the IOT, the internet of things. Nearly all devices as they're coming out now, the new devices, have a virtualassistant built into it. So that leaves just about every device ... Your smoke detector, your thermostat ... Everything that was ... Acoffee pot. Your toaster. Everything that's gonna have internet of things is going to tell you, is going to record, everything you do.[crosstalk 00:03:45].
Todd Orston:Yeah, you were talking about at the recent-
Cabot Howell:Consumer Electronic Show.
Todd Orston:That's right. And how it's like everything, nowadays, is coming out where it has that technology built in so more and more and more,you're gonna be literally in a home where you might have 10, 15, 20 devices or things like refrigerators and stoves, that all havethat technology built in.
Cabot Howell:Yeah. It's estimated that within two years, there will be over 50 billion devices with IOT capability that have the virtual assistance in it,that is keeping track of you. The idea is convenience. You're wanting to see how much milk you have in the refrigerator. Youwant it to warn you that you only have a quart left. Now, that's great, that's convenient, but now you're letting everyone elseknow that you only have a quart of milk.
Cabot Howell:That may not be a problem. But then you've got five different milk producers that are inundating you with calls, with ads, with everythingelse. Because they've assigned an affiliation with the company that built that IOT device in your refrigerator. So, now you're beinginundated with milk ads. I want to choose when I want to buy or when I wanna be marketed to. So, that's the funny side of it,or the annoying side. But again, with divorce attorneys, we've got another side to it. We've got a scary side to it.
Leh Meriwether:Yeah. You sent me a list of just some of them. So we've got the Amazon Echo or Alexa, Google Home, Apple Siri, Moto Voice-
Cabot Howell:Which is my choice.
Leh Meriwether:Moto Voice is your choice?
Cabot Howell:I like that on my phone, yes.
Leh Meriwether:Oh, okay. Google Assistant Now, Microsoft Cortana ... I forgot about Cortana. Samsung Bixby, S Voice, and Clova.
Cabot Howell:A lot of devices out there that are listening to you.
Leh Meriwether:Wow. I didn't even know about those other ones.
Cabot Howell:That's just a small sampling. There are many, many devices out there that are being proprietarily developed through all these IOT, internetof things, IOT.
Leh Meriwether:So, you know, I understand that one of the problems with Amazon is that somehow, you could be coming home one day and there's abox sitting on your porch and you're like, "I don't remember ordering anything." But, Alexa does. So, I understand that there wassome issues last year with it just randomly ordering things to people's houses. Or, not randomly, but basically after Alexa listenedto the children it started ordering things.
Cabot Howell:Yeah. There is no password or protectant, unless you password and protect every purchase. Again, these are just security and privacysettings everyone needs to have and view. But, you can make it so that anyone comes in your house, wants to order somecookies or a dollhouse as was done in the news program back last year in California. The news anchor said, "Alexa, order adollhouse." Isn't that cute? All of a sudden there was 10,000 dollhouses ordered.
Leh Meriwether:Oh, so all the devices-
Cabot Howell:All the devices heard the news anchor.
Todd Orston:Yeah, there was a ... My understanding of the article and the story was that someone on TV or whatever it was, said something like,"Alexa, order something." And basically, thousands of Alexas then started ordering them and Amazon had to come out and say,"Don't worry. Everybody that got one of these ordered, we're not gonna charge you. We're gonna reverse the purchases." But,again, that just goes right to the heart of the issue which is, these devices are always listening. And while in a perfect world,they're not doing anything nefarious, right? They're not doing anything improper. But, a lot of what we're talking about is howthose very tools can be used to gather information that we don't want to share and that can be used against us.
Leh Meriwether:Yeah, and I know we're gonna continue talking about Alexa, but you had told me about an interesting situation involving the devices.Now these have been in our cars for a while now, but the On Star type things ... That there's actually been cases where thoseOn Stars were used against people in criminal cases. Can you tell us about that one?
Cabot Howell:Well, the reason it was allowed is because it was determined that there was no spying going on. No one from the On Star company orATX technologies, nobody was actually actively listening. What had happened ... The accident initiated the recording. And so, whatwas done during this murder was recorded through the On Star, and it was admitted into court because they said that the FBI orthe On Star did not initiate that call. So, it was the passenger that initiated the call. So that's how it got through the courts isyes, but it's always listening. This is, we're talking 20 years ago. So, this is nothing new.
Cabot Howell:But now, instead of just On Star, just as Todd said earlier, just those one or two or the parrot sitting by, now it is so many deviceseverywhere, listening all the time.
Todd Orston:So the lesson there is, before you commit any felonies, get out of the car. Is that ... Maybe I'm missing the point.
Cabot Howell:Well, that's one ... I'm glad you're [inaudible 00:08:46] at least one thing from what I'm trying to relate to you.
Todd Orston:As always, it's the wrong thing, but whatever.
Leh Meriwether:I think you had mentioned another situation where I guess the person was driving a car that didn't realize On Star was in there,accidentally turned on the On Star emergency service, then turned it off but the On Star person followed protocol and contactedit back or contacted 911 because they thought maybe there was an accident, and wound up catching the people in a drug deal.
Cabot Howell:Right, because the person had bought the car from someone else who had the On Star activated. So, the activation was still in place. Theydidn't know that. Their argument was, "I didn't do it. I didn't pay for On Star." But, it was still an active subscription.
Leh Meriwether:And, look. These are rare circumstances. But, the point that we are making ... We're not talking, really ... This show is not about OnStar, it's not about, you know, don't sell drugs and have On Star in the car, it may result in prosecution. It's that, these devicesare always on, always listening. And On Star actually has to be turned on, but thing like Alexa ... And I'm not just picking onAlexa, but these types of devices.
Todd Orston:The virtual assistants.
Leh Meriwether:They are always listening, that's the big concern now. I mean, that's what I've been reading about, and that it's constantly listening.That's really where my concern lies.
Cabot Howell:Right, that's what we need to delve into in this next segment, is the security implications associated with these virtual assistants.
Leh Meriwether:And so, and I understand that the virtual assistant can't understand context. And so, there could be a situation where you're at home ormaybe the kids say ... You know, they're playing around, they're playing with the Nerf guns, and the brother shoots the otherbrother and the other one says, "That's it! I'm gonna kill you!" And he starts firing. Well, Alexa's recording this, and may notrealize that this is two kids having fun. And I know there's situations where there's like a 911 ... It can call 911 in certainsituations. And it might take something out of context and contact the police. So, [music] these are something we need to be,again, as we like to make people aware, we like to make them aware of what's going on, so you don't wanna miss what'scoming up next, 'cause we're gonna dive into this even deeper.
Leh Meriwether:Welcome back, I'm Leh Meriwether and with me is Todd Orson. Todd and I are partners at the law firm of Meriwether and Tharp, andyou're listening to Meriwether and Tharp radio on the new Talk 106.7. If you wanna learn more about us, you can always call orvisit us online at atlantadivorceteam.com.
Leh Meriwether:Well, this show ... We're actually diving into all these conveniences that we have like Alexa and Siri and those type of things. And Iknow we keep focusing on those, only just because there's so much advertising about it. But, what we're going into is how thesedevices are actually causing us to give away our private information without even thinking about it. So, we wanted to makepeople aware of what could happen with these devices in your house if you're trying to have a very confidential conversationwith someone and that sort of thing.
Leh Meriwether:At the end of the last segment, I had talked about context. So, you know, that the problems of context, that these weren't from me,this was from you Cabot. You had sent me an email about this. Can you explain a little about the context problem of artificialintelligence?
Cabot Howell:Well, the artificial intelligence, it doesn't know the context of what you're speaking. You made a joke earlier about two kids playing andone of them shoots the other one in the eye with a soft pellet, and the other one screams, "I'm gonna kill you!" All of asudden, Siri or Alexa has a recording that somebody's gonna kill somebody. Well, that's two seven year olds playing. They didn'treally mean it and they didn't know the gravity of the word, "kill". They just heard it and they just blurted it out. So the contextis important, and that's what we're missing here, is that context.
Cabot Howell:So, what we're doing is ... As well as, a spouse is talking to a relative or talking about their case. Terminology may come out that was notintended. It was jut, "Oh, I would just like to punch him in the face." Well, they're not really wanting to punch them in the face,it was just an expression they said. But then, that terminology can be used, again, with a pattern of other activity, can beskewed by a lawyer.
Cabot Howell:I know that's unheard of much, but in a certain view, it can be looked at as negative. When it was all it was, was a joke or an expressionfor the moment. And the problem is that ... What we wanna take away from this is that these devices are always listening.
Leh Meriwether:And from what I understand some of them retain the information.
Cabot Howell:Right. Siri even lets you know that it retains ... With the recording, with the device ID for six months, it continues to hold onto thatrecording for a year and a half. So, for a long time ... And they say it's because they want to improve the dictation, improve thevoice recognition, and so it's going through their number crunching to help with that area. But, regardless of what the purpose is,are you comfortable with every recording or every conversation you have being retained by a company and its affiliates for a yearand a half.
Leh Meriwether:So are you saying when I use Siri, that she's actually ... Like if I say, "Hey Siri, will you do this?" Does it actually keep that information?
Cabot Howell:For a year and a half.
Cabot Howell:And, Siri's affiliates, Apple's affiliates.
Leh Meriwether:I shouldn't have said, "Hey Siri, get Todd in trouble."
Todd Orston:In my house, that's Alexa.
Cabot Howell:The Siri license agreement reads as this, "By using Siri, our dictation, you agree and consent to Apple's and its subsidiaries' and agents'transmission, collection, maintenance, processing and use of information, including your voice input user data to provide andimprove Siri dictation and dictation functionality in other Apple products and services." What that says in legal speak is, you justgave them all rights to everything you say.
Leh Meriwether:Wow. So, maybe we should ... Can we subpoena the information, I wonder?
Cabot Howell:Well there have been ... The FBI have asked to subpoena things from Amazon Echo and Apple. They've asked, and they have beenrejected, but what to take away from this is, they know that everything has been recorded. So if the FBI and the HomelandSecurity, if they know it's been recorded, well us old country boys here in Atlanta probably know it as well.
Leh Meriwether:Yeah. So what is this ... People have been talking about voice data and how it's being used to collect from us. Can you talk about thevoice data and how the AI's using that for ... It's sharing it with other companies?
Cabot Howell:The main thing is, someone wants to get some money. That's what it's all about, right? Is improving money.
Leh Meriwether:Mm-hmm (affirmative).
Cabot Howell:And how they use this ... One of the primary reasons that I am adverse to this, is for marketing. They wanna know what you buy, whatyou're planning on buying. If you talk about a conversation. Let's say, talk about insect repellent. You're just talking about it withyour wife. Next thing you know, you're on the computer, and you have Ace exterminating coming up. You have all thesemanufacturers marketing attempts at you.
Cabot Howell:As I said earlier, I'd rather choose when I wanna be marketed to as opposed to, every time I drive somewhere, it pops up on my phone,"Hey, Arby's is here," just because I went to Arby's twice. And all of a sudden, "Hey, Arby's is here," is popping up on my phone.I'm looking for directions and it has a pop up of Arby's.
Cabot Howell:Of course, I've turned all that off on my phone, but this is why it's important to look at your security and your privacy settings on alldevices, your Siri, your Alexa ... Every device you have, look at the privacy and security settings.
Leh Meriwether:So there's things that you can do? You can uncheck things so it's not pulling that data out and sharing it with others.
Cabot Howell:That's correct. But unfortunately, at this juncture, not the voice that is recorded. And the agreement, as I just read, Siri says, "We wantthis, and if you're gonna use this, you're gonna give it to us."
Leh Meriwether:Okay. You know, and it's interesting you say that because just this last Sunday ... And we have an Amazon Echo in our kitchen. We hadpulled out a pot, and there happened to be