Published Date 2/23/2018
We made fun of Siri, but then we learned to love her once we found it easier (and safer) to talk into our iPhones instead of typing as we drove. But Siri was just the beginning for voice-activated personal assistants. When Siri first came out, we felt like mini-James Bonds speaking to our robotic butlers, sometimes asking inappropriate questions just to see what the artificially intelligent being would say in response. We felt, after all, that it would go no further than our smartphones.
Now there are millions of digital assistants hidden in pockets or gracing kitchen counters — Microsoft’s Cortana, Amazon’s Alexa, and now Google’s new search assistant being advertised everywhere. They analyze what we say or type, and in return, offer us useful information. Even more recently, they have learned to anticipate what we want — such as notifying us of traffic jams at just the right time. And it's great to find an address without having to touch our phones or be able to search Wikipedia for information on an old movie we’re watching on TV.
The only problem? Convenience comes at a price. Privacy and security concerns expose users to more than they bargained for.
That intimate relationship you have with the voice that shuffles music and plays Billy Joel songs isn’t so intimate. Your innocent query gets sent to a giant server somewhere where it’s analyzed, answered, and returned to your device. And in Hansel and Gretel fashion, your requests leave a trail of breadcrumbs behind — just like everything else you do online. So what about the not-so-innocent questions? Your queries and commands are kept by that server for months, the audio portion even longer. It’s all in the fine print you may avoid reading when you hit the “agree” button for any high tech device.
Now figure that when your question is location-oriented in nature, your human-sounding phone or countertop assistant can keep track of your habits, travels, and preferences. When’s the last time your iPhone flashed a message that told you (without prompting) how long it would take to get home? Keep in mind that these assistants wouldn’t be able to return useful results without being able to read your email and access your search history as well. The resulting data-portraits are available to law enforcement officers as well as used by hackers who may gain access to sensitive servers — something you simply don’t want to think about because — hey — you have nothing to hide, right?
Amazon’s Echo is more popular than ever, dominating three-quarters of the virtual assistant market. But critics are still skeptical. Alexa’s non-threatening female voice is always listening, recording and storing our voices. Owners worry that strangers can hear their daily activities.
If you already own an Echo, know that it records more than you ever thought possible. If you are considering purchasing Echo, however, take heart. It’s possible to mess with your security settings to lessen what is being saved by the device and its server minions. For a complete picture of what you can do, Google a few YouTubes for instructions on this. For one, you can turn off Echo's microphone, its most vulnerable part. The mic absorbs all the sound in the room, potentially compromising private conversations. Learning to do this lessens the fun, however, since it renders the device useless as a personal assistant unless you get into the habit of turning the mic on and off each time you need to use it. (It’s yelling that command from your kitchen table to turn down the volume on your musical entertainment when you get a phone call that makes you want to keep it on all the time.)
There are a host of other things you can do to safeguard your privacy, such as turning off voice purchasing and setting up PIN codes. Telling Alexa to buy more dog food is pretty cool, but a single security breach could cost you dearly.
Many of us figure our lives are already an open book, since being online or on our smartphones most of the day exposes us to many of the same dangers. But these voice-activated assistants take us one step closer — perhaps — to Big Brother (whoever that may be) doing the watching.
Source: TBWSAll information furnished has been forwarded to you and is provided by thetbwsgroup only for informational purposes. Forecasting shall be considered as events which may be expected but not guaranteed. Neither the forwarding party and/or company nor thetbwsgroup assume any responsibility to any person who relies on information or forecasting contained in this report and disclaims all liability in respect to decisions or actions, or lack thereof based on any or all of the contents of this report.
©2015 Finance of America Mortgage LLC | Equal Housing Lender | NMLS 1071 Complaints@financeofamerica.com
NMLS: 1543335
Finance of America Mortgage
6900 S McCarran Blvd #2020, Reno NV
Company NMLS: 1071
Office: 775-332-6629
Cell: 775-742-9128
Email: twerbeckes@financeofamerica.com
Web: http://www.financeofamerica.com/locations/branch-profile?id=c33827bb-71f8-6483-85d2-ff00007a9d7f
NMLS: 1543335
Cell: 775-742-9128