The walls (and the vacuums) have eyes! How Big Tech watches us

[ad_1]

Amazon’s acquisition of iRobot — the manufacturer of Roomba smart vacuums — last month for 1.65 billion has some sounding the alarm. Experts are concerned both by how the company is further monopolizing multiple industries as well as what the acquisition could mean for individual privacy disruptions. 

But it’s not just Roomba spying that has many concerned. Bluetooth devices that track your whereabouts, kid’s toys that record your child’s voice and even data collected from your smart vacuum or smart toilet could spell danger from hackers, judicial overreach from law enforcement agencies, or give insurance companies otherwise private data.

Living through the modern advent of so many smart devices, apps, and technologies may come with many perks and conveniences, but with so much data now being collected and banked by the same big tech organizations, it could also mean the end of personal privacy as we know it — including and especially within the safety of our own homes. 


“We use smart devices of all sorts every day,” said Sophia Maalsen, lecturer in Urbanism at the University of Sydney. “In doing so, we are always generating personal data about ourselves which is sent to the provider and third parties that collect this data and monetize it in different ways.” 

That collection and monetization of data is sometimes done by tech companies that monitor consumers through smart phones, Bluetooth trackers, tablets, home security systems, smart TVs and virtual assistant technologies such as Amazon’s Echo and Google’s Home. 

“I don’t think any one device is more dangerous than another,” said Torrey Trust, associate professor of learning technology at the University of Massachusetts Amherst. “I think the danger lies in having multiple devices that are collecting lots of types of data and selling or sharing this data with data brokers that create very detailed, specific profiles of consumers for targeted advertising or other uses,” she said. 

“Any app that can track your location on your phone all the time, likely knows when you are home, when you’re at work, who you sleep next to, etc.”

What happens, for instance, when a smart vacuum such as Roomba is connected to the internet and starts sending data back to Amazon on the content it’s vacuuming up (think pet fur or Goldfish crackers)? That’s powerful information to have. Or what might the implications be of a smart vacuum that notifies a health insurance company of dangerous objects in the home? Consider that last month Amazon also purchased a primary care organization One Medical for nearly $4 billion and has expressed interest in the past in starting its own insurance company.

“Roomba is not just interested in collecting floor plans,” Sadowski said. “They are gathering detailed, updated, machine-readable maps of your past and present (home) layouts with information about your furniture, habits, living conditions, other devices, all of that stuff.” 

Sadowski advised that people who shrug off concerns about corporations obtaining such vast amounts of information are “thinking of the data one-dimensionally,” and instead should consider, “how that data will be compiled in other sources and streams” once it’s combined with everything else corporate juggernauts already know about each user. 

Sadowski acknowledged that information such as a home’s architectural blueprint is likely already available at the county records office, but said that obtaining such records is harder to access than most people realize and not available en masse. “It’s not as if you can just go to your county recorder’s office and say, ‘give me all of your floor plans, please,’” Sadowski noted, “Restrictions are there to prevent this kind of blanket, mass collection (of data) for any purpose (a company) wants to use it for.”


Of course, Big Tech’s mass surveillance capabilities go way beyond robot vacuums. In addition to all the smart devices that people traditionally think of regarding a company’s mass surveillance capability (think smartphones, smart TVs, tablets, smart speakers etc.) there are many lesser-known ways that companies collect and compile information. 

Joseph Steinberg, a renowned New York-based cybersecurity expert and author of multiple books including “Cybersecurity For Dummies,” told me Fitbits and smartwatches, for instance, “keep track of your oxygen levels, heart issues and sleep patterns.” He also noted that something as relatively benign as a coffee maker collects data such as, “what time people wake up and at what times in the morning people are tired and not fully sharp or even coherent.” 

Another example Trust offered was the recent uproar from parents when they learned that their children’s smart toys were recording audio and collecting data from their children. The apps on smartphones are also tracking users in ways people may not realize. “Any app that can track your location on your phone all the time, likely knows when you are home, when you’re at work, who you sleep next to, etc.,” Steinberg said. 

Mark Andrejevic, professor of communications and media studies at Monash University, told me that even some smart light bulbs can keep track of the heart rates of a home’s occupants. “Basically, anything with a sensor and a network connection can track your data and behaviors,” he said. 


Most of our data is collected for the purposes of advertisers trying to ascertain our interests to then get us to spend more money. But there may be a darker side to such surveillance that users may not be aware of. 

Similar to Sadowski’s argument about ambivalence towards the data collected by Roomba vacuums, for instance, Maalsen warned that the data collected from so many combined sources, “could harm us further down the line.” She pointed to smartwatches and smart appliances as examples. “If your smart fridge is keeping contents of your grocery list or your Fitbit tracks your activity levels, and this is fed back to your health insurer, anything not considered healthy by an algorithm could impact your premiums,” she warned.

Such behavioral data could also be provided to life insurance companies to help them determine who can and cannot qualify for their policies. Andrejevic said that information could be abused in other ways, too. “Data can be used not just to manipulate consumers, but also to affect their access to health care, employment opportunities and more.”

He added that, “data from these devices can and has been subpoenaed as part of legal actions,” as well. Behavioral data could be used by law enforcement agencies to infer intent, guilt or as provide probable cause in order to obtain a warrant if such agencies get their hands on it. Amazon has said they will only provide recordings from their Echo listening devices within people’s homes, if “a valid and binding legal demand (is) properly served on us.”

“Basically, anything with a sensor and a network connection can track your data and behaviors.”

On that front, Trust said she is also concerned about, “online exam proctoring tools, which are quite dangerous in how much data they collect on students,” such as biometric data, audio data, screen recordings and keyboard touches. Especially because, “oftentimes, students have no say in the matter because their professors require the use of these tools to pass a course.” She added that online monitoring tools such as GoGuardian may also be misused by schools or shared with local authorities without a student’s knowledge, sometimes with concerning results. 

And though most people tend to give the companies collecting their data the benefit of the doubt that they’re doing so with the user’s best experience in mind, Andrejevic said there is, “no guarantee that the data collected will be used to benefit consumers. It may even be used to exclude them from opportunities and services.” 

Beyond potentially nefarious intentions of companies and government agencies, smart devices are also easy targets for hackers. “Nearly anything can be hacked,” Trust said. Two weeks before Christmas, 2019, for example, a man hacked into an 8-year-old girl’s bedroom security camera and taunted her by saying he was Santa Claus and wanted her to be his “best friend.” 

Numerous other reports abound of hackers getting into Ring camera systems and other smart home devices and even unlocking smart door locks remotely. 2020 research also shows that smart speakers such as Google Home and Amazon’s Echo can be compromised through hackers “listening to everything” or by installing malicious “skills” that can be accessed later to steal data or personal information. 

Maalsen said that home automation systems such as smart home devices, smart switches and smart lights have also proven to be susceptible to hacker attacks. In such cases, “the hacker takes control of the devices remotely,” she said. 

And hacking isn’t always done by strangers. Maalsen pointed me to 2021 research where jilted ex-lovers were found hacking into smart devices they previously had login access to such as Bluetooth trackers, thermostats, security cameras and entertainment systems that they then used to spy on or terrorize them. She said there have even been cases where exes have locked their former partner out of or in their own homes. “Smart home devices used in cases of domestic violence is something we rarely consider in conversations around hacking and the misuse of such devices,” she said. 

Andrejevic echoed similar concerns: “There are numerous instances of smart devices being hacked or abused for tracking and stalking purposes,” he said. 


Despite potential downsides, companies like Amazon allow users to select various data settings. And smart devices have many benefits that often make one’s life easier and more manageable. Some such advantages include thermostats and home security systems that users can arm, monitor and update remotely or put on set schedules. Or music and media that can be activated by voice alone from across the room. Plus cleaning devices that handle mundane chores and apps and smart watches that remind users when it’s time to resume some level of activity.

Steinberg pointed out that technologies such as a smart sprinkler system can even help users conserve water by detecting a forecasted rainstorm later in the day and not turning on as a result. “If deployed properly, smart devices can provide great benefits to those who use them both in terms of convenience and cost savings, and can also help the environment,” he said. 

“We have handed more control than is wise to companies whose priorities do not necessarily align with our own.”

Andrejevic similarly noted the “huge benefits — some perceived and some actual,” of such devices. “At the same time,” he warned, “these devices reconfigure power relations when corporations know more and more about us, and we know so little about how they put this information to use.” He added that smart devices also increasingly shape our information environment and our social relations. “We have handed more control than is wise to companies whose priorities do not necessarily align with our own,” he said.

As disconcerting as such notions may be, the experts stressed that there are simple steps users can take to become better protected — or at the very least, informed. 

“Become more tech savvy,” Trust advised. She recommends reading user manuals and learning about the products one uses, limiting the amount of data one shares (“I don’t allow apps ‘to share my location,’” she said) and, when possible, to decline or reject tracking and cookies online. 

Maalsen suggested keeping one’s devices up to date, choosing strong passwords (“that should never be shared,”) and avoiding the use of unsecure networks. “Being aware of your digital data footprint and choosing to turn off your smart devices when you can is helpful, too,” she said. 

Sadowski stressed the importance of investigating default settings on smart devices and “opting out” of any feature that could compromise privacy, especially now that many default settings have been changed from recommending users to opt in to now requiring they opt out on their own. “Defaults are sticky,” he said, because many companies “cynically interpret (a user not opting out) as consent.” He explained that because many users don’t know better, companies have learned that if their default settings are set to require users “to opt out vs opt in, the vast majority of people are going to stay opted in.” 

“Strategies such as purchasing products supported by Western manufacturers, turning on privacy settings and working on closed network connections can make a world of difference,” Steinberg advised. 

Andrejevic similarly recommended taking such precautions, but said he is, “reluctant to place the burden on users.” He said concerned citizens need to appeal to elected officials to “build robust regulatory systems that make it safe for us to use these devices,” and to request that “data collected for one purpose — such as mapping a house to vacuum it — should be restricted to that purpose and not used to make inferences about one’s lifestyle, income level or health.” 

At the very least, “read the privacy policies,” Trust advised. “If you feel uncomfortable with the privacy policy, then either don’t use that product or see if there is a similar alternative product out there.” Whatever you do, she said, stop clicking “I agree” or quickly scroll through a privacy policy without reading it. “This is your life you are giving away because you don’t want to spend a few moments taking these privacy policies seriously.”  



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

nineteen − 5 =