The Cambridge Analytica scandal left EU regulators scrambling to safeguard consumers' privacy online. Mike Scott reports that rules requiring greater transparency won't be enough to address the trust deficit

One of the most annoying things about signing up for a new service online is the plethora of contracts, license agreements, terms of service, privacy policies and other documents that we are required to confirm we have read and agree to, even though we probably haven’t.

“No one reads the paperwork,” says Megan Bell, chief privacy officer at Human API, a company that runs a health data exchange that allows US consumers to access and use their health data to get the best healthcare. “They’re called clickwraps in the industry, because people just click them without reading them.”

However, people are starting to become aware that these documents are there for a reason, one that is not necessarily to protect consumers, and that our data is not confined just to the websites that we use.

Every day, internet users interact with technologies that are explicitly designed to undermine their privacy

This was most starkly illustrated in the recent Cambridge Analytica case, where the company used information obtained from Facebook to target voters in the 2016 US Presidential election.

In the same way that Apple’s iPod was basically a hard drive dressed up as a music player, Facebook and other platforms are essentially data-mining and advertising sites disguised as social media, whose success hinges on being able to target users with particular ads based on how much they know about your likes, dislikes, activities and preferences.

When this is done to sell us trainers or holidays, perhaps the harm is not that great, but when the information is being used to influence the democratic process, the stakes are much higher. And platforms’ ability to target individual users will only improve as artificial intelligence and machine learning spread.

Cambridge Analytica was a warning signal to many users. (Credit: AlexandraPopova/Shutterstock)
 

“How companies use data is so advanced and sophisticated that even if consumers had the time to read exactly what is happening to their data and know what their legal rights are, they probably would not understand all the nuances,” says Jack Carvel, general counsel at Qubit, a website personalization company.

Meanwhile, buying products online – everything from groceries to clothes to holidays – may seem like simply a more convenient alternative to going into a shop, but is in fact a completely different transaction. If you buy a shirt in a shop and pay cash, all the shop knows is that it has one less shirt and a bit more money. An online transaction gives the website your bank details, identity, address, date of birth and, as you buy more goods online, a comprehensive picture of your spending habits – information that is valuable not just to the site you have used but to plenty of others as well.

Every day, internet users interact with technologies that are explicitly designed to undermine their privacy, says Woodrow Hartzog, a professor at Northeastern University School of Law and College of Computer and Information Science. In his book Privacy’s Blueprint, he argues that “social media apps, surveillance technologies, and the internet of things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is up to users to protect themselves – even when the odds are deliberately stacked against them.”

Our lives are now governed by our online presence. More regulation has become appropriate and it is a case of regulators catching up

Nor are these agreements likely to go away, according to Bell. “Why do we have all these complex third-party agreements? Because they’re legal agreements, and legal agreements have always been complex.

“At the back end they have to be complex because they are about complying with the law. I would assume that is not going to change, but there are moves at the front end to make the whole process consumable and give consumers more capacity to make decisions and take control of their data,” she adds. “The question is how you take these complex constructs of US or UK tort law and boil them down to something that the everyday user can understand.”

In the early days of the internet, issues such as data breaches were not seen as that important, says Carvel. “The view was: ‘Why should my behaviour on a website be regulated in the same way as my taking medication is?’

Most people click online agreements without reading them. (Credit: Georgejmclittle/Shutterstock)

 

“But now … our lives are governed by our online presence. As it became more important, more regulation has become appropriate and it is a case of regulators catching up.”

The most high-profile regulation has been the European Union’s GDPR (General Data Protection Regulation), which spells out EU citizens’ digital rights, giving them new powers to access and control their data online. As is the way of things with European directives, the law was seven years in the making and when the process started, many questioned the need for it. By the time it came into force, amid the fallout from Cambridge Analytica, the point of it was abundantly clear.

Even though it applies only to EU citizens, the regulation has already become a de facto global standard for data protection, in part because the US has no national privacy law. California has recently introduced a Consumer Privacy Act (CCPA), which is similar to GDPR but with some caveats, according to Bell. “It’s still a piecemeal approach,” she adds. “But at some point, the patchwork fixes will amount to a comprehensive approach.”

It’s not a question of putting terms in place and saying ‘you agreed to them’. The key point with online agreements should be fairness

However, while GDPR emerged after a long period of consultation, CCPA “is almost entirely a response to Cambridge Analytica. It was rushed and that’s not a sensible way to regulate,” Carvel says.

Underpinning GDPR is the idea that trust is essential to the growth of the digital economy and that trust has been forfeited by a lack of transparency on the part of the online platforms.

Transparency is important, agrees Carvel, but it is not enough. “It’s not just a question of putting terms in place and saying ‘you agreed to them’. The key points with these online agreements should be fairness and using the technology in an ethical way.”

Online transactions give websites a lot of information about users. (Credit: Rawpixel.com/Shutterstock)
 

Research by the UK’s Information Commissioner’s Office (ICO) recently showed that barely one third of people “have trust and confidence in companies and organizations storing and using their personal information”. For social media companies, the figure is 15%.

“Across the world people have woken up to the importance of personal data and how it’s used,” says Elizabeth Denham, the UK's Information Commissioner. “Personal data has become the currency by which society does business, but advances in technology should not mean organizations racing ahead of people’s rights. Individuals should be the ones in control and organizations must demonstrate their accountability to the public.”

What would also be helpful, Carvel adds, would be the development of standard, agreed templates that are held up as best practice. “It doesn’t really happen, but it would be great if there was a standard we could all use. We wouldn’t have to negotiate agreements for months. It would make things a lot easier for smaller companies, which wouldn’t have to invest so much in legal resources, while consumers would see something they were familiar with.”

Data protection by design is about considering data protection and privacy issues upfront in everything you do

Nonetheless, there has been some progress. Companies such as Microsoft and Google have a good reputation for simplifying their terms and conditions, Bell says, while one of the concepts championed in GDPR, “privacy by design”, is gaining traction even where it is not required. “In essence,” says the ICO, “this means you have to integrate, or 'bake in', data protection into your processing activities and business practices, from the design stage right through the lifecycle.

“Data protection by design is about considering data protection and privacy issues upfront in everything you do. It can help you ensure that you comply with the GDPR’s fundamental principles and requirements, and forms part of the focus on accountability,” it adds.

The emphasis on privacy and consumers’ rights is leading not just to changes in behaviour, but the emergence of companies hoping to profit from helping people manage their online security. OneTrust, for example, dubs itself a provider of “privacy management software” that helps companies ensure compliance with GDPR and manage visitor consent and preferences. BigID and ID Experts also help businesses keep customers’ data private in sectors such as healthcare, financial services, retail and government.

Google has simplified its terms and conditions. (Credit: achinthamb/Shutterstock)
 

There is also a new caution in the online community after the years when companies did things just because they could without thinking about whether they should. “There are things that you just don’t do any more,” Carvel points out. “For example, websites can know what other tabs you have open in your browser. Virtually everyone agrees you can’t do that now. It’s the same with ‘zombie cookies’ that stick around even if you delete them.”

Firms are also highlighting their ethics as a business differentiator. Fleksy, a keyboard maker, says that while keyboards can collect and send data just as websites can, “Fleksy does not collect personal data and it gives the user access to the analytical data that is collected to improve the keyboard, and which can be erased. It’s not a matter of legal protection or compliance, but a business decision.”

Qubit benefits from the fact that it is based in the EU, Carvel says. “We are the only company in our field based in the EU. We have had to do this for a long time and over the last three years it has really started paying dividends. We have won a lot of business based on the fact that we can offer better security practices.”

Ultimately, what will restore trust and put power over data back in the hands of consumers is when companies realize that it is good business to do so.

Mike Scott is a former Financial Times journalist who is now a freelance writer specializing in business and sustainability.  He has written for The Guardian, the Daily Telegraph, The Times, Forbes, Fortune and Bloomberg.

Main picture credit: wk1003mike/Shutterstock

 

This article is part of the in-depth Ethics of Digitization briefing. See also:

Holding the human rights frontier in a borderless internet

Taking the heat out of the digital revolution

‘We need everyone in the tech industry to sign up to 100% renewables’

'Europe’s data privacy revolution is far from over’

Will privacy concerns throw up roadblocks for self-driving cars?

Online privacy. online license agreements  Fleksy  Qubit  Cambridge Analytica  Facebook  GDPR 

comments powered by Disqus