Personal data has fuelled the growth of some of today’s leading companies. Risks and ethical dilemmas are increasing, threatening to derail this trend. Paul Bryant reports.
Clive Humby, architect of Tesco’s Clubcard and chief data scientist at Starcount, a consumer insights consultancy, first coined the phrase ‘data is the new oil’ in 2006. He said: “It’s valuable, but if unrefined it cannot really be used … data must be broken down, analysed for it to have value.”
Some, and in particular technology giants such as Amazon, Google and Facebook, have mastered this data refining process and created enormous value. (See boxout)
But alongside the commercial opportunities presented by personal data come risks, such as prescriptive new regulations and ethical questions, such as selling products that cause harm.
Companies using personal data now need to deal with new data collection and usage constraints imposed by the General Data Protection Regulation (GDPR), as well as emerging consumer activism such as the #DeleteFacebook movement. GDPR imposes strict rules that dictate how personal data should be gathered, managed and protected, with onerous consequences for not complying.
On the question of ethics, social media platforms and smartphone companies have been accused of designing addictive products and causing mental health issues.
The risks of using personal data are significant, and rising, but some firms and investors have been apathetic.
Investors who saw the commercial potential of personal data, and were able to pick the winners, have been richly rewarded. Since listing on public markets, investors in Amazon have enjoyed compound annual returns of 41% (over 20 years), Google 25% (over 15 years) and Facebook 32% (over six years).
Financial services companies are also ramping up the sophistication of their use of personal data. In early 2018, Legal & General (L&G) launched a new ‘Customer risk and opportunity management’ data programme to improve the customer insights it provides to intermediaries selling life insurance products.
L&G now combines and analyses its own data with data from external partners, such as customer segmentation providers. Traditional life insurance metrics about policy lapses or claims are analysed alongside ‘lifestyle data’, such as digital channel preferences, financial attitudes or life stage.
One early insight was that the birth of a first child was not triggering a life insurance purchase by parents to the extent expected. But, far more frequently, the birth of a second child was. This tendency could then be conveyed to intermediaries to make first-time parents aware of a common oversight.
Rob Gaunt, head of commercial management and distribution quality management at L&G, says that it’s too early to quantify the benefits of the programme. But his goal is clear: “What we want is to very quickly start producing a list of consumer insights that, when presented to a group of intermediaries, 90% of them will say: ‘I wouldn’t have expected that’.”
Investors are also using big data, including personal data, to inform their investment strategies. In its 2017 Global hedge fund and investor survey – of 106 hedge funds and 55 institutional investors – EY finds that 78% of hedge funds currently use, or expect to use in the next 6–12 months, ‘non-traditional’ data, such as social media data, credit card data and search trends. In the 2016 survey, the proportion is less than 50%.
Institutional investors are buying the idea. They told EY that 24% of the hedge funds in which they invest currently use non-traditional data and they expect that proportion to rise to 38% within three years.
The risks of using personal data are significant, and rising, but some firms and investors have been apathetic.
In July 2017, Equifax, the consumer credit scoring agency, discovered that it had been the victim of a hacking attack that compromised sensitive personal data of 143 million US consumers. It took more than a month for Equifax to publicly disclose the breach.
A week later, its share price had dropped 37%. The CEO, chief information officer and chief security officer stepped down. Equifax later confirmed it was party to more than 240 consumer class action lawsuits, as well as financial institution and shareholder class action lawsuits. And as of 30 June 2018, US$314m of expenses had been recorded related to the incident.
Equifax and its investors had been warned. MSCI, an agency that rates companies on their environmental, social and governance (ESG) performance and risk exposures, downgraded Equifax to its lowest possible ESG rating in August 2016. It reported that Equifax was “vulnerable to data theft and security breaches”.
Investors largely ignored the warnings. The Equifax share price gained over 25% between December 2016 and September 2017, when the breach was announced. Following the plummet in the immediate aftermath of the incident, the share price has regained some ground but, as of 2 August 2018, still trades at around 12% lower than the level before the incident was announced.
This is not an isolated case. In 2015, UK telecommunications company TalkTalk had its systems breached, compromising the personal data of 157,000 customers. In its 2016 annual report, TalkTalk reports that due to the breach, it has incurred £42m of exceptional expenses and lost 95,000 (or just under 2.5%) of its broadband customers. The Information Commissioner’s Office imposed a record fine of £400,000, with Information Commissioner Elizabeth Denham saying: “TalkTalk’s failure to implement the most basic cyber security measures allowed hackers to penetrate TalkTalk’s systems with ease.”
Today, in a ‘live’ GDPR environment, for companies handling the personal data of EU consumers, these consequences could be even more severe. The fines associated with this legislation – the greater of €20m or up to 4% of annual revenue – have been well publicised and have the potential to dwarf previous fines imposed for failing to adequately protect personal data. According to cyber security firm NCC Group – which created a model to determine what tier of fine would have applied to TalkTalk under GDPR legislation – the £400,000 fine might have been £59m.
MSCI says the risks are particularly severe for the financial sector. Writing on the MSCI website, Matt Moscardi, executive director, ESG Research, says: “Of the companies with revenues in the EU that MSCI ESG Research identifies as particularly at risk of privacy and data security issues, 40% were in the financial sector.”
The Equifax and TalkTalk events illustrate some of the harsh financial, legal and reputational consequences of inadequate personal data management. But the reactions of consumers are not consistent. While TalkTalk experienced a quantifiable consumer exodus, in the aftermath of the Cambridge Analytica saga, Facebook didn’t. Despite a relatively high-profile #DeleteFacebook campaign and extensive press coverage about consumer outrage, daily active users (DAUs) continued to increase during the first half of 2018 (5% more DAUs compared to 31 December 2017). In Europe, DAUs did decline by 1% in Q2 2018, but this change to Facebook’s European growth trend (DAUs were previously growing by around 1% per quarter) has been attributed to the introduction of GDPR as well as reputational and privacy concerns.
In a study of users’ reactions to the Cambridge Analytica event, research company AppOptix concluded: “It is clear that declines in short-term Facebook app use were non-existent. That is, consumers shrugged off the incident and have maintained their usage patterns.”
Ethical pressures mounting
Companies and investors also have to grapple with ethical questions. Speaking in November 2017, Sean Parker, an early Facebook investor, said: “The thought process that went into building these applications, Facebook being the first of them, was all about: ‘How do we consume as much of your time and conscious attention as possible?’ God only knows what it’s doing to our children’s brains.”
Facebook itself has acknowledged the dangers. A December 2017 Facebook Newsroom article – ‘Hard questions: is spending time on social media bad for us?’ – states: “In general, when people spend a lot of time passively consuming information – reading, but not interacting with people – they report feeling worse afterward … A study from UC San Diego and Yale finds that people who click on about four times as many links as the average person, or who like twice as many posts, report worse mental health than average in a survey.”
Big data algorithms have also been accused of reinforcing discrimination. In her 2016 book, Weapons of math destruction, data scientist and former hedge fund quant Cathy O’Neil says big data algorithms have seen the poor being caught in a “death spiral of modelling”.
One such example is that poor people are more likely to have bad credit and live in high-crime neighbourhoods. Because of this, algorithms used in recruitment score the poor as high-risk and block them from jobs. Other algorithms access this information and increase the cost of credit and insurance, making their financial position even more precarious and driving their credit score down further.
The upside for consumers
But it’s not all downside for the consumer. They benefit from having their personal data used to create and refine products. In wealth management, moneyinfo provides financial advisers with a personal financial management (PFM) and client portal tool to offer to clients. The app links to and analyses data from credit cards, bank accounts, mortgages, investments, pensions, and even insurance policies. Clients see a complete, up-to-date picture of their finances; can access tools such as spending pattern trackers; and receive a more sophisticated service from their advisers.
Although consumers may think that their adviser has simply presented them with a new set of digital tools (the moneyinfo app will be branded by the adviser), multiple parties are required to make these modern tools function. And consumers now need to be made aware of how their data moves around and where it is stored – for example, it is common for companies to store their data ‘in the cloud’ at an outsourced data centre. Consumers can take some solace from knowing that these data centres, if storing the data of EU consumers, must also be GDPR compliant.
The personal data landscape of 2018 typifies the extremes of the digital economy. Enormous commercial opportunities exist alongside potentially destructive risks and disturbing ethical dilemmas. One misstep under the tough new GDPR regulations could leave a firm reeling from the financial and reputational damage of mishandling personal data. Investment decisions – such as picking winners, losers and ethically sound investments – are only going to become more complicated.
This article was originally published in the Q3 2018 edition of the CISI members’ magazine, The Review. Republished with permission.
The value of personal data in 2018: shifting sands
Personal data is sometimes bought by organisations in a straightforward commercial transaction with an explicit value. And sometimes it is collected in exchange for a service, where the value exchange is not as clear. Both of these models are likely to undergo change.
Clive Humby, chief data scientist at consumer insights consultancy Starcount, says traditional ‘data brokers’, which sell lists of consumer data for marketing purposes, have been disorientated by GDPR. Some of their previous data collection methods are no longer legal. Nor can they store some of the data they did in the past.
He says: “Even pre-GDPR, at the very top end, lists very rarely cost more than 50p per name, perhaps £1–£2 per name with a pre-qualification process, such as knowing the consumer is looking to buy a new house. Post-GDPR, the quality of this data is likely to drop, and so will its value.”
Clive thinks organisations need to focus more on collecting their own proprietary data. So,
a mortgage broker might create useful content for potential customers about moving house. Those interested can sign up, grant consent, and their browsing or download behaviour can be used to offer targeted, useful messages.
When customers receive a ‘free’ service in exchange for allowing an organisation to use their data, the value exchange can be opaque. Clive says that LinkedIn is a case in point.
A financial adviser might create a marketing list from the connections of a wealthy client who he or she is connected to. There is a strong likelihood that many on this list would be in
the target demographic of the adviser. But the client has received no compensation.
End of the ‘Faustian pact’
For its 2018 Trends report, Mindshare Futures, a media and technology research agency, surveyed 6,000 UK consumers and found that two-thirds knew their data had value but did not know how to use it to their advantage.
Jason Smith, co-founder and former CEO of social media data analysis firm Blurrt, thinks the ‘pay-with-your data’ model is on its way out. “I can’t see this ‘Faustian pact’ continuing where a consumer gives a company their data in return for a service and the company gets to do what it wants with the data. There is going to come a point, and the younger generation are closer to this point, when people say ‘this is not fair’.”
This view is in line with the Mindshare Futures research, which finds that 69% of millennials view their own personal information as ‘bargaining chips’ to enhance their lives.