‘Cambridge Analytica – The Alternative View’ by ThoughtRiver

The recent furore over the activities of Cambridge Analytica focuses on issues of consent and the ethics of data profiling. But for corporations preparing for the coming into force of the General Data Protection Regulation (GDPR) in May this year there is a potentially more lethal danger lurking in the story, one that comes from simply not understanding and tracking data flows, writes Tim Pullan, Founder and CEO of legal AI company, ThoughtRiver.

For readers not familiar with what happened, Cambridge Analytica – a UK/US data analytics consultancy – allegedly used data gleaned from millions of Facebook accounts to develop profiles that were then used by the 2016 Trump election campaign. A Channel 4 expose secretly filmed the CEO, Alexander Nix, and fellow executives talking about the power of their data analysis as well as describing entrapment and other techniques they could apparently deploy to help political clients.

The ferocity of the news coverage is at least partly down to the human elements of the story. The narrative threads together an Old Etonian (Mr Nix), a Cambridge University academic (Dr Kogan), mystical technological powers of influence, and international figures including President Trump and Facebook’s CEO Mark Zuckerberg, the latter having to publicly apologise and see his company’s share price dip.

But serious commentators have focused on how Cambridge Analytica and Dr Kogan were apparently able to download millions of Facebook profiles via a Facebook app Dr Kogan had created for the purpose, and the impact of subsequent profiling activities on key political events such as the US presidential election and the Brexit referendum.

On the serious issues, the real surprise is that people have been surprised at all. Anyone with a mobile phone and a computer will be aware of the modern profiling techniques companies and political parties use to get people’s attention, and the apparent ineffectiveness of consent regimes.

Tim Pullan, CEO, ThoughtRiver

If there is a qualitative difference in Cambridge Analytica’s work it is that they managed to associate thousands of data points with millions of unique, identifiable individuals – what data analysts call a ‘person key’ – providing a particularly deep read.

But despite the headlines, there remains (at the date of writing) considerable uncertainty as to whether the activity was illegal under existing data protection laws or in breach of Facebook’s policies. The response from Government is that under GDPR all this is now going to change: consent is going to be the big compliance issue for corporates – and the big protection for individuals.

I was recently asked to give evidence to a House of Lords Committee about the issues of data privacy and AI at which I disagreed with this prevailing view. The point is that people focus too heavily on the consent requirement per se.

But the underlying risk so often overlooked is the ability of corporates to track, manage and maintain the vast pools of data sloshing around their third party relationships. This reality has now hit Facebook: their developer relationships, whilst clearly valuable, also present potent risks.

The truth is that in its analysis activities Cambridge Analytica is no different from thousands of other insight companies. Large corporations will often have hundreds of such vendors and partners under which personal data is daily exchanged, analysed, copied and disclosed. Under GDPR, only one of these partners needs to be rotten to expose the corporation to huge regulatory fines plus serious reputational damage. And these are only the obviously risky relationships.

There will be many others that pose a less obvious compliance risk, given their activities. And then there are the sub-contractors, sub-sub contractors and sub-sub-sub contractors, all of whom may be involved in a data pipeline that exposes the end corporation to compliance risks.

Those risks are primarily twofold: the legitimacy of each actual processing activity; and whether each contract contains terms specifically required by GDPR’s Article 28 (amongst others).

The logical answer is we will need much more rigour. All these relationships, their contracts and the associated personal data processing need to be mapped, assessed and tracked.

In practice this means reviewing every single third party relationship, because it is not possible to know whether it is potentially relevant to data protection without looking at it. And any one of them, no matter how small in commercial value, could be fatal.

This is where automation and AI can help. That’s why we at ThoughtRiver have developed a GDPR solution that allows companies to automatically assess every single contractual relationship they have, identify the ones most likely to involve personal data processing and report on Article 28 compliance.

Since the predecessor of GDPR (Directive 95/46) was first agreed back in 1995 the world of communications and commerce has become dizzyingly more global, complex and fragmented. GDPR adds further legal rigour and penalties to those 1995 rules. Considered in the modern data flows context, GDPR’s harsher requirements are harsher still.


This is a Sponsored Thought Leadership article by Tim Pullan, Founder and CEO, of legal AI company, ThoughtRiver. Contact: info@thoughtriver.com

1 Trackback / Pingback

  1. Osibogun and Partners - Law Firm

Comments are closed.