It’s the hot story right now in Europe…
…no, we’re not talking about the news that France just dumped neighbours Belgium out of the World Series with a 1-0 victory. [Surely you mean the World Cup? – Ed.]
Facebook, with Cambridge Analytica, has been the focus of the investigation since February when evidence emerged that an app had been used to harvest the data of 50 million Facebook users across the world. This is now estimated at 87 million.
The ICO’s investigation concluded that Facebook contravened the law by failing to safeguard people’s information. It also found that the company failed to be transparent about how people’s data was harvested by others.
Cambridge Analytica (CA) – in cased you missed the saga as it uncoiled itself earlier this year – was a web analytics company started by a group of researchers with connections to Cambridge University in the UK.
Put web analytics together with the word Cambridge and you get the cool-sounding name Cambridge Analytica.
What seems to have started as some sort of academic research project soon morphed into a commercial enterprise that allowed participants to take psychometric tests via a Facebook app.
(Facebook apps are essentially plugins for the Facebook platform rather than applications in the traditional sense.)
The sneaky? bait-and-switchy? sleight-of-hand? devious? obvious-with-hindsight? why was anyone surprised? [delete as inappropriate] trick employed in the CA app was that the app explicitly asked you to give it access to account data that wasn’t available by default.
Notably, Cambridge Analytica acquired access to your profile, including a list of all your friends.
That means not only that the app learned a lot about you, but could associate your own “psychometric profile” with your friends – even if they disapproved of psychometric tests; even if they’d never have agreed themselves; indeed, even if they’d never heard of Cambridge Analytica.
As we explained back in March 2018:
You might well question how 270,000 people signing up for a Facebook personality quiz blossomed into a potential data breach affecting 50 million users [now 87 million users] – nearly 25% of potential US voters.
[…] The app scraped not just test-takers’ private profile data, but also that of their friends. Facebook didn’t disallow such behavior from apps at the time, but such data harvesting was allowed only to improve user experience in the app, not to be sold or used for advertising.
Facebook ultimately kicked CA off its platform, but not before a global brouhaha had erupted over whether the social networking giant ought to have done more to make sure that app developers stuck to both the letter and the spirit of Facebook’s own rules.
The ICO certainly seems to think Facebook could have, and should have, done more to stop Cambridge Analytica getting away with its industrial-scale data harvesting – thus the fine.
Would the ICO have hit Facebook harder if it could?
The ICO’s own announcement makes a point of mentioning that, even though current GDPR rules could in theory have led to a very much bigger fine, “due to the timing of certain incidents in this investigation, civil monetary penalties have to be issued under the previous legislation, the Data Protection Act 1998.”
The maximum financial penalty in civil cases under pre-GDPR laws is £500,000 – and that’s the amount the ICO chose.
Will Facebook pay up?
The ICO’s “fine” is currently only a Notice of Intent, so Facebook still has the right of reply.
Will we all be more careful with apps and plugins in future?
Let’s hope so – remember our simple rule: IF IN DOUBT, DON’T GIVE IT OUT.
ARE YOU COLLECTING TOO MUCH? LEARN MORE: LISTEN TO OUR GDPR PODCAST