Case Studies

Facebook to pay £500k ICO fine

by Mark Rowe

Facebook has agreed to pay a £500,000 fine handed out by the UK data protection regulator, the Information Commissioner’s Office. To recap, in October 2018 the ICO issued a monetary penalty notice under section 55A of the Data Protection Act 1998 against Facebook Inc and Facebook Ireland Limited. The £500,000 was the maximum allowed under the 1998 Act, as the regulator in 2017 began its investigation into the misuse of personal data in political campaigns. Under the new 2018 Act, the penalties can be far larger.

Facebook and the ICO have agreed to withdraw their respective appeals. Facebook has agreed to pay the £500,000 but has made no admission of liability in relation to the penalty notice. The fine is not kept by ICO but is paid to HM Treasury. Facebook can retain documents disclosed by the ICO during the appeal for other purposes, including furthering its own investigation into Cambridge Analytica. Parts of this investigation had previously been put on hold at the ICO’s direction and can now resume, the regulator says.

Harry Kinmonth, Director and Associate General Counsel for Facebook said that the social media firm looks forward ‘to continuing to cooperate with the ICO’s wider and ongoing investigation into the use of data analytics for political purposes’.

ICO deputy commissioner James Dipple-Johnstone said: “The ICO’s main concern was that UK citizen data was exposed to a serious risk of harm. Protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals, but also as we now know, for the preservation of a strong democracy. We are pleased to hear that Facebook has taken, and will continue to take, significant steps to comply with the fundamental principles of data protection.”

Comments

Javvad Malik, security awareness advocate at KnowBe4, says: “Companies such as Facebook hold an immense amount of data on individuals. This data can be used for many purposes and targeted political ad campaigns using this information is akin to a large-scale tailored spear-phishing attack. That is to say that personal information of individuals is being used to coerce them into acting on information which may or may not be in their best interest. Therefore, it’s encouraging to see the ICO recognising the impact of social engineering – albeit not in those terms – and the devastating impact it can have on individuals.”

Jason Tooley, Chief Revenue Officer at Veridium says: “There is increasing concern in the community that regulators such as the ICO will take too much of a heavy-handed approach to regulating the technology, and we must absolutely ensure innovation is not being stifled or stopped. It’s in the public interest for police forces to have access to innovative technology such as biometrics to deliver better services and safeguard our streets.”

“Police forces are under increasing cost pressures, with direct government funding falling 30pc in the last eight years, and as a result biometrics are making their way into government policy to improve the quality and efficiency of policing whilst reducing costs. The use of biometrics can support identity verification on-demand and at scale, which has been seen abroad where officers currently leverage widely adopted consumer technology.

“However, it is imperative police forces take a strategic approach as they trial biometric technologies, without giving precedence to a single biometric approach. A strategic approach, using other biometric techniques that have greater levels of acceptance such as digital fingerprinting, will ensure a higher level of public consent due to its maturity as an identity verification technique. Considering the rapid rate of innovation in the field, adopting an open biometric approach that enables the police to use the right biometric technique for the right scenario, taking into account varying levels of maturity, will see the benefits associated with digital policing accelerated.”

“If the police adopt a transparent policy on how biometric data is interpreted, stored and used, the public’s data privacy concerns can be greatly alleviated, which will in turn trigger consent and wider acceptance. Managing expectations around biometrics and how the technology will be used is crucial, especially in surveillance use cases. Concerns over data privacy can also be eliminated if sensitive biometric data is stored in the correct way, using sophisticated encryption methods such as sharding or visual cryptography, which renders the sensitive data unusable to a hacker.”

Related News

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing