I'm very pleased that Legal Cheek & BARBI selected this essay for the 2nd place prize in their 2018 Privacy Law Essay Competition.
Introduction
When a major scandal emerges, the
trope of ‘something must be done’ usually arrives shortly after. Thankfully,
something is already being done. The
Data Protection Bill is currently making its way through the legislative
process, and at the time of writing, is about to enter the report stage of the
House of Commons on the 9th May 2018. However the bill is not
without issue.
The Cambridge Analytica scandal
In 2014, Cambridge University
researcher Aleksandr Kogan in collaboration with Cambridge Analytica, used an
app called ‘ThisIsYourDigitalLife to pay around 270,000 Facebook users to take
a personality test. With their consent, this harvested the data of those users.
However, as well as harvesting the data of the paid users, the app harvested
the data of their friends too, resulting in the collection of up to 87 million
individuals’ data. Cambridge Analytica was later hired by the pro-brexit campaign
group Leave.EU in 2015 and by Trump’s presidential campaign in 2016. It is
currently unclear to what extent this data was used in these campaigns, but a
whistle-blower has attested that it was used extensively.
Can Cambridge Analytica’s behaviour be repeated?
Prior to 2015, Facebook allowed
app developers to collect user data from Facebook using their own apps, which
permitted the collection of friends’ data to improve user experience in the
app, but not for it to be used for advertising or to be sold on. Facebook
changed these rules after 2015, removing this ability. Cambridge Analytica
broke the user agreement by misusing the data it collected for advertising. That is not to say what was done can not be repeated,
but it can not be repeated in the same way.
Data can be harvested via an app
in largely the same manner, though the collection will no longer be so vast
from a relatively small participation size. Data can be bought, but again, it
is unlikely to be of such large size. Even so, greater data protection laws are
imminent with the advent of the European General Data Protection Regulation (GDPR)
and the Data Protection Bill.
The legislative framework
The Data Protection Bill will
implement the vast majority of the (GDPR). There is a plethora of changes
contained in the Bill, focused on providing greater access to the data that
companies hold, as well as giving greater power the Information Commissioners
Office to uphold information rights.
Some notable changes include the requirement
for businesses to obtain a ‘positive opt-in’, clearly explaining that consent
is being given when they intend to rely on consent to lawfully use a person’s
information. The £10 charge for a Subject Access Request will now be free and
there will be greater powers to request erasing data. The GDPR also increases
the severity of fines for organisations who mishandle an individual’s data. The
maximum fine has gone from £500,000 to up to €10 million or 2% of the firms global turnover
(whichever is greater).
In general, the stricter
regulation of data protection from the GDPR can be seen as a victory for the
privacy conscious social media user. Though currently what is troubling is not
the prospect of what might enter the bill, but what it already contains. There
are some parts of the Data Protection Bill that have caused concern, most
notably, clause 8 (e).
Data Protection Bill: The problem with clause 8(e)
This subtly concerning clause
emerged in the draft bill after an amendment was agreed. Clause 8(e) states
that an ‘activity that supports or promotes democratic engagement’ is an
example of a necessary process of personal data that could be undertaken on the
grounds of lawfulness in the public interest. This clause gives good reason to
be concerned.
First, with such an extremely
wide scope of activity potentially included in this provision, it has the
capability of including the campaigning tactics of Cambridge Analytica. This
has the effect of legitimising their behaviour, and potentially forms an
enabling provision.
It would not be outrageous to
suggest that the amendment appears to be inserted out of concern that
politicians will hamstring themselves when processing data during campaigns.
The ease at which data could be used tactically during campaigns will no doubt
be of great concern to politicians. It is easy to see this as a politically self-serving
amendment, rather than one which is in the spirit of the GDPR.
Furthermore, Recital 45 of the
GDPR does not afford the extremely wide ambit of all democratic activities
which clause 8(e) does. Bundling clause 8(e) with clauses (a)-(d) affords it
the enormous scope the other deliberately wide clauses cater for. However,
clause 8(e) is far more susceptible to abuse, which has the capacity to arise
in the form of Cambridge Analytica’s strategies.
Second, as clause 8(e) applies to
any data controller, this amendment largely conflicts with the processing of
political opinions, a special category of data in Article 9 of the Bill, which
is reserved for registered political parties, as opposed to any data
controller. The clause consequently puts a much looser constraint on the
processing of such data.
Troublesome still, is that whilst
clause 8(e) of the Data Protection Bill was inserted 13th March
2018, and the expose of Cambridge Analytica published a matter of days later on
17th March, successive hearings have failed to show any indication of
amending Clause 8(e).
Concluding remarks
Whilst the Data Protection Bill
introduces a largely welcomed arsenal of tools to ensure data is handled
properly, clause 8 (e) displays a worrying exception. Its primary issue is its
scope, and the potential for it to become a form of protection for the tactics
of Cambridge Analytica. The scope should be dramatically reduced, and with
ample opportunity for this change to be implemented, there is little excuse for
it to be left in its current form.