ICO tells data broker Experian to seek users permission before selling their personal data
|27th October 2020
In a landmark decison that shines a light on widespread data protecton failings by the entire data broker industry, the UK data protection censor ICO, has taken enforcement action against Experian, based in part on a complaint made by Privacy
International in 2018.
Privacy International (PI) welcomes the report from the UK Information Commissioner's Office (ICO) into three credit reference agencies (CRAs) which also operate as data brokers for direct marketing purposes. As a result, the
ICO has ordered the credit reference agency Experian to make fundamental changes to how it handles people's personal data within its offline direct marketing services. Experian now has until July 2021 to inform people that it holds their personal
data and how it intends to use it for marketing purposes. The ICO also requires Experian to stop using personal data derived from the credit referencing side of its business by January 2021.
The ICO investigation found widespread and systemic data
protection failings across the sector, significant data protection failures at each company and that significant invisible processing took place, likely affecting millions of individuals in the UK. As the report underlines, between the CRAs, the data of
almost every adult in the UK was, in some way, screened, traded, profiled, enriched, or enhanced to provide direct marketing services.
Moreover, the report notes that all three of the credit referencing agencies investigated were also using
profiling to generate new or previously unknown information about people. This can be extremely invasive and can also have discriminatory effects for individuals. Experian has said it intends to appeal the ICO decisions saying:
We believe the ICO's view goes beyond the legal requirements. This interpretation (of General Data Protection Regulation) also risks damaging the services that help consumers, thousands of small businesses and charities, particularly
as they try to recover from the COVID-19 crisis.
Data censor consults on its fines and sanctions regime for use after the Brexit transition period
|4th October 2020 |
See proposed fines and sanctions [pdf] from
This consultation closes on 12 November 2020;
ICO consultation on the draft Statutory guidance
We are running a consultation about an updated version of the Statutory guidance on how the ICO will exercise its data protection regulatory functions of information
notices, assessment notices, enforcement notices and penalty notices.
This guidance is a requirement of the Data Protection Act 2018 and only covers data protection law under that Act. Our other regulatory activity and the other
laws we regulate are covered in our Regulatory action policy (which is currently under review).
We welcome written responses from all interested parties including members of the public and data controllers and those who represent
them. Please answer the questions in the survey and also tell us whether you are responding on behalf of an organisation or in a personal capacity.
We will use your responses to this survey to help us understand the areas where
organisations and members of the public are seeking further clarity about information notices, assessment notices, enforcement notices and penalty notices. We will only use this information to inform the final version of this guidance and not to consider
any regulatory action.
We will publish this guidance after the UK has left the EU and we have therefore drafted it accordingly.
||15th September 2020 |
A good summary of some of the unexpected consequences of internet censorship that will arise from ICO's Age Appropriate Design Code.
article from parentzone.org.uk
The ICO publishes its impossible to comply with, and business suffocating, Age Appropriate Design Code with a 12 month implementation period until 2nd September 2021
press release from ico.org.uk
Age Appropriate Design [pdf] from ico.org.uk
The ICO issued the code on 12 August 2020 and it will come into force on 2 September 2020 with a 12 month transition period.
Information Commissioner Elizabeth Denham writes:
Data sits at the heart of the digital services
children use every day. From the moment a young person opens an app, plays a game or loads a website, data begins to be gathered. Who's using the service? How are they using it? How frequently? Where from? On what device?
information may then inform techniques used to persuade young people to spend more time using services, to shape the content they are encouraged to engage with, and to tailor the advertisements they see.
For all the benefits the
digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play.
This statutory code of practice looks to change that, not by seeking to protect children from the digital world,
but by protecting them within it.
This code is necessary.
This code will lead to changes that will help empower both adults and children.
One in five UK internet users are
children, but they are using an internet that was not designed for them. In our own research conducted to inform the direction of the code, we heard children describing data practices as nosy, rude and a bit freaky.
national survey into people's biggest data protection concerns ranked children's privacy second only to cyber security. This mirrors similar sentiments in research by Ofcom and the London School of Economics.
This code will lead
to changes in practices that other countries are considering too.
It is rooted in the United Nations Convention on the Rights of the Child (UNCRC) that recognises the special safeguards children need in all aspects of their life.
Data protection law at the European level reflects this and provides its own additional safeguards for children.
The code is the first of its kind, but it reflects the global direction of travel with similar reform being
considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).
This code will lead to changes that UK Parliament wants.
Parliament and government ensured UK
data protection laws will truly transform the way we look after children online by requiring my office to introduce this statutory code of practice.
The code delivers on that mandate and requires information society services to
put the best interests of the child first when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them.
This code is achievable.
The code is
not a new law but it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services. It follows a thorough consultation process that included speaking with parents, children, schools,
children's campaign groups, developers, tech and gaming companies and online service providers.
Such conversations helped shape our code into effective, proportionate and achievable provisions.
Organisations should conform to the code and demonstrate that their services use children's data fairly and in compliance with data protection law.
The code is a set of 15 flexible standards 203 they do not ban or specifically prescribe 203 that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child
are the primary consideration when designing and developing online services.
Settings must be high privacy by default (unless there's a compelling reason not to); only the minimum amount of personal data should be collected and
retained; children's data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. The
code also addresses issues of parental control and profiling.
This code will make a difference.
Developers and those in the digital sector must act. We have allowed the maximum transition period of
12 months and will continue working with the industry.
We want coders, UX designers and system engineers to engage with these standards in their day-to-day to work and we're setting up a package of support to help.
But the next step must be a period of action and preparation. I believe companies will want to conform with the standards because they will want to demonstrate their commitment to always acting in the best interests of the child.
Those companies that do not make the required changes risk regulatory action.
What's more, they risk being left behind by those organisations that are keen to conform.
A generation from now, I believe we
will look back and find it peculiar that online services weren't always designed with children in mind.
When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second
nature as the need to ensure they eat healthily, get a good education or buckle up in the back of a car.
And while our code will never replace parental control and guidance, it will help people have greater confidence that their
children can safely learn, explore and play online.
There is no doubt that change is needed. The code is an important and significant part of that change.
The ICO's onerous internet censorship measure starts its parliamentary approval stage
|12th June 2020
ICO statement in response to the Government laying the Age Appropriate Design Code, also known as the Children's Code, before Parliament.
We welcome the news that Government has laid the Age Appropriate Design Code before
Parliament. It's a huge step towards protecting children online especially given the increased reliance on online services at home during COVID-19.
The code sets out 15 standards that relevant online services should meet to
protect children's privacy and is the result of wide-ranging consultation and engagement with stakeholders including the tech industry, campaigners, trade bodies and organisations.
We are now pulling together our existing work on
the benefits and the costs of the code to assess its impact. This will inform the discussions we have with businesses to help us develop a package of support to help them implement the code during the transition year."
The government seems a bit cagey about the timetable for introducing the internet censorship measures contained in ICO's Age Appropriate Design rules
||19th May 2020 |
See Parliamentary transcript from hansard.parliament.uk
The Age Appropriate Design Code has been written by the Information Commissioner's Office (ICO) to inform websites what they must do to keep ICO internet censors at bay with regards to the government's interpretations of GDPR provisions. Perhaps in the
same way that the Crown Prosecution Service provides prosecution guidance as to how it interprets criminal law.
The Age Appropriate Design Code dictates how websites, and in particular social media, make sure that they are not exploiting children's
personal data. Perhaps the most immediate effect is that social media will have to allow a level of usages that simply does not require children to hand over personal data. Requiring more extensive personal data, say in the way that Facebook does,
requires users to provide 'age assurance' that they are old enough to take such decisions wisely.
However adult users may not be so willing to age verify, and may in fact also appreciate an option to use such websites without handing over data
into the exploitative hands of social media companies.
So one suspects that US internet social media giants may not see Age Appropriate Design and the government's Online Harms model for internet censorship as commercially very desirable for their
best interests. And one suspects that maybe US internet industry pushback may be something that is exerting pressure on UK negotiators seeking a free trade agreement with the US.
Pure conjecture of course, but the government does seem very cagey
about its timetable for both the Age Appropriate Design Code and the Online Harms bill. Here is the latest parliamentary debate in the House of Lords very much on the subject of the government's timetable.
House of Lords
Hansard: Age-appropriate Design Code, 18 May 2020
Lord Stevenson of Balmacara:
To ask Her Majesty's Government when they intend to lay the regulation giving effect to the age- appropriate
design code required under section 123 of the Data Protection Act 2018 before Parliament.
The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Baroness Barran) (Con)
The age-appropriate design code will play an important role in protecting children's personal data online. The Government notified the final draft of the age-appropriate design code to the European Commission as part of
our obligations under the technical standards and regulations directive. The standstill period required under the directive has concluded. The Data Protection Act requires that the code is laid in Parliament as soon as is practicably possible.
Lord Stevenson of Balmacara:
I am delighted to hear that, my Lords, although no date has been given. The Government have a bit of ground to make up here, so perhaps it will not be
delayed too long. Does the Minister agree that the Covid-19 pandemic is a perfect storm for children and for young people's digital experience? More children are online for more time and are more reliant on digital technology. In light of that, more
action needs to be taken. Can she give us some information about when the Government will publish their final response to the consultation on the online harms White Paper, for example, and a date for when we are likely to see the draft Bill for
I spent some time this morning with a group of young people, in part discussing their experience online. The noble Lord is right that the
pandemic presents significant challenges, and they were clear that they wanted a safe space online as well as physical safe spaces. The Government share that aspiration. We expect to publish our response to the online harms consultation this autumn and
to introduce the legislation this Session.
Lord Clement-Jones (LD)
My Lords, I was very disappointed to see in the final version of the code that the section dealing with
age-appropriate application has been watered down to leave out reference to age-verification mechanisms. Is this because the age-verification provisions of the Digital Economy Act have been kicked into the long grass at the behest of the pornography
industry so that we will not have officially sanctioned age-verification tools available any time soon?
There is no intention to water down the code. Its content is
the responsibility of the Information Commissioner, who has engaged widely to develop the code, with a call for evidence and a full public consultation.
Lord Moynihan (Con)
Lords, is my noble friend the Minister able to tell the House the results of the consultation process with the industry on possible ways to implement age verification online?
We believe that our online harms proposals will deliver a much higher level of protection for children, as is absolutely appropriate. We expect companies to use a proportionate range of tools, including age-assurance and
age-verification technologies, to prevent children accessing inappropriate behaviour, whether that be via a website or social media.
The Earl of Erroll (CB)
May I too push the
Government to use the design code to cover the content of publicly accessible parts of pornographic websites, since the Government are not implementing Part 3 of the Digital Economy Act to protect children? Any online harms Act will be a long time in
becoming effective, and such sites are highly attractive to young teenagers.
We agree absolutely about the importance of protecting young children online and that is
why we are aiming to have the most ambitious online harms legislation in the world. My right honourable friend the Secretary of State and the Minister for Digital and Culture meet representatives of the industry regularly to urge them to improve their
actions in this area.
Lord Holmes of Richmond (Con)
My Lords, does my noble friend agree that the code represents a negotiation vis-Ã -vis the tech companies and thus there is
no reason for any delay in laying it before Parliament? Does she further agree that it should be laid before Parliament before 10 June to enable it to pass before the summer break? This would enable the Government to deliver on the claim that the UK is
the safest place on the planet to be online. Share The edit just sent has not been saved. The following error was returned: This content has already been edited and is awaiting review.
The negotiation is not just with the tech companies. We have ambitions to be not only a commercially attractive place for tech companies but a very safe place to be online, while ensuring that freedom of speech is upheld. The timing
of the laying of the code is dependent on discussions with the House authorities. As my noble friend is aware, there is a backlog of work which needs to be processed because of the impact of Covid-19.
||19th May 2020 |
Information Commissioner's Office has effectively downed tools as a result of the pandemic, raising concerns about outstanding cases and ongoing privacy issues
article from wired.co.uk
The Data censor ICO has suspended its action against adtech citing coronavirus effects
||8th May 2020 |
See article from ico.org.uk
The Information Commissioner's Office (ICO) has announced:
The ICO recently set out its regulatory approach during the COVID-19 pandemic, where we spoke about reassessing our priorities and resources.
Taking this into account we have made the decision to pause our investigation into real time bidding and the Adtech industry.
It is not our intention to put undue pressure on any industry at this time but our concerns about Adtech remain and we aim to restart our work in the coming months, when the time is right.
Newspapers realise that the ICO default child protection policy may be very popular with adults too, and so it may prove tough to get them to age verify as required for monetisation
||24th January 2020 |
See article from pressgazette.co.uk
See See ICO's FAQ discussing the code's applicability to news websites [pdf] from ico.org.uk
News websites will have to ask readers to verify their age or comply with a new 15-point code from the Information Commissioner's Office (ICO) designed to protect children's online data, ICO has confirmed.
Press campaign groups were hoping news
websites would be exempt from the new Age Appropriate Design Code so protecting their vital digital advertising revenues which are currently enhanced by extensive profiled advertising.
Applying the code as standard will mean websites putting
privacy settings to high and turning off default data profiling. If they want to continue enjoying revenues from behavioural advertising they will need to get adult readers to verify their age. In its 2019 draft ICO had previously said such measures
must be robust and that simply asking readers to declare their age would not be enough.But it has now confirmed to Press Gazette that for news websites that adhere to an editorial code, such self-declaration measures are likely to be sufficient.
could mean news websites asking readers to enter their date of birth or tick a box confirming they are over 18. An ICO spokesperson said sites using these methods might also want to consider some low level technical measures to discourage false
declarations of age, but anything more privacy intrusive is unlikely to be appropriate..
But Society of Editors executive director Ian Murray predicted the new demands may prove unpopular even at the simplest level. Asking visitors to confirm
their age [and hence submit to snooping and profiling] -- even a simple yes or no tick box -- could be a barrier to readers. The ICO has said it will work with the news media industry over a 12-month transition period to enable proportionate and
practical measures to be put in place for either scenario.
In fact ICO produced a separate document alongside the code to explain how it could impact news media, which it said would be allowed to apply the code in a risk-based and proportionate way.
Met Police to make facial recognition cameras a fully operational feature of its arsenal
See article from bbc.com
The Metropolitan Police has announced it will use live facial recognition cameras operationally for the first time on London streets.
Following earlier pilots in London and deployments by South Wales police, the cameras are due to be put into action
within a month. Cameras will be clearly signposted, covering a small, targeted area, and police officers will hand out leaflets about the facial recognition scanning, the Met said. Trials of the cameras have already taken place on 10 occasions in
locations such as Stratford's Westfield shopping centre and the West End of London. The Met said in these trials, 70% of wanted suspects in the system who walked past the cameras were identified, while only one in 1,000 people generated a false alert.
But an independent review of six of these deployments found that only eight out of 42 matches were verifiably correct.
Over the past four years, as the Met has trialled facial recognition, opposition to its use has intensified, led in the UK by
campaign groups Liberty and Big Brother Watch.
The force also believes a recent High Court judgment, which said South Wales Police did not breach the rights of a man whose face had been scanned by a camera, gives it some legal cover. The case is
heading for the Court of Appeal. But the Met is pressing on, convinced that the public at large will support its efforts to use facial recognition to track down serious offenders.
Last year, the Met admitted it supplied images for a database
carrying out facial recognition scans on a privately owned estate in King's Cross, after initially denying involvement.
Update: Censored whilst claiming to be uncensored
24th January 2020. See
article from ico.org.uk
It seems to the normal response from
the Information Commissioner's Office to turn a blind eye to the actual serious exploitation of people's personal data whilst focusing heavily on generating excessive quantities of red tape rules requiring small players to be ultra protective of personal
to point of strangling their businesses and livelihoods. And, just like for unconsented website tracking and profiling by the only advertising industry, the ICO will monitor and observe and comment again later in the year:
In October 2019 we concluded our investigation into how police use live facial recognition technology (LFR) in public places. Our investigation found there was public support for police use of LFR but also that there needed to be improvements in how
police authorised and deployed the technology if it was to retain public confidence and address privacy concerns. We set out our views in a formal Opinion for police forces.
The Metropolitan Police Service (MPS) has incorporated
the advice from our Opinion into its planning and preparation for future LFR use. Our Opinion acknowledges that an appropriately governed, targeted and intelligence- led deployment of LFR may meet the threshold of strict necessity for law enforcement
purposes. We have received assurances from the MPS that it is considering the impact of this technology and is taking steps to reduce intrusion and comply with the requirements of data protection legislation. We expect to receive further information from
the MPS regarding this matter in forthcoming days. The MPS has committed to us that it will review each deployment, and the ICO will continue to observe and monitor the arrangements for, and effectiveness of, its use.
This is an
important new technology with potentially significant privacy implications for UK citizens. We reiterate our call for Government to introduce a statutory and binding code of practice for LFR as a matter of priority. The code will ensure consistency in
how police forces use this technology and to improve clarity and foreseeability in its use for the public and police officers alike. We believe it's important for government to work with regulators, law enforcement, technology providers and communities
to support the code.
Facial recognition remains a high priority for the ICO and the public. We have several ongoing investigations. We will be publishing more about its use by the private sector later this year.
Update: Big Brother Watch Petition
24th January 2020. Sign the petition from
To: Priti Patel, Home Secretary and Cressida Dick, Commissioner of the Metropolitan Police
Urgently stop the Metropolitan Police using live facial recognition surveillance.
Why is this important?
The Metropolitan Police has announced it will use live facial recognition across London, despite an independent review finding its previous trials likely unlawful and over 80% inaccurate. The Met is the largest police force in the
democratic world to roll out this dangerously authoritarian surveillance. This represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK - and it sets a dangerous precedent worldwide. We urge the Home
Secretary and Met Commissioner to stop it now.
ICO backs off a little from an age gated internet but imposes masses of red tape for any website that is likely to be accessed by under 18s
||23rd January 2020 |
22nd January 2020. See
press release from ico.org.uk
Age Appropriate Design [pdf] from ico.org.uk
The Information Commissioner's Office (ICO) has just published its Age Appropriate Design Code:
The draft was published last year and was opened to a public consultation which came down heavily against ICO's demands that website users should be
age verified so that the websites could tailor data protection to the age of the user.
Well in this final release ICO has backed off from requiring age verification for everything, and instead suggested something less onerous called age
'assurance'. The idea seems to be that age can be ascertained from behaviour, eg if a YouTube user watches Peppa Pig all day then one can assume that they are of primary school age.
However this does seem lead to a loads of contradictions, eg age
can be assessed by profiling users behaviour on the site, but the site isn't allowed to profile people until they are old enough to agree to this. The ICO recognises this contradiction but doesn't really help much with a solution in practice.
ICO defines the code as only applying to sites likely to be accessed by children (ie websites appealing to all ages are considered caught up by the code even though they are not specifically for children. On a wider point the code will be very
challenging to monetisation methods for general websites. The code requires website to default to no profiling, no geo-location, no in-game sales etc. It assumes that adults will identify themselves and so enable all these things to happen. However it
may well be that adults will quite like this default setting and end up not opting for more, leaving the websites without income.
Note that these rules are in the UK interpretation of GDPR law and are not actually in the European directive. So they
are covered by statute, but only in the UK. European competitors have no equivalent requirements.
The ICO press release reads:
Today the Information Commissioner's Office has published its final Age Appropriate Design Code
-- a set of 15 standards that online services should meet to protect children's privacy.
The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected
toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.
The code will require digital services to automatically
provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.
That means privacy settings should be set to high by default and nudge techniques should not be used to
encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up
targeted content should be switched off by default too.
Elizabeth Denham, Information Commissioner, said:
"Personal data often drives the content that our children are exposed to -- what
they like, what they search for, when they log on and off and even how they are feeling.
"In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing
online services do so with the best interests of children in mind. Children's privacy must not be traded in the chase for profit."
The code says that the best interests of the child should be a primary
consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.
"One in five internet users in the UK is a child, but they are using an internet that was not designed for them.
"There are laws to protect children in the real world -- film ratings, car seats, age
restrictions on drinking and smoking. We need our laws to protect children in the digital world too.
"In a generation from now, we will look back and find it astonishing that online services weren't always designed with
children in mind."
The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of
State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn
This version of the code is the result of wide-ranging consultation and engagement.
The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings
with individual organisations, trade bodies, industry and sector representatives, and campaigners.
As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.
The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).
Update: The legals
23rd January 2020. See article from
Schedule The code now has to be laid before parliament for approval for a period of 40 sitting days -- with the ICO saying it will come into force 21 days after that, assuming no objections. Then there's a further 12
month transition period after it comes into force.
Obligation or codes of practice?
Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal explained:
This is not, and will not be, 'law'. It
is just a code of practice. It shows the direction of the ICO's thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it's not something with which an organisation needs to comply as such. They need to
comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018.
Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other
applicable laws. The obligation to comply with those laws does not change because of today's code of practice. Rather, the code of practice shows the ICO's thinking on what compliance might look like (and, possibly, goldplates some of the requirements of
the law too).
Comment: ICO pushes ahead with age gates
23rd January 2020. See
article from openrightsgroup.org
The ICO's Age Appropriate Design Code released today includes changes which lessen the risk of widespread age gates, but retains strong incentives towards greater age gating of content.
Over 280 ORG supporters wrote to the ICO
about the previous draft code, to express concerns with compulsory age checks for websites, which could lead to restrictions on content.
Under the code, companies must establish the age of users, or restrict their use of data. ORG
is concerned that this will mean that adults only access websites when age verified creating severe restrictions on access to information.
The ICO's changes to the Code in response to ORG's concerns suggest that different
strategies to establish age may be used, attempting to reduce the risk of forcing compulsory age verification of users.
However, the ICO has not published any assessment to understand whether these strategies are practical or what
their actual impact would be.
The Code could easily lead to Age Verification through the backdoor as it creates the threat of fines if sites have not established the age of their users.
While the Code has
many useful ideas and important protections for children, this should not come at the cost of pushing all websites to undergo age verification of users. Age Verification could extend through social media, games and news publications.
Jim Killock, Executive Director of Open Rights Group said:
The ICO has made some useful changes to their code, which make it clear that age verification is not the only method to determine age.
However, the ICO don't know how their code will change adults access to content in practice. The new code published today does not include an Impact Assessment. Parliament must produce one and assess implications for free expression
before agreeing to the code.
Age Verification demands could become a barrier to adults reaching legal content, including news, opinion and social media. This would severely impact free expression.
public and Parliament deserve a thorough discussion of the implications, rather than sneaking in a change via parliamentary rubber stamping with potentially huge implications for the way we access Internet content.
ICO takes no immediate action against the most blatant examples of people's most personal data being exploited without consent, ie profiled advertising
|23rd January 2020 |
17th January 2020. See
article from ico.org.uk
Blatant abuse of people's private data has become firmly entrenched in the economic model of the free internet ever since Google recognised the value of analysing what people are searching for.
Now vast swathes of the internet are handsomely
funded by the exploitation of people's personal data. But that deep entrenchment clearly makes the issue a bit difficult to put right without bankrupting half of the internet that has come to rely on the process.
The EU hasn't helped with its
ludicrous idea of focusing its laws on companies having to obtain people's consent to have their data exploited. A more practical lawmaker would have simply banned the abuse of personal data without bothering with the silly consent games. But the EU
seems prone to being lobbied and does not often come up with the most obvious solution.
Anyway enforcement of the EU's law is certainly causing issues for the internet censors at the UK's ICO.
The ICO warned the adtech industry 6 months ago
that its approach is illegal and has now announced that it would not be taking any action against the data abuse yet, as the industry has made a few noises about improving a bit over the coming months. Simon McDougall, ICO Executive Director of
Technology and Innovation has written:
The adtech real time bidding (RTB) industry is complex, involving thousands of companies in the UK alone. Many different actors and service providers sit between the advertisers
buying online advertising space, and the publishers selling it.
There is a significant lack of transparency due to the nature of the supply chain and the role different actors play. Our June 2019 report identified a range of
issues. We are confident that any organisation that has not properly addressed these issues risks operating in breach of data protection law.
This is a systemic problem that requires organisations to take ownership for their own
data processing, and for industry to collectively reform RTB. We gave industry six months to work on the points we raised, and offered to continue to engage with stakeholders. Two key organisations in the industry are starting to make the changes needed.
The Internet Advertising Bureau (IAB) UK has agreed a range of principles that align with our concerns, and is developing its own guidance for organisations on security, data minimisation, and data retention, as well as UK-focused
guidance on the content taxonomy. It will also educate the industry on special category data and cookie requirements, and continue work on some specific areas of detail. We will continue to engage with IAB UK to ensure these proposals are executed in a
Separately, Google will remove content categories, and improve its process for auditing counterparties. It has also recently proposed improvements to its Chrome browser, including phasing out support for third party
cookies within the next two years. We are encouraged by this, and will continue to look at the changes Google has proposed.
Finally, we have also received commitments from other UK advertising trade bodies to produce guidance for
If these measures are fully implemented they will result in real improvements to the handling of personal data within the adtech industry. We will continue to engage with industry where we think engagement will
deliver the most effective outcome for data subjects.
Comment: Data regulator ICO fails to enforce the law
18th January 2020. See
article from openrightsgroup.org
Responding to ICO's announcement today that the regulator is taking minimal steps to enforce the law against massive data breaches taking place in the online ad industry through Real-Time Bidding, complainants Jim Killock and Michael Veale have
called on the regulator to enforce the law.
The complainants are considering taking legal action against the regulator. Legal action could be taken against the ICO for failure to enforce, or against the companies themselves for
their breaches of Data Protection law.
The Real-Time Bidding data breach at the heart of RTB market exposes every person in the UK to mass profiling, and the attendant risks of manipulation and discrimination.
As the evidence submitted by the complainants notes, the real-time bidding systems designed by Google and the IAB broadcast what virtually all Internet users read, watch, and listen to online to thousands of companies, without
protection of the data once broadcast. Now, sixteen months after the initial complaint, the ICO has failed to act.
Jim Killock, Executive Director of the Open Rights Group said:
The ICO is a
regulator, so needs to enforce the law. It appears to be accepting that unlawful and dangerous sharing of personal data can continue, so long as 'improvements' are gradually made, with no actual date for compliance.
Last year the
ICO gave a deadline for an industry response to our complaints. Now the ICO is falling into the trap set by industry, of accepting incremental but minimal changes that fail to deliver individuals the control of their personal data that they are legally
The ICO must take enforcement action against IAB members.
We are considering our position, including whether to take legal action against the regulator for failing to act, or individual
companies for their breach of data protection law.
Dr Michael Veale said:
When an industry is premised and profiting from clear and entrenched illegality that breach individuals'
fundamental rights, engagement is not a suitable remedy. The ICO cannot continue to look back at its past precedents for enforcement action, because it is exactly that timid approach that has led us to where we are now.
Ravi Naik, solicitor acting for the complainants, said:
There is no dispute about the underlying illiegality at the heart of RTB that our clients have complained about. The ICO have agreed with
those concerns yet the companies have not taken adequate steps to address those conerns. Nevertheless, the ICO has failed to take direct enforcement action needed to remedy these breaches.
Regulatory ambivalence cannot continue.
The ICO is not a silo but is subject to judicial oversight. Indeed, the ICO's failure to act raises a question about the adequacy of the UK Data Protection Act. Is there proper judicial oversight of the ICO? This is a critical question after Brexit, when
the UK needs to agree data transfer arrangements with the EU that cover all industries.
Dr. Johnny Ryan of Brave said:
The RTB system broadcasts what everyone is reading and
watching online, hundreds of billions of times a day, to thousands of companies. It is by far the largest data breach ever recorded. The risks are profound. Brave will support ORG to ensure that the ICO discharges its responsibilities.
Jim Killock and Michael Veale complained about the Adtech industry and Real Time Bidding to the UK's ICO in September 2018. Johnny Ryan of Brave submitted a parallel complaint against Google about their Adtech system to the Irish Data
Update: Advertising industry will introduce a 'gold standard 2.0' for privacy towards the end of 2020
23rd January 2020. See
article from campaignlive.co.uk
The Internet Advertising Bureau UK has launched a new version of what it calls its Gold Standard certification process that will be independently audited by a third party. In a move to address ongoing privacy concerns with the digital supply chain,
the IAB's Gold Standard 2.0 will incorporate the Transparency and Consent Framework, a widely promoted industry standard for online advertising.
The new process will be introduced in the fourth quarter after an industry consultation to agree on the
compliance criteria for incorporating the TCF.