|
But it can't possibly let you read them...because of data protection y'now
|
|
|
 | 23rd November 2019
|
|
| See article from ico.org.uk
|
The Information Commissions Office (ICO) earlier in the year presented draft internet censorship laws targeted at the commendable aim of protecting the personal data of younger website users. These rules are legally enforceable under the EU GDPR and are
collectively known as The Age Appropriate Design Code. The ICO originally proposed that website designers should consider several age ranges of their users. The youngest users should be presented with no opportunity to reveal their
personal data and then the websites could relent a little on the strictness of the rules as they get older. It all sounds good at first read... until one considers exactly how to know how old users are. And of course ICO proposed age verification
(AV) to prove that people are old enough for the tier of data protection being applied. ISO did not think very hard about the bizarre contradiction that AV requires people to hand over enough data to give identity thieves an orgasm. So the ICO
were going to ask people to hand over their most sensitive ID to any websites that ask... in the name of the better protection of the data that they have just handed over anyway. The draft rules were ridiculous, requiring even a small innocent
site with a shopping trolley to require AV before allowing people to type in their details in the shopping trolley. Well the internet industry strongly pointed out the impracticality of the ICO's nonsense ideas. And indeed the ICO released a blog
and made a few comments that suggest it would be scaling back on its universal AV requirements. The final censorship were delivered to the government on schedule on 23rd November 2019. The industry is surely very keen to know if the ICO has
retreated on its stance, but the ICO has now just announced that the publication date will be delayed until the next government is in place. It sounds that their ideas may still be a little controversial, and they need to hide behind a government
minister before announcing the new rules. |
|
|
|
|
 | 21st
November 2019
|
|
|
The AdTech showdown is coming but will the ICO bite? See article from openrightsgroup.org |
|
ICO reports on adtech snooping on, and profiling internet users without their consent
|
|
|
 |
25th June 2019
|
|
| See
article from ico.org.uk See
report [pdf] from ico.org.uk |
In recent months we've been reviewing how personal data is used in real time bidding (RTB) in programmatic advertising, engaging with key stakeholders directly and via our fact-finding forum event to understand the views and concerns of those
involved. We're publishing our Update report into adtech and real time bidding which
summarises our findings so far. We have prioritised two areas: the processing of special category data, and issues caused by relying solely on contracts for data sharing across the supply chain. Under data protection law, using
people's sensitive personal data to serve adverts requires their explicit consent, which is not happening right now. Sharing people's data with potentially hundreds of companies, without properly assessing and addressing the risk of these counterparties,
raises questions around the security and retention of this data. We recognise the importance of advertising to participants in this commercially sensitive ecosystem, and have purposely adopted a measured and iterative approach to
our review of the industry as a whole so that we can observe the market's reaction and adapt our thinking. However, we want to see change in how things are done. We'll be spending the next six months continuing to engage with the sector, which will give
the industry the chance to start making changes based on the conclusions we've come to so far. Open Rights Group responds 25th June 2019. See
article from openrightsgroup.org The ICO has responded to
a complaint brought by Jim Killock and Dr Michael Veale in Europe's 12 billion euro real-time bidding adtech industry. Killock and Veale are now calling on the ICO to take action against companies that are processing data unlawfully.
The ICO has agreed in substance with the complainants' points about the insecurity of adtech data sharing. In particular, the ICO states that:
Processing of non-special category data is taking place unlawfully at the point of collection [The ICO has] little confidence that the risks associated with RTB have been fully assessed and mitigated
Individuals have no guarantees about the security of their personal data within the ecosystem
However the ICO is proceeding very cautiously and slowly, and not insisting on immediate changes, despite the massive scale of the data breach. Jim Killock said: The ICO's
conclusions are strong and very welcome but we are worried about the slow pace of action and investigation. The ICO has confirmed massive illegality on behalf of the adtech industry. They should be insisting on remedies and fast.
Dr Michael Veale said: The ICO has clearly indicated that the sector operates outside the law, and that there is no evidence the industry will correct itself voluntarily. As long as it remains doing
so, it undermines the operation and the credibility of the GDPR in all other sectors. Action, not words, will make a difference--and the ICO needs to act now.
The ICO concludes:
Overall, in the ICO's view the adtech industry appears immature in its understanding of data protection requirements. Whilst the automated delivery of ad impressions is here to stay, we have general, systemic concerns around the
level of compliance of RTB:
- Processing of non-special category data is taking place unlawfully at the point of collection due to the perception that legitimate interests can be used for placing and/or reading a cookie or other technology (rather than
obtaining the consent PECR requires).
- Any processing of special category data is taking place unlawfully as explicit consent is not being collected (and no other condition applies). In general, processing such data
requires more protection as it brings an increased potential for harm to individuals.
- Even if an argument could be made for reliance on legitimate interests, participants within the ecosystem are unable to
demonstrate that they have properly carried out the legitimate interests tests and implemented appropriate safeguards.
- There appears to be a lack of understanding of, and potentially compliance with, the DPIA
requirements of data protection law more broadly (and specifically as regards the ICO's Article 35(4) list). We therefore have little confidence that the risks associated with RTB have been fully assessed and mitigated.
-
Privacy information provided to individuals lacks clarity whilst also being overly complex. The TCF and Authorized Buyers frameworks are insufficient to ensure transparency and fair processing of the personal data in question and
therefore also insufficient to provide for free and informed consent, with attendant implications for PECR compliance.
- The profiles created about individuals are extremely detailed and are repeatedly shared among
hundreds of organisations for any one bid request, all without the individuals' knowledge.
- Thousands of organisations are processing billions of bid requests in the UK each week with (at best) inconsistent
application of adequate technical and organisational measures to secure the data in transit and at rest, and with little or no consideration as to the requirements of data protection law about international transfers of personal data.
-
There are similar inconsistencies about the application of data minimisation and retention controls.
- Individuals have no guarantees about the security of their personal data within the
ecosystem.
|
|
|
|
|
| 6th June 2019
|
|
|
Foreign websites will block UK users altogether rather than be compelled to invest time and money into a nigh-impossible compliance process. By Heather Burns See
article from webdevlaw.uk |
|
Internet companies slam the data censor's disgraceful proposal to require age verification for large swathes of the internet
|
|
|
| 5th June 2019
|
|
| From the Financial Times |
The Information Commissioner's Office has for some bizarre reason have been given immense powers to censor the internet. And in an early opportunity to exert its power it has proposed a 'regulation' that would require strict age verification for
nearly all mainstream websites that may have a few child readers and some material that may be deemed harmful for very young children. Eg news websites that my have glamour articles or perhaps violent news images. In a mockery of 'data protection'
such websites would have to implement strict age verification requiring people to hand over identity data to most of the websites in the world. Unsurprisingly much of the internet content industry is unimpressed. A six weerk consultation on the
new censorship rules has just closed and according to the Financial Times: Companies and industry groups have loudly pushed back on the plans, cautioning that they could unintentionally quash start-ups and endanger
people's personal data. Google and Facebook are also expected to submit critical responses to the consultation. Tim Scott, head of policy and public affairs at Ukie, the games industry body, said it was an inherent contradiction
that the ICO would require individuals to give away their personal data to every digital service. Dom Hallas, executive director at the Coalition for a Digital Economy (Coadec), which represents digital start-ups in the UK, said
the proposals would result in a withdrawal of online services for under-18s by smaller companies: The code is seen as especially onerous because it would require companies to provide up to six different versions of
their websites to serve different age groups of children under 18. This means an internet for kids largely designed by tech giants who can afford to build two completely different products. A child could access YouTube Kids, but
not a start-up competitor.
Stephen Woodford, chief executive of the Advertising Association -- which represents companies including Amazon, Sky, Twitter and Microsoft -- said the ICO needed to conduct a full technical
and economic impact study, as well as a feasibility study. He said the changes would have a wide and unintended negative impact on the online advertising ecosystem, reducing spend from advertisers and so revenue for many areas of the UK media.
An ICO spokesperson said: We are aware of various industry concerns about the code. We'll be considering all the responses we've had, as well as engaging further where necessary, once the consultation
has finished.
|
|
Pointing out that it is crazy for the data protection police to require internet users to hand over their private identity data to all and sundry (all in the name of child protection of course)
|
|
|
 | 31st May 2019
|
|
| See article from
indexoncensorship.org |
Elizabeth Denham, Information Commissioner Information Commissioner's Office, Dear Commissioner Denham, Re: The Draft Age Appropriate Design Code for Online Services We write to
you as civil society organisations who work to promote human rights, both offline and online. As such, we are taking a keen interest in the ICO's Age Appropriate Design Code. We are also engaging with the Government in its White Paper on Online Harms,
and note the connection between these initiatives. Whilst we recognise and support the ICO's aims of protecting and upholding children's rights online, we have severe concerns that as currently drafted the Code will not achieve
these objectives. There is a real risk that implementation of the Code will result in widespread age verification across websites, apps and other online services, which will lead to increased data profiling of both children and adults, and restrictions
on their freedom of expression and access to information. The ICO contends that age verification is not a silver bullet for compliance with the Code, but it is difficult to conceive how online service providers could realistically
fulfil the requirement to be age-appropriate without implementing some form of onboarding age verification process. The practical impact of the Code as it stands is that either all users will have to access online services via a sorting age-gate or adult
users will have to access the lowest common denominator version of services with an option to age-gate up. This creates a de facto compulsory requirement for age-verification, which in turn puts in place a de facto restriction for both children and
adults on access to online content. Requiring all adults to verify they are over 18 in order to access everyday online services is a disproportionate response to the aim of protecting children online and violates fundamental
rights. It carries significant risks of tracking, data breach and fraud. It creates digital exclusion for individuals unable to meet requirements to show formal identification documents. Where age-gating also applies to under-18s, this violation and
exclusion is magnified. It will put an onerous burden on small-to-medium enterprises, which will ultimately entrench the market dominance of large tech companies and lessen choice and agency for both children and adults -- this outcome would be the
antithesis of encouraging diversity and innovation. In its response to the June 2018 Call for Views on the Code, the ICO recognised that there are complexities surrounding age verification, yet the draft Code text fails to engage
with any of these. It would be a poor outcome for fundamental rights and a poor message to children about the intrinsic value of these for all if children's safeguarding was to come at the expense of free expression and equal privacy protection for
adults, including adults in vulnerable positions for whom such protections have particular importance. Mass age-gating will not solve the issues the ICO wishes to address with the Code and will instead create further problems. We
urge you to drop this dangerous idea. Yours sincerely, Open Rights Group Index on Censorship Article19 Big Brother Watch Global Partners Digital
|
|
A new proposal forcing people to brainlessly hand over identity data to any Tom, Dick or Harry website that asks. Open Rights Group suggests we take a stand
|
|
|
 | 30th May 2019
|
|
| From action.openrightsgroup.org See ICO's
Age-Appropriate Design: Code of Practice for Online Services |
New proposals to safeguard children will require everyone to prove they are over 18 before accessing online content. These proposals - from the Information Commissioner's Office (ICO) - aim at protecting children's privacy,
but look like sacrificing free expression of adults and children alike. But they are just plans: we believe and hope you can help the ICO strike the right balance, and abandon compulsory age gates, by making your voice heard. The
rules cover websites (including social media and search engines), apps, connected toys and other online products and services. The ICO is requesting public feedback on its proposals until Friday 31 May 2019. Please urgently write
to the consultation to tell them their plan goes too far! You can use these bullet points to help construct your own unique message:
In its current form, the Code is likely to result in widespread age verification across everyday websites, apps and online services for children and adults alike. Age checks for everyone are a step too
far. Age checks for everyone could result in online content being removed or services withdrawn. Data protection regulators should stick to privacy. It's not the Information Commissioner's job to restrict adults' or children's access to content. -
With no scheme to certify which providers can be trusted, third-party age verification technologies will lead to fakes and scams, putting people's personal data at risk. Large age verification providers
will seek to offer single-sign-in across a wide variety of online services, which could lead to intrusive commercial tracking of children and adults with devastating personal impacts in the event of a data breach.
|
|
|