|
US free speech website blocks UK users so as avoid onerous and suffocating internet censorship by Ofcom
|
|
|
|  |
17th April 2025
|
|
| See uk.gab.com |
The US right leaning forum website GAB has blocked internet users located in Britain. UK users can now only see a landing page explaining that UK internet censorship laws are unacceptable to the free speech loving forum. The website explains its actions
as follows: ATTENTION: UK Visitor Detected The following notice applies specifically to users accessing from the United Kingdom. Access Restricted by Provider
After receiving yet another demand from the UK's speech police, Ofcom, Gab has made the decision to block the entire United Kingdom from accessing our website. This latest email from Ofcom ordered us to
disclose information about our users and operations. We know where this leads: compelled censorship and British citizens thrown in jail for hate speech. We refuse to comply with this tyranny. Gab is an American company with zero
presence in the UK. Ofcom's demands have no legal force here. To enforce anything in the United States, they'd need to go through a Mutual Legal Assistance Treaty request or letters rogatory. No U.S. court is going to enforce a foreign censorship regime.
The First Amendment forbids it. Ofcom will likely try to make an example of us anyway. That's because the UK's Online Safety Act isn't about protecting children. It's about suppressing dissent. They're
welcome to try. The idea that a British regulator can pressure a U.S. company that's IP-blocking the entire UK is as farcical as it is futile. If anything, it proves our point: censorship doesn't work. It only reveals the truth about the censors.
We proudly join platforms like Bitchute in boycotting the United Kingdom. American companies should follow suit. The power of the UK's parliament ends where the First Amendment begins. The only way to vote
against the tyranny of the UK's present regime is to walk away from it, refuse to comply, and take refuge under the impervious shelter of the First Amendment. The UK's rulers want their people kept in the dark. Let them see how
long the public tolerates it as their Internet vanishes, one website at a time. |
|
US officials challenge Ofcom over online safety laws' impact on free speech
|
|
|
 |
6th April 2025
|
|
| See article from
theguardian.com |
US state department officials have challenged Britain's internet censor over the impact on freedom of expression created by new online censorship laws, the Guardian understands. A group of officials from the state department's Bureau of Democracy,
Human Rights, and Labor (DRL) recently met Ofcom in London. It is understood that they raised the issue of the new online safety act and how it risked infringing free speech. The state department body later said the meeting was part of its
initiative to affirm the US commitment to defending freedom of expression, both in Europe and around the world. During the meeting, Ofcom officials claimed the new rules were only in place to deal with explicitly illegal content and material that could
be harmful to children. A state department spokesperson said: As Vice-President Vance has said, we are concerned about freedom of expression in the United Kingdom. It is important that the UK respect and protect freedom of expression. Details
of the meeting emerged after Jonathan Reynolds, the business secretary, denied that concerns over free speech had featured in tariff negotiations with the US. In February, the US vice-president, JD Vance, complained of infringements on free speech in
the UK. Elon Musk, one of Trump's closest allies, repeatedly claimed that some prison sentences handed down to people who incited the riots on X were a breach of free speech. Free speech advocates say that the UK censorship law is going to bring
about a culture of 'if in doubt, cut it out' as platforms seek to avoid being subject to Ofcom's enforcement powers. |
|
Ofcom initiates bounteous times for hackers, scammers, phishers and identity thieves
|
|
|
 |
17th January 2025
|
|
| See press release from ofcom.org.uk
|
Children will be prevented from encountering online pornography and protected from other types of harmful content under Ofcom's new industry guidance which sets out how we expect sites and apps to introduce highly effective age assurance. Today's
decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced last month, to tackle illegal content online, and comes
ahead of broader protection of children measures which will launch in the Spring. Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce
'age assurance' to ensure that children are not normally able to encounter it.[1] Age assurance methods -- which include age verification, age estimation or a combination of both -- must be 'highly effective' at correctly determining whether a particular
user is a child. We have today published industry guidance on how we expect age assurance to be implemented in practice for it to be considered highly effective. Our approach is designed to be flexible, tech-neutral and future-proof. It also
allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader[2]. We expect the approach to be applied consistently across all parts of the online safety regime over time.
While providing strong protections to children, our approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six
months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80%) are broadly supportive of age assurance measures to prevent children from encountering online
pornography.[3] What are online services required to do, and by when? The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take
starts from today:
- Requirement to carry out a children's access assessment. All user-to-user and search services -- defined as 'Part 3' services[4] -- in scope of the Act, must carry out a children's access assessment to establish if their service -- or part of
their service - is likely to be accessed by children. From today , these services have three months to complete their children's access assessments, in line with our guidance, with a final deadline of 16 April . Unless they are already
using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply
with the children's risk assessment duties and the children's safety duties.[5]
- Measures to protect children on social media and other user-to-user services. We will publish our Protection of Children Codes and children's risk assessment
guidance in April 2025. This means that services that are likely to be accessed by children will need to conduct a children's risk assessment by July 2025 -- that is, within three months. Following this, they will need to implement measures to
protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful
content.
- Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect
children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content (defined as 'Part 5 Services[6]) including certain Generative AI tools, must begin taking steps
immediately to introduce robust age checks, in line with our published guidance. Services that allow user-generated pornographic content -- which fall under 'Part 3' services -- must have fully implemented age checks by July.
What does highly effective age assurance mean? Our approach to highly effective age assurance and how we expect it to be implemented in practice applies consistently across three pieces of industry guidance, published today[5]. Our final
position, in summary:
- confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
- sets out a non-exhaustive list of methods that we consider are capable of being
highly effective. They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
- confirms that methods including
self-declaration of age and online payments which don't require a person to be 18 are not highly effective;
- stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should
services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
- sets expectations that sites and apps consider the interests of all users when implementing age assurance -- affording strong
protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.
We consider this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force. While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage
(e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research. Opening a new enforcement programme
We expect all services to take a proactive approach to compliance and meet their respective implementation deadlines. Today Ofcom is opening an age assurance enforcement programme , focusing our attention first on Part 5 services that display or
publish their own pornographic content. We will contact a range of adult services -- large and small -- to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or
ultimately comply. For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don't ask or, when they do, the checks are minimal and easy to avoid.
That means companies have effectively been treating all users as if they're adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change. As age checks start to roll out in the coming
months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services -- including social media - which
allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest. We'll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to
face enforcement action from Ofcom. Notes
- Research shows that children are being exposed to online pornography from an early age. Of those who have seen online pornography, the average age they first encounter it is 13 -- although more than a quarter come across it by age 11 (27%), and one
in ten as young as 9 (10%). Source: 'A lot of it is actually just abuse'- Young people and pornography Children's Commissioner for England
- Research from the UK Government indicates that UK firms account for an estimated one-in-four (23%) of the
global safety tech workforce. 28% of safety tech companies are based in the UK according to recent research by Paladin Capital and PUBLIC .
- Source: Yonder Consulting - Adult Users' Attitudes to Age Verification on Adult Sites
- 'Part 3'
services include those that host user-generated content, such as social media, tube sites, cam sites, and fan platforms.
- Services that conclude they are not likely to be accessed by children -- including where this is because they are using
highly effective age assurance -- must record the outcome of their assessment and must repeat the children's access assessment at least annually.
- 'Part 5' services are those that publish their own pornographic content, such as studios or pay
sites, where operators control the material available.
|
|
The suffocating mountain of red tape titled the Online Safety Acts kills its first British business
|
|
|
 | 24th December 2024
|
|
| See article from lfgss.com |
The owner of a popular cycling forum LFGSS has decided to close his business due to the enormous risks and expenses inherent in running a British business due to be suffocated by the misleadingly named Online Safety Act. He explains: Reading
Ofcom's tome of censorship rules and we're done... we fall firmly
into scope, and I have no way to dodge it. The act is too broad, and it doesn't matter that there's never been an instance of any of the proclaimed things that this act protects adults, children and vulnerable people from... the very broad language and
the fact that I'm based in the UK means we're covered. The act simply does not care that this site and platform is run by an individual, and that I do so philanthropically without any profit motive (typically losing money), nor that the site
exists to reduce social loneliness, reduce suicide rates, help build meaningful communities that enrich life. The act only cares that is it "linked to the UK" (by me being involved as a UK native and resident, by you being a UK based
user), and that users can talk to other users... that's it, that's the scope. I can't afford what is likely tens of thousand to go through all the legal hoops here over a prolonged period of time, the site itself barely gets a few hundred in
donations each month and costs a little more to run... this is not a venture that can afford compliance costs... and if we did, what remains is a disproportionately high personal liability for me, and one that could easily be weaponised by disgruntled
people who are banned for their egregious behaviour... I do not see an alternative to shuttering it. The conclusion I have to make is that we're done... Microcosm, LFGSS, the many other communities running on this platform... the risk to me
personally is too high, and so I will need to shutter them all. On Sunday 16th March 2025 (the last day prior to the Act taking effect) I will delete the virtual servers hosting LFGSS and other communities, and effectively immediately end the
approximately 300 small communities that I run, and the few large communities such as LFGSS. |
|
|