2025-08-15 18:18:33
It will fail to protect children because it misdirects attention and focus away from the real problem […the] actual failures to investigate, charge, prosecute and convict those involved in creating, selling, and sharing child sex abuse material where the supposed big, bad guys in the room – the tech companies – have actually alerted the authorities and given them the information they need to arrest abusers and child pornographers.
This is true; when I was that Facebook I heard stories of hundreds of thousands (if not more) reports being sent to the relevant authorities and who just dropped on the floor because they had either no capacity or no interest.
The tech companies do their best to report this stuff and increase the reportage, but telling that story does not serve the interests of the state or of the child protection activists who want clicks and money.
The author has solid credentials in the space too:
Maureen Flatley is the special adviser to Stop Child Predators, a US-based child protection organisation. In 2006, she was a principal architect of Masha’s Law, a statute that tripled the civil penalties for downloading child sexual abuse material.
Like:
2025-08-15 17:58:02
I have no idea why, possibly just a editorial burp or maybe something more strategic about not wanting to potentially risk being seen as hosting circumvention discussions:
2025-08-15 16:44:06
Some folk, including Thomas Pearson with 115k followers on TikTok, are calling for big global platforms to shun the UK in order to “…make the UK Government lose money, and fast”.
In this instance this is the wrong approach, it will cause harm, it won’t work, and here I shall explain why:
First up: my credentials. You can check my “about” page for details, but for this post most relevantly I’m a software engineer, have worked in the digital rights space around encryption and security since ~1988 with 20 years in “Big Tech” industries — including 3 years at Facebook working on security infrastructure, “darkweb” and end-to-end encryption — and ~9 years volunteering on the Board of Directors of the Open Rights Group (the “British EFF”). I also helped put the BBC, Guardian and Reddit onto the darkweb / tor network to assist their audience reach into repressive regimes.
I am now a semi-retired occasional consultant and full-time parent with no financial axes to grind. I have a kid rapidly approaching school and I don’t consider the Online Safety Act a fit approach to my kid’s welfare.
The UK Online Safety Act requires all big global platforms to age-verify everyone who uses those sites “from the UK” ostensibly in order to prevent British children “stumbling upon” pornography.
Implementations have proven to be embarrassingly simple to bypass, advocates for the general principle (often with financial interest) are calling for greater and more invasive restrictions having global impact, to address a problem that the UK has caused for itself.
Further, subsequent reporting has demonstrated that the Online Safety Act’s intended purpose is not to protect children as advertised, but to bring all significant communications platforms under UK regulatory control, with an approach that can only be described as YOLO-FAFO, with Ofcom literally calling for laws to be enacted so that regulations can be issued permitting them to learn enough so that they can work out how to regulate properly to achieve what the law intended, eventually —
Ofcom informed officials that it had [little] chance of understanding how to refine its approach … and that in order to work out how to craft a regulation that wouldn’t accidentally cover the wrong websites, it would need to wait for the law to come into effect so that it could legally force the websites in question to hand over data about how they operate, and then it could rewrite the rules to exclude them. The minister was given the bad news, along with the helpful “suggestion” that “the secretary of state should not ask for more information about this”.
The Government can’t say that it wasn’t warned, because we told them that this would happen back in 2016, including the stuff about VPNs.
Some people are saying that “big platforms” should “ban the UK” in order to somehow hurt the government and that this will somehow bring about an end or overthrow of the online safety act. This will not work in the same way that biting a dog will not break up a dogfight.
But why not? After all, Apple have made a huge dent in UK surveillance political power by withdrawing Advanced Data Protection from the UK, and there is a very real risk that Signal will ban UK users if client-side scanning obligations are imposed upon it. Both of those approaches “work”, both combat state overreach, and I agree with them politically and strategically.
But we/you/they/platforms should not ban UK users from the Internet.
Various world governments have spent the past 10-15 years participating in the techlash, basically an anti-tech — and especially anti-big-tech — populist movement to stop people thinking critically about the erosion of their online freedoms to instead make them simply and rather mindlessly hate “billionaires” and to make citizens call for aid from the government to stop all the badness and harms coming from unregulated connectivity and speech.
But then the government decided that the citizens are the problem, and that the solution is to require them to present ID before they are permitted to look at stuff — tits, dicks, arses, LGBTQ discussion, alcoholism recovery, domestic abuse experiences, wartime atrocities — because of course all the content on the Internet is individually labelled with some sort of taxonomy of meaning and intention, right? (hint: nope)
But if you’re calling for the big platforms to ban Britons from their platforms until the UK Government unwinds its stupidity, then you’re making at least three big mistakes:
So: you would literally be calling for the platforms to get good at doing the thing that you are trying to get them to not do; and in what universe does that make sense?
In politics this is called accelerationism, and it does not work, plus it tends to cause a huge amount of harm in the meantime.
There is already a good historical precedent for what needs to happen: the disastrous and subsequently repealed 1920s Prohibition on Alcohol in the USA. Go check out that article, look at the of alcohol consumption over time graph and compare it to recent reportage about a supposed sudden decline in UK-based porn access.
There is no quick fix. The UK needs to lead the way in being pointed-at and laughed-at for doing such a boneheaded thing as attempting to coerce global censorship. People need to circumvent it, trivially. The Government is attempting to force its influence globally, and that simply will not happen. By the time any effective censorship controls actually sediment into the technology stack, the users — the kids, the people ostensibly being protected — will have migrated to some other application or network stack where they don’t have to deal with age verification and identity bullshit.
The actual victims will be the British people — and the citizens of any other country stupid enough to follow suit — where the ordinary citizens will have to spend their time dealing with being doxed and having identities stolen in order to circumvent the controls that otherwise will not exist.
Just as most abuse is domestic, in the newly created market for fake ID, kids will be stealing parents’ IDs to swap them — what will that do to society?
So, yeah: don’t call for the UK to be banned from big platforms. If you do you will be playing into the Government’s hands. Instead, we have to sweat it out, suffer pain, and be laughed-at
It will probably take a few years. My guess: minimum 3, maximum 15.
Here’s Thomas.
Sorry, mate, but no.
2025-08-15 11:35:06
[the…] secretary of state … expressed “concern” that the legislation might whack sites such as Amazon instead of Pornhub. In response, officials explained that the regulation in question was “not primarily aimed at … the protection of children”, but was about regulating “services that have a significant influence over public discourse”, a phrase that rather gives away the political thinking behind the act.
https://www.thetimes.com/comment/columnists/article/online-safety-act-botched-2xk8xwlps or: https://archive.ph/3pave
It’s a fine piece until we get to the last paragraph or two, because the actual solution is to tackle the children and work with them rather than try to find the appropriate way to put them in a padded cell. “Regulating their devices” simply means that they will get unregulated devices, in the same way that teenagers have always hidden illicit stuff from their parents.
2025-08-14 01:38:29
The kids are alright, it’s the politicians who are a mess:
Chatting to son: “…so do you think your youtube account will pass AI age verification?”
His reply: “I think I’m safe with all the Hearts of Iron 4 guides, but I’m going to watch a six hour video of the best fishing spots in East Anglia to make sure”
2025-08-13 20:47:51
…on the grounds that everyone is a potential predator.
Tom’s Hardware:
The UK government has instructed citizens to delete old emails and pictures to help conserve water, following the announcement of a “nationally significant” water shortage. However, the advice isn’t up to snuff, as deleting emails and pictures should have no significant effect on water consumption, and might even make it worse for data centers that use certain types of evaporative cooling.