Since launching The Prompt in May, I’ve been asked to write stories or reports based on themes raised in the newsletter for different audiences, and I’ve been very happy to oblige.
I was asked to go deeper on a recent post by the women behind the brilliant Internet Exchange newsletter, who thought it would appeal to their tech policy and internet governance audience. This version discusses regulation in more depth and ways forward.
You can read the original post here and the version for Internet Exchange here which I’ve cross-posted below.
If you would like me to write for your publication or to discuss a project, get in touch!

In 2014, a group of security researchers set up a free Wi-Fi hotspot in and around busy London stations to conduct a novel experiment. Before connecting to the hotspot, members of the public had to accept the Terms and Conditions (T&Cs) for using the service. In return for free Wi-Fi, the user agreed to assign their first born child to the Wi-Fi provider “for the duration of eternity”. Referred to in the T&Cs as the “Herod Clause," it was a stunt to highlight the lack of awareness of public Wi-Fi security issues, and the fact that nobody reads the small print.
Also known as Terms of Service, Terms of Use, User Agreements or Service Agreements, it is true that most are a case of clicked-accept-but-didn’t-read. According to a 2024 Ofcom survey, between half and two-thirds of users reported signing up to online platforms without reading the T&Cs. No wonder, with Microsoft’s combined T&Cs and privacy policy in Europe clocking in at a hefty 22,360 words, estimated to take at least two hours to read.
Not only are they lengthy, they’re also complex. In 2018, the BBC undertook a readability test of the T&Cs and privacy policies of 15 popular websites and found they were written at a university reading level and more complicated than Charles Dickens’ A Tale of Two Cities. This is a problem, especially given that some of these websites allow users as young as 13.
An Ethical AND a Legal Mess
This alone could be a breach of data protection rules, which require a clear explanation of how companies are using data. Article 12 of GDPR says that communications to individuals about their data must be presented in a "concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child". Expectations are further outlined by the UK’s Information Commissioner’s Office (ICO) Age Appropriate Design Code, likely to be updated in light of increased protection for children under the The Data (Use and Access Act) 2025 which came into force on June 19th.
There have been attempts to address overly complex T&Cs and contracts generally in order to protect consumers from signing up to something they don’t understand. The UK Consumer Rights Act (2015) includes a transparency requirement, mandating that the terms of consumer contracts are expressed in plain language.
On the government side, The US Plain Writing Act of 2010 required federal agencies to communicate in clear language and mandated all staff undergo “plain language” training. The goal was to ensure government documents are easy to read, enhancing public understanding and trust in institutions.
So Who Does Read the T&Cs?
Eagle-eyed readers of T&Cs are likely digital rights advocates and journalists mainly looking out for what data is being collected and with whom it is shared, which has led to some memorable discoveries.
When the first smart TVs with voice recognition went on sale in 2015, Samsung’s policies warned users "If your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party." This was interpreted by many to read “IT’S OUT OF CONTROL, RUN FOR YOUR LIVES.” Warning people not to have sensitive conversations in front of the TV gave the impression that Samsung was not fully in control of their own creation.
A good catch by journalist Samantha Cole in 2022 forced the period tracking app Stardust to change their privacy policy. At a time of heightened tension and increased reproductive surveillance just after Roe v. Wade was overturned, when women feared their menstrual data could be used to investigate or prosecute abortions, the company seemed happy to hand over data on periods to law enforcement without a warrant. Being tone deaf to your customer base is a theme we will return to.
And in 2023, Mozilla Foundation found in the small print of some connected car policies that they had gone full voyeur and collected information about the driver’s sex life, taking all the fun out of back seat canoodling.
In the AI Era, T&Cs Are Receiving Renewed Scrutiny
Now, with the majority of tech companies figuring out how to capitalise on the vast amount of data collected over many years and pivot their business model to cash in on the AI goldrush, updates to T&Cs reveal clues about future business plans and test the limits of what users are willing to accept.
On July 1st, WeTransfer notified customers about an update to their T&Cs regarding licensing which included utilizing user content for possible new technologies such as “machine learning models that enhance our content moderation process”. This was widely interpreted to mean that WeTransfer could capture any content transferred over the WeTransfer platform and use it to train their AI models. For free, as the clause also specifies there would be no compensation for this.
WeTransfer is a platform that allows very large digital files to be sent quickly and is used by the creative industries to send design/illustration files, photos and videos. The creative industries are VERY sensitive to their work being used to train AI models and this update came across as tone deaf to the concerns of their core customer base. The backlash was instant after the offending article was published widely on social media, with users making it clear this was not OK and they would be shuttering their accounts.
WeTransfer apologized and amended their policy to be clearer, saying:
“Such a feature [machine learning for content moderation] hasn’t been built or used in practice, but it was under consideration for the future. To avoid confusion, we’ve removed this reference.”
Online conferencing platform Zoom tried something similar back in 2023 when it updated its T&Cs to allow the company to use customer data to train AI models “with no opt out” and rolled it back after intense customer backlash.
A misunderstanding when T&Cs/privacy policies are written in inaccessible language for the average user is one thing, but are companies legally allowed to use data collected to train AI models?
The implications for copyright are an issue currently being played out in court in both the US and UK. Once again, data protection is having to do some of the heavy lifting, where the use of data for training AI hinges on consent. From recent examples, users are not clear what they are consenting to and they don’t like what they see.
The flood of AI tools coming to market, particularly agents, will demand access to a wide range of personal information and data points and users need to be crystal clear about what this means. The ICO already had a word with Microsoft regarding the “Recall” tool embedded into Co-Pilot after the “privacy nightmare” of it taking screenshots of a user desktop every few seconds.
Tick the Box to Do Better
Does it have to be like this? Volunteer projects like “Terms of Service; Didn't Read” (ToS;DR) provide legal analysis of T&C’s/Privacy Policies and assign a score. Ranking Digital Rights provides an annual index and analysis of the transparency and trends of the world's largest platforms.
But it is on the companies themselves to commit to using plain language and stop hiding controversial elements of the business in the T&Cs with the presumption that it provides legal cover and anyway no-one will notice. Legal protection for a wide range of liabilities relating to a wide range of services does not lend itself to simplicity in online platforms’ T&C’s. But lengthy and complex T&Cs can lead to confusion and misunderstanding among users, which is ultimately bad for business. Investing in co-creation of terms of service, where users play a role from beginning to end and centering those most impacted, could go some way to improve understanding and trust.
We are still in a space of resistance figuring out what AI means in our lives—what it will give and what it will take. Giving users a raw deal might work in the short term, but could backfire in the longer term as competition increases. Time to tick that box to accept we can do better.