A new analysis of digital platform policies suggests that the terms and conditions users accept online are becoming harder to read and increasingly restrict access to legal remedies, including the ability to sue companies in court.
The findings come from Harvard University’s Transparency Hub, a research tool that compiles more than 20,000 legal documents covering over 300 digital platforms, including major social media services such as TikTok and Instagram. The project aims to increase public understanding of how user data is collected and how rights are defined in digital environments.
Researchers report that privacy policies have grown significantly more complex over time. Using the Flesch-Kincaid readability test to measure difficulty, the study examined documents from 2016 to 2025 and found that around 86 percent now require a college-level reading ability to fully understand. This trend, according to researchers, makes it increasingly unlikely that average users can grasp what they are agreeing to when signing up for online services.
The analysis also highlights a shift in how companies handle legal disputes. Many platforms now direct users away from traditional court proceedings and toward arbitration, a private dispute-resolution process where a neutral third party makes binding decisions. In most cases, companies select or influence the choice of arbitrators, which reduces the user’s ability to challenge decisions in a public legal setting.
Researchers note that this approach often includes clauses that prevent users from participating in class action lawsuits. Instead, individuals must bring claims separately, which can make legal action more difficult and costly. Some artificial intelligence platforms, including Anthropic and Perplexity, also include arbitration requirements in their terms of service, along with restrictions on collective legal claims.
In some cases, users may opt out of arbitration clauses, though this typically requires sending a formal notice within a limited timeframe after first using the service. However, awareness of these options remains low among users.
The Transparency Hub’s director, Jonathan Zittrain, has emphasized the importance of making digital agreements more accessible so users can better understand where their data goes and what rights they retain.
The findings also come at a time when several European countries are considering stricter rules on social media use, particularly in relation to protecting children online. Governments across France, Spain, Portugal, and Denmark are reviewing potential safeguards as concerns grow over digital platform practices.
Euronews Next contacted Anthropic and Perplexity for comment on their arbitration policies, though no immediate response was received. It remains unclear whether differences exist between U.S. and European terms for users of these services.