Social media accountability: executive liability for algorithms

In the social media accountability debate, the Sylvans agreed that social media executives should be personally liable for their algorithms.

The Sylvans gathered to debate a motion that strikes at the heart of social media accountability: ‘social media executives should be personally liable for their algorithms’.

As concerns over mental health and digital addiction rise, the call for greater social media accountability has grown louder. The debate explored whether the legal corporate veil should be pierced to hold individual leaders responsible for the code that governs our lives.

Here is an account of the arguments presented during this lively session.

The proposition: Big Tech and social media accountability

The first speaker opened with a powerful historical parallel: the tobacco industry. They reminded the audience of advertisements from the 1940s and 50s, where doctors promoted cigarettes. Society eventually moved from accepting smoking to litigation, driven by the revelation that companies knew their products were addictive.

The speaker argued that we are on a similar journey with social media. They referenced a current 2026 legal case in California, where a 19-year-old is suing major tech platforms for causing addiction and health issues.

The core of the argument rested on the definition of an algorithm. The speaker described it not as a tool for user satisfaction, but as a giant prediction machine engineered to guess what a user will be attracted to next. This mechanism triggers dopamine spikes, creating a cycle of craving. Crucially, the speaker noted the difference between wanting and liking. The algorithm creates a want, even when the user no longer derives pleasure from the activity.

The proposition did not call for a witch hunt against success. Instead, they argued for the application of existing legal principles – gross negligence and wilful misconduct. If executives knowingly engineer addiction that harms users, they should face personal liability, much like the tobacco executives of the past.

The opposition: the media is the real bogeyman

The second speaker rejected the motion, suggesting that social media is merely a convenient scapegoat. They argued that the real issue is not the delivery mechanism (the algorithm), but the content itself. In their view, legacy media outlets create the division and misinformation that social media simply reflects.

The opposition placed the responsibility firmly on parents and individuals. Sharing personal anecdotes, the speaker claimed that addiction is a failure of discipline, not technology. They described controlling their own children’s phone usage through strict parenting, arguing that if they can do it, others can too.

Furthermore, the speaker warned against lawfare. They argued that stripping executives of protection would invite endless, frivolous lawsuits that could stifle business. They also questioned why social media is singled out when gaming, gambling and even language learning apps like Duolingo use similar engagement mechanics. The speaker concluded that we should not punish Mark Zuckerberg simply because he is successful.

Voices from the floor on social media accountability

  • Privilege and profit: One audience member challenged the opposition’s stance on parenting, noting that not all families have the resources or time to monitor screen usage constantly. They pointed to Project Mercury, an internal investigation at Meta, alleging that the company prioritised profit over user well-being despite knowing the harms.
  • The science of harm: Another speaker highlighted the biological impact of these platforms. They referenced Jonathan Haidt’s work, The Anxious Generation, noting that the brain’s frontal cortex is not fully developed until age 25. This makes young people uniquely vulnerable to algorithms designed to maximize engagement.
  • Sovereignty and values: A contributor raised the issue of technological sovereignty. They argued that because Europe relies heavily on American tech, it lacks the control to enforce its values. They suggested that while the proposition is morally right, enforcing it against foreign giants is politically difficult.
  • The teenage magazine analogy: In a striking visual argument, a speaker asked the room to imagine a physical magazine for teenagers. If pages 1-4 were fashion, but the rest contained pornography, gambling and suicide content, it would be banned immediately. They asked why digital platforms are allowed to operate without the same scrutiny.
  • Legal precedents: Several speakers discussed the legal mechanisms. One noted that the UK’s Online Safety Act is already forcing changes, proving that regulation works. Another pointed out that in the financial services industry, senior managers are already held personally liable for misconduct, which successfully focuses the mind on compliance.
  • Defence of the corporate veil: Conversely, a speaker defended the principle of separate legal personality. They argued that English law protects directors to encourage entrepreneurship. In their view, proving an intent to harm is impossible because the primary goal of these platforms is connection and profit, not injury.

The closing arguments

The opposition closing

Returning to the podium, the second speaker reiterated that addiction is a universal human trait. They admitted to their own addiction to language learning apps but argued this does not necessitate suing executives. They warned that if social media vanished, children would simply find something else to binge on.

The speaker maintained that existing laws are sufficient. If a company is sued and the share price drops, the executives suffer financial consequences naturally. They concluded with a metaphor: if a joyrider crashes a Ford at 150mph, we do not sue the executives of Ford. We blame the driver.

The proposition closing

The first speaker closed by referencing the film Thank You for Smoking, illustrating how industries often hide behind the guise of personal responsibility to defend harmful products. They agreed that personal responsibility is vital but emphasised that we live in a society based on a social contract.

The speaker clarified that they are not arguing for immediate conviction, but for liability. They argued that executives must operate within a framework that protects society. If executives are found guilty of gross negligence – turning a blind eye to the harm their products cause – they must face enforceable consequences. The speaker concluded that liability ensures that innovation does not come at the cost of public safety.

The verdict on social media accountability

The debate highlighted deep concerns regarding the power of big tech and the vulnerability of users. While valid concerns were raised regarding the practicality of enforcement and the risk of over-regulation, the mood of the room leaned heavily toward greater social media accountability.

When the final vote was cast, the motion carried by a significant margin. The Sylvans resolved that social media executives should indeed be personally liable for their algorithms.

Further reading

A detailed summary and analysis of the debate can be viewed here.

Please see summaries of earlier Sylvan debates here.

For more information about how our meetings run, see meeting info.