top of page

Standard of Care

Unlike every other major industry, social platforms are not currently being compelled to act with a duty of care towards their users. This is intentional — social platforms knew they would not be held accountable as long as users could not see the corners they cut to maximize profits.

Overview

We can fundamentally change how these platforms operate by building a shared context of accountability. Together, we can reach beyond the screen and truly hold social platforms accountable.

​

We believe the debate on how to move forward with social media has struggled to gain inertia because often the people who understand the social problems caused by social media are not the same people who understand the spectrum of what is possible with technology. As a result, conversations on how to move forward can short-circuit when advocates identify solutions that social platforms can easily dismiss as technically infeasible or which come with significant baggage.

​

We believe the way to help more people come to consensus is by developing maps of the harms of social media, the levers to prevent or mitigate each harm, and the different strategies for pulling each lever.
 

Case Study

One of the most effective levers for preventing many harms to children is keeping under 13-year-olds off the platforms. Many children’s advocates push to accomplish this by requiring the checking of government identification before people can use social platforms. Big Tech companies used their awareness of how much resistance the public has against checking IDs to attempt to derail the passage of the Age Appropriate Design Code in California by saying the only way you could keep under 13 year olds off the platform is by checking IDs.

 

In reality, if you sat down with a technologist and asked, “How can we find 13 year olds?” they would give you ten or fifteen different strategies for finding underage users. For just one example, children often self-disclose their true age through their posts or even bios even if they claimed to be older when they signed up. We’re more likely to find a reasonable compromise if we can sit around the table together and review a diverse list of possible paths forward.

​

Through extensive consultations with investors, litigators, concerned citizens, and governmental agencies, we believe our collaborative map of online harms and the solutions to prevent or mitigate them will be used by the ecosystem of accountability to compel social platforms to act with a Duty of Care towards their users. 

Initial Areas of Focus

Over the next year we are funded to feasibility test on our development methodology by building out a harms and solutions mapping for three core areas;

​

  • Harms to children & families

  • Harms to national security

  • Harms related to content moderation

Project Goals

We believe that Duty of Care can help empower an ecosystem of accountability that can hold social platforms responsible for their product choices. Some outcomes may be:


Product Safety Litigation
Articulating a set of available safety levers and the strategies for pulling those levers will create a yardstick to objectively show how little social platforms have done to prevent harm. There is a history in the United States of industries that externalize their costs (ex. tobacco and industrial pollution) being brought closer in line with the common good via class action litigation. This project will enable litigators to build smarter cases to bring  social platforms in line with the common good.


Investment Standards
If litigators act as the stick within the ecosystem of accountability, investment standards are the carrot. In every major industry, investors play a critical role in pulling corporations towards long term profitability over short-sighted profit-seeking. Investors cannot play this role today because they lack critical context about how to properly measure the externalities of social media or whether companies are adequately dealing with those externalities. The Duty of Care project will give investors the framework and create specific standards for understanding what “good” should look like for social platforms. 

Help us build solutions for the common good.

bottom of page