Social platforms intentionally conceal how their products are designed and function, creating a chasm of knowledge between platforms and everyone else.
Today, we lack an ecosystem of accountability for social platforms.
When Frances Haugen testified before the Senate, maybe a few hundred people in the world deeply understood how the algorithms driving our personal information environment function and how individual experiences are defined by product design choices.
For fifteen years social platforms have rejected requests from researchers for access to basic data, starving the ecosystem of the data needed to corroborate statements, and strategically allowing companies to both define what problems exist, and which solutions are possible. Today, we lack an ecosystem of accountability because social media companies have been allowed to operate in the dark.
Social platforms leverage this information asymmetry to avoid accountability. They have created a communications environment of uncertainty and doubt in order to prevent change. We have been trapped in the red herring rhetoric social platforms have developed to avoid real, safety by design solutions.
In other industries, activists can advocate for change, litigators can hold companies accountable for cutting corners and legislators have sufficient information to write meaningful laws. These groups and many others can act as an ecosystem of accountability.
Industries operate in the public interest when all types of stakeholders, from concerned parents to investors, have the power to pull companies in directions aligned with stakeholders' best interests. Beyond the Screen is bringing more people into the conversation about how we can move forward to build social media that is good for us.
We lack a reasonable level of transparency.
Many governments, investors, and litators are pushing for transparency, but we lack consensus on what metrics and other data we must require from social platforms if we want them to be accountable. We are working with investors, litigators, and governments to establish what the floor of transparency looks like.
Our Minimum Viable Queries project addresses this gap.
Stakeholders with a wide variety of backgrounds lack shared context on existing challenges and possible solutions.
No single person exists today who understands the scope of the challenges of social media. There are many specialists who might understand slices of the problem like child exploitation, human trafficking, or information operations.
But there is no comprehensive directory of the harms of social media or possible paths towards addressing those harms. This lack of shared context prevents us from engaging in conversation about how to move forward. It’s hard to mobilize people to act when they don’t know what they can demand.
That's why we are beginning our work with our Standard of Care project.
Informed citizens and specialists don't have the data they need to build effective social platforms.
Most industries have an array of educational programs to train specialists and inform the general public. Many schools in the United States have a school newspaper. Even though most students do not pursue journalism, we have student newspapers because we believe that in a democracy the average person should understand and value the journalistic process, we should all have some experience with telling stories and telling the truth. This benefits the general public.
Architects are specialists who engage with the general public. They attend architecture schools and are required to pass a licensing test before they're allowed to build our physical spaces. We don't expect an average person to be able to know the trade-offs and potentially harmful consequences of how to build a safe and appropriate public space, we know that specialization requires certification and governance from within the field and accountability by people outside of it.
When it comes to our online social spaces, we do not have even the most basic tools for teaching about the dynamics of social platforms which are our new town squares, nor do we have professional associations or licensing tests that require proficiency in core principles before being allowed to impact online spaces.
Expanding awareness of how social platforms work isn’t just critical for democratic accountability — it will also directly help platforms build safer products. Today every major social platform struggles to hire sufficient people to keep their platforms safe and train people internally to sustain critical safety systems. No other major industry is so dependent on having to train their talent internally without a standardized professional association to balance individual company interests than social media.