Haugen confirmed that she is not "anti-tech" or "anti-social media", but called for a conversation about the "current process by which conflicts of interests are resolved". She warned that allowed to "operate in the dark", the decisions taken by Facebook over the past fifteen years had "kept siding on the side of their own profits, over decisions that could have advanced public safety".
I think the way we got here was that a giant multinational tech company made decisions in isolation - Frances Haugen
Haugen discussed the problem of expertise and access to data. She explained that universities are not currently teaching students "how these systems work at the level of granularity that we need in order to talk about what is the path forward", and proposed a new "laboratory bench" model to simulate the multidimensional trade-offs and complexity of platforms. Haugen argued Facebook needs to make representative data available to researchers.
"Transparency is how we begin changing the feedback cycle. Where we can begin exploring these other avenues forward. But until we have more people at the table who understand what is going on – more people who can have opinions – I don’t think we are going to have products that reflect society or reflect the diverse needs that these products serve."
Haugen stressed the importance in shifting the conversation away from content. She explained that censorship-based solutions need to be rewritten for every language and dialect. Haugen argued it is product choices that are critical to protecting people in linguistically diverse places with peripheral languages where Facebook often is the internet. She recognised "pretty profound cultural changes" would also be needed to allow the company to "reorientate itself towards maximising for more kinds of stakeholders."
People aren’t aware of how many solutions that exist that aren’t about content. They are about friction. - Frances Haugen
Discussing the mass killing of Rohingya in Myanmar – officially recognised as a genocide by the US government a day earlier – and the evolving conflict in Ethiopia, Haugen conveyed the urgency of the problem. Haugen warned that without fundamental change, the consequences of Facebook’s underinvestment in basic safety systems will escalate over the next five to ten years.
I want to put you in the shoes of your future self. How you are going to feel five years from now, ten years from now, when there are ethnic violence incidents that are not hundreds of thousand … but we’re talking a million, multiple millions of people have died … What are we doing now to make sure we never see that world? - Frances Haugen
Haugen described Ireland as a "tech superpower" with the opportunity to implement legislation to influence the actions of the large platforms located in the country and the ability to "change world history". She explained the "concentric circles of agency and power that we all have" and urged Irish people "to stand in solidarity" with those who do not have the choice to opt out of Facebook.
The Schuler Democracy Forum applies Trinity's research in the Arts and Humanities to questions relating to democracy and the media. Based in the Trinity Long Room Hub, the Forum draws on the unique strengths and expertise of the Arts and Humanities: interpretative skills, nuance, long-term perspectives, critical analysis, empathy, and imagination. It puts questions related to identities, societies and cultures at the centre of its work, responds to the importance of lived experience, and recognises the ‘human factor’ in technology, media communication, and political administration.