Views expressed in opinion columns are the author's own.
Facebook is a mess. A recent Wired cover showed Mark Zuckerberg covered in scrapes and bruises, a metaphor for the site's last two years. In that time, Facebook has been repeatedly exposed for failing to notice, or ignoring, malicious actors.
In 2016, ProPublica found that Facebook allowed housing advertisers to target white users and exclude others. Facebook's community standards are supposed to prevent hate speech, but another ProPublica story from 2017 found they frequently make the wrong call. Facebook's algorithms also allowed advertisers to reach "Jew haters" with a variety of anti-Semitic keywords.
Most notably, the Wired story detailed how suspected Russian operatives made fake accounts to influence the 2016 election. Facebook has, for too long, used the phrase "open platform" to ignore its responsibly to monitor hate and lies. The scope of Facebook's influence, coupled with its failure to act, demands smarter regulation.
[Read more: We don't need Facebook to lead a global community]
It was only after the 2016 election that Facebook seemed to notice the scope of its problem with fake accounts and internet bots. It discovered that a Russian group called the Internet Research Agency funded a number of accounts to sow discord in American political discourse.
The accounts ranged in content, from a pro-Texan secession page called Heart of Texas to an anti-police brutality page called Blacktivist. But as the 2016 election neared, the accounts' purposes became clear. The Heart of Texas page began playing on racist fear while the Blacktivist page urged voters to support Jill Stein. Russian groups were taking different approaches to the same goal — electing Donald Trump. In total, posts from six of these groups were shared more than 340 million times.
According to Facebook insider Roger McNamee, Russian use of Facebook during the 2016 election was nothing unusual, explaining, "They find 100 or 1,000 people who are angry and afraid and then use Facebook's tools to advertise to get people into groups."
Ribbonfarm editor-at-large Renee DiResta compared the current problems at Facebook to those of the finance sector in the recent past. She believes that high-frequency trading used technology "to distort information flows and access in much the same way [Facebook] is now being used to distort and game the marketplace of ideas."
She explains that bots "create the illusion of a mass groundswell of grassroots activity." Not to mention all the manufactured stories that these groups circulated to users who believed the fake headlines.
Purveyors of fake news saw that the most clicks came on stories that were pro-Trump; one particularly egregious example was a viral story that claimed Pope Francis had endorsed Trump's presidential bid.
By the time of the election, many of Facebook's top news stories, in terms of engagement, were fake. The combination of Russian-bought ads, Russian-run accounts and pro-Trump fake news had an untold influence on the election, but were largely ignored by Facebook.
Part of the problem is that public opinion has not caught up to the problem. Some argue that Facebook is still a tool of the people — young people, in fact. They'll often point to the use of social media during the Arab Spring as proof that the internet serves a populist, even revolutionary, function.
But in so many other cases, Facebook is bending to existing power structures. Manipulation of Facebook aided the election of Trump and continues to allow hate speech. Its insistence on prioritizing the notion of an "open platform" gave foreign governments and fake accounts greater influence. Today, Facebook does not belong to the people — it belongs to those who know how to best exploit its flaws.
As DiResta put it, "The downstream cost of serving users disinformation, conspiracies, and radicalized propaganda became clear in the elections of 2016 and 2017. […] And in the meantime, the marketplace of ideas is growing increasingly inefficient as unchecked manipulation influences our most important conversations."
We cannot wait for Facebook and other social media companies to self-correct. It's not just elections that matter: Facebook is failing to monitor hate speech and discriminatory advertising. It's comforting to pretend that a free platform plus free speech will produce a good website, but it's not true. From hate speech prevention to campaign spending rules, we need more regulation.
Jack Lewis is a senior government and politics major. He can be reached at email@example.com