On March 3, the New York Times reported that YouTube had launched a large-scale crackdown on misleading and inflammatory content, with thousands of conspiracy and far-right videos being removed from the website. Dealing with deceptive content has become a pressing issue for companies like Facebook and Google, whose services have been widely used as a platform for spreading misinformation and organizing hate groups. Should tech companies take steps to curb malicious content on their platforms, or should free speech remain paramount?  

R Matthews ’19

I'm definitely a proponent for open internet and anti-censorship, but I think a key part for this to be possible is education about what news sources people get their information from. I've mentioned this in a previous opinion piece for the Justice, but most news is biased in some way and I feel as though people don’t do enough research on the news they are receiving or the sources they are getting it from. I feel as though not enough people do their due diligence when it comes to reading the news, regardless of what platform they get it from (online, social media, television, etc.). I personally feel as though the sifting of information should be dependent on the user, rather than the platform, but the platform should better state their biases. Tech companies don’t need to be responsible for this, but should they choose, I’d want proper information disclosure.

R Matthews ’19 is a Brandeis University Posse Scholar and is majoring in Computer Science and African & Afro-American Studies.

Amanda Kahn ’20

The crackdown on misleading and inflammatory information on platforms like YouTube is an important step toward halting the spread of misinformation. This has become a huge issue, particularly on Facebook, and it had a huge impact on the 2016 presidential election. This is very problematic, and has led to the spread of more and more fake news on many internet platforms. Furthermore, I can fully support this decision by YouTube, particularly because the main cause for the crackdown was conspiracy theories coming out about the shooting in Parkland, Florida As a student from Newtown, Connecticut, I can fully support this effort to stop these conspiracy theories, because they can be extremely damaging to the victims and their families. This occurred prominently after 12/14/12 and it was really difficult to see. It affected a lot of people in my town in a very damaging way, and even resulted in some people calling parents of the victims and harassing them. I am very glad that YouTube is making an effort to curb these conspiracy theorists, and I hope other platforms decide to follow their example.

Amanda Kahn ’20 is majoring in Biology with a minor in History. 

Roland Blanding ’21

If there is an ethical responsibility that these corporations have outside of their profit incentives, then I think that they ought to uphold freedom of civil discourse, lest they only propagate the illiberal institution of censorship that forces people into violent discourse because they feel that they are unheard. When someone makes a post on the internet, they open a dialogue with not only their proponents, but also their opponents, and it is necessary to inform both sides for fruitful debate to take place. The exceptions are threats on the lives of individuals, or damages to property, as these should be dealt with immediately and not protected as speech is. I think that a better world is one where we let people choose what voices they want to listen to, and where we understand that viewpoints are not pernicious because they engender pain within us, but because they are principally wrong.

Roland Blanding ’21 is a member of Brandeis Academic Debate and Speech Society and the Men of Color Alliance.

Julianna Scionti ’20

The paradigm set up in the question is legally irrelevant. The first amendment reads, “Congress shall make no law… abridging the freedom of speech.” The fourteenth amendment extends this standard to the states. As much as people may want the first amendment to apply to corporations like YouTube, it does not because they are private organizations. Even if the first amendment did apply to private organizations, the content YouTube will be regulating would largely not be protected under the first amendment. Inflammatory language that leads people to act in lawless ways is already not protected along with false content which can be prosecuted under libel laws. Free speech is not all encompassing.

Julianna Scionti '20 is co-founder and vice president of the Brandeis Drawing Club and is majoring in Politics. She is also an illustrator for the Justice.

Nyomi White ’20

I find it really interesting that YouTube specifically is taking these steps, because I don’t really believe that tech companies should be policing these things. I think that it disrupts the free flow of ideas. That being said, the way that YouTube algorithms work makes it so that if you’re looking at these videos, those are the videos that are going to be popping up in your “suggested” sections. It’s cyclical. The people that are making and a thing these kinds of videos are going to just continue to delve deeper in that sort of ideology. I think that maybe instead of fully taking the videos down, there should be a way that your “suggested” videos or “related content” has something completely unrelated pop up. If you’re watching right-leaning video after right-leaning video, you should also have a left-leaning video come up with an opposing view, to promote looking at a situation in a manner that is not your own.

Nyomi White ’20 is an American Studies major.

Lilly Hecht ’17

The social media platforms through which Russia successfully influenced our election — a basic tenet of our democracy — should, within reason and with moderation, be empowered to prevent such democracy-threatening content in the future. Though foundational to the viability of our democracy, our constitutionally enshrined rights have always been subject to balance with the common welfare, especially in extreme times. Though one might argue that we face no explicit imminent threat as a country at the moment, we would do well to remember that there are outer limits to our exercise of free speech. Historically, epidemics have enabled states to legitimately employ their police power to protect the larger population via quarantine and mandatory vaccination despite our individual right to bodily autonomy. As per Oliver Wendell Holmes’ “clear and present danger” legal standard, dissenting speech during wartime could be censored by the court when believed to directly threaten national security. Legally, our rights — free speech among them — are not limitless to the point of allowing us to imperil others.

Lilly Hecht ’17 works at the Schuster Institute for Investigative Journalism. She was also a Legal Studies Undergraduate Departmental Representative.