Social media has always been a hotbed for violent extremism and hate speech. With that, heated arguments between online users easily transformed into instances of cyber-bullying, but some platforms were just not having it. Instagram’s flexible use allowed people to post about pictures (which can have violent connotations) regarding the hate speech they are promoting. Thus, the strict community guidelines were laid out in order to eliminate as much hate speech as possible from the platform. This ensures that Instagram upholds its reputation for being a “safe” medium amidst the scandals Facebook, Twitter, and Google have been falling into.
Hate speech on Instagram has ranged from topics related to conspiracy theories and extremism (i.e. racism, political extremism, sexism) to topics about cyber-bullying and suicide. For example, a comic created by Adam Ford, the editor of “The Christian Daily Reporter”, was taken down from Instagram for violating the Community’s Guidelines on hate speech. The comic correlated “abortion-on-demand and the forced enslavement of human beings”, and that the similarities between the two cannot be unseen, according to Ford. Instagram viewed the comic as a promoter of slavery and also a condemner of those who are pro-choice.
Personally, I have tested out the strictness and efficiency of Instagram’s limited viewing of any violent/gory content. For no personal reasons, I decided to use Instagram’s search engine to look for posts regarding “suicide”. I thought that I would immediately be taken to a page full of posts tagged with the intended word. Instead, a pop-up appeared dictating that the images shown are of sensitive matter, and that if I suffer from depression or any other mental health issues then I should click on the “Get Support” button. The button eventually guided me to a page where Instagram offered several options that can help my mental state: talking to a friend, talking to a mental health expert provided by Instagram, or a list of Suicide Prevention Hotlines. With Instagram’s age group demographics being around 57 million users between the ages of 13–17, the safety options provided are a great way to steer them away from such a topic. As a person who was suffering from obsessive thoughts about suicide as a teenager and had easy access to posts related to that topic, I was ultimately glad that today’s youth are provided with help through an immensely popular social media platform like Instagram.
All in all, it is obvious that Instagram is attempting to free its platform of anything that can potentially cause any kind danger or pain on its users. In the efforts to protect its own users, Instagram is also managing to protect its own reputation and identity of being a social media platform that is safe to use regardless of what the person’s gender, age, political and religious beliefs, and race are. But will it be able to please all of its users through its implemented Community Guidelines? And if so, then for how long and how much will it withstand?