In a move to reduce the spread of misinformation from fake content, Facebook-owned Instagram has announced it will be allowing US users to report content they consider false.
Such false or fake content will then be removed from their searches, preventing more people from misinformation.
They even said they might expand this to other countries but for now it’s starting only in the United States.
According to Facebook spokeswoman, Stephanie Otway, “This is an initial step as we work towards a more comprehensive approach to tackling misinformation.”
While social media platforms like Instagram help everyday people improve their lives and even make money doing things like sell feet pics online, they can also lead to a lot of misinformation that make things worse for everyone in general.
That’s why this announcement is good news to a lot of people.
This is also obviously an attempt to reduce or even stop some of the massive misinformation that affected not only the last US presidential elections but many others around the world.
With how much people believe what is on social media these days, this is certainly the right step in the right direction.
Some see it as too little too late, though, considering all the damage that has already been done and continues to be done because of spread of fake news.
The problem of misinformation from fake news on these social platforms is not only political but also medical.
One of the biggest medical misinformation that experts are worried about is remain those dissuading people from getting their children vaccinated.
The staggering number of people around the world believing and adhering to such misinformation has continued to stun experts around the world who call on social media giants like Facebook to do more.
Hopefully this is the first right step in the right direction.