Facebook, the world's popular social media app, has vigorously removed the posts of 2.60 crore terrorist organizations from its platform in two years. Most of the posts of ISIS and Al Qaeda were in it. Facebook has discovered groups of these organizations on its platform on a large level. The company has removed more than 200 violent posts so far. Apart from this, the accounts of terrorist groups have also been banned.
In its statement, Facebook has said that with the help of artificial intelligence and human expertise, we have deleted terrorist posts. Also, the company had removed some posts only after the attack in Christchurch, New Zealand. Facebook has further said that these terrorist organizations had tried to spread bigotry through our platforms. Apart from this, Facebook also implemented new rules to ban violent posts in November.
For your information, let us know that Facebook is preparing a 9 point industry plan in collaboration with the world's leading tech company Google, Amazon, Microsoft and Twitter. Through this plan, the company will get information about how the content related to terror is shared on this platform. At the same time, Facebook has said that we have to constantly change our policies, so that violent posts can be stopped. At the same time, we have to take tough measures against the posts that spread evil. A video of the terrorist attack in New Zealand did not indicate the security system of Facebook, because before that users had not seen the violent content on this platform. At the same time, even the machine learning system of Facebook could not stop it. Facebook is working with the United States and Britain to curb such content.