London: An internal system that has exempted prominent users, such as former US President Donald Trump, from some or all of its content moderation rules is in need of a significant overhaul, according to the semi-independent oversight board of Facebook parent company Meta. The oversight board's report, which took more than a year to complete, said the system was "flawed in critical areas that the company must address." After The Wall Street Journal revealed last year that many of its elite users were abusing the system by posting content that would lead to penalties for regular users, such as harassment and incitement to violence, Meta requested that the board system Check out Also Read: BuzzFeed reduces staff by 12%, citing deteriorating economic conditions According to a Journal article, at least 5.8 million users were exempt from Facebook's rules as of 2020, but some VIP users were exempt from them, while other rule-breaking posts were subject to review that never materialized. Happened. The system, referred to as "XCheck" or "cross-check", was revealed in Facebook documents released by Frances Haugen, a former product manager whistleblower who previously worked on online security by the social media company. Allegations of profit-making made headlines around the world. and prompted regulators to take action against hate speech and misinformation. "The system review was requested by the business so we can continue our work to improve the program," tweeted Nick Clegg, president of global affairs at Meta. He continued, "We have agreed to respond within 90 days to fully address the Board's recommendations. Cross-Check, which applies to Facebook and Instagram, was created to prevent "overpolicing" or the mistaken removal of content that violates the platform's rules, according to the company. According to the Oversight Board report, the cross-check system caused users to be treated unfairly and delayed the removal of content that broke the rules as there could be up to five different checks. It was learned that the decisions generally took more than five days. Also Read: The final Boeing 747 to leave a factory in Washington state The average decision time was 12 days for content posted by US users and 17 days for content from Afghanistan and Syria. Some decisions took longer than others; The report said without going into further detail that one piece of material has been waiting for a decision for 222 days, or more than seven months. According to one of the board's 32 recommendations, META "should give priority to expression that is important to human rights, including expression of particular public importance." According to the report, those who do business with human rights defenders, advocates from marginalized communities, public servants and journalists should be given higher priority than those placed on cross-check lists, such as major corporations, political parties, musicians, celebrities and artists. "Users should no longer receive special protection if they repeatedly post content that violates the law, including those they include because of their commercial importance," the board said. To address additional issues, the board also urged Meta to remove or obscure the content while it was being investigated. It also recommended that businesses "increase transparency around Cross-Check and how it operates," such as establishing "clear, public criteria" for those included on the list. were concerned that Trump had incited violence that would result in the U.S. There was a riot at the Capitol after the board upheld last year's decision to ban Facebook. However, it claimed that in its request for a decision, the company refused to mention the cross-check system. Trump's reinstatement is being considered by the company as of January 7. In a blog post, Clegg claimed that Meta has already improved cross-checks by standardizing it to "run in a more consistent manner", thereby incorporating content from all 3 billion Facebook users, and An annual review may be conducted to confirm the list. Elite Users and Institutions. Facebook established an oversight panel to serve as the final arbiter of difficult content disputes after widespread criticism that it failed to act promptly and effectively in response to misinformation, hate speech and harmful influence campaigns. A former Danish prime minister, former editor-in-chief of the British newspaper The Guardian, as well as a scholar of law and expert on human rights, is a member. Also Read: TikTok's only Christmas wish is for US customers as Chinese apps compete with Amazon Past critics have questioned the board's impartiality and claimed that its particular content choices have served to obscure larger issues with Facebook and worry about government regulation.