On Tuesday, Meta's (META  ) company-appointed oversight committee released a report criticizing the special treatment given to politicians, celebrities, and business partners on Facebook and Instagram, saying it had found "several shortcomings."

The Oversight Board's criticisms surround Meta's "cross-check" program, first exposed by The Wall Street Journal in 2021.

While the company has since introduced changes to the program, the board report reveals that the system still ensures that posts violating company policies are treated differently if the poster has a high enough follower count. Offensive posts from everyday users are quickly removed, but posts from popular users, VIPs, are often allowed to stay up on the platforms.

"The board is concerned about how Meta has prioritized business interests in content moderation," the report reads, adding that the company "provided extra protection for the expression of certain users."

Meta created the Oversight Board in 2020 to deal with moderation surrounding free speech and human rights issues that arose following the January 6 attack on the Capitol. The group, made up of lawyers, human rights experts, and academics, can suggest changes to Meta, but the company doesn't necessarily have to listen.

In its report, the board made more than two dozen recommendations to Meta. It said that the platforms should be "radically" transparent about which users are given special treatment through the cross-checking process. The board is also calling on Meta to hide all posts while they're being reviewed, whether or not the post was made by a popular user.

The board also acknowledged Meta's responsibility to prioritize speech that is "of special public importance." It's suggesting that the company should split its content moderation system between reviews meant to meet Meta's "human rights responsibilities", and another meant to protect users the company considers a "business priority."

"To fully address the number of recommendations, we've agreed with the board to review and respond within 90 days," Meta wrote in a statement following the release of the board's report.

For most users, content moderation decisions are outsourced to contractors or algorithms, but the cross-check system funneled posts made by VIPs through a different process. For these "entitled entities", decisions about potentially problematic content are made by employees and contractors with some amount of "language and regional expertise", unlike the outsourced moderation for general users.

"We built the cross-check system to prevent potential over-enforcement (when we take action on content or accounts that don't actually violate our policies) and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe," the company wrote in a statement.