CNET is also available in Spanish.
Don’t show that anymore
Facebook has announced new steps to make teams safer.
Facebook already prohibits users who manage computers from creating new computers similar to those that the corporation has been undone by violating site rules. Now the corporation says it will prevent managers and team moderators who have been eliminated from creating a group, not just those on similar topics for 30 days.
Facebook users who have damaged any of the corporate regulations will also want new posts approved through an administrator or moderator for 30 days before they appear in a group. If directors or moderators approve posts that violate Facebook regulations, the company announced that it would. delete the group.
Groups that do not have an active administrator to monitor the online area will also be archived on the social network, which means that users will still be able to view the content, but may not be able to publish anything new to the group. Facebook will avoid recommending fitness equipment to users, but you can search for them on the social network. Health misinformation, i. e. vaccines, was a greater fear after the new coronavirus outbreak.
Facebook is selling equipment as more and more users turn to more personal online spaces to chat with new people or communicate with a circle of family and friends. Although Facebook regulations apply to teams, this update has rarely made it more difficult for the company to moderate. Some Facebook vaccination teams, for example, have a higher point of privacy where members must be approved in advance to join, the Guardian reported last year. This can make it more complicated for others to report posts than to violate Facebook regulations. Content cancellation can also be like a mole game for social media.
Last June, Facebook said it had deleted 220 Facebook accounts, 95 Facebook-owned Instagram accounts, 28 pages and 106 teams connected to the far-right boogaloo movement. Two members of the boogaloo conspired in a Facebook organization to assassinate federal security guards in Oakland, California, according to the Federal Bureau of Investigation. CNET also reported on a personal organization “Justice for George Floyd” that was filled with racist content. Facebook didn’t delete the organization after it got its attention, but it’s no longer visual to the public as of Wednesday.
In August, Facebook got rid of 790 groups, a hundred pages and 1,500 classified ads similar to a far-right conspiracy theory called QAnon that falsely states that there is a “deep state” plot opposed to President Donald Trump and his supporters.
For the first time, Facebook has also revealed the amount of hate speech content coming from teams. The company relies on a combination of generation and user reports to locate prohibited content. Over the next year, Facebook has disposed of 12 million hate speech team content and 87% have been reported before a user reports messages. The company said that about 1. 5 million content from organized hate speech teams had been undone and 91% had been proactively discovered. Facebook said it had been undone by 1 million computers. for violating those policies.
Facebook defines hate speech as “a direct attack on others based on what we call characteristics: race, ethnicity, national origin, devout affiliation, sexual orientation, caste, sex, gender, gender identity, and serious illness or disability. “You cannot compare blacks to monkeys, for example, refer to women as elements or use the word “that” to describe other transgender or non-binary people, according to the standards of the site’s network.
The amount of content that Facebook has been undone from represents a fraction of the posts on teams. More than 1. 4 billion more people use Facebook equipment each month and there are more than 10 million computers on Facebook.