Ville de Bitche is a French town which really does exist. Unfortunately for it, Facebook did not believe that it actually did exist. So it censored the town’s Facebook page. Now the company knows better.
Facebook has been under a lot of pressure these days to do a better job with monitoring the content posted by its users. This is largely due to the proliferation of hate speech and fake news on the world’s largest social network. But sometimes Facebook goes too far in the other direction.
Many users have found themselves in a sort of Facebook jail, suspended by the website because they posted a comment which someone objected to for being racist or posted a link to some sort of fake news. Many people have had their posts flagged or simply deleted by Facebook for one reason or another.
Will you offer us a hand? Every gift, regardless of size, fuels our future.
Your critical contribution enables us to maintain our independence from shareholders or wealthy owners, allowing us to keep up reporting without bias. It means we can continue to make Jewish Business News available to everyone.
You can support us for as little as $1 via PayPal at [email protected].
Thank you.
This company has also had to deal with countless fake accounts. Many of these are created by hackers who are attempting to Phish for people’s private information. Others are used just to proliferate fake news. So this may explain why Facebook assumed that if an account was created for a town called bitch it might review it. But many wonder why Facebook did not first find out whether or not the town existed.
Ville de Bitche is located in eastern France on the border with Germany. The town’s mayor, Benoît Kieffer, said in a statement that, “On 19 March, Facebook informed us that our page, Ville de Bitche, was no longer online, on the basis that it was ‘in violation of conditions applying to Facebook pages.”
“The name of our town seemed to suffer from a bad interpretation,” he added.
“What has happened to the town of Bitche demonstrates the insufficient and limited moderating tools that only the human gaze can appreciate,” he said. “At first, you wonder, was there a technical problem? However, with the length of time, it can be considered a real censorship.”
“We can be happy that social networks take responsibility, that they remove illegal, problematic content,” added Kieffer. “But the other problem is to consider that the human gaze has the upper hand on artificial intelligence.”
This may be a humorous mistake, but Facebook critics are sure to use this as another example of why the company should be regulated.