Latest From Our Blog

Automated Filters Gone Wrong: The Small-Business Social Media Conundrum

Automated Filters Gone Wrong: The Small-Business Social Media Conundrum

An interesting news story recently emerged from Plymouth, England. Facebook users found their posts and comments removed, members were banned from groups and accusations of bullying were thrown out by the social media giant, all for seemingly talking about their city. How did this happen? It turns out that each person affected had at some point talked about one of the historic landmarks of the city, a large grassy hill that overlooks Plymouth Sound. The name of this place? Plymouth Hoe.

“Hoe” comes from the Anglo-Saxon word “hoh,” meaning “sloped ridge.” Facebook’s algorithm apparently saw the word and, believing it be the misogynist term, slammed the ban button. And while the name may seem very strange to anyone not familiar with the city, to a local it seems perfectly normal. This is not an isolated incident either: It also happened to Facebook users talking about Devil’s Dyke in Sussex, England. Again, it comes from the algorithm misunderstanding how the term is being used. In this case, “dyke” refers to a large ditch, or valley.

For businesses, these situations raise concerns about the nature of automated systems and how they are used to monitor and police what is posted on social media sites. If they cannot distinguish between users clearly talking about a place name and being insulting, then what else will they struggle with?

While they are constantly tweaked and adjusted, the automated systems used by Facebook and other social media sites have been around for a while. They are designed to monitor and judge what everyone posts to stop harmful content and anything else that would breach the terms of service. The only problem is that it appears that with how wide the net has been cast, especially by Facebook, it is affecting many users who have not broken the rules.

For businesses that advertise on Facebook, the algorithms can become more than just annoyance. If the automated system deems you guilty of breaking the rules, you can be locked out until you are successful with your appeal. And until you regain access, any ad campaigns may continue to run without oversight, which can be expensive. This is not a hypothetical situation; it’s happened to many small businesses across the world, from New Jersey to New Zealand, and it can be detrimental.

We have dealt with this ourselves. One client had their account locked for posting content of a “sexual nature.” The item in question was a short video ad that had someone dressed as Marilyn Monroe. One look would tell you that no human being made this decision. The automated system incorrectly believed they had broken the rules and locked the account, causing the client to lose three days of revenue as they struggled to regain access through, you guessed it, another automated system.

But how do you tackle this problem? If you do fall foul of the algorithms accidentally, what can you do? Aside from appealing it, the best approach is to diversify. While Facebook may be the biggest platform out there, plenty of new social media platforms have emerged, with more than a few beginning to make big waves.

Clubhouse, for example, has topped 8 million downloads, while secure messaging app Signal also has quickly grown in popularity. Alongside the video-sharing app TikTok, which was the third-fastest growing brand of 2020, there are more than a few challengers to pick from. Making sure to have multiple platforms to fall back on, instead of being reliant on one, means the impact of being struck with a ban from the algorithms is far lessened. It also gives people alternate ways to interact with you and allows you to reach a far larger audience.

Facebook is not alone in this. We’ve seen similar situations happen with YouTube and Twitter. All three platforms find themselves stuck in a tough position. But at the same time, these social media platforms find themselves under pressure from major companies to do more about policing what is said on their sites. Last September, the three companies agreed to a deal with the World Federation of Advertisers to do more to tackle hate speech, as well as provide more tools to advertisers to determine where their advertisements appear. This was after more than 1,000 companies, including Coca-Cola and Unilever among other prominent brands, boycotted Facebook ads, causing the company’s stock to drop 8% in a single day.

With Facebook running full-page advertisements attacking Apple’s data collection policy change and styling itself as an ally for all small businesses, it is clear the company wants to attract more businesses to its site. But from what I’ve seen, Facebook has struggled to support those that have grown reliant on it. Automated customer service and content systems fall short—they’re often unable to properly tackle the problems they were put in place for. Facebook is taking steps to address these issues: Its new Oversight Board checks disputed posts that may have been incorrectly flagged, and it’s developing tools to allow advertisers to prevent their ads from running alongside certain types of content. But it seems that Facebook and other social media sites need to do more to properly deal with the problems and help small businesses.

Source : https://www.forbes.com/sites/forbesagencycouncil/2021/03/16/automated-filters-gone-wrong-the-small-business-social-media-conundrum/

No Comments

Sorry, the comment form is closed at this time.