When former White House adviser-turned-podcaster Steve Bannon called for the beheading of Dr. Anthony Fauci and FBI director Christopher Wray, the tech platforms reacted. Twitter, YouTube, and Spotify all banned him and his podcast relatively quickly, cutting off access to their millions of users. Apple Podcasts, however, took a different stance. The most popular podcast app let his show stay live in its directory so that, months later, when Bannon encouraged his listeners to converge on the Capitol to protest election results, people still had an easy way to access his thoughts. His show, even this week, ranks among Apple Podcast’s top 20 news podcasts.
A story from ProPublica in January pointed out the dangers of not moderating someone like Bannon. It’s reasonable to want Apple to not benefit from clearly harmful voices, but the incident speaks to how unprepared the podcast industry is to moderate: companies face huge challenges in even finding infringing content, and there’s little to no transparency from the big players in how they monitor the listings in their apps. Plus, people in the space have real, philosophical concerns about the extent to which podcasting’s open ecosystem should be policed.
A disparate network of companies makes up the podcasting world, including apps, hosting services, sales teams, and networks. Moderation will need to happen across these companies to be effective, and in this current moment, that effort doesn’t work the way it does at tech monoliths like Facebook, Twitter, or YouTube, which can remove someone with a push of a button. Put simply, podcasting isn’t ready for full-scale, widespread moderation — if that’s even what the industry wants.
“There’s no podcasting company that has the scale, or the reach, or the resources, to be able to do anything like [that],” says Owen Grover, the former CEO of Pocket Casts, when asked whether he thinks the podcasting ecosystem could monitor shows like Facebook does the posts, images, and videos on its platform. “If the podcasting industry cares about this stuff … it’s going to require multiple organizations that exist across the industry value chain.”
Moderation isn’t a simple task, and even platforms like Facebook and Twitter routinely get it wrong. Audio presents an even tougher challenge. For one, new content rapidly streams into the space. A report published this month from podcast marketing company Chartable says 17,000 shows launch weekly, and to moderate them would mean scanning audio, whether that be with actual human ears, transcripts, or software, and then discerning whether they cross the line. This assumes the companies in the space even care to moderate.
“It’s quite hard to do it at scale,” says Mike Kadin, founder and CEO of the podcast hosting platform RedCircle. “We would have to transcribe everything, maybe, and apply some automated filters to look at everything. A: that’s expensive, and B: even if we could get everything in text, I don’t think a computer can understand the nuance of some of these issues, so it’s super challenging, and we do the best we can.”
Even in high-profile moments, the industry has been slow and inconsistent about moderation. It should have been easy to ban shows from the notorious conspiracy theorist Alex Jones in 2018, for example, but it took weeks to build out even an incomplete blockade across the industry. Spotify started out by removing specific episodes, with Apple Podcasts removing his shows a week later. After that point, a constellation of smaller podcast apps made their own decisions on whether Jones deserved a ban.
These efforts didn’t even remove the podcasts entirely. The Alex Jones Show is still available today on Google Podcasts and smaller apps like Castbox, and the open nature of RSS means you can still listen to his shows inside of Apple Podcasts and other apps where it’s banned if you seek it out.
All of which is to say, one of the most high-profile podcast deplatforming incidents wasn’t even wholly effective, which doesn’t bode well for a future of podcast moderation in which people want apps to take a heavier hand. Now, QAnon podcasts are flourishing on at least one hosting platform, Podbean, which also hosts Bannon’s podcast, and outright fraud has occurred on Apple’s podcast charts. Copycat podcasts have also sprung up on Anchor, Spotify’s podcast creation software. The industry isn’t catching every show that passes through its systems, meaning the problematic programming lives on until someone points it out, forcing the companies to respond. In other cases, the apps and hosting providers either struggle to find these programs or don’t care enough to bother with them.
This speaks to the core of podcasting’s moderation issues, and the industry’s selling point for many: its open nature. Podcasts are distributed through RSS feeds, which are essentially a link to a list of episodes. Most apps (apart from Spotify, Audible, and Amazon Music) effectively serve as search engines for these feeds. As long as a show is hosted online somewhere, it can generally show up in these apps when someone searches for it. Apple, in particular, plays an integral role in the space because it gives smaller podcast apps the ability to incorporate its catalog, meaning Apple’s moderation decisions ripple throughout the industry.
But because the ecosystem is diffuse and multiple podcast indexes exist, most companies end up having to make moderation decisions themselves. The teams’ jobs become easier if a particular program gains the mainstream media’s attention — as Jones’ did — because the team then knows what they’re looking for. But doing their own, preemptive moderation work is tough, if not nearly impossible because day-to-day operations often involve small groups with limited resources.
One podcast app creator, Xavier Guillemane, who made the popular Android podcast player Podcast Addict, says he fills his catalog with shows both from Apple Podcasts and The Podcast Index, a podcast search engine. He relies on user reports for moderation, and if he receives a report, he first checks Apple Podcasts and Google Podcasts to see if the show is listed there.
“If it is then it means that the content does not violate their content policies,” he says over email. “If not, then I make sure that this podcast isn’t visible in any popular / suggested lists. That’s all I can do for moderation as I’m developing this app alone. With more than 2 million podcasts available, and with podcasts available in every language, there’s nothing more I can do.”
Grover echoed this idea, saying user reports were mainly how Pocket Casts policed its catalog. Those reports weren’t always reliable, however. “Signals from listeners are not always a good way to go because I will tell you that the whole notion of libertarian, do not censor — these things are powerful currents inside of podcasting,” he says, adding that many users saw the Jones removal as censorship.
Apple and Spotify, the two largest podcast players, each have their own set of community guidelines. Both platforms don’t allow content that encourages violence, for example, or shows that infringe on copyright. Spotify even specifically prohibits programs that promote pyramid schemes, while Apple doesn’t allow Nazi propaganda “as restricted by local law.” Like most terms, though, it’s hard to grasp how moderation would work in practice, and both companies are cagey about how exactly they moderate.
Spotify, which also owns one of the biggest podcast hosting platforms, Megaphone, issued a statement for this story saying Spotify uses a “variety of algorithmic and human detection measures to ensure that content on our platform is in keeping with our long-standing policies.” Apple Podcasts spokesperson Zach Kahn declined to comment.
Beyond the listening apps, podcast hosting platforms, like Podbean, also play a key role in moderation. While they don’t necessarily care about distribution, they’re the ones keeping podcasts live and available. In the past, hosting services haven’t been at the center of the moderation debate, but when Amazon Web Services booted Parler, a chat app known for far-right material, off its servers, it emphasized the critical role these hosts play. Podcast hosting platforms have a particular incentive to moderate when they help shows make money, or sell ads for them, because brands generally don’t want to advertise on a controversial show. Otherwise, the hosting platforms don’t have much reason to rein in their own customers.
At RedCircle, the team needs to moderate because it monetizes its users’ shows, but because the company only employs 11 people, Kadin says they can’t listen to or check out every program that joins the hosting service. Instead, the team reviews the shows that are the most popular each week to make sure they’re within the company’s content guidelines, including copyright, and also to ensure they’re receiving proper account support from RedCircle. Meanwhile, Spreaker, a company that’s now owned by iHeartMedia, uses algorithms and a 10-person team to review shows, says Andrea De Marsi, the company’s COO. They mostly focus on the shows that Spreaker monetizes through its advertiser marketplace and try to avoid taking sides on political rhetoric, so long as a podcaster doesn’t say or do anything illegal.
RedCircle says it’s caught some issues itself, like a neo-Nazi podcast that employed obvious imagery while Spreaker has removed dangerous propaganda creators, like ISIS, because of reports they received from law enforcement agencies.
Even Podiant, a podcast hosting platform that prominently advertises itself as a team of “compassionate liberals,” doesn’t have the bandwidth to screen new customers and mostly monitors shows based on user reports. “It’s a really tricky task, especially at the hosting level,” says Podiant founder Mark Steadman.
Acast, another major hosting provider, says it’ll soon be publishing community guidelines for its service.
“This topic is something Acast takes very seriously, and we know we have a responsibility to constantly learn and work on new ways to support podcasters, listeners and advertisers alike,” says Susie Warhurst, SVP of content at Acast in an email statement.
Ultimately, It’s the bigger companies that will have the most say in how moderation happens in podcasting. But because of the system’s open nature, there’s only so far the biggest company, Apple, can go in policing its platform. Asking it to remove a show from its directory is like asking it to make a specific webpage inaccessible in Safari — is that something people want? Podcasting has, so far, avoided crowning one platform as king, meaning anyone, both on the creator and business side, can enter the space and possibly find success in it. That’s what makes podcasting great, even if it requires unclear answers on moderation.