How to Regulate Facebook Like a Broadcaster
How to Regulate Facebook Like a Broadcaster
By Jeff John Roberts Sep 25, 2017
Russian agents paid to promote thousands of Facebook accounts in a bid to poison the U.S. political system with propaganda and fake news. It's no surprise, then, that some in Congress are calling for Facebook to be subject to television's "Stand by Your Ad" rules, which require political ads to identify who is behind them.
The proposal is promising but just scrapes the surface of a larger question: How should the government apply TV-era rules to Facebook, the most powerful media platform the world has ever known?
For years, the idea of such regulation was a non-starter—and for two good reasons. First, unlike the airwaves that carry broadcast TV, the Internet is a big, open place with room for everyone to express themselves.
Second, the government is notoriously bad at technology (recall the roll-out of Healthcare.gov). This means attempts to regulate Facebook and other online platforms might fail, while also stifling the dynamism that has made the U.S. tech sector the envy of the world.
These are persuasive arguments and ones the tech industry has employed with great success for years. Indeed, until the political fortunes of the tech industry began to shift in 2017, Silicon Valley fixers helped to make the phrase "light touch regulation" a popular mantra on Capitol Hill.
Things are different now. The staggering power of Facebook makes even erstwhile giants like NBC or Fox look like pipsqueaks, and the lack of any oversight is producing some ugly outcomes.
The most obvious of these is l'affaire Russia. As the Daily Beast explained, the Kremlin was able to weaponize Facebook's automated ad system to spread fake stories about Pope Francis endorsing Donald Trump, and foment agitation about racial issues or Texas secession.
The service has also spun out of control in other ways: Censoring pictures of alleged ethnic cleansing posted by human rights activists; offering to sell ads based on keywords like "Jew hater"; live-streaming murders and suicides; and so on.
These problems don't necessarily reflect deliberate decisions by Facebook executives, but this doesn't matter. Just imagine if CBS inadvertently sold secret political ads to the Chinese or broadcast a gang rape—the FCC, which punished the network over a Super Bowl nipple incident, would come down like a ton of bricks.
Regulating Facebook is harder, though, in part because TV's traditional carrot-and-stick approach—ie. "you can use the airwaves if you abide by these license conditions"—doesn't really apply when it comes to the Internet.
There's also the question of whether it's even possible for Facebook to address some of its problems. After all, it's much easier for TV networks to supervise a series of 30 minute sitcoms than it is for Facebook to police billions of pieces of pieces of user-submitted content—a point the social network likes to make when something goes wrong. Nonetheless, as my former colleague Erin Griffith points out, Facebook has shown itself remarkably capable or reining in bad behavior in the past. In a tweet referring to two clickbait-based news companies, Griffith notes:
The fundamental problem, then, may not be that Facebook can't fix its problems but that it won't. As influential social media sociologist Zeynep Tufekci explained in an article titled "Facebook's Ad Scandal Isn't a 'Fail,' It's a Feature," the company's series of PR problems stem from employing a highly scalable and amoral business model:
Human employees are expensive, and algorithms are cheap. Facebook directly employs only about 20,658 people — roughly one employee per 100,000 users. With so little human oversight and so much automation, public relations crisis like the one that surrounded the ads for hate groups are inevitable.
Tufekci also notes it's impractical to expect Facebook users to simply go elsewhere because of network effects. In other words, the site is the only place where people can find everyone they know, so it's the only game in town.
This last point is important because it undercuts the notion that Internet companies must be treated differently than TV networks. Sure, television regulation was written to curb the big broadcasters' monopoly on airwaves but, today, Facebook enjoys an even more powerful monopoly (even if the monopoly is not built on physical scarcity).
This leaves the question of whether the government is tech-savvy or competent enough to regulate Facebook in the first place. It's a fair concern. The current TV laws—many of which were written in the era of antennas—would serve as clumsy tools to oversee an Internet company like Facebook, and could cause bureaucrats to run amok.
The good news is Congress and regulators have pulled off something like this before. In 1998, lawmakers wrote two landmark laws to address copyright and free speech concerns on the Internet, while helping to ensure web-based companies didn't get smothered with lawsuits about user behavior. The laws are widely regarded as a success.
If Congress is to regulate Facebook, which did not immediately respond to a request for comment, it would require the same sort of forward-looking thinking. For it to work, lawmakers would have to tinker not just with the Federal Communication Commission rules, but also those of the Federal Election Commission, while also tweaking statutes concerning antitrust. It's a big lift, but at a time when Facebook has become the equivalent of a single TV channel showing a slew of violence and propaganda, the time may have come to treat Facebook as the broadcaster it is.