Maybe It’s Time To Treat Facebook Like A Public Utility
Maybe It’s Time To Treat Facebook Like A Public Utility
As more killings get uploaded to Facebook and remain on
the platform for hours, what will it take for the company to take
responsibility?
BY CALE GUTHRIE WEISSMAN 05.01.17 | 10:45 AM
Here’s a thought experiment: What would happen if every
few weeks an anchor on Bloomberg TV or Cheddar pulled out a gun and shot and killed
someone live on air? Those companies would shut down the program as fast as you
can say “over-the-top television service.” And if such acts of violence
happened that often on your local TV news broadcast, that station would
probably be taken off the air and face hefty fines.
Such questions inevitably arise every time a horrific act
of violence is streamed on Facebook Live or uploaded and shared in a post
shared on the platform. In just the last week and a half, the social network
made headlines when a disturbed man in Thailand streamed on Facebook Live the
murder of his young daughter in front of his wife, and when Steve Stephens
filmed his murder of an elderly man in Cleveland and uploaded the video to
Facebook. In both cases, it took time to remove the horrific videos—only after
24 hours for the Cleveland video and after a Thai government official and the
BBC alerted Facebook for the infanticide video. And those are just the
offending videos that we know about. Who knows what other content is being uploaded
as we speak?
Facebook admits this is a problem, and says it’s doing
everything it can via artificial intelligence and human moderators to filter
out such content. Yet it keeps happening, and with 1.8 billion monthly users
and a strong focus on video, it’s bound to become a perennial problem for the
company.
That’s because Facebook–and other social media giants
like Twitter and YouTube–treat their content somewhat duplicitously. On the one
hand, these juggernauts rake in billions of ad dollars against the content
people upload. At the same time, Facebook doesn’t consider itself responsible
for this material and argues that it’s just a platform for the distribution of
content by its users. (This is why the Thai government, in a recent example, is
unable to sue Facebook.) Were Facebook legally deemed a publisher and not a
platform, it would have to dramatically rethink how it approaches uploaded
content.
But given that Facebook makes advertising revenue off of
all the content published on it, why shouldn’t we begin treating Facebook more
like a publisher? Or, if you want to get even crazier, why not look at Facebook
videos as something akin to broadcast TV, thus under the regulation of the FCC?
This government agency has guidelines about what can and cannot be
broadcast—and it’s clear that murders and acts of torture wouldn’t make the cut
(at least not outside of a news-gathering capacity).
This, of course, is a farfetched idea. It’s highly
unlikely–especially in our current anti-regulatory political climate–that any
lawmaker would seriously consider classifying Facebook as a broadcaster, to say
nothing of rewriting digital content laws. In fact, the Trump administration is
already targeting net neutrality rules, arguing that high-speed internet service
should no longer be treated like a public utility.
The laws protecting these platforms from being liable for
their content were written in the 1990s, and digital culture has dramatically
changed since then. These once-tiny startups tinkering with the idea of “online
content” are now leading the business world. They make billions of dollars off
of uploaded material while eschewing any responsibility for it. In the wake of
near-weekly social media fails, perhaps it’s time to force all these
companies–Facebook, YouTube, Twitter–to be responsible for their content, just
like any publisher.
Jonathan Taplin, author of the new book Move Fast and
Break Things—who writes and researches the monopolistic tendencies of Silicon
Valley companies—says the legal questions stretch back decades. He cites the
Digital Millennium Copyright Act of 1998, a copyright law that limits liability
for websites that host infringing content. As Taplin describes it to me, this
act essentially “gave these companies what’s known as a safe harbor [so that]
no one can sue them for anything that’s on their platform.” In short, this
perennial problem of Facebook avoiding responsibility for murder videos
uploaded to its platform stems from the DMCA.
Similarly, section 230 of the 1996 Communications Decency
Act is what grants platforms legal immunity from the content they publish. But,
as Ars Technica points out, it’s getting fuzzier and fuzzier whether or not
what Facebook does with uploaded material means it should still be protected by
this clause. As it exerts more control over how it delivers content–all in the
name of maximizing ad revenue–it pushes more toward the actions of a publisher
and not a platform, even if Facebook continues to claim it is unable to control
what’s uploaded.
So maybe it’s time to rethink how the law views
Facebook’s content. Just like other services that have become vital parts of
Americans’ everyday lives—such as electricity, water, and increasingly
broadband internet services—maybe Facebook itself is too important to be left
unregulated, growing in size and impact with every passing day.
As Taplin puts it, the idea that these companies have no
control over what the users put on their platform is “actually a fiction.”
Websites like Facebook and YouTube have quickly figured out how to filter out
things like pornography. The reason, he posits, is that raunchy content is
unseemly for advertisers–they would pull their funding immediately if it was
associated with porn. The only content that Facebook and other companies seem
to feel responsible for–and act swiftly on–are posts that interrupt the flow of
ad dollars. The way to stop this, says Taplin, is to remove the safe harbor so
these companies become legally responsible for what’s posted.
This is precisely why shocking and abhorrent content
continues to be uploaded onto this site–Facebook has no business imperative to
filter out this content. Other online broadcasters do; they are legally and
financially responsible for the material they put up. And until companies like
Facebook are forced to confront the consequences of their users’ most
disturbing instincts, the issue will surely persist.
Comments
Post a Comment