Tumblr, Porn, and Internet Intermediaries

In the hubbub surrounding this week’s acquisition of the blogging platform Tumblr by born-again internet hub Yahoo!, I thought one of the most interesting observations concerned the regulation of pornography. It led, by a winding path, to a topic near and dear to the Concurring Opinions gang: Section 230 of the Communications Decency Act, which generally immunizes online intermediaries from liability for the contents of user-generated content. (Just a few examples of many ConOp discussions of Section 230: this old post by Dan Solove and a January 2013 series of posts by Danielle Citron on Section 230 and revenge porn here, here, and here.)

Apparently Tumblr has a very large amount of NSFW material compared to other sites with user-generated content. By one estimate, over 11% of the site’s 200,000 most popular blogs are “adult.” By my math that’s well over 20,000 of the site’s power users.

Predictably, much of the ensuing discussion focused on the implications of all that smut for business and branding. But Peter Kafka explains on All Things D that the structure of Tumblr prevents advertisements for family-friendly brands from showing up next to pornographic content. His reassuring tone almost let you hear the “whew” from Yahoo! investors (as if harm to brands is the only relevant consideration about porn — which, for many tech journalists and entrepreneurs, it is).

There is another potential porn problem besides bad PR, and it is legal. Lux Alptraum, writing in Fast Company, addressed it.  (The author is, according to her bio, “a writer, sex educator, and CEO of Fleshbot, the web’s foremost blog about sexuality and adult entertainment.”) She somewhat conflates two different issues — understandably, since they are related — but that’s part of what I think is interesting. A lot of that user-posted porn is violating copyright law, or regulations meant to protect minors from exploitation, or both. To what extent might Tumblr be on the hook for those violations?

First, some background for those less obsessed with intermediary liability. Generally, every site built on massive quantities of user-generated content faces a similar problem: potential liability for illegal things its users do. U.S. law addresses these issues differently depending on the underlying legal infraction. For most of them, Section 230 effectively immunizes the intermediary from any liability. If a user-generated post on Tumblr contains defamation, or an invasion of privacy, or harassment — as many porn blogs surely do — the company cannot be sued. Some intermediaries may take down such material voluntarily, but not under legal compulsion. (It is different in Europe, of course).

Section 230 does not protect intermediaries from liability for intellectual property infringement. Instead, that is the function of the notice-and-takedown regime of the Digital Millennium Copyright Act. Under the best and dominant interpretation of the law, Tumblr still has no obligation to police its user-generated content for unauthorized copies. Rather, intermediaries must provide a means for copyright owners to contact them and complain of unlicensed uses. Once notified, generally they must remove infringing content. Not surprisingly, Alptraum’s post suggests that most of the porn on Tumblr is infringing someone’s copyright. Yet these IP issues are quite familiar and not at all limited to sexually explicit material. As long as Yahoo!/Tumblr continues to comply with the DMCA, this shouldn’t present too serious a legal obstacle, no matter how little clothing people wear in user-posted photos.

(By the way, Eric Goldman has a very interesting ongoing project where he compares these two safe harbors and others to divine the ideal form of safe harbors.)

What’s unique about porn is another layer of regulation that Alptraum also discusses:

By federal law, anyone who creates adult content must maintain a detailed database of records proving that everyone appearing in the project is above the age of 18 … . But the law doesn’t just apply to people who shoot adult content (also referred to as “primary producers”), it also applies to people who upload and distribute adult content (known as “secondary producers”), who are expected to maintain their own records–or, at the very least, maintain a list with the location of the records that apply to the content they’re showcasing. Not maintaining these records is considered a federal offense–even if everyone in the content you’re distributing is above the age of 18.

Even though child pornography sometimes gets used as a justification for overbroad censorship, it still deserves serious legal regulation, as Derek Bambauer has spelled out. These recordkeeping requirements would seem to extend to every Tumblr user, and potential penalties include imprisonment and fines. More important for Yahoo!, they are anchored in federal criminal law (18 U.S.C. Section 2257), and DOJ regulations interpreting it. Criminal offenses are not shielded by Section 230. In other words: users’ failure to obey this law could be attributable to Yahoo!/Tumblr.

So, why isn’t this a pretty big problem for the acquisition, and for every other platform through which users can disseminate pornographic images? After all, the familiar rationale for immunities like Section 230 and the DMCA safe harbor — that it would be impossible for massive online fora to review all user contributions for potential liability — applies equally to Section 2257 liability. It is no easier for Tumblr to sift through millions of images to identify which ones are pornographic and then which of those link to required age records (as presumably almost none of them do). Yet no one at Yahoo! or in the press seems the least bit bothered by Section 2257 — I found no one other than Alptraum even raising the issue.

Which brings us to another recent Danielle Citron post on immunity. As she says there, criminal sanctions only matter if they are enforced. And, according to the remarkably detailed Wikipedia entry on Section 2257, these rules seldom are. Even if DOJ went on a big campaign to crack down on violators, one might assume that they would use their prosecutorial discretion to focus on the producers of pornography most directly responsible for potential exploitation. The upshot is that Yahoo! probably doesn’t need to worry about either of Alptruam’s potential legal issues arising from Tumblr porn: the DMCA protects the company from copyright problems and minimal enforcement effectively protects it from Section 2257 liability.

The lack of a safe harbor against Section 2257 did not prevent the explosive growth and innovation of Tumblr, or its sale for $1.1 billion. Put another way, a safe harbor is not necessarily the only way to protect intermediaries. Often it will be the best way. But this story responds to the allergic reaction of some observers when people like Citron propose tweaks in safe harbors to address serious social issues like revenge porn. If the exceptions are narrow and careful, then they do not have to destroy the value of the immunity.

You may also like...

3 Responses

  1. Bruce Boyden says:

    Bill, interesting post. It sort of suggests that the perception of the immunity is more important than the actual immunity.

  2. PrometheeFeu says:

    Of course, relying upon proprietorial discretion presents very serious problems. It basically means that the government can blackmail those who have come to rely upon that proprietorial discretion. Some day, the government will want something from Yahoo and when Yahoo says no, the government will able point to tumblr and ask managers, employees or execs if they’re ready to go to jail.

  3. William McGeveran says:

    Clearly, prosecutorial discretion is not reliable protection. Look no further than CFAA prosecutions like the Aaron Swartz case. But lawyers rely on it all the time when assessing realistic risk, and that appears to be Yahoo!’s approach here.