Mechanical Turk, Research Ethics, and Research Assistants
A recent faculty workshop by my witty and brilliant colleague Jonathan Zittrain on “ubiquitous human computing,” (this youtube video captures in a different form what he was talking about ), prompted me to thinking about some ways in which platforms like Amazon’s Mechanical Turk, interface with university research and research ethics in interesting ways.
For those unfamiliar, Mechanical Turk allows you to farm out a variety of small tasks (label this image, enter date of this .pdf to a spreadsheet, take a photo of yourself with the sign “will turk for food,” etc) at a price per unit you set. Millions of anonymous users can then do the task for you and collect the bounty, a form of microwork.
As Jonathan detailed, this raises a host of fascinating issues, but I want to focus on two that are closer to bioethics.
First, I have begun to see some legal academics recruiting populations for experimental work using Mechanical Turk, and there is an emerging literature on the pros and cons of subject recruitment from these populations. Are Mechanical Turkers “research subjects” within the legal (primarily the Common Rule if one receives federal funding) or broader ethical sense of the term? Should they be? Take as a tangible example the implicit bias research of the kind Mahzarin R. Banarji has made famous, and imagine it was done over something like Mechanical Turk. How (if at all) should the anonymity of the subject, the lack of subject-experimenter relationship of any sort, the piecemeal nature of the task, etc, change the way an institutional review board reviews the research? It is a mantra in the research ethics community that informed consent is supposed to be a “process” not a document, but how can that process take place in this anonymous static cyberspace environment?
Second, consider research assistance.
I often have my R.A.’s read over papers before I send them out to hunt for typos (alongside more substantive tasks I give them). Imagine that tomorrow I decided (imagine a shrinking research budget due to times of fiscal austerity) to farm the typo-hunt off to Mechanical Turk because I could get results faster and at one tenth the price, since there were individuals in destitute circumstances willing to do it at a rate far below that I pay my (wonderful, in case they are reading) R.A.’s. Even if the accuracy of the Turkers was individually less good, it seems plausible that having four of them pour over each page might be better and still cheaper than using R.A.’s to do it. Lest you think this only an interesting hypothetical, consider Samasource, whose mission statement suggests it “enables marginalized people, from refugees in Kenya to women in rural Pakistan, to receive life-changing work opportunities via the Internet” in just this way.
Would I have violated any rules at your university? Have I done something wrong? Perhaps I have deprived Harvard students of the opportunity to work closely with a faculty member (although on typo hunting?) Am I problematically circumventing Harvard’s minimum wage for R.A. work? Am I exploiting these Kenyan refugees or rural Pakistani women or instead giving them “life-changing work opportunities via the Internet?”
I’d be curious to hear the thoughts of any readers, as well as any reports on whether your institution has a policy on this subject.