Category: Technology

Is Eviction-as-a-Service the Hottest New #LegalTech Trend?

Some legal technology startups are struggling nowadays, as venture capitalists pull back from a saturated market. The complexity of the regulatory landscape is hard to capture in a Silicon Valley slide deck. Still, there is hope for legal tech’s “idealists.” A growing firm may bring eviction technology to struggling neighborhoods around the country:

Click Notices . . . integrates its product with property management software, letting landlords set rules for when to begin evictions. For instance, a landlord could decide to file against every tenant that owes $25 or more on the 10th of the month. Once the process starts, the Click Notices software, which charges landlords flat fees depending on local court costs, sends employees or subcontractors to represent the landlord in court (attorneys aren’t compulsory in many eviction cases).

I can think of few better examples of Richard Susskind’s vision for the future of law. As one Baltimore tenant observes, the automation of legal proceedings can lead to near-insurmountable advantages for landlords:

[Click Notices helped a firm that] tried to evict Dinickyo Brown over $336 in unpaid rent. Brown, who pays $650 a month for a two-bedroom apartment in Northeast Baltimore, fought back, arguing the charges arose after she complained of mold. The landlord dropped the case, only to file a fresh eviction action—this time for $290. “They drag you back and forth to rent court, and even if you win, it goes onto your record,” says Brown, who explains that mold triggers her epilepsy. “If you try to rent other properties or buy a home, they look at your records and say: You’ve been to rent court.”

And here’s what’s truly exciting for #legaltech innovators: the digital reputation economy can synergistically interact with the new eviction-as-a-service approach. Tenant blacklists can assure that merely trying to fight an eviction can lead to devastating consequences in the future. Imagine the investment returns for a firm that owned both the leading eviction-as-a-service platform in a city, and the leading tenant blacklist? Capture about 20 of the US’s top MSA‘s, and we may well be talking unicorn territory.

As we learned during the housing crisis, the best place to implement legal process outsourcing is against people who have a really hard time fighting back. That may trouble old-school lawyers who worry about ever-faster legal processes generating errors, deprivations of due process, or worse. But the legal tech community tends to think about these matters in financialized terms, not fusty old concepts like social justice or autonomy. I sense they will celebrate eviction-as-a-service as one more extension of technologized ordering of human affairs into a profession whose “conservatism” they assume to be self-indicting.

Still, even for them, caution should be in order. Bret Scott’s skepticism about fintech comes to mind:

[I]f you ever watch people around automated self-service systems, they often adopt a stance of submissive rule-abiding. The system might appear to be ‘helpful’, and yet it clearly only allows behaviour that agrees to its own terms. If you fail to interact exactly correctly, you will not make it through the digital gatekeeper, which – unlike the human gatekeeper – has no ability or desire to empathise or make a plan. It just says ‘ERROR’. . . . This is the world of algorithmic regulation, the subtle unaccountable violence of systems that feel no solidarity with the people who have to use it, the foundation for the perfect scaled bureaucracy.

John Danaher has even warned of the possible rise of “algocracy.” And Judy Wajcman argues that ““Futuristic visions based on how technology can speed up the world tend to be inherently conservative.” As new legal technology threatens to further entrench power imbalances between creditors and debtors, landlords and tenants, the types of feudalism Bruce Schneier sees in the security landscape threaten to overtake far more than the digital world.

(And one final note. Perhaps even old-school lawyers can join Paul Gowder’s praise for a “parking ticket fighting” app, as a way of democratizing advocacy. It reminds me a bit of TurboTax, which democratized access to tax preparation. But we should also be very aware of exactly how TurboTax used its profits when proposals to truly simplify the experience of tax prep emerged.)

Hat Tip: To Sarah T. Roberts, for alerting me to the eviction story.

The Emerging Law of Algorithms, Robots, and Predictive Analytics

In 1897, Holmes famously pronounced, “For the rational study of the law the blackletter man may be the man of the present, but the man of the future is the man of statistics and the master of economics.” He could scarcely envision at the time the rise of cost-benefit analysis, and comparative devaluation of legal process and non-economic values, in the administrative state. Nor could he have foreseen the surveillance-driven tools of today’s predictive policing and homeland security apparatus. Nevertheless, I think Holmes’s empiricism and pragmatism still animate dominant legal responses to new technologies. Three conferences this Spring show the importance of “statistics and economics” in future tools of social order, and the fundamental public values that must constrain those tools.

Tyranny of the Algorithm? Predictive Analytics and Human Rights

As the conference call states

Advances in information and communications technology and the “datafication” of broadening fields of human endeavor are generating unparalleled quantities and kinds of data about individual and group behavior, much of which is now being deployed to assess risk by governments worldwide. For example, law enforcement personnel are expected to prevent terrorism through data-informed policing aimed at curbing extremism before it expresses itself as violence. And police are deployed to predicted “hot spots” based on data related to past crime. Judges are turning to data-driven metrics to help them assess the risk that an individual will act violently and should be detained before trial. 

Where some analysts celebrate these developments as advancing “evidence-based” policing and objective decision-making, others decry the discriminatory impact of reliance on data sets tainted by disproportionate policing in communities of color. Still others insist on a bright line between policing for community safety in countries with democratic traditions and credible institutions, and policing for social control in authoritarian settings. The 2016 annual conference will . . . consider the human rights implications of the varied uses of predictive analytics by state actors. As a core part of this endeavor, the conference will examine—and seek to advance—the capacity of human rights practitioners to access, evaluate, and challenge risk assessments made through predictive analytics by governments worldwide. 

This focus on the violence targeted and legitimated by algorithmic tools is a welcome chance to discuss the future of law enforcement. As Dan McQuillan has argued, these “crime-fighting” tools are both logical extensions of extant technologies of ranking, sorting, and evaluating, and raise fundamental challenges to the rule of law: 

According to Agamben, the signature of a state of exception is ‘force-of’; actions that have the force of law even when not of the law. Software is being used to predict which people on parole or probation are most likely to commit murder or other crimes. The algorithms developed by university researchers uses a dataset of 60,000 crimes and some dozens of variables about the individuals to help determine how much supervision the parolees should have. While having discriminatory potential, this algorithm is being invoked within a legal context. 

[T]he steep rise in the rate of drone attacks during the Obama administration has been ascribed to the algorithmic identification of ‘risky subjects’ via the disposition matrix. According to interviews with US national security officials the disposition matrix contains the names of terrorism suspects arrayed against other factors derived from data in ‘a single, continually evolving database in which biographies, locations, known associates and affiliated organizations are all catalogued.’ Seen through the lens of states of exception, we cannot assume that the impact of algorithmic force-of will be constrained because we do not live in a dictatorship. . . .What we need to be alert for, according to Agamben, is not a confusion of legislative and executive powers but separation of law and force of law. . . [P]redictive algorithms increasingly manifest as a force-of which cannot be restrained by invoking privacy or data protection. 

The ultimate logic of the algorithmic state of exception may be a homeland of “smart cities,” and force projection against an external world divided into “kill boxes.” 

We Robot 2016: Conference on Legal and Policy Issues Relating to Robotics

As the “kill box” example suggests above, software is not just an important tool for humans planning interventions. It is also animating features of our environment, ranging from drones to vending machines. Ryan Calo has argued that the increasing role of robotics in our lives merits “systematic changes to law, institutions, and the legal academy,” and has proposed a Federal Robotics Commission. (I hope it gets further than proposals for a Federal Search Commission have so far!)

Calo, Michael Froomkin, and other luminaries of robotics law will be at We Robot 2016 this April at the University of Miami. Panels like “Will #BlackLivesMatter to RoboCop?” and “How to Engage the Public on the Ethics and Governance of Lethal Autonomous Weapons” raise fascinating, difficult issues for the future management of violence, power, and force.

Unlocking the Black Box: The Promise and Limits of Algorithmic Accountability in the Professions

Finally, I want to highlight a conference I am co-organizing with Valerie Belair-Gagnon and Caitlin Petre at the Yale ISP. As Jack Balkin observed in his response to Calo’s “Robotics and the Lessons of Cyberlaw,” technology concerns not only “the relationship of persons to things but rather the social relationships between people that are mediated by things.” Social relationships are also mediated by professionals: doctors and nurses in the medical field, journalists in the media, attorneys in disputes and transactions.

For many techno-utopians, the professions are quaint, an organizational form to be flattened by the rapid advance of software. But if there is anything the examples above (and my book) illustrate, it is the repeated, even disastrous failures of many computational systems to respect basic norms of due process, anti-discrimination, transparency, and accountability. These systems need professional guidance as much as professionals need these systems. We will explore how professionals–both within and outside the technology sector–can contribute to a community of inquiry devoted to accountability as a principle of research, investigation, and action. 

Some may claim that software-driven business and government practices are too complex to regulate. Others will question the value of the professions in responding to this technological change. I hope that the three conferences discussed above will help assuage those concerns, continuing the dialogue started at NYU in 2013 about “accountable algorithms,” and building new communities of inquiry. 

And one final reflection on Holmes: the repetition of “man” in his quote above should not go unremarked. Nicole Dewandre has observed the following regarding modern concerns about life online: 

To some extent, the fears of men in a hyperconnected era reflect all-too-familiar experiences of women. Being objects of surveillance and control, exhausting laboring without rewards and being lost through the holes of the meritocracy net, being constrained in a specular posture of other’s deeds: all these stances have been the fate of women’s lives for centuries, if not millennia. What men fear from the State or from “Big (br)Other”, they have experienced with men. So, welcome to world of women….

Dewandre’s voice complements that of US scholars (like Danielle Citron and Mary Ann Franks) on systematic disadvantages to women posed by opaque or distant technological infrastructure. I think one of the many valuable goals of the conferences above will be to promote truly inclusive technologies, permeable to input from all of society, not just top investors and managers.

X-Posted: Balkinization.


Beatles in the Ether or Streaming

By now many may know that The Beatles catalog (or most of it) is available for streaming on the major services. I happen to love The Beatles and easily recommend Cirque du Soleil’s Love in Las Vegas. But the streaming option presents some questions to which I have not seen answers. First, did the services offer anything extra or special to get the rights (I can’t recall the state of streaming license law as far as flat rate or baseline rate to stream if the rights are granted)? Second, will the rights holders (I can’t recall where those have ended up) track the money from streaming versus selling the tracks and albums? If they do what will they find? Work on P2P music sharing and its effect on music and a study on the effect of free options for film may shed light on the future for Beatles revenues. The film study offered:

Together our results suggest that creative artists can use product differentiation and market segmentation strategies to compete with freely available copies of their content. Specifically, the post-broadcast increase in DVD sales suggests that giving away content in one channel can stimulate sales in a paid channel if the free content is sufficiently differentiated from its paid counterpart. Likewise, our finding that the presence of pirated content does not cannibalize sales for the movies in our sample suggests that if free and paid products appeal to separate customer segments, the presence of free products need not harm paid sales.

If music works in a way similar to film, The Beatles rights holders may expand their pie, not reduce it.

Either way I am happy to enjoy the streaming options while they last.


Cyberpunk Because You Forgot to Get Someone a Gift

OK Cyberpunk can be great for a range of reasons, but I saw this repost from i09 on The Essential Cyberpunk reading list and thought, “A great list with some books I have not read. Wait! It’s a list for folks who need to send a just in time Christmas gift (assuming they are available as eBooks, which I know some are). I easily recommend Neuromancer, Snow Crash, and Mirrorshades. I look forward to reading the rest (Accelerando did not work for me but I may try it again). Plus this genre really does a great job of positing worlds and issues that are pressing the tech-law space right now, so that is another reason to jump in.


More Science Cheer – Microscopes for Everyone!

The New Yorker has a nice piece about Manu Prakash and his work on the Foldscope, a portable, paper-based microscope that costs about one dollar. As the author pointed out the whole thing can be put into “a nine-by-twelve-inch envelope.” Here are the details:

The paper is printed with botanical illustrations and perforated with several shapes, which can be punched out and, with a series of origami-style folds, woven together into a single unit. The end result is about the size of a bookmark. The lens—a speck of plastic, situated in the center—provides a hundred and forty times magnification. The kit includes a second lens, of higher magnification, and a set of stick-on magnets, which can be used to attach the Foldscope to a smartphone, allowing for easy recording of a sample with the phone’s camera. I put my kit together in fifteen minutes, and when I popped the lens into place it was with the satisfaction of spreading the wings of a paper crane.

The Foldscope performs most of the functions of a high-school lab microscope, but its parts cost less than a dollar.

So what? So Prakash and his colleagues are trying to deploy the device around the world to increase the way people gather and share data to understand the world. Folks use the device but also can go to “Foldscope Explore, a Web site where recipients of the kits can share photos, videos, and commentary. A plant pathologist in Rwanda uses the Foldscope to study fungi afflicting banana crops. Maasai children in Tanzania examine bovine dung for parasites. An entomologist in the Peruvian Amazon has happened upon an unidentified species of mite. One man catalogues pollen; another tracks his dog’s menstrual cycle.”

These seemingly far ranging interests thus connect to what Brett Frischmann, Mike Madison, and Kathy Strandburg have been studying: a knowledge commons. Just within Prakash’s interest in “biomimicry—understanding how and why certain organisms work so well, and using that knowledge to build new tools,” the project increases the ability to know about “Plants, insects, tiny bugs under the sink, bacteria,” that do amazing things. New species can be identified, and so the project creates thousands of eyes not only for Prakash’s work but others in the field.

As I read the article and the details of low-cost tech being used around the world for a variety of problems that locals identified, I thought of the way FabLabs and the work of Neil Gershenfeld have approached and supported the maker-movement. And as I went on, I found out that Prakash did his work with Gershenfeld’s Center for Bits and Atoms at MIT. Can you say school of thought?

Prakash’s group is looking for ways to aid in early detection of disease and water contamination using low-cost technology. At the same time, the world may be re-experiencing the wonder of the first tools that pushed our ability to understand the world. As the article described, Prakash and Jim Cybulski, (then Prakash’s student, now chief collaborator on the project) were in Nigeria studying malaria. They met with young students, caught a mosquito “that was feeding on one of the children and mounted it on a paper slide, which they inserted into the Foldscope.” The student looked at the slide and

“For the first time, he realized this was his blood, and this little proboscis is how it feeds on his blood,” Prakash said. “To make that connection—that literally this is where disease passes on, with this blood, his blood—was an absolutely astounding moment.” The exercise had its intended effect. The boy said, “I really should sleep under a bed net.”

Scale and change the world technology can be small, simple, and accessible. Folks who press the practical and tee up the skills and tools to learn and dream of bigger things are part of an ongoing season of giving that I dig. Happy holidays to all.


Authentic Brands

What is authentic? The question seems to pop up in many areas. If a company or corporation claims authenticity, I am sure several folks I know would have a reflexive reaction that such a claim is absurd. Nonetheless, the Economist notes that “Authenticity” is being peddled as a cure for drooping brands. One part of the article notes that despite the ongoing difficulties in valuing brands, “when brands are sold as part of corporate takeovers, what price do investors put on them? They found that these prices, as a percentage of deals’ total value, have dropped since 2003. So, at least for those firms being taken over, the strength of their brands is becoming a smaller share of their overall worth.” That is interesting insofar as it suggests that 1) Brand value (and goodwill in that sense) can be measured and 2) That is has gone down.

What is driving the change? A key thing I have tried to show is that the issue of information or search costs is not as high as it used to be and that change brings into question many aspects of trademark law and policy. The Economist seems to agree and puts it this way

It is not hard to see why the old marketing magic is fading, in an age in which people can instantly learn truths (and indeed untruths) about the things they are contemplating buying. Online reviews and friends’ comments on social media help consumers see a product’s underlying merits and demerits, not the image that its makers are trying to build around it. The ease of accessing information makes consumers more likely to abandon their habitual brands because they have heard about something new, or learned that retailers’ own-label products are much the same, except cheaper. Depending on your perspective, people are either increasingly fickle or ever more impermeable to marketing bullshit. For brands that lack any truly distinguishing features, that is bad news.

Better information and new sources of it change the legal and brand landscape. Plus an old problem–trying to sell essentially the same goods–has returned. As Spencer Waller and I noted, “From the end of the nineteenth century to the middle of the twentieth century to today, companies have had to find ways to compete over selling essentially the same goods and manage excess production capacity.” So it is not surprising that the sectors most hit by the change The Economist discusses are consumer goods and imported goods that no longer offer difference from other, lower-cost options of the same or close to same quality.

So can a corporation be authentic? If a corporation is slinging its authenticity with Keebler Elves and Santa Claus in Coke Red, that is a harder sell. Those plays will be claiming authenticity based on cultural history and maybe a done deal in that sense (as Spencer and I discussed, the history of firms using events and education to build a sense of community and identity is old). But insofar as craft brewing, locally-made goods, and customized offerings are claiming authenticity, those may fit the authenticity claim; as long as that claim is that the item is not from a firm of a certain size or somehow to be distrusted because of size, for Scalia was correct in Citizens United that many firms of many sizes can be corporations. Assuming small and personal is a sort of authenticity, where I am not sure The Economist is correct is its example of Apple. The newspaper offers

for those firms that get the product right and have a genuine story to tell, the rewards can still be huge. The textbook example of this is Apple, whose devices’ superior design and ease of use make it a powerful brand in a commoditised market. Last year it had only 6% of the revenues in the personal-computer market, but 28% of the profits. That’s real authenticity.

If getting the product “right” is the key, then the competition is about old school “my goods and services are better quality than yours.” If the story is also key, then we have to start asking whether Apple’s claims are accurate or myth-making “bullshit” as the Economist might say. I like Apple products as they fit my needs. I buy them despite the over-claimed genius we are all tech saviors rubbish they sling. It is authentic as long as it authentic here means 100% Silicon Valley hubris. So pure it …


MLAT – Not a Muscle Group Nonetheless Potentially Powerful

MLAT. I encountered this somewhat obscure thing (Mutual Legal Assistance Treaty) when I was in practice and needed to serve someone in Europe. I recall it was a cumbersome process and thinking that I was happy we did not seem to have to use it often (in fact the one time). Today, however, as my colleagues Peter Swire and Justin Hemmings argue in their paper, Stakeholders in Reform of the Global System for Mutual Legal Assistance, the MLAT process is quite important.

In simplest terms, if a criminal investigation in say France needs an email and it is stored in the U.S.A., the French authorities ask the U.S. ones for aid. If the U.S. agency that processes the request agrees there is a legal basis for the request, it and other groups seek a court order. If that is granted, the order would be presented to the company. Once records are obtained, there is further review to ensure “compliance U.S. law.” Then the records would go to France. As Swire and Hemmings note, the process averages 10 months. For a civil case that is long, but for criminal cases that is not workable. And as the authors put it, “the once-unusual need for an MLAT request becomes routine for records that are stored in the cloud and are encrypted in transit.”

Believe it or not, this issue touches on major Internet governance issues. The slowness and the new needs are fueling calls for having the ITU govern the Internet and access to evidence issues (a model according to the paper favored by Russia and others). Simpler but important ideas such as increased calls for data localization also flow from the difficulties the paper identifies. As the paper details, the players–non-U.S. governments, the U.S. government, tech companies, and civil society groups–each have goals and perspectives on the issue.

So for those interested in Internet governance, privacy, law enforcement, and multi-stakeholder processes, the MLAT process and this paper on it offer a great high-level view of the many factors at play in those issues for both a specific topic and larger, related ones as well.


Not Found, Forbidden, or Censored? New Error Code 451 May Help Figure It Out

When UK sites blocked access to the Pirate Bay following a court order the standard 403 code error for “Forbidden” appeared, but a new standard will let users know that a site is not accessible because of legal reasons. According to the Verge, Tim Bray proposed the idea more than three years ago. The number may ring a bell. It is a nod to Bradbury’s Farenhiet 451. There some “process bits” to go before the full approval, but developers can start to implement it now. As the Verge explains, the code is voluntary. Nonetheless

If implemented widely, Bray’s new code should help prevent the confusion around blocked sites, but it’s only optional and requires web developers to adopt it. “It is imaginable that certain legal authorities may wish to avoid transparency, and not only forbid access to certain resources, but also disclosure that the restriction exists,” explains Bray.

It might be interesting to track how often the code is used and the reactions to it.

Here is the text of how the code is supposed to work:

This status code indicates that the server is denying access to the
resource as a consequence of a legal demand.

The server in question might not be an origin server. This type of
legal demand typically most directly affects the operations of ISPs
and search engines.

Responses using this status code SHOULD include an explanation, in
the response body, of the details of the legal demand: the party
making it, the applicable legislation or regulation, and what classes
of person and resource it applies to. For example:

HTTP/1.1 451 Unavailable For Legal Reasons
Link: ; rel=”blocked-by”
Content-Type: text/html

Unavailable For Legal Reasons

Unavailable For Legal Reasons

This request may not be serviced in the Roman Province
of Judea due to the Lex Julia Majestatis, which disallows
access to resources hosted on servers deemed to be
operated by the People’s Front of Judea.

Complicating the Narrative of Legal Automation

Richard Susskind has been predicting “the end of lawyers” for years, and has doubled down in a recent book coauthored with his son (The Future of the Professions). That book is so sweeping in its claims—that all professions are on a path to near-complete automation–that it should actually come as a bit of a relief for lawyers. If everyone’s doomed to redundancy, law can’t be a particularly bad career choice. To paraphrase Monty Python: nobody expects the singularity.

On the other hand, experts on the professions are offering some cautions about the Susskinds’ approach. Howard Gardner led off an excellent issue of Daedalus on the professions about ten years ago. He offers this verdict on the Susskinds’ perfunctory response to objections to their position:

In a section of their book called “Objections,” they list the principal reasons why others might take issue with their analyses, predictions, and celebratory mood. This list of counter-arguments to their critique includes the trustworthiness of professionals; the moral limits of unregulated markets; the value of craft; the importance of empathy and personal interactions; and the pleasure and pride derived from carrying out what they term ‘good work.’ With respect to each objection, the Susskinds give a crisp response.

I was disappointed with this list of objections, each followed by refutation. For example, countering the claim that one needs extensive training to become an expert, the Susskinds call for the reinstatement of apprentices, who can learn ‘on the job.’ But from multiple studies in cognitive science, we know that it takes approximately a decade to become an expert in any domain—and presumably that decade includes plenty of field expertise. Apprentices cannot magically replace well-trained experts. In another section, countering the claim that we need to work with human beings whom we can trust, they cite the example of the teaching done online via Khan Academy. But Khan Academy is the brainchild of a very gifted educator who in fact has earned the trust of many students and indeed of many teachers; it remains to be seen whether online learning à la Khan suffices to help individuals—either professionals or their clients—make ‘complex technical and ethical decisions under conditions of uncertainty.’ The Susskinds recognize that the makers and purveyors of apps may have selfish or even illegal goals in mind. But as they state, “We recognize that there are many online resources that promote and enable a wide range of offenses. We do not underestimate their impact of threat, but they stand beyond the reach of this book” (p. 233).

Whether or not one goes along with specific objections and refutations, another feature of the Susskinds’ presentation should give one pause. The future that they limn seems almost entirely an exercise in rational deduction and accordingly devoid of historical and cultural considerations.

Experts with a bit more historical perspective differ on the real likelihood of pervasive legal automation. Some put the risk to lawyers at under 4%. Even the highly cited study by Carl Frey and Michael Osborne (The Future of Employment: How Susceptible Are Jobs to Automation) placed attorneys in the “low risk” category when it comes to replacement by software and robots. They suggest paralegals are in much more danger.

But empirical research by economist James Bessen has complicated even that assumption:“Since the late 1990s, electronic document discovery software for legal proceedings has grown into a billion dollar business doing work done by paralegals, but the number of paralegals has grown robustly.” Like MIT’s David Autor, Bessen calls automation a job creator, not a job destroyer. “The idea that automation kills jobs isn’t true historically,” Steve Lohr reports, and is still dubious. The real question is whether we reinforce policies designed to promote software and robotization that complements current workers’ skills, or slip into a regime of deskilling and substitution.

A Review of The Black Box Society

I just learned of this very insightful and generous review of my book, by Raizel Liebler:

The Black Box Society: The Secret Algorithms that Control Money and Information (Harvard University Press 2015) is an important book, not only for those interested in privacy and data, but also anyone with larger concerns about the growing tension between transparency and trade secrets, and the deceptiveness of pulling information from the ostensibly objective “Big Data.” . . .

One of the most important aspects of The Black Box Society builds on the work of Siva Vaidhyanathan and others to write about how relying on the algorithms of search impact people’s lives. Through our inability to see how Google, Facebook, Twitter, and other companies display information, it makes it seem like these displays are in some way “objective.” But they are not. Between various stories about blocking pictures of breastfeeding moms, blocking links to competing sites, obscurity sources, and not creating tools to prevent harassment, companies are making choices. As Pasquale puts it: “at what point does a platform have to start taking responsibility for what its algorithms go, and how their results are used? These new technologies affect not only how we are understood, but also how we understand. Shouldn’t we know when they’re working for us, against us, or for unseen interests with undisclosed motives?”

I was honored to be mentioned on the TLF blog–a highly recommended venue! Here’s a list of some other reviews in English (I have yet to compile the ones in other languages, but was very happy to see the French edition get some attention earlier this Fall). And here’s an interesting take on one of those oft-black-boxed systems: Google Maps.