Why Free Beer Isn’t So Good if your Data are Getting Drunk: How Free as in Freedom Businesses Help Prevent the Ultimate Privacy Catastrophe (MySQL Conference Keynote)

(Eben Moglen’s speech at the 2007 MySQL Conference)

Clint Smith

Good morning. I’m delighted to announce our first speaker this morning because he’s someone I deeply admire. He’s someone who has really defined the law of Free and Open Source Software and has been a great promoter and defender of it.

His first paid programming job came when he was 13 years old. After college, he was a programmer at IBM in the 1980s, and then decided to go to law school. He was a clerk for Supreme Court Justice Thurgood Marshall, and then joined the faculty at Columbia University in New York City, where he is now full professor of law and legal history.

He is also chairman of the Software Freedom Law Center, and in that role, he has a broad agenda to promote Free and Open Source Software, to protect us against software patents and the harms that they pose to us, and to help developers and programmers around the world. But unlike most high-powered New York lawyers, he doesn’t charge by the hour. He doesn’t charge us in ten minute increments. The Software Freedom Law Center does its work on a pro-bono basis helping people. But under the MySQL concept of quid pro quo, I wanted to think about ways that we could pay Eben for the billable hour that he is spending with us this morning, and really for the time he spends with us in the conference over the next couple days. If I were Guy Kawasaki, I’d offer you a top ten list of ways to pay Eben Moglen for his billable hours, but I’ll offer just four:

The first is to do something each week to convince the corporate IT departments that GPL software is not scary or dangerous or risky; to make sure that they understand that it’s actually more scary, risky and dangerous to click “I accept” to a one-sided EULA for a proprietary piece of code.

The second thing I’d recommend is when your bright niece or nephew decides to go to the dark side and go to law school, don’t give up hope, but just insist that rather than go work for a large law firm each summer, they spend one summer as an intern at the Software Freedom Law Center.

Third, when it’s time to sell your company to Google for 1.6 billion dollars, be generous: give 10% of the proceeds to the Software Freedom Law Center so that Eben and his colleagues have the resources to provide legal advice to the next generation of developers and entrepreneurs.

And fourth, when Eben Moglen comes up to the platform, please give him a warm round of applause for being with us today and for everything he does for Free and Open Source Software.

[applause]

Eben Moglen

Thank you. Good morning. It’s conventional to thank the introducer for the kind and generous introduction. That tradition of thanks is awkward here because I really want to thank Clint for the nicest and most effective introduction I’ve ever received with actual action items in it. I know I’m grateful for that. It’s what everybody dreams of.

I want to talk this morning for the first time — in I can’t remember how long — not about GPL 3. And although I am going to take questions, I hope that we will manage to preserve the “not about GPL 3” quality all the way through — a luxury I’ve thirsted for for longer than I can tell you, but certainly longer than you’ve been aware of what an amazingly complicated and socially aggressive process GPL 3 is for those of us who have been listening to everybody on earth, it seems some days, about what ought to be done with the license.

I want to talk instead about social policy and technology and freedom: all of them subjects admittedly I find myself addressing in the context of software licensing all the time, but subjects with other bearings and bearings which are particularly important to what we do here.

Let me back up a step and say as that kind of broad, general, professorial statement that goes along with teaching legal history, a thing I’ll get back to with joy for the first time in two years this coming fall, that you could make a good case that the history of social life is about the history of the technology of memory; that social order and control, structure of governance, social cohesion in states or organizations larger than face-to-face society depends upon the nature of the technology of memory: both how it works and what it remembers.

Those historians who work on epigraphy, that is deciphering inscriptions chiseled into stone or baked clay, are aware of a commonplace fact which is almost everything written by human beings, from the invention of writing until the day before yesterday, was a list of cows, or of assets in the treasury, or of taxpayers and what they owed. The greatest work of medieval English public administration was undertaken one generation after the Norman Conquest, in 1086. It’s the project we came to know as the Domesday Book. It’s a memory book about English real estate. It asks of every square meter of terrain governed by the Norman kings of England “of whom is this land holden?” And it memorizes the responses.

In short, what societies value is what they memorize. And how they memorize it and who has access to its memorized form determines the structure of power that the society represents and acts from.

For almost the whole history of human beings, what was memorized was public fact: facts about who holds which land of whom, or how many cows, or who owes what taxes. It requires a substantial change in the technology of memory to begin to be a society based on the memory of private facts, about individuals in their private capacity, not as land owners, cattle owners or taxpayers, but as dreamers, wishers and thinkers. Early modern Europeans, that is Europeans of the 14th, 15th and 16th centuries, as the great historian of China Jonathan Spence pointed out in a classic book, early modern Europeans had a technology of memory that we have lost. Living largely in a world of expensive written material, and seeking to build a private database of things experienced and learned, early modern Europeans built in their mind what they called memory palaces: imaginary rooms furnished with complex bric-a-brac and interior decoration, in which each of the items in each room was a thing to be remembered. And by eidetic association, this jar means the Ptolemaic theory of the nature of the solar system. This window represents my experience in Paris, and so on. By walking through the rooms of the memory palace in their minds, early modern Europeans iconicly remembered things they needed to know versus facts, figures, and things they needed to retain to keep their personal identities whole; their experiences, the events, the places, and the times.

The memory palace of the early modern European bears a very substantial relationship to the idea of the photograph, not however to the technology of the photographing. It’s with the introduction of photographic technology, in European life, that one can fundamentally talk about the memory of the private entering the technological stream of social control. With photography, there comes the ability to capture personal experience — factual and emotional — and to hold it for long periods of time in a form which can be both possessed, that is to say from which others can be excluded, and shared, that is to say, which others can be shown. So unlike the memory palace of Jonathan Spence’s Jesuit missionary to China, Matteo Ricci, living far from the European world but possessing a vast repertoire of European learning locked up in an imagined palace by the waterside in Venice, the possessor of a photographic collection of experiences of life can possess both vast quantities of useful information (I first saw I think when I was in seventh grade, when I was twelve, a student take a picture of the blackboard in order to capture mathematics under construction), and also an emotional relationship — things wished for, things achieved, things tragically experienced captured in photographs. The last day of my father’s life may be indelibly printed on my memory, but the photographs taken on that day contain the experience in some sense even more immediately than my own analog, biological carbon-based decaying memory of my own life [laughter].

With that photographic memory, that form of the interpenetration of the public and the private, goes the possibility of the evolution and amplification of social control. Others can take pictures of you. And as people in stone-age cultures have been trying to tell Europeans for some generations now, taking a picture of you can steal your soul. Surveillance, spying, the control of the population through the pictures of who they are and what they do, was a fact of twentieth century life, already growing rather irksome, sometimes burdensome, occasionally brutal. And there are now, I read last month, 1.6 CCTV cameras in Britain per unit of population. Some societies, in other words, at the beginning of the twenty-first century, have more surveilling powers than they have people to surveil. And if we took a look around the room at how many people are themselves carrying cameras, we would probably find the same is true: more cameras than people here. If I rummage my briefcase, that will surely be the case, and maybe yours.

Of course all of this stuff is stored personally, and therefore the structures of social control and power that arise from it are still under personal control, right?

Wrong.

If the photograph is the beginning of the technology of memory of private life in publishable form, then Flickr is the beginning of a third stage, because of course the private photograph isn’t private anymore. The shared memory of where we’ve been and who we were with and what we were doing with a nice little indication of time and date and a GPS-equipped coordinate stamp on the photograph, is now a basic table row in a database about you being kept by someone else.

Of course, the reason that they’re keeping it for you is you gave it to them. You asked them to hold it for you. You admired their technology of memory because once the digital revolution occurs, the technology of memory scales. There was no way to merge the memory palace of Matteo Ricci with the memory palace of anybody else he ran into in the course of his life. They could share a fact; they could share a story; they could build another artifact in each other’s palace, but they couldn’t link them together. There was no way to acquire wholesale the memories of one in the mind of another. Of course we do that all the time now by bluetooth, and we don’t even think about it.

Human beings have heuristics about memory, and their heuristics now begin to fail in light of current technological realities. People assume — my students do it all the time — that when we talk about privacy, we’re talking about the one big secret you have you wouldn’t want other people to know. That’s what privacy first comes to mean. Identity theft means someone who knows four facts about you: your birthday, your social security number, your mother’s maiden name and your current address. That’s identity theft in their world. A fixed number of secrets or semi-secrets, or in fact even not so secret details, which when aggregated somehow like a photograph can steal your soul. But even this understanding of the relation between memory and power is fatally defective, and we, the privileged, highly digitized, wealthy few, are also the first to experience what the loss of privacy really means, because our lives are largely recoverable through the technology of memory based on the information we voluntarily give away. And so far I’m only talking about individuals. I haven’t yet talked about the ways in which the organizations of which individuals are members give away data for others to hold, remember and process. And I’ve talked about this so far as though it were just a matter of people looking at your photographs. And that in itself is part of the bad heuristic about memory, which governs people’s first and sometimes even second impressions about privacy.

This isn’t really about people looking at your photographs; this is really about people inferring, from the data of your life, what you are thinking and what you will do next. So let us ask ourselves: who will be the greatest and most powerful intelligence services in the twenty-first century? Governments? Secret police? Defense establishments? Or private market data miners, operating on the basis of extraordinarily deep repertoires of information you have voluntarily offered them, about what you think, and what you intend, and what you mean to do, and what you’ve just accomplished, and where you are in your plans. Wouldn’t you like to know all the websites you’ve recently visited, and how to get back to them quickly in case you need something? That used to be a feature for a browser. A browser should remember where you were and store locally for you a memory of the virtual places you have recently been. Now it’s a service. Wouldn’t you like us to keep track of the history of everywhere you’ve been on the web recently, so we can do pretty amazing new stuff about making it convenient for you to figure out where you’ve been recently? Oh, and by the way, any inferences we may draw from that are our property for our commercial use; check here if you agree [laughter].

Consider for a moment just how much inferring, from the data you voluntarily give up, you permit at the moment of the assignment of custody of the information. The law of my stuff in your hands is deep and old; the Roman law goes on at great length about it as the English law does. We call it in the English legal tradition “bailment.” My object in your care; my car in your garage; my television in your shop for repair; my goods in your safekeeping. But the bailee’s ability to think about the thing given to him, that doesn’t enter the story. You bring your musical instrument to the pawn shop, and you pawn it for the means to continue life between gigs. Well of course the pawn shop owner can look at the reeds of your saxophone; the assumption is he doesn’t care. The assumption is he doesn’t want to know. And even if he did want to know how you trim your reed, what difference does it make? There’s nobody he can give that information to that matters.

But in the world of digital information, every bailee gets to think about everything you hand him to be handed back to you later, and the thinking that the bailee does is crucial to the bailee’s success, and exercises social control over you. To build a working model of a human being, permitting at least theoretically effective prediction of future behavior, even just good for a point or two or three or four of additional leverage over your future commercial behavior, based upon information you have voluntarily supplied, is one of the most exciting business models of the twenty-first century. It promises to replace commercial broadcasting, for example, by well-targeted, highly useful, deeply appropriate advertising interleaved with your media stream in a way that you approve of because it brings you information that you need at the moment that you want it most, and therefore biases your choices and controls your conduct at a level of efficiency that twentieth century mass-market advertising only dreamed about: structures of social prediction based upon your click stream, your payment habits, your stored contact lists, your photographic libraries, your shared video preferences, your Amazon wish-lists, and all the rest. Structures of social prediction and control based upon mining that data offer opportunities for government or private market use, deployment for the control of human beings that are very exciting indeed.

Oh, but it’s not real control; it’s only control over what I eat or wear or smell like. It’s only control over how I do my dating, and whether I have this or that or the other automobile in the parking lot, and so on. In other words, it’s only really about the superficialities of my life, right? Ask yourself how deeply the political parties in the electoral democracies of the West are involving themselves in the same data mining. And ask what the consequences are of that kind of data mining applied to the actual movement of elections, through better targeting of effort and resource, in a fashion which we can think of as entrepreneurial democracy on the march, or for kinds of vote suppression and discouragement of voters, interferences with the effective use of the franchise, which we would have no difficulty characterizing as anti-democratic and largely despotic in intention.

All of those possibilities arise from the new technology of shared memory and voluntary commitment of your private data to servicers who offer you pretty amazing services and keep for themselves the right to infer. But of course the right to infer is a right. I spend a great deal of my life worrying about technological civil liberties, about the relation between old and established freedoms: to think, to believe, to speak, to learn in new technological environments. And I would be the last person I hope on earth to stand here and suggest to you that we control what people think. If you give them the data about you, they have a right to think about it. And they have a right to teach it to other people, unless you have previously constrained them, either by general-purpose law or by contractual provision. They have a right to collect, they have a right to evaluate, and they have a right to express. And those are powerful and important rights that must not be interfered with, or else we shall be interfering with our own rights drastically.

So the central moment in the construction of this form of social control is the moment of voluntary participation. Not the moment at which somebody else thinks constructively about what we voluntarily gave him, and not the moment in which he turns that information over to somebody who values it highly in return for payment. Both of those are legitimate and appropriate activities which we enable when we choose to collectivize the memory of our private lives.

But there is of course another possibility, which is why I’m choosing to talk about it here. You could put it briefly this way: store it yourself, right? [laughter] The fundamental question, after all, is oddly only a question of custody. Because let’s face it, the pretty amazing new services are based on free software. You can data mine yourself just as effectively as they can data mine you. The way to do it, the code to do it, the technical capacity to do it, the ways to store and evaluate and consider information within voluntary collectives of people who want to share, those ways are the same whether they are being operated on the largest computer in the world by them, or on the smallest computer in the world by you. The technology of memory isn’t the problem; the technology of memory is the solution. It’s how we use it. It’s where we put it. It’s where we decide about how to share and what to keep to ourselves. It’s what we mean by privacy, not what we do in technology, that creates our difficulty.

And, and this may be the most difficult proposition of all to deal with, it’s an ecological problem. That is, the aggregation of individual decisions has nonlinear social consequences. Let us return to those great figures of email communication from the early 1990s: Bob and Carol and Ted and Alice. I spent a lot of time with them once upon a time. If Bob and Carol are using an email service which affords them pretty amazing new, endless amounts of storage searchable in a flat to keep their mail in, in return for data mining the content of that mail, then Ted and Alice are being data mined because they correspond with Bob and Carol, not because they’ve chosen anything at all. If Bob and Carol have chosen to open their email to data mining, and Ted and Alice have chosen to expose their click stream, and Bob and Carol and Ted and Alice are therefore involved in both the voluntary turning over for custodial purposes of email in two cases and click streams in two cases, and everybody’s got a credit card. We might as well be talking about a movie filmed in a fishbowl. And that’s a consequence not of the individual decisions made about who stores what, but the aggregate decisions to store a lot of things with people who will mine them and give you pretty amazing new services back.

The model — pretty amazing services in return for data aggregation and the right to mine — is a legitimate and appropriate model. I have no difficulty; I use them too. But we need to be thoughtful, not only about the first-order consequences to ourselves, but the second-order consequences to the community at large, of each of those individual decisions about whether to store it yourself or give it to a bailee who is pricing the storage cheap in order to retain the right to inspect and ponder and muse and express inferences on the basis of what he is storing.

There’s no way to deal with this by setting a hard and fast bright-line rule. I have spent a good deal of my professional life with people who had the enormously refreshing characteristic that they knew the difference between good and evil and were very clear about expressing which was which and what was what and where the line fell, and I can’t do that here. Maybe I shall in the course of my perambulations run into another Stallman with a clear and comprehensive and overarching theory of how to differentiate the good of this from the evil of it, and will find a way to create some elegant and clear and operable (but I said I wasn’t going to talk about GPL3) [laughter].

Instead after all we’re living in a world where law is not the answer to the problem. Social life is more complicated than the rules, and social good is more complicated than the question of “are you following the rules or breaking them?” The fundamental issue is: what are we building together, and why do we want it? Are we building an environment in which the process of knowing our insides is the commercial asset in highest demand for the next 100 years? Or are we building a culture in which the technology that we love enforces for us and helps us to preserve the privacy that we care for more deeply than we are aware about from moment to moment. Where privacy is defined not as our ability to keep the biggest of our secrets, but the smallest. Not the capacity to resist the intrusion of the state in the most brutal and overwhelming forms, but the capacity to resist the intrusion of the market in the most subtle, and the simplest, and the most difficult even to grasp once they have started to happen to you.

The spam I get is getting better all the time. It imitates things I might be interested in better and better as they are Bayesian and I am Bayesian and everybody’s busy trying to figure out which words will work their way in through the screens. The relevance of that isn’t that I am getting the spam, but that the spam is learning about what I think about far more successfully than it is encouraging to have to recognize. The spam is still the dandruff of the digital society; it’s still just cruft. But the fact that even the dandruff manufacturers are gaining more insight into who I am than just broadcasting at me pennystock deals or genital enlargements is a sign that something important has happened to the ability to know us as well as we know ourselves. And beyond that, beyond the mere crudity of the spammers, are the people whose goal it is to know us better than we know ourselves. And they are making progress just as fast.

So could it actually be a social obligation to store it yourself? Could we really come to the conclusion that for ecological reasons, we ought to have a packet in, packet out view? That letting other people carry your stuff, that hiring the Sherpas of the digital society to get you to the top of Mount Everest might actually be to flatten the mountain in a way you don’t even notice? The engines, thanks to our colleagues, are cheap and free both. The applications, thanks to all of you, are cheap and free too. The ability to store, to imagine, to consider, to refine, to guess, to improve the accuracy of what we do is in our hands also. We can’t give the excuse for not constructing around our privacy that we require the facilities; we made the facilities, and we shared them among ourselves, and we can all have them. And those of us who understand that fact can help other people to have those facilities too.

The tendency to scale memory towards common platforms raises social questions that we have to force into visibility. We have to do it in part as a matter of social responsibility to other people who are going to live in the world that we make. I hear a lot of complaining from grown ups — that is gray-haired, alter cockers like myself, about some supposed absence of concern for privacy among teenagers at MySpace and Facebook. This puzzles me very much. I hear complaints about teenage driving, too, but complaints about teenage driving are always accompanied by a recognition that the kids are inexperienced, and that as they grow up, they should become better drivers. But the fact is that the adults I hear complaining about teenage disregard for privacy on MySpace and Facebook are the very people who are bringing about the primary privacy problem that I’m trying to talk about here. They’re not becoming better drivers; they’re just becoming better ignorers of the problem as time goes by. And as we begin longitudinally to study what young people do at MySpace and Facebook, it turns out they’re not all that unconscious about privacy after all. This may yet turn out to be primarily an old-person’s problem.

But whatever it is, we have to deal with it. We have to be aware that it’s there; we have to think about its consequences; we have to imagine mixed social strategies for confronting it. Surely those strategies don’t involve prohibiting memorizing things for other people, or thinking about what you know, or telling other people about what you’ve learned. We have to be respectful of the rights to learn, to know, to think, and to express. Surely the answer isn’t to prohibit business models or get down on Google or be terrified at the empire of memory that is coming. Surely the answer is in our hands. That’s why free software is so important: because the answer is in our hands, too.

Collectively we make this technology. Collectively we decide how it functions. Collectively we decide whether it saves our sense of internality or destroys it. We decide how much private power will know about us for the rest of the twenty-first century, and not just us, but everybody coming along behind. We are the people who decide what memory does in the next phase of human society, and how the technology of memory affects the imposition of social power. That’s our work. We may not think of it as our work, but it is. And so I say, if we cared about the freedom of the software, if we care about the liberty of people to know, to think and to express, we have to take a look also at the question of the safety of the data. Is it wearing a seat belt, or is it going to be thrown out the window of the car in the accident and lie there bleeding on the highway?

Free beer is not so good, I said, if your data is getting drunk. Who’s drinking it, and what’s the hangover likely to be like when it’s all over? I don’t have answers. I know that there’s a questioning here that collectively we need to think about. I recognize the simplicity of not thinking and just clicking, and I think we have to be aware that that simplicity is a trap, not a trap put forward for us by the evil. I concede that “do no evil” may be the motto of those who present the option to us, and they may be right: they may not be doing evil. It wasn’t evil to put carbon dioxide in the atmosphere. It’s evil not to notice that it has consequences and to consider them before we drown our kids’ backyards. So let us think about this. Let us bring all of our good collective intellect to bear; let us see how this can be dealt with; let us imagine our ways out of the problem in several different directions; let’s do what we usually do. Let’s have proof of concept; let’s have running code; let’s let a hundred flowers bloom. Let’s sort it out, so that when the kids go through the snapshots and the scrapbooks after we are gone, they don’t find themselves saying this: “this is the snapshot of the day when privacy died,” and the bitterness of that is in the photograph they took of themselves as they did it. Thank you very much.