The last lighthouse: free software in dark times.

Dublin Core


The last lighthouse: free software in dark times.


Edward Snowden Keynote to Libreplanet 2016- Prerelease video


Edward Snowden Keynote to Libreplanet 2016- Prerelease video


Edward Snowden
Free Software Foundation
Libre Planet



Libre Planet









[John Sullivan]: We’re good? Well, welcome to Libre Planet 2016! You made it! You’re here! Well, not everybody made it, so we are streaming this event live. “Hello” to everybody watching along at home, too. Thank you for bearing with us, as we get things started here this morning. There are a lot of moving parts happening in this opening keynote, and we are doing it with all free software [audience cheers]. We’re really pushing the envelope here, and so there’s inevitably going to be some hang-ups, but we’ve been improving this process year after year, and documenting it, so that other conferences that are themed around free software and computer user freedom can hopefully use the same systems that we are [audience cheers] and practice what we want to preach. So, my name’s John Sullivan. I’m the Executive Director at the Free Software Foundation. This is always one of my favourite moments of the year, to start this conference off, but I’m especially excited about this year. We’ve had a [JS makes scary quotes] “Yuge” year, starting with our thirtieth anniversary in October, and continuing on to what is obviously our largest Libre Planet ever, and our biggest bang to start off the event, for sure. Let’s see, How many FSF members are here? Awesome! That’s amazing! Thank you, and I hope that the rest of you will consider becoming members by the end of the conference. You can join at Members and individual donors fund over eighty percent of the FSF’s work, including putting on this event, as well as our advocacy for computer user freedom, and development of free software. I’m really happy to have this event at MIT, again, where so much of the free software movement started, and our I want to thank our partners at the Student Information Processing Board - SIPB - for partnering with us, to make this happen. It’s really nice to see free software values continuing to be a strong part of the MIT community. [Applause] Yes, thank you. I have a few important announcements and reminders about the conference, the rest of the conference. First thing is, we have a safe space policy, that’s on page three of your program. Please read it and help us make this event a welcoming environment for absolutely everybody. If there are any issues that come up, please feel free to find me, or Georgia Young. The Information Desk will always know where we are and Georgia has her hand up in the back. Second of all, there is a party tonight at Elephant and Castle near the Downtown Crossings subway station. I hope you will join us there. We will provide some complimentary refreshments and continue conversations that get started during the conference today. We are streaming, as I mentioned, with all free software. The party, though, will not be streamed. [Laughter] We have a few program changes to announce [provides details of changes to conference program] [03:41] After the conference is over tomorrow, there will be a rally, at which we will, people will try to convince the W3C not to make a terrible mistake by endorsing DRM as a recommended extension to HTML 5. And that will be happening outside the Stata Center at 6:45 tomorrow night. Zak Rogoff at the Information Desk will have information for people who want to participate. That’s after the conference is concluded. Finally, please join us on IRC at #libreplanet channel on Freenode, both to communicate with people that are watching from home, and also just to have some back channel conversation about everything that’s happening. So, we have an amazing start to this year’s conference, with Daniel Kahn Gillmor and Edward Snowden. Daniel is a technologist with the ACLU’s Speech Privacy and Technology Project and a free software developer. He’s a Free Software Foundation member, thank you, a member of Debian, and a contributor to many free software programs, especially in the security layer many of us rely on. He participates in standards organizations, like IETF with an idea to preserving and improving civil liberties and civil rights through our shared infrastructure. Edward Snowden is a former intelligence officer, who served in the CIA, NSA, and DIA for nearly a decade as a subject-matter expert on technology and cyber security. In 2013, he revealed the NSA was unconstitutionally seizing the private records of billions of individuals who’d not been suspected of any wrong-doing, resulting in the largest debate about reforms of US surveillance policies since 1978. And I want to say, I take the chance to say “Thank you” for also inspiring us to, at the Free Software Foundation, to redouble our efforts to promote user security and privacy through the use of free software programs like GnuPG, if you’ve seen our guide at that was inspired by the actions that Snowden took and the conversation that that started. I would love to say more about how all this relates to free software, but I think I will leave that to our speakers this morning, while they have a conversation entitled “The last lighthouse: free software in dark times.” We started a little bit late. We are cancelling the break after this, so the next session will begin in this room immediately after this one concludes. So, we should have the full amount of time, so, thank you everybody. [JS gestures to DKG]

[06:47] DKG: So, I’m going ahead, and bring Ed in, hopefully. Let’s see. [Snowden appears 06:54] Ed, can you hear us? [Extended applause and cheers]

[07:00] Edward Snowden: Hello, Boston! Thank you. Wow!

DKG: Ed, you can’t see it, but people seem to be standing, right now.

ES: Thank you. Thank you. Wow! Thank you so much. Please, if I could say one thing. When we were introduced, the thing that always surprises me is that people say, you know, “Thank you” to me. But this is an extraordinary event, for me personally, because I get to say, “Thank you” to you. So many people forget – maybe people haven’t seen Citizen Four, for example, the documentary where they actually had the camera in the room when the NSA revelations were happening – but if you watch closely in credits, they thank a number of FOSS projects, including Debian, Tails, Tor, GnuPG, and so on and so forth. And that’s because, what happened in 2013 would not have been possible without free software. I did not use Windows machines when I was in my operational phase, because I couldn’t trust them – not because I knew that there was a particular backdoor or anything like that – but because I couldn’t be sure. Now, this ambiguity - this fear - this risk - that sort of creates this atmosphere of uncertainty that surrounds all of us - is one of the central problems that we in the security space – in the software space, in general – the connection space of the Internet - in the way that we relate to one another – whether it’s in politics, or law, or technology – is this thing that really is difficult to dispel. You don’t know it’s true. You don’t know it’s fact or not. Some critics of sort of the revelations and what happened - they say “Yeah, ah, we all knew that. Everyone knew that was happening. We figured that out.” And the difference is, many of us suspected – technologists suspected – specialists suspected – but we didn’t know. We knew it was possible. We didn’t know it was actually happening. Now, we know. And, now, we can start to make changes. We can integrate these threats into our threat model. We can adjust the way that we not just vote – not just the way we think about the issues – but the way that we develop, direct, and steer the software and systems that we all rely upon, everyday, that surround us invisibly in every space. Even people whose lives don’t touch the Internet - people who still have to go to the hospital - people who still may have a record of purchasing something at this location or that – somebody who spends money through banks – people who purchase something in a store – all of these things touch systems upon which we must all rely, but increasingly cannot trust - because we have that same Windows problem. Now, since 2013, I think everyone in the audience – this isn’t going to be controversial for you – would agree that Windows isn’t exactly moving in the right direction. They may be putting forth sort of new exploit mitigations, making things a little more difficult for buffer overflows and things like that, including ASLR, and everything like that, which is great, but at the same time we’re putting out an operating system like Windows 10, that is so contrary to user interests, where rather than the operating system working for you, you work for the operating system, you work for the manufacturer. This is not something that benefits society, this is not something that benefits the user, this is something that benefits the corporation. Now, that’s not to say “All corporations are evil.” That’s not to say, “I’m against private enterprise” or that you should be. We need to have systems of business, to be able to develop things, to go sell things, to trade and engage with each other, to connect and for [inaudible] – but, while sometimes corporations are on our side, sometimes corporations do stand up for the public interest, as is right now, Apple challenging the FBI, who is asking to basically smother the security of every American device, service, and product, that’s developed here, and ultimately around the world, while it’s still in its crib. We should not have to rely on them. And this talk today, I hope, is about where we’re at in the world, and thinking - for everyone in the audience - not what people say, not, you know, this fact or this authority, but what you believe, what you think, is the right way to move forward.

[12:26] KDG: So, I wanted to touch on that, on the questions around the security of free software and the security of non-free software as well. The Apple case is an interesting one, because it is a chance for us to, I think, continue to move the conversation forward about what protections are actually offered to users. There’s a lot of situations here where people are saying, “Well the Apple phones are more secure because they got this lock-down. And I think, I’d be curious to hear your take on, how do we respond to that? What are the trade-offs here, between the lock-down on Apple devices and the other possibilities - on hardware that’s maybe not so locked down?

[13:15] ES: A lot of people have difficulty distinguishing between related concepts – one of which is security, the other of which is control. Now, a lot of politicians have [inaudible] those issues, and have said this is a conversation about where we draw the line, between our rights and our security, or between liberty and surveillance, or whatever. But that really misses the point. And this is the central issue in that sort of Apple walled-garden approach. Apple does produce some pretty reliable exploit protections. Does that mean it’s a secure device? Well, they can push updates at any time, that they sign in an arbitrary way, that can completely change the functionality of the device. Now, we trust them currently, because many trust them, not to abuse that, and we’ve got at least some indication that they haven’t, which is a positive thing. But the question is, Is the device truly secure when you have no idea what it’s doing? And this is the problem with proprietary software. This is the problem with closed-source ecosystems, that are increasingly popular today. This is also the problem even with some open systems - or the more open systems, like the Android space - where security updates are just a complete, comprehensive, fractured disaster. [ES and audience laugh]. I don’t mean to go too far, but I’m sure you guys have heard this stuff. So nobody’s going to go stand up with a question, and then read me a speech about why this is wonderful. Um. But the challenge here is, Are there alternatives, right? And we know, from the Free Software Movement, that there are. You will notice in this talk, as the moderator introduced, that there’s no Google logo up here [ES points over his shoulder] for like the first time. Not the first time ever - I have done many talks on FOSS stuff - but never a full FOSS stack. Right now this is a complete stack, that’s completely free and open-source. And this is important, because what we do in our spaces, where we are a little more technical - we are a little more specialist - we can put up with more inconvenience - we develop the platforms, the capabilities, the strategies, that can then be ported over to benefit the entire class of communities that are less technical and, in many cases, simply cannot afford or access proprietary software in the traditional market-driven ways. Now, this is critical, because some of the most brilliant people I know, particularly Linux contributors, and so on and so forth, got their start - not because they necessarily believed in the ideology - but because they couldn’t afford licenses for all this different software, and they hadn’t yet developed sort of the technical sophistication to realize that they could just pirate everything. Now, this [audience and Snowden laugh] … this is actually a beneficial thing, and something I want everyone in the room to watch out for, right. Look for these people. This community that we have, that we’re building, that does so much for some people, has to grow, because we can’t compete with Apple, we can’t compete with Google directly, in the field of resources. What we can, eventually, do is head-count and heart-count. We can compete on the ground of ideology because ours is better [audience and Snowden laugh; audience applauds] … but we also have to focus on recruitment, on bringing people in, and helping them learn, right. Everybody got started somewhere. I did not start on Debian. I did not start on Linux. I was an MCSE, right, I was a Microsoft guy, ‘til eventually I saw the light. This doesn’t mean that you cast off … this doesn’t mean that you can’t use any proprietary software. I know Richard Stallman’s probably at the back and he’s waving his finger. [audience and Snowden laugh] But we’ve got to recognize that it’s a radical position to say that you can’t engage with proprietary software at all. That’s not to say it’s not without merit. The world needs radicals. We need lessons. We need leaders. We need figures who can pull us in the direction of trying new things, of expanding things, and recognizing that in a world where our visibility into the operation of our devices – whether it’s a washing machine, a fridge, or the phone in your pocket – is something that increasingly you have no idea what is going on, or, even if you want, you have no control over, short of exploiting it and trying to get /root, and then doing it on your own. That’s a fundamentally dangerous thing. And that’s why I call it the last lighthouse, right. The people in this room – whether you’re more on the radical side or more on the mainstream side – you’re blazing a trail, you’re recognizing solutions, and going “Look, we can deal with the software problem. We can do our best, but we recognize it’s a challenge.” But there are more problems that are coming, and we’re going to need more people, who are going to solve them. Everybody’s talking about the difficulties of software trust, but we really need to start thinking about hardware trust, right. There are distributions and projects like this - the Qubes project, researchers like Invisible Things Lab, Joanna Rutkowska, and others who are really focusing on these things, as well as many people in the Free Software Foundation. And we need to think about the world, where – alright, maybe the FBI – didn’t get a backdoor in the iPhone. But maybe it doesn’t matter, because they got the chip fabs. Maybe they already do. We need to think about a world where the infrastructure is something that we will never control. We will never be able to put the commercial pressure on telecommunications providers to make them watch the government, who they have to beg for regulatory licenses to actually operate the business. But what we can do, are layer our own systems on top of their infrastructure. Only think about things like the Tor project. Tor’s incredible. I use Tor everyday. I rely on Tor. I used it during the actual NSA work I did as well. And so many people around the world do. But Tor is showing its age, right. No project lasts forever. And we have to constantly be focused, we have to constantly be refreshing ourselves, and we need to look at where the opportunities are, and where the risks are. I should pass it back to Dan, because I just rambled for, like, twenty minutes. [audience laughs]

[19:55] DKG: Well, I think, so what you’re saying about how do we bring more people to the movement, I think, is really important. So I, I mean, I’ll say I came to free software for the technical excellence and I stayed for the freedom, right [audience and Snowden laugh] I came to free software at a time when Debian was an operating system that you could just install and automatically update and it worked. That didn’t exist elsewhere. I used to have Windows systems, where I was wiping the machine and re-installing every two months. [Snowden, audience and DKG laugh] and, I think a couple of people raised their hands, people have been there. So, you know, come for the technical excellence, and as I learned about the control that I ended up actually having and understanding what was going on in the systems … that became the reason that I stayed. It didn’t matter, as the other systems sort of caught up, and realized “Oh, well, we can actually do automated updates. Microsoft has a system update thing that they do.” So, I’m wondering if you have other ideas about maybe what are ways that we can expand our community, and what are ways we can sustain our community as we grow. I think maybe that’s a question for everyone in this room. But I’d be curious to know if you have any particular ideas or suggestions. Not everyone is going to come to the community is going to be maybe geeky enough to want to know what code is running on their refrigerator. But in ten years everybody’s refrigerator is going to be running code, and so how do we, like, how do we make sure that that message gets out? That people can be proud to have a free software fridge [audience and Snowden laugh] without being a free software hacker. What are ways that we can expand the community?

[21:42] ES: Well, one of the main ways is, we got to be better, right. If you have a free software fridge, it’s got to be better, it’s got to be more interesting, it’s got to be more capable, it’s got to be more fun than the proprietary equivalent. And the fact that, in many cases, it’s free is a big selling point. But beyond that – beyond the actual competitive strategy – we need to think about, as you said, the community strategy. And I don’t like [inaudible] for authority - especially from big talking heads on the wall – but, I would say, that everybody in the room should take a minute to think about their part in it, what they believe in, what they value, and how you can protect that, and how you can pass that to people who come after you, right. ‘Cause you can’t wait to your death bed, you know, like eighty, to make this happen. It’s something that has to be life-long practice, particularly in the context of organizing, particularly in the context of growing a group, particularly a group of belief. I would say, everybody in the room should make a task for themselves, that this year, bring five people into the free software community. Now, that seems really difficult. But when you think, you know, well alright, at any level - whether they just sign up for a membership when they donate, whether they do a basic commit on some Git somewhere, even if it’s just changing something cosmetic, making something a little bit more user-friendly. Even if it’s just a pull-request or a fork or branch that they’re using only for themselves, …

[23:10] DKG: Or a bug report.

ES: … or a bug report, even better. It’s important, because what we’re trying to do is, we’re trying to expose people into the language of empowerment, right. And that’s what this is really about. Where we get to back the whole thing before, whether it’s like privacy versus security, or [security versus privacy]. It’s not about privacy versus security, because when you’re more secure, you have more privacy; when you have more privacy, you’re a lot more secure as well. This is really about power, right. When we look at how these programs have actually been used in a surveillance context, it’s not just against terrorists, right. The GCHQ was using NSA systems to intercept the emails of journalists. They spied on Amnesty International. They spied on other human rights NGOs. In the United States, we used our capabilities to spy on UNICEF, the children’s fund, right, for the UN. And this was not the only time. When we looked at their actual statistics, we saw they abused their powers or broke the law 2,776 times in a single calendar {quarter/year}. Now, this is a problem for a lot of reasons, not least of which is the fact that no one was ever charged, right, no one was prosecuted, because they didn’t want to reveal the fact that these programs existed. But when we talk about what this means for people, right, ultimately it gets into that world of - Are you controlling your life? Are you controlling the things around you? Do they work for you? Or do they work for someone else? And this language of empowerment it is something, I think, that underlies everything that your organization has been doing, not just in the defense of liberty sense, or the “free as in kittens” sense, [audience and Snowden laugh] but the idea that, look, right, we’re no longer passive in our relationship with our devices.

[25:03] DKG: Yeah, so when I think about the devices that we need to be, to have some level of control over, there’s, I mentioned the refrigerator earlier, but, you know, increasingly we’re dealing with things like cars that have proprietary systems with over-the-air updates. [Snowden laughs] We’re dealing with more and more of our lives, our intimate conversations are mediated through these devices, and so, it’s interesting for me to think about how do we, how do we approach an ecosystem where there seems to be, maybe we actually now do have fully free computers, thanks to people in this room, we actually have, you know, laptops that are free from pretty much the BIOS upwards, including core boot, but how do we get, as more things become computerized, how do we get, how do we make sure that people’s cars aren’t, don’t themselves become surveillance devices, how to make sure the little pocket computers that everyone carries around actually aren’t surveillance devices for people? And, so I think one of the things that points to is that, as a community that cares about user empowerment, which is, this is freedom zero, right, the freedom to use these tools the way you want to use them, we have to, I think, make outreach also to communities with shared values. And you mentioned open hardware communities, people who are building tools that maybe we can have some level of control over, in the face of a bunch of other pieces of hardware that are under control. But there’s additional communities that we need, I think, also reach out to, to make sure that this, that this message of, you know, surveillance is this power dynamic, and we’re hoping that your control over your devices will actually provide people with some level of autonomy. And that means that we need to have more outreach to, I mean, to think about what’s going on, on the network stack itself. I mean, this is something I’ve focused on. If the protocols that we use are implemented in free software, but the network protocols are very leaky, that doesn’t actually provide people with what they want to do. It’s not very easy for people to come along and change the protocol, if it’s a communications protocol. So I think we need to look at the network standards, we need to look at regulatory standards, so I’m happy, I’m hoping there are lawyers in the room, I suspect there are, well, there’s a couple of people raising both hands. [Snowden and audience laugh] So, but that kind of outreach - can we have regulatory guidance that says, “If you’re going to put a vehicle on the road that it needs to be running free software”? I mean, that’s a super-radical position today. Can we make that not a radical position? How can we, how can we make that outreach into the communities of non-geeks to make sure that these messages about power and control, which are central to our lives in a heavily technologically-mediated society, actually are addressed in all of the places where they’re addressed? I don’t know if you have other particular places, where you can imagine outreach, Ed, a community to ally with?

[28:25] ES: You hit a big point with the network problem. That gets back into the fact that we can’t control the telecom providers, you know, we’re very vulnerable to them. If you wanted to compress the story of 2013 to its – leaving politics aside, right, leaving the big democratic question of the fact politicians were telling us this wasn’t happening, intelligence officials were giving sworn testimony saying this wasn’t happening, when it obviously was – and we focus just on the technical impact, and we want to compress it to a central point - it would be that the provider is hostile. The network path is hostile. And we need to think about mitigations for that. Now, we need to think about also where all the providers are, what they are, and how they can be replaced. Now, open hardware is one of the big challenges here. We’ve seen some adverts like the Lenovo laptop. We’ve seen some other things like Purism, and many others that I haven’t named directly. But there’s a large question here, where if we can’t control the telecommunications provider, if we can’t control the chip fabs, right, how can we assert things? Well, the first solution was, Encrypt everything. And this is an important first step, right. It doesn’t solve the metadata problem, but it’s an important first step. The next step is Tunnel everything, right. And then the step beyond that is, Mix everything, so we can mudge the metadata and it’s hard to tell where things went. Now, there’s still theoretical problems with global passive adversary, timing attacks, and what not, but you make this more expensive, and less practical with each step we go beyond, and then there’s somebody in this room, who likely has the idea that none of the rest of us have, on how to really solve this. And this is what we need. Also in the hardware space. Is it possible, that rather than getting these very specialized chips, that’s exactly this - I do exactly that, I have exactly this instruction set, and I’m inflexible - we realize that because we’re sort of bumping up to the limits of physical die shrinks at this point, that we could reach a point that, maybe we start changing our architecture a little bit more radically. We have flexible chips, things that are like FPGAs for everything. And instead of getting a hyper-specialized chip, instead we get a hyper-capable chip that can simply be used in any arbitrary manner, and this community shares masks of our own design, that are logical masks, rather than physical masks, for designing and directing how they work. [pauses] There’s another question here that I actually don’t know a lot about, but I think Daniel you’ve done some research on this, is – when we get into the actual toolchaining, right, how do we build program devices and things like that? For myself, I’m not a developer full-time. That was never my focus. And there’s this question, we’ve seen sort of attacks, including in, like, the NSA documents, the XcodeGhost type thing, where an adversary, an arbitrary adversary, will target a developer, right, and rather than poison a specific binary, rather than trying to steal their signing key or something, like that, or in addition to stealing their signing key, they’ll actually go after the compiler. They’ll actually go after their toolchains. Or, on the network, they’ll start tracking people, and the activities of developers, even if they start working pseudonymously, because they’ll look at their toolchains, they’ll look at, Is there some cruft? Is there some [inaudible] is there some artefact? Is there some string that constantly repeats in their work? Is there some variable that’s unique to them and their work, that identifies them, even if they’re under [inaudible] How do we start this off?

[32:08] DKG: Right, this is, like, one level past the I Hunt Sysadmins slide, right this is the I Hunt Developers slide, and I would hope that the free software developers in this room care about that issue, right. I mean, I certainly, I know that, as a free software developer, lots of people take their responsibilities seriously. You don’t want to release bad code – sometimes, occasionally, some people maybe make some mistakes, some bugs [Snowden laughs] but we take the responsibility seriously. We want to fix the bugs that we make. But what if your toolchain is corrupted? What if you do get targeted? If you’re maintaining a piece of core infrastructure, like many people in this room probably are, how do you ensure that a target, a targeted attack on you doesn’t become an attack against all of your user base? I think we actually, what’s great is we actually have people working on this problem. I know there’s a talk later today, or tomorrow rather, about reproducible builds, which is an opportunity to make sure we get, you can go from, I’m not going to give the talk in five minutes here [Snowden laughs] I’m just going to give an outline. You should definitely check it out. But the goal is you can go from your source code through your toolchain and get a reproducible, like byte-for-byte identical value. And so that way, as a software developer, you know that any attack against your toolchain doesn’t matter, as long as the tools, as long as you’re publishing the source code that you mean to publish, your users can rely on tools that are built by many different builders, that will all produce the same result, and they can verify that they’re getting the same thing from each party. We’re not there yet, because our tools are still kind of, kind of crufty, they build in some arbitrary things, but we’re making great strides towards making non-reproducibility itself something we can detect, and stamp out as a new class of bugs, that we can rule out, and that gives us a leg up also against the proprietary software community, where they can’t simply do that, if they don’t have the source code even visible, they have no way of saying, “Look, this is the human intent, the human-readable intent, and here’s the binaries that come out, that other people can do.” So reproducible build in one path to that kind of trust, and I think there are probably others, and I hope people are actively thinking about that. The other way that I’ve heard this framed is, “Do you want to be the person who gets made an offer that you can’t refuse, right?” [Snowden laughs] If you’re a free software developer, and you’re publishing your source code, people can see what you publish, and they can say “Hey, did you really mean to do this?” But if you’re just distributing binaries, or you’re distributing your source code next to binaries, and your binaries are compromised, anybody who is looking at your source, at your disk, “Well, the disks all look clean” and then your binaries could be compromised. So, personally, as a free software developer, I don’t want to be in that position. I don’t want to be giving anybody any binaries. I want to be able to give them changes that are human-readable. So, we’re running a little bit low on time, and I want to make sure that if people have questions, they get a chance to ask questions. There’s a couple other things that I’d love to talk with you about, but if people have questions I’m going to ask that you come down and line up here at the mic, if you have a question. A couple of people are starting. But before the questions start, I’m just going to lead off with one, which is, have you got any ideas for how we address the, you mentioned the Android lack of security updates, how do we address, any ideas or suggestions for how we address the stability versus legacy compatibility versus actually security updates quandary? [Snowden smiles]

[35:44] ES: So, this is, like, [Snowden laughs]

DKG: In one minute … it’s easy.

ES: If I could solve this, I’d have an easier time getting the Nobel prize, right. [audience and Snowden laugh] But the challenge here is that there’s a real impact to support for legacy. Everybody knows this. But the users don’t accept so well, that there’s a gigantic security impact, that makes it actually unethical to support really out-of-date versions, to keep people anchored back in that support chain. Because it’s not just a question about versioning, it’s not just a question about stable, right. Stable is important, but increasingly we’re finding out the pace of adversary offensive research is so fast that if our update cycles are not at least relevant to that attack speed that we’re actually endangering people. And by being one of the users who’s out there on a mailing list [Snowden gestures mockingly] “Oh, this breaks functionality blah-blah-blah for my floppy-disk driver in my virtual machine.” It’s like, “Yo! Stop using a floppy disk in your virtual machine!” [audience laughs]

[37:03] DKG: So, we’ve got a queue of questions. I want to make sure that we get to them. I might need to repeat the mic, repeat the questions. I’m not sure whether you’ll hear them. Go ahead.

[37:13] Question #1: Hi, my name is Curtis Glavin. Thanks for taking my question. I was wondering, should a priority for the free software community be developing a kind of killer app for privacy, security, and a lot of these ideas that we all care about, that could gain, that could gain widespread adoption and transform public opinion, mainstream public opinion on these issues? And, if so, what could such a kind of killer app be, and how could the community build it? Thanks.

[37:54] ES: Absolutely. I mean, we start to see some of these things happening, particularly in the [inaudible] space, where we’re on ecosystems, where we have less control over, or they’re starting to put apps there. Now, can we create a competing stack? And, more importantly, as you say, the first is capability, because that’s what people care about, that’s what the user cares about, is capability. We see things like Signal, that are starting to try to tackle this, and even messaging space, right. But Signal’s not perfect, right. Signal has weaknesses. It leaks metadata, like telephonic contact, your address list, and things like that. And we have to figure out, are there ways that we can change collaboration? Now, here’s the big one, right. Living in exile kind of gives you an interesting perspective on different ways that people interact. One of the big ones is the fact, look, there’s a warrant against me, right. If I was trying to speak with you at MIT, there’d be like an FBI raid and paddy wagons outside. But because of technology, here I am, and it’s all FOSS. But that’s only the beginning, because there are other alternative functions out. We’re trying to compete. We’re trying to replicate. We’re trying to distinguish. Can we get there first? Now, one of the big technologies - the disruptive technologies - that’s out there today, that’s coming out this year, are obviously the VR stuff that is starting to take off. We’ve got the Oculus Rift, we’ve got the HTC Vive, and, of course, they’ll be many different versions of this. Can we take the hardware and create our own applications for addressing the remote-work problem, right? Can you create a virtual workspace, a virtual meeting space, a virtual social space, that’s arbitrary, where you and your peers can engage? They can look over your shoulder, as if you were sitting in the same office, and see your terminal visibly in front of you in this virtual space, without regard to your physical location? Now, I’m sure, there are commercial providers out there, proprietary actors out there, who are trying to create this. You know, Facebook would be completely negligent if they weren’t trying to do it. But if we can get there first, and we can do it secure, we can do something simply, that Facebook simply can’t. Their business model does not permit them to provide privacy. We can. We can do the same thing. We can do it better. We can do it faster. And if we do, it will change the world, and it will change the politics of every person in every country, because now you’ll have a safe space - everywhere, anywhere, always. [applause]

[40:37] Question #2: Hi. My name is Sascha Costanza-Chock. Thank you so much. You mentioned a couple times in your comments, sort of nodded to the idea, that we abandon the infrastructure space and we build on top of, you know, on top of existing infrastructure. And I wonder if you could just make a couple comments about the communities that are trying to do DIY and community-controlled infrastructure? So, there are projects like Open BTS. There are community organizations like Rizomatica, that’s building free phone and internet networks in rural Mexico. There are projects like the Red Hook Initiative that’s training people how to build community-controlled wireless infrastructure in Brooklyn. There are projects like Detroit’s Digital Stewards that are doing the same thing in Detroit. And all over. There are people sort of bubbling up around the edges to do community infrastructure. And I wonder if you could comment a little bit more on, Yes, these things are longshots, but maybe we shouldn’t abandon this, the space, of the imaginary space of the possible libertory future where we do own our infrastructure as well?

[41:40] ES: I agree - and actually if you could stay at the mic there for just one second – because that is, that’s a powerful idea. Now, I have less familiarity with that. I’m not going to try to BS anybody. Nobody’s an expert in everything, right. I’m not as familiar with community infrastructure projects. When I think about that, I think about Open DD-WRT and so on. But that level, where we’re actually talking about, you know, knitting together meshnets, or small-scale cell networks, that’s awesome, and we should do more about it. I think we will have the most success, personally, where we’re leap-frogging over technologies, be more mobile, more agile, and we don’t have the same kind of sunk infrastructure costs, because, ultimately, infrastructure is what can be targeted by the adversary – whether it’s a criminal group, whether it’s a government. If we have things invested in boxes and spaces, those are things that a police car can drive up to. That’s not to say they’re not, that’s not the case. But if I could just ask you briefly to comment on that, since you do have more familiarity, and maybe everybody in the room could benefit from it - What do you see as the way forward in the next space of communications fabric? [another questioner comes to the mic] I was actually him a follow-up. But that’s fine, let’s just have the next question.

[43:04] Question 3: Hi, this one may be a partial regurgitation of the last one. Daniel Gnoutcheff, Sysadmin, Software Freedom Law [Center]. Oh, my goodness, sorry. Moving on. So, one of the responses I’ve seen to revelations of global surveillance is the rise of self-hosting projects, such as Freedom Box, that are trying to provide people with tools to move their data out of the cloud, so to say, so to speak, and into personal devices sitting in their own homes. Do you believe that these sorts of tools, such as Freedom Box, provide a reasonable defense to global surveillance? And, what would your advice be to Freedom Box and similar projects?

[43:52] ES: Yeah, absolutely. So this is one of the critical things in the values, where community infrastructure, like that open infrastructure, can actually be really valuable, even if it’s not global - in fact, especially if it’s not global. In my experience, so I worked at the NSA, right, actually with the tools of mass surveillance. I had XKEYSCORE at my desk every morning I went in. I had scheduled tasks that were just pumping out sort of all of my different targets, all of their activities around the global internet, including just general activities on subnets that I found interesting, right. If I wanted to see anybody in an entire subnet – they just sent a certain pattern of ping - I could get that, it would be just there waiting for me. It’s easy to do. If you can write RegEx you can use XKEYSCORE. And you don’t even need to do that, but more advanced people do. Everybody else was just typing in “Bad guy at” [audience laughs] but the idea here is that even, even mass surveillance has limits, right - and that’s the size and granularity of their sensor mesh, right. They have to compromise or co-op a telecommunications network. They have to hack a router, implant, and then put a targeting interdiction on that, to go “Are any of my interesting selectors - IP addresses, emails, classes of information, fingerprints of activity, anything like that passing this? Then I’ll add to, sort of, my bucket, that will come back as results. And what this means is, that for ordinary people, for dissidents, for activists, for people who want to maintain their privacy, the fewer hops that you cross, the more lower, the more local your network - particularly if you’re outside of, sort of, these big telecommunications spaces - the safer you are, because you can sort of live in those gaps between the sensor networks, and never be seen.

[45:48] DKG: So, unfortunately we’re running low on time here. We’ve got less than five minutes left. So maybe we can take one last question.

ES: Sure.

DKG: Sorry, I know there are people in the queue, but …

[46:00] Question 4 [a young man]: Hello. I wanted to ask. What is someone my age able to do, who is like in middle school or high school, to kind of help out?

[46:10] ES: First thing is care. If you care, you’ll learn. If you learn [applause] It’s not meant to be pat. A lot of people don’t care. And it’s not that they don’t care because they’ve looked at it, they understand it, and they go “It doesn’t matter.” It’s because everybody’s only got so many minutes in the day, right. There’s a contest for attention. There’s a contest for mind-share. And we can only be a specialist or an expert in so many things. If this is already interesting to you, right, you’re already on the right track. And you can do a lot of good. You can develop tools that will change lives, and maybe even save them. The key is to learn. The key is to develop capability, and to actually apply it. It’s not enough to simply care about something – that’s the start. It’s not enough to simply believe in something – that’s the next step. You actually have to stand for something. You have to invest in something. And you have to be willing to risk something to make change happen. [applause]

[47:35] DKG: So … sorry. We’ve got a bunch of other talks lined up today. And we don’t want to end up blocking them. But Ed, thank you for joining us. We really appreciate it.

ES: It’s my pleasure. Thank you so much. Enjoy the conference.


Snowden, Edward - The last lighthouse - Free software in dark times - Libre Planet 20160319.pdf



Edward Snowden, Free Software Foundation, and Libre Planet, “The last lighthouse: free software in dark times.,” I/Oterror, accessed November 18, 2018,

Transcribe This Item

  1. Snowden, Edward - The last lighthouse - Free software in dark times - Libre Planet 20160319.pdf

Item Relations

This item has no relations.