Return to Video

Presentation of the FreedomBox from James Vasile

  • 0:08 - 0:11
    Ich bin sehr Stolz einen Gast aus den vereinigten Staaten hier auf der
    2
    00:00:11,012 --> 00:00:14,086
    Elevate begrüssen zu dürfen, es ist James Vasile von der Freedom Box Foundation
  • 0:14 - 0:20
    James Vasile arbeitet an meheren Projekten
  • 0:20 - 0:23
    z.b.Apache, ich denke auch Joomla und viele andere. Er ist auch Anwalt,
  • 0:23 - 0:31
    und arbeitet für die Freedom Box Foundation und die Free Software Foundation.
  • 0:31 - 0:37
    Er presentiert nun, meiner Meinung nach, eines der illusionärsten Projekte , das ich in Jahren sah.
  • 0:37 - 0:43
    wie wir hier sehen können, eine kleine Kiste, die Freedom Box.
  • 0:43 - 0:48
    Ja, James wird eine Präsentation geben und dann werden wir
  • 0:48 - 0:50
    in einer gesprächsrunde Fragen beantworten.
  • 0:50 - 0:53
    so James, es ist deine Bühne.
  • 0:53 - 0:56
    Danke Daniel.
  • 0:56 - 1:03
    Ich bin jetzt seit ein paar Tagen auf dem Elevate Festival
  • 1:03 - 1:10
    Ich besuchte einige Vorlesungen, sah Filme und hörte Musik
  • 1:10 - 1:15
    und es ist ein grossartiger Platz wo alle diese Ideen zusammmenkommen.
  • 1:15 - 1:21
    Ich möchte mich bei Daniel für die organisation bedanken
  • 1:21 - 1:23
    und natürlich auch bei Joseph.
  • 1:23 - 1:30
    Im besonderen bei Daniel , der mich dazubewegte hierher zu kommen.
  • 1:30 - 1:33
    und ein wirklich toller Gastgeber ist.
  • 1:33 - 1:36
    Vielen Dank noch einmal.
  • 1:36 - 1:42
    APPLAUS
  • 1:42 - 1:52
    lange Zeit zurück, in den Anfängen des Internets
  • 1:52 - 1:56
    als wir anfingen das internet zu benutzen um miteinander zu reden,
  • 1:56 - 2:00
    Sprachen wir meistens direkt zu den menschen, richtig ?
  • 2:00 - 2:05
    Think about how email works, on a technical level
  • 2:05 - 2:10
    You take a message, you hand it off to your mail transport agent
  • 2:10 - 2:14
    It sends it through a network, directly to the recipient.
  • 2:14 - 2:16
    It hops through some other computers, but funadmentally
  • 2:16 - 2:21
    you use the network to talk directly to your other computer
  • 2:21 - 2:26
    the other computer where the recipient gets his or her mail
  • 2:26 - 2:30
    It was a direct communication medium.
  • 2:30 - 2:33
    If you're old enough to remember a program called 'talk'
  • 2:33 - 2:37
    Talk was the first, sort of, interactive you type, they see it, they type, you see it
  • 2:37 - 2:40
    instant message application.
  • 2:40 - 2:43
    This again, was direct.
  • 2:43 - 2:48
    You would put your, put their name, into your program, and address
  • 2:48 - 2:51
    they would put theirs into yours, and you would just talk directly to each other
  • 2:51 - 2:57
    You didn't send this message through servers. That centralised technology.
  • 2:57 - 3:02
    From there, from those beginnings of talking directly to each other
  • 3:02 - 3:07
    we started to build communities, emailing directly to people.
  • 3:07 - 3:10
    But that was relatively inefficient.
  • 3:10 - 3:17
    Talking directly to people, one-to-one, works very good for one-to-one converstions.
  • 3:17 - 3:19
    But as soon as you want a group conversation
  • 3:19 - 3:21
    as soon as you want to find people reliably who you haven't
  • 3:21 - 3:26
    already set up contacts for, exchanged email addresses and such
  • 3:26 - 3:28
    you run into friction, you run into problems
  • 3:28 - 3:34
    So the solution to that, was to create more centralised structures
  • 3:34 - 3:37
    and we did this with IRC
  • 3:37 - 3:41
    IRC is a place where instead of talking directly to the people we're trying to reach
  • 3:41 - 3:45
    we take a message, and we send it to an IRC server
  • 3:45 - 3:46
    a third party
  • 3:46 - 3:48
    and the IRC server then copies that message
  • 3:48 - 3:51
    to all the people who we might want to talk to.
  • 3:51 - 3:54
    We developed mailing lists, listservs
  • 3:54 - 3:58
    And again, this was a way where we would take our message
  • 3:58 - 3:59
    and hand it to a third party
  • 3:59 - 4:03
    A mail server, that is not us and not the person we're trying to talk to
  • 4:03 - 4:05
    and that mail server would then echo our communication to
  • 4:05 - 4:07
    all the people we want to talk to
  • 4:07 - 4:10
    and this was great, because you didn't have to know the
  • 4:10 - 4:12
    addresses of all the people you wanted to talk to
  • 4:12 - 4:15
    You could just all 'meet' in a common place
  • 4:15 - 4:19
    We all meet in an IRC chatroom, we all meet on a listserv
  • 4:19 - 4:23
    And there were a lot of IRC channels, and a lot of IRC servers
  • 4:23 - 4:25
    and a lot of mail servers
  • 4:25 - 4:27
    all across the internet
  • 4:27 - 4:28
    A lot of places to do this communication.
  • 4:28 - 4:32
    And if you didn't like the policies or the structures or the technology
  • 4:32 - 4:34
    of any one of these service providers
  • 4:34 - 4:36
    these IRC servers, or these list servers
  • 4:36 - 4:38
    you could just switch, you could choose to run your own.
  • 4:38 - 4:40
    It was very simple.
  • 4:40 - 4:46
    This infrastructure is not hard to create, it's not hard to run, it's not hard to install.
  • 4:46 - 4:49
    And so a lot of people did run, create and install it.
  • 4:49 - 4:53
    There were a bunch of IRC servers, there were a bunch of different listserv packages
  • 4:53 - 4:57
    But as we've moved forward in time,
  • 4:57 - 5:01
    we've started to centralise even more.
  • 5:01 - 5:05
    And, you can fast-forward to today
  • 5:05 - 5:07
    where we're channeling our communication
  • 5:07 - 5:10
    through fewer and fewer places.
  • 5:10 - 5:13
    And we are making structures that are more and more central
  • 5:13 - 5:15
    and more and more over-arching
  • 5:15 - 5:20
    So, from the, the IRC way of talking to each other
  • 5:20 - 5:25
    we moved to instant messaging applications.
  • 5:25 - 5:28
    AOL Instant Messenger, ICQ,
  • 5:28 - 5:31
    those were the early ways to do it
  • 5:31 - 5:33
    and there were only a few of them
  • 5:33 - 5:36
    MSN had its messaging system, Yahoo had its messaging system
  • 5:36 - 5:39
    and when people wanted to talk to each other now,
  • 5:39 - 5:41
    they were using third-parties again.
  • 5:41 - 5:43
    But they were only using a few third parties.
  • 5:43 - 5:46
    And if you wanted to switch providers,
  • 5:46 - 5:49
    you would leave almost everyone you knew behind,
  • 5:49 - 5:51
    your entire community behind.
  • 5:51 - 5:53
    And so it becomes harder to switch.
  • 5:53 - 5:54
    There are fewer options
  • 5:54 - 5:58
    and the cost of switching leaves more and more people behind
  • 5:58 - 6:00
    So you started to have lock-in.
  • 6:00 - 6:05
    You started to have people who were chained to their methods of communication
  • 6:05 - 6:07
    because the cost of losing your community is too high.
  • 6:07 - 6:10
    And so if you don't like the technology, or you don't like the policy
  • 6:10 - 6:12
    or you don't like the politics
  • 6:12 - 6:13
    or if they're trying to filter you
  • 6:13 - 6:14
    or censor you
  • 6:14 - 6:16
    you don't have a lot of options.
  • 6:16 - 6:18
    The cost of leaving is so high that you might stay.
  • 6:18 - 6:21
    People do stay. And they accept it.
  • 6:21 - 6:25
    And we went from that small basket of providers of this kind
  • 6:25 - 6:27
    of communication technology
  • 6:27 - 6:29
    to an even more centralised structure
  • 6:29 - 6:33
    where there is effectively only one way to reach all our friends,
  • 6:33 - 6:36
    in each mod of communication,
  • 6:36 - 6:37
    Facebook.
  • 6:37 - 6:38
    And Twitter.
  • 6:38 - 6:41
    These two services rule everything.
  • 6:41 - 6:43
    And I'm not going to stand here and say Facebook is evil
  • 6:43 - 6:45
    and that Twitter is evil
  • 6:45 - 6:49
    What I want to say is that having one place
  • 6:49 - 6:50
    where we do all our communication
  • 6:50 - 6:53
    leaves us at the mercy of the policies of the people
  • 6:53 - 6:55
    that control the infrastructure that we are chained to,
  • 6:55 - 6:57
    that we are stuck using, that we are locked into.
  • 6:57 - 7:02
    You can't leave Facebook without leaving everybody you know
  • 7:02 - 7:05
    because everybody you know is on Facebook.
  • 7:05 - 7:09
    I was not a Facebook user.
  • 7:09 - 7:11
    I was against Facebook.
  • 7:11 - 7:14
    I thought it was bad to centralise all our communication in one place.
  • 7:14 - 7:15
    I didn't like the privacy implications,
  • 7:15 - 7:18
    I didn't like Facebook's censorship
  • 7:18 - 7:21
    of things like pictures of nursing mothers.
  • 7:21 - 7:22
    I don't think that kind of thing is obscene,
  • 7:22 - 7:25
    and I don't think Facebook should have the ability to tell us
  • 7:25 - 7:27
    what we can share with our friends.
  • 7:27 - 7:29
    So I thought those were bad policies,
  • 7:29 - 7:32
    and I reacted to that by not joining Facebook. For years.
  • 7:32 - 7:35
    All my friends were on Facebook.
  • 7:35 - 7:41
    I joined Facebook late last year. November.
  • 7:41 - 7:48
    Because in November, a friend of mine passed away.
  • 7:48 - 7:50
    His name was Chuck. He was a brilliant man.
  • 7:50 - 7:55
    And he lived a lot of his life online.
  • 7:55 - 7:58
    He was on Facebook, and he shared things with friends on Facebook.
  • 7:58 - 8:01
    When he passed away I realised I hadn't communicated with him in a while,
  • 8:01 - 8:02
    I hadn't really talked to him in a while.
  • 8:02 - 8:05
    And the reason I hadn't was because I wasn't
  • 8:05 - 8:08
    communicating with him in the place he communicates.
  • 8:08 - 8:10
    I wasn't meeting him where he was, I wasn't on Facebook.
  • 8:10 - 8:12
    I was missing out on something huge.
  • 8:12 - 8:15
    That's the cost of not being there.
  • 8:15 - 8:17
    And so I joined.
  • 8:17 - 8:19
    Because I decided that as strong as my beliefs were,
  • 8:19 - 8:21
    it was more important to me to be there with my friends and
  • 8:21 - 8:23
    to talk to my friends.
  • 8:23 - 8:24
    That's the power of lock-in.
  • 8:24 - 8:27
    Me, a person who cares, as much as I do,
  • 8:27 - 8:31
    who cares enough about these issues that I do something like this
  • 8:31 - 8:32
    I got locked into Facebook. I'm there now.
  • 8:32 - 8:35
    That's how I talk to a lot of my friends, whether I like it or not
  • 8:35 - 8:38
    I am locked into Facebook.
  • 8:38 - 8:42
    You know, I'm also on Diaspora. But my friends aren't on Diaspora.
  • 8:42 - 8:46
    This sort of lock-in creates a sort of situation where
  • 8:46 - 8:51
    we have one arbiter of what is acceptable speech,
  • 8:51 - 8:53
    whether we like it or not.
  • 8:53 - 8:55
    If they're free, we're free to the extent,
  • 8:55 - 8:56
    only to the extent,
  • 8:56 - 8:57
    that they give us freedom.
  • 8:57 - 8:59
    And that to me isn't freedom.
  • 8:59 - 9:01
    That to me is accepting what you're given.
  • 9:01 - 9:04
    It's the exact opposite of making your own choices.
  • 9:04 - 9:08
    The exact opposite of self-determination.
  • 9:08 - 9:13
    All of our problems in communication can be traced
  • 9:13 - 9:16
    to centralized communications infrastructure.
  • 9:16 - 9:22
    Now, I've sort of told this story at the social level,
  • 9:22 - 9:25
    in the way that we're talking about how to talk to your peers
  • 9:25 - 9:28
    and your friends on the internet.
  • 9:28 - 9:33
    But this story also exists when we think about relying on the pipes,
  • 9:33 - 9:38
    relying on the hardware, the technical infrastructure behind the software.
  • 9:38 - 9:43
    We rely on internet backbones,
  • 9:43 - 9:45
    we rely on centralized cellphone networks,
  • 9:45 - 9:47
    we rely on centralized telephone networks.
  • 9:47 - 9:52
    The people that control these networks have the ability
  • 9:52 - 9:54
    to tell us what we're allowed to say,
  • 9:54 - 9:56
    when we're allowed to say it.
  • 9:56 - 9:59
    They have the ability to filter us, to censor us, to influence us.
  • 9:59 - 10:02
    Sometimes they use that ability, and sometimes they don't,
  • 10:02 - 10:04
    and sometimes by law they're not allowed to.
  • 10:04 - 10:06
    But at the end of the day
  • 10:06 - 10:09
    the power doesn't rest in our hands.
  • 10:09 - 10:11
    The power, from a technological perspective,
  • 10:11 - 10:13
    rests in the hands of the people that operate the
  • 10:13 - 10:15
    networks.
  • 10:15 - 10:20
    Centralization doesn't just allow this sort of filtering and censorship.
  • 10:20 - 10:23
    There's another big problem with centralization.
  • 10:23 - 10:26
    The other big problem with centralization is that by
  • 10:26 - 10:30
    gathering all of our data in one place
  • 10:30 - 10:33
    it becomes easy
  • 10:33 - 10:36
    to spy on us.
  • 10:36 - 10:39
    So every time you go to a website
  • 10:39 - 10:41
    pretty much
  • 10:41 - 10:45
    the website includes, at the bottom of the page
  • 10:45 - 10:49
    a little graphic or invisible Javascript thing
  • 10:49 - 10:53
    that tells Google that you came to visit the page.
  • 10:53 - 10:56
    Eva goes to a website, and the website says
  • 10:56 - 10:59
    "Hey Google! Eva just came to my website!"
  • 10:59 - 11:01
    Every time she goes to a website, that happens.
  • 11:01 - 11:04
    And so Google effectively sits next to her and watches,
  • 11:04 - 11:06
    while she uses the internet.
  • 11:06 - 11:07
    Watches everything she does,
  • 11:07 - 11:09
    and everything she enters,
  • 11:09 - 11:11
    everything she looks at and knows.
  • 11:11 - 11:15
    It's not just her search data, it's not just her Gmail.
  • 11:15 - 11:19
    It's the entire picture of her digital life.
  • 11:19 - 11:22
    In one place.
  • 11:22 - 11:23
    That's a pretty complete profile.
  • 11:23 - 11:24
    If you were able...
  • 11:24 - 11:27
    ...imagine if somebody could sit next to you and watch
  • 11:27 - 11:29
    everything you did online,
  • 11:29 - 11:31
    imagine how much they would know about you.
  • 11:31 - 11:33
    That's how much Google knows about you.
  • 11:33 - 11:36
    Google knows more about you than you know about yourself,
  • 11:36 - 11:39
    because Google never forgets.
  • 11:39 - 11:42
    Google knows more about you than your parents,
  • 11:42 - 11:43
    than your partner,
  • 11:43 - 11:46
    Google knows your secrets, your worst secrets,
  • 11:46 - 11:48
    Google knows if you're cheating on your spouse
  • 11:48 - 11:49
    because they saw you do the Google search for the
  • 11:49 - 11:54
    sexually-transmitted disease.
  • 11:54 - 11:56
    Google knows your hopes and your dreams.
  • 11:56 - 11:58
    Because the things we hope and dream about,
  • 11:58 - 11:59
    we look for more information about.
  • 11:59 - 12:00
    We're natural information seekers.
  • 12:00 - 12:02
    We think about something, it fascinates us,
  • 12:02 - 12:05
    we go and look it up online. We search around.
  • 12:05 - 12:06
    We look around the internet, and we think about it.
  • 12:06 - 12:11
    And Google is right there. Following our thought process,
  • 12:11 - 12:15
    the thought process in our click trail.
  • 12:15 - 12:19
    That is an intimate relationship.
  • 12:19 - 12:21
    Right? Do you want an intimate relationship with Google?
  • 12:21 - 12:21
    Maybe you do.
  • 12:21 - 12:25
    I personally, don't.
  • 12:25 - 12:28
    But that's it, Google sits next to us and watches us use
  • 12:28 - 12:30
    our computers.
  • 12:30 - 12:34
    And if anyone actually did... if you had a friend who wanted
  • 12:34 - 12:37
    to sit next to you, or a stranger said I want to sit next to you
  • 12:37 - 12:39
    and just watch you use your computer all day,
  • 12:39 - 12:41
    you would use that computer very differently to the way you do now.
  • 12:41 - 12:44
    But because Google doesn't physically sit next to you,
  • 12:44 - 12:49
    Google sits invisibly in the box, you don't know Google is there.
  • 12:49 - 12:51
    But you do know, right?
  • 12:51 - 12:52
    We're all aware of this. I'm not saying any of you don't know,
  • 12:52 - 12:55
    especially in a room like this.
  • 12:55 - 12:57
    But we don't think about it.
  • 12:57 - 12:58
    We try not to think about it.
  • 12:58 - 13:01
    We are locked in, to the internet.
  • 13:01 - 13:03
    We can't stop using it.
  • 13:03 - 13:05
    And the structures that exist,
  • 13:05 - 13:06
    the infrastructure that exists,
  • 13:06 - 13:09
    that has been slowly turned from
  • 13:09 - 13:12
    a means to allow us to communicate with each other
  • 13:12 - 13:16
    to a means of allowing us to access web services
  • 13:16 - 13:19
    in return for all our personal information so we can be bought and sold
  • 13:19 - 13:21
    like products.
  • 13:21 - 13:24
    That is the problem. That is the problem of centralization, of having one structure.
  • 13:24 - 13:27
    As soon as we put all that information in one place
  • 13:27 - 13:32
    we get complete profiles of us, you get complete pictures of you.
  • 13:32 - 13:33
    And that is a lot of information.
  • 13:33 - 13:34
    It's valuable information.
  • 13:34 - 13:39
    It's information that is used, right now, mostly to sell you things.
  • 13:39 - 13:42
    And that, you might find objectionable.
  • 13:42 - 13:43
    Maybe you don't.
  • 13:43 - 13:46
    Maybe you don't believe the studies that say you can't ignore advertising.
  • 13:46 - 13:51
    Maybe you think that you are smart and special, and advertising doesn't affect you.
  • 13:51 - 13:53
    You're wrong.
  • 13:53 - 13:56
    But maybe you believe that.
  • 13:56 - 14:02
    But that information, that same infrastructure, that same technology that allows them
  • 14:02 - 14:05
    to know you well enough to sell you soap
  • 14:05 - 14:12
    allows them to know you well enough to decide how much of a credit risk you are,
  • 14:12 - 14:14
    how much of a health risk you are,
  • 14:14 - 14:16
    and what your insurance premiums should look like.
  • 14:16 - 14:18
    In America we have a big problem right now.
  • 14:18 - 14:23
    Insurance costs are out of control. Health insurance. We're having a lot of difficulty paying for it.
  • 14:23 - 14:28
    Insurance companies would like to respond to this problem
  • 14:28 - 14:31
    by knowing better who's a good risk and who's a bad risk
  • 14:31 - 14:35
    so they can lower prices for the good risk and raise prices for the bad risk.
  • 14:35 - 14:41
    Essentially they want to make people who are going to get sick, uninsurable.
  • 14:41 - 14:45
    And if you could know enough about a person to know what they're risk factors are based on
  • 14:45 - 14:49
    what they're digital life is, if you can get just a little bit of information about them,
  • 14:49 - 14:53
    maybe you can figure out who their parents are and what hereditary diseases they might be subject to,
  • 14:53 - 14:55
    you can start to understand these things.
  • 14:55 - 14:58
    You can start to figure out who's a good risk and who's a bad risk.
  • 14:58 - 15:04
    You can use this information for ends that seem reasonable if you're a health insurance
  • 15:04 - 15:07
    company, but probably don't seem reasonable if you're
  • 15:07 - 15:10
    the kind of person sitting in this room, the kind of person that I talk to.
  • 15:10 - 15:17
    And that's the problem. The innocuous use. The use that seems kind of icky, but not truly evil,
  • 15:17 - 15:19
    which is advertising.
  • 15:19 - 15:25
    It's the same mechanism, the same data, that then gets used for other purposes.
  • 15:25 - 15:32
    It's the same data that then gets turned over to a government who wants to oppress you
  • 15:32 - 15:36
    because you are supporting wikileaks.
  • 15:36 - 15:39
    And that's not a fantasy, that's what happened.
  • 15:39 - 15:49
    It's the same information that anybody who wants to know something about you for an evil end would use.
  • 15:49 - 15:56
    We have a saying in the world of information, that if the data exists, you can't decide what it gets
  • 15:56 - 15:58
    used for.
  • 15:58 - 16:03
    Once data exists, especially data in the hands of the government, of officials,
  • 16:03 - 16:05
    once that data exists, it's a resource.
  • 16:05 - 16:10
    And the use of that resource it its own energy, its own logic.
  • 16:10 - 16:15
    Once a resource is there begging to be used, it's very hard to stop it from being used.
  • 16:15 - 16:22
    Because it's so attractive, it's so efficient, it would solve so many problems to use the data.
  • 16:22 - 16:28
    And so once you collect the data, once the data exists in one centralized place,
  • 16:28 - 16:35
    for anybody to come and get it with a warrant, or maybe no warrant, or maybe some money...
  • 16:35 - 16:41
    somebody is going to come with a warrant, or no warrant, and they are going to get that data.
  • 16:41 - 16:42
    And they will use it for whatever they want to use it.
  • 16:42 - 16:47
    Once it's out of the hands of the first person who collected it, who maybe you trust,
  • 16:47 - 16:52
    who maybe has good privacy policies, who maybe has no intention to do anything with your data
  • 16:52 - 16:58
    other than use it for diagnostic purposes, once it's out of that person's hands it's gone.
  • 16:58 - 17:00
    You never know where it goes after that.
  • 17:00 - 17:02
    It is completely uncontrolled and unchecked
  • 17:02 - 17:05
    and there is no ability to restrain what happens to that data.
  • 17:05 - 17:14
    So all of this is my attempt to convince you that privacy is a real value in our society,
  • 17:14 - 17:18
    and that the danger of losing privacy is a real problem.
  • 17:18 - 17:20
    It's not just the censorship, it's not just the filtering,
  • 17:20 - 17:26
    it's not just the propaganda, the influencing of opinion, that's one aspect of it,
  • 17:26 - 17:35
    it's not just the free speech. It's also the privacy, because privacy goes to the heart of our autonomy.
  • 17:35 - 17:43
    About a year and a half ago to two years ago at the Software Freedom Law Center
  • 17:43 - 17:47
    a man named Ian Sullivan who's a co-worker of mine,
  • 17:47 - 17:49
    he bought a bunch of plug servers,
  • 17:49 - 17:54
    because he was really excited at the thought of using them as print servers, and media servers,
  • 17:54 - 17:59
    and he started tinkering with them in our office.
  • 17:59 - 18:02
    My boss Eben Moglen who is a long-time activist in the Free Software movement,
  • 18:02 - 18:15
    fought very hard for Phil Zimmerman and PGP when that was a big issue,
  • 18:15 - 18:23
    he looked at this technology and he immediately realised that several streams had come together in one
  • 18:23 - 18:24
    place.
  • 18:24 - 18:27
    There's a lot of really good technology to protect your privacy right now.
  • 18:27 - 18:31
    In fact that's the stuff we're putting on the Freedom Box.
  • 18:31 - 18:33
    We're not writing new software.
  • 18:33 - 18:36
    We are gathering stuff, and putting it in one place.
  • 18:36 - 18:40
    Stuff that other people did because there are people who are better at writing software, and security,
  • 18:40 - 18:43
    than we are. We're software integrators.
  • 18:43 - 18:46
    And he realised there was all this software out there, and suddenly there was a box to put it on.
  • 18:46 - 18:53
    You could put all that software in one place, make it easy, and give it to people in one neat package.
  • 18:53 - 18:56
    Pre-installed, pre-configured, or as close to it as we can get.
  • 18:56 - 19:02
    And that, was the vision for the FreedomBox.
  • 19:02 - 19:08
    The FreedomBox is a tiny computer. Look at this.
  • 19:08 - 19:10
    That's small, it's unobtrusive.
  • 19:10 - 19:11
    So it's a small computer.
  • 19:11 - 19:16
    And we don't just mean small in size... it doesn't take a lot of energy.
  • 19:16 - 19:22
    I could be running this box on a couple of AA batteries for the life of this presentation.
  • 19:22 - 19:24
    You could run it on a solar panel.
  • 19:24 - 19:27
    It's very lightweight infrastructure.
  • 19:27 - 19:33
    You plug it into your home network, and when I say home network,
  • 19:33 - 19:35
    (I'm going to pass this around)
  • 19:35 - 19:38
    When I say home network, I mean home network.
  • 19:38 - 19:42
    This is technology we are designing for individuals to use to talk to their friends.
  • 19:42 - 19:47
    Our use-case, the thing we're trying to protect is you guys, as individuals in your communities.
  • 19:47 - 19:51
    This isn't a small-business appliance, it's not a large corporate applicance, this is a thing
  • 19:51 - 19:58
    that we are truly aiming at the home market, and people who care about privacy on an individual level.
  • 19:58 - 20:05
    You plug it into your home network to protect your privacy, your freedom, your anonymity and your security.
  • 20:05 - 20:09
    That is our mission statement, I guess. Unofficially.
  • 20:09 - 20:17
    That is what we believe we are trying to do with this device.
  • 20:17 - 20:22
    So, what privacy means in this context, the way we're going to go about trying to protect your privacy
  • 20:22 - 20:27
    is to connect you directly with other people and take everything you do and try to encrypt it
  • 20:27 - 20:31
    so that only you and the person you are talking to can see it. This is not a new idea.
  • 20:31 - 20:35
    We can do encrypted messaging, and we can do encrypted browsing.
  • 20:35 - 20:43
    Now there are problems with encrypted browsing. Right now if you want to have secure browsing you generally
  • 20:43 - 20:45
    use something called SSL.
  • 20:45 - 20:57
    SSL is a system of certificate that allow a web server to say to you "we can talk privately".
  • 20:57 - 21:01
    That's the first guarantee, a secure cryptographic connection (A).
  • 21:01 - 21:05
    and (B) I can authenticate to you that I am who I say I am.
  • 21:05 - 21:11
    So not only can nobody listen, but you know who you're talking to.
  • 21:11 - 21:18
    You're not secretly talking to the government, when really you're talking to me.
  • 21:18 - 21:23
    The problem with SSL, the big problem with SSL, is that the system for signing certificates relies
  • 21:23 - 21:28
    on a trust hierachy that goes back to a cartel of companies who have the server certificates,
  • 21:28 - 21:35
    who have the ability to do this "guarantee". So when the website says to you "I guarantee I am who I
  • 21:35 - 21:42
    am", you say "I don't know you, I don't trust you". And they say "Oh, but this other company, I paid
  • 21:42 - 21:47
    them money, and so they'll guarantee that I am me."
  • 21:47 - 21:52
    Which is a really interesting idea - because I also don't know this company, why would I trust that company?
  • 21:52 - 21:57
    I mean, the company is just old enough and influential enough that they could actually get their
  • 21:57 - 22:03
    authority into my browser. So really my browser is willing to accept at face-value that this website
  • 22:03 - 22:07
    is who it says it is, but I don't necessarily accept that.
  • 22:07 - 22:13
    And then, we have the problem of self-signed certificate. Where if they say, none of those authorities
  • 22:13 - 22:17
    in your browser trust me, I trust myself and look, I've signed a piece of paper -
  • 22:17 - 22:20
    I swear I am who I say I am.
  • 22:20 - 22:24
    And that, is not trustworthy at all, right?
  • 22:24 - 22:27
    That's just him saying again "No, really! I'm me!".
  • 22:27 - 22:33
    So this is a problem, because the FreedomBoxes are not going to trust the SSL cartel,
  • 22:33 - 22:36
    and they are not going to trust each other, so they can't just sort of swear to each other that
  • 22:36 - 22:39
    they are who they are.
  • 22:39 - 22:45
    So we think we've solved this. I'm not going to say we've solved it, because we're just starting to tell
  • 22:45 - 22:52
    people about this idea, and I'm sure people will have reasons why the idea can be improved.
  • 22:52 - 22:58
    But there is a technology called MonkeySphere, that allows you to take an SSH key and wrap it around a
  • 22:58 - 23:03
    PGP key, and use a PGP key to authenticate SSH connections.
  • 23:03 - 23:10
    It's really neat technology that allows you to replace SSH trust with PGP trust.
  • 23:10 - 23:14
    And we looked at that, and we thought, why can't we do that with SSL?
  • 23:14 - 23:21
    So one thing we're going do with browsing is take an SSL certificate, an X.509 certificate,
  • 23:21 - 23:25
    and wrap it around a PGP key and send it through the normal SSL layer mechanisms
  • 23:25 - 23:32
    but when it gets to the other end, smart servers and smart browsers will open it up and use PGP mechanisms
  • 23:32 - 23:39
    to figure out how to trust people, to verify the connections, to sign the authentication of the identity
  • 23:39 - 23:42
    of the browser, of the server.
  • 23:42 - 23:48
    This allows us to replace the SSL cartel with the web of trust, the keyservers.
  • 23:48 - 23:57
    We're replacing a tiny group of companies that control everything with keyservers, community infrastructure.
  • 23:57 - 24:01
    Anyone can set up a keyserver, and you can decide which one you want to trust.
  • 24:01 - 24:02
    They share information.
  • 24:02 - 24:06
    The web of trust is built on people, telling each other that they trust each other.
  • 24:06 - 24:09
    Again, you can decide who to trust and how much you want to trust them.
  • 24:09 - 24:16
    This is emblematic of our approach. We've identified structures that are unreliable because
  • 24:16 - 24:20
    they are centralized, because they are controlled by interests that are not the same interests
  • 24:20 - 24:22
    as our interests.
  • 24:22 - 24:29
    And we've decided to replace them wherever we can with structures that rely on people,
  • 24:29 - 24:37
    that rely on human relationships, that rely less on the notion that you can buy trust, and more on the
  • 24:37 - 24:42
    notion that you earn trust, by being trustworthy, by having people vouch for you over time.
  • 24:42 - 24:50
    So that's our approach to encrypted browsing. It's also our approach to encrypted messaging.
  • 24:50 - 24:58
    We're doing Jabber for a lot of message passing, XMPP, and we're securing that again with PGP.
  • 24:58 - 25:02
    Everywhere we can we're going to try to use the PGP network, because it already exists...
  • 25:02 - 25:04
    as I said, we're not trying to invent anything new.
  • 25:04 - 25:10
    PGP already exists and it does a really good job. So we're taking the PGP trust system and we're
  • 25:10 - 25:16
    going to apply it to things like XMPP and make sure that we can do message passing in a way
  • 25:16 - 25:18
    that we can trust.
  • 25:18 - 25:26
    Once we have XMPP we have a way to send text, a way to send audio, sure...
  • 25:26 - 25:28
    but also you can send structured data.
  • 25:28 - 25:33
    Through that same channel. And you can send that data to buddy lists.
  • 25:33 - 25:39
    So the system starts to look like a way to pass data in a social way. And we think this is the
  • 25:39 - 25:42
    beginning of the social layer of the box.
  • 25:42 - 25:46
    At the bottom of the box we have a belief that the technology should be social
  • 25:46 - 25:48
    from the ground up.
  • 25:48 - 25:50
    And so we're building structures that allow it to be social,
  • 25:50 - 25:55
    that assume you want to connect with friends in a network of freedom,
  • 25:55 - 26:01
    perhaps FreedomBoxes, perhaps other kinds of software, other kinds of technology.
  • 26:01 - 26:04
    And we're designing with that in mind.
  • 26:04 - 26:08
    With that in mind, we think we get certain benefits technologically which I'll get into later.
  • 26:08 - 26:13
    We think we can simply things like key management, through methods like this.
  • 26:13 - 26:19
    By privacy I also mean that we can install a proxy server, privoxy,
  • 26:19 - 26:21
    we think the answer is privoxy here,
  • 26:21 - 26:26
    privoxy on the box, so you can point your browser at the box, surf the web on the box,
  • 26:26 - 26:33
    and strip ads, strip cookies, stop Google from tracking you from website to website to website,
  • 26:33 - 26:43
    to remove, the constant person sitting at your side, spying, recording, listening to everything you do.
  • 26:43 - 26:46
    In that vein, we don't just want to block ads and reject cookies,
  • 26:46 - 26:50
    we want to do something new, relatively new.
  • 26:50 - 27:02
    We think we want to munge your browser fingerprint, that unique pattern of data that is captured by your
  • 27:02 - 27:03
    user-agent string and what plugins you have, and all that stuff
  • 27:03 - 27:07
    that forms a unique profile of you that allows people to track your browser, companies to track your
  • 27:07 - 27:09
    browser as you hop along the web, even if they don't know anything about you.
  • 27:09 - 27:13
    It can sort of tie you to the browser, make profiles about your browser.
  • 27:13 - 27:16
    And that turns out to be a very effective way of figuring out who you are.
  • 27:16 - 27:23
    So even without a cookie, even without serving you with an ad, once they're talking to you they can
  • 27:23 - 27:26
    uniquely identify you, or relatively uniquely.
  • 27:26 - 27:32
    But it's relatively early in the browser fingerprint arms race.
  • 27:32 - 27:37
    We think that with a very little bit of changing, we can foil the recording.
  • 27:37 - 27:40
    and win this round at least.
  • 27:40 - 27:46
    And instead of having one profile where they gather all of your data, you will present to services
  • 27:46 - 27:51
    as a different person every time you use the service. So they cannot build profiles of you over time.
  • 27:51 - 27:53
    That's what privacy looks like in our context. We're looking for cheap ways to foil the tracking.
  • 27:55 - 28:02
    We're looking for easy things we can do, because we believe there's a lot of low-hanging fruit.
  • 28:02 - 28:05
    And we'll talk about that more in a minute.
  • 28:05 - 28:09
    Freedom is our value, freedom is the thing we are aiming for,
  • 28:09 - 28:13
    freedom from centralized structures like the pipes.
  • 28:13 - 28:19
    Now mesh networking, I have mesh networking in my slides. That is a lie.
  • 28:19 - 28:21
    We are not doing mesh networking.
  • 28:21 - 28:26
    The reason we are not doing mesh networking is because I do not know anything about mesh networking
  • 28:26 - 28:31
    and one of the reaons I came here was to meet people who know a lot about mesh networking
  • 28:31 - 28:34
    and I see people in this audience who know a lot about mesh networking.
  • 28:34 - 28:41
    If you want to turn that lie into the truth, the way you do that
  • 28:41 - 28:43
    is by continuing on your projects, making mesh networking awesome,
  • 28:43 - 28:46
    to the point where I can say yes, we're going to put that in this box.
  • 28:46 - 28:49
    Then eventually, by the time this box is ready to do real
  • 28:49 - 28:52
    things for real people, we're really hoping that the mesh story
  • 28:52 - 28:56
    coheres, where we've identified the protocol and the technology and the people who are going to help
  • 28:56 - 29:00
    us. If you think you might be one of those people, we want to talk to you.
  • 29:00 - 29:02
    So yes, we are going to do mesh networking,
  • 29:02 - 29:05
    and that might be a lie
  • 29:05 - 29:08
    but I hope not.
  • 29:08 - 29:10
    We want you to have the freedom to own your data
  • 29:10 - 29:16
    that means data portability, that means that your data sits on your box and never goes to a third party.
  • 29:16 - 29:18
    It only goes to the people you want it to go to.
  • 29:18 - 29:23
    Fine-grained access control. Your data, your structures, you decide where it goes.
  • 29:23 - 29:25
    That's a user-interface problem,
  • 29:25 - 29:27
    that's a user permission problem,
  • 29:27 - 29:29
    an access control problem.
  • 29:29 - 29:33
    Access control is a solved problem.
  • 29:33 - 29:37
    Doing it through a convenient user-interface, that's not solved... so that's work to be done.
  • 29:37 - 29:42
    That's a big chunk of our todo list.
  • 29:42 - 29:43
    We want you to own your social network
  • 29:43 - 29:50
    Before Facebook there was a thing called MySpace, which was... I'm not even sure it exists anymore.
  • 29:50 - 29:54
    Before MySpace there was Tribe.
  • 29:54 - 29:56
    Before Tribe there was Friendster.
  • 29:56 - 29:59
    Friendster is now like a... "gaming network".
  • 29:59 - 30:02
    I don't know what it is but they still send me email
  • 30:02 - 30:06
    Which is the only reason I know they're still alive.
  • 30:06 - 30:11
    Before Friendster was the original social network.
  • 30:11 - 30:15
    We called this social network "the internet".
  • 30:15 - 30:17
    We talked directly to each other,
  • 30:17 - 30:21
    we used email, an instant messenger and IRC.
  • 30:21 - 30:23
    We talked to people using the structures that were out there.
  • 30:23 - 30:27
    It wasn't centralized in one service, we had a lot of ways of meeting each other
  • 30:27 - 30:29
    and passing messages.
  • 30:29 - 30:31
    What we lacked was a centralized interface.
  • 30:31 - 30:35
    So when we say "own your social network" we mean use the services of the internet,
  • 30:35 - 30:37
    own the pieces that talk to each other.
  • 30:37 - 30:41
    Hopefully we'll provide you with a convenient interface to do that.
  • 30:41 - 30:44
    But the actual structures, the places where your data live,
  • 30:44 - 30:48
    that is just the same pieces that we know how to use already.
  • 30:48 - 30:51
    We are not going to try to reinvent how you talk to people,
  • 30:51 - 30:56
    we're just going to make it so that the pipes are secure.
  • 30:56 - 30:59
    A big part of freedom, a big part of privacy,
  • 30:59 - 31:02
    is anonymity.
  • 31:02 - 31:06
    Tor can provide anonymity.
  • 31:06 - 31:08
    But we don't have to go all the way to Tor.
  • 31:08 - 31:12
    Tor is expensive, in terms of latency.
  • 31:12 - 31:16
    Tor is difficult to manage...
  • 31:16 - 31:21
    I don't know how many people have tried to use Tor, to run all their traffic through Tor.
  • 31:21 - 31:23
    It's hard. For two reasons.
  • 31:23 - 31:26
    For one, the latency... it takes a very long time to load a web page.
  • 31:26 - 31:32
    And two, you look like a criminal. To every website that you go to.
  • 31:32 - 31:38
    My bank shut down my account when I used Tor.
  • 31:38 - 31:44
    Because suddenly, I was coming from an IP address in Germany that they had detected in the past
  • 31:44 - 31:48
    efforts to hack them on.
  • 31:48 - 31:52
    So they closed my account, well I had to talk to them about it,
  • 31:52 - 31:53
    it did all get solved in the end.
  • 31:53 - 31:57
    PayPal as well closed my account down.
  • 31:57 - 31:59
    So that was the end of my ability to use Tor.
  • 31:59 - 32:01
    So we can't just run all our traffic through Tor.
  • 32:01 - 32:07
    It's too slow, and the network has weird properties in terms of how you present to websites,
  • 32:07 - 32:08
    that frankly, are scary.
  • 32:08 - 32:16
    Because if I look like a criminal to the bank, I don't want to imagine what I look like to my own government.
  • 32:16 - 32:19
    But we can do privacy in other ways.
  • 32:19 - 32:25
    If you are a web user, in China, and you want to surf the internet,
  • 32:25 - 32:30
    with full access to every website you might go to, and with privacy from your government,
  • 32:30 - 32:34
    so that you don't get a knock on your door from visiting those websites,
  • 32:34 - 32:36
    we can do that without Tor.
  • 32:36 - 32:39
    We don't need Tor to do that. We can do that cheaply.
  • 32:39 - 32:45
    Because all you need to do in that situation is get your connection out of China.
  • 32:45 - 32:54
    Send your request for a web page through an encrypted connection to a FreedomBox in...
  • 32:54 - 32:58
    Austria, America, who knows?
  • 32:58 - 33:05
    Just get the request away from the people who physically have the power to control you.
  • 33:05 - 33:08
    And we can do that cheaply, that's just SSH port forwarding.
  • 33:08 - 33:14
    That's just a little bit of tunneling, that's just a little bit of VPN.
  • 33:14 - 33:16
    There's a lot of ways to do that sort of thing,
  • 33:16 - 33:20
    to give you anonymity and privacy in your specific context
  • 33:20 - 33:22
    without going all the way into something like Tor.
  • 33:22 - 33:25
    Now there are people who are going to need Tor.
  • 33:25 - 33:27
    They will need it for their use case.
  • 33:27 - 33:32
    But not every use case requires that level of attack.
  • 33:32 - 33:37
    And so one of the things we're trying to do is figure out how much privacy and anonymity you need,
  • 33:37 - 33:40
    and from whom you need it.
  • 33:40 - 33:43
    If we can do that effectively we can give people solutions
  • 33:43 - 33:45
    that actually work for them. Because if we just tell people
  • 33:45 - 33:49
    to use Tor, we're going to have a problem.
  • 33:49 - 33:52
    They're not going to use it, and they won't get any privacy at all.
  • 33:52 - 33:55
    And that's bad.
  • 33:55 - 33:57
    So we want to allow people to do anonymous publishing,
  • 33:57 - 33:59
    and file-sharing, and web-browsing and email.
  • 33:59 - 34:01
    All the communications you want to do.
  • 34:01 - 34:03
    The technology to do that already exists,
  • 34:03 - 34:05
    we could do all of that with Tor.
  • 34:05 - 34:09
    The next piece of our challenge is to figure out how to do it without Tor.
  • 34:09 - 34:12
    To figure out what pieces we need Tor for, and to figure out
  • 34:12 - 34:17
    what pieces we can do a little bit more cheaply.
  • 34:17 - 34:19
    Security.
  • 34:19 - 34:23
    Without security, you don't have freedom and privacy and anonymity.
  • 34:23 - 34:25
    If the box isn't secure,
  • 34:25 - 34:27
    you lose.
  • 34:27 - 34:32
    We're going to encrypt everything.
  • 34:32 - 34:36
    We're going to do something that's called social key management, which I'm going to talk about.
  • 34:36 - 34:39
    I do want to talk about the Debian-based bit.
  • 34:39 - 34:42
    We are based on a distribution of Linux called Debian,
  • 34:42 - 34:46
    because it is a community-based distribution.
  • 34:46 - 34:48
    It is made by people who care a lot about your
  • 34:48 - 34:51
    freedom, your privacy, and your ability to speak anonymously.
  • 34:51 - 34:55
    And we really believe that the best way to distribute this
  • 34:55 - 34:58
    software is to hand it to the Debian mirror network and let
  • 34:58 - 35:00
    them distribute it. Because they have mechanisms
  • 35:00 - 35:02
    to make sure that nobody changes it.
  • 35:02 - 35:05
    If we were to distribute the software to you directly, we
  • 35:05 - 35:09
    would become a target. People would want to change the
  • 35:09 - 35:11
    software as we distribute it on our website.
  • 35:11 - 35:13
    They would want to crack our website and distribute their
  • 35:13 - 35:15
    version of the package.
  • 35:15 - 35:18
    We don't want to be a target, so we're not going to give you software.
  • 35:18 - 35:21
    We're going to give it to Debian, and let them give you the software.
  • 35:21 - 35:26
    And at the same time you get all of the Debian guarantees about freedom.
  • 35:26 - 35:28
    The Debian Free Software Guidelines.
  • 35:28 - 35:32
    They're not going to give you software unless it comes
  • 35:32 - 35:37
    with all of the social guarantees that are required to participate in the Debian community.
  • 35:37 - 35:39
    So we're very proud to be using Debian in this manner,
  • 35:39 - 35:41
    and working with Debian in this manner.
  • 35:41 - 35:44
    And we think that's the most effective way we can guarantee that we're going to live up to
  • 35:44 - 35:51
    our promises to you, because it provides a mechanism whereby if we fail to live up to our promises,
  • 35:51 - 35:56
    we cannot give you something that is broken. Because Debian won't let us,
  • 35:56 - 35:59
    they just won't distribute it.
  • 35:59 - 36:02
    There are problems with security.
  • 36:02 - 36:04
    There are things we can't solve.
  • 36:04 - 36:05
    One...
  • 36:05 - 36:08
    Physical security of the box.
  • 36:08 - 36:13
    We haven't really talked much internally about whether we can encrypt the filesystem on this box.
  • 36:13 - 36:16
    I don't quite see a way to do it.
  • 36:16 - 36:20
    It doesn't have an interface for you to enter a password effectively.
  • 36:20 - 36:23
    By the time you've brought an interface up you'd be running untrusted code.
  • 36:23 - 36:25
    I don't know a way to do it.
  • 36:25 - 36:29
    If anyone can think of a way that we can effectively encrypt the filesystem, I'd love to hear it.
  • 36:29 - 36:35
    But, on top of that, if we do encrypt the filesystem,
  • 36:35 - 36:38
    then the thing cannot be rebooted remotely, which is a downside.
  • 36:38 - 36:40
    So there are trade-offs at every step of the way.
  • 36:40 - 36:45
    If we can figure out some of these security issues, then we can be ahead of the game.
  • 36:45 - 36:50
    But I think the encrypting the filesystem is the only way to guarantee the box is secure, even if it's
  • 36:50 - 36:52
    not physically secure.
  • 36:52 - 36:53
    So I think that's a big one.
  • 36:53 - 36:58
    If you have ideas about that, please come and talk to me after the talk.
  • 36:58 - 37:01
    I promised I would talk about social key management, and here it is.
  • 37:01 - 37:06
    So we're building the idea of knowing who your friends are
  • 37:06 - 37:08
    into the box at a somewhat low level.
  • 37:08 - 37:12
    To the point where things that are on the box can assume it is there,
  • 37:12 - 37:17
    or ask you if it's there, or rely on it as a matter of course in some cases.
  • 37:17 - 37:21
    So we can do things with keys that make your keys unlosable.
  • 37:21 - 37:25
    Right now a PGP key is a hard thing to manage.
  • 37:25 - 37:26
    Key management is terrible.
  • 37:26 - 37:30
    Do you guys like PGP? PGP is good.
  • 37:30 - 37:34
    Does anyone here like key management?
  • 37:34 - 37:36
    We have one guy who likes key management.
  • 37:36 - 37:39
    LAUGHTER
  • 37:39 - 37:41
    He's going to do it for all of you!
  • 37:41 - 37:43
    So, none of us like key management.
  • 37:43 - 37:46
    Key management doesn't work, especially if your use-case is home users, naive end-users.
  • 37:46 - 37:48
    Nobody wants to do key management.
  • 37:48 - 37:51
    Writing their key down and putting it in a safety deposit box is ludicrous.
  • 37:51 - 37:54
    It's a very difficult thing to actually convince people to do.
  • 37:54 - 38:00
    Sticking it on a USB key, putting it in a zip-lock back and burying it in your backyard is paranoid.
  • 38:00 - 38:03
    I can't believe I just told you what I do with my key.
  • 38:03 - 38:04
    LAUGHTER
  • 38:04 - 38:06
    No, you can't ask people to do that.
  • 38:06 - 38:08
    They won't do it.
  • 38:08 - 38:09
    You can't protect keys in this manner.
  • 38:09 - 38:13
    You have to have a system that allows them to sort of, not ever know they have a key.
  • 38:13 - 38:16
    To not think about their key unless they really want to.
  • 38:16 - 38:19
    We think we've come up with something that might work.
  • 38:19 - 38:20
    You take the key,
  • 38:20 - 38:22
    or a subkey,
  • 38:22 - 38:24
    you chop it into little bits
  • 38:24 - 38:25
    and you give that key...
  • 38:25 - 38:31
    and we're talking about a key of a very long length, so there's a giant attack space
  • 38:31 - 38:36
    and you can chop it into bits and hand it to people without reducing the search space for a key.
  • 38:36 - 38:39
    You chop it into bits and hand all the bits to your friends.
  • 38:39 - 38:42
    Now all your friends have your key, as a group.
  • 38:42 - 38:44
    Individually, none of them can attack you.
  • 38:44 - 38:47
    Indicidually, none of them has the power to come root your box,
  • 38:47 - 38:50
    to access your services and pretend to be you.
  • 38:50 - 38:53
    As a group, they can do this.
  • 38:53 - 39:04
    We trust our friends, as a group, more than we trust them as individuals.
  • 39:04 - 39:08
    Any single one of your friends, if you gave them the key to your financial data and your private online
  • 39:08 - 39:10
    life that would make you very nervous.
  • 39:10 - 39:14
    You would worry that they would succumb to temptation to peek,
  • 39:14 - 39:17
    fall on hard times and want to attack you in some way,
  • 39:17 - 39:19
    fall out with you, get mad at you.
  • 39:19 - 39:23
    As an individual, people are sort of fallible in this sense.
  • 39:23 - 39:25
    But as a group of friends who would have to get together
  • 39:25 - 39:30
    and affirmatively make a decision to attack you,
  • 39:30 - 39:32
    we think that's extremely unlikely.
  • 39:32 - 39:38
    It's so unlikely that there are only a few scenarios where we think it might happen.
  • 39:38 - 39:39
    One...
  • 39:39 - 39:42
    if you are ill, and unable to access your box
  • 39:42 - 39:44
    or you're in jail
  • 39:44 - 39:45
    or you've passed away
  • 39:45 - 39:49
    or you've disappeared.
  • 39:49 - 39:52
    Or... you've gone crazy.
  • 39:52 - 39:57
    We call this type of event, where all your friends get together and help you,
  • 39:57 - 39:59
    even if you don't ask them for help,
  • 39:59 - 40:02
    we call that an intervention.
  • 40:02 - 40:05
    When your friends sit you down and say,
  • 40:05 - 40:09
    "you need our help, you can't ask us for it because you're not in a position to ask us for it",
  • 40:09 - 40:10
    that's an intervention.
  • 40:10 - 40:16
    If you have a moment in your life, a crisis in your life that is an intervention level event,
  • 40:16 - 40:18
    that's when you can go to your friends.
  • 40:18 - 40:22
    If your house burns down, you lose your key and all your data
  • 40:22 - 40:25
    You go to your friends, and you say "can I have part of my key back?"
  • 40:25 - 40:29
    "Oh, and give me that data that you have in a cryptographically-sealed box that you can't read."
  • 40:29 - 40:31
    To all your friends...
  • 40:31 - 40:32
    "My data please, my key please, ..."
  • 40:32 - 40:32
    "My data please, my key please, ..."
  • 40:32 - 40:34
    "My data please, my key please, ..."
  • 40:34 - 40:39
    You take all those pieces, you get a new box,
  • 40:39 - 40:42
    you load it all onto your box.
  • 40:42 - 40:47
    You have the key, you have your entire key, and now you can read your data.
  • 40:47 - 40:49
    And you haven't lost your digital life.
  • 40:49 - 40:54
    You have a key that is now unlosable.
  • 40:54 - 40:58
    Even if you never wrote it down, even if you never buried it in the backyard.
  • 40:58 - 41:00
    This is a hard problem in key management.
  • 41:00 - 41:04
    People lose their keys and their passwords to services all the time.
  • 41:04 - 41:09
    The only way we can think of to make that impossible, is this mechanism.
  • 41:09 - 41:10
    And of course it's optional.
  • 41:10 - 41:13
    If you're a person who doesn't trust your friends, even as a group,
  • 41:13 - 41:17
    or if you're a person who just doesn't have a lot of friends
  • 41:17 - 41:20
    (let me finish!)
  • 41:20 - 41:25
    ...who doesn't have a lot of friends with FreedomBoxes who can be the backend for this,
  • 41:25 - 41:27
    you don't have to trust this mechanism.
  • 41:27 - 41:30
    You can do something else to make your key unforgettable.
  • 41:30 - 41:32
    But for a lot of naive end-users,
  • 41:32 - 41:34
    this is the mechanism.
  • 41:34 - 41:36
    This is the way they are going to never
  • 41:36 - 41:37
    lose their keys
  • 41:37 - 41:41
    Because the first time a user gets irretrievably locked out of his FreedomBox,
  • 41:41 - 41:43
    we lose that user forever.
  • 41:43 - 41:45
    And we lose all his friends forever.
  • 41:45 - 41:52
    Because it would scare you to lose such an important group of information.
  • 41:52 - 41:53
    Social key management.
  • 41:53 - 41:58
    This is the benefit of building social, of building knowledge
  • 41:58 - 42:03
    of who your friends are, into the box, at a deep level.
  • 42:03 - 42:05
    We have never done that before, with a technology
  • 42:05 - 42:08
    as a community project.
  • 42:08 - 42:11
    And it opens up new possibilities. This is just one.
  • 42:11 - 42:13
    There are others.
  • 42:13 - 42:15
    But it's a field we haven't really thought a lot about.
  • 42:15 - 42:19
    I think once we get out there and we start doing this kind of
  • 42:19 - 42:25
    construction, a lot of new uses are going to be found for this architecture.
  • 42:25 - 42:28
    I encourage you all to think about what changes,
  • 42:28 - 42:34
    when you can assume that the box has people you can trust, just a little bit,
  • 42:34 - 42:38
    because right now we live in a world where we are asked
  • 42:38 - 42:42
    to trust third party services like Facebook with all our photos,
  • 42:42 - 42:46
    or Flickr with all our photos, or Gmail with all our email.
  • 42:46 - 42:47
    We are asked to trust them.
  • 42:47 - 42:50
    We have no reason to trust them.
  • 42:50 - 42:54
    I mean, we expect that they'll act all right, because they have no reason to destroy us.
  • 42:54 - 42:56
    But we don't know what's going to happen.
  • 42:56 - 43:01
    We're effectively giving all our information to people we don't trust at all right now.
  • 43:01 - 43:04
    How does a network of people we trust, just a little bit,
  • 43:04 - 43:06
    change the landscape?
  • 43:06 - 43:09
    I think that's a really interesting question.
  • 43:09 - 43:10
    This box explores that question,
  • 43:10 - 43:16
    this box creates new solutions to old problems that previously seemed intractable.
  • 43:16 - 43:19
    So, I encourage everybody to think about how that might
  • 43:19 - 43:27
    change the solution to a problem they have with a technological architecture as it exists today.
  • 43:27 - 43:31
    Here's another problem...
  • 43:31 - 43:34
    Boxes that know who you are, and know who your friends are,
  • 43:34 - 43:37
    and know how your friends normally act,
  • 43:37 - 43:41
    can also know when your friends are acting weird.
  • 43:41 - 43:49
    If you have a friend who sends you one email a year, who suddenly sends you ten emails in a day,
  • 43:49 - 43:51
    that look like spam,
  • 43:51 - 43:53
    you know that box is rooted.
  • 43:53 - 43:55
    You know that box is weird.
  • 43:55 - 43:59
    Or if you are using the FreedomBox as your gateway to the internet,
  • 43:59 - 44:05
    and a box it is serving downstream, starts sending a bunch of spam through it, it knows.
  • 44:05 - 44:08
    It can say "Oh no! You're acting like a zombie."
  • 44:08 - 44:10
    "You should get a check-up."
  • 44:10 - 44:15
    It can shut off mail service to that box, and not let the messages out.
  • 44:15 - 44:21
    It can make that decision to protect the wider internet to make you a better citizen in the world.
  • 44:21 - 44:27
    If suddenly your computer starts saying "Hey, I'm in Scotland and I need $5000"...
  • 44:27 - 44:30
    but we know you're not in Scotland
  • 44:30 - 44:33
    Maybe this box, because it has contact information,
  • 44:33 - 44:35
    maybe this box sends you an SMS.
  • 44:35 - 44:40
    And says "Dude, you've been hacked, go do something about your box."
  • 44:40 - 44:43
    So the types of things we can do once we assume we have
  • 44:43 - 44:49
    close relations as opposed to arms-length relations,
  • 44:49 - 44:51
    the types of things we can do when we trust each other a little bit
  • 44:51 - 44:54
    and we trust our boxes a little bit, goes way up.
  • 44:54 - 44:55
    Way up.
  • 44:55 - 44:58
    And by bringing that infrastructure closer to us,
  • 44:58 - 45:03
    I mean Gmail is too far away to play that role from a network perspective.
  • 45:03 - 45:08
    But if the box is in our land, we can do that.
  • 45:08 - 45:11
    These boxes will only work if they are convenient.
  • 45:11 - 45:14
    There's an old punk-rock slogan, from the Dead Kennedys,
  • 45:14 - 45:18
    "Give me convenience, or give me death."
  • 45:18 - 45:24
    We laugh at that, but that's a belief users have,
  • 45:24 - 45:26
    and I deduce that based on their behaviour,
  • 45:26 - 45:29
    because every time there is a convenient web service,
  • 45:29 - 45:31
    people use it.
  • 45:31 - 45:34
    Even if it's not very good with privacy, a lot of people are going to use it.
  • 45:34 - 45:41
    And conversely, whenever we have web services that are very good at privacy, but aren't very convenient,
  • 45:41 - 45:44
    comparatively fewer people use them.
  • 45:44 - 45:47
    We don't think this box works without convenience.
  • 45:47 - 45:51
    If we don't get the user-interface right then this project
  • 45:51 - 45:53
    will probably fall over.
  • 45:53 - 45:56
    It will never gain any sort of critical mass.
  • 45:56 - 45:57
    So we need a simple interface,
  • 45:57 - 46:00
    we need a way for users to interact with this box in a minimal way.
  • 46:00 - 46:03
    They should think about it as little as possible.
  • 46:03 - 46:06
    That's the hardest problem we face.
  • 46:06 - 46:07
    Quite frankly.
  • 46:07 - 46:10
    The technology to do private communication, that exists.
  • 46:10 - 46:14
    A lot of the people in this room helped to build that infrastructure and technology.
  • 46:14 - 46:16
    We can put it on the box.
  • 46:16 - 46:21
    Making it easy and accessible for users, that's hard.
  • 46:21 - 46:23
    And right now we're trying to figure out what that looks like,
  • 46:23 - 46:25
    who the designers are going to be.
  • 46:25 - 46:30
    If you have user interface or user experience design that you want to bring to a project like this,
  • 46:30 - 46:33
    please, please, come find me.
  • 46:33 - 46:38
    In order to have convenience, we need to have the thing provide services that are not just
  • 46:38 - 46:44
    freedom-oriented, we need to use its position in your network as a trusted device
  • 46:44 - 46:48
    to do things for you that aren't just about privacy.
  • 46:48 - 46:50
    It needs to do backups.
  • 46:50 - 46:52
    This is important.
  • 46:52 - 46:56
    Right now the way people back up their photos is by giving them to Flickr.
  • 46:56 - 47:00
    The way they back up their email is by giving it to Gmail.
  • 47:00 - 47:06
    If we don't provide backups, we can never be an effective replacement
  • 47:06 - 47:09
    for the services that store your data somewhere else.
  • 47:09 - 47:14
    Even though they're storing it out there in the cloud for their purposes, you get a benefit from it.
  • 47:14 - 47:16
    We have to replicate that benefit.
  • 47:16 - 47:19
    So things that we don't think of as privacy features have to
  • 47:19 - 47:21
    be in the box.
  • 47:21 - 47:25
    The backups, the passwords, and the keys, you can't forget them.
  • 47:25 - 47:29
    We would like it to be a music, a video, a photo server,
  • 47:29 - 47:33
    all the kinds of things you might expect from a convenient box on your network.
  • 47:33 - 47:37
    All the things that you want to share with other people, this box has to do those things.
  • 47:37 - 47:44
    And these aren't privacy features, but without them we won't be able to give people privacy.
  • 47:44 - 47:49
    Our first feature, the thing we are working towards
  • 47:49 - 47:50
    is Jabber.
  • 47:50 - 47:53
    It's secure encrypted chat, point-to-point.
  • 47:53 - 47:57
    That will be the thing we are working on right now.
  • 47:57 - 48:02
    But in order to do that we need to solve this monkey-spherish SSL problem that I described.
  • 48:02 - 48:06
    We have code, it needs to get packaged and all that.
  • 48:06 - 48:10
    Our development strategy, the way we are going to do all the things we said,
  • 48:10 - 48:15
    because the list of things I have said we're going to do...
  • 48:15 - 48:19
    I can't believe you're not throwing things at me.
  • 48:19 - 48:21
    Because it's ludicrous to believe that we can actually do all these things by ourselves.
  • 48:21 - 48:23
    And we're not.
  • 48:23 - 48:25
    We're going to let other people make the software.
  • 48:25 - 48:28
    As much as possible we're going to encourage other people
  • 48:28 - 48:31
    to build stuff. We're going to use stuff that already exists.
  • 48:31 - 48:35
    We're going to use Privoxy, we're going to use Prosody, we're going to use Apache.
  • 48:35 - 48:38
    We're not going to reinvent the web server, we're not going to reinvent protocols.
  • 48:38 - 48:45
    I really hope that by the time this project is mature, we haven't invented any new protocols.
  • 48:45 - 48:48
    Maybe we'll use new protocols, but I don't want to be
  • 48:48 - 48:53
    generating new things that haven't been tested, and then putting them in FreedomBox.
  • 48:53 - 48:58
    I want to see things in the real world, tested, gain credibility and take them.
  • 48:58 - 49:01
    The less we invent, the better.
  • 49:01 - 49:07
    As far as timelines go, by the time we have it ready, you'll know why you need it.
  • 49:07 - 49:10
    People right now are figuring out that privacy is important.
  • 49:10 - 49:12
    They're seeing it over and over again.
  • 49:12 - 49:18
    In Egypt, the at the start of the Arab spring, one of the things the government did to try to
  • 49:18 - 49:22
    tamp down the organisation was to convince companies to shut off cell networks,
  • 49:22 - 49:25
    to prevent people from talking to each other.
  • 49:25 - 49:28
    In America they did the same thing in San Francisco I hear.
  • 49:28 - 49:36
    Turned off the cell towers to prevent people from organising to meet for a protest.
  • 49:36 - 49:42
    With Occupy Wall Street, you're starting to see infiltration,
  • 49:42 - 49:45
    you're starting to see people going and getting information
  • 49:45 - 49:48
    that Occupy Wall Street is talking about and turning it over
  • 49:48 - 49:51
    to the authorities, the police, the FBI.
  • 49:51 - 49:59
    So the need for privacy as we enter a new age of increased activism, we hope,
  • 49:59 - 50:01
    of increased activity, of social activity,
  • 50:01 - 50:06
    I think the need for a lot of this privacy stuff is going to become clear.
  • 50:06 - 50:11
    As the technology for invading your privacy improves,
  • 50:11 - 50:18
    the need for technology to protect your privacy will become stark and clear.
  • 50:18 - 50:22
    Our two big challenges as I said are user experience,
  • 50:22 - 50:27
    and the one I didn't say was paying for developers, paying for designers.
  • 50:27 - 50:31
    Those are the hard parts that we're working on.
  • 50:31 - 50:35
    And if we fail, we think that's where we fail.
  • 50:35 - 50:40
    Software isn't on that list, as I said software is already out there.
  • 50:40 - 50:42
    So you can have a FreedomBox.
  • 50:42 - 50:46
    If you like that box that we've been passing around the audience, you can buy one from Globalscale.
  • 50:46 - 50:51
    If you don't want the box, it's just Debian, it's just Linux, it's just packages.
  • 50:51 - 50:56
    Throw Debian on a box, we will have packages available through the normal Debian mechanisms.
  • 50:56 - 50:58
    You don't even have to use our repository.
  • 50:58 - 51:01
    In fact, I don't think we're going to have a repository.
  • 51:01 - 51:06
    You're just going to download it and install it the same way you normally do it if you're technologically
  • 51:06 - 51:08
    capable of doing that.
  • 51:08 - 51:10
    I grabbed a bunch of photos from Flickr,
  • 51:10 - 51:14
    my colleague Ian Sullivan took that awesome picture of the FreedomBox.
  • 51:14 - 51:17
    And that's how you reach me.
  • 51:18 - 51:31
    APPLAUSE
  • 51:39 - 51:44
    Thanks James, please sit down.
  • 51:44 - 51:49
    We are up for questions from the audience for James.
  • 51:49 - 52:03
    Please raise your hand if you have any questions about the FreedomBox.
  • 52:03 - 52:05
    Hello, thanks that was a very interesting presentation.
  • 52:05 - 52:06
    Thank you.
  • 52:06 - 52:10
    Your boss Eben Moglen, he has given a speech at a committee of the US congress
  • 52:10 - 52:13
    I believe, which has received a lot of attention
  • 52:13 - 52:18
    and in Iran during the green movement the US state department
  • 52:18 - 52:24
    I believe has told Twitter to reschedule maintainence so that
  • 52:24 - 52:29
    the opposition could keep using Twitter during the attempted revolution
  • 52:29 - 52:33
    and Hilary Clinton has given a very popular speech about
  • 52:33 - 52:36
    how America would support the promotion of internet freedom
  • 52:36 - 52:40
    and I think things such as the New America Foundation are
  • 52:40 - 52:46
    funding and supporting projects such as the Commotion mesh networking project
  • 52:46 - 52:49
    that we've already heard about before.
  • 52:49 - 52:52
    So in other words there's a link between politics and technology sometimes,
  • 52:52 - 52:57
    and in the past I believe certain influential Americans such
  • 52:57 - 53:03
    Rupert Murdoch or George W. Bush have viewed modern communication technologies as a way to
  • 53:03 - 53:09
    promote U.S. foreign policy and to spread democracy and freedom in the world.
  • 53:09 - 53:14
    So my question is, what is your relationship with your government?
  • 53:14 - 53:16
    That's a really good question.
  • 53:16 - 53:21
    So one of the things that we sort of figured out from the beginning was that
  • 53:21 - 53:25
    if we had close relationships with the U.S. government,
  • 53:25 - 53:29
    people outside of the U.S. might have difficulty trusting us,
  • 53:29 - 53:34
    because nobody wants to tell all their secrets to the American government.
  • 53:34 - 53:42
    So we were thinking about what that really looks like in the context of a box that could be used globally.
  • 53:42 - 53:48
    We are working very hard to engineer a device that does not require you to trust us.
  • 53:48 - 53:50
    I'm not asking for your trust.
  • 53:50 - 53:55
    I'm not asking for your trust, I'm asking for your help.
  • 53:55 - 53:59
    All the code we write you'll be able to see it, you'll be able to
  • 53:59 - 54:02
    audit it, you'll be able to make your own decisions about what it does,
  • 54:02 - 54:05
    you'll be able to test it if it trustworthy or not,
  • 54:05 - 54:10
    and if you decide that it is not, you can tell everyone,
  • 54:10 - 54:11
    and they won't use it.
  • 54:11 - 54:16
    So from a trust perspective, it doesn't matter what our relationship is with anybody.
  • 54:16 - 54:18
    So that's the first thing.
  • 54:18 - 54:23
    The second thing is that right now we don't have much of a relationship with the U.S. government.
  • 54:23 - 54:33
    Jacob Applebaum is somewhat famous for his work with Julian Assange on Wikileaks,
  • 54:33 - 54:36
    and his work on Tor, and security in general,
  • 54:36 - 54:39
    his efforts to provide you with freedom and privacy.
  • 54:39 - 54:45
    He is a guy who was recently revealed in the Wall Street Journal that the U.S. government has been spying
  • 54:45 - 54:51
    on. And he is on our team, he's on our technical advisory committee.
  • 54:51 - 54:56
    He's one of the people we go to for help when we need to understand security on the box.
  • 54:56 - 55:02
    So right now our position with the American government is that we're not really related except in
  • 55:02 - 55:05
    so much that we are a bunch of people who really care about these issues,
  • 55:05 - 55:12
    which maybe occasionally makes us targets. Which gives us a reason to use a box like this.
  • 55:12 - 55:21
    Coupled with that, there is a program in America - you were talking about Hilary Clinton saying
  • 55:21 - 55:26
    she was going to encourage technologies that will spread democracy.
  • 55:26 - 55:30
    So the way America encourages things is by spending money on it.
  • 55:30 - 55:34
    That's our typical way to support programs. We fund different things.
  • 55:34 - 55:40
    We don't generally have feel-good campaigns, we just pay people to make good work, or try to.
  • 55:40 - 55:46
    So the U.S. state department has a program to provide funding for projects like the FreedomBox.
  • 55:46 - 55:48
    We have not applied for that funding.
  • 55:48 - 55:50
    I don't know if we will.
  • 55:50 - 55:56
    However I do know that they have given funding to some very good and genuine projects that are
  • 55:56 - 56:00
    run by people I trust, so I try not to be cynical about that.
  • 56:00 - 56:06
    I imagine at some point that through a direct grant or a sub-grant or something,
  • 56:06 - 56:11
    some state department money might support some aspect of work that is related to us.
  • 56:11 - 56:15
    I mean, we might take work from a project that is state department funded,
  • 56:15 - 56:17
    just because it's quick work.
  • 56:17 - 56:20
    Have I answered your question?
  • 56:20 - 56:21
    Yes, thanks.
  • 56:32 - 56:37
    Hi, well you always have tension if you talk about privacy
  • 56:37 - 56:41
    since 9/11 you know, I heard this in America very often,
  • 56:41 - 56:44
    "we have to be careful", every body is suspicious and stuff.
  • 56:44 - 56:48
    So how do you react when people like the government say well,
  • 56:48 - 56:55
    you are creating a way to support terrorism, whatever.
  • 56:55 - 57:00
    That's a good question, and it's a common question.
  • 57:00 - 57:04
    Frankly every time I do this talk, it's one of the first questions that come up.
  • 57:04 - 57:06
    The answer is really simple.
  • 57:06 - 57:11
    The fact is, this box doesn't create any new privacy technology.
  • 57:11 - 57:15
    It just makes it easier to use and easier to access.
  • 57:15 - 57:21
    People who are committed to terrorism or criminal activity, they have sufficient motivation that they
  • 57:21 - 57:23
    can use the technology that exists. Terrorists are already using PGP.
  • 57:23 - 57:27
    They're already using Tor.
  • 57:27 - 57:30
    They're already using stuff to hide their data.
  • 57:30 - 57:33
    At best we are helping stupid terrorists.
  • 57:33 - 57:35
    LAUGHTER
  • 57:35 - 57:42
    Granted, I'm not excited about that, but I don't that's a sufficient reason to deny common people
  • 57:42 - 57:44
    access to these technologies.
  • 57:44 - 57:49
    And more importantly than the fact that terrorists and criminals have access to this technology,
  • 57:49 - 57:52
    governments have access to this technology.
  • 57:52 - 57:54
    The largest corporations have access to this technology.
  • 57:54 - 58:00
    Every bank, the same encryption methods that we are using is the stuff that protects trillions of dollars
  • 58:00 - 58:05
    in value that banks trade every day.
  • 58:05 - 58:12
    This is technology that is currently being used by everyone except us.
  • 58:12 - 58:15
    All we're doing is levelling the playing field.
  • 58:15 - 58:22
    The same technology that hides data from us, that causes a complete lack of transparency in a downward
  • 58:22 - 58:27
    direction, we can have to level the playing field a little bit.
  • 58:27 - 58:39
    More questions?
  • 58:39 - 58:43
    Thank you for your presentation.
  • 58:43 - 58:51
    Could we add to challenges, maybe we could produce it in a non-communist dictatorship?
  • 58:51 - 58:54
    Because I saw the label "Made in China", so I think it is just
  • 58:54 - 59:00
    paradox to produce something like the FreedomBox in this country, and I would also like to be independent
  • 59:00 - 59:07
    from producing in China. So that's just something for a challenge I think.
  • 59:07 - 59:10
    That's a really good question and important point.
  • 59:10 - 59:16
    So, we're not a hardware project. Hardware is really really hard to do right and do well.
  • 59:16 - 59:19
    We have some hardware hackers on our project.
  • 59:19 - 59:25
    Our tech lead Bdale Garbee does amazing work with satellites and model rockets and altimeters,
  • 59:25 - 59:28
    and he's brilliant. But this is not a hardware project.
  • 59:28 - 59:31
    All we can do is use hardware that already exists.
  • 59:31 - 59:37
    When the world makes hardware in places other than China, we will use that hardware.
  • 59:37 - 59:41
    Right now, we don't have a lot of options.
  • 59:41 - 59:46
    And we're not going to deny everybody privacy because we don't have a lot of hardware options.
  • 59:46 - 59:48
    When we have those options we'll take them.
  • 59:48 - 59:51
    In the meantime, if you are a person who really cares about this issue,
  • 59:51 - 59:55
    don't buy a FreedomBox.
  • 59:55 - 59:58
    Take the software, go find a computer that isn't made in China,
  • 59:58 - 60:02
    LAUGHTER
  • 60:02 - 60:05
    and go put the software on that box.
  • 60:05 - 60:11
    If you want a solution that is run on computers that don't exist, I can't help you with that.
  • 60:11 - 60:15
    If you want a solution that runs, I might be able to help you with that.
  • 60:15 - 60:20
    But yes, I agree that that is a real issue, and we are thinking about that.
  • 60:20 - 60:25
    We believe that there is an open hardware project story here.
  • 60:25 - 60:28
    And one thing we've been doing is working with the manufacturer of the box,
  • 60:28 - 60:32
    to get the code free, to make sure we know what's in it,
  • 60:32 - 60:35
    so that there are no binary blobs in the box,
  • 60:35 - 60:38
    so we have some assurances that we actually do have freedom.
  • 60:38 - 60:45
    At some point though, we do believe that somebody will solve the open hardware problem for us.
  • 60:45 - 60:50
    We're not going to be the hardware project, but there are people trying to do this in an open way.
  • 60:50 - 60:54
    RaspberryPi for example. They're not quite right for our use-case, but those kinds of projects
  • 60:54 - 60:58
    are starting to exist, and they're starting to be really good.
  • 60:58 - 61:01
    In a few years, maybe that will be the thing we move onto.
  • 61:01 - 61:09
    Now, I'm guessing that even an open hardware project like RaspberryPi does their manufacturing in
  • 61:09 - 61:14
    a place like China. And that's a big problem.
  • 61:14 - 61:19
    When the world is ready with a solution to that, we will be ready to accept that solution and adopt it
  • 61:19 - 61:22
    of course.
  • 61:22 - 61:30
    Any more questions for James? or statements?
  • 61:33 - 61:37
    This is more of a statement than a question I guess,
  • 61:37 - 61:42
    but should the FreedomBox start being made in China there will be a lot more of them coming out of
  • 61:42 - 61:46
    the back door and enabling privacy for people that don't get
  • 61:46 - 61:51
    it, but also as soon as it starts getting manufactured I'd imagine you may,
  • 61:51 - 61:54
    because you're not in it for the money as you told me last night,
  • 61:54 - 61:59
    you may be looking forward to how easy it will be to copy,
  • 61:59 - 62:05
    and with things like MakerBot, making a case, making a bot is easy,
  • 62:05 - 62:08
    you can do it in your bedroom now with 3D printers.
  • 62:08 - 62:15
    So there will be a bag of components, a board, made by some online place that is really into this,
  • 62:15 - 62:18
    and you can assemble these at home.
  • 62:18 - 62:22
    So you've just got to get it out there first I think, and lead the way.
  • 62:22 - 62:29
    Yeah, I think that's quite right in that we are not the only place to get a box like this.
  • 62:29 - 62:34
    I mean, we're putting it on a specific box to make it easy, but there will be lots of places that make
  • 62:34 - 62:40
    boxes, and hopefully there will be places where working conditions are acceptable to everybody.
  • 62:40 - 62:43
    And at that point you can make your own boxes,
  • 62:43 - 62:44
    you can put them on any box you can find.
  • 62:44 - 62:46
    The point of Free Software is not to lock you into a service,
  • 62:46 - 62:53
    a technology, a software, a structure or a box.
  • 62:53 - 62:53
    We're not going to lock you into anything, that's one thing we're extremely clear about.
  • 62:53 - 63:00
    If you manage to make a box like this at home, I would really love to hear about it.
  • 63:00 - 63:06
    If you can spin up a MakerBot to make a case,
  • 63:06 - 63:08
    and you have a friend who can etch boards,
  • 63:08 - 63:10
    and you make a box like this at home,
  • 63:10 - 63:14
    that would be big news and a lot of people would want to know about it.
  • 63:14 - 63:22
    More statements or questions? Yes...
  • 63:22 - 63:31
    So, if you lose your box and get a new one, how is it going to reauthenticate to the boxes of your friends?
  • 63:31 - 63:34
    I think I didn't get that one.
  • 63:34 - 63:39
    Yeah, so, the good thing about friends is that they don't actually know you by your PGP key.
  • 63:39 - 63:48
    Sorry, I didn't specify it, if you want a grand security and you want distribution to more than 12 friends,
  • 63:48 - 63:54
    so let's say a hundred, and they're like, all over the world.
  • 63:54 - 63:59
    You are probably going to reach them through the internet to get your key parts back,
  • 63:59 - 64:05
    and you are probably not going to be able to use the FreedomBox to get a new one because
  • 64:05 - 64:06
    it has to be authenticated.
  • 64:06 - 64:09
    So how do you do?
  • 64:09 - 64:10
    Well, you at that point...
  • 64:10 - 64:14
    if you don't have a FreedomBox, the FreedomBox can't provide you with a solution to that problem.
  • 64:14 - 64:16
    What you're going to have to do,
  • 64:16 - 64:19
    is perhaps call your friends.
  • 64:19 - 64:20
    Have a conversation with them,
  • 64:20 - 64:23
    convince them that you are the person you say you are.
  • 64:23 - 64:27
    Reference your shared experiences, maybe they know your voice,
  • 64:27 - 64:33
    maybe they just know who you are by the way that you act and the way that you talk.
  • 64:33 - 64:37
    There's not going to be any one way that we get our keys back.
  • 64:37 - 64:41
    If you lose your key, yeah, we're not saying that's never going to be a problem.
  • 64:41 - 64:43
    And I wouldn't recommend splitting your key up among a hundred people,
  • 64:43 - 64:48
    because that's a lot of people to ask for your key back.
  • 64:48 - 64:53
    The mechanism I have in mind is not that you get a little bit of your key from
  • 64:53 - 64:56
    everyone you know, it's that you spread out the key among
  • 64:56 - 65:00
    a lot of people, and you need a certain number of those people.
  • 65:00 - 65:02
    So maybe it's five of seven of your friends.
  • 65:02 - 65:06
    So you give seven people the key, but any five of them could give you a whole key.
  • 65:06 - 65:09
    So in case you can't reach somebody you can still manage to do it.
  • 65:09 - 65:12
    And we can make that access control as fine-grained as we want,
  • 65:12 - 65:15
    but a hundred would be overwhelming.
  • 65:15 - 65:20
    We wouldn't do that. Sure, you could do it if you wanted,
  • 65:20 - 65:23
    but I don't think you'll have a hundred friends you could trust that much.
  • 65:23 - 65:26
    Maybe you do, I don't.
  • 65:26 - 65:33
    More questions, statements?
  • 65:33 - 65:39
    Yes?
  • 65:39 - 65:47
    Erm, it's just a wish... but have you thought about the idea of using the FreedomBox to create
  • 65:47 - 65:51
    a community where you can exchange not only data but like
  • 65:51 - 65:58
    products or services, so that would maybe like, change the system?
  • 65:58 - 66:04
    One of the things we want to do with the FreedomBox is
  • 66:04 - 66:10
    create a thing that looks a lot like your current social networking,
  • 66:10 - 66:12
    minus the advertising and the spying.
  • 66:12 - 66:16
    A way to talk to all your friends at once.
  • 66:16 - 66:20
    Once you have a place, a platform, where you can communicate
  • 66:20 - 66:23
    with your friends, you can build on that platform
  • 66:23 - 66:25
    and you can create structures like that.
  • 66:25 - 66:29
    If we make a thing that has programmable interfaces, so
  • 66:29 - 66:32
    you can make apps for it, you can make an app like that,
  • 66:32 - 66:34
    if that's important to you.
  • 66:34 - 66:38
    What people do with the communication once they have it,
  • 66:38 - 66:40
    we don't have any opinions about.
  • 66:40 - 66:43
    We want them to do everything that's important to them.
  • 66:43 - 66:45
    And I think something like that could be important,
  • 66:45 - 67:03
    and yeah, that would be amazing if that were to emerge.
  • 67:03 - 67:08
    Some things I believe are easier to do in a centralized architecture than a decentralized one,
  • 67:08 - 67:12
    for example search, or services that require a lot of bandwidth.
  • 67:12 - 67:16
    I don't see how you can run something like YouTube on the FreedomBox.
  • 67:16 - 67:18
    So is your utopian vision one where everything is decentralized,
  • 67:18 - 67:23
    or is it ok to have some centralized pieces in a future network?
  • 67:23 - 67:28
    Look, if you're going to grant me my utopia then of course everything is decentralized.
  • 67:28 - 67:31
    But we don't live in a utopia, I don't have magic.
  • 67:31 - 67:38
    We actually have in our flowchart a box labeled "magic routing",
  • 67:38 - 67:41
    because routing is hard to do in a decentralized way...
  • 67:41 - 67:44
    You need someone to tell you where the IPs are.
  • 67:44 - 67:47
    And that's hard to do in a decentralized way.
  • 67:47 - 67:52
    We haven't solved it, and we don't think we're going to fully solve it.
  • 67:52 - 67:54
    We hope someone else solves it first of all.
  • 67:54 - 67:56
    But second of all, we don't know where the compromises are.
  • 67:56 - 67:59
    Some things are not possible to decentralize.
  • 67:59 - 68:01
    We're going to decentralize as much as we can,
  • 68:01 - 68:04
    but we're not committing to doing anything impossible.
  • 68:04 - 68:06
    If you can't run YouTube off this box,
  • 68:06 - 68:08
    which I disagree with by the way,
  • 68:08 - 68:10
    then you won't, because it's impossible.
  • 68:10 - 68:12
    If you want to run YouTube on this box you turn all your
  • 68:12 - 68:14
    friends into your content delivery network,
  • 68:14 - 68:16
    and all your friends parallelize the distribution of the box,
  • 68:16 - 68:18
    you share the bandwidth.
  • 68:18 - 68:20
    It's ad-hoc, BitTorrent-like functionality.
  • 68:20 - 68:24
    Yes, that technology doesn't exist yet, I just made all that up,
  • 68:24 - 68:27
    but we can do it.
  • 68:27 - 68:32
    The parts that are hard though, the things like the routing,
  • 68:32 - 68:35
    there will be real compromises.
  • 68:35 - 68:36
    There will be real trade-offs.
  • 68:36 - 68:39
    There will be places where we'll say, you know what, we have
  • 68:39 - 68:41
    to rely on the DNS system.
  • 68:41 - 68:44
    Everybody in this room knows that the DNS system has some
  • 68:44 - 68:48
    security problems, some architectural problems that make it
  • 68:48 - 68:51
    a thing we would ideally not have to rely on.
  • 68:51 - 68:55
    But you know what? This project is not going to be able to replace DNS.
  • 68:55 - 68:59
    There are plenty of alternate DNS proposals out there, but we are not going to
  • 68:59 - 69:02
    just chuck the old DNS system, because we want people
  • 69:02 - 69:05
    to be able to get to the box, even if they don't have a box.
  • 69:05 - 69:09
    We want you to be able to serve services to the public.
  • 69:09 - 69:13
    We are going to use a lot of structures that are less than ideal.
  • 69:13 - 69:16
    We're assuming that TCP/IP is there...
  • 69:16 - 69:19
    in the normal use case you're using the internet backbone
  • 69:19 - 69:22
    to do your communication.
  • 69:22 - 69:25
    The mesh routing story we talked about is not how you do
  • 69:25 - 69:30
    your normal use. That's an emergency mode if there's a crisis, a political instability, a tsunami,
  • 69:30 - 69:35
    if you can't get to your regular internet because it has failed you in some way because
  • 69:35 - 69:38
    it has become oppressive or inaccessible.
  • 69:38 - 69:40
    Then you would use something like the mesh network.
  • 69:40 - 69:44
    But in the normal course of business, you are using
  • 69:44 - 69:47
    a thing that is less than ideal, and that's a trade-off.
  • 69:47 - 69:49
    We can't as a project protect you from everything.
  • 69:49 - 69:51
    We are going to look for the places where we can make
  • 69:51 - 69:54
    effective protection. We are going to try and make it clear
  • 69:54 - 69:57
    the limits of that protection. And we're going to give you
  • 69:57 - 69:59
    everything we can.
  • 69:59 - 70:05
    And then, as we move forward, when opportunities to solve new problems present themselves,
  • 70:05 - 70:08
    we'll take them.
  • 70:08 - 70:16
    Well I have to add before when we had the talk, unfortunately German you couldn't
  • 70:16 - 70:19
    understand a lot.
  • 70:19 - 70:22
    I didn't understand it but I could tell that it was occurring at a very high level of technical competence
  • 70:22 - 70:25
    and that there was a lot of good information there.
  • 70:25 - 70:28
    And I'm really hoping that you'll take the video of it and put it up on universalsubtitles.org, or some
  • 70:28 - 70:33
    other service where people can subtitle it. And hopefully there'll be an English version and I'll get
  • 70:33 - 70:35
    to see it. I think there was a lot of really good information in there.
  • 70:35 - 70:38
    What's universalsubtitles.org?
  • 70:38 - 70:46
    Universalsubtitles.org is a great website. It's kind of like, you put a video up, and anyone can
  • 70:46 - 70:49
    add subtitles to as much or as little as they want.
  • 70:49 - 70:53
    And then other people can change the subtitles, and you can do it in as many languages as you want.
  • 70:53 - 70:59
    So you don't have to ask someone for a favour, "hey, will you subtitle my video?"
  • 70:59 - 71:03
    that's 20 minutes long or an hour long. You tell a community of people "we need help subtitling",
  • 71:03 - 71:08
    and everyone goes and subtitles 3 minutes in their favourite languages.
  • 71:08 - 71:15
    It's a very effective way to crowdsouce subtitling, and it's a very effective way to just share information.
  • 71:15 - 71:20
    We have a lot of videos with good information that are locked into languages that not everyone speaks.
  • 71:20 - 71:22
    So this is a way to get around that.
  • 71:22 - 71:25
    As FreedomBox, we use that project.
  • 71:25 - 71:28
    And I believe, if I'm not mistaken, I haven't looked in a while,
  • 71:28 - 71:33
    that it's all Free software that they are using. So you can download it and start your own if you want.
  • 71:33 - 71:41
    So back to my previous question - in the talk in the afternoon we heard about mesh networking
  • 71:41 - 71:44
    we talked about that, and it's actually not just being used in
  • 71:44 - 71:46
    emergency situations but people are really using it.
  • 71:46 - 71:52
    And especially, the philosophy that everyone becomes part of the net as not just a consumer
  • 71:52 - 71:58
    but providing part of the net, it certainly is like that that they
  • 71:58 - 72:01
    can share data among each other, they don't necessarily need
  • 72:01 - 72:03
    to go into the internet.
  • 72:03 - 72:07
    So, I would imagine the FreedomBox, with mesh networking,
  • 72:07 - 72:10
    we could essentially create a large network of many many
  • 72:10 - 72:12
    people using it.
  • 72:12 - 72:17
    We also talked about the mesh networking like FunkFeuer in Graz or Vienna
  • 72:17 - 72:21
    but it would be interesting to get them on mobile devices,
  • 72:21 - 72:23
    so that you could walk through the street,
  • 72:23 - 72:30
    theoretically people have these devices, and you could walk
  • 72:30 - 72:32
    through and it would automatically mesh and connect you.
  • 72:32 - 72:37
    So FreedomBox if applied to that, you told me this interesting example, you could screw them to
  • 72:37 - 72:41
    light posts on the street, so maybe elaborate on that,
  • 72:41 - 72:44
    maybe it could have an effect and give a lot of coverage.
  • 72:44 - 72:48
    The reason why we currently envision mesh,
  • 72:48 - 72:50
    and no decisions have been made, right,
  • 72:50 - 72:54
    but just in the way we think about it when we talk to each other,
  • 72:54 - 72:58
    and the reason why we think mesh networking is not your daily
  • 72:58 - 73:03
    mode of use is that the performance degradation is not acceptable to most end-users.
  • 73:03 - 73:06
    If mesh networking reaches the point where it is acceptable
  • 73:06 - 73:09
    if you're in a place where there's enough nodes, and you
  • 73:09 - 73:13
    have a density that you can move around then sure, that
  • 73:13 - 73:15
    can make a lot of sense. But for a lot of people who
  • 73:15 - 73:19
    exist as a person not near a lot of FreedomBoxes, they're
  • 73:19 - 73:21
    going to need the regular internet.
  • 73:21 - 73:26
    So yeah, we think mesh will be great where you have that
  • 73:26 - 73:29
    density, when the mesh technology is mature.
  • 73:29 - 73:33
    When that happens, we could have the most easy access
  • 73:33 - 73:38
    to municipal wifi by using the power in all the street
  • 73:38 - 73:43
    lights. Put a FreedomBox up in the top of every street lamp.
  • 73:43 - 73:47
    Unscrew the light bulb, screw in the FreedomBox, and screw the light bulb back on top.
  • 73:47 - 73:51
    So you still get light, we're not going to plunge you into darkness.
  • 73:51 - 73:56
    You still get light, but then you have a mesh node. Right there.
  • 73:56 - 74:00
    And you could do every 3rd or 4th street light down town, and you could cover
  • 74:00 - 74:02
    an area rather effectively.
  • 74:02 - 74:07
    It is a way to get simple municipal wifi without running
  • 74:07 - 74:10
    any fibre. And every time you have fibre you can link to it.
  • 74:10 - 74:13
    Like any time you're near fibre you can link to it and you'll
  • 74:13 - 74:18
    get your information out of that little mesh and into the regular network.
  • 74:18 - 74:23
    We could have municipal wifi with much lower infrastructure costs than most people currently think of
  • 74:23 - 74:28
    when they think of municipal wifi. And we can do it through mesh nodes.
  • 74:28 - 74:33
    And if we did it through mesh nodes we would be providing that service not only to people who have
  • 74:33 - 74:38
    FreedomBoxes, that just looks like wifi, it just looks like a regular connection.
  • 74:38 - 74:45
    You might need to do some fancy hopping, but it's not...
  • 74:45 - 74:51
    the mesh boxes themselves will do the fancy hopping, your phone itself won't have to do it.
  • 74:51 - 74:54
    While we are talking about phones,
  • 74:54 - 74:59
    I want to say that I'm not sure how phones fit into the FreedomBox.
  • 74:59 - 75:02
    I'm pretty sure there is a way that phones fit into FreedomBoxes,
  • 75:02 - 75:05
    but you can't trust your phone.
  • 75:05 - 75:09
    With the so-called smartphones it's not a phone actually but a little computer, no?
  • 75:09 - 75:12
    Yes, your phone, a smartphone is a little computer but
  • 75:12 - 75:16
    it's not a computer that you can trust, because
  • 75:16 - 75:20
    even if you replace the software on your phone,
  • 75:20 - 75:26
    with Free software, it's almost impossible to actually replace all the binary drivers,
  • 75:26 - 75:29
    it's almost impossible to go all the way down to the metal.
  • 75:29 - 75:31
    It's very hard to get a phone that is completely trustworthy
  • 75:31 - 75:35
    all the way down to the bottom of the stack.
  • 75:35 - 75:37
    So that's a problem we haven't quite figured out how to solve.
  • 75:37 - 75:42
    And pretty soon it's going to be impossible to put Free software on phones.
  • 75:42 - 75:47
    The days of jailbreaking your iPhone and rooting your Android phone might
  • 75:47 - 75:55
    very well come to an end. There is a proposal right now called UEFI.
  • 75:55 - 76:01
    It's a standard. We currently use EFI, this would be UEFI.
  • 76:01 - 76:03
    I don't know what it stands for, it's a new thing.
  • 76:03 - 76:08
    And what this proposal is, is that before your computer,
  • 76:08 - 76:14
    before the BIOS will load a bootloader on your computer
  • 76:14 - 76:17
    that BIOS has to authenticate, sorry, that bootloader has
  • 76:17 - 76:20
    to authenticate to the BIOS. It has to be signed by someone
  • 76:20 - 76:23
    the BIOS trusts, someone the BIOS manufacturer trusts.
  • 76:23 - 76:25
    And the person who puts the BIOS in your phone can decide who it trusts,
  • 76:25 - 76:29
    and they can decide they don't trust anyone except themselves.
  • 76:29 - 76:36
    If Apple sells you an iPhone with a BIOS that requires a
  • 76:36 - 76:39
    signed operating system, it might be very hard for you to
  • 76:39 - 76:43
    get another version of the operating system on there.
  • 76:43 - 76:49
    The proposals for this stuff are really in the realm of laptops and computers, that's where it's starting,
  • 76:49 - 76:53
    but believe me, technology spreads.
  • 76:53 - 76:58
    And if you want to be able to put Linux on a computer that you buy, on a laptop you buy,
  • 76:58 - 77:03
    very soon you might have a very difficult time doing that.
  • 77:03 - 77:05
    The standard is there, the companies paying attention to it
  • 77:05 - 77:08
    are not paying attention to it for our purposes.
  • 77:08 - 77:12
    They want to make sure that they can control what is on your computer.
  • 77:12 - 77:17
    So this is, you know, another political fight that we're going to engage in,
  • 77:17 - 77:20
    not the FreedomBox, but the community.
  • 77:20 - 77:25
    We're going to have to have this fight. UEFI. Look it up.
  • 77:25 - 77:32
    Start thinking about it. This is going to be a big piece of the puzzle for freedom in computing over
  • 77:32 - 77:34
    the next few years.
  • 77:34 - 77:38
    We're going to have some problems and we're going to have to find some solutions.
  • 77:38 - 77:44
    But wouldn't such an initiative, wouldn't that create a good market for companies who actually
  • 77:44 - 77:49
    would supply Linux on such devices, on the phone and on the laptop market.
  • 77:49 - 77:53
    I'm sure there are companies supplying that.
  • 77:53 - 77:54
    Absolutely.
  • 77:54 - 77:58
    And if the market in freedom were good enough to support
  • 77:58 - 78:02
    large-scale manufacturing and all that other stuff then we might get that.
  • 78:02 - 78:05
    And we might get that anyway.
  • 78:05 - 78:07
    I mean, the standard will include as many keys as you want,
  • 78:07 - 78:08
    so we might get the freedom.
  • 78:08 - 78:12
    But the manufacturers will have a really convenient way to turn the freedom off.
  • 78:12 - 78:16
    I think there will be a lot of boxes where you will have freedom.
  • 78:16 - 78:21
    But there will also be a lot where right now we think we can get Free software onto it,
  • 78:21 - 78:24
    where we won't be able to anymore.
  • 78:24 - 78:25
    It's going to be a narrowing of the market.
  • 78:25 - 78:28
    I don't think our freedom is going to completely disappear from devices.
  • 78:28 - 78:33
    But a lot of devices, if you buy the device without thinking about freedom, assuming you can have it,
  • 78:33 - 78:37
    you might get it home and discover that you can't.
  • 78:37 - 78:45
    Ok, we want to give the floor again to the audience for more questions or statements.
  • 78:45 - 78:52
    Ok, there in the back, one more.
  • 78:52 - 78:54
    Yeah, one more time, so...
  • 78:54 - 79:01
    Nowadays, where you can hardly really save your PC, laptop, whatever, against malware...
  • 79:01 - 79:16
    Isn't it really, a red carpet for hackers to, if you have social networks and circles of friends,
  • 79:16 - 79:21
    one gets some malware on his PC, mobile device, whatever,
  • 79:21 - 79:26
    has a FreedomBox, authenticates to his friends, the state is secure
  • 79:26 - 79:32
    wouldn't that open doors?
  • 79:32 - 79:37
    Sure, well, the human error is not one we can control for.
  • 79:37 - 79:45
    But someone who has a key that you trust is not necessarily someone who you let run arbitrary code
  • 79:45 - 79:48
    on your FreedomBox.
  • 79:48 - 79:52
    You might trust them to the point of having message passing with them, and trusting who they are
  • 79:52 - 79:56
    and what they say, but you don't necessarily trust the technology that they have and the
  • 79:56 - 79:58
    code that they have to be free of malware.
  • 79:58 - 80:00
    You'll still have to do all the things you currently do.
  • 80:00 - 80:04
    Right now if somebody sends you a file, it could have malware in it.
  • 80:04 - 80:08
    We're not making that easier, or better, or more likely to happen.
  • 80:08 - 80:15
    I think what we are doing is completely orthogonal to that problem.
  • 80:15 - 80:19
    At the same time, if we were to have email services on the box,
  • 80:19 - 80:23
    and you know we're not quite sure what the email story of a box like this looks like,
  • 80:23 - 80:26
    we probably would want to include some sort of virus scanning or spam catching,
  • 80:26 - 80:31
    all the usual filtering tools to give you whatever measure of protection might currently exist.
  • 80:31 - 80:35
    But the fact someone has a key and you know who they are
  • 80:35 - 80:39
    I don't think that will ever be the security hole.
  • 80:39 - 80:42
    Or at least we really hope we can make it so it's not.
  • 80:42 - 80:48
    If we fail in that then we've missed a trick.
  • 80:48 - 80:53
    Ok, any more statements or questions?
  • 80:53 - 80:56
    Ok, so, James, my last question would be...
  • 80:56 - 80:59
    You can actually buy the box right now?
  • 80:59 - 81:00
    Yes.
  • 81:00 - 81:01
    From a company?
  • 81:01 - 81:02
    Yes.
  • 81:02 - 81:05
    Maybe you can supply that information. But the software is being developed?
  • 81:05 - 81:07
    Yes.
  • 81:07 - 81:11
    Can you give an estimation about the timeline of your project, or the next milestones?
  • 81:11 - 81:13
    Sure.
  • 81:13 - 81:16
    So, the boxes are manufactures by a company called Globalscale,
  • 81:16 - 81:18
    they're about $140 US dollars.
  • 81:18 - 81:24
    There is a slightly older model called the SheevaPlug that is about $90.
  • 81:24 - 81:28
    It does just pretty much everything the Dreamplug does.
  • 81:28 - 81:31
    It has some heat sinking issues, but it's a pretty good box as well,
  • 81:31 - 81:38
    so if the price point matters to you you can get last year's model and it'll serve you just fine.
  • 81:38 - 81:43
    The software, right now we have a bare Linux distribution.
  • 81:43 - 81:45
    We spent a lot of time getting the binary blobs out of the kernel
  • 81:45 - 81:50
    and making it installable onto this hardware target.
  • 81:50 - 81:54
    We have a Jabber server, Prosody, that we are modifying to suit our needs.
  • 81:54 - 82:00
    And that should be ready, time-frame, weeks.
  • 82:00 - 82:03
    Some short number of weeks.
  • 82:03 - 82:09
    The Privoxy server, the SSH forwarding, some short number of months.
  • 82:09 - 82:16
    But those are our roadmap for the short-term future, is Jabber, SSH forwarding, browser proxying.
  • 82:16 - 82:22
    We also are working on the interface, so we're going to have an interface that you can actually
  • 82:22 - 82:24
    control some of these services with.
  • 82:24 - 82:28
    And the first thing we're doing with that interface is probably allowing you to
  • 82:28 - 82:30
    configure this box as a wireless router.
  • 82:30 - 82:35
    So it can become your wireless access point if you want it to be.
  • 82:35 - 82:38
    And your gateway of course.
  • 82:38 - 82:39
    So user interface in one vertical,
  • 82:39 - 82:44
    SSH forwarding, browser proxying a little bit out there,
  • 82:44 - 82:47
    a little bit closer: Jabber, XMPP secure chat.
  • 82:47 - 82:52
    And once we have that stack, we believe that we're going to build upwards from XMPP towards
  • 82:52 - 82:55
    perhaps something like BuddyCloud.
  • 82:55 - 82:58
    We're seriously looking at BuddyCloud and seeing what problems it solves for us
  • 82:58 - 83:05
    in terms of actually letting users group themselves in ways that they can then do access control
  • 83:05 - 83:08
    and channels and things of that nature.
  • 83:08 - 83:13
    And are you actually in contact with the hardware company producing the servers?
  • 83:13 - 83:19
    Yeah, we've had a number of conversations with them.
  • 83:19 - 83:22
    They've agreed that when our code is ready this is something
  • 83:22 - 83:24
    they are very interested in distributing.
  • 83:24 - 83:26
    More importantly we've had a lot of conversations with
  • 83:26 - 83:28
    them about freedom.
  • 83:28 - 83:31
    About why we do what we do, they way we do.
  • 83:31 - 83:35
    And how they need to act if they want to distribute code for
  • 83:35 - 83:37
    us and work with our community.
  • 83:37 - 83:39
    And what that means is we're teaching them how to comply
  • 83:39 - 83:41
    with the GPL, and we're teaching them how to remove the binary drivers,
  • 83:41 - 83:45
    and in fact we're doing some of that for them.
  • 83:45 - 83:47
    But they're Chinese, right?
  • 83:47 - 83:49
    No. No, Globalscale is not a Chinese company.
  • 83:49 - 83:53
    Their manufacturing is in China, but they're not a Chinese company.
  • 83:53 - 83:58
    And we're also talking to Marvel. Marvel makes the system-on-a-chip that goes onto the boards
  • 83:58 - 84:00
    that Globalscale is integrating into their boxes.
  • 84:00 - 84:05
    But we're also talking to Marvel about what they can do to better serve the needs of our community.
  • 84:05 - 84:13
    So a large part of our efforts is to try to convince manufacturers to make
  • 84:13 - 84:14
    hardware that suits our needs.
  • 84:14 - 84:16
    This box is a thing that they developed, they invented,
  • 84:16 - 84:18
    before they ever met us, before they ever heard of us.
  • 84:18 - 84:23
    And if we can get them enough business,
  • 84:23 - 84:27
    if by making FreedomBoxes and by putting our software on the box,
  • 84:27 - 84:30
    that enables them to sell more boxes they will be very happy
  • 84:30 - 84:34
    and when they design the next generation,
  • 84:34 - 84:39
    not the next generation of the DreamPlug, but the next generation after whatever they're designing now,
  • 84:39 - 84:41
    so we're talking a couple of years from now.
  • 84:41 - 84:44
    We can say to them, look, you're selling a lot of boxes
  • 84:44 - 84:48
    because you're making a thing that serves the free world very well.
  • 84:48 - 84:52
    Remove the 8 inch audio jack because our people don't need it.
  • 84:52 - 84:55
    Add a second wifi radio. Put antenna ports on it.
  • 84:55 - 85:00
    This box can go from something that looks really good for our purpose to
  • 85:00 - 85:02
    being something that looks amazingly good for our purpose.
  • 85:02 - 85:05
    And that will require scale.
  • 85:05 - 85:07
    And what that means is that the FreedomBox becomes a wedge for
  • 85:07 - 85:13
    making better hardware for everyone.
  • 85:13 - 85:16
    But it's not just the FreedomBox. The Tor router project is
  • 85:16 - 85:21
    also focused on the DreamPlug. They've also decided this is a good box for their purpose.
  • 85:21 - 85:26
    If you are making a box that is kind of like a FreedomBox but isn't the FreedomBox because
  • 85:26 - 85:30
    it's more specialised to what you want it for, think about
  • 85:30 - 85:35
    the DreamPlug as a hardware target. And let us know,
  • 85:35 - 85:38
    so that when we go to the company, we can say look,
  • 85:38 - 85:42
    look at all the business you are getting by being people that serve the Free world.
  • 85:42 - 85:52
    And then, hopefully, we can convince them to make boxes that better serve the Free world.
  • 85:52 - 85:55
    And that's not a fantasy. We are having those conversations with them,
  • 85:55 - 85:57
    and they are very receptive.
  • 85:57 - 86:00
    So I am pretty happy about that aspect we do.
  • 86:00 - 86:02
    And my last question would be...
  • 86:02 - 86:05
    since we are now, everything is turning mobile,
  • 86:05 - 86:07
    it's like we have these computers with an extra phone...
  • 86:07 - 86:08
    the phone is a small application on these devices.
  • 86:08 - 86:13
    Is there any plan or any idea or any project to say like, have
  • 86:13 - 86:18
    a FreedomPhone or Free mobile device?
  • 86:18 - 86:23
    So the way you connect to this box is kind of how you connect to your router,
  • 86:23 - 86:24
    port 80, browser.
  • 86:24 - 86:28
    But another way you could do it would be an app on your cellphone that bluetooths to the box.
  • 86:28 - 86:33
    I don't actually think the box has bluetooth, but you know,
  • 86:33 - 86:36
    an app on your cellphone that talks to the box over the network, say.
  • 86:36 - 86:38
    That's possible, we're thinking about that.
  • 86:38 - 86:41
    We're thinking about what that looks like for the large population
  • 86:41 - 86:43
    that exists out there that doesn't have computers.
  • 86:43 - 86:46
    There's an awful lot of people that only have cellphones, they don't have computers.
  • 86:46 - 86:49
    And we want them to have freedom too.
  • 86:49 - 86:50
    So figuring out how we can use a cellphone to talk to the box is a future problem.
  • 86:50 - 86:51
    We're not working on it right now, but we're certainly talking
  • 86:51 - 86:57
    about where it fits into the roadmap.
  • 86:57 - 87:01
    And that's why we are concerned about whether or not you
  • 87:01 - 87:05
    can trust your phone.
  • 87:05 - 87:07
    Because if you can trust your FreedomBox, but not the
  • 87:07 - 87:09
    thing you use to access it then you don't really have the privacy you think you have.
  • 87:09 - 87:12
    So, figuring out, can you trust your cellphone? Is a big part of the puzzle.
  • 87:12 - 87:17
    It's a big thing that we don't know how to do yet.
  • 87:17 - 87:21
    So let me make a little advertisement for another interesting project,
  • 87:21 - 87:24
    there is a Spanish development, I think it is also produced in China,
  • 87:24 - 87:26
    but it's called The Geek's Phone.
  • 87:26 - 87:30
    And they have a compatible Android installation by default,
  • 87:30 - 87:34
    and they are probably having a similar philosophy to keep the hardware open.
  • 87:34 - 87:36
    So maybe there is a new cooperation on the horizon.
  • 87:36 - 87:40
    Oh yeah, we love projects like that.
  • 87:40 - 87:41
    I don't know a lot about their project, but I have heard of it
  • 87:41 - 87:44
    and it is on my list of things to look into.
  • 87:44 - 87:47
    I would love to see that succeed, that would be excellent.
  • 87:47 - 87:50
    Well James, thank you for your presentation.
  • 87:50 - 87:54
    I think it was really interesting. And thank you for coming.
  • 87:54 - 87:57
    James will be back on this stage at 7pm when we have our final discussion on the 20 years of
  • 87:57 - 88:03
    the world wide web.
  • 88:03 - 88:05
    Thank you James for coming.
  • 88:05 - 88:12
    APPLAUSE
Title:
Presentation of the FreedomBox from James Vasile
Description:

Freedom, Out of the Box! -- A Presentation of the FreedomBox" -- Elevate 2011 -- Forum Stadtpark ,24.10.2011 14:30 mit James Vasile (FreedomBox Foundation / US)

more » « less
Video Language:
English

German subtitles

Revisions