Return to Video

Richard Stallman, A Free Digital Society (© cc-by-nd)

  • 0:10 - 0:14
    Thank you. Congratulations to the students
    [of the Digital Freedoms association]
  • 0:14 - 0:16
    who organized this meeting.
  • 0:16 - 0:20
    They are right, not only
    because Richard is famous
  • 0:20 - 0:22
    but also because the subject
    is very interesting to us.
  • 0:22 - 0:25
    They are especially interesting
    here at Sciences Po
  • 0:25 - 0:29
    since first, we try to study
    controversies;
  • 0:29 - 0:31
    some of you here have studied
    controversies
  • 0:31 - 0:34
    and Richard himself
    is a controversial character.
  • 0:34 - 0:37
    I have not found anything on the Web
    that allows to find consensus
  • 0:37 - 0:45
    nor about what he does, nor what he says, nor
    the words he uses, so he's an ideal case study
  • 0:45 - 0:49
    for us who study cartographies of
    controversies in this institution.
  • 0:49 - 0:52
    Somehow we have a controversial character
  • 0:52 - 0:54
    on subjects that are important to us.
  • 0:54 - 0:57
    Second, obviously, is the subject itself
  • 0:57 - 1:01
    and the question of how much
    freedom and control
  • 1:05 - 1:08
    that are at the core of all these
    digital innovations
  • 1:08 - 1:11
    are directly interesting, for political science
  • 1:11 - 1:14
    sociology, also law.
  • 1:14 - 1:18
    All questions that interest us
    in this institution.
  • 1:18 - 1:22
    It's by the way interesting that
    Richard comes here
  • 1:22 - 1:28
    a few days after Steve Jobs died, a death that
    he celebrated in his own way if I dare to say
  • 1:28 - 1:32
    explaining that his loss was not
    a huge disaster
  • 1:32 - 1:36
    for all those questions of
    digital freedoms
  • 1:36 - 1:40
    because - even though Steve Jobs
    could be esteemed
  • 1:40 - 1:43
    the control he had on software
  • 1:43 - 1:47
    and its domination make us
    in some way free
  • 1:47 - 1:51
    but under the form, as he says it
    in several opinions,
  • 1:51 - 1:54
    of a "jail made cool" --
    a jail that I like myself
  • 1:54 - 1:57
    since I am myself an adept
    of the Macintosh
  • 1:57 - 1:59
    but this is obviously not
    the kind of character
  • 1:59 - 2:02
    or the kind of freedom that Richard
    is thinking about.
  • 2:02 - 2:04
    The third reason is that, obviously
  • 2:04 - 2:07
    for a lot of questions
    that are directly interesting to us
  • 2:07 - 2:09
    in this institution about
    scientific humanities
  • 2:09 - 2:12
    is the link between technical innovation
  • 2:12 - 2:15
    and political devices
    that interests us directly.
  • 2:15 - 2:18
    So we have at least, anyway myself
    at the scientific direction
  • 2:18 - 2:23
    three reasons to enjoy
    the initiative of the students group
  • 2:23 - 2:25
    which organized this meeting
  • 2:25 - 2:28
    and I am happy to let
    Richard Stallman speak
  • 2:28 - 2:31
    and please applause him before.
  • 2:38 - 2:43
    Projects with the goal of digital
    inclusion are making a big assumption.
  • 2:43 - 2:49
    They are assuming that participating
    in a digital society is good;
  • 2:49 - 2:51
    but that’s not necessarily true.
  • 2:51 - 2:55
    Being in a digital society can be good or bad
  • 2:55 - 3:01
    depending on whether that
    digital society is just or unjust.
  • 3:02 - 3:09
    There are many ways in which our freedom
    is being attacked by digital technology.
  • 3:09 - 3:17
    Digital technology can make things worse,
    and it will, unless we fight to prevent it.
  • 3:21 - 3:26
    Therefore, if we have an unjust digital society
  • 3:26 - 3:30
    we should cancel these projects
    for digital inclusion
  • 3:30 - 3:34
    and launch projects for digital extraction.
  • 3:34 - 3:41
    We have to extract people from digital society
    if it doesn’t respect their freedom;
  • 3:42 - 3:45
    or we have to make it respect their freedom.
  • 3:45 - 3:50
    So, what are the threats? First, surveillance.
  • 3:50 - 3:54
    Computers are Stalin’s dream.
  • 3:54 - 3:58
    They are ideal tools for surveillance
  • 3:58 - 4:03
    because anything we do with computers,
    the computers can record.
  • 4:03 - 4:05
    They can record the information
  • 4:05 - 4:11
    in a perfectly indexed
    searchable form in a central database
  • 4:11 - 4:17
    ideal for any tyrant
    who wants to crush opposition.
  • 4:20 - 4:24
    Surveillance is sometimes done
    with our own computers.
  • 4:25 - 4:30
    For instance, if you have a computer
    that’s running Microsoft Windows
  • 4:30 - 4:33
    that system is doing surveillance.
  • 4:33 - 4:38
    There are features in Windows
    that send data to some server.
  • 4:38 - 4:41
    Data about the use of the computer.
  • 4:41 - 4:46
    A surveillance feature was discovered
    in the iPhone a few months ago
  • 4:46 - 4:49
    and people started calling it
    the “spy-phone”.
  • 4:51 - 4:55
    Flash player has a surveillance feature too
  • 4:55 - 4:58
    and so does the Amazon “Swindle”.
  • 4:59 - 5:04
    They call it the Kindle,
    but I call it the Swindle, l’escroc
  • 5:04 - 5:08
    because it’s meant to swindle
    users out of their freedom.
  • 5:09 - 5:14
    It makes people identify themselves
    whenever they buy a book
  • 5:14 - 5:21
    and that means Amazon has a giant list
    of all the books each user has read.
  • 5:21 - 5:25
    And such a list must not exist anywhere.
  • 5:27 - 5:32
    Most portable phones
    will transmit their location
  • 5:32 - 5:38
    computed using GPS, on remote command.
  • 5:39 - 5:46
    And the phone company is accumulating
    a giant list of places that the user has been.
  • 5:46 - 5:51
    A German from the Green Party
    asked the phone company
  • 5:51 - 5:56
    to give him the data it had
    about where he was.
  • 5:56 - 6:00
    He had to sue, he had to go
    to court to get this information.
  • 6:00 - 6:06
    And when he got it, he received
    forty-four thousand location points
  • 6:06 - 6:08
    for a period of six months.
  • 6:08 - 6:12
    That’s more than two hundred per day.
  • 6:13 - 6:20
    So what that means is someone could form
    a very good picture of his activities
  • 6:20 - 6:24
    just by looking at that data.
  • 6:28 - 6:33
    We can stop our own computers
    from doing surveillance on us
  • 6:33 - 6:37
    if we have control
    of the software that they run.
  • 6:38 - 6:43
    But the software these people are running,
    they don’t have control over.
  • 6:43 - 6:49
    It’s non-free software and that’s why
    it has malicious features, such as surveillance.
  • 6:50 - 6:55
    However, the surveillance is not
    always done with our own computers
  • 6:55 - 6:58
    it’s also done at one remove.
  • 6:58 - 7:02
    For instance ISPs in Europe are required
  • 7:02 - 7:09
    to keep data about the user’s
    internet communications for a long time
  • 7:09 - 7:16
    in case the State decides to investigate
    that person later
  • 7:16 - 7:19
    for whatever imaginable reason.
  • 7:22 - 7:27
    And with a portable phone,
    even if you can stop the phone
  • 7:27 - 7:30
    from transmitting your GPS location
  • 7:30 - 7:34
    the system can determine
    the phone’s location approximately
  • 7:34 - 7:41
    by comparing the time when
    the signals arrive at different towers.
  • 7:41 - 7:45
    So the phone system can do surveillance
  • 7:45 - 7:49
    even without special cooperation
    from the phone itself.
  • 7:54 - 8:00
    Likewise, the bicycles
    that people rent in Paris.
  • 8:00 - 8:03
    Of course the system knows
    where you get the bicycle
  • 8:03 - 8:06
    and it knows where you return the bicycle
  • 8:06 - 8:10
    and I’ve heard reports
    that it tracks the bicycles
  • 8:11 - 8:14
    as they are moving around as well.
  • 8:15 - 8:23
    So they are not something we can really trust.
  • 8:23 - 8:31
    But there are also systems that have nothing
    to do with us that exist only for tracking.
  • 8:31 - 8:38
    For instance, in the UK
    all car travel is monitored.
  • 8:38 - 8:44
    Every car’s movements
    are being recorded in real time
  • 8:44 - 8:47
    and can be tracked by the State in real time.
  • 8:47 - 8:52
    This is done with cameras on the side of the road.
  • 8:52 - 8:57
    Now, the only way we can prevent surveillance
  • 8:57 - 9:01
    that’s done at one remove
    or by unrelated systems
  • 9:01 - 9:07
    is through political action
    against increased government power
  • 9:07 - 9:11
    to track and monitor everyone.
  • 9:12 - 9:17
    Which means of course we have to reject
    whatever excuse they come up with.
  • 9:18 - 9:25
    For doing such systems, no excuse
    is valid to monitor everyone.
  • 9:26 - 9:32
    In a free society, when you go out
    in public you are not guaranteed anonymity.
  • 9:32 - 9:36
    It’s possible for someone
    to recognize you and remember.
  • 9:37 - 9:42
    And later that person could say
    that he saw you at a certain place.
  • 9:42 - 9:45
    But that information is diffuse.
  • 9:45 - 9:52
    It’s not conveniently assembled to track
    everybody and investigate what they did.
  • 9:52 - 9:56
    To collect that information is a lot of work
  • 9:56 - 10:01
    so it’s only done in special cases
    when it’s necessary.
  • 10:03 - 10:06
    But computerized surveillance
    makes it possible
  • 10:06 - 10:10
    to centralize and index all this information
  • 10:10 - 10:17
    so that an unjust regime can find it all
    and find out all about everyone.
  • 10:17 - 10:21
    If a dictator takes power,
    which could happen anywhere
  • 10:21 - 10:26
    people realize this and they recognize
  • 10:26 - 10:32
    that they should not
    communicate with other dissidents
  • 10:32 - 10:36
    in a way that the State could find out about.
  • 10:36 - 10:40
    But if the dictator
    has several years of stored records
  • 10:40 - 10:46
    of who talks with whom, it’s too late
    to take any precautions then.
  • 10:46 - 10:51
    Because he already has
    everything he needs to realize
  • 10:51 - 10:56
    “OK this guy is a dissident and he spoke
    with him. Maybe he is a dissident too.”
  • 10:56 - 11:00
    “Maybe we should grab him and torture him.”
  • 11:02 - 11:12
    So we need to campaign to put
    an end to digital surveillance now.
  • 11:12 - 11:17
    You can’t wait until there is a dictator
    and it would really matter.
  • 11:18 - 11:26
    And besides, it doesn’t take an outright
    dictatorship to start attacking human rights.
  • 11:27 - 11:32
    I wouldn’t quite call
    the government of the UK a dictatorship.
  • 11:32 - 11:39
    It’s not very democratic and one way
    it crushes democracy is using surveillance.
  • 11:39 - 11:45
    A few years ago, people believed to be
    on their way to a protest
  • 11:45 - 11:47
    they were going to protest.
  • 11:47 - 11:50
    They were arrested before
    they could get there
  • 11:50 - 11:58
    because their car was tracked through
    this universal car tracking system.
  • 12:04 - 12:07
    The second threat is censorship.
  • 12:07 - 12:12
    Censorship is not new,
    it existed long before computers.
  • 12:12 - 12:18
    But 15 years ago, we thought that
    the Internet would protect us from censorship
  • 12:18 - 12:21
    that it would defeat censorship.
  • 12:21 - 12:25
    Then, China and some other obvious tyrannies
  • 12:25 - 12:31
    went to great lengths to impose
    censorship on the Internet, and we said:
  • 12:31 - 12:35
    “well that’s not surprising, what else
    would governments like that do?”
  • 12:35 - 12:39
    But today we see censorship
    imposed in countries
  • 12:39 - 12:45
    that are not normally
    thought of as dictatorships
  • 12:45 - 12:53
    such as for instance the UK, France,
    Spain, Italy, Denmark…
  • 12:54 - 13:01
    They all have systems of
    blocking access to some websites.
  • 13:01 - 13:04
    Denmark established a system
  • 13:04 - 13:11
    that blocks access to a long list
    of webpages, which was secret.
  • 13:11 - 13:16
    The citizens were not supposed to know
    how the government was censoring them
  • 13:16 - 13:20
    but the list was leaked,
    and posted on WikiLeaks.
  • 13:20 - 13:26
    At that point, Denmark added
    the WikiLeaks page to its censorship list.
  • 13:28 - 13:30
    So, the whole rest of the world can find out
  • 13:30 - 13:35
    how Danes are being censored,
    but Danes are not supposed to know.
  • 13:36 - 13:44
    A few months ago, Turkey,
    which claims to respect some human rights
  • 13:44 - 13:47
    announced that every Internet user
  • 13:47 - 13:51
    would have to choose between
    censorship and more censorship.
  • 13:52 - 13:55
    Four different levels of censorship
    they get to choose!
  • 13:55 - 14:02
    But freedom is not one of the options.
  • 14:04 - 14:09
    Australia wanted to impose filtering
    on the Internet but that was blocked.
  • 14:09 - 14:13
    However Australia has a
    different kind of censorship.
  • 14:13 - 14:16
    It has censorship of links.
  • 14:16 - 14:20
    That is, if a website in Australia has a link
  • 14:21 - 14:24
    to some censored site outside Australia
  • 14:24 - 14:27
    the one in Australia can be punished.
  • 14:27 - 14:29
    Electronic Frontier Australia
  • 14:29 - 14:33
    which is an organization
    that defends human rights
  • 14:33 - 14:37
    in the digital domain in Australia
  • 14:37 - 14:41
    posted a link to
    a foreign political website.
  • 14:42 - 14:49
    It was ordered to delete the link
    or face a penalty of $11,000 a day.
  • 14:50 - 14:52
    So they deleted it,
    what else could they do?
  • 14:52 - 14:56
    This is a very harsh system of censorship.
  • 14:56 - 15:02
    In Spain, the censorship
    that was adopted earlier this year
  • 15:02 - 15:13
    allows officials to arbitrarily
    shut down an Internet site in Spain
  • 15:13 - 15:17
    or impose filtering to block access
    to a site outside of Spain.
  • 15:18 - 15:22
    And they can do this
    without any kind of trial.
  • 15:22 - 15:27
    This was one of the motivations
    for the Indignados
  • 15:29 - 15:33
    who have been protesting in the street.
  • 15:34 - 15:39
    There were protests in the street
    in Turkey as well after that announcement
  • 15:39 - 15:43
    but the government refused to change its policy.
  • 15:43 - 15:47
    We must recognize that a country
  • 15:47 - 15:51
    that imposes censorship on the Internet
    is not a free country.
  • 15:51 - 15:57
    And is not a legitimate government either.
  • 16:06 - 16:13
    The next threat to our freedom comes
    from data formats that restrict the users.
  • 16:14 - 16:17
    Sometimes it’s because the format is secret.
  • 16:17 - 16:20
    There are many application programs
  • 16:20 - 16:25
    that save the user’s data in a secret format
  • 16:26 - 16:29
    which is meant to prevent the user
  • 16:29 - 16:33
    from taking that data and using it
    with some other program.
  • 16:33 - 16:38
    The goal is to prevent interoperability.
  • 16:39 - 16:45
    Now evidently, if the program
    implements a secret format
  • 16:45 - 16:49
    that’s because the program
    is not free software.
  • 16:49 - 16:52
    So this is another kind of malicious feature.
  • 16:52 - 16:55
    Surveillance is one kind of malicious feature
  • 16:55 - 16:59
    that you find in some non-free programs;
  • 16:59 - 17:02
    using secret formats to restrict the users
  • 17:02 - 17:05
    is another kind of malicious feature
  • 17:05 - 17:09
    that you also find in some non-free programs.
  • 17:10 - 17:15
    But if you have a free program
    that handles a certain format
  • 17:16 - 17:19
    ipso facto that format is not secret.
  • 17:19 - 17:26
    This kind of malicious feature
    can only exist in a non-free program.
  • 17:26 - 17:31
    Surveillance features could theoretically
    exist in a free program
  • 17:31 - 17:35
    but you don’t find them happening.
  • 17:36 - 17:38
    Because the users would fix it.
  • 17:38 - 17:42
    The users wouldn’t like this,
    so they would fix it.
  • 17:45 - 17:54
    In any case, we also find secret data
    formats in use for publication of works.
  • 17:54 - 17:59
    You find secret data formats
    in use for audio
  • 17:59 - 18:03
    such as music, for video, for books…
  • 18:03 - 18:11
    And these secret formats are known as
    Digital Restrictions Management, or DRM
  • 18:12 - 18:16
    or digital handcuffs (les menottes numériques).
  • 18:19 - 18:23
    So, the works are published in secret formats
  • 18:23 - 18:28
    so that only proprietary
    programs can play them
  • 18:28 - 18:36
    so that these proprietary programs can have
    the malicious feature of restricting the users
  • 18:36 - 18:40
    stopping them from doing something
    that would be natural to do.
  • 18:45 - 18:50
    And this is used even by public entities
    to communicate with the people.
  • 18:50 - 18:56
    For instance Italian public television
    makes its programs available on the net
  • 18:56 - 18:59
    in a format called VC-1
  • 18:59 - 19:05
    which is a standard supposedly,
    but it’s a secret standard.
  • 19:06 - 19:12
    Now I can’t imagine how any
    publicly supported entity
  • 19:12 - 19:18
    could justify using a secret format
    to communicate with the public.
  • 19:18 - 19:21
    This should be illegal.
  • 19:21 - 19:25
    In fact I think all use of Digital
    Restrictions Management should be illegal.
  • 19:25 - 19:29
    No company should be allowed to do this.
  • 19:32 - 19:36
    There are also formats that are not secret
  • 19:36 - 19:41
    but almost might as well be
    secret, for instance Flash.
  • 19:42 - 19:49
    Flash is not actually secret but Adobe keeps
    making new versions, which are different
  • 19:49 - 19:57
    faster than anyone can keep up
    and make free software to play those files.
  • 19:57 - 20:01
    So it has almost the same effect as being secret.
  • 20:03 - 20:11
    Then there are the patented formats,
    such as MP3 for audio.
  • 20:11 - 20:16
    It’s bad to distribute audio in MP3 format!
  • 20:17 - 20:22
    There is free software to handle MP3 format,
    to play it and to generate it
  • 20:22 - 20:26
    but because it’s patented in many countries
  • 20:26 - 20:32
    many distributors of free software
    don’t dare include those programs.
  • 20:32 - 20:36
    So if they distribute the GNU+Linux system
  • 20:36 - 20:40
    their system doesn’t include a player for MP3.
  • 20:40 - 20:48
    As a result if anyone
    distributes some music in MP3
  • 20:48 - 20:55
    that’s putting pressure on people
    not to use GNU/Linux.
  • 20:55 - 20:58
    Sure, if you’re an expert you can find
    a free software and install it
  • 20:59 - 21:01
    but there are lots of non experts
  • 21:01 - 21:05
    and they might see that
    they installed a version of GNU/Linux
  • 21:05 - 21:10
    which doesn’t have that software
    and it won’t play MP3 files
  • 21:10 - 21:12
    and they think it’s the system’s fault.
  • 21:13 - 21:19
    They don’t realize it’s MP3′s fault.
    But this is the fact.
  • 21:19 - 21:25
    Therefore, if you want to support freedom,
    don’t distribute MP3 files.
  • 21:25 - 21:30
    That’s why I say if you’re recording my speech
    and you want to distribute copies
  • 21:30 - 21:39
    don’t do it in a patented format
    such as MPEG-2, or MPEG-4, or MP3.
  • 21:39 - 21:49
    Use a format friendly to free software,
    such as the Ogg format or WebM.
  • 21:50 - 21:54
    And by the way, if you are going
    to distribute copies of the recording
  • 21:54 - 22:01
    please put on it the
    Creative Commons-No derivatives license.
  • 22:01 - 22:04
    This is a statement of my personal views.
  • 22:04 - 22:09
    If it were a lecture for a course,
    if it were didactic
  • 22:09 - 22:14
    then it ought to be free,
    but statements of opinion are different.
  • 22:24 - 22:27
    Now this leads me to the next threat
  • 22:27 - 22:32
    which comes from software
    that the users don’t have control over.
  • 22:33 - 22:38
    In other words: software
    that isn’t free, that is not “libre”.
  • 22:38 - 22:42
    In this particular point
    French is clearer than English.
  • 22:42 - 22:46
    The English word free
    means ‘libre’ and ‘gratuit’
  • 22:46 - 22:53
    but what I mean when I say free software
    is ‘logiciel libre‘. I don’t mean ‘gratuit’.
  • 22:53 - 22:55
    I’m not talking about price.
  • 22:55 - 23:01
    Price is a side issue, just a detail,
    because it doesn't matter ethically.
  • 23:01 - 23:04
    You know if I have a copy of a program
  • 23:04 - 23:10
    and I sell it to you for one euro
    or a hundred euros, who cares?
  • 23:10 - 23:14
    Why should anyone think
    that that’s good or bad?
  • 23:14 - 23:21
    Or suppose I gave it to you ‘gratuitement’…
    still, who cares?
  • 23:21 - 23:27
    But whether this program respects
    your freedom, that’s important!
  • 23:27 - 23:32
    So free software is software
    that respects users’ freedom.
  • 23:32 - 23:34
    What does this mean?
  • 23:34 - 23:38
    Ultimately there are just
    two possibilities with software:
  • 23:38 - 23:44
    either the users control the program
    or the program controls the users.
  • 23:44 - 23:48
    If the users have certain essential freedoms
  • 23:48 - 23:51
    then they control the program
  • 23:51 - 23:57
    and those freedoms are
    the criterion for free software.
  • 23:58 - 24:02
    But if the users don’t fully
    have the essential freedoms
  • 24:02 - 24:05
    then the program controls the users.
  • 24:05 - 24:12
    But somebody controls that program
    and, through it, has power over the users.
  • 24:12 - 24:17
    So, a non-free program is an instrument
  • 24:17 - 24:22
    to give somebody power
    over a lot of other people
  • 24:22 - 24:26
    and this is unjust power
    that nobody should ever have.
  • 24:26 - 24:37
    This is why non-free software, les logiciels
    privateurs, qui privent de la liberté
  • 24:37 - 24:42
    why proprietary software is
    an injustice and should not exist?
  • 24:43 - 24:47
    Because it leaves the users without freedom.
  • 24:47 - 24:52
    Now, the developer who has
    control of the program
  • 24:52 - 24:57
    often feels tempted
    to introduce malicious features
  • 24:57 - 25:02
    to further exploit or abuse those users.
  • 25:02 - 25:05
    He feels a temptation because
    he knows he can get away with it:
  • 25:06 - 25:09
    because his program controls the users
  • 25:09 - 25:12
    and the users do not have
    control of the program.
  • 25:12 - 25:17
    If he puts in a malicious feature,
    the users can’t fix it;
  • 25:17 - 25:20
    they can’t remove the malicious feature.
  • 25:20 - 25:23
    I’ve already told you about
    two kinds of malicious features:
  • 25:23 - 25:27
    surveillance features,
    such as are found in Windows
  • 25:27 - 25:32
    and the iPhone and Flash player
    and the “Swindle”.
  • 25:35 - 25:40
    And there are also
    features to restrict users
  • 25:40 - 25:43
    which work with secret data formats
  • 25:43 - 25:50
    and those are found in Windows,
    Macintosh, the iPhone, Flash player
  • 25:50 - 25:58
    the Amazon “Swindle”, the Playstation 3
    and lots and lots of other programs.
  • 26:00 - 26:04
    The other kind of malicious
    feature is the backdoor.
  • 26:04 - 26:07
    That means something in that program
  • 26:07 - 26:12
    is listening for remote
    commands and obeying them
  • 26:12 - 26:16
    and those commands can mistreat the user.
  • 26:16 - 26:26
    We know of backdoors in Windows,
    in the iPhone, in the Amazon “Swindle”.
  • 26:27 - 26:34
    The Amazon “Swindle” has a backdoor
    that can remotely delete books.
  • 26:34 - 26:39
    We know this by observation,
    because Amazon did it:
  • 26:39 - 26:47
    in 2009 Amazon remotely deleted
    thousands of copies of a particular book.
  • 26:47 - 26:53
    Those were authorized copies, people
    had obtain them directly from Amazon
  • 26:53 - 26:58
    and thus Amazon knew exactly where they were.
  • 26:59 - 27:04
    Which is how Amazon knew where
    to send the commands to delete those books.
  • 27:04 - 27:12
    You know which book Amazon deleted?
  • 27:12 - 27:18
    It’s a book everyone should read
    because it discusses a totalitarian state
  • 27:18 - 27:22
    that did things like
    delete books it didn’t like.
  • 27:23 - 27:28
    Everybody should read it,
    but not on the Amazon “Swindle”.
  • 27:31 - 27:40
    Anyway, malicious features are present
    in the most widely used non-free programs
  • 27:40 - 27:47
    but they are rare in free software, because
    with free software the users have control:
  • 27:47 - 27:50
    they can read the source code
    and they can change it.
  • 27:50 - 27:54
    So, if there were a malicious feature
  • 27:54 - 27:58
    somebody would sooner or later
    spot it and fix it.
  • 27:58 - 28:02
    This means that somebody who is considering
  • 28:02 - 28:06
    introducing a malicious feature
    does not find it so tempting
  • 28:06 - 28:10
    because he knows he might
    get away with it for a while
  • 28:10 - 28:13
    but somebody will spot it, will fix it
  • 28:13 - 28:18
    and everybody will loose
    trust in the perpetrator.
  • 28:19 - 28:23
    It’s not so tempting when
    you know you’re going to fail.
  • 28:24 - 28:29
    And that’s why we find that malicious
    features are rare in free software
  • 28:29 - 28:34
    and common in proprietary software.
  • 28:35 - 28:40
    Now the essential freedoms are four.
  • 28:40 - 28:44
    Freedom 0 is the freedom
    to run the program as you wish.
  • 28:44 - 28:50
    Freedom 1 is the freedom to study
    the source code and change it
  • 28:50 - 28:54
    so the program does
    your computing the way you wish.
  • 28:55 - 28:58
    Freedom 2 is the freedom to help others.
  • 28:58 - 29:04
    That’s the freedom to make exact copies
    and redistribute them when you wish.
  • 29:04 - 29:09
    Freedom 3 is the freedom
    to contribute to your community.
  • 29:09 - 29:14
    That’s the freedom to make copies
    of your modified versions
  • 29:14 - 29:21
    if you have made any, and then
    distribute them to others when you wish.
  • 29:21 - 29:28
    These freedoms, in order to be adequate,
    must apply to all activities of life.
  • 29:28 - 29:34
    For instance if it says: “This is free
    for academic use”, it’s not free.
  • 29:34 - 29:38
    Because that’s too limited.
    It doesn’t apply to all areas of life.
  • 29:38 - 29:42
    In particular, if a program is free
  • 29:42 - 29:48
    that means it can be modified
    and distributed commercially
  • 29:48 - 29:53
    because commerce is an area of life,
    an activity in life.
  • 29:53 - 29:57
    And this freedom has to apply to all activities.
  • 29:57 - 30:04
    Now however, it’s not obligatory
    to do any of these things.
  • 30:04 - 30:08
    The point is you’re free to do them
    if you wish, when you wish.
  • 30:08 - 30:13
    But you never have to do them.
    You don’t have to do any of them.
  • 30:13 - 30:15
    You don’t have to run the program.
  • 30:15 - 30:17
    You don’t have to study
    or change the source code.
  • 30:17 - 30:20
    You don’t have to make any copies.
  • 30:20 - 30:23
    You don’t have to distribute
    your modified versions.
  • 30:23 - 30:28
    The point is you should be free
    to do those things if you wish.
  • 30:29 - 30:34
    Now, freedom number 1, the freedom to study
    and change the source code
  • 30:34 - 30:38
    to make the program do
    your computing as you wish
  • 30:38 - 30:42
    includes something
    that might not be obvious at first.
  • 30:42 - 30:46
    If the program comes in a product
  • 30:47 - 30:52
    and a developer can provide
    an upgrade that will run
  • 30:52 - 30:57
    then you have to be able to make
    your version run in that product.
  • 30:57 - 31:01
    If the product would only run
    the developer’s versions
  • 31:01 - 31:07
    and refuses to run yours, the executable
    in that product is not free software.
  • 31:07 - 31:11
    Even if it was compiled from free source code
  • 31:11 - 31:14
    it’s not free because
    you don’t have the freedom
  • 31:14 - 31:18
    to make the program do your computing
    the way you wish.
  • 31:18 - 31:23
    So, freedom 1 has to be real,
    not just theoretical.
  • 31:23 - 31:27
    It has to include the freedom
    to use your version
  • 31:27 - 31:32
    not just the freedom to make
    some source code that won’t run.
  • 31:33 - 31:37
    I launched the free software movement in 1983
  • 31:38 - 31:40
    when I announced the plan to develop
  • 31:40 - 31:44
    a free software operating system
    whose name is GNU.
  • 31:45 - 31:50
    Now GNU, the name GNU, is a joke.
  • 31:52 - 31:54
    Because part of the hacker’s spirit
  • 31:54 - 31:59
    is to have fun even when
    you’re doing something very serious.
  • 31:59 - 32:07
    Now I can’t think of anything more seriously
    important than defending freedom.
  • 32:12 - 32:16
    But that didn’t mean I couldn’t give
    my system a name that’s a joke.
  • 32:20 - 32:24
    So GNU is a joke because
    it’s a recursive acronym
  • 32:24 - 32:31
    it stands for “GNU is Not Unix”,
    so G.N.U.: GNU’s Not Unix.
  • 32:31 - 32:35
    So the G in GNU stands for GNU.
  • 32:36 - 32:39
    Now in fact this was
    a tradition at the time.
  • 32:39 - 32:44
    The tradition was:
    if there was an existing program
  • 32:44 - 32:48
    and you wrote something
    similar to it, inspired by it
  • 32:49 - 32:52
    you could give credit
    by giving your program a name
  • 32:53 - 32:57
    that’s a recursive acronym
    saying it’s not the other one.
  • 32:58 - 33:03
    So I gave credit to Unix for
    the technical ideas of Unix
  • 33:03 - 33:10
    but with the name GNU, because I decided
    to make GNU a Unix-like system
  • 33:10 - 33:16
    with the same commands, the same system calls
    so that it would be compatible
  • 33:16 - 33:20
    so that people who used Unix
    can switch over easily.
  • 33:20 - 33:24
    But the reason for developing GNU,
    that was unique.
  • 33:24 - 33:29
    GNU is the only operating system,
    as far as I know
  • 33:29 - 33:33
    ever developed for the purpose of freedom.
  • 33:33 - 33:38
    Not for technical motivations,
    not for commercial motivations.
  • 33:38 - 33:41
    GNU was written for your freedom.
  • 33:41 - 33:45
    Because without a free operating system
  • 33:45 - 33:49
    it’s impossible to have
    freedom and use a computer.
  • 33:50 - 33:54
    And there were none, and
    I wanted people to have freedom
  • 33:54 - 33:57
    so it was up to me to write one.
  • 33:57 - 34:01
    Nowadays there are millions of users
    of the GNU operating system
  • 34:01 - 34:06
    and most of them don’t know
    they are using the GNU operating system
  • 34:06 - 34:10
    because there is a widespread
    practice which is not nice.
  • 34:10 - 34:14
    People call the system “Linux”.
  • 34:14 - 34:19
    Many do, but some people don’t
    and I hope you’ll be one of them.
  • 34:19 - 34:22
    Please, since we started this
  • 34:22 - 34:25
    since we wrote the biggest piece of the code
  • 34:25 - 34:28
    please give us equal mention
  • 34:28 - 34:37
    please call the system GNU+Linux
    or GNU/Linux. It’s not much to ask!
  • 34:37 - 34:40
    But there is another reason to do this.
  • 34:40 - 34:45
    It turns out that the person who wrote Linux
  • 34:45 - 34:49
    which is one component of the
    system as we use it today
  • 34:49 - 34:52
    he doesn’t agree with
    the free software movement.
  • 34:53 - 34:57
    And so if you call the whole system Linux
  • 34:57 - 35:04
    in effect you’re steering people towards
    his ideas and away from our ideas.
  • 35:04 - 35:08
    Because he’s not gonna say to them
    that they deserve freedom.
  • 35:08 - 35:15
    He’s going to say to them that he likes
    convenient, reliable, powerful software.
  • 35:15 - 35:19
    He’s going to tell people that
    those are the important values.
  • 35:19 - 35:27
    But if you tell them the system is GNU+Linux,
    the GNU operating system plus Linux the kernel
  • 35:27 - 35:32
    then they’ll know about us
    and then they might listen to what we say.
  • 35:33 - 35:42
    You deserve freedom, and since freedom
    will be lost if we don’t defend it —
  • 35:42 - 35:46
    there’s always going to be
    a Sarkozy to take it away —
  • 35:48 - 35:53
    we need above all to teach
    people to demand freedom
  • 35:53 - 35:56
    to be ready to stand up for their freedom
  • 35:57 - 36:03
    the next time someone threatens to take it away.
  • 36:10 - 36:16
    Nowadays, you can tell who doesn't want
    to discuss these ideas of freedom
  • 36:16 - 36:19
    because they don’t say “logiciel libre”.
  • 36:19 - 36:23
    They don’t say “libre”,
    they say “open source”.
  • 36:23 - 36:27
    That term was coined by
    the people like Mr Torvalds
  • 36:27 - 36:32
    who would prefer that these
    ethical issues don’t get raised.
  • 36:32 - 36:37
    And so the way you can help us
    raise them is by saying libre.
  • 36:37 - 36:42
    You know, it’s up to you where you stand
    you’re free to say what you think.
  • 36:42 - 36:45
    If you agree with them,
    you can say open source.
  • 36:45 - 36:49
    If you agree with us, show it: say libre!
  • 36:52 - 36:56
    Now the most important
    point about free software
  • 36:56 - 37:02
    is that schools must teach
    exclusively free software.
  • 37:02 - 37:06
    All levels of schools from
    kindergarten to university
  • 37:06 - 37:14
    it’s their moral responsibility to teach
    only free software in their education
  • 37:14 - 37:17
    and all other educational activities as well
  • 37:17 - 37:23
    including those that say that
    they’re spreading digital literacy.
  • 37:23 - 37:30
    A lot of those activities teach Windows,
    which means they’re teaching dependence.
  • 37:30 - 37:35
    To teach people the use of proprietary
    software is to teach dependence
  • 37:36 - 37:43
    and educational activities must never do that
    because it’s the opposite of their mission.
  • 37:44 - 37:50
    Educational activities have
    a social mission to educate good citizens
  • 37:50 - 37:58
    of a strong, capable, cooperating,
    independent and free society.
  • 37:58 - 38:03
    And in the area of computing,
    that means: teach free software.
  • 38:04 - 38:11
    Never teach a proprietary program
    because that’s inculcating dependence.
  • 38:11 - 38:18
    Why do you think some proprietary developers
    offer gratis copies to schools?
  • 38:18 - 38:23
    They want the schools
    to make the children dependent.
  • 38:23 - 38:28
    And then, when they graduate,
    they’re still dependent
  • 38:28 - 38:32
    and you know the company is not going
    to offer them gratis copies.
  • 38:33 - 38:38
    And some of them get jobs
    and go to work for companies.
  • 38:38 - 38:42
    Not many of them anymore,
    but some of them.
  • 38:42 - 38:47
    And those companies are not going
    to be offered gratis copies.
  • 38:47 - 38:51
    Oh no! The idea is if the school
    directs the students
  • 38:51 - 38:54
    down the path of permanent dependence
  • 38:54 - 38:58
    they can drag the rest of society
    with them into dependence.
  • 38:58 - 39:04
    That’s the plan! It’s just like
    giving the school gratis needles
  • 39:04 - 39:07
    full of addicting drugs, saying
  • 39:07 - 39:11
    “inject this into your students,
    the first dose is gratis.”
  • 39:12 - 39:15
    Once you’re dependent,
    then you have to pay.
  • 39:15 - 39:18
    Well, the school would reject the drugs
  • 39:18 - 39:22
    because it isn’t right to teach
    the students to use addictive drugs
  • 39:22 - 39:27
    and it’s got to reject
    the proprietary software also.
  • 39:27 - 39:34
    Some people say “let’s have the school teach
    both proprietary software and free software”
  • 39:34 - 39:38
    “so the students become familiar with both.”
  • 39:38 - 39:44
    That’s like saying “for lunch
    lets give the kids spinach and tobacco”
  • 39:44 - 39:48
    “so that they become accustomed to both.”
  • 39:48 - 39:54
    No! The schools are only supposed
    to teach good habits, not bad ones!
  • 39:55 - 40:00
    So there should be no Windows in a school
  • 40:00 - 40:05
    no Macintosh, nothing
    proprietary in the education.
  • 40:06 - 40:10
    But also, for the sake
    of educating good programmers.
  • 40:10 - 40:15
    You see, some people have
    a talent for programming.
  • 40:15 - 40:20
    At ten to thirteen years old,
    typically, they’re fascinated
  • 40:20 - 40:24
    and if they use a program, they want
    to know “how does it do this?”
  • 40:24 - 40:29
    But when they ask the teacher,
    if it’s proprietary, the teacher has to say
  • 40:29 - 40:32
    “I’m sorry, it’s a secret, we can’t find out.”
  • 40:32 - 40:35
    Which means education is forbidden.
  • 40:36 - 40:40
    A proprietary program is
    the enemy of the spirit of education.
  • 40:40 - 40:46
    It’s knowledge withheld, so
    it should not be tolerated in a school
  • 40:46 - 40:50
    even though there may be
    plenty of people in the school
  • 40:50 - 40:53
    who don’t care about programming,
    don’t want to learn this.
  • 40:53 - 40:58
    Still, because it’s the enemy
    of the spirit of education
  • 40:58 - 41:01
    it shouldn’t be there in the school.
  • 41:01 - 41:05
    But if the program is free,
    the teacher can explain what he knows
  • 41:05 - 41:08
    and then give out copies
    of the source code, saying:
  • 41:08 - 41:11
    “read it and you’ll understand everything.”
  • 41:11 - 41:14
    And those who are really
    fascinated, they will read it!
  • 41:14 - 41:20
    And this gives them an opportunity
    to start to learn how to be good programmers.
  • 41:20 - 41:24
    To learn to be a good programmer,
    you’ll need to recognize
  • 41:24 - 41:30
    that certain ways of writing code, even if
    they make sense to you and they are correct
  • 41:30 - 41:35
    they’re not good because other people
    will have trouble understanding them.
  • 41:35 - 41:40
    Good code is clear code that others
    will have an easy time working on
  • 41:40 - 41:43
    when they need to make further changes.
  • 41:43 - 41:46
    How do you learn to write good clear code?
  • 41:46 - 41:50
    You do it by reading lots of code,
    and writing lots of code.
  • 41:50 - 41:53
    And only free software offers the chance
  • 41:53 - 41:58
    to read the code of large
    programs that we really use.
  • 41:59 - 42:03
    And then you have to write lots of code
  • 42:03 - 42:06
    which means you have
    to write changes in large programs.
  • 42:06 - 42:10
    How do you learn to write
    good code for large programs?
  • 42:10 - 42:16
    You have to start small, which
    does not mean small programs, oh no!
  • 42:16 - 42:23
    The challenges of the code for large programs
    don’t even begin to appear in small programs.
  • 42:23 - 42:27
    So the way you start small at
    writing code for large programs
  • 42:27 - 42:31
    is by writing small
    changes in large programs.
  • 42:32 - 42:35
    And only free software
    gives you the chance to do that!
  • 42:35 - 42:44
    So, if a school wants to offer the possibility
    of learning to be a good programmer
  • 42:44 - 42:47
    it needs to be a free software school.
  • 42:47 - 42:49
    But there is an even deeper reason
  • 42:49 - 42:53
    and that is for the sake of moral education
  • 42:53 - 42:56
    education in citizenship.
  • 42:56 - 43:00
    It’s not enough for a school
    to teach facts and skills
  • 43:00 - 43:07
    it has to teach the spirit of goodwill,
    the habit of helping others.
  • 43:07 - 43:10
    Therefore, every class should have this rule:
  • 43:10 - 43:15
    “Students, if you bring software to class,
    you may not keep it for yourself”
  • 43:15 - 43:19
    ”you must share copies
    with the rest of the class”
  • 43:19 - 43:24
    ”including the source code
    in case anyone here wants to learn!”
  • 43:25 - 43:28
    ”Because this class is a place
    where we share our knowledge.”
  • 43:28 - 43:34
    ”Therefore, bringing a proprietary program
    to class is not permitted.”
  • 43:35 - 43:40
    The school must follow
    its own rule to set a good example.
  • 43:40 - 43:44
    Therefore, the school must bring
    only free software to class
  • 43:44 - 43:50
    and share copies, including the source code
    with anyone in the class that wants copies.
  • 43:50 - 43:54
    Those of you who have
    a connection with a school
  • 43:54 - 44:01
    it’s your duty to campaign and pressure
    that school to move to free software.
  • 44:02 - 44:04
    And you have to be firm.
  • 44:04 - 44:11
    It may take years, but you can succeed
    as long as you never give up.
  • 44:11 - 44:18
    Keep seeking more allies among the students,
    the faculty, the staff, the parents, anyone!
  • 44:19 - 44:23
    And always bring it up as an ethical issue.
  • 44:23 - 44:26
    If someone else wants
    to sidetrack the discussion
  • 44:26 - 44:31
    into this practical advantage
    and this practical disadvantage
  • 44:31 - 44:36
    which means they’re ignoring the most
    important question, then you have to say:
  • 44:36 - 44:42
    “this is not about how to do
    the best job of educating“
  • 44:42 - 44:48
    “this is about how to do a good
    education instead of an evil one.“
  • 44:48 - 44:54
    “It’s how to do education right
    instead of wrong“
  • 44:54 - 44:59
    “not just how to make it
    a little more effective or less.”
  • 44:59 - 45:06
    So don’t get distracted with those secondary
    issues and ignore what really matters!
  • 45:07 - 45:12
    So, moving on to the next menace.
  • 45:12 - 45:21
    There are two issues that arise
    from the use of internet services.
  • 45:21 - 45:27
    One of them is that the server
    could abuse your data
  • 45:27 - 45:32
    and another is that
    it could take control of your computing.
  • 45:32 - 45:36
    The first issue, people already know about.
  • 45:36 - 45:42
    They are aware that, if you
    upload data to an internet service
  • 45:42 - 45:45
    there is a question of what
    it will do with that data.
  • 45:45 - 45:48
    It might do things that mistreat you.
  • 45:48 - 45:54
    What could it do? It could lose the data,
    it could change the data
  • 45:54 - 45:58
    it could refuse to let you get the data back.
  • 45:58 - 46:04
    And it could also show the data to
    someone else you don’t want to show it to.
  • 46:04 - 46:07
    Four different possible things.
  • 46:07 - 46:13
    Now, here, I’m talking about the data
    that you knowingly gave to that site.
  • 46:13 - 46:19
    Of course, many of those
    services do surveillance as well.
  • 46:19 - 46:22
    For instance, consider Facebook.
  • 46:22 - 46:28
    Users send lots of data to Facebook,
    and one of the bad things about Facebook
  • 46:28 - 46:33
    is that it shows a lot of
    that data to lots of other people
  • 46:33 - 46:41
    and even if it offers them a setting
    to say “no!”, that may not really work.
  • 46:41 - 46:46
    After all, if you say “some other people
    can see this piece of information,”
  • 46:46 - 46:48
    one of them might publish it.
  • 46:48 - 46:50
    Now, that’s not Facebook’s fault
  • 46:50 - 46:54
    there is nothing they could do to
    prevent that but it ought to warn people.
  • 46:54 - 47:00
    Instead of saying “mark this as only
    to your so-called friends”
  • 47:00 - 47:06
    “it should say “keep in mind that your
    so-called friends are not really your friends”
  • 47:06 - 47:11
    “and if they want to make trouble
    for you, they could publish this.”
  • 47:11 - 47:17
    Every time, it should say that, if
    they want to deal with people ethically.
  • 47:22 - 47:26
    As well as all the data users of Facebook
    voluntarily give to Facebook
  • 47:26 - 47:33
    Facebook is collecting through data
    about people’s activities on the net
  • 47:33 - 47:39
    through various methods of surveillance.
    But that was the first menace.
  • 47:39 - 47:45
    For now I am talking about the data
    that people know they are giving to these sites.
  • 47:47 - 47:54
    Losing data is something that
    could always happen by accident.
  • 47:54 - 47:59
    That possibility is always there,
    no matter how careful someone is.
  • 47:59 - 48:04
    Therefore, you need to keep
    multiple copies of data that matters.
  • 48:04 - 48:12
    If you do that, then, even if someone
    decided to delete your data intentionally
  • 48:12 - 48:16
    it wouldn’t hurt you that much,
    because you’d have other copies of it.
  • 48:16 - 48:20
    So, as long as you are
    maintaining multiple copies
  • 48:20 - 48:25
    you don’t have to worry too much
    about someone’s losing your data.
  • 48:26 - 48:30
    What about whether you can get it back.
  • 48:30 - 48:36
    Well, some services make it possible to get
    back all the data that you sent, and some don’t.
  • 48:36 - 48:42
    Google services will let the user
    get back the data the user has put into them.
  • 48:42 - 48:45
    Facebook, famously, does not.
  • 48:46 - 48:52
    Of course in the case of Google, this only
    applies to the data the user knows Google has.
  • 48:52 - 48:57
    Google does lots of surveillance, too
    and that data is not included.
  • 48:59 - 49:03
    But in any case, if you
    can get the data back
  • 49:03 - 49:08
    then you could track
    whether they have altered it.
  • 49:08 - 49:13
    And they are not very likely to start
    altering people’s data if the people can tell.
  • 49:13 - 49:19
    So maybe we can keep a track
    on that particular kind of abuse.
  • 49:19 - 49:24
    But the abuse of showing the data to someone
    you don’t want it to be shown to
  • 49:24 - 49:29
    is very common and almost
    impossible for you to prevent
  • 49:30 - 49:33
    especially if it’s a US company.
  • 49:33 - 49:40
    You see, the most hypocritically
    named law in US history
  • 49:40 - 49:50
    the so-called USA Patriot Act,
    says that Big Brother’s police
  • 49:50 - 49:56
    can collect just about all the data
    that companies maintain about individuals.
  • 49:56 - 50:02
    Not just companies, but other
    organizations too, like public libraries.
  • 50:02 - 50:08
    The police can get this massively,
    without even going to court.
  • 50:08 - 50:13
    Now, in a country that was
    founded on an idea of freedom
  • 50:13 - 50:19
    there is nothing more unpatriotic
    than this. But this is what they did.
  • 50:19 - 50:25
    So you mustn’t ever trust any
    of your data to a US company.
  • 50:25 - 50:32
    And they say that foreign subsidiaries
    of US companies are subject to this as well
  • 50:32 - 50:36
    so the company you are directly
    dealing with may be in Europe
  • 50:36 - 50:43
    but if it’s owned by a US company,
    you got the same problem to deal with.
  • 50:49 - 50:53
    However, this is mainly a concern
  • 50:53 - 51:00
    when the data you are sending
    to the service is not for publication.
  • 51:00 - 51:03
    There are some services
    where you publish things.
  • 51:03 - 51:08
    Of course, if you publish something,
    you know everybody is gonna be able to see it.
  • 51:08 - 51:10
    So, there is no way they can hurt you
  • 51:10 - 51:13
    by showing it to somebody
    who wasn’t supposed to see it.
  • 51:13 - 51:18
    There is nobody who wasn’t supposed
    to see it if you published it.
  • 51:18 - 51:22
    So in that case the problem doesn’t exist.
  • 51:24 - 51:32
    So these are four sub-issues
    of this one threat of abusing our data.
  • 51:32 - 51:39
    The idea of the Freedom Box project is
    you have your own server in your own home
  • 51:39 - 51:43
    and when you want to do something remotely
    you do it with your own server
  • 51:43 - 51:49
    and the police have to get a court order
    in order to search your server.
  • 51:49 - 51:56
    So you have the same rights this way that you
    would have traditionally in the physical world.
  • 51:56 - 52:02
    The point here and in
    so many other issues is:
  • 52:02 - 52:06
    as we start doing things
    digitally instead of physically
  • 52:06 - 52:15
    we shouldn’t lose any of our rights, because
    the general tendency is that we do lose rights.
  • 52:18 - 52:25
    Basically, Stallman’s law says
  • 52:25 - 52:32
    that in an epoch when governments
    work for the mega-corporations
  • 52:32 - 52:36
    instead of reporting to their citizens
  • 52:36 - 52:45
    every technological change can be taken
    advantage of to reduce our freedom.
  • 52:47 - 52:51
    Because reducing our freedom is what
    these governments want to do.
  • 52:52 - 52:55
    So the question is: when
    do they get an opportunity?
  • 52:55 - 53:02
    Well, any change that happens for some
    other reason is a possible opportunity
  • 53:02 - 53:07
    and they will take advantage of it
    if that’s their general desire.
  • 53:10 - 53:12
    But the other issue
    with internet services
  • 53:12 - 53:16
    is that they can take
    control of your computing
  • 53:16 - 53:22
    and that’s not so commonly known.
    But It’s becoming more common.
  • 53:22 - 53:30
    There are services that offer to do
    computing for you on data supplied by you
  • 53:30 - 53:35
    things that you should do
    in your own computer
  • 53:35 - 53:41
    but they invite you to let somebody else’s
    computer do that computing work for you.
  • 53:41 - 53:49
    And the result is you lose control over it.
    It’s just as if you used a non-free program.
  • 53:50 - 53:56
    Two different scenarios
    but they lead to the same problem.
  • 53:56 - 54:01
    If you do your computing
    with a non-free program
  • 54:01 - 54:05
    well, the users don’t control
    the non-free program
  • 54:05 - 54:08
    it controls the users,
    which would include you.
  • 54:08 - 54:12
    So you’ve lost control of
    the computing that’s being done.
  • 54:12 - 54:17
    But if you do your computing in his server
  • 54:17 - 54:23
    well, the programs that are doing it
    are the ones he chose.
  • 54:23 - 54:28
    You can’t touch them or see them,
    so you have no control over them.
  • 54:28 - 54:31
    He has control over them, maybe.
  • 54:31 - 54:36
    If they are free software and he installs
    them then he has control over them.
  • 54:36 - 54:39
    But even he might not have control.
  • 54:39 - 54:42
    He might be running a proprietary
    program in his server
  • 54:42 - 54:49
    in which case it’s somebody else who has control
    of the computing being done in his server.
  • 54:50 - 54:52
    He doesn’t control it and you don’t.
  • 54:52 - 54:56
    But suppose he installs a free program
  • 54:56 - 55:02
    then he has control over the computing
    being done in his computer, but you don’t.
  • 55:02 - 55:05
    So, either way, you don’t!
  • 55:05 - 55:08
    So the only way to have
    control over your computing
  • 55:08 - 55:15
    is to do it with
    your copy of a free program.
  • 55:16 - 55:20
    This practice is called
    “Software as a Service”.
  • 55:20 - 55:28
    It means doing your computing with
    your data in somebody else’s server.
  • 55:28 - 55:32
    And I don’t know of anything
    that can make this acceptable.
  • 55:32 - 55:37
    It’s always something
    that takes away your freedom
  • 55:37 - 55:41
    and the only solution
    I know of is to refuse.
  • 55:42 - 55:48
    For instance, there are servers that
    will do translation or voice recognition
  • 55:48 - 55:54
    and you are letting them have
    control over this computing activity
  • 55:54 - 55:56
    which we shouldn’t ever do.
  • 55:56 - 55:59
    Of course, we are also
    giving them data about ourselves
  • 55:59 - 56:02
    which they shouldn’t have.
  • 56:02 - 56:06
    Imagine if you had
    a conversation with somebody
  • 56:06 - 56:13
    through a voice-recognition translation
    system that was Software as as Service
  • 56:13 - 56:18
    and it’s really running
    on a server belonging to some company.
  • 56:18 - 56:24
    That company also gets to know what
    was said in the conversation
  • 56:24 - 56:32
    and if it’s a US company that means
    Big Brother also gets to know. This is no good.
  • 56:36 - 56:44
    The next threat to our freedom in a digital
    society is using computers for voting.
  • 56:46 - 56:51
    You can’t trust computers for voting.
  • 56:51 - 56:54
    Whoever controls the software
    in those computers
  • 56:54 - 57:00
    has the power to commit
    undetectable fraud.
  • 57:02 - 57:11
    Elections are special. Because there’s
    nobody involved that we dare trust fully.
  • 57:11 - 57:15
    Everybody has to be checked,
    crosschecked by others
  • 57:15 - 57:22
    so that nobody is in the position
    to falsify the results by himself.
  • 57:22 - 57:26
    Because if anybody is in a position
    to do that he might do it!
  • 57:26 - 57:31
    So our traditional systems
    for voting were designed
  • 57:31 - 57:37
    so that nobody was fully trusted,
    everybody was being checked by others.
  • 57:37 - 57:42
    So that nobody could easily commit fraud.
  • 57:42 - 57:48
    But once you introduce a program,
    this is impossible!
  • 57:48 - 57:54
    How can you tell if a voting machine
    would honestly count the votes?
  • 57:54 - 57:59
    You’d have to study the program that’s
    running in it during the election
  • 57:59 - 58:06
    which of course nobody can do, and most
    people wouldn’t even know how to do.
  • 58:06 - 58:11
    But even the experts who might theoretically
    be capable of studying the program
  • 58:11 - 58:15
    they can’t do it while people are voting.
  • 58:16 - 58:18
    They’d have to do it in advance
  • 58:18 - 58:22
    and then how do they know
    that the program they studied
  • 58:22 - 58:27
    is the one that’s running while
    people vote? Maybe it’s been changed.
  • 58:27 - 58:34
    Now, if this program is proprietary,
    that means some company controls it.
  • 58:34 - 58:38
    The election authority can’t even
    tell what that program is doing.
  • 58:38 - 58:42
    Well, this company then
    could rig the election.
  • 58:42 - 58:47
    There are accusations that this was done
    in the US within the past ten years
  • 58:47 - 58:51
    that election results were falsified this way.
  • 58:51 - 58:54
    But what if the program is free software?
  • 58:54 - 58:57
    That means the election authority
  • 58:57 - 59:01
    who owns this voting machine
    has control over the software in it
  • 59:02 - 59:04
    so the election authority
    could rig the election.
  • 59:04 - 59:07
    You can’t trust them either.
  • 59:07 - 59:11
    You don’t dare trust anybody in voting
  • 59:11 - 59:17
    and the reason is, there’s no way
    that the voters can verify for themselves
  • 59:17 - 59:25
    that their votes were correctly counted,
    nor that false votes were not added.
  • 59:25 - 59:33
    In other activities of life, you can usually
    tell if somebody is trying to cheat you.
  • 59:33 - 59:37
    Consider for instance buying
    something from a store.
  • 59:37 - 59:40
    You order something, maybe
    you give a credit card number.
  • 59:40 - 59:46
    If the product doesn’t come,
    you can complain
  • 59:46 - 59:48
    and you can, of course if you got
    a good enough memory,
  • 59:48 - 59:51
    you will notice if that product doesn’t come.
  • 59:52 - 59:59
    You’re not just giving total blind trust
    to the store, because you can check.
  • 59:59 - 60:02
    But in elections you can’t check.
  • 60:03 - 60:06
    I saw once a paper where someone described
  • 60:06 - 60:10
    a theoretical system for voting
  • 60:11 - 60:15
    which uses some sophisticated mathematics
  • 60:15 - 60:20
    so that people could check
    that their votes had been counted
  • 60:20 - 60:23
    even though everybody’s vote was secret
  • 60:23 - 60:27
    and they could also verify
    that false votes hadn’t been added.
  • 60:27 - 60:31
    It was very exciting, powerful mathematics;
  • 60:31 - 60:34
    but even if that mathematics is correct
  • 60:34 - 60:38
    that doesn’t mean the system
    would be acceptable to use in practice
  • 60:38 - 60:46
    because the vulnerabilities of a real
    system might be outside of that mathematics.
  • 60:46 - 60:50
    For instance, suppose you’re
    voting over the Internet
  • 60:50 - 60:54
    and suppose you’re using
    a machine that’s a zombie.
  • 60:55 - 60:59
    It might tell you that
    the vote was sent for A
  • 60:59 - 61:05
    while actually sending a vote for B.
    Who knows whether you’d ever find out?
  • 61:05 - 61:12
    In practice, the only way to see
    if these systems work and are honest
  • 61:12 - 61:23
    is through years, in fact decades, of trying
    them and checking in other ways what happened.
  • 61:26 - 61:33
    I wouldn’t want my country
    to be the pioneer in this.
  • 61:34 - 61:43
    So, use paper for voting. Make sure
    there are ballots that can be recounted.
  • 61:46 - 61:54
    The next threat to our freedom in a digital
    society comes from the war on sharing.
  • 61:55 - 62:00
    One of the tremendous benefits
    of digital technology
  • 62:00 - 62:08
    is that it is easy to copy published works
    and share these copies with others.
  • 62:09 - 62:15
    Sharing is good, and with
    digital technology, sharing is easy.
  • 62:16 - 62:19
    So, millions of people share.
  • 62:19 - 62:23
    Those who profit by having power
  • 62:23 - 62:28
    over the distribution of these
    works don’t want us to share.
  • 62:29 - 62:35
    And since they are businesses,
    governments which have betrayed their people
  • 62:35 - 62:41
    and work for the empire of mega-corporations
    try to serve those businesses
  • 62:41 - 62:46
    they are against their own people, they are
    for the businesses, for the publishers.
  • 62:46 - 62:53
    Well, that’s not good.
    And with the help of these governments
  • 62:53 - 62:58
    the companies have been waging war on sharing
  • 62:58 - 63:06
    and they’ve proposed a series
    of cruel draconian measures.
  • 63:06 - 63:09
    Why do they propose cruel draconian measures?
  • 63:09 - 63:14
    Because nothing less has a chance of success:
  • 63:14 - 63:19
    when something is good
    and easy, people do it.
  • 63:20 - 63:25
    And the only way to stop them
    is by being very nasty.
  • 63:25 - 63:31
    So of course, what they propose is nasty,
    nasty, and the next one is nastier.
  • 63:33 - 63:41
    So they tried suing teenagers for hundreds
    of thousands of dollars — that was pretty nasty.
  • 63:42 - 63:46
    And they tried turning
    our technology against us
  • 63:46 - 63:52
    Digital Restrictions Management
    that means, digital handcuffs.
  • 63:52 - 63:58
    But among the people
    there were clever programmers too
  • 63:58 - 64:01
    and they found ways to break the handcuffs.
  • 64:01 - 64:09
    For instance, DVDs were designed to have
    encrypted movies in a secret encryption format
  • 64:09 - 64:15
    and the idea was that all
    the programs to decrypt the video
  • 64:15 - 64:19
    would be proprietary with digital handcuffs.
  • 64:19 - 64:22
    They would all be designed
    to restrict the users.
  • 64:22 - 64:25
    And their scheme worked okay for a while.
  • 64:25 - 64:29
    But some people in Europe
    figured out the encryption
  • 64:29 - 64:37
    and they released a free program that
    could actually play the video on a DVD.
  • 64:40 - 64:44
    Well, the movie companies
    didn’t leave it there.
  • 64:44 - 64:52
    They went to the US congress and bought
    a law making that software illegal.
  • 64:52 - 64:59
    The United States invented
    censorship of software in 1998
  • 64:59 - 65:02
    with the Digital Millennium
    Copyright Act [DMCA].
  • 65:02 - 65:08
    So the distribution of that free program
    was forbidden in the United States.
  • 65:08 - 65:12
    Unfortunately it didn’t stop
    with the United States.
  • 65:12 - 65:21
    The European Union adopted a directive
    in 2003 I believe, requiring such laws.
  • 65:21 - 65:28
    The directive only says that
    commercial distribution has to be banned
  • 65:28 - 65:34
    but just about every country in
    the European Union has adopted a nastier law.
  • 65:35 - 65:40
    In France, the mere possession
    of a copy of that program
  • 65:40 - 65:46
    is an offense punished by
    imprisonment, thanks to Sarkozy.
  • 65:48 - 65:52
    I believe that was done by the law DADVSI.
  • 65:53 - 65:56
    I guess he hoped that with
    an unpronounceable name
  • 65:56 - 66:00
    people wouldn’t be able to criticize it.
  • 66:02 - 66:08
    So, elections are coming.
    Ask the candidates in the parties:
  • 66:08 - 66:14
    will you repeal the DADVSI?
    And if not, don’t support them.
  • 66:15 - 66:22
    You mustn’t give up lost
    moral territory forever.
  • 66:22 - 66:26
    You’ve got to fight to win it back.
  • 66:26 - 66:32
    So, we still are fighting
    against digital handcuffs.
  • 66:32 - 66:36
    The Amazon “Swindle” has digital handcuffs
  • 66:36 - 66:41
    to take away the traditional freedoms
    of readers to do things such as:
  • 66:41 - 66:46
    give a book to someone else,
    or lend a book to someone else.
  • 66:46 - 66:49
    That’s a vitally important social act.
  • 66:49 - 66:55
    That is what builds society
    among people who read: lending books.
  • 66:56 - 66:59
    Amazon doesn’t want to let
    people lend books freely.
  • 66:59 - 67:08
    And then there is also selling a book, perhaps
    to a used bookstore. You can’t do that either.
  • 67:12 - 67:18
    It looked for a while as if
    DRM had disappeared on music
  • 67:18 - 67:26
    but now they’re bringing it back
    with streaming services such as Spotify.
  • 67:26 - 67:31
    These services all require
    proprietary client software
  • 67:31 - 67:37
    and the reason is so they can put
    digital handcuffs on the users.
  • 67:38 - 67:45
    So, reject them! They already
    showed quite openly
  • 67:45 - 67:48
    that you can’t trust them,
    because first they said:
  • 67:48 - 67:51
    “you can listen as much as you like”,
    and then they said:
  • 67:51 - 67:55
    “Oh, no! You can only listen
    a certain number of hours a month.”
  • 67:58 - 68:07
    The issue is not whether that particular
    change was good or bad, just or unjust;
  • 68:07 - 68:11
    the point is, they have the power
    to impose any change in policies.
  • 68:11 - 68:14
    So don’t let them have that power.
  • 68:15 - 68:21
    You should have your own copy
    of any music you want to listen to.
  • 68:22 - 68:34
    And then came the next assault on our freedom:
    HADOPI, basically punishment on accusation.
  • 68:35 - 68:41
    It was started in France but it’s been
    exported to many other countries.
  • 68:42 - 68:50
    The United States now demand such unjust
    policies in its free exploitation treaties.
  • 68:50 - 69:00
    A few months ago, Columbia adopted such a law
    under orders from its masters in Washington.
  • 69:01 - 69:05
    Of course, the ones in Washington
    are not the real masters
  • 69:05 - 69:10
    they’re just the ones who control the
    United States on behalf of the Empire.
  • 69:11 - 69:19
    But they’re the ones who also dictate
    to Columbia on behalf of the Empire.
  • 69:22 - 69:28
    In France, since the
    Constitutional Council
  • 69:28 - 69:33
    objected to explicitly giving
    people punishment without trial
  • 69:33 - 69:37
    they invented a kind of trial
    which is not a real trial
  • 69:37 - 69:39
    which is just a form of a trial
  • 69:39 - 69:44
    so they can pretend that people
    have a trial before they’re punished.
  • 69:44 - 69:47
    But in other countries
    they don’t bother with that
  • 69:47 - 69:52
    it’s explicit punishment on accusation only.
  • 69:53 - 69:57
    Which means that for the sake
    of their war on sharing
  • 69:57 - 70:01
    they’re prepared to abolish
    the basic principles of justice.
  • 70:02 - 70:09
    It shows how thoroughly
    anti-freedom, anti-justice they are.
  • 70:09 - 70:12
    These are not legitimate governments.
  • 70:16 - 70:19
    And I’m sure they’ll
    come up with more nasty ideas
  • 70:19 - 70:27
    because they’re paid to defeat
    the people no matter what it takes.
  • 70:29 - 70:36
    Now, when they do this, they always say
    that it’s for the sake of the artists
  • 70:36 - 70:42
    that they have “protect” the “creators”.
  • 70:42 - 70:45
    Now those are both propaganda terms.
  • 70:45 - 70:48
    I‘m convinced that the reason
    they love the word “creators“
  • 70:48 - 70:53
    is because it is a comparison with a deity.
  • 70:54 - 70:57
    They want us to think of artists as super-human
  • 70:57 - 71:03
    and thus deserving special
    privileges and power over us
  • 71:04 - 71:07
    which is something I disagree with.
  • 71:07 - 71:15
    In fact though, the only artists that benefit
    very much from this system are the big stars.
  • 71:15 - 71:20
    The other artists are getting
    crushed into the ground
  • 71:20 - 71:24
    by the heels of these same companies.
  • 71:26 - 71:30
    But they treat the stars very well,
    because the stars have a lot of clout.
  • 71:30 - 71:34
    If a star threatens to move to
    another company, the company says:
  • 71:34 - 71:36
    “oh, we’ll give you what you want.”
  • 71:37 - 71:44
    But for any other artist they say: “you don’t
    matter, we can treat you any way we like.”
  • 71:47 - 71:54
    So the superstars have been corrupted by the
    millions of dollars or euros that they get
  • 71:54 - 72:01
    to the point where they’ll do
    almost anything for more money.
  • 72:02 - 72:06
    For instance, J. K. Rowling is a good example.
  • 72:08 - 72:13
    J. K. Rowling, a few years ago,
    went to court in Canada
  • 72:13 - 72:21
    and obtained an order that people who
    had bought her books must not read them.
  • 72:22 - 72:27
    She got an order telling
    people not to read her books.
  • 72:28 - 72:35
    Here’s what happened. A bookstore put
    the books on display for sale too early
  • 72:35 - 72:39
    before the day they
    were supposed to go on sale.
  • 72:39 - 72:41
    And people came into the store and said:
  • 72:41 - 72:45
    “oh, I want that!” and they bought it
    and took away their copies.
  • 72:45 - 72:51
    Then, they discovered the mistake
    so they took the copies off of display.
  • 72:52 - 73:02
    But Rowling wanted to crush any circulation
    of any information from those books
  • 73:02 - 73:05
    so she went to court, and the court ordered
  • 73:05 - 73:10
    those people not to read
    the books that they now owned.
  • 73:12 - 73:19
    In response, I call for a total
    boycott of Harry Potter.
  • 73:20 - 73:25
    But I don’t say you shouldn’t read
    those books or watch the movies
  • 73:25 - 73:30
    I only say you shouldn’t buy
    the books or pay for the movies.
  • 73:30 - 73:36
    I leave it to Rowling to tell
    people not to read the books.
  • 73:38 - 73:44
    As far as I’m concerned, if you borrow
    the book and read it, that’s okay.
  • 73:45 - 73:53
    Just don’t give her any money!
    But this happened with paper books.
  • 73:53 - 73:57
    The court could make this order
    but it couldn’t get the books back
  • 73:57 - 74:00
    from the people who had bought them.
  • 74:00 - 74:06
    Imagine if they were ebooks. Imagine if
    they were ebooks on the “Swindle”.
  • 74:06 - 74:12
    Amazon could send commands to erase them.
  • 74:16 - 74:25
    So, I don’t have much respect for stars
    who will go to such lengths for more money.
  • 74:26 - 74:31
    But most artists aren’t like that, they
    never got enough money to be corrupted.
  • 74:32 - 74:39
    Because the current system of copyright
    supports most artists very badly.
  • 74:39 - 74:47
    And so, when these companies demand
    to expand the war on sharing
  • 74:47 - 74:50
    supposedly for the sake of the artists
  • 74:50 - 74:54
    I’m against what they want but I would
    like to support the artists better.
  • 74:54 - 75:01
    I appreciate their work and I realize if we
    want them to do more work we should support them.
  • 75:01 - 75:05
    I have two proposals
    for how to support artists
  • 75:05 - 75:09
    methods that are compatible with sharing.
  • 75:09 - 75:15
    That would allow us to end the war
    on sharing and still support artists.
  • 75:15 - 75:18
    One method uses tax money.
  • 75:18 - 75:25
    We get a certain amount of public
    funds to distribute among artists.
  • 75:26 - 75:32
    But, how much should each artist get?
    We have to measure popularity.
  • 75:32 - 75:38
    The current system supposedly supports
    artists based on their popularity.
  • 75:38 - 75:45
    So I’m saying let’s keep that, let’s
    continue on this system based on popularity.
  • 75:45 - 75:48
    We can measure the popularity
    of all the artists
  • 75:48 - 75:54
    with some kind of polling or sampling,
    so that we don’t have to do surveillance.
  • 75:54 - 75:57
    We can respect people’s anonymity.
  • 75:57 - 76:01
    We get a raw popularity figure for each artist.
  • 76:01 - 76:05
    How do we convert that into an amount of money?
  • 76:05 - 76:12
    The obvious way is: distribute
    the money in proportion to popularity.
  • 76:13 - 76:21
    So if A is a thousand times as popular as B
    A will get a thousand times as much money as B.
  • 76:21 - 76:25
    That’s not efficient distribution of the money.
  • 76:25 - 76:28
    It’s not putting the money to good use.
  • 76:28 - 76:36
    It’s easy for a star A to be a thousand times
    as popular as a fairly successful artist B.
  • 76:36 - 76:44
    If we use linear proportion, we’ll give A
    a thousand times as much money as we give B.
  • 76:44 - 76:50
    And that means that, either we have
    to make A tremendously rich
  • 76:50 - 76:54
    or we are not supporting B enough.
  • 76:55 - 76:59
    The money we use to make
    A tremendously rich
  • 76:59 - 77:06
    is failing to do an effective job of
    supporting the arts; so, it’s inefficient.
  • 77:06 - 77:12
    Therefore I say: let’s use the cube root.
    Cube root looks sort of like this.
  • 77:12 - 77:18
    The point is: if A is a thousand
    times as popular as B
  • 77:18 - 77:22
    with the cube root A
    will get ten times as much as B
  • 77:23 - 77:26
    not a thousand times as much,
    just ten times as much.
  • 77:26 - 77:30
    The use of the cube root
    shifts a lot of the money
  • 77:30 - 77:36
    from the stars to the artists
    of moderate popularity.
  • 77:37 - 77:44
    And that means, with less money we can adequately
    support a much larger number of artists.
  • 77:44 - 77:49
    There are two reasons why this system
    would use less money than we pay now.
  • 77:49 - 77:53
    First of all because it would be
    supporting artists but not companies.
  • 77:53 - 78:02
    Second because it would shift the money from
    the stars to the artists of moderate popularity.
  • 78:02 - 78:09
    Now, it would remain the case that the more
    popular you are, the more money you get.
  • 78:10 - 78:17
    So the star A would still get more
    than B, but not astronomically more.
  • 78:20 - 78:24
    That’s one method, and because
    it won’t be so much money
  • 78:24 - 78:27
    it doesn’t matter so much
    how we get the money.
  • 78:27 - 78:31
    It could be from a special tax
    on Internet connectivity
  • 78:31 - 78:36
    it could just be some of the general budget
    that gets allocated to this purpose.
  • 78:36 - 78:40
    We won’t care because
    it won’t be so much money;
  • 78:40 - 78:43
    much less than we’re paying now.
  • 78:44 - 78:48
    The other method I’ve proposed
    is voluntary payments.
  • 78:48 - 78:55
    Suppose each player had a button
    you could use to send one euro.
  • 78:56 - 79:01
    A lot of people would send it,
    after all it’s not that much money.
  • 79:01 - 79:06
    I think a lot of you might
    push that button every day
  • 79:06 - 79:12
    to give one euro to some artist
    who had made a work that you liked.
  • 79:13 - 79:15
    But nothing would demand this
  • 79:15 - 79:19
    you wouldn’t be required or ordered
    or pressured to send the money;
  • 79:19 - 79:22
    you would do it because you felt like it.
  • 79:22 - 79:25
    But there are some people
    who wouldn’t do it
  • 79:25 - 79:29
    because they’re poor and they
    can’t afford to give one euro.
  • 79:29 - 79:32
    And it’s good that they won’t give it.
  • 79:32 - 79:36
    We don’t have to squeeze money
    out of poor people to support the artists.
  • 79:36 - 79:41
    There are enough non poor people
    who’ll be happy to do it.
  • 79:41 - 79:48
    Why wouldn’t you give one euro to some
    artists today, if you appreciated their work?
  • 79:48 - 79:56
    It’s too inconvenient to give it to them.
    So my proposal is to remove the inconvenience.
  • 79:56 - 80:02
    If the only reason not to give that euro is
    you would have one euro less
  • 80:02 - 80:06
    you would do it fairly often.
  • 80:08 - 80:13
    So these are my two proposals
    for how to support artists
  • 80:13 - 80:19
    while encouraging sharing
    because sharing is good.
  • 80:19 - 80:25
    Let’s put an end to the war on sharing,
    laws like DADVSI and HADOPI.
  • 80:25 - 80:29
    It’s not just the methods
    that they propose that are evil
  • 80:29 - 80:31
    their purpose is evil.
  • 80:31 - 80:36
    That’s why they propose
    cruel and draconian measures.
  • 80:36 - 80:41
    They’re trying to do something
    that’s nasty by nature.
  • 80:41 - 80:45
    So let’s support artists in other ways.
  • 80:46 - 80:52
    The last threat to our freedom
    in digital society is the fact
  • 80:52 - 81:01
    that we don’t have a firm right
    to do the things we do, in cyberspace.
  • 81:02 - 81:06
    In the physical world,
    if you have certain views
  • 81:06 - 81:11
    and you want to give people copies
    of a text that defends those views
  • 81:12 - 81:16
    you’re free to do so. You could
    even buy a printer to print them
  • 81:16 - 81:21
    and you’re free to hand them out on the street
  • 81:21 - 81:25
    or you’re free to rent
    a store and hand them out there.
  • 81:25 - 81:29
    If you want to collect
    money to support your cause
  • 81:29 - 81:34
    you can just have a can and people
    could put money into the can.
  • 81:34 - 81:42
    You don’t need to get somebody else’s
    approval or cooperation to do these things.
  • 81:42 - 81:47
    But, in the Internet, you do need that.
  • 81:47 - 81:51
    For instance if you want
    to distribute a text on the Internet
  • 81:51 - 81:58
    you need companies to help you do it.
    You can’t do it by yourself.
  • 81:58 - 82:06
    So if you want to have a website, you need
    the support of an ISP or a hosting company
  • 82:06 - 82:09
    and you need a domain name registrar.
  • 82:09 - 82:14
    You need them to continue
    to let you do what you’re doing.
  • 82:14 - 82:21
    So you’re doing it effectively
    on sufferance, not by right.
  • 82:21 - 82:26
    And if you want to receive money,
    you can’t just hold out a can.
  • 82:26 - 82:33
    You need the cooperation
    of a payment company.
  • 82:38 - 82:46
    And we saw that this makes all of our
    digital activities vulnerable to suppression.
  • 82:46 - 82:49
    We learned this when the United States government
  • 82:49 - 82:54
    launched a distributed denial of service
    attack [DDoS] against WikiLeaks.
  • 82:54 - 83:01
    Now I’m making a bit of a joke because
    the words “distributed denial of service attack”
  • 83:01 - 83:05
    usually refer to a different kind of attack.
  • 83:05 - 83:09
    But they fit perfectly with
    what the United States did.
  • 83:09 - 83:16
    The United States went to the various kinds
    of network services that WikiLeaks depended on
  • 83:16 - 83:22
    and told them to cut off
    service to WikiLeaks. And they did.
  • 83:22 - 83:27
    For instance, WikiLeaks had rented
    a virtual Amazon server
  • 83:28 - 83:35
    and the US government told Amazon: “cut off
    service for WikiLeaks.” And it did, arbitrarily.
  • 83:36 - 83:41
    And then, Amazon had certain domain names
    such as wikileaks.org
  • 83:42 - 83:47
    the US government tried to get
    all those domains shut off.
  • 83:47 - 83:53
    But it didn’t succeed, some of them were
    outside its control and were not shut off.
  • 83:55 - 84:02
    Then, there were the payment companies.
    The US went to PayPal and said:
  • 84:03 - 84:07
    “Stop transferring money to WikiLeaks
    or we’ll make life difficult for you.”
  • 84:07 - 84:11
    And PayPal shut off payments to WikiLeaks.
  • 84:11 - 84:18
    And then it went to Visa and Mastercard and
    got them to shut off payments to WikiLeaks.
  • 84:18 - 84:27
    Others started collecting money on WikiLeaks
    behalf and their accounts were shut off too.
  • 84:27 - 84:31
    But in this case, maybe something can be done.
  • 84:32 - 84:38
    There’s a company in Iceland which began
    collecting money on behalf of WikiLeaks
  • 84:38 - 84:42
    and so Visa and Mastercard shut off its account;
  • 84:42 - 84:46
    it couldn’t receive money
    from its customers either.
  • 84:46 - 84:52
    Now, that business is suing Visa and
    Mastercard apparently under European Union law
  • 84:52 - 84:57
    because Visa and Mastercard
    together have a near-monopoly.
  • 84:57 - 85:02
    They’re not allowed to arbitrarily
    deny service to anyone.
  • 85:02 - 85:05
    Well, this is an example
    of how things need to be
  • 85:05 - 85:10
    for all kinds of services
    that we use in the Internet.
  • 85:11 - 85:17
    If you rented a store to hand out
    statements of what you think
  • 85:17 - 85:22
    or any other kind of information
    that you can lawfully distribute
  • 85:22 - 85:28
    the landlord couldn’t kick you out just
    because he didn’t like what you were saying.
  • 85:28 - 85:33
    As long as you keep paying the rent,
    you have the right to continue in that store
  • 85:33 - 85:37
    for a certain agreed-on period
    of time that you signed.
  • 85:37 - 85:41
    So you have some rights
    that you can enforce.
  • 85:41 - 85:44
    And they couldn’t shut off
    your telephone line
  • 85:44 - 85:48
    because the phone company
    doesn’t like what you said
  • 85:49 - 85:55
    or because some powerful entity didn’t like
    what you said and threatened the phone company.
  • 85:55 - 86:03
    No! As long as you pay the bills
    and obey certain basic rules
  • 86:03 - 86:11
    they can’t shut off your phone line.
    This is what it’s like to have some rights!
  • 86:11 - 86:18
    Well, if we move our activities from
    the physical world to the virtual world
  • 86:19 - 86:27
    then either we have the same rights in the
    virtual world, or we have been harmed.
  • 86:28 - 86:37
    So, the precarity of all our Internet activities
    is the last of the menaces I wanted to mention.
  • 86:37 - 86:46
    Now I’d like to say that for more information
    about free software, look at GNU.org.
  • 86:46 - 86:52
    Also look at fsf.org, which is the website
    of the Free Software Foundation.
  • 86:52 - 86:58
    You can go there and find many ways
    you can help us, for instance.
  • 86:58 - 87:04
    You can also become a member of the Free
    Software Foundation through that site
  • 87:04 - 87:11
    if you're going to do e-commerce.
    If you'd like to join and pay cash
  • 87:11 - 87:14
    right here you can do that too.
    I've got cards you can fill out.
  • 87:14 - 87:24
    There is also the Free Software Foundation
    of Europe fsfe.org. You can join FSFE also.
  • 87:24 - 87:28
    Can you accept membership in cash?
  • 87:28 - 87:31
    Is there someone who wants to join right now?
  • 87:31 - 87:37
    Ok, so you can join FSFE also
    paying with cash.
  • 87:39 - 87:42
    Now it's time and, by the way
  • 87:42 - 87:48
    I know the case of the FSF
    we get most of our funds from members
  • 87:48 - 87:53
    so joining is really important
    and probably for FSFE as well.
  • 87:53 - 87:57
    Now it's time for me to raise funds
    in another way.
  • 87:57 - 88:04
    This is an adorable gnu
    that needs a home.
  • 88:05 - 88:11
    And I'm going to auction it on behalf
    of the Free Software Foundation.
  • 88:12 - 88:17
    If you buy the gnu, I'll sign
    a card for you, if you like
  • 88:18 - 88:22
    and if you have a penguin
    you need to get a gnu
  • 88:22 - 88:28
    because as we all know a penguin
    can hardly function without a gnu.
  • 88:31 - 88:38
    When you bid, please wave your arm
    and shout the quantity you are bidding
  • 88:38 - 88:41
    so that I notice you.
  • 88:41 - 88:44
    If you are bidding, I think you
    want me to notice that you're bidding.
  • 88:46 - 88:51
    The FSF can accept payments
    either in cash or with a credit card.
  • 88:51 - 88:56
    If the credit card works for ordering
    by telephone then it won't work with us.
  • 88:57 - 89:03
    So, I'm going to start with 20 euros.
    Do I get 20 euros?
  • 89:04 - 89:07
    I've got 20 euros, do I have 25?
  • 89:09 - 89:12
    How much?
  • 89:16 - 89:20
    Trente?
    Ok, I've got 30 euros, do I have 35?
  • 89:21 - 89:25
    How much?
    I've got 35, do I get 40?
  • 89:25 - 89:28
    I have 42
  • 89:32 - 89:36
    I have 42 euros, do I have 50?
  • 89:37 - 89:40
    How much?
    I've got 50, do I get 60?
  • 89:46 - 89:49
    I've got 60, do I get 70?
  • 89:52 - 89:56
    How much?
    I've got 70, do I get 80?
  • 90:02 - 90:05
    I've got 80, do I get 90?
  • 90:05 - 90:11
    I've got 80 euros, do I get 90
    for this adorable gnu?
  • 90:14 - 90:20
    How much?
    I've got 100 euros, do I get 110?
  • 90:25 - 90:29
    I've got 110, do I get 120?
  • 90:30 - 90:34
    I've got 120, do I get 130?
  • 90:39 - 90:46
    I've got 130, do I get 140?
  • 90:46 - 90:51
    How much?
    I've got 140, do I get 150?
  • 90:55 - 90:58
    I've got 150, do I get 160?
  • 90:58 - 91:02
    I've got 160, do I get 170?
  • 91:05 - 91:10
    How much?
    I've got 170, do I get 180?
  • 91:14 - 91:18
    Do you bid?
    I've got 200
  • 91:25 - 91:29
    I've got 200, do I get 210?
  • 91:33 - 91:36
    I've got 210, do I get 220?
  • 91:44 - 91:50
    I've got 220, do I get 230?
  • 91:50 - 91:55
    Do I get 230 for this adorable gnu
    that needs a home?
  • 91:55 - 92:00
    Do I get 230 to the Free Software
    Foundation to defend freedom?
  • 92:02 - 92:05
    Last chance to bid. How much?
  • 92:06 - 92:10
    I've got 230, do I get 240?
  • 92:14 - 92:17
    I've got 240, do I get 250?
  • 92:20 - 92:24
    For this adorable gnu
    to defend freedom?
  • 92:24 - 92:28
    How much?
    I've got 300
  • 92:32 - 92:44
    I've got 300, do I get 320 for this
    adorable gnu to defend freedom?
  • 92:46 - 92:53
    I've got 320, do I get 340?
  • 92:56 - 93:00
    How much? 340?
    I've got 340, do I get 360?
  • 93:04 - 93:08
    What? I've got 340.
  • 93:09 - 93:14
    No, no I don't want to go up by such
    small increments, we'll be here all night
  • 93:14 - 93:18
    I've got 340, do I get 360?
  • 93:20 - 93:23
    I've got 360, do I get 380?
  • 93:23 - 93:30
    I've got 380, do I get 400?
  • 93:30 - 93:34
    to defend freedom?
    for this adorable gnu, do I get 400?
  • 93:35 - 93:41
    I've got 380, do I get 400?
    Last chance to bid, 400 or more.
  • 93:41 - 93:46
    Last chance, going...
  • 93:46 - 93:54
    I've got 400, do I get 420?
  • 93:57 - 94:07
    How much?
    I've got 420, do I get 440?
  • 94:08 - 94:13
    Last chance to bid 440 or more
    for this adorable gnu.
  • 94:14 - 94:21
    Do I get 440?
    Last chance, going, going
  • 94:21 - 94:25
    sold for 420.
  • 94:29 - 94:38
    Please note that there are stickers
    which are gratuit to take.
  • 94:38 - 94:40
    Please take as many as you can
    and make good use of.
  • 94:40 - 94:49
    There are also various small things
    to sell, like badges and elegant pins
  • 94:49 - 94:52
    and the money supports
    the Free Software Foundation.
  • 94:52 - 94:53
    Anyway now it's time for me
    to answer questions.
Title:
Richard Stallman, A Free Digital Society (© cc-by-nd)
Video Language:
English

English subtitles

Revisions