Nicolas Dandrimont

http://blog.olasd.eu/

Hey! I'm a DD, a member of the [Outreach team](https://wiki.debian.org/Teams/Outreach), the maintainer of [mentors.debian.net](https://mentors.debian.net/) and of [fedmsg](http://www.fedmsg.com/en/latest/) in Debian.

I also happen to do most of the Google Summer of Code and Outreachy administrative duties. I'm apparently the current token Frenchman when it comes to Cheese and Wine parties.

In my spare time I'm also an engineer working on a Free Software preservation project at [Inria](http://www.inria.fr/en/) / [IRILL](http://www.irill.org/).

Accepted Talks:

20:30pm in the Super Cow Hack Lab, in the Fuller Hall Residence

https://wiki.debconf.org/wiki/DebConf16/CheeseWineBoF

How do people trust a service that you run? They could blindly trust that you are both honest and competent, but they shouldn't have to and you might not want to be trusted [0]. This BoF is meant for people providing (or using?) services over the network, to third-parties, and want to do so as transparently as possible.

  • Step 0 is simply running free software: how could one trust the service without even knowing what it does?
    Yet, the software itself is not enough: whether it performs as intended (in the scope of the service) and securely depends on its configuration.
  • The next step should be to document the setup, and publish the documentation.
    Unfortunately, documentation will get out of sync with the configuration, as it is wont to do.
  • Then, it seems intuitive to publish the configuration (as in, the entire /etc directory), likely kept in version control. The solution I have used so far is simply keeping /etc versionned in git, using etckeeper, and automatically pushing the modifications to a public repository.

This approach also brings some unexpected benefits:

  • It make it very easy to (re)build the same infrastructure, even if it involves several systems.
    This is a boon, both for automated testing and for users who decide they wish to self-host the same setup.
  • Users can actually contribute back configuration changes, as patches/pull requests.
    They are not passive consumers anymore.
  • It is extremely convenient to use this mechanism to manage the configuration of several servers running the same service (with the same config).

Some challenges I have been facing:

  • How to handle secrets? (cryptographic keys, credentials, ...)
  • How to handle automatic upgrades? (In the multi-server case)
  • How to prove to the user that the configuration that is published is was is actually being run?

[0] By some weird cultural quirk, “trusted” is usually seen as a positive adjective. Yet, “X is trusted” means “X can break your security if compromised”: the less trust there is in a system, the more resilient it is.

In the past few decades, software has become a critical part of every single bit of infrastructure running the world, from the tiniest devices we embed in our bodies to improve our health, to the biggest human creations. Software is the key to accessing all the digital information we're constantly creating, and therefore is an essential part of our cultural heritage. But software is just a bunch of bits. Unlike antique stone carvings, software gets lost, deleted, or corrupted.

Software Heritage has set out to build the biggest archive of free software ever conceived. Our mission is to collect, preserve, organise and facilitate the sharing of all the available free software. We are laying down foundations on which a wealth of applications can be built, ranging from cultural heritage, to research and industry.

We started working in May 2015, and (as of April 2016) we have archived 2.2 billion unique files, more than 480 million project revisions across more than 16 million data sources, among which Debian source packages from snapshot.debian.org, public GitHub repositories, and the GNU project's FTP archive.

This presentation will cover in more detail the why and the how of Software Heritage, as well as opportunities for the community to help us fulfill our goals.