You are here

Bits from Debian

Subscribe to Feed Bits from Debian
Planet Debian -
Përditësimi: 2 ditë 14 orë më parë

Andrew Cater: Chasing around installing CD images for Buster 10.1 ...

Dje, 08/09/2019 - 8:42md
and having great fun, as ever, making a few mistakes and contributing mayhem and entropy to the CD release process. Buster 10.1 point update just released, thanks to RattusRattus, Sledge and Isy and Schweer (amongst others).

Waiting on the Stretch point release to try all over again.. I'd much rather be in Cambridge, but hey, you can't have everything.

Debian GSoC Kotlin project blog: Begining of the end.

Dje, 08/09/2019 - 3:09md
Work done.

Hey all, since the last page of this post we have come so far into packaging Kotlin 1.3.30. I am glad to announce that Kotlin 1.3.30's dependencies are completely packaged and only refining work on intellij-community-java( which is the source package of the intellij related jars that kotlin depended on) and Kotlin remain.

I have roughly packaged Kotlin, the debian folder is pretty much done, and have pushed it here. Also the bootstrap package can be found here.

The links to all the dependencies of Kotlin 1.3.30 can be found in my previous blog pages but I ll list them here for convinience of the reader.

1.->java-compatibility-1.0.1 -> (DONE: here)
2.->jps-model -> (DONE: here)
3.->intellij-core -> (DONE: here)
4.->streamex-0.6.7 -> (DONE: here)
5.->guava-25.1 -> (DONE: Used guava-19 from libguava-java)
6.->lz4-java ->
7.->libjna-java & libjna-platform-java recompiled in jdk 8. -> (DONE : commit)
8.->liboro-java recompiled in jdk8 -> (DONE : commit)
9.->picocontainer-1.3 refining -> (DONE: here)
10.->platform-api -> (DONE: here)
11.->util -> (DONE: here)
12.->platform-impl -> (DONE: here)
13.->extensions -> (DONE: here)
14.->jengeleman:shadow:4.0.3 --> (DONE)
15.->trove4j 1.x -> (DONE)
16.->proguard:6.0.3 in jdk8 (DONE: released as libproguard-java 6.0.3-2)
17.->io.javaslang:2.0.6 --> (DONE)
18.->jline 3.0.3 --> (DONE)
19.->protobuf-2.6.1 in jdk8 (DONE)
20.->com.jcabi:jcabi-aether:1.0 -> the file that requires this is commented out;can be seen here and here
21.->org.sonatype.aether:aether-api:1.13.1 -> the file that requires this is commented out;can be seen here and here

Important Notes.

It should be noted that at this point in time, 8th September 2019, the kotlin package only aims to package the jars generated by the ":dist" task of the kotlin build scripts. This task builds the kotlin home. So thats all we have, we don't have the kotlin-gradle-plugins or kotlinx or anything that isn't part of the kotlin home.

It can be noted that the kotlin boostrap package has kotlin-gradle-plugin, kotlinx and kotlin-dsl jars. The eventual plan is to build kotlin-gradle-plugins and kotlinx from kotlin source itself and to build kotlindsl from gradle source using kotlin as a dependency for gradle. After we do that we can get rid of the kotlin bootstrap package.

It should also be noted that this kotlin package as of 8th September 2019 may not be perfect and might contain a ton of bugs, this is because of 2 reasons; partly because I have ignored some code that depended on jcabi-aether(mentioned above with link to commit) and mostly because the platform-api.jar and patform-impl.jar from intellij-community-idea are not the same as their upstream counterpart but minimum files that are required to make kotlin compile without errors; I did this because they needed packaging new dependencies and at this time it didn't look like it was worth it.

Work left to be done.

Now I believe most of the building blocks of packaging kotlin are done and whats left is to remove this pesky bootstrap. I believe this can be counted as the completion of my GSoC (officially ended in August 26). The tasks left are as follows:

Major Tasks.
  1. Make kotlin build just using openjdk-11-jdk; now it builds iwth openjdk-8-jdk and openjdk-11-jdk.
  2. Build kotlin-gradle-plugins.
  3. Build kotlinx.
  4. Build kotlindsl from gradle.
  5. Do 2,3 and 4 and make kotlin build without bootstrap.
Things that will help the kotlin effort.
  1. refine intellij-community-idea and do its copyrights file proper.
  2. import kotlin 1.3.30 into a new debian-java-maintainers repository.
  3. move kotlin changes(now maintained as git commits) to quilt patches. link to kotlin -> here.
  4. do kotlin's copyrights file.
  5. refine kotlin.
Authors Notes.

Hey guys its been a wonderful ride so far. I hope to keep doing this and maintain kotlin in debian. I am only a final year student and my career fare starts this october 17nth 2019 so I have to prepare for coding interviews and start searching jobs. So until late November 2019 I'll only be taking on the smaller tasks and be doing them. Please note that I won't be doing it as fast as I used to up until now since I am going to be a little busy during this period. I hope I can land a job that lets me keep doing this :) .

I would love to take this section to thank _hc, ebourg, andrewsh and seamlik for helping and mentoring me trough all this.

So if any of you want to help please kindly take on any of these tasks.

!!NOTE-ping me if you want to build kotlin in your system and are stuck!!

You can find me as m36 or m36[m] on #debian-mobile and #debian-java in OFTC.

I ll try to maintain this blog and post the major updates.

Dima Kogan: Are planes crashing any less than they used to?

Dje, 08/09/2019 - 1:25pd

Recently, I've been spending more of my hiking time looking for old plane crashes in the mountains. And I've been looking for data that helps me do that, for instance the last post. A question that came up in conversation is: "are crashes getting more rare?" And since I now have several datasets at my disposal, I can very easily come up with a crude answer.

The last post describes how to map the available NTSB reports describing aviation incidents. I was only using the post-1982 reports in that project, but here let's also look at the older reports. Today I can download both from their site:

$ wget $ unzip # <------- Post 1982 $ wget $ unzip # <------- Pre 1982

I import the relevant parts of each of these into sqlite:

$ ( mdb-schema avall.mdb sqlite -T events; echo "BEGIN;"; mdb-export -I sqlite avall.mdb events; echo "COMMIT;"; ) | sqlite3 post1982.sqlite $ ( mdb-schema PRE1982.MDB sqlite -T tblFirstHalf; echo "BEGIN;"; mdb-export -I sqlite PRE1982.MDB tblFirstHalf; echo "COMMIT;"; ) | sqlite3 pre1982.sqlite

And then I pull out the incident dates, and make a histogram:

$ cat <(sqlite3 pre1982.sqlite 'select DATE_OCCURRENCE from tblFirstHalf') \ <(sqlite3 post1982.sqlite 'select ev_date from events') | perl -pe 's{^../../(..) .*}{$1 + (($1<40)? 2000: 1900)}e' | feedgnuplot --histo 0 --binwidth 1 --xmin 1960 --xlabel Year \ --title 'NTSB-reported incident counts by year'

I guess by that metric everything is getting safer. This clearly just counts NTSB incidents, and I don't do any filtering by the severity of the incident (not all reports describe crashes), but close-enough. The NTSB only deals with civilian incidents in the USA, and only after the early 1960s, it looks like. Any info about the military?

At one point I went through "Historic Aircraft Wrecks of Los Angeles County" by G. Pat Macha, and listed all the described incidents in that book. This histogram of that dataset looks like this:

Aaand there're a few internet resources that list out significant incidents in Southern California. For instance:

I visualize that dataset:

$ < [abc].htm perl -nE '/^ \s* 19(\d\d) | \d\d \s*(?:\s|-|\/)\s* \d\d \s*(?:\s|-|\/)\s* (\d\d)[^\d]/x || next; $y = 1900+($1 or $2); say $y unless $y==1910' | feedgnuplot --histo 0 --binwidth 5

So what did we learn? I guess overall crashes are becoming more rare. And there was a glut of military incidents in the 1940s and 1950s in Southern California (not surprising given all the military bases and aircraft construction facilities here at that time). And by one metric there were lots of incidents in the late 1970s/early 1980s, but they were much more interesting to this "carcomm" person, than they were to Pat Macha.

Andreas Metzler: exim update

Sht, 07/09/2019 - 6:39pd

Testing users might want to manually pull the latest (4.92.1-3) upload of Exim from sid instead of waiting for regular migration to testing. It fixes a nasty vulnerability.

Iustin Pop: Nationalpark Bike Marathon 2019

Pre, 06/09/2019 - 8:00md

This is a longer story… but I think it’s interesting, nonetheless.

The previous 6 months

Due to my long-going foot injury, my training was sporadic at the end of 2018, and by February I realised I will have to stop most if not all training in order to have a chance of recovery. I knew that meant no bike races this year, and I was fine with that. Well, I had to.

The only compromise was that I wanted to do one race, the NBM short route, since that is really easy (both technique and endurance), and even untrained should be fine.

So my fitness (well, CTL) went down and down and down, and my weight didn’t improve either.

As April and May went by, my foot was getting better, but training on the indoor trainer was still causing problems and moving my recovery backwards, so easy times.

By June things were looking better - even was able to do a couple slow runs!, July started even better, but trainer sessions were still a no-go. Only early August I could reliably do a short trainer session without issues. The good side was that since around June I could bike to work and back without problems, but that’s a short commute.

But, I felt confident I could do ~50Km on a bike with some uphill, so I registered for the race.

In August I could also restart trainer sessions, and to my pleasant surprise, even harder ones. So, I started preparing for the race in the last 2 weeks before it :( Well, better than nothing.

Overall, my CTL went from 62-65 in August 2018, to 6 (six!!) in early May, and started increasing in June, reaching a peak of 23 on the day before the race. That’s almost three times lower… so my expectations for the race were low. Especially as the longest ride I did in these 6 months was one hour long or so, whereas the race is double this time.

The race week

Things were going quite well. I also started doing some of the Zwift Academy workouts, more precisely 1 and 2 which are at low power, and everything good.

On Wednesday however, I did workout number 3, which has two “as hard as possible” intervals. Which are exactly the ones that my foot can’t yet do, so it caused some pain, and some concern about the race.

Then we went to Scuol, and I didn’t feel very well on Thursday as I was driving. I thought some exercise would help, so I went for a short run, which reminded me I made my foot worse the previous day, and was even more concerned.

On Friday morning, instead of better, I felt terrible. All I wanted was to go back to bed and sleep the whole day, but I knew that would mean no race tomorrow. I thought - maybe some slight walking would be better for me than lie in bed… At least I didn’t have a running nose or couching, but this definitely felt like a cold.

We went up with the gondola, walked ~2Km, got back down, and I was feeling at least not worse. All this time, I was overly dressed and feeling cold, while everybody else was in t-shirt.

A bit of rest in the afternoon helped, I went and picked my race number and felt better. After dinner, as I was preparing my stuff for next morning, I started feeling a bit better about doing the race. “Did not start” was now out of the question, but whether it will be a DNF was not clear yet.

Race (day)

Thankfully the race doesn’t start early for this specific route, so the morning was relatively relaxed. But then of course I was late a few minutes, so I hurried on my bike to the train station, only to realise I’m among the early people. Loading bike, get on the bus (the train station in Scuol is off-line for repairs), long bus ride to start point, and then… 2 hours of waiting. And to think I thought I’m 5 minutes late :)

I spent the two hours just reading email and browsing the internet (and posting a selfie on FB), and then finally it was on.

And I was surprised how “on” the race was from the first 100 meters. Despite repeated announcements in those two hours that the first 2-3 km do not matter since they’re through the S-chanf village, people started going very hard as soon as there was some space.

So I find myself going 40km/h (forty!!!) on a mountain bike on relatively flat gravel road. This sounds unbelievable, right? But the data says:

  • race started at 1’660m altitude
  • after the first 4.4km, I was at 1’650m, with a total altitude gain of 37m (and thus a total descent of 47m); thus, not flat-flat, but not downhill
  • over this 4.4km, my average speed was 32.5km/h, and that includes starting from zero speed, and in the block (average speed for the first minute was 20km/h)

While 32.5km/h on an MTB sounds cool, the sad part was that I knew this was unsustainable, both from the pure speed point of view, and from the heart rate point of view. I was already at 148bpm after 2½ minutes, but then at minute 6 it went over 160bpm and stayed that way. That is above my LTHR (estimated by various Garmin devices), so I was dipping into reserves. VeloViewer estimates power output here was 300-370W in these initial minutes, which is again above my FTP, so…

But, it was fun. Then at 4.5 a bit of climb (800m long, 50 altitude, ~6.3%), after which it became mostly flow on gravel. And for the next hour, until the single long climb (towards Guarda), it was the best ride I had this year, and one of the best segments in races in general. Yes, there are a few short climbs here and there (e.g. a 10% one over ~700m, another ~11% one over 300m or so), but in general it’s slowly descending route from ~1700m altitude to almost 1400m (plus add in another ~120m gained), so ~420m descent over ~22km. This means, despite the short climbs, average speed is still god - a bit more than 25km/h, which made this a very, very nice segment. No foot pain, no exertion, mean heart rate 152bpm, which is fine. Estimated power is a bit high (mean 231W, NP: 271W ← this is strange, too high); I’d really like to have a power meter on my MTB as well.

Then, after about an hour, the climb towards Guarda starts. It’s an easy climb for a fit person, but as I said I was worse for fitness this year, and also my weight was not good. Data for this segment:

  • it took me 33m:48s
  • 281m altitude gained
  • 4.7km length
  • mean HR: 145bpm
  • mean cadence: 75rpm

I remember stopping to drink once, and maybe another time to rest for about half a minute, but not sure. I stopped in total 33s during this half hour.

Then slowly descending on nice roads towards the next small climb to Ftan, then another short climb (thankfully, I was pretty dead at this point) of 780m distance, 7m36s including almost a minute stop, then down, another tiny climb, then down for the finish.

At the finish, knowing that there’s a final climb after you descend into Scuol itself and before the finish, I gathered all my reserves to do the climb standing. Alas, it was a bit longer than I thought; I think I managed to do 75-80% of it standing, but then sat down. Nevertheless, a good short climb:

  • 22m altitude over 245m distance, done in 1m02s
  • mean grade 8.8%, max grade 13.9%
  • mean HR 161bpm, max HR 167bpm which actually was my max for this race
  • mean speed 14.0km/h
  • estimated mean power 433W, NP: 499w; seems optimistic, but I’ll take it :)

Not bad, not bad. I was pretty happy about being able to push this hard, for an entire minute, at the end of the race. Yay for 1m power?

And obligatory picture, which also shows the grade pretty well:

Final climb! And beating my PR by ~9%

I don’t know how the photographer managed to do it, but having those other people in the picture makes it look much better :)

Comparison with last year

Let’s first look at official race results:

  • 2018: 2h11m37s
  • 2019: 2h22m13s

That’s 8% slower. Honestly, I thought I will do much worse, given my lack of training. Or does a 2.5× lower CTL only result in 8% time loss?

Honestly, I don’t think so. I think what saved me this year was that—since I couldn’t do bike rides—I did much more cross-train as in core exercises. Nothing fancy, just planks, push-ups, yoga, etc. but it helped significantly. If my foot will be fine and I can do both for next year, I’ll be in a much better position.

And this is why the sub-title of this post is “Fitness has many meanings”. I really need to diversify my training in general, but I was thinking in a somewhat theoretical way about it; this race showed it quite practically.

If I look at Strava data, it gives an even more clear picture:

  • on the 1 hour long flat segment I was telling about, which I really loved, I got a PR beating previous year by 1 minute; Strava estimates 250W for this hour, which is what my FTP was last year;
  • on all the climbs, I was slower than last year, as expected, but on the longer climbs significantly so; and I was many times slower than even 2016, when I did the next longer route.

And I just realise, of the 10½m I took longer this year, 6½m I lost on the Guarda climb :)

So yes, you can’t discount fitness, but leg fitness is not everything, and Training Peaks it seems can’t show overall fitness.

At least I did beat my PR on the finishing climb (1m12s vs. 1m19s last year), because I had left aside those final reserves for it.

Next steps

Assuming I’m successful at dealing my foot issue, and that early next year I can restart consistent training, I’m not concerned. I need to put in regular session, I also need to put in long sessions. The success story here is clear, it all depends on willpower.

Oh, and losing ~10kg of fat wouldn’t be bad, like at all.

Reproducible Builds: Reproducible Builds in August 2019

Pre, 06/09/2019 - 1:33md

Welcome to the August 2019 report from the Reproducible Builds project!

In these monthly reports we outline the most important things that have happened in the world of Reproducible Builds and we have been up to.

As a quick recap of our project, whilst anyone can inspect the source code of free software for malicious flaws, most software is distributed to end users or systems as precompiled binaries. The motivation behind the reproducible builds effort is to ensure zero changes have been introduced during these compilation processes. This is achieved by promising identical results are always generated from a given source thus allowing multiple third-parties to come to a consensus on whether a build was changed or even compromised.

In August’s month’s report, we cover:

  • Media coverage & eventsWebmin, CCCamp, etc.
  • Distribution workThe first fully-reproducible package sets, openSUSE update, etc
  • Upstream newslibfaketime updates, gzip, ensuring good definitions, etc.
  • Software developmentMore work on diffoscope, new variations in our testing framework, etc.
  • Misc newsFrom our mailing list, etc.
  • Getting in touchHow to contribute, etc

If you are interested in contributing to our project, please visit our Contribute page on our website.

Media coverage & events

A backdoor was found in Webmin a popular web-based application used by sysadmins to remotely manage Unix-based systems. Whilst more details can be found on upstream’s dedicated exploit page, it appears that the build toolchain was compromised. Especially of note is that the exploit “did not show up in any Git diffs” and thus would not have been found via an audit of the source code. The backdoor would allow a remote attacker to execute arbitrary commands with superuser privileges on the machine running Webmin. Once a machine is compromised, an attacker could then use it to launch attacks on other systems managed through Webmin or indeed any other connected system. Techniques such as reproducible builds can help detect exactly these kinds of attacks that can lay dormant for years. (LWN comments)

In a talk titled There and Back Again, Reproducibly! Holger Levsen and Vagrant Cascadian presented at the 2019 edition of the Linux Developer Conference in São Paulo, Brazil on Reproducible Builds.

LWN posted and hosted an interesting summary and discussion on Hardening the file utility for Debian. In July, Chris Lamb had cross-posted his reply to the “Re: file(1) now with seccomp support enabled” thread, originally started on the debian-devel mailing list. In this post, Chris refers to our strip-nondeterminism tool not being able to accommodate the additional security hardening in file(1) and the changes made to the tool in order to fix this issue which was causing a huge number of regressions in our testing framework.

The Chaos Communication Camp — an international, five-day open-air event for hackers that provides a relaxed atmosphere for free exchange of technical, social, and political ideas — hosted its 2019 edition where there were many discussions and meet-ups at least partly related to Reproducible Builds. This including the titular Reproducible Builds Meetup session which was attended by around twenty-five people where half of them were new to the project as well as a session dedicated to all Arch Linux related issues.

Distribution work

In Debian, the first “package sets” — ie. defined subsets of the entire archive — have become 100% reproducible including as the so-called “essential” set for the bullseye distribution on the amd64 and the armhf architectures. This is thanks to work by Chris Lamb on bash, readline and other low-level libraries and tools. Perl still has issues on i386 and arm64, however.

Dmitry Shachnev filed a bug report against the debhelper utility that speaks to issues around using the date from the debian/changelog file as the source for the SOURCE_DATE_EPOCH environment variable as this can lead to non-intuitive results when package is automatically rebuilt via so-called binary (NB. not “source”) NMUs. A related issue was later filed against qtbase5-dev by Helmut Grohne as this exact issue led to an issue with co-installability across architectures.

Lastly, 115 reviews of Debian packages were added, 45 were updated and 244 were removed this month, appreciably adding to our knowledge about identified issues. Many issue types were updated by Chris Lamb, including embeds_build_data_via_node_preamble, embeds_build_data_via_node_rollup, captures_build_path_in_beam_cma_cmt_files, captures_varying_number_of_build_path_directory_components (discussed later), timezone_specific_files_due_to_haskell_devscripts, etc.

Bernhard M. Wiedemann posted his monthly Reproducible Builds status update for the openSUSE distribution. New issues were found from enabling Link Time Optimization (LTO) in this distribution’s Tumbleweed branch. This affected, for example, nvme-cli as well as perl-XML-Parser and pcc with packaging issues.

Upstream news

Software development Upstream patches

The Reproducible Builds project detects, dissects and attempts to fix as many currently-unreproducible packages as possible. We endeavour to send all of our patches upstream where appropriate. In August we wrote a large number of such patches, including:


diffoscope is our in-depth and content-aware diff utility that can locate and diagnose reproducibility issues. It is run countless times a day on our testing infrastructure and is essential for identifying fixes and causes of non-deterministic behaviour.

This month, Chris Lamb made the following changes:

  • Improvements:
    • Don’t fallback to an unhelpful raw hexdump when, for example, readelf(1) reports an minor issue in a section in an ELF binary. For example, when the .frames section is of the NOBITS type its contents are apparently “unreliable” and thus readelf(1) returns 1. (#58, #931962)
    • Include either standard error or standard output (not just the latter) when an external command fails. []
  • Bug fixes:
    • Skip calls to unsquashfs when we are neither root nor running under fakeroot. (#63)
    • Ensure that all of our artificially-created subprocess.CalledProcessError instances have output instances that are bytes objects, not str. []
    • Correct a reference to parser.diff; diff in this context is a Python function in the module. []
    • Avoid a possible traceback caused by a str/bytes type confusion when handling the output of failing external commands. []
  • Testsuite improvements:

    • Test for 4.4 in the output of squashfs -version, even though the Debian package version is 1:4.3+git190823-1. []
    • Apply a patch from László Böszörményi to update the squashfs test output and additionally bump the required version for the test itself. (#62 & #935684)
    • Add the wabt Debian package to the test-dependencies so that we run the WebAssembly tests on our continuous integration platform, etc. []
  • Improve debugging:
    • Add the containing module name to the (eg.) “Using StaticLibFile for ...” debugging messages. []
    • Strip off trailing “original size modulo 2^32 671” (etc.) from gzip compressed data as this is just a symptom of the contents itself changing that will be reflected elsewhere. (#61)
    • Avoid a lack of space between “... with return code 1” and “Standard output”. []
    • Improve debugging output when instantantiating our Comparator object types. []
    • Add a literal “eg.” to the comment on stripping “original size modulo...” text to emphasise that the actual numbers are not fixed. []
  • Internal code improvements:
    • No need to parse the section group from the class name; we can pass it via type built-in kwargs argument. []
    • Add support to Difference.from_command_exc and friends to ignore specific returncodes from the called program and treat them as “no” difference. []
    • Simplify parsing of optional command_args argument to Difference.from_command_exc. []
    • Set long_description_content_type to text/x-rst to appease the linter. []
    • Reposition a comment regarding an exception within the indented block to match Python code convention. []

In addition, Mattia Rizzolo made the following changes:

  • Now that we install wabt, expect its tools to be available. []
  • Bump the Debian backport check. []

Lastly, Vagrant Cascadian updated diffoscope to versions 120, 121 and 122 in the GNU Guix distribution.


strip-nondeterminism is our tool to remove specific non-deterministic results from a completed build. This month, Chris Lamb made the following changes.

  • Add support for enabling and disabling specific normalizers via the command line. (#10)
  • Drop accidentally-committed warning emitted on every fixture-based test. []
  • Reintroduce the .ar normalizer [] but disable it by default so that it can be enabled with --normalizers=+ar or similar. (#3)
  • In verbose mode, print the normalizers that strip-nondeterminism will apply. []

In addition, there was some movement on an issue in the Archive::Zip Perl module that strip-nondeterminism uses regarding the lack of support for bzip compression that was originally filed in 2016 by Andrew Ayer.

Test framework

We operate a comprehensive Jenkins-based testing framework that powers

This month Vagrant Cascadian suggested and subsequently implemented that we additionally test a varying build directory of different string lengths (eg. /path/to/123 vs /path/to/123456 but we also vary the number of directory components within this, eg. /path/to/dir vs. /path/to/parent/subdir. Curiously, whilst it was a priori believed that was rather unlikely to yield differences, Chris Lamb has managed to identify approximately twenty packages that are affected by this issue.

It was also noticed that our testing of the Coreboot free software firmware fails to build the toolchain since we switched to building on the Debian buster distribution. The last successful build was on August 7th but all newer builds have failed.

In addition, the following code changes were performed in the last month:

  • Chris Lamb: Ensure that the size the log for the second build in HTML pages was also correctly formatted (eg. “12KB” vs “12345”). []

  • Holger Levsen:

  • Mathieu Parent: Update the contact details for the Debian PHP Group. []

  • Mattia Rizzolo:

The usual node maintenance was performed by Holger Levsen [][] and Vagrant Cascadian [].

Misc news

There was a yet more effort put into our our website this month, including misc copyediting by Chris Lamb [], Mathieu Parent referencing his fix for php-pear [] and Vagrant Cascadian updating a link to his homepage. [].

On our mailing list this month Santiago Torres Arias started a Setting up a MS-hosted rebuilder with in-toto metadata thread regarding Microsoft’s interest in setting up a rebuilder for Debian packages touching on issues of transparency logs and the integration of in-toto by the Secure Systems Lab at New York University. In addition, Lars Wirzenius continued conversation regarding various questions about reproducible builds and their bearing on building a distributed continuous integration system.

Lastly, in a thread titled Reproducible Builds technical introduction tutorial Jathan asked whether anyone had some “easy” Reproducible Builds tutorials in slides, video or written document format.

Getting in touch

If you are interested in contributing the Reproducible Builds project, please visit our Contribute page on our website. However, you can get in touch with us via:

This month’s report was written by Bernhard M. Wiedemann, Chris Lamb, Eli Schwartz, Holger Levsen, Jelle van der Waa, Mathieu Parent and Vagrant Cascadian. It was subsequently reviewed by a bunch of Reproducible Builds folks on IRC and the mailing list.

Tim Retout: PA Consulting

Mër, 04/09/2019 - 6:37md

In early October, I will be saying goodbye to my colleagues at CV-Library after 7.5 years, and joining PA Consulting in London as a Principal Consultant.

Over the course of my time at CV-Library I have got married, had a child, and moved from Southampton to Bedford. I am happy to have played a part in the growth of CV-Library as a leading recruitment brand in the UK, especially helping to make the site more reliable - I can tell more than a few war stories.

Most of all I will remember the people. I still have much to learn about management, but working with such an excellent team, the years passed very quickly. I am grateful to everyone, and wish them all every future success.

Candy Tsai: Beyond Outreachy: Final Interviews and Internship Review

Mër, 04/09/2019 - 4:30pd

The last few weeks (week 11 – week 13) of Outreachy were probably the hardest weeks. I had to do 3 informational interviews with the goal of getting a better picture of the open source/free software industry.

The thought of talking to someone I don’t even know just overwhelms me. So this assignment just leaves me scared to death. Pressing that “Send Email” button to these interviewees required me to summon up all of my courage but it was totally worth it. I really appreciate their time for chatting with me.

On the other hand, it’s hard to believe the internship is coming to an end! Good news is that I will be sticking around Debci after this.

Informational Interviews

The theme for week 11 was “Making connections”, so I had to reach out to 3 people that is beyond my network for an informational interview. I’d rather just call it an informational chat so it doesn’t sound too scary. My goal is to know better about how companies involved with open source survive and how others are working remotely. Therefore, my criteria for the interviewees were really simple but not so easy to find:

  • Lives in Taiwan
  • Works remotely
  • Their company is dedicated to open source/free software

At last I was really lucky to have them for my final assignment:

  • Andrew Lee: also part of the Debian community, has been working on open source for more than 20 years in Taiwan, works at Collabora, an open source consulting company
  • James Tien: works at Automattic, a company known for working on WordPress, link to his blog here, it’s in Chinese
  • Gordon Tai: works at Ververica, a company known for working on Apache Flink

A big thanks to them and to terceiro who guided me through this. During my search, it was hard to find someone working for a local company here in Taiwan that fulfilled my criteria.

I have organized and summarized below:

Staying in Open Source
  • Passion is needed for coding and open source, you have to really enjoy it to stay in the long run
  • Opportunities come unexpectedly, you never know when or how they would come to you
  • Write “code”
Remote work
  • People can still sense your up and downs through your chat messages and facial expressions in video calls
  • Communication is much more important than the actual code itself, sometimes you spend more time speaking out than coding down
  • You can use a pomodora clock to help focus or try working different hours
  • Try working in different environments: cafe shop, under the tree, in the forest, beside the ocean etc.
  • Exercise, exercise, exercise!

These above were very general but it was the stories and experiences that I heard that were special. It is for you to find out by doing your own informational interviews!

Internship Review

Last but not least, here’s a wrap-up of my internship in QA format. Hope that this helps anyone that wants to participate in future rounds get a better picture of how the Outreachy Internship was with Debian Debci.

What communication skills have you learned during the internship?

Asking questions and leaving comments. Since I am not a user of Debci, I started with absolutely zero knowledge. I even had to write a blog post to help me clarify what those terminology were for and come back to it if I forget in the future. I asked lots of questions and luckily my mentors were really patient. As we only have a video chat once per week, we discussed mostly through comments in the merge request or issue most of the time. Sometimes I find it hard for me to convey my thoughts with just words (or images), so this was a really good practice.

What technical skills have you learned during the internship?

I only started writing Ruby because of this internship. Also, I wrote my first VagrantFile. In general, I think getting familiar with the code base was the best part.

How did your Outreachy mentor help you along the way?

My mentor reviewed my code thoroughly and guided my through the whole internship. We did pair programming sessions and that was really helpful.

What was one amazing thing that happened during the internship?

The informational interview was pretty horrifying and at the same time amazing. The idea never really came to me that people would really take the time and talk to someone they don’t know. I am really grateful for their time. Their personal stories were really inspiring and motivating too.

How did Outreachy help you feel more confident in making open source and free software contributions?

In my opinion, Outreachy’s initial contribution phase is really important. It kind of forces candidates to at least reach out and take the first step. Even if you didn’t get accepted in the end, you still went from 0 to 1. That is when you find out that the community is actually pretty welcoming to newcomers. So for me, it wasn’t about being more confident, but rather a not so scared case.

What parts of your project did you complete?

I added a self service section where people can request their own test through the Debci UI without fumbling through CURL commands. Also added a VagrantFile for future newcomers to setup the project more easily. Hope it works for them because I’ve only tested on my computer. We’ll see then.

What are the next steps for you to complete the project?

I’m sticking around and at least until I finish the parts that I started because I think it was fun and people actually made some requests related to this. It’s always exciting to see what you are building is wanted by the users.

Really appreciate the opportunity that Outreachy has been offering to interns! Assuming that you have read through this post, you probably are interested in Outreachy. Please do come and apply if you are interested or recommend it to others!

Norbert Preining: Debian Activities of the last few months

Mar, 03/09/2019 - 10:28pd

I haven’t written about specific Debian activities in recent times, but I haven’t been lazy. In fact I have been very active with a lot of new packages I am contributing to.

TeX and Friends

Lots of updates since we first released TeX Live 2019 for Debian, too many to actually mention. We also have bumped the binary package with backports of fixes for dvipdfmx and other programs. Another item that is still pending is the separation of dvisvgm into a separate package (currently in the NEW queue). Biber has been updated to match the version of biblatex shipped in the TeX Live packages.


Calibre development is continuing as usual, with lots of activity for getting Calibre ready for Python3. To prepare for this move, I have taken over the Python mechanize package which has been not updated for many years. At the moment it is already possible to build a Calibre package for Python3, but unfortunately by now practically all external plugins are still based on Python2 and thus fail with Python3. As a consequence I will keep Calibre at Python2 version for the time being, and hope that Calibre officially switches to Python3, which would trigger a conversion of the plugins, too, before Bulleye (the next Debian release) is released with the aim to get rid of Python2.


The packages of Cinnamon 4.0 I have prepared together with the Cinnamon Team have been uploaded to sid, and I have uploaded packages of Cinnamon 4.2 to experimental. We plan to move the 4.2 packages to sid after the 4.0 packages have entered testing.


Onedrive didn’t cut it into the release of buster, in particular because the release masters weren’t happy with an upgrade request I made to get a new version (scheduled to enter testing 1 day after the freeze day!) with loads of fixes into buster. So I decided to remove onedrive altogether from Buster, better nothing than something broken. It is a bit a pain for me – but users are advised to get the source code from Github and install a self compiled version – this is definitely safer.

All in all quite a lot of work. Enjoy.

Junichi Uekawa: I have an issue remembering where I took notes.

Mar, 03/09/2019 - 12:24pd
I have an issue remembering where I took notes. In the past it was all in emacs. Now it's somewhere in one of the web services.

Sean Whitton: Debian Policy call for participation -- September 2019

Mar, 03/09/2019 - 12:04pd

There hasn’t been much activity lately, but no shortage of interesting and hopefully-accessible Debian Policy work. Do write to if you’d like to participate but are struggling to figure out how.

Consensus has been reached and help is needed to write a patch:

#425523 Describe error unwind when unpacking a package fails

#452393 Clarify difference between required and important priorities

#582109 document triggers where appropriate

#592610 Clarify when Conflicts + Replaces et al are appropriate

#682347 mark ‘editor’ virtual package name as obsolete

#685506 copyright-format: new Files-Excluded field

#749826 [multiarch] please document the use of Multi-Arch field in debian/c…

#757760 please document build profiles

#770440 policy should mention systemd timers

#823256 Update maintscript arguments with dpkg >= 1.18.5

#905453 Policy does not include a section on NEWS.Debian files

#907051 Say much more about vendoring of libraries

Wording proposed, awaiting review from anyone and/or seconds by DDs:

#786470 [copyright-format] Add an optional “License-Grant” field

#919507 Policy contains no mention of /var/run/reboot-required

#920692 Packages must not install files or directories into /var/cache

#922654 Section 9.1.2 points to a wrong FHS section?

Dirk Eddelbuettel: RcppArmadillo 0.9.700.2.0

Hën, 02/09/2019 - 5:43md

A new RcppArmadillo release based on a new Armadillo upstream release arrived on CRAN, and will get to Debian shortly. It brings continued improvements for sparse matrices and a few other things; see below for more details. I also appear to have skipped blogging about the preceding 0.9.600.4.0 release (which was actually extra-rigorous with an unprecedented number of reverse-depends runs) so I included its changes (with very nice sparse matrix improvements) as well.

Armadillo is a powerful and expressive C++ template library for linear algebra aiming towards a good balance between speed and ease of use with a syntax deliberately close to a Matlab. RcppArmadillo integrates this library with the R environment and language–and is widely used by (currently) 656 other packages on CRAN.

Changes in RcppArmadillo version 0.9.700.2.0 (2019-09-01)
  • Upgraded to Armadillo release 9.700.2 (Gangster Democracy)

    • faster handling of cubes by vectorise()

    • faster faster handling of sparse matrices by nonzeros()

    • faster row-wise index_min() and index_max()

    • expanded join_rows() and join_cols() to handle joining up to 4 matrices

    • expanded .save() and .load() to allow storing sparse matrices in CSV format

    • added randperm() to generate a vector with random permutation of a sequence of integers

  • Expanded the list of known good gcc and clang versions in

Changes in RcppArmadillo version 0.9.600.4.0 (2019-07-14)
  • Upgraded to Armadillo release 9.600.4 (Napa Invasion)

    • faster handling of sparse submatrices

    • faster handling of sparse diagonal views

    • faster handling of sparse matrices by symmatu() and symmatl()

    • faster handling of sparse matrices by join_cols()

    • expanded clamp() to handle sparse matrices

    • added .clean() to replace elements below a threshold with zeros

Courtesy of CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Jonathan Carter: Free Software Activities (2019-08)

Hën, 02/09/2019 - 1:35md

Ah, spring time at last. The last month I caught up a bit with my Debian packaging work after the Buster freeze, release and subsequent DebConf. Still a bit to catch up on (mostly kpmcore and partitionmanager that’s waiting on new kdelibs and a few bugs). Other than that I made two new videos, and I’m busy with renovations at home this week so my home office is packed up and in the garage. I’m hoping that it will be done towards the end of next week, until then I’ll have little screen time for anything that’s not work work.

2019-08-01: Review package hipercontracer (1.4.4-1) ( request) (needs some work).

2019-08-01: Upload package bundlewrap (3.6.2-1) to debian unstable.

2019-08-01: Upload package gnome-shell-extension-dash-to-panel (20-1) to debian unstable.

2019-08-01: Accept MR!2 for gamemode, for new upstream version (1.4-1).

2019-08-02: Upload package gnome-shell-extension-workspaces-to-dock (51-1) to debian unstable.

2019-08-02: Upload package gnome-shell-extension-hide-activities (0.00~git20131024.1.6574986-2) to debian unstable.

2019-08-02: Upload package gnome-shell-extension-trash (0.2.0-git20161122.ad29112-2) to debian unstable.

2019-08-04: Upload package toot (0.22.0-1) to debian unstable.

2019-08-05: Upload package gamemode (gamemode-1.4.1+git20190722.4ecac89-1) to debian unstable.

2019-08-05: Upload package calamares-settings-debian (10.0.24-2) to debian unstable.

2019-08-05: Upload package python3-flask-restful (0.3.7-3) to debian unstable.

2019-08-05: Upload package python3-aniso8601 (7.0.0-2) to debian unstable.

2019-08-06: Upload package gamemode (1.5~git20190722.4ecac89-1) to debian unstable.

2019-08-06: Sponsor package assaultcube ( for debian unstable ( request).

2019-08-06: Sponsor package assaultcube-data ( for debian unstable ( request).

2019-08-07: Request more info on Debian bug #825185 (“Please which tasks should be installed at a default installation of the blend”).

2019-08-07: Close debian bug #689022 in desktop-base (“lxde: Debian wallpaper distorted on 4:3 monitor”).

2019-08-07: Close debian bug #680583 in desktop-base (“please demote librsvg2-common to Recommends”).

2019-08-07: Comment on debian bug #931875 in gnome-shell-extension-multi-monitors (“Error loading extension”) to temporarily avoid autorm.

2019-08-07: File bug (multimedia-devel)

2019-08-07: Upload package python3-grapefruit (0.1~a3+dfsg-7) to debian unstable (Closes: #926414).

2019-08-07: Comment on debian bug #933997 in gamemode (“gamemode isn’t automatically activated for rise of the tomb raider”).

2019-08-07: Sponsor package assaultcube-data ( for debian unstable (e-mail request).

2019-08-08: Upload package calamares (3.2.12-1) to debian unstable.

2019-08-08: Close debian bug #32673 in aalib (“open /dev/vcsa* write-only”).

2019-08-08: Upload package tanglet (1.5.4-1) to debian unstable.

2019-08-08: Upload package tmux-theme-jimeh (0+git20190430-1b1b809-1) to debian unstable (Closes: #933222).

2019-08-08: Close debian bug #927219 (“amdgpu graphics fail to be configured”).

2019-08-08: Close debian bugs #861065 and #861067 (For creating nextstep task and live media).

2019-08-10: Sponsor package scons (3.1.1-1) for debian unstable ( request) (Closes RFS: #932817).

2019-08-10: Sponsor package fractgen (2.1.7-1) for debian unstable ( request).

2019-08-10: Sponsor package bitwise (0.33-1) for debian unstable ( request). (Closes RFS: #934022).

2019-08-10: Review package python-pyspike (0.6.0-1) ( request) (needs some additional work).

2019-08-10: Upload package connectagram (1.2.10-1) to debian unstable.

2019-08-11: Review package bitwise (0.40-1) ( request) (need some further work).

2019-08-11: Sponsor package sane-backends (1.0.28-1~experimental1) to debian experimental ( request).

2019-08-11: Review package hcloud-python (1.4.0-1) (

2019-08-13: Review package bitwise (0.40-1) (e-mail request) (needs some further work).

2019-08-15: Sponsor package bitwise (0.40-1) for debian unstable (email request).

2019-08-19: Upload package calamares-settings-debian (10.0.20-1+deb10u1) to debian buster (CVE #2019-13179).

2019-08-19: Upload package gnome-shell-extension-dash-to-panel (21-1) to debian unstable.

2019-08-19: Upload package flask-restful (0.3.7-4) to debian unstable.

2019-08-20: Upload package python3-grapefruit (0.1~a3+dfsg-8) to debian unstable (Closes: #934599).

2019-08-20: Sponsor package runescape (0.6-1) for debian unstable ( request).

2019-08-20: Review package ukui-menu (1.1.12-1) (needs some mor work) ( request).

2019-08-20: File ITP #935178 for bcachefs-tools.

2019-08-21: Fix two typos in bcachefs-tools (Github bcachefs-tools PR: #20).

2019-08-25: Published Debian Package of the Day video #60: 5 Fonts ( / YouTube).

2019-08-26: Upload new upstream release of speedtest-cli (2.1.2-1) to debian unstable (Closes: #934768).

2019-08-26: Upload new package gnome-shell-extension-draw-on-your-screen to NEW for debian untable. (ITP: #925518)

2019-08-27: File upstream bug for btfs so that python2 depencency can be dropped from Debian package (BTFS: #53).

2019-08-28: Published Debian Package Management #4: Maintainer Scripts ( / YouTube).

2019-08-28: File upstream feature request in Calamares unpackfs module to help speed up installations (Calamares: #1229).

2019-08-28: File upstream request at smlinux/rtl8723de driver for license clarification (RTL8723DE: #49).

Mike Gabriel: My Work on Debian LTS/ELTS (August 2019)

Hën, 02/09/2019 - 12:33md

In August 2019, I have worked on the Debian LTS project for 24 hours (of 24.75 hours planned) and on the Debian ELTS project for another 2 hours (of 12 hours planned) as a paid contributor.

LTS Work
  • Upload fusiondirectory to jessie-security (1 CVE, DLA 1875-1 [1])
  • Upload gosa 2.7.4+reloaded2+deb8u4 to jessie-security (1 CVE, DLA 1876-1 [2])
  • Upload gosa 2.7.4+reloaded2+deb8u5 to jessie-security (1 CVE, DLA 1905-1 [3])
  • Upload libav 6:11.12-1~deb8u8 to jessie-security (5 CVEs, DLA 1907-1 [4])
  • Investigate on CVE-2019-13627 (libgcrypt20). Upstream patch applies, build succeeds, but some tests fail. More work required on this.
  • Triage 14 packages with my LTS frontdesk hat on during the last week of August
  • Do a second pair of eyes review on changes uploaded with dovecot 1:2.2.13-12~deb8u7
  • File a merge request against security-tracker [5], add --minor option to contact-maintainers script.
  • Investigate on CVE-2019-13627 (libgcrypt11). More work needed to assess if libgrypt11 in wheezy is affected by CVE-2019-13627.

Julien Danjou: Dependencies Handling in Python

Hën, 02/09/2019 - 11:22pd

Dependencies are a nightmare for many people. Some even argue they are technical debt. Managing the list of the libraries of your software is a horrible experience. Updating them — automatically? — sounds like a delirium.

Stick with me here as I am going to help you get a better grasp on something that you cannot, in practice, get rid of — unless you're incredibly rich and talented and can live without the code of others.

First, we need to be clear of something about dependencies: there are two types of them. Donald Stuff wrote better than I would about the subject years ago. To make it simple, one can say that they are two types of code packages depending on  external code: applications and libraries.

Libraries Dependencies

Python libraries should specify their dependencies in a generic way. A library should not require requests 2.1.5: it does not make sense. If every library out there needs a different version of requests, they can't be used at the same time.

Libraries need to declare dependencies based on ranges of version numbers. Requiring requests>=2 is correct. Requiring requests>=1,<2 is also correct if you know that requests 2.x does not work with the library. The problem that your version range specification is solving is the API compatibility issue between your code and your dependencies — nothing else. That's a good reason for libraries to use Semantic Versioning whenever possible.

Therefore, dependencies should be written in as something like:

from setuptools import setup setup( name="MyLibrary", version="1.0", install_requires=[ "requests", ], # ... )

This way, it is easy for any application to use the library and co-exist with others.

Applications Dependencies

An application is just a particular case of libraries. They are not intended to be reused (imported) by other libraries of applications — though nothing would prevent it in practice.

In the end, that means that you should specify the dependencies the same way that you would do for a library in the application's

The main difference is that an application is usually deployed in production to provide its service. Deployments need to be reproducible. For that, you can't solely rely on the requested range of the dependencies are too broad. You're at the mercy of random version changes at any time when re-deploying your application.

You, therefore, need a different version management mechanism to handle deployment than just

pipenv has an excellent section recapping this in its documentation. It splits dependency types into abstract and concrete dependencies: abstract dependencies are based on ranges (e.g., libraries) whereas concrete dependencies are specified with precise versions (e.g., application deployments) — as we've just seen here.Handling Deployment

The requirements.txt file has been used to solve application deployment reproducibility for a long time now. Its format is usually something like:

requests==3.1.5 foobar==2.0

Each library sees itself specified to the micro version. That makes sure each of your deployment is going to install the same version of your dependency. Using a requirements.txt is a simple solution and a first step toward reproducible deployment. However, it's not enough.

Indeed, while you can specify which version of requests you want, if requests depends on urllib3, that could make pip install urllib 2.1 or urllib 2.2. You can't know which one will be installed, which does not make your deployment 100% reproducible.

Of course, you could duplicate all requests dependencies yourself in your requirements.txt, but that would be madness!

An application dependency tree can be quite deep and complex sometimes.

There are various hacks available to fix this limitation, but the real saviors here are pipenv and poetry. The way they solve it is similar to many package managers in other programming languages. They generate a lock file that contains the list of all installed dependencies (and their own dependencies, etc.) with their version numbers. That makes sure the deployment is 100% reproducible.

Check out their documentation on how to set up and use them!

Handling Dependencies Updates

Now that you have your lock file that makes sure your deployment is reproducible in a snap, you've another problem. How do you make sure that your dependencies are up-to-date? There is a real security concern about this, but also bug fixes and optimizations that you might miss by staying behind.

If your project is hosted on GitHub, Dependabot is an excellent solution to solve this issue. Enabling this application on your repository creates automatically pull requests whenever a new version of the library listed in your lock file is available. For example, if you've deployed your application with redis 3.3.6, Dependabot will create a pull request updating to redis 3.3.7 as soon as it gets released. Furthermore, Dependabot supports requirements.txt, pipenv, and poetry!

Dependabot updating jinja2 for youAutomatic Deployment Update

You're almost there. You have a bot that is letting you know that a new version of a library your project needs is available.

Once the pull request is created, your continuous integration system is going to kick in, deploy your project, and runs the test. If everything works fine, your pull request is ready to be merged. But are you really needed in this process?

Unless you have a particular and personal aversion on specific version numbers —"Gosh I hate versions that end with a 3. It's always bad luck."— or unless you have zero automated testing, you, human, is useless. This merge can be fully automatic.

This is where Mergify comes into play. Mergify is a GitHub application allowing to define precise rules about how to merge your pull requests. Here's a rule that I use in every project:

pull_requests_rules: - name: automatic merge from dependabot conditions: - author~=^dependabot(|-preview)\[bot\]$ - label!=work-in-progress - "status-success=ci/circleci: pep8" - "status-success=ci/circleci: py37" actions: merge: method: mergeMergify reports when the rule fully matches

As soon as your continuous integration system passes, Mergify merges the pull request for you.

You can then automatically trigger your deployment hooks to update your production deployment and get the new library version installed right away. This leaves your application always up-to-date with newer libraries and not lagging behind several years of releases.

If anything goes wrong, you're still able to revert the commit from Dependabot — which you can also automate if you wish with a Mergify rule.


This is to me the state of the art of dependency management lifecycle right now. And while this applies exceptionally well to Python, it can be applied to many other languages that use a similar pattern — such as Node and npm.

Russ Allbery: rra-c-util 8.0

Hën, 02/09/2019 - 2:22pd

This is a roll-up of a lot of changes to my utility package for C (and increasingly for Perl). It's been more than a year since the last release, so it's long-overdue.

Most of the changes in this release are to the Perl test libraries and accompanying tests. Test::RRA now must be imported before Test::More so that it can handle the absence of Test::More (such as on Red Hat systems with perl but not perl-core installed). The is_file_contents function in Test::RRA now handles Windows and other systems without a diff program. And there are more minor improvements to the various tests written in Perl.

The Autoconf probe RRA_LIB_KRB5_OPTIONAL now correctly handles the case where Kerberos libraries are not available but libcom_err is, rather than incorrectly believing that Kerberos libraries were present.

As of this release, rra-c-util now tests the Perl test programs that it includes, which requires it to build and test a dummy Perl module. This means the build system now requires Perl 5.6.2 and the Module::Build module.

You can get the latest version from the rra-c-util distribution page.

Thorsten Alteholz: My Debian Activities in August 2019

Dje, 01/09/2019 - 11:06md

FTP master

This month the numbers went up again and I accepted 389 packages and rejected 43. The overall number of packages that got accepted was 460.

Debian LTS

This was my sixty second month that I did some work for the Debian LTS initiative, started by Raphael Hertzog at Freexian.

This month my all in all workload has been 21.75h. During that time I did LTS uploads of:

  • [DLA 1887-1] freetype security update for one CVE
  • [DLA 1889-1] python3.4 security update for one CVE
  • [DLA 1893-1] cups security update for two CVEs
  • [DLA 1895-1] libmspack security update for one CVE
  • [DLA 1894-1] libapache2-mod-auth-openidc security update for one CVE
  • [DLA 1897-1] tiff security update for one CVE
  • [DLA 1902-1] djvulibre security update for four CVEs
  • [DLA 1904-1] libextractor security update for one CVE
  • [DLA 1906-1] python2.7 security update for one CVE

Last but not least I did some days of frontdesk duties.

Debian ELTS

This month was the fifteenth ELTS month.

During my allocated time I uploaded:

  • ELA-155-1 of cups
  • ELA-157-1 of djvulibre
  • ELA-158-1 of python2.7

I spent some time to work on tiff3 only to find that the affected features are not yet available.

I also did some days of frontdesk duties.

Other stuff

This month I uploaded new packages of …

I also uploaded new upstream versions of …

I improved packaging of …

On my Go challenge I uploaded golang-github-gin-contrib-static, golang-github-gin-contrib-cors, golang-github-yourbasic-graph, golang-github-cnf-structhash, golang-github-deanthompson-ginpprof, golang-github-jarcoal-httpmock, golang-github-gin-contrib-gzip, golang-github-mcuadros-go-gin-prometheus, golang-github-abdullin-seq, golang-github-centurylinkcloud-clc-sdk, golang-github-ziutek-mymysql, golang-github-terra-farm-udnssdk, golang-github-ensighten-udnssdk, golang-github-sethvargo-go-fastly.

I again reuploaded some go packages (golang-github-go-xorm-core, golang-github-jarcoal-httpmock, golang-github-mcuadros-go-gin-prometheus, golang-github-deanthompson-ginpprof, golang-github-gin-contrib-cors, golang-github-gin-contrib-gzip, golang-github-gin-contrib-static, golang-github-cyberdelia-heroku-go, golang-github-corpix-uarand, golang-github-cnf-structhash, golang-github-rs-zerolog, golang-gopkg-ldap.v3, golang-github-yourbasic-graph, golang-github-ovh-go-ovh, , that would not migrate due to being binary uploads before.

I also sponsored the following packages: golang-github-jesseduffield-gocui, printrun, cura-engine, theme-d, theme-d-gnome.

The DOPOM package for this month was gengetopt.

Petter Reinholdtsen: Norwegian movies that might be legal to share on the Internet

Dje, 01/09/2019 - 11:10pd

While working on identifying and counting movies that can be legally shared on the Internet, I also looked at the Norwegian movies listed in IMDb. So far I have identified 54 candidates published before 1940 that might no longer be protected by norwegian copyright law. Of these, only 29 are available at least in part from the Norwegian National Library. It can be assumed that the remaining 25 movies are lost. It seem most useful to identify the copyright status of movies that are not lost. To verify that the movie is really no longer protected, one need to verify the list of copyright holders and figure out if and when they died. I've been able to identify some of them, but for some it is hard to figure out when they died.

This is the list of 29 movies both available from the library and possibly no longer protected by copyright law. The year range (1909-1979 on the first line) is year of publication and last year with copyright protection.

1909-1979 ( 70 year) NSB Bergensbanen 1909 - 1910-1980 ( 70 year) Bjørnstjerne Bjørnsons likfærd - 1910-1980 ( 70 year) Bjørnstjerne Bjørnsons begravelse - 1912-1998 ( 86 year) Roald Amundsens Sydpolsferd (1910-1912) - 1913-2006 ( 93 year) Roald Amundsen på sydpolen - 1917-1987 ( 70 year) Fanden i nøtten - 1919-2018 ( 99 year) Historien om en gut - 1920-1990 ( 70 year) Kaksen på Øverland - 1923-1993 ( 70 year) Norge - en skildring i 6 akter - 1925-1997 ( 72 year) Roald Amundsen - Ellsworths flyveekspedition 1925 - 1925-1995 ( 70 year) En verdensreise, eller Da knold og tott vaskede negrene hvite med 13 sæpen - 1926-1996 ( 70 year) Luftskibet 'Norge's flugt over polhavet - 1926-1996 ( 70 year) Med 'Maud' over Polhavet - 1927-1997 ( 70 year) Den store sultan - 1928-1998 ( 70 year) Noahs ark - 1928-1998 ( 70 year) Skjæbnen - 1928-1998 ( 70 year) Chefens cigarett - 1929-1999 ( 70 year) Se Norge - 1929-1999 ( 70 year) Fra Chr. Michelsen til Kronprins Olav og Prinsesse Martha - 1930-2000 ( 70 year) Mot ukjent land - 1930-2000 ( 70 year) Det er natt - 1930-2000 ( 70 year) Over Besseggen på motorcykel - 1931-2001 ( 70 year) Glimt fra New York og den Norske koloni - 1932-2007 ( 75 year) En glad gutt - 1934-2004 ( 70 year) Den lystige radio-trio - 1935-2005 ( 70 year) Kronprinsparets reise i Nord Norge - 1935-2005 ( 70 year) Stormangrep - 1936-2006 ( 70 year) En fargesymfoni i blått - 1939-2009 ( 70 year) Til Vesterheimen - To be sure which one of these can be legally shared on the Internet, in addition to verifying the right holders list is complete, one need to verify the death year of these persons: Bjørnstjerne Bjørnson (dead 1910) - Gustav Adolf Olsen (missing death year) - Gustav Lund (missing death year) - John W. Brunius (dead 1937) - Ola Cornelius (missing death year) - Oskar Omdal (dead 1927) - Paul Berge (missing death year) - Peter Lykke-Seest (dead 1948) - Roald Amundsen (dead 1928) - Sverre Halvorsen (dead 1936) - Thomas W. Schwartz (missing death year) -

Perhaps you can help me figuring death year of those missing it, or right holders if some are missing in IMDb? It would be nice to have a definite list of Norwegian movies that are legal to share on the Internet.

This is the list of 25 movies not available from the library and possibly no longer protected by copyright law:

1907-2009 (102 year) Fiskerlivets farer - 1912-2018 (106 year) Historien omen moder - 1912-2002 ( 90 year) Anny - en gatepiges roman - 1916-1986 ( 70 year) The Mother Who Paid - 1917-2018 (101 year) En vinternat - 1917-2018 (101 year) Unge hjerter - 1917-2018 (101 year) De forældreløse - 1918-2018 (100 year) Vor tids helte - 1918-2018 (100 year) Lodsens datter - 1919-2018 ( 99 year) Æresgjesten - 1921-2006 ( 85 year) Det nye year? - 1921-1991 ( 70 year) Under Polarkredsens himmel - 1923-1993 ( 70 year) Nordenfor polarcirkelen - 1925-1995 ( 70 year) Med 'Stavangerfjord' til Nordkap - 1926-1996 ( 70 year) Over Atlanterhavet og gjennem Amerika - 1926-1996 ( 70 year) Hallo! Amerika! - 1926-1996 ( 70 year) Tigeren Teodors triumf - 1927-1997 ( 70 year) Rød sultan - 1927-1997 ( 70 year) Søndagsfiskeren Flag - 1930-2000 ( 70 year) Ro-ro til fiskeskjær - 1933-2003 ( 70 year) I kongens klær - 1934-2004 ( 70 year) Eventyret om de tre bukkene bruse - 1934-2004 ( 70 year) Pål sine høner - 1937-2007 ( 70 year) Et mesterverk - 1938-2008 ( 70 year) En Harmony -

Several of these movies completely lack right holder information in IMDb and elsewhere. Without access to a copy of the movie, it is often impossible to get the list of people involved in making the movie, making it impossible to figure out the correct copyright status.

Not listed here are the movies still protected by copyright law. Their copyright terms varies from 79 to 144 years, according to the information I have available so far. One of the non-lost movies might change status next year, Mustads Mono from 1920. The next one might be Hvor isbjørnen ferdes from 1935 in 2024.

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

Paul Wise: FLOSS Activities August 2019

Dje, 01/09/2019 - 6:15pd
Changes Issues Review Administration
  • Debian: restart dead stunnels
  • Debian wiki: unblacklist IP addresses, whitelist email addresses, whitelist email domains
Communication Sponsors

The purple-discord, libgoogle-protocolbuffers-perl work was sponsored by my employer. All other work was done on a volunteer basis.

Russ Allbery: C TAP Harness 4.5

Dje, 01/09/2019 - 3:53pd

Peter Paris requested that C TAP Harness support being built as C++ code. I've not been a big fan of doing this with pure C code since I find some of the requirements of C++ mildly irritating, but Peter's initial patch also fixed one type error in a malloc uncovered because of one of C++'s rules requiring the return of malloc be cast. It turned out to be a mostly harmless error since the code was allocating a larger struct than it needed to, but it's still evidence that there's some potential here for catching bugs.

That said, adding an explicit cast to every malloc isn't likely to catch bugs. That's just having to repeat oneself in every allocation, and you're nearly as likely to repeat yourself incorrectly.

However, if one is willing to use a macro instead of malloc directly, this is fixable, and I'm willing to do that since I was already using a macro for allocation to do error handling. So I've modified the code to pass in the type of object to allocate instead of the size, and then used a macro to add the return cast. This makes for somewhat cleaner code and also makes it possible to build the code as pure C++. I also added some functions to the TAP generator library, bcalloc_type and breallocarray_type, that take the same approach. (I didn't remove the old functions, to maintain backward compatibility.)

I'm reasonably happy with the results, although it's a bit of a hassle and I'm not sure if I'm going to go to the trouble in all of my other C packages. But I'm at least considering it. (Of course, I'm also considering rewriting them all in Rust, and considering my profound lack of time to do either of these things.)

You can get the latest release from the C TAP Harness distribution page.