Loading...
 

Blog

Issues using GNU/Linux as a "desktop" (PC)

admin Saturday February 6, 2016

I hit numerous issues when I moved my desktop from Microsoft Windows to GNU/Linux. When this started to feel like too many, I remember starting a document listing the main issues. Eventually, I hit many more issues, became familiar enough with the system to report some of them, and started reporting bugs, a process I am still far from having completed more than a decade later (thanks to our ability to produce new bugs being much more developed than our ability to reproduce our existing bugs). Therefore, I lacked the time needed to work on that list, and at some point, probably after having realized how huge that list would have to become, I gave up and must have deleted it.

Therefore, I was amused when I stumbled upon another list created by someone with the same objective last week. After going through that list, I guess I was right to give up on my list - it is a huge list of problems (and although I learned of that list because it received a 2016 update, it remains a huge list, despite the decade of progress since I started mine). In fact, I already knew there were millions of bugs and such a list could be as long and anecdotal as I wanted it to be, but overall, I find that Artem's long list does a good balance between listing overly specific issues and failing to list serious issues. Artem's list is highly imperfect - it is sometimes repetitive, at times unsubstantiated, and as he acknowledges himself, some issues are more disadvantages of GNU/Linux in comparison to other OS-en (in other words, it is a lot more than a list of bugs). There are many issues listed which I did not experience myself or even read about, and I am far from agreeing with every claim on that page. Some points just seem severe. Many issues affect software which is not used by most GNU/Linux users. That being said, I think that overall, that document gives someone considering to move their PC to GNU/Linux a good idea of what this involves.

One of the reasons why I gave up on my list is that determining whether an issue should make the list was hard. It depended not only on the issue's importance, but on the affected software's popularity. In any OS, virtually no software is used by everyone. But with GNU/Linux, that problem is made a lot worse by high fragmentation. And indeed, the very issue of fragmentation is part of the 9 general issues Artem listed is his summary. While some could point out that this is not an issue per se, it could instead be called a meta-issue, since fragmentation means extra complexity, more difficulty obtaining support for any specific GNU/Linux install and less developers available to work on each piece of software which has alternatives. Even though Artem's list has been updated, it is unfortunately hard to use it to estimate how fast issues are solved (which was probably my main goal), but it could be much faster without such fragmentation.

I have never suggested anyone to switch a PC to GNU/Linux. At best, I might have significantly influenced 4 people to make the switch. All of these are bachelors in computer science or computer engineering. At least 3 of these no longer use GNU/Linux as their primary OS. I do not regret having put my efforts into improving GNU/Linux rather than into directly recruiting new users. In fact, when reading Artem's list, it seems unreasonable for a system administrator to familiarize with GNU/Linux for the sole purpose of switching its own PC to GNU/Linux, unless that administrator intends to improve GNU/Linux... although as soon as I go back to Microsoft Windows for a day, I'm reminded that GNU/Linux is far from having a monopoly on problems.

While most of my contribution to Artem's list is probably the absence of even more items, I have contributed to some of the pages the list links to. But these contributions are a bit ironic:

  • One page is a KDE bug report which was closed by a KDE developer because it does not affect Wayland. My contribution there was to point out that the fact that KDE could be used without that bug did not mean that the bug was solved, since X.Org is still affected. The faulty developer did not reopen, and the ticket remains closed.
  • Image The other page linked I significantly contributed to is Wikipedia's article on Heartbleed, which I expanded, fully reviewed and maintained until it was assessed as a "good article". The very first Wikipedia article dedicated to an open source software bug. In a sense, Heartbleed was a great bug, as it highlighted how vulnerable free software can be, and was the trigger needed for the Core Infrastructure Initiative, the "new initiative" Artem mentions in his section "On a positive note". I like to think that the "Root causes, possible lessons, and reactions" section I created helped Jim Zemlin convince organizations to join the CII.

    I believe free software's success in the last decade has made its weaknesses obvious. Technically, free software is neither more or less secure than proprietary software. Each piece of software has its own security. But when I joined the free software movement, many claimed a piece of free software was generally more secure than a proprietary equivalent - for example, a GNU/Linux distribution would be safer than Microsoft Windows. Since then, history has disproved that myth, and Artem's article reflects that very well. Free software projects themselves did not necessarily improve. A decade after I filed Debian's ticket #339837, Debian has made some progress. http://security.debian.org no longer claims that Debian's average response time to security issues is under 48 hours, but still claims that "Debian takes security very seriously.", now without any supporting statistical claim. But as I write these lines, Debian's security bug tracker lists over 10 high-impact vulnerabilities (in the current Debian version) acknowledged by Debian itself, along tens of vulnerabilities still unrated.

    Some projects have performed more serious changes to actually improve their security. Following Heartbleed, OpenSSL has adopted a roadmap and a security policy. It created a blog, added members to its team, improved its performance reacting to reports of security issues, performed code cleanup, started a code audit, adopted a code review system and a code review policy.

    Unfortunately, others thought securing OpenSSL required forking. Therefore, in the wake of Heartbleed, a major fork appeared: LibreSSL (not to mention BoringSSL). As if OpenSSL and GnuTLS were not enough, we now have 3 equivalent libraries, and many lesser-known forks and equivalents. The very meta-issue Artem's Summary denounces in its fourth point.
    So while Heartbleed's long-term effect was great in a sense, in another sense, if lack of resources was the root cause of Heartbleed, it is not clear that a reaction which worsens fragmentation will be helpful. I will not claim that OpenSSL is not more secure than it was before Heartbleed, but in the long term, I doubt the reaction is very helpful for TLS library users.


A lot of my work on GNU/Linux was focused on the desktop. I am proud of the difference I made. Yet, I am not so proud of the result at this point. A lot has changed since I started working on GNU/Linux, and yet, much remains the same. Thankfully, one thing also remains unchanged: users of fully free software GNU/Linux distributions do not need to worry about vendor lock-in from their operating system.

Featured Project