Originally published at: Full-Time Linux Users React to the Linus Daily Driver Challenge Video – LearnLinuxTV
Linus from Linus Tech Tips recently released a couple of videos where he attempts to use Linux as his daily driver, and makes some interesting decisions. In this video, Tom and Jay, two full-time Linux users, share their thoughts on how Linus could’ve done better.
Originally published at: Full-Time Linux Users React to the Linus Daily Driver Challenge Video – LearnLinuxTV
TBH I use my Linux PC (Manjaro-i3) mostly for work (Blender, Python, etc) and my Dad for programming and ML stuff. I’m not really a gamer, I just like ones like MSFS and ATS/ETS2 that I play on my Windows PC. That actually works out nice, because we both use the Linux one for working, so keeping games separate is a good idea anyway.
IDK why LTT didn’t try a USB live distro to test first, though; that seems so basic.
One nitpick I have regarding this video is that, around 00:54:00, Jay says something like:
I don't think it is Linux's fault that nVidia (or any other company for that matter) doesn't provide support or drivers for Linux. And many people who criticized and who were in support of Linus fail to see this. Both sides do it. It is true that the Linux experience using nVidia products is not great, but people still put the blame on Linux for some reason.
I’m not going to pretend that Linux is perfect. If there’s a fault, I’m going to call them out on it, because it’s about being honest and I don’t want to be a fanboy to the point where “Linux does nothing wrong.” There are some things I wish it [Linux] was better at. I mean, the nVidia thing is another example of that, because Windows users legitimately have a better experience with the nVidia driver.
If Intel hypothetically stopped making CPUs compatible with Windows, like say, they moved to Itanium again, because they get more demand or more money from that side than x86_64, people wouldn’t blame Intel because they aren’t making x86 CPUs anymore right? Well, some will, but that’s not a legitimate criticism, everyone would just vote with their wallet and move to AMD if they kept making x86. Most people would be asking Microsoft to make Windows compatible with Itanium or ARM, but criticism that they don’t is invalid (or rather, not legitimate). And people wouldn’t complain to Microsoft that WinRAR doesn’t work on Windows on Itanium / ARM, they would complain to WinRAR the company to port their proprietary program to the new Windows.
And back to nVidia, I wouldn’t put the blame on them either. I would not blame anything or anyone. It’s not Linux responsibility to support nVidia products, it is not distro developers’ job either. And nVidia has no obligation to support Linux either. For that matter, even Microsoft has no obligation to keep making Windows for x86, they can at any time they desire just kill it, declare that they will only support Windows 10 and 11 until they hit EOL, but only as extended-support and not feature releases, i.e. basically the platforms would be dead unless something critical happens that requires a patch. And MS could either move on to, idk, make Android phones or something, or just continue to offer Azure and cloud office365.
Of course, Microsoft would probably not get away with killing Windows 11 before EOL, because they promised to paying customers that it will receive feature updates. RedHat managed to pull the rug from under CentOS 8 because they had no obligation to serve anyone until RHEL hit EOL. But after a while, it is plausible that MS could kill Windows if they so desire and people can put the blame on Microsoft for not making Windows x86 anymore all day long, but that won’t change a thing unless MS changes their minds.
So, to conclude, it’s not Linux’s fault, it’s not nVidia’s fault, it just so happens that the compatibility between the two is not great. People need to deal with it, grow up and stop blaming manufacturers for not supporting a platform (of course, they can ask the manufacturers to support a platform, that’s perfectly reasonable, but they cannot legitimately criticize them for not doing it) and also stop blaming a platform for not supporting a platform.
You cannot have software that doesn’t work on any hardware (that would at best be pseudo-code) and you cannot have software that magically works on all hardware. Someone has to write software for a certain hardware if the instructions / protocols / APIs are not standardized. And people are not entitled to someone else’s labor.
Ok, I’ve digressed a bit too much, but I think I’ve made the point clear. Again, it’s more of a nitpick at Jay’s (and a lot of other people’s) argument. But unlike most people, Jay understands that it’s not Linux fault because nVidia doesn’t have good support for it. nVidia could do better and adhere to the mesa standard on Linux, but they are not obligated to. But with how things are turning out, it seems kinda inevitable at some point, because of how good the stack is.
But Nvidia drivers work fine on Linux, even with multiple GPUs. I use it every day and beat on it with Blender and my dad with his ML stuff. It just works.
Yeah, the drivers aren’t open-source, but they do work well just the same.
Yeah, I know that, I was just making a point. Linus and Luke wouldn’t have gone through a whole month of using Linux on an nVidia system if it didn’t work at least partially. I used the nouveau drivers in the past (many years ago) and that worked perfectly fine for me. And it appears the proprietary drivers worked well for Linus and Luke (and for you), otherwise they wouldn’t have played any game.
I agree for the most part.
I might’ve misspoke about something. Yes, Nvidia support on Linux not mirroring the quality on Windows isn’t the fault of Linux, that’s true. IIRC, I was moreso bringing that up as a legitimate thing that’s better on Windows, not that it’s Linux’s fault. When I mentioned that I blame the Linux community for some things, I was moreso talking about the reputation side of things. Linux fans will often claim that Linux works on everything and blindly recommend that everyone install it (without checking compatibility) and I think sometimes that mindset works against it.
Not only that, but if there was a packaging issue with Steam, then there’s no excuse for that. (Assuming Linus didn’t do something off-camera to make that happen). I have seen packaging issues in distributions every now and then, and I never understand why that kind of thing happens given that we have a culture of testing things in development environments before graduating changes to production. I think we need more information on if Linus broke something before that, or there was an upstream package issue.
When it comes to Nvidia supporting the Linux platform, I can see your point there - but I think it gets a bit frustrating when a company decides to support a platform, and then decides to do the minimum amount of work possible and give users a subpar experience. In that case, the debate turns a corner and I wonder if it does more damage to not support a platform at all, or choose to support that platform but give their users a subpar experience. Either way, their customers lose.
To be fair, like Buffy mentioned, the drivers do work, but there have been some quality control issues on Linux that doesn’t seem to happen as often on Windows (unless I’m mistaken).
I believe what Linus did was a classic “not updating the OS before installing packages.” Gardiner Bryant mentioned in his reaction video of LTT Linux challenge and I do recall this being a problem on
apt too (albeit it has been a very long time ago). If you try installing software that has a dependency on a newer library, every software that has a dependency on the old library would be uninstalled in apt. This would have been simply solved by updating the OS to the latest version on first boot (which should be a bare basic digital hygiene for any computer user IMO). Then trying to install Steam from the Pop Shop should have worked.
I never encounter this, because I’m insane with keeping software up to date, so it has never been an issue for me in the past, idk, 6-7 years? But it’s unfortunate that there is no easy fix for this. Debian (and Ubuntu) are enterprise OS at their core, that are expected to run old version of programs for very long periods of time. Arch (pacman) solves this issue by forcing their users to first update their system to the latest version of their packages, before they can install software that may depend on old version of libraries that have most likely been updated. But apt doesn’t do that, because users expect to run certain versions of a software for a long time, despite a newer version being released, because of heavy API / ABI compatibility requirements, which can change slightly on minor version of software updates and are very likely to break on major version updates. So you can’t expect a bog-standard apt to tell you “bro, update your OS and packages before you install new software.”
Also, Gardiner did point out that a workaround for telling users not to do something they are unaware of, would be flashing text, or colored text output to highlight the big warning message that Linus probably didn’t read. That should be very easy to implement.
As for nvidia drivers, the proprietary drivers do work, but the software control stack, like Linus pointed out, is really outdated compared to what’s available in Windows. The drivers themselves are probably ok.
I’m not sure I agree about the “not updating the OS before installing packages” reaction. I haven’t seen that situation result in removing core distribution packages. What I have experienced, is when people don’t run “apt update” first (when using the command-line) but that still doesn’t result in removing packages, usually apt will bail out and claim that a dependency isn’t found. And unless you add the force option manually (which he didn’t do) then that shouldn’t result in any problem. The only scenarios I can think of off the top of my head that would cause that to happen are when a package is uploaded to the repository that depends on an older version of a library, while an already installed package depends on a newer version of that same library. In that case, the only way to install that package is to go against the one already installed, causing a removal to satisfy the dependency. This can be caused by adding a third-party repository that the owner doesn’t keep up to date, or a package uploaded to the main repository isn’t fully tested. There may be other possibilities too, but after installing Linux distributions thousands of times by now, I’ve never personally seen a situation where updates not being installed end up causing this. The other potential scenario is that Linus may have done something to break the package database in his installation, but I don’t know for sure. When creating a YouTube video, there’s often a lot of footage that doesn’t make it into the video. I’ve had 15 minute videos before, with over 30 minutes of unused footage. So there’s no telling what all happened in his case.
I don’t think the flashing text would’ve mattered, Linus did read the verbiage as it was written. So the flashing text would’ve gotten his attention only if it didn’t already have his attention. He did read it first. IMHO, when you open up the terminal, all bets are off. I don’t care if it’s the command shell or Power Shell in Windows, or Bash on Linux. Literally every operating system (Windows included) assumes you know what you’re doing and won’t protect you from yourself. I’ve literally deleted the bootloader on Windows once (the file name was NTLDR if I remember correctly) and it didn’t even attempt to stop me. And most of the commands in Linux are available on macOS, you can mess that up pretty badly too if you’re not careful. I just don’t see any reason whatsoever for anyone to open up a Linux terminal, unless they’re a system administrator, experienced user, a developer, a DevOps person, or they’re trying to learn. Linus has no use-case for the terminal at all.
You are correct, the Nvidia drivers do work - but not always very well. There are some bugs with the proprietary Nvidia driver that may not be apparent at first. A good example of this, is the suspend bug. I experienced it last year, for the majority of the entire year. If any of my Nvidia systems went into suspend, there was a 1 in 3 chance they’d never be able to be woken up. I’d have to force the machine off by holding the power button. Lots of people complained, and only just recently that was fixed. Also, Nvidia drivers still don’t properly support the native resolution of your display while using a TTY. But to be fair, I’m probably one of a small number of people that actually cares about that.
I was thinking the exact same opposite way, steam was a newer version and depended on a newer library version, while the whole desktop and xorg depended on an older version of the same library, which is why the desktop got uninstalled. And this is what I explained in my previous comment, sorry if it wasn’t clear.
I’m not sure about that. Or if he did, he did not understand. There was a message “don’t proceed if you don’t know exactly what you are doing.” Yeah, I get that a new Linux user would think “I know I am, installing Steam,” but I believe most people would give it a second thought. Yeah, the message is kind of fuzzy and has to be more explicit, and the “do as I say” probably asked twice with all the dragons ahead warnings.
But I believe the fix would have been to just update his OS. Pop!_OS team should put a message box in all error boxes in Pop! Shop, where is easy to see (not inside the error output), something like: “if packages fail to install, try updating Pop!_OS first using the Update tab in Pop! Shop” or a similar message. Because that way, he wouldn’t have had the bright idea to look online how to install steam from the terminal.
It seems like “Yes, do as I say” tries to do exactly that. Albeit it is apparent that it failed miserably. Also, most of
rm nowadays have the default behavior to
--preserve-root just so people wouldn’t run
rm -rf /. So, the CLI does try to some degree to protect the user from himself. Obviously this can be overwritten with
--no-preserve-root. It’s not even an alias, it is hard-coded in rm itself. There are other examples, but they don’t come to mind right now.
I agree. It’s just the fact that people write tutorials with the CLI, because articles are easier to write and copy-paste is way easier to reproduce, because if you do a tutorial on how to install Steam on GNOME Shell on Pop!_OS, it will be completely different than how to install it on Kubuntu, but using the terminal, it should be pretty similar. But this may be good or bad. Since if tutorials are written for a certain OS on a certain shell, that means it’s pretty much sure to work on it, while it is under question mark if it will work on other distros with the same base (i.e. Debian family).
True. But I guess the point was to show that materials online are written for the terminal. I would bet that if he would have stayed in the GUI, Pop!_OS would have worked perfectly fine for him.
Ouch. I’m glad I decided to not buy their products for my usage on Linux. Well, I hate laptops in general, but that’s another topic. And I don’t make use of suspend, so I never had an issue, but obviously I’m a minority.
I think this is a prime example of why universal apps for GUI applications is such a great way forward. There’s a Flatpak version of Steam available (as you probably already know). Universal apps do have their pros and cons, but removing a critical component of the distribution is not something that could’ve happened in that case. Although I’m sure it’s not a popular opinion, downloading Steam through the same means as critical distribution packages, would be the same as using Windows Update to download Steam on Windows.
I haven’t personally run into a scenario where the distribution not being updated could cause the issue Linus experienced. The only other way I can think of where this might be impacted, is if someone adds a third party repository for Steam. I don’t think he did that, unless I missed it. I’ve only personally seen an issue like this come up when the package maintainer does something wrong.
My point there wasn’t so much about the “do as I say” message, but command-line use in general. Sure, there’s the --preserve-root option, but rm is just one example of many commands that can break things, and not every command will have a safety precaution built-in. I consider opening the terminal to be akin to opening up the engine in your car. If you know what you’re doing, you can get some great results. But if you don’t, you can cause damage. But if someone decides to open up the terminal, they do so at their own risk and (IMHO) forfeit their right to complain if something breaks. Don’t get me wrong, if he used Pop!_Shop at that point and it was the means by which critical packages were removed, that would be an entirely different issue and Linus wouldn’t be to blame for that.
You have a great point there. My personal opinion is that if you’re writing a tutorial on something desktop-specific, showing the command-line method is probably not the best idea. For server related tasks, command-line makes a lot of sense. And I also agree about the different desktop environments, showing a method with GNOME Software wouldn’t make sense for someone using Linux Mint. And I know for sure I’m definitely guilty of this same problem, as I often show command-line methods on desktop distributions as well. But in my own defense, the culture of LLTV is that of learning, so the focus is probably always going to be on the finder details under the hood. I agree completely, it’s a very very very hard balance to find here. The fact that we have many distributions is one of the biggest benefits of Linux. But it’s also one of the biggest problems too.
I’d be interested to know if he did read a tutorial that led to him doing that, I don’t remember.
I think the main problem with compatibility is that if everything works for you with your use-case, then there is no problem (for you). It’s probably reasonable to judge the quality of hardware support based on the general consensus, but even that’s flawed since someone might hate Nvidia for a problem they caused themselves, whereas the issue I had was experienced on multiple computers even after wiping and reinstalling everything. Context gives you a better understanding of how good drivers are, but bias confuses things. This is also what makes it tough on developers.
I know this is just an assumption on my part, but I feel it’s perfectly understandable that Linus might have been under a lot of pressure, and may not have made the same decisions if he had more time. I’m sure he was working on other videos at the same time, and probably had sponsorships causing hectic release schedules. It’s definitely frustrating when you’re trying to get a video out to the public, only to have it take much longer than you thought it would. I’ve had to completely scrap and redo entire videos under pressure before, and something in the tone of his voice really makes me think he was frustrated and under pressure. YouTube is great when the video you’re producing is going well, but when it doesn’t, it can be very stressful.
I recall that Linus said on the WAN show that the Linux challenge was something he did in his personal time. So my assumption is that the frustration didn’t come from sponsorship or other job-related stuff, but from the fact that things didn’t work flawlessly. When Linus comes home, after a day of work (and despite what the appearances are, Linus is a very busy man), he just wants to relax and enjoy some games on his PC. The challenge made it so that instead of relaxing, he had to debug Linux running on his PC for something literally work related, so basically continuing work in his free time. When you own a company, this is bound to happen. A lot of times! I know for a fact this is valid for you as well, because it cannot be any other way (unless you’re just a chairman and just earn a passive income).
So, instead of gaming and relaxing, Linus had to waste his night trying to install Linux, record himself doing so, then ended up in failure and probably had to go to sleep tired and frustrated because things didn’t work as he was hoping. So I completely understand where the frustration was coming from.
He also mentioned in the WAN show that the Linux challenge was very delayed to be uploaded, because of other very tight and important schedules, like new hardware releases and other sponsor spots that had priority before this literal side-project.
I’m also guilty of this, but in my defense, all my tutorials are explained thoroughly (IMO), so that people know what the commands the are running actually do (probably sometimes to verbose, people lose patience very often with my articles). I also make tl;dr run-once scripts that do most of the work for people who don’t want to bother with the setup, but some knowledge is usually required beforehand (like my VPN tutorial for example).
My tutorials are only on L1T forum, I wonder if I should post them here as well. I think the LLTV community would benefit from them too. But it appears this forum doesn’t have a Wiki topic.
I am vehemently opposed to universal containers, but I understand their advantages. IMO, Steam is one of the few programs that make absolute 100% sense to be flatpaked. Steam is one of those programs that have no need to access resources outside of its own sandbox, because all programs it downloads are installed inside Steam’s own folder, which makes it the perfect candidate for Flatpak. I don’t like snaps because they aren’t universal (I run Void Linux, without systemd I cannot use snapd - nothing against systemd, just that I like Void as a distro more than everything else I’ve seen before).
Yeah, if Linus had ran Steam, or rather, if Pop!_OS offered Steam by default as a flatpak, many headaches would have been avoided.
According to Jeremy from System76, it was an issue with an official Ubuntu repository. Not sure if it’s them just trying to save face, but I’d like to give him the benefit of the doubt. It has become somewhat of a meme in the Pop community, and System76 has been less than pleased with those videos and the outcome. I tried to find the post Jeremy had made explaining what happened, but wasn’t able to. You can no longer remove pop-desktop from Pop anymore though. At least they were quick to react and patch it.
There’s a lot that happened here, and I don’t think any one person/org deserves any “blame”, as it were. We know that Linus isn’t a fresh computer user, who has never used a computer in his life. With that comes familiarity of what I will call ‘scanning/skimming’. A lot of online tutorials, StackOverflow, Windows documentation, etc. are padded with fluff, and unnecessary information. We all have looked at tutorials, code snippets, recipes and skipped over all the (presumed) non-sense to get to the meaty parts. Especially coming from the Windows world, often times - half of the stuff doesn’t apply, you just want that one part. As for the warnings in the command line, again, I think this comes from Linus’ experience in the Windows’ world. There are plenty of software that have big scary messages to keep people from poking around. In Linus’ defense, if you watch a lot of his videos, his Linux experience is being guided by people like Wendell from L1T, or vendors. He knows that you have to use the terminal, he knows that he doesn’t necessarily understand exactly what is going on, but he just does what he is told to do.
The Nvidia drivers work, but I would say that they work about 60-70% as well as their Windows counterparts. I have a 3080ti, and dragging windows is atrocious. There’s artifacts, smearing, and general chugging that should not happen on that card (or really any for that matter). I have since moved to i3 so I don’t really notice it anymore, as I’m not dragging windows around anymore. Not to mention the unsupported hardware decoding. I STILL have issues with suspend / wake on all of my computers. Arch and Pop. Desktops and laptops.
On Pop, generally it’s been recommended to use the .deb version - as the flatpak hasn’t worked as well.
It does default to the flatpak in the store if I recall. It’s just not recommended to use that usually, as it hasn’t worked for many (Steam specifically). I think I tried to install via flatpak from the store, and gave up when it immediately crashed? I don’t remember. I wasn’t actually trying to game at the time, I just wanted to see what games I could play from my library.
I think that Linus’ experience is fair, but also somewhat misleading at times. I’ve had far worse experiences on Majaro/KDE than I have had on Pop. It really wasn’t fair to Pop, and I think Linus doesn’t quite understand how the distro / pkg maintainer relationships work. I have borked a Pop install previously by upgrading, and some other random things, without touching the terminal.
I’d really like to see a video (or series) on REAL regular people trying to install and use Linux. People with varying levels of familiarity with computers and Linux. That would be interesting to see. Someone like my dad, who knows nothing. My mom, who knows some things, but comes from Mac OS, my brother who has a high end gaming PC (but doesn’t know how any of it works as I put it together), and so on. That would be an interesting insight, or at least entertaining if nothing else.
Side note, I think I remember coming across Lawrence Tech’s channel from the MSP subreddit years ago. I think I used to talk with him on there on a semi-regular basis.
The only one I know of is a small video series from quidsup on youtube, where he put his wife use a new distro on the Raspberry Pi 4 each day, for one week. She tried Raspberry Pi OS, Manjaro KDE, Lubuntu, Ubuntu Mate, and Ubuntu GNOME Shell I think. She also tried a while after that Twister (?) OS.
Complete Linux noob, coming from Windows. She did spectacularly well. The experience was kinda terrible, because all OSes weren’t very optimized for the Pi at the time, or still had bugs to be ironed out. My experience as a Pi 4 user for the past 6 months has been great, but I’ve been using sway from the beginning, so 1. low resource usage, 2. I use mostly minimalist software, with the exception of firefox, thunderbird and libreoffice, and 3. I use sway, which uses wayland and wayland fixes screen tearing and has better performance on low-end hardware for some reason (obviously, if you have good drivers, I’ve been using the full kms drivers for the vc4).
I think that’s down to Xorg. Doing computation tasks (Cycles rendering and ML) we have something like 10% better performance on Linux (Manjaro with kernel 5.10, current Nvidea drivers). IDK about suspend/wake; we don’t have ours doing that, so I can’t say. Sending the screens to sleep works fine, though.
I wonder how regular people would fare having a bare machine and having to install Windows on it; hardly any regular people ever do that since Windows comes pre-installed.
True. I haven’t ventured too far into Wayland to really have an opinion. I’d like to, but just haven’t had the time, unfortunately. Also, I don’t understand as much about that area as I’d like.
Fair point. Probably not very well, in all honesty. When I worked at a MSP I wanted to run a class that would start at the very basics of computers, as I felt that giving people a very basic understanding of how these black boxes work is beneficial to everyone.