HDR support looks to be a ways off.
Unfortunately the HDR implementation in Windows also isn’t flawless and has some big issues.
Yeah Windows HDR is incredibly buggy, it breaks in a huge variety of cases, and is completely incompatible with screen sharing.
kde plasma has it on the next release though, which you can already download the beta of.
Linux lacks GUI configuration tools for many things, you have to edit text files often using guidance for obsolete versions of software and hope it works. Every single config file can have thousands of lines and if you wrote something wrong it will crash or start acting weirdly, very fragile design. GUI config tools mostly allow valid inputs like checkbox true/false and complain if the path isn’t valid.
These things exist for windows as well but they are not accessible. Linux is a car without the plastic hood over the motor. Its not dumbed down.
Does that make it hard to see the three things a noob should touch? Yes.
But there are linux distros that take care of this so this comment isn’t correct.
SUSE is weird but their YaST was compelling enough to make them an option. Cockpit in RHEL doesn’t compare. I think that having admins edit text files is bad. The capability should be there, but it should not be mandatory. Editing files manually instead of a GUI increases the odds of a mistype trashing the system.
i could see this comment maybe a decade ago. things like Mint have made most of these complaints just echos of a different era.
Adobe lightroom (with its multi-device editing and catalogue management - even when only using its cloud for smart previews).
Hardware support for music. NI Maschine is a non-starter. Most other devices are, at best, a ‘hope it works’ but are most definitely unsupported.
Music software. You can hack your way into getting a lot of your paid modules to work, but it is certainly not supported.
Wine is ‘fun’(?), but it’s a game of whack-a-mole chasing windows’ tail and will never allow everything to run. Either way it’s not 'supported.
Businesses any any size tend to eschew SW/HW that doesn’t have formal support. (things like RHEL are most definitely supported as servers and orgs certainly leverage it).
I keep installing Linux hoping I can get a sufficient amount stuff to work “well enough” to move on from windows but it’s just not to be (yet). Hope it changes, but it’ll require buy-in from commercial product developers. I hope as Linux continues to grow a foothold in desktop installs, a critical mass will be reached, commercial devs take notice and it’ll be easier to switch.
For now, I’m stuck with Windows and WSL. (But I am not happy with Windows’ direction).
This commenter used “NI Maschine” as though everbody’d know what “NI” stood for…
iirc, it stands for Native Instruments, and iirc, the “Maschine” is either hardware or hardware+software.
The ONLY Linux distro which may do what theyre wanting, is UbuntuStudio.
I happen to agree that it is a damn “whack-a-mole” “game” for us in Linux, and I"ve been experiencing that since 1996 ( when only Slackware mostly-worked ),
but … if ever the spyware in MS’s products gets made illegal, then … Linux’d be the only lifeboat left?
( don’t tell me that Apple isn’t every-bit as much into privacy-molestation as the other Big Tech corpos are: they aren’t a real alternative )
_ /\ _
use 100% of the cpu (so efficient)!!!
Hey, mprime runs on linux (prime95)
Be useful
Be highly unified, which eases software distribution. With Windows, the system software at least is from a single vendor. You’ll have differences in hardware and in versions of Windows, sure. But then compare that to Linux, where Wikipedia estimates a thousand different distros. Granted, a lot of those are member of families like Red Hat or Debian that can be supported relatively easily. However, others use more exotic setups like Alpine, NixOS, or Gentoo. Projects like Flatpak are working on distribution mechanisms, but they have their own issues. And even if you get it running, that doesn’t mean it integrates well into the desktop itself. Wayland should improve that situation, though.
This is one of the issues that systemd purports to solve, and it gets nothing but flack for it.
Granted, systemd does have its flaws. But the religious war around it is unjustified.
The big one for me is running mobile apps as an integrated experience
Waydroid simply doesn’t work well yet.
Streamdeck support too.
VMware install isnt as seamless as it should be
Windows tiling in gnome, but, magnet for Macos is far better than both
Double click to install a program
you can do that on linux too. just double click the .deb
*on most distros
can windows do that, without a wizard? i guess technically, but ive never seen it done, and i think it would be a security flaw.
linux does have gui package managers, you know
I don’t want to compile anything. I don’t want to “make”. I don’t want to use command lines. I don’t want to download Rust. Or know the difference between python2 and python3 and how you still have to be specific about it. I never wanna read a git manual with lines that mean nothing to me again. I don’t care about snap or flatpack or whatever package distribution gives me a year old version of my program on this distro and how that version differs from the one from the webpage.
Sorry Linux, but a big download button, double click, “install to c:/program files”, “want a short cut to desktop”, done. is the gold standard for 99% of applications.
I know there are reasons. But still, it sucks.
Have the Year of the Windows Desktop.
Run updates without me having to worry that “whoops, an update was fucked, and the system is not unbootable anymore. Enjoy the next 6 hours of begging on forums for someone to help you figure out what happened, before being told that the easiest solution is to just wipe your drive and do a fresh install, while you get berated by strangers for not having the entirety of the Linux kernel source code committed to memory.”
Moving to ublue/silver blue has really been a treat for avoiding this. Oh update borked my system time to boot to last update and wait on that one. I personally really want to get a CI/CD running next for my updates to make sure my specific build and collection of software just works the way I want it too.
That’s why I make a btrfs snapshot of my system before every upgrade. Rolling back from a rescue image takes only a minute.
What a great idea! They should automate something like that! Maybe they could call it System Restore?
I never claimed to have invented the technique.
They’re just pointing out that Windows does this too.
I had to literally give up on a windows install that worked itself into an update hole, run the update, cant log in, undo the update, it tries to update at night. Endless cycle, no possible fix.
I don’t want to berate you, but just know with enough practice, you’ll be able to fix that linux install. Windows wont let you fix it.
Just to provide another data point: I’ve had bad Windows updates render my machine unbootable too.
And then you’re left searching for bullshit error messages and potentially unable to fix the problem regardless of your level of expertise.
And Microsoft support that’s in fact clueless fanboys.
… No you just use Windows built-in rollback feature. Which I think even auto-recovers these days of it detects a failure to boot after an update.
Hah! Can someone here chime in and tell me when the slow AF (as in, it can take hours) rollback feature actually worked‽
Who TF is that patient‽ You can reinstall Windows and all your apps in half the time required.
As someone who has hundreds of installed programs with tweaks on top of tweaks and hundreds of thousands of files, I always find the suggestion to “just reinstall” beyond laughable.
I think it recovered my PC for me twice, and it took about ~10 minutes each time at most. Good luck reinstalling everything in that time lol.
Windows recovery fails in plenty of circumstances, it’s not a magic bullet. Snapshots are like you can do with btrfs, but that’s not exactly how Windows recovery works.
Of course not, but it works 9/10 times for most people. Enough so that most people never have to deal with a faulty Windows update.
sfc /scannow didn’t work? Well too bad, cuz now you gotta reinstall your OS
Spoken like someone who doesn’t do stable releases
Even in the most stable distros I’ve had this issue. We had a RHEL 9 server acting as a graphana kiosk and it failed after an update. Something dbus related. I’d love to know why, as it’s been the only failure we ever had but nonetheless it shakes confidence. Windows 11 updates trashed three servers, one to the point we had a to fly an engineer out. My hope is that immutable distros fix this.
You might be suffering from the opposite of survivorship bias: When you work in IT you end up having to fix the strangest shit that reoccurs on certain categories of hardware.
I know for a fact that RHEL 7 just did not like certain appliances by vendors that used it (back in the day). They would regularly break themselves until the vendor put out an update that switched it to a Debian-based custom thing.
Also, all the (thousands of) appliances that use Windows are utter shit so it’s not really a high bar. The vendor just needs to hire people that actually know what they’re doing and if they do they won’t use Windows on an appliance!
Embed ads on your desktop.
Play games with kernal level anti cheat
Run professional software like fusion 360, Adobe suite and much more.
Use Wsl to get a lot of the benefits of linux
Fusion 360 actually works under Linux with Bottles. Some other Autodesk products also have native Linux versions.
I’ve put more work into getting wsl to work at work than I have my home linux machines. it’s just so unreliable for some reason. I ended up just giving up and running a full vm instead, and it’s so much nicer since I can just pretend windows doesn’t exist
Same here. It’s nice that I can do some of regular Linux flow on my laptop but it’s so much to get to consistently just work .
Especially when enabling wsl is incompatible with running a VM. I want to run VM not only for Linux! Yeah just installing a full vm is better.
Blue screen of death.
I feel like its kinda different but lets see
The granularity and scale of active directory is a major thing that is keeping linux out of offices, etc…I know you can do a lot with certain tools but nothing comes close as far as I have seen.
The granularity of AD doesn’t scale though. I work for a huge bank and trying to get something changed in Group Policy is basically impossible. Making it even the tiniest bit bigger (e.g. adding a single new rule) will slow down every goddamned PC and VM in the entire organization. It adds up to real money lost real fast.
Not only that but some changes to GPOs can break things that you didn’t foresee so the general wisdom is, “don’t ever change it.” Rendering that whole “granularity” argument moot. What good is granularity if you can’t even use it?
Also, getting AD to scale to the size required the help of Microsoft. They had to change AD for us many times because the way it replicated certain things just does not scale past around 20,000 desktops (if memory serves). They gave us custom DLLs that run on our DCs to keep things operating reasonably smoothly but their lack of support on non-Windows platforms is a perpetual problem.
If literally every single computer in your company is Windows you’ll be fine. However, as soon as you start trying to connect your Linux servers to AD everything starts getting really fucking complicated and troublesome real fast.
Microsoft made a lot of mistakes when they were designing AD but the biggest one was making it intentionally proprietary in so many ways. It prevents us from adopting it more. If AD actually worked with everything we’d be paying Microsoft a lot more in licenses every year.
Aside: Their second biggest mistake with AD was allowing groups to be placed in other groups. This made it so that “simple” administration of your policies and access controls goes from a single lookup to a lookup to the power of n groups. It doesn’t scale at all and exponentially increases network traffic and load on domain controllers.
LDAP + Kerberos running on Linux servers doesn’t have this problem because it doesn’t allow it (intentionally, because it’s stupid).
Oh man, I’m thinking about it now and AD just makes me so upset, haha. It’s such a poorly engineered product. Don’t give it more credit than it’s due. It works fine for small organizations but that does not mean it’s a good product.
Can you elaborate…
I have looked after a few instances of Active Directory and basic user management involved multiple steps through GUI’s clearly written at different times (you would go from a Windows 8 to Windows 95 to Windows XP styled windows, etc…)
I much prefer FreeIPA, if I wanted to modify a user account it was two button clicks. Adding a group and bulk applying was the work of moments. You can setup replicas and for a couple hundred users it uses no resources.
The only advantage I could see related to Exchange Integration as it makes it really easy to setup Sharepoint, Skype & Email.
Sharepoint never gets setup properly and you find people switching to alternatives like Confluence, Github/Gitlab Pages or Media Wiki. So that isn’t an advantage.
Everybody loathes Skype and your asked to setup an alternative (Mattermost, Slack, Zoom, etc…). I am not sure how integrated Teams is.
Which really only leaves Email and I just can see the one off pain of setting up Dovecot as worth the ongoing usability pain of AD’s user control.
Install stuff without asking and then force reboot?
Play all my laptop’s speakers
Same, or use the fingerprint reader.