I’ve got arch on a laptop. It’s fun for ricing and all but Windows is so much easier from a user standpoint.
There’s a reason Linux makes up only 1.2% of OS’s used, maybe next year will be the “year of Linux” and everything will suddenly work and support all software but until than I’ll use the “worse” OS daily.
Bro, you’re talking about Arch. No duh it isn’t user friendly—it isn’t designed to be. If you’re going to compare Windows to Linux, the only fair comparison would be to Ubuntu or Linux Mint or something else designed for the people outside of the tech-illuminati.
I’ve used many distros before, don’t try to pretend it’s only this distro holding Linux back.
Linux is just not there. No one wants to deal with compiling from source, worrying about dependencies, and all that other shit that makes software such a fucking pain in the arse.
With Windows you have the worlds largest selection of software and it largely continues to work 20+ years after release for many apps.
In my opinion, it is perceived difficulty that keeps people from using it. Most basic users will use the OS that is installed on the computer when it ships and never stray from that. It often takes another Linux user to introduce someone to it before they will use it.
Those concerns you mentioned are basically non existent for a low level user who just wants to do email, internet, and word documents, which covers a decent chunk of home windows users. Not all, of course, but many.
In my entire social and family circle that comprises hundreds of people, I can count with one hand the number of people who are even close to proficient enough with a computer to be assured of a decent experience with *nix systems, (excluding macOS, naturally).
My litmus test is “will you be comfortable with opening a terminal and typing in a bunch of text commands?”
If not, I’d only recommend Windows or macOS. I don’t want to play tech support for hours to my parents or granny or my in-laws at 11pm on Saturday night guiding them to through a tutorial to fix their borked computer because they “accidentally clicked something but they can’t remember what and now it doesn’t load facebook”.
That 2003 PC probably does not receive firmware updates from the manufacturer, and hasn’t for over a decade.
It might still function, but that doesn’t mean it is still supported. At this point, many operating systems won’t even install due to the x86-64bit requirements.
Nothing else matters except privacy and security for me. Apple provides that in their phones.
PCs from 2003 are full of vulnerabilities, use legacy instruction sets, lack power efficiency, lack manufacture support, do not support UEFI, have no IOMMU hardware isolation, have no modern VM capabilities, probably have no TPM, etc etc etc.
If Apple is anti-user, then we need to also start blaming every single hardware manufacturer that doesnt support their products anymore. Manufacturers of phones, motherboards, TVs, SSDs, displays, mice, keyboards, printers, network equipment, etc etc etc.
Ok then those that can’t afford Apple can shop other brands. They just won’t get the Apple support, and will have to rely on community efforts to keep their machines running.
A PC from 2003 still runs a modern OS. No Apple isn’t the good guy, other companies are just even cheaper.
deleted by creator
It doesn’t get slow. Modern software just gets better resulting in more powerful hardware needed which results in older hardware feeling slow.
deleted by creator
You’re correct but you get what I was trying to say.
Can it run Windows 11 with that tpm or whatever it was called requirement?
It can always run Linux
Some very minimal versions of Linux.
And I’m sure an old iPhone can run some jailbroken shit as well. So afaic there’s little argumentative difference.
It can run a very competent desktop. The hard things to run are web browsers.
Why would you run windows when there’s better operating systems for free?
I’ve got arch on a laptop. It’s fun for ricing and all but Windows is so much easier from a user standpoint.
There’s a reason Linux makes up only 1.2% of OS’s used, maybe next year will be the “year of Linux” and everything will suddenly work and support all software but until than I’ll use the “worse” OS daily.
Bro, you’re talking about Arch. No duh it isn’t user friendly—it isn’t designed to be. If you’re going to compare Windows to Linux, the only fair comparison would be to Ubuntu or Linux Mint or something else designed for the people outside of the tech-illuminati.
I’ve used many distros before, don’t try to pretend it’s only this distro holding Linux back.
Linux is just not there. No one wants to deal with compiling from source, worrying about dependencies, and all that other shit that makes software such a fucking pain in the arse.
With Windows you have the worlds largest selection of software and it largely continues to work 20+ years after release for many apps.
In my opinion, it is perceived difficulty that keeps people from using it. Most basic users will use the OS that is installed on the computer when it ships and never stray from that. It often takes another Linux user to introduce someone to it before they will use it.
Those concerns you mentioned are basically non existent for a low level user who just wants to do email, internet, and word documents, which covers a decent chunk of home windows users. Not all, of course, but many.
In my entire social and family circle that comprises hundreds of people, I can count with one hand the number of people who are even close to proficient enough with a computer to be assured of a decent experience with *nix systems, (excluding macOS, naturally).
My litmus test is “will you be comfortable with opening a terminal and typing in a bunch of text commands?”
If not, I’d only recommend Windows or macOS. I don’t want to play tech support for hours to my parents or granny or my in-laws at 11pm on Saturday night guiding them to through a tutorial to fix their borked computer because they “accidentally clicked something but they can’t remember what and now it doesn’t load facebook”.
If you use Rufus you can remove that and all other hardware requirements and install 11 on much older computers than Microsoft intended.
No of course not. Thankfully open hardware means no company dictates it alone.
That 2003 PC probably does not receive firmware updates from the manufacturer, and hasn’t for over a decade.
It might still function, but that doesn’t mean it is still supported. At this point, many operating systems won’t even install due to the x86-64bit requirements.
None of that matters. No company can say what your hardware can do. Apple’s policies are disgustingly anti-user.
Nothing else matters except privacy and security for me. Apple provides that in their phones.
PCs from 2003 are full of vulnerabilities, use legacy instruction sets, lack power efficiency, lack manufacture support, do not support UEFI, have no IOMMU hardware isolation, have no modern VM capabilities, probably have no TPM, etc etc etc.
If Apple is anti-user, then we need to also start blaming every single hardware manufacturer that doesnt support their products anymore. Manufacturers of phones, motherboards, TVs, SSDs, displays, mice, keyboards, printers, network equipment, etc etc etc.
Nobody is forcing you to use an old PC. Others exist, the poor, who need affordable computers that last.
Ok then those that can’t afford Apple can shop other brands. They just won’t get the Apple support, and will have to rely on community efforts to keep their machines running.
What exactly do you want Apple to do here?
Provide an open boot loader on all devices they sell at the minimum (I believe that should be law).
Basic documentation helping a community OS would be nice.
Is there an example of an open bootloader you would want apple to model?
Apple was an early adopter of EFI and is a member of the UEFI Forum. They should use modern UEFI.