• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Anyone using Linux?

I think you meant the FGLRX driver, I'm describing an issue with the DRI infrastructure that affects all open source drivers including the open source Radeon driver which is separate and different. The nVidia driver basically does an end run around the entire Linux driver infrastructure in order to be user friendly (in the process creating non-user-friendly issues like incompatibility with additional graphics devices from non-nVidia manufacturers). That shouldn't be necessary and indicates a problem with the OS.

Enthusiast is hard to define, but does not include your grandma, your aunt, etc. It means technically inclined people who like to tinker. The only non-enthusiasts I have ever encountered running Ubuntu et al do it because they have an enthusiast in the family who set it up for them. And even those are rare.

I think it's entirely likely that Linux is the best for what you do - it's the best for much of what I do as well. Just recognize that you're more the exception than the rule among the general populace.

Re: "installed it for them" - like i mentioned before, one of the hurdles is lack of pre-installed systems. You buy a computer, odds are it's going to be loaded with Windows, unless you specifically seek out a Mac or Linux machine. I think it has a lot more potential for casual following if it were a pre-install option on more systems, since there is a fair portion of the population that can adjust to it and use it without much issue (my sister being a good example).

Will it have the same market share Windows used to have? Not without dealing with the technical issues (a modular kernel would go a long way toward this end, I think), but it could have more than it currently does, even the way it is. Most users don't need the hardware that usually runs into problems (high end graphics cards for example), or run programs that don't have suitable analogues. The UIs are there, enough compatibility for software is there, for the basic users.
 
My point remains that Linux's lack of market share is far more likely to be Linux's fault than everybody else's fault. I do not want to hear blame.

If the hardware support list is smaller than that of Windows, that's a fatal blow. I don't care if the items missing from the list are high end graphics cards or any other type of peripheral. Users aren't going to buy a computer and cross reference whether the components are compatible. In my opinion, it has always been incumbent upon Linux to find a way to support Windows drivers as a stopgap, or else the critical mass of users that would eliminate the need for that stopgap will never materialize. However, most kernel developers tend to be more religious than practical, and they can be because they aren't getting paid. The ones that are getting paid are doing it with specific products targeted.

The same is true of Windows software. You might feel that there are UIs available for Linux that are as ergonomic as Windows or Mac OS, but I think a poll of the general population would disagree with you.
 
And again, plenty of people install Windows on Boot Camp. There are probably several times more installations of that nature than there are total Ubuntu Desktop installations.
 
My point remains that Linux's lack of market share is far more likely to be Linux's fault than everybody else's fault. I do not want to hear blame.

If the hardware support list is smaller than that of Windows, that's a fatal blow. I don't care if the items missing from the list are high end graphics cards or any other type of peripheral. Users aren't going to buy a computer and cross reference whether the components are compatible. In my opinion, it has always been incumbent upon Linux to find a way to support Windows drivers as a stopgap, or else the critical mass of users that would eliminate the need for that stopgap will never materialize. However, most kernel developers tend to be more religious than practical, and they can be because they aren't getting paid. The ones that are getting paid are doing it with specific products targeted.

The same is true of Windows software. You might feel that there are UIs available for Linux that are as ergonomic as Windows or Mac OS, but I think a poll of the general population would disagree with you.

I'm not blaming anyone, just stating opinions on where I think the trouble spots and strengths are. If anyone is blaming anyone, it's you blaming the kernel developers.

Whether a UI is ergonomic for a given person is purely subjective. In that sense, some will find Unity better, while others will prefer Explorer or the Mac UI, or something else entirely (KDE and Cinnamon are pretty seamless switches from old Explorer). Most people can adapt when exposed, as Metro and the rise of Apple have illustrated. However, when it comes to functionality, "maturity," and other, (relatively) more objective measures, the big Linux desktop environments are on par with Mac and Windows. In that sense, they are ready for wider exposure. (In some cases, they're even ahead, with better integration of things like messenger and calendar, and more powerful basic tools like text editors. Notepad doesn't hold a candle to Kate or Gedit, even on default settings.)

The general population hasn't been exposed to any of the Linux desktop environments, so a poll of them would be heavily biased. Expose more people to it, motivate them to give it a fair shake, and those numbers would balance out quite a bit more, I think. This is pretty much how Apple did it. The unification between the desktop and mobile environments prompted iDevice users to try Mac computers. Some stuck with it, some bootcamped Windows. Do you really think a person who has only been exposed to Windows (prior to 8) would find the Mac interface as "ergonomic" right off the bat? A few might, but most won't. Hell, Microsoft learned that the hard way with Metro.

OEM preinstallation bypasses most of the hardware compatibility issues, since the OEMs would use hardware that's tested and verified (Dell already does this, since they offer Linux computers to their non-American markets), and most of the issues come from internal hardware. Peripherals are a question, but all the ones I've ever used "just work" and the situations where I didn't get full functionality was due to software support (namely, the Logitech g15 keyboard's macro and LCD), which I could have addressed, but never bothered to (this could be a strike against Linux, but we already agree that gaming experience isn't quite up to par, yet, and since it's a gaming keyboard, I'd wrap that into that niche). In fact, I've even seen a couple of cases where one worked on Linux, but not Windows (it was a Canon bed scanner, so it's not like it was some obscure thing).

It's not a perfect solution, but most computers these days ship with an Intel/nVidia combo already, which can work easily, making it a step in the right direction (besides, RedHat has already proven Linux works in a business environment, which isn't much different from a casual home environment in terms of general needs). Such an offering may light a fire under both ATI and kernel developers to work to improve their respective offerings. It may even prompt nVidia and/or ATI and/or Intel to spearhead a better Linux kernel project. It's not like there's not precedent -- Apple made hundreds of contributions to open source projects over the years, because it made business sense to do so. We see it now with the open sourcing of Microsoft's .Net framework.

I don't disagree that most of the kernel devs are religious more than practical (I've seen some of the conversions Linus has gotten himself into). Not having monetary motivating plays a huge factor in that. I think if Linux is to ever have a truly modular kernel, it'll be made by someone else than the incumbent group.

I think what might help, is if someone starts what essentially amounts to a Linux mirror of Apple. Branded hardware, an OS tuned to that hardware, strong branding, etc. You could start with a distro like Mint or Ubuntu, and build from there, since they're already newbie friendly. That also makes perfect one for one hardware compatibility with Windows a bit less of an issue, since not even Mac has that (it's now weighted in the direction of Mac, but it's still not exact compatibility, and they don't even really try).
 
Apple built a complete OS on top of the BSD kernel. They didn't borrow userspace components from any BSD distribution.

Google has done the same with Android and Chrome OS.

Those are not really analogue to building a branded hardware/software combo using Ubuntu. Any manufacturer can sell a computer with Ubuntu, they generally choose not to because the market would not support it, especially in light of what they calculate would be a higher support cost.
 
Apple built a complete OS on top of the BSD kernel. They didn't borrow userspace components from any BSD distribution.

Google has done the same with Android and Chrome OS.

Those are not really analogue to building a branded hardware/software combo using Ubuntu. Any manufacturer can sell a computer with Ubuntu, they generally choose not to because the market would not support it, especially in light of what they calculate would be a higher support cost.

I'm fully aware that they built their own userspace on top of the kernel. That's only a small part of the brand, though, and the closest analogue to my idea - which is to control and optimize for a specific set of hardware and build a brand around it.

Actually, Dell has a very successful Linux segment that's been going strong for close to a decade now. There are also small distributors that do Linux machines. The market is there (in the case of the small distributors, it's just the marketing money that isn't in order to get large scale recognition).

Do you have actual numbers on these higher support costs? Because in my experience (working retail tech support and watching my husband manage an enterprise IT department), most support costs come from malware or hardware. You can't help hardware, really, but malware most people encounter is essentially stopped dead in a Linux environment, due to how the authorization works. As for the newbie questions, there are a dozen avenues by which to address them. Again, Apple has already proven that it can work, it just needs someone bold enough to step up and do it on a large enough scale.
 
Also not every new UI succeeds. Metro has been almost completely rejected on the desktop.

Never said they did. In fact, that's why I think a Cinnamon or KDE based setup would be the most viable one for a Linux competitor. The idea that desktop and mobile should be identical is garbage.
 
Dell offers Linux on specific machines targeted to developers. You cannot buy a Dell home or business desktop with Linux AFAIK. So it's really not similar to what you describe.

No numbers on support costs, just reiterating what I've been told.

Linux's virus resistance being due to its architecture is a myth that bothers me. It's certainly not impossible to get root access on a Linux machine via clandestine methods, just as it's not impossible to get Administrator access on Windows, which is a prereq for most viruses. The difference is, as I'm sure you've been told, that nobody bothers writing a virus for Linux because Windows has around 80 times the market share on the desktop. The only OSes I'm aware of that are genuinely virus resistant are Android and iOS, and it is because they run every process in a VM.
 
Dell offers Linux on specific machines targeted to developers. You cannot buy a Dell home or business desktop with Linux AFAIK. So it's really not similar to what you describe.

No numbers on support costs, just reiterating what I've been told.

Linux's virus resistance being due to its architecture is a myth that bothers me. It's certainly not impossible to get root access on a Linux machine via clandestine methods, just as it's not impossible to get Administrator access on Windows, which is a prereq for most viruses. The difference is, as I'm sure you've been told, that nobody bothers writing a virus for Linux because Windows has around 80 times the market share on the desktop. The only OSes I'm aware of that are genuinely virus resistant are Android and iOS, and it is because they run every process in a VM.

InfoSec 101 - it's not if, it's when.

Androids have had wild malware (not strictly virus, but malware nonetheless).

I didn't say Linux was immune (and to say no one bothers is also a cop-out, some 90% of webservers are Linux and there's no better target than that), but the authorization process and file access rights makes it far more idiot tolerant than Windows'. There's a reason UAC was implemented starting in Vista.

It's also far easier to get root access on a Windows box than it is on Linux (especially when the root account is disabled entirely). Again, making it more resistant by nature.

As for Dell computers with Linux:

Ubuntu on Dell Models | Ubuntu

Linux is an option outside of the US - Dell
 
Web servers are an excellent target for malware, not viruses. There is a variety of malware for Linux and BSD web servers, some of which can go root.

UAC is not that different from Linux authentication. What's superior about Linux in this regard? On either OS you would need to identify and exploit an OS flaw, or socially engineer someone with root privilege (Trojan horse).

Wasn't aware of the Ubuntu option on home/business Dell machines outside the U.S. I guess outside the U.S. there's a market.
 
I am following this topic with great interest. Allow me to interject my own opinions as somebody who can get around computers fairly well but has only a limited knowledge of Unix terminals:

It is true, many distros of Linux are far more "user-friendly" than they were ten years ago. Hardware support is significantly improved, software has become more robust, and for what most people do with computers, like office software and web browsing, it is certainly a viable option even for inexperienced or casual users. Things that used to take hours to figure out and find workarounds for, sometimes even when it came to the most basic hardware setup (I remember having great trouble a few years back with wireless cards) are now basically built in to the OS right out of the box. For someone with an older machine--say, you get a hand-me-down that is too old and slow to run current-generation software well--the fact that Linux is much more lightweight makes it a good candidate for replacing Windows; I've used it for this purpose more than once.

THAT SAID - I've noticed that there are still areas where Linux falls short of commercial OSes. Mostly these are for more specialty uses...say, audio production or media creation...for the simple reason that none of the open-source software for these purposes, as far as I can find (I've looked far and wide), come anywhere near close to the quality, usability, and performance of their commercial counterparts, like ProTools or Photoshop or the like, and can require a great deal of tinkering to get running that may be too much for someone who just wants to be able to use the software might not have the patience for, especially when the aforementioned software is inferior to the commercial equivalents anyway. Simply being open-source does not automatically make the software superior to the alternatives, and the average user certainly isn't going to be getting into the code to make it better!

As for my own personal taste...well, that's my personal taste, I suppose. I like Windows AND Mac AND Linux, all for different reasons. Disregarding which company we hate more than the rest, I currently use OS X because, the vast majority of the time, I don't CARE about customizability, and you have to bear in mind a lot of end users feel the same; most Linux distros, as I recall, are still slightly more clunky as far as GUI's go, though they have, as mentioned before in this thread, gotten much better in recent years. Most people want software that "just works;" and while Linux is certainly accessible given the right distro, it still isn't QUITE there yet, especially for those who want to go beyond web browsing and office software (as I already mentioned, commercial software for professional applications still remains a few steps above what is currently available in open source). Throw in the fact (and again, this comes to which tech company you hate the most) that with Windows and Mac you get a lot of device integration, which I use frequently and is something that simply does not exist in Linux and likely never will, unless by some miracle we start to see Linux-based mobile devices successfully infiltrate the marketplace--I can't see that happening anytime soon.

Anyway, just some rambling from an amateur perspective. Like I said, I've followed this thread with great interest and still would like to better my computer knowledge when it comes to open-source OS and software.
 
Ubuntu phones are on the way. (Not to mention Android, Tizen, Firefox OS, etc.)

There is no problem with wireless cards or hardware these days if you do a little research before choosing a computer at the store. :)

If you do graphics and sound work, you might want Windows or Mac (though Disney was using Linux for their animation workstations at one point). You can run Windows in a Virtualbox. Linux is stronger in other areas, like programming. One won't see many programmers using Windows here in the startup/tech scene. They use Mac and Linux...
 
Last edited:
I only run Linux in a Virtualbox these days...dual-booting was more trouble than it was worth for me. :p
 
Ubuntu phones are on the way. (Not to mention Android, Tizen, Firefox OS, etc.)

There is no problem with wireless cards or hardware these days if you do a little research before choosing a computer at the store. :)

If you do graphics and sound work, you might want Windows or Mac (though Disney was using Linux for their animation workstations at one point). You can run Windows in a Virtualbox. Linux is stronger in other areas, like programming. One won't see many programmers using Windows here in the startup/tech scene. They use Mac and Linux...

People familiar with the mobile market will tell you that Ubuntu Phone has very little chance of cracking more than the 1% market share it enjoys on the desktop. It is very hard and very expensive to break into that market, and there are already two separate OSes that app developers have to target; achieving critical mass is really hard to do and even longtime players like Blackberry have failed to make serious inroads once they're out. Blackberry 10 is, technically, a good operating system, easily superior to Android in a multitude of ways. It wasn't enough. They might have had a better chance if the first version had shipped with bulletproof Android app support, but it was felt there wasn't time. It's been argued that Ubuntu Phone might crack emerging markets, but I don't really see why it would and Android wouldn't. Firefox OS has many times the current buy-in from manufacturers that Ubuntu does and even it won't crack the market.

None of this means I think Ubuntu Phone is a shoddy product. Any Linux-based OS running on a smartphone is by definition going to lack the issues that make it problematic on the desktop.

Programmers use Windows if they're targeting Windows. I actually use Visual Studio 2015 as an IDE for code that exists in an Ubuntu VM. I just haven't found a comparably high quality IDE on Linux.
 
Linux doesn't have issues on the desktop if you just do a quick Google search before you choose which computer to buy. :)

Windows isn't used much in the startup scene here. Linux is more common outside of special cases. (I mentioned .NET above.) I prefer Vim, which isn't a good fit with Windows.

I'm not criticising anyone's choices, but am only saying that Linux is not inferior to Windows or Mac. Depends on what you're trying to do.
 
Yeah. It's all personal. However, for a majority of people, the personal choice appears to be Windows right now. No excuses, just market share.

I've found that people who are developers tend to just not notice the types of ergonomic issues that are glaring in sharp relief for the average person.
 
for a majority of people, the personal choice appears to be Windows right now.

That market share is a result of business practices, not necessarily of software quality. Microsoft charges the manufacturers higher fees unless they exclude competing operating systems. If another operating system were preinstalled on most computers, it might have the majority share, and it would also be finely tuned for the hardware.

OS X is limited by its lock-in with specific, expensive hardware. For $1,000 I can get a laptop with the specs of a $4,000 MacBook.
 
The decree that ended that Microsoft practice is 21 years old.

Dell charges more for Ubuntu desktops for no reason other than their calculated higher relative support cost.

Don't get me started on Mac fanboys, who are probably even less rational than Linux fanboys. For my part, I bought a MacBook Pro in 2013 for no reason other than it had the highest resolution display available on a laptop at the time, in addition to just being a good looking piece of hardware that was in a convenient form factor. I compile some code on it but I'm not trying to set performance records. I actually had to wait for Windows 8.1 before I could boot Windows on it, because scaling for high resolution displays on Windows sucked until recently.
 

New Threads

Top Bottom