Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The actual conversation is more interesting and revealing than the blogspam on top of it (toolbar I have to dismiss? really?). This quote is noteworthy (from Linus):

> I personally think that one reason that the Linux kernel has been so successful was the fact that I didn't have a huge vision of where I wanted to force people to go

I agree. Thing is, this just doesn't work when it comes to user interfaces. You end up with a morass of principalities and wanna-be-kings who all have their own different view of what Linux on the desktop should be. This is an area where consistency really matters (to users).

What's more the primary contributors are engineers who are unrepresentative of any kind of mass user base (potential or actual). People who want to be able to customize everything when no one else cares.

Ultimately I think the pseudo-anarchic development process doomed Linux on the desktop but some external factors didn't help, namely OSX. OSX really is "good enough" for anyone wanting a nix desktop/laptop. Sure it has problems but it means the gap between it and Linux is just that much smaller. And with Linux you give up great hardware (to varying degrees) and a snappy and consistent* UI/UX.

The company that has had the most success putting Linux interfaces in people's hands is, of course, Google with Android (disclaimer: I work for Google). And even there the principality problem has created a fragmentation nightmare as different vendors seek to "differentiate" themselves.

Phones and tablets of course don't have the same expectation and technical legacy that desktops do (from X11 on up). Hell, Apple did the same thing with iOS (essentially forking OSX).

Audio is another problem. Peripherally I've had to deal with PulseAudio and it really is a solution looking for a problem. Graphics drivers are another issue. AMD/ATI and nVidia have of course not helped matters here.

I think Canonical had a chance to make a real impact here but they essentially blew it.



> I think Canonical had a chance to make a real impact here but they essentially blew it.

I keep seeing claims being made, and mostly by (former?) Linux superusers, that Canonical and Ubuntu are "doomed". Could someone point me to some numbers that show signs of this?

I know when the Unity desktop came out, a lot of people were flaming. I didn't install it on my laptop (I'm still on 10.04 LTS), but I did on a colleague's desktop. From time to time, I have to assist him with code he's writing. During those sessions I have to interact with it and to be honest I'm not pleased, but each time I ask him if he wouldn't be interested in switching to Gnome or maybe installing Mint or something, he keeps saying it's not that bad. I'm sitting next to him and frustrated because I can't find where everything is and how to bring stuff up, but when I ask, he just presses a few keys and whatever we need shows up on screen.

When Miguel de Icaza's post came out last week and I saw claims on the doom of Ubuntu, I started looking for evidence. I couldn't find anything that in fact confirmed this. I found a lot of people who complained about Unity last year when it first came out, but fast forward in 2012 and there are also more and more posts of people going I use to hate it with a passion, but I have to admit, it's now growing on me. Not everything is perfect, but with a bit of work I think they're on to something. Also interesting to me was that I had a sense that the most positive reviews came from people with no prior Linux experience. So much so that I'm getting more and more curious about it and I'm slowly mentally preparing to make a switch to it.

Is it possible that Canonical is actually moving away from Gnome and taking things in-house to avoid exactly the problems that Miguel de Icaza is underlining?


Thanks for this comment. I personally think Unity kicks ass. It is the best out of the box experience I ever had with a desktop os.

Of course my heavily customized gnome 2 were better suited for me, because every detail was tweaked by myself. Unity I can use out of the box and it is working really fine.

I sell laptop with Ubuntu 12.04 and I have a return rate of 3%, normal return rates are 20 to 30 %. It can't be that bad for normal users.


How can you live with having to search everything? Gnome-shell gives you a nice list of all your applications and you can switch between categories. The biggest bug since the introduction of Unity besides trying hard to make you remember application names is that F1 no longer opens help. The question mark symbol has also disappeared from the top panel.

After grudgingly Googling how to use Unity, I found the steps to list applications and it's absolutely horrible. Here's how:

  Right-click the Ubuntu logo*
  Choose applications
  Click "Filter results"
The buttons presented are individual toggle-buttons, not the usual switch buttons. Turn the next category on, turn the previous category off. Madness. And on top of everything that's badly designed, it's also unresponsive. A lot of clicks are missed and this is especially noticeable with the workspace switcher.

For now, there's the option of installing gnome-session-fallback. It introduces a "GNOME classic (no effects)" option in LightDM which works well and has the old, functional workspace switcher. Still no F1 for help, but it sits under Applications > Accessories so it's relatively discoverable. The panels can be customised by holding down ALT+Super while right-clicking them (thank you again, Google).

Sadly, Gnome is deprecating the fallback session in favor of a software rasteriser ("llvmpipe") to run Gnome-shell where a hardware rasteriser is unavailable. The next release will also see the lock screen moved into the shell so it'll look pretty, but it better be the most stable release they've ever had since the last 2.x. Sorry to be cynical, but I doubt it.

*: This doesn't work in Unity 2D. Tough.


How can you live with having to search everything?

(Disclosure: satisfied Unity user here)

If you interact with the machine primarily through the keyboard, "search everything" is a plus, not a minus. It means you can easily launch new applications without having to take your hand off the KB and move it over to the mouse. Launching applications is much faster for me in Unity than it was in GNOME 2.x, simply because typing ALT plus the first three characters of the application name (or a generally descriptive word: ALT + "music" finds Rhythmbox, for instance) is faster than rooting around a multi-level menu.

After grudgingly Googling how to use Unity, I found the steps to list applications and it's absolutely horrible

This is because "listing applications" in Unity is a failure case. Users don't want to list applications; they want to launch the one application they were looking for. The point of Unity's launcher is to make digging through a long list of apps to find the one you wanted obsolete.


So it takes Gnome Do and makes it more complex? I think that would make Ubuntu quite unique and thus back it into a corner they invented. It's not like the GUI is a new concept these days, designs have long been ironed out. Old Gnome was fine and I would've settled for something that resembles Android 3.0+, sorry to rant.

"listing applications" in Unity is a failure case

My desktop takes the logical next step, before I moved back to gnome-panel I simply dragged the applications I used out of the overlay thing onto the desktop. I keep the icons organised, so they're always in the same place. Anecdotal observations from "normal" people show that they use their Windows Vista/7 desktops much the same way, not even bothering with the start menu except to shut down. They get grumpy whenever anything is different at all in which icons are shown and in what order. Their allegations that I broke something weeks after I accidentally left, say, a copy of unzip on their desktop is of course laughable at best. But the concept of having shortcuts on the desktop, certainly not. Makes me wonder why Unity bothers with a separate desktop at all, it could benefit from the screen real estate when that overlay screen sits at the bottom of the stack as the root window. I loved the netbook-launcher it grew out of, too bad it was discontinued.

Users don't want to list applications; they want to launch the one application they were looking for

The Gnome consensus used to be that users don't care which application they're using, they just want to carry out a certain task. Which is why they all have boring names such as "document viewer" and "web browser". How is a user going to know to find evince and epiphany or firefox? In Unity, search quickly breaks because of this. If you know the name of the application, it can find it. But if you don't, it doesn't map keywords to application names. It would be impossible to maintain appropriate keywords for all applications in Debian (and by extension, Ubuntu) so it doesn't.

Fortunately, all the current desktops run a terminal by pressing CTRL+ALT+T. It's the one failure mode they all support equally well, supports tab completion and isn't limited to applications with a .desktop file. This is what I do for most tasks besides web browsing, so even after all these years of developing desktop environments the main purpose is still to run multiple terminals side by side. If only twm had niceties such as network manager and removable media.


So it takes Gnome Do and makes it more complex?

No. I had doubts last year about the wisdom of Canonical starting from scratch with Unity rather than just building on GNOME Do (see http://jasonlefkowitz.net/2011/04/ubuntu-11-04-everything-ol...), but the last couple of releases have put my doubts to rest. Unity is everything GNOME Do is/was, but now with the possibility to extend even further into things like the new HUD (https://wiki.ubuntu.com/Unity/HUD), which does to application menu bars what Unity does to application lists.

They get grumpy whenever anything is different at all in which icons are shown and in what order. Their allegations that I broke something weeks after I accidentally left, say, a copy of unzip on their desktop is of course laughable at best

You seem to be under the impression that the Unity desktop can't have icons/files/launchers/etc. on it. It can, mine does. It's accessible via the usual "Desktop" folder in your home directory.

If you know the name of the application, it can find it. But if you don't, it doesn't map keywords to application names. It would be impossible to maintain appropriate keywords for all applications in Debian (and by extension, Ubuntu) so it doesn't.

You're incorrect here too. Unity does support searching by keywords, and all you need to do to hook into it is create a standard freedesktop.org-style .desktop file for the app (http://standards.freedesktop.org/desktop-entry-spec/latest/a...) and fill in the "Keywords" field inside it. Unity will pick up those keywords and use them when searching in the application lens. Most of the commonly used applications already have the most obvious keywords provided for them.

(Putting the keywords in the desktop entry file means that Ubuntu/Canonical don't have to maintain a master list of keywords for every application; app developers can just provide relevant keywords for their app in the DE file they ship, and users can tune them if needed just by editing the DE file.)

I get the feeling from your complaints that you checked out Unity briefly early in its lifecycle and haven't checked back in on it lately. You should try it again now, you might be pleasantly surprised.


Unity is everything GNOME Do is/was, but now with the possibility to extend even further into things like the new HUD

I never got the hang of Do, either. In order to do things with do, you have to learn its vocabulary. Whenever there was something I didn't know how to do with do (or it didn't have a certain plugin), the old way of doing it was still there. This made me reluctant to ever press the button that triggers it. Maybe I'm bad for not investing time into making it work better, perhaps desktop environments are not being treated fairly in this regard. To clarify, maybe it's hard to unlearn the ways of interfaces which came before. I was taught how to browse the web with Netscape and today I'm still bitter about certain changes in Firefox (I know there's seamonkey but it's a second-class citizen nowadays). This doesn't hold true for all user interfaces, but I'm not sure if age plays a significant part (as in, learn UI X before age Y => forever stuck with the concepts of X), prolonged use or a combination of either along with not having any alternatives (thus, not building the required mental abstraction layer to differentiate the underpinnings of various UI concepts). I have only myself to sample and given that people are using user unfriendly (or even hostile) window managers such as evilwm, fluxbox and xmonad, there have to be wildly different "mental operating systems" in any representative sample group.

Gnome usually worked well despite this psychological hellhole, I could introduce people to it and it generally wasn't met with contempt. They knew their users so well that I, for one, was shocked and baffled when at the first command I entered in the preferences of the panel applet Fish, it berated me for trying to make it useful (there's only 1 command which does that).

Most of the commonly used applications already have the most obvious keywords provided for them.

I see, that's great. Sadly this hasn't worked for me, but I'll keep this in mind whenever I'm searching in Unity and something is missing. I hope seeing how it works hasn't skewed the way I use it too much.

app developers can just provide relevant keywords for their app in the DE file they ship

Yes, I hope package maintainers will do this too. This can be useful for a11y and other desktop environments, nice.

I get the feeling from your complaints that you checked out Unity briefly early in its lifecycle and haven't checked back in on it lately.

This is false. Although I have tried Unity in each Ubuntu release starting from 10.10, I've tried hard to work with it (instead of replacing it outright) in 12.04 over the course of months. I turned to Google just to figure out where my applications are for starters, instead of flipping tables. What recently drove me away is the instability, not the glaring usability quirks. At some point, compiz flat out refused to start but compiz 2d still worked (for a while). Compiz stability is very hard to fix as it can even depend on obscure GPU bugs only found in serial number x through y of model z manufactured by {.

However, I did manage to find one improvement over gnome-panel: it's easy to use the keyboard to navigate panel applets/indicators. Just by pressing e.g. Alt+F, you can move (using the arrow keys) all the way to the power button. A lack of pointer input always renders gnome-panel useless.


> The Gnome consensus used to be that users don't care which application they're using, they just want to carry out a certain task.

Still just press the Super key type in "doc" and you will see the "Document-Viewer" which is evince.


For Help press ALT + F1.

You can see most shortcuts with holding down the Super-Key for a few seconds.


This does not actually work, it merely highlights the Ubuntu button in the dock. According to the shortcuts overview, it's "Open Launcher keyboard navigation mode". I ran unity --reset to make sure it's not just me.


Thanks! In gnome-panel, this brings up the left side menu.

I'll try the help whenever I'm in Unity again.


What really bothers me about Unity is that I haven't seen it work. I've got a quad-core machine with a decent NVidia GPU and 4GB of ram, and I was seeing multiple seconds of UI lag, even on the 2D mode.

I switched to Xfce and haven't looked back. Everything is snappy, and everything is configurable.


The Unity interface is incredibly cool. Love it, love it.

Performance was a bit choppy when I had 4GB ram, and sometimes compiz would freeze my system. However, upgrading to 8GB RAM (costs about $50) totally fixed everything for me. Just upgrade.

I'm sure they could optimize things better, but that would slow down their frantic pace of innovation.


Just upgrade.

Why? Why should someone upgrade their ram just so the UI works smoothly? We've had smooth UI's for twenty years, even back when 32 MB of ram was the norm - so why should someone upgrade to 8 GB just for the UI? Hell, if I upgrade my ram (I have 8 already, but anway) I'll want it to be to benefit all the applications I like to run and I expect the UI and OS to use as little as possible and still perform smoothly.

My old eee pc was able to run windows xp just fine before I installed Arch with a tiling window manager on it and everything ran well, everything was fast, everything was smooth - for about three years - and then I installed Ubuntu and Unity brought it to a standstill despite not using any different applications than I always used. Even switching workspaces sometimes took 15 seconds or so. Yes, it only has 1 GB of ram, but like I said, I used the same applications that I always did without problems. I struggled with it for a few months and switched back to Arch and haven't had any problems since (though I now use a laptop which does have 8 gigs of ram and rather than using that ram to make unity run smoothly, I'm using it so I can run windows 7 and linux at the same time with VirtualBox).


I love Xfce myself, but Unity on 12.04 feels really snappy on my System76 laptop.


Maybe a config problem ? It works perfectly fine on my old Core2Duo laptop, with 4Gb RAM. Almost never experienced multi-seconds lag (but I do admit I see some crashes here and there) in the latest Ubuntu release.


This, IMHO, is the problem right here in a nutshell.

Maybe it's a config problem. Maybe it's not. Maybe it's sunspots. Who cares? It doesn't JUST WORK and that - more than anything else - is what will doom Desktop Linux.

I run Ubuntu on what essentially amounts to a media desktop. It does work, but that's because I spent hours on researching the simplest, cheapest hardware config that would work seamlessly with Ubuntu without requiring me to compile my own drivers, etc. The fact is that I, even as a power user, really don't want to spend hours messing around with that stuff... so I can't imagine the Average Joe would want to.


Oh, but that has been an issue since the ancient times of the beginning of Linux. Being it monitors/screens, (win)modems, audio, etc.

Slashdot comments are full of "Works-for-me" replies to real issues with varying levels of "you are lUser" for not being able to make it work.


Predictions of Linux's doom have also been there since the ancient times of the beginning of Linux.

Actually, back then there was much more reason to doomsay, as there really wasn't a desktop to speak of (CDE on Solaris was far more sophisticated than bare X on Linux), no hardware manufacturers even bothered to provide even closed-source drivers for their hardware, hardware support was miniscule, there was no such thing as auto-detection of hardware, web browsers didn't even exist so apart from a handful of applications with pitiful GUIs by today's standards, every user was forced to interact with the system through the shell. There was no OpenOffice or LibreOffice (ie. no option to use a Word/Excel/PowerPoint alternative on Linux), no web browsers, no desktop environments, no way to configure your system through a GUI. The list goes on and on.

Since those days, Linux has improved by many orders of magnitude. And it keeps getting better every year.

I think most people complaining about Linux these days really don't appreciate how bad and comparatively unusable it was in the old days. Back then you could not even dream of making any kind of comparison between Linux and Windows or MacOS in terms of ease of use for an average non-technical user. Now from an average user's perspective, they're all quite similar, with some minor differences on the periphery.

Sure, Linux still has some problems, but it's not like the competition is without its own problems.

Yes, sometimes a user who bought some obscure peripheral (made by a manufacturer who does not care about Linux compatibility) doesn't bother to check to see if the peripheral was listed as being Linux compatible. So they may have problems getting it to work.

But how do you think the typical Windows user likes using a virus and trojan infested system? Even Mac users are starting to fall victim to this problem. Or how do you think a typical Windows user enjoys having to reinstall the operating system every couple of years because it's slowed to a crawl through Windows-bloat?

"Slashdot comments are full of "Works-for-me" replies to real issues with varying levels of "you are lUser" for not being able to make it work."

Slashdot is not representative of the Linux community as a whole. And my experience in asking questions of Linux users has been quite different from yours. Virtually all Linux users I've encountered anywhere, from usenet to web forums to irc to mailing lists, have been exceedingly helpful. Even on Slashdot, there's quite a lot of help to be found.

And what could be more helpful than volunteering to write whole applications, whole operating systems, and even the documentation to go along with them for free? Countless Linux users have dedicated years and even decades of their life to helping the community by doing this.

The mercenary, closed-source culture that dominates in the Windows and Mac worlds is shamefully lacking compared to the vibrant open-source world that thrives in the Linux community. As a Linux user you are gifted with an entire free operating system, tens of thousands of free applications, hundreds of free device drivers, your choice of free desktops, etc. In the computer world, is this not the ultimate in generosity?


HOw can it spell the doom of Linux desktop? Even if it does not worth with 100% of the folks, it works for a certain % of people, and the total installed base is growing over time as more people try it and find it works for them. That is why you see the share of Linux desktop not GOING DOWN, but being stable or growing a little, while the overall internet population is growing from year to year.

Please stop the "Doom predictions". It won't happen. It does not need to be the MAINSTREAM system to exist.


Ok, perhaps my usage of the word "doom" was a tad strong. I should have said that it will relegate desktop Linux to a tiny (miniscule) percentage of the mainstream.

And that's a shame, because of the wealth of great, free tools (whether it's development tools, photo editing tools, audio editing tools, etc. etc.) in the Linux ecosystem.


I really don't think it's any problem with Desktop Linux. I accept that it's going to take a few hours to tweak a new machine to my liking. Linux makes it really easy to swap out a buggy or uncomfortable windowing environment. A lot of manufacturers seem to be going in the opposite direction, baking the windowing environment so deeply into the firmware you can't get rid of it. Linux has always been niche, and it's one I'm pretty happy in.


>Is it possible that Canonical is actually moving away from Gnome and taking things in-house to avoid exactly the problems that Miguel de Icaza is underlining?

I do think this is part of it though they probably wouldn't say it way. They certainly do want a consistent user experience - and they want one that rivals OS X.

I don't understand why people complain about using software they haven't learned how to use yet. It all takes some getting used to for a new user, including OS X. Now I can definitely understand the frustration though of someone who just upgraded their OS only to find they don't know how to use it anymore.


People complain about such software because we live in a world where there is no such thing as a Computer Driving Licence. Many users are perfectly fine to live in as much ignorance as possible, and are actually quite afraid of traditional computers. These users can easily be spotted at a glance on both Windows and Mac as they invariably use their desktop as a horrifically chaotic filesystem, never delving deeper, and the internet is essentially Facebook.


Actually it does exist: http://www.ecdl.com/

The world is not better off with this, as it merely teaches people how to use MS-Windows and MS-Office. It doesn't go into any detail at all how a computer really works.


Wow, you learn something new every day! My thinking was more akin to a real life driving license, a mandatory thing you would need before being allowed on a computer unsupervised etc. Thanks for enlightening me though!


> I don't understand why people complain about using software they haven't learned how to use yet.

you shouldn't have to learn how to use the desktop! I tried out Unity, I really did. I didn't like it and my apps kept getting lost when minimized and I couldn't have lots of small windows open and it just fixed something that wasn't broken for me. Like everyone else, I went to Xfce.


But you do have to learn how to use a desktop. You even spent sometime learning how to use a mouse. I use xfce on my main machine and unity on the oops I need skype machine and I hate the switch, but that's because I have not invested time to use it well - and chances Are I probably won't, just like I won't learn the annoying parts of OSX

desktops are a tool like any other - the mentality that they should be intuitive or easy leads down the path of eye candy as opposed to functionlity.



when you learn to drive a car.... you can drive almost all cars... the biggest difference is with the manual o automatic gears.... when you learn to use a computer you should be able to use many of them whithout having to learn anything.


A car does one job. A computer does many jobs. Learning to do each job a little differently, adds up to a lot of learning. I don't know what "you should" has to do with it - I'm just pointing out that for all desktops that currently exist, you have to learn how to use it.


So you are going to make the claim that xfce does not require learning? Or just it doesn't require learning if you already knew gnome, or what? Because no learning is a pretty strong claim...


Yes, I agree, but they're going the same path as Gnome, the "we know better"

"one that rivals OSX" what a joke. Windows XP is a better rival to OSX than Unity.


Actually I think it is the other way 'round. OS X has preatty lousy UE by default.


> with Linux you give up great hardware (to varying degrees)

I assume you're talking about Linux driver issues. But since we're talking about OSX vs Linux here, it's only fair to note that Linux supports a much greater variety of hardware than OSX. Linux also tends to support the very newest hardware more consistently than OSX (perhaps excluding video cards). See eg the laughable specs of Mac Pros.

Software also benchmarks faster on Linux vs OSX on the same hardware. So there is still an opening on Linux for people concerned with performance.

But yes, the Linux desktop is still light-years behind in terms of design, for the reasons you mention.


it's only fair to note that Linux supports a much greater variety of hardware than OSX.

Entirely true, but I also don't have to look up what is or is not supported by OSX. I just buy an Apple computer. Of course, there's obvious tradeoffs here about having the freedom to buy whatever hardware you want.


Ubuntu has initatives to let you just buy a computer with Linux preinstalled too!

http://www.ubuntu.com/certification/


That hardly an achievement of OSX though. Supporting pretty much a single set of hardware and some variations isn't really hard and does not count as "great hardware support" in my book. That's just one of the advantages you get by selling the devices and the software.


I don't disagree, but as a consumer, I just care that it's easy.


> I assume you're talking about Linux driver issues. But since we're talking about OSX vs Linux here, it's only fair to note that Linux supports a much greater variety of hardware than OSX.

I'll take supporting a limited set of hardware very well over supporting tons of hardware often in a mediocre at best fashion. I'm not just talking about video cards either, over the years I've had "fun" with everything from SATA controllers to sound chips that were 'supported' under linux.


If you're fine with limited hardware just use hardware that is supported by Linux. Every Thinkpad I've used so far worked perfectly out of the box. If you're not sure what to buy just look at the list of Ubuntu certified hardware: http://www.ubuntu.com/certification/


Honest question: isn't hardware supported by Linux strictly a superset of OSX? I.e., is there any Apple computer you cannot install Linux on? I've been running Ubuntu on a mac mini and I know others who do the same. Is there a problem installing Ubuntu on a macbook or an imac? If not, then this whole subthread is moot, no?


IIRC, there are issues with the latest Retina MacBook Pro.


Usually, yes. For instance Ubuntu runs perfect on the latest Macbook Air.


Speaking of Thinkpad, there's also http://www.thinkwiki.org


Thinkwiki is absolutely fantastic. I think that may just be the thing I miss most about not using a Thinkpad anymore.


I'd rather spend time building. The value prop of doing the research isn't worth it when I can just buy a Macbook Air and not even have to think about it.


Frankly I have never done any research at all when buying laptops intending to use Linux on them. Maybe I've just gotten lucky, but I really don't think so.

The last computer I purchased I purchased at a bar on my phone while rather intoxicated. The only thing I did was sort newegg's laptop selection by "cheapest first" and bought the cheapest. I don't think I missed out on any "building" time... If what everyone says about Linux hardware support was true, there is no way in hell that should have turned out fine.


Oh, statistically speaking I reckon this could work out just fine. However, Linux hardware support is a lot better than some claim here. The average desktop with it's x86 architecture, usually on-board sound and some mediocre GeForce is supported just fine. Seriously, that's what most people use.

Concerning the video cards I think people miss out on the proprietary drivers from Nvidia here which have always worked brilliantly for me since mid 2007 or something. Yes, they are proprietary but so what? As far as I know ATI cards work pretty well too.

Linux sound support works fine as well, it's still mostly ALSA underneath calling the shots which works like a charm. PulseAudio, I admit, usually doesn't. You always have the option to remove it from your system though - or even better: Don't install it in the first place. The actual drivers are in ALSA though, so nothing to complain here either.

CPU and RAM support is a no-brainer with linux. Never had any issues. I actually had a ton more issues with it under Win7. Recently plugged some 32gig additional memory into a workstation (64GB now) and Win7 only accepts 48 of it. Booting into Linux everything works and I've got the full memory capacity at my fingertips.

So I wonder, what is this all about? I've been using Linux on quite a few pieces of hardware and never ran into any serious issues. Yes, I had to screw with the X config a few times but that doesn't file as "not supported", just as "stupid defaults".


Me too. I typically always buy Thinkpad X-Series laptops. They always worked fine with Linux and I didn't really have to do much (if any) research.

So you buy Macbooks because they run OS X and I buy Thinkpads because they run Linux. Where's the difference? (Except that I would also be able to run Linux on other types of laptops if I wanted to).


Not the one you're responding to, but: I guess the difference is the preferred pointer device plus the screen quality. I have to tell you though, I've seen quite a few posts of people switching from thinkpads to macbooks and not one of them was looking back to the thinkpad's pointer nipple, at least not if they have been using a macbook after a while. All other differences of those two products are basically a matter of taste IMO.


Here's a datapoint: After buying a 13 inch Macbook pro (and having had it stolen), I went back to the T420. I miss two finger scrolling at times, but I love having page up and down. I love using the pointer the pointer and scroll bar without having to leave the home keys. When I was on a recliner or laying on a couch, using the trackpad was often annoying and uncomfortable.

Option-up/down doesn't replace an actual Page Down Key.


It's certainly interesting that it's more comfortable for you to use Page Down instead of two finger kinetic scrolling. I only find myself using the Page Down keys on Desktops instead of the mouse wheel. Well as I said, it's basically a matter of taste.

Btw. how are the screens these days? I've read that the resolution is quite low, but do they at least have good contrast / color spectrum / brightness / viewing angle? I'm asking because my father wants to buy one again (low res is actually a plus there).


As for page down, I think the nicest piece of it is that when I hit page down, I know exactly where to reset my eyes when reading something that's longer. Now, space usually worked in safari, but it was suprisingly unpredictable. Of course, pages meant using option+up/down or scrolling.. We all have our quirks, I guess. :)

I have the upgraded 1600x900 screen, so I can't speak to the quality of the lower resolution screen. I've heard it isn't as good, but I will say that for programming, the 1600x900 has been great. (I forgot to mention how much I hate glossy screens as opposed to matte...) From everything I've heard, the upgraded screen is much better.


Thanks for the heads up. I'm not sure yet whether 1600x900 is the right choice for him; Windows software is still not as resolution independent as it should be and he needs rather large text. I think that the retina pro would actually be quite good in that aspect because the scaled resolutions are very well implemented there, however $2.3k for my father's usecase is quite a stretch ;).


We come from different socio-economic groups. I simply can't fathom the idea of making a $1k purchase so casually that I couldn't even be bothered to do a bit of cursory research. Even 10-20 minutes of light googling is too much for you? How much can you actually build in that time?


Maybe you're happy with that but there are others who want a decent package manager and don't want deal with the differences in OSX vs Linux.


> I'll take supporting a limited set of hardware very well over supporting tons of hardware often in a mediocre at best fashion

Ha! Check any recent thread on Mountain Lion updates and you'll find a ton of non-"very well supported" hardware glitches.


> I think the pseudo-anarchic development process doomed Linux on the desktop

I'll repeat myself here. Linux is the natural successor of the technical Unix workstation. It's, therefore, a niche product: built by a highly technical crowd for a highly technical crowd. Only recently efforts were made to make Linux more muggle-friendly and those efforts nearly coincide with the decline of the relevance of the desktop PC.


You must have never seen Corel Linux, Caldera OpenLinux, Lindows/Linspire, etc.

Many attempts were made to make Linux more user-friendly at the start of the century (and ever since).


I did see those, indeed. Caldera's was somewhat successful in making Linux easier to install - at least when compared to the Debian and Slackware installers of the time. At least, I managed to install it on my machine. I would also mention Conectiva's as an easy to install and use Linux.

Unfortunately, at that time, the lack of desktop applications was a huge problem for the average user. Today, many people read e-mail, collaborate on documents and engage in a large part of their social lives through a browser. That was not the case in the early 2000s. Having an easy to use (or Windows-like) file manager, application launcher or even package manager was not enough.

Perhaps it's unfair to give credit to Canonical for making Ubuntu grandma-friendly when a lot of its ease of use comes from applications like LibreOffice, Thunderbird and Firefox (and Gmail, Gdocs and Facebook).


I think Canonical had a chance to make a real impact here but they essentially blew it.

Curious to hear you elaborate on what they blew, and how. I don't use Ubuntu myself (prefer straight Debian), but I've always respected what they're trying to do (and to some extent, what they have already accomplished).


Linux is not a corporation. It's not fighting for turf. If Mac OS X is good enough for most of the desktop users, Linux doesn't have to throw resources and bullshit around (as corporations do) to counter that success. As long as Mac OS X does its job well Linux can focus on being linux-y, and succeed.


Since it's customary to bash on PulseAudio: Here's one user that is very happy to have it, using its end-user features.

I wouldn't want to have a machine again, where any of these things won't work:

- Set volume per source/application (More flash video for the presentation, some mp3 still in the background)

- Move applications between devices (Wife complains about the sound of my video call/video/game? Move that thing to the USB (!! No jack that directly does what you want anyway) headphones)

- Normalize output so that nothing blows up my headphones when I switch between different audio sources

Apart from that I did use the 'stream over the network' feature a couple of times in the past and while it wasn't something I'd need on a regular basis it worked fine for me.

Long story short: Pulseaudio solves problems for me. It totally might not do that for you, but please don't project.


I <3 pulseaudio as well, glad to hear someone else loves it. All the reasons you mention :)


Google's Android users activate one million new devices each day. This Linux based mobile OS has absolutely crushed Microsoft's mobile software, and has surpassed Apple worldwide.

I have begun to wonder that, given the evident superiority of Linux in mobile, given the lack of GNU within it, and given the central control of Linux mobile development by Google and Amazon, then to what degree has GNU and its development traditions held back Linux desktop adoption?

Thoughts?


GNU is not really relevant, except may be to say that free software philosophy can't build something for the mass market, since it takes money to pay workers to slave away for the benefit of the plebes.

The UI layers (GNOME/KDE/Canonical) may be relevant. Linux desktop doesn't have a giant company funding massive ad campaigns and retail outlets and spit-and-polish and OEM integration, and all that is needed to launch a successful product for the mass market.

Linux desktop is very successful in the developer workstation and server markets, where the users are more tech savvy.


Now you made me curious. How would I be able to switch between different sound cards on the fly without restarting applications if PulseAudio did not exist?


Let's take a common case: I have a motherboard sound device which I use 95% of the time, and I have a USB camera that has a built-in microphone.

When I go to settings in Google Hangout, it asks me which device I want to use. ALSA presents the interfaces. I choose whatever combination I want -- mic from the camera, headphone out through the mobo -- and it works.


How many people ever had more than one sound card in their computer, let alone a need to switch between them ?

http://xkcd.com/619/


Everyone with headphones or headsets using Bluetooth or USB.


I have some USB speakers I'm fond of, and PulseAudio was indeed the only way they ever worked right for me on Linux.


As CPU counts continue to rise and Flash support continues to get dropped from more pltforms, that comic will become more ironic.


I think I have four: Onboard audio, pcie sound card, usb microphone and hdmi audio on video card.


Not sure if serious


The problem with Android is its diverging from the Linux desktop path (i.e. X11->Wayland). Being completely incompatible put Android apart and it arguably can't be called a Linux desktop at all. The split it caused in the mobile devices (with no drivers available except for Android) is a horror, and like Aaron Seigo called it - Android is the best friend and the worst enemy.

The real future of the Linux desktop and mobile is Wayland, and Android it seems will stay apart forever, going its own path.


"the real future"...

The real future is what users like best, not what developers think that users might like. I personally don't like the Wayland approach and I think that competition is fine. Let the "market" decide what's best.


Users like what's comfortable and functional to use. But those who create it - are developers. Regular mobile Linux has hard time presenting existing software solutions to the user because of the hardware barrier reason. And not because they are in any way inferior to the Android user experience (if anything - they are superior).

I.e. there are practically no manufacturers which provide closed Linux drivers or open specs (so open source drivers could be created). They mostly only care to provide Android drivers. Hopefully Jolla and Plasma Active will help to break through somehow.

This situation is caused by Android's historic roots, since it started as a proprietary project, and didn't take in account any interests of the Linux community. The fact that it was open sourced later didn't really change anything - it's de facto completely separate from the Linux desktop as well as from the mobile Linux varieties which share the effort with the desktop distros. Wayland was created with collaboration in mind. Android architecture - was not.


I'm curious - why cant a desktop version of Android be possible ? I mean, OSX is moving to a world where an app developer (optimistically) can code once and compile to both desktop and mobile targets. Why cant Android be able to drive that in Linux.

I would like to think that if Android was available on both desktops and mobiles, we would have been happily playing Dark Souls using Steam-on-Linux by now.


It's not that it "can't". It's just that would be bad for Linux even more. Do you want the sick situation with mobile drivers to spread to desktop as well? No, thanks.

Also, Android is rather narrow in its capabilities comparing to normative Linux. No need to cripple the desktop like that.

It's going to be other way around - normative Linux will start competing with Android in the mobile sphere.


> The split it caused in the mobile devices (with no drivers available except for Android) is a horror, and like Aaron Seigo called it - Android is the best friend and the worst enemy.

These mobile devices could grow some HDMI and USB ports in a few iterations and becoming stationary.

Tada - there's your Linux desktop.


That's not the point (many of the mobile devices already have a USB and HDMI). The point is - Android incompatible architecture put it apart from the rest of the Linux world. And it's a horror to squeeze X.org or Wayland drivers (as well as other Linux drivers) from the mobile manufacturers, when they say "we are too busy with Android".


I have a Transformer Infinity with the dock that I use as a netbook that I can plug into tvs and stream movies off of. The most recent iterations of the Nexus Q et al have mini usb and mini hdmi ports and can effectively do the same thing as well. Android devices have been at the point of being desktop replacements starting this year, and mainly due to the Transformer tablets, but any bluetooth keyboard + hdmi complient device can serve the same function given a beefy enough processor.


Actually I think Canonical is making a terrific job. IMO it has the best out of the box experience of all desktop OSes on the market.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: