Utterly unfair comparison. The Apple and Microsoft business models are so unlike each other it's almost inaccurate to say they're in the same business. Apple can get product quality to the point they do because they laser focus on a few profitable segments. Microsoft's business model --- not "what Steve Ballmer thinks they should be doing this month" but their underlying business model --- involves them serving 80+% of the market.
This is sort of like complaining that Safeway can't make ice cream as well as Ben and Jerry's.
This is true, and I think it accounts for a lot of the difference in quality between their products. But in this case, it sounds like there just weren't clear, enforced rules about integration, interoperability, and usability. Not once in the email does Gates reference an existing, company-wide set of standards. Even a company like Microsoft is capable of having and enforcing such a thing, and it would actually boost productivity over the long run. If all the different teams are doing their own thing and subject to an off-the-cuff usability review only after they've shipped, they will ultimately spend more time making revisions than they would have spent targeting standards from the start.
I've no idea what Microsoft was like at the time, so maybe they did have such a thing, but if there was, I can't imagine it had much priority in the corporate culture.
> This is sort of like complaining that Safeway can't make ice cream as well as Ben and Jerry's.
And in fact, Safeway doesn't make ice cream as well as Ben and Jerry's. It's ice cream, not apples and oranges.
Apple is in a few markets where they serve a large majority of the market (music players being the obvious). There is probably room for Apple to improve on those products, but I suspect most rational people would not claim Apples products in those markets are as poorly made as most of Microsofts products. Effectively, it is not the unfair comparison you claim.
How do any differences in the business model cause or justify bad design/integration in either their web experience or software products? Duplicating popular software in higher volume isn't a burden on design, is it? I fail to see why popularity should be an excuse for mediocrity.
The distribution methods and relationships with customers differ a great deal, but both companies develop software, some of it very similar in purpose, and both also sell hardware. The problems outlined in the email were not related to problems integrating with others' hardware, so lack of ability to integrate with that isn't an excuse (this time).
I think it is the ice cream comparison that's the unfair one. Software sold in volume is a high margin business. With 80% market share there is neither the justification nor the excuse for the sort of blunders outlined in the email. Unlike ice cream, there are no physical ingredients to cut corners with. It's doubtful that things that compromised quality saved them any money at all. What resource did they lack?
It's not as if they were a hardware business operating on razor-thin margins to get market share and cutting corners to break even. They've got revenue, they've got cash, they've got many talented engineers. They spent about 9 times as much as Apple on R&D in 2007, about 7 times as much in 2009. Considering how often Apple has major OS updates, and introducing new/updated hardware designs, it seems like MS isn't producing as much fruit for the R&D money. Did I miss something?
It wasn't a comparison of companies, it was a rebuke of Microsoft's leadership. The criticism is certainly valid, and all the more convincing because Microsoft HAS shipped high-quality products, so they are CAPABLE of doing great work, they just get LAZY.
I agree, when you consider when products were released and compare it to similar offerings at those times, many Microsoft products are (or were) "decent". I would add these:
- Windows 95: Not perfectly stable or technically very groundbreaking, but a good enough combination of usability and features to be considered decent in my mind and millions of others at the time.
- IE6: great at the time. Some of the standards it doesn't support didn't exist when it came out. Also, IE5 introduced the Microsoft.XMLHTTP ActiveX control, which was the foundation for XMLHttpRequest. If that wasn’t created by Microsoft at the time, AJAX almost certainly wouldn’t be as ubiquitous as it is today.
- SQL Server 20xx: Perhaps RDMS's aren't your thing, but they're very much needed for now, and Microsoft's can certainly handle most relational db needs.
- .NET 3.5+ and C# 3+: If you want a statically typed language and platform, anything .NET 3.5 or greater are at least ok.
- ASP.NET MVC 2+: Not quit the same as a dynamically-based ROR or Django and they also don’t give you a default ORM; you pick it, but its solid and improving. Source code is available; contribs aren’t.
- Visual Studio 2008+: Again, not some crazy vim/emacs setup, although plugins can somewhat help simulate that, but these definitely don’t suck at all.
- IIS7+: Just because it’s not Apache doesn't mean it isn't decent. It is quite powerful and secure.
- Windows Server 2003+: Never had bad experiences with these.
- Microsoft Security Essentials: Very good and free antivirus.
I’m sure there’s more, but it is an exaggeration to say Microsoft hasn’t made any decent products. Certainly some of these don’t seem as good overall when accounting for the licensing costs and associated culture/stigma that comes with working in Microsoft shops, but many of products in and of themselves are decent.
I view all 32bit PM versions of non-NT windows (ie. 386, 3.1, 9x...) as amazing technical feat, because they do something that almost should not be possible. Like ability to transparently use realmode DOS drivers from windows programs and vice versa (althought at huge performance and stability penalty).
"decent" is the perfect name. Better than the crap the anterior version was is a better description. Perfect for milking the cash cow.
-Windows 95? Come on, your memory fails. It was utterly usable and crashed hard everyday. Any other OS of the time was better. It was so basic... paint? notepad? a big pile of shit. And they have been budling the same crap more than 10 years until Windows 7.
-IE6 was good at the time. It's sad it was so many time without new versions.
-The first usable SQLS was the 2000, and sincerely, until 2003 oracle ran circles around it.
-.NET and C# is OK, fine lenguage.
-ASP.NET I do not know.
-Visual Studio is OK, but almost because it has no competition.
-IIS7+: I do not remember the last version I touched, but I had to clean my hands with bleach after. Oh! and it came with W2000, and obligatory for some SQL Server stuff, for no aparent reason.
-Windows Server 2003, better than the 2000, yes. And grey.
Microsoft Security Essentials: Not sure what you have against this. I happily suggest it for all my friends and family when they ask for anti virus. Its free, and effective, and doesn't cause headaches (ala Norton, etc).
I'm not sure what his beef is with Microsoft Security Essentials, but I just tell my friends and family to get a Mac.
It's too tiring keeping up with antivirus offerings, and constantly needing to clean up after their messes.
I'd recommend Linux, to be honest, because it's a cheaper alternative. But because there's still no good way of using it without needing to open Terminal at some point - the Mac is, I suppose, a good compromise.
You can use ubuntu without opening the terminal, and have been able to for a while, the problem with linux is not the terminal it is whether or not the hardware is supported and software people need is supported.
I recommend that you sit behind a non-technical relative or friend, and watch them use Ubuntu without offering them any help.
There are quite a number of things that they'd like to do that isn't clearly expressed - or even easily findable! - though Ubuntu's GUI. They will struggle with it for a bit.
Plus: even if we set that aside, there will be certain bits of hardware you'll want to use that would require opening the Terminal to configure/install.
Because my objective is to minimize my time spent on helping my friends/relatives with their computers, Ubuntu simply isn't a good enough alternative to Windows. You merely trade the time spent on making the computer virus-proof with time spent teaching them where to find things (which is funny, because most of the time I'm finding things for the first time myself, seeing as I prefer the Terminal).
As opposed to the Mac - I send them away to buy one, and they never come back to me for help.
PS: Why the downvotes? I expressed a justified assertion, backed up with some evidence, and people react to the fact that I'm dissing Linux? Tsk, HN, I expected more of you.
I recommend that you sit behind a non-technical relative or friend, and watch them use a Mac without offering them any help.
There are quite a number of things that they'd like to do that isn't clearly expressed - or even easily findable! - though OSX's GUI. They will struggle with it for a bit.
--
Really, it's just a learning curve and it's the same for any new OS.
Most non-technical people don't have the habit of figuring out what to do. They have the habit of doing what they're used to. For most, if you change even the smallest thing in their interface it can be catastrophic. (My mom often calls me for things like not being able to use internet at all anymore because the bookmarks bar is gone because someone accidentally hit ctrl+b or something).
I think the whole "Apple just works" thing is majorly overblown. Every time I have to use an OSX laptop I cringe because I have to look for stuff in the weirdest places and it takes me so long to figure out how to do things that I often just give up (if I'm using an OSX laptop it's usually because I don't have much time and just bummed the first laptop that was around). I can't imagine my mom magically finding everything intuitively if I can''t.
Also, you say you backed your assertion up with evidence. What evidence? I see none at all. The plural of anecdote is not data. (I'm not saying I'm providing evidence either, I'm simply providing a counter-example to your anecdote.)
Evidence != data. Evidence = data and/or examples and/or anecdotes. Evidence is also most effective when backed up with analysis/logic (otherwise you get ping-pong debating, where both sides toss counter-examples around, proving nothing)
Note too that I said some evidence.
>I think the whole "Apple just works" thing is majorly overblown.
I suppose the simplest way to prove this is to consider the alternative: of what other operating system, taking into account everything that a non-technical user needs, can this be said to be true? Perhaps it is not true of all our current operating systems, but if it is more preposterous to claim that Windows or Linux 'just works', then Apple takes the cake and my point is justified.
Whether it is overblown or not is not my concern - it is currently the best that we have, and I have to make do. Perhaps iPad-like devices would prove to be better, but that future is still two-three years away.
Anecdotal evidence seems to support this: I wince whenever I see the Macs my friends and family use - the desktop is cluttered beyond repair, the dock is stretched across the screen, and they don't know how to use various core features (expose, frontrow, etc). But they don't complain, and they don't ask me for as much help as they did before, and this is the only metric that matters to me.
"Plus: even if we set that aside, there will be certain bits of hardware you'll want to use that would require opening the Terminal to configure/install."
Change your mindset from hacker to 'non-technical relative or friend', and that problem goes away.
"There will be certain bits of hardware you'll want to use that will not work with you new system." is how they would phrase it, and oftentimes, it does not really bother them at all.
Even if it would bother them, getting a Mac likely would not help, as it would get them the same results for some of their old hardware.
I think you could say the same about any foreign operating system and any user, regardless of how technical. The more technical have better guesses at where to find things (e.g., Oh, it's not under 'System', maybe it's under 'Administration') based on familiarity with OS patterns, but if you sat me down in front of a Gentoo box running XFCE, I'd look just as much the fool as the next guy for the first 20 minutes or so (except I'd drop down to shell -- though the same applies for BSD even there.)
The best synopsis I heard (I can't remember who said it, nobody famous) was that "The hardest part about learning Linux is UNlearning whatever else you know first."
I think that correctly asserts that yeah, you won't find things in the "Control Panel", and you won't be able to "Win+R" for a run dialog, etc. Otherwise, I personally have had nothing but positive feedback from my less technical friends that I've turned on to Ubuntu, except for the usual complaints, or when they have to give up foo application because it doesn't exist (Photoshop, World of Warcraft, whatever.)
Hardware support on Linux is a red herring. There are some few devices that don't work all that great, like some horrible sound chipsets and softmodems, plus the very latest in WiFi used to have problems (I think that's less of an issue nowadays).
Some years ago there was a project set up to fix those Linux hardware issues, and they effectively didn't do any work, because they couldn't find reasonable hardware that wasn't supported by Linux.
I have a (crappy) scanner that has no working drivers for any Windows beyond 98 and Mac OS beyond version 9, but it works just fine under Ubuntu/Linux. I haven't seen any mainstream, reasonable hardware that doesn't work under Linux for a long time. Really, from my perspective, hardware support under Linux is significantly better than under any other operating system.
Where it indeed gets spotty is the software to use the features of the hardware. E.g. Xsane rhymes a lot with insane, the various photo organiser tools are of varying quality, to say the least, and so on. Ubuntu is making progress, but there still are a lot of use cases where the GNU/Linux/Ubuntu desktop doesn't provide good alternatives.
> Windows 2000 and XP were very good operating systems
I by no means am some Microsoft hater, but I'm amused about this tinted rose glass nostalgia for XP. It's easy to say XP is a good system now with the maturity of three service packs and the bad taste of Vista still in our mouths, but I remember all those years ago much dislike of XP around the internet. Among other things, it was a security nightmare that wasn't even fixed until SP2. You put release day XP on your machine and put it on the internet and you're machine will be probably hacked within the hour.
On the other hand not to many people stuck with Vista. A fully patched Vista installation works just fine nowadays. It still has higher system requirements, but those requirements are now 4 year old systems.
You are remembering it backwards. 2000, and XP (because it was the consumer version of 2000) were much better than Windows 98, Windows ME and NT. This is why the poster says they were good OS, they were so much more stable, you could actually leave your machine on for a while.
who cares? you can say this about any major release number OS. go ahead and throw out netbsd stats all you want. no one tries to hack that b/c no one cares. stats are a funny game, my friend.
Win2k being released was a huge blow to Linux at the time. Redhat 9 was so awful in comparison. I remember how a lot of developers I knew who had been Linux users actually switched back.
IIRC, the Linux of the time had trouble with multiprocessors and ran on ext2. Nevertheless, I cannot imagine someone giving up a fully functional Unix environment to go back to a Windows box. Lots of mouse clicking for much less power.
Almost all releases of the NT line were good (including Win 7).
Was there any OS contemporary to NT 3.51 that was better than it at anything (beside win 95 when it comes to compatibility with more software & hardware)?
Solaris 2.5 was a very solid OS and better than NT 3.5 in most respects unless a defining criteria for "better" is "runs well on PCs", which it didn't.
OS/2 was also a good OS for x86 PCs. At least as good as NT, unless, again, you want it to run software designed for Windows.
I wasn't really into OS/2 but I concede that it was probably about as good as NT at the time.
I would, however, much rather use NT 3.51 as a desktop OS than Solaris 2.5 with CDE and its 80s-style command line utilities (luckily it seems they are using GNU stuff nowadays?).
I quite liked NT 3.5 as a development platform back then. I came from WfW 3.11 and used Borland C++ 4.5, and nasty programming bugs would hang 3.11, but not NT 3.5.
The /Mac/ was a pale imitation of itself, by those standards. Developing a Mac app was just as gnarly as developing a Windows 3.1 app, but by 1995 Microsoft had Visual Basic (to Apple's HyperCard, which was virtually moribund at that point) and MS took off and /owned/ the database client market.
Apple never really believed in its development tools group, and the market share they lost in the 90s shows that. They are still paying for lost opportunity.
Continuing the "Microsoft and Apple are different" theme, the thing that's worth considering is that the "goodness" of these products is the goodness of well engineered compromises.
Current Word is actually a fusion of DOS word and the original Windows Word and while it has a huge feature set, this set is not terribly consistent. It's an achievement to make thing feel as good as it does. Similarly for the others and very different from producing a clean, "just works" app from the ground up (not necessarily a better or worse achievement imho).
Mostly you're right, but I object to the claim about Excel being better.
While Word was better than anything else I had found to use at the time (Word Perfect was a buggy mess).
However, I claim that there were better spreadsheet products available. Certainly the one I used, Enable, was quite robust, and was more capable than Excel.
Excel for Windows was much better than Lotus 123 for Windows. Lotus basically took the DOS version of 123 and added some menus for it. It never did feel like a fist class GUI app.
Lotus took Microsoft's word seriously when Gates proclaimed OS/2 was the future and Windows was an interim step. Lotus invested a lot in 123 for OS/2 (123/G, IIRC) and, when the IBM/Microsoft divorce made evident they chose the wrong side, they had to port it to Windows all over again (with lots of different and conflicting APIs). Microsoft had no such problem.
Their Macintosh products tend to be really great. I really loved IE5 for Mac and was sorry when they discontinued it. MS Word on the Macintosh always seemed well designed too. I assumed this was because their Macintosh division was long established and very good at what they did, or designing for a platform they did not control brought Microsoft's true design talents.
Word? There are still things that were a piece of cake to accomplish in WordPerfect (5.1 for DOS; I never used WPWin) or AmiPro[1] (3+ for Windows) that are impossible, or extremely difficult, in Word.
[1] I say this despite having once written a piece (c. 1993) based on a Dr. Suess book that featured the lines "I could not merge 'grneggs' with 'ham'. I do not like you star dot sam."
To be fair, Windows and the Windows ecosystem is so huge that it would be almost impossible for one guy to sit down and say, OK, this is how Movie Maker is going to get installed, and this is how it should function, and this is how Windows Update Q52324 will get installed, and blah blah.
My point is that it's easy to think Steve Jobs is in compete and total control of every pixel on an OSX screen and has intimate knowledge of the tiniest details about how every bundled app works, but the truth is probably closer to "Jobs hired a bunch of very smart people to do it for him, and managed them in a way that allowed them to function without the bureaucratic paralysis that plagues Microsoft." Which is a state very different from "dictatorial death grip."
(Though I won't argue that Jobs is indeed a benevolent dictator, and a damned good one).
A wise man once me taught me that this is the role of culture. You can't be there all the time, and don't want to be: Smart people need to be able to make their own decisions or they get bored and demotivated. A good culture helps them make the decision you would have made by propagating knowledge of which values are most important to the group.
You can still set rules. "All first-party Microsoft software will take 5 clicks to install. More than 5 and you need a sign-off from a VP." Palm had a rule that it should take 3 taps to get anywhere on the system. They had a guy whose full-time job was to count taps for various actions.
I agree, but I believe it helps for Apple's employees to know they won't get away with a user experience like that under Jobs. They've got smart people, the ability to get things done, and they know what specific areas cannot be compromised.
You're right and the anonymous commenter on their site said it well:
"The Steve Jobs version: "If the MovieMaker download site isn't working by tomorrow at 6 am I will come down there at 6:01 am an choke the living ___ out of all of you.""
Jobs may well provide more severe threats than Gates, but I still think there's an important distinction between firing/choking/maiming employees and refusing to ship (or canceling) a product.
That's right, and the difference between those kinds of threats is personal confrontation.
Some folks will rip apart shoddy work because they don't care if the creators get personally upset about it. They're not looking for a confrontation, but if there is one, they feel justified.
Other people are bothered that the creators are not personally upset about how shoddy the work is and make quite sure that they are. It's a subtle but important difference.
Well, I'm saying that's what motivates some directors to rip into work; I admit it's not a common attitude. But I have seen the "How can you think this is acceptable?" attitude and it is hard to take for a lot of people.
Yup, now they just need to actually do that with quite a few of their own products and we'd have two happy, completely different corporations that don't release sh*tty software from time to time.
Look at what Gates has done since he loosened his grip at MS. The Gates foundation's global health and global development programs have done tremendously important work to reduce human suffering in the 3rd world. Nothing Gates could do at MS would come even close to what he is doing now to make the world a better place.
Compare to Steve Jobs who knows how to make wonderful shiny devices that cost more than what many in the third world will make in a lifetime, but who has done very little when it comes to philanthropy.
I see people write this all the time, and it makes me sick.
Bill Gates decided he made enough money at Microsoft and wanted to pursue other things. That's fantastic, for him and millions of others. Steve Jobs has decided to do what he loves until he can't any more. That's also fantastic, for him and for millions of others. They're both making positive changes for lots and lots of people. Who are you, or anyone else, to criticize that?
I don't see why these two need to be directly compared this way, as if Gates is finally showing his soft side means Jobs should too.
Steve Jobs releasing some new shiny thing that makes the lives of some very affluent people (on the global scale for wealth) magrinally better, doesn't compare, is not even in the same ball park, as ensuring that hundreds of thousands of people don't go blind in central Africa. That is my belief, and of course I understand that not everyone else shares it.
I actually believe that the iPhone will have a _huge_ (indirect) impact on subsaharan Africa, in a manner similar to traditional cellular technology.
Principally, the iPhone seems to have hastened the development of low-cost hand-held medical diagnostic equipment, including digital stethoscopes, portable ultrasound, eye examination, remote disease diagnosis, etc
Sometimes it takes a revolutionary product targeting affluent individuals to help drive innovation for everyone else.
I can appreciate that point of view, though I don't totally agree.
The problem I have is that they're no longer doing the same thing - it's like comparing Bill Gates to Donald Trump. It might have made sense to compare Bill and Steve while they were working in the same field, but why bother any more?
> Steve Jobs releasing some new shiny thing that makes the lives of some very affluent people (on the global scale for wealth) magrinally better, doesn't compare [...]
Compare to Steve Jobs who knows how to make wonderful shiny devices that cost more than what many in the third world will make in a lifetime, but who has done very little when it comes to philanthropy.
I really hate this comparison for one reason: Gates has a lot more money than Jobs - IIRC the amount of Gates' wealth that he put toward the Gates Foundation is more than Jobs entire net worth.
There's also the matter of whether Jobs isn't pursuing philanthropy or if he's just not doing it publicly.
This point seems to come up every time Bill's name is mentioned these days. It is really funny how he has gone from Darth Vader to Mother Theresa in a few short years. I mean its nice that he is giving all his money away and making a difference, but everyone seems to skip over the fact that his money was made by running an evil empire and being one of the most reviled figures in the computer world.
I didn't claim that it was. I do think it is worth considering all the costs and benefits of Bill G staying at MS. The benefits might be a better Windows. The costs are all the work he wouldn't have time to do outside the technology sphere.
Ah. I don't believe mrshoe was talking about Bill Gates remaining CEO of MS. He was suggesting that Gates should have managed more like Jobs back when he was CEO.
I can't shake the feeling that I've always been somehow out of touch, because I'm not really a fan of most apple software. First thing I do with osx is the same thing I always used to do with windows. Get rid of all the distracting annoying stuff (dock, dash, iEverything) then happily get down to business. I never did use vista but with all other windows versions I would always just remove a dozen or so things and happily move along with my work never to be annoyed by them again.
The only thing that keeps me using my iPhone is safari and the screen. It's great. That's the killer feature to me. Yet it's annoying I can't turn images off without jailbreaking.
I don't really think apple delivers software products that are much better than MS products. Hardware yes although the xbox line is pretty sharp.
It's a lot easier to remove all the junk from OSX.
It sounds like you haven't seen a brand new Windows machine in awhile. They are cluttered beyond belief. The OSX clutter, in comparison, is unnoticeable.
The other nice thing about OSX though is that the migration tools have always been totally painless for me - when I get a new mac I DON'T have to go remove that stuff because it saves my settings automatically. I haven't had to remove the clutter since my first MacBook in like 2006 - the settings were carried over. In Windows this rarely works, and when it does it takes a lot of work and doesn't do a complete job.
To be fair, a clean retail Windows install is just as clean as OSX. The clutter comes from shitty bundleware.
This is, IMHO, one of MS's greatest struggles - it is not enough for Microsoft to develop a culture of top notch user experiences, but also all third party OEMs. This is a problem that Apple does not contend with.
We see this with Windows phone also - no amount of software and UI can save a device designed and manufactured with rock bottom quality.
The issue here though is that Bill was trying to download MovieMaker. I bet when Bill talked to the MovieMaker team he was given a demo of MovieMaker already installed on a computer.
With that said, at Apple the website feels like an extension of the company. WIth Microsoft the website seems like a standard site for a corporation.
"MS probably could have made some decent products."
Microsoft products are great, not all of them but most of their software products are, it is the reason they own the desktop market and have millions of users using many of their software products every day.
I like Apple's products as well... but it is the hardware that I like... not so impressed with Apple's software such as Mail or Xcode.
Both companies have their strong and weak points... including some in critical strategic areas. For instance, if Jobs wants to live up to his reputation as a dictatorial advocate for quality and engineering, why doesn't he force-choke the program manager for iTunes?
I suspect that the iTunes UI Manager's job strongly resembles an Admiral's position. On a regular basis, I can see minions dragging the latest victim out of Steve's office while he intones, "Apology Accepted."
Reminds me of the phrase "Nobody ever got fired for buying IBM." SharePoint works with so many MS products and there is such a huge eco system that goes with it. I think it's a safe bet for many companies that already rely on MS products. I would also give a lot of credit to the MS consulting/salesforce for many sales of that product.
edit: there are categories of software that have representatives that are more dysfunctional than Sharepoint. Still, Sharepoint is pretty much the worst document management/intranet package you can find.
Enterprise software can be surprisingly awful. blasdel's story is the best, but I remember being astonished that you could actually pay for version control software. The fact that said version control software didn't even have atomic commits but did, interestingly enough, run on its own proprietary file system was almost a Kafkaesque punchline.
I've seen some gibberingly awful Oracle web interfaces for HR stuff like entering time cards, but the worst was a "Software Life Cycle Management" thing that a client was using — a bizarre issue tracker with piles of inane baked in process.
When you logged in it would open a sole chrome-less popup window. You weren't allowed to be logged in more than once per account: one computer, one browser session, one window, one tab, period. There were no hyperlinks and only one URL — all the buttons POSTed a form with tons of parameters back to the same URL, with lots of state both in the session on the server and independently in cookies. You couldn't link to anything, it had a ridiculous taxonomy system, and the search was actively antagonistic to you finding anything. Most of the actual data worth looking at was in unindexed Word documents attached to the issues.
Xbox was revolutionary. Online multiplayer on a console had been tried many times before — as far back as the Sega Genesis and as recently as the Xbox's rival, Playstation 2 — and failed miserably every time. Microsoft was the first company to make it work, and their current implementation is still probably the best.
And I don't know any developer who would dismiss Visual Studio. In most regards, it's a best-of-breed IDE.
I would be fascinated to hear from them. I know lots of developers don't use Visual Studio for some reason or another (I'm one of them), but I've never heard anyone call Visual Studio a poor IDE.
Java IDEs like Eclipse and IntelliJ made VS look like a toy until very recent editions. No built-in refactoring support, limited configurability for keyboard shortcuts, minimal integration with external build tools and SCM systems -- it was horrible going back to C# development from Java when I was doing both in 2005-2006.
Not sure where eclipse is at now but I have less than fond memories of that beast. I did most of my java coding in ultraedit so I could actually listen to mp3s at the same time back then.
Not that I had a lot of love for VS either, but I don't remember it ever running as heavy as eclipse.
I used VS for the better part of my 10-year stint at a BFE, and it served rather well. However, i would be reluctant to go back to it having spent serious quality time with emacs.
The plus about it is that it has all the stuff that you need to build the final payload/program/assembly/whatever. My discomfort with it these days would be that it requires too much mouse work.
(And best-of-breed sounds very BFE, or something that Gartner would say, hardly the people I would look to to know what is going on).
Heh, I don't know a better word for it. That's how people who are enthusiastic about IDEs tend to talk. I would hesitate to call it "awesome" or anything like that, but among IDEs, it's well thought-of.
They bought the source code for SQL Server from Sybase; the foundations of the product weren't created at Microsoft. The wire protocol is still heavily compliant with sybase.
It's one thing to yell at the people who made this system via email. It's quite another to never allow them to ship it in the first place.