Now that the mobile titans are battling for market-share and mind-share, it seems that the golden days of the Mac switcher story are pretty much over. The future of computing lies not in the PC or the Mac, but in the more abstract mobile platforms that power the iPhone, the iPad, and their counterparts. So, before the fine art of the Mac switcher story becomes extinct, I’d like to contribute my own. But, rather than just write about how I came by a Mac, I will also write about why my story is as much about Apple itself as it is about becoming a Mac user.
Vista, Linux, or Mac?
I was late to the party. Very late. Prior to 2008, I owned no Apple products, not even the venerable iPod. It isn’t that I was a luddite. I was working on a degree in computer science and I had considerable programming experience. And I wasn’t anti-Apple. I just had no interest in an iPod, an iPhone, a Mac, or anything else made by Apple. My music was on CDs, my phone wasn’t smart, and my custom built PC dual-booted Linux and Windows. Thus, in a world full of people who either adored or loathed Apple, I was honestly apathetic toward the company. That was about to change.
In 2008, my wife’s two year old Dell laptop choked. Its battery failed to hold a charge, the integrated LAN port had died, the display had developed a flicker, and a few of the special keys weren’t so special anymore. But the problem that eclipsed all others was its performance: it had slowed to a crawl, as if the mere suggestion of opening Firefox caused the thing to sputter and gag and fall to its knee. It was virtually unusable, even for my wife’s meager computing needs.
Like a seasoned Windows power user, I first attempted the standard remedies: complete anti-virus and spyware sweeps, registry cleanup, cruft removal, defragging, disk analysis, and Windows updates. When none of this worked, I went for the nuclear option. I formatted the hard drive and re-installed Windows.
Sadly, there was little improvement. It was if the hardware itself had given up on life.
Next, I wiped out Windows and installed Xubuntu, the lightweight version of Ubuntu targeted at lower end systems. Xubuntu worked well for a few days. It wasn’t perfect, but the speed was acceptable, and my wife was browsing the web and checking her e-mail without complaints. But then a few major bugs started giving her headaches. I don’t know if these bugs originated in the Linux kernel, in X.org, in XFCE, or in some other esoteric component. In any case, the laptop would not sleep or wake reliably. Even after closing the lid, it would often run at 100% CPU consumption for hours, overheating and causing all sorts of havoc. During these episodes, the power brick felt like a chunk of pyroclastic rock recently ejected from Eyjafjallajökull, and I feared it was a fire hazard. Also, it seemed that the sound system wasn’t exactly stable, as the Dell screeched and belched every time there was a confirmation “ding” or an accusatory “dong” from the UI. Finally, the virtual workspaces went wild and switched themselves randomly as if the machine were possessed by a productivity-obsessed demon. I spent hours Googling for fixes. I perused forums, blogs, and wikis. Despite my efforts, I gave up. I was exhausted. And it was time for a new laptop.
At the time, I was quite disillusioned with laptops. They all seemed to suck. I hated the small, cramped trackpads, the heat problems, their lack of serviceability, their creaky design, and batteries that — if they didn’t burn your house down — often required replacing after just a year. So, at my urging, my wife and I decided that we were in the market for a desktop rather than another laptop.
While shopping, we walked by the Apple kiosk at Best Buy — that tiny alcove of shiny glass and aluminum amongst a sea of black and gray mottled plastic — but we immediately turned away when we glimpsed the price tags. Compared to a similar Windows machine, the Macs’ specs were weak, their list of features small and unimpressive, and they were still more than double the price. What kind of racket was Apple running, anyway?
But by this time, the summer of 2008, another variable had entered the equation: Windows Vista. Despite having mostly migrated to Linux the year before, the demands of work and the occasional game of Company of Heroes meant that I was still shackled to Microsoft, and I had been using Vista on my dual-boot machine. Vista was a constant frustration that I loathed even more than its hideously ugly predecessor, Windows XP.
So we were in a bind. We could choose a system running XP (an old evil), Vista (a new evil), or go for a custom build with another Linux install. The biggest hurdle with Linux was instability. As a Linux user, I knew that each kernel update brought fixes for old problems and introduced new ones. Regressions were common and troubleshooting Linux’s weakness sometimes teetered on being a part-time job. I was used to it — I had convinced myself that I enjoyed it actually — but I was not about to put my wife in that situation. She needed something simple. She needed something that didn’t require a lot of tweaking. So, despite the price, we decided to take another look at the Macs.
As I considered buying a Mac, I was attempting to rationalize the purchase in a way that justified the increased price. Having never used a Mac before, I had to resort to something other than personal experience, so I fell back to what I knew.
I knew that Mac OS X had it roots somewhere in FreeBSD. Being a *nix geek, that was a huge selling point for me. I also knew that Apple had played a role in a few open source projects like CUPS and WebKit. I never suffered the delusion that Apple was an “open” company, but I wasn’t an open source zealot so I didn’t much care. I used Linux because I hated Windows, not because I was making a political statement.
And there was another, more embarassing, factor in my decision: one of my earliest computers was an Apple IIe. I had adored that machine. In addition to an early model IBM PC AT, the Apple IIe was a computer on which I had spent many long hours programming text-based fantasy games that I had lovingly referred to as my “RPGs.” Later, once I had figured out graphics mode, I had written my magnum opus, a custom version of Pong, in BASIC on an Apple IIe. Yes, I loved that machine, and those fond memories definitely softened me to the idea of buying Apple’s new machines, even if they were a pale shadow of the computers of yore.
Granted, these are all poor reasons to drop an enormous amount of cash on what appeared to be a childish imitation of a computer. I mean, come on, this thing — this iMac — was all one piece. How did you swap out hard drive? Replace the monitor? Graphics cards? Why was there no physical button to eject a DVD? And that Mighty Mouse thing? Utter rubbish! (And it remains rubbish. Despite their amazing track record on everything else, Apple has yet to produce a usable mouse.)
Ultimately, my disgust with Windows and my lack of faith in Linux left me no alternatives. It was either buy a Mac or go back to Microsoft, so I decided on the lesser of two evils and we went home with a base model 20″ iMac. Its weak specs were still far above what my wife required and I was happy knowing that by avoiding their latest OS adbomination I had smote Microsoft.
I admit that I was a little embarassed while carrying our new iMac out to the car. I didn’t want people to think that this thing, delicately packaged in an effeminate white box with a convenient handle, was for me. I was a programmer, a geek who liked code obfuscation contests and technical manuals and C++ GUI programming guides. I wanted to shout to the passersby, “This is for my wife. I’m just tired of servicing crappy machines and I’m looking for a break. Don’t judge me.” Who could blame me? Really, who?
This Isn’t So Bad
At home, we booted the machine and began exploring. We played with iTunes, Spotlight, Exposé, and Spaces. Being a Linux user, the latter three were known to me under different monikers, but all were impressive to see on a mainstream machine. We snapped some cute photos using the iSight camera. We browsed the Internet, glanced at iPhoto and iMovie. I then opened a Terminal window and tried out some of my trusty bash commands.
I was impressed. The iMac was as snappy as my custom Linux box. The UI was simple and intuitive. Every keystroke, every icon, and every menu item had a clear, defined purpose. No detail had been overlooked. My wife was comfortable with it immediately. No waiting. No tweaking. No crashing. No Googling. No man pages to read. It was nice, even though I hesitated to admit it at the time.
To be fair, we were annoyed by a few of the Mac’s quirks. For one, the mouse didn’t feel right. Despite having the sensitivity jacked up to the max, the mouse felt strangely unresponsive, as if I were mousing through molasses. It made my arm and wrist sore. I eventually discovered that the problem was with OS X’s acceleration curve. To remedy the problem, I installed USB Overdrive (which also happens to be the best set of mouse drivers I have ever used on any platform). In my opinion, this is one of Apple’s largest UI failures and it remains unresolved.
I was also annoyed with the Mac UI’s tacit policy regarding “click-through.” (Isn’t it odd that almost everything that intensely annoys me about Macs is somehow related to the mouse?) I hate having to click a window just to bring an application into focus when I can quite easily use a single click to both perform a task and bring the application into focus. John Gruber has written extensively (and eloquently) about click-through, but I disagree with him. Gruber thinks that people are confused when, after clicking to bring a background window into focus, the click inadvertently performs an additional action. In practice, I have never seen a Windows user confused by this and the explanation is simple: people are able to logically conclude that the action of clicking has consequences. Click a button, get a response. That’s a good UI principle to adhere to, regardless of where the window is positioned. Most Windows users are quickly trained to click in empty space in background windows to avoid inadvertent actions. Sure, it isn’t elegant, but I believe it makes for a more responsive UI.
Also, to make the “it confuses people” argument, the Mac would have to abandon the common problem associated with a windowless application having focus. For example, if I have both Safari and iTunes running, it is quite common to close the last Safari window to see iTunes occupying my desktop, but the focus still remains with Safari until I explicitly click on the iTunes window. That’s just stupid.
Ultimately, despite its quirks and its limitation to Apple hardware, I decided that Mac OS X was the best choice in a very short list of choices for desktop operating systems. My wife was pleased and I was pleased.
Enter the MacBook
A few months later the new aluminum unibody MacBooks were released. Despite my disgust with laptops, my wife’s iMac had impressed me so much that I was curious to see how a MacBook would fare. For me, the most enticing feature was the new trackpad: a giant, beautiful, solid piece of glass that fully supported gestures. Sure, it wasn’t the first trackpad to support gestures, but it was the first that I found usable. After trying it out at the local Best Buy, I proclaimed that it was the best trackpad I had ever used, a statement that I still believe to be true today.
So I ordered myself a 13″ MacBook.
That semester I was taking an advanced Unix programming class. It was the standard Unix system stuff: file I/O in C, fork, wait, exec, etc. Our final class assignment was to build a rudimentary web server to demonstrate our prowess with Unix IPC. I typically did my programming in Linux, under Cygwin on Windows, or via SSH to the university’s Solaris server, but I had accidentally mangled GRUB somehow and my PC was unbootable. Sure, I could fix it in an hour or so — I just needed to figure out how to restore GRUB and pray that I didn’t destroy anything else in the process — but I was frustrated and hurried, so I decided to try programming on the MacBook instead.
Surprisingly, everything worked as well as it had in Linux. Even the more complex programs which failed on Cygwin ran flawlessly on the Mac. I was shocked. Here was a machine that my wife could use to buy and sync music with iTunes and yet I too, the classic computer nerd, could use the very same machine to code, test, and debug a web server from a native command-line interface running vi. That was damn cool, and I was no longer embrassed to admit it.
Over the next few months, it became obvious that my wife liked my MacBook far more than the iMac we had purchased for her. It turns out that she preferred mobility to screen size. Since her old Dell had to be constantly plugged in, she hadn’t realized that a MacBook would last for hours on a charge. Now that she was untethered, she wanted one. I inherited her iMac after my wife commandeered the MacBook, but it worked out well because I was actually beginning to prefer the iMac to my own, custom-built Linux/Vista PC.
Although my PC was more than double the specs of the iMac, it was significantly less usable. The iMac and MacBook both slept and woke within a few seconds every single time. Not once did either crash when waking from sleep. Both Linux and Vista regularly failed to wake from a low power state and, despite hours of troubleshooting, I was never able to fix it. Furthermore, my PC sounded like a hurricane. Even with the fan speeds turned down as low as possible, the noise was annoying. In comparison, the iMac and MacBook were whisper quiet, so quiet in fact that I couldn’t tell by ear whether they were on or off.
But the OS was the most important upgrade.
It wasn’t until I saw how elegant and graceful an OS could be did I actually become discriminating enough to be annoyed by Linux’s warts. After having been spoiled by Mac OS X, I was finding it increasingly difficult to be satisfied by the “good enough” creed by which I had lived for so long. Linux began to feel like OS X’s temperamental, half-baked doppelganger. If it weren’t for the hardware lock-in and Apple’s strict control of the OS, I would have shed a tear for the discovery my dream OS. (Although a convincing argument can be made that Mac OS X is so great precisely because of the hardware lock-in and Apple’s strict control.)
Linux does give the user complete control over his system but, as Uncle Ben said, “With great power comes great responsibility.” Linux demands a lot of its users. It is a tangled web of independently developed components. It is so modular and flexible that the system sometimes breaks down at the seams. As I got older and had less free time to spend tinkering, I was finding it more difficult to justify spending long hours massaging Linux into compliance. Despite wanting to maintain my self-image as a geek (how incongruous is that?), I was secretly loving and using OS X for its simple usability. OS X is the antithesis to Linux: everything is designed to fit together perfectly, yet the perfect fit comes at the cost of flexibility. Whereas Linux can be one of a billion different things, a Mac is a Mac and, if you don’t like it, you can install Linux.
Then, of course, there is the aesthetic charm of Apple’s products. For many years, I thought Macs were the ugliest computers on the market. But once the Mac’s primary exteriror components were made of aluminum and glass, I was lovestruck. The Mac, iPod, and the iPhone were stunning: simple, elegant, symmetrical, devoid of embossed plastic, markings, or unnecessary lines, rarely hinting at the raw industrial power that lay just millimeters beneath. My days of yearning for the futuristic devices seen on Star Trek: The Next Generation were finally over. Apple was now making them.
In which I actually pay for software
It took a few weeks but I eventually found alternatives to my favorite Linux and Windows software. And, in most cases, the Mac versions were superior to their counterparts. The only downside to this was that I found myself shelling out cash for software that I would never have paid for on Windows or Linux. This is a significant difference between the platforms. As a Windows user (or a Linux user), I was accustomed to crappy software. When people asked me to pay for it, I was offended: “Pay for it? For your crappy UI, your constant crashes, your spelling mistakes, and your restrictive license? No thanks.”
In the world of Windows, there wasn’t much worth paying for. I paid only for Microsoft Office, Adobe Photoshop, and games. These were worth the price. Everything else in the Windows-verse was bloatware, nagware, spyware, or shareware. If you could get past the horrible user interfaces and the instability, things sometimes worked just enough for you to get the job done. And this was the way of things.
This is not so on the Mac. Yes, there is crappy software in the Mac ecosystem too, but the good software is far better than anything I ever used on Windows or Linux. Applications like TextMate, BBEdit, PathFinder, Delicious Library, Things, ScreenFlow, and even iWork exceed even their closest Windows or Linux incarnations. But Mac software is rarely free, and this is no coincidence. Good software requires paying customers, and Mac owners are willing to pay for quality.
Addicted to Quality
It has been almost two years since I first bought my first Mac. Like many of Apple’s customers, once I bought one Apple product, I eventually bought more. This is not just because of the quality, but also because Apple’s stuff is designed to work together. If you have one Apple product, adding a second one almost always extends the first in some way.
Paying a premium for Apple products hurts, but that pain fades quickly. On the other hand, buying cheaper, junky products often results in a device that is lackluster, underutilized, and imparts buyer’s remorse. Years before buying a Mac or an iPod, I purchased a Sansa SanDisk MP3 player. Feature for feature, it was “better” than the iPod. The Sansa’s software was awful and it never became a part of my daily life. Syncing it felt like more of a chore than an upgrade and, eventually, it ended up packed away in my graveyard of obsolete gadgets. None of my Apple devices ever ended their life in that graveyard.
At no point during those intervening years did I decide to become an Apple fanboy. I was just so happy with each purchase that I became spoiled. Spoiled by quality. Spoiled by good design. Spoiled by simplicity and attention to detail. Eventually, I began demanding these qualities from everything I bought, not just computers and electronic devices. For me, the days of “good enough” have come to an end. Apple has made me a more demanding consumer.
I think this is a good trend, especially for the electronics industry, where companies thoughtlessly cram “features” into their products. These features are often unimpressive, half-baked shams, or even outright lies. Their purpose is not to deliver value to the customer, but to deliver to the illusion of value to set their widget apart from their competitor’s nearly identical widget.
In the tech media, Apple products are often compared to these generic feature-stuffed products and found wanting. Some are convinced only by specs and benchmarks. Bigger numbers are always better. These consumers aren’t interested in any form of value other than those that can be directly quantified. Intangible features like the user experience are simply not factored into their equation, so they believe Apple products are less valuable and Apple’s customers are simply fanboys with no rational logic behind their purchases other than the desire to flaunt the Apple logo.
Take the Mac for example. Can you buy a computer for less than a Mac? Of course. Can it do everything a Mac can do? Absolutely. The discrepancy here isn’t function. You can buy any number of cheap computers that do virtually everything expected of a Mac. The discrepancy is with user satisfaction. Mac users are a specific type of consumer and they hate buyer’s remorse. They are willing to pay extra for the comfort associated with knowing that they will be satisifed with their purchase.
Where to Go From Here
As of this writing, Apple has surpassed Microsoft to become the United States’ second largest company in terms of market captialization, just behind Exxon-Mobile. What does this mean for Apple and its fans? Hopefully, not much. I hope Apple continues to do what it does best: deliver innovative quality products to demanding consumers.
But there is another alternative. With Apple’s explosive growth has come FTC investigations, DoJ investigations, patent spats with Nokia and HTC, fierce competition from Google, and the swelling legions of Apple haters. Let’s face it: Apple has become “the man.” Apple is a bigger target now than it has ever been before. Even relatively obvious and innocuous business decisions, like supporting H.264/AVC over Ogg or VP8, are scrutinized and criticized ad infinitum by the scores of churlish tech bloggers who want to see Apple humiliated and defeated.
I am happy with Apple’s success. The market is ripe for the type of products Apple produces and Apple’s sales figures reflect this. But, with its enormous growth, Apple is asserting greater control over its products in unprecedented ways. They are circling the wagons. I am tempted to say that Apple is getting cocky, but Apple has always been cocky. That’s part of what makes Apple so unique. They do not shy away from, and rarely even respond to, criticism. They maintain an intense focus on their product line and pretend as if the world around them weren’t churning a hurricane of anti-Apple animus.
So I am in the awkward position of both loving a company and being afraid of what it might become. Is it bad to be dependent upon a single company for all of my computing needs? Maybe. But considering the importance of computing isn’t it just as bad to be dependent upon a handful of companies? Logic tells me that putting all of my eggs in one basket is not a good thing, but logic also tells me that I should use what works rather than distribute my eggs into baskets I can’t carry just to prevent a potential, ideological threat. Besides, I currently use Google’s services quite heavily, including Gmail, Google Docs, Google Calendar, and Google Reader. Google has more of my data than Apple does, so why should I fear Apple?
Don’t get me wrong. I am concerned about some of Apple’s recent actions, primarily those that directly impact users. For example, remember Apple’s decision to block all device manufacturers from syncing with iTunes? While iTunes is indeed Apple’s property, the content managed within iTunes is owned or licensed by the consumer, and this is just a form of vendor lock-in. However, there is a good business case to be made against allowing other companies to pair their devices with iTunes. Why should Apple invest in a piece of software only to have their competitor’s use it against them? Since Palm competes directly with the iPhone, allowing Palm to do iTunes syncing would basically free Palm from the responsibility of building and maintaining software for its own platform. Like most of these discussions, the issue is not as dualistic as the throngs of choleric tech bloggers would have you believe.
All of the indignant, self-righteous fearmongering gushing forth from the blogosphere regarding Apple is so overblown that I find it difficult to take seriously. Ultimately, most of the complaints we hear are only tangentially relevant to the real argument these people are selling. They really want us to believe that Apple is evil, or, more specifically, that Apple is more evil than is Google or Microsoft or Nokia. But we must remember that, unlike Microsoft and Google and, to a lesser extent, Nokia, Apple has never had, nor does it have now, a monopoly on anything. The market is flush with competitors nipping at Apple heels in every market, yet Apple is always there, at the top of the market, siphoning off the most valuable consumers and leaving behind the dregs for everyone else to scrap over. Apple can’t do this because it has a monopoly, or because it absorbs its competitors, or because it manipulates the market, or even because its customers are so locked-in that they can’t flee. Apple can do this because people spend enormous amounts of money on Apple’s products.
Most of the bitching comes from people who, for whatever reason, think Apple’s lead in the marketplace obligates them to allow their competitors a handicap. Frankly, I think that is ludicrous. I want to see competition, and companies aren’t pressed to compete when they are given an unfair advantage just for being behind. If nothing else, Apple’s rise to dominance should be a clear sign that consumers are willing to pay for quality. As Mike Davidson said regarding Apple at the helm, this is a good problem to have.
In the two years since I first purchased a Mac, my entire perspective of computing, and even my behavior as a consumer, has been permanently altered. When I bought my first Mac, I was not intending on ever buying anything else from Apple. I never intended on even liking the damn thing. But, in time, I found out that Apple really does make awesome stuff. Their products aren’t perfect, as tradeoffs must always be made in engineering, but these sacrifices are almost always made with the user experience, not the marketing or accounting departments, in mind. If Apple makes a contentious decision, it is most often because they truly believe it improves the product.
After writing this, I have finally figured out why switcher stories are so enjoyable. Each story is an individual’s tale of doubt, chronic frustration, and annoyance, ending in a splendid orgasm of user satisfaction. Until another company can prove to me — not with prototypes or vaporware but with real, shipping products — that they too can sacrifice the old, race-to-the-bottom strategy and build quality products, then I will continue opening my wallet for Apple. It really is that simple.