My personal journey to Apple

By | 2012/02/11

Apple Family

A few days ago a University buddy asked me if I’d mind blogging about the factors that led me to the point of being such a strong Apple enthusiast. I’d like, from the outset, to clarify that I’m not a fanboy. I’m a technologist; indeed, frequently those who lump me into the “fanboy” category usually end up being fanboys themselves of some other product or vendor.

I’ve been using computers for a very long time. My first computer was a Vic-20, and even back then I started programming. I was intent on writing a database to keep track of names and facial features of people I knew: I was always terrible with names, and since my childhood dream was to become a mad scientist, I clearly needed a way of keeping track of my minions and peons.

The Vic-20 was replaced by a Commodore 64, which in turn was replaced by my first PC. Each computer I’ve owned became an exercise in power and long-term viability; the Vic-20 got a 19KB RAM expansion unit; the C64 got 2 floppy drives and a 512KB RAM expansion unit. The first PC came with a whopping 40MB hard drive and I skipped over 5 1/4 inch floppies, getting it just with a 3 1/2 inch drive instead.

At high school I had a fairly strong exposure to Apple – the Apple IIe was the primary educational computer then, and we had about 12 or 16 of them, with a IIc added to them over time. Somewhere along the line a IIgs arrived, and I certainly fell in love with that beastie. Towards the end of high school, we had a bunch of Macs as well; checking through apple-history I’m fairly certain they were SEs, without the hard drive.

By the time I got to University, I was already programming heavily in Pascal. I’d started with G Pascal while still on the Commodore 64, and already had Turbo Pascal before I got to Uni on my PC.

I was, by all definitions of the word, a geek. My first PC was just a 286, and so for the first year of Uni I was limited to Windows. I managed to buy a friend’s 386 though sometime in second year University; by that stage I’d discovered Unix and fell in love with it; by second year Uni I was living in a granny flat that had more of a studio arrangement, and I distinctly remember regularly going to sleep listening to a hard drive whirring and lights flashing as my Linux workstation took the 6+ hours to recompile the kernel.

Other than a bit of programming work in it, that was the point where I pretty much ditched Windows. By the time I graduated and bought my first Pentium class PC, I had a Windows/Linux dual boot arrangement that saw me boot into Windows 95 about once every 6 months.

I had become a Linux fanboy.

The ironic thing of course was that I was rapidly heading towards meeting Darren, circa October 1996, and he used what by that stage was a completely foreign operating system to me – a toy operating system: Mac OS.

By the time I met Darren, Apple had either finished the acquisition of or were well and truly along the way of acquiring NeXT, and with it, Steve Jobs. I remember our first morning together having coffee on the balcony and me expressing incredulity that someone otherwise so intelligent and technically savvy would be so passionate about and defensive of Apple. Mac OS, I practically sneered, was a toy operating system that didn’t even have command line functionality, and therefore was pointless for any sort of “power use” scenario.

Somehow we survived that gulf, probably from a mutual decision to not press each others’ buttons on it too much. I continued down the PC running Linux path for the first several years of our relationship, and he kept on going with the Mac.

By the time Mac OS X was introduced, I’d used Darren’s various computers enough that I had a passing familiarity of but still largely contemptuous opinion of the interface.

Things started to change with OS X – not in terms of me immediately jumping across, but definitely getting more interested. A full Unix back end to the operating system suddenly made it, in my not so humble opinion at the time, capable of being a ‘real’ operating system.

I didn’t convert until sometime after the introduction of Mac OS X 10.4 – Tiger, and ultimately my conversion was based on 2 reasons, being:

  • Interoperability
  • Best of Both Worlds

Once I jumped across, those 2 key reasons were joined by another:

  • Efficiency

Interoperability was the initial driving force. By the time I transitioned, USB was rapidly becoming the de facto standard for device connectivity (thanks of course, to Apple), and to be perfectly blunt, Linux sucked at handling USB. And for large values of “suck”. At that point I was still quite a strong Palm user, and naturally one of the things I liked being able to do with to Sync my Palm with my desktop. I’d previously been able to do that without issue when Palm was still interfacing via serial connectors, but the latest Palms with USB connectivity seemed a nightmare under Linux. It was a perpetual game of chasing my own tail – I’d plug the Palm into USB Port 1, and the sync software would seemingly detect Port 1 was busy, so it would try to sync against Port 2. Swap, and the sync software would reverse. There was, without a doubt, very special magic required to sync a USB Palm Pilot to a Linux desktop in the early days – and despite four years of University and a desire to work from the command line, it escaped me.

Printers, too, were a nightmare. Unix printing in fact generally was until CUPS came along, but while CUPS gave some relief from the nightmare of Unix printing, it was about as friendly to interact with as a scratch to the scrotum from a rabies infected monkey would be. And I wasn’t alone in my view on this – Eric Raymond, open source placard bearer, found it a nightmare to work with himself, writing this famous attack on it here. Indeed, Eric’s final 6 questions to Linux hacker/developers are, to me, the core of what was wrong with Linux when I decided to leave it:

  1. What does my software look like to a non-technical user who has never seen it before?
  2. Is there any screen in my GUI that is a dead end, without giving guidance further into the system?
  3. The requirement that end-users read documentation is a sign of UI design failure. Is my UI design a failure?
  4. For technical tasks that do require documentation, do they fail to mention critical defaults?
  5. Does my project welcome and respond to usability feedback from non-expert users?
  6. And, most importantly of all…do I allow my users the precious luxury of ignorance?

Those questions are reflective of a problem that still remains with Linux. It is still, so often, written from a hacker mentality point of view. Oh, I know that these days the majority of contributions to the source and the surrounding packages come from some form of commercial entity, but even the people involved in those entities for the most part have that core-geek view of slotting in every possible option except an easy to use interface.

Interoperability didn’t just come in terms of hardware though, it also applied at the software level. Multimedia was a big bug-bear for me by this stage. Every time I changed distribution or upgraded the distribution I was using, I’d have to go through the tedious process of recompiling my kernel to support my sound card. I’d have to work through all the magical mystical options to get DVD playing working, and I’d have to hope like hell that all the planets were in alignment if I wanted to burn a CD. If I wanted to play a multimedia file, I could usually kiss my arse – while there were some player options available, the gap between when a new media format was released and when it would be supported was often huge, and highly dependent on whether or not the format could be successfully reverse engineered.

I should note – I was in an engineering role, at a highly technical-focused company, and all technical staff were running Linux. I was literally being exposed to it for sometimes up to 20 hours a day without fail, and it was still getting the better of me.

By this stage I’d been commuting for a few years, and my way for the most part in dealing with commuting was music. I started with commercial CDs, but carrying a pack of every possible CD I might want to listen to became cumbersome, so I’d transitioned to having a generous amount of my music in MP3 format on my laptop. However, laptop batteries back in those days weren’t really all that great, and nor was laptop CPU performance, so playing music the entire way through a 1.5 hour commute (if not longer – this was NSW Cityrail, after all!) could be a real pain. So having started with some cheap and nasty Creative device, by the time the iPod came out, I was convinced I wanted and needed it. The only problem, of course, was that it connected via Firewire; the easiest solution to transfer music onto it was to periodically plug it into Darren’s Mac. While there were some Linux based iPod management options, these were mostly developed around the notion of hacking the iPod, and after using it only for a few hours there was no chance in hell I was going to replace its operating system with something else. Of course, Linux support for Firewire was also very poor, which didn’t help.

In 2005 though, a new iPod came out, which supported the dock, and by virtue of that, a USB connector rather than a Firewire connector. “Great”, I thought, “Finally an iPod that I’ll be able to seamlessly sync under Linux”.




Linux handled the USB connected iPod about as well as it handled the Palm, and this was the straw that broke the camel’s back.

I was, to be perfectly frank, at a point in my life where I was tired of needing to be thinking like a full computer scientist and programmer every time I wanted to use my computer, or needing to “hack” any consumer device I wanted to attach to it to make it “better”. So one bright Saturday afternoon with an iPod I still couldn’t use against my own computer, I slammed my head against the wall for the last time in frustration, roared “Enough is enough!” and decided to switch.

There was only one direction to go, of course. Windows was a steaming pile of virus riddled shit, with XP at the height of its popularity. It was buggy as all hell, despite all of Microsoft’s promises as to its stability, and the command line in it was out of the stone age. On the other hand, Mac OS X offered the best of both worlds – a full GUI with excellent compatibility and interoperability, and a powerful Unix back-end that would let me drop to the shell and work as quickly as I wanted to as well.

So I bought a 17″ eMac with a 1.25GHz PPC G4 CPU, 512MB of RAM and an 80GB hard drive. It got quickly upgraded to 1GB of RAM (the maximum for the eMac), but within a few hours of bringing it home I was already somewhat in awe and changing my opinion on the Mac. Of course, part of that came from the “new toy syndrome” that plagues most geeks – give us anything new and different and we’ll usually be all over it like a flash. Due to the similarities though in file format, I was pretty impressed at being able to just copy my mail folders across from my Linux machine to the eMac and importing the mail. 4+GB of email that I’d accumulated over 8 or so years imported with minimum fuss, leaving me feeling pretty satisfied at the switching almost right from the start.

To be sure, there were some things that bugged me; the last time I’d used any Apple computer with any regularity was during high school, so there was the standard interface learning curve, and having been used to being able to assign so many keyboard shortcuts and use multi-button mice, the invocation of contextual menus and bigger need to use the menu bar took a while getting used to. (These days, the ongoing enhancements made by Apple as well as my stronger knowledge of keyboard shortcuts means that I can readily launch an app without going near the dock, do what I need to get done, and quit, all without having gone near the menu.)

One day, a couple of months into my switch, whilst still using Linux on a daily basis on my work laptop, I had an epiphany. Because of the interoperability, and because of the best of both worlds, and because of the human/user interface design principles of the operating system, I was having a very different user experience on the Mac as I was having on Linux:

  • I was efficient on Mac OS X, and
  • I was full of creative inertia on Linux.

Whenever I sat in front of my Mac desktop, I got things done. The operating system really was like a butler – it was there to help you when you needed it, but as much as possible stayed out of your way. (That comparison by the way comes from this Network Computing article from 2007, which compared Mac OS X to Windows Vista. To be fair, any comparison between OS X and Linux at the time would have been even more one-sided.)

Every time I tried to use Linux for anything productive (at least, outside of genuine work tasks), I’d find myself endlessly tweaking and optimising. I was busy, without being productive. I’d start things and not finish them. I’d get easily distracted, I’d fall into old habits and mistake busyness with productivity. We often equate the two terms to meaning the same thing, but they can be so far apart that they don’t even overlap, were to you draw them as a Venn diagram.

(It took me a while after that, but I should note that I’ve subsequently come to terms with (and this is why I now call myself a technologist) the fact that just because I’m more efficient on Mac OS X, doesn’t mean that the same will apply to everyone. Disregarding Linux as a desktop environment and comparing directly to Windows, it’s patently clear that Windows and Mac OS X have radically different user experiences and processes, yet some people are very productive on Windows too.)

Over time it’s become apparent to me that Apple grasped a couple of fundamental aspects to computing (regardless of whether it’s desktop, portable or mobile) that all the other computer companies failed to get. In fact, that most of them continue to fail to get:

  1. The real core target market for computers and computing devices are consumers, not workers and most certainly not technical users.
  2. The entire history of humankind is littered with the premise that while technology becomes more complex, our interaction becomes simpler.

This is something I’ve written about before – and rather than rehash the argument, I’ll link to it here.

I still have a keen interest in computing. I may not approach it from a rigorous math background like so many Universities focus on, but I am a computer scientist, and the theoretical nature of this industry fascinates and excites me like few other things on the planet do. The net has become like a sixth sense to me; I feel like I’m missing a limb when I’m without it, and despite what some people think, this is the future: highly connected people able to access information in the blink of an eye and communicate with not only the people around them, but people anywhere else on the planet. It will be, without a doubt, a singularity in and of itself, regardless of whether a machine-sentience singularity is reached or not.

But regardless of where my interests lay, what I’m most certain about is that computers are no longer an end in themselves for me; they’re a means to an end. That end may be some productive task, or it may be communicating, or it may be, as I’ve recently rediscovered, playing a game. Just as I don’t go out into the kitchen to idly fiddle around with a toaster, I don’t sit down at a computer to idly fiddle around with it. I sit down to use it. And the user in me – the consumer wants something that just works.

For me, that’s Apple.