Apple can’t do cloud because the cloud is Linux
April 11, 2013  

Apple’s been taking a lot of grief lately for their problems in the cloud space. For example, Tom Dale says “Google is getting better at design faster than Apple is getting better at web services“, and Michael Göbel says

Core Data and iCloud sync are still a joke. I can’t count the number of developers and companies that all ran into the same trouble and finally gave up – meaning they dropped iCloud support completely after hundreds of thousands of users lost their data.

The reason that Apple is doing badly in web services is simple: all of the companies that are good at web services (Google, Facebook, Twitter, Amazon) use Linux, while Apple is stuck with their home-grown operating system, OS X Server.

This isn’t about technical superiority—Linux and OS X have a strong common heritage (Unix), and they are far more similar than they are different. It’s about community and ecosystems.

The Linux ecosystem is huge, diverse, and moving fast. The kernel has over 15 million lines of code and gets a new release every two to three months. There are dozens of major Linux distributions, which package the kernel with other software. Big companies like Google, Facebook, and Amazon use Linux heavily and contribute to it; universities, startups, and hobbyists as well. It runs on everything from smart watches to supercomputers, and typical server hardware is a cheap commodity.

Linux never succeeded on the desktop because, face it, the desktop sucks. Sure, there were always people who wanted to run open clones of Microsoft Word on Linux, but if your job requires you to use Microsoft Word, I pity you. It has always succeeded in the server market, and now, with Android, in the smartphone and tablet market. The significance of the desktop market is diminishing quickly.

Apple has been successful with their operating systems as well—they created the modern smartphone and tablet market, for example. But on the server, they’re screwed. Besides Apple, no one is using OS X on the server. Nobody is using WebObjects. Nobody outside of Apple is debugging or extending this stuff, and nobody is trying to port their projects to it. Even though both Linux and OS X are based on Unix, they are different enough that porting is painful.

Virtual machines are the cornerstone of the cloud, but you don’t see many OS X VMs because, for one thing, they all have to be licensed. In contrast, you can have a Linux VM running on a cloud server for 5 dollars a month. You can buy a fraction of a Linux VM. You can spin up 1000 Linux VMs in an instant and shut them down a minute later. You can go to a thousand companies for this, and that number is growing.

Backend engineers know Linux, not OS X. Any Linux backend company looking to hire has a wide selection of experienced candidates to choose from; Apple has to hire people and train them.

This isn’t the Macintosh versus DOS. Apple doesn’t have the superior technical solution, so there’s no hope that the world will wake up and Apple will suddenly win. Apple is behind and there’s no chance they will catch up with their current strategy. My advice to them is to abandon OS X on the server in favor of Linux. The odds of that happening are, of course, nil.