Of Gardens and Sandboxes

4 Nov 2011

Yesterday, Matthew Bloch, MD of Bytemark Hosting1, tweeted:

next stop: no installations on your own computer from outside the walled garden. A free desktop is much more important.

He was referring to Apple’s announcement that they’ll soon be requiring all apps sold via the Mac App Store to implement sandboxing. Broadly speaking, this strictly limits what an individual app could do once installed - for example, it can’t access files outside it’s own area without going through the standard save and restore dialogs. Mac OS X’s younger sibling, iOS, has had even stricter sandboxing in place since it first allowed third-party applications. It also has another property, namely that the App Store is the only way to install applications under iOS. Matthew’s worry is that Apple’s adoption of one iOS policy (sandboxing) on the Mac suggests that they’re likely to adopt another, and forbid installation of non-App Store apps.

Personally, I think the two are separate issues. Sandboxing is a sensible policy in its own right. It’s a key component of the success of the iOS App Store - the fact that you don’t need to worry about the developer’s incompetence or malice before downloading that 69p impulse buy. Relatively few applications actually need to do things that sandboxing prevents, like alter system settings. For the rest, sandboxing allows the ease-of-use and convenience that iOS users have grown used to.

It’s not like sandboxing is even a new idea. Not only did Java applets (remember those?) give it a go in the late 90s, but Mac OS X’s own version has a lineage stretching back to chroot jails, introduced to Unix in 1979. The idea that every program a user runs has to be able to do anything that that user is not an absolute, and there’s nothing inherently wrong with a user choosing to restrict the facilities available to specific programs. All modern systems do this to an extent, by restricting root privileges to programs explicitly granted them. Sandboxing could be thought of as simply a more fine-grained version of the same idea.

Restricting installation to only the official App Store, on the other hand, represents a significant break from the current status quo. At present, if I want to install a random piece of software from some shady corner of the internet on my Mac, I am free to do so, and live with the consequences. Channelling all installations through the App Store (and Apple’s somewhat mercurial review process) doesn’t give me this option. For many users, who just want to get work done, and don’t want to worry about maintaining their computer, this might be a reasonable trade-off, but for others (such as myself), it is not. However, it’s important to remember that this isn’t an all-or-nothing distinction. For example, I’d want Galcon Fusion to be sandboxed, but still build Emacs myself. If I lose the option of installing arbitrary software, though, the range of things I could do on Mac OS as opposed to another system would be severely curtailed. I don’t think there’s much risk of this happening, for two reasons.

Firstly, the people that such a move would alienate - developers, designers, and those who see themselves as power users - are the sort of people who are willing to pay a premium for high-end hardware. Whilst not as numerous as general consumers, this group must still be a significant source of profit for Apple, and so they’d need a good reason to abandon them. Furthermore, many of these users are the very people who keep the App Store stocked with high-quality products. Driving them away would, in my opinion, be a significant own goal.

Secondly, and more importantly, why do they need to? Apple already sell a walled-garden OS for people who want it, and the devices that use it sell pretty much as fast as Apple can manufacture them. They don’t need to sneak a walled garden in via the back door, because they’ve already kicked down the front. My expectation is that Mac OS will be increasingly seen as a “Pro” platform, and iOS as the way for general consumers. This is arguably no bad thing, as it allows people to use computers for what they actually want to do without having to worry about servicing and maintaining them (I touched on this in a previous post). As John Gruber points out the Mac/iOS division allows Apple to cater for those of us who want the flexibility offered by a general purpose system, whilst not burdening everyone with the attendant complexity. And if at some point they choose to stop catering for us, there are plenty of others who will.

Update: Mathew has responded on his blog. He makes some interesting points. Most importantly, he points out that Apple already charges developers $99 per year. Root access could be restricted to these developers (and anyone who paid the fee could join them), whilst not providing it for regular users. The latter would effectively be handing over both the hassle and the control of auditing the software on their machine to Apple, a bargain that many people would consider reasonable (and one that most people already make with, for example, their email).

The fallout for developers would be that they could no longer rely on end users being able to install software themselves, and therefore everything would have to go through the App Store, with Apple taking their 30% cut. For many small developers, this again is a good bargain - the simplicity of the App Store, plus the trust user’s have in it, are worth Apple’s cut (and the small but non-zero risk of running into problems with the review process). However, as Matthew points out, big developers such as Adobe and Microsoft are less likely to be happy, and could go as far as abandoning the platform.

That’s a key reason I don’t think that Apple are likely to go down this route. Over the last decade, Mac OS X has gone from having the scantest selection of third-party software to having arguably the best. Part of this has come from the kinds of small commercial developers that are a natural fit to the App Store, but it also comes from big-name applications like Office and Creative Suite. Losing these applications would be a significant blow to the platform, and I’d be surprised if the additional App Store revenue would make up for it. Indeed, it’s not at all clear that there would be any additional revenue, given that the change would result in users leaving a Mac OS that no longer met their needs.

More importantly, the end result of this game makes no sense. Why would Apple turn Mac OS into another iOS, when they already have iOS? If people are choosing between a hassle-free/locked-down device (depending on your perspective) and a more flexible/fragile one, why would Apple not want a product on both sides? The only reason would be that one side doesn’t make enough money to justify pursuing. I don’t think that’s the case, but if and when it is, I wouldn’t expect them to turn Mac OS into iOS. They’ll just stop selling it.

In conclusion, I don’t expect Apple to remove the ability to install arbitrary applications from Mac OS in the foreseeable future, not through any moral compunction or fear, but simply because I don’t think it would increase their profits in the long term. I could be wrong about this; Apple clearly know their business better than I do. We’ll see.

  1. Who are, as I believe I’ve mentioned before, highly recommended if you’re in the market for hosting. [back]

This site is maintained by me, Rob Hague. The opinions here are my own, and not those of my employer or anyone else. You can mail me at rob@rho.org.uk, and I'm @robhague@mas.to on Mastodon and robhague on Twitter. The site has a full-text RSS feed if you're so inclined.

Body text is set in Georgia or the nearest equivalent. Headings and other non-body text is set in Cooper Hewitt Light. The latter is © 2014 Cooper Hewitt Smithsonian Design Museum, and used under the SIL Open Font License.

All content © Rob Hague 2002-2024, except where otherwise noted.