For a couple of years now Apple has included a Caching Service within OS X Server. This service allows a company or school to locally cache certain OS X and iOS updates and apps to lower the load on their network when multiple users request the same update.
Originally the service only cached OS X installers, but year after year Apple has expanded the service to include: OS X and iOS updates, App Store apps and iBook Store books. And combined with the excellent CacheWarmer this service is a perfect add-on to any medium or large Apple deployment. I’ve even got it running at home, and I’m caching around 500GB of data.
OS X Server 5
Caching Server can accelerate the download of personalized data stored in iCloud, including photos and iCloud Drive. Enabling iCloud Acceleration reduces the amount of iCloud data that needs to be downloaded when users have multiple devices on the same network. – Apple Developer Portal
With the release of OS X Server 5.0 in the fall, Apple will expand this service one again and they’ll now include Users’ iCloud Data as well. Users who use their iPads or Macs on a network that runs a Caching Server will gradually seed their iCloud Documents, Photos and iCloud Backups to that Caching Server. This allows for very fast device restores, pretty sweet!
Capacity and management
But there’s an issue with the implementation: Imagine a network running a Caching Server for 100 users which all have an iPad. Theoretically this would mean a minimum of 500GB of iCloud Data with the 5GB for free cap. But with the arrival of iCloud Photos I guess most users now pay for bigger storage tiers, so we’re potentially looking at around 100GB per user, or around 10TB of data.
If you consider a small company who probably caches their data on a Mac mini’s internal hard drive and you can see where things will go terrible wrong.
Caching Server currently has a single setting to pick where you want to store your data, and a single slider to pick anything between 0GB and 90% of your storage points capacity.
At the office I currently cache updates and apps on a very fast external Thunderbolt drive and we’ve got around 2TB of data cached. When OS X Server 5 is released, we would now need at least 12TB of storage too cache everything. And we would to migrate all our data, because we can’t just add a second bigger hard drive to manage this new influx of data.
A Solution: looking at NetBoot
There is an easy and existing solution for this problem: Just look at the other service that caches user data: NetInstall.
NetInstall has always allowed Admins to select multiple harddrives to store images and/or user data. You can even select which drives contain only user data, and which store NetBoot-images.
You can – for example – host your most important images on a fast internal drive, and the bulk of user-data on an external drive.
So what I would like for OS X Server 5.0 to do is something similar, but for Caching.
In this scenario you would select multiple storage devices to store all yur cached data. A big, slightly slower drive for all cached iCloud Data (which gets downloaded slowly to user’s devices after a restore anyway), and smaller faster drives for OS updates and apps. This makes managing the server easier.
You could even imagine a company starting with a small internal drive, and gradually expanding their Caching Server setup if necessary.
Although I love the idea of caching iCloud Data on a local network, and can see how this massively improves the restore experience of users, I find this lack of control about what’s stored where a dealkiller. That’s why I’ve created a bugreport for this issue, hoping Apple will fix this issue before launch.
If you’re managing your own Caching Servers and agree with my statements above, please create a similar bug-report and refer to my Bug Report: #21342840