Hacker News new | past | comments | ask | show | jobs | submit login

The author is wondering why iCloud Sharing is used by so few developers; I will point out / remind that, in addition to fundamentally being an API whose usage implies "I will never be able to build an Android app which interoperates with this data" (which is already pretty damning for most developers as a door they don't want to permanently close), the first few releases of iCloud were so bad that documents and even entire clients would end up in permanently wedged states that developers could not fix for users and were so bad that at WWDC one year I remember the Apple iCloud person on stage during the "developer state of the union" talk apologizing for iCloud being so broken and begging the audience for another chance.

While true, I don't think this explains the disparity between regular CloudKit use and CloudKit Sharing use. For example, this snippet from Apple's CloudKit paper[1] particularly struck me: "We identified the top apps using CloudKit, based on their number of active users in the past month, and examined their use of private and public databases. We found that 20% and 49% of the apps use only the public or the private database, respectively, and 31% of apps use both databases (20 apps use the shared database)."

So few apps from their sample set were using CloudKit Sharing that they didn't even bother using a percentage! That's quite unusual for an Apple framework.

[1]: http://www.vldb.org/pvldb/vol11/p540-shraer.pdf

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact