There are a lot more Java libraries than Objective-C libraries. Especially at Google.
We write a lot of Java at Google, but I wouldn't call anyone a "Java programmer". We're just programmers. And like all programmers, if we can save time by writing a tool and letting the computer do some work, we will. Laziness, impatience, hubris, and all that.
My 5-engineer team at Google all write java day-to-day. But we've all needed to write python, especially scripts, and have it properly checked in and maintained. Not sure how you could get by most places in the company without being able to do that. Besides, skill in other languages is an asset; I've written C++ client library code to push for broader adoption of my project.
Actually this has nothing to do with Java developers, as much as people here likes to look down on them.
This has to do with calling a single command and blasting out an app for IOS once you have already written an app for Android.
If you note carefully Google could have done it the other way around, by implementing IOS shims on top of Android.
This would have been easier (as Google controls Android) and because Objective C code is low level code that is easier to port to high level Java code.
And it would have given an incentive to those who have already written IOS apps to port them to Android.
So why not do it that way? Because then everybody would develop for IOS first and Android second and few people would use the Android specific features. By going from Android to IOS, Android is the primary system, but Google still lover the cost of development of Android apps by having the IOS market articficially subsidise it.
Speaking as someone who has developed on both platforms, it's definitely my experience. And that includes starting with an Android app and porting it to iOS - half of the stuff I had to add to Android to make it usable I discovered they had already built into iOS. Not to mention the poor choice of defaults, coping with device fragmentation, and the difficulty making Android apps look attractive via-a-vis iOS.
I haven't built ICS apps, though, so perhaps it's getting better.
It will be so much better to have Objective C Runtime on Android. I haven't seen any java based Android apps that run smoother than Google Chrome on Android, which I highly suspect that it was built using C/C++ instead of Java.
Learning ObjC is a one-time cost I've already paid. The problem is being continually confronted with all the compromises it made against safety and readability just to rush out semi-usable implementations thirty years ago. It's no longer necessary to regress to C and risk crashes and random misbehavior from unchecked pointer arithmetic and integer overflow and uninitialized memory, and debugging that (while sometimes fascinating) has long been a poor use of hackers' time.
How so? Looks like an opportunity to have some common code between the two platforms to me. Write your view layers twice. If you have a large amount of business logic write it once in Java and then part of the build or checkout process for iOS can be to translate this code to objective-c.
I've recently been diving into mobile dev and the short list of good options on having a common code base between the platforms seems so strange to me.
Actually common code on 4 platforms. We share Java code between the server, the web (via GWT), Android (already runs Java natively), and now iOS (via j2objc).
That's basically the reason for this. We have huge amounts of Java libraries, and we want to create and ship out features on 3 platforms simultaneously. Not having to write, say, an Operational Transform library 3 different times (JS, Java, and Objective-C) and keep them constantly in sync will go a long way.