Kind of reminds me of the old slogan from the anti-nuclear movement in Germany: "Atomkraft, Nein Danke! Strom kommt aus der Steckdose". (We don't need no stinkin' nuclear power, electricity comes out of the wall socket)
Sounds like a hoax slogan (possible from anti-anti-nuclearists). The one I remember was "Atomkraft, Nein Danke!" by itself.
I can understand that there is probably a problem related to ARC and garbage collection. But missing this set of features means that there are lot of things that simply cannot be implemented in Swift and thus having remain in Objective-C (Core Data, other ORMs, IoC-frameworks etc.).
That means the iOS/OSX-platform will continue to be split in two programming languages and object-models for the foreseeable future.
iOS/OSX has been split into multiple languages for a long time, anyway. The C/C++-based framework subsystem has produced many others..
I don't think so, since Objective-C also uses ARC and is very dynamic. I'd say the typesystem/runtime of Swift might have more to do with it.
The Objective-C compiler gets significantly more strict with ARC turned on, for example sending unknown messages is an error instead of a warning. Various runtime functions that used to be quite safe also become dangerous, for example it used to be OK to send performSelector: style messages with non-object returning messages (void) and simply ignore the result. That now becomes a crash as ARC tries to retain the return value. So in order the be compatible with ARC, the "cheap and cheerful" reflection that ObjC has isn't really sufficient.
ARC is predicated on the compiler having essentially a statically checkable call-chain.
They could know what they are doing, or they could be blathering idiots. Or, possibly, their use cases might be different than another person's.
Whatever the case, by just saying that they "won't miss feature X", they still tells us something about them but nothing about the feature.
To do the latter, they'd have to up their argumentation.
> I am better Swift programmer after a year and a half in Swift than I was in Objective-C after 26 years.
I like and use swift, but I haven't reached a point where a lightbulb went off and I proclaimed "Hey I needed this".
See also: The Siren Call of KVO and Cocoa Bindings http://blog.metaobject.com/2014/03/the-siren-call-of-kvo-and...
I'd also want more strongly typed keys, less strings everywhere. I do use NSStringFromSelector as much as I can, but strings are unavoidable in IB.
Also the handling of one-to-many dependencies could be made a lot more intuitive as well.
You don't feel more productive in swift? Like a lot more?
EDIT: Just the compile times alone drive me batshit. That doesn't mean that there aren't nice aspects, certainly some of the syntax has been simplified by unifying two disparate syntaxes...but they could have done so much better, even there.
"Apple's shared frameworks are very useful for sharing executable code and its associated resources among multiple applications, but were historically not designed to be created by authors of consumer applications. I discourage developers from creating frameworks as a method of sharing code, because they encourage code bloat, increase launch times, complicate the developer's fix/compile/link/debug cycle, and require extra effort in setting up correct and useful developer and deployment builds."
and the last one:
"Creating shared frameworks is a lot of hassle for third-party developers of consumer software, introduces instability into the development process, and encourages slower and larger applications. Code sharing is better accomplished through creating new directories for shared code in subversion and judiciously including only the files and resources needed by any application in its Xcode project."
(Note: framework here refers to a specific way of organizing compiled code in OS X for sharing: .framework bundles)
Which, by the way, is also how Xcode is set up. Have you looked how large the actual binary is? On my system, it's 35K.
-rwxr-xr-x 1 root wheel 35K May 1 22:12 Xcode*
Holy compression, Batman! No, actually, they just moved all of the actual code out into frameworks. Just like the clang and llvm teams put all their code into libraries, which gives us tools like the static analyser and Xcode-integrated syntax tools.
So even if I don't have plans to share the code yet, I still just put it in a framework target that's in the same project as the app itself, which takes around 30 seconds, and afterwards is no extra effort.
And the idea that only Apple can create software that is worthy of reuse...well fanboys be fanboys, but it actually isn't true.
This isn't true: he said that people shouldn't use a specific way of sharing code in OS X, not that "frameworks are bullshit". He even said that "Apple's shared frameworks are very useful for sharing executable code and its associated resources among multiple applications". He also praises Cocoa, the framework, and many other frameworks created by Apple.
So, yes, you did misrepresent Wil's opinion, and you should take your words back.
"Frameworks are Teh Suck, Err".
Let's look at the urban dictionary:
"The absolute worst thing ever, completely beyond bad, lower than horrible, and more crappy than explosive diarrhoea"
So when I wrote "bullshit", my only crime was that I was being euphemistic. Guilty as charged. And as I wrote, he makes this farcical distinction between Apple and the rest of the world.
So, no, I did not misrepresent Wil's opinion, apart from softening it up. So you should take your words back.
Because he went for an extended explanation what that meant, besides the funny/link-baity title -- which is exactly what the parent wrote.
Yes, but he writes, and presumably talks about, a little more ambitious apps.
We detached this subthread from https://news.ycombinator.com/item?id=11789509 and marked it off-topic.
None of those things is necessarily a personal attack by itself but they're personal, uncivil, and especially bad in combination.
"Note that Objective-C's metasystem...is a bad design."
You have a funny definition of "fanboy", but suit yourself. And if you've read my blog and think I write about type theory, then you are utterly confused.
That article just documents that the safety benefits of static typing are, empirically, rather minuscule, especially compared to the claims for the benefits, which are vastly overblown.
That said: I still like to be able to statically type my programs. I just don't expect it to yield a significant safety benefit (documentation benefit is more important).
Here is another:
Right, which is why I said that you fundamentally misunderstand type theory.
This article is laughable. There is like billion confounding factors.
Let me explain it to you: if I claim that I have a new car that goes 200mph, and you measure the speed and it only does 20mph max, then it really isn't relevant whether you understand the theory behind your engine or not, it's simply not as fast as claimed.
Or do you mean that anyone who doesn't accept claims about benefits at face value "doesn't understand". That sounds more like a religion than anything having to do with science and engineering. Which is, sadly, my experience with this particular cult.
Of the article that you find "laughable": show me the research that actually validates the claim of significant safety benefits and we can talk.
> Of the article that you find "laughable": show me the research that actually validates the claim of significant safety benefits and we can talk.
I'm yet to see one good research paper that looked into the field that was worthwhile. This sort of thing is extremely hard to measure across different people. I speak from my experience and from what others have told me.
You still don't seem to understand this this is completely irrelevant to the point I am making. You don't have to be able to build an engine with my magic engine technology to be able to measure the speed of the car and see whether it goes as fast as I claim.
However, since you asked so nicely:
- Our algorithms class at university was taught with statically typed FPs. Mostly Hope, some Miranda IIRC. (We were the last generation of students to be spared the institute's own Opal). Haskell didn't exist yet.
- I also took the advanced FP courses.
- And am a great fan of FP. Backus's FP, to disambiguate
- It became quickly clear that FPs were no panacea, they just had different problems than other languages
- We also quickly surmised that this whole FP thing was a religious cult and that you were required to take all the claims that were made on faith
- Also used Pascal and Modula
- Did a major system in Java, probably one of my best pieces of work to that date
- Also remember that Objective-C acquired static typing during my time with it (before that it was all "id"). I was hugely confident (kind of like you now) beforehand that this would be a major boost to my programs' correctness and my productivity, and I was very surprised when that turned out not to be the case
But again, all this is largely irrelevant to the point I am making.
> I speak from my experience and from what others have told me.
Really?! Not only do you ignore the evidence there is, you also, of course, have absolutely none yourself. And with that nothing, you make claims that anyone who disagrees with your personal opinion (backed by anecdote) is a complete idiot.
Well, at least I don't have to revise my 1989 opinion (based on the evidence at the time) that this is a cult. Boy is it ever a cult.