His work has been distinguished by the melding of language safety, reliability and clarity, that is, not merely having sophisticated constructs that help the guarantee correctness, but also making code simple, beautiful and easy to read. Ultimately writing safe code depends on the ability of the programmer to comprehend it, so creating a programming environment that's successful on all fronts is a foundational achievement.
A notable example: LLVM enabled ARC, a beautifully simple approach to memory management that removed much (not all) of the need for the developer to implement details in code, while providing high efficiency and, perhaps even more importantly, predictable performance (no garbage collection pauses). These are all essential for safety-critical realtime software.
A GC algorithm known since the early days of Lisp GC research and used in languages like Mesa/Cedar in the late 70's.
I can think of lots of other examples, like how VB, Delphi and C++ Builder interfaced with COM in the mid-90's.
Yes, ARC PR made automatic memory management to those without background in compiler research more easy to accept, specially if they weren't aware of the stability issues that trying to implement a tracing GC in Objective-C semantics meant.
However it goes without saying, that Lattner is really brilliant and his work on LLVM as a compiler building tool, as well as, clang in regards to improving the status quo of C static analysis tooling and compiler error messages, is really notable.
In regards to ARC, maybe also in terms of PR, as now there is a whole generation of developers that thinks reference counting isn't garbage collection.
One curious thing is how similar Swift is to modern Pascal, ie Delphi. It has many other influences, and it is certainly not Pascal with another syntax, but reading the Apple guide when Swift was released gave frequent moments of deja-vu.
On the subject of COM, yes. If you think COM means ATL you should try it in a language designed to work with interfaces and reference counting inbuilt - Delphi has COM interfaces as first-class language primitives, and a heap of classes and other code making COM quite straightforward. Much of this even spills into C++Builder, though since it's a different language it's not as clean as in Delphi. Still miles past ATL though.
Re ARC: I think the thing that makes it appealing is it's conceptually simple, completely deterministic, and can be traced by reading code rather than understanding an environment's implementation. Delphi does ARC now too, and if you've ever wanted ARC in C++, C++Builder optionally supports it for some classes. ARC is not yet on Windows for either language, we're talking just iOS, Android and soon Linux here.
(Disclosure: I recently started working at Embarcadero on C++Builder. This is a personal comment only. But liking the languages Embarcadero makes was one main reason I joined.)
Lattner is a well known expert on compilers. Having used Swift since its inception, I would call into question the reliability of the Swift LLVM compiler. In its current state (3.0.2) its absolutely terrible and does not back up the sentiment; "But fast-moving Silicon Valley needs a fundamental shift in quality standards when it comes to safety-critical software, and if you look closely, Lattner has been leading this charge at the language and compiler level".
His online presence always has that Apple Arrogance™ to it. Thats coming from someone who was born and raised an Apple fan.
So, yes I can see why one would call Lattner at the forefront of making more reliable compilers and pushing shifts in quality standards in fast-moving Silicon Valley. It is an awesome achievement to create a new language that improves both readability and safety and even more awesome to get it mainstreamed so quickly.
There are a few people who I would like to trade places with. Lattner is one of them, Musk is another. They both fulfill different parts of my long-held dreams. So I consider them to both be quite awesome. Its cool that they'll be working together too I guess.
Having used compilers for a few new languages (Rust, Go, Dart, Kotlin, Swift). Swift is the only one I've had any issues with as well as Swift seems to be the only language to have adopted the "move fast and break things" philosophy of Silicon Valley. I dunno, I just don't see the argument.
LLVM is one of the most influential pieces of software of the past decade. Hero worship isn't good but credit where credit is due.
Additionally I'm sure we can all agree there is no substitute for maturation through time and usage in the field. Which frankly is an argument for more popular languages over obscure ones. None of the ones you mentioned are ready for safety-critical system development (including Swift 3), but which one is most likely to achieve widespread adoption and field testing in the long run?
I don't think Swift stands to gain wide spread traction outside of Apple orientated app development. Aside from a lack of stability, Apple is to well known for boxing its competitors out. I've used and loved their products my entire life and I know how annoying it is to go against Apple's grain.
It already is though, there are several Linux web frameworks etc. It's open source and community run so I'm not sure how they're planning to box out competitors from it.
When writing a server, I would take Go over Swift anyday. It out preforms it, uses less memory, its simpiler, oh and it uses a "tradiontal" GC.
That is very much _not_ the case according to the testing I have done recently.
Swift uses a lot less memory than Go unless the program uses only trivial amounts of memory in the first place. Using interfaces in Go data structures makes the difference even more pronounced.
On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.
That said, Swift has a few very weak spots when it comes to memory. Most notably the String type, which is terrible on all counts, but that is a whole different story.
Only if the said language doesn't allow for stack or static globals.
Quite a few languages do allow it.
Not really, example in Active Oberon:
point = RECORD x, y : INTEGER; END;
staticPoint : point; (* On the stack or global *)
gcPoint : POINTER TO point; (* GC pointer *)
noGCPoint : POINTER(UNTRACED) TO point; (* pointer not traced by the GC *)
Mesa/Cedar, Oberon, Oberon-2, Active Oberon, Component Pascal, Modula-2+, Modula-3, D, Oberon-07, Eiffel, BETA.
There are probably a few other ones.
That's fine for one point. How about N points where N varies at runtime?
If I allocate memory dynamically outside the GC's remit, I'm going to have to release that memory somehow.
On Active Oberon's case, those pointers are still safe. They can only point to valid memory regions, think of them as weak pointers that can also point to data on the stack or global memory.
This in safe code.
If the package imports SYSTEM, it becomes an unsafe package, and then just like e.g. Rust's unsafe, the rules are bended a bit and usage with SYSTEM.NEW() SYSTEM.DISPOSE() is allowed.
Just like any safe systems programming language, it is up to the programmer to ensure this pointer doesn't escape the unsafe package.
In a safe package it can only point to existing data, there isn't anything to release.
If the pointee is something that lives on the heap, it is similar to weak references. Points to GC data, but doesn't count as yet another GC root.
If the pointee is on the stack or global memory (data segment in C), then there is also nothing to release. Global memory only goes away when program dies, stack gets released on return. Memory that was allocated by the compiler due to VAR declarations, it is static.
Usually the idea is that you use untraced pointer to navigate statically allocated data structures, they are not to be exposed across modules.
My own code is unfortunately a bit messy and entangled with unrelated stuff. If I find the time I'm going to clean it up.
"The Garbage Collection Handbook", chapter 5
Garbage collectors are fully automatic and rarely if ever require to mind anything; automatic RC does almost everything but requires programmer to analyze and annotate some things as 'weak'; manual RC requires a lot more programmer's effort while still technically being "automatic"; and manual memory management means the programmer does everything.
Automatic/manual is a scale, not a boolean yes/no, and the point is that ARC lies on it a bit closer to manual than garbage collectors.
There isn't anything like ARC vs GC, that is layman knowledge and just wrong from CS point of view.
True in some sense, but mostly useless. Come on.
FWIW, as someone who was a Java programmer for over a decade before learning Objective C right after ARC came on the scene, I greatly prefer ARC over garbage collection. I find the things you have to remember to think about with both ARC and GC (e.g. circular references and unintentionally strongly reachable references) to be about the same cognitive load, but the deterministic, predictable behavior of ARC means you won't have to try to debug random GC hangs that only happen in prod under heavy load and the subsequent fiddling with a million GC options to get performance to be acceptable.
C++14 is a whole other world, and can be written with most (if not all) the safety guarantees you would expect from Swift or Rust.
I had to use it for a project and was very surprised about this too...
I used MFC like that in the late 90's/early 2000.
The problem are the team mates that write Win32/C like code, instead of MFC/C++ code.
Still, I would say it's mostly usable now. It used to be a lot worse.
-edit- I meant it as a serious question. But the person who responded to me sums up the issues.
The Swift compiler segfaults very frequently. I do find this amusing in that it's the compiler for a theoretically largely-memory-safe language (yes the compiler is written in C++, it's still funny). The syntax highlighter in Xcode, which is driven by the same stuff, also crashes, which breaks autocompletion and even indentation. Using Xcode, you just have to get used to it. It frequently reports the wrong error message - just something that isn't even close to related. Sometimes object files just become 0 bytes and you either need to clean (and experience the Swift compiler's blazing performance again) or go and modify that file so that the incremental compiler will pick it up.
I've found most of these to be triggered by using a lot of closures and possibly type inference. Shaking out the incorrect errors or segfaults is... not fun.
I should mention I also find the community to be sorta toxic. They are so focused on Swift being the one language to rule them all and they use terms like "Swifty".
I wonder if we'll have refactoring in Xcode for C++ now Lattner has gone. I wonder why they never added it.
Not that dev'ing should rise to that level, but it isn't the typical level most developers work at. Also, if he is more of a "software correctness and reliability" guy as abalone says he is, then yes, that is the right direction.
The ability to run 32-bit applications means that preexisting libraries cannot incorporate Swift code yet.
The "application may slow down your phone" warnings that users are getting with 32-bit apps this year is a pretty strong indicator that Apple is going to remove support for running 32-bit apps completely for iOS 11 or 12. They previously had a deadline for apps to have 64-bit version submitted, but backed off the ultimatum for now.
Apps have had to support 64-bit since June 2015 (February 2015 for new apps), and Apple hasn't backed off that deadline. But there are still 32-bit apps on the store that haven't received an 64-bit update, and I don't think Apple has ever stated what it is going to do about them (other than showing the warning).
Apple can't write Swift libraries (at least not ones that are publicly exposed) until the ABI stabilizes, so that will not happen until at least Swift 4. Apple can write Swift apps as long as they don't need to support 32-bit Mac, which they haven't needed to do for years.
There is a WWDC session talking about it.
I would surprised if they actually did tho. I know someone did some static analysis of the apps for mac and iOS and found Swift was barely used at all.
I have to stomp on coreaudiod a few times a month because my USB DAC stops responding.
It is not fair to judge Lattner's ability or commitment to safety based upon Xcode and Swift. One of these predates him, the other is the result of his decisions (no doubt) but also countless decisions of others, including those above his pay grade at Apple.
Xcode is basically the evolution of something designed for NEXT. Swift is a solution to various problems in application development. I'm not aware if he (et.al.) had real time processing or safety critical devices in mind with 3.0.2.
I'm not claiming Lattner is missing some innate ability -- it just appears they're putting someone in charge who has never been exposed to this mindset. Maybe they have a culture or other leaders already in place who can foster this within the team.
> It is not fair to judge Lattner's ability or commitment to safety based upon Xcode and Swift.
To be fair, we cannot judge Lattner's ability or commitment to safety at all, because he has no publicly-known experience with safety-critical systems.
Perhaps he has relevant experience that's not public. And if it turns out he has no relevant experience, I'm not saying he can't learn. It's just strange for Tesla to put someone in charge who will be learning on the job.
> I'm not aware if he (et.al.) had real time processing
C++ has become incredibly great recently after languishing in the C++2003 period for too long.
One idea to get correcter code is clean code layout (I heard good things about the LLVM code base), simple to read code and compilers that exploit theoretical knowledge we gained over years of compiler and type theory research. I think Chris Lattner has expertise in all of them (or at least knows about their importance, contributions and drawbacks), and if want to build a full-blown self-driving car it is important to have no indeterminism in your car, and advanced languages and compilers help to guarantee specific statements about your code.
So it absolutely makes sense to invest in your compiler research team for safety, as our security, reliability and correctness expectations will rise (which is good) in particular for self-driving cars (they are not just DNNs).
That's why some of the big banks flat out refuse to implement any form of deep learning for risk analytics. They're much more reliant on simpler ML models like random forests and logistic regression that are easier to analyse and diagnose by model governance teams.
In the end you want to know what did go wrong (the public will demand it and they are right) and it might be misclassification.
But that is not enough. You want to know why did it classify situation X wrong? So the answer is because the inference network (which was created via the training network) computed its weights because the input was Y. Now you might throw your hands in the air and say "oh it's complicated, the network is undeterministic, blame NVidia", but you can also go farther and build your networks deterministic (which is possible and AFAIK not a performance penalty). Compiler research helps in at least guaranteeing that certain parts of code are deterministic which makes it easier to debug and maybe avoid complex NN misclassification scenarios, but the way to do it doesn't have to do much with NN's itself but more with language design and (real-time, in particular deterministic) OS research.
So the statement oh, that's so complex, we do not no why we misclassified is no excusion, we can do better.
For starters we have to publish NN papers with implementations that describe how to make a particular NN out of given trainigs data (provide that too). We already publish the code and network structure (see caffe, etc.), but often with pre-trained models that have been build on a cluster with many forms of training-data going through network structure, etc.
Now at the moment you read a paper, head to the published code (often the case, again a desirable property of the ML community) and try to reproduce some examples by training on the data.
However it is hard to say in the end, if your network is really as good as the published, as a simple
$ diff my-net-binary-blob.dat tesla-net-binary-blob.dat
It does make it simpler, but surely not usually simple enough to answer the question "why did we misclassify". It's like saying we will finally understand consciousness once we simulate the quantum mechanics of a certain cubic metre of space with perfect accuracy - which need not be true even if that cubic metre happens to contain a functioning brain.
One of the concepts is to prevent the use of any dynamic behavior, which means you don't need any garbage collection at all. Because garbage collection is not predictable. It can be good and bad, but even a good one is not predictable.
Embedded design and development is completely different to classical IT. For IT that sounds archaic. But it always proved right at the end.
And regarding memory allocation in reliable automotive systems: Yes, the best practice would be not to allocate at all to get to some deterministic behavior. However I've seen lots of projects where "don't allocate" is implemented as "don't allocate with malloc", and you find dozens of custom memory allocators and pools throughout the code. Some of those designs are probably less reliable and safe than using a garbage collected language would be.
It might work out with a very careful system design, but determining statically how much queued messages and memory is needed seems like a very hard task.
No garbage collection means more fragmentation though which can be an issue at some point perf-wise, there's no free lunch.
so build the autopilot and other systems in haskell?
This move increases his compensation and clout. Post-Tesla, he'll only have VP or founder titles elsewhere, never anything lower-level (unless he gets his old job back). The change must be welcoming, too.
That Apple could not retain him speaks volumes of the company they've become. They're a conglomerate at the intersection of tech and fashion. Groundbreaking engineering is not always given its proper due (or compensation) because there's only so many seats at the table. They've become rigidly corporate and not particularly inspiring.
Good luck to him, and good job taking a chance. Working for someone with a vision other than "thin" must be a welcoming change.
This is so spot on. I would kill for a thicker phone with a less vibrant display that would last 2x as long on a single charge. They're pushing stuff that the consumer doesn't want (thinner/less battery volume, no headphone jack) in order to make money. It reminds me a lot of the TV industry pushing 3D TV's. I don't know a single person who has ever watched, let alone regularly, any 3D content on their 3D TV. It's a technology that few wanted but was pushed to drive sales.
But how do you know that people don't want this? All the people who told me they were switching to Android because of the headphone jack ended up with iPhone 7s anyways and love them. And now it seems that Samsung's next phone is going to ditch the jack too. Not to mention the numbers aren't really in favor of suggesting this change is hugely unpopular.
Not everyone hangs in the same circles, but lots of (for example) motorcyclists buy these for general "plug in at camp" use. Run a small light, charge the GoPro, and some do use them to charge a phone. Hell, some of them use them as jump start batteries which is about as far away from anything related to cell phones as I can imagine in the portable power market.
In summary, quoting battery pack market size merely reflects the desire for folks to have portable electrical power, for whatever they feel they might need it for.
The 3.5mm headphone market is also massive but people don't want 3.5mm headphones, they just want headphones. You're confusing demand for intrinsic virtue.
Sorry if this sounds naive (and it's not intended as veiled criticism), but could it not be the case just that he is more valuable to Tesla so they are able to justify offering him more compensation?
Or it could be just that there 100s of engineers and tech leads working on Swift/Xcode/LLVM and giving enormous compensation to one because (s)he is famous engineer and wanted to leave does not really make sense. Unlike marketing or sales writing core software does not seem to be star driven, or at least I hope so.
Well said. Apple would have made the strides it did in its developer platform had it not been for Lattner. It does seem that the company has different priorities than it did when he was hired
He is an inspirational leader and talented people want to work with him. It appears it will be harder and harder to attract and keep talent at Apple with out a leader like Steve Jobs.
While Tesla is still on the upward rocket phase while they are still figuring out their product/market. Much more exciting for the type of talent who are looking to make a big impact on the world.
Steve Jobs brought that life back into Apple when he rejoined and maintained that environment through various product launches over many years.
I'm not sure I'd want them to still pretend they are the same company or not, rather they should work on the talent they have. Just like how basketball teams rebuild after losing their star players, you rebuild around new talent instead of clinging on to past glory.
I know you are being facetious, but similar to Jobs, Musk got ousted as CEO from his first and second companies: Zip2  and PayPal .
 Vance, Ashley (2015). Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future, p72.
Chris Lattner has been at Apple for over 11 years. He shepherded LLVM, Clang, lldb, and Swift. All while climbing the ranks to Director. A huge number of engineers would simply be ready to think about and do something different after that much time.
You might say that Elon answers "Why Tesla?" but in no way can you claim an answer to "Why leave Apple?".
(I have no inside knowledge, I've just been in the industry long enough to have gone through this myself).
Seems obvious, doesn't it? Nothing interesting going on.
Under Jobs, terms like "courage" and "innovation" used to mean something, like kicking sand in the face of the entire mobile phone industry and competing against their own bestselling product. Tim Cook's idea of "courage" involves selecting a headphone jack in a CAD tool and hitting the Delete key. And "innovation" means late nights at the office pushing the limits of dongle engineering.
I do think his technical knowledge was probably higher than most give him credit for, however.
This is a convenient narrative when Tesla makes a high profile hire but they've also lost a lot of talented engineers and I don't remember the convention wisdom being that it was because of Elon's shortcomings.
I can't even imagine what Musk would do with Apple's $200 billion in cash. I think he would've been much more daring with that money than even Steve Jobs would've been.
But I think Apple missed its shot, and the merger of Tesla and Solar City probably sealed that for good. Now Musk is probably already seeing a 10-20x larger combined Tesla/Solar City company in his head, 10 years from now, and a potential merger with a bigger Space X as well.
So from his point of view, it probably won't be worth it anymore for him. He would probably have to take over a declining iPhone market and deal with that at the same time as dealing with explosive growth at Tesla and an imminent launch of SpaceX' big rocket to Mars.
On the other hand there would be hundreds of billions of dollars he could get access to, so I wouldn't say it's impossible to happen anymore either. However, at least to me, this would only be interesting from the "let's give Musk unlimited money and see what he can do wit it" point of view. Otherwise, I would rather see Tesla/SpaceX be on their own, than join an Apple/Tesla/SpaceX megacorp.
I'm not knowledgable enough about the psychological/sociological aspects of this but I wonder if it's possible to maintain that kind of culture when you have $200 billion sitting in the bank. It might lead to a kind of resource curse that some countries suffer from.
I wonder if there have been studies done on this at the corporate level.
It's questionable if he would be the profit-maximizing CEO choice for Apple though.
Perhaps he wouldn't even want the job (if we ignore the cash), because the smartphone and computer industry is maturing, i.e. tougher competition, decreasing rate of innovation, maintenance mode ahead.
My favourite bosses were engineers first.
And for the iPhone Vs Electric Car comparison. Oh well may be i am old school, if BMW or other Car makers made a Electric Car ( assuming we have to choose a e-car ) i would choose them over Tesla any day. The interior quality of Tesla just dont compare well.
The iPhone was truly a revolutionary product, in that many people tried smartphone before but they have ALL FAILED. This is speaking from someone who has used pretty much all smart phone prior to iPhone introduction.
If anything i say the difference is Elon Musk lack of taste.
It's well documented that Android was going to be Blackberry-ish, then they pivoted to be iPhone-ish after the original iPhone announcement. Think the App Store and things like Uber, which exist primarily as mobile apps.
Very rarely is a new technology useful in abstract; the inventor of the steering wheel made a contribution but without the rest of the car it is meaningless.
Touch screens existed, sure. There were one or two largish screen phones, sure. Smartphones existed (all using keyboards and styluses). OS X and Safari existed. But no one had put it all together into a single device, nor had anyone created a sensible UI design language to take advantage of things like multitouch.
You're basically saying Dropbox is garbage because rsync/SMB/NFS/FTP existed. Or Uber/Lyft are garbage because Taxis existed. Yes there are some superficial similarities but it turns out the details make a massive world of difference and it is intellectually dishonest to be so dismissive.
Interesting that Google recognized the importance and adapted quickly while Ballmer laughed it off: https://www.youtube.com/watch?v=eywi0h_Y5_U
Yes, he might change the world if/once he gets to Mars, but most of his stuff is a marketing tech demo.
Bell Labs changed the world. Tesla did not.
And I think your categorisation of his work as a tech demo is unfair, and uninformed. SpaceX have built rockets that are delivering satellites into orbit and supplies to the international space station, and have achieved reusability. They have played a significant part in building a private sector space industry. These are not tech demos, they are real things. Can you boast any such achievement in your own life?
Likewise, Tesla have built and sold electric cars that people want. They've created a global re-charging network to supply them. And they've been the first company to deploy significant automation into the automobile market on a large scale. These are not tech demos, any more than the gigafactory rising out o the Nevada desert is.
That was an unnecessary remark; the commenter never claimed that he/she has achieved more than Musk.
I'm perfectly entitled to ask what they've achieved themselves that puts them in a position to so casually demean what, by most people's standards, are quite considerable feats.
Disclaimer: I work at SpaceX, albeit as a technician so I'm very far down the corporate ladder, but I feel that may be unduly harsh.
There can certainly be legitimate criticisms around Musk personally, and SpaceX/Tesla in regards to whether they are overhyped relative to competitors or whether they will succeed on delivering what they promise. With that being said, when a company delivers 70k+ cars in a year, even if this is just a tiny percentage of the overall new car market, or a company puts satellites into orbit, I think we've moved beyond "marketing tech demo" status.
Being a good CEO is just way more valuable than being a good individual contributor, because it multiplies the output and growth of the whole company.
And ego doesn't really matter that much in the big picture.
Most engineers wouldn't have thought about having proportionally spaced fonts back in the 80's when personal computers only had 80 x 24 fixed withd character green phosphor displays.
When Jobs dropped out of regular college and dropped back in to take the classes he was truly interested in, he took calligraphy; years later that lead to the Mac being for first personal computer (the LISA had it too, but that was the $10,000 predecessor to the Mac) to have proportionally spaced, bitmap display.
That's not something Woz (or someone like him) would have prioritized for a brand new computing platform.
> If you think the job of a CEO is to increase sales, then Ballmer did a spectacular job. He tripled Microsoft’s sales to $78 billion and profits more than doubled from $9 billion to $22 billion. The launch of the Xbox and Kinect, and the acquisitions of Skype and Yammer happened on his shift. If the Microsoft board was managing for quarter to quarter or even year to year revenue growth, Ballmer was as good as it gets as a CEO. But if the purpose of the company is long-term survival, then one could make a much better argument that he was a failure as a CEO as he optimized short-term gains by squandering long-term opportunities.
Yes, he blew it on Mobile and that's a huge deal. The hugest. He had his blind spots, but he was very far from incompetent.
I can for 100% assure that I only nitpick because I want to understand the logical reasoning for down-voting and are scared that I am too high to understand the meaning of the phrase "It's funny you mention that". I am not a native speaker, but I know all the words, heard it in English and there exists a translation into my language and the interpretation is the same for both, so long story short ... why is it funny?
But why is that funny (haha) or funny (strange)? If they can build a self-driving car for the masses Elon Musk will be the uber-tech guy for a whole generation, and Tesla seen as one of the good guys with cool tech.
Tangential, but I really hate the term "poach" when referring to recruiting employees.
We shouldn't think of hiring people as "poaching", because employees are not property. You can't "poach" an employee because there's no ownership, and employees should be free to make their own decisions regarding their employment opportunities.
Whenever, I see it, I think of the two companies plying Y with enticements, with Y manipulating them until (s)he finally gets rich rewards with the new company.
You do hear poach used colloquially or even somewhat jokingly when it's not intended to imply anything nefarious but the word does imply something underhanded.
Yes. If that wasn't what someone was trying to convey, why did they pick the word "poach" in the first place? The whole point of using that word is to evoke a metaphor where the hired employee is mere game animal ensnared by the company.
> I think of the two companies plying Y with enticements, with Y manipulating them until (s)he finally gets rich rewards with the new company.
"Hire" is a good verb for that.
It's always worth thinking about the language we use to describe things, because that language does have an effect on our modes of thought.
I believe it's a mistake to treat words as though they were nothing more than abstract parse tokens, devoid of any cultural, historical or etymological baggage and conveying only the precise meaning that we intend to convey. Language is more complex than that, and the baggage that comes along with words does sometimes imply things that may be entirely unintended by the speaker, and often slip easily below the level of conscious awareness. Entire
fields of propaganda (and advertising!) are based on this.
I am unsure why even the idea of thinking about the words we use is apparently so controversial - is critical self-examination really so scary?
this comparison really seems to miss the mark.
Of course, the employee is free to just up and walk away at any time, but if you feel it is courtesy to discuss the relationship before parting, and it wasn't discussed, then poach does not seem completely inappropriate.
Having said that, in this case, we don't know many details about the parting. He and Apple may have had a good discussion about the future and came to a realization that the relationship wasn't going to work anymore, for whatever reason, and parted amicably.
From another perspective, perhaps we should not even think of employer/employee agreements to be relationships in this day in age?
In the typical enterprise, whose domain isn't selling software products, C++ tends to be mostly used as infrastructure language for .NET/Java/JS native libraries or interacting with their runtime APIs.
My guess would be that Tesla's ambitions are set on controlling the whole stack. Something along the lines of a custom software toolchain for tailor-built silicon.
Given it's largely a field for intellectuals, sticking at the one thing for a long time can become repetitive and boring; losing the challenge and the appeal.
I'm reminded of their loss of both Avie Tevanian and Bertrand Serlet.
Now, the new mystery: What does this mean for Apple's car project?
Yes, I realize he probably didn't have much (or anything) to do with it (whatever it really even is). There are going to be some batshit crazy theories though, and I can't wait to see how this affects Apple's stock price.
I don't think we can derive much about their project from this hire tbh.
So his new work will have a lot to do with his inters ;)
All of those ("programming languages, compilers, debuggers, operating systems, graphics, and other system software") imply interest in tools for software development, where here they'd mostly be developing software (self driving engines etc), period.
Plus, I think the latest rumors say Apple has given up on making a car anyway, and it's working on a Waymo competitor of sorts.
This has gotten so much coverage on the back of ... nothing really.
That was a short mystery! :)
I had the impression he wasn't that high in the totem pole before Jobs died...
Because that was also part of the parents framing for the conflict.
Pure lucky, but my reasoning was that it is difficult for a compiler (etc) engineer like him to find a better job than the one he was doing at Apple: developers' tools, compiler infrastructure, a new programming language, Coding playground and so on.
The point is that Apple's been dropping the ball a lot lately. Lattner leaving is either another example of that (why didn't they make him a better offer to keep him?) or a consequence of it (what smart person wants to work for a company that's constantly screwing up?).
They might, but the examples for that are bad themselves:
>Macbook with less ports
Which has always been something Apple pushed for. Dropping deprecated ports early (to complains) and adopting new ones. Few doubt USB-C is the future, even if they complain about the dongles.
>copy&paste iPhone with no originality
The iPhone, like the iPod before it, had always had incremental updates. What originality exactly should it had at it's 10th year? Magic pixie dust spray? Can you point to some competitor doing anything original?
Besides, while everyone is always about how "Apple is all about style and no substance", nobody pays attentions to the large internal changes inside the iPhone year over year, with new processors, boards, camera setups and other internals designed by Apple. Processors, than, in all tests, leave the competitive top-end Android phones behind in single/multi core performance.
>imac/macpro with old CPUs :-(
Intel announced Kaby Lake CPUs suitable for the iMac just last week (Jan 03).
Mac Pro, yes, but it's probably a duying niche product.
I've never considered myself a diehard Apple fan, but over the years I've spent thousands and thousands of dollars on Apple products. From a 1 Gb iPad shuffle all the way up to a Retina MBP, and quite a few things in between. I don't need to argue if Apple is going down hill because I'm seeing it first hand while using their products, or, more often lately, choosing not to use their products.
My counterpoint is that I've been following Apple news ever since I was enthused with the idea of a NeXT-based OS X back in 2001, and know that "the simple fact that we're even discussing this" doesn't mean much.
People, pundits, media have "discussing" that Apple "lost the plot", "is doomed", "can't compete anymore", etc. all the time, from the introduction to the iPod until now.
Maybe this time it really is. But NOT because people are discussing this, because "people discussing this" has been a constant, not a differentiating factor for these times.
This is awful though. People can't really think spending time on autopilot for rich people sports cars is more important than the LLVM compiler infrastructure.
Give me a break!
Even if tesla invents cool new batteries and changes the way we think about power all of that stuff still has to run on software, that the compiler infrastructure depends on...
But seriously, exciting news for Tesla.