I've always been amazed by their attitude.
Look at Microsoft Research, and their enormous scientific output over the years. IBM and Google look bleak by comparison, and Apple is not even on the chart.
Here is a nice snapshot of the major contributions that came out of IBM Research over the past 60 years: http://www.research.ibm.com/featured/history/. Edit: the invention of copper interconnects is enough for me to be frank (http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/copperc...).
On a similar note, Intel is another company that is very active in research and has published a significant amount of peer-reviewed publications. Samsung Research is yet another; they have an amazing presence at circuit design conferences, for instance.
The modern world is to great extent a byproduct of the experiment that was Bell Labs. Some laugh, and say that it was experiment that has gone awry.
"A monopoly like Google is different. Since it doesn't have to worry about competing with anyone, it has wider latitude to care about its workers, its products and its impact on the wider world. Google's motto—"Don't be evil"—is in part a branding ploy, but it is also characteristic of a kind of business that is successful enough to take ethics seriously without jeopardizing its own existence. In business, money is either an important thing or it is everything. Monopolists can afford to think about things other than making money; non-monopolists can't"
There are also a lot of small tech companies like 37 signals and fog creak that try to be ethical and and treat their employees well. I don't think thiel could argue that they are monopolies.
Nor frankly do I buy that Google is a monopoly. But that's beside the point.
Much like Xerox Palo Alto Research Center (inventors of the mouse, GUI, and object oriented programming), which Xerox failed to profit from.
I get that point. We're all interested in building great and successful companies on here, so of course it's a bummer when there is success from one perspective, but the whole thing kinda doesn't work out overall.
But I also want to ask: As a society, don't we benefit so much from any advancements made in the open, through publicly shared research as well as the open source movement, that sometimes we would do well to just bask in the glory of those advancements and never mind what individual entities (financially, structurally) stuck around or not, profited or not, in the procuring of said advancements?
It's also very clear of course that it is beneficial to look at the past in a discerning way, learn from it, make it better now and in the future. Still, there's something about the thought in the paragraph above that I wanted to bring up.
I do admire a company that contributes in that way, recognizing that it's a contribution to humanity rather than investment for future profits.
Xerox PARC had a number of notable inventions and they created the Alto computer which had a bitmapped screen with the desktop metaphor, but they did not invent the mouse or object oriented programming.
In terms of the computer mouse:
>...Independently, Douglas Engelbart at the Stanford Research Institute (now SRI International) invented his first mouse prototype in the 1960s with the assistance of his lead engineer Bill English.
In regards to object oriented programming:
>...Simula (1967) is generally accepted as being the first language with the primary features of an object-oriented language.
Xerox didn't lack the commercial understanding to sell Alto+spinoffs, it lacked the understanding to realise you could build a developer ecosystem to support your hardware and make it the de facto standard.
DEC and IBM didn't understand this either. Gates and Jobs totally understood it, which is why Windows became a business standard and the Mac became the only serious business/home alternative.
But Xerox still did okay, because the use of GUI software transformed office culture and made it much more visual - which meant very steady sales of copiers and printers.
Xerox's stock price climbed steadily through the 1990s while paper remained a thing.
After the dot com crash, GUIs and screens had evolved to the point where paper became non-essential, and Xerox never entirely recovered - although you can still find a few people who print out and file all their emails.
tl;dr Xerox did very nicely indeed from Alto etc in an indirect way, for at least a decade or so.
I'd suggest that was due to the anti-trust case against Bell more than anything else. For example, I believe AT&T were forbidden from selling Unix direct to consumers for many years, leading them to licence Unix to other entities (in the business and academic worlds).
I very much doubt it would have been adopted in such scale, specially for the then startups trying to start a workstation market, if Bell could sell it at the same prices other OSes were being sold.
Plus maybe the Xerox PARC attempts would been more successful if there weren't a cheaper UNIX workstation as alternative, in spite how they managed the whole process.
Why would it have needed to be priced that high? We're talking about software here, the cost of reproduction is close to zero. It could've competed in the same market space as CP/M.
It's not a fair comparison at all.
Especially since Bell Labs sat within a state-sponsored (enforced?) monopoly until the mid-80s, so there was no direct concern about revenue. Which is not to belittle them in the least as they made outstanding organizational decisions, and had the most enviable pure research->development->production pipeline to this day.
In terms of the R&D USA, the closest comparison is IBM (which also enjoyed a monopoly for some of its existence). MS's monopoly, with the benefit of hindsight, was feeble in comparison even to IBM's -- though they, like IBM, are absolutely one of the greats and have redeemed themselves particularly in the past few years.
The author confused continuity of the DECISION FUNCTION with continuity of the OUTCOME CURVE.
In other words, an algorithm such as "keep trying to make a decision until you notice that t > C, after which always pick the left one" will in fact NOT have a continuous outcome curve, despite the decision function being continuous.
That continuity assumption he made is there in Newtonian Mechanics and other very widely accepted models. Something just feels off about the whole assumption. maybe I spent too much time with digital computers. Those models have a hard time explaining the non-reversibility of time also.
Class of their own IMO.
“Our practices tend to reinforce a natural selection bias — those who are interested in working as a team to deliver a great product versus those whose primary motivation is publishing,” says Federighi.
In my opinion, this gives you a view of the thinking that goes inside Apple. Publishing papers correlates negatively with being a team player and delivering great products, and here Apple is on record as saying that they do not want that sort of people working there.
Interesting quote. Federighi seems a tad condescending - in that producing great products and producing publications are mutually exclusive.
Whilst Apple might expect their engineers to toil away anonymously under NDA without much recognition outside the organization, the world of academic research does not work like that at all.
I guess proof of the pudding is in the eating and in my experience with Apple's AI efforts they are far behind their competitors. I for one am glad Apple are going to start publishing research - it should attract talent, foster sharing and ultimately result in better AI products.
If you go by publication count alone, I suspect that IBM Research is still near the top. They have a large research organization that made some fundamental contributions in computer science in everything from speech recognition to databases.
Using the second metric, the winner in all cases - without negotiation - would be AT&T Bell Labs.
I'd also remind you of the cost of a telephone system with monopoly status that didn't even let you install your own phones in the house for a long time. And which led too skits like Lily Tomlin's "We don't care. We don't have to. We're the phone company."
MSR has done cool work and important work, but google's publications have been often been major paradigm shifts. In a lot of ways they're currently way out ahead of everyone else.
MSR was one of the first companies bringing FP into mainstream developers tools with LINQ and F#.
Investing into dependent type programming via F*.
Sponsors Haskell and OCaml research.
Researches OS design that aren't yet another UNIX clone, namely memory safe OSes with Singularity, Midori, Drawbrige, theorem provers for device driver validation, P language, micro-kernels.
(I agree that Go is a horrific mediocrity in comparison.)
Do you have any metrics to back up this dubious claim? I find it very hard to believe Microsoft is even in the same ballpark as IBM considering all of the research centers IBM has throughout the world and the number of years they've been in business.
Depends on the compensation and several other factors. Lots of military/intelligence related research for example is not published either...
As for Apple: seeing is believing.
I watched that presentation and I don't recall them claiming to be the only ones using differential privacy.
But my impression from the presentation, as someone who had never heard of differential privacy before, was that this was brand new research from some professor, which they contracted to help them apply it to the real world. Definitely not "this is a technology that's used to achieve such and such, this is how it works".
I don't follow. How would preventing employees from writing papers would stop them from reading papers?
Much of the AI research currently starts with some seed idea like GAN which looks cool in small experiments but have tons of blind spots that needs to iron out. Unlike typical product efforts where you can probably work your way out through brute force engineering, advances in most AI related areas requires massive amount of collaboration, mathematical acrobatics, trial-and-error and cross-pollination across fields consuming multi-person-years before yielding fruits. In theory, Apple can still keep any silver bullets they find in the field secret but in practice such silver bullets are rare and advances are very incremental spanning over many years and many people.
Basically they can't provide that seed insight that motivates others.
All the AI/ML engineers were going to FB/Google/etc. (and on some rare occasions M$)
Are we still doing that?
So Alexa is really the biggest thing in the market so far right? How many papers does Amazon publish?
Dunno about the rest of the company.
Mac site picks it up 5 hours ago and it gets reported on HN:
Bloomberg writes the same story an hour ago and it finally gets huge traction on HN. My guess is that many people ignore the "new" page and it's all a matter of luck that 3 or 4 people get it to the front page where a story takes off.