Hacker News new | comments | show | ask | jobs | submit login

This migration of physicists to the Silicon Valley and computer industry is not new. It has been true at least since the first big physics employment bubble crashed in the late 1960's, early 1970's. The post-Sputnik boom in physics degrees and grad students produced a huge surplus of physicists by the late 1960's. Dennis Ritchie started at Bell Labs in 1967. Back in the 60's, 70's, early 80's a fair number of physicists decamped for Bell Labs, mostly to work on computer and telecommunications related activities.

A high profile example is Emanuel Derman, author of My Life as a Quant (2004) and later books, who worked at Bell Labs from 1980 to 1985 before moving on to Wall Street. He mentions quite a number of other physicists at Bell Labs at the same time.

Most physicists end up in some sort of software development. The high profile "quant" jobs are actually rather rare and hard to get. The Wall Street firms are typically going after very strong physicists, especially theoretical physicists like Derman.

Nathan Myhrvold of Microsoft and Intellectual Ventures fame (or infamy) has a Ph.D. in theoretical physics from Princeton. Did not go to Wall Street. :-)

The Large Hadron Collider (LHC) produced a huge surplus of experimental particle physics (high energy physics) Ph.D.'s with no jobs in the field. Experimental particle physics involves large amounts of software development for data acquisition, instrument monitoring and control, and data analysis, mostly in C and C++, although there is still some "legacy" FORTRAN software. The heyday of FORTRAN in physics was a long time ago.

Although there have been attempts to use neural networks and other machine learning methods in particle physics, the workhorse of data analysis in the field is Ronald Fisher's maximum likelihood estimation and classification -- primarily estimation of parameters such as the mass and width of the Higgs Boson. The discovery of the Higgs was a maximum likelihood analysis.

Although it is undoubtedly possible to map maximum likelihood onto neural networks, in practice they are different. Neural networks are an attempt to simulate the low level structure of the neurons in the brain and solve problems by brute force fitting of data to models with huge numbers of adjustable parameters. In contrast, maximum likelihood involves attempts to understand the phenomenon under study and model it as a small number of functions corresponding to higher level concepts such as the Higgs Boson. A neural net could exactly fit the Higgs Boson peak yet never produce or confirm a physical model of what causes the peak.




This is a good summary, but we used both machine learning and maximum likelihood to discover the Higgs Boson. The difference is that we frequently use machine learning to identify the (already discovered) particles that the Higgs decays to.


Part of that is because there was only physics and math degrees up to around the 60s. There wasn't really EE or CS, and even when there was, most the professors were originally trained as physicists.


There were definitely many EE degrees. Electrical technology dates back to the telegraph, telephones, lights, early electric power in the 1800's followed by radio in the 1920's. Richard Feynman started out as a EE at MIT in 1934 and was one of only a small number of students who got a Physics degree in his class. Most MIT students got EE and other engineering degrees. EE was hot in the 1930's It is true that CS is a new field in the 1960's and 1970's.


>The heyday of FORTRAN in physics was a long time ago.

Ahh, how I wish that were true...




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: