
IBM announces $3B research initiative - jonbaer
http://electroiq.com/blog/2014/07/ibm-announces-3b-research-initiative/
======
Keyframe
I've been following this topic for quite some time. Probably because I always
wanted to be an EE/HPC engineer, but never was. When you listen to what guys
from IBM, Intel, Applied Materials etc. have to say, then it seems that 5nm
will probably be the last silicon based process. III-V (Boron, Nitrogen group)
type of materials will then be next, with most likely candidate being GaAs
(Gallium arsenide) which is already being used in niche productions.

What is more interesting than semiconductor base used is lithography process.
Deep UV (immersion, using refraction of medium such as water) lithography
currently used is showing its limits for ever smaller features. So, they came
up with Extreme UV lithography which uses mirrors to project features onto
surface, because at that short wavelength everything is opaque. Applied
materials has a machine with EUVL apparently (there is one on their site and a
nice video with description) as well as ASML (their machine reportedly costs
$88 million).

There is a probability that with new semiconductor base (group III-V) there
will be a reset back to 22nm or similar and a race down to bottom once again.
We'll see.

~~~
typon
GaN, SiC and Diamond are also very good candidate materials.

Also don't forget multi-patterning, which is what has allowed us to get to
14nm in the first place. I reckon Intel will ditch their immersion lithography
which they love so dearly and move on to other techniques, which is when I
will actually get excited about these techniques. X-Ray Lithography is also
being investigated and seems to hold some promise.

The crazy thing about all these advancements is that no one knows what the
future will hold, and right now we are essentially living under a renaissance
era, witnessing the death of Silicon. I just hope the stuff I'm working on
succeeds so I can make that cash money.

~~~
Keyframe
Intel already said they are moving to EUV from immersion lithography multi
patterning, but tech isn't ready for them in another year or two. X-ray
lithography has been used in the past, if I'm not mistaken. I didn't know
there is an interest to bring it back. I know there is some research in
Electron beam lithography, but jitter and slowness of it is a major hurdle.

What are you working on - graphene? In any case, very very interesting times
ahead. Projections are that 2020 will see the last of the cycles (5nm) for
Silicon. And that's only a few years away!

~~~
typon
You don't know if Intel's tech is ready. It's classified for the machine
manufacturers to disclose if Intel's their customer. I tried really hard to
ask this Intel guy at IEDM what they're using for 10nm research and he
wouldn't say anything.

E-beam lithography has been the work horse for research in fabrication for
decades now. You can't get better resolution than E-beam lithography. The
wavelength of an electron at 5kV acceleration is something like 0.017 nm.
There are some crazy people who are trying to make E-beam systems for
production processes (multiple beams, etc.), but I don't think E-beam will
ever see use outside of research.

I am working on a special type of transistor that is similar to this:
[http://arxiv.org/pdf/cond-mat/0401162.pdf](http://arxiv.org/pdf/cond-
mat/0401162.pdf)

Grapehene FET research is for peasants

~~~
Keyframe
I can't know about intel's tech, of course, but I can read what they are
saying:
[http://www.theregister.co.uk/2013/05/29/euv_lithography_stil...](http://www.theregister.co.uk/2013/05/29/euv_lithography_still_out_there/)
and where they are spending: [http://spectrum.ieee.org/tech-
talk/semiconductors/devices/in...](http://spectrum.ieee.org/tech-
talk/semiconductors/devices/intel-invests-in-euv) I can't find a link now, but
somewhere I've read from an intel guy they were expecting production runs in
2017.

It might be due to my limited scope of knowledge, but how would one scale
E-beam to a production capacity anyways?

Thanks for the link. I'll have to cross-read it with a lot of info though
since it's 'a bit' over my head. I'm already lost at 1D and how it relates to,
well, anything. I still can't warp my head around 1D geometry.

------
capkutay
There's so much hype over big data, analytics, machine learning, predictive
analytics etc. In silicon valley, we're predisposed to think this disruption
is the result of startups, tech companies (google/yahoo certainly get a lot of
credit), and top engineering universities. But the truth is the only company
effectively doing, and more importantly, selling this stuff at scale is IBM.
Everyone in the data management space is hoping to get just a piece of IBM's
pie. While we talk about the latest updates in hadoop and spark, IBM is
closing $50m deals with telcos, banks, and governments all over the world. If
you're a startup in analytics and you don't think of IBM as a competitor,
you're probably naive or your market isn't big enough.

~~~
yeukhon
I think the strategy is not to compete with IBM at first and there are plenty
of potential clients out there.

Banks and big government won't switch to trust a startup after working with
IBM for twenty years. The risk of a failing startup and buyout is a real risk.
Even should you do such transition, it will take a long time. Maybe for small
components you can switch within months.

I like PG's advise to startup founders: go out there, sell your product, do
the necessary customer supports (do demo, train users, troubleshoot, etc) and
stop the hype because big companies like IBM and VMWare can sell their product
with ease. I think the right attitude is to start small, build the reputation
and make progress.

------
skrebbel
My pet theory is that IBM's goal is to make lots of money with boring IT
consultancy so they can spend it all on fun fundamental research.

~~~
porlw
Maybe Google should buy IBM.

~~~
capkutay
Does google have $200b in cash?

~~~
eitally
Maybe they just buy IBM Research instead?

[http://www.research.ibm.com/](http://www.research.ibm.com/)

~~~
yeukhon
Why would Google be interested in IBM Research? Just offer more competitive
salary for the researchers and a few might leave IBM for Google.

------
jmspring
IBM has traditionally had a lot of very strong research labs including IBM
Almaden. That said, they also have a reputation for driving out relations with
academics and local researchers in the name of "budget" \-- very similar to
the overall outsourcing regiment Cringely has outlined.

I'd like to know which of the labs are getting this money and what hiring they
are doing contrasted with any downsizing they previously did.

~~~
gaze
Yeah I dunno what's up with quantum computing. DiVincenzo isn't there any
more.

------
NamTaf
Interesting. The article makes the following claim:

 _As the leader in advanced schemes that point beyond traditional silicon-
based computing, IBM holds over 500 patents for technologies that will drive
advancements at 7nm and beyond silicon — more than twice the nearest
competitor._

I would have guessed that Intel was the big player here in the low-level
hardware space. What has put IBM at that position, besides sheer age?

~~~
Hermel
Fundamental research. For example, Karl Alex Müller discovered super-
conductivity while working at IBM's Swiss research lab and later won the Nobel
prize for that
([http://www.uzh.ch/about/portrait/nobelprize_en.html](http://www.uzh.ch/about/portrait/nobelprize_en.html)).
Also, they invented the scanning electron microscope, along with other
discoveries relevant at nano-scale.

IBM is one of the very few companies that does fundamental research with a
time-horizon of decades, and not just one product cycle.

~~~
selimthegrim
You mean _high-temperature_ superconductivity. The regular stuff has been
known since the turn of the 20th century.

[http://en.wikipedia.org/wiki/Heike_Kamerlingh_Onnes](http://en.wikipedia.org/wiki/Heike_Kamerlingh_Onnes)

------
akuma73
The end of Moore's Law will have profound effects on the entire technology
industry.

What happens when computers stop getting faster?

There's no obvious successor to CMOS on the foreseeable horizon, so it's good
to see that IBM and others are searching for solutions.

~~~
eru
Moore's Law is not about speed per se. It's about transistor count on the chip
that is cheapest per transistor. (If memory serves right.)

~~~
boyaka
The reason Moore's law worked so well is largely because of this. When we
began creating transistors on integrated circuits, the only thing preventing
us from making them exactly as small as they are today were things like cost
and the capabilities of fabs to make them reliable and stable. So ever since
we started making them we've stepped down the size by about half, rather than
something smaller that would be more difficult and expensive to make. The
entire manufacturing industry gradually became more efficient, increasing
reliability and reducing the cost of fabricating the integrated circuits.

The history of performance speedup hasn't been solely due to doubling the
transistors. Other types of architectural improvements in memory, buses,
parallelism, multiple cores/threads have yielded performance doubling or more.
Memory access has always been a huge latency and there are lots of ways to
improve that without just making the transistors smaller.

We have a lot of work to do still! The majority of our computing has been on
the x86 architecture, which basically stems from the 8080 which was designed
in the mid 70s. We can still discover new and better architectures and
materials. It's just the standard shrinkage of IC's that was the basis of
Moore's Law originally has reached its limits.

------
chm
In what countries will they spend the money? Who can apply for grants?

~~~
anigbrowl
A reasonable question; I don't know why you were downvoted. Per the article,
the spending will be in the US and Europe, but I don't know which European
countries. If you're engaged in research then your best bet would be to
contact them through your institution.

~~~
chm
I am beginning my studies (2nd year MSc) but my professor might be interested!

------
magic_man
IBM just did massive layoffs in hardware. I used to visit essex junction plant
a bunch and they had a lot of layoffs there. I don't know how much commitment
Management there has toward hardware.

~~~
johnward
There is probably more interest in developing the technology and then
licensing it. It seems like IBM doesn't want to be in the hardware market
unless it's high margin.

------
z3phyr
They are looking for Digital Nurosynaptic solutions. Traditionally it has been
Analog. Can someone explain how would it work out?

~~~
aidenn0
I don't know this in particular, but these days just about any analog circuit
can be implemented in digital with better resolution, smaller die area, and
lower power consumption.

(In analog domain, your resolution is limited by SNR, in digital, it's limited
by number of bits).

~~~
marcosdumay
Except for digital to analog, and analog to digital conversors.

As far as I know, our neurons are belived to have analogic interfaces, thus
they could not be interfaced by a digital circuit. If somebody found out that
they are digital, that'd be very newsworth, at least for me.

