
Ask HN: What are the most interesting emerging fields in computer science? - Norther
Hey HN,<p>What do you think is the most interesting emerging field in Computer Science? I&#x27;m interested in PHD areas, industry work, and movements in free software.
======
dbatten
Secure Multi-party Computation.

The basic idea is developing methods for two (or more) parties with sensitive
data to be able to compute some function of their data without having to
reveal the data to one another.

The classic example is developing an algorithm that allows two people to
figure out who is paid more without either revealing what their salary is.

Such algorithms get significantly more complicated if the threat model starts
changing from "we're all acting in good faith, but we just don't want to share
this private info" to "I'm not sure some of the people involved in this are
acting in good faith."

Based on my (admittedly limited) look into this field, it seems like there has
been some theoretical progress made here, but there's nothing like a
generalized framework or library for general development with it. Instead,
practical applications seem to be one-offs. For example, a contractor a while
back developed a system that lets parties (nation-states or private space
firms) figure out if their satellites are going to run into each other without
revealing anything about the location or orbit of their satellites. That way
they don't share sensitive data, but they can move their satellites if they're
on a collision course with somebody else.

Personally, I got interested in this when working for the government. I was
working on an extremely cool data integration project (State Longitudinal Data
System grant form US Department of Education) that basically went nowhere
because we couldn't get over the legal hurdles to data sharing... If we didn't
have to share data, but could still compute interesting statistics about the
data, that would have been really cool.

~~~
numbsafari
This is a big deal.

If we can find ways to perform secure, multi-party computation, we could
develop fully distributed computational, networking, and power delivery
systems.

Your solar roof tiles could be, basically, CPUs or GPUs with embedded wireless
networking.

~~~
xtreme
Can you elaborate why solar roof tiles need such intelligence, especially the
secure part? I don't see what data they would need to hide.

~~~
IpV8
Distributed power electronics are inherently dangerous. If you have a surplus
of power and your neighbor has a deficit, it is desirable for the two systems
to communicate to send power from you to the neighbor. If a bad actor can
leverage this system to tell everyone to send all of their power to the grid,
they could easily damage the infrastructure in a very costly way.

~~~
infogulch
Ok so the grid (that knows everything already) commands every house what to
do, using a pinned certificate to authenticate. What part of this needs to be
zero knowledge?

~~~
lgregg
It's also important to point out the grid is archaic in many places and the
knowledge isn't omnicompetent across the system. In the US, we have three
major grids which are also connected to Canada's (which I can't speak too).
[0] From my understanding, with an amateur interest, we know a lot about where
those major grids connect but very little about the distributed nodes that
make up the network; The Northeast Power Blackout of 2003 is a really great
example of this. [1] Essentially, there was a grid failure that occurred in
Ohio which overlapped into the other sections of the grid. In short, a wire
contacted with a tree in Ohio which caused a cascading failure to NYC.

So, let's now bring IoT into the mix. You and I have smart houses, with smart
solar tiles. John attacks our tiles plus all our neighbors and directs a major
electrical spike towards our local substation. Now it's a physics question,
where are all those joules of energy going? It's a heat problem and right now
there is no way to dissipate that heat from the system which will melt our
substation. Let's say our neighborhood is between several other neighborhoods
and the main power station, we just killed a node and guess who else doesn't
have access to power because they don't have tiles like we do.

That's the basic concept on why security and authenticity are important in
relation to the energy grid. It would be nice if there was an effective way to
dissipate an "electrical DDoS"? I'm not sure if it's called something else. If
you're interested in the energy dissipation within the energy grid, this is a
great question on SE. [2]

That all said, is also a major reason why the government is consistently
freaking out about our power grid being hacked.

[0]
[https://en.wikipedia.org/wiki/Continental_U.S._power_transmi...](https://en.wikipedia.org/wiki/Continental_U.S._power_transmission_grid)
[1]
[https://en.wikipedia.org/wiki/Northeast_blackout_of_2003](https://en.wikipedia.org/wiki/Northeast_blackout_of_2003)
[2]
[https://electronics.stackexchange.com/questions/117437/what-...](https://electronics.stackexchange.com/questions/117437/what-
happens-to-excess-energy-fed-into-the-power-grid)

------
resiros
I think most interesting computer science fields are actually application of
CS in other domains.

Science changed a lot in the last decades, moving from a genius in a room
looking at the data and coming up with grand theory to have vast amounts of
data that no single human can make sense of. The work of the computer
scientist is to quickly understand problems from various fields then solve it
using tailor-made algorithm that leverage the prior knowledge, the data
structure.

One of such interesting fields (which I'm working on), is computational
biology. We're working on leveraging sparse experimental data for protein
structure prediction. To do that, we end up using algorithms and ideas from
different various CS fields, from machine learning, to robotics, to
distributed systems. Other people are working on exciting fields like
computation protein design, studying drug protein interaction in silico..

~~~
niklasd
Yes! I studied law before CS and now I learn all these algorithms which deal
with questions about how to do something efficiently – and these algorithms
are unkown by all these people thinking about important questions in this
field.

And I think this also applies to other fields. I gave the book "Algorithms To
Live By" (which is basically an overview of CS algorithms) to a medicine
student and he was immediately inspiried and came up with ideas on how to
apply these ideas on his research. CS algorithms are just so basically true
that I think they should be more universally known.

~~~
terminalcommand
Slightly off-topic but I wanted to ask why and when did you start studying CS
after law.

I recently graduated from law school and now am an intern at a law firm. I
have a strong interest for CS, and it bothered me for a long time that I went
to law school instead of CS.

I'v overcome those feelings over the years and dedicated myself to become a
lawyer. But your post caught my interest.

I'd be glad if you could share some of the story behind you studying cs after
getting your law degree.

~~~
niklasd
After my law degree I worked for a year in a big international law firm (I
didn't yet had my license, so I was a kind of trainee – similar to you
position right now). I realised that there is a huge interest in technical
solutions to make the work more efficient (often labeled as "Legal Tech"). But
there was very little actual understanding of technologie, which I think is
one of the reasons why there aren't yet many real world application which are
really making a difference. That was when I decided to go back to university
for a CS degree.

Some learnings so far: 1) I get great feedback for my decision from other
lawyers, who are genereally very interested but not well versed in tech. 2)
Legal Tech feels a bit overhyped right now, but eventually it will change the
field drastically. Law firms need lawyers who have technical skills. And that
doesn't necessarily mean a whole CS degree, some programming skills etc. will
already do it.

I personally love tech that much that I don't want to go back to a law firm to
practise law, but rather actually develop technology. But for you, if you want
to become a lawyer, I can promise you that you will find a fertil ground for
your interest. It soon will be one of the most sought after skills for law
firms. So if you learn some programming (maybe you already know some), take
some online courses (there are great resources for CS online), then the next
time your law firm gets offered a (as magic advertised) ML tool or needs to
implement a new tech solution which really influcences the workflow, you will
be the star of the firm for being a critical but competent colleague. Or if
you're starting your own law firm, I think there is great potential for a more
automated workflow. In your position, I would be very glad for you CS interest
– you in the right field and it is the right time for it!:)

~~~
trampypizza
If you don't mind me asking - whereabouts (geographically) are you based? I'm
hoping to get into law after a couple of years working in technology but I
haven't been able to find that much information online about the meeting point
of law and technology and my searches haven't found me any communities for law
similar to HN for technology.

~~~
niklasd
Germany. If you look up Legal Tech meetups (often hosted by law firms) you
will find some law students/lawyers interested in tech. But it's still a small
community. And for real techies in law, that's an even smaller pool of people.

~~~
trampypizza
Cool, what you said about going to do another degree gave me the impression
you were probably based in Europe! I'm in the UK myself. Yeah I think that's
probably the way to go, thanks for the advice.

------
georgewsinger
Almost all of the answers on this list are not fields that are "emerging" but
fields that "have already emerged".

The ideal emerging field is one that's so obscure we haven't heard of it yet,
but so important that we will. If there are widely disseminated books on
Amazon about your field, it's not emerging. If there are hundreds of
professionals cranking out papers about your field, it's also not emerging.

Emerging fields are underrated and under-recognized. What are they?

~~~
xamuel
On the philosophical side, I recently published a paper which could
potentially lead to a whole new genre: making actual scientific (=falsifiable)
progress on the previously-ineffable question, "Do we live in a simulation?"

"A type of simulation which some experimental evidence suggests we don't live
in"
[https://philpapers.org/archive/ALEATO-6.pdf](https://philpapers.org/archive/ALEATO-6.pdf)

~~~
tlb
The x - ˆx property is very easy to avoid when building a simulator. Most
server-grade computers already use error correcting codes for their memory.
Or, the simulator could just abort and restart at a recent checkpoint if an
error is detected. It's possible to detect errors with arbitrarily low false-
negative rate for a small additional cost of computing and storing checksums.

Nevertheless, it's an interesting observation that we can now easily do
experiments that demonstrate correct behavior of logic to the 10^-15 level. If
Descartes were looking for evidence of the fallibility of a daemon creating
his sense data, it would have been hard to demonstrate better than 10^-3 or
10^-4.

~~~
xamuel
You're right of course. Nevertheless there's a difference between saying "the
simulating computer probably uses error-correcting codes or something"
(speculation) vs. saying "an experiment suggests (same thing)" (science).

To borrow from Nick Bostrom: suppose we run two types of simulations.
Important simulations and un-important simulations. For the important sims, we
use error-correcting codes, we save checkpoint images, etc. For the
unimportant sims, we don't do those things, in order to save money. This
allows us to run far more unimportant sims than important sims. Thus, if
someone is incarnated randomly in one of the sims, it's probably one of the
cheap ones (just because there are more cheap sims than important sims, by
basic economics). The point is just to show that it is _possible_ for a
philosopher to argue against error-correcting codes etc. Indeed, if we leave
it to philosophers, we'll probably never make progress.

We need to appeal to the muse of science, that harsh mistress who serves us
cold hard facts, every single one of which throws 50% of philosophers out into
the darkness where there is wailing and gnashing of teeth :)

------
gota
Process Mining [1]. When I programmed a rather complex logistics simulator at
work I told my coworkers 'whoever comes up with a way of instantiating a
simulator from data will be praised forever'. Turned out process discovery is
a thing (well, one of _the_ things in PM). And there's so much cool stuff to
do and being done. I'm now on the last stretch of my doctorate researching the
mining of typical plans in non-competitive environments.

[1]
[https://en.wikipedia.org/wiki/Process_mining](https://en.wikipedia.org/wiki/Process_mining)

~~~
janemanos
Have you ever had a look into analyzing processes with a graph database like
e.g. ArangoDB. Wonder if that would make sense for your needs. You can
traverse along the processes, find patterns or use distributed graph
processing with Pregel analyze from different angels. edit:typo

~~~
whazor
It is not about the big scale of processes that make process mining
interesting. But also the tooling that comes with the field, look for example
at the tool Disco:
[https://www.youtube.com/watch?v=pmXZQhFSv10](https://www.youtube.com/watch?v=pmXZQhFSv10)

It provides automatic visualisation of graphs, analysing of bottlenecks, and
lots of analytics. While you only need system logs linked to an id.

------
meuk
AI, machine learning, and neural networks are, of courwe, booming, but I
consider them to be hyped.

I consider type theory and formal verification to be more promising (but more
academic). Distributed systems and everything having to do with parallel
and/or high-performance systems is a good midway between what the industry
likes and what's interesting from an academic point of view.

~~~
Davidbrcz
Haha.

Formal verification has been around for 40/50 years and we can't say it is a
wide success from a industrial point of view. It has some achievements in
terms of results/methods and projects checked, but on a daily basis, pretty
much no one uses it. We are ages away of having every programmer understanding
formal verification and having all programs verified/proved.

Type theory is in a similar situation. Many issues in code could be solved
with basic typing algorithms but people and companies favor languages with
poor/no typing (python, Javascript).

~~~
jacoblambda
The biggest barrier to adoption of formal verification that I have seen as
someone just starting in the field (working through Software Foundations and
have a number of projects planned with SPARK, Frama-C, and LiquidHaskell) is
the lack of groundwork. Verifying just your own code is complex enough as it
is but working with libraries without any clear specification of their
interfaces and behaviours makes this so much harder.

I think there is real value in having verified libraries or at least libraries
with well defined specs so that interfacing with other code wasn't so tedious.
I think this issue is starting to be overcome with regard to the usage of
strong type systems. Truly strongly typed languages are finally getting the
libraries and communities built up so that they don't seem quite as daunting.

~~~
bor0
To me, personally, the biggest barrier was lack of a proper introduction with
a _lot_ of examples.

I try to break this barrier a bit with my upcoming book: Gentle Introduction
to Dependent Types with Idris.

I am very interested in this area but it is impossible for newcomers to get a
grasp of it without too much digging. Logical Foundations was OK but I was
still missing the theoretical explanation ("why does this tactic work? it is
magic!").

So with accumulated knowledge from IRC, forums I hope to address this.

~~~
jacoblambda
Ooh I'll check this book out once I get through the pile of stuff I have right
now.

And as you noted there definitely is a lot of "magic" when it comes to the
inner workings of theorem proving tactics. I'm slowly figuring all of that out
but like you said it definitely takes time and digging at the moment.

------
stared
Various field of Deep Learning. Right now - Reinforcement Learning.

See:
[https://www.forbes.com/sites/louiscolumbus/2018/01/12/10-cha...](https://www.forbes.com/sites/louiscolumbus/2018/01/12/10-charts-
that-will-change-your-perspective-on-artificial-intelligences-
growth/#38090e534758) or in general any other marker like NIPS submissions or
arXiv preprints on DL.

Of course focus changes, and maybe in the next 2 years it will be on something
different than RL. But still, even in Computer Vision it is still a very
vibrant field, since its breakthrough in late 2012
([https://www.eff.org/ai/metrics](https://www.eff.org/ai/metrics)). The
majority of more traditional disciplines of CS had their breakthroughs a few
decades ago.

------
kmisiunas
DNA computing might be an interesting new domain [1]. The idea is to use DNA
as a memory, while using proteins/RNA as logic operators. This can provide
massive speed and efficiency gains, especially for optimisation problems that
need parallelization. Just consider that 4bits of information on DNA take only
about 1nm^3 of volume, where solid state memory has about 3Tb/in^2 which is
roughly equivalent to 10^7 nm^3.

To me it is still not clear how scalable the DNA computing is, but there are
nice proofs of concept already [2].

[1]
[https://en.wikipedia.org/wiki/DNA_computing](https://en.wikipedia.org/wiki/DNA_computing)
[2]
[https://www.nature.com/articles/s41586-018-0289-6](https://www.nature.com/articles/s41586-018-0289-6)

~~~
Zaskoda
DNA computing is going to be huge. Nobody is talking about it but a handful of
people are slowly pushing it forward.

~~~
nojvek
I am willing to bet that both DNA computing and DNA manufacturing (organically
3D print things, but like how organisms grow) will be yuuuuuuge.

Not sure when it will have its internet moment, but the universe has been
doing this for a long time and once we unlock its secrets, we become a wee-bit
closer to Gods.

------
GolDDranks
Homomorphic encryption is a mind-blower. But I fear that we may never see it
in it's fullest glory. It's going to be computationally too expensive or too
impractical for reason or another. One can still hope.

~~~
gnode
Aren't ring confidential transactions (RingCT), used in some cryptocurrencies
a form of homomorphic encryption which is being applied now?

~~~
jacoblambda
Monero uses ring confidential transactions as of now and zCash's zkSNARKs take
advantage of some form of homomorphic encryption.

There are probably more but those are the ones off the top of my head.

------
QML
Algorithmic Game Theory [1]

Going into CS as an undergrad, I didn't anticipate the depth that the field
had in other domains -- and for some time, I wanted to double major in {math,
biology, economics} to supplement my education.

However, while in the algorithms course, I stumbled upon a connection between
linear programming and 2-player zero-sum games (the minimax theorem [2]). Up
to that point, I had never considered the idea of using a computational lens
to view problems outside of CS, such as "what is the complexity of Nash
equilibrium?"

It turns out Algorithmic Game Theory can be applied to study theory of
auctions (Why does Ebay use 2nd-priced auctions?) [3], tournament design (Why
would a team purposely lose?) [4], or something as basic as routing (Why does
building more roads lead to more congestion?).

[1]
[https://en.wikipedia.org/wiki/Algorithmic_game_theory](https://en.wikipedia.org/wiki/Algorithmic_game_theory)

[2]
[https://en.wikipedia.org/wiki/Minimax_theorem](https://en.wikipedia.org/wiki/Minimax_theorem)

[3]
[https://en.wikipedia.org/wiki/Auction_theory](https://en.wikipedia.org/wiki/Auction_theory)

[4]
[https://theory.stanford.edu/~tim/f13/l/l1.pdf](https://theory.stanford.edu/~tim/f13/l/l1.pdf)

------
dasmoth
I don’t know for sure, but I certainly _hope_ we’ll see some fresh thinking
about user interface design and construction. The past couple of decades seem
to have been substantially about recapitualating what came before in the web
browser, and while webification has it’s good sides (easier deployment), the
actual interfaces for data-entry type tasks still seem as clunky as ever.

AR is potentially an interesting sub-field, but doesn’t seem to be the answer
for everything (e.g. those form-like data entry tools...)

~~~
TheOtherHobbes
I think UI progress is unlikely without good AI, and good AI has to be much
better than human to be passable.

(If you're not convinced, try watching how often you have to ask your fellow
humans what they meant by a communication and/or a request for information.
It's probably more often than you expect - but you give fellow humans a pass
because you're used to it, and so are they.)

Either that, or personal data has to stored in a central server so it can be
accessed on demand by web apps - which would eliminate a lot of web forms, but
would have uncomfortable political and social implications.

There's still room to improve form-based pages, because there's still far too
little research into best practice. But forms are an efficient way to collect
information, so it's hard to imagine a secure and private UI paradigm that
would eliminate them altogether.

~~~
dasmoth
_Either that, or personal data has to stored in a central server so it can be
accessed on demand by web apps - which would eliminate a lot of web forms, but
would have uncomfortable political and social implications._

That doesn't necessarily require centralisation. Web browsers have some form-
filling capabilities now, and that data can stay under end-user control.
Perhaps there's scope for building on something like this (although the growth
of, _e.g._ "social login" doesn't leave me too optimistic. That perhaps _does_
count as an example of UI innovation, although one which hasn't particularly
registered with me since I tend to avoid it).

 _There 's still room to improve form-based pages, because there's still far
too little research into best practice. But forms are an efficient way to
collect information, so it's hard to imagine a secure and private UI paradigm
that would eliminate them altogether._

Agreed. I don't see easy wins, but trying to make forms as good as they can be
seems a very worthwhile area of endeavour. I suspect part of this might be
trying _not_ to go too far in terms of baking "business rule" type stuff into
forms, which has a tendency to leave people in impossible states (thinking,
for instance, of academic grant systems which can end up with some very strong
assumptions about career paths built in)

------
deadalus
Deep Video Portraits -
[https://web.stanford.edu/~zollhoef/papers/SG2018_DeepVideo/p...](https://web.stanford.edu/~zollhoef/papers/SG2018_DeepVideo/page.html)

DeepFake Creation Tools -
[https://voat.co/v/DeepFake/2405562](https://voat.co/v/DeepFake/2405562)

Adobe Voco =
[https://en.wikipedia.org/wiki/Adobe_Voco](https://en.wikipedia.org/wiki/Adobe_Voco)

------
lldata
CRDT's looks interesting and are new enough, that there will be more to learn
about them. Seems like an important component in distributed systems (almost
all new systems)

[https://en.wikipedia.org/wiki/Conflict-
free_replicated_data_...](https://en.wikipedia.org/wiki/Conflict-
free_replicated_data_type)

------
Dowwie
Computational law. See:
[http://logic.stanford.edu/complaw/complaw.html](http://logic.stanford.edu/complaw/complaw.html)

~~~
lajarre
About formalisation of laws in code (read: "write down laws as code"), I
wonder how far this can go (eg. in a 10 or 20-years timespan). Any idea what
are the most suitable parts of law (notably public) where to apply this
principle? And how far can companies/startups go without the help of
governments in this venture?

Another note on "smart contracts" in the first sense, or as I would put it,
trusted computation as way of executing multi-party agreements. This approach
already in application in electronic markets for example, and public
blockchains seem to be a way to bring this to the masses. But I think it's
still hard to say how this can interplay with "wet" decisions (involving a
judge or an arbitrator). That's probably one of the interesting questions in
this domain.

------
randcraw
Personally, I think higher order cognition in AI will be hot. Deep learning
has monetized the introduction of AI into numerous mainstream domains (e.g.
smart NLP search, vision, speech, game RL), which will motivate and underwrite
efforts to push AI beyond the shallow hacks of the past, finally breaking
through AI's brittleness problem.

Some problem domains are killer apps, like self driving cars, and on
smartphones, personal digital assistants and verbal interfaces. There's no
stopping these initiatives. They only question is how far each can go without
moon shot levels of investment. But between the economic interests of
especially Google and Apple to advance their mobile devices, and the military
to make weapons and intel as smart as possible, I'm convinced there's enough
critical mass for AI's pile to stay hot for a couple of decades or more.

The trick is to avoid the roadblocks that today's academic agenda inflicts on
researchers by demanding they publish frequent shallow incremental novelties.
What's needed is 5-10 years to develop the infrastructure that's needed to
enable the fielding of robust general reasoning w/ causation and rich
knowledgebases.

------
chriswait
One interesting approach might be to work backwards from desired practical
applications: [https://www.gartner.com/smarterwithgartner/gartner-
top-10-st...](https://www.gartner.com/smarterwithgartner/gartner-
top-10-strategic-technology-trends-for-2018/)

------
ArtWomb
Within a year or two, we will see grad-level courses at top CS programs in
Chaos Engineering and SRE. Just as we've seen the addition of Distributed
Systems classes in the past few years with introductions to Zookeeper, Paxos,
BigTable, Raft and Spanner. There will be an explosion in academic work on the
science of "failure-injection" methods ;)

~~~
arithma
Supporting evidence: "Chaff Bugs: Deterring Attackers by Making Software
Buggier" [https://arxiv.org/abs/1808.00659](https://arxiv.org/abs/1808.00659)

------
akqu
There are currently huge opportunities in applied computing for people who can
break out of the status quo. There has never been such a big gap between what
technology can do and what technology culture can't. Of course it isn't easy.
As there also never been easier to waste time in technology.

------
jonbaer
I think anything "emerging" will come from the forms of unconventional
computing [1] ... ML/DL/AI are being rehashed on faster silicon hardware, I
wouldn't call it hype but it will be better applied to another form of
hardware - once it's realized. I personally think reversible computing [2]
(once understood) to make the most sense in terms of energy efficiency in CS
(much needed) ...

[1]
[https://en.wikipedia.org/wiki/Unconventional_computing](https://en.wikipedia.org/wiki/Unconventional_computing)

[2]
[https://en.wikipedia.org/wiki/Reversible_computing](https://en.wikipedia.org/wiki/Reversible_computing)

------
gnode
Out-of-order execution and caches are once again emerging fields,
unfortunately.

~~~
RantyDave
Emerging as in not leaking through side channels emerging?

~~~
gnode
Yup, that's what I was referring to.

------
therealmarv
Not so much science in this list, more personal and practical view:

As a web developer: WebAssembly

As a DevOp: Kubernetes

As a backend engineer: headless (CMS) API systems like Strapi or Wagtail

~~~
corpMaverick
"Drive mobile and Javascript front ends from wagtail's API"

What is the meaning of headless anyway ?

~~~
skfist
It's essentially a back-end only content management system that makes content
accessible via a RESTful API.

According to wikipedia, the term "headless" comes from the concept of chopping
the "head" (the front-end, i.e. the website) off the "body" (the back-end,
i.e. the content repository).

------
crawfordcomeaux
Help me create a field of human programming that's informed by computer
science? Below is a simplified description of some ideas informing what I do.
These days I'm more focused on language and behaviors in myself and primary
relationship in preparation for our first child, so it'd be nice if someone
else started working on the theoretical stuff. I'm also down for informally
experimenting with things anyone comes up with from this.

Here's the basis:

Start with a category theoretical model connecting neuroanatomy and thought
(MENS, category theory and the hippocampus). Combine with the concepts of
universal embedding and fully abstract languages. Replace computers in the
previous sentence with computational model of human; I'm playing with modified
versions of differentiable neural computers and perceptual sets comprised of
beliefs, emotions, intentions, and behavior/thought patterns. Choose a human
language to mathematically hack into a strict subset of itself so it meets the
requirements for a target language in universal embedding. I suspect some form
of type theory might be needed for that; coeffects seem like they could be
useful, as well as quantitative type theory. Use yourself as the primary
experimental subject (ie. the test machine) to help guide things and don't
worry about reproducible results...trust that you're an ordinary human with
essentially the same cognitive functions as everyone else, for now. Explore
how this can impact human relationships. Discover ways to organize the self in
such a way as to more effectively organize at scale.

Teach the world how to program itself at an individual level.

~~~
AnimalMuppet
> Start with a category theoretical model connecting neuroanatomy and thought
> (MENS, category theory and the hippocampus).

Yeah... um... let us know when you've got that in a form that is really true
to neuroanatomy, really true to human thought, and really solid category
theory. I'm pretty sure you're not going to get there in this lifetime.

------
laser
Only because I don't yet see it mentioned, the one emerging field to rule them
all: program synthesis. :P

~~~
worldsayshi
Here's sort of a relevant critique to that idea:
[http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensiv...](http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensive-and-precise-spec/)

Not saying that there isn't merit to the idea. Just saying that program
synthesis is more or less synonymous with programming language design when you
take into account the challenges involved.

------
montalbano
Zero knowledge proofs.

Secure execution environments.

------
jpamata
Graphical models[0] & probabilistic programming[1], with the latter making it
easier for developers to dive into this growing AI trend. Research in the
field for the past decade has been steadily booming with more companies like
Microsoft leading the way. I recommend checking out some MOOCs[2] in coursera.

[0][http://www.computervisionblog.com/2015/04/deep-learning-
vs-p...](http://www.computervisionblog.com/2015/04/deep-learning-vs-
probabilistic.html)

[1][http://probabilistic-programming.org/wiki/Home](http://probabilistic-
programming.org/wiki/Home)

[2][https://www.coursera.org/specializations/probabilistic-
graph...](https://www.coursera.org/specializations/probabilistic-graphical-
models)

------
onion2k
Functional correctness, formal verification and automated bug fixing.

------
vortico
Since you mentioned movements in free software, "open core" has been emerging
in the last 4 years as a viable way to do business while allowing other
individuals and companies to build onto your core platform while still being
able to monetize your work. For example, if you launch a startup specializing
in building foo, you can maintain a library called libfoo and sell a larger
foo application or foo plugins or foo services using the open-source library
you created.

------
deepaksurti
Open source computing hardware. RISC V[0]

[0] [https://riscv.org/risc-v-foundation/](https://riscv.org/risc-v-
foundation/)

------
KaiserPro
Low power sensor and associated networks.

Its hard to do and has lots of real world applications.

------
msbroadf
Fully Homomorphic Encryption

------
otakucode
Amorphous computing has always seemed interesting to me. Computing with
emergent phenomenon amongst scatterings of large numbers of unreliable simple
processors - like the sort of thing you could mix into paint. It's very young,
with lots of fundamentals to be worked out, but that's what makes it
interesting!

------
jfilter
Wide-scale adoption and promotion of open source software in the industry
(e.g. Microsoft, Facebook, Google)

------
simonhughes22
Adversarial Machine Learning. Fake news detection using ML. Integrating 'good
old fashioned AI' ideas with modern ML techniques - to some extent Alpha Go
went in this direction. While I am glad AI has moved far more towards the
machine learning direction, i suspect the decades of AI research that preceded
it may come back in a form that is combined with more modern techniques in
some way. I see Alpha Go (and Alpha Zero) as steps in that direction. Also,
applying deep learning to search engines and making that scale efficiently. I
suspect google has partly solved this already, but haven't gone public with
any of the tech, but that's pure speculation.

~~~
simonhughes22
I meant those as 4 separate areas. I don't think my post makes that clear.

------
mongol
Quantum computing?

~~~
adrianN
Yes. Once we build a scale-able quantum computer it will revolutionize so many
fields. Simulating chemistry suddenly would become practical. We could broaden
our understanding of biochemistry and materials science without designing
fickle experiments, just by simulating things. This would be a real game
changer and will probably lead to a bunch of breakthroughs on the way to
protein-based nanotechnology.

------
tabtab
An informal HN survey on what some feel is the future vs. over-hyped:
[https://news.ycombinator.com/item?id=17129481](https://news.ycombinator.com/item?id=17129481)

------
lazyjones
Swarm Computing. The hardware and networking to make it practical and useful
exist now, but the field is still in its infancy. There‘s some discussion
about its use in autonomous driving, construction, warfare.

~~~
throwawaybbqed
What makes swarm computing different from distributed computing?

~~~
lazyjones
Swarm computing is about moving, cooperating devices with sensors, possibly AI
features. Distributed computing is just a minor aspect of it.

------
RantyDave
I am _certain_ there will be an emerging field in AI for engineering.
Suspension that 'learns' how to keep the car flat; buildings that start
shuffling warm air from a to b before it's needed ... things like that.
Programming is going to change from "explain how to do it" to "show it what
you want", and this has got to be a big deal.

~~~
engi_nerd
[https://en.wikipedia.org/wiki/Self-
levelling_suspension](https://en.wikipedia.org/wiki/Self-levelling_suspension)

[https://www.youtube.com/watch?v=eSi6J-QK1lw](https://www.youtube.com/watch?v=eSi6J-QK1lw)

You don't need AI to keep your car's body level. The technology has been
around in various forms for over 60 years.

So much of what people believe we need AI for is amenable to classical
engineering techniques.

~~~
RantyDave
Indeed, couldn't agree more. But quite possibly AI will prove simpler than
classical techniques, and more adaptable to i.e. changes in tyre pressure.

------
dalbasal
Just as an angle to answering the question (I don't have answers of my
own)....

What was the most interesting CS field(s) in 2008, 98, 88, etc?

~~~
nailer
88 (taking a stab here as I was pretty young):

DTP

OO

RISC CPUs

'graphics' (as in render farms)

98:

Linux, Apache, Mozilla, OSS in general

Perceptual audio compression : MP3 (layer3.org, MP3 vs TwinVQ, codecs created
during the period before Fraunhofer announced the source code it uploaded to
ISO without a license and that people had been working on for free, in fact
had a license and everyone owed them 10 grand).

2008:

Cloud

mobile (location in particular). Think Foursquare vs Gowalla vs Burbn, Grindr,
other early mobile location-aware apps. App stores for popularised by Apple
that same year.

AJAX, Rails

blogging.

~~~
neilwilson
88

OSI network stacks. They were going to replace the 'old' Internet protocols.

Relational Databases, SQL and two phase commit.

Formal methods and verification.

~~~
sitkack
> Formal methods and verification.

Still emerging.

------
vikaskyadav
Computer Vision.

~~~
chriswait
Emerging for 50 years, and still going strong

~~~
RantyDave
Yes, but it works now.

------
doomjunky
Functional programming!

Functional programming languages have several classic features that are now
gradually adopted by none FP languages.

Lambda expressions [1] is one such feature originating from FP languages such
as Standard ML (1984) or Haskell (1990) that is now implemented in C#3.0
(2007), C++11 (2011), Java8 (2014) and even JavaScript (ECMAScript 6, 2015).

Pattern matching [2] is another feature that is now^2015 implemented in C#7.0.
My bet is that Java and other will follow in the next versions.

Here is a list of FP features. Some of which are already adopted by none FP
languages: Lambda expressions, Higher order functions, Pattern matching,
Currying, List comprehension, Lazy evaluation, Type classes, Monads, No side
effects, Tail recursion, Generalized algebraic datatypes, Type polymorphism,
Higher kinded types, First class citicens, Immutable variables.

[1]
[https://en.wikipedia.org/wiki/Lambda_calculus](https://en.wikipedia.org/wiki/Lambda_calculus)

[2]
[https://en.wikipedia.org/wiki/Pattern_matching](https://en.wikipedia.org/wiki/Pattern_matching)

------
kitanata
The use of Lattice Boltzmann Equations to parelleize computation. It’s alreadg
being used in doing dynamic fluid simulations but it’s applications are pretty
endless. I wouldn’t be surprised to see LBM translated for use in Machine
Learning, Cognitive AI, NLP, Computational Biology, etc, etc.

------
amino
Programatic identification and comprehension of morality in software
applications is a fascinating area!

------
k__
I think the core disciplines are the same, sometimes just some "updates" are
happening. Like AI or CV in the last years.

The big changes are happening in engineering (software and hardware).

Many things that were known for decades are now accessible for a broader
audience.

------
I_am_tiberius
Quantum computing and cryptography

------
azhenley
Human-computer interaction. The field is not new but the way people interact
with computers has drastically changed in the last ten years and will probably
continue to do so.

~~~
jacknews
And within that, Distributed/shared UI.

------
al_ramich
The hype is where the money is and if you look at the established emerging
tech, AI and IoT are projected to get most funding and create most disruption
in the years to come [https://uk.pcmag.com/feature/94662/blockchain-and-
robots-buz...](https://uk.pcmag.com/feature/94662/blockchain-and-robots-buzzy-
but-not-yet-vc-blockbusters)

The cutting-edge emerging tech I feel will be in the way we engage with data
and tech and Augmented Intelligence (assistive) will see huge advancements.

------
tugberkk
Internet of Things. It is still in development and there are lots of stuff to
work on.

~~~
lnsru
Just another hype. Sensors and internet are here for decades. Internet is ok,
but “things” are way too expensive and not reliable yet.

~~~
zaarn
IMO the best IoT is the DIY IoT, the kind that isn't really IoT but rather "I
put Wifi on a raspi and connected it to a PCB".

Thankfully the online resources around electronics are plentiful and PCBs can
be had for under 10$ incl. S&H.

That way I can make all my lighting IoT without having to deal with the
garbage of the IoT industry.

~~~
lnsru
Yes, IoT is great when you can DIY. But when you need to send a technician 2-4
times a year for each node... it’s a problem, not a business.

~~~
maccio92
It's a business for repair technicians :)

------
arisAlexis
direct acyclic graphs for blockchain use. homomorphic cryptography.

------
jklein11
Targeted Advertising

------
tastyham
Quantum Computing

------
fl0tingh0st
Computer Networks

~~~
frogdog55
This Internet thing is going to be huge-- but it's going to tear us apart.
Mark my words.

------
wellboy
Blockchain, though no one on hn really understands it. :D

~~~
SuddsMcDuff
It's an unfortunate characteristic of many in the cryptocurrency community,
that they think anyone who doesn't support crypto simply doesn't understand
it. And by extension, as soon as they do understand it they will become
supporters.

No. There are those who do understand blockchain and still don't support it. A
great example is professor Jorge Stolfi. He is one of the more prominent
detractors, and yet he routinely displays a very thorough understanding of the
technology.

~~~
DoctorOetker
I tried looking up the arguments of Jorge Stolfi, however my Spanish is
insufficient.

I don't claim to contradict that he is against the concept of cryptocurrencies
or blockchain in general, but I fail to find evidence that he is against the
technology in general.

I do find evidence he is opposed to Bitcoin in specific, or at least warns
against it.

Could you point me to English writings where he argues against
blockchain/cryptocurrency in general?

~~~
laken
Perhaps the reason why your Spanish was insufficient for his writings is that
his writings are in Portuguese ;)

Here's his primary English writing on Bitcoin and cryptocurrency in general
(not necessarily blockchain), sent to the SEC:
[https://www.sec.gov/comments/sr-
batsbzx-2016-30/batsbzx20163...](https://www.sec.gov/comments/sr-
batsbzx-2016-30/batsbzx201630-2.htm)

~~~
DoctorOetker
Thanks for pointing out it's Portugese!

This critique of Bitcoin is quite short, and seems directed at Bitcoin in
particular, in no way do I conclude that he is against the concepts of
blockchain (say non-currency), or perhaps even cryptocurrencies that do not
take on some of the Ponzi aspects.

After reading this I can perfectly imagine (but do not claim so) that he might
support certain other forms of blockchains and/or cryptocurrency...

~~~
Tesl
This is his reddit account:

[https://www.reddit.com/user/jstolfi](https://www.reddit.com/user/jstolfi)

He primarily posts in buttcoin, a sub that exists to mock bitcoiners. It's
pretty fair to say he thinks all coins are crap, not just bitcoin.

Also, please include me in the "Understands cryptocurrencies and yet doesn't
support it" bucket please :)

