Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What are emerging/promising areas of tech to get into today?
52 points by jkka on Sept 3, 2023 | hide | past | favorite | 65 comments
What fields in tech are going to be prevalent, disruptors or gain a foothold in business or consumer markets?

I have seen that people on forums like this one and stackoverflow are usually able to predict the next big things in tech quite well. Like this thread:

https://news.ycombinator.com/item?id=21324768&p=2

So, what CS areas would be good to invest time in learning for the near future?



Construction technology. Design software for horizontal construction is astonishingly bad. The PDF is still the "product" in am era where GPS and automated machine guidance are the norm. We're decades overdue on rethinking project delivery from first principles in the context of modern practices and The inefficiency costs billions. A single bill or FHWA policy memo on open data standards for digital project delivery will be the starting gun for an explosion of disruption in this market. No telling who will win, but it won't be Bentley and I doubt Autodesk can innovate fast enough.


I can't predict what the future will bring in this sector but I'm in the process of building a house and had to undergo through the architectural process and planning (by hiring professionals of course not going at it on my own). I am surprised of how advanced things are with building construction, eg with realtime rendering, unreal engine 5 integration into autodesk revit, actually making changes and being instantly visible with photorealism feels like science fiction. All quantities for materials are calculated automatically which makes cost estimations much more accurate and allows you to play with options and iterate designs in a way that wasn't possible before.

Of course construction is not only for buildings but for someone who had no experience in this sector I was caught off guard expecting to see old fashioned blue prints.


All that could very well be true. The important question is: Is there anyone who is prepared to spend money to fix any of this?

Put another way: It doesn’t matter if you have a product which would help people and save them money, if people, for whatever reason, aren’t inclined to buy your product.


Bentley has a market cap of around $15B. It's worth something to somebody. Their only serious market is DOT work and their position is attributable to administrative inertia and aggressive vendor lock-in.

The biggest problem isn't making better software, which already exists, but getting DOTs to stop making Bentley's proprietary file type a requirement for deliverables. The second problem is realizing that the CAD approach of imitating a paper process from last century isn't actually necessary.


Current owners, operators, and tenants. Buildings need to work for the occupants. Even in the best scenarios this takes ongoing upgrades and maintenance. In some cases buildings need to be significantly designed to provide for the needs of current tenants. This means that the whole construction sector needs the hardware equivalent of continuous integration based on the idea that construction is only the start of a process of keeping a building operational for its service lifetime.


It’s the same classic problem when selling business products; the buyers are usually not the users, so no user-friendly features are created; only features which would look good in a demo to the buyers.

Therefore, no products which only contain features which help users (and save them money) can be a success, since the buyers won’t be impressed by it, and will not pay for it.


It's also quite fun! There's a lot of computational geometry involved and you have a chance to positively affect the built environment around you.

We're quite close now to delivering the final version of IFC4.3: the open standard for BIM (building information modelling), now including semantic modelling of long linear infra https://ifc43-docs.standards.buildingsmart.org/ Eagerly awaiting to replace your PDF.


Since you are an insider, would you please shed some light on a personal question I've had for a while? Is the reason the IFC specification is so extremely complicated because they're trying to map nicely to an object model that's already in Autodesk products?

I read a proposal a while ago to simplify an open BIM spec down to meshes for solids and a generic dictionary for attaching property data. That struck me as very sane and very easy to implement. You could have a free product like Blender spit out BIM compliant deliverable at minimal cost. Apparently some high ups at Autodesk talked to the author and he came to understand the error of his ways.


This is a difficult question and the the history to all of this predates my involvement.

Indeed, a fair amount of people believe that the IFC is intentionally unnecessarily complex to limit interoperability and retain a monopoly.

Over time, I've come to believe hanlon's razor is also in play here and it's more of a poor understanding of use cases, academic ideas blowing up scope and inheriting obscure schema and serialization tech.

Meshes + metadata can facilitate most of the use cases in industry today, which is coordination, interference checks and visualization.

But at the same time, there's some pretty compelling use of the standard that requires slightly more semantic geometry descriptions:

- calculating geometric quantities according to local norms, which requires some additional geometric knowledge on things like openings and the axis of a wall for example

- The opening direction of doors is often good to know, but this is just a convention on the local transformation matrix, so could be just meshes.

- steel manufacturing can derive quite a bit of information from the IFC geometry, like parametric cross section profiles and where to drill for bolt holes and using which diameter

And then there is the ultimate end goal according to some to be able to exchange all parametric and constraint information from the native model. But this is still quite far out.

The challenge for future editions of the spec is to better align required complexity of certain use cases with a more modular spec so that you depend on a more appropriate amount of complexity.

Btw. since you mention Blender. The BlenderBIM addon https://blenderbim.org/ is actually one of the most avid users of complex and parametric constructs in IFC :)


IFCOpenShell was news to me. Thanks for sharing. You're doing God's work.


If everyone is saying AI/LLM then that ship already sailed.


Ikr, I don't know why that is being mentioned. If someone suggested AI five years ago. it would have been emerging then.


Tech works in cycles, including AI. What's hot today (this year) will be less hot tomorrow (next year) but be hot again in a week (N years). Invest in next week by spend your today/tomorrow learning about it.


1. Genetics and Genomics. The researchers I have worked with closely in the past have been outstanding scientists who have highlighted a huge room for computer engineers to step in and reap the benefits of applying software engineering and to a large extent - algorithms to solve large scale problems in the quantitative aspects of genetics and genomics. To give you a broad example, such methods are used in the breeding industry to find suitable mating bulls for cows to achieve a certain goal/feature - milk production, beef, even reduction in methane in their offsprings. I can only imagine the need for such methods in food and medicine which are huge industries.

2. spatial computing : If Apple’s foray into it has validated anything, it is that it is just getting started and it holds a lot of unrealized potential.

3. Robotics - it has been important for a long while, but its trajectory is still upwards and reach very promising.


In your given example, I know it's more complicated than that but it is basically an optimization algorithm, right?


I unfortunately don’t have the detailed picture, but only several layman’s view of the problems they were working on. From what I understood, a lot of their research involved pattern recognition in strings of the size of about 4-6 GiB, combinatorics of smaller patterns, classification across multiple samples etc. I kept thinking these are elegantly solved problems in other domains, and if one were to possess both the skills of algorithms and the knowledge of what is it they are looking for, then it can click for a win.


• Spacial Computing

This is definitely an emerging field that is trying to develop market share. Unclear if the tech is really there in the long haul for comfortable spacial computing though. What you're seeing now is literally akin to something like the Apple I being released. We have a LONG road to go but there's very little doubt that the end goal is something powerful.

• AI

Duh? It's useful now, and likely will only get better. No question here that sharpening your AI skills is a safe bet.

• Synthetics

Lots of work going on quietly around synthetic engineering, BCIs, etc. If you're interested in the human body interface then this is a field that has some interesting problems but, like spacial computing, probably won't come to fruition for many decades.

• Security

Always useful and always changing. A no brainer if you're looking to skill up in a specific computer science niche.


Glassesless spatial computer you might not be familiar with:

Http://voxon.co (disclosure: founded the company )

Http://look.glass (disclosure: worked at company)

Http://lightfieldlabs.com

Http://leiainc.com


Wow! These companies are building awesome tech!

Has demand for 3D tech like this increased since you started Voxon?


Drones drones drones drones drones.

The war in Ukraine has transformed drone tech. Ukraine alone is burning 20k to 30k drones a month. The military applications are leading to huge investments; economies of scale are leading to rapid improvements in tech and reduction in costs.


What are the most promising commercial startups in the space in your opinion?


Not really answering the question, I'm most interested in mine clearing drones:

https://nitter.net/Tatarigami_UA/status/1692962111675138074#...

A drone with a FLIR can do a great job of sweeping for UXO and mines. Which will be big business (it would take every mine clearing team on the planet 750 years to clear Ukraine given current non-drone tech).


I’ve been chasing security as a pseudo specialty. Being able to design a secure “thing” or service is becoming increasingly important. Not many security engineers actually have a software engineering background.


The Chips act is going to have a huge impact on soft-hardware interfaces going forward. Anyone who can work at the lowest levels of abstraction is going to see opportunities that most others will miss.


I really hope so. There's so much potential for this money to expand the pot hugely.

But there's also so much risk it all gets captured & eaten by the existing titans, that we don't make any gains begetting a new "silicon foundry" where new folks & new ideas get created. There's so much risk that chupmaming remains as rareifiedly arcane, in which case this money doesn't really help the world much.


I don't know much about the Chips act and its implications. Could you ELI5?


One of, if not the, first applications of the transistor was in missile guidance. China developed enough vertical integration and global consolidation in the chip industry to become a national security threat. Congress responded by allocating $280 billion to bring the at-risk aspects of design and development back to domestic soil. There is more nuance but that’s the gist.


ERP is waiting for a disruptor.

I wish I knew how to code, or just had the balls to quit job and make a company. Since I worked for years in various companies I see that no ERP handles the usual problems.


I think many firms have already started entering the ERP space, no?

Also, could you mention some of the usual problems ERP cannot handle?


What are the usual problems that you see?


The systems aren’t build to handle the real world. The issue isn’t really technical though, at least not in my opinion. We manage a lot of companies in a lot of different countries and the best fit for us was Microsoft’s BC365, which is frankly still a horrible fit. The issues are mainly related to legislation though, and I’m not sure how you could really plan on disrupting that part of company management even if you wanted to.


Is it because the systems are mostly designed on the 80/90 and are just outdated?

I guess you’d have to provide a solid reason for mgmt to invest into something new. So increased UX won’t cut it. AI/automation might, given that we potentially face massive shortages of admin people, looking at western societies. But it’s hard to see a path into companies to even start with this.


I think it’s mainly because of the myriad of rules and laws surrounding budgeting. Like, we’re using BC365 right, but we’re also paying a 3rd party consultant to build some custom APIs into BC365 for us because the laws needed to “manage” Danish companies require more features than BC365 provides. SAP would be able to handle some of those better, but SAP is sort of this thing that takes over the way you do ERP. And it’s still not really build to handle our use case of managing many companies in many different countries.

There is also the issue of how you’d likely want payment systems to integrate with your ERP system in the year 2023. But payment systems are two separate things, one deals with banking and managing bank accounts and one deals with the transfer of payments. You’d think they should all be just one system, I do, but the rules and laws are so complex that they are 3 because you can have an entire enterprise sized company specialise in just one of those areas due to the complexity of the laws. I’ll give you one example, in Italy every “bill” (sorry I don’t know the English word for faktura”) has to pass through a national registry. This is mafia prevention, and frankly a rather good idea. It also means that a budget system in Italy needs functionality to handle the flow of obtaining SDI approval (the Italian thing). And that’s just one small tip of the iceberg of complexity you’re dealing with.

Automation is a whole other topic. That is probably ripe for disruption, but it’s already been sort of disrupted by Robot Process Automation. Which again has its own sets or issues, because some of the actions that you would like to automate can’t be. I forget which EU country made it illegal to approve payments automatically, but ideally you’d like automation on the outgoing payments of the transactions you’ve already got approved on your books. But then you can in most countries and not in others because of the law. Sometimes there is also the need for physical two-factor verification to log into things. In Denmark your systems can’t log into company bank accounts unless you build a robot to press a button and read a number on a little IOT device.


I share this sentiment, what do you think should be the first part to improve?


My son (15) has expressed interest in studying materials science. While he's still years away from a firm decision, it doesn't seem like a bad path to me.


Seems like most true technology breakthroughs come from materials science. It is much harder to get the benefits of those advances though, since the path from materials science to product can be long and complicated.


What is a "true" technology breakthrough compared to a "not true" technology breakthrough?


I'll leave "true" up to someone else, but in my opinion "not true" is all the stuff that's just a gimmick. Someone might think of the nail puller on the side of a Craftsman hammer to be "an innovation", but to me it's just a gimmick. It's useless compared to anything actually designed to pull nails, and it was added for the sole purpose of making people go "I don't have one of those, I need to buy one".

I think a good 50% of "innovations" under capitalism fall into this category. They're only thought of as "innovations" because the parasites who make a profit from it are the ones in charge of headlines.


Computing is ripe for disruption. I think it's high time for computing to end bring the next big field in tech.

The various industries have deeply balkanized the field & expertly extract value with endless apps that primarily capture value for the app maker, while barely creating value for the user. We are in a late-appification stage.

I think the biggest value area of the future are undoing the damage of all this disruption & market capture - of undoing all these apps! - & building strong principled powerful general computing systems.

This is targeted to data scientists far more than general general computing, but I like how broadly this post expands the problem of what data science computing is: The Road to Composable Data Systems: Thoughts on the Last 15 Years and the Future. That unpacking of concerns, understanding & building the general substrate, calls powerfully to me. I think even more general computing forms lie in this direction, that expand how all computing by all users would work, in a way that better augments the user's intellect. https://wesmckinney.com/blog/looking-back-15-years/ https://news.ycombinator.com/item?id=37367236

Figuring out how to make general progress that's even more accessible, less specialists & more "just how this computer works" in a way that everyone can potentially see & touch is the valuable work that is almost entirely ignored & unworked on now. It's a colossal opportunity.


Focus on the basics and seek what’s profound. For example, you might learn a ton from implementing your own computer algebra system, and the code is extremely valuable, but also extremely basic. The difference between 10x engineers and average ones is a tolerance for busy boring work like strict TDD (test driven development).

Learn about strength training and cardio. Taking care of your body pays dividends long term.


Honestly?

There is so much unbroken ground in tech, especially point solutions with existing web technologies that haven't been made or just started to grow. LLMs just icing in the grand scheme of things (especially in Business Process Management) along with other ML models.

I would say build apps and think about where it's appropriate to make a call to a LLM like GPT-4 or an Tensorflow JS mode, and then decide for yourself


AI for sure. I’d steer clear of prompt engineering though - it’s hot right now, but I expect it’ll be a flash in the pan. Whichever areas you pick will almost certainly be disrupted by AI, so stay on top of it. Generative AI seems like a safe place to invest in.

Biotech is heating up. Sure it’s not pure CS, but there’s plenty of crossover. Doing computer vision on the glossy insides of a person is a serious CS challenge for example. Its disruptions are slow but possibly larger than other areas. Intuitive’s surgery robots are everywhere you look now for example.

Augmented reality. Apple’s Vision Pro is a big validation of this direction. We haven’t figured out what the killer apps here are yet and I expect we’ll see a similar gold rush and set of disruptions like smartphones as folks figure it out.

Self driving cars. These were way over promised but we’re getting past the hype. I’d expect to see robust (probably geofenced) Level 4 systems by the end of the decade.


LLM-ops, distributed systems, high-volume data storage, better logs storage/observability across distributed systems, private personalization, AR (Vision Pro), VR (when it becomes less of a box on your face and more of just glasses).

These are just off the top of my head so take it with a grain of salt.


Manufacturing MRP. Every other manufacturing company has their own way of doing things and none of the MRP solutions on the market are flexible enough to work for these companies.


Googling MRP it is material requirements planning? Can you give some examples of existing big players and their shortcomings.


Yes, that's the one. I'm not going to go through all of the players, but take a look at Katana MRP. They have a very rigid way of defining what an order is, what a product is, etc. Very specifically, they only allow one layer of subassembly, whereas at least two manufacturers I know have at least 2 levels of subassemblies. They expect their users to adhere to their process, not the other way around.


Anything revolutionary you are aware of? Knowing that we will need one soon, and dreading the usual options, I’m ready for a revolution.


Nothing that I'm aware of, no. The reason this is on my radar is I'm working part time for a manufacturing company for which the existing MRP solutions don't work.

Do you mind if I bend your ear about what your needs are? If so, send me an email. It's in my profile.


There's a ton of interesting things to do and important problems to be solved in information security. Think of this as the ecosystem of safety in software and systems.


Could you elaborate?


I would look at areas of technology that can help with the societal shift towards having greater numbers of elderly people living longer and wanting to maintain a higher standard of living.

Most western societies have an age curve moving to the right. Lots of opportunities with AI and disrupting the user experience for the elderly. Have relatives who can't use a touch screen because of arthritis, bonus is the have money


Remember when Steve Jobs introduced the iPhone and its own appstore? And how developing mobile apps is now its own field?

A new interface is ripe for development. Once a user is used to such a clear and intuitive UI, they can't go back. And GPT technology is exactly that. ChatGPT is the herald of the new era UI.


I am confused. What is the mobile app-equivalent for GPTech?


Synthetic food.

I think good synthetic meat etc is still some way in the future, but things like milk, juices, pulps, and animal feed should be easier and make even more sense.

Why grow a whole tree when you just want OJ, a whole cow when you just want it's breast secretions, etc?


Broadly speaking, my bet is on three to four major ones:

* AI

* Cryptocurrency

* Solar

* Biotech/Bioinformatics

Doing some back of the envelope calculations you can get a rough guess that we'll have "human level AI" by 2030 or so. The current trend of AI producing spectacular results will, in my opinion, not cease but continue to pick up pace.

Back of the envelope estimates can show cryptocurrency holding, conservatively, 20% of the worlds wealth by 2030. I suspect the growth of cryptocurrency won't be independent of other technologies. Cryptocurrency gets a lot of flack here and other places but I'll remind people, yet again, that many of the critiques against cryptocurrency are eerily similar to critiques of the internet, email, the world wide web and social networks in the late 1990s and early 2000s.

Back of the envelope estimates can show solar producing over half of the worlds energy needs by 2040. In addition to solar panels, there's battery technology, microgrids, etc. so this is really about what technologies are rolled out to satisfy the 2.5% yearly energy usage growth rate. I don't see any other technology that has "Moore's law"-like behavior in terms of the energy harvested vs. the energy invested than solar and battery technology.

Biotech and bioinformatics will have profound consequences for health, longevity and a host of other issues but I don't have a good sense for what the major innovations are that need to happen before this becomes widely adopted. I suspect AI will help with bio-engineering crops, drugs, food and help with general medicine but I don't quite know what that looks like. We silently hit a roughly $100 whole genome sequencing [0], so it's progressing, I just don't know what rough goals to predict or what to look out for.

All of the above are relying on a sort of "generalized Moore's law" in that the reason the innovation and adoption is so quick is because cost is dropping exponentially.

I created a small post about it with some simple justifications for where the timelines come from [1].

[0] https://nebula.org/whole-genome-sequencing-dna-test/

[1] https://mechaelephant.com/dev/Future-Predictions.html


I think powerful AI programs and models running in the browser has to be a promising area to work on, especially with technologies like WASM and the Web APIs for using it becoming prevalent.


thinks of areas that can benefit from "AI assist" and pick a domain you think you can add the most value and go all in on mastering the intersection


Military spending is probably going to be given a hefty boost as tensions ramp up with China and Russia. Dronetech will probably be a large chunk of that.


I’ll throw in anti-drone tech too. Countermeasures are where the money is!


My favorite is chocola-TECH-ip cookie


how about drasticaly reducing unneeded complexity and de-bureucratisation of software?

Although i don't know how that matches anyone's agenda - software eating the world is exactly feeding on these "Perpetuum mobile"..


"Pessimists sound smart, but optimists make money," is an adage I believe, so I've been trying to mine pessimism into contrapositive opportunities based on who benefits from it, so take these with a giant bag of salt as they are trying to derive optimism from dystopian views, but:

- (enterprisey langages and platforms, identity management) I don't think consumer tech mints any new unicorns or platforms for a while, as a discretionary cost, I think there's a consumer tech winter starting.

- (graphs, GANNs, LLMs, forum tech, api aggregators) When I look anecdotally at where the money is (institutions, endowments, PACs), where it isn't (consumers), and where it is going (favored causes and mechanisms to secure political levers) - the upside goes to an emerging class who works in moderation / trust and safety, campaign management, PR, ad tech/surveiilance, gov tech consulting. Like marketing, but for shifting narrative alignment and "funding," instead of discovering customer desire. "Influence hacking," is a thing.

- (not comp.sci, but likely trend) I joke that the biggest bubble over the next decade will be weekend vacation property and renovations near government towns. All them comrades gonna need dachas.

- (identity management, AI model alignment, software attestation) The mood of money in the econoomy now is being applied to converting frothy QE cash to political influence instead of being invested in innovation. Model alignment, model authenticity, and ability to influence will be valuable.

- (next gen browser tech, virtualization, webasm, react, scripting frameworks) Social platforms are optimizing for hollowing out the value they provided instead of making new products people want, so there are unlimited dollars for anything that sustains their business model a bit longer. Social is the new legacy business model, and they will spend to sustain it the way banks and credit card companies spend to maintain their oligopolies. This means sustaining legacy tech features in browsers that enabled it.

- (PowerBI, python in Excel makes you a wizard) I'd bet on incentives for additional governance roles, where firms will get ESG and tax incentives to hire low tech-skill party affiliates, creating a public sector job consumer bubble. Those jobs are ongoing conversations about higher level metrics that come from these tools.

- (no tech, just a bet) A lot of those jobs will go to childless people without a lot of responsibility, and many will just drink/amuse themselves to death, so get long boxed white wine and short pampers. Luxury goods/bags do well, as they will need new ways to signal status.

- (piecework gig platforms, make google glass shared/augmented reality for toddlers and parents) Reproductive tech and services set for a boom, single parent family management tech could become its own category.

- (LLMs for teaching and testing/exams, higher level abstractions like category theory, graphs, domain specific language design) Premium edu tech to scrape out any tiny advantage in a now-globalized competition between students for university spots. Anything that communicates useful abstractions faster will win.

I'm going to go touch some grass now.


AI.

AI!

Did I mention ai?

Srsly go into ai.

Start learning what is already here (so much) learn to use it, learn to fine-tune it and if you then still not busy enough: learn to really understand ai architecture.


But hasn't AI boomed already? If you said this 5 years ago, then it would be called emerging tech but now it's already here with many practitioners already.


Nope.

The industry is adding ai right now left and right.

This takes time and experts.

And all the small companies probably haven't even touched it.

And currently there is no end in sight. The r&d progress is still as high as not higher than yesterday.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: