The first thing you should ask yourself what is your motivation?
You wanna be an entrepreneur? Just because? Honestly thats just not smart.
You wanna be rich? Unicorn rich? Honestly thats just not going to happen.
I had the itch to do 'something' but i never just wanted to make a business out of that. But i had plenty of ideas or i'm seeing a lot of niches were you could make it as a normal small it company being independent from others.
I only started to become an entrepreneur when i met a friend who had something i could relate to, a good business idea in a good niche with the potential to make it in a sense of i will be able to pay myself a 6 figure salary while already making a 6 figure salary.
There is a small chance that i might get a little bit more out of that company than a self managed stable job with a good salary (like 100-400k) but thats it.
Honestly the best bet by far is to try to get into big well paying companies like Google etc.
You are at an uni right? Go to hackathons, go to startup meetups, go to accelerators, make sure to be good in something. But pls don't just want to become an entrepreneur just for the sake of becoming a entrepreneur.
The big hit happened when intel started doing this. It killed the whole category of desktop GPUs.
Audio and NICs are very different though and the Apple GPU integration has nothing to do what happened to Audio and NICs: Apples customer demand GPU power for Image and Video editing, for the retina display and they pay a big price for these chips.
A integrated audio/NIC got optimized away because compute got so much better. iGPUs and co are not compensated through CPU compute but because putting them together makes it cheaper. The iGPU still has normal GPU components.
A M* Chip from Apple is HUGE and f* expensive. If it wouldn't be for people with deep pockets, it would be a lot cheaper to build the hardware yourself with normal GPUs. Mac Studio? 6k vs. same setup without Mac hardware: 4k and less + upgradable etc.
Thats often enough not true. They mostly are just integrated
I know the distinction feels very thin but come one, a GPU chips complexity is far beyond a sound or nic chip and i don't think that comparision is fair at all.
You are not adding a network card to your desktop pc to have significant better network.
And they don't even make new soundcards since 2021.
> Apples customer demand GPU power for Image and Video editing, for the retina display and they pay a big price for these chips.
I know “too pricey” has been a telling point for Mac forum threads since the turn of the century but you really should check the numbers before saying things like this. The M series chips meant Apple had a multiyear period of being notably cheaper because an integrated chip saves money - the correct angle for criticism is limited customization options.
Your pricing for the Mac Studio is high by 50% but also misses the point: that’s not competing with gaming rigs or home PCs (the $600 Mac Mini is that market) while the Mac Studio is aimed at people who need expansion options like video editors - note how it had hardware acceleration for the ProRes codec they use, support for 8 displays, double or triple the Thunderbolt and USB ports, etc.? You’re not buying that to play Call of Duty, you’re buying it to connect 8K cameras. The Mac Pro is even more of a specialist design with the PCI-X slots.
The M1 chip was a game changer. This is true and for whatever reason, a MacBook Air is at a exceptional price/value point.
But not with a Mac Studio: You can build your 8k super trible all bells and whistles with a lot less money than giving it to apple. The difference is volumne. A Mac Studio is probably 5-10x smaller.
The point is still valid: You do pay a big price for these chips. Apple pushes you to a Mac Book Pro due to RAM.
Its not bad critisism, don't get me wrong. My company laptop is really good but it costs 3k.
The normal consumer market, outside of an Apple ecosystem high price bubble, actually starts a lot lower. YOu can get a normal laptop for 300 while the MacBook Air starts at 1000.
But before the M chip, this was totally different. I would now try to convince people 'if you can afford it, save uup a little bit more and get you an macbook air' i would not have said this a few years back.
> The normal consumer market, outside of an Apple ecosystem high price bubble, actually starts a lot lower. YOu can get a normal laptop for 300 while the MacBook Air starts at 1000.
That $300 “normal” laptop was worse in almost every way and had significantly shorter service life - I still remember people making those comparisons claiming spinning metal drives were the same as SSDs. What you’re conflating is that there isn’t a single market segment but several, and Apple relies on used kit for the lower end price points. When you compare equivalent hardware capabilities, things have been roughly even since the switch to Intel, although it got tricky during the end of that when Intel struggled to ship low-power parts and you really had to decide how much you valued battery life.
The problem ist not that its shit, the problem is that in our world there are a lot of people who can't afford a $1000 laptop and Apple doesn't cater to these people at all.
Smartphones helped here a lot though but are not always an alternative. A young person barly making enough for studing (a person who needs a keyboard).
You were previously saying that you could get the same thing much cheaper, but now you’re talking about how you want something different. That’s a valid topic but conflating the two won’t help.
Well that’s because Hugo is very simple to set up, there are lots of tutorials online about it, it hasn’t changed drastically in the past year and it’s mostly the same for different static site generators.
You won’t find that kind of ease of use with say webgpu+winit for building a small renderer in rust.
Blockchain and NTF are and were stupid. A lot of people knew about this and the hype was more of a news hype because we had nothing else to talk about (until ai came).
I have seen so many really good and helpful ai demos/features internally, its impressive.
With AI / ML we are getting self driving cars, robots (talking, listing, walking), agents etc.
LLMs are not crazy good because they can generate stories, they are crazy good because they are a very very good interface to humans.
Facebooks Segment Anything ml model basically solved segmentation problem. Alpha Fold solved protein folding. Nvidias Omniverse with Robots solved the robot motion problem.
AI is not a hype, AI delivers left and right every week there is something really cool new.
Instead of writing uneducated blog posts or just blindly rant about it, at least try to follow up on AI news, you will be amazed how much it solves. And Until we are seeing ANY slow down or ceiling, until then i do believe that this right now is what the iphone was or the internet just crazier.
Its frustrating that people are not even able to understand AI, Blockchain and NFT good enough to be able to separate them. Just because something gets hyped doesn't mean its the same thing as the other thing which got hyped.
And no you were not able to talk to a computer system as fast and good as you are able to do that today with OpenAIs voice input. And no you never had a system which was able to answer that many questions in such a high quality.
> LLMs are not crazy good because they can generate stories, they are crazy good because they are a very very good interface to humans.
Do you have an example of a tool that uses an LLM as an interface? Seems like that'd be the fastest way to show people this is a superior interface.
We're obviously a long ways away from star-trek style natural interaction with computers, so I'm curious what you're doing that can work today. Aside from straightforward content generation, of course.
This is very cool and I could have seriously used this when recovering from RSI! However, it's not exactly a great argument that this is better than a keyboard and mouse for those who are abled enough to use them fully.
Well tbf, we’ve had decades of improving mouse and keyboard interfaces, but beyond better speech to text, natural language interfaces have been the same for like 15 years.
The mouse was controversial on release as well, since most computers weren’t graphical at the time.
Let the LLM backed interfaces cook. I don’t think they’re a replacement for graphical UIs, but that doesn’t mean they won’t be better for some applications.
Braille, for example, can be read by blind people AND non-blind people in dark rooms. Not strictly better than regular text, but far from useless.
> Its frustrating that people are not even able to understand AI, Blockchain and NFT good enough to be able to separate them. Just because something gets hyped doesn't mean its the same thing as the other thing which got hyped.
I agree with your main point, to those of us on HN there is obviously more substance to the current AI wave than (say) web3.
But I can hardly blame people who aren't actively following tech news from believing that it's more of the same -- many of the VCs and tech media boosters are the same every cycle. If anything, I think it's more of an onus on those of us who do follow closely to sound the alarm on the bullshit. (And there is bullshit this AI wave, too, on top of the obvious substance.)
The GenAI Stuff you don't like/hate helps blind people to navigate the world because generating images also allows to analyse images.
There are plenty of other areas which will help a lot of people left and right including you. Cancer research, material research for energy.
Cancer for example can be detected in xrays thanks to GenAI stuff. Its a lot easier to get an x-ray machine somewere around the globe but an expert is hard and we do not have enough radiologists anyway.
I agree with the general gist of the article that generative AI is bullshit for many use cases. But I'd rather point out why this comment is misguided for a very specific reason: it's a prime example of why we, as technologists, need to be careful with our terminology around some of these innovations.
You are lumping together many different things in this post. For example:
> With AI / ML we are getting self driving cars, robots (talking, listing, walking), agents etc.
The "AI / ML" part of this sentence is telling. I am aware of exactly zero self-driving cars that are powered by LLMs (what the general public almost always means when they say "AI" these days).
Self-driving cars are enabled by physical sensors in combination with various ML algorithms which have been around in some form for literally decades. I'm not an expert in this field, but my understanding is that what's actually happened in the last ~decade which has allowed them to flourish is the development of better _hardware_, that is, hardware that can run these algorithms fast enough, at a large enough scale, and still be small and cool enough to fit into a car.
Ditto to some extent with your other examples, though maybe a general-purpose robot could be made better by interfacing with an LLM.
I realize this may not be your intent, but by writing in this way, you are confusing the layperson into thinking that all of these innovations were enabled by ChatGPT-style "AI," when in fact some of them have nothing to do with that type of tech at all.
I really wish we'd all be more honest, and not conflate transformers/LLMs with other "AI" algorithms. In fact, I think it'd be good if we stopped saying "AI" completely, though I realize this will never happen given that term's stickiness with the public at large.
Its about mass and leads to 99% to some scam. I don't think thats a lot of people but someone writes software for this type of thing and others buy it and they then do all the same thing.
Sometimes just an impression (visit of your page) can get you money.
Do you think you are happier in life when you at the top? I tell you a secret, no.
Its for sure better to not stress about stuff like money but your definition of success is not universal.