Sounds like a potential business opportunity! I don't know much about cars, how much is standardized in car electronics? Would it be possible to build a infotainment module that you could sell to several car manufacturers with only minimal modifications?
I think I've heard of something called an ICANN(?) bus that is used to communicate stuff in cars and is fairly standardised, maybe?
There are already companies doing 3rd party electronics as mentioned above, such as Visteon and Continental, and Garmin is trying to get into that business too.
That's what many OEMs have been doing for decades and this is exactly what many SDV have been trying to get rid of, since integrating many different products from many different manufacturers are slow, let alone iterating and designing new features.
Related to CAN, the bus is standard, but the thing is, CAN is just a bus, not a protocol. There are many ways you can have two ECUs (vehicle's modules) talking in incompatible ways.
I've had one co-worker with something like a decade of experience on paper, who was proud of his C++ despite having never heard of the standard template library — lots of `new` and `free`, not a single smart pointer (https://en.cppreference.com/w/cpp/memory#Smart_pointers). And the code they wrote had a lot of copy-paste going on, which I ended up finding because I'd put in a "TODO: deduplicate this" comment somewhere and found it in his newly duplicated class one day.
They absolutely were not interested in learning anything. I left knowing more C++ than they did despite having started there with total C++ experience of a hello world tutorial, and the fact that I still don't count myself as a C++ dev today.
To be fair when a company says they use C++, it can mean anything from "C with classes" to crazy metaprogramming with almost automatic memory management. Since they have over 10 years experience, they are almost definitely in the former camp.
I would never utter the phrase "I know C++" because it can mean so many different things to so many different people, and I don't think anyone truly knows the whole language.
Not using templates nor smart pointers doesn't sound that bad to me(unless the entirety/majority of the codebase was written with them in mind), the duplication thing is more questionable.
It's not so much that this specific person didn't use smart pointers, it's that they had never even heard of them, and wasn't interested either.
"C with classes" is probably a good description, given what I saw from that one person — they didn't understand sub-typing either, and only had a cargo-cult understanding of access specifiers (revealed when the rest of us asked them why they'd duplicated a class file rather than subtyping).
Tbh, I also (sort of) knew C++, studied in school and a few semesters worth in college (CUDA, DSA, Computer vision elective,compiler design) but I still don't know STL.
(I had been then interviewing using Java and Python.)
Nope! I reverted a commit once, since a colleague pushed something that didn't compile to master. Sent the guy a polite message notifying him that something seems to have been amiss with his last commit, and to please let me know if he wanted help fixing it.
Boss called me 5 minutes later and tells me off for creating "bad vibes" in the work environment.
Colleague then proceeded to forcepush his "fix" that still didn't even compile to master, removing a new feature I was about to roll out to production, because he didn't know how to merge his changes with the revert commit I'd added
This was when I decided to quit
Oh I should add this developer bragged he had 10+years working experience. Not that I believe him, but still
It's got me wondering: do any of my hard work actually matter? Or is it all just pointless busy-work invented since the industrial revolution to create jobs for everyone, when in reality we would be fine if like 5% of society worked while the rest slacked off? Don't think we'd have as many videogames, but then again, we would have time to play, which I would argue is more valuable than games.
To paraphrase Lee Iacocca:
We must stop and ask ourselves, how much videogames do we really need?
> It's got me wondering: do any of my hard work actually matter?
I recently retired from 40 years in software-based R&D and have been wondering the same thing. Wasn't it true that 95% of my life's work was thrown away after a single demo or a disappointingly short period of use?
And I think the answer is yes, but this is just the cost of working in an information economy. Ideas are explored and adopted only until the next idea replaces it or the surrounding business landscape shifts yet again. Unless your job is in building products like houses or hammers (which evolve very slowly or are too expensive to replace), the cost of doing of business today is a short lifetime for any product; they're replaced in increasingly fast cycles, useful only until they're no longer competitive. And this evanescent lifetime is especially the case for virtual products like software.
The essence of software is to prototype an idea for info processing that has utility only until the needs of business change. Prototypes famously don't last, and increasingly today, they no longer live long enough even to work out the bugs before they're replaced with yet another idea and its prototype that serves a new or evolved mission.
Will AI help with this? Only if it speeds up the cycle time or reduces development cost, and both of those have a theoretical minimum, given the time needed to design and review any software product has an irreducible minimum cost. If a human must use the software to implement a business idea then humans must be used to validate the app's utility, and that takes time that can't be diminished beyond some point (just as there's an inescapable need to test new drugs on animals since biology is a black box too complex to be simulated even by AI). Until AI can simulate the user, feedback from the user of new/revised software will remain the choke point on the rate at which new business ideas can be prototyped by software.
I think about this a lot with various devices I owned over the years that were made obsolete by smartphones. Portable DVD players and digital cameras are the two that stand out to me; each of them cost hundreds of dollars but only had a marketable life of about 5 years. To us these are just products on a shelf, but every one of them had a developer, an assembly line, and a logistics network behind them; all of these have to be redeployed whenever a product is made obsolete.
This is what makes software interesting. It theoretically works forever and has zero marginal production cost, but it's durability is driven by business requirements and hardware and OS changes. Some software might have a 20 year life. Some might only be 6 months.
A house is way more durable. My house is older than all software and I expect it to outlive most software written (either today or ever). Except voyager perhaps!
Yes... basically in life, you have to find the definition of "to matter" that you can strongly believe in. Otherwise everything feels aimless, the very life itself.
The rest of what you ponder in your comment is the same. And I'd like to add that baselines shifted a lot over the years of civilization. I like to think about one specific example: painkillers. Painkillers were not used during medical procedures in a widespread manner until some 150 years ago, maybe even later. Now, it's much less horrible to participate in those procedures, for everyone involved really, and also the outcomes are better just for this factor - because the patients moves around less while anesthetized.
But even this is up for debate. All in all, it really boils down to what the individual feels like it's a worthy life. Philosophy is not done yet.
Well, from a societal point of view, meaningful work would be work that is necessary to either maintain or push that baseline.
Perhaps my initial estimate of 5% of the workforce was a bit optimistic, say 20% of current workforce necessary to have food, healthcare, and maybe a few research facilities focused on improving all of the above?
I'm pretty sure it's not impossible, but rather just improbable, because of how human nature works. In other words, we are not incentivized to do that, and that is why we don't do that, and even when we did, it always fell apart.
You are very right that AI will not change this. As neither did any other productivity improvement in the past (directly).
It's impossible, and not just because of human nature. Even if humans were more cooperative or altruistic, it's impossible to plan for disruptive innovations.
Power itself seems to be the goal, and the reasons for it is human DNA I think. I have doubts that we can build anything different than this (on a sufficiently long run).
Mine doesn't, and I am fine with that, never needed such validation. I derive fulfillment from my personal life and achievements and passions there, more than enough. With that optics, office politics and promotion rat race and what people do in them just makes me smile. Seeing how otherwise smart folks ruin (or miss out) their actual lives and families in pursuit of excellence in a very narrow direction, often hard underappreciated by employers and not rewarded adequately. I mean, at certain point you either grok the game and optimize, or you don't.
The work brings over time modest wealth, allows me and my family to live in long term safe place (Switzerland) and builds a small reserve for bad times (or inheritance, early retirement etc. this is Europe, no need to save up for kids education or potentially massive healthcare bills). Don't need more from life.
Agree. Now I watch the rat racers with bemusement while I put in just enough to get a paycheck. I have enough time and energy to participate deeply in my children’s upbringing.
I’m in America so the paychecks are very large, which helps with private school, nanny, stay at home wife, and the larger net worth needed (health care, layoff risk, house in a nicer neighborhood). I’ve been fortunate, so early retirement is possible now in my early 40s. It really helps with being able to detach from work, when I don’t even care if I lose my job. I worry for my kids though. It won’t be as easy for them. AI and relentless human resources optimization will make tech a harder place to thrive.
Unless you propose slaves how are you going to choose the 5%?
Who in their right mind would work when 95 out of 100 people around them are slacking off all day? Unless you pay them really well. So well that they prefer to work than to slack off. But then the slackers will want nicer things to do in their free time that only the workers can afford. And then you'd end up at the start.
>It's got me wondering: do any of my hard work actually matter?
It mattered enough for someone to pay you money to do it, and that money put food on the table and clothes on your body and a roof over your head and allowed you to contribute to larger society through paying taxes.
Is it the same as discovering that E = MC2 or Jonas Salk's contributions? No, but it's not nothing either.
> Don't think we'd have as many videogames, but then again, we would have time to play, which I would argue is more valuable than games.
Would we have fewer video games? If all our basic needs were met and we had a lot of free time, more people might come together to create games together for free.
I mean, look at how much free content (games, stories, videos, etc) is created now, when people have to spend more than half their waking hours working for a living. If people had more free time, some of them would want to make video games, and if they weren’t constrained by having to make money, they would be open source, which would make it even easier for someone else to make their own game based on the work.
Nope. The current system may be misdirecting 95% of labor, but until we have sufficiently modeled all of nature to provide perfect health and brought world peace, there is work to do.
I've been thinking similarly. Bertrand Russell once said: "there are two types of work. One, moving objects on or close to the surface of the Earth. Two, telling other people to do so". Most of us work in buildings that don't actually manufacture, process or anything. Instead, we process information that describes manufacturing and transport. Or we create information for people to consume when they are not working (entertainment). Only a small faction of human beings are actually producing things that are necessary for physiological survival. Rest of us are at best, helping them optimize that process, or at worst, leeching off of them in the name of "management" of their work.
Most work is redundant and unnecessary. Take for example the classic gas station on every corner situation that often emerges. This turf war between gas providers (or their franchisees by proxy they granted a license to this location for) is not because three or four gas stations are operating at maximum capacity. No, this is 3 or 4 fisherman with a line in the river, made possible solely because inputs (real estate, gas, labor, merchandise) are cheap enough where the gas station need not ever run even close to capacity and still return a profit for the fisherman.
Who benefits from the situation? You or I who don’t have to make a u turn to get gas at this intersection, perhaps, but that is not much benefit in comparison for the opportunity cost of not having 3 prime corner lots squandered on the same single use. The clerk at the gas station for having a job available? Perhaps although maybe their labor in aggregate would have been employed in other less redundant uses that could benefit out society otherwise than selling smokes and putting $20 on 4 at 3am. The real beneficiary of this entire arrangement is the fisherman, the owner or shareholder who ultimately skims from all the pots thanks to having what is effectively a modern version of a plantation sharecropper, spending all their money in the company store and on company housing with a fig leaf of being able to choose from any number of minimum wage jobs, spend their wages in any number of national chain stores, and rent any number of increasingly investor owned property. Quite literally all owned by the same shareholders when you consider how people diversify their investments into these multiple sectors.
It's weird to read the same HN crowd that decries monopolies and extols the virtues of competition turn around and complain about job duplication and "bullshit jobs" like marketing and advertising that arise from competition.
Swedish Senior Java backend developer who recently moved to Mexico, long experience building scalable systems for clients with lots of traffic. notable previous employers include: Mojang(makers of Minecraft), EA/dice (battlefield), King (candy crush) and Discovery Networks
Corn is my favourite example of this, mostly because it went the other direction, ie, the kinds of corn that haven't been cultivated to their current form over centuries is way cooler than the yellow we are used to!
There is blue, red, and disco ball "all colours" extravaganza corn.
The wild teosinte ancestors of corn are nigh-indistinguishable from dozens of other kinds of wild grasses in Mexico and Central America. The red and glass gem field corns are usually new varieties developed in the last century or so, comparable to roses in this analogy. Pretty much the only commonly available product made from heritage varieties is blue cornmeal, which usually comes from puebloan blue varieties still cultivated commercially.
Curious, do you write code on it? I've heard the ghosting makes them nearly unusable for coding, but the concept of coding in e-ink is just so enticing to me!
I have a Dasung monitor (black & white, not colour) and its reasonably good for coding. You need to use light themes on eink devices (I use the default Emacs theme), as using a dark theme does lead to unusable bad ghosting.
Mine is a few years old and newer panels are better, so I would expect this to be pretty good.
I do! I mostly use JetBrains stuff and Sublime Text. The ghosting is a non-issue for me.
I do keep a color calibrated panel running next to it in case I need a video going or something (conference, etc.). But the ghosting is a non-issue for coding once you figure out the settings and limits that work for you.
Of note I spent a chunk of my career working with and programming on a Harris H100 via terminal, and slowing down is a nice little throwback to me.
I am not too sure about that. Isn't the whole thing about art and music is that you can convey something that words cannot? Of course, these models start to support image and audio inputs as well, but the most interesting mixing step that happens in the artist's head seems missing in the generated output. If you have some vision inside your head, making something out of it by hand is still the best way to convey it. Just as writing something down refines the thoughts and reveals holes in your thinking - drawing something is visual thinking that reveals holes in your imagination. You can imagine some scene or an object pretty easily, but if you try to draw it into existence, you will immediately notice that a lot of detail is missing, a lot of stuff you didn't think through or didn't even notice was there at all. The same applies to creating music and programming. Using generative AI certainly has some artistic component to it, but I feel like using these models you give up too much expressive bandwidth and ability to reflect and take your time on the whole endeavor.
Who is the work for? If I lived in the automated future (or could afford private staff in the present) I would do more creative stuff just because I enjoy it and with no expectation of having an audience.
For context, I'm an occasionally-published photographer, and I like playing piano but I'm not at a level anyone else would want to listen to.
But photography is not art, you didn't paint it! You literally pointed a device at something, twiddled a few knobs and pushed a button. Literally anyone with a smartphone can do that!
/s of course, but basically that's the argument people make nowadays related to AI and art (of any form).
I'm not convinced the two are necessarily mutually exclusive. Surely a skilled developer could make use of AI to produce stuff faster, while still understanding everything and making sure the code is well written (well generated?)
Haven't really tried vibe coding myself yet, but I'm tempted to give it a go. I imagine stuff like integrating external API's could be really handy, looking through external documentation and creating the correct calls to the correct endpoints is always a huge timesink. Seems like AI should be able to make short work of such tasks
> Surely a skilled developer could make use of AI to produce stuff faster, while still understanding everything and making sure the code is well written (well generated?)
Yes. But in these early stages, that will only prove the LLM's can't do much more than be a fancy autocomplete, as opposed to a way to accelerate the workflow navigating anything with real-world value.
Maybe in a few years for the most common domains this will truly shine.
reply