Hacker News new | comments | ask | show | jobs | submit login

I'm sorry, I simply cannot buy this idea that any kind of coding can't be made out to be fascinating.

I was asked today what about working with websites I found so fulfilling. The reason boils down to the fact that, more than any other domain out there, I feel like I'm working directly with the future. I'm working directly at the intersection of technology and business, driving real value for real people. Anybody who is making a website for dev money is doing this. It's just way too expensive to waste money on something that's not.

I feel like if you consider your job as 'just gluing', you're missing out on something amazing.

Friday we were having a team discussion about data modeling and how we wanted to structure a particular concept. When does it become worthwhile to introduce another database to accommodate a new data flow? What precisely is the criteria you use for determining when to denormalize? Our Postgres database worked just fine with little consideration towards perf, until we needed to use analytics. Now all of a sudden all those joins matters.

How do you work effectively with designers to produce a great-looking, functional website at the speed that a media company wants to work at? The whole team has had to really level up their CSS knowledge and code organization because things change really really fast. Layout goes from being a chore to a matter of fair urgency that you get right the first time and fast.

Say you don't work for a fast-paced media company. Great, now you have tons of spare cycles to devote towards maintainability and code quality. Just how far can you golf your job down to? At my last job I was touching a code editor twice a week.

I refuse to buy into the mentality that says that any programming job can be meaningless. There's always something neat you can be doing there.




I agree that anything can be made fun if you try hard enough. It's a kind of mental hack. Some people are better at it than others.

I'm very happy that you feel passionate about your work. I envy that. This comment is not an attempt at making you less passionate; I'm trying to point out differences in perspective.

One phenomenon I observed is that people who just learned programming are fascinated by everything. People on their first or second job. People just learning their first or second programming language. They're so full of enthusiasm. I envy them. For me, that moment was many years ago, way before I was able to join the job market. Because of that, I sometimes think that there's a disadvantage to learning to program before adulthood - your first jobs won't be as exciting. You've already seen 90% of the problems you're about to face, and your attention is thus focused not on the technical aspects, but on the fact that you have this problem in the first place, likely for no good reason.

> The reason boils down to the fact that, more than any other domain out there, I feel like I'm working directly with the future. I'm working directly at the intersection of technology and business, driving real value for real people.

Yeah, for me - and many others - that "future" is a disastrous dystopia, a perversion of what computing could be, a land of corporate riches, total surveillance and great waste of resources, instead of the promised efficiency and empowerment of individuals. The web of today - from the tech stack to dev communities to popular business models - is a tragic reminder of how "worse is better" can be taken to an extreme, resulting in all the compute being mostly wasted on software bloat and glitter. Consider e.g. Moore's law, and then consider that the software of today doesn't usually offer new user-facing features that are actually useful (i.e. not glitter) over the software from 15 years ago. Sure, most of it is the same old thing but networked!, but that's arguably more of a business model change than something meaningful to end users. And the web is unfortunately driving all that bloat, glitter and user-hostile business practices.

Yes, I very much don't like the cloud.

> (...) driving real value for real people. Anybody who is making a website for dev money is doing this.

I question the "real value for real people" part of this. In fact, I believe it would be better for everyone if a lot of those web jobs were not done at all. After all, the dominant model of the web is to waste people's time (result of optimization for engagement) while making money off advertising (i.e. wasting more time), plus building surveillance infrastructure to make extra buck off data collection. I do not subscribe to the view that just because some money changes hands, it's automatically a good thing.

But then, maybe I'm just burned out. I've noticed I've become increasingly depressed about this industry, and the typical jobs that are available.


I switched to being a data engineer and have found it remarkably refreshing. Sure, it’s “just” a combination of DBA, ETL engineer, devops, and glue coding, but the data world is 15-20 years behind programming practices and it’s fun to see unit testing and version control being applied to data, as well as the rusty tooling getting some new love. Especially with dbt, a data transform tool that’s fun to use. I generally agree with your downbeat view on things, but data engineering has its own problems that have given me a renewed interest in tech. For now!


I can't even fathom a dystopic view at this point. All of my senses indicate that both the near and far term future are going to be amazing, and just get better all the time. I'm not a big fan of the current Javascript meta, but metas come and go. The current job market does depress me a bit, labor liquidity seems to be veering sharply downwards at the moment.

The only thing that worries me right now is World War 3. And losing the whales. That's really sad.


Every piece of computing hardware you own has been designed so that it is able to spy on you. If that, combined with the general government trends going on right now doesn't sound dystopic to you then nothing will.


I guess I've studied enough history to know what real dystopias look like and how they come about to realize that this just isn't it. Society is inherently adversarial, but the amount of actual violence is going down.

Violence doesn't just disappear, it just morphs into another form. As the internet has slowly spread, it's allowing a lot of people's inner turmoil to manifest, this is the cause of the current political environment. Since this turmoil is chaotic and undirected, it's being harnessed by people who do have agency to serve political ends.

It's worrying until you realize that this is how politics has always been done. Democracies have always been mob rule with an aristocratic backbone to fall back on when things get too unruly.

I'll take the current political climate over that which generated the French Revolution any day of the week.

Spy networks also have a long long history. They used to be the tools of autocrats to enforce social order, and still are in political entities like ISIL. I'll take my computers spying on me for economic ends over people spying on me to actually literally control my behavior any day of the week and Sunday too.


> I'll take the current political climate over that which generated the French Revolution any day of the week.

The question is, are those two climates really that different?

> people spying on me to actually literally control my behavior

You've essentially defined advertising industry. It's a very insidious form of control, but just because it doesn't involve threats of violence, doesn't mean it isn't there and isn't working.


What made the French Revolution and today different is the sheer murderousness of the population and willingness to get whipped up into big mobs that then take people from their homes, kill them, put their heads up on pikes. This became normal. It got so bad that someone thought the guillotine was a humane invention.

You just can't conflate today's RedPillers with that.

If you look at the Civil War, this was a conflict that rent families apart. Brothers faced down brothers across a battlefield. Fathers fought sons. The Civil War was the culmination of a political quarrel that steadily rose in tenor since the inception of the nation. It was a hundred years in the making, enough time for the sides involved to get real passionate about it.

The essence of fighting and violence are still there, the same deep inner chaos is driving these conflicts.

But what's missing is the ridiculous amount of zeal that both sides possessed. If you have that kind of zeal, these days the only way to fulfill it is to go join ISIL. And people do that. But most people aren't getting that bad. They find an outlet. Trump is that outlet. Brexit is that outlet. Before it was mob violence.

It's the same with advertising. The continuation and slow civilization of the new domain of the marketing of human attention. However bad it is now, it was worse before. Before machines did the dirty work, it was other humans acting in unbelievably shady manners to rip people off for as much as they could. And it was considered normal.


You are not burnt out, you are 100% correct.

Combine that with the fact that web chumps think they're 'inventing' when they're solving problems that were solved decades ago, only worse, and most of them don't have the historical background, education, or even interest to know what mistakes they're making. They think its "leveling up" to solve an already solved problem, but in web format, while wasting everyone's time with ads, tracking, gimmicks, for money.

You are not burnt out, or wrong. The web used to be better; computing used to be better. Windows is now a service, and Linux still can't drive graphics card without nonfree drivers. We failed, but I think there might be a chance to fix things.

We need to out together a cohesive document that explains why modern computing needs a serious overhaul, and a plan to get there. The computer should be fast; UI should be appealing, but minimal. It should be understandable and open down to RTL of the processor. The core should be aggressively minimal, and fully understandable. Networks with no central auth should be at least an option (preferably a local mesh). It should be cheap.

I'm thinking like a modern DOS, but for RISCV, fully implemented on an FPGA (goal of course to move to custom silicon for performance improvements later).


> web chumps think they're 'inventing' when they're solving problems that were solved decades ago, only worse, and most of them don't have the historical background, education, or even interest to know what mistakes they're making.

Your tone is very out of place in this thread. If you don't know that something was solved before and you solve it yourself, it is a great feeling and very rewarding.

I love solving crossword puzzles, riddles and similar games. Yes, even though someone else even _designed_ them. We were talking about gaming.


> We need to out together a cohesive document that explains why modern computing needs a serious overhaul, and a plan to get there. The computer should be fast; UI should be appealing, but minimal. It should be understandable and open down to RTL of the processor. The core should be aggressively minimal, and fully understandable. Networks with no central auth should be at least an option (preferably a local mesh). It should be cheap.

Agree on all those goals. Over the years, I've seen some essays and comments scattered all over the web containing those ideas; it would be great to collate all of those thoughts somewhere.


We could open a repo and start writing things down?

I can handle the FPGA side. Start with essentially DOS for RISCV. Use qemu for kernel development, and Verilator as the simulator for RTL. Develop a simple graphics unit that is "pluggable" so you can replace it with a more advanced one later (or if the user requires it). Same for networking.

We'll target one of the ICE40 fpgas with an open toolchain. Not that they're great FPGAs, but the open toolchain is what's important.

I've been batting this idea around for a while.


In the range of things I can think of that can make the world of computing better, reinventing the hardware / OS combo is so far out in the sticks that you're going to be fighting lions and tigers for your daily sandwich. Is x64 / Linux really that bad? And do none of the existing alternative OSes offer what you need?

My side projects generally involve creating my own little Garden of Eden on top of as mainstream a system as the world can muster, which appears at the moment to be Ruby driven by a shell, not sure yet if it's appropriate to try something other than bash. I never even get motivated to drop down to making C extensions, much less reinvent hardware.

If I'm going to reinvent anything at this point it will be the shell, which to me is the most powerful UI abstraction tool available atm. My main gripe at the moment is $PATH. If I want my own tool to be called sort, well the answer is don't do that.

Reinvention is only relevant when you can actually add capabilities that didn't exist before. Then you have to communicate those new capabilities with a killer app. Even my little example, nobody's going to care that I fixed shell tool namespacing if I can't articulate why I needed it with a better system for personal computing.

And if I go and do all that work to make a new shell, it has to solve all the same problems that the status quo evolved to solve. Now that I'm thinking about it, the answer to my dilemma is to invert the shell, make a very basic CLI that delegates to the shell when you really need it, but offers an extensible garden that is free of the shell otherwise. Build on Readline, not bash. I never really needed bash anyway.

I mean, don't get me wrong, I'd be very very happy to implement my side project on the hardware and OS that adheres to your proposed doctrine. Openness is great. But the seething shifting mass of backwards compatibility isn't just going to go away after you reinvent it all. You're just resetting the clock.


Resetting the clock might be worth it at this point, if we can jettison accumulated cruft and simplify, simplify, simplify.

My problem with things like Haiku and Reactos is that they're trying to emulate systems that have already existed - Beos and Windows, respectively. All I want is a modern DOS, say with a faster/more capable graphics layer.

The closest I've found is Wirth's Project Oberon, but the choice of language meant that there's very little that can be pulled from existing repositories and used directly - not too many people are programming in Oberon. C is a requirement, but you get that with the RISCV tooling.


You're probably right. Every now and then a revolution is actually appropriate.


>Linux still can't drive graphics card without nonfree drivers

It can https://en.wikipedia.org/wiki/Free_and_open-source_graphics_...


Yeah, that's exaggerated. What author probably meant, that proprietary drivers give some features, which won't work with open source drivers. For example, better results in 3D acceleration. In my case, I was unable to configure properly multi-display with 4 screens and HDMI audio without nvidia proprietary drivers.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: