Agree with the sentiment. However it's hard to stay curious, even harder to stay up-to-date.
I liked fiddling with storage for a while, got really into it, deepened my knowledge about it. A couple years later I realized everything else (networking, architectures, languages) developed so much, mot of my (non-basic) knowledge was obsolete. Picking up where I left off with all technologies is incredibly hard and caused fatigue.
Now I'm at a point where I have the feeling I don't know nothing about anything. It's factually not true, but my gut feeling tells this. Would I be younger, this would trigger a lot of anxiety. Thankfully I can janfle this by now.
That's understandable. I'm into databases (both professionally and personally), so I get to touch a wide variety of things. Following a query the whole way is pretty neat. Typed query --> client --> wire protocol --> network --> server --> DB frontend --> DB planner --> DB storage engine --> OS --> Memory / Disk. `blktrace` is super interesting to watch commands being sent to the disk.
When you are deeply involved in tech both personally and professionally you are probably very passionate about this and makes sense that you'd only look closely at this field and think "people are so deeply incurious. This seems to only happen in tech".
Tech is also one of the (if not the) most dynamic and fast evolving field a normal person will ever touch. Curiosity in tech can drain every single bit of free time and energy you have and you will hardly keep up with the progress, maybe barely scratch the surface. But people's available free time and energy wanes and curiosity is a collateral victim.
I've painfully gone through the entire cycle of this, including the bit of resurgence later on when you have a combination of free time but less energy. What I can say is that this absolutely does not happen just in tech. If anything tech is flooded with people with more curiosity than almost any other field.
> When you are deeply involved in tech both personally and professionally you are probably very passionate about this and makes sense that you'd only look closely at this field and think "people are so deeply incurious. This seems to only happen in tech".
Good point. I commented in a sibling post to the same effect.
I've certainly felt the personal strain of time sink and procrastination in my homelab. It's currently running k3os, which has been dead for about four years now, because everything I want is still running, and I never seem have the motivation on the weekend to yell at my laptop when I could be playing board games.
> including the bit of resurgence later on when you have a combination of free time but less energy.
I'm guessing that will happen in another decade or so, when my kids are grown.
However, my sentiment is rather this: I wouldn't pass an assembly programming interview, but I know enough about it so that I know what I don't know. Same with embedded programming, fpgas, machine learning stuff, big data, networking, etc etc.
As for LLMs and quantum computing, I don't even know the basics, have no idea about the broader science behind it. Worst is that I don't feel like it interests me, I don't feel excited about it.
I guess if tomorrow I had to work with them, I could learn some "libraries that hide the complexity", but it leaves me with an empty feeling about these new technologies. Hence the existential question if I'm "too old for this" career path at all.
Nah. You don't have to feel excited about every single new bit of tech that comes along.
About 15 years ago I became interested in really advanced cryptography, because it was presented at a Bitcoin conference I went to. If you think AI is hard, that's kindergarten stuff compared to the maths behind zero knowledge proofs. And because nobody cared at that time outside of a handful of academics, there were no tutorials, blog posts or anything else to help. Just a giant mound of academic papers, often undated so it was hard to even figure out if what you were reading had been superseded already. But it seemed important, so I dived in and started reading papers.
At first, maybe only 5% of the words made sense. So I grabbed onto those 5%. I read a paper, put it down for a while, re-read it later and found I understood more. I talked to the researchers, emailed them, asked questions. I read the older papers that initiated the field, and that helped. It was a lot of work.
You know what? In the end, it was a waste of time. The knowledge ended up being useful primarily for explaining why I wasn't using those algorithms in my designs. A lot of the claims sounded useful but ended up not being so for complicated reasons, and anyway, I was mostly interested in what you could do with the tech rather than the tech itself. Turns out there's always a small number of people who are willing to dive in and make the magic happen in a nicely abstracted way for everyone else, for any kind of tech. QC is no different. There's, as far as I can tell, very little reason to learn it. If QC does ever "happen" it'll presumably 95% of the time be in the form of a cloud service where you upload problems that fit a quantum algorithm worked out by someone else, pay, and download the answer. Just like LLMs are - another topic where I was reading papers back in 2017 and that knowledge turned out to not be especially useful in regular life.
Learn the details of stuff if it naturally interests you. Ignore it if it doesn't. Being a specialist in an obscure domain can occasionally be like striking the jackpot, but it's rare and not something to feel bad about if you just don't want to.
Funnily, I started using this word when I'm referring to social media, the addicts of daily TikTok and Instagram scrolling users. Maybe I'm exposed to this brain rot, since I am using this word lately a lot, and possibly, I picked up unconsciously from other sources.
I do! Wow, what a flashback. I think it was formed from Mozilla's built in editor, and later on died quite quickly and was forked under a different name, which, in turn, died quickly.
I know it's just an example, but it didn't take modern medicine for us not to die of broken legs at twenty. There's some risks sure, but the norm was presumably lying down and resting because it (especially doing anything else) bloody hurts, it gradually healing, and then varying degrees of hobbling for the rest of life because it healed without a splint.
(And then it wouldn't be that surprising I don't think if we figured out splints long before anything you might call medicine.)
We didn’t need modern medicine to heal broken bones but for sure more than what was „naturally“ available.
You can switch out broken bones for „being dragged from the campfire into the dark and devoured“ - another natural occurrence that I gladly live without
IIRC, most of the evidence regarding screens before bed being detrimental were focused on the "blue light" thing, so a warmer light is probably better.
(I think the science behind blue light being bad for us is still up for debate, but it's hard for me to ignore the amount of anecdata I've heard, including my own experience using flux and the like)
I liked fiddling with storage for a while, got really into it, deepened my knowledge about it. A couple years later I realized everything else (networking, architectures, languages) developed so much, mot of my (non-basic) knowledge was obsolete. Picking up where I left off with all technologies is incredibly hard and caused fatigue.
Now I'm at a point where I have the feeling I don't know nothing about anything. It's factually not true, but my gut feeling tells this. Would I be younger, this would trigger a lot of anxiety. Thankfully I can janfle this by now.
reply