Hacker News new | past | comments | ask | show | jobs | submit | gmays's comments login

I've had a similar experience. I've now done math with Math Academy for 349 days in a row as of today. I'm not going as fast as I'd like due to other higher priorities like my kids and my startup, but Math Academy helps me make the most of the time I do have. I highly recommend it.

I also documented my experiences when I hit the 100 day streak mark here: gmays.com/math


It’s been great for me. As of today I crossed 324 days of using it straight. I wrote about my experience here: https://gmays.com/math

And yes, I’m a total fanboy. I’ve also known the founder for over a decade and he’s been working on it for most of that time. Math Academy came out of him helping his son learn math even before that.

I haven’t found anything close to teaching math than this. It’s legit.


I feel like sending this to all my vegan friends.."Uh oh, what are you guys going to eat now?"


These days a Nasdaq ETF is likely the best bet (it's what I use in my IRA) since it has been conistently outperforming the S&P 500.

I'd be cautious with the more exotic leveraged ETF. Instead, for my non-investment accounts I've been mostly holding the Mag 7 since last year (after rotating out of pandemic stocks following the rebound). I shared my results over the last couple of years on by blog where I document my thinking for future reference/improvement (https://gmays.com/2-year-follow-up-on-buying-the-dip-on-pand...). My current IRR from that portfolio is 83.70% over the last couple years for context.


QQQ has done great the last decade, but that's no guarantee it will outperform the next decade. Make sure you understand what you are getting yourself into by overweighting in the tech sector, and that it's a plan you can stick with during the hard times.



Love QQQ and the Australian version NDQ.


TL;DR:

1) In 2023, 72% of Americans reported that they were “at least doing okay financially”, a marginal decrease from 73% in 2022, but notably below the recent peak of 78% in 2021.

2) Monthly incomes in 2023 saw an uptick for 34% of respondents compared to 2022, a trend that has continued to increase over the last four consecutive years.

3) Among retired individuals who constitute more than a quarter of the U.S. population, 80% said they were doing at least okay financially, surpassing the national average.


I'd avoid straps or belts until you're at least around ~400lb so you develop your grip and core strength. If you think you need a belt, you could try lifting with a trap (hex bar). That's what I switched to recently and it's more comfortable as someone tall and turning 40.


Not an expert, but can give it a shot:

(1) Much development is already moving from CUDA to the LLM, so less of an issue. Nvidia is also doing more work to increase interoperability. Could be an issue I guess, but doesn't seem like it since there's nothing close to CUDA or the ecosystem.

(2) AMD has attracted significant investment looking at appreciation in its market cap, with a PE ration 3X Nvidia's. However, AMD is so far behind in so many ways, I don't believe it is an investment problem, but structural. Nvidia has just been preparing for this for so long it has a temendous head start, not to mention being more focused on this. Remember AMD also competes with Intel, etc.

(3) Hyperscalers already are building their own chips. It seems even Apple used its own chips for Apple Intelligence. It's relatively (which is doing a lot of lifting in this sentence because it's all HARD) not too hard to make custom chips for AI. The hard (near impossible) thing is making the cutting edge chips. And the cutting edge chips are what the OpenAIs of the world demand for training, but releasing the newest best model 1-3 months ahead of a competitors is worth so much.

If anything, I'd say the biggest threat to Nvidia in the next 1-3 years is an issue with TSMC or some new paradigm that makes Nvidia's approach suboptimal.


Thanks, that was extremely helpful.

I don't think I understand your point in (1) that it's less of an issue because development is moving to the LLM. I can infer that maybe CUDA isn't a big part of the moat given your other points that the hard part is making cutting edge chips.


It's just the natural evolution of tech towards higher levels of abstraction. In the beginning most dev was on CUDA because the models had to be built and trained.

But since there are plenty of more advanced models now, the next level is getting built out as more developers start building applications that use the models (e.g. apps using GPT's API).

So where 5 years ago most AI dev was on CUDA, now most is on the LLMs that were built with CUDA to build applications.


Coincidentally, a few days ago my wife was giving talks to students in NYC and mentioned how developed the girls seemed--much more than when she was in high school.


> The code isn't that complicated, you could probably implement training and inference for a single model architecture, from scratch, on a single kind of GPU, with reasonable performance, as an individual with a background in programming and who still remembers their calculus and linear algebra, with a year or so of self study.

Great overview. One gap I've been working on (daily) since October is the math working towards MA's Mathematics for Machine Learning course (https://mathacademy.com/courses/mathematics-for-machine-lear...).

I wrote about my progress (http://gmays.com/math) if anyone else is interested in a similar path. I recently crossed 200 days of doing math daily (at least a lesson a day). It's definitely taking longer than I want, but I also have limited time (young kids + startup + investing).

The 'year of self study' definitely depends on where you're starting from and how much time you have, but it's very doable if you can dedicate an hour or two a day.


You might want to take a look at Math Academy. I'm similar to you and have been using it since last year.

My tool to get me through slogs is streaks, so I commited to doing a lesson (or at least part of a lesson) every day and I'm at 198 days so far.

I wrote this at 100 days in case it's helpful: http://gmays.com/math I'm not sure if I'll have time to write an update for 200 days, but maybe at the 1 year mark.


Wanna see if we could be an accountability partner to each other? Let me know! You can email me or I can contact you via LinkedIn.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: