Would love it if you could go into more detail about those two weeks. Had you already played around with tone.js beforehand? Did you have an idea or a draft written up for that initial lesson so you knew what was gonna be in it, and was it your initial vision to have a picture of each waveform accompanied by the sound, and buttons for each note in the scale? Did you have to change part of your design (even small details) when you discovered that it was hard to build something a certain way and that another way would be easier?
I'm trying to understand how this tutorial works, I noticed it doesn't make any network requests when run one of the examples, but then if I edit the code in an example it does make a request. Is this the magic of next.js and server-side components at work, and do you have a real edgedb running on the server side that is used to pre-render the page and also handle updates? I would love to hear more about how this setup works and even look at the code if it's available.
Aw man... and I just bought an Oryx Pro. Oh well, I guess this one doesn't come with coreboot and it's not even out yet - plus I think the Oryx Pro has a slightly faster processor?
I know git has a way to show diffs based on words instead of lines. Is there a way to use words instead of lines for these stats? When I'm writing markdown I usually turn on word wrap in my editor, which means there are entire paragraphs on one line.
On the other hand, editing takes a lot of time per word changed, because you have to consider the whole sentence or even paragraph to make sure the grammar is still right, you're not repeating things, and you haven't removed any necessary context. So maybe a bit of overestimation is good.
reply