Re: your experience teaching synths in university, do you feel there is certain knowledge/concepts that are difficult for those self-teaching sound design?
I do the same "no presets rule", with most being the result of random experimentation and fiddling of parameters. This does a good job at "creating never before heard, totally out there sounds", but I imagine it's trickier the more specific your end goal is.
I think the hardest part is getting the basics straight, and by that I mean things like understanding oscillation produces sound, amplitude influences the spund level while the speed of the oscillation influences the pitch. Understanding that by modulating pitch and amplitude you can already create a ton.
The biggest challenge for students in my experience is grasping all these obscure words and acronyms, with VCOs, LFOs, keytracking, Envelopes, ADSR, Triggers, Gates, CVs, oscillations, overtones, frequency graphs, filter cutoff points, resonance, ...
It is just a lot of concepts at once. Sometimes it can pay of to take a step back and limit one's arsenal and figure out how far you can get only using one or two of those.
Totally. I'm self-taught in this area and I feel like the biggest obstacle for self-study is that the basic concepts are often passed over too quickly (like simply the idea of using phase/the unit circle to represent oscillations, and how all that relates to frequency and period). I recently sat down and tried to make sure I really understood the mathematical modeling part of the whole business and it brings a lot of clarity when tackling more advanced topics. This is kind of true for self-studying any mathematical domain generally. It's easy to pass over the "entry-level" or foundational stuff quickly because on the surface it seems straightforward, but getting those fundamental ideas crystal clear and burned into your brain is absolutely crucial if you're going to comprehend anything that follows.
This is exactly what I think. The basics need to be understood. That doesn't necessarily mean they need to be understood first tho. People that learn synthesizers are first and foremost interested in creating cool sounds, so the best strategy to teach them the (for some: boring) basics, is IMO to repeatedly overwhelm them a little and then step back and explain things throughly.
That sounds like my ideal job. The longer I've been a developer, the more I've come to dislike work that doesn't address problems faced by end-users/the org.
I couldn't be happier. I am the in-house expert in my field, I replaced an agency that was far more expensive and incompetent, and I have great hours (9-4:30!) and benefits. Oh, and I'm fully remote in an org that's been remote since 2013.
Two economists are walking in a forest and they come across a pile of shit.
The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.
They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.
Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."
"That's not true", responded the second economist. "We increased the GDP by $200!"
I was not expecting panpsychism to pop up on HN. Years ago I explored the topic out of curiosity; at first it seems pretty absurd, but you can find some interesting discussions and insights about it. At the very least, it can encourage you to think differently about consciousness, and perhaps even question some of your own assumptions about consciousness.
It's odd to me that you chose color as an example of something objective, since so much about perception is subjective.
Even if we think about it in more quantitative terms, with red being defined as having a dominant wavelength approximately 625–740 nanometres, it's a bit of an arbitrary definition isn't it? If we observe a wavelength of 624, objectively we might say it's not red, but someone may still observe it as red considering how close it is to red. Or someone with protanopia won't see anything in those bounds as red either.
Color relies on ostensive definition. It's a public part of language whether someone is color-blind or not. When we say "that car is red" we are, in a sense, pointing to something and then using the concept and rules of color in our language. We might see something as a particular color through perception, but when we see "that --->" we are, in a sense, "seeing" in language (including body language, for instance. You could ask me which one is red and I could simply point).
We, of course, might disagree, but color-blind people learn which traffic lights are red, green, or yellow, regardless of their perceptual faculties. Because the color is not just what you see, but what you say.
I use f.lux - it changes the screen color temperature through the day to make it easier to sleep.
If I sit late at night my desktop walpaper (which is a regular photo with a lot of blue sky) becomes basically all red if you look at rgb values. But I still perceive the sky in it as blue because other things are "more red" so it looks blue, and because I know sky is blue, and because I remember how it looked before and the change happened slowly.
All of these scream "relativism" to me, in fact the mapping to common moral fallacies is surprisingly direct :) When law changes around you you might not recognize when it got evil. Obeying the law is good because it's the law. When everything around is evil - small evil seems good.
As for language and law - these are arbitrary. "Yellow pages" can be any color, so can "blue screen of death". Green traffic lights are actually blue in some countries.
Sometimes I wonder how many criticisms of JavaScript frameworks are driven by a nostalgia for an internet that no longer exists. Not this post per se, but many seem to forget why the industry landed on React in the first place.
Minor nitpick, but I wish this post started with defining what we mean by logging vs tracing, since some people use these interchangeably. The reader instead has to infer this from the criticisms of logging.
I've never encountered this confusion anywhere, so I wouldn't ever think to dispel it. Which isn't to say that I disagree with the more general point that defining your terms is good thing.
In any case, the post itself (which is not long) illustrates and marks out many of the differences.
I wouldn't assert that the confusion is non-existent. But I think the audience for a post comparing technical differences between logging and tracing is unlikely a junior one.
But again, I do think the (brief) post marks out the differences throughout, so regardless, it still doesn't strike me as a problem here.
I agree. I'm working with code that uses 'verbose "message"' for level 1 verbosity logs and 'trace "message"' for level 2 verbosity. Makes sense in its world, but it's not the same meaning as how cloud-devops-observability culture uses those words.
This or bran cereal are both good options in my experience. Just make sure to watch for how much sugar is in it; some brands will sneak in sugar even in granola.