> “These bears have become synonymous with gentrification in San Francisco,” he told fnnch, “and the displacement of the artists that come from here.”
I have mixed feelings (i.e. I understand your boredom) of his honeybear art from a pure aesthetic pov. However, (as any modern viral influencer knows), any successful artist will invite haters. This article reinforces the notion that fnnch is very successful...
I concur with you (that this is an excellent introduction)!
Imo, your suggestions are more for intermediate/advanced active listeners that need to interact with folks in their job (e.g. bartenders, reporters, middle managers...).
Still, I feel being repetitive (e.g. 'It sounds like XYZ...is that right?') is better than nothing. Sometimes, training wheels aren't bad when learning how to ride a bike.
author here. Exactly, “it sounds like” etc are training wheels. Use them while you figure out how to do the technique. And yes, when you’re learning, it can sound stilted. As you master it, you don’t need to use those exact phrases any more.
I think the problem statement is: How do you know when to Let Go of the current boulder?
The poem suggested many many many possible when. Here's one: "unless it comes out of / your soul like a rocket,".
Unfortunately (or fortunately), in life, there is no methodology to prove that a given search problem is futile (e.g. NP-complete)... so we have to take our chances and choose. I believe that's the beauty of life: choice.
This is correct. To delve into a topic about cognitive load without talking about germane overhead disqualifies this article (i.e. similar to extraneous overhead in terms of effort but germane overhead is beneficial. Because it helps the coder's reading ability.)
The examples are good but every reader must not have the takeaway that every effortful code is bad (e.g. haskell is extremely hard to read at first but every developer swears it has very high intrinsic cognitive load)
I read the page https://www.succeedsocially.com/morefun. Here's my initial impressions. Pros: it identified several important painpoints and give several decent examples. Cons: Being a truly fun person is all about reaction reaction reaction. Fun people react authentically (while censoring their ahole side because you don't want to be fun but unlikable), ridiculously (while reading the room), and intelligently (playing to the top of the crowd's intelligence).
Consider the intended audience though. This is for people who are lost and need perspective and concrete steps for improving. Compared to all the "fake it 'till you make it" or "just stop caring" type of advice, it's helpful.
> Fun people react authentically (while censoring their ahole side because you don't want to be fun but unlikable)
But here you explain exactly what is difficult. It's like walking a tightrope and someone tells you not to fall to the left and by the way, also not to the right.
Hmmm what is a simple "hello world" project in chip design?
In computer science courses, that's as simple as a println().
In machine learning courses, that's training on mnist dataset to do character recognition.
In electrical engineering, that's buying a raspberry pi to blink led.
In chip design ... Chatgpt says to design a 1-bit full adder using verilog?
...
I understand why the article thinks the market is looking for graduate education. To design a simple chip requires an *initial investment* (as with all hardware startups really). This is different from software where one can simply launch a web app with a container hosted on your preferred cloud provider...
... That said, with the rise of LLMs lowering the barrier of entry of software even lower (e.g. vibe coding), may we see more rise of hardware startups/innovations?
FPGA dev boards are cheap nowadays, and you can start coding in a hardware definition language with a simulator. The ChatGPT answer of doing a 1-bit full adder as "hello world" makes sense.
You are obviously not going to etch silicon at home, but the design part is rather accessible as far as hardware goes.
You can absolutely etch silicon at home. Processes like wet etching (KOH, HF), reactive ion etching (RIE), laser ablation, and even electron beam lithography using repurposed CRTs are all viable at the DIY scale.
They're not used in high-volume manufacturing (you’re not replacing ASML), but they’re solid for prototyping, research, and niche builds.
Just don’t underestimate the safety aspect—some of these chemicals (like HF) are genuinely nasty, and DIY high voltage setups can bite hard.
You're not hitting nanometer nodes, but for MEMS, sensors, and basic ICs, it’s totally within reach if you know what you’re doing.
It's actually pretty scary how far this guy got: 10nm. Makes you wonder if something as powerful as an intel N100, famously a current 10nm chip + DDR4 memory, could be made like this. That would support quite a bit in terms of AI already, even if transformers are probably a bit much too ask.
> rise of LLMs lowering the barrier of entry of software even lower
Getting to your first wafer costs something like $250k and upwards of fab costs, depending on what process you're using. Hence much of chip design effort is already spent on verification, it's probably over 50% by now. This is the exact opposite of vibes because mistakes are expensive.
Businesswise it's quite tough B2B sales because you're selling into other people's product development pipelines. They need to trust you because you can sink their project, way over and above the cost of the actual parts.
Edit: I cannot emphasise enough how much more conservative the culture is in chip design and EE more broadly. It belongs to a world not just before "vibe coding" but before "web 2.0". It's full of weird closed source very expensive tooling, and is built on a graveyard of expensive mistakes. You've got to get the product 100% right on the first go.
Well, maybe the second go, production silicon is usually "B" rev. But that's it. Economics dictate you then need to be able to sell that run for a few years before replacing it with an upgraded product line.
The rule of thumb I use for chip design is that verification takes at least 2/3s of development. Sometimes more. 50% would be nice but I think is optimistic
Verification is indeed the majority of the time spent. Unlike programming, Verilog and VHDL and higher level things like Chisel aren’t executed serially by the hardware they describe like a von Neumann machine. Hello World for a chip isn’t designing the circuit, or simulating the circuit, or synthesizing the circuit to some set of physical primitives. No, it’s proving that the circuit will behave correctly under a bunch of different conditions. The less commoditized the product, the more important it is to know the real PDK, the real standard cell performance, what to really trust from the foundry, etc. Most of the algorithms to assist in this process are proprietary and locked behind NDAs. The open source tools are decades behind the commercial ones in both speed and correctness, despite heavy investment from companies like Google.
And so my point: the place where people best know how to make chips competitively in a cutthroat industry is NOT in schools, but in private companies that have signed all the NDAs. The information is literally locked away, unable to diffuse into the open where universities efficiently operate. Professors cannot teach what they don’t know or cannot legally share.
Chip design is a journeyman industry. Building fault-tolerant, fast, power-efficient, correct, debuggable, and manufacturable designs is table stakes. Because if not, there are already a ton of chip varieties available. Don’t reinvent the wheel because the intersection of logic, supply chain logistics, circuit design, large scale multi objective optimization, chemistry, physics, materials science, and mathematical verification is unforgiving.
I think you would just buy a cheap FPGA board and use that wouldn't you? No need to do a full chip until you know what you are doing. That would be like building a server farm just to do your software hello world
> Few of Twitter's most vocal posters spend time reading contemporary poetry collections, attending readings, or tracing the evolution of forms.
I recently got hooked into contemporary (i.e. modern) poetry. I fully understand why modern poetry seems hard to understand.
I believe most people innately love simple and deep modern poems.
If you like poetry related to nature (sorry major typo!), check out Ada Limon (1)
If you like poetry related to medicine and life, check out ACP poetry prize (2)
As a cloud security analyst that is thinking of going back to coding or DevSecOps, if I'm honest with myself, there is nothing new here that I have not seen before... (This is not a criticism or anything. If anything the problem is myself: if I can allocate time to learn this or use Anki to retain this).
The headline is borderline clickbaity. Specifically the word "Ruthless" made me think of something unethical like Delve's business.
[1]: https://news.ycombinator.com/item?id=47634690
reply