Did the author start a client-side debugging process without running any kind of diagnostics in the dev tools?
To me this sounds like "I saw some slowness in my application, so naturally I started adjusting indexes in my database." Without doing any upfront research, you're basically just throwing darts and hoping one sticks. This type of approach to debugging has never served me well.
"...even as he fell, Leyster realized that he was still carrying the shovel. In his confusion, he’d forgotten to drop the thing. So, desperately, he swung it around with all his strength at the juvenile’s legs.
Tyrannosaurs were built for speed. Their leg bones were hollow, like a bird’s. If he could break a femur …
The shovel connected, but not solidly. It hit without breaking anything. But, still, it got tangled up in those powerful legs. With enormous force, it was wrenched out of his hands. Leyster was sent tumbling on the ground.
Somebody was screaming. Dazed, Leyster raised himself up on his arms to see Patrick, hysterically slamming the juvenile, over and over, with the butt of the shotgun. He didn’t seem to be having much effect. Scarface was clumsily trying to struggle to its feet. It seemed not so much angry as bewildered by what was happening to it.
Then, out of nowhere, Tamara was standing in front of the monster. She looked like a warrior goddess, all rage and purpose. Her spear was raised up high above Scarface, gripped tightly in both hands. Her knuckles were white.
With all her strength, she drove the spear down through the center of the tyrannosaur’s face. It spasmed, and died. Suddenly everything was very still."
left-pad even being a package is pretty funny, no? How many bytes got pumped across CDNs, proxies, build pipelines, etc. just to write a tiny utility function? I'm all for taking advantage of existing solutions, but I can't wrap my head around needing to pad a string and thinking "oh, I bet there's a package for that"
I remember part of the discourse being that this was a much needed wake up call to web-devs for their relentless reliance on micro packages like left-pad. Part of it was the culture of publishing packages for the sake of popularity and github stars. Part of it were also devs insisting that implementing anything that could otherwise be installed through NPM was "reinventing the wheel". Today I work with a lot of devs who still prefer using micro packages, regardless of their simplicity, because to them it means "less maintenance". Go figure.
Extending strings is not a linear-time operation. Behind the scenes, the JS runtime allocates new memory for it. In the naive case, you start by allocating 1 byte, then when you append to it, you need 2 bytes. So you allocate a new string of 2 bytes, and copy the data in. Each new byte is a new allocation, and a new copy of the entire string. That's how it's quadratic.
In practice, memory allocators tend to double the size of an allocation like this, which is still quadratic.
In practice, JS runtimes also tend to use data structures like Ropes for strings to handle this sort of issue. That brings it down to linear time in practice (I think?)
In each loop prepending a single character could take O(m) (moving all m characters one to the right) so combined O(nm) where n is the number of padding characters and m is the total number of characters in the string.
Only when the underlying JS implementation does this naively. In reality JS implementations do a lot of optimizations which often can reduce the time complexity.
I didn't mean that. JS doesn't have any lower-level interface for handling memory, so such optimization has to be in the implementation. It should be quite obvious that relying on such optimization can be problematic.
Really, what's the qualitative difference between reaching for a utility function that someone else already wrote within your project and reaching for a package that someone else already published within your ecosystem? They're obviously not the same thing but are they so far apart that you can't wrap your head around it wanting to treat them the same, given sufficiently advanced tooling?
The utility function just has to fit your use case, and can be easily refactored as it lives in a local context.
The package has to have a public API, meet millions of different people's use cases, and any change to the API will cause millions of man-hours of useless work... and yet if it's a poorly designed API, it might cause millions of sub-optimal programs.
Every project has a StringUtils file. But every project's StringUtils file is different.
Published packages in an ecosystem ought to be well-designed, with good performance, good APIs, good security. Packages that don't meet that bar ought to be kept out. npm is jarring because it let any old shit get published, despite there being long-standing package ecosystems in other languages with much better standards, that they could have copied from (CPAN, PyPi, RubyGems, Maven Central, NuGet Gallery, etc.)
The number of distinct entities in your supply chain and whether those developers are on your payroll.
As a business, each additional human or company you add to your supply chain represents additional risk that you're taking on. You can go some ways towards mitigating those risks—one of the most common is to sign a contract with them rather than doing business ad hoc—but the risk doesn't go away entirely. Given that additional risk for each additional downstream supplier, it's generally safer to use code written by someone who's already on your payroll than it is to use code written by someone you've never met and have no way of vetting.
I'm pretty pessimistic about AI in general, but the quality of web query results has gone done so much I've resorted to asking an AI to get the short answer or the starting point that Google would have given me just a few years ago...
The biggest reason for this is reuse between libraries - if you use 10 libraries, you don't want each of them to add its own leftpad. This is especially a problem if this happens in client code and you then send duplicate code to the browser.
Small addendum: some traditional wooden joinery is deliberately prepared to account for the varying rates and effects of drying across the timber.
This is particularly relevant in timberframing, where you want to work with the wood when it is as green as possible. Green pine, though heavier to lug around, is significantly more receptive to a chisel than drier lumber. In a classic mortise and tenon joint [0], it's common to leave the outer edge of the shoulder slightly raised from the inner edge to account for the natural warping as the exterior of the beam dries more aggressively.
Although it's more outside my area of experience, I believe fine carpentry also has a few techniques that see a higher frequency of use in areas that enjoy seasonal swings in humidity. The split-tenon is the only one that comes to mind, but, now that I think of it, I realize my mental model isn't great. More surface area to account for seasonal swelling / shrinkage? Maybe someone else can chime with a better explanation of this one.
Timber framing uses dry wood as well, slightly different techniques but in the softwoods and some of the hardwoods its is not all that harder to work dry than green and in some ways easier. It depends on the tradition and location as to the exact process and technique, some preferred dry timbers, some green, some something between.
In US farm country it was common to fell the trees in late fall/early winter after the harvest was all taken care of and then leave the trees where they dropped until the ground froze. After the ground froze you haul them to the build site, much easier to drag logs on hard frozen ground than on soft wet ground. Then you would forget about them until after the spring planting is taken care of and build in the summer. Those big timbers would be far from dry but they will have lost a fair amount of weight and will be more stable which makes everything easier.
I can only speak to my own experience of doing this professionally in northern climes without power tools for ~5 years, but both of your suggestions are foreign to me. I take this as a nice reminder that there is lots of regional variation to this craft around the world, which isn't surprising.
Even then, building a barn with dried pine or hemlock is much more tedious and incurs many more trips to the sharpening wheel. It is in no way easier.
The joints used in dried are dictated by the operations which are easier to do in dry wood and are not influenced by what the wood will do as it dries. Dried you get to use a saw with considerably less kerf and a thinner plate, augers can be more aggressive and take better advantage of lead screw and spurs. Chisel work will be a bit slower when chopping across the grain but not harder and if it incurs many more trips to the sharpening stone you are most likely trying to chop that mortise as you would in green wood.
I read a biography of the earthmoving equipment maker R.G. Le Tourneau, and it was really eye-opening how much this was a thing before mechanized equipment was readily available. A lot of moving was put off until winter because it was so much easier to drag logs, boulders, buildings, etc. over ice than over thawed ground.
Or waiting for things to freeze real good so you can dig the kind of hole or trench that would make HN clutch its pearls or simplify de-watering problems.
I’ve never tried to dig in frozen ground - isn’t that going to require blasting equipment or techniques closer to mining? (Heavy pneumatic jackhammers)
Generally you just build a fire on the ground you want to dig up, possibly throw in some good sized stones to hold the heat longer. If the frost is deep might turn it all over once the flames have died down and bury those coals and stones so their heat is more contained and not just going up into the air, or have a second fire after you have dug out the thawed soil.
Green woodworking is an entire field of its own. Not very common in industrial scale but it was a common method a few centuries ago.
Examples of things where green woodworking is common: spoon carving, bowl turning, chair making, etc.
The idea is that wood is worked while green to make 80% finished blanks, which are dried slowly for some months or years before finishing the rest of it. This gives less distortion to the shape as it dries. And the drying times are faster because it's all small pieces at that point. The time from tree to product is shorter.
It is an almost extinct craft but it is a lot of fun for woodworkers not under schedule pressure.
I just finished a green wood post-and-rung chairmaking class last week. The posts are split out and steam-bent, while the rungs are dried in a makeshift kiln (a box with a heat lamp). The posts are then above ambient humidity, while the rungs are dried below it. As the entire chair equals out, the posts will dry out and compress onto the tenons of the rungs, which will swell up a bit and lock in place. We did use glue but you don't really need to. Neat stuff.
Cool. I've also built a bar stool with green wood but it's a fairly crude shop stool rather than a fine chair.
A green wood specialty in my neck of the woods is sauna ladles (used for throwing water). You can buy wooden ones but they are made from seasoned lumber with CNC machines and don't survive more than a year before they crack. The one I made from green wood is still going strong after 7 years in extreme humidity and temperature environment.
It's absolutely routine for hobby and artisan turners and carvers, though. In between the first turn and the second turn, you can air dry, kiln dry, and other techniques. With air drying, you actually want to slow the drying so that it happens more evenly. Otherwise, the outside of a vessel dries faster than the inside, which splits the wood. In general, packing a vessel inside and out with wood shavings helps even the process.
I've also had great results using silica gel on smaller items, although it can be hard to scale it to larger vessels. Much faster drying than air alone, with greatly reduced distortion and cracking.
What's particularly wild about the choice to tax software development in this way is that it assumes that code is always asset. For companies that are pre product market fit, it's often a liability!
Interesting perspective. Firms actually have to evaluate each year whether it is really an asset. If they determine that product is no longer useful, they would write off the remaining balance immediately.
Did the author start a client-side debugging process without running any kind of diagnostics in the dev tools?
To me this sounds like "I saw some slowness in my application, so naturally I started adjusting indexes in my database." Without doing any upfront research, you're basically just throwing darts and hoping one sticks. This type of approach to debugging has never served me well.
reply