And contact lenses too. A HBS case study I remember from grad school:
"Optical Distortion, Inc"
A new product, contact lenses for chickens, is to be introduced by a small firm formed to market the product. An entry strategy must be planned including price, sales force, size, and location. Allows data for computation of economic benefit to farmers. Includes state-by-state chicken population data for planning a rollout sales program.
But some ideas cannot be crushed by bankruptcy and the dream of providing lenses to all of America’s hens was carried on by the son of one of Vision Control Inc.’s founders, a young Mr. Randall Wise. Wise, a Harvard Business school graduate and former nautical shipping consultant, used the millions he made from selling his software company to establish Animalens, Inc.
Instead of pecking at each other (success!), the hens were now pecking at the air, rubbing their eyes repeatedly on their wings, and suffering from corneal ulcers and ruptured eyes.
Thanks for reminding me that I haven't been to the Lost Art Press site in a while. I have a few of their books; they are fantastic. If you are interested in quality woodworking tools and high-quality woodworking books (both in terms of content as well as binding and paper), check them out.
I hadn't seen their "Build a Chair from Bullshit" book before. I've always been a little intimidated at the thought of building a chair, but this one looks like it's easy to build and has nice lines. Definitely more involved than the OP's chair, but it still looks still approachable with basic tools and skills.
Edit: As mentioned in dlbucci's comment below, I forgot to mention that it's available for free as a downloadable PDF book as well as for purchase as high quality bound book on their site. If you have any interest in an easy to build chair with an attractive design, it's worth a look.
These were my go-to guys for sciencey stocking stuffers at Christmas for my kids. Their catalog was always a joy to read, with excellent puns.
Like the guy who wrote the linked article, a GoFundMe for a for-profit enterprise rubs me the wrong way. However, I just donated because of all the great memories they've provided me and my kids. Seems like those of us that like these things may need to pitch in from time to time.
I wonder if something like this could have helped Lindsay's Publications, who went out of business a decade ago. I have so many fantastic books from them. They're really worth a HN post all on their own.
Scored a lot of Lindsay's books when you still could (lot of build-your-own regenerative receivers and similar, reprints of "The Boy Mechanic", "5 Acres and Independence", etc.).
I would suspect (hope) many have been "archived" at "the org".
This is a great recommendation, thanks! I've found a lot of interesting old radio hobbyist magazines on the Archive, whose copyright has expired. While it's great to see how people could get stuff done with the very limited electrical resources available in the 01910s and 01920s, many of the designs depend on materials that are now hard to find, while materials that are now easy to find (like PN2222s) didn't exist.
Thanks! I had no idea! Your work is one of the three or four most important projects sustaining civilization right now; it will be profoundly missed when it's gone.
You're spot on. The IBM XT was introduced immediately after I graduated and went to work. My employer's employees got a 20% discount from IBM, as IBM was a client. I bought an XT (PC + 10MB hard drive) for $4K (list price $5K). A lot of money then for a new grad. The contrast between then and now is stunning.
And software costs... this was also the era when every "application" (think accounting software: accounts payable, payroll, accounts receivable, word processing) was $495 per application. A small business could easily pay $2-5K for basic software to run their business. And then, of course, it was a nightmare to setup and use and almost impossible to pull off without a "consultant".
But VisiCalc -- it was such a game changer. A totally different way of using a PC that enabled the "ordinary" non-computer person to become an order of magnitude more productive. I think Lotus 123 was the pinnacle of golden era of keyboard-driven spreadsheets, but it was only an incremental improvement over VisiCalc. The journey into the abyss began with Lotus Symphony.
I do like Excel and use it on occasion. But I pine for the days of lean software that did one thing exceptionally well.
> I think Lotus 123 was the pinnacle of golden era of keyboard-driven spreadsheets, but it was only an incremental improvement over VisiCalc.
They sat on their laurels with 123 v2 for much too long, and competitors surpassed them. v3 wasn't enough more to catch up.
Enable OA's spreadsheet module[1] was certainly head and shoulders above 123, offering real 3D capability (where what Excel views as multiple independent sheets could be addressed directly, so you could do things like have a layer for each month of an annual report, with the topmost being as @SUM() over the column of montly numbers below it), much richer set of functions, and integration with a database and word processor.
And there were others, too. I liked Lucid-3D, which wasn't really 3D but kinda fractal, where you could set up any given cell to drill into a sub-spreadsheet that was used to calculate a single value that would roll up into its parent. And Borland had a competitor, but I don't recall anything about that one.
[1] During college I had a part-time job working on this, on the testing team.
One cool thing we testers did to keep ourselves entertained was to build a spreadsheet casino. Each of us took on a given casino game to implement via spreadsheet macros. My game was blackjack, and it supported the full range of features: multiple decks, double-down, insurance, and all that.
Another guy did craps, the result of which was that we found a subtle bug in the app's random number generator. He set it up to play itself automatically and left it running overnight. When we came back in the next day, he was rich; that's not supposed to happen. He ran the test again the next night, and same thing. The bias in the RNG was causing rolls of 11 to happen more often than they should have.
I really wish multi-dimensional spreadsheets were more popular --- Lotus Improv was _amazing_, but unfortunately, Quantrix Financial is not something I could convince my employer to pay for, let alone justify for my own purchase, and sadly Flexisheet seems moribund.
Pyspread has some interesting features along those lines.
There were a number of alternative spreadsheet models trotted out over time. But things had pretty much coalesced around the 2D Visicalc/Lotus model for mainstream users. Excel did add features like pivot tables but, basically, once Microsoft Office became dominant with Windows that sort of cemented what an office suite looked liked at least until you added in video conferencing. So you don't, for example, really have a mainstream desktop publishing program that goes beyond the limitations of word processing offerings.
In practice, though, you need to upgrade to stay current. For software that people use as a daily driver, subscriptions are not obviously more expensive in general.
that's where we disagree: you don't usually need to stay current. as long as it does the job, it's current enough. if there's new features available that would add value to the business, then you have a business case to buy a new license. 95% of software update haven't really added any value since the early 00s.
I'm not sure I want to work at a company that nickels and dimes purchases to the degree that I'm running unsupported 20 year old software because someone in procurement doesn't think I need an upgrade unless I write up a business case for it. I assume they're equally cheap in many other ways.
So, in other words, you need to upgrade--or have a subscription. In fact, extended support agreements for some enterprise products are a premium offering that don't require moving up to the next version given the effort associated with backporting bug fixes for a fairly small base.
I went quite a ways up the parent chain without seeing anyone seriously proposing that. All executables become unusable eventually (except maybe for IBM mainframes), so "pay once and you're done forever" is inherently impossible.
Unless you keep running every part of the hardware / software stack, which works only until some piece of hardware wears out.
exe34 15 hours ago | root | parent | prev | next [–]
> $495 per application.
it was a lifetime licence though. you can probably still run it today.
Also:
that's where we disagree: you don't usually need to stay current. as long as it does the job, it's current enough. if there's new features available that would add value to the business, then you have a business case to buy a new license. 95% of software update haven't really added any value since the early 00s.
-----
Maybe I misinterpreted but the implication is I could run something from the early 00s without changes which--while true in some cases--I wouldn't do in general.
> Maybe I misinterpreted but the implication is I could run something from the early 00s without changes
Yes, you did. If he meant that, he wasn't thinking. Some piece of your hardware won't work with the old software, unless it's also old. You get a new printer and there are no drivers for the old OS. The old software won't work with the new network. You want some new app and it doesn't work with the old software. Etc. Etc.
We ran into this in Google Patent Litigation all the time. You have to have old everything to run old software, whether your license is still good or not.
But I think we agree that paying for maintenance only, and no new features, is fair. Of course, most vendors don't want to do that; they want to shove new "features" down your throat.
When I worked for an enterprise Linux vendor, we definitely provided long-term support/maintenance options followed by security patches. It was mostly for government-related but also for some companies, especially those that were using it for embedded applications. But we also had meetings with those customers regularly who certainly weren't just installing the software and not touching it for 10+ years.
I've also known customers who have old software, e.g. for test systems, just running on old systems who basically don't breathe on the ancient systems. For years, United's entertainment system would also sometimes reboot to a 20+ year-old pre-Red Hat Enterprise Linux kernel but that's not really a critical system.
There were these two clean-cut young men in short-sleeved white shirts and ties who came to my door and handed me a pamphlet. I think it was about that.
and that's why we ended up with agile and alpha crapware released every week, breaking functionality that used to work and moving everything useful around until you can't find it.
Zuar | Sr. Software Developer (Python) | Austin, TX or Remote (US only) | Full Time
We're a rapidly growing profitable startup in the ETL space. Our seven person development team is hiring a Senior (8+ years) Python Developer (backend) to help grow Mitto, our flagship product. Your knowledge and experience will be put to use helping set product direction, define architecture/features, implement new features, and improve deployments and scalability. Our products are large and complex. Our team is small, but agile and highly productive. Developers with initiative, broad technical interests, and inquisitiveness will thrive in our environment.
One of my favorites is "Zippy the pinhead" quotes (M-x yow). I believe this has been removed due to copyright issues, but it is still available if you look. Zippy's quotes could be sent to the doctor via M-x psychoanalyze-pinhead.
Not the OP, but my experience with Lyme 25+ years ago in New Jersey was similar. I took oral and IV antibiotics over multiple years. The antibiotics I recall taking are Claforan, Ceftin, Augmentin, Penicillin, Ampicillin, Rocephin, Doxycycline, Azithromycin.
While my symptoms would often improve during/after a course of IV antibiotics, they would always return. I don't ever recall thinking that one of them was what cured me. It was more like the symptoms gradually tapered off over years. I do, however, credit my MD's willingness to aggressively treat the symptoms with my eventual recovery.
Interestingly, both my MD and his wife (also an MD) had Lyme, as did at least one of their children. My advice to people who think they have Lyme is to seek out a physician that has experience in treating Lyme and a willingness to do so. At the time, there was a lot of pressure in the medical community to simply give two weeks of oral doxycycline and then tell the patient that they have to live with whatever remains.
These are the two that sound familiar to me. Doxycycline in particular.
The bite was ~3 years ago for me. I still have the mark on my ankle. It was red and raised until about 6 months ago. Now it just looks like a faint bit of scar tissue.
> I took oral and IV antibiotics over multiple years.
This is not a standard or reasonable treatment for Lyme. Antibiotics will kill it in a standard course. It sucks that you still had issues afterwards. Did they do additional blood tests?
In the Texas Hill Country bluebirds used to be common. I'm not claiming that their decline is due to house sparrows, but it's clear that house sparrows can be quite harmful to many cavity-dwelling native birds. They can be extremely desrtuctive; it is not uncommon for them to kill both parents and offspring to take over a nest or nesting spot. One of many resources on the topic:
Nor did I. I've been using gimp a few times a year for 15+ years -- i.e., never enough to build much muscle memory. Your suggestions will make a huge difference for me.
I was first exposed to Awk when I started work at Bell Labs in the late 80s. Until then, I'd been using either Lisp or C exclusively and was really blown away by how simple some things were in Awk. I used it with impunity to munge all sorts of data for input into fault prediction tools I was working on. Speed was never an issue for me, so I never explored the potential improvements offered by 'awkcc'. Although perl was becoming the new hotness at that time, Awk remained my goto tool for many years.
If you are interested in learning Awk, I highly recommend "The AWK Programming Language" by Aho, Kernighan, and Weinberger. It's about the same size as the original "The C Programming Language" and is equally well-written. Previously on HN: https://news.ycombinator.com/item?id=13451454
>If you are interested in learning Awk, I highly recommend "The AWK Programming Language" by Aho, Kernighan, and Weinberger. It's about the same size as the original "The C Programming Language" and is equally well-written.
I'm a big fan of small utilities :) - as I sometimes say in my email sig; but more importantly, I'm a big fan of Kernighan et al, where by "et al" I mean the others from the core early Unix days, such as Dennis Ritchie, Rob Pike, Ken Thompson and many unnamed others, from whom I (and tons of others) learned about the Unix command-line (tools), the shell (scripting), and the Unix philosophy [1].
Had written this just a few weeks ago on HN, in the thread titled "Technical Writing: Learning from Kernighan", but worth repeating here in the context of this thread:
"If someone has already solved a problem once, don't let pride or politics suck you into solving it a second time rather than re-using."
I'd strongly argue it's overzealous. As much as I agree "reinventing the wheel" is dangerous, tempting, and can quickly spiral to yak shaving, but Unix itself, and all the good it brought, is a prime example of "solving [a problem] a second time" after Multics.
In other words, I'd restate it in Sage Speak™: "Don't do this. Except when you need to." ;P Or, just want to have fun :P
>But Unix itself, and all the good it brought, is a prime example of "solving [a problem] a second time" after Multics.
Not sure about that. I mean, I know it came after Multics and was inspired by it (due to some of the early Unix people having worked on Multics), including that the name was originally Unics (I've heard, as a word play on Multics, because it was originally written by one person or was originally a single-user OS, maybe), but I am not so sure that all the good it brought was from Multics. Likely Unix brought some new stuff too. Others who know better may be able to say more.
>In other words, I'd restate it in Sage Speak™: "Don't do this. Except when you need to." ;P Or, just want to have fun :P
Good one. A bit Zennish :) Check out one of ESR's other compilations, the Unix Koans of Master Foo, if not seen already ...
Wasn't meaning that the good was from Multics. Just that Unix was after Multics, "solving [a similar problem] a second time". In fact, that Unix brought extra good doing this, actually strenghtens the thesis that reinventing the wheel may bear good fruit :)
I sort of agree. I didn't quote those sections in the sense of recommending that the advice in them be followed strictly, and to the letter. Also, ESR is known to talk a bit that way - sort of overzealous, as you put it. But that is part of the fun of reading his stuff. As long as one takes it with a pinch of salt, and common sense, it is okay, and one usually gets to learn something from his writings.
The awk book is a lot of fun. I just finished reading it after seeing it recommended here a few weeks ago. The highlights for me:
1. A simple interpreter for an awk-like language called qawk. qawk is like awk except that it allows for querying by field name rather than field number. For instance, it allows doing
{ print $country, $population, $capital }
instead of the more cryptic
{ print $1, $3, $5 }
2. An awk program that takes another awk program (in their example, a sorting algorithm) and outputs a version of that program modified to include profiling statements and an END section that outputs the results of those profiling statements to some file; then, another awk program that reads the data in that file and inserts that data back into the original awk program, thereby approximating where the hotspots are.
There's a lot more in the book besides these, but to me these are the coolest programs because they are the awk-iest, by which I mean that they loop over lines of input, split the fields of those lines, and then manipulate the fields. Some of the programs in the book don't do this; instead, they consist of a single large BEGIN block with typical for-loops, arrays, etc. Used in this way, awk is just yet another dynamic language.
Thank you, I remember wanting to follow up on these more abstract constructions in the book. They seemed to be leading me somewhere amazing and very computer science-y. Programs that take programs as input and generate new code to do that thing I wanted with some data files — I’m sure this will be useful if I put the time in.
Am I right that qawk was included as a program in the text? Did they ever follow up with further uses?
Yeah, the code is all in the book, and it works! Here's the main body of the qawk interpreter:
BEGIN { readrel("relfile") }
/./ { doquery($0) }
where
- relfile is a file containing the field attributes used in various database files,
- readrel is a function that parses the relfile and stores the fields in a dictionary, and
- doquery is a function that takes a qawk query, converts it to an awk query by replacing the field names with their corresponding field numbers, and then executes the awk command.
Perl was basically written because Larry Wall found Awk's syntax to be a little too cryptic. In the language design business this is what we refer to as baby steps.
Also, Awk isn't great for making reports, which is why Perl 5 to this day has an awkward report creation system[1] that looks like some COBOL refugee instead of idiomatic perl code.
From the link: "The lone dot that ends a format can also prematurely end a mail message passing through a misconfigured Internet mailer (and based on experience, such misconfiguration is the rule, not the exception). So when sending format code through mail, you should indent it so that the format-ending dot is not on the left margin; this will prevent SMTP cutoff."
If you're of a certain age and read that and grimace as you immediately understand why this would happen, it does rather put into perspective the misery of dealing with, say, Webpack configuration.
> From the link: "The lone dot that ends a format can also prematurely end a mail message passing through a misconfigured Internet mailer (and based on experience, such misconfiguration is the rule, not the exception). So when sending format code through mail, you should indent it so that the format-ending dot is not on the left margin; this will prevent SMTP cutoff."
Did email clients not handle "dot stuffing" back then? That is, if a line begins with a single dot, the client would automatically insert another dot right before it. Then, at the receiving end, the client would remove the extra dot at the beginning of the line.
> Perl was basically written because Larry Wall found Awk's syntax to be a little too cryptic.
That seems ridiculous; where is it substantiated?
When Wall posted Perl to comp.sources.unix for the first time, he wrote "If you have a problem that would ordinarily use sed or awk or sh, but it exceeds their capabilities or must run a little faster, and you don't want to write the silly thing in C, then perl may be for you."
Or rather, not Larry Wall, but the apparent newsgroup moderator added that text, lifting it from the Perl manual page.
Thus he was pitching it as something that performs faster than awk and sed, with a greater range of capabilities.
On pp. 381-382, my copy of Programming Perl (1992 printing of first edition) says he was trying to build a configuration management system for 6 Vax and 6 Sun machines, and he needed to solve some problems like file replication across a 1200 baud link and approvals. So he installed B-news, the Usenet news software at the time. Then he was asked to generate some reports and:
> News was maintained in separate files on a master machine, with lots of cross references between files. Larry's first thought was "Let's use awk." Unfortunately, awk couldn't handle opening and closing of multiple files based on information in the files. Larry didn't want to have to code a special-purpose tool. As a result, a new language was born.
So that's why it's the Practical Extraction and Reporting Language. He wanted to extract data from files and generate reports.
I found awk after using perl (four was the new hotness)
Before long I was trying to figure out just what in my daily work perl was suppose to be protecting me from
>IIRC there's no method of putting headers or footers on pages, nor page number without manually counting the lines yourself.
Good point. The BEGIN and END patterns do work for global headers and footers, totals, etc., but not for per-page stuff. You can do it yourself with some extra awk code, but yes, you have to write it. Not difficult, though. I guess it was not designed to be a Crystal Reports-like reporting tool, with report bands and what not.
>I could be wrong though. Awk is one of those tools like vi where you can use it for years and still be discovering new features.
Agreed :) Not only new features, even new uses for existing features, because, although it is a sort of DSL or little language (but a programmable one), the area it is applicable to, pattern matching and data processing of many kinds, is vast.
As many others pointed out in this thread, the fact that the reading of input is built-in to it (whether from standard input or files given as command-line arguments), saves you a bit of boiler-plate code each time (cumulatively) you write an awk program using that feature. So does the pattern-action model, with those two defaults for missing pattern or action (match all lines, or print). And again as others have said, Perl, Ruby, etc. have that feature too (the first one).
Aho taught the programming languages course at my Uni, and he loved to tell stories about Bell Labs, and this Brian guy in particular. ‘Let me tell you, I’d walk into Brian’s office and say, Brian, you really messed that language up...’ The entire semester I was like who the f is this Brian guy? Years later I was like oh shit, Brian is Brian Kernighan, and the language is C! And I realized I missed a freaking awesome opportunity to ask Al Aho about some serious heads in CS.
I've heard about _The Awk Programming Language_ but could get my hands only on _The GNU Awk User's Guide_ by Arnold Robbins, since it's available as Info pages on my Linux box. It's also free online
I'm not a coder and awk was my first more serious or more complete effort to learn a programming language. What I did notice afterwards was that the syntax and semantics of C also became a lot more clearer to me.
I think Kernighan also stated somewhere that awk was (also) designed as a helper-tool to learn C.
All in all, I think it's a great first language, even if you're initially more compelled by Lisp family languages. If you have time to learn only one language, then awk is not a bad choice; It'll open many doors in the Unix/KISS-world.
I can also state that "The Awk programming language" is, among other things, an excellent introduction to computer science or "the way programmers think" in general. A remarkably well written book for general audience.
"Optical Distortion, Inc" A new product, contact lenses for chickens, is to be introduced by a small firm formed to market the product. An entry strategy must be planned including price, sales force, size, and location. Allows data for computation of economic benefit to farmers. Includes state-by-state chicken population data for planning a rollout sales program.
https://www.hbs.edu/faculty/Pages/item.aspx?num=17120