Hacker Newsnew | past | comments | ask | show | jobs | submit | unpythonic's commentslogin

The other article fails to mention the building vs land+building valuation. I posted it over there, but thought this was a better article on its own.


Right, I saw your comment when I went to add this URL to that thread myself.

And yes, it's arguably the better (and non-paywalled) source. You can email mods if you think it's a preferable URL.


Here's a better article covering this situation: https://wolfstreet.com/2024/08/01/values-of-old-office-tower...

Unsurprising that a building's value goes down over time while the land on which it resides appreciates. The 97% discount is a comparison of the building price versus the building + land price, so the 97% isn't very relevant.


Still pretty surprising. Buildings often go up in value over time and it's often the building that gets more appreciation than vacant lots. NYC is a special case, however, because land is so scarce in Manhattan. It's also frequently the case in NYC that different parties own the land than the building.


The building underwent a a $76M renovation in 2021. So, this is absolutely a huge loss in value.


Insane that building gets depreciated in the balance sheet


What is insane about that? Nothing lasts forever.


nah, its super normal in RE and the source of a lot of tax advantage


Sadly, it's not always that simple. If one of your well-pinned packages has a build dependency which is not correctly pinned, then you're subject to the problem.


You raise a great point. I believe cached builds, say with Nix, are the way to go. What do you do?



When perl came out we were living in horrific times. You had the choice of either Bourne, C or Korn shell. Automation was glued together in one of these with a series of grep, awk, sed, ls, test, commands glued together. Anything more complicated was written in C and called from one of these things.

Perl in one stroke collapsed the programming of C, text manipulation, the capabilities of all of the Unix utilities, and data structures into one system. For anything which wasn't subsumed into the monolith of Perl, you could easy access via backticks. It was very friendly in dealing with text streams, and that's what those call-outs in those back ticks spoke.

Yes, awk and sed were replaced by Perl, but more importantly, the unmaintainable nightmare that glued all of it together was wiped out.


> Automation was glued together in one of these with a series of grep, awk, sed, ls, test, commands glued together. Anything more complicated was written in C and called from one of these things.

This doesn't sound that horrific to me. It's the classic Unix approach of building small tools that do one thing well, and composing them in novel ways to solve problems. For any problem that can't be solved this way you write another small tool using your programming language of choice. Rinse and repeat.

But occasionally Unix attracts users and programmers who reject this approach, and who prefer building a monolithic tool, or in the case of Larry Wall, new programming languages. To be clear, I'm a fan of Perl and think it has its place, especially in the era it came out. It inspired many modern languages, and its impact is undeniable.

Personally, I find solutions you refer to as "unmaintainable nightmare" to be simple and elegant, if used correctly. No, you probably shouldn't abuse shell scripts to build a complex system, and beyond a certain level of complexity, a programming language is the better tool. But for most simple processing pipelines, Unix tools are perfectly capable and can be used to build maintainable solutions.

The classic Knuth-Mcllroy bout[1] comes to mind. Would you rather maintain Knuth's solution or Mcllroy's?

[1]: https://matt-rickard.com/instinct-and-culture


I don’t think you’ve seen the kind of scripts the person you are responding to is talking about.

I have, and mentioned one lower down in the comments. Unix philosophy was great but does not scale well in terms of maintainability or efficiency. Invoking processes over and over again loops is godawful slow. And the horror of complicated shell scripts is legendary.


As a self-taught coder, I've experienced many times how highly skilled software engineers groan and sweat when they encounter shell scripts. I don't understand why, but it seems like people with a CS background are never really taught shell scripts and have come to irrationally fear them. It's sort of taboo.

This results in weird behavior, such as writing a groovy (Java?) script for Jenkins to execute bazel in order to build a go binary that runs the very same commands in an exec.Command() construct. Or people who download and import pandas to grab the third field in a csv file.

During the course of learning, I've naturally written code in bash that should have been written in another language. I replaced if statements with case because they turned out to be more performant. It's a great learning experience and why I got into python and go.

IMO we should use the right tool for the job. Sometimes that tool is a combination of unix utilities that you can put in a shell script for easier maintenance. It's just procedural execution of (usually very efficient) binaries, akin to a jenkins script or gitlab pipeline. Just mind the exceptions and use exit codes.


I'm the type of person that would grab pandas to parse a CSV. Here's my reasoning

* often times, it's not just the third column I want. Sometimes it becomes "third column unless the first column is 'b' then instead grab the fourth column". Having a good data representation makes sure that I'm not mixing logic code with representation parsing code

* I don't have to care about CSV parsing edge cases. Escaped comma? Quotes? I don't care, the library will either handle it or throw an explicit error. With custom parsing code, instead of an error, I'll get some mangled result in the middle of the file that I won't even catch / notice until later down the line

* when working with CSVs, in my area (ML / scientific compute), Python is often the right context to be in.


> I've experienced many times how highly skilled software engineers groan and sweat when they encounter shell scripts. I don't understand why, but it seems like people with a CS background are never really taught shell scripts and have come to irrationally fear them. It's sort of taboo.

It's not a lack of being "taught" shell scripts. It's the fact that shell programming constructs aren't well documented, your "standard library" is basically dependent on whatever binaries happen to be available on the filesystem, error handling is almost non-existent, etc.

It's very easy to write a bad shell script that "solves" a problem as long as a bunch of assumptions aren't violated. In my experience, senior software engineers are extremely averse to hidden assumptions and very concerned with reliability of the systems they build.


Yea... "Well the script works fine in MY SYSTEM" was the most common issue with said scripts. Running across different versions of Linux was fraught with issues, much less any other operating system that could execute a shell script.

Of course this can happen with any language, especially as it ages and adds complexity.


I certainly have, and might have written a few of those myself.

But this doesn't make this approach inherently wrong or obsolete. The programmer is wrong for trying to use the tools beyond their capabilities. Where that line is drawn is subjective, as is the concept of maintainability, but if you feel that you're struggling to accomplish something, and that it's becoming a chore to maintain, the path forward is choosing a more capable tool, like a programming language.


I think we are actually mostly in agreement there then.

Perl was invented because the gap from shell to more capable languages was (and is) really big. Languages like Python and Ruby didn’t exist yet, and Perl had a really, really strong sweet spot in text processing.


> Perl had a really, really strong sweet spot in text processing.

Still does.


Since Ruby took the best bits of Perl what advantage does Perl retain?


>> what advantage does Perl retain?

Ubiquity, speed, and conciseness.

Perl is usually installed by default on Linux and Unix systems. Ruby might be there, it depends.

Perl is faster than Ruby. Ruby has been one of the slower scripting languages. But Ruby has been working on performance improvements in the past few releases. I have not seen any benchmarks of the current Perl versus the current Ruby, so this may have changed.

Perl is more concise than Ruby allowing more functionality for less code.


I don't know about conciseness. Ruby excels here. As for speed it's useful to distinguish between startup and runtime.


>> As for speed it's useful to distinguish between startup and runtime.

The JVM would like a word. (It has slow startup, but can be very fast at runtime due to JIT optimization and cacheing.)

Scripts should start and run quickly.

Ruby has historically been fairly slow which is why Ruby 3 focused heavily on performance. It has been improving a lot, but I have not seen any benchmarks against other programming languages.


The thing to remember here is that speed is relative. I haven't checked but Ruby 3 is probably faster than Perl 5.0. For most scripting purposes on modern hardware Ruby is plenty fast enough. Whilst there may may be marginal speed differences between Perl, Ruby and Python the differnce is insignificant.


I used Perl then Ruby as my main language for almost a decade each. These days, I don't really write Ruby anymore; I moved on to Elixir and never looked back. But I still find myself using Perl on the command line, in contexts where Awk or Sed would also make sense. Ruby never optimized for the one-liner case IMO.


I don't understand your last point. `ruby -e` has excellent parity with `perl -e`.


Yeah, the actual experience of leaky abstractions and non-portable code is forgotten. Perl solved a very real problem in the 90s. Grief, I shudder to think back to the sheer complexity of my bashrc file back then.


Yes, exactly. I have seen entire backend systems written in bash. Everything was shell script, sed and awk. The owner didn't want python or perl because he only knew bash and the related tools.

Everything was needlessly hard because these tools were not built for that. Easy to talk about philosophy and the "classic Unix approach" if you don't have to build modern applications this way.


Horror is apt.

You hit it on the head with the slowness of loops when the body comprises a series of program invocations. The horror really seeps in when you realize the original author wasn't stopped by the lack of data structures: they could get around that with some creative variable names.


Programming environments including shells and operating systems are just tools. And every tool can been misused. I opened a can of beans with a screwdriver once. Reality is messy. That doesn’t make the tool bad.


At the time, before everything became Linux, all these tools and the shells used to glue them together were an incoherent mess. Was your glue sh, ksh, csh, tcsh, bash or something uncommon like zsh? Did your grep, awk and sed use the same regexp syntax as your text editor? Single letter command line options, all meaning something different to each tool. Dozens of domain specific languages (shells, awk, sed etc.) meant dozens to learn and keep in your head. And you needed it in your head, because finding the information you needed in the massive single document man file reference was a pain because hypertext links had not been invented yet (well, probably in Emacs, which was another tool like Perl that people used to avoid the command line nightmare).


> or something uncommon like zsh

By the time our lord and savior zsh appeared on the scene Perl was already at Perl 3. And, to be fair, I do not think many used zsh before 2.1 which was some time 1991 fall and by then the Camel Book was out for half a year or something like that. So the pre-Perl and the we-use-zsh days do not really overlap.

When Perl appeared the absolute hotness was the version of Korn shell what later became known as ksh88. https://github.com/weiss/original-bsd/tree/master/local/tool...


> And you needed it in your head, because finding the information you needed in the massive single document man file reference was a pain

That's what the O'Reilly books were for, especially the Nutshell series.


I remember capturing every password at my university via "methods". Because we had a printer quota. In the summer when everyone was gone I printed out all the man pages (all the mans, the system libraries, etc) so I'd have a nice reference book. I made sure to make it so no one was charged any money.

The one thing people can't possibly fathom if they started coding after the mid-late 90s was how much we relied on the printed medium.


I still remember when we measured the documentation IBM shipped with the mainframes not in pages but in yards it occupied on the shelves. It was a lot.


Oracle used to be hard on your lower back, within the last 25 years.


The Unleashed and Bible series of books come to mind. Waite group or Sams or so, publishers.

Rather Waite-y.


Honest question-- why weren't the tools glued together or perhaps replaced entirely with the lisp inside Emacs? What was it missing?


While there are many possible answers, it eventually boils down to Unix being a C runtime and thus has a C culture. Lisp is from outside of this section of the world, so it simply had less adoption and support inside Unix land. Other languages, like sed, awk, and shell are not C but share its heritage(essentially, they were made by people close to the making of C.)


Memory and CPU efficiency, since most systems were memory constrained. My first server had 128MB of RAM...


> My first server had 128MB of RAM

Whippersnappers! :D

The first big iron I had the luck to work with was an IBM 3090 , essentially a gift from IBM, it handled the university entrance exams of the entire country of some ten million people and it had 64 MB of RAM. (It was also the first computer in Hungary permanently connected to the Internet via a leased line to Austria so it had an Austrian IP address. Hungary didn't have its IP region for two more years.)

I think the first machine with 128MB was a VAX 6510 a year or two later at another university. A little bit later, in 1994, CERN had gifted a VAX 9000 with an astounding 256MB of RAM.

To compare, the first server I installed Linux on had a grand total of 4MB RAM -- and that was one of the largest computers a small department at the university had.

It would be a long, long time before "128MB" and "mine" entered the same sentence.


Shades of the Monty Python sketch here, but the following is true...

4MB?!?

My first encounter with IBM kit was a, er, darn I'm not sure cuz I'm getting old, but I think it was a 4300? Not big iron in some senses, but still with a box that was something like 6-8 feet long iirc and definitely several feet wide and high. (And a bank of about 6-8 tape decks, each as tall as me, and two disk units, each the size of a washing machine, and so on.)

Its RAM? A massive 1 MB.

That IBM kit was the heart of the super new expensive upgrade in 1980 that cost something like 5-10 million pounds iirc to build, including a brand new building to house it and a team of programmers.

The older setup, which is where I was until its last days, was an ICL system that was expanded at the end of its life to a whopping 48KB -- yes, KB -- of RAM.

And that kit ran all the systems, internal (payroll, accounting, etc., etc.) and external (sales etc.) for the largest car dealership in the UK.

128MB? 4MB? Even 1MB? That was an unimaginably insanely large amount of RAM!

(Yes, it was very weird to be working with this physically enormous setup, and dealing with keeping it all cool enough not to halt for a half hour or so, through super human efforts when the A/C broke down, when the likes of PETs, Sinclair Z80s, and Acorn Atoms were a thing...)


> including a brand new building to house it

Ha yes the aforementioned IBM 3090 was so big for installation they removed the roof of the building it was living in, craned it in place and put the roof the back. Bringing it up the elevator or stairs was impossible.

Much later, in the second half of the 90s, I remember the four of us carrying an IBM HDD -- I think it was your normal 5.25" drive but it needed four people because it was mounted on a vibration dampening base ...


> Much later, in the second half of the 90s, I remember the four of us carrying an IBM HDD -- I think it was your normal 5.25" drive but it needed four people because it was mounted on a vibration dampening base ...

Continuing the shades of Monty Python theme[1]:

I remember one of my first few nights being in charge of the new IBM kit (I was a "computer operator" back then, in 1980), leaning back in the fancy new chair at the desk with its fancy "virtual" teletypes (a couple "terminals" displaying the status of the OS with a CICS system), and showing off to an "underling" by swinging a long plastic slide rule or something stupid like that (I no longer recall), and me accidentally banging it on the desk. Right "near" a recessed big red button. Or perhaps "on" the button? As I snapped my head around to look at the button and begin to understand what I may have just done I heard an ominous series of whirring and clicking sounds coming from the cpu box, right near where there was an 8" diskette drive that wasn't supposed to be doing anything while the OS was running (it was just for starting the OS). Then I looked at the console... Uhoh. They didn't fire me but it took months before they decided to let me be "in charge" again with someone else actually hovering over me...

Fast forward to when I was a coder (BCPL) in a small software startup, during the second half of the 80s, presumably 10 years before you were carrying your 5.25" drive monster, I vividly recall someone bringing a 700MB hard drive back from a local computer store. It cost an astonishingly paltry 700 quid or thereabouts. A pound a MB!

[1] https://www.youtube.com/watch?v=VKHFZBUTA4k


> I vividly recall someone bringing a 700MB hard drive back from a local computer store. It cost an astonishingly paltry 700 quid or thereabouts.

I ... do not know. That sounds very low. Look at https://jcmit.net/diskprice.htm and note the pound was 1.8-ish around this time so the price should have been well above 1000 pounds even in early 90s. We are talking of a 5.25" full height drive, here's an 1987 model http://www.bitsavers.org/pdf/maxtor/MXT8760E.pdf rare in personal computers, it was definitely for workstations / servers.


There's an old joke that emacs stands for Eight Megs And Constantly Swapping.


Eventually Munches All Computer Memory.

Other backronyms too.


Elisp is very much a niche language. For whatever reasons, the use of Elisp outside of Emacs is basically non-existent. Elisp is quite clunky, and AFAIK there hasn’t really been any big efforts to make it usable outside of Emacs. People who wanted Lisp outside of Emacs already had Common Lisp. (And Chez Scheme, and Scheme 48, etc etc.)


you probably shouldn't abuse shell scripts to build a complex system, and beyond a certain level of complexity, a programming language is the better tool

but the only free programming languages available at the time were C/C++, various shells, and awk. everything else was expensive or not generally usable for other reasons. all the really useful languages to build complex systems didn't really appear or become freely available until the 90s. and perl was first among those.


I'm not saying that Perl didn't have its time and place. It certainly fulfilled a need at the time for a language more capable than shell scripts, but less cumbersome than C/C++.

But the thing is that today the shell landscape is much more mature for solving simple problems, and we have C/C++ alternatives that are saner and more capable than Perl (e.g. Go). So it arguably has lost its place, as shell tools are still in widespread use, while Perl is mostly underused. Raku is interesting, but it goes in a different direction, and its adoption is practically zero.


Python killed the game.


Unfortunately. Ruby should have been Perl's natural successor. Python is the VHS of scripting languages. For a start it doesn't have a decent answer to Perl or Ruby's one-liners. Then there's the crippled lambda implementation. Python is a sad case of worse is better.


> For a start it doesn't have a decent answer to Perl or Ruby's one-liners.

This is by design. Readability is core to the design and philosophy of python. One liners are cool and fun to write, but trying to decipher someone else's incredibly dense bash or perl one-liner is absolutely awful.


>> One liners are cool and fun to write, but trying to decipher someone else's incredibly dense bash or perl one-liner is absolutely awful.

You can write hard-to-read code in any programming language.

Python lets you with mandatory whitespace so that the awfulness spans multiple lines instead.

Really talented Python programmers can do downright demonic stuff with list comprehensions.

Python appears to be simple, but is actually quite complex. I recommend reading "Effective Python" (https://effectivepython.com/) to see beneath the surface.


The readability complaint usually comes from people who never took the time to grok the language and its idioms. At least give the user the option. Advocating a language based on what it denies you doesn't make sense. Why use a scriptig language at all if belt and braces is what you're looking for?


This topic is akin to "holy wars" and since I used to think the same way (still do to some extent), I would like you to at least consider another aspect: the effort it takes to "grok the language and its idioms" is vastly different depending on the design of the language. Letting people do whatever they want, whatever way they want it isn't only about protecting them from themselves or not.

Just think of C: I'd argue its design is actually more akin to Python than Perl (and it definitely inspired languages like Go and Zig, NOT languages like C++). It's a small language and this is a very important characteristic of it: you can count on being able to actually master it or at least very well comprehend it. Other effects of a simple and literally straightforward language can be: easier implementation and evolvement, less mental load on the developer, easier portability among developers (both for general knowledge and actual code), etc. I'm not saying that C is the way it is for all these reasons but I wouldn't overlook this factor and I do think that languages like Python are deliberately building on these advantages.

Now, I don't dislike C++ at all but back when I studied it at university, I noticed that it was the first language for me that needed to be actually studied, unlike Pascal, C, Python and "oldschool" JS. Ever since, the only languages where I felt the same were Prolog (mostly because it requires a different mindset; other than that, it didn't seem bloated) and Raku. Not C#, not Java, not Erlang. I didn't really have to touch Perl but from all I know, Raku started off as a fresh take on the Perl approach. It seems somewhat more organized but huge nevertheless, to the extent that there literally isn't one person who really "groks the language" all around. In the case of Raku, I wouldn't even say it encourages you to write unreadable code (especially if you have a thing for APL look-alikes, lol) - it's just so rich that there is a good chance you will come across something in someone else's code you have never used before and don't quite remember how it will act in your specific use case.

There are different types of freedom than "do whatever you want". Like, the freedom to feel safe and confident about code. These days, humanity has aggregated an immense amount of knowledge and technology and "I will do it all by myself" is not that much of an option. And even people with such puritanistic tendencies will choose simple and straightforward tools, even if not for the "limitations".


There have always been feature-rich and bare-bones languages. No point pitting one against the other. I don't consider myself particularly clever but with the help of "Programming Perl" and "Learning Perl" I managed to get a pretty good grasp of the language with no prior programming experience. I just feel a lot of the knee-jerk response to the mention of Perl comes from people who have never put any effort into learning it.


Well I'm saying that "you just didn't bother to learn it, duh" is not an equally valid argument for different languages; it's much less valid for Perl than it would be for Python. It rewards your efforts much less. Perl is notoriously a language that had desperate criticism among its users even by the mid 90's ($[ and $] stuff comes to mind), quickly led to the creation of Ruby and famously "forked itself" with what is now known as Raku.

Now I have barely spent any time with the oldschool Perl but trust me, I have put a lot of effort into learning Raku, the language that was meant to fix Perl. Whenever something that "seemed like a good idea at first but it's actually harmful" shows up, it's usually Perl's legacy. I'm thinking of things like the conceptual mishmash between a single-element list and a scalar value (or in general, trying hard to break down variables arbitrarily into list-alikes, hash-alikes and the rest of the world), the concept of values that try to implicitly pretend they are strings and numbers at will, or the transparency of all subroutines to loop control statements which is some next level spaghetti design. If you ever actually use something like this, you introduce a brand new level of complexity, somewhere inbetween a "goto" and a "comefrom", so I would really think about if this was worth learning at all.

Oh right... from what I remember, it was also Perl that fostered this idiotic idea that a name of a concrete thing could be overloaded to be a namespace as well, and a concrete Foo::Bar could very well be something that has no logical relation to a concrete Foo. Moreover, I'm quite sure Perl invented this nonsensical distribution-module dichotomy where you are supposed to depend on modules, despite the smallest publishable and installable unit being a distribution. There are three outcomes with that: - if the distribution contains only one module: what was the point of drawing the distinction? - if the distribution contains tightly coupled modules: you can pretend to only depend on one of the modules but in fact you are depending on the whole distribution together - if the distribution is a collection of unrelated modules: why are you trying to encouple the metadata when this will make the versioning meaningless?

I can only hope that it's somehow better than Raku but the whole principle is just an anomaly.

And you know, then these people move around in the world, pretending that all of this is just normal and you just have to learn it. Well guess what, there is a reason people might want to put that effort into something else.


> This doesn't sound that horrific to me. It's the classic Unix approach of building small tools that do one thing well, and composing them in novel ways to solve problems.

This works really well if your problem can be solved in one or two liners.

It go bad very quickly when, say, you have two CSV files and want to join them the sql-way. In sed, you have to use positional variables and think about shell escaping. In perl, you can at least name those variables and use \Q


> This works really well if your problem can be solved in one or two liners.

My personal comfort threshold is around the 100-line mark. It's even possible to write maintainable shell scripts up to 500 lines, but it mostly depends on the problem you're trying to solve, and the discipline of the programmer to follow best practices (use sane defaults, ShellCheck, etc.).

> It go bad very quickly when, say, you have two CSV files and want to join them the sql-way.

In that case we're talking about structured data, and, yeah, Perl or Python would be easier to work with. That said, depending on the complexity of the CSV, you can still go a long way with plain Bash with IFS/read(1) or tr(1) to split CSV columns. This wouldn't be very robust, but there are tools that handle CSV specifically[1], which can be composed in a shell script just fine.

So it's always a balancing act of being productive quickly with a shell script, or reaching out for a programming language once the tools aren't a good fit, or maintenance becomes an issue.

[1]: https://miller.readthedocs.io/


You’re discussing modern tooling in a conversation about early UNIX tooling. Back in the period being discussed, even ‘read’ was less functional. Ksh introduced a lot of the stuff we now take for granted, some of which wasn’t even available until the Ksh93 (long after Perl was released). Bash itself is a younger project than Perl. Albeit not by much.


Fair point. I'm not arguing that Perl wasn't an improvement back then, but that the approach of composing Unix tools is not inherently bad. And as the shell ecosystem evolved since then, and more capable programming languages appeared, Perl has been left by the wayside as a historical relic, rather than the replacement of Unix tools that Wall envisioned.

So I don't disagree that it was needed back then, but it's important to mention the modern context it struggles to exist in.


Perl is still a commonly used tool chain. It is far from being a “historic relic”.

I agree that there’s nothing wrong with composing UNIX tools. I mean, that was one of its key selling points. if you watch any early promo videos for UNIX you’ll see them talk heavily about the composability of the command line and shell scripting. It wasn’t an accident — it was designed that way.

The point of the conversation wasn’t to say that one shouldn’t write shell scripts, it was just to say that there was a massive and unfilled gulf between what was easy to do in Ksh, awk and sed, and what could be done in C.


> It go bad very quickly when, say, you have two CSV files and want to join them the sql-way.

Then just put them in a database and write a simple SQL query. If you use Perl it’s really very simple to do.


> It go bad very quickly when, say, you have two CSV files and want to join them the sql-way.

That sounds like a perfect use case for `join`.


> This doesn't sound that horrific to me.

I think it's possible that things that seem normal and inoffensive can become horrific simply from scale. You'll climb the stepladder without complaint, but then there's that radio tower in Canada...

Others like the idea of the family cow, then they see the 10,000 head feedlot from the highway.

Scale is sometimes sufficient by itself to induce horror.


csh was a decent interactive tool, but not great for scripting. Bourne shell had the right idea but there were so many bugs in various corners of it (I still sometimes end up writing "test "x$foo" = "xbar" even though shells that need that are long gone).

If you can depend on a recent bash and use shellcheck, then it's actually quite a pleasant programming environment, with fewer footguns than one might think. (I want a @#$@# "set -e" equivalent that returns non-zero from a function if any statement in the function results in non-zero).

There are some things that are more awkward than they should be though (e.g. given a glob, does it match 0, 1, or many files, or the way array expansions work).

Also, there's no builtin way to manage libraries (I don't know about Perl, but Python suffers from this as well). This results in me pasting a few dozen lines of shell at the top of any of my significant shell scripts, for quality-of-life functions. Then I have to use "command -v" to check if the various external programs I'm going to use are present. Say what you will about C, but a statically-linked C program can be dropped in anywhere.


Mostly agree. The modern shell scripting environment is much more robust than 30 years ago, with ShellCheck and some sane defaults, as you say. I also find it pleasant, once you get over some of its quirks.

As for managing libraries, that's true, but you can certainly import and reuse some common util functions.

For example, this is at the top of most of my scripts:

    set -eEuxo pipefail

    _scriptdir="$(dirname "$(readlink -f "${BASH_SOURCE[0]}")")"
    source "${_scriptdir}/lib.sh"
This loads `lib.sh` from a common directory where my shell scripts live, which has some logging and error handling functions, so it cuts down on repetition just like a programming language would.


I haven't seen -E (aka -o errtrace) before, but it looks like a useful addition to the standard-ish -euxo pipefail.

Moreover, it's existence being required explains why my error handling, recently, wasn't working as expected.


The effortless composition of complex commands out of simple standalone programs is one of the best features of Unix. And yes, I admire and love it as well.

That said, imagine a metrics system for a huge networking company that used these methods to cover all automated testing or defect analysis. Those inner loops were made of greps and seds and so forth, and each one is the invocation of a new program. It wasn't uncommon for these runs to take almost a day.

Besides performance, the other nightmare was was someone described below: each script was a one-off that didn't leverage the work from others. If the author only new C shell, then you know you're going to be doing gymnastics to catch the stderr of some of those programs (you can't capture it in the same manner that Bourne variants do).

Anyway, yes, we all adore the Unix philosophy, but there are limits.


The "philosophy" that I never saw in commercial UNIXes, starting with my first experience with Xenix in 1993, beyond being endless repeated in FOSS circles, which is ironic given how GNU and BSD applications work, with their endless amount of command line parameters.


Not just the language itself but the whole ecosystem it brought with it was revolutionary. CPAN was incredible. There seemed like there was a module for just about anything! Perldoc and a testing framework were built right in. Regexes, backticks to shell out to the system, reporting built in. It really was the whole package.


Yeah CPAN is easy to take for granted now that every language has its own package manager (npm, cargo, pip), and even for C/C++ you can find what you want on GitHub. At the time though CPAN was revolutionary and no one else had it. To be able to just search for a module, find one that did what you wanted, then download it to your project was pure magic.


There is some question whether CTAN (for TeX) or CPAN was the first big library. It was pretty close. I was involved tangentially with the guys in the UK who were setting up the first pass at CTAN as the administrator of the ymir.claremont.edu archive and I remember one of the things that they came up with back then was that you could do an FTP get of any directory and get back a zip archive of its contents which was pretty fancy in the 90s. That said, both CPAN and CTAN definitely show their age in that they assume a single version of dependencies will be installed on any system which causes untold nightmares when it turns out that there’s a non-backwards compatible update to something three dependencies deep that you weren’t even aware you were using.


> There is some question whether CTAN (for TeX) or CPAN was the first big library.

There isn't "some question", there is a definite answer, namely in "Programming Perl" and I quote the 4th edition, page 629:

History

Toward the end of 1993, Tim Bunce, Jarkko Hietaniemi, and Andreas König set up the perl-packrats mailing list to discuss the idea of an archive for all the Perl 4 stuff floating around the Internet. Perl 5 development had started that year, and one of its main features would be an extensible module system that would allow people to extend the language without changing perl. Jared Rhine suggested he idea of a central repository, but nothing much happened. His idea had come from CTAN (http://www.ctan.org), the Comprehensive TeX Archive Network.

Edit: fix markup


>There is some question whether CTAN (for TeX) or CPAN was the first big library.

I'm not in this space, but the similarity in names made me wonder if these would really have started around the same time, with an unclear "dependency order", so I reached for a search engine, and according to Wikipedia, CPAN (1993) is based on CTAN (1992).


See also The Timeline of Perl and its Culture https://history.perl.org/PerlTimeline.html


>assume a single version of dependencies will be installed on any system which causes untold nightmares when it turns out that there’s a non-backwards compatible update to something three dependencies deep that you weren’t even aware you were using.

I may not completely understand what you're describing, but with perlbrew managing different Perl versions, and various CPAN clients installing modules in versioned libraries - as well as the test/smoke servers that test CPAN on tons of versions of Perl for Perl CPAN authors automatically, I don't know if I've come across what you're describing.

What's still a pita is when there's an outside dependency from the Perl ecosystem that breaks a module, but I'm not sure that's Perl's fault.


It happens more with CTAN where there is no possibility of a testing framework and there is no ability for multiple versions of an artifact to coexist.


As far as I know, https://cpantesters.org/ is still unrivaled though. None of those have the level of testing that CPAN Testers provides.


> Yes, awk and sed were replaced by Perl,

I still use awk and sed semi regularly. I haven’t used Perl in over a decade.


1980s awk and sed aren't the same as the 2020s version.

They've become much more useful because of the influence of things like perl.

To see what I mean, here's SunOS 4.1.1's man page for awk and sed as grabbed from https://github.com/ambiamber/Run-Sun3-SunOS-4.1.1/blob/main/... and ran through groffer(1) . These are from 1989 and 1987 respectively. The core stuff is there but not much else.

https://9ol.es/sed.1v.pdf

https://9ol.es/awk.1.pdf


Awk today is very much the same language it was in 1977 when it was introduced. Comparing your links with today's POSIX awk manpage doesn't yield a single thing Perl brought to awk. At best, while JavaScript syntax is even based on awk, you could say JavaScript regexpes are based on PCRE, and it wasn't clear whether use of capture groups/back references makes Perl regexpes accidentally Turing.


POSIX, right. POSIX sucks. POSIX anything is stuck 30 years ago. Nobody in 2023 really uses POSIX awk on some commercial UNIX like IBM AIX, they use GNU awk, that's the modern one I'm talking about https://www.gnu.org/software/gawk/manual/html_node/index.htm...


Well, I look at the POSIX standards whenever I want to write shell scripts and Makefiles that work on both Linux and macOS. Which, in my current role, is often enough that I am driven crazy by how far POSIX is behind GNU tooling...


Yep. Funny enough, Linux is actually not POSIX certified

POSIX was initially a bunch of the Unix vendors getting together in the 80s seeing a rise of incompatibility saying "this shit is crazy let's find something we can agree on" and then the conflicts of trying to still have a differentiating But Also compatible product. They knew that if they cannibalized themselves, IBM Novell and DEC were there to snatch up the customers

As a result, POSIX was intentionally kneecapped by all the players but not enough to make it completely worthless.

By the 2010s I decided to simply ignore it and if something breaks on FreeBSD or whatever, make the decision then whether to support it or not. Mostly it's platform testing at the top and then subbing the gnu version of things and bailing out if it's not there.


Got it. I'm the dev tools guy at my job right now, and we have a mix of Linux and Mac users, so I've gotta make sure that any shell I write works on both. For me, POSIX seems to be a good indicator of the intersection between the two OSs, even though the true intersection is neither a superset nor subset of POSIX (eg, Mac has bash, which isn't POSIX shell). Oh well. It's close enough to be useful, and I test it all manually anyway.


Mac OS ships zsh since 10.15 as default shell as I'm sure you know, with the (restricted ksh subset called) POSIX shell pretty much the intersection of what works on both Linux and Mac OS. There you have the usefulness of POSIX/SUS.


What are you talking about? There is no POSIX awk implementation, just a language spec and nawk AKA the one true awk. As shipped on Mac OS, whereas Debian ships mawk in addition to gawk. gawk is just much slower and unfortunately also buggy (crashes on a non-"C" locale with complex regexpes).


Right, "the one true awk" corresponds to a book written in 1988, very explicity. https://github.com/onetrueawk/awk

I was incorrect about 30 years. It's actually 35.

You were the one that that said POSIX awk to begin with, I was using your terms. I understand what you meant, why don't you?

As far as shitting on the GMU tools, I don't think I've seen someone do that in decades, especially just referencing the bottom of the man page like that.

As far as speed, I don't know how you're using these tools, but if gawk is your bottleneck and not i/o, there's probably smarter ways to do things.

This is not a productive conversation. You can live life however you want and if you're happy than I'm happy. We are definitely at an impasse. I'm off to bed.


Agreed. I would turn to Python for anything more involved that couldn’t be done very directly and efficiently using awk/sed. I would never use Perl for anything.


Same, but if there were no Python/Ruby/etc, I would probably be reaching for Perl quite a lot.


Sounds like you were also replaced by Perl.


…that you’re aware of.


> I still use awk and sed semi regularly. I haven’t used Perl in over a decade.

Same. Awk and Sed are delightful little tools that have aged exceptionally well.


They are also constrained languages which gives many advantages. POSIX regex will always execute in O(n), Perl "regex" are potentially exponential and can be a security risk in certain situations.


Well, that's at least an actual argument, even though POSIX regex would have never proven so useful in a variety of use cases.

Other than that, I'm completely shocked that somebody would praise sed or awk in well into the 21st century. Turing-complete languages barely capable of properly solving any problem that has any sort of algorithmic complexity, let alone one that requires proper data types, variables or interacting with any sort of system. Really, you can do that in Perl (even in Raku) without your couple-of-liner horribly breaking down once you need some custom logic to it. Oh, and it will even resemble programming languages, not some hieroglyphs left behind by aliens.


What is the reason for that? I think they should be long gone for the lack of proper data structures, if nothing else.


Ignore the haters - they're too young to remember.

I remember different CLI tools working differently on SysV and other variants. I remember AIX and HPUX.

Perl meant one thing that just goddamned worked mostly the same way. That and vi and emacs. :-)


Back in the 90s I got my first 'real' job, and needed to learn well grep, awk, sed, col et. al. Or learn Perl. Perl kept us afloat until we had too much Perl and needed to switch to Python. And I still don't know awk or sed; to this day my fingers just go with 'perl -pie'.


The dozens of different versions of UNIX did not help either. A bug in their sed grep awk toolchain meant that you had to write workarounds for things like "Sequioa Unix" to support is single customer. as long as the UNIX flavor had a PERL port all of that could go away.

PERL being open source meant you were a "make" away from having the same environment on all those unixes.

even today when the dozens of UNIX distributions have collapsed into a couple of Lunux flavours, getting a shell based script to work reliably between say macOS and redhat can take some effort.


Yes, writing shell scripts calling awk and sed was awful, just an aesthetic nightmare to work with. Properly quoting strings was always problematic, and while Usenet was available there wasn't a plethora of information like nowadays. This was a few years before Bash was released.

Through my use of Usenet I learned that a better reader program was available called 'rn', which I downloaded and built. It had an amazing handwritten install script (autoconf was years off) which could automatically configure and build rn on any *NIX system. All developed by a fellow named Larry Wall.

rn was truly a joy to use and made reading Usenet swift and efficient. It would get updated with the fixes that came across Usenet that I'd apply using the clever 'patch' program, also written by Larry Wall.

Based on my experience with his other software, when Larry Wall released Perl on Usenet I immediately downloaded, built, and started using it. As promised, for scripting things not requiring a C program, it was massive improvement. Version 2.0 came out and brought many great new capabilities.

I wasn't writing software while versions 3 and 4 came out; I started using it again after version 5 and the appearance of CPAN. Over the years I've used Perl extensively for task automation and data wrangling.

Python now dominates Perl's niche because it's easier to learn and interfaces better with C. It's also less flexible, which compared to Perl is a virtue. One of Perl's mottos is TMTOWTDI—there's more than one way to do it. But many of them are bad. Much of Perl's poor reputation ("line noise") stems from this.

But when Perl was released it was a revelation, like a drab day when the clouds suddenly part letting warm sunlight pour down on the land.


    # Sometimes I wake up screaming.  Famous figures are gathered in the nightmare,
    # Steve Bourne, Larry Wall, the whole of the ANSI C committee.  They're just
    # standing there, waiting, but the truely terrifying thing is what they carry
    # in their hands.  At first sight each seems to bear the same thing, but it is
    # not so for the forms in their grasp are ever so slightly different one from
    # the other.  Each is twisted in some grotesque way from the other to make each
    # an unspeakable perversion impossible to perceive without the onset of madness.
    # True insanity awaits anyone who perceives all of these horrors together.
https://github.com/openembedded/openembedded/blob/fabd8e6d07...


sed and awk are still around and still used a lot especially for one-liners. Anything more than that is typically not done as much anymore, though is certainly possible.


sed and awk are still around and still used a lot especially for one-liners.

Adding to that some popular examples are here [1][2]

[1] - https://www.commandlinefu.com/commands/matching/awk/YXdr/sor...

[2] - https://www.commandlinefu.com/commands/matching/sed/c2Vk/sor...


For me in the current day, perl is too much like "real programming", while I'm fairly comfortable with grep and sed, and to a lesser extent, awk. Piping a bunch of stuff together is just easier to understand and gradually expand. One thing I do in a lot of scripts is read from and write to the clipboard, usually transforming text with sed and co. Stuff like turning YouTube shorts URLs into regular ones, taking just the last directory name from a long file path on my clipboard, making my clipboard lowercase, etc.

I barely get into loops and lists when trying to learn Python before losing interest/focus and not trying again for months. It feels very abstract and like it's not useful for day-to-day stuff without learning and writing a lot. Shell stuff is very short and powerful and feels more grounded in reality. To be clear, I am not using if/else and the like in my shell scripts either. Usually they're just a few lines with little to no logic, but they're immensely useful. I can't imagine learning perl and then using it for all this stuff.


All software eventually becomes an unmaintainable nightmare.


> Yes, awk and sed were replaced by Perl, but more importantly, the unmaintainable nightmare that glued all of it together was wiped out.

Er, Perl replaced an unmaintainable nightmare? Perl, the language infamous for being indistinguishable from line noise?


Yes. It's comparative. The absolute value of Perl's readability, maintainability, and general intuitiveness is massively better than all comparable tools that existed when Perl came out. Perl is a massive improvement over those tools.

If you have doubts, try writing a complicated production software stack in awk. Then hand it off to a coworker.

One need not be irreplaceably good, if one is already beating the current state of the art by an order of magnitude. Perl did this.


Moreover, as often as people joke about the readability of Perl code, that's entirely a function of the developer.

I've easily written tens of thousands of lines of Perl, and not a single person has complained about difficulty reading or maintaining that code. Why? Because I apply all the usual best practices for code hygiene that apply to any language.

Frankly, I think most people are just repeating a meme they heard once, and the rest just get put off by a) sigils, and b) the use of implicit variables (which I tend to use only very sparingly, and mostly in quick one-liners).

But a developer that poorly names their functions or variables, fails to modularize appropriately, fails to document their code, abuses language features because they want to be excessively terse or clever, that kind of person is gonna write crappy, difficult-to-maintain code no matter what language they use.

In fact, I'd argue the advent and popularity of Python--which was pretty radical in how opinionated it was at the time--is a direct response to languages like Perl and C that were a lot more free-form and easier to abuse by poor coders.


and not a single person has complained about difficulty reading or maintaining that code

Developers extremely seldom complain about other developers' code to their faces.

Instead they go to lenghts to avoid it if they dislike it severely; right or wrong.


> that's entirely a function of the developer.

I would argue the same of sed/awk; you can write an unmaintainable mess, but you don't have to.


Oh sure. IME Perl just pulls it all together into a much more convenient, all-in-one package that has all the libraries and tools you need to build anything from small scripts all the way up to large projects.


absolutely a response to perl. “TOOWTDI” has been an expression in the python community for a long time :)


TIMTOWTDI is Python's motto for managing dependencies and deployments:

https://packaging.python.org/en/latest/overview/

<kidding> OCI (Docker) Containers were invented to overcome the problems inherent with deploying Python applications. </kidding>


Gen-Zer spotted. No, Perl is not line noise.

Get the Perl source code for PangZero and you'll understand proper Perl code.


All nontrivial sed programs are line noise, there is no other way to program in it. In Perl you at least have a chance. Awk is not much more readable than Perl actually.


Oh yeah! Perl sure did have a bunch of ugly programs, but I believe that was because it was so easy for non-experts to program in it. There have been many companies successfully built atop of a Perl code base, and I've seen fantastic systems built with it. I've also seen one-off programs handed from manager to manager which would scare you to death.


Yep, that’s how bad it was before.


A single-executable perl5 (like awk, but with advanced string manipulation + sigils + timtowdy madness) would be great.


And now you can just use PowerShell which combines it all in one cross platform scripting language.


assuming that the security administrators of the systems you want to run the script on haven't followed Microsoft's 'recommendations' and basically made powershell unusable on the systems due to random GPOs ;)

PS is 'okay' as a scripting language, but it's very frustrating how Microsoft's security defaults seem to be dead-set on ensuring you can't use scripts on systems without jumping through a bunch of hoops. I get how MS got to this conclusion that it needs to be locked down, but it makes me think twice about pumping out a script since I know I might need to walk my colleagues through how to actually run the damn thing depending on what the Security Admins decided to lock down that week.


Cross-platform? I guess technically it is...


Perl didn't replace anything. It was replaced by PHP for CGI programming (which says a lot!) and botched itself by the second worst version migration of all times.


Perl 4 to 5 was quite successful. Perl 5 to 6 was definitely worse than Python 2 to 3, since the latter actually worked out in the end.


I was an innkeeper, in this crazy, little town in Vermont...

https://youtu.be/OwYw2i2icNg?t=430


The claim: reduces CO₂e emissions by up to 80%*

The asterisk: water boiling phase excluded

When I boil pasta, I use full heat to get the water boiling, drop in the pasta, then let it come back up to a boil, then drop the temperature as low as possible to keep a slow boil going--about medium-low.

Guesstimating that the boil takes about as long as the time to cook the pasta, I'd say that the pasta cooking portion only takes up about 1/3 of the energy of the total (cold pot to cooked pasta).

The 80% savings only covers the pasta cooking phase, so overall, it's only saving 4/15 or 27% (roughly). If you use a lot more water, then that initial boil time further reduces your savings.


A better way to save energy is to use an electric kettle to get the water up to boiling.

This is about three times as energy-efficient as a gas hob. Combined with passive cooking it could save real money…

Technology Connections on YouTube has a good video about how electric kettles are the superior way to boil water: https://youtu.be/_yMMTVVJI4c


Alec, of the Technology Connections YouTube channel, shows it's not just energy saving, but faster and cleaner. Even with US outlet power limits a kettle will boil a volume of water much more quickly than a gas stovetop, and my indoor CO2 sensor won't spike by hundreds to a thousand PPM of CO2.

While the time savings is modest, if you have an electric kettle it's a no brainer to prefer that over gas.

Though induction stovetops can be faster yet, and just as clean.


Your gas stove will likely dramatically outperform an 1800W electric heater if you use a proper pot:

https://turbopot.com/

I would still rather use a nice induction range, but it is possible to get decent efficiency out of gas.


What efficiency (in terms of energy transfer to the intended material vs to the environment) is observed with gas? IIRC, gas achieves maybe 35-50% efficiency (depending on the surface area and material of the pan) compared with ~80% for electric resistance heating and >90% for induction.

Adam Ragusea did a water boiling comparison and the gas stove was dumping so much energy into the air around the pot that his thermometer melted [0]. I just don't see how any pan geometry could extract much of the energy from the quickly rising hot gas produced by combustion.

[0] https://youtu.be/Xn1LUo5ra_A?t=249


The claim on the site is that the heat exchanger base boosts efficiency up to 60%.

You still of course do have the problem that home rangehoods usually aren't powerful enough to create enough air-flow to properly deal with the NOx and SOx produced by burning gas, which turns out to be a big health risk...


Most home range hoods are too powerful. Go to any site that calculates commercial range hood requirements, enter in your parameters, and then try to find a nice range hood that matches. Those 700 cfm or 1000 cfm beasts the appliance stores sell are entirely inappropriate.

There may well be a problem with a poorly designed range hood and exhaust that inevitably isn’t captured. And people might not like using the hoods all the time.


Comparing gas with electrical efficiency is not representing realistic end-to-end efficiencies wrt. co2 as long as the vas majority of electricity is generated from primary energy. Power plants tend to have efficiencies lower 50%.


The gas end-to-end "efficiency" will not improve drastically ever, while for an electrical stove it is tied to energy production. Arguing with end-to-end efficiencies and power plants is misleading, as I can power my induction stove from my solar panels


That's true, but I power mine from the municipal grid that burns stuff to make steam and then electricity


Might still be better as efficiency goes up with temperature and industrial generators can achieve much higher temperatures that what you can at home, and also include some clean sources of energy into the mix.


There's still a huge amount of energy that gets dumped out into the kitchen by any non-induction cooktop, no matter what pot you use.


I always wanted some kind of "collar" that I could put around a burner to try to direct that excess heat back into the pan/pot.


The problem is you need airflow or you generate carbon monoxide instead of carbon dioxide. So simple solutions can get more heat but also kill you.


minor side effect


Death only increases the amount of carbon emissions this method saves.


Product idea: a broken CO detector that saves up to 100% of person’s future emissions. Saves money in the future too.


Lower carbon emissions with this one weird trick!


But your decomposing family will release CO2!


It's offset against the future CO2 emissions of the family and their descendants.

(Gee, this thread got dark. Like carbon-black dark)


There used to be (still is?) a sort of vertical chute/collar available to put around camp pots. Think mountaineering applications. The idea was that they would capture heat energy escaping off the bottom of the pot that would rise/disperse into the environment, and channel it close to the side of the pot where it would heat the vessel as it rose. Think of wrapping a section of corrugated cardboard around the outside wall of a pot. Now a birds-eye view down the cardboard should reveal the energy capturing channels that will allow the sides of your pot to heat the pot contents. Adjust your material and tune the sizing and you’ve got a camping gadget.


This is maybe useful for boiling water but not so great for cooking. You want your pot to have a relatively uniform temperature, and gas burns very hot. This means that, especially if the walls of the pan are thin, any part of the wall that has good thermal contact with combustion products and poor contact with food will get extremely hot, with various unfortunate consequences. Also the handles will get hot.


Well it takes absolutely forever to boil water on a camp stove, so I"ll take it.


Look into jetboil campstove/pot. It's got a heat exchanger fin stack on the bottom of the pot to improve boil time.


Typically, those collars are used as a wind screen.

https://www.rei.com/product/139472/toaks-titanium-windscreen



Old cast iron skillets had a collar under them to better fit on the pot belly stovetop.


I have a gas stove in my studio apartment. I still use an electric kettle to get water boiling before dumping it into a preheated pan to cook noodles/whatever. Highly recommend this approach if you are stuck with gas.


> Highly recommend this approach if you are stuck with gas.

My partner, who cooks more, bemoans that our new apartment has electric. Grew up using gas and claims that it’s better for cooking. I prefer not to intentionally dump gas into our house and welcome the minor benefit of energy efficiency.

Funny how different people see the world (and technology).


Watch this video about how the idea that gas was better for cooking was a marketing ploy with no basis in reality and how they used lobbyist to force has connections into building codes

https://m.youtube.com/watch?v=hX2aZUav-54


"no basis in reality". Do you cook? Gas is functionally superior than traditional electric resistance cooking. That's not some evil lobby, that's practical experience. You can instantly control the temperature without juggling multiple burners at different heats. That makes it better. I'd take gas over electric (resistive or halogen) all the time - and I've lived with both.

Induction is on par with gas in controllability - and although there are some downsides, the upsides (so easy to clean!) Make it worth it (imo). I'm not saying that because big induction paid me to either.


Gas burners put out 3x-4x the power of an electric burner and have much lower thermal mass, so the acceleration in thermal power is much higher and the absolute thermal power is higher.

A large electric burner is 2400-3600 watts, a large gas burner can be 8-16 kw. I think induction could get better power delivery than gas with time.

You can get skillful with electric, you just have to see 30 seconds into the future and anticipate the thermal lag and overshoot when adjusting burner power.


Gas burners may be higher total power, but the heat transfer is surely pretty poor. Induction delivers something like 90% of the power into the pan. Boiling water (which is just an exercise in energy transfer) is much quicker on induction than gas.

In my experience living with an AEG induction stove with a peak single burner power output of 3.7kW - apart from I never did any cooking (apart from boiling water) which needed that level of power for more than a few tens of seconds. The gas stove in my current house felt underpowered in comparison.


No disagreement here, induction has way better power transfer to the food than electric resistance heating. I was only comparing resistance heating with gas.


I cook, and while I acknowledge there are benefits to gas, I can escape the feeling that what a lot of people end up comparing are crappy electric stoves from when they lived in cheap apartments vs higher end gas stoves that they/another homeowner bought for themselves.


My preference for gas has a lot to do with the fact that my pans, unless they're exceptionally heavy, never sit stably on the conventional coil-type electric stove burners. So the pan isn't level and doesn't heat evenly. Gas range grates don't tend to have that problem.

I've never owned an induction cooktop but I'd imagine the flat surface wouldn't have that problem (assuming the pans aren't warped).


Every gas range( Wolf, Viking, GE, Bosch, Maytag ) I've used seems to vary dramatically with temps at different settings.

The lowest setting on some ranges might be a High setting on others.


I used to be a cook in a a few michelin starred restaurants, so you can trust my opinion here.

Gas is far superior to electric heating elements because of the speed at which it can change temperature, and makes flambeing a cinch.

That being said Induction ranges are several orders of magnitude superior to gas. As are ovens with humidity and air flow control.


This is the case for every range, gas or electric. Every time I've moved I'd had to "recalibrate" my expectations and use of the stove to account for different temperatures and quirks.


Induction is much better than gas about 90% of the time. Because the heat can be set much lower than a flame’s temperature, and because heat transfer happens through the whole bottom surface of the pan, it’s much better to keep a low-ish temperature for any length of time, which opens a lot of possibilities. For high heat, the larger contact area makes it much quicker to heat up evenly the whole pan. It pairs very nicely with a cast iron pan as well.

And that’s just from a practical, cooking point of view, without mentioning all the health benefits.

It was not our choice when we got an induction cooktop the first time, but now it would be.


Get a (high power) portable induction hob. If you have an outlet that can supply a 1.8-3kW one, it's better than gas unless you're using a wok or simar.


> claims that it’s better for cooking

It's not really "better" it's simply easier keep a constant heat. Most electrical cooking stuff works on intermittent on-and-off cycles. An electric oven power up, then start cycling it's internal resistences let's say 10" powerd 5" off, then again 10" powerd etc. Some users here show me in a video an induction plate who use "mini-coils" who seems rotating constantly nearly nullifying such cycle effects on food cooking, but most other electric gears choose a "simpler" approach for their OEM.

Personally I'm all electric since around 8/10 years or so, I've made a habit and not a professional cook (while remaining a professional eater) but I understand those who dislike the initial impact...

Beside that: most actual tech is developed in crappy ways, most product explicitly made not to last and not to ensure evolution with plugged-in recycling but simply ensure a constant buy of new gears who are just crappy like the ones they substitute. Now most people might not realize that as well as a techie but anyone feel that. So...


If you can afford it, a high end induction stove might be an upgrade you both enjoy without needing gas. Higher end ones have more precision and consistency and easier controls.


I cook a lot. I enjoy cooking with gas and I appreciate it’s benefits, but I avoid/minimize it because it is so plainly inefficient and frankly more hazardous than a modern electric or induction burner.


I generally heat half the water in the pot, and half in the electric kettle (adjust proportions depending on the relative power of your appliances) Quickest way to heat up the required amount of water


Assuming you have a four-burner stove, you could split your water five-way, and use four pots and your electric tea kettle.

If you want to push, you could improve the efficiency even more. That would require a capital investment into additional stoves and/or electric kettles. You'd probably need something like kubernetes for orchestration as well.


Since I don't have a dishwasher, using more pots actually has time cost, so that approach doesn't work

Interestingly, I feel compelled to wash a pot even if I only used it to boil water, while that's not the case for the kettle. Force of habit I guess


Crazy. Like a Jetboil for your home stove.


Buying cookware that works better on gas but is specifically incompatible with induction cooking seems a bit like rigging your Ram to roll coal, at this point in time.


How's that? Most people who have serviceable gas ranges are not going to rip them out and replace them with induction any time soon.

Someone who wants to go electric badly enough to spend thousands of dollars installing a high-current power outlet is not likely to balk at the cost of new pans.


> While the time savings is modest, if you have an electric kettle it's a no brainer to prefer that over gas.

Not quite. Water is heavy and boiling water is dangerous; if you're boiling your water in a teakettle you then have to transfer it to a pot on the stove without scalding yourself.

(This isn't the exact problem I'd experience if I adopted your advice. I have an electric teakettle, but it is low volume and is also sharply limited in the rate at which you can pour out water from it. That's fine if you want to prepare individual servings of tea. It's unworkable if you want to prepare a bunch of boiling water to boil stuff in. But fixing that problem will immediately cause the "boiling water is dangerous" problem.)


Presumably you mean you have a gooseneck kettle, which I agree is not well suited to this task (though if it takes 30s to pour out the kettle, it's not really a big deal). Regardless, pouring boiling water out of a kettle is far less dangerous than pouring the cooked pasta + boiling water out of the pot and into a strainer.

If I'm looking to boil say 2 litres of water, I'll put 1.5 l in the electric kettle, 0.5l in the pot on the gas stove with a lid on. Generally the kettle boils first.


> Regardless, pouring boiling water out of a kettle is far less dangerous than pouring the cooked pasta + boiling water out of the pot and into a strainer.

I don't think this is true. The mechanics are essentially the same. But the colander receiving the pasta + boiling water is situated inside the sink, which will catch the water that is in that case intended to spill out.

The pot is situated on the stove, which is a raised platform that can't catch water at all. Any spill there will splash all over.


Well, with my sample size of one, I've splashed boiling water on the floor when trying to drain pasta or potatoes, but never while pouring out a kettle. Perhaps related, a kettle has a cold base in addition to a cold handle oriented more appropriately for pouring.


>Though induction stovetops can be faster yet, and just as clean.

You have the mother of all hotspots on the pan (less relevant for boiling) where the induction ring itself is and getting simmer right is harder. Induction as implemented right now is on/off with full blast for the duration and just increased downtime. It is not the same as low constant output.


> You have the mother of all hotspots on the pan (less relevant for boiling) where the induction ring itself is and getting simmer right is harder.

Not at all. There is no significant hot spot as heat is actually produced by the bottom of the pan and not the coils themselves.

> Induction as implemented right now is on/off with full blast for the duration and just increased downtime. It is not the same as low constant output.

Again, not at all. All the devices I have used were perfectly fine generating a low, constant heat. what you describe may be the case for bargain basement ones, and it was the case for most resistive cooktops, but is definitely not the case for inductions ones.


I guess you could have a hot spot on the pan if the coils are ill-designed or you’re putting a large pan on a small burner?

The pan will only heat up near the coil, so if the coil covers only the center third of the pan, only that will heat up (though some pans have a heat-spreading layer to mitigate this, and sometimes to add some more inertia depending on the pan’s purpose).


> I guess you could have a hot spot on the pan if the coils are ill-designed or you’re putting a large pan on a small burner?

Yes, but then it’s hardly a problem with the technology if you put a large pan on a small burner.

As for the rest, in all cookers I have seen, the coils cover the whole surface, except for a small spot in the centre. Besides, induced current does not happen only where the cookware is closest to the coil. The magnetic field is more spread out than that and the heating surface is larger than just the surface of “contact” (there is no real contact, but anyway).

I assume there could be an exceptionally badly designed induction cooktop with hot spots (it’d have to have a very weird geometry, though), but that would take some effort.


With most cookware it's pretty obvious where the coils are when trying to simmer. With thinner stuff like carbon steel the hot spots can be pretty nasty: https://www.youtube.com/watch?v=pifD__DIxGU


> All the devices I have used were perfectly fine generating a low, constant heat

I have never seen an induction cooktop that could generate low, constant heat. Even pretty pricy ones (eg. a 3000€ Bora hob with integrated extraction) showed a clear on/off cycle at the lowest heat settings. But maybe tech has improved in recent years, and there are hobs now that have constant low power output?

Maybe the effect also just depends on your cookware? On pots with a heavy bottom with aluminium or copper core you probably won't see the coil patterns and the on/off effect will be less pronounced. If you have a pot with a thin stainless steel bottom, you will definitely see uneven heat, the power from an induction coil is not completely homogenous.


All of the cheap tabletop cooktop I have seen have terrible cycling at lower powers, there seems to be two power levels and when you need something below the lower one, it cycles power at ~10s frequency. In contrast a Bosch integrated cooktop from a couple of years ago also does cycling, but seems to have more power levels available and the cycling is faster, around 1Hz or so. At least for me that is good enough.

My issues with it are more about inaccurate placement causing hot and cold spots and if you move a pot it can triggers cookware detection, which then clicks different coils on and off for 5-10s before it is satisfied with the new configuration. Both of those issues are probably exacerbated by the "FlexInduction" system that promises one large automatic cook area.

I'm hoping that power electronics development for electric cars creates some innovation in this area, but it's really hard to tell because there are no useful review sites and there aren't any places that let you take a cooktop for a testdrive.


> there are no useful review sites

Right! The only thing there is are occasional reviews on shopping websites, but people can usually only compare it to their previous one, so there really are no useful reviews (spoiler: most induction hobs are faster than whatever people had before, and people seem to hate touch controls)

Especially once you get to the fancy features (eg. internal and external temperature sensors) there are almost no reviews at all and all you have is the manufacturers marketing.


> This is about three times as energy-efficient as a gas hob.

I sincerely doubt this figure in cases where electricity is generated from fossil fuels in the first place.

For example, I lived in the Netherlands where 80% of electricity is still produced by fossil fuels, mostly in gas-fired power plants. As I understand it, generating electricity from gas has an efficiency of only about 50%. That means you lose a lot of energy before it even arrives in your home.

An electric kettle has an efficiency of about 80%, with gas stove around 40%. Assuming the cost of transportation is approximately equal for gas and electricity, that means using the gas stove is about as efficient as using the electric kettle, if you assume (most of) the electricity comes from a gas-fired plant.

In the winter, the gas stove will be more efficient since all the heat that doesn't go into your food, heats up the room (heating is typically also based on gas in the Netherlands, so this is basically free energy).

> Combined with passive cooking it could save real money

Again, I don't think this is true. Currently the energy market is fucked up because of the Ukraine war, but up to recently a cubic meter of gas cost around 1 euro, which produces roughly 10 kWh of energy, versus a kWh of electricity cost around 40 cents. That means that on a per-kWh basis, gas costs only a quarter of electricity. So even if the electric kettle is twice as efficient as the gas stove, it is still twice as expensive.

In countries where most electricity is generated by burning coal (like Poland, for example) there is also an environmental cost, since coal-fired plants emit more CO2 per joule than gas-fired plants.

YMMV based on local energy prices obviously, but I don't think it's straightforwardly true that electric kettles are always more efficient or more environmentally friendly than gas stoves, if you look at it holistically.


The kettle might come out around the same for particularly dirty grids, but my understanding is that induction stoves are efficient enough (and gas stoves so poor efficiency) that it's basically always better CO₂-wise to use the induction hob.

Your assumption that "the cost of transportation is approximately equal for gas and electricity" is probably not accurate, given that while the infrastructure costs are probably similar, the cost of the energy used in the compressors to pipe the gas around is likely quite a bit higher than transmission losses (around me at least, the service fees on the bill if you have gas were at least 50% higher than electricity last I heard).

You really don't want to be getting much of your heating from a gas stove (or unflued gas heater) not just because of the possibility of carbon monoxide formation, but because NOx and SOx produced by burning gas is actually a health risk (primarily asthma for kids, cardiovascular for adults).


And the gas pipes looses gas in transmission which does a lot more damage to the environment than lost electricity.


> In the winter, the gas stove will be more efficient since all the heat that doesn't go into your food, heats up the room (heating is typically also based on gas in the Netherlands, so this is basically free energy).

The efficiency loss of the electric kettle also gets converted into heat that heats up your home, energy doesn't just disappear.


Not the waste heat from the power plant.


As with everything in this discussion, this also depends on where you live as district heating is a thing


An electric kettle is ~90% efficient, the rest is lost to the surroundings but the electricity generation (assuming gas) is ~60% so the entire process, gas to hot water is ~60% and most of the waste you don't get the benefit of.

And gas stove kettle is ~40% efficient but you get the other 60% back as heat in the surroundings, so the entire process is ~100% efficient.

Assuming you need that waste heat of course. In the summer that 60% inefficiency really is an inefficiency.


Well the heating with gas is cheaper than with electrics in most places.

So it is "heat your house by X at price of the gas heating" vs "heat your house by X at price of electric heating". And the price of old school resistive heating, not the more efficient heat pump.


I imagine part of the efficiency of electric kettles is due to the heating element actually being submerged in the water, or at least directly adjacent to it.

Compare to a stovetop where there's a big heavy pot between the element and the water, and it's not in complete contact with it. Also often the heating element is underneath a piece of glass.


> Compare to a stovetop where there's a big heavy pot between the element and the water, and it's not in complete contact with it.

You’re describing resistive stovetops, not induction ones where the heating element is the bottom of the pot itself.

> Also often the heating element is underneath a piece of glass.

These things under the glass do not heat anything. They are there to induce some electric current in the bottom of the pan you put on top, which actually does the heating.


> These things under the glass do not heat anything

Glass infrared cookers also exist, my parents had one back in the 90's before induction was much of a thing. It was touted as easy to clean (and looks cool)


Yes, you’re right, I forgot about those!

They are mostly historical curiosities now, right? Not as cheap as simple resistive cookers, and worse in about all respects than induction ones.


Probably depends on your location. They are still sold here in Sweden for instance, but otoh almost nobody has a gas stove since electricity used to be dirt cheap here.


I have one! I think they're still pretty common in New Zealand too.

They're awful though, I've only got one because I'm renting. The old school exposed spiral element ones are the best non induction electric stoves imo.


With an induction cooktop the pot is the heating element.


Damn, 40 per kWh electricity.

My last energy bill (including taxes and fees) is showing $0.124 per kWh electricity, and $0.62 per cubic meter of gas.


With those prices, gas is still twice as expensive as electricity, so even if the gas stove is half as efficient, you still break even.

And yes, electricity prices in Europe are crazy right now. You can see some graphs here that show how enormous the spikes are compared to years of relative stability (no need to read the text; the graphs speak for themselves, and the y-axis starts at 0!)

Electricity: https://www.overstappen.nl/energie/stroomprijs/

Gas: https://www.overstappen.nl/energie/gasprijzen/

So apparently I slightly misremembered the prices; those graphs show that in 2020 the average gas price was about 84 cents per m3 (or about 8.4 cents per kWh assuming 10 kWh per m3), or 23 cents per kWh of electricity. That's closer to 1:3 than 1:4 on a per-kWh basis, but the general argument still holds that it seems like cooking on gas is cheaper than cooking on electricity.



Americans don't really do electric kettles, largely on account of using 110V mains, which limits power to around 1100W, making it a lot slower than ~2000W electric kettles in the 200+V world.


Americans largely don't use electric kettles because Americans largely don't drink tea at home and thus don't use kettles as often as the rest of the world.

Most households I know don't even own a kettle. Not a stovetop one, not an electric one.

If you were to ask most Americans why they don't have an electric kettle at home they won't say "because I only have 120V outlets in the kitchen." They'll say it is because they don't need a kettle.


Once introduced, it's hard to downplay the usefulness of an electric kettle.


I'm American and learned about electric kettles during trips to Asia. They are definitely handy, even at 110V. We make our daily coffee in a French press, so the electric kettle is a no-brainer.


> We make our daily coffee in a French press

Moka pots are much better. Your French press will start letting the coffee powder through at some point. I use these: https://www.rommelsbacher.de/en/coffee-tea-co/espresso-maker....


You can't compare a moka pot with a french press, they are fundamentally different things that make different styles of coffee. Espresso and coffee are not identical.

Personally I find french press too grainy, so at home I use aeropress, but when out prefer a properly done pour over filtered coffee. Black of course.


I don’t like to force things on people, but I usually buy people Moka pots. Moka pots ARE better but they make a different coffee. Moka pots are an espresso machine substitute more than a French press substitute.

I do recommend it though. Lots more options for coffee drinks.


Agreed. Someone gave us ours as a wedding gift. I thought, “Great. More clutter I don’t need.”

I use it daily for tea and coffee (Aeropress). And now, thanks to this thread, I may use it to speed up my pasta water boiling.


I have a pretty nice electric kettle. You can set it to a number of common tea temperatures and it'll hold that temp for up to a half hour. I got it as a wedding gift. I use it about weekly.

If it broke tonight I don't know that I'd bother replacing it tomorrow. I'd probably go a while before I got another kettle. It might even take someone gifting me one before I bother getting one.

Sure, it's marginally faster boiling water than my stove. It's about as fast as my microwave (which is insane at 1650W). It's definitely more efficient, but the break even on that is measured in years probably for even a cheap kettle and decades for this fancy one I have.

I just don't really drink many hot drinks and my microwave does just about as good of a job for getting things hot.

https://www.cuisinart.com/shopping/appliances/tea_kettles/cp...


While traveling through Europe, most AirBnBs had electric kettles and they all had gross scale deposits in them so I ended up boiling water in a pot.


It's calcium. It won't hurt you, and you can't taste it.


You can definitely taste it.

If you care about energy efficiency, you should descale your kettles every so often.


> You can definitely taste it.

If it gets deposited, it means that it’s not in the water any longer, or at least the concentration in the water has lowered. What you can taste is already in the water before you boil it and does not come from the kettle (well, in kettles that have been used normally with normal water).

If it bothers you, just boiling a bit of slightly diluted vinegar will get rid of it.

> If you care about energy efficiency, you should descale your kettles every so often.

Yes, these are terrible at conducting heat.


It's not even hard, just boil a solution of 1/2 white vinegar + 1/2 water, and the CaCO3 (+ 2H(+) from the vinegar) changes back to Ca(2+) + CO2 + H2O You can even see the bubbles from the CO2


I'd recommend using citric acid instead, they sell it here as "lemon salt" so it comes in an easy to use salt shaker. I usually put it in the minimum amount of water required to safely boil it and a small amount depending on how heavy the deposits are (usually up to a spoon is enough for me), though you could also just pour it in and wait.

It works really well and doesn't leave the same smell. When I'd previously used vinegar I'd had to boil another round of water and throw it away just to clean the kettle from the vinegar itself, but with citric acid there's no need (just don't drink the citric acid, it tastes like acid :)).


Do you recommend citric acid over vinegar just because of the smell?

I use vinegar all the time, put in something like 50 ml to the remaining hot water just after making a tea. After a few minutes you can just rinse it and let it evaporate, 5 more mins and the smell is totally gone.


It's pretty inconvenient when you are in an unknown location on a hotel. But yes, some areas of Europe has very hard water.


For making coffee or tea, soft water is preferable.

For drinking straight up, hard water tastes better. At least to me. Perhaps because that's what I grew up with in central Europe.

Btw, the Romans also built their towns preferably in places with hard water.


I come from a soft water area and I prefer the soft water for drinking. Our water cooker never needs vinegar, it's just not getting the residue :-)


Will it also make my bones stronger? Where does it come from? Was it in the water to begin with?


Don’t know about your bones but all tap water has some minerals in it.


Good thing too, because if you drink demineralised water it pulls electrolytes out of your cells (I think by osmosis) and eventually you also end up with decreasing bone density.


Most tap water is far from being demineralised. It's a problem if you have a reverse osmosis filtration system though.


Yes it was in the water to begin with, that’s what hard water is.


Ahh right, hard water, thanks.



Adding a little vinegar and water and boiling it will often remove most of the scale. Maybe hard when you're travelling, but at home it works fine.


Citric acid is commonly used in Germany.


With the added bonus it (imo) smells better. In the US, you can usually find it with pickling supplies or in the cleaning section as coffee pot cleaner


I'd gladly take an electric kettle over what I had to use the last time I was in an American hotel room to heat water, which was the bedside drip coffee maker. Water heated through that still tasted like bad coffee.


You clean that by boiling vinegar and water, takes 3 minutes.


Throw some white vinegar. Boil. Rince. Clean.


It’s part of my bug out bag. You can cook with that thing and one bowl.


I know it's heresy to the British, but you can heat water for tea in the microwave just fine. And at least in my European imagination, every American has a microwave in their home.


Anecdote of one: I am an American that does not own a microwave and does own an electric kettle. We are legion.


>every American has a microwave in their home

Of course we do! Europeans don't all have microwave ovens? How do you warm up leftovers?

And yes, I drink tea daily with microwaved water.


I'm American, and I often heat up leftovers with a toaster oven or on the stove. For some foods it gets better results.


There's also steamers. They are especially popular in Asia.


I don't heat up leftovers, I just eat them cold. But we do have a microwave (we bought it at the beginning of the pandemic so we could heat up frozen meals--we didn't have one before). I had cold leftover pasta for lunch today. My wife turns up her nose at cold leftovers but I'm not picky.


Isn’t it problematic to heat water in a microwave?


Its possible to superheat water in a microwave so that it will explosively boil when disturbed, but it is fairly easy to safeguard against, as well.


My microwave has a picture of a cup with a spoon in it, this stops superheating.


How?


It provides a non-smooth surface to induce steam bubbles to form.

I know in chem labs we used to use little chunks of ceramic for test tubes, but those aren't desirable in your tea.


>My microwave has a picture of a cup with a spoon in it, this stops superheating.

This appears to be a microwave setting. I don't see how that would provide a non smooth surface


“My microwave has a picture on it” is what was said; graphic instructions are common.


Ah ok I see now!


Sorry if I was unclear. It's basically this: https://physics.stackexchange.com/questions/234042/should-i-...


Only if you overheat it. Drives all the gas out and makes the water off/flat.


You can end up superheating the water that way and scalding yourself. I also prefer to control the temperature of my water which you cannot do in a microwave.

https://www.snopes.com/fact-check/boil-on-troubled-waters/


It's something to be aware of, but in practice it's very difficult to superheat water at home.

For superheating to occur, you need very pure water in a smooth container with no imperfections. Tap water in most places cannot be superheated because it contains too many minerals, and most ceramic cups aren't smooth enough to prevent boiling.

If you're making tea, an easy way to prevent superheating is to drop the bag in the water before you heat it. Another way is to put a wooden stirrer in the cup.

There's a mythbuster video here where they show water being superheated: https://www.youtube.com/watch?v=1_OXM4mr_i0, but note that they use distilled water (which nobody drinks) in a Pyrex container (which nobody uses). And they actually use a mug of tap water as a control.


that's not really true. i've superheated tap water at least a dozen times. never had a negative consequence from it, besides having to clean up some water, so it's also not something that really bothers me. from my experience the superheated state can be surprisingly stable, a couple times only explosively boiling over while being poured, instead of the moment I grab the container.

edit: as far as 'nobody' using Pyrex, my preferred container for heating water is a big Pyrex measuring cup.


> It's something to be aware of, but in practice it's very difficult to superheat water at home.

Eh, I've had it happen many times at home. A coffee mug of water comes out of the microwave with just a hint of bubbles. Add sugar/tea/instant coffee and the whole thing instantly bubbles up and spills.


My microwave has a picture of a cup with a spoon in it to warn you to only boil water with something in it. Despite being warned for years not to put anything metal in a microwave it's actually fine to leave a metal spoon in water to stop it from superheating.


You can burn yourself with an oven or a hob. In fact a kitchen is full of ways to maim yourself. So why single out the microwave? A microwave is a tool, tools can be unsafe if used improperly.


It's really easy to control the temperature of the water; boil it, ten wait for it to cool to the correct temperature.

For a fixed amount of water, you can learn the time and not have to pull out your instant-read thermometer.


But that's partially because 1100W kettles are less _useful_.

I'm mostly a coffee drinker, and use a bean to cup machine for that. But I still use my kettle a good bit, because it's quicker to boil the kettle and then pour into the pan and bring back to the boil than bringing cold water to the boil in the pan. I've got a 3kW kettle, though, if I had a 1100W one I wouldn't do that because it _wouldn't_ be quicker.


Unless you have some sort of thunder-wok burner the 110V kettle is still way faster than the stove.

Source: have a 110V kettle and a gas cooktop with very large burners, still use the aforementioned trick.


Hi end gas ranges will beat a good electric kettle in the USA. (kettle 7min, range <6 min). My wife picked up a used thermidor this last summer and it beats any kettle we’ve had in the USA so far by at least a minute. That said, the kettle is more convenient overall (auto shutoff, auto start, pour from same container), so we end up using it.

In the other hand the US kettle beat the crap out of our old gas range, and even the good range is not nearly as fast as a crappy electric kettle in germany, which clocks in at 4 min.


I timed mine (1kW kettle vs glass topped electric stove, not induction) and the kettle was only a bit faster: https://www.jefftk.com/p/electric-kettle-vs-stove


did you try french press?


Strange.. every household and even hotel room I've ever been in North America had a kettle.

I only went to an Airbnb once that didn't have a kettle, so I Amazon'd one and just left it there.


If you did it must not have been in the United States, or you got lucky.

I've been in over a hundred different hotels in the United States. East coast, West coast, Midwest, Rockies, deep South, South Atlantic coast, pretty much every region. I've probably only had a kettle a handful of times. Nearly universally a drip coffee maker, but practically never a kettle.

Note I'm not talking about the machine with the glass carafe on a small burner with a basket above it. That's a coffee maker in US terms, not an electric kettle.


Oh yea ok I think you're right about the hotels.

It's a pretty universal item in any Canadian home and possibly even dorm room.


In the hotels, that’s not a kettle. It’s a coffee maker. If you choose to not put coffee in it, that’s up to you.

(Us Americans tend to optimize for coffee over tea. If something also works for tea, that’s a happy coincidence, but not intentional.)


You sure? Germany is a (shit) coffee country and everybody has had electric kettles since the 90s.


Lots of Americans drink kind of shit coffee as well, they just largely drink it from drip coffee makers. Alternative brewing methods like french press, aero press, pour over, etc. is starting to become more popular but is definitely still not a predominant way of doing it. And even then, the really bougie people will just have an instant boiler tap in their kitchen instead of taking up counter space with something that takes longer to get hot water.


In my experience drip coffee in a halfway decent machine from halfway decent beans is miles better than most/all instant. Just because drip coffee is convenient doesn't necessarily mean it's bad.

The beans you use are more important than the brewing method - if you use a giant tub of stale Folgers it probably won't be great. But grind you own decent beans and a drip machine can be fantastic.


Recommend any particular beans?


I doubt we live in the same place so I won't recommend anything specific. But I would find local roasters and try to buy beans that were roasted in the last couple of days. Freshness makes a pretty big difference. Finding specific ones you really like will take a bit of trial and error. Single origin if you want interesting/specific flavours, blends if you want something a bit more consistent.


Drip coffee is pretty good, good taste and caffeine content while being pretty simple to brew, a decent machine is cheap as well. It's just a method, the beans you use are what actually count.


I don't drink the bean juice personally, so I've got no skin in the game. To me it definitely seems like a drip coffee maker could make a cup of Joe just about as good as most other methods. I do think it's a combination of the burner burning the coffee after and also a coolness factor (that's how my parents did it, gross!) that drives a lot of the hipness of alternative brewing methods in the US.

But as mentioned I don't really have a horse in this race to begin with. Drink what makes you happy however you brew it. :)


I'm not going to judge which one is better (though I personally prefer espresso, it's a matter of preference), but coffee brewed with different methods definitely tastes differently. Coffee brewed at higher pressure (3~4 bar for moka pot, >9bar for espresso) tend to taste way stronger and more flavorful (which is why they're drank in lower quantities)


Hard Disagree.

Espresso and similar pressure based brewing (Moka Pot) is far superior with equal beans. Tastes way better, more flexibility in creating different drinks, very fast to brew, etc . If it’s too concentrated or hard to drink an espresso shot then you dilute it (for all you black-coffee drinkers, it’s ok to use milk, it’s still coffee).


Different methods of brewing give you different results. If you like (or just want) a stronger flavour, use a pressurised method (espresso, moka pot, etc.), more flavour but the caffeine content is going to be lower than a longer brewing method.

I drink coffee in different ways in different days. If I want a large amount in the morning I go for a drip coffee or french press, the taste is smoother and caffeine content higher. If I want to taste the fullness of the beans I use a moka at home.

I don't see other methods of brewing as objectively better or worse, it depends on the result you expect or want, some days I want an espresso in the morning, some days I want a lot more caffeine, etc.

Also, I just drink coffee, I don't make drinks or turned coffee into a hobby.


The worst part of a standard American drip coffee machine is cleaning it. Too many different nooks and crannies to get gross.

I find pour over to be one of the easiest methods in amortized time. It takes three minutes to make a cup but the clean up time is only another 30 seconds—-throw away the paper filter and rinse the dripper.


Those suck, I use a Moccamaster that's been beating around for at least some 30 years. Bought it second hand and never had an issue in the past 15 years, reliable, easy to clean. It's a delight of design based in functionality for me.


And pour-over is just you manually doing what the drip coffee machine does for you.


Recommend any particular beans?


The time since roast matters a lot. Find a local roaster, you can get fresh roasted and it’s unlikely someone that went through the effort of setting up a local roaster is going to buy crap beans. Also, I’d recommend light roast.


> an instant boiler tap in their kitchen instead of taking up counter space with something that takes longer to get hot water.

These things usually serve "near-boiling" water, somewhere around 95 C. This is fine for some cases - e.g. making ramen - but not appropriate for many kinds of tea.

I have such a tap, and I also have an electric kettle. The tap is mostly used to prefill pots with near-boiling water for cooking, so as to not wait for so long for the stove to do it. The kettle is used for tea and coffee.


It was weird how before that they only ever had wall mounted Wasserkocher (water boiler) and then en-masse discovered kettles.


What do they use the kettles for? For most non tea drinking households in the US there is no essential use for a kettle.

They do sell them here, people buy them, they are usually tea drinkers.


German in Canada here. I used mine to cook water for pasta as it's usually faster and more efficient than my electric cooktop. Bought one in Canada and it's close to useless for that purpose. My German kettle would bring 1.7L of tap water to a rolling boil in about 4 minutes. My Canadian one needs about 10 minutes to achieve the same.


To perhaps state the obvious, I use mine mainly for coffee. And the occasional tea cup.


Honestly it seems more likely for an American household to have a Keurig machine (or hell, even multiple Keurigs) than to have any kind of kettle.


You're on target here. I have become the dumping ground for misbehaving keurigs from family and have managed to make most of them work or frankenstein parts between them. I hate the machines and wish this never happened. I was happy with a french press. Curse my urge to not waste things.


Last sentence is ironic because keurigs waste so much plastic if you use them daily.


It feels like much plastic since you touch each one, but it's a very tiny amount if you weigh your monthly usage.


> I was happy with a french press. Curse my urge to not waste things.

I don't understand this sentence, I would much rather waste some ground coffee beans and water than plastic.


With a French Press, you don't have to buy and throw away a constant stream of paper filters.


A Keurig machine is just an electric kettle with an electric pump inside of it.


I strongly disagree. With a kettle, the whole volume of water is being heated at once and is a uniform temperature. A Keurig or drip coffee maker is only heating a small volume of water at a time. By the time you process a whole liter of water the first bit will have already cooled off a lot. It's a very different process and potentially a very different outcome.


If you're making a single cup (~250mL) why do you need to heat 1000mL of water?


Can it be used like a kettle - ie can it rapidly heat up a quart of water without a coffee residue taste? No, it cannot.

Just because they both heat water doesn’t make them similar.


Pretty much. I used my keurig this way for a few years until i realized I'm really not using pods and am only using it for hot water to make tea (via normal steeping). If you brew coffee and then use it for something right after I don't recall there being much coffee taste, but you could probably run a small cup setting to flush things in the pod area a bit if needed.

I've since switched to a Zojirushi water boiler which I adore, especially after learning my keurig wasn't getting hot enough to really brew the tea well.


I used to use my Senseo to make ramen so ¯\_(ツ)_/¯


Ok exactly - in addition to keurigs /nespressos etc mentioned (which I will refuse to buy) - for years the predominant coffee making apparatus in the US has been the automatic drip coffee maker.

French press, pour over, chemx are all niche.


Tea.

French press coffee.

Instant coffee, hot cider, hot chocolate, etc.

Heating water for instant soups, etc.


> there is no essential use for a kettle

electric kettle

french press for coffee

tea kettle for tea


European here that doesn't drink tea yet still has a kettle.

> they won't say "because I only have 120V outlets in the kitchen." They'll say it is because they don't need a kettle.

Well no they wouldn't say that, unless they have experience of 240v perhaps.

They might reasonably say that it's too slow. Or maybe they're aware that kettles are 'just slow' and so arrange their lives so that they don't need a kettle, in which case they would say "I don't need a kettle".


My wife drinks tea, I drink coffee. Both of us grudge the time needed to heat water in our 120V American electric kettle compared to European 240V kettles. I^2 makes a big difference.


We use a hot water pot at home, a Zojirushi. Never have to bother with hot water, it's always available and the thing uses a negligible amount of electricity (we're almost entirely on solar anyway, so it's a moot point for our family) comparable to all of our other appliances.


I own a stove top kettle. Only used for hot chocolate or aeropress coffee (but I usually use drip maker)


Japan is on 100V mains, and electric kettles are everywhere.


Good point. What's the typical kitchen set-up in a Japanese apartment?


I'd be very surprised to see a standard US gas range capable of boiling water faster than an electric kettle.


Yeah, even for a 1200W kettle heating a liter of water can be faster in the kettle than an average gas stove and pot.


Yeah it's just not. I have one of those ridiculous bazillion-BTU gas burners and I still prefer to boil water using a combination of a 120V tea kettle (1L) and an 120V portable induction cooker. An installed 240V range is even faster, but I don't have one of those.


I'm a European who finds kettles stupid. I own one because someone didn't appreciate my house's lack of kettle and got one despite my protestations. I own (1) a induction stovetop, (2) a microwave… both will perfectly boil water just as fast. Why do I want to waste counter space on a kettle. My kitchen is tiny.

I have the same complaint about a ricecooker. It's perfectly easy to cook rice in a pot. Sure it's convenient to use the automated device, but it's wasteful.


I've never owned a rice cooker but I'm led to believe the rice they make is superior to anything you can reasonably achieve in a simple pot.


It's far easier to achieve consistent results, for sure. We have a small one and it's the best rice we can make at home without significant effort.


Cooking perfect rice in a pot takes technique, for sure. But it's not that hard, and if you do it right it comes out perfect just like a rice cooker.

Wash your rice. Throw in cold pot with 1:1 ratio of rice to water, + a bit extra to account for evaporation[1]. Bring to a full boil. As soon as it's boiling, drop the heat to the lowest setting and cover your pot as tight as you can and let it steam itself until ready. When ready, fluff it immediately.

[1]: the precise amount extra depends on your setup, tbh… how tight your lid seal is, how much surface area your saucepan exposes, etc. but usually an extra 25% will be about right


Dedicated appliance worth to have if it used frequently. As a Japanese who uses ricecooker and kettle, it must have.


which limits power to around 1100W

All those 1800+ watt hair dryers that are used very commonly in the U.S. are wondering where you got your wattage limit from.

https://news.energysage.com/how-many-watts-does-a-hair-dryer...


Continuous vs noncontinuous overcurrent rating.[1] Depending how you get your appliance approved, you can draw more. The actual requirement is 3 hours, but like all regulations you have to look at intended usage:

[1] https://www.csemag.com/articles/understanding-overcurrent-pr...


The comment I replied to said the U.S. was limited to around 1100W which is why The U.S. doesn't use tea kettles.

I reply with a very commonly used device that uses over 63% more power than 1100W showing that obviously isn't the reason the U.S. doesn't use tea kettles.

And you reply with essentially, "Oh, but that's because the hair dryer isn't used continuously for 3 hours". Did you think a tea kettle is used for 3 hours continuously?


Please actually read what I said: there are differences in the appliance approval process which determines how much continuous power you can draw from an electrical circuit.

Why would a manufacturer bother trying to prove discontinuous draw when they can just current limit, use the same element they do everywhere else with the same resistance, and not worry about it?


I read what you wrote but still fail to see what it has to do with a claim that 1100W is the limit on US circuits and that is the reason Americans don’t use electric kettles.

Because that is what was being discussed.


> Americans don't really do electric kettles, largely on account of using 110V mains, which limits power to around 1100W

Most sources I can find indicate the usual (but not maximum) draw of US electric kettles 1500W, and checking a few popular models confirms that 1500W is common.


I don't have a kettle because I have a hot water tap that gives me water at more than 200F. If I need it boiling, it takes a very short period of time on the range to get it there. Those are relatively common, almost every one of my friends and family have one too.


Technology Experiments tried it.

https://youtu.be/_yMMTVVJI4c?t=232


Most kettles I've had in the US are 1500W, similar to a space heater.

It does take longer to boil, but it's like 2 min compared to 1 min.

I survive.


FWIW in my American experience most everyone I know has one and uses it, so your mileage may vary.


In my experience, electric kettles are reasonably common in America.


I wonder how I’m making my Chemex every morning…


It’s not that slow. 1. They sell plenty of kettles in the US (touch grass and go to Target sometime or something) - a large portion of those sales are probably to Asian households. It is simply that if you have a drip coffee maker and don’t drink much tea why do you need a kettle.

No idea where this dumb myth comes from - kettles are not hard to find in the US, they work fine - there is a ton of demand it is just relatively miniscule.


No idea where this dumb myth comes from

I think it’s a hangover from when it was true - fairly recently in a human-lifespan timescale. I moved to the SF Bay Area from the UK 20 years ago, and had to buy a weird kettle from Amazon. None of the nearby stores had a decent one. That’s definitely not true now though.


I had an electric kettle growing up, but nobody else I knew/know has one. I don't know why, they're extremely practical, don't take up much space, and don't cost much money unless you get a fancy temperature-controlled one.


They're found in most college dorm rooms.


No it's not.

The most efficient (and fastest) way is an induction hob with copper disc pan. Something that Technology Connections didn't include in his video.

See some test here: https://youtu.be/EBlyuahlplI


Where's the energy loss from the kettle that's not present in your proposed method? Legit asking, I don't see it. Just heat radiating from the body of the kettle for longer cuz of longer time to make it boil?


> Where's the energy loss from the kettle that's not present in your proposed method?

You're not heating the water directly, you're heating it through a piece of metal. The wasted energy is heating that piece of metal up to more than 100 deg C. A surprising amount of energy lost there.

If you use the induction hob you're only heating the metal that you actually need to heat to do the cooking.

For us in northern climates, none of this energy is really wasted during winter. But many of us now have heat pumps, which are more efficient than resistive heating.


The claim seems dubious in Europe, but in North America the stove has access to twice the voltage potential as the kettle, so can heat faster. Then with equal losses the stove wins.


Europeans use 220V for everything.


Well, Germans like to use three-phase power for their electric ovens.

(Still nominally at 230 Volt, but it's not quite the same as two-phase power.)


So in Europe the phenomenon of a stove heating faster than a kettle would not be observed. Isn't that what I said?


230V/3kW induction stove boils a cup of water faster than you can get the cup from the cupboard.


I have a modern induction stove. I think it's great. But not for bringing water to boil. It's about as fast as a kettle for 500-750 ml. But over a liter and it'll be much slower than a kettle (yes, boost function). I do not use it for bringing pasta water to boil.


This is not just efficient but also saves a lot of time. I do this with rice/pasta or anything that requires boiling. Also, using a pressure cooker can save a lot of time/energy with foods like legumes/meats etc.


+1 for pressure cooker. I use mine twice a week or so. For just about anything since there are so many recipes online. And less clean up!


> A better way to save energy is to use an electric kettle to get the water up to boiling.

An even better way (and one that also saves cleanup) is to use an electric pressure cooker like an InstantPot to cook the pasta


or an induction stove. I had an induction cooktop, moved and ended up with a gas cooktop again - I was astonished at how much slower and overall worse it was than induction. I quickly replaced the gas with induction and am now enjoying it once again. The only downside for induction is you have to have induction ready cookware - but really it's not that hard these days. Cookware and more has all-clad seconds and if you sign up for their email newsletter they run a sale - buy one, get a second 50% off - it's how I built my collection. Made In is a medium quality alternative - made in Italy. For affordability, Tramontina is surprisingly good. You can find it at Walmart, Costco and other places. It's not quite as dense as Made In or All Clad but any clad cookware is way better than anything with a damn disk in it.


Speaking about electrical kettle: why aren't there insulated kettles that would help me save energy when I re-boil water after 20 minutes?

EDIT: Ah, there are some: https://www.amazon.com/DASH-Insulated-Electric-Kettle-Cordle...


I just have a thermos flask next to the kettle. The excess ALWAYS gets put to good use: extra cups of tea (without reboiling), cooking, dishes in the sink, or even the kids' bath.



If you're reboiling water aren't you already doing it wrong.

Ideally you should only be boiling the water you need.


I don't have reverse-osmoze filters, so I get pretty much residue (lime) which I rather leave in the kettle.

Moreover I sometimes want to add extra hot water to my 0,5L cup of tea. When boiling water, I already add the extra water I'll need so I get it just-in-time (except it has already lost much heat thus why I'll try to get insulated kettle). My wife may or may not want to drink tea, but I get ready water anyways (yeah, this could be organized better)


Throw a kitchen towel over the kettle after you pour off your first drink.


Unless the electricity came from gas or coal. Then it's a wash.

Induction hobs are about as good as a kettle (and you lose less energy because you're not heating up the glass/plastic/metal of the kettle).


My housemate makes pasta nearly every night, so we decided to optimize the process. We found that bringing the water to boil in a borosilicate flask in the microwave, adding the pasta and then returning it to the microwave with a pyrex bowl of the sauce and meatballs was the fastest and most economical method. I think from start to finish it takes 9 minutes total, 7 to bring the water to a boil and another 2 to reheat the sauce and cook the pasta. Something like 150 watts of power usage in a 1kW microwave.


Doesn't this seem crazy counter-intuitive. Turn gas into heat, water to steam energy to electricity, then transmit the electricity, then turn the electricity back into heat. Versus turn gas into heat, water to steam?


Unless it's winter where you're heating your house anyway, in which case the gas hob is more efficient.

But gas is a lot cheaper than electric anyway so I wouldn't expect a kettle to be much cheaper.


Isn't this a thing? This is how I have been cooking for a long time - bring water to boil in kettle and use it to cook.


For some reasons I thought most of the people switched to electric induction cooktops.


After installing an induction stove, I did a side-by-side test on boiling water on it vs kettle. The kettle won, both on power consumption and time. This is for a ~2000W kettle.


There are a lot of resistive electric ranges, and around 40% of Americans still use gas stoves.


You can also use a lot less water than is standard: just enough to cover the noodles. Kenji Lopez-Alt has written about this a bunch [1]. Noodles cooked that way were indistinguishable in his testing. He's also discussed it on his cooking youtube channel, which I recommend to anyone. For example, a 3-ingredient macaroni and cheese [2] in which he discusses the noodles at 0:40-ish.

[1] https://www.seriouseats.com/how-to-cook-pasta-salt-water-boi...

[2] https://www.youtube.com/watch?v=kWge-2jT9ZQ


Kenji's 3-ingredient mac and cheese is a masterpiece. It's become the go-to meal for my kids. It also has to be the peak in energy efficiency for pasta.

The idea is that by minimizing the water you can maximize the starchiness of the water, which creates a great emulsifier for the cheese sauce, along with evaporated milk. The combo is truly creamy mac and cheese, with no grittiness or separation of oils. It tastes great, it's faster than Kraft, it's energy efficient, it's easy to learn (literally 3 ingredients plus water).

It also has easy-to remember-units (6 oz pasta, 6 oz cheese, 6 fl oz evap milk). The evap milk amount perfectly divides the USA standard 12 fl oz evap milk can.

Kenji has tweaked the ergonomics of scratch mac and cheese light years ahead of where it was.


We use the recipe all the time too, but I can't understand it being given in 6oz proportions. None of the ingredients, except maybe some cheeses, come in 6oz sizes standard.

I do a 16oz box of pasta, 12 oz can of evaporated milk, and 16oz bag of shredded cheese. I put the milk and cheese in while there's a bit more water still in the pan, and have never missed the extra bit of evaporated milk.


And you even get pasta water way more concentrated in starch, which is useful if you're using it to make a sauce


The claim is weird, though. If they said “just add to boiling water and turn off immediately” then they would claim it’s 100% more efficient*.

If you turn off your water right before it’s boiling and cook your pasta, then Barilla is an energy generator*.


Actually reverse cycle air conditioners are said to have a heating efficiency of over 100%, but what they really do is to move the heat from one point to another. It's all about definitions; if you define the system as your room, it will be more than 100% efficient. If your system is the universe, then it's always 100%.

By the same logic, yes, your hypothetical statement can be rephrased as "if you already have boiling water, cooking pasta doesn't require any energy".


But the whole point is that Barilla's definition is ludicrous. The only definition that really makes sense is how much energy did it take to get the pasta from the box to my plate, and Barilla is excluding the vast majority of that energy in their 80% number for the sole purpose of inflating it to make it look better.


Barilla's definition is ludicrous, and I'm playing the Devil's advocate.

> The only definition that really makes sense is how much energy did it take to get the pasta from the box to my plate

Including the energy required (and the carbon footprint) to manufacture the pasta in the first place, yes? Or no?


no, because we're only comparing the energy required to cook the pasta; not the whole lifecycle.


If your definition of efficiency encompasses change within the entire universe, you could say that everything has an efficiency of 0% because matter and energy cannot be created or destroyed.

Back in reality, the purpose of a heat pump is to heat/cool an enclosed space. Its efficiency is rated by comparing it to a direct conversion from electricity to heat. So long as what occurs outside doesn't affect what occurs inside, it isn't relevant to the question of efficiency.


Except that would not cook the pasta.

I haven’t tried this technique yet, but one of the aims is to still cook the pasta effectively. I know from experience that your hypothetical won’t cook most dried pasta.


You "know" incorrectly. Pasta "cooking" is mostly rehydration, and that happens at temperatures well, well below boiling.

https://www.seriouseats.com/tips-for-better-easier-pasta

Getting the water to a boil is probably just a way for Barilla to make the instructions foolproof- after adding the pasta, the water will still be hot enough to rehydrate said pasta.


> instructions foolproof

and allows a leeway for a different pots, stoves, fried off power cables...

I use this technique when I don't want to bother running a second stove top and making a sauce or something for the pasta.

It takes more time, though.


This is a bizarre thing to put in the fine print. They're literally just excluding most of the cooking process from their calculation and I can't see any reason why other than to artificially inflate the efficiency stat.


> When I boil pasta, I use full heat to get the water boiling, drop in the pasta, then let it come back up to a boil, then drop the temperature as low as possible to keep a slow boil going--about medium-low.

Do you put a lid on the pot?

If not, you're running a reboiler. If you were to do that in a region of the earth and season of the year where you have to heat your place you'd probably be wasting double. Because the additional moisture from evaporation has to be removed from your place later through ventilation.

It always baffles me how many people don't care/know just how much less energy they could use to achieve the exact same result, just by putting a lid on.


Guesstimating that the boil takes about as long as the time to cook the pasta,

You need an new stove. Edit: or a new pot


Bringing 5+ quarts of water to a boil takes a lot of energy. Hell, it takes my jetboil 90s-120s to boil low-temp water in like...1 quart quantities. It doesn't seem unreasonable to me that a large pot of water takes 8-10 minutes to bring to a rolling boil.


More than a gallon? That's a big batch of spaghetti - it stands to reason that if you're feeding a great multitude, there's going to be a cost in cooking fuel.


You don't need to cook pasta in a lot of water. Just enough to cover the pasta is fine.


Right. And in fact, it's better that way. There's more starch in the water, which makes it more useful when you add some to the sauce in your sauce pan.

What, you don't finish your pasta by heating it with the sauce??? Hang your head in shame.


no. they shamelessly pan fry pasta. (and rinse them briefly with fresh water to get rid of the starch and stop the cooking for bite. also works just perfect with a low amount of water)


Or just cook your pasta in the sauce.


Or he is stuck with an electric coil stove that he can't replace. Those suck.


You can actually really turn off the stove after the water starts boiling again. As long as you of course keep the lit on. That's how I have been doing it for a while, no dramatic difference in the end result (and yeah I am Italian, so I tend to complain about the details in food related matters).


The claim smelled bullshit from the get go. Boiling water takes much longer than cooking the pasta.


The asterisk always means bullshit.


Also probably assuming gas. Induction would take another 90% off.


Actually since they're using percentage, it doesn't matter


If you google your question, you literally get a set of videos showing you how to do it.

"oneplus 10 pro qr": https://www.google.com/search?q=OnePlus+10+Pro+qr

In 30 seconds, I found that you can do it by using the Google Lens feature, which is a button to the left of your zoom button in your camera app. If you don't want to use that, you can use google lens directly.


Ah, the trick is not to have disabled the Google Lens app. I'd seen that video before, but in never clicked that Lens was required. Thanks.



If you read through to the bottom, you'll see the article itself is just a call to action for yet another service which the author is starting. It's an ad. Seeing it buried so deep doesn't make me feel that the author has any better intentions than Google itself -- credibility already gone.


That's an extremely jaded outlook. Just because the OP is presenting an alternative solution doesn't necessarily mean it was all borne out of greed/profit-seeking. At least the OP's service was built from the outset to be facts- and evidence-based unlike Google search results.

I'm really not sure what you expect. Should articles which point out problems be written completely independently of the people creating solutions to those problems? Cos clearly that is totally unrealistic.


Well, yuck. I did completely miss that this was just a shitty advert complaining about other shitty adverts.


I'll ask you what I asked the other commenter: should articles which point out problems be written completely independently of the people creating solutions to those problems?

Even if it is an advert, it's not a shitty one. It presents scientific arguments and is well-sourced. It's about as a good as an "advert" as you'd ever get. Really, what more do you want from people?


Frankly, I'm just done with advertisements and commercial "speech" being normalized everywhere, including in articles, regular speech amongst humans, and more. I've been done with commercialism fake "culture" for a LONG time. And the more we have, the worse the pollution is with genuine conversations.

I think that's the real problem with articles like this. The moment I realized it was a commercial piece masquerading as a scholarly meta-article, I have to question all the previous discussion they have. Is it right? What are they trying to sell? What viewpoint are they trying to get me to follow?

Advertising is codified monetary deceit. We've moved long past "Buy my soda cause its yummy", to an ever-present dread of "if you dont have this, you'll be sorry" in a round-about way.

> should articles which point out problems be written completely independently of the people creating solutions to those problems?

If there's a commercial motive to use/buy/rent their shhit, absolutely yes.

> It presents scientific arguments and is well-sourced.

Depends. What are the biases of their arguments that the underlying commercialism is modifying? It now needs its own analysis to see if they were trying to sell me something different on dread or FOMO.


The article is just an ad.

> This was my motivation for starting GlacierMD: I wanted to give consumers the information they needed to make evidence-based health decisions. And although I lost that battle (spectacularly) I'm still fighting the war! If this is a topic that interests you, I'd urge you to


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: