Hacker News new | past | comments | ask | show | jobs | submit login
Dt: Duck tape for your Unix pipes (dt.plumbing)
226 points by signa11 10 months ago | hide | past | favorite | 157 comments



This article links to https://en.wikipedia.org/wiki/Vesta_Stoudt, which links to https://en.m.wikipedia.org/wiki/Cotton_duck, which explains that the word "duck" here comes from the Dutch doek, or "linen canvas".


TIL that "duck tape" is actually the correct/original version, and "duct tape" is a modern derivative.

I had always assumed it was the other way round, and some people simply don't know what a duct is so they substitute a similar-sounding, more common word.


Me too! The situation isn't helped by the registration of "Duck Tape" as a trademark for a specific brand of tape -- but that appears not to have happened until 1981, well after both names were in common use: https://trademarks.justia.com/733/00/duck-73300816.html


What an epic failure by the trademark office. That's as if a company had tried to trademark "Hammer" as a brand name for hammers, and the trademark office just said "Yeah, sure!"


They don’t check trademarks like they (are supposed to) do patents. You want a registration? Here you go. Now someone has to take it to court to see if it stands up.


Some countries do patents the same way. They're called non-examining patent offices.


You should check Windows. (Total Commander)


Or like "Apple" as a brand name for... nevermind


That's different. Apple as a trademark for a computer (or for a record company) is fine. Apple as a trademark for apples... not so.


Just that the "difference" stops when they now sue any grocery related company having anything like an apple in their logos..


Anyone can sue anyone else over anything. If those grocery companies actually fought them in court instead of caving in, Apple would get their ass handed to them.


I welcome you to try sueing your neighbor for murdering you.

Courts do not accept all lawsuits.


Not a great example. After being found innocent in criminal court of murdering his wife, OJ Simpson was famously sued by her family and lost in civil court.


Apple did complain about this logo too, for a bicycle route through Germany's second largest apple growing region:

https://apfelroute.nrw/


I used to be mad about this sort of thing too, until I learned that things are incentivized in such a way that encourages companies to at least make a nominal defense of their trademarks in order to maintain an edge in court for actual less-defensible violations of the trademark


Not to forget "Mail" for an email program. Made searching so much easier ... Not.

And never mind the dog named dog (a long time ago, some IEEE journal had a great cartoon where someone painted "house" on their house, "car" on their car, ... you get it. The cartoon illustrated an article about good and bad names for systems.


ever since i read the story about the "Max Headroom" guys about how they picked their name (because it was written at the entrance of every parking garage)

i think the best way of naming a company is to make it something that is said hundreds of times by people every day

"an apple a day" etc... wonder if I can name my comapny "the The company"

or "the corporation of Is" haha


If Max had been killed by a train, he would have taken the name, "Mind the Gap" or perhaps just the very appropriate, "Mind Gap".


> wonder if I can name my comapny [sic] "the The company"

https://en.wikipedia.org/wiki/LaCie


Ironically the kind of tape you actually use on metal ducting isn't even duck tape, instead it's a thin aluminum foil tape.

Duck tape kind of sucks for most temporary applications as it leaves so much residue behind. Gaffer tape is an awesome replacement (although quite a bit pricier).


Someone won an Ig Nobel for research finding that duc(t/k) tape is singularly bad on actual ducts. (Because the adhesive melts at normal heating-duct temperatures.)

Their acceptance speech was written in Seussian rhyme. "Do not use duct tape on ducts. Do not use it: t'won't stay stuck."


Actually I even bring this up in the user-guide https://dt.plumbing/user-guide/tutorial/expectations.html

Also +1 for Gaffer tape, especially for audio/video.


Sounds like “duck tape” is adhesive fabric tape (maybe water resistant?) and “duct tape” is shiny metal tape for connecting air ducting? And both names are used informally for either or both types of tape by most of the population?

And then so many people post “facts” about a clearly fuzzy issue and proceed to argue about who’s “right”. I’m getting real blegg/rube vibes from the whole argument here.


I believe it's professionally called foil tape, or sometimes HVAC tape, to distinguish the two. In product listings, you often see seohacks like "HVAC foil sealer duct tape", because consumers still search for duct tape.

https://www.lowes.com/pd/3M-2-5-in-W-x-150-ft-L-HVAC-Tape/40...


> "Sounds like “duck tape” is adhesive fabric tape (maybe water resistant?) and “duct tape” is shiny metal tape for connecting air ducting? And both names are used informally for either or both types of tape by most of the population?"

This is how I always understood the situation, and then I've recently discovered the issue is further "muddied" by a brand name/trademark "Duck Tape" also, so that's fun...


Me too. This explains the thing about "duct tape" that never made any sense to me -- it is not the sort of tape you'd want to use on ducts.

But I grew up calling it "gaffer's tape", because it was popularly used by gaffers on film and TV sets.



I'm talking about how the term was used in the studios I was exposed to.

Even in that article, the two definitions are very, very close to being the same thing, and in the studios I was familiar with, both of the tapes the article defines would have been referred to as "gaffer's tape". They're just different varieties of gaffer's tape.

It's all just slang, though, so I'm not surprised that the definition varies from crowd to crowd.


Just ask KaTe Bush.


Or her Usenet fan group.

Damn, that neuron hasn't lit up since about 1992.


It would be amusing if this were merely an elaborate citogenesis hoax and the original name was indeed "duct tape", but we're all being gaslighted now.


ISTR from my scout days that "ducking" is a fabric surround at the bottom of tent walls that you tuck under the groundsheet - probably from the same root, I guess - though few online dictionaries seem to have heard of it.


The same cotton duck material is still used for old school Caradice cycle touring bags.

It's a nice waterproof alternative to oil based materials as the cotton swells when wet to seal its own seams.

https://carradice.co.uk/


Yeah which explains that duck tape is not for ducts (or pipes) since it's made of fabric.

> She is often misattributed as the inventor of duct tape. However, numerous variations of adhesive cotton duck tape had existed for decades, nor did she invent the specific formulation of the popularized duct tape.


That is not what the thing you quoted says, or even correct. All duct tape is made of fabric and it is… well duct tape, so ducts were the intended use case when the name was created.

Duct tape has kinda become a general term for all cloth-based tapes though, so you can indeed find duct tape not intended for ducts, but the fact the name was originally “duck tape” has nothing to do with that.


ASHRAE/UMC actually specify tape 'for ducts' to be UL 181A-P/181B-FX [0] which is commonly made of metal foil and not cloth.

[0] https://tapeuniversity.com/products/foil-tapes/ul181ap-ul181...


I can find more [0] than [1] one [2] source that states that duct tape was ORIGINALLY used in duct work and that was later found out to be not good in duct work, so we don’t do that anymore. I’m a bit dumb, so are you saying that everyone else is wrong about history and duct tape never intended for ducts? That duct tape was not used for ducts all until 181A-P/181B-FX because it wasn’t allowed?

Even your own source on a separate page [3] says:

> After the war, duct tape became popular with the general public. One popular use was holding together ventilation ducts. Ironically, while this is a use that duct tape does not normally have today, the name stuck and is used to this day.

So I am now confused.

[0] https://www.chicagotribune.com/redeye/redeye-is-it-duck-or-d... [1] https://en.m.wiktionary.org/wiki/duct_tape [2] http://www.todayifoundout.com/index.php/2010/02/duct-tape-wa... [3] https://tapeuniversity.com/industry/building-construction/du...


No matter how mundane a random claim posted on HN is, there is always a user with expert knowledge willing and able to respond with a "well actually...", citing incredibly specific details that I've never even tangentially heard about in my entire life until that point.


Nope, duct tape is for ducts and can't be made of cloth because the water would clearly seep through the cloth. There are strings in duct tape but not cloth.

The part I quoted says that duck tape is made of cloth and therefore is not duct tape.

I think people get confused because there is a brand of duct tape named Duck Tape. But even the Duck Tape packaging calls it a "duct tape".


> Nope, duct tape is for ducts and can't be made of cloth because the water would clearly seep through the cloth. There are strings in duct tape but not cloth.

I am sorry, but you are wrong. The term duct tape is specifically referring to “Duct tape: cloth- or scrim-backed pressure-sensitive tape, often coated with polyethylene.” We don’t even need to argue if it is good for use in ducts, it is not, I agree. However, that isn’t what we are discussing here. From Wikipedia: “The ultimate wide-scale adoption of duck tape, today generally referred to as duct tape, came from Vesta Stoudt. Stoudt was worried that problems with ammunition box seals could cost soldiers precious time in battle, so she wrote to President Franklin D. Roosevelt in 1943 with the idea to seal the boxes with a fabric tape which she had tested. … [Johnson & Johnson’s] new unnamed product was made of thin cotton duck coated in waterproof polyethylene (plastic) with a layer of rubber-based gray adhesive (branded as "Polycoat") bonded to one side. It was easy to apply and remove, and was soon adapted to repair military equipment quickly, including vehicles and weapons. This tape, colored in army-standard matte olive drab, was widely used by the soldiers. After the war, the duck tape product was sold in hardware stores for household repairs. The Melvin A. Anderson Company of Cleveland, Ohio, acquired the rights to the tape in 1950. It was commonly used in construction to wrap air ducts. Following this application, the name "duct tape" came into use in the 1950s, along with tape products that were colored silvery gray like tin ductwork. Specialized heat- and cold-resistant tapes were developed for heating and air-conditioning ducts.”

> The part I quoted says that duck tape is made of cloth and therefore is not duct tape.

The part you quoted doesn’t say that at all. The part you quoted says “She is often misattributed as the inventor of duct tape. However, numerous variations of adhesive cotton duck tape had existed for decades, nor did she invent the specific formulation of the popularized duct tape.” Did you quote the wrong thing? It’s simply saying that the duct tape Vesta Stoud developed was not the final product we know today, nor was she the first to come up with the idea.

> I think people get confused because there is a brand of duct tape named Duck Tape. But even the Duck Tape packaging calls it a "duct tape”

I honestly can’t keep track anymore.


Honestly I think we're elaborating more than needed, I won't die on this hill :)


You know what. In retrospect I have no idea why I cared so much and got so aggressive. I apologize for that. Cheers.


Not all ducts carry water...


I always believed that the name "duck tape" was given because it is waterproof as in "water off a duck's back" and duct tape was just an eggcorn.


Gaffer tape

Gaffer Duck

Donald Duck

These toons stick together


In Norwegian too, 'duk'. That's also called 'lerrets tape'


I can’t find a source at the moment, but I’ve read that it was also branded as duck tape because of its waterproof nature, making for some clever word play.


Nushell has filled this void for me. It elevates pipes by structuring them into lists or tables. It has all sorts of goodies for manipulating those data structures like inserting/updating rows and cells, better string interpolation, useful utility functions like case transformation, and more.


Agreed. Nushell fulfills the Unix philosophy promise that the standard Unix tools have been breaking since forever. Traditional Unix commands feel like a hodgepodge of hacks by comparison, that only work together occasionally, and even then mostly by accident.


The Unix philosophy doesn't dictate any specific data format to interoperate between commands, besides this being data streams, and typically text. Commands are free to interpret these streams however they want, which is why this seems like "hodgepodge", but in no way were tools breaking any sort of promise.

FWIW I've been using *nix machines for decades, and very rarely have I had the need to use structured data between commands. Yes, it would make things cleaner or more robust in certain cases, but in practice there are common ways of handling this. E.g. most tools support outputting or inputting null-terminated lines.


Null separators between fields is structured data. It might just be the simplest that can possibly work (because the null character is outside the normal data range for text, unlike the newline, which demonstrably doesn't work in that sense) but it is structure. A list structure, specifically.

The world would be a much happier place if -print0 (or whatever) was universal, but no such luck.


The painful part of all this is that there were and still are actual record separator and field separator characters in ASCII.

ASCII was perfect for table output. All tools had to do when outputting is use the standard existing characters when no TTY is detected, else use tabs/spaces and newlines.

I always wondered why no one uses the ASCII rs and FS codes.


That ... is an excellent question.


How would you deal with a tree?



That seems like a text format on its own? doesn't even use RS/FS. Even when Translating it uses { (),; } that's 4 characters to indicate a tree. Maybe you could use

  FS=, RS=; GS=( US=)
Is that what you mean?


Who cares?

It deals with 99.99% of uses cases that pipes are being used for right now.

That's more than good enough.

If you really needed to output a tree using only rs and fs characters, and since they are only a single byte each, you could cover that 1 in a 1000 use case by using empty fields for depth of a record, and attaching any record to the current parent until the depth changes.


I care, I wrote a converter[1] for it a long time ago, when I wanted to write a json one I got stuck.

Depth of a record seems useful, but then you'll have to make fields mandatory to indicate between real records and depth. slowing down the parsing.

[1]: https://github.com/Nomarian/AsciiDT


Sure, but the reason it works with any tool is because it's generic and simple. If each tool had to implement anything more sophisticated like a JSON parser and serializer, it would be a nightmare to maintain. Projects like Nushell essentially need to handle every type of output from any command, and every type of input to any command, which is an absurd amount of work, and just not scalable. Subtle changes in this strict contract means that pipelines will break, or just not be well supported[1].

If programs simply input and output unstructured data, it's up to the user to (de)structure this data in any way they need. The loose coupling is a feature, not a bug.

[1]: You can see this in Nushell's issue tracker[2]. I'm not judging the amount of issues, as any healthy OSS project will have many issues, but some of these are critical bugs related to command handling and interop. I'm not blaming the Nushell team either, my hat's off to them, but just pointing out that the nature of the project will inevitably lead to a neverending stream of these types of issues.

[2]: https://github.com/nushell/nushell/issues


> Projects like Nushell essentially need to handle every type of output from any command, and every type of input to any command, which is an absurd amount of work, and just not scalable.

I think you're misunderstanding how Nushell works. They don't parse outputs, or generate inputs, from/for standard Unix commands. Instead, they implement their own commands with the same names as standard commands, and generate/consume structured data by default, using the same data structures everywhere. There is only a single implementation of those data structures. That's very easy to maintain.

So running `ls` from Nushell does not shell out to the `ls` program on your system, and then try to make sense of its output. It runs a Nushell-internal command that is tailored to the kind of pipelines that Nushell is built around. They already have hundreds such commands implemented and working, and that approach absolutely does scale. Whatever issues may remain, it already works much more reliably than the default Unix tools.

Saying that unstructured text streams are a universal interface is like saying that atoms are a universal construction kit – it's technically correct, but pretty useless in practice.


You're right, I misunderstood the way it worked. But I'm not sure that approach is better. They either need to maintain full compatibility with existing tools, or users need to learn the idiosyncrasies of Nushell tools. And commands not reimplemented by Nushell wouldn't work, or they would need some kind of generic wrapper, which would have the drawbacks I mentioned.

But, hey, this obviously has users who prefer it, so if this works for you, that's great. Personally, I'll stick to the standard GNU and POSIX tools. I do concede that this is partly due to the robustness of this ecosystem and my familiarity with it, which is hard to abandon.

> Saying that unstructured text streams are a universal interface is like saying that atoms are a universal construction kit – it's technically correct, but pretty useless in practice.

My point is that offloading the decision of how the tools are integrated beyond raw byte streams to the user, is the most flexible and future-proof approach, with the least overhead for individual tools. Doing anything more sophisticated, while potentially easier for the user, would require maintenance of the glue layer by each tool's developer, or a central maintainer ala Nushell. This loose coupling is a good thing.


> They either need to maintain full compatibility with existing tools, or users need to learn the idiosyncrasies of Nushell tools.

The existing tools aren't fully compatible with each other either. There are significant differences between GNU and BSD tools, for example, and yet more differences with BusyBox and others. The idea of "standard" tools is unfortunately an illusion, so not much is lost there.

But more importantly, most of the many options the traditional tools offer are related to output selection and formatting. In Nushell, those problems are solved in a unified way by piping to builtin commands that work with structured data. So instead of learning twenty different cryptic flags for `ls`, you just learn three or four postprocessing commands, and use them for `ls` and everything else.


Both gnu and bsd grep input and output a stream of bytes.

Without ever having tried it, I know that one random day when I for whatever reason have a wish to take the output from bsd grep and send it over tcp through netcat, to be collected by zsh's built-in tcp support and fed in to gnu's grep, it will work. No piece along the way made any jerk assumptions or required any any jerk tight coupling, and the bsd and gnu tools were completely compatible.

That is more valuable and greater convenience than any other poorly concieved ideas to make it more convenient.

This all 50 years after unix pipes were invented and in an environment the inventors did not even try to predict and handle. Instead they handled infinity by not trying to predict anything. They just made useful low level tools which you assemble however you may turn out to need to, and the tools make as few as possible assumptions which will all eventually break. The hammer doesn't only work with one kind of nail.


> The existing tools aren't fully compatible with each other either.

Right, but those incompatibilities, as well as the way commands interoperate, are left to the user to resolve. No monolithic tool could realistically make that easier, unless they reimplement everything from scratch, as Nushell has done. But then you have to work with an entirely different and isolated ecosystem, and you depend on a single project to maintain all your workflows for you. Again, the ability for loosely coupled tools to work together is one of the strengths of Unix.

We clearly have a difference of opinion here, so let's agree to disagree. :)


Murex (https://GitHub.com/lmorg/murex) doesn’t replace coreutils with builtins but manages interop with commands just fine.

Most output is relatively easy to parse, sometimes you need to annotate the pipe with what format to expect but that’s easy enough to do. And Murex does come with builtins that cover some of the more common coreutils use cases for instances when you want greater assurances of the quality of the data - but those are named differently to their coreutil counterparts to avoid confusion.


Murex is pretty neat, thanks for sharing.

Still, you must have issues parsing all variations of output, depending on the flags passed to the source command and its version. How do you parse the output of ls or ps without knowing the column headers, delimiters, or which version of the command was ran (GNU, BSD, BusyBox, etc.)? Piping data into commands also must require a wrapper of some sort.

Not knocking on the project, it does look interesting, especially the saner scripting language. But the usefulness seems limited to the commands and workflows it supports.


Basically the same way you’d parse a CSV except white space delimited. You assume the headings are the first row. You can use named headings or numbered heading (like AWK) so you have options depending on the input and whether it contains headings.

The current implementation does break a little if records contain a space as part of its input (eg ‘command parameter parameter’ in ps) but I’m working on some code that would look at column alignment as well as separators etc — basically reading the output like a human might but without going to the extreme of machine learning. (I’m already doing this to parse man pages and --help output as part of automatic autocompletions so I know the theory works, I just haven’t yet applied that to more generalised command output).


That's the first time I'm hearing about this project (which I take you are the creator of?). Very interesting!

How would you say Murex compares to Nushell? The syntax seems vaguely similar. Are there any fundamental differences?


Yes, I'm the author :)

Murex was created before most of the alt shells existed, created to scratch a personal itch. It's only relatively recently that I've been promoting it. What I wanted to create was a shell that had typed pipes but still worked 100% with traditional POSIX abstractions. So it's still just standard POSIX pipes underneath so type information is sent out-of-band. This basically means you can have a richer set of functionality from anything that understands Murex while still falling back to plain old byte streams for anything that doesn't.

I've also taken inspiration from IDEs with regards to the interactive UX. You'll get syntax highlighting, dynamic autocompletions based from man pages (I'm shortly going to push an enhancement in that area as well), smarter hints (like tool tips), inline spell checking, and all sorts.

There's also been some focus on making the shell more robust. Such as built in unit test framework, watches (for debugging), etc.

There will still be plenty of rough edges (as is the case with all shells to be honest) but it's a vast improvement over Bash in my biased opinion. So much so that it's been my primary shell for > 5 years.


> Null separators between fields is structured data. It might just be the simplest that can possibly work (because the null character is outside the normal data range for text, unlike the newline, which demonstrably doesn't work in that sense) but it is structure. A list structure, specifically.

Sometimes I daydream about a parallel universe in which the designers of Unix decided on record-oriented IO instead of stream-oriented IO.

If pipes were defined in terms of records as opposed to an unstructured byte stream, there'd be no need for a special character (whether newline or null) to separate records. How is in-band signalling in pipes any better than in-band signalling in telecommunications?


Sometimes I daydream about a parallel universe in which the designers of Unix decided on record-oriented IO instead of stream-oriented IO.

I would attack you with tea bags and waxed paper, if you tried to alter the timeline, and make this so.

On the other hand, I would happily sing your praises, if you invented another kind of pipe, to your specs, it sounds like a great additional.

Imagine combining the two, in one series of commands!


> On the other hand, I would happily sing your praises, if you invented another kind of pipe, to your specs, it sounds like a great additional

Already exists - Unix domain sockets. Some shells (in particular some versions of ksh) use them to implement pipes. And, on some platforms (Linux yes, but I think maybe not macOS???) Unix domain sockets support record-oriented operation (SOCK_SEQPACKET). The problem is that for it to work you don’t just need the kernel to support it and the shell to use it, you also need all the utilities to support it too-that’s a big ask.

The idea has been implemented on IBM mainframes (CMS pipelines aka Hartmann pipelines). But that’s a radically different platform, and IBM has never tried porting that to a non-mainframe platform, and nobody has ever sought to directly clone it (although stuff like PowerShell and NuShell share some of its ideas, albeit none of the details)


Hmm. Shame it never caught on in the OSS/GNU world.

Thanks for the FYI.


What if most standard Unix commands had a new option to insert into their output those nifty separators ? (FS, GS, RS, US)

That ought to make life simpler for downstream commands that try to go beyond processing a text stream.


> What if most standard Unix commands had a new option to insert into their output those nifty separators ? (FS, GS, RS, US)

That's not what I mean by record-oriented IO though. That's signalling record boundaries in-band. I'm talking about signalling them out-of-band. So you have record lengths kept separately from the data, and passed around by APIs separately from the data bytes.


I've had the same idea. It would (I guess) rewriting the kernel from the ground up to turn every stream into two streams ?

Telecoms uses both concepts (in-band signaling and out-of-band signaling) but I guess the Unix developers didn't get the memo ?


> I've had the same idea. It would (I guess) rewriting the kernel from the ground up to turn every stream into two streams ?

Many (but not all) Unix implementations already have support for “record-oriented pipes” in the kernel, it is just that support has rarely been used. Linux has record-oriented Unix domain sockets (SOCK_SEQPACKET), but few use them.

> Telecoms uses both concepts (in-band signaling and out-of-band signaling) but I guess the Unix developers didn't get the memo ?

The original Bell Labs Unix team created STREAMS, which does support true record-oriented IO. Most commercial Unix implementations (such as Solaris and AIX) support it (or at least did at one point). But Berkeley Sockets won the mindshare competition, and open source Unix-likes (such as Linux and BSDs) never added streams support. There was a project to add it to Linux, but Linus was opposed to the idea, so it never got merged, and I think it has since been abandoned.

Even before STREAMS, Unix supported record-oriented IO in the terminal subsystem (cooked mode). But it wasn’t general, it was very specific to the needs of interactive use. STREAMS was intended as a generalisation but it never caught on. So even today all Unix-likes have a purpose-specific (rather than general) record-IO implementation in their tty subsystems, pseudoterminals, etc


It sounds like the situation is ripe for a couple of dedicated (crazed) developers to pick up the ball.


> but in no way were tools breaking any sort of promise

The Unix philosophy as usually stated includes the rule "write programs that work together".

Most Unix commands don't work together in any meaningful sense. For example, `ps | kill` will not terminate all processes, as one might naively expect.

I'd say this is a clear violation of both the letter and the spirit of the Unix philosophy.


I mean, neither will `ls | rm`, and a myriad of such combinations if you try them blindly, but it's unreasonable to assume that every combination of commands will work in the simplistic way you expect them to.

Tools like xargs exist to fill that void, and it allows the tools themselves to work independently.

Developing a set of tools from scratch that share a strict design principle is much easier than enabling an ecosystem of purpose-built tools developed over decades by external contributors to interoperate with each other. Leaving the exchange format open is a reason this still works so well today. Or would you rather use XML or whatever format was popular 40 years ago, have to adopt a new format whenever the previous one becomes outdated, and deal with compatibility hell when tools support different versions of the format?


IMO, it's rather embarrassing how rarely they work together without a decent amount of effort. Especially in light of the fact that, e.g., `ls | rm` does work exactly as expected on Windows nowadays.


> For example, `ps | kill` will not terminate all processes, as one might naively expect.

Who expects it to behave like that? Quite the assumption.

Expecting automagic inference based on typing is building a system based on assumption. Have fun making breaking changes to a foundational component that every command will rely on. Everyone will assume you want typed data and if you don't accept/generate typed data then what? Will every shell program have to be burdened with this nonsense and require a rewrite?

In the end ps does exactly what it was meant to do: print information on processes. Your assumption is missing the step where you extract the necessary information you want, e.g. the pid. That is where unix philosophy comes in. Your assumption also seems to make foot guns easier as ps|kill should not be that easy.


kill `ps`

rm `ls`

kindof work


Everyone wants protobuf in their pipes.


The “Unix philosophy” is whatever the writers philosophy is - it has no real meaning


Isn't this what powershell tries to do,with the difference that it's structured objects being passed instead of tables?


That's why I use powershell


I read the page a few times but I'm still scratching my head. How is this intended to be used?


Presumably it's meant to be used when you're writing a shell script and you have some problem in front of you that would be trivial to solve in a real programming language and you find yourself saying a sentence starting with "I just wanna…" and rage-googling or asking ChatGPT or whatever.


Would be great if it could show some things it can do that awk can’t. Because awk is typically my Swiss Army knife if I run into these situations.


I think it's written by someone who finds pipes and awk too awkward to work with.

I know people who can't be bothered to learn these tools and write complete programs to solve problems instead of a shell one-liner.

Looks similar.

Edit: My brain skipped words while typing.


> I think it's written by someone who finds pipes and awk too awkward to work with.

Actually it's exactly the opposite, it's born out of a love for pipes, and shells, and tools like awk. If you know anyone working at Amazon, ask them to search "11 years of work in a weekend" for a tale of shell heroics that I wrote about while I worked there.

dt is intended to be a new tool in the "shell one-liners" category, just with concatenative FP semantics. :) It will not be everyone's cup of tea, and I will still love and use awk when appropriate


> I think it's written by someone who finds pipes and awk too awkward to work with.

This one of substituting AWK/shell/sed/Perl with a forth-like lang is a good idea in a sense, it doesn't break the flow because presentation comes later and logic is at the beginning of the dt part, with the aforementioned tools you have the logic and output mixed all over the place. I will however still use AWK.


> ...it doesn't break the flow because presentation comes later and logic is at the beginning of the dt part

I didn't mean to discredit the work done. It's a big undertaking in any case. The idea and the aim is good, however it breaks the conciseness and reduces the speed of implementation.

> with the aforementioned tools you have the logic and output mixed all over the place.

I think this is a secondary effect of composability, pipe and conciseness requirement.

> I will however still use AWK.

Me too, and this is why I made my prior comment, exactly.


awk is an inspiration, and a great simple tool. Not trying to compete, but add more tools in the space.

Probably dt will never be able to do things that awk can't do... At least for the non-trivial things. But I think it will be able to do some things with a more readable/declarative syntax.

I'll fill this out later, but imagine dt as trying to be a shell-friendly Functional Programming riff on awk, with first-class functions and no need to regex match or BEGIN etc. At the end of the day, assuming it catches on, I suspect choosing dt will probably be more often about taste


> you have some problem in front of you that would be trivial to solve in a real programming language

Which is the exact point in time when I solve the problem by using a real programming language.


Isn't that why Larry made perl? To avoid using a real programming language like awk? ;)


My read as well.

I write shell-scripts when the current tools solve the problem easily. I distribute shell scripts to colleagues (never customers) only when I absolutely do not want to install extra software on their system.

I avoid awk and perl because if I'm going to introduce a second language to a tool, I'm not going to pick the niche ones everyone only learns opportunistically if at all. At that point I'd rather pick something my colleagues are deeply familiar with.

And on a small level, writing these little binaries that truly do one thing and do it well and that I understand intimately is a private joy.


Awk and Perl niche? How about "reliably installed on every GNU/Linux box this side of the century". Their fault for not knowing their own systems. Anything pre-installed is fair game.


Yeah, the examples are focused on dt’s internal state and features. A few real world practical examples of what you can do with it would be much better.


For example I am trying to figure out how I would split each line on a delimiter. I cant figure out what the syntax is

dt [ "," split ] pls prints split

dt [ "," split ] map pls prints stack underflow

I think I could definitely use this if it had better docs and I could figure out how to.


I don't know what that is intended with that, but for simple line splits I'd use awk, which is typically everywhere, and its parameter -F (F there stands for field separator) that is, given the file L with the lines

   hello,world,one
   maybe,baby,you
awk -F, '{ print $2 }' L

should give

   world
   baby
(tested on https://busybox.net/live_bbox/live_bbox.html )


I either use Bash (which is actually a decent programming language if you spend the time to learn it properly) or Python.


> Bash (which is actually a decent programming language)

How so? I tried to seriously learn Bash a few days ago, and once I learned about word splitting, [[ ]] vs. [ ], "${MY_VAR}" just to properly reference a variable it seemed even more full of technical debt that Wat-full [1] Javascript. I figured it would be obsoleted by a modern shell language alternative by the time I will be 30 (currenty 18), it's not worth it, and gave up. I think the bar should not be Turing completeness for a programming language to be decent.

1: https://www.destroyallsoftware.com/talks/wat


For me, somewhat surprisingly, the best bash resource I've come across is the manpage. If you haven't read this and have instead relied on any other resource, I would recommend spending some time to read the bash manpage before deciding to give up on it. To me this has transformed bash from what I once considered a cryptic ugly mess to one of my favourite languages to write even relatively general-purpose code on.

The help command is the only other resource you need, imo.


There is a zero percent chance that Bash will be gone by the time you're 30. I recommend you use some of that youthly neuroplasticity of yours to get used to it; it is too valuable a skill to give up on simply because it's "wat-full". It is in part a valuable skill to have precisely because it is "wat-full".


I understand, but I don’t and probably won’t ever code for an employer. And AI assistants can help with the occasional wat syntax e.g. when I need to troubleshoot some software I use. Btw I just switched to fish.


Learning a programming language properly usually takes 5 years of daily practice using it to solve real problems.

There is no 'wat' situation for a tool you really are proficient with.


I use Python's `fileinput.FileInput()` a lot for ad hoc scripts, because its behavior is to read from STDIN if available, or to attempt to open a file from `sys.argv[1]` (possibly `[1:]`?) if not, which covers a ton of this kind of use cases.


It wasn’t mentioned on the landing page (I think) but is this a concatenative programming language?


It's stack based, like Forth and RPN. So 1 2 3 * + means something like

    push(1); push(2); push(3); multiply-top-of-the-stack(); add-top-of-the-stack();
So `status pls` means: call the function status (which leaves its results on the stack) and print the top of the stack with new-lines (multiple lines if it's an array).

Then `status upcase pls` would do something like the above, but it calls a "to uppercase" function before printing.


While I think 'pl' and 'pls' for print line and print lines is probably an excellent choice overall I can't help but see 'pls' and try and parse it as an abbreviated INTERCAL.


Got it cheers


In my profressional life I have cause to work with Powershell a great deal, and I love the object oriented pipelines that it has.

Being able to filter, sort, or transform properties of the objects passed over makes it much more straightforward to work with than bash et al.


I’m with you here. Once you get the hang of it you even miss the most awkward pipelines once they’re gone and you’re just passing around single dimensional strings.

The memory overhead with sets of objects and their properties vs vanilla strings is significant though and easy to bump up against when working with large data sets. Best to try to keep all of the processing for that dataset in a single pipeline if you can. But that’s tradeoffs.


Personally I only use shell scripting for simple tasks, not anything that would run into a memory limit. If it requires more heavy lifting, I'll write it in a programming language like C#. Nothing we do in production relies on shell scripts, they exist only as shortcuts for our workflow


I think that's a good rule of thumb but not universally applicable. Powershell is used a lot with Microsoft sysadmin/identity/mail software and other vendor software used to manage or integrate with said MS software. If you're processing tens of thousands of AD objects you can easily hit memory issues. You could have your staff learn a few best practices and continue using the wealth of existing tooling/knowledge that exists for Powershell or write a bunch of custom .net code for every simple integration or job they want to run to process the objects in their directory.


Personally we use Linux boxes for everything remote, and I just develop on Windows locally


Yeah the memory usage of the pipeline is real. Often using foreach instead to save memory is needed. Or good old fashioned array chunking.


Exactly. If you really have to you can take advantage of its ability to use .net libraries directly and force immediate cleanup/gc of variables but it's super hacky and always found pipelining to work better once you get the hang of processing data in flight.


There's a lot of people trying to copy this to the Linux space. NuShell comes to mind: https://www.nushell.sh/

I had an idea where I'd make wrappers for all the popular commands that would accept JSON structured data instead of raw binary data and "do the right thing". And that way you could take advantage of it without having to change your shell tooling. And they'd all accept an extra argument called either --structured-out or --unstructured-out which would either emit JSON or render the output back to a "flat" string, as appropriate.


up: Ultimate Plumber for shell

https://github.com/akavel/up


Nice one.

There have been so many nice CLI tools these last 10 years. Quality of life kind of nice. ag/rg, fzf, entr, up, lf...

Go and rust have been appreciated enablers, even though ruby/python/js demonstrated the concepts, with somewhat poor performances and UX.


Oh my!


Maybe I’m not the target audience, but this README is very light on real usecases and examples to help me understand when I would need this.


"Duck Tape" (versus duct tape) is a trademark of Manco, now Henkel Consumer Adhesives.[0]

Did some fancy DB work for them on OS/2 and early AS/400s in the '90s.

[0] https://www.duckbrand.com/about


Trivia: "DT" is Japanese slang for "dōtei" ("童貞", means "virgin").

Names are hard.


It's generally considered a feature for your pipeline to -not- be screwed, after all.


Is it nailed?


There's a fairly successful version control system whose name is an insult in British English.

And wait until you hear what the most popular open source alternative to Photoshop is called...


There are 5 pages of entries in Urban Dictionary for "DT" -- there is no winning with short names.


I don't really see why existing shell tools and scripting languages are missing, that this project provides. Especially if you consider the expressive range from, say, grep/bc/tr, through sed, through awk, to perl or python.


I write a lot of perl -e pipeline members for this sort of thing and I can absolutely see dt producing something that's easier to skim read for somebody who only really knows shell.

(I got sufficiently competent at perl before really noticing sed and awk existed that I'm pretty terrible at the latter two, so YMMV etc.)


Title edit: original title is “duct tape for your unix pipes”


Forth/Joy/Factor vibes


What do the -- in the documentation mean? Also why does one command use double backslash (\\word) while others only a single one (\word)?


Isn't it the bash double dash[0] that signifies the end of cli options?

[0] https://unix.stackexchange.com/a/11382


So this is basically just Forth.

I'd suggest that a functional shell like es-shell is more interesting than this experiment =)

https://wryun.github.io/es-shell/


This is really interesting and fills a nice niche for when you don’t want to reach for python or whatever.

As an aside, I wish I was cool enough for lobste.rs!


Thank you to the two people who sent me invites, that was very kind :)


[flagged]


> During World War II, Revolite (then a division of Johnson & Johnson) developed an adhesive tape made from a rubber-based adhesive applied to a durable duck cloth backing. This tape resisted water and was used to seal some ammunition cases during that period.

> "Duck tape" is recorded in the Oxford English Dictionary as having been in use since 1899 and "duct tape" (described as "perhaps an alteration of earlier duck tape") since 1965.

https://en.wikipedia.org/wiki/Duct_tape


I stand corrected... thanks


Straight from the link :

> Excuse me, is it "duct" or "duck" tape?

> I mean sure, "duct tape" and "duck tape" are both fine. "Duct tape" is the more common name today. The older name is "duck tape" after the duck cloth under the adhesive.


Maybe at least read the link next time before wasting all of our times commenting (everyone who read your useless comment, multiplied by the amount of time it takes to read it).

The very thing you touched on is literally mentioned in like the 2nd paragraph.


Thanks for the moral lesson on how to spend my time.


Anyone else immediately thought of the attention hack wrt. to duck vs. duct? The hack is based on the opservation that any content with little, weird mistakes or errors get outsized attention.


In that case, I fell for it


I did as well. I should stop grouchly reading HN in the morning waiting for coffee.



"Duck Tape" is actually one brand of duct tape: https://www.duckbrand.com/products/duck-tape


Early ones at least had some parts from ducks, but there's no relation to ducts at all. Ducts are not sealed using duct tape.

Language should be updated to call it reinforced self-adhesive tape or something.


Title should be

> Dt: Duct tape for your Unix pipes

      ^^^^


Curiously enough, not really, from the post:

    Excuse me, is it "duct" or "duck" tape?
    I mean sure, "duct tape" and "duck tape" are both fine. "Duct tape" is the more common name today. The older name is "duck tape" after the duck cloth under the adhesive.

    As an aside: If you care about such things, make sure to read up on how today's common version became popular. A 51-year-old woman named Vesta Stoudt, mother of 8, mailed her idea for a better ammunition box seal to the president (who approved production) after her bosses didn't do anything with the suggestion.


Barely a day goes by I don't learn something surprising on HN that's entirely unrelated to IT/software dev.


"I suggested we use a strong cloth tape to close seams, and make tab of same. It worked fine, I showed it to different government inspectors they said it was all right, but I could never get them to change tape." Vesta Stoudt to President Roosevelt, February 10, 1943 [1]. History is full of "never get them to change" stories, probably one of the more famous is Napoleon's dismissal of the steam engine, although the story is a bit more complicated, "Fulton (and his design) failed at the worst possible time" [2]. As we found out recently, submarines are hard.

And on a completely unrelated note, one of the greater stories of quasi-forgotten sacrifice of a mother for her son is the story of a woman in 1850s travelling around 2,000 kilometres by foot, by horse, by any means to get her son enrolled into university, dying shortly after: her name was Maria Dmitrievna Mendeleeva, her son's name was Dmitri Ivanovich Mendeleev [3], that Mendeleev.

[1] https://en.wikipedia.org/wiki/Vesta_Stoudt

[2] https://hsm.stackexchange.com/a/13154

[3] https://chemaust.raci.org.au/article/julyaugust-2019/mother%...


The title of the article spells it "duct", and titles should not be editorialized.


Actually the title as in, html > head > title is "dt: duck tape for your unix pipes".

The poster probably used the HN bookmarklet which sets the title automatically.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: