
Microsoft cURLs too - TXCSwe
https://daniel.haxx.se/blog/2018/01/13/microsoft-curls-too/
======
lettergram
One thing I am always sure to share with collegues when we discuss curl, is
the fact from the command line you can generate the underlying C code.

This is pretty useful when creating a CLI for pretty much any app, and I've
used it regularly to generate a CLI for an app.

My post on how to do it: [http://austingwalters.com/export-a-command-line-
curl-command...](http://austingwalters.com/export-a-command-line-curl-command-
to-an-executable/)

~~~
aplorbust
It wasnt until 2006 that curl added HTTP/1.1 pipelining support. Hence I
always used netcat instead of curl because I utilised pipelining heavily for
text/html retrieval (IME, most servers supported it).

Imagine something like this with curl:

    
    
       curl << eof
       http://example.com/a.htm
       http://example.com/b.htm
       eof
    

where curl only opens a single connection.

Alas, AFAIK, pipelining is still not enabled in the curl binary.

As I understand it, the --libcurl option only generates code for what is
possible with the curl binary, e.g., curl_easy_init(), curl_easy_setopt(),
etc.

As such, it will not generate code using curl_multi_init(),
curl_multi_setopt(), etc.

I have to automate the code generation myself.

~~~
aplorbust
To reproduce:

    
    
       curl --libcurl 1.c http://example.com/a.htm http://example.com/b.htm
    
       grep curl_multi_init 1.c
    

[https://curl.haxx.se/mail/archive-2008-02/0036.html](https://curl.haxx.se/mail/archive-2008-02/0036.html)

~~~
aplorbust
Here is how RFC 7230 defines "HTTP/1.1 pipelining":

"6.3.2. Pipelining

A client that supports persistent connections MAY "pipeline" its requests
(i.e., send multiple requests _without waiting for each response_ )."

AFAIK, the curl binary does not do pipelining by this definition.

And, AFAIK, it will not generate code to do pipelining by invoking it with
--libcurl.

------
orf
Curl is alright, and congratulations on this massive and very impressive step
forwards, but the CLI not exactly very user friendly. httpie[1] is a great
tool if you find curl invocation somewhat arcane.

1\. [https://httpie.org/](https://httpie.org/)

~~~
Something1234
HTTPie is great, but sometimes you need curl. Curl supports a lot more
protocols.

~~~
Rapzid
Plus, it's easy to forget that httpie is SLOW. Sometimes when testing an API
the timings are worrying, until you remember and switch over to cURL to
confirm everything is right with the world.

~~~
nicolaslem
I don't know why you are getting downvoted. That's true, starting a Python
interpreter is a relatively expensive operation. There is a noticeable lag
with most CLIs written in Python.

There has been some work recently to make the interpreter start faster. I hope
we will see the result in Python 3.7.

------
gregmac
For debugging on remote servers this is pretty handy (at least, one it makes
its way into the server versions).

Now I have hope they'll put in a text editor that understands unix line
endings.

~~~
martin-adams
It's like they're trying to get a record by not doing that. For years I've
been baffled why Notepad at the very least couldn't understand unix line
endings.

~~~
eat_veggies
It's possible they lost the source code and literally _can 't_. They've done
it before:

[https://www.bleepingcomputer.com/news/microsoft/microsoft-
ap...](https://www.bleepingcomputer.com/news/microsoft/microsoft-appears-to-
have-lost-the-source-code-of-an-office-component/)

~~~
userbinator
Notepad is literally a window with the standard Windows Edit control inside
it, so they certainly have the source code.

My guess as to why they don't care to support \n-only is that there's been
very little need to; anyone who needs more advanced editing isn't going to use
Notepad anyway.

As the sibling comment mentions, WordPad (which is similar but with a RichEdit
control) _does_ support \n-only.

~~~
TazeTSchnitzel
If you want notepad-with-LF-support, ReactOS, the open-source Windows NT
clone, has this.

------
nodesocket
> They ship 7.55.1, while 7.57.0 was the latest version at the time. That’s
> just three releases away so I consider that pretty good. Lots of distros and
> others ship (much) older releases.

Indeed. I am running latest High Sierra and:

    
    
        curl --version
        curl 7.54.0 (x86_64-apple-darwin17.0) libcurl/7.54.0 
        LibreSSL/2.0.20 zlib/1.2.11 nghttp2/1.24.0

------
ComodoHacker
On a side note, for all my daily download tasks (other than debugging some web
API) I've settled with aria2[0]. It seems to support every protocol used in
modern Internet and has plenty of flexibility (connection multiplexing,
bandwidth control etc.). It can even serve as 24/7 torrent client managed via
remote API.

0\. [https://aria2.github.io/](https://aria2.github.io/)

------
qwerty456127
BTW another program (and a library) that should, IMHO, be made a standard
component of every modern OS (except those that would choose to exclude it for
a practical reason, e.g. some of extremely-lightweight and heavily specialized
embedded ones) is SQLite3

------
qwerty456127
Cool. I've been installing wget on every Windows PC in my authority since the
days of Windows 98SE. Every net-enabled operating system is to have such a
tool installed by default.

------
ape4
Now, just ~1000 other commands to go

~~~
shitloadofbooks
But they're _all_ already there via the WLSS (Bash on Ubuntu on Windows) if
you want them.

Not to mention that the PowerShell equivalents of a lot of *nix commands are
_much_ better. "Everything is an object" is a brilliant philosophy and it's a
joy to use.

~~~
frou_dh
Regardless of its merits, they blew it in terms of marketing. PowerShell is
destined to be a deadend technology only used by Windows sysadmins.

~~~
Spivak
I don't think MS envisioned PS as anything other than a sysadmin tool. If they
did it doesn't seem to come though with the design.

~~~
frou_dh
Doubt that. You think in the PowerShell design phase, the team would have
shrugged if some MS decision maker somehow mentioned "BTW, 10 years from now,
for developers, we're going to be promoting installing a Linux emulation layer
and using Bash"? I bet the PS folks would have been dismayed.

------
colemannugent
Cool, now I don't need to remember the arcane incantation to download a file
with Powershell.

Do you think we'll see things like the good old "curl <some url> | bash" for
Windows now? They still have no package manager worth using.

~~~
Iv
> the good old "curl <some url> | bash"

I would be thankful if the habit of trusting a random IP with control of one's
shell could die, forever.

~~~
jjnoakes
Why would you be using a random ip? That incantation is usually used with a
URL where you would have just downloaded and installed the software manually
anyway.

There are various arguments against the curl-piped-to-shell idiom but "random
ip" doesn't seem like a valid one.

~~~
shakna
A URL and an IP are not equal.

Building a script that acts differently for a web browser, a normal download,
and curl, is trivial, and I've seen it happen. Here's a proof-of-concept[0]
someone else wrote.

Manually downloading is safer, at least you can review, curling straight into
a shell is inherently unsafe.

The better option is still a package manager, but curling straight to a shell
is very unsafe.

[0] [https://jordaneldredge.com/blog/one-way-curl-pipe-sh-
install...](https://jordaneldredge.com/blog/one-way-curl-pipe-sh-install-
scripts-can-be-dangerous/)

~~~
jjnoakes
Why would you be using an IP and not a URL?

And why wouldn't you trust the source of the install script as much as any
other installer?

Do you audit the binary installers you use as well?

I don't disagree about a minor difference between the methods, but I
definitely disagree with piping to sh being "very unsafe". If you trust the
site/author enough to run their code on your computer at all, the install
method risk difference is but a tiny drop in the bucket.

~~~
shakna
> If you trust the site/author enough to run their code on your computer at
> all

Piping curl, means you can't be sure it came from the author's site.

It means you can't be sure you're getting the same software you've been
considering installing.

It means a broken connection is a broken install, with no cleanup and no idea
what it has changed.

> Do you audit the binary installers you use as well?

Don't install random binaries either. The security implications of that should
be fairly obvious.

~~~
Spivak
> Piping curl, means you can't be sure it came from the author's site.

Up to the trustworthiness of the CA system yes you can. If the author's site
is serving malicious downloads to the curl UA then you're probably hosed
either way. It would be easier to just slip malicious code in the software
itself.

> It means a broken connection is a broken install, with no cleanup and no
> idea what it has changed.

This is the real draw of package management. The argument surrounding
curl|bash should really focus on this rather than hand-wavy security.

> Don't install random binaries either

Nobody who is running curl|bash isn't installing a 'random' binary but
downloading an installer from a source _they_ trust.

~~~
shakna
> Up to the trustworthiness of the CA system yes you can. If the author's site
> is serving malicious downloads to the curl UA then you're probably hosed
> either way. It would be easier to just slip malicious code in the software
> itself.

If they have HSTS, otherwise you might end up using plain ol' http by
accident. Like over at surge.sh, but at least they use a package manager.

> Nobody who is running curl|bash isn't installing a 'random' binary but
> downloading an installer from a source they trust.

But you can't trust it, because most shell scripts out there are woefully
inadequate. So you're one broken connection, one WiFi drop, from corrupting
your system. At least a binary needs to be complete to run.

Example: Heroku's CLI [0]

If it breaks on the echo, you could end up overwriting your entire source
list.

e.g. It breaks to:

echo "deb [https://cli-assets.heroku.com/branches/stable/apt](https://cli-
assets.heroku.com/branches/stable/apt) ./" > /etc/apt/sources.list

instead of the intended

echo "deb [https://cli-assets.heroku.com/branches/stable/apt](https://cli-
assets.heroku.com/branches/stable/apt) ./" >
/etc/apt/sources.list.d/heroku.list

And it'll work too, because the entire thing runs as the root user.

> The argument surrounding curl|bash should really focus on this rather than
> hand-wavy security.

They're the same thing. A broken connection with curl | sh is a security
problem. As is downgraded https, because of an accidentally misconfigured
host. As is running without even the basic check of seeing if you get the
complete file before executing it.

Everything about curl | sh is inherently untrustworthy.

[0] [https://cli-assets.heroku.com/install-ubuntu.sh](https://cli-
assets.heroku.com/install-ubuntu.sh)

~~~
jjnoakes
> Everything about curl | sh is inherently untrustworthy

Nope. Only one item is of _minor_ concern (which I've covered many times in
this thread) and it has an easy and known solution.

The rest of your objections are not specific to curl-piped-to-sh and are
irrelevant to this discussion.

The sky is not falling, so I'm not sure what your agenda really is.

------
chrisper
What would be the reason behind disabling all those protocols?

~~~
syncsynchalt
Probably to reduce exposure to security issues. The more code you ship, the
more code you're responsible for keeping secure.

Looking briefly at the list at
[https://curl.haxx.se/docs/security.html](https://curl.haxx.se/docs/security.html)
I see issues for FTP (x2), IMAP, and TFTP in 2017 alone. These protocols which
are outside of curl's core competency of http are likely to have less scrutiny
and more bugs. While FTP shouldn't be removed from curl I don't think a
protocol like TFTP or gopher is crucial, and I wouldn't mind too much if it
got the axe in a distribution I used

~~~
pritambaral
HTTP is NOT curl's core competency, "transferring data with URLs" is. HTTP
just happens to be the most often used in the world, and thus in curl.

------
ghews
I don't like Microsoft ,but I still wish Linux on Windows will have a brighter
future.

------
oblio
Either the Curl developers are at fault somewhere, which I somehow doubt, or
distributions are really special snowflakes, which I also doubt, or software
distribution in the Open Source world is, in my opinion, flawed:

> Finally, I’d like to add that like all operating system distributions that
> ship curl (macOS, Linux distros, the BSDs, AIX, etc) Microsoft builds,
> packages and ships the curl binary completely independently from the actual
> curl project.

Why would __everyone __rebuild it? There are some security considerations
(matching source and binary; disabling "dangerous" stuff) and some feature
considerations (disable stuff you don't need to reduce resource usage -
maybe), but conceptually this seems so wrong to me.

Conceptually I'd want downstream packagers to talk to upstream developers so
that upstream has reasonable defaults and settings and I'd want packagers to
just __package __and make the package follow distribution conventions. But
rebuild seems overkill.

Maybe I'm missing something obvious?

~~~
lwf
Almost all Linux distributions rebuild upstream software from source -- this
ensures everything is built from the same toolchain (gcc/libc etc), and that
the binaries distributed match the source.

It also allows for ease of patching in a stable release -- generally it's
preferred to just fix specific high-impact bugs rather than moving to a new
upstream version, which might introduce regressions.

(Context: I'm a Debian developer and on the Ubuntu MOTU team)

~~~
digi_owl
And then there is the whole dependencies shitstorm, were far too many
upstreams have a bad habit of breaking APIs etc as they see fit.

There are ways to work around it, but it gets messy quickly. And rather than
clean up their act they start championing things like Flatpak, that is
basically a throwback to the DOS days of everything living in their own folder
tree with a bit of souped up chroot thrown on top.

I really expect that if the likes of flatpak becomes mainstream in the Linux
world having some flaw being found in a lib somewhere will produce a stampede
of updates because every damn project crammed in a copy to make sure it was
present.

