
Curl, 17 years old today - bagder
http://daniel.haxx.se/blog/2015/03/20/curl-17-years-old-today/
======
xrstf
> "If it doesn't load through curl, it's broken." \--someone

So, so true. Thanks, curl.

~~~
laurent123456
Curl is great and all but I don't think this quote makes much sense. Are there
examples of services that would, for example, load in a browser but not
through curl? I'd think it's rather the opposite since you can through pretty
much anything at curl and it will work.

~~~
cmg
Anything that requires JavaScript to fetch and display content.

~~~
twerquie
Yeah, building web apps this way is a fad. I predict everyone will go back to
full page reloads and server-rendered content.

/s

~~~
acdha
On a serious note, “progressive enhancement – still the right way”. I don't
think client-side rendering is going away but everyone who's relied on it
exclusively has learned the hard way that it's just too unreliable and slow to
have a failure mode which is an empty page unless a lot of complex code works
perfectly.

Just think about how many engineer-hours Twitter flushed with that silly #!
kludge – and then when they switched back, saw an 80% improvement in page load
time.

~~~
ceejayoz
> Just think about how many engineer-hours Twitter flushed with that silly #!
> kludge – and then when they switched back, saw an 80% improvement in page
> load time.

No hours were wasted, and they didn't really switch back. They're just using
HTML5's History API on browsers that support it now. Essentially the same
mechanism under the hood, just prettier URLs for it.

~~~
acdha
They did more than just switch to the history API. During that period, if
anything went wrong, you saw a blank page and, of course, robots saw only the
generic launcher HTML instead of any content.

Now, here's what a tweet looks like without JavaScript enabled:

[https://www.dropbox.com/s/me7kinvje7ly781/Screenshot%202015-...](https://www.dropbox.com/s/me7kinvje7ly781/Screenshot%202015-03-20%2019.42.30.png?dl=0)

Here's what it looks with JavaScript enabled:

[https://www.dropbox.com/s/04pjdlkuht6t2ja/Screenshot%202015-...](https://www.dropbox.com/s/04pjdlkuht6t2ja/Screenshot%202015-03-20%2019.42.38.png?dl=0)

(The main difference would be that things like the search & menus are either
interactive controls or simple links to basic HTML forms depending whether
JavaScript loads)

During the hashbang era you couldn't use a page without a full rendering
ending. Now, however, all of the content is available with fairly rich markup:

[https://redbot.org/?uri=https%3A%2F%2Ftwitter.com%2Facdha%2F...](https://redbot.org/?uri=https%3A%2F%2Ftwitter.com%2Facdha%2Fstatus%2F578935560187826176)

------
escherize
I love curl so much. I just learned that you can 'copy to curl command' from
the chrome inspector's network panel by right clicking on any request!!

I want to make a library that reads the curl command (and maybe request
syntax?) and outputs a function that will do that command.

~~~
artpar
[https://shibukawa.github.io/curl_as_dsl/](https://shibukawa.github.io/curl_as_dsl/)

~~~
DenisM
So many languages supported, but no plain C?

~~~
tlrobinson
As someone else mentioned, this is actually built into Curl with the
"\--libcurl file.c" option.

------
pascalo
Can I just say thank you for all those hours of hard work the maintainers have
put in over the years?

~~~
ceejayoz
Yes, you can! Here's the link:

[http://curl.haxx.se/donation.html](http://curl.haxx.se/donation.html)

~~~
pascalo
Good call!

------
teamhappy

        alias wget='echo "How dare you." && curl -O'
        brew rm wget
    

Happy birthday.

~~~
awalGarg
Also, we have aria2 already :P

------
wging
I'm surprised it's so new. And wget is only a year older... what did people
use before then?

~~~
barsonme
How long have GET/POST commands been around? Probably not as long, but just
curious.

~~~
castell
The first documented version of HTTP was HTTP V0.9 (1991) has only GET:
[http://www.w3.org/pub/WWW/Protocols/HTTP/AsImplemented.html](http://www.w3.org/pub/WWW/Protocols/HTTP/AsImplemented.html)

Basic HTTP as defined in 1992 had GET, PUT, HEAD, POST, LINK, TEXTSEARCH,
CHECKIN, etc.:
[http://www.w3.org/Protocols/HTTP/Methods.html](http://www.w3.org/Protocols/HTTP/Methods.html)

More generic info:
[http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol](http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol)

------
josephpmay
It's slightly weird to me that I am older than Curl and Wget. They always
seemed like Unix Monoliths to me; I had just assumed they had always existed.

~~~
gobengo
THIS. I have the exact same feelings. Same with the 'Kubuntu is 10 years old'
link on the homepage. I think we just grew up at the perfect time!

------
tlrobinson
Curl is great, but I also recently came across HTTPie
([https://github.com/jakubroztocil/httpie](https://github.com/jakubroztocil/httpie))
which has some nice features for playing around with HTTP APIs (JSON
formatting, syntax highlighting, etc)

------
lloydde
I abuse curl most weeks. How many more web apps would fail at a header only
request, if not for prodding from a curl user.

------
noselasd
Daniel Stenberg, the maintainer, was also on the packet pusher podcast
recently, talking about HTTP/2 - [http://packetpushers.net/show-224-http2-its-
the-biggest-netw...](http://packetpushers.net/show-224-http2-its-the-biggest-
network-thing-happening-on-the-internet-today-repost/)

------
jkoudys
I was wgetting all my http requests until about 2 years ago. Curl's undeniable
coolness won me over after 15 years. Now I practically live in curl when I'm
setting up webservices, and libcurl for PHP does something on nearly every
page request I have.

For PHP people, curl_multi_exec is the new event loop.

------
kyberias
> Rough estimates say we may have a billion users already.

This cannot be true by a long shot. Or am I missing something?

~~~
anonfunction
The most common language used on websites is PHP which uses libcurl to handle
HTTP requests.

~~~
kyberias
PHP applications may use libcurl when making outbound HTTP requests, right?
That is only a small subset of these web sites. Most web sites just normally
process HTTP requests.

This does not seem to explain the 1 billion users figure.

~~~
anonfunction
WordPress uses libcurl, according to a google search there are 74,652,825
sites using WordPress. How many unique visitors do you think these sites get?
I would guess over a billion.

~~~
kyberias
Oh come on, you just don't get it do you?

Just because Wordpress, which is a blog platform, uses libcurl for something
you didn't even state (probably some outbound http stuff), doesn't mean it
uses it to process most of the incoming requests for the millions of users.

We don't say Solitare is the most successful game ever just because it's
installed with every copy of Windows.

------
Pephers
curl is just awesome, thanks so much to the author and all maintainers over
the years! It's still my go to application for testing and debugging HTTP
requests.

------
cdnsteve
Happy bday!

------
awalGarg
cUrl, I am as old as you <3

------
IshKebab
Curl is pretty great to use, but be warned it has old smelly code and is
probably full of security issues. I wouldn't use it for anything too critical.

~~~
charonn0
That's sufficiently vague to be useless. Would you care to elaborate?

------
nailer
And still better than wget, which only does http 1.0 and this has problems due
to lacking a 'host' header. Curl just works.

~~~
leni536
> And still better than wget, which only does http 1.0 and this has problems
> due to lacking a 'host' header.

Well I had to check, but this is not true at all:

    
    
       $ nc -l -p 9999
       GET / HTTP/1.1
       User-Agent: Wget/1.15 (linux-gnu)
       Accept: */*
       Host: localhost:9999
       Connection: Keep-Alive
    

Also wget can recursively mirror webpages and there are nice options to
carefully select contents you want to download. It's quite dated though, I
wish it could use an external downloader (like aria2) and only do the walking
and converting links part itself.

~~~
baldfat
I love aria2c

I use it all the time even as the down loader for ArchLinux's pacman (package
manager).

~~~
heyalexej
It's also used internally in apt-fast[1] and speeds up the process by an order
of magnitude for me.

[1] [https://github.com/ilikenwf/apt-fast](https://github.com/ilikenwf/apt-
fast)

