
HTTPie: A cURL-like tool for humans - jkbr
https://github.com/jkbr/httpie/
======
javajosh
This is a nice curl and wget replacement that handles a bunch of modern use-
cases without a lot of hard-to-remember command-line flags.

That said, there is a broader problem of "hard-to-remember command-line flags"
which I have personally solved using snippet management (I use notational
velocity or command history, whichever is handiest).

There is no doubt httpie's interface is a lot better, but it creates another
problem (again, which is somewhat universal) of installing, learning and
remembering to use a new tool. This is a non-trivial problem that is a key
concern for anyone evaluating a new tool, and it's a problem that only really
gets solved with ubiquity.

Finally, an observation that so many of our "traditional" command line tools
pay no attention to usability because, at least back in the day, the problem
they solved was hard. People had a choice: either put up with an (admittedly)
bad interface or write their own version in C. The individual cost of learning
a bad interface outweighed the cost of rewriting the tool, and so standard
tools were born.

And now, decades later, new generations are stuck having to learn needlessly
obtuse interfaces to standard tools. We have a situation where newcomers pay
the cost of developer UI laziness _in perpetuity_. This is, of course, a
terrible outcome and it's projects like this one that are trying to change it.

So I applaud the effort and hope it catches on, become ubiquitous, and I can
take the curl and wget snippets out of NV.

~~~
chernevik
DELETED

This was a warning of a hazard to navigation, when a more diligent effort to
remove the hazard is called for.

~~~
javajosh
_> on at least some occasions the Python client gets non-standard results,
while curl and (Guido, forgive me) PHP do fine_

Ok, young one, here's the thing. If you see a problem like this then file a
bug. Ideally write a test case. And if you're an overachiever, dig into the
code and fix it. Any bug in an http client that reposts data is incredibly
serious, and needs to be fixed.

The other value in doing this is that you don't spread Fear, Uncertainty, and
Doubt - or FUD as it is often referred to. FUD is usually ascribed to big
companies trying to discourage using a competitors product, but it can also be
spread by the ignorant or misinformed inadvertantly. No offense, but I think
that's the case in this case, because Python is not a niche language, it's
used (and it's http libraries are used) by a lot of people, and the error mode
you describe is very, very serious.

~~~
chernevik
No offense taken.

I did report this to the API maintainers, who couldn't make heads or tails of
it. I didn't file a Python bug because I honestly don't think Python is the
problem.

But you are right, it is more responsible to pursue this until fixed rather
than raise warnings.

~~~
Ralith
Do you have a link to the report? I'd be interested to read the discussion.

~~~
chernevik
No. I will be following up on this, if you send me an email (in my profile)
I'll try to let you know when an analysis is available.

~~~
tripzilch
FYI, the `email` field in your profile is not publicly visible. If you want
other people to be able to email you, you need to put it in your `about` text.

------
fromhet
Everything on HN is just "hey, learn to use this complicated complex thing -
in just 30 minutes! You'll be so productive with all your creative startups!"

"Learn VIM essentials in this blog post!" "Never bother reading 'man curl'!"
"Learn the basics of C in three easy steps!"

Sometimes HN feels like a lifestyle magazine for people who dream of being PG.

EDIT: Don't get me wrong, I too dream of having the same succes as PG. Why
else would I be writing here?

~~~
dschobel
Also not sure what your complaint is. Working at the right level of
abstraction is fundamental to being a good engineer.

~~~
fromhet
There is no criticism really, I agree with you. But was just feeling uneasy
and vented it. I welcome downvotes to my GP comment.

------
slurgfest
"for humans" seems to have no better meaning than "for the OS X sensibility".

In other words: this is a style change rather than a productivity gain. And
the superiority of the style is not obvious - unless you just HATE the style
of existing tools and need to be set apart.

Most humans don't operate the command line or write scripts to begin with.
Those who do, usually can handle wget "<http://foo/bar>. It took me all of a
few seconds to start using wget and all of 10 minutes to have access to
fancier features. (But the truth is that a certain level of complexity really
just wants a script rather than ad hoc commands).

So here is a new tool, and it looks nice. But it doesn't at all relieve me
from having to learn syntax and conventions - I still have to go to a
doc/manpage and read that same kind of technical prose. So the only effective
difference is that now I am using different punctuation, like @filename and
-b. But the use of this "@" character is not really consistent with anything
else.

So the tool is fine and I am sure people will use it but the competitive
advantage is incredibly thin and the project smacks of NIH.

If curl and wget are not for humans then what are they for? People who do not
have that magical design sensibility. Lame code-monkeys without vision, who
are not creative and different. Soulless agents of the man.

This emphasis on branding over substance irks me quite a bit.

~~~
javajosh
Gosh, what a horrible comment! The simple fact is that curl and wget's
interfaces are bad, and _everyone_ who learns to use them spends that 10
minutes. And then, if you're like me, respends that 10 minutes when I need to
do something fancy again. If a new tool were to save 5 minutes (which this one
does), that's 5 minutes saved on every use, which is probably a total of a few
hours for me personally, and spread over the population of future users, is a
few lifetimes.

Rather than admit that even incremental usability improvements are not only
useful, but continue to pay dividends long after the tool is produced, you
lambast the author and the effort.

Not cool.

P.S. You should watch Bret Victor again, talking about how much easier it is
to crush an idea than to support it and nurture it.
<http://vimeo.com/36579366>

~~~
benatkin
Your comment is an ad hominem attack where you falsely accuse someone of
making an ad hominem attack.

~~~
javajosh
You know, I was about to defend myself, but you may be right. My tone of
outrage at the slurgfest being _wrong_ was itself rather wrong. If he didn't
see the long term incremental improvements this tool offers, that's nothing to
be outraged over. People make mistakes.

That said, I think it's a stretch to call this mistake of mine an _ad hominem
attack_. It doesn't fit my normal understanding of such an attack - there was
no name calling, etc. And certainly I didn't accuse _him_ of an ad hominem.

~~~
benatkin
Thanks for pointing that out. I made slurgfest's comment a proxy for slurgfest
when I interpreted your comment, and that is indeed a stretch. I agree that
neither of you were all that harsh.

------
jcromartie
If by "for humans" you mean "for programmers who mostly use JSON". I have to
say I'm not sending JSON with cURL too often, compared to any other payload.
And when I do send JSON it's more complex than a flat set of keys and values.

~~~
jkbr
There is more to it than that. It provides an expressive syntax to construct
all sorts of requests. You can submit forms, upload files and set headers
without having to use flags. You also get syntax highlighting, ability to pipe
in request data, etc.

If you are sending complex JSON, it's probably stored in a file or it's the
output of another program:

    
    
        http PUT httpbin.org/put @/data/test.json
    
        http -b localhost:8888/couchdb/ | http PUT httpbin.org/put

~~~
zeroonetwothree
I don't really think curl is much harder, e.g.

curl -X PUT -d @data/test.json httpbin.org/put

It also has the advantage that it supports all the options you might ever
need, for example http authentication and proxies are often useful.

~~~
jkbr
HTTPie also supports proxies and HTTP auth (see --proxy and -a/--auth).

~~~
shazow
Doesn't support socks 4/5 proxies. :)

Requests and urllib3 have a long way to go to be complete competitors with
cURL.

P.S. Good job nonetheless. Seems like a good idea to make a specialized HTTP
CLI client for JSON/RESTful services.

------
mjs
The UI for curl is awful (--request to change the method??) and wget's is only
slightly better, but they do have the advantage of ubiquity, and it's often
useful to email/Skype complete curl or wget command lines about the place to
explain how to use an API, or demonstrate problems.

(e.g. Stripe and others document their API in terms of curl commands:
<https://stripe.com/docs/api.>)

I do wish the curl UI was better, but I can't see it being trivially replaced.
(It's a similar issue with git: bad UI, but every git question and answer is
described in terms of the CLI, so even if you prefer a GUI client, say, you
still need to be able to formulate your problem in CLI terms for anyone on
stackoverflow to understand you.)

~~~
SoftwareMaven
It is possible to have another tool become ubiquitous (and in this case, it
really would be for the best to have another tool become ubiquitous). In order
to do that, though, you have to have a better tool, so this is a great start
to making that happen.

~~~
justincormack
Curl after all mostly replaced the earlier wget.

~~~
harbud
Except wget -r, which is still widely used and has no equivalent functionality
in curl.

------
jparise
I'd also recommend Curlish (<http://packages.python.org/curlish/>). It
performs nice JSON highlighting and also handles OAuth 2.0 token
authentication.

It simply wraps curl(1), so all of the familiar arguments and recipes continue
to work just fine, as well.

------
benatkin
It looks great! I see no reason to slight cURL, though. Its CLI was intended
for humans.

~~~
jkbr
It's not meant to be an insult on cURL. cURL is a great library/tool and
supports way more than HTTP, but the command line interface simply isn't as
convenient for common HTTP as it could be. The "for humans" slogan is borrowed
from the underlying python-requests library and is meant to communicate that
good UX is one of the top priorities of the project. Glad you like it!

~~~
benatkin
I don't think it's the most obvious kind of good UX, but rather good UX for
people who value expressiveness, like ruby programmers. The items that change
meaning based on symbols are quick to type, but it comes at the cost of
clarity. curl's paramters aren't very clear either, and since it supports more
than just HTTP, it's harder to go through the man page with all that it
supports. I think there's still room for one that uses option names/letters to
differentiate between headers and body properties rather than symbols.

~~~
jkbr
Thanks for the feedback!

I tried to come up with syntax that would make the most common tasks (i.e.,
sending JSON objects, submitting forms, setting headers, etc) as easy as
possible and also feel "natural". The reason for the chosen style is that it
quite corresponds to the actual HTTP request being sent. For example, if you
want to send a PATCH request with a custom header and a form field data:

    
    
        PATCH /patch HTTP/1.1
        X-API-Token: 123
        Host: httpbin.org
        Content-Type: application/x-www-form-urlencoded; charset=utf-8
    
        foo=bar
    

You can simply copy the header (X-API-Token: 123) and the data (foo=bar) and
paste it to the terminal:

    
    
        http --form PATCH httpbin.org/post X-API-Token:123 foo=bar
    

It's not as obvious as '--request PATCH --header X-API-Token:123 --form
foo=bar', but on the other hand, the command doesn't include almost anything
that wouldn't become part of the actual request, which makes it short and easy
to focus on what's important.

~~~
benatkin
Hmm, that makes more sense. If it isn't already in the README or man page, it
might be a worthy addition.

I suggest you add an option where you can start off with a bare request.

Is there a way to have it construct the query string when you are doing a GET
request? Also is there a way to have it construct a query string when you have
a JSON or form body? Might be something to add below the description of items,
as something that doesn't fit into that list but is related. Perhaps -q page=2
-q rpp=20 would be a good way of saying it.

------
kodeninja
Another alternative, in Ruby, is HTTY - <https://github.com/htty/htty>

~~~
0x5a177
One difference to note is that HTTY puts you in a REPL-like environment
instead of being a one-shot command like curl.

------
pkulak
This is a great idea. Sometimes when something is not working the last thing I
want to do is poor through the curl man page before I can even get started
figuring it out.

------
akvlad
I was looking for something like this a while back and found a useful firefox
plugin called Poster (<https://addons.mozilla.org/en-
us/firefox/addon/poster/>). It's useful for testing a RESTful api without
creating any front-end code to handle the requests. Or anything you can do
with cURL just simpler.

~~~
freestyler
I wrote a similar one for chrome
[https://chrome.google.com/webstore/detail/cdjfedloinmbppobah...](https://chrome.google.com/webstore/detail/cdjfedloinmbppobahmonnjigpmlajcd)

~~~
roller
A coworker recently pointed us to yet another one of these, Dev HTTP Client,
which has worked really well for us.

[https://chrome.google.com/webstore/detail/aejoelaoggembcahag...](https://chrome.google.com/webstore/detail/aejoelaoggembcahagimdiliamlcdmfm)

------
barrkel
My main difficulties in using wget are in organizing output location (things
like -nH, --cut-dirs, -P), choosing between -nc / -c / default (rename), error
/ retry policy (-T / -t), logging (-a vs -o), etc.

This tool doesn't really solve any of my actual problems. YMMV. It's less a
cURL replacement than a web API invocation tool.

------
masto
I like it, though I got around curl's painful syntax by wrapping it in a few
shell script. For example, here's my api_post script (meant to be used like
"api_post users/123 first_name=Foo last_name=Bar") (pardon my incompetent
shell scripting and redaction of company internals):

    
    
      #!/bin/sh
      
      resource="$1"
      shift
      
      declare -a post
      while [ "$1" ]; do
          post=("${post[@]}" "-F" "$1")
          shift
      done
      
      if [ -z "${API_BASE:=}" ]; then
          API_BASE=http://localhost:3000/api/v1/
          echo "No API_BASE set.  Using $API_BASE."
      fi
      
      verbose=""
      [ -n "$API_VERBOSE" ] && verbose="-v"
      
      if [ -z "${API_COOKIES:=}" ]; then
          cookies=~/.api.cookies
      else
          cookies="$API_COOKIES"
      fi
      
      curl -0 -k -s -S $verbose -b "$cookies" -c "$cookies" -X POST "${post[@]}" "${API_BASE}${resource}"

------
mixmastamyk
I was prepared to not be impressed, but this looks nice to use. Installed.

------
ww520
This is a good tool. I wish there's an editor-integrated interactive http
tool.

OT: Is there an Emacs package that can do interactive invocation of http? Like
having a text buffer to hold all the urls. Hitting Ctrl-E on a url to invoke
it, and display the http response headers and result on separate buffers.

~~~
gvalkov
OT: You just pretty much summed up _restclient.el_ \-
<https://github.com/pashky/restclient.el>

~~~
ww520
That's fantastic. Thanks for the find. Emacs once again delivers.

------
ericmoritz
See Resty <https://github.com/micha/resty>

------
ubasu
For website scraping, something like casperjs/phantomjs or selenium are more
suitable, since they emulate a browser and evaluate javascript, which curl
cannot, and it is not clear if this tool can.

What are the use cases where curl is more suitable than phantomjs or selenium?

~~~
exogen
APIs. Lots of HTTP usage is not browser-related.

------
vlucas
If you are doing cURL calls as a part of testing functionality, you may want
to consider using a tool like Frisby.js ( <http://frisbyjs.com/> ) to create a
suite of automated tests that involve HTTP calls.

------
snissn
<http://xkcd.com/927/>

~~~
kstenerud
Not applicable. This is not a new standard; merely a new tool.

~~~
rhomboss
Replace standard with tool. Still applicable.

~~~
kstenerud
So by that rationale, we should have stuck with Mosaic, and Internet Explorer,
Firefox, Chrome, Safari, Opera, and Konqueror are all bad things?

How about the various mail clients, office products, ftp clients, torrent
clients & servers, programming languages, power drills, clothing lines,
gasoline engines, kitchen knives and so on?

Tools are about innovation. Standards are about locking down feature sets so
that tools can interact in a common way. The whole point of the xkcd comic is
that too many standards makes it difficult to make tools, and adding yet
another one-standard-to-rule-them-all usually backfires.

------
bmuon
I really like how writing little tools like this in Node is so simple:

    
    
        node -e require('request')('http://www.asdf.com/').pipe(process.stdout)

------
rafaelferreira
A similar project is htty, which presents an interactive console UI to make
and inspect http requests.

------
jameswyse
This is fantastic! Thank you!

------
stonnyfrogs
I find the implication that people who know how to use curl are inhuman to be
insulting and arrogant.

