

How `curl -L cueup.com/jobs` works - danicgross
http://tech.cueup.com/blog/2013/03/06/how-our-terminal-friendly-jobs-page-works/

======
akavi
I dumped it into a file out of habit, which sort of ruined the effect. I did
wonder why a ~300 line textfile took so long to DL, though.

~~~
danicgross
Any hacks you can think of to detect this from the server?

~~~
bdarnell
Start out by sending dummy data as fast as you can. Once you've sent a few TB
you can be relatively confident that it's going to a terminal instead of a
file (which would have filled up the disk by now). Then you can start slowly
sending the real data.

------
raverbashing
I'm afraid my system does not have yet support to http protocol

This server does not apparently has any service listening on port 23 so
unfortunately I can't telnet to it (I'm not sure what's this /jobs as well, is
this a Gopher page? oh well...)

~~~
laumars
Port 22 is SSH. Telneting to it wouldn't do much good aside telling you which
sshd was listening.

I'm also a little confused by what you mean when you said your system doesn't
support "HTTP protocol", _curl_ has been a staple for Unix and Linux for over
a decade. Or are you trying to run _curl_ from cmd.exe in windows?

~~~
raverbashing
I meant port 23, sorry

And maybe you're too young to understand the joke ;)

~~~
laumars
Possibly. I know good jokes are ruined if they have to be explained, but would
you mind on this occasion :)

~~~
raverbashing
Well, here goes:

Once upon a time (around 1995, first time I tried this 'internet' thing) the
internet already existed. But this newfangled http protocol was not very
popular (it existed since 1991)

You see, Curl was created in 1997, Wget in 1996

The way to access systems was through telnet mostly. Some modern clients, like
lynx, also supported Gopher <http://en.wikipedia.org/wiki/Gopher_(protocol)>

(Yes, lynx got http support at a later time)

So, their choice for a page readable from a terminal ends up being kind of an
Anachronism =)

~~~
laumars
Oh, I know all that (I was building websites in the mid 90s, so I'm old enough
to remember a life before the WWW).

I assumed you were referencing an old meme or a famous quote from some notable
UNIX greybeard. I hadn't realised you were just joking about being old (that
wasn't clear from your original comment).

Sorry mate, I just misunderstood the context of your joke. (my epic lack of
sleep probably didn't help)

~~~
raverbashing
Ah it happens, don't worry =)

------
danicgross
OP here. I'm curious to learn of any other creative uses of ANSI and curl.
Does anyone know of websites that use it as an easter egg?

~~~
songgao
One thing that I often use is

    
    
      curl ifconfig.me
    

But this is so damn cool!

------
cheeze
Maybe it's just me (using PuTTY), but it is next to impossible to read while
it scrolls up my screen. Once it's done scrolling I can't see most of the
listings because by default my terminal doesn't save that many lines of
history.

Awesome idea though

~~~
pjscott
I'm pretty sure there's a setting in PuTTY to increase the size of the
scrollback buffer. Memory is cheap these days.

------
jkbr
With HTTPie (master):

    
    
       http --pretty=none https://www.cueup.com/jobs User-Agent:curl
    

(`--pretty=none` is needed because the server incorrectly sends `Content-Type:
text/html`)

------
ysangkok
Works with Wget too:

wget -q --user-agent="curl" -O - cueup.com/jobs

------
spitfire
Reminds me of an old dialup BBS. Can I post a comment on the wall?

------
philfreo
Where are these colorMarkdown and htmlToMarkdown functions?

------
iso-8859-1
Sadly there seems to be no MIME type for ANSI art. Of course this should
depend on the "Accept" header and not the User Agent. But it seems infeasible.

------
patrickod
Is anyone else unable to see this due to tech.cueup.com not resolving?

~~~
pjscott
You know the worst kind of DNS problem? The kind that only affects some
people. Yeah, I can't resolve the domain name either. _Ugh._ For now, the
Coral Cache hack works:

[http://tech.cueup.com.nyud.net/blog/2013/03/06/how-our-
termi...](http://tech.cueup.com.nyud.net/blog/2013/03/06/how-our-terminal-
friendly-jobs-page-works/)

EDIT: Looks fixed now.

------
redmattred
Your /jobs link in your sites footer is 404ing

------
hamburglar
This is a cool idea, but anyone who needs "how it works" explained is really
not the target audience.

------
recroad
Honestly, fuck anyone who does this for a job posting.

~~~
arjie
Why? There's a normal (nice looking) page for other user agents. You haven't
lost anything.

