
Unix command line queue utility - evandrix
https://github.com/chneukirchen/nq
======
justin_vanw

       $ mkfifo ~/.jobqueue
       $ parallel :::: ~/.jobqueue
    

now, from another terminal:

    
    
       echo ls > ~/.jobqueue
    

Tada!

for a remote job queue, use nc instead of a fifo

for a job queue with a buffer, use a file, and run $ parallel :::: <(tail -f
thefile) you can even truncate the file from time to time.

Of course, this doesn't capture the context you are in, such as your current
directory or the value of shell variables.

~~~
giis
What's 'parallel'? Is it a *nix command? Looks like its not available on
Fedora box

~~~
teraflop
Yes, it's a GNU tool. Try "yum install parallel".

[http://www.gnu.org/software/parallel/](http://www.gnu.org/software/parallel/)

------
kovek
This stackoverflow answer[0] shows how you can queue jobs even after you start
running other processes in your queue. Essentially, you can do:

    
    
      sleep 2; echo "A" && sleep 2 ; echo "B" && sleep 2 ; echo "C" 
      # Quickly Ctrl-Z
      fg && sleep 2 ; echo "D"
    

and you will see A\nB\nC\nD\n slowly printed out in your terminal.

[0]: [http://superuser.com/a/345455](http://superuser.com/a/345455)

------
CyberShadow
Could you please add some usage examples or elaborate on when this tool would
be useful? For example,

> building several targets of a Makefile

As opposed to `make && sudo make install` ?

> downloading multiple files one at a time

As opposed to `(wget url1 &) ; (wget url2 &)` ?

> or simply as a glorified `nohup`

As opposed to `nohup` ?

~~~
cozzyd
I imagine it's useful if you don't know all the commands you want to queue at
the beginning.

For example:

    
    
        nq ./my_awesome_analysis /data/complicated_file_with_long_name.dat
    
        # um... what's the next file? 
    
        ls /data 
        nq ./my_awesome_analysis /data/even_more_complicated_path.dat
    
        man rsync
        nq rsync -avh --progress out* somewhere_else.net:

~~~
jessaustin
You don't need nq for that. Just kick off ./my_lame_analysis in the background
whenever you want.

~~~
cozzyd
Yes but then they will execute at the same time instead of sequentially.

~~~
shanemhansen
How about:

    
    
        ./my_awesome_analysis /data/complicated_file_with_long_name.dat &
        # do some stuf
        wait
        ./my_awesome_analysis /data/even_more_complicated_path.dat

~~~
e40
It's still not equivalent, and it's more cumbersome.

------
jschwartzi
Which systems guarantee monotonic behavior for gettimeofday()? I can't think
of any. I hope there's no behavior in nq which relies on monotonicity other
than file naming, because otherwise you might run into some obscure issues.

------
pwg
How does this differ from task spooler:

[http://vicerveza.homeunix.net/~viric/soft/ts/](http://vicerveza.homeunix.net/~viric/soft/ts/)

------
dsl
cat list.txt | xargs -n 1 -P 20 wget

Queues up 20 concurrent downloads with wget, for example.

~~~
icebraining
You could also use a for loop. But as others pointed out, that doesn't allow
you to add items to the queue after it has started.

~~~
stonogo
It does if you use a fifo instead of a text file. This is the difference
between basic Unix literacy and actual Unix proficiency. I'm not being
insulting; very few people take the time to learn the tools Unix provides. And
naturally, a developer is going to be more interested in writing tools than
most.

~~~
icebraining
I don't think you can use a fifo with xargs, though it'd definitively work
with a loop. Even with text files, using tail -f.

