
Using Unix as an IDE (2012) - nz
https://sanctum.geek.nz/arabesque/series/unix-as-ide/?
======
rbc
It's not for everyone, but the Unix/BSD/Linux shell is still my favorite
command invocation environment. Despite their simplicity, pipelines are a
really handy way to pass data between programs when solving certain kinds
problems. It also provides a cheap way of providing some parallel processing.

~~~
zrobotics
The other thing I absolutely love about pipes is the ease of writing custom
utilities. I'd refactored a filetype for use with a game last week, and needed
to insert several lines into ~1.5k ASCII files. 1/2 hour to make a hacked c++
program to make the changes, and pass the outputs of grep to that.

Although I'm unfamiliar with using pipes to provide parallel processing, could
you elaborate? If you can pass to another thread I will be kicking myself for
missing that.

~~~
flukus
> Although I'm unfamiliar with using pipes to provide parallel processing,
> could you elaborate? If you can pass to another thread I will be kicking
> myself for missing that.

You didn't miss it, each stage in the pipeline runs in it's own process so
you're already using it. With say "grep -r foo ./ | do-something | do-
omething-else" the second and third stages will be operating on the data from
grep while grep is still running.

In addition to that you can get more parallelism pretty simply with xargs
([http://offbytwo.com/2011/06/26/things-you-didnt-know-
about-x...](http://offbytwo.com/2011/06/26/things-you-didnt-know-about-
xargs.html)). If that's not enough then check out gnu parallel
([https://www.gnu.org/software/parallel/](https://www.gnu.org/software/parallel/)).

While we're talking about unix as an IDE I want to put in a plug for
inotifywait
([https://linux.die.net/man/1/inotifywait](https://linux.die.net/man/1/inotifywait)).
Combining this with a makefile you can get some very fast builds with near
instant feedback everytime you save a file. I have one that executes units
tests and valgrind every time for instant memory leak detection. Another great
use case is getting WYSIWYG like features with latex files.

~~~
harry8
using make -j $num is another way to go to get parallelism that is sometimes
convenient.

~~~
flukus
Just make sure $num is set, otherwise it tries to do everything it can at
once. I found out the hard way compiling gcc last week, system was
unresponsive so I reset, but I'm curious if it would have worked eventually.

~~~
btschaegg
This is a pet peeve I have had for a while now. I still have a definition for
MAKEFLAGS (with -j [# of cpus-1]) in my profile and rc files and had to learn
that if you allow make to start too many things at once, the _whole_ machine
would hang. Even switching to a different TTY took ages.

To me, it sounds like a tremendous scheduling issue if a build job can drown a
whole system like that. Shouldn't there be some emphasis on non-waiting
threads/processes that haven't done much over those who have hogged the CPU in
the past without me having to manually adjust the niceness of every build
task?

~~~
__blockcipher__
I can’t offer specifics, but it’s probably at the point that you’d compile a
different scheduler?

------
smadge
Maybe this is implicitly assumed, but a terminal multiplexer such as tmux is
also a useful utility for using unix as an IDE. For example, you can have a
pane for your code in an editor like vim, and a pane for running your tests
next to it, and maybe another pane for an interactive shell of your language.
Lacking a consistent local development environment, tmux is indispensable when
I ssh into into my remote development environment.

~~~
majewsky
These days, you can also run terminals as panels in vim, so vim is sort of its
own terminal multiplexer now.

    
    
      :help terminal.txt

~~~
skriticos2
I tend to just write a command in a vim buffer (with all that nice Vim magic),
select it and run it with :@"

e.g. I write and yank: r!fancy command arg1 arg2 ..

Then go to the next line and do :@" This will run the command (well execute
the colon mode command) and read back the results.

edit: The nice thing about this is that I can do this as a progressing
document (command, output, command, output) and scroll back to see what I have
done and what the result was.

------
partycoder
If using LLVM, you may want to debug with lldb instead. Also with gdb you can
use the tui mode.

To read the environment variables used by a program you can use

    
    
        cat /proc/{pid}/environ
    

update: Some of this overlaps with reverse engineering. I posted this comment
yesterday about useful reverse engineering resources on Linux
[https://news.ycombinator.com/item?id=17342197](https://news.ycombinator.com/item?id=17342197)

~~~
glandium
Note that `/proc/$pid/environ` has null bytes between the entries, so it's not
very readable if you just `cat` it. So you want to filter its content, e.g.:

    
    
        tr "\0" "\n" < /proc/$pid/environ

------
twblalock
I like Acme because it is built to facilitate using shell commands from within
the editor. You can write scripts in any language you like and run them, which
in some ways makes it the ultimate scriptable editor.

However, an IDE like IntelliJ brings so much to the table that it's hard to
imagine working without it. I know I'm much more productive using it than
Emacs or Acme or Vi or any other editor that integrates well with Unix.

~~~
tambourine_man
One thing that annoyed me the last time I tried it (webstorm variation if I’m
not mistaken) is that you can’t use it as an editor only.

You have to create a project. For everything.

And of course, multi-second launch time is unacceptable for an editor in my
book. It’s just too frustrating.

But yeah, autocompletion is amazing. It even understands image dimensions.

And it understands PHP, JavaScript, HTML and CSS, all mixed toguether in the
same file, respecting indentation and syntax highlighting like nothing I’ve
ever seen.

~~~
pfranz
I can't find the source, but I remember hearing a quote similar to, "What
separates a text editor from an IDE is that you have create a project to save
a file."

I feel the same way as you. I spent a good amount of time with PyCharm,
WebStorm, and RubyMine.

------
haolez
Highly recommend the inotify tools. It makes things like auto reloading very
easy to setup.

~~~
DarkCrusader2
Though it looks like unmaintained, I use watchdog[1] for automatically running
build and test commands on file change.

[1]
[https://github.com/gorakhargosh/watchdog](https://github.com/gorakhargosh/watchdog)

~~~
lloeki
Maybe look into the very unix-y fswatch, which is cross platform too.

[https://github.com/emcrisostomo/fswatch](https://github.com/emcrisostomo/fswatch)

------
Teckla
Midnight Commander (mc) is also a great file management tool.

~~~
kakwa_
ranger (or rangerfm) is also quite nice.

It provide a 3 columns view with either:

* parent directory | current directory | child directory

* parent directory | current file | preview of the file (if possible in text)

I found it quite nice to use because it gives you a good picture of where you
are and what are your neighboring files, which can be helpful when navigating
inside a code base.

------
pvg
Previouslies:
[https://hn.algolia.com/?query=unix%20as%20ide&sort=byPopular...](https://hn.algolia.com/?query=unix%20as%20ide&sort=byPopularity&prefix=false&page=0&dateRange=all&type=story)

------
Rolpa
But the idea of an OS in and of itself is the ultimate IDE, is it not? I can't
say I like the idea of writing software without any system calls at my
disposal.

~~~
pjmlp
That is the case of C, because UNIX is the runtime, hence what was left out of
ISO ended up on POSIX.

Any programming language with a richer runtime can be OS agnostic to a certain
point. The runtime is the OS.

------
cryptonector
tmux + cscope + a $CSCOPE_EDITOR script that starts $EDITOR in a new tmux
window == awesome.

EDIT: Also, add nested tmux sessions to make it awesome++.

------
jlebrech
is it possible to write man pages for tasks rather than commands.

so you could have: man search, man restart, man compile. etc.

~~~
majewsky
Manpage names are not tied to command names at all. Sure, sections 1 and 8 are
conventionally sorted by command name, but the other sections have different
purposes. Sections 2 and 3 document system calls and library calls, one
function at a time:

    
    
      man 2 fork
      man 3 fread
    

Section 5 documents file formats and conventions:

    
    
      man 5 passwd
      man 5 rsyncd.conf
    

And section 7 is described by `man man` as "miscellaneous". There is some nice
high-level overview documentation hidden in there, e.g.:

    
    
      man 7 signal
      man 7 socket
    

My all-time favorite is from section 7 is `man ascii`. Very handy when you're
ever pushing bits.

NOTE: If these commands fail for you, check if you have the "man-pages"
package installed (may be called different in your distro).

------
NVRM
ls -ltrapR

