
A week-long programming retreat - bane
https://www.facebook.com/permalink.php?story_fbid=2110408722526967&id=100006735798590
======
qychtkd
Master of Doom contains an anecdote on this aspect of John Carmack's life. On
page 252, it mentioned how he had sequestered himself in a "small, anonymous
hotel room somewhere in Florida" as he researched Trinity. He had a dolch
portable computer with pentium II and full length PCI slots, while subsisting
only on pizza and Diet Coke. That bit for some reason made a big impression on
me when I read it on the bus ride to school. To be able to let yourself go and
research and code what you truly believe in or curious or excited about (w/
room service and not have to clean up after yourself haha) seemed incredible.
I wonder if John still sticks to Florida, or if he goes to different places
each year; in a city or just a hotel off of a highway or near an airport. My
favorites have been Hyatt - Place Amsterdam Airport, - Regency Charles De
Gaulle, and - Lake Tahoe. Something about sterile rooms, room service, a hotel
near, but not too close to beautiful and historical landmarks just center you
and allow you to think.

~~~
yitchelle
Agreed, I guess that the trick to selecting a location is to be in a location
the the inside the hotel/cabin room is much more attractive that on the
outside.

I had a colleague that told me that his most productive period was when he was
stuck at the hospital for a couple of weeks but was able to do some coding.

~~~
nemo1618
There was a post on HN years ago advocating coding on a cruise ship -- it's
like a nice hotel, but the internet is so crappy that you'll only use it when
you _really_ need it (e.g. finding docs, syncing git), which is great for
productivity!

~~~
wittjeff
Post may have been by Tynan ([http://tynan.com);](http://tynan.com\);) he's
big on working while on transatlantic cruises. Following his advice, I have
done the same thing several times.

Here are some tips: \- Find the best cruises most easily on cruisesheet.com
(disclosure: it's Tynan's project)

\- Royal Caribbean has the best internet at sea through O3b. In the Caribbean
or Mediterranean it's about 70ms latency. In the middle of the Atlantic it's
about 220ms latency. So Skype may work but the delays are annoying. Plenty of
bandwidth now though. You may have only "one 9" service on average though. So
scheduled conference calls are always a gamble.

\- Repositioning cruises (many ships move from the Mediterranean to the
Caribbean and back seasonally) in April/May and October/November are the
cheapest cruises you'll ever find. 5-8 days at sea means plenty of time to get
some work done, as well as goof off a bit during the evenings. For the second
week (Europe or Caribbean), I find it easy enough to get an hour or two of
email catch-up in on port days after getting back from a shore excursion, but
it's easier to just say port day = vacation, sea day = work and great food.

A 2-week repositioning cruise may cost as little as $600 per person including
taxes and fees. Add another $200pp for gratuities, a few $hundred for shore
excursions, and a few $hundred for airfare to get back home. Depending on
where you go, you may get the benefits of a 2-week vacation for the price of
one on land.

Similarly, if you want to do a one-week offsite with your startup
(particularly if you're all remote most of the time anyway), this is probably
cheaper than flying your crew to any big city.

The seminars-at-sea model was also well proven out by geekcruises.com, now
renamed insightcruises.com

My wife and I have been renting an off-season beach house near Boston for 8
months out of the year for less than the price of a basement studio in
Cambridge, then traveling during the summer. This means our repositioning
cruises are further discounted by the fact that we aren't paying rent or
mortgage on an empty house. It's hard to beat if you both have good schedule
flexibility.

------
United857
Good writeup -- and one of the main reminders for me is this:

People throw around words like "revolution" for the current deep-learning
push. But it's worth remembering that the fundamental concepts of neural
networks have been around for decades. The current explosion is due to
breakthroughs in scalability and implementation through GPUs and the like, not
any sort of fundamental algorithmic paradigm shift.

This is similar to how the integrated circuit enabled the personal computing
"revolution" but down at the transistor level, it's still using the same
principles of digital logic since the 1940s.

~~~
Iv
In computer vision at least, deep learning has been a revolution. More than
half of what I knew in the field became obsolete almost overnight (it took
about a year or two I would say) and a lot of tasks received an immediate
boost in term of performances.

Yes, neural networks have been here for a while, gradually improving, but they
were simply non-existent in many fields where they are now the favored
solution.

There WAS a big fundamental paradigm shift in algorithmic. Many people argue
that it should not be called "neural networks" but rather "differentiable
functions networks". DL is not your dad's neural network, even if it looks
superficially similar.

The shift is that now, if you can express your problem in terms of
minimization of a continuous function, there is a new whole zoo of generic
algorithms that are likely to perform well and that may benefit from throwing
more CPU resources.

Sure it uses transistors in the end, but revolutions do not necessarily mean a
shift in hardware technology. And, by the way, if we one day switch from
transistors to things like opto-thingies, if it brings a measely 10x boost on
performances, it won't be on par with the DL revolution we are witnessing.

~~~
Cacti
It is indeed pretty amazing in CV. In a field nearly as old as CS itself, I'd
say at least 75% of the existing techniques were made obsolete in a span of
only a few years.

You could have started your PhD in 2006, made professor in 2012, and nearly
everything you had learned would have been _completely_ different.

~~~
Iv
I got my diploma in 2003. Actually I got lucky that I found a client who
needed a computer vision specialist to add fancy features to their DL
framework so I could train myself in this new direction

------
rhacker
It's actually super refreshing learning that even programming masters like
Carmack are just now learning NNs and watching Youtube Stanford classes like
the rest of us. These are actual people, not gods :) Everybody poops!

~~~
tostitos1979
Yeah .. but his first impulse was to write backprop from scratch. I saw the
lectures, been dabbling with NN for years, and I never thought to do it. I
always thought the Stanford people made you do it on assn 1 to pay your dues
or something. I continue to think of Carmack as the Master hacker.

~~~
jorgemf
> his first impulse was to write backprop from scratch

backprop is a very simple algorithm, nothing to fear there. The problems are
to calculate the derivates if you want to be flexible building your model. But
for feedforward networks with sigmoid activation, the equations to update the
weights are a joke.

~~~
neiled
I'm not sure we agree on the definition of very simple. It maybe very simple
if you already know it...

~~~
shbm
[https://m.youtube.com/watch?v=i94OvYb6noo&feature=youtu.be&t...](https://m.youtube.com/watch?v=i94OvYb6noo&feature=youtu.be&t=5m20s)

You won't regret. One of the best explanation of backprop on the internet.

------
r00k
I do trips like this every few years. I can hardly recommend them enough.

I wrote about my specific recommendations here:
[https://robots.thoughtbot.com/you-should-take-a-
codecation](https://robots.thoughtbot.com/you-should-take-a-codecation).

~~~
Cthulhu_
I should do that sometime. I'm afraid I've lost my ability to focus, something
like that might help. I'd need a solid mission though, instead of a "look into
tech X" without a goal.

------
justonepost
Pretty awesome! If I ever had to say the one thing that differentiates
successful people from unsuccessful people it wouldn't be intelligence, or
even perseverance, or passion. It'd be focus. With focus, you can be amazingly
successful in so many types of occupations.

(That being said, passion / perseverance / intelligence can often lead to
focus)

~~~
michaelmcmillan
IQ tests seem to suggest otherwise.

~~~
chooseaname
Are these even legitimate? If a person can paint a masterpiece but can barely
balance their checkbook, IQ tests won't show their mastery in painting, only
their mediocrity in math.

~~~
habitue
The argument you're making has often been made as "there are different kinds
of genius". The research seems to have debunked this. Rather than seeing
multiple ways to be a genius, we see some general intelligence ability that
helps you no matter what you're trying to do.

In the case of our painter, if we're talking about Michaelangelo or something,
he'd likely come out with an excellent IQ.

As an aside, IQ tests don't try to make you do math problems and the like,
it's all pattern matching kinds of questions where you learn all of the
necessary context in the test itself. Which makes sense right, it's trying to
measure your ability to learn and generalize, not what you happen to know
before you take the test.

------
arnioxux
I second his recommendation of CS231N:
[http://cs231n.stanford.edu/](http://cs231n.stanford.edu/)

You can probably go through the whole thing including assignments in under a
week full-time.

~~~
colmvp
Personally, I found it took me much longer. I watched Karpathy's lectures,
took notes and stewed upon the ideas, and read a bunch of other materials such
as blog posts and research papers to try and truly comprehend some of the
concepts mentioned in the course.

I found myself knowing how to create CNN's, but the why of the entire process
still feels under-developed. But I'll admit it could be because my
understanding of Calc and Linear Algebra was far more under-developed back
when I was studying the course than it is now.

~~~
_mhr_
What did you read/do to develop your calc and linear algebra skills? I feel
like I know calc and linear algebra fairly well on paper, but I'm unsure how
to translate that to the computer.

------
plg
The idea of programming something from "scratch" (whatever your definition,
and programming language) is the best way to really understand something new.
Reading about it, hearing someone speak about it is one thing ... but opening
up a blank .c file and adopting a "ok, let's get on with this" approach is
something much different.

It takes time though and one has to combat the "how come you're reinventing
the wheel" comments from co-workers, spouses, bosses, etc., which can be a
challenge.

~~~
tweek273
This is the biggest challenge I face. As I attempt to teach myself computer
science and programming, the toughest aspect is to work through SICP. I am
always tempted to take the path of least resistance and follow a tutorial to
build a tangible program that will impress. Must remember that this is a
journey and to build brick by brick, even if that means gathering the
ingredients for the clay, then mixing, then laying the bricks!

~~~
st1ck
I'm not really sure it's the most effective learning paradigm. It must be
easier to just copy-paste trivial programs and then try to modify them,
gradually increasing complexity level, until you get certain level of
understanding.

------
BenoitEssiambre
"I initially got backprop wrong both times, comparison with numerical
differentiation was critical! It is interesting that things still train even
when various parts are pretty wrong — as long as the sign is right most of the
time, progress is often made."

That is the bane of doing probabilistic code. Errors show up not as clear cut
wrong values or crashes but as subtle biases. You are always wondering, even
when it is kinda working, is it REALLY working or did I miss a crucial
variable initialization somewhere?

~~~
hotmilk
Reproducing known results goes a long way.

~~~
BenoitEssiambre
That is only possible when you are not experimenting with new algorithms.

------
dopeboy
The recurse center (formerly known as Hackerschool) is offering one week mini
retreats (their typical program is months long). I applied and didn't get in
but after reading this, I may try again.

[https://www.recurse.com/blog/127-a-new-way-to-join-the-rc-
co...](https://www.recurse.com/blog/127-a-new-way-to-join-the-rc-community)

~~~
twic
Reminds me of /dev/fort:

[https://devfort.com/](https://devfort.com/)

------
tinderliker
>On some level, I suspect that Deep Learning being so trendy tweaked a little
bit of contrarian in me, and I still have a little bit of a reflexive bias
against “throw everything at the NN and let it sort it out!”

I am the same kind of person. But when John Carmack approaches this with
scepticism and concludes that it is indeed not over-hyped, I guess its worth
learning after all!

CS231N, here I come.

~~~
tail-recursion
He never said it is not over-hyped. Most people who work in the field think it
is over-hyped.

------
deepaksurti
>> I’m not a Unix geek. I get around ok, but I am most comfortable developing
in Visual Studio on Windows.

This is a lesson for me and probably many others. Don't get hung up on tools,
ship!!!

~~~
munificent
I _think_ the subtext of your comment here is something like, "Look how
productive he is without mastering the _real_ tools we Unix hackers use! He
can get by with second rate Windows stuff!"

Maybe I'm reading you wrong. But if I am right, it's good to take a look
outside of the Unix bubble. Visual Studio is literally the world's most
sophisticated developer tool. More human hours of engineering have been poured
into it than likely any other piece of software we use on a daily basis.

Windows isn't my jam, but VS is incredible.

~~~
mxschumacher
does it beat intellj's IDEs?

~~~
cobalt
yes

------
cluoma
If anybody else is interested in doing something similar, I highly recommend
Michael Nielsen's online book: Neural Networks and Deep Learning[1]. He gives
really good explanations and some code examples.

I ended up writing the most basic feed-forward network in C[2]; although I
didn't use base libs like Carmack :(

[1]
[http://neuralnetworksanddeeplearning.com/chap1.html](http://neuralnetworksanddeeplearning.com/chap1.html)

[2] [https://github.com/cluoma/nn_c2](https://github.com/cluoma/nn_c2)

------
doomlaser
I like his insight that a lot of base neural network implementation code is
conceptually simple in the same way as writing a raytracer.

~~~
fossuser
I think the point about NN and ray tracing being simple systems that allow for
complex outcomes is something that seems to be a deeper truth about the
universe. Stephen Wolfram and Max Tegmark both talk and write about this - it
also shows up in old cellular automata like Conway’s game of life.

It’s pretty cool that so much complexity can come from a few small rules or
equations.

~~~
kridsdale1
You could say that physics is a relatively simple program that allows for
outcomes as complex as the universe.

~~~
fossuser
In his book Max Tegmark makes this argument, though he says it’s math rather
than physics.

------
Cieplak
Interesting to hear that C++ isn’t that well supported on OpenBSD. The story
is quite the opposite with FreeBSD, where it’s really easy to use either clang
or gcc. I usually spin up new jails to keep different installations sandboxed.
CLion works quite nicely with most window managers on FreeBSD, but I rarely
boot Xwindows these days and usually prefer to work with emacs inside tmux
from the console.

~~~
dijit
My gut feeling is(was) that C++ is a first class citizen on OpenBSD. Its
interesting having my beliefs challenged like this from someone so
influential.

The question becomes; if its not C++, then what is the first class citizen on
OpenBSD. And if it is C++ then how do we improve support? Just going to ports
seems like a poor answer.

~~~
pjmlp
> The question becomes; if its not C++, then what is the first class citizen
> on OpenBSD.

Being an UNIX derivative the answer is quite easy, C.

~~~
cwyers
Sure, but is there a popular, modern C compiler out there that doesn't compile
C++ as well? Clang, GCC, Intel, MSVC... all of them compile C++ as well as C.
I don't think it's possible to have first-hand support for C and not C++ any
more unless you ship your own compiler. It sounds like the issue here is that
OpenBSD is shipping older versions of LLVM/Clang and GCC.

~~~
temprature
OpenBSD's base ships the most recent LLVM and Clang (5.01), but there's no
LLDB yet and gdb is outdated (6.3, the last GPLv2 version).

He would have had a much better time using C99 than C++11, especially since he
didn't want to install a newer gdb from ports.

------
TheAceOfHearts
I find it very admirable that he can sit down for a week and just focus on one
main subject. Personally, I get derailed all the time when I don't have a very
well defined goal in mind.

It looks a little something like this: I'll be reading a manpage and notice
another manpage referenced at the bottom. So I obviously keep crawling this
suggestions tree until I bump into a utility whose purpose is unclear. So then
I'll go searching online to try and figure out what kinds of problems or use-
cases it's meant to help with.

------
abenedic
He mentions it a bit, but openbsd is really a good place to start for
multiplatform code. Their focus on security and posix helps a ton.

------
blt
This post made me appreciate being a graduate student a lot. I have many weeks
like this!

~~~
bsenftner
I've structured my life like this. Remote employer in a time zone 10 hours
away, I get months long tasks I simply update boss on progress and am left to
focus so deep my health suffers because I naturally obsess over my work, which
I love. And working from home, my wife also works from home (freelance film
producer), so we just immerse ourselves, have meals together, but otherwise we
both obsess over our work. No commuting makes this very enjoyable.

------
z3phyr
I really like the attitude of picking a topic at hand and hack around it for
fun. It radiates a very MIT Hacker feel. The writeup is very motivating.

John has been experimenting with a lot of stuff -- Racket, Haskell, Computer
Vision and now Neural Networks. I guess there is no professional intent, but
the spirit of hacking lives on.

------
forgotmypw
No-JS link:

[https://m.facebook.com/permalink.php?story_fbid=211040872252...](https://m.facebook.com/permalink.php?story_fbid=2110408722526967&id=100006735798590)

John Carmack

After a several year gap, I finally took another week-long programming
retreat, where I could work in hermit mode, away from the normal press of
work. My wife has been generously offering it to me the last few years, but
I’m generally bad at taking vacations from work.

As a change of pace from my current Oculus work, I wanted to write some from-
scratch-in-C++ neural network implementations, and I wanted to do it with a
strictly base OpenBSD system. Someone remarked that is a pretty random
pairing, but it worked out ok.

Despite not having actually used it, I have always been fond of the idea of
OpenBSD — a relatively minimal and opinionated system with a cohesive vision
and an emphasis on quality and craftsmanship. Linux is a lot of things, but
cohesive isn’t one of them.

I’m not a Unix geek. I get around ok, but I am most comfortable developing in
Visual Studio on Windows. I thought a week of full immersion work in the old
school Unix style would be interesting, even if it meant working at a slower
pace. It was sort of an adventure in retro computing — this was fvwm and vi.
Not vim, actual BSD vi.

In the end, I didn’t really explore the system all that much, with 95% of my
time in just the basic vi / make / gdb operations. I appreciated the good man
pages, as I tried to do everything within the self contained system, without
resorting to internet searches. Seeing references to 30+ year old things like
Tektronix terminals was amusing.

I was a little surprised that the C++ support wasn’t very good. G++ didn’t
support C++11, and LLVM C++ didn’t play nicely with gdb. Gdb crashed on me a
lot as well, I suspect due to C++ issues. I know you can get more recent
versions through ports, but I stuck with using the base system.

In hindsight, I should have just gone full retro and done everything in ANSI
C. I do have plenty of days where, like many older programmers, I think “Maybe
C++ isn’t as much of a net positive as we assume...”. There is still much that
I like, but it isn’t a hardship for me to build small projects in plain C.

Maybe next time I do this I will try to go full emacs, another major culture
that I don’t have much exposure to.

I have a decent overview understanding of most machine learning algorithms,
and I have done some linear classifier and decision tree work, but for some
reason I have avoided neural networks. On some level, I suspect that Deep
Learning being so trendy tweaked a little bit of contrarian in me, and I still
have a little bit of a reflexive bias against “throw everything at the NN and
let it sort it out!”

In the spirit of my retro theme, I had printed out several of Yann LeCun’s old
papers and was considering doing everything completely off line, as if I was
actually in a mountain cabin somewhere, but I wound up watching a lot of the
Stanford CS231N lectures on YouTube, and found them really valuable. Watching
lecture videos is something that I very rarely do — it is normally hard for me
to feel the time is justified, but on retreat it was great!

I don’t think I have anything particularly insightful to add about neural
networks, but it was a very productive week for me, solidifying “book
knowledge” into real experience.

I used a common pattern for me: get first results with hacky code, then write
a brand new and clean implementation with the lessons learned, so they both
exist and can be cross checked.

I initially got backprop wrong both times, comparison with numerical
differentiation was critical! It is interesting that things still train even
when various parts are pretty wrong — as long as the sign is right most of the
time, progress is often made.

I was pretty happy with my multi-layer neural net code; it wound up in a form
that I can just drop it into future efforts. Yes, for anything serious I
should use an established library, but there are a lot of times when just
having a single .cpp and .h file that you wrote ever line of is convenient.

My conv net code just got to the hacky but working phase, I could have used
another day or two to make a clean and flexible implementation.

One thing I found interesting was that when testing on MNIST with my initial
NN before adding any convolutions, I was getting significantly better results
than the non-convolutional NN reported for comparison in LeCun ‘98 — right
around 2% error on the test set with a single 100 node hidden layer, versus 3%
for both wider and deeper nets back then. I attribute this to the modern best
practices —ReLU, Softmax, and better initialization.

This is one of the most fascinating things about NN work — it is all so
simple, and the breakthrough advances are often things that can be expressed
with just a few lines of code. It feels like there are some similarities with
ray tracing in the graphics world, where you can implement a physically based
light transport ray tracer quite quickly, and produce state of the art images
if you have the data and enough runtime patience.

I got a much better gut-level understanding of overtraining / generalization /
regularization by exploring a bunch of training parameters. On the last night
before I had to head home, I froze the architecture and just played with
hyperparameters. “Training!” Is definitely worse than “Compiling!” for staying
focused.

Now I get to keep my eyes open for a work opportunity to use the new skills!

I am dreading what my email and workspace are going to look like when I get
into the office tomorrow.

~~~
bringtheaction
Better non-JS link: [http://archive.is/MvKHy](http://archive.is/MvKHy). This
link is of a snapshot of the DOM of rendered page. The one you linked has a
not-so-comfortable layout due to being made for old phones with small screens.

------
Ono-Sendai
I've also taken the time to implement a NN in C++ and train it on the MNIST
handwriting data. It's a lot of fun :) As a result I have some pretty fast CPU
NN code lying around.

------
swah
Note: this is a post from John Carmack, not Facebook HQ.

------
jenkstom
I'm pretty sure this is what being a teenager or in college is for. In middle
age I'd much rather spend a week doing a meditation retreat.

------
peternicky
Since when is Carmack writing on Facebook??

~~~
balls187
Since he's employed by them. (He works for Occulus, owned by FB).

That said, I chuckled at his note about being contrarian, as he _writes_ a
post on FB

------
burnte
Hands up for everyone who expected John to announce the first self-aware AI at
the end.

~~~
justonepost
I think his passion is and always will be total immersive VR. I truly believe
we are where are today completely on the back of Carmack's recognition of
Occulus potential. He was just trying to be friendly and gave palmer
assistance. Glad it all worked out though.

------
balls187
Compatible with Firefox Reader View:
[https://pastebin.com/raw/hZ32q3Rv](https://pastebin.com/raw/hZ32q3Rv)

~~~
porjo
The Facebook page loaded fine for me in Firefox Reader View!?

------
swang
I find it peculiar about people's continued good feelings towards Carmack.

During the Zenimax/Oculus case:

He claimed that he never wiped his hard drive: An independent court expert
found that most of his hard drive was wiped after Carmack heard about the
lawsuit. So Carmack lied in his affidavit.

He claimed that no source code from Zenimax ever got transferred over to
Oculus. Then later admitted that the emails he had taken form his Zenimax
laptop on his last day there did contain source code. He denies that the
source code in the emails benefitted him and that he "rewrote" all the code
anyways. But this runs counter to the testimony of Oculus programmers who
admitted they copied Zenimax code straight into the Oculus SDK.

He also has not outright denied the copy claims from the testimony of David
Dobkin's in which Dobkin testified about the similarity between the source
code in Zenimax and the source code in Oculus's SDK. Carmack instead accuses
him of doing it for money and argued that the methodology wasn't very robust.
But no denial, just ad hominem attacks.

So why do people still fawn over him when it seems like his ethics are
dubious?

(Granted it could be possible that the technical aspects of the case went over
the heads of jurors. And I will admit to being wrong if the Oculus appeal ends
up revealing more information.

But his HD was discovered to be wiped, and even Oculus programmers admitted to
copying code. Why does he get a free pass on this?)

~~~
panic
Yeah, Carmack has a history of unethical behavior. From the book "Masters of
Doom":

 _> Late one night Carmack and his friends snuck up to a nearby school where
they knew there were Apple II machines. Carmack had read about how a thermite
paste could be used to melt through glass, but he needed some kind of adhesive
material, like Vaseline. He mixed the concoction and applied it to the window,
dissolving the glass so they could pop out holes to crawl through. A fat
friend, however, had more than a little trouble squeezing inside; he reached
through the hole instead and opened the window to let himself in. Doing so, he
triggered the silent alarm. The cops came in no time._

 _> The fourteen-year-old Carmack was sent for psychiatric evaluation to help
determine his sentence. He came into the room with a sizable chip on his
shoulder. The interview didn’t go well. Carmack was later told the contents of
his evaluation: “Boy behaves like a walking brain with legs ... no empathy for
other human beings.” At one point the man twiddled his pencil and asked
Carmack, “If you hadn’t been caught, do you think you would have done
something like this again?”_

 _> “If I hadn’t been caught,” Carmack replied honestly, “yes, I probably
would have done that again.”_

 _> Later he ran into the psychiatrist, who told him, “You know, it’s not very
smart to tell someone you’re going to go do a crime again.”_

 _> “I said, ‘if I hadn’t been caught,’ goddamn it!” Carmack replied. He was
sentenced to one year in a small juvenile detention home in town. Most of the
kids were in for drugs. Carmack was in for an Apple II._

~~~
bitwize
Carmack: Breaks into a building at age 14, steals a bit of code today. "Oh, he
has a history of unethical behavior, we shouldn't look up to him."

Bill Gates: Steals a bulldozer to race with his buddies at age 19(?), is
responsible for Microsoft's corporate culture and history of predatory
behavior. Hackernews loves him and can't get enough of Microsoft. Boy, they're
a fair sight better than Google, eh lads?

I pity the person who learns ethics from Hackernews.

~~~
sokoloff
I haven't experienced a consistent love fest here for Bill Gates. There are
supporters and detractors, just as you'd expect for any public figure. Perhaps
you notice more those opinions with which you disagree.

------
gigatexal
The man is a genius but why openBSD? FreeBSD would have been my choice.

~~~
gtsteve
I've never used a *BSD but perhaps you could say why you think FreeBSD is
superior to openBSD?

~~~
gigatexal
I have used both and found them equally consistent. But maybe my usage was
superficial and I missed something.

