
Going “Write-Only” - begriffs
http://begriffs.com/posts/2015-04-20-going-write-only.html
======
rntz
> The discussion in The Republic maintains that would-be citizens of the ideal
> republic should be exposed to music that cultivates their good qualities,
> and prohibited from listening to the bad. Much modern music creates
> agitation and aggression. I’ll listen to serene and balanced songs like
> Gregorian chant and neoclassical

> The etching, and even the pressing and collating of pages were difficult
> processes but somehow the artists outdid us, we who can so easily create,
> modify and distribute images. Goodbye cartoonish web images, let me be
> immersed in nature and see uninhibited art instead.

The author should be careful not to confuse a desire for focus and on the
lasting rather than the ephemeral with a fetishisation of the past, of the
"authentic", and of the "natural". It is all too tempting to step from use of
survivorship bias as a tool, as a filter, into a false belief in the
superiority of things past.

> the (false?) feeling of connectedness through a glowing rectangle

This particularly gets to me. I know people who would not be here on this
earth today if not for the ability to connect with others through the
internet. If you feel like disconnecting makes your life more fulfilling,
great! But don't project that onto others by saying that what makes their life
meaningful is false. (At least he had the self-awareness to throw in a
question-mark.)

~~~
mhd
I predict that this will go full circular again in a few years, and we're back
to the "leave meatspace, enter the cyber world" over-enthusiasm again. And
then a few years after it's vinyl/Thoreau again etc.

~~~
kordless
Ebb and flow.

------
saalweachter
> To program any more was pointless. My programs would never live as long as
> [Kafka’s] The Trial.

As a programmer, I want to believe I'm a writer, a poet, a sculptor.

But the truth is programmers are dancers, mimes, ventriloquists. We're
performance artists, and the systems we program really only have meaning as
long as we keep programming them (or, if we're really lucky, someone else
takes up the dance when we grow tired of it). Once we take our final bows, the
programs will fade away and die and all that will be left is a memory of the
performance.

~~~
tbirdz
I don't think this is true. Consider the popularity of emulators for older
systems. People still care about and use old programs and games. You can even
still buy games released 35+ years ago from online retailers like gog. Think
about how old programs like ls, dd, etc are (even older when you consider the
design and not just the implementation).

People still care about "Super Mario Bros" (a computer program) today, over 30
years after it's original release. Is it really going to be forgotten in the
next 10 years? 20? 30? I think not. Will all emulators be abandoned?

Maybe appreciation for old computer programs is a niche interest, but let's
face it, Kafka is not exactly popular among the masses either.

Most software may be forgotten, but most books will be too.

~~~
bentcorner
I find it a little sad that most modern computing will be lost to time. Early
console games were easily self contained systems, and lent themselves well to
emulation. Today's consoles are towers of complexity that will likely never be
emulated. At best compatibility layers will be built into future systems to
interoperate, the likely result is that the classics of today will continue to
hobble along on a dwindling number of devices. At worst, a corporation loses
interest and all their consoles immediately turn into expensive paperweights.

~~~
korethr
I disagree. It's easy today to think of the consoles of yesteryear as simple
self contained systems (A 6502-based CPU? How quaint!). However, Consider the
unholy trinity that was the Sega Genesis/Master System/CD/32X. Though no
longer modern, that absolutely is a tower of complexity, especially when it
was brand new.

The emulators of the game consoles of the past exist today because some hacker
got curious and interested. Complexity certainly isn't stopping emulator devs.
The Wii was more complex than the Gamecube, which was more complex than the
N64, which was more complex than the SNES, which was more complex than the
NES. And yet there are emulators available for all of those systems.

What is different today that would prevent an enterprising hacker tomorrow
from making emulators of today's consoles?

~~~
lloeki
> What is different today that would prevent an enterprising hacker tomorrow
> from making emulators of today's consoles?

Not a sliver of hope of access to the server-side runtime of cloud-assisted†
games?

† not full-scale MMOs but Diablo III, Starcraft II, Destiny, even Guild
Wars...

~~~
abricot
If they follow the example of Ultima Online they certainly will live on
forever.

In many some though, they will be replaced with full conversions that gets
even more popular than the original.

------
dkhenry
What a waste. I am amazed that you can take the attitude of "All new things
are broken" and be involved in the creation of anything new at all. To me you
need to have one of two outlooks on life to hold that viewpoint. Either you
really think that you are independently of anyone else so awesome that you
will create something of value by yourself that is somehow better then what
anyone else is doing, or you think that you yourself are so much better then
everyone else that things which hold value to others are meaningless to you.
Either way you're placing so much value on yourself that whatever comes out
will be so isolated as to be useless to everyone else, but why do you care
since you're so much better then everyone else.

I mean consider the blog post "I am going write only" which loosely translates
to what I produce is worth your time to read, but what you produce is not
worth my time to read. No thank you.

~~~
allendoerfer
Especially the part about his twitter usage shows is attitude pretty well.

------
delluminatus
I like this post a lot, because it reminds me a lot of myself. I have spent a
long time thinking about how the deluge of easy entertainment and shallow
articles on the Internet has affected the way people operate. I've even done a
similar experiment in the past, except in my case it was banning the Internet
except for a specific set of use cases. It was certainly an interesting
experience.

One warning I will throw out for those considering entertainment/distraction
deprivation: you will need something to spend your time on. Instead of just
taking away, use the same opportunity to try to cultivate a productive habit
(hey, you're going to be suffering anyway -- might as well).

Also, you need to think about whether this really addresses the issues that
cause you to be "mediocre"/lazy/unproductive. It's easy to blame external
factors like the Internet for providing easy entertainment, but it's also
important to look inward and see if you have mental roadblocks that are
inhibiting you -- perhaps you are mentally exhausted from work, or maybe you
just aren't interested. Also, I've started to think that some people are just
flat out less productive than others. Or to put it another way, some people
are abnormally productive. After all, productivity is really more of a means
than an end. If you're not sure where you are going, it doesn't matter how
fast you get there. And if you do have a goal, abstract ideas like quality or
productivity are quickly redefined in more concrete terms to fit your new
intention.

------
Kenji
He has delusions of grandeur. He compares himself to Kafka and says he
insulates himself from "online mediocrity". Does he realize how incredibly
arrogant he sounds? What he needs right now is to acknowledge that he's just
another ordinary human, capable of ordinary deeds (basically an "ego
adjustment" to realistic levels). He should do what he loves doing how he
loves doing it, without any thoughts about how it measures up against peers or
great artists from the past. That's how great art comes into being. But hey -
if being a judgmental prick is part of his creative process, then so be it.

~~~
ky3
You'd also think Thoreau a dick for renouncing society for a time, yes?

 _What he needs right now is to acknowledge that he 's just another ordinary
human_

OP is on an experiment to put his ordinariness as a human being to the test.
I'm grateful for a view from the sidelines.

"A man is called selfish not for pursuing his own good but for neglecting his
neighbor's." \-- Whately

And a man is called arrogant not for how much he strives to better himself but
for how he belittles his neighbor.

Sorry that you feel belittled.

------
scott_karana
As a counterpoint, how many people can actually read Chauncer's Canterbury
Tales in its original form, or Beowulf for that matter?

Human language is a _longer_ scale of change, no doubt, but in Neo-
Sanfransokyo in the year 2599, will English even resemble itself now?

~~~
rjbwork
Almost definitely not. 4 quick dialects are AAVE, Indian English, Australian
English, and, for a bit of localization, Iowan English. Mutually intelligible
to someone who cares to pick up the slang, turns of phrase, and alternate
grammars but clearly they are swiftly diverging.

~~~
computmaxer
I live in Iowa. Is our dialect concrete enough to put it in a group with
Indian English and Australian English?

~~~
timv
I'm sorry, I can quite understand what you're asking.

Could you repost it in Australian English?

------
qrohlf
> My first concrete step will be to eliminate variable information rewards
> from my computing life.

By posting your article to HN? I appreciated the article and found it
interesting, but it seems odd that you're the one posting it given the
sentiment behind the article.

~~~
oillio
Posting to HN is consistent with his plan, as long as he doesn't respond to
comments. He has said he will be write-only. He can post tweets, hn articles,
etc. However, he cannot read the responses and comments to his output more
than once a week.

~~~
unholiness
That's a pretty ridiculous stance to take. By posting here, he clearly thinks
his thoughts are worth our time to read, but by going "write-only", he clearly
thinks other peoples' thoughts aren't worth his time to read.

The whole exercise seems very conceited.

------
bitwize
Remember how deep and cool we thought the guy from _The Guy I Almost Was_ was
when he decided to check out of techno-hip cybersociety, become a citizen of
the past, and search for true meaning in old, or absent, technology?

Yeah, turns out he was just going full hipster. Which is just as bad in its
own way.

Balance. In all things, balance. Not very profound to say. You can't make a
manifesto out of it, because duh. But yeah. It's what works, I think.

------
nicolethenerd
> A computer will never live as long as The Trial. … What if Amerika was only
> written for 32-bit Power PC?

This line kind of got me, and made me think about the transience of my own
work. I build educational software, and I entered this field because I was
inspired by the educational games I played as a kid. Most of those games,
developed for a specific operating system, can still be run today on a VM or
an emulator of an older Windows machine. But the apps I work on are web
applications - we're constantly racing to update and maintain them in a sea of
ever-changing devices and standards. The odds of somebody being able to run my
work, even a just few years from now, and have it work without issue is
unfortunately kind of small (the introduction of iOS 8 already wreaked havoc
on some of our layouts). And that makes me kind of sad.

~~~
wtetzner
I've been thinking about this problem lately as well. I think one solution
might be to build your games for a well specified abstract machine. Then, you
can just build an implementation of that machine for the web, and run the
games on that.

The inspiration for the idea was SCUMM, and how we're able to play SCUMM games
now with SCUMMVM.

------
joeyh
Oh precious irony of clicking on this in HN in a bored moment!

I think you'll do better if you find ways to make your clarifying restrictions
be imposed upon you, rather than attaining them through willpower.

I do find 2-3 day email and internet fasts useful from time to time, but the
internet is also a source of much inspiration.

------
lnanek2
So his plan is no online news; only write to GitHub and Twitter, not read;
only old time tested books. I don't think you can be a good coder like this.

A lot of improvements to software development are too new to be time tested
and encoded into book form. For example, Node scales a lot better than Rails,
and maybe there are books about that now and not just everyone hearing about
companies that switched, but beginner Node programmers are going to get stuck
in massive ugly callback pyramids unless they learn mitigation techniques like
Promises.

There's simply too many things a good coder should know that are picked up by
reading other people's code. What if you didn't know about A* path finding and
had a bunch of buggy, slow, piles of loops and if statements? Then you are
doing shoddy work and you are charging your clients more for reinventing and
debugging the wheel when you should have just looked up the algorithm.

The other problem I see is that he wants to live in the city, unlike the
people he cited who moved to cheaper less busy places like Walden Pond, but he
is focusing more on art than work, so it will be tougher for him to make rent.
Maybe Loop/Recur is paying him enough he has time for that, though, who knows.
Many people in big, expensive cities like NYC have to struggle with multiple
jobs.

~~~
cgag
People ignoring old works is why we have node and people trying to put polish
it with promises, rather than people just using Haskell or Erlang or something
else with lightweight threads. Or at least something inspired by them.

~~~
sbov
One of my favorite insights I've heard is that the present came from just a
tiny part of the past.

------
rrggrr
This post is a beautiful illustration of self-determination. Often on HN I
read about the tyranny of employers, government and co-workers and often I
identify with the aggrieved poster. But Begriff's post is a welcome reminder
for me that most of us already have everything we need to exercise what Locke
(and I think Begriff) would call our "natural rights", except the will to do
so. After reading this post I have less regard for my own complaints about
'self' unless accompanied by determination.

------
chrismartin
begriffs will create new content, but only read the work of others if it is
decades/centuries old? That sort of delayed feedback loop is no way to
participate in a community, or even in a discussion larger than comments on
your own work.

The survivorship bias is a logical fallacy, not a prescription for choosing
what to read!

Why disable images in your browser, when many concepts are best expressed with
a diagram or photograph?

It's important to consciously limit your content consumption, but this is not
the way to do it.

~~~
PopeOfNope
> That sort of delayed feedback loop is no way to participate in a community

Who says he's participating in any particular community?

> The survivorship bias is a logical fallacy, not a prescription for choosing
> what to read!

So you suggest reading works that didn't survive? How would you go about
finding content that is by definition dead?

> Why disable images in your browser, when many concepts are best expressed
> with a diagram or photograph?

Yeah, you've got me there. I understand his viewpoint on this, but the
execution is problematic. I'd wager the vast majority of images on websites
have nothing to do with the content you're reading. Even images that are
technically a part of the article you're reading frequently add nothing to the
text. How do you filter the useful from the superfluous?

------
jcoffland
This article makes a great argument for not reading it.

------
6d0debc071
There are things that last because they were done once and were done, mostly,
right. Then there are things that have lasted because they were done once,
mostly wrong, and still provided enough value at the time that they became
popular among the group with the power to determine what would be the dominant
solution before they could be refined.

The flip-side of permanence is path-dependency. So many programs get written
for a certain architecture of chip and instruction set, for instance, that to
change it to something more efficient would break too many things.

To make something that lasts because it is good, that lasts on its own
virtues, would be a wonderful thing. Nonetheless, the next generation ought to
be able to surpass us - and if they are not able to do so, we have done
something wrong. Blanket approval of past solutions simply _because_ they have
lasted seems likely to lead one wrong.

------
moultano
I'd be surprised if this was the attitude of the people whose work he intends
to read. I expect they were mostly very in touch with all the other art of
their time.

------
Nadya
If you find your problem is reading - I would argue you are reading the wrong
things and not balancing your input with your output.

I agree that limiting your input might help improve your output. It could also
harm your output as you are having less ideas being inputted to feed
inspiration.

Best of luck when you read this on Monday. ;)

------
heurist
This would be too hard for me to maintain, but I'm sure it will bring benefits
for as long as it can be sustained. The content we expose ourselves to
definitely affects our subconscious thoughts. However, I instead prefer to
envelop myself in things I enjoy and attempt to recognize and improve the
quality of things I enjoy over time. For example, reddit is a low-quality
entertainment source. The more I recognize it the less I enjoy it, and the
less time I spend on it. Progress depends on exposing myself to things I don't
know how to enjoy and attempting to find ways to enjoy them. My preferences
have shifted from consuming to creating over time. The adjustments required
for my philosophy are much smaller and easier than for this 'write-only'
philosophy.

------
jonsterling
This resonates in a certain sense, but honestly I could not help but find this
post incredibly narcissistic.

~~~
mrweasel
Part of the narcissism comes in because because he want to broadcast not
communicate. I don't disagree that email can be a distraction, but only
answering your email once a week makes communication impossible. Twitter would
be useless (more than I already believe it is) if everyone decided to use it
as broadcast medium and never read anything. What's the point if there are no
readers?

You can do broadcasting if your Linus Torvalds, if not you might need to check
your email a little more frequently.

~~~
jonsterling
Yeah (though I don't agree with the Torvalds comment), I basically agree.
“Write-only” means “I am the center of the universe, and the plebs should
listen to what I have to say and write, but I will listen to nobody: and if
you were smart, you would stop listening to me and only broadcast, but if you
are not, then you should keep reading my blog.”

It really strikes me as another variation on the Carefully Curated Instagram
Feed that contains a million pictures of some hustler douche skydiving or
touring Africa, or something. Building engagement with the Personal Brand,
etc. I mean, everybody kind of does it, but it's a new level of narcissism to
write a blog post about how “I’m going to focus on #1 right now, which is me,
and the fact that I am the only one who has anything worthwhile to say.”

If you really want to focus on knowing yourself and being reflective, that's
great! But don't “broadcast”. Just be silent for God's sake.

------
clay_to_n
I agree with the sentiment in this that contemporary media is distracting and
oversaturated.

But when he says:

>The simple test of time does yield false negatives.

I wonder what makes him so sure it doesn't yield false positives. My own view
on aesthetics is that they're highly personal - something "timeless" might not
appeal to you at all. Timeless works are the ones which happen, overall, to
appeal to the most people - it doesn't mean they necessarily are higher
"quality." It also has to do with how accessible they are, how ubiquitous (/
marketed), etc.

(Sidenote, but his arguments against startup aesthetics could also apply to
modernism and many other movements. It's about distilling something to it's
essence, removing superfluous details in favor of clarity. For all the faults
"flat" design has, I think it is a noble cause even if it takes some
missteps.)

There is a lot of contemporary work (programming, music, art, etc) that I find
compelling, and personally I want to embrace it. Understanding the
contemporary helps you understand the current moment of time, and where things
are going. It lets you see larger trends and where you might want to fit in.
I'm not very afraid of embracing something that turns out to be a false start
- at least, at the moment, I was doing what I thought was best. Maybe this is
a fallacy of youth.

I'm all for embracing the classics, the tried and true. But I think applying
lessons learned from them to the unproven ideas is where things get
interesting.

------
serve_yay
I've seen a lot of pieces lately, talking about the career of a programmer, or
the state of the tech industry or what have you, written with rather a lack of
perspective. Sorry, but even the work of those who are deeply fulfilled by it
will not last as long as a Kafka novel. It's simply not possible, or else we
would be drowning in the past. Even for writers, even for whatever occupation
you think noblest. The blowing away of the past is what gives us a chance to
build, and the blowing away of all we have is what will give that chance to
future generations. I believe someone may have written a song about this at
some point.

Of course, just because something's not going to happen doesn't mean we
shouldn't strive for it anyway. But I am unsure that posting less frequently
on Github will get the author any closer to that achievement. To me this
attitude of needing one's work to survive signals an uneasiness with death
that will permeate the rest of one's life, but then that would be quite a leap
to make on the basis of a blog post.

~~~
k-mcgrady
>> "To me this attitude of needing one's work to survive signals an uneasiness
with death"

I look at it slightly differently. It signals an acceptance of death being the
end and that you should make what you do now really count.

~~~
PopeOfNope
It also speaks to a universal human desire to have a purpose in life, to make
a lasting positive impact on the world.

------
sparaker
A pretty good article, i agree with most of the points made. However i think
there are times when the creativity inside you needs some fuel and looking
around and browsing the internet is a very fast way of acquiring that
knowledge. The key is to have a rule like 30/70\. So if you write 70% of the
time, you should be reading for 30% to charge your batteries.

------
gamblorr
If we create dependable, practical things that never break, few people will do
anything more than depend upon them, and likely would never learn to create
things themselves.

So then, if six or seven billion people learn, and then go on to create
something that lasts, compounded across mutliple generations, each with a
delta of several billion, we're left with pollution.

~~~
hargup
Well, we _have_ created a lot of dependable practical things. I say something
is dependable when you don't have to look for instances where they break. The
hardware of your computer, the architecture of your house, the transport
system, water, electricity and food supply. Almost everything that matter is
so dependable it is possible to not to learn to create anything at all.

------
A_COMPUTER
I used to argue with this the conclusions of the author, as presented to me by
other people older and wiser than myself. But I had to learn from experience
that I was wrong and the author is right. I thought I was smarter than
everyone else who told me to moderate and that you become what you put in your
body, but I wasn't.

------
scotty79
> To program any more was pointless. My programs would never live as long as
> [Kafka’s] The Trial.

If you wrote Cobol it might live just as long.

On more serious note your code might live in billion devices and do useful
stuff. Don't dismiss stuff that doesn't run on meatware. People might get
obsolete faster than some of your code.

------
AnonJ
All pretty good. However the author didn't answer the question he posed at the
beginning of the article. How does he plan to escape the "ephemerality" of
computer programs? Seems from the footnote of his website that he plans to do
so by "journeying from web ephemera to the timeless world of data." But is
that really so fundamental a change? Or, actually, maybe programming isn't
that "ephemeral" after all. It's just that people nowadays have the luxury of
updating things at a much higher rate than ancient people, while still
preserving(and improving upon) the core spirit. Languages, libraries and
classical programs have been around for years. They're not really being
overhauled/outdated as suggested. They're just being improved.

------
chrisbennet
An _inherent_ desire to make long lasting things is an aspect of
craftsmanship. However, I think some people are driven by a need for external
validation (N people use my creation!).

Much of my programming is a (hopefully great) meal that I make and eat myself.

------
brendano
If you bet on the right platform, code can live for a while.

For example: this classic paper on compression
[http://web.stanford.edu/class/ee398a/handouts/papers/WittenA...](http://web.stanford.edu/class/ee398a/handouts/papers/WittenACM87ArithmCoding.pdf)

was published with a C, Unix-based implementation (I put it here
[http://brenocon.com/WittenNealCleary-ArithmeticEncodin-
cacm-...](http://brenocon.com/WittenNealCleary-ArithmeticEncodin-
cacm-87.shar))

which compiles and runs fine (just lots of compiler warnings) on modern Linux
and OS X, 18 years later.

------
lifeisstillgood
> I will eliminate all use of the computer that is not directly related to
> creating things. If I’m not coding, writing, or editing videos then there
> will be literally nothing to do.

Err, but aren't I reading this on a Computer? Did he not read inspiring things
online, watch videos on how to code or cook?

Consumption of others output is not an intrinsically bad thing. It is
balancing it with creation that matters. Very similarly to the academics
closed door, if You create without ever consuming you may be productive but
eventually will be working on the wrong problem.

------
diminoten
Oh he'll _only_ use the command line twitter tool during his month of solitude
and reflection, huh?

It just seems artificially constructed as a framework for doing... something,
like this guy is waving his arms about, trying to find some deliberate way of
being like Thoreau or Emerson.

It just reminds me a bit of when Sarah Silverman talked about hecklers -- how
once, a woman simply shouted out, "I exist!", because that's what hecklers are
trying to do -- show people they exist.

This guy just seems like he's trying to let other people know he exists.

------
jqm
Code might not live forever. So what? It's not supposed to.

I'd be disturbed if something better didn't eventually replace what we are
currently are doing. Evolution. The progressive succession of organisms.
Unless you are a horseshoe crab your species is probably just a flash in the
pan and a rung on a ladder.

I consider myself less an architect and more a cook. The nutrition is provided
and the meal enjoyed, but then people go on their way. The meal is gone, but
the effects provided echo in time forever.

------
evjan
I did something similar in 2013, see my blog:
[http://peterevjan.com/posts/a-week-of-no-media-
consumption/](http://peterevjan.com/posts/a-week-of-no-media-consumption/)

In the end, it forced me to be more creative and social, but didn't produce
any lasting change. I need to keep forcing myself to turn off internet etc to
be able to get creative stuff done during evenings and weekends. Sad but true.

------
btilly
His plan is missing the inevitable step of, "Go crazy."

Seriously, as an extrovert I've found that "shut down all external stimuli" is
a recipe for disaster.

~~~
egypturnash
It sounds like a pretty good plan if you're an introvert.

And if he needs human contact, he's not moving out to the middle of nowhere.
He can go outside. Turn on the TV or get a paper for news. Go to a restaurant.
Hook up with a friend for an evening. He's just turning off all the little
social drip-feeds that live in the computer.

I've never done that for a month, but I've done it for a week or two now and
then. It changed the way I think to not have Twitter to distract me on a bus
ride or walk through the park or whatever.

~~~
tonyedgecombe
Yes, strangely there were people with fulfilling lives before the internet
came along.

------
scotty79
> These are repeated activities which occasionally – and unpredictably – give
> a pleasant surprise.

That's interesting observation. But I feel not only distractions work that
way. Programming is to repetitive activity that occasionally, unpredictably
gives you pleasant surprise that makes you keep coming back to it. Learning
too.

------
scott_karana
Somehow, I suspect that Knuth's "Art of Computer Programming" will survive.

I feel that algorithms, especially as presented in concise research papers,
will survive, and continue to show beauty in the centuries to come, just as
the first-principles of mathematics and physics do. :-)

------
nemoniac
Nice article but are you sure you want to end the article with a "Contact me"
link?

------
ThomPete
_> To program any more was pointless. My programs would never live as long as
[Kafka’s] The Trial._

The Trial had it been written today wouldn't have survived long either. It's a
function of much less competition rather than quality IMHO.

------
robgibbons
Excellent and thought-provoking post, very relevant to our industry and
specifically to our times. I would personally add a daily period of meditation
to further strengthen awareness and emotional tranquility.

------
walterbell
Reminder: we live in a time where many university libraries have digitized
classic books and made them available on archive.org and elsewhere. A bounty
of pre-1920s history of human civilizations.

------
72deluxe
He should change his taste in modern music instead of listening to old old old
music. Just because it is modern does not mean that it is agitated and
aggressive.

------
mori
His comparing of logos and the "free expression of a skilled painter" is
honestly ridiculous. They are optimized for completely different things!

------
ribs
Only place I've ever seen the word "hamartia" (before I Googled it) was in a
clue from a very, very hard crossword puzzle.

------
MatthewJBrown
The hipster is strong in this one.

------
steamy
Why some people here insist that programming/coding is an art form?

No, despite how much you'd like to be and see it happening, it is not and will
not be.

~~~
apalmer
If you want to be an artist be an artist.

Being a programmer is not going to get you there if thats what you need.

I dont really understand the mindset, I think that is because I am an
engineer. Meaning by training, schooling, and occupation i did industrial and
civil engineering for 10 years. I tend to take the software engineering
literally, well more literally than most...

Its kind of the antithesis of being an artist to me. Still beauty in it.

~~~
heurist
Art and engineering to me are different paths to get to the same place. Art is
very free-form, flow-based, and seemingly arbitrary but which contains deeper
meaning and structure that often comes out subconsciously (when done well)
while working toward a goal. Engineering is a methodical, reliable way to
reach a goal. The end results may be the same - program that does something -
but how they were reached varies.

