

Ask HN: Impossible to successfully make big software? - mediaman

An earlier article linked from HN posits that software that has become big has failed. Many here argued that there are naturally big problems -- for example, powering Amazon.com -- that must be solved with big software. Others counter that it still becomes a mess and causes high turnover and lower innovation.<p>What are the principles in building very large systems over long periods of time, such that they remain easy to maintain and improve upon?<p>What are examples of companies with very big problems needing big software solutions that have done this successfully?<p>What architecture is used? Is there an optimal class of languages to use?<p>If you were the CTO of a big company dependent on a massive technology infrastructure, how would you build it?
======
makecheck
You should develop a large system by building many small parts that are bound
together (probably by scripts). The "big monolithic program" approach is
insane: it invariably leads to reinventing wheels (e.g. there is no good
reason for an office suite to have its own scripting language).

Small parts can be written in whatever language makes sense for the task.
Their partitioning leads to clean interfaces: command line options, file
formats, or protocols, without which the parts could not be usefully combined.
This has the great side effect of making each piece easier to test in
isolation, as regression test data can be created for each part without regard
for how the data reached that point. It also makes everything inherently more
flexible, for unforeseen improvements.

When you have small parts that are easily tested, you can easily replace them.
This is the basis for innovation or other maintenance: if you really have to,
you can confidently throw out one piece and put in something better that is
compatible.

It also helps immensely to open-source each part; at least, within your
organization. It should not be surprising that people are more willing to
contribute when there is something small they can wrap their heads around.

~~~
DanielStraight
Exactly. What task at Amazon requires big software for _just that task_?
You've got lots of tasks, so write lots of software. Lots of software != big
software. I have well over 2000 packages installed on my Ubuntu Linux machine,
but somehow all the developers manage to keep on working without getting
bogged down in what the other 2000 packages are doing.

------
DanielStraight
You're doing the exact thing Steve Yegge and Ola Bini warn against. You're
assuming that you need a "massive technology infrastructure" and then deciding
how to go about building one. Well of course it's going to be massive; you're
_planning_ to make it massive. Let's try another thought experiment. Assume
you had to run Amazon.com but were only allowed to write 50k lines of code.
What would you do? What percent of Amazon's problems could you solve with this
code?

------
stonemetal
The Unix philosophy is the only one I have ever seen work. If you can't
componentize it enough that you can pretend that the rest of the system
doesn't exist\ is magic then you can't focus on a smaller human sized chunk to
make that work properly. The second you have to break the black box complexity
has just won.

