Hacker News new | past | comments | ask | show | jobs | submit login
The Complexity Paradox (1998) (asktog.com)
43 points by declanhaigh on Oct 14, 2020 | hide | past | favorite | 17 comments



The article is missing the key factor that people will accept some level of crap (in their computer, or it their car) to obtain value, and that minimizing complexity strictly for its own sake is almost no one's goal except monks.


I understood exactly the opposite point from the article - it's saying the exact thing you're saying, people will accept a CONSTANT AMOUNT of crap to get to their goals, and technology doesn't change this. It only changes how far people can get with that constant amount of crap they're willing to put up with.


This is a principle, not a paradox. The principle is simply that there is an inherent complexity to any construct, and, that the burden of complexity can be shifted to optimal domains.

As an analog, consider analysis of harmonic systems. A complex wave form in time-domain, vs the same signal considered in frequency-domain, via Fourier Transforms.

https://en.wikipedia.org/wiki/Fourier_transform

(So, here, quite a lot of the time-domain complexity is handled by very complex mathematical machinery that supports FT. The new picture in frequency domain is “simple”, but the FT machinery is hardly that. The total system complexity is asserted to be equivalent.)

The name of the game with complexity is to shift the burden of complexity to a capability domain that is within one’s reach.


I don't think this is a good analogy. FT is like changing the reference frame. Similarly, the movement of planets are more easily described in an heliocentric model than in an Earth-centric model. You can write the transform between those two models, but this does not increase the complexity of the solar system itself: you don't add features, which is exactly the issue with software.

A slightly better analogy would be the shift from Newtonian physics to relativist physics, that describes more accurately how things move. However, you still don't add complexity to the described thing, you just discover more of its complexity and get a more complex description as a result.

What we have here is simply the effect of something we know one expect of as Parkinson's law [1], which is a specific case of the Induced demand phenomenon [2].

[1] https://en.wikipedia.org/wiki/Parkinson%27s_law [2] https://en.wikipedia.org/wiki/Induced_demand


“Simple can be harder than complex: You have to work hard to get your thinking clean to make it simple.“ Steve Jobs


Why do people put celebrity names on common sense proverbs just because the celebrity said it once?


Why not? Should they instead quote their neighbor Joe Six Pack? This is a technique called Appeal to Authority and you can read about it on wikipedia.


Because S. Jobs got rich via multiple market-changing products. It's as proof-is-in-the-pudding as you can get. There are not many practical alternatives. Careful scientific studies would be nice, but nobody is paying for them so far.

Re: "technique called Appeal to Authority"

More like "appeal to merit".

That being said, good factoring takes work. You need both raw logic and math-like ability to identify patterns and propose reworks, and also you need to understand the domain so you know what can be chopped, reworked, and/or consolidated without harming operations. Often when talking to the subject matter experts, I say something like, "I see a pattern here, but before we rely on it in the new system, how likely is it to change in the future, and what kind of changes are likely?"


You forgot to cite and link the wikipedia page


I’ve come across a similar line of thought while reading about capitalism. When technology improves such that it takes less time to perform work, there are three possible outcomes:

1. The labourers finish the work in less time and get more leisure time (production and labourers stay constant, time spent decreases)

2. Some of the labourers are fired since fewer are needed to perform the same work in the same amount of time now (time and production stay constant, labourers decrease)

3. The same amount of labourers work the same amount of time, but produce more due to the increased efficiency. (time and labourers stay constant, production increases)

In my experience, unless you work for yourself, outcome 1 will never happen. Outcome 3 is desirable from a broader perspective because that extra production must be benefitting someone. But I don’t see any upside to option 2. Some people lose their jobs and there’s no extra production. I guess you could argue it would have been a waste for them to continue working on something that could be done more efficiently without them, so in the long term it works out, but in the short term they lose.


> I don’t see any upside to option 2.

That's because you're stopping too soon with option 2. There are actually two sub-options to option 2:

2a. Some of the laborers are fired, and they can't find any other work so they now aren't producing anything (time and production stay constant, laborers decrease).

2b. Some of the laborers are fired, and that means a pool of unused labor now exists, which entrepreneurs hire to do new jobs that couldn't be done at all before because there was no labor available (time and laborers stay constant, production increases).

Outcome 2a will virtually never happen in a healthy economy because there are always more things that people want, so there are always additional things that could be produced if labor were available. So what actually happens, at least in a healthy economy, is outcome 2b. If you're seeing outcome 2a, it means the economy is not healthy: something is preventing the natural process of labor that is no longer needed for existing production being redirected into new production. Almost always that something is the government.


1 could happen at a lifestyle company, theyre out there but rare (and never hiring obv)


Complexity is easy to build.


As somebody who gave two honest attempt at building a PRNG... no, not quite. Building complexity -- especially one that isn't trivally reducible -- is quite hard.


What does trivially reducible mean?

A PRNG is inherently complex and winners of https://www.ioccc.org/ are a class of complexity you deem reducible. This competition involves building complexity and since there are winners it's feasible to claim that to win it is "hard". Since this straddles a form of art, placing a value judgement as easy vs hard seems invalid.

The reason I'm thinking about this is because I recognize that adding accidental complexity is far easier than removing it. In fact, I spend a significant amount of time "trivially reducing" such complexity.


My both attempts at PRNGs resulted in rather short cycles -- ones that were easily compressed several-times-fold by GZIP and similar algorithms. Thus I consider them to have been trivially reducible.

Yes, a well designed PRNG algorithm and implementation can be quite simple, but it is quite hard to design them well. It takes either non-trivial math, or non-trivial amount of iterative improvements.


You are mixing hard with complex. A PRNG is very simple (not complex) and hard (not easy).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: