Hacker News new | past | comments | ask | show | jobs | submit | more joggery's comments login

Why does it take so long for us to form our first memory? My guess is it's because human memories are ideas and like all our ideas they depend on other ideas for their meaning.

However when we were young our ideas about the world were very different from what they are now: much simpler and with many falsehoods. So our earliest memories don't make sense to us and can't be recalled.

What we call our 'first memory' may just be the earliest thing we can recall and make sense of now.


Isn't there a valid distinction between the joy of making genuine progress in life and mere pleasure such as that derived from drugs or alcohol?


This is a question of virtue versus pleasure which is much explored by the Stoics because a contemporary, competing movement was Epicureanism, which extolled pleasure.

OP is referring, however, to the "life hack" promoted by the article's author which suggests keeping our baseline low so that we are constantly surprised to the upside.


I like to consider the difference between 'one','a couple','a few','several','many'. e.g 1,2,3-6,5-12,10+ respectively.

>Vagueness isn’t a problem about logic; it’s a problem about knowledge.

I think it's more to do with context. The transition from 'a grain' to 'a pile' to 'a heap' where those terms are actually useful is normally clear in the context of whatever problem we're trying to solve.

The confusion arises from the fact that 'a grain' is considered to be well-defined without context and that language refers to it directly. Thus we lament that we can't adequately define 'heap'. Whereas in fact all language is metaphorical and indirect, and definitions can't rescue us from this. So even the label 'a grain' is fuzzy if you look deeply and scientifically enough.


>The transition from 'a grain' to 'a pile' to 'a heap' where those terms are actually useful is normally clear in the context of whatever problem we're trying to solve.

That's the position of pragmatist philosophers, the existential-phenomenologists and the analytic philosophers influenced by the later Wittgenstein.

In Western philosophy, there are two competing ideas about the metaphysical status of basic concepts. According to one, as in Platonism, dualism, and materialism, concepts concerning the real world are perfect, like mathematical ideas. In the other view, concepts are the products of the great complexities of human living, and so are inherently vague, complex, and contextual. In the last century or so, I would say the later idea has been winning out, though many philosphers still hold the former one.


>All language is metaphorical and indirect, and definitions can't rescue us from this.

I think that overstates the problem. Just don't use 'heap' in a context where greater precision is needed; there are alternatives.


Ok here goes.

A heap is more than three.

You can't have a heap if the objects aren't stacked, and stacking nonrectangular objects generally requires 4 if you want it to be a stable heap. 3 as the base, 1 on top.


You need the math required to understand the language in which the current best physical theories are written. It's unknown as yet which mathematical objects will be required for their successors.


>terribly introverted adolescent, painfully aware of his own suffering

Self-awareness seems to be a key component of creativity.


>excessive screen time early in life can change the circuits in a growing brain

Any activity changes the brain. Also, as BurningFrog points out, "rewires" is a metaphorical term and therefore vague. Ditto "circuits".

>mice

...aren't human.

> But it also meant they acted like they had an attention deficit disorder

Such disorders are pretty loosely defined. Something "like" such a disorder is vaguer still. And, again, these are mice for goodness sake.

>In a video game, he said, you can meet the equivalent of a lion every few seconds.

No you can't. Lions are dangerous!

>our understanding of how sensory stimulation affects developing brains.

We're not passive. We decide what to pay attention to. Thus we can't be stimulated arbitrarily by the environment. Actually I think this is assumed by the contradictory concept of "attention deficit", elsewhere in the piece.


Isn't this sort of thing inevitable? I mean, the more technology advances the easier and cheaper it becomes to build terrorist devices or WMDs. Therefore society, in order to protect itself, has to be vigilant about how the relevant knowledge disseminates. An analogy might be an individual guarding against suicidal thoughts.

More interesting to me is how to create mechanisms to control who gets access to the data and under what circumstances. I think in the UK we have a pretty bad record of local councils and other busybodies using snooping powers not intended for them.


Indeed. Turn the spotlight of publicity onto what people, be they good or bad, have to say. Turn a platform into a "safe space" and pretty soon any kind of criticism will be interpreted as harassment. This will deny users a powerful means of making progress. So ultimately it becomes an unsafe space.


Wonder if drones could be shielded against this? Or, better yet, could the technology be adapted to transmit energy to drones or spacecraft?


Of course. Wrap the drone in a Faraday cage. It will need to be capable of flying itself and communicating with lasers to work properly.


Myself I prefer optimistic SF such as classic Asimov and Clarke. I don't mind things going wrong because mistakes are normal. But there ought to be resolution and problems solved by hard-won knowledge and the efforts of good people. If you think that is immoral or unrealistic consider whether you would read your children bedtime stories with no good guys and/or no happy endings. Consider how such children might grow up. Then consider the effect on society of unrelenting pessimism in SF.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: