A fun exercise is to eliminate the adjective "emergent" from any sentence in which it appears, and see if the sentence says anything different...Does not each statement convey exactly the same amount of knowledge about the phenomenon's behavior? Does not each hypothesis fit exactly the same set of outcomes?
Before: Human intelligence is an emergent product of neurons firing.
After: Human intelligence is a product of neurons firing.
Before: The behavior of the ant colony is the emergent outcome of the interactions of many individual ants.
After: The behavior of the ant colony is the outcome of the interactions of many individual ants.
When you talk about an emergent property, you are saying that there is an issue of scale to pay attention to. Below some threshold the phenomena does not show up. After another threshold it clearly shows up. And this is sometimes a useful piece of information.
Take, for example, the statistical algorithms used by Google translate. Effective translation is an emergent behavior of the corpus the algorithm has to run off of. If you provide it with a small corpus, you get garbage translations if anything. If you provide it with a large corpus, you get good translations. And the knowledge that size matters is sometimes useful information. For one thing it tells you that the results from a small corpus are not indicative of how effective the algorithms are.
1. The "self-organization" of novel entities. The paradigmatic example is the glider in Conway's Life automata. Gliders have properties their substrate entities (the individual life cells, perhaps) do not: direction, motion, information transmission, speed, and a whole logic of destruction/creation (from which, e.g. Turing machines can be constructed). Are gliders as ontologically "real" as the substrate cells and update rules that comprise them? I don't know, but clearly they are novel in some sense, and the term "emergent" seems to naturally apply here.
2. Conflation with the notion of "downward causality". Downward causality is a concept that arises mostly in philosophy of mind. The basic idea is that it seems to be the case that conscious entities control their bodies (including, to some extent, the neurons in their brains - for example if I want to imagine a pink elephant presumably "I" am causing some neurons in my head to fire differently than they were before I was envisioning a pink elephant). And yet, conscious entities as we believe are comprised of their bodies (including, to some extent, the neurons in their brains - presumably their is some necessary subset of neural activity in my brain comprising my conscious volition and the like). Downward causation is the idea that from certain patterns of activity arise sub-patterns that control or have causal influence on the underlying activity. And in this context the term 'emergent' seems appropriate.
Other examples: dark matter, dark energy, etc.