At times, I find it weirdly surprising that I don't actually experience fear every time I look up at the sky. Then again the lack of fear is easy to understand: a deadly encounter with an alien race happens only once, so evolution had no chance to tune our responses appropriately.
This reminds me of dodo of Mauritius . It is said that the birds were not fearful of humans (nor presumably of other abstract threats that could conceivably come from across the seas).
It's actually significantly less probable than that by many orders of magnitude but it's hard for humans to grasp just how isolated everything in the universe is from everything else.
That fact that we dont find drones suggests one of only a few things
* life is rare
* we're first
* they all kill themselves before they get the drones built
Or, that there is something fundamentally flawed with the concept of self-replicating drones covering the entire galaxy that would make it infeasible in practice, despite it seeming perfectly rational on paper.
The decision to use resources to pollute the wilderness with arbitrary technology is a leadership decision, relying on the personality characteristics of an entity or social chain of command.
This business about "drones" is an anachronistic paraphrasing of the original concept. People didn't speak in terms of the "drone" fad in the 20th century, like we do now. Drones were usually just target practice for the Air Force and Navy.
The original concept just specified range of influence, and a demonstration of presence. It did not impose a manner of activity, be it drone replication or direct colonization with regimented staff, and divisions of duty among personel.  The Fermi paradox remained agnostic, simply implying possible speed of travel given geological time scales.
Carl Sagan's Cosmos mentioned unmanned satellites (or unaliened? unoccupied...) as the most likely hypothetical form of first contact. Before we bump into any living thing, will probably notice a few remote control devices fanned out in front of their main corpus of civilization or colonization. That TV show also hypothesized about the possibility of dying civilizations leaving behind self-perpetuating remnants of technology, the likes of which might or might not be sentient. All of it was TV speculation though, not presented as surely factual.
If you read between the lines, the premise of a "dying" civilization hints at the lack of self control present in a runaway factory neglected and left to churn out garbage. That idea does not assume that a collective of entities would always wish to tamper with and contaminate their surrounding domain presumptiously.
It's pretty clear that the "drones" the GGP is referring to are self-replicating autonomous machines, which could, with a relatively extremely small initial mass-energy investment, visit every solar system in the galaxy in a matter of megayears. It doesn't require leadership or social approval or whatever you're talking about; anyone with technology marginally more advanced than what we have could do it with the equivalent of a few tens of billion dollars of machinery. The more advanced you are, the cheaper it gets. It's very odd that we haven't seen anything like that yet.
It's like Kurzweil describing the exponential curve of self-replicating AI leading to the Singularity and infinite machine intelligence... it's an elegant, mathematically self-evident solution that just happens not to correlate with reality.
It has literally nothing to do with the nature of whatever aliens might build these hypothetical machines. The only requirement is that some group with modest resources and marginally better technology than us wants to do it.
> It's like Kurzweil describing the exponential curve of self-replicating AI
A little early to try and refute that, don't you think? We're still on an exponential production trend.
That's not "modest resources and marginally better technology than us", that's perilously close to being magic. Not even viruses, the most aggressive and efficient self-replicators we know of, have managed to consume all the biomass on the planet, or could reasonably be expected to do so.
This is a false premise. It doesn't have to do anything this fancy.
You can expect that any given asteroid is likely to have a certain amount of iron, carbon, silicon, nickel, etc.
In almost any solar system, you can mine the materials needed for construction very easily. Solar gives you enough energy to do (slow, deliberate) resource extraction from asteroids.
As for propellant and reaction mass, we already know nuclear explosions work excellently. Unfortunately, based on analysis of meteors we don't expect asteroids to have great heavy-metal concentrations; on the order of 10ppb for Uranium. This presents a challenge, but not an insurmountable one; it just means that you'll probably want to get explosive materials closer to the center of the system.
> Not even viruses, the most aggressive and efficient self-replicators we know of, have managed to consume all the biomass on the planet, or could reasonably be expected to do so.
This is true, but 100% irrelevant to the challenges of creating space-based manufacturing capable of reproducing all its own components.
It doesn't have to be "self replicating drones." It could be inert bullets, arrow heads, sharp sticks, carefully arranged electromagnetic retro-reflectors.
Drones are a fad. The paradox doesn't require an alien implementation of drone technology, self replicating, autonomous, or what have you.
The concept of autonomous machines is a fad?
As technology advances, autonomous machines may very likely grow beyond the definition/parlance of a "drone" (machinery designed and dedicated only to perform specific tasks) into more powerful forms of intelligence.
At that point, such machinery graduates beyond the definition of an artifact (a possession held in the ownership of higher-order life), and one might speculate whether it represents a form of life unto itself, or where one might draw such a boundary.
Either way, such sentient machinery (read: not drones) might not be bound by geological/astronomical timescales, outlasting its inventors possibly permanently, but still represents a component of the Fermi Paradox. If such things are possible, where are they?
Yeah, you're just making shit up. The first machine intelligence in pop culture can be found in https://en.m.wikipedia.org/wiki/R.U.R. (the origin of the phrase "robot"). Non-sentient autonomous machines are even older. https://en.m.wikipedia.org/wiki/Golem Sci-fi authors have been talking about modern-looking self-replicating machines for decades.
Maybe you have some weird definition of a drone? Drones in common parlance have come to be associated with any robotic vehicle, and are used in technical parlance to refer to a huge variety of autonomous or semi-autonomous machines.
Please don't create accounts to break the HN guidelines with.
Another option is something like the outcome of the novel Blood Music, where engineering efforts become smaller and smaller and the outside universe is ignored completely in favor of microscopic exploration and engineering efforts.
I think we get caught-up in the physical nature of the universe. I mean, anyone who can efficiently traverse the galaxy would have completely understood the laws of the universe and be able to somehow travel as fast/faster than the speed of light, meaning that they would have to somehow violate the physical mass=infinite, length=0, time=0 constraints of the speed of light. There's no way that you would be able to travel efficiently under those physical constraints. It would have to be some dimensional non-physical method of travel. And if they can do that, then physical drones would be unneeded.
might just as well be: they manage to build the drones, but the drones don't go on or the power gets turned off.
The concern isn't little green men in flying saucers; it's if, for example, someone makes an exponentially replicating terraforming system. It wouldn't take very long to swamp the galaxy.
-- Bertrand Russell, The Problems of Philosophy
Seeking growth, they will need more and more resources. Direct competition inevitably ensues, leading to conflict and destruction either via warfare or economic means.
That's all pretty theoretical, but just one example of why aliens might not be benign. I don't think it's reasonable to say we have nothing to fear.
The amount of material present in Jupiter is rather large. If a species arrives that is routinely capable of converting gas giants into habitats and is currently in pressing need for housing then yes, we may be bulldozed into extinction - there might be even some poetic justice in that. But why would the Developers choose Jupiter, why not some other system with many more Gas giants or currently coalescing clouds ?
Resource based arguments don't hold much water for me, not at this scale anyway.
In any case, we would still be ants compared to them.
You just need to pump out lot of satellites.
There's nothing special about self-replicating "swarm", either.
Unless maybe you are losing a war, being chased relentlessly across the stars and need to make a bomb shelter under a thousand miles of rock. You could offer the natives all sorts of technology, or all the precious metals you find in exchange for a few cubic kilometers deep under ground. Whatever deals you make with them don't matter anyway. They'll all be glassed in a few years when your pursuers figure out where you're hiding and begin bombardment.
Or imagine making peace with the aliens, what will they think of our nearly endless collections of movies, video games and stories wherein we're slaughtering, mutilating and murdering or visitors from the sky? I wonder how _that_ is going to look once they understand what they're seeing.
Highly doubtful. Any trait that's collectively exhibited was adaptive and such a spefies would value it to some extent, so they wouldn't breed it out entirely. If some new sub-species found a better way and outcompeted those with this trait, then they are already better adapted by definition, and so wouldn't need this trait.
To study us. Intelligent life, and maybe even complex animal life, may be incredibly rare in the universe even if life is common. It may even be so rare that even an aggressively expansionist species might choose not to colonize our system because of the value that might be associated with a natural experiment such as ourselves.
So assuming expansionism, the question is, can such a species survive long enough to be a threat to us? If their societt is unstable, they may well destroy themselves before getting very far. Stability seems out of the question here - that would imply they are able to maintain a steady-state society within limits, but still have the need or impulse to hunt for new resources and living space.
So maybe we're not likely to meet such a species as it is likely to be self-destructive. One configuration is worrying, though: An expansionist species sufficiently advanced to reliably spread across space without end, but still bound to eventually exhaust any resources they find. (Inevitable for all species given entropy?)
This species will either die out or monopolize all resources it can find. (And then die out.)If it really requires constant growth, it can do nothing else.
which is basically what you said - every civilization's prime directive is to survive, resources in the universe are limited, and what they call "chains of suspicion"
Resources limited in universe: think about how fast we have used many of the resources on earth. The universe has a finite amount of resources.
He also talks about how fast technology progresses. In just the past 100 years we've made airplane travel common, gone to the moon, launched a probe that is on the edge/outside the solar system, have self driving cars, computers, etc. Who's to say that this is fast or slow? By the time new communcation from an alien race arrives, they/we might have had that technological progress.
Either way there is reason for fear.
We can only take comfort that such an event would wipe out their own civilization, and they have adequate safeties in place as a result.
Oh, and hopefully there isn't a galactic wide cold-war-esque standoff on the possession and use of supernova-creating bombs.
I'm really not sure how anyone can be so certain about the motives of beings that might be radically different from any life we know of.
They could exterminate humans as an experiment, out of curiosity, for enjoyment, because some other species or their god wants them to, randomly, for no reason at all, or for many other reasons we can't even conceive of, because they may not think in any way relatable to us or what we know. There are an infinity of possibilities.
I think there's a lot of jobs that war would select for that don't involve killing. Early man interested in treating wounds would study the human body and try to repair it, as one example. These valuable and intelligent humans would be promoted just like today's .mil and be more likely to procreate.
Also, since everything is based on the bell curve, out of 7 billion people, there will be people who are born with the perfect storm of personality traits to start a war, and with the right (or wrong, more accurately) life experiences will start that war. Hitler was a perfect example. Statistically it has to happen.
And we are only aggressive because, socially, we are barely out of the trees. We have the ability to blow the world up many times over, yet are consumed by petty intolerance. Our social intelligence lags our technical intelligence considerably. This is the most dangerous time in any civilisations time.
Similarly, different races and nations on earth kind of decided that we're not actually going to kill each other and take each other's resources. We gain a lot more by cooperating.
If you'll allow me to strawman you a bit, I've always found such arguments facile at best. They seem to reason from something akin to this "As mankind has become more advanced, he has grown from being tribalistic and violent to understanding that fear itself is more dangerous than actual external threats"
Yes, but we've never actually reached such nirvana except in our imaginations. Instead, history has shown us all sort of intelligent and evolved populations both being the perpetrators and victims of indiscriminate violence. In addition, looking at how humans develop and what we know of species in general on the planet, there's never any reason to be afraid of anything -- until there is. That's the problem with induction and not knowing enough about the greater universe. Things are always fine and dandy for the turkey until Thanksgiving comes.
It's extremely problematic to speculate about alien beings. I note, however, that we tend to think in terms of aliens having some sort of policy towards humans: they want to eat us, they want to conquer us, and so on.
Assuming we are just a few rungs up the evolutionary ladder from pond scum, it's much more likely that we're not important at all -- and that if there were some terrible interaction with aliens, it would probably be akin to a man accidentally kicking over an anthill on his way to work than some sort of dramatic invasion or intervention. The scariest thing about potential technologically superior alien beings isn't that they are hostile. It's that they're completely apathetic and have better things to do. Which takes care of Fermi's Paradox quite nicely.
The potential that humans may create and use relativistic missiles combined with the philosophy of safety first suggests that humans might be treated more like fire ants or brown recluse spiders:
1. if you see them you call pest control so they don't sting you when you aren't paying attention,
2. to get rid of the ones you can't find you leave some poison traps out,
3. and you make sure to vacuum regularly ("do you want ants this is how we get ants").
Not to mention it's totally debatable that we are any less violent than we used to be, which seems to be the core premise of this 'friendly alien' argument.
This is why the spectre of dead civilizations like Rome haunt us to this day.
It was quite a surprise after wiping out the most advanced and powerful civilzation other than me (they had 19th century riflemen, I was several tiers ahead of them so had tanks, battleships, and modern artillery) to have the other civilizations "trade up" to the same level of technology as me and have nukes flying at my cities.
The best way to stop computer civilizations from becoming a threat is to capture or raze all of their cities except for a single one, which you will keep as a pet. After all, you're not a monster.
The aliens who can visit us, however, are likely to possess technologies to terraform unsuitable planets. But maybe their technological development was not well rounded and space exploration was pursued way more than other form of technologies. Remember that we went to moon before we had smartphones. Just a simple analogy, but who knows?
How do you know so much about alien psychology?
Trying reason about what motivates aliens, their desires, or how they think, is an error in anthropomorphization.
If you can travel over light years a joule is going to be extraordinarily cheap, or time is not relevant. Either way, very hard for me to see how human slaves would be of any use to them.
maybe they need a creativity core for their warship ai.
But, I don't think enslavement or a human zoo is a very likely motivation for a species advanced enough to travel light years to get here.
It's like living on a Pacific island 5000 years ago. There are definitely other people in the world but there is no way you can contact them.
These distances only seem long to us because we're thinking mostly in getting to other stars in a single human lifetime, which is a pretty short lifetime even compared to some other organisms on Earth (like trees, not to mention organisms that can go in to stasis indefinitely, like the mushroom spores of Terrence McKenna alien contact hypothesis fame). Once you remove the limit of a single human lifetime, your reach expands greatly.
The many billions of years that the universe has existed should give an advanced civilization plenty of time to visit the entire galaxy, either in person or by proxy, without even having to invoke the possibility of FTL drives or wormholes.
... we are either alone in the simulation or we are not.
not alone? false dichotomy.
What if "we" are both alone and not alone. We are part of some universal life form whose elements prefer to be disconnected from the others for some periods.
So yes, under the laws of physics evolution and the eventual writings of avc (you) are possible, yet even a tiny sample, just half a paragraph, is unique in the history of this world and untold billions of possible other ones. While a hundred billion galaxies with a hundred billion stars might seem "infinite" - it's not so large at all. We can very easily be totally unique.
Actually infinite monkeys typing truly randomly would bang out your whole comment (not just your first paragraph) within a few minutes. Never confuse a few hundred billion times a few hundred billion with the former. :)
That's why I think there are no super-intelligent species out there, because either they blow themselves up due to petty issues like racism, nationalism, etc. or they have evolved past the need for physical form and basically just disappear off the face of the universe.