Radio waves take time to travel; there could be many civilisations out there whose signals haven't had time to reach us.
If that's the case, we're just the first intelligent civilisation to evolve in our immediate vicinity.
The aliens are probably out there wasting time on social media and saying stuff like We wanted flying cars, instead we got 140 characters.
No reason to spend money on searching "aliens"
just look near Cuba.
~5% matter and energy. The stuff you and I, dollhouses, dogs, and sugarcubes are made up of. Of that about 99.9% of it is very hot stars.
~20% dark matter. This is currently under investigation, but really all we know about it is that it is incredibly sparse and falls down.
~75% dark energy. We know nearly nothing about it except that it falls up (?!)
Point is, if there are other civilizations out there, and they have been existing for even a few thousand years longer than our has, a cosmic eyeblink, they are probably not just using matter and energy anymore. Heck, their understanding of physics is so much more than ours, even comparing us to Plato and them to Devoret is an insult.
"Can advanced beings evolve beyond the need for civilization?"
Future Star Trek episode:
The crew arrives at what appears to be a completely lifeless planet, and sees the empty buildings of a futuristic city...
Crew member: "These warlike people must have annihilated themselves... nuclear war with advanced energy devices that penetrate buildings with high energy radiation, that leave the buildings but destroy all life..."
Beings in light bodies (who suddenly appear out of nowhere!): "No, it's totally cool -- all of us are still alive! We simply evolved beyond the need for civilization!"
Dumbfounded crew member (whose sociologic theory about civilizational collapse is now proven utterly and completely wrong!): "Oh..."
:-)
It is hardly inconceivable that another world might offer an abundance of energy and materials so vast that the primitive notion of competition becomes irrelevant – an absence that would, I imagine, deeply offend the instincts of Earth’s more belligerent inhabitants.
Unless humanity is prepared to extricate itself from its parochial assumptions and entertain the possibility of wildly divergent modes of existence, it is unlikely we shall ever encounter a species with whom a meaningful exchange is even possible. Worse still – though entirely within the realm of plausibility – we may one day confront a civilisation whose moral and ethical framework is so colossally alien to our own that the mere act of contact would not enlighten, but unsettle, fracture and shatter.
I would push back on the idea that alien civilizations might somehow be more enlightened so as to avoid internal conflict altogether. Unless they were artificially designed by a creator who explicitly factored out these traits, they likely also evolved from primitive beginnings. If we know anything from our single sample of living organisms is that competition and survival play a key role in driving evolution. Even if their planet had ample resources for everyone—which can also be said about Earth—those resources might not be accessible to everyone equally. This would inevitably lead to hoarding, tribalism, and conflicts. Besides, physical resources aren't the only cause for conflict. If they're social creatures, relationships, hierarchies, and politics also play a role.
So all of these things would be embedded in their organisms even after they've evolved to a technological civilization, just like they are in ours. Therefore it is not difficult to imagine that they would also struggle to balance their use of technology with their nature to distrust each other. I don't think this is a human-centrist viewpoint, but one we observe from nature itself. However limited that may be, it's the only place we can draw any kind of conclusions from. Thinking otherwise is interesting, but the realm of science fiction.
It's at least 4b years old on earth, that's not "relatively recently"
We've been technologically capable for less than 200 years, that's nothing in the grand scheme of things.
We could have just missed them by a few millions years and we would have no ideas at all. There might be 10 ancient Egypt tier civilisations out there right now that might develop radio tech by the time we're extinct or back to low tech.
It's far more likely that signal-emitting life is so rare (or short-lived) so that they are separated by distances where their signals weaken to undetectability, than that we are the one fantastically lucky star to be the first among a hundred billion.
Star Trek showing all the rival civilizations exploring the galaxy at the same time makes no sense at all. It's far more likely that civilizations would arise separated by time of millions or billions of years, than that they would all be concentrated within a few centuries. (Trek does hint strongly that the explanation is panspermia, that life was seeded everywhere at the same time to account for the time-concentrated development.)
What we don't know is what makes signal-emitting civilizations cease to do so. But (if we aren't the fantastically unlikely first one) either something must, or they're so far apart that signals can't be detected between them.
As Lynn Margulis would say the chimps aren't the main show. Intelligence maybe an over rated and very buggy feature of Life. And the bugs get amplified as the minds interact with each other and group size increases.
The philosophers have talked about the bugs for a long time (see Plato's chariot, Hobbes passion vs reason, Freuds id-ego-superego, Kahnemann's system 1 vs system 2, Haidt's Elephant Rider). The mind needs stories to handle the bugs. And there is no dearth of stories on Earth to keep the 3 inch chimp brain occupied forever.
We have a head start on other apes, but they might catch up if we weren't in the picture. If octopi stopped dying so young, they might give us a run for our money. Orcas have fashion trends ("did you see Becky's dead salmon hat? I'm getting one of those!"). Mess with corvids at your own risk.
There’s a ridiculous number of stars in the sky - no matter how low you put the odds of intelligent life, you will still end up with more than 1 civilized species in the universe.
This is, at least for me, the primary utility of such extrapolations. And eventually - extrapolations will be tested.
That paper used a horribly faulty logical argument because it has been well known for quite a while that most of the time evolution is quite slow, but there are short bursts of rapid change. For example: https://en.wikipedia.org/wiki/Cambrian_explosion
The universe is approximately 13.8 billion years old. Primates evolved only six million years ago. So let's be charitable and say it took only 13 billion years for a lifeform capable of transmitting radio waves to evolve.
Unless I'm missing something, if we're listening for ETs, we would only have visibility into a sphere with a radius around 1/45th the radius of what is out there?
My point is, if the Universe is such that it takes a minimum of, say, 10 billion years for planetary conditions to make evolution possible and evolution to wind up at intelligent life, then there could be hundreds or thousands of civilisations out there, at this very moment, whose earliest signals won't reach us for millions of years.
The control of fire is what has enabled humans to produce and use new classes of materials that all other living beings are unable to make, e.g. ceramics, metals, glasses, cements, thermoplastics and thermosets, various kinds of crystals, including semiconductors, etc.
These materials have been essential in the development of human technology during the last twenty thousand years.
All the other living beings can use only a limited range of materials, consisting of polymers that can be synthesized at ambient temperature and pressure (e.g. wood or horn or chitin), adhesives and the equivalent of sedimentary rocks (in various kinds of skeletons, most commonly made from composites of proteins or chitin with insoluble salts of calcium, strontium or barium, but also including those made of sedimentary glass, i.e. opal, like in sponges and diatoms). (Natural glass is either volcanic, made by rapid cooling, like most artificial glass, or sedimentary, i.e. opal deposited from a solution of silicic acid in water. Living beings can catalyze the latter reaction in order to make siliceous skeletons.)
So the myth of Prometheus was actually quite wise in comparison with many later attempts to define essential features of humans.
Any living beings that have remained confined to a water environment, like cephalopods, do not have any chance to develop a technology comparable to that of humans, because without being able to use fire they could not make metals and the other materials required for that. Being unable to create an advanced technology is not an obstacle to reaching a high intelligence, as high as that of humans. The humans of fifty thousand years ago were as intelligent as those of today (perhaps on average even more, as all the dumb ones died quickly), even if they did not have yet any technologies that could not have been developed by something like an octopus.
I'm pretty sure the issue you raise is averaged away quite neatly by the math.
If there were no eukaryotes, there'd likely be no way of getting past the energy per gene ceiling that constrains the other two domains of life.
A hypothetical civilisation may well have emerged with a head start of five billion years – a span of time sufficiently vast to allow for profound advancement, assuming favourable conditions and uninterrupted development.
However, I find the oft-repeated assertion – that alien civilisations have likely annihilated themselves due to traits as tediously predictable as short-sightedness, internal strife or some anthropocentric parody of self-sabotage – to be both intellectually lazy and philosophically barren. such a narrative reveals far more about the limitations of the human imagination than it does about the potential trajectories of alien life.
A more measured and plausible explanation lies not in self-inflicted extinction, but in the nature of the galaxy itself. Five or more billion years ago, the Milky Way could have been a significantly less hospitable environment – more unstable, more violent, and subject to higher stellar activity. One must also consider the precariousness of location: the galactic core, unlike the relative quietude of our outer spiral arm, is a congested and perilous stellar thoroughfare, far less conducive to the emergence and persistence of complex life.
To default to the notion of civilisational self-destruction is to betray a lack of imagination – or worse, a projection of our own inadequacies onto the cosmos.
> The control of fire is what has enabled humans to […]
… on Earth with an iron-nickel core and an oxygen rich atmosphere, with easy access to vasts of low energy density organic material (dead trees) and higher density energy source (coal). Either of the two can be literally picked up from the surface, which was even more important for ancient civilisations.
But mastery of combustion is just one successful path, not a universal prerequisite. What matters is access to controllable high energy densities and redox chemistry that can extract, shape and join structural and conductive materials. On many plausible worlds those needs could be met without fire. Fir instance:
1. Native metals and cold working – worlds rich in native copper, silver or gold from hydrothermal deposition or reducing atmospheres allow metal use with no smelting. Cold hammering, annealing on warm geothermal surfaces, and pressure-sintering can produce wire, sheet and simple tools. Meteoric iron is another route to early ironwork by cold forging.
2. Electrochemical extraction at ambient temperatures – acidic or chloride brines can leach Cu²⁺, Ag⁺, Zn²⁺ and similar ions that can be plated onto seed cathodes. Electricity could come from: a) galvanic piles built from naturally dissimilar minerals in a brine; b) tidal, wind or river generators driven by simple turbines; c) lightning harvesting into capacitors, then steady discharge into plating cells. This is essentially solvent-extraction and electrowinning without a firebox.
3. Solar furnaces without flames – on oxygen-less or thin-oxygen worlds with intense sun, arrays of polished stone, mica sheets or vitrified sand mirrors can reach smelting temperatures. Early optics need not be metallic – glazed ceramics and transparent minerals suffice – so a civilisation could jump straight to photothermal metallurgy.
4. Non-combustive chemical heat – highly exothermic mixtures – thermite-class reactions and metal sulphide or halide reductions – release smelting-grade heat once initiated. On halogen-rich worlds, fluorides or chlorides could be reduced by hydrogen or metals to yield both heat and purified product. This is chemistry as furnace.
5. Under-ice or ocean worlds – combustion may be impossible, yet hydrothermal chimneys deposit native metals and sulphides. Technology could develop around: a) ceramic and glassworking using geothermal heat; b) galvanic circuits using sulphide–metal couples in seawater for plating; c) arc heating from captured lightning or magnetospheric induction to melt and weld underwater.
6. Biological ore upgrading and metal precipitation – an interesting idea to entertain as microbial consortia already leach copper and gold on Earth at ambient temperatures. Projecting further, an alien biosphere could be domesticated to: a) acidify heaps and liberate ions from ore; b) reduce and precipitate metals onto templates for near-net-shape parts; c) grow conductive biomaterials that substitute for early copper wiring. Biometallurgy can start well below 100 °C and scale industrially.
We should also not entirely dismiss a possibility of material alternatives that postpone, or even replace, metallurgy – advanced ceramics, glasses, cements, laminated woods and fibre composites can carry a civilisation far – structural engineering, containers, even turbines and high-temperature reactors. Conductive paths can begin with graphite, sulphides and native copper; semiconducting minerals such as galena enable primitive electronics. Metallurgy may arrive later or remain niche.None of the above is outside the constructs of the laws of physics and chemistry, and, most assuredly, the suggested alternatives are not crack pot theories or hard code science fiction – provided the local environment has favourable conditions that meet one or multiple criteria. Even more so if an alien civilisation had an earlier head start – modern humans have only been around for 200 thousand years circa (we won't include the earlier hominid forms), and if another civilisation could have started advancing «just» 1 billion years earlier, they would have had a luxury of progressing steadily even if at a slower pace.
To sum it up, none of the above requires fire for a hypothetical extraterristrial civilisation to advance. Fire was the proverbial low-hanging fruit and the path of the least resistance, so the early humans enthusiastically adopted it given that the terrestrial environment favoured and «encouraged» the use of fire, so to speak.
We should look for those.
Primates evolved six million years ago here. There are plenty of stars [...] older than our sun [...] where they could have
Fair enough. If we're certain a star billions of years older then ours could have the same conditions that would support evolution, then my comment is unlikely.I, embarrassingly, have a crackpot theory where G isn't a constant. I'll spare HN; it would make me sound like a moron.
You could call this technique "complexity dating". First you show there is exponential growth (or decay) occuring naturally. The actual changes occur randomly but the mean rate is fixed. Then you plot on log scale and voila you have complexity dated life itself. The only argument you can make against is that the laws of physics are somehow not constant, but I think everything froze out by the time molecules were forming.
So in your cartoon, the bride can marry a random normally distributed number of husbands each time. We would determine the average rate of husband accretion. Then given the number of husbands at any time we could determine when the rapacious bride began marrying.
Native metals have always been used only as jewels, before it became possible to melt them, in order to make hard alloys, e.g. copper-arsenic or copper-antimony, and later copper-tin.
The exception has been meteoric iron (i.e. Fe-Ni-Co-Ge alloy), which is hard enough to be useful (because it is an alloy), but that could never be an abundant resource. Moreover, meteoric iron cannot be forged without heating it. Cutting and polishing it like you do with stones is very difficult, but possible. However, such a method of using it does not provide the main advantage of metals, of enabling the creation of complex shapes by plastic deformation. Had it not been possible to forge meteoric iron by heating it in fire, nobody would have bothered with attempts to make knives out of it, instead of using stone tools.
Using concentrated light instead of fire would work, so such an invention could be imagined on a planet where the atmosphere could not provide fire, like on Earth. However this is not something that could be invented by aquatic animals, because they would have to invent first means to make chambers empty of water where materials could be heated in such a way. Such a succession of inventions, including adequate pumps, is extremely implausible because their utility would not appear until all the necessary techniques would exist. It would be far more plausible for aquatic animals to first develop means to live outside the water, and only then make such inventions, in an environment where they are much easier.
While there are bacteria that can reduce some metal ions, e.g. of copper, silver or gold, to the corresponding metal, they do not do this for producing a metal, but for removing poisonous ions from their environment by sequestering them into the insoluble metal. On Earth, living beings have either a complex structure or chemical versatility, never both. It seems likely that this constraint would also exist elsewhere, so one should not expect intelligent beings that are able to reduce metal ions into metals by their body physiology. In any case, the pure metals produced in this way, like also the native metals precipitated in abiotic conditions, are useless for making tools, without being able to heat them to high temperatures for making alloys and for controlling their polycrystalline structure. The same applies to metals that would be produced by electrolysis.
Fire using air is the source of heat originally used by humans on Earth. Any other available source of strong heat could have replaced it in the history of another planet. The point is that to ascend to the level of human technology any living being must acquire the capability of processing materials at temperatures very different from their ambient temperature. None of the techniques of producing metals at the ambient temperature is sufficient for being able to make useful things out of them. The same for other classes of materials, e.g. semiconductors or glasses.
One could imagine an extremely advanced technology where one would be able to make a big object by putting an atom after another in just the right place. That could work at ambient temperature and produce anything. Even if such a technique were possible, I find it impossible to believe that any living beings could become capable to develop it without first passing through the stage where a lot of material transformations can be made only at high temperatures and/or high pressures.
Inaccurately, with recent revisions to the tune of hundreds of millions of years.
> The age of rocks?
Surprisingly inaccurately, despite the smooooooooth exponential curve of radioactive decay.
Evolution is not smooth.
> Do you not accept carbon dating?
I do, within its error range. Which is large. Like tens of percent in common scenarios.
Moreover, fire – as it is understood on Earth – is not merely a product of an oxygen-rich atmosphere. It is inextricably tied to the cyclical processes of organic growth and decay specific to this biosphere. To assume that such a mechanism is universal is a failure of both imagination and scientific rigour. It is patently unreasonable to presume that alien worlds capable of hosting life would replicate the precise biochemical and atmospheric conditions of this planet.
One might as well expect a symphony to be performed identically by instruments fashioned from entirely different matter – and yet remain surprised when the melody diverges.
1. The rate of change-per-generation is very much not constant, as described by the Punctuated Equilibria theory. Sudden changes in the environment can cause sudden bursts of evolution. We don't know if there were any (and how many) mass extinctions / sudden change events before the fossil record starts, which is already hundreds of millions of years into the existence of single-celled life existing!
2. The time elapsed per generation has changed over time too, and we have virtually no direct evidence of the actual rate for the earliest epochs of life, before multi-cellular life.
These are particularly bad problems for any theory trying to extrapolate backwards, with compounding issues that can blow out any naive error estimates massively.
For example:
RNA-based vs DNA-based life. We know that DNA is more stable and resistant to mutation than RNA, which was the foundation of the earliest life forms. But we have no idea how that difference specifically affected early life evolutionary rates! We can guess... but only guess. However, almost certainly, early life had a much higher mutation rate per generation than modern life AND a consistently short generation time.
1) The rate of change not being constant is not significant as long as its variation is normally distributed about some mean. I expect that changes in environment are randomly distributed.
2) This is too small scale to have any impact on the trajectory of the numerical growth in number of base pairs.
We don't need to guess, life isn't special. The growth of complexity under favorable conditions is observable and occurs all over the place at the same rate, at scale.
Mutation rates of RNA-based life was likely 1,000x higher than later DNA based life!
How are these “not relevant”?
It’s like estimating the velocity of a ball from a replay that has slow mo, time-lapse, and the game being played changes!
Your assumptions are faulty.

(Image credit: European Souther Observatory)
Most of the alien civilizations that ever dotted our galaxy have probably killed themselves off already.
That's the takeaway of a new study, published Dec. 14 to the arXiv database, which used modern astronomy and statistical modeling to map the emergence and death of intelligent life in time and space across the Milky Way. Their results amount to a more precise 2020 update of a famous equation that Search for Extraterrestrial Intelligence founder Frank Drake wrote in 1961. The Drake equation, popularized by physicist Carl Sagan in his "Cosmos" miniseries, relied on a number of mystery variables — like the prevalence of planets in the universe, then an open question.
This new paper, authored by three Caltech physicists and one high school student, is much more practical. It says where and when life is most likely to occur in the Milky Way, and identifies the most important factor affecting its prevalence: intelligent creatures' tendency toward self-annihilation.
The authors looked at a range of factors presumed to influence the development of intelligent life, such as the prevalence of sunlike stars harboring Earth-like planets; the frequency of deadly, radiation-blasting supernovas; the probability of and time necessary for intelligent life to evolve if conditions are right; and the possible tendency of advanced civilizations to destroy themselves.
Related: 9 strange, scientific excuses for why humans haven't found aliens yet
Get the world’s most fascinating discoveries delivered straight to your inbox.
Modeling the evolution of the Milky Way over time with those factors in mind, they found that the probability of life emerging based on known factors peaked about 13,000 light-years from the galactic center and 8 billion years after the galaxy formed. Earth, by comparison, is about 25,000 light-years from the galactic center, and human civilization arose on the planet's surface about 13.5 billion years after the Milky Way formed (though simple life emerged soon after the planet formed.)
In other words, we're likely a frontier civilization in terms of galactic geography and relative latecomers to the self-aware Milky Way inhabitant scene. But, assuming life does arise reasonably often and eventually becomes intelligent, there are probably other civilizations out there — mostly clustered around that 13,000-light-year band, mostly due to the prevalence of sunlike stars there.

A figure from the paper plots the age of the Milky Way in billions of years (y axis) against distance from the galactic center (x axis), finding a hotspot for civilization 8 billion years after the galaxy formed and 13,000 light years from the galactic center. (Image credit: Cai et al.)
Most of these other civilizations that still exist in the galaxy today are likely young, due to the probability that intelligent life is fairly likely to eradicate itself over long timescales. Even if the galaxy reached its civilizational peak more than 5 billion years ago, most of the civilizations that were around then have likely self-annihilated, the researchers found .
This last bit is the most uncertain variable in the paper; how often do civilizations kill themselves? But it's also the most important in determining how widespread civilization is, the researchers found. Even an extraordinarily low chance of a given civilization wiping itself out in any given century — say, via nuclear holocaust or runaway climate change — would mean that the overwhelming majority of peak Milky Way civilizations are already gone.
The paper has been submitted to a journal for publication and is awaiting peer review.
Originally published on Live Science.
Rafi joined Live Science in 2017. He has a bachelor's degree in journalism from Northwestern University’s Medill School of journalism. You can find his past science reporting at Inverse, Business Insider and Popular Science, and his past photojournalism on the Flash90 wire service and in the pages of The Courier Post of southern New Jersey.