Jump to content

Supernovae 2 million years ago may have changed human behaviour


Reefa

Recommended Posts

Quote

tGhZzy2.jpg

 

Two stellar explosions could have made life interesting for early humans.

 

Roughly 2 million years ago, two supernovae exploded so close to Earth that they showered our pale blue dot with debris, leaving behind traces of radioactive iron-60 found buried in the sea floor across the globe and even mixed within the dust layers on the moon.

 

Those supernovae were several hundred light-years from Earth, far enough away that their radiation shouldn’t have led to a mass extinction, but close enough that the blast could have affected our ancestors. At the time, the human ancestor Homo erectus was descending from the trees.

 

Now, Brian Thomas at Washburn University in Topeka, Kansas, and his colleagues posit that the two supernovae could have hurled enough radiation at Earth to affect our ancestors’ behavioural patterns, and potentially increase cancer rates.

 

The first radiation to bombard Earth would have simply been visible light. Supernovae can be so bright that they briefly outshine all the stars in their host galaxy – an effect that wouldn’t go unnoticed on Earth.

 

In fact, such a close supernova would have been as bright as a full moon every night for up to a year after the initial explosion. The added light pollution could have had some biological impact, Thomas says, as we know from studies of the effect of artificial lights on wildlife.

 

“Certain species use light from the moon to navigate,” he says. “They also use that cue for mating, reproduction, laying eggs, things like that. Even just foraging for food. This can screw with their usual behavioural processes.”

 

Additionally, recent evidence suggests that increased light at night can affect hormone production in people. Take melatonin for example: it doesn’t just put us to sleep, it also mediates some of the repair mechanisms in our bodies.

 

“We’re not talking about wiping out species here, but there may be some impact on one or two generations,” says Thomas.

 

But visible light isn’t the only radiation that would have burst from these stellar explosions. Roughly 500 years after the supernova faded, its radioactive particles would have pelted Earth.

 

Thomas and his colleagues calculated that the average radiation felt across the globe would have been three times higher than the background levels typical today. They speculate that our ancestors could have faced an increased cancer risk as a result.

 

But they might not have had much to worry about, says Michael Weil at Colorado State University.

 

That’s because the average isn’t very telling – there’s a considerable range across the globe. As an example, Weil says he often sends his students to do field work at the site of the Fukushima nuclear disaster in Japan.

 

“They receive less of a dose at Fukushima while they’re doing their field work then they do when they’re studying in classes at Colorado state,” he says. That’s because there is a large amount of granite in Colorado, which means a fair amount of uranium in the soil and thus a larger radiation dose.

 

Colorado isn’t even the most radioactive place on Earth: there’s a radiation hotspot in Kerala, India, where the radiation can be 20 times the global average.

 

“People have struggled to show an increase in cancer rates in those areas, and they haven’t managed to do it,”

Weil says. “So it’s really, really hard to spot any biological effects from tripling the average background radiation levels, which is what this [supernova] would do.”

 

Astronauts receive 30 times the average background radiation, so just three times would probably not be a problem, says Don Hassler, the principal investigator of the radiation assessment detector on the Mars rover Curiosity.

 

The numbers aren’t worrisome until they are 1000 times higher than Earth’s average, says Hassler. “That’s the canonical number that agencies use as a lifetime limit,” he says. And it corresponds to a 3 per cent increase in the risk of developing a fatal cancer. “Given that an average American may have a 20 to 25 per cent chance of developing fatal cancer in their lifetime, this number is still relatively small,” he says.

 

That said, the radioactive particles that stream from supernovae are mostly muons – unstable subatomic particles – that are thought to be extremely penetrating. This places them in a different camp to the predominate source of background radiation, radon, which has to be inhaled or ingested in order to be damaging.

 

So instead of thinking about background radiation, it might make more sense to think about the supernovae in terms of diagnostic radiation. Thomas and his colleagues calculated that the increased dose is equivalent to getting one CT scan per year. Two recent studies have suggested that less than one scan a year in children leads to an increase in leukaemia and brain tumours.

 

Any damage caused by the supernovae is hard to quantify. Luckily, the chance that there will be another nearby supernova any time soon is small, says Michael Sivertz, who works at the NASA Space Radiation Laboratory. He puts that number at roughly one nearby supernovae every billion years. “You wouldn’t have to take out a life insurance policy on it,” he says.

 

Reference: arxiv.org/abs/1605.04926

 

source

 

Link to comment
Share on other sites


  • Views 523
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...