Toggle light / dark theme

CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?

CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.

Experts point out that this research could be as limitless as the universe itself. The UK’s former Chief Scientific Advisor, Prof Sir David King told BBC: “We have to draw a line somewhere otherwise we end up with a collider that is so large that it goes around the equator. And if it doesn’t end there perhaps there will be a request for one that goes to the Moon and back.”

“There is always going to be more deep physics to be conducted with larger and larger colliders. My question is to what extent will the knowledge that we already have be extended to benefit humanity?”

There have been broad discussions about whether high energy nuclear experiments could pose an existential risk sooner or later, for example by producing micro black holes (mBH) or strange matter (strangelets) that could convert ordinary matter into strange matter and that eventually could start an infinite chain reaction from the moment it was stable – theoretically at a mass of around 1000 protons.

CERN has argued that micro black holes eventually could be produced, but they would not be stable and evaporate immediately due to „Hawking radiation“, a theoretical process that has never been observed.

Furthermore, CERN argues that similar high energy particle collisions occur naturally in the universe and in the Earth’s atmosphere, so they could not be dangerous. However, such natural high energy collisions are seldom and they have only been measured rather indirectly. Basically, nature does not set up LHC experiments: For example, the density of such artificial particle collisions never occurs in Earth’s atmosphere. Even if the cosmic ray argument was legitimate: CERN produces as many high energy collisions in an artificial narrow space as occur naturally in more than hundred thousand years in the atmosphere. Physicists look quite puzzled when they recalculate it.

Others argue that a particle collider ring would have to be bigger than the Earth to be dangerous.

A study on “Methodological Challenges for Risks with Low Probabilities and High Stakes” was provided by Lifeboat member Prof Raffaela Hillerbrand et al. Prof Eric Johnson submitted a paper discussing juridical difficulties (lawsuits were not successful or were not accepted respectively) but also the problem of groupthink within scientific communities. More of important contributions to the existential risk debate came from risk assessment experts Wolfgang Kromp and Mark Leggett, from R. Plaga, Eric Penrose, Walter Wagner, Otto Roessler, James Blodgett, Tom Kerwick and many more.

Since these discussions can become very sophisticated, there is also a more general approach (see video): According to present research, there are around 10 billion Earth-like planets alone in our galaxy, the Milky Way. Intelligent life might send radio waves, because they are extremely long lasting, though we have not received any (“Fermi paradox”). Theory postulates that there could be a ”great filter“, something that wipes out intelligent civilizations at a rather early state of their technical development. Let that sink in.

All technical civilizations would start to build particle smashers to find out how the universe works, to get as close as possible to the big bang and to hunt for the smallest particle at bigger and bigger machines. But maybe there is a very unexpected effect lurking at a certain threshold that nobody would ever think of and that theory does not provide. Indeed, this could be a logical candidate for the “great filter”, an explanation for the Fermi paradox. If it was, a disastrous big bang machine eventually is not that big at all. Because if civilizations were to construct a collider of epic dimensions, a lack of resources would have stopped them in most cases.

Finally, the CERN member states will have to decide on the budget and the future course.

The political question behind is: How far are the ordinary citizens paying for that willing to go?

LHC-Critique / LHC-Kritik

Network to discuss the risks at experimental subnuclear particle accelerators

www.lhc-concern.info

LHC-Critique[at]gmx.com

https://www.facebook.com/LHC-Critique-LHC-Kritik-128633813877959/

Particle collider safety newsgroup at Facebook:

https://www.facebook.com/groups/particle.collider/

https://www.facebook.com/groups/LHC.Critique/

Interview with Scott Aaronson — covering whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).


Scott discusses whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).

Questions include:
- In “Could a Quantum Computer Have Subjective Experience?” you speculate that a process has to ‘fully participate in the arrow of time’ to be conscious, and this points to decoherence. If pressed, how might you try to formalize this?

- In “Is ‘information is physical’ contentful?” you note that if a system crosses the Schwarzschild bound it collapses into a black hole. Do you think this could be used to put an upper bound on the ‘amount’ of consciousness in any given physical system?

- One of your core objections to IIT is that it produces blatantly counter-intuitive results. But to what degree should we expect intuition to be a guide for phenomenological experience in evolutionarily novel contexts? I.e., Eric Schwitzgebel notes “Common sense is incoherent in matters of metaphysics. There’s no way to develop an ambitious, broad-ranging, self- consistent metaphysical system without doing serious violence to common sense somewhere. It’s just impossible. Since common sense is an inconsistent system, you can’t respect it all. Every metaphysician will have to violate it somewhere.”

Many thanks to Mike Johnson for providing these questions!

Bio : Scott Aaronson is a theoretical computer scientist and David J. Bruton Jr. Centennial Professor of Computer Science at the University of Texas at Austin. His primary areas of research are quantum computing and computational complexity theory.

He blogs at Shtetl-Optimized: https://www.scottaaronson.com/blog/

Read more

Scientists working with the Event Horizon Telescope project may have have captured an image of a black hole in the Milky Way, which could be ‘the most iconic ever’.

Scientists have suggested that for the first time, they may have finally captured what could end up being the first image of a black hole within the Milky Way. A team of international astronomers have been hard at work analyzing two specific areas of space located in Sagittarius A and M87 through the Event Horizon Telescope project (EHT), and have reported that they have discovered what amounts to “spectacular” data during their research, which in this case, would be the faint image of the silhouette of a black hole.

According to the Daily Mail, this image may very well prove to be “one of the most iconic ever.” The scientists involved with the EHT collaboration are currently analyzing tremendous amounts of data from 2017, and this data is set to be made public later on this year.

Read more

Astronomers have been watching a very hungry black hole devour the gases of a nearby star for almost a year.

A specialized instrument aboard the International Space Station in March detected an enormous explosion of X-ray light nearly 10,000 light years from Earth.

The source: a black hole called MAXI J1820+070, caught in an outburst, spewing surges of X-ray energy as it devours inhales celestial dust and gas.

Read more

The ¥16.4-billion (US$148-million) observatory — Japan’s Kamioka Gravitational Wave Detector (KAGRA) — will work on the same principle as the two detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) in the United States and the Virgo solo machine in Italy. In the past few years, these machines have begun to detect gravitational waves — long-sought ripples in the fabric of space-time created by cataclysmic cosmic events such as the merging of two black holes or the collision of two neutron stars.


LIGO’s Asian cousin will this year deploy ambitious technology to improve sensitivity in the search for these faint, cosmic ripples — but its biggest enemy could be snowmelt.

Read more