Toggle light / dark theme

CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?

CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.

Experts point out that this research could be as limitless as the universe itself. The UK’s former Chief Scientific Advisor, Prof Sir David King told BBC: “We have to draw a line somewhere otherwise we end up with a collider that is so large that it goes around the equator. And if it doesn’t end there perhaps there will be a request for one that goes to the Moon and back.”

“There is always going to be more deep physics to be conducted with larger and larger colliders. My question is to what extent will the knowledge that we already have be extended to benefit humanity?”

There have been broad discussions about whether high energy nuclear experiments could pose an existential risk sooner or later, for example by producing micro black holes (mBH) or strange matter (strangelets) that could convert ordinary matter into strange matter and that eventually could start an infinite chain reaction from the moment it was stable – theoretically at a mass of around 1000 protons.

CERN has argued that micro black holes eventually could be produced, but they would not be stable and evaporate immediately due to „Hawking radiation“, a theoretical process that has never been observed.

Furthermore, CERN argues that similar high energy particle collisions occur naturally in the universe and in the Earth’s atmosphere, so they could not be dangerous. However, such natural high energy collisions are seldom and they have only been measured rather indirectly. Basically, nature does not set up LHC experiments: For example, the density of such artificial particle collisions never occurs in Earth’s atmosphere. Even if the cosmic ray argument was legitimate: CERN produces as many high energy collisions in an artificial narrow space as occur naturally in more than hundred thousand years in the atmosphere. Physicists look quite puzzled when they recalculate it.

Others argue that a particle collider ring would have to be bigger than the Earth to be dangerous.

A study on “Methodological Challenges for Risks with Low Probabilities and High Stakes” was provided by Lifeboat member Prof Raffaela Hillerbrand et al. Prof Eric Johnson submitted a paper discussing juridical difficulties (lawsuits were not successful or were not accepted respectively) but also the problem of groupthink within scientific communities. More of important contributions to the existential risk debate came from risk assessment experts Wolfgang Kromp and Mark Leggett, from R. Plaga, Eric Penrose, Walter Wagner, Otto Roessler, James Blodgett, Tom Kerwick and many more.

Since these discussions can become very sophisticated, there is also a more general approach (see video): According to present research, there are around 10 billion Earth-like planets alone in our galaxy, the Milky Way. Intelligent life might send radio waves, because they are extremely long lasting, though we have not received any (“Fermi paradox”). Theory postulates that there could be a ”great filter“, something that wipes out intelligent civilizations at a rather early state of their technical development. Let that sink in.

All technical civilizations would start to build particle smashers to find out how the universe works, to get as close as possible to the big bang and to hunt for the smallest particle at bigger and bigger machines. But maybe there is a very unexpected effect lurking at a certain threshold that nobody would ever think of and that theory does not provide. Indeed, this could be a logical candidate for the “great filter”, an explanation for the Fermi paradox. If it was, a disastrous big bang machine eventually is not that big at all. Because if civilizations were to construct a collider of epic dimensions, a lack of resources would have stopped them in most cases.

Finally, the CERN member states will have to decide on the budget and the future course.

The political question behind is: How far are the ordinary citizens paying for that willing to go?

LHC-Critique / LHC-Kritik

Network to discuss the risks at experimental subnuclear particle accelerators

www.lhc-concern.info

LHC-Critique[at]gmx.com

https://www.facebook.com/LHC-Critique-LHC-Kritik-128633813877959/

Particle collider safety newsgroup at Facebook:

https://www.facebook.com/groups/particle.collider/

https://www.facebook.com/groups/LHC.Critique/

Back in the first moment of the universe, everything was hot and dense and in perfect balance. There weren’t any particles as we’d understand them, much less any stars or even the vacuum that permeates space today. The whole of space was filled with homogeneous, formless, compressed stuff.

Then, something slipped. All that monotonous stability became unstable. Matter won out over its weird cousin, antimatter, and came to dominate the whole of space. Clouds of that matter formed and collapsed into stars, which became organized into galaxies. Everything that we know about started to exist.

So, what happened to tip the universe out of its formless state? [How Quantum Entanglement Works (Infographic)].

Read more

Today the collaboration for the LHCb experiment at CERN’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b’- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”. In the Xi_b’- state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b*- state they are aligned.

“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’- is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”

Read more

Electronegativity is one of the most well-known models for explaining why chemical reactions occur. Now, Martin Rahm from Chalmers University of Technology, Sweden, has redefined the concept with a new, more comprehensive scale. His work, undertaken with colleagues including a Nobel Prize-winner, has been published in the Journal of the American Chemical Society.

The theory of is used to describe how strongly different atoms attract electrons. By using electronegativity scales, one can predict the approximate charge distribution in different molecules and materials, without needing to resort to complex quantum mechanical calculations or spectroscopic studies. This is vital for understanding all kinds of materials, as well as for designing new ones. Used daily by chemists and materials researchers all over the world, the concept originates from Swedish chemist Jöns Jacob Berzelius’ research in the 19th century and is widely taught at high-school level.

Now, Martin Rahm, Assistant Professor in Physical Chemistry at Chalmers University of Technology, has developed a brand-new scale of electronegativity.

Read more

The production of entropy, which means increasing the degree of disorder in a system, is an inexorable tendency in the macroscopic world owing to the second law of thermodynamics. This makes the processes described by classical physics irreversible and, by extension, imposes a direction on the flow of time. However, the tendency does not necessarily apply in the microscopic world, which is governed by quantum mechanics. The laws of quantum physics are reversible in time, so in the microscopic world, there is no preferential direction to the flow of phenomena.

One of the most important aims of contemporary scientific research is knowing exactly where the transition occurs from the quantum world to the classical world and why it occurs — in other words, finding out what makes the production of entropy predominate. This aim explains the current interest in studying mesoscopic systems, which are not as small as individual atoms but nevertheless display well-defined quantum behavior.

Read more

A scientific collaboration has released a concept design for the Large Hadron Collider’s successor, an enormous new experiment that would sit inside a hundred-kilometer (62-mile) tunnel.

The design concept plans for two Future Circular Colliders, the first which would begin operation perhaps in 2040. The ambitious experiments would hunt for new particles with collision energies 10 times higher than those created by the Large Hadron Collider (LHC). The concept design is the first big milestone achieved by the scientific collaboration.

Read more

Australia’s New South Wales scientists have adapted single atom technology to build 3D silicon quantum chips – with precise interlayer alignment and highly accurate measurement of spin states. The 3D architecture is considered a major step in the development of a blueprint to build a large-scale quantum computer.

They aligned the different layers in their 3D device with nanometer precision – and showed they could read out qubit states with what’s called ‘single shot’, i.e. within one single measurement, with very high fidelity.

“This 3D device architecture is a significant advancement for atomic qubits in silicon,” says Professor Simmons.

Read more