Toggle light / dark theme

Do you remember all the hoopla last year when the Higgs Boson was confirmed by physicists at the Large Hadron Collider? That’s the one called the ‘God particle’, because it was touted as helping to resolve the forces of nature into one elegant theory. Well—Not so fast, bucko!…

First, some credit where credit is due: The LHC is a 27-kilometer ring of superconducting magnets interspersed by accelerators that boost the energy of the particles as they whip around and smash into each other. For physicists—and anyone who seeks a deeper understanding of what goes into everything—it certainly inspires awe.

Existence of the Higgs Boson (aka, The God Particle) was predicted. Physicists were fairly certain that it would be observed. But its discovery is a ‘worst case’ scenario for the Standard Model of particle physics. It points to shortcomings in our ability to model and predict things. Chemists have long had a master blueprint of atoms in the Periodic Table. It charts all the elements in their basic states. But, physicists are a long way from building something analogous. That’s because we know a lot more about atomic elements than the fundamental building blocks of matter and energy. [continue below image]

So, what do we know about fundamental particles the forces that bind them? HINT: There are 61 that we know of or have predicted and at least two about which we don’t yet have any clue: The pull of Gravity and dark matter / dark energy.

This video produced by the BBC Earth project is an actors’ portrayal of a news interviewer and a particle physicist. If we were to simply watch these two guys talk in front of a camera, it would be pretty boring (unless, of course, the physicist has charm and panache, like the late Richard Feynman or my own Cornell professor, Carl Sagan). So, to spice it up a bit, BBC has added a corny animation of two guys talking with an anthropomorphic illustration of cartoon particles. Corny? Yes! But it helps to keep a viewer captivated. And, for any armchair physicist, the story is really exciting!

See the video here. It takes a moment to load—but for me, the wait is worthwhile.


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)
CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Posted in astronomy, big data, computing, cosmology, energy, engineering, environmental, ethics, existential risks, futurism, general relativity, governance, government, gravity, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, open source, particle physics, philosophy, physics, policy, posthumanism, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , , , , , | Leave a Comment on Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Why the LHC must be shut down

CERN-Critics: LHC restart is a sad day for science and humanity!

Posted in astronomy, big data, complex systems, computing, cosmology, energy, engineering, ethics, existential risks, futurism, general relativity, governance, government, gravity, hardware, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , | 1 Comment on CERN-Critics: LHC restart is a sad day for science and humanity!

PRESS RELEASE “LHC-KRITIK”/”LHC-CRITIQUE” www.lhc-concern.info
CERN-Critics: LHC restart is a sad day for science and humanity!
These days, CERN has restarted the world’s biggest particle collider, the so-called “Big Bang Machine” LHC at CERN. After a hundreds of Million Euros upgrade of the world’s biggest machine, CERN plans to smash particles at double the energies of before. This poses, one would hope, certain eventually small (?), but fundamentally unpredictable catastrophic risks to planet Earth.
Basically the same group of critics, including Professors and Doctors, that had previously filed a law suit against CERN in the US and Europe, still opposes the restart for basically the same reasons. Dangers of: (“Micro”-)Black Holes, Strangelets, Vacuum Bubbles, etc., etc. are of course and maybe will forever be — still in discussion. No specific improvements concerning the safety assessment of the LHC have been conducted by CERN or anybody meanwhile. There is still no proper and really independent risk assessment (the ‘LSAG-report’ has been done by CERN itself) — and the science of risk research is still not really involved in the issue. This is a scientific and political scandal and that’s why the restart is a sad day for science and humanity.
The scientific network “LHC-Critique” speaks for a stop of any public sponsorship of gigantomanic particle colliders.
Just to demonstrate how speculative this research is: Even CERN has to admit, that the so called “Higgs Boson” was discovered — only “probably”. Very probably, mankind will never find any use for the “Higgs Boson”. Here we are not talking about the use of collider technology in medical concerns. It could be a minor, but very improbable advantage for mankind to comprehend the Big Bang one day. But it would surely be fatal – how the Atomic Age has already demonstrated — to know how to handle this or other extreme phenomena in the universe.
Within the next Billions of years, mankind would have enough problems without CERN.
Sources:
- A new paper by our partner “Heavy Ion Alert” will be published soon: http://www.heavyionalert.org/
- Background documents provided by our partner “LHC Safety Review”: http://www.lhcsafetyreview.org/

- Press release by our partner ”Risk Evaluation Forum” emphasizing on renewed particle collider risk: http://www.risk-evaluation-forum.org/newsbg.pdf

- Study concluding that “Mini Black Holes” could be created at planned LHC energies: http://phys.org/news/2015-03-mini-black-holes-lhc-parallel.html

- New paper by Dr. Thomas B. Kerwick on lacking safety argument by CERN: http://vixra.org/abs/1503.0066

- More info at the LHC-Kritik/LHC-Critique website: www.LHC-concern.info
Best regards:
LHC-Kritik/LHC-Critique

On a casual read of the appraised work of Duncan R. Lorimer on Binary and Millisecond Pulsars (2005) last week, I noted the reference to the lack of pulsars with P < 1.5 ms. It cites a mere suggestion that this is due to gravitational wave emission from R-mode instabilities, but one has not offered a solid reason for such absence from our Universe. As the surface magnetic field strength of such would be lower (B ∝ (P ˙P )^(1÷2)) than other pulsars, one could equally suggest that the lack of sub millisecond pulsars is due to their weaker magnetic fields allowing CR impacts resulting in stable MBH capture… Therefore if one could interpret that the 108 G field strength adopted by G&M is an approximate cut-off point where MBH are likely to be captured by neutron stars, then one would perhaps have some phenomenological evidence that MBH capture results in the destruction of neutron stars into black holes. One should note that more typical values of observed neutron stars calculate a 1012 G field, so that is a 104 difference from the borderline-existence cases used in the G&M analysis (and so much less likely to capture). That is not to say that MBH would equate to a certain danger for capture in a planet such as Earth where the density of matter is much lower — and accretion rates much more likely to be lower than radiation rates — an understanding that is backed up by the ‘safety assurance’ in observational evidence of white dwarf longevity. However, it does take us back to question — regardless of the frequently mentioned theorem here on Lifeboat that states Hawking Radiation should be impossible — Hawking Radiation as an unobserved theoretical phenomenon may not be anywhere near as effective as derived in theoretical analysis regardless of this. This oft mentioned concern of ‘what if Hawking is wrong’ of course is endorsed by a detailed G&M analysis which set about proving safety in the scenario that Hawking Radiation was ineffective at evaporating such phenomenon. Though doubts about the neutron star safety assurance immediately makes one question how reliable are the safety assurances of white dwarf longevity – and my belief has been that the white dwarf safety assurance seems highly rational (as derived in a few short pages in the G&M paper and not particularly challenged except for the hypothesis that they may have over-estimated TeV-scale MBH size which could reduce their likelihood of capture). It is quite difficult to imagine a body as dense as a white dwarf not capturing any such hypothetical stable MBH over their lifetime from CR exposure – which validates the G&M position that accretion rates therein must be vastly outweighed by radiation rates, so the even lower accretion rates on a planet such as Earth would be even less of a concern. However, given the gravity of the analysis, those various assumptions on which it is based perhaps deserves greater scrutiny, underscored by a concern made recently that 20% of the mass/energy in current LHC collisions are unaccounted for. Pulsars are often considered one of the most accurate references in the Universe due to their regularity and predictability. How ironic if those pulsars which are absent from the Universe also provided a significant measurement. Binary and Millisecond Pulsars, D.R. Lorimer: http://arxiv.org/pdf/astro-ph/0511258v1.pdf

High energy experiments like the LHC at the nuclear research centre CERN are extreme energy consumers (needing the power of a nuclear plant). Their construction is extremely costly (presently 7 Billion Euros) and practical benefits are not in sight. The experiments eventually pose existential risks and these risks have not been properly investigated.

It is not the first time that CERN announces record energies and news around April 1 – apparently hoping that some critique and concerns about the risks could be misinterpreted as an April joke. Additionally CERN regularly starts up the LHC at Easter celebrations and just before week ends, when news offices are empty and people prefer to have peaceful days with their friends and families.

CERN has just announced new records in collision energies at the LHC. And instead of conducting a neutral risk assessment, the nuclear research centre plans costly upgrades of its Big Bang machine. Facing an LHC upgrade in 2013 for up to CHF 1 Billion and the perspective of a Mega-LHC in 2022: How long will it take until risk researchers are finally integrated in a neutral safety assessment?

There are countless evidences for the necessity of an external and multidisciplinary safety assessment of the LHC. According to a pre-study in risk research, CERN fits less than a fifth of the criteria for a modern risk assessment (see the press release below). It is not acceptable that the clueless member states point at the operator CERN itself, while this regards its self-set security measures as sufficient, in spite of critique from risk researchers, continuous debates and the publication of further papers pointing at concrete dangers and even existential risks (black holes, strangelets) eventually arising from the experiments sooner or later. Presently science has to admit that the risk is disputed and basically unknown.

It will not be possible to keep up this ostrich policy much longer. Especially facing the planned upgrades of the LHC, CERN will be confronted with increasing critique from scientific and civil side that the most powerful particle collider has yet not been challenged in a neutral and multidisciplinary safety assessment. CERN has yet not answered to pragmatic proposals for such a process that also should constructively involve critics and CERN. Also further legal steps from different sides are possible.

The member states that are financing the CERN budget, the UN or private funds are addressed to provide resources to finally initiate a neutral and multidisciplinary risk assessment.

German version of this article published in Oekonews: http://www.oekonews.at/index.php?mdoc_id=1069458

Related LHC-Critique press release and open letter to CERN:

https://lifeboat.com/blog/2012/02/lhc-critique-press-release-feb-13-2012-cern-plans-mega-particle-collider-communication-to-cern-for-a-neutral-and-multi-disciplinary-risk-assessment-before-any-lhc-upgrade

Typical physicist’s April joke on stable black holes at the LHC (April 1 2012, German): http://www.scienceblogs.de/hier-wohnen-drachen/2012/04/stabiles-minischwarzes-loch-aus-higgsteilchen-erzeugt.php

Latest publications of studies demonstrating risks arising from the LHC experiment:

Prof Otto E. Rössler: http://www.academicjournals.org/AJMCSR/PDF/pdf2012/Feb/9%20Feb/Rossler.pdf

Thomas Kerwick B.Tech. M.Eng. Ph.D.: http://www.vixra.org/abs/1203.0055

Brief summary of the basic problem by LHC-Kritik (still valid since Sep. 2008): http://lhc-concern.info/wp-content/uploads/2008/12/lhc-kritik-cern-1st-statement-summary-908.pdf

Detailed summary of the scientific LHC risk discussion by LHC-Kritik and ConCERNed International: http://lhc-concern.info/wp-content/uploads/2010/03/critical-revision-of-lhc-risks_concerned-int.pdf

We wish you happy Easter and hope for your support of our pragmatic proposals to urgently increase safety in these new fields of nuclear physics.

LHC Critique / LHC Kritik — Network for Safety at nuclear and sub-nuclear high energy Experiments.

www.LHC-concern.info

[email protected]

Tel.: +43 650 629 627 5

New Facebook group: http://www.facebook.com/groups/LHC.Critique/

Info on the outcomes of CERN’s annual meeting in Chamonix this week (Feb. 6–10 2012):

In 2012 LHC collision energies should be increased from 3.5 to 4 TeV per beam and the luminosity is planned to be highly increased. This means much more particle collisions at higher energies.

CERN plans to shut down the LHC in 2013 for about 20 months to do a very costly upgrade (CHF 1 Billion?) to run the LHC at 7 TeV per beam afterwards.

Future plans: A High-Luminosity LHC (HL-LHC) is planned, “tentatively scheduled to start operating around 2022” — with a beam energy increased from 7 to 16.5 TeV(!).

One might really ask where this should lead to – sooner or later – without the risks being properly investigated.

For comparison: The AMS experiment for directly measuring cosmic rays in the atmosphere operates on a scale around 1.5 TeV. Very high energetic cosmic rays have only been measured indirectly (their impulse). Sort, velocity, mass and origin of these particles are unknown. In any way, the number of collisions under the extreme and unprecedented artificial conditions at the LHC is of astronomical magnitudes higher than anywhere else in the nearer cosmos.

There were many talks on machine safety at the Chamonix meeting. The safety of humans and environment obviously were not an official topic. No reaction on the recent claim for a really neutral, external and multi-disciplinary risk assessment by now.

Official reports from the LHC performance workshop by CERN Bulletin:

http://cdsweb.cern.ch/journal/CERNBulletin/2012/06/News%20Articles/?ln=de

LHC Performance Workshop — Chamonix 2012:

https://indico.cern.ch/conferenceOtherViews.py?view=standard&confId=164089

Feb 10 2012: COMMUNICATION directed to CERN for a neutral and multidisciplinary risk assessment to be done before any LHC upgrade:

http://lhc-concern.info/?page_id=139

More info at LHC-Kritik / LHC-Critique: Network for Safety at experimental sub-nuclear Reactors:

www.LHC-concern.info

Famous Chilean philosopher Humberto Maturana describes “certainty” in science as subjective emotional opinion and astonishes the physicists’ prominence. French astronomer and “Leonardo” publisher Roger Malina hopes that the LHC safety issue would be discussed in a broader social context and not only in the closer scientific framework of CERN.

(Article published in “oekonews”: http://oekonews.at/index.php?mdoc_id=1067777 )

The latest renowned “Ars Electronica Festival” in Linz (Austria) was dedicated in part to an uncritical worship of the gigantic particle accelerator LHC (Large Hadron Collider) at the European Nuclear Research Center CERN located at the Franco-Swiss border. CERN in turn promoted an art prize with the idea to “cooperate closely” with the arts. This time the objections were of a philosophical nature – and they had what it takes.

In a thought provoking presentation Maturana addressed the limits of our knowledge and the intersubjective foundations of what we call “objective” and “reality.” His talk was spiked with excellent remarks and witty asides that contributed much to the accessibility of these fundamental philosophical problems: “Be realistic, be objective!” Maturana pointed out, simply means that we want others to adopt our point of view. The great constructivist and founder of the concept of autopoiesis clearly distinguished his approach from a solipsistic position.

Given Ars Electronica’s spotlight on CERN and its experimental sub-nuclear research reactor, Maturana’s explanations were especially important, which to the assembled CERN celebrities may have come in a mixture of an unpleasant surprise and a lack of relation to them.

During the question-and-answer period, Markus Goritschnig asked Maturana whether it wasn’t problematic that CERN is basically controlling itself and discarding a number of existential risks discussed related to the LHC — including hypothetical but mathematically demonstrable risks also raised — and later downplayed — by physicists like Nobel Prize winner Frank Wilczek, and whether he thought it necessary to integrate in the LHC safety assessment process other sciences aside from physics such as risk search. In response Maturana replied (in the video from about 1:17): “We human beings can always reflect on what we are doing and choose. And choose to do it or not to do it. And so the question is, how are we scientists reflecting upon what we do? Are we taking seriously our responsibility of what we do? […] We are always in the danger of thinking that, ‘Oh, I have the truth’, I mean — in a culture of truth, in a culture of certainty — because truth and certainty are not as we think — I mean certainty is an emotion. ‘I am certain that something is the case’ means: ‘I do not know’. […] We cannot pretend to impose anything on others; we have to create domains of interrogativity.”

Disregarding these reflections, Sergio Bertolucci (CERN) found the peer review system among the physicists’ community a sufficient scholarly control. He refuted all the disputed risks with the “cosmic ray argument,” arguing that much more energetic collisions are naturally taking place in the atmosphere without any adverse effect. This safety argument by CERN on the LHC, however, can also be criticized under different perspectives, for example: Very high energetic collisions could be measured only indirectly — and the collision frequency under the unprecedented artificial and extreme conditions at the LHC is of astronomical magnitudes higher than in the Earth’s atmosphere and anywhere else in the nearer cosmos.

The second presentation of the “Origin” Symposium III was held by Roger Malina, an astrophysicist and the editor of “Leonardo” (MIT Press), a leading academic journal for the arts, sciences and technology.

Malina opened with a disturbing fact: “95% of the universe is of an unknown nature, dark matter and dark energy. We sort of know how it behaves. But we don’t have a clue of what it is. It does not emit light, it does not reflect light. As an astronomer this is a little bit humbling. We have been looking at the sky for millions of years trying to explain what is going on. And after all of that and all those instruments, we understand only 3% of it. A really humbling thought. […] We are the decoration in the universe. […] And so the conclusion that I’d like to draw is that: We are really badly designed to understand the universe.”

The main problem in research is: “curiosity is not neutral.” When astrophysics reaches its limits, cooperation between arts and science may indeed be fruitful for various reasons and could perhaps lead to better science in the end. In a later communication Roger Malina confirmed that the same can be demonstrated for the relation between natural sciences and humanities or social sciences.

However, the astronomer emphasized that an “art-science collaboration can lead to better science in some cases. It also leads to different science, because by embedding science in the larger society, I think the answer was wrong this morning about scientists peer-reviewing themselves. I think society needs to peer-review itself and to do that you need to embed science differently in society at large, and that means cultural embedding and appropriation. Helga Nowotny at the European Research Council calls this ‘socially robust science’. The fact that CERN did not lead to a black hole that ended the world was not due to peer-review by scientists. It was not due to that process.”

One of Malina’s main arguments focused on differences in “the ethics of curiosity”. The best ethics in (natural) science include notions like: intellectual honesty, integrity, organized scepticism, dis-interestedness, impersonality, universality. “Those are the believe systems of most scientists. And there is a fundamental flaw to that. And Humberto this morning really expanded on some of that. The problem is: Curiosity is embodied. You cannot make it into a neutral ideal of scientific curiosity. And here I got a quote of Humberto’s colleague Varela: “All knowledge is conditioned by the structure of the knower.”

In conclusion, a better co-operation of various sciences and skills is urgently necessary, because: “Artists asks questions that scientists would not normally ask. Finally, why we want more art-science interaction is because we don’t have a choice. There are certain problems in our society today that are so tough we need to change our culture to resolve them. Climate change: we’ve got to couple the science and technology to the way we live. That’s a cultural problem, and we need artists working on that with the scientists every day of the next decade, the next century, if we survive it.

Then Roger Malina directly turned to the LHC safety discussion and articulated an open contradiction to the safety assurance pointed out before: He would generally hope for a much more open process concerning the LHC safety debate, rather than discussing this only in a narrow field of particle physics, concrete: “There are certain problems where we cannot cloister the scientific activity in the scientific world, and I think we really need to break the model. I wish CERN, when they had been discussing the risks, had done that in an open societal context, and not just within the CERN context.”

Presently CERN is holding its annual meeting in Chamonix to fix LHC’s 2012 schedules in order to increase luminosity by a factor of four for maybe finally finding the Higgs Boson – against a 100-Dollar bet of Stephen Hawking who is convinced of Micro Black Holes being observed instead, immediately decaying by hypothetical “Hawking Radiation” — with God Particle’s blessing. Then it would be himself gaining the Nobel Prize Hawking pointed out. Quite ironically, at Ars Electronica official T-Shirts were sold with the “typical signature” of a micro black hole decaying at the LHC – by a totally hypothetical process involving a bunch of unproven assumptions.

In 2013 CERN plans to adapt the LHC due to construction failures for up to CHF 1 Billion to run the “Big Bang Machine” at double the present energies. A neutral and multi-disciplinary risk assessment is still lacking, while a couple of scientists insist that their theories pointing at even global risks have not been invalidated. CERN’s last safety assurance comparing natural cosmic rays hitting the Earth with the LHC experiment is only valid under rather narrow viewpoints. The relatively young analyses of high energetic cosmic rays are based on indirect measurements and calculations. Sort, velocity, mass and origin of these particles are unknown. But, taking the relations for granted and calculating with the “assuring” figures given by CERN PR, within ten years of operation, the LHC under extreme and unprecedented artificial circumstances would produce as many high energetic particle collisions as occur in about 100.000 years in the entire atmosphere of the Earth. Just to illustrate the energetic potential of the gigantic facility: One LHC-beam, thinner than a hair, consisting of billions of protons, has got the power of an aircraft carrier moving at 12 knots.

This article in the Physics arXiv Blog (MIT’s Technology Review) reads: “Black Holes, Safety, and the LHC Upgrade — If the LHC is to be upgraded, safety should be a central part of the plans.”, closing with the claim: “What’s needed, of course, is for the safety of the LHC to be investigated by an independent team of scientists with a strong background in risk analysis but with no professional or financial links to CERN.”
http://www.technologyreview.com/blog/arxiv/27319/

Australian ethicist and risk researcher Mark Leggett concluded in a paper that CERN’s LSAG safety report on the LHC meets less than a fifth of the criteria of a modern risk assessment. There but for the grace of a goddamn particle? Probably not. Before pushing the LHC to its limits, CERN must be challenged by a really neutral, external and multi-disciplinary risk assessment.

Video recordings of the “Origin III” symposium at Ars Electronica:
Presentation Humberto Maturana:

Presentation Roger Malina:

“Origin” Symposia at Ars Electronica:
http://www.aec.at/origin/category/conferences/

Communication on LHC Safety directed to CERN
Feb 10 2012
For a neutral and multidisciplinary risk assessment to be done before any LHC upgrade
http://lhc-concern.info/?page_id=139

More info, links and transcripts of lectures at “LHC-Critique — Network for Safety at experimental sub-nuclear Reactors”:

www.LHC-concern.info

Lee Smolin is said to believe (according to personal communication from Danila Medvedev who was told about it by John Smart. I tried to reach Smolin for comments, but failed) that global catastrophe is impossible, based on the following reasoning: the multiverse is dominated by those universes that are able to replicate. This Self-replication occurs in black holes, and in especially in those black holes, which are created civilizations. Thus, the parameters of the universe are selected so that civilization cannot self-destruct before they create black holes. As a result, all physical processes, in which civilization may self-destruct, are closed or highly unlikely. Early version of Smolin’s argument is here: http://en.wikipedia.org/wiki/Lee_Smolin but this early version was refuted in 2004, and so he (probably) added existence of civilization as another condition for cosmic natural selection. Anyway, even if it is not Smolin’s real line of thoughts, it is quite possible line of thoughts.

I think this argument is not persuasive, since the selection can operate both in the direction of universes with more viable civilizations, and in the direction of universes with a larger number of civilizations, just as biological evolution works to more robust offspring in some species (mammals) and in the larger number of offspring with lower viability (plants, for example, dandelion). Since some parameters for the development of civilizations is extremely difficult to adjust by the basic laws of nature (for example, the chances of nuclear war or a hostile AI), but it is easy to adjust the number of emerging civilizations, it seems to me that the universes, if they replicated with the help of civilizations, will use the strategy of dandelions, but not the strategy of mammals. So it will create many unstable civilization and we are most likely one of them (self indication assumption also help us to think so – see recent post of Katja Grace http://meteuphoric.wordpress.com/2010/03/23/sia-doomsday-the-filter-is-ahead/)

But still some pressure can exist for the preservation of civilization. Namely, if an atomic bomb would be as easy to create as a dynamite – much easier then on Earth (which depends on the quantity of uranium and its chemical and nuclear properties, ie, is determined by the original basic laws of the universe), then the chances of the average survival of civilization would be lower. If Smolin’s hypothesis is correct, then we should encounter insurmountable difficulties in creating nano-robots, microelectronics, needed for strong AI, harmful experiments on accelerators with strangelet (except those that lead to the creation of black holes and new universes), and in several other potentially dangerous technology trends that depend on their success from the basic properties of the universe, which may manifest itself in the peculiarities of its chemistry.

In addition, the evolution of universes by Smolin leads to the fact that civilization should create a black hole as early as possible in the course of its history, leading to replication of universes, because the later it happens, the greater the chances that the civilization will self-destruct before it can create black holes. In addition, the civilization is not required to survive after the moment of “replication” (though survival may be useful for the replication, if civilization creates a lot of black holes during its long existence.) From these two points, it follows that we may underestimate the risks from Hadron Collider in the creation of black holes.

I would repeat: early creation of a black hole suggested by Smolin and destroying the parent civilization, is very consistent with the situation with the Hadron Collider. Collider is a very early opportunity for us to create a black hole, as compared with another opportunity — to become a super-civilization and learn how to connect stars, so that they collapse into black holes. It will take millions of years and the chances to live up to this stage is much smaller. Also collider created black holes may be special, which is requirement for civilization driven replication of universes. However, the creation of black holes in collider with a high probability means the death of our civilization (but not necessarily: black hole could grow extremely slowly in the bowels of the Earth, for example, millions of years, and we have time to leave the Earth, and it depends on the unknown physical conditions.) In doing so, black hole must have some feature that distinguishes it from other holes that arise in our universe, for example, a powerful magnetic field (which exist in collider) or a unique initial mass (also exist in LHC: they will collide ions of gold).

So Smolin’s logic is sound but not proving that our civilization is safe, but in fact proving quiet opposite: that the chances of extinction in near future is high. We are not obliged to participate in the replication of universes suggested by Smolin, if it ever happens, especially if it is tantamount to the death of the parent civilization. If we continue our lives without black holes, it does not change the total number of universes have arisen, as it is infinite.

Experts regard safety report on Big Bang Machine as insufficient and one-dimensional

International critics of the high energy experiments planned to start soon at the particle accelerator LHC at CERN in Geneva have submitted a request to the Ministers of Science of the CERN member states and to the delegates to the CERN Council, the supreme controlling body of CERN.

The paper states that several risk scenarios (that have to be described as global or existential risks) cannot currently be excluded. Under present conditions, the critics have to speak out against an operation of the LHC.

The submission includes assessments from expertises in the fields markedly missing from the physicist-only LSAG safety report — those of risk assessment, law, ethics and statistics. Further weight is added because these experts are all university-level experts – from Griffith University, the University of North Dakota and Oxford University respectively. In particular, it is criticised that CERN’s official safety report lacks independence – all its authors have a prior interest in the LHC running and that the report uses physicist-only authors, when modern risk-assessment guidelines recommend risk experts and ethicists as well.

As a precondition of safety, the request calls for a neutral and multi-disciplinary risk assessment and additional astrophysical experiments – Earth based and in the atmosphere – for a better empirical verification of the alleged comparability of particle collisions under the extreme artificial conditions of the LHC experiment and relatively rare natural high energy particle collisions: “Far from copying nature, the LHC focuses on rare and extreme events in a physical set up which has never occurred before in the history of the planet. Nature does not set up LHC experiments.”

Even under greatly improved circumstances concerning safety as proposed above, big jumps in energy increase, as presently planned by a factor of three compared to present records, without carefully analyzing previous results before each increase of energy, should principally be avoided.

The concise “Request to CERN Council and Member States on LHC Risks” (Pdf with hyperlinks to the described studies) by several critical groups, supported by well known critics of the planned experiments:

http://lhc-concern.info/wp-content/uploads/2010/03/request-to-cern-council-and-member-states-on-lhc-risks_lhc-kritik-et-al_march-17-2010.pdf

The answer received by now does not consider these arguments and studies but only repeats again that from the side of the operators everything appears sufficient, agreed by a Nobel Price winner in physics. LHC restart and record collisions by factor 3 are presently scheduled for March 30, 2010.

Official detailed and well understandable paper and communication with many scientific sources by ‘ConCERNed International’ and ‘LHC Kritik’:

http://lhc-concern.info/wp-content/uploads/2010/03/critical-revision-of-lhc-risks_concerned-int.pdf

More info:
http://lhc-concern.info/