Toggle light / dark theme

It is a few years since I posted here on Lifeboat Foundation blogs, but with the news breaking recently of CERN’s plans to build the FCC [1], a new high energy collider to dwarf the groundbreaking engineering triumph that is the LHC, I feel obliged to write a few words.

The goal of the FCC is to greatly push the energy and intensity frontiers of particle colliders, with the aim of reaching collision energies of 100 TeV, in the search for new physics [2]. Below linked is a technical note I wrote & distributed last year on 100 TeV collisions (at the time referencing the proposed China supercollider [3][4]), highlighting the weakness of the White Dwarf safety argument at these energy levels, and a call for a more detailed study of the Neutron star safety argument, if to be relied on as a solitary astrophysical assurance. The argument applies equally to the FCC of course:

The Next Great Supercollider — Beyond the LHC : https://environmental-safety.webs.com/TechnicalNote-EnvSA03.pdf

The LSAG, and others including myself, have already written on the topic of astrophysical assurances at length before. The impact of CR on Neutron stars is the most compelling of those assurances with respect to new higher energy colliders (other analogies such as White Dwarf capture based assurances don’t hold up quite as well at higher energy levels). CERN will undoubtedly publish a new paper on such astrophysical assurances as part of the FCC development process, though would one anticipate it sooner rather than later, to lay to rest concerns of outsider-debate incubating to a larger audience?

Hope springs eternal. Hearing that folk from China’s IHEP were later in contact with the LSAG on this specific issue, one infers due diligence is in mind, albeit seemingly in retrospect again, on the premise that as CERN take up the baton, significant progress in collecting further input for the overall assessment (eg from cosmic rays, direct astrophysical observations, etc) is expected in the ~20 years timescale of development.

Meanwhile those of us keen on new science frontiers, and large scale engineering projects, have exciting times ahead yet again with a new CERN flagship.


[1] Cern draws up plans for machine four times the size of Large Hadron Collider https://www.theguardian.com/science/2019/jan/15/cern-draws-up-plans-for-collider-four-times-the-size-of-large-hadron

[2] The Future Circular Collider Study (FCC) at CERN https://home.cern/science/accelerators/future-circular-collider

[3] The next super-collider, The Economist, 2018. https://www.economist.com/leaders/2018/01/11/the-next-super-collider-should-be-built-in-china

[4] Reflecting on China’s Ambition to Build the World’s Most Powerful Supercollider, Existential Risk/Opportunity Singularity Management, 2015. http://www.global-risk-sig.org/erosmB9F.pdf

[5] The Next Great Supercollider — Beyond the LHC : https://environmental-safety.webs.com/TechnicalNote-EnvSA03.pdf

[6] Progress in Seeking a More Thorough Safety Analysis for China’s Supercollider http://www.global-risk-sig.org/EROSM7Ui.pdf

CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?

CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.

Experts point out that this research could be as limitless as the universe itself. The UK’s former Chief Scientific Advisor, Prof Sir David King told BBC: “We have to draw a line somewhere otherwise we end up with a collider that is so large that it goes around the equator. And if it doesn’t end there perhaps there will be a request for one that goes to the Moon and back.”

“There is always going to be more deep physics to be conducted with larger and larger colliders. My question is to what extent will the knowledge that we already have be extended to benefit humanity?”

There have been broad discussions about whether high energy nuclear experiments could pose an existential risk sooner or later, for example by producing micro black holes (mBH) or strange matter (strangelets) that could convert ordinary matter into strange matter and that eventually could start an infinite chain reaction from the moment it was stable – theoretically at a mass of around 1000 protons.

CERN has argued that micro black holes eventually could be produced, but they would not be stable and evaporate immediately due to „Hawking radiation“, a theoretical process that has never been observed.

Furthermore, CERN argues that similar high energy particle collisions occur naturally in the universe and in the Earth’s atmosphere, so they could not be dangerous. However, such natural high energy collisions are seldom and they have only been measured rather indirectly. Basically, nature does not set up LHC experiments: For example, the density of such artificial particle collisions never occurs in Earth’s atmosphere. Even if the cosmic ray argument was legitimate: CERN produces as many high energy collisions in an artificial narrow space as occur naturally in more than hundred thousand years in the atmosphere. Physicists look quite puzzled when they recalculate it.

Others argue that a particle collider ring would have to be bigger than the Earth to be dangerous.

A study on “Methodological Challenges for Risks with Low Probabilities and High Stakes” was provided by Lifeboat member Prof Raffaela Hillerbrand et al. Prof Eric Johnson submitted a paper discussing juridical difficulties (lawsuits were not successful or were not accepted respectively) but also the problem of groupthink within scientific communities. More of important contributions to the existential risk debate came from risk assessment experts Wolfgang Kromp and Mark Leggett, from R. Plaga, Eric Penrose, Walter Wagner, Otto Roessler, James Blodgett, Tom Kerwick and many more.

Since these discussions can become very sophisticated, there is also a more general approach (see video): According to present research, there are around 10 billion Earth-like planets alone in our galaxy, the Milky Way. Intelligent life might send radio waves, because they are extremely long lasting, though we have not received any (“Fermi paradox”). Theory postulates that there could be a ”great filter“, something that wipes out intelligent civilizations at a rather early state of their technical development. Let that sink in.

All technical civilizations would start to build particle smashers to find out how the universe works, to get as close as possible to the big bang and to hunt for the smallest particle at bigger and bigger machines. But maybe there is a very unexpected effect lurking at a certain threshold that nobody would ever think of and that theory does not provide. Indeed, this could be a logical candidate for the “great filter”, an explanation for the Fermi paradox. If it was, a disastrous big bang machine eventually is not that big at all. Because if civilizations were to construct a collider of epic dimensions, a lack of resources would have stopped them in most cases.

Finally, the CERN member states will have to decide on the budget and the future course.

The political question behind is: How far are the ordinary citizens paying for that willing to go?

LHC-Critique / LHC-Kritik

Network to discuss the risks at experimental subnuclear particle accelerators

www.lhc-concern.info

LHC-Critique[at]gmx.com

https://www.facebook.com/LHC-Critique-LHC-Kritik-128633813877959/

Particle collider safety newsgroup at Facebook:

https://www.facebook.com/groups/particle.collider/

https://www.facebook.com/groups/LHC.Critique/


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)
CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Posted in astronomy, big data, computing, cosmology, energy, engineering, environmental, ethics, existential risks, futurism, general relativity, governance, government, gravity, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, open source, particle physics, philosophy, physics, policy, posthumanism, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , , , , , | Leave a Comment on Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Why the LHC must be shut down

CERN-Critics: LHC restart is a sad day for science and humanity!

Posted in astronomy, big data, complex systems, computing, cosmology, energy, engineering, ethics, existential risks, futurism, general relativity, governance, government, gravity, hardware, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , | 1 Comment on CERN-Critics: LHC restart is a sad day for science and humanity!

PRESS RELEASE “LHC-KRITIK”/”LHC-CRITIQUE” www.lhc-concern.info
CERN-Critics: LHC restart is a sad day for science and humanity!
These days, CERN has restarted the world’s biggest particle collider, the so-called “Big Bang Machine” LHC at CERN. After a hundreds of Million Euros upgrade of the world’s biggest machine, CERN plans to smash particles at double the energies of before. This poses, one would hope, certain eventually small (?), but fundamentally unpredictable catastrophic risks to planet Earth.
Basically the same group of critics, including Professors and Doctors, that had previously filed a law suit against CERN in the US and Europe, still opposes the restart for basically the same reasons. Dangers of: (“Micro”-)Black Holes, Strangelets, Vacuum Bubbles, etc., etc. are of course and maybe will forever be — still in discussion. No specific improvements concerning the safety assessment of the LHC have been conducted by CERN or anybody meanwhile. There is still no proper and really independent risk assessment (the ‘LSAG-report’ has been done by CERN itself) — and the science of risk research is still not really involved in the issue. This is a scientific and political scandal and that’s why the restart is a sad day for science and humanity.
The scientific network “LHC-Critique” speaks for a stop of any public sponsorship of gigantomanic particle colliders.
Just to demonstrate how speculative this research is: Even CERN has to admit, that the so called “Higgs Boson” was discovered — only “probably”. Very probably, mankind will never find any use for the “Higgs Boson”. Here we are not talking about the use of collider technology in medical concerns. It could be a minor, but very improbable advantage for mankind to comprehend the Big Bang one day. But it would surely be fatal – how the Atomic Age has already demonstrated — to know how to handle this or other extreme phenomena in the universe.
Within the next Billions of years, mankind would have enough problems without CERN.
Sources:
- A new paper by our partner “Heavy Ion Alert” will be published soon: http://www.heavyionalert.org/
- Background documents provided by our partner “LHC Safety Review”: http://www.lhcsafetyreview.org/

- Press release by our partner ”Risk Evaluation Forum” emphasizing on renewed particle collider risk: http://www.risk-evaluation-forum.org/newsbg.pdf

- Study concluding that “Mini Black Holes” could be created at planned LHC energies: http://phys.org/news/2015-03-mini-black-holes-lhc-parallel.html

- New paper by Dr. Thomas B. Kerwick on lacking safety argument by CERN: http://vixra.org/abs/1503.0066

- More info at the LHC-Kritik/LHC-Critique website: www.LHC-concern.info
Best regards:
LHC-Kritik/LHC-Critique

I was about to discuss the third of three concepts, but thought a look back would be appropriate at this time. In my earlier post I had shown that the photon/particle wave function could not be part of the photon/particle as this would violate the empirical Lorentz-Fitzgerald transformations and therefore, Einstein’s Special Theory of Relativity. The wave function is only the photon/particle’s disturbance of the spacetime it is in, and therefore explains why photons/particles have wave properties. They don’t. They disturb spacetime like a pebble dropped into a pond. The pond’s ripples are not the pebble.

In the recent findings, Dr. Alberto Peruzzo, University of Bristol (UK) the lead author of the paper and quoting “The measurement apparatus detected strong nonlocality, which certified that the photon behaved simultaneously as a wave and a particle in our experiment, … This represents a strong refutation of models in which the photon is either a wave or a particle.” This is a very important finding and another step in the progress of science towards a better understanding of our Universe.

Those of you who have been following my blog posts will recognize that this is empirical validation using single structure test that shows that both wave and particle properties occur together. What is required next, to be empirically rigorous, is to either confirm or deny that this wave function is a spacetime disturbance. For that we require a dual structure test.

If this wave function is a spacetime disturbance, then Einstein’s Special Theory of Relativity is upheld, and we would require a major rethink of quantum physics or the physics of elementary particles. If this wave function is a not spacetime disturbance but part of the particle structure, then there is an empirical exception to the Lorentz-Fitzgerald transformation and we would require a rethink of Einstein’s Special Theory of Relativity.

Here is a proposal for a dual structure test (to test two alternative hypotheses) which probably only an organization like CERN could execute. Is it possible to disturb spacetime in a manner as to exhibit the properties of a known particle but has no mass? That is the underlying elementary particle is not present. I suppose other research institutions could attempt this, too. If successful … it will be a bigger discovery that Dr. Alberto Peruzzo and his team.

My money is on Lorentz-Fitzgerald and Einstein being correct, and I infer that the physics community of quantum and string theorist would not be happy at the possibility of this dual structure test.

So I ask, in the spirit of the Kline Directive, can we as a community of physicists and engineers come together, to explore what others have not, to seek what others will not, to change what others dare not, to make interstellar travel a reality within our lifetimes?

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Theoretical-Empirical Relationship (Part 3)

Posted in cosmology, defense, education, engineering, particle physics, philosophy, physics, policy, spaceTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 17 Comments on The Kline Directive: Theoretical-Empirical Relationship (Part 3)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In Part 1, we learned that Einstein was phenomenally successful because his work was deeply meshed with the experimental evidence of the day. In Part 2, we learned that to be successful at developing new useful theories and discovering new fundamental properties of Nature that will bring forth new interstellar travel technologies, we need to avoid hypotheses that are not grounded in experimental data, as these are purely mathematical conjectures.

In my book on gravity modification I classified physics hypotheses and theories into 3 categories, as follows:

A. Type 1: The Millennium Theories
These are theories that would require more than a 100 years and up to 1,000 years to prove or disprove. Mathematically correct but inscrutable with physical verifiable experiments, even in the distant future.

String and quantum gravity theories fall into this category. Why? If we cannot even figure out how to engineer-modify 4-dimensional spacetime, how are we going to engineer-modify a 5-, 6-, 9-, 11- or 23-dimensional universe?

How long would it take using string theories to modify gravity? Prof. Michio Kaku in his April 2008 Space Show interview had suggested several hundred years. Dr. Eric Davis in his G4TV interview had suggested more than 100 years maybe 200 years. So rightly, by their own admission these are Millennium Theories. It should be noted that Richard Feynman (Nobel Prize 1965) & Sheldon Lee Glashow (Nobel Prize 1979) were against string theory, but their opinions did not prevail.

Even hypotheses that conjecture time travel should be classified as Millennium Theories because they require ‘exotic’ matter. John Eades, a retired CERN senior scientist, in his article Antimatter Pseudoscience, states in no uncertain terms that antimatter is impossible to handle and create in real quantities. Then what about exotic matter?

For that matter any hypothesis that requires antimatter or exotic matter should be classified a Millennium Theory.

B. Type 2: The 100-Year Theories
These are theories that show promise of being verified with technologies that would require several decades to engineer, test and prove.

These types of theories do not lend themselves to an immediate engineering solution. The engineering solution is theoretically feasible but a working experiment or technology is some decades away, because the experimental or physical implementation is not fully understood.

Note there is this gap. We do not have 100-Year Theories in our repertoire of physical theories to keep the pipeline supplied with new and different ways to test the physical Universe.

C. Type 3: The Engineering Feasible Theories
These are theories that lend themselves to an engineering solution, today. They are falsifiable today, with our current engineering technologies. They can be tested and verified in the laboratory if one knows what to test for and how to test for these experimental observations.

Today Relativity falls into this category because we have the engineering sophistication to test Einstein’s theory, and it has been vindicated time and time again. But, there is a very big ‘but’. But Relativity cannot give us gravity modification or new propulsion theories, because it requires mass. We need to stand on Einstein’s shoulders to take the next step forward.

Therefore, if we are to become an interstellar civilization, in the spirit of the Kline Directive, we need to actively seek out and explore physics in such a manner as to bring forth Engineering Feasible and 100-Year Theories.

We need to ask ourselves, what can we do, to migrate the theoretical physics research away from Theory of Everything research to the new field of propulsion physics? Gravity modification is an example of propulsion physics. Here is the definition of gravity modification, from my book:

“Gravity modification is defined as the modification of the strength and/or direction of the gravitational acceleration without the use of mass as the primary source of this modification, in local space time. It consists of field modulation and field vectoring. Field modulation is the ability to attenuate or amplify a force field. Field vectoring is the ability to change the direction of this force field.”

Note by this definition requiring no mass, relativity, quantum mechanics and string theories cannot be used to theorize propulsion physics. Therefore, the urgent need to find genuinely new ways in physics, to achieve interstellar travel.

Can we get there? The new physics? To answer this question let me quote Dr. Andrew Beckwith, Astrophysicist, Ph.D.(Condensed Matter Theory) who wrote the Foreword to my book:

“I believe that Quantum Mechanics is an embedded artifact of a higher level deterministic theory, i.e. much in the same vein as G. t’Hooft, the Nobel prize winner. In this sense, what Benjamin has done is to give a first order approximation as to what Quantum Mechanics is actually a part of which may in its own way shed much needed understanding of the foundations of Quantum Mechanics well beyond the ‘Pilot model’ of DICE 2010 fame (this is a conference on the foundations of Quantum Mechanics and its extension given once every two years in Pisa , Italy, organized by Thomas Elze).”

Why does Dr. Andrew Beckwith reference quantum mechanics in a book on gravity modification?

Because my investigation into gravity modification led me to the conclusion that gravitation acceleration is independent of the internal structure of the particle. It does not matter if the particle consists of other particles, strings, pebbles or rocks. This led me to ask the question, so what is the internal structure of a photon? I found out that the photon probability is not Gaussian but a new distribution, Var-Gamma. Therefore I believe Robert Nemiroff’s three photon observation will be vindicated by other physicist-researchers sifting through NASA’s archives for gamma-ray burst.

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Economic Viability

Posted in business, complex systems, defense, economics, education, engineering, finance, military, nuclear weapons, philosophy, physics, policy, scientific freedom, space, sustainabilityTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 11 Comments on The Kline Directive: Economic Viability

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In this post I will explore Economic Viability. I have proposed the Interstellar Challenge Matrix (ICM) to guide us through the issues so that we can arrive at interstellar travel sooner, rather than later. Let us review the costs estimates of the various star drives just to reach the velocity of 0.1c, as detailed in previous blog posts:

Interstellar Challenge Matrix (Partial Matrix)

Propulsion Mechanism Legal? Costs Estimates
Conventional Fuel Rockets: Yes Greater than US$1.19E+14
Antimatter Propulsion: Do Not Know. Between US$1.25E+20 and US$6.25E+21
Atomic Bomb Pulse Detonation: Illegal. This technology was illegal as of 1963 per Partial Test Ban Treaty Between $2.6E12 and $25.6E12 . These are Project Orion original costs converted back to 2012 dollar. Requires anywhere between 300,000 and 30,000,000 bombs!!
Time Travel: Do Not Know. Requires Exotic Matter, therefore greater than antimatter propulsion costs of US$1.25E+20
Quantum Foam Based Propulsion: Do Not Know. Requires Exotic Matter, therefore greater than antimatter propulsion costs of US$1.25E+20
Small Black Hole Propulsion: Most Probably Illegal in the Future Using CERN to estimate. At least US$9E+9 per annual budget. CERN was founded 58 years ago in 1954. Therefore a guestimate of the total expenditure required to reach its current technological standing is US$1.4E11.

Note Atomic Bomb numbers were updated on 10/18/2012 after Robert Steinhaus commented that costs estimates “are excessively high and unrealistic”. I researched the topic and found Project Orion details the costs, of $2.6E12 to $25.6E12, which are worse than my estimates.

These costs are humongous. The Everly Brothers said it the best.

Let’s step back and ask ourselves the question, is this the tool kit we have to achieve interstellar travel? Are we serious? Is this why DARPA — the organization that funds many strange projects — said it will take more than a 100 years? Are we not interested in doing something sooner? What happened to the spirit of the Kline Directive?

From a space exploration perspective economic viability is a strange criterion. It is not physics, neither is it engineering, and until recently, the space exploration community has been government funded to the point where realistic cost accountability is nonexistent.

Don’t get me wrong. This is not about agreeing to a payment scheme and providing the services as contracted. Government contractors have learned to do that very well. It is about standing on your own two feet, on a purely technology driven commercial basis. This is not an accounting problem, and accountants and CFOs cannot solve this. They would have no idea where to start. This is a physics and engineering problem that shows up as an economic viability problem that only physicists and engineers can solve.

The physics, materials, technology and manufacturing capability has evolved so much that companies like Planetary Resources, SpaceX, Orbital Sciences Corp, Virgin Galactic, and the Ad Astra Rocket Company are changing this economic viability equation. This is the spirit of the Kline Directive, to seek out what others would not.

So I ask the question, whom among you physicist and engineers would like to be engaged is this type of endeavor?

But first, let us learn a lesson from history to figure out what it takes. Take for example DARPA funding of the Gallium Arsenide. “One of DARPA’s lesser known accomplishments, semiconductor gallium arsenide received a push from a $600-million computer research program in the mid-1980s. Although more costly than silicon, the material has become central to wireless communications chips in everything from cellphones to satellites, thanks to its high electron mobility, which lets it work at higher frequencies.”

In the 1990s Gallium Arsenide semiconductors were so expensive that “silicon wafers could be considered free”. But before you jump in and say that is where current interstellar propulsion theories are, you need to note one more important factor.

The Gallium Arsenide technology had a parallel commercially proven technology in place, the silicon semiconductor technology. None of our interstellar propulsion technology ideas have anything comparable to a commercially successful parallel technology. (I forgot conventional rockets. Really?) A guesstimate, in today’s dollars, of what it would cost to develop interstellar travel propulsion given that we already had a parallel commercially proven technology, would be $1 billion, and DARPA would be the first in line to attempt this.

Given our theoretical physics and our current technological feasibility, this cost analysis would suggest that we require about 10 major technological innovations, each building on the other, before interstellar travel becomes feasible.

That is a very big step. Almost like reaching out to eternity. No wonder Prof Adam Franks in his July 24, 2012 New York Times Op-Ed, Alone in the Void, wrote “Short of a scientific miracle of the kind that has never occurred, our future history for millenniums will be played out on Earth”.

Therefore, we need to communicate to the theoretical physics community that they need get off the Theory of Everything locomotive and refocus on propulsion physics. In a later blog posting I will complete the Interstellar Challenge Matrix (ICM). Please use it to converse with your physicist colleagues and friends about the need to focus on propulsion physics.

In the spirit of the Kline Directive — bold, explore, seek & change — can we identify the 10 major technological innovations? Wouldn’t that keep you awake at night at the possibility of new unthinkable inventions that will take man where no man has gone before?

PS. I was going to name the Interstellar Challenge Matrix (ICM), the Feasibility Matrix for Interstellar Travel (FMIT), then I realized that it would not catch on at MIT, and decided to stay with ICM.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

EOH events are events that cause the irreversible termination of humanity. They are not events that start the physical destruction of humanity (that would be too late), but fundamental, non-threatening and inconspicuous events that eventually lead to the irreversible physical destruction of humanity. Using nations and civilizations I explain how.

(1) Fundamental: These events have to be fundamental to the survival of the human species or else they cannot negatively impact the foundation of humanity’s existence.

On a much smaller scale drought and war can and have destroyed nations and civilizations. However, that is not always the case. For example, it is still not know what caused the demise of the Mayan civilization.

The act of war can lead to the irreversible destruction of a nation or civilization, but the equivalent EOH event lay further back in history, and can only be answered by the questions who and why.

For example, the assassination of Archduke Franz Ferdinand of Austria is the EOH event that triggered a domino effect which started with Austria-Hungary’s declaration of war against Serbia, and Central Powers (including Germany and Austria-Hungary) and the Allies of World War I (countries allied with Serbia) to declare war on each other, starting World War I.

In this case Europe was not destroyed as it still had the capability to rebuild, but it led to massive loss of human lives.

Lesson: This illustrates that an EOH event acts like a trigger. Therefore, EOH events must have the capability to trigger destruction in such a manner as to annihilate the capability to rebuild, too.

(2) Non-Threatening: They have to be non-threatening or else these types of events cannot take hold and become main stream.

The Hindu numeral system designed for positional notation in a decimal system, invented (trigger event) in India, was transmitted via the Arab traders to Europe where it took root, and bloomed into the counting and mathematical systems we now accept universally.

Note, the Roman Empire, essentially Southern Europe, Mediterranean and parts of the Middle East, used a comparatively awkward system, by contrast, and this has not survived into general usage today.

The development of mathematics in a Europe, hungry not to be left behind, led to the development of the sciences and engineering not envisioned by India. And several hundred years later came back to India in the form of the British Raj, and changed how Indians live.

Lesson: This illustrates that for an EOH event to prosper it requires a conducive environment – in this case a Europe hungry not to be left behind.

(3) Inconspicuous: They have to be inconspicuous to facilitate the chain reaction of irreversible events. If these events were visible to the majority of humanity when they occur, people could intervene and prevent these chain reactions to irreversible destruction of humanity.

The 9/11 attacks on the Twin World Trade Towers was inconspicuous simply because no one believed that commercial airplanes could be used as weapons of destruction. The subsequent chain reaction, the sequential collapse of building floors, lend to destruction and major loss of lives.

Lesson: Inconspicuous does not necessarily mean ‘cannot be seen’ as the 9/11 example illustrates that it would also encompass ‘cannot be believed’.

.

In summary an EOH event is a non-threatening inconspicuous trigger in a conducive environment, that chain reacts into irreversible physical destruction of the foundations of humanity in a manner that prevents rebuilding.

By this definition there are two EOH events within our comprehension.

The first that comes to mind is the irreversible expansion of our Sun into a red giant will lead to the total destruction of humanity with the inability to rebuild if we remain on Earth, and is triggered by the inconspicuous exhaustion of hydrogen in the Sun’s core which switches to the thermonuclear fusion chain reaction of hydrogen.

Therefore, the EOH event is the exhaustion of hydrogen.

.

The second are experiments in small black hole production. The hypothesis that small black holes can be used for interstellar propulsion (https://lifeboat.com/blog/2012/09/debunking-the-black-hole-interstellar-drive) lends a conducive environment to trigger the funding for such experiments. The realization of small black holes without experimentally proven controls will lead to the irreversible chain reaction of black hole growth as it consumes matter around it at increasingly faster rates. This will result in the complete destruction of humanity and everything within our reach, in manner we cannot rebuild.

Therefore, the EOH event is the approval of funding into small black hole experimental research. And with CERN we have achieved an EOH event.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

There four camps that comprise the present day interstellar travel community and only one camp will succeed.

The first camp, the conventional rocket camp, believes it is possible using conventional rockets (chemical, ion, nuclear or antimatter) to realize interstellar travel to our nearest star Alpha Centauri. One of the problems is the costs, estimated at an unthinkably large $238,596 billion and upwards. It is several thousand times greater if we choose to use antimatter.

Further, John Eades, a former senior scientist with CERN, in his March/April 2012 Skeptical Inquirer article “Antimatter Pseudoscience”, lays down the reasons why antimatter based propulsion will never be technologically feasible.

Black Hole of wealth. One down three to go.

.

The second, the hypothesis camp, believes that there is some equation that will allow us to reach 1,000 x velocity of light and upwards based on quantum foam. Nonsense. Be very clear, the experimental evidence proves that anything with mass cannot be accelerated to exceed the velocity of light. Sure, we have hypotheses (i.e. mathematical guesses without experimental proof) that point every which way, but at best these are guesses and they have not or cannot be proven experimentally. In addition, Robert Nemiroff’s three photon discovery suggests that both quantum foam and quantum gravity may in part or whole invalidated while upholding relativity.

Wrong turn. Two down and two to go.

.

The third, the impossible camp, believes that interstellar travel is impossible. As Prof. Adam Franks stated in his July 24, 2012 New York Times Op-Ed, Alone in the Void, “Short of a scientific miracle of the kind that has never occurred, our future history for millenniums will be played out on Earth”. Obviously the impossible camp disagrees with the hypothesis camp on the basis of the physics.

Don’t argue. Three down one more to go.

.

I belong to the fourth, the new physics camp, that there is a new physics that the other three camps do not subscribe to. There are 57 of us physicist-engineers from 16 countries, US, Russia, UK, China, Japan, Romania, Austria, India and more, who have researched or are researching new propulsion technologies that are not based on chemical, ion, nuclear or antimatter engines or untested hypotheses. We search out and investigate anomalies.

Change is coming. We will be successful.

.

Based on my work as evidence, several important phenomena have been discovered

1. A new formula for gravitational acceleration that does not require us to know the mass of the planet or star. This is an immense discovery, never before accomplished in the 346-year history, since Newton, of the physics of gravitational fields, as all theories on gravity require us to know the mass of the planet or star.

2. Solved Laithwaite’s Big Wheel experiment, which nobody else could in the last 35 years.

3. Asked questions that neither relativity nor quantum theory has. For example, how is probability implemented in Nature?

Because we have learned to ask questions that the other three camps have not, we the new physics camp will find different answers and reach the stars before anyone else.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.