Toggle light / dark theme

It may have gone unnoticed to most, but the first expedition for mankind’s first permanent undersea human colony will begin in July of next year. These aquanauts represent the first humans who will soon (~2015) move to such a habitat and stay with no intention of ever calling dry land their home again. Further details: http://underseacolony.com/core/index.php

Of all 100 billion humans who have ever lived, not a single human has ever gone undersea to live permanently. The Challenger Station habitat, the largest manned undersea habitat ever built, will establish the first permanent undersea colony, with aspirations that the ocean will form a new frontier of human colonization. Could it be a long-term success?

The knowledge gained from how to adapt and grow isolated ecosystems in unnatural environs, and the effects on the mentality and social well-being of the colony, may provide interesting insights into how to establish effective off-Earth colonies.

One can start to pose the questions — what makes the colony self-sustainable? What makes the colony adaptive and able to expand its horizons. What socio-political structure works best in a small inter-dependent colony? Perhaps it is not in the first six months of sustainability, but after decades of re-generation, that the true dynamics become apparent.

Whilst one does not find a lawyer, a politician or a management consultant on the initial crew, one can be assured if the project succeeds, it may start to require other professions not previously considered. At what size colony does it become important to have a medical team, and not just one part-time doctor. What about teaching skills and schooling for the next generation to ensure each mandatory skill set is sustained across generations. In this light, it could become the first social project in determining the minimal crew balance for a sustainable permanent off-Earth Lifeboat. One can muse back to the satire of the Golgafrincham B Ark in Hitch-Hiker’s Guide to the Galaxy, where Golgafrinchan Telephone Sanitisers, Management Consultants and Marketing executives were persuaded that the planet was under threat from an enormous mutant star goat, packed in Ark spaceships, and sent to an insignificant planet… which turned out to be Earth. It provides us a satirical remind that the choice of crew and colony on a real Lifeboat would require utmost social research.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I discuss the third and final part, Concepts and Logical Flow, of how to read or write a journal paper, that is not taught in colleges.

A paper consists of a series of evolving concepts expressed as paragraphs. If a concept is too complex to be detailed in a single paragraph, then break it down into several sub-concept paragraphs. Make sure there is logical evolution of thought across these sub-concepts, and across the paper.

As a general rule your sentences should be short(er). Try very hard not to exceed two lines of Letter or A4 size paper at font size 11. Use commas judicially. Commas are not meant to extend sentences or divide the sentence into several points!!! They are used to break up a sentence into sub-sentences to indicate a pause when reading aloud. How you use commas can alter the meaning of a sentence. Here is an example.

And this I know with confidence, I remain and continue …

Changing the position of the commas, changes the meaning to

And this I know, with confidence I remain and continue …

We see how ‘confidence’ changes from the speaker’s assessment of his state of knowledge, to the speaker’s reason for being. So take care.

When including mathematical formulae, always wrap. Wrap them with an opening paragraph and a closing paragraph. Why? This enhances the clarity of the paper. The opening paragraph introduce the salient features of the equation(s), i.e. what the reader needs to be aware of in the equation(s), or an explanation of the symbols, or why the equation is being introduced.

The closing paragraph explains what the author found by stating the equations, and what the reader should expect to look for in subsequent discussions, or even why the equation(s) is or is not relevant to subsequent discussions.

Many of these concept-paragraphs are logically combined into sections, and each section has a purpose for its inclusion. Though this purpose may not always be stated in the section, it is important to identify what it is and why it fits in with the overall schema of the paper.

The basic schema of a paper consists of an introduction, body and conclusion. Of course there are variations to this basic schema, and you need to ask the question, why does the author include other types of sections.

In the introduction section(s) you summarize your case, what is your paper about, and what others have reported. In the body sections you present your work. In the conclusion section you summarize your findings and future direction of the research. Why? Because a busy researcher can read your introduction and conclusion and then decide whether your paper is relevant to his or her work. Remember we are working within a community of researchers in an asynchronous manner, an asynchronous team, if you would. As more and more papers are published every year, we don’t have the time to read all of them, completely. So we need a method of eliminating papers we are not going to read.

An abstract is usually a summary of the body of the paper. It is difficult to do well and should only be written after you have completed your paper. That means you are planning ahead and have your paper written and abstracts completed when you receive the call for papers.

An abstract tells us if the paper could be relevant, to include in our list of papers to be considered for the shortlist of papers to be read. The introduction and conclusion tells if the paper should be removed from our short list. If the conclusion fits in with what we want to achieve, then don’t remove the paper from the short list.

I follow a rule when writing the introduction section. If I am writing to add to the body of consensus, I state my case and then write a review of what others have reported. If I am negating the body of consensus, then I write a review of what others have reported, and then only do I state my case of why not.

As a general rule, you write several iterations of the body first, then introduction and finally the conclusion. You’d be surprised by how your thinking changes if you do it this way, This is because you have left yourself open to other inferences that had not crossed your mind from the time you completed your work, to the time you started writing your paper.

If someone else has theoretical or experimental results that apparently contradicts your thesis, then discuss why and why not, and you might end up changing your mind. It is not a ‘sin’ to include contradictory results, but make sure you discuss this intelligently and impartially.

Your work is the sowing and growing period. Writing the paper is the harvesting period. What are you harvesting? Wheat, weeds or both? Clearly the more wheat you harvest the better your paper. The first test for this is the logical flow of your paper. If it does not flow very well, something is amiss! You the author, and you the reader beware! There is no substitute but to rethink your paper.

The second test is, if you have tangential discussions in your paper that seem interesting but are not directly relevant. Prune, prune & prune. If necessary split into multiple concise papers. A concise & sharp paper that everyone remembers is more valuable than a long one that you have to plough through.

Go forth, read well and write more.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

I was not intending to write Part 5, but judging from the responses I thought it was necessary to explain how to read a journal paper – and a good read cannot be done without a pen and paper. If you are writing a paper, when you have completed it, I would suggest you set it aside for at least a week. Don’t think about your paper or the topic during this shmita period. Then come back to your paper with a pen & paper and read it afresh. You’d be surprised by the number of changes you make, which means you have to start well before your deadline.

Note, you can find articles on how to review or write papers and here is one, by IOP (Institute of Physics, UK) titled Introduction to refereeing, and is a good guide to read before reading or writing a paper. This is especially true for physics but applies to all the sciences and engineering disciplines.

Note, for those who have been following the comments on my blog posts, IOP explicitly states “Do not just say ‘This result is wrong’ but say why it is wrong…” and “be professional and polite in your report”. So I hope, we as commentators, will be more professional in both our comments and the focus of our comments. Thanks.

In this post I will address what is not taught in colleges. There are three things to look out for when reading or writing a paper, Explicit and Implicit Axioms, Mathematical Construction versus Mathematical Conjecture, and finally, Concepts and Logical Flow. In this first part I discuss Explicit and Implicit Axioms.

This may sound silly but 1+1 = 2 is not an axiom. Alfred North Whitehead and Bertrand Russell proved that 1+1 adds to 2. Therefore, we see, that the immense success of the modern civilization compared to all other previous civilizations, is due to the encroachment of the imperceptible mathematical rigor in our daily lives by nameless, faceless scientist, engineers and technicians. Now that is something to ponder about. If we lose that rigor we lose our society. We can discuss economic and political theory but without this mathematical rigor, nothing else works.

Any theoretical work is based on axioms. For example in Pythagorean geometry, one assumes that surfaces are flat in such a manner the sum of the angles of a triangle adds to 180º. In Riemann geometry this is not the case. Explicit axioms are those stated in the paper.

Implicit axioms are axioms that are taken for granted to be true and therefore not stated, or considered too trivial to be mentioned. More often than not, the author is not aware he or she is using or stating an implicit axiom.

For example, mass causes a gravitational field is an implicit axiom, as we cannot with our current theoretical foundations nor with our current technologies prove either way that mass is or is not the source of a gravitational field. This axiom is also considered trivial because what else could?

But wait, didn’t Einstein … ? Yes correct, he did .…

Mass is a carryover from Newton. It shows how difficult it is to break from tradition even when we are breaking from tradition! Since Newton figured out that mass was an excellent means (i.e. “proxy” to be technically rigorous) to determining gravitational acceleration in mathematical form, therefore mass had to be the source. All tests pertaining to Einstein’s relativity test the field characteristics, not how the source creates the gravitational field.

But our understanding of the world has changed substantially since both Newton and Einstein. We know that quarks are at the center of matter and exist in the same ‘amount’ as mass. So how does one tell the difference between quark interaction and mass as the gravitational source?

The importance of implicit axioms in particular and axioms in general, is that when we recognize them we can change them and drive fundamental changes in theory and technologies. I asked the questions, what is gravity modification and how can we do it? These questions are at best vague, but they were as good a starting point as any? But life happens backwards. We get the answer and then only do we recognize the precise question we were attempting to ask!

When I started researching gravity modification in 1999, I just had this sense that gravity modification should be possible in our lifetimes, but I did not know what the question was. It was all vague and unclear at that time, but I was very strict about the scope of my investigation. I would only deal with velocity and acceleration.

I spent 8 years searching, examining, discarding, testing and theorizing anomalies, trying to get a handle on what gravity modification could be. Finally in 2007 I started building numerical models of how gravitational acceleration could work in spacetime. In February 2008 I discovered g=τc2 and at that moment I knew the question: Can gravitational acceleration be described mathematically without knowing the mass of the planet or star?

So the implicit axiom, mass is required for gravitational acceleration, is no longer valid, and because of that we now have propulsion physics.

If, in the spirit of the Kline Directive, you want to explore what others have not, and seek what others will not, my advice is that when you read a paper ask yourself, what are the implicit and explicit axioms in the paper?

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Theoretical-Empirical Relationship (Part 4)

Posted in business, cosmology, defense, economics, education, engineering, nuclear weapons, particle physics, philosophy, physics, policy, scientific freedom, spaceTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 11 Comments on The Kline Directive: Theoretical-Empirical Relationship (Part 4)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationship, & Technological Feasibility.

In this post I have updated the Interstellar Challenge Matrix (ICM) to guide us through the issues so that we can arrive at interstellar travel sooner, rather than later:

Interstellar Challenge Matrix (Partial Matrix)

Propulsion Mechanism Relatively Safe? Theoretical-Empirical Relationship?
Conventional Fuel Rockets: Yes, but susceptible to human error. Known. Theoretical foundations are based on Engineering Feasible Theories, and have been evolving since Robert Goddard invented the first liquid-fueled rocket in 1926.
Antimatter Propulsion: No. Extensive gamma ray production (Carl Sagan). Issue is how does one protect the Earth? Capable of an End of Humanity (EOH) event. Dependent on Millennium Theories. John Eades states in no uncertain terms that antimatter is impossible to handle and create.
Atomic Bomb Pulse Detonation: No, because (Project Orion) one needs to be able to manage between 300,000 and 30,000,000 atomic bombs per trip. Known and based on Engineering Feasible Theories.
Time Travel: Do Not Know. Depends on how safely exotic matter can be contained. Dependent on a Millennium Theory. Exotic matter hypotheses are untested. No experimental evidence to show that Nature allows for a breakdown in causality.
String / Quantum Foam Based Propulsion: Do Not Know. Depends on how safely exotic matter can be contained. Dependent on a Millennium Theory. String theories have not been experimentally verified. Exotic matter hypotheses are untested. Existence of Quantum Foam now suspect (Robert Nemiroff).
Small Black Hole Propulsion: No. Capable of an End Of Humanity (EOH) event Don’t know if small black holes really do exist in Nature. Their theoretical basis should be considered a Millennium Theory.

It is quite obvious that the major impediments to interstellar travel are the Millennium Theories. Let us review. Richard Feynman (Nobel Prize 1965) & Sheldon Lee Glashow (Nobel Prize 1979) have criticized string theory for not providing novel experimental predictions at accessible energy scales, but other theoretical physicists (Stephen Hawking, Edward Witten, Juan Maldacena and Leonard Susskind) believe that string theory is a step towards the correct fundamental description of nature. The Wikipedia article String Theory gives a good overview, and notes other critics and criticisms of string theories. In What is String Theory? Alberto Güijosa explains why string theories have come to dominate theoretical physics. It is about forces, and especially about unifying gravity with the other three forces.

Note, strings expand when their energy increases but the experimental evidence aka Lorentz-Fitzgerald transformations tell us that everything contracts with velocity i.e. as energy is increased.

In my opinion, the heady rush to a theory of everything is misguided, because there is at least one question that physics has not answered that is more fundamental than strings and particles. What is probability and how is it implemented in Nature?

Probabilities are more fundamental than particles as particles exhibit non-linear spatial probabilistic behavior. So how can one build a theory of everything on a complex structure (particles), if it cannot explain something substantially more fundamental (probabilities) than this complex structure? The logic defies me.

We can ask more fundamental questions. Is this probability really a Gaussian function? Experimental data suggests otherwise, a Var-Gamma distribution. Why is the force experienced by an electron moving in a magnetic field, orthogonal to both the electron velocity and the magnetic field? Contemporary electromagnetism just says it is vector cross product, i.e. it is just that way. The cross product is a variation of saying it has to be a Left Hand Rule or a Right Hand Rule. But why?

Is mass really the source of a gravitational field? Could it not be due to quark interaction? Can we device experiments that can distinguish between the two? Why do photons exhibit both wave and particle behavior? What is momentum, and why is it conserved? Why is mass and energy equivalent?

Can theoretical physicists construct theories without using the laws of conservation of mass-energy and momentum? That would be a real test for a theory of everything!

In my research into gravity modification I found that the massless formula for gravitational acceleration, g=τc2, works for gravity, electromagnetism and mechanical forces. Yes, a unification of gravity and electromagnetism. And this formula has been tested and verified with experimental data. Further that a force field is a Non Inertia (Ni) field, and is present where ever there is a spatial gradient in time dilations or velocities. This is very different from the Standard Model which requires that forces are transmitted by the exchange of virtual particles.

So if there is an alternative model that has united gravity and electromagnetism, what does that say for both string theories and the Standard Model? I raise these questions because they are opportunities to kick start research in a different direction. I answered two of these questions in my book. In the spirit of the Kline Directive can we use these questions to explore what others have not, to seek what others will not, to change what others dare not?

That is why I’m confident that we will have real working gravity modification technologies by 2020.

In concluding this section we need to figure out funding rules to ensure that Engineering Feasible and 100-Year Theories get first priority. That is the only way we are going to be able to refocus our physics community to achieve interstellar travel sooner rather than later.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Theoretical-Empirical Relationship (Part 3)

Posted in cosmology, defense, education, engineering, particle physics, philosophy, physics, policy, spaceTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 17 Comments on The Kline Directive: Theoretical-Empirical Relationship (Part 3)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In Part 1, we learned that Einstein was phenomenally successful because his work was deeply meshed with the experimental evidence of the day. In Part 2, we learned that to be successful at developing new useful theories and discovering new fundamental properties of Nature that will bring forth new interstellar travel technologies, we need to avoid hypotheses that are not grounded in experimental data, as these are purely mathematical conjectures.

In my book on gravity modification I classified physics hypotheses and theories into 3 categories, as follows:

A. Type 1: The Millennium Theories
These are theories that would require more than a 100 years and up to 1,000 years to prove or disprove. Mathematically correct but inscrutable with physical verifiable experiments, even in the distant future.

String and quantum gravity theories fall into this category. Why? If we cannot even figure out how to engineer-modify 4-dimensional spacetime, how are we going to engineer-modify a 5-, 6-, 9-, 11- or 23-dimensional universe?

How long would it take using string theories to modify gravity? Prof. Michio Kaku in his April 2008 Space Show interview had suggested several hundred years. Dr. Eric Davis in his G4TV interview had suggested more than 100 years maybe 200 years. So rightly, by their own admission these are Millennium Theories. It should be noted that Richard Feynman (Nobel Prize 1965) & Sheldon Lee Glashow (Nobel Prize 1979) were against string theory, but their opinions did not prevail.

Even hypotheses that conjecture time travel should be classified as Millennium Theories because they require ‘exotic’ matter. John Eades, a retired CERN senior scientist, in his article Antimatter Pseudoscience, states in no uncertain terms that antimatter is impossible to handle and create in real quantities. Then what about exotic matter?

For that matter any hypothesis that requires antimatter or exotic matter should be classified a Millennium Theory.

B. Type 2: The 100-Year Theories
These are theories that show promise of being verified with technologies that would require several decades to engineer, test and prove.

These types of theories do not lend themselves to an immediate engineering solution. The engineering solution is theoretically feasible but a working experiment or technology is some decades away, because the experimental or physical implementation is not fully understood.

Note there is this gap. We do not have 100-Year Theories in our repertoire of physical theories to keep the pipeline supplied with new and different ways to test the physical Universe.

C. Type 3: The Engineering Feasible Theories
These are theories that lend themselves to an engineering solution, today. They are falsifiable today, with our current engineering technologies. They can be tested and verified in the laboratory if one knows what to test for and how to test for these experimental observations.

Today Relativity falls into this category because we have the engineering sophistication to test Einstein’s theory, and it has been vindicated time and time again. But, there is a very big ‘but’. But Relativity cannot give us gravity modification or new propulsion theories, because it requires mass. We need to stand on Einstein’s shoulders to take the next step forward.

Therefore, if we are to become an interstellar civilization, in the spirit of the Kline Directive, we need to actively seek out and explore physics in such a manner as to bring forth Engineering Feasible and 100-Year Theories.

We need to ask ourselves, what can we do, to migrate the theoretical physics research away from Theory of Everything research to the new field of propulsion physics? Gravity modification is an example of propulsion physics. Here is the definition of gravity modification, from my book:

“Gravity modification is defined as the modification of the strength and/or direction of the gravitational acceleration without the use of mass as the primary source of this modification, in local space time. It consists of field modulation and field vectoring. Field modulation is the ability to attenuate or amplify a force field. Field vectoring is the ability to change the direction of this force field.”

Note by this definition requiring no mass, relativity, quantum mechanics and string theories cannot be used to theorize propulsion physics. Therefore, the urgent need to find genuinely new ways in physics, to achieve interstellar travel.

Can we get there? The new physics? To answer this question let me quote Dr. Andrew Beckwith, Astrophysicist, Ph.D.(Condensed Matter Theory) who wrote the Foreword to my book:

“I believe that Quantum Mechanics is an embedded artifact of a higher level deterministic theory, i.e. much in the same vein as G. t’Hooft, the Nobel prize winner. In this sense, what Benjamin has done is to give a first order approximation as to what Quantum Mechanics is actually a part of which may in its own way shed much needed understanding of the foundations of Quantum Mechanics well beyond the ‘Pilot model’ of DICE 2010 fame (this is a conference on the foundations of Quantum Mechanics and its extension given once every two years in Pisa , Italy, organized by Thomas Elze).”

Why does Dr. Andrew Beckwith reference quantum mechanics in a book on gravity modification?

Because my investigation into gravity modification led me to the conclusion that gravitation acceleration is independent of the internal structure of the particle. It does not matter if the particle consists of other particles, strings, pebbles or rocks. This led me to ask the question, so what is the internal structure of a photon? I found out that the photon probability is not Gaussian but a new distribution, Var-Gamma. Therefore I believe Robert Nemiroff’s three photon observation will be vindicated by other physicist-researchers sifting through NASA’s archives for gamma-ray burst.

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

From Part 1 … “that mathematics has become so sophisticated and so very successful that it can now be used to prove anything and everything, and therefore, the loss of certainty that mathematics will provide reasonability in guidance and correctness in answers to our questions in the sciences”.

We need to note that there are several different relationships between the mathematics of physics and the physics of the real world.

The first relationship and most common type is that several different types of equations in physics describe the same physics of the world. Gravity is a great example. The three mathematical theories on gravity are relativity, quantum and string theories. All three model the same single physical phenomenon, gravitational fields. So if one is correct than the other two must be wrong. All three cannot be correct. So which is it?

Just for argument sake, there is another alternative — all three are wrong. But wait didn’t all those experiments and observations prove that General Relativity is correct? Remember for argument’s sake, that proving that something fits the experimental observation does not mean that is how Nature works. That is why theoretical physicists spend so much time, money and effort considering alternatives like quantum and string theories.

The second relationship is that different mathematical descriptions can be ascribed to different parts of a physical phenomenon. For example Einstein’s General Relativity describes spacetime as tensor calculus, a very complex mathematical model which he did not get right on his first attempt. General Relativity addresses the question of gravity’s source as an energy-momentum tensor. To put it simply, these equations are complex.

Whereas in my work I realized at some point during my investigation into gravity modification, that to develop technologies that could modify gravity we needed a mathematical equation (g=τc2) that would describe the phenomenon of gravitational acceleration without needing to include mass. I discovered this equation, g=τc2, after very extensive numerical modeling of gravitational accelerations in spacetime, where tau, τ is the change in time dilation divided by change in distance (for more look up my Physics Essays paper, “Gravitational Acceleration Without Mass And Noninertia Fields”). Consider how elegantly simple this equation is and without mass we can now replace the source with something more technology friendly.

And the third type of relationship is the mathematics of physics that cannot or cannot yet be verified with experimental evidence. String theories are great examples of this. From what I know, there is nothing in the string theories (which have not been borrowed for quantum mechanics) that is experimentally verifiable. And yet we go on. Why?

Consider this. The experimental evidence proves that nothing with mass can be accelerated past the velocity of light (aka Lorentz-Fitzgerald transformations), and yet Dr. Eric Davis agrees with Dr. Richard Obousy that using string quantum theory that the maximum velocity one can reach is 1032 x c (100,000,000,000,000,000,000,000,000,000,000 x velocity of light). Now what would you believe, experimental evidence or mathematical conjecture?

Now, do you agree that that mathematics has become so sophisticated and so very successful that it can be used to prove anything and everything, and therefore, the loss of certainty that mathematics can provide reasonability in guidance and correctness in answers to our questions in the sciences?

Don’t get me wrong. Mathematics is vital for the progress of the sciences, but it needs to be tempered with real world experimental evidence, otherwise it is just conjecture, and retards our search for interstellar travel technologies.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Economic Viability

Posted in business, complex systems, defense, economics, education, engineering, finance, military, nuclear weapons, philosophy, physics, policy, scientific freedom, space, sustainabilityTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 11 Comments on The Kline Directive: Economic Viability

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In this post I will explore Economic Viability. I have proposed the Interstellar Challenge Matrix (ICM) to guide us through the issues so that we can arrive at interstellar travel sooner, rather than later. Let us review the costs estimates of the various star drives just to reach the velocity of 0.1c, as detailed in previous blog posts:

Interstellar Challenge Matrix (Partial Matrix)

Propulsion Mechanism Legal? Costs Estimates
Conventional Fuel Rockets: Yes Greater than US$1.19E+14
Antimatter Propulsion: Do Not Know. Between US$1.25E+20 and US$6.25E+21
Atomic Bomb Pulse Detonation: Illegal. This technology was illegal as of 1963 per Partial Test Ban Treaty Between $2.6E12 and $25.6E12 . These are Project Orion original costs converted back to 2012 dollar. Requires anywhere between 300,000 and 30,000,000 bombs!!
Time Travel: Do Not Know. Requires Exotic Matter, therefore greater than antimatter propulsion costs of US$1.25E+20
Quantum Foam Based Propulsion: Do Not Know. Requires Exotic Matter, therefore greater than antimatter propulsion costs of US$1.25E+20
Small Black Hole Propulsion: Most Probably Illegal in the Future Using CERN to estimate. At least US$9E+9 per annual budget. CERN was founded 58 years ago in 1954. Therefore a guestimate of the total expenditure required to reach its current technological standing is US$1.4E11.

Note Atomic Bomb numbers were updated on 10/18/2012 after Robert Steinhaus commented that costs estimates “are excessively high and unrealistic”. I researched the topic and found Project Orion details the costs, of $2.6E12 to $25.6E12, which are worse than my estimates.

These costs are humongous. The Everly Brothers said it the best.

Let’s step back and ask ourselves the question, is this the tool kit we have to achieve interstellar travel? Are we serious? Is this why DARPA — the organization that funds many strange projects — said it will take more than a 100 years? Are we not interested in doing something sooner? What happened to the spirit of the Kline Directive?

From a space exploration perspective economic viability is a strange criterion. It is not physics, neither is it engineering, and until recently, the space exploration community has been government funded to the point where realistic cost accountability is nonexistent.

Don’t get me wrong. This is not about agreeing to a payment scheme and providing the services as contracted. Government contractors have learned to do that very well. It is about standing on your own two feet, on a purely technology driven commercial basis. This is not an accounting problem, and accountants and CFOs cannot solve this. They would have no idea where to start. This is a physics and engineering problem that shows up as an economic viability problem that only physicists and engineers can solve.

The physics, materials, technology and manufacturing capability has evolved so much that companies like Planetary Resources, SpaceX, Orbital Sciences Corp, Virgin Galactic, and the Ad Astra Rocket Company are changing this economic viability equation. This is the spirit of the Kline Directive, to seek out what others would not.

So I ask the question, whom among you physicist and engineers would like to be engaged is this type of endeavor?

But first, let us learn a lesson from history to figure out what it takes. Take for example DARPA funding of the Gallium Arsenide. “One of DARPA’s lesser known accomplishments, semiconductor gallium arsenide received a push from a $600-million computer research program in the mid-1980s. Although more costly than silicon, the material has become central to wireless communications chips in everything from cellphones to satellites, thanks to its high electron mobility, which lets it work at higher frequencies.”

In the 1990s Gallium Arsenide semiconductors were so expensive that “silicon wafers could be considered free”. But before you jump in and say that is where current interstellar propulsion theories are, you need to note one more important factor.

The Gallium Arsenide technology had a parallel commercially proven technology in place, the silicon semiconductor technology. None of our interstellar propulsion technology ideas have anything comparable to a commercially successful parallel technology. (I forgot conventional rockets. Really?) A guesstimate, in today’s dollars, of what it would cost to develop interstellar travel propulsion given that we already had a parallel commercially proven technology, would be $1 billion, and DARPA would be the first in line to attempt this.

Given our theoretical physics and our current technological feasibility, this cost analysis would suggest that we require about 10 major technological innovations, each building on the other, before interstellar travel becomes feasible.

That is a very big step. Almost like reaching out to eternity. No wonder Prof Adam Franks in his July 24, 2012 New York Times Op-Ed, Alone in the Void, wrote “Short of a scientific miracle of the kind that has never occurred, our future history for millenniums will be played out on Earth”.

Therefore, we need to communicate to the theoretical physics community that they need get off the Theory of Everything locomotive and refocus on propulsion physics. In a later blog posting I will complete the Interstellar Challenge Matrix (ICM). Please use it to converse with your physicist colleagues and friends about the need to focus on propulsion physics.

In the spirit of the Kline Directive — bold, explore, seek & change — can we identify the 10 major technological innovations? Wouldn’t that keep you awake at night at the possibility of new unthinkable inventions that will take man where no man has gone before?

PS. I was going to name the Interstellar Challenge Matrix (ICM), the Feasibility Matrix for Interstellar Travel (FMIT), then I realized that it would not catch on at MIT, and decided to stay with ICM.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

Congratulations Skydiver Felix Baumgartner, on the success of your 24 mile skydive. You proved that it is possible to bail out of a space ship and land on Earth safely.

The records are nice to have but the engineering was superb!

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In this post I will explore Safety Awareness.

In the heady rush to propose academically acceptable ideas about new propulsions systems or star drives it is very easy to overlook safety considerations. The eminent cosmologist Carl Sagan said it best “So the problem is not to shield the payload, the problem is to shield the earth” (Planet. Space Sci., pp. 485 – 498, 1963)

It is perfectly acceptable if not warranted to propose these technologically infeasible star drives based on antimatter and exotic matter, as academic exercises because we need to understand what is possible and why. However, we need to inform the public of the safety issues when doing so.

I do not understand how any physicist or propulsion engineer, in his/her right mind, not qualify their academic exercise in antimatter propulsion or star drive with a statement similar to Carl Saga’s. At the very least it gets someone else thinking about those safety problems, and we can arrive at a solution sooner, if one exists.

We note that the distinguished Carl Sagan did not shy away from safety issues. He was mindful of the consequences and is an example of someone pushing the limits of safety awareness in the spirit of the Kline Directive, to explore issues which others would (could?) not.

We have to ask ourselves, how did we regress? From Sagan’s let us consider all ancillary issues, to our current let us ignore all ancillary issues. The inference I am forced to come to is that Carl Sagan was a one-man team, while the rest of us lesser beings need to come together as multi-person teams to stay on track, to achieve interstellar travel.

In interstellar & interplanetary space there are two parts to safety, radiation shielding and projectile shielding. Radiation shielding is about shielding from x-ray and gamma rays. Projectile shielding is about protection from physical damage caused by small particle collisions.

I may be wrong but I haven’t come across anyone even attempting to address either problems. I’ve heard of strategies such as using very strong electric fields or even of using millions of tons of metal shielding but these are not realistic. I’ve even heard of the need to address these issues but nothing more.

Safety is a big issue that has not been addressed. So how are we going to solve this? What do we need to explore that others have not? What do we need to seek that others would not? What do we need to change, that others dare not?

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

At first glance, one would consider the proposition of a base on Mercury, our Sun’s closest satellite, as ludicrous. With daytime temperatures reaching up to 700K — hot enough to melt lead — while the dark side of the planet experiences a temperature average of 110K — far colder than anywhere on Earth, combined with the lack of any substantial atmosphere, and being deep in the Sun’s gravitational potential well, conditions seem unfavorable.

First impressions can be misleading however, as it is well known that polar areas do not experience the extreme daily variation in temperature, with temperatures in a more habitable range (< 273 K (0 °C)) and it has been anticipated there may even be deposits of ice inside craters. http://nssdc.gsfc.nasa.gov/planetary/ice/ice_mercury.html

And is not just habitable temperature and ice-water in its polar regions that make Mercury an interesting candidate for an industrial base. There are a number of other factors making it more favourable than either a Looner or Martian base:

Mercury is the second densest planet in our solar system — being just slightly less dense than our Earth — and is rich in valuable resources, the highest concentrations of many valuable minerals of any surface in the Solar System, in highly concentrated ores. Also, being the closest planet to the Sun, Mercury has vast amounts of solar power available, and there are predictions that Mercury’s soil may contain large amounts of helium-3, which could become an important source of clean nuclear fusion energy on Earth and a driver for the future economy of the Solar System. Therefore it is a strong candidate for an industrial base.

Ticking other boxes — the gravity on the surface of Mercury is more than twice that of the Moon and very close to the surface gravity on Mars. Since there is evidence of human health problems associated with extended exposure to low gravity, from this point of view, Mercury might be more attractive for long-term human habitation than the Moon. Also, Mercury has the additional advantage of a magnetic field protecting it from cosmic rays and solar storms.

In fact, this idea is not a new one. Back in the 1980s, C.R. Pellegrino proposed covering Mercury with solar power farms, and transferring some of the resulting energy into a form useful for propulsion for interstellar travel. When one looks at the options we have available to us for first steps into space, we have another option available to us in Mercury.