Toggle light / dark theme

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

My apologies to my readers for this long break since my last post of Nov 19, 2012. I write the quarterly economic report for a Colorado bank’s Board of Directors. Based on my quarterly reports to the Board, I gave a talk Are We Good Stewards? on the US Economy to about 35 business executives at a TiE Rockies’ Business for Breakfast event. This talk was originally scheduled for Dec 14, but had moved forward to Nov 30 because the original speaker could not make the time commitment for that day. There was a lot to prepare, and I am very glad to say that it was very well received. For my readers who are interested here is the link to a pdf copy of my slides to Are We Good Stewards?

Now back to interstellar physics and the Kline Directive. Let’s recap.

In my last four posts (2c), (2d), (2e) & (2f) I had identified four major errors taught in contemporary physics. First, to be consistent (2c) with Lorentz-Fitzgerald and Special Theory of Relativity, elementary particles contract as their energy increases. This is antithetical to string theories and explains why string theories are becoming more and more complex without discovering new empirically verifiable fundamental laws of Nature.

Second, (2d) again to be consistent with Lorentz-Fitzgerald and Special Theory of Relativity, a photon’s wave function cannot have length. It must infinitesimally thin, zero length. Therefore, this wave function necessarily has to be a part of the photon’s disturbance of spacetime that is non-moving. Just like a moving garden rake under a rug creates the appearance that the bulge or wave function like envelope is moving.

Third, that exotic matter, negative mass in particular, converts the General Theory of Relativity into perpetual motion physics (sacrilege!) and therefore cannot exist in Nature. Fourth, that the baking bread model (2e) of the Universe is incorrect as our observations of the Milky Way necessarily point to the baking bread model not being ‘isoacentric’.

Einstein (2f) had used the Universe as an expanding 4-dimensional surface of a sphere (E4DSS) in one of his talks to explain how the number of galaxies looks the same in every direction we look. If Einstein is correct then time travel theories are not, as an expanding surface would necessarily require that the 4-dimensional Universe that we know, does not exists inside the expanding sphere, and therefore we cannot return to a past. And, we cannot head to a future because that surface has not happened. Therefore, first, the law of conservation of mass-energy holds as nothing is mysteriously added by timelines. And second, causality paradoxes cannot occur in Nature. Note there is a distinction between temporal reversibility and time travel.

In this E4DSS model, wormholes would not cause time travel but connect us to other parts of the Universe by creating tunnels from one part of the surface to another by going inside the sphere and tunneling to a different part of the sphere. So the real problem for theoretical physics is how does one create wormholes without using exotic matter?

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.


The 100,000 Stars Google Chrome Galactic Visualization Experiment Thingy

So, Google has these things called Chrome Experiments, and they like, you know, do that. 100,000 Stars, their latest, simulates our immediate galactic zip code and provides detailed information on many of the massive nuclear fireballs nearby.

Zoom in & out of interactive galaxy, state, city, neighborhood, so to speak.

It’s humbling, beautiful, and awesome. Now, is 100, 000 Stars perfectly accurate and practical for anything other than having something pretty to look at and explore and educate and remind us of the enormity of our quaint little galaxy among the likely 170 billion others? Well, no — not really. But if you really feel the need to evaluate it that way, you are a unimaginative jerk and your life is without joy and awe and hope and wonder and you probably have irritable bowel syndrome. Deservedly.

The New Innovation Paradigm Kinda Revisited
Just about exactly one year ago technosnark cudgel Anthrobotic.com was rapping about the changing innovation paradigm in large-scale technological development. There’s chastisement for Neil deGrasse Tyson and others who, paraphrasically (totally a word), have declared that private companies won’t take big risks, won’t do bold stuff, won’t push the boundaries of scientific exploration because of bottom lines and restrictive boards and such. But new business entities like Google, SpaceX, Virgin Galactic, & Planetary Resources are kind of steadily proving this wrong.

Google in particular, a company whose U.S. ad revenue now eclipses all other ad-based business combined, does a load of search-unrelated, interesting little and not so little research. Their mad scientists have churned out innovative, if sometimes impractical projects like Wave, Lively, and Sketchup. There’s the mysterious Project X, rumored to be filled with robots and space elevators and probably endless lollipops as well. There’s Project Glass, the self-driving cars, and they have also just launched Ingress, a global augmented reality game.

In contemporary America, this is what cutting-edge, massively well-funded pure science is beginning to look like, and it’s commendable. So, in lieu of an national flag, would we be okay with a SpaceX visitor center on the moon? Come on, really — a flag is just a logo anyway!

Let’s hope Google keeps not being evil.

[VIA PC MAG]
[100,000 STARS ANNOUNCEMENT — CHROME BLOG]

(this post originally published at www.anthrobotic.com)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I explain two more mistakes in physics. The first is 55 years old, and should have been caught long ago.

Bondi, in his 1957 paper “Negative mass in General Relativity”, had suggested that mass could be negative and there are surprising results from this possibility. I quote,

“… the positive body will attract the negative one (since all bodies are attracted by it), while the negative body will repel the positive body (since all bodies are repelled by it). If the motion is confined to the line of centers, then one would expect the pair to move off with uniform acceleration …”

As a theoretician Bondi required that the motion be “confined to the line of centers” or be confined to a straight line. However, as experimental physicist we would take a quantity of negative mass and another quantity of positive mass and place them in special containers attached two spokes. These spokes form a small arc at one end and fixed to the axis of a generator at the other end. Let go, and watch Bondi’s uniform straight line acceleration be translated into circular motion driving a generator. Low and behold, we have a perpetual motion machine generating free electricity!

Wow! A perpetual motion machine hiding in plain sight in the respectable physics literature, and nobody caught it. What is really bad about this is that Einstein’s General Relativity allows for this type of physics, and therefore in General Relativity this is real. So was Bondi wrong or does General Relativity permit perpetual motion physics? If Bondi is wrong then could Alcubierre too be wrong as his metrics requires negative mass?

Perpetual motion is sacrilege in contemporary physics, and therefore negative mass could not exist. Therefore negative mass is in the realm of mathematical conjecture. What really surprised me was the General Relativity allows for negative mass, at least Bondi’s treatment of General Relativity.

This raises the question, what other problems in contemporary physics do we have hiding in plain sight?

There are two types of exotic matter, that I know of, the first is negative mass per Bondi (above) and the second is imaginary (square root of −1) mass. The recent flurry of activity of the possibility that some European physicists had observed FTL (faster than light) neutrinos, should also teach us some lessons.

If a particle is traveling faster than light its mass becomes imaginary. This means that these particles could not be detected by ordinary, plain and simple mass based instruments. So what were these physicists thinking? That somehow Lorentz-Fitzgerald transformations were no longer valid? That mass would not convert into imaginary matter at FTL? It turned out that their measurements were incorrect. Just goes to show how difficult experimental physics can get, and these experimental physicists are not given the recognition due to them for the degree of difficulty of their work.

So what type of exotic matter was Dr. Harold White of NASA’s In-Space Propulsion program proposing in his presentation at the 2012 100-Year Starship Symposium? Both Alcubierre and White require exotic matter. Specifically, Bondi’s negative mass. But I’ve shown that negative mass cannot exist as it results in perpetual motion machines. Inference? We know that this is not technologically feasible.

That is, any hypothesis that requires exotic negative mass cannot be correct. This includes time travel.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post on technological feasibility, I point to some more mistakes in physics, so that we are aware of the type of mistakes we are making. This I hope will facilitate the changes required of our understanding of the physics of the Universe and thereby speed up the discovery of new physics required for interstellar travel.

The scientific community recognizes two alternative models for force. Note I use the term recognizes because that is how science progresses. This is necessarily different from the concept how Nature operates or Nature’s method of operation. Nature has a method of operating that is consistent with all Nature’s phenomena, known and unknown.

If we are willing to admit, that we don’t know all of Nature’s phenomena — our knowledge is incomplete — then it is only logical that our recognition of Nature’s method of operation is always incomplete. Therefore, scientists propose theories on Nature’s methods, and as science progresses we revise our theories. This leads to the inference that our theories can never be the exact presentation of Nature’s methods, because our knowledge is incomplete. However, we can come close but we can never be sure ‘we got it’.

With this understanding that our knowledge is incomplete, we can now proceed. The scientific community recognizes two alternative models for force, Einstein’s spacetime continuum, and quantum mechanics exchange of virtual particles. String theory borrows from quantum mechanics and therefore requires that force be carried by some form of particle.

Einstein’s spacetime continuum requires only 4 dimensions, though other physicists have add more to attempt a unification of forces. String theories have required up to 23 dimensions to solve equations.

However, the discovery of the empirically validated g=τc2 proves once and for all, that gravity and gravitational acceleration is a 4-dimensional problem. Therefore, any hypothesis or theory that requires more than 4 dimensions to explain gravitational force is wrong.

Further, I have been able to do a priori what no other theories have been able to do; to unify gravity and electromagnetism. Again only working with 4 dimensions, using a spacetime continuum-like empirically verified Non Inertia (Ni) Fields proves that non-nuclear forces are not carried by the exchange of virtual particles. And therefore, if non-nuclear forces are not carried by the exchange of virtual particles, why should Nature suddenly change her method of operation and be different for nuclear forces? Virtual particles are mathematical conjectures that were a convenient mathematical approach in the context of a Standard Model.

Sure there is always that ‘smart’ theoretical physicist who will convert a continuum-like field into a particle-based field, but a particle-continuum duality does not answer the question, what is Nature’s method? So we come back to a previous question, is the particle-continuum duality a mathematical conjecture or a mathematical construction? Also note, now that we know of g=τc2, it is not a discovery by other hypotheses or theories, if these hypotheses/theories claim to be able to show or reconstruct a posteriori, g=τc2, as this is also known as back fitting.

Our theoretical physicists have to ask themselves many questions. Are they trying to show how smart they are? Or are they trying to figure out Nature’s methods? How much back fitting can they keep doing before they acknowledge that enough is enough? Could there be a different theoretical effort that could be more fruitful?

The other problem with string theories is that these theories don’t converge to a single set of descriptions about the Universe, they diverge. The more they are studied the more variation and versions that are discovered. The reason for this is very clear. String theories are based on incorrect axioms. The primary incorrect axiom is that particles expand when their energy is increased.

The empirical Lorentz-Fitzgerald transformations require that length contracts as velocity increases. However, the eminent Roger Penrose, in the 1950s showed that macro objects elongate as they fall into a gravitational field. The portion of the macro body closer to the gravitational source is falling at just a little bit faster velocity than the portion of the macro body further away from the gravitational source, and therefore the macro body elongates. This effect is termed tidal gravity.

In reality as particles contract in their length, per Lorentz-Fitzgerald, the distance between these particles elongates due to tidal gravity. This macro expansion has been carried into theoretical physics at the elementary level of string particles, that particles elongate, which is incorrect. That is, even theoretical physicists make mistakes.

Expect string theories to be dead by 2017.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.


…here’s Tom with the Weather.
That right there is comedian/philosopher Bill Hicks, sadly no longer with us. One imagines he would be pleased and completely unsurprised to learn that serious scientific minds are considering and actually finding support for the theory that our reality could be a kind of simulation. That means, for example, a string of daisy-chained IBM Super-Deep-Blue Gene Quantum Watson computers from 2042 could be running a History of the Universe program, and depending on your solipsistic preferences, either you are or we are the character(s).

It’s been in the news a lot of late, but — no way, right?

Because dude, I’m totally real
Despite being utterly unable to even begin thinking about how to consider what real even means, the everyday average rational person would probably assign this to the sovereign realm of unemployable philosophy majors or under the Whatever, Who Cares? or Oh, That’s Interesting I Gotta Go Now! categories. Okay fine, but on the other side of the intellectual coin, vis-à-vis recent technological advancement, of late it’s actually being seriously considered by serious people using big words they’ve learned at endless college whilst collecting letters after their names and doin’ research and writin’ and gettin’ association memberships and such.

So… why now?

Well, basically, it’s getting hard to ignore.
It’s not a new topic, it’s been hammered by philosophy and religion since like, thought happened. But now it’s getting some actual real science to stir things up. And it’s complicated, occasionally obtuse stuff — theories are spread out across various disciplines, and no one’s really keeping a decent flowchart.

So, what follows is an effort to encapsulate these ideas, and that’s daunting — it’s incredibly difficult to focus on writing when you’re wondering if you really have fingers or eyes. Along with links to some articles with links to some papers, what follows is Anthrobotic’s CliffsNotes on the intersection of physics, computer science, probability, and evidence for/against reality being real (and how that all brings us back to well, God).
You know, light fare.

First — Maybe we know how the universe works: Fantastically simplified, as our understanding deepens, it appears more and more the case that, in a manner of speaking, the universe sort of “computes” itself based on the principles of quantum mechanics. Right now, humanity’s fastest and sexiest supercomputers can simulate only extremely tiny fractions of the natural universe as we understand it (contrasted to the macro-scale inferential Bolshoi Simulation). But of course we all know the brute power of our computational technology is increasing dramatically like every few seconds, and even awesomer, we are learning how to build quantum computers, machines that calculate based on the underlying principles of existence in our universe — this could thrust the game into superdrive. So, given ever-accelerating computing power, and given than we can already simulate tiny fractions of the universe, you logically have to consider the possibility: If the universe works in a way we can exactly simulate, and we give it a shot, then relatively speaking what we make ceases to be a simulation, i.e., we’ve effectively created a new reality, a new universe (ummm… God?). So, the question is how do we know that we haven’t already done that? Or, otherwise stated: what if our eventual ability to create perfect reality simulations with computers is itself a simulation being created by a computer? Well, we can’t answer this — we can’t know. Unless…
[New Scientist’s Special Reality Issue]
[D-Wave’s Quantum Computer]
[Possible Large-scale Quantum Computing]

Second — Maybe we see it working: The universe seems to be metaphorically “pixelated.” This means that even though it’s a 50 billion trillion gajillion megapixel JPEG, if we juice the zooming-in and drill down farther and farther and farther, we’ll eventually see a bunch of discreet chunks of matter, or quantums, as the kids call them — these are the so-called pixels of the universe. Additionally, a team of lab coats at the University of Bonn think they might have a workable theory describing the underlying lattice, or existential re-bar in the foundation of observable reality (upon which the “pixels” would be arranged). All this implies, in a way, that the universe is both designed and finite (uh-oh, getting closer to the God issue). Even at ferociously complex levels, something finite can be measured and calculated and can, with sufficiently hardcore computers, be simulated very, very well. This guy Rich Terrile, a pretty serious NASA scientist, sites the pixelation thingy and poses a video game analogy: think of any first-person shooter — you cannot immerse your perspective into the entirety of the game, you can only interact with what is in your bubble of perception, and everywhere you go there is an underlying structure to the environment. Kinda sounds like, you know, life — right? So, what if the human brain is really just the greatest virtual reality engine ever conceived, and your character, your life, is merely a program wandering around a massively open game map, playing… well, you?
[Lattice Theory from the U of Bonn]
[NASA guy Rich Terrile at Vice]
[Kurzweil AI’s Technical Take on Terrile]

Thirdly — Turns out there’s a reasonable likelihood: While the above discussions on the physical properties of matter and our ability to one day copy & paste the universe are intriguing, it also turns out there’s a much simpler and straightforward issue to consider: there’s this annoyingly simplistic yet valid thought exercise posited by Swedish philosopher/economist/futurist Nick Bostrum, a dude way smarter that most humans. Basically he says we’ve got three options: 1. Civilizations destroy themselves before reaching a level of technological prowess necessary to simulate the universe; 2. Advanced civilizations couldn’t give two shits about simulating our primitive minds; or 3. Reality is a simulation. Sure, a decent probability, but sounds way oversimplified, right?
Well go read it. Doing so might ruin your day, JSYK.
[Summary of Bostrum’s Simulation Hypothesis]

Lastly — Data against is lacking: Any idea how much evidence or objective justification we have for the standard, accepted-without-question notion that reality is like, you know… real, or whatever? None. Zero. Of course the absence of evidence proves nothing, but given that we do have decent theories on how/why simulation theory is feasible, it follows that blithely accepting that reality is not a simulation is an intrinsically more radical position. Why would a thinking being think that? Just because they know it’s true? Believing 100% without question that you are a verifiably physical, corporeal, technology-wielding carbon-based organic primate is a massive leap of completely unjustified faith.
Oh, Jesus. So to speak.

If we really consider simulation theory, we must of course ask: who built the first one? And was it even an original? Is it really just turtles all the way down, Professor Hawking?

Okay, okay — that means it’s God time now
Now let’s see, what’s that other thing in human life that, based on a wild leap of faith, gets an equally monumental evidentiary pass? Well, proving or disproving the existence of god is effectively the same quandary posed by simulation theory, but with one caveat: we actually do have some decent scientific observations and theories and probabilities supporting simulation theory. That whole God phenomenon is pretty much hearsay, anecdotal at best. However, very interestingly, rather than negating it, simulation theory actually represents a kind of back-door validation of creationism. Here’s the simple logic:

If humans can simulate a universe, humans are it’s creator.
Accept the fact that linear time is a construct.
The process repeats infinitely.
We’ll build the next one.
The loop is closed.

God is us.

Heretical speculation on iteration
Even wonder why older polytheistic religions involved the gods just kinda setting guidelines for behavior, and they didn’t necessarily demand the love and complete & total devotion of humans? Maybe those universes were 1st-gen or beta products. You know, like it used to take a team of geeks to run the building-sized ENIAC, the first universe simulations required a whole host of creators who could make some general rules but just couldn’t manage every single little detail.

Now, the newer religions tend to be monotheistic, and god wants you to love him and only him and no one else and dedicate your life to him. But just make sure to follow his rules, and take comfort that your’re right and everyone else is completely hosed and going to hell. The modern versions of god, both omnipotent and omniscient, seem more like super-lonely cosmically powerful cat ladies who will delete your ass if you don’t behave yourself and love them in just the right way. So, the newer universes are probably run as a background app on the iPhone 26, and managed by… individuals. Perhaps individuals of questionable character.

The home game:
Latest title for the 2042 XBOX-Watson³ Quantum PlayStation Cube:*
Crappy 1993 graphic design simulation: 100% Effective!

*Manufacturer assumes no responsibility for inherently emergent anomalies, useless
inventions by game characters, or evolutionary cul de sacs including but not limited to:
The duck-billed platypus, hippies, meat in a can, reality TV, the TSA,
mayonaise, Sony VAIO products, natto, fundamentalist religious idiots,
people who don’t like homos, singers under 21, hangovers, coffee made
from cat shit, passionfruit iced tea, and the pacific garbage patch.

And hey, if true, it’s not exactly bad news
All these ideas are merely hypotheses, and for most humans the practical or theoretical proof or disproof would probably result in the same indifferent shrug. For those of us who like to rub a few brain cells together from time to time, attempting to both to understand the fundamental nature of our reality/simulation, and guess at whether or not we too might someday be capable of simulating ourselves, well — these are some goddamn profound ideas.

So, no need for hand wringing — let’s get on with our character arc and/or real lives. While simulation theory definitely causes reflexive revulsion, “just a simulation” isn’t necessarily pejorative. Sure, if we take a look at the current state of our own computer simulations and A.I. constructs, it is rather insulting. So if we truly are living in a simulation, you gotta give it up to the creator(s), because it’s a goddamn amazing piece of technological achievement.

Addendum: if this still isn’t sinking in, the brilliant
Dinosaur Comics might do a better job explaining:

(This post originally published I think like two days
ago at technosnark hub www.anthrobotic.com.
)

The Kline Directive: Theoretical-Empirical Relationship (Part 4)

Posted in business, cosmology, defense, economics, education, engineering, nuclear weapons, particle physics, philosophy, physics, policy, scientific freedom, spaceTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 11 Comments on The Kline Directive: Theoretical-Empirical Relationship (Part 4)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationship, & Technological Feasibility.

In this post I have updated the Interstellar Challenge Matrix (ICM) to guide us through the issues so that we can arrive at interstellar travel sooner, rather than later:

Interstellar Challenge Matrix (Partial Matrix)

Propulsion Mechanism Relatively Safe? Theoretical-Empirical Relationship?
Conventional Fuel Rockets: Yes, but susceptible to human error. Known. Theoretical foundations are based on Engineering Feasible Theories, and have been evolving since Robert Goddard invented the first liquid-fueled rocket in 1926.
Antimatter Propulsion: No. Extensive gamma ray production (Carl Sagan). Issue is how does one protect the Earth? Capable of an End of Humanity (EOH) event. Dependent on Millennium Theories. John Eades states in no uncertain terms that antimatter is impossible to handle and create.
Atomic Bomb Pulse Detonation: No, because (Project Orion) one needs to be able to manage between 300,000 and 30,000,000 atomic bombs per trip. Known and based on Engineering Feasible Theories.
Time Travel: Do Not Know. Depends on how safely exotic matter can be contained. Dependent on a Millennium Theory. Exotic matter hypotheses are untested. No experimental evidence to show that Nature allows for a breakdown in causality.
String / Quantum Foam Based Propulsion: Do Not Know. Depends on how safely exotic matter can be contained. Dependent on a Millennium Theory. String theories have not been experimentally verified. Exotic matter hypotheses are untested. Existence of Quantum Foam now suspect (Robert Nemiroff).
Small Black Hole Propulsion: No. Capable of an End Of Humanity (EOH) event Don’t know if small black holes really do exist in Nature. Their theoretical basis should be considered a Millennium Theory.

It is quite obvious that the major impediments to interstellar travel are the Millennium Theories. Let us review. Richard Feynman (Nobel Prize 1965) & Sheldon Lee Glashow (Nobel Prize 1979) have criticized string theory for not providing novel experimental predictions at accessible energy scales, but other theoretical physicists (Stephen Hawking, Edward Witten, Juan Maldacena and Leonard Susskind) believe that string theory is a step towards the correct fundamental description of nature. The Wikipedia article String Theory gives a good overview, and notes other critics and criticisms of string theories. In What is String Theory? Alberto Güijosa explains why string theories have come to dominate theoretical physics. It is about forces, and especially about unifying gravity with the other three forces.

Note, strings expand when their energy increases but the experimental evidence aka Lorentz-Fitzgerald transformations tell us that everything contracts with velocity i.e. as energy is increased.

In my opinion, the heady rush to a theory of everything is misguided, because there is at least one question that physics has not answered that is more fundamental than strings and particles. What is probability and how is it implemented in Nature?

Probabilities are more fundamental than particles as particles exhibit non-linear spatial probabilistic behavior. So how can one build a theory of everything on a complex structure (particles), if it cannot explain something substantially more fundamental (probabilities) than this complex structure? The logic defies me.

We can ask more fundamental questions. Is this probability really a Gaussian function? Experimental data suggests otherwise, a Var-Gamma distribution. Why is the force experienced by an electron moving in a magnetic field, orthogonal to both the electron velocity and the magnetic field? Contemporary electromagnetism just says it is vector cross product, i.e. it is just that way. The cross product is a variation of saying it has to be a Left Hand Rule or a Right Hand Rule. But why?

Is mass really the source of a gravitational field? Could it not be due to quark interaction? Can we device experiments that can distinguish between the two? Why do photons exhibit both wave and particle behavior? What is momentum, and why is it conserved? Why is mass and energy equivalent?

Can theoretical physicists construct theories without using the laws of conservation of mass-energy and momentum? That would be a real test for a theory of everything!

In my research into gravity modification I found that the massless formula for gravitational acceleration, g=τc2, works for gravity, electromagnetism and mechanical forces. Yes, a unification of gravity and electromagnetism. And this formula has been tested and verified with experimental data. Further that a force field is a Non Inertia (Ni) field, and is present where ever there is a spatial gradient in time dilations or velocities. This is very different from the Standard Model which requires that forces are transmitted by the exchange of virtual particles.

So if there is an alternative model that has united gravity and electromagnetism, what does that say for both string theories and the Standard Model? I raise these questions because they are opportunities to kick start research in a different direction. I answered two of these questions in my book. In the spirit of the Kline Directive can we use these questions to explore what others have not, to seek what others will not, to change what others dare not?

That is why I’m confident that we will have real working gravity modification technologies by 2020.

In concluding this section we need to figure out funding rules to ensure that Engineering Feasible and 100-Year Theories get first priority. That is the only way we are going to be able to refocus our physics community to achieve interstellar travel sooner rather than later.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

The Kline Directive: Theoretical-Empirical Relationship (Part 3)

Posted in cosmology, defense, education, engineering, particle physics, philosophy, physics, policy, spaceTagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 17 Comments on The Kline Directive: Theoretical-Empirical Relationship (Part 3)

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In Part 1, we learned that Einstein was phenomenally successful because his work was deeply meshed with the experimental evidence of the day. In Part 2, we learned that to be successful at developing new useful theories and discovering new fundamental properties of Nature that will bring forth new interstellar travel technologies, we need to avoid hypotheses that are not grounded in experimental data, as these are purely mathematical conjectures.

In my book on gravity modification I classified physics hypotheses and theories into 3 categories, as follows:

A. Type 1: The Millennium Theories
These are theories that would require more than a 100 years and up to 1,000 years to prove or disprove. Mathematically correct but inscrutable with physical verifiable experiments, even in the distant future.

String and quantum gravity theories fall into this category. Why? If we cannot even figure out how to engineer-modify 4-dimensional spacetime, how are we going to engineer-modify a 5-, 6-, 9-, 11- or 23-dimensional universe?

How long would it take using string theories to modify gravity? Prof. Michio Kaku in his April 2008 Space Show interview had suggested several hundred years. Dr. Eric Davis in his G4TV interview had suggested more than 100 years maybe 200 years. So rightly, by their own admission these are Millennium Theories. It should be noted that Richard Feynman (Nobel Prize 1965) & Sheldon Lee Glashow (Nobel Prize 1979) were against string theory, but their opinions did not prevail.

Even hypotheses that conjecture time travel should be classified as Millennium Theories because they require ‘exotic’ matter. John Eades, a retired CERN senior scientist, in his article Antimatter Pseudoscience, states in no uncertain terms that antimatter is impossible to handle and create in real quantities. Then what about exotic matter?

For that matter any hypothesis that requires antimatter or exotic matter should be classified a Millennium Theory.

B. Type 2: The 100-Year Theories
These are theories that show promise of being verified with technologies that would require several decades to engineer, test and prove.

These types of theories do not lend themselves to an immediate engineering solution. The engineering solution is theoretically feasible but a working experiment or technology is some decades away, because the experimental or physical implementation is not fully understood.

Note there is this gap. We do not have 100-Year Theories in our repertoire of physical theories to keep the pipeline supplied with new and different ways to test the physical Universe.

C. Type 3: The Engineering Feasible Theories
These are theories that lend themselves to an engineering solution, today. They are falsifiable today, with our current engineering technologies. They can be tested and verified in the laboratory if one knows what to test for and how to test for these experimental observations.

Today Relativity falls into this category because we have the engineering sophistication to test Einstein’s theory, and it has been vindicated time and time again. But, there is a very big ‘but’. But Relativity cannot give us gravity modification or new propulsion theories, because it requires mass. We need to stand on Einstein’s shoulders to take the next step forward.

Therefore, if we are to become an interstellar civilization, in the spirit of the Kline Directive, we need to actively seek out and explore physics in such a manner as to bring forth Engineering Feasible and 100-Year Theories.

We need to ask ourselves, what can we do, to migrate the theoretical physics research away from Theory of Everything research to the new field of propulsion physics? Gravity modification is an example of propulsion physics. Here is the definition of gravity modification, from my book:

“Gravity modification is defined as the modification of the strength and/or direction of the gravitational acceleration without the use of mass as the primary source of this modification, in local space time. It consists of field modulation and field vectoring. Field modulation is the ability to attenuate or amplify a force field. Field vectoring is the ability to change the direction of this force field.”

Note by this definition requiring no mass, relativity, quantum mechanics and string theories cannot be used to theorize propulsion physics. Therefore, the urgent need to find genuinely new ways in physics, to achieve interstellar travel.

Can we get there? The new physics? To answer this question let me quote Dr. Andrew Beckwith, Astrophysicist, Ph.D.(Condensed Matter Theory) who wrote the Foreword to my book:

“I believe that Quantum Mechanics is an embedded artifact of a higher level deterministic theory, i.e. much in the same vein as G. t’Hooft, the Nobel prize winner. In this sense, what Benjamin has done is to give a first order approximation as to what Quantum Mechanics is actually a part of which may in its own way shed much needed understanding of the foundations of Quantum Mechanics well beyond the ‘Pilot model’ of DICE 2010 fame (this is a conference on the foundations of Quantum Mechanics and its extension given once every two years in Pisa , Italy, organized by Thomas Elze).”

Why does Dr. Andrew Beckwith reference quantum mechanics in a book on gravity modification?

Because my investigation into gravity modification led me to the conclusion that gravitation acceleration is independent of the internal structure of the particle. It does not matter if the particle consists of other particles, strings, pebbles or rocks. This led me to ask the question, so what is the internal structure of a photon? I found out that the photon probability is not Gaussian but a new distribution, Var-Gamma. Therefore I believe Robert Nemiroff’s three photon observation will be vindicated by other physicist-researchers sifting through NASA’s archives for gamma-ray burst.

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

From Part 1 … “that mathematics has become so sophisticated and so very successful that it can now be used to prove anything and everything, and therefore, the loss of certainty that mathematics will provide reasonability in guidance and correctness in answers to our questions in the sciences”.

We need to note that there are several different relationships between the mathematics of physics and the physics of the real world.

The first relationship and most common type is that several different types of equations in physics describe the same physics of the world. Gravity is a great example. The three mathematical theories on gravity are relativity, quantum and string theories. All three model the same single physical phenomenon, gravitational fields. So if one is correct than the other two must be wrong. All three cannot be correct. So which is it?

Just for argument sake, there is another alternative — all three are wrong. But wait didn’t all those experiments and observations prove that General Relativity is correct? Remember for argument’s sake, that proving that something fits the experimental observation does not mean that is how Nature works. That is why theoretical physicists spend so much time, money and effort considering alternatives like quantum and string theories.

The second relationship is that different mathematical descriptions can be ascribed to different parts of a physical phenomenon. For example Einstein’s General Relativity describes spacetime as tensor calculus, a very complex mathematical model which he did not get right on his first attempt. General Relativity addresses the question of gravity’s source as an energy-momentum tensor. To put it simply, these equations are complex.

Whereas in my work I realized at some point during my investigation into gravity modification, that to develop technologies that could modify gravity we needed a mathematical equation (g=τc2) that would describe the phenomenon of gravitational acceleration without needing to include mass. I discovered this equation, g=τc2, after very extensive numerical modeling of gravitational accelerations in spacetime, where tau, τ is the change in time dilation divided by change in distance (for more look up my Physics Essays paper, “Gravitational Acceleration Without Mass And Noninertia Fields”). Consider how elegantly simple this equation is and without mass we can now replace the source with something more technology friendly.

And the third type of relationship is the mathematics of physics that cannot or cannot yet be verified with experimental evidence. String theories are great examples of this. From what I know, there is nothing in the string theories (which have not been borrowed for quantum mechanics) that is experimentally verifiable. And yet we go on. Why?

Consider this. The experimental evidence proves that nothing with mass can be accelerated past the velocity of light (aka Lorentz-Fitzgerald transformations), and yet Dr. Eric Davis agrees with Dr. Richard Obousy that using string quantum theory that the maximum velocity one can reach is 1032 x c (100,000,000,000,000,000,000,000,000,000,000 x velocity of light). Now what would you believe, experimental evidence or mathematical conjecture?

Now, do you agree that that mathematics has become so sophisticated and so very successful that it can be used to prove anything and everything, and therefore, the loss of certainty that mathematics can provide reasonability in guidance and correctness in answers to our questions in the sciences?

Don’t get me wrong. Mathematics is vital for the progress of the sciences, but it needs to be tempered with real world experimental evidence, otherwise it is just conjecture, and retards our search for interstellar travel technologies.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In Part 1 of this post I will explore Theoretical-Empirical Relationship. Not theoretical relationships, not empirical relationships but theoretical-empirical relationships. To do this let us remind ourselves what the late Prof. Morris Kline was getting at in his book Mathematics: The Loss of Certainty, that mathematics has become so sophisticated and so very successful that it can now be used to prove anything and everything, and therefore, the loss of certainty that mathematics will provide reasonability in guidance and correctness in answers to our questions in the sciences.

History of science shows that all three giants of science of their times, Robert Boyle, Isaac Newton & Christiaan Huygens believed that light traveled in aether medium, but by the end of the 19th century there was enough experimental evidence to show aether could not be a valid concept. The primary experiment that changed our understanding of aether was the Michelson–Morley experiment of 1887, which once and for all proved that aether did not have the correct properties as the medium in which light travels.

Only after these experimental results were published did, a then unknown Albert Einstein, invent the Special Theory of Relativity (SRT) in 1905. The important fact to take note here is that Einstein did not invent SRT out of thin air, like many non-scientists and scientists, today believe. He invented SRT by examining the experimental data to put forward a hypothesis or concept described in mathematical form, why the velocity of light was constant in every direction independent of the direction of relative motion.

But he also had clues from others, namely George Francis FitzGerald (1889) and Hendrik Antoon Lorentz (1892) who postulated length contraction to explain negative outcome of the Michelson-Morley experiment and to rescue the ‘stationary aether’ hypothesis. Today their work is named the Lorentz-Fitzgerald transformation.

So Einstein did not invent the Special Theory of Relativity (SRT) out of thin air, there was a body of knowledge and hypotheses already in the literature. What Einstein did do was to pull all this together in a consistent and uniform manner that led to further correct predictions of how the physics of the Universe works.

(Note: I know my history of science in certain fields of endeavor, and therefore use Wikipedia a lot, not as a primary reference, but as a starting point for the reader to take off for his/her own research.)

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts:

1. Legal Standing. 2. Safety Awareness. 3. Economic Viability. 4. Theoretical-Empirical Relationship. 5. Technological Feasibility.

In this post I will explore Safety Awareness.

In the heady rush to propose academically acceptable ideas about new propulsions systems or star drives it is very easy to overlook safety considerations. The eminent cosmologist Carl Sagan said it best “So the problem is not to shield the payload, the problem is to shield the earth” (Planet. Space Sci., pp. 485 – 498, 1963)

It is perfectly acceptable if not warranted to propose these technologically infeasible star drives based on antimatter and exotic matter, as academic exercises because we need to understand what is possible and why. However, we need to inform the public of the safety issues when doing so.

I do not understand how any physicist or propulsion engineer, in his/her right mind, not qualify their academic exercise in antimatter propulsion or star drive with a statement similar to Carl Saga’s. At the very least it gets someone else thinking about those safety problems, and we can arrive at a solution sooner, if one exists.

We note that the distinguished Carl Sagan did not shy away from safety issues. He was mindful of the consequences and is an example of someone pushing the limits of safety awareness in the spirit of the Kline Directive, to explore issues which others would (could?) not.

We have to ask ourselves, how did we regress? From Sagan’s let us consider all ancillary issues, to our current let us ignore all ancillary issues. The inference I am forced to come to is that Carl Sagan was a one-man team, while the rest of us lesser beings need to come together as multi-person teams to stay on track, to achieve interstellar travel.

In interstellar & interplanetary space there are two parts to safety, radiation shielding and projectile shielding. Radiation shielding is about shielding from x-ray and gamma rays. Projectile shielding is about protection from physical damage caused by small particle collisions.

I may be wrong but I haven’t come across anyone even attempting to address either problems. I’ve heard of strategies such as using very strong electric fields or even of using millions of tons of metal shielding but these are not realistic. I’ve even heard of the need to address these issues but nothing more.

Safety is a big issue that has not been addressed. So how are we going to solve this? What do we need to explore that others have not? What do we need to seek that others would not? What do we need to change, that others dare not?

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.