Toggle light / dark theme

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

There is one last mistake in physics that needs to be addressed. This is the baking bread model. To quote from the NASA page,

“The expanding raisin bread model at left illustrates why this proportion law is important. If every portion of the bread expands by the same amount in a given interval of time, then the raisins would recede from each other with exactly a Hubble type expansion law. In a given time interval, a nearby raisin would move relatively little, but a distant raisin would move relatively farther — and the same behavior would be seen from any raisin in the loaf. In other words, the Hubble law is just what one would expect for a homogeneous expanding universe, as predicted by the Big Bang theory. Moreover no raisin, or galaxy, occupies a special place in this universe — unless you get too close to the edge of the loaf where the analogy breaks down.”

Notice the two qualifications the obvious one is “unless you get too close to the edge of the loaf where the analogy breaks down”. The second is that this description is only correct from the perspective of velocity. But there is a problem with this.

Look up in the night sky, and you can see the band of stars called the Milky Way. It helps if you are up in the Rocky Mountains above 7,000 ft. (2,133 m) away from the city lights. Dan Duriscoe produced one of the best pictures of our Milky Way from Death Valley, California that I have seen.

What do you notice?

I saw a very beautiful band of stars rising above the horizon, and one of my friends pointed to it and said “That is the Milky Way”. Wow! We could actually see our own galaxy from within.

Hint. The Earth is half way between the center of the Milky Way and the outer edge.

What do you notice?

We are not at the edge of the Milky Way, we are half way inside it. So “unless you get too close to the edge of the loaf where the analogy breaks down” should not happen. Right?

Wrong. We are only half way in and we see the Milky Way severely constrained to a narrow band of stars. That is if the baking bread model is to be correct we have to be far from the center of the Milky Way. This is not the case.

The Universe is on the order of 103 to 106 times larger. Using our Milky Way as an example the Universe should look like a large smudge on one side and a small smudge on the other side if we are even half way out. We should see two equally sized smudges if we are at the center of the Universe! And more importantly by the size of the smudges we could calculate our position with respect to the center of the Universe! But the Hubble pictures show us that this is not the case! We do not see directional smudges, but a random and even distribution of galaxies across the sky in any direction we look.

Therefore the baking bread model is an incorrect model of the Universe and necessarily any theoretical model that is dependent on the baking bread structure of the Universe is incorrect.

We know that we are not at the center of the Universe. The Universe is not geocentric. Neither is it heliocentric. The Universe is such that anywhere we are in the Universe, the distribution of galaxies across the sky must be the same.

Einstein (TV series Cosmic Journey, Episode 11, Is the Universe Infinite?) once described an infinite Universe being the surface of a finite sphere. If the Universe was a 4-dimensional surface of a 4-dimensional sphere, then all the galaxies would be expanding away from each other, from any perspective or from any position on this surface. And, more importantly, unlike the baking bread model one could not have a ‘center’ reference point on this surface. That is the Universe would be ‘isoacentric’ and both the velocity property and the center property would hold simultaneously.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I explain two more mistakes in physics. The first is 55 years old, and should have been caught long ago.

Bondi, in his 1957 paper “Negative mass in General Relativity”, had suggested that mass could be negative and there are surprising results from this possibility. I quote,

“… the positive body will attract the negative one (since all bodies are attracted by it), while the negative body will repel the positive body (since all bodies are repelled by it). If the motion is confined to the line of centers, then one would expect the pair to move off with uniform acceleration …”

As a theoretician Bondi required that the motion be “confined to the line of centers” or be confined to a straight line. However, as experimental physicist we would take a quantity of negative mass and another quantity of positive mass and place them in special containers attached two spokes. These spokes form a small arc at one end and fixed to the axis of a generator at the other end. Let go, and watch Bondi’s uniform straight line acceleration be translated into circular motion driving a generator. Low and behold, we have a perpetual motion machine generating free electricity!

Wow! A perpetual motion machine hiding in plain sight in the respectable physics literature, and nobody caught it. What is really bad about this is that Einstein’s General Relativity allows for this type of physics, and therefore in General Relativity this is real. So was Bondi wrong or does General Relativity permit perpetual motion physics? If Bondi is wrong then could Alcubierre too be wrong as his metrics requires negative mass?

Perpetual motion is sacrilege in contemporary physics, and therefore negative mass could not exist. Therefore negative mass is in the realm of mathematical conjecture. What really surprised me was the General Relativity allows for negative mass, at least Bondi’s treatment of General Relativity.

This raises the question, what other problems in contemporary physics do we have hiding in plain sight?

There are two types of exotic matter, that I know of, the first is negative mass per Bondi (above) and the second is imaginary (square root of −1) mass. The recent flurry of activity of the possibility that some European physicists had observed FTL (faster than light) neutrinos, should also teach us some lessons.

If a particle is traveling faster than light its mass becomes imaginary. This means that these particles could not be detected by ordinary, plain and simple mass based instruments. So what were these physicists thinking? That somehow Lorentz-Fitzgerald transformations were no longer valid? That mass would not convert into imaginary matter at FTL? It turned out that their measurements were incorrect. Just goes to show how difficult experimental physics can get, and these experimental physicists are not given the recognition due to them for the degree of difficulty of their work.

So what type of exotic matter was Dr. Harold White of NASA’s In-Space Propulsion program proposing in his presentation at the 2012 100-Year Starship Symposium? Both Alcubierre and White require exotic matter. Specifically, Bondi’s negative mass. But I’ve shown that negative mass cannot exist as it results in perpetual motion machines. Inference? We know that this is not technologically feasible.

That is, any hypothesis that requires exotic negative mass cannot be correct. This includes time travel.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post on technological feasibility, I point to some more mistakes in physics, so that we are aware of the type of mistakes we are making. This I hope will facilitate the changes required of our understanding of the physics of the Universe and thereby speed up the discovery of new physics required for interstellar travel.

The scientific community recognizes two alternative models for force. Note I use the term recognizes because that is how science progresses. This is necessarily different from the concept how Nature operates or Nature’s method of operation. Nature has a method of operating that is consistent with all Nature’s phenomena, known and unknown.

If we are willing to admit, that we don’t know all of Nature’s phenomena — our knowledge is incomplete — then it is only logical that our recognition of Nature’s method of operation is always incomplete. Therefore, scientists propose theories on Nature’s methods, and as science progresses we revise our theories. This leads to the inference that our theories can never be the exact presentation of Nature’s methods, because our knowledge is incomplete. However, we can come close but we can never be sure ‘we got it’.

With this understanding that our knowledge is incomplete, we can now proceed. The scientific community recognizes two alternative models for force, Einstein’s spacetime continuum, and quantum mechanics exchange of virtual particles. String theory borrows from quantum mechanics and therefore requires that force be carried by some form of particle.

Einstein’s spacetime continuum requires only 4 dimensions, though other physicists have add more to attempt a unification of forces. String theories have required up to 23 dimensions to solve equations.

However, the discovery of the empirically validated g=τc2 proves once and for all, that gravity and gravitational acceleration is a 4-dimensional problem. Therefore, any hypothesis or theory that requires more than 4 dimensions to explain gravitational force is wrong.

Further, I have been able to do a priori what no other theories have been able to do; to unify gravity and electromagnetism. Again only working with 4 dimensions, using a spacetime continuum-like empirically verified Non Inertia (Ni) Fields proves that non-nuclear forces are not carried by the exchange of virtual particles. And therefore, if non-nuclear forces are not carried by the exchange of virtual particles, why should Nature suddenly change her method of operation and be different for nuclear forces? Virtual particles are mathematical conjectures that were a convenient mathematical approach in the context of a Standard Model.

Sure there is always that ‘smart’ theoretical physicist who will convert a continuum-like field into a particle-based field, but a particle-continuum duality does not answer the question, what is Nature’s method? So we come back to a previous question, is the particle-continuum duality a mathematical conjecture or a mathematical construction? Also note, now that we know of g=τc2, it is not a discovery by other hypotheses or theories, if these hypotheses/theories claim to be able to show or reconstruct a posteriori, g=τc2, as this is also known as back fitting.

Our theoretical physicists have to ask themselves many questions. Are they trying to show how smart they are? Or are they trying to figure out Nature’s methods? How much back fitting can they keep doing before they acknowledge that enough is enough? Could there be a different theoretical effort that could be more fruitful?

The other problem with string theories is that these theories don’t converge to a single set of descriptions about the Universe, they diverge. The more they are studied the more variation and versions that are discovered. The reason for this is very clear. String theories are based on incorrect axioms. The primary incorrect axiom is that particles expand when their energy is increased.

The empirical Lorentz-Fitzgerald transformations require that length contracts as velocity increases. However, the eminent Roger Penrose, in the 1950s showed that macro objects elongate as they fall into a gravitational field. The portion of the macro body closer to the gravitational source is falling at just a little bit faster velocity than the portion of the macro body further away from the gravitational source, and therefore the macro body elongates. This effect is termed tidal gravity.

In reality as particles contract in their length, per Lorentz-Fitzgerald, the distance between these particles elongates due to tidal gravity. This macro expansion has been carried into theoretical physics at the elementary level of string particles, that particles elongate, which is incorrect. That is, even theoretical physicists make mistakes.

Expect string theories to be dead by 2017.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

I was about to discuss the third of three concepts, but thought a look back would be appropriate at this time. In my earlier post I had shown that the photon/particle wave function could not be part of the photon/particle as this would violate the empirical Lorentz-Fitzgerald transformations and therefore, Einstein’s Special Theory of Relativity. The wave function is only the photon/particle’s disturbance of the spacetime it is in, and therefore explains why photons/particles have wave properties. They don’t. They disturb spacetime like a pebble dropped into a pond. The pond’s ripples are not the pebble.

In the recent findings, Dr. Alberto Peruzzo, University of Bristol (UK) the lead author of the paper and quoting “The measurement apparatus detected strong nonlocality, which certified that the photon behaved simultaneously as a wave and a particle in our experiment, … This represents a strong refutation of models in which the photon is either a wave or a particle.” This is a very important finding and another step in the progress of science towards a better understanding of our Universe.

Those of you who have been following my blog posts will recognize that this is empirical validation using single structure test that shows that both wave and particle properties occur together. What is required next, to be empirically rigorous, is to either confirm or deny that this wave function is a spacetime disturbance. For that we require a dual structure test.

If this wave function is a spacetime disturbance, then Einstein’s Special Theory of Relativity is upheld, and we would require a major rethink of quantum physics or the physics of elementary particles. If this wave function is a not spacetime disturbance but part of the particle structure, then there is an empirical exception to the Lorentz-Fitzgerald transformation and we would require a rethink of Einstein’s Special Theory of Relativity.

Here is a proposal for a dual structure test (to test two alternative hypotheses) which probably only an organization like CERN could execute. Is it possible to disturb spacetime in a manner as to exhibit the properties of a known particle but has no mass? That is the underlying elementary particle is not present. I suppose other research institutions could attempt this, too. If successful … it will be a bigger discovery that Dr. Alberto Peruzzo and his team.

My money is on Lorentz-Fitzgerald and Einstein being correct, and I infer that the physics community of quantum and string theorist would not be happy at the possibility of this dual structure test.

So I ask, in the spirit of the Kline Directive, can we as a community of physicists and engineers come together, to explore what others have not, to seek what others will not, to change what others dare not, to make interstellar travel a reality within our lifetimes?

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I discuss the second of three concepts, that if implemented should speed up the rate of innovation and discovery so that we can achieve interstellar travel within a time frame of decades, not centuries. Okay, I must remind you that this will probably upset some physicists.

One of the findings of my 12-year study was that gravitational acceleration was independent of the internal structure of a particle, therefore, the elegantly simple formula, g=τc2, for gravitational acceleration. This raised the question, what is the internal structure of a particle? For ‘normal’ matter, the Standard Model suggests that protons and neutrons consist of quarks, or other mass based particles. Electrons and photons are thought to be elementary.

I had a thought, a test for mass as the gravitational source. If ionized matter showed the same gravitational acceleration effects as non-ionized matter, then one could conclude that mass is the source of gravitational acceleration, not quark interaction; because the different ionizations would have different electron mass but the same quark interaction. This would be a difficult test to do correctly because the electric field effects are much greater than gravitational effects.

One could ask, what is the internal structure of a photon? The correct answer is that no one knows. Here is why. In electromagnetism, radio antenna’s specifically, the energy inside the hollow antenna is zero. However, in quantum theory, specifically the nanowire for light photons, the energy inside the nanowire increases towards the center of the nanowire. I’m not going to provide any references as I not criticizing any specific researcher. So which is it?

One could ask the question, at what wavelength does this energy distribution change, from zero (for radio waves) to an increase (for light photons)? Again, this is another example of the mathematics of physics providing correct answers while being inconsistent. So we don’t know.

To investigate further, I borrowed a proposal from two German physicists, I. V. Drozdov and A. A. Stahlhofen, (How long is a photon?) who had suggested that a photon was about half a wavelength long. I thought, why stop there? What if it was an infinitely thin slice? Wait. What was that? An infinitely thin slice! That would be consistent with Einstein’s Special Theory of Relativity! That means if the photon is indeed an infinitely thin pulse, why do we observe the wave function that is inconsistent with Special Theory of Relativity? That anything traveling at the velocity of light must have a thickness of zero, as dictated by the Lorentz-Fitzgerald transformations.

The only consistent answer I could come up with was that the wave function was the photon’s effect or the photon’s disturbance on spacetime, and not the photon itself.

Here is an analogy. Take a garden rake, turn it upside down and place it under a carpet. Move it. What do you see? The carpet exhibits an envelope like wave function that appears to be moving in the direction the garden rake is moving. But the envelope is not moving. It is a bulge that shows up wherever the garden rake is. The rake is moving but not the envelope.

Similarly, the wave function is not moving and therefore spreads across the spacetime where the photon is. Now both are consistent with Einstein’s Special Theory of Relativity. Then why is the Standard Model successful? It is so because just as the bulge is unique to the shape of the garden rake, so are the photon’s and other particles’ wave function disturbances of spacetime are unique to the properties of the photon & respective particles.

In my book, this proposed consistency with Special Theory of Relativity points to the existence of subspace, and a means to achieve interstellar travel.

There are a lot of inconsistencies in our physical theories, and we need to start addressing these inconsistencies if we are to achieve interstellar travel sooner rather than later.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I discuss three concepts, that if implemented should speed up the rate of innovation and discovery so that we can achieve interstellar travel within a time frame of decades, not centuries.

Okay, what I’m going to say will upset some physicists, but I need to say it because we need to resolve some issues in physics to distinguish between mathematical construction and conjecture. Once we are on the road to mathematical construction, there is hope that this will eventually lead to technological feasibility. This post is taken from my published paper “Gravitational Acceleration Without Mass And Noninertia Fields” in the peer reviewed AIP journal, Physics Essays, and from my book An Introduction to Gravity Modification.

The Universe is much more consistent than most of us (even physicists) suspect. Therefore, we can use this consistency to weed out mathematical conjecture from our collection of physical hypotheses. There are two set of transformations that are observable. The first, in a gravitational field at a point where acceleration is a compared to a location at 0 an infinite distance from the gravitational source, there exists Non-Linear transformations Γ(a) which states that time dilation ta/t0, length contraction x0/xa, and mass increase ma/m0, behave in a consistent manner such that:

(1)

.

The second consistency is Lorentz-Fitzgerald transformations Γ(v) which states that at a velocity v compared to rest at 0, time dilation tv/t0, length contraction x0/xv, and mass increase mv/m0, behave in a consistent manner such that:

(2)

.

Now here is the surprise. The Universe is so consistent that if we use the Non-Linear transformation, equation (1) to calculate the free fall velocity (from infinity) to a certain height above the planet’s or star’s surface, and it’s corresponding time dilation, we find that it is exactly what the Lorentz-Fitzgerald transformation, equation (2) requires. That there is this previously undiscovered second level of consistency!

You won’t find this discovery in any physics text book. Not yet anyway. I published this in my 2011 AIP peer reviewed Physics Essays paper, “Gravitational Acceleration Without Mass And Noninertia Fields”.

Now let us think about this for a moment. What this says is that the Universe is so consistent that the linear velocity-time dilation relationship must be observable where ever velocity and time dilation is present, even in non-linear spacetime relationships where acceleration is present and altering the velocity and therefore the time dilation present.

Or to put it differently, where ever Γ(a) is present the space, time, velocity and acceleration relationship must allow for Γ(v) to be present in a correct and consistent manner. When I discovered this I said, wow! Why? Because we now have a means of differentiating hypothetical-theoretical gravitational fields, and therefore mathematical conjectures, from natural-theoretical gravitational fields, which are correct mathematical constructions.

That is, we can test the various quantum gravity & string hypotheses and any of the tensor metrics! Einstein’s tensor metrics should be correct, but from a propulsion perspective there is something more interesting, Alcubierre tensor metrics. Alcubierre was the first, using General Relativity, to propose the theoretical possibility of warp speed (note, not how to engineer it). Alcubierre’s work is very sophisticated. However, the concept is elegantly simple. That one can wrap a space craft in gravitational-type deformed spacetime to get it to ‘fall’ in the direction of travel.

The concept suggest that both equations (1) and (2) are no longer valid as the relative velocity between the outer edges of the spacetime wrap and an external observer is either at c, the velocity of light or greater – one needs to do the math to get the correct answer. Even at an acceleration of 1g, and assuming that this craft has eventually reached c, equation (1) and (2) are no longer consistent. Therefore, my inference is that Alcubierre metrics allows for zero time dilation within the wrap but not velocities greater than the velocity of light. Therefore, it is also doubtful that Dr. Richard Obousy hypothesis that it is possible to achieve velocities of 1E30c with a quantum string version of Alcubierre warp drive is correct.

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this set of posts I discuss three concepts. If implemented these concepts have the potential to bring about major changes in our understanding of the physical Universe. But first a detour.

In my earlier post I had suggested that both John Archibald Wheeler and Richard Feynman, giants of the physics community, could have asked different questions (what could we do differently?) regarding certain solutions to Maxwell’s equations, instead of asking if retrocausality could be a solution.

I worked 10 years for Texas Instruments in the 1980s & 1990s. Corporate in Dallas, had given us the daunting task of raising our Assembly/Test yields from 83% to 95%, within 3 years, across 6,000 SKUs (products), with only about 20+ (maybe less) engineers, and no assistance from Dallas. Assembly/Test skills had moved offshore, therefore, Dallas was not in a position to provide advice. I look back now and wonder how Dallas came up with the 95% number.

Impossibly daunting because many of our product yields were in the 70+%. We had good engineers and managers. The question therefore was how do you do something seemingly impossible, without changing your mix of people, equipment and technical skills sets?

Let me tell you the end first. We achieved 99% to 100% Assembly/Test yields across the board for 6,000 SKUs within 3 years. And this, in a third world nation not known for any remarkable scientific or engineering talent! I don’t have to tell you what other lessons we learned from this as it should be obvious. So me telling Dr. David Neyland, of DARPA’s TTOI’ll drop a zero” at the first 100YSS conference in 2011, still holds.

How did we do it? For my part I was responsible for Engineering Yield (IT) Systems, test operation cost modeling for Overhead Transfer Pricing, and tester capacity models to figure out how to increase test capacity. But the part that is relevant to this discussion was team work. We organized the company into teams, brought in consultants to teach what team work was and how to arrive at and execute operational and business decisions as teams.

And one of the keys to team work was to allow anyone and everyone to speak up. To voice their opinions. To ask questions, no matter how strange or silly those questions appeared to be. To never put down another person because he/she had different views.

Everyone from the managing director of the company down to the production operators were organized into teams. Every team had to meet once a week. To ask those questions. To seek those answers. That was some experience, working with and in those teams. We found things we did not know or understand about our process. That in turn set off new & old teams to go figure! We understood the value of a matrix type organization.

As a people not known for any remarkable scientific and engineering talent, we did it! Did the impossible. I learned many invaluable lessons from my decade at Texas Instruments that I’ll never forget and will always be grateful for.

My Thanksgiving this year is that I am thankful I had the opportunity to work for Texas Instruments when I did.

So I ask, in the spirit of the Kline Directive, can we as a community of physicists and engineers come together, to explore what others have not, to seek what others will not, to change what others dare not, to make interstellar travel a reality within our lifetimes?

Previous post in the Kline Directive series.

Next post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I will explore Technological Feasibility. At the end of the day that is the only thing that matters. If a hypothesis is not able to vindicate itself with empirical evidence it will not become technologically feasible. If it is not technologically feasible then it stands no chance of becoming commercially viable.

If we examine historical land, air and space speed records, we can construct and estimate of velocities that future technologies can achieve, aka technology forecasting. See table below for some of the speed records.

Year Fastest Velocity Craft Velocity (km/h) Velocity (m/s)
2006 Escape Earth New Horizons 57,600 16,000
1976 Capt. Eldon W. Joersz and Maj. George T. Morgan Lockheed SR-71 Blackbird 3,530 980
1927 Car land speed record (not jet engine) Mystry 328 91
1920 Joseph Sadi-Lecointe Nieuport-Delage NiD 29 275 76
1913 Maurice Prévost Deperdussin Monocoque 180 50
1903 Wilbur Wright at Kitty Hawk Wright Aircraft 11 3

A quick and dirty model derived from the data shows that we could achieve velocity of light c by 2151 or the late 2150s. See table below.

Year Velocity (m/s) % of c
2200 8,419,759,324 2808.5%
2152 314,296,410 104.8%
2150 274,057,112 91.4%
2125 49,443,793 16.5%
2118 30,610,299 10.2%
2111 18,950,618 6.3%
2100 8,920,362 3.0%
2075 1,609,360 0.5%
2050 290,351 0.1%
2025 52,384 0.0%

The extrapolation suggests that on our current rate of technological innovation we won’t achieve light speed until the late 2150s. The real problem is that we won’t achieve 0.1c until 2118! This is more than 100-years from today.

In my opinion this rate of innovation is too slow. Dr. David Neyland, of DARPA’s TTO was the driving force behind DARPA’s contribution to the 100-year Starship Study. When I met up with Dr. David Neyland during the first 100YSS conference, Sept. 30 to Oct 2, 2011, I told him “I’ll drop a zero”. That is I expect interstellar travel to be achievable in decades not centuries. And to ramp up our rate of technological innovation we need new theories and new methods of sifting through theories.

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.

To achieve interstellar travel, the Kline Directive instructs us to be bold, to explore what others have not, to seek what others will not, to change what others dare not. To extend the boundaries of our knowledge, to advocate new methods, techniques and research, to sponsor change not status quo, on 5 fronts, Legal Standing, Safety Awareness, Economic Viability, Theoretical-Empirical Relationships, and Technological Feasibility.

In this post I discuss part 2 of 3, Mathematical Construction versus Mathematical Conjecture, of how to read or write a journal paper that is not taught in colleges.

I did my Master of Arts in Operations Research (OR) at the best OR school in the United Kingdom, University of Lancaster, in the 1980s. We were always reminded that models have limits to their use. There is an operating range within which a model will provide good and reliable results. But outside that operating range, a model will provide unreliable, incorrect and even strange results.

Doesn’t that sound a lot like what the late Prof. Morris Kline was saying? We can extrapolate this further, and ask our community of theoretical physicists the question, what is the operating range of your theoretical model? We can turn the question around and require our community of theoretical physicists to inform us or suggest boundaries of where their models fail “ … to provide reasonability in guidance and correctness in answers to our questions in the sciences …”

A theoretical physics model is a mathematical construction that is not necessarily connected to the real world until it is empirically verified or falsified, until then these mathematical constructions are in limbo. Search the term ‘retrocausality’ for example. The Wikipedia article Retrocausality says a lot about how and why of the origins of theoretical physics models that are not within the range of our informed common sense. Let me quote,

“The Wheeler–Feynman absorber theory, proposed by John Archibald Wheeler and Richard Feynman, uses retrocausality and a temporal form of destructive interference to explain the absence of a type of converging concentric wave suggested by certain solutions to Maxwell’s equations. These advanced waves don’t have anything to do with cause and effect, they are just a different mathematical way to describe normal waves. The reason they were proposed is so that a charged particle would not have to act on itself, which, in normal classical electromagnetism leads to an infinite self-force.”

John Archibald Wheeler and Richard Feynman are giants in the physics community, and these esteemed physicists used retrocausality to solve a mathematical construction problem. Could they not have asked the different questions? What is the operating range of this model? How do we rethink this model so as not to require retrocausality?

This unfortunate leadership in retrocausality has led to a whole body of ‘knowledge’ by the name of ‘retrocausality’ that is in a state of empirical limbo and thus, the term mathematical conjecture applies.

Now, do you get an idea of how mathematical construction leads to mathematical conjecture? Someone wants to solve a problem, which is a legitimate quest because that is how science progresses, but the solution causes more problems (not questions) than previously, which leads to more physicists trying to answer those new problems, and so forth .… and so forth .… and so forth .…

In Hong Kong, the Cantonese have an expression “chasing the dragon”.

Disclaimer: I am originally from that part of the world, and enjoyed tremendously watching how the Indian and Chinese cultures collided, merged, and separated, repeatedly. Sometimes like water and oil, and sometimes like water and alcohol. These two nations share a common heritage, the Buddhist monks, and if they could put aside their nationalistic and cultural pride, who knows what could happen?

Chasing the dragon in the Chinese cultural context “refers to inhaling the vapor from heated morphine, heroin, oxycodone or opium that has been placed on a piece of foil. The ‘chasing’ occurs as the user gingerly keeps the liquid moving in order to keep it from coalescing into a single, unmanageable mass. Another more metaphorical use of the term ‘chasing the dragon’ refers to the elusive pursuit of the ultimate high in the usage of some particular drug.”

Solving a mathematical equation always gives a high, and discovering a new equation gives a greater high. So when we write a paper, we have to ask ourselves, are we chasing the dragon of mathematical conjecture or chasing the dragon of mathematical construction? I hope it is the latter.

Previous post in the Kline Directive series.

—————————————————————————————————

Benjamin T Solomon is the author & principal investigator of the 12-year study into the theoretical & technological feasibility of gravitation modification, titled An Introduction to Gravity Modification, to achieve interstellar travel in our lifetimes. For more information visit iSETI LLC, Interstellar Space Exploration Technology Initiative.

Solomon is inviting all serious participants to his LinkedIn Group Interstellar Travel & Gravity Modification.