Toggle light / dark theme

Quoted: “Sometimes decentralization makes sense.

Filament is a startup that is taking two of the most overhyped ideas in the tech community—the block chain and the Internet of things—and applying them to the most boring problems the world has ever seen. Gathering data from farms, mines, oil platforms and other remote or highly secure places.

The combination could prove to be a powerful one because monitoring remote assets like oil wells or mining equipment is expensive whether you are using people driving around to manually check gear or trying to use sensitive electronic equipment and a pricey a satellite internet connection.

Instead Filament has built a rugged sensor package that it calls a Tap, and technology network that is the real secret sauce of the operation that allows its sensors to conduct business even when they aren’t actually connected to the internet. The company has attracted an array of investors who have put $5 million into the company, a graduate of the Techstars program. Bullpen Capital led the round with Verizon Ventures, Crosslink Capital, Samsung Ventures, Digital Currency Group, Haystack, Working Lab Capital, Techstars and others participating.

To build its technology, Filament is using a series of protocols that include the blockchain transaction database behind Bitcoin; BitTorrent, the popular peer-to-peer file sharing software; Jose, a contract management protocol that is also used in the OAuth authentication service that lets people use their Facebook ID to log in and manage permissions to other sites around the web;TMesh, a long-range mesh networking technology andTelehash for private messaging.”

“This cluster of technologies is what enables the Taps to perform some pretty compelling stunts, such as send small amounts of data up to 9 miles between Taps and keep a contract inside a sensor for a year or so even if that sensor isn’t connected to the Internet. In practical terms, that might mean that the sensor in a field gathering soil data might share that data with other sensors in nearby fields belonging to other farmers based on permissions the soil sensor has to share that data. Or it could be something a bit more complicated like a robotic seed tilling machine sensing that it was low on seed and ordering up another bag from inventory based on a “contract” it has with the dispensing system inside a shed on the property.

The potential use cases are hugely varied, and the idea of using a decentralized infrastructure is fairly novel. Both IBM and Samsung have tested out using a variation of the blockchain technology for storing data in decentralized networks for connected devices. The idea is that sending all of that data to the cloud and storing it for a decade or so doesn’t always make economic sense, so why not let the transactions and accounting for them happen on the devices themselves?

That’s where the blockchain and these other protocols come in. The blockchain is a great way to store information about a transaction in a distributed manner, and because its built into the devices there’s no infrastructure to support for years on end. When combined with mesh radio technologies such as TMesh it also becomes a good way to build out a network of devices that can communicate with each other even when they don’t have connectivity.”

Read the Article, and watch the Video, here > http://fortune.com/2015/08/18/filament-blockchain-iot/

Quoted: “Traditional law is a form of agreement. It is an agreement among people and their leaders as to how people should behave. There are also legal contracts between individuals. These contracts are a form of private law that applies to the participants. Both types of agreement are enforced by a government’s legal system.”

“Ethereum is both a digital currency and a programming language. But it is the combination of these ingredients that make it special. Since most agreements involve the exchange of economic value, or have economic consequences, we can implement whole categories of public and private law using Ethereum. An agreement involving transfer of value can be precisely defined and automatically enforced with the same script.”

“When viewed from the future, today’s current legal system seems downright primitive. We have law libraries — buildings filled with words that nobody reads and whose meaning is unclear, even to courts who enforce them arbitrarily. Our private contracts amount to vague personal promises and a mere hope they might be honored.

For the first time, Ethereum offers an alternative. A new kind of law.”

Read the article here > http://etherscripter.com/what_is_ethereum.html

Concerns about the future of artificial intelligence (AI) have recently gained coverage thanks to pioneers like Hawking, Gates, and Musk, though certainly others have been peering down that rabbit hole for some time. While we certainly need to keep our eyes on the far-reaching, it behooves us to take a closer look at the social issues that are right under our noses.

The question of artificial intelligence transforming industry is not a question of when — it’s already happening — but rather of how automation is creeping in and impacting some of the biggest influencers in the economic sphere i.e. transportation, healthcare, and others, some of which may surprise you.

I recently discussed these near-at-hand social implications and ambiguities with Steve Omohundro, CEO and founder of Possibility Research.

Social Implications of AI

In the words of Mr. Omohundro, we’re “on the verge of major transformation” in myriad ways. Consider near-term economics. McKinsey&Company have estimated that the presence of AI automation could impact the economy by $10 to $25 trillion in the next 10 years. Gartner, an information technology research group, estimates that 1/3 of all jobs will be relegated to the world of AI by 2025.

Evidence of these trends is particularly relevant in the areas of “the cloud” i.e. the ‘Internet of All Things’, delivery of services, knowledge work, and emerging markets. Tesla recently announced a software upgrade that would allow its self-driving cars to take better control on highways. In the same market, Daimler just released the first 18-wheeler ‘Freelance Inspiration” truck that will be given autonomous reign of Nevada (NV) freeways.

Leaps are already being made in the areas of healthcare and medicine; engineering and architecture (a Chinese design company recently produced 10 3D-printed houses in 24 hours); and, perhaps one that’s not as obvious — the legal profession.

Dealing with Shades of Grey

Electronic discovery i.e. ediscovery is the electronic solution that is now used in identifying, collecting and producing electronically stored information (ESI) (emails, voicemail, databases, etc.) in response to a request for production in a law suite of investigation. The legal industry leverages this software in dealing with companies that sometimes have millions of emails, through which this natural language program helps sift and search.

Future impacts in the legal industry could resound in areas where much of what human lawyers do is considered quite routine, such as creating contracts. This type of work usually has a high ticket price, so there is tremendous incentive to automate these types of tasks.

There also exists the overlap and spill over of impacts of AI from one industry to the next, and the legal industry is right at the intersection. Think back to the autonomous cars. Lawyers are not poised to look forward to new and weird questions such as, ‘What if a self-driving car hits and kills a person? Who’s responsible? The people who built the car, or faulty software?’ We are on the cusp of having an “onslaught of new technology with very little clue of how to manage it”, says Omohundro.

Big Data and AI Implications

Another area that is changing the lay of the land is big data, which is constantly being applied by consumer companies as they gather data about consumers and then target ads based on this information. Once again, the question arises of how to manage this process and define legal restrictions.

Price fixing presents another ambiguous case. It’s illegal to collaborate with other companies in the same business to set prices, and a recent case arose in which an online seller looked as if it was ‘polluting’ the space and fixing prices. Turns out, the seller was running bots to check competitors’ prices, which were then adjusted according to an algorithm. “What happens when the bot is doing the price fixing; is that illegal?” Apparently so, looking at the outcome of the case, but the question of volition is a valid one.

Along a similar vein, a Swiss group working in the name of art created a bot, gave it bitcoin, hooked it up to the ‘dark net’, a realm of the Internet where people trade illegally, and had the bot randomly buy things. The art exhibit display was what the bot bought while roving the dark markets. “Police allowed the exhibit, and then came and arrested the bot…carted the computer away,” explains Omohundro. “Every aspect of today’s society is going to be transformed by these technologies.”

While there’s no succinct answers to any of the economic or ethical considerations of the “big questions” that Steve brought up in our conversation, he’s confidant that more informed and serious discourse will help us make better decisions of the human future — and I certainly hope he’s right.

Quoted: “IBM’s first report shows that “a low-cost, private-by-design ‘democracy of devices’ will emerge” in order to “enable new digital economies and create new value, while offering consumers and enterprises fundamentally better products and user experiences.” “According to the company, the structure we are using at the moment already needs a reboot and a massive update. IBM believes that the current Internet of Things won’t scale to a network that can handle hundreds of billions of devices. The operative word is ‘change’ and this is where the blockchain will come in handy.”

Read the article here > https://99bitcoins.com/ibm-believes-blockchain-elegant-solution-internet-of-things/

Gravity modification, the scientific term for antigravity, is the ability to modify the gravitational field without the use of mass. Thus legacy physics, the RSQ (Relativity, String & Quantum) theories, cannot deliver either the physics or technology as these require mass as their field origin.

Ron Kita who recently received the first US patent (8901943) related to gravity modification, in recent history, introduced me to Dr. Takaaki Musha some years ago. Dr. Musha has a distinguished history researching Biefeld-Brown in Japan, going back to the late 1980s, and worked for the Ministry of Defense and Honda R&D.

Dr. Musha is currently editing New Frontiers in Space Propulsion (Nova Publishers) expected later this year. He is one of the founders of the International Society for Space Science whose aim is to develop new propulsion systems for interstellar travel.

Wait. What? Honda? Yes. For us Americans, it is unthinkable for General Motors to investigate gravity modification, and here was Honda in the 1990s, at that, researching this topic.

In recent years Biefeld-Brown has gained some notoriety as an ionic wind effect. I, too, was of this opinion until I read Dr. Musha’s 2008 paper “Explanation of Dynamical Biefeld-Brown Effect from the Standpoint of ZPF field.” Reading this paper I realized how thorough, detailed and meticulous Dr. Musha was. Quoting selected portions from Dr. Musha’s paper:

In 1956, T.T. Brown presented a discovery known as the Biefeld-Bown effect (abbreviated B-B effect) that a sufficiently charged capacitor with dielectrics exhibited unidirectional thrust in the direction of the positive plate.

From the 1st of February until the 1st of March in 1996, the research group of the HONDA R&D Institute conducted experiments to verify the B-B effect with an improved experimental device which rejected the influence of corona discharges and electric wind around the capacitor by setting the capacitor in the insulator oil contained within a metallic vessel … The experimental results measured by the Honda research group are shown …

V. Putz and K. Svozil,

… predicted that the electron experiences an increase in its rest mass under an intense electromagnetic field …

and the equivalent

… formula with respect to the mass shift of the electron under intense electromagnetic field was discovered by P. Milonni …

Dr. Musha concludes his paper with,

… The theoretical analysis result suggests that the impulsive electric field applied to the dielectric material may produce a sufficient artificial gravity to attain velocities comparable to chemical rockets.

Given, Honda R&D’s experimental research findings, this is a major step forward for the Biefeld-Brown effect, and Biefeld-Brown is back on the table as a potential propulsion technology.

We learn two lesson.

First, that any theoretical analysis of an experimental result is advanced or handicapped by the contemporary physics. While the experimental results remain valid, at the time of the publication, zero point fluctuation (ZPF) was the appropriate theory. However, per Prof. Robert Nemiroff’s 2012 stunning discovery that quantum foam and thus ZPF does not exist, the theoretical explanation for the Biefeld-Brown effect needs to be reinvestigated in light of Putz, Svozil and Milonni’s research findings. This is not an easy task as that part of the foundational legacy physics is now void.

Second, it took decades of Dr. Musha’s own research to correctly advise Honda R&D how to conduct with great care and attention to detail, this type of experimental research. I would advise anyone serious considering Biefeld-Brown experiments to talk to Dr. Musha, first.

Another example of similar lessons relates to the Finnish/Russian Dr. Podkletnov’s gravity shielding spinning superconducting ceramic disc i.e. an object placed above this spinning disc would lose weight.

I spent years reading and rereading Dr. Podkletnov’s two papers (the 1992 “A Possibility of Gravitational Force Shielding by Bulk YBa2Cu3O7-x Superconductor” and the 1997 “Weak gravitational shielding properties of composite bulk YBa2Cu3O7-x superconductor below 70K under e.m. field”) before I fully understood all the salient observations.

Any theory on Dr. Podkletnov’s experiments must explain four observations, the stationary disc weight loss, spinning disc weight loss, weight loss increase along a radial distance and weight increase. Other than my own work I haven’t see anyone else attempt to explain all four observation within the context of the same theoretical analysis. The most likely inference is that legacy physics does not have the tools to explore Podkletnov’s experiments.

But it gets worse.

Interest in Dr. Podkletnov’s work was destroyed by two papers claiming null results. First, Woods et al, (the 2001 “Gravity Modification by High-Temperature Superconductors”) and second, Hathaway et al (the 2002 “Gravity Modification Experiments Using a Rotating Superconducting Disk and Radio Frequency Fields”). Reading through these papers it was very clear to me that neither team were able to faithfully reproduce Dr. Podkletnov’s work.

My analysis of Dr. Podkletnov’s papers show that the disc is electrified and bi-layered. By bi-layered, the top side is superconducting and the bottom non-superconducting. Therefore, to get gravity modifying effects, the key to experimental success is, bottom side needs to be much thicker than the top. Without getting into too much detail, this would introduce asymmetrical field structures, and gravity modifying effects.

The necessary dialog between theoretical explanations and experimental insight is vital to any scientific study. Without this dialog, there arises confounding obstructions; theoretically impossible but experiments work or theoretically possible but experiments don’t work. With respect to Biefeld-Brown, Dr. Musha has completed the first iteration of this dialog.

Above all, we cannot be sure what we have discovered is correct until we have tested these discoveries under different circumstances. This is especially true for future propulsion technologies where we cannot depend on legacy physics for guidance, and essentially don’t understand what we are looking for.

In the current RSQ (pronounced risk) theory climate, propulsion physics is not a safe career path to select. I do hope that serious researchers reopen the case for both Biefeld-Brown and Podkletnov experiments, and the National Science Foundation (NSF) leads the way by providing funding to do so.

(Originally published in the Huffington Post)

Recent revelations of NASA’s Eagleworks Em Drive caused a sensation on the internet as to why interstellar propulsion can or cannot be possible. The nay sayers pointed to shoddy engineering and impossible physics, and ayes pointed to the physics of the Alcubierre-type warp drives based on General Relativity.

So what is it? Are warp drives feasible? The answer is both yes and no. Allow me to explain.

The empirical evidence of the Michelson-Morley experiment of 1887, now known as the Lorentz-FitzGerald Transformations (LFT), proposed by FitzGerald in 1889, and Lorentz in 1892, show beyond a shadow of doubt that nothing can have a motion with a velocity greater than the velocity of light. In 1905 Einstein derived LFT from first principles as the basis for the Special Theory of Relativity (STR).

So if nothing can travel faster than light why does the Alcubierre-type warp drive matter? The late Prof. Morris Klein explained in his book, Mathematics: The Loss of Certainty, that mathematics has become so powerful that it can now be used to prove anything, and therefore, the loss of certainty in the value of these mathematical models. The antidote for this is to stay close to the empirical evidence.

My good friend Dr. Andrew Beckwith (Prof., Chongqing University, China) explains that there are axiomatic problems with the Alcubierre-type warp drive theory. Basically the implied axioms (or starting assumptions of the mathematics) requires a multiverse universe or multiple universes, but the mathematics is based on a single universe. Thus even though the mathematics appears to be sound its axioms are contradictory to this mathematics. As Dr. Beckwith states, “reducto ad absurdum”. For now, this unfortunately means that there is no such thing as a valid warp drive theory. LFT prevents this.

For a discussion of other problems in physical theories please see my peer reviewed 2013 paper “New Evidence, Conditions, Instruments & Experiments for Gravitational Theories” published in the Journal of Modern Physics. In this paper I explain how General Relativity can be used to propose some very strange ideas, and therefore, claiming that something is consistent with General Relativity does not always lead to sensible outcomes.

The question we should be asking is not, can we travel faster than light (FTL) but how do we bypass LFT? Or our focus should not be how to travel but how to effect destination arrival.

Let us take one step back. Since Einstein, physicists have been working on a theory of everything (TOE). Logic dictates that for a true TOE, the TOE must be able to propose from first principles, why conservation of mass-energy and conservation of momentum hold. If these theories cannot, they cannot be TOEs. Unfortunately all existing TOEs have these conservation laws as their starting axioms, and therefore, are not true TOEs. The importance of this requirement is that if we cannot explain why conservation of momentum is true, like Einstein did with LFT, how do we know how to apply this in developing interstellar propulsion engines? Yes, we have to be that picky, else we will be throwing millions if not billions of dollars in funding into something that probably won’t work in practice.

Is a new physics required to achieve interstellar propulsion? Does a new physics exists?

In 2007, after extensive numerical modeling I discovered the massless formula for gravitational acceleration, g=τc^2, where tau τ is the change in the time dilation transformation (dimensionless LFT) divided by that distance. (The error in the modeled gravitational acceleration is less than 6 parts per million). Thereby, proving that mass is not required for gravitational theories and falsifying the RSQ (Relativity, String & Quantum) theories on gravity. There are two important consequences of this finding, (1) we now have a new propulsion equation, and (2) legacy or old physics cannot deliver.

But gravity modification per g=τc^2 is still based on motion, and therefore, constrained by LFT. That is, gravity modification cannot provide for interstellar propulsion. For that we require a different approach, the new physics.

At least from the perspective of propulsion physics, having a theoretical approach for a single formula g=τc^2 would not satisfy the legacy physics community that a new physics is warranted or even exists. Therefore, based on my 16 years of research involving extensive numerical modeling with the known empirical data, in 2014, I wrote six papers laying down the foundations of this new physics:

1. “A Universal Approach to Forces”: There is a 4th approach to forces that is not based on Relativity, String or Quantum (RSQ) theories.
2. “The Variable Isotopic Gravitational Constant”: The Gravitational Constant G is not a constant, and independent of mass, therefore gravity modification without particle physics is feasible.
3. “A Non Standard Model Nucleon/Nuclei Structure”: Falsifies the Standard Model and proposes Variable Electric Permittivity (VEP) matter.
4. “Replacing Schrödinger”: Proposes that the Schrödinger wave function is a good but not an exact model.
5. “Particle Structure”: Proposes that the Standard Model be replaced with the Component Standard Model.
6. “Spectrum Independence”: Proposes that photons are spectrum independent, and how to accelerate nanowire technology development.

This work, published under the title Super Physics for Super Technologies is available for all to review, critique and test its validity. (A non-intellectual emotional gut response is not a valid criticism). That is, the new physics does exist. And the relevant outcome per interstellar propulsion is that subspace exists, and this is how Nature implements probabilities. Note, neither quantum nor string theories ask the question, how does Nature implement probabilities? And therefore, are unable to provide an answer. The proof of subspace can be found in how the photon electromagnetic energy is conserved inside the photon.

Subspace is probabilistic and therefore does not have the time dimension. In other words destination arrival is not LFT constrained by motion based travel, but is effected by probabilistic localization. We therefore, have to figure out navigation in subspace or vectoring and modulation. Vectoring is the ability to determine direction, and modulation is the ability to determine distance. This approach is new and has an enormous potential of being realized as it is not constrained by LFT.

Yes, interstellar propulsion is feasible, but not as of the warp drives we understand today. As of 2012, there are only about 50 of us on this planet working or worked towards solving the gravity modification and interstellar propulsion challenge.

So the question is not, whether gravity modification or interstellar propulsion is feasible, but will we be the first nation to invent this future?

(Originally published in the Huffington Post)

Companies looking to launch satellites into space typically spend anywhere from $10–50 million per launch but thanks to 3D printing, those costs are set to drop in a big way.

For $4.9 million, businesses can use RocketLab to send small satellites into orbit. The firm’s engine, called the Rutherford, is powered by an electric motor and is the first oxygen and hydrocarbon engine to use 3D printing for all its primary components. The New Zealand company is set to begin test flights this year and aims to launch weekly commercial operations next year. Read more

Cryptocurrency aficionados have been discussing Bitcoin limitations ever since the blockchain buzz hit the street. Geeks toss around ideas for clearing transactions faster, resisting potential attacks, rewarding miners after the last coin is mined, and supporting anonymity (or the opposite—if you lean toward the altcoinsdark side). There are many areas in which Bitcoin could be improved, or made more conducive to one camp or another.

Distinguished Penn State professor, John Carroll, believes that Bitcoin may eventually be marginalized due to its early arrival. He believes that its limitations will eventually be overcome by newer “altcoins”, presumably with improved mechanisms.

So, does progress in any of these areas threaten the reigning champ? It’s unlikely…

Andreas-transparentMore than any other individual, Andreas Antonopoulos is the face of Bitcoin. We discussed this very issue in the outer lobby of the MIT Bitcoin Expo at which he was keynote speaker (March 2015). Then, we discussed it again, when I hosted his presentation at The Bitcoin Event in New York (also in March). He clearly and succinctly explained to me why it is unlikely that an altcoin will replace Bitcoin as the dominant—and eventually surviving—cryptocurrency

It is not simply that Bitcoin was first or derived from Satoshi’s original paper, although this clearly established precedent, propelled it into the media, and ignited a grassroots industry. More importantly, Bitcoin is unlikely to be surpassed by an altcoin because:

  1. Bitcoin is open source. It is difficult enough for skeptics to trust that an open source protocol can be trusted. Users, businesses, banks, exchanges and governments may eventually trust a distributed, open source movement. After all, math is more trustworthy and less transient than governments. Math cannot inflate itself, bend to political winds, or print future generations into debt if it is tied to a cap. But it is unlikely that these same skeptics will allow an inventor with a proprietary mechanism to take custody of their wealth, or one in which the content of all wallets cannot be traced back to the origin.
  2. If we accept #1 (that a viable contender must be open source and either public or freely licensed), then Bitcoin developers or wallet vendors are free to incorporate the best protocols and enhancements from the alt-developers. They can gradually be folded into Bitcoin and adopted by consensus. This is what Gavin and the current developers at Bitcoin Prime do. They protect, enhance, extend, and promote. Looked at another way, when a feature or enhancement is blessed—and when 3 or 4 of the leading 7 wallets honor it, it becomes part of Bitcoin.

Bitcoin has achieved a two-sided network effect, just like Acrobat PDF. Unseating an entrenched two-sided network requires disruptive technology and implementation with clear benefits. But in the case of a widely distributed, trusted and universally adopted tool (such as a public-use monetary instrument), a contender must be open source. The Cryptocurrency Standards Association, The Bitcoin Foundation and the leading wallet vendors have always been open and eager to incorporate the best open source ideas into Bitcoin.

Even if Bitcoin were replaced by an altcoin or by “Bitcoin 2.0”, it is likely that the public would only migrate to the enhanced coin if it were tied to the original equity corpus of earned and mined coins from the Bitcoin era. That is, we all know that Satoshi may have thousands of original Bitcoins, but few among us would tolerate (a) losing all of our Bitcoin value, and (b) rewarding a blockchain wannabe who declares that his coins are worth more than the grassroots legacy of vested millions that came before.

string_can_phoneConsider Prof Carroll’s analogy: “Who will use an acoustic string telephone when he could access a mobile phone.” A more accurate analogy is the evolution of the 32 year old AMPS phone network (the first widely deployed cell phone network). In 1983, the original phones were analogue and limited to 400 channels. Like their non-cellular predecessors, user equipment was bulky. Phones were divided into bulky components in the trunk, under the seat and a corded handset. They lacked GPS, LTE and many signaling features that we now take for granted. Yet carriers, equipment manufacturers and users were never forced to throw away equipment and start over. The network grew, adopted, and yielded incentives for incremental user-equipment upgrade.

With all due respect to the distinguished Penn State professor, John Carroll, I stand with Andreas. Bitcoin need’t relinquish the throne. It is evolving!

Philip Raymond is Co-Chair of The Cryptocurrency Standards Association and CEO of Vanquish Labs.
This is his first article for Lifeboat Foundation

Related: Stellar & Ripple: Pretender to Bitcoin throne?

Game-changing technologies can be a waste of money or a competitive advantage. It depends on the technology and the organization.

It seems like the term “game-changing” gets tossed around a lot lately. This is particularly true with respect to new technologies. But what does the term mean, what are the implications, and how can you measure it?

With regarding to what it means, I like the MacMillan dictionary definition for game-changing. It is defined as “Completely changing the way that something is done, thought about, or made.” The reason I like this definition is it captures the transformational nature of what springs to mind when I hear the term game-changing. This should be just what it says. Not just a whole new ball game, but a whole new type of game entirely.

Every industry is unique. What is a game-changer for one, might only be a minor disruption or improvement for another. For example, the internal combustion engine was a game-changer for the transportation industry. It was important, though less of a game-changer for the asphalt industry due to secondary effect of increased demand for paved roads.

Just as every industry is unique, so is every organization. In order to prosper in a dynamic environment, an organization must be able to evaluate how a particular technology will affect its strategic goals, as well as its current operations. For this to happen, an organization’s leadership must have a clear understanding of itself and the environment in which it is operating. While this seems obvious, for large complex organizations, it may not be as easy as it sounds.

In addition to organizational awareness, leadership must have the inclination and ability to run scenarios of how it the organization be affected by the candidate game-changer. These scenarios provides the ability to peek a little into the future, and enables leadership to examine different aspects of the potential game-changer’s immediate and secondary impacts.

Now there are a lot of potential game-changers out there, and it is probably not possible to run a full evaluation on all of them. Here is where an initial screening comes in useful. An initial screen might ask is it realistic, actionable, and scalable? Realistic means does it appear to be feasible from a technical and financial standpoint? Actionable means does this seem like something that can actually be produced? Scalable means will the infrastructure support rapid adoption? If a potentially transformational technology passes this initial screening, then its impact on the organization should be thoroughly evaluated.

Let’s run an example with augmented reality as the technology and a space launch services company. Despite the (temporary?) demise of Google Glass, augmented reality certainly seems to have the potential to be transformational. It literally changes how we can look at the world! Is it realistic? I would say yes, the technology is almost there, as evidenced by Google Glass and Microsoft HoloLens. Is it actionable? Again, yes. Google Glass was indeed produced. Is it scalable? The infrastructure seems available to support widespread adoption, but the market readiness is a bit of an issue. So yes, but perhaps with qualifications.

With the initial screening done, let’s look at the organizational impact. A space launch company’s leadership knows that due to the unforgiving nature of spaceflight, reliability has to be high. They also know that they need to keep costs low in order to be competitive. Inspection of parts and assembly is expensive but necessary in order to maintain high reliability. With this abbreviated information as the organizational background, it’s time to look at scenarios. This is the “What if?” part of the process. Taking into account the known process areas of the company and the known and projected capabilities of the technology in question, ask “what would happen if we applied this technology?” Don’t forget to try to look for second order effects as well.

One obvious scenario for the space launch company would be to examine what if augmented reality was used in the inspection and verification process? One could imagine an assembly worker equipped with augmented reality glasses seeing the supply chain history of every part that is being worked on. Perhaps getting artificial intelligence expert guidance during assembly. The immediate effect would be reduced inspection time which equates to cost savings and increased reliability. A second order effect could be greater market share due to a better competitive advantage.

The bottom line is this hypothetical example is that for the space launch company, augmented reality stands a good chance of greatly improving how it does business. It would be a game-changer in at least one area of operations, but wouldn’t completely re-write all the rules.

As the company runs additional scenarios and visualizes the potential, it could determine whether or not this technology is something they want to just wait and see, or be an early adopter, or perhaps directly invest in to bring it along a little bit faster.

The key to all of this is that organizations have to be vigilant in knowing what new technologies and capabilities are on the horizon, and proactive in evaluating how they will be affected by them. If something can be done, it will be done, and if one organization doesn’t use it to create a competitive advantage, rest assured its competitors will.