Toggle light / dark theme

dec15-08-169294734-700x394

“To identify emerging trends, I use a six-part methodology beginning with seeking out those on the fringes doing unusual experimentation or research. Next I look for patterns using my CIPHER model, where I identify previously unseen contradictions, inflections, practices, hacks, extremes, and rarities. Then I ask practical questions, mapping trajectories, building scenarios, and pressure-testing my conclusions.”

Read more

20151120-go-board-game-google-ai

“When the world’s smartest researchers train computers to become smarter, they like to use games. Go, the two-player board game born in China more than two millennia ago, remains the nut that machines still can’t crack.”

Read more


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)
CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

IMG_0424

“I am excited to introduce the first wave of TechLuxe in a form of a resin handbag with an LCD video screen. The idea is to radically bring technology to fashion, but with creative beauty within a functional beautifully designed bag.”

Read more

Urksa_Bellabeat

Leave it to Berlin to breed the coolest of tech-related gatherings. At this interdisciplinary ‘unconference’, industry leaders and tech enthusiasts rubbed shoulders as they sipped chia seed smoothies among the sun-kissed gardens of an abandoned carpet factory.

Read more

d6da9223d89d@1x

““Who is Sabita?” I was looking right at Sabita Devi when she said these words. She was describing her life as a wife and mother in Jharkhand, one of the poorest states in India, where she has spent most of her days inside the four walls of her home. “No one in my village knew my name,” Sabita told me. Her contact with the outside world was mediated entirely by her husband: who she could talk to, what she could buy, when (and if) she could see a doctor. She was isolated from everyone and everything but her children.”

Read more

F_1b292dc085cb49ad2c9a621baff89267559a4be5c547b

As everyone is pointing out, 2015 is a crucial year for sustainable development, with three critical international meetings in the calendar starting this month. But what role do science, technology and innovation play in these processes?

Read more

4297675082_708b6bba24_o

Behind London and Berlin, the Dutch startup scene is already considered to be one of the most prominent in Europe. (If it feels unfair to weigh an entire country against individual cities, consider that the Netherlands has 17 million people crammed into an area half the size of South Carolina.)

Read more

Game-changing technologies can be a waste of money or a competitive advantage. It depends on the technology and the organization.

It seems like the term “game-changing” gets tossed around a lot lately. This is particularly true with respect to new technologies. But what does the term mean, what are the implications, and how can you measure it?

With regarding to what it means, I like the MacMillan dictionary definition for game-changing. It is defined as “Completely changing the way that something is done, thought about, or made.” The reason I like this definition is it captures the transformational nature of what springs to mind when I hear the term game-changing. This should be just what it says. Not just a whole new ball game, but a whole new type of game entirely.

Every industry is unique. What is a game-changer for one, might only be a minor disruption or improvement for another. For example, the internal combustion engine was a game-changer for the transportation industry. It was important, though less of a game-changer for the asphalt industry due to secondary effect of increased demand for paved roads.

Just as every industry is unique, so is every organization. In order to prosper in a dynamic environment, an organization must be able to evaluate how a particular technology will affect its strategic goals, as well as its current operations. For this to happen, an organization’s leadership must have a clear understanding of itself and the environment in which it is operating. While this seems obvious, for large complex organizations, it may not be as easy as it sounds.

In addition to organizational awareness, leadership must have the inclination and ability to run scenarios of how it the organization be affected by the candidate game-changer. These scenarios provides the ability to peek a little into the future, and enables leadership to examine different aspects of the potential game-changer’s immediate and secondary impacts.

Now there are a lot of potential game-changers out there, and it is probably not possible to run a full evaluation on all of them. Here is where an initial screening comes in useful. An initial screen might ask is it realistic, actionable, and scalable? Realistic means does it appear to be feasible from a technical and financial standpoint? Actionable means does this seem like something that can actually be produced? Scalable means will the infrastructure support rapid adoption? If a potentially transformational technology passes this initial screening, then its impact on the organization should be thoroughly evaluated.

Let’s run an example with augmented reality as the technology and a space launch services company. Despite the (temporary?) demise of Google Glass, augmented reality certainly seems to have the potential to be transformational. It literally changes how we can look at the world! Is it realistic? I would say yes, the technology is almost there, as evidenced by Google Glass and Microsoft HoloLens. Is it actionable? Again, yes. Google Glass was indeed produced. Is it scalable? The infrastructure seems available to support widespread adoption, but the market readiness is a bit of an issue. So yes, but perhaps with qualifications.

With the initial screening done, let’s look at the organizational impact. A space launch company’s leadership knows that due to the unforgiving nature of spaceflight, reliability has to be high. They also know that they need to keep costs low in order to be competitive. Inspection of parts and assembly is expensive but necessary in order to maintain high reliability. With this abbreviated information as the organizational background, it’s time to look at scenarios. This is the “What if?” part of the process. Taking into account the known process areas of the company and the known and projected capabilities of the technology in question, ask “what would happen if we applied this technology?” Don’t forget to try to look for second order effects as well.

One obvious scenario for the space launch company would be to examine what if augmented reality was used in the inspection and verification process? One could imagine an assembly worker equipped with augmented reality glasses seeing the supply chain history of every part that is being worked on. Perhaps getting artificial intelligence expert guidance during assembly. The immediate effect would be reduced inspection time which equates to cost savings and increased reliability. A second order effect could be greater market share due to a better competitive advantage.

The bottom line is this hypothetical example is that for the space launch company, augmented reality stands a good chance of greatly improving how it does business. It would be a game-changer in at least one area of operations, but wouldn’t completely re-write all the rules.

As the company runs additional scenarios and visualizes the potential, it could determine whether or not this technology is something they want to just wait and see, or be an early adopter, or perhaps directly invest in to bring it along a little bit faster.

The key to all of this is that organizations have to be vigilant in knowing what new technologies and capabilities are on the horizon, and proactive in evaluating how they will be affected by them. If something can be done, it will be done, and if one organization doesn’t use it to create a competitive advantage, rest assured its competitors will.

A realistic and desirable human destination would produce a different space program than what we have today.

“We reach for new heights and reveal the unknown for the benefit of humankind.” This is NASA’s Vision Statement. This is NASA’s reason for being, its purpose. This is a vision statement for science and knowledge. This vision statement was crafted in a solar system that has only one planet that is environmentally friendly to human life.

Thanks to the ongoing search for exoplanets, we’ve identified several planets in our galaxy that are Earth sized and in their star’s habitable zone. Based on statistics, potentially billions more are waiting to be found. We are just now developing the technology to detect them. But we’re nowhere near having the technology needed to get to visit them. They are simply too far away.

Now here is where I’d like to pose a what if question: What if there was another habitable planet just like Earth, right here in our own solar system? What would Earth’s space programs look like, if anyone with a good telescope could look up and see another world with oceans, and continents, and clouds, and green forests? I think that it is safe to say that space programs in this imaginary solar system would be vastly different than ours today. This is conjecture, but it seems likely that the vision statement above, would be more in line with making that new world available for humanity.

Of course the key difference between our present reality and this imaginary scenario is the existence of an obviously desirable destination relatively close by to Earth. This lack of obviously desirable destinations has shaped space programs into the form we see them today. The science oriented form described in the current NASA vision statement is a good example.

It has been said that leadership begins with a vision. To be compelling, a vision describes a desirable end state to be obtained. In the case of the fictional scenario with another Earth like planet in the solar system, that leadership vision might include making it possible for people to move freely to this new world.

As an analogy, in the mid 1800’s, the transcontinental vision (paraphrased) was to secure the U.S. position on the Pacific through a speedy and direct means of travel from one coast to the other. That vision did not include establishing and building the city of San Francisco! The prior existence of San Francisco, enabled the vision of a transcontinental railroad.

Since our situation lacks a visible desirable destination, a bit more effort is required in the vision department. We know that the solar system contains all the resources we need in order to construct vast places for people to live. Immense structures with forests, streams and farmland as advocated by Dr. Gerard O’Neill back in the 1970’s are all possible. We can achieve the same vision of having another habitable planet in this solar system, we just have to add the intermediate step of a vision to develop the manufacturing capability to construct our own desirable destinations!

Using the transcontinental vision as a guide, it is premature for the space vision to focus on sending millions of people out into space, since apart from the International Space Station, there are no destinations yet! No, to get to the transcontinental vision for space, we first need a vision of building a San Francisco in space! But in order for that vision to be considered, it must be realistic. The focus would be on developing the tools and robots necessary to rapidly and economically build up in-space manufacturing industries that can begin the construction of the first villages that will grow into the human cities.

Even though we do not have another Earth in our solar system, it is possible to envision the creation of other Earth equivalents. This leap in leadership would produce a vision unlike what we have now. This new vision, focused on manufacturing and development utilizing the resources of our solar system, would empower capabilities for even greater accomplishments in the future.