Toggle light / dark theme

Since its acquisition of ITA Matrix Software eight years ago, Google has been quietly rolling out new tools for travelers. Its progress has been even more notable over the past months and weeks as it began unveiling tools to help predict flight delays, plan trips, and manage itineraries — among other things.

These changes have some wondering: Is Google making a run at total domination in the travel space? If it is, there’s a strong case to be made for its potential to disrupt the travel and hospitality sector with a similar approach to Amazon’s run at retail, and more recently grocery.

Read more

Realistic climate simulations require huge reserves of computational power. An LMU study now shows that new algorithms allow interactions in the atmosphere to be modeled more rapidly without loss of reliability.

Forecasting global and local climates requires the construction and testing of mathematical . Since such models must incorporate a plethora of physical processes and interactions, climate simulations require enormous amounts of . And even the best models inevitably have limitations, since the phenomena involved can never be modeled in sufficient detail. In a project carried out in the context of the DFG-funded Collaborative Research Center “Waves to Weather”, Stephan Rasp of the Institute of Theoretical Meteorology at LMU (Director: Professor George Craig) has now looked at the question of whether the application of can improve the efficacy of climate modelling. The study, which was performed in collaboration with Professor Mike Pritchard of the University of California at Irvine und Pierre Gentine of Columbia University in New York, appears in the journal PNAS.

General circulation models typically simulate the global behavior of the atmosphere on grids whose cells have dimensions of around 50 km. Even using state-of-the-art supercomputers the relevant that take place in the atmosphere are simply too complex to be modelled at the necessary level of detail. One prominent example concerns the modelling of clouds which have a crucial influence on climate. They transport heat and moisture, produce precipitation, as well as absorb and reflect solar radiation, for instance. Many clouds extend over distances of only a few hundred meters, much smaller than the grid cells typically used in simulations – and they are highly dynamic. Both features make them extremely difficult to model realistically. Hence today’s models lack at least one vital ingredient, and in this respect, only provide an approximate description of the Earth system.

Read more

Citi has produced another of its Disruptive Innovations publications, which takes a look at what it considers to be the top ten disruptive technologies. It is a sign of the changing times that anti-aging medicines are number 2 in its list.

1. All-Solid-State Batteries 2. Anti-Aging Medicines 3. Autonomous Vehicle Networks 4. Big Data & Healthcare 5. Dynamic Spectrum Access 6. eSports 7. 5G Technology 8. Floating Offshore Wind Farms 9. Real Estate Market Disruptors 10. Smart Voice-Activated Assistants.

What was considered fringe science a decade ago is now rapidly becoming a mainstream industry. Our understanding of aging has advanced quickly in the last 10 years, and the tools and innovations seem to come more quickly with each passing year. A variety of therapies that target different aging processes are in development, and some are at fairly advanced stages; if you are interested in their progress, check out the Rejuvenation Roadmap.

Read more

The World Economic Forum suggests we are on the cusp of a Fourth Industrial Revolution driven by ‘ubiquitous automation, big data and artificial intelligence’. The Institute for Public Policy Research, however, says that “despite the growing capability of robots and artificial intelligence (AI), we are not on the cusp of a ‘post-human’ economy.”

IPPR suggests that an estimated 60 percent of occupations have at least 30 percent of activities which could be automated with already-proven technologies. As tasks are automated, work is likely to be redefined, focusing on areas of human comparative advantage over machines.

The CIPD point out that “new technology has changed many more jobs than it has destroyed, and it does not destroy work. Overall, the biggest advanced industrialized economies have between them created over 50 million jobs, a rise of nearly 20 percent, over the past 20 years despite huge economic and technological disruptions.”

Read more

If you’ve ever experienced jet lag, you are familiar with your circadian rhythm, which manages nearly all aspects of metabolism, from sleep-wake cycles to body temperature to digestion. Every cell in the body has a circadian clock, but researchers were unclear about how networks of cells connect with each other over time and how those time-varying connections impact network functions.

In research published Aug. 27 in PNAS, researchers at Washington University in St. Louis and collaborating institutions developed a unified, data-driven computational approach to infer and reveal these connections in biological and chemical oscillatory networks, known as the topology of these , based on their time-series data. Once they establish the topology, they can infer how the agents, or cells, in the network work together in synchrony, an important state for the brain. Abnormal synchrony has been linked to a variety of brain disorders, such as epilepsy, Alzheimer’s disease and Parkinson’s disease.

Jr-Shin Li, professor of systems science & mathematics and an applied mathematician in the School of Engineering & Applied Science, developed an algorithm, called the ICON (infer connections of networks) method, that shows for the first time the strength of these connections over time. Previously, researchers could only determine whether a connection existed between networks.

Read more

The algorithm is saving about $10 million as part of an effort to replace the city’s water infrastructure.

To catch you up: In 2014, Flint began getting water from Flint River rather than the Detroit water system. Mistreatment of the new water supply, combined with old lead pipes, created contaminated water for residents.

Solving the problem: Records that could be used to figure out which houses might be affected by corroded old pipes were missing or incomplete. So the city turned to AI. Using 71 different pieces of information—like the age or value of the home—Georgia Tech researchers developed an algorithm that predicted whether or not a home was connected to lead pipes.

Read more

One of the most significant AI milestones in history was quietly ushered into being this summer. We speak of the quest for Artificial General Intelligence (AGI), probably the most sought-after goal in the entire field of computer science. With the introduction of the Impala architecture, DeepMind, the company behind AlphaGo and AlphaZero, would seem to finally have AGI firmly in its sights.

Let’s define AGI, since it’s been used by different people to mean different things. AGI is a single intelligence or algorithm that can learn multiple tasks and exhibits positive transfer when doing so, sometimes called meta-learning. During meta-learning, the acquisition of one skill enables the learner to pick up another new skill faster because it applies some of its previous “know-how” to the new task. In other words, one learns how to learn — and can generalize that to acquiring new skills, the way humans do. This has been the holy grail of AI for a long time.

As it currently exists, AI shows little ability to transfer learning towards new tasks. Typically, it must be trained anew from scratch. For instance, the same neural network that makes recommendations to you for a Netflix show cannot use that learning to suddenly start making meaningful grocery recommendations. Even these single-instance “narrow” AIs can be impressive, such as IBM’s Watson or Google’s self-driving car tech. However, these aren’t nearly so much so an artificial general intelligence, which could conceivably unlock the kind of recursive self-improvement variously referred to as the “intelligence explosion” or “singularity.”

Read more