KITCHENER — Big jumps in life expectancy will begin in as little as 10 years thanks to advances in nanotechnology and 3D printing that will also enable wireless connections among human brains and cloud computers, a leading futurist said Thursday.
“In 10 or 15 years from now we will be adding more than a year, every year, to your life expectancy,” Ray Kurzweil told an audience of 800 people at Communtech’s annual Tech Leadership conference.
Kurzweil, a futurist, inventor and author, as well as a director of engineering at Google, calls this “radical life extension.”
Quantum future discussed at London’s Royal Society Conference.
By Tushna Commissariat
Not a week goes by here at Physics World that we don’t cover some advance in quantum mechanics – be it another step towards quantum computing or error correction, or a new type of quantum sensor, or another basic principle being verified and tested at new scales. While each advance may not always be a breakthrough, it is fair to say that the field has grown by leaps and bound in the last 20 years or so. Indeed, it has seen at least two “revolutions” since it first began and is now poised on the brink of a third, as scientific groups and companies around the world race to build the first quantum computer.
With this in mind, some of the stalwarts of the field – including Peter Knight, Ian Walmsley, Gerard Milburn, Stephen Till and Jonathan Pritchard – organized a two-day discussion meeting at the Royal Society in London, titled “Quantum technology for the 21st century “, which I decided to attend. The meeting’s main aim was to bring together academic and industry leaders “in quantum physics and engineering to identify the next generation of quantum technologies for translational development”. As Knight said during his opening speech, the time has come to “balance the massive leaps that the science has made with actual practical technology”.
The software startup launching out of a garage or a dorm room is now the stuff of legend. We can all name the stories of people who got together in a garage with a few computers and ended up disrupting massive, established corporations — or creating something the world never even knew it wanted.
Until now, this hasn’t really been as true for physical things you build from the ground up. The cost of tools and production has been too high, and for top quality, you still had to go at it the traditional manufacturing route.
If you’ve ever seen a “recommended item” on eBay or Amazon that was just what you were looking for (or maybe didn’t know you were looking for), it’s likely the suggestion was powered by a recommendation engine. In a recent interview, Co-founder of machine learning startup Delvv, Inc., Raefer Gabriel, said these applications for recommendation engines and collaborative filtering algorithms are just the beginning of a powerful and broad-reaching technology.
Gabriel noted that content discovery on services like Netflix, Pandora, and Spotify are most familiar to people because of the way they seem to “speak” to one’s preferences in movies, games, and music. Their relatively narrow focus of entertainment is a common thread that has made them successful as constrained domains. The challenge lies in developing recommendation engines for unbounded domains, like the internet, where there is more or less unlimited information.
“Some of the more unbounded domains, like web content, have struggled a little bit more to make good use of the technology that’s out there. Because there is so much unbounded information, it is hard to represent well, and to match well with other kinds of things people are considering,” Gabriel said. “Most of the collaborative filtering algorithms are built around some kind of matrix factorization technique and they definitely tend to work better if you bound the domain.”
Of all the recommendation engines and collaborative filters on the web, Gabriel cites Amazon as the most ambitious. The eCommerce giant utilizes a number of strategies to make item-to-item recommendations, complementary purchases, user preferences, and more. The key to developing those recommendations is more about the value of the data that Amazon is able to feed into the algorithm initially, hence reaching a critical mass of data on user preferences, which makes it much easier to create recommendations for new users.
“In order to handle those fresh users coming into the system, you need to have some way of modeling what their interest may be based on that first click that you’re able to extract out of them,” Gabriel said. “I think that intersection point between data warehousing and machine learning problems is actually a pretty critical intersection point, because machine learning doesn’t do much without data. So, you definitely need good systems to collect the data, good systems to manage the flow of data, and then good systems to apply models that you’ve built.”
Beyond consumer-oriented uses, Gabriel has seen recommendation engines and collaborative filter systems used in a narrow scope for medical applications and in manufacturing. In healthcare for example, he cited recommendations based on treatment preferences, doctor specialties, and other relevant decision-based suggestions; however, anything you can transform into a “model of relationships between items and item preferences” can map directly onto some form of recommendation engine or collaborative filter.
One of the most important elements that has driven the development of recommendation engines and collaborative filtering algorithms is the Netflix Prize, Gabriel said. The competition, which offered a $1 million prize to anyone who could design an algorithm to improve upon the proprietary Netflix’s recommendation engine, allowed entrants to use pieces of the company’s own user data to develop a better algorithm. The competition spurred a great deal of interest in the potential applications of collaborative filtering and recommendation engines, he said.
In addition, relative ease of access to an abundant amount of cheap memory is another driving force behind the development of recommendation engines. An eCommerce company like Amazon with millions of items needs plenty of memory to store millions of different of pieces of item and correlation data while also storing user data in potentially large blocks.
“You have to think about a lot of matrix data in memory. And it’s a matrix, because you’re looking at relationships between items and other items and, obviously, the problems that get interesting are ones where you have lots and lots of different items,” Gabriel said. “All of the fitting and the data storage does need quite a bit of memory to work with. Cheap and plentiful memory has been very helpful in the development of these things at the commercial scale.”
Looking forward, Gabriel sees recommendation engines and collaborative filtering systems evolving more toward predictive analytics and getting a handle on the unbounded domain of the internet. While those efforts may ultimately be driven by the Google Now platform, he foresees a time when recommendation-driven data will merge with search data to provide search results before you even search for them.
“I think there will be a lot more going on at that intersection between the search and recommendation space over the next couple years. It’s sort of inevitable,” Gabriel said. “You can look ahead to what someone is going to be searching for next, and you can certainly help refine and tune into the right information with less effort.”
While “mind-reading” search engines may still seem a bit like science fiction at present, the capabilities are evolving at a rapid pace, with predictive analytics at the bow.
You are really starting to see the shape of the Singularity, ever more clearly, in the convergence of so many engineering and scientific discoveries, inventions, and philosophical musings.
I can say, without a doubt, that we are all living in truly extraordinary times!
This five-fingered robot hand developed by University of Washington computer science and engineering researchers can learn how to perform dexterous manipulation — like spinning a tube full of coffee beans — on its own, rather than having humans program its actions. (credit: University of Washington)
A University of Washington team of computer scientists and engineers has built what they say is one of the most highly capable five-fingered robot hands in the world. It can perform dexterous manipulation and learn from its own experience without needing humans to direct it.
The Wyss Institute at Harvard is creating miniaturised versions of human organs that could one day be used to test drugs as specific as the patients that take them.
I am so happy to see others seeing the value because Quantum is changing everything; not just computing, raw material enrichment, medical technology and treatments, etc. Once more and more folks start seeing the various capabilities around Quantum and just how wide that range is; we will begin to see an explosion of demand for Quantum. We’re still in that mode of discovery, and wait and see state by some. However, the Quantum Revolution will exceed even the industrial revolution with the span of change that it brings across so many areas & industries.
Quantum physics research that could enhance self-driving vehicles and spearheaded by a Dalhousie University team is now a $6-million commercial venture that counts U.S. aerospace giant Lockheed Martin among its partners.
What started as a theoretical research project backed by Lockheed Martin hit paydirt when physics professor Jordan Kyriakidis realized quantum software could be used to perfect the design and operation of self-driving cars and new aircraft.
“A self-driving car is an example of a machine that’s really an infrastructure that is going to be entrusted with our lives and our children’s lives, so it’s absolutely critical that the machine behaves exactly as intended, doesn’t have any flaws, and so our software helps ensure that, before they even build anything, when it’s still at the blueprint stage,” said Kyriakidis, now CEO of Halifax-based Quantum Research Analytics (QRA).
If you’ve ever had a train set, you might remember the tiny street lamps that are often part of the model landscape. Today, the bulbs from those toy lamps are helping to shed light on quantum computing.
In fact, there’s been a spate of developments lately in quantum computing, and not just IBM’s announcement of its upcoming cloud service. Here are three recent advances from research institutions around the world.
Nice list of experts on Quantum; however, I would love to see someone from the Lab from Los Alamos to discuss Quantum Internet and University of Sydney from their Innovation Lab or the lady herself “Michelle Simmons” on the panel. Hope to see registration soon.
“Quantum computers enable us to use the laws of physics to solve intractable mathematical problems,” said Marcos de López de Prado, Senior Managing Director at Guggenheim Partners and a Research Fellow at Lawrence Berkeley National Laboratory’s Computational Research Division. “This is the beginning of a new era, and it will change the job of the mathematician and computer scientist in the years to come.”
As de Prado points out on the Quantum for Quants website, “Our smartphones are more powerful than the systems used by NASA to put a man on the moon.”
Twenty-five years after the introduction of the World Wide Web, the Information Age is coming to an end. Thanks to mobile screens and Internet everywhere, we’re now entering what I call the “Experience Age.”
When was the last time you updated your Facebook status? Maybe you no longer do? It’s been reported that original status updates by Facebook’s 1.6 billion users are down 21 percent.
The status box is an icon of the Information Age, a period dominated by desktop computers and a company’s mission to organize all the world’s information. The icons of the Experience Age look much different, and are born from micro-computers, mobile sensors and high-speed connectivity.