Toggle light / dark theme

If you’ve ever seen a “recommended item” on eBay or Amazon that was just what you were looking for (or maybe didn’t know you were looking for), it’s likely the suggestion was powered by a recommendation engine. In a recent interview, Co-founder of machine learning startup Delvv, Inc., Raefer Gabriel, said these applications for recommendation engines and collaborative filtering algorithms are just the beginning of a powerful and broad-reaching technology.

Raefer Gabriel, Delvv, Inc.
Raefer Gabriel, Delvv, Inc.

Gabriel noted that content discovery on services like Netflix, Pandora, and Spotify are most familiar to people because of the way they seem to “speak” to one’s preferences in movies, games, and music. Their relatively narrow focus of entertainment is a common thread that has made them successful as constrained domains. The challenge lies in developing recommendation engines for unbounded domains, like the internet, where there is more or less unlimited information.

“Some of the more unbounded domains, like web content, have struggled a little bit more to make good use of the technology that’s out there. Because there is so much unbounded information, it is hard to represent well, and to match well with other kinds of things people are considering,” Gabriel said. “Most of the collaborative filtering algorithms are built around some kind of matrix factorization technique and they definitely tend to work better if you bound the domain.”

Of all the recommendation engines and collaborative filters on the web, Gabriel cites Amazon as the most ambitious. The eCommerce giant utilizes a number of strategies to make item-to-item recommendations, complementary purchases, user preferences, and more. The key to developing those recommendations is more about the value of the data that Amazon is able to feed into the algorithm initially, hence reaching a critical mass of data on user preferences, which makes it much easier to create recommendations for new users.

“In order to handle those fresh users coming into the system, you need to have some way of modeling what their interest may be based on that first click that you’re able to extract out of them,” Gabriel said. “I think that intersection point between data warehousing and machine learning problems is actually a pretty critical intersection point, because machine learning doesn’t do much without data. So, you definitely need good systems to collect the data, good systems to manage the flow of data, and then good systems to apply models that you’ve built.”

Beyond consumer-oriented uses, Gabriel has seen recommendation engines and collaborative filter systems used in a narrow scope for medical applications and in manufacturing. In healthcare for example, he cited recommendations based on treatment preferences, doctor specialties, and other relevant decision-based suggestions; however, anything you can transform into a “model of relationships between items and item preferences” can map directly onto some form of recommendation engine or collaborative filter.

One of the most important elements that has driven the development of recommendation engines and collaborative filtering algorithms is the Netflix Prize, Gabriel said. The competition, which offered a $1 million prize to anyone who could design an algorithm to improve upon the proprietary Netflix’s recommendation engine, allowed entrants to use pieces of the company’s own user data to develop a better algorithm. The competition spurred a great deal of interest in the potential applications of collaborative filtering and recommendation engines, he said.

In addition, relative ease of access to an abundant amount of cheap memory is another driving force behind the development of recommendation engines. An eCommerce company like Amazon with millions of items needs plenty of memory to store millions of different of pieces of item and correlation data while also storing user data in potentially large blocks.

“You have to think about a lot of matrix data in memory. And it’s a matrix, because you’re looking at relationships between items and other items and, obviously, the problems that get interesting are ones where you have lots and lots of different items,” Gabriel said. “All of the fitting and the data storage does need quite a bit of memory to work with. Cheap and plentiful memory has been very helpful in the development of these things at the commercial scale.”

Looking forward, Gabriel sees recommendation engines and collaborative filtering systems evolving more toward predictive analytics and getting a handle on the unbounded domain of the internet. While those efforts may ultimately be driven by the Google Now platform, he foresees a time when recommendation-driven data will merge with search data to provide search results before you even search for them.

“I think there will be a lot more going on at that intersection between the search and recommendation space over the next couple years. It’s sort of inevitable,” Gabriel said. “You can look ahead to what someone is going to be searching for next, and you can certainly help refine and tune into the right information with less effort.”

While “mind-reading” search engines may still seem a bit like science fiction at present, the capabilities are evolving at a rapid pace, with predictive analytics at the bow.

If you’ve ever tried to learn how to spin a pencil in your hand, you’ll know it takes some concerted effort—but it’s even harder for a robot. Now, though, researchers have finally built a ‘bot that can learn to do it.

The reason that tasks like spinning a stick are hard is that a lot happens in a very short time. As the stick moves, the forces exerted by the hand can easily send it flying out of control if they’re not perfectly co-ordinated. Sensing where the stick is and varying the hand’s motion is an awful lot for even the smartest algorithms to handle based on a list of rules.

Read more

QC meets Blockchaining; nice.


CoinFac Limited, a technology company, has recently introduced the next generation quantum computing technology into cryptocurrency mining, allowing current Bitcoin and Altcoin miners to enjoy a 4,000 times speed increase.

Quantum computing is being perceived as the next generation of supercomputers capable of processing dense digital information and generating multi-sequential algorithmic solutions 100,000 times faster than conventional computers. With each quantum computing server costing at an exorbitant price tag of $5 Million — $10 Million, this revolutionary concoction comprising advanced technological servers with a new wave of currency systems, brings about the most uprising event in the cryptocurrency ecosystem.

“We envisioned cryptocurrency to be the game changer in most developed country’s economy within the next 5 years. Reliance of quantum computing technology expedite the whole process, and we will be recognized as the industry leader in bringing about this tidal change. We aren’t the only institution fathom to leverage on this technology. Other Silicon big boys are already in advance talks of a possible tie up”, said Mike Howzer, CEO of CoinFac Limited. “Through the use of quantum computing, usual bitcoin mining processes are expedited by a blazing speed of 4,000 times. We bring lucrative mining back into Bitcoin industry, all over again”.

Google, NASA and Microsoft have been in close talk with the developers of a possible integration using quantum computing into their existing products and platform.

Read more

Researchers at the University of Liverpool have developed a set of algorithms that will help teach computers to process and understand human languages.

Whilst mastering is easy for humans, it is something that computers have not yet been able to achieve. Humans understand language through a variety of ways for example this might be through looking up it in a dictionary, or by associating it with words in the same sentence in a meaningful way.

The algorithms will enable a to act in much the same way as a human would when encountered with an unknown word. When the computer encounters a word it doesn’t recognise or understand, the algorithms mean it will look up the word in a dictionary (such as the WordNet), and tries to guess what other words should appear with this unknown word in the text.

Read more

Hmmm; my verdict is out for now because I haven’t seen anything showing me that IBM is a real player in this space.


IBM is bringing quantum computing to a device near you by delivering its IBM Quantum Experience through the IBM Cloud. The platform is part of IBM’s Research Frontiers Institute and could be a data scientist’s newest tool and a data junkie’s dream come true.

The platform is available on any desktop or mobile device. The tech allows users to “run algorithms and experiments on IBM’s quantum processor, work with the individual quantum bits (qubits), and explore tutorials and simulations around what might be possible with quantum computing,” the press release noted.

The processor itself, which is housed at the T.J. Watson Research Center in New York, is made up of five superconducting qubits.

Read more

https://youtube.com/watch?v=EyOuVFQNMLI

Cambridge University spin-out Optalysys has been awarded a $350k grant for a 13-month project from the US Defense Advanced Research Projects Agency (DARPA). The project will see the company advance their research in developing and applying their optical co-processing technology to solving complex mathematical equations. These equations are relevant to large-scale scientific and engineering simulations such as weather prediction and aerodynamics.

The Optalysys technology is extremely energy efficient, using light rather than electricity to perform intensive mathematical calculations. The company aims to provide existing computer systems with massively boosted processing capabilities, with the aim to eventually reach exaFLOP rates (a billion billion calculations per second). The technology operates at a fraction of the energy cost of conventional high-performance computers (HPCs) and has the potential to operate at orders of magnitude faster.

In April 2015 Optalysys announced that they had successfully built a scaleable, lens-less optical processing prototype that can perform mathematical functions. Codenamed Project GALELEO, the device demonstrates that second order derivatives and correlation pattern matching can be performed optically in a scaleable design.

Read more

Nice; however, I see also 3D printing along with machine learning being part of any cosmetic procedures and surgeries.


With an ever-increasing volume of electronic data being collected by the healthcare system, researchers are exploring the use of machine learning—a subfield of artificial intelligence—to improve medical care and patient outcomes. An overview of machine learning and some of the ways it could contribute to advancements in plastic surgery are presented in a special topic article in the May issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS).

“Machine learning has the potential to become a powerful tool in plastic surgery, allowing surgeons to harness complex clinical data to help guide key clinical decision-making,” write Dr. Jonathan Kanevsky of McGill University, Montreal, and colleagues. They highlight some key areas in which machine learning and “Big Data” could contribute to progress in plastic and reconstructive surgery.

Machine Learning Shows Promise in Plastic Surgery Research and Practice

Machine learning analyzes historical data to develop algorithms capable of knowledge acquisition. Dr. Kanevsky and coauthors write, “Machine learning has already been applied, with great success, to process large amounts of complex data in medicine and surgery.” Projects with healthcare applications include the IBM Watson Health cognitive computing system and the American College of Surgeons’ National Surgical Quality Improvement Program.

Read more

Due to the pace of Quantum Computing is developing; NIST is rushing to create a Quantum proof cryptographic algorithms to prevent QC hacking. Like I have stated, I believe we’re now less that 7 years away for QC being in many mainstream devices, infrastructure, etc. And, China and it’s partnership with Australia; the race is now on and hotter than ever.


The National Institute for Standards and Technology has begun to look into quantum cybersecurity, according to a new report that details and plans out ways scientists could protect these futuristic computers.

April 29, 2016.

Ransomware has taken off in 2016, already eclipsing the number of attacks observed in a recently published threat report from Symantec.

Read more