Toggle light / dark theme

What if computers could recognize objects as well as the human brain could? Electrical engineers at the University of California, San Diego have taken an important step toward that goal by developing a pedestrian detection system that performs in near real-time (2−4 frames per second) and with higher accuracy (close to half the error) compared to existing systems. The technology, which incorporates deep learning models, could be used in “smart” vehicles, robotics and image and video search systems.

Read more

Another step forward for Quantum — The Quantum Current. US Dept. of Energy has a new method to generate very low-resistance electric (Quantum) current which will improve our methods for energy, quantum computing, and medical imaging, and possibly even a new mechanism for inducing superconductivity—the ability of some materials (zirconium pentatelluride) to carry current with no energy loss.

Read more

Individual brain cells within a neural network are highlighted in this image obtained using a fluorescent imaging technique (credit: Sandra Kuhlman/CMU)

Carnegie Mellon University is embarking on a five-year, $12 million research effort to reverse-engineer the brain and “make computers think more like humans,” funded by the U.S. Intelligence Advanced Research Projects Activity (IARPA). The research is led by Tai Sing Lee, a professor in the Computer Science Department and the Center for the Neural Basis of Cognition (CNBC).

The research effort, through IARPA’s Machine Intelligence from Cortical Networks (MICrONS) research program, is part of the U.S. BRAIN Initiative to revolutionize the understanding of the human brain.

A “Human Genome Project” for the brain’s visual system

“MICrONS is similar in design and scope to the Human Genome Project, which first sequenced and mapped all human genes,” Lee said. “Its impact will likely be long-lasting and promises to be a game changer in neuroscience and artificial intelligence.”

Read more

By now, everyone is probably familiar with holographic performances from such artists as Elvis, Micheal Jackson and the one that started it all, Tupac Shakur at Coachella. However, the real pioneers of performance holograms were that quirky cartoon band, Gorillaz. The costs of a holographic setup would make your toes curl, but the technology itself is fairly straight forward. Projecting onto a specialized screen which is as close to invisible as one can get. There are two main players in the space, Holo-gauze and Musion.

When it comes to digital artists, there is one name that stands out from all others. Hatsune Miku. What makes Miku so unique when compared to holograms of dead celebrities or even the animated Gorillaz with Blur frontman, Damon Albarn, is that she is entirely computer generated. A software instrument manifest as an anime character who has become as much of a ‘real’ celebrity as anyone currently in the charts.

Read more

As I said this morning; there is something definitely going with Quantum today. Maybe it’s the planet alignment (I saw there was something going on with the alignment with Aquaris today) — this is awesome news.


Rigetti Computing is working on designs for quantum-powered chips to perform previously impossible feats that advance chemistry and machine learning.

Read more

Wham! Another headline; 2 new companies (Rigetti and Q Branch) trying to capture the Quantum Platform crown from D-Wave. Now, we can say a real industry race is on.


Based on a recent analysis of our most popular articles, investors seem to have a strong interest in quantum computing. The problem for investors is that there aren’t any pure play opportunities to invest in quantum computing at the moment. The main reason for this is that there aren’t many companies working on quantum computing. In fact, there’s just one company right now that’s actually selling a quantum computer; Canadian based startup D-Wave.

D-Wave has actually released a controversial “quantum computer”, and is working with big names like Google, NASA, and Lockheed. D-Wave received some major credibility recently when Google announced that they solved an optimization problem in seconds that would normally take 10,000 years with a conventional computer. There is one way to get exposure to D-Wave, but it’s hardly a pure-play and doesn’t seem overly promising. While there are very few companies other than D-Wave directly involved in quantum computing, we did find two companies that quantum computing investors should keep an eye on.

QXBranch_Logo

Read more

In quantum physics, the creation of a state of entanglement in particles any larger and more complex than photons usually requires temperatures close to absolute zero and the application of enormously powerful magnetic fields to achieve. Now scientists working at the University of Chicago (UChicago) and the Argonne National Laboratory claim to have created this entangled state at room temperature on a semiconductor chip, using atomic nuclei and the application of relatively small magnetic fields.

When two particles, such as photons, are entangled – that is, when they interact physically and are then forcibly separated – the spin direction imparted to each is directly opposite to the other. However, when one of the entangled particles has its spin direction measured, the other particle will immediately display the reverse spin direction, no matter how great a distance they are apart. This is the “spooky action at a distance” phenomenon (as Albert Einstein put it) that has already seen the rise of applications once considered science fiction, such as ultra-safe cryptography and a new realm of quantum computing.

Ordinarily, quantum entanglement is a rarely observed occurence in the natural world, as particles coupled in this way first need to be in a highly ordered state before they can be entangled. In essence, this is because thermodynamic entropy dictates that a general chaos of particles is the standard state of things at the atomic level and makes such alignments exceedingly rare. Going up a scale to the macro level, and the sheer number of particles involved makes entanglement an exceptionally difficult state to achieve.

Read more

“Full exploitation of this information is a major challenge,” officials with the Defense Advanced Research Projects Agency (DARPA) wrote in a 2009 brief on “deep learning.”

“Human observation and analysis of [intelligence, surveillance and reconnaissance] assets is essential, but the training of humans is both expensive and time-consuming. Human performance also varies due to individuals’ capabilities and training, fatigue, boredom, and human attentional capacity.”

Working with a team of researchers at MIT, DARPA is hoping to take all of that human know-how and shrink it down into processing unit no bigger than your cellphone, using a microchip known as “Eyeriss.” The concept relies on “neural networks;” computerized memory networks based on the workings of the human brain.

Read more

Meet “Unreal Engine”; VR’s friend in VR game creations.


Epic Games has been teasing “the future of VR development” recently, and the team is finally ready to tell everyone what that is: Creating virtual reality content within virtual reality itself, using the full version of its Unreal Engine 4. Epic cofounder Tim Sweeney says that while the company’s been supporting the likes of the Oculus Rift from the outset, the irony is that, up to this point, the experiences we’ve seen so far have been developed using the same tools as traditional video games. “Now you can go into VR, have the entire Unreal editor functioning and do it live,” he says. “It almost gives you god-like powers to manipulate the world.”

So rather than using the same 2D tools (a keyboard, mouse and computer monitor) employed in traditional game development, people making experiences for VR in Unreal can now use a head-mounted display and motion controllers to manipulate objects in a 3D space. “Your brain already knows how to do this stuff because we all have an infinite amount of experience picking up and moving 3D objects,” Sweeney says. “The motions you’d do in the real world, you’d do in the editor and in the way you’d expect to; intuitively.”

Imagine walking around an environment you’re creating in real time, like a carpenter surveying his or her progress while building a house. Looking around, you notice that the pillar you dropped in place earlier is unexpectedly blocking some of the view through a window you just added. Now there isn’t a clear line of sight to the snowcapped mountain on the horizon. Within the VR update for Unreal Engine 4, you can pick the pillar up with your hands and adjust its placement until it’s right.

Read more