Toggle light / dark theme

“Beyond implementation of quantum communication technologies, nanotube-based single photon sources could enable transformative quantum technologies including ultra-sensitive absorption measurements, sub-diffraction imaging, and linear quantum computing. The material has potential for photonic, plasmonic, optoelectronic, and quantum information science applications…”


In optical communication, critical information ranging from a credit card number to national security data is transmitted in streams of laser pulses. However, the information transmitted in this manner can be stolen by splitting out a few photons (the quantum of light) of the laser pulse. This type of eavesdropping could be prevented by encoding bits of information on quantum mechanical states (e.g. polarization state) of single photons. The ability to generate single photons on demand holds the key to realization of such a communication scheme.

By demonstrating that incorporation of pristine into a silicon dioxide (SiO2) matrix could lead to creation of solitary oxygen dopant state capable of fluctuation-free, room-temperature single , Los Alamos researchers revealed a new path toward on-demand single photon generation. Nature Nanotechnology published their findings.

Read more

1. Silicon technology has taken humanity a long way forward from 1947 when the first transistor was invented by the Nobel prize winners Shockley, Bardeen & Brattain.

2. From smart mobile telephones we rely on to the sophisticated satellite navigation systems guiding our cars, a lot of techno-magic we see around us is a result of our ability to scale silicon-tech that turns hitherto science fiction into everyday reality at affordable prices.

3. All the Nobel laureates, scientists and engineers we liaise with at Quantum Innovation Labs http://QiLabs.net collectively realise the end of the silicon-scaling era is coming to end as the Moore’s Law era for Silicon-based computers finally concludes.

Read more

1. Google Search.

2. Facebook’s News Feed.

3. OKCupid Date Matching.

4. NSA Data Collection, Interpretation, and Encryption.

5. “You May Also Enjoy…”

6. Google AdWords.

7. High Frequency Stock Trading.

8. MP3 Compression.

9. IBM’s CRUSH (Criminal Reduction Utilizing Statistical History)

10. Auto-Tune


The importance of algorithms in our lives today cannot be overstated. They are used virtually everywhere, from financial institutions to dating sites. But some algorithms shape and control our world more than others — and these ten are the most significant.

Just a quick refresher before we get started. Though there’s no formal definition, computer scientists describe algorithms as a set of rules that define a sequence of operations. They’re a series of instructions that tell a computer how it’s supposed to solve a problem or achieve a certain goal. A good way to think of algorithms is by visualizing a flowchart.

Read more

An international team of researchers from the National Physical Laboratory (NPL), IBM, the University of Edinburgh and Auburn University have shown that a new device concept — a ‘squishy’ transistor — can overcome the predicted power bottleneck caused by CMOS (complementary metal-oxide-semiconductor) technology reaching its fundamental limits.

Moore’s law predicted that the number of transistors able to fit on a given die area would double every two years. As transistor density doubled, chip size shrank and processing speeds increased. This march of progress led to rapid advances in and a surge in the number of interconnected devices. The challenge with making anything smaller is that there are fundamental physical limits that can’t be ignored and we are now entering the final years of CMOS transistor shrinkage.

Furthermore, this proliferation is driving an increase in data volume, accompanied by rising demands on energy to process, store and communicate it all; as a result, IT infrastructure now draws an estimated 10 % of the world’s electrical power. Previous efforts have focused on remediation by reducing the amount of energy per bit. However, soon we will hit a power barrier that will prevent continued voltage scaling. The development of novel, low-power devices based on different physical principles is therefore crucial to the continued evolution of IT.

Read more

A small, Santa Fe, New Mexico-based company called Knowm claims it will soon begin commercializing a state-of-the-art technique for building computing chips that learn. Other companies, including HP HPQ and IBM IBM, have already invested in developing these so-called brain-based chips, but Knowm says it has just achieved a major technological breakthrough that it should be able to push into production hopefully within a few years.

The basis for Knowm’s work is a piece of hardware called a memristor, which functions (warning: oversimplification coming) by mimicking synapses in the brain. Rather than committing certain information to a software program and traditional computing memory, memristors are able to “learn” by strengthening the electrical charge between two resistors (the “ristor” part of memristor) much like synapses strengthen connections between commonly used neurons in the brain.

Done correctly—and this is the result that HP and IBM are after—memristors can make computer chips much smarter, but also very energy efficient. That could mean data centers that don’t use as much energy as small towns, as well as more viable robotics, driverless cars, and other autonomous devices. Alex Nugent, Knowm’s founder and CEO, says memristors—especially the ones his company is working on—offer “a massive leap in efficiency” over traditional CPUs, GPUs, and other hardware now used to power artificial intelligence workloads.

Read more

Intel today announced plans to invest $50 million over the next ten years as part of a quantum computing push to help solve problems such as “large-scale financial analysis and more effective drug development.”

But despite the ambitions and huge cost of the project, company vice president Mike Mayberry admits that “a fully functioning quantum computer is at least a dozen years away.”

The money will be channeled through QuTech, the quantum research institute of Delft University of Technology, and TNO, with Intel additionally pledging to commit its own “engineering resources” to the collaborative effort.

Read more