Toggle light / dark theme

Excellent overview on BMI technology.


Less than a century ago, Hans Berger, a German psychiatrist, was placing silver foil electrodes on his patients’ heads and observing small ripples of continuous electrical voltage emerging from these. These were the first human brain waves to ever be recorded. Since Hans Berger’s first recordings, our knowledge on the brain structure and function has developed considerably. We now have a much clearer understanding of the neuronal sources that generate these electrical signals and the technology that is now available allows us to get a much denser and accurate picture of how these electrical signals change in time and across the human scalp.

The recording and analysis of brain signals has advanced to a level where people are now able to control and interact with devices around them with the use of their brain signals. The field of brain-computer interfaces has in fact garnered huge interest during the past two decades, and the development of low-cost hardware solutions together with the continuously evolving signal analysis techniques, have brought this technology closer to market than ever before.

Research in the field of brain-computer interfaces was primarily propelled by the need of finding novel communication channels for individuals suffering from severe mobility disorders as in the case of patients with locked-in syndrome. People suffering from the condition have a perfectly functioning brain but are trapped inside their body, which no longer responds to the signals being transmitted from their brain.

Read more

Microsoft has been working on quantum computing for years, but the company is coming out of stealth mode now and taking a more active role in quantum computer development. It wants to build a theoretical type of machine known as a topological quantum computer, despite the difficulty of doing so.

Read more

Scientists from MIT and Boston University have developed biological cells that can count and ‘remember’ cellular events by creating simple circuits through a series of genes that are activated in a precise order. These circuits, which the scientists say simulate computer chips, could be employed to tally the number of times a cell divides or to track a cycle of developmental stages. Such counting cells could also be used as biosensors to count the number of toxin exposures present in an environment.

Read more

Glad that the author is highlighting the need for investment; however, the US government has had a quantum network since 1991. Wall Street, various overseas banks, ISPs, etc. well over a year to 2 years ago. And, tech has invested in QC for a decade or longer. So, the article raises the need for investing in QC; however, the investing is no longer experimental as it is now about the daily usage of this technology as well as planning for technical transformation that is coming in the next 5 to 7 years.


Researchers led by Lockheed Martin and IBM are pushing quantum computing prototypes and military applications.

Read more

An international team of scientists has succeeded in making further improvements to the lifetime of superconducting quantum circuits. An important prerequisite for the realization of high-performance quantum computers is that the stored data should remain intact for as long as possible. The researchers, including Jülich physicist Dr. Gianluigi Catelani, have developed and tested a technique that removes unpaired electrons from the circuits. These are known to shorten the qubit lifetime (Science, DOI: 10.1126/science.aah5844).

Quantum computers could one day achieve significantly higher computing speeds than conventional digital computers in performing certain types of tasks. Superconducting circuits belong to the most promising candidates for implementing quantum bits, known as qubits, with which quantum computers can store and process information. The high error rates associated with previously available qubits have up to now limited the size and efficiency of quantum computers. Dr. Gianluigi Catelani of the Peter Grünberg Institute (PGI-2) in Jülich, together with his colleagues has now found a way to prolong the time in which the superconducting circuits are able to store a “0” or a “1” without errors. Beside Catelani, the team comprises researchers working in the USA (Massachusetts Institute of Technology, Lincoln Laboratory, and the University of California, Berkeley), Japan (RIKEN), and Sweden (Chalmers University of Technology).

When superconducting materials are cooled below a material-specific critical temperature, electrons come together to form pairs; then current can flow without resistance. However, so far it has not been possible to build superconducting circuits in which all electrons bundle together. Single electrons remain unpaired and are unable to flow without resistance. Due to these so-called quasiparticles, energy is lost and this limits the length of time that the circuits can store data.

Read more

Nantero Inc., the nanotechnology company developing next-generation memory using carbon nanotubes, today announced the closing of an over $21 million financing round. The lead investor in the round was Globespan Capital Partners and also included participation from both new and existing strategic and financial investors. Nantero currently has more than a dozen partners and customers in the consumer electronics, enterprise systems, and semiconductor industries actively working on NRAM®. The new funding will enable the company to support these partners in bringing multiple products into the market, while also enabling new customers to begin development. This financing round brings the total invested in Nantero to date to over $110 million.

“This round enables Nantero to accelerate its pace in product development, especially of its multi-gigabyte DDR4-compatible memory product,” said David Poltack, Managing Director, Globespan Capital Partners. “Nantero has multiple industry-leading customers who would like to receive NRAM even sooner. The fact that several of these customers, as well as key partners in the ecosystem, have decided to also invest in Nantero is a strong sign of confidence given how well they know Nantero and its product from years of working together.”

“The customer traction we’ve achieved at Nantero has been overwhelming, as evidenced by our recent announcement that NRAM had been selected by both Fujitsu Semiconductor and Mie Fujitsu Semiconductor,” said Greg Schmergel, Co-Founder & CEO of Nantero. “With this additional funding, we will be able to help these existing customers speed their time to market while also supporting the many other companies that have approached us about using Nantero NRAM in their next generation products.”

Read more

Developments in computing are driving the transformation of entire systems of production, management, and governance. In this interview Justine Cassell, Associate Dean, Technology, Strategy and Impact, at the School of Computer Science, Carnegie Mellon University, and co-chair of the Global Future Council on Computing, says we must ensure that these developments benefit all society, not just the wealthy or those participating in the “new economy”.

Why should the world care about the future of computing?

Today computers are in virtually everything we touch, all day long. We still have an image of computers as being rectangular objects either on a desk, or these days in our pockets; but computers are in our cars, they’re in our thermostats, they’re in our refrigerators. In fact, increasingly computers are no longer objects at all, but they suffuse fabric and virtually every other material. Because of that, we really do need to care about what the future of computing holds because it is going to impact our lives all day long.

Read more

Creative Machines; however, are they truly without a built in bias due to their own creator/s?


Despite nature’s bewildering complexity, the driving force behind it is incredibly simple. ‘Survival of the fittest’ is an uncomplicated but brutally effective optimization strategy that has allowed life to solve complex problems, like vision and flight, and colonize the harshest of environments.

Researchers are now trying to harness this optimization process to find solutions to a host of science and engineering problems. The idea of using evolutionary principles in computation dates back to the 1950s, but it wasn’t until the 1960s that the idea really took off. By the 1980s the approach had crossed over from academic curiosities into real-world fields like engineering and economics.

Applying natural selection to computing

Evolutionary algorithms are numerous and diverse, but they all seek to replicate key features of biological evolution, such as natural selection, reproduction and mutation. Typically these methods rely on a kind of trial and error — a large population of potential solutions to a problem are randomly generated and tested against a so-called “fitness function.” This lets the system rank the solutions in order of how well they solve the problem.

Read more

Why Synbio computing is where we ultimately want to more and more progress towards especially once the basic infrastructure is updated with technology like QC.


Cells are often likened to computers, running an operating system that receives signals, processes their input, and responds, according to programming, with cellular output. Yet untangling computer-like pathways in cells is anything but simple, say Denise Montell, professor at the University of California, Santa Barbara, and Aviv Regev, a Howard Hughes Medical Institute investigator at the Massachusetts Institute of Technology and the Broad Institute. However, both are eager to try and will outline their latest efforts at the “Logic of Signaling” symposium at the 2016 ASCB Annual Meeting.

“My lab is understanding how cells maintain and build normal tissues. We’re studying cellular behaviors that underlie normal behavior and tumor metastasis, a great unsolved question in cancer,” Montell said. Her lab recently discovered that cells can bounce back from the brink of apoptotic cell death. “This wasn’t known before so now we’re looking at how cells do it, when do they do it, under what circumstances, and what does it mean,” Montell said.

To track these near-death experiences in cells the Montell lab generated a genetically coded sensor in Drosophila. The researchers expected the mechanism to be a stress response, but they found that it was normal during development. “It makes sense retrospectively,” Montell explained, pointing to neuronal development as an example. “You produce way more neurons that you need, and the neurons compete for trophic factors. If a group of cells are competing for trophic factors, then one cell starts to die, but if it gets more trophic factor, it could bounce back.”

Read more