JILA researchers have measured hundreds of individual quantum energy levels in the buckyball, a spherical cage of 60 carbon atoms. It’s the largest molecule that has ever been analyzed at this level of experimental detail in the history of quantum mechanics. Fully understanding and controlling this molecule’s quantum details could lead to new scientific fields and applications, such as an entire quantum computer contained in a single buckyball.
Engineering researchers have demonstrated proof-of-principle for a device that could serve as the backbone of a future quantum Internet. University of Toronto Engineering professor Hoi-Kwong Lo and his collaborators have developed a prototype for a key element for all-photonic quantum repeaters, a critical step in long-distance quantum communication.
A single quantum particle can send a two-way signal, scientists have discovered — something that’s impossible in classical physics. That means a particle can essentially send messages to itself thanks to the whacky state of uncertainty known as superposition.
Superposition states that one particle can occupy two positions at once, and that’s how the two-way communication happens.
In May, 2016 I stumbled upon a highly controversial Aeon article titled “The Empty Brain: Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer” by psychologist Rob Epstein. This article attested to me once again just how wide the range of professional opinions may be when it comes to brain and mind in general. Unsurprisingly, the article drew an outrage from the reading audience. I myself disagree with the author on most fronts but one thing, I actually agree with him is that yes, our brains are not “digital computers.” They are, rather, neural networks where each neuron might function sort of like a quantum computer. The author has never offered his version of what human brains are like, but only criticized IT metaphors in his article. It’s my impression, that at the time of writing the psychologist hadn’t even come across such terms as neuromorphic computing, quantum computing, cognitive computing, deep learning, evolutionary computing, computational neuroscience, deep neural networks, and alike. All these IT concepts clearly indicate that today’s AI research and computer science derive their inspiration from human brain information processing — notably neuromorphic neural networks aspiring to incorporate quantum computing into AI cognitive architecture. Deep neural networks learn by doing just children.
By Alex Vikoulov.
“I have always been convinced that the only way to get artificial intelligence to work is to do the computation in a way similar to the human brain. That is the goal I have been pursuing. We are making progress, though we still have lots to learn about how the brain actually works.”
The human brain has amazing capabilities making it in many ways more powerful than the world’s most advanced computers. So it’s not surprising that engineers have long been trying to copy it. Today, artificial neural networks inspired by the structure of the brain are used to tackle some of the most difficult problems in artificial intelligence (AI). But this approach typically involves building software so information is processed in a similar way to the brain, rather than creating hardware that mimics neurons.
My colleagues and I instead hope to build the first dedicated neural network computer, using the latest “quantum” technology rather than AI software. By combining these two branches of computing, we hope to produce a breakthrough which leads to AI that operates at unprecedented speed, automatically making very complex decisions in a very short time.
We need much more advanced AI if we want it to help us create things like truly autonomous self-driving cars and systems for accurately managing the traffic flow of an entire city in real-time. Many attempts to build this kind of software involve writing code that mimics the way neurons in the human brain work and combining many of these artificial neurons into a network. Each neuron mimics a decision-making process by taking a number of input signals and processing them to give an output corresponding to either “yes” or “no”.
An old thought experiment now appears in a new light. In 1935 Erwin Schrödinger formulated a thought experiment designed to capture the paradoxical nature of quantum physics. A group of researchers led by Gerhard Rempe, Director of the Department of Quantum Dynamics at the Max Planck Institute of Quantum Optics, has now realized an optical version of Schrödinger’s thought experiment in the laboratory. In this instance, pulses of laser light play the role of the cat. The insights gained from the project open up new prospects for enhanced control of optical states, that can in the future be used for quantum communications.
Aeronautics giant Airbus today announced that it is creating a global competition to encourage developers to find ways quantum computing can be applied to aircraft design.
Quantum computing is one of many next-generation computing architectures being explored as engineers worry that traditional computing is reaching its physical limits.
Computers today process information using bits, either 0s or 1s, stored in electrical circuits made up of transistors. Quantum computers harness the power of quantum systems, such as atoms that can simultaneously exist in multiple states and can be used as “quantum bits” or “qubits.” These can theoretically handle far more complex calculations.
An international research team led by the University of Liverpool and McMaster University has made a significant breakthrough in the search for new states of matter.
CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?
CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.
Experts point out that this research could be as limitless as the universe itself. The UK’s former Chief Scientific Advisor, Prof Sir David King told BBC: “We have to draw a line somewhere otherwise we end up with a collider that is so large that it goes around the equator. And if it doesn’t end there perhaps there will be a request for one that goes to the Moon and back.”
“There is always going to be more deep physics to be conducted with larger and larger colliders. My question is to what extent will the knowledge that we already have be extended to benefit humanity?”
There have been broad discussions about whether high energy nuclear experiments could pose an existential risk sooner or later, for example by producing micro black holes (mBH) or strange matter (strangelets) that could convert ordinary matter into strange matter and that eventually could start an infinite chain reaction from the moment it was stable – theoretically at a mass of around 1000 protons.
CERN has argued that micro black holes eventually could be produced, but they would not be stable and evaporate immediately due to „Hawking radiation“, a theoretical process that has never been observed.
Furthermore, CERN argues that similar high energy particle collisions occur naturally in the universe and in the Earth’s atmosphere, so they could not be dangerous. However, such natural high energy collisions are seldom and they have only been measured rather indirectly. Basically, nature does not set up LHC experiments: For example, the density of such artificial particle collisions never occurs in Earth’s atmosphere. Even if the cosmic ray argument was legitimate: CERN produces as many high energy collisions in an artificial narrow space as occur naturally in more than hundred thousand years in the atmosphere. Physicists look quite puzzled when they recalculate it.
Others argue that a particle collider ring would have to be bigger than the Earth to be dangerous.
Since these discussions can become very sophisticated, there is also a more general approach (see video): According to present research, there are around 10 billion Earth-like planets alone in our galaxy, the Milky Way. Intelligent life might send radio waves, because they are extremely long lasting, though we have not received any (“Fermi paradox”). Theory postulates that there could be a ”great filter“, something that wipes out intelligent civilizations at a rather early state of their technical development. Let that sink in.
All technical civilizations would start to build particle smashers to find out how the universe works, to get as close as possible to the big bang and to hunt for the smallest particle at bigger and bigger machines. But maybe there is a very unexpected effect lurking at a certain threshold that nobody would ever think of and that theory does not provide. Indeed, this could be a logical candidate for the “great filter”, an explanation for the Fermi paradox. If it was, a disastrous big bang machine eventually is not that big at all. Because if civilizations were to construct a collider of epic dimensions, a lack of resources would have stopped them in most cases.
Finally, the CERN member states will have to decide on the budget and the future course.
The political question behind is: How far are the ordinary citizens paying for that willing to go?
LHC-Critique / LHC-Kritik
Network to discuss the risks at experimental subnuclear particle accelerators