The Higgs mode associated with the amplitude fluctuation of an order parameter can decay into other low-energy bosonic modes, which renders the Higgs mode usually unstable in condensed matter systems. Here, the authors propose a mechanism to stabilize the Higgs mode in anisotropic quantum magnets. They show that magnetic anisotropy gaps out the Goldstone magnon mode and stabilizes the Higgs mode near a quantum critical point. The results are supported by three independent approaches: a bond-operator method, field theory, and quantum Monte Carlo simulation with analytic continuation.
Category: quantum physics
Caltech’s OrbNet deep learning tool outperforms state-of-the-art solutions.
Artificial intelligence (AI) machine learning is being applied to help accelerate the complex science of quantum mechanics—the branch of physics that studies matter and light on the subatomic scale. Recently a team of scientists at the California Institute of Technology (Caltech) published a breakthrough study in The Journal of Chemical Physics that unveils a new machine learning tool called OrbNet that can perform quantum chemistry computations 1,000 times faster than existing state-of-the-art solutions.
“We demonstrate the performance of the new method for the prediction of molecular properties, including the total and relative conformer energies for molecules in range of datasets of organic and drug-like molecules,” wrote the researchers.
Quantum chemistry is the scientific study that combines chemistry and physics. Also known as molecular quantum dynamics, quantum chemistry is a subset of chemistry that studies the properties and behavior of molecules at the subatomic level through the lens of quantum mechanics.
Physics theory suggests that exotic excitations can exist in the form of bound states confined in the proximity of topological defects, for instance, in the case of Majorana zero modes that are trapped in vortices within topological superconducting materials. Better understanding these states could aid the development of new computational tools, including quantum technologies.
One phenomenon that has attracted attention over the past few years is “braiding,” which occurs when electrons in particular states (i.e., Majorana fermions) are braided with one another. Some physicists have theorized that this phenomenon could enable the development of a new type of quantum technology, namely topological quantum computers.
Researchers at Pennsylvania State University, University of California-Berkeley, Iowa State University, University of Pittsburgh, and Boston University have recently tested the hypothesis that braiding also occurs in particles other than electrons, such as photons (i.e., particles of light). In a paper published in Nature Physics, they present the first experimental demonstration of braiding using photonic topological zero modes.
International Business Machines, still the legal name of century-plus-old IBM, has managed over the years to pull off a dubious feat. Despite selling goods and services in one of the most dynamic industries in the world, the IT sector the company helped create, it has managed to avoid growing.
The company that was synonymous with mainframes, that dominated the early days of the personal computer (a “PC” once meant a device that ran software built to IBM’s technical standards), and that reinvented itself as a tech-consulting goliath, lagged while upstarts and a few of its old competitors zoomed past it.
What IBM excelled at more often was marketing a version of its aspirational self. Its consultants would advise urban planners on how to create “smart cities.” Its command of artificial intelligence, packaged into a software offering whose name evoked its founding family, would cure cancer. Its CEO would wow the Davos set with cleverly articulated visions of how corporations could help fix the ills of society.
What IBM did not do was grow or participate sufficiently in the biggest trend in business-focused IT, cloud computing. Now, in the words of veteran tech analyst Toni Sacconaghi of research shop Bernstein, new IBM CEO Arvind Krishna is pursuing a strategy of “growth through subtraction.” The company is spinning off its IT outsourcing business, a low-growth, low-margin portion of its services business that rings up $19 billion in annual sales. Krishna told Aaron and Fortune writer Jonathan Vanian that he plans to bulk back up after the spinoff via acquisitions. “We’re open for business,” he said.
The move is bold, if risky. The reason it took so long, and presumably a new leader, to jettison the outsourcing business is that it was meant to drive sales of IBM hardware and other services. But Krishna, promoted for his association with IBM’s nascent cloud-computing effort—just as Microsoft Satya Nadella ran his company’s cloud arm before taking the top job—recognizes that only by discarding a moribund business can IBM focus and invest properly in the one that matters.
IBM CEO Arvind Krishna is spinning off IBM’s IT outsourcing services unit to focus on cloud and quantum computing.
— 300 interviews with the people who shape our world, in 40 countries and on 12 platforms.
Recognise yourself? If so, please RT!
#movethehumanstoryforward #science #arts #culture #music #technology #artificialintelligence #nanotech #quantumphysics #space #blockchain #ideaXme
This could be important!
A new algorithm that fast forwards simulations could bring greater use ability to current and near-term quantum computers, opening the way for applications to run past strict time limits that hamper many quantum calculations.
“Quantum computers have a limited time to perform calculations before their useful quantum nature, which we call coherence, breaks down,” said Andrew Sornborger of the Computer, Computational, and Statistical Sciences division at Los Alamos National Laboratory, and senior author on a paper announcing the research. “With a new algorithm we have developed and tested, we will be able to fast forward quantum simulations to solve problems that were previously out of reach.”
Computers built of quantum components, known as qubits, can potentially solve extremely difficult problems that exceed the capabilities of even the most powerful modern supercomputers. Applications include faster analysis of large data sets, drug development, and unraveling the mysteries of superconductivity, to name a few of the possibilities that could lead to major technological and scientific breakthroughs in the near future.
Quantum mechanics, the physics of atoms and subatomic particles, can be strange, especially compared to the everyday physics of Isaac Newton’s falling apples. But this unusual science is enabling researchers to develop new ideas and tools, including quantum computers, that can help demystify the quantum realm and solve complex everyday problems.
That’s the goal behind a new U.S. Department of Energy Office of Science (DOE-SC) grant, awarded to Michigan State University (MSU) researchers, led by physicists at Facility for Rare Isotope Beams (FRIB). Working with Los Alamos National Laboratory, the team is developing algorithms – essentially programming instructions – for quantum computers to help these machines address problems that are difficult for conventional computers. For example, problems like explaining the fundamental quantum science that keeps an atomic nucleus from falling apart.
The $750,000 award, provided by the Office of Nuclear Physics within DOE-SC, is the latest in a growing list of grants supporting MSU researchers developing new quantum theories and technology.
Secure telecommunications networks and rapid information processing make much of modern life possible. To provide more secure, faster, and higher-performance information sharing than is currently possible, scientists and engineers are designing next-generation devices that harness the rules of quantum physics. Those designs rely on single photons to encode and transmit information across quantum networks and between quantum chips. However, tools for generating single photons do not yet offer the precision and stability required for quantum information technology.
Now, as reported recently in the journal Science Advances, researchers have found a way to generate single, identical photons on demand. By positioning a metallic probe over a designated point in a common 2-D semiconductor material, the team led by researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has triggered a photon emission electrically. The photon’s properties may be simply adjusted by changing the applied voltage.
“The demonstration of electrically driven single-photon emission at a precise point constitutes a big step in the quest for integrable quantum technologies,” said Alex Weber-Bargioni, a staff scientist at Berkeley Lab’s Molecular Foundry who led the project. The research is part of the Center for Novel Pathways to Quantum Coherence in Materials (NPQC), an Energy Frontier Research Center sponsored by the Department of Energy, whose overarching goal is to find new approaches to protect and control quantum memory that can provide new insights into novel materials and designs for quantum computing technology.
Whenever I tell my friends about the potential of Quantum Computing, for example, how a Quantum Computer (QC) can do a large number of calculations in parallel worlds, they look at me like I’m kind of crazy.
Quantum mechanics arose in the 1920s, and since then scientists have disagreed on how best to interpret it. Many interpretations, including the Copenhagen interpretation presented by Niels Bohr and Werner Heisenberg, and in particular, von Neumann-Wigner interpretation, state that the consciousness of the person conducting the test affects its result. On the other hand, Karl Popper and Albert Einstein thought that an objective reality exists. Erwin Schrödinger put forward the famous thought experiment involving the fate of an unfortunate cat that aimed to describe the imperfections of quantum mechanics.
In their most recent article, Finnish civil servants Jussi Lindgren and Jukka Liukkonen, who study quantum mechanics in their free time, take a look at the uncertainty principle that was developed by Heisenberg in 1927. According to the traditional interpretation of the principle, location and momentum cannot be determined simultaneously to an arbitrary degree of precision, as the person conducting the measurement always affects the values.
However, in their study Lindgren and Liukkonen concluded that the correlation between a location and momentum, i.e., their relationship, is fixed. In other words, reality is an object that does not depend on the person measuring it. Lindgren and Liukkonen utilized stochastic dynamic optimization in their study. In their theory’s frame of reference, Heisenberg’s uncertainty principle is a manifestation of thermodynamic equilibrium, in which correlations of random variables do not vanish.