Toggle light / dark theme

The first 256-qubit quantum computer has been announced by startup company QuEra, founded by MIT and Harvard scientists.

QuEra Computing Inc. – a new Boston, Massachusetts-based company – has emerged from stealth mode with $17 million in funding and has completed the assembly of a 256-qubit device. Its funders include Japanese e-commerce giant Rakuten, Day One Ventures, Frontiers Capital, and the leading tech investors Serguei Beloussov and Paul Maritz. The company recently received a DARPA award, and has already generated $11 million in revenue.

QuEra Computing recently achieved ground-breaking research on neutral atoms, developed at Harvard University and the Massachusetts Institute of Technology, which is being used as the basis for a highly scalable, programmable quantum computer solution. The QuEra team is aiming to build the world’s most powerful quantum computers to take on computational tasks that are currently deemed impossibly hard.

Physicists have created a new ultra-thin, two-layer material with quantum properties that normally require rare earth compounds. This material, which is relatively easy to make and does not contain rare earth metals, could provide a new platform for quantum computing and advance research into unconventional superconductivity and quantum criticality.

The researchers showed that by starting from seemingly common materials, a radically new quantum state of matter can appear. The discovery emerged from their efforts to create a quantum spin liquid which they could use to investigate emergent quantum phenomena such as gauge theory. This involves fabricating a single layer of atomically thin tantalum disulphide, but the process also creates islands that consist of two layers.

When the team examined these islands, they found that interactions between the two layers induced a phenomenon known as the Kondo effect, leading to a macroscopically entangled state of matter producing a heavy-fermion system.

Bringing global giants into the economic fight.

The world’s biggest chip-making nation is getting serious.

Japan has committed $5.2 billion (roughly 600 billion yen) toward providing support for semiconductor manufacturers in a bid to help solve the world’s ongoing chip shortage.

While the funds will go to several chipmakers, the most notable among them is the largest one in the world, Taiwan Semiconductor Manufacturing Co (TSMC), according to an initial Tuesday report from Nikkei.

Japan invests $5.2 billion in Taiwan’s giant chip-making firm TSMC also said that it would construct a new chip plant in Japan for $7 billion in a joint effort with Sony Group Corp. Understandably, the government of Japan was pleased. The remaining 200 billion yen of Japan’s new investment will be directed toward preparing other factories for multiple new projects, including one under development by the U.S. memory chipmaker Micron Technology Inc, and Japan’s Kioxia Holdings, according to the report. Japan has remained the largest chip-making industry in the world since the 1980s. But since then the nation has fought an uphill battle to maintain its competitive edge in an increasingly crowded industry, falling into a steady decline in the last three decades as economic rivals like manufacturers based in Taiwan continued to close the gap.

Full Story:

TSMC getting half their factory paid for by the Japan government shows how concerned governments are getting about the fact that we are down to 3 companies in the world that can make high-end chips. Sony is also contributing $500 million to this factory in addition to the Japan government money.


Japan will provide 600 billion yen ($5.2 billion) as part of its fiscal 2021 supplementary budget to support advanced semiconductor manufacturers.

Circa 2021


By varying the presence of different building blocks in a computational model, Jackson et al. reverse-engineer the architecture for controlled semantic cognition and test this model against evidence from anatomy, neuropsychology and functional imaging.

And that’s where physicists are getting stuck.

Zooming in to that hidden center involves virtual particles — quantum fluctuations that subtly influence each interaction’s outcome. The fleeting existence of the quark pair above, like many virtual events, is represented by a Feynman diagram with a closed “loop.” Loops confound physicists — they’re black boxes that introduce additional layers of infinite scenarios. To tally the possibilities implied by a loop, theorists must turn to a summing operation known as an integral. These integrals take on monstrous proportions in multi-loop Feynman diagrams, which come into play as researchers march down the line and fold in more complicated virtual interactions.

Physicists have algorithms to compute the probabilities of no-loop and one-loop scenarios, but many two-loop collisions bring computers to their knees. This imposes a ceiling on predictive precision — and on how well physicists can understand what quantum theory says.

New work on linear-probing hash tables from MIT

MIT is an acronym for the Massachusetts Institute of Technology. It is a prestigious private research university in Cambridge, Massachusetts that was founded in 1861. It is organized into five Schools: architecture and planning; engineering; humanities, arts, and social sciences; management; and science. MIT’s impact includes many scientific breakthroughs and technological advances.

The most promising application in biomedicine is in computational chemistry, where researchers have long exploited a quantum approach. But the Fraunhofer Society hopes to spark interest among a wider community of life scientists, such as cancer researchers, whose research questions are not intrinsically quantum in nature.

“It’s uncharted territory,” says oncologist Niels Halama of the DKFZ, Germany’s national cancer center in Heidelberg. Working with a team of physicists and computer scientists, Halama is planning to develop and test algorithms that might help stratify cancer patients, and select small subgroups for specific therapies from heterogeneous data sets.

This is important for precision medicine, he says, but classic computing has insufficient power to find very small groups in the large and complex data sets that oncology, for example, generates. The time needed to complete such a task may stretch out over many weeks—too long to be of use in a clinical setting, and also too expensive. Moreover, the steady improvements in the performance of classic computers are slowing, thanks in large part to fundamental limits on chip miniaturization.