Toggle light / dark theme

“Today’s technology announcement is about challenging convention and rethinking how we continue to advance society and deliver new innovations that improve life, business and reduce our environmental impact,” said Dr. Mukesh Khare, Vice President of Hybrid Cloud and Systems, IBM Research. “Given the constraints the industry is currently facing along multiple fronts, IBM and Samsung are demonstrating our commitment to joint innovation in semiconductor design and a shared pursuit of what we call ‘hard tech.’”

Moore’s Law – an ongoing trend that shows the number of transistors on a computer chip doubling every two years or so – is now approaching what are considered fundamental barriers. Simply put, as more and more transistors are crammed into a finite area, engineers are running out of space.

Historically, transistors have been built to lie flat upon the surface of a semiconductor, with the electric current flowing laterally, or side-to-side, through them. Vertical Transport Field Effect Transistors (VTFET), by contrast, are built perpendicular to the surface of the chip with a vertical, or up-and-down, current flow.

That could change though, because Samsung and IBM have announced a partnership in which both companies are working together to develop new battery tech that could allow our smartphones to run for an entire week on a single charge.

This comes in the form of a new chip architecture that could potentially reduce the amount of energy consumed by as much as 85% called Vertical-Transport Nanosheet Field Effect Transistor (VTFET). As the name suggests, this new design will allow signals to travel across the chip vertically by stacking transistors on top of each other.

As a result, this could allow phones to perform just as well as they do right now, but with massive gains in energy efficiency. Alternatively, this design would also allow phones to improve on its performance by as much as 100% compared to modern FET alternatives.

LG Electronics USA has just announced its first gaming laptop, the LG UltraGear 17G90Q.

The LG UltraGear 17G90Q is powered by an 11th Gen Intel® Tiger Lake H processor, NVIDIA GeForceTM RTX 3,080 Max-Q graphics card, dual-channel memory and an ultra-fast dual SSD setup. In addition to a 17-inch IPS panel with a 1 millisecond response time and a 300Hz refresh rate, the LG UltraGear gaming laptop ensures immersive, fluid gameplay. To stop all this high-end hardware from melting, the LG 17G90Q cooling system features a vapor chamber that keeps the laptop running cool, even when pushed to the limits.

Gallery.

“Until now researchers have encoded and stabilized. We now show that we can compute as well.”

Researchers at QuTech—a collaboration between the TU Delft and TNO—have reached a milestone in quantum error correction. They have integrated high-fidelity operations on encoded quantum data with a scalable scheme for repeated data stabilization. The researchers report their findings in the December issue of Nature Physics.

Physical quantum bits, or qubits, are vulnerable to errors. These errors arise from various sources, including quantum decoherence, crosstalk, and imperfect calibration. Fortunately, the theory of quantum error correction stipulates the possibility to compute while synchronously protecting quantum data from such errors.

During NeurIPS 2021, seven quantum computer scientists from Amazon came together to discuss the current state of quantum computing, some of the biggest challenges facing the field, and what the future might hold.

Panelists included:
• Simone Severini, director of quantum computing.
• Antia Lamas-Linares, principal research scientist.
• Earl Campbell, senior research scientist.
• John Preskill, Amazon Scholar.
• Katharine Hyatt, applied scientist.
• James Whitfield, Amazon Visiting Academic.
• Helmut Katzgraber, senior practice manager.

Follows us:
Website: https://www.amazon.science.
Twitter: https://twitter.com/AmazonScience.
Facebook: https://www.facebook.com/AmazonScience.
Instagram: https://www.instagram.com/AmazonScience.
LinkedIn: https://www.linkedin.com/showcase/AmazonScience.
Newsletter: https://www.amazon.science/newsletter.

#AmazonScience #Quantum #QuantumComputing #QuantumTech #NeurIPS

The company’s current Glass hardware is built on Android.


Google is hiring an “Augmented Reality OS” team focused on building software for an “innovative AR device,” according to job listings spotted by 9to5Google. The team is led by Mark Lucovsky, who announced he’d joined the company this week. Lucovsky previously worked at Meta developing an in-house alternative to Android to power the company’s hardware, and also co-authored the Windows NT operating system.

According to Google’s job listings, the Augmented Reality OS team is building “the software components that control and manage the hardware on [its] Augmented Reality (AR) products.” This is far from Google’s first stab at developing AR software, and follows the company’s work on ARCore for Android and Tango. The company’s Google Glass, which is aimed at the business and enterprise market, is currently built on Android.

This morning I became a Noogler. My role is to lead the Operating System team for Augmented Reality at Google.

“We will constantly be ‘within’ the internet, rather than have access to it, and within the billions of interconnected computers around us,” Ball wrote in his Metaverse Primer. Mark Zuckerberg described the metaverse similarly, calling it “an even more immersive and embodied internet.” Picture this: you strap on a headset or pair of goggles, flick a switch, and boom—you’re still standing in your living room, but you’re also walking through a 3D world as an avatar of yourself, and you can interact with other people who are doing the same thing from their living rooms.

Being constantly within the internet doesn’t sound all that appealing to me personally—in fact, it sounds pretty terrible—but the good news for those with a similar sentiment is that the “full vision” of the metaverse, according to Ball, is still decades away, primarily because of the advances in computing power, networking, and hardware necessary to enable and support it.

In fact, according to Raja Koduri, VP of Intel’s accelerated computing systems and graphics group, powering the metaverse will require a 1,000-fold improvement on the computational infrastructure we have today. “You need to access to petaflops [one thousand teraflops] of computing in less than a millisecond, less than ten milliseconds for real-time uses,” Koduri told Quartz. “Your PCs, your phones, your edge networks, your cell stations that have some compute, and your cloud computing need to be kind of working in conjunction like an orchestra.”

Albert Einstein and Stephen Hawking – the most famous physicists of the twentieth century — both spent decades trying to find a single law that could explain how the world works on the scale of the atom and on the scale of galaxies. In short, the Standard Model describes the physics of the very small. General relativity describes the physics of the very large. The problem? The two theories tell different stories about the fundamental nature of reality. Einstein described the problem nearly a century ago in his 1923 Nobel lecture 0, telling the audience that a physicist who searches for, “an integrated theory cannot rest content with the assumption that there exist two distinct fields totally independent of each other by their nature.” Even while on his deathbed, Einstein worked on a way to unite all the laws of physics under one unifying theory.

Yet despite the chip giant’s manufacturing struggles, it still maintains nearly 90% market share in data-center chips, compared with AMD’s 10%, according to data from Mercury Research. Intel has lost more ground in desktop and laptop computers, holding onto 83% market share and 78% share respectively, with the remainder going mostly to AMD, according to Mercury data.

After years of hearing about these problems, Wall Street had largely written off the company’s manufacturing prowess. Investors expected the company to move to a hybrid approach to chip making, contracting more of its chip manufacturing to TSMC and potentially to Samsung. Some analysts suggested the company go as far as spinning out the manufacturing business, as AMD did with what is now known as GlobalFoundries years ago.

But weeks after Gelsinger took over, he announced that the company planned to double down on its manufacturing business in an effort to return Intel to its roots, including a bid to compete with TSMC as a contract manufacturer. Since his return to Intel after nearly nine years as chief executive of VMware, he has shaken up the company’s executive team. That includes re-hiring several notable Intel staffers, including Natarajan.

The mathematician Ben Green of the University of Oxford has made a major stride toward understanding a nearly 100-year-old combinatorics problem, showing that a well-known recent conjecture is “not only wrong but spectacularly wrong,” as Andrew Granville of the University of Montreal put it. The new paper shows how to create much longer disordered strings of colored beads than mathematicians had thought possible, extending a line of work from the 1940s that has found applications in many areas of computer science.

The conjecture, formulated about 17 years ago by Ron Graham, one of the leading discrete mathematicians of the past half-century, concerns how many red and blue beads you can string together without creating any long sequences of evenly spaced beads of a single color. (You get to decide what “long” means for each color.)

This problem is one of the oldest in Ramsey theory, which asks how large various mathematical objects can grow before pockets of order must emerge. The bead-stringing question is easy to state but deceptively difficult: For long strings there are just too many bead arrangements to try one by one.