Toggle light / dark theme

The cable, called hollow-core fiber, is a next-generation version of the fiber-optic cable used to deliver broadband internet to homes and businesses. Made of glass, such cables carry data encoded as beams of light. But instead of being solid, hollow-core fiber is empty inside, with dozens of parallel, air-filled channels narrower than a human hair.

Because light travels nearly 50% faster through air than glass, it takes about one-third less time to send data through hollow-core fiber than through the same length of standard fiber.

The difference is often just a minuscule fraction of a second. But in high-frequency trading, that can make the difference between profits and losses. HFT firms use sophisticated algorithms and ultrafast data networks to execute rapid-fire trades in stocks, options and futures. Many are secretive about their trading strategies and technology.

Smartwatches and other battery-powered electronics would be even smarter if they could run AI algorithms. But efforts to build AI-capable chips for mobile devices have so far hit a wall—the so-called “memory wall” that separates data processing and memory chips that must work together to meet the massive and continually growing computational demands imposed by AI.

“Transactions between processors and memory can consume 95 percent of the energy needed to do machine learning and AI, and that severely limits battery life,” said computer scientist Subhasish Mitra, senior author of a new study published in Nature Electronics.

Now, a team that includes Stanford computer scientist Mary Wootters and electrical engineer H.-S. Philip Wong has designed a system that can run AI tasks faster, and with less energy, by harnessing eight hybrid chips, each with its own data processor built right next to its own memory storage.

A new deep-learning algorithm could provide advanced notice when systems — from satellites to data centers — are falling out of whack.

When you’re responsible for a multimillion-dollar satellite hurtling through space at thousands of miles per hour, you want to be sure it’s running smoothly. And time series can help.

A time series is simply a record of a measurement taken repeatedly over time. It can keep track of a system’s long-term trends and short-term blips. Examples include the infamous Covid-19 curve of new daily cases and the Keeling curve that has tracked atmospheric carbon dioxide concentrations since 1958. In the age of big data, “time series are collected all over the place, from satellites to turbines,” says Kalyan Veeramachaneni. “All that machinery has sensors that collect these time series about how they’re functioning.”

(Inside Science) — It took years of painstaking work for Carlos Souza and his colleagues to map out every road in the Brazilian Amazon biome. Official maps of the 4.2 million-square-kilometer region only show roads built by federal and local governments. But by carefully tracing lines on satellite images, the researchers concluded in 2016 that the true length of all the roads combined was nearly 13 times higher.

“When we don’t have a good understanding of how much roadless areas we have on the landscape, we probably will misguide any conservation plans for that territory,” said Souza, a geographer at a Brazil-based environmental nonprofit organization called Imazon.

Now, Imazon researchers have built an artificial intelligence algorithm to find such roads automatically. Currently, the algorithm is reaching about 70% accuracy, which rises to 87%-90% with some additional automated processing, said Souza. Analysts then confirm potential roads by examining the satellite images. Souza presented the research last month at a virtual meeting of the American Geophysical Union.

Scientists at DGIST in Korea, and UC Irvine and UC San Diego in the US, have developed a computer architecture that processes unsupervised machine learning algorithms faster, while consuming significantly less energy than state-of-the-art graphics processing units. The key is processing data where it is stored in computer memory and in an all-digital format. The researchers presented the new architecture, called DUAL, at the 2020 53rd Annual IEEE/ACM International Symposium on Microarchitecture.

“Today’s computer applications generate a large amount of data that needs to be processed by algorithms,” says Yeseong Kim of Daegu Gyeongbuk Institute of Science and Technology (DGIST), who led the effort.

Powerful “unsupervised” machine learning involves training an algorithm to recognize patterns in without providing labeled examples for comparison. One popular approach is a clustering algorithm, which groups similar data into different classes. These algorithms are used for a wide variety of data analyzes, such as identifying on social media, filtering spam email and detecting criminal or fraudulent activity online.

Proteins are essential to cells, carrying out complex tasks and catalyzing chemical reactions. Scientists and engineers have long sought to harness this power by designing artificial proteins that can perform new tasks, like treat disease, capture carbon or harvest energy, but many of the processes designed to create such proteins are slow and complex, with a high failure rate.

In a breakthrough that could have implications across the healthcare, agriculture, and energy sectors, a team lead by researchers in the Pritzker School of Molecular Engineering at the University of Chicago has developed an artificial intelligence-led process that uses big data to design new proteins.

By developing machine-learning models that can review protein information culled from genome databases, the researchers found relatively simple design rules for building artificial proteins. When the team constructed these artificial proteins in the lab, they found that they performed chemical processes so well that they rivaled those found in nature.

Scientists at Freie Universität Berlin develop a deep learning method to solve a fundamental problem in quantum chemistry.

A team of scientists at Freie Universität Berlin has developed an artificial intelligence (AI) method for calculating the ground state of the Schrödinger equation in quantum chemistry. The goal of quantum chemistry is to predict chemical and physical properties of molecules based solely on the arrangement of their atoms in space, avoiding the need for resource-intensive and time-consuming laboratory experiments. In principle, this can be achieved by solving the Schrödinger equation, but in practice this is extremely difficult.

Up to now, it has been impossible to find an exact solution for arbitrary molecules that can be efficiently computed. But the team at Freie Universität has developed a deep learning method that can achieve an unprecedented combination of accuracy and computational efficiency. AI has transformed many technological and scientific areas, from computer vision to materials science. “We believe that our approach may significantly impact the future of quantum chemistry,” says Professor Frank Noé, who led the team effort. The results were published in the reputed journal Nature Chemistry.

Digital data storage is a growing need for our society and finding alternative solutions than those based on silicon or magnetic tapes is a challenge in the era of “big data.” The recent development of polymers that can store information at the molecular level has opened up new opportunities for ultrahigh density data storage, long-term archival, anticounterfeiting systems, and molecular cryptography. However, synthetic informational polymers are so far only deciphered by tandem mass spectrometry. In comparison, nanopore technology can be faster, cheaper, nondestructive and provide detection at the single-molecule level; moreover, it can be massively parallelized and miniaturized in portable devices. Here, we demonstrate the ability of engineered aerolysin nanopores to accurately read, with single-bit resolution, the digital information encoded in tailored informational polymers alone and in mixed samples, without compromising information density. These findings open promising possibilities to develop writing-reading technologies to process digital data using a biological-inspired platform.

DNA has evolved to store genetic information in living systems; therefore, it was naturally proposed to be similarly used as a support for data storage (1–3), given its high-information density and long-term storage with respect to existing technologies based on silicon and magnetic tapes. Alternatively, synthetic informational polymers have also been described (5–9) as a promising approach allowing digital storage. In these polymers, information is stored in a controlled monomer sequence, a strategy that is also used by nature in genetic material. In both cases, single-molecule data writing is achieved mainly by stepwise chemical synthesis (3, 10, 11), although enzymatic approaches have also been reported (12). While most of the progress in this area has been made with DNA, which was an obvious starting choice, the molecular structure of DNA is set by biological function, and therefore, there is little space for optimization and innovation.

With the rapid advancement of humanoid robots in the market today, we’re able to see how our lives have become simpler and easier. In this video let’s look at some of the most advanced humanoid robots that are being developed by various companies and organisations.

✅Subscribe to our Channel to learn more about the top Technologies: https://bit.ly/2VT4WtH

⏩ Check out the Artificial Intelligence training videos: https://bit.ly/2Li4Rur.

#HumanoidRobots #AdvancedHumanoidRobot #Robotics #ArtificialIntelligence #FutureOfArtificialIntelligence #Simplilearn.

Simplilearn’s Artificial Intelligence course provides training in the skills required for a career in AI. You will master TensorFlow, Machine Learning and other AI concepts, plus the programming languages needed to design intelligent agents, deep learning algorithms & advanced artificial neural networks that use predictive analytics to solve real-time decision-making problems without explicit programming.

Why learn Artificial Intelligence?
The current and future demand for AI engineers is staggering. The New York Times reports a candidate shortage for certified AI Engineers, with fewer than 10000 qualified people in the world to fill these jobs, which according to Paysa earn an average salary of $172000 per year in the U.S. (or Rs.17 lakhs to Rs. 25 lakhs in India) for engineers with the required skills.