Toggle light / dark theme

To say we’re at an inflection point of the technological era may be an obvious declaration to some. The opportunities at hand and how various technologies and markets will advance are nuanced, however, though a common theme is emerging. The pace of innovation is moving at a rate previously seen by humankind at only rare points in history. The invention of the printing press and the ascension of the internet come to mind as similar inflection points, but current innovation trends are being driven aggressively by machine learning and artificial intelligence (AI). In fact, AI is empowering rapid technology advances in virtually all areas, from the edge and personal devices, to the data center and even chip design itself.

There is also a self-perpetuating effect at play, because the demand for intelligent machines and automation everywhere is also ramping up, whether you consider driver assist technologies in the automotive industry, recommenders and speech recognition input in phones, or smart home technologies and the IoT. What’s spurring our recent voracious demand for tech is the mere fact that leading-edge OEMs, from big names like Tesla and Apple, to scrappy start-ups, are now beginning to realize great gains in silicon and system-level development beyond the confines of Moore’s Law alone.

Don’t worry, you’re safe.

Engineers from Stanford University created a robot that can grasp irregular objects.

The robot, called Stereotyped Nature-inspired Aerial Grasper (SNAG), is inspired by peregrine falcons, which is the fastest animal on earth reaching 200 miles per hour (320 km) when diving. … See more.

Do you know what the Earth’s atmosphere is made of? You’d probably remember it’s oxygen, and maybe nitrogen. And with a little help from Google you can easily reach a more precise answer: 78% nitrogen, 21% oxygen and 1% Argon gas. However, when it comes to the composition of exo-atmospheres—the atmospheres of planets outside our solar system—the answer is not known. This is a shame, as atmospheres can indicate the nature of planets, and whether they can host life.

As exoplanets are so far away, it has proven extremely difficult to probe their atmospheres. Research suggests that artificial intelligence (AI) may be our best bet to explore them—but only if we can show that these algorithms think in reliable, scientific ways, rather than cheating the system. Now our new paper, published in The Astrophysical Journal, has provided reassuring insight into their mysterious logic.

Astronomers typically exploit the transit method to investigate exoplanets, which involves measuring dips in light from a star as a planet passes in front of it. If an atmosphere is present on the planet, it can absorb a very tiny bit of light, too. By observing this event at different wavelengths—colors of light—the fingerprints of molecules can be seen in the absorbed starlight, forming recognizable patterns in what we call a spectrum. A typical signal produced by the atmosphere of a Jupiter-sized planet only reduces the stellar light by ~0.01% if the star is Sun-like. Earth-sized planets produce 10–100 times lower signals. It’s a bit like spotting the eye color of a cat from an aircraft.

Tesla’s head of AI has released new footage of the automaker’s auto labeling tool for its self-driving effort.

It’s expected to be an important accelerator in improving Tesla’s Full Self-Driving Beta.

Tesla is often said to have a massive lead in self-driving data thanks to having equipped all its cars with sensors early on and collecting real-world data from a fleet that now includes over a million vehicles.

ONE YEAR AGO Google artificial intelligence researcher Timnit Gebru tweeted, “I was fired” and ignited a controversy over the freedom of employees to question the impact of their company’s technology. Thursday, she launched a new research institute to ask questions about responsible use of artificial intelligence that Gebru says Google and other tech companies won’t.

“Instead of fighting from the inside, I want to show a model for an independent institution with a different set of incentive structures,” says Gebru, who is founder and executive director of Distributed Artificial Intelligence Research (DAIR). The first part of the name is a reference to her aim to be more inclusive than most AI labs—which skew white, Western, and male —and to recruit people from parts of the world rarely represented in the tech industry.

Gebru was ejected from Google after clashing with bosses over a research paper urging caution with new text-processing technology enthusiastically adopted by Google and other tech companies. Google has said she resigned and was not fired, but acknowledged that it later fired Margaret Mitchell, another researcher who with Gebru co-led a team researching ethical AI. The company placed new checks on the topics its researchers can explore. Google spokesperson Jason Freidenfelds declined to comment but directed WIRED to a recent report on the company’s work on AI governance, which said Google has published more than 500 papers on “responsible innovation” since 2018.

Harrison.ai, a Sydney-based company that creates medical devices with AI technology, announced today it has raised $129 million AUD (about $92.3 million USD) in what it called one of the largest Series B rounds ever for an Australian startup.

The funding was led by returning investor Horizons Ventures and included participation from new investors Sonic Healthcare and I-MED Radiology Network. Existing backers Blackbird Ventures and Skip Capital also returned for the round, which brings Harrison.ai’s total raised over the past two years to $158 million AUD.

Harrison.ai announced it has also formed a joint venture with Sonic Healthcare, one of the world’s largest medical diagnostics providers, to develop and commercialize new clinical AI solutions in pathology. The partnership will focus first on histopathology, or the diagnosis of tissue diseases.

Recently, Microsoft Azure joined the Top 10 club of the TOP500 super computer rankings by delivering 30.05 Petaflops. It was based on Microsoft’s recently announced Azure NDm A100 80GB v4, available on demand. These Azure NDm A100 v4 instances are powered by NVIDIA GPU acceleration and NVIDIA InfiniBand networking.

Microsoft today highlighted the latest (December 2021) MLPerf 1.1 results in which Azure delivered the #2 performance overall and the #1 performance by a cloud provider.

At its re: Invent 2021 conference today, Amazon announced Graviton3, the next generation of its custom ARM-based chip for AI inferencing applications. Soon to be available in Amazon Web Services (AWS) C7g instances, the company says that the processors are optimized for workloads including high-performance compute, batch processing, media encoding, scientific modeling, ad serving, and distributed analytics.

Alongside Graviton3, Amazon unveiled Trn1, a new instance for training deep learning models in the cloud — including models for apps like image recognition, natural language processing, fraud detection, and forecasting. It’s powered by Trainium, an Amazon-designed chip that the company last year claimed would offer the most teraflops of any machine learning instance in the cloud. (A teraflop translates to a chip being able to process 1 trillion calculations per second.)

As companies face pandemic headwinds including worker shortages and supply chain disruptions, they’re increasingly turning to AI for efficiency gains. According to a recent Algorithmia survey, 50% of enterprises plan to spend more on AI and machine learning in 2021, with 20% saying they will be “significantly” increasing their budgets for AI and ML. AI adoption is, in turn, driving cloud growth — a trend of which Amazon is acutely aware, hence the continued investments in technologies like Graviton3 and Trn1.