Toggle light / dark theme

A world where people are monitored and supervised by machines isn’t confined to the realms of sci-fi. It’s here now.

Tough conditions: There have been many reports over recent years about unpleasant conditions workers face at Amazon warehouses. Employees are under pressure to pack hundreds of boxes per hour, and face being fired if they aren’t fast enough.

What’s new: Documents obtained by The Verge show that it’s far more common for people to be fired due to lack of productivity than outsiders realize. Roughly 300 people were fired at a single facility between August 2017 and September 2018 for that reason. And crucially, the documents show that much of the firing process is automated.

Read more

Move over Spot, there’s a new four-legged flipping robot in town. Boston Dynamic’s dog-like droid has some new, friendly competition in the form of a quadruped built by undergraduate students at Stanford University, who have made the designs open source with the aim of encouraging advances through low-cost robotics.

Read more

Researchers from Washington State University and Ohio State University have developed a low-cost, easy way to make custom lenses that could help manufacturers avoid the expensive molds required for optical manufacturing.

Led by Lei Li, assistant professor in the School of Mechanical and Materials Engineering, and graduate student, Mojtaba Falahati, the researchers developed a liquid mold from droplets that they can manipulate with magnets to create lenses in a variety of shapes and sizes. Their work is featured on the cover of the journal, Applied Physics Letters.

High-quality lenses are increasingly used in everything from cameras, to self-driving cars, and virtually all robotics, but the traditional molding and casting processes used in their manufacturing require sophisticated and expensive metal molds. So, manufacturers are mostly limited to mass producing one kind of lens.

Read more

In less than two decades, you won’t just use your computers, you will have relationships with them.

Because of artificial intelligence, computers will be able to read at human levels by 2029 and will also begin to have different human characteristics, said Ray Kurzweil, a director of engineering at Google.

“My timeline is computers will be at human levels, such as you can have a human relationship with them, 15 years from now,” he said. Kurzweil’s comments came at the Exponential Finance conference in New York on Wednesday.

Read more

In a bid to help those with limited mobility get to the gate, Tokyo Narita International Airport is set to welcome a number of self-driving wheelchairs to its floors. With the ability to navigate the airport all on their own, the new wheelchairs are hoped to streamline foot traffic in one of Japan’s busiest airports and form part of a wider plan to boost mobility options at such facilities.

Read more

Visitors to The Dali Museum in St. Petersburg, Florida, will now be greeted by a digitally resurrected simulation of Salvador Dali. Created using machine learning and deepfake technologies, the digital Dali is programmed to communicate in novel ways, from commenting on the day’s weather to taking a selfie with museum patrons.

Read more

An innovative system to predict lung cancer could make a huge change in survival rates, with Google exploring how artificial intelligence could dramatically improve diagnosis rates. Despite advances in cancer treatment, lung cancer remains one of the most deadly diseases, not least because difficulty in identifying it among patients means it can often be too late to address.

Read more