Toggle light / dark theme

Imagine this: a driverless car cruises around in search of passengers.

After dropping someone off, the car uses its profits for a trip to a charging station. Except for it’s initial programming, the car doesn’t need outside help to determine how to carry out its mission.

That’s one “thought experiment” brought to you by former bitcoin contributor Mike Hearn in which he describes how bitcoin could help power leaderless organizations 30-or-so years into the future.

Read more

Fusion power — the process that keeps stars like the Sun burning — holds the promise of nearly unlimited clean power. But scientists have struggled for decades to make it a practical energy source.

Now, laser scientists say a machine learning breakthrough has smashed the standing record for a fusion power yield. It doesn’t mean fusion power is practical yet, but the prestigious journal Nature called the result “remarkable” and wrote that it has “major implications” — so, at the very least, it’s another hint that the long-deferred technology is starting to come into focus.

Read more

After years of promise, AI is finally becoming useful. But what usually happens to useful technologies is that they disappear. We forget about the things that just work, and we shouldn’t let that happen to AI. Any technology destined to change the world needs scrutiny, and AI, with its combination of huge imaginative presence and very real, very dangerous failings, needs that scrutiny more than most.

So, for the AI Issue at The Verge, we’re taking a closer look at some of the ways artificial intelligence and machine learning are affecting technology right now — because it’s too late to understand something after it’s changed the world.

Read more

Facebook is working on an artificial intelligence that it hopes could one day detect people’s emotions based on their tone of their voice, aiming to alleviate the frustrations of modern voice speaker systems such as Alexa.

Engineers at the social network’s research labs are working out how to train its voice-controlled video chat device, Portal, to understand when a user is angry, an employee said during a tech conference in San Francisco.

The system could one day be used across Facebook Messenger and WhatsApp calls, but could lead to privacy fears about the scope of the company’s data collection.

Read more

Neural networks have been used to turn words that a human has heard into intelligible, recognizable speech. It could be a step toward technology that can one day decode people’s thoughts.

A challenge: Thanks to fMRI scanning, we’ve known for decades that when people speak, or hear others, it activates specific parts of their brain. However, it’s proved hugely challenging to translate thoughts into words. A team from Columbia University has developed a system that combines deep learning with a speech synthesizer to do just that.

The study: The team temporarily placed electrodes in the brains of five people scheduled to have brain surgery for epilepsy. (People who have this procedure often have implants fitted to learn more about their seizures.) The volunteers were asked to listen to recordings of sentences, and their brain activity was used to train deep-learning-based speech recognition software. Then they listened to 40 numbers being spoken. The AI tried to decode what they had heard on the basis of their brain activity—and then spoke the results out loud in a robotic voice. What the voice synthesizer produced was understandable as the right word 75% of the time, according to volunteers who listened to it. The results were published in Scientific Reports today (and you can listen to the recordings here.)

Read more

WASHINGTON — Boeing is on track to launch its new astronaut taxi to the International Space Station (ISS) next month.

Along with SpaceX, the private spaceflight company was contracted by NASA to begin launching astronauts from U.S. soil again for the first time since the space shuttle program ended in 2011. Boeing’s CST-100 Starliner won’t be taking any astronauts along for its first flight to the ISS, however. After docking robotically with the orbiting lab, it will return to Earth for a parachute landing in Texas.

If this test flight goes according to plan, Boeing will be ready to launch its first crew of astronauts to the space station in August, Boeing spokesperson Maribeth Davis told Space.com during a presentation of Boeing’s future vision for space travel here. [How Boeing’s Commercial CST-100 Starliner Spacecraft Works].

Read more

AI farms are well suited to impoverished regions like Guizhou, where land and labor are cheap and the climate temperate enough to enable the running of large machines without expensive cooling systems. It takes only two days to train workers like Yin in basic AI tagging, or a week for the more complicated task of labeling 3D pictures.


A battle for AI supremacy is being fought one algorithm at a time.

Read more

Now, this is awesome. A stationary robot, two mobile robots, and a human cooperating to perform a task. The humanoid robot also interprets human gestures and obeys those commands.

More information: http://www.co4robots.eu/


The Co4Robots MS2 scenario consists on collaborative grasping and manipulation of an object by two agents, the TIAGo mobile manipulator and a static manipulator; and a collaborating mobile platform and stationary manipulator to facilitate loading and unloading tasks onto the mobile platform.

Find out more on the Co4Robots project at: http://www.co4robots.eu

Co4Robots project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 731869.

Read more

Another step forward in robotics self-awareness. This robot learns it’s own kinematics without human intervention and then learns to plot solution paths.


Columbia Engineering researchers have made a major advance in robotics by creating a robot that learns what it is, from scratch, with zero prior knowledge of physics, geometry, or motor dynamics. Once their robot creates a self-simulation, it can then use that self-model to adapt to different situations, to handle new tasks as well as detect and repair damage in its own body.

Read more