Toggle light / dark theme

To check out any of the lectures available from Great Courses Plus go to http://ow.ly/dweH302dILJ

We’ll soon be capable of building self-replicating robots. This will not only change humanity’s future but reshape the galaxy as we know it.

Get your own Space Time t­shirt at http://bit.ly/1QlzoBi.
Tweet at us! @pbsspacetime.
Facebook: facebook.com/pbsspacetime.
Email us! pbsspacetime [at] gmail [dot] com.
Comment on Reddit: http://www.reddit.com/r/pbsspacetime.
Support us on Patreon! http://www.patreon.com/pbsspacetime.

Help translate our videos! http://www.youtube.com/timedtext_cs_panel?tab=2&c=UC7_gcs09iThXybpVgjHZ_7g.

Previous Episode — Is there a 5th Fundamental Force.
https://www.youtube.com/watch?v=MuvwcsfXIIo.

Should we Build a Dyson Sphere?
https://www.youtube.com/watch?v=jW55cViXu6s.

A new kind of Technology developed by Meta AI will enable more intelligent and efficient robots to enter our homes and replace humans in warehouses through advances in Artificial Intelligence. DIGIT and ReSkin are two advanced technologies that enable Robots to have better feelings than even some humans. One of the biggest and best AI Scientists, Yann LeCun is working on this very futuristic technology that may be considered one of the best robots and AI’s of 2021. Through Deep Learning and machine learning robotics, the smart humanoid robots will be abilities previously thought impossible.

If you enjoyed this video, please consider rating this video and subscribing to our channel for more frequent uploads. Thank you! smile

TIMESTAMPS:
00:00 A new type of Robot.
02:25 A new way to sense the world.
04:45 Is this technology for everyone?
07:13 DIGIT and the Metaverse.
08:32 Last Words.

#ai #meta #facebook

Deadlines.

You either love them, hate them, or experience both sentiments at the same time.

For AI-based true self-driving cars, there isn’t a human driver involved. Keep in mind that true self-driving cars are driven via an AI driving system. There isn’t a need for a human driver at the wheel, and nor is there normally a provision for a human to drive the vehicle. For my extensive and ongoing coverage of Autonomous Vehicles (AVs) and especially self-driving cars, see the l… See more.


A deadline can be handy as a focal point that aids in rallying together everyone towards achieving something great. On the other hand, a deadline can turn each person against the other and fester a bitter fight that leaves all involved forever scarred and upset due to a seemingly arbitrary and reprehensible line in the sand.

Deadlines do decidedly set expectations, though at times the expectations are out of whack with reality.

The wheeled Jaeger-C is a small machine with a low profile designed to attack from ambush. In some ways, it might be seen as a mobile robotic mine. This is especially true because the makers note it can be remote-controlled or “autonomously with image analysis and trained models linked to robotic actions,” according to a report in Overt Defense. This sounds very much like the sort of deep learning increasingly used for other automatic target recognition, a trend driven by the ready availability of new, low-cost hardware for small uncrewed systems.

The Jaeger-C will sit in ambush in Gaard mode – a long-term silent watch mode – until it detects potential targets. It will then switch into either Chariot mode or Goliath mode depending on whether the targets are personnel or vehicles.

The Air Force’s Skyborg team flew two General Atomics MQ-20 Avenger stealth drones on the “multi-hour” Oct. 26 flight over California. One of the Ave… See more.


Two stealth drones soared over Edwards Air Force Base in California last week, offering some encouraging evidence that the U.S. Air Force’s new drone “brain” not only works—it works with a bunch of different drone types.

The Air Force hopes to install the Skyborg autonomy core system in a wide array of unmanned aerial vehicles. The idea is for the ACS to steer armed drones with minimal human control—even in the heat of battle. That way the drones can fly as robotic wingmen for manned fighters without demanding too much of the busy human pilots.

The value of attending major industry events like Nvidia’s GTC (GPU Technology Conference) is to see what companies are and are not focusing on going forward. Nvidia has transformed the agenda for GTC from gaming into one of the leading AI events. The agenda also includes HPC and data center networking topics, representing other areas Nvidia has been expanding into in the last few years. If the agenda for the upcoming GTC event is any indication, the company has greatly increased its focus on autonomous machines, which includes all forms of robotics.

In addition to autonomous vehicles, this GTC agenda includes more than ten sessions focused on autonomous machines. As the company has done with other market segments, the autonomous machines sessions will bring together experts from academia, the industry, and Nvidia to provide training, industry insights, and technical assistance in AI and robotics. Some of the experts attending include Brian Gerkey, Co-founder and CEO of Open Robotics, Patty Delafuente from the University of Maryland, Ajit Jaokar and Ayşe Mutlu form the University of Oxford, and Johan Barthelemy from the University of Wollongong. There will also be AI and robotics experts from Denso Wave, Digeiz, Hammerson, Integral AI, Milestone Systems, Nota, and SK Telecom presenting at the conference.

DeepMind is mostly known for its work in deep reinforcement learning, especially in mastering complicated games and predicting protein structures. Now, it is taking its next step in robotics research.

According to a blog post on DeepMind’s website, the company has acquired the rigid-body physics simulator MuJoCo and has made it freely available to the research community. MuJoCo is now one of several open-source platforms for training artificial intelligence agents used in robotics applications. Its free availability will have a positive impact on the work of scientists who are struggling with the costs of robotics research. It can also be an important factor for DeepMind’s future, both as a science lab seeking artificial general intelligence and as a business unit of one of the largest tech companies in the world.

Simulation platforms are a big deal in robotics. Training and testing robots in the real world is expensive and slow. Simulated environments, on the other hand, allow researchers to train multiple AI agents in parallel and at speeds that are much faster than real life. Today, most robotics research teams carry out the bulk of training their AI models in simulated environments. The trained models are then tested and further fine-tuned on real physical robots.