Toggle light / dark theme

SpaceX and NASA are targeting Saturday, May 1 at 8:35 p.m. EDT, or 00:35 UTC on May 2, for Dragon to autonomously undock from the International Space Station (ISS) and splashdown off the coast of Florida on Sunday, May 2 at approximately 2:57 a.m. EDT, 6:57 UTC, completing its first six-month operational mission to the Station.

A series of departure burns will move Dragon away from the orbiting laboratory, followed by the vehicle jettisoning the trunk to reduce weight and mass to help save propellant for the deorbit burn. Once complete, Dragon will re-enter Earth’s atmosphere and deploy its two drogue and four main parachutes in preparation for a soft water landing.

Aboard the spacecraft will be NASA astronauts Mike Hopkins, Victor Glover, Shannon Walker, and JAXA astronaut Soichi Noguchi, who flew to the space station on Dragon six months ago when Falcon 9 launched the spacecraft from historic Launch Complex 39A (LC-39A) at Kennedy Space Center in Florida on Sunday, November 15, 2020.

Upon splashdown, the Dragon and the astronauts will be quickly recovered and returned to Cape Canaveral and Houston respectively. Once the mission is complete, Dragon will be inspected and refurbished for future human spaceflight missions.

It’s exactly as weird as it sounds.


The U.S. Army is looking into using animal muscle tissue as a means to move robots.

The Army Research Laboratory believes its bots could use real muscle, which allows most living things to move and manipulate their environments, instead of mechanical arms, wheels, tracks, and other systems to travel across the battlefield. The concept, which some might find disturbing, is an example of the new field of “biohybrids.”

In order to effectively navigate real-world environments, legged robots should be able to move swiftly and freely while maintaining their balance. This is particularly true for humanoid robots, robots with two legs and a human-like body structure.

Building robots that are stable on their legs while walking can be challenging. In fact, legged robots typically have unstable dynamics, due to their pendulum-like structure.

Researchers at Hong Kong University of Science and Technology recently developed a computer vision-based robotic foot with tactile sensing capabilities. When integrated at the end of a ’s legs, the artificial foot can increase a robot’s balance and stability during locomotion.

Yes, but they wont be trusted til 2035.


Current trends in AI use in healthcare lead me to posit that this market will significantly grow in the coming years. So, should leaders in healthcare expect the emergence of a fully automated electronic physician, sonographer or surgeon as a replacement for the human healthcare professional? Can the development of AI in healthcare help overcome the difficulties the industry faces today? To figure all this out, I would like to analyze the current challenges of using AI in healthcare.

Let’s discuss two promising examples: the application of AI in diagnosis and reading images, and the use of robotic systems in surgery.

Diagnostic Robots: Accuracy And Use For Treatment Recommendations

The success of AI in diagnosing is confirmed by the results of its application in a number of medical studies — for example, in optical coherence tomography (OCT), which requires serious qualifications. Google’s AI-based DeepMind Health system, for instance, demonstrates 94% accuracy of diagnoses for over 50 types of eye diseases in an early trial. Nevertheless, the system operates in conjunction with human experts.

Protocol to reverse engineer Hamiltonian models advances automation of quantum devices.

Scientists from the University of Bristol ’s Quantum Engineering Technology Labs (QETLabs) have developed an algorithm that provides valuable insights into the physics underlying quantum systems — paving the way for significant advances in quantum computation and sensing, and potentially turning a new page in scientific investigation.

In physics, systems of particles and their evolution are described by mathematical models, requiring the successful interplay of theoretical arguments and experimental verification. Even more complex is the description of systems of particles interacting with each other at the quantum mechanical level, which is often done using a Hamiltonian model. The process of formulating Hamiltonian models from observations is made even harder by the nature of quantum states, which collapse when attempts are made to inspect them.

SAN FRANCISCO — Kleos Space is conducting a six-month test of technology for in-space manufacturing of large 3D carbon fiber structures that could be used to construct solar arrays, star shades and interferometry antennas.

The company with operations in Luxembourg, the United States and United Kingdom is best known for radio frequency reconnaissance satellites. In the background, however, Kleos has been designing and developing in-space manufacturing technology called Futrism to robotically produce a carbon-fiber I-beam with embedded fiber-optic cables that is more than 100 meters long.

“It’s something that we have linked to our roadmap for RF, because it’s something that could deploy very large antennas for RF reconnaissance,” Kleos CEO Andy Bowyer told SpaceNews. “However, it’s useful for a whole range of other applications as well that we are very keen to work with partners on. We firmly believe that manufacturing in space is the future.”

Machine learning is capable of doing all sorts of things as long as you have the data to teach it how. That’s not always easy, and researchers are always looking for a way to add a bit of “common sense” to AI so you don’t have to show it 500 pictures of a cat before it gets it. Facebook’s newest research takes a big step toward reducing the data bottleneck.

The company’s formidable AI research division has been working for years now on how to advance and scale things like advanced computer vision algorithms, and has made steady progress, generally shared with the rest of the research community. One interesting development Facebook has pursued in particular is what’s called “semi-supervised learning.”

Generally when you think of training an AI, you think of something like the aforementioned 500 pictures of cats — images that have been selected and labeled (which can mean outlining the cat, putting a box around the cat or just saying there’s a cat in there somewhere) so that the machine learning system can put together an algorithm to automate the process of cat recognition. Naturally if you want to do dogs or horses, you need 500 dog pictures, 500 horse pictures, etc. — it scales linearly, which is a word you never want to see in tech.

Still calling 2025 for the debut of a robotic set of human level hands.


Although robotic devices are used in everything from assembly lines to medicine, engineers have a hard time accounting for the friction that occurs when those robots grip objects – particularly in wet environments. Researchers have now discovered a new law of physics that accounts for this type of friction, which should advance a wide range of robotic technologies.

“Our work here opens the door to creating more reliable and functional haptic and robotic devices in applications such as telesurgery and manufacturing,” says Lilian Hsiao, an assistant professor of chemical and biomolecular engineering at North Carolina State University and corresponding author of a paper on the work.

At issue is something called elastohydrodynamic lubrication (EHL) friction, which is the friction that occurs when two solid surfaces come into contact with a thin layer of fluid between them. This would include the friction that occurs when you rub your fingertips together, with the fluid being the thin layer of naturally occurring oil on your skin. But it could also apply to a robotic claw lifting an object that has been coated with oil, or to a surgical device that is being used inside the human body.

Scientists from the University of Bristol’s Quantum Engineering Technology Labs (QETLabs) have developed an algorithm that provides valuable insights into the physics underlying quantum systems—paving the way for significant advances in quantum computation and sensing, and potentially turning a new page in scientific investigation.

A seabed mining robot being tested on the Pacific Ocean floor at a depth of more than 4 km (13000 ft) has become detached, the Belgian company running the experimental trial said on Wednesday.

Global Sea Mineral Resources (GSR), the deep-sea exploratory division of dredging company DEME Group, has been testing Patania II, a 25-tonne mining robot prototype, in its concession in the Clarion Clipperton Zone since April 20.

The machine is meant to collect the potato-sized nodules rich in cobalt and other battery metals that pepper the seabed in this area, and was connected to GSR’s ship with a 5km cable.