Toggle light / dark theme

It was invented by David Gathu and Moses Kinyua and is powered by brain signals.

The signals are converted into an electric current by a “NeuroNode” biopotential headset receiver. This electrical current is then driven into the robot’s circuitry, which gives the arm its mobility.

The arm has several component materials including recycled wood and moves vertically and horizontally.

Juniorr Amenra.

· —3—h ·

Micro-sized robots could bring a new wave of innovation in the medical field by allowing doctors to access specific regions inside the human body without the need for highly invasive procedures. Among other things, these tiny robots could be used to carry drugs, genes or other substances to specific sites inside the body, opening up new possibilities for treating different medical conditions.

Researchers at ETH Zurich and Helmholtz Institute Erlangen–Nürnberg for Renewable Energy have recently developed micro and nano-sized robots inspired by biological micro-swimmers (e.g., bacteria or spermatozoa). These , presented in a paper published in Nature Machine Intelligence, are capable of upstream motility, which essentially means that they can autonomously move in the opposite direction to that in which a fluid (e.g., blood) flows. This makes them particularly promising for intervening inside the .

“We believe that the ideas discussed in our multidisciplinary study can transform many aspects of medicine by enabling tasks such as targeted and precise delivery of drugs or genes, as well as facilitating non-invasive surgeries,” Daniel Ahmed, lead author of the recent paper, told TechXplore.

What do you think Eric Klien.


A universal basic income worth about one-fifth of workers’ median wages did not reduce the amount of effort employees put into their work, according to an experiment conducted by Spanish economists, a sign that the policy initiative could help mitigate inequalities and the impact of automation.

Providing workers with a universal basic income did not reduce the amount of effort they put into their work, according to an experiment conducted by Spanish economists, a sign that the policy initiative could help mitigate inequalities and debunking a common criticism of the proposal.

Examining a universal basic income worth about one-fifth of workers’ median wages, the researchers also found that the threat of being replaced by robots did not impact workers’ productivity, nor did a tax on firms when they replace a worker with a robot or automated process, though the latter successfully created a disincentive for managers.

The Army’s top modernization official said Monday that the Pentagon may have to relax its rules on human control over artificial intelligent combat systems to defeat swarms of enemy drones that often move too fast for soldiers to track.

All branches of the U.S. military have expressed interest in using artificial intelligence, or AI, for faster target recognition; however, the Defense Department until now has stressed that humans, not machines, will always make the decision to fire deadly weapons.

But as small unmanned aerial systems, or UAS, proliferate around the world, Army modernization officials are recognizing that swarms of fast-moving drones will be difficult to defeat without highly advanced technology.

Our goal is audacious — some might even say naive. The aim is to evaluate every gene and drug perturbation in every possible type of cancer in laboratory experiments, and to make the data accessible to researchers and machine-learning experts worldwide. To put some ballpark numbers on this ambition, we think it will be necessary to perturb 20000 genes and assess the activity of 10000 drugs and drug candidates in 20000 cancer models, and measure changes in viability, morphology, gene expression and more. Technologies from CRISPR genome editing to informatics now make this possible, given enough resources and researchers to take on the task.


It is time to move beyond tumour sequencing data to identify vulnerabilities in cancers.

Exploring new approaches to improve the capabilities and accuracy of robots, a team of researchers in Singapore has turned to an unexpected source: plants.

Robots have been dispatched to move cars, lift weighty inventory in warehouses and assist in construction projects.

But what if you need to delicately lift a tiny object 1/50th of an inch?

In recent years, countless computer scientists worldwide have been developing deep neural network-based models that can predict people’s emotions based on their facial expressions. Most of the models developed so far, however, merely detect primary emotional states such as anger, happiness and sadness, rather than more subtle aspects of human emotion.

Past psychology research, on the other hand, has delineated numerous dimensions of emotion, for instance, introducing measures such as valence (i.e., how positive an emotional display is) and arousal (i.e., how calm or excited someone is while expressing an emotion). While estimating valence and arousal simply by looking at people’s faces is easy for most humans, it can be challenging for machines.

Researchers at Samsung AI and Imperial College London have recently developed a deep-neural-network-based system that can estimate emotional valence and arousal with high levels of accuracy simply by analyzing images of human faces taken in everyday settings. This model, presented in a paper published in Nature Machine Intelligence, can make predictions fairly quickly, which means that it could be used to detect subtle qualities of emotion in real time (e.g., from snapshots of CCTV cameras).

Automation ‘to keep people safe’

Hong Kong-based Hanson Robotics said four models, including Sophia will start to be mass produced in the first half of 2021.

This coincides with a rise in automation documented worldwide as robotics technologies are used to allow everyday tasks to be carried out amidst social distancing restrictions.… See More.


Hanson Robotics plans to sell its famous robot amidst increased automation linked to the pandemic.

Researchers from Harvard University have 3D printed a school of soft robotic fish that are capable of swimming in complex patterns without the aid of Wi-Fi or GPS.

Inspired by the distinctive reef-dwelling surgeonfish, the team’s ‘Bluebots’ feature four fins for precision navigation, and a system of LEDs and cameras that enable them to swarm without colliding. The self-sufficiency of the tiny bots could make them ideal for ecological monitoring applications, in areas that wouldn’t otherwise be accessible to humans.

“Just by observing how far or close they are in a picture, they know how far or close the robot must be in the real world. That’s the trick we play here,” the study’s lead author Florian Berlinger told Wired.