Toggle light / dark theme

Circa 2020


Robots and stranger machines have been using a particular band of ultraviolet light to sterilize surfaces that might be contaminated with coronavirus. Those that must decontaminate large spaces, such as hospital rooms or aircraft cabins, use large, power-hungry mercury lamps to produce ultraviolet-C light. Companies around the world are working to improve the abilities of UV-C producing LEDs, to offer a more compact and efficient alternative. Earlier this month, Seoul Viosys showed what it says is the first 99.9 percent sterilization of SARS-COV-2, the coronavirus that causes COVID-19, using ultraviolet LEDs.

UV LEDs are deadly to viruses and bacteria, because the 100–280 nanometer wavelength C-band shreds genetic material. Unfortunately, it’s also strongly absorbed by nitrogen in the air, so sources have to be powerful to have an effect at a distance. (Air is such a strong barrier, that the sun’s UV-C doesn’t reach the Earth’s surface.) Working with researchers at Korea University, in Seoul, the company showed that its Violed LED modules could eliminate 99.9 percent of the SARS-COV-2 virus using a 30-second dose from a distance of three centimeters.

Unfortunately, the company did not disclose how many of its LEDs were used to achieve that. Assuming that it and the university researchers used a single Violed CMD-FSC-CO1A integrated LED module, a 30-second dose would have delivered at most 600 millijoules of energy. This is somewhat in-line with expectations. A study of UVC’s ability to kill influenza A viruses on N95 respirator masks indicated that about 1 joule per square centimeter would do the job.

As the electronic health record grows in detail, the possibilities for customized care are becoming a reality. This article features some useful links to things in the making.


Illustrated woman. While AI is driving value in all aspects of our lives, there are times where it’s hard to separate the aspirations of those who want to use it to do good from those leverag ing AI today to positively impact real change in health and medici ne.

I have the privilege of working with many talented leaders and organizations that are truly making health and medical services better by harnessing the power of healthcare’s data tsunami using AI and other analytical solutions.

COVID-19, p art t wo

There is growing optimism in how we manage COVID-19 going forward to restore many of the daily living activities we miss and treasure. One of the good things we learned from COVID-19 is that, when faced with a challenge, health systems are capable of agile transformation. As part of this, we also demonstrated that AI could drive a “short time to value.”

Audio content production company Aflorithmic and digital human creators UneeQ have collaborated to synthesize the voice of renowned historical scientist, Albert Einstein.

Both organizations intend to give users the opportunity to ask a life-like Einstein AI practical questions, just as if they were engaging the real-life physicist himself. The companies claim to have chosen Einstein due to his famous reputation as an actual genius, historical icon, technology enthusiast and someone they felt many people would actually want to ask many questions.

For the Einstein proof of concept, UneeQ has combined visual character rendering techniques with an advanced computational knowledge engine in order to make this prototype as realistic as possible. In terms of resurrecting an authentic based on the real Albert Einstein, however, researchers had little go on. The only accounts they managed to uncover from reported Einstein to have a heavy German accent and that he spoke, slowly, wisely and kindly in a high-pitched tone.

Someday, scientists believe, tiny DNA-based robots and other nanodevices will deliver medicine inside our bodies, detect the presence of deadly pathogens, and help manufacture increasingly smaller electronics.

Researchers took a big step toward that future by developing a new tool that can design much more complex DNA robots and nanodevices than were ever possible before in a fraction of the time.

In a paper published today in the journal Nature Materials, researchers from The Ohio State University—led by former engineering doctoral student Chao-Min Huang—unveiled new software they call MagicDNA.

Cambridge Quantum Computing (CQC) hiring Stephen Clark as head of AI last week could be a sign the company is boosting research into ways quantum computing could be used for natural language processing.

Quantum computing is still in its infancy but promises such significant results that dozens of companies are pursuing new quantum architectures. Researchers at technology giants such as IBM, Google, and Honeywell are making measured progress on demonstrating quantum supremacy for narrowly defined problems. Quantum computers with 50–100 qubits may be able to perform tasks that surpass the capabilities of today’s classical digital computers, “but noise in quantum gates will limit the size of quantum circuits that can be executed reliably,” California Institute of Technology theoretical physics professor John Preskill wrote in a recent paper. “We may feel confident that quantum technology will have a substantial impact on society in the decades ahead, but we cannot be nearly so confident about the commercial potential of quantum technology in the near term, say the next 5 to 10 years.”

CQC has been selling software focused on specific use cases, such as in cybersecurity and pharmaceutical and drug delivery, as the hardware becomes available. “We are very different from the other quantum software companies that we are aware of, which are primarily focused on consulting-based revenues,” CQC CEO Ilyas Khan told VentureBeat.

😃 From robotic people, dogs, it seems scientists are now pushing forward with robotic vines. 😃


This robot has applications to archaeology, space exploration, and search and rescue — with a simple elegant design inspired by a plant. Sign up to Morning Brew for free today: https://ve42.co/mb.

Make your own Vine Robot! — https://www.vinerobots.org.
Special thanks to A/Prof. Elliot Hawkes, Nicholas Naclerio, Margaret Coad, David Haggerty for appearing in this video and showing off your amazing robots. For more info on vine (and other types of) robots check out https://ve42.co/HawkesLab, and https://ve42.co/CHARM

B-roll footage of robots from the supplementary materials of (Hawkes et al., 2017) https://ve42.co/VineVideos, and from Stanford University https://ve42.co/StanfordVideo.

Additional info on the intubation vine robot here: https://www.wardenchem.com/vine.

References: Hawkes, E. W., Blumenschein, L. H., Greer, J. D., & Okamura, A. M. (2017). A soft robot that navigates its environment through growth. Science Robotics, 2. — https://ve42.co/Hawkes2017

Coad, M. M., Blumenschein, L. H., Cutler, S., Zepeda, J. A. R., Naclerio, N. D., El-Hussieny, H.,… & Okamura, A. M. (2019). Vine robots: Design, teleoperation, and deployment for navigation and exploration. IEEE Robotics & Automation Magazine, 27, 120-132. — https://ve42.co/Coad2019

Blumenschein, L. H., Coad, M. M., Haggerty, D. A., Okamura, A. M., & Hawkes, E. W. (2020). Design, modeling, control, and application of everting vine robots. Frontiers in Robotics and AI, 7. — https://ve42.co/Blumenschein2020

**Engineers, using artificial intelligence and wearable cameras, now aim to help robotic exoskeletons walk by themselves.**

Increasingly, researchers around the world are developing lower-body exoskeletons to help people walk. These are essentially walking robots users can strap to their legs to help them move.

One problem with such exoskeletons: They often depend on manual controls to switch from one mode of locomotion to another, such as from sitting to standing, or standing to walking, or walking on the ground to walking up or down stairs. Relying on joysticks or smartphone apps every time you want to switch the way you want to move can prove awkward and mentally taxing, says Brokoslaw Laschowski, a robotics researcher at the University of Waterloo in Canada.


AI and wearable cameras could help exoskeletons act a bit like autonomous vehicles.

As well as Kronshtadt, many other Russian enterprises in the military-industrial complex are developing drones for deployment on the front lines. For example, aircraft manufacturer Sukhoi has teamed up with defense company Mikoyan to build the Okhotnik-B, which will have a top speed of 1000 km/h. Another aerospace company, called OKB Sokol, has developed a UAV named Altius, due to be delivered to the Russian Army this year.


A Russian company is building the country’s first-ever specialized factory solely for manufacturing unmanned aerial vehicles (UAVs). It plans to mass-produce military drones, like those deployed by the Russian Army in Syria.

The 45000-square-meter plant, under construction in the town of Dubna near Moscow, will cost at least four billion rubles ($52 million) and will create jobs for more than 1500 people. If all goes to plan, it will be built in record time, with the launch of production scheduled for November 2021.

The company, called ‘Kronshtadt Group,’ is the developer and manufacturer of the Inokhodets UAV, also known as the Orion. This medium-altitude drone, which is capable of flying for a whole day, can carry a payload of up to 200kg, and has already seen action in the Middle East.

Four of the newfound quadruply imaged quasars are shown here: From top left and moving clockwise, the objects are: GraL J1537-3010 or “Wolf’s Paw;” GraL J0659+1629 or “Gemini’s Crossbow;” GraL J1651-0417 or “Dragon’s Kite;” GraL J2038-4008 or “Microscope Lens.” The fuzzy dot in the middle of the images is the lensing galaxy, the gravity of which is splitting the light from the quasar behind it in such a way to produce four quasar images. By modeling these systems and monitoring how the different images vary in brightness over time, astronomers can determine the expansion rate of the universe and help solve cosmological problems. Credit: The GraL Collaboration.

With the help of machine-learning techniques, a team of astronomers has discovered a dozen quasars that have been warped by a naturally occurring cosmic “lens” and split into four similar images. Quasars are extremely luminous cores of distant galaxies that are powered by supermassive black holes.

Over the past four decades, astronomers had found about 50 of these “quadruply imaged quasars,” or quads for short, which occur when the gravity of a massive galaxy that happens to sit in front of a quasar splits its single image into four. The latest study, which spanned only a year and a half, increases the number of known quads by about 25 percent and demonstrates the power of machine learning to assist astronomers in their search for these cosmic oddities.