Toggle light / dark theme

Scientists with the help of next gen Artificial Intelligence managed to create the smallest and most efficient camera in the world. A specialist medical camera that measures just under a nanometer has just entered the Guinness Book of Records. The size of the grain of sand, it is the camera’s tiny sensor that is actually being entered into the world-famous record book, for being the smallest commercially available image sensor.

TIMESTAMPS:
00:00 A new leap in Material Science.
00:57 How this new technology works.
03:45 Artificial Intelligence and Material Science.
06:00 The Privacy Concerns of Tiny Cameras.
07:45 Last Words.

#ai #camera #technology

Engie and Macquarie to build 150MW one hour battery at site of shuttered Hazelwood coal generator.


French energy giant Engie and Macquarie’s Green Investment Group are to jointly fund the construction of a 150MW/150MWh big battery at the site of the now closed Hazelwood brown coal generator.

The announcement, which comes four years after Engie closed what was Australia’s dirtiest power station, continues the trend of using the sites of closed or ageing coal and gas plants to build battery storage to support the switch to 100 per cent renewables.

Construction has already begun on the Hazelwood Battery, which will be built and maintained over a 20-year period by US-based Fluence, using – for the first time in Australia – its sixth-generation Gridstack product and its AI-enabled bidding system.

The robot navigates using sensors and removes weeds mechanically without the need for chemicals. The LiDAR (light detection and ranging) scanners installed in the weed killer continuously emit laser pulses as the vehicle moves, which are then reflected by objects in the surrounding area. This produces a 3D point cloud of the environment, which helps mobile weed killers to find their way and determine the position of plants or trees. “AMU-Bot is not yet able to classify all plants; however, it can recognize crops such as trees and shrubs in the rows of the tree nursery cultivations,” said the team leader Kevin Bregler.

The weeds in the spaces between the plants or trees are also reliably eliminated. To do this, the manipulator moves into the gaps between the crops. The weeds do not need to be collected separately and are left on the ground to dry out. Thanks to its caterpillar drive, the self-driving weed killer moves along the ground with ease and is extremely stable. Even holes in the ground created when saplings are removed do not pose a problem for AMU-Bot. The AMU-Bot platform is economical, robust, easy to use, and at the same time highly efficient.

The project is funded by the German Federal Office of Agriculture and Food. The AMU-Bot platform relies on the ingenious interaction of three sophisticated modules: caterpillar vehicle, navigation system, and manipulator. Bosch is responsible for the navigation and the sensor system, while KommTek developed the caterpillar drive. The Fraunhofer IPA designed the height-adjustable manipulator, including rotary harrows, and was responsible for overall coordination.

Eureka Robotics, a tech spin-off from Nanyang Technological University, Singapore (NTU Singapore), has developed a technology, called Dynamis, that makes industrial robots nimbler and almost as sensitive as human hands, able to manipulate tiny glass lenses, electronics components, or engine gears that are just millimeters in size without damaging them.

This proprietary force feedback technology developed by NTU scientists was previously demonstrated by the Ikea Bot which assembled an Ikea chair in just 20 minutes. The breakthrough was first published in Science in 2018 and went viral on the internet when it could match the dexterity of human hands in assembling furniture.

NTU Associate Professor Pham Quang Cuong, Co-founder of Eureka Robotics, said they have since upgraded the software technology, which will be made available for a large number of industrial robots worldwide by Denso Wave, a market leader in , which is part of the Toyota Group.

OAKLAND/LOS ANGELES, Calif., Dec 2 – Andy Chanley, the afternoon drive host at Southern California’s public radio station 88.5 KCSN, has been a radio DJ for over 32 years. And now, thanks to artificial intelligence technology, his voice will live on simultaneously in many places.

“I may be a robot, but I still love to rock,” says the robot DJ named ANDY, derived from Artificial Neural Disk-JockeY, in Chanley’s voice, during a demonstration for Reuters where the voice was hard to distinguish from a human DJ.

Our phones, speakers and rice cookers have been talking to us for years, but their voices have been robotic. Seattle-based AI startup WellSaid Labs says it has finessed the technology to create over 50 real human voice avatars like ANDY so far, where the producer just needs to type in text to create the narration.

Face is pretty impressive, and not CGI as far as i know. The BIG challenge in robotics remains building robotic hands that can match human level hands.


Ameca, a new humanoid robot from a company in the UK, has taken the internet by storm with its ultra-realistic movements and expressions.

Bongard said they found that the xenobots, which were initially sphere-shaped and made from around 3,000 cells, could replicate. But it happened rarely and only in specific circumstances. The xenobots used “kinetic replication” — a process that is known to occur at the molecular level but has never been observed before at the scale of whole cells or organisms, Bongard said.


The US scientists who created the first living robots say the life forms, known as xenobots, can now reproduce — and in a way not seen in plants and animals.

Formed from the stem cells of the African clawed frog (Xenopus laevis) from which it takes its name, xenobots are less than a millimeter (0.04 inches) wide. The tiny blobs were first unveiled in 2020 after experiments showed that they could move, work together in groups and self-heal.

Now the scientists that developed them at the University of Vermont, Tufts University and Harvard University’s Wyss Institute for Biologically Inspired Engineering said they have discovered an entirely new form of biological reproduction different from any animal or plant known to science.

https://youtube.com/watch?v=zJH6J7rKn9I

SpaceX launches, Tesla updates its terms and conditions, and Elon Musk sighs.


SpaceX launches NASA’s DART spacecraft with its Falcon 9 rocket, Tesla updates its terms and conditions for Full Self-Driving, and Elon Musk sighs about it.

Circa 2018 #artificialintelligence #doctor


Abstract: Online symptom checkers have significant potential to improve patient care, however their reliability and accuracy remain variable. We hypothesised that an artificial intelligence (AI) powered triage and diagnostic system would compare favourably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by both an AI system and human doctors. Differential diagnoses and triage outcomes were evaluated by an independent judge, who was blinded from knowing the source (AI system or human doctor) of the outcomes. Independently of these cases, vignettes from publicly available resources were also assessed to provide a benchmark to previous studies and the diagnostic component of the MRCGP exam. Overall we found that the Babylon AI powered Triage and Diagnostic System was able to identify the condition modelled by a clinical vignette with accuracy comparable to human doctors (in terms of precision and recall). In addition, we found that the triage advice recommended by the AI System was, on average, safer than that of human doctors, when compared to the ranges of acceptable triage provided by independent expert judges, with only a minimal reduction in appropriateness.

From: Yura Perov N [view email]

[v1] Wed, 27 Jun 2018 21:18:37 UTC (54 KB)

Bridging Technology And Medicine For The Modern Healthcare Ecosystem — Dr. Mona G. Flores, MD, Global Head of Medical AI, NVIDIA.


Dr. Mona Flores M.D., is the Global Head of Medical AI, at NVIDIA (https://blogs.nvidia.com/blog/author/monaflores/), the American multinational technology company, where she oversees the company’s AI initiatives in medicine and healthcare to bridge the chasm between technology and medicine.

Dr. Flores first joined NVIDIA in 2018 with a focus on developing their healthcare ecosystem. Before joining NVIDIA, she served as the chief medical officer of digital health company Human-Resolution Technologies after a 25+ year career in medicine and cardiothoracic surgery.

Dr. Flores received her medical degree from Oregon Health and Science University, followed by a general surgery residency at the University of California at San Diego, a Postdoctoral Fellowship at Stanford, and a cardiothoracic surgery residency and fellowship at Columbia University in New York.

Dr. Flores also has a Masters of Biology from San Jose State and an MBA from the University at Albany School of Business. She initially worked in investment banking for a few years before pursuing her passion for medicine and technology.