Toggle light / dark theme

An animal scientist with Wageningen University & Research in the Netherlands has created an artificial-intelligence-based application that can gauge the emotional state of farm animals based on photographs taken with a smartphone. In his paper uploaded to the bioRxiv preprint server, Suresh Neethirajan describes his app and how well it worked when tested.

Prior research and anecdotal evidence has shown that are more productive when they are not living under stressful conditions. This has led to changes in , such as shielding cows’ eyes from the spike that is used to kill them prior to slaughter to prevent stress hormones from entering the meat. More recent research has suggested that it may not be enough to shield from stressful situations—adapting their environment to promote peacefulness or even playfulness can produce desired results, as well. Happy cows or goats, for example, are likely to produce more milk than those that are bored. But as Neethirajan notes, the emotional state of an animal can be quite subjective, leading to incorrect conclusions. To address this problem, he adapted human face recognition software for use in detecting emotions in cows and pigs.

The system is called WUR Wolf and is based on several pieces of technology: the YOLO Object Detection System, the YOLOv4 that works with a convolution and Faster R-CNN, which also allows for detection of objects, but does so with different feature sets. For training, he used the Nvidia GeForece GTX 1080 Ti GRP running on a CUDA 9.0 computer. The data consisted of thousands of images of cows and pigs taken with a smartphone from six farms located in several countries with associated classification labels indicating which could be associated with which mood—raised ears on a cow, for example, generally indicate the animal is excited.

This article is part of our new series, Currents, which examines how rapid advances in technology are transforming our lives.

Imagine operating a computer by moving your hands in the air as Tony Stark does in “Iron Man.” Or using a smartphone to magnify an object as does the device that Harrison Ford’s character uses in “Blade Runner.” Or a next-generation video meeting where augmented reality glasses make it possible to view 3D avatars. Or a generation of autonomous vehicles capable of driving safely in city traffic.

These advances and a host of others on the horizon could happen because of metamaterials, making it possible to control beams of light with the same ease that computer chips control electricity.

Li-ion batteries and other emerging lithium-based battery technologies are currently used to power a wide range of devices, including smartphones, laptops, tablets and cameras. Despite their advantages, batteries containing lithium do not always retain their performance over time.

One of the main reasons for the performance decay observed in some Li-based batteries is that the lithium contained within them sometimes becomes inactive or “dead.” This “dead lithium” can cause capacity decay and thermal runaway, which can ultimately reduce a battery’s lifespan and impair its performance.

Researchers at Zhejiang University of Technology in China and Argonne National Laboratory in the U.S. have recently devised a strategy to restore inactive lithium in Li anodes. This strategy, outlined in a paper published in Nature Energy, is based on a chemical reaction known as iodine redox.

If you can’t beat ‘em, join ‘em.


Arm is the technology company of the hour. Or one of, at least. The chip designer rose to great heights in the mobile phone biz and now its many license holders are looking to twist an ARM processor into something more computer-shaped. Arm is finding increasing number of advocates from Intel and AMD’s firm customers too: perhaps the most notable among them being Apple, with the M1 chip in MacBooks and the new iMac, but Amazon, Microsoft, and Arm’s prospective buyer, Nvidia, all have skin in the game.

Yet Intel has a plan: a brand new foundry business. That which will offer flexibility in a way that was largely ruled out by oppressive x86 licenses and Intel’s unwillingness to share in the past. It’s what Arm offers, after all. A way for companies to design a chip as they see fit, and leave the unwanted features on the cutting room floor.

Imagine a foldable smartphone or a rollable tablet device that is powerful, reliable and, perhaps most importantly, affordable.

New research directed by Wake Forest University scientists and published today in the journal Nature Communications has led to a method for both pinpointing and eliminating the sources of instability in the materials and devices used to create such applications.

“In this work, we introduced a strategy that provides a reliable tool for identifying with high accuracy the environmental and operational device degradation pathways and subsequently eliminating the main sources of instabilities to achieve stable devices,” said lead author Hamna Iqbal, a who worked closely with Professor of Physics Oana Jurchescu on the research.

Human Security cybersecurity specialists reveal the finding of a massive botnet made up of compromised Android devices. This malicious operation, identified as Pareto, would aim to conduct advertising fraud related to payment connected television (CTV) services and would so far be made up of about one million infected devices.

As you will recall, the term botnet refers to a network of computer systems committed to a specific malware variant, executed autonomously and automatically and under remote control by attack operators.

Experts say hackers have used dozens of mobile apps to mimic the image of over 6000 CTV apps, equivalent to around 650 million ad requests per day. This botnet was first identified in 2020 and since then companies such as Google and Roku have tried to mitigate their progress, although operators have managed to grow inordinately.

Cystic fibrosis is diagnosed in infants by use of sweat testing as elevated chloride concentrations in sweat are indicative of cystic fibrosis. The current approach can have poor sensitivity and require repeated testing. Toward the goal of developing a noninvasive, simple test for cystic fibrosis, Ray et al. devised an adhesive microfluidic device, or “sweat sticker,” to capture and analyze sweat in real time with colorimetric readout. Benchtop testing and validation in patients with cystic fibrosis showed that smartphone imaging of sweat stickers adhered to the skin could monitor sweat chloride concentrations. Results support further testing of the sweat stickers in larger studies.

The concentration of chloride in sweat remains the most robust biomarker for confirmatory diagnosis of cystic fibrosis (CF), a common life-shortening genetic disorder.

The next version of Android remains focussed on developers until the first beta launches next month. With that in mind, we’re diving into today’s release of Android 12 DP3 to find all the new features.

Over the coming hours, we’ll dive into all of Android 12 DP3’s new features and every single change. (The newest updates will be at top of this list. Be sure to check back often and tell us what you find in the comments below.)

Google is planning eight releases over the coming months before the consumer launch later this year to Pixel phones and other devices. If you want to quickly install the Android 12 DP1 on your compatible Pixel 3, Pixel 3 XL, Pixel 3a, Pixel 3a XL, Pixel 4, Pixel 4 XL, Pixel 4a, Pixel 4a 5G, and Pixel 5 be sure to check out our step-by-step guide.

Massive solar storms in space can be picked up by iOS and Android smartphones, meaning billions of people have a personal geomagnetic storm detector — but the signals threaten to interfere with future location-based applications.

Hoping to get the public more involved in science, study author Sten F. Odenwald, an astronomer at the NASA Goddard Spaceflight Center, published a paper on the topic April 2 in Space Weather. It indicates that even through the unavoidable interference caused by other smartphone components, the phone’s built-in magnetometers can detect geomagnetic storms.

“Smartphones — at least theoretically — should be able to detect some of the strongest storms, pretty easily in fact,” Odenwald told The Academic Times. “Especially if you happen to live up in the northern latitudes — in Minnesota or in Canada, or places like that where it really rocks and rolls.”

This year-old zdnet article notes that the company plans a photo-sensitivi ty range from ultraviolet through visible light to 2000nm infrared. The sensor itself retains almost 4x the light of ordinary CMOS sensors, while being 2000x more sensitive to light. This will put it on par with the best analogue image intensification tubes used for night vision. Up until now, there have not been any digital night vision systems that can match analogue systems. This will be better, with higher resolution and multichromatic. It also has a 100x greater dynamic range than ordinary CMOS sensors, according to the specifications from SeeDevice’s site linked below. (This means that it can image both bright and dark areas clearly and simultaneously, instead of having the bright areas washing out the image, or the dark areas being black. The included photo is from its website, demonstrating a wide dynamic range photo produced by the system. On a normal photo, either the sky would appear black, or the road would be so bright that it would look washed out.)

Hopefully coming soon to a cell phone camera near you…

SeeDevice’s site: https://www.seedeviceinc.com/technology