Toggle light / dark theme

The team has set an internal deadline of 2025.

In a move that could peg it against electric vehicle market leader, Tesla, Apple has begun working aggressively on its fully autonomous electric car, Bloomberg reported. Developing a car has been on Apple’s agenda since 2014 but recent moves within the company signal a push towards making an Apple car a reality.

Given Apple’s history of taking regularly used products and transforming them into their must-have versions using excellent design, it is hardly a surprise. With Steve Jobs at the helm of affairs, Apple made the iPod even when music players were ubiquitous. Then the company revealed the iPhone when Nokia was still selling resistive touch screens as its premium product. And recently, the Apple Watch has become the “it” wearable even though there are other smartwatch options in the market. During a time where electric vehicles are in a surge, it only seems natural that the electric car is Apple’s next target.

OrCam’s reading device, ElectReon’s ‘smart road’ tech, a sensor for farming and security drones all make the list.


1. OrCam Read, a smart reading support device developed by OrCam Technologies, the maker of artificial intelligence-based wearable devices to help the blind and visually impaired read texts via audio feedback. The company launched OrCam Read in 2,020 a handheld digital reader meant to help people with language processing challenges, including dyslexia. The device (priced at $1,990) captures and reads out full pages of text and digital screens, and follows voice commands.

Wireless sensing devices, tools that allow users to sense movements and remotely monitor activities or changes in specific environments, have many applications. For instance, they could be used for surveillance purposes as well as to track the sleep or physical activities of medical patients and athletes. Some videogame developers have also used wireless sensing systems to create more engaging sports or dance-related games.

Researchers at Florida State University, Trinity University and Rutgers University have recently developed Winect, a new wireless sensing system that can track the poses of humans in 3D as they perform a wide range of free-form physical activities. This system was introduced in a paper pre-published on arXiv and is set to be presented at the ACM Conference on Interactive, Mobile, Wearables and Ubiquitous Technologies (Ubi Comp) 2,021 one of the most renowned computer science events worldwide.

“Our research group has been conducting cutting-edge research in wireless sensing,” Jie Yang, one of the researchers who carried out the study, told TechXplore. “In the past, we have proposed several systems to use Wi-Fi signals to sense various human activities and objects, ranging from large-scale human activities, to small-scale finger movements, sleep monitoring and daily objects For example, we proposed two systems dubbed E-eyes and WiFinger, which are among the first work to utilize Wi-Fi sensing to distinguish various types of daily activity and finger gestures.”

Penn State researchers developed a prototype of a wearable, noninvasive glucose sensor, shown here on the arm. Credit: Jia Zhu, Penn State.

Penn State researchers develop first-of-its-kind wearable, noninvasive glucose monitoring device prototype.

Noninvasive glucose monitoring devices are not currently commercially available in the United States, so people with diabetes must collect blood samples or use sensors embedded under the skin to measure their blood sugar levels. Now, with a new wearable device created by Penn State researchers, less intrusive glucose monitoring could become the norm.

The University of Bristol is part of an international consortium of 13 universities in partnership with Facebook AI, that collaborated to advance egocentric perception. As a result of this initiative, we have built the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.


Progress in the fields of artificial intelligence (AI) and augmented reality (AR) requires learning from the same data humans process to perceive the world. Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities—from the mundane act of opening a door to the exciting interaction of a game of football with friends.

Egocentric 4D Live Perception (Ego4D) is a massive-scale dataset that compiles 3,025 hours of footage from the wearable cameras of 855 participants in nine countries: UK, India, Japan, Singapore, KSA, Colombia, Rwanda, Italy, and the US. The data captures a wide range of activities from the ‘egocentric’ perspective—that is from the viewpoint of the person carrying out the activity. The University of Bristol is the only UK representative in this diverse and international effort, collecting 270 hours from 82 participants who captured footage of their chosen activities of daily living—such as practicing a musical instrument, gardening, grooming their pet, or assembling furniture.

“In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike—they could even remind you where you left your keys,” said Principal Investigator at the University of Bristol and Professor of Computer Vision, Dima Damen.

A Japanese startup at CES is claiming to have solved one of the biggest problems in medical technology: Noninvasive continuous glucose monitoring. Quantum Operation Inc, exhibiting at the virtual show, says that its prototype wearable can accurately measure blood sugar from the wrist. Looking like a knock-off Apple Watch, the prototype crams in a small spectrometer which is used to scan the blood to measure for glucose. Quantum’s pitch adds that the watch is also capable of reading other vital signs, including heart rate and ECG.

The company says that its secret sauce is in its patented spectroscopy materials which are built into the watch and its band. To use it, the wearer simply needs to slide the watch on and activate the monitoring from the menu, and after around 20 seconds, the data is displayed. Quantum says that it expects to sell its hardware to insurers and healthcare providers, as well as building a big data platform to collect and examine the vast trove of information generated by patients wearing the device.

Quantum Operation supplied a sampling of its data compared to that made by a commercial monitor, the FreeStyle Libre. And, at this point, there does seem to be a noticeable amount of variation between the wearable and the Libre. That, for now, may be a deal breaker for those who rely upon accurate blood glucose readings to determine their insulin dosage.

Researchers from Georgia Tech University’s Center for Human-Centric Interfaces and Engineering have created soft scalp electronics (SSE), a wearable wireless electro-encephalography (EEG) device for reading human brain signals. By processing the EEG data using a neural network, the system allows users wearing the device to control a video game simply by imagining activity.

GraphWear, a company pursuing needle-free approaches to glucose monitoring, has closed a $20.5 million Series B round. This Series B round is a vote of confidence by investors in GraphWear’s approach: to monitor key metrics in the body, like glucose, without breaking the skin at all.

GraphWear Technologies was founded in 2015 by Rajatesh Gudibande and Saurabh Radhakrishnan, who had both completed master’s degrees in nanotechnology at the University of Pennsylvania. Specifically, GraphWear is developing a skin-surface-level wearable made of graphene (more on this material later). The sensor is small, about the size of an Apple Watch — but the key piece of technology is actually housed on the bottom. It’s a thin slice of graphene that fits onto the back of the watch, or onto a sticker that can be worn on the abdomen.

This Series B round, says Gudibande, will be focused on helping the company build upon previous validation studies of the wearable, completing a pivotal trial and submitting for FDA clearance. The round was led by Mayfield, with participation from MissionBio Capital, Builders VC and VSC Ventures.

But first, Facebook is going to have to bridge the territory of privacy — not just for those who might have photos taken of them, but for the wearers of these microphone and camera-equipped glasses. VR headsets are one thing (and they come off your face after a session). Glasses you wear around every day are the start of Facebook’s much larger ambition to be an always-connected maker of wearables, and that’s a lot harder for most people to get comfortable with.

Walking down my quiet suburban street, I’m looking up at the sky. Recording the sky. Around my ears, I hear ABBA’s new song, I Still Have Faith In You. It’s a melancholic end to the summer. I’m taking my new Ray Ban smart glasses for a walk.

The Ray-Ban Stories feel like a conservative start. They lack some features that have been in similar products already. The glasses, which act as earbud-free headphones, don’t have 3D spatial audio like the Bose Frames and Apple’s AirPods Pro do. The stereo cameras, on either side of the lenses, don’t work with AR effects, either. Facebook has a few sort-of-AR tricks in a brand-new companion app called View that pairs with these glasses on your phone, but they’re mostly ways of using depth data for a few quick social effects.

Interesting.


Everybody knows sleep is important, but there’s still a lot we don’t understand about what it actually does to the brain – and how its benefits could be boosted. To investigate, the US Army has awarded researchers at Rice University and other institutions a grant to develop a portable skullcap that can monitor and adjust the flow of fluid through the brain during sleep.

Most of us are familiar with the brain fog that comes with not getting enough sleep, but the exact processes going on in there remain mysterious. In 2012 scientists made a huge breakthrough in the field by discovering the glymphatic system, which cleans out toxic waste products from the brain during deep sleep by flushing it with cerebrospinal fluid. Disruptions to sleep – and therefore the glymphatic system – have been increasingly associated with neurological disorders such as Alzheimer’s.

Studying the glymphatic system could provide new insights into sleep disorders and how to treat them, but currently it requires big bulky MRI machines. So the US Army is funding researchers at Rice University, Houston Methodist and Baylor College of Medicine to develop a wearable skullcap.