Symbiosis by combining humans with artificial intelligence is Elon Musk’s #Neuralink corporate goal and I agree that is a great strategy for the future and why he is the richest person in the world now. 🌎
Some smart robots can perform complex tasks on their own, without the programmers understanding how they learned them.
The maker of a defunct cloud photo storage app that pivoted to selling facial recognition services has been ordered to delete user data and any algorithms trained on it, under the terms of an FTC settlement.
The regulator investigated complaints the Ever app — which gained earlier notoriety for using dark patterns to spam users’ contacts — had applied facial recognition to users’ photographs without properly informing them what it was doing with their selfies.
Under the proposed settlement, Ever must delete photos and videos of users who deactivated their accounts and also delete all face embeddings (i.e. data related to facial features which can be used for facial recognition purposes) that it derived from photos of users who did not give express consent to such a use.
Two researchers at Duke University have recently devised a useful approach to examine how essential certain variables are for increasing the reliability/accuracy of predictive models. Their paper, published in Nature Machine Intelligence, could ultimately aid the development of more reliable and better performing machine-learning algorithms for a variety of applications.
“Most people pick a predictive machine-learning technique and examine which variables are important or relevant to its predictions afterwards,” Jiayun Dong, one of the researchers who carried out the study, told TechXplore. “What if there were two models that had similar performance but used wildly different variables? If that was the case, an analyst could make a mistake and think that one variable is important, when in fact, there is a different, equally good model for which a totally different set of variables is important.”
Dong and his colleague Cynthia Rudin introduced a method that researchers can use to examine the importance of variables for a variety of almost-optimal predictive models. This approach, which they refer to as “variable importance clouds,” could be used to gain a better understanding of machine-learning models before selecting the most promising to complete a given task.
The process of systems integration (SI) functionally links together infrastructure, computing systems, and applications. SI can allow for economies of scale, streamlined manufacturing, and better efficiency and innovation through combined research and development.
New to the systems integration toolbox are the emergence of transformative technologies and, especially, the growing capability to integrate functions due to exponential advances in computing, data analytics, and material science. These new capabilities are already having a significant impact on creating our future destinies.
The systems integration process has served us well and will continue to do so. But it needs augmenting. We are on the cusp of scientific discovery that often combines the physical with the digital—the Techno-Fusion or merging of technologies. Like Techno-Fusion in music, Techno-Fusion in technologies is really a trend that experiments and transcends traditional ways of integration. Among many, there are five grouping areas that I consider good examples to highlight the changing paradigm. They are: Smart Cities and the Internet of Things (IoT); Artificial Intelligence (AI), Machine Learning (ML), Quantum and Super Computing, and Robotics; Augmented Reality (AR) and Virtual Reality Technologies (VR); Health, Medicine, and Life Sciences Technologies; and Advanced Imaging Science.
Over the past decade or so, deep neural networks have achieved very promising results on a variety of tasks, including image recognition tasks. Despite their advantages, these networks are very complex and sophisticated, which makes interpreting what they learned and determining the processes behind their predictions difficult or sometimes impossible. This lack of interpretability makes deep neural networks somewhat untrustworthy and unreliable.
Researchers from the Prediction Analysis Lab at Duke University, led by Professor Cynthia Rudin, have recently devised a technique that could improve the interpretability of deep neural networks. This approach, called concept whitening (CW), was first introduced in a paper published in Nature Machine Intelligence.
“Rather than conducting a post hoc analysis to see inside the hidden layers of NNs, we directly alter the NN to disentangle the latent space so that the axes are aligned with known concepts,” Zhi Chen, one of the researchers who carried out the study, told Tech Xplore. “Such disentanglement can provide us with a much clearer understanding of how the network gradually learns concepts over layers. It also focuses all the information about one concept (e.g., “lamp,” “bed,” or “person”) to go through only one neuron; this is what is meant by disentanglement.”
A thermoelectric device is an energy conversion device that uses the voltage generated by the temperature difference between both ends of a material; it is capable of converting heat energy, such as waste heat from industrial sites, into electricity that can be used in daily life. Existing thermoelectric devices are rigid because they are composed of hard metal-based electrodes and semiconductors, hindering the full absorption of heat sources from uneven surfaces. Therefore, researchers have conducted recent studies on the development of flexible thermoelectric devices capable of generating energy in close contact with heat sources such as human skins and hot water pipes.
The Korea Institute of Science and Technology (KIST) announced that a collaborative research team led by Dr. Seungjun Chung from the Soft Hybrid Materials Research Center and Professor Yongtaek Hong from the Department of Electrical and Computer Engineering at Seoul National University (SNU, President OH Se-Jung) developed flexible thermoelectric devices with high power generation performance by maximizing flexibility and heat transfer efficiency. The research team also presented a mass-production plan through an automated process including a printing process.
The heat energy transfer efficiency of existing substrates used for research on flexible thermoelectric devices is low due to their very low thermal conductivity. Their heat absorption efficiency is also low due to lack of flexibility, forming a heat shield layer, e.g., air, when in contact with a heat source. To address this issue, organic-material-based thermoelectric devices with high flexibility have been under development, but their application on wearables is not easy because of its significantly lower performance compared to existing inorganic-material-based rigid thermoelectric devices.