NASA is on a mission to go back to the moon by 2024 and use it as a “backyard” of experimentation, according to Lucien Junkin, chief engineer of the space exploration vehicle at NASA.
Wave function represents the quantum state of an atom, including the position and movement states of the nucleus and electrons. For decades researchers have struggled to determine the exact wave function when analyzing a normal chemical molecule system, which has its nuclear position fixed and electrons spinning. Fixing wave function has proven problematic even with help from the Schrödinger equation.
Previous research in this field used a Slater-Jastrow Ansatz application of quantum Monte Carlo (QMC) methods, which takes a linear combination of Slater determinants and adds the Jastrow multiplicative term to capture the close-range correlations.
Now, a group of DeepMind researchers have brought QMC to a higher level with the Fermionic Neural Network — or Fermi Net — a neural network with more flexibility and higher accuracy. Fermi Net takes the electron information of the molecules or chemical systems as inputs and outputs their estimated wave functions, which can then be used to determine the energy states of the input chemical systems.
To evaluate the performance of robotics algorithms and controllers, researchers typically use software simulations or real physical robots. While these may appear as two distinct evaluation strategies, there is a whole other range of possibilities that combine elements of both.
In a recent study, researchers at Texas A&M University and the University of South Carolina have set out to examine evaluation and execution scenarios that lie at an intersection between simulations and real implementations. Their investigation, outlined in a paper pre-published on arXiv, specifically focuses on instances in which real robots perceive the world via their sensors, where the environment they sense could be seen as a mere illusion.
“We consider problems in which robots conspire to present a view of the world that differs from reality,” Dylan Shell and Jason O’Kane, the researchers who carried out the study, wrote in their paper. “The inquiry is motivated by the problem of validating robot behavior physically despite there being a discrepancy between the robots we have at hand and those we wish to study, or the environment for testing that is available versus that which is desired, or other potential mismatches in this vein.”
Researchers at Hefei University of Technology in China and various universities in Japan have recently developed a unique emotion sensing system that can recognize people’s emotions based on their body gestures. They presented this new AI- powered system, called EmoSense, in a paper pre-published on arXiv.
“In our daily life, we can clearly realize that body gestures contain rich mood expressions for emotion recognition,” Yantong Wang, one of the researchers who carried out the study, told TechXplore. “Meanwhile, we can also find out that human body gestures affect wireless signals via shadowing and multi-path effects when we use antennas to detect behavior. Such signal effects usually form unique patterns or fingerprints in the temporal-frequency domain for different gestures.”
Wang and his colleagues observed that human body gestures can affect wireless signals, producing characteristic patterns that could be used for emotion recognition. This inspired them to develop a system that can identify these patterns, recognizing people’s emotions based on their physical movements.
Recent surveys, studies, forecasts and other quantitative assessments of the impact and progress of AI highlighted the strong state of AI surveillance worldwide, the lack of adherence to common privacy principles in companies’ data privacy statement, the growing adoption of AI by global businesses, and the perception of AI as a major risk by institutional investors.
Far from being a mystical “ghost in the machine”, consciousness evolved as a practical mental tool and we could engineer it in a robot using these simple guidelines.
Downloading your brain may seem like science fiction, but some neuroscientists think it’s not only possible, but that we’ve already started down a path to one day make it a reality. So, how close are we to downloading a human brain?
How Close Are We to Fusion Energy? — https://youtu.be/ZW_YCWLyv6A
We’ve Put a Worm’s Mind in a Lego Robot’s Body https://www.smithsonianmag.com/smart-news/weve-put-worms-mind-lego-robot-body-180953399/?no-is “A wheeled Lego robot may not look like a worm, but it ‘thinks’ like one after programmers gave it the neuron connections in a C. elegans roundworm”
Crumb of Mouse Brain Reconstructed in Full Detail https://www.nature.com/news/crumb-of-mouse-brain-reconstructed-in-full-detail-1.18105 “The resulting three-dimensional map is the first complete reconstruction of a piece of tissue in the mammalian neocortex, the most recently evolved region of the brain.”
The Immortalist: Uploading the Mind to a Computer https://www.bbc.com/news/magazine-35786771 “Within the next 30 years,” promises Dmitry Itskov, “I am going to make sure that we can all live forever.”
Self-driving vehicles will lead to a rise in car sex, according to a new study.
People will be more likely to eat, sleep and engage in on-the-road hanky-panky when robot cars become the new normal, according to research published in the most recent issue of the journal Annals of Tourism Research.
“People will be sleeping in their vehicles, which has implications for roadside hotels. And people may be eating in vehicles that function as restaurant pods,” Scott Cohen, who led the study, told Fast Company magazine.
Artificial intelligence is infiltrating every industry, allowing vehicles to navigate without drivers, assisting doctors with medical diagnoses, and giving financial institutions more nuanced ways to predict risk. But for all the authentic use cases, there’s a lot of hype too.