Folks, if you are in the NYC area we are at the Long Island Retro Gaming EXPO 2017 this weekend. We believe that reaching out to other technology progressive communities like gamers, etc. is important which is why we will be there this weekend. We talked about this at the DNA conference in Holland last year in the video here:
Category: entertainment
DeepMind’s scientific mission is to push the boundaries of AI by developing systems that can learn to solve complex problems. To do this, we design agents and test their ability in a wide range of environments from the purpose-built DeepMind Lab to established games, such as Atari and Go.
Testing our agents in games that are not specifically designed for AI research, and where humans play well, is crucial to benchmark agent performance. That is why we, along with our partner Blizzard Entertainment, are excited to announce the release of SC2LE, a set of tools that we hope will accelerate AI research in the real-time strategy game StarCraft II. The SC2LE release includes:
Human beings have always wanted to improve themselves, it’s an intrinsic human drive. We’ve come to a point in time where technology allows us to do just that and in the very near future we’re going to see dramatic changes in what it means to be a human being. So, let’s take a look at the likely advancements we all soon maybe upgrading to.
Exo-skeletons:
1984 was the year that introduced The Terminator to the world as a cold, ruthless killing machine, but only part-machine. The cybernetic organism was described in the movie as “living tissue over a metal endoskeleton.” It was a fictional concept back then, but in the 2020’s, it might not be fiction, but reality.
We’re not going to stop taking pictures and recording movies, and we need to develop new ways to save them.
- By Luis Ceze, Karin Strauss, The Conversation US on July 29, 2017
As moviemaking becomes as much a science as an art, the moviemakers need ever-better ways to gauge audience reactions. Did they enjoy it? How much… exactly? At minute 42? A system from Caltech and Disney Research uses a facial expression tracking neural network to learn and predict how members of the audience react, perhaps setting the stage for a new generation of Nielsen ratings.
The research project, just presented at IEEE’s Computer Vision and Pattern Recognition conference in Hawaii, demonstrates a new method by which facial expressions in a theater can be reliably and relatively simply tracked in real time.
It uses what’s called a factorized variational autoencoder — the math of it I am not even going to try to explain, but it’s better than existing methods at capturing the essence of complex things like faces in motion.
Earlier Work
Wernquist early short film “Wanderers” explored many of the same themes about humanities nature to explore and experance, built around beautiful images of space and the cosmos and narration built around an expert from Carl Sagan ‘s “Pale Blue Dot.”
Wernquist described the film as: