Toggle light / dark theme

facebook-zuckerberg-aiShort Bytes: Artificial Intelligence holds a special place in the future of the humanity. Many tech giants, including Facebook, have long been working on improving the AI to make lives better. Facebook has decided to reveal its milestones in Artificial Intelligence Research in the form of a progress report.

It doesn’t matter if you are scared of AI like Elon Musk or Stephen Hawking or if you have an opinion same as that of Google’s chief of Artificial Intelligence that computers are remarkably dumb. Companies are still going through the byzantine process of training the machines and creating human brain algorithms. Meanwhile, Facebook has just announced its progress report.

Facebook’s AI research team (FAIR) will present at NIPS, an Artificial Intelligence conference, its report card and reveal the team’s achievements regarding its state-of-the-art systems. Facebook has been trying to improve the image recognition and has created a system that speeds up the process by 30% using 10 times less training data from previous benchmarks.

Read more

The local delivery market is worth approximately £150bn in the UK alone. This includes parcel and delivery companies (20 pc) and personal shopping trips by people (80 pc). Starship said that robot deliveries are potentially five to fifteen times cheaper than current “human-powered” delivery services.

“It does not take the whole delivery chain from an Amazon warehouse to your doorstep, it only takes the last few miles. But right now the last few miles are the most difficult part for the delivery vans. They need to find parking spaces and so forth, so our robot is taking care of that,” said Mr Heinla.

“For the large e-commerce companies it helps to reduce the costs. For the local businesses it opens up new possibilities, allowing people to order deliveries over the internet rather than coming to the store physically.”

Read more

A team of researchers with Ulsan National Institute of Science and Technology and Dong-A University, both in South Korea, has developed an artificial skin that can detect both pressure and heat with a high degree of sensitivity, at the same time. In their paper published in the journal Science Advances, the team describes how they created the skin, what they found in testing it and the other types of things it can sense.

Many scientists around the world are working to develop , both to benefit robots and human beings who have lost skin sensation or limbs. Such efforts have led to a wide variety of artificial skin types, but until now, none of them have been able to sense both pressure and heat to a high degree, at the same time.

The new artificial skin is a sandwich of materials; at the top there is a meant to mimic the human fingerprint (it can sense texture), beneath that sit sensors sandwiched between . The sensors are domed shaped and compress to different degrees when the skin is exposed to different amount of pressure. The compression also causes a small electrical charge to move through the skin, as does heat or sound, which is also transmitted to sensors—the more pressure, heat or sound exerted, the more charge there is—using a computer to measure the charge allows for measuring the degree of sensation “felt.” The ability to sense sound, the team notes, was a bit of a surprise—additional testing showed that the artificial skin was actually better at picking up sound than an iPhone microphone.

Read more

Post-Human is a scifi proof-of-concept short based on the bestselling series of novels by me, David Simpson. Amazingly, filmed over just three hours by a crew of three, the short depicts the opening of Post-Human, drawing back the curtain on the Post-Human world and letting viewers see the world and characters they’ve only been able to imagine previously. You’ll get a taste of a world where everyone is immortal, have onboard mental “mind’s eye” computers, nanotechnology can make your every dream a reality, and thanks to the magnetic targeted fusion implants every post-human has, everyone can fly (and yep, there’s flying in this short!) But there’s a dark side to this brave new world, including the fact that every post-human is monitored from the inside out, and the one artificial superintelligence running the show might be about to make its first big mistake. wink

The entire crew was only three people, including me, and I was behind the camera at all times. The talent is Madison Smith as James Keats, and Bridget Graham as his wife, Katherine. As a result of the expense of the spectacular location, the entire short had to be filmed in three hours, so we had to be lean and fast. What a rush! (Pun intended).

The concept was to try to replicate what a full-length feature would look and feel like by adapting the opening of Post-Human, right up to what would be the opening credits. Of course, as I was producing the movie myself, we only had a micro-budget, but after researching the indie films here on Vimeo over the last year, I became convinced that we could create a reasonable facsimile of what a big-budget production would look like and hopefully introduce this world to many more people who aren’t necessarily aficionados of scifi exclusively on the Kindle. While the series has been downloaded over a million times since 2012, I’ve always intended for it to be adapted for film, and I’m excited to have, in some small measure, finally succeeded.

Read more

UC San Diego is establishing a robotics institute aimed at developing machines that can interpret such things as facial expressions and walking styles and size up people’s thoughts, actions and feelings.

The See-Think-Do technology is largely meant to anticipate and fulfill people’s everyday needs, especially for the growing number of older Americans who want to remain in their own homes instead of moving into an assisted-living facility or nursing home.

Engineers also envision creating robots so good at sizing up people, places and situations that they could help evacuate crowds from dangerous areas and pick through the rubble of an earthquake in search of survivors.

Read more

Emerging technologies are shaking up how we grow food, distribute it, and even what we’re eating. We are seemingly on the cusp of a food revolution and undoubtedly, technologies including artificial intelligence will play a huge role in helping people grow healthier, more resilient food faster and with less energy than ever before.

Rob Nail, Singularity University’s CEO and Associate Founder, provides a few examples of how robotics, automation, and drones are transforming agriculture in this short video:

Read more

During the press conference for the release of the Autopilot, Tesla CEO Elon Musk referred to each Model S owners as “expert trainers” – meaning that each driver will train the autonomous features of the system to feed the collective network intelligence of the fleet by simply driving the electric vehicle on Autopilot.

He said that the system should improve everyday, but that improvements might only become noticeable every week or so by adding up. Just a few weeks after the release, Model S owners are already taking to the Tesla Motors Club forum to describe how the Autopilot is improving…

Read more