Toggle light / dark theme

One of the largest drawbacks in robotics is the rigid parts and movements of the robots. Well, soon that maybe changing due to non-Newtonian fluids.


(Inside Science) — By using fluids similar to Silly Putty that can behave as both liquids and solids, researchers say they have created fluid robots that might one day perform tasks that conventional machines cannot.

Conventional robots are made of rigid parts that are vulnerable to bumps, scrapes, twists and falls. In contrast, researchers worldwide are increasingly developing robots made from soft, elastic plastic and rubber that are inspired by worms, starfish and octopuses. These soft robots can resist many of the kinds of damage, and can squirm past many of the obstacles, that can impede hard robots.

However, even soft robots and the living organisms they are inspired by are limited by their solidity — for example, they remain vulnerable to cutting. Instead, researcher Ido Bachelet of Bar-Ilan University in Israel and his colleagues have now created what they call fluid robots that they say could operate better than solid robots in chaotic, hostile environments. They detailed their findings online Jan. 22 in the journal Artificial Life.

Read more

This maybe true in the UK. However, I am in the US. In the US, if I have a robot representing me and I lose my case; can I claim improper representation? I believe that I can. Also, which states and counties/ cities recognize a robot as an attorney? What federal/ state/ county/ and city ordinances and laws will need to be changed for robots to be recognized as attorney in the US? Just having a robot that interprets laws is not enough in the US.

http://mic.com/articles/135693/this-robot-lawyer-can-get-you-out-of-a-parking-ticket-for-free


It has already saved people millions of dollars.

Read more

Seeking to “push the limits of what humans can do,” researchers at Georgia Tech have developed a wearable robotic limb that transforms drummers into three-armed cyborgs.

The remarkable thing about this wearable arm, developed at GT’s Center for Music Technology, is that it’s doing a lot more than just mirroring the movements of the drummer. It’s a “smart arm” that’s actually responding to the music, and performing in a way that compliments what the human player is doing.

The two-foot long arm monitors the music in the room, so it can improvise based on the beat and rhythm. If the drummer is playing slowly, for example, the arm will mirror the tempo.

Read more

Fujitsu Laboratories today announced that it has developed deep learning technology that can analyze time-series data with a high degree of accuracy. Demonstrating promise for Internet-of-Things applications, time-series data can also be subject to severe volatility, making it difficult for people to discern patterns in the data. Deep learning technology, which is attracting attention as a breakthrough in the advance of artificial intelligence, has achieved extremely high recognition accuracy with images and speech, but the types of data to which it can be applied is still limited. In particular, it has been difficult to accurately and automatically classify volatile time-series data–such as that taken from IoT devices–of which people have difficulty discerning patterns.

Now Fujitsu Laboratories has developed an approach to that uses advanced to extract geometric features from time-series data, enabling highly accurate classification of volatile time-series. In benchmark tests held at UC Irvine Machine Learning Repository that classified time-series data captured from gyroscopes in wearable devices, the new technology was found to achieve roughly 85% accuracy, about a 25% improvement over existing technology. This technology will be used in Fujitsu’s Human Centric AI Zinrai artificial intelligence technology. Details of this technology will be presented at the Fujitsu North America Technology Forum (NAFT 2016), which will be held on Tuesday, February 16, in Santa Clara, California.

Background

In recent years, in the field of , which is a central technology in artificial intelligence, deep learning technology has been attracting attention as a way to automatically extract feature values needed to interpret and assess phenomena without rules being taught manually. Especially in the IoT era, massive volumes of time-series data are being accumulated from devices. By applying deep learning to this data and classifying it with a high degree of accuracy, further analyses can be performed, holding the prospect that it will lead to the creation of new value and the opening of new business areas.

Read more

https://youtube.com/watch?v=KV3hHGUGXIU

A complete cognitive architecture to implement systems that are self-aware and capable of intentional mutations. Now available at mecasapiens.com.

HALIFAX, CANADA, February 16, 2016 (Newswire.com) — Monterège Design Inc. is pleased to announce the publication of a cognitive architecture to implement synthetic consciousness. The systems based on this architecture will be fully autonomous, self-aware and capable of intentional mutations. The architecture, published under the title The Meca Sapiens Blueprint, is complete and ready for design and implementation. It can be purchased on line at mecasapiens.com.

Read more

How robotics is making live music a more enriching experience.


Scientists have developed a ‘smart’ wearable robotic limb that responds to human gestures and the music it hears, allowing drummers to play with three arms.

The two-foot long robotic arm can be attached to a musician’s shoulder, and knows what to play by listening to the music in the room. It improvises based on the beat and rhythm. For instance, if the musician plays slowly, the arm slows the tempo. If the drummer speeds up, it plays faster.

Another aspect of its intelligence is knowing where it is located at all times, where the drums are, and the direction and proximity of the human arms.

Read more

Actors and Actresses will never have to worry about reading through pages of scripts to decide whether or not the role is worth their time; AI will do the work for you.


A version of this story first appeared in the Feb. 26 issue of The Hollywood Reporter magazine. To receive the magazine, click here to subscribe.

During his 12 years in UTA’s story department, Scott Foster estimates he read about 5,500 screenplays. “Even if it was the worst script ever, I had to read it cover to cover,” he says. So when Foster left the agency in 2013, he teamed with Portland, Ore.-based techie Brian Austin to create ScriptHop, an artificial intelligence system that manages the volume of screenplays that every agency and studio houses. “When I took over [at UTA], we were managing hundreds of thousands of scripts on a Word document,” says Foster, who also worked at Endeavor and Handprint before UTA. “The program began to eat itself and become corrupt because there was too much information to handle.” ScriptHop can read a script and do a complete character breakdown in four seconds, versus the roughly four man hours required of a reader. The tool, which launches Feb. 16 is free, and is a sample of the overall platform coming later in 2016 that will recommend screenplays as well as store and manage a company’s library for a subscription fee of $29.99 a month per user.

As for how exactly it works, Austin is staying mum. “There’s a lot of sauce in the secret sauce,” he says. Foster and Austin aren’t the first to create AI to analyze scripts. ScriptBook launched in 2015 as an algorithmic assessment to determine a script’s box-office potential. By contrast, ScriptHop is more akin to a Dewey Decimal System for film and TV. Say a manager needs to find a project for a 29-year-old male client who is 5 feet tall, ScriptHop will spit out the options quickly. “If you’re an agent looking for roles for minority clients, it’s hugely helpful,” says Foster. There’s also an emotional response dynamic (i.e., Oscar bait) that charts a character’s cathartic peaks and valleys as well as screen time and shooting days. So Meryl Streep instantly can find the best way to spend a one month window between studio gigs. Either way, it appears that A.I. script reading is the future. The only question is what would ScriptHop make of Ex Machina’s Ava? “That would be an interesting character breakdown,” jokes Foster.

Read more

Big Blue is cool again according to investors.


NEW YORK: Here’s a vexing question for artificial mega-brain Watson: Why is IBM stock surging? Big Blue’s market value rose about $6 billion after the computer giant agreed on Thursday to buy Truven Health Analytics for $2.6 billion. Giving IBM’s artificial-intelligence platform more data to chew on is useful, but investors’ glee over an opaque addition to an enigmatic business effort is confusing.

Big Blue’s top line has been shrinking steadily for nearly four years. In the fourth quarter of 2015, all major divisions had declining sales, with overall revenue falling 8.5 percent compared with the same period a year earlier. Clients need less of IBM’s hardware, and its software and consulting businesses are faltering in competition with rivals’ cloud-based versions.

The upshot is a falling share price. It has dropped about 25 percent in the past four years, while the S&P 500 has risen about 40 percent.

Read more

This is so true and even more importantly in the space of technology as we introduce more products and services in the AI space. Reason is because we are seeing the consumer’s buying patterns changing especially as consumers have more options around devices, services, and AI available to them.

As a result of more choices and AI sophistication; consumers are now & more so in the future will chose to buy things that “fit” more with their own style and personality today. And, this places pressures on companies to change/ expand their thinking on product innovation to include emotional thinking as well. Gone are the days of technology just being a machine/ devices designed to only process information and provide information insights only. Tech consumers today and in the future want technology that marries with their own sense of style and personalities. Therefore, corporate culture as a whole will need to change their thinking at all levels.


I once wrote an article about how people with outstanding academic achievement or technical brilliance can easily get hired, but brilliance will get them nowhere if they lack emotional intelligence and the ability to build strong working relationships. This is especially true in today’s highly competitive world where organisations rely heavily on interdependence to stay ahead of the game.

However, I have heard arguments against my claim from people who point out that there is no shortage of notoriously heartless CEOs lacking in EQ. While that argument might ring true to some extent, I find the reasons for that situation rather interesting. As well, it is essential to note that most CEOs with low EQ scores are not the best-performing business leaders.

First, let’s make it clear that we are talking about managers or C-level executives who have to climb the ladder themselves and not those who founded or inherited a business. In this case, I have found research showing that middle managers often stand out with the highest emotional intelligence scores in the workplace because companies generally promote high-EQ types to supervisory positions as they are level-headed and good with people. However, EQ scores tend to decrease as people move up further in the hierarchy.

Read more

There is a need for a larger “official and governmental” review and oversight board for drones, robots, etc. due to the criminal elements; however, any review needs focus more on the immediate criminal elements that can use and is using this technology plus how to best manage it. Like guns; we may see a need for background check and registration & license to have drones and certain robots as a way to better vet and track who can own a drone or robot.


At AAAI-16, a panel discussed the safety that will be necessary when it comes to autonomous manned and unmanned aircraft. Here’s what you need to know.

Read more