Science fiction has always provided an outlet where we can let our imaginations run wild about the possibilities of technological advances. Not long ago, science fiction concepts like self-aware robots, autonomous cars, or 3D printers may have been unimaginable, but we’re watching many of these things come to life before our eyes in the digital age.
Deep learning was previously a technology that seemed straight from the plot of the latest blockbuster, yet these days it’s no longer fiction and is proliferating across real-life applications. Deep learning, which falls under the umbrella of technologies known as artificial intelligence (AI), teaches machines to mimic the thought and decision-making processes of the human brain. Computers are trained using extremely large historical datasets to help them adapt and learn from prior experience, identify anomalous patterns in large datasets, and improve predictive analysis.
These techniques are becoming so popular that Gartner recently named AI and Advanced Machine Learning (which includes technologies such as deep learning) their #1 Strategic Technology Trend for 2017. The firm went on to predict that these technologies will begin to increasingly augment and extend virtually every technology-enabled service, thing, or application, and therefore will become the primary battleground for technology vendors through at least 2020.