Toggle light / dark theme

In 1610, Galileo redesigned the telescope and discovered Jupiter’s four largest moons. Nearly 400 years later, NASA’s Hubble Space Telescope used its powerful optics to look deep into space—enabling scientists to pin down the age of the universe.

Suffice it to say that getting a better look at things produces major scientific advances.

In a paper published on July 18 in The Astrophysical Journal, a team of scientists led by Craig DeForest—solar physicist at Southwest Research Institute’s branch in Boulder, Colorado—demonstrate that this historical trend still holds. Using advanced algorithms and data-cleaning techniques, the team discovered never-before-detected, fine-grained structures in the outer —the Sun’s million-degree atmosphere—by analyzing taken by NASA’s STEREO spacecraft. The new results also provide foreshadowing of what might be seen by NASA’s Parker Solar Probe, which after its launch in the summer 2018 will orbit directly through that region.

Read more

Our Fast Lightweight Autonomy program recently completed Phase 2 flight tests, demonstrating advanced algorithms designed to turn small air and ground systems into team members that can autonomously perform tasks dangerous for humans — such as pre-mission reconnaissance in a hostile urban setting or searching damaged structures for survivors following an earthquake.

Read more

WANT a job with a successful multinational? You will face lots of competition. Two years ago Goldman Sachs received a quarter of a million applications from students and graduates. Those are not just daunting odds for jobhunters; they are a practical problem for companies. If a team of five Goldman human-resources staff, working 12 hours every day, including weekends, spent five minutes on each application, they would take nearly a year to complete the task of sifting through the pile.

Little wonder that most large firms use a computer program, or algorithm, when it comes to screening candidates seeking junior jobs. And that means applicants would benefit from knowing exactly what the algorithms are looking for.

Read more

An international team of scientists from Eindhoven University of Technology, University of Texas at Austin, and University of Derby, has developed a revolutionary method that quadratically accelerates artificial intelligence (AI) training algorithms. This gives full AI capability to inexpensive computers, and would make it possible in one to two years for supercomputers to utilize Artificial Neural Networks that quadratically exceed the possibilities of today’s artificial neural networks. The scientists presented their method on June 19 in the journal Nature Communications.

Artificial Neural Networks (or ANN) are at the very heart of the AI revolution that is shaping every aspect of society and technology. But the ANNs that we have been able to handle so far are nowhere near solving very complex problems. The very latest supercomputers would struggle with a 16 million-neuron network (just about the size of a frog brain), while it would take over a dozen days for a powerful desktop computer to train a mere 100,000-neuron network.

Read more

Scientists at the California Institute of Technology can now assess a person’s intelligence in moments with nothing more than a brain scan and an AI algorithm, university officials announced this summer.

Caltech researchers led by Ralph Adolphs, PhD, a professor of psychology, neuroscience and biology and chair of the Caltech Brain Imaging Center, said in a recent study that they, alongside colleagues at Cedars-Sinai Medical Center and the University of Salerno, were successfully able to predict IQ in hundreds of patients from fMRI scans of resting-state brain activity. The work is pending publication in the journal Philosophical Transactions of the Royal Society.

Adolphs and his team collected data from nearly 900 men and women for their research, all of whom were part of the National Institutes of Health (NIH)-driven Human Connectome Project. The researchers trained their machine learning algorithm on the complexities of the human brain by feeding the brain scans and intelligence scores of these hundreds of patients into the algorithm—something that took very little effort on the patients’ end.

Read more

It’s amusing that these people know where this is headed, but arent interested enough to stop it.


The co-chief investment officer and co-chairman of Bridgewater Associates shared his thoughts in a Facebook post on Thursday.

Dalio says he was responding to a question about whether machine intelligence would put enough people out of work that the government will have to pay people to live with a cash handout, a concept known as universal basic income.

My view is that algorithmic/automated decision making is a two edged sword that is improving total productivity but is also eliminating jobs, leading to big wealth and opportunity gaps and populism, and creating a national emergency.

Read more

Contrary to what Silicon Valley portrays, you’ll need more than drive and intelligence to land a high-paying job in the tech world. You’ll need to be well versed in one of the most popular and fastest growing programming languages: Python.

SEE ALSO: Walmart’s new text service bypasses app, website to order stuff online

Python made its debut in 1990, and since then it’s been focused and refined by some of the brightest programmers in the industry. That’s resulted in its current status as a multi-faceted, yet beautifully simple language with a wide variety of applications, from interfacing with SQL databases to building websites.

Read more

What if a large class of algorithms used today—from the algorithms that help us avoid traffic to the algorithms that identify new drug molecules—worked exponentially faster?

Computer scientists at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a completely new kind of algorithm, one that exponentially speeds up computation by dramatically reducing the number of parallel steps required to reach a solution.

The researchers will present their novel approach at two upcoming conferences: the ACM Symposium on Theory of Computing (STOC), June 25–29 and International Conference on Machine Learning (ICML), July 10 −15.

Read more

Recommended Books ➤

📖 Life 3.0 — http://azon.ly/ij9u
📖 The Master Algorithm — http://azon.ly/excm
📖 Superintelligence — http://azon.ly/v8uf

This video is the twelfth and final in a multi-part series discussing computing. In this video, we’ll be discussing the future of computing, more specifically – the evolution of the field of computing and extrapolating forward based on topics we’ve discussed so far in this series!

[0:31–5:50] Starting off we’ll discuss, the 3 primary eras in the evolution of the field of computing since its inception, the: tabulating, programming and cognitive eras.

Afterwards, we’ll discuss infinite computing, a paradigm that incorporates cloud computing and the principles of heterogenous architecture that is accelerating the transition to cognitive computing.

Finally, to wrap up, we’ll discuss the future of computing, ubiquitous computing, fueled by the rise of abundant, affordable and smart computing devices, where computing is done using any device, in any location and in any format.

Read more