Toggle light / dark theme

Amidst these complex security challenges and the sea of unknowns coming our way, what remains fundamental for the safety and security of the human race is the role of programmers and programming along with the integrity of semiconductor chips. The reason behind this is programmers can define and determine the nature of AWS (at least in the beginning) until AI begins to program itself.


Weaponized artificial intelligence is almost here. As algorithms begin to change warfare, the rise of autonomous weapons systems is becoming a terrifying reality.

Read more

These dated interfaces are not equipped to handle today’s exponential rise in data, which has been ushered in by the rapid dematerialization of many physical products into computers and software.

Breakthroughs in perceptual and cognitive computing, especially machine learning algorithms, are enabling technology to process vast volumes of data, and in doing so, they are dramatically amplifying our brain’s abilities. Yet even with these powerful technologies that at times make us feel superhuman, the interfaces are still crippled with poor ergonomics.

Many interfaces are still designed around the concept that human interaction with technology is secondary, not instantaneous. This means that any time someone uses technology, they are inevitably multitasking, because they must simultaneously perform a task and operate the technology.

Read more

Facial recognition is going mainstream. The technology is increasingly used by law-enforcement agencies and in schools, casinos and retail stores, spurring privacy concerns. In this episode of Moving Upstream, WSJ’s Jason Bellini tests out the technology at an elementary school in Seattle and visits a company that claims its algorithm can identify potential terrorists by their facial features alone.


According to a straightforward interpretation of general relativity, the Big Bang wasn’t the start of ‘everything’.

Taking Einstein’s famous equations at face value and making as few assumptions as possible, a team of researchers has rewound the clock on our Universe to find it wouldn’t lead to a stopping point at all, but would take us through a different kind of beginning into a flipped space.

To understand what all the fuss over the Big Bang is, we need to rewind a bit to understand why physicists think it may not have been the start of everything.

Read more

Companies use different algorithms based on different sets of data. Most of that data comes from people of recent European ancestry.

The problem, obviously, is that a lot of people don’t have grandparents or great-great-great-grandparents from England or Italy or Denmark. Most people on Earth, actually! That means if you’re from, say, Asia or Africa, you might not get as detailed a profile as you’d like.

My mother, who was born in the Philippines, actually got an update from 23andMe with new information about her heritage. Her history didn’t change. But as the company gets more DNA kits from people of Asian descent, the algorithm churns out modified results. Which is great … but that does mean right now, if you’re not white, you might have to wait a bit longer for more accurate results.

Read more

Of course, all the algorithmic rigmarole is also causing real-world problems. Algorithms written by humans — tackling harder and harder problems, but producing code embedded with bugs and biases — are troubling enough. More worrisome, perhaps, are the algorithms that are not written by humans, algorithms written by the machine, as it learns.


Profiles in science.

Donald Knuth, master of algorithms, reflects on 50 years of his opus-in-progress, “The Art of Computer Programming.”

Donald Knuth at his home in Stanford, Calif. He is a notorious perfectionist and has offered to pay a reward to anyone who finds a mistake in any of his books. Credit Credit Brian Flaherty for The New York Times.

Read more

Publication: Forbes story title: Human 2.0: is coming faster than you think deck: Will you evolve with the times? section: Innovation topic: artificial intelligence + big data special label: contributor group | Cognitive World author: by Neil Sahota date: October 1, 2018.


IonQ was founded on a gamble that ‘trapped ion quantum’ computing could outperform the silicon-based quantum computers that Google and others are building. As of right now, it does. IonQ has constructed a quantum computer that can perform calculations on a 79-qubit array, beating the previous king Google’s efforts by 7 qubits.

Their error rates are also the best in the business, with their single-qubit error rate at 99.97% while the nearest competitors are around the 99.5 mark, and a two-qubit error rate of 99.3% when most competitors are beneath 95%. But how does it compare to regular computers?

According to IonQ, in the kinds of workloads that quantum computers are being built for, it’s already overtaking them. The Bernstein-Vazirani Algorithm, a benchmark IonQ is hoping will take off, tests a computer’s ability to determine a single encoded number (called an oracle) when the computer can only ask a single yes/no question.

Read more