Toggle light / dark theme

About ten years ago scientist Dave Bacon, now at Google, presented that a time-travelling quantum computer could rapidly solve a bunch of problems, known as NP-complete, which mathematicians have lumped together as being hard. The problem was, Bacon’s quantum computer was travelling around ‘closed timelike curves’. These are paths through the fabric of spacetime that loop back on themselves. General relativity lets such paths to exist through contortions in spacetime identified as wormholes.

Why send a message back in time, but lock it so that no one can ever read the contents? As it may be the key to resolving presently intractable problems. That’s the claim of an international collaboration.

Read more

Nice list of experts on Quantum; however, I would love to see someone from the Lab from Los Alamos to discuss Quantum Internet and University of Sydney from their Innovation Lab or the lady herself “Michelle Simmons” on the panel. Hope to see registration soon.


The announcement was made at the Global Derivatives Trading & Risk Management conference in Budapest, Hungary.

“Quantum computers enable us to use the laws of physics to solve intractable mathematical problems,” said Marcos de López de Prado, Senior Managing Director at Guggenheim Partners and a Research Fellow at Lawrence Berkeley National Laboratory’s Computational Research Division. “This is the beginning of a new era, and it will change the job of the mathematician and computer scientist in the years to come.”

As de Prado points out on the Quantum for Quants website, “Our smartphones are more powerful than the systems used by NASA to put a man on the moon.”

Read more

Great move to my friends at D-Wave! Nice.


BUDAPEST, HUNGARY—(Marketwired — May 10, 2016) — D-Wave Systems Inc., the world’s first quantum computing company, 1QB Information Technologies Inc. (1QBit), a quantum software firm, and financial industry experts today announced the launch of Quantum for Quants (quantumforquants.org), an online community designed specifically for quantitative analysts and other experts focused on complex problems in finance. Launched at the Global Derivatives Trading & Risk Management conference in Budapest, the online community will allow quantitative finance and quantum computing professionals to share ideas and insights regarding quantum technology and to explore its application to the finance industry. Through this community, finance industry experts will also be granted access to quantum computing software tools, simulators, and other resources and expertise to explore the best ways to tackle the most difficult computational problems in finance using entirely new techniques.

“Quantum computers enable us to use the laws of physics to solve intractable mathematical problems,” said Marcos de López de Prado, Senior Managing Director at Guggenheim Partners and a Research Fellow at Lawrence Berkeley National Laboratory’s Computational Research Division. “This is the beginning of a new era, and it will change the job of the mathematician and computer scientist in the years to come.”

Experts in finance, mathematics, computer science and physics have agreed to participate as editors and content contributors of the community, including:

Read more

A Chinese robot is set to compete with grade 12 students during the country’s national college entrance examination next year and get a score qualifying it to enter first-class universities.

The robot being designed will appear in three exams – math, Chinese and a comprehensive test of liberal arts, which includes history, politics and geography, said Lin Hui, CEO of an artificial intelligence company in Chengdu.

The robot will have to finish the exams during designated periods like the other examinees. It will take its exams in a closed room with just proctors and a notary present.

Read more

What would you say if I told you that aging happens not because of accumulation of stresses, but rather because of the intrinsic properties of the gene network of the organism? I’m guessing you’d be like: :o.

So, here’s the deal. My biohacker friends led by Peter Fedichev and Sergey Filonov in collaboration with my old friend and the longevity record holder Robert Shmookler Reis published a very cool paper. They proposed a way to quantitatively describe the two types of aging – negligible senescence and normal aging. We all know that some animals just don’t care about time passing by. Their mortality doesn’t increase with age. Such negligibly senescent species include the notorious naked mole rat and a bunch of other critters like certain turtles and clams to name a few. So the paper explains what it is exactly that makes these animals age so slowly – it’s the stability of their gene networks.

What does network stability mean then? Well, it’s actually pretty straightforward – if the DNA repair mechanisms are very efficient and the connectivity of the network is low enough, then this network is stable. So, normally aging species, such as ourselves, have unstable networks. This is a major bummer by all means. But! There is a way to overcome this problem, according to the proposed math model.

The model very generally describes what happens with a gene network over time – the majority of the genes are actually working perfectly, but a small number doesn’t. There are repair mechanisms that take care of that. Also, there are mechanisms that take care of defected proteins like heat shock proteins, etc. Put together all of this in an equasion and solve it, and bam! here’s an equasion that gives you the Gompertz law for all species that have normal aging, and a time independent constant for the negligibly senescent ones.

What’s the difference between those two aging regimes? The model suggests it’s the right combination of DNA repair efficiency and the combined efficiency of proteolysis and heat shock response systems, mediating degradation and refolding of misfolded proteins. So, it’s not the accumulation of damages that is responsible for aging, but rather the properties of the gene network itself. The good news is that even we are playing with a terrible hand at first, there is a chance we can still win by changing the features of our network and making it stable. For example, by optimizing misfolded protein response or DNA repair.

Read more

Ask an Information Architect, CDO, Data Architect (Enterprise and non-Enterprise) they will tell you they have always known that information/ data is a basic staple like Electricity all along; and glad that folks are finally realizing it. So, the same view that we apply to utilities as core to our infrastructure & survival; we should also apply the same value and view about information. And, in fact, information in some areas can be even more important than electricity when you consider information can launch missals, cure diseases, make you poor or wealthy, take down a government or even a country.


What is information? Is it energy, matter, or something completely different? Although we take this word for granted and without much thought in today’s world of fast Internet and digital media, this was not the case in 1948 when Claude Shannon laid the foundations of information theory. His landmark paper interpreted information in purely mathematical terms, a decision that dematerialized information forever more. Not surprisingly, there are many nowadays that claim — rather unthinkingly — that human consciousness can be expressed as “pure information”, i.e. as something immaterial graced with digital immortality. And yet there is something fundamentally materialistic about information that we often ignore, although it stares us — literally — in the eye: the hardware that makes information happen.

As users we constantly interact with information via a machine of some kind, such as our laptop, smartphone or wearable. As developers or programmers we code via a computer terminal. As computer or network engineers we often have to wade through the sheltering heat of a server farm, or deal with the material properties of optical fibre or copper in our designs. Hardware and software are the fundamental ingredients of our digital world, both necessary not only in engineering information systems but in interacting with them as well. But this status quo is about to be massively disrupted by Artificial Intelligence.

A decade from now the postmillennial youngsters of the late 2020s will find it hard to believe that once upon a time the world was full of computers, smartphones and tablets. And that people had to interact with these machines in order to access information, or build information systems. For them information would be more like electricity: it will always be there, and always available to power whatever you want to do. And this will be possible because artificial intelligence systems will be able to manage information complexity so effectively that it will be possible to deliver the right information at the right person at the right time, almost at an instant. So let’s see what that would mean, and how different it would be from what we have today.

Read more

https://youtube.com/watch?v=EyOuVFQNMLI

Cambridge University spin-out Optalysys has been awarded a $350k grant for a 13-month project from the US Defense Advanced Research Projects Agency (DARPA). The project will see the company advance their research in developing and applying their optical co-processing technology to solving complex mathematical equations. These equations are relevant to large-scale scientific and engineering simulations such as weather prediction and aerodynamics.

The Optalysys technology is extremely energy efficient, using light rather than electricity to perform intensive mathematical calculations. The company aims to provide existing computer systems with massively boosted processing capabilities, with the aim to eventually reach exaFLOP rates (a billion billion calculations per second). The technology operates at a fraction of the energy cost of conventional high-performance computers (HPCs) and has the potential to operate at orders of magnitude faster.

In April 2015 Optalysys announced that they had successfully built a scaleable, lens-less optical processing prototype that can perform mathematical functions. Codenamed Project GALELEO, the device demonstrates that second order derivatives and correlation pattern matching can be performed optically in a scaleable design.

Read more

I read this article and it’s complaints about the fragile effects of data processing and storing information in a Quantum Computing platform. However, I suggest the writer to review the news released 2 weeks ago about the new Quantum Data Bus highlighted by PC World, GizMag, etc. It is about to go live in the near future. Also, another article to consider is today’s Science Daily articile on electron spin currents which highlights how this technique effectively processes information.


Rare-earth materials are prime candidates for storing quantum information, because the undesirable interaction with their environment is extremely weak. Consequently however, this lack of interaction implies a very small response to light, making it hard to read and write data. Leiden physicists have now observed a record-high Purcell effect, which enhances the material’s interaction with light. Publication on April 25 in Nature Photonics (“Multidimensional Purcell effect in an ytterbium-doped ring resonator”).

Ordinary computers perform calculations with bits—ones and zeros. Quantum computers on the other hand use qubits. These information units are a superposition of 0 and 1; they represent simultaneously a zero and a one. It enables quantum computers to process information in a totally different way, making them exponentially faster for certain tasks, like solving mathematical problems or decoding encryptions.

Fragile.

The difficult part now is to actually build a quantum computer in real life. Rather than silicon transistors and memories, you will need physical components that can process and store quantum information, otherwise the key to the whole idea is lost. But the problem with quantum systems is that they are more or less coupled to their environments, making them lose their quantum properties and become ‘classical’. Thermal noise, for example, can destroy the whole system. It makes quantum systems extremely fragile and hard to work with.

Read more

New research by UCSF scientists could accelerate – by 10 to 100-fold – the pace of many efforts to profile gene activity, ranging from basic research into how to build new tissues from stem cells to clinical efforts to detect cancer or auto-immune diseases by profiling single cells in a tiny drop of blood.

The study, published online April 27, 2016, in the journal Cell Systems, rigorously demonstrates how to extract high-quality information about the patterns of in individual cells without using expensive and time-consuming technology. The paper’s senior authors are Hana El-Samad, PhD, an associate professor of biochemistry and biophysics at UCSF, and Matt Thomson, PhD, a faculty fellow in UCSF’s Center for Systems and Synthetic Biology.

“We believe the implications are huge because of the fundamental tradeoff between depth of sequencing and throughput, or cost,” said El-Samad. “For example, suddenly, one can think of profiling a whole tumor at the single cell level.”

Read more