Toggle light / dark theme

Plug and play is preparing to launch.


DARPA hopes to shrink traditional military machines into single ‘chiplets’ to build a library of components to aid everything from smart drone building to instant language translation. Shown, an artist’s impression of the components that could be shrunk onto a single chip.

Read more

Luv it; and this is only the beginning too.


In the continued effort to make a viable quantum computer, scientists assert that they have made the first scalable quantum simulation of a molecule.

Quantum computing, if it is ever realized, will revolutionize computing as we know it, bringing us great leaps forward in relation to many of today’s computing standards. However, such computers have yet to be fabricated, as they represent monumental engineering challenges (though we have made much progress in the past ten years).

Case in point, scientists now assert that, for the first time ever, using this technology, they have made a scalable quantum simulation of a molecule. The paper appears in the open access journal Physical Review X.

Read more

This is a true question especially since China launches their new Quantum Satellite communications in the next few weeks. I do believe some will be protected; however, the broader majority will be a stretch.


The encryption of today will be broken by the computers of tomorrow, even retroactively.

Read more

Some folks will be freaked out by this while others will luv it.


A visitor tries out an HP Spectre XT laptop computer featuring an Intel Ultrabook processor at the Internationale Funkausstellung (IFA) 2012 consumer electronics trade fair on August 31, 2012 in Berlin, Germany. (Getty Images — Representational Image)

Read more

AI and Quality Control in Genome data are made for each other.


A new study published in The Plant Journal helps to shed light on the transcriptomic differences between different tissues in Arabidopsis, an important model organism, by creating a standardized “atlas” that can automatically annotate samples to include lost metadata such as tissue type. By combining data from over 7000 samples and 200 labs, this work represents a way to leverage the increasing amounts of publically available ‘omics data while improving quality control, to allow for large scale studies and data reuse.

“As more and more ‘omics data are hosted in the public databases, it become increasingly difficult to leverage those data. One big obstacle is the lack of consistent metadata,” says first author and Brookhaven National Laboratory research associate Fei He. “Our study shows that metadata might be detected based on the data itself, opening the door for automatic metadata re-annotation.”

The study focuses on data from microarray analyses, an early high-throughput genetic analysis technique that remains in common use. Such data are often made publically available through tools such as the National Center for Biotechnology Information’s Gene Expression Omnibus (GEO), which over time accumulates vast amounts of information from thousands of studies.

Read more

Horizon Robotics, led by Yu Kai, Baidu’s former deep learning head, is developing AI chips and software to mimic how the human brain solves abstract tasks, such as voice and image recognition. The company believes that this will provide more consistent and reliable services than cloud based systems.

The goal is to enable fast and intelligent responses to user commands, with out an internet connection, to control appliances, cars, and other objects. Health applications are a logical next step, although not yet discussed.

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center.

Read more

Cloud syncing and sharing software company Dropbox today announced that it has released an image compression algorithm called Lepton under an Apache open source license on GitHub.

Lepton can both compress and decompress files, and for the latter, it can work while streaming — that is, files can be expanded back into full size as they are being sent over the network. So Lepton is important for user experience, given how it can more quickly transfer data and show content. But at the same time, it has an impact on the data center infrastructure where files often end up.

“We have used Lepton to encode 16 billion images saved to Dropbox, and are rapidly recoding our older images. Lepton has already saved Dropbox multiple petabytes of space,” Dropbox software systems architect Daniel Reiter Horn wrote in a blog post.

Read more