Toggle light / dark theme

The general index is a collection of 100+ million scientific papers that can be downloaded in 38 Terabytes. It is structured and can be searched via code.


There’s a vast amount of research out there, with the volume growing rapidly with each passing day. But there’s a problem.

Not only is a lot of the existing literature hidden behind a paywall, but it can also be difficult to parse and make sense of in a comprehensive, logical way. What’s really needed is a super-smart version of Google just for academic papers.

Enter the General Index, a new database of some 107.2 million journal articles, totaling 38 terabytes of data in its uncompressed form. It spans more than 355 billion rows of text, each featuring a key word or phrase plucked from a published paper.

This work expands the repertoire of DNA-based recording techniques by developing a novel DNA synthesis-based system that can record temporal environmental signals into DNA with minutes resolution. Employing DNA as a high-density data storage medium has paved the way for next-generation digital storage and biosensing technologies. However, the multipart architecture of current DNA-based recording techniques renders them inherently slow and incapable of recording fluctuating signals with sub-hour frequencies. To address this limitation, we developed a simplified system employing a single enzyme, terminal deoxynucleotidyl transferase (TdT), to transduce environmental signals into DNA. TdT adds nucleotides to the 3’ ends of single-stranded DNA (ssDNA) in a template-independent manner, selecting bases according to inherent preferences and environmental conditions.

“The Rejuvenome Project was launched to target these bottlenecks,” said Nicholas Schaum, PhD, Scientific Director at the Astera Institute. “We hope to do that by characterising treatments and regimens, both established and newly invented, for which we have reason to believe improve health and longevity.”

Previously, Schaum worked as a researcher at Stanford University, California, in conjunction with the Chan Zuckerberg BioHub. He organised dozens of labs and hundreds of researchers into a consortium that produced cell atlases, to characterise aging tissues in mice. These cell atlases became the foundation for Schaum’s further studies into whole-organ aging and single-cell parabiosis.

The Rejuvenome Project is expected to be complete in 2028. All wet lab operations will be centred at Buck, while the dry lab computational aspects will reside at the Astera Institute.

Facebook has just announced it’s going to hire 10,000 people in Europe to develop the “metaverse”.

This is a concept which is being talked up by some as the future of the internet. But what exactly is it?

**What is the metaverse?
**
To the outsider, it may look like a souped-up version of Virtual Reality (VR) — but some people think the metaverse could be the future of the internet.

In fact, the belief is that it could be to VR what the modern smartphone is to the first clunky mobile phones of the 1980s.

Instead of being on a computer, in the metaverse you might use a headset to enter a virtual world connecting all sorts of digital environments.

Unlike current VR, which is mostly used for gaming, this virtual world could be used for practically anything — work, play, concerts, cinema trips — or just hanging out.

Most people envision that you would have a 3D avatar — a representation of yourself — as you use it.

But because it’s still just an idea, there’s no single agreed definition of the metaverse.

Full Story:

Estimate measures information encoded in particles, opens door to practical experiments.

Researchers have long suspected a connection between information and the physical universe, with various paradoxes and thought experiments used to explore how or why information could be encoded in physical matter. The digital age propelled this field of study, suggesting that solving these research questions could have tangible applications across multiple branches of physics and computing.

In AIP Advances, from AIP Publishing, a University of Portsmouth researcher attempts to shed light on exactly how much of this information is out there and presents a numerical estimate for the amount of encoded information in all the visible matter in the universe — approximately 6 times 10 to the power of 80 bits of information. While not the first estimate of its kind, this study’s approach relies on information theory.

The ATLAS collaboration is breathing new life into its LHC Run 2 dataset, recorded from 2015 to 2018. Physicists will be reprocessing the entire dataset – nearly 18 PB of collision data – using an updated version of the ATLAS offline analysis software (Athena). Not only will this improve ATLAS physics measurements and searches, it will also position the collaboration well for the upcoming challenges of Run 3 and beyond.

Athena converts raw signals recorded by the ATLAS experiment into more simplified datasets for physicists to study. Its new-and-improved version has been in development for several years and includes multi-threading capabilities, more complex physics-analysis functions and improved memory consumption.

“Our aim was to significantly reduce the amount of memory needed to run the software, widen the types of physics analyses it could do and – most critically – allow current and future ATLAS datasets to be analysed together,” says Zach Marshall, ATLAS Computing Coordinator. “These improvements are a key part of our preparations for future high-intensity operations of the LHC – in particular the High-Luminosity LHC (HL-LHC) run beginning around 2,028 which will see ATLAS’s computing resources in extremely high demand.”

Earlier this month D-Wave Systems, the quantum computing pioneer that has long championed quantum annealing-based quantum computing (and sometimes taken heat for that approach), announced it was expanding into gate-based quantum computing.

Surprised? Perhaps we shouldn’t be. Spun out of the University of British Columbia in 1,999 D-Wave initially targeted gate-based quantum computing and discovered how hard it would be to develop. The company strategy morphed early on.

“I joined in 2005 when the company was first transitioning from a gate-model focus to quantum annealing focus,” recalled Mark Johnson, now vice president of quantum technologies and systems products. “There was still this picture that we wanted to find the most direct path to providing valuable quantum applications and we felt that quantum annealing was the was the way to do that. We felt the gate model was maybe 20 years away.”