Toggle light / dark theme


July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.

Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!

The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.

With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.

By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.

Image: CERN

PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.

Racks of servers at the CERN Data Centre (Image: CERN)
CERN DATA CENTER: The server farm in the 1450 m2 main room of the DC (pictured) forms Tier 0, the first point of contact between experimental data from the LHC and the Grid. As well as servers and data storage systems for Tier 0 and further physics analysis, the DC houses systems critical to the daily functioning of the laboratory. (Image: CERN)

The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)

  • ALICE: 4 GB/s (Pb-Pb running)
  • ATLAS: 800 MB/s – 1 GB/s
  • CMS: 600 MB/s
  • LHCb: 750 MB/s

In July, the LHCb experiment reported observation of an entire new class of particles:
Exotic Pentaquark Particles (Image: CERN)

Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.

The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”

Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.

Until now, however, no conclusive evidence for pentaquarks had been seen.
Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.

The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!

August 18th, 2015
CERN Experiment Confirms Matter-Antimatter CPT Symmetry
For
Light Nuclei, Antinuclei (Image: CERN)

Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.

The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.

Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.

Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.

The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.

The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.

IN CLOSING..

A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.

UNTIL THEN…

ORIGINAL ARTICLE POSTING via Michael Phillips LinkedIN Pulse @

Quoted: “Sometimes decentralization makes sense.

Filament is a startup that is taking two of the most overhyped ideas in the tech community—the block chain and the Internet of things—and applying them to the most boring problems the world has ever seen. Gathering data from farms, mines, oil platforms and other remote or highly secure places.

The combination could prove to be a powerful one because monitoring remote assets like oil wells or mining equipment is expensive whether you are using people driving around to manually check gear or trying to use sensitive electronic equipment and a pricey a satellite internet connection.

Instead Filament has built a rugged sensor package that it calls a Tap, and technology network that is the real secret sauce of the operation that allows its sensors to conduct business even when they aren’t actually connected to the internet. The company has attracted an array of investors who have put $5 million into the company, a graduate of the Techstars program. Bullpen Capital led the round with Verizon Ventures, Crosslink Capital, Samsung Ventures, Digital Currency Group, Haystack, Working Lab Capital, Techstars and others participating.

To build its technology, Filament is using a series of protocols that include the blockchain transaction database behind Bitcoin; BitTorrent, the popular peer-to-peer file sharing software; Jose, a contract management protocol that is also used in the OAuth authentication service that lets people use their Facebook ID to log in and manage permissions to other sites around the web;TMesh, a long-range mesh networking technology andTelehash for private messaging.”

“This cluster of technologies is what enables the Taps to perform some pretty compelling stunts, such as send small amounts of data up to 9 miles between Taps and keep a contract inside a sensor for a year or so even if that sensor isn’t connected to the Internet. In practical terms, that might mean that the sensor in a field gathering soil data might share that data with other sensors in nearby fields belonging to other farmers based on permissions the soil sensor has to share that data. Or it could be something a bit more complicated like a robotic seed tilling machine sensing that it was low on seed and ordering up another bag from inventory based on a “contract” it has with the dispensing system inside a shed on the property.

The potential use cases are hugely varied, and the idea of using a decentralized infrastructure is fairly novel. Both IBM and Samsung have tested out using a variation of the blockchain technology for storing data in decentralized networks for connected devices. The idea is that sending all of that data to the cloud and storing it for a decade or so doesn’t always make economic sense, so why not let the transactions and accounting for them happen on the devices themselves?

That’s where the blockchain and these other protocols come in. The blockchain is a great way to store information about a transaction in a distributed manner, and because its built into the devices there’s no infrastructure to support for years on end. When combined with mesh radio technologies such as TMesh it also becomes a good way to build out a network of devices that can communicate with each other even when they don’t have connectivity.”

Read the Article, and watch the Video, here > http://fortune.com/2015/08/18/filament-blockchain-iot/

“I am prepared to meet my Maker. Whether my Maker is prepared for the great ordeal of meeting me is another matter.” — Winston Churchill

Death still enjoys a steady paycheck, but being the Grim Reaper isn’t the cushy job that it used to be.

F_1b292dc085cb49ad2c9a621baff89267559a4be5c547b

As everyone is pointing out, 2015 is a crucial year for sustainable development, with three critical international meetings in the calendar starting this month. But what role do science, technology and innovation play in these processes?

Read more

77e2d8f5-c1be-4dca-9123-67e8375ea364.img

One of the most symbolic and substantively important examples of environmental conflict is over Yellowstone National Park. Yellowstone is the first national park in the world, and perhaps the most important natural treasure in the US. More recently it has become a site for bitter and long-lasting environmental conflict. And it has made me wonder how the scientific arguments around the issues sit with the emotional reactions inspired by the landscape and history.

Read more

OK. In scientific terms, it is only a ‘hypothesis’ — the reverse of the ‘Disposable Soma’ theory of ageing. Here how it goes.

For the past several decades, the Disposable Soma theory of ageing has been enjoying good publicity and a lively interest from both academics and the public alike. It stands up to scientific scrutiny, makes conceptual sense and fits well within an evolutionary framework of ageing. The theory basically suggests that, due to energy resource constraints, there is a trade-off between somatic cell and germ cell repair. As a result, germ cells are being repaired effectively and so the survival of the species is assured, at a cost of individual somatic (bodily) ageing and death. To put it very simply, we are disposable, we age and die because all the effective repair mechanisms have been diverted to our germ cell DNA in order to guarantee the survival of our species.

The theory accounts for many repair pathways and mechanisms converging upon the germ cell, and also for many of those mechanisms being driven away from somatic cell repair just to ensure germ cell survival. In the past two or three years however, it is increasingly being realised that this process is not unidirectional (from soma to germ), but it is bi-directional: under certain circumstances, somatic cells may initiate damage that affects germ cells, and also that germ cells may initiate repairs that benefit somatic cells!

I can’t even begin to describe how important this bi-directionality is. Taking this in a wider and more speculative sense, it is, in fact, the basis for the cure of ageing. The discovery that germ cells can (or are forced to) relinquish their repair priorities, and that resources can then be re-allocated for somatic repairs instead, means that we may be able to avoid age-related damage (because this would be repaired with greater fidelity) and, at the same time, avoid overpopulation (as our now damaged genetic material would be unsuitable for reproduction).

Ermolaeva et al. raised the further possibility that DNA damage in germ cells may protect somatic cells. They suggested that DNA injury in germ cells upregulates stress resistance pathways in somatic cells, and improves stress response to heat or oxidation. This is profoundly important because it shows that, in principle, when germ cells are damaged, they produce agents which can then protect somatic cells against systemic stress.

This mechanism may reflect an innate tendency to reverse the trade-offs between germ cell and somatic cell repair: when the germ cells are compromised, there is delay in offspring production matched by an increased repair of somatic cells. In Nature’s ‘eyes’, if the species cannot survive, at least the individual bodies should.

In addition, it was shown that neuronal stress induces apoptosis (orderly cell death) in the germ line. This process is mediated by the IRE-1 factor, an endoplasmic reticulum stress response sensor, which then activates p53 and initiates the apoptotic cascade in the germ line. Therefore germ cells may die due to a stress response originating from the distantly-located neurons.

If this mechanism exists, it is likely that other similar mechanisms must also exist, waiting to described. The consequence could be that neuronal positive stress (i.e. exposure to meaningful information that entices us to act) can affect our longevity by downgrading the importance of germ cell repair in favour of somatic tissue repair. In other words, the disposable soma theory can be seen in reverse: the soma (body) is not necessarily disposable but it can survive longer if it becomes indispensable, if it is ‘useful to the whole. This, as we claimed last week, can happen through mechanisms which are independent of any artificial biotechnological interventions.

We know that certain events which downgrade reproduction, may also cause a lifespan extension. Ablation of germ cells in the C.elegans worm, leads to an increased lifespan, which shows that signals from the germline have a direct impact upon somatic cell survival, and this may be due to an increased resistance of somatic cells to stress. Somatic intracellular clearance systems are also up-regulated following signals from the germ line.

In addition, protein homoeostasis in somatic cells is well-maintained when germ cells are damaged, and it is significantly downgraded when germ cell function increases. All of the above suggest that when the germ cells are healthy, somatic repair decreases, and when they are not, somatic repair improves as a counter-effect.

In an intriguing paper published last month, Lin et al. showed that under certain circumstances, somatic cells may adopt germ-like characteristics, which may suggest that these somatic cells can also be subjected to germ line protection mechanisms after their transformation. A few days ago Bazley et al. published a paper elucidating the mechanisms of how germ cells may induce somatic cell reprogramming and somatic stem cell pluripotency. This is an additional piece of evidence of the cross-talk mechanisms between soma and germ line, underscoring the fact that the health of somatic tissues depends upon signals from the germ line.

In all, there is sufficient initial evidence to suggest that my line of thinking is quite possibly correct: that the disposable soma theory is not unidirectional and the body may not, after all, be always ‘disposable’. Under certain evolutionary pressures we could experience increased somatic maintenance at the expense of germ cell repairs, and thus reach a situation where the body actually lives longer. I have already discussed that some of these evolutionary pressures could be dependent upon how well one makes themselves ‘indispensable’ to the adaptability of the homo sapiens species within a global techno-cultural environment.

“It’s much easier to replicate experiments and catch fraud if you have access to the original data. Some journals currently reward researchers for sharing the data that they used in an experiment. In the highest level of this new framework, data sharing would not only become compulsory, but independent analysts would conduct the same tests on it as those reported by the researchers, to see whether they get the same results.” Read more

http://www.slate.com/content/dam/slate/blogs/future_tense/2015/06/24/darpa_s_biology_is_technology_conference_discusses_problems_with_open_source/data.jpg.CROP.promovar-mediumlarge.jpg

“If goodwill and curiosity aren’t motivating researchers to work with open-source data on their own, there is still something that probably will: human limitation. ‘We have tiny little brains. We can’t understand the big stuff anymore,’ said Paul Cohen, a DARPA program manager in the Information and Innovation Office. ‘Machines will read the literature, machines will build complicated models, because frankly we can’t.’ When all you have to do is let your algorithms loose on a trove of publicly available data, there won’t be any reason not to pull in everything that’s out there. ” Read more