Scientists including one of Indian origin have developed a new highly efficient and low cost light emitting diode that could help spur more widespread adoption of the LED technology.
“It can potentially revolutionise lighting technology. In general, the cost of LED lighting has been a big concern thus far. Energy savings have not balanced out high costs. The new discovery could change that,” explained Zhibin Yu, assistant professor of industrial and manufacturing engineering at Florida State University.
Yu developed this technology with a team that included post-doctoral researcher Junqiang Li and graduate students Sri Ganesh Bade and Xin Shan.
But the ultimate goals of the project are nothing short of amazing: “The best possible outcome is to map the entirety of existing cache of neural network algorithms and applications to this energy-efficient substrate,” said Modha. “And, to invent entirely new algorithms that were hereto before impossible to imagine.”
IBM scientists are advancing toward “neuromorphic” computing — digital systems that process information like the brain — and launching a complete ecosystem for brain-like computing, with important near-term applications and visionary long-term prospects.
“For decades, computer scientists have been pursuing two elusive goals in parallel: engineering energy-efficient computers modeled on the human brain and designing smart computing systems that learn on their own — like humans do — and are not programmed like today’s computers,” said Dharmendra S. Modha, IBM Fellow and Chief Scientist for brain-inspired computing.
July, 2015; as you know.. was the all systems go for the CERNs Large Hadron Collider (LHC). On a Saturday evening, proton collisions resumed at the LHC and the experiments began collecting data once again. With the observation of the Higgs already in our back pocket — It was time to turn up the dial and push the LHC into double digit (TeV) energy levels. From a personal standpoint, I didn’t blink an eye hearing that large amounts of Data was being collected at every turn. BUT, I was quite surprised to learn at the ‘Amount’ being collected and processed each day — About One Petabyte.
Approximately 600 million times per second, particles collide within the (LHC). The digitized summary is recorded as a “collision event”. Physicists must then sift through the 30 petabytes or so of data produced annually to determine if the collisions have thrown up any interesting physics. Needless to say — The Hunt is On!
The Data Center processes about one Petabyte of data every day — the equivalent of around 210,000 DVDs. The center hosts 11,000 servers with 100,000 processor cores. Some 6000 changes in the database are performed every second.
With experiments at CERN generating such colossal amounts of data. The Data Center stores it, and then sends it around the world for analysis. CERN simply does not have the computing or financial resources to crunch all of the data on site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of over 8000 physicists near real-time access to LHC data. The Grid runs more than two million jobs per day. At peak rates, 10 gigabytes of data may be transferred from its servers every second.
By early 2013 CERN had increased the power capacity of the centre from 2.9 MW to 3.5 MW, allowing the installation of more computers. In parallel, improvements in energy-efficiency implemented in 2011 have led to an estimated energy saving of 4.5 GWh per year.
Image: CERN
PROCESSING THE DATA (processing info via CERN)> Subsequently hundreds of thousands of computers from around the world come into action: harnessed in a distributed computing service, they form the Worldwide LHC Computing Grid (WLCG), which provides the resources to store, distribute, and process the LHC data. WLCG combines the power of more than 170 collaborating centres in 36 countries around the world, which are linked to CERN. Every day WLCG processes more than 1.5 million ‘jobs’, corresponding to a single computer running for more than 600 years.
The data flow from all four experiments for Run 2 is anticipated to be about 25 GB/s (gigabyte per second)
ALICE: 4 GB/s (Pb-Pb running)
ATLAS: 800 MB/s – 1 GB/s
CMS: 600 MB/s
LHCb: 750 MB/s
In July, the LHCb experiment reported observation of an entire new class of particles: Exotic Pentaquark Particles (Image: CERN)
Possible layout of the quarks in a pentaquark particle. The five quarks might be tightly bound (left). The five quarks might be tightly bound. They might also be assembled into a meson (one quark and one anti quark) and a baryon (three quarks), weakly bound together.
The LHCb experiment at CERN’s LHC has reported the discovery of a class of particles known as pentaquarks. In short, “The pentaquark is not just any new particle,” said LHCb spokesperson Guy Wilkinson. “It represents a way to aggregate quarks, namely the fundamental constituents of ordinary protons and neutrons, in a pattern that has never been observed before in over 50 years of experimental searches. Studying its properties may allow us to understand better how ordinary matter, the protons and neutrons from which we’re all made, is constituted.”
Our understanding of the structure of matter was revolutionized in 1964 when American physicist Murray Gell-Mann proposed that a category of particles known as baryons, which includes protons and neutrons, are comprised of three fractionally charged objects called quarks, and that another category, mesons, are formed of quark-antiquark pairs. This quark model also allows the existence of other quark composite states, such as pentaquarks composed of four quarks and an antiquark.
Until now, however, no conclusive evidence for pentaquarks had been seen. Earlier experiments that have searched for pentaquarks have proved inconclusive. The next step in the analysis will be to study how the quarks are bound together within the pentaquarks.
“The quarks could be tightly bound,” said LHCb physicist Liming Zhang of Tsinghua University, “or they could be loosely bound in a sort of meson-baryon molecule, in which the meson and baryon feel a residual strong force similar to the one binding protons and neutrons to form nuclei.” More studies will be needed to distinguish between these possibilities, and to see what else pentaquarks can teach us!
August 18th, 2015 CERN Experiment Confirms Matter-Antimatter CPT Symmetry For Light Nuclei, Antinuclei (Image: CERN)
Days after scientists at CERN’s Baryon-Antibaryon Symmetry Experiment (BASE) measured the mass-to-charge ratio of a proton and its antimatter particle, the antiproton, the ALICE experiment at the European organization reported similar measurements for light nuclei and antinuclei.
The measurements, made with unprecedented precision, add to growing scientific data confirming that matter and antimatter are true mirror images.
Antimatter shares the same mass as its matter counterpart, but has opposite electric charge. The electron, for instance, has a positively charged antimatter equivalent called positron. Scientists believe that the Big Bang created equal quantities of matter and antimatter 13.8 billion years ago. However, for reasons yet unknown, matter prevailed, creating everything we see around us today — from the smallest microbe on Earth to the largest galaxy in the universe.
Last week, in a paper published in the journal Nature, researchers reported a significant step toward solving this long-standing mystery of the universe. According to the study, 13,000 measurements over a 35-day period show — with unparalleled precision – that protons and antiprotons have identical mass-to-charge ratios.
The experiment tested a central tenet of the Standard Model of particle physics, known as the Charge, Parity, and Time Reversal (CPT) symmetry. If CPT symmetry is true, a system remains unchanged if three fundamental properties — charge, parity, which refers to a 180-degree flip in spatial configuration, and time — are reversed.
The latest study takes the research over this symmetry further. The ALICE measurements show that CPT symmetry holds true for light nuclei such as deuterons — a hydrogen nucleus with an additional neutron — and antideuterons, as well as for helium-3 nuclei — two protons plus a neutron — and antihelium-3 nuclei. The experiment, which also analyzed the curvature of these particles’ tracks in ALICE detector’s magnetic field and their time of flight, improve on the existing measurements by a factor of up to 100.
IN CLOSING..
A violation of CPT would not only hint at the existence of physics beyond the Standard Model — which isn’t complete yet — it would also help us understand why the universe, as we know it, is completely devoid of antimatter.
Long time ago I was wondering why not to use drones (*) (named for that concrete application Extreme Access Flyers) to explore the space, to reach new planets, asteroids … it would be exciting … rovers are limited in action, so what if we make it airborne? Once in space, why not to send a drone or a swarm of them from the main spaceship to explore a new planet? They could interact, share capabilities, morph, etc.
While the economy looks more or less promising for civil and military, there is still a long path to walk …
“Teal Group’s 2015 market study estimates that UAV production will soar from current worldwide UAV production of $4 billion annually to $14 billion, totaling $93 billion in the next ten years. Military UAV research spending would add another $30 billion over the decade.”
Read more at http://www.suasnews.com/2015/08/37903/teal-group-predicts-worldwide-uav-production-will-total-93-billion-in-its-2015-uav-market-profile-and-forecast/
Now NASA pursues the aim of using drones to overcome the problems of rovers …
Read more at http://www.engadget.com/2015/07/31/space-drones-mars-moon-asteroid/
Filament is a startup that is taking two of the most overhyped ideas in the tech community—the block chain and the Internet of things—and applying them to the most boring problems the world has ever seen. Gathering data from farms, mines, oil platforms and other remote or highly secure places.
The combination could prove to be a powerful one because monitoring remote assets like oil wells or mining equipment is expensive whether you are using people driving around to manually check gear or trying to use sensitive electronic equipment and a pricey a satellite internet connection.
Instead Filament has built a rugged sensor package that it calls a Tap, and technology network that is the real secret sauce of the operation that allows its sensors to conduct business even when they aren’t actually connected to the internet. The company has attracted an array of investors who have put $5 million into the company, a graduate of the Techstars program. Bullpen Capital led the round with Verizon Ventures, Crosslink Capital, Samsung Ventures, Digital Currency Group, Haystack, Working Lab Capital, Techstars and others participating.
To build its technology, Filament is using a series of protocols that include the blockchain transaction database behind Bitcoin; BitTorrent, the popular peer-to-peer file sharing software; Jose, a contract management protocol that is also used in the OAuth authentication service that lets people use their Facebook ID to log in and manage permissions to other sites around the web;TMesh, a long-range mesh networking technology andTelehash for private messaging.”
“This cluster of technologies is what enables the Taps to perform some pretty compelling stunts, such as send small amounts of data up to 9 miles between Taps and keep a contract inside a sensor for a year or so even if that sensor isn’t connected to the Internet. In practical terms, that might mean that the sensor in a field gathering soil data might share that data with other sensors in nearby fields belonging to other farmers based on permissions the soil sensor has to share that data. Or it could be something a bit more complicated like a robotic seed tilling machine sensing that it was low on seed and ordering up another bag from inventory based on a “contract” it has with the dispensing system inside a shed on the property.
The potential use cases are hugely varied, and the idea of using a decentralized infrastructure is fairly novel. Both IBM and Samsung have tested out using a variation of the blockchain technology for storing data in decentralized networks for connected devices. The idea is that sending all of that data to the cloud and storing it for a decade or so doesn’t always make economic sense, so why not let the transactions and accounting for them happen on the devices themselves?
That’s where the blockchain and these other protocols come in. The blockchain is a great way to store information about a transaction in a distributed manner, and because its built into the devices there’s no infrastructure to support for years on end. When combined with mesh radio technologies such as TMesh it also becomes a good way to build out a network of devices that can communicate with each other even when they don’t have connectivity.”
Read the Article, and watch the Video, here > http://fortune.com/2015/08/18/filament-blockchain-iot/
Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, and An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.
A cutaway view of the proposed ARC reactor (credit: MIT ARC team)
MIT plans to create a new compact version of a tokamak fusion reactor with the goal of producing practical fusion power, which could offer a nearly inexhaustible energy resource in as little as a decade.
Fusion, the nuclear reaction that powers the sun, involves fusing pairs of hydrogen atoms together to form helium, accompanied by enormous releases of energy.
The new fusion reactor, called ARC, would take advantage of new, commercially available superconductors — rare-earth barium copper oxide (REBCO) superconducting tapes (the dark brown areas in the illustration above) — to produce stronger magnetic field coils, according to Dennis Whyte, a professor of Nuclear Science and Engineering and director of MIT’s Plasma Science and Fusion Center.
Professor Hyun-Gyu Park of the Department of Chemical and Biomolecular Engineering at Korea Advanced Institute of Science and Technology (KAIST) has developed a technique to analyze various target DNAs using an aptamer, a DNA fragment that can recognize and bind to a specific protein or enzyme. This technique will allow the development of affordable genetic diagnosis for new bacteria or virus, such as Middle Ease Respiratory Syndrome (MERS). The research findings were published in the June issue of Chemical Communications, issued by the Royal Society of Chemistry in the United Kingdom. The paper was selected as a lead article of the journal.
SSDs and other flash memory devices will soon get cheaper and larger thanks to big announcements from Toshiba and Intel. Both companies revealed new “3D NAND” memory chips that are stacked in layers to pack in more data, unlike single-plane chips currently used. Toshiba said that it’s created the world’s first 48-layer NAND, yielding a 16GB chip with boosted speeds and reliability. The Japanese company invented flash memory in the first place and has the smallest NAND cells in the world at 15nm. Toshiba is now giving manufacturers engineering samples, but products using the new chips won’t arrive for another year or so.