Toggle light / dark theme

But where advocates like Foxx mostly see the benefits of transhumanism, some critics say it raises ethical concerns in terms of risk, and others point out its potential to exacerbate social inequality.


Foxx says humans have long used technology to make up for physical limitations — think of prosthetics, hearing aids, or even telephones. More controversial technology aimed to enhance or even extend life, like cryogenic freezing, is also charted terrain.

The transhumanist movement isn’t large, but Foxx says there is a growing awareness and interest in technology used to enhance or supplement physical capability.

This is perhaps unsurprising given that we live in an era where scientists are working to create artificial intelligence that can read your mind and millions of people spend most of their day clutching a supercomputer in their hands in the form of a smartphone.

Read more

Japan’s government is facing serious fiscal challenges, but its main science ministry appears hopeful that the nation is ready to once again back basic research in a big way. The Ministry of Education (MEXT) on 31 August announced an ambitious budget request that would allow Japan to compete for the world’s fastest supercomputer, build a replacement x-ray space observatory, and push ahead with a massive new particle detector.


Proposed successor to Super-Kamiokande, exascale computer and x-ray satellite win backing.

Read more

Realistic climate simulations require huge reserves of computational power. An LMU study now shows that new algorithms allow interactions in the atmosphere to be modeled more rapidly without loss of reliability.

Forecasting global and local climates requires the construction and testing of mathematical . Since such models must incorporate a plethora of physical processes and interactions, climate simulations require enormous amounts of . And even the best models inevitably have limitations, since the phenomena involved can never be modeled in sufficient detail. In a project carried out in the context of the DFG-funded Collaborative Research Center “Waves to Weather”, Stephan Rasp of the Institute of Theoretical Meteorology at LMU (Director: Professor George Craig) has now looked at the question of whether the application of can improve the efficacy of climate modelling. The study, which was performed in collaboration with Professor Mike Pritchard of the University of California at Irvine und Pierre Gentine of Columbia University in New York, appears in the journal PNAS.

General circulation models typically simulate the global behavior of the atmosphere on grids whose cells have dimensions of around 50 km. Even using state-of-the-art supercomputers the relevant that take place in the atmosphere are simply too complex to be modelled at the necessary level of detail. One prominent example concerns the modelling of clouds which have a crucial influence on climate. They transport heat and moisture, produce precipitation, as well as absorb and reflect solar radiation, for instance. Many clouds extend over distances of only a few hundred meters, much smaller than the grid cells typically used in simulations – and they are highly dynamic. Both features make them extremely difficult to model realistically. Hence today’s models lack at least one vital ingredient, and in this respect, only provide an approximate description of the Earth system.

Read more

An AI is set to try and work out how a potentially limitless supply of energy can be used on Earth.

It could finally solve the mysteries of fusion power, letting researchers capture and control the process that powers the sun and stars.

Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University hope to harness a massive new supercomputer to work out how the doughnut-shaped devices, known as tokamaks, can be used.

Read more

https://youtube.com/watch?v=2qTuZlMvFgY

Researchers involved in the Blue Brain Project – which aims to create a digital reconstruction of the brain – have announced the deployment of a next-generation supercomputer.

mouse brain supercomputer future Credit: HPE

Ecole Polytechnique Fédérale de Lausanne (EPFL), the Swiss university and research institute developing the Blue Brain Project, has announced the selection of Hewlett Packard Enterprise (HPE) to build a next-generation supercomputer. This will model and simulate the mammalian brain in greater detail than ever before. The powerful new machine, called “Blue Brain 5”, will be dedicated to simulation neuroscience, in particular simulation-based research, analysis and visualisation, to advance the understanding of the brain.

Read more

An international team of scientists from Eindhoven University of Technology, University of Texas at Austin, and University of Derby, has developed a revolutionary method that quadratically accelerates artificial intelligence (AI) training algorithms. This gives full AI capability to inexpensive computers, and would make it possible in one to two years for supercomputers to utilize Artificial Neural Networks that quadratically exceed the possibilities of today’s artificial neural networks. The scientists presented their method on June 19 in the journal Nature Communications.

Artificial Neural Networks (or ANN) are at the very heart of the AI revolution that is shaping every aspect of society and technology. But the ANNs that we have been able to handle so far are nowhere near solving very complex problems. The very latest supercomputers would struggle with a 16 million-neuron network (just about the size of a frog brain), while it would take over a dozen days for a powerful desktop computer to train a mere 100,000-neuron network.

Read more

Thomas Sterling has retracted his prediction that we will never reach ZettaFLOP computers. He now predicts zettaFLOPS can be achieved in less than 10 years if innovations in non-von Neumann architecture can be scaled. With a change to cryogenic technologies, we can reach yottaFLOPS by 2030.

Read more

Chinese physicists realized a genuine entanglement of 18 quantum particles, beating their own world record set in 2016, while the team has set their next goal at 50-qubit entanglement.

The result of the study was published in the US journal Physical Review Letters on June 28. Chinese leading quantum physicist Pan Jianwei led the project. Together with his team, Pan earlier demonstrated quantum entanglement with 10 quantum bits, or “qubits,” in 2016, according to a report sent by Pan’s team to Global Times on Tuesday.

Quantum entanglement is a weird phenomenon which Einstein called “spooky action at a distance” where quantum particles are connected “even if they are at opposite ends of the universe,” an Australia-based Cosmos Magazine reported.

Read more

China has 206 of the top 500 supercomputers — compared to the U.S.’s 124.


America is now home to the world’s speediest supercomputer. But the new list of the 500 swiftest machines underlines how much faster China is building them.

The list, published Monday, shows the Chinese companies and government pulling away as the most prolific producer of supercomputers, with 206 of the top 500. American corporations and the United States government designed and made 124 of the supercomputers on the list.

For years, the United States dominated the supercomputer market. But two years ago, China pulled even on the Top 500 list. China moved decisively ahead last fall and extended the gap in the latest tally.

Read more