Toggle light / dark theme

Signup for your FREE TRIAL to The GREAT COURSES PLUS here: http://ow.ly/5KMw30qK17T. Until 350 years ago, there was a distinction between what people saw on earth and what they saw in the sky. There did not seem to be any connection.

Then Isaac Newton in 1,687 showed that planets move due to the same forces we experience here on earth. If things could be explained with mathematics, to many people this called into question the need for a God.

But in the late 20th century, arguments for God were resurrected. The standard model of particle physics and general relativity is accurate. But there are constants in these equations that do not have an explanation. They have to be measured. Many of them seem to be very fine tuned.

Scientists point out for example, the mass of a neutrino is 2X10^-37kg. It has been shown that if this mass was off by just one decimal point, life would not exist because if the mass was too high, the additional gravity would cause the universe to collapse. If the mass was too low, galaxies could not form because the universe would have expanded too fast.

On closer examination, it has some problems. The argument exaggerates the idea of fine tuning by using misleading units of measurement, to make fine tuning seem much more unlikely than it may be. The mass of neutrinos is expressed in Kg. Using kilograms to measure something this small is the equivalent of measuring a person’s height in light years. A better measurement for the neutrino would be electron volts or picograms.

Another point is that most of the constants could not really be any arbitrary number. They are going to hover around some value close to what they actually are. The value of the mass of a neutrino could not be the mass of a bowling ball. Such massive particles with the property of a neutrino could not have been created during the Big Bang.

China’s Ministry of Industry and Information Technology (MIIT) on Saturday released its second batch of extended goals for promoting the usage of China’s 5G network and the Industrial Internet of Things (IIoT).

IIoT refers to the interconnection between sensors, instruments and other devices to enhance manufacturing efficiency and industrial processes. With a strong focus on machine-to-machine communication, big data and machine learning, the IIoT has been applied across many industrial sectors and applications.

The MIIT announced that the 5G IIoT will be applied in the petrochemical industry, building materials, ports, textiles and home appliances as the 2021 China 5G + Industrial Internet Conference kicked off Saturday in Wuhan, central China’s Hubei Province.

Researchers at the USC Viterbi School of Engineering are using generative adversarial networks (GANs)—technology best known for creating deepfake videos and photorealistic human faces—to improve brain-computer interfaces for people with disabilities.

In a paper published in Nature Biomedical Engineering, the team successfully taught an AI to generate synthetic brain activity data. The data, specifically called spike trains, can be fed into to improve the usability of (BCI).

BCI systems work by analyzing a person’s brain signals and translating that into commands, allowing the user to control like computer cursors using only their thoughts. These devices can improve quality of life for people with motor dysfunction or paralysis, even those struggling with locked-in syndrome—when a person is fully conscious but unable to move or communicate.

There’s a multibillion-dollar race going on to build the first complete map of the brain, something scientists are calling the “connectome.” It involves slicing the brain into thousands of pieces, and then digitally stitching them back together using a powerful AI algorithm.

Presented by Polestar.

#HelloWorld #Science #BloombergQuicktake.

About Hello World:

Meet the exotic, colorful, and endlessly entertaining characters that make up the technology industry beyond big tech. Watch Bloomberg’s Ashlee Vance in a journey around the world to find the inventors, scientists and technologists shaping our future: https://youtube.com/playlist?list=PLqq4LnWs3olU-bP2R9uD8YXbt02JjocOk.

——-
Like this video? Subscribe: http://www.youtube.com/Bloomberg?sub_confirmation=1
Become a Quicktake Member for exclusive perks: http://www.youtube.com/bloomberg/join.

Rutgers researchers and their collaborators have found that learning — a universal feature of intelligence in living beings — can be mimicked in synthetic matter, a discovery that in turn could inspire new algorithms for artificial intelligence (AI).

The study appears in the journal PNAS.

One of the fundamental characteristics of humans is the ability to continuously learn from and adapt to changing environments. But until recently, AI has been narrowly focused on emulating human logic. Now, researchers are looking to mimic human cognition in devices that can learn, remember and make decisions the way a human brain does.

There are billions of people around the world whose online experience has been shaped by algorithms that utilize artificial intelligence (AI) and machine learning (ML). Some form of AI and ML is employed almost every time people go online, whether they are searching for content, watching a video, or shopping for a product. Not only do these technologies increase the efficiency and accuracy of consumption but, in the online ecosystem, service providers innovate upon and monetize behavioral data that is captured either directly from a user’s device, a website visit or by third parties.

Advertisers are increasingly dependent on this data and the algorithms that adtech and martech employ to understand where their ads should be placed, which ads consumers are likely to engage with, which audiences are most likely to convert, and which publisher should get credit for conversions.

Additionally, the collection and better utilization of data helps publishers generate revenue, minimize data risks and costs, and provide relevant consumer-preference-based audiences for brands.

A team of researchers from Tri Alpha Energy Inc. and Google has developed an algorithm that can be used to speed up experiments conducted with plasma. In their paper published in the journal Scientific Reports, the group describes how they plan to use the algorithm in nuclear fusion research.

As research into harnessing has progressed, scientists have found that some of its characteristics are too complex to be solved in a reasonable amount of time using current technology. So they have increasingly turned to computers to help. More specifically, they want to adjust certain parameters in a device created to achieve fusion in a reasonable way. Such a device, most in the field agree, must involve the creation of a certain type of that is not too hot or too cold, is stable, and has a certain desired density.

Finding the right parameters that meet these conditions has involved an incredible amount of trial and error. In this new effort, the researchers sought to reduce the workload by using a to reduce some of the needed trials. To that end, they have created what they call the “optometrist’s .” In its most basic sense, it works like an optometrist attempting to measure the visual ability of a patient by showing them images and asking if they are better or worse than other images. The idea is to use the crunching power of a computer with the intelligence of a human being—the computer generates the options and the human tells it whether a given option is better or worse.

Dr. Ben Goertzel with Philip K. Dick at the Web Summit in Lisbon 2019.

Ben showcases the use of OpenCog within the SingularityNET enviroment which is powering the AI of the Philip K. Dick Robot.

We apologise for the poor audio quality.

SingularityNET is a decentralized marketplace for artificial intelligence. We aim to create the world’s global brain with a full-stack AI solution powered by a decentralized protocol.

We gathered the leading minds in machine learning and blockchain to democratize access to AI technology. Now anyone can take advantage of a global network of AI algorithms, services, and agents.

Website: https://singularitynet.io.
Forum: https://community.singularitynet.io.
Telegram: https://t.me/singularitynet.
Twitter: https://twitter.com/singularity_net.
Facebook: https://facebook.com/singularitynet.io.
Instagram: https://instagram.com/singularitynet.io.
Github: https://github.com/singnet.
Linkedin: https://www.linkedin.com/company/singularitynet

Turbulence makes many people uneasy or downright queasy. And it’s given researchers a headache, too. Mathematicians have been trying for a century or more to understand the turbulence that arises when a flow interacts with a boundary, but a formulation has proven elusive.

Now an international team of mathematicians, led by UC Santa Barbara professor Björn Birnir and the University of Oslo professor Luiza Angheluta, has published a complete description of boundary turbulence. The paper appears in Physical Review Research, and synthesizes decades of work on the topic. The theory unites empirical observations with the Navier-Stokes equation—the mathematical foundation of dynamics—into a .

This phenomenon was first described around 1920 by Hungarian physicist Theodore von Kármán and German physicist Ludwig Prandtl, two luminaries in fluid dynamics. “They were honing in on what’s called boundary layer turbulence,” said Birnir, director of the Center for Complex and Nonlinear Science. This is turbulence caused when a flow interacts with a boundary, such as the fluid’s surface, a pipe wall, the surface of the Earth and so forth.

A group of scientists at the U.S. Department of Energy’s Ames Laboratory has developed computational quantum algorithms that are capable of efficient and highly accurate simulations of static and dynamic properties of quantum systems. The algorithms are valuable tools to gain greater insight into the physics and chemistry of complex materials, and they are specifically designed to work on existing and near-future quantum computers.

Scientist Yong-Xin Yao and his research partners at Ames Lab use the power of advanced computers to speed discovery in condensed matter physics, modeling incredibly complex quantum mechanics and how they change over ultra-fast timescales. Current high performance computers can model the properties of very simple, small quantum systems, but larger or more rapidly expand the number of calculations a computer must perform to arrive at an , slowing the pace not only of computation, but also discovery.

“This is a real challenge given the current early-stage of existing quantum computing capabilities,” said Yao, “but it is also a very promising opportunity, since these calculations overwhelm classical computer systems, or take far too long to provide timely answers.”