Toggle light / dark theme

Graeme Ross: “Once again the over-riding need to measure the immeasurable raises it‘s ugly head. Statistics are proof of ignorance. Numbers are not knowledge. It has been mooted that we are a mental construct that incorporates multiple persona in our subconscious and semi-conscious mind. Find the theory for yourself. I wont quote what you can find yourselves. If we are a construct, ever-changing, ever-evolving in complexity and moment-to-moment inner focus, and if, as it has been mooted, we have constant and endless conversation with these ever-changing inner mental persona, then it follows that without capturing that process in mid-flight (as it were) we can‘t deduce the reasoning that results from these conversations. Therefore we are not able to quantify these processes in any way at all. It is ephemeral. Thought takes place in the interval between knowing and asking. Trying to build a machine that will think would take far more resources than mankind will ever possess.”


Abstract: This paper outlines the Independent Core Observer Model (ICOM) Theory of Consciousness defined as a computational model of consciousness that is objectively measurable and an abstraction produced by a mathematical model where the subjective experience of the system is only subjective from the point of view of the abstracted logical core or conscious part of the system where it is modeled in the core of the system objectively. Given the lack of agreed-upon definitions around consciousness theory, this paper sets precise definitions designed to act as a foundation or baseline for additional theoretical and real-world research in ICOM based AGI (Artificial General Intelligence) systems that can have qualia measured objectively.

Published via Conference/Review Board: ICIST 2018 – International Conference on Information Science and Technology – China – April 20-22nd. (IEEE conference) [release pending] and https://www.itm-conferences.org/

Introduction The Independent Core Observer Model Theory of Consciousness is partially built on the Computational Theory of Mind (Rescorla 2016) where one of the core issues with research into artificial general intelligence (AGI) is the absence of objective measurements and data as they are ambiguous given the lack of agreed-upon objective measures of consciousness (Seth 2007). To continue serious work in the field we need to be able to measure consciousness in a consistent way that is not presupposing different theories of the nature of consciousness (Dienes and Seth 2012) and further not dependent on various ways of measuring biological systems (Dienes and Seth 2010) but focused on the elements of a conscious mind in the abstract. With the more nebulous Computational Theory of Mind, research into the human brain does show some underlying evidence.

Read more

“Open Article” smile Spin-based quantum computers have the potential to tackle difficult mathematical problems that cannot be solved using ordinary computers, but many problems remain in making these machines scalable. Now, an international group of researchers led by the RIKEN Center for Emergent Matter Science have crafted a new architecture for quantum computing. By constructing a hybrid device made from two different types of qubit—the fundamental computing element of quantum computers –they have created a device that can be quickly initialized and read out, and that simultaneously maintains high control fidelity.


Single-spin qubits in semiconductor quantum dots hold promise for universal quantum computation with demonstrations of a high single-qubit gate fidelity above 99.9% and two-qubit gates in conjunction with a long coherence time. However, initialization and readout of a qubit is orders of magnitude slower than control, which is detrimental for implementing measurement-based protocols such as error-correcting codes. In contrast, a singlet-triplet qubit, encoded in a two-spin subspace, has the virtue of fast readout with high fidelity. Here, we present a hybrid system which benefits from the different advantages of these two distinct spin-qubit implementations. A quantum interface between the two codes is realized by electrically tunable inter-qubit exchange coupling. We demonstrate a controlled-phase gate that acts within 5.5 ns, much faster than the measured dephasing time of 211 ns. The presented hybrid architecture will be useful to settle remaining key problems with building scalable spin-based quantum computers.

Read more

‘’As a result, it’s nonsensical to ask what happens to space-time beyond the Cauchy horizon because space-time, as it’s regarded within the theory of general relativity, no longer exists. “This gives one a way out of this philosophical conundrum,” said Dafermos.


Mathematicians have disproved the strong cosmic censorship conjecture. Their work answers one of the most important questions in the study of general relativity and changes the way we think about space-time.

Read more

Scientists at the University of Würzburg have been able to boost current super-resolution microscopy by a novel tweak. They coated the glass cover slip as part of the sample carrier with tailor-made biocompatible nanosheets that create a mirror effect. This method shows that localizing single emitters in front of a metal-dielectric coating leads to higher precision, brightness and contrast in Single Molecule Localization Microscopy (SMLM). The study was published in the Nature journal Light: Science and Applications.

The sharpness of a microscope is limited by —structures that are closer together than 0.2 thousandths of a millimeter blur, and can no longer be distinguished from each other. The cause of this blurring is diffraction. Each point-shaped object is therefore not shown as a point, but as a blurry spot.

With , the resolution can still be drastically improved. One method would calculate its exact center from the brightness distribution of the blurry spot. However, it only works if two closely adjacent points of the object are initially not simultaneously but subsequently visible, and are merged later in the . This temporal decoupling prevents superimposition of the blurry spot. For years, researchers in have been using this tricky method for super high-resolution light of cells.

Read more

by Eloisa Marchesoni

Today, I will talk about the recent creation of really intelligent machines, able to solve difficult problems, to recreate the creativity and versatility of the human mind, machines not only able to excel in a single activity but to abstract general information and find solutions that are unthinkable for us. I will not talk about blockchain, but about another revolution (less economic and more mathematical), which is all about computing: quantum computers.

Quantum computing is not really new, as we have been talking about it for a couple of decades already, but we are just now witnessing the transition from theory to realization of such technology. Quantum computers were first theorized at the beginning of the 1980s, but only in the last few years, thanks to the commitment of companies like Google and IBM, a strong impulse has been pushing the development of these machines. The quantum computer is able to use quantum particles (imagine them to be like electrons or photons) to process information. The particles act as positive or negative (i., the 0 and the 1 that we are used to see in traditional computer science) alternatively or at the same time, thus generating quantum information bits called “qubits”, which can have value either 0 or 1 or a quantum superposition of 0 and 1.

Read more