Toggle light / dark theme

In their book “Era of Exponential Encryption — Beyond Cryptographic Routing” the authors provide a vision that can demonstrate an increasing multiplication of options for encryption and decryption processes: Similar to a grain of rice that doubles exponentially in every field of a chessboard, more and more newer concepts and programming in the area of cryptography increase these manifolds: both, encryption and decryption, require more session-related and multiple keys, so that numerous options even exist for configuring hybrid encryption: with different keys and algorithms, symmetric and asymmetrical methods, or even modern multiple encryption, with that ciphertext is converted again and again to ciphertext. It will be analyzed how a handful of newer applications like e.g. Spot-On and GoldBug E-Mail Client & Crypto Chat Messenger and other open source software programming implement these encryption mechanisms. Renewing a key several times — within the dedicated session with “cryptographic calling” — has forwarded the term of “perfect forward secrecy” to “instant perfect forward secrecy” (IPFS). But even more: if in advance a bunch of keys is sent, a decoding of a message has to consider not only one present session key, but over dozens of keys are sent — prior before the message arrives. The new paradigm of IPFS has already turned into the newer concept of these Fiasco Keys are keys, which provide over a dozen possible ephemeral keys within one session and define Fiasco Forwarding, the approach which complements and follows IPFS. And further: by adding routing- and graph-theory to the encryption process, which is a constant part of the so called Echo Protocol, an encrypted packet might take different graphs and routes within the network. This shifts the current status to a new age: The Era of Exponential Encryption, so the vision and description of the authors. If routing does not require destination information but is replaced by cryptographic in.

Read more

RSA Encryption is an essential safeguard for our online communications. It was also destined to fail even before the Internet made RSA necessary, thanks the work of Peter Shor, whose algorithm in 1994 proved quantum computers could actually be used to solve problems classical computers could not.

Read more

WhatsApp has admitted to a major cybersecurity breach that has enabled both iPhone and Android devices to be targeted with spyware from Israel’s NSO. This is a major breach for WhatsApp, with the product’s encrypted voice calls seen as a secure alternative to standard calls.

Read more

In conventional holography a photographic film can record the interference pattern of monochromatic light scattered from the object to be imaged with a reference beam of un-scattered light. Scientists can then illuminate the developed image with a replica of the reference beam to create a virtual image of the original object. Holography was originally proposed by the physicist Dennis Gabor in 1948 to improve the resolution of an electron microscope, demonstrated using light optics. A hologram can be formed by capturing the phase and amplitude distribution of a signal by superimposing it with a known reference. The original concept was followed by holography with electrons, and after the invention of lasers optical holography became a popular technique for 3D imaging macroscopic objects, information encryption and microscopy imaging.

However, extending holograms to the ultrafast domain currently remains a challenge with electrons, although developing the technique would allow the highest possible combined spatiotemporal resolution for advanced imaging applications in condensed matter physics. In a recent study now published in Science Advances, Ivan Madan and an interdisciplinary research team in the departments of Ultrafast Microscopy and Electron Scattering, Physics, Science and Technology in Switzerland, the U.K. and Spain, detailed the development of a hologram using local . The scientists obtained the electromagnetic holograms with combined attosecond/nanometer resolution in an ultrafast transmission electron microscope (UEM).

In the new method, the scientists relied on electromagnetic fields to split an electron wave function in a quantum of different energy states. The technique deviated from the conventional method, where the signal of interest and reference spatially separated and recombined to reconstruct the amplitude and phase of a signal of interest to subsequently form a hologram. The principle can be extended to any kind of detection configuration involving a periodic signal capable of undergoing interference, including sound waves, X-rays or femtosecond pulse waveforms.

Read more

Checking out a stack of books from the library is as simple as searching the library’s catalog and using unique call numbers to pull each book from their shelf locations. Using a similar principle, scientists at the Center for Functional Nanomaterials (CFN)—a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory—are teaming with Harvard University and the Massachusetts Institute of Technology (MIT) to create a first-of-its-kind automated system to catalog atomically thin two-dimensional (2-D) materials and stack them into layered structures. Called the Quantum Material Press, or QPress, this system will accelerate the discovery of next-generation materials for the emerging field of quantum information science (QIS).

Structures obtained by stacking single atomic layers (“flakes”) peeled from different parent bulk crystals are of interest because of the exotic electronic, magnetic, and that emerge at such small (quantum) size scales. However, flake exfoliation is currently a manual process that yields a variety of flake sizes, shapes, orientations, and number of layers. Scientists use optical microscopes at high magnification to manually hunt through thousands of flakes to find the desired ones, and this search can sometimes take days or even a week, and is prone to .

Once high-quality 2-D flakes from different crystals have been located and their properties characterized, they can be assembled in the desired order to create the layered structures. Stacking is very time-intensive, often taking longer than a month to assemble a single layered structure. To determine whether the generated structures are optimal for QIS applications—ranging from computing and encryption to sensing and communications—scientists then need to characterize the structures’ properties.

Read more

With new advances in technology it all comes down to simple factoring. Classical factoring systems are outdated where some problems would take 80 billion years to solve but with new technologies such as the dwave 2 it can bring us up to speed to do the same problems in about 2 seconds. Shores algorithm shows us also we can hack anything with it simply would need the technology and code simple enough and strong enough. Basically with new infrastructure we can do like jason…


RSA is the standard cryptographic algorithm on the Internet. The method is publicly known but extremely hard to crack. It uses two keys for encryption. The public key is open and the client uses it to encrypt a random session key. Anyone intercepts the encrypted key must use the second key, the private key, to decrypt it. Otherwise, it is just garbage. Once the session key is decrypted, the server uses it to encrypt and decrypt further messages with a faster algorithm. So, as long as we keep the private key safe, the communication will be secure.

RSA encryption is based on a simple idea: prime factorization. Multiplying two prime numbers is pretty simple, but it is hard to factorize its result. For example, what are the factors for 507,906,452,803? Answer: 566,557 × 896,479.

Based on this asymmetry in complexity, we can distribute a public key based on the product of two prime numbers to encrypt a message. But without knowing the prime factors, we cannot decrypt the message to its original intention. In 2014, WraithX used a budget of $7,600 on Amazon EC2 and his/her own resources to factorize a 696-bit number. We can break a 1024-bit key with a sizeable budget within months or a year. This is devasting because SSL certificates holding the public key last for 28 months. Fortunately, the complexity of the prime factorization problem grows exponentially with the key length. So, we are pretty safe since we switch to 2048-bit keys already.

Read more