Toggle light / dark theme

By Jeremy Batterson 11-09-2021

The equivalent of cheap 100-inch binoculars will soon be possible. This memo is a quick update on seven rapidly converging technologies that augur well for astronomy enthusiasts of the near future. All these technologies already exist in either fully developed or nascent form, and all are being rapidly improved due to the gigantic global cell phone market and the retinal projection market that will soon replace it. Listed here are the multiple technologies, after which they are brought together into a single system.

1) Tracking.
2) Single-photon image sensing.
3) Large effective exit pupils via large sensors.
4) Long exposure non-photographic function.
5) Flat optics (metamaterials)
6) Off-axis function of flat optics.
7) Retinal projection.

1) TRACKING: this is already being widely used in so-called “go-to” telescopes, where the instrument will find any object and track it, so Earth’s rotation does not take the object viewed out of the field of vision. The viewer doesn’t have to find the object and doesn’t have to set up the clock drive to track it. Tracking is also partly used in image stabilization software for cameras and smart phones, to prevent motion blurring of images.

2) SINGLE-PHOTON IMAGE SENSORS, whether of the single-photon avalanching diode type, or the type developed by Dr. Fossum, will allow passive imaging in nearly totally dark environments, without the use of IR or other illumination. This new type of image sensor will replace the monochromatic analogue “night-vision” devices, allowing color imaging at higher resolution than they can produce. Unlike these current devices, such sensors will not be destroyed by being exposed to normal or high lighting. Effectively, these sensors increase the effective light-gathering power of a telescope by at least an order of magnitude, allowing small telescopes to see what observatory telescopes see now.

3) EXIT PUPIL: The pupil of the dark-adapted human eye is around 7mm, which means light exiting a telescope must not have a wider-cross axis than this, or a percent of the light captured by the objective lens or mirror will be lost. If the magnification of a system is lowered, to give brighter images, this is limited by this roadblock. This is a well-known problem for visual astronomers. Astro-photographers get around this by two tricks. The first is to use a photographic sensor wider than 7mm, allowing a larger exit pupil and thus brighter images. A 1-inch sensor or photographic plate, for example, already allows an image thirteen times brighter than what a 7mm human pupil can see.

4) LONG EXPOSURE: The other trick astro-photographers use is to keep the shutter of their cameras open for longer periods, thus capturing more light, and allowing a bright image of a faint object to build up over time. As a telescope tracks the stars–so that they appear motionless in the telescopic view–this can be done for hours. The Hubble Space Telescope took a 100 hour long-exposure photograph leading to the famous “deep field” of ultra-faint distant galaxies. An example of a visual use of the same principle is the Sionyx Pro camera, which keeps the shutter open for a fraction of a second. If the exposures are short enough, a video can be produced which appears brighter than what the unaided eye sees. Sionyx adds to this with its black-silicon sensors, which are better at retaining all light that hits them. For astronomy, where stellar objects do not move and do not cause blurring if they are tracked, longer exposures can be created, with the image rapidly brightening as the viewer watches. Unistellar’s eVscope and Vaonis’s Stellina telescope, already use this function, but without an eyepiece. Instead, their images are projected onto people’s cell phones or other viewing devices. However, most astronomers want to be able to see something directly with their eyes, which is a limiting point on such types of telescopes.

5) FLAT OPTICS are already entering the cell phone market and will increase in aperture over coming years. A flat metamaterial lens, replacing a thick and heavy series of lenses can provide a short enough system to easily fit into a cell phone, with no protruding camera bumps. It is possible to produce such lenses with extremely short focal-ratios. Eventually, very large 20-inch or 30-inch objectives will be producible this way, but, in the interim, multiple small objectives could be combined at a DISTANCE from each other. There are two reasons for larger aperture objectives: increased light-gathering power and higher resolution. However, the higher resolution can also be obtained by having two or more smaller objectives kept far apart from each other. If the light-gathering power is already high enough from single-photon detectors, it is not necessary to have a larger aperture.

6) THE OFF-AXIS ability of flat metamaterial optics is another game-changer. In normal optics, the plane of focus travels straight down the perpendicular of the main light-collecting lens or mirror, the so-called “objective” lens or mirror. In an off-axis mirror or lens, the focal axis is not down the middle, but to the side or even totally external to the perpendicular plane of the objective. These types of optics are very difficult to produce traditionally but are easy to produce with the flat, metamaterial method. An example of the value of off-axis optics is a reflector telescope. When light bounces off a reflector’s objective mirror, it then hits a secondary mirror and bounces back to an eyepiece. But the secondary mirror obstructs the main mirror as light passes by it on the way to it. This creates diffraction patterns which reduce the resolution of the image. For this reason, refractors, which use lenses, and thus have no secondary obstruction, are superior in quality, although far harder and expensive to build. The off-axis function would also be valuable for large binoculars, which would not need secondary prisms or mirrors to bring the images to the width between two human eyes. Typical human pupils are two or three inches apart. With off-axis binoculars, the focus of the two objective lenses would be to the side of their diameters, and no secondary guidance of the light cones would be needed.

7) RETINAL PROJECTION is being developed in several ways by numerous companies, and within a decade should be fairly common. This is known generically as “augmented reality,” where a pair of glasses or contact lenses project an image over the normal vision. If a person closed their eyes in a darkened room, they could watch movies, or dictate papers with such a system—or view images of the stars from a telescope. For visual astronomers, who like the idea of seeing something in their eye, instead of viewing it on a TV screen, retinal projection will allow a nearly identical experience, but with all the superior functions listed in this memo.

These functions can be brought together in numerous ways. Noted here is but one such possible way.

A) Two flat off-axis metamaterial lenses around 2.5 inches (~60mm) in diameter, and corrected for achromatic, spherical, and other optical aberrations, are positioned 30 inches apart, giving the resolution of a telescope with an objective mirror of 30-inch diameter. Since they have single-photon detectors, they will conservatively have an effective light-gathering power of something like 10-inch binoculars.

B) Whoever has control of the scope looks up at a stellar object, such as the famous M-42 nebula, and says, “zoom and track.” The scope then moves to where the controller is looking, or wherever else it is told to go.

C) A large image sensor increases the brightness of the view another order of magnitude over what the dark-adapted eye would see with 10-inch binoculars. Now, the image is as bright as what would be seen in 30-inch binoculars.

D) The time-exposure function adds yet another order or two of light-gathering power, depending on the length of exposure. Even exposures fractions of a second long already add more brightness. Longer exposures add more, in almost real time. Now, the image is equivalent to looking through something like 100-inch binoculars. As the producible aperture of metamaterial optics increases, the effective aperture will go up to 200-inch, 500-inch, etc.

Finally, the image is projected into the back of the viewer’s eye, as a wide-field image, wider than that of the best TeleVue eyepiece, and is simultaneously projected onto the retinas of many other people. This multiple-viewer function already exists with Stellina and eVscope, but the view is seen on multiple cell phone screens, instead of in multiple viewers’ own eyes.

As pointed out in an earlier memo, the rise of the single-photon detector will also render much of our bright city lights obsolete, since simple night-glasses will allow driving at night without headlights or streetlamps. This means that the night sky could be a lot darker to begin with, another boon to astronomers. Finally, with currently existing filters, it is already possible to filter out much of the light-noise coming from urban areas. This function will no doubt also be incorporated, along with built-in dew heaters, and other useful features.

Today at AWE 2,021 Qualcomm announced Snapdragon Spaces XR Developer Platform, a head-worn AR software suite the company is using to kickstart a broader move towards smartphone-tethered AR glasses.

Qualcomm says its Snapdragon Spaces XR Developer Platform offers a host of machine perception functions that are ideal for smartphone-tethered AR glasses. The software tool kit focuses on performance and low power, and provides the sort of environmental and human interaction stuff it hopes will give AR developers a good starting point.

Intel senior vice president Keyvan Esfarjani and Intel CEO Pat Gelsinger at the groundbreaking of two new chip fabrication plants in Chandler, Arizona, on Friday, Sept. 24 2021.

Intel Corporation.

The world’s smallest and most-efficient chips are usually referred to as 5 nanometer, a nomenclature that once referred to the width of transistors on the chip. They power cutting-edge data processing and the latest generation of Apple iPhones. TSMC and Samsung make all of these 5-nanometer chips at fabs in Asia.

The concept and technology behind Neuralink are so far ahead of what we’ve grown accustomed to that it might as well be magic. Make no mistake Neuralink is happening and it’ll be here sooner than you think…

I remember the first time I heard about Neuralink. I thought it was a joke or something far off in the future. Then I heard Elon Musk was behind it and immediately knew that this bonkers technology would be with us a lot sooner than any of us imagined.

The concept of Neuralink is simple: you have a chip implanted in your brain and with this chip, you can control things – computer games, applications, your phone, beam thoughts to other Neuralink users. Elon has even demoed the tech working inside a monkey’s head.

LISBON, Nov 2 (Reuters) — Chip designer Advanced Micro Devices (AMD.O) has been able to skirt most of the problems linked with the global chip supply shortage by forecasting demand years in advance, a top executive said on Tuesday.

Demand for electronics gadgets from people stuck in homes due to the pandemic has led to a shortage of semiconductors that are used from anything from mobile phones and cars.

But despite a squeeze in supply, AMD has been able to take market share away from rival Intel (INTC.O) in both PCs and servers with its latest line of processors.

Quantum physicists at the University of Copenhagen are reporting an international achievement for Denmark in the field of quantum technology. By simultaneously operating multiple spin qubits on the same quantum chip, they surmounted a key obstacle on the road to the supercomputer of the future. The result bodes well for the use of semiconductor materials as a platform for solid-state quantum computers.

One of the engineering headaches in the global marathon towards a large functional quantum computer is the control of many basic memory devices – qubits – simultaneously. This is because the control of one qubit is typically negatively affected by simultaneous control pulses applied to another qubit. Now, a pair of young quantum physicists at the University of Copenhagen’s Niels Bohr Institute –PhD student, now Postdoc, Federico Fedele, 29 and Asst. Prof. Anasua Chatterjee, 32,– working in the group of Assoc. Prof. Ferdinand Kuemmeth, have managed to overcome this obstacle.

The brain of the quantum computer that scientists are attempting to build will consist of many arrays of qubits, similar to the bits on smartphone microchips. They will make up the machine’s memory.

South Korea has created an intriguing device that allows its citizens to continue staring at their phones while crossing the road safely, but the notion has not received many likes.

On October 11 TikTok footage from a Seoul crossroads showed green and red lights illuminating the curbs, indicating when it’s safe to cross the road, even if you’re staring at your phone screen.

Recently, @naturalkorean has published a TikTok video of the lights eliciting a fairly mixed bag of reactions. Theoretically speaking, it is a great idea to protect people too absorbed in social media to notice their surroundings; however, this type of reckless behaviour should not be encouraged. The video shows a swarm of ‘smombies’ or ‘smart phone-obsessed zombies’ who are glued to their phones as they cross a major downtown street. The scene made me nervous, even as someone who has run into strangers when smombie-ing around.

Taiwan Semiconductor Manufacturing Company makes 24% of all the world’s chips, and 92% of the most advanced ones found in today’s iPhones, fighter jets and supercomputers. Now TSMC is building America’s first 5-nanometer fabrication plant, hoping to reverse a decades-long trend of the U.S. losing chip manufacturing to Asia. CNBC got an exclusive tour of the $12 billion fab that will start production in 2024.

» Subscribe to CNBC:
» Subscribe to CNBC TV:
» Subscribe to CNBC Classic:

About CNBC: From ‘Wall Street’ to ‘Main Street’ to award winning original documentaries and Reality TV series, CNBC has you covered. Experience special sneak peeks of your favorite shows, exclusive video and more.

Connect with CNBC News Online.
Get the latest news:
Follow CNBC on LinkedIn:
Follow CNBC News on Facebook:
Follow CNBC News on Twitter:
Follow CNBC News on Instagram:


Secretive Giant TSMC’s $100 Billion Plan To Fix The Chip Shortage.

The tech giant wants to be known for more than social media’s ills.

Facebook is planning to change its company name next week to reflect its focus on building the metaverse, according to a source with direct knowledge of the matter.

The coming name change, which CEO Mark Zuckerberg plans to talk about at the company’s annual Connect conference on October 28th, but could unveil sooner, is meant to signal the tech giant’s ambition to be known for more than social media and all the ills that entail. The rebrand would likely position the blue Facebook app as one of many products under a parent company overseeing groups like Instagram, WhatsApp, Oculus, and more. A spokesperson for Facebook declined to comment for this story.

Facebook already has more than 10,000 employees building consumer hardware like AR glasses that Zuckerberg believes will eventually be as ubiquitous as smartphones. In July, he told The Verge that, over the next several years, “we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company.”