Toggle light / dark theme

I get this question a lot. Today, I was asked to write an answer at Quora.com, a Q&A web site at which I am the local cryptocurrency expert. It’s time to address this issue here at Lifeboat.

Question

I have many PCs laying around my home and office.
Some are current models with fast Intel CPUs. Can
I mine Bitcoin to make a little money on the side?

Answer

Other answers focus on the cost of electricity, the number of hashes or teraflops achieved by a computer CPU or the size of the current Bitcoin reward. But, you needn’t dig into any of these details to understand this answer.

You can find the mining software to mine Bitcoin or any other coin on any equipment. Even a phone or wristwatch. But, don’t expect to make money. Mining Bitcoin with an x86 CPU (Core or Pentium equivalent) is never cost effective—not even when Bitcoin was trading at nearly $20,000. A computer with a fast $1500 graphics card will bring you closer to profitability, but not by much.

The problem isn’t that an Intel or AMD processor is too weak to mine for Bitcoin. It’s just as powerful as it was in the early days of Bitcoin. Rather, the problem is that the mining game is a constantly evolving competition. Miners with the fastest hardware and the cheapest power are chasing a shrinking pool of rewards.

The problem arises from a combination of things:

  1. There is a fixed rate of rewards available to all miners—and yet, over the past 2 years, hundreds of thousands of new CPUs have been added to the task. You are competing with all of them.
  2. Despite a large drop in the Bitcoin exchange rate (from $19,783.21 on Dec. 17, 2017), we know that it is generally a rising commodity, because both speculation and gradual grassroots adoption outpaces the very gradual increase in supply. The rising value of Bitcoin attracts even more individuals and organizations into the game of mining. They are all fighting for a pie that is shrinking in overall size. Here’s why…
  3. The math (a built-in mechanism) halves the size of rewards every 4 years. We are currently between two halving events, the next one will occur in May 2020. This halving forces miners to be even more efficient to eke out any reward.
  4. In the past few years, we have seen a race among miners and mining pools to acquire the best hardware for the task. At first, it was any CPU that could crunch away at the math. Then, miners quickly discovered that an nVidia graphics processor was better suited to the task. Then ASICS became popular, and now; specialized, large-scale integrated circuits that were designed specifically for mining.
  5. Advanced mining pools have the capacity to instantly switch between mining for Bitcoin, Ethereum classic, Litecoin, Bitcoin Cash and dozens of other coins depending upon conditions that change minute-by-minute. Although you can find software that does the same thing, it is unlikely that you can outsmart the big boys at this game, because they have super-fast internet connections and constant software maintenance.
  6. Some areas of the world have a surplus of wind, water or solar energy. In fact, there are regions where electricity is free.* Although regional governments would rather that this surplus be used to power homes and businesses (benefiting the local economy), electricity is fungible! And so, local entrepreneurs often “rent” out their cheap electricity by offering shelf space to miners from around the world. Individuals with free or cheap electricity (and some, with a cold climate to keep equipment cool) split this energy savings with the miner. This further stacks the deck against the guy with a fast PC in New York or Houston.

Of course, with Bitcoin generally rising in value (over the long term), this provides continued incentive to mine. It is the only thing that makes this game worthwhile to the individuals who participate.

So, while it is not impossible to profit by mining on a personal computer, if you don’t have very cheap power, the very latest specialized mining rigs, and the skills to constantly tweak your configuration—then your best bet is to join a reputable mining pool. Take your fraction of the mining rewards and let them take a small cut. Cash out frequently, so that you are not locked into their ability to resist hacking or remain solvent.

Related: Largest US operation mines 0.4% of daily Bitcoin rewards. Listen to the owner describe the effiiency of his ASIC processors and the enormous capacity he is adding. This will not produce more Bitcoin. The total reward rate is fixed and falling every 4 years. His build out will consume a massive amount of electricity, but it will only grab share from other miners—and encourage them to increase consuption just to keep up.


* Several readers have pointed out that they have access to “free power” in their office — or more typically, in a college dormitory. While this may be ‘free’ to the student or employee, it is most certainly not free. In the United States, even the most efficient mining, results in a 20 or 30% return on electric cost—and with the added cost of constant equipment updates. This is not the case for personal computers. They are sorely unprofitable…

So, for example, if you have 20 Intel computers cooking for 24 hours each day, you might receive $115 rewards at the end of a year, along with an electric bill for $3500. Long before this happens, you will have tripped the circuit breaker in your dorm room or received an unpleasant memo from your boss’s boss.


Bitcoin mining farms

  • Professional mining pool (above photo and top row below)
  • Amateur mining rigs (bottom row below)

This is what you are up against. Even the amateur mining operations depicted in the bottom row require access to very cheap electricity, the latest processors and the skill to expertly maintain hardware, software and the real-time, mining decision-process.


Philip Raymond co-chairs CRYPSA, hosts the New York Bitcoin Event and is keynote speaker at Cryptocurrency Conferences. He sits on the New Money Systems board of Lifeboat Foundation. Book a presentation or consulting engagement.

IBM and the Department of Energy’s Oak Ridge National Laboratory have revealed the world’s “most powerful and smartest scientific supercomputer.” Known as Summit, IBM says that its new computer will be capable of processing 200,000 quadrillion calculations per second. To put that into perspective, if every person on Earth did a single calculation per second, it would take 305 days to do what Summit does in a single second. Assuming those numbers are accurate, that would make Summit the world’s fastest supercomputer. It would also mark the first time since 2012 that a U.S. computer held that title.

Summit has been in the works for several years now and features some truly impressive specs. According to Tech Crunch, the computer will feature 4,608 compute servers, 22 IBM Power9 chips and six Nvidia Tesla V100 GPUs each. In addition, the machine will feature more than 10 petabytes of memory. As the Nvidia GPUs attest, this machine will be primarily used for the development of artificial intelligence and machine learning. In addition to the work on A.I., Summit will also be used for research into energy and other scientific endeavors at Oak Ridge.

IBM was the Department of Energy’s general contractor for the Summit project, but it also had the help of several other partners within the tech industry. The GPUs were provided by Nvidia, which remains one of the leaders in cutting-edge GPU development. Mellanox and Redhat were also brought on to work on the development of Summit.

Read more

In general, modelers attack the problem by breaking it into billions of bits, either by dividing space into a 3D grid of subvolumes or by parceling the mass of dark and ordinary matter into swarms of particles. The simulation then tracks the interactions among those elements while ticking through cosmic time in, say, million-year steps. The computations strain even the most powerful supercomputers. BlueTides, for example, runs on Blue Waters—a supercomputer at the University of Illinois in Urbana that can perform 13 quadrillion calculations per second. Merely loading the model consumes 90% of the computer’s available memory, Feng says.

For years such simulations produced galaxies that were too gassy, massive, and blobby. But computer power has increased, and, more important, models of the radiation-matter feedback have improved. Now, hydrodynamic simulations have begun to produce the right number of galaxies of the right masses and shapes—spiral disks, squat ellipticals, spherical dwarfs, and oddball irregulars—says Volker Springel, a cosmologist at the Heidelberg Institute for Theoretical Studies in Germany who worked on Millennium and leads the Illustris simulation. “Until recently, the simulation field struggled to make spiral galaxies,” he says. “It’s only in the last 5 years that we’ve shown that you can make them.”

The models now show that, like people, galaxies tend to go through distinct life stages, Hopkins says. When young, a galaxy roils with activity, as one merger after another stretches and contorts it, inducing spurts of star formation. After a few billion years, the galaxy tends to settle into a relatively placid and stable middle age. Later, it can even slip into senescence as it loses its gas and the ability make stars—a transition our Milky Way appears to be making now, Hopkins says. But the wild and violent turns of adolescence make the particular path of any galaxy hard to predict, he says.

Read more

A new proof by SFI Professor David Wolpert sends a humbling message to would-be super intelligences: you can’t know everything all the time.

The proof starts by mathematically formalizing the way an “inference device,” say, a scientist armed with a supercomputer, fabulous experimental equipment, etc., can have knowledge about the state of the universe around them. Whether that scientist’s knowledge is acquired by observing their universe, controlling it, predicting what will happen next, or inferring what happened in the past, there’s a mathematical structure that restricts that knowledge. The key is that the inference device, their knowledge, and the physical variable that they (may) know something about, are all subsystems of the same universe. That coupling restricts what the device can know. In particular, Wolpert proves that there is always something that the inference device cannot predict, and something that they cannot remember, and something that they cannot observe.

“In some ways this formalism can be viewed as many different extensions of [Donald MacKay’s] statement that ‘a prediction concerning the narrator’s future cannot account for the effect of the narrator’s learning that prediction,’” Wolpert explains. “Perhaps the simplest extension is that, when we formalize [inference devices] mathematically, we notice that the same impossibility results that hold for predictions of the future—MacKay’s concern—also hold for memories of the past. Time is an arbitrary variable—it plays no role in terms of differing states of the universe.”

Read more

1. blame the American public that lost serious interest in science in the 1990’s, And 2. the US government who’s only real interest now is war, and how to spend money on war.


If you want to crunch the world’s biggest problems, head east. According to a newly published ranking, not only is China home to the world’s two fastest supercomputers, it also has 202 of the world’s fastest 500 such devices—more than any other nation. Meanwhile, America’s fastest device limps into fifth place in the charts, and the nation occupies just 144 of the top 500 slots, making it second according to that metric.

The world’s fastest supercomputer is still TaihuLight, housed at the National Supercomputing Center in Wuxi, China, and pictured above. Capable of performing 93 quadrillion calculations per second, it’s almost three times faster than the second-place Tianhe-2. The Department of Energy’s fifth-placed Titan supercomputer, housed at Oak Ridge National Laboratory, performs 17.6 quadrillion calculations per second—making it less than a fifth as fast as TaihuLight.

China also beats out all comers on total computational resources, commanding 35.4 percent of the computing power in the list, compared with America’s 29.6 percent. The new list clearly and painfully underscores America’s decline as a supercomputing heavyweight. Indeed, this is the weakest representation by the U.S. since the Top500 supercomputers list started ranking the industry 25 years ago.

Read more

Using supercomputer modeling, University of Oregon scientists have unveiled a new explanation for the geology underlying recent seismic imaging of magma bodies below Yellowstone National Park.

Yellowstone, a supervolcano famous for explosive eruptions, large calderas and extensive lava flows, has for years attracted the attention of scientists trying to understand the location and size of below it. The last caldera forming eruption occurred 630,000 years ago; the last large volume of lava surfaced 70,000 years ago.

Crust below the park is heated and softened by continuous infusions of magma that rise from an anomaly called a , similar to the source of the magma at Hawaii’s Kilauea volcano. Huge amounts of water that fuel the dramatic geysers and hot springs at Yellowstone cool the crust and prevent it from becoming too hot.

Read more