Toggle light / dark theme

How important is failure – yes, failure – to the health of a thriving, innovative business? So important that Ratan Tata, chairman of India’s largest corporation, gives an annual award to the employee who comes up with the best idea that failed. So important that Apple, the company gives us the world’s most beautifully designed music players, mobile phones, and tablets, wouldn’t be here if it hadn’t dared to fail. Remember the Apple Newton? Probably not, since it was a flop, but it was a precursor to today’s wildly successful iPad.

In a struggling economic climate, failure is what separates mediocre companies from businesses that break through and astound us with their creativity.

Yet failure has become a scenario to be avoided at all costs for most CEOs. Between fear of losing jobs to fear of rattling investors, business leaders are expected to deliver a perfect track record of product launches and expansion programs. They indirectly instill this attitude in their employees, who lack the confidence to spearhead any corporate initiative that isn’t guaranteed to work.

Call it the Tiger Mom effect: In the business world today, failure is apparently not an option.

We need to change this attitude toward failure – and celebrate the idea that only by falling on our collective business faces do we learn enough to succeed down the road. Sure, this is a tough sell at a time when unemployment figures are still high and a true economic recovery is still a long way off. But without failures, no business can grow and innovate.

Why is it that we are willing to accept a certain amount of failure along with success in other arenas, but it’s forbidden in business? Take basketball: The best player on his best night will only score about 50 percent of the time. The other 50 percent of the shots aren’t considered failures – they are attempts to test playing strategy that will figure into dunks later in the game.

If a basketball player is scoring close to 100 percent of the time, he is clearly making layups and not considering the three-pointer that may be needed to win the game. The same is true for business leaders: if every one of your pet projects rolls out successfully, maybe you aren’t stretching far enough.

Apple Computer would not have reached its current peak of success if it had feared to roll the dice and launch products that didn’t always hit the mark. In the mid-1990s, the company was considered washed up, Steve Jobs had departed, and a string of lackluster product launches unrelated to the company’s core business had failed to catch fire. But the company learned lessons from its mistakes, and shifted focus to the development of flawlessly designed consumer electronics goods. Yesterday’s failures bred today’s market dominance.

Fail today, profit tomorrow
The good news about failing is that this is a smart time to do it. Today, the cost of failure is much lower than it used to be. And if you take chances while the economy is down, your successful business launch will grow and become exponentially more profitable as times improve.

Strange as it may sound, a positive attitude towards failure starts at the top. Here’s how business leaders can create an environment where failure is encouraged, not punished.

Applaud people who fail: As a leader, you need to praise people who take risks and explore new ways to gain market share, since short-term failures can lead to the biggest business successes down the road. Talk publicly about why the failed venture has merit, and what your employees have learned from it. At performance review time, let the innovators who dare to fail know that you value their contributions.

Acknowledge your own failures: When you experience a failure as a leader, don’t hide it – talk about it. Your missed opportunity will encourage others to take risks. When you tell personal stories about your own failed plans, you give permission to everyone in the organization to do the same, without fear of reprisals. (Of course, you should always remind people that they should dig deep for lessons learned from every failed attempt.)

Create a culture of innovation and entrepreneurship: Grant people the time to work on projects that they are passionate about – beyond their daily responsibilities. This sends employees a signal that their failures as well as their successes have great value to the business. Google does this, allowing its engineers to spend 20 percent of their work time on side projects. You can’t argue with success: Google’s Orkut social networking service and AdSense ads for website content came directly from this “20 percent time” program.

So get out there and fail – and take inspiration from Thomas Edison, who suffered through more than a thousand experiments before finally inventing a working light bulb. “I didn’t fail a thousand times,” Edison told a journalist. “The light bulb was an invention with a thousand steps.”

Follow Naveen Jain on Twitter: www.twitter.com/Naveen_Jain_CEO

When examining the delicate balance that life on Earth hangs within, it is impossible not to consider the ongoing love/hate connection between our parent star, the sun, and our uniquely terraqueous home planet.

On one hand, Earth is situated so perfectly, so ideally, inside the sun’s habitable zone, that it is impossible not to esteem our parent star with a sense of ongoing gratitude. It is, after all, the onslaught of spectral rain, the sun’s seemingly limitless output of charged particles, which provide the initial spark to all terrestrial life.

Yet on another hand, during those brief moments of solar upheaval, when highly energetic Earth-directed ejecta threaten with destruction our precipitously perched technological infrastructure, one cannot help but eye with caution the potentially calamitous distance of only 93 million miles that our entire human population resides from this unpredictable stellar inferno.

On 6 February 2011, twin solar observational spacecraft STEREO aligned at opposite ends of the sun along Earth’s orbit, and for the first time in human history, offered scientists a complete 360-degree view of the sun. Since solar observation began hundreds of years ago, humanity has had available only one side of the sun in view at any given time, as it slowly completed a rotation every 27 days. First launched in 2006, the two STEREO satellites are glittering jewels among a growing crown of heliophysics science missions that aim to better understand solar dynamics, and for the next eight years, will offer this dual-sided view of our parent star.

In addition to providing the source of all energy to our home planet Earth, the sun occasionally spews from its active regions violent bursts of energy, known as coronal mass ejections(CMEs). These fast traveling clouds of ionized gas are responsible for lovely events like the aurorae borealis and australis, but beyond a certain point have been known to overload orbiting satellites, set fire to ground-based technological infrastructure, and even usher in widespread blackouts.

CMEs are natural occurrences and as well understood as ever thanks to the emerging perspective of our sun as a dynamic star. Though humanity has known for centuries that the solar cycle follows a more/less eleven-year ebb and flow, only recently has the scientific community effectively constellated a more complete picture as to how our sun’s subtle changes effect space weather and, unfortunately, how little we can feasibly contend with this legitimate global threat.

The massive solar storm that occurred on 1 September 1859 produced aurorae that were visible as far south as Hawai’i and Cuba, with similar effects observed around the South Pole. The Earth-directed CME took all of 17 hours to make the 93 million mile trek from the corona of our sun to the Earth’s atmosphere, due to an earlier CME that had cleared a nice path for its intra-stellar journey. The one saving grace of this massive space weather event was that the North American and European telegraph system was in its delicate infancy, in place for only 15 years prior. Nevertheless, telegraph pylons threw sparks, many of them burning, and telegraph paper worldwide caught fire spontaneously.

Considering the ambitious improvements in communications lines, electrical grids, and broadband networks that have been implemented since, humanity faces the threat of space weather on uneven footing. Large CME events are known to occur around every 500 years, based on ice core samples measured for high-energy proton radiation.

The CME event on 14 March 1989 overloaded the HydroQuebec transmission lines and caused the catastrophic collapse of an entire power gird. The resulting aurorae were visible as far south as Texas and Florida. The estimated cost was totaled in the hundreds of million of dollars. A later storm in August 1989 interfered with semiconductor functionality and trading was called off on the Toronto stock exchange.

Beginning in 1995 with the launch and deployment of The Solar Heliospheric Observatory (SOHO), through 2009 with the launch of SDO, the Solar Dynamics Observatory, and finally this year, with the launch of the Glory science mission, NASA is making ambitious, thoughtful strides to gain a clearer picture of the dynamics of the sun, to offer a better means to predict space weather, and evaluate more clearly both the great benefits and grave stellar threats.

Earth-bound technology infrastructure remains vulnerable to high-energy output from the sun. However, the growing array of orbiting satellites that the best and the brightest among modern science use to continually gather data from our dynamic star will offer humanity its best chance of modeling, predicting, and perhaps some day defending against the occasional outburst from our parent star.

Written by Zachary Urbina, Founder Cozy Dark

Transhumanists are into improvements, and many talk about specific problems, for instance Nick Bostrom. However, Bostrom’s problem statements have been criticized for not necessarily being problems, and I think largely this is why one must consider the problem definition (see step #2 below).

Sometimes people talk about their “solutions” for problems, for instance this one in H+ Magazine. But in many cases they are actually talking about their ideas of how to solve a problem, or making science-fictional predictions. So if you surf the web, you will find a lot of good ideas about possibly important problems—but a lot of what you find will be undefined (or not very well defined) problem ideas and solutions.

These proposed solutions often do not attempt to find root causes or assume the wrong root cause. And finding a realistic complete plan for solving a problem is rare.

8D (Eight Disciplines) is a process used in various industries for problem solving and process improvement. The 8D steps described below could be very useful for transhumanists, not just for talking about problems but for actually implementing solutions in real life.

Transhuman concerns are complex not just technologically, but also socioculturally. Some problems are more than just “a” problem—they are a dynamic system of problems and the process for problem solving itself is not enough. There has to be management, goals, etc., most of which is outside the scope of this article. But first one should know how deal with a single problem before scaling up, and 8D is a process that can be used on a huge variety of complex problems.

Here are the eight steps of 8D:

  1. Assemble the team
  2. Define the problem
  3. Contain the problem
  4. Root cause analysis
  5. Choose the permanent solution
  6. Implement the solution and verify it
  7. Prevent recurrence
  8. Congratulate the team

More detailed descriptions:

1. Assemble the Team

Are we prepared for this?

With an initial, rough concept of the problem, a team should be assembled to continue the 8D steps. The team will make an initial problem statement without presupposing a solution. They should attempt to define the “gap” (or error)—the big difference between the current problematic situation and the potential fixed situation. The team members should all be interested in closing this gap.

The team must have a leader; this leader makes agendas, synchronizes actions and communications, resolves conflicts, etc. In a company, the team should also have a “sponsor”, who is like a coach from upper management. The rest of the team is assembled as appropriate; this will vary depending on the problem, but some general rules for a candidate can be:

  • Has a unique point of view.
  • Logistically able to coordinate with the rest of the team.
  • Is not committed to preconceived notions of “the answer.”
  • Can actually accomplish change that they might be responsible for.

The size of an 8D team (at least in companies) is typically 5 to 7 people.

The team should be justified. This matters most within an organization that is paying for the team, however even a group of transhumanists out in the wilds of cyberspace will have to defend themselves when people ask, “Why should we care?”

2. Define the Problem

What is the problem here?

Let’s say somebody throws my robot out of an airplane, and it immediately falls to the ground and breaks into several pieces. This customer then informs me that this robot has a major problem when flying after being dropped from a plane and that I should improve the flying software to fix it.

Here is the mistake: The problem has not been properly defined. The robot is a ground robot and was not intended to fly or be dropped out of a plane. The real problem is that a customer has been misinformed as to the purpose and use of the product.

When thinking about how to improve humanity, or even how to merely improve a gadget, you should consider: Have you made an assumption about the issue that might be obscuring the true problem? Did the problem emerge from a process that was working fine before? What processes will be impacted? If this is an improvement, can it be measured, and what is the expected goal?

The team should attempt to grok the issues and their magnitude. Ideally, they will be informed with data, not just opinions.

Just as with medical diagnosis, the symptoms alone are probably not enough input. There are various ways to collect more data, and which methods you use depends on the nature of the problem. For example, one method is the 5 W’s and 2 H’s:

  • Who is affected?
  • What is happening?
  • When does it occur?
  • Where does it happen?
  • Why is it happening (initial understanding)?
  • How is it happening?
  • How many are affected?

For humanity-affecting problems, I think it’s very important to define what the context of the problem is.

3. Contain the Problem

Containment

Some problems are urgent, and a stopgap must be put in place while the problem is being analyzed. This is particularly relevant for problems such as product defects which affect customers.

Some brainstorming questions are:

  • Can anything be done to mitigate the negative impact (if any) that is happening?
  • Who would have to be involved with that mitigation?
  • How will the team know that the containment action worked?

Before deploying an interim expedient, the team should have asked and answered these questions (they essentially define the containment action):

  • Who will do it?
  • What is the task?
  • When will it be accomplished?

A canonical example: You have a leaky roof (the problem). The containment action is to put a pail underneath the hole to capture the leaking water. This is a temporary fix until the roof is properly repaired, and mitigates damage to the floor.

Don’t let the bucket of water example fool you—containment can be massive, e.g. corporate bailouts. Of course, the team must choose carefully: Is the cost of containment worth it?

4. Root Cause Analysis

There can be many layers of causation

Whenever you think you have an answer to a problem, as yourself: Have you gone deep enough? Or is there another layer below? If you implementt a fix, will the problem grow back?

Generally in the real world events are causal. The point of root cause analysis is to trace the causes all the way back for your problem. If you don’t find the origin of the causes, then the problem will probably rear its ugly head again.

Root cause analysis is one of the most overlooked, yet important, steps of problem solving. Even engineers often lose their way when solving a problem and jump right into a fix which later on turned out to be a red herring.

Typically, driving to root cause follows one of these two routes:

  1. Start with data; develop theories from that data.
  2. Start with a theory; search for data to support or refute it.

Either way, team members must always remember keep in mind that correlation is not necessarily causation.

One tool to use is the 5 Why’s, in which you move down the “ladder of abstraction” by continually asking: “why?” Start with a cause and ask why this cause is responsible for the gap (or error). Then ask again until you’ve bottomed out with something that may be a true root cause.

There are many other general purpose methods and tools to assist in this stage; I will list some of them here, but please look them up for detailed explanations:

  • Brainstorming: Generate as many ideas as possible, and elaborate on the best ideas.
  • Process flow analysis: Flowchart a process; attempt to narrow down what element in the flow chart is causing the problem.
  • Fishikawa: Use a Fishikawa (aka Cause and Effect) diagram to try narrowing down the cause(s).
  • Pareto analysis: Generate a Pareto chart, which may indicate which cause (of many) should be fixed first.
  • Data analysis: Use trend charts, scatter plots, etc. to assist in finding correlations and trends.

And that is just the beginning—a problem may need a specific new experiment or data collection method devised.

Ideally you would have a single root cause, but that is not always the case.

The team should also come up with various correction actions that solve the root cause, to be selected and refined in the next step.

5. Choose the Permanent Solution

The solution must be one or more corrective actions that solve the cause(s) of the problem. Corrective action selection is additionally guided by criteria such as time constraints, money constraints, efficiency, etc.

This is a great time to simulate/test the solution, if possible. There might be unaccounted for side effects either in the system you fixed or in related systems. This is especially true for some of the major issues that transhumanists wish to tackle.

You must verify that the corrective action(s) will in fact fix the root cause and not cause bad side effects.

6. Implement the Solution and Verify It

This is the stage when the team actually sets into motion the correction action(s). But doing it isn’t enough—the team also has to check to see if the solution is really working.

For some issues the verification is clean-cut. Some corrective actions have to be evaluated with effectiveness, for instance some benchmark. Depending on the time scale of the corrective action, the team might need to add various monitors and/or controls to continually make sure the root cause is squashed.

7. Prevent Recurrence

It’s possible that a process will revert back to its old ways after the problem has been solved, resulting in the same type of problem happening again. So the team should provide the organization or environment with improvements to processes, procedures, practices, etc. so that this type of problem does not resurface.

8. Congratulate the Team

Party time! The team should share and publicize the knowledge gained from the process as it will help future efforts and teams.

Image credits:
1. Inception (2010), Warner Bros.
2. Peter Galvin
3. Tom Parnell
4. shalawesome

I’ve been an entrepreneur most of my adult life. Recently, on a long business flight, I began thinking about what it takes to become successful as an entrepreneur — and how I would even define the meaning “success” itself. The two ideas became more intertwined in my thinking: success as an entrepreneur, entrepreneurial success. I’ve given a lot of talks over the years on the subject of entrepreneurship. The first thing I find I have to do is to dispel the persistent myth that entrepreneurial success is all about innovative thinking and breakthrough ideas. I’ve found that entrepreneurial success usually comes through great execution, simply by doing a superior job of doing the blocking and tackling.

But what else does it take to succeed as an entrepreneur — and how should an entrepreneur define success?

Bored with the long flight, sinking deeper into my own thoughts, I wrote down my own answers.

Here’s what I came up with, a “Top Ten List” if you will:

10. You must be passionate about what you are trying to achieve. That means you’re willing to sacrifice a large part of your waking hours to the idea you’ve come up with. Passion will ignite the same intensity in the others who join you as you build a team to succeed in this endeavor. And with passion, both your team and your customers are more likely to truly believe in what you are trying to do.

9. Great entrepreneurs focus intensely on an opportunity where others see nothing. This focus and intensity helps to eliminate wasted effort and distractions. Most companies die from indigestion rather than starvation i.e. companies suffer from doing too many things at the same time rather than doing too few things very well. Stay focused on the mission.

8. Success only comes from hard work. We all know that there is no such thing as overnight success. Behind every overnight success lies years of hard work and sweat. People with luck will tell you there’s no easy way to achieve success — and that luck comes to those who work hard. Successful entrepreneurs always give 100% of their efforts to everything they do. If you know you are giving your best effort, you’ll never have any reason for regrets. Focus on things you can control; stay focused on your efforts and let the results be what they will be.

7. The road to success is going to be long, so remember to enjoy the journey. Everyone will teach you to focus on goals, but successful people focus on the journey and celebrate the milestones along the way. Is it worth spending a large part of your life trying to reach the destination if you didn’t enjoy the journey along the way? Won’t the team you attract to join you on your mission also enjoy the journey more as well? Wouldn’t it be better for all of you to have the time of your life during the journey, even if the destination is never reached?

6. Trust your gut instinct more than any spreadsheet. There are too many variables in the real world that you simply can’t put into a spreadsheet. Spreadsheets spit out results from your inexact assumptions and give you a false sense of security. In most cases, your heart and gut are still your best guide. The human brain works as a binary computer and can only analyze the exact information-based zeros and ones (or black and white). Our heart is more like a chemical computer that uses fuzzy logic to analyze information that can’t be easily defined in zeros and ones. We’ve all had experiences in business where our heart told us something was wrong while our brain was still trying to use logic to figure it all out. Sometimes a faint voice based on instinct resonates far more strongly than overpowering logic.

5. Be flexible but persistent — every entrepreneur has to be agile in order to perform. You have to continually learn and adapt as new information becomes available. At the same time you have to remain persistent to the cause and mission of your enterprise. That’s where that faint voice becomes so important, especially when it is giving you early warning signals that things are going off-track. Successful entrepreneurs find the balance between listening to that voice and staying persistent in driving for success — because sometimes success is waiting right across from the transitional bump that’s disguised as failure.

4. Rely on your team — It’s a simple fact: no individual can be good at everything. Everyone needs people around them who have complimentary sets of skills. Entrepreneurs are an optimistic bunch of people and it’s very hard for them to believe that they are not good at certain things. It takes a lot of soul searching to find your own core skills and strengths. After that, find the smartest people you can who compliment your strengths. It’s easy to get attracted to people who are like you; the trick is to find people who are not like you but who are good at what they do — and what you can’t do.

3. Execution, execution, execution — unless you are the smartest person on earth (and who is) it’s likely that many others have thought about doing the same thing you’re trying to do. Success doesn’t necessarily come from breakthrough innovation but from flawless execution. A great strategy alone won’t win a game or a battle; the win comes from basic blocking and tackling. All of us have seen entrepreneurs who waste too much time writing business plans and preparing PowerPoints. I believe that a business plan is too long if it’s more than one page. Besides, things never turn out exactly the way you envisioned them. No matter how much time you spend perfecting the plan, you still have to adapt according to the ground realities. You’re going to learn a lot more useful information from taking action rather than hypothesizing. Remember — stay flexible and adapt as new information becomes available.

2. I can’t imagine anyone ever achieving long-term success without having honesty and integrity. These two qualities need to be at the core of everything we do. Everybody has a conscience — but too many people stop listening to it. There is always that faint voice that warns you when you are not being completely honest or even slightly off track from the path of integrity. Be sure to listen to that voice.

1. Success is a long journey and much more rewarding if you give back. By the time you get to success, lots of people will have helped you along the way. You’ll learn, as I have, that you rarely get a chance to help the people who helped you because in most cases, you don’t even know who they were. The only way to pay back the debts we owe is to help people we can help — and hope they will go on to help more people. When we are successful, we draw so much from the community and society that we live in we should think in terms of how we can help others in return. Sometimes it’s just a matter of being kind to people. Other times, offering a sympathetic ear or a kind word is all that’s needed. It’s our responsibility to do “good” with the resources we have available.

Measuring Success — Hopefully, you have internalized the secrets of becoming a successful entrepreneur. The next question you are likely to ask yourself is: How do we measure success? Success, of course, is very personal; there is no universal way of measuring success. What do successful people like Bill Gates and Mother Teresa have in common? On the surface it’s hard to find anything they share — and yet both are successful. I personally believe the real metric of success isn’t the size of your bank account. It’s the number of lives where you might be able to make a positive difference. This is the measure of success we need to apply while we are on our journey to success.

Naveen Jain is a philanthropist, entrepreneur and technology pioneer. He is a founder and CEO of Intelius, a Seattle-based company that empowers consumers with information to make intelligent decisions about personal safety and security. Prior to Intelius, Naveen Jain founded InfoSpace and took it public in 1998 on NASDAQ. Naveen Jain has been awarded many honors for his entrepreneurial successes and leadership skills including “Ernst & Young Entrepreneur of the Year”, “Albert Einstein Technology Medal” for pioneers in technology, “Top 20 Entrepreneurs” by Red Herring, “Six People Who Will Change the Internet” by Information Week, among other honors.

Dear Ray;

I’ve written a book about the future of software. While writing it, I came to the conclusion that your dates are way off. I talk mostly about free software and Linux, but it has implications for things like how we can have driverless cars and other amazing things faster. I believe that we could have had all the benefits of the singularity years ago if we had done things like started Wikipedia in 1991 instead of 2001. There is no technology in 2001 that we didn’t have in 1991, it was simply a matter of starting an effort that allowed people to work together.

Proprietary software and a lack of cooperation among our software scientists has been terrible for the computer industry and the world, and its greater use has implications for every aspect of science. Free software is better for the free market than proprietary software, and there are many opportunities for programmers to make money using and writing free software. I often use the analogy that law libraries are filled with millions of freely available documents, and no one claims this has decreased the motivation to become a lawyer. In fact, lawyers would say that it would be impossible to do their job without all of these resources.

My book is a full description of the issues but I’ve also written some posts on this blog, and this is probably the one most relevant for you to read: https://lifeboat.com/blog/2010/06/h-conference-and-faster-singularity

Once you understand this, you can apply your fame towards getting more people to use free software and Python. The reason so many know Linus Torvalds’s name is because he released his code as GPL, which is a license whose viral nature encourages people to work together. Proprietary software makes as much sense as a proprietary Wikipedia.

I would be happy to discuss any of this further.

Regards,

-Keith
—————–
Response from Ray Kurzweil 11/3/2010:

I agree with you that open source software is a vital part of our world allowing everyone to contribute. Ultimately software will provide everything we need when we can turn software entities into physical products with desktop nanofactories (there is already a vibrant 3D printer industry and the scale of key features is shrinking by a factor of a hundred in 3D volume each decade). It will also provide the keys to health and greatly extended longevity as we reprogram the outdated software of life. I believe we will achieve the original goals of communism (“from each according to their ability, to each according to their need”) which forced collectivism failed so miserably to achieve. We will do this through a combination of the open source movement and the law of accelerating returns (which states that the price-performance and capacity of all information technologies grows exponentially over time). But proprietary software has an important role to play as well. Why do you think it persists? If open source forms of information met all of our needs why would people still purchase proprietary forms of information. There is open source music but people still download music from iTunes, and so on. Ultimately the economy will be dominated by forms of information that have value and these two sources of information – open source and proprietary – will coexist.
———
Response back from Keith:
Free versus proprietary isn’t a question about whether only certain things have value. A Linux DVD has 10 billion dollars worth of software. Proprietary software exists for a similar reason that ignorance and starvation exist, a lack of better systems. The best thing my former employer Microsoft has going for it is ignorance about the benefits of free software. Free software gets better only as more people use it. Proprietary software is an inferior development model and an anathema to science because it hinders people’s ability to work together. It has infected many corporations, and I’ve found that PhDs who work for public institutions often write proprietary software.

Here is a paragraph from my writings I will copy here:

I start the AI chapter of my book with the following question: Imagine 1,000 people, broken up into groups of five, working on two hundred separate encyclopedias, versus that same number of people working on one encyclopedia? Which one will be the best? This sounds like a silly analogy when described in the context of an encyclopedia, but it is exactly what is going on in artificial intelligence (AI) research today.

Today, the research community has not adopted free software and shared codebases sufficiently. For example, I believe there are more than enough PhDs today working on computer vision, but there are 200+ different codebases plus countless proprietary ones. Simply put, there is no computer vision codebase with critical mass.

We’ve known approximately what a neural network should look like for many decades. We need “places” for people to work together to hash out the details. A free software repository provides such a place. We need free software, and for people to work in “official” free software repositories.

“Open source forms of information” I have found is a separate topic from the software issue. Software always reads, modifies, and writes data, state which lives beyond the execution of the software, and there can be an interesting discussion about the licenses of the data. But movies and music aren’t science and so it doesn’t matter for most of them. Someone can only sell or give away a song after the software is written and on their computer in the first place. Some of this content can be free and some can be protected, and this is an interesting question, but mostly this is a separate topic. The important thing to share is scientific knowledge and software.

It is true that software always needs data to be useful: configuration parameters, test files, documentation, etc. A computer vision engine will have lots of data, even though most of it is used only for testing purposes and little used at runtime. (Perhaps it has learned the letters of the alphabet, state which it caches between executions.) Software begets data, and data begets software; people write code to analyze the Wikipedia corpus. But you can’t truly have a discussion of sharing information unless you’ve got a shared codebase in the first place.

I agree that proprietary software is and should be allowed in a free market. If someone wants to sell something useful that another person finds value in and wants to pay for, I have no problem with that. But free software is a better development model and we should be encouraging / demanding it. I’ll end with a quote from Linus Torvalds:

Science may take a few hundred years to figure out how the world works, but it does actually get there, exactly because people can build on each others’ knowledge, and it evolves over time. In contrast, witchcraft/alchemy may be about smart people, but the knowledge body never “accumulates” anywhere. It might be passed down to an apprentice, but the hiding of information basically means that it can never really become any better than what a single person/company can understand.
And that’s exactly the same issue with open source (free) vs proprietary products. The proprietary people can design something that is smart, but it eventually becomes too complicated for a single entity (even a large company) to really understand and drive, and the company politics and the goals of that company will always limit it.

The world is screwed because while we have things like Wikipedia and Linux, we don’t have places for computer vision and lots of other scientific knowledge to accumulate. To get driverless cars, we don’t need any more hardware, we don’t need any more programmers, we just need 100 scientists to work together in SciPy and GPL ASAP!

Regards,

-Keith

Did you know that many researchers would like to discover light-catching components in order to convert more of the sun’s power into carbon-free electric power?

A new study reported in the journal Applied Physics Letters in August this year (published by the American Institute of Physics), explains how solar energy could potentially be collected by using oxide materials that have the element selenium. A team at the Lawrence Berkeley National Laboratory in Berkeley, California, embedded selenium in zinc oxide, a relatively affordable material that could make more efficient use of the sun’s power.

The team noticed that even a relatively small amount of selenium, just 9 percent of the mostly zinc-oxide base, significantly enhanced the material’s efficiency in absorbing light.

The main author of this study, Marie Mayer (a fourth-year University of California, Berkeley doctoral student) affirms that photo-electrochemical water splitting, that means using energy from the sun to cleave water into hydrogen and oxygen gases, could potentially be the most fascinating future application for her labor. Managing this reaction is key to the eventual production of zero-emission hydrogen powered motors, which hypothetically will run only on water and sunlight.

Journal Research: Marie A. Mayer et all. Applied Physics Letters, 2010 [link: http://link.aip.org/link/APPLAB/v97/i2/p022104/s1]

The conversion efficiency of a PV cell is the proportion of sunlight energy that the photovoltaic cell converts to electric power. This is very important when discussing Pv products, because improving this efficiency is vital to making Photovoltaic energy competitive with more traditional sources of energy (e.g., fossil fuels).

For comparison, the earliest Photovoltaic products converted about 1%-2% of sunlight energy into electric energy. Today’s Photo voltaic devices convert 7%-17% of light energy into electric energy. Of course, the other side of the equation is the money it costs to produce the PV devices. This has been improved over the decades as well. In fact, today’s PV systems generate electricity at a fraction of the cost of early PV systems.

In the 1990s, when silicon cells were 2 times as thick, efficiencies were much smaller than nowadays and lifetimes were reduced, it may well have cost more energy to make a cell than it could generate in a lifetime. In the meantime, the technological know-how has progressed significantly, and the energy repayment time (defined as the recovery time necessary for generating the energy spent to produce the respective technical energy systems) of a modern photovoltaic module is generally from 1 to 4 years depending on the module type and location.

Usually, thin-film technologies — despite having comparatively low conversion efficiencies — obtain significantly shorter energy repayment times than standard systems (often < 1 year). With a normal lifetime of 20 to 30 years, this means that contemporary photovoltaic cells are net energy producers, i.e. they generate significantly more energy over their lifetime than the energy expended in producing them.

The author — Rosalind Sanders writes for the solar pool cover ratings blog, her personal hobby weblog focused on tips to help home owners to save energy with solar power.

Perhaps you think I’m crazy or naive to pose this question. But more and more the past few months I’ve begun to wonder if there is a possibility here that this idea may not be too far off the mark.

Not because of some half-baked theory about a global conspiracy or anything of the sort but simply based upon the behavior of many multinational corporations recently and the effects this behavior is having upon people everywhere.

Again, you may disagree but my perspective on these financial giants is that they are essentially predatory in nature and that their prey is any dollar in commerce that they can possibly absorb. The problem is that for anyone in the modern or even quasi-modern world money is nearly as essential as plasma when it comes to our well-being.

It has been clearly demonstrated again and again — all over the world — that when a population has become sufficiently destitute that the survival of the individual is actually threatened violence inevitably occurs. On a large enough scale this sort of violence can erupt into civil war and wars, as we all know too well can spread like a virus across borders, even oceans.

Until fairly recently, corporations were not big enough, powerful enough or sufficiently meshed with our government to push the US population to a point of violence and perhaps we’re not there yet, but between the bank bailout, the housing crisis, the bailouts of the automakers, the subsidies to the big oil companies and ten thousand other government gifts that are coming straight from the taxpayer I fear we are getting ever closer to the brink.

Who knows — it might just take one little thing — like that new one dollar charge many stores have suddenly begun instituting for any purchase using an ATM or credit card — to push us over the edge.

The last time I got hit with one of these dollar charges I thought about the ostensible reason for this — that the credit card company is now charging the merchant more per transaction so the merchant is passing that cost on to you — however this isn’t the whole story. The merchant is actually charging you more than the transaction costs him and even if this is a violation of either the law or the terms and services agreement between the card company and the merchant, the credit card company looks the other way because they are securing a bigger transaction because of what the merchant is doing thus increasing their profits even further.

Death by big blows or a thousand cuts — the question is will we be forced to do something about it before the big corporations eat us alive?

Existential Threats

With our growing resources, the Lifeboat Foundation has teamed with the Singularity Hub as Media Sponsors for the 2010 Humanity+ Summit. If you have suggestions on future events that we should sponsor, please contact [email protected].

The summer 2010 “Humanity+ @ Harvard — The Rise Of The Citizen Scientist” conference is being held, after the inaugural conference in Los Angeles in December 2009, on the East Coast, at Harvard University’s prestigious Science Hall on June 12–13. Futurist, inventor, and author of the NYT bestselling book “The Singularity Is Near”, Ray Kurzweil is going to be keynote speaker of the conference.

Also speaking at the H+ Summit @ Harvard is Aubrey de Grey, a biomedical gerontologist based in Cambridge, UK, and is the Chief Science Officer of SENS Foundation, a California-based charity dedicated to combating the aging process. His talk, “Hype and anti-hype in academic biogerontology research: a call to action”, will analyze the interplay of over-pessimistic and over-optimistic positions with regards of research and development of cures, and propose solutions to alleviate the negative effects of both.

The theme is “The Rise Of The Citizen Scientist”, as illustrated in his talk by Alex Lightman, Executive Director of Humanity+:

“Knowledge may be expanding exponentially, but the current rate of civilizational learning and institutional upgrading is still far too slow in the century of peak oil, peak uranium, and ‘peak everything’. Humanity needs to gather vastly more data as part of ever larger and more widespread scientific experiments, and make science and technology flourish in streets, fields, and homes as well as in university and corporate laboratories.”

Humanity+ Summit @ Harvard is an unmissable event for everyone who is interested in the evolution of the rapidly changing human condition, and the impact of accelerating technological change on the daily lives of individuals, and on our society as a whole. Tickets start at only $150, with an additional 50% discount for students registering with the coupon STUDENTDISCOUNT (valid student ID required at the time of admission).

With over 40 speakers, and 50 sessions in two jam packed days, the attendees, and the speakers will have many opportunities to interact, and discuss, complementing the conference with the necessary networking component.

Other speakers already listed on the H+ Summit program page include:

  • David Orban, Chairman of Humanity+: “Intelligence Augmentation, Decision Power, And The Emerging Data Sphere”
  • Heather Knight, CTO of Humanity+: “Why Robots Need to Spend More Time in the Limelight”
  • Andrew Hessel, Co-Chair at Singularity University: “Altered Carbon: The Emerging Biological Diamond Age”
  • M. A. Greenstein, Art Center College of Design: “Sparking our Neural Humanity with Neurotech!”
  • Michael Smolens, CEO of dotSUB: “Removing language as a barrier to cross cultural communication”

New speakers will be announced in rapid succession, rounding out a schedule that is guaranteed to inform, intrigue, stimulate and provoke, in moving ahead our planetary understanding of the evolution of the human condition!

H+ Summit @ Harvard — The Rise Of The Citizen Scientist
June 12–13, Harvard University
Cambridge, MA

You can register at http://www.eventbrite.com/event/648806598/friendsofhplus/4141206940.

MediaX at Stanford University is a collaboration between the university’s top technology researchers and companies innovating in today’s leading industries.

Starting next week, MediaX is putting on an exciting series of courses in The Summer Institute at Wallenberg Hall, on Stanford’s campus.

Course titles that are still open are listed below, and you can register and see the full list here. See you there!

————–

July 20: Social Connectedness in Ambient Intelligent Environments, Clifford Nass and Boris deRuyter

July 23: Semantic Integration, Carl Hewitt

August 3–4: Social Media Collaboratory, Howard Rheingold

August 5–6: New Metrics for New Media: Analytics for Social Media and Virtual Worlds, Martha Russell and Marc Smith

August 7: Media and Management Bridges Between Heart and Head for Impact, Neerja Raman

August 10–11: Data Visualization: Theory and Practice, Jeff Heer, David Kasik and John Gerth

August 12: Technology Transfer for Silicon Valley Outposts, Jean Marc Frangos, Chuck House

August 12–14: Collaborative Visualization for Collective, Connective and Distributed Intelligence, Jeff Heer, Bonnie deVarco, Katy Borner

————-

Cross posted from Next big future by Brian Wang, Lifeboat foundation director of Research

I am presenting disruption events for humans and also for biospheres and planets and where I can correlating them with historical frequency and scale.

There has been previous work on categorizing and classifying extinction events. There is Bostroms paper and there is also the work by Jamais Cascio and Michael Anissimov on classification and identifying risks (presented below).

A recent article discusses the inevtiable “end of societies” (it refers to civilizations but it seems to be referring more to things like the end of the roman empire, which still ends up later with Italy, Austria Hungary etc… emerging)

The theories around complexity seem me that to be that core developments along connected S curves of technology and societal processes cap out (around key areas of energy, transportation, governing efficiency, agriculture, production) and then a society falls back (soft or hard dark age, reconstitutes and starts back up again).

Here is a wider range of disruption. Which can also be correlated to frequency that they have occurred historically.

High growth drop to Low growth (short business cycles, every few years)
Recession (soft or deep) Every five to fifteen years.
Depressions (50−100 years, can be more frequent)

List of recessions for the USA (includes depressions)

Differences recession/depression

Good rule of thumb for determining the difference between a recession and a depression is to look at the changes in GNP. A depression is any economic downturn where real GDP declines by more than 10 percent. A recession is an economic downturn that is less severe. By this yardstick, the last depression in the United States was from May 1937 to June 1938, where real GDP declined by 18.2 percent. Great Depression of the 1930s can be seen as two separate events: an incredibly severe depression lasting from August 1929 to March 1933 where real GDP declined by almost 33 percent, a period of recovery, then another less severe depression of 1937–38. (Depressions every 50–100 years. Were more frequent in the past).

Dark age (period of societal collapse, soft/light or regular)
I would say the difference between a long recession and a dark age has to do with breakdown of societal order and some level of population decline / dieback, loss of knowledge/education breakdown. (Once per thousand years.)

I would say that a soft dark age is also something like what China had from the 1400’s to 1970.
Basically a series of really bad societal choices. Maybe something between depressions and dark age or something that does not categorize as neatly but an underperformance by twenty times versus competing groups. Perhaps there should be some kind of societal disorder, levels and categories of major society wide screw ups — historic level mistakes. The Chinese experience I think was triggered by the renunciation of the ocean going fleet, outside ideas and tech, and a lot of other follow on screw ups.

Plagues played a part in weakening the Roman and Han empires.

Societal collapse talk which includes Toynbee analysis.

Toynbee argues that the breakdown of civilizations is not caused by loss of control over the environment, over the human environment, or attacks from outside. Rather, it comes from the deterioration of the “Creative Minority,” which eventually ceases to be creative and degenerates into merely a “Dominant Minority” (who forces the majority to obey without meriting obedience). He argues that creative minorities deteriorate due to a worship of their “former self,” by which they become prideful, and fail to adequately address the next challenge they face.

My take is that the Enlightenment would strengthened with a larger creative majority, where everyone has a stake and capability to creatively advance society. I have an article about who the elite are now.

Many now argue about how dark the dark ages were not as completely bad as commonly believed.
The dark ages is also called the Middle Ages

Population during the middle ages

Between dark age/social collapse and extinction. There are levels of decimation/devastation. (use orders of magnitude 90+%, 99%, 99.9%, 99.99%)

Level 1 decimation = 90% population loss
Level 2 decimation = 99% population loss
Level 3 decimation = 99.9% population loss

Level 9 population loss (would pretty much be extinction for current human civilization). Only 6–7 people left or less which would not be a viable population.

Can be regional or global, some number of species (for decimation)

Categorizations of Extinctions, end of world categories

Can be regional or global, some number of species (for extinctions)

== The Mass extinction events have occurred in the past (to other species. For each species there can only be one extinction event). Dinosaurs, and many others.

Unfortunately Michael’s accelerating future blog is having some issues so here is a cached link.

Michael was identifying manmade risks
The Easier-to-Explain Existential Risks (remember an existential risk
is something that can set humanity way back, not necessarily killing
everyone):

1. neoviruses
2. neobacteria
3. cybernetic biota
4. Drexlerian nanoweapons

The hardest to explain is probably #4. My proposal here is that, if
someone has never heard of the concept of existential risk, it’s
easier to focus on these first four before even daring to mention the
latter ones. But here they are anyway:

5. runaway self-replicating machines (“grey goo” not recommended
because this is too narrow of a term)
6. destructive takeoff initiated by intelligence-amplified human
7. destructive takeoff initiated by mind upload
8. destructive takeoff initiated by artificial intelligence

Another classification scheme: the eschatological taxonomy by Jamais
Cascio on Open the Future. His classification scheme has seven
categories, one with two sub-categories. These are:

0:Regional Catastrophe (examples: moderate-case global warming,
minor asteroid impact, local thermonuclear war)
1: Human Die-Back (examples: extreme-case global warming,
moderate asteroid impact, global thermonuclear war)
2: Civilization Extinction (examples: worst-case global warming,
significant asteroid impact, early-era molecular nanotech warfare)
3a: Human Extinction-Engineered (examples: targeted nano-plague,
engineered sterility absent radical life extension)
3b: Human Extinction-Natural (examples: major asteroid impact,
methane clathrates melt)
4: Biosphere Extinction (examples: massive asteroid impact,
“iceball Earth” reemergence, late-era molecular nanotech warfare)
5: Planetary Extinction (examples: dwarf-planet-scale asteroid
impact, nearby gamma-ray burst)
X: Planetary Elimination (example: post-Singularity beings
disassemble planet to make computronium)

A couple of interesting posts about historical threats to civilization and life by Howard Bloom.

Natural climate shifts and from space (not asteroids but interstellar gases).

Humans are not the most successful life, bacteria is the most successful. Bacteria has survived for 3.85 billion years. Humans for 100,000 years. All other kinds of life lasted no more than 160 million years. [Other species have only managed to hang in there for anywhere from 1.6 million years to 160 million. We humans are one of the shortest-lived natural experiments around. We’ve been here in one form or another for a paltry two and a half million years.] If your numbers are not big enough and you are not diverse enough then something in nature eventually wipes you out.

Following the bacteria survival model could mean using transhumanism as a survival strategy. Creating more diversity to allow for better survival. Humans adapted to living under the sea, deep in the earth, in various niches in space, more radiation resistance,non-biological forms etc… It would also mean spreading into space (panspermia). Individually using technology we could become very successful at life extension, but it will take more than that for a good plan for human (civilization, society, species) long term survival planning.

Other periodic challenges:
142 mass extinctions, 80 glaciations in the last two million years, a planet that may have once been a frozen iceball, and a klatch of global warmings in which the temperature has soared by 18 degrees in ten years or less.

In the last 120,000 years there were 20 interludes in which the temperature of the planet shot up 10 to 18 degrees within a decade. Until just 10,000 years ago, the Gulf Stream shifted its route every 1,500 years or so. This would melt mega-islands of ice, put out our coastal cities beneath the surface of the sea, and strip our farmlands of the conditions they need to produce the food that feeds us.

The solar system has a 240-million-year-long-orbit around the center of our galaxy, an orbit that takes us through interstellar gas clusters called local fluff, interstellar clusters that strip our planet of its protective heliosphere, interstellar clusters that bombard the earth with cosmic radiation and interstellar clusters that trigger giant climate change.