Toggle light / dark theme

To many people, the introduction of the first Macintosh computer and its graphical user interface in 1984 is viewed as the dawn of creative computing. But if you ask Dr. Nick Montfort, a poet, computer scientist, and assistant professor of Digital Media at MIT, he’ll offer a different direction and definition for creative computing and its origins.

Defining Creative

Creative Computing was the name of a computer magazine that ran from 1974 through 1985. Even before micro-computing there was already this magazine extolling the capabilities of the computer to teach, to help people learn, help people explore and help them do different types of creative work, in literature, the arts, music and so on,” Montfort said.

“It was a time when people had a lot of hope that computing would enable people personally as artists and creators to do work. It was actually a different time than we’re in now. There are a few people working in those areas, but it’s not as widespread as hoped in the late 70’s or early 80s.”

These days, Montfort notes that many people use the term “artificial intelligence” interchangeably with creative computing. While there are some parallels, Montfort said what is classically called AI isn’t the same as computational creativity. The difference, he says, is in the results.

“A lot of the ways in which AI is understood is the ability to achieve a particular known objective,” Montfort said. “In computational creativity, you’re trying to develop a system that will surprise you. If it does something you already knew about then, by definition, it’s not creative.”

Given that, Montfort quickly pointed out that creative computing can still come from known objectives.

“A lot of good creative computer work comes from doing things we already know computers can do well,” he said. “As a simple example, the difference between a computer as a producer of poetic language and person as a producer of poetic language is, the computer can just do it forever. The computer can just keep reproducing and, (with) that capability to bring it together with images to produce a visual display, now you’re able to do something new. There’s no technical accomplishment, but it’s beautiful nonetheless.”

Models of Creativity

As a poet himself, another area of creative computing that Montfort keeps an eye on is the study of models of creativity used to imitate human creativity. While the goal may be to replicate human creativity, Montfort has a greater appreciation for the end results that don’t necessarily appear human-like.

“Even if you’re using a model of human creativity the way it’s done in computational creativity, you don’t have to try to make something human-like, (even though) some people will try to make human-like poetry,” Montfort said. “I’d much rather have a system that is doing something radically different than human artistic practice and making these bizarre combinations than just seeing the results of imitative work.”

To further illustrate his point, Montfort cited a recent computer generated novel contest that yielded some extraordinary, and unusual, results. Those novels were nothing close to what a human might have written, he said, but depending on the eye of the beholder, it at least bodes well for the future.

“A lot of the future of creative computing is individual engagement with creative types of programs,” Montfort said. “That’s not just using drawing programs or other facilities to do work or using prepackaged apps that might assist creatively in the process of composition or creation, but it’s actually going and having people work to code themselves, which they can do with existing programs, modifying them, learning about code and developing their abilities in very informal ways.”

That future of creative computing lies not in industrial creativity or video games, but rather a sharing of information and revisioning of ideas in the multiple hands and minds of connected programmers, Montfort believes.

“One doesn’t have to get a computer science degree or even take a formal class. I think the perspective of free software and open source is very important to the future of creative programming,” Montfort said. “…If people take an academic project and provide their work as free software, that’s great for all sorts of reasons. It allows people to replicate your results, it allows people to build on your research, but also, people might take the work that you’ve done and inflect it in different types of artistic and creative ways.”

— WiredIt’s taken close to half a decade. But WikiLeaks is back in the business of accepting truly anonymous leaks.

On Friday, the secret-spilling group announced that it has finally relaunched a beta version of its leak submission system, a file-upload site that runs on the anonymity software Tor to allow uploaders to share documents and tips while protecting their identity from any network eavesdropper, and even from WikiLeaks itself. The relaunch of that page—which in the past served as the core of WikiLeaks’ transparency mission—comes four and a half years after WikiLeaks’ last submission system went down amid infighting between WikiLeaks’ leaders and several of its disenchanted staffers. Read more

Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Posted in astronomy, big data, computing, cosmology, energy, engineering, environmental, ethics, existential risks, futurism, general relativity, governance, government, gravity, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, open source, particle physics, philosophy, physics, policy, posthumanism, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , , , , , | Leave a Comment on Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Why the LHC must be shut down

Quoted: “IBM has unveiled its proof of concept for ADEPT, a system developed in partnership with Samsung that uses elements of bitcoin’s underlying design to build a distributed network of devices – a decentralized Internet of Things. The ADEPT concept, or Autonomous Decentralized Peer-to-Peer Telemetry, taps blockchains to provide the backbone of the system, utilizing a mix of proof-of-work and proof-of-stake to secure transactions.”

Read the article here > http://www.coindesk.com/ibm-reveals-proof-concept-blockchain-powered-internet-things/

Quoted: “If you understand the core innovations around the blockchain idea, you’ll realize that the technology concept behind it is similar to that of a database, except that the way you interact with that database is very different.

The blockchain concept represents a paradigm shift in how software engineers will write software applications in the future, and it is one of the key concepts behind the Bitcoin revolution that need to be well understood. In this post, I’d like to explain 5 of these concepts, and how they interrelate to one another in the context of this new computing paradigm that is unravelling in front of us. They are: the blockchain, decentralized consensus, trusted computing, smart contracts and proof of work / stake. This computing paradigm is important, because it is a catalyst for the creation of decentralized applications, a next-step evolution from distributed computing architectural constructs.

Screen Shot 2014-12-23 at 10.30.59 PM

Read the article here > http://startupmanagement.org/2014/12/27/the-blockchain-is-the-new-database-get-ready-to-rewrite-everything/

Quoted: “Ethereum will also be a decentralised exchange system, but with one big distinction. While Bitcoin allows transactions, Ethereum aims to offer a system by which arbitrary messages can be passed to the blockchain. More to the point, these messages can contain code, written in a Turing-complete scripting language native to Ethereum. In simple terms, Ethereum claims to allow users to write entire programs and have the blockchain execute them on the creator’s behalf. Crucially, Turing-completeness means that in theory any program that could be made to run on a computer should run in Ethereum.” And, quoted: “As a more concrete use-case, Ethereum could be utilised to create smart contracts, pieces of code that once deployed become autonomous agents in their own right, executing pre-programmed instructions. An example could be escrow services, which automatically release funds to a seller once a buyer verifies that they have received the agreed products.”

Read Part One of this Series here » Ethereum — Bitcoin 2.0? And, What Is Ethereum.

Read Part Two of this Series here » Ethereum — Opportunities and Challenges.

Read Part Three of this Series here » Ethereum — A Summary.

Quoted: “Bitcoin technology offers a fundamentally different approach to vote collection with its decentralized and automated secure protocol. It solves the problems of both paper ballot and electronic voting machines, enabling a cost effective, efficient, open system that is easily audited by both individual voters and the entire community. Bitcoin technology can enable a system where every voter can verify that their vote was counted, see votes for different candidates/issues cast in real time, and be sure that there is no fraud or manipulation by election workers.”

Read the article here » http://www.entrepreneur.com/article/239809?hootPostID=ba473face1754ce69f6a80aacc8412c7

Preamble: Bitcoin 1.0 is currency — the deployment of cryptocurrencies in applications related to cash such as currency transfer, remittance, and digital payment systems. Bitcoin 2.0 is contracts — the whole slate of economic, market, and financial applications using the blockchain that are more extensive than simple cash transactions like stocks, bonds, futures, loans, mortgages, titles, smart property, and smart contracts

Bitcoin 3.0 is blockchain applications beyond currency, finance, and markets, particularly in the areas of government, health, science, literacy, culture, and art.

Read the article here » http://ieet.org/index.php/IEET/more/swan20141110

My Brief Q&A session with Christoffer De Geer, about BitCoin, Cryptocurrency, and Blockchain Technology.

This Q&A was first published by Mr. Geir Solem, Director of Cryptor Trust Inc., on the Cryptor Primary Investor Blog (Date: October 31, 2014).

Quote: “BitCoin was the first small step in what I believe will be a truly transformational journey, for each and every one of us. In 10 Years Cryptocurrency and Blockchains have every chance to have the same, or greater, impact on our lives, society, and civiliation, as the creation of Email had to the Postal Service, and the Fax Machine as compared to the Internet; in 25 Years Monetary Systems, Systems of Trade and Exchange, Systems of Transaction of Goods, Ledger and Recordation Systems, Everything You Know – Will – Be – Different – and, Unrecognizable relative to what we know today at the end of the year 2014.”

See the Q&A article here » [Article: BitCoin, Cryptocurrency, and Blockchain Technology] Continue reading “BitCoin, Cryptocurrency, and Blockchain Technology — A Brief Q&A” | >

If the controversy over genetically modified organisms (GMOs) tells us something indisputable, it is this: GMO food products from corporations like Monsanto are suspected to endanger health. On the other hand, an individual’s right to genetically modify and even synthesize entire organisms as part of his dietary or medical regimen could someday be a human right.
The suspicion that agri-giant companies do harm by designing crops is legitimate, even if evidence of harmful GMOs is scant to absent. Based on their own priorities and actions, we should have no doubt that self-interested corporations disregard the rights and wellbeing of local producers and consumers. This makes agri-giants producing GMOs harmful and untrustworthy, regardless of whether individual GMO products are actually harmful.
Corporate interference in government of the sort opposed by the Occupy Movement is also connected with the GMO controversy, as the US government is accused of going to great lengths to protect “stakeholders” like Monsanto via the law. This makes the GMO controversy more of a business and political issue rather than a scientific one, as I argued in an essay published at the Institute for Ethics and Emerging Technologies (IEET). Attacks on science and scientists themselves over the GMO controversy are not justified, as the problem lies solely with a tiny handful of businessmen and corrupt politicians.
An emerging area that threatens to become as controversial as GMOs, if the American corporate stranglehold on innovation is allowed to shape its future, is synthetic biology. In his 2014 book, Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, top synthetic biologist J. Craig Venter offers powerful words supporting a future shaped by ubiquitous synthetic biology in our lives:

“I can imagine designing simple animal forms that provide novel sources of nutrients and pharmaceuticals, customizing human stem cells to regenerate a damaged, old, or sick body. There will also be new ways to enhance the human body as well, such as boosting intelligence, adapting it to new environments such as radiation levels encountered in space, rejuvenating worn-out muscles, and so on”

In his own words, Venter’s vision is no less than “a new phase of evolution” for humanity. It offers what Venter calls the “real prize”: a family of designer bacteria “tailored to deal with pollution or to absorb excess carbon dioxide or even meet future fuel needs”. Greater than this, the existing tools of synthetic biology are transhumanist in nature because they create limitless means for humans to enhance themselves to deal with harsher environments and extend their lifespans.
While there should be little public harm in the eventual ubiquity of the technologies and information required to construct synthetic life, the problems of corporate oligopoly and political lobbying are threatening synthetic biology’s future as much as they threaten other facets of human progress. The best chance for an outcome that will be maximally beneficial for the world relies on synthetic biology taking a radically different direction to GM. That alternative direction, of course, is an open source future for synthetic biology, as called for by Canadian futurist Andrew Hessel and others.
Calling himself a “catalyst for open-source synthetic biology”, Hessel is one of the growing number of experts who reject biotechnology’s excessive use of patents. Nature notes that his Pink Army Cooperative venture relies instead on “freely available software and biological parts that could be combined in innovative ways to create individualized cancer treatments — without the need for massive upfront investments or a thicket of protective patents”.
While offering some support to the necessity of patents, J. Craig Venter more importantly praises the annual International Genetically Engineered Machine (iGEM) competition in his book as a means of encouraging innovation. He specifically names the Registry of Standard Biological Parts, an open source library from which to obtain BioBricks, and describes this as instrumental for synthetic biology innovation. Likened to bricks of Lego that can be snapped together with ease by the builder, BioBricks are prepared standard pieces of genetic code, with which living cells can be newly equipped and operated as microscopic chemical factories. This has enabled students and small companies to reprogram life itself, taking part in new discoveries and innovations that would have otherwise been impossible without the direct supervision of the world’s best-trained teams of biologists.
There is a similar movement towards popular synthetic biology by the name of biohacking, promoted by such experts as Ellen Jorgensen. This compellingly matches the calls for greater autonomy for individuals and small companies in medicine and human enhancement. Unfortunately, despite their potential to greatly empower consumers and farmers, such developments have not yet found resonance with anti-GMO campaigners, whose outright rejection of biotechnology has been described as anti-science and “bio-luddite” by techno-progressives. It is for this reason that emphasizing the excellent potential of biotechnology for feeding and fuelling a world plagued by dwindling resources is important, and a focus on the ills of big business rather than imagined spectres emerging from science itself is vital.
The concerns of anti-GMO activists would be addressed better by offering support to an alternative in the form of “do-it-yourself” biotechnology, rather than rejecting sciences and industries that are already destined to be a fundamental part of humanity’s future. What needs to be made is a case for popular technology, in hope that we can reject the portrayal of all advanced technology as an ally of powerful states and corporations and instead unlock its future as a means of liberation from global exploitation and scarcity.
While there are strong arguments that current leading biotechnology companies feel more secure and perform better when they retain rigidly enforced intellectual property rights, Andrew Hessel rightly points out that the open source future is less about economic facts and figures than about culture. The truth is that there is a massive cultural transition taking place. We can see a growing hostility to patents, and an increasing popular enthusiasm for open source innovation, most promisingly among today’s internet-borne youth.
In describing a cultural transition, Hessel is acknowledging the importance of the emerging body of transnational youth whose only ideology is the claim that information wants to be free, and we find the same culture reflected in the values of organizations like WikiLeaks. Affecting every facet of science and technology, the elite of today’s youth are crying out for a more open, democratic, transparent and consumer-led future at every level.

By Harry J. Bentham - More articles by Harry J. Bentham

Originally published at h+ Magazine on 21 August 2014