The Elon Musk funded OpenAI non-profit has created a breakthrough system for writing high-quality text. It can write text, performs basic reading comprehension, machine translation, question answering, and summarization and all without task-specific training.
The system is able to take a few sentences of sample writing and then produce a multi-paragraph article in the style and context of the sample. This capability would let AI’s to impersonate the writing style of any person from previous writing samples.
GPT-2, is a 1.5 billion parameter Transformer that achieves state of the art results on 7 out of 8 tested language modeling datasets in a zero-shot setting, yet still simplifies (or in AI term underfits) their database called WebText. Samples from the model reflect these improvements and contain coherent paragraphs of text. These findings suggest a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations.