We are publishing a detailed study of a 280-billion parameter transformer language model called Gopher, a study of ethical and social risks associated with large language models, and a paper investigating a new architecture with better training efficiency.
Language modelling at scale: Gopher, ethical considerations, and retrieval
Posted in futurism