Toggle light / dark theme

Many people reject scientific expertise and prefer ideology to facts. Lee McIntyre argues that anyone can and should fight back against science deniers.
Watch the Q&A: https://youtu.be/2jTiXCLzMv4
Lee’s book “How to Talk to a Science Denier” is out now: https://geni.us/leemcintyre.

“Climate change is a hoax—and so is coronavirus.” “Vaccines are bad for you.” Many people may believe such statements, but how can scientists and informed citizens convince these ‘science deniers’ that their beliefs are mistaken?

Join Lee McIntyre as he draws on his own experience, including a visit to a Flat Earth convention as well as academic research, to explain the common themes of science denialism.

Lee McIntyre is a Research Fellow at the Center for Philosophy and History of Science at Boston University and an Instructor in Ethics at Harvard Extension School. He holds a B.A. from Wesleyan University and a Ph.D. in Philosophy from the University of Michigan (Ann Arbor). He has taught philosophy at Colgate University (where he won the Fraternity and Sorority Faculty Award for Excellence in Teaching Philosophy), Boston University, Tufts Experimental College, Simmons College, and Harvard Extension School (where he received the Dean’s Letter of Commendation for Distinguished Teaching). Formerly Executive Director of the Institute for Quantitative Social Science at Harvard University, he has also served as a policy advisor to the Executive Dean of the Faculty of Arts and Sciences at Harvard and as Associate Editor in the Research Department of the Federal Reserve Bank of Boston.

This talk was recorded on 24 August 2021.


A very special thank you to our Patreon supporters who help make these videos happen, especially:
Abdelkhalek Ayad, Martin Paull, Anthony Powers, Ben Wynne-Simmons, Ivo Danihelka, Hamza, Paulina Barren, Metzger, Kevin Winoto, Jonathan Killin, János Fekete, Mehdi Razavi, Mark Barden, Taylor Hornby, Rasiel Suarez, Stephan Giersche, William ‘Billy’ Robillard, Scott Edwardsen, Jeffrey Schweitzer, Gou Ranon, Christina Baum, Frances Dunne, jonas.app, Tim Karr, Adam Leos, Michelle J. Zamarron, Andrew Downing, Fairleigh McGill, Alan Latteri, David Crowner, Matt Townsend, Anonymous, Robert Reinecke, Paul Brown, Lasse T. Stendan, David Schick, Joe Godenzi, Dave Ostler, Osian Gwyn Williams, David Lindo, Roger Baker, Greg Nagel, and Rebecca Pan.

Subscribe for regular science videos: http://bit.ly/RiSubscRibe.
The Ri is on Patreon: https://www.patreon.com/TheRoyalInstitution.
and Twitter: http://twitter.com/ri_science.
and Facebook: http://www.facebook.com/royalinstitution.
and Tumblr: http://ri-science.tumblr.com/
Our editorial policy: http://www.rigb.org/home/editorial-policy.
Subscribe for the latest science videos: http://bit.ly/RiNewsletter.

“The use of organophosphate esters in everything from TVs to car seats has proliferated under the false assumption that they’re safe,” said Heather Patisaul, lead author and neuroendocrinologist at North Carolina State University. “Unfortunately, these chemicals appear to be just as harmful as the chemicals they’re intended to replace but act by a different mechanism.”


Summary: Exposure to even low levels of common chemicals called organophosphate esters can harm IQ, memory, learning, and brain development overall in young children.

Source: Green Science Policy Institute

Chemicals increasingly used as flame retardants and plasticizers pose a larger risk to children’s brain development than previously thought, according to a commentary published today in Environmental Health Perspectives.

The research team reviewed dozens of human, animal, and cell-based studies and concluded that exposure to even low levels of the chemicals—called organophosphate esters—may harm IQ, attention, and memory in children in ways not yet looked at by regulators.

Advanced Nuclear Power Advocacy For Humanity — Eric G. Meyer, Founder & Director, Generation Atomic


Eric G. Meyer is the Founder and Director of Generation Atomic (https://generationatomic.org/), a nuclear advocacy non-profit which he founded after hearing about the promise of advanced nuclear reactors, and he decided to devote his life to saving and expanding the use of atomic energy.

Eric worked as an organizer on several political, union, and issue campaigns while in graduate school for applied public policy, taking time off to attend the climate talks in Paris and sing opera about atomic energy.

Eric began his full time nuclear work in May of 2016 with Environmental Progress by organizing marches, rallies, and trainings in California, New York, and Illinois, before leaving to found Generation Atomic in late 2016.

In only a short period of time, Generation Atomic has made significant progress in the world of nuclear advocacy. Over the last year they’ve held several advocacy trainings at conferences, Marched for Science, talked to over tens of thousands voters, and carried the banner for nuclear energy at the climate talks in Morocco, Germany, and Poland.

Eric attended University of Minnesota-Duluth where he obtained a Master’s Degree in Advocacy and Political Leadership, with Concentrations in Public Sector and Non-profits, and a Bachelor of Arts in Music Performance.

Without a new legal framework, they could destabilize societal norms.


Autonomous weapon systems – commonly known as killer robots – may have killed human beings for the first time ever last year, according to a recent United Nations Security Council report on the Libyan civil war. History could well identify this as the starting point of the next major arms race, one that has the potential to be humanity’s final one.

Autonomous weapon systems are robots with lethal weapons that can operate independently, selecting and attacking targets without a human weighing in on those decisions. Militaries around the world are investing heavily in autonomous weapons research and development. The U.S. alone budgeted US$18 billion for autonomous weapons between 2016 and 2020.

Meanwhile, human rights and humanitarian organizations are racing to establish regulations and prohibitions on such weapons development. Without such checks, foreign policy experts warn that disruptive autonomous weapons technologies will dangerously destabilize current nuclear strategies, both because they could radically change perceptions of strategic dominance, increasing the risk of preemptive attacks, and because they could become combined with chemical, biological, radiological and nuclear weapons themselves.

Accounting and consulting firm PwC told Reuters on Thursday it will allow all its 40,000 U.S. client services employees to work virtually and live anywhere they want in perpetuity, making it one of the biggest employers to embrace permanent remote work.

The policy is a departure from the accounting industry’s rigid attitudes, known for encouraging people to put in late nights at the office. Other major accounting firms, such as Deloitte and KPMG, have also been giving employees more choice to work remotely in the face of the COVID-19 pandemic.

Autonomous weapon systems – commonly known as killer robots – may have killed human beings for the first time ever last year, according to a recent United Nations Security Council report on the Libyan civil war. History could well identify this as the starting point of the next major arms race, one that has the potential to be humanity’s final one.

Autonomous weapon systems are robots with lethal weapons that can operate independently, selecting and attacking targets without a human weighing in on those decisions. Militaries around the world are investing heavily in autonomous weapons research and development. The U.S. alone budgeted US$18 billion for autonomous weapons between 2016 and 2020.

Meanwhile, human rights and humanitarian organizations are racing to establish regulations and prohibitions on such weapons development. Without such checks, foreign policy experts warn that disruptive autonomous weapons technologies will dangerously destabilize current nuclear strategies, both because they could radically change perceptions of strategic dominance, increasing the risk of preemptive attacks, and because they could become combined with chemical, biological, radiological and nuclear weapons themselves.

A dress worn this week by Democratic Congresswoman Alexandria Ocasio-Cortez (D-NY), which bore the message “tax the rich,” set off a wave of debate over how best to address wealth inequality, as Congress weighs a $3.5 trillion spending bill that includes tax hikes on corporations and high-earning individuals.

The debate coincides with the ongoing pandemic in which billionaires, many of whom are tech company founders, have added $1.8 trillion in wealth while consumers have come to depend increasingly on services like e-commerce and teleconference, according to a report released last month by the Institute for Policy Studies.

In a new interview, artificial intelligence expert Kai Fu-Lee — who worked as an executive at Google (GOOG, GOOGL), Apple (AAPL), and Microsoft (MSFT) — attributed the rise of wealth inequality in part to the tech boom in recent decades, predicting that the trend will worsen in coming years with the continued emergence of AI.

A new Rutgers study will examine how COVID-19 is affecting individuals in a number of cognitive-related areas, including memory loss, “brain fog,” and dementia.

“Many people who recover from mild or moderate COVID-19 notice slowed thinking or memory loss, and this motivated us to leverage our experience in studying cognitive issues related to Alzheimer’s disease, multiple sclerosis, and HIV to examine this phenomenon,” said Dr. William T. Hu, associate professor and chief of cognitive neurology at Rutgers Robert Wood Johnson Medical School and the Institute for Health, Health Care Policy, and Aging Research.

A leading cognitive neurologist and neuroscientist, Dr. Hu is spearheading the characterization of cognitive impairment following mild-to-moderate COVID-19 at Rutgers.

US Secretary of Commerce Gina Raimondo has announced that the Commerce Department has established a high-level committee to advise the President and other federal agencies on a range of issues related to artificial intelligence (AI). Working with the National AI Initiative Office (NAIIO) in the White House Office of Science and Technology Policy (OSTP), the Department is now seeking to recruit top-level candidates to serve on the committee.

A formal notice describing the National Artificial Intelligence Advisory Committee (NAIAC) and the call for nominations for the committee and its Subcommittee on Artificial Intelligence and Law Enforcement appears in the Federal Register published today.

“AI presents an enormous opportunity to tackle the biggest issues of our time, strengthen our technological competitiveness, and be an engine for growth in nearly every sector of the economy,” said Secretary Raimondo. “But we must be thoughtful, creative, and wise in how we address the challenges that accompany these new technologies. That includes, but is not limited to, ensuring that President Biden’s comprehensive commitment to advancing equity and racial justice extends to our development and use of AI technology. This committee will help the federal government to do that by providing insights into a full range of issues raised by AI.”