“I think it’s changed everything, and I think it’s changed everything fundamentally,” James Livingston, a history professor at Rutgers University and the author of No More Work: Why Full Employment Is a Bad Idea, told Vox.
We’ll (probably) always have work, but could the job as the centerpiece of American life be on the way out?
To understand the question, you have to know how the country got to where it is today. The story starts, to some degree, with a failure. Much of American labor law — as well as the social safety net, such as it is — stems from union organizing and progressive action at the federal level in the 1930s, culminating in the New Deal. At that time, many unions were pushing for a national system of pensions not dependent on jobs, as well as national health care, Nelson Lichtenstein, a history professor at the University of California Santa Barbara, told Vox. They did win Social Security, but with many people left out, such as agricultural and domestic workers, it wasn’t a full nationwide retirement system. And when it came to universal health care, they lost entirely.