2022-03-28 Education in a Time of Pandemic IV

Prologue: What Education Cannot Do

Before taking up the long list of public school crypto- (and not-so-crypto) privatization initiatives in our final installment concerning the consequences for public education in a time of pandemic, let’s clear the slate regarding the stated purposes of education and the reason/s these purposes have become controversial over the past 30 years.

Below, a sampling from several state constitutions, setting out the purposes of education and responsibilities of (four) individual states for providing such an education to all residents. [Note: Highlighting is ours.]

(1)

Article 14, §1 Arkansas Constitution

Text of Section 1:

Free School System

Intelligence and virtue being the safeguards of liberty and the bulwark of a free and good government, the State shall ever maintain a general, suitable and efficient system of free public schools and shall adopt all suitable means to secure to the people the advantages and opportunities of education.

(2)

State Constitution, New Hampshire

Part 2, Form of Government, Encouragement of Literature, Trades, Etc., New Hampshire State Constitution.

[Art.] 83. [Encouragement of Literature, etc.; Control of Corporations, Monopolies, etc.] Knowledge and learning, generally diffused through a community, being essential to the preservation of a free government; and spreading the opportunities and advantages of education through the various parts of the country, being highly conducive to promote this end; it shall be the duty of the legislators and magistrates, in all future periods of this government, to cherish the interest of literature and the sciences, and all seminaries and public schools, to encourage private and public institutions, rewards, and immunities for the promotion of agriculture, arts, sciences, commerce, trades, manufactures, and natural history of the country; to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and economy, honesty and punctuality, sincerity, sobriety, and all social affections, and generous sentiments, among the people: Provided, nevertheless, that no money raised by taxation shall ever be granted or applied for the use of the schools of institutions of any religious sect or denomination.

(3)

ARTICLE VIII, Section 1. North Dakota Constitution

EDUCATION

Section 1.A high degree of intelligence, patriotism, integrity and morality on the part of every voter in a government by the people being necessary in order to insure the continuance of that government and the prosperity and happiness of the people, the legislative assembly shall make provision for the establishment and maintenance of a system of public schools which shall be open to all children of the state of North Dakota and free from sectarian control. This legislative requirement shall be irrevocable without the consent of the United States and the people of North Dakota.

(4)

The Page Amendment, Minnesota Constitution (proposed 2022; not yet approved*)

“All children have a fundamental right to a quality public education that fully prepares them with the skills necessary for participation in the economy, our democracy, and society, as measured against uniform achievement standards set forth by the state. It is a paramount duty of the state to ensure quality public schools that fulfill this fundamental right. The duty of the state established in this section does not infringe on the right of a parent to choose for their child a private, religious, or home school as an alternative to public education.”

Democrats have backed themselves into a corner over the past 30 years by supporting education through claiming that a university degree represents the 21st-century path to the middle class, out of poverty and into the land of plenty. In other words, their argument for public education has been essentially an instrumental (as opposed to intrinsic) one of the sort, “Get an education and get a good-paying job.” The converse of this argument? “If you don’t get an education, that’s on you.” It’s somewhat analogous to blaming those who became ill with COVID-19 due to the nature of their work, or the fact that they live in multi-generational households, or because they suffer from underlying co-morbidities such as diabetes or autoimmune diseases: “Your bad luck is your own fault.”

It’s an argument that devalues and shames those without a college degree, who accounted for 62.5% of the U.S. population in 2020. How can one-third of the population look down on the other two-thirds when in fact this is not their fault?

How’d we get here? Here are some of the major economic and political transformations in the U.S. economy and political environment over the past generation:

  • The rise of heavy industrial production in the mid-to late 19th century up through approximately 1970 (mechanization / urbanization / two World Wars which ignited the economy)
  • This shift in the economic base (to industrial production) was accompanied by a long, often violent conflict between industrial workers and owners (“robber barons”), culminating in the National Labor Relations Act (1935) that inter alia guaranteed all workers the right to organize without employers’ exercising unfair labor practices. Formerly rural farmworkers became industrial workers, were unionized, and gained access to wages well above poverty level, enabling their entry into the post-World War II middle class.
  • Inflation and changes in individual and corporate tax structures after around 1980 (in part fueled by the oil crisis/embargo in 1973-1974, which had been partly caused by the falling rate of the USD) and the rise of industrial capacity outside the U.S. initiated
  • A process of deindustrialization, i.e. the movement of large production facilities to locations where labor was cheaper; this process involved both shifts from more costly, more heavily-unionized northern states to southern ones, as well as the shift to other countries (offshoring) including Mexico, China, and Southeast Asia, where labor costs were a fraction of what they were in the U.S.
  • This loss of industrial production in turn led to the U.S.’s transformation from a production economy to a human and financial services economy, accompanied by
  • A precipitous drop in union membership (down to 10.3%, from 20% in 1983), which in turn
  • Depressed wages for workers, while it simultaneously
  • Increased profits for owners, leading to
  • An ever-growing gap in earnings/savings between owners/the (higher) professional classes and workers/service providers. (Cf. for example the 70% increase in wealth for U.S. billionaires during the pandemic, whose worth soared from $2T to $5T since March 2020), which has
  • Resulted in 40% of Americans not having savings of $400 today to cover an unforeseen emergency (car repair, dental work, death in the family)

During the past 30-40 years, as (1) through (10) were occurring, the working middle class lost ground continuously. Factories closed and no comparable jobs came to replace them. There was a shift from a “production” to a “service” economy. Democrats, whose most powerful base – both in terms of funding and in terms of voter turnout – was falling out of the middle class and into the “working poor,” had to figure out a way to continue to attract the displaced and discouraged. The solution they arrived at was “go to college” – a university degree suddenly became the ticket (formerly provided by high-paying union jobs) to the middle class.

But many – most – of those to whom this new mantra was addressed couldn’t afford the cost of a college degree. And so, another “solution” was found: the college loan program.  

Given that this mantra was financially-motivated (“Go to college and you’ll get a good job”), colleges and universities obliged by gearing ever more programs of study to getting a job after graduation. Many degrees have ended up as what was once referred to as “technical training” with a much higher price tag. In consequence, the humanities and social sciences have seriously declined in terms of enrollment and offerings; entire degree programs have been cancelled. Many university graduates have only rudimentary (or no) knowledge of the subjects which once formed the foundation for an educated human being – philosophy, history, literature, foreign languages. All have fallen out of favor because they don’t automatically guarantee a decent job.  

Today, about 37.5% of the U.S. population has college degrees; 13% has a master’s degree, and 1.2% has a PhD. Although many graduates don’t enjoy middle-class salaries, one thing both higher- and lower-end graduates share is upper-class debt. In 2021, 43.4 million students (sixty-two percent of the total) were burdened with $1.6 trillion dollars in debt; average debt held by individuals for undergrad student loans is $28,950, and $57,520 for families where both partners carry debt burdens (2019 figures). Loan totals increase considerably at the professional degree levels (e.g. medical school: $201,490; law school: $145,500). Sixty-two percent of students graduate with debt weighing down their future – the prospect of owning a home, for example, diminishes considerably for those with student debt, which is nearly impossible to discharge in bankruptcy (thanks, Senator Joe Biden).

The dream of gaining admission to the middle class through education, which has for a generation been touted as the solution to the loss of manufacturing and production jobs as the latter were being offshored has not materialized, nor will it.

Of course there were other means of steering the economy in new directions in the wake of offshoring, and there were other tools in the political economy toolkit to address the drastic decline in prosperity of the former blue-collar middle class. Medicare for All, a permanent Child Tax Credit that applied to all incomes, even those too meager to be taxed, a progressive taxation system for the top 1-5% (remember – it reached 90% during Dwight Eisenhower’s presidency), tuition-free study at all public colleges and universities are a few that come to mind. And the specter of climate change, the urgent need to transition to renewable/sustainable energy, to retrofit the entire U.S. housing stock for extreme temperatures, to rebuild our infrastructure and urban cores sustainably – all of these needs have been known for the last 20 years, we just haven’t implemented them on a scale that would have both gainfully employed former industrial production workers and made the U.S. more resilient in the face of what is coming in the next half-century. Creating a sustainable and resilient new national infrastructure to ward off at least some of the consequences of climate change would have compensated for the working class prosperity which has been lost.

But none of the above happened, and there are only faint prospects of action before it’s too late, both for the working class and for the climate and natural environment.

Education must walk in lockstep with the overall goals of a society – and in societies where the greatest possible profit for the smallest possible number of beneficiaries, without regard to economic externalities that harm both people and the environment reigns above all, there is little chance for education to accomplish what the business class, the financial class, and the political class do not want it to accomplish.

Neither K-12 nor college education can put food in hungry children’s mouths (in 2019, 10 million children, around 20% of school-age children, were living below poverty level), nor money in their parents’ pockets when they’re earning $7.25 an hour (the federal minimum wage since 2009). Education cannot pay their own or their caregivers’ medical costs in case of a major illness or life-threatening accident; it cannot clean up the air they breathe, or the water they drink, or the food they eat. And in the absence of all of these, teachers cannot do their job, which is to teach.

What we are left with, then, are the idealistic promises and goals (today more honored in the breach than the observance) set forth in the state constitutions quoted above. More than 40 million students/graduates are burdened with debt that will never permit them to enter the traditional middle class, symbolized perhaps most conspicuously by home ownership following World War II (the U.S. is now in the early stages of a seismic shift from a home ownership society to a rental society, as we will discuss in the post on Housing and Homelessness later in this series). Our population has been gravely harmed by the coronavirus pandemic, from which an estimated 10%-20% of those infected will suffer from “long COVID” for an unspecified and unknown period, and with as-yet unknown costs for the economy. And we confront a K-12 public education system which, already weakened over the past 30 years, is now being subjected to a brutal frontal attack on its very existence.

In our next – and final – post on Education in a Time of Pandemic, we will consider the push towards privatization which is threatening a foundational institution of the United States: free public education for all.

Further Reading

I Am Not a Proof of the American Dream

If You Think Republicans Are Overplaying Schools, You Aren’t Paying Attention

Disdain for the Less Educated Is the Last Acceptable Prejudice

Moral Relativism and the Bottom Line

*“The Page Amendment Is a Trojan Horse to Destroy Public Schools

2022-03-25 Education in a Time of Pandemic III

Governance & Education Policy: Where Have All the Grown-ups Gone?

It’s just not one or two people here — there’s a mind-set coming from the governor on down to ban conversation and to segment communities and to erase life experiences from classroom discussion”(Hedy Weinberg, director, ACLU Tennessee)  

The sociocultural, socioeconomic, and sociopolitical schisms in the U.S. which we observe daily on the national front cannot avoid being reflected at the local level. And one local governance body which has been especially vulnerable to spillover conflict is the school district.

School districts are representative government at its best (and worst): members are elected for set terms, and, at least traditionally, have tended to be individuals with children who were or had studied in the district schools, former teachers or other district employees, or civic-minded individuals inclined to involvement in les affaires educatives.

Boards set general policies for their districts (remember: there are 13,000 districts across the nation; Illinois alone has 859 – a fact that deserves a post of its own), and the degree to which individual boards become involved in the day to day running of schools varies from extremely hands-off to way too hands-on; Superintendents (who are hired by the Board and answer, ultimately, to them) are de officio members of the Board, although they may not be voting members.

The “Great Unraveling” of civility and civic-mindedness has not occurred in all 13,000 districts, of course. One of the most strife-ridden districts, Loudoun County, is next-door to another district in Alexandria which has experienced a conflict-free pandemic.

Most media reporting contends that the civil war that’s erupted in many school districts was caused by the pandemic. In contrast, we believe that the pandemic simply hastened a process of “unraveling” that had been underway for decades. The initial triggering event was, understandably, school closings (March 2020) and the almost-overnight shift to online/virtual instruction, for which many districts were, also understandably, unprepared.

Parents suddenly found themselves in the role of tutors-in-chief, or overseers of their children’s educational curriculum and day-to-day learning experiences, and there were not a few parents who didn’t like what they were seeing and hearing onscreen, whether it was how math was being taught (or not), or what books their children were reading, or how crucial racial issues in American history were being presented and discussed.

The early weeks and months of the pandemic have practically retreated to the status of ancient history two years on, but from March to September 2020, chaos reigned throughout the country. Schools, which in industrial and post-industrial societies serve in loco parentis for seven or eight (or more) hours each weekday so that children may be under adult supervision (i.e. “cared for”) during adult working hours, were unable to fulfill their twin role as community institutions of learning and looking after children before and after school. For those parents able to switch to working from home – by and large, middle and upper-middle class office workers – this situation was stressful. When both parents were working from home and trying to supervise / mentor two or three school-age children simultaneously, patience waned and tempers flared. But where were parents to direct their anger and frustration?

In late May 2020, George Floyd’s murder was captured by cell phone and the country erupted in horror. Protests and demonstrations continued for weeks, and millions of Americans began, perhaps for the first time, to awaken to the harsh reality of systemic (structural) racism in the nation’s justice system. Much of the background and some of the foreground of the School District Wars has been played out over What to Teach about Our Nation’s Racist History, and is currently before state legislatures (states have a major role in funding and dictating state educational policy, as a result of our diverse, non-federal system of public education) in the form of bills that would, for example, forbid schools to teach subject matter that might make students feel “uncomfortable.” Clearly, some state legislatures do not fully grasp the purpose of education itself: if you’re not feeling uncomfortable, you’re probably not learning.

Below we examine, in chronological order of their emergence, the issues which have made governing local school districts so difficult during the pandemic:

School Closures (March 2020)

When school districts across the U.S. began shutting down in March 2020, mostly within about two-three weeks, what ensued can only be described as chaotic. Among the issues children and their guardians confronted when schools shifted to virtual (online) platforms: (1) many families (poor urban, rural populations) could not afford or did not have access to high-speed broadband required for synchronous online learning; (2) Public schools offer meals to children of eligible families (those earning below 130% of the poverty level, or who are on TANF or SNAP), and emergency accommodations had to be made for meal preparation / pick-up / delivery so that children wouldn’t go hungry during the first few months of the pandemic; (3) Public schools, for want of a more flattering description, offer childcare services (aka “babysitting”) during the normal workday (8-4/ 9-5), and when schools closed suddenly, parents/guardians were left scrambling to make alternative care arrangements. At every socioeconomic level, knotty problems emerged.

There were two-parent, two-income families living in cramped urban apartments trying to work full-time from home while simultaneously supervising their children’s online learning – a “first-world” problem, but a problem nonetheless which drove thousands of professionals to seek somewhat cheaper and more spacious dwellings in the suburbs, and this problem, for which those with the financial means found a solution, is going to have knock-on effects on public school enrollments and thus, finances for years to come.

There were children of essential workers whose parents’ jobs couldn’t be performed from home; who was to look after the youngest of these, and who was to supervise the coursework of their older siblings? There were no adults at home, so inevitably, older siblings looked after younger ones, often to the detriment of their own learning. Families which had relied on older relatives (grandparents / great-aunts) for childcare when their children missed school due to illness could not responsibly expose elderly caregivers (many of whom were not comfortable using a laptop or iPad) to COVID-19. Of course, in multi-generational households, such exposure was unavoidable.

Parents of children under five who were in daycare or preschool programs had nowhere to leave them during the first months of the pandemic as daycare centers too were shut down (many, apparently, permanently). As a solution of last resort, some engaged friends/neighbors to look after the under-5s during the first several months, but we should remember that COVID-19 was running rampant in densely-populated urban neighborhoods, particularly those inhabited by POC. Everyone was fearful, and rightfully so.

It is understandable that many parents – often those parents who were financially able to confront the crisis in school closures – became upset and later, angry. They began writing emails to school boards, attending meetings via Zoom, and posting on closed Facebook groups, lobbying to re-open schools. Those most likely to become the object of anger were local school boards, populated by their friends, neighbors, fellow church-members – people they knew or knew of, people they might even have voted for in local elections.

This anger was expressed despite the fact that school closures were not the fault of school boards or districts; members / administrators were, however, a lot more accessible and vulnerable to attacks than the federal government, whose refusal to issue clear guidelines / explanations through the CDC will ultimately be seen as responsible for a pandemic whose end is not yet in sight. The CDC’s policies, while helpful after some months had elapsed (although by then it was too late), never clearly identified the coronavirus as aerosol, even though this was clear from the earliest superspreader events in Washington State and Boston.

Why did the virus’s airborne transmission matter? Well, it turns out it mattered more than just about any other characteristic feature of the virus, particularly in regard to transmission in congregate settings like public schools.

What’s the problem with our public schools? By and large, they’re old, and they haven’t been properly maintained and retrofitted during the past 30-40 years. The American Society of Civil Engineers (ASCE) most recently graded the U.S. public school infrastructure (2021) with a D+. Many urban schools in older industrial cities whose schools were built between 1900 and 1950 have outdated, malfunctioning ventilation systems and/or windows that hadn’t been opened in years. With aerosol viruses, ventilation is the key to successful mitigation; when the air in a school (both central areas like cafeterias and classrooms) can be changed every 10 minutes, the virus’s spread is significantly lessened. Air purifying systems producers made millions from contracts with school districts in 2020-2021, but many of these systems actually did not meet the requisite standard of air replacement classrooms required to be considered (relatively) safe.

Today there’s a DIY means for ventilating classroom-size spaces called the Corsi/Rosenthal box. It’s cheap (around $100 for all materials), and can be assembled by amateurs. And it works very well, even in spaces that are otherwise poorly-ventilated. Every classroom should have one.

In short: parents/guardians were right to be angry that schools were closed by the force majeure imposed by COVID-19, but their anger was misdirected. School boards and administrations could as a practical matter do little to ameliorate the virus’s spread in old, poorly-ventilated, over-crowded and under-staffed buildings. Furthermore, districts didn’t possess reliable information about the virus’s airborne spread.  

The George Floyd killing (May 25, 2020)

The murder of George Floyd in Minneapolis by a police officer in late May, which was recorded on a cell phone, ignited a series of protests and demonstrations across the country (starting from Minneapolis itself on the day after Floyd’s killing). Suddenly, Americans “woke up” to the realities of structural racism in the U.S. justice system. During most of the summer, public and private institutions and organizations scrambled to step up racial justice programs (often referred to as DEI = Diversity, Equity, Inclusion programs). There is a highly profitable service industry devoted to training schools (boards, administrators, teachers, students) to be more racially aware both in practice (e.g. active recruitment of Black / AAPI / Hispanic teachers and senior staff) and in the classroom (e.g. through use of teaching materials which more accurately reflect the country’s racist history, including the 20th-century civil rights movement, etc.).

Cultural Conflicts (2020-present)

CRT

Not all districts reopened with in-person classes in the fall 2020 semester; many remained closed until spring 2021. Overcome by despair at their children’s prospective “learning loss” and concerned that precious teaching/teacher resources were being devoted not to the basics but to sociocultural initiatives, conservative parents in (primarily red/purple) districts directed their anger against an acronym, CRT, i.e. Critical Race Theory. Let’s clear up a much-misunderstood point: CRT is not taught in K-12 schools anywhere in the country. Rather, it’s a term plagiarized from legal analysis which was introduced in some law schools in the early 1980s. DeedSpeakOut is clear about this because the types of issues CRT actually considers – how persistent negative effects of earlier practices continue today (examples: redlining in housing, school segregation in education, environmental sacrifice zones, the school-to-prison pipeline in education/justice) -are precisely the sorts of issues this blog has been dealing with the last five years.

From a purely academic perspective, we believe that a 12th-grade AP American History or Sociology class could undertake incredibly useful archival research into these areas – we’ve often thought that high school seniors could, for example, study the original deeds for houses built between 1920 and 1960 in their neighborhood/city, or examine K-12 school boundaries as these were drawn and redrawn during the same period (or up to 2000; the process of gerrymandering school attendance boundaries continues), or examine publicly-available detainment/arrest/sentencing records of young juvenile offenders between 1960 and 2020, or partner with an investigative reporting group (the Mississippi Center for Investigative Reporting comes to mind; another group is ProPublica) to examine historical zoning regulations for residential/light/heavy industry within the boundaries of their district.

Such research projects, depending on how they were ultimately presented (and deployed), could legitimately be considered “critical race theory-related/relevant” (although their primary focus would not be on legal issues, which are graduate-level). But what angry parents are protesting isn’t this sort of student work. Rather, they’re disturbed by efforts to introduce more racially- and ethnically-sensitive texts and discussions of U.S. history. Nicole Hannah Jones’ The 1619 Project (which came out in 2019, on the 400th anniversary of the arrival of slaves in Virginia) has provided a lot of fodder to the anti-CRT movement, which has now spread to numerous state legislatures via bills outlawing the teaching of racism in various iterations, even going so far as to forbid the teaching of material that would make students feel “uncomfortable.” Such bills originate from a small number of conservative activist groups; in some states (e.g. Alaska), The 1619 Project is explicitly banned, while in Missouri, “Students must be presented with a positive picture of US history. Discussions of current policy issues are banned.” Many of these bills (not all have been voted into law; some remain pending and may be voted down) also include clauses forbidding classroom discussions of LGBTQ (or sex or gender) issues; Virginia has expanded on this with a bill that would require students to use their original “biological sex” bathrooms. Ah, Virginia.

Book Banning

When difficult and painful subjects are banned from K-12 education, whether by states or individual districts, can the banning of books which treat difficult and painful subjects be far behind? A small Tennessee school district board (McMinn County) enjoyed nationwide notoriety for several weeks over its decision to ban a modern-day classic, the graphic novel Maus, which garnered the Pulitzer Prize for author Art Spiegelman. The novel is a story of the Holocaust, and apparently the bones of contention were the use of “mild swear words” (as if students in 2022 had never heard them) and the depiction of a naked female mouse.

Another book on the blacklisting list in 2021-2022 is Toni Morrison’s novel Beloved (Pulitzer Prize for fiction, 1988). Glenn Youngkin, then a Republican candidate for Governor of Virginia (now Governor), featured the mother of a former high school student who had been assigned the novel in one of his campaign ads. The controversy, which culminated in a book banning (or “explicit content warning”) bill that then-Governor Terry McAuliffe vetoed twice, eventually was capped by McAuliffe’s infamous – and eminently quotable – quote “I don’t think parents should be telling schools what they should teach” – a statement which the national press believed cost McAuliffe the election. The main objections were that the book depicted violence, sex, and above all, the heroine Sethe’s killing of her baby daughter.

It should be noted that Beloved was taught at the AP level only (Morrison’s other classic, The Bluest Eye, was taught in regular English classes). The campaign to ban Beloved because the heroine killed her little girl so she wouldn’t have to suffer the depredations of slavery (in line with the conservative line that “all life is sacred,” which is honored by conservatives mostly in the breach) reminded us of another American classic, Pearl Buck’s The Good Earth (Pulitzer Prize for fiction, 1932). We checked to see whether The Good Earth had similarly been banned for its depiction of a mother killing her newborn daughter. Yes – it had been banned, but not in the U.S. It was banned in China because it depicted poverty in a way that made the Maoist-era Communists “uncomfortable.” The U.S. may want to think carefully about whether it wants to follow China’s example.

Having set the stage, we now proceed to a consideration of the havoc wrought by the pandemic – school closings – school re-openings on two school boards, one on the East coast (Loudoun County, Va.) and one on the West (San Francisco). The bitterness and successive controversies which rent both boards asunder (literally) were well-documented in the national and independent media, and illustrate the complex issues which each board confronted before and during the pandemic.

Case Study 1: Loudoun County, Virginia

“The core misread of the national press is an idea the Equity Collaborative essentially labeled taboo. ‘The culture war is not a proxy for race,’ is how Grim put it. ‘It’s a proxy for class.’” (from Matt Taibbi, “Loudoun County Epilogue”)

“‘Economic diversity across the county/division complicates the discussions about race, leading many people to steer the conversation away from race to focus on poverty,’ would be among their main initial observations about Loudoun.” (Taibbi, “Part 2: The Incident”)

Loudoun County lies in Northern Virginia; it is heavily populated by federal civil servants and high-tech employees (with many overlaps between the two groups; since the Clinton era, it has been known as “the Silicon Valley of the East”). Loudoun is the wealthiest county in the U.S., and that’s germane to the story of what happened there between 2018 and 2021, although it’s hardly ever mentioned by anybody.

Here’s how the MSM story goes: Terry McAuliffe lost the governor’s race in November 2021 because of white (i.e. racist) backlash against social justice movements within schools and over whether parents should have a say in what their children are taught.

But that’s not what happened in Loudoun County, which has for several election cycles been seen as a quintessential PMC voting bloc – i.e. Democratic. Matt Taibbi, an experienced and open-minded independent reporter, decided to go to Loudoun County and find out for himself what was going down.

Our summary of his four-part series (here, here, here and here): As noted above, Loudoun County is wealthy and predominantly white (67%). However, during the past 20 years, its population of Asian and in particular, South Asian first- and second-generation immigrants has risen to more than 20% of the total (Blacks, on the other hand, account for just 8% of the county’s population). This newly- and recently-arrived population have flocked to Loudoun for its well-paying high-tech jobs, and for its nationally-ranked public school system.

Loudoun’s School District has for years maintained an agreement with neighboring Fairfax County (also wealthy; it ranked no. 4 in 2020) so that 250-300 students from Loudoun could attend Thomas Jefferson High School (rank: no. 1 in the U.S.) in nearby Fairfax County. Fairfax’s accommodation of its neighbors doesn’t come cheap; the cost is more than $4 million a year. Each spring when it comes time for LCS to approve the upcoming year’s budget, there is grumbling, especially since Loudoun built its own state-of-the-art Thomas Jefferson clone, Loudoun Academies.  The Board was unhappy with allotting $4+ million for TJ when the district had spent a bundle on Loudoun Academies, and parents whose children were destined for admission to TJ – the biggest public feeder school in the country to the Ivy League (and MIT, naturally) – weren’t at all pleased at the idea that their children might have to attend a knock-off school which could take decades to acquire the reputation of Thomas Jefferson.

The grumbling grew worse over the past three years, and then racial justice initiatives entered stage left. The local NAACP and activist parents began maintaining that a blind admissions system based on examination was racially biased, although blind admissions were actually created to avoid racial bias (it’s a topsy-turvy world in racial justice land these days). They began lobbying for admissions based on criteria that would take into consideration children’s race, recommendations, and so forth. (One wants to say “and SES,” but that’s a bit of a stretch in a county were the average income is more than $150,000 a year).

The thing was, Black students were somewhat underrepresented in admissions to TJHS, but the group that was most seriously underrepresented was white students, whose parents were unhappy but have resolved the issue of non-admittance by sending their children to private schools in the wider area for some time now.

What was the irony here? The group that was over-represented (by a factor of +3:1 in relation to their population share) in Thomas Jefferson admissions was Asian / South Asian students, many of whose parents / grandparents had left South India to escape racism in their own country. As Taibbi notes, many of these students are darker than their Black peers, but in Virginia they are classified as “white.”

Indian and South Indian parents had moved to Loudoun County for jobs and its public schools. Their mantra was that by working hard and excelling at academics, they could succeed in America in a way their skin color would not have allowed them to do in India. In other words, they believed in the American Dream.

These parents, who had previously voted pretty solidly Democratic, were unfailingly polite but really, really angry, and in November 2021 they expressed their anger by voting – many for the first time in their life – Republican.

And this wasn’t all. The District had hired a consultancy firm to carry out racial sensitivity training on a no-bid contract ($500,000, an amount the Board would normally have had to approve). The firm, “Equity Collaborative” out of Oakland, California (its headquarters located not far from San Francisco, the wokest of woke school districts as we’ll see below) implemented something called the “Action Plan to Eliminate Systemic Racism” which was approximately like igniting a torch to the kindling of parent discontent.

By late 2020, Loudoun County still hadn’t reopened its schools, and with white-collar workers returning to their offices, suburban mothers were furious. Schools reopened in early 2021, but by this time parent anger had reached the boiling point – and the Board’s patience, particularly that of members who had fully and uncritically embraced the Equity Collaborative’s anti-racist training had come to an end. School Board meetings devolved into shouting (and more); parents set up small groups of “for” or “against.” A special security detail had to be hired to protect board members during meetings, and parents who wished to speak before the Board were let in one by one to avoid rioting.

Meanwhile, LGBTQ student rights had come to the fore in Loudoun County and Virginia as a whole, with a focus on bathroom choice. The national press, Taibbi documents, got a particularly ugly incident in spring 2021 wrong. This led at least indirectly to the arrest of the father of a 16-year-old student who was sexually assaulted by a fellow-student in a (girls’) bathroom. Her attacker was wearing a skirt, but was apparently not transitioning. The school transferred him to another school in the district, where he promptly assaulted another student. The incident was misrepresented as an attack on transgender student rights, when in fact it was a case of a sexual assault on a minor.

Finally, we come to Terry McAullife’s “gaffe” when he maintained in a late September 2021 gubernatorial debate with his opponent that “I don’t think parents should be telling schools what they should teach.” Youngkin took that statement and ran with it – and won.

On the surface, this sounds pretty extreme – and Taibbi (who should perhaps be forgiven since he has three young children of his own) got this one wrong, as did the national press.

Jennifer Berkshire, writing for the Nation in the wake of the November election, fills in the details: in fact, McAuliffe was correct when he said that parents won’t be telling their children’s schools what to teach in future. But neither the state nor individual school boards/districts will either. McAuliffe had several years previously (2017) signed away curricular privileges for Virginia public schools to Amazon when Crystal City was awarded Amazon’s HQ2 in 2018. Cue Berkshire: “Virginia is essentially retooling its schools to train an army of future Bezos employees …”. Henceforth, the state will be divided up into regional workforce development districts, and companies / curricular development businesses will present curriculums tailored to training students to work for local employers.

Here’s the most ironic thing of all: Youngkin (co-CEO of private equity firm the Carlyle Group before entering politics in 2020; est. net worth $440 million) and McAuliffe are in perfect accord regarding who’ll be telling schools what they should teach in future.  And it won’t be parents.

Case Study 2: San Francisco, California

And now for San Francisco, whose Unified School District has around 120 schools and 54,000 students (7th largest district in California). Like Loudoun and Fairfax Counties, San Francisco County is perched near the top of the income pyramid: it’s in 5th place in the nation, right behind Fairfax.

San Francisco schools closed early in the pandemic and reopened late (not until spring 2021); thus they remained closed for three full terms, only beginning to return in April 2021. This was in line with the county’s health department recommendations – San Francisco was more proactive about closures and required mitigation measures than most cities/counties in the state (or country), and as a result experienced a comparatively lighter incidence of COVID-19 during its first wave than other comparably-sized cities/counties.

But enough was enough. Parents wanted the schools reopened earlier than the Board did, and San Francisco being San Francisco, the City ended up suing its own School Board to force schools to reopen. Clearly, relations had worsened during the first year of the pandemic.

Given that schools were closed for a year+, how did the San Francisco School Board spend its time during this period? One issue that appears to have occupied it intensively (since 2018) was that of renaming no fewer than 44 (out of 120) schools in a gesture towards racial equity – not empty, admittedly, but not exactly geared towards solving San Francisco’s desperate housing shortage, not with even the business of reopening public schools safely. The Board’s decision, which included renaming Jefferson, Washington, and Lincoln (yes, Lincoln) schools due to the racism of the three Presidents, had to be rescinded in the wake of public opposition (partly motivated by a host of historical mistakes by a presumptive “blue-ribbon” committee which undertook the renaming project), but for many, they’d crossed the Rubicon of racial equity extremism. In a recent (and very rare) recall election, three of the Board’s most outspoken anti-racism advocates were recalled by margins exceeding 70%.

There’s more – in fact, the story of the city’s various excursions into the tangled web of racial equity sans economic equality deserves a post of its own.

The next post in “Education in a Time of Pandemic” will look at how educational entrepreneurs seized on the opportunity afforded by pandemic school closures to hasten the process of school privatization on the public dime.

Further Reading: Governance & Education Policy

General

School Boards get death threats amid rage over race, gender, mask policies

Death threats, online abuse, police protection: School board members face dark new reality

Why Public School Supporters Need to Keep On Pushing Back Against Laws Banning

of ‘Divisive’ Subjects at School

 “This Is Not Transparency

Opinion: Cruz Attacks Jackson for ‘Critical Race Theory’ — But Sends His Own Daughters to Learn It

Book Banning

Where Have You Gone, Laura Bush?”

The Woman Who Wanted Beloved Banned from Schools Is Right about One Thing

 “Virginia Governor Highlights Irony of Banning ‘Beloved’ from Schools

Holocaust Novel ‘Maus’ Banned in Tennessee School District

The Fight over ‘Maus’ Is Part of a Bigger Cultural Battle in Tennessee

Loudoun County, VA

Loudoun County, Virginia: A Culture War in Four Acts

A Culture War in Four Acts: Loudoun County, Virginia. Part Two: ‘The Incident.’”

 “The Holy War of Loudoun County, Virginia

Loudoun County Epilogue: A Worsening Culture War, and the False Hope of ‘Decorum’

 “Corporate Democrat Goes Down to Defeat in Virginia” …

 “Fairfax Schools Request Stay of Judge’s Order Invalidating TJ Admissions System

San Francisco

San Francisco Sues its Own School District for Not Reopening

San Francisco recalls school board members seen as too focused on racial justice

The Radical History of the Murals at George Washington High School

What Happens When an Elite Public School Becomes Open to All?”

2022-03-05 Education in a Time of Pandemic II

Where Have All the Teachers Gone?

Most American readers will be familiar with the euphemistically-termed “teacher shortage,” often presented in the media as an out-of-the-blue consequence of the pandemic. But just as we saw with nurses and nursing home workers, the teacher shortage has been decades in the making.

Like nursing, teaching is both a profession and a vocation. The best teachers are “called” to teach (cf. derivation of “vocation” > Lat. voco -are, “to call”); their knowledge is acquired through university attendance, honed through teacher training, and later, professional development courses which committed members of the profession continue to enroll in throughout their careers. As the parents of school-age children who were at home doing virtual classes in the early months of the pandemic have realized, teaching is not a matter of showing up and handing out homework at the end of the day. It requires an incredible amount of mental energy – even for the “natural” teachers among us – and the 20 or 30 hours of in-class time standard for most primary and secondary school teachers are accompanied by as many hours again of out-of-class preparation and grading. It’s easily a 60-hour week for a conscientious teacher.

But most teachers gladly give of themselves – their time, mental engagement, dramatic skills (yes), because that’s why they entered the profession in the first place. They chose to contribute to children’s growth through the acquisition of “book knowledge” as well as “social knowledge” – a fair amount of school time in the primary years involves socializing very young children to the idea that there are other people in the world outside their family. It’s not easy.

Througg the middle decades of the 20th century, teaching was still deemed a middle-class profession; salaries varied considerably by state (considering both cost-of-living differences between states/regions and the fact that wealthier districts / states paid higher salaries than poor districts/states), but in most states a teacher could maintain a decent lifestyle, particularly when they were one of a two-person working household. Of the three options available to lower- and lower-middle-class women in the late 19th and first half of the 20th century, it was the most secure.

Today, there are two major national teachers’ unions: the American Federation of Teachers (AFT) and the National Education Association (NEA) with 3 million members (it includes teachers and all others who work in education as well as future teachers and retired ones). The AFT (1.7 million members) has around 3,000 local affiliates and is currently led by one of the nation’s best-known public unionists, Randi Weingarten. Of the two, the AFT was from the outset a true union (militant, strike-ready), while the NEA began as a professional organization which only later acquired the characteristics of a true union (with collective bargaining, for example). Individual districts (recalling that there are 13,000 of these), when unionized, become “locals” – thus, the Chicago Teachers Union is “Local 1” of the AFT; that in NYC is the “United Federation of Teachers” (UFT). Since the 1960s, teachers unions have wielded considerable lobbying and ballot box power at the local, state, and federal levels.  

But teachers and the unions which represent them are not without powerful opponents. To some extent this has always been the case, its origins going back to the mid-19th century when teaching shifted from the home and into the institutional setting of the school room – often, throughout the smaller towns and rural regions of the U.S., the one-room school house [note: our blog’s masthead features an early 20th-c. one-room schoolhouse in Central Illinois]. Young women assumed responsibility for imbuing a small group of children (aged 5-18) with sufficient “reading, writing, and arithmetic” to enable them to function in adulthood as farmers and laborers, but their tenures were short – only until marriage, when they were normally required to resign – their ambitions seen as non-existent, and their “vocation” a temporary one which terminated once they had a husband and family of their own. The emerging professional class (white, male dominated), in its effort to professionalize office work / management of enterprises both service- and production-oriented, looked down on the nation’s teaching ranks as inferior, largely due to the profession being dominated by young women, whom they saw as docile and obedient but not really up to the job of educating the country’s youth.

For the past generation or more, teachers have been attacked by numerous organizations which have systematically downplayed / downgraded their work and its results, and which have lodged an equal amount of vitriol towards their unions; while men began entering the teaching profession in significant numbers after WW II, especially at the secondary school level, much of the activism that led to collective bargaining rights, decent pensions, health insurance, sick days – all the benefits of white male private-sector unionism, in other words – was conducted by women, and two of the most powerful unions – the UFT and the CTU – are or were led by women in the 2010s.  

Over the past 30 years, teachers have been systematically attacked by both political parties for the inadequacy of “outcomes” as these privately-backed groups began hacking away at the primacy of public schools through the introduction of the “Big Test,” VAM (value-added-model of teaching), charter schools offering parents “choice” if they were unhappy with their children’s outcomes in public schools, voucher (private) schools, online (virtual) schools, and home schools, of which the “pod” or “micro school” which gained some traction during the pandemic was but a recent variant.

When teachers themselves, their profession and their union are being assailed on all sides for decades, it’s hardly surprising that applications to schools of education decreased in the years leading up to 2020; in fact, what’s surprising is that applications didn’t fall even further.  Deeds have consequences, and the consequence of late 20th and early 21st-century “teacher bashing” was that when the pandemic arrived, it was already estimated the system would be 200,000 teachers short by 2025 out of a required 3.5 million to maintain fully staffed classrooms.

As we enter Year 3 of the pandemic (apparently having decreed that it is at an end), U.S. school districts across the country are struggling, often unsuccessfully, to find staff. One reason is clearly the pandemic: teachers at or near retirement age, considered to belong to a COVID-vulnerable group, took early retirement. Others were forced to quit because of family obligations – caring for elderly relatives who were vulnerable themselves, or for young children who were at home due to pandemic school closures or COVID. Similarly, the ranks of substitute teachers shrank as many districts’ substitute corps is made up of retired teachers.

Other staff essential to operating and maintaining our public school systems similarly decreased in numbers throughout the pandemic, including classroom assistants (aides), bus drivers, cafeteria workers, crossing guards, and custodians. Many of these workers live in poverty, in communities hit hard by the initial wave of the pandemic in 2020, and in multi-generational settings with elderly relatives whom they were loath to expose to the virus.

How have states dealt with personnel shortages, which have not yet abated? Two states, New Mexico and Massachusetts, have called upon their National Guards to fill in for sick teachers (New Mexico) or for bus drivers (Massachusetts). Oklahoma has recruited police officers.

New Mexico holds the dubious distinction of having the highest child poverty rate and the lowest average teacher salaries in the nation (an argument could probably be made that these two data points are connected), although salaries are slated to go up 20% this summer. So the state, in collaboration with its National Guard, normally tasked with providing assistance in times of natural disasters and serving abroad in military missions, created the “S.T.A.F.” (Support Teachers and Families) program. The Guard was hoping around 70 of its members would step up; in the end, 96 did. This may not sound like many, but for some schools like those in rural areas featured in this NYT story, it meant that schools could stay open even when 10% of their staff was absent.

In Massachusetts, which in January had over 1,000 school employees out sick on an average day (20% absence rate in food/nutrition, 100 bus monitors, 30 bus drivers),  hundreds of school administrative staff went into classrooms, including the Boston Public Schools Superintendent herself. When administrators and clerical staff must enter classrooms, their work doesn’t get done in a timely manner; often, those teachers still working are tasked with additional paperwork and quasi-administrative tasks which add to the burden without benefit to children’s learning.

In 2021, 37% of all teachers were considering leaving the profession earlier than they had planned. Between July 2021 and January 2022, teacher retirements and resignations jumped 85% in Chicago Public Schools, in addition to 72 resignations by principals and assistant principals.  With a total workforce of 39,000, there were 1842 resignations and 524 retirements during the same period, up 50% from 2019-2020.  Percentage-wise, the highest turnover was observed among principals/assistant principals. While the stresses on school leaders have been different than those on teachers, they’ve been no less severe: the initial shift to online learning, reopening (or not) school facilities, ensuring the safety of students and staff through mitigation measures, resisting anti-vax and anti-mask activists (mostly parents, not students), dealing with the repercussions of the annus horribilis 2020 for our country’s race relations, managing massive amounts of federal coronavirus assistance responsibly, confrontational school board meetings, critical race theory, book banning,  and the list goes on.

But there’s more than the grievous effects of the pandemic at work here. We have often read about “teacher burnout” and “low morale” during the past two years, but teachers were burning out and morale was falling well before COVID-19. In a recent post, education writer Peter Greene suggests another name for the ill that has befallen our public school personnel: “moral injury.” He adopts the definition employed by Syracuse University’s Moral Injury Project: “Moral injury is the damage done to one’s conscience or moral compass when that person perpetrates, witnesses, or fails to prevent acts that transgress one’s own moral beliefs, values, or ethical codes of conduct.” Greene – a long-term high school teacher in Pennsylvania who recently retired – gives an example of one classroom practice which he considers as having inflicted moral injury, viz. “teaching to the test.” He estimates in another post that between 6 and 10 weeks a year were devoted to practice, preparation, and taking standardized tests – when you have only around 180 teaching days, and 50 of those have to be devoted to “the test,” that’s a lot of valuable real teaching time lost, to nobody’s benefit apart from the testing companies’ bottom line.

But there are many other aspects of teaching today which contribute to moral injury, i.e. the sense that what you are being forced to do goes against your values and indeed against the very reason you entered the profession in the first place.

It’s easy to say “Well, let’s just all pull together and agree on our values so teachers can inculcate them in our students.”

The thing is, our country’s values are fractured along very deep fault lines today. And inevitably, these fractures are played out in classrooms.

Teachers didn’t create them, but they’re paying the price in moral injury.

Next up: Governance & Education Policy in a Time of Pandemic. Lots of misbehaving and conduct unbecoming to adults – stay tuned.

Further Reading:

 “Opinion: I see firsthand why teachers are burning out and quitting. We owe it to children to fix this.”

Teacher Voice: Why We Are Being Driven Straight Out of Our Classrooms

I’m Never Going Back

Who Wants to Be a Teacher?”

 “Burnout and Moral Injury

The Blame Game: 100 Years of Teacher Bashing

 (Episode #84, Have You Heard Blog)

 “Iowa Won’t Require Schools to Put Live Cameras in Classrooms after Republican Bill Dies

 “Who wants to lead America’s school districts? Anyone? Anyone?

In Chicago Public Schools, More Principals and Teachers Are Leaving

 “New Twist on Pandemic’s Impact on Schools

“‘We Are Losing Good Teachers and Staff Every Day’: Report Reiterates Pandemic Shortages

2022-02-26 Public Education & COVID-19

Part I: Why is Public Education Public?

“…it’s hard to think of an education-related policy that has effectively and sustainably worked, beyond the granddaddy of all ed policy: a free, high-quality, fully public education for every American child, no matter what they bring to the table.” -Nancy Flanagan, Teacher in a Strange Land

We return to our overview of what COVID-19 has revealed about systemic weaknesses in the various areas DeedSpeakOut covers, starting from public education.

Let’s start this group of posts with a question: What is the purpose of public schools? Sounds simple, right? But the answer has become more controversial over the past 30 years as the U.S. has been inundated by “school choice” (charter schools, expanded voucher/quasi-voucher programs for private schools, home schooling, virtual [online] schools), national curricular and assessment programs (Common Core, No Child Left Behind [2001], Race to the Top [2009], Every Student Succeeds Act [2015]), anti-union and anti-teacher agitation, and aging school facilities. The financial crisis of 2008-2009 with its drastic budget cuts have not been made up for in many states/districts by a return to pre-2009 funding levels. And then the pandemic arrived.

The almost-overnight shift to online learning did not proceed smoothly in many schools/districts. It particularly affected those already resource-strained before the pandemic, i.e. high-poverty inner-city and rural schools (a significant percentage of which lacked adequate [or any] broadband coverage).

By 2021, parents in better-resourced districts were lobbying for school re-openings. Working parents (particularly mothers, who still bear the burden of most child-rearing) were obliged to return to their offices but could not leave children at home all day without an adult presence; mothers of preschool-age children struggled to find day care facilities because so many such centers had closed. Lobbying sometimes turned into hostile confrontations with school administrations and boards; parents, goaded by frustration at lengthy school closures, continuing mask mandates, curtailed extracurricular programs and “learning loss”, accused boards/district leaders of infringing on their own and their children’s “freedoms” (to attend in-person class, to ignore mask mandates at will). For these parents, COVID-19 has devolved into a minor inconvenience to be treated as “endemic.” “We’re done with COVID,” parents and like-minded community members claim. How many ever pause to ask whether COVID is done with us?

To return to our initial question: Why is public education public?

The U.S. public school system is not a federal one, although federal funds are disbursed to support schools, for example through Title I, which provides additional support to poor schools in the amount of $16.7 billion (2020). But we have no “national” prescribed curriculum (the closest thing being the “Common Core”), and the various states are largely free to determine, in collaboration with school districts, the curricula, textbooks, and requirements for graduation from primary and secondary schools.

This, as we shall see in this group of posts, has proved a double-edged sword.

Individual states have enshrined their commitment to educate all residents within their constitutions. The fourth Illinois Constitution (1970) is typical:

Article X.

Goal – Free Schools

A fundamental goal of the People of the State is the educational development of all persons to the limits of their capacities.

The State shall provide for an efficient system of high quality public educational institutions and services. Education in public schools through the secondary level shall be free. There may be such other free education as the General Assembly provides by law.

The State has the primary responsibility for financing the system of public education.

Thus: “the educational development of all persons to the limits of their capacities” is defined as a goal, further elaborated as an “efficient system of high qualityinstitutions and services” It “shall be free.” And the State “has the primary responsibility for financing ….”

To restate for the purposes of discussion:

– the public education system is for all persons, i.e. it is universal

the public education system shall be efficient and of high quality

the State shall provide primary financing

Defined in these terms – universal, high-quality, state-financed, free – public education is a public good. In this it resembles our interstate highway system, our bridges and dams, our public parks (national, state, local), our public libraries, our law enforcement personnel (local, county, state police), our fire departments, and emergency services. All of these are public goods for everyone who uses/needs them.

The guiding principle behind public goods is that they are financed by everyone (through taxes) and are equally-accessible to everyone. They thus differ from private – consumer – goods in that the latter are paid for by individuals, at their individual discretion, and consumption should not materially affect the availability or quality of public goods. Private consumption is a matter of individual preference in concert with financial means, and is sometimes referred to as discretionary consumption.

Applying the terminology and adopting the criteria associated with private, discretionary consumption to refer to public goods is intellectually disingenuous and deliberately misleading. Over the past 20-30 years, school reformists have insisted on using the term “consumers” to refer to public school parents. This is strange, because while parents (along with all other taxpayers, parents or not) are indeed funding public schools, if anyone is a “consumer” it is their children, not themselves. The appropriate term should be “beneficiaries” – you won’t see that term being bandied about – or simply “users.” (Think of “library users” or “highway users” – we’d hardly call people who check out books from the local library, or drive their autos on public interstates “consumers,” would we?)

Public goods are public because they demand massive investment, planning, coordination, oversight, long-term maintenance, and regular renovation/replacement, all of which are too costly for any private individual to fund. Not even billionaires could have built the Hoover Dam, or the New York Public Library; to take a recent example, not even Elon Musk could have funded the James Webb Space Telescope.

In the case of public education, both private individuals – students – and the public itself – “society” – are beneficiaries. Each student benefits to the “limits of their capacities,” and when those limits are attained, society as a whole reaps long-term benefits.

What sort of “freedom” is involved for parents here? Well, there is the freedom to opt out of the system entirely, for one; wealthy parents may choose to send their offspring to private schools whose tuition ($60,000 per year is not uncommon for an elite private school today) their fellow citizens could never afford. This doesn’t, however, mean that wealthy parents’ obligation to the universal good ensured by public schools can be abnegated; they can opt out of sending their children but they cannot opt out of the more general obligation to the common good. Thus, the rich continue to pay local property taxes and state and federal taxes, some portion of which return to states/districts in the form of public school funding.

Against this background of public schools as public goods supported by public funds as enshrined in our various state constitutions, we will examine a number of issues exacerbated by the pandemic but not caused by the pandemic.

First, we’ll consider personnel shortages. It’s estimated that 90% of public schools are currently short of staff, including administrators, teachers, teachers’ aides, substitute teachers, bus drivers, cafeteria workers, and custodians. What happens when 20% of a school’s bus drivers are out on any given day? Some students won’t get to school, or they’ll get to school two periods late. When 10% of your teaching staff is out, and there are no substitutes to call on? Administrators, secretaries, custodians are asked to fill in, or classes are combined and placed in a gym – in which case, gym classes are curtailed. Many districts depend on retired teachers for substituting, but because this population tends to be over 65, many were reluctant to serve during the pandemic.

Second, we’ll look at school infrastructure. The U.S. has 50,000,000 school-age children enrolled in 13,000 districts and around 100,000 separate school facilities. Many schools couldn’t manage to make their facilities COVID-safe because the buildings themselves were too old to renovate quickly or indeed at all. The great period of public school construction was the first third of the 20th century; many of these structures are still in use, but they have not been maintained or renovated to 21st-century standards. This is particularly true for our older industrial and commercial cities both large and small; when COVID struck NYC, for example, where more than 50% of all schools operate out of facilities more than a century old, many had non-functioning windows, or ancient ventilation systems that would have required gut renovations to upgrade to COVID ventilation standards. This had consequences for the virus’s dissemination.

Following infrastructure, we’ll take up privatization, which along with governance and policy form two of the most fractious aspects of public education today. Privatization of the public schools has been presented as a matter of “choice” and (personal, individual) “freedom.” Beginning in the 1990s, school “reformists,” funded by various private groups and individuals devoted their efforts to dismantling U.S. public schools and replacing them with charter schools (tax-funded private schools) and vouchers (for private, largely religious schools). Two states (Alabama and Oklahoma) currently have pending legislation that would essentially abolish public schools entirely – parents would be awarded a sum of money each year and left on their own. This isn’t easy (and that’s another reason we call public schools a public service): charter schools are often loathe (or refuse outright) to admit special needs students and English Language Learners (ELL), and in order to keep their test scores/rankings high, are prone to expel students each school year, leaving them to scramble to find a school that will accept them. Nor do charter schools offer any guarantee that they’ll remain open indefinitely; in fact, they sometimes close without notice over a weekend. Vouchers/quasi-vouchers (such as Education Savings Accounts and Tax Credit Scholarships) supposedly enable parents to enroll their children in private schools, but the amount doled out never covers tuition and fees at the private schools of parents’ dreams.  

Governance has become increasingly difficult for many school boards during the pandemic, partly due to controversial virus-related measures such as mask mandates and, during the first year of the pandemic, cancellation of athletics – not everywhere, but in many states/districts. Many school boards moved their meetings online and it proved a lot easier for parents to participate vocally via Zoom than in person. And then in the summer-fall of 2020, CRT hit the public schools like a spiritual-ethical pandemic. Some boards jumped on the pro-CRT bandwagon, others on the anti-, but few had any clear understanding of what CRT even was. Iowa’s governor now wants a camera in every public school classroom in the state to ensure that teachers aren’t teaching “CRT,” but what she means by that is that teachers will be forced to ignore key events in U.S. history and social life from the 16th – 21st century.

The fifth and final topic of this series of posts will be the more general crisis of public education towards which most states have been heading the last 30 years. But the crisis of public education is in fact only part of the crisis of American society itself – schools are microcosms of society at large, and their problems are mirrored in other public sectors. The U.S. never fully transitioned to the form of social democracy enjoyed by many European countries during the post-WW II era extending from the late forties to the late seventies. Its social welfare system remained anemic, universal healthcare was never implemented, daycare / preschool programs were never federally mandated or funded, university-level education was never free. And since the eighties, there has been a concerted assault on the working classes combined with a powerful anti-tax movement which has intentionally starved the public sector of funding to maintain public services at their 1970s level. Poverty has become more prevalent at both the individual and the public sector level. Many reformists and like-minded opponents of public schools – of public anything, really, except perhaps freeways and law enforcement – have for a generation now been engaged in a misdirected assault on public schools as responsible for a plethora of ills.

But social inequality, systemic racism, and a deliberately-underfunded public sector are not the fault of our public schools, and they cannot provide a full redress for larger social failings.

Above all, schools cannot be made to compensate for mass poverty. Nearly 17% of all U.S. children were living below the poverty line in 2020. One in 10 of NYC’s 1,000,000 school children is homeless at some point throughout the school year. No teacher and no school can compensate for such social tragedies.

 In the remaining posts in our “Public Education & COVID-19” series, this stark and shameful reality will serve as backdrop.    

2022-02-08 Covid Revelations IV: The Opioid Epidemic

The Pandemic and the Epidemic

“If you’re alone, there’s nobody to give you the Narcan”

One prevailing theme is the fact that the epidemic now is driven by illicit fentanyl, fentanyl analogs, methamphetamine, and cocaine, often in combination or in adulterated forms. Overdoses related to prescription opioids and heroin remain high and also are increasingly contaminated with illicit fentanyl.” (Issue brief: “Nation’s drug-related overdose and death epidemic continues to worsen,” AMA, Nov. 12, 2021)

We conclude our review of COVID-19 revelations about health/healthcare with a look at the “epidemic within the pandemic”: overdose deaths, which rose more than 30% (year-over-year increase between March 2020 and March 2021: 38%) during the first year of the pandemic. This was the largest increase in overdose deaths in U.S. history. Of these deaths, around 75% involved opiates, and 60% involved fentanyl.

The extraordinary measures required by COVID-19 led to stay-at-home orders, isolation, and despair for both those with resources and those without. Millions turned to alcohol, opioids, and other drugs as coping mechanisms. Those with existing substance-abuse problems feared in-person treatment centers and residential detox facilities as possible sites of contagion; many stayed away and became more vulnerable to relapsing. One of the responses by treatment centers/ clinics was to shift to telehealth provision of services, but participation in online appointments with counselors/caregivers requires an internet connection which is stable and fast – something millions of rural residents don’t have and something millions of urban residents can’t afford to access.

Because so many people who under normal circumstances would have been using in social contexts began to use in isolation, overdoses that might otherwise have been witnessed and reversed with naloxone became fatalities. Naloxone prescriptions decreased 26% during the same period, although this statistic may be connected with a not-fully-explained supply “disruption” in the early months of the pandemic – a disruption related not so much to actual supply (though this may have been the case early on), but to pricing, as the cost for Narcan, which as an opioid receptor antagonist blocks the effects of an overdose, skyrocketed.

The opioid supply chain was also interrupted / contaminated because many people had to seek out dealers they didn’t know and therefore, couldn’t trust to provide a clean product. Producers / distributors / dealers took advantage of this and contributed to the untrustworthiness of supplies; it became commoner for heroin to be laced with fentanyl, a synthetic opioid 80-100 times more potent than morphine.

The “opioid epidemic” has of course been around much longer than the pandemic. Since the mid-nineties when large-scale prescription “programs” began to take off in some of the U.S.’s poorest rural regions (Kentucky, Tennessee, Virginia, West Virginia (Appalachia) and Maine), Americans have since the early 2000s associated use of prescription opioids with a stereotypical demographic: poor, white, rural. From the late forties into the seventies, heroin use was similarly associated with a poor, black/Hispanic, urban population.

Stereotypes are hard to overcome, but this pandemic has shown itself to be an equal-opportunity destroyer in the substance abuse sector. Opioid addiction has now spread everywhere, and the demographics have therefore changed as well, with some of the steepest recent rises in opioid use/abuse now found in states west of the Mississippi. And it no longer matter what socioeconomic class users are from: opioid addiction now strikes the rich as it once struck the poor.

Let’s take a short detour to trace the history of opioid addiction in the U.S. The first and for nearly a century most widely-used opiate was morphine, which comes from the poppy (in common with opium, laudanum and heroin). It was first extracted by the German pharmacist Friedrich Sertürner in the early 1800s. Initially used for medical purposes only, it rose to popularity in the U.S. during and after the Civil War (“every war has its drug”), when soldiers and veterans received it for acute and chronic pain from war wounds; addiction became so prevalent that it was referred to as “the soldiers’ disease.” But morphine (named after the Greek god of sleep, Morpheus) wasn’t just a soldier’s drug; by the final quarter of the 19th century it was also a “mother’s drug” (the 19th century’s equivalent of “mother’s little helper”). Thousands of middle-class women became addicted, having been prescribed the drug for migraines, gynecological complaints, and general states of “unwellness.” Until aspirin came on the market in 1899, it was essentially the only painkiller widely on offer.

In the early 20th century, the pharmaceutical company Bayer began producing and distributing heroin (rel. to “heroic,” because of its power), and by the 1920s-1930s America was in the throes of its second great addiction wave. Heroin (derived from morphine, but more potent) was marketed as a safe alternative to morphine, even though it was soon realized (by 1906) that it could become addictive after only a few days’ use. Early advertisements were directed at both adults and children (yes, children).

Once heroin was acknowledged to be addictive, an attempt was made to create a closed system, and with doctors no longer prescribing it, production / distribution shifted to the underworld – there was, after all, continued demand, but no legal supply for users. With the arrival of Prohibition in 1920, the “syndicate” began employing the same routes used for alcohol to smuggle in street drugs – heroin, in this case. It was outlawed for all purposes, including medicinal ones, in 1924.

The Boggs Act (1951) imposed mandatory minimum sentences for drug possession, which primarily meant in urban populations of poor minorities. But illicit heroin continued to circulate widely from the fifties through the early seventies. It was deemed a pestilence of the inner cities. America is still grappling with the effects of the Boggs Act 70 years later, as the debate between those who would criminalize use and those who prefer to see addiction as a disease, and thus a public health crisis, continues. It would appear that the pendulum is now swinging towards decriminalization and treatment rather than punishment; many local jurisdictions across the country have introduced innovative programs which aim to keep users out of jails/prisons and shepherd them towards treatment (methadone clinics being the best-known) or at least, harm-reduction programs (needle exchanges).

In the forties, three brothers from Brooklyn, all doctors, all keen on research (together they published more than 150 scholarly articles), took a particular interest in biological psychiatry – i.e., in the use of pharmaceuticals as opposed to psychoanalysis. One of the three, Arthur, also had a distinct talent for marketing and business, and he had purchased a small medical advertising agency. Arthur Sackler must be considered the father of modern pharmaceutical marketing, which was based on direct marketing to physicians (to persuade them to write prescriptions) and secondly, to the public through appeals such as “Ask your doctor about XX drug.” (Cf. “Most of the questionable practices that propelled the pharmaceutical industry into the scourge it is today can be attributed to Arthur Sackler.”) The brothers, who purchased a small, nondescript pharmaceutical company in 1952, turned among other ventures to marketing two benzodiazepines, Valium (= diazepam) and Librium, used primarily as tranquilizers for anxiety. By 1973, physicians were writing 100 million prescriptions a year for tranquilizers. The Sacklers (Arthur as CEO of the medical ad agency, Mortimer and Raymond as co-CEOs of the pharma company, rechristened “Purdue Pharma”) made millions in the sixties and seventies.

During the seventies and eighties, Purdue Pharma had tremendous success with a slow-release formulation of morphine (“MS Contin”). But in 1987 MS Contin’s patent was about to run out, so they needed another narcotic – a drug that would also contain the “twist” of delayed release. This drug, which consisted of the opioid oxycodone, was approved by the FDA in 1995 and christened OxyContin. While oxycodone (synthesized in 1916 by German scientists) was already available in other formulations (Percocet = oxycodone + Tylenol; Percodan = oxycodone + aspirin), in addition to the delayed absorption feature, OxyContin was pure oxycodone. And it was produced in a range of concentrations (titrations), from 10mg all the way up to 160 mg. It was promoted for use for “moderate to severe” pain.

In the mid-1990s, pain was re-classified as the “5th vital sign,” and the opportunities for production and marketing of effective pain medications, including opioids, changed again. The Sacklers found the opening they’d been waiting for. OxyContin was marketed not just for the types of pain opioids had legally been limited to – basically, end-of-life pain – but for just about any type of pain – injury-related, dental-related, surgery-related; back pain, neuralgia, arthritis, athletic injuries – you name it, OxyContin could treat it. Not surprisingly given the technique employed to sell it – by doctors (paid by the Sacklers) to doctors (including general practitioners and other non-specialists) – by 2000 OxyContin was generating a billion dollars a year in revenue.

Throughout the late nineties and into the early years of the 21st century, OxyContin was marketed aggressively to doctors in poor, rural, white communities (with a population described as “opioid-naive”) as a “delayed absorption” opioid, the implication being that it was therefore less addictive (in absence of any objective evidence). It took only a few years for the realization to sink in that OxyContin was just as addictive as other opioids which had come before it.

Purdue Pharma has now been sued thousands of times, but has never admitted complicity to causing the opioid epidemic; cases are settled out of court for dollars on the billions in profit the family made out of others’ suffering. In 2019, Purdue Pharma filed for bankruptcy, a filing that was recently rejected (January 2022) upon appeal. We shall see; a further appeal is pending with the 2nd U.S. Circuit Court of Appeals.  

Back to 2022:

The majority of now-adult substance abusers first became dependent as a result of a legal opioid prescription (estimates range as high as 80%). What does one do once doctors become fearful about writing further prescriptions – today, for example, patients are normally given only one week’s supply following routine surgeries and modest injuries – i.e. for “moderate” pain. Initially users turned to street supplies – the “pill mills” which churned out prescriptions in the late nineties and early 21st century put legally-prescribed supplies out there for a price, with pills often being sold as singletons. Once that supply began to dry up, addicts turned again to street providers – but this time, of heroin.

And that heroin is frequently laced with fentanyl (more than 25% of drugs confiscated by the DEA now contain fentanyl), a synthetic opioid 80-100 times more potent than morphine and 25-40 times more potent than heroin, rather than the innocuous baking soda, sugar, or starch which were once used to cut it. In consequence, during the first year of the pandemic, around 190 people died every day of opioid-related deaths, making them the leading cause among all drug-related deaths. It’s a sellers’ market out there, and buyers – at least in part due to the disruption of reliable supply chains during the pandemic – are at their mercy.

We noted above that the most-used antidote for overdoses – especially those caused by unknowing ingestion of fentanyl – also experienced market “turbulence” during the pandemic. Pfizer admitted that it “ran into problems in manufacturing doses of naloxone” early in 2021, but insists that the “issues are now fixed” (without revealing what those issues were). The real turbulence was connected with pricing, with the cost of a single dose of Narcan (produced by Emergent BioSolutions) going from $2.50 to $75.00 for harm-reduction centers, which do not receive a discounted price. The Opioid Safety and Naloxone Network (OSNN) Buyers Club, which Pfizer had been supplying with an injectable formulation, was cut out of purchasing altogether. It’s estimated that the astronomical rise in cost resulted in between 12,000 and 18,000 unnecessary drug-related deaths in the past year.*

That’s a lot of excess deaths, and the explanation for what happened with naloxone – first to the supply (if anything) and then to the price – has not yyet been thoroughly investigated.

*Addendum: Further research after this post was written failed to reveal more about what happened to Pfizer’s manufacturing of naloxone in early 2021:

The drab Pfizer website lists the availability of its injectable naloxone formulation as “depleted.” In a perfect world, Pfizer’s listing would be banal. A shortage of one generic WHO-declared essential medication for one manufacturer (among several in the US) doesn’t sound like a crisis scenario. Yet Pfizer’s supply disruptions are causing the worst naloxone shortage the country has faced since at least 2012, when overdose levels were less than half of what they are now.

In a juster world, Pfizer could have diverted some of its estimated net profit of almost $22 billion (2021, up from $9.1 billion in 2020) to ensure that the supply of injectable naloxone could meet demand.

Readings

 “Opioids and the COVID-19 Pandemic”

 “An Epidemic within a Pandemic:  The Opioid Crisis and COVID-19”

 “How the COVID Pandemic Made the Opioid Epidemic Worse Even as Telehealth Helped”

“2022 a critical year to address worsening drug-overdose crisis”

“Drug Overdose Deaths in the U.S. Top 100,000 Annually”

 “It’s really, truly everywhere: How the Opioid Crisis Worsened with COVID-19” (podcast)

 “America’s Opioid Epidemic” (pre-pandemic podcast))

“Price for drug that reverses opioid overdoses soars amid record deaths”

“The Family That Built an Empire of Pain” (on the Sacklers)

“Affordable naloxone is running out, creating a perfect storm for more overdose deaths, activists say”

 “The Evolving Opioid Crisis”

“Opioids, Inc.” (PBS Frontline)

 “Chasing Heroin” (PBS Frontline)

 “7 Days: The Opioid Crisis in Arkansas” (PBS Arkansas)

2022-02-04 Covid Revelations III: Long-Term Care Facilities

The Nursing Home Crisis

In this, our third post covering the failures of the U.S. health/healthcare system over the past two years, we consider the case of nursing homes, the best-known and largest category of Long-term Care (LTC) providers. Other congregate living facilities include assisted living, memory care facilities, group homes for the disabled and of course, prisons, which we will consider under the COVID-19 lens in a later post dealing with the justice system.

When queried, around 77% of Americans state that they would prefer to “age in place,” i.e. to remain in their home during their final years. But there are reasons, both economic and health-related, why this is often not possible. Between 1.3 and 1.4 million people are currently nursing homes residents; another 800,000 or so are in assisted living facilities (considered a “step up” from nursing homes in terms of resident independence).

The only significant portion of this sector publicly owned and operated (by the federal government in collaboration with individual states) are Veterans Administration facilities, popularly referred to as “VA Homes.” Of the approximately 15,000 nursing homes in operation today, 70% are for-profit; the other 30% are non-profits  or “safety net” homes funded by smaller government units, e.g. states /counties.

While the majority of homes are privately-owned and operated, and the majority of these are for-profit, this is not to say that state and federal governments aren’t directly involved in their funding and oversight: Medicare (overseen by the Department of Health and Human Services, HHS), under which 97% of nursing homes are approved for receipt of federal funding, provides around $25 billion yearly for the rehabilitation/convalescence of patients released from hospital but not yet able to return home, while Medicaid (federal/state) contributes around $50 billion a year, for a total of 60% of nursing home costs.

During the early days of the pandemic (March – July 2020), those paying attention to the statistics became alarmed when data on nursing home mortality rates began to be published, at least by some states and at least in part, although much of the data was hard to interpret since posting practices differed from state to state, making accurate state-to-state comparisons difficult for epidemiologists and other investigators. Overall, however, it would appear that of total deaths in the first wave, about 40% were of those living and working in nursing homes (140,000 total deaths / 5% of total cases). There was considerable variation, however, among states: in New Hampshire, for example, 81% of all reported COVID-19 deaths were of nursing home residents/staff, whereas in Nevada, only 19% of deaths were attributable to nursing home residents and workers. Data from 18 states showed that +50% of all first-wave deaths were of nursing home residents/workers.

It was noted by pundits and news analysts that this was “natural ” and “inevitable,” given that those over 65 residing in nursing homes normally have multiple co-morbidities which made them particularly vulnerable to COVID-19. But, as we saw in our two previous posts (and as we will see repeatedly throughout this series on the revelations of COVID-19), the nursing home / long-term care industry was already operating under its own version of “just-in-time” conditions, something most news reports didn’t make clear.

Some of the challenges homes had long been facing in February 2020:

  • Standards for nursing homes / congregate care facilities exist, but enforcement has historically been lax. This was crucially the case with infection control, which led to many homes’ turning into superspreader sites practically from the onset of the pandemic. Inspectors tended to be too few in number to carry out regular, rigorous inspections; nursing home owners often felt that a modest penalty or fine, e.g. for systematic failure to implement infection control mechanisms, was cheaper than addressing protocol breaches.
  • Funding conundrums. Medicaid, which foots around 60% of the total budget for nursing home care in the U.S., has no age-at-home option, even when someone could remain at home with minimal-to-modest regular visits/support and community services. If a given state wants to use federal Medicaid money to provide community / in-home care options, it must apply for a waiver through an onerous, time-consuming process.
  • Nursing homes are by their nature – as “congregate facilities” – unhealthy environments for the elderly, sick, and frail. Most consist of shared rooms, toilets, and showers; staff often “float” among a large number of residents (even between wings, which led to additional spread before isolation procedures for COVID-19 residents were devised) – and of course, it was staff members who brought COVID into the nursing homes in the first place, given that residents live in a quasi-bubble.
  • Nursing homes are systematically understaffed, and staff that carry out nearly all of direct patient care / service – CNAs (certified nurse assistants), food service workers/janitorial/maintenance/laundry staff – are the lowest-paid in the healthcare provision sector. CNAs earned on average $13.00 an hour before the pandemic (with considerable variation between regions/states based on cost of living and local minimum wage laws). In order to survive, many were working at more than one facility, or in another sector altogether (e.g. fast-food service). The more hours they worked outside any given facility, the greater the danger of their contracting COVID through “community spread” and bringing it into one or more of the homes where they worked.
  • Systematic understaffing soon reached crisis levels in many homes as staff began to fall ill or quit in fear. This led, inevitably, to some homes’ shutting down entire wings, thus reducing the number of beds available to patients ready for release from hospitals into convalescent care. And this had a backward-ripple effect on hospitals themselves, which were forced to keep patients who, although ready for discharge, had nowhere to go, in turn forcing the hospitals to turn away patients who urgently required hospitalization.
  • The long-term care sector provides few decent – and no public, universally-available – options for insurance. Medicaid pays $6,180 per month ($74,160 a year) per resident, but this is not enough to provide sufficient staffing/services even in non-pandemic times. The sector, which is now enormous, is thus both a victim of underfunding as well as an inevitable predatory one, given that the majority of homes are run for profit.
  • The profit motive means that many homes do not provide the bare minimum of professional oversight of facilities, e.g. by having an RN present 24 hours a day, or by employing an infectious disease specialist to monitor for infection control. Homes which do have an RN present experience both lower morbidity/mortality rates as well as an overall better level of care. Given that a federal agency, HHS, has the right to set standards for staffing of nursing homes, it would be possible to establish minimum staffing ratios across the board / across the country (as we saw that California did in 2004 for hospital ratios, and as Illinois and Pennsylvania are proposing). But this would require around 150,000 care-workers be added to nursing home staffs.
  • This leads us to staff shortages, which have now (2022) expanded from primary care providers like CNAs to nursing home directors (liability concerns?), RNs (as part of the overall shortage / fear of liability?), and even dining staff. Today, 54.5% of all nursing homes are experiencing staffing shortages; since February 2020 (i.e. the past 2 years), 420,000 nursing home staff have left the field entirely – partly out of fear of contracting COVID, certainly, but also partly because more attractive / less health-threatening jobs opened up during the 2021 recovery. In some states, the National Guard has been called up to assist with keeping homes open, but overall, 58% of the country’s 15,000-odd  homes are now limiting admissions, either because they have closed beds or due to inadequate (even by minimum standards) staffing, or both.

As if the above weren’t enough, a significant percentage of elder-care workers remained unvaccinated months after vaccines became available to them. In July 2020 (7 months after vaccines became available), 40% of CNAs remained unvaccinated; as of September, it was 30%. Some states (a total of 15 as of Feb. 1, 2022) have mandated that healthcare workers be vaccinated, but in states where mandates do not exist (e.g. Ohio), the rate of unvaccinated workers remains stubbornly stuck at around 40%. The highly-transmissible omicron variant has led to high rates of infection among these workers, and consequently to even more serious staff shortages.

At first glance, this reluctance – refusal appears hard to comprehend or countenance, given the population with which LTC workers interact.

Let’s turn the mike over to the workers themselves, whose initial concerns mimic those of millions of others (some of whom hold PhDs) who refuse to get vaccinated. But nursing home workers have additional concerns rarely aired in mainstream media.

We begin from the well-known standard objection, which may be summarized as “I’m not against vaccines, but it all happened too quickly”:

First up: Kia Cooper, Philadelphia, who has worked as a CNA for nearly 2 decades: “I’m not totally against it. But it was so rushed. I want to wait and see how others do.”

However, Kia has additional concerns:

Her experience with a health-care industry that seems to put profits over the interests of patients and staff—that denies hazard pay, that fails to provide adequate protective equipment—also contributes to her hesitancy. ‘I do wonder if it’s a money thing. These are big companies trying to force these products on everyone. You have to wonder, Are they doing it for us or are they just trying to make money?’” (emphasis added)

Second up: Destiny Hankins, an LPN from Tennessee currently working in Ohio (no vaccine mandate):

Sometimes, it feels like no one cares about us. I’ve worked in places where pretty much the whole staff walked out because the facility lied to us. They said there was no COVID when there was. They didn’t give us P.P.E. They didn’t have the decency to be straight with us.” (emphasis added)

Ms. Hankins has, after reading / reflection, decided she does want to receive the vaccine. “But because she works part time at several facilities, and full time at none, she hasn’t been able to get one.”

And then there’s lack of trust, which must be considered in light of a distrust of the medical establishment among POC/ marginalized communities whose members form the backbone of CNAs:

David Grabowski, a professor of health-care policy at Harvard:

In many cases, vaccine hesitancy is not a lack-of-information problem. It’s a lack-of-trust problem. Staff doesn’t trust leadership. They have a real skepticism of government. They haven’t gotten hazard pay. They haven’t gotten P.P.E. They haven’t gotten respect. Should we be surprised that they’re skeptical of something that feels like it’s being forced on them?”

There were facilities which succeeded in protecting residents and staff during the initial wave of the virus, but this required a genuine sense of community, shared commitment and sacrifice which had been created over time.

Kimberly Delbo, the director of nursing services and innovation at an assisted-living facility in central Pennsylvania:

“‘We’re a small, tight-knit family. The most important thing we can do as an organization is make sure people know that we truly care about them.’ In an industry where a fifty-per-cent annual staff-turnover rate is not uncommon, Delbo’s facility did not lose a single employee in 2019; last year, it had a ninety-per-cent retention rate.” (emphasis added)

Delbo herself engaged in active campaigning for the vaccine:

We’ve been very proactive about building confidence in it, about getting them the facts, about debunking conspiracy theories and social-media myths. We can engage in this dialogue because they trust us. I think what’s important for people to understand is that you don’t build trust in a day and you don’t build it for a specific purpose. We’ve been investing in trust for years. We were doing this before the pandemic, and we’ll do it after.” (emphasis added)

In sum, the major problems identified in the nursing home sector pre-pandemic – all of which were exacerbated by the pandemic itself – included: systematic underfunding and understaffing; unsanitary living conditions (shared rooms/toilets/showers/staff), inadequate senior (RN) supervision and infection control protocols; absence of a stockpile of PPE (masks, goggles/shields, gowns, gloves, sanitizer), whether that stockpile was at federal, state, or local level, and an absence of testing / tracing capabilities.

When the pandemic struck, it was inevitable that COVID-19 would wreak havoc among nursing home residents and staff. On average, the mortality rate in nursing homes among (residents + staff) during the early months of the pandemic was more than five times that among the general population (16% vs. 3%).

As regards PPE, nursing homes were/are not prioritized, and many had but a week’s supply of equipment laid by. This led to competition between homes – hospitals, and homes – homes (similar to that witnessed between states for ventilators), with larger chains winning out, and smaller ones (including safety-net homes) left behind, along with their residents and staff. Price-gouging was common. And during the first months, there was a “nearly complete absence of national efforts to improve the availability of testing and PPE.”

The nursing home crisis must therefore be interpreted within a “wider context of historical disinvestment and chronic underfunding,” rather than as a one-off, unavoidable disaster.

Addendum: The Biden Administration’s “Build Back Better” Act of 2021 foresaw a substantial and unprecedented investment ($150 billion, i.e. twice the amount spent yearly by Medicare and Medicaid for nursing homes) in Home and Community Based Services (HCBS) programs.

It appears that Senator Joe Manchin (WVa) has succeeded in killing the bill’s chance of passage, certainly in its original form and perhaps even in some stripped-down future iteration. (“Pressed by CNN on whether he has had talks on the proposal, Manchin said, ‘No, no, no, it’s dead.’”)

Interestingly, Manchin’s family once built a “safety net” nursing home, the John Manchin Sr. Health Care Center (founded 1899 for disabled miners) in Marion County, West Virginia (near the Pennsylvania border). Until he was questioned about his involvement with the home, Manchin was listed as a corporate officer of the home, which is registered as an LLC (two days after questioning, his name had been removed from the corporate officers). There is currently a concerted effort underway to close the Manchin Center (which provides vital care and health – as well as food-provision – services to Marion County) because it is inefficient – it has around 30 residents requiring a high level of care who would probably not find places in another facility. The ongoing struggle to save the Center – there are only 150 state-run nursing homes left in the U.S. (1%) – would have been aided by the passage of the BBB Act.

The pandemic has revealed that while the U.S. pays lip service to caring for its most vulnerable population, in fact the oldest and frailest among us have been neglected for decades due to increasing privatization and the profit-driven operations of congregate care.

We will see later in this series when considering COVID’s revelations about pre-K thru 12 education that the youngest among us – infants and toddlers – have been similarly neglected. They too would have benefited greatly from the passage of the Build Back Better Act.

Readings

 “Nursing home staff shortages are worsening problems at overwhelmed hospitals”

“Nursing home health care shortages, a ‘crisis’”

“State Policy Responses to COVID-19 in Nursing Homes”

“Nursing Homes Can’t Find Enough Workers: How That Affects Care”

“Rising from the COVID 19 crisis: Policy responses in the long-term care sector”

Don’t You Work with Old People?”: Many Elder-Care Workers Still Refuse to Get COVID-19 Vaccine

Reimagining the Nursing Home Industry after the Coronavirus

“Why Are So Many Health-Care Workers Resisting the COVID Vaccine?”

“Senate Build Back Better Act Draft Language Maintains Historic $150 Billion Investment in HCBS”

“The Fight to Save the Manchin Nursing Home”

2022-01-31 Covid Revelations II: The Nursing Crisis

Where Have All the Nurses Gone?

The issues involving crisis-level shortages of nurses and healthcare providers (for home health care, assisted living facilities) are somewhat different, although the level of crisis in each sector is comparable. The pandemic has made these worse – and more apparent to the public – but both sectors were in trouble well before the onset of COVID.

While general news articles blame the pandemic shortage of registered nurses (RNs) on COVID (most frequently-cited causes: burnout, stress, emotional exhaustion), just as we saw in our previous post on how the pandemic led to a public health disaster in the U.S., the roots of the U.S. nursing shortage go back much further, years before COVID-19 struck.

Most hospitals in the U.S. are now privately-owned and operated. Consequently, they are driven by the profit motive rather than the delivery of critical healthcare services. Given that approximately 50% of a hospital’s budget goes to staffing, the latter has been “streamlined,” i.e. minimized to increase profits. Not-for-profit hospitals, forced to compete with for-profits, have adopted the same practices to avoid being priced out of the market. Thus for practical purposes, there is little difference in how their budgets operate.

This is not a question of an actual lack of nurses – in 2015, for example, there were 2.7 million RNs in the U.S. Rather, hospitals’ “flex-staffing” (equivalent to the “just in time” production-delivery system we saw in the case of PPE in our previous post) has led to systemic understaffing, and which revealed gaping holes during the first wave of the pandemic. Even in the midst of the omicron wave – formally, the third – the U.S. has not been able to make up for chronic, deliberate understaffing. Today, the problem is not so much a lack of beds, but a lack of staffing to care for patients occupying those beds. And because remaining staff are diverted to COVID wards, millions of elective procedures (a major source of profit) have necessarily been postponed and or cancelled.

When staff shortages become chronic, existing staff cannot properly meet the needs of a surge in patients (even, for example, in the case of a major accident / natural disaster). It therefore should not be surprising that 66% of critical care nurses (who bear the brunt of caring for COVID patients) were considering leaving the profession when surveyed in September 2021 (before the omicron wave got underway), and 40% of ALL nurses were considering leaving.

Hospitals have dealt with persistent nursing shortages by “outsourcing” demand to private agencies, which contracted as middlemen for provision of visiting nurses in COVID “hotspots” as each wave has rippled through the country. This practice, engendered by stark necessity, cost hospitals approximately $24 billion in additional outlays, resulting in an increase in total labor costs of 14% (as of Sept. 2021), even though full-time (i.e. regular) employees fell during the same period by 4%, against a benchmark of an average increase in staffing requirements of 20% during the pandemic.

The visiting nurse business has proved enormously popular during the pandemic; in 2020, it increased by 35% in volume (and, predictably, profits). Visiting nurses, who sign short-term contracts for periods ranging from a few weeks to a few months, earn between $5,000 and $10,000 a week. They can go where they choose, and work when they choose, i.e. they can rest and recuperate between stints. Staff nurses, on the other hand, often work 12-hour shifts for weeks on end (when there is a chronic shortage, the term “days off” flies out the window), and it was inevitable that many would suffer burnout and emotional trauma during the multiple waves of the pandemic. And for this, they earn on average around $1,400 per week ($1,200 in rural hospitals). The pay disparity inevitably impacts the morale of staff nurses, some of whom chose to quit their regular jobs and go to work as agency contract workers.

Understandably, the term “price-gouging” has arisen in the discussion of visiting nurse agencies’ charges, but this issue really needs to be addressed at the federal level; individual states which attempt to impose caps on “excess profits” of, say, 10%, or a cap on service charges (e.g. 150%, i.e. “time-and-a-half”) would soon find themselves shut out of the market for visiting nurses entirely.

The outside observer of what is clearly a crisis in staffing – 99% of rural hospitals have declared staffing shortages, and 96% have noted that they have the most difficulty in hiring RNs – may well wonder why RN staffing levels are not regulated. Currently, only one state, California, has legislation (since 2004) regulating RN: patient ratios in hospitals (ratios range from 1:1 to 1:6, depending on the level of care demanded; ICU wards, where critically ill COVID patients are typically cared for, require a 1:2 ratio). Two other states (Illinois, Pennsylvania) have pending legislation to impose similar ratios on hospitals within their jurisdiction; unfortunately, similar legislation proposed in Massachusetts was defeated in 2018, thanks to $25 million invested by the American Hospital Association, which conducted a campaign opposing the bill under the guise of “freedom and choice,” an all-too-familiar slogan. Federal legislation regulating the standard nurse: patient ratios in all hospitals which receive federal funding (e.g. Medicare, Medicaid) is clearly needed, but seems an unattainable goal given the current donor base of Congressional members of both parties.

In the meantime, what is to be done? One solution – although it’s not a quick fix by any means – would be to increase the overall number of practicing RNs in the U.S. Despite COVID, applications to many B.S. programs in Nursing are up. But the American Association of Colleges of Nursing (AACN) has noted that in 2021, around 80,000 applicants to such programs were rejected due to a lack of teaching staff and clinical placements sites (66,000 rejected from B.S. programs, and 13,000 from graduate programs).

Herein we find yet another problem: teaching in nursing programs pays less on average than actual nursing; thus, many senior nurses with advanced degrees choose to remain practitioners or continue for a nurse practitioner degree (with even higher salaries), or do a few stints of visiting nursing each year rather than enter the teaching profession. In short, the U.S. cannot train more nurses in the short term because there aren’t enough qualified teaching faculty.

Here’s another problem: the RN nursing force is “counter-cyclical,” because the majority of RNs are married and remain outside the workforce while their children are young and while their spouses / significant others are gainfully employed. But because many U.S. schools went online during the first year or so of the pandemic, even those who might otherwise have reentered the workforce were unable to do so due to childcare / home schooling responsibilities. Thus, many qualified nurses were constrained in their ability to return to work (around 500,000 of the country’s 2.7 million RNs in 2015 were not working).

And finally: The peak baby boom year in the U.S. was 1957 – and the large cohort of nurses born in that year will turn 65 in 2022. Thus we must anticipate a larger-than-normal retirement cohort of highly experienced, long-term nursing staff.

Nurses today are university-educated professionals like teachers (later in this series, we will see that similar issues have arisen with the teacher crisis). Staff nurses earn a middle-class wage, but they are increasingly burdened by technology demands many older nurses have difficulty adapting to, by administrative burdens which take many RNs off the floor for extended periods each day they work (most nurses have entered the field not to engage with computers/paperwork, but to engage with human beings whom they want to care for and help), and COVID has greatly increased the levels of stress-burnout-overwork – much of which is caused not by too much patient contact, but too little.

Privatization of U.S. institutional healthcare providers over the past several decades created a crisis-in-waiting. COVID heightened it, but didn’t cause it.

Sources:

Covid will continue to highlight America’s nursing shortage in 2022 and the looming ‘silver tsunami’ (OP-ED)

Why is the U.S. perpetually short of nurses?

Post Covid-19 global nursing workforce challenges ‘too big to be ignored’

COVID-19’s Impact on Nursing Shortages, The Rise of Travel Nurses, and Price Gouging

We Know the Real Cause of the Crisis in our Hospitals. It’s Greed.

Why the U.S. Nursing Crisis Is Getting Worse

COVID-19 field nursing shortage but also inspired new generation of nurses

What does the California Ratio Laws actually require?

2022-01-28 Covid Revelations

Setting the Scene for a Public Health Catastrophe

Over the past two years, successive waves of the coronavirus (SARS-CoV-19, “COVID” for short) – the Alpha, Delta, and now, Omicron variants – have revealed grave failings in all the policy fields DeedSpeakOut has regularly featured: Health/healthcare, education, housing, the justice system, and the environment. As we resume coverage of these areas, it may be useful to point to the initial cracks which the pandemic has turned into chasms. We’ll start by setting out the big picture of what happened to public health.

Remember all the way back to late 2019/early 2020, and the stories which began surfacing from Wuhan about a newly-detected respiratory (pneumonia-like) virus? China put Wuhan (pop. 14,000,000) in full lockdown, promptly implemented a massive testing-and-tracing program, and imposed strict quarantine protocols. Westerners were appalled by these images (eerily-empty streets, massive field hospitals), but the Chinese (and Taiwanese, Japanese, Vietnamese, Singaporeans, Malaysians, Australians, New Zealanders, etc.) had more extensive and recent experience of such viruses. They knew the only way to tame them into submission was an all-out “zero COVID” policy, and they had the knowledge, experience, means, and political will to do so.  

Remember that Biogen conference in Boston in February 2020 [estimated to have been responsible for some 330,000 cases]? Those two Washington state choir practices in early March? That image of Chicago O’Hare as thousands of Americans rushed home when the U.S. government finally decided to close its air borders [March 15, 2020]? The U.S. dithered, but there’s no time to dither when confronting a new, highly-contagious virus for which neither vaccines nor known therapeutics exist. Every step taken in the U.S. at both federal and state levels came too late.

In consequence of its failure to follow the tried-and-true public health model, the U.S. ended up with one of the highest viral case rates per million, the largest number of hospitalizations, and, inevitably, deaths; even now, in the midst of the Omicron variant, U.S. (pop. 330 million) confirmed cases on Jan. 26, 2022 exceeded 600,000, while those in China (pop. 1.4 billion) were at 57.  

The absence of prompt, informed action backed up by long-term preparation for implementing public health’s standard toolkit for dealing with an airborne (via aerosols) virus has now resulted in nearly 900,000 U.S. deaths. The U.S. healthcare system, which is largely privatized and therefore, profit-driven, has been overwhelmed by successive waves of the virus. Once eradication fails, we are at the mercy of the virus itself, forced to await each successive mutation’s unique features (current variant: Omicron = B.1.1. 529, estimated to be about 10 times more contagious than the original strain), hoping that what will eventually emerge as the long-term variant is no more lethal than the common cold. In the meantime, however, new variants emerge. Delta emerged in India; Omicron appears to have emerged in South Africa; an even more contagious sub-variant of Omicron [BA2] has now been identified in Denmark, and appears to have landed in California. It’s not really a race against time; rather, it’s a case of hanging on by a wing and a prayer.

The U.S. doesn’t fund its public health programs sufficiently (less than 3% of total health spending [2020: $4.1tn] goes to public health), which has unavoidable consequences for large-scale health emergencies. The long-term lack of funding results in a chronic dearth of personnel, something all-too-evident during those first months when thousands of county health departments (the first line of defense for ordinary citizens) were unable to coordinate with one another, to articulate in a clear and compelling manner their initial policies (in coordination with state departments of public health), or to implement those policies once they were published.

Chronic underfunding at local, state, and federal levels cannot be overcome in the short term; thus we continued to witness a lack of staffing, coordination, and implementation in the initial rollout of the first vaccines in early 2021. The federal government, unable to deliver a mass vaccination program, contracted with private vendors such as mega-pharmacy chains (CVS, Walgreens) to assist with delivery. County health departments were in many cases compelled to contract out for vaccination appointment software, rather than the federal government rolling out a single platform for the entire country. Time was lost, inefficiencies ensued, federal funding was often spent unwisely or unnecessarily – and private companies profited.

During the first and most critical wave of the pandemic, the dearth of funding for public health in the U.S. meant that we lost both the battle for eradication in 2020 and that on which suppression was staked, viz. vaccines, in 2021. Even today, over a year after vaccines became widely available to at-risk groups (healthcare workers, residents of long-term care facilities, those with comorbidities, the elderly) and more than eight months after they became widely available to the general adult population, only 63.8% of the U.S. population is fully vaccinated – an unimpressive data point which situates the U.S. between San Marino (64%) and Sri Lanka (63.4%). Something went wrong, because the U.S., in contrast for example to Africa (where only 10% of the population is fully vaccinated), had access to abundant supplies of the vaccine from early on once initial production hitches and distribution problems were overcome. Much of what went wrong must be attributed to this absence of federal policy-coordination-oversight, the lack of public health personnel to carry out a fast, mass campaign, and the inevitable involvement of the private sector in vaccine manufacture (and patents), distribution, and delivery. Billions of dollars in corporate profits were earned, but the job didn’t get done. It’s been estimated that +80% of the population needs to be vaccinated given Delta’s R0 (= reproduction number) (5.09, vs. 2.79 for the ancestral strain). The percentage of the population that needs to be vaccinated is even higher for the Omicron variant, given an estimated R0 of 7.0 -14.0. We’re not going to reach the required percentage, ever, given that vaccinations (a) aren’t sterilizing, i.e. their effectiveness begins to wane after 3-4 months and (b) vaccinations are geared at the most recent dominant mutation – they’re always at least one step and several months behind any newly-emerging variant(s).

Why? Countries which succeeded in (nearly) eradicating the virus during the first year – before the emergence of Omicron, which presents a different and much bigger challenge – did so by stringent imposition of NPIs (non-pharmaceutical interventions), namely masking, social distancing, good ventilation (critical for aerosol viruses), and testing-tracing-isolation/quarantines.

There was also a misunderstanding/misrepresentation about the purpose and efficacy of the vaccines themselves; this misrepresentation, however, may have been necessary given the failure to implement first-line NPIs. The vaccines were promoted as offering (nearly) full protection against infection, when in fact their logic is that of the yearly flu shot, which is never referred to as a “vaccine.” The latter term is reserved for diseases such as polio, smallpox, tetanus, whooping cough, measles-mumps-rubella and others for which one shot (or one shot + a booster at a later date) is deemed “sterilizing.” The COVID-19 “vaccine” is not sterilizing and cannot be deemed a definitive solution to the virus. This was known to the developers, the CDC, the WHO, the NIH, the NIAID (of which Anthony Fauci serves as Director), public health personnel, epidemiologists …

In order to gain modest control over the virus once front-line defenses had failed, speed was of the essence – i.e., to vaccinate the largest possible percentage of any given population within the shortest possible timeframe. Speed was necessary to head off mutations of the virus among the unvaccinated and partially-vaccinated – the greater the percentage of the population is at least partially protected, the smaller the chance that new, more lethal and/or more contagious variants will emerge.

The emergence of the Omicron variant (formally identified in South Africa in November 2021) well before the retreat of Delta in many countries proved particularly unlucky; Omicron is more contagious (with a case doubling rate in 2-3 days), has a shorter incubation period (thus making it less likely to be detected in its asymptomatic stage), and, due to the presence of a large number of “spike” mutations, is better able to evade the protection provided by current vaccines, which were developed in response to the ancestral (now essentially extinct) strain of the virus. Two major producers of vaccines for the American market (Pfizer and BioNTech, Moderna) have recently announced that they are entering the trial phase of boosters directed specifically at the Omicron variant, but it will be several months before trials are completed and production ramped up. In the meantime, we must hope that further viable mutations do not occur.

At this point, the virus is out of control in most of the West. Only those countries which continue to enforce strict NPIs have been able to hold it in check. The measures China has taken in preparing for the Olympics are worth considering. For example, China has decided to forego foreign spectators and members of the local general public at events, where stands will be nearly empty starting February 4. Daily testing of contestants, journalists, and support staff is already being carried out; journalists are tested and quarantined for 2 weeks upon arrival; impenetrable “bubbles” have been created for distinct groups (contestants, media, staff); inter-city travel has been severely curtailed; cities where even a handful of cases have been detected have been promptly placed on lockdown.  

Why didn’t the U.S. do all the above in the initial phase of the pandemic (say, January – February 2020)? Arguments that such measures would have been “un-American” were disingenuously applied to conceal the fact that the U.S. was objectively unable to do what China, Taiwan, Japan, Vietnam, Australia, and New Zealand – to name a few notable success stories during the first year of the pandemic – did.

The U.S. production economy (apart from the military) has been largely off-shored over the past forty years. During the early days of the pandemic (February – April 2020), the U.S. did not have enough protective (surgical-grade or higher, e.g. N95) masks for its health providers, let alone for the population at large. Most masks, like other disposable medical gear, were being produced in East Asia for the U.S. market. While reshoring was eventually adopted (through a Presidential invocation of the Defense Production Act), the time gap was one of several months – a period during which Americans were successively advised that “masks aren’t really all that necessary,” to “make your own (cloth) masks,” and (later) “don’t wear N95 masks because they’re needed for healthcare personnel.” Masks, of course, are indispensable for protection against an aerosol-borne virus (that the virus was transmitted in this fashion was clear from the first superspreader events in early 2020). Cloth masks are nowhere near as effective as surgical masks or N95s, but they were all the U.S. could muster because the supply chain from East Asia had been interrupted, firstly because China (and other East / Southeast Asian countries) needed all their production for domestic use, and secondly because the supply chain itself was cut off with the closing of ports / interruptions in shipping.

The dominant economic approach governing the production of both durable and disposable goods over the past four+ decades is referred to “just in time” production – delivery – dissemination. Producers (the majority of whom are offshore) keep production to a minimum to avoid maintaining excessive inventory; local (onshore) distributors do the same – orders are deliberately kept low to minimize costs and maximize short-term private profits.

The “just-in-time” production philosophy fails in a pandemic, when response time is counted in days, not months. A public health emergency calls for excess production and inventories, available at the emergency’s onset.

Such shortages occurred not only with masks; other crucial disposables required in the earliest stages of the pandemic such as additional PPE (personal protective equipment) for healthcare providers/hospitals, ventilators, and testing materials and supplies (test tubes, reagents) just weren’t available, or were in such short supply that they couldn’t be used on the scale required. Inevitably, supplies went to the wealthy and better-organized (states / healthcare providers / individuals); other states/counties/communities went without – and the virus spread.

Two years later, we still haven’t got masks right, and politicians are issuing orders against their mandated use in schools, of all places – just at the moment we’ve learned that Omicron is especially contagious among school-age children, and that approximately 10% of children (20% of adults) will suffer “long Covid” for some indeterminate period. The newly-installed Governor of Virginia, keen to give parents more “say” in their children’s education, has issued an executive order giving parents the right to decide whether their children wear masks (the cheapest, simplest, and most effective single means of protection against an aerosol virus) at school.

Next up: we’ll consider how neoliberalism / financial capitalism / techno-feudalism contributed to the crisis of healthcare personnel, including hospital nurses and long-term care facility staff. And then we’ll move on to multi-national pharmaceutical companies, pandemic health/care profiteering, and vaccine apartheid.

In the meantime, stay well, mask up, maintain social distancing, test often (if you can), and if you must go to indoor venues, make sure they are adequately ventilated. The vaccine, especially when accompanied by a booster (3rd shot) is remarkably effective in preventing hospitalization and death, but it doesn’t halt breakthrough cases.  

2021-04-05 The Pandemic and the Commons

The Social Discourse Commons

In an increasingly class-stratified and politically-polarized society, whose values reign supreme? And where do those who disagree with those values go to dispute them?

The values of traditional liberalism, including among others the freedom to live as, where, and with/among whom we choose; the expectation of fairness in the dispensation of social goods and social sanctions, and a basic tolerance of those whose beliefs differ from our own are being sorely tested both by the isolation imposed on us during the year-long pandemic series of lockdowns and by our self-selected presence/absence on what was once touted as a substitute for the “commons,” the Internet (the “digital commons”).

In real life (“IRL”), how many of us are in a position to live anywhere we choose, with whomever we choose, and in any way we choose? How many of us genuinely believe that jobs are awarded (sic) to those best qualified and most “deserving” (and “deserving” in what sense, exactly?) of them? And how tolerant are we IRL of those with whom we disagree – often, sharply – on fundamental issues involving our shared polity?

As we continue to endure masking, social distancing, the ins and outs of (sometimes inequitable) vaccination, openings and shutdowns of both businesses and public services, many of us – those of us not providing essential pandemic services – continue to seek human interaction and discourse through online means, whether Skype, or Zoom, or FaceTime, or FB Messenger. I have initiated regular meet-ups (some complete with each of us consuming their own glass of wine and slices of cheese, or sushi or dessert, separately but together) with old friends, and it’s been a real boon to my morale. With old friends, we can laugh and joke, discuss current events, share our latest COVID-19 complaints, even proceed to engage in more substantive discussion as I do with one bi-weekly meet-up, where we have an assigned topic-of-the-evening and where there’s homework involved in preparation. Initiating such meet-ups was the most positive thing I’ve done during the pandemic in support of my personal mental well-being.

Make no mistake, however, about such get-togethers: they’re a good thing, but they’re not the real thing. And I assume that holds true for most of us – what we most miss is actual human contact. An education podcast we follow, HaveYouHeard, recently asked a previous guest to recruit her students and ask them what they felt they’d lost over the past year. In stark contrast to what administrators and Department of Education officials are bemoaning (“Learning Loss”), what the students feel they’ve lost is almost exclusively related to the social nature of school. Student after student said, first off, that they missed their friends; that they missed their teacher’s physical presence; that they missed their team sports, their drama club’s activities, even the sense of hanging out – or, to put it another way, of not being alone.

Not that school is perfect, far from it. Concerned teachers who’ve been paying attention the past year are entreating administrators to take heed of the lessons learned from remote teaching. For one thing (and this has long been known), school starts too early for children and adolescents – if school bells rang at 8.30 or 9.00 each morning, there would be many fewer students asleep during the first and second periods. For another, children get hungry at odd – and different – hours during the day; why not allow them to snack as needed? And it turns out that some children are doing better with remote learning than expected from their classroom past histories because, well, thirty other students in any given class is too much stimulation – it’s distracted them from learning, rather than helped them learn. We need – we must, both for the sake of public and mental health – reduce class sizes by at least one-third.

The overall message is that what our leaders are trying to force students and teachers back to – what they call “normal” – is not all great. Many things will never be truly “normal” again, and rather than bemoaning this, we should seek to implement what we’ve learned from a year of far-from-normal. Perhaps the biggest takeaway is that students go to school to learn, yes, but they love school because it offers them a social commons, a place to engage with their peers, a place where they can learn to be. One student noted that a year of non-engagement meant that they had “lost my sense of self.” Truly we learn to define ourselves through the eyes of others, no matter how young we are.

Human beings, it need scarcely be repeated, are highly social animals. We spend our lives once we emerge from infancy and dependency on the primary caretaker with others – our peers, be they as young as 2 or as old as 92. We play together, we explore together, we collaborate, we learn – I mean, part of the thrill of being a 3rd-grader was when the whole class understood a new concept in concert – we were all excited, each of us an individual with greater or lesser intrinsic abilities but in that moment learning as one, together – the sum of us all was greater than our individual selves.

The same holds true for most adults. Millions of members of our Professional Managerial Class (the “PMC”) have worked remotely throughout the pandemic. Some fared well, others poorly. Those who fared well tended to be in stable relationships where they were isolated, yes, but not entirely alone; those who fared poorly were either entirely alone, lacking family or friends with whom to create a pandemic “bubble,” or in failing/failed relationships which provoked much unhappiness and provided little comfort. Understandably, the latter are hankering to return to “the office,” to a social world which provides interest, stimulation, distraction, and at least some solace from troubles at home.

In the absence of real-life social interaction, both children and adults have resorted to a simulacrum of fractured, interrupted relations: online communities. Here, however, we have duplicated and unfortunately exacerbated a significant pre-pandemic social issue, i.e. a tendency to gravitate towards forums/spaces where we feel most “comfortable.” Liberals subscribe to liberal online publications/news feeds and survey liberal websites; ditto for conservatives, of course, and for progressives and far-right conservatives – the overlap between progressive/liberal sites/sources and between conservative/far-right sites/sources has gotten smaller, not larger, throughout the last year. How many of us spend significant time checking out what the other side is making of current affairs, what they’re focusing on, what they feel is most important today, this week, this month, this past year?

And what does this narrowing of focus signify for our return to the physical commons, once we return to work and gyms and clubs and church groups and school board meetings and restaurants and cultural events and … well, wherever we choose to go once we’re able, once COVID-19 has been tamed by natural means or once we reach herd immunity through mass vaccination programs?  

Are we returning in the belief that the game’s not rigged against us (i.e. with the expectation that the dispensation of society’s goods / sanctions will be reasonably fair), or that it’s even more rigged than before? Are we returning with more tolerance for the Other – be that the irritating person whose desk is next to ours at the office, the show-off at the gym who worked out for two hours daily over the past year, or the shrill mother complaining because “some” students can no longer “keep up” with the curriculum, and that’s harming her child?

When the pandemic struck the West last March and it became clear that students – and millions of adults – weren’t going to be able to return to school/the office anytime soon (the West’s generalized failure to respond in a timely/appropriate/decisive manner to COVID-19 should by now be obvious), we believed that the most appropriate response-in-lieu-of-any-effective-response should have been for those in charge of education to hit the “pause” button.

That, of course, didn’t happen, but the decision by newly-inducted Secretary of Education Miguel Cardona that the yearly “Big Test” would be administered as planned this spring is pointless and in some sense cruel. Who believes that children who have lost loved ones (nearly 600,000 deaths now), whose parents have been largely absent because they – in contrast to PMC parents – were deemed “essential workers” (essential, but not really deserving of recognition by enhanced pay and benefits), will do as well as the children whose parents hired tutors, or formed “learning pods,” or rushed to exclusive summer enclaves to enroll their offspring in private schools that remained more-or-less open? The Department and its collaborators in the high-stakes, for-profit testing industry claim that the tests are necessary so that students’ “learning loss” can be assessed and remedial programs designed (in many cases by these same for-profit companies or their offshoots) so students can “catch up.” Does anyone doubt what the test will demonstrate, viz. that students in our poorest districts, whether urban or rural, whose parents didn’t have the luxury of staying home and supervising their remote learning experience, whose access to remote learning may well have been negatively impacted by inadequate or non-existent broadband connections, will not do well on the standardized tests? And for that matter, what does “well” even mean in such unprecedented circumstances?

Thousands of the 13,000-odd school districts across the country are already engaged in planning for remedial summer sessions (hoping against hope the pandemic will retreat in the next eight weeks, as if they’ve learned absolutely nothing from its course to date), but students don’t want to go back to school this summer. They want to go back to each other – to re-connect, to re-socialize, to re-discover their lost sense of self. There will be plenty of time to get back to school work and formal learning once social bonds are re-established. And the learning that takes place will be all the more effective once communal ties are re-established.  

Adults should heed what students claim they need, given that our needs are not all that different from theirs. When the pandemic passes – or when, as seems more probable, it retreats to a level deemed “acceptable” to the powers that be – we should re-connect with one another in person for any number of reasons every chance we get, even if masking and some degree of social distancing continue to be de rigueur.

Why? Because American adults’ social discourse commons, like our children’s, has been damaged in the pandemic. Already fractured and split into sharply-divided camps before the pandemic, some writers are now questioning whether our commons is already beyond rescue. These pessimists believe that the latest iteration of the liberal class (the PMC, the “Mandarins”, the “credentialed classes” – pick your pejorative) and the now-vast socio-economic divide are so great that there are almost no occasions for the working class – 70%-75% of the population – to interact with the top 25%-30% except in relations of dependency. The well-off, well-educated, and well-paid live apart and play apart; their code of behavior (based on purported “values”) is different, and not insignificantly, their children often exist separate from everyone else’s, whether in elite private schools or elite public ones protected from the poor (and their parents) by income segregation, which has now become a useful stand-in for racial segregation.

Where is the social commons? Well, it’s everywhere in the real world, but the locus where it’s critical we all return with a renewed commitment to tolerance – not performative tolerance, but actual, demonstrated tolerance – is the public commons, that raucous, contentious place where issues large and small deserve to be discussed, openly and respectfully but honestly, even perhaps in loud tones, over the months and years to come. There’s going to be plenty of contention, because we’re discussing issues critical to our survival, but it’s vital that everybody show up for the discussion. This means direct involvement in the life of the polis – politics – and breaking through the barriers now protecting and segregating decision-takers from those who have to live with their decisions.

How did we all learn to participate in this commons? Where do we all come together as children, to make friends and enemies (to embrace the former and tolerate the latter), to share in the joy of discovery, creativity, friendship, companionship – togetherness – to collaborate, succeed and sometimes, fail? Where’s that place where for 12 years American children come together to learn, including learning how to become full-fledged members of the social discourse commons?

Yep – public school, the training-ground par excellence for a well- and variously- informed, rational, tolerant citizenry, the bedrock prerequisite for the continued existence of an open and lively public discourse commons.

The success of the adult commons depends directly on its predecessor, and that’s one reason the stark income disparities that increasingly characterize the U.S. don’t bode well for the future. If our children don’t play together, learn together, and collaborate/compete together when they’re very young, the chances they’ll be able to do so as 30- or 40-year-olds become significantly impaired. When rich and poor children study and learn and play – and quarrel, yes, that’s part of life – with one another from the age of five, by the time they’re 25 they understand one another far better than if they’d never come in contact. They speak one another’s language – and as any good rhetorician knows, if you don’t speak the other’s language there’s no hope of ever finding common ground.

Public school is the most egalitarian social institution societies have hit upon to induct their members into what it means to be full and equal participants in the social commons. Public schools remain the sole effective preparation we know of for active, life-long membership in the body politic by the majority of citizens/residents. And said schools should be integrated and reflect both the racial and socio-economic demographics of their locale, with every school appropriately funded by its district, state, and the federal government to ensure that the poorest student in any school is deprived of no opportunity offered.  

Here’s how an education blogger we admire but have not had occasion to cite put it:

“Early in the 20th century, public schools had been established serving every community from coast to coast. The results from this vast American public education experiment shine like a lighthouse beacon on the path of Democracy and social happiness. A nation that entered the century as a 2nd rate power ended the century as the undisputed world leader in literacy, economy, military power, industrial might, cultural influence and more.

“Today, unbelievably, more and more forces are agitating to undo public education and even American Democracy itself.”

When we emerge from our pandemic silos, we must rejoin the social discourse commons to preserve both that commons itself, which makes liberal democracy possible, and its foundational institution: public schools.  

2021-03-18 What Went Wrong?

The West’s Failure to Vanquish Covid

“This is a national emergency, this is a war that we’re in, and instead of putting generals in positions of power, we’ve deferred to academics. Imagine in World War II, if that was how we treated it all — that we couldn’t make a single mistake” (Michael Mina, Harvard epidemiologist)

Europe – by which we refer to the 27 separate and unique nation-states which compose the European Union – is now in its third wave of the coronavirus pandemic, battling a virus which has mutated into a more transmissible and lethal form. While such a turn was not inevitable, it can happen that a virus becomes more – rather than less – lethal as time passes, and that’s what apparently has happened.

The piece we discuss in this post, “How the West Lost Covid” (David Wallace-Wells), is the best retrospective we’ve read on the almost-universal failure by the “rich West” to confront the virus successfully in the past year – and how a handful of countries bucked the trend and largely succeeded in eradicating it from the start.

There are no easy explanations, though, for how some rich countries have suffered far more than others: climate, demography, the presence or absence of nationalized health care systems, infrastructure – every example has a counter-example. Take California vs. Florida: one state imposed strict lockdowns and closures and masking requirements fairly early on, while the other partied on into late spring 2020 as if nothing was amiss. A year into the pandemic, the two states’ statistics aren’t all that different, and California was ravaged by the virus last fall, when Californians can still be outside and require neither heating nor air-conditioning.  

A significant contributing factor to the West’s failure, however, was its failure to act proactively. Perhaps due to inexperience with large-scale epidemics which threaten to become pandemics (the SARS outbreak in 2003, Ebola), perhaps out of arrogance both scientific (“Our sophisticated, advanced medical infrastructure can handle it”) and cultural (“Our citizens would never submit to a total lockdown/shutdown”), the West dithered throughout January 2020, when the images from Wuhan were illustrating what our future would be like if countries didn’t take prompt action.

The “West” (= the EU and U.S. primarily) went first into denial, then into a sort of fatalistic capitulation mode, convinced that there was nothing to be done except to tough it out and hope for the magic bullet of one or more vaccines. This, however, is not how you confront a global pandemic: the attitude has to be one of full-out war against a common enemy; for the zillionth time, viruses have no respect for national borders, especially in an age of globalization. The only thing they succumb to is the total eradication approach: zero COVID.

Across the EU, with the exception of the outlier countries of Finland, Norway, and Iceland, the failure not just to eliminate but even to contain COVID is more or less a general one – and although the U.S.’s enormous caseload and number of deaths seems incomprehensible, in terms of cases/deaths per million, the U.S. is near the EU average (Spain, France, Italy; the UK, Portugal and the Czech Republic all have higher mortality rates). There are plenty of reasons to be puzzled by why the world’s wealthy West basically surrendered ab initio in a war against an unseen but deadly enemy – the war metaphor, so frequently employed with disease (“fighting cancer” “conquering polio”) is the most appropriate one in a pandemic, and countries which viewed it as such from the outset and adopted the goal of total and complete defeat (eradication, not containment) had incomparably better outcomes: South Korea, Taiwan, New Zealand, Australia.

On the other hand, it must be said that the countries which succeeded in eliminating or nearly eliminating the virus did nothing terribly different than those that failed – Peru instituted draconian measures and has been devastated by the coronavirus. And the number of factors that might be in play (including chance [stochasticity], demography, distribution of comorbidities, geography, a country’s location, its neighbors, and its place in the global travel network, climate, the presence or absence of air conditioning, residential density, blood type, ICU capacity, proximity to bats and so on) is high. But nearly every factor that would seem to have contributed to a higher infection and mortality rate in one country can be countered by the absence of that factor’s significance in another – take the case of Japan, for example, whose population is elderly and whose proximity to China seemed at best dangerous, at worst fatal: Japan has managed the pandemic successfully – perhaps not at the level of New Zealand, but its caseloads have been far lower than the West’s despite an aging population, location, an only partial lockdown and an absence of mass testing. England, like New Zealand, is an island, but is the hardest-hit country in the world. Experts can’t explain all these discrepancies. 

There is a lot about this disease which just seems chaotic – unpredictable, surprising, and alarming to many medical academic scientists, who seek predictability even in its absence. However, national outcomes can be classified in three broad categories:

  • Europe/North America/South America: failure
  • Sub-Saharan Africa / South Asia: high caseloads, low death rates (due to demographics?)
  • East Asia / Southeast Asia / Oceania: resounding success

While there are variations in success within each category (Canada did better than the U.S.; Uruguay did better than Argentina, etc.), the biggest predictor of how well a country has succeeded against the virus is its location on the world map.

Consider the death rates per million in each of the three categories:

  • UK: 1,800; U.S.: 1,600; Sweden: 1,300; Germany: 900
  • New Zealand: 5; Australia 36
  • Taiwan: 0.42; Cambodia: 0; Vietnam: 0.36; Singapore: 5; South Korea: 32; Japan: 67 (despite an elderly population and the absence of strict lockdowns)

There are two issues here worth noting: first, while the virus originated in China, its Western origin was Northern Italy – and the mutation that infected the West came from there (the U.S. Eastern seaboard was infected with the “Italian variant”). Italy was heavily and fatally infected before it even knew the virus was present. (A question which seems worth asking is: How did the virus reach Northern Italy? – it’s one Wallace-Wells doesn’t address, but we think it’s an important one.)

Secondly, the attitude towards China when it locked down Wuhan – a “super-affluent” city of 11 million people – is characterized as “pandemic Orientalism”: “The disease was dismissed as a culturally backward outgrowth of wet markets and exotic-animal cuisine, and the shutdown was seen not as a demonstration of extreme seriousness but as a sign of the reflexive authoritarianism of the Chinese regime.” In fact this wasn’t the case – China, all rumors and biases to the contrary, is not in the habit of forcing millions of people into lockdown.

One thing that would have helped in the very first stages of the virus’ trajectory: a global travel shutdown – yes, global. It would have given the West a reprieve at the very least – a chance to go into full pandemic preparation mode (to gear up for testing, tracing, and isolation, and to stock up on PPE), and if it had occurred early enough, might well have averted what followed. And it needn’t have been an endless shutdown – several weeks might well have halted the virus’ spread to the West (and the rest of the world, for that matter).

Even after COVID arrived in Europe, many European countries (and of course the U.S.) chose a state of denial. What were leaders thinking? They hesitated to impose strict lockdowns and travel bans early on, not wanting to “dis-accommodate” their residents – or, perhaps more importantly, adversely impact “business.” When COVID was first detected in the state of Washington on the West coast, the East coast dithered – despite Governor Cuomo’s reputation as a “COVID hero,” the lockdown in NYC came too late, and there was nothing that could have saved the city by that time.  And states that eventually shut down all opened up too quickly – again, the goal seems to have been quasi-“suppression” or semi-“containment,” never “eradication.”

So firstly: EU countries and U.S. states acted too late. And their shutdowns (which were never total lockdowns), which advised “hand washing, social distancing, and mask-wearing,” were not accompanied by the other triplet of measures that most successful countries employed: “testing, tracing, and quarantining.” The U.S., for example, should have been testing around 25 million people a day last spring; it barely ever made it to 2 million (and seems to have given up on general population testing now). Without massive testing, contact tracing and quarantining became pointless. It’s been quite amazing to witness the second triplet of measures only partially-embraced and eventually abandoned in the U.S. (and elsewhere).

A good number of EU countries did well initially with strict, extensive lockdowns; since cases were in decline (there was no discussion about eradication) by early summer, the EU decided – too early, as it turned out – to open its borders for vacationers in July – September. Now they’re vowing to do the same thing this year, with discussion of a “Covid passport” to allow (vaccinated) vacationers to pretty much go where they please. In the EU, where vaccine supplies and consequently, vaccination rates remain alarmingly low, this sounds more like a (dangerous and deluded) pipe dream than a plan.

So what was the West thinking (assuming they were thinking at all) a year ago? Essentially, the West decided to sacrifice a few million people on the altar of keeping their economies as open as possible in the hope of the rapid discovery, approval, production and distribution of a vaccine – the modern-medicine obsession (“magic bullet”) that characterizes the rich West.

On March 13, 2020, Michael Ryan, the WHO’s Executive Director of health emergencies – a man who’d spent his career fighting Ebola outbreaks – was asked what lessons he’d learned:

 “What we’ve learned through the Ebola outbreaks is you need to react quickly. You need to go after the virus. You need to stop the chains of transmission. You need to engage with communities very deeply — community acceptance is hugely important. You need to be coordinated, you need to be coherent.

With respect to the coronavirus:

Be fast. Have no regrets. You must be the first mover. The virus will always get you if you don’t move quickly… If you need to be right before you move, you will never win. Perfection is the enemy of the good when it comes to emergency management. Speed trumps perfection. And the problem in society we have at the moment is everyone is afraid of making a mistake, everyone is afraid of the consequence of error. But the greatest error is not to move. The greatest error is to be paralyzed by the fear of failure.” (Emphasis added)

In sum: what did differentiate the three broad geographic categories above was speed and intensity of response. When every day counted, the West let literally weeks – about nine, 60+ days – pass without acting decisively and in a coordinated fashion. From modern historian Adam Tooze, who is writing a book on the pandemic:

Either you control this early on, in which case the trade-offs are relatively manageable and all sorts of conventional things make sense, or you don’t and you end up in a space which really no advanced polity’s decision-making process is very good at coping with. And so then it’s really a matter of degrees of failure across the board.”

With respect to how the U.S. in particular confronted COVID-19 in the early days, it was one PR disaster after another – and the President was by no means the only one to contribute to this. Dr. Anthony Fauci continued to insist throughout February that the virus was relatively unthreatening, like the flu, no cause for alarm, etc. – what was he imbibing, exactly? The Governor of New York has admitted that what he was most concerned about was not the virus, but panic among the populace – in other words, “Stay calm, everybody. Cuomo’s in charge.” But in fact, controlled panic is a pretty sound response to an invisible and insidious enemy – you’re under siege, and playing it cool just won’t cut it. 

Fauci, Trump, and Cuomo weren’t alone in their blasé confrontation style – the media were complicit and in retrospect, the Times and Post and other major outlets should be ashamed of writing stuff like “beware the pandemic panic” (Times), “we should be wary of an aggressive government response to coronavirus” (Post), and “Coronavirus Is Scary, but the Flu Is Deadlier, More Widespread” (USA Today). Alas, to the enormous detriment of the U.S. – and at the cost of more than half a million lives in the pandemic’s first year, “the cause of the alarm was picked up not by those in positions of social authority or with the power to enact preparatory measures but by a rogues’ gallery of outsiders and contrarians” – in other words, cranks and doomsday types with no access to power.

None of the early measures that would have stopped the virus in its tracks were imposed early enough or rigorously enough to succeed – just recall the testing debacle, the failure to set up contact tracing on a massive scale, and what basically was just a theoretical wave in the direction of quarantining and/or isolation. The U.S., acting too late and too disjointedly (if there’s one justification for a national public health policy in a country like the U.S., it’s a pandemic threatening to decimate your population and destroy your economy), eventually had to employ lengthy lockdowns which succeeded only in part – this, because they were never full lockdowns, and because they were meant to be employed in conjunction with the other measures, not independent of them. The loss of life and damage to the economy (770,000 people filed new unemployment claims in the U.S. most recently, the 52nd week in a row that claims have been higher than their highest point in the 2008-2009 financial crisis / recession) is incalculable, with whole sectors knocked out – aviation, tourism and the hospitality industry, restaurants- bars, the performing arts.

Another issue in the U.S. (and not only) is the blinkered focus on, and worship of, scientific, individual-centered, research-driven medicine. It’s a system focused on absolute knowledge and certainty, on testing hypotheses and confirming results, rather than on broad-stroke policy decisions which rely on back-of-the-envelope calculations and rapid action, as was required. Even today, Western leaders – including medical authorities and policy gurus – would rather not act than act and be wrong. But by not acting, they’ve been wrong all along.

The precepts of (Western, enlightenment-inspired, experimentally-driven) medicine have been followed throughout the past year, rather than the precepts of public health, which is often viewed by the medical establishment as a poor (literally and metaphorically, as events proved) step-daughter of medicine. But it’s not – its values and approaches to pandemics, including dealing with masses of people are fundamentally different. The U.S. has noted that, for example, the elderly are far more vulnerable to the virus than others – but public health acknowledges and addresses the fact that one major factor that makes the elderly vulnerable on a mass scale is that so many live in congregate, enclosed, poorly-ventilated and inadequately maintained settings, viz. nursing homes and assisted care facilities. This acknowledgement demanded an entirely different approach. The same goes for all those living in congregate facilities: prisons (jails, state and federal prisons), institutions for the disabled, homeless shelters, and ICE facilities on the southern border. The wealthiest senior citizens without significant co-morbidities could afford to shelter in place and self-isolate – and most of the advice was aimed at this privileged group. For some mysterious reason, the U.S. (and other Western nations) thought that human beings could be forced into isolation, despondence, and depression for a year or more by shaming. Public health experts know that shaming doesn’t work – it may succeed with some small number of people (who didn’t need to be shamed in the first place) for a year, and with a larger number for a few months, but with each successive cycle of lockdowns and re-openings, people become less subject to shaming and therefore, less compliant. People will congregate in secret at homes, they’ll walk along a seaside promenade in droves and throw caution to the winds, they’ll attend protests, thereby counteracting the very measures their government is trying to enforce.

“[W]e have to think also a bit with sustainability in mind. How do we communicate with people? What is the goal? What is the plan? Because I think there’ve been times when it felt like we were a little aimless as a country — just sort of muddling through. At least we should, you know, have a goal” (Natalie Dean, biostatistician, Univ. of Florida)

What, exactly was the goal of the West in its confrontation of the pandemic? Eradication – certainly not, that was seen as impossible; suppression? perhaps, in a few cases; containment? maybe. But mostly it seems to this writer to have been “Let’s cross our fingers, shut our eyes tight, and hope for a vaccine.”

Now we have several vaccines – shutting one’s eyes, crossing one’s fingers and going into full denial mode seems to have worked if you ignore the loss of 538,000 U.S. lives [as of March 17, 2021] and the destruction of the U.S. economy. The EU, sclerotic to the bitter end, played it coy with advance vaccine purchases; it was slow to place orders and haggled over prices. That meant that it’s not getting the supplies it needs, and thus, the vaccination rollout is embarrassingly slow. “Vacation Europe” wants to re-open – Greece has announced it will open to foreign tourists on May 14 to take advantage of its five-month season, which it couldn’t do in 2020. Along with other summer destination countries (Spain, Italy), Greece is pressing for a COVID passport system to allow vaccinated travelers into the country.

But it’s very doubtful that at a rate of fewer than 1 million people a month being vaccinated (around 30,000 daily), Greece will be anywhere near vaccine-induced herd immunity by May – just in terms of sheer numbers, it would require 8 months to vaccinate 8 million people (out of a population of just under 11 million), putting that goal near the end of August – and that’s assuming that the vaccine supply holds steady.

In the meantime, variants are multiplying daily – and it stands to reason that one or more – perhaps many – will evade the vaccines developed to date. What then?