History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 29 Mar 2020 06:09:20 +0000 Sun, 29 Mar 2020 06:09:20 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://m.hnn.us/site/feed Why Holocaust Fiction?

 

As a biographer I envy novelists, who can craft a captivating tale without needing to carefully document sources of information. Though they face definite challenges, fiction writers can, with whatever degree of knowledge and understanding they possess, invent composite characters and telling dialogue.

 

As a reader of Holocaust literature, I prefer non-fiction. Give me the facts straight up, please. I do not want to be left wondering whether some person existed or some action occurred. And though I love good stories, I see little need to manufacture them when the truth is powerful and strange and terrible enough. 

 

As a daughter of survivors, I think my grandparents—two of whom were in their early forties and two in their early sixties when they were killed in Auschwitz-Birkenau—would have wanted the full truth of what happened to them and their children to be wailed unto the heavens—lest they all vanish without a trace of having existed, as evidence of the human capacity for evil. 

I understand, however, that a case can be made for Holocaust fiction. In portraying virtuous, ignoble, or complex characters in extreme situations, novelists shed light on human behavior in normality. Vivid scenes facilitate our entry into foreign worlds. And in distilling events, the fiction writer can make the complicated comprehensible. Finally, readers who might not otherwise have known that Mengele experimented on twins, or that diplomat-rescuers saved some prominent Jews, or that tattooists engraved numbers on the arms of select Auschwitz inmates—or about any other of the innumerable dimensions of the maelstrom—might learn something. They may be spurred to further exploration. 

 

When fiction is “based on true events” we receive more than the author’s imagination. Of course, the degree to which such stories can be relied upon for historical accuracy varies. How deeply did the author research the subject? How can those of us who are not scholars evaluate whether we are getting a true picture? 

 

Some fiction writers use the Holocaust in the service of a good yarn—as if throwing perpetrators or victims or survivors into their narrative adds pathos or heft. Sometimes the most fantastical accounts (for example, the movies Life is Beautiful and Jo Jo Rabbit) dish up the absurd enmeshed in the plausible, like an SS officer barking orders at inmates in a language they did not understand. This happened. Nazi leaders training Hitlerjugend to defend the fatherland—this happened. While such works may be accused of trivializing the most serious of subjects, they make no claim to being other than farcical. 

 

But pretenders to truth (such as Binjamin Wilkomirski’s Fragments: Memories of a Wartime Childhood) indisputably cross a line. Passing off a false account as true provides fodder for Holocaust deniers and affronts us all. 

 

Sometimes true accounts are mistaken for fiction. Each semester that I taught a college course on the Holocaust, students would hand in papers that read, “In Elie Wiesel’s novel Night...” The Nobel laureate wrote several novels, but Night is a true account of teenaged Wiesel’s experience of the war. I wanted my students to know that. 

 

Survivor accounts are among our most trustworthy sources. Though it was near impossible for people in extreme situations to remember precise dates and times (not even decently-fed soldiers could recall such details), those who were there were (and are) experts on what they saw and felt on their own skins. 

 

Perhaps, then, the greatest good that ever came out of a work of Holocaust fiction was the Institute for Visual History and Education of the University of Southern California Shoah Foundation. In March of 1994, after accepting an Academy Award for Best Picture for Schindler’s List (based on Thomas Keneally’s historical novel), Steven Spielberg launched an ambitious and impractical project: knowing that survivors’ stories would soon be lost to history, he would capture on video as many of their testimonies as possible. Among the roughly 350,000 survivors then alive, most had been young adults during the war; they were now seniors; it would be a race-against-time. 

 

Moving quickly and efficiently—with the aid of historians and scholars, and production logistics experts; with project directors, coordinators, and, eventually, 2,500 interviewers in 33 cities in 24 countries, Spielberg set about achieving his goal. Wanting viewers to “See the faces, to hear the voices,” he insisted that survivors be interviewed in their own homes wherever possible. They were to tell their complete life stories, but spend most of the interview recounting their Holocaust experiences. Spielberg’s team ultimately amassed 52,000 videotaped testimonies. 

 

What possessed these (mostly) ordinary citizens, including those who were shy or humble or who had never before spoken about their experiences, to dress neatly, invite interviewers and videographers into their living rooms, and open up about the darkest period of their lives? For one, most knew about Steven Spielberg’s Schindler’s List. Secondly, they learned about the project through multiple media sources; flyers and ads with the headline “So Generations Never Forget What So Few Lived to Tell” awakened their sense of moral responsibility. Ms. Miller, a child who had hid with her family in a crowded farmhouse in the hills of Italy, said, “We come forward because we are aware of our own mortality and how important it is to share what happened.” 

 

Fortuitously, during the six-year period (1994-2000) in which the interviewing took place, there was an explosion in the field of information technology, enabling Spielberg’s team to create a vast, searchable cyber-archive. The carefully catalogued and scientifically preserved videos were distributed to various organizations (including the U.S. Holocaust Memorial Museum and Yad Vashem). 

 

It would take twelve years (or more than 105,000 hours) to watch all of the interviews. I have only had time to view some, obtained online through the USC Shoah Foundation. Once I began listening to certain survivors, I could not tear myself away. Their stories are inherently harrowing, gripping, and educational. And authentic. 

 

For his noble work, all of humanity owes a debt of gratitude to Steven Spielberg (whose foundation has subsequently worked with the Kigali Genocide Memorial to capture the testimonies of survivors of the Rwandan genocide). Twenty-five years after the release of Schindler’s List, in December 2018, the filmmaker reflected on this “most important experience” of his career. Survivors who could bear to watch the film told him that it could not compare to what was. But they were glad he told the story—it should not be forgotten.

 

Owing to the singular circumstances, perhaps no author of Holocaust fiction can aspire to again produce a work as far-reaching as Schindler’s List. But writers who ignore or take liberties with the truth ought to reflect on their purposes. If in some measure they aim to edify, counter hate, and inspire empathy, they might be mindful of those who did not live to tell their stories—who, when they could, engraved their names and places of birth in the walls of barracks, or implored others to remember them. Had they had a choice, I believe Hitler’s victims would have wanted nothing about the mortal crimes against them falsified. 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174674 https://historynewsnetwork.org/article/174674 0
Plots Against America?: Jim Crow was Homegrown Fascism

 

Note: this essay quotes a sign attached to the body of a man lynched in 1919, which uses a racial slur.

HBO’s recent release of a prestige adaptation of Philip Roth’s 2004 novel The Plot Against America makes it worthwhile to examine whether fascism is really so alien to the United States as many wish to believe. In Roth’s novel, a Jewish extended family in Newark experiences fascism’s arrival in America, with the 1932 election of Charles Lindbergh to the presidency, as an intrusion of European extremism against an American Way—and a short-lived one, at that. When many white thinkers ponder Roth’s Plot or the sardonic title of Sinclair Lewis’s 1935 novel It Can’t Happen Here, they often miss the many ways in which it has already happened here. After all, the public prominence of the Ku Klux Klan and the massive riots that rock the climax of Roth’s novel were already regular features of American life, depending upon where you lived. For example, in early May 1927, just a few weeks before the historical Lindbergh took off from Roosevelt Field in the Spirit of St. Louis, some 5,000 whites rioted in the black business district of Little Rock, Arkansas, where they burned the body of man named John Carter. Local police officers, many rumored to be Klan members, did nothing to stop the violence—and may have even taken part.

Black observers of current trends, however, tend to be a little more astute. For example, in his February 21, 2020, New York Times column, Jamelle Bouie argues that the expansive authoritarianism of Donald Trump has its analogue in the Jim Crow South. However, we should not consider the Venn Diagram of fascism and Jim Crow as a circle. The reality is a little more complicated.

The word “fascism” has also long been employed as a political Rorschach blot. As early as 1946, just one year after the end of World War II, George Orwell was complaining of this fact in his essay “Politics and the English Language,” writing: “The word Fascism has now no meaning except in so far as it signifies ‘something not desirable.’” But let us take this as a working definition:

Fascism is the attempt, birthed in reactionary politics, to resolve the contradictions of democracy for purposes of preserving elite power against the demands of the masses.

This will need some explanation. Although we today associate democracy with high ideals, its origins are a bit grubbier. For example, opposition to the Angevin kings of England (which led to the Magna Carta) included, according to historian Robert Bartlett, such charges as “heavy taxation, elevation of low-born officials, slow and venal justice, disregard for the property rights and dignities of the aristocracy.” Much of the Magna Carta focuses upon preserving the property rights and prestige of the aristocracy while limiting the king’s ability to levy certain taxes without “the common counsel of the kingdom.” The eventual emergence of a mercantile class in the late Middle Ages and early Renaissance sparked another expansion of “democracy,” as the bourgeoisie sought similar privileges in order to protect their own wealth. With industrialization, and the eventual concentration of the lower classes into cities, the emerging proletariat began to press for access to the franchise itself on both sides of the Atlantic, as exemplified in the UK by the Reform Act of 1832 and in the US by Jacksonian democracy and the removal of property qualifications for the vote. 

At each step in the expansion of democracy, those who already possessed the franchise feared the loss of their power and wealth by allowing any “lower” classes the privilege of voting. The United States has been much more a racial society than a class society along the lines of the UK, and so here it was easier to get elite buy-in to the idea of universal male suffrage, so long as those males were exclusively white. The abolition of slavery and the expansion of suffrage to those former slaves and their eventual descendants provoked the rage of the south’s idle landlords, who initiated a campaign of violence in the immediate aftermath of the Civil War in order to return to the status quo ante of black servitude and submissiveness. What historians call the first Ku Klux Klan was an elite project to scuttle the political empowerment of African Americans.

The “contradictions of democracy” can be seen in this struggle between those who believe that republican government should preserve elite power and the democratic desire that all citizens be given a voice in how they are governed. Fascism is an attempt to short-circuit this tension through the advancement of a purely corporate figure who is cast as the savior of “the people,” not by empowering them but rather by emphasizing his own unique attributes to act on their behalf. As the Israeli scholar Ishay Landa points out in The Apprentice’s Sorcerer: Liberal Tradition and Fascism, while fascism regularly employs the rhetoric of collectivism, it centralizes such collective and democratic yearnings upon the individual strongman leader, so that he becomes democracy personified, the one true spokesman for “the people,” who no longer need engage in self-governance. 

But there is more to it. As historian Aristotle Kallis observes in Genocide and Fascism: The Eliminationist Drive in Fascist Europe, fascist ideology was born with the specific aim of seeking redemption from recent “humiliations” by latching onto the glories of the past to drive a new utopian future. This “redemption” manifested itself externally, through expansionist policies of conquest, and internally, through a “cleansing” of the population aimed at eliminating those figures responsible for recent humiliations: socialists, communists, Jews and other minority groups. The drive to “cleanse” the state, Kallis writes, “helped shape a redemptive licence to hatedirected at particular ‘others’ and render the prospect of their elimination more desirable, more intelligible, and less morally troubling.” 

This has been just a brief overview, but it allows us to draw some parallels between fascism and Jim Crow. Both fascism and Jim Crow were means of limiting democratic participation and thus the political and economic emancipation of certain “others.” And both fostered a “license to hate” that resulted in massive violence against the enemies of the elite. But there are more parallels. As Landa writes in Fascism and the Masses: The Revolt against the Last Humans, 1848–1945, “Rhetoric of honoring labor aside, the Nazis strove to achieve the exact opposite: keeping wages low and increasing working hours, which was precisely what German business was insisting should be done throughout the years of the Weimar Republic.” Much the same held true in the Jim Crow South, where particular ire was reserved for those who resisted the southern tradition of racialized economic exploitation. In June 1919, after Clyde Ellison refused to work for Lincoln County, Arkansas, planter David Bennett for a mere 85 cents a day, he was hanged from a bridge, with a sign attached to his body reading, “This is how we treat lazy niggers.” Later that year, and not too far away, white mobs and soldiers would slaughter untold numbers of African Americans, in what has become known as the Elaine Massacre, for daring to organize a farmers’ union.

Although both fascism and Jim Crow constituted violent means of securing elite power, there are important distinctions to note. While southern states had their share of demagogues, Jim Crow was a multi-generational project of the Democratic Party, one not centered upon any particular individual. Too, while both fostered a “license to hate” against racial and ideological others, the Jim Crow project made a distinction between “good negroes” who “knew their place” and “bad negroes” who sought the privileges reserved to whites. The latter may have to be killed, and the region “cleansed” of those “outsider” whites who spread dreams of equality, but black people who were dutifully submissive could be tolerated in so far as their unpaid or underpaid labor created the region’s wealth.

Back in 2004, The Plot Against America was widely regarded as a commentary about the administration of George W. Bush. No doubt, the 2020 television adaptation will be viewed in the light of Donald Trump, whose rhetoric and policies have been compared by critics to both fascism and Jim Crow. Perhaps the television series will exhibit a more sophisticated understanding of fascism and America than did the book. Perhaps not. Either way, the series should provide a good opportunity for historians to educate the public on who, exactly, lay behind the centuries-old plot against all Americans.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174673 https://historynewsnetwork.org/article/174673 0
The Three Daring Women Who Traversed the Himalayas

 

 

Antonia Deacock, Anne Davies and Eve Sims were three rather extraordinary women who, when in their mid-twenties and thirties, set off overland from England to Tibet in 1958.  Their aim was climbing one of the Himalayas’ unexplored high peaks. They made the 16,000-mile drive to India and back, adding a 300-mile trek on foot to Zanskar, a remote province of Ladakh (part of Kashmir), and became the first European women to set foot there.

 

They called their adventure the Women’s Overland Himalayan Expedition, and were inspired and encouraged by their husbands, explorers and climbers themselves. Not wanting to be left behind "chained to the kitchen sink" when their husbands went on a trek, the women spent six months planning their own adventure. "It was just something we wanted to do," said Anne matter-of-factly. "And it was a good idea."

 

They had no vehicle, little money and no equipment, and didn’t even know each other very well at the start, but by working together they mustered enough support to fund their expedition, gaining sponsorship from, among other companies, Brooke Bond Tea (they mistakenly ordered enough tea "to keep a family going for 150 years," according to Eve), lllustrated magazine, John Player & Son, even cosmetician Max Factor. They persuaded Land Rover to sell them a demonstration model of a modified long-wheelbase all-terrain vehicle at a significant discount, and the British Ministry of Agriculture and Fisheries donated steak, vegetables, and berries preserved with the new technology of freeze-drying. A publicity campaign before they left captured the public’s attention, and headlines such as "Four Fed-up Wives" (one member had to pull out at the last minute, due to an unexpected pregnancy) helped raise the required funds.

 

They weren’t completely naïve, however. Anne was fluent in Urdu and Hindi and had previously trekked in Kashmir with her husband, their baby strapped to the back of a mule. Sims had spent two years motorbiking around Australia and New Zealand, and before that had learned to climb in Wales. Deacock was an experienced rock climber. They weren’t the kind of women who would let the fact that two of them couldn’t even drive when they started planning their trip deter them.

 

Eve and Antonia had not yet had children, but Anne left behind her three sons, aged 15, 14 and five. "I didn't feel guilty," she said. "Because Lester had gone off on expeditions, and this was my turn."

 

Their five-month journey saw them battle illness, delays, inhospitable terrain, the effects of altitude, and terrible roads. They occasionally had to fend off unwelcome advances and negotiate with recalcitrant porters. Rest days were spent scrubbing their laundry on rocks in a nearby river, catching up on correspondence–communication was slow and difficult, and they could only occasionally get word to their husbands and families that they were safe–and checking their supplies. The entire trip was a considerable feat of organisation – after the women estimated what they would need to last the entire trip, several crates were shipped to Bombay and had to be retrieved from the docks after bureaucratic delays.

 

Due to the heat in Iran, they were often forced to drive at night, and were faced with constant enquiries as to where the ‘sahibs’ were and incredulity that the women were travelling on their own. They camped almost everywhere, and were welcomed by Land Rover’s agents in the European cities they travelled through, who praised the women for their maintenance of the vehicle, though they often expressed surprise that they successfully completed the trip. 

 

Following an audience with the Indian Prime Minister Nehru they were granted the rare privilege of travelling beyond the "inner line,", a boundary across India and Tibet that had been drawn up in the 1800s and beyond which no British subject might rely on government protection or rescue. No non-Indian had been granted permission to cross the inner line since before the Second World War, and it was one of the last regions untraveled by Europeans on the planet. They also were thought to be the first European women to cross Afghanistan unescorted. 

 

Fording fast-flowing meltwater streams, sometimes up to their waists in water, they trekked through snow and ice, climbing a 18,700 foot peak and naming it Biri Giri ("Wives Peak"). As they travelled, they passed through villages untouched by European contact, meeting locals who had never seen things such as zippers or nylon climbing ropes. One elderly woman was terrified of the sight of her own face in one of their mirrors. "How privileged we were to witness and partake in societies that were virtually strangers to the modern world that we know," said Antonia afterwards.

 

The women were bound by a common desire to prove themselves. "I’d been my father’s daughter, my husband’s wife," said Eve. "But this time I was somebody on my own." Despite living in close proximity and enduring many hardships, surviving dust and discomfort, the freezing cold and the intense heat, they claim never to have argued, preferring to thoroughly discuss issues and abide by a majority rule on decisions.

 

After their expedition, they carried on the spirit of adventure in their lives. Antonia wrote a book about their exploits, then eventually moved to Australia and established an Outward Bound school and the world’s first dedicated adventure travel company with her husband, also establishing close links with Nepal. Eve had three children and went on to run an Outward Bound centre with her husband, and Davies helped her husband to run an outward bound school in the Lake District.

 

The 1950s were a time when relations between British and other western people and newly independent nations in South Asia were in their infancy, and the women’s fearless endeavour is all the more remarkable for it. Although trekking in remote lands is now far more common, it is unlikely that they would be able to make such a journey–particularly through Afghanistan and Iran– today.

 

A film about their expedition, by Pulse Films and Britain’s Film4, is in development. Antonia Deacock’s book, No Purdah in Padam, is now out of print, but available from antiquarian booksellers.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174675 https://historynewsnetwork.org/article/174675 0
As We Zoom into Online Learning….

 

 

“Zoom” – this playful kid-word, which once referred to fast cars, now signals a fast-approaching sea-change in higher education, unfolding before our eyes thanks to COVID-19 and the need to move live university classes online. Zoom, for those who do not know, is the video meeting platform by which faculty are all migrating our classes to on-line format.

 

We all might have larger things to worry about in the next few weeks and months, like our loved ones, our colleagues, and our students becoming terribly ill. If this happens, then the nature of online education will hardly be our biggest problem. 

 

But for the moment at least, as an academic who has been teaching at state universities for nearly thirty years, I am torn concerning the issue at hand. On the one hand, the students who signed up for my History of the Holocaust class this semester at the University of Florida did so because they were interested in the topic, some intensely so. As I try and move my lectures and discussion sessions from a classroom to a Zoom format, I want to provide something as close to the classroom experience as I can. On the other hand, I suspect, as do many of my colleagues, that university administrators and state legislators throughout the US will study this crash experiment in online education very closely one day. Are we academics showing them how they might replace us in the name of heightened efficiency? 

 

We can agree that some of the efficiencies are indeed desirable. Those of us who remember putting books on reserve in the library for twenty-five students at a time will attest to this. A certain number of online classes, moreover, have existed for the last couple of decades, helping place-bound students and those students, younger and older, who work full time. But what happens when everything goes online, and all at once? If we discover that all classes can be delivered online from a remote location, then what is the point of having lecture halls, classrooms, or for that matter a diverse faculty of broad expertise and talents? The arguments that have been percolating in universities for the past decade will intensify overnight.Yes, there are faculty who have put in an immense amount of time in order to develop fine online experiences. But I have also seen half-baked efforts over the years that are rather disastrous, even within the oft-cited rationalizing context of the apocryphal 1970s professor (I never actually had one of these guys) who mumbled through his yellowing lecture notes. 

 

My colleagues are proceeding cautiously. One colleague warned me not to record my lectures into the Zoom cloud, but to provide them live through Zoom. Everyone, I hear, is giving synchronous (“live” in Zoomspeak) as opposed to asynchronous (“recorded” in Zoomspeak) lectures. Anyone who has seen the infatuation with online learning in higher education administration over the past twenty years knows that this is hardly a paranoid reaction. The university would own the recorded content, as the work is done for the university in return for compensation. I actually recorded the first few lectures for my class. I needed to crawl with this technology before I could walk, and the students, I thought, would need time to adjust to the new reality of a full course load online as they simultaneously move from Gainesville back to their homes in Florida and elsewhere in the US.  Nonetheless, like some of my colleagues, I am uneasy even with synchronous content. If I have learned anything from other people’s travails with Facebook and Twitter over the years, it is that nothing put online, even briefly, is truly protected or truly deleted. 

 

And if I know what faculty will say once the experiment is over, I am less sure about the students.  I hope that they give a contextual yet roundly negative assessment -- something like, “I understand the situation, but I can’t wait to get back to live classes.” But today’s students are online-surveyed to death, starting with online evaluations of faculty each semester that most do not complete. Worse, twenty-somethings believe that they can effectively multitask–what others of us would call diffusing one’s focus. The professor telling them at the start of class to turn off their cell phones and laptops is now the professor who depends on these devices as we try to lecture or hold discussion sections from remote locations. Zoom actually has a feature that discloses a student’s level of attention—have they left their screen? Are they messaging? Are they watching Netflix? But I really don’t want to check that feature, and the fact that Zoom has it at all reveals the nature of the problem. Live classes promote a level of decorum that everyone in the classroom understands and from which everyone in the room benefits. But taking a class alone in one’s kitchen or bedroom? There is a reason that different rooms have different names, and in every language.

 

Finally, there is the quality of our own work, our pedagogical preparation. Like most of my colleagues in certain disciplines, I believe that each lecture and each discussion is the result of having worked at our craft over a period of years. What material shall we present to make a particular point about, say, Jewish resistance in the Warsaw ghetto, or about Reconstruction after the Civil War, or about Robespierre’s dictatorship? How shall we present it? What verbiage will we use? What visuals will we use? When will we leave the lectern for a stroll up the aisle? When will we pause and urge the the students think rather than just take notes? What questions will we pose to them when they discuss? How can we encourage them to interact and even debate with one another face-to-face-to-face, complete with expressions and gestures? How will we get them to understand that there are no black and white answers but only arguments, some thoughtful, some needing intensive development? 

 

These questions and many others form the very stuff that makes live higher education on a university campus an experience for faculty and students that cannot be replicated online, at least through the Zoom technology with which I have become familiar. Even if all of the technology “works,” how can our broader efforts, having been squeezed through the portal between a faculty computer and those of the students, come out undistorted on either side in ways that we cannot yet fully recognize? Zoom is fine technology—for conference calls. It enables business executives to talk to one another over long distances while presenting flowcharts and such. Academics can even have faculty meetings via Zoom, so that we ourselves can “multitask” to our heart’s content while discussing the minutiae of departmental by-laws.

 

But the real interaction that results in true learning? I am not sure at all. Zoom at its bandwidth-driven heart allows us to see one another and hear one another only to the point where we can talk to one another’s images (with cheesy optional backgrounds of the tropics or of outer space no less) and not speak to one another as human beings. We can see, but our vision is circumscribed. We can listen, but our hearing is muffled. We can connect, but our interaction is impeded.

 

For this, we all need to be, once again, in the same room. 

 

Let’s hope it is soon.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174676 https://historynewsnetwork.org/article/174676 0
Use What You Know: Online Teaching Tools If you teach history at any sort of educational institution, whether K–12 or higher ed, chances are your institution has “pivoted” to remote (online) instruction, if not closed campus entirely, as a response to the rapid spread of COVID-19 and the imperative for social distancing. Over the past week, a wave of these pivots and closures has left many of us scrambling for alternative means of engaging our students in an online and likely asynchronous setting. This is not the optimal way to teach and learn history, given that good online courses take more time and planning to develop than the handful of days most of us have been given.

So what do we do? How do we keep as many of the essential elements of our course as possible, even if those look different online? We want to help our students to continue to be engaged with history; that is, to still feel present in the course and actively work with the course material. We want our students to do things like discuss, analyze, work with primary sources, and be able to communicate their interpretations to others.....

Many historians have already been doing this kind of teaching online, so there might not be the need for you to re-discover fire. Check out the #twitterstorians hashtag on Twitter to read and participate in ongoing conversations and resource sharing. Waitman Beorn, a senior lecturer in history at Northumbria University, has generously created a spreadsheet of teaching tools, digital history sites you can direct students to, digital tools for historical scholarship, digital humanities projects, and digital archives (use the tabs at the bottom of the spreadsheet to navigate between categories). H-Net has put together a repository for resources on teaching history online, which should be a thriving community soon. One of the most exciting cross-disciplinary products has been the “Keep Teaching” online community set up by the staff of Kansas State University’s Global Campus, an excellent virtual gathering spot where faculty, staff, and designers are sharing tips, tricks, techniques, and—most importantly—solidarity as we navigate this rapidly changing landscape together. 

As historians and teachers, we pride ourselves on being able to engage students with the complexity and wonders of the past. Though our current circumstances are far different than we anticipated, we have the research skills and critical faculties to help solve this new set of problems. Being analytical and discerning about the tools we use is a necessary part of that process, but so, too, is our discipline’s remarkable willingness to collaborate and share expertise. If you’re one of the thousands of us “moving online,” good luck, and see you on the internet!

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174651 https://historynewsnetwork.org/article/174651 0
The Cultural Constants of Contagion

 

Sometime during the year 541, a few rats found their way into Byzantium. Soon more would arrive in the city. Whether they came from ships unloading cargo in Constantinople’s bay, or overland in carts bringing goods from points further east, the rodents carried fleas harboring the Yersina pestis bacterium. Byzantium’s citizens, ruled over by the increasingly erratic Justinian, had heard accounts of the plague as it struck down other nations first. The historian Procopius, who was witness to the ensuing pestilence, writes in his History of the Wars that the resultant disease “seemed to move by fixed arrangement, and to tarry for a specified time in each country, casting its blight… spreading in either direction out to the ends of this world, as if fearing some corner of the earth might escape it.” 

Known as Justinian’s Plague, the pandemic marked the first instance of the bubonic plague in Europe, almost a millennium before its more famous manifestation as the Medieval Black Death. William Rosen writes in Justinian’s Flea that while contagion was a feature of human life, before the sixth-century “none of them [had] ever swept across what amounted to the entire known world, ending tens of millions of lives, and stopping tens of millions more from ever being born.” By the pandemic’s end, epidemiologists estimate that perhaps a quarter of the known world’s population had perished.

There is something deep within our collective unconscious that dimly apprehends these past traumas. The accounts of a historian like Procopius sound familiar to us; echoes of such horror in everything from the skeleton masks of Halloween, the morbid aesthetic of the gothic, and the horror movies of pandemic–from the clinical Contagion and Outbreak to the fantastical The Walking Dead and 28 Days Later. We so fearfully thrill to narratives of apocalypse, that when faced with our own pandemic there is something almost uncanny about it. For the past several months a few Western observers nervously read reports about the coronavirus outbreak in Wuhan, China. More people paid attention as cases emerged in Italy. Shades of Procopius, who reflected that “at first the deaths were a little more than the normal, then the mortality rose still higher, and afterwards the tale of dead reached five thousand each day, and again it even came to ten thousand and still more than that.” Today almost every nation in the European Union is affected by coronavirus, a majority of U.S. states, and every single continent save for Antarctica. Physicians have yet to fully ascertain the disease’s mortality rate, but thousands have already died, and contrary the obstinate denials from the president of the United States, Covid-19 clearly seems to be more than “just the flu.” 

Those of us who work in the humanities have, for more than a generation, been loath to compare radically different time periods and cultures–and for good reason. There can be a flattening to human experience when we read the past as simply a mirror of our own lives. Yet plague is, in some ways, a type of cultural absolute zero, a shared experience of extremity that does seem to offer certain perennial themes that approach universality. For the first time in more than a century, since the Spanish Influenza outbreak of 1918, we collectively and globally face a pandemic that threatens to radically alter the lives of virtually everybody on the planet. It behooves us to converse with the dead. Much is alien about our forebears, but when we read accounts of pestilence raging through ancient cities, it’s hard not to hear echoes of our own increasingly frenzied push notifications. 

There is a collection of morbid motifs which recur with epidemics–the initial disbelief, the governmental incompetence, hoarding of goods, desperate attempts at protection (including the embrace of superstition), social stigma and bigotry, social distancing and quarantine, and of course panic. In the pilfered N95 surgical masks there are echoes of the avian masks worn by Renaissance plague doctors; in the rosemary, sage, and thyme clutched by medieval peasants there are precursors of the Purell which we’re all desperately using (albeit the latter is certainly more effective). As different as those who lived before us may be, the null void which is the pandemic does display a certain universality in how people react to their circumstances, born necessarily from biology itself. For example, gallows humor is an inextricable and required aspect of human endurance, though it can also mark a dangerous denialism. Catharine Arnold writes in Pandemic 1918 that the in its earliest days, the Spanish flu was either denied or joked about. Arnold writes that “At the [Cape Town, South Africa] Opera House, a cough in the audience provided the actor on stage with an excellent opportunity to ad-lib. ‘Ha, the Spanish flu, I presume?’ The remark brought the house down.” However, she writes that “within days, that joke wasn’t funny anymore.”

Few emotions are as all encompassing, understandable, and universal as is panic. In the queasy feeling which a pandemic inculcates throughout a society there is a unity of experience across disparate time periods. Consider the language from The New York Evening Post in July of 1832 as they traced the inevitable arrival of a cholera epidemic into the city, where inhabitants of the metropolis fled “from the city, as we may suppose the inhabitants of Pompeii or Reggio fled from those devoted places.” Naturally hoarding is another predictable behavior, if often dangerous and self-defeating, as can be attested to anyone who has tried to buy hand sanitizer or toilet paper in the last week, or with the squirreling away of face masks which are currently needed by public health officials. Procopius again notes that in Byzantium, “it seemed a difficult and very notable thing to have sufficiency of bread or of anything else; so that with some of the sick it appeared that the end of life came about sooner than it should have come by reason of the lack of the necessities of life.”

This panic can metastasize into rank irrationalism, as pandemic often leads to the embrace of quack cures, or more disturbingly the promulgation of noxious hatreds. Philip Ziegler writes in The Black Death that during the pandemic of 1348, in France a “fashionable course of study” regarding the plague included “an ointment called ‘Black de Razes’” that was “on sale at apothecaries as a cure recommended for virtually any ailment.” Its effectiveness might be compared to the silver nitrate solutions hawked by televangelist Jim Bakker as a prophylactic against coronavirus. During the Black Death, fear was a convenient catalyst for old hatreds, as the plague justified antisemitic pogroms. By dint of both their isolation and the religious strictures that encouraged rigorous hygiene, medieval Jewish communities were sometimes spared the worst of the plague. This was interpreted by Christians as complicity in the pandemic, and so the Jews were made to suffer for their previous good fortune. John Kelly writes in The Great Mortality that antisemitism “bubbled up from the medieval Teutonic psyche,” where in Germany and other nations the religious cult of the Flagellants “believed the curse of the mortality could be lifted through self-abuse of the flesh and slaying Jews.” 

Xenophobia similarly marked more modern pandemics. In the first four years of the twentieth-century, San Francisco saw America’s only prolonged outbreak of bubonic plague. Brought by ship into the city, where today a coronavirus contaminated cruise ship lingers off the bay, the plague broke out in Chinatown, leading to hundreds of deaths. Marilyn Chase explains in The Barbary Plague that “public health efforts… were handicapped by limited scientific knowledge and bedeviled by the twin demons of denial and discrimination.” Rather than offering treatment, Mayor James D. Phelan spread fear about the Chinese residents of the city, anticipating today’s talk of “foreign virus.” He called them “a constant menace to the public health,” while ironically exacerbating the outbreak by dint of his policy. Meanwhile, California governor Henry Gage simply denied the existence of the plague at all, fearing more that the economy would be harmed then that his fellow citizens were dying. 

If there is a commonality to the recurring themes that mark pandemics throughout history, it’s because of the physical reality of the ailment. Covid-19 can’t be explained away, can’t be ignored, can’t be obscured, can’t be denied. It can’t be tweeted out of existence. It turns out that internet memes aren’t the same thing as viruses. We may yet discover that in a “post-reality” world, reality has a way of coming to collect its debts. 

An important reminder, however; for all of the uncertainty, panic, and horror which pandemics spread, there is also the acknowledgment, at least among some, of our shared affliction, our collective ailment, our common humanity. Within a plague there are fears both rational and irrational, there are prejudices and panics, there is selfishness, cruelty, and hatred. But there is also kindness, and the opportunity for kindness. The story of pandemics contains flagellants, but also selfless physicians and nurses; it includes the shunning of whole groups of people but the relief of treatment as well. We’ve never faced something like the coronavirus in the contemporary Western world, but we’d do well to remember something strangely hopeful that Daniel Defoe observed in his quasi-fictional A Journal of the Plague Year, 1666 when the last of the major bubonic outbreaks killed a fifth of London’s population: “a close conversing with Death, or the Diseases that threaten Death, would scum off the Gall from our Tempers, remove the Animosities among us, and bring us to see with differing Eyes, than those which we look’d on Things with before.”   

 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174671 https://historynewsnetwork.org/article/174671 0
Bancroft Prize Goes to Books on Emancipation and Urban Renewal A sweeping reconsideration of the complexities of Emancipation and a biography of the nearly forgotten mid-20th-century urban planner who reshaped Boston and other cities have won this year’s Bancroft Prize, which is considered one of the most prestigious honors in the field of American history.

Lizabeth Cohen’s “Saving America’s Cities: Ed Logue and the Struggle to Renew Urban American in the Suburban Age,” published by Farrar Straus and Giroux, was cited for offering “a nuanced view of federally-funded urban redevelopment and of one of its major practitioners that goes beyond the simplicity of good and bad, heroes and villains.”

Reviewing the book last year in The New York Times Book Review, Alan Ehrenhalt praised Dr. Cohen, a professor of American Studies at Harvard, for her “incisive treatment of the entire urban-planning world in America in the last half of the 20th century,” and fair-mindedness in addressing what has become, he writes, “a highly polarized subject.”

The second winner, Joseph P. Reidy’s “Illusions of Emancipation: The Pursuit of Freedom and Equality in the Twilight of Slavery,” published by University of North Carolina Press, was cited by the prize committee for the way it builds on and departs from the huge existing literature on the subject to “deepen our understanding of the vagaries of Emancipation in the United States.”

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174669 https://historynewsnetwork.org/article/174669 0
Updated 3/27: What Historians Are Saying About Covid-19 and Trump's Response Click inside the image to scroll through tweets

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174574 https://historynewsnetwork.org/article/174574 0
Updated 3/27: Historians Discuss the Media's Coverage of COVID-19 Click Inside the Image to Scroll Tweets

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174678 https://historynewsnetwork.org/article/174678 0
Roundup Top Ten for March 20, 2020

The Shortages May Be Worse Than the Disease

by Elise A. Mitchell

Societies further their own destruction whenever they fail to provide anyone health care, housing, or dispensation from work because of their employment, socioeconomic, or immigration status.

 

We Need Social Solidarity, Not Just Social Distancing

by Eric Klinenberg

To combat the coronavirus, Americans need to do more than secure their own safety.

 

 

Hurricane Katrina Provides Lessons about Closing Campuses

by Andre M. Perry

Students in New Orleans needed resources to return to normalcy. But when racial wealth gaps are the norm, a stumble can become a fall.

 

 

Work Requirements are Catastrophic in a Pandemic

by Elisa Minoff

Instead, we should be implementing policies that support people’s work in the wage labor force and make it possible for working families to make ends meet.

 

 

Counting Everyone—Citizens and Non-Citizens—In the 2020 Census is Crucial

by Brendan A. Shanahan

Even without a citizenship question, the Trump administration wants to shape how states reapportion their legislatures.

 

 

Why Sanders Isn’t Winning Over Black Voters

by Keeanga-Yamahtta Taylor

For millions, even when government “works” it is not working for them.

 

 

An Epicenter of the Pandemic Will Be Jails and Prisons, if Inaction Continues

by Amanda Klonsky

How will we prevent incarcerated people and those who work in these institutions from becoming ill and spreading the virus?

 

 

Democracy: How 1860 Connects to 2020

by Daniel W. Crofts

In the years before the Civil War, just as today, minority rule was the norm. White Southerners dominated the Democratic Party, and the Democratic Party dominated the federal government.

 

 

College Worth Fighting For

by Ryan Boyd

Professors are in a class struggle, a real fight that cannot be won with critique alone.

 

We Can’t Forget Women as We Tell The Story of COVID-19

by Jennifer Brier

Women who have been medical (and political) subjects of HIV/AIDS also have much to teach us during our current pandemic.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174668 https://historynewsnetwork.org/article/174668 0
An Interview with Mary V. Thompson on the Lives of the Enslaved Residents of Mount Vernon

 

Mount Vernon Historian Mary V. Thompson is the author of “The Only Unavoidable Subject of Regret”: George Washington, Slavery, and the Enslaved Community at Mount Vernon (University of Virginia Press, 2019).

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights and conflict. He can be reached by email: robinlindley@gmail.com.

 

Drawing on years of extensive research and a wide variety of sources from financial and property records to letters and diaries, Ms. Thompson recounts the back-breaking work and everyday activities of those held in bondage. Without sentimentality she describes oppressive working conditions; the confinement; the diet and food shortages; the illness; the drafty housing; the ragged clothes; the spasms of cruel punishment; the solace in religion and customs; and the episodic resistance. 

Ms. Thompson also illuminates the lives of George and Martha Washington through their relationships with black slaves. Washington was a strict disciplinarian with high expectations of himself and his slaves. As a young man, he callously bought and sold slaves like cattle. However, as Ms. Thompson explores, his attitudes toward slavery and race changed with the American Revolution when he saw black men fight valiantly beside white troops. Although not a vocal abolitionist, his postwar statements reveal that he found slavery hypocritical and incompatible with the ideals of democracy and freedom for which he had fought. He was the only Founding Father who freed his slaves in his will.

Ms. Thompson brings to life this complicated history of enslaved people and their legendary owner. Her careful explication of the many aspects of life at Mount Vernon offers a vivid microcosm for readers to better understand the institution of slavery and its human consequences during colonial period and early decades of the republic.

Since 1980, Mary V. Thompson has worked at George Washington's Mount Vernon in several capacities, and currently serves as Research Historian who supports programs in all departments at Mount Vernon, with a primary focus on everyday life on the estate, including domestic routines, foodways, religious practices, slavery, and the slave community. She has lectured on many subjects, ranging from family life and private enterprise among the slaves, to slave resistance, to religious practices and funerary customs in George Washington's family. Her other books include “In the Hands of a Good Providence:” Religion in the Life of George Washington, and A Short Biography of Martha Washington.Ms. Thompson also has written chapters for several books, entries in encyclopedias, and numerous articles. She earned an M.A. in History from the University of Virginia.

Ms. Thompson generously responded by email to a series of questions on her work and her new book on the slave community at Mount Vernon.

 

Robin Lindley: Congratulations Ms. Thompson on your recent book on George Washington and enslavement at Mount Vernon. Before getting to your book, I wanted to ask about your background. How did you decide on a career as a historian?

Mary V. Thompson: My father was a major influence on that.  He served for 32 years as an Army Chaplain and, through quite a few moves, would drag us to nearby museums and historic sites and encourage us to read about the next place we were going and all the exciting things that happened there, so we were pretty psyched by the time we got there. He was also the first curator of the Army Chaplains Museum, when it was in Brooklyn, during the Bicentennial of the Revolutionary War.  As part of that job, he also edited a 5-volume history of the Chaplains Corps, while writing the first volume, which covered the American Revolution.  So, as I went through high school, I helped in the museum with some of the exhibits, helped with acquisitions, and with research. I loved all of it.

Robin Lindley: I understand that you’ve spent most of your professional career as a historian at Mount Vernon. How did you come to work at this historic plantation and what is your role?

Mary V. Thompson: This was definitely a result of serendipity---or providence, depending on your world view.  I was getting ready to finish a master’s degree at the University of Virginia, while working as a volunteer for the Army Ordnance Museum at Aberdeen Proving Ground in Maryland, and sending out what felt like bazillions of resumes for jobs all over the country.  I started out part-time [at Mount Vernon] as an historic interpreter (giving tours to about 8,000 visitors per day).  From there, I moved on to doing special projects for the Curator, then to assisting full-time in the Curatorial Department.  I moved up to being the Registrar in the Curatorial Department, which involved cataloguing new objects as they came into the collection, keeping track of where everything was, doing inventories, working with insurance companies, etc.  

To keep me from going nuts, they gave me one day per week to do research on a specific, agreed-upon topic, the first of which dealt with foodways.  After a few years, my boss asked me to switch to studying slavery and slave life at Mount Vernon.  In the late 1990s, as the 200thanniversary of George Washington’s death was rapidly approaching, I worked on three major projects: a travelling exhibition entitled, “Treasures from Mount Vernon:  George Washington Revealed,” which opened in late 1998 and travelled to five cities around the country; redoing the furnishings in the mansion, with special exhibitions to make the house look as though the Washingtons had just walked out of the room; and the recreation/reenactment of George Washington’s funeral, a three-hour event on C-Span.  

I was then moved to the Library, where I worked as the Research Specialist and then as Research Historian.  This involved dealing with questions from people all over the country, generally dealing with domestic life here at Mount Vernon; helping authors, illustrators, and publishers by vetting publications; helping pretty much every department on the estate with helpful quotes and deciding whether we had enough information on a particular subject to do a special exhibit or program built around it.  Best of all was the opportunity to give talks on and publish my own research.  

Robin Lindley: What sparked your recent book on enslavement at Mount Vernon? 

Mary V. Thompson: I actually started working on the topic in the late 1980s, because Mount Vernon really needed to be able to teach its staff and visitors about this issue, but it was probably about seven or eight years after that before it knew it wanted to be a book.  It was in the early 1960s that I first learned about slavery, as a result of the Civil War centennial, which was going on when I was an elementary school student, at the same time that the Civil Rights movement was playing out on the news every night during dinner.  Then in graduate school at the University of Virginia in the late 1970s, slavery was the subject of much of our reading and classroom discussions.

Robin Lindley: Your book has been praised for its impressive detail and extensive research. What was your research process?

Mary V. Thompson: Thankfully, I was able to start with some of the sources compiled by prior members of our Library staff.  One of the Librarians had put together a bound volume of statements by George Washington on the topic of slavery, which she’d typed up back in the 1940s.  I went through that, page by page, listing the topics covered on each and then photocopied the pages and put them into loose-leaf binders for each of those topics.

I also went through bound volumes of photostats of the Weekly Work Reports that Washington required from his overseers, as well as photostats of his financial records.  The Weekly Reports provided detailed information on the work being done on each of the five farms that made up the Mount Vernon estate, as well as information on the food being delivered to each, the weather on each day, food delivered to each farm, the number of people working on each farm, and explanations for why certain people were not working each week.  This last category was really interesting, because it provides information on illnesses, injuries, childbirth, and how long women were out of work because they were recovering from giving birth.

Another great source was correspondence by family members other than George Washington, as well as descriptions of Mount Vernon by visitors to the plantation, that often mention those enslaved there.  In order to understand where Mount Vernon fit in the overall picture of plantations in Virginia, it was also necessary to learn about life at Monticello, Montpelier, Sabine Hall, and elsewhere in the colony/state.   

Robin Lindley: You reconstruct and put a human face on the lives of slaves at Mount Vernon—despite the virtual lack of any contemporary documents by slaves from that period. How did you deal with that challenge?

Mary V. Thompson:  Getting at the enslaved community was one of my favorite parts of this project.  I started by taking the two fullest slave lists, from 1786 and 1799, and used them to try to reconstruct families. Thankfully, these two lists enumerated the people on each of the five farms and what their work was, with the 1786 list linking mothers and their children who were too young to work, and the ages of those children.  The 1799 list did the same, but also linked women and their husbands and told where those husbands lived (whether they were on the same farm with their wives and children, lived on another of Washington’s farms, or belonged to another owner altogether, or were free men).

 Comparing the two lists made it possible to start reconstructing extended, multigenerational families.  I put together a document for each of the farms, organized by family, and then, as people would be named in the work reports, the financial records, or correspondence, would put those references in the individual records, if I was as sure as I could be that I’d found the right person.  

For most of the people, I was keeping track of such things as information about what work they were doing; references to their health; children; ways they might have made extra money; rations of food and clothing; instances of resistance; etc.

Robin Lindley: I was impressed by your description of the massive size of Mount Vernon and the number of slaves who worked there. How would you briefly describe the Mount Vernon plantation in Washington’s era in terms of area, farming, crops, forests, and number of slaves? 

Mary V. Thompson:  Mount Vernon reached an ultimate size of 8,000 acres during Washington’s lifetime. While Washington, like many plantation owners prior to the American Revolution, started out as a tobacco grower, by the late 1760s, he was making the switch from tobacco to grain and from markets in Europe to American and West Indian markets.  Much of the land was still forested after switching in crops and markets. As I understand it, in order to keep fireplaces running on a daily basis for heating, cooking, and washing, it takes ten acres of forest to get enough trees and branches dying naturally to do those things, without the need to cut any more trees.  The largest number of enslaved people on the plantation was 317 in 1799, the last year of George Washington’s life. 

Robin Lindley: What are a few salient things you learned about Washington’s treatment of slaves? 

Mary V. Thompson: Washington was a stickler for detail and a strict disciplinarian.  He was also approachable when his enslaved workers had problems with their overseers, needed to borrow something, or someone was interested in moving from one plantation job to another that required more responsibility.  They even talked to him to clarify things, when he didn’t understand a particular problem.  

Robin Lindley: How did Washington’s military background affect his treatment of slaves and other workers?

Mary V. Thompson: Washington used the same methods to keep an eye on his army as he did on the plantation with his slaves.  He directed that both officers and overseers spend time with his soldiers and slaves, respectively; he expected regular reports from them so that he had a very good idea about how things were going and would also travel daily through his military camps and farms to catch problems before they became major issues.  He also insisted on proper medical care for both soldiers and slaves and was a strict disciplinarian in both situations.

Robin Lindley: How did Martha Washington see and treat slaves? It seems she was more dismissive and derogatory than her husband concerning black people.

Mary V. Thompson:  Like her husband, Martha Washington tended to doubt the trustworthiness of the enslaved people at Mount Vernon.  Upon learning of the death of an enslaved child with whom her niece was close, she wrote that the younger woman should “not find in him much loss,” because “the Blacks are so bad in th[e]ir nature that they have not the least grat[i]tude for the kindness that may be sh[o]wed them.”  

The Washingtons never seemed to realize that they only knew Africans and African-Americans as people who were enslaved, which meant that they were not interacting as equals and any ideas they may have had about innate qualities of this different culture were tainted by the institution of slavery.

Robin Lindley: I realize that direct evidence from slaves is limited, but what did you learn about how slaves viewed George Washington? 

Mary V. Thompson:  Because Washington was so admired by his contemporaries, many of whom came to Mount Vernon to see his home—and especially his tomb—those visitors often talked with the slaves and formerly enslaved people on the plantation in order to learn snippets about what the private George Washington was like. 

Extended members of the Washington family, former neighbors, official guests, and journalists, often wrote about their experiences at Mount Vernon and what they learned about Washington from those enslaved by him. Some people were still angry about how they were treated, while others were grateful for having been freed by him.

Robin Lindley: In his early years as a plantation owner, Washington—like most slave owners—saw his slaves as his property and he bought and sold slaves with seeming indifference to the cruelty and unfairness of this institution. He broke up slave marriages and families, and he considered black people indolent and intellectually inferior. However, as you detail, his views evolved. How do you see the arc of Washington’s life in terms of how he viewed his slaves and slavery?

Mary V. Thompson: That change primarily happened during the American Revolution.  Washington took command of the American Army in mid-1775.  Within three years, he was confiding to a cousin, who was managing Mount Vernon for him, that he no longer wanted to be a slave owner.  In those years, Washington was spending long periods of time in parts of the country where agriculture was successfully practiced without slave labor and he saw black soldiers fighting alongside white ones. He also could see the hypocrisy of fighting for liberty and freedom, while keeping others enslaved.  There were even younger officers on his staff who supported abolition.  

While he came to believe that slavery was something he wanted nothing more to do with, it was one thing to think that slavery was wrong, and something else again to figure out what to do to remedy the situation.  For example, it was not until 1782 that Virginia made it possible for individual slave owners to manumit their slaves without going through the state legislature.  After an 8-year absence from home, during which he took no salary, Washington also faced legal and financial issues that would also hamper his ability to free the Mount Vernon slaves.

Robin Lindley: Many readers are familiar with the story of Thomas Jefferson and Sally Hemmings. Did you find any evidence that George Washington had intimate relationships with any of his slaves or any free blacks?  

Mary V. Thompson:  Not really. As a young officer on the frontier during the French and Indian War, one of his brother officers wrote a letter, teasing him about his relationship with a woman described as “M’s Nel.”  The wording suggests several possibilities: she might have been a barmaid working for a tavern owner or pimp, whose first initial was M; another possibility is that she was the mistress of a brother officer; or perhaps that she was enslaved to another person.  With the minimal evidence that survives, there are many unanswered questions about this mystery woman.

The oral history of an enslaved family at Bushfield, the home of Washington’s younger brother, John Augustine Washington, alleges that George Washington was the father of a young male slave named West Ford, who was born in Westmoreland County, Virginia, roughly 95 miles from Mount Vernon, about a year or two after the American Revolution.  Here, the surviving documentary evidence contradicts the oral history, indicating that Ford’s father was someone in the Bushfield branch of the family.

Robin Lindley: What struck you particularly about the working conditions for slaves at Mount Vernon and how did they compare to conditions at other plantations?

Mary V. Thompson:  As was true on other Virginia plantations in the eighteenth century, the enslaved labor force at Mount Vernon worked from dawn to dusk six days per week, with the exception of four days off for Christmas, two days each off for Easter and Pentecost, and every Sunday throughout the year,  Because Easter and Pentecost took place on Sunday, which was already a day off, the slaves were given an additional day off on the Monday following the religious holiday.  If they were required to work on a holiday, there is considerable evidence that they were paid for their time on those days.

Robin Lindley: What are a few things you’d like readers to know about the living conditions of slaves at Mount Vernon?

Mary V. Thompson:  Most of the enslaved residents at Mount Vernon lived in wooden cabins—the smaller ones served as homes for one family, while the larger “duplexes” housed two families, separated by a fireplace wall.  

The majority of Americans at this period, free and enslaved, lived in very small quarters.  In comparing the sizes of cabins used by enslaved overseers and their families at two of the farms at Mount Vernon with those of the overseer on a plantation in Richmond County, the two at Mount Vernon had a total living space of 640 square feet, while the other had 480 square feet.

The homes of 75% of middle-class white farmers in the southwestern part of Virginia in 1785 were wooden cabins ranging from 640 square feet to 394 square feet.  Our visitors tend to be very surprised to learn that the entire average Virginia home for middle class or poor families in the eighteenth century would fit easily into just “the New Room,” the first room they enter in the Mount Vernon mansion. In other words, pretty much everyone was on the poor end of the scale, unless they were like the Washingtons, the Custises, or the Carters.  

Robin Lindley: I was surprised that some of the Mount Vernon slaves were literate. I had thought that education of slaves was illegal then. 

Mary V. Thompson: There were no restrictions on teaching slaves to read in eighteenth century Virginia, and, in fact, it might have been a useful skill, especially for slaves working in more of a business capacity, than in agricultural labor.  It was not until after a slave revolt known as Gabriel’s Rebellion (1800), that the state passed a law forbidding enslaved people to gather together in order to learn to read.  At least one historian has suggested that between 15 and 20 percent of slaves could read in the 18thcentury.

Robin Lindley: You found evidence that many slaves were aware of African lore and practices—at times from stories passed down through generations and at times from black people more recently arrived from Africa. What are some things you learned about African influences?

Mary V. Thompson:  African influence can be seen in everything from naming practices within families, to family lore and folk tales told to children, the languages spoken in the quarters, religious beliefs and practices, and even some of the food and cooking traditions.

Robin Lindley: You note that slaves were punished physically at Mount Vernon and that even Washington at times applied the lash. What did you find about forms of punishment at the plantation?

Mary V. Thompson:  One of the changes on the plantation after the war, recorded by Washington’s secretary Tobias Lear, was that his employer was trying to put limits on the physical punishment doled out to the slaved.  According to Lear, Washington wrote that no one was to be punished unless there was an investigation into the case and “the defendant found guilty of some bad deed.”  After the war, Washington also tried to use more positive reinforcement, instead of punishment, in order to get the sort of behavior he wanted.  Those positive reinforcements included such things as the chance to get a better job, earning monetary rewards, or even better quality clothing.

Robin Lindley: What happened to slaves at Mount Vernon who escaped and were recaptured? 

Mary V. Thompson: It would depend on the circumstances and how difficult it was to get them back.  Some people might run away briefly because of a conflict with someone else in the quarters, or with an overseer and needed a breather to let the situation cool off.  Others might have left to visit relatives on another plantation.  If they were not gone long and came back on their own, there might be little punishment.  In other cases, if someone continually ran away or was involved in petty crimes, they might be punished physically or even sold away.  

We know of at least one slave, who was sold to another plantation in Virginia, after running away four times in five years; three times when George Washington sold a person to the West Indies, something many people today consider akin to a death sentence; and one case where a young man at Mount Vernon—and his parents—were told that he would be sold there, as well, if he didn’t start exhibiting better behavior.

Robin Lindley: Did you find examples of slave resistance?

Mary V. Thompson:  Yes, many. When people today think of resistance, most probably are thinking of things like running away, or physically fighting back with an overseer, stealing something to eat, or poisoning someone in the big house. Not everyone was brave enough or desperate enough to do something so easily detectable.  They might well have tried something less obvious, like slowing down the pace of work, procrastinating on finishing a particular job, or even pretending to be sick or pregnant.  

Robin Lindley: Oney Judge Staines was a Mount Vernon slave who escaped to New Hampshire a few years before Washington died. He was angry and vigorously sought her return, but was unsuccessful. Did you find new information on this fascinating case?

Mary V. Thompson:  It wasn’t exactly new information, but the fact that this young woman was one of the “dower slaves” from the estate of Martha Washington’s first husband, meant that Martha did not own her or any of the others, but only had the use of them (and any offspring they had) until her death.  George Washington would lose access to those slaves upon Martha’s death, when the dower slaves would be divided among the heirs of her first husband, who in this case were her four Custis grandchildren.

According to a Virginia law at the time, if any dower slave from that state was taken to another state, without the permission of the heirs—or presumably the guardian of those heirs if they were minors—then the heirs or the guardian acting on their behalf would be entitled to take the entire estate immediately, without having to wait for the death of either the husband or wife. Oney’s escape may well have threatened the entire Custis estate.

Robin Lindley: You note that Washington was the only slave-owning Founder who freed all of his slaves in his will. You also note that he seemed circumspect and perhaps ashamed about owning slaves later in his life. Did he ever speak out publicly for the abolition of slavery in his lifetime?

Mary V. Thompson:  It depends on what a person means by “publicly”.  Washington corresponded with quite a few abolitionists, both British and American, after the Revolution.  In response to those people who were pushing him to emancipate those he held in bondage, Washington typically responded that he thought the only legitimate way to do that was through a gradual process of manumission, much like the northern states were setting up.  He noted that he would always vote to forward such a plan, however, he never stood in front of a legislative body as a proponent of a plan like that.  

Robin Lindley: What do you hope readers take from your groundbreaking book? 

Mary V. Thompson:  I would like people to understand that slavery in eighteenth-century Virginia differed from the same institution in both the seventeenth and nineteenth centuries, and that it was a complex institution.  For example, there were people at Mount Vernon who were free, hired, indentured, and enslaved.  They came from many countries and cultures on two continents, represented a variety of both European and African religious traditions, and began their relationships speaking many different languages.  

Robin Lindley: It’s a complicated story. Thank you very much for your thoughtful comments Ms. Thompson, and congratulations on your illuminating book on the Father of the Country and enslavement on his plantation. 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/blog/154325 https://historynewsnetwork.org/blog/154325 0
When the Western Wall Was A Battleground For Jewish Rights

 

Moshe Phillips reviews this book as part of Herut North America's Zionist History Book of the Month series.

 

The book The Western Wall Wars (Whirlwind Press, 2019) details the stories about the young men who, from 1930 to 1947, violated British regulations which banned the sounding the shofar at the Western Wall at the conclusion of Yom Kippur services each year.

 

Moshe Zvi Segal was the first of these young men and he was arrested for sounding the shofar. He blazed a path forward for young Zionist revolutionaries to follow in what was the longest running Zionist underground operation in its history.

 

Rabbi Moshe Segal (1904-1985) was the quintessential Zionist rebel and was a key figure in the histories of Betar, Brit HaBiryonim, Irgun, LEHI (Stern Group), and Haganah and he was the founder of the Brit HaShmonaim religious youth movement. All of these organizations were part of the movement initiated by Zev Jabotinsky (1880-1940) who was the greatest pre-World War Two Zionist leader after Theodor Herzl. Segal himself was a close comrade of Yitzhak Shamir when the later prime minister was a 1940s commander of LEHI.

 

Author Zev Golan knew Rabbi Segal personally and interviewed him many times in addition to attending his lectures and translating his writings. Golan is one of only a handful of Americans who made it their business to seek out the aging heroes of the Irgun and LEHI and to get to know them, their stories, and the ideas that animated their deeds while they were still alive.

 

The Western Wall Wars is subtitled How the Wailing Wall Became the Heroic Wall and is a direct result of Golan's relationship with Segal. In 1930, Segal was the first individual to violate the British regulations against the sounding of the shofar at the Western Wall at the conclusion of the Yom Kippur service. Until 1947, a volunteer from the Irgun, Betar, or the Brit HaShmoniam sounded the shofar every year–often after receiving personal training from Segal in both the mitzvah of shofar as well as how to elude the British police. The British authorities went to great lengths to stop the shofar from being sounded. British efforts to stop Jews from performing a mitzvah probably will seem impossible to fathom to today’s readers and that is just one of the reasons this book is so important.

 

The book explains how Segal and the others who followed in his footsteps transformed the Western Wall from a site of wailing to one of national pride. The book reveals the details of the actual operations at the Western Wall and the full stories of the volunteers who were arrested, escaped from prison, and/or deported to prisons in Africa. Some were involved in the 1946 Irgun attack on the King David Hotel and other Irgun or LEHI operations. Many later fought in Israel's wars. The Western Wall Wars also covers Arab attempts during the 1920s to drive the Jews from the Western Wall and the Jewish response to the Arab effort. Segal was a leader of the opposition in this area as well.

 

The emerging Jewish Underground in the pre-1940 period was a time when the Jabotinsky movement suffered the slings and arrows of the leftist establishment and bravely soldiered on. The light of history has shown that the stances of the Jabotinsky Zionists were correct. If Jabotinsky had been more successful, perhaps the tragedies of the Holocaust and the loss of life in the 1948 war could have been lessened. Progressive historians have always downplayed--and often completely removed–the role of Jabotinsky's movement from their histories of Zionism. This book helps to preserve authentic history and that is a highly praiseworthy thing.

 

The story of the Zionist underground in the pre-state period told here also helps the reader to understand the ideology that guided these warriors as they fought for Jewish rights and rebelled against the British Empire.

 

And this is no small thing. The ideology of Jabotinsky, Rabbi Segal and their comrades is just as instructive and relevant now as it was many decades ago--probably more so.

 

Now that the Jewish People possess a sovereign Jewish State, the concept of just what a Jewish State should rightly be is of vital importance. Avrum Burg, a former Speaker of the Knesset who was also a former chairman of both the World Zionist Organization and Jewish Agency, said "To define the state of Israel as a Jewish state is the key to its end," in a June 2007 interview with Israel's Haaretz newspaper. Now, we live in a time when many radical Jewish organizations in the U.S. struggle to redefine Israel as something other than a Jewish State.

  

For today’s Zionists to be truly successful in a way that transcends politics and elections–in a nation transforming way–we must reevaluate the philosophy of the heroes who fought for Israel’s freedom and Jewish rights in Jerusalem. These heroes were not only the ideological heirs of Jabotinsky but the champions who brought Jabotinsky's deepest hopes into reality.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174672 https://historynewsnetwork.org/article/174672 0
The Cost of Loyalty: A Crisis of Ethics in the Military

 

At a time when faith in our institutions is precarious, and our old dependence on norms has proven to be misguided at best, Americans take succor in believing that the backbone of our country’s security—the U.S. military—is, well, secure. 

It’s necessary to think this way because the alternative is just too disturbing to consider. Yet, from the U.S. Military Academy, known as West Point, which its military administrators assure us with no hint of humility, or data, is “the world’s preeminent leadership institution,” I’d like to sound the alarm. Our faith in military leaders and their progeny is dangerously misplaced. 

The Afghanistan Papers, more than 600 recorded interviews of military and civilian leaders obtained by the Washington Post after years of litigation against the government, showed, according to former lieutenant general Douglas Lute, “we didn’t have the foggiest notion of what we were undertaking” in a war that started more than 18 years ago.

Now 75 years since America’s last victory in a major war, one led by a general, Eisenhower, who was born in 1890, Americans are numb to the military’s strategic failures: bombing to pulp the cities of North Korea, the war of attrition in Vietnam, the wayward invasion and dismal occupation of Iraq. When President Trump and Congress increase the military budget, the near worship expressed by many Americans for the military prevents discussion about what is necessary for self-defense and what is wasted on the self-aggrandizement of generals and admirals and military contractors. 

Recent policies announced by the U.S. military regarding a redeployment of landmines at the discretion of commanders and the placement of low-yield nuclear warheads on submarines should make us shudder. From outside, the military may appear to be an undifferentiated mass of strong-jawed soldiers who come from a sturdier fabric than the rest of us, people who could be trusted with such massive responsibilities. It’s a myth. As I’ve described recently, the military does not have the capacity necessary to achieve success in a modern world. These are average men and women trained within an inch of their nature at the U.S. military academies. And I’ve seen these people up close. I’ve answered to them. I’ve been blackballed by them, retaliated against by them, and will likely be attacked by them again simply for speaking up. At West Point, the headwaters that feeds the armed forces, the values of loyalty and conformity have blasted away any sense of ethics. When fealty to anyone of higher rank becomes the norm—and the essential component of careerism—truth will dissipate quickly. 

The military is in trouble because the schools feeding it have long been breeding grounds for its worst tendencies. One 2017 study by researchers at West Point shows that the intellectual core of the Army and West Point has been eroding for the past 75 years.

Most Americans  still consider the military institution to be an invincible force. Even though, led mainly by generals who graduated from West Point, it snatched a stalemate from the jaws of victory in Korea, got 58,000 young American men killed in Vietnam, and has run Iraq and Afghanistan into the ground, we continue to worship at the altar of military infallibility. Its leaders are driven by self-regard, its soldiers by self-preservation, and all are ingrained to protect the institution, not us.

From the first day of Beast Barracks, a summer training and indoctrination ritual, new cadets at West Point learn that loyalty to each other is the preeminent value, high above all the others, including truth. Of course, loyalty is important—we all understand the need for it, and how war requires unity forged in this trust. But in reality, loyalty doesn’t look like it does in movies—an exhausted soldier avoiding fire to save his comrade. It takes the form of hiding a breach from a sanctioned “outsider,” protecting a superior from the suspicion of an inspector general, lying on a casualty report, or falsifying records to evade criticism. The value being cultivated by the military ethos at the academies and in the theaters across the world goes by a different word for the rest of us: deceit.

And the military knows it. Leonard Wong and Stephen Gerras, former colonels now at the Army War College, wrote in “Lying to Ourselves: Dishonesty in the Army Profession,” a 2015 study describing a culture devoid of integrity that “’white’ lies and ‘innocent’ mistruths have become so commonplace in the U.S. Army that there is often no ethical angst, no deep soul-searching, and no righteous outrage when examples of routine dishonesty are encountered.” Rather, “mutually agreed deception exists in the Army because many decisions to lie, cheat, or steal are simply no longer viewed as ethical choices.” 

The Afghanistan Papersillustrate a military command and civilian bureaucracy in almost unanimous agreement that the strategies employed in the war in Afghanistan have been futile. But not a single official in a position to know ever spoke up. This is not a coincidence. This kind of deception is not a bug of the military hierarchy. It is a feature.

The problems coming from inside the U.S. military will be difficult to solve for two reasons. First, Americans’ trust in the military is akin to religious worship, so we don’t engage in any real oversight. Second, the military itself has no incentives for admitting faults in order to rectify problems, despite a continuing cascade of losses and harm to soldiers at home. “We’re creating an environment where everything is too rosy because everyone is afraid to paint the true picture,” an army officer said in the Wong-Gerras study. “You just wonder where it will break, when it will fall apart.”

The regal military “chain of command,” a relic in 2020, has become, in the hands of military officers, a device to coerce blind unquestioning loyalty at the expense of truth. This has spawned silence among generals when their speaking up could prevent America from engaging in almost perpetual warfare. 

They could best fulfill their constitutional obligations by telling us the truth. But, alas, they weren’t trained for that. 

Though evidence amasses—through scandals, leaks, evidence on the ground, and decades of failure—Americans  stubbornly hold faith in the U.S. military. A staggering 80% of Americans believe our military will always act in our country’s best interest. Is it because, were we to consider the evidence, it would be hard to sleep at night? 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174571 https://historynewsnetwork.org/article/174571 0
"Faster" Author Neal Bascomb on the Jewish Auto Racer who Defied Hitler

 

Neal Bascomb has written many works of nonfiction, digging into historical episodes to find stories of perseverance and triumph over adversity. The research for his latest book involved unearthing a forgotten challenge to Nazi doctrines of racial supremacy by a Jewish driver pushed out of racing by European antisemitism, and his unlikely allies. It also involved high-speed laps in a rare vintage Delahaye race car. He shares insight into the process of research and writing here. 

 

How did you come to write about Rene Dreyfus and his once-famous Delahaye?

Book ideas originate from my places. Sometimes you’re out fishing for ideas; other times you come across a vignette in a history that you believe could play out on a much larger scale. And on the rare occasion, one drops your lap like a gift from heaven. This was the latter, courtesy of my good friend and talented Wall Street Journalcolumnist Sam Walker. Four years ago, while in New York visiting his family, he passed along to me a small news article about classic car collector Peter Mullin who had just premiered his latest gem called the Million Franc Delahaye. There was an intriguing story behind its genesis. According to the piece, the French-made car had been produced to take on the fearsome German Silver Arrows before the break of World War II; its creation was financed by an American heiress named Lucy Schell; and, if this was not epic tale enough, the Delahaye was piloted by Jewish driver Rene Dreyfus. It did not take a genius to know this was a remarkable sports story, perhaps better even than Jesse Owens at the Berlin Olympics, of David beating Goliath. Instead of running, we had race cars. Even better, Hitler sought to destroy the Delahaye when he invaded France, and the car was disassembled and hidden to avoid discovery. After years of restoration efforts, Mullin had brought the car back to its former glory. Even after focusing on this story for three years, I still get excited thinking about it.

 

Beyond what sounds like a compelling narrative, why does this history of a race car matter to today’s reader?

In building the Silver Arrow race cars, Hitler wanted to prove the superiority of the German nation—yes, in motor sport but also in terms of their engineering, technological and economic prowess. These bullets on wheels were nationalistic symbols (in some ways of capitalism versus fascism). The same game continues to play out today. One sees it most pervasively by China through their high-speeds trains, their skyscrapers, their leaps in artificial intelligence and super computers, and even their “Belt and Road” initiative spanning the globe. Not only must they dominate, but they need to be seen to dominate above all others, past and present. The battle on the Grand Prix was an early predecessor of this, and the attempt by Hitler to erase the history of their defeat at the hands of Dreyfus, Schell, and Delahaye (by destroying the records of the 1938 season and seeking out the car itself for the same) has echoes today—and throughout time—as well. It is not much different when you think of ISIS trying to eliminate any vestige of Shiite mosques, tombs, and shrines in Iraq. Or the mortaring of the famed Mostar bridge during the Bosnian war. It is rewriting history by attempting to erase it. That is why the resurrection of the Delahaye to its former glory today resonates so deeply. 

 

Have you always been a car fanatic?

Truth be told, no. As a kid, my grandmother owned a 1968 red Mustang convertible, which I was in absolute awe of, but my fascination waned, not least because when I was old enough to drive, my experience was limited to utilitarian snorers, including a Pontiac Sunbird, Isuzu Impulse, and Suzuki Grand Vitara. Cubic capacities, the differences between a turbo and supercharged engine, and the dynamics of suspension systems—all were lost on me. That said, when I first visited the Petersen Museum in Los Angeles to meet Peter Mullin, everything changed. There was a special exhibit on Bugattis, and I simply could not believe the beauty and refinement of these masterpieces of design. Then, when I first got a chance to ride in some of these classics, to feel their power and the thrum of their engines, I was lost to this world. Now I frequent classic car sites like bringatrailer.com and revel in the weekly “My Ride” dispatches from the Wall Street Journal’s A.J. Baime, to name just a few distractions. One day soon, I intend to get my hands on a ’68 convertible too!

 

Why have we never heard of the name Lucy Schell?

Over the past two years, the New York Times has been running a spectacular project called “Overlooked.” Originated by Amisha Padnani, it features the obituaries of remarkable women that the paper-of-record overlooked on their deaths, including the first American woman to claim an Olympic championship to a literary star of the Harlem Renaissance. Surely the faded memory of Lucy Schell suffered from the same discrimination and prejudice. The fact is Lucy was a true path-breaker who lived a incredible life. A nurse in WWI, she went on to become one of the first speedqueens. For almost half a decade, she was one of the best Monte Carlo Rallyers, man or woman. She was surely the top-ranked American. Then she was the first woman to start her own Grand Prix race car team. She helped win the famed Million Franc prize for Delahaye, and it was her car—and her driver Rene Dreyfus—who beat the German Silver Arrows in an epic race before the war. One of my proudest accomplishments withFaster is restoring her rightful position in the racing history!

 

What was the most challenging aspect of the research?

On previous books, there was often a world of publicly available, primary research. Sometimes I had to dig for weeks or months among obscure files to discover what I needed, but a picture ID and a fair travel budget usually sufficed to obtain entry (except in Russia!). In the automotive world, private collectors often corner the market on archival material, including company documents, interviews, photographs or personal papers. At first, I was rebuffed by these collectors, likely because I was seen as an interloper (and not French enough when it came to Delahayes). Charm offensives and a lot of follow-up finally gained me access to many treasures scattered about the globe, whether found in a sprawling French farmhouse, a cluttered Seattle garage, or a storybook English manor, among other places. Fortunately, the reward was never-before heard interviews with Rene Dreyfus, personal histories of Lucy Schell, grainy video footage of 1930s Grand Prix racing, and rarely seen blueprints and production figures from Delahaye. The generosity from these collectors, not to mention the exquisite archives at the REVS Institute and Daimler-Benz, have allowed me to tell this remarkable story in what I hope is a visceral, edge-of-your seat way.

 

What was most memorable about writing Faster?

The first I chronicle in the introduction: zooming through the orange groves of California in the 1938 Delahaye 145 at speeds that still make me tremble. Second to that would be my journey to France, Germany, and Monaco. It’s one thing to watch a race on the Nurburgring or through Monte Carlo, it’s another to drive these same stretches, then walk them on foot. I wanted to know every turn and dip in an attempt to get some sense of the challenges of racing these courses at hundreds of miles an hour, amid a crowded field of other cars that could vault off the road at any minute (or directly into you). Of special note was a private tour of Montlhery outside Paris. We raced around the oval autodrome, and I truly got the sense of what it was like to round the banked curves, feeling like a fly stuck on a wall, but in danger of slipping off at any moment. It was deliciously frightening, and the experience of a lifetime. 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174570 https://historynewsnetwork.org/article/174570 0
Golda Meir and Women's Wartime Leadership

 

A CNN poll at the time of the New Hampshire primary revealed that 30 percent of voters believed a female Democratic nominee would have a harder time beating Donald Trump than a male nominee would. As the number of women in the field dwindled from six, to two, to zero, in an election cycle that had been widely discussed as a possible referendum on women’s empowerment, a question emerged, generally implicit but still present:  will voters trust a woman to lead the nation in time of war?

 

The potential for a woman to wield ultimate power in a field jealously guarded by men for centuries was answered, though not definitively, in 2016. Nearly 66 million citizens voted for Hillary Clinton to be their military commander-in-chief, three million more than voted to give that authority to Donald Trump. Matters of war are certainly not the only election issues. But the absence of a woman war leader is remarkable for a country that has stood on democracy’s cutting edge since its founding and has been engaged for decades in nearly continuous armed conflicts abroad—while many other countries have looked to women for wartime leadership.

 

Among modern leaders, none played for higher stakes than a Jewish grandmother named Golda Meir. She helped coordinate Jerusalem’s defense during Israel’s war of independence in 1948, served as foreign minister during the 1956 Suez War, and informally advised the government during the 1967 Six-Day War. Two years later, she became Israel’s first female prime minister. In a country in a constant state of war, she received many of the sort of the middle-of-the-night calls that were portrayedprominentlyand ominouslyin American campaign commercials in 2008.

 

On October 6, 1973, one of those calls informed her that Syria and Egypt would be launching a full-scale attack that very day—Yom Kippur, the highest of holy days. Golda’s military advisors recommended a strategy that had proved so successful in the past: a preemptive air strike against Syrian and Egyptian airfields. Years of war had hardened Golda to the reality of life surrounded by enemies. “Are we supposed to sit here with our hands folded, praying and murmuring, ‘Let’s hope that nothing happens?’” she asked one interviewer. “Praying doesn’t help.

What helps is to counterattack. With all possible means, including means that we don’t necessarily like.”

 

Yet as a statesman, Golda also had to focus beyond the war’s opening salvos. Israel depended upon the United States for ammunition, aircraft and replacement parts, and a first-strike by Israel would jeopardize relations with its superpower patron. Haggard from stress, she forbade her commanders from launching a preemptive strike. For the war’s first few days, Israel’s survival hung in the balance. Its army lost a quarter of its tanks and an eighth of its fighter-bombers. Israel’s defense line in the Sinai began cracking, and Syrian troops made inroads along the Golan Heights. Should either attacker have broken through Israel’s thin green line, its heartland would have beenwide open to attack. 

 

Golda spent several anxious days playing the parts of strategist, diplomat, and cheerleader. Chain-smoking, gulping a gallon of coffee each day, the 75 year-old woman stiffened the badly-shaken spirits of her defense minister, the veteran general Moshe Dayan. When Dayan asked Golda for permission to assemble Israel’s nuclear weapons, Golda refused. The battle would be fought in the Sinai and Golan Heights, but the war would be won—or lost—by the support of Israel’s friends. There would be no nuclear option. Shadowing Golda as she reviewed military reports with Dayan, the defense editor of the Israeli newspaper Ha’retz wrote, “It was strange to see a warrior of seven campaigns and brilliant past chief of staff of the IDF bringing clearly operational subjects to a Jewish grandmother for decision.”

 

With U.S. supplies and Israel’s reserves taking the field, the tide of battle turned. Israeli troops threw back the Syrian’s on the Golan Heights and surrounded an Egyptian army in the Sinai. On October 29, less than a month after the war began, Israeli commanders met their Egyptian counterparts under a tent stretched across the guns of four parked tanks. They negotiated a withdrawal of Egyptian forces on terms that permitted Egypt’s president Anwar Sadat to save face and, in time, negotiate a lasting peace with Israel.

 

Golda’s was just one chapter in a rich history of women leading their nations in wartime. In ancient times, Egypt’s Cleopatra and Tomyris of the Massagetae led their nations against the superpowers of their day. In the Middle Ages, Queen Manduhai of the Mongols reclaimed a chunk of Genghis Khan’s Empire about the time that Marguerite d’Anjou, wife of England’s King Henry VI, rallied the Lancaster faction in the Wars of the Roses. Elizabeth Tudor turned back the Spanish Armada, Queen Njinga ravaged Portugal’s slave-trading outposts in Angola, and Catherine the Great led her nation to victory against the Ottoman Empire, Sweden and Poland. In the Age of Democracy, Indira Gandhi resortedto war to solve a humanitarian crisis in Bangladesh, and Margaret Thatcher ejected Argentine invaders from the Falkland Islands.

 

These examples suggest sex is no bar to effective political leadership of a country at war, or of leadership over military forces. Indeed, comparatively few male American presidents have boastedhigh-level military expertise when confronted with a decision to go to war. Our historical bias in favor of male leaders in wartime will come to a close, sooner or later.and the next generation of conflicts may find the United States led, capably and competently, by a commandress-in-chief. 

 

Jonathan W. Jordan and Emily Anne Jordan are co-authors of the book, THE WAR QUEENS: Extraordinary Women Who Ruled the Battlefield (Diversion Books, March 10, 2020).

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174573 https://historynewsnetwork.org/article/174573 0
I Taught my Students to Create Hoaxes in Public. Here’s Why

 

Two hoaxes surfaced on the Internet in late November and early December 2019. The hoaxes drew interest and comment, prompted clicks and reposts, and fooled both the unsuspecting and the highly trained. The hoaxes did not spark widespread panic or promise essential health cures; they were not created by marketing bots or Russian spies. The hoaxes were designed, planned, and launched by my undergraduate students at the University of Utah as part of their preparation to become savvy citizens in the twenty-first century.

 

One hoax announced the “discovery” of a wedding ring in Frisco, Utah, bearing the inscription “Etta Place - Chubut, 1904.” A photograph of the ring was enough to excite Facebook users to fill in the gaps. “Etta Place was the girlfriend of Henry Longbaugh aka the Sundance Kid,” wrote one. “Chubut is the province in Argentina where the couple settled,” added another. “I thought I knew pretty much all of the legends of Butch and his companions,” observed a third, “I’m looking forward to the rest of the story.” This ruse met its match in a museum professional who pointed out that the dirt in the photo did not reflect the composition of soil from southern Utah and that a gold ring would not have fractured like the tungsten ring in the photo.

 

The second hoax “uncovered” a document in the university archive with implications for a collegiate rivalry. A first-year graduate student “found” an architectural sketch indicating that the statue of Brigham Young on his namesake campus in Provo was originally intended to fill a pedestal that remains empty to this day on the campus of the University of Utah. Then an ardent “fan” created an online petition calling for the return of a statue with a Twitter hashtag to #BringBrighamBack! BYU fans seemed amused, while a Utah fan retorted “Please keep the name and the statue!” One reader knowingly stated that he’d “already heard” this as an urban myth and was glad the answer was finally found in the archives. A professor from one of the schools read the petition, clicked through to the faked study, and then reposted the underlying study.

 

Why teach students to create hoaxes? First and foremost, we need citizens with the skills and perspective to survive in “the golden age of hoaxes.” Photoshopped images and deep faked videos go viral on social media, cable channels air documentaries about mermaids and monsters, celebrities become politicians, and politicians call the media “fake.” And yet, what circulates so quickly on the Internet today also circulated in other forms in the past—forged diaries and documents, a mermaid body washed up on shore, photographs of fairies or Lincoln’s ghost, and tales of life on the moon.

 

Teaching about hoaxes also represents good professional practice. I taught the course as a night class at a university, but by day I am the director of an archive/research library where I am responsible both to share information and preserve its integrity for perpetuity. My library joins the Smithsonian among the countless victims of the most notorious murderer-forger in history. We follow the best practices of the archival profession for providing secure access in our reading room. Our IT professionals constantly monitor for phishing, social engineering, or outright hacking. One way to protect the historical record is to understand how it can be stolen, manipulated, modified, or erased.

 

Teaching with hoaxes also presented an opportunity for engaging pedagogy. In designing the course, I took cues from psychologists who examine abnormal mental dysfunction to promote better mental health, from police officers who experience tasers and K-9 takedowns before being authorized to use them, and from athletes who run the plays of opponents in order to beat them. If we are to avoid being fooled by bad information, we must understand how it works. How better to understand how it works than by creating our own hoaxes? We began by exploring the history and methods of past hoaxers, frauds, and forgers. We sought usable lessons by developing skills in the historical method to identify and debunk false claims and by seeking to understand how digital misinformation and disinformation circulate today.

 

The students succeeded in creating hoaxes that fooled some in their target audiences, but our creations faced stiff competition on the worldwide web of misinformation. Both hoaxes ran for about two weeks before we came clean. Those weeks also witnessed presidential impeachment hearings by the House Judiciary Committee, a rumor about the height of Disney’s most famous animated snowman, and a Facebook hoax about women being abducted in white vans. One of the takeaways noted by my students was that it was hard to get people’s attention in an environment in which they are constantly bombarded by misinformation.  

The experience also prompted increased respect for actual professional expertise. The group working on the fake ring explicitly targeted baby boomers, thinking they’d be an easy target. That intergenerational spite dissipated after their best effort was so effortlessly debunked by someone with more experience.

 

Other lessons learned came out of the ongoing classroom discussions about the ethics of our activities. After all, it’s not every day that one gets assigned to forge a historical document, photoshop an image, create fake social media accounts, and spread deception on the Internet. We drew a line that prohibited the forgery of medical claims, soliciting money, public harm, or anything that would get any of us arrested, fired, or expelled. That left us free to experiment with information, history, and emotional appeal.

 

The discipline of history has much to offer our present age of misinformation in terms of subject matter and analytical methods. Here’s hoping for a future in which we recognize the flood of misinformation around us, acknowledge the value of expertise, and develop the thinking skills necessary to survive our century. 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174575 https://historynewsnetwork.org/article/174575 0
Trump's Budget Proposal Reveals His Values

 

It is often said that government budgets are “an expression of values.”  Those values are clear in the Trump administration’s $4.8 trillion budget proposal for 2021, unveiled early this February.

The budget calls for deep cuts in major U.S. government programs, especially those protecting public health. The Department of Health and Human Services would be slashed by 10 percent, while the Centers for Disease Control and Prevention, which has already been proven to be underfunded and unprepared to deal with the coronavirus outbreak, would be cut by a further 9 percent. Spending on Medicaid, which currently insures healthcare for one out of five Americans, would plummet by roughly $900 billion, largely thanks to reductions in coverage for the poor and the disabled. Meanwhile, Medicare expenditures would drop by roughly $500 billion. The budget proposal’s reshuffling of agency responsibilities in connection with tobacco regulation also seems likely to contribute to a decline in public health.

 

Public education constitutes another low-priority item in the Trump administration’s budget proposal. Calling for a funding cut of nearly 8 percent in the Department of Education, the proposal hits student assistance programs particularly hard. Despite the soaring costs of a college educationthe budget would eliminate subsidized federal student loans and end the Public Service Loan Forgiveness program, which currently cancels federal student loan debt for teachers and other public servants after a decade of loan payments. The budget would also reduce student work-study funding and increase the percentage of discretionary income student borrowers must devote to repayment.

 

Some of the deepest cuts in the Trump budget relate to the environment. The Environmental Protection agency would lose 26 percent of its funding, including a 10 percent reduction in the Superfund hazardous waste cleanup program, a nearly 50 percent reduction in research and development, and a $376 million decrease in efforts to improve air quality. EPA staffing would fall to its lowest levels in three decades, thereby hampering enforcement of existing environmental regulations. Moreover, there would be cuts to the National Park Service of $587 million and to the U.S. Fish and Wildlife Service of $80 million, including an $11 million reduction in funding for determining extinction risk under the Endangered Species Act. The administration’s approach to the environment is also evident in the budget’s call for a cut of half the funding for the ecosystem work of the U.S. Geological Survey and for a 74 percent reduction in funding for the Energy Efficiency and Renewable Energy program of the Department of Energy.

 

The budget targets other domestic programs for sharp cutbacks, as well. In the area of public transportation, Amtrak’s federal grants would be reduced from $2 billion to less than half that amount. The budget proposal also calls for ending all federal funding for the Corporation for Public Broadcasting, which supports PBS and National Public Radio stations. Although Trump has repeatedly promised not to cut Social Security, his budget would do just that, slashing it by $71 billion worth of benefits earmarked for disabled workers.

 

Programs aiding impoverished Americans come in for particularly harsh treatment. Although homelessness and securing affordable housing in urban areas are major problems in the United States, the budget calls for a 15 percent decrease in funding for the Department of Housing and Urban Development. Programs that help pay for rental assistance for low-income people have been slashed, while award grants to neighborhoods with deteriorating public and federally assisted housing would be eliminated. 

 

Furthermore, the budget proposes cutting funding for the Supplemental Nutrition Assistance Program, the federal government’s primary effort to feed the hungry, by $180 billion between 2021 and 2030. Stricter work requirements would be implemented, and are expected to result in nearly 700,000 Americans being dropped from the program’s coverage. Among them are large numbers of children, who would also lose their enrollment in the free school lunch program. Explaining the cuts, the Trump budget message stated that “too many people are still missing the opportunity to move from dependence to self-sufficiency.”

 

It’s also noteworthy that, in a world plagued by wars, a massive refugee crisis, climate disasters, and disease epidemics, the Trump budget calls for slashing State Department and U.S. Agency for International Development funding by 22 percent. The biggest cuts target diplomatic engagement, food assistance, and international organizations such as the United Nations, which would receive $447 million less for UN peacekeeping efforts and $508 million less for U.S. dues to the world organization. The budget also calls for slashing more than $3 billion in funding from global health programs, including half of U.S. funding for the World Health Organization.

 

By contrast, the Trump budget proposes substantial increases for the president’s favorite programs. Additional spending would be devoted to restricting immigration, including another $2 billion for building Trump’s much-touted wall on the U.S.-Mexico border and another $544 million to hire 4,636 additional ICE enforcement officers and prosecuting attorneys. Moreover, as a New York Times analysis noted, “the budget promotes a fossil fuel ‘energy boom’ in the United States, including an increase in the production of natural gas and crude oil.”  

 

Furthermore, despite the fact that U.S. military spending already surpasses the combined expenditures of the next seven military powers throughout the world, the Trump budget would add billions of dollars to annual U.S. military appropriations, raising them to $741 billion. This military spending would focus on developing a new generation of weapons, especially nuclear weapons, with the Department of Defense receiving $29 billion (a 16 percent increase) and the Department of Energy $20 billion (a 19 percent increase) for this purpose. If one adds in proposed expenditures on “missile defense” and cleaning up nuclear weapons sites, the annual cost of U.S. nuclear war preparations would soar to $75 billion.

 

With presidential and congressional elections now looming, Americans will soon have the opportunity to show whether these priorities―and the values underlying them―accord with their own. As the coronavirus pandemic indicates, a government's priorities and values can be matters of life and death.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174577 https://historynewsnetwork.org/article/174577 0
Churchill, Stalin and the Legacy of the Grand Alliance: An Interview with Professor Geoffrey Roberts

 

Geoffrey Roberts, Martin Folly and Oleg Rzheshevsky have recently published Churchill and Stalin: Comrades-in-Arms During the Second World War (Pen & Sword, South Yorkshire: UK, 2019). The book offers an overview —based on material, including never before released documents, from the Russian archives — of the relationship between the two leaders in the period surrounding the Grand Alliance of the US, UK and USSR, which defeated Nazi Germany. Aaron Leonard recently exchanged emails with Professor Roberts about his research.

 

‘Common knowledge’ —in the West anyway —holds that Joseph Stalin was among the most evil persons of the twentieth century, yet your book presents a picture of him as a key figure, working hand in hand with the Western icon, Winston Churchill, in the successful defeat of Hitler. How does one reconcile such a seeming paradox?

 

There is no paradox. They united to defeat a common foe who threatened the very existence of their states and societies. Churchill saw Stalin as by far the lesser or two evils compared to Hitler, while for Stalin a common interest in the defeat of fascism prevailed over his hostility to British capitalism and imperialism. While Churchill had been one of the main organisers of the capitalist coalition that tried to overthrow Bolshevik Soviet Russia after the First World War, in the 1930s he opposed Anglo-French appeasement of Nazi Germany and campaigned for a grand alliance of Britain, France and the Soviet Union to oppose Hitler. When negotiations for an Anglo-Soviet-French pact failed, Churchill did not like the ensuing Nazi-Soviet pact, but he understood Stalin’s reasons for doing a deal with Hitler in August 1939 and believed that sooner or later Britain and the Soviet Union would be fighting alongside each other. During the period of the Nazi-Soviet pact, from August 1939-June 1941, Stalin was wary of being prematurely dragged into the war on Britain’s side but he admired Churchill’s refusal to capitulate to Hitler after the fall of France in 1940.

 

You describe the relationship between Churchill and Stalin as one —in Churchill’s view anyway — of ‘warlord to warlord.’ This struck me as a very particular word, and not one that immediately springs to mind. Could you explain why you think it Churchill felt it captured their relationship?

 

By the time Churchill first met Stalin – in Moscow in August 1942 – the Soviet Union had survived the initial German onslaught, though not without the Red Army suffering millions of casualties. Like Churchill in 1940, Stalin had shown his mettle by remaining in Moscow in November 1941 when the Wehrmacht was at the gates of the Soviet capital. Stalin’s counterpart to Churchill’s ‘Finest Hour’ speech of June 1940 was his patriotic appeal to troops parading through Red Square on their way to the front on 7 November 1941. It was also clear from the two men’s correspondence with each other that they were both deeply involved in directing the British and Soviet war efforts. Both men had personal experience of war - Churchill in British imperial wars in Africa and World War One, Stalin during the Russian civil war. They were both steeped in military history, strategy and doctrine and had a penchant for military-style clothing. While Stalin described Roosevelt as a great man in both peace and war he called Churchill his comrade-in-arms.

 

What about Stalin’s relationship with Roosevelt,  the other prong in the troika. How did it compare with that of Churchill’s? Put another way, among which pair were the relationships strongest?

 

You have to remember that Stalin viewed personal relationships through a political prism. To him Roosevelt was a representative of the American progressive bourgeoisie and a potential ally against Fascism, Nazism and war-mongering capitalists and imperialists. Roosevelt’s New Deal wasn’t socialist but it leaned left and FDR was seen by American communists as a part of an embryonic popular front – a perspective that became even more pronounced during the Second World War when the CPUSA dissolved into a Communist Political Association that sought affiliation with the Democratic Party. 

 

When Roosevelt became President in 1933 the Soviet Union and the United States established diplomatic relations and resolved financial disputes arising from the Bolshevik takeover in Russia in 1917. Roosevelt was a bit player in events leading to the outbreak of war in 1939 but he came to fore as an exponent of the United States as the ‘arsenal of democracy’. Stalin was impressed by Roosevelt’s rapid decision to extend Lend-Lease to the USSR when the Germans attacked the Soviet Union – even though the United States had yet to enter the war - and he knew from his sources within the US government that Roosevelt battled to overcome bureaucratic and political obstacles to shipping as much aid as possible to the Soviets. Roosevelt was fortunate to have at his disposal two special envoys – Harry Hopkins and Averell Harriman (US ambassador in Moscow from 1943-1945) who got on well with Stalin and had the Soviet leader’s confidence.

 

Stalin was also a great respecter of the power that Roosevelt represented and he aspired to emulate US industrial and military might through a combination of American mass production methods and socialist economic planning. “Had I been born and brought up in America,” Stalin told the head of the American Chamber of Commerce in June 1944, “I would probably have been a businessman.” 

 

Stalin had a great deal of personal affection for FDR, which started at the Tehran summit in 1943 – a conference that entailed a long and arduous journey for the wheelchair-bound Roosevelt. The ailing US President undertook another gruelling journey when he met Churchill and Stalin at Yalta in 1945. Stalin was genuinely upset by Roosevelt’s unexpected death in April 1945 and apprehensive about the future of Soviet-American relations. Stalin was well aware of pressures in Washington for a more hard-line approach to relations with the USSR but was reassured by reports that Roosevelt’s successor, Harry S. Truman, was committed to continuing co-operation with the Soviets. In a piece for the History News Network a few years ago called Why Roosevelt was Right about Stalin I concluded that “Stalin was sincere in his commitment to collaborate with Roosevelt during the war and was sorely disappointed when that cooperation did not continue under Truman.”

 

It seems to me that while he was emotionally closer to Stalin, Churchill’s relationship with Roosevelt – for class, political and life-history reasons – was stronger. The same was true of Stalin’s relations with Churchill and Roosevelt. He was emotionally intimate and bonded with Churchill but trusted Roosevelt more.When the Republican politician Harold Stassen met him in April, 1947 Stalin told him “I am not a propagandist, I am a man of business,” and pointed out that he and Roosevelt had never indulged in the name-calling game of “totalitarians” v. “monopoly capitalists.” Stalin was referencing not just Truman’s recent speech to Congress calling for a Free World-struggle against totalitarianism but also Churchill’s "Iron Curtain" speech at Fulton, Missouri in March 1946, a clarion call for a harder line against the Soviet Union which had disappointed but not surprised Stalin.

 

In reading your account what comes through is that both Churchill and Stalin attempted to maneuver and leverage their respective positions in the alliance with charm and personality. How critical were such things when weighted against the larger historical forces shaping their decisions? 

 

The answer to that question depends on how you view the role of individuals in history. Generally speaking, historians think that individuals matter, and the more important and powerful the individual the more they matter. In The Hero in History (1943) Sydney Hook distinguished between what he called "eventful" individuals and "event-making" individuals. Eventful individuals are important because of their role in events, while event-making individuals shape and change the course of events. Hook’s key case-study of an event-making individual was Lenin’s role in 1917 when he transformed the character of the Russian Revolution and changed the course of world history. Had he published his book after World War Two Hook could have added case-studies of Churchill and Stalin.  Churchill’s role in keeping Britain fighting in 1940 was both eventful and event-making. Had he taken Britain out of the war Hitler’s domination of Europe would have been secured and provided an even stronger springboard for his attack on the USSR in 1941, which, with Britain neutral, might well have succeeded.

 

Churchill’s immediate declaration of solidarity with the Soviet people when the Germans invaded the USSR in June 1941 was the first critical step in the formation of grand alliance of Britain, the Soviet Union and the United States – one of the most effective war-fighting coalitions in history. Some would argue that British interests dictated an alliance with the Soviets against Hitler. That’s true, but clear only in retrospect. At the time there were those in Britain arguing for a more equivocal response to the Nazi invasion of the USSR, while others welcomed the prospect of Hitler crushing the Soviet communists. Large quantities of western military aid did not flow to the Soviet Union until 1943 but the Eastern Front was on a knife-edge in 1941-42 and every little bit helped. Important, too, in those early months of the Soviet-German war, was the positive impact on Soviet morale of the alliance with Britain, a psychological plus that was further strengthened by the US entry into the war in December 1941.

 

Another example in relation to Churchill is the impact of his personal opposition to opening a Second Front in France in order to relieve German pressure on the Red Army. The Soviets survived the absence of this second front in 1942 but it was a close-run thing. A second front in 1943 would have altered the military course of the war and might have had vast geopolitical consequences. Churchill had his reasons – in the book you can read for yourself the arguments between him and Stalin about this issue - and some people still think that delaying the second front until 1944 was necessary to avoid a costly failure that would have set back the allied cause. But there is no doubt about Churchill’s personal influence in being able to prevent an early second front.

 

Stalin as both an eventful and event-making individual during the war presents a paradox. On the one hand, he was a leader who both brought his country to the brink of disaster by his handling of plans and preparations for war, notably by restraining mobilization in the face of the imminent German attack. While this was not Stalin’s sole responsibility, he alone had the power to change the course of events on the Soviet side. On the other hand, he was the leader who then saved his country by holding its war effort together, albeit by brutal and costly methods.

 

You describe how the preeminent capitalist powers of Britain and the US were able to collaborate with the preeminent socialist one, the Soviet Union — another paradox. What was the basis for this and to what degree did each entity need to compromise to make it work?

 

As a Marxist Stalin believed that the material interests of Britain, the United States and the Soviet Union constitute a solid and durable basis for the anti-Hitler coalition. He also believed that the common interest in containing a resurgence of the German (and Japanese) threat meant the alliance could continue after the war. The ideological struggle between communism and capitalism would continue but in the context of peaceful coexistence and collaboration to maintain postwar security for all states. Stalin also thought that the growing popularity of communism in Europe and the global swing to the left would also facilitate a peacetime grand alliance.

 

Churchill and Roosevelt had a more individualistic view of the foundations of the grand alliance – they had faith in Stalin as a moderate leader. At the same time, they perceived changes in the Soviet internal regime – a degree of convergence with western systems – which meant that a cooperative USSR would become more open to western influence.

 

The perceptions and beliefs of all three leaders were reinforced by their experience of the grand alliance, which was a history of successful negotiation and compromise. So successful that by the end of the war the Big Three, as they came to be called, were convinced it would continue into the foreseeable future. That didn’t happen but it wasn’t for want of trying, at least on Stalin’s part. Churchill lost office in July 1945 but when he returned to power in Britain in 1951 he was an advocate of renewed negotiations with the USSR, notwithstanding his reputation as an early-adopter cold warrior. And who knows what would have happened if Roosevelt had lived a little longer. Maybe he would have been able to restrain the advocates of a tough line with Russia, as he had done during the war.

 

Looking 75 years on, what is the legacy of the ‘Grand Alliance’ today?

 

The immediate legacy on the Grand Alliance was mixed. On the one hand, it had defeated Hitler’s grab for world power and his attempt to establish a Nazi racist empire in Europe. A Europe of independent sovereign states was re-established after the war. The allied coalition, including the Soviet Union, had also fought the war under the banner of democracy and the allied victory did indeed reverse the trend towards authoritarianism which had gathered momentum following the world economic crisis of the late 1920s and early 1930s. On the other hand, this war for democracy had largely been won by another authoritarian state – the Soviet Union – and, with the outbreak of the cold war, Stalin was quick to establish a tightly controlled communist bloc in central and eastern Europe. The postwar failure of the Grand Alliance resulted in the cold war and decades of crisis, confrontation and conflict, not just in Europe but across the globe. That cold war is over but we are still grappling with its consequences.

 

Historical memory of the Grand Alliance, and of the popular anti-fascist unity that underpinned it, remains strong, particularly in Russia where there is yearning for a return to great power politics based on negotiation, compromise, mutual respect and trust.

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174572 https://historynewsnetwork.org/article/174572 0
The Life and Times of Flamboyant Rock Music Impresario Bill Graham

Baron Wolman. Jimi Hendrix performs at Fillmore Auditorium, San Francisco, February 1, 1968.

Gelatin silver print. Iconic Images/Baron Wolman

 

You remember rock music promoter Bill Graham. Sometime in your life you read about him, saw his photo in a newspaper or magazine or went to one of the thousands of rock and roll shows he produced from coast to coast. Even if you never went to one of his rock shows, you were influenced somehow by his work. Regardless of your age, you were a witness to the legacy of Bill Graham. You may think you know all about him—friend of Mick Jagger, late night confidant of Bob Dylan, pal of Janis Joplin, discoverer of the Grateful Dead, the Jefferson Airplane and Carlos Santana.

Whether you think you know all the details of Graham’s life and career, or know little, you will be dazzled by a mammoth and wildly pleasing new exhibit about his rich life, The Rock & Roll World of Impresario Bill Graham at the New York Historical Society (running February 14-August 23).  

The fun starts when you enter and they give you a headset. On the headset, for the one or two hours you spend at the exhibit you listen to some of the greatest rock and roll music ever played. You go from room to room to see, chronologically, Graham’s life as a promoter.

The Grajonca Family, Berlin, ca. 1938 Gelatin silver print Collection of David and Alex Graham 

 

Did you know that he was not a Bronx native, but was born in Berlin and spirited out of the country as a child in the 1930s by his mother (who would die at Auschwitz) as the Nazis took over, enduring a dangerous train and boat ride from Berlin to New York? Did you know that when he got to Lisbon on that journey, they asked him where he wanted to go to start a new life. He had no answer. “The United States?” an official said. “Yeah, sure,” he said and shrugged his shoulders. Thus was the legend born.

Graham’s love of music and promotion intrigued NYHS CEO Louise Mirrer, but so did his childhood. “Few know about Graham’s immigrant background and New York roots. We are proud to collaborate with our colleagues at the Skirball Cultural Center to present this exhibition in New York – Graham’s first American hometown—and to highlight his local experience. His rock and roll life was a pop cultural version of the American Dream,” she said.

That dream got started when he was the young manager for a street Mime Show. The performers got arrested and Graham staged a concert to raise bail. That was the beginning of a 40 years’ music career that ended suddenly when he died in a freak helicopter crash at the age of 60.

Graham started relatively small; tickets to one of his theaters the, Fillmore East, went for just $3.50 in 1970. In the late ‘60s, you could buy a Sunday afternoon show ticket at the Fillmore West in San Francisco for $1 and get to see the Grateful Dead, AND the Jefferson Airplane AND Carlos Santana. The enterprise grew, however. Graham put up 5,000 of those now-famous wild psychedelic posters in each city where he promoted a show and told the storekeepers that displayed the posters to keep them. The walls of the exhibit are covered with these posters featuring performers and friends such as Janis Joplin, Gracie Slick, Mick Jagger and the Rolling Stones, Etta James. Lenny Bruce, the Grateful Dead, Carlos Santana and Sam the Sham and the Pharoahs. Original copies of some of those posters today area worth several thousand dollars.

Sections of the exhibit tell the story of the Fillmore Auditoriums, East and West, and their short but eventful lives from 1968 to 1971, and of the Winterland Ballroom in San Francisco.

Visitors then pass through rooms telling the story of the huge, 50,000 seat arenas and upscale venues Graham booked for later shows. When Graham was looking for a theater to house a Grateful Dead concert in the 1970s, he turned down his own Fillmore, the Beacon, and Madison Square Garden to put the Dead onstage at Lincoln Center’s swanky, blue-blood Metropolitan Opera House (they sold out, quickly). 

Baron Wolman B.B. King backstage at Winterland Auditorium, San Francisco, December 8, 1967 Gelatin silver print Iconic Images/Baron Wolman 

 

Graham’s social activism is also represented. Visitors can learn of a show he staged to raise money for poor children’s school lunches in California, which drew 50,000 music fans, notably Willie Mays and Marlon Brando.  Graham’s concerts also introduced black blues and R&B musicians like Etta James to young white fans of rock groups they had influenced. Once, B.B. King came to him and said he saw a “bunch of long haired white people” on the ticket line. ” I think they booked us in the wrong place,” he said.

There is social relevance, such as Graham’s campaign to stop Ronald Reagan from visiting a cemetery in Germany where Nazi soldiers were buried and touching stories such as the hundreds of letters he was sent upon the closing of one of his venues. The exhibit also showcases Graham’s personal friendships with musicians. Sometimes these friendships were tested by business. Graham described his time managing the Grateful Dead and the Jefferson Airplane as “the longest year of my life.” There are delicious personal stories, such as his relationship to Bob Dylan. “We heard that Dylan wanted absolute quiet around him all day on the day of a show, so I ordered everybody in the theater not to talk to him. Later at night he comes to my office and says ‘why is nobody talking to me?’

Note from Donovan to Bill Graham, San Francisco, November 1967 Offset print with inscribed ink Collection of David and Alex Graham Photo by Robert Wedemeyer

 

The exhibit contains a wealth of visual and physical artifacts, including stage costumes and a dozen guitars played by musicians in Graham’s orbit. And there are walls full of the famous wildly colored psychedelic posters with all the writing that nobody could understand.  “The only thing anybody is going to understand in that poster is the asterisk,” Graham complained to one artist, who, of course paid no attention to him. 

Gibson Guitar Corporation 1959 Cherry Sunburst Gibson Les Paul played by Duane Allman of the Allman Brothers Band 

during live concert recording at Fillmore East; recorded: March 12–13, 1971

Collection of Galadrielle Allman Photo by Robert Wedemeyer

 

Graham was a gregarious, flamboyant man. Actor Peter Coyote said he “was a cross between Mother Theresa and Al Capone.” The promoter always knew that in the rock music world he was working with oddballs. “I always felt that someone had to relate to reality. That someone was me.”

Ken Friedman Bill Graham between takes during the filming of “A ’60s Reunion with Bill Graham: A Night at the Fillmore,” 

Fillmore Auditorium, San Francisco, 1986 Courtesy of Ken Friedman 

 

And he created a marvelous reality for tens of millions of music fans.

 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174576 https://historynewsnetwork.org/article/174576 0
Does Lincoln or Trump Represent the Conscience of the Republican Party?

 

Are we headed for another Civil War?

The idea might seem hard to imagine for a great many obvious reasons, and the prospect is easy to deride. And yet the years in which the “blues” and “greys” went to war can shed light upon America now. The comparison troubled me as I composed my new Lincoln biography, Summoned to Glory. It is hard to ponder Lincoln’s life and times without contemplating the condition of America today: divided, enraged, and almost coming apart at the seams.

And the comparison between Lincoln’s Republican Party and today’s Republican Party is especially troubling.

From the founding of the party through the end of the Civil War, the issue of slavery was paramount for Republicans, in spite of the diversity of motivations within their ranks. The party unified around the principle of outright emancipation by 1865 under Lincoln’s leadership. And Lincoln saw the issue as a commentary on the best and the worst of human nature--nothing less. “Slavery,” he proclaimed in 1854, “is founded in the selfishness of man’s nature, opposition to it in his love of justice. Those principles are an eternal antagonism.”

At the heart of Lincoln’s creed was the principle of egalitarian decency: the principle articulated by Jefferson in the Declaration of Independence. Lincoln urged Americans to rededicate themselves to that creed--the nation’s founding creed--and he said this again and again in the years leading up to the war. The war itself was a test to determine whether any nation so conceived and so dedicated could long endure.

He said this, of course, in the Gettysburg Address, and the words of that speech are quite famous. But he made the same point in much sharper language in the course of an earlier speech--one that he delivered on August 17, 1858 in his campaign against Stephen Douglas.

The Founding Fathers, said Lincoln, believed that nothing “stamped with the divine image and likeness” was meant to be “trodden on, and degraded, and imbruted by its fellows.” So they “erected a beacon to guide their children, and their children’s children.” They “knew the tendency of prosperity to breed tyrants, and so they established these great self-evident truths, that when in the distant future some man, some faction, some interest, should set up the doctrine that none but rich men, or none but white men, or none but Anglo-Saxon white men, were entitled to life, liberty, and the pursuit of happiness, their posterity might look up again to the Declaration of Independence and take courage to renew the battle… so that truth, and justice, and mercy… might not be extinguished from the land.”

That quotation makes poignant reading when today’s Republican Party is led by a man who takes pleasure in degrading and “imbruting” whole classes of people who, in his view, are human “scum.”

As for the proposition that only rich men, or white men, or Anglo-Saxon white men deserve life, liberty, and the pursuit of happiness, consider Trump’s actions toward Latin Americans who come here seeking those things. He wants to treat them like animals.

He vilifies undocumented immigrants at every opportunity and comes close to calling them subhuman. He even vilifies Americans who ask for humanitarian assistance--like the victims of the hurricane in Puerto Rico--if they happen to be people of color.

Lincoln was opposed to the “Know Nothings” who were trying to restrict immigration. He supported opportunity for all. Compare the position of Lincoln with the current Republican immigration policy, which actively frowns upon the pursuit of happiness by anyone who comes as a refugee to these shores unless they happen to be… whites. That was the gist of Trump’s reported remarks when he said that new immigrants ought to come from places like Norway instead of from . . . Africa or from “shithole” nations like Haiti.

Trump denies that he said this. But participants in the meeting in question insist that he did. And such sentiments are perfectly consistent with the public behavior that Trump displays all the time. His bigotry cannot be contained because it flows from the impulse that makes Donald Trump Donald Trump.

He likes to hurt and humiliate anyone he thinks is “beneath” him. Lincoln’s politics emphasized decency. Trump doesn’t know the meaning of the term.

Imagine the exercise of reciting Lincoln’s words in the presence of Trump:  nothing made in God’s image should be “trodden on, and degraded, and imbruted by its fellows.”  Would he pause long enough — in his tweeting of insults — to listen?  And even if he did pay attention to the words (presuming he understood them) what would he say?

We can easily imagine what Donald Trump would say.

And we can easily imagine the responses of Mitch McConnell, Lindsey Graham, and all the other Republican worshippers of power who avert their eyes from Trump’s deeds or else think up ways to excuse them.

Compare Lincoln’s politics to those of Donald Trump and you will see how far downhill the Republican Party has come. Abuse and degradation now flow from the party of Lincoln. The “better angels of our nature” are unknown in the party today. Republicans in Lincoln’s day insisted that the moral issue dividing the nation should be settled in favor of common decency before the country could unite. Common decency and American unity could not be farther away from the minds of Trump and his supporters.

Trump’s Republican Party pushes endless division and, rather than fostering a decent re-unification with an emphasis on mutual respect and common decency, the party sinks our country ever downward. Before very long, we may be mauling one another in the dark.

Scores of decent Republicans recoiled when Trump succeeded in his quest for the Republican nomination in 2016. “Anyone but Trump” was their slogan and some of them left the Republican Party. As the bestiality of Trump’s administration took shape, some Republicans like Jeff Flake and Bob Corker retired from Congress and indeed from politics in revulsion against what was happening.

But Trump’s steady quest for absolute power has proceeded, and only one lone Republican senator--Mitt Romney, the party’s 2012 presidential nominee--had the courage to take a stand in the Senate impeachment trial. All the others went along like sheep, either bleating out pathetic excuses or else remaining silent. They refused to hear the sworn testimony of witnesses. Several of them—Lamar Alexander and Pat Roberts, for instance—were retiring, so this was their now-or-never moment to consider their own place in history. 

They took the coward’s way out.

Perhaps they’re just tired old men. But Trump is quite old and yet he never seems to get tired. Perhaps they simply haven’t the stomach any longer for continued… “unpleasantness.” But that’s a justification that anyone can use—it’s something that anyone can say to themselves—if they decide to give up and let a brutal thug have his way.

Roberts chairs the Dwight D. Eisenhower Memorial Commission, which seeks to honor another Republican exemplar of decency. The connection between Roberts and Eisenhower and Trump is fraught with irony—to put it mildly. Eisenhower worked behind the scenes to defeat a demagogue who was just as vile as Trump:  Joe McCarthy. It bears noting that Trump learned the arts of demagoguery from none other than McCarthy’s top henchman Roy Cohn.

It is sad to anticipate the cant that Pat Roberts will probably intone when the Eisenhower memorial is dedicated. How can he live with himself after failing to take a stand when he had the opportunity? Perhaps he said to himself, “what good would it do if I took a stand—no one would listen.” Well, if others had followed the lead of Mitt Romney, who can say what events might have followed?  It might have triggered a catharsis.

If someone in the future writes a book like John F. Kennedy’s Profiles in Courage, Mitt Romney will be in it and Pat Roberts will not.

Theodore Roosevelt had a pithy way of describing defeatists. In his language of robust and swashbuckling action, he would probably say that the equivocal Republicans of today have a “yellow streak.”  Lincoln had something more exalted to say to the Republican Congress in 1862:  “Fellow citizens, we cannot escape history.... The fiery trial through which we pass, will light us down, in honor or dishonor, to the latest generation.”

Republicans:  you have drifted away from noble origins that would be the envy of any political party. You have cut yourselves off from a heritage of altruism and benevolence to obey the commands of a brute—a leader who, like his idol Mr. Putin, believes in nothing whatsoever but the grubby acquisitions that plunder can deliver into his foul and selfish little hands.

Awaken, Republicans, and make some effort to reflect!  If you value any principle, prove it by your words and your deeds. Prove to yourselves and to the world that there is more to life than raw power.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174580 https://historynewsnetwork.org/article/174580 0
Do We Want the Progressive Sanders or the Pragmatist Biden?

 

Since my state’s presidential primary in Michigan was on March 10, it was time to make a choice. The only realistic and sensible one was between Bernie Sanders and Joe Biden. They may be two old white guys, but no other candidates remained who had any chance of defeating another old white guy who happens to be one of our worst presidents ever.

There’s a lot to like about the progressive Sanders, including his realization of how serious our climate crisis is. But Biden is also “a solid contender on climate and environment.” As Robert Dallek has noted, Franklin Roosevelt (FDR) was both a progressive and a pragmatist. And the ideal 2020 Democratic presidential candidate would be both, but there is no denying that by historical standards Sanders is more progressive and Biden more pragmatic. So which should one choose?

 

I recommend using history and the need for political wisdom as a guide. First, let’s look at the changing meaning of the term progressivism. The Progressive Era (1890-1914) was a diverse movement to assert and protect a public interest, both economic and social, against unfettered capitalism. It did not attempt to overthrow or replace capitalism, but to have government bodies and laws constrain and supplement it in order to insure that it served the public good. 

 

Progressive reforms included passing laws improving sanitation, education, food and drug safety, housing, and workers’ rights and conditions, especially for women and children. Progressive efforts also helped create the National Park Service and a graduated federal income tax (16th Amendment). In addition, they reduced corruption in city governments, limited trusts and monopolies, expanded public services, and worked to aid the poor and secure the vote for women, which was not achieved in presidential elections until 1920.

Importantly for our discussion, progressivism was a broad coalition that included both the bellicose Republican (and president from 1901 to early 1909) Theodore Roosevelt and the likes of Jane Addams, a radical reformer and pacifist. It was Addams’s pragmatism that led her to shun political labels and seek alliances to advance social progress.

The 1920 presidential election brought the Republican Warren Harding to office, and for the next twelve years it was held by Republicans, but then the progressive, pragmatic Democrat Franklin Roosevelt (FDR) came to office and remained there until his death in 1945. 

I have described progressive values as similar to the values of political wisdom. They included the proper mix of realism and idealism, love, compassion, empathy, self-discipline, passion, courage, persistence, peace, optimism, humility, tolerance, humor, creativity, and an appreciation for beauty and the need to compromise.

Like progressivism, pragmatism has a rich tradition in the USA. Walter Isaacson wrote that while popular opinion of the founders emphasizes high principle, deliberation on the Constitution in 1787 “showed that they were also something just as great and often more difficult to be: compromisers.” Isaacson also stated that for Franklin “compromise was not only a practical approach but a moral one. Tolerance, humility and a respect for others required it…. Compromisers may not make great heroes, but they do make great democracies.”

Recent Democratic presidents—Truman, JFK, Lyndon Johnson, Jimmy Carter, Bill Clinton, Barack Obama—have embodied this pragmatic impulse. As former President Bill Clinton said, “This is a practical country. We have ideals. We have philosophies. But the problem with any ideology is that it gives the answer before you look at the evidence.” And historian James Kloppenberg has emphasized the centrality of pragmatism to Obama’s politics. This type of pragmatism “challenges the claims of absolutists--whether their dogmas are rooted in science or religion--and instead embraces uncertainty, provisionally, and the continuous testing of hypotheses through experimentation," in order to see what works. 

Thus, both progressivism and pragmatism can claim to be authentically American approaches, but where does that leave us regarding the Sanders/Biden choice? Has Sanders demonstrated enough pragmatism? Unfortunately, I do not think so. The political-wisdom qualities of compassion, empathy, passion, courage, persistence, and love of peace are certainly there. But the proper mix of realism and idealism, creativity, humility, tolerance, and a willingness to compromise--not so much.  

Today the USA is badly split between pro and anti-Trumpers and is likely to remain one split between right and left politicians. We are a country that has traditionally favored the pragmatic over the ideological. The reality is that almost all the former Democratic contenders for the 2020 nomination now support Biden, and he, rather than Sanders, is more likely to attract independents and dissatisfied Republicans and be able to work with a new Congress to achieve results. We are a country that has traditionally favored the pragmatic over the ideological. We need a unifier, not an ideologue who will keep us divided. 

By sticking too closely to an ideological democratic-socialist message, Sanders has failed to display the creativity that historian R. B. Bernstein (in his The Founding Fathers Reconsidered) wrote of regarding those men, as well as FDR and his supporters who “reinterpreted the Constitution’s origins, stressing the founding fathers’ creative experimentation, which they sought to foster in the new nation.” 

 

Regarding humility, tolerance, and a willingness to compromise, they all go together. Humility should tell us not to be dogmatic, that we do not have all the answers. We should tolerate the views of others not only for reasons of love, compassion, and empathy, but also because we might learn something from different views. Compromise implies both humility and tolerance, plus the realization that to achieve the common good in a democratic society compromises will often be necessary. 

 

Sanders has stressed a we-versus-they approach. In a speech after Super Tuesday he made it clear--as he often has--who are on the two sides. The “we” includes those  “who in many cases are working longer hours for lower wages . . . people who have not traditionally been involved in the political process.”Also many of those favoring his Medicare-for-All plan, those suffering from a “dysfunctional and cruel healthcare system in which we are spending twice as much per person on healthcare as are the people of any other country and yet we have 87 million Americans who are uninsured, underinsured.” Not specifically mentioned, but a big part of the “we” are millions of young people, many of whom would benefit from the free college tuition and canceling of “all student loan debt” that Bernie’s web site indicates

 

Among the “they” are the “corporate establishment” and the “political establishment,” as well as “Wall Street and the drug companies and the insurance companies and the fossil fuel industry and the military industrial complex and the prison industrial complex, and most of the 1%.” Not mentioned are all those who voted for Trump in 2016, including a majority of Protestant and Catholic voters, and still hold views contrary to Bernie’s. 

 

Bernie’s approach is not a unifying one, not one that says, “Let’s reason together and see what we can achieve for the common good.” However enlightened his ideas might be in theory--and I think many of them are noble--a majority of U.S. citizens and their politically diverse legislators are not yet ready to back them. He advocates, in his own words, “a political revolution,” and revolutions are more divisive than unifying--think of the civil war that followed the Russian revolution. Biden insists that voters “aren't looking for revolution, they're looking for results." And the primary results heretofore (through Mar. 10) bear him out. 

 

In essays of 2012 and 2015 about Obama’s pragmatism vs. Republican dogmatism I criticized Republicans for not being open enough to policies that might further the common good. Bernie’s supporters in 2020 tend to be more ideological and less pragmatic than other Democratic primary voters, and the danger is real that if Bernie is not the Democratic nominee many of them might not vote for whoever is. But they should think twice. 

 

Biden is not yet that nominee. Things could still change. A poor Biden debate performance. Stunning upsets in some big yet-to-vote states. Unknown political effects of the coronavirus. Who knows? But one thing is certain. Our country needs a president that is more progressive and pragmatic than Trump, who is neither, but only an unprincipled opportunist. Either Biden or Sanders would be better than Trump, and Democrats need to rally around whoever it is. But because his pragmatic style matches more the American tradition, Biden has a better chance of unifying our nation and delivering positive long-range results. An enlightened vote is one that considers the common good, not just that of those who think like us. 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174578 https://historynewsnetwork.org/article/174578 0
Roundup Top Ten for March 13, 2020

Coronavirus School Closings: Don’t Wait Until It’s Too Late

by Howard Markel

In the history of medicine, we have never been more prepared to confront this virus than we are today. But this history also teaches us that when it comes to school closings, we must always be ready to act today — not tomorrow.

 

Joe Biden Personifies Democratic Party Failures Since the Cold War

by Michael Brenes

The Democratic Party needs to escape the shadow of anti-communism and embrace economic and racial justice.

 

 

Liberal Activists Have to Think Broadly and Unite Across Lines

by Matthew D. Lassiter

Fifty years before Greta Thunberg, students at the University of Michigan organized a Teach-In that paved the way for Earth Day demonstrations that mobilized 20 million people in 1970.

 

 

Coronavirus and the Great Online-Learning Experiment

by Jonathan Zimmerman

Let’s use this crisis to determine what our students actually learn when we teach them online.

 

 

The History of Slavery Remains With Us Today

by Ariela Gross and Alejandro de la Fuente

Two historians trace how law and institutions developed around anti-black ideology in the Americas.

 

 

We’ve Been Looking in the Wrong Places to Understand Sanders’s Socialism

by Richard White

Detractors like to equate Senator Bernie Sanders’s socialism with Soviet and Chinese Communism, but they’re swinging at the wrong century, the wrong country and the wrong socialism.

 

 

My Abortion Before Roe v. Wade

by Elizabeth Stone

Roe v. Wade is in peril, flinging me back to a terrifying time in my own life, one I never expected women today would have to face.

 

 

I Helped Fact-Check the 1619 Project. The Times Ignored Me.

by Leslie M. Harris

The paper’s series on slavery made avoidable mistakes. But the attacks from its critics are much more dangerous, argues historian Leslie M. Harris.

 

 

The Latest Battle over the Confederate Flag Isn’t Happening Where You’d Expect

by Megan Kate Nelson

Confederate actions in the Far Western theater of the war reveal the extent to which the Confederate flag became a symbol of white supremacy and conquest.

 

 

There’s a Complex History of Skin Lighteners in Africa and Beyond

by Lynn M. Thomas

The politics of skin colour in South Africa have been importantly shaped by the history of white supremacy and institutions of racial slavery, colonialism, and segregation. My book examines that history.

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174565 https://historynewsnetwork.org/article/174565 0
The Gross Clinic

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/blog/154323 https://historynewsnetwork.org/blog/154323 0
HNN Introduces Classroom Activity Kits As part of a new initiative to merge the worlds of education and journalism, HNN is introducing a series of Classroom Activity Kits.

 

These kits, all crafted by undergraduate students at George Washington University, are designed to illustrate the relationship between current events and history. Each kit consists of a complete 45-60 minute lesson plan which requires little-to-no preparation on part of the instructor. These lessons are designed for high school and college students, but they can be altered to suit other education levels.

 

During these lessons, students will be use news articles as a tool for understanding a specific history. In doing so, students will engage in critical thinking, group-work, text analysis, research and more. This first batch of activity kits covers the history of U.S. immigration, climate change, sports activism, and the U.S. prison system.

 

Aside from the educational benefits to students, these kits are also a great resource for educators. All you need to do is click one of the download links below and boom! – you have a fully formed unique activity.

 

HNN’s Classroom Activity Kits will be a continued effort to bring news into the classroom.

 

Click on each link below to learn more and download the activity kit. 

 

 

Classroom Activity Kit: The History of Climate Change

What do farmers from the 1950s, anti-smoking campaigns and climate change have in common? Download this Classroom Activity Kit to find out.

 

Classroom Activity Kit: The History of U.S. Immigration

This Classroom Activity Kit teaches students about U.S. immigration history while also highlighting their personal histories.

 

 

Classroom Activity Kit: The History of Sports Activism

Discussing athletes from Jackie Robinson to Colin Kaepernick, this Activity Kit teaches students about the history of political activism in sports.

 

 

Classroom Activity Kit: The History of Private Prisons in the U.S.

Download this Classroom Activity Kit to teach students about the history of the American prison system.

 

 

Classroom Activity Kit: The History of Climate Change and Activism

Download this Classroom Activity Kit to help students understand climate change activism in its historical context.

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174366 https://historynewsnetwork.org/article/174366 0
A Founder of American Religious Nationalism Right around the time the House began its impeachment inquiry, the homepage of the U.S. Department of State featured a talk by Secretary of State Mike Pompeo titled “Being a Christian Leader.” Only a few weeks had passed since Attorney General William Barr told students at Notre Dame Law School that “secularists” are to blame for “moral chaos” and “immense suffering, wreckage, and misery,” and that “Judeo-Christian moral standards are the ultimate utilitarian rules for human conduct.” Then, at a January campaign rally at a Miami megachurch, President Donald Trump told the largely evangelical crowd that God is “on our side.” 

 

Most of us have a sense that this kind of religious-nationalist rhetoric and behavior got its start with the revolution that Reagan brought to power. A decisive moment was in August 1980, at the Reunion Arena in Dallas, Texas, when Reagan addressed 15,000 thousand pastors and religious activists. “I know that you can’t endorse me,” but “I want you to know that I endorse you and what you are doing,” he said, to wild applause. 

 

Reagan’s speech at the Reunion Arena marked a sea change in the role of conservative religion in American politics. But some of those who had helped organize the event were concerned that the one individual who deserved the most credit for the transformation of the interface of politics and religion was not on the podium. “We agreed that it was unfortunate that Rousas Rushdoony was not speaking,” radical theologian Gary North later observed, recalling an exchange with Robert Billings, a Reagan campaign staffer who had previously served as executive director of Jerry Falwell’s Moral Majority organization. Billings responded, “If it weren’t for his books none of us would be here.”  “Nobody in the audience understands that,” North countered. “True, but we do,” Billings reportedly replied. 

 

Howard Phillips, a former Nixon administration aide who was also present at the Reunion Arena, called Rushdoony the “most influential man of the 21st century.” As he confided to author and religious studies professor Julie Ingersoll in 2007, “The whole Christian conservative political movement had its genesis in Rush.” 

 

Rousas John Rushdoony was born in 1916 to Armenian immigrants who had narrowly escaped the genocide, in which as many as 1.5 million Armenians were murdered by Turks of the Ottoman Empire. Rushdoony’s father, who founded the Armenian Martyrs Presbyterian Church in Kingsburg, California, ministered to a community of fellow Armenian refugees, who agonized and grieved as letters from relatives back home came to a standstill. “In Armenia, there was no neutral ground between Islam and Christianity,” Rushdoony wrote in 1997. “And I came to realize there is no neutral ground anywhere.”

 

Rushdoony left his family home and made his way to college at the University of California, Berkeley. He did not fit in. Advised to read the classics, Rushdoony later called this “the ugliest experience of my life.”  He pronounced the works of Shakespeare, Homer and the rest of the canon “classics of degenerate cultures. What they offer at their best is evil.”

 

Rushdoony emerged from Berkeley with all the distinctive features of his intellectual persona in place: a resolutely binary form of thought that classified all things into one of two absolutes; a craving for order; and a loathing of the secular world. 

 

Rushdoony began to advocate for a return to “biblical law” in America. The Bible, Rushdoony said, commands Christians to exercise dominion over the earth and all its inhabitants. Women are destined by God to be subordinate to men; men are destined to be ruled by a spiritual aristocracy of right-thinking Christian leaders, and public education is a threat to civilization for it promotes a “secular world-view.” In over thirty books and publications, including The Messianic Character of American Education and The Institutes of Biblical Law – often hailed as his magnum opus and recommended as one of the Choice Evangelical Books of 1973 by evangelical flagship journal Christianity Today -- Rushdoony laid it all out in a program he called Christian Reconstruction. 

 

There is little mystery about the historical sources from which Rushdoony drew his own inspiration. He laid out all the details in his works. Setting aside the hardline Dutch Reformed theologians who supplied the backbone of his thought, Rushdoony drew on two traditions that would prove essential in understanding the genesis of today’s Christian nationalist movement. The first was the proslavery theology of America’s antebellum preachers. The second was the economic libertarianism that took root in reaction to the New Deal. 

Among apologists for Christian nationalism today, the favored myth is that the movement represents an extension of the abolitionism of the nineteenth century and perhaps of the civil rights movement of the twentieth century, too. Many antiabortion activists self-consciously refer to themselves as the new abolitionists. Mainstream conservatives who lament that the evangelicals who form Trump’s most fervent supporters have “lost their way” suggest that they have betrayed their roots in the movements that fought for the abolition of slavery and the end of discrimination. But the truth is that today’s Christian nationalism did not emerge out of the movement that opposed such rigid hierarchies. It came from the one that endorsed them.

Rushdoony understood this well. Not long after escaping the horror of Berkeley, he took an interest in the work of Robert Lewis Dabney, a defender of slavery before the Civil War and who also supported patriarchy and the American form of apartheid after the war. 

Rushdoony reprinted and disseminated some of Dabney’s works through his Vallecito, California-based Chalcedon Foundation, as well as through his publishing company, Ross House Books. He found himself agreeing with Dabney that the Union victory was a defeat for Christian orthodoxy. In Rushdoony’s mind, Dabney’s great adversaries, the abolitionists, were the archetypes of the anti-Christian rebels – liberals, communists, secularists, and advocates of women’s rights – who continued to wreak havoc on the modern world. As Rushdoony’s fellow Reconstructionist C. Gregg Singer put it, proslavery theologians including Dabney, Thornwell and their contemporaries “properly read abolitionism as a revolt against the biblical conception of society and a revolt against divine sovereignty in human affairs.” Rushdoony himself concluded, “Abolitionist leaders showed more hate than love on the whole.” The defeat of the orthodox side in the Civil War, Rushdoony asserted, paved the way for the rise of an unorthodox Social Gospel. 

Rushdoony’s admiration for southern religious orthodoxy was such that he adopted a forgiving attitude toward certain forms of slavery. In books such as Politics of Guilt and Pity and The Institutes of Biblical Law, which is essentially an 890-page disquisition on “the heresy of democracy” and the first of a three-volume series under the same title, he makes the case that “the move from Africa to America was a vast increase of freedom for the Negro, materially and spiritually as well as personally.”

“Some people are by nature slaves and will always be so,” Rushdoony muses, and the law requires that a slave “recognize his position and accept it with grace.”

One of Dabney’s pet peeves was the provision of public education to Black children, whom he referred to as “the brats of black paupers,” which required unjust (in his view) taxation of “oppressed” “white bretheren.” Rushdoony sympathized with Dabney’s point of view on public education, and here began to fuse his views with a small-government ideology. “State supported and controlled education is theft,” he wrote, and called the claim of ownership to the lives of citizens by a “humanist state” slavery, too. 

Rushdoony did not agitate for the literal enslavement of Black Americans in his time. But his fascination with proslavery theology was no passing fancy. The idea that the United States is a Redeemer Nation, chosen by God; that it is tasked with becoming an orthodox Christian republic in which women are subordinate to men, education is in the hands of conservative Christians, and no one pays taxes to support Black people; that at some point in the past the nation deviated horribly from its mission and fell under the control of atheist, communist, and/or liberal elites—the stuff of proslavery theology was the life of Rushdoony’s political thought.

Rushdoony soon found an even greater source of inspiration in the libertarian economic thinkers who emerged to beat back the New Deal. He was very much taken with figures like James W. Fifield as well as members of the Austrian school of economics, and began to churn out works arguing that the modern welfare state was “organized larceny” and “capitalism is supremely a product of Christianity.”

In this Christian-libertarian vision, Rushdoony saw the foundation for a thoroughly religious – or better, theocratic – understanding of the American republic. In Rushdoony’s telling, it was not the intention of America’s founders to establish a nonsectarian representative democracy. The First Amendment, he argues, aimed to establish freedom “not from religion but for religion.” “The Constitution was designed to perpetuate a Christian order,” he wrote.

 

The idea that the United States is a “Christian nation” – not just in the sense that its population was originally mostly Christian but that it was intended to serve as part of a Christian world order – was Rushdoony’s central contribution to the religious right. 

 

Rushdoony had many prominent admirers, among them evangelical leader D. James Kennedy, whose ministry received millions of dollars in donations from the DeVos family. Kennedy popularized Rushdoony’s ideas through multiple sermons and publications, exhorting attendees at a 2005 conference organized by his ministry to “exercise godly dominion” over “every aspect and institution of human society.” The ultraconservative Catholic leader Richard John Neuhaus, who worked to broker a conservative evangelical and conservative Catholic alliance, pointed out that Rushdoony’s “theonomy,” or the idea of a social and political order rooted in “biblical law,” has “insinuated itself in circles where people would be not at all comfortable to think of themselves as theonomists.” He also attested, in 1990, to “increasing encounters with ideas clearly derived from Christian Reconstructionistm even among conservatives in the mainline/oldline churches.”

 

Today, the ideas of Rushdoony and his fellow Reconstructionists have penetrated into evangelical and conservative Catholic circles that are, quite often, unaware of their original sources. “Though we hide their books under the bed, we read them just the same,” as one observer put it to Michael J. McVicar, author of Christian Reconstruction: R.J. Rushdoony and the American Religious Conservatism.

To be clear, the Christian right is large and diverse in its specific theologies. Many of its representatives know very little about R.J. Rushdoony and others take pains to distance themselves from him. Some of his extreme positions, such as the idea that homosexuals, blasphemers and adulterers are all worthy of the death penalty, have been loudly repudiated by conservative leaders. Yet it is difficult to understand the ideological origins and structure of Christian nationalism in America today without taking into account Rushdoony’s ideas. 

 

Rushdoony’s work is touted today by some of the leading personalities and policy groups on the Christian right. Perhaps the most telling example comes from David Barton, whose efforts to reframe our constitutional republic as a Christian nationalist enterprise are at the center of so many of the movement’s cultural and legislative initiatives. Though perhaps not in a formal sense a Reconstructionist, Barton dances around many of Rushdoony’s defining ideas, even on the question of slavery. 

In a paper titled “The Bible, Slavery, and America’s Founders,” posted on the WallBuilders website, Barton cites Rushdoony extensively, and argues that “in light of the Scriptures, we cannot say that slavery, in a broad and general sense, is sin. But this brief look at the Biblical slave laws does reveal how fallen man’s example of slavery has violated God’s laws.” 

Where Barton strikes out on his own, it is to take a swipe at modern, liberal government as a form of slavery, a gesture that Rushdoony surely would have endorsed. “Since sinful man tends to live in bondage, different forms of slavery have replaced the more obvious system of past centuries,” Barton explains. “The state has assumed the role of master for many, providing aid and assistance, and with it more and more control, to those unable to protect themselves.” In a 2018 blog post, Watchmen on the Wall, the Family Research Council’s alliance of an estimated 25,000 pastors, praised Rushdoony as a “powerful advocate for the Christian and homeschool movements across America” who “challenged Christian leaders of his day to stand on biblical truth in the public square.” Rushdoony is a foundational thinker whose ideas continue to speak, long after he has been silenced.

For a long time now, critics have viewed America’s religious right as a social or cultural movement. They assume that it represents a reaction to modern, secular culture, and that it speaks for a large mass of disaffected conservative evangelicals and others who are preoccupied with issues of concern to the family. But that simplistic interpretation is plausible only for those who do not actually listen to what its leaders have to say or trace its ideas to their primary sources. Christian nationalism today is a political movement, and its primary goal is power. Its ultimate aim is not just to win elections but to replace our modern constitutional Republic with a “biblical” order that derives its legitimacy not from the people but from God and the Bible – or, at least, the God and the Bible that men like Rushdoony claimed to know.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174498 https://historynewsnetwork.org/article/174498 0
A Price to Be Paid

 

None of us had any idea how terrible the cost would be before we finished taking Okinawa. The casual observer might have concluded that the landing was so easy, war must be a walk in the park. Some park! 

As the postwar decades passed, I realized that the average American’s view of death was shaped by movies and television shows. In thirty minutes, you’d watch a number of people get killed, break for a couple of commercials, get a beer out of the fridge, change the channels, and watch another detective kill some other guys. Finally, you’d turn the set off and go to bed. All those killings were imaginary, impersonal, abstract; they meant nothing. Just show business. 

The situation changes when the dead are your buddies, a brother, a friend whom you’d played poker with and shared stories of what you hoped to do when you got home. The cost of war mounts up and takes a staggering emotional toll on every soldier. 

Weeks after we had come ashore on Okinawa and fought a good number of battles, Sergeant McQuiston and I were in my jeep driving in the opposite direction from the front lines and following a set of tracks that our boys had cut through the remnant of what had once been a footpath of some sort. The smell of war hung in the air, but by this time we’d gotten used to it and just kept going. We happened onto an area where bodies of soldiers were kept, awaiting burial. After men fell in battle, they were usually carried to the rear and eventually ended up in black body bags. 

Somebody had set up a shelter under tarps tied up on wooden poles. Underneath, corpses were stacked like firewood. The pile must have been three to four feet high. The soldiers’ dog tags were fastened on the end of the zippers, the only remaining identification of who they had been. 

McQuiston brought the jeep to a halt. The anonymity of the body bags only added to the pathos. Could have been us. We sat there staring at that heap of cadavers. Each one was someone’s son, brother, maybe fiancé, and now they were gone. Forever gone. We stared. 

No break for a television commercial there. 

Lieutenant Colonel Ed Stare’s Third Battalion of the 383rd Infantry made the most spectacular advance of the day pushing south from the landing beaches. Company I marched along the coast- line until they stopped for a flag raising. Someone had brought an American flag from the States. The soldiers hoisted the flag to signal American sovereignty; the men cheered as the flag went up. Let the Japanese think on that one for a while! 

I wasn’t with them at the time, but later the men kept talking about what followed. The experience proved to be so vivid, I couldn’t forget it. The soldiers shared the struggle with me in great detail. 

The battalion kept moving; they knew a river was ahead. The men were prepared for a difficult time crossing the waterway. Could be a big-time problem as all the enemy had to do was set up machine guns on the other bank and fire at the approaching soldiers. Amazingly enough, they had forgotten to destroy a bridge. Our men rolled across without missing a beat. They hadn’t gone far up the road when they finally ran into the opposition. About twenty Japanese had set up a machine-gun nest and were waiting for our boys. Battalion I hit the ground and began returning the fire. 

Captain Gordon Wheeler yelled that they’d found the enemy. 

The gunfire went back and forth while the battalion tried to position itself. Bullets were flying in rapid fire. Soldiers dropped behind boulders or downed trees. 

A soldier called back that there weren’t many of them but they were dug in below ground level. We needed to hit ’em from both sides. 

The enemy answered with a blast of machine-gun fire. 

Wheeler called for the unit to spread out, to split up and work both angles. 

The battalion divided on the flanks and started working their way around the opposition. A blast much larger than a simple machine gun ripped through the air. Sounded like they had a 155mm artillery piece over there. 

Two American soldiers fell to the ground. A third man screamed and rolled over into a ditch. For thirty minutes intense fire went back and forth. The unit didn’t seem to be making any progress. Casualties were quickly mounting. 

The voice of Captain Wheeler echoed through the trees that he needed help. He’d caught it in the leg and couldn’t walk. 

Someone called for a medic. 

An anonymous man called from the left to keep the captain on the ground. 

Men began pulling in behind where Wheeler lay sprawled. 

The medic called for them to drag him out and get him behind shelter. Three men pulled the captain across the grass and finally propped him up behind a large boulder. 

Wheeler groaned that he’d gotten smoked in the thigh and the pain was killing him. 

The medic gave him a shot of morphine. He worked feverishly over the captain, telling him they’d quickly get him out of there. 

The soldiers around Wheeler nodded, but a burst of machine-gun fire sent them diving to the ground again. A sergeant crawled over and informed them that they had killed some of the enemy but the rest, though outnumbered, were still dug in below ground level. The battalion couldn’t hit them, and trying would only get them killed. They needed an ace in the hole. Something different. 

The radioman crawled over with the phone pack on his back. The sergeant jerked the receiver off and reported the problem. Captain Wheeler had been hit and so had a number of other men. The Japanese weren’t budging. They needed a tank or something big of that order. A weapon that packed a punch. The sergeant listened for a moment before hanging up. 

In five minutes, a soldier poked through the trees carrying a long black tube at his side. He said Command had sent him and he had to make it fast because “I’m busier than a one-legged tap dancer over there in my unit.” 

The sergeant asked what he’d brought. The soldier answered that he’d show them. 

The sergeant beckoned him to follow and they inched through bushes until they stopped behind a fallen tree closer to the enemy. The sergeant pointed out a machine gun there on ground level that was their problem. 

The soldier nodded and told them, “Don’t stand behind me. The bazooka has a considerable back blast.” 

The soldier lowered himself almost to ground level. Abruptly, the enemy fired a round or two. The sergeant and the bazooka man could hear them jabbering. 

After putting his finger to his mouth to signal silence, the soldier rested the long, recoilless antitank launcher on the fork of a scrubby tree. For a second, he carefully aimed low, and then he pulled the trigger. A ball of fire spiraled upward before smoke covered the area around the enemy’s foxhole. The Japanese were history. 

The sergeant couldn’t believe the bazooka had knocked them out with one shot. 

The two men started inching backward until they rejoined the rest of the squad. 

The Japanese wouldn’t be bothering them anymore. The bazooka man saluted and disappeared through the trees and scrub just as he had come in. 

By then Captain Wheeler had passed out. Apparently, morphine had done the trick but his leg continued to bleed heavily. The sergeant took a second look. 

The sergeant knew that they had to get the captain out of there. A couple of the men carried him out. It was getting late enough in the afternoon that they ought to have been digging in for the night. Fortunately, none of them got hit with that 155mm. 

The sergeant who told me this story paused then, caught his breath, and gritted his teeth, cursing the loss of their men. The survivors knew that Wheeler would end up in a body bag with a hundred other guys all laid out under a tarp. 

Nobody said anything. 

At night we’d sit around and talk, sharing this kind of story. I remember thinking how we were really processing what might happen to us. 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174502 https://historynewsnetwork.org/article/174502 0
Jack Welch Was a Bitter Foe of American Workers

Cg-realms at English Wikipedia / CC BY (https://creativecommons.org/licenses/by/3.0)

 

With the United States firmly in its second Gilded Age, the creators of this period of epic income inequality, growing racial violence, and undermining of our democratic institutions are already beginning to pass from the scene. One of these is Jack Welch, who died last week at the age of 84. Most of the national obituaries of Welch focused on his outsized personality and aggressive management style. Those things did help change the trajectory of American capitalism, but the obituaries left out or downplayed Welch’s greatest impact in shaping the unequal and unfair America of today: unionbusting. 

 

For many years, General Electric was one of the largest corporations that had a stable relationship with its unions. Under Gerald Swope’s leadership, GE had come to terms with labor unions in a system of collective bargaining that created a profitable stability. While Swope’s successors at GE were not as favorable to unionism as he, the company was a stalwart of union power for decades.  

 

When Jack Welch took over GE, he brought a very different perspective to the question of unions. He wanted them out. He was no outsider. He has worked for GE for twenty years before becoming CEO. But rather than learn how unions can sustain a workplace where workers feel valued, where they have safe and healthy jobs, and where they make wages that allow them to live with dignity, the lesson Welch took was that unions got in the way of maximizing profits. That Welch came to power the same year that Ronald Reagan ushered in the new era of unionbusting by firing the air traffic controllers in 1981 was coincidence, but a fitting one. 

 

Moreover, Welch’s rise coincided with American corporations began moving millions of industrial jobs overseas, helping to undermine organized labor and reorient the nation to concentrate wealth in the top 1 percent of income holders, which very much included Welch and other GE executives. He made GE one of the leaders in corporate mobility. As he infamously said, “Ideally, you'd have every plant you own on a barge.” He closed union plants in the north and reopened them in the non-union south or overseas. The era of unfettered capital mobility went far to undermine the postwar stability of the working class.  American steel companies laid off forty percent of their workers between 1979 and 1984 while United Auto Workers lost half their members between 1970 and 1985. As late as the 1990s, there were tens of thousands of jobs in textile plants in the South. But these were the last remnants of a once robust manufacturing base. Beginning in 1995, Fruit of the Loom closed a series of Alabama mills and moved production to Mexico, Central America, and the Caribbean, a process that continued through 2009, when two last factories closed, laying off 270 workers. Jack Welch was not the only person responsible for the outsourcing of union jobs, but no one celebrated it with more obnoxious fervor. 

 

Welch believed that unions destroyed the American economy, stating in a 2009 forum that there were no competitive unionized industries in the United States. Between 1980 and 1985, GE’s workforce plummeted from 411,000 to 299,000 workers. The overall percentage of unionized workers in the company fell from 70 percent when Welch took over to 35 percent by 1988. That Welch himself rose from nothing, the son of Irish-Americans who did not graduate from high school only bred greater contempt. If he could become rich and famous, why couldn’t all the other working class kids? That his personal actions foreclosed that possibility for millions of others hardly entered his mind, as it didn’t bother earlier generations of formerly working class capitalists such as Andrew Carnegie. Welch personified the survival of the fittest for himself.   

 

Welch’s impact on upstate New York was devastating. GE’s home of Schenectady, New York, once a thriving industrial city, became a deindustrialized shell filled with crime and poverty thanks to GE layoffs. When GE pulled out of its Fort Edward, New York electrical capacitor plant in 2013, after Welch’s retirement but with his influence still strong, Chris Townsend, political director for the United Electrical Workers, stated in fury, “It shouldn’t be easy to close a plant. The General Electric corporation has been shown every imaginable consideration. Our members have worked with the company to keep this plant profitable. Now the company decides to walk off, leave hundreds of people stranded with no jobs, no income.” That, compounded by hundreds of other closures and layoffs, is the legacy of Jack Welch. 

 

Welch’s entire career was class warfare: the rich against the poor. The only worthwhile value for Welch was the quarterly profit report. Anything that got in the way of that—especially an unproductive division of workers—was happily sacrificed for that goal. For all this, Fortune magazine Welch was named “manager of the century,” a symbol of how toxic American corporate culture had become by 1999. Whereas workers were placed into competitions to cull the least productive, Welch helped bring in the era of outsized CEO pay that far outstripped that of average workers, recreating the world of the late nineteenth century where obscene wealth for a few was overtly built on the poverty of the many. 

 

Today, we are living in the world Jack Welch made, one in which a racist, proto-fascist, self-proclaimed billionaire whose businesses have worked with organized crime figures is president, governing by the New Gilded Age principles of letting business rule itself and eviscerating labor, consumer, and environmental regulations. In fact, while Welch criticized Donald Trump’s chaotic management skills, he also loved how Trump had governed in favor of business and warned that impeaching the president would “blow the markets away.” Impeachment had no impact on the market, but Jack Welch being wrong about America was the cornerstone of his career. His life immeasurably hurt this nation and should not be mourned. Instead, his methods and beliefs should stand for the decline of America in the twenty-first century. 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174491 https://historynewsnetwork.org/article/174491 0
Who Killed Ralph Featherstone?

 

Fifty years ago, just before midnight on March 9, 1970, a bomb demolished a car carrying two young men approximately 20 miles north of Baltimore and two miles south of the courthouse in Bel Air, Maryland, where the black radical H. Rap Brown was set to go on trial the following day. 

The explosion was so severe that authorities could not identify the remains of one of the car’s passengers. At first, many speculated that Brown, who was being tried for inciting a riot in Cambridge in 1967, had been in the car. It was quickly announced that Ralph Featherstone, a veteran Student Non-Violent Coordinating Committee (SNCC) activist, had been killed by the blast. Based on a small sampling of skin fragments, medical examiners subsequently identified William “Che” Payne, another SNCC veteran, as the second victim. 

Almost immediately, two opposing explanations for their deaths emerged. Framing the incident as part and parcel of a wave of terrorist acts, authorities, the national media and most local news outlets cast Featherstone as a “bitter revolutionary” and uniformly concluded that Featherstone and Payne had blown themselves up with a bomb that they intended to use to disrupt Brown’s trial.

 Numerous press outlets, including Time, The Washington Post and the NBC Nightly News, punctuated this interpretation of the Bel Air bombing by quoting a poem allegedly found on Featherstone’s body, which read: “To Amerika: I’m playing heads-up murder. When the deal goes down I’m gon [sic] be standing on your chest screaming like Tarzan. Dynamite is my response to your justice.”

In contrast, virtually everyone who knew them, from family members to movement colleagues, as well as the black press, insisted that Featherstone and Payne had been assassinated, perhaps inadvertently by those who sought to kill Brown. Concomitantly, they complained about the press’s coverage of the incident.

Shortly before the bombing, William Kunstler — who was serving as director of the ACLU at the time — had argued in favor of removing the trial from Bel Air because of the poisonous pre-trial coverage by the local press and evidence that hundreds of whites had begun to arm themselves. The day after the bombing, Kunstler and local black activists contended that Featherstone and his yet-to-be-identified passenger, had been victims of an “ambush laid by the KKK.”

Charlotte Orange-Featherstone, Featherstone’s wife of just a few weeks, expressed outrage over the way the white press “took the side of the police and the FBI.”  “They never talked to me,” she told the reporters from the Pittsburgh Courier, one of the nation’s most-esteemed African-American newspapers.  “They accepted the white man’s lies and they deliberately attempted to destroy Ralph’s character.”

About a year ago, making use of heretofore classified government material, especially Featherstone’s FBI files, I set out to answer the question: “Who killed Ralph Featherstone?”    While I have no definitive answer, I can say beyond a shadow of a doubt that Featherstone’s death and the media coverage of it offers a cautionary tale. This case reminds us, alongside the better-known assassination of Black Panther Fred Hampton by Chicago police with FBI complicity, of the dangers of unchecked state powers and the risk of relying on the national press to ferret out the truth about the repression of self-avowed dissidents.  Fifty years later, it is time to reopen the investigation of their deaths.  

The argument that Featherstone and Payne killed themselves rested on statements made by government authorities which the national press, with very few exceptions, uncritically relayed to the public. On March 16, 1970, FBI Director J. Edgar Hoover asserted that FBI experts had concluded that the bomb had been sitting in plain sight when it exploded, which meant, according to authorities, it could not have been planted under the seat in their car.  Hoover asserted that the two had transported the bomb from Washington, D.C., and that it was “intended for use in Black extremist activities.”   Nowhere did Hoover, other state officials or the media explain how they came to these latter conclusions.  None of the documents in Featherstone’s declassified files confirm such conclusions. 

In his public statements, Hoover never revealed that government agents had been monitoring Featherstone and Brown for years and that none of their investigations had uncovered evidence that they were involved in bomb-making.  Nor did Hoover explain that the FBI unsuccessfully tapped a vast network of confidential informants to ascertain the bomb’s origin.  The declassified files make clear that Hoover and other top officials cared more about protecting the secret operations of the government, particularly the existence of the FBI’s COINTELPRO program, whose mission called for “neutralizing” black extremists, which included Martin Luther King, Jr., than getting to the truth of Featherstone’s case.  

Lacking direct evidence related to Featherstone’s and Payne’s intent, government officials and the national media supported their interpretation by deploying innuendo and guilt by association. Featherstone and “Che” Payne, they emphasized, had travelled to Cuba, held anti-Semitic views,  and were closely allied to Stokely Carmichael, who had recently urged “black militants to go underground.”  Numerous press outlets highlighted the “Amerika” poem they allegedly found on Featherstone’s body. 

The poem, given so much weight by authorities and the media, is not on any of the FBI’s itemized lists of materials found on Featherstone’s body. Given what we now know about the willingness of the government to pen false letters and plant fake stories in its efforts to “neutralize” black activists, as described by scholars such as Wade Churchill and Kenneth O’Reilly, there is good reason to wonder if Featherstone wrote the poem.   

365 days after Featherstone’s death, antiwar activists broke into the FBI’s offices in Media, Pennsylvania, where, among other things, they uncovered evidence of the COINTEL program. Several years later, reports surfaced that the Baltimore police had operated a secret program to spy on Maryland’s citizens. In the interim, the state of Maryland dropped all riot and arson charges against H. Rap Brown, in part because Bob Woodward reported that the arson charge against Brown had been “phony” all along.

Yet neither Woodward nor his future partner, Carl Bernstein, nor any other national journalist or broadcaster, followed up on Congressman John Conyers’ call for a full and impartial investigation of Featherstone’s and Payne’s death, even though a who’s who of civil rights activists from Julian Bond to Dorothy Height made the same demand. And when the Church Committee and other government investigations revealed the vast overreach of the FBI and other police agencies in the mid-1970s, it ignored, as did the press, the government’s possible complicity in Featherstone’s death, focusing instead on its wire-tapping of Martin Luther King, Jr. and other more mainstream figures.   

Given the doubts expressed by many of Featherstone’s and Payne’s colleagues about the cause of their deaths, the laxity of the contemporary investigation, and ongoing revelations about government complicity in the assassination of a wide array of civil rights activists, their case deserves to be re-opened. In the least, doing so will shed further light on the dangers of unchecked government agencies and the shortcomings of the fifth estate.  

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174474 https://historynewsnetwork.org/article/174474 0
Covering the Troubles in Northern Ireland

 

As the summer of 1969 began, I was looking forward to a less intense job post. I had spent the past few years running the NBC News Saigon bureau, serving as South East Asia bureau chief in Hong Kong, and covering the Vietnam War. My thoughts of a gentlemanly tour in London quickly ended once the Troubles started. Although I didn’t know it yet, the Troubles would dominate my role as bureau chief in London for the next four years. 

The Troubles were a physical, social and intellectual explosion in Northern Ireland against the British government that began in 1968 and continued for more than three decades. For many in Northern Ireland, mainly those who were Catholic, the Troubles were nothing more than a continuation of what had been festering for 800 years. Northern Ireland, the territory of Ulster, was owned, operated and dominated by the aristocracy. Protestants ruled and Catholics felt suppressed. The Catholic minority wanted its independence from Great Britain and to merge with the Republic of Ireland to the south. That would never happen as long as the British government ruled. 

The Troubles started with a fury in 1968 at a civil rights protest march in Londonderry when the Royal Ulster Constabulary broke up the march using truncheons and water cannon for reasons still unclear. Then, in August 1969, the Troubles kicked into high gear with the two-day Battle of the Bogside, also in Derry (Londonderry). The demonstrations quickly spread to Belfast and throughout Northern Ireland. The Irish Republican Army (IRA) led a well-organized and fierce guerilla group in the charge for freedom. In over three decades of conflict, more than 3000 died, 30,000 were wounded, and untold millions suffered psychic scars that remain to this day. 

I covered the Troubles in Northern Ireland for NBC News for four years, from the summer of 1969 until 1973. Even as a journalist with significant experience covering conflict, I knew I was in for a new ride. Because we now live in an era of transparency about how the press covers news, I thought I would take you inside how I and other journalists approached covering The Troubles. 

After quickly settling into my new job as a producer and bureau chief based in London, I was soon making frequent trips to Northern Ireland and to the Republic of Ireland, especially Dublin. When I drove across the closed and heavily guarded border that separated the two entities, my camera crew and I underwent long delays, passport checks, and searches of our car for weapons or contraband. We suffered harassment but we were journalists in a high intensity situation, and we expected nothing less.

My team consisted of a cameraman, a soundman, a correspondent and myself as field producer. We were witnesses to what seemed to be never-ending riots between the British troops who fired rubber bullets and tear gas and the disaffected crowds of anti-British Catholic protestors who threw paving stones. 

Even back then, the press was not welcome by anyone.  We knew we were in danger the moment we stepped into the path of marching Northern Irish, whether Protestant or Catholic, and the heavily armed military and police.  Sometimes to keep safe we would run down narrow back alleys to escape the rock-tossing demonstrators. Add to the rocks the British troops wielding batons and firing rubber bullets at whoever stood in their path and life on the streets was, to put it mildly, difficult. We learned not to be in the way of the demonstrators' anger or the retaliating British troops or police. Their wrath only added to our fear. 

My job was to get the story, make it understandable and palatable for our American audience, and to show the continuing riots in all their ugliness. My problem was, how do I cover that kind of story when the thirst in America was for riots rather than analysis. We did not want to dignify or glorify what took place in the streets.

I recall the many faces, the armed and frightened, often brutish, very young British troops, some of whom barely had started shaving clad in their all-weather khaki colored wool uniforms. Alongside them stood the blue- clad R.U.C., the Royal Ulster Constabulary, the police arm of the Protestant Unionist government, generally anti-Catholic and always anti-IRA, its sworn enemy. With them were the B-Specials, an even more select and tougher arm of the Ulster government, if you could identify them, which was difficult because they wore civilian clothing. 

In Belfast and other cities, the clash of philosophy, ideals, and plain hatred made for frequent demonstrations against the British troops. Those fierce clashes with the pounding of metal garbage can covers were sounds of support by mothers, daughters and grandmothers outside their homes. Throwing cobblestones, paving stones, bricks, rocks, and any weapon they could find happened quickly and ended just as fast. Many demonstrators were young and out of work, disenfranchised by any standard.

The riots were frequently bloody. Often the same age as the British troops the rioters confronted, they were difficult to contain when let loose on the street. Because they felt they had no future, they usually rioted with abandon. Some rioters used what the British called petrol bombs, a quart bottle filled with gasoline, a wick at the top and, after lighting it, throwing it and watching it explode, often injuring as many as possible, especially if they were wearing a uniform. 

Truth is that it was difficult to be sympathetic toward any of the players, though that is not what an objective journalist should feel. Each, the IRA who wanted the British out, the Protestant Union/loyalists who wanted to dominate life in Northern Ireland, the R.U.C and the British troops were hardcore in their own way. This does not include the average person raising a family, going to work in a weak economy trying to survive under difficult conditions. Fanaticism was the rule. I learned not to argue with anyone. I could not agree with violent tactics on any side. I did agree that oppression by the British government had been too long the norm in Northern Ireland, and it was time for those people to be free of British rule. 

I witnessed the aftermath of bombings in the streets and in pubs. I talked to politicians, the police and the military and civilians. I often traveled to Dublin over the then heavily guarded border for interviews with underground IRA officials to whom we could not speak in the north. But we never told enough stories about how life was different in the Republic of Ireland though I was never sure how effective those stories were whenever we did. 

The Troubles is one of the stories I covered that does not go away. For different reasons, along with Vietnam, its imprint is with me always. The British and the IRA ended the Troubles after years of negotiation, on Good Friday in 1998. Called  "The Good Friday Agreement," the hope was there would be no more terror. No more bombings, No more killings. No more deaths. Peace had a chance. Importantly, the border between Northern Ireland and the Irish Republic would be open, allowing people, goods and services to cross freely, aiding the economy and the new social compact. I sighed with relief. The people deserved their new reality. 

With Brexit now on its way, its implementation must include the pledge that there will be no hard border between Northern Ireland and Ireland. If not, it could be a threat to the nearly twenty years of that convenient, open border with stability between north and south at risk. The people will suffer. It would not take much to resurrect a Troubles for the 21st Century if the architects of Brexit do not intelligently address the border between the two Irelands. 

Meanwhile, there are enough people unhappy with the 1998 peace accord. Some hardcore opponents in Northern Ireland are not beneath restarting the Troubles. Various IRA splinter groups continue to emerge. Not talked about, but deep in the consciousness of many, Great Britain remains the enemy. Deep-seated memories of plantation dictatorship and the idea that Northern Ireland is still a colony of Great Britain prevail. The possibility of Northern Ireland joining The Irish Free State is more alive than ever. It seems though the Troubles are quiet for the moment, they are ever present in Northern Ireland.

But for me, it was over. After four years of intense coverage of the story, with the smell of tear gas forever implanted in my nostrils, I departed London for a new assignment, never to return to Northern Ireland. 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174506 https://historynewsnetwork.org/article/174506 0
Kamikazes at the Battle of Okinawa

 

On May 6, 1945, a twin-engine kamikaze plane’s bomb exploded beside the destroyer Luce, part of the radar picket ship screen surrounding Okinawa, and ripped her starboard side “like a sardine can.” Flames shot 200 feet high. A minute later, a kamikaze fighter slammed into Luce’s 5-inch stern port guns, and their magazine erupted in a fireball. Luce went down five minutes later with 149 men lost. In the water, sharks hit men “left and right, just tearing them up,” said radioman Tom Matisak, who saw them rip into the ship’s barber. “It was an awful, bloody mess as they chopped him up and pulled him under.”

 

For three months in 1945, this was an all-too-common occurrence in the seas off Okinawa, where 10 mass kamikaze attacks, each with hundreds of suicide planes, struck the U.S. Fifth Fleet. The attacks did not alter the course of the Pacific war, but the death toll of more than 4,900 Navy crewmen increased the misgivings of some members of the Joint Chiefs of Staff about invading Japan.

 

As American forces edged closer to mainland Japan in 1944 and 1945, Japanese leaders adopted desperate measures to thwart the looming disaster. One was the mass kamikaze attack.The loss of Saipan, Tinian, and Guam in the Mariana Islands and the better part of Japan’s air force during the summer of 1944 forced many senior officials to realize that the war was lost. B-29s now menaced mainland Japan’s major cities and ports from new Mariana bases. American submarines were shutting down the oil and rubber pipeline from Southeast Asia. Peleliu was about to fall, and the Philippines would be next.

 

A negotiated peace being Japan’s best hope, Japanese military leaders embraced attritional warfare as a means of forcing the Allies to drop their demand for unconditional surrender.

Its ideological underpinnings were gyukosai and Bushido. Gyokusai was an ancient term meaning “smashing the jewel” — perishing by suicide or in battle rather than suffering the ignominy of capture. A vestige of the samurai warrior code, Bushido was characterized by a studied indifference to death. The new strategy was first applied in September 1944 during the defense of the Palua Islands stronghold of Peleliu. Rather than launch a banzai attack at the beach, the usual Japanese tactic, Colonel Kunio Nakagawa’s troops awaited the invaders inside the caves, tunnels, and fortifications that they had carved into the jagged coral ridges. They patiently waited for U.S. Marines to enter prepared “kill zones” where they could be raked by gunfire from multiple positions.

 

The Japanese achieved their goal at Peleliu: during the battle’s first two weeks, the American casualty rate surpassed anything seen in the Pacific war. The new strategy became the template for the defenses of Iwo Jima and Okinawa in 1945. Japan’s air forces officially embraced the strategy on October 19, 1944, when Admiral Takijiro Ohnishi, commander of the First Air Fleet, met with the 201stAir Group’s senior pilots at Mabalacat Airfield in the Philippines. He told them Japan’s salvation no longer depended on civilian and military leaders, but on its young pilots and their “body-hitting spirit.” When Ohnishi finished speaking, “in a frenzy of emotion and joy” all of the pilots volunteered for the first Special Attack Unit. 

 

Never before or since has there been a phenomenon quite like the Japanese suicide pilot — the kamikaze, named for the “divine wind” typhoon that destroyed an invasion fleet under Kublai Khan in 1281 before it reached Japan. General Torashiro Kawabe claimed that the kamikaze did not regard himself as suicidal. “He looked upon himself as a human bomb which would destroy a certain part of the enemy fleet … [and] died happy in the conviction that his death was a step toward the final victory.” It was a coldly logical decision considering that there were fewer skilled pilots, and they were flying outdated planes that were being routinely shot down.

 

The Japanese simply armed their warplanes with 500-pound bombs and crashed them into American ships. “If one is bound to die, what is more natural than the desire to die effectively, at maximum cost to the enemy?” wrote Captain Rikihei Inoguchi, the First Air Fleet’s senior staff officer. The “fight to the death” strategy’s objectives were embodied in the slogan of the Thirty-Second Army that defended Okinawa: “One plane for one warship/One boat for one ship/One man for ten of the enemy or one tank.” The kamikaze pilots wore white headbands emblazoned with the Rising Sun and good-luck “thousand-stitch” wrappers made by 1,000 civilians who had each sewn a stitch with red thread; it supposedly made them bullet-proof. Before climbing into their cockpits, the pilots lifted their sake cups in a final toast to the emperor and sang, “If we are born proud sons of the Yamato race, let us die/Let us die with triumph, fighting in the sky.”

 

The suicide attacks began October 25, 1944, during the U.S. invasion of the Philippines. A kamikaze squadron commander sent off his 18 pilots with the exhortation, “Put forth everything you have. All of you, come back dead.” They sank the carrier escort St. Lo, killing 113 crewmen, and damaged the carrier escort Santee. Six pilots returned after failing to find targets. Days later, kamikazes crashed and badly damaged the aircraft carriers Franklin and Belleau Wood.

 

It was just the beginning.

 

Between October 1944 and March 1945, suicide attacks killed more than 2,200 Americans and sank 22 vessels. At Iwo Jima on February 21, fifty kamikazes from the 601st Air Group sank the carrier escort Bismarck Sea and badly damaged the carrier Saratoga. The kamikazes’ acme was during the 10 large-scale attacks, or “kikusuis” —meaning “chrysanthemums floating on water” — launched against the picket ships surrounding Okinawa. During Kikusui No. 1 on April 6 — five days after L-Day on Okinawa — the onslaught by 355 kamikazes and 344 escort fighters began at 3 p.m. and lasted five hours. “Dear parents,” wrote Flying Petty Officer 1/c Isao Matsuo on the eve of the mission, “please congratulate me. I have been given a splendid opportunity to die. This is my last day.” Twenty-two kamikazes penetrated the combat air patrol shield on April 6, sinking six ships and damaging 18 others. Three hundred fifty U.S. crewmen died.

 

The clash between death-seeking Japanese flyers and American sailors and pilots determined to live produced gruesome casualties. John Warren Jones Jr., on the destroyer Hyman when she was crashed, saw two men stagger from the inferno with their naked bodies covered with third-degree burns. Two shipmates had their heads blown open. One had “a big piece of plane through his chest and sticking out both sides.” By April 1945, though, it was apparent that many kamikaze pilots, perhaps because of fuel shortages that limited their training, possessed meager flying skills and could be easily shot down. As defeat loomed larger by the week, volunteers for kamikaze duty dried up; resentful conscripts increasingly filled the ranks. They often flew to their deaths drunk and bitter. One pilot, after takeoff, strafed his own command post.

 

The Japanese fell short of their goal of “one plane one ship,” but sank 36 American warships, and damaged 368 other vessels at Okinawa. The Navy’s losses were the highest of the Pacific war: 4,907 sailors and officers killed, and 4,824 wounded. Japan lost an estimated 1,600 suicide and conventional planes at Okinawa. The 9/11 hijackers excepted, the kamikaze disappeared after the advent of unmanned missiles, and in the absence of a samurai tradition like that of World War II Japan.

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174496 https://historynewsnetwork.org/article/174496 0
Historical Novelists Owe the Truth to Readers–and to History

In an article that I did for National Public Radio in 2011, I described the past as uncharted territory, comparing it to visiting a country where we do not speak the language and depend upon the historical novelist to act as our translator. For that relationship to work, it must be based upon trust; we need to be able to rely upon a novelist’s interpretation of that past. So, truth matters. I would never have expected that statement to become controversial, but never has that bedrock value been under such relentless assault. As writers, we owe the truth to our readers—and to history. Many people accept what they read in a book or see on-screen as gospel, and that can give novelists and screenwriters more influence than even they realize. We need only think of Braveheart or Kingdom of Heaven, a visually striking film that transformed Balian d’Ibelin, one of the most influential noblemen in the Kingdom of Jerusalem, into an illegitimate French blacksmith. 

Historical novelists need to adhere to the known facts, whether they are writing of a battle, a rebellion, or the lives of the men and women caught up in these events. Writers must inevitably rely upon our imaginations to a great extent, for there are always blanks that must be filled in. Medieval chroniclers were indifferent to the needs of modern authors, neglecting to give us the date of a marriage or birth, the cause of death even of kings. 

When I am forced to fill in any significant blanks, I clear my conscience in my Author’s Notes. Since my readers often tell me how much they enjoy a writer’s “AN,” this is a win-win situation. Not all writers take it to the extremes that I do; the AN for one of my novels ran for eleven pages. But I feel disappointed when I finish a historical novel and then discover that the author has not included an AN. The best historical novelists—Bernard Cornwell and Margaret George to name just two--always offer this intriguing look behind the curtain of the creative process, telling us when they had to take liberties and why. And I think most readers are comfortable with this approach, as long as writers play fair and alert them when a diversion is coming. I understand that few of the battles in Bernard Cornwell’s splendid Saxon series were well chronicled and so it makes sense that he must draw upon his own strategic skills to make these battles both memorable and bloody; when he then reveals that he borrowed a battle tactic from Napoleon for one of his books, we are amused and even more impressed. 

I am not suggesting, of course, that we should accept “history” without question. It is a cliché but a true one that history is usually written by the victors, and while I do not agree with Napoleon’s cynical comment that history is a fable agreed upon, it is obvious that history reflects more than facts. We must bear in mind that historians tend to interpret the past in light of their own beliefs and biases. This is why the reputations of controversial kings like Richard III and Richard Coeur de Lion have fluctuated so wildly over the centuries. 

A good example of this can be found in the story of the medieval Kingdom of Jerusalem. Historians in the twentieth century explained the political rift that would doom the kingdom in modern terms—as a struggle between hawks and doves, between those who urged war with the Muslims and those who argued for accommodation and compromise. They were not entirely wrong; European crusaders were more likely to have a take-no-prisoners mentality than the men and women who’d been born in the Levant and had a more realistic understanding of their vulnerability as a small Christian enclave in a Saracen sea. But truth is always messier than that, rarely able to fit into such a neat “hawks vs. doves” paradigm. 

Historians like Peter Edbury have convincingly shown that these rivalries were more complex. Personalities mattered, too. If the last king of Jerusalem, Guy de Lusignan, had not been as malleable as wax, well-meaning but easily influenced by men of stronger will, it is likely that the battle of Hattin, one of history’s great military blunders, would not have happened. The Count of Tripoli, cast by earlier historians in a heroic mold, has been reassessed by the current crop of historians, who have not been so quick to disregard his acts of blatant self-interest. A more balanced portrayal of the count’s character makes it easier to understand why his enemies scorned his advice when he pleaded with the king not to fight Saladin at Hattin. 

In insisting upon the importance of historical accuracy in novels, I am also making an argument that novels can add a valuable dimension to the study of history. I realize that my viewpoint is not universally accepted; to the contrary, some historians dismiss fiction as a superficial distraction from the real study of the past. I suppose it is only to be expected that I’d want to defend what is both my profession and my passion. But novelists have learned that our books can reach a wider audience than many academics can, that we can win converts to the cause—the belief that history should be an essential element in any school curriculum, as it once was. It helps us understand our place in the universe and it teaches us the importance of context, that we are united in our common humanity and there is nothing new under the sun. And learning about history is great fun, especially if camouflaged in fictional form! 

Most of us have heard a variation of the comment by George Santayana, that those who cannot remember the past are condemned to repeat it. This is true, yet it is also true that even those who know the pitfalls of the past can still stumble into them. While the trappings of civilization have changed dramatically over the centuries, human nature remains a constant. So, I would rather conclude by giving the last word to William Faulkner in Requiem for a Nun: “The past is never dead. It’s not even past.”   

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174504 https://historynewsnetwork.org/article/174504 0
Roundup Top Ten for March 6, 2020  

 

The Problem with Women's History Month in 2020

by Kimberly A. Hamlin

As we honor the ongoing work of women to gain equal citizenship, it is time to integrate women’s stories more fully into our national narratives and civic memories.

 

Democratic Socialism All Around: What Bernie Sanders’ New York Can Teach Us about America’s Future

by Joshua Freeman

Mid-20th-century New York had serious flaws. But it stands as an example of what can be done when the power of government is combined with a capacious vision of human rights, equality, and democracy. 

 

What Can the Black Death Tell us About The Global Economic Consequences of a Pandemic

by Adrian R. Bell, Andrew Prescott, and Helen Lacey

There will be winners and losers economically as the current public health emergency plays out.

 

What Shirley Chisholm Can Teach 2020 Candidates as They Exit 

by Anastasia Curwood

Chisholm wanted to show the power of new voices in the Democratic Party: women, African Americans, the poor and youth, and to challenge the authority of conservative Southern white Democrats at the Democratic National Convention.

 

Little Women: Greta Gerwig’s Love Letter to the 19th-Century Novel 

by Christine Jacobson

It’s thrilling to see the historic Roberts Brothers publishing house richly imagined in the film’s first scene and to watch Jo spar with her editor over the rights to her novel and its ending.

 

Pennies and Nickels Add Up to Success: Black Banking Pioneer Maggie Lena Walker

by Crystal M. Moten

Maggie Lena Walker was one of the most important Black businesswomen in the nation, and today too few people have heard of her.

 

What the Plague Can Teach Us about the Coronavirus

by Hannah Marcus

The distant past is not our best source of advice for pathogen containment. But it does offer clear lessons about human responses to outbreaks of infectious disease.

 

Onward, Christian Soldiers: The Triumph of Christian Nationalism

by David Austin Walsh

The Christian Right wields wide-ranging political power in contemporary America. We live in the country that the movement has created.

 

Can 50 Years of Minimizing Nuclear Proliferation Continue?

by Ivo H. Daalder

The Nuclear Nonproliferation Treaty has mostly succeeded in keeping more countries out of the nuclear club. But as U.S. alliances fray, its future success is not assured.

 

On Behalf of All 'Future Historians,’ Leave Us Out of Your Brexit Rants

by Charlotte Lydia Riley

The people who invoke historians of the future are less keen on listening to actually existing historians today.

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174487 https://historynewsnetwork.org/article/174487 0
Greetings from New HNN Editor Michan Connor

 

Greetings to the History News Network community. I’ve had the pleasure of hearing from some of you already, and am very impressed by readers’ devotion to the site as a forum for historians and people who love to learn about history and history’s importance for understanding our present. It’s a great honor to be able to carry forward such an important resource. I am very appreciative of the efforts of HNN founder Rick Shenkman, and the enthusiastic support of the history department at George Washington University. I am especially grateful to my predecessor Kyla Sommers, who diligently explained not only the nuts and bolts of the website, but its spirit. 

 

I came to HNN after a long path through (and out of) the minefield of contemporary academe.  I was always interested in history as a child in Massachusetts, and enjoyed how historical fiction (both books and movies) could challenge me to imagine life in a different time and place, to read more about it, and to understand what a story got right, or got wrong, or left out completely. 

 

As a first-year undergraduate student, I spent two terms slogging away at chemistry and calculus in pursuit of medical school admission, but was led astray by my courses in the humanities and joined an honors program in American studies. Living and studying outside of Chicago, I was inspired by the city’s physical form and the richness of its multiple artistic and musical traditions, but also troubled by its inequality, segregation and political dysfunction. My history classes on the American city and Black Chicago pushed me to ask about who made the city, how, and why. 

 

After a couple of years of fairly aimless but fun young adult life at the rough edges of one of Chicago’s gentrifying neighborhoods, I entered the new and dynamic PhD program in American Studies and Ethnicity at the University of Southern California in 2001. The 9/11 attacks shook the world in the second week of my graduate studies. In hindsight, I think those events shaped the way I looked at cities, including my new home of Los Angeles, as places where the contradictions, conflict, and violence of modern life are on full display, but where diverse people work to figure out how to survive and thrive together.  The key for me was understanding how those people and places were shaped by the imprint of the past. 

 

That perspective guided me through writing a dissertation on how city boundaries shaped racial dynamics in Los Angeles County, and researching and publishing articles about suburban secession movements in LA and metro Atlanta. Along the way I moved to Texas, spent a year in Atlanta as a fellow of the James Weldon Johnson Institute at Emory University, and started a family. I had the pleasure and challenge of designing and teaching courses that brought historical perspective to understanding policy problems like environmental injustice, economic insecurity, and racial inequality, and worked with incredibly diverse groups of students, who taught me that teaching history is about dialogue and communication.

 

Many readers of this site are no doubt aware that today’s universities are increasingly unwilling to support stable and long-term careers for scholars in fields like history, and I’m among the many who have sought a path outside of academe. If there’s a silver lining to this professional crisis, it is that historians have expanded their public engagement into new forms. I am very excited to be carrying on the legacy of HNN, which has long championed the idea that historians have important things to say about today. 

 

If you are a past contributor to HNN, I look forward to reading your future submissions. If you are a new contributor, and especially if you are a history graduate student or early career scholar looking to start writing to bring your expertise to the public, send your drafts to editor@hnn.us.

 

 

 

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174477 https://historynewsnetwork.org/article/174477 0
Truth and History: Historical Truth and Historical Narrative

Is there such a thing as objective truth in history? Is history a compilation of narratives advanced by different groups and nations? The influence wielded by historical narratives on international relations is such as to make it imperative to define conceptually the terms concerned and dwell, albeit briefly, on a few cases. 

Historical truth is objective by its very nature. It is there, so to speak, to be discovered and unearthed. Certainly, there may be occasions in which the truth cannot be discovered. However, the inability to discover the truth does not negate its objective existence. In this context, a distinction ought to be drawn between Historical truth and interpretation. The first is objective and the latter is subjective. The first refers to a fact, which can be determined as true, at least in principle, by empirical study, whereas the latter entails an explanation of the fact in question. To be sure, the lack of historical truth may lead to an act of inference, accompanied by interpretation, designed to assess what the truth might have been. 

Thus, “narratives”, a commonly-used catch-phrase, to afford legitimacy to historical interpretations are, at best, an attempt at explaining historical events from a subjective perspective. Their importance resides in the influence they wield in shaping the perception of reality by groups or nations. Narratives may determine historical truth insofar as they describe the perception of groups or nations as they exist objectively, but the historical veracity of the facts which those narratives depict do not derive necessarily from them.   They may be objectively true or false. 

This is not to belittle the importance of historical narratives. Their emotional impact may determine the manner by which decision-makers interpret the external environment in which they operate, and make decisions affecting the group or nation they represent. 

Still, historical narratives are not a synonym for historical truth. However powerful historical narratives may be in shaping the actions of a certain group or nation, they do not, per se, reflect historical truth. A historical narrative may be based on historical truth, but to believe that historical truth may not be objectively determined and thus one is left only with historical narratives is to confuse the objective existence of truth with its subjective interpretation. 

The assumption that the subjective interpretation of history is automatically rendered into a historical truth on account of its historical impact is clearly wrong.  

We can witness the effects of historical narratives on the nature of contemporary international relations. 

Suffice us to glance at the differing narratives by the Turks and the Armenians of the Armenian Genocide and their effects on international relations. Indeed, the term “Armenian Genocide” is part and parcel of the fierce dispute between the two sides about the events surrounding the murder of around one and a half million Armenians by the Ottoman Turks, starting in 1915. While the Armenians contend that the Ottoman Turks perpetrated a well-prepared and thought-out act of genocide, the Turks argue that the Armenians were a hostile element within the Ottoman Empire and that the events concerned reflected a violent conflict between two contending sides, and not an organized effort at genocide. Any attempt by a third party to recognize the Armenian Genocide is immediately followed by strong protests by the Turkish Government.  Governments and parliaments assess the pros and cons of recognizing the Armenian Genocide on the basis not only of moral but also of pragmatic considerations as to its effect on bilateral relations with Turkey. 

The differing accounts by the Palestinian Arabs and the Israelis about the nature of the Arab-Israeli conflict is a further example of historical narratives that still wield a strong influence on the character of an international conflict.  Thus, for instance, the Palestinian Arabs refer to the events surrounding the establishment of the State of Israel as a Nakba, or Catastrophe in Arabic,  leading to the displacement of hundreds of thousands of Palestinian Arabs from their homes, whereas the Israelis stress the refusal of the leadership of the Palestinian Arabs to accept the UN Partition Plan of 1947, which could have led to the establishment of a Palestinian Arab state alongside Israel, and their subsequent decision to launch an all-out attack against the Jewish community in Mandatory Palestine, followed by an attack by the Arab countries against the newly-established State of Israel.  For the Palestinian Arabs the establishment of Israel led to the Nakba; for the Israelis, the Nakba was the result of the refusal of the Palestinian Arabs to accept a compromise solution and the decision to launch an all-out attack, without which there would have been no war and no refugee problem. 

A further example relates to the tensions prevailing between Russia and Poland about the events surrounding the start of the Second World War and the role played by the Soviet Union in it. Russia stresses the role played by the Soviet Union in defeating Nazi Germany and liberating Poland from the yoke of German occupation, while Poland puts as much emphasis on the Ribbentrop-Molotov Pact of August 1939, which stipulated that Poland would be divided between Nazi Germany and the Soviet Union. For Poland, the Soviet Union was as much a liberator as an oppressor. 

Historical narratives may reflect historical truth or not. Their aim is not necessarily to ascertain what actually happened in the past, but to justify what happens in the present. Narratives are important to understand the attitudes that form part of the decision-making process of the sides involved in an international dispute. A clear distinction ought to be drawn between historical narratives as a tool to comprehend the mind-setting behind the positions adopted by the sides concerned, and historical truth as such. The historical narrative of one side may reflect historical truth more than the historical narrative of the other. Indeed, in general, one may be subjective and right. Still, conceptually, the two are not necessarily related. Historical truth stands alone, in its own right. Historical narratives may reflect historical truth, but, however influential they may be in historical and contemporary parlance, they occupy a separate place.  

]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174493 https://historynewsnetwork.org/article/174493 0
America’s First Literary Voice

As we celebrate Women's History Month let's remember America’s first literary star, Anne Bradstreet, who was a pioneer in more ways than one. 

Anne came to the new American colony in 1630 from Great Britain, with her husband Simon Bradstreet and family. She was just 18. They were Puritans who left the home country because of political and religious tensions. But traveling by ship across the Atlantic Ocean was no easy journey. It was a nightmare.  According to biographer Elizabeth Wade White “many of the colonists arrived ill and weakened by malnutrition" from the voyage.   The Bradstreets landed in Salem, Massachusetts before going  to Boston. They lived in Ipswich along the coast before moving inland to Andover. At that time the western frontier for European settlers was the Massachusetts wilderness. Life was hard. Just getting shelter and food were challenges, and fighting off sickness. Sarah Loring Bailey, author of Historical Sketches of Andover, writes "The hardships and privations of pioneer life told severely upon the delicate constitution of Anne Bradstreet."  Anne became the mother to eight children, so she had incredible responsibilities as a parent on the frontier. Her health gradually suffered. Anne was a well-educated woman, something rare for those times. She was skilled with the pen and took to writing poetry in Ipswich and continued in Andover.   You can imagine her on winter days in Massachusetts, where the sun sets very early, taking some time to write her feelings. The long winter, and sometimes sleepless nights, gave rise to many deep thoughts of life in the New World. Anne wrote “By night when others soundly slept And hath at once both ease and Rest, My waking eyes were open kept And so to lie I found it best.” Writing poetry is therapeutic, and Anne needed that outlet. The poems she wrote were not for a public audience, but rather for her children. She was a devoted parent.  But Anne, especially with her health struggling,  knew someday she would be gone. She wanted to leave something with her kids. Anne wrote "That being gone, you here may find. What was your loveing mother's mind, Make use of what I leave in Love And God shall blesse you from above." Not everyone approved of Anne writing poetry. At that time it was considered out of place and even dangerous for women to be educated, to write and express opinions. Anne wrote “I am obnoxious to each carping tongue Who says my hand a needle better fits, A poet’s pen all scorn I should thus wrong, For such despite they cast on female wits.” But Anne would become a trailblazer as a writer and prove that women could break through barriers that society wrongly imposed.  Anne got quite a surprise when her poems were published in a book in London. Her brother-in-law, the Reverend John Woodbridge, took her poems there for publication without her knowledge. The next family dinner must have been very interesting after that stunt!    All of a sudden Anne was a published poet in the book titled The Tenth Muse, lately Sprung up in America.  Anne was a good editor too and wanted to fix little errors in her hastily published book. She wrote about this in her poem "The Author to Her Book." Good writing is rewriting. Years later a revised version of The Tenth Muse, with additional poems, was printed in North America. She became an acclaimed poet on both sides of the ocean.  What started as thoughts in Anne’s mind are now writings available in bookstores, libraries and on the web. They can inspire writers everywhere like the poetry workshops that Save the Children runs in Africa for girls who suffer from discrimination, conflict and hunger. The power of writing.  Anne was the first published poet from America. Her best works were about love and loss. Her poem "To My Dear and Loving Husband" reads “If ever two were one, then surely we. If ever man were loved by wife, then thee. If ever wife was happy in a man, Compare with me, ye women, if you can. I prize thy love more than whole mines of gold, Or all the riches that the East doth hold.” Anne suffered a tragedy in 1666 when her home was burned to the ground. She lost her possessions but came to realize what truly mattered was her faith in God. She writes “Farewell, my pelf, farewell, my store. The world no longer let me love, My hope and treasure lies above." The thoughts and feelings of Anne Bradstreet, written hundreds of years ago, live on with us today.  ]]>
Sun, 29 Mar 2020 06:09:20 +0000 https://historynewsnetwork.org/article/174501 https://historynewsnetwork.org/article/174501 0