History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 22 Sep 2020 18:22:36 +0000 Tue, 22 Sep 2020 18:22:36 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://ww.hnn.us/site/feed The Second Amendment has Never Covered Kenosha Shooter Kyle Rittenhouse

 

 

 

Kenosha shooter Kyle Rittenhouse’s lawyer made headlines recently when he claimed that his client’s gun possession fell under the “well regulated militia” clause of the Second Amendment. The claim was novel, at best; the Supreme Court rulings striking down some gun control laws have made a point to not strike down the very idea of gun control. Even for those eager to see both some level of sanity in US gun laws and some measure of justice for the men killed in Kenosha, the proposed defense was enough to raise questions about the lawyer who made them – John Pierce, whose past clients include Rudy Giuliani – and about the competence of Rittenhouse’s representation. Experts in modern case law have already made clear that Mr. Pierce’s claims about a member of the militia having the right to carry any weapon anywhere lack any legal validity. 

 

Those claims also lack historical merit, and that they could get as far as they have is one more sign of the general ignorance in today’s America about what the militia was when the Bill of Rights was written and, therefore, what the Second Amendment was supposed to accomplish. So first things first: Rittenhouse was not part of the militia. Not in the way that the Second Amendment intended, in any case, nor by any eighteenth-century definition of “militia.” If anyone in Kenosha could claim to be part of the militia, it was the National Guardsmen there (more on that below). 

 

The eighteenth-century militia was an official institution which answered to the state government and, before that, to the colonial governments. To the extent that the militia ever was “well regulated,” one could not simply declare oneself a member; nor, for that matter, could the men whom Rittenhouse shot declare that they were not members. The militia as it existed in the eighteenth century was the backbone of a system of citizen military institutions that died a long time ago. There were scores of old militia laws, from both the colonial era and from the early state constitutions. Those regulations made a few things clear – which, though common knowledge at the time, seem unknown by most Americans today, including those who insist on their status as “Second Amendment supporters.” 

 

In the eighteenth century, militia membership and participation was not a choice. The men in it were legally required to participate. To put this in more concrete terms, the colonial regulations would state that all men “16 to 60,” or “above seventeen years and under sixty,” or “21 to 60,” etc., would be members of the militia. There were often qualifiers to this, such as “all able-bodied men,” or “all free male persons.” Most of the adult men not in it were prohibited from participating, or even from owning weapons. As one might expect, this division was racial – white men were required to be part of the militia and non-whites were either excluded or limited to specific positions like drummer or scout. There was some variation with these lines; poor white men were in most but not all of the militias, and Catholics were occasionally excluded, but as a general rule of thumb white men served in the militia, women and men of color did not. This was no accident, nor was it just symbolic – with little to no professional police or military forces, the militia was how whites maintained their domination. The militia was also how whites in slave states prevented enslaved people from rising up, and it was how communities along the frontier were able to take land away from Native American tribes. Rather than relying on full-time professionals, the citizens – a status limited at the time to white men – participated part-time in their local units, in which anyone eligible was required to register, and the officers in that unit kept registers of those men. The militia also trained together several times a year. Men who were required to participate but who chose not to were subject to fines and other penalties.  

 

It is fair to note, though, while that militia was a key institution in Colonial society and in the first decades of the republic, it never was never as well regulated as its advocates hoped. The regulations themselves often mentioned the sorry state into which the militia had fallen. Virginia’s 1755 militia act, for instance, noted that the previous acts "hath proved very ineffectual,” and as a result “the colony is deprived of its proper defence in time of danger." So in Pierce’s defense – although this is hardly sufficient – the militia rarely lived up to its goals, or acted as its regulators expected it to. It was also not unknown for the men who made up the militia to reject their government and rise up in arms against it, and even claim to be the militia while doing so. Shays’s Rebellion and the Whiskey Rebellion are the two most famous examples of this. When that happened, there was relatively little that a governor could do. And while those rebellions were always put down eventually, the punishments on the men who participated were fairly mild (especially if compared to the punishments meted out against enslaved people who participated in any sort of rebellion). There were also occasions like the 1782 Gnadenhutten Massacre, when local militia men took it upon themselves to execute unarmed Native Americans who were not at war with the US, leading to widespread condemnation, but not to any criminal charges against the militiamen. Still, the constitution’s militia clause made it clear that the role of the militia was to suppress insurrections, not to participate in them. 

 

The militia, then – that “palladium of liberty” – was a messy and unstable institution whose members often resented having to participate, and which at times rose up and left the government in desperate straits. It was also only really effective at maintaining racial divisions and inflicting violence on non-whites. So it’s not surprising that as the United States grew, the militia became less popular. During the first years of the Republic, there were several plans to revitalize the militia, all of which died on vine – both under Washington, who was skeptical of the militia’s potential on the battlefield, and under Jefferson, who was more enthusiastic about its possibilities. Over the course of the nineteenth century professional forces began to take over many of the militia’s duties, both as an internal police force and as an external army. The Militia Act of 1903 provided a long overdue acknowledgment that the militia of the founders’ generation was dead; in its place came the “organized militia,” which consisted of “the regularly enlisted, organized, and uniformed active militia” – henceforth to be known as the National Guard (as it was already known in much of the US). The “unorganized militia,” meanwhile, consists of all remaining “able-bodied males” aged seventeen to forty-five – a fact unknown by most of the members of that unorganized militia, as it brings with it no obligation to train, register, or do anything else whatsoever. 

 

Replacing the citizens’ militia with paid professionals has hardly been a perfect solution, as the Black Lives Matter movement has shown (following in the footsteps of a long tradition of civil rights’ groups criticisms of police violence and racism). The uneasy overlap between professional police and racist vigilantism has also been a recurring problem, as shown by the presence of someone like Rittenhouse – an admirer of the police – taking part in vigilantism.

 

So Rittenhouse does have his historical precedents – and if his lawyer wants to argue that, by attempting to repress the Black Lives Movement, Rittenhouse was acting in the spirit of those eighteenth-century militias which went outside the law and defied their state government, and especially those who did so in the interest of promoting white supremacy – his case would be fairly solid. It would not, however, be an exoneration. Far from it. What Rittenhouse and his lawyers cannot argue is that he was part of the well-regulated militia. Unlike the National Guard, which is under government authority, Rittenhouse was not acting as that militia had been legally required to act. Beyond that, though, the militia which the Second Amendment declared “necessary for the security of a free state” died a long time ago. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177404 https://historynewsnetwork.org/article/177404 0
Breaking Lincoln's Promise

 

 

 

“Why should I go to that cemetery? Its filled with losers,” President Donald Trump reportedly stated when justifying his unwillingness to visit the American war dead at the Aisne-Marne American cemetery in France. The cemetery commemorates soldiers from the First World War, many of whom died at the nearby battlefield at Belleau Wood. Mr. Trump’s supporters who attended the President’s trip to France in 2018, including current White House staffers, denied that the Commander-in-Chief of the U.S. military made such utterances. Jeffrey Goldberg, Editor-in-Chief of The Atlanticand author of the article, quotes four anonymous sources “with firsthand knowledge of the discussion that day.”  President Trump had denigrated American soldiers, veterans, and their families before and during his presidency including Senator John McCain, who died from cancer in 2018 and was a prisoner of war for over 5 years after his Navy plane was shot down over North Vietnam. Mr. Trump was accused of being disrespectful to Khizer Khan, a Gold Star Father, whose Muslim son, Captain Humayun Khan, died heroically protecting his fellow soldiers from a suicide attack in which he was posthumously awarded the Bronze Star and Purple Heart medals. 

 

Critics of the President refer to these moments to suggest that Donald Trump is unfit to be Commander-in-Chief while the President’s supporters use them to suggest that the media and liberals maliciously distort and lie about Mr. Trump to undermine his reelection bid. The controversy persists because the 2020 presidential election between Donald Trump and Joe Biden is taking place within the context of a war over American memory. President Abraham Lincoln crystalized American cultural memory when he dedicated the cemetery at Gettysburg in November 1863. Lincoln summoned the loss and grief of hundreds of thousands of Americans and harnessed their mourning to the bodies of Union soldiers, declaring “that these dead shall not have died in vain” but rather their sacrifices guaranteed “a new birth of freedom and that government of the people, by the people, for the people shall not perish from the earth.” He constituted what can be described as a promise to the dead that the nation would remember them and their collective sacrifices, and Lincoln obligated the living to guarantee this promise. More than a justification to continue the war, Lincoln activated a hot memory that cultivated a culture of yoking America’s slave past to the present democratic underpinnings of the Civil War. 

 

The hot memory that Lincoln initiated was short-lived. The intensification of the postwar Confederate lost cause movement, coupled with the eventual de-facto and then legal segregation codified within Jim Crow laws helped cool American cultural memory by denying the role that slavery played in American history thus effectively separating the past from the present. American cultural memory of the Civil War continued to cool in the first half of the twentieth century, allowing Americans to ignore their domestic racist past, which enabled their imperialist ambitions and interests across the Atlantic during the First World War. American soldiers buried in the Aisne-Marne cemetery in France died in segregated combat units in the cause of expanding American influence in Europe. In his Memorial Day speech in 1919, President Woodrow Wilson stood in the Suresnes military cemetery and explicitly (without noting the irony) tied the war dead from the First World War to the war dead of the Civil War. He claimed that the Civil War dead died to create a new nation and the First World War dead sacrificed their lives to create a new Europe. Wilson succeeded in incorporating the First World War dead into Lincoln’s promise and likewise obligated the living to remember them collectively as a noble brotherhood of American sacrifice. 

 

As the United States expanded its interests across the Atlantic, Americans could not sustain the chilling effect on cultural memory. The hypocrisy of segregation, especially in the eyes of the nations Americans were trying to influence, manifested itself through the onset of redlining residential neighborhoods, the death of Emmett Till, and the marches in Birmingham and Selma. This duplicity became acute during the Cold War when Communist nations used American segregation to undermine the United States’ spread of democracy. Inside the U.S., the Civil Rights Movement of the 1950s and 60s attempted to heat American cultural memory again by calling for democratic reforms that would incorporate African Americans and others into the American dream. Civil rights leaders marched and protested, in part, to remind Americans of how much their collective past, especially when it came to slavery, was intertwined with the segregationist policies of the present. While this movement succeeded in thawing and even reheating cultural memory, its efficacy waned as the twentieth century closed. The logic of the Cold War, the imperialistic nature of the Vietnam conflict, and the economic stagnation of the 1970s helped cool American memory again. This cooling effect continued as the end of the Cold War allowed Americans to embrace the present, to celebrate their defeat of Communism, which helped distract them from completing the program of racial equality. This cooling trend continued into the twenty-first century with the invasion of Iraq and Afghanistan in the war on terror pushing the American present further and further away from dealing with the traumatic and difficult aspects of the American past.

 

Donald Trump emerged as a presidential candidate taking advantage of this already frozen memory. President Trump’s denigration of dead soldiers is part of this movement to “drain the swamp” and is a disruption of Lincoln’s promise and of the obligation on which American commemorative traditions have been built. He is not just attacking an individual politician who had been a Navy pilot or a Gold Star Father who spoke out against his presidential candidacy this time, he is calling the war dead collective “losers” and “suckers” including the war dead that Lincoln claimed “gave their lives that that nation might live.” 

 

The first Republican President reminded his audience in Pennsylvania that “the world will little note, nor long remember what we say here, but it can never forget what they did here.” The current Republican President’s personal memory of the past is not unique rather, it reflects the cultural memory of America, cold, presentist, and amnesiac. His unwillingness to remember the war dead in Aisne-Marne or apparently even know “who were the good guys in this war” is symptomatic of a larger cultural amnesia that has been around since the end of the Cold War. American cultural memory, in its frozen state, cannot accommodate the past. Americans must warm their collective memories of the past if they want to navigate the present. They must bring the past into the present in a way that allows for an honest discussion of the past so that they can make informed decisions about the present. Abraham Lincoln understood this when he spoke at Gettysburg. Donald Trump’s refusal to commemorate the war dead at Aisne-Marne illustrates not only how far the Republican party has shifted away from Lincoln but it also suggests just how far American society has moved away from its obligation to remember the war dead. Lincoln’s promise to the dead seems unfulfilled if not wholly broken. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177415 https://historynewsnetwork.org/article/177415 0
Woody Guthrie's Communism and "This Land Is Your Land"

 

 

 

Just before her high wire performance at the 2016 Super Bowl, Lady Gaga sang the opening lines of Irving Berlin’s “God Bless America,” before segueing into Woody Guthrie’s “This Land is Your Land.” That Guthrie’s song was written in angry response to Berlin, and that its incorporation into such a corporate spectacle likely would have caused Woody no small amount of distress, appears to have been something Gaga was unaware of. Such confusion is not especially unique. The song has been mired in ambiguity for decades.

 

Woody Guthrie’s inspiration for the song came as he traveled from California to New York in the winter of 1940. At the time there was no getting away from Berlin’s song, which was everywhere on the radio, sung by Kate Smith — someone back in the news, not for that composition, but for singing songs with racist content. When Guthrie finally reached New York he sat down and wrote, “God Blessed America for Me,” which would become “This Land is Your Land” — with its melody taken from the Carter Family’s “When the World’s on Fire.” Guthrie’s song, rather than extoling God’s special relationship with the United States, asked how it could be that He blessed a country where people were standing in the relief lines, hungry and out of work. It was, to say the least, not a barnstormer of unabashed patriotism.

 

Guthrie & the Communist Party

 

To the degree people know the politics of Woody Guthrie today he is thought of as an advocate for social justice, with a degree of association with mid-century US communism, but mainly a free spirit who travelled among and gave voice to the dispossessed in the United States. The matter of whether or not he was an actual member of the Communist Party USA (CPUSA) has long been debated, with the consensus being he was simply not party material. 

 

On a general level this is correct, but not wholly so. That is because not only was Guthrie a close supporter of the party, there is strong evidence he was a member of the group for a short time in the early 1940s, before being dropped because of discipline issues. Not only that, he unsuccessfully reapplied to the group during World War II, but was rejected, according to his second wife Marjorie Guthrie, something she revealed in Oregon’s Northwest Magazine in 1969. 

 

This is different from the prevailing view, most forcefully argued in Ed Cray’s Ramblin’ Man: The Life and Times of Woody Guthrie, which cites numerous friends and former comrades who claim Guthrie was not, and could never have been, in the Party. Cray however, contradicts himself in a footnote: 

 

[The writer] Gordon Friesen, on the other hand, maintained that Guthrie was a member of the Communist Party briefly in 1942. It ended sometime in the summer months. Friesen wrote, after Guthrie was summoned by “his organizer” to “a branch meeting” in Greenwich Village. Guthrie was to answer charges of lack of discipline… He had pledged to appear at a certain Village street corner to sell Daily Workers and then had failed to show up.

 

Cray found this anecdote in a letter from Friesen to historian Richard Reuss, the author of  the seminal work, American Folk Music and Left-Wing Politics: 1927-1957.

 

What is notable about this story is that it is one that had been told elsewhere, by music critic Robert Shelton in his biography of Bob Dylan. Shelton says Friesen wrote to him directly, remarking on how badly the CPUSA treated artists, citing Guthrie as an example, “I remembered Woody showing me a letter from his section organizer in the Village ordering him to appear to answer charges for ‘lack of discipline’ because he had failed to show up at a certain corner to sell the Daily Worker.”

 

Both these stories, in turn, align with Pete Seeger’s recollections. According to Seeger, who spoke with Robert Santelli for his book, Hard Travelin: The Life and Legacy of Woody Guthrie, “Woody considered himself a communist. Was he a member of the Party? Not really.” According to Seeger, “The Party turned him down. He was always traveling here and there. He wasn’t the kind of guy you’d give a Party assignment to. He was a valued fellow traveler.” Seeger, however, continues:

 

On the other hand, Sis Cunningham, who was a much more disciplined person than either me or Woody, was in a Greenwich Village Branch of the Party. She got Woody in. She probably said, I’ll see Woody acts responsibly.’ And so Woody was briefly in the Communist Party.

 

Sis Cunningham, it should be noted, was married to Gordon Friesen, and was one of the Almanac Singers in New York before World War II, along with Guthrie, Seeger, Bess Lomax, Millard Lampell and others. That aside, Seeger’s characterization matches with the information above.

 

Dues paying or not, Guthrie was a committed partisan of the CPUSA. As Los Angeles Party leader Dorothy Healey put it, “If he wasn’t a Party member, he was the closest thing to it.” This is important not only factually, but for what it says about the attention the FBI directed at him. The Bureau tracked Woody Guthrie for over three decades, compiling files adding up to 593 pages. If his association with the Party was as tenuous as some have claimed, then the FBI was seriously off track in the attention they gave him. From their perspective, pursuing people with serious ties to US communism made sense. Which is not to say it was just. Guthrie was not breaking any laws or otherwise engaged in activity meriting such attention. In fact it is abominable that the FBI continued to monitor him for years after his diagnosis with Huntington’s Chorea, as he was losing his ability to walk and even speak.

 

A Song With Many Meanings

 

Woody Guthrie met the Communist Party in Los Angeles in 1939, while working at Radio Station KFVD. There he met Ed Robbin, a writer for Peoples World, the party’s West Coast newspaper. One of the first things Guthrie did after meeting Robbin was to write a song about Tom Mooney a labor organizer who had been imprisoned for allegedly bombing a “Preparedness Day” parade held in preparation of the US entering World War I. Guthrie’s first partisan act in that respect was to write “Tom Mooney is Free,” on the occasion of Mooney’s release. A greater example of his partisanship came in the lyric, “Why Do You Stand There in the Rain?” which he wrote in response to Franklin Roosevelt’s scolding of a youth rally — that included communists — held soon after the USSR went to war against Finland. Among other things the lyrics take a strong anti-war stand consistent with the CP’s slogan at the time, “The Yanks Aren’t Coming” — a position they held during the  non-aggression pact between Germany and the USSR. Guthrie, in other words, was already incorporating the political line of the Communist Party into his lyrics when he sat down at actor Will Geer’s house in New York to write what would become “This Land is Your Land.”

 

In that respect, in order to better understand the song, one needs to understand the peculiarity of the CPUSA under the leadership of Earl Browder. A major slogan of the CP when Woody came on the scene was, “Communism is 20th Century Americanism.” That slogan was in keeping with Browder’s attempt to create a big tent for communism in the United States, steeped in anti-fascism and social-democracy. That the slogan was a mash-up of communism and US exceptionalism helps explain why Guthrie’s song stops in New York, rather than going on to the wider world  — communism, after all, was supposed to be internationalism. Browder and the CP’s approach to communism was far more U.S.-centric than internationalist —except of course when it came to supporting the geopolitical dictates of the Soviet Union. All of which explains the orientation of the song. 

 

Depending on the listener, “This Land is Your Land” can be heard in different ways. Guthrie most likely intended it as a call to move beyond private property, and toward a greater equality and common humanity. Notably the lines usually excluded, talk about encountering a sign reading, “No Trespassing,” while the other side of the sign, “didn’t say nothing.” — that being the sign that was “made for you and me.” 

 

More moderately it can be heard  as a liberal-secular hymn, in which all people ought to share in the country’s bounty. That is why Pete Seeger and Bruce Springsteen could safely perform it at Barack Obama’s first inaugural celebration.

 

Alternately still, and not without basis, it can be heard as a proclamation of American chauvinism — in fact the song has been criticized as justification of manifest destiny and even the theft of native lands because of its lines extolling the US landscape achieved through no small amount of blood and conquest.

 

Such debate will likely never get fully resolved. However, given the politics of its author, and in the interests of showing a little respect for the dead, it would seem a modest request — all due respect to Lady Gaga — to ask that the song not again be sung as part of a medley with “God Bless America.”

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177412 https://historynewsnetwork.org/article/177412 0
Ruth Bader Ginsburg Helped Shape the Modern Era of Women’s Rights – Even Before She Went on the Supreme Court  

Judge Ruth Bader Ginsburg paying a courtesy call on Sen. Daniel Patrick Moynihan, D-N.Y., left, and Sen. Joseph Biden, D-Del., in June 1993, before her confirmation hearing for the Supreme Court. AP/Marcy Nighswander

Jonathan Entin, Case Western Reserve University

Justice Ruth Bader Ginsburg died on Friday, the Supreme Court announced.

Chief Justice John Roberts said in a statement that “Our nation has lost a jurist of historic stature.”

Even before her appointment, she had reshaped American law. When he nominated Ginsburg to the Supreme Court, President Bill Clinton compared her legal work on behalf of women to the epochal work of Thurgood Marshall on behalf of African-Americans.

The comparison was entirely appropriate: As Marshall oversaw the legal strategy that culminated in Brown v. Board of Education, the 1954 case that outlawed segregated schools, Ginsburg coordinated a similar effort against sex discrimination.

Decades before she joined the court, Ginsburg’s work as an attorney in the 1970s fundamentally changed the Supreme Court’s approach to women’s rights, and the modern skepticism about sex-based policies stems in no small way from her lawyering. Ginsburg’s work helped to change the way we all think about women – and men, for that matter.

I’m a legal scholar who studies social reform movements and I served as a law clerk to Ginsburg when she was an appeals court judge. In my opinion – as remarkable as Marshall’s work on behalf of African-Americans was – in some ways Ginsburg faced more daunting prospects when she started.

Thurgood Marshall, in 1955, when he was the chief counsel for the NAACP. AP/Marty Lederhandler

Starting at zero

When Marshall began challenging segregation in the 1930s, the Supreme Court had rejected some forms of racial discrimination even though it had upheld segregation.

When Ginsburg started her work in the 1960s, the Supreme Court had never invalidated any type of sex-based rule. Worse, it had rejected every challenge to laws that treated women worse than men.

For instance, in 1873, the court allowed Illinois authorities to ban Myra Bradwell from becoming a lawyer because she was a woman. Justice Joseph P. Bradley, widely viewed as a progressive, wrote that women were too fragile to be lawyers: “The paramount destiny and mission of woman are to fulfil the noble and benign offices of wife and mother. This is the law of the Creator.”

And in 1908, the court upheld an Oregon law that limited the number of hours that women – but not men – could work. The opinion relied heavily on a famous brief submitted by Louis Brandeis to support the notion that women needed protection to avoid harming their reproductive function.

As late as 1961, the court upheld a Florida law that for all practical purposes kept women from serving on juries because they were “the center of the home and family life” and therefore need not incur the burden of jury service.

Challenging paternalistic notions

Ginsburg followed Marshall’s approach to promote women’s rights – despite some important differences between segregation and gender discrimination.

Segregation rested on the racist notion that Black people were less than fully human and deserved to be treated like animals. Gender discrimination reflected paternalistic notions of female frailty. Those notions placed women on a pedestal – but also denied them opportunities.

Either way, though, Black Americans and women got the short end of the stick.

Ginsburg started with a seemingly inconsequential case. Reed v. Reed challenged an Idaho law requiring probate courts to appoint men to administer estates, even if there were a qualified woman who could perform that task.

Sally and Cecil Reed, the long-divorced parents of a teenage son who committed suicide while in his father’s custody, both applied to administer the boy’s tiny estate.

The probate judge appointed the father as required by state law. Sally Reed appealed the case all the way to the Supreme Court.

Ginsburg did not argue the case, but wrote the brief that persuaded a unanimous court in 1971 to invalidate the state’s preference for males. As the court’s decision stated, that preference was “the very kind of arbitrary legislative choice forbidden by the Equal Protection Clause of the 14th Amendment.”

Two years later, Ginsburg won in her first appearance before the Supreme Court. She appeared on behalf of Air Force Lt. Sharron Frontiero. Frontiero was required by federal law to prove that her husband, Joseph, was dependent on her for at least half his economic support in order to qualify for housing, medical and dental benefits.

If Joseph Frontiero had been the soldier, the couple would have automatically qualified for those benefits. Ginsburg argued that sex-based classifications such as the one Sharron Frontiero challenged should be treated the same as the now-discredited race-based policies.

By an 8–1 vote, the court in Frontiero v. Richardson agreed that this sex-based rule was unconstitutional. But the justices could not agree on the legal test to use for evaluating the constitutionality of sex-based policies.

New York Times article about the Wiesenfeld case, which refers to Ginsburg as ‘a woman lawyer.’ New York Times

Strategy: Represent men

In 1974, Ginsburg suffered her only loss in the Supreme Court, in a case that she entered at the last minute.

Mel Kahn, a Florida widower, asked for the property tax exemption that state law allowed only to widows. The Florida courts ruled against him.

Ginsburg, working with the national ACLU, stepped in after the local affiliate brought the case to the Supreme Court. But a closely divided court upheld the exemption as compensation for women who had suffered economic discrimination over the years.

Despite the unfavorable result, the Kahn case showed an important aspect of Ginsburg’s approach: her willingness to work on behalf of men challenging gender discrimination. She reasoned that rigid attitudes about sex roles could harm everyone and that the all-male Supreme Court might more easily get the point in cases involving male plaintiffs.

She turned out to be correct, just not in the Kahn case.

Ginsburg represented widower Stephen Wiesenfeld in challenging a Social Security Act provision that provided parental benefits only to widows with minor children.

Wiesenfeld’s wife had died in childbirth, so he was denied benefits even though he faced all of the challenges of single parenthood that a mother would have faced. The Supreme Court gave Wiesenfeld and Ginsburg a win in 1975, unanimously ruling that sex-based distinction unconstitutional.

And two years later, Ginsburg successfully represented Leon Goldfarb in his challenge to another sex-based provision of the Social Security Act: Widows automatically received survivor’s benefits on the death of their husbands. But widowers could receive such benefits only if the men could prove that they were financially dependent on their wives’ earnings.

Ginsburg also wrote an influential brief in Craig v. Boren, the 1976 case that established the current standard for evaluating the constitutionality of sex-based laws.

Like Wiesenfeld and Goldfarb, the challengers in the Craig case were men. Their claim seemed trivial: They objected to an Oklahoma law that allowed women to buy low-alcohol beer at age 18 but required men to be 21 to buy the same product.

But this deceptively simple case illustrated the vices of sex stereotypes: Aggressive men (and boys) drink and drive, women (and girls) are demure passengers. And those stereotypes affected everyone’s behavior, including the enforcement decisions of police officers.

Under the standard delineated by the justices in the Boren case, such a law can be justified only if it is substantially related to an important governmental interest.

Among the few laws that satisfied this test was a California law that punished sex with an underage female but not with an underage male as a way to reduce the risk of teen pregnancy.

These are only some of the Supreme Court cases in which Ginsburg played a prominent part as a lawyer. She handled many lower-court cases as well. She had plenty of help along the way, but everyone recognized her as the key strategist.

In the century before Ginsburg won the Reed case, the Supreme Court never met a gender classification that it didn’t like. Since then, sex-based policies usually have been struck down.

I believe President Clinton was absolutely right in comparing Ruth Bader Ginsburg’s efforts to those of Thurgood Marshall, and in appointing her to the Supreme Court.

Jonathan Entin, Professor Emeritus of Law and Adjunct Professor of Political Science, Case Western Reserve University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177417 https://historynewsnetwork.org/article/177417 0
Historians Respond to the Death of Justice Ruth Bader Ginsburg

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177418 https://historynewsnetwork.org/article/177418 0
Nostalgia and the Tragedy of Trump's Speech at Mount Rushmore  

 

 

 

Standing in front of Mount Rushmore on July 3, Donald Trump offered the American people a recitation of their history that was not only stirring but saturated with nostalgia.  With visages of notable presidents looming behind him, he presented a magical tale of a wonderful nation pursuing its manifest destiny and the heroes who helped it along.  Magnificent achievements abounded; victims and aggression were nowhere to be found.  His oration perfectly underscored David Lowenthal's point that nostalgia is memory with the pain removed. The pain is in the present. 

 

In Trump's sentimental and self-satisfying rendering of our past, Americans emerge as a people beyond reproach--free of any hint of original sin or culpability for sordid deeds.  He legitimately praised the laudable actions of America's doctors and nurses who fought the corona virus, the "wonderful veterans" who fought our wars and the law enforcers who patrol our streets.  He stood in awe of George Washington for persevering through the  bitter winter at Valley Forge, of Thomas Jefferson for authoring the Declaration of Independence, calling it "one of the greatest treasures of human history," and of Abraham Lincoln for saving the union and ending slavery.  He paid tribute to "our founders" for launching not only a revolution in government, but a revolution in the pursuit of "justice, liberty, and prosperity."  He even found space in his pantheon of heroes for Muhammad Ali, Buffalo Bill Cody,  Frederick Douglass, Andrew Jackson, General George Patton, and  Elvis Presley. Certainly there was much that Americans could be proud of in this particular history lesson but Trump's omissions were glaring. The ache of Indian removal, Jim Crow, lynchings, race riots, labor exploitation, Vietnam, Iraq,  sexual abuse, and environmental degradation were left out of this version of who we were.

                  

Soon, however, Trump's wistful saga turned ominous.  The president declared that the future of the "most just and exceptional nation ever to exist on earth" was in peril, threatened by forces that emanated not from distant shores but from within.  He warned his listeners of dangers that jeopardized "every blessing our ancestors fought so hard for."  America was now threatened by  "angry mobs" who were attempting to deface "our most sacred memorials and unleash a wave of violent crime in our cities."  Entrenched in urban enclaves run by "liberal democrats," these hordes represented the vanguard of a "new far-left fascism" intolerant of contradictory views who were now spreading a form of totalitarianism in our schools and newsrooms and threatening to unravel the very revolution that gave birth to the nation in the first place.  In an age when Americans had come to fear Muslims from abroad, immigrants with criminal intent, and a deadly virus, Trump now insisted that the real existential threat to America and its glorious history came from domestic terrorists who opposed him politically and plotted to "defame our heroes, erase our values, and indoctrinate our children."    

 

Trump's speech offered a classic case of what scholars such as Svetlana Boym have called "restorative nostalgia," a highly emotional impulse that longs to flee from the incessant swirl of change  and tension in the present and find truth and comfort in a romanticized past free of turmoil and trauma.  Such wistfulness is popular in our time because it promises to serve as a source of enduring truths--suspending the need for critical thinking--and free the virtuous nation from any responsibility to address wrongs it may have committed. In Trump's retelling of American history, in fact, there were no misdeeds.  He offered a memory and a history free of  guilt and certainly without any evidence that might support claims for justice in the present. 

 

Boym also identified a contrasting form of nostalgia that she claimed was more "reflective." In this particular turn to the past, facts are prized more than myths and a careful assessment is made of what has worked and what has failed so that improvements or reforms can be made going forward. This more thoughtful nostalgia longs not for the return of paradise but the implementation of lessons learned from a history filled with forward and backward thrusts. It serves as an antidote the lure of utopia or an "unreflected nostalgia," which can breed "monsters" or evil forces like "angry mobs."  Senator Joseph McCarthy's warnings in 1950 that there were communists or "men in high places" lurking  in the government that threatened American values was a prime example of this turn to "monsters."  When "mobs" appear in the streets defenders of paradise can never see their grievances as legitimate because there can be no reason to critique what is already exceptional and faultless.  The only recourse is to quash the mob, drain the swamp, demand a restoration of law and order and reaffirm traditional values.  Negotiations would be waste of time. 

 

Ironically, Abraham Lincoln, whose monument glared down at Trump as he spoke in South Dakota, turned out be a nostalgic as well, but one that relied more on a sober analysis of the past he longed for than a wistful one.  Drawing upon his experience from leading the nation through the Civil War, the sixteenth president yearned earlier times when America was peaceful and united.  As most historians will argue, Lincoln fought the war to save the union. But Lincoln, drawing upon his view of  history, saw the fight to save the United States as much more than a preservation project. He felt Americans had a debt to their forefathers to sustain not only the nation they created but what he called America's "first principles."  He shared the sense that the nation was exceptional but not because it produced heroic figures but because it was born with a moral obligation to promote and protect the ideal of universal liberty and equal rights for all. 

 

Lincoln did not always believe as fully in these ideals as he might have. Before the Civil War he often felt the best solution to the race problem in America was for African-Americans to return to their home continent.  But over time--as he reflected upon the bitter disputes over slavery that had divided America before the 1860s and upon the human sacrifices of the war--he came to see clearly that the task before him--and for Americans who followed him--was to save the union so that it could continue to serve as a moral agent pressing for tolerance and equal rights for men and women everywhere. To Lincoln, America's destiny--shaped by its history--was less about producing heroes or quashing imagined conspiracies than ensuring that all Americans had access to their birthright of equality. That is why his famous speech at Gettysburg mentioned "our forefathers," the pain of solider deaths, and the continuing need to see that a government of the people should not perish from the face of the earth.  

                  

Trump's story of a flawless people and nation would not have resonated with Lincoln because it failed to emphasize the central tenet of the American experience that all human beings needed to be seen in the light of the dignity they possessed and the rights they deserved. Trump did praise Jefferson and the Declaration of Independence in his talk but then he buried them in a convoluted tale of heroes, destiny, statues,  mobs, and violent cities that left the impression that his need to destroy political opponents took precedence over everything else.  He showed no interest in endorsing Lincoln's point that the government and the nation needed to be saved so that it could continue the progressive dream of expanding human rights into the foreseeable future.  In Trump's future he planned to build a monument to all of America's heroes and ventured that centuries from now our legacy would be seen in the cities we built and the "champions we forged."  The tragedy of his nostalgia was not only that it failed to contemplate the calamities of the history he told but that his vision for the nation had drifted so far from the razor-sharp devotion Lincoln had to human rights.  

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177410 https://historynewsnetwork.org/article/177410 0
Dwight Eisenhower Built up American Intelligence at a Crucial Moment

Soviet officer inspects CIA tunnel under East Berlin, 1956. Photo Bundesarchiv, Bild. CC BY-SA 3.0 de

 

 

More than any other president--with the possible exception of George Washington--Dwight D. Eisenhower did not need on-the-job training to understand the value of good intelligence. As Supreme Allied Commander in Europe during World War II, Eisenhower relied heavily on Ultra, the British code-breaking operation that allowed the Allies to read encrypted German communications. At the war’s conclusion, Eisenhower said the intelligence had been “of priceless value to me.”

So it was with no little chagrin that upon taking office in Washington in January 1953, Eisenhower learned just how far western intelligence had declined since the war.

This week, the Dwight D. Eisenhower Memorial is being dedicated in Washington, not far from the U.S. Capitol. Eisenhower’s presidency is sometimes overshadowed by his wartime command. One aspect in particular that is often overlooked is how vastly US intelligence capabilities increased during his administration.

Eisenhower took office at a uniquely vulnerable period in American history. The Soviet Union had already shocked the West in 1949 by successfully testing a nuclear bomb after stealing Manhattan Project secrets. In August 1953, the Soviets detonated their first hydrogen bomb. It was an unpleasant surprise for Eisenhower—Western intelligence had no inkling the Soviets would achieve such destructive capability so quickly.  On top of the nuclear threat, an enormous Red Army force –never withdrawn from Eastern Europe at the end of World War II--remained poised along the borders with Western Europe.

But as Eisenhower soon learned, the U.S. had virtually no good intelligence on the Soviet Union. For some years during and after World War II, the U.S. intercepted and decrypted secret Soviet radio communications as part of a secret program codenamed VENONA. But in 1948, after the secret was betrayed by KGB spies Kim Philby and William Weisband, the Soviets changed their cryptographic systems and shifted much of their communications from radio to landlines, leaving the West almost entirely in the dark about Moscow’s military capabilities and intentions. CIA efforts to place agents inside the Soviet Union had failed miserably. Other than rare overflights along the periphery of Soviet territory by U.S. and British military aircraft, there was none of the overhead imagery that the U-2 and satellites would later provide. “We were simply blind,” said David Murphy, a CIA officer who would serve in Berlin.

Not long after the Soviet hydrogen bomb test, CIA Director Allen Dulles told Eisenhower “the Russians could launch an atomic attack on the United States tomorrow.” It left the president wondering whether he should consider launching a first strike to preempt the Soviets. “As of now, the world is racing toward catastrophe,” he wrote gloomily in his diary.

So Eisenhower was receptive when Dulles brought him a proposal soon afterwards for what would become one of the most audacious espionage operations of the Cold War, involving the divided city of Berlin. The idea was to team with British intelligence to dig a quarter-mile tunnel from West Berlin into East Berlin to tap into underground cables used by the Red Army to communicate with Moscow. The Berlin Tunnel, as recounted in Betrayal in Berlin, would be simultaneously the largest signals intelligence and covert operation the CIA had conducted to that point--not to mention effectively an incursion into Soviet-held territory.

Eisenhower gave the tunnel his ready approval, as did his former partner from World War II, Winston Churchill—who had returned to power in 1952 as Britain’s prime minister and was likewise dismayed at the lack of intelligence about the Soviet military. 

As president, Eisenhower pushed for aggressive intelligence gathering--within limits. “In general we should be as unprovocative as possible but he was willing to take some risks,” Andrew Goodpaster, then an Army colonel serving as Eisenhower’s staff secretary, later recalled. Eisenhower refrained from asking too many questions about exactly what the CIA was up to in Berlin. “He insisted that he have access to everything, and I think we did,” said Goodpaster. “But there were things that he deliberately did not inform himself about.” Eisenhower liked having plausible deniability, to guard against having to lie to the press or Congress about what he had known.

“President Eisenhower did not feel that he wanted to know the specifics of all these activities,” recalled Dillon Anderson, who served as Eisenhower’s national security advisor. “I don’t think he particularly wanted to know” the elaborate details of how the CIA intended to tunnel into East Berlin, Anderson said. But the president was keenly interested in the end product.

Construction of the tunnel began in great secrecy in September 1954, dug by a small U.S. Army Corps of Engineers team. They used the cover of an Army warehouse in Rudow, a remote corner of the American sector, to disguise the project from curious Soviet and East German guards across the nearby border. As work continued, Dulles came to the president seeking authorization for another secret program, this one to develop a special high-altitude reconnaissance aircraft that would become known as the U-2. Once again, Eisenhower approved without hesitation. “Our relative position in intelligence, compared to the Soviets, could scarcely have been worse,” he later wrote. Bigger and better fleets of bombers and improved guided missile capability had given the Soviets an “ever-growing capacity for launching surprise attacks against the United States,” Eisenhower believed. He admitted to being “haunted” by the threat of a nuclear Pearl Harbor and created two commissions in 1954 to examine the ability of U.S. intelligence to protect the nation against such an attack. The first report, a review of CIA covert operations led by Lieutenant General James Doolittle, hero of the wartime raid on Tokyo, described the U.S. as losing an intelligence battle that could have apocalyptic consequences: “If the United States is to survive, long-standing American concepts of 'fair play’ must be reconsidered,” Doolittle wrote. The second commission, headed by MIT President James Killian, was more sober-minded but equally chilling in its conclusions. “The advantage of surprise attack has never been so great as now,” the Killian report said.

Good intelligence could not come too soon, as far as Eisenhower was concerned. “Our old conceptions of the time that would be available to governments for making of decisions in the event of attack are no longer tenable,” Eisenhower wrote to Churchill in January 1955. “I think it possible that the very life of a nation, perhaps even of Western civilization, could . . . hang upon minutes and seconds used decisively at top speed or tragically wasted in indecision.” 

In May 1955, after eight months of delicate work, the American and British tunnel team succeeded in tapping the first of the targeted trunk cables, located 27 inches below the surface of a heavily traveled East Berlin road. Back in the warehouse, a team of linguists and analysts was quickly inundated by a gusher of captured telephone calls and teletype communications, involving everyone from senior Soviet commanders to low-ranking logistician clerks across East Germany. Cartons filled with tape recordings were soon being flown almost every day to processing centers in London and Washington staffed by hundreds of translators, transcribers and analysts.

Bit by bit, a mosaic was painstakingly painted of the Soviet military—its organization, deployment of forces, strength and weaknesses, training, tactics, weaponry, radio and telephone networks and system of passwords. The captured conversations also revealed details about the Soviet nuclear program, Kremlin machinations, and Soviet intelligence operations and much other critical information. For Eisenhower, though, the greatest value of the tunnel lay in what it did not show: any indication that the Soviets were planning an attack. The preemptive strike Eisenhower feared he would need to launch never happened.

Ironically, the KGB had been tipped to plans for the tunnel by George Blake, a British intelligence officer involved in the project. But the KGB found itself in a dilemma. Blake was proving himself invaluable as a Soviet spy, and if the KGB did anything to stop the tunnel, he would immediately fall under suspicion, as one of only a handful who knew of the operation. Planting disinformation was likewise too risky, because it would stick out like a sore thumb amidst the torrent of real information captured by the tunnel. So the KGB left Red Army commanders in the dark. By the time the Soviets finally staged a discovery in April 1956, the tunnel had intercepted some 90,000 communications. In a sense, it was a precursor to the mass surveillance that would be employed by the National Security Agency.

Less than three months after the tunnel’s demise, a U-2 took off from West Germany on July 4, 1956, for the first overflight of Soviet territory. It was almost like handing off a baton. For nearly a year, the tunnel had provided the early warning the U.S. and its allies needed. Now, the U-2, with high resolution cameras able to cover vast amounts of territory from high altitudes, would be able to track the movement of military equipment, weaponry, troops and other logistical signs that might signal plans for an attack. The downing of a spy plane over the USSR in 1960 would prove deeply embarrassing to Eisenhower, when the Soviets exposed White House denials of espionage as lies by producing captured CIA pilot Francis Gary Powers. But despite Eisenhower’s regret over a cancelled summit meeting with Soviet leader Nikita Khrushchev, the value of the U-2 would be indisputably proven two years later when an overflight spotted Soviet nuclear missiles in Cuba.

Before Eisenhower left office in January 1961, the world’s first photo reconnaissance satellites had been launched as part of CORONA, another secret program the president had authorized. CORONA revolutionized intelligence collection, providing the CIA with the capability of scanning the globe. While many American intelligence disasters lay ahead, from the Bay of Pigs to Iraq and beyond, never again would the U.S. be as utterly blind as it had been when Eisenhower took office.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177379 https://historynewsnetwork.org/article/177379 0
Rick Perlstein’s Reaganland: America’s Right Turn, 1976-1980

Cover Detail, Rick Perlstein's Reaganland, Simon & Schuster

 

Here are two key things to know about Rick Perlstein’s new book, Reaganland: 

First, despite the title, the book is much more than a political biography of Reagan. Covering the years 1976-1980, this is the latest in Perlstein’s four-volume series on the rise of the American conservative movement.  In fact, much of Reaganland details the rise and fall of the Carter administration, which, of course, made Reagan’s election in 1980 possible.

Second, while Donald Trump is mentioned only once (as a young real estate developer buying a Manhattan hotel with government subsidies), this book goes a long way to explaining the rise of the conservative constituencies (e.g. blue-collar workers, evangelicals) that made his 2016 election possible.

As Reaganland sets forth, many of the tactics used by Trump’s campaign were invented or perfected by Reagan’s skilled team of advertising and PR professionals. These include thinly veiled racist language to appeal to “aggrieved” white voters, careful cultivation of evangelical Christians and foot-stomping, flag-waving rallies in blue-collar cities.   

Although the phrase “Make American Great Again” was used in Reagan’s campaign, it was just one of a dozen different messages, all implying the nation had declined under Jimmy Carter. The all-white crowds that attended Reagan’s rallies often chanted “Reagan’s right! Reagan’s right!”

Some of the issues in the 1980 campaign would be familiar to today’s voters, such as the loss of manufacturing jobs, increased crime in big cities and a perceived weakness in the military. Other issues, such as Carter’s decision to cede control of the Panama Canal, the SALT II nuclear arms talks and the campaign for the Equal Rights Amendment, are topics that have faded out of sight.

A Pre-Internet Age

To read Reaganland is to be immersed in the world of 1979, which where the pace of life, at least in terms of media consumption, was far more leisurely. Mass communication was limited to the three TV networks and your local daily newspaper. Cable TV was generally limited to distant rural areas and the Cable News Network did not produce its first newscast until November 1980. The worldwide web, smart phones and streaming media were the stuff of science fiction. Most phones used rotary dials and music lovers purchased vinyl albums. 

For political campaigns, this meant a relentless focus on staging a series of colorful campaign events to gain coverage in local newspapers and TV stations. Print was the dominant medium, with some 1,800 daily newspapers across America, all of them influencers on local voters. 

In this pre-Internet age, the only way to reach directly into American homes with an unfiltered message was through the mail — and a skilled right-wing marketer gave conservative causes a major advantage. Richard Viguerie perfected the art of direct mail over twenty-five years; he was dubbed “the six-million-dollar man” for his ability to rake in huge sums with a single, carefully targeted mailing 

This fundraising tool was just one of the factors that gave the conservative movement new power that enabled it to fracture the traditional liberal coalition of labor unions, urban dwellers and white-collar workers. Perlstein describes the rise of a right-wing “counterintelligentsia,” including the American Enterprise Institute, the Heritage Foundation and the American Conservative Union. These groups churned out fact sheets, position papers and editorials supporting “free enterprise” and “reduced government intervention.”  

While candidate Reagan usually projected an image of sunny optimism, his campaign staff was busy appealing to the darker recesses of the American psyche. Perlstein notes “Reagan’s managers were targeting voters who felt victimized by government actions that cost them the privileges their whiteness once afforded them.”

This climate of white racial grievance was fostered by Conservative-leaning newspapers, like The Chicago Tribune, which ran dozens of stories in 1979 about “welfare queens,” usually Black mothers, who “grew comfortable living off the public purse.”

A National Election Studies poll in 1980 found that the number of Americans who supported increased spending to improve the conditions of minorities fell to 24%, a record low. Instead, 46% of those surveyed said minorities should instead “help themselves” out of poverty.

Other campaign messages appealed to the fears of suburban women and evangelical Christians. Campaign surrogates spoke of “threats” to the American family in the form of homosexual teachers, “women’s libbers,” abortion mills and rampaging criminals from the inner city.

Carter’s mistakes

Reagan benefited enormously from President Carter’s many stumbles, some self-inflicted and others due to outside forces. In the summer of 1979, a gasoline shortage swept across America. Kicked-off by increasing worldwide consumption and then exacerbated by an American trucking strike, long lines formed at service stations. Frustrated drivers demanded Carter end the shortage.  

Then in November 1979, Iranian militants seized 52 American diplomats at the embassy in Teheran. Carter’s favorability ratings plummeted. At the July 1980 Democratic Convention, Carter barely survived a challenge from Senator Ted Kennedy. Moderate Republican Congressman John Anderson ran as an independent candidate and attracted independent voters by advocating a 50 cent per gallon gasoline tax. On election night, Reagan swept to victory winning 51% of the popular vote to Carter’s 41% and gaining a landslide in the electoral college: 489 to 39. 

Perlstein’s great strength is weaving together a compelling narrative from a range of disparate sources. He is particularly skillful at showing how popular culture can influence voter concerns. Thus, Reaganland includes references to Eric Clapton, Tom Wolfe, Jane Fonda and movies such as The China Syndrome, Star Wars and The Godfather. 

One of the book’s failings, however, is a superficial treatment of Reagan the man. His two younger children, Patti Davis and Ronald Reagan, Jr. are never mentioned. Nancy Reagan is treated as a lightweight who merely organized parties. Several other biographers shown that she had a major influence on her husband’s policies and personnel choices. 

In this campaign year, a shelf full of books have been written about Trump. Administration insiders (John Bolton) and family members (Mary Trump) have all shined a light on the many flaws of Donald J. Trump.

But as Reaganland reveals, Trump did not rise in a vacuum. He tapped into racial divisions and cultural anxieties that had been carefully cultivated years ago by Reagan and his far-right supporters.  

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177413 https://historynewsnetwork.org/article/177413 0
Unlike the Germans We Have Failed to Recognize and Atone for Our Holocausts

Photo 1940.

 

 

 

As a nation we have not owned up to the grievous, centuries-long harm we have done to Native and African Americans. To what extent are we historians responsible?

 

In late August 2020 a policeman in Kenosha, Wis. fired seven shots in to the back of a Black man named Jacob Blake. For three months--from late May, when another Black man, George Floyd, was killed by a policeman who knelt on his neck--we have seen almost continued protests over how police have treated Black Americans. Shortly after the Blake shooting, various professional athletes, following the lead of the NBA’s Milwaukee Bucks, canceled some scheduled games or practices. 

 

Former NBA hoopster and now Los Angeles Clippers coach Doc Rivers succinctly expressed the grief felt by many Black people: 

 

“We're the ones getting killed. We're the ones getting shot. We're the ones that were denied to live in certain communities. We have been hung, we have been shot. And all you do is keep hearing about fear. It's — it's amazing to me why we keep loving this country, and this country does not love us back. And it's just — it's really so sad.”

 

“This country does not love us back.” Why? Why does systematic racism continue to trouble our country, a century and a half after the end of slavery? One answer is that we have not sufficiently acknowledged, not sufficiently atoned for, the sin of slavery.  

In a previous article I referred to two of the USA’s “most heinous crimes—genocide against Native Americans and slavery.” Now (in my title) I refer to them as “our holocausts.”

 

I do not use the word “holocaust” carelessly. Historians should guard against throwing around such words loosely. But holocaust should not be limited to “the killing of millions of Jews by the German Nazi government in the period 1941–5.” This is only an Oxford dictionary’s second definition. The first is “a situation in which many things are destroyed and many people killed, especially because of a war or a fire.” Long before 1941 the term was used in this way, and even more broadly--see, for example, F. Scott Fitzgerald’s The Great Gatsby.

 

I use “holocausts” here because it seems appropriate for situations “in which many things are destroyed and many people killed.” And it captures best the enormity of what European colonizers and then white Americans have done to Native and African Americans. Moreover, some scholars, such as Russell Thornton in American Indian Holocaust and Survival: A Population History since 1492 (1990) and David Stannard in American Holocaust: Columbus and the Conquest of the New World (1993), have previously applied the word in a similar way. 

To get some idea of the numbers involved, let’s start with the following quote from Jill Lepore’s These Truths: A History of the United States. “Between 1500 and 1800, roughly two and a half million Europeans moved to the Americas [note: not just the USA]; they carried twelve million Africans there by force; and as many as fifty million Native Americans died, chiefly of disease,” most of them because they had no immunity to the diseases passed on to them by those of European ancestry. In discussing this most common means of death, however, historian Roxanne Dunbar-Ortiz emphasizes that colonizers didn’t regard all those deaths as just unfortunate accidents, but from the very beginning intended to eliminate, one way or another, Indian civilization. Stannard stressed the continuing interaction of the diseases with a “deliberate racist purge.”

Although estimates of Native American population in 1492 in what is today the conterminous United States vary widely, Thornton estimated it at about 5 + million, and added that it “declined to but 600,000 by 1800,” and “to about 250,000 by the last decade of the nineteenth century…. This was a population some 4 to 5 percent of its former size.” No wonder he referred to the “American Indian Holocaust.”

Stannard claimed that “the destruction of the Indians of the Americas was, far and away, the most massive act of genocide in the history of the world.” And despite, some twentieth century improvements and an increased Indian population, he still faulted (in 1993) the U. S. government for “its willful refusal to deal adequately with the life-destroying poverty, ill health, malnutrition, inadequate housing, and despair that is imposed upon most American Indians who survive today.”

One government report indicates that Native Americans “have long experienced lower health status when compared with other Americans. Lower life expectancy and the disproportionate disease burden exist perhaps because of inadequate education, disproportionate poverty, discrimination in the delivery of health services, and cultural differences.” In 2020,according to a late-July New York Times article, “there are strong indications that Native Americans have been disproportionately affected by the coronavirus. The rate of known cases in the eight counties with the largest populations of Native Americans is nearly double the national average.”

According to the 2018 Census, Native Americans have a national poverty rate of 25.4%, African Americans 20.8%, Hispanics 17.6%, and Whites 8.1%. Indians also are less educated than other groups. From 2013 to 2017, only 14.3% of Native Americans had a bachelor’s degree or higher, compared to 15.2% of Hispanics, 20.6% of African Americans and 34.5% of Whites. Also, “Native Americans experience substance abuse [including alcohol] and addiction at much higher rates than other ethnic groups.”

With any such woes there is always the question of responsibility. To what extent is it societal or due to personal failings? Although the exact mix is difficult to determine, there is no doubt that we Whites and the governments that have represented our interests bear a heavy responsibility for the historic mistreatment of Native and African Americans.

Of the horrendous conditions facing slaves captured in Africa and sent to the United States much has been written. By 1790, “there were almost 700 thousand slaves in the US . . . which equated to approximately 18 percent of the total population.” By 1860, “there were four million slaves in the South, compared with less than 0.5 million free African Americans in all of the US.” The slaves, who made up about half of the South’s population, worked mainly on large cotton plantations. (For a fictional treatment of slavery, see Harriet Beecher Stowe’s Uncle Tom’s Cabin.)

After the Civil War years and the end of slavery, the Reconstruction era (1865–1877) followed, but it got off to a slow start due to President Lincoln’s assassination and his successor’s presidency. For Andrew Johnson was a former Tennessee governor and, in the words of historian Eric Foner, “incorrigibly racist.” As Lepore writes, “By the winter of 1865–66, Southern legislatures consisting of former secessionists had begun passing ‘black codes,’ new, racially based laws that effectively continued slavery by way of indentures, sharecropping, and other forms of service.” In 1866 the Ku Klux Klan began their decades of White terror against Black people.  

Congress, however, opposed Johnson and Southern racist policies. In April 1866, it overcame a Johnson veto to pass the Civil Rights Act. During the two-term presidency of Ulysses Grant (1869-1877), Reconstruction policies, aided by federal troops stationed in the South, prevailed. According to Douglas Brinkley, during Reconstruction 22 Black legislators served in Congress, and “Blacks were elected to the legislatures of every one of the Confederate states.”  

Yet, as Lepore writes, “Political equality had been possible, in the South, only at the barrel of a gun.” What followed was what Henry Louis Gates Jr. calls “the Redemption era” of 1877 to 1915, when Black enfranchisement ended; the Klan and other Whites terrorized Black people; and Jim Crow laws segregated them from Whites in various places from playgrounds to public transport.  

Thousands of black men were also lynched. As late as 1930, in Marion, Indiana, two young innocent black men were lynched surrounded by white onlookers. In Cincinnati, where I grew up, the swimming pool at Coney Island did not permit Black swimmers until 1961. In the early 1960s, I lived in northern Virginia, where interracial marriage was still prohibited and where I remember picketing a northern Virginia movie theater that still discriminated against Black patrons. 

More recently, Gates recalls other signs of racism: Dylann Roof murdering “the Reverend Clementa Pinckney and the eight other innocents in Mother Emanuel AME Church in Charleston, South Carolina, on June 17, 2015”; the white supremacy rally in Charlottesville, Virginia, on August 12, 2017, “when an avowed white supremacist plowed his car into a crowd of counter-protesters”; and a White racist, in October 2018, unable to get into the “predominantly black First Baptist Church in Jeffersontown, Kentucky,” settling instead for fatally shooting “two African American shoppers” at a local Kroger. 

Between 2018 and today, besides the George Floyd and Jacob Blake episodes, numerous other cases reflecting our ongoing racism could be mentioned, many of them abetted by President Trump, who commented about the white supremacy rally in Charlottesville and opponents of it, “there's blame on both sides,” and there “were very fine people, on both sides.”

The disadvantaged poverty rate and educational attainment of Black Americans have already been mentioned. In addition, the life expectancy of Black males is lower and their incarceration rate much higher than for White men. In 2020, as the government CDC reports, Black and other “racial and ethnic minority groups are being disproportionately affected by COVID-19. Inequities in the social determinants of health, such as poverty and healthcare access, affecting these groups are interrelated and influence a wide range of health and quality-of-life outcomes and risks.”   

Thus, from the days of Columbus to the present, Native and African Americans have suffered grievously. Despite significant gains, including electing our first Black president, the Trump presidency has setback attempts to alleviate the effects of past injustices--see, e.g., here and here. Trump has reinforced White resistance to any efforts toward atoning for past and present racial injustices.

In the early postwar years after the downfall of the German Nazis, Germans also resisted any talk of guilt. In his much discussed 2014 essay, “The Case for [U. S.] Reparations,” Ta-Nehisi Coates mentioned this, but then indicated how the Germans came around to owning up to their Holocaust guilt. More recently (in 2019) Susan Neiman, in her Learning from the Germans: Race and the Memory of Evil, noted that “after the 1963 Birmingham church bombing, James Baldwin said that white Americans share collective guilt for the persecution of black Americans as Germans did for their silence during the Nazi persecution of Jews.” Later Neiman added, “after white nationalist demonstrators [in 2017] screamed ‘Blood and Soil’ in Charlottesville, does the comparison require further argument?”

Having been raised in the U. S. South, and then spent many years in Germany, Neiman details how both areas have (or have not) grappled with responsibility for their racial crimes. While presenting numerous examples of how Germans have attempted to atone--see a recent HNN article for a few examples--she writes that “America's failure to face its past is evident not only in the vicious outbursts of white supremacy that Donald Trump encouraged, but in subtler ways as well.”

Exactly what form this atonement should take is a complicated question, but first we have to own up to our responsibilities. And this is where we historians come in. Some, like Thornton, Stannard, and Gates, have stressed past racial injustices. But many others have not sufficiently done so. More of us have to write and teach U. S. history in such a way that readers and students recognize how grave U. S. injustices to Native and African Americans have been. 

In his Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong, James Loewen writes “our teachers and our textbooks still leave out most of what we need to know about the American past.” This is especially true in pre-college classrooms and regarding the depth of our past racial injustices. Loewen writes, “Almost half the states have textbook adoption boards. Some of these boards function explicitly as censors, making sure that books . . . avoid topics and treatments that might offend some parents.” 

Under such circumstances, teaching history as we should, especially in pre-college courses, can be very difficult. But as I have written previously, avoiding parental disapproval or teaching patriotism should not be our goal. “Historians’ main allegiance should be to truth-telling.” And the truth includes owning up to the grievous harm we have done to Native and African Americans. Only then can we atone for our past misdeeds and create a truly “composite nation,” where we are “made better, and stronger,” not in spite of our “many elements, but because of them.” 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177406 https://historynewsnetwork.org/article/177406 0
The Religion of Patriotism

 

 

“I pledge allegiance to the flag of the United States of America and to the republic for which it stands, one nation under God indivisible with liberty and justice for all.”

 

I have not said that pledge for many years, but I remembered every word from all the times I had said it as a youth, in school and at other places. The history of that wording reflects the history of American patriotism.

 

The pledge was written by Francis Julius Bellamy (1855-1931) in 1892. He had studied at the Rochester Theological Seminary to become a Baptist minister, following his father, a Baptist minister in Rome, NY. He led congregations in Little Falls, NY, and then in Boston. Bellamy believed that the rights of working people and the equal distribution of economic resources were inherent in the teachings of Jesus. In the labels of the late 19th century, he was a Christian socialist.

 

His cousin Edward Bellamy, whose father was also a Baptist minister, shared Francis Bellamy’s late 19th-century version of liberation theology. He wrote the novel Looking Backward: 2000–1887 (published in 1888), a futurist fantasy in which a Boston man falls asleep in 1887 and wakes up in 2000, when the United States has been transformed into a socialist utopia: all industry is nationalized, working hours reduced with retirement at age 45, and equal distribution of all goods. Looking Backward, along with Harriet Beecher Stowe’s Uncle Tom’s Cabin, was among the great best-sellers of the late 19th century. His sequel, entitled simply Equality (1897), promoted equality for women, and imagined the television, air travel, and universal vegetarianism.

 

The Bellamy cousins wanted radical change, but so did millions at that time. During the last decades of the 19th century, often labeled the Gilded Age, rapid industrialization and capitalism unfettered by regulation led to widespread poverty and unprecedented concentrations of wealth. The top 1% owned half of the nation’s property, and the bottom 44% owned 1%. American industry had the world’s highest accident rate. Socialist and labor movements grew in response.

 

Francis Bellamy preached against the evils of capitalism, offered a public education class entitled “Jesus the socialist”, and was founding vice president of the Society of Christian Socialists. He was forced out of his Boston congregation in 1891. Daniel Sharp Ford, a member of Bellamy’s congregation who published Youth’s Companion, a children’s magazine, hired him to promote Ford’s campaign to put an American flag in every school. To coincide with the 400thanniversary of the voyage of Christopher Columbus in 1892, in coordination with the World’s Columbian Exposition in Chicago in 1893, Bellamy wrote a flag pledge published in Youth’s Companion in September 1892.

 

Bellamy’s pledge read: “I pledge allegiance to my Flag and to the Republic for which it stands, one nation, indivisible, with liberty and justice for all.” He later wrote about his thinking when composing the pledge. The Civil War led to his reference to “indivisible”. Although he was deeply religious, he strongly believed in the separation of church and state, thus including no reference to God. He had been inspired by the French Revolutionary slogan, “liberty, equality, fraternity”, but wrote, “No, that would be too fanciful, too many thousands of years off in realization. But we as a nation do stand square on the doctrine of liberty and justice for all.” He knew that most state superintendents of education were opposed to equality for women and African Americans.

 

Bellamy’s political thinking was among the most progressive of his era, but did not escape the racism inherent in American culture. He argued that the assimilation of non-white “races” into American society would lower “our racial standard”. His and Ford’s and official America’s veneration of Columbus was itself a political statement based on white supremacy and targeted at Italian voters.

 

As a national ritual of patriotism, the pledge has been yanked to the right in the 20th century. In 1924, the conservative leaders of the American Legion and the Daughters of the American Revolution persuaded the National Flag Conference to change “my flag” to “the flag of the United States of America”, despite Bellamy’s opposition. A much more serious distortion was added in 1954 with the words “one nation under God”. Although that change is often attributed to the recommendation of President Dwight Eisenhower, its longer history, as described by historian Kevin Kruse in One Nation Under God: How Corporate America Invented Christian America in 2015, is much more revealing.

 

In response to Franklin Roosevelt’s New Deal, which introduced significant regulation of business and empowered labor unions, giant corporations created a public relations campaign for big capitalism using organizations like The American Liberty League. This secular political campaign was a flop. Jim Farley, chair of the Democratic National Committee under Roosevelt, said, “They ought to call it The American Cellophane League, because No. 1: It’s a DuPont product, and No. 2: You can see right through it.”

 

Corporate America then turned to conservative Christian ministers, literally employing them to link capitalism with Christianity by arguing that the New Deal is evil and capitalism is “freedom under God”. In 1951, Cecil B. DeMille organized a Fourth of July ceremony, backed by the leaders of corporate America and hosted by Jimmy Stewart, carried live over national radio. Their message was that “the American way of life” was Christian individualism expressed in unchecked capitalism.

 

This is the background for the insertion of religious messages into American patriotic rituals. The pledge now asserted that the separation of church and state was un-American. “In God We Trust” appeared on a postage stamp that same year, 1954, and on paper money in 1955. In 1956, it became our first national motto. Since Ronald Reagan began using “God bless America” to end his speeches in the 1980s, that phrase has become a staple in both parties, like the flag pin as patriotic adornment.

 

The claim in the modern Pledge of Allegiance about “liberty and justice for all” is not true today. In the Jim Crow era, when I first learned to recite it, it was an outright lie. The stirring words of the national anthem about America, “the land of the free”, were similarly false. Like the Lost Cause mythology about the Civil War and its aftermath, which was enshrined in the school textbooks I read and taught as American history, these assertions were propaganda for an American society based on white supremacy. Patriotic rituals were designed to indoctrinate young and old with the belief that the racist, sexist, antisemitic America of the 20th century was already perfect, that criticisms of racial injustice or gender discrimination were illegitimate, that America was God’s country and corporate capitalism was God’s handiwork.

 

On Flag Day in 1943, the Supreme Court declared, in West Virginia State Board of Education v. Barnette, that a law requiring schoolchildren to salute the flag and recite the Pledge was unconstitutional. That ruling still stands as settled American constitutional law. Justice Robert H. Jackson wrote then, “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion.” Yet the patriotic rituals we take for granted do exactly that, prescribing that belief in a specific kind of God is patriotic and that freedom and justice for all already exist.

 

Steve Hochstadt

Jacksonville IL

September 8, 2020

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/blog/154402 https://historynewsnetwork.org/blog/154402 0
Dirty Politics, Then and Now

 

 

Donald Trump has brought to American politics and to the presidency a uniquely personal, combative, and mean-spirited style, honed in the cutthroat worlds of high-end real estate, finance, and show business, that relies on personal insults and denigration of foes. In the realm of dirty politics, virtually by default, he has been allowed to rewrite the rules of the game—to the enormous detriment of our country.              

To be sure, personal attacks, mudslinging, and name calling date to the beginning of this republic. “JEFFERSON—AND NO GOD,” nervous Federalists screamed in 1800 in a vain effort to thwart election of the allegedly infidel, pro-French Virginian. Andrew Jackson’s rivals had the temerity to besmirch his beloved wife, Rachel. The twin issues of slavery and secession made the election of 1860 especially ugly. Abraham Lincoln’s enemies depicted him as a “horrid looking wretch,” assaulted him with vicious racist attacks, and claimed that he favored miscegenation.  

Sometimes the mudslinging took on a lighter tone. In 1884, Democrat Grover Cleveland was rumored to have fathered a child out of wedlock, inspiring the ditty “Ma, ma, where’s my pa? Gone to the White House, ha, ha, ha.” At times, it has been downright silly. In 1944, Republicans charged that Franklin Roosevelt had wasted millions of taxpayer dollars by sending a U.S. Navy vessel back across the Pacific to rescue his dog Fala, who, allegedly, had been left on a remote island. Today’s politicians thus follow a time-honored tradition.

But there is also something new in the volume of the attacks, who is purveying them, and how they are purveyed.  The internet gives a platform to political hacks, private citizens, extremist groups, conspiracy theorists, and even foreign governments—think Russia and Iran—to spread misinformation and outright lies with little or no test for truthfulness. This has exponentially increased the amount of personal smears and rendered them more nasty.

More important is the role of the mud-slinger-in-chief. In years past, presidents have generally turned a blind eye to the antics of their zealous underlings. They have left to others the job of responding to attacks. Surprisingly, perhaps, Jackson defended his wife’s honor with words rather than dueling pistols. In another exception, FDR turned the tables on his opponents with a brilliantly sarcastic riposte pronouncing that Fala was a Scotty and his “Scotch soul” would never condone such a waste of money. “Roosevelt’s dog got Dewey’s goat” was the verdict on this incident.   

Trump is the ringmaster of today’s supercharged political circus. The role comes naturally to him. It is a part of his persona and his modus operandi. He is a master of innuendo and hyperbole. He is not troubled by moral or ethical standards, and has no more than a passing acquaintance with the truth. He delights in churning up chaos and seeks to exploit it to his advantage. He early latched on to Twitter, and he spews out with seeming impunity so many tweets that opponents are hard pressed to know how or whether to respond. Mud-slinging seems to be the one part of the job that he truly enjoys. One must grudgingly concede that he has a certain gift for it.

He ventured into dirty politics before he ran for office by taking up the notorious “birther” theory that Barack Obama was not eligible for the office he already held. In 2016, presidential candidate Trump dreamed up belittling nicknames for his primary opponents: “Low Energy Jeb” (Bush); “Little Marco” (Rubio); “Lyin’ Ted” (Cruz). He hawked the preposterous insinuation that Cruz’s father was involved in the JFK assassination. In the campaign itself, he labeled his opponent “Crooked Hillary” for her alleged misuse of government emails and led the chant “Lock Her Up.” His slurs can be blatantly racist and sexist: “Pocahontas” for Senator Elizabeth Warren, for example, and the unspeakably crude remarks directed at presidential debate anchor Megyn Kelly after she dared challenge him.   

Seeking reelection, he has picked up where he left off, shifting the low energy moniker to “Sleepy Joe” and “Slow Joe” Biden and questioning his opponent’s mental acuity. Childishly, he insulted Vice-Presidential nominee Kamala Harris by knowingly mispronouncing her first name. He launched a half-baked birther theory for her as well and piled on with a barrage of misogynist slurs: “phony Kamala,” “nasty,” “angry,” “madwoman.” The Trump family low--so far--has been the “like” (later removed) that son Eric attached to a tweet calling Harris a “whorendous pick.” Older son Don Jr.’s labeling of Biden as the “Loch Ness monster of the swamp” ranks a distant second. Junior has also hinted that Biden may be a pedophile.

It could be argued, of course, that Trump is simply being honest, that he is merely bringing into the open stuff that usually remains behind the scenes. But that is too easy. What the president of the United States says makes a difference. He demeans himself, if that matters. He demeans the office of the presidency, one of the most prestigious positions in the world. He demeans this nation in the eyes of its own citizens and the world. Actions have consequences, and Trump’s tirades can incite his followers and provoke his opponents to act similarly, revving up the already rampant divisiveness in this country, sparking hatred and violence, and even causing death, as with the killing of two people by a teenage Trump enthusiast in Kenosha, WI. “The unthinkable has become normal,” Senator Bernie Sanders has rightly noted.  

Shamelessly, the Republicans either won’t or can’t rein in their president. The only way to get rid of his toxic influence is to vote him out.   

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177409 https://historynewsnetwork.org/article/177409 0
“We Are Ourselves”: Review of For Workers’ Power: The Selected Writings of Maurice Brinton  

 

For Workers’ Power: The Selected Writings of Maurice Brinton, David Goodway, ed. (AK Press, 2020)

 

By day, Christopher Agamemnon Pallis was a distinguished British clinical neurologist, scion of a prominent Anglo-Greek family that included poets, soldiers, businesspeople, a botanist, and an authority on Tibetan Buddhism. The rest of the time, he was “Maurice Brinton,” one of the leaders of Solidarity, a breakaway British Marxist group, and the author of a stream of polemics, reporting, and historical works that helped move much of the left from hard Marxism to libertarian socialism—and, in some cases, anarchism. Chris Pallis’s pseudonymous writings, along with those of other maverick materialists, freed the left from its own past and is one of the reasons that today’s left is typified not by party cadres and excruciating sectarian quarrels but by Black Lives Matter, Antifa, anarchists, Mexico’s Zapatistas, and indigenous movements in Latin America and Asia.

For Workers’ Power: The Selected Writings of Maurice Brinton is an expanded edition of a collection that first appeared in 2005. It provides more than an introduction to this shadowy but important figure’s work; it includes just about everything important that he wrote under the Brinton byline. British scholar David Goodway, who edited the collection, provides a compact account of Pallis’s life (1923-2005) and unusual career, including the time he was outed under his original pseudonym, “Martin Grainger,” and nearly lost his position as consultant in neurology at the Hammersmith Hospital, as well as an introduction that neatly traces his intellectual development and accomplishments.

Brinton was an excellent writer, scholar, and eyewitness journalist; For Workers’ Power includes political commentary and critique, theoretical writings, his groundbreaking historical work on the Russian Revolution, and powerful accounts of key events in postwar working-class politics, including the May 1968 uprising in Paris and the failed Portuguese revolution of 1974. Two things make Brinton (as I’ll refer to him from now on) and the Solidarity circle interesting and important today: their very serious effort to understand what went wrong with Soviet Russia and the Marxist Left in the decades after the Russian Revolution, and their commitment to finding ways that revolutionary socialists could adjust to the new economic and social realities of the postwar era.

To put this in context, Solidarity and Brinton were part of a generation of postwar thinkers and activists who came of age as Marxists (Brinton joined the Communist Party at university in 1941), then gradually rejected many of the basic assumptions of Marx as flawed or outdated. Some of the biggest names on the Continent were Cornelius Castoriadis (many of whose works Brinton translated into English), Claude Lefort, and Jean-François Lyotard.

Like them, Brinton realized that the kind of proletarian politics that Marxists had practiced in the 19th and the first half of the 20th centuries wouldn’t work anymore. Ethnic and racial issues were becoming more important; the western working class had grown more prosperous, then splintered and moved to the right (in many cases) as the industrial economy evolved; and women’s and LGBTQ rights (among others) were becoming immediate issues. How, he wanted to know, can you forge a powerful workers’ movement of the left in a world where the sharp definitional lines that Marx drew, suddenly were blurring?

Brinton could see these developments coming years before terms like “neoliberalism,” “globalization,” and computerization were common. “New productive tech­niques have led to greater division between the producers,” he wrote in 1961. “Thousands of jobs and profes­sions formerly requiring skill and training and offering their occupants status and satisfaction have today been stripped of their specialized nature. Not only have they been reduced to the tedium and monotonous grind of any factory job, but their operatives have been degraded to simple executors of orders, as alienated in their work as any bench hand.”

“Marxists,” he added, “would be bet­ter employed analyzing the implications of this important change in the social structure rather than waving their antiquated economic slide-rules.” The French crisis of 1968 involved university students who were far from starving and factory workers at Renault and Sud-Aviation who were among the best paid in the country. What was driving them, Brinton asked, and what would drive future uprisings? All working people, and not just those who Marx classified as “workers,” were increasingly cut off from the management of their own lives, and felt it, even if they didn’t always know how and couldn’t find a way to pull together in opposition to the new post-industrial landscape. 

“We live in neither the Petrograd of 1917 nor the Barcelona of 1936,” Brinton wrote in 1972. “We are ourselves: the product of the disintegration of tra­ditional politics, in an advanced industrial country, in the second half of the 20th century. It is to the problems and conflicts of that society that we must apply ourselves.”

By and large, the supposed leadership of the working class wasn’t much help, because it didn’t understand, or didn’t want to understand, what was changing. Time and again—in Hungary in 1956, in Belgium during a 1960 general strike, in France in 1968, in Portugal in 1974—a potentially revolutionary movement, bringing together industrial and agricultural workers, students, and other dissidents was discouraged not by a right-wing government but a sclerotic Communist Party that was suspicious of any formation that didn’t fit its ideological preconceptions. Social democratic and labor parties in Europe and North America play a similar role in the neoliberal present.

Even in the days of the Russian Revolution, Brinton argued, Marxists were wrong to think that working people’s lives were entirely defined by economic circumstances, and many of their mistakes sprung from their failure to understand the other aspects of human existence. In an approving article about the renegade psychoanalyst and sometime Marxist Wilhelm Reich, Brinton quotes Reich’s 1934 pamphlet, What is Class Consciousness?, which was becoming a popular New Left text. Mass consciousness, Reich wrote, is “made up of concern about food, clothing, family relationships, the possibilities of sexual satisfaction in the narrowest sense, sexual pleasure and amusements in a broader sense, such as the cinema, theatre, fairground entertainments and dancing.” It is concerned “with the difficulties of bringing up children, with furnishing the house, with the length and utilization of free time, etc.”

The conclusion, for Brinton, is that a revolutionary party can’t create revolutionary class consciousness; it has to come from the bottom up, from working people themselves and their understanding of their lives. What’s left of Marxism, then? Not much, but it’s important: workers’ control of production. Socialism isn’t about who owns the fields, the factories, and the workshops, but who controls them. States can expropriate real estate, factories, and data sets—as was done in Russia in 1917 or China in 1949—but if it puts them in the charge of a professional managerial class, that’s not workers’ control. 

Brinton stakes out an uncompromisingly radical position that would be familiar to many anarchists today: socialism and economic democracy can’t be achieved through reform or even a revolution within the State, like the one the Bolsheviks pulled off. They require a “total social revolution,” including workers’ management of production. A “meaningful” social revolution only comes about “when a large number of people seek a total change in the conditions of their existence.”

Just as a practical matter, simply taking over the government, and even the levers of the economy, is not enough. “No island of libertarian communism can exist in a sea of capitalist production and of capitalist consciousness,” Brinton wrote; any attempt to do so will revert to a capitalist model, sooner rather than later.

Does this make Brinton an anarchist? He lamented that Marxist revolutions all too often produced authoritarian regimes, and he rejected scientific socialism—the theory that the study of historical trends can predict their future development. “Genuine creation is the act of producing that which is not totally implicit in the previous state of affairs,” he wrote. “By its very nature it defies the dictates of predetermination. For those who see history as the unfurling of a dialectical process which leads inevitably ‘forward’ towards a particular brand of ‘socialism’ … there is no real history. There are just mechanisms.”

Those certainly sound like the words of a contemporary anarchist; “Brinton’s politics are fully anarchist,” Goodway firmly asserts. But I’m skeptical. While he rejected much of Marxism, Brinton still relied on Marxist terminology and categories of thought. The few references to major anarchist thinkers like Proudhon, Bakunin, and Kropotkin suggest that he never seriously grappled with their ideas. A better way to view Brinton—and Chris Pallis—is through the lens of the history that succeeded him. 

Today, much of the left is more concerned about building social movements than political parties or with seizing power. Anarchists, libertarian socialists, Indigenous organizers—even Black Lives Matter—would likely agree on the need to keep agency in the hands of these movements, not cede them to a clique of politicians or a revolutionary conspiracy. That’s a demanding assignment; Brinton was unsparing at defining the terms and exposing the pitfalls.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177407 https://historynewsnetwork.org/article/177407 0
Is This the Most Important Election?

 

 

 

The conventions are now behind us, and the post Labor Day period is often considered the launch of the full presidential campaign season. As in most election seasons, this one is being cast in apocalyptic terms by the two parties. “Do not let them take away your democracy,” former President Obama urged during his convention speech. “This is the most important election in our history,” President Trump countered.

 

Is Trump right? Is this the most important election in our history? Is democracy on the ballot, as Obama claimed? Or is that just a conceit, something we say every four years? Perhaps a look at some other crucial elections in our history will help to enlighten us.

 

The election of 1800 established the first peaceful transfer of power in the United States, one that almost didn’t happen. Without this, it is hard to see how America would have become a democracy. The election featured two men who were old friends and now political rivals: John Adams and Thomas Jefferson. They had faced each other in 1796, with Adams prevailing. Jefferson, who came in second, become vice president based on the original wording of the Constitution, under which electors voted for two people. The one with the most votes became president, while the runner up became vice president.

 

The two had a brief flirtation with bipartisanship at the beginning of Adams’ term, but things soon fell apart over lingering differences in the direction the new nation should take including over foreign policy. Relations with revolutionary France had fallen apart over the Jay Treaty, which was seen as pro-British. Adams ended up in a quasi war with France and his Federalist Party passed a series of bills known as the Alien and Sedition Acts. The Sedition Act was clearly pointed at Jefferson and the Republicans, making it illegal to publish “false, scandalous, and malicious writings against the United States.” Partisanship had spun out of control by the late 1790s, and actual violence between the two sides, both in Congress and in the streets, broke out.

This was the setting as the election of 1800 unfolded. Surprisingly, Jefferson and his vice-presidential candidate, Aaron Burr, tied with 73 electoral votes, while Adams received 65 electoral votes. The election was thrown to the House, but the Federalists began to consider extraconstitutional means to deprive Jefferson of the presidency. Jefferson then warned Adams that this “would probably produce resistance by force and incalculable consequences.” Ultimately Jefferson emerged as the winner after thirty-six ballots. While Adams peacefully gave up power, he refused to attend Jefferson’s inauguration. As David McCullough has written, the “peaceful transfer of power seemed little short of a miracle…and it is regrettable that Adams was not present.”

 

The election of 1860 took place when the future of the nation was literally at stake. Abraham Lincoln, a man who had risen from humble circumstances, had become one of the leaders of the new Republican Party in the 1850s. Lincoln wanted to stop the spread of slavery into the new territories that had been obtained during the Mexican-American War. His main rival for power, Stephen Douglas, believed that each territory should vote on whether to allow slavery, that popular sovereignty was the answer. Lincoln’s response is instructive. “The doctrine of self government is right---absolutely and eternally right—but it has no just application” to the issue of slavery, which Lincoln believed was morally wrong. 

 

Lincoln, the dark horse candidate for the Republicans, emerged on the third ballot at the convention in Chicago. Douglas won the Democratic Party nomination, but it was a pyrrhic victory. The Democrats from the South had walked out of the convention and nominated Vice-President John C. Breckenridge as their candidate. To make matters worse, a fourth candidate joined the fray as John Bell of Tennessee ran for the Constitutional Union Party. Ultimately Lincoln prevailed in the election, winning solidly in the North and west, but barely gaining any votes in the South. By mid December, South Carolina seceded from the Union and the Civil War began in April when southerners fired on Fort Sumter in Charleston harbor.

 

The question at the start of the war was would the Union survive, but ultimately the next four years of Civil War would lead to the elimination of slavery in the United States and “a new birth of freedom” for the nation, as Lincoln framed it at Gettysburg. The question of who can be an American, of who is part of the fabric of our nation, continued to evolve. During a brief period known as Reconstruction, America began to live up to its founding creed, that all are equal. Amendments were added to the Constitution which formally ended slavery, provided for birthright citizenship and equal protection under the law, and allowed black men to vote. But the era was just a blip in our history, and the era of segregation and Jim Crow laws soon emerged and would not be removed until the Civil Rights protests of the 1960s.  

 

The 1932 election occurred against the backdrop of the Great Depression. Herbert Hoover had been elected in 1928 as the “Great Engineer.” He had made a fortune as a geologist in mining and then had become involved in public affairs. “The modern technical mind was at the head of government,” one admirer wrote of the president. Hoover has often been cast as being a disciple of laissez faire when it came to the economy, but he in fact believed in “government stimulated voluntary cooperation” as historian David Kennedy has written. He took many actions early in the crisis, like getting businesses to agree to maintain wages and urging states and local government to expand their spending on public works. But Hoover was limited by his own view of voluntary action and could never bring himself to use the federal government to take direct action to fight the depression.

 

Franklin Delano Roosevelt had no such qualms. A rising politician in the early part of the 20th century, FDR had been struck down by polio in 1921. It made him a more focused and compassionate man who identified with the poor and underprivileged, as Doris Kearns Goodwin argues. Roosevelt began with some bold pronouncements, talking about “the forgotten man at the bottom of the economic pyramid” and of the need for a “new deal for the American people.” Those two words, which James McGregor Burns has written “meant little to Roosevelt and the other speech writers at the time,” soon came to define Roosevelt’s approach to the depression. FDR swept to victory, winning almost 60 percent of the popular vote and 42 of the then 48 states. The election established that the government had a responsibility for the well being of the people of the nation. FDR would eventually adopt the Four Freedoms as part of his approach, which included the traditional support for freedom of speech and worship, but also freedom from want and fear.

 

The 2020 election features each of the elements that made these prior elections so important. Democracy and the peaceful transfer of power are clearly on the line. Donald Trump has already called into question the fairness of the election, especially over mail in voting, and has begun once again to claim that he will lose the election only if it is rigged. One can imagine Trump refusing to leave office if he loses a close election to Joe Biden. 

 

The unity of our nation is also as stake. Trump “is polarization personified” who has “repeatedly stoked racial antagonism and nativism,” political scientist Suzanne Mettler and Robert C. Lieberman write. Trump has even been encouraging violence on the part of his supporters over Black Lives Matter protests.  “The big backlash going on in Portland cannot be unexpected,” Trump tweeted regarding the violence perpetrated by his supporters. 

 

Prior to COVID-19, Trump’s economic and tax policies favored the already wealthy and contributed to an ever-worsening growth in income inequality. To their credit, the president and his party supported an aggressive initial stimulus package to assist businesses and individuals. The extent to which the Republican Party will continue to support aggressive government action in response to the economic damage caused by the coronavirus, in order to aid the middle and working classes rather than the wealthy, is an open question. 

 

President Donald Trump may indeed be right, this is the most important election in our history. Just not for the reasons he believes.  

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177414 https://historynewsnetwork.org/article/177414 0
The Roundup Top Ten for September 18, 2020

The Deep Roots of Disdain for Black Political Leaders

by Carole Emberton

From Thomas Jefferson's writings, through the proslavery argument of the middle of the 19th century, the overthrow of Reconstruction, and the Jim Crow era, American politics has been influenced by the racist idea that Black people were incapable of exercising leadership in a democracy.

 

Who Owns the Evidence of Slavery’s Violence?

by Thomas A. Foster

A lawsuit demands that Harvard University give custody of famous images of enslaved men and women--taken without consent by a biologist seeking to demonstrate white supremacy-- to the subjects' descendents. A Howard University historian agrees, putting the images in context of other intimate violations endured by enslaved persons. 

 

 

The Long History Behind Donald Trump’s Outreach To LGBTQ Voters

by Neil J. Young

Gay Republicans emerged as a political force in response to both radical leadership in the gay liberation movement and the rise of evangelicals as a force in the Republican party. Today they may have to decide which fight is more important. 

 

 

Lampooning Political Women

by Allison K. Lange

Backlash against women's emancipation in the nineteenth century took to the most potent social media of the day--political cartoons--to decry feminism as a threat to civilization itself. 

 

 

The Dark Side of Campus Efforts to Stop COVID-19

by Grace Watkins

While colleges have a legitimate interest in suppressing virus transmission on campus, it is dangerous to expand the surveillance powers of campus police. 

 

 

The Forgotten History of the Radical ‘Elders of the Tribe’

by Susan J. Douglas

The Gray Panthers fought for the civil rights, social services and respect denied to older Americans. But they did so by challenging inequality in ways that sought alliances instead of antagonism between young and old. 

 

 

Why Do Women Change Their Stories Of Sexual Assault? Holocaust Testimonies Provide Clues

by Allison Sarah Reeves Somogyi

Despite the horrific frequency of sexual abuse of women during the Holocaust and during World War II, stigmas attached to victims encouraged survivors to self-censor in their testimonies. The historical record may help to understand the behavior of victims today.

 

 

American Democracy Is in the Mail

by Daniel Carpenter

The Postal Service has been a circuit of information vital to democracy, a non-exclusionary employer, and a service connecting all communities in the nation. It's also been a tool of conquest and voracious capitalism. For good and ill, the history of the USPS is the history of America. 

 

 

Why ‘Glory’ Still Resonates More Than Three Decades Later

by Kevin M. Levin

The film based on the story of the 54th Massachusetts Volunteer Infantry is streaming on Netflix. Kevin Levin suggests that despite the narrative license taken, the film puts the story of Black freedom fighters and the question of emancipation at the center of the story of the Civil War. 

 

 

Where Kamala Harris’ Political Imagination Was Formed

by Tessa Rissacher and Scott Saul

A Black cultural center in Berkeley introduced Kamala Harris to activism and the connections between culture and politics. 

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177403 https://historynewsnetwork.org/article/177403 0
The Pentagon is Missing the Big Picture on "Stars and Stripes"

Editorial Room of Stars and Stripes, WWI

 

 

 

In February, the Pentagon proposed slashing funding for the famed soldiers’ newspaper Stars and Stripes, a story that roared back into the news in September after its publisher reported he had been ordered to halt publication by the end of the month. By the morning of September 4th, President Trump tweeted out his insistence that the paper would not be closing, though the Senate has not yet voted on a defense appropriations bill to resolve this issue. While veterans have raised important concerns about the elimination of an important journalistic voice independent of military officials’ control, administrators framed the paper’s closure as a budgetary issue to save $15.5 million - a seeming pittance in a department collecting over $705 billion in federal funding. If budgetary issues are truly the concern, I’d propose considering the paper’s history, which demonstrates how the paper actually can and does function as an important part of the public-private partnerships driving the country’s economy in connection with its journalistic mission.

 

In recent days, many articles have mentioned the paper’s roots during the Civil War, but few have described crucial developments during the First World War, when Guy T. Viskniskki, a Spanish-American War veteran and New York area journalist for the Wheeler Syndicate, argued a new paper describing the wartime experience through the eyes of rank-and-file servicemen could raise morale without becoming a form of propaganda. While training at Camp Lee outside of Richmond, Viskniskki established a newspaper for the community, one of many camp newspapers funded by the YMCA’s War Work Council. However, when he reached Europe in November 1917, Viskniskki dreamed of establishing a paper free from the oversight of his commanding officers and believed his knowledge of censorship regulations allowed him to effectively follow the rules placed on war correspondents. While Viskniskki was proud to highlight his paper’s financial successes and independence, the Intelligence Section provided the first 25,000 francs he needed to begin publication in January 1918. Viskniskki worked hard to push the paper beyond the narrow, divisional or company focus of other soldiers’ papers, rejecting calls to spin-off specialty publications for the Services of Supply because he believed a mass paper created a sense of unity spanning front and rear lines. He found his mass audience by providing troops with news of the war, tales from the home front including sports coverage, and comics, all written in a relatable, informal style. At its peak circulation near the armistice in November 1918, staffers printed 526,000 copies per issue and distributed them to readers on both sides of the Atlantic. When the paper closed shop in June 1919, the paper had even turned a profit (though Viskniskki was disappointed it was turned over to the US Treasury rather than distributed to French war orphans).

 

Stars and Stripes staff adjust linotype machines, WWI

 

 

Viskniskki financed his paper primarily through advertisements, securing free assistance from A.W. Erickson in New York. The paper charged perspective buyers the very low rate of one dollar per inch, raising it to six dollars when the cost of newsprint grew more cumbersome after circulation rose over 400,000. By the third issue, Viskniskki packed the paper with ads for products including Boston Garters, Colgate dental cream, Lowney’s chocolates, Wrigley’s chewing gum, 3-In-One oil, Mennen shaving cream, Fatima cigarettes, and Auto Strop razors. These advertisements were part of a broader modernization of military culture that introduced servicemen to new consumer products. For example, military regulations required troops to maintain clean-shaven faces to create a firm seal for their gas masks, thus requiring them to carry their own shaving kits and creating a market for the supplies advertised in their paper; a 1919 survey by the J. Walter Thompson advertising firm found that 30 percent of safety razor users learned of the product in the army. Similarly, army officers estimated that 50 percent of conscripts had not brushed their teeth on a regular basis, leading them to order over 4 million toothbrushes for troops who purchased toothpaste at local canteens or post exchanges.

 

Staffers for The Stars and Stripes also developed tools to effectively distribute their paper amid harsh wartime conditions. The paper secured subscribers by collecting a large cash payment up front, making it easy for delivery agents to simply drop off the paper rather than manage individual subscribers’ accounts. Viskniskki relied on staffers from Hachette, a leading French publishing house, to carry papers from train stations to troops even when they were under fire. The paper’s staff also acquired ninety-one cars that allowed field agents to deliver the paper to most remote regions where their readers served. Such decisions to develop an internal distribution service predated similar efforts by leading retailers by several years, as major mail order retailer Sears only developed their own trucking service in the early 1920s. The Stars and Stripes’ distribution network was so effective they approached the Red Cross and YMCA to assist them with their pre-USO era responsibilities of delivering treats and entertainments to troops across service areas.

 

Viskniskki was far from the paper’s only reporter, and its large and successful staff both reinforced the paper’s reputation for independence and established contacts that would further its commercial impact long after the war. While Viskniskki conducted the reporting for the first few issues primarily by swiping official cables from the censorship office, he quickly expanded his team by identifying experienced journalists who had difficulty fitting in with their units because of their writing habits. These included Harold Ross, whose commanding officer in a railway engineering unit forwarded Viskniskki many articles Ross had drafted alongside a plea to remove the man he considered the unit’s headache; New York Times writer Alexander Woollcott, who Viskniskki knew wanted to cover the front lines; New York Tribune sports reporter Grantland Rice who transferred from artillery work mere issues before Viskniskki decided to cancel the sports page amid Americans’ increasing combat responsibilities; fellow Tribune scribe Franklin P. Adams who opined on military life in his “The Listening Post” column; and Philip Von Blon, the enterprising reporter who developed sources within the SOS and broke the Harts uniform story. These writers formed a close social circle during the war and after, particularly after Ross founded The New Yorker and recruited Woollcott as a writer, palling around with Adams in their famed Algonquin Round Table meetings, while Von Blon returned home and took a job as managing editor of American Legion Monthly. Viskniskki himself declined to capitalize on the Stars and Stripes brand he created, turning down a $300,000 offer to establish a paper in the United States. However, other staffers risked Viskniskki’s ire and marketed their connections to the paper when founding veterans’ publications such as The Home Sector. 

 

While short-lived, the World War One-era Stars and Stripes demonstrated the features that would become hallmarks of the paper when it resumed publication in 1942. The paper’s editors relied on funding from the government and its affiliates to open its doors, and it contributed to the country’s commercial development by inspiring new distribution ideas and familiarizing troops with new products for future consumption. Such factors clearly demonstrate why Congressional leaders are right to offer continued support for a paper that has shaped the country’s economy and culture for over a century.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177333 https://historynewsnetwork.org/article/177333 0
Native Actors Outside the Frame

Harry Smith, under the name of Jay Silverheels, was a Mohawk actor who famously portayed Tonto on The Lone Ranger. 

 

 

 

Remember Tonto and the Lone Ranger? You might recognize my book cover with Harry Smith, aka Jay Silverheels, ready to grab his gun. I am a citizen of Cherokee Nation and an Assistant Professor of History and Native American and Indigenous Studies at Indiana University. My book has just been released entitled, Picturing Indians: Native Americans in Film, 1941-1960.

 

In this book, I draw attention outside the frame of the films we watch from this era and remind readers that the movie sets were workplaces. Although I was interested in all aspects of work on the sets, including makeup artists, costumers, and the food prep people, just to name a few, I look in particular at those playing Native American characters, especially Native people playing Native characters. This comprises both actors and extras. With actors, I am invoking union guidelines around speaking parts and time on screen, and Native actors never took the lead role. This meant that supporting or minor parts were the highest-level Native workers achieved at the time. 

 

Some of these men included Harry Smith, or Jay Silverheels, who graces the cover of Picturing Indians. Harry was a Mohawk man from the Six Nations Reserve in Canada, as is Gary Farmer, the actor who appears in many films, including Powwow Highway and Dead Man. Smith had over 100 film credits, with a commanding film presence even in the limiting roles he was offered. In spite of working non-stop for decades, generating tremendous wealth for the many studios where he worked, Harry struggled financially his entire life. In LA he rented a one-bedroom apartment near the corner of Sunset and Bronson. He passed away with massive legal debts, suffering from medical malpractice and dragging himself through a legal battle until the day he passed.

 

Like Harry Smith, Daniel Simmons, a member of the Yakama Nation, used Chief Yowlachie as a name that would define and present him as a Native American to casting agents and the American public. He too has over 100 film credits, but as far as I know never owned a home in Los Angeles. In fact, he rented a granny flat in East LA where he received his meager checks from the studio. 

 

There are several other Native men who worked regularly in supporting roles and I go over this in the book, but let’s move onto those who worked as extras. Again, I use union terminology, emphasizing that extras are people working in front of the camera with no lines. There are hundreds, perhaps thousands of Native people who appeared in movies of the 1940s and 1950s according to the studios’ archives. Sometimes I know their names, such as Plain Feather, a Crow man who worked as an extra in Warpath and Donald Deer Nose, also Crow, who worked in Warpath as well. Often extras went unidentified in photographs taken by the studio. Only from archival materials would I know, for instance, that the woman in a studio photograph is Diné or Navajo. Perhaps now that the book is out, I will be able to identify her and stop referring to her and others as anonymous extras. 

 

To be clear, Picturing Indians is a behind the scenes look at movies of the 1940s and 1950s. Initially I believed the movies and the film sets ran in parallel tracks, separate and uninformed by each other. Yet the more I looked at the archival materials alongside the films themselves, the more I saw just how oppositional they are. What I mean by that is the films recreate American history in a particular way, usually with complicated plot devices for white characters, extremely simplistic ones for Native characters, and the constant of Indian violence and white innocence. Yet the materials from the sets where Native people worked tell something very different.

 

For instance, an image from the set of Drum Beat of two Apache women being photographed taking a photograph of Charles Bronson in Indian costume, leaning back seductively in a chair, seems to be saying something about Native women finding Charles Bronson attractive. Yet this film is about hundreds of white soldiers and volunteers hunting down and surrounding Modocs then executing their leader.

 

Or another example comes from an image of an Apache male extra taking a photo of a beaming William Holden on the 1953 set of Escape from Fort Bravo. A studio photographer photographed this moment, staged or spontaneous, which seems to indicate pleasure and camaraderie, yet this film made by MGM tells a story about deeply divided northern and southern whites during the Civil War, who come together when faced with violence from Apaches. 

 

The last example I will give of this disjuncture and perhaps the most stunning comes from the set of Far Horizons in 1955. We see tribal chairperson Herman St. Clair with a number of Eastern Shoshone men offering Donna Reed a fishing permit, invoking their sovereign fishing rights to give her the right to fish on their waters. They have maintained these rights by way of the Fort Bridger Treaty of 1868. Yet St. Clair took this action, perhaps nothing more than a stunt, on the set of a film that has nothing to do with tribal sovereignty. Instead the film tells the story of settler colonialism with Lewis and Clark as heroes. 

 

There are so many moments I wish more people knew about, especially those who know and love these movies. But Picturing Indians maintains a steady analysis of the exploitation of Diné and their land by the movie industry. Monument Valley is Navajo land, yet it came to embody the West and the filmic West through the economic exploitation of the Diné. I document this quite precisely in the book in terms of how they were paid by John Ford and other filmmakers of the era. To better understand Diné today I would strongly recommend several movies for people to watch such as The Return of Navajo Boy, Basketball or Nothing or Drunktown’s Finest.

 

But more than anything, I want my readers to see that Harry Smith and other Native American actors gave Americans tremendous entertainment value with very little in return. Warner Bros. owns nearly all of the images in the film archives. Yet they gave permissions for me to reproduce them, then revoked that permission at the last minute as we went to press. My publisher pulled the cover image of Harry Smith for the cover from public domain. Smith’s family earns nothing from this and has no rights to the image. Yet the studios possess the rights and refuse to allow anyone to reproduce the vast numbers of images they hold of Native people who worked in film. Harry Smith made the studios a small fortune, but died with just about nothing. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177330 https://historynewsnetwork.org/article/177330 0
The Garbage Troop: Segregation, Primatology, and Republican Rhetoric

Postcard of Charleston High School, 1910. Postcard Collection (UALR.PH.0105), UA Little Rock Center for Arkansas History and Culture.

 

 

 

If you watched the Republican National Convention at all, you were probably struck by the expressions of fear that permeated the proceedings—namely, the fear that any failure to re-elect Donald Trump as president of the United States would result in the collapse of the American experiment, if not the dissolution of civilization itself. Words to that effect were spoken many times over; Trump himself, accepting the nomination, said, “This election will decide whether we SAVE the American Dream, or whether we allow a socialist agenda to DEMOLISH our cherished destiny.” (The capitalization is original to the transcript.) But can we take their expressions of concern at face value, or does the Republican Party’s rhetoric conceal another fear entirely?

 

If we roll back several decades, we find that those who opposed school desegregation similarly warned of the collapse of civilization if black and white students were allowed (or “forced”) to study in the same buildings. But reality proved them wrong. The very first school district in the former Confederacy to desegregate following the 1954 Brown v. Boarddecision was that of Charleston, Arkansas, although it did so rather secretly. This small school district in the western part of the state had been paying to bus black students to Fort Smith for their education, and so the decision to desegregate was as much economic as it was moral. Local leaders did not seek to attract national attention to the fact that eleven African American students were admitted on the first day of classes on August 23, 1954, and desegregation went off with very little opposition.

 

The first reported school desegregation in the former Confederacy occurred in Fayetteville in the northwestern corner of Arkansas. Seven black students entered Fayetteville High School in September 1954. The only opposition was one lone woman with a placard, despite the district having announced publicly their intentions. And although black students did report instances of harassment and the use of racial slurs during the school year, a certain camaraderie seem to have formed between black and white students. Many local schools refused to play the integrated Fayetteville football team, and when Coach Harry Vandergriff gave his players the option of benching black players or forfeiting the games, they chose the latter.

 

By the following year, however, segregationists had apparently had enough with the success of school desegregation efforts, drawing a line in the sand at Hoxie, a small town in northeastern Arkansas. As at Charleston, school district officials pursued desegregation to save the money of having to bus black students to the city of Jonesboro, but also because such an act was, in the words of Superintendent Kunkel Edward Vance, “right in the sight of God.” And so on July 11, 1955, all school facilities at the local white school were opened up to black children. Everything seemed to be going okay for the next weeks, but later that month, Life magazine published a pictorial essay highlighting the success of desegregation and showing white and black children playing and studying together. Soon, outsiders began flooding into the town, raising the threat of violence. Although they were not successful in rolling back desegregation at Hoxie, they developed the techniques of harassment and intimidation that would come into play two years later in the much-publicized desegregation of Little Rock Central High School, when the federal government was forced to call out the National Guard to restore order after nine black students attempted to enter the school. Finally, segregationists could point at the violence they created and assert, with much more confidence, that letting black and white students study together would disrupt civilization as we know it.

 

Segregationists insisted that difference between black and white was unalterable and would necessarily produce violence conflict if the proper hierarchy were not maintained. Interestingly, as the battles over school desegregation were raging in America, the study of primatology was coming into its own, giving us a glimpse into the deeper realities of human nature. The first overview of the subject was Irven DeVore’s 1965 Primate Behavior: Field Studies of Monkeys and Apes. In this book, DeVore insisted that aggression in savannah baboons “is an integral part of the monkeys’ personalities, so deeply rooted that it makes them potential aggressors in every situation.” But later studies called this “fact” into question. In the 1980s, Robert M. Sapolsky was studying a particular baboon troop when a neighboring troop began foraging at the garbage pit of a nearby tourist lodge, which provided a wealth of high-energy foods, such as discarded beef and chicken and sweets. Soon, certain members of Sapolsky’s troop began going over to this garbage pit every morning to fight over these new resources. As Sapolsky writes in Behave: The Biology of Humans at Our Best and Worst, these baboons typically “were male, big, and aggressive. And morning is when baboons do much of their socializing—sitting in contact, grooming, playing—so going for garbage meant forgoing the socializing. The males who went each morning were the most aggressive, least affiliative members of the group.”

 

However, some of the meat over which these baboons were fighting came from tubercular cows, and soon TB wiped out not only most of the troop that had found the garbage pit, but also those males from Sapolsky’s troop who were going there. He returned to his troop some years later and discovered that the culture had changed radically. Not only were levels of aggression lower across the board, but “there was minimal displacement of aggression onto innocent bystanders—when number three lost a fight, he’d rarely terrorize number ten or a female.” And the social culture was being transmitted. Adolescent males typically leave their own troop, and those who entered this one were greeted with affiliative overtures by the less-stressed females, such as grooming or sexual solicitation, much earlier than in other troops, and soon assimilated to this new culture themselves.

 

What does all of this talk of school desegregation and primatology have to do with the rhetoric coming out of the RNC? Simply this—that our culture can change in egalitarian ways without threatening our survival. The previously thinkable can become simply an everyday reality for us, and quickly, too. Black and white children can attend school together without conflict, even in some godforsaken corner of a state not known for its progressive worldview. Those appealing to the power of tradition must create conflict in order to prove their point. Natural hierarchies are anything but; they are not written in our DNA. The study of more “primitive” species illustrates that fact.

 

In other words, Donald Trump and his Republican Party are not afraid that Joe Biden’s election will destroy America. They’re afraid that it won’t. They’re afraid that Joe Biden’s election won’t herald the end of our American experiment in a widening gyre of violence and chaos. They’re afraid that a turn toward egalitarian thinking won’t unravel the survivability of our troop and thus herald our doom. They’re afraid that equality might prove a strength rather than a weakness. And so between now and November, they will create as much chaos as possible in order to prove themselves right. Just as their forebears did at Hoxie sixty-five years ago when they saw black and white children playing together, as happy as they could be.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177327 https://historynewsnetwork.org/article/177327 0
Twenty-One Days Later: Ventura County's Participation in the Chicano Moratorium of 1970

 

 

 

Last month hundreds of people marked with moxie the 50th anniversary of the August 29th, 1970 Chicano Moratorium in East Los Angeles. To protest our nation’s war in Vietnam, racism, and police brutality, starting at 9 am that day nearly 30,000 ethnic Mexicans and their allies from all over the Southwest took to the streets in a 3-mile peace march through the boulevards of Atlantic and Wilshire.

 

Among many slogans, they chanted and held signs expressing, “¡Raza Si! ¡Guerra No!,” “Our Fight Is Not in Vietnam,  “Chicano Power,” and “Stop Chicano Genocide!”

 

In the spirit of the Black Lives Matter movement since George Floyd’s killing by now-former Minneapolis police officer Derek Chauvin this May, the protests of Chicanos today concentrate on law enforcement’s abuse of power.

 

In 1970 Chicanos protested how US casualties in Vietnam disproportionately consisted of young men from their communities in the Southwest. Dr. Ralph Guzmán documented that from 1961 to 1967 their brothers and friends made up 19.4 percent of those killed in action, when this group was only 10 to 12 percent of the national population.

 

Now, they protest the killings of Latina and Latino soldiers. Army Private First Class Vanessa Guillen stationed at Fort Hood being one and Specialist Enrique Roman-Martinez of the 82nd Airborne Division at Fort Bragg another. Roman-Martinez’s sister and mother delivered impassioned speeches at Atlantic Park in East L.A. before the commencement of the 50th-anniversary march this past August 29th. They criticized the Army for its less-than-transparent investigation and decried only having received Enrique Roman-Martinez’s partial remains. 

 

In 1970, the Brown Berets of Los Angeles, along with UCLA student Rosalio Muñoz and others formed the National Chicano Moratorium Committee and organized many demonstrations in Southern California. But the August 29th march and rally at the then-named Laguna park was the granddaddy of them all.

 

Then tragedy struck. With the pretext of a responding to a robbery at a nearby liquor store, Los Angeles sheriff’s deputies and police stormed the peaceful assembly with batons and teargas. The law enforcement-instigated riot resulted in three deaths and hundreds arrested and abused. Ruben Salazar, a former Los Angeles Times reporter turned KMEX-TV news director, considered the voice of the Chicano community, was one of the slain as he stopped at the Silver Dollar Bar far away from the melee, on Whittier Blvd, to decompress from law enforcement’s merciless assault.

 

After several contradictory official explanations, it was found that Los Angeles County sheriff’s deputy Thomas Wilson killed Salazar with a 10-inch teargas projectile designed to pierce walls. Many in the community contended then, and believe now, that the powers that be in Los Angeles conspired to assassinate Salazar due to his refusal to temper his reportage of law enforcement misconduct. 

 

In adjacent Ventura County, the Chicano community also viewed Salazar’s homicide as the system’s culling of its leadership. In a September 3, 1970 letter to the Ventura County Star-Free Press titled, “Siesta Is Over!” Arthur Gómez of Santa Paula addressed Governor Ronald Reagan and local elected officials when he stated, “Yes, the siesta is over! The siesta was broken by the murder of two innocent Mexican nationals in a Los Angeles hotel and the 10-inch projectile that shattered Ruben Salazar’s head… One day we shall not have our leaders murdered. One day we shall not have our children made ashamed of being part Mexican. One day we shall have justice and dignity.”

 

Intrepidly, Chicano men and women conducted a peace march in Oxnard on September 19th,  twenty-one days after law enforcement’s rampage in East Los Angeles. Approximately, 1,000 marchers from all walks of life, different communities, and a span of generations again took to the streets.

 

In their planning that started weeks, if not months, in advance of the August 29th tragedy, the organizers declared the community’s goal of liberation as well as the end of Chicano genocide in Vietnam and police brutality.

 

To avoid an August 29th-like catastrophe, the Brown Berets of Oxnard, the Ventura County chapter of the Mexican American Political Association, and MEChA representatives from local colleges and high schools met in advance with law enforcement.

 

The week leading up to the “La Raza” (the People’s) peace march, men and women of the Brown Berets leafleted neighborhoods to promote the demonstration and, to further ensure amity at the event, disseminated a code of conduct to the public, the Oxnard Police Department, and media.

 

On the day of the demonstration, people paraded boldly through the streets La Colonia barrio from La Virgin de Guadalupe Church and the downtown district with a coffin that symbolized 8,000 ethnic Mexican servicemen killed in Vietnam. The procession ended at the city’s Community Center. There, as national chairman for the Chicano Moratorium Committee, Muñoz characterized the Vietnam War as the “systematic murder” of Chicanos.

 

La Raza Moratorium Committee’s communication with law enforcement and the press garnered the community’s goodwill for the event’s achievement. Indeed, the Oxnard Press-Courier commended the organizers in an editorial as it acknowledged the disproportionate ethnic Mexican casualty rate in the Vietnam War. It also complimented in a backhanded manner law enforcement in general, for its “diplomacy and restraint.”

 

Fifty years later, Chicanos are proud of being ethnic Mexicans. But with the controversial homicides of Latino soldiers and civilians such as PFCs Guillen and Roman-Martinez on the one hand and Andres Guadardo, shot in the back by a LA County sheriff’s deputy, on the other, we, Chicanas and Chicanos, still await justice.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177332 https://historynewsnetwork.org/article/177332 0
The "Noble Dead": Warren Harding and the Resting Places of the WWI Fallen

American cemetery, Aisne-Marne. Photo by author.

 

 

With controversy swirling around President Trump’s decision in 2018 not to visit Aisne-Marne, a World War I cemetery for American soldiers located some fifty miles outside Paris, one wonders why some American war dead from the Great War were left behind in France, and why some were brought home. A president from the time provides the answer, one who referred to fallen Americans not as “losers” but as “the noble dead.”

Warren Harding, our nation’s twenty-ninth president, not only received the first flag-draped wooden coffins to be returned from Europe after the war, he was also the Chief Executive who dedicated the Tomb of the Unknown Soldier at Arlington National Cemetery.

On May 23, 1921, two-and-one-half-years after the end of the fighting in Europe, 5112 coffins, containing bodies of soldiers, sailors, marines and nurses, newly returned from France, were carefully set out in a shipyard at Army Pier 4 in Hoboken, New Jersey. The rows of coffins stretched for city blocks. President Harding, who had just taken office in March, arrived via the presidential yacht, the USS Mayflower. While onboard, he composed a short address that reflected the solemnity and the expected shock of seeing so many caskets arrayed in one place.

“There grows upon me,” he said from a bunted platform erected in front of a single, representative coffin, “the realization of the unusual character of this occasion.” Because this simple ceremony had been hastily arranged, President Harding and First Lady Florence Harding appeared in front of what one correspondent described as “a pitiful little handful of soldier relatives while a guard of honor, grim in khaki and trench helmets, stood frozen at attention over their comrades.”

Harding recognized that “our Republic has been at war before, it has asked and received the supreme sacrifices of its sons and daughters, and faith in America has been justified.” But this display was different, unparalleled. “We never before sent so many to battle under the flag in foreign lands,” he said. “Never before was there the spectacle of thousands of dead returned to find their eternal resting place in the beloved homeland.”

The decision to bring remains home from foreign soil was a complicated, extended and negotiated affair. America had no established precedent to consult. When it became clear that there would be a staggering death toll during the Civil War, President Abraham Lincoln signed a law authorizing the creation of national cemeteries (which would include a cemetery at Gettysburg). For years after the war, the remains of Northern soldiers hastily buried near battlefields were exhumed and reburied in venerated cemeteries. And in the handful of small wars where Americans died overseas, sometimes remains were recovered, sometimes not.

 

Makeshift gravesite, France c.1918

 

World War I created a dual challenge. Nearly 75,000 Americans were buried in temporary graves in France and the cost to recover that many bodies was daunting. Moreover, leaders in France did not relish the idea of endless trains bearing disinterred remains of American dead rumbling through the countryside to ports for shipment back to the United States. France had its hands full with the staggering work to reclaim dangerous and devastated land, not to mention millions of corpses, from a war that had been waged mostly on its soil. So, France banned the repatriation of any bodies from January 1919 until January 1922, though it relented from the three-year ban in response to American pressure. Hence, it fell to Warren Harding, elected 100 years ago in November 1920, to meet the first returned.

In the United States many families demanded a return of their loved one’s remains, worried that they would be forgotten in unmarked or untended graves. The government decided to let families decide whether to seek the return of remains or to leave them where they had fallen, either in existing graves or in nearby official American cemeteries established in France. Ballots were sent to over 80,000 families to discuss and debate the decision. In the end, about 40,000 bodies were returned and 30,000 were left, buried almost exclusively in American cemeteries.

 

The names of dead and missing are engraved on a chapel wall near Belleau Wood. Photo by author.

 

Enter Aisne-Marne. This American cemetery is the final resting place for nearly 2,300 Americans. Built at the base of a hill on which stands Belleau Wood, the site of one of the most monumental battles of the war. This is where the Marines helped stop the German advance towards Paris in the summer of 1918. The Americans arrived just in time and the cost in human lives was severe. The Marine Corps venerates Belleau Wood as sacred ground, no doubt the reason that John Kelly, then chief of staff to President Trump, made the trip to Aisne-Marne even when the president bailed, allegedly because of weather.

Kelly was a retired 4-star general of the United States Marine Corps. His son Robert, also a Marine, was killed-in-action in Afghanistan in 2010. John Kelly knew the importance of visiting Aisne-Marne on the one-hundredth anniversary of America’s pivotal engagement in the war; he understood the duty to the families of those buried overseas in American cemeteries to remember and honor “the noble dead.”

Six months after Harding welcomed home the remains of the first 5,000 returned from Europe, he dedicated the Tomb of the Unknown Soldier at Arlington National Cemetery. On November 11, 1921, the third anniversary of the Armistice, Harding said it mattered little whether the unknown was “a native or adopted son.” The sacrifice was the same. “We do not know the eminence of his birth,” he added, “but we do know the glory of his death.”

 

Warren Harding and William Howard Taft observe the Unknown Soldier in state, U.S. Capitol. 

 

President Harding expressed the gratitude of the nation for the ultimate sacrifice of the warriors, what Lincoln called at Gettysburg the “last full measure of devotion.” But he challenged his fellow citizens to do more than to pay tribute to the fallen hero in the unknown tomb. He asked that every American "unite to make the Republic worthy of his death for flag and country.”

Just as Americans visit and revere the graves of those in Arlington and other national cemeteries in the United States, it is important to remember that the nation made a solemn compact with the families of those who were lost in the First World War. The government promised that the sons or daughters of those gold-star families would be buried in American cemeteries, cared for and tended to by Americans, so that no one would forget them or their sacrifice and so that Americans, when overseas, could locate and venerate their honored dead.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177326 https://historynewsnetwork.org/article/177326 0
Richard Haass on the Need for Historically Informed Policy in a Changing World

 

Richard Haass is the President of the Council on Foreign Relations. He served as senior Middle East advisor to President George H.W. Bush and as Director of the Policy Planning Staff under Secretary of State Colin Powell and is the author of fifteen books, most recently  The World: A Brief Introduction. He discussed the work and the importance of historical understanding with HNN Contributing Editor David O'Connor. 

 

 

David O'Connor: Can you share the story of how a fishing trip sparked your interest in writing this book on history and international relations?

 

Richard Haass: The idea for writing The World: A Brief Introduction was sparked on a summer's day fishing with a friend and his nephew in Nantucket. The young man was about to enter his senior year at Stanford and would graduate with a degree in computer science. As we began talking, it became clear that he had been exposed to little history or politics or economics and would leave the campus with almost no understanding of why the world mattered and how it worked. When I got back to my office, I began looking into this issue and realized that a young American could graduate from nearly any high school or college in the country without taking as much as an introductory course on U.S. history, international relations, international economics, or globalization. To be sure, there are distribution requirements at nearly every college or university, but a student can choose to narrowly focus on one period of history or one region of the world without ever taking a survey course that provides a framework for putting it all together. I decided to write The World to provide that foundation for students or even people who had graduated from college decades ago but need a refresher. A democracy requires that its citizens be informed, and it was evident far too many citizens in the United States and other countries could not be described as globally literate.

Are you an advocate for universities and colleges to mandate a core curriculum?  If so, what courses would you want to see included in it?  

I am a firm believer in a core curriculum. Students (and their parents) should know before choosing to attend a particular institution just what it is they will be sure to learn. Would-be employers should know what a degree from a particular institution stands for. I believe a core curriculum should at a minimum include courses devoted to promoting critical skills (analysis, writing, speaking, teamwork, digital) and knowledge (world history, civics, global literacy). Such a core would still allow every student to have ample opportunity to specialize.

 

How have you and your colleagues at the Council on Foreign Relations encouraged those who are not in college to learn about world history and current international events?  Which efforts do you think have been the most successful?

We continue to publish Foreign Affairs, which releases a print edition six times per year and remains the magazine of record in the field. The magazine contains articles that present fresh takes and new arguments on international issues - the magazine published the famous "X" article by George Kennan that introduced Americans to the concept of containment, for example. Its website, ForeignAffairs.com, publishes shorter pieces every day more closely tied to the news cycle. On CFR.org we publish a host of backgrounders that aim to provide what a person needs to know to get up to speed on issues ranging from global efforts to find a vaccine for COVID-19 and U.S. policy toward the Israeli-Palestinian conflict to the role of the IMF and the U.S. opioid epidemic. We have also produced a series of award-winning InfoGuides on China's maritime disputes, modern slavery, and refugees, among others. We have a series of podcasts, including The President's Inbox, which each week focuses on a foreign policy challenge facing the United States, and another titled Why It Matters, which takes issues and as its title suggests explains to listeners why they should care about them. 

Just as important, a few years ago I created an entirely new education department at the Council. Its mission is explicitly to teach Americans how the world works. Its flagship initiative, World101, explains globalization, including climate change, migration, cyberspace, proliferation, terrorism, global health, trade, and monetary policy, regions of the world, the ideas basic to understanding how the world operates, and, as of early 2021, history. Each topic includes videos, infographics, interactives, timelines, and written materials. It also includes teaching resources for teachers who want to use the lessons in their classrooms. We have also created Model Diplomacy, which helps students learn about foreign policy and how it is made by providing free National Security Council and UN Security Council simulations.

 

You begin this book with an explanation of the Treaty of Westphalia, one that many people don’t know very well. Why did you start your study in 1648? How have the concepts and practices established in the Westphalian system endured?  

I started with the Treaty of Westphalia because the principles enshrined in those arrangements created the modern international system. The treaty (in actuality a series of treaties) established the principle of sovereignty that increased respect for borders along with the notion that rival powers ought not to interfere in the internal affairs of others. These agreements helped bring about a period of relative stability, ending the bloody Thirty Years War that was waged over questions of which religion could be practiced within a territory's borders. More important for our purposes, they put forward the principle of sovereignty that remains largely unchanged to this day. When you hear the Chinese government declare that foreign powers have no right to criticize what happens inside of China's borders, they are harkening back to Westphalia. At the same time, as I argued in my book A World in Disarray, this conception of sovereignty is inadequate for dealing with global challenges. For issues like climate change, global health, terrorism, and migration, what happens inside a country's borders has huge ramifications for other countries. For instance, Brazil's decision to open up the Amazon for commercial purposes and deplete this natual resource has negative implications for the world's ability to combat climate change. China's failure to control the outbreak of COVID-19 has caused massive suffering around the world. I introduced the concept of sovereign obligation to capture the idea that governments have certain responsibilities to their citizens and the world, and if they do not meet those obligations the world should act. The challenge will be how to preserve the basic Westphalian respect for borders (something violated in 1990 by Iraq in Kuwait and by Russia in Ukraine more recently) and at the same time introduce the notion that with rights come obligations that must also be respected.

 

How did Wilsonian idealism at the Versailles Conference propose to reform the Westphalian model?  Why did the effort fail to prevent another world war a couple decades later?  

Wilson famously declared the United States had entered World War I because "the world must be made safe for democracy." This was a decidedly anti-Westphalian statement, as he was in essence calling for the United States to transform other societies and influence their internal trajectory. The Treaty of Westphalia, as I mentioned above, emphasized that a country's internal nature was its own business, and countries should instead focus on shaping each other's foreign policies. It is too much to say that Wilson's approach failed to prevent another world war. World War II was the result of a convergence of forces, including the Great Depression, protectionism, German and Japanese nationalism, U.S. isolationism, and the weakness of international institutions, above all the League of Nations. What I would highlight about Wilsonianism is that it remains an important strain of American political thought. To this day, there is a school of American foreign policy that emphasizes the promotion of democracy, and, in some cases, the transformation of other societies. My personal preference is to focus our efforts mostly on shaping the foreign policies of other countries.

 

I found your coverage of what you call China’s “century of humiliation” to be one of the most interesting parts of the book. What were some of the key developments that led to this troubled period in China’s history?  How do you think this “humiliation” affects Chinese domestic and international policies today?

As I mention in the book, the "century of humiliation," as the Chinese term it, began with the Opium Wars and closed with the establishment of the People's Republic of China in 1949. It was mostly the result of the internal decay of the Qing Dynasty, which was in large part brought on by its inability to grasp the changes that were going on around it and adjust to the new reality. While Japan, following Commodore Perry's mission, modernized and attempted to catch up with the West in areas where it had fallen behind, the Qing Dynasty remained set in its ways, convinced that the world had nothing to offer China. More important, this "humiliation" shapes the Chinese Communist Party's (CCP) narrative and how it wants Chinese citizens to think about the world. In the CCP's telling, only a strong government can prevent foreign powers from taking advantage of China, while a fractious and weak country invites foreign aggression. Of course, what the CCP then claims is that only it can provide the stability and strength that China needs and uses this take on history to justify one-party rule and the repression of civil liberties.

 

Though you do not deny the hardships and missteps that occurred during the Cold War, you do offer a rather positive evaluation of the stability in the decades-long bipolar contest between the US and Soviet Union.  What were some of the features of the Cold War that helped manage the tensions between the superpowers and prevent the outbreak of a hot war?  Can some of these be applied to the current Sino-American relations?

 

We should not discount the role that nuclear weapons played in keeping the Cold War cold. Simply put, the specter of nuclear war kept the competition between the United States and the Soviet Union bounded, as any potential war between the two powers could have led to a nuclear exchange that would have decimated both countries and the world. Many international relations scholars argue that a bipolar world is inherently more stable than a multipolar one, because it is easier to maintain a balance of power and stability more broadly when there are only two centers of decision-making. I would add that the United States focused most (although not exclusively) on the Soviet Union's international behavior and did not seek to overthrow the regime. There was a measure of restraint on both sides. Finally, there were frequent high-level summits, arms control agreements, and regular diplomatic interactions. These all helped set understandings for each side and communicate what would not be acceptable to each side. 

In terms of Sino-U.S. relations, I believe nuclear deterrence will work to lower the prospect of war between the two countries. I am concerned, though, that we do not have a real strategic dialogue with China. We need to be able to sit in a room with each other and at an authoritative level communicate what we will not tolerate in areas like the South China Sea, the East China Sea, and the Taiwan Strait. The chances of miscalculation are too high. I also believe we should focus less on China's internal trajectory and more on shaping its foreign policy. We cannot determine China's future, which will be for the Chinese people to decide. We should continue to call out the government's human rights abuses in Xinjiang and its dismantling of Hong Kong's freedoms, but we should not make this the principal focus of our relationship. Instead, we should compete with China, push back against its policies that harm U.S. interests, and seek cooperation where possible with China to address global challenges. 

 

In the Cold War era, both Europe and parts of Asia experienced tremendous economic growth, peace, and prosperity. What role did the United States play in facilitating these positive outcomes?  Are there lessons from Europe and East Asia that can be applied to other parts of the world today?

First of all, we should give credit to the people of Europe and Asia for their tremendous economic success. In terms of the U.S. role, there was of course the Marshall Plan in Europe that provided the funding Europe needed to get back on its feet and rebuild after World War II. In Asia, the United States gave significant aid to its allies. The point I would make is that this aid was not done purely out of altruism. Instead, it furthered U.S. interests. It ensured Western Europe did not go over to the Soviet Union and that U.S. allies in Asia could be stronger. Foreign aid continues to be an important tool in our foreign policy toolbox, and we should continue to use it to further our interests. For instance, with China extending its reach around the globe through the Belt and Road Initiative, the United States should respond with a better alternative that would provide funding for infrastructure in the developing world but make it conditional on the infrastructure being green and on the countries undertaking necessary reforms. Trade can also be a powerful tool for promoting development.

 

What are some of the key developments that undermined the great hope that followed the end of the Cold War?  

In many ways, the Cold War was a simpler time for U.S. foreign policy. The country had one adversary, and it could devote most of its resources and the bulk of its foreign policy apparatus to addressing it. After the Soviet Union collapsed, containment lost its relevance, and U.S. foreign policy lost its compass. The United States enjoyed unparalleled power, but no consensus emerged as to how it should use that power: should it spread democracy and free market economics, prevent other great powers from emerging, alleviate humanitarian concerns, tackle global challenges, or something else? I've begun calling the post-Cold War period of U.S. foreign policy "the great squandering" given that U.S. primacy was not converted into lasting arrangements consistent with U.S. interests.

I would point to a few U.S. missteps that set back its foreign policy agenda and undermined the hope you refer to. First there was the mistaken 2003 invasion of Iraq, where the United States initiated a war of choice in the hope of transforming the country and the region. The Iraq War, and the nation-building effort in Afghanistan, soured many Americans on their country playing an active role internationally. Simply put, they believed the costs of such a role outweighed the benefits. Now, as the United States faces challenges from China to Russia, Iran, and North Korea, Americans are weary of getting involved. Relations with Russia soured, some would argue at least in part because of NATO enlargement.  The 2008 global financial crisis raised doubts worldwide about U.S. competence, as has the American response to COVID-19. In short, the relative position and standing of the United States have deteriorated.

 

After World War II, the United States helped construct what you call the liberal world order.  What are the key features of this order?  What do you consider its greatest strengths and weaknesses?  

The liberal world order is an umbrella term for the set of institutions the United States helped to create in the wake of the Second World War, including the United Nations, the World Bank, the International Monetary Fund, and the General Agreement on Tariffs and Trade (now the World Trade Organization). It was rooted in liberal ideas of free trade, democracy, and the peaceful settlement of disputes, and was also liberal in the sense that any country could join the order as long as it abided by its principles. It was never truly a global order during the Cold War, as the Soviet Union and its satellite countries opted out of many of its elements. 

The great strengths of the liberal world order are that it has promoted unprecedented peace, prosperity, and freedom. But increasingly it is being challenged. Its liberalness is rejected by authoritarian regimes. Many governments or non-state actors are not prepared to hold off using force to advance their international aims. In addition, the order has had difficulty adjusting to shifting power balances (above all China’s rise) and in developing collective responses to global challenges such as climate change, proliferation, and the emergence of cyberspace.

 

China’s emergence as a world economic power has greatly challenged this liberal world order and efforts to get it to conform to some of its basic principles have come up short.  How can other countries persuade and/or pressure China to adhere to the practices and rules of institutions (e.g., the World Trade Organization) dedicated to upholding the order?

First, it is fair to say that some institutions, such as the WTO, were not set up to address a country such as China, with a hybrid economy that mixes free market enterprise with a large state role. And the WTO failed to adjust sufficiently to China’s rise. The United States should be working with its allies and principal trading partners to bring about a major reform of the WTO. More broadly, the single greatest asset that the United States enjoys is its network of alliances. China does not have allies, whereas the United States enjoys alliances with many of the most prosperous and powerful countries in Europe and Asia. The United States needs to leverage those alliances to present a united front in pushing back where China does not live up to its obligations. It should also work with its allies to develop an alternative 5G network, for example, and negotiate new trade deals that set high standards and would compel China to join or risk being left behind. In the security realm, it should coordinate with its allies in Asia to resist Chinese claims to the South China Sea and make clear to China that any use of force against Taiwan would be met with a response.

 

Despite the fact that the US was a driving force behind establishing and maintaining this liberal world order, many Americans have grown weary of the costs involved and fail to see how it benefits them.  Indeed, this was a key feature in President Trump’s 2016 campaign message and continues to influence his foreign policy.  How can policymakers who want to continue American leadership in this order persuade Americans that the system actually benefits them?  

Policymakers need to be more explicit in highlighting the benefits of the liberal order and contextualizing its costs. We avoided great power war with the Soviet Union and the Cold War ended on terms more favorable to the United States than even the most optimistic person could have imagined. Global trade has skyrocketed, and America remains the richest country on earth. Alliances have helped keep the peace in Europe and Asia for decades. In terms of the costs, defense spending as a percentage of GDP is currently well below the Cold War average, which was still a time Americans did not have to make a tradeoff between butter and guns. We can assume a leadership role abroad without sacrificing our prosperity. On the contrary, playing an active role internationally is necessary in order to keep America safe and its people prosperous. The United States may be bordered by two oceans, but these oceans are not moats. Even if we choose to ignore the world, it will not ignore us. Both 9/11 and the COVID-19 pandemic have made this abundantly clear. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177324 https://historynewsnetwork.org/article/177324 0
Making History with Music

PFC Richard Burt, March Field, Riverside, CA 

 

75 years ago World War II ended, but the stories of the men and women who served continue to be woven into the history of the United States.  When the war ended, Corporal Richard Burt attended the Juilliard School of Music with his constant wartime companion, his trumpet.  During the war, the 19 year old private served in the 746th Far East Air Force Band in the Philippine Campaign and shared musical experiences with front line troops, generals, foreign dignitaries, and some of the most famous service members taken prisoner by the Japanese when U.S. forces in the Philippines surrendered in 1942.  

His journey began stateside, in the band at March Field in Riverside, California.  It was there that he learned under the best in show business: “I learned an awful lot about blowing my horn there.  Three-fourths of March Field had been professionals in Hollywood recording industries or were members of nationally known big swing bands.”  Richard was challenged musically by military formations at March Field as well, when he was asked to play taps for the first time as a 19 year old PFC at the funeral service of legendary World War II pilot Lt. Col. William “Ed” Dyess.  Dyess was well known for his bravery at Bataan and as one of a group of twelve POWs who made the only successful mass escape from a Japanese POW camp during the entire war in the Pacific.  Dyess’ final act of bravery earned him the Soldier’s Medal when his plane malfunctioned over Glendale, California and he chose to crash land in a vacant lot rather than ditch his plane and possibly kill or injure any civilians.  “I’ll always remember the first time I played taps in the war for William Dyess.  He was a real hero and that experience always stuck with me.”

 

Lt. Col. William "Ed" Dyess, 1943, after his return from his POW experience

As the war in the Pacific raged in 1944, a call came to March Field to furnish a trumpet player with a sergeant’s rating for overseas assignment in a newly formed band.  Being the youngest in the group and only a PFC, Richard volunteered, “All of our Sergeants were married, so, I marched into our Chief Warrant Officer’s office and asked if I could take the place of that married man that was slated to go.”  His request was granted and he was off to an unknown destination with his new companions, the 746th Far East Air Force Band.  

Upon arrival to Leyte Island in the Philippines, the newly formed band played their first show with a USO group for front line soldiers.  “We played that show three nights in a row, the last being up at the front.  It was an area where all the palm trees had been blown in half.   A make shift stage had been set up and when we arrived there were GIs climbing up these half blown up palm trees to attach spotlights for the show.  All the men who came to see the show came in their ponchos with their helmets on and their rifles sticking out.  As the show progressed, across the ravine, there would be sounds of automatic weaponry and you could see the flashes every once in a while that the shooting made.  So, even while we performed, on the other side of the ravine, there was action going on.  That was as close as I ever came to fighting in that war.”  Stories such as this exemplify the experience of the front line band and the role that they played in World War II, using music to give a respite to soldiers, sailors, marines, and airmen and make them feel a little more connected to home, even on the other side of the world in a combat zone. 

 

PFC Burt practices his trumpet in the jungle of Letye

 

As the Philippine Campaign progressed, Richard and the 746th Far East Air Force Band would move into the City of Manila on Luzon, eventually being stationed at the headquarters of the Far East Air Force at Fort McKinley, under the command of General George Kenney.  It was here that the band would cross paths with another legendary group from the Philippines: the Army and Navy Nurse Corps “Angels of Bataan” who had been freed from their POW camp at Santo Tomas in February of 1945.  According to Richard, “We played for one formation only, that is a military formation, and that was after the war had ended with Japan.  This was when nurses who had been taken prisoner when Corregidor fell were given medals.  We stayed in the shade to play that formation and for good reason, it was suffocatingly hot.”

 

Army Nurses, popularly known as the "Angels of Bataan," awarded the Bronze Star. Leyte Island, 1945

 

As the war came to an end and the unit was to break up, it was decided that it would be fun to record the group.  It was done in the band’s rehearsal tent, using a wire recording.   In what was described as a fatiguing session, the band recorded themselves on wire playing ten chart topping big band hits, musically arranged by members of the 746th.  Upon completion of the recording session, lead trumpet player PFC Richard Burt would ask his commanding officer if he could have the recordings to take home. Chief Warrant Officer John Washburn granted Richard’s request and he brought them back to his home in Salt Lake City, where he took them to KSL Radio and had them transferred to records.

Richard’s life after the war always included music and his long time war companion, the trumpet.  He graduated from Juilliard School of Music in 1953 and received his BA and MA in music education from Drake University in Des Moines, Iowa.  He passed his passion for music on to his family and students that he taught in the public school systems of San Francisco and West Sacramento.  He kept the recordings of his World War II band safe for 75 years, but lost track of them in his home in the 1980s.  Richard would pass away in August of 2016 at the age of 92.

When his wife Marilyn passed away in October of 2019, I found my grandfather’s misplaced recordings in his attic. As a historian, I was bound to preserve this one of a kind artifact and honor my grandfather, his band mates, and all those who served during the war.  Working with a 4 time Grammy award winning sound engineer, I am producing the band’s work into a modern album with a 28 minute narration on the band and military experience of Richard Burt that was recorded by Richard in the 1980s.  The project has multiple goals, but the boldest would be historic: To take an album created by a front line Army band in the Pacific 75 years ago and make a band of World War II Veterans a platinum selling artist by selling a million albums.  The album created 75 years ago by the 746th Far East Air Force Band will be available for sale on Veteran’s Day of 2020. A portion of the proceeds from album purchases will be donated to the United Service Organization (USO) and the World War II Museum in New Orleans.  If you wish to follow this World War II music project as it unfolds, you can follow the 746th Far East Air Force Band on Facebook https://www.facebook.com/746thFEAFband/ or on Twitter @746thFEAFband.

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177325 https://historynewsnetwork.org/article/177325 0
A Conversation with Seattle Author Dr. Lawrence Matsuda on His Debut Historical Novel "My Name is Not Viola"

Lawrence Matsuda portrait by Alfredo Arreguin

 

 

 

On December 7, 1941, forces of the Japanese Empire attacked the American naval base at Pearl Harbor and left hundreds of American military members and civilians dead or wounded. In response to the surprise attack, the United States declared war on Japan the next day. The attack on America inflamed anti-Japanese sentiment and hysteria that led to hate crimes, particularly on the West Coast, against aliens and US citizens of Japanese extraction—and those who looked like them.

Under President Franklin D. Roosevelt’s February 1942 Executive Order 9066, the US government forcibly removed 120,000 people of Japanese ancestry from their homes and incarcerated them in concentration camps.  Most of these interned people were kept in the camps until 1945, with the exception of early releases of a few, such as the valiant souls who volunteered to serve in the American armed forces, including members of the Japanese American 442nd Regiment that became the most decorated American unit of the war. Others were released to attend college or work in defense industries like munitions factories in areas away from the West Coast.

The unfortunate internees subjected to the harsh and dehumanizing conditions of the prison camps had committed no crime but were rounded up, dispossessed, and detained unconstitutionally based only on their ancestry and race. And about two-thirds of the internees were United States citizens. 

The detainees included Hanae and Ernest Matsuda who, with removal in 1942, lost their home and grocery business in Seattle. Like thousands of others, they were evacuated without due process and were incarcerated at the Minidoka concentration camp in Idaho where Hanae gave birth to two sons and a stillborn child.

Hanae and Ernest Matsuda’s youngest son Lawrence was born in 1945 in Block 26, Barrack 2, of Minidoka Camp. Their baby’s prisoner number was 11464d. 

Now Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola. The events experienced by the fictional Hanae of the novel mirror actual incidents in the life of his mother including her girlhood in Seattle’s Japantown; her pre-war journey to Hiroshima, Japan; her removal from her Seattle home and incarceration at the brutal Minidoka concentration camp; her quest for Hiroshima relatives after the atomic obliteration of the city; her marital woes; her severe depression and incarceration at Western State Hospital, a psychiatric facility; her resilience grounded in Japanese and western beliefs; and her evolution as a force for good.

The novel captures the rhythm of life in Seattle’s Japantown, the unrelenting misery of internment at the Minidoka camp, and the pain and loss of internees as they returned home after the war to face dispossession and poverty. This history through the eyes of the fictional Hanae grips the reader with its lively writing and evocative imagery while sharing an important and heartbreaking chapter from our American experience. Yet it is also a story of hope and triumph despite recurrent traumas—and quite timely as we face an unprecedented pandemic and political crises today.

Dr. Matsuda is known in Seattle as a voice for social justice, equality, and tolerance. He is a former secondary school teacher, administrator, principal, and professor. He received an MA and PhD at the University of Washington.  

As a writer, Dr. Matsuda is most well-known for his poetry. His first book of poems, A Cold Wind from Idaho, was published by Black Lawrence Press in 2010. He has published two other books of poetry, one in collaboration with renowned American poet Tess Gallagher, as well as a graphic novel about the Second World War experiences of the Japanese American 442 Regimental Combat Team. Chapter one and two of that graphic novel were animated by the Seattle Channel and both won regional Emmys, one in 2015 and the other in 2016. His poems have appeared in many publications including Raven Chronicles, New Orleans Review, Floating Bridge Review, Poets Against the War website, Nostalgia Magazine, Plumepoetry, Surviving Minidoka (book), Meet Me at Higos (book), Minidoka-An American Concentration Camp (book and photographs), the Seattle Journal for Social Justice, and many others. And he co-edited the book Community and Difference: Teaching, Pluralism and Social Justice, winner of the 2006 National Association of Multicultural Education Phillip Chinn Book Award. 

And Dr. Matsuda continues to work tirelessly for a more just and tolerant nation.

He graciously talked about his new novel and his writing career by telephone from his home in Seattle.

 

Robin Lindley: You had a successful career as an educator, administrator, and professor. How did your “encore career” as a poet and writer come about? 

Dr. Lawrence Matsuda: When I got my PhD, I decided to take something fun because the PhD was tough sledding and not always enjoyable. So, I took a poetry class from Nelson Bentley. 

Robin Lindley: He was a beloved professor at the University of Washington.

Dr. Lawrence Matsuda: Yes. I enjoyed it a lot. I attended his class several times and read for the Castilla reading series for several years. He always encouraged me to publish my poetry. He was a good person and took great pride in having his students published. 

I moved my energy into poetry after my PhD, and continued to write poetry when I was working. Most of it was not great, but mediocre poetry. 

In about 2008, I decided to get good at poetry. I worked with Tess Gallagher. She helped me with my first book of poetry A Cold Wind from Idaho. I thought I was done because I had worked with some other people who helped. I gave the manuscript to my friend, the artist Alfredo Arreguin, and he said Tess Gallagher was coming to his house, and that he would show the book to her. Evidently, she was taken by the manuscript, but decided it needed revisions. She worked with me for about a year, mostly electronically. We finally met and I submitted to Black Lawrence Press as part of a contest. It didn't win first prize, but received honorable mention, and it was published in 2010. Currently more than 1,300 copies are in print.

Robin Lindley: Thanks for sharing that story. It’s wonderful that one of our great American poets, Tess Gallagher, helped launch your writing career. Now you've written this historical novel, My Name is Not Viola, based on the life of your mother. What sparked a novel at this time? Did you see it as a memoir for you as well as the story of your mother? 

Dr. Lawrence Matsuda: It started as a play in the Minidoka [concentration camp] canteen where old guys were sitting around and talking in a general store--cracker barrel scene.

I decided that the play wasn't going anywhere. It was just talking, and it needed a little more action. So, I looked to my own life and I compared it to my mother's and my mother had a much better story. 

It's not a memoir because some of it is fiction, and it’s not an autobiography. It follows the same character in the first person from beginning to end. It’s a historic novel that looks very much like the memoir.

The bones of the novel are my mother’s story and that structure is true. My mother was born in the United States. She went to Japan and was educated there. She came back to the United States, and thengot married. She was incarcerated. And she went to a mental hospital. So, all the bones are true, and to add flesh, I borrowed some of the stories that she told me. I filled in the blanks and then, to move the story farther, I added stories that I heard from other people about Minidoka. 

I’ve made pilgrimages to Minidoka six or seven times. They have a story time when former internees talk about being there. I borrowed some of those stories, and then farther out, I brought in stories of my friends, and then way out farther it was just fiction. So, the book is historic fiction based on the general outline of my mother's life. 

What motivated me is, I have always thought that each person has a good story, and at least one novel. I decided I needed to write and find my one novel, but it wasn't my story. It was my mother's story. 

The other thing is that I’ve always felt an artist should keep moving. I went from poetry to a graphic novel, to a kind of a poetry exchange with Tess. and then to a novel. I'm always trying to do different things. I think an artist should always try something new. Because the incarceration is so powerful it is very tempting to dwell on it and not move forward.  For the novel, I wanted to present the context of the incarceration and the afterward to give a larger perspective. 

Robin Lindley: Thanks for your words on your process. How did you decide on the novel’s title, My Name is Not Viola?

Dr. Lawrence Matsuda: I found my mother’s high school annual and there were inscriptions like “Good Luck, Viola.” I asked her who Viola was, and she said her teacher gave her that name. 

Robin Lindley:  In your novel, you take your mother’s life and add to the story. Picasso said that art is the lie that tells the truth. You share an engaging human story that deals on so many levels with the forces of history such as racism and injustice and the aftermath of war. It’s incredible how much she dealt with in her life.

Dr. Lawrence Matsuda: There are 120,000 stories of people who were 

forcibly incarcerated and each one is different but similar. They all experienced the same thing at different levels. My story is only one of 120,000. 

Robin Lindley: You were born a Minidoka in 1945 so you must not have any direct memory of the internment.

Dr. Lawrence Matsuda: No, but I do have borrowed memories. No matter what, at every Christmas, every Thanksgiving, every New Year's party, every wedding, funeral, the evacuation and the incarceration always came up. It's just a part of life. And I have these borrowed memories that usually focus on the worst of the experience. 

I don't have clear memories in the traditional sense, but my friend, a psychiatrist, says that, when my mother was pregnant, more than likely some chemicals were sent to me in her womb and that affected me in terms of fear and stress that made up my personality. And he also has said that, when he talks to someone who has deep problems, oftentimes he asks if their grandparents suffered any problems? He says big traumas are passed down for three generations. He feels that what happened to your grandparents and your parents is relevant to your current situation. 

Robin Lindley: I’ve heard about studies on genetics and past trauma. There are several studies with grandchildren and children of Holocaust survivors. 

Dr. Lawrence Matsuda:  So the trauma is passed down, and somehow you adjust. The third generation of trauma can still affect you.

Robin Lindley: So, we’re haunted by the traumas of earlier generations. You deal with almost a century of modern American history in the book. What was your research process as you wrote the novel?

Dr. Lawrence Matsuda: I went to Minidoka about six or seven times. In 1969, I taught the first Oriental American history class in the state of Washington at Sharpless Junior High School—now Aki Kurose Middle School. So I was interested in history and, while there, a number of things happened. I met Mineo Katagiri, a reverend who founded the Asian Coalition for Equality, and we worked together. 

Later on, some members of the Asian Coalition for Equality and I confronted the University of Washington because they were not admitting Asian students into their educational opportunity program (EOP). At the time, it was called the Special Opportunity Program, which served poor whites, blacks, Latinos, and Native Americans, but not Asians. 

And so, my interest in history took a step into activism. Ironically, it did again with the kids in the Oriental American history class. At that time, we were still referred to as “Orientals” and the term “Asian” was emerging. The class made a display of miniature barracks like those at Minidoka for an exhibit called “The Pride and the Shame,” a Japanese American Citizen League’s traveling exhibit for the University of Washington Museum. 

Bob Shimabokuro in his book, Born in Seattle, writes about how the traveling exhibit was the impetus for the reparations movement for Japanese Americans. So, my history interest moved me into activism, and my activism was rooted in history, especially anti-Asian, anti-Chinese, and anti-Japanese prejudice which culminated in the forced incarceration.

Robin Lindley: Thank you for your work for change. To go back to your novel, I’m curious about the story of your main character Hanae, who is based on your mother, and your mother's actual experiences. Did your mother go to Hiroshima, as in the novel, when she was about nine and have a rather dismal experience with her relatives, especially her older brother’s wife?  

Dr. Lawrence Matsuda: That was not true. She was born in Seattle and she went to Japan at age one and she returned with her mother and brothers about eight years later.  Her father stayed in Seattle and sent money home to Hiroshima when the family was there. And when she was nine years old, she came back to Seattle. When she was 21, she returned to Hiroshima to live with her older brother and that's when she couldn't get along with her sister-in-law and left after a year. 

Robin Lindley: And did she have an older brother Shintaro who was an officer in the Japanese Navy? 

Dr. Lawrence Matsuda: Yes. He was a submarine officer. He was not a captain, but he was a high-ranking officer on a submarine. He mentioned that the warlords were feeling very confident because of the victory over a Western power in the Russo-Japanese War.

Robin Lindley: The militarists were building sentiment for war in Japan in the early 1930s. In your novel, you depict the removal, the evacuation, and the internment vividly. Was your depiction of Hanae’s story in the novel similar to what your mother experienced in the shocking removal and then the incarceration. 

Dr. Lawrence Matsuda: Yes, it was as described

I think most of the Japanese were shocked. They knew that the Japanese nationals were at-risk as non-citizen aliens. There was a law that wouldn't allow them to become naturalized citizens, so they were aliens. That would be her father's generation. But the initial thought among the Japanese was that they would not take the Nisei [second generation] who were US citizens. So, they were shocked when citizens were taken because it was totally unconstitutional and un-American. You don't round up and arrest citizens for no crime without due process, right?

Robin Lindley: Didn’t the US government contend that the order of evacuation and internment was to protect people of Japanese origin because of extreme anti-Japanese sentiment after the Pearl Harbor attack?

Dr. Lawrence Matsuda: Some people used that excuse, but that wasn't the reason that they were evacuated. If you read the actual evacuation notice, it says all persons of Japanese ancestry, alien and non- alien, were to report to designated locations. And overnight the Nisei, who were citizens, became non-aliens. 

Robin Lindley: And weren't the families and others of Japanese ancestry actually rounded up by troops armed with rifles with fixed bayonets? 

Dr. Lawrence Matsuda: Yes. There were troops. The people were told to report to certain places.  The earliest pickups were done by the FBI. They took  mostly first-generation people who were leaders of the community shortly after Pearl Harbor while the bulk of Japanese were taken in April. 

Robin Lindley: It was a heartbreaking violation of human rights and the rule of law. What happened once these citizens and non-citizens were rounded up? What happened to their property and possessions? 

Dr. Lawrence Matsuda: It was different in every region of the country, but here the Japanese obviously sold off a lot of their goods at fire sale prices. And they stored some items. My parents actually stored some goods at a storage company and also at the Buddhist church. 

There were people in rural areas who left their land to others to care for. For example, on Bainbridge Island, some leased their land to their Filipino workers. They did take care of it and when the Japanesereturned, the land was in good shape. And some of the Japanese split the land with the Filipino workers. Other Japanese left the land and it was totally in disrepair when they came back. Many couldn’t keep their properties because they couldn't pay the taxes. So it was lost. 

There are countless stories. One storeowner left his ten-cent store to a Jewish man to care for. I think he was a jeweler who watched the boarded-up store and took care of it. Nothing happened to that store, but other places such as farmhouses were destroyed, especially when they came back. A farm house was burned on Vashon Island. There were farm houses vandalized in anti-Japanese incidents in Hood River where the whole town signed a petition not to permit the Japanese to return--but the Japanese did anyway. 

Each place has a different story, but overall, most of the people lost their businesses. Most of them lost their jobs. Most of them lost their homes. Most of them sold whatever they had at huge discount. So it was a very difficult time. Goods were sold for a penny on the dollar and customers took advantage because they knew that the Japanese were vulnerable. 

Robin Lindley: You have some remarkable scenes in your novel. I was struck when some white person wanted to buy a piano for a dollar. 

Dr. Lawrence Matsuda: Yes. The Japanese knew they couldn't take it with them. And, if a store was going out of business, they would sell at a huge discount on all goods. They were trying to make something, no matter how small.

Robin Lindley: Were their physical attacks on people of Japanese origin following the Pearl Harbor attack? 

Dr. Lawrence Matsuda: I hadn’t heard of any physical attacks. I know some Filipinos were beaten up because they were thought to be Japanese. The Chinese wore buttons saying “I am Chinese.” And I know that there was a man who was impersonating an FBI agent and he tried to do some bad things to Japanese women. 

Robin Lindley: That was such a time of fear and hysteria. What are some things you’d like people to know about the conditions of the concentration camp at Minidoka where your parents were held and where you and your brother were born? You describe the circumstances vividly in your novel. 

Dr. Lawrence Matsuda: They were in the desert. The food was not always sanitary. The quarters were cramped. There was no privacy. People had to use the latrines instead of regular toilets. There were scorpions and rattlesnakes and dust storms. 

All of that was just a given, but the worst part of it was being betrayed by your country. I compare it to rape. The whole community was raped and we handled it like rape victims. Some were in denial and others tried to prove that they were good citizens. Some committed suicide. Others were just depressed. So, the worst part of it was the mental realization that the whole community was raped. And very few on the outside really cared. I compare it to a rape by your uncle--by someone you trust in your family. It was a rape by our Uncle Sam.

Robin Lindley: And wasn’t the internment out of sight and out of mind, without much press coverage or any outside attention? 

Dr. Lawrence Matsuda: Yes. Minidoka was tucked into a ravine and 9,000people were imprisoned there. If you drove by, you wouldn't even see Minidoka even though it was the third largest city in Idaho at the time.

The physical conditions were bad, but I think the mental trauma was really devastating. The fact that your country betrayed you. And afterwards. Think about it. Who can you trust if you can't trust your government to protect you and maintain your rights? Who can you trust? 

Robin Lindley: That history is devastating. What sort of housing did your mom and dad live in there at the concentration camp? I understand the shelters were very crude and crowded with little privacy.

Dr. Lawrence Matsuda: They lived in barracks that were hastily constructed. They had tar paper on the outside and weren't shingled or sided. It was like army barracks. It was open and they used blankets as curtains, and several families shared each building. The noises and the smells spread. The barracks were heated by a pot belly stove that burned coal.

At the first relocation center, my parents were given ticking and sent to a pile of straw to stuff a mattress. That's what they slept on at Camp Harmony in Puyallup, which was actually a county fairground. Some of the bachelors lived in the horse stalls that still had horse smells. My cousin got the measles and was quarantined in a horse stall. 

When they moved to the permanent camps, like Minidoka and the other camps, they lived in hastily-constructed, army-style barracks with cracks in the floors, cracks in the walls. The wind would blow through. And the barracks all looked alike so people could get lost and wander into your area at night. 

Robin Lindley: And there were extreme temperatures in the hot summers and cold winters. The weather must have been miserable. 

Dr. Lawrence Matsuda: It was cold and muddy in winter. The residents had to walk on boards that were laid down on the mud. And that was how they got to the mess hall. My mother would never eat Vienna sausage because it caused dysentery several times. 

Robin Lindley: And wasn’t healthcare limited? 

Dr. Lawrence Matsuda: There was a patient hospital on site. When there was an outbreak of dysentery, you had to line up at the latrine with everyone else, because everyone who ate at the same mess hall had dysentery. One night, the lines were so long and the internees were upset, the guards thought there was a riot. Soldiers were going to shoot. The residents shouted, “No, no, it's dysentery. We've got the trots.” And so, the soldiers left them alone.

Robin Lindley: When your parents were released from Minidoka with you and your brother, they returned to Seattle where they had been dispossessed. And your mother was facing the additional trauma of dealing with the probable deaths of her relatives in the atomic bombing of Hiroshima. 

Dr. Lawrence Matsuda: They actually released many people at Minidoka before the end of the war to work, attend college or join the army. My father left several times to find housing, which he never found.  So, they stayed in camp until it closed. The administration shuttered it down, turned off the electricity, and told them to leave, and gave them a train ticket and $25.  

Back in Seattle, my family stayed in the basement of my mother's friend's house for a while. We lived there until my dad could find proper housing, but it was in short supply because of the war and the GIs coming back. 

It was not an easy time. And, there was racial real estate redlining in Seattle, so we couldn't move to the best part of town. We could only move to certain parts of town. If those areas were taken, it was tough luck. And in fact, some of the Japanese who moved out of the Central Area returned and found that African-Americans who came up from the South to work during the war had moved into the redlined area.  

Robin Lindley: That’s another tale of discrimination in America, and we're still living with the results of racist red lining. Thanks for sharing that insight. I didn't realize the effect on the Japanese community. Your mother must have been shaken by the terrible atomic bombing of Hiroshima and the lack of news about her relatives. 

Dr. Lawrence Matsuda: Yes. The first news they heard was that Hiroshima was bombed. Tokyo had suffered firebombing with more or less conventional bombs like napalm, but the residents did not understand what an atomic bomb was and the results.  

Recently, I read an article about how the US was suppressing news about the Hiroshima destruction until John Hersey visited Hiroshima and wrote his famous book, which revealed the aftermath. 

The news came in very slowly. It wasn’t like today when, if something happens, CNN is there by the next day. This news dribbled in. They knew that Hiroshima was destroyed, but they didn't know quite what that meant. It was the instantaneous destruction that was hard to comprehend. You could understand something being destroyed slowly, but everything in Hiroshima was vaporized or destroyed in an instant. 

My mother didn't know what happened to our relatives. It was only because of our relatives in the countryside that she found out the full story. But it was tough for her because she had lived in Hiroshima and she knew the city, so it was really devastating to realize that the city and many of her relatives were gone instantly. 

The people of Hiroshima were not soldiers. Soldiers expect to be put in harm's way and die, but these were civilians: old women, old men, young children, and workers.  They were evaporated and destroyed instantly or many died later of radiation sickness. 

Robin Lindley: Have you traveled to Japan and visited Hiroshima? 

Dr. Lawrence Matsuda: Yes. I was actually in Hiroshima during the 50th anniversary of the bomb.  It is a strange city. Kyoto is very old. You see the shrines and the old architecture. Hiroshima is modern. It doesn't look like a Japanese city, but a modern city because it was totally destroyed. And in real life, our family home was only a thousand meters from ground zero. 

Robin Lindley: That visit must have been very moving for you then. Now it’s the 75th anniversary. 

Dr. Lawrence Matsuda: Yes. But I was surprised too when I met my relatives, the children and grandchildren of my mother's oldest brother. They were all very positive, very healthy, and very energetic. They were generally happy people. I met Akkiko who survived the bomb. She was in the family home at the time.  I met her son, and her son’s son. So it seems life goes on. 

Robin Lindley: Yes, that’s encouraging. Didn’t Akkiko suffer radiation illness and severe burns?  

Dr. Lawrence Matsuda: Yes. She’s mentioned in the book. 

Robin Lindley: Your description of Hanae’s treatment for depression at Western State Hospital, a psychiatric facility, is very moving. It happened in 1962 and you juxtapose her experience with the Cuban Missile Crisis. You also destigmatize mental illness. Does your portrayal in the novel parallel your mother’s own “incarceration” at the hospital when she was admitted for severe depression? 

Dr. Lawrence Matsuda: I really couldn't say for sure because she never talked about it. But I did talk to my friend who is a psychiatrist.  He took me to the Western State Hospital Museum and I saw what it was like, and I knew what they did at the time. I studied the hospital’s history and learned that doctors specialized in lobotomies at the time.

Robin Lindley: Did you visit your mother when she was in the hospital? You must have been a teenager then. 

Dr. Lawrence Matsuda: I visited her once. They wouldn't let me go inside. We had to meet her in front of the hospital, in the parking area, at the turnaround. She came out to see us.

Robin Lindley: What do you remember about that visit?

Dr. Lawrence Matsuda: She was very thin and she looked worse than when she entered. 

Robin Lindley: And what kind of treatment did she receive? Did she actually have shock treatment or electroconvulsive therapy? 

Dr. Lawrence Matsuda: I'm sure she did. My psychiatrist friend told me that was pretty standard. 

Robin Lindley: Did your mother seem depressed to you before she was hospitalized? Did she talk about suicide? 

Dr. Lawrence Matsuda: Yes, she seemed depressed, and she was very distant and not engaged. But she did admit to her sister-in-law that she was contemplating suicide. 

Robin Lindley: Wasn’t there almost an epidemic of suicide among the internees after the war?

Dr. Lawrence Matsuda: Yes. There’s no real data on that because nobody kept track of it. But I talked to Tets Kashima, who was a professor of Asian American studies, and he said in California suicide was prevalent. There were just a lot of suicides. And the other thing was, few people talked about it. 

Robin Lindley: From some history I’ve read, such as The Nobility of Failure by Ivan Morris, it seems that suicide is honorable in Japanese culture and tradition. And in your novel, some characters see suicide as an acceptable way to cope with loss and depression. 

Dr. Lawrence Matsuda: That's the samurai tradition. If you dishonor your master, or yourself, you must die too. That led to a custom of ritual suicide. Hara kiri, which translates into “cut your stomach.” And that’s what samurai did. And my friend, [the artist] Roger Shimomura had ancestors who were famous for a double suicide. They stood face to face and stabbed each other simultaneously. So, they committed ritual suicide together. 

Robin Lindley: That's an elaborate way to go. You indicate that Hanae and your mother were influenced by both Japanese and Christian traditions. Were those traditions a source of your mother’s strength and resilience through the catastrophes in her life? 

Dr. Lawrence Matsuda: Yes. I think both of them helped her. She could call on Japanese tradition to deal with her stress if an American tradition did not help. So, she had a little more of an arsenal, if you will, or two toolboxes to pull from. However, some tools that helped her survive became counterproductive. Take the Japanese word shikatanganai. “It can't be helped.” That word helps you get through, but after a while it doesn't move you forward. 

Robin Lindley: Yes. “It can't be helped.” When I read that phrase in your book, it reminded me of Vonnegut’s refrain: “So it goes.” It can’t be helped seems a pessimistic adage rather than we can change this or we can do better. 

Dr. Lawrence Matsuda: It isn't really. Japan was a harsh land of starvation, earthquakes, and typhoons. When your house fell down, no one in the village wanted to hear you crying because their house fell down too. And so it’s shikatanganai, it can't be helped. It's just what happened. 

And in America, a rich country, not a poor country like Japan, there is no shikatanganai. Here, your house falls and you call your lawyer. You sue the city. You sue the architect. You sue your neighbors. But it's not that it couldn't be helped. You’ve got to sue somebody. And it's really an irony that, in a poor country, they accept their fate but in a rich country, they always want to contest what happens. Not always, but there’s a different feeling. So this Japanese value helped my mother and others cope with overwhelming forces. 

Robin Lindley: Maybe that's akin to the acceptance stage of grief. 

Dr. Lawrence Matsuda: Yes, you accept fate rather than get angry.

Robin Lindley: It’s a different perspective. I was interested in your influences, and you have mentioned the naturalist writers such as Frank Norris and his classic novel The Octopus. Naturalism concerns how characters deal with the forces of nature, the forces aligned against them, and you write beautifully of how your characters take on fate. Do you see the influence of writers like Norris in how Hanae deals with forces beyond her control and then, it seems, becomes a force herself? 

Dr. Lawrence Matsuda: Yes. The naturalists felt that the forces of nature superseded human ambition. Human beings have to deal with natural forces at work in this world and these forces often overcame individuals.  In The Octopus, the novel by Norris, the railroad was a force which had to reach from coast to coast to deliver grain to the starving people in India. So that was another force to deal with. And even though the ranchers resisted the railroad, they couldn't stand up to it because the force was more potent. It had to deliver the grain to feed the starving masses. 

If you look at our situation today, there are numerous outside forces at play. One is obviously the pandemic. The other is the political situation. And these forces that are largely out of our control. But in the novel, Hanae managed to survive the adverse forces and learned to surf the waves of the tsunami and become a force herself--not a capital letter F force like feeding the starving in India, but a small force that is filled with equality and social justice. 

We're in that kind of a situation now. The large forces out there can destroy us, but we must learn to use them and to survive them and become forces for good. And if many people get together and become forces themselves, they can become a large force, like a natural force, like the starving masses in need of grain. We need to persevere and make it to the other side and become forces ourselves.

Robin Lindley: And you have been a force for social justice and for democracy in your writing and in your activism and teaching.

Dr. Lawrence Matsuda: I have tried.

Robin Lindley: I’ve read about your many accomplishments. You’re too humble. You’ve written now about atrocious incidents and the resulting trauma, but you have also shared triumphs of the human spirit. Where do you find hope today?

Dr. Lawrence Matsuda: When I was a kid, I read all the Greek mythology in the Beacon Hill Library at grade three. And that helped me. I think that mythology is something like history. I recall that Pandora opened a box and unleashed all these horrible things. But the thing that was left in the box was hope. There is still hope.

Robin Lindley: Is there anything you’d like to add for readers?

Dr. Lawrence Matsuda: I'd like to speak to why the Japanese were incarcerated. Three presidents, Reagan, Bush Senior, and Clinton, said in the letters of apology. They said the causes were racial discrimination, wartime hysteria, and failed leadership. And I ask you to take a look at what we have now regarding racial discrimination. My hope is that things get better. For wartime hysteria, which was called propaganda then, and is now called fake news. I hope that the network that peddles fake news crashes and burns. And the last one, failed leadership. I hope that our failed leaders are repaired or replaced soon. So those are my three hopes. 

Robin Lindley: Those are powerful thoughts to end on. Thank you for sharing your thoughtful comments Dr. Matsuda, and congratulations on your moving new novel, My Name is Not Viola. It’s been an honor to speak with you.

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, Huffington Post, Bill Moyers.com, Salon.com, NW Lawyer, ABA Journal, Real Change, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com.

Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola. 

 

 

 

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/blog/154401 https://historynewsnetwork.org/blog/154401 0
How an American TV Mini-Series Helped the Germans Acknowledge the Holocaust

Meryl Streep in Holocaust, NBC, 1978

 

 

In a fascinating book, Learning from the Germans: Race and the Memory of Evil (2019), philosopher Susan Neiman praises the German people for coming to terms with their country’s role in the Holocaust. The reckoning took time, Neiman reports. For a few decades after the Second World War there was not much public discussion or teaching about the subject in Germany. In the late 1970s, however, a significant change occurred. Germans began to deal more openly and frankly with the record of Nazi persecutions. 

A visitor to present-day Germany can find numerous examples of this “remembrance,” notes Susan Neiman. There is a memorial to the Holocaust at the center of Berlin and there are “Stumbling Stones,” small brass plaques around the city indicating where Jews and other victims of the Nazis lived before deportation. Exhibits about the Holocaust can be found throughout the country, and educational programs at Buchenwald and other concentration camps describe horrible practices at these sites. On the anniversaries of tragic events, such as Kristallnacht, Germany holds “public rites of repentance.” Neiman says Americans can learn how to confront their nation’s troublesome history of slavery and racial oppression by considering Germany’s progress dealing with unpleasant facts about the past.

Why did the German people’s curiosity and interest in the Holocaust surge in the late 1970s? Years ago, I discovered an important clue to this attitudinal change when conducting research for my 2002 book, Reel History: In Defense of Hollywood. Working on a chapter called “Impact,” I examined history-oriented dramatic films that influenced public opinion and behavior in significant ways. During that investigation, I came upon details concerning Holocaust, an American-made mini-series that NBC released in the United States in 1978. Subsequent programming in Britain, France, and Sweden attracted large audiences. The greatest buzz and public discussion took place in Germany. 

Holocaust is a four-part docudrama with mostly fictional characters. Among its stars is Meryl Streep, then a young actress in an early stage of an extraordinary career. At the center of the story is a kind and respected Jewish medical doctor, Josef Weiss, and his extended family. Weiss’s nemesis is Erik Dorf, an unemployed lawyer who joins the Nazis. Eventually Dorf becomes a deputy to Reinhard Heydrich, a principal leader of the “Final Solution.” By the end of the film, most members of Josef Weiss’s family perish. The story exposes viewers to major historical developments from 1935 to 1945, including the Nuremberg Laws, Kristallnacht, concentration camps, and the Warsaw ghetto uprising. 

When Holocaust became available for West German television in 1979, some German TV executives did not want to broadcast the film. One complained that it represented “cheap commercialism” in a soap opera format. A program director dismissed the production as typical Hollywood entertainment, “not quite real, not quite truth.” Despite the executives’ resistance, the program appeared on local TV stations and it became an instant hit. About half of West Germany’s population viewed some or all programs in the series, and many people in East Germany managed to watch it through antenna reception. About 30,000 viewers called television stations, requesting information. They asked: How could it happen? How many people knew? 

The film made a significant impact on German society. A few months after its broadcast, West Germany scrapped the statute of limitations on Nazi war crimes. Media attention to the film provoked a “historians’ debate,” leading scholars to clash on questions about lessons from the record of German society under the Nazis. Educational leaders responded to the public’s interest by developing new courses for schools.

Books and documentary films about the Nazis and the Holocaust appeared in Germany before 1979, but they did not excite the degree of curiosity and interest that the mini-series aroused. Several media analysts in Germany pointed to the dramatic film’s powerful effect. Viewers became emotionally attached to the characters. They were upset when seeing the Germans’ indifference to human suffering and seeing Jewish figures harassed or cut down in brutal actions. Previous reports about this tragic history provided only names and numbers, the analysts noted. This production displayed the impact of historical events in graphic form. The victims seemed like real people. Audiences cared about the Jewish characters’ fate.

Susan Neiman makes a good point in her book. Americans, now struggling to acknowledge their country’s history of racial oppression and wishing to do something about it can learn from Germany’s progress toward “remembering.” Yet the Americans’ recognition of evils from history is not as limited as Neiman suggests. “Hollywood,” the generic name for America’s vast film and video-based industry, has made some worthy contributions to humanitarian awakenings. Holocaust helped Germans to confront their troubled past, and in another notable example, Hollywood confronted Americans with demons from their history. 

Marvin J. Chomsky, the director of Holocaust, tugged at the heartstrings of American viewers through broadcast of an emotionally powerful drama on ABC Television in 1977, a year before the release of Holocaust. Roots, a mini-series about the experience of Africans and African Americans in slavery,attracted enormous audiences. The fourth and final program of Roots became the most-watched single episode of an American television show in history up to that point. Chomsky’s Roots did for the American people what Holocaust did for the Germans. The film aroused viewers to ask questions and seek information about a history that was less familiar than it should have been.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177328 https://historynewsnetwork.org/article/177328 0
Fried Ice Cream and Steak: A Personal History of Hong Kong

 

 

The flight from Tan Son Nhut Airport in Saigon to Kai Tak Airport in Hong Kong on Cathay Pacific Airlines was short but sometimes rough, and windy, especially when landing. After clearing customs, on leaving the airport I hailed a cab. With my luggage in hand,  "Jimmy's, please," I said to the driver who knew the way without me telling him. It was usually my first stop even before checking in at my hotel. I needed a Western style meal or even the semblance of one after almost two months in Vietnam without a break from the daily grind of the war. Weaving through traffic we were fast on our way to Jimmy's Kitchen, a European style restaurant and a Hong Kong staple, a respite for Western journalists, contractors, military men, entertainers and people with money from everywhere. NBC News' policy for Vietnam War staffers gave us ten days off every two months to catch our breath and refresh. More often than not, Hong Kong was my primary destination, a place I visited many times and then lived in 1966, 1967, 1968 and 1969. 

Known simply as Jimmy's or, better yet, but rarely used, Jimmy's Kitchen, the restaurant sat on a darkened street in Kowloon down several concrete steps behind a dark wood door with only a single light at the entrance. Kowloon was far enough from the Central District of the city where most of the high-end shops, expensive restaurants, hotels and financial institutions were (and still are). Despite the huge demonstrations that are going on today, people tend to forget there were equally serious three-day demonstrations that turned into riots in 1966 over the price of a ferry ride. I was then in Saigon covering the Vietnam War when many people in Hong Kong protested the proposed rise of a few pennies, about 25 percent over the average cost of a daily ferry trip. With so many residents of Hong Kong on or near the poverty line, a few cents more for a ferry ride would seriously hurt their pocketbook. Because people were getting no response from the government, they demonstrated and ultimately rioted at the cost of one dead, almost 2,000 injured, and a like amount jailed. The riots lasted three days. The rioters lost and the ferry raised its rates. Quiet ensued. However the idea that demonstrations might work against government polices came into play. It is now apparent that those early protests were a precursor to today’s huge demonstrations in Hong Kong. 

One year later in 1967, there were sudden and unexpected (for some) demonstrations in the name of freedom from the rule of the British, then the colonial masters of Hong Kong. Mao Zedong, China's cultish dictator, inspired the riots with his brand of brutal communist ideology outlined in his Little Red Book, the bible for his philosophy. I vividly recall witnessing a reasonably peaceful demonstration on a Hong Kong main street with hundreds of Chinese students wearing starched white shirts, dark trousers and plastic sandals, marching together as one, brandishing the famous red vinyl-covered book. It was late afternoon, the sun not yet down, and the end of the workday. Most of the people on the crowded street were passive and stood silently saying nothing while a few others punctuated the march with a smattering of orderly clapping hands.

The late 1960s was also a time of terror in Hong Kong when makeshift bombs went off in doorways and on the streets of the city. The Red Guard, a Mao creation formidable in its control of Mainland China, was looking to establish a foothold in Hong Kong and set off many bombs in the city in an effort to oust the British. Though people died and suffered injuries when some bombs went off, what the Red Guard did was disruptive and created corresponding fear more than serious destruction. It was enough to keep the city on edge. One day I was in a crowd of people at lunchtime as it surrounded a small suitcase in the middle of a crossroad. Traffic stopped. Police were everywhere. The growing crowd was quiet but tense as we shuffled our feet and waited for the police to disarm what everyone in the crowd believed was a bomb. To a collective sigh of relief, the bag was empty and everyone quietly dispersed. Dinner was waiting. 

Today, with British colonialism long gone, ironically the many and continuing mass riots, mostly in the Central District and fomented by Hong Kong's younger generation, are for freedom from China in opposition to an increasingly repressive Mainland government. The riots regularly take place in the center of the city on the Hong Kong side of the island. Jimmy's, until it closed a few years ago, remained tucked away on a quiet street in Kowloon successfully serving its diverse patrons. 

I do not want to overly extol Jimmy's virtues. In a big room with low lights and dark wood paneling, with wide space between tables covered in starched white tablecloths, waiters clad in pajama style uniforms moved silently and served with no fuss. The restaurant had the look and feel of an exclusive British Club in Mayfair, London. The drinks were lavish and large. The Hong Kong brewed San Miguel or Japanese Asahi beer was never cold enough, but that was how the British liked their favorite drink, so we adjusted. The food was good, not great. The steaks were well aged. Chicken Kiev was consistent. Beef Stroganoff always satisfied. The onion soup with a heavy dollop of cheese was chewy and delicious. It was more than a restaurant, though, that catered to mostly Westerners. Instead of designating us as American or French or British, we all fell under the easier term, European. In times of war we accepted the designation with good will. 

The best part of the meal at Jimmy's was dessert, particularly its Baked Alaska, a treat to behold for its beauty and richness. I preferred the fried ice cream, an orb as big as a baseball consisting of freshly made vanilla ice cream wrapped in cake batter and quickly deep fried to create a sweet treat that I found nowhere else in Asia. It was always worth the trip. 

Coming from Saigon as I did at least twice a year the lure of Jimmy's was, strange at it may seem, its sameness. Even now if I close my eyes, I see the British club it wanted to be and for a few hours a night for dinner, or at lunch, it had become a peaceful break from the reality of war in Vietnam. Today, all semblance of peace is gone as Hong Kong seethes in its struggle between authoritarianism and freedom. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177329 https://historynewsnetwork.org/article/177329 0
Prop 16 and the "Chinese Virus" Bring Two Views of Asian American History into Conflict

Anti-Proposition 16 Car Parade, San Francisco Peninsula, August 2020

 

 

Nationwide Black Lives Matter protests over the past several months have rejuvenated fights against ongoing racism. A surge of harassment and assault against Asian Americans during the pandemic signifies the recurrence of xenophobia and racial animosity. The racialization of Covid-19 as the “Chinese virus” awakens the dormant yellow peril trope. Meanwhile, a group of Chinese Americans, mostly first-generation immigrants, have been organizing flag-flying, placard-displaying car rallies in the Bay Area and southern California, protesting Proposition 16—a ballot measure that aims to restore affirmative action in California.

Disregarding the structural inequalities that race and gender-conscious affirmative action seeks to dismantle, anti-Prop 16 protesters embrace a conception of equality that comprises two basic ideas: individual effort and colorblindness. They consequently consider racial and gender preferences inherently discriminatory. To comprehend this individualistic view underscoring mere surface equality requires one to trace the history of Chinese Americans’ struggle for equality. 

In the late 1960s and early 1970s, a group of Chinese American social activists, inspired by the civil rights movement, contested surface equality in American courts. Lau v. Nichols is a landmark case wherein limited-English-speaking students of Chinese ancestry in San Francisco alleged the denial of equal educational opportunity by the school district due to the lack of bilingual education. The United States Supreme Court ruled in 1974 that “there is no equality of treatment” without adequate bilingual education, the “effect” of which constituted discrimination. Mandating “different treatment,” the court directed the school district to “take affirmative steps to rectify the language deficiency” for racial minority students. As direct beneficiaries, Chinese American students enjoyed the benefits of these structural improvements.

In a more open and equal social milieu, a growing yet diverse Chinese American community emerged. The ethnic Chinese population almost doubled in the 1970s as a result of the Hart–Celler Immigration Act of 1965. This more liberal immigration law favored immigrants seeking family reunification and those with professional occupations. Many Chinese Americans with professional skills and capital rode the wave of opportunity in the post-civil rights era to achieve socioeconomic success, whereas the majority of new immigrants who came for family reunification struggled in urban poverty. Public perceptions of a successful minority group rising from historical discrimination overlooked the vast intragroup socioeconomic divisions. 

Re-emerging in the 1980s, the model minority myth portrayed Asian Americans as an example of self-sufficiency and individual achievement. In contrast to the structural interpretation, the cultural rhetoric that emphasized familial and cultural attributes dominated the public view. Many middle-class and wealthy Chinese families welcomed the illusory rhetoric because it fit in well with traditional values and beliefs that the parents had carefully maintained to nurture their children. This cultural discourse functioned as a powerful force informing Chinese Americans’ understanding of equality. 

The positive stereotypes soon backfired. Public perceptions of Asians as disproportionately successful in American society drove a growing amount of anti-Asian resentment. The once positive portrayal of Asian students was repositioned to depict them as monotonous and lacking character and leadership. In order to curtail rising Asian American and declining white enrollments, UC Berkeley made several undisclosed admissions policy changes in the mid-1980s that disfavored Asian applicants. After discovery and investigation by a coalition of Asian American community organizations and further pressure from state government agencies, Berkeley Chancellor Ira Michael Heyman apologized twice and publicly acknowledged the university’s discriminatory policies. 

With the disputes barely settled, anti-affirmative action politicians moved in quickly to exploit the Berkeley situation by targeting race-based policies in general. Historian Dana Takagi argues that the political manipulation shifted the focus of discourse from anti-Asian racial discrimination to the faults of affirmative action. Elaine Chao, then U.S. Deputy Maritime Administrator, wrote a 1987 op-ed in Asian Week, connecting the racial quotas against Asian Americans in Berkeley’s admissions process to the university affirmative action programs for underrepresented minorities. Other conservative politicians and intellectuals joined the fray to reinforce the conflation. 

The heightened conflation of anti-Asian racial discrimination and race-based policy manifested in Ho v. San Francisco United School District (SFUSD). The SFUSD had implemented court-mandated racial caps in public schools to achieve school integration since the 1980s. In the 1990s, the racial caps’ negative impact on ethnic Chinese students, who faced the highest score cutoff among all racial groups to qualify for admission to a top alternative high school, became more pronounced. Several ethnic Chinese students filed a class-action suit against the school district alleging that the imposed racial caps constituted racial discrimination. The lawsuit found impassioned support from anti-affirmative action Chinese Americans who ignored the mandatory nature of public education and equated the racial caps with affirmative action. This resentment of race-based policies dovetailed with the conviction in the cultural rhetoric, forging a specious argument among some Chinese Americans in support of colorblind policies. 

This stance has resonated with many newly arrived Chinese immigrants. These well-off suburban dwellers, most of whom work as professionals, rushed to adopt a misguided position that suppresses race as an essential element in American social relations. Even the pandemic failed to shake their belief in the model minority myth and subdue their passion for protesting Prop. 16. Little wonder that the car rally organizers are part of a broader coalition that supports a recent suit against Harvard, whose political repercussions recall the Berkeley admissions controversy. 

The past never vanishes. But what the past really entails for our present depends on an accurate and nuanced interpretation of it. The revival of the racist and anti-immigration narratives around the “Chinese virus” attests to the illusion of a colorless society. Deeply rooted in the history of the United States, racism and xenophobia never go away. In a society where people still believe in racial hierarchies, even a “model minority” group runs the risk of being accused by the racial superior as a threat, whether in the form of university overrepresentation or disease carrying. Over the years, race-conscious policies have induced profound change in institutions, bringing about structural improvements. Nevertheless, until racial hierarchies are shattered, racial discrimination persists regardless of how a racial minority group is ranked along the racial hierarchy.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177331 https://historynewsnetwork.org/article/177331 0
Roundup Top Ten for September 11, 2020

Our Long, Forgotten History of Election-Related Violence

by Jelani Cobb

"A weather forecast is not a prediction of the inevitable. We are not doomed to witness a catastrophic tempest this fall, but anyone who is paying attention knows that the winds have begun to pick up." 

 

Think The Trump Tapes Are Worse Than The Nixon Tapes? Think Again.

by Leonard Steinhorn

Recordings of the President reveal not only racial bigotry but a cynical indifference to the rule of law and a belief that any means were justified to prevail over political adversaries.  

 

 

This Republican Party Is Not Worth Saving

by Tom Nichols

"The hardening of the GOP into a toxic conglomeration of hucksters, quislings, racists, theocrats, and cultists is already happening. The party gladly accepted support from white supremacists and the Russian secret services, and now welcomes QAnon kooks into its caucus. Conservatives must learn that the only way out of “the wilderness” is first to vanquish those who led them there."

 

 

Neoliberal Hong Kong Is Our Future, Too

by Macabe Keliher

While orthodox economists like to point to Hong Kong as an ideal free market, the social consequences have been disastrous. Inequality is rising, wages are declining and working hours increasing, overall economic opportunity is dwindling, and housing is so unaffordable that office workers sleep in McDonalds. Is it any wonder that the streets are now burning?

 

 

The Supreme Court’s Starring Role In Democracy’s Demise

by Carol Anderson

The Supreme Court today repeats the shameful actions of the courts in the 1890s, which gave judicial cover to state laws explicitly designed to disenfranchise Black voters, by accepting bad faith arguments that the laws in question were race-neutral. 

 

 

Reform the Police? Guess Who Funds My State’s Officials

by Miriam Pawel

Translating protest into reform depends on breaking the influence law enforcement unions exert on state legislators, including through campaign contributions.

 

 

Trump’s Law-and-Order Campaign Relies on a Historic American Tradition of Racist and Anti-Immigrant Politics

by Austin Sarat

"Throughout this nation’s history, appeals to law and order have been as much about defending privilege as dealing with crime. They have been used in political campaigns to stigmatize racial, ethnic and religious groups and resist calls for social justice made by, and on behalf of, those groups."

 

 

In 2020, Voting Rights are on the Ballot

by Peniel Joseph

Black citizenship remains the best yardstick to measure the nation’s democratic health, and even before the coronavirus pandemic, the Black vote in large parts of the country remained imperiled.

 

 

Trump’s 2020 Playbook Is Coming Straight From Southern Enslavers

by Elizabeth R. Varon

In arguing that radical protesters endanger U.S. law and order, Trump is echoing the attacks leveled by Southern enslavers against abolitionists.

 

 

Covid-19 Has Exposed The Consequences Of Decades Of Bad Public Housing Policy

by Gillet Gardner Rosenblith

Poor and economically precarious Americans are at risk of eviction in the COVID-19 crisis because American policymakers have spent decades rejecting a public role in providing decent housing outside of the market system. 

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177312 https://historynewsnetwork.org/article/177312 0
Americans Have Feared Another Civil War Since the End of the Last One

 

 

 

Is the United States on the brink of a second civil war? The question has hovered around the margins of national political discussion for three or four years now, gaining new purchase with every crescendo of a hard-fought election, every fresh outbreak of disorder, every highly publicized evisceration of a long-cherished, now-eroded norm. 

 

Yet this year of unprecedented upheaval--a presidential impeachment trial, a deadly pandemic, a precipitous economic crisis, a succession of carbon-fueled catastrophes--has seen the possibility raised of a violent eruption of civic discord on a scale unseen in this country for more than a century and a half. The roots of the second civil war, should it come, lie deep in American history, as far back as the founding, if not before. But the end of the first one also offers answers as to why it failed to resolve the underlying conflicts and contradictions in American political life. Even within months of Appomattox, the traumatized inhabitants of a battered and broken land were asking themselves if the cycle of violence would ever end, if the questions that led to the conflict might, sooner or later, start a new one.

 

Rumors and reports of another looming conflict began circulating immediately after Appomattox. Given that the country had just torn itself apart, it wasn’t hard to imagine the worst happening once again. President Andrew Johnson, a lifelong Democrat who took over in April 1865 for the assassinated Lincoln, sympathized with the former slave owners and wished to welcome the former Confederate states back into the Union with no strings attached. To Benjamin Butler, the Union general who had first offered protection for war-time fugitive slaves behind army lines and thus played a crucial in advancing military-backed emancipation, such a generous Reconstruction would effectively amount to a surrender to the South. “All is wrong,” Butler wrote. “We are losing the just results of this four years struggle.” 

 

To Republicans in the North, those too-hasty steps toward Reconstruction would risk surrendering the whole purpose of the war. Would it be necessary once again to resort to arms? “The First Southern War may prove not the last,” the diarist George Templeton Strong wrote in his diary in 1866. He feared that “there may well be another sectional war within three years.” “We are to have another war,” lamented William Brownlow, a longtime foe of Andrew Johnson in Tennessee politics. Johnson, in that next fight, would “take the place of Jeff Davis,” Brownlow anticipated.

 

To Johnson and his opponents, excluding the Southern states from the Union was as good as to invite another cataclysmic clash. Johnson railed against his opponents as “Radical Distructionists” and accused Republicans, by excluding the Southern delegations from Congress, of engaging in “secession in another form.” Democratic papers warned that “Black Republicans plunged the country into one civil war” and would gladly do so again. Every day Congress met without newly-elected Southern representatives--among whom were numerous veterans of the Confederate army and government--was “a triumph of treason,” one newspaper argued.

 

In 1867, Republicans seized control of Reconstruction, forced passage of the Fourteenth Amendment extending citizenship to Black Americans, and imposed a military occupation on the defeated Southern states. Democrats were furious, and threatened violent resistance. In letters pouring into the White House, they encouraged Johnson to do whatever it took to force Congress to accept the Southern delegates. Should Johnson try to use the battle-hardened federal army to overcome congressional opposition, one correspondent assured him, thousands of “trained men, of the late Confederate army...will bear the banner of the Union, to sustain the constitution.” These were the same men, of course, who only a few years earlier had vowed to destroy that Union and overthrow the Constitution. One wounded Confederate soldier promised to fight on Johnson’s side in the next war once he had recovered from an injury sustained in the first one.

 

As in the recent fight over President Trump’s impeachment, the Republicans’ effort to remove Johnson from office in 1868--as Brenda Wineapple recently showed in her absorbing study, The Impeachers--was haunted by the possibility of constitutional breakdown and a return to catastrophic violence. Republican supporters around the country vowed to once again sacrifice “blood and treasure” if Johnson refused to leave office. One even professed himself eager to murder Democrats “as you would wolves.” An estimable New England senator feared the political battle in Washington would lead to “another fight...the final one.”

 

A politician in Massachusetts wrote to the state’s long-serving senator, Charles Sumner, suggesting Northerners gird themselves for that ultimate fight. “Well, We have been through One War,” he wrote, “But rather than to have treason and traitors triumph, we will fight Again, And Make Clean Work and Sure, if the Cause demands it.”

 

Johnson’s defenders, meanwhile, offered to rally to his defense. A Washington paper allied with the president described impeachment as a “second rebellion,” and the president himself publicly expressed his faith in the “good sense of the army” should push come to shove.

 

Ultimately, the country was spared a second war over Johnson’s removal, but that was only because the Senate voted to acquit him by a single vote--a margin likely influenced by the senators’ fear of the widespread talk of violence had the trial ended in his conviction. In that event, one sympathizer wrote to Johnson afterward, the result could only have been “civil war or despotism, or both.”

 

The president’s acquittal in 1868 telegraphed the decision the Republican Party would eventually make a decade later to withdraw support from military-backed Reconstruction in order to maintain its grip on national political power. The North had fought the Civil War, from the beginning, to preserve the Union, not to free the slaves or to guarantee equal rights for all Americans regardless of color. That difference still held after the war, when Northerners decided it wasn’t worth risking another war to see that the Constitution, as amended after the fighting, was enforced in the South. From that moment, the palpable (and understandable) fear of provoking a second civil war prevented the victorious side in the first one from consolidating its gains and enjoying the fruits of victory. The North won the war, while the South won the peace, the common formulation has it. To undermine the white-supremacist consensus has always been threatening to the myth of national unity, and so it is today.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177224 https://historynewsnetwork.org/article/177224 0
The "Triple Nickles": Jim Crow Was an Elite Black Airborne Battalion's Toughest Foe

The 555th Parachute Infantry Battalion, Fort Dix, New Jersey, 1947. By year's end, the 555th would be inactivated and its members 

assigned to units of the 82nd Airborne Division, making it the first racially integrated division in the Army. Photo U.S. Dept. of Agriculture

 

 

Since the unfortunate death of George Floyd, men and women of all walks of life have joined in solidarity around the country to protest institutional racism pervasive in these United States. Since emancipation, Black people in this country have been pressured to be quiet, and when they have spoken out to bring attention to injustice, they’ve often been told to “shut up and dribble.” Even when, as happened in the 1940s, a select group of African American soldiers managed to prove themselves as members of an elite military fraternity, their success did little to earn them equality in American society. Today, as protests against systemic oppression explode across the nation, examining the fate of past attempts to earn equality helps explain the modern frustration of African Americans.

Unsurprisingly, African Americans serving in World War II struggled against racism in Jim Crow America. The Double V campaign positioned Black soldiers as fighting fascism abroad and Black civilians as fighting racism at home. America regarded them as good enough to serve in the military, yet not good enough to enjoy full citizenship. Within the Army, paratroopers enjoyed a status higher than the average GI. So, when the call went out for African American volunteers to partake in parachute training—a new path toward defeating Jim Crow seemed at hand. But unfortunately, racism and prejudice at all echelons in the Army of World War II precluded even elite airborne infantrymen from receiving equal treatment during the war.

Constituted on November 25, 1944, the 555th Parachute Infantry Battalion represents a microcosm of the African American experience in the United States. 17 of the 24 original volunteers came from the 92nd Infantry Division, better known as the “Buffalo Soldiers.” The new Battalion’s distinct nickname, the “Triple Nickles,” came from the three buffalo nickels that formed the battalion’s distinguished unit insignia, with the buffalo nickels serving as an homage to the 92nd. According to a memorandum from the Advisory Committee on Negro Troop Policies in December 1942, however, this battalion was not born of a desire to foster racial equality, but rather “for purposes of enhancing the morale and esprit de corps of the negro people.” Rather than serving to normalize the inclusion of African Americans in elite units of the Army, the Triple Nickles ultimately represented more of the same Jim Crow segregation.

From its inception, the unit and its men faced the same sort of institutional racism that plagued the United States Army and the country. In the Jim Crow Army, black soldiers were not allowed into the Post Exchange at Fort Benning. Meanwhile, German and Italian prisoners of war, ostensibly captured after killing Americans, enjoyed American cigarettes and sodas from the same establishment. At the same time, Black soldiers, like Walter Morris, walked by thinking “there must be something wrong with us,” to not be allowed the same comforts as America’s enemies. Reports of bottlenecks in the transfer of black troops due to white officers not passing along applications permeate African American newspapers of the period, like the Chicago Defender and the Baltimore Afro-American. In spite of every effort to hinder their development, however, the Army’s only all-black airborne unit impressed more than just its white trainers. The 22 officers and hundreds of men made an impact on nearly every officer who witnessed their training—including the Commanding General of all Army Air Forces, General Henry Hap Arnold. Exhibiting the high morale and esprit de corps expected of all paratroop units, they were keen to test themselves in combat while trying to remain “neither wistfully glorified nor overpublicized,” according to the Baltimore Afro-American.

The first African American paratroopers were eager to prove that “all blood runs red” and that their race “was being underestimated,” according to Sgt. James E. Kornegay, who stated as much in a note to his hometown newspaper, the Philadelphia Tribune. Proving themselves quietly, however, was never enough. Thanks in part to personnel shortages created by white officers deliberately misplacing transfer documents, or otherwise preventing the battalion from reaching full strength, the unit never made it overseas. As the first Company First Sergeant in the 555th, Walter Morris recalled, “Since no Field Commander in Europe or the Far East wanted ‘colored troops’ mixing with their racist white troops, the Army was stuck with us.”

Ultimately, the Triple Nickles received an important, if unglamorous, mission—Operation FIREFLY. This task involved partnering with the United States Forest Service to jump into forests and fight wildfires throughout the Pacific Northwest. Starting in November 1944, the Japanese sent more than 9,000 balloon bombs toward North America, intending to start fires. A handful reached the United States. Five children and a woman were killed by a balloon that exploded on May 5, 1945, during their Sunday school outing near Bly, Oregon. That incident, combined with the impending fire season, prompted the Army to send the Triple Nickles to assist. The 555th arrived at Camp Pendleton, Oregon on May 12, 1945, responded to at least 36 fires and made more than 1,200 individual jumps up and down the West Coast.

While fighting fires on the homefront was a critical part of the broader effort to safeguard the country during the war, for a battalion of paratroopers imbued with the same sort of camaraderie, elitism, and esprit de corps as white parachute infantry units, this assignment was more than disappointing. “We felt it was a dodge to avoid using us in combat,” Second Lieutenant Roger Walden asserted. Thus, the rank and status of Triple Nickles members did little to save them from prevailing racist attitudes, and sometimes even worked against individual members of the 555th. Former First Sergeant turned First Lieutenant Walter Morris, for instance, recalled an incident where a military police officer questioned his status at a North Carolina train station. His printed orders did not suffice, and he was arrested for impersonating a paratrooper. In Atlanta in April 1944, white paratroopers accosted Private B. F. Lane, Jr. because they refused to believe there could possibly be black members of the airborne elite. Add this kind of harassment to the humiliation of being treated worse than enemy POWs, while training on a base named after an early advocate of secession who would never have recognized their humanity, and it is easy to see why, as Morris suggests, Black soldiers developed an inferioritycomplex. The opportunity to serve as paratroopers helped assuage feelings of inadequacy but did little to rectify extant racism in the ranks.  “We didn’t win any wars, but we did contribute,” Morris stated in 2000. “What we proved was that the color of a man had nothing to do with his ability.”

Despite Morris’ modesty, the unit did enjoy some small victories, and found itself marching with the famous 82nd Airborne Division during the January 1946 victory parade in New York City. The inclusion of the Triple Nickles in the ceremony, and their success fighting fires out west, served as a step toward integration, as was soon demonstrated by a shift in barracks. The battalion remained attached to the division until it was reorganized as 3rd Battalion, 505th Parachute Infantry at Maj. Gen James M. Gavin’s request. Gavin, the Army’s youngest division commander and a progressive thinker, personally ensured the movement of the 555th from a run-down portion of Fort Bragg into proper barracks in the 505th Regiment’s area. At this point, black and white soldiers moved into the same barracks together for the first time. These actions made the 82nd the Army’s first integrated division in December 1947—a full six months before President Harry S. Truman signed Executive Order 9981 ordering integration in the Department of Defense. Integration, however, was far from full citizenship, something African Americans are clearly still fighting for today.

As the nation struggles to confront systemic racism in its ongoing quest to form a more perfect union, it is helpful to remember that becoming members of the elite airborne did little to create equality for African Americans in World War II and beyond. Black soldiers have consistently answered the call to serve a nation that considered them mere second-class citizens, but their quiet requests for equality, even after having proven themselves beyond the typical service-member, were not heeded. The lesson of the 555th Parachute Infantry Battalion for African Americans is a sadly familiar one: proving oneself is not enough; becoming members of a select fraternity was not enough to earn the respect and equality that comes with full citizenship. It is no wonder, then, that people of color and their allies are currently demanding, rather than quietly requesting, equality.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177230 https://historynewsnetwork.org/article/177230 0
Fabrication and Fraud in the Lost Cause: Historian Adam Domby Interviewed

 

 

 

Adam Domby, an award-winning historian and specialist on the Civil War and Reconstruction, examines the role of lies and exaggeration in the Lost Cause narratives and their celebration of white supremacy in his timely and groundbreaking new book The False Cause: Fraud, Fabrication, and White Supremacy in Confederate Memory (University of Virginia Press, 2020). 

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Re-Markings, Documentary, NW Lawyer, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights and conflict. He can be reached by email: robinlindley@gmail.com.

 

Members of the Minneapolis Police Department killed George Floyd, a 46-year-old African American man, on May 25, 2020. The shockingly brutal 8 minute and 46 second televised asphyxiation of Mr. Floyd sparked nationwide outrage and protests against police brutality and the many forms of systemic racism.

Mr. Floyd’s death also led to renewed efforts to remove Confederate monuments that celebrate slavery, treason, white supremacy, and racism. In several cities, public officials or protesters removed these memorials. The Southern Poverty Law Center reported in 2018 that more than 1,700 monuments to the Confederacy stood in public places. Many remain.

As the president shares racist talking points and vows to protect all memorials, including the Confederate tributes, many Americans are learning more about their history, particularly about the cruelty of slavery, the treason of the South to defend this brutal enterprise, the postwar defeat of Reconstruction, and the Jim Crow era of rigid segregation and disenfranchisement of Black citizens.

The Confederate monuments were constructed as reminders to white citizens of their supposed racial superiority while they were meant to intimidate and even terrorize Black people and keep them subservient. Most of these statues were erected in the first two decades of the twentieth century and later during the height of the Civil Rights Movement in the fifties and sixties. The monuments are points of discord as they dishonor the Black and white soldiers who died for the Union, while they demean and disregard the humanity of Black people.

The monuments that sentimentalize Confederate heroes who lost the war convey the Lost Cause myth of the Confederacy, a narrative that romanticizes slavery as benevolent as it recalls Confederate soldiers as gallant and chivalrous, rebel leaders as saintly, the South solidly united against Northern aggression, and Reconstruction as corrupt. Above all, the Lost Cause myth is built on a belief in the superiority of the white race and the need to forcibly control and subjugate Black people. 

Professor Adam Domby, an award-winning historian and specialist on the Civil War and Reconstruction, examines the role of lies and exaggeration in the Lost Cause narratives and their celebration of white supremacy in his timely and groundbreaking new book The False Cause: Fraud, Fabrication, and White Supremacy in Confederate Memory (University of Virginia Press, 2020). 

 

In his book, Professor Domby debunks the romantic Confederate myths by exposing the cruelty and barbarity of slavery; the ambivalence of many Confederate troops; the high rate of Confederate desertion; the extent of Union sympathizers in the South; the rise of Jim Crow; the brutal violence against Blacks; the monuments erected to intimidate Black citizens; the many lies about war service by white Southerners to claim veterans’ pensions; the myth of Black Confederates; and more. 

 

Professor Domby’s book reveals that much of our understanding of the Civil War remains influenced by falsehoods of the Lost Cause, lies perpetuated in popular movies such as the silent classic The Birth of a Nation and the lavish Gone with the Wind. As a historian, he sees an obligation to share the reality of history and put to rest longtime falsehoods that were used to justify white supremacy, Jim Crow segregation, and the disenfranchisement and subjugation of African Americans. 

 

Dr. Domby, an Assistant Professor at the College of Charleston, is an award-winning historian of the Civil War, Reconstruction, and the American South. In addition to Civil War memory, lies, and white supremacy, Professor Domby has written about prisoners of war, guerrilla warfare, Reconstruction, divided communities, and public history. He received his MA and PhD from the University of North Carolina at Chapel Hill after receiving his BA from Yale University.

Professor Domby generously discussed his work and his writing by telephone from Charleston, South Carolina.

 

Robin Lindley: Thanks for speaking with me Professor Domby about your work as a historian and your new book, The False Cause, a debunking of myths about the Confederacy and the Lost Cause narrative. 

I understand that you’re dealing with many media inquiries, and it must be somewhat surprising how newsworthy your book has become with the increasing focus on our history of racism since the brutal police murder of George Floyd in May. It seems many Americans are rethinking our fraught past as our president doubles down on his racist rhetoric.

Professor Adam Domby: Yes. I wrote the book knowing that the Lost Cause memory obviously still mattered. And I wanted the book out by the 2020 election in part because of the appeals to Confederate symbols.  

I didn't necessarily expect the president himself to be quite so blatant as he's been. I also didn’t realize how many monuments were going to be suddenly removed. I knew the book was would be somewhat timely, but I did not expect to have at least one reporter calling me a day, if not more, for about a month this summer. I have talked to reporters before, but now it's very busy, and it definitely caught me a little off guard. 

I felt the book needed to be written because it relates to the present issues. The historical narratives that these monuments are supporting are fundamentally undergirding systemic racism. And, so long as we have these falsely propagated narratives of history, the harder it will be to dismantle systemic racism. I see it as the start of a longer process—of first correcting the historical narratives so that we can actually address the ramifications of what has happened historically. 

Robin Lindley: You mention in the epilogue of your book that historians have a duty to question false narratives and myth, and you do a masterful and carefully documented, point-by-point, debunking of the Lost Cause narrative. How did you got involved in this project? Did you grow up in the South? 

Professor Adam Domby: Yes. I was born and raised in Georgia. In fact, my childhood home was not far from where Sherman's headquarters was during the siege of Atlanta. And I always had an interest in the Civil War. I saw the movie Gettysburg as a kid, and I followed other stories of the war.

But I went to college being a math major. I thought I would get a degree in math and that lasted all of one semester. I was fortunate that, on a whim, I took a class called “Wilderness and the North American Imagination” that was taught by Aaron Sachs, and I really liked this class. I picked another class that he was teaching, and that led me to eventually major in history. During college, I initially thought I’d be a colonial historian, but I fell in love with the Civil War again when I took a class with David Blight and started to see things that I had grown up learning in new ways. 

I grew up in a region where the Confederacy was still often romanticized. So it was eye-opening for me to take a Civil War class and additional seminars on memory that presented a different view of the war.

I hadn’t decided yet to be a professional historian. I was going to be a park ranger. I worked as a park ranger for a while, and then left that, worked in politics for a bit, and then decided to attend graduate school. Even then, I didn't think I'd write about Confederate monuments. I planned to write on prisoners of war. But, in my first year of grad school, I stumbled upon a document while working on a term paper, and that document sort of launched me on this process. The document was a [1913] dedication speech by Julian Carr for a Confederate monument [at the University of North Carolina]. 

I didn't realize at the time how important that speech would be in shaping my own future. I thought this was an opportunity to teach people a little bit about Jim Crow. But I put it out there and activists took it and ran with it, really educating people about these monuments. I didn't really think about social activism, and I didn't fully appreciate yet the impact these monuments have on students and faculty of color who have to walk by them. That only came later. I saw this as a side project, maybe for some articles in the future. 

I had two articles that I wanted to write. One was all about white supremacy and memory and the other was about lies and memory. And then I looked at those projects and it eventually dawned on me that this was actually the same project. The lies were part of the monuments and the white supremacy aspect was tied to the monuments and the Confederate fraud. 

I arrived at the College of Charleston just after the Mother Emanuel terror attack here. And then the 2016 election of Donald Trump led me to put aside my dissertation project and focus full time on this project. This was a book I felt was important to have out there that would be useful to people who are having to engage neo-Confederates to show how this Lost Cause narrative was propagated and continues to be propagated. I think it has something both for the public and for other scholars. 

Although it seems timely on some level, I did not realize how timely it would become, until it came out. I was actually worried it would come out too late. I really was hoping to have it out before the primary elections, but it turned out to be perfect timing. 

Robin Lindley: You begin your book with a description of the Silent Sam monument and that speech of Julian Carr, who you researched. From your book, it seems that Carr was a grandiose conman who misled people about his background as he celebrated the Confederacy. What did this Silent Sam statue represent to him?

Professor Adam Domby: Silent Sam was a Confederate monument put up by the University of North Carolina, Chapel Hill.  It came up at the same time as all those Confederate monuments at city halls and courthouses. In fact, this statue at Chapel Hill is across the street from a courthouse and Post Office, so it’s central in the town. It was put up in 1913, and other colleges have similar monuments such as the University of Mississippi. Increasingly they're being removed because they were put up at a time when the schools were segregated and there were no Black students at the time. Making Black students feel welcome was the opposite of what college leadership wanted to do at the time. These monuments were put up explicitly to celebrate white men and to teach white supremacy to the next generation. 

The state's future leaders and the next generation were being trained at UNC, according to Carr, and he wanted them to learn to be white supremacists. That's basically what he said. And this monument was meant to celebrate the success of overturning Reconstruction. And something we often forget about the Lost Cause is that it wasn’t just about how we remember the war. It's also about how we remember the antebellum era and how we remember the era after the war. 

You might say the Lost Cause narrative includes history all the way to the present. It’s an evolving memory of course, but the memory that they wanted and still want.  

Robin Lindley: How did Carr embrace the Lost Cause narrative? 

Professor Adam Domby: The Carr speech was unremarkable at the time, despite the fact that he was saying that Silent Sam was a monument to white supremacy. His speech was so unremarkable that the newspapers did not carry any note of it other than that he gave a speech and he had given so many speeches before. They ran the governor's speech and a few other speeches given that weekend, some of which also hinted at white supremacy, but the Carr speech was largely forgotten with a few exceptions because he'd given so many similar speeches. The only thing that stands out from this one was the statement he makes about his personal part. To Carr, this was a monument devoted to celebrating the overturning of gains that African Americans had made, and this was in the aftermath of the disenfranchisement of African-Americans. Carr wanted to celebrate that North Carolina was now a controlled by whites. He was helping create the solid South. It didn't exist in 1876. It was created by disenfranchisement largely and aided by false narrative of history. 

Robin Lindley: What would you like readers to know about the major aspects of the Lost Cause narrative?

Professor Adam Domby: The first is the denial of the cause of the war. Denying the central role of slavery is crucial for the Lost Cause myth because, if you fought a war for slavery, you were a loser. If you fought a war for states' rights and it's 1901 and state's rights still exist and are constantly being cited as a reason why the federal government shouldn't intervene in disenfranchisement, then you're a victor. In fact, you can be celebrated by continuing the fight for segregation, and you were a hero for segregation as well. This was overtly making Confederate soldiers into white heroes of white supremacy while also stopping them from being the losers. 

Lost Cause narratives also remember slavery as benevolent, while simultaneously remembering Reconstruction as a time of corrupt African-Americans and carpetbagger. The benevolent slavery aspect brings this idea that there were happy slaves. But slaves were actually brutally terrorized into labor. That is what slavery does. But in this false memory there was a time when the races got along, and everyone was happy before the war, and only the intervention of Northerners during Reconstruction caused poor race relations in the South. And by this narrative, disenfranchisement of African Americans can be seen as not the cause of race relations being strained but as the cure for it instead. To be clear, Reconstruction was also not a time of exceptional corruption.

Similarly, Confederate soldiers are remembered as the most heroic and devoted soldiers of all time who were only defeated due to overwhelming manpower. The problem with this, of course, is the high levels of desertion in North Carolina and other states. 

The conduct of the war is another aspect of the Lost Cause that is often overlooked.  How do we remember the conduct when we ignore racial massacres? Race and racism shaped every aspect of the war: how the war was fought and how that shaped Confederate strategy. So, with the Lost Cause, these Confederate soldiers are rewritten as noble men, despite what they fought for, how they fought, and that they in participated in racial massacres. 

And, returning this narrative of history, the lie is not just used to remember Confederate soldiers well, but to justify Jim Crow as a system, to defend Jim Crow, and to support Jim Crow. To me, that’s the central key to this whole thing. All of these lies are holding up the biggest lie of all: that races are different in that one race is better than the other. That that's the big lie at the top. 

Robin Lindley: Thanks for that explanation Professor Domby. When I was in grade school and high school in late fifties and early sixties it seems that these myths were the view reflected in our history textbooks. I grew up in Spokane Washington, and our texts shared a romanticized view of the South and slavery, and stressed the failures of Reconstruction.

Professor Adam Domby: It's a narrative that became the norm and was pushed very much. In the South, this Lost Cause version was appreciated as a way of defending the South from Northern intervention. That narrative of history allows you to also say that race relations in the South are a Southern issue. That's the message of the Lost Cause. Whereas, if you have a more accurate version and you're living in Washington, you might think, maybe we should intervene. Instead, this was all about convincing and exporting the views of the South to the North and West. 

Robin Lindley: And also, I think, so publishers could sell the same books in the North and South. Your book is very timely. The president has been criticized by many for seeming to care more about these monuments to dead racists and traitors than he cares about living Americans. 

Professor Adam Domby: Especially the lives of Black Americans. It's worth pointing out whose lives he cares least about. It’s true that many Americans are dying right now due to Covid-19, but communities of color are impacted at much higher rates of infection and death. 

He also ignores that police violence is similarly disproportionately borne by African Americans. And Trump is not a historical rarity. We see the same thing in Julian Carr’s language back in the early 20th century when Carr said, “lynching is bad but. . .”  There should be no “but” at the end of the statement, because lynching is bad. We should be able to stop there, but he didn’t. He said lynching is bad but, until all white women feel safe themselves, we can't even address it. For Julian Carr, the lives of Black men were less important than white women feeling comfortable. This seems reminiscent of Trump’s appeals to suburbia.  

The statement Black Lives Matter is so important right now because historically Black lives had been viewed by those in power as less important than things like monuments and every white woman feeling entirely safe at all times. 

We see something similar now with Trump as he focuses on monuments while ignoring the underlying meaning as a way of signaling either on purpose or not. But we're seeing that signaling again and again, and the roots of that racism are not new. 

Robin Lindley: Thanks for pointing out the meaning of Black Lives Matter. After all this time, some people are still confused. You mention in your book that, when many of us read about Southerners, we immediately think only of white men while ignoring most of the other people there, particularly the non-white people. I do that at times.

Professor Adam Domby: I'm guilty of that myself. I think we all are. I think you'd be hard pressed to find a history teacher in this country, even a historian of the South, who doesn't occasionally mess that up when they're in class or writing. When I finished the book, one of the last things I did was go through and check every time I used the word “Southerner” and I caught myself multiple times in that final rewrite where I said Southerner when I meant white Southerner. It's short hand that we grew up with. That's what happens when you live in a society that is built around systemic racism. You don't realize you're inviting it, so it requires conscious effort to avoid these mistakes. 

Robin Lindley: What do you think should happen with the symbols of the Lost Cause such as the Confederate statues and monuments? 

Professor Adam Domby: The first thing I would say is that my own views are constantly evolving. One of the things that I've learned by talking to colleagues and students who are Black is that I can't fully appreciate the harm done by these symbols when they're seen by a person of color or a Black person. The closest I can imagine, and this is not the same, is my visceral reaction when I see a swastika, and I'm of Jewish descent. Another historian of the South phrased it like this: So, when I see a Confederate monument, it doesn't insult my personhood in what it says, but it insults my principles, which is a very different experience than if I were Black.

It’s important for us to realize the harm that these monuments do, and it’s very hard for a white person to appreciate it. A white person must listen and hear what harm is done by these monuments and what message these monuments represent, whether or not they believe that's what the monuments were meant for. With symbols send messages, there's always a subtext. The same goes for the Confederate flag. So I think that's the first thing to remember. 

The second thing to remember is that, when it comes to monuments, I don’t think I’m the most important person to ask. One of the things that I see is that these monuments were put up in an incredibly undemocratic fashion. I think that there isn't anything wrong with allowing communities to decide what to do with them. If they decide they want to put them in a museum, that’s on them. If they decide they want to try contextualization, that's on them. If they decide to pull the thing down and leave it broken in the park, that might be the best solution for them. 

In some ways, I feel like my job is to give the context of the monuments that allows communities to decide what to do about it. I try usually to avoid giving an opinion about what to do with monuments, except when the monuments are in communities that I've lived in. That being said, I think that these monuments are tools of racial oppression at times, as when they stand in front of a courthouse. And so those are especially problematic, and I don't think that they teach history.

Robin Lindley: And what should happen with the Confederate flag?

Professor Adam Domby: I think the Confederate flag has no place being put in front of schools or public buildings. It is a symbol that was from its creation designed to intimidate Black people and to celebrate white supremacy. From its earliest days, the flag was tied to white supremacy and that meaning has only grown over time. [Former Republican Governor] Nikki Haley liked to say that the symbol was appropriated. It wasn't appropriated by white supremacists. It was already owned by them. It just became even more so. 

It’s important for us to remember that fixing the narrative is the starting point to undoing systemic racism. It's not the endpoint to take down the monuments, but it is also not just a symbolic act because it undermines the Lost Cause, which upholds white supremacy. It’s more than a symbolic act because having a democratic landscape that is welcoming to all impacts how the next generation learns about how to understand who's worthy of admiration and what values we should emulate, and who we want to emulate. But it doesn't solve the problems society faces. 

Some political figures are perhaps hoping that, if we just remove the monument, the problem goes away. But the problem is still there. The problem is systemic racism. Getting rid of the monuments is a first step in being able to understand the sin of racism. And so long as you believe the Lost Cause, people will be able to claim with a straight face that systemic racism doesn't exist.

You literally have Republican politicians right now saying that systemic racism is not a thing and that these monuments teach history and have nothing to do with slavery. So long as you have that message going forward, I don't see this [monument removal] as solely symbolic action because the narrative they are upholding is both false and justifying white supremacy.

I would love to see a democratic process and see communities discuss and decide as a group where everyone has a say, but the reality of the situation is that Southern legislatures have made that nearly impossible. And so, perversely, what you have going on is that heritage acts are leading to the destruction of monuments because there is no democratic process. What do people do instead when all legal means are exhausted? There's only one option: they resort to extra-legal means. 

And the process has the potential also to be educational. I saw in Charleston over the last five years that the city has learned about the history of John C. Calhoun through debates, and that's the history that wasn't there when he was just a monument. In the process of removal, people were able to learn something. That being said, that it took five years is both upsetting and surprising. It should have been quicker perhaps but I am also a little surprised it didn’t take longer. 

I would love to see our landscape be one that is welcoming to all. That is more important than using monuments to teach history. I thought for a long time, that Silent Sam and other statues had the potential to be a teaching tool. I could use them to teach about Jim Crow and white supremacy, and racial violence during Reconstruction. And I thought that for longer than I should have. 

But I knew there was no going back when avowed white supremacists started showing up on the UNC campus to “protect” the monument. That's when I had the realization that this monument was making students of color and myself feel unsafe. And I don't need monuments to teach Jim Crow. I don't even need a picture of them. I can talk about it. But I can't teach about Jim Crow if my students and I can't get to the classroom safely. And to me that moment meant long term there was no solution that kept the monument on campus.

The other key thing that led me to shift my views was talking to students and faculty of color and listening to them, hearing them say that they would take a different route when they're going to Franklin Street [to avoid Silent Sam]. When they're walking through campus, they avoid that part of campus. Why are they doing that? The monument was harmful because students need to feel welcome on their campus. A fundamental aspect of being open to learning is to feel safe. It’s very hard to learn when you don't feel safe in that environment. 

Robin Lindley: It’s impressive that you consulted with Black students and Black faculty members on the Silent Sam statue at UNC. Now, with the current climate and the president's comments about race, people are learning more about history. 

Professor Adam Domby: I am not sure it is impressive. It seems UNC should have been doing that all along and they still aren’t. I don’t know about learning history, but with the president’s overt appeals to racism, it's become increasingly hard to deny racism as a problem in our society. 

There were plenty of people in the Obama era who wanted to say that racism is solved. We have a Black president. You will not find people on the left saying that anymore, but you will find people on the right saying that and pushing white supremacist talking points at the exact same time. Some are very clearly lying and are very clearly dog whistling at times. I think there's no question that, when the Trump administration does a lot of these actions, it's a purposeful dog whistle and becoming a foghorn.

Trump went to Mount Rushmore [on July 4, 2020] and announced his plan for a garden of heroes not including a single Native American [at this sacred Native American site]. And, if you look at the language of that executive order explicitly, it says it includes American citizens at the time of the action. If you look at who's considered an American citizen in the 19th century, Native Americans aren't included. Legally speaking, you cannot include Sacajawea who would be an easy choice. She's noncontroversial to whites as she helped white people; she's nonthreatening

The idea that America is a white nation is embodied in that historical narrative that Trump’s pushing. You'll also notice which African Americans are included. It's always whoever is perceived by the right as safe, at least in memory if not in reality, You have MLK, but you don't have Malcolm X. You have Frederick Douglas, but you don't have W.E.B. Du Bois. You have Booker T. Washington instead. You have Frederick Douglas, but you don't have Nat Turner. So, you have safe individuals. A lot of conservatives love to quote MLK and present themselves as being racially progressive. 

Robin Lindley: And they ignore the last three years of Dr. King’s life when he talked about militarism, materialism, racism and economic injustice. 

Professor Adam Domby. They also forgot what happened to him. He said all these great things, and then he got shot and killed. 

When we look at these larger questions of monuments and memory, again and again what we're seeing is that white supremacy is alive and well. And the narratives in history are being used to either signal or overtly state who belongs and who doesn't, who's included and who’s not. When Trump says we’re trying to celebrate our heritage, he's not talking about the heritage of Native Americans or the heritage of Black Americans. He's saying that heritage is about white Americans because it's a concerted, exclusionary perspective. I think that his speechwriter, who presumably was Stephen Miller, did it on purpose. Even though he knew exactly what he was dealing with, he chose to not have a single Native American on that list. The surprise to me is Phil Sheridan and George Custer weren’t on there. 

Robin Lindley: Yes, Custer didn’t make the final cut. Rev. William Barber recently mentioned that Trump focuses on statues but ignores issues of health care, education, economic inequality, and the cost of racism. 

Professor Adam Domby: It's a really important point that Barber made and it’s worth reiterating time and time again: no monument is worth more than a single life. I don't care what the monument is to. I value the life of a human that's living far more than I value any monument to someone who's dead. The idea that you're going to shoot someone for tagging a monument screams that the life of a dead individual from the past is worth more than a current life is very telling. 

Robin Lindley: You stress the extreme violence against Blacks with the massacres of Black Union prisoners of war by Confederate troops during the Civil War, and the postwar lynching of Blacks by the KKK and violence of other vigilantes. How do these crimes and violence against Black Americans tie in with the Lost Cause narrative? 

Professor Adam Domby: We see what the Lost Cause narratives tell us and who we're supposed to celebrate. Why the police violence toward Black men and women is so much more than toward white men and women, has a long history. 

I like to use the example of Nathan Bedford Forrest who grew up in a slave society and worked as a slave trader. He made a fortune off separating families and, to him, that was perfectly acceptable. It was more important to separate families than to keep families together. Let's remember that he sold children away from their families because his bottom line was more important than their lives. When he saw an enslaved person who was out of line in his eyes, his immediate reaction was toward violence. 

And it's no surprise that, during the Civil War, when Forrest sees armed Black soldiers, his immediate reaction is to consider this a slave revolt. And that was the Confederacy's reaction. Jefferson Davis orders said that when they saw Black troops, that was considered a slave revolt. So Forrest was directly involved in murdering Black prisoners of war because, for him, that was the appropriate response. 

And then after the war, when you see African Americans asserting independence and not doing what Forrest wants, he joins in the Klan violence and passes that on to his grandson, Nathan Bedford Forrest II, who became a head of the Sons of Confederate Veterans, and was a Klan leader as well. What he learned growing up in that family was that the appropriate response to African-American asserting independence or being out of line was violence. Values get passed on another generation, not just by people saying you should be racist, but by observing what your father does and what your grandfather does and remembering how he's celebrated after the fact. And Forrest is still celebrated, and so it's no surprise. Then when we move forward to the 1960s, you have Bull Connor in Alabama thinking that violence is the appropriate response to Black children asserting that they had rights. We saw him use dogs and fire hoses and billy clubs. That is a historical legacy of a learned behavior: that a quick reaction with violence is the appropriate way to maintain order. Connor understood violence as how to maintain the status quo. 

And so, it's no surprise that today that police disproportionately shoot African Americans because they grew up in a society which was the society that Bull Connor came out of, which came out of the society that Nathan Bedford Forrest II came out of, and that Nathan Bedford Forrest came out of. And so, this is a long, drawn out process. You talk about fighting against white supremacy. This doesn't end overnight. This is a process that will take generations. And, I'd love for us to be able to declare racism dead tomorrow, but I don't foresee it. 

Robin Lindley: Thanks for those powerful words Professor Domby. Have you read historian Professor Heather Cox Richardson’s new book, How the South Won the Civil War? 

Professor Adam Domby: It's a brilliant book There’s an old saying that the North won the war but the South won the peace. That is problematic. We must be clear. When we say the South won, what do we mean? We mean the white South, because Southerners also include Black legislators and governors and other as well. If you're talking about South Carolina, for instance, the vast majority of South Carolina was supporting the Union because the vast majority of South Carolinians were enslaved. 

Her book does a wonderful job explaining how we got to where we are today. I think it pairs quite nicely with my own book that looks at the underlying historical narratives of an ideology she studied. 

And our books bring the history to the present, unlike some earlier histories. Bringing the past into the present is actually important right now in this moment where the Lost Cause is present and white supremacy has seen a form of resurgence. And the tie to the president is something that historians should address, in my opinion. Some may say that is too presentist or too political. But you can’t ignore the fact that all of our history writing has an agenda. We all choose to write on topics because we think they matter. Being more upfront about it is actually the best solution. 

I appreciate that many books more recently directly tie into the present situation. I think there's a growing sense among historians that the act of writing history is political and, in the act of teaching of history, the narrative we choose is a political decision. And, to pretend it's not, is it in itself a political decision.  

Robin Lindley: Thank you Professor Domby for your thoughtful and illuminating comments. And congratulations on your groundbreaking new book The False Cause on the myths of the Confederacy.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/blog/154398 https://historynewsnetwork.org/blog/154398 0
The Headless Horseman: William Barr and the Attorney General in History (Part 2)

 

 

To read Part 1 of this article, go here

 

None of the founders of the country conceivably imagined that if the Attorney General is indeed the creature of the President, he would be ill-equipped to investigate, much less prosecute the President or his associates for wrongdoing, either before or during his presidency. It obviously didn’t occur to Hamilton, Madison or Jay that we might one day have a rogue President like Donald Trump with a life history of being one step ahead of the law, and little or no regard for presidential or constitutional norms. 

 

Barr enjoys the dubious decision of being the most controversial Attorney General in history. He testified in his confirmation hearings that he would make law enforcement decisions based on the facts and the law “not on politics.” But, he quickly broke his promise and emerged as a political apparatchik, leaving a trail of decisions favorable to Trump and his political cronies that appear to have ignored both the facts and the law, and raise serious questions as to whether he has any independence from the President at all. Indeed, Barr not only embraced, but bedded down with, the controversial doctrine of the unitary executive. In a devastating Washington Postop-ed, Obama Attorney General Eric Holder found Barr “unfit” to hold the office because he failed to distance himself from the President:

 

Barr made the outlandish suggestion that Congress cannot entrust anyone but the President himself to execute the law. In Barr’s view, sharing executive power with anyone ‘beyond the control of the President,’ [emphasis Holder’s]… presumably including a semi-independent Cabinet member’ [the Attorney General], contravenes the Framers’ clear intent to vest that power in a single person.’ This is a stunning declaration … revealing of Barr’s own intent: to serve not at a careful remove from politics, as his office demands, but as an instrument of politics — under the direct “control” of President Trump.

 

Barr’s interpretation of the office is totally at odds with that expressed by Cushing or, for that matter, any of his illustrious predecessors. 

 

Barr had briefly served as George H.W. Bush’s Attorney general from 1991 to 1993. The only blemish on his reputation was the tragic Ruby Ridge massacre in which a woman, a 14-year old boy and a dog were killed by federal snipers while  Barr was at the helm of the Justice Department. Barr successfully distanced himself from the tragedy, and the incident did not figure in his confirmation hearings. 

 

When Barr was appointed by Donald Trump, people in the legal community regarded him as a reasonable guy, a moderate Republican, a lawyer who understood his constitutional duties. In Barr, Trump found his Roy Cohn, a cunning advocate and protector of his political flank. Barr got his job because months before his appointment, he wrote a letter to Deputy Attorney General Rod Rosenstein, contending that the Mueller probe could not “provide a legitimate basis for interrogating the President,” because “the President alone constitutes the Executive branch,” has  “all-encompassing” authority over federal law enforcement and cannot commit obstruction of justice even by firing an official, such as James Comey or Preet Bharara, directing an investigation into presidential misconduct.”

 

Since assuming office, Barr has gone so far as to state that the Russia probe was “one of  the greatest travesties in American history,” manufactured  “without any basis…to sabotage the presidency.” The Justice Department inspector general concluded otherwise. He said the investigation was justified and accomplished without bias. Notwithstanding, Barr deputized Connecticut federal prosecutor John Durham to look into the matter. This fit neatly well with Trump’s debunked “Obamagate” scenario that Obama and his Vice President Joe Biden ginned up the Russia probe to discredit his presidency. Durham’s findings may result in the fantasy of the hard right, an indictment of James Comey or a report strategically timed for October, amidst the run-up to the November election. 

 

Barr told the Senate Judiciary Committee that “nothing could be more destructive of our system of government …than any toleration of political interference with the enforcement of the law.” He lied. On June 24, two career prosecutors told the House Judiciary Committee that Barr had done precisely that. John Elias, a 14-year veteran of the Department of Justice whose motives Republican Congressmen were quick to impugn, testified that Barr had ordered the antitrust division to investigate mergers in the legalized marijuana industry simply because Barr “did not like the nature of their underlying business.” “Personal dislike of the industry is not a valid basis upon which to ground an antitrust investigation,” Elias testified. 

 

Aaron Zelinsky, lead prosecutor in the case of Roger Stone, the longtime “dirty tricks” political fixer for Donald Trump, declared that political pressure accounted for Barr’s Justice Department’s cutting a break for Stone. A jury had  convicted Stone after trial of seven counts of felony witness tampering and perjury before Congress. Line prosecutors had recommended a sentence of seven to nine years in prison in accordance with sentencing guidelines. Barr countermanded the recommendation in accordance with Trump’s wishes.

 

The government filed a new sentencing memorandum, without a prosecutor’s signature, seeking a sentence that was a downward departure from the guidelines. The court sentenced Stone to three years in prison, a sentence which Trump commuted just as Stone was about to enter prison. Zelinsky and the three other career prosecutors on Mueller’s team, who represented the government at trial, withdrew from the case in protest or else resigned from the Justice Department. 

 

At least in Stone’s case, Barr’s Justice Department sought some reduced jail time. Witness tampering is still a serious offense. But in the case of Michael Flynn, another Trump crony, Barr moved for leave of court to scupper Flynn’s sentencing, and dismiss the case altogether after Flynn had twice pleaded guilty to lying to the FBI. Other career prosecutors resigned over the move. In an extraordinary brief filed in court, Barr contended that even assuming, for sake of the argument, that he acted out of improper political motive or “gross abuse” in moving to dismiss the case, his action was  “an unreviewable exercise of prosecutorial discretion.” A panel of the D.C. Circuit upheld Barr’s position and dismissed the case by a 2-1 vote. Further litigation on this issue is ongoing. 

 

Jonathan Kravis, one of the federal prosecutors who resigned from the Justice Department because of being overruled in the Stone case, wrote an op ed in the Washington Post in which he argued that both Stone’s and Flynn’s cases “the department undercut the work of career employees to protect an ally of the president, an abdication of the commitment to equal justice under the law.” “Prosecutors must make decisions based on facts and law, not on the defendant’s political connections,” Kravis wrote. “When the department takes steps that it would never take in any other case to protect an ally of the president, it betrays this principle.” Zelinsky told Congress that Barr’s actions were “because of politics,” “virtually unprecedented” and “unheard of.” In the United States of America, we…don’t cut [people]..a break because of politics, ” he testified.

 

No Attorney General has gone so far as Barr in using the office to twist the law to protect the President politically. He perverted the findings of the Mueller Report into an exoneration of the President. He swept under the rug whistleblower allegations that Trump had sought to hold hostage millions in appropriated military aid to Ukraine in exchange for the “favor” of Ukraine’s announcing an investigation of Democrats for supposed wrongdoing. The Democrats included his political rival Joe Biden. The case bristled with evidence of attempted bribery and extortion, but Barr, closing his eyes to the obvious, saw nothing criminal in it. 

 

Some lawyers have the courage of their retainers. Barr has proven to be Trump’s handmaiden. He intervened in behalf of the United States in the subpoena cases in which congressional committees and New York District Attorney Cy Vance in behalf of a grand jury sought Trump’s tax returns and other financial records covering years and conduct before he even took office. He lost the case before every court that considered it, including the Supreme Court.

 

Wearing his secret police cap, he directed the federal officers who launched an unprovoked pepper spray attack on Washington demonstrators, exercising their constitutional right of assembly, to clear the way so Trump could pose for a political photo op in front of a Lafayette Square church. By executive order, he vengefully sought a broad overhaul of the legal shield protecting the social media when Twitter tried to rein in his misleading and inciting-to-violence tweets.

 

He commenced suit against former national security adviser John Bolton to enjoin publication of Bolton’s tell-all book entitled The Room Where It Happened, making blunderbuss claims that publication would breach national security. The suit was an obvious tactical ploy to delay the book until after the 2020 election. The suit made no sense for Trump, and any lawyer worth his salt would have refused to file it. The Room Where It Happened immediately soared to the top of Amazon’s best seller list for many of the same reasons that  steamy movies used to attract crowds if they were “banned in Boston.” Sensationalism, like sex, sells. In any event, bootleg copies of the book leaked out to the media, rendering any injunction moot. As the judge said, “the train has left the station.”

 

Donald Ayer, a deputy attorney general in the first Bush administration called Barr “the greatest threat in my lifetime to our rule of law and to public trust in it.” Quite an indictment coming from a former colleague.

 

More will undoubtedly unfold about the complicated character of William Barr beyond the damning evidence of his conflicting loyalties, which are at this point so well-known. More than 2,000 former federal Justice Department employees, including federal prosecutors, have called on Barr to resign, and for Congress to censure him for his “repeated assaults on the rule of law.”

 

Engraved on the west pediment above the entrance to the Supreme Court building in Washington is the aspirational legend EQUAL JUSTICE UNDER LAW. But the promise of “equal justice” turns decidedly on an adversarial system with an independent Attorney General unblemished by political considerations.

 

As the story goes, Benjamin Franklin was walking out of Independence Hall after the Constitutional Convention in 1787, when someone shouted out, “Doctor, what have we got? A republic or a monarchy?” To which Franklin supposedly responded ominously: “A republic, if you can keep it.”

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177131 https://historynewsnetwork.org/article/177131 0
"Elvis In The Box": The National Enquirer Issue that Made Today's Celebrity Culture

 

 

 

Admit it, fellow boomers, you looked.  How could you not?

You were on the slow-moving checkout line at the grocery store, stalled by a complicated series of price checks involving the customer in front of you.  Your kids were transfixed by the candy rack.  “Just one apiece,” you finally said, surrendering to their whining while maintaining the pretense of laying down a tough line.

Besides, it was the eye candy, not the Gummy Bears, that had your attention.  There it was, right next to the cash register: the front page of the National Enquirer.  

“Ed McMahon Flies into Rage over Fiancee’s Secret Past of Men, Men, Men!,” it shouted.

Or: “Vanna and Kathie Lee Square Off in Vicious Feud over Who’s Most Beautiful.”  Or even: “New White House Cat Faces Nervous Breakdown—the Untold Story.”

You—we— looked but probably didn’t buy.  

Unless, that is, the headline blared: “Elvis: The Untold Story,” as it did on September 6, 1977, 43 years ago this week.  Even more compelling, the cover photo was of Elvis’s corpse in an open coffin, an astonishing shot known ever after as “Elvis in the Box.”

Don’t feel bad for buying that one.  I did.  More than six-and-a-half million of our fellow Americans did too.

The National Enquirer has fallen on hard times in recent years, the victim of its own success.  It’s been overshadowed by everything it spawned: competing supermarket tabloids, People magazine, E!, TMZ, and a mainstream media that has become ever more besotted by celebrity and scandal.  

The Enquirer’s most recent chief executive, David J. Pecker, best known for buying and burying former Playboy model Karen McDougal’s account of her affair with Donald Trump as a favor to the candidate in 2016, was just fired.  Weekly circulation has dropped to about 265,000, down more than 95% from its peak. Just last week the paper’s long-expected sale to James Cohen, the CEO of airport chain Hudson News, fell through.

But that’s now; this was then.  The story of the Elvis in the Box cover might well be subtitled: The Week the National Enquirer Came to Memphis.

By 1977 the Enquirer had outgrown its “Kills Pal & Eats Pieces of His Flesh” and “I Cut out Her Heart and Stomped on It” days.  Sure, it still featured aliens, quack diets, and mystics, but what founder-editor Generoso Pope Jr. discovered was that celebrity stories drove circulation, largely by enabling the paper to move from old-style street corner urban newsstands to new-style suburban check-out lines.

“We love to feel part of this lifestyle—the privilege, the money, the mansions,” wrote former editor Steve Coz in “The National Enquirer: Thirty Years of Unforgettable Images.”  

“At the same time we want to know that there is a price, that after the glory comes the fall, the drugs, the scandals, the divorce, the murder.”  Then after a while “we feel guilty for wanting to see the famous fall, and when they do, we most often want to see them rise back up to new heights of glamour and fame.”

Pope paid his writers, many of whom he imported from British or Australian tabloids, lavishly but on one condition, notes New York Times Magazine staff writer Jonathan Mahler in his conclusion to the same volume:  they must be willing to do anything it took—anything—to “deliver high-octane scoops. If that meant digging through Henry Kissinger’s trash” or “dressing up as a llama to infiltrate a herd near the site of Michael J. Fox’s wedding, then so be it.”

At age 42 Elvis’s star was waning.  Recent Elvis covers hadn’t done much for Enquirer sales.  “He weighed a ton and a half and he looked like shit,” said articles editor Tom Kuncl.  But Pope sensed that the public, having moved two-thirds of the way through its cycle of first idolizing, then despising, and then resurrecting stars, was primed to explode in adoration when Elvis unexpectedly died.

As soon as Elvis’s death was announced in August 1977, Pope chartered a Lear jet to relay teams of reporters from the Enquirer’s Lantana, Florida, headquarters to Memphis, where they rented an entire floor of a Holiday Inn, had dozens of secure phone lines installed, and hired scores of local writers, photographers, and private detectives to help pursue the story.  

Their pockets stuffed with tens of thousands of dollars in cash, the Enquirer team grabbed everyone it could find who had any connection at all to Elvis or his death and was willing to sell their stories.  Among those they rounded up were current girlfriend Ginger Alden, stepmother Dee Presley, Uncle Vernon, even the paramedics who tended Elvis on the ambulance ride from Graceland to Baptist Hospital.  Their exclusive stories bought and paid for, they were whisked off to Lantana to keep them from talking to anyone else.  

When Memphis newspapers and television stations sniffed disapprovingly in print and on the air that the Enquirer had stooped to paying sources, Memphians with their own Elvis stories headed straight for the Holiday Inn. 

Words—like Ginger Alden’s editorially pumped-up account in “Girl Elvis Was Going to Marry Tells Her Heartbreaking Story”—were one thing.  

Pictures were another. 

“Our photo desk had one major imperative,” writes another former editor, Iain Calder, in The Untold Story: My 20 Years Running the National Enquirer.

“Get a photo of Elvis in his coffin.”

But how? Elvis’s “Memphis Mafia” of friends, retainers and relations were eagle-eyed guardians of the coffin as thousands of fans filed through Graceland’s foyer to pay their respects.  

“I bought every Minox [spy] camera in a three-state area,” says Kuncl, but “people just chickened out” when they caught sight of “Red and Sonny West standing there ready to kick the shit out of anyone who tried to take a picture.”

Enquirer photographer Jimmy Rutherford finally found a distant cousin of Elvis’s who was willing to sneak in after dark.  He got off four quick shots, and the film was rushed back to Lantana to be developed.

As recounted by Calder, here’s what emerged in the darkroom.

“Frame one: The cousin’s blurry face. He had pointed the camera at himself.

“Frame two: A picture of the chandelier hanging above the coffin.

“Frame three: Bingo. Elvis, full face, in the coffin.

“Frame four: Who cared after frame three, but it was another good picture, taken from the side of the casket, showing Elvis in profile.”

The demand for the Enquirer’s Elvis in the Box issue was so great that Pope wanted to print 8 million copies instead of the record 6.7 million the Enquirer actually sold, but he couldn’t find printers with enough paper to pull it off.  Issues became collector’s items—you can still find them all over eBay.

And so began “Elvis’s elevation”—more than four decades long this month (yes, Elvis would be 85 if he’d lived) and still going strong.  He went “from a fat, fading celebrity to a beloved nationally, and eventually worldwide, fetish,” notes journalism professor Jack Vitek in The Godfather of Tabloid: Generoso Pope Jr. and the National Enquirer.

Vitek, Calder, Coz, and Mahler have all written marvelous nonfiction accounts of Elvis in the Box, but my favorite version is comic crime novelist Donald E. Westlake’s Trust Me on This. 

In Westlake’s telling the death of the lightly fictionalized Elvis (a 38-year-old country singer named Johnny Crawford) spurs the Weekly Galaxy (an even more lightly fictionalized Enquirer) into frantic faction.  

After a cousin from the dead singer’s “teeming and scrofulous family” delivers the photo, Sara, the lead reporter, asks editor Jack Ingersoll, why it’s such a big deal.

“’Because America wants it . . .,’ Jack told her. ‘Never mind pictures of the guy at the White House, pictures of him dancing, laughing, eating pizza. What America wants is the dead body, on its back, hands folded over shriveled balls, eyes sewed shut, eyelids with that special caved-in look, puffy silk casket lining all around.’”

It’s like “something primitive, tribal,” Sarah marvels.  “It’s like cavemen.”

“’Sara,’ Jack said mildly. ’Who do you think our readership is?  The senior class at MIT?’”

No, just me and maybe you and everyone else on the check-out line at the grocery store 43 years ago this week.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177228 https://historynewsnetwork.org/article/177228 0
Trump Can Use MS-13 as a Prop Because the US Won't Acknowledge a Role in Creating It

 

 

 

As I listen to President Trump speechify about how he is launching “an all-out-campaign to destroy MS-13,” the Salvadoran gang whose members (maras) I’ve known for over 30 years, the first thing I think is “it must be election season.” The second thing I think about is history, specifically, a historical figure whose contributions to creating the “gang threat” have had and are still having catastrophic consequences: Attorney General William Barr. More than anyone in U.S. history, Barr perfected the Salvadoran gang formula—mixing elections, gangs, immigration and policy—he’s been developing for decades, since the L.A. riots of 1992.

 

Trump ‘s Oval Office speeches and other recent MS-13 rants come on the heels of many previous pronouncements, like his 2019 midterm statement referring to the Speaker of the House as “the MS-13 lover Nancy Pelosi."

 

While Trump’s deploying MS-13 as an election year prop is predictable and laughable in how it (again) tries to paint the gangs as “terrorists,” these electioneering ploys hide a history of policing and intervention that created in Los Angeles, and exported to Central America, both the gangs and the policing models many are now calling on the country to abolish. Few people have worsened the devastation in Central America and the humanitarian crisis of refugees fleeing the region as much as William Barr. In attacking the maras, Trump is continuing a pattern of lies backed materially by policy and police and military boots on the ground. 

 

I first noticed Barr’s outsized role while watching him commenting on Face the Nation right after the L.A. riots of 1992. As Barr started talking about “the significant involvement of gang members at the inception of the violence (of the riots),” it was obvious that he was laying responsibility for a billion dollars of destruction on the maras and other gangs, even though those of us who were actually on the streets knew it was the country’s first multiracial uprising. In order to understand the killings of George Floyd, Breonna Taylor and other cases that moved Black Lives Matter back into action in cities across the U.S., we must understand what William Barr did in those cities to Salvadoran and other gangs following the L.A. riots.

 

Those of us working in the central parts of L.A. most impacted by the riots—the South Central  and Pico-Union neighborhoods— understood Barr’s Face the Nation statements for what they were: a cynical use of the uprisings and the gangs to justify a major shift in US law enforcement. This a shift that would have profound implications for Salvadorans in the Pico-Union neighborhood where the Salvadoran maras were born. At the time, however, few to none of us in South Central and Pico-Union had any idea that Barr’s policy and policing maneuvers previewed a shift in U.S. law enforcement that would have the catastrophic results far beyond the communities that burned to the ground.

 

Barr was using the complexity and chaos of post-uprising L.A. as an opportunity to push his new priority for federal law enforcement: fighting gangs, especially the maras. Weeks after his appointment to the G.H.W. Bush administration, the new attorney general had targeted the gangs with the largest reallocation of FBI resources in history, moving three hundred agents from counterintelligence work against potential foreign enemies to instead target the gangs, including the Salvadoran young people in and around maras. Before making his decision to prioritize gangs, Barr consulted with the man who started a new era of anti-immigrant politics in the United States, California governor Pete Wilson, who was preparing to launch Proposition 187. Proposition 187, which denied government aid and services to undocumented immigrants, was the de facto start of the contemporary anti-immigration movement in the United States.

 

Barr went on to send US Justice Department trainers and aid to El Salvador to institute policing models based on William Bratton’s “broken windows” policing in New York, championed and expanded by Rudy Giuliani. These were the same controversial policing models that #BlackLivesMatter protests throughout the country are calling on governments to abolish. Barr’s failed policies are global in scope—and still growing. Even before Trump’s attempts to designate the estimated 10,000 mara members in the U.S. as “terrorists,” other U.S. presidents and government officials, FBI agents, and U.S. allies that have made baseless claims comparing the maras with Al Qaeda and other terrorist organizations. The haze of Barr’s racialized war on gangs is enhanced by my peers in the media, many of whom still use tattoo-faced images of the maras, even though most stopped tattooing their faces many years ago.

 

Lost in the haze of Barr’s legacy is the fact that the overwhelming majority of the estimated 10,000 mara members in the U.S. (a number that has not grown for years, according to the FBI) have not killed anyone. In any given year, three or four handfuls of white men involved in mass shootings with AR-15s have likely killed more people in the than have all MS-13 members living in the United States.

 

The most recent testament to Willaim Barr’s legacy lives in Trump’s Oval Office Salvadoran gang statements yesterday. Those statements included linking Democratic mayors to the maras, calling the mayors "left wing people that are running our cities,” and “not doing the job that they’re supposed to be doing (about the gangs)”

 

All indicators are that Barr’s legacy will continue and expand their destructive trajectory. Trump said that Barr himself would be holding a press conference next week announcing policy initiatives targeting MS-13.

 

The other thing I hear in Trump’s MS-13 speeches is that in L.A. across the United States and around the world, we’re all living in William Barr’s vision of gangs, cities and politics.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177178 https://historynewsnetwork.org/article/177178 0
Can Biden Beat Van Buren's Curse?

A political cartoon, likely 1836, mocks the personal life of Richard Mentor Johnson and suggests his selection will appeal to 

abolitionists and Black people, groups loathed by large parts of the Democratic polity. Image Library of Congress.

 

 

 

Will Joe Biden be different?

 

As Election Day approaches, he confronts not only an incumbent Republican president flush with campaign cash but also a nearly two-century-old hazard of history. Since the 1830s, the vice presidency has become a dead end for Democratic aspirants to the White House. 

 

Believers in dark wizardry might consider this prolonged string of ill luck as the curse of Martin Van Buren.

 

Of course, Harry Truman (in 1948) and Lyndon Johnson (in 1964) won presidential campaigns after occupying the nation’s second office. Each, though, ran as incumbent chief executives, following Franklin Roosevelt’s death in 1945 and John Kennedy’s assassination in 1963. Neither was still viewed in the secondary role by the time of their elections.

 

Over the past 52 years, a trio of three Democrats who were a heartbeat away have sought the presidency. All three—Hubert Humphrey in 1968, Walter Mondale in 1984 and Al Gore in 2000—lost. During that same period, two Republicans who served as two-term veeps—Richard Nixon and George H. W. Bush—triumphed in their pursuit of the Oval Office. 

 

What is it about Democrats?

 

Except for the atypical cases of Truman and LBJ, not a single Democrat since Andrew Jackson picked Van Buren as his running mate in 1832 has moved up electorally after being—in Mondale’s sobering phrase—“standby equipment.”

 

Even Van Buren’s political career took a tumble not long after he succeeded “Old Hickory” by winning the 1836 election. The Economic Panic of 1837—with high unemployment, unstable currency and debt defaults—caused the most destructive depression in the young nation’s history, jolting confidence in Washington. 

 

With the economy still struggling and other domestic problems, Van Buren, a diminutive New Yorker with the nickname the “Little Magician,” couldn’t conjure enough electoral tricks to win a second term in 1840. He lost soundly to Whig candidate William Henry Harrison, who appealed to voters by using the ballyhoo of America’s first catchy campaign slogan: “Tippecanoe and Tyler Too.” 

 

Part of the Democratic Party’s predicament that year came from the reputation of Van Buren’s vice president, Richard Mentor Johnson, whom Van Buren never wanted in his administration, let alone in close proximity to presidential power. 

 

The Van Buren jinx starts with Johnson, so he deserves more than cursory consideration.

 

Having spent 30 years in Congress representing Kentucky—one decade in the Senate and two more in the House of Representatives—Johnson was a conspicuous fixture in Washington. 

At one point he sent the capital’s heads shaking for leading a drive to support a government-funded expedition to the center of the earth. As planned, this bizarre scheme involved about a hundred men—transported by reindeer and sleigh—and they were intent on discovering an opening at the Arctic Circle for a descent to the earth’s core. 

 

“As outlandish as this proposal seems, what is truly remarkable is that twenty-three senators joined Johnson in supporting it!” write Mitch McConnell and Roy Brownell II in The U.S. Senate and the Commonwealth: Kentucky Lawmakers and the Evolution of Legislative Leadership (2019). McConnell, the Senate majority leader, and his co-author devote a dozen memorable pages to the country’s ninth vice president. 

 

During the War of 1812, he took a break from House duties to command a regiment of volunteers from his home state that helped win the Battle of Thames in Canada. 

 

This Kentucky colonel returned to the hurly-burly of politics as a combat hero, becoming a favorite of the lionized military leader of the War of 1812: Gen. Andrew Jackson. 

 

Jackson selected Van Buren to run with him for his second-term in 1832 after clashing with his first vice president, John C. Calhoun. Four years later, “Old Hickory” still wielded political clout and vigorously supported Johnson over presidential candidate Van Buren’s choice for a running mate. Jackson won.

 

Johnson swiftly became a controversial national figure. His common-law wife, Julia Chinn, happened to be a mixed-race enslaved woman (described as an “octoroon” in the racial taxonomy of the day), whom he inherited from his father. Other relationships with enslaved women after Chinn’s death in 1833 kept tongues wagging in Kentucky, Washington and elsewhere. 

 

Commingling personal affairs and government business was never an ethical dilemma for Johnson, who also seemed to care little about his public image. In her widely read and quoted 1837 survey, Society in America, British author Harriet Martineau offered this close-up of Johnson at a Washington dinner: “If he should become president, he will be as strange-looking a potentate as ever ruled. His countenance is wild, though with much cleverness in it; his hair wanders all abroad, and he wears no cravat.”

 

Johnson’s personal life, duly noted and derided in the press, helped keep him from securing the necessary Electoral College votes for vice president after the 1836 popular vote. For the first and only time in history, the Senate formally assembled to pick the occupant of the nation’s second office. With each senator casting a vote, Johnson defeated Francis Granger of New York, 33 to 16.

 

Given that Van Buren didn’t want Johnson on the Democratic ticket, the chief conductor of the executive branch had virtually nothing to do with his second fiddle. The official Senate website summarizes Johnson’s service as vice president and president of the Senate in two matter-of-fact sentences: “During his four years in office, Johnson broke 14 tie votes. When not presiding over the Senate, Johnson could regularly be found in Kentucky, operating his tavern.”

 

Serious questions about Johnson triggered robust debate at the 1840 Democratic convention. This time Jackson dropped his backing, but Van Buren fretted that dumping Johnson from the ticket might bolster the chances of bona fide war hero: “Tippecanoe” Harrison. What to do?

 

Convention delegates dodged a formal selection, fearing any choice could be divisive. Nobody was nominated for the second place on the ticket, and 1840 became the only time since the passage of the 12th Amendment (spelling out the process to choose the president and vice president) when a major party failed to slate two people on a national ticket. 

 

Van Buren ran alone. He lost alone. 

 

Historian Arthur M. Schlesinger Jr. thought the 12th Amendment, ratified in 1804, turned the vice presidency of the 19th century into “a resting place for mediocrities.” He’s accurate, but they proved to be quite ambitious mediocrities.

 

Johnson, who campaigned unsuccessfully on his own to remain as vice president in 1840, sought the Democratic presidential nomination four years later—and got nowhere.

 

James Polk won in 1844, and his promise to serve just one term made his vice president, George M. Dallas of Pennsylvania, drool for the 1848 Democratic nomination. At that year’s convention he could convince only three delegates he was White House timber. 

 

Dallas followed Johnson in what’s turned into a Democratic Party tradition of defeat for hapless, if not cursed, White House hopefuls. Adlai Stevenson I (Grover Cleveland’s second VP), Thomas Marshall (Woodrow Wilson’s two-term second banana), John Nance Garner (Franklin Roosevelt’s first Number Two) and Alben Barkley (Truman’s vice president) all thought they could find success at Democratic conventions by using their performances in front of delegates to vault to the highest office. 

 

None, however, mustered much support. Garner, remembered for his bon mot that the “vice presidency is not worth a bucket of warm piss,” even ran unsuccessfully against FDR—his boss—at the 1940 convention.   

 

Henry Wallace, FDR’s second vice president, was dropped from the 1944 ticket and replaced by Truman. Wallace’s consolation prize, appointment as Secretary of Commerce, continued for a year-and-a-half until Wallace and Truman parted ways in 1946 over the president’s confrontational approach to the Soviet Union. After his firing as Secretary of Commerce, Wallace sought revenge. He mounted a presidential campaign against Truman in 1948 as the Progressive Party candidate. 

 

Wallace’s untraditional and ill-fated attempt at political advancement resembled a similar effort by John Breckinridge, James Buchanan’s vice president during the much-reviled Democratic administration between 1857 and 1861. 

 

Stephen A. Douglas, a senator from Illinois, won the 1860 Democratic nod for president, but disgruntled Southern Democrats, largely pro-slavery stalwarts, bolted, conducting a separate convention that chose Breckenridge from Kentucky as the nominee. 

 

Though Breckinridge received just 18 percent of the popular vote (to Republican Abraham Lincoln’s 40 percent and Douglas’s 29 percent), Southern states awarded Breckinridge 72 electoral votes to Lincoln’s 180. Douglas got just 12. 

 

Starting with Van Buren, there have been 17 vice presidents in Democratic administrations. Three made it to the White House—Van Buren, Truman and Lyndon (not Richard) Johnson. Eleven tried and failed either to win the highest office or the party’s nomination. (Two—William R. King, Franklin Pierce’s VP for just six weeks, and Thomas Hendricks, a nine-month Number Two in Cleveland’s first term—died in office.)

 

Joe Biden is Number 17, and he faces another hurdle of history. Except for Thomas Jefferson in the controversial election of 1800, no candidate of either party who’s been vice president has defeated an incumbent president to grab the brightest brass ring of U.S. politics.

 

Will the curse of Martin Van Buren continue to cast its sinister spell on Democrats in 2020, or is this the year when history takes a dramatic turn by elevating an understudy to become the leading character on America’s—and the world’s—stage?

 

Place your bets. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177229 https://historynewsnetwork.org/article/177229 0
The Latest Resurgence of Ethnic Studies

Black Student and Third World Liberation Strike, San Francisco State, 1968

 

 

Ethnic Studies is back at the forefront of higher education. It has been more than half a century since the first ethnic studies courses became part of the curriculum in higher education. Ethnic studies evolved out of the civil rights movements of the late 1960s and early 1970s, an era dominated by an intense level of civil disobedience that contributed to increasing self-awareness and ethnocentric pride among many people of color. Black American Studies, Asian American Studies, Chicano Studies, Mexican American Studies, Native American Studies, Jewish Studies, Arab Studies, and studies of indigenous cultures were established as fields in their own rights. Such activism is still occurring.

The first official establishment of an ethnic studies department occurred in 1968 as the result of a long, tense, and lengthy strike staged by students of the Third World Liberation Front at San Francisco State University, a progressive and radical group of students composed of Black, Asian, Latino, and Native American students. Frustrated at the dearth of courses that were representative of or spoke to their ethnic and cultural experiences, these young radicals demanded that administrators address and correct what students viewed as chronic deficiencies in the curriculum. Their steadfast determination resulted in senior-level administrators, including then President S.L. Hayakawa, surrendering and incorporating ethnic studies courses into the university by establishing a School of Ethnic Studies.

This victory inspired other students and faculties of color across the nation to follow San Francisco State University's lead and stage their own strikes. A few years later, in 1972, the National Association for Ethnic Studies was founded to bring together scholars from diverse disciplines to promote interdisciplinary research. 

Throughout the early to mid-1970s, ethnic studies programs and departments continued to flourish. One major difference, though, was that groups that previously worked in solidarity became more territorial, demanding departments and programs focused on their particular cultures. Thus, fields such as Black Studies, Latin American Studies, Native American Studies, Chicano Studies, and others were established. Often such departments worked in unison with one another, but sometimes fights over power, prestige, recognition, and resources occurred.

By the late 1970s, as the U.S. economy fell into recession and the 1978 Supreme Court decision in Regents of the University of California v. Bakke  took aim at race-based affirmative action in higher education, more Americans focused on how race was being addressed in higher education. A backlash began toward issues or institutions viewed as having a racial agenda. Anything seen as “too race-identified” was viewed with a suspicious eye.

Such sentiments continued after Ronald Reagan defeated incumbent president Jimmy Carter in November 1980 to become the 40th president of the United States. While the backlash against racial identification and perceived racial preferences, no matter how misguided, had begun in the late 1970s, such reactionary attitudes became even more pronounced as the 1980s progressed. Aggressive attacks on social programs, the safety net, welfare programs, higher education, labor unions, people of color, Jews, and any group associated with the specter of liberalism became institution or issue non grata.

By the 1980s, business and engineering were the degrees du jour on college campuses. Liberal arts were considered passé and all but dead. While majors such as English, history, philosophy, and others declined, the drop-off was even more acute in ethnic studies programs such as Black Studies and Chicano Studies. In some cases, funding became harder to secure from hostile, suspicious, or outright indifferent state legislatures; some programs or departments closed outright, while others were severely scaled back or folded and housed into other departments on campus. It was a less than robust time for ethnic studies at many institutions. Interestingly, during the same period, women's studies programs were established on many campuses.

  By the later 1980s, many employers had realized that their business, engineering, and hard science graduates were often good at crunching numbers, figuring out equations, and processing data, but deficient in concrete abstract and critical-thinking skills. This posed a dilemma for corporations whose companies needed workers who could think outside the box. Thus, the much-maligned and dismissed liberal arts majors were suddenly back in demand and being eagerly recruited. 

This trend continued into the 1990s. In contrast with the previous decade, when the field had been largely marginalized, employers suddenly wanted and demanded graduates in communication, philosophy, history, literature, and similar majors as part of their workforce. In higher education, issues such as diversity, and academic perspectives rooted in multiculturalism or intersectionality became mainstream. Institutions suddenly became committed (at least in theory) to diversifying their faculty ranks, which had been (and in truth still are today) overwhelmingly white.

After languishing on the academic sidelines throughout the 1970s and 80s, the often overlooked and neglected stepchild—Ethnic Studies—was finally becoming recognized as a valuable entity in higher education, and such majors increased tremendously in popularity. For several years, Black American Studies became the discipline du jour. Much of this was due to the immense popularity of scholars such as Cornel West, bell hooks, Patricia Williams, Henry Louis Gates Jr., and many others.

Suddenly, Ivy League and other prestigious institutions were developing African American and other high-powered, well-funded ethnic studies departments or programs. Colleges and universities appeared eager to hire, recruit, and retain faculty of color. Ethnic studies enjoyed the positive reception that had previously eluded it. A new day had seemingly emerged in the ivory tower.

While a renaissance of sorts did indeed occur, not everyone was pleased with this sudden coronation being bestowed upon race, ethnic, and gender studies. There were those who saw this new recognition of the contribution of color as an attack on “classic, well-established (read Eurocentric) scholarship” that had supposedly “stood the test of time.” The much-touted idea of “culture wars” reared its intense, combative head throughout the mid-1990s. These battles were primarily waged in the liberal arts, most notably in history and English departments.

To say debates were fierce is akin to saying that water is wet. It appeared that everyone from academics to the mainstream media to politicians and private citizens found it appropriate to weigh in with commentary. Heated debates took place across the political spectrum as men and women from the political and cultural left, center, and right took to the airwaves on radio programs, penning passionate arguments on the op-ed pages of well-respected newspapers, highbrow magazines, upscale journals, and other avenues to defend their positions.

By the late 1990s, it appeared that a cease-fire had occurred, and all parties involved had decided to “live and let live.” At the risk of troubling long-settled waters and reopening old wounds, I would argue that proponents of the new school—the deconstructionists, post-colonialists, and social and cultural historians—were successful in integrating their work into the literary and historical canon. It is now a given that the works of non-white, heterosexual men and women will remain in the curriculum of many humanities courses, despite grumblings from certain right-wing quarters.

During the first two decades of the 21st century, stark cuts to higher education by state legislatures, a declining college-student-aged population, overreliance on adjunct faculty, the skyrocketing cost of college, and the rise of online education and for-profit colleges, coupled with similar roadblocks, have had a dramatic impact on higher education. Some colleges have been forced to merge or close down. Such predicaments have left many parents (and students) questioning the value of a college education.

 In an era of such endemic uncertainty, it is not surprising that parents are more cautious and determined to ensure that the money they spend on their children’s education will result in tangible outcomes. In essence, only the most economically privileged students can afford the supposed “luxury” of pursuing a liberal arts education. Arguments from the early to mid-1980s regarding the profitability of certain degrees have suddenly resurfaced. But this time, business has been eclipsed by STEM and healthcare-related fields.

Once again, the humanities are under fire and viewed as being worthless in the job market. Ethnic studies courses have faced particular ire. They are often the first programs/departments to be severely downsized or terminated. The more things change, the more they stay the same.

But have we not heard this argument before? Ethnic studies and the humanities have routinely been targeted for ridicule. Yet when considered down for the count, they always seem to rise from the ashes, turn a corner, and find themselves back on the road riding comfortably down the highway. Something tells me this time will be no different. For one, rapidly changing racial, ethnic, and cultural demographics in our nation will demand that such courses become permanently etched into the fabric of the educational curriculum for K–12 and beyond. Secondly, our increasingly global economy and the world at large require such a situation to become the norm. Ethnic studies are the epitome of resilience. I would not bet against the discipline; such courses are more important now than ever.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177227 https://historynewsnetwork.org/article/177227 0
A Coup of "Clerqs"?: Anne Applebaum's "Twilight of Democracy" Reviewed

Fidesz rally, Budapest, 2006. Photo Derzsi Elekes Andor, CC BY-SA 3.0

 

 

 

 

On New Year’s Eve in 1999, historian and journalist Anne Applebaum and her husband Radek Sikorski, who was then Poland’s deputy foreign minister, hosted what sounded like a really great party. Their home was filled with American and British journalists, Polish dignitaries, and some of their neighbors. Applebaum characterizes them as representatives of “the right,” which at the time meant something vastly different from what it means today. They were staunch anti-communists in the Thatcherite tradition, who were optimistic about Poland’s embrace of democracy and free markets. The gathering was held at an old manor house that had been abandoned in 1945 by its previous owners, who had fled from the advancing Red Army. Sikorski’s family had purchased and renovated the dilapidated house, making it the perfect metaphor for a nation rising from decades of oppression at the hands of the Soviet Union. Unfortunately, the friendships and the optimism that animated that celebration have largely evaporated. According to Applebaum, “Nearly two decades later, I would now cross the street to avoid some of the people who were at my New Year’s Eve party. They, in turn, would not only refuse to enter my house, they would be embarrassed to admit they had ever been there.”  Many of her Polish friends, colleagues, and acquaintances have abandoned their belief in classical liberalism and embraced an illiberal right-wing authoritarianism that sees democracy as flawed and free markets as rigged by elites. She maintains that the bitter political divisions that developed among those who attended this party in Poland are similar to those now found in many countries today, including Hungary, Spain, Italy, Great Britain and the United States. 

 

In Twilight of Democracy: The Seductive Lure of Authoritarianism, Applebaum identifies several factors that have contributed to the rise of twenty-first-century illiberal movements, often using personal anecdotes to illustrate her points. She is not particularly concerned with theories, nor is she attempting to offer a grand thesis about the phenomenon. She makes a few passing references to the ideas of Hannah Arendt and Theodor Adorno among others, and relies on the work of a behavioral economist, Karen Stenner, to point out that about a third of the world’s population simply has an authoritarian disposition that makes them deeply suspicious of views different from their own. The bulk of her analysis focuses on the journalists, propagandists, intellectuals, and opportunists who support and enable authoritarian leaders. She refers to these sycophants as clerqs—a term she borrows from Julien Benda’s 1927 book, The Treason of the Intellectuals—and explores their ideas (such as they are) and methods. She provides dozens of examples of how these clerqs used the system to advance themselves and perpetuate the system but devotes little attention to the actions of dictators like Viktor Orban, Andrzej Duda, and Vladimir Putin. It is an interesting approach to the phenomenon of rising authoritarianism. 

 

Twilight consists of six chapters, three of which have already appeared in the Atlantic or Washington Post. Each chapter can be read independently, but they fit together well in book form. Some readers may be disappointed that she devotes little attention to left-wing authoritarianism arguably found in some political parties, among college activists, and in cultural institutions, but she chose to focus on right-wing authoritarianism because she sees it as the greater threat to Western democratic liberalism right now. Poland and Hungary serve as her two most important examples of authoritarians using democratic institutions to gain power, but she refers to Russia, Turkey and other nations as well. The book is a powerful warning to citizens of democracies around the world: “Given the right conditions, any society can turn against democracy. Indeed, if history is anything to go by, all of our societies eventually will.”  

 

Applebaum’s exploration of the demise of Polish and Hungarian liberalism illuminates key developments that are also found in other European countries and the US. She asserts that today’s authoritarians essentially follow Vladimir Lenin’s view of political parties, but neither Law and Justice in Poland nor Fidesz in Hungary seized power through violent revolution. However, these modern authoritarians embrace the Leninist model of a polity ruled by a single party. Though the veneer of a pluralistic democracy endures, the party dominates many aspects of life in both countries. The clerqs who support the party and run the government, “…understand their role, which is to defend the leaders, however dishonest their statements, however great their corruption, and however disastrous their impact on ordinary people and institutions. In exchange, they know that they will be rewarded and advanced.”  For example, she tells the story of Jacek Kurski, who rose from “relative obscurity of fringe politics,” (he was known for his work spreading the rumor that Polish presidential candidate Donald Tusk had a grandfather who voluntarily fought with the Wehrmacht in World War II) to become the director of Telewizja Polska, Poland’s public broadcaster. Since he took control of the broadcasting company, it has become a mere propaganda outlet for Law and Justice. Applebaum describes his appointment as akin to Alex Jones of InfoWars taking over the BBC. Law and Justice and Fidesz have not resorted to the creation of a new Gulag to suppress dissent; they rule through intimidation, manipulation of the media, and control of the courts.

 

These new authoritarians do not try to promote what anti-Stalinist writers like Arthur Koestler and George Orwell called the Big Lie. Instead, they promote the Medium-Size Lie, a term coined by historian Timothy Snyder. Law and Justice promotes many such lies wrapped up in vast conspiracy theories. One example is the Smolensk conspiracy concerning the 2010 plane crash that killed the Polish president, Lech Kaczynski, who was on his way to Smolensk for a ceremony commemorating the thousands of the Polish officers murdered in 1940 by the Red Army. Since the truth of the official investigation into the causes of the crash embarrassed Law and Justice (the black box revealed that Kaczynski himself insisted that they land the plane despite a thick fog), the party concocted an elaborate and flexible theory that the plane was brought down by various conspirators, though the group and its goals can change on an as-needed basis. According to the author: “Anyone who professes belief in the Smolensk lie is by definition of true patriot—and thus qualified for a government job.”  (I was surprised the author did not connect this lie to the Polish Communist Party’s insistence that the Katyn Forest Massacre was the work of the Nazis, not the Red Army.) Fidesz dusts off old anti-Semitic tropes to frighten Hungarians into believing that Viktor Orban is the great bulwark against George Soros’s sordid attempt to destroy Christian civilization by enabling mass immigration of Muslim refugees into Christian Hungary. 

 

Illiberal political movements in Europe and the United States use conspiracy theories to promote what the author calls “restorative nostalgia.”  They contend that their once-great nations have been weakened by a whole host of similar threats that have specific variations depending on the country. The Vox party in Spain claims that Muslim invaders are poised to undo the fifteenth-century Reconquista of Catholic Spain. Prior to winning office Boris Johnson, in his columns for the Daily Telegraph, painted an often-inaccurate picture of venal and corrupt European Union bureaucrats destroying the fabric of English society. In a BBC interview, Johnson candidly admitted, “...everything I wrote from Brussels was having this amazing, explosive effect on the Tory party—and it really gave me this, I suppose, rather weird sense of power.”  In Poland, the LGBTQ community is seen as an existential threat to the nation. And some members of the Republican Party in the US, including President Donald Trump, have lent credence, if not active support, to the bizarre QAnon conspiracy of deep state actors destroying the country, maintaining that only Trump can Make America Great Again. 

 

These movements share similar enemies and tactics, sometimes relying on the same viral memes that can instantly transcend borders. For example, a video purporting to show a large group of French Muslims in Paris celebrating the 2019 fire in Notre Dame Cathedral was shown across multiple platforms on both sides of the Atlantic as a dire warning of the threat of Muslim immigrants. In reality, the images were from an anti-government rally in Algeria held before the tragic fire. Amidst the “cascade of lies,” the truth has little chance of catching up. What’s worse, even if the truth catches up, it is too easy to ignore because of the sheer volume of information spread across the internet. 

 

Applebaum illustrates her points with numerous personal experiences that she shared with leading European and American journalists, writers, artists, and politicians, which make the book both compelling and enjoyable to read. Twilight of Democracy raises alarms that liberal democracy is facing its most significant threat since the Cold War. She does not, however, insist that it’s doomed. In the final chapter, she recounts another party she hosted in 2019 (she seems to have a great social life), which included large numbers of guests including her teenage sons’ friends from Poland, Britain, and the US: “They mixed English and Polish, danced to the same music, knew the same songs. No deep cultural difference, no profound civilizational clashes, no unbridgeable identity gaps appeared to divide them.”  The threat of authoritarianism is not going away, but liberal democracy can still prevail. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177225 https://historynewsnetwork.org/article/177225 0
Will California Voters End a 24-Year Ban on Affirmative Action?

 

 

This November, California voters will have the chance to repeal the state’s longtime ban on affirmative action. A state ballot measure, Proposition 16, will be the first major test of voter enthusiasm for social justice measures since the May 28 murder of George Floyd.  

The ban on the use of affirmative action by state government was passed as a ballot measure (Proposition 209) in 1996 with strong support from Republican Governor Pete Wilson. 

Proposition 16, if passed, would allow the state to use affirmative action programs in government hiring, contracting and university admissions. The measure’s most immediate impact would be on college admissions, where officials could once again use an applicant’s race as a factor in determining acceptance.

The ballot proposition, which would amend the state’s constitution, needs a simple majority to take effect. It was placed on the ballot after a June 24 vote in the Democratic-controlled state Senate.  The Senate vote came shortly after the university board of regents voted unanimously to endorse the ballot measure and one month after the killing of George Floyd in Minneapolis.

Nine States Ban Affirmative Action

California is currently one of nine states that have a ban on affirmative action; the others are Washington, Texas, Florida, Michigan, Idaho, Nebraska, Oklahoma and New Hampshire. No state that imposed an affirmative action ban, has later had it repealed by voters or the legislature.  In November 2019, Washington State voters narrowly defeated a measure to repeal that state’s ban on affirmative action.

How did California, now seen as a bastion of progressive legislation, pass a ban on affirmative action? 

The 1990s were a period of rapid change and social unrest in the Golden State. Between 1988 and 1992 the state experienced a 42% increase in illegal immigration. In 1992, Los Angeles endured three days of rioting after the acquittal at trial of the LAPD officers who beat Rodney King.

 A “tough on crime” mood infused state politics. In 1994, Pete Wilson won a tough re-election campaign and state voters approved two conservative-backed ballot measures, Propositions 184 and 187. The former, dubbed “Three Strikes and You’re Out,” lengthened criminal sentencing, while the latter denied health and welfare benefits to undocumented workers. 

By 1996, Governor Wilson was contemplating a run at the Republican presidential nomination and saw the rise in illegal immigration as an issue he could use to his advantage. Ward Connerly, a Black businessman appointed by Wilson to the university board of regents, became the most prominent supporter of Proposition 209, the measure banning affirmative action. In speeches and editorials, Connerly characterized it as a “civil rights” measure.

Shifting Demographics

In 1996, the California electorate was 74% white. Proposition 209 passed by an overwhelmingly majority of white voters, 63-37%, while being opposed by majorities of Black, Hispanic and Asian voters, who constituted a small fraction of the electorate.

Today, the demographics of California have shifted. According to the Public Policy Institute of California, Whites in California make up only 42% of California’s adult population but 58% of the state’s likely voters. In comparison, Latinos comprise 35% of the adult population but just 19% of likely voters. 

Asian Americans make up 15% of adults and 13% of likely voters.  African Americans comprise 6% of both the adult population and likely voters.

The repeal measure will benefit from a well-funded “Yes” campaign, which has raised more than $3 million. The “Yes” campaign enjoys the support of the state’s leading Democrats and most major unions.  Proposition 16 opponents, who have tapped Proposition 209 backer Ward Connerly as a spokesperson, have raised only around $100,000. 

The Asian American community appears to be divided in its support. While State Controller Betty Yee, the state’s highest elected Asian American, supports Proposition 16, the Silicon Valley Chinese Association, a business group, strongly opposes it. The group’s president, Crystal Lu, said “We cannot let government choose winners or losers on the basis of your race or gender.”

Although the University of California (UC) and California State University (CSU) systems have been prohibited from using racial quotas in recent decades, they have implemented policies to diversify enrollment by giving weight to applicants’ economic status. 

In 2019, White students comprised 22% of UC undergraduates and 21% of CSU’s, compared to the 36% of the adult population.

While Latinos comprise 39% of the state’s adults, they make up 25% of UC undergraduates and 44% of CSU’s.  Note that in 2019, Latinos comprised 55% of high school graduates. Asian Americans comprise about 14% of the adult population but 33% of UC undergraduates, the largest share of any racial group

African Americans make up about 6.5 % of state’s adult population, but only 4% of undergraduates at both the UC and CSU systems.   

While no major statewide polls have been conducted on Proposition 16, many of the state’s political experts believe it will pass. The November ballot will include Democratic nominees Joe Biden and Kamala Harris, the state’s popular U.S. Senator. California is strongly anti-Trump and a large turnout is expected, which should favor progressive candidates and social justice causes.

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177226 https://historynewsnetwork.org/article/177226 0
On Labor Day, Think of Bread and Roses

Lawrence, Massachusetts, 1912

 

 

As we celebrate Labor Day and hard-working Americans, let's also think of Bread and Roses. For back in 1912 that’s what inspired laborers in the textile mills of Lawrence, Massachusetts to go on a history-making strike. They stood up for the rights of workers and the poor, ideals we must carry on today.  Many immigrant women and children worked long, grueling hours at the American Woolen company's mills in Lawrence. This was dangerous work in producing fabric for clothing.  One fourteen year old, Carmela Teoli, caught her hair in the machines and was so badly wounded she had to be hospitalized for months.  A small victory appeared to happen for these laborers in January, 1912 when Massachusetts passed a law lowering the working week for women and children from 56 hours to 54. How did management respond? They cut each worker's wages by thirty-two cents to make up for the lost hours.  Thirty-two cents may not seem like a lot of money today, but back in 1912 that made the difference between being able to afford food. The workers in the textile mills lived in poor, cramped conditions and were struggling to get by. When the workers learned of the pay cut it made them furious. The Bread and Roses Strike of 1912 began, and it spread quickly throughout the textile mills. Over 20,000 workers were organized to protest the low wages and poor treatment. These oppressed workers wanted to be able to afford food for their hard work. But they also wanted respect.  A rallying cry was captured in a poem by James Oppenheim "Hearts starve as well as bodies; give us bread, but give us roses!" When the public learned of the horrid working conditions in the textile mills, the strike gained even more support.  Representative Edward Townsend of New Jersey saw firsthand the poverty in the homes of the textile workers. Malnutrition was causing a high death rate among infants.  Townsend, in a speech before Congress in March 1912, said "In the City of Lawrence...out of every 100 deaths 47 were of children under 5 years of age.” As reported in the New York Times, Townsend said "It was hard to believe in some cases that human beings could exist in the conditions I found. In only a score of homes was there anything to eat except bread. The supply of that was, many times, pitifully small." Hearings were held in Washington and powerful testimony was delivered by Carmela Teoli on her experiences working in the mills. President Taft and Mrs. Taft heard her story and pledged their support.  The American Woolen Company soon gave in to the workers’ demands including increasing wages. The workers had successfully united for justice and human rights. Child labor laws were soon passed to begin protecting youth.  As we celebrate Labor Day, let's remember the struggle for the rights of workers and the poor continue. There are Americans who are working long hours but barely able to get by and still need food assistance. Bread for the World reminds us that 'We cannot end hunger in the U.S. without raising the minimum wage."  According to the World Bank  almost half the world’s population – 3.4 billion people – lives on less than $5.50 a day. Many people, including children, go to bed hungry at night around the world.  We must do more to help the poor. One way is by lifting up the many small farmers around the globe so they can produce more food and becomes suppliers to their own communities. Small farmers in developing countries could become the suppliers to national school lunch programs to end hunger and poverty.  On Labor Day we should be thinking of ways to give everyone a chance to work hard and make a decent living. No one should be left out.  Everyone working to make a better life for their family deserves Bread and Roses too. 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177232 https://historynewsnetwork.org/article/177232 0
Life during Wartime 519

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/blog/154397 https://historynewsnetwork.org/blog/154397 0
"Have You Lived Your Whole Life in Vermont? Well, Not Yet!": One State's Joke Culture

Having never seen this T-shirt within the state,

I don’t think it’s really a Vermont joke. 

 

 

 

Why do some states develop a joke culture while others don’t?

 

For that matter, some states develop much stronger state identities than others. One is or is not a native Vermonter, for example. Indeed, I probably should have written “Native Vermonter.” 

 

I moved to Vermont in 1975. Shortly after arriving, I went to a potluck dinner at the home of the minister of the Burlington Unitarian Church, hoping to make new friends. Two dozen people came, and we went around the room introducing ourselves. Perhaps the third person to speak was also the oldest, 84 years of age, and he spoke for several minutes, but his point was not to tell of any of the interesting jobs he had held or experiences he had had. No, he went back in time to explain and lament that he was not a native Vermonter, having lived in the state for only the last 82 years. From that point on, around the room, everyone took care to note whether they were or were not native to the state. 

 

Mississippi is like that. So is Texas. 

 

Compare Illinois, my home state. We don’t even know how to pronounce it! Some say “Illinoisan” without pronouncing the “s”; some say “Illinoisian,” and in that camp no one knows whether to pronounce the “s” or not; and no one cares anyway. Massachusetts does not even have a name for it—“Massachusettsian”? Massachusettser”? [there is in fact a widely-shared nickname for Bay State residents, popular among residents of the northern New England states, but it is profanity-adjacent--ed.]

 

Vermont, you may know, is also a state that has jokes. Most states do not. Some states are known to be the butt of others’ jokes, such as the cruel and misleading anecdotes told about West Virginians by people in neighboring states. But Vermonters tell their own jokes—wry, a bit sly, sometimes showing some wisdom. 

 

Often these jokes tie to the notion of the Vermont native and his[i] character. For example, a “flatland” tourist finds herself asking a Vermonter in at least his 80s, “Have you lived in Vermont your whole life?” Comes the laconic reply, “Not yet.”

 

Here are all the Vermont jokes I ever heard told within the state, starting with the worst. 

 

            Flatlander during dreary all-day drizzle: “Think this rain’ll stop?”

            Vermonter: “Always has.”

 

            Flatlander (me) during dreary all-day drizzle: “Think this rain’ll keep up?”

            Lumber yard worker: “Hope so.”

            Me: “You hope so?”

            Lumber yard worker: “Ayup. Then it won’t come down!”

 

            Vermonter to flatlander: “Don’t like our weather? Wait a minute.” 

 

A flatlander wants to take a shortcut across a field, but is worried by the bull he sees grazing in the middle of it. So he asks the nearby farmer, “Say, is that bull safe?” 

Vermonter: “Sure, he’s safe”. 

So the flatlander jumps the fence and starts across the pasture. 

Vermonter: “Can’t say the same for you, though.”

 

An elderly Vermont farm couple sits on their porch, watching a typical yet stupendous Vermont sunset over the low scarlet-tinged hills. Slowly pink turns to crimson turns to vermilion. It is literally breathtaking. At last, as they turn to go in, the husband says quietly, “We’ll pay for that.” 

 

Driving through one of our many intersections with no stop signs, a flatlander and a Vermonter both reacted too slowly for the icy conditions and had a fender bender. As they were exchanging insurance information, the Vermonter suggested they retreat to the pub that happened to be at the corner, to get out of the cold. As they entered, the Vermonter called to the waitress, “A beer for my new friend here.” 

“How nice,” thought the New Yorker. “This would never happen in the city.”

They called the authorities and wrote down the other’s insurance company and phone number, and then the flatlander realized that the Vermonter hadn’t bought himself anything to drink. “Let me get you a beer!” he said.

“Oh, no thank you,” replied the Vermonter. “I’ll just wait until the police have come and left.” 

 

No one had won the Vermont Lottery for several weeks in a row, so it had grown to more than $3,000,000. Finally a winner was announced, and it turned out to be a native Vermonter, a farmer all his life, so a newspaper reporter from Burlington was sent to interview him.[ii] “What do you plan to do with the winnings?” she asked.

“Nothin’,” replied the farmer.

“$3,000,000?! You must have some plans!”

“Nope,” said the farmer. “Reckon we’ll just stay right here and farm, ‘til it’s gone.” 

                       

The Burlington Free Press followed the custom of interviewing couples celebrating their 50th anniversaries and then writing little human interest stories about them.[iii] Then came the news that an old couple up in the Northeast Kingdom had just passed their 75th anniversary! Of course a reporter went to interview them. She knocked on the door and the husband let her in. They both sat down in the front room and the wife joined them. 

“You’ve been married longer than anyone else in the state, so far as we can tell,” gushed the reporter. “Do you have any secrets to share as to what helped you stay married so long?”

“Nope,” said the wife. “We hate each other.”

The reporter blanched. “Really?”

“Ayup,” said the husband. “For years we’ve had an in-house separation. And look – here she is, in my part of the house. It’s an outrage.” He glared at her. 

“You’re one to talk,” said the wife. “Last Sunday you ate in the kitchen.” And she shot him a look of sheer malevolence.

The reporter was shaken by hatred spewing from both sides. With a tremulous voice she asked, “Well, if you hate each other so much, why haven’t you divorced?

In unison, they both replied, “We’re planning to. We just wanted to wait ‘til the kids were dead.” 

 

A young woman schoolteacher in Boston decided to make a major change in her life. Her boyfriend had just moved out, and she wanted to leave old memories behind. So she applied for a teaching job in Brattleboro, the town closest to Boston but nevertheless in Vermont, and she got hired. Moving to her new state, she decided to go for the full Vermont experience and found a cabin in the woods for rent. It was an elegant modern cabin, but still, it was rural Vermont, in a lovely wood and with a beautiful view. 

The second week of school, she was driving home when suddenly a deer leaped out in front of her. Unprepared, she hit it, crumpling her fender against her wheel, and had to get roadside assistance. 

She was telling the tow truck man how the deer had surprised her. He pointed out the yellow sign with the dancing deer. “That means ‘deer crossing,’” he said. “Oh,” she replied. From then on she drove more carefully. 

Nevertheless, after the first PTA meeting, coming home late in the evening, another deer was in the roadway, and again she hit it. Another repair bill. Colleagues at work told her dusk was a particularly dangerous time for deer to be out. 

She drove still more carefully at dusk. 

As October passed, however, she failed to realize that dusk-like conditions also occur at dawn, and dawn came later every day. Setting forth early one morning, planning to have her wake-up coffee at school, she was stunned when a big buck jumped in front of her. This time her car was totaled, although she was not hurt. 

That evening, irate, she wrote a letter to the Vermont Department of Transportation asking – no, demanding – that they move the sign. 

 

It was foliage season. A wealthy Texas rancher on holiday was driving along one of our quaint two-lane highways and came upon a farmer, tinkering with his tractor by the side of the road. “I’m a farmer,” he reasoned to himself. “I’ll have a conversation with the fellow.”

So he stopped and introduced himself. “This your place here?

“Ayup.”

“How big a spread you got, then?

“Well, my land begins up there by the potato shed, takes in the woodlot, comes down along the creek there, and then back up along the road. 

The rancher had never heard of anything so dinky. He just had to reply, “Y’know, back home in West Texas, I can get in my truck and drive west all day and never reach the end of my property.”

“Ayup,” said the Vermonter. “Had a truck like that.”

 

Down in southwestern Vermont, the New York/Vermont border isn’t Lake Champlain, but a manmade line, and indeed, there had been a dispute about the exact location of that line since the formation of Vermont back in the eighteenth century. Finally the selectmen of the Vermont town got together with their counterparts across the state line and hired a surveyor to resolve the matter. Much of the dispute was on Ebenezer Jones’s property, and after the surveyor finished, part of his land, including the farm home, proved to be in New York state.

Who would tell him? Who would tell a fifth-generation Vermonter that in fact he was not, had never been, a native Vermonter? Finally they decided to go as a group. 

It was a warm August evening. The head selectman knocked. “Eb, you recollect we got this border dispute with New York?

“Ayup,” he replied. “Had surveyors on my property.”

“Yes,” said the selectman. “That’s what we want to talk to you about. It turns out that the line was bad. Actually, most of your land, including the house here, is not in Vermont at all, but in New York state.”

To their astonishment, a broad smile came across Ebeneezer’s face.

“You’re smiling!” the selectman exclaimed. “We thought you’d be downcast. You realize this means you’re not a native Vermont, don’t you? Never have been? Why are you smiling?”

“Well boys,” replied Eb, “It’s like this. You see, I’m gettin’ on in years. Passed my 82nd birthday last May. I just don’t think I can take another Vermont winter!” 

 

That’s it. Well, there were a few more, having to do with giving directions to tourists, but they were truly terrible. 

 

Hoping to supplement my own haphazard in-person research on Vermont jokes, I went to the web. The first URL listed by Google, “Vermont Jokes” at Jokes4Us.com,[iv] proves a dead end. Not one has ever been told within the state.[v] They simply demean the state, and most are generic, applicable to any state (or college, city, etc.). For example, 

 

“Q: Did you hear about the fire in University of Vermont's football dorm that destroyed twenty books?

“ A: The real tragedy was that fifteen hadn't been colored yet.” 

 

Since the University of Vermont gave up football in 1974 – the only flagship state school ever to do so – it’s safe to say that this joke has never even been told about Vermont, let alone in the state. 

 

At another website, I did come upon one possible Vermont joke that was new to me: 

 

A flatlander was visiting Vermont and stopped at a farm stand to buy some apples. As he stood in the barnyard talking with the farmer, a three-legged pig walked by. “I’ve never seen a three-legged pig," said the tourist. "How'd he get that way?        

"Last summer I was out plowing and my tractor overturned and pinned me,” the farmer replied. “That pig came running and all by himself dug the dirt out from around my head and then ran back to the farmhouse and got help. That pig saved my life!"

"But how'd he lose the leg?"

"Well," said the farmer, "a pig that good you don't eat all at once." 

 

Further research reveals versions of that joke in many other locales, however, including England, “the country,” and even Australia. In the Australian version, the pig is even more amazing: herds the farmer’s sheep, milks the cows, collects the eggs, and even does his taxes!

 

So I concluded it’s not a Vermont joke. 

 

Also, I made up a Vermont joke myself: “I’ve been stealing my neighbor’s maple sap,” said Tom, surreptitiously. But my native Vermont friends assured me it was not really a Vermont joke, indeed that no Tom Swiftie or other pun could ever be a Vermont joke. 

 

However, if you know a Vermont joke or two – real ones – please send it to me. Eventually it’ll wind up on my website. 

 

I did substantial research – of sorts – and I’ve reached the end of this essay, but I’m no closer to understanding why some states develop state jokes while others don’t. I did enjoy collecting the jokes, though. Maybe you enjoyed them too? 

 

[i] Usually “his,” not “hers,” I’m sorry to report.

 

[ii]Back when the Free Press had reporters. 

 

[iii]Back when the Free Press wrote stories.  

 

[iv] http://www.jokes4us.com/miscellaneousjokes/worldjokes/vermontjokes.html .

 

[v]I realize this is an impossibility theorem, but I stand by it. You read them. All of them demean the state and most are generic, to be applied to any state (or college, city, etc.). As well, few are funny.  

 

 

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/blog/154399 https://historynewsnetwork.org/blog/154399 0
The Roundup Top Ten for September 4, 2020

The Pope, the Jews, and the Secrets in the Archives

by David I. Kerzer

Newly available Vatican documents, reported here for the first time, offer fresh insights into larger questions of how the Vatican thought about and reacted to the mass murder of Europe’s Jews, and into the Vatican’s mindset immediately after the war about the Holocaust, the Jewish people, and the Roman Catholic Church’s role and prerogatives as an institution.

 

What's Next for Abortion Law?

by Mary Ziegler

Thinking historically about the abortion debate shows a shift in the ground of conflict from questions of rights to questions of restriction. The debate has always been about how the costs and benefits of childbearing are shared in society.

 

 

Even After Their Fearmongering Proves Wrong, Republicans Keep At It. Here’s Why.

by Lawrence B. Glickman

Trump's politics of fear are an extension of reactionary political rhetoric that dates back to the effort by right-wing leaders to argue that New Deal programs would lead to totalitarianism. 

 

 

Gerald Ford Rushed Out a Vaccine. It Was a Fiasco

by Rick Perlstein

If steady, mature Gerald Ford succumbed to haste when his presidency was on the line, imagine what Donald Trump will do.

 

 

Looking Out For Each Other

by Leah Valtin-Erwin

The wrenching transitions that East Germans faced in adapting to western commercial culture after reunification offer lessons for the COVID crisis, and a warning that the burdens of managing social change and stress often fall on retail workers. 

 

 

Opening Up New Avenues to Understanding the Path to War in Iraq

by Joseph Stieb

National security historian Joseph Stieb reviews journalist Robert Draper's account of the drive to war against Iraq in 2003, concluding that Draper explains how the principals built a case for war out of selectively embroidered intelligence, but not why war appeared as a positive option or much of the American political establishment got on board. 

 

 

Whose Anger Counts?

by Whitney Phillips

Many complaints about "cancel culture" depend on a false equivalency between left and right forms of internet argument that ignores the nature of far-right online harassment as a tool of power. 

 

 

The Real Suburbs: Unpacking Distortions and Truths about America’s Suburbs

by Becky Nicolaides

A leading historian of American suburbs points to the fine-grained changes in the L.A. metro area that confound Donald Trump's 1950s version of the suburban dream. Do the suburbs he's pandering to even exist today? (Photo by author)

 

 

The History Of Racist Colonial Violence Can Help Us Understand Police Violence

by Sarah Olutola

Racial ideologies and practices of social control honed by colonial powers are present today in American police tactics and defenses of police abuse against communities of color. 

 

 

What Liberals Get Wrong About Work

by Michael J. Sandel

Michael Young, who coined the term meritocracy in the late 1950s—and who used it as a pejorative—observed four decades later: “It is hard indeed in a society that makes so much of merit to be judged as having none. No underclass has ever been left as morally naked as that.”

 

]]>
Tue, 22 Sep 2020 18:22:36 +0000 https://historynewsnetwork.org/article/177217 https://historynewsnetwork.org/article/177217 0