History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 26 Mar 2019 23:32:08 +0000 Tue, 26 Mar 2019 23:32:08 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://ww.hnn.us/site/feed The Weakness of Democracy

 

Donald Trump is the most dishonest and most ignorant president in living memory, perhaps in American history. With his disdain for fundamental elements of democratic practice, such as freedom of the press and separation of powers, he is a danger to our American democracy.

 

But his election and the continued support he receives from a significant minority of voters are themselves symptoms of weaknesses which seem to be inherent in modern democracy itself. When we extend our gaze beyond the US, we can more easily perceive that democracy often works badly. I am not talking about fake democracies, where there is voting but no choice, as in the Soviet Union and the states it controlled. Even in countries where there is real opposition and secret ballots, voting can produce terrible results.

 

Venezuela, currently suffering a constitutional and humanitarian crisis, appears to have a functioning democracy, but the system has been rigged in favor of Nicolás Maduro, the successor of Hugo Chavez. Physical attacks on and arrests of opposition leaders, banning of opposition parties, sudden changes in the date of the election, and vote buying helped produce a victory for Maduro in 2018.

 

Algeria is currently experiencing a popular revolt against the elected president Abdelaziz Bouteflika, who was first elected in 1999, when the five other candidates withdrew just before the vote. He has been re-elected in 2004, 2009, and 2014, and announced he would run again this year, until massive protests forced him to withdraw as a candidate. He is very ill and has not said a word in public since 2013. His power has been based on military control, corruption, voting manipulation, and extensive use of bribery to create supporters and discourage opposition. The rebels are calling for an overthrow of the whole system.

 

These two cases are exceptional: the illusion of democracy hid authoritarian reality where democracy had never achieved a foothold. Much more common over the past two decades has been a gradual decline of existing democracies across the world, a process which could be called autocratization. A recent study shows that gradual autocratization has weakened democracies, in places as diverse as Hungary, Turkey and India. By extending government control of media, restricting free association, and weakening official bodies which oversee elections, modern autocrats can undermine democracy without a sudden coup. The authors argue with extensive data that the world has been undergoing a third wave of autocratization covering 47 countries over the last 25 years, after the first two waves in the 1930s and in the 1960s and 1970s.

 

The efforts of would-be autocrats to maintain their power by restricting democracy discourage trust in democracy itself. Nearly three-quarters of voters in Latin America are dissatisfied with democracy, according to a survey in 18 countries by Latinobarómetro, the highest number since 1995.

 

This is the context for the current failures of democracy in the United States (Trump) and Great Britain (Brexit). What can explain these failures? Physical coercion of political opponents is nearly non-existent. Corruption and voter suppression certainly play a role, at least in the US, but probably not a decisive one. Voters were overwhelmingly free to choose. Why did so many make such bad choices? I believe that conservative politicians in both countries used carefully chosen political tactics to appeal to widespread voter dissatisfaction. Those tactics are fundamentally dishonest, in that they promised outcomes that were impossible (Brexit) or were not actually going to be pursued (better health care than Obamacare). White voters made uncomfortable by the increasingly equal treatment of women and minorities were persuaded that it was possible and desirable to return to white male supremacy.

 

Voters made poor choices, even by their own professed desires. There is a dangerous disconnect between the voting preferences of many Americans and their evaluations of American political realities. A survey by the Pew Research Center at the end of 2018 offers some insight into the fundamental weakness of American democracy. A wide bipartisan majority of 73% think the gap between rich and poor will grow over the next 30 years. Two-thirds think the partisan political divide will get wider and 59% believe the environment will be worse. Only 16% believe that Social Security will continue to provide benefits at current levels when they retire, and 42% think there will be no benefits at all. Nearly half say that the average family’s standard of living will decline, and only 20% believe it will improve. These are not just the views of liberals. 68% of Republicans say that no cuts should be made to Social Security in the future. 40% say that the government should be mostly responsible for paying for long-term health care for older Americans in the future.

 

Yet when asked about their top political priorities, Republicans offer ideas which don’t match their worries about the future. Their three top priorities for improving the quality of life for future generations are reducing the number of undocumented immigrants; reducing the national debt; and avoiding tax increases. The richer that a Republican voter is, the less likely they are to want to spend any money to deal with America’s problems. Republicans with family incomes under $30,000 have a top priority of more spending on affordable health care for all (62%) and on Social Security, Medicare and Medicaid (50%), while those with family incomes over $75,000 are give these a much lower priority. 39% of poorer Republicans say a top priority is reducing the income gap, but that is true for only 13% of richer Republicans. Republican politicians follow the preferences of the richest Republican voters, but that doesn’t seem to affect the voting patterns of the rest.

 

Nostalgia for the “whites only” society of the past also pushes Americans into the Republican Party. About three-quarters of those who think that having a non-white majority in 2050 will be “bad for the country” are Republicans.

 

A significant problem appears to be ignorance, not just of Trump, but also of his voters. Many are ignorant about the news which swirls around us every day. A poll taken last week by USA Today and Suffolk University shows that 8% of Americans don’t know who Robert Mueller is.

 

But much of the ignorance on the right is self-willed. Only 19% of self-identified Republicans say the news media will have a positive impact in solving America’s problems. Only 15% are “very worried” about climate change and 22% are not worried at all. Despite the multiple decisions that juries have made about the guilt of Trump’s closest advisors, one-third of Americans have little or no trust in Mueller’s investigation and half agree that the investigation is a “witch hunt”. Despite the avalanche of news about Trump’s lies, frauds, tax evasions, and more lies, 27% “strongly approve” of the job he is doing as President, and another 21% “approve”. 39% would vote for him again in 2020.

 

Peter Baker of the NY Times reports that “the sheer volume of allegations lodged against Mr. Trump and his circle defies historical parallel.” Yet the percentage of Americans who approve of Trump is nearly exactly the same as it was two years ago.

 

Ignorance and illogic afflict more than just conservatives. The patriotic halo around the military leads Americans of both parties to political illusions. 72% of adults think the military will have a positive impact on solving our biggest problems, and that rises to 80% of those over 50.

 

The British writer Sam Byers bemoans his fellow citizens’ retreat into national pride as their political system gives ample demonstration that pride is unwarranted. His wordsapply to our situation as well. He sees around him a “whitewash of poisonous nostalgia”, “a haunted dreamscape of collective dementia”. He believes that “nostalgia, exceptionalism and a xenophobic failure of the collective imagination have undone us”, leading to “a moment of deep and lasting national shame”.

 

One well-known definition of democracy involves a set of basic characteristics: universal suffrage, officials elected in free and fair elections, freedom of speech, access to sources of information outside of the government, and freedom of association.

 

We have seen some of these attributes be violated recently in the United States. Republican state governments have tried to reverse electoral losses by reducing the powers of newly elected Democratic governors. Trump, following the lead of many others, has urged Americans to ignore the free press and to substitute information that comes from him. Many states have tried to restrict the suffrage through a variety of tactics.

 

Across the world, democracy is under attack from within. Winston Churchill wrote, “it has been said that democracy is the worst form of Government except for all those other forms that have been tried”. Unless we want to try one of those other forms, we need to fight against autocratization, at home and abroad.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154196 https://historynewsnetwork.org/blog/154196 0
A Heartwarming Lost Chapter on Immigrants Emerges

 

From 1907 to 1911, the U.S. Government sponsored the Galveston Movement, a massive effort to direct arriving Jewish immigrants away from New York City and other East Coast ports, all overcrowded with immigrants and their families, and bring them to cities on the Gulf of Mexico, primarily Galveston, Texas. The government wanted to populate the Southern and Western states with immigrants as well as the Atlantic Seaboard. At first, it worked. Nearly 1,000 immigrants, mostly Jewish people fleeing from the pogroms of Russia at that time, moved to the Galveston area and were gradually assimilated into the city and its suburbs. The movement had problems, though. Jews refused to work on Saturday, annoying their employers. There was some anti-Semitism. Low paid Texas workers complained that the Jews took their jobs. There was not a big Jewish community in Galveston to embrace newly arrived Jews, as there was in cities like New York. The new arrivals encountered many of the same problems that confront Jewish, and other, immigrants today. The movement shut down in 1912.

Among those Jews who did move to Galveston was Russian Haskell Harelik, who spent his life there. His story is now being told by his grandson, playwright Mark Harelik, The Immigrant. It is a heartwarming, engaging and thoroughly lovable story not just about the Jews, but the Texans who befriended them and, like so many Americans, helped them to become Americans themselves. The play just opened at the George Street Playhouse, in New Brunswick, N.J.

The play starts with the impressive display of dozens of huge black and white photos of the Galveston immigrants and what their lives were like in those years. Black and white pictures appear from time to time in the play, just the right times, too, to help tell the story. As the play actually starts, we meet Haskell Harelik.  Harelik, a charming, affable young man, arrived in Galveston by ship in 1909, leaving his parents behind in Russia. He had nothing.  Milton and Ima Perry, a Galveston couple, take him in, renting him a room in their house and get him to start a banana sales business, that he runs out of an old wooden cart.

Young Harelik, a hard worker, soon evolves the banana trade into a produce store and then a larger dry goods store. His wife arrives from Russia to join him and they have three children. They become patriotic Americans and his three sons all fight for the U.S. in World War II.

Harelik has his struggles, though. Angry residents of a nearby town shoot at him when he visits there  with his banana cart. Others scorn him. Many ignore him. Eventually, though, he succeeds.

Playwright Harelik does not just tell his grandfather’s personal story in The Immigrant; he tells the story, in one way or another of all immigrants. They all faced the same difficulties upon arrival in America and, in some way, overcame their problems and were assimilated. This is a story triumph, not just for Harelik, but all the immigrants who came to America over all the years. It is a reminder, too, to those on both sides of immigration wars today, that the entry of foreigners int America, however they got here, was always controversial.

There are wonderful scenes in the play, such as those when a thrilled Harelik carries his newborn babies out of his house and lays them on the ground so that they become part of America. Then, years later, he names his baby after his friend Milt, and Milt happily carries him out of the house and lays him on the ground.

There is the story of the first Shabbat, a Jewish Holy Day, when Haskell and his wife invite Milt and his adorable wife Ima to their home.

It is the story of Milt and Ima Perry, too. One of their two children died quite young and the other ran away from home and was rarely heard from. They battle each other, and the Hareliks, and townspeople, from time to time, like we all do. Their story is the story of Texans, and Americans, embracing, with problems, these new immigrants.

The play succeeds, mostly, because of the mesmerizing acting of Benjamin Pelteson as Haskell. He is funny, he is sad, he is exuberant. You cheer for him and cry for him.  Director Jim Jack, who did superb work on the drama, also gets outstanding performances from R. Ward Duffy as Milt, Gretchen Hall as Milt’s wife Ima, and Lauriel Friedman as Haskell’s wife Leah.

There are some gaps on the play. We don’t know if Harelik spoke English when he arrived in Galveston or whether he learned it here. We know very little about the story of his wife Leah or troubles his kids might have had in school. All of that, of course, would require a 450 hour play. The drama in this one is good enough.

Haskell and his family were assimilated into Galveston life, his business did succeed and they made friends. It was an American dream for them.

PRODUCTION: The play is produced by the George Street Playhouse. Scenic Design: Jason Simms, Costumes: Asta Bennie Hostetter, Lighting: Christopher J. Bailey, Sound: Christopher Peifer, Projection Design: Kate Hevner. The play is directed by Jim Jack. It runs through April 7. 

    

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171594 https://historynewsnetwork.org/article/171594 0
What Does William Barr Have to Do With Iran Contra?

Donald Trump’s nomination of William Barr to become attorney general has recast the spotlight on the presidency of George H.W. Bush. Barr served as attorney general in the Bush administration from late 1991 to early 1993. Most notably, Barr railed publicly against a long running independent counsel investigation of the Reagan-Bush administration and he fully supported President Bush’s last minute pardon of Caspar Weinberger, Reagan’s former defense secretary. Weinberger had been indicted on five felony charges, including accusations that he obstructed federal investigations and lied to Congress about the Iran-Contra affair.   

In the wake of Bush’s recent death, innumerable editorials have heaped praise on the late president for his prudent and polite leadership. Far too little attention has been paid to his role in the Iran-Contra scandal.

No writer has been more generous to Bush than journalist Jon Meacham, the author of The American Odyssey of George Herbert Walker Bush. In a New York Times editorial assessing Bush’s legacy, Meacham lauded the nation’s forty-third vice president and forty-first president for being especially principled and pragmatic; a leader whose “life offers an object lesson in the best that politics…can be.” Bush, Meacham noted admiringly, saw politics as a noble pursuit, a means to faithfully serve the public, “not a vehicle for self-aggrandizement or self-enrichment.”     

But the history of Bush’s involvement in the Iran-Contra scandal is not one of nobility and virtue. The object lesson, in fact, is that even our most revered leaders are fallible human beings subject to making unethical decisions out of misdirected loyalties or self-preservation. 

There is no doubt that Bush, as a loyal vice president, was aware of and endorsed the Reagan administration’s covert policies in the Middle East and Central America. Specifically, he knew of the illicit program of selling arms to Iran, a U.S. designated terrorist state, in hopes of recovering American hostages in Lebanon. And, he knew of the illegal program of suppling aid to the Contra rebels in Nicaragua. Years later when running for reelection as president, Bush admitted to his diary that, “I’m one of the few people that know fully the details [of Iran-Contra]….It is not a subject we can talk about.”

It is also clear that Reagan and his senior staff, Bush included, understood that the Iran and Contra programs were illegal. At one point, in regard to the arms-for-hostages initiative, Reagan informed his advisers that he would risk going to prison because the American people would want him to break the law if it meant saving the lives of hostages. “They can impeach me if they want,” Reagan said, and then he quipped “visiting days are Wednesday.”

Shortly after the Iranian weapons deals became public, Bush tried to distance himself from the Iran-Contra scandal by telling reporters that it was “ridiculous to even consider selling arms to Iran.” Knowledge of Bush’s involvement could jeopardize his plans to succeed Reagan. Such deceptive maneuvering was galling to Reagan’s secretary of state, George Shultz, who knew all too well that Bush had supported the Iran project. Shultz told a friend: “What concerns me is Bush on TV,” because he risks “getting drawn into a web of lies….He should be very careful how he plays the loyal lieutenant.”

Bush did become president and his eventual pardon of Weinberger, just weeks before leaving office, was not an act of virtuous public service; even Reagan had refused to grant pardons to those involved with Iran-Contra. Bush’s decision was a self-serving one as a trial examining Weinberger’s role in Iran-Contra, including the administration’s orchestrated cover-up, risked exposing the outgoing president’s complicity.

Hearing of Weinberger being pardoned, Judge Lawrence Walsh, the independent counsel investigating Iran-Contra, issued a statement of condemnation: “President Bush’s pardon…undermines the principle that no man is above the law. It demonstrates that powerful people with powerful allies can commit serious crimes in high office—deliberately abusing the public trust without consequence."

Among the lessons of Iran-Contra is that a healthy democracy must have robust checks on executive authority in order to minimize abuses of power. A quarter century ago, the president’s attorney general, William Barr, staunchly opposed the independent counsel’s investigation of wrongdoing in the White House, and he also firmly supported Bush’s use of pardons as a means of self-protection. Are we to believe that Barr’s relationship with President Trump will be any different? 

 

If you enjoyed this piece, be sure to check out Dr. Matthews forthcoming book: 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171025 https://historynewsnetwork.org/article/171025 0
Is Shoeless Joe Jackson Innocent? The Black Sox Scandal 100 Years Later

 

What do Pete Rose, Rob Manfred, Barry Bonds, and Ted Williams have in common? Why, Shoeless Joe Jackson, of course. 

Major League Baseball has had its share of controversies and scandals, but perhaps none has had a more lasting impact than the Black Sox Scandal of 1919. At the center of that legacy is Shoeless Joe Jackson, the legendary outfielder for the Chicago White Sox. He was arguably the best player in baseball at the time and remains one of the game’s greatest hitters with the records to prove it. He also remains permanently banned from professional baseball and therefore ineligible for the Hall of Fame. The why’s and wherefore’s of his banishment have stirred the passions of countless fans for the last one hundred years.

Gambling is at the heart of the Black Sox story. Eight White Sox players conspired with gamblers to throw the World Series, which Cincinnati won in game 8 (the Series was 9 games that year). That is not in doubt. What continues to be questioned is Jackson’s role in the conspiracy. When did he know about it? Was he in on it? What did he do about it? Did he take money for it? Did he field and hit poorly in order to lose or did he play his heart out? The answers to those questions are not the subject of this article. Instead, I investigate why Joe was banned and how his legacy still shapes baseball today. 

The rules against gambling sprang from the Black Sox Scandal and are clearly posted in every professional clubhouse in the land: “Any player, umpire, or club or league official or employee who shall bet any sum whatsoever upon any baseball game in connection with which the better has a duty to perform, shall be declared permanently ineligible.” Pete Rose, who has the most hits in baseball history, clearly broke that rule and has also been declared ineligible for the Baseball Hall of Fame. Yet many still argue that Rose should be allowed into the Hall of Fame. Many often debate who has a better case for reinstatement, Joe Jackson or Pete Rose?

Joe Jackson broke no rule. In fact, it might be argued that gambling was the national pastime in 1919 (It might still be argued that gambling is our national pastime). Gamblers often greased a player’s palm in exchange for inside dope on who was hurt, who was drinking too much, anything that would help solidify the bet. The owners knew it, which is why the White Sox owner, Charles Comiskey, wasn’t that concerned when he heard rumors that the fix was in. After the owners elected Kennesaw Mountain Landis baseball’s first commissioner in 1921, gambling was declared illegal, but that was two years after the 1919 scandal. Shoeless broke no rule. Pete Rose broke the rules, plain and simple. That alone gives Jackson the better case for reinstatement.

More importantly, Joe was only alleged to have broken the lawand was never convicted. In the 1920 jury trial, “The Eight” were found not guilty. And when Joe sued Comiskey for back pay, a 1924 jury awarded it to him, finding him not guilty of the gambling conspiracy. How, then, did he come to be banned from baseball?

The answer goes to another part of Joe’s legacy: the autocratic power of baseball’s commissioner. Landis, a former judge, would not take the job unless he had absolute power when making decisions. The owners gave it to him. One need look no further than his ruling: “Regardless of the verdict of juries, no player who throws a game...will ever play professional baseball.” Imagine being able to act “regardless of the verdict of juries!” Still today, Rob Manfred, the current Commissioner of Baseball has almost unlimited power to investigate and issue punishment for any practice or transaction he believes is “detrimental to the best interests of baseball.” He owes that power to the legacy of Joe Jackson and the “eight men out.” 

“Cheating” has become a modern-day equivalent of “gambling.” And the question of Jackson’s banishment has also impacted the conversation about whether Barry Bonds, Roger Clemens, Mark McGuire, or others who used steroids should be voted into the Hall of Fame. Unlike Shoeless, they are not banned from baseball; the sports writers could vote them into the Hall. It has become a question of character. Should the writers—and by extension, the fans—consider only the baseball statistics, or should the morality of what the players did be considered? Bonds, to take one example, had his obstruction of justice conviction overturned. Like Shoeless, he has never been convicted of anything. Nevertheless, the writers have refused to vote him in, the highest percentage of votes for admitting him, 56%, falling well short of the necessary 75%. 

Judge Landis certainly considered the morality of Joe Jackson when he banned him from professional baseball. Is the shadow of Joe’s banishment lingering in the minds of today’s sports writers when they refuse to vote into the Hall any otherwise eligible player credibly accused of using steroids to enhance his performance on the field?

Among die hard baseball fans, no one question elicits more “discussion”—i.e. argument—than that of “Who was the greatest hitter of all time?” Was Teddy Ballgame better than the Babe? How about Ty Cobb vs. Tony Gwynn? Aaron Rodriguezor Bonds or Joltin’ Joe DiMaggio or Stan Musial? Or perhaps some “ancients” like Ed Delahanty, Dan Brouthers, Cap Anson? Joe Jackson could outhit them all, some say. In almost any discussion of hitting, in fact, the name Shoeless Joe Jackson usually arises. Many consider him to be the best “natural” hitter of all time, with a swing so perfect that no one could match it. Both Babe Ruth, who patterned his swing after Jackson’s, and Ty Cobb expressly said just that. 

Jackson played in the “dead ball” era of baseball, where one baseball was used for an entire game, if possible, and his lifetime batting average of .356 stands third of all time. Had he played in the “live ball” era, where new balls were frequently inserted into the game and scuffed balls disallowed, there is no telling what average he could have hit for. In any event, he is on almost everyone’s list of top hitters and to this day is one of the gold standards of hitting when fans discuss the “best of all time.”

But perhaps the greatest legacy of Shoeless Joe and the Black Sox Scandal of 1919 is simply this: we’ll never know exactly what happened one hundred years ago and that gives baseball lovers the chance to do what they love best: argue. 

Shoeless hit .375 in the series, had 12 base hits, a record not broken until 1964, committed no errors, threw out a runner at the plate. 

Oh yeah? His average in the games they lost was only .286. And what about the $5,000? Joe said he tried to give it back to Comiskey. 

Oh yeah? How come he made no mention of that in his grand jury testimony? He knew about the fix, he should have done more to stop it. 

He tried to, he asked his manager to bench him. 

Oh, yeah? Prove it!

My novel takes place in 1951 and uses flashbacks to describe the Scandal. Ultimately, I had to decide for myself whether Joe was innocent or not. My answer turns on the question of character. Do I believe Jackson deserves to be reinstated and then voted into the Hall of Fame? You’ll have to read the novel. Although he probably wouldn’t have wanted it this way, the wonderful legacy of Shoeless Joe is that he’ll never have a last at-bat.

To read the author's latest book on Shoeless Joe, click below!

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171454 https://historynewsnetwork.org/article/171454 0
An American Socialite and the Biggest British Constitutional Crisis Since Henry VIII's Divorce

 

It was the biggest constitutional crisis since Henry VIII divorced Catherine of Aragon – except this time everyone agreed who the villain was. On December 10, 1936, Edward VIII renounced what Winston Churchill called “the greatest throne in history,” giving up an empire of 500 million people, to marry the twice-divorced American socialite, Wallis Simpson. Ever since, this Baltimore native has been blamed as the wicked-witch who almost derailed the British monarchy. Throughout the intervening decades, we have been overfed a diet of such fantastical slander about Mrs. Simpson that it has become impossible to discern the real woman. Wallis has been written off as a seductress, a gold-digger, a Nazi sympathizer and seen as a cold, ambitious bitch who schemed from the outset in the hopes of becoming Queen of England. She has become a caricature of villainous womanhood.

History is mostly perceived from the perspective of his-story. But what about her story? This Women’s History Month, Wallis Simpson is an important woman to revisit and uncover her real history. 

It was the unholy Trinity of the Church, the Palace, and Parliament, who did not want Edward VIII on the throne. They considered him weak and ill-disciplined and saw Wallis as the perfect excuse to rid England of a man they deemed unfit to rule. Far from the villain of the history books, Wallis was the victim of the abdication. She was undermined by a cunning powerful British establishment who sought to destroy and diminish her. Bright and perceptive, she soon realized that machinations to use her were underway. At the time of the abdication crisis, she wrote, “I became obsessed with the notion that a calculated and organised effort to discredit and destroy me had been set afoot.” She was right. She became the perfect pawn for the wily palace courtiers. 

That Edward did not conform to court life, preferring a vigorous and flamboyant social life to the grey strictures of monarchical duty, was tantamount to treachery in the eyes of his advisors. In 1927, courtier Tommy Lascelles told Prime Minister Stanley Baldwin of his violent disdain for the Prince of Wales: “You know, sometimes when I am waiting to get the result of some point-to-point in which he is riding, I can’t help thinking that the best thing that could happen to him and the country, would be for him to break his neck.” “God forgive me,” Baldwin replied. “I have often thought the same thing.” This conversation occurred seven years before Wallis Simpson met Edward Prince of Wales at a weekend house party in the British countryside.

When Edward fell in love with Wallis Simpson, no one could have predicted the strength of his obsession. At the time, she was happily married to her second husband, Ernest Simpson.  Many ask, why didn’t she break off her relationship with Edward, especially when he became King? Why did she divorce Ernest Simpson? Her detractors fail to acknowledge that she never wanted to divorce Ernest or to marry Edward. Initially, she was flattered by his attention. What woman would not have been beguiled by the prince’s “unmistakeable aura of power and authority?” Yet she never expected the infatuation to last. In 1935, she wrote to her beloved aunt, Bessie Merryman, “What a bump I’ll get when a young beauty appears and plucks the Prince from me. Anyway, I’m prepared.” 

It was Edward, then King, who forced her into an untenable position, refusing neverto give her up. At the time of the abdication, Edward slept with a loaded gun under his pillow and threatened to kill himself if Wallis forsook him. Aides described him as “exalté to the point of madness.” Wallis knew that her fate would be far worse if a beloved and popular King took his life because of her. In the name of Edward’s needy, obsessive love, Wallis paid the ultimate price: entrapment by a childish narcissist who threw the biggest tantrum in history when he could not have the two things her wanted most in the world — her and the throne. When he chose Wallis, the couple was devastated when the royal family closed ranks against them, forcing them into exile from Britain for the rest of their lives. Wise to this, Wallis wrote to Edward post-abdication, “It is the politicians whose game it is to build up the puppet they have placed on the throne. I was the convenient tool in their hands to get rid of you and how they used it!” 

During my research, in which I gained entrée into Wallis’s coterie of living friends, I listened with mounting incredulity and fury as they told me repeatedly of her kindliness, sense of fun and depth of friendship, which contradicted the public image of a hard-nosed, shallow woman. It was a revelation for me to discover what a warm, witty, loyal friend Wallis was. Her friends adored her. The Conservative MP Sir Henry “Chips” Channon said, “She has always shown me friendship, understanding, and even affection. I have known her to do a hundred kindnesses and never a mean act.” 

She was no saint – but she was far from a sinister manipulator. Her detractors continue to argue that she was a Nazi sympathizer and traitor, yet her friends and eminent historians, such as Hugo Vickers, Philip Ziegler, and the late Lord Norwich, are adamant that there is no concrete evidence of Nazi conspiracy. She did go with Edward to Germany to meet Hitler in 1937, but it was before the atrocities of the Second World War and only because Edward wanted his wife to experience the pomp and ceremony of a royal tour that was denied to Wallis in England. Edward was keen for this trip when the Germans agreed to his request that Wallis would be curtsied to and addressed as Your Royal Highness. This was blisteringly important to him, as he felt so aggrieved that the British royal family refused to give Wallis the crucial HRH title, even though she was legally entitled to this as the wife of the former King. 

When the world learned of the abdication, it recoiled in shock. How could this strange, angular-looking woman take a beloved King from his people, many wondered. Wallis received vicious hate mail. “It’s no exaggeration to say that my world went to pieces every morning on my breakfast tray,” she later wrote. Admirably, she schooled herself to survive what would have felled the hardiest of souls: “To be accused of things that one has never done; to be judged and condemned on many sides by the controlling circumstances; to have one’s supposed character day after day laid bare, dissected and flayed.” She succeeded with “a kind of private arrangement with oneself.” She knew who she was and her friends knew too. She learned to uphold what matters in life and to endure being a woman misunderstood and excluded on an international scale.

From the moment she was locked in the Faustian pact of marriage to the Duke of Windsor, she determined to make their marriage a success and to ensure that her husband was as happy as he could be, ousted by his family and exiled from his country. For thirty-five years , she triumphed in this endeavour, even as she endured psychological assassination from the entire world. Wallis Simpson was no ordinary woman. An inscrutable dignity gave her strength. She has been misunderstood and misinterpreted for far too long. This woman deserves our admiration for the situation she became embroiled in and what she subsequently had to endure. Most of all, she deserves for her reputation to be rehabilitated in the annals of history.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171541 https://historynewsnetwork.org/article/171541 0
The Elon Musk of Global Crime and European Colonialism in Africa

 

Paul LeRoux, a renegade tech titan from southern Africa, introduced Silicon Valley-style entrepreneurship to transnational organized crime.  A pioneer in the field of cyber security, LeRoux broke bad and used his exceptional gifts to become the international criminal underground’s premier innovator. He dealt in arms, weapons systems, drugs, gold, natural resources, phony documents, bribery and contract murder, trading illicit commodities and favors with Iran, North Korea, the Chinese Triads, the Serb mafia, the Somali pirates, warlords, militias, terrorists and mercenaries of various nations. 

 

Two agents of the U.S. Drug Enforcement Administration’s elite, secretive 960 Group, part of the agency’s Special Operations Division, tracked him down, penetrated his inner circle, and lured him to Liberia and arranged to have him arrested and expelled to their custody on September 26, 2012. LeRoux promptly flipped and helped the DEA agents round up his hit men and his North Korean methamphetamines trafficking team.   All have been convicted.  The last to face justice was Joseph “Rambo” Hunter, a former U.S. Army sniper trainer and drill sergeant, who was sentenced in New York on March 7, to three life sentences plus 10 years for setting up the brutal murder of a Filipino woman targeted by LeRoux.  LeRoux himself has pleaded guilty to arms and drugs trafficking and other crimes and is incarcerated in New York, awaiting sentencing.

 

Paul LeRoux is the fruit of a poisonous tree.  I could never think about him without thinking of the horrific history of European colonialism in Africa. I read deeply into histories of colonial Rhodesia and South Africa, where LeRoux was born and grew up.  I thought of Kurtz, the corrupted, blood-soaked, self-exiled anti-hero of Joseph Conrad’s masterpiece, The Heart of Darkness.  “His soul was mad,” Conrad writes.  “Being alone in the wilderness, it had looked within itself and, by heavens I tell you, it had gone mad.”  It became clear to me that, for all his technological prowess, LeRoux has to be seen in the context of the history of southern Africa, a rapacious, swash-buckling profiteer trading in guns, drugs, gold, timber, false documents and human life.

 

He was born on Christmas Eve 1972 in Bulawayo, Rhodesia’s gritty, vibrant second city. He was the illegitimate son of a young white woman and her lover, both of British descent. A married white Rhodesian couple, Paul and Judith LeRoux, adopted him.  

 

Today, only 17,000 people of European extraction live in Zimbabwe, a landlocked nation of 150,000 square miles wedged between South Africa, Mozambique, Botswana, and Zambia. At the time of LeRoux’s birth, some 260,000 whites resided there, ruthlessly dominating, exploiting, and fearing the country’s 4.8 million blacks. Bulawayo, a precolonial tribal capital whose name meant “place of slaughter,” had blossomed into an industrial powerhouse and processing center for the region’s abundant metal ores, cattle, cotton, tobacco, and maize. The colony’s wealth cushioned it from international sanctions meant to force Prime Minister Ian Smith, an unyielding champion of white rule, to agree to a transition to majority—black—rule.

 

Three days before LeRoux was born, the colony’s long-simmering racial and economic disparities ignited into what whites called the Bush War and blacks called the War of Liberation. The civil war escalated as both sides engaged in hideous atrocities. The British historian Piers Brendon, in his 2008 book, The Decline and Fall of the British Empire, wrote:

 

…Guerrillas, some backed by China and others by Russia, crossed the frontier from Mozambique and Zambia to attack remote farmsteads, railways and roads….The guerrillas tried to enlist the native population, using terror tactics against anyone who resisted. Chiefs were regularly tortured and murdered. Schoolteachers were raped. Villages were looted and burned. Counter-insurgency measures were no less savage.… African cattle were seized or deliberately infected with anthrax. Captured combatants were given electric shocks, dragged through the bush by Land Rovers or hung upside down from a tree and beaten. 

 

Under pressure from the United Nations, Great Britain, and the United States, Smith reluctantly held elections. On March 4, 1980, guerrilla leader Robert Mugabe’s party won in a landslide.  The colony of Rhodesia disappeared from the map, replaced by the independent nation of Zimbabwe.

 

Liberation brought no peace. Mugabe launched a dirty war against tribal and political rivals. He created a 5,000-man Fifth Brigade, had it trained and equipped by North Korea, and dispatched it into the countryside to pillage, rape, torture, and slaughter. Between 20,000 and 80,000 people, mostly civilians, died.

 

The LeRoux family reportedly lived in the mining town of Mashaba, where Paul LeRoux the elder was a supervisor of underground asbestos mining in the enormous Shabanie and Mashaba asbestos mining complex, one of the largest and most hazardous mining operations in the world at the time. Mashaba meted out misery and early death to black asbestos miners, but it would have afforded an uneventful childhood to the son of a white overseer.

 

White privilege couldn’t survive Mugabe’s financial mismanagement, which launched the Zimbabwean economy into a tailspin and sent white professionals fleeing. The LeRoux family joined the white exodus in 1984 and landed in the grimy South African mining town of Krugersdorp, 540 miles to the south of Bulawayo. LeRoux’s father parlayed his knowledge of mining into work as a consultant to South African coal mines. LeRoux later claimed that his father developed an off-the-books sideline as a diamond smuggler and introduced his son to figures in the South African underworld.

 

Whites were at the top of the economic and social heap in South Africa. LeRoux, chunky and socially awkward, buried himself in his computer. He studied programming at a South African technical school, soon outstripped his classmates and his teacher, and developed a specialty in cyber security programming. In 1992, when he was twenty, he snagged his first job, working at a London-based information technology consultancy. He became a digital nomad, traveling between Europe, Hong Kong, Australia and the United States, setting up secure data systems for government ministries, corporations, law firms and banks.

 

 In 2002, when he was 30, he felt a vocation to entrepreneurship, in the mold of his South African contemporary Elon Musk. LeRoux’s innovations were always on the dark side – drugs, arms, smuggled gold, illegal timber, false documents, murder. As he accumulated wealth, he fell back on a mind-set he absorbed in Rhodesia—dig in hard, don’t spare the bullets, and be ready to move.

 

He bought bolt-holes – safe houses -- throughout Africa and Asia, but he dreamed of returning to the place he was born.   He paid a broker $12 million to give Mugabe so he could acquire a plantation confiscated from white farmers. He was cheated:  Mugabe never delivered.  In 2009, he tried another tack, sending “Jack,” a European aide, to travel around the Zimbabwean countryside to search out a colonial-era villa with white, plantation-style columns, some acreage, and a “big, curvy driveway.”  He evidently harbored a fantasy of the by-gone colonial era, when white gentlemen planters, called “verandah farmers,” enjoyed idle days and debauched nights, sipping cool drinks on the broad front porches, observing from a distance the toils of the black farmhands, then toddling off at dusk for dinner and an orgy with other planters’ bored wives.

 

As a child in a whites-only school, he would have been taught the so-called Pioneers Myth, about intrepid English settlers taming the verdant, empty plain and carving out a civilization. It is highly doubtful a white schoolboy would have been told the truth -- that Rhodesia was founded on and sustained by blood and lies.  When the indigenous people rebelled in 1896, British troops and Rhodes’s militiamen exterminated them. Historian Brendon described scenes of horrific cruelty: British soldiers and settlers putting villages, grain stores crops to the torch; slaughtering men, women and children; collecting trophy ears; and making their victims’ skin into tobacco pouches. In the famine that resulted, people were reduced to eating roots, monkeys, and plague-ridden cattle corpses. The streets of Bulawayo filled with emaciated refugees trying to escape to South Africa. 

 

LeRoux wasn’t interested in his homeland’s shameful imperial history, or Mugabe’s misrule.  

 

 “It was about getting what he wanted,” Jack said, “and if he had to do business with an evil person like Mugabe, then so be it, as long as he got his part off the deal. He didn’t care about the people. They were all monkeys to him.”  

 

To read more about Paul LeRoux, check out Hunting LeRoux: The Inside Story of the DEA Takedown of a Criminal Genius and His Empire by the author:

 

 

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171540 https://historynewsnetwork.org/article/171540 0
A House Once More Divided

 

Beginning with the Three-Fifths Compromise in the U.S. Constitution, United States history is filled with “compromises” intended to preserve a rough balance of power between slave-holding and free states. The Three-Fifths Compromise was followed by the Missouri Compromise of 1820 and the Compromise of 1850. These negotiations helped America delay war, but after the Kansas-Nebraska Act of 1854, further concessions meant not only preserving but expanding slavery.

 

The election of Abraham Lincoln outraged many in the South. South Carolina Governor Francis Pickens declaredprior to the Civil Warhe “would be willing to cover the state with ruin, conflagration and blood rather than submit” to abolition.(1) After decades of compromise on the issue of slavery, South Carolina became the first state to secede. Ultimately Governor Pickens reached his goal, but before peace was restored, conflagration and blood truly covered South Carolina.

 

Lincoln’s campaign and election prompted a different response from activists like Joseph Medill, co-owner and editor of the Chicago Tribune. Medill was motivated by a desire to preserve the Union and emancipate slaves, and he felt a good newspaper must report stories in ways that advanced society. To him that meant abolishing slavery. Joseph became a key player in a new generation of abolitionist leadership. 

 

Public advocacy in the Tribune made Joseph a target. In 1860 while in Washington, D.C., he criticized concessionists, the position of Illinois Congressman William Kellogg. At the National Hotel, Congressman Kellogg attacked Joseph, landing blows to Joseph’s head and face. Kellogg had been appointed to the Committee of Thirty-Three of the U. S. House of Representatives, tasked with averting a civil war. Joseph described the assault in a letter to his wife, Katherine: “Wm. Kellogg started home in a hurry to Springfield to help beat Judd (2) for a place in the Cabinet. He is talking compromise. He [Kellogg] is a cowardly Republican and wants to back down. I quarreled with him." (3)

 

Joseph Medill and his partner, Dr. Charles Ray, used the pages of the Tribune to support the Lincoln administration and rally the public to the cause of emancipation. Joseph urged the swift organization of black regiments and broadcast the goals for the Union League of America (U.L.A.,) a group established to promote loyalty to the Union. Joseph played a prominent role in Union League programs.(4) The U.L.A. supported organizations such as the United States Sanitary Commission and provided funding and organizational support to the Republican Party. 

 

Joseph’s early public calls for war turned to personal anxiety and grief when two of his younger brothers became casualties of war. Yet, he continued to support a war of liberation and pursue principles of freedom and self-government. Joseph provides a poignant example of moral imperative informing political activism.

 

Abraham Lincoln and supporters like Joseph Medill taught that politics must not violate human rights. Immoral behavior must never be subject to a majority vote. Robert Todd Lincoln explained his father’s views on democracy eloquently in 1896. “In our country there are no ruling classes. The right to direct public affairs according to his might and influence and conscience belongs to the humblest as well as to the greatest…But it is time of danger, critical moments, which bring into action the high moral quality of the citizenship of America.”(5)

 

 

People didn’t grasp the danger of a house divided then, and many fail to grasp it now, but history repeats itself in elusive, yet profound, ways. Today, the ugly specter of divided parties returns. No matter which party we align with, President Trump’s ability to divide us and willingness to condone violence should alarm us all. 

 

From the beginning of his campaign, Donald Trump used rhetoric to incite supporters, using baseless slurs to disparage immigrants (6) and political opponents. During the presidential campaign in March 2016, it seemed unlikely that Trump had enough votes at the Republican National convention to secure his nomination. If that happened, Trump warned during an interview with CNN, “I think you would have riots.” (7) When President Trump wages verbal war with the intelligence community and independent sources of investigation, he provokes division that threaten to become an “irrepressible conflict,” echoing the pre-Civil-War rancor. If Americans don’t reject politicians who divide us, condone violence, label a group of people as criminal, and another group enemies of the people, we do so at our own peril.

 

Once again we face dilemmas that require as much of us as any time in the nation’s past. Modern Americans tend to take our stable democracy for granted, but Mr. Lincoln realized the freedoms gained in the Revolution could be lost. He enlisted newsmen like Joseph Medill to champion justice and liberty. Lincoln understood that involved citizens preserve the union, and he taught a vital lesson that only when human rights are respected is democracy worth preserving.

 

(1) Orville Vernon Burton, Age of Lincoln, (New York : Hill and Wang, 2007)118.

(2) Longtime Lincoln friend and supporter Norman Judd did not receive a Cabinet post but was named Minister to Prussia.

(3) Georgiann Baldino, Editor, A Family and Nation Under Fire, (Kent: Kent State University Press, 2018) 25.

(4) Robert McCormick’s papers in the McCormick Research Center at the First Division Museum, Medill Family Correspondence.

(5) Speech of the Hon. Robert T. Lincoln made at the Celebration of the Thirty-eighth Anniversary of the Lincoln-Douglas Debate, Galesburg, Ill., October 7, 1858(Hancock NY: Herald Print, 1921) 2.

(6) Jennifer Rubin, “Most Americans agree: President Trump is divisive,” Washington Post, January 17, 2018, accessed March 11, 2019, https://www.washingtonpost.com/blogs/right-turn/wp/2018/01/17/most-americans-agree-president-trump-is-divisive/?noredirect=on&utm_term=.e89c9aaeb8b0

(7) Jonathan Cohn, POLITICS, HuffPost 06/09/2016 06:23 pm ET Updated Jun 16, 2016, accessed March 11, 2019, https://www.huffingtonpost.com/entry/worst-trump-quotes_us_5756e8e6e4b07823f9514fb1

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171535 https://historynewsnetwork.org/article/171535 0
What I’m Reading: An Interview With Environmental Historian Eleonora Rohland

 

Eleonora Rohland is Assistant Professor for Entangled History in the Americas (16th-19th centuries) at Bielefeld University, Germany. Rohland was trained as an environmental historian at the University of Bern, Switzerland. She received her PhD from the Ruhr-University of Bochum, Germany, in 2014 and was a doctoral fellow at the Institute for Advanced Study in the Humanities Essen (KWI) from 2008-2014. Her research was supported by the Swiss Study Foundation, as well as by the German Historical Institutes in Washington and Paris. Her MA as well as her PhD thesis were awarded prizes. Rohland is the author of two books, Sharing the Risk. Fire, Climate and Disaster. Swiss Re 1864-1906(Lancaster: 2011) and Changes in the Air: Hurricanes in New Orleans, 1718 to the Present, which just appeared in the Rachel Carson Center (RCC)’s Series Environments in History: International Perspectives (Berghahn Books) in 2018. With her third book project, tentatively entitled Encountering the Tropics and Transforming Unfamiliar Environments in the Caribbean, 1494 to 1804, her research focus moves geographically from the U.S. Gulf Coast into the Caribbean (Hispaniola and Jamaica).

 

What books are you reading now?

 

Related to the courses I am going to teach next semester (starting in April in Germany), I am reading Origins: How the Earth Made Us by Lewis Dartnell and Energy and Civilization: A History by Vaclav Smil. I am also very interested in the neurology of the creative process, partly for my own writing, but also in relation to teaching, so in this context I am reading Imagine: How Creativity Works by Jonah Lehrer. And I just love the humor and voice in Ursula Le Guin’s A Wave in the Mind: Essays on the Writer, the Reader, and the Imagination. Her texts are beautiful, profound and inspiring.

 

What is your favorite history book?

 

It’s hard to mention just one…Les Paysans du Languedoc (The peasants of the Languedoc) by Emmanuel Le Roy Ladurie; Mosquito Empires by John McNeill, Round about the Earth by Joyce Chaplin.

 

Why did you choose history as your career?

 

I think there’s a difference between the career aspect and my personal relationship to the field of history. At the present state of academia (and I think that is true for the U.S. as well as for Germany and Switzerland), it’s not so much a question of my choice, but of whether you get lucky and the system chooses you. But even beyond that, it was not a straightforward choice, though on my father’s side, I come from a family of historians. That doesn’t mean my career was in any way predetermined, it’s rather that, I guess, historical thinking has been part of my upbringing, and I was lucky to have a great history teacher in school who managed to foster that already existing interest. Also, I was lucky to have come to the field of environmental and climate history early during my undergraduate studies with Christian Pfister at the University of Bern (Switzerland), one of the pioneers of climate history. The way he taught Annales School-style history at his department of economic-, social- and environmental history was very comprehensive, geared towards understanding macro-scale historical connections while not losing sight of the micro-developments, and it was always related to present-day questions. 

 

During research for my MA thesis I realized that I really loved working with archival materials, the materiality and aesthetics of it intrigued me. The detective work that is involved in the research process intrigued me. And that connection only deepened during research for my doctoral thesis (though of course neither of these research projects was always pleasurable and easy). History is a fascinating subject that allows you to see so many levels and facets of human existence – the very light and the very dark – at different times and in different places. And I would say I am studying and teaching history most of all, to understand how we’ve got where we are today in our current, troubled era of unprecedented global change.

 

What qualities do you need to be a historian?

 

An open mind. Curiosity. Imagination. Inquisitiveness. A very healthy dose of skepticism towards anything already written. Persistence. Meticulousness. Frustration tolerance. Self-criticism. Patience.

 

Who was your favorite history teacher?

 

My history teacher at high school, Jürg Düblin. He had a wonderful sense of humor and always spiced up our history lessons with jokes and references about current events. I went to high school during the Clinton era, so of course there was ample opportunity to crack jokes about the Lewinsky affair. He also taught us to see the interconnectedness and entanglement of historical processes, and to accept and welcome complexity. 

 

And Christian Pfister, now Professor Emeritus at the University of Bern. Early in my undergrad studies I took his seminar on the history of disasters, which became my starting point into environmental and climate history.

 

What is your most memorable or rewarding teaching experience?

 

Again, it’s difficult to point out a single moment or course. I would rather say more generally that it’s most rewarding when the student group and I manage to create an atmosphere in which the students feel comfortable enough to really ask fundamental questions and in which they start discussing among themselves, without me having to guide much. This is not a situation I can create at will, it’s a co-creation between students and teacher, that depends on the make-up of the student group and on how they interact with the subject of the course. 

 

What are your hopes for history as a discipline?

 

That it fully absorbs the profound implications that the Anthropocene has for the discipline. Dipesh Chakrabarty in his 2009 “The Climate of History: Four Theses” clearly and brilliantly laid out how the fact that humans were now shaping the earth’s climate and other realms of the ecosystem spelt the end of the separation between human and natural history. Chakrabarty himself and others (Julia Adeney Thomas, John McNeill, Amitav Ghosh, and Franz Mauelshagen, to mention just a few) have since elaborated on this new perspective, and I am basing my hopes and opinion on their work. 

 

History needs to get comfortable with deep time scales that reach beyond the written record; that is, it needs to become more interdisciplinary, to connect more, and more naturally, with related disciplines such as anthropology and archaeology to include artefactual evidence alongside the written record; and with the natural sciences in order to understand environmental and climatic aspects concerning past societies. But even beyond the immediate theoretical and methodological implications of the Anthropocene, and more focused on the current, worrying changes in political arenas around the globe, I think history as a discipline needs to renew itself, needs to reassert or renegotiate its place in society, needs to be vocal on political abuses of terminology or populist and racist reinterpretations of historical events.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I own a facsimile of the Nuremberg Chronicle (theSchedelsche Weltchronik) and the first edition of the Jacob Burckhardt’s The Greeks and Greek Civilization (Griechische Kulturgeschichte)(1898), which was edited by my great-great grandfather, Jacob Oeri.

 

What have you found most rewarding and most frustrating about your career? 

 

I see it as a great privilege to be able to teach what I research and to work with students in general and thinking about ways to help them find what sparks their interest and passion. The historical profession unites many of the activities and themes I am passionate about myself (researching, teaching, writing, books, discussing, conceptualizing projects) and I consider myself very fortunate to be working as a professional historian.

 

One of the more frustrating experiences on this career path is – and this probably accounts more for Germany where environmental and climate history are completely under-institutionalized, and where, at the same time, one has to acquire project funding from national or EU funding bodies to do research – that funders have not yet realized that the discipline of history has substantial contributions to make to questions of sustainability, climate change and societal change. Consequently, funding formats that deal with these topics are never made with historians (or the humanities) in mind. On the other hand, environmental history is still seen as somehow narrow, restricted to the history of political movements since the 1950s, even among historians. So, as an environmental historian in the German speaking countries of Europe, one is kind of wedged in between those two (mis-)perceptions, and it will take some work to get out of this situation…

 

How has the study of history changed in the course of your career?

 

Well, my career has not been very long yet, but I would say that when I started studying history in the early 2000s, the cultural turn was in full swing and its methodologies were becoming mainstream in history. However, by now it has reached a sort of dead end. As far as I can see, no renewal or alternative is yet in sight, but is urgently needed. Be it by taking into account the Anthropocene’s implications or by other important aspects related to current events. So, I guess this observation connects to what I said above about my hopes for the discipline.

 

What is your favorite history-related saying? Have you come up with your own?

 

“The past is a foreign country, people do things differently there.” By L.P. Hartley, from The Go-Between. 

 

To me this saying perfectly expresses my fascination with history. In our every-day relationships we tend to instinctively look for similarity. In history, and in particular in early modern history the challenge is to understand as far as possible the foreignness of those worlds, to embrace the feeling of being estranged.

 

What are you doing next?

 

I am writing a short publication on “Entangled History and the Environment” which is an offshoot from my third book project on the socio-environmental transformation of the island of Hispaniola, from Columbus to the Haitian Revolution.

 

And I am preparing two courses on resource and energy history in the Americas.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171538 https://historynewsnetwork.org/article/171538 0
Political Leadership and the Need for a New American Story

Franklin Delano Roosevelt delivering a "Fireside Chat" to the American people.

 

I recently came across Political Humor (1977) by Charles Schutz, and it detailed what a great storyteller Abraham Lincoln was. That got me thinking, “What other notable presidents were also good storytellers?” “How are stories and storytelling related to effective political leadership?” “Do we need a new American Story?”   

I checked out Franklin Roosevelt and found that historian Cynthia M. Koch wrote that FDR used heroic individuals like Washington and Lincoln “to tell stories that would unite people and provide comfort, courage, reassurance, and inspiration to Americans facing fear, hardship, uncertainty, and war.” The FDR Foundation added that “to heal a wounded nation,” FDR relied “in no small part” on “storytelling” . . . to tap into humankind’s primeval need to understand issues not only in intellectual terms, but on an emotional level as well.” 

This sounded much like Lincoln. At a cabinet meeting he once said, “I don’t propose to argue this matter because arguments have no effect upon men whose opinions are fixed and whose minds are made up.” Instead, he told a story to illustrate his point. 

Both great presidents possessed an acute understanding of the common people. Lincoln’s came partly from his origins among common frontier folk, but the aristocratic Roosevelt, who campaigned extensively throughout the nation, “principally relied on his feel for public mood to guide him in leading the country.” Yet both presidents understood that to reach people, to motivate them, to win them over, a president had to appeal to their emotions, and storytelling was one way of doing so. Carl Sandburg, who wrote a six-volume biography of Lincoln and was a strong FDR supporter, saw numerous parallels between the two presidents, especially their attunement to the will of the American people.

In recent years much has been made of the great U. S political divide. In a previous article, I mentioned that Donald Trump appeals to the anti-intellectual strain in American life earlier highlighted by historian Richard Hofstadter’s Anti-Intellectualism in American Life (1962). Liberals, progressives, and even some thoughtful conservatives bemoan the anti-intellectualism of Trump supporters. “Why do they believe the steady spew of lies told by Trump and all his talk of ‘fake news’”? “How can they deny the scientific consensus on climate change?” 

The answer is simple. Our problem is that we keep forgetting it: Most people’s politics are not based on reason or rationality. This was a good part of the message of former HNN editor Rick Shenkman’s book Political Animals, as well as earlier ones like Predictably Irrational andThe Righteous Mind.  

Supporters of former President Obama are especially prone to forgetting this message, for he spoke on an intellectual level more than most politicians. In 2009, former presidential adviser and television journalist Bill Moyers stated that in the medical care debate Obama “didn't speak in simple, powerful, moral, language. He was speaking like a policy wonk.” In 2011, a similar criticism came from historian John Summers, who wrote that Obama paid insufficient attention to the irrational and emotions, and that conservatism was better at recognizing that “successful politicians tapped into the collective unconscious of voters, controlling their perceptions.” Liberalism, however, as Lionel Trilling wrote in Liberal Imagination (1950), “drifts toward a denial of the emotions and the imagination.”

Not coincidently, Junot Diaz wrote in 2010 that one of the main responsibilities of a president is to be a good storyteller, and that President Obama had failed miserably in this regard. “If a President is to have any success, if his policies are going to gain any kind of traction among the electorate, he first has to tell us a story.” [In his pre-presidential days Obama had told a good story in his Dreams from My Father.] Republicans, Diaz believed, were “much better storytellers.” 

In contrast to Obama, President Reagan did pay more attention to the irrational and to storytelling, as Jan Hanska emphasized in Reagan’s Mythical America: Storytelling as Political Leadership (2012). Historian Koch agrees: “Like FDR, he [Reagan] was a great storyteller.” He used stories “to look backward to an earlier time to promote ideas of self-reliance and free enterprise.” President Trump, though not much of a storyteller, also appeals to myths about America’s past—note all the “Make America Great” hats worn by his supporters.

But how does storytelling help political leaders? Lincoln provides some interesting insights.  He believed it often helped “avoid a long and useless discussion . . . or a laborious explanation.” Moreover, it could soften a rebuke, refusal, or wounded feelings. Influenced by Aesop’s fables, as well as the parables of Jesus, Lincoln intuitively understood that to sway the American public simple stories were often more effective than reasoned arguments. As he once stated, “They say I tell a great many stories . . . but I have found in the course of a long experience that common people . . . are more easily informed through the medium of a broad illustration than in any other way.” In his emphasis on Lincoln’s storytelling ability and the humor that often accompanied it, Schutz notes that they reflected his ability to identify with the common people, his appreciation of their practical bent, and his good-natured acceptance of the flawed human condition. 

More modern thinkers also have recognized the usefulness of storytelling for political purposes. In a book on violence John Sifton quoted the philosopher Richard Rorty on the usefulness of “sad stories,” rather than reasoned appeals to change people’s minds about using violence. Similarly, British climate-change activist Alex Evans came to realize that bombarding people “with pie-charts, acronyms and statistics” was not persuasive enough and that activists “could only touch people's hearts by telling stories.” He believes “that all successful movements, including those that overturned slavery and racial discrimination, consisted of a network of small and large communities held together not by common calculations or common acceptance of certain technical facts, but by commonly-proclaimed narratives about the past and the future. In his view the political shockwaves of 2016, including Brexit and Donald Trump's victory, reflected the winning camps’ ability to tell better stories, not their superior command of facts.” 

Futurist Tom Lombardo believes that the personal narratives we tell ourselves “give order, meaning, and purpose to our lives.” He also writes that “the most powerful way to generate change is to change the personal narrative. . . . Similarly, to change a society, its grand narrative needs to be changed” to one that will help provide “society a sense of integrity, distinctiveness, and overall purpose.”

In a 2016 HNN essay, historian Harvey J. Kaye argued that “the time has come for progressive historians and intellectuals to join with their fellow citizens in the making of a new American narrative,” one that would “encourage renewed struggles to extend and deepen American democratic life.” We now have at least one such narrative, Jill Lepore’s These Truths: A History of the United States (2018). 

In a recent Foreign Affairs essay, “A New Americanism: Why a Nation Needs a National Story,” she asks, “What would a new Americanism and a new American history look like?” Her essay and book answer that they would accept and celebrate our ethnic, religious, and gender-identity diversity. In both works, she quotes from an 1869 speech of Frederick Douglass where he refers to a “‘composite nation,’ a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them.”She ends her essay by criticizing those with a false view of our nation. “They’ll call immigrants ‘animals’ and other states ‘shithole countries.’ They’ll adopt the slogan ‘America first.’ They’ll say they can ‘make America great again.’ They’ll call themselves ‘nationalists.’ Their history will be a fiction. They will say that they alone love this country. They will be wrong.”

In two earlier HNN essays (here and here), I wrote of the need for a new “compelling, unifying vision that a majority of Americans would embrace.” It would build upon the visions suggested in the 1960s by Martin Luther King, Jr. and Robert Kennedy, which foreshadow Lepore’s “new Americanism.” In addition, it might sprinkle in the spirit of Carl Sandburg (1878-1967), “the one living man,” according to Adlai Stevenson, “whose work and whose life epitomize the American dream.” It would also recall other qualities that our country has demonstrated in its finest moments such as tolerance, compromise, pragmatism, generosity, and a willingness to undertake great tasks. 

Biographer of Ben Franklin, Walter Isaacsonwrotethat our forefathers who wrote the Constitution demonstrated that they were great compromisers. Also, for Franklin “compromise was not only a practical approach but a moral one. Tolerance, humility and a respect for others required it. . . . Compromisers may not make great heroes, but they do make great democracies.”

Today we again need to demonstrate both the idealism and compromising ability of the Constitution makers. Our present climate-change crisis provides such an opportunity. The Democrats’ Green New Deal reflects our idealism and willingness to once again take up a great task—as FDR did in fighting the Depression and mobilizing U. S. power in World War II. But transforming this idealistic resolution to effective legislation also requires political compromises. 

As various Democratic presidential contenders vie for the 2020 nomination, we need at least one of them to provide a unifying vision. Being a storyteller able to relate to and inspire most Americans, like Lincoln and FDR did, would also help create a “new Americanism.” So too would a Republican change of heart about compromise, a word most of them have rejected for far too long.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171537 https://historynewsnetwork.org/article/171537 0
Vernon Johns: An Often Forgotten Controversial Civil Rights Activist

 

Atop a hill on the southwest part of Lynchburg, “Hill City,” there sits on Garfield Avenue a small educational institution with a rich, storied history: Virginia University of Lynchburg (VUL). It comprises three old and weathered buildings, each much in need of repairs—Graham Hall, Humbles Hall, and Mary Jane Cachelin Memorial Science and Library Building. It is there that part of the story of the dispute between black separatists and black accommodationists was fought—a dispute centered on the disparate views of Booker T. Washington and W.E.B. Du Bois concerning disfranchisement of Blacks.

 

Washington, an erstwhile slave, argued for accommodationism—for a sort of bootstrapping. He resisted the temptation to challenge directly the injustices of Jim Crow laws and black disfranchisement. In his famous “Atlanta Compromise,” he maintained that Blacks ought to challenge racial injustices by tardigrade progress through “industrial education, and accumulation of wealth.” His approach was conciliatory, not aggressive. He said in his Atlanta address: “To those of my race who depend on bettering their condition in a foreign land or who underestimate the importance of cultivating friendly relations with the Southern white man, who is their next-door neighbor, I would say: ‘Cast down your bucket where you are’—cast it down in making friends in every manly way of the people of all races by whom we are surrounded.” Blacks could in time eliminate racial injustices by gradual integration in white society through learning industrial skills, vital in the Southern economic climate.

 

W.E.B. Du Bois began as a proponent of Washington’s conciliatory approach to racial injustices. Yet he soon strayed from Washington’s conciliatory approach, which was too tardigrade, conceded too much the interests of Whites, and did too little to address disfranchisement, lynching, and Jim Crow laws. In his watershed book, The Souls of Black Folk, Du Bois maintained that the separate-but-equal policy was itself proof of Blacks’ inequality. In an obvious poke at Washington, he stated, “To make men, we must have ideals, broad, pure, and inspiring ends of living, not sordid money-getting, not apples of gold.” Education was his vehicle for racial reform. “The function of the Negro college, then, is clear: it must maintain the standards of popular education, it must seek the social regeneration of the Negro, and it must help in the solution of problems of race contact and co-operation. And finally, beyond all this, it must develop men.”He argued that the “Talented Tenth,” a group of highly educated Blacks, in time would change more rapidly the racist landscape.

 

The disparate views of Washington and Du Bois shaped accommodationist and separatist strategies thereafter.

 

Enter Vernon Johns, born on April 22, 1892 in Darlington Heights, Virginia. Johns disagreed with both Washington and Du Bois—viz., he disagreed with Washington’s slow conciliatory approach and Du Bois’ separatism and elitism.

 

Johns we know attended VUL when it was Virginia Theological Seminary and College (VTSC, est. 1886) as an educational institution that trained young black men in theology. The year of matriculation was 1912. The institution was separatist in ideology, and that probably appealed to Johns who was no stranger to antagonism. Says Ralph E. Luker in “Johns the Baptist”: “The transfer [to VTSC] was crucial to his development and would shape his career for another two decades, for Virginia Seminary challenged Virginia Union’s cooperation with Northern white Baptists with coeducation of men and women, an emphasis on the liberal arts, and unceasing devotion to African American autonomy.” Johns entered the seminary, but was expelled permanently from VTSC for rebelliousness in 1915. The school at the time focused heavily on study through Greek and Latin.

 

Having left the seminary, he matriculated at Oberlin College in 1915, where he received an education that, Luker says, “no one of color might have found anywhere in Virginia and won honors among his classmates.” Johns graduated in 1918 and thereafter studied theology for a year at the University of Chicago’s graduate school.

 

Johns revisited VTSC in 1919 and taught homiletics and the New Testament. He became pastor of Court Street Baptist Church in 1920 and kept that position till 1926. In a 1920 letter to Professor G.W. Fiske of Oberlin College, he writes of his appointment, “My sailing at present is smooth with no clouds insight, and my prayer is that I may do some good on the voyage and at last be granted a safe harbor.”  In 1923, he was removed from the faculty of VTSC on account of harsh criticism of their curriculum. He removed to West Virginia, New York, and then to North Carolina, where he married. He returned inauspiciously to VTSC in 1929—the year of the Great Depression. For five years, he functioned as president of VTSC. In 1933, he was forced to resign, due to students’ protests—they striked until Johns was dismissed—and the financial impoverishment of the institution. Formal complaints were these: “We want a president whose presence on the campus will be a source of joy and not intense displeasure. We want a president who will not regard students and trustees as enemies of the school simply because they oppose or object to certain of his policies or actions. We want a president who will not convert classrooms into coal bins and chicken houses. … We want a president whose remarks to students in chapel services will be advisory and not adverse criticism or lambaste. We want a president who will not stoop to the use of profanity and vulgarity in addressing the students in chapel services and in the presence of young women on the campus.” Johns, it is obvious, was not a popular president.

 

In 1937, Johns began a second appointment at First Baptist Church in West Virginia, where he caught and sold fish, fishmongery, for additional income. In 1941, he again returned to Lynchburg as pastor of Court Street Baptist Church. He was removed from that position in 1943, after disputes with laymen.

When his wife Altona Trent Johns joined the faculty at Alabama State University’s Department of Music in 1947, Johns became pastor of the esteemed Dexter Avenue Baptist Church. There he continued his rebelliousness with incendiary speeches (e.g., “Segregation after Death” and “When the Rapist Is White”) and loud actions in protest of racial discrimination. Johns, never again holding a pastorship, resigned in 1952, due once again to unrest from his congregation. He was eventually succeeded by Dr. Martin Luther King, Jr. Johns died from a heart attack on June 10, 1965 in Washington, D.C.

 

What of Johns the man?

 

The pattern of his life, as the short biographical sketch shows, reveals that Johns could never stay in one place too long. Why? His message for racial equality was direct and unbending, and his vision was broad and far-seeing. As biographers Partrick Louis Cooney and Henry Powell note, “He had a natural ability to be socially insensitive to people individually, yet, at the same time, caring mightily for them as a group.”

 

Yet that seems understated. Johns was so lost in the end he pursued, eminently just, that he could not see a suitable means to achieve that end. He alienated both Whites and Blacks who were impassioned advocates of civil rights. At a convention of white and black preachers in Baltimore in 1960, Johns objected to a talk by a white minister. “The thing that disappoints me about the Southern white church is that it spends all of its time dealing with Jesus after the cross, instead of dealing with Jesus before the cross. … I don’t give a damn what happened to [Jesus] after the cross.” Though he might have been right, the comment offended almost all present. Again, aiming to democratize the elitism of Du Bois, he asked in effect all Blacks in their own way to be part of the Talented Tenth, and he was asking too much of them. At Dexter Church, he earned the reputation, as one biographer notes, of a “‘militant guy,’ who exhorted the congregation like a ‘whirlwind’ to get involved in social issues.” Martin Luther King, Jr., wrote of Johns: “A fearless man, … he often chided the congregation for sitting up so proudly with their many academic degrees, and yet lacking the very thing the degrees should confer, that is, self- respect. One of his basic theses was that any individual who submitted willingly to injustice did not really deserve more justice.”

 

Johns was a critical figure in the push for racial equality, chiefly because he was not merely a pusher, but also a shover. Never afraid of offending others, chiefly the Whites with whom he sought equality, he challenged Jim Crowe laws by bringing to trial Whites accused of raping Blacks, sitting with Whites in the front of a bus, and entering Whites-only restaurants. He also consistently offended black students and parishioners by demanding of them the sort of deeds that only a person of bulky courage, vision, and abilities, like he, could do.

 

Though a critical figure, he was never politically mainstream in the Civil Rights movement. He was kept from political involvement in racial injustice, because, while he praised many Blacks’ literary and intellectual achievements, he railed against general black indifference to or noninvolvement in civic issues.

 

Johns was also hampered by his large intellectual capacity. Self-educated as a boy, he came to learn Greek, Latin, Hebrew, and German and committed to memory lengthy biblical passages and numerous quotes from philosophers from antiquity, Shakespeare and other poets, sermons, and other literature. Yet his prodigious intellect generally made him largely inaccessible as professor, school president, and pastor, and he seems to have made little effort to make himself accessible—a problem shared my numerous others with large intellection.

 

Johns was also off-putting because he was a strange admixture of husbandman and preacher. He loved the land. Brought up on a farm, he was always a farmer and fisherman at heart. He would unabashedly sell his farmed goods and fish at his churches and at the various schools he attended. Many thought that that was pompous and untoward.

 

Moreover, Johns was restive—always aiming to be a harbinger of outsized change—and he had to instigate that change in his own, inimitable manner. Of prodigious intellect and unsubtle and uncomfortable with compromise, he irritated and angered Whites, and did the same with Blacks, and so he was less effective as a harbinger of change than he would have been had he been less intelligent, more subtle, and open to compromise. His successor at Dexter Church, Martin Luther King, Jr., fortunately possessed those qualities that Johns lacked.

 

Finally, Johns was an enigma, because he stood solidly for civil rights, but it was never quite clear what his modus operandifor Blacks was. He was no accommodationist, but he also did not fit squarely into the separatist mold. He might best be categorized as an antagonist or a revolutionist, who championed swift and decisive counteractions to unjust actions. In a sermon delivered after the shooting of a black man by a white officer in 1948, Johns reminded his congregation that one of the Ten Commandments was “Thou shalt not kill.” He added that God did not qualify that commandment with “unless you are a police officer” or with “unless you’re White.” Johns then added: “I’ll tell you why it’s safe to murder Negroes. Because Negroes stand by and let it happen.” Johns was arrested for instigation. He was seen as a threat to the white status quo and he also managed in the process to offend his black congregation by his statement that Blacks would not act.

 

Was Johns a failure because he could have been a greater harbinger for change had he had aimed for subtlety and conciliation?

 

Johns once said, “You should be ashamed to die until you’ve made some contribution to mankind.” Thus, Johns throughout his life strived to improve not just Blacks’ condition, but the human condition. Johns lived up to and greatly exceeded that mark. In doing so, he has set a high mark for us, irritated today by injustices of any sort, to match.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171536 https://historynewsnetwork.org/article/171536 0
What Democratic Socialism Is and Is Not

In recent weeks, Donald Trump and other Republicans have begun to tar their Democratic opponents with the “socialist” brush, contending that the adoption of socialist policies will transform the United States into a land of dictatorship and poverty.  “Democrat lawmakers are now embracing socialism,” Trump warned the annual Conservative Political Action Conference in early March.  “They want to replace individual rights with total government domination.” In fact, though, like many of Trump’s other claims, there’s no reason to believe it.

The ideal of socialism goes back deep into human history and, at its core, is based on the notion that wealth should be shared more equitably between the rich and the poor. Numerous major religions have emphasized this point, criticizing greed and, like the revolutionary peasants of 16th century Germany and the rebellious Diggers of 17th century England, preaching the necessity for “all God’s children” to share in the world’s abundance. The goal of increased economic equality has also mobilized numerous social movements and rebellions, including America’s Populist movement and the French Revolution.                                                                               

But how was this sharing of wealth to be achieved?  Religious leaders often emphasized charity.  Social movements developed communitarian living experiments. Revolutions seized the property of the rich and redistributed it.  And governments began to set aside portions of the economy to enhance the welfare of the public, rather than the profits of the wealthy few.

In the United States, governments at the local, state, and federal level created a public sector alongside private enterprise.  The American Constitution, drafted by the Founding Fathers, provided for the establishment of a U.S. postal service, which quickly took root in American life.  Other public enterprises followed, including publicly-owned and operated lands, roads, bridges, canals, ports, schools, police forces, water departments, fire departments, mass transit systems, sewers, sanitation services, dams, libraries, parks, hospitals, food and nutrition services, and colleges and universities.  Although many of these operated on a local level, others were nationwide in scope and became very substantial operations, including Social Security, Medicare, National Public Radio, the National Institutes of Health, and the U.S. armed forces.  In short, over the centuries the United States has developed what is often termed “a mixed economy,” as have many other countries.

Nations also found additional ways to socialize (or share) the wealth.  These included facilitating the organization of unions and cooperatives, as well as establishing a minimum wage, unemployment insurance, and a progressive tax policy―one with the highest levies on the wealthy and their corporations.

Over the course of U.S. history, these policies, sometimes termed “social democracy,” have enriched the lives of most Americans and have certainly not led to dictatorship and economic collapse. They are also the kind championed by Bernie Sanders and other democratic socialists.

Why, then, does a significant portion of the American population view socialism as a dirty word?  One reason is that many (though not all) of the wealthy fiercely object to sharing their wealth and possess the vast financial resources that enable them to manipulate public opinion and pull American politics rightward.  After all, they own the corporate television and radio networks, control most of the major newspapers, dominate the governing boards of major institutions, and can easily afford to launch vast public relations campaigns to support their economic interests.  In addition, as the largest source of campaign funding in the United States, the wealthy have disproportionate power in politics.  So it’s only natural that their values are over-represented in public opinion and in election results.

But there’s another major reason that socialism has acquired a bad name:  the policies of Communist governments.  In the late 19th and early 20th centuries, socialist parties were making major gains in economically advanced nations.  This included the United States, where the Socialist Party of America, between 1904 and 1920, elected socialists to office in 353 towns and cities, and governed major urban centers such as Milwaukee and Minneapolis. But, in Czarist Russia, an economically backward country with a harsh dictatorship, one wing of the small, underground socialist movement, the Bolsheviks, used the chaos and demoralization caused by Russia’s disastrous participation in World War I to seize power. Given their utter lack of democratic experience, the Bolsheviks (who soon called themselves Communists) repressed their rivals (including democratic socialists) and established a one-party dictatorship.  They also created a worldwide body, the Communist International, to compete with the established socialist movement, which they denounced fiercely for its insistence on democratic norms and civil liberties.

In the following decades, the Communists, championing their model of authoritarian socialism, made a terrible mess of it in the new Soviet Union, as well as in most other lands where they seized power or, in Eastern Europe, took command thanks to post-World War II occupation by the Red Army.  Establishing brutal dictatorships with stagnating economies, these Communist regimes alienated their populations and drew worldwide opprobrium.  In China, to be sure, the economy has boomed in recent decades, but at the cost of supplementing political dictatorship with the heightened economic inequality accompanying corporate-style capitalism.

By contrast, the democratic socialists―those denounced and spurned by the Communists―did a remarkably good job of governing their countries. In the advanced industrial democracies, where they were elected to office on numerous occasions and defeated on others, they fostered greater economic and social equality, substantial economic growth, and political freedom.

Their impact was particularly impressive in the Scandinavian nations.  For example, about a quarter of Sweden’s vibrant economy is publicly-owned. In addition, Sweden has free undergraduate college/university tuition, monthly stipends to undergraduate students, free postgraduate education (e.g. medical and law school), free medical care until age 20 and nearly free medical care thereafter, paid sick leave, 480 days of paid leave when a child is born or adopted, and nearly free day-care and preschool programs.  Furthermore, Sweden has 70 percent union membership, high wages, four to seven weeks of vacation a year, and an 82-year life expectancy.  It can also boast the ninth most competitive economy in the world. Democratic socialism has produced similar results in Norway and Denmark.

Of course, democratic socialism might not be what you want.  But let’s not pretend that it’s something that it’s not.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171564 https://historynewsnetwork.org/article/171564 0
Trump's Executive Order Censors Free Speech on College Campuses

 

In 1961, a historian at the University of Pittsburgh named Robert G. Colodny was called before the House Un-American Activities Committee. Colodny was just one of HUAC’s many targets, a list which included screenwriters like Dalton Trumbo and playwrights such as Arthur Miller. HUAC remained a fearsome and fundamentally anti-democratic means of intimidation and often professional ruin even after the height of the McCarthy era’s Red baiting. The professor drew suspicion after he innocuously referred to Cuban “agrarian reforms” in the Pittsburgh Press, which was enough for a local state representative to label Colodny a communist sympathizer. Shortly after, Congress and then the university itself launched investigations. This, it should be said, is what an attack on academic freedom looks like. 

Part of what contributed to the professor’s new-found notoriety was that Colodny had been among those idealists and visionaries, including writers like Ernest Hemingway and George Orwell, who enlisted themselves in the army of the democratically elected government of Republican Spain, which in the late 1930’s was threatened and ultimately defeated by the fascist forces of the future dictator Francisco Franco. They were often tarred as “prematurely anti-fascist,” with historian Adam Hochschild explaining in Spain in Our Hearts: Americans in the Spanish Civil War 1936-1939 that for those fighters the conflict was “seen as a moral and political touchstone, a world war in embryo, in a Europe shadowed by the rapid ascent of fascism.” Franco received aid and assistance from Mussolini and Hitler, with the Luftwaffe’s brutal destruction of the Basque city of Guernica indeed a prelude to the coming horror of the bloodiest war in human history. Women and men like Colodny, who served in the international brigades, correctly believed that right-wing nationalism and international fascism should be countered on the battlefields of Spain. As Orwell would write in his 1938 account Homage to Catalonia, “I recognized it immediately as a state of affairs worth fighting for.” From 1937 until the following year, Colodny would fight in a battalion of volunteers known as the Abraham Lincoln Brigade, the first integrated squadron of American soldiers, and one of over fifty international brigades composed of leftists who fought against the Spanish fascists. The future professor sustained a gunshot wound above his right eye which left Colodny partially paralyzed and blind. Despite his injuries, he’d later serve in the American armed forces, going onto receive a doctorate in history at the University of California at Berkeley, where he specialized in the philosophy of science.

Such were the vagaries of a fascinating, if unassuming, professional career until Colodny would be called to account for his anti-fascist record. After his congressional testimony, the University of Pittsburgh was under pressure to terminate Colodny’s appointment, but after six months of investigation they would conclude that the professor’s political opinions and service didn’t constitute a reason for dismissal. Pitt’s Chancellor Edward H. Litchfield wrote in his conclusion to the investigation, in a statement that deserves to be the canonical statement on academic freedom, that a university “embraces and supports the society in which it operates, but it knows no established doctrines, accepts no ordained patterns of behavior, acknowledges no truth as given. Were it otherwise, the university would be unworthy of the role which our society has assigned it.”   

As moving and apt an encapsulation of the free inquiry that lay at the heart of American higher education as any that’s ever been written, and one that as of today is under serious threat by the machination of the Trump administration. On March 21st Trump signed an executive order with the anodyne designation of “Improving Free Inquiry, Transparency, and Accountability at Colleges and Universities,” a declaration that by name alone would be easy to assume is congruent with Litchfield’s idealistic argument of half a century ago. But the order’s language, which claims that we must “encourage institutions to appropriately account” for free inquiry in their “administration of student life and to avoid creating environments that stifle competing perspectives” lacks not just Litchfield’s poetry, but indeed means the exact opposite of that earlier defense. Trump’s order, fulfilling a promise to his right-wing supporters and their long-standing obsession with a perceived liberal bias in the academy, exists not to promote inquiry, but to stifle it; not to expand perspectives, but rather to limit them; not to encourage free speech, but to censor it. 

Trump’s order was germinated out of the debate that has surrounded questions concerning the scheduling of fascist speakers at universities. Today’s order can arguably be traced back towards an incident at Colodny’s alma matter of Berkeley, which incidentally was also the birthplace of the Free Speech Movement of the 1960’s. In 2017 violent confrontations between political groups, none of whom were affiliated with the university, led to the cancelling of one event due to security concerns. Importantly the university had approved this speaker’s visit, and indeed said the speaker was paid with student activity funds. At no point was the speaker censored or oppressed, despite his abhorrent views. 

With his characteristic grammar, punctuation, orthography and enthusiasm for capitalization, the president tweeted on February 2, 2017 that “If U.C. Berkeley does not allow free speech and practices violence on innocent people with a different point of view – NO FEDERAL FUNDS?” Tallying the inaccuracies in a Donald J. Trump statement is a bit like searching for sand at the beach, but it should go without saying that neither Berkeley faculty nor its administration had enacted  “violence on innocent people.” Rather a hitherto invited right-wing speaker arrived with his own retinue of supporters that were countered by community groups not affiliated with the university itself, and unsurprisingly hate speech generated hate. 

The language of the March 21st executive order is nebulous, but seems to imply that colleges and universities will lose federal funds if there choose not to host certain speakers. This is, as should be obvious, the opposite of free speech. A university has every right to decide who will speak on its campus, and the community certainly has the right to object to certain speakers, who are normally paid from the budget generated by student activity fees. It’s unclear if such a federal order will be consistently applied, so that an evangelical college would be required to invite pro-choice speakers, or a Christian university would have to pay visiting atheist lecturers, but I’ll let you guess what the intent of the proclamation most likely is. 

Colodny’s brother-in-arms George Orwell would probably have something astute to say about the manner in which the Trump administration has commandeered the language of free speech so as to subvert free speech. At the very least you have to appreciate the smug arrogance of it. Such an executive order, which is red-meat to Trump’s base, is the culmination of two generations of neurotic, anxious, right-wing fretting about apparent liberal infiltration of colleges and universities. While it’s true that faculty, depending on discipline, tend to vote liberal, you’re as likely to find a genuine Marxist among university professors as you are to find an ethical member of the Trump administration itself. Furthermore, this concern over “political diversity” is only raised when conservatives feel threatened, and academe is simply the one small corner of society not completely dominated by the right. Ask yourself what insecurity encourages those who dominate the executive branch, dozens of state governments, business, and increasingly the judiciary to continually fulminate about academe, Hollywood, and the media?

You’ll note that the concern over the perceived lack of political diversity among faculty normally begins and ends at the social sciences and humanities, though more recently the natural sciences have also been attacked for daring to challenge the conservative ideological orthodoxy on issues such as climate change. Conservatives aren’t concerned about a lack of diversity among business faculty, or even more importantly among the trustees of colleges and universities, where every higher education worker knows that the real power is concentrated. For that matter, there is no equivalent hand-wringing about political diversity on corporate boards, though perhaps a socialist sitting in on a board meeting at the Bank of America could have all done us some good in 2008. Nobody in the Republican Party seems terribly concerned that other professions which hew to the right, be they law enforcement or investment bankers, don’t have a “diversity” of political opinions represented in their ranks. 

That’s because today’s order obviously has nothing to actually do with free inquiry and diversity, but rather intends to stranglehold it. Terry Hartle, the senior vice president for government and public affairs at the American Council on Education said in a speech that “As always in the current environment, irony does come into play. This is an administration that stifles the views of its own research scientists if they are counter to the pollical views of the administration… And the president vigorously attacks people like Colin Kaepernick.” 

It’s impossible to interpret much of what the administration does without an awareness of their own finely adroit sense of sadistic irony and mocking sarcasm. In such a context, Thursday’s executive order, whose full ramifications remain unclear, is far from a defense of free inquiry but rather a sop to those like right-wing activist David Horowitz, director of his own self-named and so-called “Freedom Center,” or the administrators of the website Professor Watchlist, maintained by the conservative group Turning Point USA. Trump’s executive order is an attempt to return us to the era in which Colodny could be fired for his progressive views, an age of blacklists and loyalty oaths. 

Anyone attending a college, or has children enrolled, or who works in higher education, is amply aware that the state of the American university is troubled. The recent enrollment scandal whereby wealthy parents simply paid their children’s way into elite institutions (as cynically unsurprising as this may be) only underscores the malignancies which define too much of post-secondary education in America today. College is too expensive, too exclusionary, and its resources are misallocated. The academic job market is punishing, and serves not the graduate students who aspire towards a professorial job. Undergraduates take on obscene amounts of debt, and the often-inflated reputation of the Ivy league and a handful of other instructions still sets too much of the tenor of American social, political, and cultural life. But none of these problems are because the university is too “liberal.” To the contrary, American higher education could stand to move a lot more to the left in terms of admissions and employment. If anything, the current crisis in higher education is most closely related to the imposition of a certain business mentality upon institutions whose goal was never to be the accumulation of profit for its own sake. 

Because despite its contradictions, American higher education has historically remained the envy of the world. There is a reason that international students clamber for a spot at an American college. Since the emergence of the American research university in the 19th century, higher education has been at the forefront of research and innovation. Even more importantly, democratizing legislation such as the GI Bill and affirmative action transformed American universities into the greatest engine of upward class mobility in human history. It’s not a coincidence that conservative attacks on higher education occurred right at the moment when it became available to the largest number of people, but the nature of these most recent attacks, making federal funding contingent on which right-wing agitator receives a hefty speaker’s fee, could have a chilling effect on education. 

Sociologist Jonathan R. Cole writes in The Great American University that our system of higher education has been “able to produce a very high proportion of the most important fundamental knowledge and practical research discoveries in the world.” By intervening in the details of who is invited to speak on a college campus (which is of course separate from censorship), the federal government threatens the independence and innovation of higher education, by imposing an ideological approved straight-jacket upon that which has historically been our great laboratory of democracy. Colodny wrote that the goal of higher education was so that “some traditional holder of power feels the tempest of new and renewing ideas.” The man who currently occupies the Oval Office can’t abide either of those things, and so he’d rather burn it all down than spend a moment being threatened by institutions that actually enshrine free inquiry. The gross obscenity is that he’s self-righteously claiming the mantle of that same free inquiry to do it. 

 

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171563 https://historynewsnetwork.org/article/171563 0
What is Antisemitism? Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

 

Antisemitism is alive and well these days. In Europe and America, the number of antisemitic incidents areincreasing every year, according to those who try to keep track.

 

News about antisemitism has recently wandered from the streets and the internet into the halls of Congress. The presence of two newly elected young Muslim women in the House, who openly advocate for Palestinians against Israel, has upset the strongly pro-Israel consensus that has dominated American politics for decades. Accusations of antisemitism are especially directed at Ilhan Omar from Minneapolis, who has used language that is reminiscent of traditional antisemitic themes in her criticism of Israeli policies. Her case demonstrates that it can be difficult to distinguish between unacceptable antisemitism and political criticism of the Jewish government of Israel and its supporters.

 

Some incidents seem to be easy to label as antisemitic. For example, when a large group of young people physically attacked Jewish women while they were praying. Many women were injured, including the female rabbi leading the prayers. The attackers carried signs assailing the women’s religious beliefs, and the press reported that the women “were shoved, scratched, spit on and verbally abused”.

 

An obvious case of antisemitism? No, because the attackers were ultra-Orthodox Jewish girls and boys, bussed to the Western Wall in Jerusalem in order to attack the non-Orthodox Women of the Wall, who were violating misogynist Orthodox traditions about who can pray at the Wall. This incident fulfills every possible definition of antisemitism. For example, the International Holocaust Remembrance Alliance offers the following description of public acts that are antisemitic: “Calling for, aiding, or justifying the killing or harming of Jews in the name of a radical ideology or an extremist view of religion.” The ultra-Orthodox leaders who encouraged the assault would argue that they were protecting, not attacking Judaism, and that the Women of the Wall were not really Jewish anyway.

 

Acts of antisemitism are political acts. Accusations of antisemitism are likewise political acts, deployed in the service of the political interests of the accusers. Many, perhaps most accusations of antisemitism are made in good faith for the purpose of calling attention to real religious prejudice. But such accusations are often made for less honest political purposes.

 

The Republicans in Congress who demand that Democrats denounce Ilhan Omar are cynically using the accusation of antisemitism for political gain. Many Republicans have themselvesmade statements or employed political advertisements that are clearly antisemitic. The rest have stood by in silence while their colleagues and their President made antisemitic statements. But they saw political advantage in attacking a Democrat as antisemitic.

 

Supporters of the Israeli government’s policies against Palestinians routinely accuse their critics of antisemitism as a means of drawing attention away from Israeli policies and diverting it to the accusers’ motives. Sometimes critics of Israel are at least partially motivated by antisemitism. But the use of this rhetorical tactic also often leads to absurdity: Jews who do not approve of the continued occupation of land in the West Bank or the discrimination against Palestinians in Israel are accused of being “self-hating Jews”.

 

This linking of antisemitism and criticism of Israeli policy has worked well to shield the Israeli government from reasonable scrutiny of its policies. In fact, there is no necessary connection between the two. Criticism of current Israeli policy is voiced by many Jews and Jewish organizations, both religious and secular.

 

Supporters of the idea of boycotting Israeli businesses as protest against Israeli treatment of Palestinians, the so-called BDS movement, are sometimes assumed to be antisemitic and thus worthy of attack by extremists. But the pro-Israel but also pro-peace Washington Jewish organization J-Street argues that “Efforts to exclude BDS Movement supporters from public forums and to ban them from conversations are misguided and doomed to fail.” I don’t remember that any of the supporters of boycotting and divesting from South Africa because of its racial policies were called anti-white.

 

Those who advocate a “one-state solution” to the conflict between Israel and the Palestinians are sometimes accused by conservatives of being antisemitic, with the argument that this one state will inevitably eventually have a majority of Muslims. The Washington Examiner calls this equivalent to the “gradual genocide of the Jewish people”.

 

The absurdity of equating anti-Zionism with antisemitism is personified by the denunciations of Zionism and the existence of Israel by the Orthodox Satmar, one of the largest Hasidic groups in the world.

 

On the other side, the most vociferous American supporters of Prime Minister Netanyahu’s government have been evangelical Christians. Although they claim to be the best friends of Israel, the religious basis of right-wing evangelical Christianity is the antisemitic assertion that Jews will burn in hell forever, if we do not give up our religion. Robert Jeffress, the pastor of First Baptist Church in Dallas, who spoke at President Trump’s private inaugural prayer service, has frequently said that Jews, and all other non-Christians, will go to hell. The San Antonio televangelist John C. Hagee, who was invited by Trump to give the closing benediction at the opening of the new American Embassy in Jerusalem, has preached that the Holocaust was divine providence, because God sent Hitler to help Jews get to the promised land. Eastern European nationalists, who often employ antisemitic tropes to appeal to voters, are also among the most vociferous supporters of Netanyahu and Israel.

 

Political calculations have muddied our understanding of antisemitism. Supporters of the most right-wing Israeli policies include many people who don’t like Jews. Hatreds which belonged together in the days of the KKK may now be separated among right-wing white supremacists.

 

But no matter what they say, purveyors of racial prejudice and defenders of white privilege are in fact enemies of the long-term interests of Jews all over the world, who can only find a safe haven in democratic equality.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154195 https://historynewsnetwork.org/blog/154195 0
Mike Pence Says the US Has Been "A Force For Good in the Middle East" for "nearly 200 years"; Here's How Historians Responded Allen Mikaelian is a DC-based editor and writer. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. The Political Uses of the Past Project collects and checks statements by elected and appointed officials. This is the first installment of what will hopefully become a regular feature of the project. Read more about the project here. Contact the editor of the project here.

Vice President Pence: "For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East"

For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East. Previous administrations in my country too often underestimated the danger that radical Islamic terrorism posed to the American people, our homeland, our allies, and our partners. Their inaction saw the terrorist attacks from the U.S.S. Cole; to September 11th; to the expansion of ISIS across Syria and Iraq — reaching all the way to the suburbs of Baghdad. But as the world has witnessed over the past two years, under President Trump, those days are over. —Vice President Michael Pence, Remarks, Warsaw Ministerial Working Luncheon, February 14, 2019

Historians say...

Eight historians responded to our request for comment; their full statements and recommended sources are on the Political Uses of the Past page).

The vice president starts with the 1833 treaty with Oman, and so shall we, even though it’s an odd place to start. As Will Hanley of Florida State University noted in his reaction to Pence’s claim, the treaty itself is a piece of routine boilerplate, not so different “from dozens of other 1830s agreements between Middle East authorities and representatives of American and European states.” But there was at least one innovation, as Hanley explains: “The Sultan of Muscat inserted a clause saying that he, rather than the US, would cover the costs of lodging distressed American sailors. A more accurate statement [by Pence] on this evidence would be ‘For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, representatives of the United States have pursued standardized agreements in the Middle East and enjoyed meals that we haven't paid for.’”

Vice President Pence made this broad statement at a ministerial meeting on terrorism, but his mind was primarily on Iran. His intent was to draw a contrast between the United States and Iran, with the former being a “force for good” in the region and the latter being a perpetrator of continual violence. But by going back to 1833 to reference a routine and fairly boring trade agreement with a minor kingdom, he appears to be grasping at straws.

If Pence was looking for good done by the United States in the Middle East, he could have asked some of the historians who reacted to his statement. He may have learned from Joel Beinin how “American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region.” He may have been interested to hear from Indira Falk Gesink that "after World War I, most of those polled in the regions surrounding Syria wanted the US as their mandatory power (if they wanted any)." He may have learned from Lior Sternfeld how the United States has sponsored “schools, universities, and orphanages” and took a stand against its European allies and Israel during the Suez Crisis of 1956.

But if he had asked and had learned about these efforts, he would also have learned from Professor Beinin that many of the missionaries who established these schools went to work for the CIA in the postwar period, “so even the very best thing that Americans have done in the Middle East since the early 19th century was corrupted by government efforts to exert power over the region in order to control its oil.” And Pence would have also had to hear Professor Sternfeld tell about the 1953 coup in Iran that cemented a brutal regime in place for the next quarter-century and how, as described by Professor Gesink, "from that point on, US actions in the Middle East were guided by demand for oil and anti-Communist containment." Finally, he would have had to hear about how much that 1953 coup has to do with our relations with Iran now.

Historians who replied to our request for comment could not find much “force for good” in the historical record. Instead, they find “death, displacement, and destruction” (Ziad Abu-Rish), support for “the most ruthless and brutal dictators at every turn” and the “most fanatical and chauvinistic nationalist and religious forces at every turn” (Mark Le Vine), “intense and destructive interventions … characterized by public deception, confusion, and mixed motives” (Michael Provence), "a moral compromise with authoritarianism"  (Indira Falk Gesink), and actions that have “contributed to breakdowns in security, widespread violence, and humanitarian disaster” (Dale Stahl).

Homage to the Shah after coup d'état, 5 September 1953, The Guardian - Unseen images of the 1953 Iran coup.

Three historians below recommend The Coup: 1953, The CIA, and The Roots of Modern U.S.-Iranian Relations by Ervand Abrahamian, and this book is incredibly pertinent today. Previous historical accounts and justifications by 1950s policymakers made the coup all about Mosaddegh’s unwieldiness to compromise or said it was all about winning the Cold War. Abrahamian instead shows that it was about oil, or, more specifically, “the repercussions that oil nationalization could have on such faraway places as Indonesia and South America, not to mention the rest of the Persian Gulf.” And for this, Iran and the Middle East got, courtesy of the United States, the brutal Mohammad Reza Shah. The shah crushed the democratic opposition, filling his jails with thousands of political prisoners, and left “a gaping political vacuum—one filled eventually by the Islamic movement.” And so here we are.

Mike Pence’s incredibly blinkered statement can be viewed as an extreme counterpoint to the right-wing view of Obama’s Cairo speech, in which the president mildly acknowledged that the US had not always been on the side of right in the Middle East, and that its history of actions have come back to haunt us all. Such things, it seems, must not be spoken in the muscular Trump administration, even if it means abandoning an understanding that might actually be useful. “For me as an historian,” Mark Le Vine notes below, “perhaps the worst part the history of US foreign policy in the region is precisely that scholars have for so long done everything possible to inform politicians, the media and the public about the realities there. Largely to no avail.” Indeed, Mike Pence here appears intent on utterly blocking out history and historical thinking, even as he dreams of a long and glorious past.

Browse and download sources recommended by the historians below from our Zotero library, or try our in-browser library.

 

Ziad Abu-Rish, Assistant Professor of History at Ohio University

I'm only going to tackle the "force for good" claim, without getting into the claims about Trump compared to his predecessors or the notion of "radical Islamic terrorism." Let's give Vice President Pence a chance at being correct... Read more

Joel Beinin, Donald J. McLachlan Professor of History and Professor of Middle East History, Emeritus, Stanford University

American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region... Read more

Indira Falk Gesink, Baldwin Wallace University

I think this is a much more complicated question than is generally acknowledged. On the one hand, some American private citizens have had long-lasting positive impact—for example the founding of educational institutions such as Roberts College, the American University in Beirut (originally the Syrian Protestant College), and the American University in Cairo. At that time, the US generally was viewed positively in the region. ... Read more

Will Hanley, Florida State University

It's not possible to use historical evidence to support a black-and-white statement like "The United States has been a force for good in the Middle East." Even if it were possible, the slim 1833 treaty between the US and the Sultan of Muscat is meager evidence. ... Read more

Mark Andrew Le Vine, Professor of Modern Middle Eastern History, UC Irvine

This statement is ridiculous even by the standards of the Trump administration. The US has been among the most damaging forces in the Middle East for the last three quarters of a century. ... Read more

Michael Provence, Professor of Modern Middle Eastern History, University of California, San Diego

The United States had no role in the Middle East before 1945, apart from private business and educational initiatives. Within a couple years of 1945, the US tilted toward Israel in its first war, began overthrowing democratic Middle Eastern governments, and propping up pliant dictators. ... Read more

Dale Stahl, Assistant Professor of History, University of Colorado Denver

I see this statement as "more or less false" because there are clear examples where the United States has not had a positive influence in the Middle East. One needn't reflect very far back into that "nearly 200 years" of history to know that this is so. ... Read more

Lior Sternfeld, Penn State University

While the US had some moments where it was a force for good, with projects like schools, universities, and orphanages, it was also a source for instability in cases like the 1953 coup against Mosaddegh that overturned the course not just of Iran but of the region in its entirety. Read more

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154194 https://historynewsnetwork.org/blog/154194 0
Roundup Top 10!  

The New Zealand Shooting and the Great-Man Theory of Misery

by Jelani Cobb

Most of the men who committed these recent acts of terror composed manifestos. A sense of history turning on the fulcrum of a single man’s actions is a theme within them.

 

Nazis Have Always Been Trolls

by Adam Serwer

Historically, they rely on murderous insincerity and the unwillingness of liberal societies to see them for what they are.

 

 

The first time the U.S. considered drafting women — 75 years ago

by Pamela D. Toler

As legislative debate about drafting women in 1945 shows, if the military need is great enough, women will be drafted no matter how uncomfortable lawmakers are with the prospect.

 

 

Poor criminal defendants need better legal counsel to achieve a just society

by Connie Hassett-Walker

Why we must fulfill the promise of a famous Supreme Court decision to truly achieve criminal justice reform.

 

 

Native children benefit from knowing their heritage. Why attack a system that helps them?

by Bob Ferguson and Fawn Sharp

For 40 years, the Indian Child Welfare Act has protected the best interests of Native children and helped preserve the integrity of tribal nations across the United States.

 

 

The Story of the Dionne Quintuplets Is a Cautionary Tale for the Age of ‘Kidfluencers’

by Shelley Wood

The pitfalls and payoffs of advertising directly to children have consumed psychologists, pediatricians, marketers and anxious parents for the better part of a century.

 

 

Citizenship in the Age of Trump

by Karen J. Greenberg

Death By a Thousand Cuts

 

 

When bad actors twist history, historians take to Twitter. That’s a good thing.

by Waitman Wade Beorn

Engaging with the public isn’t pedantry; it’s direct engagement.

 

 

Americans don’t believe in meritocracy — they believe in fake-it-ocracy

by Niall Ferguson

This illegal “side door” into college came into existence because the back door of a fat donation — like the $2.5 million paid by Jared Kushner’s father to Harvard — isn’t 100 percent reliable.

</

 

Who’s the snowflake? We tenured professors, that’s who

by Anita Bernstein

Our freedom to say what we want is not only tolerated but celebrated.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171561 https://historynewsnetwork.org/article/171561 0
Andy Warhol: A Lot More than Soup Cans A month ago, I watched a television program that covered, briefly, the art of pop icon Andy Warhol, he of all the Campbell’s Soup cans. The narrator said that Warhol had passed into history and that young people today probably had no idea who he was.

I was startled. Young people did not know who the thin man with the white hair was, the man who hung out with Liz Taylor, Liza Minelli, dress designer Halston and the Jaggers? The man who painted the famous Mao portrait? Truman Capote’s buddy?

I’m a professor, so the next day I asked my classes, 25 students in each, if they knew who Andy Warhol was. I didn’t say artist or painter Andy Warhol, just Andy Warhol.

The hands shot into the air. About 95% of them knew who he was.

Andy Warhol will never pass from the scene. That is proven, conclusively, in the largest exhibit of his work in generations at the Whitney Museum, in New York, Andy Warhol – From A to B and Back Again. It is a marvelous and exciting tribute to his work and is attracting huge crowds.

The crowds are not art aficionados from the 1960s, either, but young women with baby carriages, high school student groups, young couples and foreign tourists. Warhol was an international celebrity and a celebrity superstar in addition to being a memorable artist, and, these crowds indicate, always will be remembered.

“Modern art history is full of trailblazers whose impact dims over time,” said Curator Scott Rothkopf. “But Warhol is that extremely rare case of an artist whose legacy grows only more potent and lasting. His inescapable example continues to inspire, awe and even vex new generations of artists and audiences with each passing year.”

Another curator, Donna De Salvo, said the originally Avant Garde Warhol has become part of mainstream art. “Warhol produced images that are now so familiar that it’s easy to forget how just how unsettling and even shocking they were when they debuted,” she said.

Warhol really became famous not so much because of his new age art, but because of his celebrity. He was friends with many of the biggest entertainment stars in the world, was a fixture at legendary New York nightclub Studio 54 in the 1980s, paled around with fashion designer Halston, drank wine with Liza Minelli and lunched with Liz Taylor. He was almost murdered in 1968 when an irate actress from his film studio, the Factory, shot him several times. The shooting made front page news all over the world. He was a central character in the movie Factory Girl, about Edie Sedgwick, one of his Factory actresses.

Everybody recognized him instantly since he wore those thick glasses and had that mop top of dyed white hair. That fame was why people paid so much attention to his often-bizarre work. Some said that the quiet boy from Pittsburgh, who fell in love with Shirley Temple as a kid created a unique persona of himself that worked well.

The Warhol exhibit, a real achievement in cultural history, occupies all of the fifth floor at the Whitney plus additional galleries on the first and third floors. The best way to start is on the first floor and the gallery of his oversized portraits. They are mounted in log rows across the walls of the room and they introduce you to Andy the celebrity and Andy the artist at the same time. The portraits also tell you a lot about show business and art history in the 1960s and ‘70s. There are lots of famous people on the walls here, like Liza Minelli, Dennis Hopper, soccer star Pele, socialite Jane Holzer and Halston, but lots of people you never heard of, too. 

The third-floor houses wall after wall of his famous “Cow Wallpaper,” adorned with hundreds of similar heads of a brown cow. It is eye-opening and hilarious. 

Another room has a stack of his popular blue and white Brillo pad boxes and a wall full of S & H Green Stamps (remember them?)

There are his paintings of magazine covers and lots of newspaper front pagers (an eerie one about a 1962 Air France plane crash).

You learn a lot about his personal life. As an example, as a young man he became a fan of Truman Capote, who wrote Breakfast at Tiffany’s and called him every single day. 

There are drawings of celebrity’s shoes to show how they represented their personalities. Christine Jorgensen was one of the first modern openly transgender women, so she has shoes that don’t match each other.

Unknown to most, he loved to do paintings of paintings of comic strip characters. Two in the exhibit, of Superman and Dick Tracy, in blazing bright colors, were displayed in a New York City department store window. 

What makes the exhibit so enjoyable at the Whitney Museum, recently opened on Gansevoort Street near the Hudson River, is the way the curators use its space. Unlike most museum exhibits, where everything is scrunched together, the curators used the large, high ceilinged rooms wisely, putting the 350 Warhol pieces, especially the very large ones (some are thirty feet wide) alone on the pristine white walls so they jump off the wall at you. You go around one corner and there is Elvis Presley as a gunslinger in four separate portraits firing one of his six-guns. Next to him is Marlon Brando in a leather jacket and on his motorcycle.

There are weird walls of photos such as most wanted criminals he drew from photos in a New York State Booklet, “13 Most Wanted Men.’ There is a series of his copies of Leonardo da Vinci’s Mona Lisa and then his copy of his copy.

He did inspirational photos and silkscreens. A woman named Ethel Sculls CHEC arrived at his studio one day for what she thought would be a traditional portrait. Instead, Warhol took her to Times Square and had her sit for dozens of photos in the cheapie photo booths there, where all the going-steady high school kids went. The result – a sensational wall of photos of her in different giddy and seductive poses. Brilliant.

There are photos of Jackie Kennedy Onassis. One set is of her smiling on the morning before her husband’s murder and then, in the next strip, is her, somber, at the President’s funeral.  There is a wall full of his famous photo of Marilyn Monroe. There is a world famous, mammoth, and I mean mammoth, portrait of China’s Chairman Mao. One wall is filled with his fabled Campbell’s soup can paintings and another with his Coca Cola works.

Sprinkled among all of these paintings are real life photos and videos of Warhol at work.

There is a large television set on the third floor in which you see a truly bizarre video of Warhol simply eating a cheeseburger for lunch (he’s doing to get sick eating so fast!)

Warhol was also a well-known Avant Garde filmmaker and the museum is presenting dozens of his 16mm movies in a film festival in its third-floor theater. Some of these star the famous Ed Sedgwick, who appeared in many of his films and died tragically of a drug overdose.

Andy Warhol, who died at the age of 58 during a minor operation, led a simple middle-class existence until he arrived in New York. He was born in 1928 in Pittsburgh, was graduated from Carnegie-Mellon University there and then went to New York where he became well-known. He began his career as a commercial artist, earning money for drawings for magazine ads (dozens of them are in the show).

He became famous for his portraits of Campbell’s Soup cans. He painted them because as a kid his family was so poor that he and his brothers had Campbells Soup for lunch every day. Warhol said he had Campbell’s soup for lunch every day for 20 years. He also saw the soup can as a window into America. He was right.

The exhibit is a large open window on American history and culture in the 1960s and ‘70s and how the outlandish Warhol starred in it and, with his genius, changed it.

Andy Warhol not remembered? Hardly.

The exhibit runs through March 31.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171533 https://historynewsnetwork.org/article/171533 0
Richmond’s Robert E. Lee Statue: A Southern Unionist’s Viewpoint

 

"Ever since the Lee unveiling," Elizabeth Van Lew of Richmond, Virginia wrote to a Massachusetts friend in 1891, "I have felt that this was no place for me." 

 

This was a remarkable and revealing confession. Van Lew had long been at odds with the majority of her fellow white Richmonders: while they supported the Confederacy, she had stayed loyal to the Union and played a heroic role during the Civil War as the head of an interracial Federal espionage network in the rebel capital.  The network helped Union prisoners of war escape the Confederacy and funneled military intelligence to General U.S. Grant. Van Lew skillfully stayed one step ahead of the Confederate authorities, aware that she risked banishment, imprisonment or death if caught. 

 

Through all the trials of secession, war, and Reconstruction, her resolve to remain in Richmond had never wavered.  But the May 29, 1890 unveiling of a massive equestrian statue of Robert E. Lee in Richmond--an event attended by a throng of more than 100,000 spectators--was the final straw.  Van Lew knew that the ascendant cult of the Lost Cause, which posited a "solid South" of dauntless Confederates and loyal slaves, was designed to erase her and other anti-Confederate Southerners from the region's public life and from its history books.

 

Recovering the history of Southern Unionism has been challenging work for modern historians, and it has yielded some bracing insights. William W. Freehling's 2001 book The South vs. the South: How Anti-Confederate Southerners Shaped the Course of the Civil War revealed that 450,000 men from slave states fought for the Union army during the Civil War. The group of anti-Confederate Southerners with the most measurable impact on the war were the 150,000 black men who fought in United States Colored Troops regiments.  They are part of the larger story of African American resistance to the Confederacy:  slaves fled farms and plantations by the hundreds of thousands to seek refuge with the Union army and contributed to the Union victory not only as soldiers but as nurses, spies, scouts and in a host of other ways.  

 

White Southern Unionists were more unevenly distributed across the slave states and their impact is harder to assess. Unionism prevailed in the four slaveholding border states (Maryland, Kentucky, Missouri and Delaware). There were pockets of Unionists even in the seceded states, especially in mountainous, “upcountry” districts where plantation slavery did not predominate—places like West Virginia and East Tennessee. But in the Confederacy's plantation belts, white Unionists like Elizabeth Van Lew were a small and beleaguered minority. Their motivations varied widely. Some had family ties to the North; some advocated the economic modernization of the South; some had class-based resentments against the secessionist elite; some belonged to dissenting religious communities, such as the Quakers.  Some, like Van Lew, embraced emancipation, but most were anti-secessionist rather than anti-slavery.

 

Modern scholarship has established that white Southern Unionists in the Confederacy had an impact disproportionate to their sparse numbers--by destroying Confederate assets, sowing dissension, and aiding the Union army they exasperated Confederate authorities, who devoted time and resources to trying to roust them out. In a recent study of North Carolina, historian Barton Myers found that Unionists there furnished enough localized resistance to undermine Confederate control of roughly 1/3 of the state’s counties, forcing the Confederates to deploy resources, including vital manpower, to try to stabilize the contested areas. 

 

 

White Southern Unionists played just as important a role in sustaining Northern morale as they did in bedeviling the Confederates. The very existence of loyalists such as Van Lew sustained the abiding Northern belief that substantial numbers of white Southerners yearned for deliverance from Confederate rule.  This belief was rooted in the popular "Slave Power Conspiracy" theory that elite secessionists had bullied and duped the non-slaveholding white Southern masses into accepting disunion.  Northerners went to war to save the South from the secessionists, and clung tenaciously to that war aim even in the face of massive evidence that Confederates did not want to be saved.  The belief in deliverance was fueled by stories of dissent and disaffection, which were widely disseminated in Northern popular culture:  stories of loyalist scouts, guides, and spies lending their services to the Federal army; of refugee families accepting food from Union commissary stores; of teeming crowds welcoming Union armies of occupation; of white Southerners seeking amnesty by taking oaths of allegiance to the Union; of Confederate deserters returning like prodigal sons to the Union fold.  

 

White Southern Unionists willing to endorse and defend Abraham Lincoln's leadership and policies played a key role in this vision of deliverance, as a vanguard who could help lead other Southerners into the light.  Men such as Cassius Clay of Kentucky, Andrew Jackson Hamilton of Texas, Edward Gantt of Arkansas, and Andrew Johnson of Tennessee loomed large in Northern politics as symbols of Southern deliverance.  As the head of the “Richmond Underground,” Van Lew too became such a symbol.  As George Sharpe, Union chief of military intelligence for the Army of the Potomac, put it, “for a long, long time, [Van Lew] represented all that was left of the power of the U.S. government in the city of Richmond.” 

 

When Federal troops entered Richmond in April 1865 Van Lew felt it a personal vindication. "Oh, army of my country," she confided to her journal, "how glorious was your welcome!"  But the postwar era brought further struggles and deep disappointments.  Reconstruction exposed the fault lines among white Southern Unionists, pitting the small number of progressives such as Van Lew, who supported the Radical Republican program of black civil rights, against reactionaries such as President Andrew Johnson, who accepted emancipation but rejected black suffrage and racial equality.  Van Lew had a season of influence during Congressional Reconstruction, when President U.S. Grant appointed her postmaster of Richmond, and she used that influence to promote civil rights and woman suffrage.  But her political enemies gradually drove her out of office, charging that she was too radical, and mentally unstable.  As ex-Confederates recaptured Southern politics, the complex story of the wartime divisions among Southern whites was overshadowed by the ritual glorification of the Lost Cause.

 

Van Lew’s life raised the question:  Who counts as a Southerner? The stakes of the question were high.  Defenders of slavery had long argued that the only “real” Southerners were those who upheld the dominant proslavery political orthodoxies of the region; anyone who dissented was by definition an outsider.  Van Lew objected to the Lee statue because it sent an unmistakable message to Richmond’s African Americans and to their small number of white allies that they must remain on the margins of Southern politics.  Van Lew’s 1891 lament about the Lee statue echoed that of John Mitchell, Jr., editor of the leading black newspaper in Virginia, the Richmond Planet.  In a May 31, 1890 editorial on the Lee unveiling—entitled “What It Means”—Mitchell asserted that Confederate memorialization “fosters in this Republic, the spirit of Rebellion and will ultimately result in handing down a legacy of treason and blood.”  This was a plea for justice and recognition for those Southerners who chose patriotism over treason during the Civil War, and progress over reaction in its aftermath. The South’s Confederate statues obscured the region’s political diversity, and still do.

 

 

 

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171506 https://historynewsnetwork.org/article/171506 0
Parents, College, Money, and the American Dream

The University of Southern California, one of the schools mixed up in the college admissions scandal. 

 

The front-page news about the college admissions bribery arrests has people talking about social class, fairness, status anxiety, helicopter parenting, and whether an expensive education can translate into a lifetime of wealth and happiness.  None of this is new.  In writing about the history of babies in the 20th century United States, I discovered early 20th century baby books distributed by banks and insurance companies prompting parents to save for college. At a time when less than 20 percent of Americans completed high school and far fewer went on to higher education, financial institutions told parents to start saving for college.  

 

Insurance companies, banks, and savings and loan firms enticed customers by encouraging parental hopes and dreams.  Just as manufacturers of disinfectants, patent remedies, and infant foods turned to baby books to advertise the products parents could buy to keep babies healthy, financial firms sold their services as ways for making babies wealthy and wise--in the future. For all kinds of companies, playing to parental anxieties and aspirations became the means of expanding their clientele. 

 

Consider this example. In 1915 an Equitable Life Assurance baby book advertisement in the Book of Baby Mine began, "Say, Dad, what about my college education?" At the time, high school graduation rates hovered below 13 percent and college attendance and graduation was much lower. Nevertheless, parents looked to the future with great hopes for their offspring. In 1919 the United States Children's Bureau conducted an investigation of infant mortality in the city of Brockton, Massachusetts. An immigrant Italian mother interviewed for the study reported she was saving to send all four of her young children to college. Clearly, in reaching out with a save-for-college message, financial firms were capitalizing on a common but mostly unrealized dream and helping to reinforce the message that college was a pathway to success. 

 

Banks promoted thrift by reaching out to customers via motion pictures, newspaper advertisements, and programs in schools collecting small deposits from children. Competition for savers grew as the number of banks doubled between 1910 and 1913. Accounts for babies soon became part of banks' advertising strategy. Savings and loans and banks gave away baby books with perforated deposit slips, slots for coins, or simply included pages for listing deposits into the baby's bank account. The 1926 Baby's Bank and Record Book even included a section on college savings estimating a future scholar would need $1000--a figure it derived, the advertisement explained, from the University of Pennsylvania catalog. In addition to citing this source, the ad included a helpful chart showing that saving $1 a week would, with compounding interest, yield $1065.72 in fifteen years.  

 

 

The Great Depression wiped out many of the banks and small insurance companies holding the savings of infants, children, and adults, thus erasing the hopes of many who had dreamed their child would obtain a college education.  However, as children withdrew from the workforce because of the lack of job opportunities and New Deal laws limited their employment, high school completion rates grew to 40 percent by 1935.  As scholars have pointed out, G.I. benefits after World War II (the Serviceman's Readjustment Act) and the National Defense Education Act of 1958 both led to big increases in college attendance thanks to the financial support they provided. 

 

What changed in the wake of enhanced federal financial support was not the desire for one's children to acquire more education, but the numbers of young people able to go to college. A quick look through baby books from the first half of the twentieth century shows the "go to college" message being sent and received well before government dollars came in to the picture. Banks and insurance companies knew what customers dreamed of for their offspring and they made it the centerpiece of some of their advertising. Today, the vast majority of students and families still save up and borrow to afford higher education. And, of course, financial firms still promote themselves as critical resources for fulfilling this dream. What just might surprise us is how, for over a century, banks and insurance companies have been delivering this message, aware of what parents thought about when they gazed at their new babies and thought about their futures.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171508 https://historynewsnetwork.org/article/171508 0
The Antigay "Traditional Plan" and the History of the Methodist Church

 

Recently in St. Louis, the United Methodist General Conference strengthened its ban on same-sex marriage, doubled-down on its prohibition against ordaining gay clergy, and sent an unmistakable message to its LGBTQ brethren. Advocates of this so-called “Traditional Plan” believe that Biblical teachings against homosexuality are unambiguous, the teachings of the Bible are eternal and unchanging, and that the church should not bow to political pressure when evaluating their stance on major issues. Advocates of LGBTQ inclusion look to other Biblical precedents, insisting that the Sermon on the Mount and Christ’s injunction to love one’s neighbor as oneself are the only moral guideposts they need when considering same-sex marriage, gay ordinations, and the place of LGBTQ members within the life of the church. Both sides are equally sure that God is on their side. 

 

Although conservatives in the church claim to be protecting the Bible from what they see as a dangerous faction hellbent on watering down Christian teachings, the Traditional Plan was not about defending the entire breadth of what the scriptures have to say regarding marriage and relationships, which would have included commentary on a range of practices. Proponents of the measure rejected proposals to add language condemning adultery, divorce, and polygamy into the resolution and focused entirely on homosexuality. 

 

At this moment, the future of the United Methodist Church is unclear, but feelings are raw and schism is likely. The St. Louis Conference was not routine, but it is not without historical precedent. It is ironic that that the church’s antigay coalition described their plan as “traditional,” because public schisms over social and political issues are deeply woven in the fabric of Methodist history. Whether it was pressure from Prohibitionists which ultimately convinced Methodists to ban fermented wine during communion or democratic rhetoric in the Jacksonian era which coincided with a schismatic wave of anticlericalism, Methodists have never shied away from pressing issues throughout their long history. 

 

But until this month, slavery was the only other issue that seriously threatened the unity of American Methodism. Just as Methodists today look to the Bible for guidance on LGBTQ inclusion, their nineteenth century forebearers turned to scripture when considering the morality of slavery. For instance, southern Methodists pointed to Romans 13:1-2, in which Paul instructs, “Let every soul be subject unto the higher powers. For there is no power but of God: the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God: and they that resist shall receive to themselves damnation.” This convinced them that their slave-based hierarchy was divinely ordained and that it was sin to resist the clearly-evident will of God. Furthermore, proslavery theologians argued that if slavery was as grave a sin as their adversaries in the North claimed, there would be some kind explicit Biblical injunction against the practice. But there are no such teachings in the Bible and slavery was omnipresent in both the Old and New Testaments. For them, the truth of the Bible never changed, the institution of slavery had always existed, and would continue in perpetuity. 

 

Opponents of slavery looked to the Sermon on the Mount and the Golden Rule when arguing that it was a sin for Christians to degrade fellow humans as chattel. Like today, bothsides were equally sure that God was on their side. 

 

Tension reached a boiling point at the General Conference of 1844 in New York City. According to church law, bishops could not own slaves, but Bishop James O. Andrew of Georgia had inherited slaves through his wife. Antislavery forces wanted to suspend him from the ministry for violating church law. Southern delegates passionately defended Andrew and the institution of slavery while northerners argued that it was dangerous to vest religious authority in somebody who willingly participated in such a corrupt system. 

 

When neither side showed a willingness to back down, the church split into separate denominations. The new southern branch of Methodism rested on a cornerstone of slavery and white supremacy. The schism opened the door to legal wrangling over church property that embittered both sides for generations and which ultimately had nothing to do with morality and everything to do with wealth and power. The churches reunited in 1939, after a ninety-five-year estrangement.

 

Today, it might seem shocking that Christians would look to the Bible to defend slavery, but advocates were defending more than the institution; they genuinely felt that they were protecting the integrity of the Bible against an attempt by radicals in their own church to dilute the scriptures. This month, proponents of the Traditional Plan tapped into a Methodist tradition that should give them serious pause. Meanwhile, legal challenges over property might again become the ultimate arena for power plays masquerading as morality. 

 

The historical lessons of 1844 are abundantly clear. Methodists have always been committed to faith, prayer, charity, and walking with God, but still turn their backs on the brethren and struggle to live up the highest ideals of Christianity and their even their own motto, “open hearts, open minds, and open doors.” 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171510 https://historynewsnetwork.org/article/171510 0
The Death of Appeasement: the 80th Anniversary of the Invasion of Prague

Hitler entering Prague Castle, 15 March, 1939.

 

A turning point in the history of international relations refers to an event that alters significantly the present process in international relations, which entails a long-lasting, considerable effect in it. A turning point may not necessarily be the trigger to a significant change in international relations, but rather part of the underlying cause leading to it.

The turning point in the history of international relations in the 1930s occurred in 1939. However, rather than the outbreak of World War II, in September of that year, it was the invasion of German troops to what remained of Czechoslovakia in March 1939, following the Munich Agreement of 1938,  that represented that turning point. 

It was a significant landmark as it showed even to the most enthusiast supporters of the appeasement policy towards Nazi Germany that German ambitions went much beyond the supposed rectification of the wrongs done to Germany by the Versailles settlement of 1919, following World War I (or the Great War as it was then known).

The appeasement policy pursued by Britain and France was founded on the premise that Germany was maltreated by the victors of World War I, and that German grievances had some justification and could be accommodated in order to prevent the outbreak of a major European war.

Employing the rhetoric of the parliamentary democracies, Adolf Hitler and the Nazi regime argued that the German people had the right to collective self-determination. Consequently, the prohibition included in the Versailles settlement according to which Austria should remain a separate state and not be allowed to be part of a larger German state was deemed to be unjust. After all, shouldn’t the Austrian people ‘freely’ decide whether they wish to live in a separate sovereign entity or be incorporated into Germany?

By the same token, and following the same logic, Germany argued that the German inhabitants of the Sudeten region in Czechoslovakia should have the right of self-determination. The fact that the area concerned was an integral part of the sovereign territory of Czechoslovakia (incidentally the only truly parliamentary democracy in Central and Eastern Europe) only enhanced the case put forward by Germany. The German population in the Sudeten region was being treated harshly by the government in Prague, claimed Hitler and the Nazi propaganda machine. 

Thus, the case for national self-determination was no less valid as regards the German minority in Czechoslovakia than it was in the case of Austria. The more Germany accused Czechoslovakia for the supposedly inhuman treatment of its Germany inhabitants, the more likely the prospect of a general European war was feared by Britain and France. Why risk such a war if applying the right of collective self-determination could actually avert it? 

Of course, Britain’s Prime Minister, Neville Chamberlain, thought of that conflict as “a quarrel in a far-away country between people of whom we know nothing.” He didn’t seem to be particularly concerned with the fate of the German minority in Czechoslovakia or, indeed, with their right of self-determination. His policy, though, was based on the assumption that German grievances in this matter were related to the supposed wrongs inflicted on Germany and the German people following World War I.

To be sure, the real issue was not the quarrel to which Chamberlain referred to, but rather the ambitions of the Nazi regime. After all, even Chamberlain himself asked Hitler whether he had any further territorial demands, beyond the Sudeten region; to which the German leader replied in the negative. Indeed, Hitler stressed cynically, albeit in a rhetorical way, that he was not interested in adding Czechs and Slovaks to the German Reich. 

Once German troops entered Prague in March 1939, the whole conceptual edifice justifying German demands collapsed. There was no way anyone could logically justify the German move by resorting to the supposed evils imposed upon Germany by the Versailles settlement. The cherished principle of collective self-determination could apply now in reverse: it was the Czech people who were denied their right of self-determination. 

In a sense, it could be argued that the day German forces occupied Prague was the day that the anti-appeasers in Britain, led by Winston Churchill, turned from a cornered minority into a solid majority. It was also the watershed that altered once and for all the character of international relations in Europe: the time for pretense was over. The sense of remorse over the post-World War I settlement and the widely-acceptable principle of national self-determination could no longer justify any policy aimed at preventing war with Nazi Germany. 

March 1939 was the real turning point. September 1939 was to be the climax.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171456 https://historynewsnetwork.org/article/171456 0
Why is the Mona Lisa not the Mona Lisa?

 

One of the great riddles of modern times is why a 500-year portrait of a Florentine housewife, of no rank or title, is today the most famous painting in the world. But the mystery around that portrait has now deepened. New findings suggest that Leonardo da Vinci’s most celebrated work may not depict M(ad)onna Lisa, wife of the silk merchant Francesco del Giocondo, at all. For example, art experts have long believed that the painting dates from Leonardo’s late period, given its heavy sfumato or dark, “smoky” treatment. But a handwritten note, discovered in the library of the University of Heidelberg, has revealed that Leonardo painted the work as early as 1503, at least a decade before Leonardo’s late period. What’s more, a 16th century biographer, Giorgio Vasari, claims that Leonardo worked on it for only four years and then left it unfinished. That’s a bigger problem, because if anything, the Mona Lisa in the Louvre is as polished as any Leonardo work would ever get. And to top it off, Leonardo himself once told a visitor in 1517 that the portrait was made for Giuliano de’ Medici, brother of Pope Leo X, under whose patronage Leonardo lived in Rome between 1513 and 1516.

 

Why is this important? The simple answer is: we believe these sources don’t agree because they are not talking about the same painting. In other words, there must be two versions of the Mona Lisa, both painted by da Vinci. Would this be a radical theory? Not at all. We know, for example, that Leonardo painted two versions of The Virgin of the Rocks, now in the Louvre and the National Gallery in London, just as he painted two versions of The Virgin of the Yarnwinder, now in private collections. In many of these works, Leonardo allowed collaborators to “fill in” the less important parts, such as the background and the clothing, while he concentrated on the critical stuff: the faces and hands. This process served two purposes: it allowed Leonardo’s assistants to learn the master’s technique, and it increased the number of works that could be sold. Leonardo worked very slowly and only produced 18 paintings that we know of, which is a problem when you have a large workshop and many mouths to feed.

 

 

But if Leonardo painted two Mona Lisa’s, where is the other one? Diligent research into a number of possible candidates, notably by Salvatore Lorusso and Andrea Natali at the University of Bologna, has produced a winner: a portrait known as the Isleworth Mona Lisa, long believed to be a copy. Close inspection reveals that unlike all other 16th century versions of the Mona Lisa, the Isleworth is not a copy at all. In key respects, including its size, its composition, its landscape, and its support (it was painted on canvas, rather than wood), it is strikingly different from the Louvre Mona Lisa. What’s more, it depicts Lisa del Giocondo as a beautiful young woman. That makes sense, because in 1503 Lisa was just 24 years old, quite unlike the matronly figure who stares at us from behind her bulletproof glass at the Louvre. And here is the clincher: the Isleworth is unfinished, just as Vasari wrote. Most of the background never progressed beyond the gesso, the underpaint.

 

So why has no one else looked more closely at the Isleworth version? As we argue in our book, the answer is that for much of the past 500 years, she was not on public view. First acquired by an 18th century English nobleman, the work was discovered in 1913 by a British art dealer, Hugh Blaker, during a visit to a stately home in Somerset. The Isleworth then spent much of the 20th century in vaults, in part to protect her from two world wars. In the 1960’s, she was purchased by a collector, Henry Pulitzer, who soon thereafter locked her up in another vault, where she was discovered in 2012. And now that’s going to change, because there are plans to put her on exhibit in the land of her birth, later this year—as part of the commemoration of the 500th anniversary of Leonardo’s death in 1519. That’s when all the world will be able to see her in all her glory—as the true likeness of a lovely young mother who would remain on Leonardo’s mind for the rest for his life.

 

That brings us to the final question: if the Isleworth Mona Lisa is indeed the earlier version, then who is the older woman in the Louvre? Is she a later version of Lisa del Giocondo, painted in middle age? But why would Giuliano de’ Medici, a powerful aristocrat who took several mistresses, want a likeness of the wife of a silk merchant? Could the Louvre Mona Lisa depict one of his paramours instead? A number of candidates come to mind, but the solution that seems obvious to us is also the simplest: the Louvre Mona Lisa is no longer the likeness of a particular person. While Leonardo used the same “cartoon,” the preparatory drawing, from the earlier version, he was no longer interested in painting an actual portrait. Instead, he wanted to create an idealized image of motherhood and the mysterious forces of creation, as evidenced by the primordial landscape in the background. Remember, Leonardo was taken away from his mother, Caterina, when he was only five years old. That’s why Leonardo was so struck by Lisa, because in 1503 she was close in age to his last memory of his mother. And that’s why, for the Louvre painting, Leonardo chose Lisa again—not to paint her likeness, but to capture the eternal grace and mystery of womanhood.

 

Don't miss the TV special based on the author's research: The Search for the Mona Lisa, narrated by Morgan Freeman and produced by Pantheon Studios. Timed to coincide with the 500th anniversary of Leonardo’s death on May 2, 2019, the film shows that the portrait has become the center of a swirling controversy. If she is not the Mona Lisa, then who is she? Why did Leonardo da Vinci paint her? Using newly discovered evidence, and featuring Italian star Alexandro Demcenko as Leonardo, the film is a thriller-like pursuit for the real identity of the most famous portrait in the world.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171455 https://historynewsnetwork.org/article/171455 0
London at War, 1941, and a New Alice in Wonderland LONDON, ENGLAND: The German Luftwaffe staged yet another bombing raid on London last night. Prime Minster Winston Churchill said the German Air Force, sneaking through the Royal Air Force defenses in the sky, dropped more than 1,100 bombs on the southeastern area of London. 122 people were killed and 42 injured in the raid, yet another strike on England by Hitler’s Germany

Below ground, hiding with tens of thousands of others in London’s underground train stations, was Alice Spencer, a teenager whose boyfriend, Alfred, is slowly dying of tuberculosis and lying on a nearby cot. She cares for him in the crowded station while reading her favorite book, Lewis Carroll’s classic Alice’s Adventures in Wonderland. There is a series of explosions, the lights go dim and suddenly, without warning, Alice, looking for an escape from the grimness of the scene, and the poor health of her love, tumbles down an underground rabbit hole, pulling some of the other people in the train station with her. They all find themselves in Wonderland, populated by the Queen of Hearts, Mr. Caterpillar, the White Rabbit and others, a thoroughly zany place to be while a World War rages above them.

This is the very unusual start of Alice By Heart, a new musical with book by Steven Sater and Jessie Nelson and music and lyrics by Duncan Sheik and Sater, that just opened at the new MCC Theater at 511 W. 52d. Street, in New York. It starts slowly and it takes a while to figure out who is who and what the authors are trying to do with the 154-year-old Carroll classic story of the girl who chased a rabbit and fell head first down a hole into Wonderland. About twenty minutes into the musical, though, everything starts to click. What follows is a highly amusing, thoroughly enjoyable and at times brilliant show, a unique and highly creative re-invention of Carroll’s timeless story.

In the play, the British Alice goes through many of the same adventures as the Alice in the book, cavorting with the Caterpillar, chasing the White Rabbit and shaking her head at the antics of the Mad Hatter. She is put on trial by the Queen of Hearts, a delightful character who must have the loudest and scariest screech in the world. She could wake up people in Brazil with that screech. At the end of the Wonderland trip, she returns to the shelter to try to save her boyfriend.

What makes Alice by Heart succeed is the stellar skill of its ensemble cast. Each person not only acts out his/her role, but pays careful attention to the movements of the others. The two stars of the show are Molly Gordon as Alice and Colton Ryan as Alfred. Others in the fine cast include Kim Blanck, Noah Galvin, Grace McLean, Catherine Ricafort, Heath Saunders, Wesley Taylor and others.

 Jessie Nelson does a fine job of directing the play. The choreography, by Rick and Jeff Kuperman, is very impressive. 

The musical has some wonderful special effects. As everybody knows, the Queen is always asking for people to decapitate Alice (“Off with her head!). They do that in the show. As Alice sings, you see her shadow on a large white sheet. After a few seconds her shadow is rather neatly beheaded. In another scene one actress douses Alice with a magical dust that slowly hovers in the air. People in gas masks run amuck. At the end of the play, there is a wonderful musical number about the Mock Turtle in whish soldiers sing and dance in a wondrous mass. There are people popping out from underneath a woman’s skirt, a caterpillar who grows in size as people jump on and off of him. The magnificently dressed Mad Hatter leaps onto and off of a table.

The musical does have its flaws. It has a slow and dreary start. The music is OK, but after while some songs sound just like another. There is also too much bouncing back and forth between the underground station and Wonderland.

The writers should also have included more history on the “Blitz” bombing of London by the Luftwaffe during the war. My father was a GI in London in WW II and he told me really scary stories about hiding out in the underground during the bombings. The playwrights should have given audiences a better picture of that. The British use of their subways as bomb shelters was fascinating. During the height of the bombings the Luftwaffe struck just about every night of the week. The subways held roughly 150,000 men, women and children, who got on line at 4 p.m. to secure a spot in the underground train shelters. The shelters were run by the Red Cross, the Salvation Army and other charitable groups.  Concerts, films, plays and books donated by local libraries were used as entertainment for the residents.

The shelters were not completely safe. Direct bomb hits wiped out some, killing hundreds. 173 people were trampled to death in a panic that followed a woman’s fall down the stairs in the Bethnal Green subway station.

Lewis Carroll’s two books, Alice’s Adventures in Wonderland (1865) and Through the Looking Glass(1871) generated forty movies and television show and, of course, the rock and roll hit “White Rabbit,” performed by Grace Slick and the Jefferson Airplane in the late 1960s.

Despite these small complaints, Alice by Heart is a smart, nifty show about London in World War II and yet another colorful tale of the Mad Hatter and Wonderland.

Can you really spend a better evening than chasing a white rabbit trotting through the forest with a pocket watch in his hand?

PRODUCTION: The play is produced by the MCC Theater. Set Design: Edward Pierce, Costumes: Paloma Young, Lighting: Bradley King, Sound: Dan Moses Schreier. The play is directed by Jessie Nelson and choreographed by Rick and Jeff Kuperman. It runs through April 7.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171534 https://historynewsnetwork.org/article/171534 0
Why the U.S. Bombed Auschwitz, But Didn't Save the Jews

Hungarian Jewish Women and children after their arrival at Auschwitz.

 

Seventy-five years ago this week—on March 19, 1944—German troops marched into Hungary. The country’s 800,000 Jews, the last major Jewish community to have eluded the raging Holocaust, now lay within Hitler’s grasp.

The railway lines and bridges across which Hungary’s Jews would be deported to the Auschwitz death camp in Poland were within range of American bombers. So were the gas chambers and crematoria in the camp itself. The refusal of the Roosevelt administration to drop bombs on those targets is one of the most well-known and troubling chapters in the history of international responses to the Holocaust.

What few realize today is that U.S. planes actually did bomb part of Auschwitz—but they left the mass-murder machinery, and the railways leading to it, untouched. Why?

The same week that the Germans occupied Hungary, two Jewish prisoners in  Auschwitz were in the final stages of plotting to escape, something that only a tiny handful of inmates had ever accomplished. Their goal was to alert the Free World that the gas chambers of Auschwitz were being readied for the Jews of Hungary. They hoped these revelations would prompt the Allies to intervene.

On April 7, 1944, Rudolf Vrba, 19, and Alfred Wetzler, 25, slipped away from their slave labor battalion and hid in a hollowed-out woodpile near the edge of the camp. On the advice of a Soviet POW, the fugitives sprinkled the area with tobacco and gasoline, which confused the German dogs that were used to search for them.

After three days, Vrba and Wetzler emerged from their hiding place and began an eleven-day, 80-mile trek to neighboring Slovakia. There they met with Jewish leaders and dictated a 30-page report that came to be known as the "Auschwitz Protocols." It included details of the mass-murder process, maps that pinpointed the location of the gas chambers and crematoria, and warnings of the impending slaughter of Hungary's Jews. "One million Hungarian [Jews] are going to die," Vrba told them. "Auschwitz is ready for them. But if you tell them now, they will rebel. They will never go to the ovens." 

 

What FDR Knew, and When

The fate of Hungarian Jewry unfolded before the eyes of the world. Unlike previous phases of the Holocaust, which the Germans partially succeeded in hiding from the international community, what happened in Hungary was no secret.

A common refrain among defenders of President Franklin D. Roosevelt’s response to the Holocaust is the claim that he and his administration learned about the deportations from Hungary too late to do much about it. For example, a recent essay in The Daily Beast, by journalist Jack Schwartz, claimed that “The Allies learned of the Hungarian deportations and their lethal destination in late June”—that is, not until five weeks after the deportations commenced. 

But in fact, Washington knew what was coming. At a March 24, 1944, press conference, FDR, after first discussing Philippine independence, farm machinery shipments, and war crimes in Asia, acknowledged that Hungary’s Jews “are now threatened with annihilation” because the Germans were planning “the deportation of Jews to their death in Poland.” The president blurred the issue by coupling it with a remark about the danger that “Norwegians and French” might be deported “to their death in Germany,” but the key point is clear: If we wonder “what did they know, and when did they know it,” the answer with regard to Hungary is that the Roosevelt administration knew plenty, and knew it early.

The Holocaust in Hungary was widely reported, and often in timely fashion, by the American news media (although it was not given the prominence it deserved). For example, on May 10, nine days before the deportations to Auschwitz began, the New York Times quoted a European diplomat warning that the Germans were preparing “huge gas chambers in which the one million Hungarian Jews are to be exterminated in the same fashion as were the Jews of Poland.” 

Likewise, on May 18, the Times reported that “a program of mass extermination of Jews in Hungary” was underway, with the first 80,000 “sent to murder camps in Poland.” The notion that the Roosevelt administration only learned about all this in “late June” is preposterous.

 

Appeals for Bombing

Meanwhile, copies of the Auschwitz escapees’ report reached rescue activists in Slovakia and Switzerland. Those activists then authored an appeal to the Roosevelt administration to bomb “vital sections of these [railway] lines, especially bridges” between Hungary and Auschwitz, “as the only possible means of slowing down or stopping future deportations.” The plea reached Washington in June.

Numerous similar appeals for bombing the gas chambers, or the rail lines and bridges leading to them, were sent to U.S. officials by American Jewish organizations throughout the spring, summer, and fall of 1944.

Assistant Secretary of War John McCloy was designated to reply to the requests. He wrote that the bombing idea was "impracticable" because it would require "diversion of considerable air support essential to the success of our forces now engaged in decisive operations." He also claimed the War Department's position was based on "a study" of the issue. But no evidence of such a study has ever been found by researchers. 

In reality, McCloy's position was based on the Roosevelt administration’s standing policy that military resources should not be used for "rescuing victims of enemy oppression."

The aforementioned Daily Beast article claimed that the administration’s rejection of the bombing requests “reflected military reality as perceived by a defense establishment with stretched resources trying to meet the diverse demands of an all-encompassing war.”

That’s nonsense. The “military reality” was that at the same time McCloy was saying Auschwitz could not be bombed, Auschwitz was being bombed. Not the part of Auschwitz where the gas chambers and crematoria were situated, but rather the part where slave laborers were working in German oil factories.

On August 20, a squadron of 127 U.S. bombers, accompanied by 100 Mustangs piloted by the all-African American unit known as the Tuskegee Airmen, struck the factories, less than five miles from the part of the camp where the mass-murder machinery was located.

 

What Elie Wiesel Saw

Future Nobel Laureate Elie Wiesel, then age 16, was a slave laborer in that section of the huge Auschwitz complex. He was an eyewitness to the August 20 bombing raid. Many years later, in his best-selling book ‘Night’, Wiesel wrote: “If a bomb had fallen on the blocks [the prisoners’ barracks], it alone would have claimed hundreds of victims on the spot. But we were no longer afraid of death; at any rate, not of that death. Every bomb that exploded filled us with joy and gave us new confidence in life. The raid lasted over an hour. If it could only have lasted ten times ten hours!” 

There were additional Allied bombing raids on the Auschwitz oil factories throughout the autumn. American and British planes also flew over Auschwitz in August and September, when they air-dropped supplies to the Polish Home Army forces that were fighting the Germans in Warsaw. They flew that route twenty-two times, yet not once were they given the order to drop a few bombs on the death camp or its transportation routes.

Adding insult to inaccuracy, Jack Schwartz claimed (in The Daily Beast) that “in Palestine, the Jewish Agency [the Jewish community’s self-governing body] overwhelmingly opposed the bombing [of Auschwitz] on the grounds that it would likely take Jewish lives,” and “American Jewish leaders were equally divided over the issue, which led to recriminations during and after the war.”

Wrong, and wrong again. The minutes of Jewish Agency leadership meetings show they opposed bombing for a period of barely two weeks, and even then only because they mistakenly thought Auschwitz was a labor camp. Then they received the Vrba-Wetzler “Auschwitz Protocols,” revealing the true nature of the camp. At that point, Jewish Agency representatives in Washington, London, Cairo, Geneva, Budapest and Jerusalem repeatedly lobbied U.S., British and Soviet officials to bomb Auschwitz and the routes leading to it.

As for American Jewish leaders, a grand total of one of them urged the Allies to use ground troops against Auschwitz instead of air raids. By contrast, pleas in support of bombing were made in Washington by multiple representatives of the World Jewish Congress, Agudath Israel, the Labor Zionists of America, and the Emergency Committee to Save the Jewish People of Europe (the Bergson Group). Calls for bombing also appeared in the columns of a number of American Jewish newspapers and magazines at the time.

 

Motives for Rejection

Now we come to the vexing question of why the Roosevelt administration rejected the bombing requests.

The explanation that the administration gave at the time—that bombing Auschwitz or the railways would require diverting bombers from battle zones—was clearly false, since we know that U.S. bombers did bomb other targets within the Auschwitz complex (the oil factories).

A second argument has been made by some FDR apologists: that bombing was a bad idea because some of the Auschwitz inmates would have been killed. But that does not hold up, either—first, because that was not the reason given for the rejections at the time; and second, because it fails to explain why the administration refused to bomb the railway lines and bridges, which would not have involved any risk to civilians.

So what, then, was the real reason for the administration’s rejection?

In all likelihood, it was the result of several factors. One was old-fashioned antisemitism. The antisemitic sentiments rife among senior officials of the State Department and War Department have been amply documented. What about the White House? Jack Schwartz, in The Daily Beast, mocked any suggestion that President Roosevelt harbored antisemitic feelings, pointing out that he “surrounded himself with Jewish advisers” and “staffed the New Deal…with Jewish activists.” In other words, some of FDR’s best friends were Jewish.

A more informed perspective would consider Roosevelt’s actual statements on the subject. For example, as a member of the Harvard board of governors, he helped impose a quota on admitting Jewish students so they would not be “disproportionate,” as he put it. He called a questionable tax maneuver by the owners of the New York Times in 1937 “a dirty Jewish trick.” He said in 1938 that the behavior of “the Jewish grain dealer and the Jewish shoe dealer” was to blame for antisemitism in Poland. 

FDR continued to make such remarks (behind closed doors) in the 1940s. He complained to his cabinet in 1941 that there were “too many Jews among federal employees in Oregon” (which he had recently visited). In 1942, he used the slur “kikes” in reference to Jewish Communists. At the Casablanca conference in 1943, he said strict limits should be imposed on North African Jews entering professions, in order to “eliminate the specific and understandable complaints which the Germans bore towards the Jews in Germany, namely, that while they represented a small part of the population, over fifty percent of the lawyers, doctors, school teachers, college professors, etc, in Germany, were Jews.” 

Do such statements reflect antisemitism? Or when it comes to assessing antisemitism, should there be one standard for revered former presidents and a different standard for everyone else?

Another factor in the decision not to bomb Auschwitz was a practical consideration: rescuing Jews meant the Allies would be stuck with a lot of Jewish refugees on their hands. At one point during the war, a senior State Department official warned his colleagues that any U.S. action to rescue Jewish refugees was “likely to bring about new pressure for an asylum in the Western hemisphere.” Another official characterized Jewish refugees as a “burden and curse,” and he worried about the “danger” that the Germans “might agree to turn over to the United States and to Great Britain a large number of Jewish refugees.”

This is not to say that antisemitism and the fear of pressure to admit refugees were the decisive factors. More likely they served to buttress or reinforce the main factor, which was the overall mindset in the administration that America had no national interest nor moral obligation to pursue humanitarian objectives abroad. 

This attitude was articulated, most notably, in the War Department’s internal decision, in early 1944, that it would not utilize any military resources “for the purpose of rescuing victims of enemy oppression unless such rescues are the direct result of military operations conducted with the objective of defeating the armed forces of the enemy.”

Bombing bridges and railway lines over which both deported Jews and German troops were transported could have qualified as necessary for military purposes. But not when the prevailing attitude in the White House and other government agencies was one of hardheartedness when it came to the Jews, reinforced by antisemitism and nativism. 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171509 https://historynewsnetwork.org/article/171509 0
History Will Clash With History in the 2020 Election

 

Americans love predicting the future based on historical precedent. “No president has ever been reelected when the unemployment rate was above X percent,” “Missouri has predicted the winner for the last so-and-so many years,” etc. 

As we look to the 2020 presidential election, three different historical precedents all suggest a different outcome. In other words, “history clashes with history” and, at the end of it all, the historical record will change. 

First, three consecutive Presidents have won and completed two terms of office only twice in American history. The first period is that of Thomas Jefferson, James Madison and James Monroe from 1801-1825.  Then, from 1993-2017, Bill Clinton, George W. Bush, and Barack Obama all served two terms.

Never before has the fourth President in a row won two terms. If Donald Trump goes on to win a second term and finishes that term in January 2025, it will be unprecedented. This legacy suggests it would be unlikely Trump will win. 

A second historical trend, however, indicates that it is extremely unlikely a Republican would lose the presidency to a Democrat. A political party has lost control of the White House after serving one term just once in the past 31 election cycles. The norm has been that a political party serves at least eight years before being ousted. For example, Democrats from 1853-1861 (Franklin Pierce, James Buchanan), Democrats from 1913-1921 (Woodrow Wilson), Republicans from 1953-1961 (Dwight D. Eisenhower), Democrats from 1961-1969 (John F. Kennedy, Lyndon B. Johnson), and Republicans from 1969-1977 (Richard Nixon, Gerald Ford).  Also, parties have held control of the White House for 12 years, as from 1789-1801 (Federalists George Washington and John Adams), 1829-1841 (Democrats Andrew Jackson, Martin Van Buren), 1921-1933 (Republicans Warren G. Harding,  Calvin Coolidge, and Herbert Hoover), and 1981-1993 (Republicans Ronald Reagan and George H. W. Bush).  We also had 16 years of the same party from 1897-1913 (Republicans William McKinley, Theodore Roosevelt, William Howard Taft), 20 years of the same party from 1933-1953 (Democrats Franklin D. Roosevelt and Harry Truman), and 24 years of the same party from 1801-1825 (Democratic Republicans Thomas Jefferson, James Madison, James Monroe), and 1861-1885 (Republicans Abraham Lincoln, Andrew Johnson, Ulysses S. Grant, Rutherford B. Hayes. James A. Garfield, and Chester Alan Arthur).

The only times when a party controlled the White House for only one term was 1841-1845 (William Henry Harrison, John Tyler), 1845-1849 (James K. Polk), 1849-1853 (Zachary Taylor, Millard Fillmore), 1885-1889 (Grover Cleveland), 1889-1893 (Benjamin Harrison), 1893-1897 (Grover Cleveland), and 1977-1981 (Jimmy Carter).  So, a party has lost control of the Presidency after only one Presidential term only once in the past 31 election cycles.  If a Democrat wins in 2020, therefore, it would go against historical precedent. 

The third historical trend is that most presidents who served just one term faced a significant challenge from within their party prior to the general election. Every President since 1900 who has lost reelection, except for one case, did so after their party nomination was challenged.  This is the case of William Howard Taft in 1912 (challenged by Theodore Roosevelt), Gerald Ford in 1976 (challenged by Ronald Reagan), Jimmy Carter in 1980 (challenged by Ted Kennedy and Jerry Brown), and George H. W. Bush in 1992 (challenged by Pat Buchanan)  The one exception is Herbert Hoover in 1932. 

The ongoing investigations into Donald Trump’s Presidential campaign, his Presidential administration, and every other aspect of his life and business activities coupled with the likelihood of an economic slowdown, makes it harder to think Trump will win reelection. Whether he will face a serious challenge within his party seems unlikely at this time, but it would certainly not be unprecedented. 

Regardless of what happens, history will clash with history, and history in some fashion will be changed as a result.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171511 https://historynewsnetwork.org/article/171511 0
The Cult of Trump

 

I am a ferociously independent  and passionately moderate voter. But in 2016 I voted a straight party ticket for the Democrats. It was the only way - however meek - to send a message to the Republican Party: You gave us Trump, shame on you, now get rid of him.  

But Republican legislators, who have much to lose in the long run, cave in to the Trump base fearing they might be “primaried” and lose their seats, and give the disaster that is the Trump presidency a free pass.  They are wrong morally and politically to do so.  What happened to the Republican Party? Where is it hiding?  Wherever you are, come back, we need you badly.  

Part of the problem is that many Republicans confuse Donald Trump with a Republican. He is not a party president, and barely a party member. He is instead, the leader of a cult. This transformation was noted a few months back by then Republican Senator Bob Corker of Tennessee, who openly asked if his party was “becoming a cultish thing”.  Donald Trump does not represent traditional Republican values, policy positions, or core principles such as limited government, individual freedom, respect for democratic norms, freedom of expression and press, active and strong democratic alliances, a pro-democracy foreign policy, and the dignity of public service.  No, this is not your daddy’s Republican Party, it is Trump’s personal cult. 

As the Republicans morph from national party to cult of personality, it might be useful to reflect on the role cults have played in American politics.  Ordinarily, when we think of cults we think of religious cults, but there have been political cults in America as well (see: Dennis Tourish and Tim Wohlforth’s ON THE EDGE: POLITICAL CULTS RIGHT AND LEFT, 2000).  Not surprisingly, cults have had a very short shelf-life in the United States. A largely pragmatic, non-ideological nation, Americans have been suspicious of extremism and narrow politics. Part of the reason for the failure of cults to catch on in the U.S. can be seen in Alexis de Tocqueville’s 1835 reminder in DEMOCRACY IN AMERICA that “the spirit of association” exerts a powerful influence in the United States, and that such associations, often cross-cutting ideologically and politically, help produce a more moderate and perhaps even a more tolerant political atmosphere.  

Parties in the U.S. have succeeded by being “big tents" that are somewhat inclusive, large, and centrist (center –right for the Republicans, center-left for the Democrats). By contrast, cults are narrow, extreme, and exclusive. Traditionally, parties served as gatekeepers keeping radical extremists at the fringes and not allowing them to capture the party.  In Europe, the rise of fascism in the 20th Century was seen by many as a cult movement, and in North Korea today, some see the 60 year rule of the Kim family as a three-generation “cult of personality." But in the United States, such takeovers by cults have been largely unknown.  

What are the key characteristics of a cult, and how closely does Trump fit the bill?  Cults blindly and mindlessly follow a charismatic leader. Donald Trump recognizes this element in his base with such sayings as  "I could stand In the middle Of Fifth Avenue and shoot somebody and I wouldn't lose any voters.” Indeed, such is the blind loyalty to Trump that he is probably right.  Cults worship their leader. Even if the leader says things like “John McCain isn’t a hero,” or trashes Gold Star families, his base applauds and follows their Pied Piper wherever he wants to take them. 

The cult leader’s word is gold to his followers. And so, his base turns a blind eye in the face of thousands (yes, thousands) of lies he tells. Cults have their ritualistic chants. “Lock her Up, Lock her Up” and “build the Wall” are shouted at Trump rallies from coast to coast. Cult leaders claim to be on a “special mission” and as Trump says “Only I can do it.” Cults have insiders and the rest of the world is an outsider to be berated and hated.  Cult leaders are not accountable, and thus Trump says he will not release his tax returns as previous presidents have routinely done. Cults believe the ends justify the means, and thus the President bullies, demeans, and calls others ugly names which seems to the rest of us undignified and unpresidential, but to the Trump cult is fully justified (my mother would have washed my mouth out with soap if I had said such things). 

Former cult members write scathing exposes of their terrible lives within the cult. After only 2 years in office, a spate of Trump administration tell-all books are coming out describing the horrors of working for such a monstrous boss. Cults have a persecution complex. How many times does Trump tweet out messages condemning Saturday Night Live for its impersonations of the President? How many times does Trump blame the press for his failings? Cults engage in group-think. Cults kowtow to the leader’s every whim. They show disdain for non-members. Cults are paranoid. How many tweets does one have to read to see that our President thinks the media, our allies, college professors, and authors are out to get him? Cults control the information members receive. And of course, President Trump calls the media “the enemy of the people,” not to be believed, and that his base should “not believe what you see and hear, believe me.”  Cults tolerate even celebrate the inappropriate and egregious behavior of their leader. And so, Trump is not to blame for the strippers, and Playboy models with whom he may have had relationships while still married. Boys will be boys, or demeaning references to women is just “locker room talk.”  I could go on.

What to do?  The Republicans gave us Trump, they should  now clean up their mess.  Someone must take him on in the primaries. Republicans need to return to their core values and principles and not be pet poodles for Donald Trump’s excesses. Our system works best when we have two strong parties that vie for power but can come together at times for the good of the nation. Caving in to the cult leader is not politics, it is party suicide. If the Republican Party is to survive into and beyond the next decade, it must wrestle from the Trump cult, control of the party. If not, it deserves to be electorally defeated and to collapse into the dust bin of history.  

Donald Trump did not create the conditions that allowed for his rise. Global events, easily witnessed in the increasingly fractious politics of Europe, are challenging liberal democracies with a brand of illiberal democracy that threatens rule of law systems across the globe. Donald Trump is the American manifestations of this dangerous drift. He is riding a wave he did not create but has been masterful at exploiting. But just as the United States resisted the temptations of Communism and Fascism in the 1930s and 40s, and instead, committed to a rule of law system for our country, we need now to recommit to liberal democracy and the rule of law in our age. 

Where will we be when this long national nightmare is over?  Will our democracy be stronger? Our society more just and equal? Our politics more civil and our language more compassionate? Will we move towards what our better angels would have us do, or will we follow the cult of leadership towards a politics of fear and division. Come home Republicans.  We need you.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171507 https://historynewsnetwork.org/article/171507 0
The History of International Women’s Day and the Ongoing Fight for Gender Equality Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Theresa Serber Melkiel

 

Last Friday, March 8, was International Women’s Day. You might not have known that, since little notice is given to this date in the US, even though American women initiated it. Here in Berlin, one could not help but be aware of this special day, because the city government had declared it a holiday, and everything was closed except restaurants and museums.

 

A “National Women’s Day” was first declared by women in the Socialist Party of America for February 28, 1909. It was proposed by Theresa Serber Malkiel (1874-1949), whose story exemplifies the history of the uneasy connection between leftist politics and women’s rights in Europe and America, and the continued relevance of a “women’s day”.

 

As part of the emigration of 2 million Jews from the increasingly antisemitic Russian Empire between 1881 and the beginning of World War I, the Serber family moved from Ukraine to New York in 1891. Theresa went to work in a garment factory. At age 18, she organized the Woman’s Infant Cloak Maker’s Union of New York, mostly Jewish women workers, and became its president. Like many trade unionists, she gradually came to believe that socialism was the only path to liberation for workers and for women. She led her union into the Socialist Labor Party, the first socialist party in the US, the next year. Angered at the authoritarian tendencies of the SLP leader, Daniel De Leon, she and others joined with Midwestern socialists Eugene Debs and Victor Berger to form the Socialist Party of America in 1901.

 

At that time, both in the US and in Europe, socialists were the only political group to openly advocate women’s equality. In contrast to suffragists, socialists argued that gaining the vote was only the first step in creating an egalitarian society. But Theresa Serber almost immediately attacked the tendency of socialist men to say they favored gender equality, but to do nothing to bring it about, even within their own ranks. She formed separate women’s organizations to reach out to women workers and discuss their particular issues. She denounced the relegation of women in the Party to traditional women’s roles: women were “tired of their positions as official cake-bakers and money-collectors.” In 1909 she published an essay, “Where Do We Stand on the Woman Question?” criticizing her socialist “brothers” for their attitude toward female colleagues: “they discourage her activity and are utterly listless towards the outcome of her struggle.”

 

That year, Serber was elected to the new Women’s National Committee of the Socialist Party, and she promoted the idea of a “National Women’s Day” on February 28. In 1910, she published “The Diary of a Shirtwaist Worker”, a novel about the 3-month strike by about 20,000 mostly Jewish women factory workers in New York, the largest strike by women to that point in American history, which won better pay and shorter hours.

 

In 1910, German socialist women at the International Socialist Women's Conference in Copenhagen proposed creating an annual Women’s Day to promote equal rights. By 1914, March 8 was established as the day for demonstrations across Europe and America. The importance of this event grew when a women’s strike on March 8, 1917, in St. Petersburg began the Russian Revolution.

 

Women won the vote across Europe and America over the next few years: Russia 1917, Germany 1918, United States 1920, England 1928, although many individual American states had already given women the vote. Some nations moved slowly toward women’s suffrage: France and Italy only granted women voting rights in 1945.

 

But as socialist women had argued for decades, neither one celebratory day a year nor the right to vote brought equal rights. March 8 was declared a national holiday in many communist countries, but women continued to occupy secondary social, economic and political roles. Even after feminists in the US began in the 1960s to use the day to protest their continued subordinate status and the United Nations declared International Women’s Day in 1975, equality was still far away.

 

The socialist origins of a day devoted to women’s rights exemplifies the long-lasting political controversy over gender equality. The idea of equal rights was heretical for conservatives: a German poster calling for the vote for women on March 8, 1914, was banned by the Emperor’s government. Issues of equal rights continue to be marked by partisan political division in the US. The Lily Ledbetter Fair Pay Act was passed in 2009, supported by Democrats in the House 247-5 and in the Senate 56-0, and opposed by Republicans 172-3 in the House and 36-5 in the Senate. Democrats support the #MeToo movement and Republicans mock it. The Republican Party itself, as represented in Congress, is overwhelmingly male: 93% in the House and 85% in the Senate. Democrats are represented by a more equal, but not yet equal gender division: about 62-38 male in both chambers.

 

The same differences exist in Germany, but with more women overall. From left to right, the percentages of women delegates in the Bundestag, the federal legislature, are: Left 54%, Greens 58%, Social Democrats 43%, Free Democrats 24%, Christian Democrats 21%, and right-wing Alternative for Germany 11%.

 

A major point of discussion in German politics is the introduction of a gender quota system to insure equal representation in legislative assemblies. The Left Party proposed in November a law that would raise the proportion of women in the Bundestag, but it was voted down by a majority led by the Christian Democrats and Free Democrats. The far right Alternative for Germany was most vehemently against any effort to raise the proportion of women.

 

In the state of Brandenburg, ruled by a leftist coalition of Social Democrats, Greens and Left Party, the first German law requiring all parties to put forward equal numbers of men and women in their lists of candidates starting in 2020, the Parity Law, was passed this January.

 

The Social Democrats in Berlin proposed at the end of 2018 that March 8 should be a new holiday, and this was passed in January with the support of the Left and Greens. A coalition of activists used the March 8 holiday as a Kampftag, day of struggle, including a demonstration of about 10,000 people. Their demands included that abortion be fully legalized, pay be equalized, and more action be taken against sexism in daily life, especially violence against women.

 

International Women’s Day serves to highlight the remaining gender inequality in our society. The #MeToo movement exemplifies the much more vigorous public discussion of how to keep moving toward equality and the need for significant behavioral changes for both men and women to make that possible.

 

The goal is to make International Women’s Day superfluous.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154193 https://historynewsnetwork.org/blog/154193 0
Roundup Top 10!  

 

The government effort to make FOIA “as bad as possible”

by Nate Jones

The Department of Justice's historical effort to weaken the Freedom Of Information Act and why Congress must strengthen the law.

 

Why the College-Admissions Scandal Is So Absurd

by Alia Wong

For the parents charged in a new FBI investigation, crime was a cheaper and simpler way to get their kids into elite schools than the typical advantages wealthy applicants receive.

 

 

The real history of women wouldn’t look quite so nice on a tote bag

by Laurie Penny

“Empowerment” has always been more palatable and easier to sell than the idea of women simply taking power, and it’s more cheerful than the reality that plenty of women’s history has been defined as much by frustration and pain as by perky self-actualization.

 

 

How presidential empathy can improve politics

by Jeremi Suri

The legacy of FDR and his fireside chats.

 

 

The Black Gun Owner Next Door

by Tiya Miles

I’m an African-American historian and, on most issues, decidedly liberal. Could I rethink my anti-gun stance?

 

 

America’s Long History of Hysteria about Women’s Veils: Jeanine Pirro and Ilhan Omar

by Juan Cole

In fact, nothing is more American historically than veiling and debates on veiling.

 

 

The Case for Reparations

by David Brooks

A slow convert to the cause.

 

 

What's Behind the Lamentations Over History?

by James Grossman

Max Boot’s questions imply change over time. But we can’t know why something has declined if we don’t know what conditions have changed.

 

 

Getting the Right History vs. Getting the History Right

by L.D. Burnett

An excellent summary of recent popular critiques of historians--and a rebuttal.

 

 

Woodrow Wilson and ‘the Ugliest of Treacheries’

by Erez Manela

After World War I, America was supposed to lead the fight against colonialism. What happened?

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171505 https://historynewsnetwork.org/article/171505 0
Mennonite Values

Menno Simons

I claim to be "genetically Mennonite." Of course, since Mennonites are a religious group, not a racial/ethnic group, the claim is oxymoronic. Nevertheless, I mean it seriously as well as tongue-in-cheek. It does turn out, I think, that all Loewens in the world, at least all I have ever met, are of Mennonite origin. Add anything -- "Loewenberg," "Loewenstein" -- and it's Jewish. Subtract -- "Loewe," "Lowe" -- and it's likely Jewish but not always. But "Loewen," ironically meaning "lions," is usually Mennonite. 

Mennonites are followers of the Protestant minister Menno Simons, who lived in Holland 1496-1561. Mennonites were the first group in the Western World to come out against slavery and against war. Particularly that last stand -- against military service -- has caused them centuries of hardship and grief. 

"Old Order Mennonites" are also called Amish, and they famously forbear modern technology. Most "regular" Mennonites look like everyone else. "My" Mennonites, in Mountain Lake, MN, were good farmers, among the first to electrify. Besides, my dad stopped being a Mennonite and a believer when he was about 24. I was born when he was 39. So I was definitely "regular." Indeed, I grew up Presbyterian, since that church was closest to my house, and since Mom was a Christian. 

Nevertheless, my sister and I recently talked with each other about these matters, and we agreed that some Mennonite values seeped into our upbringing. We both seem to favor the underdog, for example. We both have worked for social justice. We are not impressed by mansions or BMWs. Today I am happy to choose my Mennonite heritage, if not religiously, well, then, as a statement of my values.

In particular, on the last page of the coffee-table book, In the Fullness of Time: 150 Years of Mennonite Sojourn in Russia, by Walter Quiring and Helen Bartel (3rd edition, 1974), are nine lines. Perhaps they are by Menno Simons; I have asked Mennonite scholars but they do not know. They sum up Mennonite values for me. I am particularly taken with the two words "we hope." What a modest claim! We hope that the good and the mild will have the power. Surely they ought to! 

Whose is the Earth?

Whose is the Earth? The toiler's. Who rules the earth? The wise one. Who has the might? Only the good, we hope, and mild. Vengeance and fury devour themselves. The peaceful abide and save. Only the wisest shall be our guide. The chain does men no honour and even less the sword.  

At the end of my life, I publish these lines thinking that they may come to be meaningful to you. You can claim them just as well as I can! You don't have to be genetically Mennonite to do so! Remember, genetically Mennonite is a contradiction anyway. You don't even have to attend a Mennonite church. (I go Unitarian. But that's another story.) "The chain does men no honor and even less the sword." 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154192 https://historynewsnetwork.org/blog/154192 0
What Historians Are Saying About the College Admissions Scandal Click inside the image below and scroll to see tweets.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171480 https://historynewsnetwork.org/article/171480 0
The Most Alarming Argument in Jill Lepore's These Truths

 

“Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” We find these words near the beginning of “The Machine (1946-2016),” the last part (some 270 pages) of Jill Lepore’s lengthy and highly-praised These Truths: A History of the United States. The rest of this section provides little hope that the outpacing she writes of is narrowing. This failure of ours is what is most alarming about these years.

Lepore’s survey of our post-WWII years addresses computing developments, polling, and political polarization. UNIVAC, the Universal Automatic Computer, was first revealed to the public in 1951. Along with subsequent computing, it helped turn “people into consumers whose habits could be tracked and whose spending could be calculated, and even predicted.” It also wreaked political “havoc, splitting the electorate into so many atoms,” and it contributed to newer forms of alienated labor.

Lepore thinks that conservatives took over the Republican Party in the late 1970s and early 1980s and gained a “technological advantage” over Democrats that “would last for a long time.” In this same period, corporations increasingly used computers to conduct their own polls, the accuracy of which Lepore often questions. By the 1990s, conservatives were increasingly using “targeted political messaging through emerging technologies” and were contributing to “a more atomized and enraged electorate.”

Although collapsing communist regimes and the end of the Cold War, culminating in the disintegration of the USSR in 1991, boosted Americans’ confidence in the future, Lepore believes “they were unable to imagine the revolution in information technology that would resist regulation and undermine efforts to establish a new political order.”

Despite the early Republican advantage in information technology, its impact on Democrats was also great. In the 1990s, Silicon Valley entrepreneurs and other professionals came to dominate the party, which deemphasized concerns of blue-collar workers, as it “stumbled like a drunken man, delirious with technological utopianism.” In February 1996, in what “would prove a lasting and terrible legacy of his presidency,” Bill Clinton signed the Telecommunications Act.  By deregulating the communications industry, it greatly reduced antimonopoly stipulations, permitted media companies to consolidate, and prohibited “regulation of the Internet with catastrophic consequences.”

Despite claims that the Internet helped democratize political life, Lepore thinks that social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.” She writes of how the alt-right used web sites like Breitbart to spread its influence and how the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world.

Beginning in the 1990s the concept of innovation “gradually emerged as an all-purpose replacement” for progress. The newer goal was more concerned with profit than any moral improvement, and it was often perceived as “disruptive innovation.” One of its proponents was Mark Zuckerberg, who in 2004 founded Facebook. Lepore quotes him as saying, “Unless you are breaking stuff, you aren’t moving fast enough.”

Newspapers were one of the casualties of this disruption. Compared to them, Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media, “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

During the twenty-first century political polarization accelerated as the Internet enabled people “to live in their own realities.” Lepore quotes conservative talk-radio host Rush Limbaugh as saying in 2009 that “science has been corrupted” and “the media has been corrupted for a long time. Academia has been corrupted. None of what they do is real. It’s all lies!” Instead the “conservative establishment” warned audiences away from any media outlets except those that reinforced right-wing views. Such polarization also affected people’s ability to deal with our most pressing global problem—climate change—because, as Limbaugh believed, the science of the “alarmists” could not be trusted.

Although one can argue that Lepore pays insufficient attention to all the plusses of technological change, her main point that our moral advances have failed to keep pace with technological developments is irrefutable. One can further argue that many of our main problems today, such as climate change, nuclear buildups, cybersecurity, growing economic inequality, and the Trump presidency, are related to our inability to relate wisely to our technological changes. The popularity of our tweeting president, for example, was greatly boasted by his starring role in the twenty-first-century reality TV show The Apprentice.     

More than four decades ago economist and environmentalist E. F. Schumacher bemoaned that “whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.” Adecade ago I concluded that “it was indeed evident how difficult it was for people’s prudence, wisdom, and morality to keep pace with technological change.” More recently, I updated this perspective by citing the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Undoubtedly, how to insure the use of digital and other technology to improve the common good is a tough problem. One place to look is to futurists. Psychologist Tom Lombardo is one of the wisest ones. He recognizes that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences,” but in his 800-page Future Consciousness: The Path to Purposeful Evolution he details a path by which we can evolve toward a more noble way of managing technology: by developing “a core set of character virtues, most notably and centrally wisdom. ” 

Another source of wisdom regarding technology is from religious and philosophical thinkers. In the 1970s Schumacher in his chapter on “Buddhist Economics” in Small Is Beautiful sketched out a way wholly different than in the West for looking at technology and economics. More recently, in an encyclical on climate change—which the nonbeliever neurologist Sacks referred to as “remarkable”—Pope Francis devoted many pages to technology and acknowledged that at present it “tends to absorb everything into its ironclad logic.” But in opposition to our present “technocratic paradigm” he called for a “bold cultural revolution” based on noble values and goals. 

Finally, on a history site such as HNN it is appropriate to ask, “Does history give us any hope that such a ‘bold cultural revolution’ can occur?” Can our approach to technology change from that of the dominant Western one of the last few centuries? Despite indicating our many post-WWII failures to cope wisely with technological change, Lepore does provide examples of movements and individuals that changed our history’s trajectory. 

She writes of the Second Great Awakening, a religious revival movement that swept over the USA in the 1820s and 1830s. It increased church membership from one out of ten Americans to eight out of ten. She recalls the long struggle to end U. S. slavery from Benjamin Franklin, whose “last public act was to urge abolition,” to Frederick Douglas, whose writings helped inspire Lincoln and continue to inspire Lepore. She notes that after the money-grubbing Gilded Age, the Progressive Era emerged, and that “much that was vital” in it grew out of the Social Gospel movement, which “argued that fighting inequality produced by industrialism was an obligation of Christians.” 

She recounts the many battles for civil rights from the Civil Rights Act of 1866, through Martin Luther King’s efforts and the Civil Rights Act of 1964, to the contested battles for the rights of blacks, women, immigrants, and LGBTs during the Trump presidency. She also details Franklin Roosevelt’s New Deal, whose scope was “remarkable” in combatting the Great Depression, when “nearly five in ten white families and nine in ten black families endured poverty,” and during which President Herbert Hoover argued against government relief, believing it would plunge the nation “into socialism and collectivism.” 

One significant historical change that Lepore pays scant attention to is the end of the Cold War, noting simply, “by 1992, more than four decades after it began, the Cold War, unimaginably, was over.” That “unimaginable” ending, however, was due to individuals (like Soviet leader Mikhail Gorbachev and Ronald Reagan) who acted in unexpected ways to carry out steps that other individuals (like activist Andrei Sakharov and other protesters) and movements had long been demanding. In other parts of the world leaders like Gandhi and Nelson Mandela also produced results like nonviolent résistance and the end of apartheid that changed history’s trajectory. 

As discouraging as post-WWII efforts to manage technology wisely have been, there may be, paradoxically, glimmers of hope emerging from our present dire climate-change situation. In a recent New York Times op-ed, “Time to Panic,” we read that “we’re at a point where alarmism and catastrophic thinking are valuable, for several reasons.” One is that “politics, suddenly, is on fire with climate change.” Just as the catastrophe  of the Great Depression led to the imaginative New Deal, so too the present climate-change crisis might soon alarm us enough to spark new actions and ways of interacting with our planet—and with technology in general.    

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171379 https://historynewsnetwork.org/article/171379 0
What I’m Reading: An Interview With Civil War Historian Anne Sarah Rubin

 

Anne Sarah Rubin is a Professor of History at the University of Maryland, Baltimore County, where she teaches courses on the Civil War, American South, and the Nineteenth Century United States. She is also the Associate Director of the Imaging Research Center. Find her at her website.

 

What books are you reading now? 

I'm working on a project about starvation in the Civil War and Reconstruction South, so I just finished Amy Murrell Taylor's book Embattled freedom: journeys through the Civil War's slave refugee camps. I'm also working on a digital project about African Americans in early republic Baltimore so I am digging into Martha Jones' Birthright citizens: a history of race and rights in antebellum America; I can see using this in my Civil War course next fall. Finally, I always read something for fun before bed, often true crime or fiction. I'm in the middle of The Feather Thief: Beauty, Obsession, and the Natural History Heist of the Century by Kirk Wallace Johnson and next up is Washington Black by Esi Edugyan.

 

What is your favorite history book?

It's hard to choose just one! Edmund Morgan's American Slavery, American Freedom was mind-blowing when I first read it as an undergrad, thought-provoking as a graduate student, and a pleasure to teach with. I think Seth Rockman's Scraping By is so impressive in illuminating the lives of people who we thought we couldn't find. Finally, I think Thavolia Glymph's Out of the House of Bondage is such a powerful work, and one that I use every year in my Civil War class.

 

Why did you choose history as your career?

I always loved history and thinking about the past – my parents took us to all different historical sites like Williamsburg and Old Sturbridge Village when my brother and I were kids. Then I took AP US History as a high school junior, taught by Eric Rothschild and Ted Morse (more on them below) and I realized that you could make a career out of figuring out the past – why did people do things? What was it like? So I am the rare person who chose a career at 16 and stuck to the plan!

 

What qualities do you need to be a historian?

Curiosity is number one—you need to want to know what the past was like and to find all sorts of aspects of the past interesting. Beyond that you need to love to read, and be willing to write, and rewrite, even if you don't love the process. I think that historians also need to be tenacious. It takes a long time to do good research, which is often tedious. And good writing also takes a lot of time. You just need to keep plugging away at a project.

 

Who was your favorite history teacher?

Eric Rothschild at Scarsdale High School in Scarsdale, NY. He taught my AP US history class (along with Ted Morse) and showed us how much fun history could be. He used lots of primary sources, sang to us, showed political cartoons and popular art. He also took our class to the AHA and OAH annual meetings, so we could see the wide variety of work that historians did. Eric loved teaching, and his joy was infectious.

 

What is your most memorable or rewarding teaching experience?

In 2015 I taught a class called Replaying the Past, about using video games to teach history. My students (a mix of advanced undergrads and MA students) were the clients for students in UMBC's video game design major, and together we built an educational game about the Pratt Street Riots in Baltimore in April, 1861. The premise is that you are a fox running around the city collecting documents about the riot. Then you read the documents to put them in chronological order. Besides working with game designers, my students also built their own board games and interactive digital fiction. It was interesting to see them think through using the same corpus of research in different ways and for different audiences, and I learned a lot about gaming and digital history myself.

 

What are your hopes for history as a discipline?

Honestly, I just hope that it survives as a distinct scholarly practice. Historians learn to think logically and systematically, to analyze arguments, and to organize reams of evidence into their own arguments. It’s a habit of mind. I'm all for working across the disciplines—I am the associate director of a digital media lab at UMBC and work every day with programmers, artists, geographers, etc.—but the value that I bring to the table is my historical thinking. I worry sometimes that there are too many majors being offered at universities, and that History is falling out of fashion.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

I have some 19th century editions of books like Albion Tourgée's A Fool's Errand, but I wouldn't call myself a collector. I collect kitschy Civil War items: Playmobile Civil War soldiers and funny magnets for example. My favorite one is an antique handmade cookie-cutter shaped like Abraham Lincoln. Sometimes I make Lincoln cookies for my students.

 

What have you found most rewarding and most frustrating about your career? 

Teaching is rewarding, especially at a place like UMBC where I can have a student in multiple classes over the years and watch her or him grow intellectually. Working on books is both incredibly frustrating, because it’s time-consuming and difficult, but also tremendously rewarding. To hold your own book is to have achieved something permanent in an ephemeral world.

 

How has the study of history changed in the course of your career?

The profession, and therefore the work that people produce, has become much more diverse. As a Civil War historian, it’s been a pleasure to see more and more women and people of color come into the field and make it their own.

 

What is your favorite history-related saying? Have you come up with your own?

I don't have a favorite one for history. But I often tell students and colleagues that the best paper/dissertation/book is a finished one! I also always tell students that the people in the past were not better or worse than we are today—sort of an antidote to the idea of a "greatest generation." 

 

What are you doing next?

I'm working on a project about starvation in the Civil War South, from the start of the war through the famine of 1867. I'm trying to use culinary history to get at what people were really eating, from the perspectives of elite whites, poor whites, and African Americans, particularly those who ran away during the war. I also want to explore the different groups and agencies providing relief to blacks and whites after the war. Right now, I am still in the research phase, and it hasn't yet come together for me. But it will. Because the best book is a finished book.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171452 https://historynewsnetwork.org/article/171452 0
Hitler's Own Maginot Line

 

"Monsieur Maginot built a fortified line," noted the German justice inspector Friedrich Kellner in his diary in June 1940, just after Hitler's army burst through the French fortifications. If France really expected to keep its neighbor from storming its territory, he added sardonically, it should have covered every foot of its border with "dynamite, deep pits of hydrochloric acid, traps of the largest extent, trip-wire guns, electrically charged fences -- and with thousands of heavy caliber cannons playing a major role." Water wells in the fields would have to be poisoned, and the more persistent intruders greeted by "machine gunners in swiveling steel towers." The humiliating failure of the vaunted Maginot Line thrilled the vast majority of Germans, who adored their Führer and supported his ruthless agenda. "The foolish people are intoxicated by the victories," wrote Kellner. He had opposed the Nazis from the beginning. A political organizer for the Social Democrats, he had campaigned against them during the ill-fated Weimar Republic. When Hitler came to power, Kellner moved his family to the small town of Laubach where he found a position as courthouse manager. On September 1, 1939, when the Nazi war machine introduced the world to blitzkrieg, Kellner began to record Nazi crimes and the German people's complicity in Hitler's program of terror. He occasionally forgot himself and gave voice to his feelings and was written up as a "bad influence" by the local Nazi leader. The SS placed him and his wife, Pauline, under surveillance, and only Kellner's position in the courthouse kept them from arbitrary arrest by the Gestapo.  If Adolf Hitler's megalomania had been satisfied after he avenged Germany's WWI defeat by overcoming the Maginot Line, he might not have had to create one of his own. But two years later Hitler was anxiously constructing a line of defense to secure his ill-gotten gains. He promised that his "Atlantic Wall" -- thousands of fortifications of concrete and steel stretching over 1,500 miles -- would keep out the hordes. Should anyone miraculously make it through the Wall, declared a boastful Hitler to laughter and thunderous applause from his audience, "It will only be a matter of luck if he remains nine hours."  Joseph Goebbels, the Minister of Propaganda, contributed with special presentations in the papers and over the radio, calling the Wall "the most enormous protective cordon of all times." In weekly newsreels, a narrator described the panoramas of the fortifications: "We admire the many and confounding systems of the trenches, crossbars and barricades, of the bunkers and machine gun emplacements, the threatening, elongated muzzles of the super-heavy cannons." To those who might dare to test the Wall, the narrator added, "We understand the nervous mood on the British island, where they hope in vain to grate the German people's nerves by shouting about invasion." "The Atlantic Wall is presently a favorite topic in the press and on the radio for rousing the people," wrote Kellner. "But it ranks along with the Maginot Line that made France believe it was safe." He was unimpressed when Hitler assigned Germany's most popular military leader to command the project. "General Field Marshal Rommel, the Great Retreater, who at one time had 'the gates of Egypt' in his hand, is being brought out of mothballs," said Kellner. "He is the darling of the propaganda machine, and his blemished fame will fill the forgetful ones with hope."

 

A photo of Friedrich Kellner's diary, courtesy of the author.

Kellner was certain neither Rommel nor his battalions nor the fabled Atlantic Wall would repel what was coming. The invaders might not even bother with the Wall, he suggested. There were other borders they could test. If they entered Germany from elsewhere, the Wall would prove a farce. "Besides," Kellner wrote, "they can fly over it." In any case, Kellner was tired of the boasting and bluster in Germany, and the procrastination of the Allies, and he wrote, "One would like to proclaim, 'Let us finally see the action.' Then the proof will be furnished whether the legendary Atlantic Wall is to be breached or not." That day of action arrived three weeks later, on June 6, 1944. "Finally!" Kellner wrote in capital letters at the top of that diary entry. "With the utmost inner excitement we heard the announcement today landings were made on the northern French coast." The Wall fell, and rightly so in the diarist's mind. "The human community does not end at border crossings artificially placed within a country," said the Social Democrat Friedrich Kellner, "but has to embrace everything that carries a human face."

 

For more on Friedrich Kellner, read My Opposition: 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171378 https://historynewsnetwork.org/article/171378 0
Lessons from Studying Corporate History and the Case of IBM

 

There are usually three audiences for a long-lived corporation’s history: industry analysts trying to figure out if the firm will die off soon (and if clients should sell off stock); senior business management studying how to keep their companies successful for years, if not decades; and business and economic historians.  The first two constituencies often turn quietly to historians for these lessons. For example, Harvard’s business historian Alfred D. Chandler, Jr. was a welcome guest in classrooms and boardrooms and  Louis Galambos and Geoffrey Jones are followed by many businesses. All three historians spent decades studying the role of large multinational corporations.

 

IBM fits into that galaxy of firms; the body of literature about it generated by historians, economists, journalists, employees, and industry experts over the last half century is extensive. For nearly a century, other multinational companies looked to see what IBM was doing and drew lessons from its experience.  It is no accident, therefore that when I wrote IBM: The Rise and Fall and Reinvention of a Global Icon that I emphasized IBM as an icon. My study of the company’s 130 year historyisthe first such comprehensive book to appear in over 15 years and the first to be written by an historian since Robert Sobel did so in 1981.

 

Looking at IBM through the lens of a large information ecosystem highlighted two findings.  First, by the end of the 1920s and continuing until the late 1980s, if one wanted to understand how to use data processing—IT in today’s language—they turned to IBM and its many thousands of experts, user communities, and industry associations that it populated, often dominated with its own perspectives.  That practice put IBM in the middle of much of what happened with computing in all societies for most of the twentieth century and facilitated its success and growth.  Second, that ecosystem’s existence led to the realization that IBM was less of a monolithic institution, which both it and its historians had long touted and accepted.

 

I argue that IBM was a collection of different factions (think product lines and divisions) that collaborated, competed with each other, and that shared a common culture.  It proved useful to think of IBM much the way one would about a midsized American city of several hundred thousand people.  Their mayor (the CEO at IBM) had to deal with factions and neighborhoods scattered across scores of countries, persuading them to do his bidding, while employees worked to succeed within the economies of so many countries.  That perspective is applicable to such other enterprises as GM, Ford, Phillips, Shell, and Exxon/Mobile.  All have long histories, multiple product lines, and subgroups competing for resources. 

 

 

If IBM is a city that shares a common language (in this case English) and set of clearly articulated and accepted values (corporate culture) in the middle of an information ecosystem, a new perspective of IBM and other firms emerges.  The story becomes more complicated, of course, as we move from simple discussions of organization and strategy to the more sociological views advocated by business historians over the past three decades.  But it also allows us to explore old themes in new ways, such as Chandler’s fixation on unified strategies now explained as evolving dynamic strategic intents, or Philip Scranton’s advocacy of looking at innovation and structures in smaller units.  Pro- and anti-Chandlerian views that dominated much of business historiography over the past 15 years become less relevant.

 

The history of IBM exposes the emerging notion that information ecosystems and their underlying information infrastructures leads to useful methods for studying such large organizations at both macro and micro levels, from grand strategy of “The Corporation,” to how a sales office or factory worked, linking together the experiences of both types of history that reflected the realities memoirists described.  In IBM’s case, we have the benefit of several dozen memoirs, a well-stocked corporate archive, and a nearly vast literature on the daily affairs of the company to support research.  When viewed through the lens of ecosystems and infrastructures, we see a new IBM.

         

Briefly summed, IBM was in constant churn for 130 years, always challenged by technological complexities, effective and dangerous competitors, antitrust litigation that followed its waves of success three times, rivalries among executives promoting their personal careers, customers dependent on IBM, others that wanted new products or solutions to problems, and always, the angst of possibly going out of business, as was feared in the 1910s, early 1920s, during the Great Depression of the 1930s, in the early 1990s, and that periodically an analyst hints at today.  But it did not go out of business.  No longer can one view IBM just as a slick, monolithic, well run operation.  It had feet of clay, like other large organizations.

         

Why did IBM succeed?  IBM developed a corporate culture aligned with what business it was in—the processing of data—and tuned it to meet the realities of every era.  Second, because of the technological churn that was always its reality, it was constantly willing to retire old products and bring in new ones, to replace old employee skills with new ones, and to alter long-standing operational practices into new ones.  Each represented painful transformations that put business prospects at risk, cost some employees their careers while opening up opportunities for others, and always against the background of growing customer dependencies on the use of data processing from punch card tabulating equipment through computers then PCs then to the Internet and now to ubiquitous computing appearing in all corners of modern life.  

 

For many decades what happened at IBM mirrored events in scores of other similar enterprises and with so many now appearing around the world its experiences have much to teach each audience. Nowhere does this seem so urgently true than for such firms as Amazon, Google, Facebook, Microsoft, and Apple. It is no accident that the last two firms have formed various business relations with IBM, Microsoft since the early 1980s and Apple in the last decade.

         

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171385 https://historynewsnetwork.org/article/171385 0
Saving Lives in the Crimean War: “They Are My Sons”

 

Mary Seacole was a Creole, born in Jamaica to a Jamaican woman who had married a Scottish soldier. Her mother, a nurse, ran a boarding house for invalid soldiers there. Mary lived in Jamaica until she was 39 and then, widowed, traveled to Panama, where she opened up a hotel. She developed numerous medical skills over the years and when the Crimean War broke out in 1853, raced to England in an attempt to join Florence Nightingale and her cadre of 39 nurses to care for the wounded British troops in the war.

She was rejected by Nightingale (Mary said it was because she was black) and then, with a friend she traveled to Crimea and set up a small hotel/hospital behind the battle lines at Balaclava and went to work as a nurse, caring for the soldiers, who appreciated everything she did for them. She also risked her life treating soldiers on the battlefield. A British journalist covered her and she became famous all over the world.

Her story is told well, and with a considerable amount of emotion, by Jackie Sibblies Drury in the play Marys Seacole, that opened recently at Lincoln Center’s Claire Tow Theater, in New York.

You realize before you sit down and stare at the very contemporary set of a hospital waiting room that this is a very unusual play. It is a hard-hitting drama that hammers away at your senses, full of vivid skits and scenes. Characters hurtle back and forth through time between Jamaica of 2019 and Crimea of the late 1850s. It depicts Mary’s life, and her struggle in the war, very well and tells yet another true story of a brave woman in history.

The hospital set is used as a theater in itself in this story of Jamaica in the 1840s and Crimea in the 1850s. Added on to the war story are numerous contemporary events, such as a dress rehearsal for a mas shooting exercise, Mary’s disputes with her dead mother’s ghost and the anguish of a pregnant woman. The last half of the play is in 1850s Crimea and Mary’s heroism there and more visits from her mom. If you pay careful attention to the rather chaotic plot, it all makes surreal sense, but if you don’t pay attention you can get a bit lost.

Director Lileana Blain-Cruz does a fine job of telling the 1853 tale and the contemporary story at the same time. She also is skilled at keeping the pace of the play moving along at a brisk speed even though she works hard, with her actresses in this all female play to build deep, rich characters slowly.

She gets good work from Quincy Tyler Bernstine as a wonderful Mary, who is a tornado of emotions at times. The director also gets superlative work from Karen Kandel as Duppy Mary, her mom’s ghost. Others in the talented ensemble cast include Gabby Beans as Mamie, Lucy Taylor as May, Marceline Hugot as Merry and Ismenia Mendes as Miriam.

There is a point, about two thirds through the play, when the ensemble goes through a torrent of activity, people running madly across the stage, arguing and defending themselves. There is, well, a lot of high-pitched screaming. Fans of Super Bowl winners have not screamed this loud. It is a magical moment.

Lincoln Center has printed a short history of Mary’s life (the Marys in the title refers to the different Marys in the story) that is slid into the program so that you know her history and that of the world in the 1850s. It is helpful. 

A problem with the play is that you learn absolutely nothing about the Crimean War. You don’t even learn who won the conflict (Alfred Lord Tennyson’s immortal poem ‘The Charge of the Light Brigade” was written about that war).

The war, that lasted until 1856, was started over the rights of Christian visitors to the Holy land, at the time part of the Ottoman empire. It pitted Russia against the Ottomans, France, the British and Sardinia. Russia lost. A play about a nurse in the Crimean war must have more information on the war. It should also tell more about Mary’s life as a nurse there, especially her visits to the battlefield while gunfire was exchanged. People do not know much about that war, and more information om stage would be valuable. 

 The play also starts slowly, mopes for a while and has too many people with clipped Jamaican accents. These are just minor complaints, though.  After about half an hour the magic of the playwright and director take hold and Marys Seacole soars.

PRODUCTION: The play is produced by Lincoln Center. Scenic Design: Mariana  Sanchez, Costumes: Kaye Voyce, Lighting: Ilyoun Chang, Sound: Palmer Hefferan. The play is directed by Lileana  Blain-Cruz. It runs through April 17.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171479 https://historynewsnetwork.org/article/171479 0
Robinson Crusoe’s Wall

 

The question of building a wall on the southern border of the United States has been elevated to a national emergency, despite the objections of national security experts who say that a wall, however long or high, will not secure the border. Opponents of the wall also say that its cost, not only in funds previously allocated for other projects but also in the erosion of our democratic system of government, is not worth the benefit that may be obtained. Yet there is a persistent resolve at the highest levels of government and among a significant number of voters that we must have a wall, no matter what. Perhaps we can look to history, especially literary and cultural history, to explain this passionate desire to build a wall.

Robinson Crusoe, a novel written by Daniel Defoe in 1719, has much to tell us about walls. In the first century after its publication, Robinson Crusoe became a popular book for children because of the salutary model it was thought to provide for young Englishmen and women. One feature that remained in every edition, no matter how much abridged, was Crusoe’s wall. Thousands of young readers from Britain, America, and many other nations learned from Robinson Crusoe that building a wall was the first step toward creating a civilized domestic space and eventually extending it into an empire.

In the central episode of the novel, the mariner Robinson Crusoe is washed ashore on a deserted island in the Caribbean Sea. To protect himself against wild animals, he builds a wall across the mouth of a cave made of posts set in the earth, backed with layers of turf two feet thick. The wall, a half-circle twenty-four yards in length, takes him fourteen weeks of strenuous labor to build, “but I thought I should never be perfectly secure ’till this Wall was finish’d.”

The completion of the wall enables Crusoe to improve his circumstances, both material and spiritual. He builds a domestic space; he plants crops and raises goats; he reads his Bible and gives thanks for his salvation. But all this security is swept away one day when he discovers a single unfamiliar footprint in the sand. He retreats in terror behind his wall, where he remains for several months. At last he emerges and builds a second wall, “thickened with Pieces of Timber, old Cables, and every Thing I could think of, to make it strong.” This new wall, fortified with seven gunports, can only be climbed with a series of ladders which Crusoe can take down when he is inside his cave. Upon finishing this second wall, he rejoices again in his new sense of security: “I was many a weary Month a finishing [it], and yet never thought my self safe till it was done.”

The object of Crusoe’s terror is the indigenous people of the northeastern coast of South America, then known as Caribs, who visit the island occasionally to celebrate their victories and sacrifice their prisoners. In Defoe’s day it was commonly believed that the Caribs were cannibals, though that point is contested now. Crusoe’s providential rescue of one of their victims, whom he names Friday, allows him to initiate a military-style campaign against the Caribs. Driven by fear and anger, Crusoe attacks the visitors deliberately, not reflecting at first that in doing so “I should be at length no less a Murtherer than they were in being Man-eaters, and perhaps much more so.” Crusoe and Friday kill or wound twenty-one of the “Savages,” whom he implicitly blames for their own deaths.

Crusoe’s last exploit on the island involves assisting in the re-capture of a ship whose crew has mutinied. Instead of hanging the mutineers, Crusoe proposes that they should be marooned on the island, where they may repent and reform themselves as he has done. In effect, he converts his island paradise into a penal colony, leaving the mutineers his cave, his tools, and his wall upon his return to England. On a subsequent visit to the island, years later, he finds that the mutineers have quarreled, fought amongst themselves, and lost their island to the Spaniards. The ending suggests what happens when the lesson of the wall and the purpose it serves are not understood. 

Defoe’s Robinson Crusoe suggests that there is nothing better than a wall to provide a sense of security, so long as the wall is a symbol of civic order and domesticity. If those values are lacking, the wall is no better than a penitentiary, locking up disorderly elements within. Before extending the wall on the southern border, we need to decide whether its purpose is to strengthen a civil society, or to perpetuate division and discord on both sides of the wall. If we merely lock ourselves up in our own fear and anger, we are likely to repeat the conclusion of Robinson Crusoe.

 

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171453 https://historynewsnetwork.org/article/171453 0
How the Allies Won on D-Day

 

The following is an excerpt from Soldier, Sailor, Frogman, Spy, Airman, Gangster, Kill or Die: How the Allies Won D-Day

 

George Lane [a Hungarian living under a pseudonym who signed up for the elite British-led X-Troop that consisted of foreign nationals whose countries had been overrun by the Nazis] viewed his life in much the same way as a professional gambler might view a game of poker: something to be played with a steady nerve, a dash of courage and a willingness to win or lose everything in the process. 

 

His addiction to risk had driven him to join the commandos; it had also led him to volunteer for a perilous undercover mission codenamed Operation Tarbrush X. In the second week of May 1944, Lane was to smuggle himself into Nazi- occupied France using the cover of darkness to paddle ashore in a black rubber dinghy. His task was to investigate a new type of mine that the Germans were believed to be installing on the Normandy beaches.

 

Operation Tarbrush X was scheduled for 17 May, when a new moon promised near- total darkness. Lane selected a sapper named Roy Wooldridge to help him photograph the mines, while two officers, Sergeant Bluff and Corporal King, would remain at the shoreline with the dinghy. All four were fearless and highly trained. All four were confident of success. 

 

The mission got off to a flying start. The men were ferried across the Channel in the motor torpedo boat and then transferred to the black rubber dinghy. They paddled themselves ashore and landed undetected at exactly 1.40 a.m. The elements were on their side. The rain was lashing down in liquid sheets and a stiff onshore squall was flinging freezing spray across the beach. For the German sentries patrolling the coast, visibility was little better than zero.

 

The four commandos now separated, as planned. Bluff and King remained with the dinghy, while Lane and Wooldridge crawled up the wet sand. They found the newly installed mines just a few hundred yards along the beach and Lane pulled out his infrared camera. But as he snapped his first photograph, the camera emitted a sharp flash. The reaction was immediate. ‘A challenging shout in German rang out and within about ten seconds it was followed by a scream which sounded as if somebody had been knifed.’ Soon after, three gunshots ricocheted across the beach.

It was the signal for a firework display unlike any other. The Germans triggered starshells and Very lights (two different types of flare) to illuminate the entire stretch of beach and then began firing wildly into the driving rain, unable to determine where the intruders were hiding. 

 

Lane and Wooldridge scraped themselves deeper into the sand as they tried to avoid the bullets, but they remained desperately exposed and found themselves caught in a ferocious gun battle. Two enemy patrols had opened fire and it soon became apparent that they were shooting at each other. ‘We might have laughed,’ noted Lane after the incident, ‘if we had felt a bit safer.’

 

It was almost 3 a.m. by the time the gunfight ended and the German flashlights were finally snapped off. Sergeant Bluff and Corporal King were convinced that Lane and Wooldridge were dead, but they left the dinghy for their erstwhile comrades and prepared themselves for a long and exhausting swim back to the motor torpedo launch. They eventually clambered aboard, bedraggled and freezing, and were taken back to England. They would get their cooked breakfast after all. 

George Lane and Roy Wooldridge faced a rather less appetizing breakfast. They flashed signals out to sea, hoping to attract the motor torpedo boat and then flashed a continuous red light in the hope of attracting attention. But there was never any response. As they belly- crawled along the shoreline, wondering what to do, they stumbled across the little dinghy. Lane checked his watch. It was an hour before dawn, precious little time to get away, and the Atlantic gale was whipping the sea into a frenzy of crests and troughs. It was not the best weather to be crossing the English Channel in a dinghy the size of a bathtub. 

 

‘Shivering in our wet clothes, we tried to keep our spirits up by talking about the possibility of a Catalina flying boat being sent out to find us and take us home.’ Wooldridge glanced at his watch and wryly remarked that it was the date on which he was meant to have been going off on his honeymoon. Lane laughed at the absurdity of it all. ‘There he was, poor bugger, with me in a dinghy.’

 

Any hopes of being rescued by a flying boat were dealt a heavy blow in the hour before dawn. As the coastal town of Cayeux- sur- Mer slowly receded into the distance, Lane suddenly noticed a dot in the sea that was growing larger by the second. It was a German motor launch and it was approaching at high speed. He and Wooldridge immediately ditched their most incriminating equipment, including the camera, but kept their pistols and ammunition. Lane was considering a bold plan of action: ‘shooting our way out, overpowering the crew and pinching their boat’.But as their German pursuers began circling the dinghy, Lane was left in no doubt that the game was up. ‘We found four or five Schmeisser machine guns pointed at us menacingly.’ The two of them threw their pistols into the sea and ‘with a rather theatrical gesture, put up our hands’.

They were immediately arrested and taken back to Cayeux- sur- Mer, zigzagging a careful passage through the tidal waters. Lane swallowed hard. Only now did it dawn on him that he had paddled the dinghy through the middle of a huge minefield without even realizing it was there. ‘It was an incredible bit of luck that we weren’t blown to bits.’ 

 

The two men feared for their lives. They were separated on landing and Lane was manhandled into a windowless cellar, ‘very damp and cold’. His clothes were drenched and his teeth were chattering because of the chill. He was also in need of sustenance, for he had not eaten since leaving England. 

 

It was not long before an officer from the Gestapo paid him a visit. ‘Of course you know we’ll have to shoot you,’ he was told, ‘because you are obviously a saboteur and we have very strict orders to shoot all saboteurs and commandos.’ Lane feigned defiance, telling his interrogators that killing him would be a very bad idea. The officer merely scowled. ‘What were you doing?’

 

Lane and Wooldridge had cut the commando and parachute badges from their battledress while still at sea, aware that such badges would condemn them to a swift execution. They had also agreed on a story to explain their predicament. But such precautions proved in vain. The German interrogator examined Lane’s battledress and told him that he ‘could see where the badges had been’. Lane felt his first frisson of fear. ‘They knew we were commandos.’ 

To read more, check out the book!

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171458 https://historynewsnetwork.org/article/171458 0
Three Unexpected Deaths That Shaped Presidential History \

 

As I’ve written before, random circumstances often shape history. Unexpected tragedy in the lives of three political leaders—John F. Kennedy, Jr., Paul Wellstone, and Joe Biden—further demonstrate the profound impact of chance on political history. 

John F. Kennedy Jr. was the “golden lad": the most famous offspring of John F. Kennedy, he was a public celebrity who published “George” magazine. He was rumored to be planning to run for the open US Senate seat in New York vacated by Senator Daniel Patrick Moynihan in 2000.  With the famous Kennedy name behind him, plus his good looks and winning personality, it seemed as if he was likely to announce his campaign and even win. Then, he and his wife were suddenly killed in a small plane crash off Martha’s Vineyard, Massachusetts on the evening of July 16, 1999.  

First Lady Hillary Rodham Clinton won the seat of Senator Moynihan in 2000.  Hillary Clinton sought the Presidency in 2008, and when that candidacy failed, she served as Secretary of State to President Barack Obama. She ran again in 2016, winning the popular vote, but losing the Electoral College to Donald Trump.  Without the Senate seat, Clinton would have had no opportunity to seek the Presidency twice. Some believe JFK, Jr would have defeated Clinton in the Democratic primary, gone on to win the Senate seat, and could have sought the Presidency in 2008, potentially derailing Barack Obama’s campaign.  This is, of course, speculation, but it's also a reasonable possibility. 

Paul Wellstone was a Minnesota Senator representing the Democratic Farmer Labor Party from 1991 until his tragic death in another small plane crash on October 25, 2002. His wife and daughter also died in the crash, just two weeks before the 2002 election. Wellstone was a rising progressive star in the Democratic Party and a likely candidate for the Presidential nomination in 2004.  In the 2004 presidential election, Democratic Presidential nominee John Kerry lost to George W. Bush in large part because Kerry lost the state of Ohio. When one considers that Wellstone was from an agricultural state similar to Ohio, one can speculate that Wellstone might have drawn a lot more interest and support than Kerry or the early “flash in the pan” candidate, former Vermont Governor Howard Dean. If Wellstone had won, he would have been the first Jewish President.

Finally, Joe Biden’s political career was altered by personal tragedy. Biden has had the longest career of anyone who ran for the Presidency.  He ran in 1988, but was forced out by accusations of plagiarism and suffered an aneurysm shortly after. He again ran in 2008, but he could not compete with the stardom of Barack Obama and Hillary Clinton. His 36 years in the Senate and 8 years as an actively engaged Vice President made him a leading contender in the lead-up to the 2016 election until his beloved son, Beau Biden, died of cancer in 2015. This led to his decision not to run for President in 2016 and instead Hillary Clinton and Bernie Sanders battled it out. 

The fact that Clinton lost the states of Pennsylvania, Michigan and Wisconsin by small official margins, and that the white working class of those states went to Trump, made many feel Democrats could have won if Biden was the nominee. There is no certainty, of course, that Biden would have overcome both Hillary Clinton and Bernie Sanders and won the Democratic nomination in 2016, but many think that his known appeal to the white working class over his career would have helped him win the states Clinton lost. Whether Biden will decide whether to run in 2020, and have the ability to accomplish what might have been in 2016, is of course open to speculation.

While there are no guarantees that John F. Kennedy, Jr., Paul Wellstone, and now Joe Biden would have been elected President, it is certainly interesting to think how the future of America in the early 21st century might have been very different.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171459 https://historynewsnetwork.org/article/171459 0
A Historian Reflects on a Long Life

 

A lifelong historian, beginning with a spring, 1936 term paper on Franklin D. Roosevelt’s rise to the Presidency, Vaugh Davis Bornet frequently contributes to HNN. At 101 years old, he offers advice on living a long life, with a little history along the way in this personal essay.  

 

Recently I saw the movie South Pacific all the way through.  The Frenchman in exile has decided that the rest of his life (half is gone) has to be with Mitzi Gaynor, the Navy Nurse.  His two children are not enough, and neither is his ample estate.  He will live long if and only if he has  loving companionship! Yes!  

Beth Bornet was with me for 68 Anniversaries, beginning after our wedding in late 1944.  It is the place to start on the subject of  seeking and getting Longevity beyond any doubt.  After all, she planned and prepared every meal of those at Home for over half of my life, satisfied yearnings, selected and vetoed our guests, handled children.

So it is out of the question for me to start explaining/bragging over my oh-so-sensible choices on “how to live.”  After all, my partner in Life is the prime consideration as I ask myself:  “Why did you live so long?”  To you I say, “Choose and preserve Companionship, to guarantee a long and healthy life.”  (Anyway, that is my educated opinion.)

Next, I do think that Being Active is important. Those who lived long whilst drifting slowly and quietly and inactively through all those years do have a point. It can’t be denied.  But I do think I’m right as rain in urging activity, avoiding “just sitting there.”

Yes, I’m a Navy veteran, over 5 years active; 18 busy reserve.  Not sure the difference this participation has made, overall; did get help with medical bills, without doubt.

Move about, talk, play, be doing something all or most of the time.  (That doesn’t mean I am hostile to Rest, Relaxation, and quiet Reflection now and then. But it does seem that friendly companions around most of the time has to be related to wanting to Stay Alive and quite possibly to succeeding!)

Minding the business of your communities is related to longevity.  Join things. Meddle.  Speak up! Be alive and show it.  Make a difference.  Each of us, in his and her way, certainly did.  No detailed proof necessary.

I am going to waste little time on urging No Smoking.  It is now so obvious.  When I changed seats at Rotary in the early ‘60s to move away from second hand smoke I had no idea how important was each and every move.

This body of mine looks about the way it should, but really it is better.  I took real, no kidding, real barbell  weightlifting deadly serious in spring, 1941 as we worriers  thought war was around the corner for my age group.  I learned and did the snatch, press, clean and jerk, and dead lift.  A few years earlier I had been a high school tennis player of ability (Duke offered a scholarship--refused), and I won five victories in a row as an intermural baseball pitcher; I adored swimming.

With a few exceptions, it was one drink before dinner, and that was it.  In diet: lots of fresh and salt water fish; crab; poached eggs; meat—but no fetish; salads and vegetables “if I must.” But:  in later life, bananas, and lots of grapefruit shipped in from Texas and/or Florida, and even at the store in bottles out of season.

After a real, no kidding heart infarction (attack) in 1977 there was pretty long companionable walking in our Ashland, Oregon hills. There was always a large dog to demand that we go:  rain or snow! In late retirement, I sought out  a two-way exercise machine in our place; I have long given it a daily 20 minute workout, tension on full, with both arms and legs involved.  It’s totally routine, expected.

I confess there has been one of three different heart pills only once a day for over thirty years, prescribed by “the Best.” Since I was inheritor of an oddity, I long since had a pacemaker installed in my right shoulder; I am on my third as I write.  They keep heartbeats above 60, night and day.

Anything else?  I’m in bed, mostly asleep, 10:30 PM to 7:30 AM, nightly.  Little change.  I live, eat, and behave normally—I do think. I was in a terrible one car auto wreck  at over 70 mph in a bit earlier decade.  Then I was the victim of a freak accident that broke  open my left femur its whole length—forcing a form of incarceration nearly four months.

But: I had a 30 ft. above ground swimming pool for over 30 years and gave it the use it was sold for, relying on a battery of solar panels tp extend the season.  Great!

For all I know I could be deceased tonight or next week.  I am, after all, 101 and four months.  Ambulatory, I am unsteady; have had eyelid operations; hearing OK but unsharp just a bit. Skin not what it was. Still have nice cuttable hair on my head.  No fingerprints left!  Feet learning numbness; let’s wish them well.  Care to trade?  More sensitive to pepper than you.  Use potassium instead of salt; fake sugar in packets; on advice of one with a Ph.D. in “Nutrition” at the VA, eat minimum bread.  Fast typist, but lean on the keyboard now and then and swear under my breath. 

Why in the world did I write this article instead of doing something else?  Well, around me where I reside are a number of older males and females who are starting to push 100 just a bit,  and some are well over that.  Somebody may want to read it! And my sense of obligation is well developed, especially when I am pretty sure I can extend their life of chatting and musing maybe more than a few months!

LONG LIFE MAY BE GOTTEN, that is, if you want to have it.  This lifetime Research Historian wishes you well, and does hope you get what you want out of the rest of your hopefully happy life.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171386 https://historynewsnetwork.org/article/171386 0
What Historians Are Tweeting: The Women Historians Who Inspire on International Women's Day Dr. Sarah Bond (@SarahEBond) asked "How did your female mentor make a difference in your career?" Here are some of the responses. 

Click inside the image below and scroll to see tweets.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171457 https://historynewsnetwork.org/article/171457 0
Can Artists Remake Society?

In Grey (1919) by Kandinsky, exhibited at the 19th State Exhibition, Moscow, 1920

 

One hundred years ago today, fighting raged in the streets of Berlin. Kaiser Wilhelm II had abdicated in November 1918, and a new socialist government, led by reform-minded members of the German Socialist Party (SPD), had declared a democratic republic. Thousands of workers and sailors, dissatisfied with the moderate stance of the SPD leaders, demanded more radical policies, and revolted in Berlin in January. Not yet demobilized soldiers, the so-called Freikorps, fresh from defeat in World War I, were employed by the new government to destroy the revolt. The Freikorps killed hundreds of workers and assassinated two leaders of the newly founded German Communist Party, Rosa Luxemburg and Karl Liebknecht.

 

The radicals tried again in March to overthrow the SPD government – they called a general strike on March 3, which developed into renewed street fighting. Again the much more heavily armed Freikorps were dispatched by the government to put down the revolt. Gustav Noske, the new Defense Minister, issued a fateful order: “Any individual bearing arms against government troops will be summarily shot.” The ruthless Freikorps, led by extreme conservative officers who hated any manifestation of workers’ power, including the SPD government, hardly needed any encouragement. With few losses, they killed over a thousand workers. When several hundred unarmed sailors demanded back pay at a government office on March 11, twenty-nine were selected out and murdered.

 

Berlin was relatively quiet for a year. On March 12, 1920, the Freikorps, sporting swastikas on their helmets, and other right-wing military formations marched on Berlin in an attempt to create an authoritarian government. Military leaders on the government side refused to fire on fellow soldiers. The SPD government had to flee and a group of extreme conservatives declared themselves rulers of Germany. Adolf Hitler flew into Berlin to support the coup. Across Germany, army commanders and bureaucrats fell into line. This attempt to end the life of the new German democracy finally brought all leftist parties together in a call for a general strike, in which millions of workers paralyzed the country as protest against the so-called Kapp putsch. After four days, the putsch collapsed and the SPD government returned to Berlin.

 

The conspirators were treated leniently in comparison to the leftist rebels. Kapp and the other leaders were allowed to leave the country. Most participants were given amnesty. The Freikorps were eventually dissolved and many of their members later joined the Nazi Party. 

 

Its violent birth severely weakened the first German democracy, the Weimar Republic. The far left continued to advocate revolution. The far right was never reconciled to democracy and used violence against its representatives. The Nazi Party, while never gaining a majority among voters, was tolerated and supported by business and military leaders and conservative politicians, and was able to overthrow Weimar democracy bloodlessly in January 1933, and later murder 96 members of the German parliament, the Reichstag.

 

The city of Berlin is now commemorating the hundredth anniversary of the revolution of 1918-1919 with a broad palette of museum exhibitions, educational events, discussions, and tours under the title “100 Years of Revolution – Berlin 1918-19”.

 

One of the most striking changes triggered by the November Revolution in Germany, and more generally the revolutions in eastern Europe provoked by the Russian Revolution, was the conquest of the art world by a radically new conception of the nature of visual expression. The political revolution encouraged and was welcomed by young German artists, who sought to overthrow the traditional reliance of visual artists on more or less realistic representations of the material world. Calling themselves the Novembergruppe, an “association of radical fine artists”, they, like their colleagues in the new Soviet Union, rejected most accepted artistic conventions and called for a radical rethinking of what art meant. Breaking out of the stultifying traditionalism of German official art, the Novembergruppe offered artistic “freedom for every pulse”. But their ambitions went beyond aesthetics to seek the “closest possible mingling of the people and art”. “We regard it as our principal duty to dedicate our best energies to the moral construction of this young, free Germany.”

 

Among the many “pulses” that the Novembergruppe promoted was a rejection of all forms of artistic realism in favor of pure abstraction. Following the lead of Russian innovators like Kazimir Malevich, the painters Wassily KandinskyOtto Freundlich,Walter Dexel and others created non-objective works of color and form. They invited the Dutch abstractionist Piet Mondrian and the Russian Lazar El Lissitzky to exhibit with them in Berlin.

 

Also exhibiting more recognizably political works challenging the German economic, military, and religious elite, the Novembergruppe caused outrage in the early 1920s. By the later 1920s, they had achieved astounding success. Their paintings, sculptures, and architectural drawings became accepted and copied. The innovative artists of the 1920s revolutionized our conceptions of the nature of art. In nearly every cultural field, forms of creative expression which had been deemed distasteful, even repulsive, by the cultural elite became first acceptable and then dominant. Without the innovations of the 1920s, it is not possible to understand contemporary music, painting, or architecture.

 

Yet the broader ambitions of the German cultural radicals of the 1920s fell flat. Their radical ideas had little appeal to broader masses of the population, who still sought traditional forms of beautiful art. Art did not transform life. Their radical politics had restricted appeal. After 1933, the Nazis exploited popular preference for traditional art to categorize the Novembergruppe as “degenerate”.

 

In modern society, we are used to political art. Artists often express political beliefs through artistic creations as a means of influencing popular opinion. Some are individually successful, such as Margaret Atwood’s The Handmaid’s Tale or Norman Rockwell’s painting about school integration “The Problem We All Live With”. But a collective ambition to remake society through art has been absent since the idealism of Russian and German artists of the 1920s ended in disaster in the 1930s. The social vision of the Bauhaus has been subsumed in capitalist commercialism at IKEA. The Novembergruppe’s radical manifestoes are now museum pieces on display at the Berlinische Galerie for 10 Euros.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154191 https://historynewsnetwork.org/blog/154191 0
Writer Amiri Baraka and the Endless Controversies

 

LeRoi Jones (he later changed his name to Amiri Baraka) was one of the most controversial playwrights and poets in American history. He wrote the plays Dutchman, Black Mass, Home on the Range and the Police, among others, dozens of books of poems, plus fiction and non-fiction works. He gave hundreds of talks on writing and Civil Rights across the country and befriended Martin Luther King Jr., Jesse Jackson, Andrew Young and numerous other African-American leaders of the 1960s and 1970s. He was famous for his stand on black nationalism, but infamous, too, for that stand and his anti-Semitic essays and comments. Many people loved him and many people hated him. He was a lightning rod of emotions for Americans during a turbulent era.

His story is now being told in the play Looking for LeRoy, by Larry Muhammad, at the New Federal Theater’s Castillo Theater at 543 W. 42d Street, in New York, where it opened Saturday. It is a searing, strident and alarming play about a racial rabble rouser and yet, at the same time, a very deep, warm and rich play about a man looking back on his life and talking about what he did and what he did not do.

Playwright Muhammad smartly wrote it as a two man play, using the character of Baraka and adding a fictitious young intern. Taj, who serves as both friend and enemy of the playwright, letting Baraka bask in his glory at some points and forcing him to confront severe charges against him over the years from both black and white audiences at others.

Director Petronia Paley has done a superb job of taking a tight two men play and working it so that you see a whole nation of trouble and a landscape of characters alongside Amiri and Taj at the same time.

Baraka, who died in 2014, was a very complicated man. He was a playwright, poet, speaker and in many ways, the conscience of black America. Kim Sullivan, a highly skilled actor, plays him exactly that way. In some scenes he smiles and nods his head knowingly while discussing some well-remembered moments in his life and at others he flies into a rage, stomping about the stage as he remembers other, not-so-fine moments.

His counterpart, Taj, is played with enormous energy by the talented Tyler Fauntleroy, who is an emotional whirlwind on stage. The battles between the writer and intern are wondrous to behold in the gorgeous set by Chris Cumberbatch that serves as Baraka’s apartment.

The play is split int two sections. In the first, the young, headstrong intern who had met Baraka several times years ago, sees him as a celebrity. In the second, Taj, angrier as his internship goes on, becomes Baraka’s enemy and grills him like he was a one-man Congressional Committee.

The first section is good, but, like many plays, it starts off very slowly. As the two men spar over Baraka’s view of the theater, and race, though, it starts to sizzle. The second section is even better because the intern confronts Baraka on not only his work, but his life. Many people charged that as Baraka moved more into politics his writing suffered and Taj hammers him on that. He continually sticks intellectual pins into him and the playwright winces.

The play has its problems. It starts slowly and never really explains how famous Amiri Baraka was (sort of like Spike Lee today). Playwright Muhammad also holds his fire and does not present the very outspoken Baraka, despised by so many, until the last twenty minutes of the play, when some of the playwrights withering language is used. That portrait of the playwright should have come much sooner.

There are also too brief references to important things. Baraka’s enthusiastic support for Kenneth Gibson in his successful drive to become Newark, New Jersey’s first black Mayor, aided substantially by Baraka, is glossed over in a few seconds, as was the Presidency of Barrack Obama and the playwright’s lengthy duels with members of the Jewish faith (he was fired as New Jersey’s poet laureate over anti-Semitic writings).  

These are small criticisms, though. Looking for LeRoy is an impressive look at a provocative writer and electric speaker.

Amiri Baraka would love this play if he was around to see it, although he probably would have insisted on a few more protest posters, louder microphones and a few books about his friend Malcolm X on his coffee table.

 

PRODUCTION: The play is produced by Woodie King. Jr.’s New Federal Theater. Sets: Chris Cumberbatch, Costumes: Kathy Roberson, Lighting: Antoinette Tynes, Sound: Bill Toles. The play was directed by Petronia Paley. It runs through March 31.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171404 https://historynewsnetwork.org/article/171404 0
Life during Wartime 485: “Fatal Embrace"

Previous installments are archived at http://www.joshbrownnyc.com/ldw.htm

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/blog/154190 https://historynewsnetwork.org/blog/154190 0
Roundup Top 10!  

 

These women were denied veteran status for decades. Congress can’t overlook them again.

by Elizabeth Cobbs

Sens. Jon Tester (D-Mont.) and Marsha Blackburn (R-Tenn.) now propose to honor the women of the Signal Corps with the Congressional Gold Medal. 

 

The Rise of the Pedantic Professor

by Sam Fallon

When academic self-regard becomes an intellectual style.

 

 

The toxic legacy of the Korean War

by Mary L. Dudziak

The conflict upended the constitutional balance. It has been cited by presidents ever since.

 

 

Women in Ancient Rome Didn’t Have Equal Rights. They Still Changed History

by Barry Strauss

If we look hard at the history, we discover some women who made their mark, either working within their prescribed gender roles as wives, lovers, mothers, sisters or daughters, or exercising so much political, religious or, even in a few cases, military power that they smashed those roles altogether and struck out on their own. 

 

 

The History of Sexism in the Southern Baptist Church

by Susan M. Shaw

Recent media reports have revealed decades of abuse by Southern Baptist pastors. Here is the history behind the reports.

 

 

Barack Obama’s Presidential Library Is Making a Mockery of Transparency

by Anthony Clark

The leader of the “most transparent administration in history” has been anything but transparent when it comes to plans for his presidential center.

 

 

Grant’s First Tomb

by Jamelle Bouie

Ulysses S. Grant, inaugurated as president 150 years ago today, missed a chance to reconstruct the South economically as well as politically.

 

 

The Island That Changed History

by Sergey Radchenko

A 1969 border clash between Moscow and Beijing pushed the two apart, and opened the door for Nixon to go to China.

 

 

Policing black Americans is a long-standing, and ugly, American tradition

by Vanessa Holden and Edward E. Baptist

A new database of all the fugitive slave ads from U.S. and colonial history reveal how white Americans trained and incentivized themselves to police black Americans’ movements.

 

 

Michael Cohen’s testimony exposed a direct parallel between Trump and Watergate

by Shane O'Sullivan

Payoffs kept Watergate hidden, but eventually whistleblowers like Cohen flipped.

 

 

Five Reasons Why Republicans Won’t Abandon Trump Like They Ditched Nixon

by Ed Kilgore

There are five reasons a broader Republican backlash like the one that helped push Nixon out of office won’t happen if Mueller’s suggestions of law-breaking are limited to obstruction of justice.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171451 https://historynewsnetwork.org/article/171451 0
The English Diggers, the "Commons," and the Green New Deal

 

In April of 1649 a group of radicals, some veterans of the recent English civil wars, occupied a bit of common grass on a plot known as St. George’s Hill in Surrey and they began to grow vegetables. These radicals called themselves “True Levellers” to distinguish themselves from another, more moderate political faction, but everybody else called them “Diggers” in reference to both a scriptural passage as well as their literal activity on the hill. In a spirit of communal fellowship, they invited the people of Surrey, then suffering under exorbitant prices food prices, to “come in and help them, and promise them meat, drink, and clothes.” 

Gerard Winstanley was the major theorist of the group, and in his manifesto The True Levellers Standard Advanced he advocated for a form of public ownership of land, seeing the idea of a commons not as radical, but rather as a restitution. In Winstanley’s understanding the commons were a feature of English rights, that had been violated in the development of privatization, whereby enclosures had begun to partition off formerly collective lands, which were now owned by individual aristocrats and noble families. The result, since the end of the fifteenth-century, had been increasing inequity, with the landless poor often not having space on which to graze their animals. There was an explicitly ecological gloss to Digger politics, with Winstanley claiming that “true freedom lies where a man receives his nourishment and preservation, and that is in the use of the earth.” The dream of the commons as exemplified by the Diggers has something to say in our current moment, as we face not just rising prices on vegetables, but indeed the possibility of complete ecological collapse. 

Critics of supply-side economics point to the Reagan and Thatcher revolutions of the 1980’s as being a moment whereby the traditional social contract, which held that a democratically organized state had a responsibility of care towards collective rights, had begun to fray. This analysis isn’t wrong, that the conservatives of that decade attacked the welfare state in favor of privatization, a shell-game redefinition of “freedom” whereby the undemocratic allocation of resources and power was shifted to the few, who’ve demonstrated a perilously uncaring stewardship towards the environment. But I’d argue that there has long been a Manichean struggle between an understanding of democratic control of resources versus a certain aristocratic libertarianism. The “reforms” of the later go back far in our history, with thinkers like Winstanley understanding what’s lost when the commons are turned over to the control of fewer and more powerful people. Results of that ignoble experiment now threaten to end life on earth as we know it.   

If it seems as if there has been an increasing attack on the idea of the commons when faced against the neo-liberal forces of privatization, then perhaps we should draw some succor from the English historian Christopher Hill, who noted in his classic The World Turned Upside Down: Radical Ideas During the English Revolution that the “Diggers have something to say to twentieth-century socialists,” and perhaps twenty-first century socialists as well. In 2019, I would argue, the idea of the “commons” as a space of collective ownership, responsibility, engagement, and possibility must be a metaphor that the left draws from rhetorically, wrenching it free from the realms of theory and philosophy, and which we can use to more fully define a concept of freedom which is true for the largest possible number of humans. 

A good idea never really dies. Even the right-leaning The Economist in an article entitled “The Rise of Millennial Socialism” admits that the newly resurgent left in both Great Britain and the United States’ Democratic Party has “formed an incisive critique of what has gone wrong in Western societies.” Partially this has been by recourse to traditional labor politics, but as The Economist notes it’s only on the left that there has been any credible attempt to solve the apocalyptic threat of climate change, the right either burying their heads in the sand or engaging in irrational and unempirical denialism.  Part of the new socialist movement’s environmental approach is a return to how the Diggers understood stewardship of the land, so that in policy proposals like Representative Alexandria Ocasio-Cortez and Senator Edward Markey’s Green New Deal we arguably have the emergence of a new ethic that could be called the “people’s right to the natural commons.” As Jedediah Britton-Purdy wrote in The New York Times, “In the 21st century, environmental policy is economic policy.” 

Just as the Diggers hoped to redefine the commons back towards its traditional understanding, so too do todays eco-socialists see this as a fruitful moment in which to expand the definition of freedom as meaning “something more than the capitalist’s freedom to invest or the consumer’s freedom to buy,” as the authors of a recent Jacobin article on the Green New Deal write. Kate Aronoff, Alyssa Battistoni, Daniel Aldana Cohen, and Theo Riofrancos write that for too long the “Right has claimed the language of freedom. But their vision of freedom as your right as an individual to do whatever you want – so long as you can pay for it – is a recipe for disaster in the twenty-first century, when it’s clearer than ever that all our fates are bound up together.” In opposition, the authors argue that the Green New Deal presents the freedoms of a commonwealth, the “freedom to enjoy life, to be creative, to produce and delight in communal luxuries.” I’d add a freedom of access to our collectively owned environment, including its climate, its land, its oceans. As Woody Guthrie sang with patriotic fervor, “This land is your land, and this land is my land.” 

Increasingly the mass of people has come to understand that the exorbitant wealth of the upper fraction of the 1% signal something more than mere luxury, but rather the transfer of undemocratically manifested political power and the ability to do with the earth and its climate whatever they want, even if the entire ecosystem lay in the balance. By contrast, eco-socialism requires a return to the Diggers’ promise, not the abolishment of private property, but an equitable say in how the resources which belong to the common treasury of all people of the earth should be allocated. Why should executives at Exxon have any ability to decide that they’re alright with apocalypse just because it helps their shareholders? Remember that we’re all shareholders of the earth. Far more pragmatic to consider the potential of what philosophers Michael Hardt and Antonio Negri write about in Commonwealth, when they argue that an embrace of the commons is a rejection of “nihilism,” for in turning away from an apocalyptic capitalist economics we can rather imagine “opening up the multitude’s process of productivity and creativity that can revolutionize our world and institute a shared commonwealth.” 

If it seems as if the Leveller’s nascent eco-socialist revolution failed, that’s not because there isn’t a golden thread connecting them to their own past and our current moment. Such beliefs about the commons were held by those participants of the aforementioned Peasant’s Rebellion in the fourteenth-century, and similar ideas about a collective right to some part of the environment can be seen everywhere from the commons at the center of many colonial New England towns goi to the environmental progressivism of President Theodor Roosevelt and the establishment of national parks. A collective right to a natural common, whereby we once again reaffirm the interdependent and communal ownership of the earth sounds as a radical idea, but a shift towards understanding our environmental crisis in this manner might be the spiritual change required to fully grapple with climate change. 

At St. George’s Hill, Winstanley didn’t understand the occupation as being anarchic, but rather conservative in the truest sense of that abused word, as the root for the word “conservation” and as a return to a “merry old England.” As historian Peter Linebaugh explains, the commons have “always been local. It depends on custom, memory, and oral transmission for the maintenance of its norms rather than law, police, and media.” For the Diggers, it was nascent capitalism which was truly radical, and they who rather advocated a return to how they defined the natural state of things. Half a century before the occupation at St. George’s Hill, and the anonymous author of a broadsheet ballad of 1607 wrote that “The law locks up the man or woman/Who steals the goose from off the common/But lets the greater villain loose/Who steals the common from the goose.” The Diggers’ rhetoric has even earlier precursors, their politics recalling a rhyming couplet of the Lollard priest John Ball who helped lead the Peasant’s Rebellion of 1381, and who used to sing a song asking where aristocrats could possibly have been in a state of nature, for “When Adam delved and Eve span, /Who was then the gentleman?” 

In The Century of Revolution: 1603-1714, Hill writes that “Freedom is not abstract. It is the right of certain men to do certain things.” We think of “freedom” as an innate quality – and it is – but we have those with a very limited definition of the word run rough-shod over our environment, where the freedom which has been defined is the right of a very small number of men to make momentous and detrimental changes to the world itself. A world which should be our shared commonwealth. Of course, proposals like the Green New Deal are upsetting to elites who have long profited by their small definition of freedom as being merely their freedom to exploit the earth. Surrey nobles were also less than pleased by the presence of hundreds of radicals encamped on St. George’s Hill, and by August of 1649 the Diggers lost a court case advocating for their squatter’s rights, so they voluntarily abandoned the plot before the threat of violence would have forced them to do so. Linebaugh writes that the “commons is invisible until it is lost.” Today St. George’s Hill is the site of an exclusive gated community and a tennis club. Maybe it’s time to tear down some of those enclosures again? 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171387 https://historynewsnetwork.org/article/171387 0
Genocide Denial In Bosnia: A Danger to Europe and to the World

The Srebrenica Genocide Memorial in Potočari

 

On July 11, 1995, Ratko Mladic and his Serb (VRS) paramilitary units arrived to the sleepy eastern Bosnian city of Srebrenica. Designated a “safe zone,” by the United Nations, civilians from neighboring cities and municipalities clamored to the area in hopes of salvation and safety. That day, over 8,000 young Bosniak men and boys were brutally executed by Mladic’s troops in the biggest single massacre of the Bosnian genocide. It was an event unseen in Europe since the murderous Holocaust campaigns carried out by Hitler’s Nazi regime.  Today, this event, the details around it, and the nature of the killings has become political fuel for nationalist politics in Bosnia and Herzegovina.  Despite the annual exhumation of new mass graves, genocide denial has once again raised its ugly head, just as it did in the aftermath of World War II.

 

Despite thousands of testimonies, photographs, video evidence and overwhelming physical evidence in the form of mass graves, Bosnian Serb and Serbian politicians such as Milorad Dodik (currently a member of the Presidency of Bosnia and Herzegovina) continue to question and deny that a genocide took place in Srebrenica and the wider Bosnia and Herzegovina.  These are by no means passive remarks but rather a targeted campaign of denial. The latest iteration of this heinous and destabilizing action is Republica Srpska’s (one of the political entities created under the Dayton Agreement) so called “truth investigation,” into the Srebrenica genocide.  The implications could not be any more clear: a rise in nationalist fervor and fascistic political ideologies (the same one that fueled the last wars in the Balkans), historical revisionism, political instability, and perhaps most worrying a return to the denial of human rights, the truth, and reconciliation in the country and this precarious part of Europe. 

 

Misinformation campaigns are nothing new.  Nazi authorities and their co-conspirators denied the killing of over 6,000,000 Jews during the war, and many who were sympathetic to their cause, continued to do so afterwards.  This did not simply stop at passively dismissing or denying the Holocaust, but ramped up through targeted campaigns of misinformation. Nazi propaganda dehumanized Jews and cultivated support for the mass murder of Jews before, during, and after the war. Their supporters, such as historian Harry Elmer Barnes, actively supported reports denying the existence of Nazi death camps and even published literature on the topic.  Neo-Nazi “think tanks,” (akin to RS’s investigative body) opened old wounds by downplaying the death count or actively denying the existence of a well-planned, full fledged campaign of extermination. 

 

Dodik and authorities in the Republika Srpska seem to have taken a page out of this playbook. During the war, mass graves were routinely covered up and concealed, and the bodies of victims moved.  Today, this makes identifying the victims very difficult since there is significant mingling of remains.  For example, one victim’s remains were found at two separate locations, over 30km away from each other.  The disinformation and deceit did not stop with the halting of hostilities. Serb nationalist politicians and their supporters routinely downplay the genocide or dismiss it outright, refusing to accept blame or to begin a process of reconciliation. They are aggressively pursuing a policy of genocide denial and introducing unsubstantiated doubt in an effort to destabilize the country, and further, deny the humanity of the victims of the genocide. In 2004, the Appeals Chamber of the International Criminal Tribunal for the former Yugoslavia (ICTY), located in the Hague, ruled that the massacre in Srebrenica constituted genocide, which is a crime under international law.  This ruling was further upheld in 2007 by the International Court of Justice (ICJ) in 2007. These rulings matter little to nationalist leaders such as Dodik and those of his ilk.  Ultimately, they have very little respect for international bodies, considering them nothing more than attack dogs against the Bosnian Serb people. Their tools of the trade have been misinformation campaigns, propaganda, and political investigations.  What they fail to understand is that genocide denial has further societal implications. The distrust and feelings of enmity in Bosnia cannot subsist without the truth being taken seriously, and authorities formally apologizing and undertaking actions to prevent similar atrocities from ever happening again.  

 

Ultimately, why is this so important?  The same de-humanizing philosophy which fed into ethnic and religious tropes leading to genocide is back, perhaps stronger than ever. The denial of history and the truth has become normalized in many parts of the world, sometimes through masked efforts at legitimacy.  In this moment it is especially important for scholars, journalists, and other professionals to stand up for the truth and demand a platform which overshadows lies and misinformation. Historical revisionism threatens not just the sense of justice for families in Bosnia, but the democratic process in the region.  If Europe is indeed serious about protecting democracy and individual rights, it needs to respond to attacks on the truth first. 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171382 https://historynewsnetwork.org/article/171382 0
The Entangled History of “America First” and “The American Dream"

“America First.” 

This simple two-word phrase, which had lain dormant for decades, was suddenly placed front and center in the 2016 Presidential campaign of Donald Trump.

For the Trump campaign, unconcerned about historic meanings or previous connotations, “America First” in 2016 meant higher tariffs, protecting Midwestern manufacturing jobs, trashing NAFTA and turning away from NATO and other longstanding global alliances. 

As Sarah Churchwell explains in her new book, “Behold America,” Trump adopted the phrase as a wedge issue, designed to polarize Americans along racial and geographic lines and peel off blue-collar Democratic voters.

There is a great historical irony in his provocative use off the phrase, because “America First” had first entered the national political dialogue in 1915 when a progressive Democrat, President Woodrow Wilson, sought to unify the country during the horrifying first years of World War I.  

Wilson, running for re-election in early 1915, used the phrase to justify non-intervention in the bloody conflict. In a major political address, he advocated a carefully calibrated American neutrality. 

Wilson proclaimed that “Our whole duty for the present…is summed up in the motto: America First. Let us think of America before we think of Europe, in order that America may be fit to be Europe’s friend when the day of tested friendship comes.” 

He added that by remaining neutral, and thinking of “America First,” the nation was not being ignorant, or self-centered.  Instead, American neutrality meant “sympathy for mankind. It is fairness, it is good will at bottom. It is impartiality of spirit and judgment.”

So how did we get from that benevolent meaning of “America First” to its use as a provocative threat by candidate Donald Trump? 

Churchwell, professor of American Literature at the University of London, unravels the complicated history behind the “America First” and the equally problematic phrase, “the American Dream” in her new book Behold America. 

Churchwell reports that the first written use of the phrase came in 1884, when an Oakland, California newspaper ran “America First and Always” in the headline above a report on a looming trade war with the British Empire.  It fell into disuse until President Wilson resurrected it in his 1915 re-election campaign.

In April 1917, America declared war on Germany and the concept of a “fair-minded” neutrality vanished. After the World War ended and the Versailles Peace Treaty negotiated in 1919, the phrase “America First” continued to be used, but with new meanings. 

For example, Warren Harding used the slogan “Prosper American First” in his successful 1920 campaign. He called Wilson’s proposed League of Nations treaty a “supreme blunder.”  One newspaper, in endorsing him, cited the fact that Harding would usher in “an era of nationalism, instead of internationalism.”   

According to Churchwell, the massive Harding victory (he won 60 per cent of the vote) “legitimized” the phrase for many Americans. It was soon adopted by anti-immigrant and anti-Catholic groups including the newly resurgent Ku Klux Klan.  For these groups, “America First” meant White supremacy and returning the nation to its “Anglo Saxon” or “Nordic” origins, and restricting immigration from Italy and Eastern Europe. 

The 1924 President Calvin Coolidge signed the Johnson-Reed Immigration Act, which fulfilled the hopes of those who saw “America First” as invocation of racial supremacy.  The new act severely restricted immigration from southern and eastern Europe by imposing quotas based on the 1890 census.  It also, in effect, banned immigrants from China and Japan.  

In the 1930s, as the nation descended into the Great Depression, references to “America First” rapidly declined.  In 1940, however, the phrase roared back into national prominence with the founding of the America First Committee (AFC), which chose aviator Charles Lindbergh as its spokesman.

The AFC, funded by wealthy businessmen and run primarily by Ivy League law students, was launched nationwide shortly after Germany conquered France. The committee claimed 800,000 dues-paying members within its first year. The AFC vehemently opposed American entry into World War II and directly attacked President Roosevelt’s aid to Britain. Charles Lindbergh, speaking at rallies across the country, suggested that American Jews were behind the effort to support Britain because they were angry at Germany for its vicious anti-Semitic policies.  

On December 7, 1941, Japan bombed Pearl Harbor and America entered the war the next day. The America First Committee collapsed overnight.  

While the majority of “Behold America” is devoted to exploring the tangled evolution of “America First,” Churchwell also examines the changing meaning of “the American Dream.” Trump, of course, famously declared “The American Dream” is dead in his campaign speeches, blaming the loss of upward mobility on unfair competition by China and a “flood” of immigrants.   

As Churchwell noted in a recent interview with Smithsonian magazine, “the American Dream” has always been about economic success, but 100 years ago “the phrase meant the opposite of what it does now.” It was a “dream of equality, justice and democracy,” not just a vision of a large house full of expensive possessions.  

The author lamented that “that “the American dream isn’t dead…we just have no idea what it means anymore.”   

“Behold America” is extensively researched and generally well written, guiding the reader through a century of political dialogue. However, it is a one-dimensional work dependent on newspaper articles, editorials and letters to the editor.  It is an etymological study of two specific political phrases, rather than a broader look at America’s self-image. 

Churchwell bases her research exclusively on print sources, citing hundreds of newspaper articles and handful of novels.  She only mentions one movie:  D. W. Griffith’s “Birth of a Nation” (in 1915 it became the first film ever shown in the White House). She also completely ignores theater, music and radio, despite the fact that broadcasts reached millions of Americans in their home. She skips over President Roosevelt’s “Fireside Chats” on radio and the widely popular weekly political commentary (often openly anti-Semitic) of Detroit’s Father Coughlin.  

She also ignores the motion picture industry. Hollywood had a major influence on American perceptions of opportunity and social justice. One only has to think of movies like The Grapes of Wrath, Mr. Smith Goes to Washington or It’s a Wonderful Lifeto realize how they shaped depictions of the American Dream. 

Behold America brushes aside the impact of these newer, influential media.  Perhaps Churchwell’s reliance on print sources is due to her background as a professor of Literature. Her previous books include Careless People: Murder, Mayhem, and the Invention of the Great Gatsby.

In the introduction to Behold America, Churchwell notes that “We risk misreading or own moment if we don’t know the historical meanings of expressions we resuscitate or perpetuate.” 

This is certainly true and despite the book’s narrow focus, readers interested in American politics will find the book offers important new context on the contested meanings of “America First” and “The American Dream.”

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171383 https://historynewsnetwork.org/article/171383 0
What I’m Reading: An Interview With Public Historian Amanda Higgins

 

Amanda Higgins is a public history administrator, working outside the academy. She often describes herself as academic-adjacent. Her work is closely aligned with academic pursuits and she loves talking with students, especially graduate students who are thinking about careers outside the academy. A scholar of 20th-century Kentucky and American history, Amanda’s understanding of the not-so-distant past helps her to connect to and build lasting relationships with people across the commonwealth. She also oversees such outreach activities as oral history efforts, the Kentucky Historical Markers program, Kentucky History Awards and the Local History Trust Fund. She holds a Ph.D. in American history. She can be reached on Twitter at @Doc_Higgs.

 

 

What books are you reading now?

 

Beyond the texts that I pull to support ongoing projects related to my public history work, I’m reading: Jeanne Theoharis, A More Beautiful and Terrible History: The Uses and Misuses of Civil Rights History; Kevin M. Kruse and Julian E. Zelizer, Fault Lines: A History of the United States since 1974; and Rebecca Traister, Good and Mad: The Revolutionary Power of Women’s Anger.

 

Reading these books alongside each other is like taking a graduate seminar in contemporary history. I do miss the lively discussion, but the books speak so nicely to each other. 

 

What is your favorite history book?

 

I struggle with the idea of a favorite, because I am always bouncing between projects, eras, and interpretations. One day I’ll be working on historical marker text about indigenous Kentucky and the next my own research rooted in the twentieth century. The best history books, in my mind at least, weave complex interpretations with compelling narrative. 

 

Books like Hasan Kwame Jeffries, Bloody Lowndes, Timothy B. Tyson, Radio Free Dixie, and Donna Murch, Living for the City helped me frame my dissertation project and were models for the best parts of my work. 

 

In Kentucky history, my friend Patrick A. Lewis creates arguments with clauses and structure that I deeply admire. I wish I was even half the writer of my advisor, Tracy A. Campbell, who makes the most mundane details compelling. So, that doesn’t answer the question, but it does name check some of the folks I recommend that others should read! 

 

Why did you choose history as your career?

 

I am nosy by nature and ask many, many questions. I grew up in a home full of books and was encouraged to read anything I wanted at a young age. I loved stories about people, especially people who weren’t like me. I started college as a journalism major, but did not enjoy my introductory class. I gravitated toward my history courses because I enjoyed reading, identifying and engaging with arguments, and digging for information. I thought I’d turn the history major into a law degree, but in the fall semester of my senior year I took a US Legal History course and a Constitutional Law course. I hated the law parts of the classes and loved the policy and implications of the laws—how the laws affected peoples, unintended consequences of rulings, precedent and challenges—and skipped the LSAT. I took the GRE, went to graduate school, and became a historian because I had more questions and wanted to do history. 

 

What qualities do you need to be a historian?

 

Endless curiosity and a dogged determination to find answers. 

 

Who was your favorite history teacher?

 

I’ve been very fortunate throughout my life to be surrounded by incredible educators. My seventh grade social studies teacher was the first teacher who showed me that history was more than names and dates. She tied history to relevant, contemporary topics and encouraged us to be independent and critical thinkers. 

 

What is your most memorable or rewarding teaching experience?

 

In the penultimate year of my doctoral program I was the primary instructor in a course called “the World at War.” The course subject isn’t my favorite or my specialty, but I had a promising student who decided she wanted to be a historian that semester. She was majoring in business or some “sensible” career path that pleased her parents, but she didn’t like those courses. History made her mind race, helped her to understand her world, and ignited a passion in her. We talked through the arguments for and against majoring in history, how she could “sell” the change to her parents, and what her future may look like. She became a history major, graduated with honors, earned a Master’s in public history and is doing a fantastic job as the second in command at a small museum. 

 

In helping her think about what her future could be, I also articulated what I wanted for myself. She helped me much more than I helped her, by asking questions about my career goals and skills. Her continued success brings me so much joy!

 

What are your hopes for history as a discipline?

 

That we get over ourselves and invite folks into the process of history. The best historians are removing the layer between the finished thing and the work to get to that finished project. History is powerful and it matters deeply for a healthy and engaged citizenship, but as historians, we’re not always good at or comfortable with showing our work. To steal a line from my advisor, we hide behind—or even in—our footnotes. We should stop doing that. 

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I don’t own rare books, but I get to work amongst them every day at the Kentucky Historical Society. 

 

My home is full of mid-century bourbon decanters that my partner and I salvaged from my grandfather’s bar after he passed. Jim Beam used to (maybe still does?) put out a collectible decanter every year. They are such fun little pieces of Americana. Living and working in Kentucky means you’re never far from bourbon and I do enjoy historical ephemera from the industry.

 

What have you found most rewarding and most frustrating about your career?

 

The most rewarding parts of my job are the colleagues and friends who I get to collaborate with on my many projects. Seeing friends and collaborators succeed, helping connect a good idea with the right person to make sure that idea becomes a project, and championing the good history work I get to be a part of everyday sustains me through self-doubt or bad moments.

 

The frustrating parts are not anything unique or noteworthy. I like my job. I’m proud of the choices I made to get where I am and willing to take most of the frustrations that come with working in an institution to do work worth doing. 

 

How has the study of history changed in the course of your career?

 

My career is quite young. I’ve only been at this professionally for about five years now. Still, I am so impressed by the research fellows who come through the Kentucky Historical Society’s program. Their projects are inventive and inspiring. The way many of the fellows are using court records to build digital projects, or ARC-GIS mapping to illustrate the networks of enslaved labor, or material culture to understand the lived experiences of working class families is incredible. 

 

The other thing I’m really excited about is the way folks are thinking about projects that span multiple formats. The people I interact with on a daily basis aren’t thinking that the monograph is the only outlet for publication or that the monograph is the last the project will see. They’re (we’re) proposing multifaceted projects—monographs, and scholarly essays, but also public programming, exhibitions, digital tools and games, and experiential learning classes. By democratizing the end product, we’re making the most cutting edge, relevant, and impactful history more available, especially to folks who aren’t as likely to pick up or engage with a scholarly monograph. 

 

What is your favorite history-related saying? Have you come up with your own?

 

“For history, as nearly no one seems to know, is not merely something to be read. And it does not refer merely, or even principally, to the past. On the contrary, the great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do. It could scarcely be otherwise, since it is to history that we owe our frames of reference, our identities, and our aspirations.”—James Baldwin, “The White Man’s Guilt,” in Ebony Aug. 1965 (pg. 47).

 

I haven’t come up with one myself, at least not anything worth repeating and definitely not alongside James Baldwin!

 

What are you doing next?

 

I’m currently managing a three-year IMLS-funded diversity and inclusion initiative at KHS, working on the planning stages of a new exhibition, a new research project I’m not ready to put into the world yet, and collaborating with a number of public history projects throughout Kentucky. 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171180 https://historynewsnetwork.org/article/171180 0
Will the U.S. Government Abide by the International Law It Created in Venezuela?

 

The Trump administration’s campaign to topple the government of Venezuela raises the issue of whether the U.S. government is willing to adhere to the same rules of behavior it expects other nations to follow.

During the nineteenth and early twentieth centuries, U.S. foreign policy was characterized by repeated acts of U.S. military intervention in Latin American nations.  But it began to shift in the late 1920s, as what became known as the Good Neighbor Policy was formulated.  Starting in 1933, the U.S. government, responding to Latin American nations’ complaints about U.S. meddling in their internal affairs, used the occasion of Pan-American conferences to proclaim a nonintervention policy.  This policy was reiterated by the Organization of American States (OAS), founded in 1948 and headquartered in Washington, DC.

Article 19 of the OAS Charter states clearly:  “No State or group of States has the right to intervene, directly or indirectly, for any reason whatever, in the internal or external affairs of any other State.”  To be sure, the Charter, in Article 2(b), declares that one of the essential purposes of the OAS is “to promote and consolidate representative democracy.” But this section continues, in the same sentence, to note that such activity should be conducted “with due respect for the principle of nonintervention.”  The U.S. government, of course, is an active member of the OAS and voted to approve the Charter.  It is also legally bound by the Charter, which is part of international law.

The United Nations Charter, also formulated by the U.S. government and part of international law, includes its own nonintervention obligation.  Attempting to outlaw international aggression, the UN Charter declares, in Article 2(4), that “all Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations.”  Although this wording is vaguer than the OAS Charter’s condemnation of all kinds of intervention, in 1965 the UN General Assembly adopted an official resolution that tightened things up by proclaiming:  “No State has the right to intervene, directly or indirectly for any reason whatever, in the internal or external affairs of any other State.”

Unfortunately, the U.S. government has violated these principles of international law many times in the past―toppling or attempting to topple numerous governments.  And the results often have failed to live up to grandiose promises and expectations. Just look at the outcome of U.S. regime change operations during recent decades in Iran, Guatemala, Cuba, Chile, Cambodia, Haiti, Panama, Nicaragua, Afghanistan, Iraq, Libya, Syria, and numerous other nations.

Of course, there are things worth criticizing in Venezuela, as there are in many other countries―including the United States.  Consequently, a substantial majority of OAS nations voted in January 2019 for a resolution that rejected the legitimacy of Nicolas Maduro’s new term as president, claiming that the May 2018 electoral process lacked “the participation of all Venezuelan political actors,” failed “to comply with international standards,” and lacked “the necessary guarantees for a free, fair, transparent, and democratic process.” 

Nonetheless, the January 2019 OAS resolution did not call for outside intervention but, rather, for “a national dialogue with the participation of all Venezuelan political actors and stakeholders” to secure “national reconciliation,” “a new electoral process,” and a peaceful resolution to “the current crisis in that country.”  In addition, nonintervention and a process of reconciliation between Venezuela’s sharply polarized political factions have been called for by the government of Mexico and by the Pope.

This policy of reconciliation is far from the one promoted by the U.S. government.  In a speech to a frenzied crowd in Miami on February 18, Donald Trump once again demanded the resignation of Maduro and the installation as Venezuelan president of Juan Guiado, the unelected but self-proclaimed president Trump favors. “We seek a peaceful transition to power,” Trump said.  “But all options are on the table.” 

Such intervention in Venezuela’s internal affairs, including the implicit threat of U.S. military invasion, seems likely to lead to massive bloodshed in that country, the destabilization of Latin America, and―at the least―the further erosion of the international law the U.S. government claims to uphold. 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171384 https://historynewsnetwork.org/article/171384 0
Remembering Audrey Hepburn's Best Role

 

As International Women's Day (March 8) comes right after the Academy Awards, it's worth remembering a past winner for best actress. For she almost never got the chance to achieve that dream.

As a young girl, this future actress was living in Nazi-occupied Holland during World War II. As the war continued so did food shortages. It became a famine known as the Hunger Winter of 1944-45. People were starving to death. Some children never got to realize their dreams because their lives were lost to malnutrition.

This future Academy Award winning actress almost became one of a lost generation. Her name was Audrey Hepburn. She thankfully survived and so did others thanks to food aid.

The Allied Forces airlifted food into the Netherlands (Holland) near the end of the war after reaching agreement with the German forces. Truck convoys would soon follow upon liberation. The Supreme Allied Commander, Dwight Eisenhower, had organized a relief plan so food stocks were available to move in so people could eat again.

Audrey Hepburn would go on to become a star, winning the Best Actress Academy Award in 1954 for Roman Holiday. Her best role though was later in life becoming an ambassador for UNICEF, the United Nations agency which fights child hunger and disease. Audrey's own experience living in famine moved her to help children facing that same plight.

After visiting famine-ravaged Ethiopia in 1988, Audrey said “there is a moral obligation for those who have, to give to those who have nothing."  If we remember any of Audrey's lines it should be that one. There are many children today, like Audrey, who are living in areas threatened by famine.

Yemen, South Sudan, the Central African Republic, Syria, Afghanistan, Haiti, Mali and so many other nations are facing major hunger emergencies. There are future doctors, scientists, researchers, teachers, writers, farmers and even actresses among this population. But they may not get the chance to realize their potential if they are lost to hunger.

Even children who survive famine may become stunted for life. Audrey herself may have suffered some lifelong health issues from being malnourished during the war.

For the sake of humanity, we have to save these children. In a world where food is abundant no child should go hungry and lose their future.

Audrey thought her role as ambassador was to educate the world about the nightmares of famine. She knew people were good and would help once they realized something terrible was happening. As she told the Christian Science Monitor in 1992 “the world is full, I’ve discovered, of kind people. And I've also discovered once they know, they give, they help.  It’s not knowing that holds them up.

This is especially true because many people who can help are far away from the hunger emergencies. The starvation in civil war-torn Yemen or South Sudan is not often seen in media coverage. They need ambassadors, maybe you, to change that. You could educate others for example, that 70 percent of Yemen’s population lives in hunger and relief agencies are short on funds to help them.

We need to support relief agencies like UNICEF, the World Food Program, Save the Children, Catholic Relief Services, Mercy Corps and others fighting hunger. These charities on the frontlines of famine are desperately short on funds. Much more resources are put into military expenditures rather than feeding the hungry.

We should increase funds for the U.S. Food for Peace program, which was started by Eisenhower. The McGovern-Dole school lunch program, which feeds children in developing countries, should also see more funding.

We need to step up our diplomatic efforts to resolve conflicts that are causing so much hunger. We need to fortify peace agreements with food aid.

Global conflict, hunger and displacement are at the highest levels since the World War II era. We still have time to save many children who are suffering. As Audrey reminds us, we have a moral imperative to take action and save lives.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171403 https://historynewsnetwork.org/article/171403 0
Is Bernie Sanders Actually A Democratic Socialist?

 

In his “State of the Union” address, President Trump railed against “socialism,” taking aim at rising left-wingers in the Democratic Party,such as Bernie Sanders and Alexandria Ocasio-Cortez who often identify themselves as “democratic socialists.” But are these policies actually socialist? According to Webster Dictionary, socialism is “a system or condition of society in which the means of production are owned and controlled by the state,” where as social welfare is “organized public or private social services for the assistance of disadvantaged groups.” Social welfare was in fact an antidote to socialism initiated by one of the most conservative politicians in world history, Otto von Bismarck, the first Chancellor of newly unified Germany. 

 

When Germany was unified in 1871, Europe had already witnessed a steady increase of socialist-communist influence. “A spectre is haunting Europe – the spectre of communism,” Karl Marx declared in 1848 while revolutions were spreading all over the continent. In the 1870s, the Social Democratic Party rose quickly in Germany, and Bismarck called those socialists “this country’s rats” and “enemies bent on pillage and murder.” Bismarck tied the socialist party to the attempted assassination on William I and banned the party. At the same time, he led the legislative action to establish a social welfare system in order to reduce the appeal of radical socialism/ communism to the working class and to increase commoners’ loyalty to the German state. Three important legislations laid the foundation of German social welfare system: the Health Insurance of Workers Law of 1883, the Accident Insurance Law of 1884, and the Old Age and Invalidity Insurance Law of 1889. Thus, the German social welfare system, arguably the first of its kind in human history, was created as an antidote to undermine socialist/communist radicalism in politics.

 

The hatred of Bismarck and many other establishments against socialism/communism was based on the ideology’s doctrine of abolition of private ownership via class struggle and “proletarian dictatorship.” Given what happened in Soviet Union under Stalin, or China under Mao, when millions landlords and business owners were killed and their properties were confiscated, Bismarck’s harsh words against radicals bending on “pillage and murder” may not seem too off the mark. However, Bismarck or any other politician could not possibly foresee the split of socialist/communist parties during the First World War. Many radical leaders in the Second Communist International abandoned Karl Marx’s calling for proletarian international solidarity against their own “bourgeoise” national government, and they became “national socialists.” At the same time, Lenin split with those “national socialists,” and formed the Third Communist International (Comintern). He condemned his fellow socialists as “revisionists” who betrayed Marx’s famous saying: “working class has no fatherland.” He went on to wage revolution against the Russian “bourgeois” state and succeeded. 

 

The split of socialist camp in Europe had profound social and political consequences, including bloodshed. A case in point was Mussolini in Italy, who betrayed his father’s socialist belief and enlisted himself to fight in the war. Later, his Strom Troopers were fighting on the streets against their former comrades, who seemed to be loyal to Moscow rather than to Rome. One of Mussolini’s admirers was Hitler, who later named his party as “national socialist party” (NAZI). The elite establishments in both Italy and Germany faced a tough choice between Marxist socialists determined to wage class struggle against private ownership and national socialists who wanted to make their nation “great again.” The nationalists seemed to be more popular and gained more seats in parliaments than their former comrades. The rest was history; Mussolini and Hitler gained power in their respective countries. 

 

In the US, the influence of socialism was on the rise during the “Gilded Age” because of unregulated and fast capitalist development. But Eugene Debs’ Socialist Party did not succeed by any means, thanks to the “antidote” provided by progressive reformers, who were largely middle-class professionals. They tried to prevent social revolution of the European typeand they did not want to see the tragedies in St. Petersburg or of the Paris commune occur in the New World. Their strategy was to educate the public to push for legislations on behalf of public interest, especially thepoor and marginalized, and against the greedy instinct of the corporate world. Consequently, we have “Workmen’s Compensation Law” and federal regulatory agencies such as FDA in place to make unregulated capitalist market economy behave more rationally, and to prevent the cumulation of private “wealth against common wealth.” 

 

Thanks in part to the effectiveness of the “antidote,” Socialism has never been influential in America. Therefore, the public is much less informed about the nature and history of socialism than the European counterpart. Some political hackers could take advantage of this knowledge gap to accuse someone they disliked, such as Obama, as a “socialist,” or to name the “Obama Care” as a “socialist legislation.” People who would buy what the political hackers would sell need to know the basic definition of socialism, which is an ideology advocating “public,” “collective,” or “common” ownership of productive means against private ownership. In practice, socialism became quite popular in Western European countries, such as UK or even Canada, where many “Crown Corporations” were owned and controlled by the government. The problem was that they were all losing money, and dependent on taxpayers’ support to stay alive. During the 1980s, the so-called Reagan-Thatcher decade of conservatism, these socialist enterprises were all but privatized. In the 1990s, Tony Blair led the Labor Party to get rid of the “common ownership” clause in its constitution, moving it from the left to the center. That allowed the Labor Party to win election after election.

 

At the same time, Bill Clinton moved Democratic Party to the center, and won the elections twice. In the era of Trump, some elected Democratic officials again seem to move decidedly toward the left, some openly call themselves “democratic socialists.” Given the history of socialism in this country and around world, is this really a winning strategy? If you are in favor of “Medicare for all,” you don’t have to call yourself a “democratic socialist” because Medicare is not exactly an enterprise of “productive means,” as is US Steel or Exxon-Mobile. This is especially true if you argue that health care is a human right for everyone. Of course, you should call yourself a socialist if you really believe in the abolition of private ownership and social control of productive means by the state/government. For the sake of history, please tell your voters what you really believe, and don’t just throw out a label without a precise definition.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171380 https://historynewsnetwork.org/article/171380 0
Battlefield Tourism At Ypres and How Public History Sites Change

 

Ypres or ‘Wipers’ (the soldiers’ slang name) in Belgium has been a magnet for battlefield tourists since 1918. The first wave of (mainly Anglophone) visitors in the 1920s thought of themselves as ‘pilgrims’ rather than ‘tourists’, but the boundaries between commemoration and commerce were as fluid then as they are now. After a brief slump in the 1970s, the tourists have been returning to West Flanders in ever greater numbers, especially since the 1990s. 

Visiting Ypres, or Ieper to use its modern name, is an amazing experience. First, there is the sheer wonder of wandering around a seemingly historic city which, on closer inspection, proves to be of very recent completion. Then, there is the impressive scale of the massive Cloth Hall, the great medieval trading market which attracted merchants from across Europe. But, that too proves to be a bit of curiosity when stared at, as the mix of very smooth, sharply cut stone merges with the pock-marked, scarred and worn pillars along the ground floor. Next to the Cloth Hall is a soaring medieval cathedral, but enter inside and it feels so new you almost expect it to squeak as it comes out of the shrink-wrap. Finally, there is the Menin Gate, a huge memorial to the British and Commonwealth missing of ‘the salient’. Tucked into the ramparts, the Menin Gate almost leaps out on the visitor walking along the street from the central square (the Grote Markt). 

It is the Menin Gate that provides the key to the rest of the mystery, for it commemorates the fact that this charming West Flanders city witnessed some of the most intense and prolonged fighting on the Western Front between 1914 and 1918. Inscribed on the Gate are the names of some 56,000 soldiers how have been missing since the war. During that fighting Ypres was reduced to rubble and ashes only to rise again in replica form. And that is an underlying theme of our work: the recycling, rebuilding, reconstruction of images, stories, and histories of Ypres which stands alongside the physical construction of memorials, monuments and cemeteries in a reconstructed landscape. It is about construction and reconstruction; the encoding and reinterpreting of a major historical event within its original space, and how the battlefield of Ypres could be brought home. 

Battlefield tourism (or ‘pilgrimage’) has been at the heart of this often deeply emotional process of bringing the war home since 1918. For the British the city was indeed a site of pilgrimage, as summed up in the title of an early guidebook, Ypres: Holy Ground of British Arms. The British thought the ground was made sacred by the bloody sacrifice of countless soldiers from the British Empire. Running alongside this reverential mode was that of the tourist, as British people sought to buy souvenirs and have home comforts. By the mid-twenties Ypres businesses were appealing directly to the British. ‘If you want a cup of good strong English tea have at the Tea Room’, was one café’s advertisement.

What the Baedeker guidebook noted in 1930 still holds true today, namely that Ypres had ‘acquired a certain English air’. Today’s visitor, walking the streets of the ‘medieval’ town, attending the daily remembrance ceremony at the Menin Gate or visiting a war cemetery on the outskirts of the city, is unlikely to bump into a German battlefield tourist; even at the German war cemetery at Langemarck (some six miles to the north-east of the city), anglophone visitors greatly outnumber their German counterparts. This was not always the case. ‘Langemarck’ had once occupied a special place in the German memory of the Great War. It was the site where war volunteers had allegedly marched into death singing ‘Deutschland über alles’ in November 1914. The memory of their noble ‘self-sacrifice’ became a rallying cry for the political right during the 1920s and 1930s. When the Wehrmacht overrun Belgium in May 1940, this was celebrated as ‘Second Langemarck’. After 1945, following a hiatus of several years, German veterans of the First World War returned to the city for the 50th anniversary celebrations, often at the invitation of the city keen to foster a spirit of reconciliation. With the passing away of the veterans during the 1970s, the presence of German visitors in and around Ypres declined dramatically – precisely at the moment when British battlefield tourism started to increase again. Most German veterans went to their graves with their war stories, and only a negligible number of families toured the battlefields in the hope of recapturing something of the war experiences of their grandfathers. Flanders – the mere mention of the name could still send shock waves through the generation of survivors in the 1950s – faded from German collective memory and largely disappeared from tourist itineraries.  The new generation, it seems, no longer felt a deep emotional connection to the war dead and the landscape of the Ypres salient.

Today’s Ieper still has thousands of British visitors, with tourism as important to the economy of the city as it was in the twenties. But, in addition to the British, the Australians, Canadians, New Zealanders and also Americans are now coming in even greater numbers, as well as people from many other nations fascinated and intrigued by meeting the last great eyewitness left of the Great War: the landscape. Modern Ieper is a world forged and shaped in the furnace of a conflict that ended one hundred years ago this November.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171381 https://historynewsnetwork.org/article/171381 0
What FDR Can Teach Us About Congress and National Emergencies

Franklin D. Roosevelt, Sam Rayburn, and Alben Barkley

 

Seventy-five years ago this week, there was a serious conflict between President Franklin Roosevelt and Congress. The United States was at war, indisputably a national emergency. Today we face a serious conflict between President Donald Trump and Congress. President Trump has declared a national emergency in order to spend monies appropriated by Congress for other purposes in order to build a wall between the United States and Mexico. Only Trump’s supporters, a minority of the country, see an emergency. If Trump is not stopped, we will have taken a serious step toward authoritarian government. We may draw some lessons from the conflict between Roosevelt and Congress in 1944 that may be helpful today.

As a follow-up to his call for an Economic Bill of Rights in his January 11, 1944, State of the Union address, Roosevelt had proposed to raise $10.5 billion for the prosecution of the war and domestic needs. The resulting Revenue Act raised only $2.1 billion and included tax cuts and new benefits for bondholders and the airline, lumber, and natural gas industries. On February 22, 1944, Roosevelt issued a veto message, charging that the measure enacted by Congress was “not a tax bill but a tax relief bill providing relief not for the needy but for the greedy.” Although Roosevelt was right in his criticism, the reaction on Capitol Hill was outrage. 

The next morning, Senate Majority Leader Alben Barkley of Kentucky, hitherto a close supporter of the president, charged that the president had given a “calculated and deliberate assault upon the legislative integrity of every Member of Congress.” Barkley announced that he would resign as majority leader because he needed to maintain “the approval of my own conscience and my own self-respect.”  He concluded his speech to a packed Senate with a call for action: “if the Congress of the United States has any self-respect yet left, it will override the veto of the President and enact this tax bill into law, his objections to the contrary notwithstanding.” The Senate gave Barkley a standing ovation. The House overrode the president’s veto later that day. The Senate did so the following day, February 24, 1944. 

Instead of persisting in his criticism of the Congress, Roosevelt sent Barkley a conciliatory message, hoping that he would not resign and that “your colleagues will not accept your resignation; but if they do, I sincerely hope that they will immediately and unanimously re-elect you.” Shortly before the Senate override of the veto, Barkley resigned and was unanimously reelected majority leader. 

An immensely popular president who had been elected three times and would be reelected to a fourth term in less than nine months time, Roosevelt nevertheless knew he needed to work with Congress and respect its authority. Although the media at the time characterized Barkley’s resignation and the Congressional override of Roosevelt’s veto as a crisis, the rift was quickly healed and Roosevelt and the Congress continued to work together on the war emergency. The controversy, it’s true, meant that Roosevelt no longer considered Barkley as a potential running mate in 1944. That was a sacrifice Barkley was willing to make. He later served as vice president under Harry Truman.

What lessons may we draw today from this controversy?  To maintain its role as the holder of “All legislative Powers” of our constitutional government, the Congress should vote to cancel Trump’s emergency declaration as provided by the National Emergencies Act of 1976 and then vote to override his likely veto. Senate Majority leader Mitch McConnell should follow the example of fellow Kentuckian Alben Barkley and support such legislation and an override of a presidential veto. Because the stakes are the survival of representative government, a grass roots movement should make their voices heard to stop the imposition of authoritarian government.

 

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171377 https://historynewsnetwork.org/article/171377 0
Roundup Top 10!  

Michael Cohen revealed Trump doesn’t understand America’s racist past

by Elizabeth A. Herbin-Triant

Ghettos exist because of housing discrimination.

 

I wrote about the waning popularity of history at universities. Historians weren’t happy.

by Max Boot

Americans are in vital need of the instruction that historians can provide. Instead of responding defensively to criticism, historians would be better advised to think about what all of us — I include myself — can do to counter the abysmal ignorance that has made so many people susceptible to a demagogue like Donald Trump.

 

 

Black women led the charge against R. Kelly. They’re part of a long tradition.

by Danielle McGuire

Why has it taken more than 20 years and testimony by about 50 accusers to get to this moment?

 

 

Revisiting The American Nazi Supporters of "A Night at the Garden"

by Margaret Talbot

One advantage to living through Trumpism is that it has compelled a reckoning with aspects of our country’s past that, for a long time, many Americans preferred not to acknowledge.

 

 

What journalists miss when they ignore history

by Kathryn Palmer

Media historian Earnest Perry explains why journalists should put more history in the headlines.

 

 

The Academy Is Unstable and Degrading. Historians Should Take Over the Government, Instead.

by Daniel Bessner

Were Mills and Chomsky correct to assume that radical intellectuals could have little effect on U.S. policy?

 

 

Beyond Slavery and the Civil Rights Movement: Teachers Should Be Integrating Black History Into Their Lessons

by Melinda D. Anderson

Much of what students learn about black people’s distinct American story is hit-or-miss.

 

 

2020 Will See a Monumental Clash Over America’s Place in the World

by Stephen Wertheim

Is it time for the U.S. to confront other great powers — or to retreat?

 

 

Stop calling Trump “medieval.” It’s an insult to the Middle Ages.

by Eric Weiskott

It’s not only ahistorical. It obscures uniquely modern evils.

 

 

Obama Makes It Harder to See the Arc of History Bend

by John Gans

My old boss’ post-presidential center is a missed opportunity.

</

 

The important way the 2008 crisis was worse than the Great Depression

by Matt O'Brien

The 2008 crisis is still with us to this very day even though it officially ended almost a decade ago.

]]>
Tue, 26 Mar 2019 23:32:08 +0000 https://historynewsnetwork.org/article/171402 https://historynewsnetwork.org/article/171402 0