All posts by sineadmceneaney

Historian at the Open University. Primarily interested in post-war United States history, with an emphasis on race, gender, social movements and protest. Currently writing about gender, autobiography and civil rights.

1776, and all that

“There has to be some kind of accountability, because we cannot have a redemption, we cannot have healing without accountability, without the truth being told, without responsibility being accepted.”

Those were the words of Senator Cory Booker (D-NY) on Stephen Colbert’s The Late Show on the evening of January 18th, a federal holiday marking Martin Luther King Jr.’s birthday. He was talking about the attack on Congress, and the failure of the President and several high profile Republican politicians to take responsibility for their part in inciting the violence on January 6th.


But he could also have been talking about the 1776 Report, published the same day. Conceived as a rebuttal to the 1619 project, the authors of the 1776 report set out to create a definitive conservative historical ‘truth’ of American exceptionalism that could be used in school history curricula instead of those pesky scholarly studies that reveal real problems of slavery and genocide in the American past. There was no expert in American history among the authors of the 1776 report, and historians across the board have dismissed it as propagandist rubbish. Indeed, incoming President Biden has declared he will dissolve the commission on his first day in office.

Also on January 18th, the White House released the names of 244 people who would be cast in stone or bronze in a National Statue Garden, each to be commemorated for their contributions to American society and, by the standard of the 1776 report, contributions to the inevitable march of American progress. The list is most peculiar. It includes Whitney Houston and William F Buckley, Frederick Douglass and George Patton. “The chronicles of our history show that America is a land of heroes,” said the press release. Such heroism conveniently allows us to focus on a narrative of progress: if Sitting Bull is a hero alongside Andrew Jackson, then they must both be good, right? Like the Declaration of Independence in 1776, would these statues also be theoretically sacred, and would it be ‘erasing history’ to destroy them?

In reality, just like the Declaration of Independence, the Statue Garden would provide just a snapshot of the past, open to interpretation and re-interpretation in context for generations to come. It is also, thankfully, unlikely to ever be built.

What is truly striking in these stories is how unusual it is to see this kind of wrangling about the past in mature democracies. We normally associate disputes over what is ‘true’ in history with autocratic states, or with new nations emerging from internal strife. In 2007, Northern Ireland established a Consultative Group on the Past, in an effort to find some kind of usable historical truth that would be accepted by two sides in a still deeply divided post-Troubles society. After the death of Franco, the political left and right in Spain colluded to ignore the history of Francoism so as to move forward without having to reckon with the past: they called this the ‘Pact of Forgetting’.

In contrast, the Germans have a dedicated word to express the process of coming to terms with a difficult and indefensible past: “vergangenheitsbewältigung”. Through careful coordination of public space, museums, statues and other monuments, and encouraging detailed examination of the atrocities of the 20th century, Germans have sought to create an open discussion to reconcile with their past.

The United States is no stranger to historical myth-making. But the inability to engage in good faith with the past, and the reliance instead on fairy tales which obscure the real harms done – and still being done – by slavery and its legacies, indicates real challenges ahead for its democratic norms. There are challenges here in the UK too, where we see a similar wrangling over the legacy of empire and Britain’s central involvement in the slave trade. There is a determination in some quarters (primarily conservative) to produce simplistic jingoistic narratives, and to protect statues rather than protect research into the nuance of the past. We can see this in the recent criticism of the National Trust for ‘erasing history’ through provision of new, detailed research on their properties’ historical links with slavery, and the incomprehensible attack by some newspapers on the Arts Council funded Colonial Countryside project. 

Weaponizing bad-faith inaccurate narratives of the past for political leverage creates further division, now and in the future. It is more useful in a mature democracy to encourage detailed study of history, robust good faith debate based on sources rather than ideological agendas. History is erased through lies, not research and simplistic platitudes that appeal to prejudices. What we need is more, and more detailed, historical research into these difficult questions in our pasts. Accountability, understanding, responsibility: these are the only things that will lead to a robust civic society capable of engaging in a critical and positive way with democracy.

The Spectre of Trump Haunts America

When Joseph R. Biden becomes President of the United States at noon on 20 January, you could forgive him a sense of déjà vu. When he first entered the White House in 2009, then as Vice President in Barack Obama’s incoming administration, he was facing a crisis of unprecedented proportions that would require swift action and a huge stimulus from Congress to resolve. The Republican party had been taken over by a small, ideologically fragile fringe group, which sought to stoke social and political division through lies and misrepresentations. Conspiracy theories abounded.

This time, the Covid pandemic replaces the financial crisis, Trumpists are the new Tea Party, and Q anon and assorted ‘stop the steal’ conspiracists take up from where Birthers began. Same problems, different January. And this time, Biden takes the oath as President instead of VP.

Some other things will be familiar. The ceremony will be outside the Capitol building, as has been the case for most inaugurations since the 1830s. The Chief Justice of the Supreme Court, John Roberts, will administer the 35-word oath as required by the constitution, swearing in the new President. Biden is expected to use his family Bible for the ceremony, as he follows in John F. Kennedy’s footsteps to become only the second Catholic to occupy the Oval Office. President Biden will give an inauguration speech, but to a small in-person crowd. He will almost certainly echo most of his recent predecessors who sought to use their first remarks in office to reach out to those who did not vote for him, unlike the outgoing President whose inaugural speech conjured up a dystopian image of American carnage, a country “destroyed” by immigration, universal healthcare, anti-racist movements and eight years of progressive Obama policies.

But for the most part, this inauguration will bear little resemblance to any other.

There will be no parade. There will be no inaugural balls in the evening. There won’t even be proper crowds: the National mall will be empty and television and online coverage will provide a proxy. The entire city is on a lockdown, the product of dual concerns over security and the pandemic. And with the exception perhaps of Abraham Lincoln’s first inauguration on 4 March 1861 – when seven states had already seceded from the Union and Civil War looked inevitable – it’s difficult to think of a more fraught inaugural ceremony. Now, just as then, Washington DC will resemble a warzone, with tens of thousands of police, military and National Guardsmen patrolling the Capitol and the city, expecting riots, protests and possibly even assassination attempts. Given the reports that some rioters on 6 January may have had plans to assassinate members of Congress, these fears do not seem unreasonable.

Outside the Capitol, 6 January 2021

Even as Biden takes command of the nuclear codes, the spectre of Trump haunts America.

When he takes office, Biden will inherit a carnage of Trump’s making. Trump’s vice-like grip on the Republican party has only exacerbated the party’s worst failings. As a result of Trump’s routine lying, and the spineless failure of party grandees to stand up to him, and others cynically exploiting unfounded conspiracy theories for their own political gain, confidence in the political system is at an all-time low. Political discourse is broken. Bipartisanship has been destroyed. 147 Republicans in the House voted to block the certification of the election results, in direct contradiction of the clear voice of the electorate which voted in unprecedented numbers for the Biden/Harris ticket. Trump’s part in encouraging a dangerous attack on the Capitol on 6 January left the Democrats in the House little choice but to call for his impeachment. Those impeachment proceedings will distract from Biden’s agenda in his first weeks in office. Although Trump will not be physically present at the handover of power (only the fourth President to refuse to attend the inauguration of his successor), Joe Biden is not yet rid of Donald Trump.

As a student of the 1960s, I can’t help but recall the words of John F. Kennedy, 60 years ago to the day: “United, there is little we cannot do in a host of cooperative ventures. Divided, there is little we can do–for we dare not meet a powerful challenge at odds and split asunder.” The line harked back to Lincoln’s “A house divided against itself cannot stand.” Any sensible examination of American history reveals that unity as a country has always been difficult to articulate. But rarely has the nation been quite so divided, and rarely have these divisions seemed quite so irreconcilable. There are two large constituencies in the United States, each in its own echo chamber of talk radio, television, social media. It is difficult to see how they can be brought together.

Perhaps he will echo Lincoln’s first inaugural exhortation to friendship between both sides: “We are not enemies, but friends. We must not be enemies. Though passion may have strained, it must not break our bonds of affection.” But like Lincoln, I suspect this would fall on deaf ears. Perhaps instead Biden will follow Franklin D. Roosevelt’s first inaugural pledge to truth: “This is preeminently the time to speak the truth, the whole truth, frankly and boldly. Nor need we shrink from honestly facing conditions in our country today. This great Nation will endure as it has endured, will revive and will prosper.” But truth has been the great casualty of the past four years. Each side has their own truth.

What we do know is that Biden’s theme will be “American Unity”. Whatever he says at his inauguration, he’s going to need a lot of help to achieve that.

Is America better than this?

On Wednesday morning, I woke up to the news that the state of Georgia had elected the Reverend Dr Raphael Warnock to one of the state’s two Senate seats, in a keenly anticipated run-off election after the November ballot failed to produce winners in either of Georgia’s senate races. Warnock’s victory was unprecedented: when he is sworn in, not only will he be the first Black Senator in Georgia’s history, he will also be the first Black Senator representing the Democratic Party in any former Confederate state. He joins Tim Scott (R-NC) as the only two Black men elected to the Senate in the post-Reconstruction South. Joining Cory Booker (D-NJ), they are three Black senators out of a total 100. It’s hard to overstate the significance of Warnock’s election. It’s hard to overstate the historic whiteness of the Senate.

But by Wednesday evening, Warnock’s momentous victory in Georgia was overshadowed by the insurgent attack on the Capitol building by supporters of Donald Trump, the most openly racist President since Wilson, who has teetered on the brink of sedition since he lost to Joe Biden in the November election. One can’t help wonder whether Warnock’s victory pushed his mostly white, conspiracy theory loving, confederate flag bearing base over the edge.

In the aftermath of the riots, president elect Joe Biden delivered remarks from Wilmington to reassure Americans that this was an aberration. “America is so much better than what we’ve seen today”, he said.

Is it though?

One of the other notable things about Raphael Warnock is that he preaches at Ebenezer Baptist Church in Atlanta, where Martin Luther King, Jr began his ministry. Biden’s inauguration is scheduled for two days after the federal holiday named after King. When King and his colleagues in the Civil Rights movement marched peacefully against racism, they did not meet with police forces who moved barriers for them. They did not take selfies with local policemen. The were arrested en masse.

On 2 May 1963, in Birmingham Alabama, Commissioner of Public Safety Bull Connor and his police forces jailed 1,200 men, women and children who took part in peaceful protest. Last August, DC police had no difficulty arresting 41 participants in a Black Lives Matter protest march. You might come to the conclusion that the police know perfectly well how to arrest peaceful Black and Brown people in the street, but seem at a loss when it comes to arresting white far-right insurrectionists attacking the seat of government. So far, Capitol Police have arrested 14 people as a result of the riots at the Capitol. Many more were arrested overnight for breach of curfew; in a city that is majority Black, one can only wonder who they were, and marvel at the injustice that the lives of ordinary DC citizens were disrupted by curfew because of the actions of mostly white, radicalized Trump supporters intent on anti-government activity. The double standard is striking.

The mythology that these white racist radicals are somehow ‘un-American’ is one of the most dangerous stories of the post-Reconstruction era. White racism upended any potential for real healing after the Civil War. White racism created the need for the very Compromise of 1877 that Ted Cruz used as a pretext for opposing certification of the electoral college results, before senators had to leave the chamber because of the incursion. Donald Trump has consistently used racist dog whistles to foment divisions that allowed him to win the presidency.  In using the campaign phrase ‘America First’ and styling himself on Andrew Jackson, Donald Trump has deliberately aligned himself with a vision of America that privileges white experience over all others. The rioters yesterday thought nothing of filming themselves illegally entering and damaging the Capitol building, exposing their identities and giving interviews to the mainstream press. Why? It literally did not occur to them that the police would turn on them. And they were right, for the most part.

On 18 January, many Americans will mark the federal holiday commemorating Martin Luther King and the Civil Rights movement. They will remember King’s ‘I have a dream’ speech, and reassure themselves that the arc of the moral universe bends towards justice. But let’s not forget that in Mississippi and Alabama, the holiday commemorates Martin Luther King and Robert E. Lee, the Confederate general who fought to retain slavery. Because commemorating sedition is as American as apple pie. Trump’s attempt to overturn the election is just the sharp end of generations of voter suppression that were not eliminated by the 15th Amendment, nor by the Voting Rights Act. And the attacks on the Capitol are the natural end point of generations of Know-Nothings, Klansmen, Massive Resistance, Tea Partiers and politicians using race-baiting disguised as ‘law and order’ and vague notions of ‘socialism’ to deliberately sabotage normal political discourse.

When Joe Biden is inaugurated as the 46th President two days after MLK day, he may well give a speech articulating his own dream of harmony, a desire to shift away from Trump’s American carnage. But this crisis has been years, generations, in the making. Fuelled by systemic racism, economic inequality, an under-resourced public education system and the intentional sabotage of bipartisan politics (mostly by the Republican Party), the United States is dangerously divided. As Abraham Lincoln warned in 1858, “a house divided cannot stand.”

The election of Raphael Warnock will be interpreted as a beacon of hope, and maybe it is — the tireless work of Stacey Abrams and others in Georgia took almost a decade to get to this point. Kamala Harris will perhaps overtake Dick Cheney as the most powerful Vice President in history. But there’s no getting away from the reality that America is not better. It has never come to terms with its past. It is unlikely that the incoming Congress and President will be able to mend these divisions, within a divisive climate encouraged by mainstream and social medias. The only salve will be a wholesale change of political culture.

From Bhagat Singh Thind to Kamala Harris: an american story

As I write this, Pennsylvania has just flipped. If the trend continues, we are very close to a declaration that Joe Biden and Kamala Harris will be the President elect and Vice-President elect of the US. Whatever about the change at the top of the ticket, Kamala Harris’ election as the first Black woman Vice President is truly historic. I still remember as a child watching Geraldine Ferraro being eviscerated during the 1984 election, mocked during her VP run for not being ‘tough enough’, for being a woman. Somewhere out in the cosmos, Geraldine Ferraro is raising a toast to Kamala Harris. And I have a lump in my throat.

But I’m also thinking this afternoon of Bhagat Singh Thind, an Indian Sikh man whose petition for citizenship through naturalization was rejected by the Supreme Court in 1923. Thind was born near Amritsar in the state of Punjab in India, and moved to the US in 1913 to undertake his studies, eventually earning a PhD at the University of California, Berkeley. Towards the end of  World War I, Thind enlisted in the US Army, where he became the first turbaned soldier in the Army. Before his honourable discharge in December 1918, Thind became a US citizen through naturalization. However, the Naturalization Act of 1906 specified that naturalized citizenship was only available to people who were deemed “free and white”, or of African descent. Four days after he received his citizenship in Washington State, the Bureau of Naturalization applied to have this removed. Thind, an Indian Sikh, was not deemed to be ‘white’.

Thind took his case to the Supreme Court, at a time when the Court’s perception of racial hierarchy was determined by the Plessy case (1896, established the “separate but equal” colour line in law related to Black and white Americans) and a range of anti-immigrant measures including the Chinese Exclusion Act (1882), the California Alien Land Law in 1913 which prohibited citizens ineligible for naturalization from owning land, the Immigration Act of 1917 and the Asiatic Exclusion League that sought to curtail immigration from Asia, and especially from India. Thind’s lawyers sought to establish that their client was white enough to claim access to naturalization routes to citizenship. They failed.

When the Supreme Court heard the case, they concluded that people from India could not be naturalized as US citizens. In United States v Bhagat Singh Thind (261 U.S. 204 (1923), the English-born Justice George Sutherland authored the unanimous decision declaring that Indian Sikhs were not white, and so ineligible for naturalization under the terms of existing legislation.

Towards the end of his opinion, Sutherland wrote:

“the children of… European parentage quickly merge into the mass of our population and lose the distinctive hallmarks of their European origin,” but “the children born in this country of Hindu parents would retain indefinitely the clear evidence of their ancestry.” Thind presented in his turban, unwilling to compromise his own religion and identity to become invisible in Americana.

Part of the explanation for the Court decision lies in racism, but it was also the case that the courts were suspicious of Indians, like Thind, who articulated political sympathies with anti-imperial movements challenging British and western hegemony on the continent. Thind was not only deemed to be non-white; he was potentially a dangerous political radical.

Thind did eventually gain US citizenship in 1935, although his eligibility at that stage was based on his military service in World War 1. The Supreme Court decision in Thind was not overturned until after World War 2, when Harry Truman signed the Luce-Cellar Act in 1946, finally overturning much of the previous discriminatory legislation and allowing people from India to naturalize as citizens, and to own property, and to sponsor their family members abroad to come into the US. Quotas were tiny at first (only 100 allowed under the 1946 Act), eventually revised upwards in the 1950s and later.

Some 40 years after Thind’s case, Kamala Harris’ mother Shyamala Gopalan – herself born and raised in India – was awarded her PhD by the University of California at Berkeley, Thind’s alma mater. Kamala Harris has long identified both with her father’s Jamaican and her mother’s Indian heritages — in historical terms, quite complex, since there had been little solidarity between immigrant Indians and African Americans during Thind’s lifetime. Like Barack Obama, Harris’ self-identity is intersectional. Her election is important for a wide constituency: a triumph for Black women, many of whom are responsible for pushing the Biden-Harris ticket past Trump in the key states of Georgia and Pennsylvania; a triumph for women in the Democratic party, where she stands on the shoulders of Shirley Chisholm, Geraldine Ferraro and Hillary Clinton, among many others; and a repudiation of the racist exclusion of immigrant groups in the past, who despite their commitment to the nation were denied access to its citizenship. A new administration will need to grapple with the racist legacies of the Trump era, including the debacle of separated children at the Mexican border. Thind’s story reminds us how the courts and the government have conspired in the past to use concepts of “whiteness” to decide who gets to be American. Kamala Harris, with dual heritage steeped in histories of exclusion, should be in a unique position to challenge this.

The Electoral college

On 7 November 2012, Donald Trump tweeted “The electoral college is a disaster for democracy”. In that election, Barack Obama won almost 66 million votes nationwide, carried victories in 26 states and the District of Columbia, and 332 electoral college votes. His overall percentage was 51% to his opponent Mitt Romney’s 47%. This stands in stark contrast to the margins in 2016, where Trump won victory through the Electoral College while his opponent Hillary Clinton lost the Presidency although she won the popular vote. As we wait for votes to come through in Georgia, Pennsylvania, North Carolina, Nevada and Arizona, Trump is again playing for Electoral College votes – his opponent Joe Biden has already won an unassailable lead in the popular vote, and has in fact garnered more votes in hard numerical terms than any other presidential candidate in history.

So, what is the electoral college, why does it exist, and how does it work?

When Americans vote in a presidential election, they are not actually voting for the President. They are voting on a state-by-state basis for a panel of people nominated by their state legislature as Electors, who will then cast their votes in line with the decision of the voters in their state. While the Electors never actually meet as a whole group, the congregation of these people is called the Electoral College, and votes are usually cast and certified by December.

There are currently 538 Electors distributed across the various states, based on the population of these states. So, when you look at the interactive electoral maps produced in the media showing ‘Electoral college votes’, this tells you how many electors each state has. This all means that whatever the voters may think, the Presidential race in the US is decided by indirect voting: the popular vote doesn’t matter, and the winner is determined on a state-by-state basis according to the number of Electors casting their vote.

The existence and selection of Electors is provided for in Article II, Section 1 of the Constitution. So, to change the Electoral College system is a tricky thing and would require a constitutional amendment. When the constitution was written and ratified in the late 1780s, the Electoral college was devised as a tool to balance out the power of the more populated states: a national winner-takes-all approach would have advantaged the interests of states with bigger urban populations, and for the Founding Fathers, it was important to ensure that some power was held by rural, low-population states, and of course they also had to balance the power of slave states and free states to maintain national unity.

But a lot has changed since the 1780s, and demographic shifts in the 20th and 21st century reveal real weaknesses in the Electoral college system. No president in the 20th century managed to win victory in the Electoral College without also carrying the popular vote. This is true even in the notorious 1960 election where John F. Kennedy was accused of ‘stealing’ the election through Democratic rigging of the Chicago vote. That year, Kennedy won the popular vote by a mere 100,000 or so votes, but he won the Electoral College by 303 to Nixon’s 219 Electoral votes.

But in the 21st century, the story is very different. Famously, in 2000 Al Gore conceded victory to George W. Bush after the Florida count was decided against him. Bush took the Presidency with 271 Electoral College votes, despite winning only 47.9% of the popular vote to Gore’s 48.4%. In 2016, four years after criticizing the Electoral College via tweet, Donald Trump took the Presidency with a large Electoral College majority, but only 46% of the popular vote, to Hillary Clinton’s 48.1%.

Why does this happen? It goes back to the balance of power envisaged by the Founding Fathers. The system preserves the power of rural, less populous states (like Nevada and Arizona) against the dominant interests of more urban, more populous states (like California and New York). So, the vote weight of Nevada, with a population of 3 million and 6 Electoral College votes is about 1.35. But the vote weight of Pennsylvania, with a population of over 12.8 million, and 20 electoral votes is about 0.83. The Electoral College means that not all voters’ votes count equally.

That would be less divisive if the population was more evenly distributed, and if we could talk about a broad spread of voting values across the US. But the population is becoming less white, more urbanized, and more progressive. Often voting outcomes seem at odds with this. If the Presidential vote was calculated on a national winner-takes-all basis, you can be pretty sure that California and the more progressive East Coast, heavily populated states, would pick the winner every time. Nobody would care about Nebraska, or South Dakota. The existing system, however flawed, keeps these states in active political participation. As the country becomes more politically polarized, this is important. Political alienation encourages separatist mindsets, and we have seen in the recent past that this poses a real threat to national security. As much as we might complain that the Electoral College damages democracy, it serves a function and it is unlikely that political powers will seek to change it any time soon.

As I write this, Joe Biden leads Donald Trump by 253 Electoral college votes to 214. He needs 270 to win. If Trump does overtake him to snatch victory, he will do so without a popular vote mandate, and to use his own words, this would be a disaster for democracy. If this does happen, he will continue this century’s trend of Republicans taking the White House while losing the popular vote. Only in 2004 has a Republican (G.W. Bush) won the Presidency with a popular vote majority. What this underscores is the inability of the Republican Party to appeal sufficiently to the majority feeling in the country. A party that appeals to a narrowing population base will eventually run out of steam: perhaps that is where real change is needed to protect democracy.

The Republican Revolution and the Death of Trust

“So let us begin anew–remembering on both sides that civility is not a sign of weakness, and sincerity is always subject to proof.”

John F. Kennedy’s inauguration speech in January 1961.

This could be a blog post with a very short shelf life. As we wait for results to start coming in from today’s general election in the US, I’m reminded of Kennedy’s inaugural speech in 1961. Kennedy was speaking about Cold war divisions here, but I think his words have peculiar resonance in relation to the opportunity for change, healing divisions, rounding a corner, beginning anew in the current domestic situation.

I think most of us would agree that the most significant casualty over the past four years has been political trust. The decline of trust did not start in November 2016. From the early 1990s, New Gingrich developed a destructive partisanship that almost inevitably led to Trumpworld. He created conspiracy theories, engaged in strategic obstructionism, and sought to use the so-called ‘culture wars’ to destroy bipartisanship and create disfunction in Washington. Since the start of his career as a Republican activist in the late 1970s, Newt Gingrich called for Republicans to act “nasty”, in what he called a “war for power”. This was the so called Republican Revolution: to destroy trust in the system and divide the electorate along what Karl Rove, Bush’s key strategist, would later call wedge issues – mostly what we might call progressive social change like abortion, marriage equality, and broad civil rights agendas.

It is perhaps no surprise, then, that Gingrich has been one of Trump’s big supporters. Trump’s nasty, name calling, personal attacking strategy is straight from Gingrich’s playbook. Rather than draining the swamp, his golfing, Fox-watching, late night tweeting, lying, obstructionism has done exactly what Gingrich wanted: made people sick of politics, mistrust Washington, buy into completely ludicrous conspiracy theories, opt out completely.

It was Gingrich who turned legislating into a reality show. With C-Span cameras installed in the House, Gingrich became a performer in his own political reality show. He repurposed the term ‘communist’ as an insult. The Trump supporters who accuse Harris and Biden of being socialist have, for the most part, no idea what that term actually means; they get their insult handed down from Gingrich, through Rove, through Palin, through Trump. Gingrich even wrote a memo about language use: in the late 1980s, his “Language: a Key Mechanism of Control” encouraged Republicans to call Democrats “radical”, “traitors”, “corrupt”, “socialist”. He weaponized impeachment – again an opportunity for reality tv – and he weaponized supreme court nominations in new and nefarious ways.

If Trump wins in 2020, it will be another triumph for Gingrich, and Gingrich’s successors like Mitch McConnell. We’ll spend four more years going around and around in a spiral of lies, obfuscation, maladministration and distraction, while the foundations of democracy are hacked away, live on tv. I am usually not that melodramatic. But Trump didn’t start this – he is a symptom of it. I’m not sure if a return to civility and sincerity is entirely possible after four years of Trump, but it certainly won’t be after eight.

The high turnout figures being reported this week give some pause in this. Not to indulge too much in counterfactuals, or “what if” history, I do wonder whether this high level of engagement would have been the case if we weren’t dealing with a pandemic. I’m sure the political scientists will be analysing the ways in which the coronavirus influenced political engagement over the past months. And they’ll be able to gauge how much voters register the origins of their political engagement through racial polarization, reactions to police brutality, and the Black Lives Matter movement. Democrats, and those who lean Democrat, do not trust the government. Melania voted maskless. Do you trust the masked people or the unmasked? Do you trust Fox news or MSNBC? A good friend of mine told me yesterday that in relatively affluent parts of Philadelphia, the drugstores and other shopfronts have been boarded up for the past couple of weeks. There is no trust that civility will return, no matter who wins.

All the norms appear to be gone. Disinformation has eroded all trust.

The New York Times editorial today headlines: “You’re not just voting for President. You’re voting to start over.” “The American experiment has taken a beating, but there’s a chance to renew our democracy”, the editorial tells us. But Republican activists in Texas are still – even now, even after several court cases – trying to throw out 120K votes. Voter suppression is real. It’s not clear that democracy can be renewed.

What do Biden and Harris represent? Contrary to the Republican hype, they are about as middle ground as you can get. I like Kamala Harris a lot, but you might suspect that if she were a politician in Britain, she could easily have been one of David Cameron’s so-called compassionate conservatives. It’s telling that one of Biden’s last pre-election tv ads was voiced by Bruce Springsteen – a symbol of good old fashioned solid Born in the USA American values. Whereas Biden came into the VP in 2008 on the Obama ticket of Hope and Change, he’s now essentially running on a ticket of God, Can We Rewind the Clock to Civility and Sincerity? Less catchy.

When I was a child, and I would go for a walk with my grandmother, I would start to whine when my feet got sore and I didn’t want to walk any more. She would always tell me the same thing: just a little further. It’s just down the road, and around the corner. Oddly, this is what this Trump cartoon reminds me of. We’re just rounding the corner. Except, like Trump, my grandmother was always lying. Will the US round the corner this week?

Working from home, but not alone

As I write this, my wife is in the kitchen, trying out Skype for Business for the first time. I can hear her comments to her colleague: “Oh yes, I can see that Word document. Yes, Powerpoint is good. Oh no, I can’t see the whiteboard. Download it?” I presume these kinds of exchanges are happening across the country, for those of us who thankfully still have jobs in this new challenging economy.

The biggest change for many people with non-essential jobs is that we are being asked to work from home. For lots of people, this will be a welcome shift. But for many others, this will be very challenging. If your employer is already difficult to work for, my bet is that working from home won’t make things all that much better. If you have a good employer, you might find that the things that make them good employers don’t fully transfer to when you’re working in your living room surrounded by kids who are going stir crazy without their usual play dates.

A year ago, I made the switch from working in a regular, bricks and mortar university, to a university that specializes in distance and online education. As many of my academic friends scramble to put their lecture and seminar content online, I am vaguely amused by their discovery of the tools that make the Open University so good at distance education provision. Friends who railed against Skype for Business a month ago are embracing its potential for seminar teaching, others are debating the relative merits of Zoom and Skype, and I’ve even heard whispers of Adobe Connect. But the big challenge is translating materials designed for face to face delivery into online-friendly formats. This is hard. They are being asked to do in 2 weeks what it often takes 9 months to do in the OU, with sophisticated IT support teams. I can only imagine how hard I would have found that in my underfunded bricks and mortar institution just over a year ago.

When I moved to the OU, my biggest challenge was changing my work practices in order to accommodate the distance between me and my colleagues. I have colleagues who work from home across the country, and we see each other only several times a year, at departmental or School meetings. These moments of personal interaction are prized.

I am not a designated home worker, but I routinely work about 2-3 days at home each week, going to my office at Milton Keynes on average twice a week. This routine suits me. It allowed me to move from overpriced suburban London to Birmingham, where my wife works, in April last year. With the Covid19 crisis, I am very glad that I no longer shuttle up and down the M40 on a weekly basis. The nature of the academic job market is such that couples often live apart for chunks of time during term; this is another good reason for universities shifting teaching online during this crisis, so that families can pull together in one common place.

But working from home has its challenges. I don’t have children, so I will leave it to others to give advice on how to successfully work from home if you have to juggle children into the mix. But I can share some tips I learned over the past year about working from home:

  1. Create a dedicated space for work. In our house, that means I’ve repurposed our dining room into a home office. We already planned to move house this summer (will there be any houses on the market??) so that I can have a more formal office space. But for the moment, a room with a door, a table that is bit enough to work at, and a chair that is comfortable will work. Find a way for your kiddos to understand that this is mommy or daddy’s office, and that they need to knock if they want to come in. And that sometimes they won’t be able to, because you’re in a meeting or busy. And use that dedicated space to create the psychological distinction between home and work: you’ll need that.
  2. Shower and dress properly: Ok, so I have conducted interviews while wearing sweatpants, which I would never have done in my previous job. But I absolutely can’t work in my pajamas. Dress like you’re going to work on casual Friday. Wear shoes, even if they’re trainers. I promise: you’ll feel more productive than if you’re wearing slippers.
  3. Start your day at time that is more or less normal, and end it at a more or less normal time. If you’re working around kids, you might find that you can shift your work hours around a bit to let you spend some time with them that you wouldn’t ordinarily do. But don’t expect to be able to all your work in the evening when they’re asleep. It just won’t happen.
  4. Begin your day with an easy and enjoyable job. That will kick start you in work mode and give you that sense that home has ended and work has begun.
  5. Keep the heating on. One of the big challenges is that you are now responsible for all of the utilities you use during the day, including the internet. Most people won’t be able to claim compensation for these, because your status as home worker is temporary (at least for the moment). But you won’t be able to work if you’re cold or generally uncomfortable. So, if you’re the sort of person who turns on the heating system on 1 December and turns it off again on 1 February, you might want to rethink this position. Keep your working space well heated, and as well ventilated as possible. Make sure you have enough light. If this is financially difficult, talk to your employer about mitigating electricity costs and if you need help to increase your internet data capacity. It’s in their interests to help you out.
  6. Do not look at cats on the internet. It’s all very well to get distracted in the office where there are other people around. It’s quite a different thing when you’re at home, and there is nobody to shame you into stopping the autoplay on youtube.
  7. If you are newly working from home with a spouse or partner (or several…. whatever your situation is), try to provide moral support for each other rather than distraction. Office romances are never a good idea, and that’s as true when your office is your home. No hanky-panky just because the boss isn’t looking.
  8. Keep in touch with your colleagues. One of the things I struggled with over the past year is the lack of casual interaction we take for granted in the workplace. I had to learn to use our messenger apps, skype for business, and even email as replacements for those conversations we have at the photocopier, or wherever you get coffee, or en route to the bathroom. It seems obvious: you won’t meet people by accident when you are working from home. You have to engineer those informal chats. Use whatever works: WhatsApp groups for your team, your company messenger apps, twitter. Think about how many times a day you just wander over to speak to a colleague. Halve that. Now try to engineer short informal interactions about nothing in particular that number of times per day using online tools. Working from home doesn’t have to mean working alone.
  9. Get out of the house for at least a half hour, if you can. Again, I know this can be difficult if you have kids (especially small napping ones). But it’s important. Your morning and evening commute gets you out of the house, and you need to replicate this. Also, if you usually go to the gym or play a team sport, it’s likely that you might not be able to do that as normal for a while. So, get out of the house at least once a day. Take a walk around the block. Go for a jog. Take your kids out somewhere uncrowded for half an hour. Park nearby? Go for a walk. This can be at lunchtime or in the evening. And remember if it’s raining, you can at least have a shower and change your clothes when you get back. Getting out of the house will keep you sane.
  10. Take breaks, but not too many. The temptation to watch tv is overwhelming, especially if you are in the house on your own. Resist it. Take the breaks you usually would do at work. But don’t take more. Once you sit down in your living room to watch that episode of Doctors that you wouldn’t usually see because you are at work, you’re done for. Spoiler alert: daytime tv is terrible. You’re not missing anything. The same goes if you are watching tv on your laptop. Switching on Netflix at 2pm is a Very. Bad. Idea.

Right. Back to work for me. I can hear the wife (I’ve taken to calling her my co-worker) in the kitchen on another call. She’ll be back in our home office again in 10 minutes, and I don’t want her to catch me watching cat videos on youtube…

Damian Hinds and the distraction of grade inflation panic

There are so many mixed messages coming from government about Higher Education that it seems clear that there is no clear strategy at all, just a handful of reactions to perceived problems. Cambridge’s response to TEF highlights the clear errors in government proposals to ‘measure’ teaching excellence; it is ludicrous to think that some subjects at Cambridge might be awarded TEF ‘silver’ because students are smart enough to boycott the NSS. And the papers over the weekend were full of Damian Hinds, gnashing his teeth over the number of firsts awarded in Universities. More distraction and reaction, rather than strategy. Students may be getting Firsts than ever before, but this is not necessarily a crisis of ‘grade inflation.’ In fact, it is a crisis of misunderstanding by government about how universities have changed their teaching model – often for the better – in response to the never-ending barrage of bright ideas put forward by a long list of Universities Ministers desperate to get up the political ladder.

When I started university in 1993, I was the first of my family to attend university. I was 17. I had no idea what university was about, really. For each of my classes I received a reading list of around 200 items, alphabetized by author surname; there were no further instructions about what to read for each session. The outline of each module was vague: a four page outline of broadly what subjects would be covered in each week. Often lecturers and professors would stray away from these broad topics. One of my lecturers – a very highly regarded academic in his field – would routinely come into the lecture theatre, sit down, and read verbatim from the folder of notes he had used for the same class for at least 10 years. Individual tutorials were a thing to be feared: students were viewed as a necessary evil, but office hours were certainly an interruption from the ‘real’ work of most academics on campus.

I do not think that my experience differs greatly from that of most of my pre-internet generation. I graduated in 1997 with a first class in both History and French: my recollection is that two Firsts were awarded that year in Arts, out of a cohort of several hundred. I had taken a year off before my final year to teach English in northern France, and the point of university clicked for me that year. What I know for sure is that many of my classmates were smarter than me, and could have achieved first class grades. Why did they not? Some of the reason is that they were not supported to do so. There were no grade descriptors. There were no Personal Tutors who could explain the mystery of university to a naive and lost student. There were three mental health counselors for the whole university, and appointments were seen as a last resort. Exams were at the end of the year, and in three weeks of pressure, students were examined on material that they had learned in all three terms. No wonder first class grades were rare.

In the past 15 years or so of an academic career teaching in large and small universities, elite and non-elite in the UK and Ireland, I have seen a major shift in the way that teaching is undertaken. It is absolutely true that more students are being awarded first class grades for their university work. But this is not necessarily a negative thing, although it is often reported as such in the many reports that come out about grade inflation. In reality, this is the inevitable result of the stellar work in curriculum reform, in student support, and in teaching methods, that has been undertaken in universities over the past 10 years.

I hope students today are much more informed about how university works than I was. They should be. They have lots of pre-entry information about universities. They are offered significant support on writing and time management in the first year, they are given very detailed and prioritized reading lists and module outlines – now all online for accessibility and interaction. They know they are able to ask questions if they don’t understand, and to book individual tutorials to discuss their work. There are dedicated student support teams – academic, mental health, learning supports – on campus to act as a safety net for students who are struggling. Academic colleagues work harder to support students who, 15 years ago, might have failed out in first or second year. Lecturers are encouraged to reflect on their teaching formally and informally, and are given incentives to do so. Many undertake formal teaching qualifications. Pedagogy is becoming more and more important, even in a regulatory environment that still rewards research over teaching innovation. If half my class were failing, for example, I would question the efficacy of my teaching practice.

The bottom line is that political panic over grade inflation is a manufactured crisis. Many more students are getting higher grades, but they have never been so carefully supported to do so. As competition for student numbers steps up, Russell group universities are catching up on the good practices that have been developed in teaching-focused universities. So, it should not be a surprise to anybody that the number of first class grades has risen. We equip our students with the resources and skills to excel in ways that were reserved only for the exceptionally clued-in when I was a student. Demonizing this as ‘grade inflation’ is deeply unhelpful. Seeking a return to a time when only a small handful of firsts were awarded is retrograde.

We should absolutely seek to uphold academic standards. But we should also not devalue the excellent work many universities do to ensure that their students can do their best work. For some, their best is a 2.2. For others, their best is a First. It is great to see that more students are capable of reaching that standard than ever before. This is something to be celebrated, not penalized.

New article: Home Sweet Home? Housing Activism and Political Commemoration in Sixties Ireland,’ in the History Workshop Journal

Lots of important changes last week:

On Thursday 28th February, I officially finished working for St Mary’s University, Twickenham, after 8 and a half mostly happy years as part of a great team of historians. It’s odd to leave an academic position in the middle of a semester; I shall miss my historian colleagues, my other amazing colleagues in the Humanities and Social Sciences, and of course my students who have kept me sharp, engaged and grounded (and entertained!) for many years.

Overnight, I moved from of one of the UK’s smallest Higher Education providers to one of its largest. I’m very pleased to start work with the Open University as a Staff Tutor in the History programme, and I’m looking forward to all the new challenges, opportunities and experiences I’ll have in the new position.

Also last week, the History Workshop Journal published an article I wrote about housing activism in Dublin in the 1960s. In particular, the article seeks to place this protest movement within the context of both the political commemorations in Ireland and the wider landscape of global protest associated with the decade. It will be out later in the year in the print edition of HWJ, but you should also be able to read the advance publication here on the journal’s OUP site. 

New Article: “Sex and the radical imagination in the Berkeley Barb and the San Francisco Oracle”

In November, the journal Radical Americas published a special issue on ‘radical periodicals.’ There are all sorts of interesting articles in the issue, from anarchist periodicals of the Depression era through to the ways in which Black Power aesthetics were captured through graphic design in magazines.

berkeley barb cover
Photo credit: Berkeley Barb Archives,


My contribution to the volume looks specifically at two influential newspapers of the American underground press during the 1960s. Using the Berkeley Barb and the San Francisco Oracle, my paper proposes two arguments: first, that the inability of the countercultural press to envisage real alternatives to sexuality and sex roles stifled any wider attempt within the countercultural movement to address concerns around gender relations; and second, the limitation of the ‘radical’ imagination invites us to question the extent to which these papers can be considered radical or countercultural. The reinforcement of heterosexism, especially the primacy of the male gaze, gave little space for any radical challenge to gender norms. In short, these radicals weren’t as radical as they might have thought they were!


You can read and download the full text here: