Category Archives: Uncategorized

Working from home, but not alone

As I write this, my wife is in the kitchen, trying out Skype for Business for the first time. I can hear her comments to her colleague: “Oh yes, I can see that Word document. Yes, Powerpoint is good. Oh no, I can’t see the whiteboard. Download it?” I presume these kinds of exchanges are happening across the country, for those of us who thankfully still have jobs in this new challenging economy.

laptop photo

The biggest change for many people with non-essential jobs is that we are being asked to work from home. For lots of people, this will be a welcome shift. But for many others, this will be very challenging. If your employer is already difficult to work for, my bet is that working from home won’t make things all that much better. If you have a good employer, you might find that the things that make them good employers don’t fully transfer to when you’re working in your living room surrounded by kids who are going stir crazy without their usual play dates.

A year ago, I made the switch from working in a regular, bricks and mortar university, to a university that specializes in distance and online education. As many of my academic friends scramble to put their lecture and seminar content online, I am vaguely amused by their discovery of the tools that make the Open University so good at distance education provision. Friends who railed against Skype for Business a month ago are embracing its potential for seminar teaching, others are debating the relative merits of Zoom and Skype, and I’ve even heard whispers of Adobe Connect. But the big challenge is translating materials designed for face to face delivery into online-friendly formats. This is hard. They are being asked to do in 2 weeks what it often takes 9 months to do in the OU, with sophisticated IT support teams. I can only imagine how hard I would have found that in my underfunded bricks and mortar institution just over a year ago.

When I moved to the OU, my biggest challenge was changing my work practices in order to accommodate the distance between me and my colleagues. I have colleagues who work from home across the country, and we see each other only several times a year, at departmental or School meetings. These moments of personal interaction are prized.

I am not a designated home worker, but I routinely work about 2-3 days at home each week, going to my office at Milton Keynes on average twice a week. This routine suits me. It allowed me to move from overpriced suburban London to Birmingham, where my wife works, in April last year. With the Covid19 crisis, I am very glad that I no longer shuttle up and down the M40 on a weekly basis. The nature of the academic job market is such that couples often live apart for chunks of time during term; this is another good reason for universities shifting teaching online during this crisis, so that families can pull together in one common place.

But working from home has its challenges. I don’t have children, so I will leave it to others to give advice on how to successfully work from home if you have to juggle children into the mix. But I can share some tips I learned over the past year about working from home:

  1. Create a dedicated space for work. In our house, that means I’ve repurposed our dining room into a home office. We already planned to move house this summer (will there be any houses on the market??) so that I can have a more formal office space. But for the moment, a room with a door, a table that is bit enough to work at, and a chair that is comfortable will work. Find a way for your kiddos to understand that this is mommy or daddy’s office, and that they need to knock if they want to come in. And that sometimes they won’t be able to, because you’re in a meeting or busy. And use that dedicated space to create the psychological distinction between home and work: you’ll need that.
  2. Shower and dress properly: Ok, so I have conducted interviews while wearing sweatpants, which I would never have done in my previous job. But I absolutely can’t work in my pajamas. Dress like you’re going to work on casual Friday. Wear shoes, even if they’re trainers. I promise: you’ll feel more productive than if you’re wearing slippers.
  3. Start your day at time that is more or less normal, and end it at a more or less normal time. If you’re working around kids, you might find that you can shift your work hours around a bit to let you spend some time with them that you wouldn’t ordinarily do. But don’t expect to be able to all your work in the evening when they’re asleep. It just won’t happen.
  4. Begin your day with an easy and enjoyable job. That will kick start you in work mode and give you that sense that home has ended and work has begun.
  5. Keep the heating on. One of the big challenges is that you are now responsible for all of the utilities you use during the day, including the internet. Most people won’t be able to claim compensation for these, because your status as home worker is temporary (at least for the moment). But you won’t be able to work if you’re cold or generally uncomfortable. So, if you’re the sort of person who turns on the heating system on 1 December and turns it off again on 1 February, you might want to rethink this position. Keep your working space well heated, and as well ventilated as possible. Make sure you have enough light. If this is financially difficult, talk to your employer about mitigating electricity costs and if you need help to increase your internet data capacity. It’s in their interests to help you out.
  6. Do not look at cats on the internet. It’s all very well to get distracted in the office where there are other people around. It’s quite a different thing when you’re at home, and there is nobody to shame you into stopping the autoplay on youtube.
  7. If you are newly working from home with a spouse or partner (or several…. whatever your situation is), try to provide moral support for each other rather than distraction. Office romances are never a good idea, and that’s as true when your office is your home. No hanky-panky just because the boss isn’t looking.
  8. Keep in touch with your colleagues. One of the things I struggled with over the past year is the lack of casual interaction we take for granted in the workplace. I had to learn to use our messenger apps, skype for business, and even email as replacements for those conversations we have at the photocopier, or wherever you get coffee, or en route to the bathroom. It seems obvious: you won’t meet people by accident when you are working from home. You have to engineer those informal chats. Use whatever works: WhatsApp groups for your team, your company messenger apps, twitter. Think about how many times a day you just wander over to speak to a colleague. Halve that. Now try to engineer short informal interactions about nothing in particular that number of times per day using online tools. Working from home doesn’t have to mean working alone.
  9. Get out of the house for at least a half hour, if you can. Again, I know this can be difficult if you have kids (especially small napping ones). But it’s important. Your morning and evening commute gets you out of the house, and you need to replicate this. Also, if you usually go to the gym or play a team sport, it’s likely that you might not be able to do that as normal for a while. So, get out of the house at least once a day. Take a walk around the block. Go for a jog. Take your kids out somewhere uncrowded for half an hour. Park nearby? Go for a walk. This can be at lunchtime or in the evening. And remember if it’s raining, you can at least have a shower and change your clothes when you get back. Getting out of the house will keep you sane.
  10. Take breaks, but not too many. The temptation to watch tv is overwhelming, especially if you are in the house on your own. Resist it. Take the breaks you usually would do at work. But don’t take more. Once you sit down in your living room to watch that episode of Doctors that you wouldn’t usually see because you are at work, you’re done for. Spoiler alert: daytime tv is terrible. You’re not missing anything. The same goes if you are watching tv on your laptop. Switching on Netflix at 2pm is a Very. Bad. Idea.

Right. Back to work for me. I can hear the wife (I’ve taken to calling her my co-worker) in the kitchen on another call. She’ll be back in our home office again in 10 minutes, and I don’t want her to catch me watching cat videos on youtube…

 

Damian Hinds and the distraction of grade inflation panic

There are so many mixed messages coming from government about Higher Education that it seems clear that there is no clear strategy at all, just a handful of reactions to perceived problems. Cambridge’s response to TEF highlights the clear errors in government proposals to ‘measure’ teaching excellence; it is ludicrous to think that some subjects at Cambridge might be awarded TEF ‘silver’ because students are smart enough to boycott the NSS. And the papers over the weekend were full of Damian Hinds, gnashing his teeth over the number of firsts awarded in Universities. More distraction and reaction, rather than strategy. Students may be getting Firsts than ever before, but this is not necessarily a crisis of ‘grade inflation.’ In fact, it is a crisis of misunderstanding by government about how universities have changed their teaching model – often for the better – in response to the never-ending barrage of bright ideas put forward by a long list of Universities Ministers desperate to get up the political ladder.

When I started university in 1993, I was the first of my family to attend university. I was 17. I had no idea what university was about, really. For each of my classes I received a reading list of around 200 items, alphabetized by author surname; there were no further instructions about what to read for each session. The outline of each module was vague: a four page outline of broadly what subjects would be covered in each week. Often lecturers and professors would stray away from these broad topics. One of my lecturers – a very highly regarded academic in his field – would routinely come into the lecture theatre, sit down, and read verbatim from the folder of notes he had used for the same class for at least 10 years. Individual tutorials were a thing to be feared: students were viewed as a necessary evil, but office hours were certainly an interruption from the ‘real’ work of most academics on campus.

I do not think that my experience differs greatly from that of most of my pre-internet generation. I graduated in 1997 with a first class in both History and French: my recollection is that two Firsts were awarded that year in Arts, out of a cohort of several hundred. I had taken a year off before my final year to teach English in northern France, and the point of university clicked for me that year. What I know for sure is that many of my classmates were smarter than me, and could have achieved first class grades. Why did they not? Some of the reason is that they were not supported to do so. There were no grade descriptors. There were no Personal Tutors who could explain the mystery of university to a naive and lost student. There were three mental health counselors for the whole university, and appointments were seen as a last resort. Exams were at the end of the year, and in three weeks of pressure, students were examined on material that they had learned in all three terms. No wonder first class grades were rare.

In the past 15 years or so of an academic career teaching in large and small universities, elite and non-elite in the UK and Ireland, I have seen a major shift in the way that teaching is undertaken. It is absolutely true that more students are being awarded first class grades for their university work. But this is not necessarily a negative thing, although it is often reported as such in the many reports that come out about grade inflation. In reality, this is the inevitable result of the stellar work in curriculum reform, in student support, and in teaching methods, that has been undertaken in universities over the past 10 years.

I hope students today are much more informed about how university works than I was. They should be. They have lots of pre-entry information about universities. They are offered significant support on writing and time management in the first year, they are given very detailed and prioritized reading lists and module outlines – now all online for accessibility and interaction. They know they are able to ask questions if they don’t understand, and to book individual tutorials to discuss their work. There are dedicated student support teams – academic, mental health, learning supports – on campus to act as a safety net for students who are struggling. Academic colleagues work harder to support students who, 15 years ago, might have failed out in first or second year. Lecturers are encouraged to reflect on their teaching formally and informally, and are given incentives to do so. Many undertake formal teaching qualifications. Pedagogy is becoming more and more important, even in a regulatory environment that still rewards research over teaching innovation. If half my class were failing, for example, I would question the efficacy of my teaching practice.

The bottom line is that political panic over grade inflation is a manufactured crisis. Many more students are getting higher grades, but they have never been so carefully supported to do so. As competition for student numbers steps up, Russell group universities are catching up on the good practices that have been developed in teaching-focused universities. So, it should not be a surprise to anybody that the number of first class grades has risen. We equip our students with the resources and skills to excel in ways that were reserved only for the exceptionally clued-in when I was a student. Demonizing this as ‘grade inflation’ is deeply unhelpful. Seeking a return to a time when only a small handful of firsts were awarded is retrograde.

We should absolutely seek to uphold academic standards. But we should also not devalue the excellent work many universities do to ensure that their students can do their best work. For some, their best is a 2.2. For others, their best is a First. It is great to see that more students are capable of reaching that standard than ever before. This is something to be celebrated, not penalized.

New article: Home Sweet Home? Housing Activism and Political Commemoration in Sixties Ireland,’ in the History Workshop Journal

Lots of important changes last week:

On Thursday 28th February, I officially finished working for St Mary’s University, Twickenham, after 8 and a half mostly happy years as part of a great team of historians. It’s odd to leave an academic position in the middle of a semester; I shall miss my historian colleagues, my other amazing colleagues in the Humanities and Social Sciences, and of course my students who have kept me sharp, engaged and grounded (and entertained!) for many years.

Overnight, I moved from of one of the UK’s smallest Higher Education providers to one of its largest. I’m very pleased to start work with the Open University as a Staff Tutor in the History programme, and I’m looking forward to all the new challenges, opportunities and experiences I’ll have in the new position.

Also last week, the History Workshop Journal published an article I wrote about housing activism in Dublin in the 1960s. In particular, the article seeks to place this protest movement within the context of both the political commemorations in Ireland and the wider landscape of global protest associated with the decade. It will be out later in the year in the print edition of HWJ, but you should also be able to read the advance publication here on the journal’s OUP site. 

New Article: “Sex and the radical imagination in the Berkeley Barb and the San Francisco Oracle”

In November, the journal Radical Americas published a special issue on ‘radical periodicals.’ There are all sorts of interesting articles in the issue, from anarchist periodicals of the Depression era through to the ways in which Black Power aesthetics were captured through graphic design in magazines.

berkeley barb cover
Photo credit: Berkeley Barb Archives, http://www.berkeleybarb.net/gallery.ht

 

My contribution to the volume looks specifically at two influential newspapers of the American underground press during the 1960s. Using the Berkeley Barb and the San Francisco Oracle, my paper proposes two arguments: first, that the inability of the countercultural press to envisage real alternatives to sexuality and sex roles stifled any wider attempt within the countercultural movement to address concerns around gender relations; and second, the limitation of the ‘radical’ imagination invites us to question the extent to which these papers can be considered radical or countercultural. The reinforcement of heterosexism, especially the primacy of the male gaze, gave little space for any radical challenge to gender norms. In short, these radicals weren’t as radical as they might have thought they were!

 

You can read and download the full text here: https://www.scienceopen.com/document/read?vid=bb3dcdc4-981b-4231-af9e-9ffe6aff80f2

The value of higher education

Two things happened this morning. At 7am, I started work writing up a new section of my new first year module on Transatlantic Slaveries. At 8am I read Laura Kennedy’s column in today’s Irish Times. The headline caught my eye: “I have my PhD, but what is the value of a university education?” Full disclosure: I have a PhD, and I value university education. So, I expected a sharp analysis of the challenges facing the higher education landscape, and perhaps some discussion of the ways in which access to education could be improved to make it more attractive to groups of people who have traditionally found themselves excluded from (what we crassly call in the UK) high tariff universities.

women university

This was not what I got. So I did what every good academic does when she reads something she disagrees with on the internet: I had a bit of a rant on Twitter.

Mostly it was the timing. Across the UK and Ireland over the next few weeks, universities are welcoming hundreds of thousands of students, all eager to learn, meet, think, drink, experience. Today, we welcomed a new cohort of first year students in the Humanities at my university – a small university in west London where our student profile is extremely diverse, and where we offer a supportive and encouraging environment to students who often feel they would get lost in a larger institution. I have taught elsewhere, in larger, more elite institutions both Ireland and the UK. I can honestly say some of the smartest students I have taught in 14 years have been here at St Mary’s University.

The core proposition of Kennedy’s article is that university education gives poor value for money in an era when people can research and learn through their own endeavours. That people who want to know about, say, history, can do so by reading the internet. A sort of home-school higher ed. In theory, Kennedy is correct. In practice, she’s missing the point and real value of university education.

University education encourages people to think differently, beyond what they thought was possible, or acceptable. Academic staff and the wonderful people who support tertiary learning, guide and push students towards knowledge, and to reflect on what and how they learn.

The line in the article that made me genuinely sad was this: “We joylessly and fruitlessly engage in the accumulation of education we don’t value or use.” If the writer’s experience of higher education was joyless, then I feel very sorry for her. It is true that some people do find education joyless. I’m sorry for them too, and I suspect that they chose the wrong course. This is a flaw in the system, and one that I would change given half a chance: due to funding priorities, it can be very difficult for somebody to get out of a course that is not for them.

But my sense from talking to students over many years, from all kinds of backgrounds and of all abilities is that the vast majority find their degree an enjoyable, difficult, rewarding experience.

Can we “do” higher education ourselves? Of course. But most people don’t, and don’t want to. If I want to learn how to play tennis, then I can watch youtube clips to learn topspin technique, and I can bang a ball against the wall for hours to practice. But a true understanding of the game can only be gained through playing with other people: this is where you learn tactics, quick responses, new strokes. And hiring a coach? She’ll push you well beyond what you think you’re capable of.

Is higher education too expensive for students? Yes it is, especially in the UK. But it is expensive at the point of deliver because since the turn of the century, governments have decided that the purpose of education is to serve industry. This is one way to see education, but not the only one. Higher education is not synonymous with vocational training, nor should it be. And we accept without question that the GCSE and A-Levels (or for that matter the Junior and Leaving Certificates) are not vocational qualifications. Why should it be assumed that university education be a direct training for industry, easily measured in usefulness? Is it because students pay so much? Well, if so, then the value for money narrative is one that has been constructed by government policy. It is not a measure of inherent value.

Indeed, this attitude has influenced parents and students to think of higher education as any other commodity, something that can be easily evaluated by a “value for money” calculation. Utilitarian approaches undermine the whole experience of higher education. By this measure, nobody should ever study drama, or English literature, or Classics. There are few requirements these days for expert knowledge of Greek mythology.

The real value of higher education is much more difficult to quantify. It lies in the quest, the divergent pathways taken, the development of self-knowledge, the skills to acquire further knowledge, a critical understanding of working with other people, assessing new ideas, challenging them, having the time and space to think and engage critically with information, knowledge and people, and to develop a love of something, a passion for something, even if for only a few years. It is the joy of having time to push your boundaries, and to be guided in doing so. That has value, and it is about time that society and government policy recognized this inherent value.

“Where do we go from here?”: #MLK50

king memorialIn August 2011, 48 years after the March on Washington, the African American poet Maya Angelou dedicated a poem to Martin Luther King, Jr to mark the unveiling in Washington D.C. of the new memorial to his legacy.

The opening lines of ‘Abundant Hope’ remembered King as a prophet, a saviour:

The great soul
Flew from the Creator
Bearing manna of hope
For his country
Starving severely from an absence of compassion.

Martin Luther King

Angelou was born on 4 April 1928, and today marks what would have been her 90th birthday. Between 1968 and her death in 2014, she shared her birthday with the anniversary of Martin Luther King’s assassination. The death of King devastated her. She had first met him in 1960 and had collaborated with him and the Southern Christian Leadership Conference on various civil rights projects. After his death, she refused to celebrate her birthday, and instead marked the anniversary by sending flowers to Coretta Scott King, with whom she remained close until King’s death in 2006.

Angelou’s devastation at King’s murder was felt across the United States. His assassination at the Lorraine motel on the evening of 4 April 1968 was followed by outrage from civil rights leaders, and riots in 110 cities across the country. In Washington D.C., the site of the massive march in 1963, over 20,000 people took to the streets in anger and frustration, looting and burning large parts of the inner city. Stokeley Carmichael, the former president of the Student Non-violent Coordinating Committee who later espoused black power militancy, warned reporters: “When White America killed Dr. King, it declared war on us. We have to retaliate for the execution of Dr. King.”

King had arrived in Memphis, Tennessee on 3 April, and had delivered a searing speech at the Mason Temple in the Washington Heights district of the city. He was in Memphis to support striking sanitation workers, who had been protesting poor conditions and wages for almost two months. This was indicative of King’s radical turn: exactly a year before his death he had made a controversial public statement against the war in Vietnam, and by 1968 he was preoccupied by the economic disadvantage that African Americans continued to face despite the Civil Rights legislation passed in 1964 and 1965. He had spent two years pressurizing the government of Lyndon B. Johnson to pass a fair housing bill. To the eyes of conservatives and racists, all of this seemed to confirm that King was a communist.

Indeed, in the years prior to his assassination, King was deeply unpopular. In 1966, Gallup measured his popularity amongst the American public: 32% positive versus 63% negative. In the midst of the Selma controversy in March 1965, King appeared on the cover of Time magazine: the depiction was of an angry black man, in stark contrast to the statesmanlike portrayal a year before when Time named him Man of the Year.King Time

Years later, during the debates about whether to institute a federal holiday in honour of King, Senator Jesse Helms (R.-N.C.) attempted to block the passage of the bill by accusing King of “action-oriented marxism”, of espousing the “official policy of communism.” And there was a kernel of truth to Helms’ accusations: King was a radical. He was attempting to upend the systemic inequality within American society. The danger of remembering King as a prophet, a saviour, is that we risk de-radicalizing his memory. This is made all the easier by a national holiday that remembers King’s contribution as ‘service.’ The most recent misuse of the memory of King was during the Superbowl interval in February, when Ram used King’s voice in their advert to sell trucks. “In the spirit of Dr. Martin Luther King, Jr.,” the ad people tell us, “Ram truck owners also believe in a life of serving others. They serve because they’re driven by a higher calling. They serve because they feel a shared responsibility and commitment to their family and community.” Conservatives like Helms have got their way: the memory of King is not of the radical preacher who sought to undermine the “giant triplets of racism, extreme materialism, and militarism.” Instead, his memory is preserved in the aspic of 1963, when he had a dream.

The monument to King’s memory in Washington D.C. does very little to counter that saccharine version of his legacy. In Lei Yixin’s design, a great white King looms up over the Tidal Basin, overlooking a series of decontextualized quotes that emphasize humanity, justice, peace, loyalty and love. This forms a comfortable narrative that allows King to be accepted as an establishment figure, completely consistent with the mythology that America is like the arc of the moral universe, and will bend inevitably towards justice. This is the version of King that allowed Bill O’Reilly to claim in 2016 (with goodness knows what authority) that King would not have marched with the Black Lives Matters protesters. It is a simplistic version of a complex man, who spent his final days criticizing the government for their war in Vietnam, campaigning for fair housing provision and for labour rights for sanitation workers. King understood that change would not be given easily: it had to be forced.

In his address to the SCLC convention in August 1967, King tried to answer the question, “where do we go from here?” His answer questioned the very foundations of the American project:

“‘Where do we go from here?’ that we must honestly face the fact that the movement must address itself to the question of restructuring the whole of American society. There are forty million poor people here, and one day we must ask the question, ‘Why are there forty million poor people in America?’ And when you begin to ask that question, you are raising a question about the economic system, about a broader distribution of wealth. When you ask that question, you begin to question the capitalistic economy. And I’m simply saying that more and more, we’ve got to begin to ask questions about the whole society.”

Maya Angelou’s recollection of King in her 2011 poem is a rose-tinted one. She deliberately forgets the animosity shown towards King when she writes:

All creeds and cultures
Were comfortable in
His giant embrace
And all just causes
Were his to support and extol
Through sermons and allocutions
With praise songs and orations

In his remarks at the delayed dedication of the King memorial in October 2011, Barack Obama got much closer to describing a more real King:

“We forget now, but during his life, Dr. King wasn’t always considered a unifying figure. Even after rising to prominence, even after winning the Nobel Peace Prize, Dr. King was vilified by many, denounced as a rabble rouser and an agitator, a communist and a radical. He was even attacked by his own people, by those who felt he was going too fast or those who felt he was going too slow; by those who felt he shouldn’t meddle in issues like the Vietnam War or the rights of union workers. We know from his own testimony the doubts and the pain this caused him, and that the controversy that would swirl around his actions would last until the fateful day he died.”P1010002.JPG

He continued: “Change has never been simple, or without controversy. Change depends on persistence. Change requires determination.”

So, today we remember Maya Angelou, rising with an abundance of hope, and Martin Luther King, Jr, whose vision of change for African Americans still has a way to go.

 

#Takeaknee: lessons for Donald Trump in heritage and respect

“We have a great country. We have great people representing our country, especially our soldiers, our first responders, and they should be treated with respect. And when you get on your knee, and you don’t respect the American flag or the anthem, that’s not being treated with respect. This has nothing to do with race. I have never said anything about race. This has nothing to do with race, or anything else. This has to do with respect for our country and respect for our flag.” Donald Trump, in an interview with CNN, 24 Sept 2017.

In the last 48 hours, the President of the United States has been picking a fight with NFL players. Specifically, he has criticized the practice that some players have adopted of kneeling during the playing of the national anthem at the start of a game. In August 2016 San Francisco 49ers quarterback Colin Kaepernick began this #takeaknee protest in order to bring attention to the Black Lives Matters movement, and to protest killings of black men by white police officers. According to Kaepernick, he was “not going to stand up to show pride in a flag for a country that oppresses black people and people of color.” Of course, when Mr Trump says “this is not about race”, he is deliberately ignoring that this is all about race. In doing so, he adopts an ahistorical stance of colour blindness in order to disempower black Americans who protest against ongoing discrimination through economic, political and social systems that are structurally racist.

I’m struck by the timing of this recent outburst by this president who staunchly refuses to engage in any meaningful way with the history of the nation. Sixty years ago, in September 1957, Mr Trump’s predecessor President Dwight D Eisenhower struggled with an escalating situation in Little Rock, Arkansas. At the start of the school year, on September 3rd, nine students arrived at Central High School in Little Rock in order to being the process of desegregation of the school in line with the Supreme Court decision in Brown vs Board of Education three years earlier. Met by throngs of protesters, and opposed by Arkansas governor Orval Faubus, the black students were turned away from the school amid fears that widescale riots would break out. When Eisenhower eventually sent the 101st Airborne Division (without its black soldiers) to Arkansas, he explained the move as one designed to enforce the orders of the court, not as something that should be interpreted as indicating his position regarding integration or segregation. The crisis was never fully resolved: the riots petered out, but the students were verbally and physically abused during their time at Central High. Eisenhower’s official explanation for sending in the troops was that he wished to avoid anarchy; but everybody (including him) knew this was about race. Eisenhower had the good sense to never utter those words: “it’s not about race.” But then Mr Trump is no Eisenhower.

Ten years before Little Rock, in April 1947, Jackie Robinson became the first African American to play for a major-league baseball team since the 1884 season. The grandson of slaves, Robinson was probably the most talented American athlete of all time. At UCLA he excelled at track, baseball, football and basketball. He served with honour in the military during World War II. Years before Rosa Parks, he refused to sit at the back of a military bus, and was court martialled (and acquitted) for his opposition to an authority enforcing rules that were patently unfair. Despite his success, he was taunted and racially abused throughout his career in major league baseball. Signed to the Brooklyn Dodgers in 1947, he faced racial taunts on the baseball diamond and in the press. In 1949 he played with two other black players on the All-Star team, the first time an All Star team was desegregated. Those players were also given lockers in a secluded part of the locker room; they showered separately from their white teammates. It was clear to Robinson that desegregation did not mean equality. Writing not long before his death in 1972, Robinson revealed his attitude towards the national anthem and the flag:

As I write this twenty years later, I cannot stand and sing the anthem. I cannot salute the flag; I know that I am a black man in a white world. In 1972, in 1947, at my birth in 1919, I know that I never had it made. Jackie Robinson, I Never Had It Made: An Autobiography (1972)

Donald Trump sees the flag as universal: in his “colour blindness” he does not recognize the historical baggage of that flag and of that anthem. Of course, he doesn’t acknowledge problems with Confederate symbols, so how can we expect him to understand the complex historical relationship between the descendants of slaves and the descendants of slaveholders? He should have paid more attention to Frederick Douglass, who he lauded last February as having “done an amazing job and is being recognized more and more.” In 1852, Douglass’ address to mark the fourth of July reminded his audience of the lack of universality of American national symbolism. “What, to the American slave, is your 4th of July?” he asked. The answer, he suggested, was “a day that reveals to him, more than all other days in the year, the gross injustice and cruelty to which he is the constant victim. To him, your celebration is a sham; your boasted liberty, an unholy license… your shouts of liberty and equality, hollow mockery…” To think that key national celebrations — the fourth of July, or Thanksgiving — hold an inherent universality is to misunderstand all of American history.

There are deep divisions in the American experience. The symbols of nationhood have long been used to cover those differences, to convince Americans that theirs is one progressive story. But this is an ahistorical notion, promoted now by an ahistorical president. Donald Trump chose a rally at Huntsville, Alabama as the opportunity to attack NFL players following in Kaepernick’s wake. Trump used his platform to denounce this protest as a “total disrespect of our heritage, a total disrespect of everything that we stand for.” Alabama is still a highly segregated state. Huntsville is still a highly segregated city. A more historically sensitive president might have chosen to steer clear of criticizing non-violent protest in a state where the civil rights movement escalated in December 1955, through the leadership of Ralph Abernathy, Rosa Parks and Martin Luther King. That is the heritage of Alabama: racial segregation, and non-violent protest against it.

But Donald Trump is a modern know-nothing. Ultra nationalist without any understanding of the nuances of nationalism, embracing a “colour blindness” which allows him to deny pervasive problems of racism, and an ahistorical proponent of a “heritage” that is neither shared nor universal. You might not agree with the NFL players’ protest, but the #takeaknee protest is precisely in line with a strand of American heritage. Respect that.

The ‘woman issue’ in the last US election

On Monday evening I spent a lovely evening in the company of an old friend at the University of Lincoln, who had invited me to participate in a Historical Association roundtable on the 2016 US election. He originally asked me to offer perspectives on race and gender, but in the end I only talked about gender. There’s a lot to say about gender and the US election. In fact, although a lot was said about women and gender in the course of the campaigning, the reality is that for an election featuring the first female major-party presidential candidate, the election cycle didn’t have much impact on the number of women in national politics. So why did this conversation about women and politics, and more broadly women in the US, not translate into real change in the political landscape?

Across US political life, women make up about 20-25% of the overall representation in elected offices. At federal level, this is a little lower: after the 2016 election just under 20% of the House of Representatives, and 21% of the Senate are women. For all the horse-trading, total numbers of women in the House dropped from 84 to 83 in the new Congress. Organizations like Emily’s List have been pushing for more gender diverse selection in the Democratic primaries. In the aftermath of the election, their workshops and training sessions experienced unprecedented attendance by women who were prepared to run in the next cycle.

There were some notable wins for women in November, particularly in the Senate. The Centre for American Women in Politics (CAWP) is a great resource for tracking the election of women at federal and state levels. Kamala Harris (D-CA) became only the second African American woman to sit in the Senate (and the third woman of colour); Tammy Duckworth (D-IL) and Catherine Cortez-Masto (D-NV) also took seats in the Senate, to bring the total number of WoC ever to sit in the Senate to five. Five, in the history of the Senate. Only 50 women have taken seats in the Senate since the franchise was extended to women in 1920.

This is significant because we often think of the Senate as the cradle of the presidency. The chamber has variously been described as “the mother of Presidents,” “the Presidential incubator,” and “the Presidential nursery.” This language of mothering stands in stark contrast to the historically male nature of the Senate. In historical terms, though, very few Senators have ever transitioned directly from the chamber to the White House: the only three are Warren G. Harding (1920), John F. Kennedy (1960) and Barack Obama (2008). Many Senators have come to the Presidency via a more indirect route, for example via the Vice Presidency. Hillary Clinton’s stint as Secretary of State should, therefore, have made her a much stronger candidate than she was in 2008. It did, but in the end it was not enough to push her over the line.

clinton_untrustworthy

Looking at the coverage of Clinton’s loss in the aftermath of November, the general consensus is that she lost because of the interplay of four factors: she was perceived as untrustworthy, perceived as too ambitious, seen as part of an oligarchy/dynasty, and – perhaps most importantly – perceived to be “unlikeable.” Her campaign lacked the ideological buzz generated by Obama and Sanders, but she was fighting a campaign on several fronts: her ability to use her extensive track record was hampered by her need to also run against her own past. And as the Atlantic put it before the election, she was also fighting the fear of a female president.

All of this was accentuated via the rally chants of “Lock Her Up” (often led by now-fired Mww2_2ichael Flynn) and Trump’s use of the “Crooked Hillary” label that stuck. Of course, the idea that women are inherently untrustworthy is not at all new. It was most clearly expressed in popular culture during the Second World War, as the War Department and other government agencies produced propaganda to warn against sharing military information with women who might gossip and inadvertently pass it on to the enemy. The received wisdom is that women are gossips, and so they are untrustworthy and dangerous. This was particularly accentuated in an era when women were pushing against traditional gender boundaries, also a feature of Clinton’s campaign.

The focus on Clinton’s e-mails, and use of an unsecured e-mail server, confirmed this ww2_3sense that women were untrustworthy. The “gossipy” nature of many of the leaked/released e-mails further indicated that Clinton’s self-presentation as a qualified, hard-nosed politician simply masked her gossipy, vain womanliness.

The perception that she was untrustworthy was also a legacy of her husband’s sex scandals. In a 2008 article in the Yale Law Journal, Julia Simon-Kerr explores the ways that the legal system has generally hinged women’s credibility or honour on her moral integrity and sexual virtue. Simon-Kerr concludes that “while our cultural definition of sexual virtue has shifted drastically since the 18th century, and even since the initial enactment of the rape statutes [in the 1970s], the idea that a woman’s sexual virtue bears upon her credibility is still present today.” For one group of American voters, Clinton’s sexual virtue was compromised by her decision to support her husband through scandal, and her excoriation of her husband’s accusers (especially Lewinsky); for another group, her virtue was undermined by her inability to be sufficiently ‘woman’ in order to keep her husband from straying.

It is not a coincidence, then, that the “lying Hillary” myth emerged at the same time as the sex-scandals of the mid 1990s. Her own morality, and so her credibility, was undermined by her husband’s immorality. When William Safire’s “Blizzard of Lies” column appeared in the New York Times in January 1996, he cast Hillary as an over-ambitious woman who was pushing a suspect political agenda alongside her husband’s infidelities. Often identified as the root of the idea that Hillary is a congenital liar, the Safire column cannot be divorced from a narrative of morality/credibility.

In 2016 this took on a surreal significance, given the obvious capacity for pathological lying displayed by her Republican opponent, and fact checking of Hillary’s statements that showed she was really quite honest, for a politician. But where Trump’s lying provoked disbelief, and often cartoonish or comic effect, for Clinton the e-mail scandal simply confirmed a 20-year old mythology. False equivalence was attached to her e-mails (disclosed by the FBI), for example, and early evidence of his ties to Russia (which the FBI chose not to disclose in advance of the election).

This false equivalence is part of a wider phenomenon that holds women in politics to a different set of standards. The Barbara Lee Family Foundation’s “Keys to Elected Office: Essential Guide for Women” indicates that women still have to be more qualified than male counterparts, but must also establish themselves within a language of family. Every time Clinton began a sentence with “as a grandmother,” she was playing by the rules.keys

The results of Pew research analyzing the likely behaviour of voters (pre-election) and attitudes (through exit polls) reveals major differences in the ways that men and women saw the historical significance of the election, and in the “likeability” factor. The research also indicates that Clinton did not get the expected “woman bounce.” Women supported Clinton over Trump by 54% to 42%. But this is approximately the same as the Democratic advantage among women in 2012 (55% Obama vs. 44% Romney) and 2008 (56% Obama vs. 43% McCain).

There is not, and never has been, such a thing as a woman’s voting block. Delays in extending the suffrage to women can be partly explained by this fear that women would all vote in the same way. For example, the dominance of women in the Temperance movement suggested to the alcohol industry that a vote for women would be a vote for prohibition. Once the Eighteenth Amendment was ratified in 1919, alcohol interests no longer funnelled money into blocking the women’s vote. The fear of a monolithic vote amongst women was never realized. Gender was not the defining factor explaining women’s vote for Clinton, although it may have been a significant factor in why men did not vote for her.

 

Too ‘wretched’ for a visa?: a brief history of immigration policy in the US

In 1883, Emma Lazarus published a sonnet called The New Colossus. Influenced by a renewed interest in her own Jewish ancestry, and spurred on by the influx of thousands of Jews fleeing Russia after the anti-semitic crackdowns of 1881, Lazarus articulated a vision of an America that was a safe haven, a place of welcome where those who had been oppressed could find freedom. Twenty years later, the words of Lazarus’ poem were affixed to the base of the Statue of Liberty, a statue that became the symbol of immigration, the welcoming symbol for immigrants on ships heading to the Ellis Island processing centre that opened in 1892.

“Keep, ancient lands, your storied pomp!” cries she
With silent lips. “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!”

In 2010 President Barack Obama quoted Lazarus’ sonnet at the end of a key speech on immigration reform, but leaving out the line about the ‘wretched refuse’. Commentators at the time wondered whether this was a deliberate omission, or simply a blunder in recitation. Either way, it should make us think about the ways in which US immigration policy struggles to square the mythology of ‘give me your huddled masses’ with the reality that many of those seeking entry are perceived by the political class, and by the wider population, to be ‘wretched refuse.’ That mythology exploded this week when President Donald Trump signed an executive order banning entry to the US by nationals (including dual nationals not holding a US passport) from a list of ‘wretched’ countries: Iran, Iraq, Syria, Yemen, Sudan, Libya and Somalia.

Signed on Holocaust memorial day, Trump’s executive order seems to fly in the face of what America means for those who seek refuge from persecution in their own countries, as well as for those who see the US as a place to create new economic futures for themselves and their families. But none of this is new. In fact, this is just the latest effort by a US administration to decide who is ‘deserving’, and who is ‘wretched refuse’. Just over fifty years after Lazarus penned her poem, the US turned away thousands of Jews seeking refuge from the Nazis. In 1882, the year before the poem was published, Congress passed the Chinese Exclusion Act, putting a 10-year moratorium on immigration of Chinese labourers. This was a racist measure justified by a perceived economic imperative. For Chinese people already living in the US, travelling outside the country was fraught with uncertainty, in case they would not be able to re-enter. Sound familiar?

In truth, while Americans tell themselves that the nation was built by hardworking immigrants who brought prosperity and modernity, the reality of immigration law is quite different. Since the inception of the state, governments have tried to figure out who is ‘deserving’ and who is too ‘wretched’ to be a desirable immigrant. In defining nationhood, and belonging, American administrations targeted what Stanley Cohen called ‘folk devils’ by fashioning moral panics around the perceived threat of immigration. The folk devils these days are Muslims, or as the Trump administration says, people from specific Muslim-majority countries which are deemed suspect (notably not Turkey, or Saudi Arabia though). Certain kinds of Muslims are just too ‘wretched.’

But however the US wishes to cast itself as a haven for the poor and tired masses, its history is as much one of exclusion as inclusion. In the early republic, moral panics were initially visible through the rules negotiated around naturalization. Before 1790, a mere two years of residency were required for naturalization. By 1795, after the peace had been concluded with Britain, this increased to five years. In 1797, politicians seeking to ensure that only the wealthy could be naturalized proposed a $20 tax on naturalization certificates. This proposal failed, but in the following year Congress passed the Naturalization Act and the Alien and Sedition Act, which established a 14-year residency requirement for naturalization, and made it easier to criminalize and deport any immigrants who criticized the government. Alexander Hamilton, himself an immigrant, declared in support of the Acts that

“…[F]oreigners will generally be apt to bring with them attachments to the persons they have left behind; to the country of their nativity, and to its particular customs and manners… The influx of foreigners [will serve to]…change and corrupt the national spirit; to complicate and confound public opinion; to introduce foreign propensities.”

Forty years later, during the administration of Andrew Jackson, a prominent anti-slavery advocate warned against the dangers of Catholic immigration. Writing about the promise of Westward Expansion, Lyman Beecher (father of Uncle Tom’s Cabin author Harriet Beecher Stowe) appealed to his fellow Americans not to allow the republican spirit of the West to be tainted by the corruption of Catholic immigrants. As Irish and Italian immigrants sought entry for economic opportunity, Beecher deemed them too ‘wretched’ to be part of the American nation. Incidentally, it is interesting that President Trump has chosen a portrait of Andrew Jackson to hang above his desk in the Oval Office: Jackson was a populist who pursued a sustained policy of Indian removal, helping to ensure that America defined itself primarily as a whiteman’s nation.

Thomas Nast’s anti-Irish cartoons captured this anti-CaNast.jpgtholic immigration sentiment, as it continued to rise mid-century. The National Party, or the Know Nothings, reflected this in the political arena. Their nativist policies would be at home on Breitbart; they were essentially the Tea Party of the 1840s. Their America was a Protestant one. In the words of their leader Thomas Whitney:

“Religion, patriotism, and morality have been the foundation stones of our success as a nation, and our happiness and prosperity as a people. These foundation stones were laid upon the rock of a stern Protestant faith, and their fruits have been all that our institutions promised: civil and religious liberty… But the foundation is being removed, and the rock upon which it was laid is in danger of being undermined. Imported infidelity is supplanting the religion of our fathers.”

Replace ‘Muslim’ with Catholic in the spirit of the Know-Nothings, and you have the key to today’s moral panic and folk devils. Things escalated during the Mexican war, with prominent opponents of the war framing their opposition in racist terms. America should not seek to annex parts of Mexico, they said, because (in the words of Rev. Theodore Parker), Mexicans were “a wretched people; wretched in their origin, history, and character.” President Trump may have stolen his election slogan from Ronald Reagan, but he stole his position on Mexico from Theodore Parker.

The aftermath of the Civil War opened up more questions about who constituted the citizenry of the nation; eventually the 14th amendment recognized that all people (including former slaves) born on US soil are citizens. But simultaneously, new colour lines were drawn. Even Frederick Douglass, prominent abolitionist, saw immigrants as the enemy of free Blacks who were vulnerable to being enslaved. Writing in 1855, he complained that

“Every hour sees us elbowed out of some employment to make room perhaps for some newly arrived immigrants, whose hunger and color are thought to give them a title to special favor.”

At any rate, only immigrants deemed to be ‘white’ were allowed to become naturalized citizens. Propaganda about the “Yellow Peril” targeted Chinese immigrants who had mostly settled on the west coast, and the Exclusion Act of 1882 and the so-called Gentleman’s Agreement of 1907 clearly indicated that Asiatic people were not part of the American ‘nation.’ State and territorial legislation restricted Chinese and Japanese rights to own property, and a landmark Supreme Court decision in U.S. v. Bhagat Singh Thind (1923) reaffirmed that people from India were not eligible to become naturalized citizens (even though, ironically, the court acknowledged that they were ‘Caucasian’). this decision would hold until 1946.

The assassination of President McKinley provided an excuse for emergency crackdown on suspected anarchists. The passage of new legislation through Congress was facilitated by the rise of eugenics. In 1911, prominent biologist, statistician and eugenicist Charles Davenport bemoaned the impact of immigration from south and eastern Europe:

“The population of the United States will… rapidly become darker in pigmentation, smaller in stature, more given to crimes of larceny, kidnapping, assault, murder, rape and sex immorality. And the ratio of insanity in the population will rapidly increase.”

Sound familiar?

Laws in 1901 were followed up in 1917 with the Immigration Act of 1917, which was a resoundingly nativist piece of legislation. In particular, it targeted Jewish immigrants from Russia and Eastern Europe who were tarred with the label of communism. A waiver to the law exempted Mexicans, perhaps ironically given today’s political climate; businesses in the Southwestern states relied heavily on immigrant labour, especially in a war economy.

After the war, immigration quotas were introduced, which disproportionately favoured western European applicants and disadvantaged people from Asia and Latin America. Much has been written about the isolationism of the ‘America First‘ movement, spearheaded by noted anti-semites like Charles Lindburgh. We know a lot about Japanese internment during World War II, after generations of silence. japanese internment.jpgMarking Holocaust memorial day, somebody has been tweeting the names of Jewish refugees who were turned away from the border in 1939, and died in work and death camps. They were too ‘wretched’ to be accommodated in a US which would only two years later organize its war effort around Roosevelt’s ideas of Four Freedoms.

The next major developments in immigration policy came in 1952, when the Immigration and Nationality Act abolished racially-based restrictions, although it kept quotas based on national origins which, it was hoped, would control the immigration of ‘undesirables’ who would taint American democracy. Critics of the bill claimed (rightly) that it would give prefernce to immigrants from northern and western Europe. The co-sponsor of the bill, Senator Pat McCarran (D-NV) was eager to maintain the national origins controls, arguing that

“..this nation is the last hope of Western civilization and if this oasis of the world shall be overrun, perverted, contaminated or destroyed, then the last flickering light of humanity will be extinguished.”

Although the 1952 Act opened the door to increased Asian immigration, the numbers remained low. Seeking to veto the bill, President Harry S. Truman said

“We do not need to be protected against immigrants from these countries–on the contrary we want to stretch out a helping hand, to save those who have managed to flee into Western Europe, to succor those who are brave enough to escape from barbarism, to welcome and restore them against the day when their countries will, as we hope, be free again.”

The key piece of legislation that people are talking about in the context of current developments is the 1965 Immigration and Nationality Act. This was a major departure from previous policy, and radically overhauled the ways that immigration quotas were decided. For the first time, national origin and race were eliminated from immigration decisions; labour became the defining issue. In fact, the 1965 Act specifically made it unlawful to discriminate on the grounds of race, echoing the terms of the 1964 Civil Rights Act. Fifty years after the 1965 Act, it is clear that the legislation did not have the consequences that were intended. But this piece of legislation established a principle of non-discrimination in the issuance of immigrant visas, and it is likely that Trump’s Executive Order is in contravention of this law.

The reality is that the US has a troubled history with squaring up the mythology of itself as an immigrant nation, a beacon upon a hill, a refuge for the tired, poor and huddled masses, yearning for freedom, and the reality of a white, protectionist, political system that has used immigration as a wedge issues since the establishment of the early Republic. Donald Trump’s actions must be seen in historical context, in an arc of history that has rarely bent towards justice. This does not mean it shouldn’t, and can’t, be resisted. It will be interesting to see which way the courts go. The US should protect itself from illegal immigration and terror, but this kneejerk, arbitrary and misguided reaction only reminds us of the the mid 19th century, and the Know-Nothings. It’s certainly not by accident that Trump has chosen Jackson as his office-mate.

 

 

Home sweet home: homelessness in Dublin, fifty years after the Dublin Housing Action Committee

The current ‘Home Sweet Home’ campaign to force the Irish government to take notice of the endemic problem of homelessness on Dublin streets has captured the empathy of a city. Reading the reports over the past few days, it is clear that there is a public appetite to address an obvious unfairness: the deliberate decision to leave properties vacant while people are homeless on the streets. When these properties, as is the case with Apollo House, belong to the government (or its agency, NAMA) the situation appears even more acutely illogical. It is not without significance that the occupation of Apollo House comes at the end of the celebratory year of the Rising, when we re-told ourselves stories of the bravery of men and women who had a social and political vision for an independent Ireland. Jim Sheridan, one of the celebrity supporters of the campaign, captured the problem within a historic narrative:

We were the first nation in the world to do that and, coming from a famine country where everyone was displaced and had to leave, we think we should be in the forefront of ending homelessness especially in these cruel times of austerity, of banking crisis, of people paying debt.

Indeed, this historic sensibility was a feature of the founding of the Dublin Housing Action Committee fifty years ago (1966 was also a commemoration year) against the background of rising homelessness, lengthening waiting lists for social housing, and the apparent contradictions between the promise of the Proclamation and the delivery of a social vision for justice within an Irish Republic. One wonders whether the Home Sweet Home campaign might finally deliver on the state’s promise to cherish its children equally, and to realize the social justice that we often attribute to the foundation myth of the state. If it does, then it might finally provide a resolution to problems that have been fifty years in the making.

In 1968 and 1969 a small activist organization called the Dublin Housing Action Committee grabbed headlines in much the same way: their protests and occupations against the spiraling problem of homelessness were framed within a language of historical responsibility. The roots of the crisis, like ours now, lay in events several years before. The tenements of the early 1900s were mostly gone, but housing provision in the city was still well below contemporary standards. Matters deteriorated rapidly in the summer of 1963, which was one of the wettest summers on record. Flooding caused numerous houses to collapse, almost all in areas traditionally associated with traditional ‘tenement’ dwelling. On both sides of the Liffey, fears increased regarding the safety of old houses. Housing shortage became so critical that the Dublin Health Authority acquired a section of Griffith Barracks to house homeless families.

When the Housing Act of 1966 was signed into law, the government’s priorities in housing matters became clear. Already in the pipeline since the Housing Act of 1962, the 1966 Act declared government incentives for citizens to purchase their own homes. This was a move away from state responsibility for housing provision, and effectively called upon citizens to make provision for their own accommodation. Government-backed loans would be offered to help families (the basic constitutional unit of the state) to purchase homes; the expectation was that these measures might provide adequate incentives for families to move out of the city centre and into family homes in the suburbs, without having to resort to adding their names to lengthy local authority lists in the hope of acquiring social housing.

By 1968, housing protests had escalated. Protests and arrests gained media attention. In September that year a group, heavily influenced by Sinn Fein, protested outside the Shelbourne hotel to highlight the ways that the Republic had failed to deliver on its promise to free the working people of the nation:

Our freedom has not yet been won, that the 26-county “Republic” declared in 1949 is a sham. Ireland cannot be free until her whole wealth is under the control of the organized working people of the whole country. To achieve this we must sweep aside the present administrators of money-grabbing politicians and their foreign monopolist bosses.[1] [italics in original]

The new Lord Mayor of Dublin, Frank Cluskey, attempted to meet with a deputation from the DHAC in August. But the problem remained that Minister Kevin Boland remained intransigent on the issue. In fact, apart from the Minister, the whole country was obsessed with the issue of housing towards the end of the year. In his presidential address to the annual conference of the Association of Municipal authorities of Ireland, Dan Spring stated that ‘the provision of houses was one of the most pressing matters for all councils’[2]; RTE’s Seven Days program invited Fr Michael Sweetman to show them around what he considered to be the worst parts of Dublin (they were unable to broadcast the footage because it was deemed to be too upsetting); Kevin Boland was plagued by questions from deputies in the Dail regarding plans to address the lengthy housing waiting lists.

Frustration grew. At a Conference of Dublin’s Homeless, held in November 1968, a resolution was passed stating that squatting was the only resort left to homeless people.[3] This was a direct challenge on private property, designed to test Article 41 of the Constitution requiring the state to protect the family. On 17 November Denis Dennehy, a key organizer of the DHAC, moved his whole family into a room at 20 Mountjoy Square. The property belonged to a prominent Dublin businessman and member of the Georgian Society, Ivor B. Underwood; it had been left vacant for some time, possibly in the hope that its conditions of use could be changed from residential to commercial. On 16 December, Mr Justice Butler ordered the Dennehys to vacate the premises, or find themselves in contempt of court. Reporting the case in its January issue, The United Irishman concluded that ‘despite the grand language of the Sacred 1937 constitution, a working-class family counts for nothing against the might and majesty of landlordism in Ireland.’[4] Dennehy refused to leave: on 3 January, he was imprisoned for contempt of court. In protest at his arrest, he went on hunger strike.

There is no doubt that the timing of the incident was carefully choreographed: January 1969 marked the fiftieth anniversary of the establishment of the first Dail. On 20 January, one day before a civic reception was planned to commemorate the anniversary, Lord Mayor Frank Cluskey sent a telegram to Taoiseach Jack Lynch, appealing for

the release from prison of Denis Dennehy to his wife and children on humanitarian grounds, as a tangible token of our acceptance on the great occasion we will commemorate tomorrow and of the princibles (sic) espoused on that occasion.[5]

Since Dennehy was in prison for contempt of court, neither the Minister for Justice nor the Taoiseach could intervene on his behalf. Notwithstanding this, the commemoration held on 21 January at the Mansion House in Dublin was interrupted by a veteran of the 1916 Rising, who used the occasion to highlight the perceived hypocrisy of the government. Una O’Higgins-O’Malley, daughter of veteran Kevin O’Higgins and member of a well-connected Irish political dynasty, wrote to Lynch the following day:

The wrong elements may be being used for the wrong motives but the truth is that far too many people are living in sub-human conditions and the children of the nation are very far from being cherished equally. (I do not question the validity of the High Court decision in the case of Mr. Dennehy – but rather the position which gave rise to all this).[6]

Significant marches were organized in mid-January in support of Dennehy. In fact, the Dennehy hunger-strike was the single most important consciousness-raising activity undertaken by the DHAC. Not only did it increase popular support, it also galvanised support from external groups: opposition politicians, students, the unions, the Dochas Society, all called for his release. A statement from the Cooperative Society, Dochas, summed up the mood of protesters:

A housing crisis exists in Dublin, despite all the good work of Corporation officials… What happened to Denis Dennehy on the eve of the first Dail’s 50th anniversary must never be allowed to happen again. The gaoling of homeless Denis Dennehy should be the last indignity that we allow the homeless to suffer.[7]

More radical voices also came out in support of the Dennehy protest. A group calling themselves the ‘Irish Exiles Association’ placed a picket on the Irish embassy in London: they threatened violence if Dennehy was not released from prison.[8]

The eventual release of Dennehy in late January, and the publication of a new Housing Bill the following March, marked some degree of success for the DHAC. The main purpose of the Bill, in the words of its accompanying explanatory memorandum, was to ‘secure more effective control over the demolition or change of use of houses.’ The Bill also attempted a more precise definition of a ‘habitable house’ as ‘one which in the opinion of the housing authority is reasonably fit for human habitation or is capable of being rendered so fit at reasonable expense.’

While the Bill did not offer total control over the demolition of sound houses, it did represent a softening of the establishment’s position regarding the housing issue, and demonstrated at least some willingness to consider the most significant concern of the DHAC. It was not enough for many activists. Demonstrations continued until late 1969, but the focus of attention began to gravitate north as violence escalated in Derry and Belfast. The antics of Hilary Boyle, a 70 year old grandmother who chained herself to the railings outside City Hall in November 1969 did not attract the same level of attention as Dennehy’s hunger strike.[10]

And now, almost fifty years later, activists are resorting to the same kind of tactics (consciousness-raising, occupation) in order to raise the profile of the capital city’s homelessness. The obvious disconnect between the promise of the founding period of the state and the delivery of the ideals of the republic continues to capture the public imagination. Commemoration comes with expectation. Half a century after its ‘Just Society’ platform, Fine Gael still struggles to balance social justice with political imperatives. The occupation of Apollo House is the latest attempt to force successive governments to prioritize people over profit.


[1] This was the language from a DHAC membership bulletin, in issues of the United Irishman from mid-1968 through 1969.

[2] Irish Independent, 18 September 1968, p. 7.

[3] Irish Independent, 18 November 1968, p. 13.

[4] United Irishman, January 1969, p. 10.

[5] Cluskey to Lynch, 20 January 1969, Dept. of the Taoiseach, NA 2000/6/423.

[6] O’Higgins-O’Malley to Lynch, 21 January 1969, Dept. of the Taoiseach, NA 2000/6/423.

[7] Irish Independent, 21 January 1969, p. 9.

[8] Security briefings, 20-24 January 1969, Dept. of the Taoiseach, NA 2000/6/423.

[9] For example, Seamus Costello, a local councilor in Bray who was involved with the Dublin Committee and the Bray Housing Action Committee, joined the INLA and was shot dead in 1971.

[10] Correspondence, Hilary Boyle to Jack Lynch, October-November 1969, Dept. of the Taoiseach, NA 2000/6/423.