North Carolina on a cold winter night in 1962. Nina Simone and her trio have just come off tour and are desperate to get home. A snowstorm has grounded their flight, so they wander into town and end up at a tiny greasy spoon run by locals who are dazzled by the presence of four glamorous strangers. But when the band suddenly find themselves accused of a crime they didn’t commit, neither local goodwill nor star power will be enough to get them home.
The Champion offers a rare look into the heart and mind of an artist known as much for her indictment of American racism as her artistic brilliance. The project began in 2006 as a collaboration between American playwright Amy Evans and UK-based director Mark Rosenblatt and actor Noma Dumezweni. The team embarked on a period of extensive research, including personal interviews and a series of workshops, all of which helped generate material for the script. The result is an intimate portrayal of a cultural icon, the outstanding musicians who worked with her, and the turbulent era during which they rose to fame.
On May 8, 2014, 651 ARTS will present The Champion to the public as a staged reading for the first time in the United States (venue TBC). Following the performance, the audience will be invited to share their responses to the play with the company, playwright, and special guests. This key moment of interaction will unearth the most urgent themes raised in the play, feeding directly into the further growth of the project.
This project needs your support! Please click here to learn more about how you can get involved in The Champion.
Filed under: music, performance writing, theatre, Uncategorized | Leave a Comment
Tags: 651 Arts, Brooklyn, music, Nina Simone, performance, plays, The Champion, theater, theatre
At Jubilee Market Place on 68th and Freedom Place in Manhattan, a sign hangs just beside the deli counter that reads ‘English-Only Zone.’ It’s not one of those notices you might find scribbled on a sheet of paper and taped haphazardly to the wall; whoever hung this sign did a careful job of having it printed in clear, large font, on paper that matches the purple and green color scheme of the small market. It’s meant to be taken seriously, the same way one might frame the company motto and display it in the reception lounge for everyone to see.
Is it a warning against patrons who might slow the mad pace of lunch hour as they try to explain what kind of cold cuts they want on their sandwich? Is it reassurance for customers who grow afraid when the clerks serving them speak to one another in a language they can’t understand? Is it an ominous reminder to employees that they are to leave their mother tongue at the door when they show up for their shift – unless, of course, their mother tongue is English?
One of the essays I enjoy sharing with my ESL students is ‘When and Why We Speak Spanish in Public’ by journalist Myriam Marquez. For Marquez, speaking her mother tongue in public places – even when, heaven forbid, she and her family are surrounded by strangers who can’t understand them – is a matter of ‘respect for our parents and comfort in our cultural roots.’ It’s a simple assertion – so simple, in fact, that I have to wonder each time I read it what has driven her to make it in the first place. What are those who insist on the supremacy of English so frightened of? What has scared them so out of their wits that they need to be reminded of what language means?
During the 2012 presidential race, Republican candidates Newt Gingrich and Mitt Romney both expressed their support for English-Only legislation. Such a policy would, in Gingrich’s words, ‘create a continuity.’ A continuity of what, one must ask? The country we now call America was multilingual long before Europeans invaded it, and the Europeans that did eventually show up were not exclusively English-speakers. What grand tradition does Gingrich see himself carrying forward by imposing an official language in a country founded on the principle of Freedom of Speech?
Those who would like to see an ‘English-Only Zone’ clause written into the Constitution base their reasoning on any number of bizarre fantasies. But these fantasies cloud a more dangerous and violent line of thinking. Linguist Derek Bickerton aptly calls it ‘racism lite’: English-Only is a convenient vehicle for exercising prejudice with relative impunity. The store manager knows that he could never get away with a ‘Whites Only’ sign in his establishment. But ‘English-Only Zone’ allows him to indulge in any silly fantasy he wants: he is only trying to bring unity and order to his unruly customers and give his employees equal access to the American Dream. In fact what he has done is reserve the privilege of entering his establishment for those who resemble him – if not in outward appearance or in culture then at least in speech. Gingrich calls it ‘continuity’; fifty years ago George Wallace called it ‘preservation.’ In both cases, the goal is the same: maintaining a power structure where there are more white people at the center and more people of color on the margins.
When I explain that I teach ESL, I am often asked how many languages I speak besides English – the assumption being that there is no way I could teach someone English without first knowing that person’s mother tongue. In fact I’ve taught students at all levels, from absolute beginners to advanced learners, and 90% of time I can’t speak a word of their mother tongue. Teaching ESL has proven to me that language is only one of many means of communicating: little more is needed than the desire of one person to share information and the readiness of another person to receive that information. It might take a little longer to get the information across, but it can be done: It only requires empathy from both sides. And if you’re busy trying to occupy a position of dominance, if you’re more interested in controlling the person in front of you than actively communicating with him/her, you probably don’t have much empathy to spare.
Language is fluid. It changes over time, adopting words and disowning others, shedding its skin generation after generation, bleeding across borders and taking new shapes. Language follows our lead, shifting direction as we do, adapting to our choices. That’s what is beautiful about it. But language is also power. And applying privileges to that power is a ruthless perversion of all that language can be.
Filed under: news | Leave a Comment
40 for free
39 for his sister
38 for his mother
37 for his baby
36 for the half-love at home
35 for the reasons he ran here instead of staying
33 for the soldiers who were on his trail
32 for the authorities who let him go
31 for the hours spent at the airport waiting
30 people in line
29 visas to go
26 too few for hope
25 years old, too young to die
24 dead anyway and many more to come
23 friends buried
22 burned alive
21 shot in cold blood and the rest got away
20 with major injuries and
19 maimed for life
18 who will never be able to multiply and make up for
17 year old children with children of their own
16 chances missed
15 different times but choices are like that, impossible to guess that
14 could ever be a lucky number and
13 maybe he’ll survive if he gets there in time
12 but this time too late
11 days without a minute of sleep
10 times punishment for doing the job no one else would
9 coffees to wash out
8 badly needed drinks
7 seconds to fall asleep and dream of Ricky Reel
and dream of Stephen Lawrence
and dream of Rodney King
and dream of Nigeria and what used to be home,
and dream of the sound of footsteps coming up to the door
6 raps on the window or maybe
5 who’ll ever know because
4 police officers’ memories aren’t enough to hold what’s happened and
3 are failing failing
2 times would’ve kept him down but not far down enough for them
and let one of them be left for me
let one of them be left for me
leave me one don’t let them
leave me out
amy evans © 2000
Filed under: Uncategorized | Leave a Comment
After the Sandy Hook shooting, a statement circulated around Facebook that was falsely attributed to Morgan Freeman about the role of the media in glamorizing perpetrators of gun violence. That the author of the statement not only remained anonymous but projected his or her feelings onto a high profile Hollywood actor should be reason enough to ignore the sentiments expressed within it. But this didn’t stop the Huffington Post from reprinting the comments along with remarks by L. Steven Sieden, who seemed to feel that this faceless author had a point. “Do we dare to think for ourselves rather than allowing the media to dictate how the world is and should be?” Sieden asks – implying that had Adam Lanza been more of an independent thinker, he would have just died ‘a sad nobody’ and everyone else would have been better off for it. In fact if the media would just stop glorifying mass shootings, ‘disturbed people’ would simply do the mature thing and blow themselves away in private. If you want to take your own life, you’re free to do it – just don’t make it anyone else’s business.
I, for one, have a problem with that. It wasn’t the promise of media hype – hype that he wouldn’t live to enjoy – that prompted Adam Lanza to kill his mother, twenty children, six teachers, and then himself. It wasn’t video games, the latest Hollywood flick, Lanza’s genetic make-up, his parents’ divorce, or the possibility that he might have suffered from a mild form of autism that caused him to do it. Lanza committed murder and completed suicide because 1) he was mentally ill – not mentally disabled, but mentally ill – and 2) he had access to guns. Remove either one of those two factors from the equation, and twenty-eight people might still be alive today and the rest of us would be going on with our lives blissfully unaware that a little place called Newtown, Connecticut even exists.
I come from a town much like Newtown, if the gushing descriptions of the place are to be believed. My parents bought a house in Chapel Hill, North Carolina in 1974, the year I was born; my mother was pregnant with me when she and my father repainted the exterior of the house yellow. I remember dodging caterpillars, wriggling earthworms and the occasional garden snake on the way to school in the mornings. After school I played in the street with the kids on the block and in summer went swimming at the local pool. There were bushes along the street where blackberries grew in spring, and my grandmother and I used to harvest them if she happened to be in town for a visit. I saw Michael Jordan play basketball when he was just a freshman at UNC and watched jealously as my neighbor piled her kids in the car and rushed up to Franklin Street to celebrate when the Tarheels brought home NCAA titles. (We weren’t allowed to go; ‘too dangerous,’ my father said. I still remember going to school the next day and listening to stories of schoolmates who arrived on Franklin Street with their parents and five minutes later found themselves drenched in beer.) If it sounds idyllic, it’s because it was; even today Chapel Hill can be found listed among the top ten best places to live in the United States.
Still my father decided to invest in a gun. This may seem odd in a place where neighbors left their front doors wide open on hot summer nights and sometimes never bothered locking their back doors at all. But it seems less odd when I try to put myself in my father’s shoes. Both my parents came of age in Mississippi during a time when a Black child couldn’t attend a majority white school without the protection of federal marshals. Now here my parents were twenty years later sending their children off to integrated schools, with no references to draw upon and no idea what to expect. It must have been baffling at times, if not downright frightening. And in my father’s mind, having a weapon in the house and making sure his wife and daughter knew how to use it was at least one way to ensure our safety.
It never occurred to him or to any of us that my mother would one day use the gun to take her own life. In retrospect the signs were all there; on the Center for Disease Control and Prevention website, there’s a long list of risk factors for suicide, and more than a few apply in my mother’s case: a history of mental illness, a recent loss, feelings of isolation, and, near the bottom of the list, easy access to lethal methods – in other words, guns. In 2009, 56% of men and 30% of women who died by suicide used a firearm, making guns the leading means of suicide death among men and the second most prevalent among women. Firearms offer an alternative that is quick and immediate; a gunshot wound to the chest or head can snuff out a life a lot faster that sleeping pills or a razor blade. The chances that a friend or family member might discover the suicide in progress – and consequently the chances of survival – are far slimmer in cases where firearms are used.
My mother, taking no chances, planned her death long in advance. She chose to use my father’s gun, but only after procuring four bottles of prescription sleeping pills, a supply it must have taken weeks to collect. She had always been responsible for the family finances, and she arranged bill payments a solid three months in advance to ensure my father had a comfortable grace period before he was forced to take over her responsibilities. She shot herself in the bathroom, the only room in the house with a lock, probably because she knew that her husband would be more likely to force open a locked door than her daughter. She chose the bathtub, not an especially comfortable or romantic place to end one’s life, but easy to clean up in case things got messy. And lastly she slipped away to complete her suicide when she was least likely to draw attention to herself – on a Sunday morning while her husband was out doing yard work and her teenage daughter was holed up in her room with music blaring. My mother committed the ‘sad nobody’ kind of suicide, the kind of suicide that’s sanctioned by our right as Americans to do whatever we want with our lives as long as we don’t make it anyone else’s business.
In the twenty years since I lost her, in twenty years of missing her, in twenty years of wondering if I could have done anything to stop her, I’ve never once paused to think about the role of the media in her death. I’ve never wondered which violent film or graphic video game might have desensitized her to the grief her decision would inflict on her family. Maybe I should learn to be proud that my mother, even in the depths of her depression, still had the sense to die sad and lonely instead of making a murderous spectacle of herself and shooting up the local elementary school. But even though her death didn’t make front page news and didn’t break the nation’s heart, it broke three hearts at least: my father’s, my brother’s and mine. And though I’m now finally able to move beyond my grief and embrace my life as a gift that I would trade for nothing in the world, I refuse to reduce my mother, who gave me this life and who nurtured me in spite of her depression for as long as she could, to a ‘sad nobody’ who vanished from among us without a trace.
Before we start blaming the media for turning ‘sad nobodies’ into cold-blooded mass murderers, we should ask ourselves who’s really the more desensitized. The direct correlation between the presence of firearms and the likelihood of suicide completion has been proven again and again. But it seems we don’t actually care enough about suicidal people to act on these facts unless they kill someone else along with themselves, unless they choose to be ‘monsters’ instead of ‘nobodies.’ Maybe it’s because we’re scared of mental illness. If gun control could help curb heart disease or lung cancer, maybe we’d be more inclined to talk about it. But we’re barely at the point where we can talk plainly and knowledgeably about depression, let alone suicide. We still cling to the belief that a person only needs offspring, a stable home, and a successful spouse to keep her rooted firmly to this earth, even though there is plenty of evidence pointing to the contrary.
Don’t try to pin that one on the media. Our denial is all our own fault. And only we can change it.
Even though my mother was dangerously ill, she probably would not have shot herself if she had not had access to a loaded firearm. She might have used pills or another method, but at least then there would have been some chance, however small, of a timely intervention. And the beautiful, graceful woman who was as much my beloved elder as she was my best girlfriend might still be alive today.
It may sound as if I blame my father for bringing a gun into the house in the first place. I don’t blame him, not at all. To do so would be to ignore the complex history of gun ownership in America and the real dangers that Black people have faced throughout that history. I clearly recall driving with my parents through Alabama late one night after visiting relatives in Mississippi and sampling what for my parents must have been a common occurrence when they were my age. The car was running low on gas, so my father stopped at the first place we saw, which was a tiny gas station on a remote section of the highway. My father pulled the car into the station, but he didn’t get out to fill the tank; he just sat very still, looking into the windows of the station. The lights were on, and there were clearly people mulling about inside. My mother followed his gaze and then looked back at him; from the backseat I heard her ask, ‘You think it’s Klan?’ My father replied by gunning the engine and driving away. This wasn’t 1950; this was the late 1980s. And there it was, the word ‘Klan,’ rolling off my mother’s tongue with frightening ease. It’s no wonder to me that my father opted to keep a weapon in the house. But it is a wonder to me that the gun owners whose raucous voices dominate the debate on gun control look so little like my father and so very much like the figures I saw through the gas station window that night.
In debating gun control, there is a great deal of paranoia about protecting the Second Amendment at all costs. Radical gun nuts who seem to think the United States is still a British colony have received a rather remarkable wealth of air time for their outrageous views. I wish that more air time was given to gun owners with legitimate fears. When we unpack these fears, acknowledge them and address them, we might better empower gun owners to give up their weapons, making their homes and communities safer for their loved ones and opening up spaces where conversations about healing can finally take place.
Filed under: news | 2 Comments
In one of my writing classes at Fordham, we debated Mayor Bloomberg’s recent proposal to address the obesity epidemic by imposing a ban on 20 oz. soft drinks sold at New York City sports arenas, delis, and movie theaters. My students were convinced the ban would fail, not only because it challenged the self-determination of the individual (this comment came from an attorney, as you might guess), but because, quite logically, anyone who had a craving for a large soda would only have to go next door to the pharmacy or the local grocery store and help themselves. And if sugary drinks have a role to play in adding pounds to the bodies of the people who consume them, what about the tempting array of candy bars available at every convenience store, pharmacy, gas station and grocery store checkout line? Or the two-for-a-dollar bags of chips, loaded with artificial flavors, coloring, cholesterol and saturated fat? Banning sodas is little more than a symbolic gesture in light of all the other options available to the salt and sugar addict.
And yet symbolism is important. When I was a kid growing up in the seventies and eighties, smoking was still associated with glamor and sophistication. Now, at least in New York, smoking is synonymous with exile and social exclusion. The smoking ban surely has had a lot to do with that. So why shouldn’t a ban on soft drinks work the same magic? The obvious answer is that, unlike soft drinks, smoking stinks. The smell gets in your clothes, your hair, and your belongings, whether it’s you who’s smoking or the person next to you. If a person takes a drag and you happened to be in the wrong place at the wrong time, you could very well end up taking a drag too, whether you want to or not. In other words, smoking had an impact on the non-smoking public that was as apparent as it was bothersome. Thus turning smoking into a behavior with morally incorrect undertones was not such a long shot. Drinking soda is different. If the person sitting next to me slurps on a soft drink, it has little to no impact on my personal health. It won’t inadvertently end up going down my throat, giving me ulcers or rotting my teeth. Slapping a moral code on behavior that has no apparent impact on society as a whole – especially in a country where ill health is not only acceptable but clearly lucrative – will do nothing to curb trends in obesity.
So what will? The answer is simple. Ill health must become intolerable in the eyes of the general public. And the only way to make ill health intolerable is to make quality health care entirely accessible. Lack of funds or lack of transportation may no longer serve as explanations for why so many Americans go for years without medical check-ups or without treatment for illness and injury. If every resident in the New York metropolitan area had access to affordable health care, regardless of their income, then I would be less inclined to turn a blind eye to the health of the stranger nursing a soft drink next to me; both of us would have a stake in his effort to cut back on his consumption of sugar and salt.
This might sound far-fetched. After all not even the US federal government has been able to pass universal health care despite decades of attempts. Undoubtedly the elected official that presides over the institution of something so incontrovertibly beneficial to the American public would go down in history a hero, and that seems to be a status that no political party dares afford a member of the rival party – or a rival member of their own party for that matter. What I’m proposing, however, is not universal healthcare, but a scheme that would make sure every New Yorker, regardless of employment status, has consistent access to essential services. The plan would include emergency room visits, preventative and routine care, mental health care services, and prescription medicine. Premiums would be paid on a sliding scale according to one’s income and number of dependents. Rather than imposing a random and often conflicting web of in-network facilities for customers to sort through in the event of a health crisis, this plan would assign participants a local clinic: a place close to home where they could go and feel confident that they will receive the care they need or be referred to a specialist whose fees are covered by the scheme.
If any of this sounds familiar, it’s because it’s all based on San Francisco’s award-winning Health Access Program, which has been alive and kicking since 2007. If you are a resident of San Francisco earning up to 500% of the Federal Poverty Level – that’s $54,000 or less for an individual and $111,000 or less for a family of four – then you are eligible for access to health care at a rate ranging from $0 to $450 per quarter. Program participants are assigned a ‘medical home’ where they receive regular care from personnel who are then able to keep consistent records of their patients’ medical history. The program is not meant to take the place of health insurance plans – it won’t cover your root canal or your new prescription glasses or any emergencies you might incur while outside San Francisco. But for those people stuck in the twilight zone between indemnity insurance and unaffordable health care packages, the Health Access Program has been a proven success: at the end of 2011, over 54,000 uninsured adults were enrolled in the program, accounting for 85% of the total 64,000 uninsured residents of San Francisco. That same year the program added two additional medical homes to its network of 36 total. All of this came at a cost of $177 .7 million in 2011, or approximately $272 monthly per each enrolled participant. Of this amount, 28% was covered by city revenue, 56% was covered by city and county subsidies, and 16% came from private community program providers.
I’m not a statistician, an economist or a psychologist. Yet I have a sneaking suspicion that implementing a comparable program in a big city like New York is not beyond the realm of possibility. At present the obesity epidemic costs the city of New York $4 billion and has put over 5,000 lives at risk. Instead of trying in vain to stop people from drinking so much soda, why not actively encourage people to take better care of themselves and give them the means to do it? It’s one thing when the mayor of New York – whom most of us have only seen on TV and in the newspaper – reaches down from on high and snatches away your 20 oz. soda cup; it’s another thing when a doctor sits across from you and calmly informs you that your diet is killing you and offers you the numbers to prove it. I personally would find the latter far more effective, and my guess is that most people would as well. What’s left is to afford all people – and not just those with far-reaching health insurance coverage – the chance to sit down with a doctor in the first place.
Mayor Bloomberg’s soda ban proposal has been called an attempt to create a ‘nanny state’ by some naysayers. Undoubtedly many would say the same about a city-wide program that promises access to health care for everyone, including the city’s poorest. But I claim that the exact opposite is true. Guaranteeing quality health care for everyone allows individuals to thrive. It means more people will go to the doctor when they’re sick and seek out treatment rather than ignoring their symptoms because they’re afraid that a diagnosis might lead to financial ruin. It means that more people will get treated for illnesses instead of ‘toughing it out’ because they can’t afford it and inevitably making themselves even sicker. And it means fewer people losing homes and going bankrupt because they are unable to keep up with medical bills.
As for critics who insist that affordable health care will result in a flood of attention-seekers with nothing better to do than take advantage of resources, I would counter that most people don’t actually enjoy being poked and prodded at the doctor’s office, and the numbers that do would be rendered insignificant compared to the numbers that only go when they need to. If San Francisco’s program is anything to go by, then I predict a healthier population, higher productivity, and an overall stronger sense of satisfaction with the city in which we all live. But more importantly, instituting such a scheme would make good health a moral imperative instead of a symbolic gesture. And instead of ignoring the guy beside me guzzling soda, I would be in a better position to bark at him in true New York style and tell him to move to New Jersey if he wants to trash his body.
Filed under: Uncategorized | Leave a Comment
Madness abounds in Shakespeare. King Lear strips off his clothes and goes racing into a lightning storm, driven to the breaking point by his two ingrate daughters. Hamlet pretends so well to be mad that everyone is fooled, including the lovestruck Ophelia, who goes mad for real. Judging by the world of Shakespeare, madness is as close to us as death: one step too far in the wrong direction, and our minds fly to the winds. So it wasn’t terribly surprising to watch a version of Macbeth set entirely in a mental institution with Alan Cumming in the role of a tormented man who becomes possessed by one character after the next while detached but curious figures in white coats observe and take notes from a comfortable distance. Apparently, he’s gone ‘mad’ in the Shakespearean sense of the word: he’s lost so much and suffered so deeply that his poor brain has become overwhelmed with grief. So he talks to himself, flits from one identity to the next, and attempts suicide twice, all in less than two hours.
This version of the Scottish play, produced by the National Theatre of Scotland and directed by John Tiffany and Andrew Goldberg, is inspired by the moment when Macduff is informed that his wife and children have been murdered. The loss spurs him forward to fight alongside England and overthrow the tyrannical Macbeth – but not before he has first felt his grief ‘like a man’. But here Macduff has not fully managed to process his grief; yes, he fought the tyrant and won, but this feat has done nothing to help him come to terms with the loss of his family. As a result, he’s landed in an institution where he recounts the events that brought him to this sad state. It’s a clever twist, and demands that we consider what exactly there is to celebrate in justice that has come too late and at too high a cost, a question that resonates throughout contemporary history and especially now in places like Egypt, Libya and Syria. What happens when overthrowing a tyrant and changing the course of history cannot heal the pain of personal loss? But unfortunately this is where the production stops asking questions and instead becomes a showcase for the masterful Alan Cumming, who transforms before our very eyes into a campy King Duncan, a loyal Banquo, a seductive Lady Macbeth, and a whining, sniveling Malcolm with sublime precision and clarity. Watching Cumming at work was a magical experience, yet I left the production feeling vaguely disappointed that I wasn’t as in awe of the clever take on the story as I was of Cumming’s craft.
What bothered me was that I didn’t know exactly what mental condition I was meant to have seen depicted before me. Cumming’s Macduff seemed to exhibit symptoms more characteristic of schizophrenia than grief patterns. And while it is not entirely clear what causes schizophrenia, statistics show that it is primarily genetic. Certainly we all grieve at one point or another in our lives, and while despair can at times feel overwhelming, it does not always result in hallucinations and suicide attempts. But in the world of drama, there’s something irresistible about ignoring the fact that mental illness, like any other kind of illness, has symptoms that appear with a rhythm and a logic of their own. If you’re suffering from the flu, abdominal bleeding is generally not a symptom; if you have a migraine headache, it’s rare that you get a cough and fever along with it (unless you also happened to have the flu at the same time). So what happens if you lose your mind? Well, anything at all, which is convenient if you’re trying to find a clever way of staging a one-man version of Macbeth.
The powerful stigma attached to mental illness allows us to adhere much less to reality in favor of indulging in what is popularly perceived as ‘mad’. Those perceptions might include a vast range of behaviors, many of them utterly subjective. And of course we can get away with it. After all we have centuries of generic insanity that we can call upon without running the risk of alienating our audiences, thanks to the fact that mental illness, compared to other forms of illness, is still so widely misunderstood. The result is stories where schizophrenia is a disorder that can be conquered with a bit of mathematical reasoning, or worse, where depression is merely a metaphor for bucking the system, and schizophrenics, like wise sages, actually are the ones who see the world in all its clarity.Words like ‘crazy,’ ‘nuts’, ‘mad’, and ‘mental’ are attached to anything from the rising price of a pint of milk to the American-led invasion of the Middle East. But what are we actually talking about when we use these terms? What does ‘crazy’ look like? And would a more accurate depiction of mental illness necessarily negate the beauty of a tormented man, alone, tearing through the story of Macbeth in an effort to make some sense of the losses he’s endured? I doubt it.
I only wish that more of the stories we saw on film and on stage more accurately reflected the growing body of information available to us about schizophrenia and depression, not simply because it would help educate audiences – which it would – but because the result would be stories that are challenging, honest, and very, very timely. This Macbeth fell short of doing just that – very slightly short, thanks to a brilliant performance, but short nonetheless.
Filed under: news, theatre | Leave a Comment
Tags: Alan Cumming, Lincoln Center, Macbeth, schizophrenia, Shakespeare, theater, theatre
Earlier this year I took out indemnity insurance with Empire Health Choice, fulfilling a promise to myself to buy some sort of coverage after seven years of playing the numbers. Indemnity insurance is, in short, ‘emergency insurance’, popular among artists who can afford little else. It covers in-patient hospital care – provided, of course, that the hospital is within the company’s circle of friends. That means all the lab work must done in the hospital’s lab and not outsourced to a facility that might do a better job of it; any specialists who deal with my case must be employed directly by the hospital; and, failing all other red tape, the insurance company – and not my physician – must deem whatever treatment I receive absolutely necessary. So, for example, if my physician says a hospital stay was necessary to set my broken leg, but Empire Health Choice insists my broken leg could’ve just as easily been set in the hospital parking lot, then heaven forbid Empire Health Choice cough up a single cent for a luxurious hospital stay, even if that hospital is among their circle of friends. (The paperwork the company sent when I first joined recommended that customers remember to ask before receiving any treatment whether their doctors are directly employed by the hospital. Here’s hoping that if the time comes when I need to use my plan, I’ll be lucid enough to ask for such important information and not too preoccupied with whether I’m going to live or die.)
When I signed up I thought the plan was better than nothing. But after six months of sending in checks for $181 and each time imagining how I’d have to instruct the ambulance driver to take me to an in-network hospital rather than the closest one, I was starting to feel jaded about my great desire to be among the privilege insured. Even when Obamacare received the stamp of approval from the Supreme Court this month, I felt about as thrilled as I would have if Tiffany’s announced it was slashing the prices on all their platinum pendants by 50%. Great that President Obama has been spared the humiliation of having what little was left of his healthcare reforms squashed by the highest judges in the land, all of whom I’m sure are fully covered in case of an emergency and would never have to ask whether the surgeon about to cut them open is on the hospital’s payroll. But what’s there to celebrate in having comprehensive coverage a little less unaffordable than it used to be, but still very much out of reach?
All of this is leading me to the unexpected surprise I got in the mail last week. A letter showed up from Empire Health Choice, long before my next payment was due. I opened the letter, and a check tumbled out, a check made out to me. To me? From an insurance company? With no visit to the hospital, no arguing back and forth about whether or not my procedure was covered by my plan? This had to be a mistake. I quickly perused the enclosed memo and learned that the Affordable Healthcare Act requires insurance companies to issue a rebate to customers if the company does not spend at least 80% of customer premiums on health care services (i.e. doctors and hospital bills and ‘activities to improve healthcare quality’). Last year Empire Health Choice fell short by just over 10%, spending only 69.4% of $3,767,992.20 received from premiums. Yay for Obamacare! Yay for an unexpected check in the mail! Yay for an extra $28.44 to put in the bank! But as the thrill of the moment began to fade, I did a few calculations. Empire Health Choice raked in approximately $400,000 last year, not a cent of which went toward the cause of providing health care to its clients. What ‘activities to improve healthcare quality’ could that money have funded?
- Three years of weekly visits to a mental health practitioner (at $50 a visit) for the 48 people who committed suicide in New York last year.
- A year’s membership to a local YMCA for 400 families, including parents and their dependents aged under 18. (This is especially pertinent in Brooklyn and the Bronx, both ranked within the bottom five among New York’s unhealthiest counties in 2011.)
- An annual teeth cleaning and oral exam, including four X-rays, for 2,000 adults.
It’s all too easy to find ‘activities to improve healthcare quality’ in New York. So why on earth should I be receiving a rebate in any amount? My premium could go toward the health of so many people – myself included – if the plan I was paying for prioritized care to any realistic degree. And I would happily fork over my $181 a month if it did. But for now, I’m paying for coverage in the unlikely event that I will find my way to an in-network hospital in an emergency, and once there, be in the frame of mind to investigate which anesthesiologist, physical therapist, surgeon or oncologist is on payroll at that hospital rather than which anesthesiologist, physical therapist, surgeon or oncologist is best qualified and readily available to treat my ailment.
I suppose the idea behind my suddenly becoming $28.44 richer this week was to urge Empire Health Choice to adjust its allotments to include more ‘activities to improve healthcare quality’ and fewer executive bonuses. And of course, it makes Obama look like Santa Claus in July, and who wouldn’t want to elect Santa for President? In the meantime I’ll spend my money on extra carrots and greens, and hope that by the time I do need to make a claim, the healthcare system will have evolved into something that puts people before profits. But I have the nagging feeling that revolution, and not evolution, is the only way this monstrous profit-making machine will ever change.
Filed under: news | Leave a Comment
Tags: Barack Obama, Empire Health Choice, health insurance, healthcare, indemnity insurance, Obamacare, rebate, single payer