Diberdayakan oleh Blogger.

Popular Posts Today

Room for Debate: Foreigners Behaving Badly

Written By Unknown on Jumat, 31 Januari 2014 | 13.26

  • Ruben Navarrette Jr.

    Revoke Bieber's Visa

    Ruben Navarrette Jr., syndicated columnist

    It's obscene and not in keeping with the best traditions of this country to have one standard for the rich and famous, and another for everyone else.

  • Nisha Agarwal

    The Costs of Deportation Are Too High Already

    Nisha Agarwal, Immigrant Justice Corps

    Punishing green card holders for minor offenses will overwhelm our legal system, tear apart families, and cost our local and state governments time and money.

  • Mark Krikorian

    Lax Policies Create a Sense of Entitlement

    Mark Krikorian, Center for Immigration Studies

    Obama could show that he's committed to holding outsiders to our standards by sending Justin Bieber back home. But he won't.

  • Peter Spiro

    An Extraodinary Case That Highlights the Ordinary

    Peter Spiro, Temple University

    The Bieber episode highlights the low threshold for deportation that has haunted our immigration regime for almost 20 years.

  • Joseph H. Carens

    Let Settled Immigrants Stay

    Joseph H. Carens, author, "The Ethics of Immigration"

    Over time immigrants' lives become bound up with the place where they live, and it is wrong to take that away from them, even when they behave badly.


  • 13.26 | 0 komentar | Read More

    Opinionator | Private Lives: Meetings with a Murderer

    Written By Unknown on Kamis, 30 Januari 2014 | 13.26

    Private Lives: Personal essays on the news of the world and the news of our lives.

    Of all the troubling events from my childhood, one of the most enduring remains the afternoon I visited a prisoner serving a life sentence for murder. It was 1978 and I was 9 years old, escorted to Western Penitentiary in Pittsburgh by my mother, who, compelled by a lifelong objective of raising her son's awareness of injustice in the world, no doubt considered this to be a well-suited occasion.

    The injustice, in this particular instance, was the framing of a 21-year-old black man named Stanton Story for the killing of a white Pittsburgh police officer. At the time of our visit, three years had passed since Mr. Story's trial, in which, despite having apparently been in North Carolina on the day of the shooting, he was found guilty by an all-white jury and sentenced to die in the electric chair. Almost three years later, however, the Pennsylvania Supreme Court granted Mr. Story a new trial (on the grounds that prejudicial evidence had been introduced at the first one) and, having recently declared the death penalty unconstitutional, set aside his death sentence. It was during the run-up to this second trial that the Socialist Workers Party, of which my mother was a dedicated member, began advocating on Mr. Story's behalf.

    My mother's substantial commitment, bordering on mania, had made Mr. Story the predominant subject of our household for months. I was acutely aware of the effort she had been expending, at all hours, on meetings, rallies, protests. "Do you dream of freedom?" she had written in one of her many letters to Mr. Story. Then, for fear that the guards would suspect this question to be a coded invitation to attempt escape, she had frantically whited it out.

    What I recall about that afternoon visit, several hours long, is mostly a feeling of dismay. Mr. Story was so pleasant, so courteous, diffident even, that the prospect of his spending the rest of his life in prison was not something I could fathom. I was also bored. The conversation between my mother and Mr. Story, which I was expected to sit through silently, revolved mainly around the particulars of a forthcoming carwash that would raise funds for his legal team. The visit ended abruptly with his being led away, but before the door closed behind him, he turned to wave a melancholy goodbye to my mother, who, standing beside me, gripping my hand, was cursing the guards under her breath and sobbing uncontrollably.

    At the new trial, Mr. Story was again found guilty by an all-white jury, and since the death penalty had been reinstated, sentenced a second time to the electric chair — a sentence that on appeal would once more be reduced to life in prison. My mother, mercifully, spared me the details, informing me only that he had "lost." What are we going to do now, I remember asking, because surely, given Mr. Story's innocence, and given my mother's unflagging determination, there was always something more to be done. But no, my mother said, this was it, there was nothing else we could do. So after that, we never mentioned his name again.

    But I never forgot him. Over the years, that final image of Mr. Story, looking back at us, would pop into my head at the most inopportune moments. Here I am playing basketball, I would think, and Stanton Story is still in prison. Here I am sitting on my new couch from Crate and Barrel, and Stanton Story is still in prison. Thus my mother's goal to raise my awareness of injustice in the world had been achieved. Achieved so effectively, in fact, that 30 years after that visit it occurred to me that I could contact Mr. Story, perhaps hear his account of the injustice done to him and, as with other wrongful convictions, help free him. If this sounds like a childish thought, that's because it is.

    After a series of unsuccessful phone calls and Internet searches, I finally learned that Mr. Story was now incarcerated at a supermax prison 50 miles south of Pittsburgh. In addition, I found a Twitter account, ostensibly set up for Mr. Story, listing a few unsettling tweets, including "HELP HELP HELP HELP PLEASE HELP."

    "Dear Mr. Story," I wrote, "I'm not sure if you'll remember me, but years ago I came to visit you one afternoon with my mother …"

    One week later came a reply. No, he did not remember me. He remembered my mother, though, "a very good and dear friend," whom he thought of often. He still had a photograph in his cell of the two of them, taken at one of her many visits. His grammar was occasionally off, but all things considered he wrote with elegance and optimism. "I guess you can say that I'm constantly trying to make the best out of a bad situation" was a refrain he would repeat in nearly every letter to me. He was still hoping for a new trial. He thought the prospects were good. We made plans for me to visit. He was excited to get started, to tell me his story. "I don't want to get too far ahead of myself," he wrote.

    In the meantime, I began to research his case. One of the first websites that I came across, though, was a memorial for slain police officers, which had dedicated a page to Patrick Wallace, the officer who had allegedly been killed by Mr. Story. Up to this point, I had never given much thought to Mr. Wallace. In fact, I had never given any thought to him. It occurred to me as I read that not only did I know very little about Mr. Wallace, but I also knew very little about any of the details of the case.

    I soon discovered some troubling things. I learned, for instance, that at his second trial, Mr. Story admitted he had lied about his alibi of being in North Carolina. He had been in Pittsburgh, at the scene of the shooting, but he insisted that it was his companion, a man named Richard Davis, who had fired the fatal shot. Moreover, I found that he had a long history with crime, beginning as a teenager. When he was 21, he was convicted on multiple counts of armed robbery and sent to Western Penitentiary. In prison, his behavior was so exemplary that he was granted a three-day furlough, but during those three days he robbed two banks and fled to North Carolina. A month later he returned to Pittsburgh, where he may or may not have shot and killed Patrick Wallace.

    All of this I was just beginning to process as I made my way from New York City to Waynesburg, Penn., to visit Mr. Story.

    I had made the mistake of skipping breakfast, partly out of poor planning, but mostly out of anxiety, so that by the time I arrived I was famished. He was waiting for me when I walked in, sitting patiently at a table in the visitor's room. He had gray in his beard. He was 57 years old now.

    We broke the ice by having a good laugh at the expense of the Socialist Workers Party, whose members, Mr. Story said, had disapproved when he told them that when he got out of prison he would buy a house. "We don't believe in private property," they had counseled. He spoke highly of my mother, however, and seemed to bear no ill will that she had fallen out of touch. For the next six hours we talked about his childhood, his life in prison, the improprieties in his two trials, the details of the crime.

    I was plagued by a sinking feeling that even if he were innocent — which he might very well be — there would be no way to prove it. His conviction appeared to hinge largely on the testimony of Mr. Wallace's partner, whose identifications of the two men at the scene were somewhat marred by ambiguities. No bullets had been found, which meant no gun could be connected to the crime. There seemed to be no hard evidence to prove either innocence or guilt. Still, he had been sentenced twice to the electric chair.

    Unable to feed myself, I fed Mr. Story. Fish sandwiches and green tea from the vending machine. I was surprised at how high his spirits remained. He described how years ago he had been shackled and transferred across the country by bus. "I looked out the window the entire time," he said. "It was the best week of my life."

    When our visit was over, we promised to keep in touch. We parted with hopes and expectations. I was aware that he was smiling at me when he went back to his cell.

    But the hopes and expectations were soon replaced by the monumental task that lay before me, as it had once lain before my mother. I visited with members of his family, who offered little in the way of assistance. I contacted his old lawyers, who never returned my calls. I spoke with legal experts who agreed that Mr. Story had some legitimate arguments in his favor but said that countless men and women suffered from inadequate counsel and an unfair trial.

    Meanwhile, Mr. Story and I wrote letters back and forth, going over the same thin material. I thought of letting my mother know that I had reconnected with Mr. Story, but as she was nearing 80 years old, I did not want her to have to contemplate those grave and ponderous issues of hopelessness and the passage of time.

    A year passed. Our letters became shorter. We began to write mostly about the Steelers. The space between sending and receiving letters grew longer. How long can a correspondence like this go on? Not long. Eventually I took so many months to respond that he never wrote back. Or perhaps it was the other way around. Either way, it came as a relief.

    Recently, while helping to organize some of my mother's things, I found a large envelope that was labeled "Stanton Story Letters." The envelope was thick, and I had the urge to open it and read what she had written to him — but I refrained. My mother had not been able to figure out how to keep up a correspondence with a man imprisoned for life. Thirty years later, neither could I.


    Saïd Sayrafiezadeh is the author of the short-story collection "Brief Encounters With the Enemy" and the memoir "When Skateboards Will Be Free."


    13.26 | 0 komentar | Read More

    Room for Debate: Stability Versus Democracy in Egypt

    Written By Unknown on Selasa, 28 Januari 2014 | 13.26

    Almost three years ago, Egyptians celebrated the end of six decades of military rule. But fury at the autocratic governance of Egypt's first democratically elected president, the Islamist Mohamed Morsi, led to his ouster six months ago by the military. Now many Egyptians are supporting for the military again as its leaders authorize their commander to run for president.

    Is a military government actually the best recipe for stability and progress in Egypt?

    Read the Discussion »
    13.26 | 0 komentar | Read More

    Opinionator | Measure for Measure: The Song Remains the Same (and Kind of Blue)

    Written By Unknown on Minggu, 26 Januari 2014 | 13.25

    If you grew up in the 1970s and you loved '70s rock as much as I did, then there is a good chance you saw the Led Zeppelin concert film, "The Song Remains the Same." The film documents a 1973 Zeppelin concert at Madison Square Garden and played in theaters throughout the late '70s and after. It became an almost instant cult classic, bringing my favorite rock band ever to life, with only a trip to the local theater.

    My melancholia probably explains why I prefer Led Zeppelin's acoustic 'Rain Song' to the blues jam 'Rock 'N' Roll.'

    I first saw the movie at a mall in Canton, Ohio, when I was in seventh grade. I watched it again last year on cable in my apartment in San Francisco, and the same scenes that resonated with me back then still moved me: Jimmy Page playing the hurdy-gurdy at a lake, and the John Paul Jones dream sequence scene during "No Quarter." It was a funny thing to realize that at 46 years old, I'm still in many ways the same person I was then.

    Like a lot of real-life experiences that influenced me, the movie ended up in one of my songs. This one is called — probably not surprisingly — "I Watched the Film 'The Song Remains The Same.'" I recorded it last year and the track is going to be on the new Sun Kil Moon record, "Benji," coming out next month. It's the sixth record I've made under the Sun Kil Moon moniker.

    Like many of my songs, the lyrics on this one start on a  defined subject, then spiral into other areas and eventually return to the same place. This song reflects on some of my earliest memories, paying tribute not only to the inspiration of Led Zeppelin, but also to my grandmother, who passed away from lung cancer in 1976. During the last year of her life, the sight of her condition troubled me so much that I waited in the car during visits.  I was sitting at our kitchen table in Ohio when my mom called to say she'd died. I was 9 years old, and for some strange reason  I broke into  hysterical  laughter before running to my room and crying. 

    There is also a reference to a person I have deep  gratitude and respect for, Ivo Watts-Russell, the founder of 4AD Records. He signed my early band Red House Painters in 1992 and gave me my start in the music business. There was a myth that he and I had a falling out when I signed to Island Records in 1995, but we have always remained friends. I visited him this past year at his home outside Santa Fe.

    There is also a reference  to an old friend of mine named Chris Waller, who smoked pot and fished and did the best  Bon Scott impression I've ever heard (he could sing "Touch Too Much" and hit every note), but  Chris got bumped off a moped and died when he was only 13. He was overdue to make an appearance in one of my songs, and he's in good company on this album.

    Though it's one of my least favorite memories, there is an odd reference to a kid I sucker punched on a playground when I was in elementary school (someone dared me). The incident always bothered me, and this is my very belated apology.

    The way this song drifts in and out of different realities and memories is a lot like the movies — weaving documentary, imagination and memory throughout, always coming back to the music.

    The main theme of this song is melancholia, something I've carried around since I was a kid  and probably explains why I prefer Led Zeppelin's  acoustic  "Rain Song" to the blues jam "Rock 'N' Roll," or why the piano ballad "Changes" is my favorite Black Sabbath song, not "Iron Man."  It's not deep depression I'm singing about, or even necessarily a bad feeling, but a  state of being  (an "unspecific sadness," as a friend of mine put it) that  I believe is  rooted in my early life experiences, some of which are mentioned in this song, and in others throughout the album.

     The  feel of this  track  makes me think of Judy Collins's "Send In The Clowns" or Van Morrison's "Beside You."  The music works at its own pace without a defined time signature, the vocals tossed off and falling wherever they land.  The musicians who played on this track — including the drummer Steve Shelley of Sonic Youth and the keyboardist
    Owen Ashworth of Advance Base — had a big challenge trying to lock in the rhythm of the nylon string guitar, which was intended to be in 6/8 time but didn't exactly turn out that way. The Bay Area singer Keta Bill got choked up during her backup vocals, saying, "This song is so sad."

    This song, like the rest of the album, is a thank-you to those who have inspired me along the way, and an apology to a few, as well.

    Mark Kozelek is an American singer and songwriter who has recorded more than 40 albums with his bands The Red House Painters and Sun Kil Moon, and under his own name.


    13.25 | 0 komentar | Read More

    Opinionator | Measure for Measure: The Song Remains the Same (and Kind of Blue)

    Written By Unknown on Sabtu, 25 Januari 2014 | 13.25

    If you grew up in the 1970s and you loved '70s rock as much as I did, then there is a good chance you saw the Led Zeppelin concert film, "The Song Remains the Same." The film documents a 1973 Zeppelin concert at Madison Square Garden and played in theaters throughout the late '70s and after. It became an almost instant cult classic, bringing my favorite rock band ever to life, with only a trip to the local theater.

    My melancholia probably explains why I prefer Led Zeppelin's acoustic 'Rain Song' to the blues jam 'Rock 'N' Roll.'

    I first saw the movie at a mall in Canton, Ohio, when I was in seventh grade. I watched it again last year on cable in my apartment in San Francisco, and the same scenes that resonated with me back then still moved me: Jimmy Page playing the hurdy-gurdy at a lake, and the John Paul Jones dream sequence scene during "No Quarter." It was a funny thing to realize that at 46 years old, I'm still, in many ways, the same person I was then.

    Like a lot of real-life experiences that influenced me, the movie ended up in one of my songs. This one is called, probably not surprisingly, "I Watched the Film 'The Song Remains The Same.'" I recorded it last year and the track is going to be on the new Sun Kil Moon record, "Benji," coming out next month. It's the sixth record I've made under the Sun Kil Moon moniker.

    Like many of my songs, the lyrics on this start on a  defined subject, then spiral into other areas, and eventually return to the same place. This song reflects on some of my earliest memories, paying tribute not only to the inspiration of Led Zeppelin, but also to my grandmother, who passed away from lung cancer in 1976. During the least year of her life, the sight of her condition troubled me so much that I waited in the car during visits.  I was sitting at our kitchen table in Ohio when my mom called to say she'd died. I was 9 years old, and for some strange reason I broke into  hysterical laughter before running to my room and crying. 

    There is also a reference to a person whom I have deep gratitude and respect for, Ivo Watts-Russell, the founder of 4AD Records. He signed my early band Red House Painters in 1992 and gave me my start in the music business. There was a myth that he and I had a falling out when I signed to Island Records in 1995, but we have always remained friends. I visited him this past year, at his home outside Santa Fe.

    There is also a reference to an old friend of mine named Chris Waller, who smoked pot and fished and did the best  Bon Scott impression I've ever heard (he could sing "Touch Too Much" and hit every note), but Chris got bumped off a moped and died when he was only 13. He was overdue to make an appearance in one of my songs, and he's in good company on this album.

    Though it's one of my least favorite memories, there is an odd reference to a kid whom I sucker punched on a playground when I was in elementary school (someone dared me). The incident always bothered me, and this is my very belated apology.

    The way this song drifts in and out of different realities and memories is a lot like the movies – weaving documentary, imagination and memory throughout, always coming back to the music.

    The main theme of this song is melancholia, something I've carried around since I was a kid and probably explains why I prefer Led Zeppelin's acoustic "Rain Song" to the blues jam "Rock 'N' Roll" or why the piano ballad "Changes," is my favorite Black Sabbath song, not "Iron Man."  It's not deep depression I'm singing about, or even necessarily a bad feeling, it's a state of being  (an "unspecific sadness," as a friend of mine put it) that I believe is rooted in my early life experiences, some of which are mentioned in this song, and others, throughout the album.

     The  feel of this  track  makes me think of Judy Collins's "Send In The Clowns" or Van Morrison's "Beside You."  The music works at its own pace without a defined time signature, the vocals tossed off and falling wherever they land. The musicians who played on this track, including the drummer Steve Shelley of Sonic Youth and the keyboardist
    Owen Ashworth of Advance Base, had a big challenge trying to lock in the rhythm of the nylon string guitar, which was intended to be in 6/8 time but didn't exactly turn out that way. The Bay Area singer Keta Bill got choked up during her backup vocals, saying, "This song is so sad."

    This song, like the rest of the album, is a thank-you to those who have inspired me along the way, and an apology to a few, as well.

    Mark Kozelek is an American singer and songwriter who has recorded more than 40 albums with his bands The Red House Painters and Sun Kil Moon, and under his own name.


    13.25 | 0 komentar | Read More

    Opinionator | The Stone: Should Pope Francis Rethink Abortion?

    Written By Unknown on Jumat, 24 Januari 2014 | 13.25

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    Pope Francis has raised expectations of a turn away from the dogmatic intransigence that has long cast a pall over the religious life of many Roman Catholics. His question "Who am I to judge?" suggested a new attitude toward homosexuality, and he is apparently willing to consider allowing the use of contraceptives to prevent sexually transmitted diseases. But his position on what has come to be the hierarchy's signature issue — abortion — seems unyielding. "Reason alone is sufficient to recognize the inviolable value of each single human life," he declared in his recent apostolic exhortation, "Evangelii Gaudium," adding: "Precisely because this involves the internal consistency of our message about the value of the human person, the church cannot be expected to change her position on this question."

    Revising the ban on abortion would not contradict the pope's overall commitment to the 'value of the human person.'

    I want to explore the possibility, however, that the pope might be open to significant revision of the absolute ban on abortion by asking what happens if we take seriously his claim that "reason alone is sufficient" to adjudicate this issue. What actually follows regarding abortion once we accept the "inviolable value of each single human life"? This appeal to rational reflection has been a central feature of the tradition of Catholic moral teaching. I put forward the following reflections in the spirit of this tradition.

    There is considerable rational basis for moral concern about abortions. In many (probably most) cases, it would be immoral to abort a pregnancy. (Note, however, that this by no means implies that most abortions actually performed are immoral.) Late-term fetuses, for example, are no different biologically or psychologically from babies born prematurely at the same stage of development. It's hard to see how killing a premature baby is immoral but killing an identical late-term fetus isn't. At a minimum, aborting a healthy late-term fetus would, except when the mother's life is at risk, be immoral — which is no doubt why it is seldom, if ever, done.

    Further, from conception on, an embryo or fetus is at least potentially human in the sense that, allowed to develop along its natural path, there is a human life ahead for it. As the philosopher Don Marquis has pointed out, one reason it's wrong to kill a human being is that, when you take a life, you take away a human future. The same is true when you kill a potential human being: All the human goods that it might have enjoyed are eliminated. At the very least, even early abortions for trivial reasons (e.g., not having to postpone a trip or pass up an athletic competition) would be immoral, even if not the "murder" of pro-life rhetoric.

    At the same time, the "inviolable value of each human life" does not imply that no abortion can be moral. Here the case of rape is especially relevant. It is hard to claim that a rape victim has a moral duty to bring to term a pregnancy forced on her by rape, even if we assume that there is a fully human person present from the moment of conception. We might admire someone who has the heroic generosity to do this, but talk of murder is out of place. As the philosopher Judith Jarvis Thomson has noted, if someone kidnapped you and connected your kidneys to those of someone who would die unless the connection were maintained for the next nine months, you would hardly be obliged to go along with this. How can we require a woman pregnant by a rapist to do essentially the same thing?

    Related
    More From The Stone

    Read previous contributions to this series.

    Other exceptions to the condemnation of abortion arise once we realize that an early-stage embryo may be biologically human but still lack the main features — consciousness, self-awareness, an interest in the future — that underlie most moral considerations. An organism may be human by purely biological criteria, but still merely potentially human in the full moral sense. As we saw, Marquis's argument shows that killing a potential human is in itself bad, but there's no reason to think that we are obliged to preserve the life of a potential human at the price of enormous suffering by actual humans.

    Another point, seldom discussed, is that not even pro-life advocates consistently act on their belief that any embryo has full moral standing. As the philosopher Peter Smith has noted, they do not, for example, support major research efforts to prevent the miscarriages or spontaneous abortions (many so early that they aren't ordinarily detected) that occur in about 30 percent of pregnancies. If 30 percent of infants died for unknown reasons, we would all see this as a medical crisis and spend billions on research to prevent these deaths. The fact that pro-life advocates do not support an all-out effort to prevent spontaneous abortions indicates that they themselves recognize a morally relevant difference between embryos and human beings with full moral standing.

    There is, then, a strong case for thinking that abortions always bring about some bad results — at a minimum the loss of potential human life — and that for most pregnancies abortion would be morally wrong. But this conclusion is limited in two ways: A woman's right to control her reproductive life can, as in the case of rape, offset even a person's right to life; and at least at the earlier stages of pregnancy, the embryo has only the moral standing of potential, not actual, human life, which may be overridden by harm to humans with full moral standing.

    These limitations, I suggest, correspond to the "very difficult situations" (such as "rape" and "extreme poverty") in which the pope, in "Evangelii Gaudium," admitted the church has "done little to adequately accompany women." Allowing for exceptions to the moral condemnation of abortion in some of these painful situations would not contradict the pope's overall commitment to the "value of the human person." Rather, it would admit what reason shows: There are morally difficult issues about abortion that should be decided by conscience, not legislation. The result would be a church acting according to the pope's own stated standard: preaching not "certain doctrinal or moral points based on specific ideological options" but rather the gospel of love.


    Gary Gutting is a professor of philosophy at the University of Notre Dame, and an editor of Notre Dame Philosophical Reviews. He is the author of, most recently, "Thinking the Impossible: French Philosophy Since 1960″ and writes regularly for The Stone.


    13.25 | 0 komentar | Read More

    Room for Debate: Politics on the Strip

    "What happens in Vegas, stays in Vegas," is still the city's coy come-on. But as the city once known for mobsters, high stakes and showgirls becomes a destination for moms, steakhouses and shopping, the Republicans are considering holding their national convention there.

    Does it send a wrong message to nominate a president surrounded by roulette wheels and just down the road from legal brothels, or does it merely recognize that Vegas is as American as apple pie?

    Read the Discussion »
    13.25 | 0 komentar | Read More

    Opinionator | Private Lives: Cousins, Across the Color Line

    Written By Unknown on Kamis, 23 Januari 2014 | 13.25

    Private Lives: Personal essays on the news of the world and the news of our lives.

    EL CERRITO, Calif. — I learned about her through the comments section of an article in Publisher's Weekly. I had recently published a book of poems crafted out of family stories, and it had been written up, along with a brief interview. In the interview, I reckon with the complicated history of my family — I am a white descendant of Thomas Jefferson — and the fact that some of my ancestors were slave owners from 1670 until the Civil War.

    In the comments section, the woman, Gayle Jessup White, had written: "I am an African-American Jefferson descendant. My grandmother was a Taylor (although her mother didn't exactly marry into the family!), a direct descendant from J.C. Randolph Taylor and Martha Jefferson Randolph" — Thomas Jefferson's daughter. "Tess Taylor — I wonder if we share great-great-grandparents? The plot thickens."

    The story of Sally Hemings, a slave in the Jefferson household — and the children she most likely bore the third president — is by now widely accepted. That story has offered a chance for people descended from slave owners and those descended from enslaved people to begin to recognize their connections. But the situation, at least in my family, remains delicate. Some white Jefferson descendants have welcomed Hemings descendants. Others have not. Hemings descendants are not allowed to be buried in the family graveyard at Monticello, Jefferson's home, because despite increased evidence, there is, technically, room for scientific doubt. The doubt in turn points to great historical violence: Because it was not the custom of slave owners to name who fathered the mulatto children on their plantations, we have little documentary evidence that would constitute legal "proof" of our interrelationship.

    Yet the fact is that many so-called white and so-called black people in our country are actually deeply interrelated. It is highly likely that I have distant cousins I'll never know, people who'll never come to any family reunion. Historians have obsessed over Jefferson's possible liaisons, but slavery lasted many generations. Among his sons, grandsons, great-grandsons and great-great-grandsons, there were bound to be other liaisons and therefore other direct lineal descendants of Jefferson and enslaved people or domestic servants.

    I wrote to Gayle immediately. Frankly, I was delighted to get her note. I looked her up. I sent her an email. "Hey. It's Tess," I wrote. "Let's talk."

    A few weeks later, we did, long distance from California, where I live, to Virginia. Gayle's story includes oral history, a mysterious Bible and surnames that match a particular branch of the Taylors. The 1870 census shows Gayle's great-grandmother Rachael Robinson working as a servant in my great-great-great-grandfather's household. We can't know this, but it is highly probable that she was enslaved there earlier. Given what I do know about the way slavery was practiced in the Randolph-Jefferson-Taylor families, it is also highly possible that Gayle's family had been handed down and passed along among my family for generations. Our blood may indeed be quite thick.

    Gayle and I consulted Cinder Stanton, a leading Jefferson historian who has worked for decades to help reconstruct the genealogies of enslaved people at Monticello. Cinder's hunch was this: That because of Gayle's oral history, and because the census shows Gayle's great-grandmother Rachael as unmarried but living with two children in great proximity to a man named Moncure Robinson Taylor, he was a likely father of Rachael's children.

    Gayle believes this, Cinder believes this. I think that whether it was Moncure or some other Taylor man, it's not unreasonable to assume that Gayle and I are cousins. But because of the state of DNA testing, which is most accurate on male-only lines, it's unlikely that we'll see a DNA test that proves this.

    Here is just one painful legacy of slavery that still sends tendrils out from the past. My family line is written down, legitimated. In letters housed at the University of Virginia library in Charlottesville, cousins gossip about one another for nearly two centuries. Meanwhile, if you're looking for information about hundreds — even thousands of people — enslaved in the Randolph families over those generations, you are mostly out of luck.

    Gayle and I confronted this together when we met with Cinder for lunch in Charlottesville last fall. The first thing I felt meeting Gayle was pleasure: She is warm, gracious, wry. And we have a lot in common: We have both worked as writers; we are interested in genealogy. But I also felt the strangeness of looking deeply into her eyes and face. Gayle is lovely — I'd be honored to look like her. Still, as we met, I thought about how I had not been taught to look into black faces for traces of family; the common thing that cousins do at reunions is not something I have much practice doing across racial lines.

    We were both dressed up — it was a lunch meeting in the South after all — and Gayle brought photos of her ancestors, her grandmother and her great-grandmother. I fumbled clumsily with my iPhone, realizing that I had pictures only of my son. Gayle seemed disappointed. I realized that my family photos were traces, clues she'd been hoping for.

    We talked about how Gayle could gather more evidence to support her long-held family story. I suggested she look at letters at the University of Virginia, sources I used for my research. "Are you looking for more proof?" I asked. I felt I saw her stiffen. I realized she thought I was doubting her.

    For a moment, frustration hung in the air. I wanted there to be more evidence, more physical clues. But by trying to talk about written documents, I seemed to be insisting on a kind of proof Gayle might never find.

    "You know," Gayle said firmly. "This is my story. This is my family's story. I don't really need any more proof."

    I looked at her Bible and photographs. I felt a knot in myself. Finally I spoke. "I am sorry," I said. "I do not have any more evidence to give you, but I also do not mean to sound as if I doubt you."

    What could I show Gayle, really? I told her about a church in Crozet, Va., on land that had been given by the Randolph family to some of their former slaves to start a church and school. "Descendants of original members still go there," I said. "Maybe somebody knows something."

    But I can't prove or disprove that her great-grandmother had children or a relationship with my great-great-grandfather's brother. I can't tell her what the contents of that relationship were. At one point, Gayle asked me if I was looking for absolution for what my family did, and we both agreed the word was imperfect. I think instead we are both looking for some present-tense reconciliation. We acknowledge our desire to feel connected to our shared history, and appreciate the fact that we can sit together, looking at the mystery of the past and trying to articulate what it means.

    After lunch, Gayle and I took a trip to the Monticello graveyard. I unlocked the gate. I took her in to see the stones — Jefferson's surely, but also John Charles Randolph Taylor's, my great-great-grandfather Bennett's, his brother Moncure's. We stood listening to the wind through the tulip poplar. We agreed to stay in touch.

    There is something radical in knowing Gayle. For what if we were to begin — all of us — to see each other as family? What if we looked into each other's eyes with recognition? For now, Gayle and I make sporadic phone calls. She sent me pralines for my birthday. We exchanged cards at Christmas. Over Martin Luther King Day weekend, we met up for dinner in Washington. I don't yet know what will happen, or all that we have to say to each other. We have only now come to the table. We are only now beginning to talk.

    At Monticello, Gayle and I had a picture taken together just outside the graveyard's locked gate. When we met, I couldn't see the resemblance. But later, looking at the shot, I saw us both squinting up at the camera. When I look back, I see it: We are looking out with a mix of confusion and wonder. We are wearing the same quizzical face.


    Tess Taylor is the author of a collection of poetry, "The Forage House."


    13.25 | 0 komentar | Read More

    Opinionator | Draft: How I Stopped Procrastinating

    Written By Unknown on Selasa, 21 Januari 2014 | 13.26

    Draft is a series about the art and craft of writing.

    I learned something about a year ago that I wish I'd learned much sooner. And it happened only after I woke up one morning and couldn't walk. An X-ray revealed that my hip cartilage had made a unilateral decision to jump ship. I liked my hip cartilage. I thought we'd be together forever.

    Of course, as often happens with a bad divorce, problems had been brewing for years. I'd been ignoring increasingly strong leg pains because I thought their origin might be emotional. In my early 20s I'd had asthma so intense that I couldn't walk up a flight of stairs, only to have all the symptoms disappear entirely at the age of 26 after breaking up with a boyfriend. The finer points of this lesson were not lost on me.

    After all, I was raised in a family with a rich history of psychosomatic ailments. When the leg pains started, I eschewed the medical establishment in favor of talking to a shrink and also consulting a cryptic, sometimes baffling homeopathic "doctor" who favored meridian readings and dietary advice. He said my hips were fine, told me to eat more red meat and avoid summer fruits. Plums, peaches and melons were like eating poison, he explained solemnly. Obediently, I eliminated them from my shopping list. He sent me home with a jar of topical hormone cream to add to the 40-some-odd bottles of essential vitamins, minerals and food supplements he'd sold me over the years.

    Turned out I had to take time out from immortality to get both of my hips replaced. I was hardly a font of good cheer during the six-week wait for surgery, when I couldn't make the short trek from my bedroom to my bathroom without a walker. So there I was: crippled and forced to rethink my daily schedule. No more going to the gym or walking the dogs. My usual morning ritual of coffee and the newspaper became physically impossible since I couldn't walk out to the front yard to retrieve The New York Times. Now I suddenly couldn't even carry that cup of coffee from the kitchen to the office because I had no hand free during transit.

    Things seemed pretty bleak until I accidentally stumbled upon something astonishing: I learned how not to hate writing. In this new and more difficult morning paradigm, I found myself wide-awake at 6 a.m. with no paper, no coffee and no scheduled distractions. I am unable to tolerate anchor people smiling and talking at the same time, so morning television was out. I was left so desperate for an activity that I decided to pursue a little writing.

    I had a vague idea for a play that I had tried to begin many times at 3 in the afternoon. Each time my efforts were thwarted by the tyrannical voices in my head, which grew louder as the hour grew later, berating me for not taking care of bills, cleaning, shopping, grooming, pet care, more bills, more grooming. And if I got caught up on those things, the voices would quickly remind me that I was too ill informed to begin writing even a personal anecdote without undertaking years of painstaking research. A constant feeding of this negativity cyclone would put me in such a state of anxiety that I'd start reflexively checking Internet headlines in search of an environmental catastrophe or a massacre of some kind to help me refocus my anguish.

    Of course, along the way, whenever I encountered a slide show titled "Eight Diet Foods That Pack on the Pounds" or "Celebrity Fashion Fails," I'd have to stop and investigate because hey, it might be information I'd need in some unforeseeable future where I had become, for some reason, a fat celebrity. And before I knew it: uh-oh. Sunset: time for cocktails and falling asleep. Years of behaving like this convinced me that I'd do anything in my power to avoid writing. And I say this as someone who has earned a living as a writer for 25 years and used to imagine punching myself in the face, then wrestling myself into a chair before I agreed to start working. I recited the rich literary tradition of legendary literary figures who hated to write. One of my heroes, Dorothy Parker, proudly said: "I hate writing. I love having written." As I understood it, this was a problem that came with the territory.

    AND if that wasn't evidence enough, there was also all that biological information about the two hemispheres of the brain to prove there could be no cure. Sure, the whole brain participated in most brain function, but art and music were known to be anchored mostly on the right, where they happily partied with "euphoria" and "intuitive flow." When I used to paint, I was always impressed by how I would be transported to some floaty, nirvana-like state, only to wonder six hours later where all the time had gone. Not so with writing, which takes place in the left brain, where it's stuck in an airless waiting room suffocating with all the other poorly dressed, most anal-retentive and under-loved super powers we rely on for most of life's homework: organizing, structuring, analysis, logic, math, science.

    But back to my revelation: When I tried writing at 6 a.m., to my complete surprise I effortlessly wrote 15 pages that first day. The same thing happened when I did it the next day and the day after that. And so it came to pass that in the six weeks before my surgery and in the weeks that followed, I actually enjoyed writing a first draft of my play.

    Here's what I learned: First thing in the morning, before I have drowned myself in coffee, while I still have that sleepy brain I used to believe was useless — that is the best brain for creative writing. Words come pouring out easily while my head still feels as if it is full of ground fog, wrapped in flannel and gauze, and surrounded by a hive of humming, velvety sleep bees.

    And I have a theory about why. Although I am not really known for my scrupulous neurological research and have only personal experience to back me up, what I believe may be happening is that before you are fully awake, your right brain continues to dominate for a while, allowing you access to the pleasant sensation of right brain creativity. Or maybe it's because during the phase of sleep known as REM the brain becomes more active while at the same time the muscles of the body become more relaxed. And since tests have shown that REM periods become more prolonged as we progress toward waking, maybe a bit of the old "active mind/relaxed body" afterglow lingers on in a helpful way. (Note to Nobel Committee: I can be contacted through my website. I'm also on Twitter.)

    Conversely, the relentlessly negative voice that comes from your critical parent seems to be a left brain resident and doesn't like to wake up too early. (And by the way, there is no faster way to bring that demonic harpy voice to consciousness than restless Internet activity. Once you start randomly interacting with anything on the Internet, the delightful relaxation of the right brain has better taste than to stick around. Soon you will have no one but yourself to blame for the large amount of time you spend contemplating Jennifer Aniston's favorite summer outfits.)

    I'm happy to report that these days I can walk just fine again. But having learned this lesson, I now get up in the morning and immediately start writing. I recommend using a pen as often as possible because it seems to maintain the right brain connection better. I am only trying to help you start the writing process. I can't guarantee your ideas will be good or reduce the need for endless rewrites. Now if you'll excuse me, I'm at the end of my four hours. I'm going back to sleep.


    Merrill Markoe is a humorist and the author of nine books.

    A version of this article appears in print on 01/19/2014, on page SR9 of the NewYork edition with the headline: How I Stopped Procrastinating.

    13.26 | 0 komentar | Read More

    Opinionator | The Stone: Fifty States of Fear

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    The British philosopher Bertrand Russell, writing as World War II was drawing to a close in Europe, observed that "neither a man nor a crowd nor a nation can be trusted to act humanely or to think sanely under the influence of a great fear." Russell's point was that irrational fear can propel us into counterproductive activities, ranging from unjust wars and the inhumane treatment of others to more mundane cases like our failure to seize opportunities to improve our everyday lives.

    Just like authoritarian states, democracies can use fear to exert control over the populace and consolidate power.

    It is hard to dispute Russell's claim. We all know that fear can impair our judgment. We have passed up opportunities in our personal lives and we have also seen groups and nations do great harm and unravel because of their irrational fears. The 20th century was littered with wars and ethnic cleansings that were propelled in large measure by fear of a neighboring state or political or ethnic group. Given this obvious truth, one might suppose that modern democratic states, with the lessons of history at hand, would seek to minimize fear — or at least minimize its effect on deliberative decision-making in both foreign and domestic policy.

    But today the opposite is frequently true. Even democracies founded in the principles of liberty and the common good often take the path of more authoritarian states. They don't work to minimize fear, but use it to exert control over the populace and serve the government's principal aim: consolidating power.

    Philosophers have long noted the utility of fear to the state. Machiavelli notoriously argued that a good leader should induce fear in the populace in order to control the rabble.

    Hobbes in "The Leviathan" argued that fear effectively motivates the creation of a social contract in which citizens cede their freedoms to the sovereign. The people understandably want to be safe from harm. The ruler imposes security and order in exchange for the surrender of certain public freedoms. As Hobbes saw it, there was no other way: Humans, left without a strong sovereign leader controlling their actions, would degenerate into mob rule. It is the fear of this state of nature — not of the sovereign per se, but of a world without the order the sovereign can impose — that leads us to form the social contract and surrender at least part of our freedom.

    Most philosophers have since rejected this Hobbesian picture of human nature and the need for a sovereign. We have learned that democratic states can flourish without an absolute ruler. The United States of America was the original proof of concept of this idea: Free, self-governing people can flourish without a sovereign acting above the law. Even though the United States has revoked freedoms during wartime (and for some groups in peacetime), for most of its history the people have not been under the yoke of an all-powerful sovereign.

    However, since 9/11 leaders of both political parties in the United States have sought to consolidate power by leaning not just on the danger of a terrorist attack, but on the fact that the possible perpetrators are frightening individuals who are not like us. As President George W. Bush put it before a joint session of Congress in 2001: "They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other." Last year President Obama brought the enemy closer to home, arguing in a speech at the National Defense University that "we face a real threat from radicalized individuals here in the United States" — radicalized individuals who were "deranged or alienated individuals — often U.S. citizens or legal residents."

    Even the founder of Blackwater Worldwide, the military contractor, says that the American security state has gone too far.

    The Bush fear-peddling is usually considered the more extreme, but is it? The Obama formulation puts the "radicalized individuals" in our midst. They could be American citizens or legal residents. And the subtext is that if we want to catch them we need to start looking within. The other is among us. The pretext for the surveillance state is thus established.

    And let there be no mistake about the consolidation of power in the form of the new surveillance state. Recent revelations by Edward Snowden have shown an unprecedented program of surveillance both worldwide and on the American population. Even Erik Prince, the founder of the private military contractor Blackwater Worldwide thinks the security state has gone too far:

    America is way too quick to trade freedom for the illusion of security. Whether it's allowing the N.S.A. to go way too far in what it intercepts of our personal data, to our government monitoring of everything domestically and spending way more than we should. I don't know if I want to live in a country where lone wolf and random terror attacks are impossible 'cause that country would look more like North Korea than America.

    The widespread outrage over the new surveillance state has been great enough that President Obama announced on Friday that he would scale back some of its programs, but he remained strident in his overall support for aggressive surveillance.

    The interesting thing about the security measures that are taken today is that they provide, as Prince puts it, the "illusion of security"; another way to put it is that they provide "security theater." Or perhaps it is actually a theater of fear.

    During the George W. Bush administration we were treated to the color-coded terror threat meter. It was presented as a way to keep us secure, but constantly wavering between orange and red, it was arguably a device to remind us to be fearful. Similarly for the elaborate Transportation Security Administration screenings at airports. Security experts are clear that these procedures are not making us safe, and that they are simply theater. The only question is whether the theater is supposed to make us feel safer or whether it is actually intended to remind us that we are somehow in danger. The security expert Bruce Schneier suggests it is the latter:

    By sowing mistrust, by stripping us of our privacy — and in many cases our dignity — by taking away our rights, by subjecting us to arbitrary and irrational rules, and by constantly reminding us that this is the only thing between us and death by the hands of terrorists, the T.S.A. and its ilk are sowing fear. And by doing so, they are playing directly into the terrorists' hands.

    The goal of terrorism is not to crash planes, or even to kill people; the goal of terrorism is to cause terror. … But terrorists can only do so much. They cannot take away our freedoms. They cannot reduce our liberties. They cannot, by themselves, cause that much terror. It's our reaction to terrorism that determines whether or not their actions are ultimately successful. That we allow governments to do these things to us — to effectively do the terrorists' job for them — is the greatest harm of all.

    As the Norwegian philosopher Lars Svendsen notes in his book "A Philosophy of Fear," Hobbes already anticipated the need for the sovereign to manipulate our fears. The state, Svendsen writes, "has to convince the people that certain things should be feared rather than others, since the people will not, just like that, fear what is appropriate from the point of view of the state. Hobbes points out that this can necessitate a certain amount of staging by the state, which magnifies certain phenomena and diminishes others."

    We are conditioned to fear persons in caves in Pakistan but not fatal industrial accidents or the work-related deaths of thousands of Americans every year.

    One way in which our fears can be manipulated by the government is to lead us to fear the lesser danger. Schneier provides a simple example of this: 9/11 caused people to irrationally fear air travel and led them to the much more dangerous route of traveling in automobiles.

    Another such example of this misdirection of fear took place in the case of the Boston Marathon bombings on April 15, in which the Boston Police Department effectively imposed martial law and seized control of people's homes and used them as command posts in their effort to apprehend the perpetrators. The bombings were terrible (three people died and more than 260 were injured), but just two days later another terrible thing happened: a giant explosion in a fertilizer plant in Texas killed at least 14 people and injured more than 160. For a moment we held our collective breath. Could it have been terrorists?

    When we learned that it was probably an accident caused by the ignition of stored ammonium nitrate, a collective sigh of relief was heard, and then not another word about the event. But why? And what if the explosion in that factory was part of a larger problem of industrial safety? In fact, according to a report by the United States Congressional Research Service, thousands of industrial facilities across the country risk similar harm to nearby populations.

    Meanwhile, 300,000 residents of West Virginia were without safe drinking water last week after 7,500 gallons of 4-methylcyclohexane methanol leaked into the Elk River from an industrial storage tank at a plant owned by a company called Freedom Industries. Few, if any, of the Sunday TV talk shows discussed the matter, but imagine the fear that would have been pedaled on those shows if terrorists had poisoned the water of those 300,000 Americans. Of course the danger is the same whether the cause is terrorism or corporate indifference and malfeasance.

    Dangers are not limited to large scale events. In 2012, according to the Bureau of Labor Statistics, 4,383 workers were killed on the job, and it has been at this level or higher since 9/11. In other words, we suffer a 9/11 every year in terms of workplace fatalities.

    But the problem is not limited to workplace deaths. The A.F.L.-C.I.O. estimates another 50,000 die every year from occupational diseases. And none of this accounts for the thousands of workers who are permanently disabled each year.

    In total, 54,000 Americans die every year due to work-related illnesses and accidents. This is the equivalent of 148 deaths each day; in terms of fatalities it is roughly a Boston Marathon bombing every half hour of every day.

    But while we spend more than 7 billion dollars a year on the T.S.A.'s national security theater in which over 58,000 T.S.A. employees make sure we are not carrying too much toothpaste or shampoo onto airplanes, the budget for the Occupational Safety and Health Administration is under $600 million per year. It seems that our threat assessments are flawed.

    We are conditioned to fear persons in caves in Pakistan but not the destruction of our water supply by frackers, massive industrial accidents, climate change or the work-related deaths of 54,000 American workers every year. Fear of outside threats has led us to ignore the more real dangers from within.

    Related
    More From The Stone

    Read previous contributions to this series.

    Fear has also driven us to wage a "war on terror" that, as the political writer Jeremy Scahill has shown in his book "Dirty Wars," creates still more enemies. As Scahill describes the results, the United States Special Forces kill lists of seven targets gave rise to kill lists of hundreds, which in turn gave rise to kill lists of thousands today. Does it not occur to the United States that the drone strikes and assassinations are creating more terrorists than they are neutralizing? Perhaps it has, but the calculation has been made that it does not matter. The newly minted enemies can be used to gin up more fear, more restrictions on our freedoms, and so the cycle goes.  One might argue that the United States has become a government of fear, by fear, and ultimately, for fear.

    Obama's drone wars also arise from Hobbesian assumptions about society — that the sovereign, enlisted to impose order, is above the law. The sovereign is free to do whatever is in his power to impose order. If the United States must be in charge of providing order in the world, then its sovereign is above the law. Here lie the roots of so-called American exceptionalism.

    Svendsen describes the dynamic thus: "The social contract is absolutely binding on all citizens, but the sovereign himself is not subject to the contract that he undertakes to guarantee. Similarly, the U.S. is conceived as being the guarantor of a civilized world, as the country that can maintain moral order, but that stands outside this order." Fear is driving the United States to believe it is above the law.

    Fear is even used to prevent us from questioning the decisions supposedly being made for our safety. The foundation of this approach in our government can be traced back to burning rubble of the World Trade Center, exemplified by this statement by John Ashcroft, then the attorney general of the United States, in December 2001: "To those who scare peace-loving people with phantoms of lost liberty, my message is this. Your tactics only aid terrorists, for they erode our national unity and diminish our resolve. They give ammunition to America's enemies, and pause to America's friends."

    As Svendsen points out, Ashcroft's reasoning is straight out of the playbook of the German legal philosopher Carl Schmitt, who was notorious for defending Hitler's extrajudicial killings of his political enemies. Schmitt too felt that national unity was critical and that liberty should be subjugated to safety. Svendsen writes:

    A political act consists in maintaining one's own existence and destroying those that threaten it, and there is little room for overcoming conflicts via discussion. Such political action is the sole right of the state, and in order to maintain itself the state must also eliminate all enemies within, that is, all those who do not fit into a homogeneous unity. Every genuine political theory, according to Schmitt, must assume that man is evil, that man is a dangerous being. It is here, in the fear of what humans can do to each other, that the state finds the justification of its own existence — the ability of the state to protect one is the argument for submitting to it.

    Fear is a primal human state. From childhood on, we fear the monsters of our imaginations, lurking in dark closets, under beds, in deserted alleyways, but we also now fear monsters in the deserts of Yemen and the mountains of Pakistan. But perhaps it is possible to pause and subdue our fears by carefully observing reality — just as we might advise for trying to calm and comfort a fear-stricken child. We might find that, in reality, the more immediate danger to our democratic society comes from those who lurk in the halls of power in Washington and other national capitols and manipulate our fears to their own ends.

    What are these ends? They are typically the protection of moneyed interests. In 1990, the Secretary of State James Baker tried to make the case for the first Gulf War on economic grounds. "The economic lifeline of the industrial world," he said, "runs from the gulf and we cannot permit a dictator such as this to sit astride that economic lifeline."

    Images of mushroom clouds, environmental terrorists and agents of mayhem were all used to justify excessive actions that served to protect corporate interests.

    That rationale, although honest, did not resonate with the American people — it hardly seemed to justify war. The George W. Bush administration abandoned the economic justification and turned to fear as a motivator. We were told that Saddam Hussein had weapons of mass destruction. If we did not act against him, the national security adviser Condoleezza Rice argued, the next thing we would see might be a "mushroom cloud."

    This playbook of fear has not been limited to motivating military actions. Environmentalists, once ridiculed as "tree-huggers" are now often characterized as "environmental terrorists" — as individuals we should fear and neutralize. The hacktivist Jeremy Hammond, who exposed the nefarious dealings of the private intelligence corporation Stratfor and its clients, was characterized as someone seeking to cause "mayhem" by Federal District Judge Loretta Preska when she sentenced him to 10 years in prison.

    In each case, the images of mushroom clouds, environmental terrorists and agents of mayhem were used to justify actions that would otherwise seem excessive – all in the service of protecting corporate interests.

    Whatever their motivation, by using fear to induce the rollback of individual rights, politicians, judges and lawmakers are working against the hard-won democratic principles and ideals that we and other democracies have defended for almost 250 years. They are manipulating our fears to undo centuries of democratic reform. And it doesn't matter if the empowered leader is called a king or a prime minister or a president; the end result is that fear has been used to place us back under the yoke of Hobbes's sovereign and Machiavelli's prince.

    Yet ultimately we are not powerless. We can resist the impulse to be afraid. We may not at the moment have answers to the very real dangers that we face in this world, but we can begin to identify those dangers and seek solutions once we overcome our fear. Or as Bertrand Russell rather more elegantly put it, as World War II was drawing to a close, "to conquer fear is the beginning of wisdom."

    Peter Ludlow, a professor of philosophy at Northwestern University, writes frequently on digital culture, hacktivism and the surveillance state.


    This post has been revised to reflect the following correction:

    Correction: January 20, 2014

    An earlier version of this article misstated the estimated budget of the Occupational Safety and Health Administration. It is less than $600 million, not less than $600,000.


    13.26 | 0 komentar | Read More

    Room for Debate: How to Compensate Victims of Terrorism

    After each national tragedy, the American people ask: How can we help? But the answer changes each time. After the terrorist attacks in 2001, Congress set up a taxpayer-financed fund to compensate victims and their families – and to prevent lawsuits against the airlines. After the Newtown shootings in 2012, nonprofit groups distributed (or did not distribute) donations. After the Boston Marathon bombing last year, a single fund collected contributions, and its administrator decided how to divide them up.

    Should the U.S. government compensate victims of mass violence like bombings and shootings sprees? Or should compensation come from other sources, like lawsuits and private donations?

    Read the Discussion »
    13.26 | 0 komentar | Read More

    Opinionator | The Stone: Fifty States of Fear

    Written By Unknown on Senin, 20 Januari 2014 | 13.26

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    The British philosopher Bertrand Russell, writing as World War II was drawing to a close in Europe, observed that "neither a man nor a crowd nor a nation can be trusted to act humanely or to think sanely under the influence of a great fear." Russell's point was that irrational fear can propel us into counterproductive activities, ranging from unjust wars and the inhumane treatment of others to more mundane cases like our failure to seize opportunities to improve our everyday lives.

    Just like authoritarian states, democracies can use fear to exert control over the populace and consolidate power.

    It is hard to dispute Russell's claim. We all know that fear can impair our judgment. We have passed up opportunities in our personal lives and we have also seen groups and nations do great harm and unravel because of their irrational fears. The 20th century was littered with wars and ethnic cleansings that were propelled in large measure by fear of a neighboring state or political or ethnic group. Given this obvious truth, one might suppose that modern democratic states, with the lessons of history at hand, would seek to minimize fear — or at least minimize its effect on deliberative decision-making in both foreign and domestic policy.

    But today the opposite is frequently true. Even democracies founded in the principles of liberty and the common good often take the path of more authoritarian states. They don't work to minimize fear, but use it to exert control over the populace and serve the government's principle aim: consolidating power.

    Philosophers have long noted the utility of fear to the state. Machiavelli notoriously argued that a good leader should induce fear in the populace in order to control the rabble.

    Hobbes in "The Leviathan" argued that fear effectively motivates the creation of a social contract in which citizens cede their freedoms to the sovereign. The people understandably want to be safe from harm. The ruler imposes security and order in exchange for the surrender of certain public freedoms. As Hobbes saw it, there was no other way: Humans, left without a strong sovereign leader controlling their actions, would degenerate into mob rule. It is the fear of this state of nature — not of the sovereign per se, but of a world without the order the sovereign can impose — that leads us to form the social contract and surrender at least part of our freedom.

    Most philosophers have since rejected this Hobbesian picture of human nature and the need for a sovereign. We have learned that democratic states can flourish without an absolute ruler. The United States of America was the original proof of concept of this idea: Free, self-governing people can flourish without a sovereign acting above the law. Even though the United States has revoked freedoms during wartime (and for some groups in peacetime), for most of its history the people have not been under the yoke of an all-powerful sovereign.

    However, since 9/11 leaders of both political parties in the United States have sought to consolidate power by leaning not just on the danger of a terrorist attack, but on the fact that the possible perpetrators are frightening individuals who are not like us. As President George W. Bush put it before a joint session of Congress in 2001: "They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other." Last year President Obama brought the enemy closer to home, arguing in a speech at the National Defense University that "we face a real threat from radicalized individuals here in the United States" — radicalized individuals who were "deranged or alienated individuals — often U.S. citizens or legal residents."

    Even the founder of Blackwater Worldwide, the military contractor, says that the American security state has gone too far.

    The Bush fear-peddling is usually considered the more extreme, but is it? The Obama formulation puts the "radicalized individuals" in our midst. They could be American citizens or legal residents. And the subtext is that if we want to catch them we need to start looking within. The other is among us. The pretext for the surveillance state is thus established.

    And let there be no mistake about the consolidation of power in the form of the new surveillance state. Recent revelations by Edward Snowden have shown an unprecedented program of surveillance both worldwide and on the American population. Even Erik Prince, the founder of the private military contractor Blackwater Worldwide thinks the security state has gone too far:

    America is way too quick to trade freedom for the illusion of security. Whether it's allowing the N.S.A. to go way too far in what it intercepts of our personal data, to our government monitoring of everything domestically and spending way more than we should. I don't know if I want to live in a country where lone wolf and random terror attacks are impossible 'cause that country would look more like North Korea than America.

    The widespread outrage over the new surveillance state has been great enough that President Obama announced on Friday that he would scale back some of its programs, but he remained strident in his overall support for aggressive surveillance.

    The interesting thing about the security measures that are taken today is that they provide, as Prince puts it, the "illusion of security"; another way to put it is that they provide "security theater." Or perhaps it is actually a theater of fear.

    During the George W. Bush administration we were treated to the color-coded terror threat meter. It was presented as a way to keep us secure, but constantly wavering between orange and red, it was arguably a device to remind us to be fearful. Similarly for the elaborate Transportation Security Administration screenings at airports. Security experts are clear that these procedures are not making us safe, and that they are simply theater. The only question is whether the theater is supposed to make us feel safer or whether it is actually intended to remind us that we are somehow in danger. The security expert Bruce Schneier suggests it is the latter:

    By sowing mistrust, by stripping us of our privacy — and in many cases our dignity — by taking away our rights, by subjecting us to arbitrary and irrational rules, and by constantly reminding us that this is the only thing between us and death by the hands of terrorists, the T.S.A. and its ilk are sowing fear. And by doing so, they are playing directly into the terrorists' hands.

    The goal of terrorism is not to crash planes, or even to kill people; the goal of terrorism is to cause terror. … But terrorists can only do so much. They cannot take away our freedoms. They cannot reduce our liberties. They cannot, by themselves, cause that much terror. It's our reaction to terrorism that determines whether or not their actions are ultimately successful. That we allow governments to do these things to us — to effectively do the terrorists' job for them — is the greatest harm of all.

    As the Norwegian philosopher Lars Svendsen notes in his book "A Philosophy of Fear," Hobbes already anticipated the need for the sovereign to manipulate our fears. The state, Svendsen writes, "has to convince the people that certain things should be feared rather than others, since the people will not, just like that, fear what is appropriate from the point of view of the state. Hobbes points out that this can necessitate a certain amount of staging by the state, which magnifies certain phenomena and diminishes others."

    We are conditioned to fear persons in caves in Pakistan but not fatal industrial accidents or the work-related deaths of thousands of Americans every year.

    One way in which our fears can be manipulated by the government is to lead us to fear the lesser danger. Schneier provides a simple example of this: 9/11 caused people to irrationally fear air travel and led them to the much more dangerous route of traveling in automobiles.

    Another such example of this misdirection of fear took place in the case of the Boston Marathon bombings on April 15, in which the Boston Police Department effectively imposed martial law and seized control of people's homes and used them as command posts in their effort to apprehend the perpetrators. The bombings were terrible (three people died and more than 260 were injured), but just two days later another terrible thing happened: a giant explosion in a fertilizer plant in Texas killed at least 14 people and injured more than 160. For a moment we held our collective breath. Could it have been terrorists?

    When we learned that it was probably an accident caused by the ignition of stored ammonium nitrate, a collective sigh of relief was heard, and then not another word about the event. But why? And what if the explosion in that factory was part of a larger problem of industrial safety? In fact, according to a report by the United States Congressional Research Service, thousands of industrial facilities across the country risk similar harm to nearby populations.

    Meanwhile, 300,000 residents of West Virginia were without safe drinking water last week after 7,500 gallons of 4-methylcyclohexane methanol leaked into the Elk River from an industrial storage tank at a plant owned by a company called Freedom Industries. Few, if any, of the Sunday TV talk shows discussed the matter, but imagine the fear that would have been pedaled on those shows if terrorists had poisoned the water of those 300,000 Americans. Of course the danger is the same whether the cause is terrorism or corporate indifference and malfeasance.

    Dangers are not limited to large scale events. In 2012, according to the Bureau of Labor Statistics, 4,383 workers were killed on the job, and it has been at this level or higher since 9/11. In other words, we suffer a 9/11 every year in terms of workplace fatalities.

    But the problem is not limited to workplace deaths. The A.F.L.-C.I.O. estimates another 50,000 die every year from occupational diseases. And none of this accounts for the thousands of workers who are permanently disabled each year.

    In total, 54,000 Americans die every year due to work-related illnesses and accidents. This is the equivalent of 148 deaths each day; in terms of fatalities it is roughly a Boston Marathon bombing every half hour of every day.

    But while we spend more than 7 billion dollars a year on the T.S.A.'s national security theater in which over 58,000 T.S.A. employees make sure we are not carrying too much toothpaste or shampoo onto airplanes, the budget for the Occupational Safety and Health Administration is under $600,000 per year. It seems that our threat assessments are flawed.

    We are conditioned to fear persons in caves in Pakistan but not the destruction of our water supply by frackers, massive industrial accidents, climate change or the work-related deaths of 54,000 American workers every year. Fear of outside threats has led us to ignore the more real dangers from within.

    Related
    More From The Stone

    Read previous contributions to this series.

    Fear has also driven us to wage a "war on terror" that, as the political writer Jeremy Scahill has shown in his book "Dirty Wars," creates still more enemies. As Scahill describes the results, the United States Special Forces kill lists of seven targets gave rise to kill lists of hundreds, which in turn gave rise to kill lists of thousands today. Does it not occur to the United States that the drone strikes and assassinations are creating more terrorists than they are neutralizing? Perhaps it has, but the calculation has been made that it does not matter. The newly minted enemies can be used to gin up more fear, more restrictions on our freedoms, and so the cycle goes.  One might argue that the United States has become a government of fear, by fear, and ultimately, for fear.

    Obama's drone wars also arise from Hobbesian assumptions about society — that the sovereign, enlisted to impose order, is above the law. The sovereign is free to do whatever is in his power to impose order. If the United States must be in charge of providing order in the world, then its sovereign is above the law. Here lie the roots of so-called American exceptionalism.

    Svendsen describes the dynamic thus: "The social contract is absolutely binding on all citizens, but the sovereign himself is not subject to the contract that he undertakes to guarantee. Similarly, the U.S. is conceived as being the guarantor of a civilized world, as the country that can maintain moral order, but that stands outside this order." Fear is driving the United States to believe it is above the law.

    Fear is even used to prevent us from questioning the decisions supposedly being made for our safety. The foundation of this approach in our government can be traced back to burning rubble of the World Trade Center, exemplified by this statement by John Ashcroft, then the attorney general of the United States, in December 2001: "To those who scare peace-loving people with phantoms of lost liberty, my message is this. Your tactics only aid terrorists, for they erode our national unity and diminish our resolve. They give ammunition to America's enemies, and pause to America's friends."

    As Svendsen points out, Ashcroft's reasoning is straight out of the playbook of the German legal philosopher Carl Schmitt, who was notorious for defending Hitler's extrajudicial killings of his political enemies. Schmitt too felt that national unity was critical and that liberty should be subjugated to safety. Svendsen writes:

    A political act consists in maintaining one's own existence and destroying those that threaten it, and there is little room for overcoming conflicts via discussion. Such political action is the sole right of the state, and in order to maintain itself the state must also eliminate all enemies within, that is, all those who do not fit into a homogeneous unity. Every genuine political theory, according to Schmitt, must assume that man is evil, that man is a dangerous being. It is here, in the fear of what humans can do to each other, that the state finds the justification of its own existence — the ability of the state to protect one is the argument for submitting to it.

    Fear is a primal human state. From childhood on, we fear the monsters of our imaginations, lurking in dark closets, under beds, in deserted alleyways, but we also now fear monsters in the deserts of Yemen and the mountains of Pakistan. But perhaps it is possible to pause and subdue our fears by carefully observing reality — just as we might advise for trying to calm and comfort a fear-stricken child. We might find that, in reality, the more immediate danger to our democratic society comes from those who lurk in the halls of power in Washington and other national capitols and manipulate our fears to their own ends.

    What are these ends? They are typically the protection of moneyed interests. In 1990, the Secretary of State James Baker tried to make the case for the first Gulf War on economic grounds. "The economic lifeline of the industrial world," he said, "runs from the gulf and we cannot permit a dictator such as this to sit astride that economic lifeline."

    Images of mushroom clouds, environmental terrorists and agents of mayhem were all used to justify excessive actions that served to protect corporate interests.

    That rationale, although honest, did not resonate with the American people — it hardly seemed to justify war. The George W. Bush administration abandoned the economic justification and turned to fear as a motivator. We were told that Saddam Hussein had weapons of mass destruction. If we did not act against him, the national security adviser Condoleezza Rice argued, the next thing we would see might be a "mushroom cloud."

    This playbook of fear has not been limited to motivating military actions. Environmentalists, once ridiculed as "tree-huggers" are now often characterized as "environmental terrorists" — as individuals we should fear and neutralize. The hacktivist Jeremy Hammond, who exposed the nefarious dealings of the private intelligence corporation Stratfor and its clients, was characterized as someone seeking to cause "mayhem" by Federal District Judge Loretta Preska when she sentenced him to 10 years in prison.

    In each case, the images of mushroom clouds, environmental terrorists and agents of mayhem were used to justify actions that would otherwise seem excessive – all in the service of protecting corporate interests.

    Whatever their motivation, by using fear to induce the rollback of individual rights, politicians, judges and lawmakers are working against the hard-won democratic principles and ideals that we and other democracies have defended for almost 250 years. They are manipulating our fears to undo centuries of democratic reform. And it doesn't matter if the empowered leader is called a king or a prime minister or a president; the end result is that fear has been used to place us back under the yoke of Hobbes's sovereign and Machiavelli's prince.

    Yet ultimately we are not powerless. We can resist the impulse to be afraid. We may not at the moment have answers to the very real dangers that we face in this world, but we can begin to identify those dangers and seek solutions once we overcome our fear. Or as Bertrand Russell rather more elegantly put it, as World War II was drawing to a close, "to conquer fear is the beginning of wisdom."

    Peter Ludlow, a professor of philosophy at Northwestern University, writes frequently on digital culture, hacktivism and the surveillance state.


    13.26 | 0 komentar | Read More

    Opinionator | Draft: How I Stopped Procrastinating

    Written By Unknown on Minggu, 19 Januari 2014 | 13.25

    Draft is a series about the art and craft of writing.

    I learned something about a year ago that I wish I'd learned much sooner. And it happened only after I woke up one morning and couldn't walk. An X-ray revealed that my hip cartilage had made a unilateral decision to jump ship. I liked my hip cartilage. I thought we'd be together forever.

    Of course, as often happens with a bad divorce, problems had been brewing for years. I'd been ignoring increasingly strong leg pains because I thought their origin might be emotional. In my early 20s I'd had asthma so intense that I couldn't walk up a flight of stairs, only to have all the symptoms disappear entirely at the age of 26 after breaking up with a boyfriend. The finer points of this lesson were not lost on me.

    After all, I was raised in a family with a rich history of psychosomatic ailments. When the leg pains started, I eschewed the medical establishment in favor of talking to a shrink and also consulting a cryptic, sometimes baffling homeopathic "doctor" who favored meridian readings and dietary advice. He said my hips were fine, told me to eat more red meat and avoid summer fruits. Plums, peaches and melons were like eating poison, he explained solemnly. Obediently, I eliminated them from my shopping list. He sent me home with a jar of topical hormone cream to add to the 40-some-odd bottles of essential vitamins, minerals and food supplements he'd sold me over the years.

    Turned out I had to take time out from immortality to get both of my hips replaced. I was hardly a font of good cheer during the six-week wait for surgery, when I couldn't make the short trek from my bedroom to my bathroom without a walker. So there I was: crippled and forced to rethink my daily schedule. No more going to the gym or walking the dogs. My usual morning ritual of coffee and the newspaper became physically impossible since I couldn't walk out to the front yard to retrieve The New York Times. Now I suddenly couldn't even carry that cup of coffee from the kitchen to the office because I had no hand free during transit.

    Things seemed pretty bleak until I accidentally stumbled upon something astonishing: I learned how not to hate writing. In this new and more difficult morning paradigm, I found myself wide-awake at 6 a.m. with no paper, no coffee and no scheduled distractions. I am unable to tolerate anchor people smiling and talking at the same time, so morning television was out. I was left so desperate for an activity that I decided to pursue a little writing.

    I had a vague idea for a play that I had tried to begin many times at 3 in the afternoon. Each time my efforts were thwarted by the tyrannical voices in my head, which grew louder as the hour grew later, berating me for not taking care of bills, cleaning, shopping, grooming, pet care, more bills, more grooming. And if I got caught up on those things, the voices would quickly remind me that I was too ill informed to begin writing even a personal anecdote without undertaking years of painstaking research. A constant feeding of this negativity cyclone would put me in such a state of anxiety that I'd start reflexively checking Internet headlines in search of an environmental catastrophe or a massacre of some kind to help me refocus my anguish.

    Of course, along the way, whenever I encountered a slide show titled "Eight Diet Foods That Pack on the Pounds" or "Celebrity Fashion Fails," I'd have to stop and investigate because hey, it might be information I'd need in some unforeseeable future where I had become, for some reason, a fat celebrity. And before I knew it: uh-oh. Sunset: time for cocktails and falling asleep. Years of behaving like this convinced me that I'd do anything in my power to avoid writing. And I say this as someone who has earned a living as a writer for 25 years and used to imagine punching myself in the face, then wrestling myself into a chair before I agreed to start working. I recited the rich literary tradition of legendary literary figures who hated to write. One of my heroes, Dorothy Parker, proudly said: "I hate writing. I love having written." As I understood it, this was a problem that came with the territory.

    AND if that wasn't evidence enough, there was also all that biological information about the two hemispheres of the brain to prove there could be no cure. Sure, the whole brain participated in most brain function, but art and music were known to be anchored mostly on the right, where they happily partied with "euphoria" and "intuitive flow." When I used to paint, I was always impressed by how I would be transported to some floaty, nirvana-like state, only to wonder six hours later where all the time had gone. Not so with writing, which takes place in the left brain, where it's stuck in an airless waiting room suffocating with all the other poorly dressed, most anal-retentive and under-loved super powers we rely on for most of life's homework: organizing, structuring, analysis, logic, math, science.

    But back to my revelation: When I tried writing at 6 a.m., to my complete surprise I effortlessly wrote 15 pages that first day. The same thing happened when I did it the next day and the day after that. And so it came to pass that in the six weeks before my surgery and in the weeks that followed, I actually enjoyed writing a first draft of my play.

    Here's what I learned: First thing in the morning, before I have drowned myself in coffee, while I still have that sleepy brain I used to believe was useless — that is the best brain for creative writing. Words come pouring out easily while my head still feels as if it is full of ground fog, wrapped in flannel and gauze, and surrounded by a hive of humming, velvety sleep bees.

    And I have a theory about why. Although I am not really known for my scrupulous neurological research and have only personal experience to back me up, what I believe may be happening is that before you are fully awake, your right brain continues to dominate for a while, allowing you access to the pleasant sensation of right brain creativity. Or maybe it's because during the phase of sleep known as REM the brain becomes more active while at the same time the muscles of the body become more relaxed. And since tests have shown that REM periods become more prolonged as we progress toward waking, maybe a bit of the old "active mind/relaxed body" afterglow lingers on in a helpful way. (Note to Nobel Committee: I can be contacted through my website. I'm also on Twitter.)

    Conversely, the relentlessly negative voice that comes from your critical parent seems to be a left brain resident and doesn't like to wake up too early. (And by the way, there is no faster way to bring that demonic harpy voice to consciousness than restless Internet activity. Once you start randomly interacting with anything on the Internet, the delightful relaxation of the right brain has better taste than to stick around. Soon you will have no one but yourself to blame for the large amount of time you spend contemplating Jennifer Aniston's favorite summer outfits.)

    I'm happy to report that these days I can walk just fine again. But having learned this lesson, I now get up in the morning and immediately start writing. I recommend using a pen as often as possible because it seems to maintain the right brain connection better. I am only trying to help you start the writing process. I can't guarantee your ideas will be good or reduce the need for endless rewrites. Now if you'll excuse me, I'm at the end of my four hours. I'm going back to sleep.


    Merrill Markoe is a humorist and the author of nine books.

    A version of this article appears in print on 01/19/2014, on page SR9 of the NewYork edition with the headline: How I Stopped Procrastinating.

    13.25 | 0 komentar | Read More

    Opinionator | The Great Divide: What Happens When the Poor Receive a Stipend?

    Growing up poor has long been associated with reduced educational attainment and lower lifetime earnings. Some evidence also suggests a higher risk of depression, substance abuse and other diseases in adulthood. Even for those who manage to overcome humble beginnings, early-life poverty may leave a lasting mark, accelerating aging and increasing the risk of degenerative disease in adulthood.

    Today, more than one in five American children live in poverty. How, if at all, to intervene is almost invariably a politically fraught question. Scientists interested in the link between poverty and mental health, however, often face a more fundamental problem: a relative dearth of experiments that test and compare potential interventions.

    So when, in 1996, the Eastern Band of Cherokee Indians in North Carolina's Great Smoky Mountains opened a casino, Jane Costello, an epidemiologist at Duke University Medical School, saw an opportunity. The tribe elected to distribute a proportion of the profits equally among its 8,000 members. Professor Costello wondered whether the extra money would change psychiatric outcomes among poor Cherokee families.

    When the casino opened, Professor Costello had already been following 1,420 rural children in the area, a quarter of whom were Cherokee, for four years. That gave her a solid baseline measure. Roughly one-fifth of the rural non-Indians in her study lived in poverty, compared with more than half of the Cherokee. By 2001, when casino profits amounted to $6,000 per person yearly, the number of Cherokee living below the poverty line had declined by half.

    The poorest children tended to have the greatest risk of psychiatric disorders, including emotional and behavioral problems. But just four years after the supplements began, Professor Costello observed marked improvements among those who moved out of poverty. The frequency of behavioral problems declined by 40 percent, nearly reaching the risk of children who had never been poor. Already well-off Cherokee children, on the other hand, showed no improvement. The supplements seemed to benefit the poorest children most dramatically.

    When Professor Costello published her first study, in 2003, the field of mental health remained on the fence over whether poverty caused psychiatric problems, or psychiatric problems led to poverty. So she was surprised by the results. Even she hadn't expected the cash to make much difference. "The expectation is that social interventions have relatively small effects," she told me. "This one had quite large effects."

    She and her colleagues kept following the children. Minor crimes committed by Cherokee youth declined. On-time high school graduation rates improved. And by 2006, when the supplements had grown to about $9,000 yearly per member, Professor Costello could make another observation: The earlier the supplements arrived in a child's life, the better that child's mental health in early adulthood.

    She'd started her study with three cohorts, ages 9, 11 and 13. When she caught up with them as 19- and 21-year-olds living on their own, she found that those who were youngest when the supplements began had benefited most. They were roughly one-third less likely to develop substance abuse and psychiatric problems in adulthood, compared with the oldest group of Cherokee children and with neighboring rural whites of the same age.

    Cherokee children in the older cohorts, who were already 14 or 16 when the supplements began, on the other hand, didn't show any improvements relative to rural whites. The extra cash evidently came too late to alter these older teenagers' already-established trajectories.

    What precisely did the income change? Ongoing interviews with both parents and children suggested one variable in particular. The money, which amounted to between one-third and one-quarter of poor families' income at one point, seemed to improve parenting quality.

    Vickie L. Bradley, a tribe member and tribal health official, recalls the transition. Before the casino opened and supplements began, employment was often sporadic. Many Cherokee worked "hard and long" during the summer, she told me, and then hunkered down when jobs disappeared in the winter. The supplements eased the strain of that feast-or-famine existence, she said. Some used the money to pay a few months' worth of bills in advance. Others bought their children clothes for school, or even Christmas presents. Mostly, though, the energy once spent fretting over such things was freed up. That "helps parents be better parents," she said.

    A parallel study at the University of North Carolina at Chapel Hill also highlights the insidious effect of poverty on parenting. The Family Life Project, now in its 11th year, has followed nearly 1,300 mostly poor rural children in North Carolina and Pennsylvania from birth. Scientists quantify maternal education, income and neighborhood safety, among other factors. The stressors work cumulatively, they've found. The more they bear down as a whole, the more parental nurturing and support, as measured by observers, declines.

    By age 3, measures of vocabulary, working memory and executive function show an inverse relationship with the stressors experienced by parents.

    These skills are thought important for success and well-being in life. Maternal warmth can seemingly protect children from environmental stresses, however; at least in these communities, parenting quality seems to matter more to a child than material circumstances. On the other hand, few parents managed high levels of nurturing while also experiencing great strain. All of which highlights an emerging theme in this science: Early-life poverty may harm, in part, by warping and eroding the bonds between children and caregivers that are important for healthy development.

    Evidence is accumulating that these stressful early-life experiences affect brain development. In one recent study, scientists at the Washington University School of Medicine in St. Louis followed 145 preschoolers between 3 and 6 years of age for up to 10 years, documenting stressful events — including deaths in the family, fighting and frequent moves — as they occurred. When they took magnetic resonance imaging scans of subjects' brains in adolescence, they observed differences that correlated with the sum of stressful events.

    Early-life stress and poverty correlated with a shrunken hippocampus and amygdala, brain regions important for memory and emotional well-being, respectively. Again, parental nurturing seemed to protect children somewhat. When it came to hippocampal volume in particular, parental warmth mattered more than material poverty.

    The prospective nature of both studies makes them particularly compelling. But as always with observational studies, we can't assume causality. Maybe the children's pre-existing problems are stressing the parents. Or perhaps less nurturing parents are first depressed, and that depression stems from their genes. That same genetic inheritance then manifests as altered neural architecture in their children.

    Numerous animal studies, of course, show that early life stress can have lifelong consequences, and that maternal nurturing can prevent them. Studies on rats, for example, demonstrate that even when pups are periodically stressed, ample maternal grooming prevents unhealthy rewiring of their nervous systems, favorably sculpting the developing brain and making the pups resilient to stress even in adulthood.

    Yet in observational human studies, it's difficult to rule out the possibility that the unwell become poor, or that some primary deficiency stresses, impoverishes and sickens. This very uncertainty is one reason, in fact, that Professor Costello's findings are so intriguing, however modest her study size. A naturally occurring intervention ameliorated psychiatric outcomes. A cash infusion in childhood seemed to lower the risk of problems in adulthood. That suggests that poverty makes people unwell, and that meaningful intervention is relatively simple.

    Bearing that in mind, Randall Akee, an economist at the University of California, Los Angeles, and a collaborator of Professor Costello's, argues that the supplements actually save money in the long run. He calculates that 5 to 10 years after age 19, the savings incurred by the Cherokee income supplements surpass the initial costs — the payments to parents while the children were minors. That's a conservative estimate, he says, based on reduced criminality, a reduced need for psychiatric care and savings gained from not repeating grades. (The full analysis is not yet published.)

    But contrary to the prevailing emphasis on interventions in infancy, Professor Akee's analysis suggests that even help that comes later — at age 12, in this case — can pay for itself by early adulthood. "The benefits more than outweigh the costs," Emilia Simeonova, a Johns Hopkins Carey Business School economist and one of Professor Akee's co-authors, told me.

    Not all changes in the Cherokee's "natural experiment" were benign, however. For reasons neither Professor Costello nor Professor Akee can explain, children who were the poorest when the supplements began also gained the most weight.

    Another analysis, meanwhile, found that more accidental deaths occurred during those months, once or twice a year, when the tribe disbursed supplements. The authors attributed that, in part, to increased drinking, as well as to buying cars and traveling more.

    Then there's the broader context of gaming, an often contentious issue around the country. Opponents often cite the potential for increases in crime, problem gambling and bankruptcies. And some early studies suggest these concerns may have merit.

    But Douglas Walker, an economist at the College of Charleston who has done some consulting for pro-gaming organizations, says many of the studies on gaming have methodological problems. Increased criminal behavior may simply be a function of more visitors to the casino area, he says. If the population increases periodically, it's natural to expect crime to rise proportionally. "The economic and social impacts of casinos are not as clear, not as obvious as they seem," he said.

    So Professor Costello's findings are not necessarily a sweeping endorsement of Native American gaming, and casinos generally. Rather, they suggest that a little extra money may confer long-lasting benefits on poor children. And in that respect, the Cherokee experience is unique in several important ways.

    First, this was not a top-down intervention. The income supplements came from a business owned by the beneficiaries. The tribe decided how to help itself. Moreover, the supplements weren't enough for members to stop working entirely, but they were unconditional. Both attributes may avoid perverse incentives not to work.

    Also, fluctuations in the casino business aside, the supplements would continue indefinitely. That "ad infinitum" quality may both change how the money is spent and also protect against the corrosive psychological effects of chronic uncertainty.

    And maybe most important, about half the casino profits went to infrastructure and social services, including free addiction counseling and improved health care. Ann Bullock, a doctor and medical consultant to the Cherokee tribal government, argues that these factors together — which she calls the exercising of "collective efficacy" — also may have contributed to the improved outcomes. She describes a "sea change" in the collective mood when the tribe began to fund its own projects. A group that was historically disenfranchised began making decisions about its own fate.

    "You feel controlled by the world when you're poor," she said. "That was simply no longer the case."

    Professor Costello and Professor Akee don't entirely agree. They think cold hard cash made the real difference. For one thing, Professor Akee says, outcomes started improving as soon as the supplements began, before many of the communitywide services went into effect.

    If that's the primary takeaway, then we have some thinking to do. Some people feel that "if you're poor, it's because you deserve it," Professor Costello said. "If you're sick, it's because you deserve it," she said.

    But if giving poor families with children a little extra cash not only helps them, but also saves society money in the long run, then, says Professor Costello, withholding the help is something other than rational.

    "You're not doing it because it pains you to do it," she said. "That's a very valuable lesson for society to learn."


    Moises Velasquez-Manoff is a science writer and the author of "An Epidemic of Absence."

    A version of this article appears in print on 01/19/2014, on page SR12 of the NewYork edition with the headline: When the Poor Get Cash.

    13.25 | 0 komentar | Read More
    techieblogger.com Techie Blogger Techie Blogger