Diberdayakan oleh Blogger.

Popular Posts Today

Opinionator | The Stone: Is Belief a Jewish Notion?

Written By Unknown on Senin, 31 Maret 2014 | 13.25

The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

This is the fourth in a series of interviews about religion that I am conducting for The Stone. The interviewee for this installment is Howard Wettstein, a professor of philosophy at the University of California, Riverside, and the author of "The Significance of Religious Experience."

Gary Gutting: You say you're a naturalist and deny that there are any supernatural beings, yet you're a practicing Jew and deny that you're an atheist. What's going on here? What's a God that's not a supernatural being?

Just as a practicing mathematician need not have views about the metaphysical status of numbers, so too religious life does not require a theoretical stance on God's existence.

Howard Wettstein: Let's begin with a distinction between participation in a practice and the activity of theorizing, philosophically and otherwise, about the practice. Even an advanced and creative mathematician need not have views about, say, the metaphysical status of numbers. Richard Feynman, the great physicist, is rumored to have said that he lived among the numbers, that he was intimate with them. However, he had no views about their metaphysical status; he was highly skeptical about philosophers' inquiries into such things. He had trouble, or so I imagine, understanding what was at stake in the question of whether the concept of existence had application to such abstractions. Feynman had no worries about whether he was really thinking about numbers. But "existence" was another thing.

It is this distinction between participation and theorizing that seems to me relevant to religious life.

G.G.: How so?

H.W.: I had a close friend in Jerusalem, the late Rabbi Mickey Rosen, whose relation to God was similarly intimate. To watch him pray was to have a glimpse of such intimacy. To pray with him was to taste it; God was almost tangible. As with Feynman, Mickey had no patience with the philosophers' questions. God's reality went without saying. God's existence as a supernatural being was quite another thing. "Belief," he once said to me, "is not a Jewish notion." That was perhaps a touch of hyperbole. The point, I think, was to emphasize that the propositions we assent to are hardly definitive of where we stand. He asked of his congregants only that they sing with him, song being somewhat closer to the soul than assent.

This brings to mind Buber's emphasis on the distinction between speaking to God, something that is readily available to all of us, and significant speech/thought about God, something that Buber took to be impossible.

G.G.: But you can't in fact speak to someone who doesn't exist — I can't speak to Emma Bovary, although I can pretend to or think I can. Further, why would you even want to pray to someone you didn't believe exists? On your account praying to God seems like playacting, not genuine religious commitment.

H.W.: Were I to suggest that God does not exist, that God fails to exist, then what you suggest would have real purchase. My thought is otherwise; it's rather that "existence" is, pro or con, the wrong idea for God.

My relation to God has come to be a pillar of my life, in prayer, in experience of the wonders and the awfulness of our world. And concepts like the supernatural and transcendence have application here. But (speaking in a theoretical mode) I understand such terms as directing attention to the sublime rather than referring to some nonphysical domain. To see God as existing in such a domain is to speak as if he had substance, just not a natural or physical substance. As if he were composed of the stuff of spirit, as are, perhaps, human souls. Such talk is unintelligible to me. I don't get it.

The theism-atheism-agnosticism trio presumes that the real question is whether God exists. I'm suggesting that the real question is otherwise and that I don't see my outlook in terms of that trio.

G.G.: So what is the real question?

H.W.: The real question is one's relation to God, the role God plays in one's life, the character of one's spiritual life.

Let me explain. Religious life, at least as it is for me, does not involve anything like a well-defined, or even something on the way to becoming a well-defined, concept of God, a concept of the kind that a philosopher could live with. What is fundamental is no such thing, but rather the experience of God, for example in prayer or in life's stunning moments. Prayer, when it works, yields an awe-infused sense of having made contact, or almost having done so. Having made contact, that is, concerning the things that matter most, whether the health and well-being of others, or of the community, or even my own; concerning justice and its frequent absence in our world; concerning my gratefulness to, or praise of, God. The experience of sharing commitments with a cosmic senior partner, sharing in the sense both of communicating and literally sharing, "dreaming in league with God," as A.J. Heschel puts it, is both heady and heartening. Even when that partner remains undefined and untheorized.

G.G.: How could you share commitments with someone — a "senior partner" no less — who wasn't a person?

H.W.: I have been speaking as if God were a person, and such is our experience. However, overlaying this is the sense, sometimes only a dim sense, that somehow God is beyond being a person, that we are over our heads in even raising the question. Do you sense a tension, one that, on the face of it, might make theorizing a tad difficult?

G.G.: I see a tension in religious practice itself. In prayer, you say, you have a sense of God as a person but at the same time a sense that God is perhaps not a person. It seems to me that, if God is not a person, then the religious practice of praying to God isn't what most religious people think it is. It may be edifying, therapeutic, or whatever. But it's not, for example, expressing our thoughts to someone who understands and loves us.

H.W.: I wasn't speaking about what God is, nor do I know what he is. (Remember his enigmatic remark in Exodus 3:14, "I am what I am.") I was addressing my experience, with its strange duality: In prayer, we express our deepest selves to God who understands. I pray, and I mean it. But I am "blessed" with an additional sense that, in so supposing, I'm over my head; I don't know what I'm talking about. Both feelings are real and powerful.

These experiences are not theory-driven. The perceptions and understandings of the religious practitioner are more like the outpourings of a poet than they are like theoretical pronouncements. Moments of insight, illumination and edification do not necessarily respect one another; illuminating one aspect of a phenomenon may occlude others. Poetry, at its most profound, need not observe consistency.

G.G.: This sounds like Whitman: "Do I contradict myself? Very well, then, I contradict myself." Hardly a philosophical response.

H.W.: The philosophy of religion is, of course, another matter. My approach departs from the way the philosophy of religion has been and is still often pursued, largely as a treatment of the putative metaphysics of religion and then the epistemology needed to support such a metaphysics. For me, religious practice is primary; the philosophical project consists in an elucidation of the significance of that practice and of the religious life that embeds the practice.

G.G.: I agree that some questions philosophers of religion ask have purely theoretical significance. But here the question arises out of religious practice itself. Is there a person I'm praying to? How could that not matter to me, precisely as someone engaged in the practice of praying? Compare: I'm on the phone and suddenly get a sense that the responses I'm hearing are from an automated program, not a human being. That's a matter of practical importance. Why is the case of talking to God different?

We anthropomorphize God, yet we also know that this anthropomorphism is inadequate.

H.W.: What I've been suggesting about God's personhood is a special case of the problem of anthropomorphism, the way we are drawn into and out of anthropomorphic characterization of God. Such characterization of God is at the heart of religious life. "Taste and see God's goodness." (Psalms 34:8) And there is also a dark side of anthropomorphic depiction: "I form the light, and create darkness: I make peace, and create evil." (Isaiah 45:7) God's goodness, nurture, and the like, but also his anger, his hiddenness, all of these are available to experience.

Yet religious anthropomorphism coexists with a sense that, while hardly universal even in my religious community, goes deep: in thinking about God, about what he is, about how he works in our world, we are over our heads. "How the hell do I know why there were Nazis?" protests one of Woody Allen's characters, "I don't even know how the can opener works." And such an attitude reflects itself in the anti-anthropomorphist outlook that is an important if controversial stand in religious thought at least since medieval times. Maimonides's attempt, in "Guide for the Perplexed" to explain away biblical anthropomorphism is a Jewish case in point.

G.G.: Well, personal experience can be hard to explicate. But as you've just said, the inclination to think of God in human terms also comes from the Bible, which certainly often talks of God as acting like a person: expressing love or anger, giving commands, making plans. In fact, much of the Hebrew Scriptures are a narrative with God as the major protagonist. Doesn't accepting, as I suppose you do, this narrative of how God dealt with his people require thinking of God in human terms? How else can you make sense of God as an agent intervening in human history?

H.W.: For a philosophical anti-anthropomorphist like Maimonides, the Bible "speaks in the language of the folk." Maimonides takes the phrase from the Talmud (but in quite another connection). It is a medieval theological counterpart to Bishop Berkeley's advice that we ought to "think with the learned and speak with the vulgar."

My own view is different. One way to put what I've been saying: The anthropomorphic is one mode of our access to God. I'm not sure that it's the exclusive biblical mode, but it's close to that. As religiously powerful as it is, the anthropomorphized sense of the divine coexists with the humble sense that we are over are heads. This latter feeling can itself be infused with awe. It can have its own religious power.

At the end of your last question, you raise the difficult issue of God's intervention in human affairs. I can't tackle it here. But we should bear in mind that to speak of God as intervening in history, as with characterizing him as creating, planning, willing, and the like, these are all anthropomorphic.

G.G.: Coming back to the personal experiences that seem to be the core of your religious commitment, what's your response to suggestions that such experiences have some sort of entirely human psychological explanation? Doesn't that thought ever seem plausible to you?

H.W.: That one can explain love of our fellows in psychological terms does not suggest that there is something unreal about our fellows or about what we feel for them. The threat to religion is not from the psychological intelligibility of religious experience; it's from that intelligibility in the service of a reductive account. Freud argued persuasively, I think, for the psychological explicability of the religious impulse, and for the psychological needs to which the impulse is responsive. I'm sure something like that is right but, contrary to Freud's thinking, it doesn't threaten my own outlook or even the more usual supernaturalism. God's reality or existence is compatible with the putative needs.

What would it be like for love to be beyond the reach of psychology? Perhaps there are romantics who find such a scenario attractive or compelling. Perhaps this is due to the sense that a naturalistic explanation would render null and void the mysteries of love, and similarly, the magic of religious experience. For Einstein, though, the awe deepened with increased understanding.

G.G.: So then you can't argue for God's reality or existence as the best explanation of religious experience.

Related
More From The Stone

Read previous contributions to this series.

H.W.: That's one way to argue for the reality or existence of God, but it's not my way. Such an argument is subject to refutation by showing that naturalistically acceptable reasons can explain our experience, either in Freud's way or some other. And given the propensity of the universe to disclose itself increasingly to scientific understanding, this argument seems, among other things, risky.

Nor do people who are blessed with religious experiences, even the intense ones of the mystic, uniformly suppose that their experiences are only explicable by reference to the truth of their religious beliefs. Rowan Williams points out that Teresa of Avila, a medieval saint and mystic whose life was punctuated by ecstatic experiences, never supposed that her uncanny experiences established the truth of religious claims.

G.G.: You seem to be saying that we'd have a complete explanation of religious experience, even assuming there's no God. Isn't this just the case that many naturalists make for atheism?

H.W.: I didn't say that we would have a complete explanation in psychological terms. I'm not easy with the idea of a "complete explanation." Say we had a really satisfying psychological account of, for example, what we experience in a moment of intense love. Say further that this was somehow perfectly correlated with a neurophysiological account. Would this be a complete explanation? Would there be no more questions — "why" questions — to ask about the experience? Couldn't we still be puzzled about the role that love plays in the human emotional economy? Wouldn't we want to know what it says about these creatures, their needs, their frustrations, the things that make life worthwhile for them? I'm not sure that we can ever close the book on our multiple explanatory projects.

The subject requires much more than can be said here. It's important to me, however, that my discomfort with the idea of a phenomenon receiving a once-and-for-all finished explanation is not only in the service of defending religion.

One of my complaints about the New Atheists, like Richard Dawkins, is their reductive tendency. I don't see why the psychological (or more generally naturalistic scientific) explicability of a phenomenon should suggest that questions associated with its meaning are put to rest. Indeed, were I a supernaturalist theist, I would feel no need to resist naturalistic explanation.

Earlier naturalistically inclined philosophers like Dewey and Santayana, by contrast with the New Atheists, appreciate the substantial power for good that religion exercises in people's lives. Needless to say, such appreciation is entirely consistent with a deep appreciation for the negative side of the impact of religion: wars, bigotry, narrow-mindedness and the rest. Such is the way with institutions of such power.

This interview was conducted by email and edited. Previous interviews in this series were with Alvin Plantinga, Louise Antony and John D. Caputo.


Gary Gutting is a professor of philosophy at the University of Notre Dame, and an editor of Notre Dame Philosophical Reviews. He is the author of, most recently, "Thinking the Impossible: French Philosophy Since 1960″ and writes regularly for The Stone.


13.25 | 0 komentar | Read More

Room for Debate: Scholars, Players and Union Members

Written By Unknown on Jumat, 28 Maret 2014 | 13.26

  • Andrew Zimbalist

    Proof That College Athletics Need to Be Reformed

    Andrew Zimbalist, economist

    Although unionization will help reduce the hypocrisy and exploitation in college sports, it will move intercollegiate athletics away from its professed purpose.

  • Allen Sack

    It is Their Legal Right to Unionize

    Allen Sack, professor of sport management

    Unionization of college athletes is a natural outgrowth of the N.C.A.A.'s 1973 decision to dump four-year scholarships in favor of one-year renewable scholarships that can be canceled for just about any reason.

  • Kenneth L. Shropshire

    The Labeling Gives Me Pause

    Kenneth L. Shropshire, Wharton School Sports Business Initiative

    As this battle expands beyond Northwestern, these athletes, who are not "primarily students," will be largely African-American, particularly if we envision future March Madness participants joining in.

  • Amy Privette Perko

    Colleges Can Help Without an Athletes' Union

    Amy Privette Perko, Knight Commission on Intercollegiate Athletics

    Colleges need to shift their priorities to restore the educational role of athletics and improve athletes' experiences.

  • Glenn Wong

    Be Careful What Your Wish For

    Glenn Wong, University of Massachusetts

    Unionization would have benefits and also some substantial costs.

  • Billy Hawkins

    Success May Not Be What It Seems

    Billy Hawkins, author, "The New Plantation"

    True success could take years and would mean a new bureaucracy. Maybe continued grassroots activism would achieve more real success.


  • 13.26 | 0 komentar | Read More

    Opinionator | Disunion: Dissent in Milledgeville

    Disunion follows the Civil War as it unfolded.

    Thanks to the particularities of the Confederate Constitution, President Jefferson Davis had no real claim on the loyalty of his sitting vice president, Alexander H. Stephens, who was selected by the Confederate Legislature. Even so, Davis recognized Stephens's appearance before the Georgia Legislature in early 1864 as an act of utter betrayal – and not only because it fell the day after the Ides of March.

    Stephens's three-hour harangue on the night of March 16 jolted the sleepy town of Milledgeville, then the state capital, with its withering attack on the Davis administration's violation of the fundamental rights of Southern whites through policies like the military draft and martial law. Stephens sent the packed crowd home with echoes of Patrick Henry's revolutionary injunction of 1775: The choice for Confederate Georgians, as it had been for Henry's generation, was the alternative of civil liberty or patriotic death, imposed by a tyrannical ruler.

    Stephens's rhetorical broadside spread quickly across the wartime South and became one of the most important political developments in the Civil War's third lingering winter. The sour personal relationship between the Confederacy's top two officials was hardly news. But the rift in the South's political establishment now aligned with a building anti-Lincoln campaign in the North that was pushing hard for a negotiated peace. Those Confederates who already considered Davis to be a Southern-bred Caesar could now imagine Stephens in the role of Brutus.

    Stephens raised the stakes of political opposition by presenting Richmond's governmental overreach on the same level of danger as the Union's military onslaught. An administration that had weathered three years of federal military assault now faced a profound crisis of legitimacy from within.

    Tensions between Davis and Stephens emerged early in the war but were resolved, at least at first, by Stephens, a Georgia politician, withdrawing from Richmond, in the spring of 1862. From his semi-exile in his Crawfordville, Ga., home (called Liberty Hall as early as 1860), the vice president privately seethed about his boss, whom he dismissed as "weak and vacillating, timid, petulant, peevish, obstinate, but not firm."

    His grievances found a ready audience among fellow Georgia politicians whose anti-Davis animus was just as bitter. Early in 1862, Gov. Joseph Brown had marshaled a states-rights critique of Richmond's new military draft. Robert Toombs followed his brief stint as Davis's secretary of state with interminable squabbling with the president and Confederate military brass; the culmination of these disputes came when Toombs was placed under military arrest just a few weeks after Stephen's "Ides of March" address. Linton Stephens, Alexander's younger brother and a state legislator, channeled his hatred of the president into a series of anti-administration resolutions taken up during his state's 1864 late winter legislative session. By the end of their March assembly, Georgia state legislators had followed the Stephens brothers' lead by passing a strongly worded rebuke of Confederate martial law.

    Stephens stood out among these Georgia "malcontents" for his oratorical skills no less than for the high office he uneasily occupied. His speechifying in Congress 20 years earlier had brought tears from Representative Abraham Lincoln. His impassioned address against disunion in November of 1860 also touched Lincoln (as it did most other Unionists) and prompted a famous interchange between the president-elect and his former Whig ally from Georgia. A few months later, Stephens offered up his so-called "Cornerstone speech," which would shape his reputation thereafter. His full-throated identification of the new Southern Confederacy with racial slavery gained international notoriety.

    Stephens's "Ides of March," speech, which ranks just below his "Cornerstone" address in its rhetorical power, was built around a central tenet of classical republicanism: the fragility of popular rights during the emergency of war. "Liberty is the animating spirit, the soul of our system of government, and like the soul of man, when once lost, it is lost forever," Stephens said. Insisting that even the most dire circumstances could not justify curtailing popular rights, he noted that "without liberty, I would not turn upon my heel for independence." True patriotism required citizens to defy an overreaching government, even if that government faced the crisis of its existence. "Let no one therefore be deterred from performing his duty on this occasion by the cry of counter-revolution, nor by the cry that it is the duty of all, in this hour of peril, to support the government," he warned. "Be not misled by this cry, or that you must not say anything against the administration, or you will injure the cause."

    Stephens reached a peak by warning of imminent political bondage. "I was not born to have a master from either the North or South," he thundered, insisting that he was no more willing to subordinate himself to his own Richmond government than to Lincoln's despotism in Washington. Were his options reduced to serving one or another of these two tyrants, his resolve would be clear. "I shall never choose between candidates for that office. Shall never degrade the right of suffrage in such an election." Death would be welcome in this event, since Stephens had "no wish or desire to live after the degradation of my country, and have no intention to survive its liberties."

    Ironically, with Stephens away in Milledgeville defending white liberty, one of the slaves at Liberty Hall made his own bid for freedom. Stephens learned upon his return to Crawfordville in late March that a slave named Pierce had left the premises soon after he had departed, his own speech in hand, for the capital. Early reports that Pierce had been killed were updated by information that he was both alive and no longer a slave, having escaped to the Union lines and placed himself in the service of a cavalry unit commanded by Gen. Joseph Wheeler.

    The Gettysburg Address

    Gettysburg Address

    It was one of the most powerful speeches in American history.

    This stunned Stephens. For him, the unquestioning obedience he expected from Pierce and his other 30-odd slaves was a function of their racial deficiencies. Among the central points of his Cornerstone speech was the natural subservience of the "African race" to whites, whose superiority Stephens insisted had been vindicated by modern science. Pierce's decision to abandon the role of devoted slave haunted Stephens into the postwar period. Late in 1865, during his five-month incarceration in Fort Warren and facing a possible trial for treason to the United States government, Stephens recorded several dreams of his once trusted servants. In one of the most vivid, he imagined the life that a wayward Pierce would be living beyond his master's command.

    Stephens's psychic entanglement with Pierce's loyalty and his betrayal resulted from the deep history between these two men. Pierce had become Stephens' human property in 1845, at the age of 6; the master immediately laid claim to the slave by endowing him with the name of his close friend George Foster Pierce, the Methodist bishop of Georgia. As Pierce the slave grew older, Stephens worked to mold him into a trusted dependent and regularly traveled to Washington with the enslaved personal attendant.

    Their history didn't end with Pierce's emancipation. When Stephens returned to take up a seat in the United States House of Representatives eight years after the Confederate collapse, he resumed contact with Pierce (who had taken the last name of Lafayette) and used his influence to gain his former slave a Civil Service appointment at the Interior Department. In a world that had made slaves into citizens, Stephens' Southern mastery was replaced by a routine act of political patronage.

    But there would be no rapprochement between Stephens and Davis after the March 1864 rupture in their political and personal relationship. At the same time, fears (and hopes) that Stephens's dissent might lead to a political upheaval within the Confederate government proved unfounded. Despite his fiery rhetoric, Stephens was still doing his president's bidding up through his February 1865 meeting with Lincoln and Secretary of State William H. Seward at Hampton Roads (a vignette aboard the River Queen that Steven Spielberg's 2012 film "Lincoln" has memorably dramatized). Even under the extreme hardships of war and the looming inevitability of defeat, Stephens's allegiance to the Southern cause remained firm.

    Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


    Sources: "The Great Speech of Hon. A.H. Stephens, Delivered Before the Georgia Legislature, on Wednesday Night, March 16th, 1864"; James Z. Rabun, "Alexander H. Stephens and Jefferson Davis," American Historical Review, 1953; Henry Cleveland, "Alexander Stephens in Public and Private;" Alexander H. Stephens Papers, Library of Congress.

    Robert E. Bonner is a professor of history at Dartmouth and the author of a forthcoming biography of Alexander H. Stephens.


    13.26 | 0 komentar | Read More

    Op-Ed | Errol Morris: The Certainty of Donald Rumsfeld (Part 2)

    Written By Unknown on Kamis, 27 Maret 2014 | 13.26

    Errol Morris on photography.

    This is the second installment of a four-part series.

    2.
    THE KNOWN AND THE UNKNOWN

    The phrase "unknown known" first appears in early 19th-century Romantic poetry — in John Keats's Endymion, his ode to the sovereign power of love.[1] Fifty years later it appears again in Robert Browning's The Ring and the Book.[2] A metaphor for the unknowability of the mind of man. And then John Wesley Powell, the one-armed Civil War veteran who traveled through the Grand Canyon, compared the unknown known and the known unknown. Savagery versus civilization — the first use I can document of the two phrases in one sentence.[3] Powell wrote:

    There is an unknown known, and there is a known unknown. The unknown known is the philosophy of savagery; the known unknown is the philosophy of civilization. In those stages of culture that we call savagery and barbarism, all things are known — supposed to be known; but when at last something is known, understood, explained, then to those who have that knowledge in full comprehension all other things become unknown. Then is ushered in the era of investigation and discovery; then science is born; then is the beginning of civilization. The philosophy of savagery is complete; the philosophy of civilization fragmentary. Ye men of science, ye wise fools, ye have discovered the law of gravity, but ye cannot tell what gravity is. But savagery has a cause and a method for all things; nothing is left unexplained.[4]


    In short, the savage is free to imagine anything; the civilized man is constrained by evidence. The known unknown "usher[s] in the era of investigation and discovery;" the unknown known is the savage's false belief that he can explain everything.

    ——–

    Rumsfeld, in his memoir, Known and Unknown, says he learned about the known and unknown from William R. Graham, who served with him in the late 1990s on the Ballistic Missile Threat Commission — an attempt to undermine or at least question the C.I.A. assessment asserting the improbability of a ballistic missile attack since the fall of the Soviet Union.[5] (For years Rumsfeld had proposed "defensive" missiles as a shield against offensive missiles — in my opinion, setting off yet another arms race. On my visit to his offices in Washington, Rumsfeld went into a closet and pulled out a heavy, crumpled piece of metal — part of a deployed antiballistic missile. It was his trump card: "Who says you can't shoot down a missile with a missile?")

    The known known, the known unknown and the unknown unknown seemingly have straightforward interpretations. Or do they? Things we know we know — like the name of the president of the United States or the capital of France. And things we know we don't know — like the exact population of Kathmandu. (I know I don't know it.) Things we know we don't know but we can look them up, say on Wikipedia. Like the atomic number of tungsten. (It's 74. I just looked it up.) Or things that we know we don't know but need to be investigated. (Who killed JonBenét Ramsey? I don't know, but someone probably does know — the killer? — although I know I don't know who that person is.) Things that our enemies know but may not be known to us. (How many atomic warheads are there in North Korea?) And then, of course, there are the things I once knew but can't remember. It goes on and on and on. It begs us to answer the question what does it mean to know something? Or to know that we know something? Or to know that we don't know something? Doesn't it depend on evidence?[6]

    As Rumsfeld tells the story, the known and unknown are linked (see also the aforementioned Feb. 12 news conference) with the absence not the presence of evidence. Rumsfeld writes in his memoir:

    The idea of known and unknown unknowns recognizes that the information those in positions of responsibility in government, as well as in other human endeavors, have at their disposal is almost always incomplete. It emphasizes the importance of intellectual humility, a valuable attribute in decision making and in formulating strategy. It is difficult to accept — to know — that there may be important unknowns. The best strategists try to imagine and consider the possible, even if it seems unlikely. They are then more likely to be prepared and agile enough to adjust course if and when new and surprising information requires it — when things that were previously unknown become known.

    I also encountered this concept in Thomas Schelling's foreword to Roberta Wohlstetter's book Pearl Harbor: Warning and Decision, in which Schelling identified a "poverty of expectations" as the primary explanation for America's inability to anticipate and thwart the Japanese attack on Hawaii. Schelling's message was as clear as it was prescient: We needed to prepare for the likelihood that we would be attacked by an unanticipated foe in ways that we might not imagine.[7]

    Let's examine this passage.

    As Rumsfeld writes, the known and unknown recognizes that information is always incomplete. Correct as far as it goes. Information is always incomplete — do we ever have all the evidence we want or need? Of course not. But was the threat of the Japanese in 1941 or Al Qaeda in 2001 an unknown unknown or even a known unknown? Evidence was ignored or underestimated — in 1941 and 2001 — not because it was "unknown," but because it didn't fit a preconceived agenda.

    Both Roberta Wohlstetter and Thomas Schelling, writing for publication in the early 1960s, were concerned with the possibility of a nuclear war — how to prevent it. Wohlstetter's book ends with an admonition, not a solution:

    We cannot count on strategic warning. We might get it, and we might be able to take useful preparatory actions that would be impossible without it. We certainly ought to plan to exploit such a possibility should it occur. However, since we cannot rely on strategic warning, our defenses, if we are to have confidence in them, must be designed to function without it. If we accept the fact that the signal picture for impending attacks is almost sure to be ambiguous, we shall prearrange actions that are right and feasible in response to ambiguous signals, including signs of an attack that might be false. We must be capable of reacting repeatedly to false alarms without committing ourselves or the enemy to wage thermonuclear war …. We have to accept the fact of uncertainty and learn to live with it. No magic, in code or otherwise, will provide certainty. Our plans must work without it.[8]

    Schelling's foreword, likewise, spells out the ways in which intelligence can fail despite our best efforts — not because we don't know about it, but because we fail to interpret it correctly or to act on it. As Schelling puts it, "There is a tendency in our planning to confuse the unfamiliar with the improbable." But this is not an invitation to imagine the worst and to act on it.

    Call this the Chicken Little Principle. Do you remember Chicken Little? An acorn falls on Chicken Little's head, and she decides the sky is falling. Other animals are warned in turn — Henny Penny, Ducky Lucky, Goosey Loosey, Turkey Lurkey — until they are all eaten in their panic by Foxy Loxy, who sees an unparalleled gustatory opportunity. There are a number of staggering what-ifs. What if Chicken Little had asked for additional evidence that the sky was falling? What if Henny Penny or Goosey Loosey had been more skeptical of Chicken Little's claims? You can't fault Chicken Little for a lack of imagination, but the fable is a warning against unfettered credulity — and imagination. If Chicken Little had reacted to the falling acorn with greater equanimity, she might still be alive today — along with many, if not all of her barnyard friends.

    Remarkably, the Chicken Little imagery comes from Rumsfeld himself, not just from me. For years he had been Mr. Naysayer — second-guessing the C.I.A., predicting Soviet nuclear dominance, attacking détente, proposing antiballistic missile shields, conjuring images of a Saddam armed with nuclear weapons and an assortment of biological and chemical W.M.D. He was the boy who cried "Armageddon." Now, the shoe was on the other foot. In a Pentagon news conference on April 11, 2003, a few weeks before victory in Iraq was declared — somewhat prematurely, I should add — Rumsfeld responded to reports of looting and anarchy by accusing his critics of being — guess what? — naysayers.

    DONALD RUMSFELD: Let me say one other thing. The images you are seeing on television you are seeing over, and over, and over, and it's the same picture of some person walking out of some building with a vase, and you see it 20 times, and you think, "My goodness, were there that many vases?" (Laughter.) "Is it possible that there were that many vases in the whole country?"

    CHARLES ALDINGER: Do you think that the words "anarchy" and "lawlessness" are ill-chosen —

    DONALD RUMSFELD: Absolutely. I picked up a newspaper today and I couldn't believe it. I read eight headlines that talked about chaos, violence, unrest. And it just was Henny Penny — "The sky is falling." I've never seen anything like it! And here is a country that's being liberated, here are people who are going from being repressed and held under the thumb of a vicious dictator, and they're free. And all this newspaper could do, with eight or 10 headlines, they showed a man bleeding, a civilian, who they claimed we had shot — one thing after another. It's just unbelievable how people can take that away from what is happening in that country!

    For Donald Rumsfeld, evidence of anarchy and chaos is not evidence of anarchy and chaos. For Donald Rumsfeld, the presence of evidence isn't evidence of presence.


    [1] John Keats's Endymion (Book II) (1818). But would Keats have seen Rumsfeld's known unknown as a thing of beauty or a joy forever? Here is the quote from Endymion

    O known Unknown! from whom my being sips
    Such darling essence, wherefore may I not
    Be ever in these arms? in this sweet spot
    Pillow my chin for ever? (Book II, l. 741-44)

    [2] Robert Browning, The Ring and The Book (1869)

    O Thou, — as represented here to me
    In such conception as my soul allows, —
    Under Thy measureless, my atom width! —
    Man's mind — what is it but a convex glass
    Wherein are gathered all the scattered points
    Picked out of the immensity of sky,
    To re-unite there, be our heaven on earth,
    Our known unknown, our God revealed to man? (ll. 1308-15).

    [3] Powell was clearly interested in the known and the unknown. Even though he disapproved of the philosophy of the unknown known, he toyed with the formulation in his description of the Grand Canyon, referring to it as "the Great Unknown." See Edward Dolnick's Down the Great Unknown: John Wesley Powell's 1869 Journey of Discovery and Tragedy Through the Grand Canyon (2001).

    [4] J. W. Powell. 1881, "Sketch of the Mythology of the North American Indians." In the First Annual Report of the Bureau of Ethnology to the Secretary of the Smithsonian Institution.

    [5] Donald Rumsfeld, Known and Unknown: A Memoir, Penguin, 2011, p. xiv:

    I first heard a variant of the phrase "known unknowns" in a discussion with former NASA administrator William R. Graham, when we served together on the Ballistic Missile Threat Commission in the late 1990s. Members of our bipartisan commission were concerned that some briefers from the U. S. intelligence community treated the fact that they lacked information about a possible activity to infer that the activity had not happened and would not. In other words, if something could not be proven to be true, then it could be assumed not to be true. This led to misjudgments about the ballistic missile capabilities of other nations, which in some cases proved to be more advanced than previously thought."

    [6] Maria Ryan at the University of Birmingham,

    Thus what appeared to be an intelligence failure over Iraq's alleged weapons of mass destruction actually represented the temporary institutionalisation of a method of intelligence analysis long favoured by some conservative and neo-conservative hawks. Team B, the Rumsfeld Commission and the Office of Special Plans were all successful on their own terms, encouraging increases in defence expenditure, missile defence and war in Iraq as their authors had conceived.

    However, in hindsight, not one of these reports proved correct in the long term. Team B reported just as the Soviet Union's military expenditure was slowing and its economy was contracting (and 15 years later it would no longer exist); the United States does not a face a hostile ballistic missile threat and will not in the near future; and Iraq's WMD are nowhere to be found. In sum, although intelligence gathering may always be an inexact science, policy makers would do better to concentrate on what we do know rather than fantasise about what we do not.

    [7] Donald Rumsfeld, op. cit., pp. xiv-xv.

    [8] Roberta Wohlstetter, Pearl Harbor: Warning and Decision, Stanford University Press, 1962, pp. 400ff.


    13.26 | 0 komentar | Read More

    Opinionator | Draft: Keep It Short

    Written By Unknown on Selasa, 25 Maret 2014 | 13.26

    Draft is a series about the art and craft of writing.

    In my first daily newspaper job some 25 years ago, I learned a few lessons about brevity that I'm still using today. Back then, among other editorial chores, I contributed to a weekly feature called "Books in Brief." Each review could be no longer than 200 words — less than a fourth the length of a usual article. Imagine a slender column of type slightly taller than your middle finger, and you'll get some idea of the word limit.

    As a recent college graduate accustomed to discussing books in 12-page term papers, I chafed at writing in miniature. But I tried to think deliberately about my reviews as a form of quick conversation. If I were briefly summarizing my opinion over the phone, for example, how would I shape my argument to nab my listener by the collar before he hung up? What I was reminding myself, I suppose, is that writing is a kind of talk, a discourse that must eventually answer to the clock. In writing, brevity works not only as a function of space on a page, but the time that an audience is willing to spend with you. Even if the Internet has made infinite texts possible, the reader's attention is not without end.

    To shorten my articles, I often worked through several versions, and with a merciless finger on the delete button I could surgically reduce my first draft by half.

    The exercise taught me that successful economy of expression often depends on vigorous revision. Perhaps no one exemplifies this principle more vividly than E.B. White, the magazine essayist and children's author celebrated for his deceptively simple style. White excelled in a number of forms, including "Talk of the Town" items for The New Yorker — graceful editorials that derived much of their charm from their compact scale. Although White's gift for saying much in a few words looked effortless, he often achieved his pith by distilling one draft after another to its elegant essence. At the conclusion of his White biography, the author Scott Elledge lets readers look over White's shoulders as he hones a New Yorker commentary on the first moon landing in 1969.

    White begins his first draft with some wry comparisons of a moon launch to a picnic at the beach, but by his sixth draft, he's jettisoned all the preliminary repartee, beginning with his true subject, international unity, and dropping about 100 unnecessary words in the bargain. Here's the first paragraph of the piece he finally turned in to The New Yorker's editor, William Shawn:

    The moon, it turns out, is a great place for men. One-sixth gravity must be a lot of fun, and when Armstrong and Aldrin went into their bouncy little dance, like two happy children, it was a moment not only of triumph but of gaiety. The moon, on the other hand, is a poor place for flags. Ours looked stiff and awkward, trying to float on the breeze that does not blow.

    White's handiwork reminds us that writing economically also means getting to the point. Such urgency can be a blessing, helping a writer achieve the kind of clarity that sustains all good composition.

    The late economist John Kenneth Galbraith, who wrote wisely and well about his chosen field, once recalled the relentless winnowing of his prose at the hands of his former boss, Henry Luce, the founder of Time:

    In his hand was a pencil; down on each page one could expect, at any moment, a long swishing wiggle accompanied by the comment: "This can go." Invariably it could. It was written to please the author and not the reader. Or to fill in the space. The gains from brevity are obvious; in most efforts to achieve it, the worst and the dullest go. And it is the worst and the dullest that spoil the rest.

    Refining a draft is a process of elimination that, like any contest advancing the survival of the fittest, tends to dramatize what's left standing when the competition is complete. Like passengers in a lifeboat, all the words in a concise text must pull their own weight. That's why good poetry, which places a premium on brevity, stakes such a claim on a reader's attention.

    Consider how much William Carlos Williams manages to convey in his tiny poem, "The Red Wheelbarrow": "so much depends / upon / a red wheel / barrow / glazed with rain / water / beside the white / chickens." In only a handful of words, Williams gives us a lot to think about: the tension between beauty and function, the boundary between man and nature, the interplay between what we make and what, in a sense, makes us.

    I frequently hear champions of brevity advising writers to cut their word counts by scratching all the adjectives or adverbs. But "The Red Wheelbarrow" would be radically diminished if that little modifier, "red," had been left on the cutting room floor. In coloring his wheelbarrow, Williams transforms it from an abstraction into a tangible object of reflection.

    The point of brevity isn't to chop a certain kind of word, but to make sure that each word is essential. And brevity, whatever its virtues, must be balanced against a multitude of other needs in composition. If extreme brevity were the only goal of writing, after all, we wouldn't have "Moby-Dick" or "Anna Karenina." Not every piece of writing requires a Spartan word limit.

    "The Elements of Style," coauthored by White and William Strunk Jr., shows us how to strike the right balance:

    Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid all detail and treat his subjects only in outline, but that every word tell.

    I tend to have longer word limits for my work today than those bite-size book reviews I wrote many years ago, but the basic imperative of brevity remains. Which is why, I suppose, the essay you're reading now has been cut by about a third of its original length.

    Danny Heitman, a columnist for The Advocate newspaper in Louisiana, is the author of "A Summer of Birds: John James Audubon at Oakley House."


    13.26 | 0 komentar | Read More

    Opinionator | The Stone: When Nature Looks Unnatural

    Written By Unknown on Senin, 24 Maret 2014 | 13.26

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    Nothing makes scientists happier than an experimental result that completely contradicts a widely accepted theory. The scientists who first invented the theory might not be tickled, but their colleagues will be overjoyed. Science progresses when a good theory is superseded by an even better theory, and the most direct route to building a better theory is to be confronted by data that simply don't fit the old one.

    Nature is not always so kind, however. Fields like particle physics and cosmology sometimes include good theories that fit all the data but nevertheless seem unsatisfying to us. The Hot Big Bang model, for example, which posits that the early universe was hot, dense, and rapidly expanding, is an excellent fit to cosmological data. But it starts by assuming that the distribution of matter began in an incredibly smooth configuration, distributed nearly homogeneously through space. That state of affairs appears to be extremely unnatural. Of all the ways matter could have been distributed, the overwhelming majority are wildly lumpy, with dramatically different densities from place to place. The initial conditions of the universe seem uncanny, or "finely tuned," not at all as if they were set at random.

    Good scientific theories can fit all the data but still seem unsatisfying to us.

    Faced with theories that fit all the data but seem unnatural, one can certainly shrug and say, "Maybe that's just the way it is." But most physicists take the attitude that almost none of our current models are exactly correct; our best ideas are still approximations to the underlying reality. In that case, apparent fine-tunings can be taken as potential clues that might prod us into building better theories.

    Last week's announcement of the observation of gravitational waves from the earliest moments of the history of the universe will — if the observation holds up — represent a resounding victory for this kind of approach. The observations seem to verify a prediction of the theory of cosmic inflation, proposed by the physicist Alan Guth in 1980 (with both important predecessors and subsequent elaborations). Guth was primarily motivated by a desire to provide a more natural explanation of why our universe looks the way it does.

    Guth's proposal was that the extremely early universe was dominated for a time by a mysterious form of energy that made it expand at a super-accelerated rate, before that energy later converted into ordinary particles of matter and radiation. We don't know exactly what the source of that energy was, but physicists have a number of plausible candidates; in the meantime we simply call it "the inflaton." Unlike matter, which tends to clump together under the force of gravity, the inflaton works to stretch out space and make the distribution of energy increasingly smooth. By the time the energy in the inflaton converts into regular particles, we are left with a hot, dense, smooth early universe: exactly what is needed to get the Big Bang model off the ground. If inflation occurred, the smoothness of the early universe is the most natural thing in the world.

    Inflation has become a starting point for much contemporary theorizing about the beginning of the universe. Cosmologists either work to elaborate the details of the model, or struggle to find a viable alternative. Which is why excitement was so high last week when cosmologists announced that they had found the imprint of primordial gravitational waves in the cosmic microwave background, the leftover radiation from the Big Bang. These gravitational waves are a direct prediction of inflation. Before last week, our reliable knowledge of the universe stretched back to about one second after the Big Bang; this observation pushes our reach back to one trillionth of a trillionth of a trillionth of a second.

    The theory of cosmic inflation was motivated by the simple desire to have a more natural explanation of the early universe.

    Cosmic inflation is an extraordinary extrapolation. And it was motivated not by any direct contradiction between theory and experiment, but by the simple desire to have a more natural explanation for the conditions of the early universe. If these observations favoring inflation hold up — a big "if," of course — it will represent an enormous triumph for reasoning based on the search for naturalness in physical explanations.

    The triumph, unfortunately, is not a completely clean one. If inflation occurs, the conditions we observe in the early universe are completely natural. But is the occurrence of inflation itself completely natural?

    That depends. The original hope was that inflation would naturally arise as the early universe expanded and cooled, or perhaps that it would simply start somewhere (even if not everywhere) as a result of chaotically fluctuating initial conditions. But closer examination reveals that inflation itself requires a very specific starting point — conditions that, one must say, appear to be quite delicately tuned and unnatural. From this perspective, inflation by itself doesn't fully explain the early universe; it simply changes the kind of explanation we are seeking.

    Fortunately — maybe — there is a complication. Soon after Guth proposed inflation, the physicists Alexander Vilenkin and Andrei Linde pointed out that the process of inflation can go on forever. Instead of the inflaton energy converting into ordinary particles all throughout the universe, it can convert in some places but not others, creating localized "Big Bangs." Elsewhere inflation continues, eventually producing other separate "universes," eventually an infinite number. From an attempt to explain conditions in the single universe that we see, cosmologists end up predicting a "multiverse."

    This may sound like a very peculiar result. But in the news conference after last week's announcement, both Guth and Linde suggested that evidence for inflation boosts the case for the multiverse. And perhaps the multiverse repays the favor. The fundamental laws of physics obey the principles of quantum mechanics: Rather than predicting definite outcomes, we attach probabilities to members of an ensemble of many different experimental outcomes. If inflation begins in any part of this quantum ensemble, and that inflation goes on forever, it creates an infinite number of individual universes. So even if inflation itself seems unlikely, multiplying by the infinite number of universes it creates makes it quite plausible that we find ourselves in a post-inflationary situation.

    Related
    More From The Stone

    Read previous contributions to this series.

    If you find the logic of the previous paragraph less than perfectly convincing, you are not alone. Not that it is obviously wrong; but it's not obviously right, either. The multiverse idea represents a significant shift in the philosophy underlying inflation: Rather than explaining why we live precisely in this kind of universe, eternal inflation admits there are many kinds of local universes, and expresses the hope that ones like ours are more likely than other kinds.

    Perhaps they are. At this point, however, we simply don't know how to do the math. The multiverse is a provocative scenario, but the specific models that predict it are very tentative, far from the pristine rigor one expects of a mature physical theory. How many universes are there? How do we decide which sets of conditions are most "likely" within the giant ensemble of possibilities? How do we balance the intrinsic probabilities of quantum mechanics with the possible infinite proliferation of local regions? Does absolutely everything happen within the multiverse, or are only some possibilities actually realized?

    The questions are daunting, but they're not necessarily hopeless. Physical theories are often vague when first proposed, and only come into focus after a great deal of effort. Inflation was originally motivated by a quest for naturalness, and its audacious extrapolations away from established physics have apparently been vindicated by new observations. The multiverse, in its modern cosmological guise, has a similar origin. It's not that a group of theoretical physicists were unwinding with some adult beverages and starting wondering out loud, "What if there were billions and billions of universes?" It's that the equations we invented to explain observational data (the smoothness of our initial conditions) ended up pointing in the direction of this provocative possibility, and it's the responsibility of scientists to take the predictions of their models as seriously as possible.

    Naturalness is a subtle criterion. In the case of inflationary cosmology, the drive to find a natural theory seems to have paid off handsomely, but perhaps other seemingly unnatural features of our world must simply be accepted. Ultimately it's nature, not us, that decides what's natural.

    Sean Carroll, a theoretical physicist at the California Institute of Technology, is the author of "The Particle at the End of the Universe: How the Hunt for the Higgs Boson Leads Us to the Edge of a New World."


    13.26 | 0 komentar | Read More

    Opinionator | The Great Divide: All Economics Is Local

    Written By Unknown on Minggu, 23 Maret 2014 | 13.25

    The Great Divide is a series about inequality.

    In the face of congressional inaction, the debate on raising the minimum wage is moving to the local level. As more cities and counties consider setting their own wage standards, they can learn from the policy experiments already underway.

    Since the mid-1980s, states in every region of the country have raised the local minimum wage, often numerous times. Twenty-one states (and Washington, D.C.) currently have wage floors above the federal level ($7.25), and 11 of these raise them every year to account for inflation. Washington State currently has the highest, at $9.32; California's is set to increase to $10 on July 1, 2016.

    More than 120 cities and counties have adopted living wage laws that set pay standards, many of them in the $12 to $15 range. These higher standards usually apply only to employees of city service contractors, like security guards, landscapers and janitors. In some cities, living wage laws cover workers at publicly owned airports or stadiums, as well as at shopping malls subsidized by local development funds. While the impact on the individual workers covered under these laws is often quite significant, their reach is rarely broad enough to affect the local low-wage labor market as a whole.

    For this reason, cities and counties are now enacting higher local minimum wage policies that cover all work performed in the area. Cities as varied as Albuquerque, San Francisco, San Jose, Calif., Santa Fe, N.M., and Washington, D.C., have minimum wages ranging from $8.60 in Albuquerque to $10.74 in San Francisco. The District of Columbia, which is raising its minimum wage to $11.50 in 2016, wisely joined with two neighboring Maryland counties to create a regional standard.

    Many more cities are getting ready to follow suit. Richmond, Calif., Oakland and Seattle are seriously considering setting their own minimum wage. The Richmond City Council just voted an increase that will go to $12.30 by 2017. Advocates in Oakland are aiming for $12.25. Seattle is discussing $15. Prodded by its new mayor, New York City is seeking the right to set its own minimum wage rate, instead of using New York State's.

    With the national debate stuck in the same old rut, states and cities have again become laboratories of democracy. Are they on the right path? For the last 15 years we have been doing research on just this question.

    One city we have studied in detail, San Francisco, has passed a dozen labor standards laws since the late 1990s. After adding the effects of other local laws mandating employers to pay for sick leave and health spending, the minimum compensation standard at larger firms in San Francisco reaches $13. Our studies show that the impact of these laws on workers' wages (and access to health care) is strong and positive and that none of the dire predictions of employment loss have come to pass. Research at the University of New Mexico on Santa Fe's floor (now $10.66) found similar results.

    These are not isolated cases. Research on the effects of differing minimum wage rates across state borders confirms the results of the city studies. But how can minimum wage increases not have negative effects on employment? After all, according to basic economic theory, an increase in the price of labor should reduce employer demand for labor.

    That's not the whole story, though. A full analysis must include the variety of other ways labor costs might be absorbed, including savings from reduced worker turnover and improved efficiency, as well as higher prices and lower profits. Modern economics therefore regards the employment effect of a minimum-wage increase as a question that is not decided by theory, but by empirical testing.

    Our research and that of other scholars illuminates how businesses actually absorb minimum wages at low-wage industries. Higher standards have an immediate effect in reducing employee turnover, leading to significant cost savings. Minimum wage increases do lead to small price increases, mainly in restaurants, which are intensive users of low-paid workers. How much? A 10 percent minimum wage increase adds 0.7 cents on the dollar to restaurant prices. Price increases in most other sectors, like retail, are too small to be visible, partly because retail pays more than restaurants.

    Related
    Interactive Feature: Can You Live on the Minimum Wage?

    75 ThumbnailCalculate the hard choices that have to be made living on the smallest paychecks.

    Even if Congress finally acts to raise the federal minimum wage, higher standards at the state and local level still make sense. It will surprise no one that living costs are generally higher in cities than in rural areas. They often vary among cities in the same state. The New York City metro area is 26 percent more expensive than upstate Utica; costs in the San Jose metro area are 30 percent higher than in El Centro, in southeastern California. Policy makers need to take these variations into account. This is not just a theoretical idea. It has long been policy in Japan. Minimum wages in Tokyo and Osaka are as much as 30 percent higher than they are in regions with the lowest cost of living.

    Here's another way to think about it. One measure of employers' latitude to absorb higher wages compares the minimum wage to the median wage. From the 1960s into the 1970s, the minimum-median ratio in the United States varied between 41 and 55 percent. Since the mid-1980s, it has been much lower, varying between 33 and 39 percent. A minimum wage increase to $10.10 by 2016, as President Obama proposed earlier this year, would restore the national ratio to 50 percent. By comparison, San Francisco's $10.74 minimum wage is 40 percent of the city's median wage. In other words, although some of the proposed rates may seem high by national standards, they look more reasonable when measured against local wage levels.

    Local minimum wages also represent a response to growing inequality within cities, in too many of which a growing army of low-paid workers — maids, gardeners, janitors, restaurant and security workers — provide personal services to an increasingly well-heeled minority.

    The record is clear. Employers can afford to pay higher wages that raise families out of poverty and bear a closer relation to local living costs. And there's a moral value, too. An increase in the local minimum wage restores, on a very personal level, some of our notion of fairness.

    Michael Reich is a professor of economics at the University of California, Berkeley, and the director of the Institute for Research on Labor and Employment; Ken Jacobs is the chairman of the Center for Labor Research and Education at Berkeley; together with Miranda Dietz, they are the authors of "When Mandates Work: Raising Living Standards at the Local Level."

    A version of this article appears in print on 03/23/2014, on page SR4 of the National edition with the headline: All Economics Is Local.

    13.25 | 0 komentar | Read More

    Room for Debate: Turkey and Twitter

    Written By Unknown on Sabtu, 22 Maret 2014 | 13.25

    Introduction

    Dado Ruvic/Reuters

    On Thursday night, the Turkish government blocked Twitter because of security concerns, drawing protests the next day.

    What does this attempt to dismantle social media mean for Prime Minister Recep Tayyip Erdogan and Turkey, his so-called Muslim democracy?

    Read the Discussion »
    13.25 | 0 komentar | Read More

    Opinionator: A Wedding in Intensive Care

    Written By Unknown on Kamis, 20 Maret 2014 | 13.25

    Private Lives: Personal essays on the news of the world and the news of our lives.

    BOSTON — There wasn't going to be a happy ending. The patient had metastatic cancer and had just gone through her third unsuccessful regimen of chemotherapy. Now it seemed that everywhere we looked, we found disease. An X-ray of her belly revealed an obstruction in her intestines. A CT scan of her chest uncovered a pulmonary embolism. Her labs demonstrated that she had almost no white blood cells left with which to defend herself.

    When she arrived in the intensive care unit, she was delirious. I asked her the usual questions, about her medical history, and whether she wanted us to do CPR if her heart were to stop beating, but she didn't answer. I was just setting the clipboard aside when she raised a hand and told me, in a moment of lucidity: "Doc, do everything you can. I need to make it to my daughter's wedding."

    She was in a lot of pain. She had a tube down her nose draining her stomach.

    "When is the wedding?" I asked.

    "Next summer."

    I blinked. I blinked again. She didn't — she was looking right at me. At this point, I doubted she'd make it through the hospitalization, let alone eight more months. I didn't know what I could say. I put the stethoscope against her chest and retreated into silence.

    I met Stefanie, her daughter, the next morning. She was 24, but was only 8 when her mother's cancer was first diagnosed. Stefanie's mother had Muir-Torre syndrome, a condition that gave her a predilection for malignancies. So Stefanie had shared her home with cancer for many years, and had always seen her mother fight.

    But she knew that this time was different. The oncology fellow who had been treating her mother as an outpatient was the one to tell her that her mother was dying. Stefanie broke down, but understood there was no use denying it. The dream of a family wedding under the summer sun turned sour.

    Stefanie called her fiancé that morning. Crying, she told him the news. But he flipped the fatalistic script. Without hesitation he told her, "I want her to be there, too," and he proposed not only to have the wedding done sooner, but to have it done right there in the I.C.U.

    Our team was used to dealing with all kinds of crises: Handling a last-minute wedding was not one of them. While having more than one opinion on a medical team regarding how best to manage a patient is fairly routine, we received no push back from anyone as we started to make arrangements for the wedding. Soon the whole medical team was involved. We sent a letter to the court to expedite the marriage certificate. A pastor and harp player were booked. The hospital cafeteria baked a chocolate cake, and the nurses brought in flowers. In just a few days, we were ready.

    My job was to make sure our patient's pain was controlled while also avoiding the confusion that is a side effect of narcotic medications. But almost miraculously, she didn't need pain medications for hours and was fully aware of everything that was going on. Looking at the bride and groom from her hospital bed, she seemed more comfortable than I had seen her before. The whole day had an unreal feel to it; everything felt like it slowed down. The sun shone through the windows and glistened on the bags of fluid. For once in the hospital, there were tears but no pain. It felt as if, after all these years of chasing our patient down, even the cancer took a break.

    The next morning, the family decided to transition to hospice. No intubation, no CPR — nothing that would prolong life. It was all about trying to make the patient comfortable. (And yet, four months later, she is still alive, and doing as well as can be hoped in hospice.)

    In today's outcome-driven, efficiency-obsessed medical world, it's easy to forget that healing patients isn't just about treating diseases and relieving symptoms. There are things doctors and nurses can do, meaningful interventions — like helping patients fulfill final goals or spend quality time with their families — that cannot be documented in a discharge summary or be converted into a blip on a screen.

    As a physician, I never liked the word "miracle." I preferred to think in terms of "medical outliers." And yet that day of the wedding did feel like a miracle. Physicians often share their patients' sorrow, but rarely their joys. No, we had not discovered the cure to cancer, but we felt that we had achieved something powerful — freeing, if only temporarily, our patient from her disease.

    One of the nurses, smiling through her tears, spoke to me after it was all over. "It was magical," she said. "None of the patient alarms went off."


    Haider Javed Warraich is a resident in internal medicine at the Beth Israel Deaconess Medical Center and the author of the novel "Auras of the Jinn."

    A version of this article appears in print on 03/20/2014, on page A27 of the NewYork edition with the headline: A Wedding in Intensive Care.

    13.25 | 0 komentar | Read More

    Room for Debate: Drywall Drama

  • Carrie Schoenfeld

    It Can Happen To You

    Carrie Schoenfeld, writer

    My husband and I had heard the nightmare stories, but we thought it would be different with us. We were mistaken.

  • Timothy Dahl

    Contractors Have Concerns, Too

    Timothy Dahl, builder, entrepreneur

    Too often the contractors are cast as villains. In a world where "the customer is always right," the entitlement of many clients has spiraled out of control.

  • Joseph Burgo

    The Narcissism of Home Improvement

    Joseph Burgo, psychotherapist

    The relationship between homeowners and their contractor often resembles an overly idealistic marriage that starts off well and founders in the face of reality.

  • Fernando Lozano

    Happy Worker, Happy Renovation

    Fernando Lozano, economist

    A day labor center can negotiate on behalf of workers and its reputation can put potential employers at ease.

  • Alison Rogers

    Keep an Experienced Renovator on Speed Dial

    Alison Rogers, real estate broker

    The mantra of renovations is "plan, plan, plan" — or as carpenters put it, "measure twice, cut once."

  • Daniel McGinn

    TV Ups the Ante of a (Sometimes) Tedious Pursuit

    Daniel McGinn, author, "House Lust"

    As with cooking shows, home improvement television is hyper-produced and is usually more about aspiration than actuality.


  • 13.25 | 0 komentar | Read More

    Opinionator | Draft: Weighing My Words

    Written By Unknown on Selasa, 18 Maret 2014 | 13.26

    Draft is a series about the art and craft of writing.

    As a child, I had a relatively unusual speech impediment: I couldn't form the sounds sh, j or ch properly, and this made a large swath of words difficult to pronounce. The word just would come out sounding like chust or shust; double-whammy words like church never emerged cleanly even if I squared myself and took a good run at them; if I said that I was a Jew someone in earshot might call out "gesundheit!" (everybody's a comedian), and I dreaded having to speak my own name. An adult's uncomprehending "Can you say that again?" was a staple of my life.

    Because I found this mortifying, I learned early to plan each word in advance. Given enough determination, almost any message could be recast in less perilous, albeit slightly formal vocabulary — vocabulary that might have seemed a bit peculiar coming from a child, but served me well. I never offered a suggestion or a choice, only an alternative; I never judged a playground contest, only decided or considered or even weighed it; I'd no sooner have used a word like challenge in front of my peers than I'd have ordered chimichangas. When called on in class, I hesitated on the verge, surveying options: a dozen alternate paths for every sentence, an infinitely proliferating map of every possible route to a given meaning. What might have sounded to others like a thoughtful pause was, in reality, the interval in which I hectically planned my route — turning the syntax of a sentence on its head if necessary in order to land nimbly without touching those three dreaded sounds.

    If, as often happened, there was no getting around a verbally hazardous name or title, I learned to hesitate long enough — pretending to have momentarily forgotten a word — to induce other people to say it for me. Recently, while explaining to a friend the lengths I used to go to in order to avoid certain sounds, I raised the topic of that movie about the king of England who had to overcome his stutter … (here I trailed off; my friend helpfully offered the movie's title) …. and after making reference to that Australian actor who plays the tutor, I once more paused … (my friend gladly supplied the actor's name). Only then did I alert her to the fact that I'd choreographed my way out of having to say either "The King's Speech" or that sonic double-gainer, Geoffrey Rush.

    (Incidentally, no unnecessary words were harmed in the making of the prior paragraph. Hence the use of topic instead of subjectking of England rather than English king … and I'd choreographed my way out of rather than she'd enabled me to avoid. You see how it works.)

    Out loud, language was a minefield, hazards looming all around. But on the blank page! The blank page was a different element altogether — spacious, bright, astonishingly unimpeded. Every word at my disposal, crisp and clean and comprehensible. When I wrote, my self-consciousness vanished. I could, if so inclined, tell the world Joshua changed his judgment. I could order a lifetime's worth of fictional chimichangas, swear without anyone thinking I was saying chit. On the page I could speak my own name clearly.

    Only later, years after the improvement of both my speech and my self-confidence allowed me to drop my verbal hyper-vigilance, did it occur to me to wonder about the literary watermark left by speech impediments. Sure enough, the literary map is dotted with fellow sufferers. Lewis Carroll and Somerset Maugham had stutters, as did Henry James, whose affliction was immortalized by Edith Wharton in "A Backward Glance." In his memoir "Self-Consciousness," John Updike discusses his own stutter ("a kind of windowpane suddenly inserted in front of my face while I was talking … an obdurate barrier thrust into my throat"); and Margaret Drabble's description of the verbal circumlocutions necessary to avoid words that trigger her stammer will leave any veteran of a speech impediment nodding in recognition.

    While one can't know for sure, it seems to me that the flexibility necessary to overcome such verbal adversity must occasionally leave its signature on a writer's style. Drabble has speculated that Henry James's stammer might have helped shape his literary voice, and in a 2004 article in The Telegraph titled "The Literary Stammer," Robert Douglas-Fairhurst discusses James's "snaking sentences, full of measured subclauses and self-qualifications, which may or may not have emerged from the way that stammerers learn to avoid words likely to snag their voices, nimbly sidestepping danger with an alternative word, a new direction." A friend of mine has even suggested, though there's no way to be certain, that the famously playful portmanteau words of "Jabberwocky" might have their origin in Lewis Carroll's need to avoid troublesome sounds.

    Certainly it's always seemed to me that, as literary boot camps go, navigating a mild speech impediment has its merits. Not only can the page be a refuge from the harassments of speech, but needing to avoid certain words brings home firmly that most fundamental of writerly lessons: there is always, always an alternate way to get your meaning across. There are, in fact, 50 or 500 ways to say any given thing.

    And while treating a segment of one's vocabulary as verboten might not be a sustainable way to write a novel, it's not necessarily a bad form of writerly calisthenics. Not when one is operating in a field where one lives or dies by one's ability to throw a sentence in the air, rotate it in three dimensions — then make it dance on the head of a pin, do a quadruple flip, and stick the landing.

    I was 17 when, on one intimidating and lovely day, my own verbal calisthenics ended decisively. I needed to deliver a speech in front of a large audience — and no matter how I tried to divert my vocabulary or my thoughts as I prepared for the event, what I really wanted to talk about was changes. I wrote and rewrote sentence after sentence, only to arrive, exasperated and bored with myself, at the conclusion that no synonym would do — and that enough was enough. Hang the self-consciousness, from then on I was simply going to use the best words — the deliciously, luxuriously right words — for what I needed to say.

    But although my verbal hyper-vigilance ended that day at the podium — and although I now enjoy public speaking, and revert to my old speech pattern only when fatigue or a poor phone connection prompts me to make extra effort to pronounce each word distinctly — the gratitude I feel for the luminous, silent and utterly barrier-free world of writing has never vanished.

    In high school, our swim coach would have us train wearing drag suits: loose second suits on top of the regular team suits we wore. The intent was to slow us swimmers down, make us have to work just a bit harder on every stroke. Come the day of the race, when we'd pop into the pool wearing only our team suits, the experience never failed to shock. The water felt colder, the submerged lights somehow brighter, the entire underwater world looser and astonishingly unimpeded. A person could fly in that water — and we did, we took off through the chill, clear element like dolphins. It was the same sensation I had as a child, and continue to have, when I turn to the blank page: perfect, sweet freedom.

    Rachel Kadish is the author of the novels "From a Sealed Room" and "Tolstoy Lied: a Love Story." Her novella "I Was Here" is currently being released in serial form on the Rooster app for iPhone (www.readrooster.com).


    13.26 | 0 komentar | Read More

    Room for Debate: Lowering the Deadly Cost of Drug Abuse

    As even quaint New England towns deal with desperate residents chasing their next hit of heroin, pain killers or other hard drugs, and with overdose deaths increasing, officials trying to stem drug abuse have begun to focus more on treatment, rather than punishment.

    With more people seeing law enforcement's war on drugs as a failure, what are the best, or boldest, ways that experts around the world have proposed to take care of the problem?

    Read the Discussion »
    13.26 | 0 komentar | Read More

    Opinionator | The Stone: What Would Plato Tweet?

    Written By Unknown on Senin, 17 Maret 2014 | 13.26

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    It began when a writer friend asked me what my Klout score was. We were sitting at the sushi bar of a Japanese restaurant, the master chef assembling edible origami of torched fish and foam. My husband and I used to patronize this neighborhood place quite a lot, until a restaurant critic ruined it for us by his unrestrained rave, so that now you have to make reservations months in advance. But my friend had magically procured us two seats just like that, and when I asked him for the secret of his influence he responded by asking me about my Klout score.

    I didn't know what a Klout score was, but I was pretty sure I didn't have one. And yes, under his raised-eyebrow questioning, it was revealed that since I didn't use Facebook or Twitter or any of the other social media by which a website called Klout calculates your online influence, my score was probably low to nonexistent.

    Perhaps studying the ancient Greeks might give me perspective on today's social-media obsession.

    On either side of us, diners were pointing their cellphones at their plates, taking pictures to be posted on their Facebook pages or Instagram accounts. I knew that's what they were doing. People have taken to putting themselves out there in all kinds of ways, producing — in words, pictures, videos — the shared stories of their lives as they are transpiring. They disseminate their thoughts and deeds, large and small (sometimes very small), in what can seem like a perpetual plea for attention. I wasn't that out of touch that I didn't know about the large cultural changes that had overtaken our society while my attention was directed elsewhere.

    The elsewhere was ancient Greece. For the past few years I'd been obsessed with trying to figure out what lay behind the spectacular achievements that had occurred there. In a mere couple of centuries, Greek speakers went from anomie and illiteracy, lacking even an alphabet, to Aeschylus and Aristotle. They invented not only the discipline of philosophy, but also science, mathematics, the study of history (as opposed to mere chronicles) and that special form of government they called democracy — literally rule of the people (though it goes without saying that "the people" didn't include women and slaves). They also produced timeless art, architecture, poetry and drama. What lay behind the explosive ambition and achievement? I'd always planned eventually to catch up on the changes that were going on all around me — once I'd gotten the ancient Greeks out of my system.

    Only now did it occur to me that I might be able to arrive at some contemporary perspective precisely because I hadn't gotten the Greeks out of my system. Parallels between their extraordinary time and our extraordinary time were suddenly making themselves felt.

    For starters, the Klout on which my friend prided himself struck me as markedly similar to what the Greeks had called kleos. The word comes from the old Homeric word for "I hear," and it meant a kind of auditory renown. Vulgarly speaking, it was fame. But it also could mean the glorious deed that merited the fame, as well as the poem that sang of the deed and so produced the fame. The medium, the message, and the impact: all merged into one shining concept.

    Kleos lay very near the core of the Greek value system. Their value system was at least partly motivated, as perhaps all value systems are partly motivated, by the human need to feel as if our lives matter. A little perspective, which the Greeks certainly had, reveals what brief and feeble things our lives are. As the old Jewish joke has it, the food here is terrible — and such small portions! What can we do to give our lives a moreness that will help withstand the eons of time that will soon cover us over, blotting out the fact that we ever existed at all? Really, why did we bother to show up for our existence in the first place? The Greek speakers were as obsessed with this question as we are.

    Like us, the Greeks wanted to make their lives matter. And like a Twitter user, they did so by courting the attention of other mortals.

    And like so many of us now, they approached this question secularly. Despite their culture's being saturated with religious rituals, they didn't turn to their notoriously unreliable immortals for assurance that they mattered. They didn't really want immortal attention. Something terrible usually happened when they attracted a divine eye. That's what all those rituals were trying to prevent. Rather, what they wanted was the attention of other mortals. All that we can do to enlarge our lives, they concluded, is to strive to make of them things worth the telling, the stuff of stories that will make an impact on other mortal minds, so that, being replicated there, our lives will take on moreness. The more outstanding you were, the more mental replication of you there would be, and the more replication, the more you mattered.

    Not everybody back then was approaching this question of mattering in mortal terms. Contemporaneous with the Greeks, and right across the Mediterranean from them, was a still obscure tribe that called themselves the Ivrim, the Hebrews, apparently from their word for "over," since they were over on the other side of the Jordan. And over there they worked out their notion of a covenantal relationship with one of their tribal gods whom they eventually elevated to the position of the one and only God, the Master of the Universe, providing the foundation for both the physical world without and the moral world within. From his position of remotest transcendence, this god nevertheless maintains a rapt interest in human concerns, harboring many intentions directed at us, his creations, who embody nothing less than his reasons for going to the trouble of creating the world ex nihilo. He takes us (almost) as seriously as we take us. Having your life replicated in his all-seeing, all-judging mind, terrifying as the thought might be, would certainly confer a significant quantity of moreness.

    And then there was a third approach to the problem of mattering, which also emerged in ancient Greece. It, too, was secular, approaching the problem in strictly mortal terms. I'm speaking about Greek philosophy, which was Greek enough to buy into the kleos-like assumption that none of us are born into mattering but rather have to achieve it ("the unexamined life is not worth living") and that the achievement does indeed demand outsize ambition and effort, requiring you to make of yourself something extraordinary. But Greek philosophy also represented a departure from its own culture. Mattering wasn't acquired by gathering attention of any kind, mortal or immortal. Acquiring mattering was something people had to do for themselves, cultivating such virtuous qualities of character as justice and wisdom. They had to put their own souls in order. This demands hard work, since simply to understand the nature of justice and wisdom, which is the first order of business, taxes our limits, not to speak of then acting on our conclusions. And the effort may not win us any kleos. Socrates got himself a cupful of hemlock. He drank it calmly, unperturbed by his low ratings.

    Related
    More From The Stone

    Read previous contributions to this series.

    The divergent Greek and Hebrew approaches went into the mix that is Western culture, often clashing but sometimes also tempering one another. Over the centuries, philosophy, perhaps aided by religion, learned to abandon entirely the flawed Greek presumption that only extraordinary lives matter. This was progress of the philosophical variety, subtler than the dazzling triumphs of science, but nevertheless real. Philosophy has laboriously put forth arguments that have ever widened the sphere of mattering. It was natural for the Greeks to exclude their women and slaves, not to mention non-Greeks, whom they dubbed barbarians. Such exclusions are unthinkable to us now. Being inertial creatures, we required rigorous and oft-repeated arguments that spearheaded social movements that resulted, at long last, in the once quixotic declaration of human rights. We've come a long way from the kleos of Greeks, with its unexamined presumption that mattering is inequitably distributed among us, with the multireplicated among us mattering more.

    Only sometimes it feels as if we haven't. Our need to feel as if our lives matter is, as always, unabating. But the variations on the theistic approach no longer satisfy on the scale they once did, while cultivating justice and wisdom is as difficult as it has always been. Our new technologies have stepped in just when we most need them. Kleos — or Klout — is only a tweet away.

    It's stunning that our culture has, with the dwindling of theism, returned to the answer to the problem of mattering that Socrates and Plato judged woefully inadequate. Perhaps their opposition is even more valid today. How satisfying, in the end, is a culture of social-media obsession? The multireplication so readily available is as short-lived and insubstantial as the many instances of our lives they replicate. If the inadequacies of kleos were what initially precipitated the emergence of philosophy, then maybe it's time for philosophy to take on Klout. It has the resources. It's far more developed now than in the day when Socrates wandered the agora trying to prick holes in people's kleos-inflated attitudes. It can start by demonstrating, just as clearly and forcefully as it knows how, that we all matter.

    Mattering — none of us more than the other — is our birthright, and we should all be treated accordingly, granted the resources that allow for our flourishing. Appreciating this ethical truth might help calm the frenzy surrounding our own personal mattering, allowing us to direct more energy toward cultivating justice and wisdom. In fact, fully appreciating this ethical truth, in all its implications for both thought and deed, would itself constitute a significant step toward the cultivation of justice and wisdom.


    Rebecca Newberger Goldstein is the author, most recently, of "Plato at the Googleplex: Why Philosophy Won't Go Away."


    13.26 | 0 komentar | Read More

    Room for Debate: Are Infrastructure Needs Truly Urgent?

    Written By Unknown on Jumat, 14 Maret 2014 | 13.26

    After a gas explosion in East Harlem killed at least eight people on Wednesday, New Yorkers were shocked to learn that some gas mains in the area, and probably elsewhere, are 127 years old. While it's unclear what caused the blast or whether the main was involved, news that such vital equipment dates to the first Cleveland administration has raised alarm about the state of United States infrastructure and renewed concern that more needs to be done.

    Does the nation need to allocate resources more urgently to repairing and replacing transmission lines, roads, bridges, etc., or are these concerns exaggerated?

    Read the Discussion »
    13.26 | 0 komentar | Read More

    Opinionator | The Stone: Deconstructing God

    Written By Unknown on Senin, 10 Maret 2014 | 13.25

    The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.

    This is the third in a series of interviews about religion that I am conducting for The Stone. The interviewee for this installment is John D. Caputo, a professor of religion and humanities at Syracuse University and the author of "The Prayers and Tears of Jacques Derrida: Religion Without Religion."

    Gary Gutting: You approach religion through Jacques Derrida's notion of deconstruction, which involves questioning and undermining the sorts of sharp distinctions traditionally so important for philosophy. What, then, do you think of the distinction between theism, atheism and agnosticism?

    John Caputo: I would begin with a plea not to force deconstruction into one of these boxes. I consider these competing views as beliefs, creedal positions, that are inside our head by virtue of an accident of birth. There are the people who "believe" things from the religious traditions they've inherited; there are the people who deny them (the atheism you get is pegged to the god under denial); and there are the people who say, "Who could possibly know anything about all of that?" To that I oppose an underlying form of life, not the beliefs inside our head but the desires inside our heart, an underlying faith, a desire beyond desire, a hope against hope, something which these inherited beliefs contain without being able to contain.

    If you cease to 'believe' in a particular religious creed, you have merely changed your mind. But if you lose 'faith,' a way of life, everything is lost.

    If you cease to "believe" in a particular religious creed, like Calvinism or Catholicism, you have changed your mind and adopted a new position, for which you will require new propositions. Imagine a debate in which a theist and an atheist actually convince each other. Then they trade positions and their lives go on. But if you lose "faith," in the sense this word is used in deconstruction, everything is lost. You have lost your faith in life, lost hope in the future, lost heart, and you cannot go on.

    G.G.: I'm having some trouble with your use of "deconstruction." On the one hand, it seems to be a matter of undermining sharp distinctions, like that between atheism and theism. On the other hand, your own analysis seems to introduce a sharp distinction between beliefs and ways of life — even though beliefs are surely part of religious ways of life.

    J.C.: After making a distinction in deconstruction, the first thing to do is to deconstruct it, to show that it leaks, that its terms are porous and intersecting, one side bleeding into the other, these leaks being the most interesting thing of all about the distinction. I am distinguishing particular beliefs from an underlying faith and hope in life itself, which takes different forms in different places and traditions, by which the particular traditions are both inhabited and disturbed.

    I agree they are both forms of life, but on different levels or strata. The particular beliefs are more local, more stabilized, more codified, while this underlying faith and hope in life is more restless, open-ended, disturbing, inchoate, unpredictable, destabilizing, less confinable.

    G.G.: O.K., I guess you might say that all thinking involves making distinctions, but deconstructive thinking always turns on itself, using further distinctions to show how any given distinction is misleading. But using this sort of language leads to paradoxical claims as, for example, when you say, as you just did, that beliefs contain a faith that they can't contain. Paradox is fine as long as we have some way of understanding that it's not an outright contradiction. So why isn't it a contradiction to say that there's a faith that beliefs both contain and can't contain?

    J.C.: The traditions contain (in the sense of "possess") these events, but they cannot contain (in the sense of "confine" or "limit") them, hold them captive by building a wall of doctrine, administrative rule, orthodoxy, propositional rectitude around them.

    G.G.: So the distinction that saves you from contradiction is this: Beliefs contain faith in the sense that, in the world, beliefs are where we find faith concretely expressed; but any given faith can be expressed by quite different beliefs in quite different historical contexts. In this sense, the faith is not contained by the beliefs. That makes sense.

    Presumably, then, deconstructive theology is the effort to isolate this "common core" of faith that's found in different historical periods — or maybe even the differing beliefs of different contemporary churches.

    J.C.: No! I am not resurrecting the old comparative-religion thesis that there is an underlying transcendental form or essence or universal that we can cull from differing empirical religious beliefs, that can be approached only asymptotically by empirical cases. I am saying that the inherited religious traditions contain something deeper, which is why they are important. I don't marginalize religious traditions; they are our indispensable inheritance. Without them, human experience would be impoverished, its horizon narrowed. We would be deprived of their resources, not know the name of Moses, Jesus, Mohammed, the startling notion of the "kingdom of God," the idea of the messianic and so on.

    As a philosopher I am, of course, interested in what happens, but always in terms of what is going on in what happens. The particular religious traditions are what happen, and they are precious, but my interest lies in what is going on in these traditions, in the memory of Jesus, say. But different traditions contain different desires, promises, memories, dreams, futures, a different sense of time and space. Nothing says that underneath they are all the same.

    G.G.: That doesn't seem to me what typically goes on in deconstructive theology. The deconstructive analysis of any religious concept — the Christian Trinity, the Muslim oneness of God, Buddhist nirvana — always turns out to be the same: an endless play of mutually undermining differences.

    J.C.: There is no such thing as deconstructive theology, in the singular, or "religion," in the singular. There are only deconstructive versions of concrete religious traditions, inflections, repetitions, rereadings, reinventions, which open them up to a future for which they are not prepared, to dangerous memories of a past they try not to recall, since their tendency is to consolidate and to stabilize. Accordingly, you would always be able to detect the genealogy, reconstruct the line of descent, figure out the pedigree of a deconstructive theology. It would always bear the mark of the tradition it inflects.

    A lot of the "Derrida and theology" work, for example, has been following the wrong scent, looking for links between Derrida's ideas and Christian negative theology, while missing his irregular and heretical messianic Judaism. I like to joke that Derrida is a slightly atheistic quasi-Jewish Augustinian, but I am also serious.

    Derrida said he 'rightly passes for an atheist,' but if we stop there we miss everything interesting and important about his thinking about religion.

    G.G.: I can see that there are influences of Judaism, Augustinian Christianity and enlightenment atheism in Derrida. But isn't this just a matter of his detaching certain religious ideas from their theistic core? He talks of a messiah — but one that never comes; he's interested in the idea of confessing your sins — but there's no one to forgive them. After all the deconstructive talk, the law of noncontradiction still holds: Derrida is either an atheist or he isn't. It seems that the only reasonable answer is that he's an atheist.

    J.C.: In the middle of his book on Augustine, Derrida said he "rightly passes for an atheist," shying away from a more definitive "I am an atheist." By the standards of the local rabbi, that's correct, that's the position to attribute to him, that's a correct proposition. But if we stop there we miss everything interesting and important about what he is saying for religion and for understanding deconstruction.

    G.G.: So if I insist on expressing religious faith in propositions (assertions that are either true or false), then, yes, Derrida's an atheist. But according to you, the propositions that express faith aren't what's interesting or important about religion.

    I agree that there's much more to religion than what's stated in creeds. There are rituals, ascetic practices, moral codes, poetry and symbols. But for most people, believing that God exists entails believing such propositions as that there's someone who guarantees that justice will eventually prevail, that no suffering is without meaning, that there is a life after death where we can find eternal happiness.

    J.C.: We have to appreciate the deep distrust that Derrida has for this word "atheism." This kind of normalizing category has only a preliminary value — it finds a place to put him in a taxonomy of "positions" — but it obscures everything that is valuable here. This word is too powerful for him, too violent. That is why in another place he said calling him an atheist is "absolutely ridiculous." His "atheism" is not unlike that of Paul Tillich, when Tillich said that to the assertion that God is a Supreme Being the proper theological response is atheism, but that is the beginning of theology for Tillich, not the end.

    Derrida is not launching a secularist attack on religion. Deconstruction has nothing to do with the violence of the "new atheists" like Richard Dawkins and Christopher Hitchens. Derrida approaches the mystics, the Scriptures, Augustine with respect — they are always ahead of him, he says — and he always has something to learn from them. He is not trying to knock down one position ("theism") with the opposing position ("atheism"). He does not participate in these wars.

    G.G.: You keep saying what Derrida doesn't do. Is there any positive content to his view of religion or is it all just "negative theology"? Is he in any sense "making a case" for religion? Can reading Derrida lead to religious belief?

    J.C.: In its most condensed formulation, deconstruction is affirmation, a "yes, yes, come" to the future and also to the past, since the authentic past is also ahead of us. It leads to, it is led by, a "yes" to the transforming surprise, to the promise of what is to come in whatever we have inherited — in politics, art, science, law, reason and so on. The bottom line is "yes, come."

    Derrida is reading, rereading, reinventing inherited texts and traditions, releasing the future they "harbor," which means both to keep safe but also conceal, all in the name of what Augustine calls "doing the truth." He is interested in all the things found in the Scriptures and revelation, the narratives, the images, the angels — not in order to mine them for their "rational content," to distill them into proofs and propositions, but to allow them to be heard and reopened by philosophy. Deconstruction is a way to read something meticulously, feeling about for its tensions, releasing what it itself may not want to disclose, remembering something it may not want to recall — it is not a drive-by shooting.

    G.G.: But why call this "religion"?

    Related
    More From The Stone

    Read previous contributions to this series.

    J.C.: Derrida calls this a "religion without religion." Other people speak of the "post-secular," or of a theology "after the death of God," which requires first passing through this death. In Derrida's delicate logic of "without," a trope also found in the mystics, a thing is crossed out without becoming illegible; we can still see it through the cross marks. So this religion comes without the religion you just described — it is not nearly as safe, reassuring, heartwarming, triumphant over death, sure about justice, so absolutely fabulous at soothing hearts, as Jacques Lacan says, with an explanation for everything. His religion is risky business, no guarantees.

    G.G.: If Derrida doubts or denies that there's someone who guarantees such things, isn't it only honest to say that he is an agnostic or an atheist? For most people, God is precisely the one who guarantees that the things we most fear won't happen. You've mentioned Derrida's interest in Augustine. Wouldn't Augustine — and virtually all the Christian tradition — denounce any suggestion that God's promises might not be utterly reliable?

    J.C.: Maybe it disturbs what "most people" think religion is — assuming they are thinking about it — but maybe a lot of these people wake up in the middle of the night feeling the same disturbance, disturbed by a more religionless religion going on in the religion meant to give them comfort. Even for people who are content with the contents of the traditions they inherit, deconstruction is a life-giving force, forcing them to reinvent what has been inherited and to give it a future. But religion for Derrida is not a way to link up with saving supernatural powers; it is a mode of being-in-the-world, of being faithful to the promise of the world.

    The comparison with Augustine is telling. Unlike Augustine, he does not think a thing has to last forever to be worthy of our unconditional love. Still, he says he has been asking himself all his life Augustine's question, "What do I love when I love my God?" But where Augustine thinks that there is a supernaturally revealed answer to this question, Derrida does not. He describes himself as a man of prayer, but where Augustine thinks he knows to whom he is praying, Derrida does not. When I asked him this question once he responded, "If I knew that, I would know everything" — he would be omniscient, God!

    This not-knowing does not defeat his religion or his prayer. It is constitutive of them, constituting a faith that cannot be kept safe from doubt, a hope that cannot be kept safe from despair. We live in the distance between these pairs.

    G.G.: But if deconstruction leads us to give up Augustine's way of thinking about God and even his belief in revealed truth, shouldn't we admit that it has seriously watered down the content of Christianity, reduced the distance between it and agnosticism or atheism? Faith that is not confident and hope that is not sure are not what the martyrs died for.

    J.C.: In this view, what martyrs die for is an underlying faith, which is why, by an accident of birth or a conversion, they could have been martyrs for the other side. Mother Teresa expressed some doubts about her beliefs, but not about an underlying faith in her work. Deconstruction is a plea to rethink what we mean by religion and to locate a more unnerving religion going on in our more comforting religion.

    Deconstruction is faith and hope. In what? In the promises that are harbored in inherited names like "justice" and "democracy" — or "God." Human history is full of such names and they all have their martyrs. That is why the difference between Derrida and Augustine cannot be squashed into the distinction between "theism" and "atheism" or — deciding to call it a draw — "agnosticism." It operates on a fundamentally different level. Deconstruction dares to think "religion" in a new way, in what Derrida calls a "new Enlightenment," daring to rethink what the Enlightenment boxed off as "faith" and "reason."

    But deconstruction is not destruction. After all, the bottom line of deconstruction, "yes, come," is pretty much the last line of the New Testament: "Amen. Come, Lord Jesus."

    This interview was conducted by email and edited. Previous interviews in this series were with Alvin Plantinga and Louise Antony.

    Gary Gutting is a professor of philosophy at the University of Notre Dame, and an editor of Notre Dame Philosophical Reviews. He is the author of, most recently, "Thinking the Impossible: French Philosophy Since 1960″ and writes regularly for The Stone.


    13.25 | 0 komentar | Read More
    techieblogger.com Techie Blogger Techie Blogger