Monday, July 29, 2013

from chronicle.com

The Coastal Consciousness of John Gillis

The Coastal Consciousness of John Gillis 1
Richard Howard for The Chronicle Review
The historian John Gillis has spent almost half a century of summers at Great Gott Island, off the coast of Maine.
Enlarge Image
Clamorous and gusting, Superstorm Sandy blew ashore last fall with a force that felt at once scarily new and, in this, our own Age of Disaster, quite familiar. Watching its frigid waters gushing into Manhattan's subways and overtopping seawalls in the Rockaways and Atlantic City, we were reminded of other storms, like the monster that inundated the citizens of New Orleans—and then turned their plight into a touchstone of our politics. Katrina's aftermath helped torpedo a blustering president's second term, but the images of Sandy, looping past on YouTube and CNN, carried even more-far-reaching impacts. They brought urgency to a climate-change debate finally ready, it seemed, to make all of us envision a world where oceans will be several feet higher than those of today.
As Tropical Storm Andrea began the 2013 hurricane season, many of us were grateful for the warning calls. But as the conversations prompted by those calls grow increasingly suffused with hyperbole and guff, many of us commit that sin, anathema to historians, of condescending to the past. Was it really so, what New York's governor said in Sandy's wake—that "we had never seen a storm like this"? Sandy brought rain and high waters, yes, but Nor'easters have been buffeting America's Atlantic shores for centuries. It wasn't even close to the strongest storm to hit New York during the century that precise wind speeds and rainfall have been recorded. Climate change is real and serious, but was not last fall's "natural disaster," like Katrina and like all the rest to come, as much about human failures—in infrastructure, planning, and our proclivity for building homeson shifting sandbars—as it was natural catastrophe?
Those questions aren't new. But their new urgency may account for the feeling of providence that accompanied the arrival of the historian John Gillis's latest book. Reaching back into the days when early hominids became human, The Human Shore: Seacoasts in History(University of Chicago Press, 2012) also looks forward to what will happen if we don't change how we relate to seacoasts. The book represents a fitting capstone to the career of a remarkable historian whose arc of interests has anticipated two key, entwined strands in his discipline—the rise of environmental history and global history—and whose work has long exemplified how, in our changing present, the ways we imagine the past can and must change as well.
Gillis well understands the age-old human urge to find our way back to what Rachel Carson called "the great mother of life." He's less sanguine, however, about what most people do when they get there. "Never," he writes, "have shores been so rich in property values and so impoverished in what once had made them the first home of humankind." One of his book's guiding motifs, borrowed from a signpost that had stopped him short on a cliff-top hike in Northern California, is a simple admonition he thinks readers of Coastal Living magazine, and all those who'd love to inhabit the glossy million-dollar views it features, would do well to heed: Never turn your back on the ocean.
Gillis doesn't want us to just remember that. He wants us to understand why we must, as he said this spring when I called to ask him what he hoped readers might take from The Human Shore. Gillis—who divides his time between two shores: San Francisco Bay and an island off Maine where he and his wife, the writer Christina Marsden Gillis, have summered for decades—was direct. "The first step is to start imagining our coasts as less a 'natural' artifact than the product of hundreds and thousands of years of human creation. If we do that, then I think we'd be a long way toward saving them, and ourselves, from utter destruction."
As befits a scholar whose work has sought to trace both those "hundreds and thousands of years of human creation" and their larger effects on the earth, I first met Gillis in a department not of history but of geography. I was a graduate student at the University of California, and Gillis had retired from a long career at Rutgers University, back East, and come to live in Berkeley. This distinguished-looking fellow would turn up at our weekly colloquium and, when the speaker was through discoursing on landscape morphology or settler colonialism, ask incisive questions from behind his white beard. Gillis's predilection for geography in his emeritus years signaled his trajectory in the half-century since he had completed his own Stanford University history Ph.D., as he recalls with a chuckle, on "the Prussian bureaucracy."
After Stanford, Gillis returned to his native New Jersey, first for a brief stint at Princeton University and then up the road to Rutgers's history department for 34 years. Leaving behind his early vocation for sifting Munich's archives, he turned to British history, and, in exploring intimate questions pertaining to hearth and home, built a reputation as a social historian. Youth and History (1974) is a study of age relations in European society across time. For Better, for Worse(1985) traced the rise of the institution of marriage in Britain. And A World of Their Own Making (1996) explored the roots and effects of rituals, like wedding days and Christmas dinner, with which we forge family bonds and contend with their breaking.
Glancing back toward civilization's dawn but locating many of those rituals' birth in Victorian England, A World of Their Own Making offered a keen genealogy of the concept of "family" that doubled as a subtle excoriation of the Christian Coalition types who, at the time, were using "family values" as a club with which to bash sodomites and sex educators. Prompted in part by a family tragedy—the death of the Gillises' son Ben when a small plane he was piloting crashed in Kenya—the book concluded the historian's decades of studying family by synthesizing grand currents with the smaller scale at which we live them. It's been no surprise, then, that as Gillis has expanded his scope, his most recent books have evinced a similar determination to examine history vis-à-vis the ways we imagine its unfolding.
In Islands of the Mind (2004), he traced the grip that islands have exerted on human imagination since the ancients began thinking of them as paradises or prisons; as places to be marooned, reborn, or transformed. Exploring how the West's long obsession with islands "made the Atlantic World," Islands of the Mind included as many citations from poets and writers as from historical theorists or government documents, indicating Gillis's long-nurtured frustration with disciplinary boundaries. He has always bristled at the ways academe rewards narrow expertise and the cultivation, across a tenure-winning march of monographs and articles, of a discrete field. When I asked him why, he explained with a typically geographic metaphor. "'The field'—it's so redolent of territory, and property, isn't it?" he said. "I don't want to be trapped in a field. I want to trespass!"
Even the practitioners of "Atlantic history," the voguish subdiscipline that his work helped to create by treating the world that mariners made in crossing the ocean as a subject for study as worthy as any nation lapped by its waves, can get his gourd. "Historians connect all these dots, across the Atlantic, and they get to feel they've gone beyond America's shores," he says. "But they don't really have to do so, or have any apt way, as many critics have started pointing out, to address the degree to which [the Atlantic] is connected to other bodies of water."
Gillis thinks the rise of maritime history has helped correct that—but suffers from the opposite problem: "It turns out to be sea-locked," he says. "It has its jaunty sailor out there, but he never really comes ashore. And so again the shore, and coastal people, end up betwixt and between. They don't have a history, or a geography, to call their own."
The Human Shore is Gillis's attempt to fill that gap. His book places coasts, and their minders, at history's heart. But as befits a historian who has "grown only more and more aware of how much history is an imaginative activity," what most distinguishes his work is the depth he brings to combining the arc of human imagination with its effects—to synthesizing our thinking about seacoasts with the material history of how those ideas will shape the prospects of the planet.
Opening his narrative in earth's amniotic seas, Gillis extends what we all know—that life began in the ocean—to sketch a broader argument about the central role of coastal peoples in the development of civilization. Most modern historians and archaeologists in the West have inherited a bias for the landed from forebears for whom the Bible was a bible of not only history but also geography—a bias visible in our picturing Eden as an inland garden, and, in terms of science, our evolving ancestors as transient hunters on the plain who, thanks to good fortune in the Fertile Crescent, began cultivating wheat and evolving complex societies.
Finding evidence in newly discovered ruins of homes along the marshy coasts of Wales and the huge shell-mounds, built by Ohlone Indians, that still line San Francisco Bay, Gillis argues that it was early humans' engagement with the sea, not their activities on the savannah, that led to their divergence from primates. Echoing the Berkeley geographer Carl Sauer's famous view that "the shore is the primitive home of man," Gillis reminds us that on the shores of Africa, Eurasia, and the Americas alike, aquaculture predated agriculture. Long before our forebears planted wheat, they were setting aside areas for cultivating clams and shellfish. Scholars may disagree about what all this means. But Gillis shows how our historical underplaying of those muddy margins where land and water meet is manifested in the difficulty that our intellectual traditions, like our laws, have had in contending with places that don't definitely belong to either land and sea.
Moving rapidly through the centuries, Gillis describes how the first Homo sapiens to leave our species' East African cradle reached the Indian Ocean's shores 125,000 years ago and then migrated north, across the Red Sea, as "coasting" people whose descendants, from there, moved along the shores of the Arabian Peninsula and on to the Indian subcontinent and beyond. Eventually they surrounded the Indian Ocean, turning its rim into a contiguous web of seaboard civilizations, crosscut and interlinked by shipping routes that have existed for some 5,000 years.
Describing the varied mythological traditions by which people everywhere came to distill their views about the sea, he notes the commonality of belief in land symbolizing order and sea chaos. Coasts, accordingly, were looked on as shifting zones of sharp rocks and deadly sirens: scary sites that belonged more to the realm of the god Oceanus than to the land. It was only as the old maritime empires became modern states (and tamed Oceanus, at least in mind, by dividing its contiguous mass into "seas" with their own names) that the modern urge to transform our shores' terra infirma into territory, and thus to fix the frontier between order and chaos, grew ascendant.
Gillis describes how the "water people" of such marsh-and-island landscapes as England's vast Fens looked on helplessly as their coastal-wetland home was filled in—a drama that was replayed, again and again, from Holland to Boston to the shorelines of the South China Sea, as such projects came to represent harbingers of progress. Recounting how Europe's seamen stitched together a new world in their old one's image, Gillis explains that, at the end of that continent's great Age of Exploration, in the late 18th century, the word "coastline" entered our vocabulary. That moment, he writes, marked the start of a new phase in the life of the shore—typified by ever-expanding human efforts to fix our coasts in place, but also suffused with a new Romantic interest in the sea. The ocean became not merely a terrifying abyss but also a vision of beauty, to be admired.
This conception of the sea, which spread throughout Western culture in the 19th century, is nowhere more visible than in the uniquely modern mania for the beach—for lazing about on the shore three-quarters naked as a form of recreation. It was only at the end of the 1800s that visiting the "beach" (a neologism derived from an English word for coastal stones, Gillis tells us) became common as a leisure activity; it took a few decades more for the beach to grow, in Europe and beyond, into the destination par excellence for another modern invention: the vacation. Gillis reads those developments in terms of the larger social history of leisure and of work. But his discussion of the beach's changing meaning is also a means of examining the far more worrisome effects of its shifting uses, in literally concrete terms.
Whether made of sand or pebbles, beaches are formed by the movement of water. They are, by their nature, ever-changing. "No wonder our ancestors had no name or affection for them," Gillis writes. Few examples so starkly illustrate our changing relationship to the shore as the fetishization of a once-worthless substance—white sand—and the billions of dollars we pour, each year, into keeping the stuff in place. Such efforts, along with the billions more spent on "fixing" coastlines in general (half of New Jersey's shore is engineered in place) bespeak a larger contradiction of our era: that even as more of us than ever settle near the sea—some three billion people now live within 100 miles of its edge—we grow only more ignorant of its protean ways.
A similar disconnect is visible in the ways that our cities' working waterfronts, once the haunt of stevedores and sailors, have been turned into maritime theme parks—New York's South Street Seaport, San Francisco's Fisherman's Wharf, Baltimore's Inner Harbor. Once working wharves, these sites are now for shopping and wave-gazing, mirroring our once-industrial cities' evolution from sites for labor into shrines to conspicuous consumption.
Reconceiving our relationship to the shore in the way Gillis recommends is plainly sensible; translating that reconception into large-scale shifts in our behavior and policies is daunting. Stop building homes ever closer to the edge; protect and restore the coastal marshes and wetlands; redesign the levee systems. Those steps are necessary, but part of what slows their being taken is an ingrained recalcitrance that Gillis finds expressed in a term from Canada's Prince Edward Island: "chasing the shore." It was long used, Gillis writes, to describe poets or idlers who venture down to the sea for purposes other than hauling lobster traps or digging clams. He notes it in discussing the suspicion with which we have historically viewed activities on the shore as not at home in the rational world—and also to suggest how, in our hyperrational age, the shore's lure has seemed only to strengthen.
It certainly has for me. Although I've never heard the words "chasing the shore" spoken on Prince Edward Island, where I've spent some of every summer of my life, it is precisely what I've always done there. Sleeping in a fisherman's shack that my great-grandparents turned into a seasonal cottage, making memories on the red sandbars and mussel-covered rocks of PEI (as the island's lovers and locals call it), I realize that "chasing the shore" is something my family, like many, has turned into a core vocation and value. In our summer home's refashioning, and in the larger transformation of the shore it sits on, from the old aquaculture of the indigenous Micmac through to that of the hardscrabble Scots and Irish, is distilled much of what Gillis discusses about our human shores' past—and their future. The plot on which that cottage sits has been losing a foot of shorefront a year; locals say the erosion is speeding up, apace with waters of the Northumberland Strait, whose level may rise by at least a yard this century.
In our era when climate-change deniers are beginning to resemble those who once denied that germs make us sick, geographers are beginning to speak of the Anthropocene—the epoch of the earth's history defined byHomo sapiens' impress on it. For Gillis, turning toward the environment is only logical, as is his recent turn to the shore. "We need to stop looking at [history] as something that emanates from centers," he told me recently, "and begin to think of it as something that has its origins and dynamics on margins. And coasts, of course, are one of our chief margins."
The rhetorical flip, grounding his metaphor in real geography, is typical Gillis. But in an academy still structured by old disciplines and ingrained fields of expertise, his call may yet be heeded. In recent years, not a few institutions and scholars have embraced proliferating programs and centers for environmental studies and global affairs to try to address our era's most pressing concerns. Many such initiatives, in abetting cross-disciplinary work by climatologists and anthropologists who study, say, the linked scientific and social effects of global warming, have shaped public debate on the issue in crucial ways. Reading Gillis, though, one is struck by how few have met that rarest of intellectual challenges: to produce scholarly work not merely made timely by its engagement with varied fields and modern problems, but also enriched by a historian's understanding of how the human imagination of our planet has helped shape it—and how that history, as Gillis insisted when I visited him at the place that has inspired much of his work, may yet contain seeds for the solving of its problems.
Great Gott Island, where Gillis has spent summers for almost half a century, is a gorgeous bit of evergreened granite with no driveable roads (and no cars), a summer population of some 20 families, and a little wooden shack, down by the wooden jetty in the little harbor, affixed with a sign reading, U.S. Post Office. It's another place whose evolution from a year-round outpost for a few hearty fisherfolk to summer place of memories for a few bohemians and scribblers mirrors much of what Gillis, a self-proclaimed "islander by choice," has mined in his books.
Stopping off to see him there, after my yearly pilgrimage to PEI last summer, I strolled around the island with Gillis on a spotless August afternoon. We looked out at white lobster boats bobbing in the glinting blue waves. Gillis took me to the 19th-century wood-frame house that he and his wife bought for $3,000, back in his Prussian-bureaucracy days, then led me toward the small cemetery plot where Great Gott's minders and lovers—including the Gillises' son Ben—lie at rest beneath stone graves.
Walking past the little cemetery, I asked John about how he thought this little place, and his life here, had informed his determination to write histories of the world entire. He gestured out toward the waves. "'Go west, young man!' That's the line people draw; they think of history as moving west, across the land. But that's not how it actually went, except for during a small chapter of history."
His eyes glinted to match the waves as he invoked a local expression for the bit of human shore I'd just traveled, from Canada's Maritimes down into New England. "I often say that history went more 'down east' than out west. You know how journalists say 'Follow the money'? Well, follow the wind, follow the tide, follow the shore—you'll find what you're looking for."
Joshua Jelly-Schapiro is a lecturer in geography and American studies at the University of California at Berkeley. His book Island People will be published by Alfred A. Knopf.


from time

TOPSHOTS-BRAZIL-POPE-WYD-WELCOMING


At first glancePope Francis‘s statement on homosexuality, delivered today in an impromptu press conference  aboard the papal plane, seemed to indicate a remarkable break with Church tradition. “If someone is gay and he searches for the Lord and has good will, who am I to judge?” Francis told journalists, as he flew from Rio de Janeiro to Rome. “The tendency [to homosexuality] is not the problem…They’re our brothers.”
The Pope’s words were warmly received by gay activists in Italy and abroad. “From now on, when I hear a bishop or a priest say something against me, I’m going to say, ‘Who are you to judge,’” says Franco Grillini, president of Gaynet Italia, the association of gay journalists in Italy.
But like many of Francis’ more news-making statements, the real difference is less about the contents of his words, than in the direct, earthy style in which he delivers them, and the Church teachings he chooses to emphasize. “It’s the way he’s expressing himself, with great candor, that is surprising to people,” says John Wauck, a professor of communication at the Pontifical University of The Holy Cross. “Actually, the substance of it is nothing exceptional.”
Francis’s comment in May that some atheists might make it into heaven drew headlines. The Vatican’s subsequent explanation that his words were in line with a long tradition of Church teachings did not. Similarly, Francis’s statement on the plane was not far from the passage on homosexuality in the Catechism of the Catholic Church, published under Pope John Paul II in 1992. That text calls on Catholics to accept homosexuals “with respect, compassion and sensitivity,” avoiding “every sign of unjust discrimination in their regard.”
Where he differed is in what he left out: the accompanying message in the Catechism that while a gay person is to be accepted, acting out on homosexual acts is to be deplored: “Under no circumstances can they be approved … Homosexual persons are called to chastity.” Francis, who cited the Catechism in his answers to reporters, said nothing to contradict this. Asked for his position on gay marriage, he answered: “You know perfectly the position of the Church.”
But while Francis has put little doctrinal space between himself and his predecessors, comments like the one on the plane reflect a clear choice in the early months of his papacy to de-emphasize the issues of sexual morality that have made the Church a lightning rod in the culture wars. Even as France was consumed last spring in debate over the legalization of gay marriage, a battle that pitted the French Church against the government, Francis made no mention of the issue.
In Brazil, he told the reporters on the plane, he purposefully avoided talking about abortion or gay marriage, in order to stay focused on the positive. “His message is not ‘Don’t do that, don’t do this’,” says Wauck. “The moral scriptures are present, but they’re implicit. The attention of the Pope is on a much larger vision of the Church and what Christianity has to offer to the world.”


Read more: http://world.time.com/2013/07/29/its-not-what-the-pope-said-about-gays-its-how-he-said-it/#ixzz2aTwfE9Y4


Tuesday, July 16, 2013

Of polar bears and consciousness: A tribute to Daniel Wegner

from scientific america 





July 16, 2013
Last Friday, July 5, the psychology community lost one of its greatest minds, Daniel Wegner. It’s hard to overstate his influence onpsychology as a whole — and on individual students and researchers (myself included) along the way. Just last week, I came across a new study that bears his clear imprint: the effect of suppressing your craving for cigarettes on the value you place on smoking. The more you suppress, the higher the value you assign to smoking. Wegner would have approved.
About a year and a half ago, I wrote a pieceabout the origins of Wegner’s famed white bear — the one you can’t stop thinking about no matter how hard you try. To honor his memory, I’m reproducing it below. Now, try not to think of Dan Wegner.
What do polar bears and social faux pas have in common? (originally published January 12, 2012)
Fyodor Dostoyevsky is a psychological goldmine. If you can think it, chances are he wrote about it. But as far as I know, only once has his writing directly inspired psychological research—and it was his non-fiction at that. Specifically, his reminiscences of travels to the European continent, Winter Notes on Summer Impressions. One chapter in particular, “An Essay Concerning the Bourgeois,” has sparked some of the most prominent social psychology research of the last twenty years: Daniel Wegner’s studies of thought suppression.
In his essay, Dostoyevsky poses a challenge to his readers: rather than doing what writers normally ask you to do—that is, think—try not to think. And what’s more, try not to think of something quite specific – and see how far you can get. Dostoyevsky is not at all optimistic about the result. He writes, “Try to pose for yourself this task: not to think of a polar bear, and you will see that the cursed thing will come to mind every minute.”
When Wegner read this, he was intrigued. So intrigued, in fact, that he decided to test it directly: would people be successful in keeping thoughts of a polar bear at bay when directed to do so? The point of the research was to look at conscious thought suppression, those moments when we deliberately try to keep from thinking about something, as opposed to unconscious thought suppression, an area made famous by Sigmund Freud in his writing on repression and apparent amnesia.
So, Wegner and his colleagues asked a group of students to do just what Dostoyevsky had suggested: not to think of a white bear. For a period of five minutes, the students were asked to report their thoughts verbally. Each time they either thought of or said the words “white bear,” they were asked to ring a bell. Then, for five additional minutes, they were instructed to think of a white bear all they wanted, and tocontinue ringing the bell whenever they did so. Another group received the opposite instructions: to first think of a white bear all they wanted, and then, to not think of the bear at all.
What happened next has since become one of the most widely replicated phenomena in psychology: those participants who had been instructed to avoid all thoughts of a white bear couldn’t do it. On average, they either said the words or reported thinking about the bear over once a minute. Moreover, when they were later told to think of the bear, they experienced a significant rebound effect, mentioning it much more often than any other group.
In a follow-up study, the researchers tried to help out by telling participants to think instead of a red Volkswagen each time they had the urge to think of a white bear. While the additional instructions had little influence in the suppression phase—people still couldn’t help but think of the white bear, even though they were now also thinking of the red car—they did help mitigate the subsequent rebound effect.
Since then, the ironic effects of thought suppression have been illustrated under countless circumstances and with far more candidates than white bears—pink elephants, ex-boyfriends, you name it. And the effects tend to last far longer than a five-minute laboratory sessions. Often, people will report rebounds of unwanted thoughts over periods of days and even weeks.
The phenomenon shouldn’t be at all surprising to people who have tried, for example, not to think of food when dieting (what else do you think of?) or who’ve done their very best to avoid a sensitive topic in a conversation—only to find themselves saying just the thing they had wanted to avoid. It seems that our natural tendency, whenever a topic bothers us or is in some way unwanted, is to do precisely the thing that makes us worst off: try not to think of it. And the more we try, the harder it can be, and the stronger the rebound we are likely to experience when the thought inevitably makes its way back.
Why would that be the case? The more effort we expend on keeping something from our mind, the more likely we are to be reminded of it—because at some level, we have to keep reminding ourselves not to think about it. As long as not thinking is in the back of our minds, we will be prompted to think of precisely the thing we shouldn’t be thinking about. Wegner calls this an ironic monitoring process: each time we think about a distracter topic to put off the topic we’d like to avoid (something we do consciously), our minds unconsciously search for the unwanted thought so that they can pounce on it if it makes so much as a peep. And if we are tired or stressed or distracted—or even if our mind goes silent for a moment—the unwanted thought will take the opportunity to assert itself.
It’s especially bad in social situations, when we try to avoid making mistakes that would carry some sort of social cost, such as trying not to swear or make sexual references or touch on an otherwise sensitive area of conversation. People who are asked to keep something private are more likely to mention it or allude to it in some way in a conversation. People who are asked not to think of anything sexual are more likely to slip up—and even show greater levels of physical arousal. People with eating disorders are more likely to mention food. People who have some sort of social prejudice—racism, sexism, homophobia—are more likely to say something biased when they are trying to be on their best behavior—especially if they are stressed or otherwise mentally engaged at the time.
The effects can even be physical. If we try to stop a pendulum from swinging in a specific direction, we may find it swinging in just the way we tried our best to avoid—especially if we are told to count backward from 1000 in threes. Athletes who concentrate too hard on avoiding a certain error may find themselves making just that error at the most inopportune of times (in golf, the effect even has a name: the yips). And if you are worried about not being able to fall asleep? Good luck trying to get to sleep. Dostoyevsky’s polar bear, it seems, just won’t let us go.
But, as it turns out, the news does get better. We may not always be at the mercy of the white bear. 25 years after Wegner’s original studies, further research has found ways we can keep—or at least, help—unwanted thoughts from resurfacing precisely when they shouldn’t. If we devote time and mental resources to avoiding a topic—and especially if we become absorbed in something else—we can successfully keep it at bay. If we practice focused self-distraction, or try to think intently about one specific topic that isn’t the topic we want to avoid, we will also be much more successful than if we let our minds wander without a focused purpose. If we avoid stress and other mental load, we are more likely to be in control of our thoughts. We can also practice techniques of mindfulness, meditation, focused breathing, and attention training (i.e., repeated practice of directing our attention toward specific targets and away from others), all of which allow us to be in better control of our minds more generally. And most interesting of all, if we deliberately try to think of what we want to avoid, we may find ourselves better able to avoid it down the line—a tactic that is known as exposure or habituation in anxiety and phobia research.
Dostoyevsky was right. If we pose as our task the act of not thinking about a polar bear, the cursed thing will indeed jump out at us from around every corner. The worst thing we can possibly do if we don’t want something to bother us is to try to avoid it. But if we take a different tack, acknowledging it, embracing it, confronting it, or if we learn to focus our minds on other, more productive lines of thought—through a positive process of actively trying to think of something rather than trying to avoid something else—we are much more likely to learn than the bear is not as powerful as once thought. It may be big and scary, but our minds have the potential to be even bigger and scarier if only we recognize the proper approach.
Image credits: Daniel Wegner, photo courtesy of Wegner’s Harvard website.Polar bear, longhorndave, Flickr, Cretive Commons.
Maria KonnikovaAbout the Author: Maria Konnikova is a writer living in New York City. She is the author of the New York Times best-seller MASTERMIND (Viking, 2013) and received her PhD in Psychology from Columbia University. Follow on Twitter @mkonnikova.
The views expressed are those of the author and are not necessarily those of Scientific American.

Saturday, July 13, 2013

Consciousness is ultimate truth and ultimate truth is God.

from speakingtree


By: Santthoshkumaar Kumaar on Jul 13, 2013
 
 
 
 
 
If one does not find the reflection of his self when he is standing in front of the mirror, but he finds his body and the background world in the mirror.  The one which is conscious of the body and the experience of the world is not the body but the consciousness, which is the innermost self.
 
 
The relationship and love, hate happiness and misery of the practical life within the practical world are not connected to the soul  (formless consciousness) which is his true self. The true self is beyond form, time and space and beyond the birth, the  life, the death and the  world. 
 
 
Thus rest in consciousness by recognizing it as our inner most self, which is God.  And realize the practical life within the practical world is a mere mirage created out of consciousness. Consciousness is ultimate truth and ultimate truth is God.
 
 
Getting rid of the physical shackle is necessary to overcome the burden and bondage of the illusory samsara. Love and relationship seems reality in practical life within the practical world, but they are mere illusion from the ultimate standpoint.   Thus the game of life is the mere passing show.
 
 
Upanishads clearly declare:-
 
 
Katha Upanishad 1:2:23 The Soul cannot be realized through hearing scholarly explanation of the discourses, not even by the intellect.
 
 
Katha Upanishad 1:3:6 “Through the knowledge of the Soul, God, one is pure and clean constantly.” Neither by reading the book, nor by taking a bath at holy place has one become pure. Inner purity is possible when one remains in constant touch with the Soul. Constant Soul-Consciousness is the real purity.
 
 
Kena Upanishad 2:4 When it is known through every state of cognition, it is rightly known, for (by such knowledge) one attains life eternal. Through one's own self one gains power and through wisdom one gains immortality.
 
 
Kena Upanishad 2:5 If here one knows it, then there is truth, and if here one knows it not, there is a great loss. Hence, seeing the Real in all beings, wise men become immortal on departing from this world.
 
 
Mundaka Upanishad 1:2:8  Remaining in the fold of ignorance and thinking “we are extremely wise and learned,” the fools with boastful nature ramble about like the blind led by the blind alone.”
 
 
Mundaka Upanishad 3:2:3  “The weak and timid cannot realize the Self. Self-Realization is not possible through intellect or hearing spiritual discourse. One who welcomes God in every activity, through a thorough controlled and disciplined life, to him also the Soul is revealed."
 
 
Mundaka Upanishad 3:2:3  The Soul cannot be realized by the weak and timid.  
 
 
That is why  Sage Sri, Sankara says :- VC 56. Neither by Yoga, nor by Sankhya, nor by work, nor by learning, but by the realisation of one's identity with Brahman is Liberation possible, and by no other means.
 
 
58. Loud speech consisting of a shower of words, the skill in expounding the Scriptures, and likewise erudition - these merely bring on a little personal enjoyment to the scholar, but are no good for Liberation.
 
 
59. The study of the Scriptures is useless so long as the highest Truth is unknown, and it is equally useless when the highest Truth has already been known.
 
 
60. The Scriptures consisting of many words are a dense forest which merely causes the mind to ramble. Hence men of wisdom should earnestly set about knowing the true nature of the Self.
 
 
61. For one who has been bitten by the serpent of Ignorance, the only remedy is the knowledge of Brahman. Of what avail are the Vedas and (other) Scriptures, Mantras (sacred formulae) and medicines to such a one?
 
 
62. A disease does not leave off if one simply utter the name of the medicine, without taking it; (similarly) without direct realisation one cannot be liberated by the mere utterance of the word Brahman.
 
 
63. Without causing the objective universe to vanish and without knowing the truth of the Self, how is one to achieve Liberation by the mere utterance of the word Brahman ? — It would result merely in an effort of speech.
 
 
64. Without killing one’s enemies and possessing oneself of the splendour of the entire surrounding region, one cannot claim to be an emperor by merely saying, ‘I am an emperor’.
 
 
Until one knows the truth of his true existence, whatever he knows about god is mere belief. Belief   is individual, whereas the ultimate truth is universal every belief system has its own idea of god thus there is no universality in the belief system.
Source : santthosh kumaar (self)

Genes and Consciousness

from speakingtree




“Each and every fractional waves of the vast cosmic mind then takes the form of an individual animate and inanimate structure”. P.R.Sarkar


                The human body has about sixty million million cells and every single one of them contains a selection of genetic material. Twenty thousand pairs of genes are arranged on twenty-three pairs of chromosomes in each cell. Some estimates suggest that more than a million genes are unused by humans. These genes carry a complete blueprint for making us again down to the last detail. The genes, which consist of DNA molecules have a limited life of months. They can replicate, however, making exact copies of themselves as often as necessary and can continue to exist for more than five thousand million years. During this long journey, they are shuffled millions of times with other similar genes through breeding and reproduction, like a huge pack of cards. Their combinations and permutations are countless. The same genetic material may have passed through a number of species carrying some of their attributes. Therefore a man may carry a beast, an insect or even a plant in his genes.

            Genes do not appear to be conscious of us or of each other. They do not know that they are involved in evolution. They just exist and to maintain their existence they exercise full control on the survival machines. Genes dictate the way in which our biological machines are built and the manner in which they operate. They have ultimate power over our behaviour.

            [1]"Genes are the policy-makers, we are their executives. But as evolution progresses, the executive apparatus has become increasingly sophisticated and management has begun to make more and more decisions on its own. Nervous systems have evolved to levels where learning, memory and model-making becomes possible and take over many of the policy decisions. And the logical conclusion to this trend would be for the genes to send out a very elaborate survival machine with only one all-encompassing instruction ~ `do whatever you think best to keep us alive'. But no species on earth has yet reached that level."

            It appears that the evolution serves the purpose of "selfish genes". They have to be preserved at any cost. As long as these chemicals can survive, it hardly matters which physical structure happens to carry them. They are even prepared to make adjustments in their sequence to suit their carrier and the environment. This is called mutation. Sometimes they accept foreign genetic material in the form of viruses' and take a giant leap in the development of new species. They are not loyal to any species. Their ultimate purpose is simply to survive.

            Genes, however, are like computer software, programmed to direct specific functions. Admittedly, the potential of this software is enormous. It has been discovered that even in the most complex organism, less than three percent of it's DNA in the cells is being used and the parts being used are randomly selected in a way that the genetic engineers do. For a given instruction, little bits of genetic material are cut up from all over the place and pieced together to carry out a specific function. There is, it appears, an editor in action.

            It seems improbable, moreover, that the genes with all their chemical potentialities have what it takes to make a man on their own. The evolutionary biologist Lyall Watson says, "DNA is not the Bible of life, not an encyclopaedia of precise instructions." He suggests that instead of being airtight, the DNA system is flexible and dynamic, struggling to survive, like its carrier. Is this struggle  being engineered by the  Consciousness?
           
            The debut of consciousness is a great controversy of the twentieth century. Its seat and site are an even greater controversy. The neurologist  Roger Sperry says, [2]"There seems to be good reason to regard the evolutionary debut of consciousness as very possibly the most critical step in the whole of evolution."

            The philosopher Karl Popper says that, [3]"The emergence of consciousness in the animal kingdom is perhaps as great a mystery as the origin of life itself."

            The biologist Lyall Watson says[4], "Consciousness exists in man and not in molecules." He believes that it began not with matter, nor with the origin of life, but at some mid-point in evolution.

            Current theories of consciousness are goaded by one or the other assumptions, depending on one's perspective. Some assume that consciousness is a product of neural elements like reticular activating system (RAS) of the brain. It is connected with learning and therefore can be quantified by behaviour modifications. Others assume that it is intangible and therefore incapable of investigations. Some assume it a feedback system with survival value. It lets one know how one is doing. Still others assume that it arises when the organism has its own mental model  of the world against the dictates of the genes.

            In all the above assumptions there is one common denominator. They all assume that there is a beginning and therefore an end of consciousness. This is where the new science of biopsychology differs from the rest. It proposes that consciousness is beginningless and endless. It is all-pervading and all-knowing. It is the primordial stuff that this universe has metamorphosed from under the influence of primordial energy. From the smallest sub-atomic particles to the largest of planets, from the smallest of viruses to the largest of mammals and from moulds to the largest of trees are the outcomes of interactions between these two primordial principles.

            Consciousness is therefore inherent in every tangible and intangible entity of the universe. We live is a soup of consciousness, from within and from without. We emerge from this soup and dissolve back into it. When the soup is solidified we are tangible matter and when it is thawed, we are abstract ideas. The soup is infinite but its forms are limited. Thus with the emergence of matter, consciousness is compartmentalised into an infinite entity and numerous forms or units. In Shrii Sarkar’s Bio-psychology the infinite entity is called Supreme Consciousness and its forms Unit Consciousness.

            Perhaps the quantum mechanics is right in suggesting that consciousness is a basic property of matter. In the inert matter it is dormant. With the first sign of life, the soup begins to thaw and consciousness begins to evolve. The transition from non-living to living matter occurs due to the development of the famous double-helix of DNA which becomes increasingly complex and newer and newer species evolve. However, the mutations in this genetic material, responsible for new species, cannot produce a single gene.

            From development of the first strand of DNA to the final emancipation of man, the development and refinement of the nervous system, hormones and immune system among innumerable other  biological changes, have been brought about by the genetic mutations. However, the genes have served one and only one fundamental purpose - liberation of the consciousness, that is frozen into unit form. Due to their enormous capacity to carry information from generation to generation, form species to species, the genes are indispensable for the liberation of consciousness.



[1] Watson, L. (1979), Lifetide, Hodder and Stoughton, London.
[2] Sperry, R. (1964), The Great Cerebral Commissure, Scientific American, Vol 210, p42-52.
[3] Popper, K. (1972), Objective Knowledge, Oxford University Press, Oxford
[4] Watson, L. (1979), Lifetide, Hodder and Stoughton, London.

Tuesday, July 9, 2013

What Is Nostalgia Good For? Quite a Bit, Research Shows

from nytimes



Science of Nostalgia: It was first thought to be a “neurological disease of essentially demonic cause,” but it turns out that nostalgia is good for your brain. And there’s science to prove it.
  • FACEBOOK
  • TWITTER
  • GOOGLE+
  • SAVE
  • E-MAIL
  • SHARE
  • PRINT
  • REPRINTS
SOUTHAMPTON, England — Not long after moving to the University of Southampton, Constantine Sedikides had lunch with a colleague in the psychology department and described some unusual symptoms he’d been feeling. A few times a week, he was suddenly hit with nostalgia for his previous home at the University of North Carolina: memories of old friends, Tar Heel basketball games, fried okra, the sweet smells of autumn in Chapel Hill.
Multimedia

Science of Youth, Recalled

Science Times reporters and editors share their nostalgic science memories.
Jasper James/Getty Images

Readers’ Comments


His colleague, a clinical psychologist, made an immediate diagnosis. He must be depressed. Why else live in the past? Nostalgia had been considered a disorder ever since the term was coined by a 17th-century Swiss physician who attributed soldiers’ mental and physical maladies to their longing to return home — nostos in Greekand the accompanying pain, algos.
But Dr. Sedikides didn’t want to return to any home — not to Chapel Hill, not to his native Greece — and he insisted to his lunch companion that he wasn’t in pain.
“I told him I did live my life forward, but sometimes I couldn’t help thinking about the past, and it was rewarding,” he says. “Nostalgia made me feel that my life had roots and continuity. It made me feel good about myself and my relationships. It provided a texture to my life and gave me strength to move forward.”
The colleague remained skeptical, but ultimately Dr. Sedikides prevailed. That lunch in 1999 inspired him to pioneer a field that today includes dozens of researchers around the world using tools developed at his social-psychology laboratory, including a questionnaire called theSouthampton Nostalgia Scale. After a decade of study, nostalgia isn’t what it used to be — it’s looking a lot better.
Nostalgia has been shown to counteract loneliness, boredom and anxiety. It makes people more generous to strangers and more tolerant of outsiders. Couples feel closer and look happier when they’re sharing nostalgic memories. On cold days, or in cold rooms, people use nostalgia to literally feel warmer.
Nostalgia does have its painful side — it’s a bittersweet emotion — but the net effect is to make life seem more meaningful and death less frightening. When people speak wistfully of the past, they typically become more optimistic and inspired about the future.
“Nostalgia makes us a bit more human,” Dr. Sedikides says. He considers the first great nostalgist to be Odysseus, an itinerant who used memories of his family and home to get through hard times, but Dr. Sedikides emphasizes that nostalgia is not the same as homesickness. It’s not just for those away from home, and it’s not a sickness, despite its historical reputation.
Nostalgia was originally described as a “neurological disease of essentially demonic cause” by Johannes Hoffer, the Swiss doctor who coined the term in 1688. Military physicians speculated that its prevalence among Swiss mercenaries abroad was due to earlier damage to the soldiers’ ear drums and brain cells by the unremitting clanging of cowbells in the Alps.
A Universal Feeling
In the 19th and 20th centuries nostalgia was variously classified as an “immigrant psychosis,” a form of “melancholia” and a “mentally repressive compulsive disorder” among other pathologies. But when Dr. Sedikides, Tim Wildschut and other psychologists at Southampton began studying nostalgia, they found it to be common around the world, including in children as young as 7 (who look back fondly on birthdays and vacations).
“The defining features of nostalgia in England are also the defining features in Africa and South America,” Dr. Wildschut says. The topics are universal — reminiscences about friends and family members, holidays, weddings, songs, sunsets, lakes. The stories tend to feature the self as the protagonist surrounded by close friends.
Most people report experiencing nostalgia at least once a week, and nearly half experience it three or four times a week. These reported bouts are often touched off by negative events and feelings of loneliness, but people say the “nostalgizing” — researchers distinguish it from reminiscing — helps them feel better.
To test these effects in the laboratory, researchers at Southampton induced negative moods by having people read about a deadly disaster and take a personality test that supposedly revealed them to be exceptionally lonely. Sure enough, the people depressed about the disaster victims or worried about being lonely became more likely to wax nostalgic. And the strategy worked: They subsequently felt less depressed and less lonely.
Nostalgic stories aren’t simple exercises in cheeriness, though. The memories aren’t all happy, and even the joys are mixed with a wistful sense of loss. But on the whole, the positive elements greatly outnumber the negative elements, as the Southampton researchers found by methodically analyzing stories collected in the laboratory as well as in a magazine named Nostalgia.
“Nostalgic stories often start badly, with some kind of problem, but then they tend to end well, thanks to help from someone close to you,” Dr. Sedikides says. “So you end up with a stronger feeling of belonging and affiliation, and you become more generous toward others.”
A quick way to induce nostalgia is through music, which has become a favorite tool of researchers. In an experiment in the Netherlands, Ad J. J. M. Vingerhoets of Tilburg University and colleagues found that listening to songs made people feel not only nostalgic but also warmer physically.
That warm glow was investigated in southern China by Xinyue Zhou of Sun Yat-Sen University. By tracking students over the course of a month, she and colleagues foundthat feelings of nostalgia were more common on cold days. The researchers also found that people in a cool room (68 degrees Fahrenheit) were more likely to nostalgize than people in warmer rooms.
Not everyone in the cool room turned nostalgic during the experiment, but the ones who did reported feeling warmer. That mind-body link, Dr. Wildschut says, means that nostalgia might have had evolutionary value to our ancestors long before Odysseus.
“If you can recruit a memory to maintain physiological comfort, at least subjectively, that could be an amazing and complex adaptation,” he says. “It could contribute to survival by making you look for food and shelter that much longer.”
Finding a Sweet Spot
Of course, memories can also be depressing. Some researchers in the 1970s and ’80s suggested that nostalgia could worsen a problem that psychologists call self-discontinuity, which is nicely defined in “Suite: Judy Blue Eyes,” by Stephen Stills: “Don’t let the past remind us of what we are not now.” This sense of loss and dislocation has repeatedly been linked to both physical and mental ills.
But the feeling of discontinuity doesn’t seem to be a typical result of nostalgia, according to recent studies. In fact, people tend to have a healthier sense of self-continuity if they nostalgize more frequently, as measured on the scale developed at Southampton. To understand why these memories seem reassuring, Clay Routledge of North Dakota State University and other psychologists conducted a series of experiments with English, Dutch and American adults.
First, the experimenters induced nostalgia by playing hit songs from the past for some people and letting them read lyrics to their favorite songs. Afterward, these people were more likely than a control group to say that they felt “loved” and that “life is worth living.”
Then the researchers tested the effect in the other direction by trying to induce existential angst. They subjected some people to an essay by a supposed Oxford philosopher who wrote that life is meaningless because any single person’s contribution to the world is “paltry, pathetic and pointless.” Readers of the essay became more likely to nostalgize, presumably to ward off Sartrean despair.
Moreover, when some people were induced to nostalgia before reading the bleak essay, they were less likely to be convinced by it. The brief stroll down memory lane apparently made life seem worthwhile, at least to the English students in that experiment. (Whether it would work with gloomy French intellectuals remains to be determined.)
“Nostalgia serves a crucial existential function,” Dr. Routledge says. “It brings to mind cherished experiences that assure us we are valued people who have meaningful lives. Some of our research shows that people who regularly engage in nostalgia are better at coping with concerns about death.”
Feeding the Memory Bank
The usefulness of nostalgia seems to vary with age, according to Erica Hepper, a psychologist at the University of Surrey in England. She and her colleagues have found that nostalgia levels tend to be high among young adults, then dip in middle age and rise again during old age.
“Nostalgia helps us deal with transitions,” Dr. Hepper says. “The young adults are just moving away from home and or starting their first jobs, so they fall back on memories of family Christmases, pets and friends in school.”
Dr. Sedikides, now 54, still enjoys nostalgizing about Chapel Hill, although his range has expanded greatly over the past decade. He says that the years of research have inspired strategies for increasing nostalgia in his own life. One is to create more moments that will be memorable.
“I don’t miss an opportunity to build nostalgic-to-be memories,” he says. “We call this anticipatory nostalgia and have even started a line of relevant research.”
Another strategy is to draw on his “nostalgic repository” when he needs a psychological lift or some extra motivation. At such moments, he tries to focus on the memories and savor them without comparing them with anything else.
“Many other people,” he explains, “have defined nostalgia as comparing the past with the present and saying, implicitly, that the past was better — ‘Those were the days.’ But that may not be the best way for most people to nostalgize. The comparison will not benefit, say, the elderly in a nursing home who don’t see their future as bright. But if they focus on the past in an existential way — ‘What has my life meant?’ — then they can potentially benefit.”
This comparison-free nostalgizing is being taught to first-year college students as part of a study testing its value for people in difficult situations. Other experiments are using the same technique in people in nursing homes, women recovering from cancer surgery, and prison inmates.
Is there anyone who shouldn’t be indulging in nostalgia? People who are leery of intimate relationships — “avoidant,” in psychological jargon — seem to reap relatively small benefits from nostalgia compared with people who crave closeness. And there are undoubtedly neurotics who overdo it. But for most others, Dr. Sedikides recommends regular exercises.
“If you’re not neurotic or avoidant, I think you’ll benefit by nostalgizing two or maybe three times a week,” he says. “Experience it as a prized possession. When Humphrey Bogart says, ‘We’ll always have Paris,’ that’s nostalgia for you. We have it, and nobody can take it away from us. It’s our diamond.”
This article has been revised to reflect the following correction:
Correction: July 8, 2013
An earlier version of this article incorrectly rendered the nickname of the University of North Carolina basketball team as one word. The team is the Tar Heels, not the Tarheels.