Friday, July 31, 2009

Hare and hounds

Nicholas Schmidle arrived in Pakistan in 2006 as a fellow of the Institute of Current World Affairs. He was first spotted on the Pakistani intelligence's radar when he wrote a piece on the Pakistani Taliban for the New York Times Magazine. To Live or to Perish Forever is an account of his two years in a country where another foreign journalist, Daniel Pearl, met with a bitter end.

Hillary Clinton told reporters during her recent India visit that while she agreed that America faced challenges on Af-Pak, there was no doubt that the Pakistani authorities were working to root out terror. Schmidle's book reminds us how hollow that claim is and how self-destructive America's optimism on this score.

As he roams the streets of Karachi, Gwadar and Quetta, Schmidle meets locals and elites who personify the stark contradictions of Pakistani society. So, while "ninety-nine point ninety-nine" per cent of Pakistanis rub their hands in glee at America's troubles in Afghanistan, as quoted by one local, a full one hundred per cent don't want to have anything to do with the Taliban on Pakistani soil.

To India, the chapter on Pakistan's repression of Balochistan is the most revealing, as also the most relevant. As the Manmohan Singh government frantically douses all-round fire at the wording of the joint statement signed at Sharm el-Sheikh, Schmidle exonerates India of any involvement, laying the blame squarely on Pakistan and China for co-operating to smother the genuine Baloch demand for autonomy.

Timely and provocative, Schmidle's book perfectly captures the schizophrenic nature of a society where policymakers are always buying time to diffuse crises and run a little more with the hare even as they hunt with the hounds.

Thursday, July 30, 2009

Still searching for God’s dice

The battle between quantum theory and relativity to be the most definitive discovery of the 20th century seemed to have ended with the Time magazine naming Albert Einstein the man of the century. Yet, doubts persist. In the real world, the quantum continues to have greater applicability than relativity. Most modern appliances employ quantum principles. Some, such as the electron microscope, operate at a scale where quantum effects score over classical phenomena.

Such was the extraordinariness of its precepts that when quantum theory first gained prominence at the beginning of the 20th century, Niels Bohr, one of its chief architects, proclaimed: “Those who are not shocked when they come across quantum theory cannot possibly have understood it.”

Manjit Kumar’s Quantum revisits an era where cutting-edge research (at the time) yielded new ideas about the nature of reality. 1905, Einstein’s annus mirabilis, brought forth the first stirrings of what would become his Special Theory of Relativity, forever changing man’s views on space and time.

However, around the same time, an arguably more earth-shaking theory was about to take root, and it busied itself not with the motion of large objects but with the spin of the tiniest particle known to man—the humble electron. Experiments at the sub-atomic level conducted by the holy trinity of Bohr, Max Planck and Werner Heisenberg showed that electrons do not follow the same rules of behaviour that had come to be gospel truth since Newton first propounded his laws in the seventeenth century. Rather, they seemed to show dual characteristics of wave and particle, depending on the mode of observation.

The need for quantum theory stemmed from black body radiation. A black body is a hypothetical object that absorbs all electromagnetic radiation that falls on it. According to Wikipedia, “No electromagnetic radiation passes through it and none is reflected. Because no light (visible electromagnetic radiation) is reflected or transmitted, the object appears black when it is cold. However, a black body emits a temperature-dependent spectrum of light. This thermal radiation from a black body is termed black-body radiation.”

In 1862, Gustav Kirchhoff, while studying the spectrum of a laboratory black body emitting radiation, observed the presence of discrete spectral lines that could not be accounted for. The question remained unanswered until 1900, when Planck postulated that electromagnetic radiation was not a uniform wave but a collection of discrete energy particles called quanta. In 1905, Einstein appropriated Planck’s theory to explain the photoelectric effect — the emission of electrons from metals when electromagnetic radiation falls on their surface. Einstein attributed this to the presence of discrete energy particles called photons in light.

But the real strangeness of the quantum theory, which at its basic postulates a wave-particle duality to all matter, was the discovery made by the young Heisenberg in 1927: “It is impossible to measure simultaneously both the position and velocity of a microscopic particle with any degree of accuracy or certainty.” That is, the measurement of one quantity plays havoc with the measurement of another and vice-versa. What this essentially entailed was a realisation, too shocking to be readily grasped, that reality is not a fixed entity but dependent on its observation.

The discovery of uncertainty caused an uproar in the scientific community, since it shook to the core the deterministic principles on which the entire edifice of physics—and by extension, the patterns of thinking—was based. Einstein famously quipped, “God does not play dice with the universe” and hoped that a still-undiscovered quantity would arrive in due course to resolve the inherent contradictions of the quantum. Bohr and Heisenberg, on the other hand, fashioned the Copenhagen Interpretation that gave credence to the central improbability of quantum mechanics. In their view, quantum mechanics was the best approximation of all motion, since for larger bodies, it approximated the laws of classical physics.

Kumar’s Quantum builds on the many discoveries that launched the world of the quantum and the fraught relationships among the dramatis personae, many of who moved in a rarified circle where talk of force fields and particle physics was common over tea and biscuits. Indeed, Kumar’s real achievement is not in throwing light on quantum mechanics per se, which descriptions are often mired in thick scientific jargon, but on a time when the thrill of discovery was so palpable it could slice through butter like a hot knife.

As for the quantum, the debate continues and Kumar ends his book on the hope that a “Theory of Everything” to resolve the deterministic outlook of Einstein with the thrilling uncertainty of Bohr, will come around in our lifetimes. Except, every new discovery is fraught with opening new vistas and shifting the ever-changing contours of not just science, but epistemology.

Wednesday, July 29, 2009

Gritty portrait marred by lack of coherence

Nami Mun's debut novel is an interesting patchwork quilt of set pieces that stand well on their own. This story of Joon, the daughter of Korean immigrants to New York (much like Mun), has several strengths to its credit. In spite of them, however, it fails to form a complete whole.

At the beginning, Joon has left the house that she and her mother share, because her mother has decided to stop speaking (she is mentally unstable). Before this, the father had abandoned the family, and started living with another woman. Joon's guilt at abandoning her mother is sublimated by her indecision over who really screwed the family.

On the streets, Joon tries a variety of odd jobs, from hustling to becoming an escort. She is befriended by oddball characters, Knowledge and Wink. As the three hop from one place to another, and from one calamity to the next, Joon comes to form real friendships for the first time in her life.

This, however, does not diminish her tiresome view on life. The narrative is peppered with Joon's cynical ditties on the fragility of relationships, the seduction of substance abuse, the possibilities of sexual deviancy, and so on. It does get to you at some point.

Redemption is promised towards the end with the fig leaf of Joon's decision to set her life in order and return to live with her mother. Given her know-it-all bluster, this sudden about-turn comes across as jaded and contrived.

Miles From Nowhere is not a bad book. Mun does possess a distinctive voice and a talent for characterization. However, the budding author needs to work on dovetailing disparate parts into a satisfying whole.

Saturday, July 11, 2009

'The handsomest young man in England'

Jill Dawson's latest, The Great Lover, is a fictionalised account of the latter years of Rupert Brooke's life, the English poet who died tragically of septicemia during the First World War. He is most well-known for his poem "The Soldier", which has the lines:

"If I should die, think only this of me:
That there's some corner of a foreign field

That is forever England."

The following is a brief biography of Brooke from a website:

Rupert Brooke was born in Rugby, Warwickshire, where his father taught classics and was a housemaster at Rugby School. In his childhood Brooke immersed himself in English poetry and twice won the school poetry prize. In 1906 he went to King's college, Cambridge, and became friends with G.E. Moore, Lytton Strachey, Maynard Keynes, Roger Fry, and Leonard Fry, members of the future Bloomsbury Group. In 1910 Brooke's father died suddenly, and Brooke was for a short time in Rugby a deputy housemaster. Thereafter Brooke lived on an allowance from his mother. In 1911 he worked on a thesis on the playwright John Webster and the Elizabethan drama, and travelled in Germany and Italy. In England he was a leader of a group of young 'Neo-pagans', who slept outdoors, embraced a religion of nature, and swam naked - among others Virginia Woolf joined the swimmers in Grantchester. However, sex was something that was not part of the fun - "We don't copulate without marriage, but we do meet in cafes, talk on buses, go on unchaperoned walks, stay with each other, give each other books, without marriage," Brooke once told to his friend.

It is this fertile period of Brooke's life that Dawson fictionalises in The Great Lover. Nell Golightly is a maid at the Old Vicarage in Grantchester. The house is famous for hosting artistic types and one summer, it plays guest to Brooke. To Nell, the upright daughter of a bee-keeper, Brooke and his circle represent the waywardness of artists — a fundamental difference from her staid maidishness — which, while she resents, Nell is also drawn to.

The book pays hearty tribute to Brooke's politics, his many left-leaning causes that ended with a whimper, and yet, which also made him overlook class divides and appreciate the repressed intelligence of Nell. Their romance, never spoken of, is that drip-drip pattter of subdued tension that ends in a night of emotionally charged lovemaking.

But more than anything else, the novel is a tribute to a way of life, a freedom that encompasses magnanimous love yet refuses to be tied town, sometimes to shocking effects. The long line of Brooke's conquests—men and women—are ever-present in the background, hurtling in and out of Brooke's racing conscious. Dawson has exonerated Brooke of any scheming though. As Nell tells him tenderly at one point, yes, there are two kinds of people: those who marry and those who can't.

For, there could have been no scheming. Not from one whose pen emitted these lines:

...Oh, never a doubt but, somewhere, I shall wake,
And give what's left of love again, and make

New friends, now strangers...
But the best I've known
Stays here, and changes, breaks, grows old, is blown

About the winds of the world, and fades from brains

Of living men, and dies.

Nothing remains.


O dear my loves, O faithless, once again

This one last gift I give: that after men

Shall know, and later lovers, far-removed,

Praise you, "All these were lovely"; say "He loved".

(From "The Great Lover" by Rupert Brooke)

Dawson's book has the pallor of tragedy hanging over it, as the reader accompanies Brooke through his nervous breakdown and his visit to Tahiti where he will find succour in Taatamata, a local woman he will eternalise in his poetry—and his subsequent return to England. But all this is known before-hand and when the book ends with a haunting letter Brooke wrote to a friend a few days before his death, it is all a bit too much to bear.

A playful, highly affecting novel!

Thursday, July 09, 2009

A brief history of man

The narrative of human evolution, in spite of Darwin and his Origin of Species, is a discontinuous mishmash that gives us only a broad outline of who we are and where we come from. The evidence for the study of human evolution is derived primarily from fossils, which can give insights into the existence of in-between species — the ones that provide the missing links in the evolution of man.

Hamburg in Germany is the site of an annual fossil fair where scientists, private collectors, dealers and locals converge in December every year to peddle their pre-historic wares. Jorn Hurum, associate professor of paleontology at the Natural History Museum at the University of Oslo, is a regular at the fair, visiting every year in the hope of adding to the museum’s substantial collection.

In 2006, Hurum and a museum colleague were milling around the table of Thomas Perner, a prominent dealer. Hurum had had a long association with Perner, and so, when the latter asked him to meet up for a drink later that day, with a mischievous twinkle in his eye, Hurum acceded readily.

Over drinks, Perner explained to Hurum that a private collector seeking anonymity had given him six months to sell a rare find. Perner opened an envelope and showed Hurum a high-resolution colour photograph of a complete fossil skeleton. The photograph was of Ida (so named by Hurum later), fossilised after her death.

Ida is the world-renowned 47-million-year-old primate ancestor whose perfectly fossilised remains were shown to Hurum on that fateful December day. Her discovery is massively important to science because she could provide the crucial missing link in the evolution of primates. It was during the Eocene (56 million years to 34 million years ago) that a spilt in two distinct primate groups had occurred, leading to the existence of humankind.

Because of gaps in fossil records, paleontologists have had to hypothesise about what happened after the primitive primate. Their best guess so far had been that by 40 million years ago there were two distinct primate groups: those with wet noses—lemurs and lorises; and those with dry noses—tarsiers, apes, monkeys and humans. It was Ida that could explain the split in primate evolution.

Barely a year old at the time of her death, Ida died while drinking from a lake in what was then a tropical rain forest. A volcanic eruption engulfed the area surrounding the lake and the dense gas it released rendered Ida unconscious. Her limp body fell into the lake, settling in the sediment at the bottom, which over time, congealed into oily shale. A perfect accident had created the conditions for long-term preservation.

The site where this occurred is located near the village of Messel in Germany. Called the Messel Pit, it is a rich source of fossils from the Middle Eocene period. Up until the beginning of the twentieth century, the Messel Pit was chiefly a quarry, mined by coal prospectors looking to convert the shale into raw petroleum.

However, beginning 1966, formal excavations were undertaken by paleontologists and archaeologists in the Messel Pit. “Fossils of horses, fish, bats and crocodiles perfectly frozen in time were unearthed and preserved. In many cases, complete skeletons were preserved, along with bacterial imprints of hair, feathers, scales and even internal organs,” writes Tudge.

By 1971, mining had ceased in the area and it became open hunting ground for scientists and private collectors alike. Sometime in 1982, a private collector from Frankfurt, while splitting the layers of shale, “stumbled on a fossil of what looked like an exotic monkey crushed to the thickness of a silver dollar.” He took it home and preserved it, away from the eyes of science and the public, until twenty-five years later, when advancing age made him approach Perner.

The Link is the gripping account of how Hurum set about meeting the $1 million price tag on Ida—seeking the assistance of the Oslo museum whose director remarked, “We’re not a museum known around the world like the Louvre, but this could be our Mona Lisa”; authenticating the fossil by means of X-rays and CT scans; and clearing legal hurdles to enable the specimen to leave Germany.

Fans of Bill Bryson and Stephen Jay Gould may find the book lacking in flamboyance, but Tudge’s subject matter makes up for any deficit in flair. There are brilliant illustrations in the book, including three-dimensional images of Ida’s skeleton and close shots of her last meal. Tudge builds on the massiveness of the findings to argue about the need for humans to preserve the environment—there is an amusing, yet gravid, comparison of time lines to drive home the magnitude of destruction that human beings have wreaked in their rather minuscule time on earth.

Thursday, July 02, 2009

Web-crazed zombies all?

Which is the single largest recruiting ground that terrorists use to lure gullible people into their nefarious dens? Mosques, I hear you say. Only that is wrong. It's websites, hundreds of thousands of them, says James Harkin, Director of Talks at the ICA in London, in this stimulating new book.

Even as we cheer on young Iranian students using Twitter and other Web 2.0 technologies for raising their collective voice against Iran's botched electoral outcome, Harkin cautions us to keep in mind the many dangerous side-effects that this openness has entailed.

He does so by pointing us to the visual, albeit very real, domain of Cyburbia — "the place we go when we spend too much time hooked up to other people via a continuous loop of electronic information."

Harkin begins by introducing the concept of cybernetics whose founder was the redoubtable Norbert Wiener of MIT, who famously declined to join the Manhattan Project. Derived from control theory, cybernetics is the study of closed systems, where the feedback from the system is fed into a loop, resulting in the system modifying itself based on the feedback input.

During the Second World War, Wiener was distressed at the failure of British anti-aircraft gunners to shoot down German aircraft hovering the British sky. The problem was the circuitous routes that the German aircraft took to dodge detection. The British tracking system was just not up to the task of factoring in the bomber's zigzag motion in its calculation.

Wiener, working with complex mathematical models, came to the conclusion that the information feedback loop between the Luftwaffe bomber and the anti-aircraft gunner was not fast enough, resulting in rising failures. If only the bomber's movement was suitably estimated, Wiener calculated, the accuracy of the gunner's aim would improve dramatically.

While Wiener's work would have little bearing on the British war effort, his ideas came to be rapidly accepted in the broader social sciences, especially among the countercultural idealists of the 1960s. These pioneers imagined the establishment of a global "electronic village of authentic information and perfect understanding" based on cybernetics.

One such pioneer was Marshall McLuhan, the man who coined the memorable phrase: "The medium is the message." McLuhan, Harkin reminds us, was the progenitor of the idea of the internet, predicting the setting up of a giant electronic loop which will connect things and people in a smorgasbord of anytime connectivity.

Harkin dovetails the rise of the internet to cybernetics by exploring the way Google searches — the search results on the website’s first few pages drive our knowledge/views on any given topic. The more popular a site, the higher its chance of being shown on the first page of search results, resulting in an endless loop where a few, highly-visited sites govern our consumption of ideas.

It is in this vein that Harkin builds his central argument. The internet has engendered a herd-like instinct which dresses up McLuhan's original dictum in a less glamorous interpretation. The content, never much important, is less so today—so long as one feels connected to a wider community. Which is why, Harkin seems to chide, seemingly normal adults can waste hours playing childish games and scoring themselves against one another on Facebook.

Welcome to Cyburbia, where youngsters share music and movies illegally on peer-to-peer networks, even as governments struggle to contain newer, more blatant forms of piracy. "The peer-to-peer architecture started out as a hippie cri de coeur at the conformism of post-war American life, but the layout of Cyburbia encourages us to conform to the opinion of our electronic peers," Harkin laments.

The rise of Cyburbia has entailed the easy availability of porn, much of it free and user-generated. The other tragic manifestation of the internet, in Harkin's view, is the global rise of opinion-making, with sundry blogs bloviating on serious topics with no editorial control.

However, in Harkin's view, none of this compares with the curious case of so-called medieval terrorists using the latest technologies to spread their message of hate, or down-and-out lonely souls exploiting the internet's seductive anonymity to enter suicide pacts.

But is this the whole picture? Clearly not. The rise of the internet has brought about several positive transformations, and the abuse of any technology cannot be reason enough to decry it. If the internet allows terrorists to group, it also lets ordinary citizens in non-democratic societies to get their views across. Why else does China set such great store by banning websites?

Perhaps then, it is enough to take Harkin at his word, and support him on how the ubiquitous use of the internet is playing havoc with not just our attention spans and social lives, but also our freedom to know and choose. Indeed, a new question is already upon us: What next?