or Connect
Styleforum › Forums › Men's Style › Streetwear and Denim › [SOON , A TITLE HERE ]
New Posts  All Forums:Forum Nav:

[SOON , A TITLE HERE ] - Page 80

post #1186 of 1305
In Pursuit of Taste, en Masse
FEB. 11, 2013
AMERICANS didn’t always ask so many questions or expect so much in their quest for enjoyment. It was enough for them simply to savor a good cigar, a nice bottle of wine or a tasty morsel of cheese.

Not anymore. Driven by a relentless quest for “the best,” we increasingly see every item we place in our grocery basket or Internet shopping cart as a reflection of our discrimination and taste. We are not consumers. We have a higher calling. We are connoisseurs.

Connoisseurship has never been more popular. Long confined to the serious appreciation of high art and classical music, it is now applied to an endless cascade of pursuits. Leading publications, including The New York Times, routinely discuss the connoisseurship of coffee, cupcakes and craft beers; of cars, watches, fountain pens, lunchboxes, stereo systems and computers; of tacos, pizza, pickles, chocolate, mayonnaise, cutlery and light (yes, light, which is not to be confused with the specialized connoisseurship of lighting). And the Grateful Dead, of course.

This democratization of connoisseurship is somewhat surprising since as recently as the social upheavals of the 1960s and ’70s connoisseurship was a “dirty word” — considered “elitist, artificial, subjective and mostly imaginary,” said Laurence B. Kanter, chief curator of the Yale University Art Gallery. Today, it is a vital expression of how many of us we want to see, and distinguish, ourselves.

As its wide embrace opens a window onto the culture and psychology of contemporary America, it raises an intriguing question: If almost anything can be an object of connoisseurship — and if, by implication, almost anyone can be a connoisseur — does the concept still suggest the fine and rare qualities that make it so appealing?

There were probably Neanderthals who tried to distinguish themselves through their exquisite taste in cave drawings. But the word connoisseur was not coined until the 18th century — in France, of course, as a symbol of the Enlightenment’s increasingly scientific approach to knowledge.

At a time when precious little was known about the provenance of many works of art, early connoisseurs developed evaluative tools — for example, identifying an artist’s typical subject matter, use of color and use of light — to authenticate works by revered masters and to debunk pretenders to the pedestal.

“Works of art do not carry a guarantee,” said Dr. Kanter. “It has always been the job of the connoisseur to question, investigate, refine the received wisdom of earlier generations.”

As the aristocracy declined and the bourgeoisie enjoyed new wealth, especially after the Napoleonic upheavals, the number of people who could afford art expanded, as did the types of art they were interested in. Connoisseurship grew in response to the need for authoritative guidance in a changing world. In the 19th century, connoisseurs helped reassess the works of forgotten artists, like Giotto, Fra Angelico and Botticelli, who are now considered canonical. They studied and appraised ignored forms like German woodcuts, French porcelain and English statuary.

Contemporary efforts to apply connoisseurship to a host of far-flung fields are consistent with this history. “Our definition of quality continues to expand and mature,” Dr. Kanter said, “so it makes sense that we can talk now about connoisseurs not just of art but also of rap music, comic books and Scotch. Connoisseurship is not about objects; it’s a process of thinking about and making distinctions among things.”

True connoisseurs — and this is what makes the label so appealing — do not merely possess knowledge, like scholars. They possess a sixth sense called taste. They are renowned for the unerring judgment of their discerning eye. They are celebrated because of their rare talent — their gift — for identifying and appreciating subtle, often hidden, qualities.

Despite its expanded applications, connoisseurship still revolves around art, if we define art broadly as things that are more than the sum of their parts because they offer the possibility of transcendence. We do not speak of connoisseurs of nature (which can transport us) or diapers (which are simply useful). But no one blinks when we apply the term to wine, food or literary forms like comic books, because these are believed to offer deeper experiences to those who can gain access to them. Generally speaking, almost anyone can become an expert, but connoisseurship means we’re special.

If connoisseurship is a way of thinking, its rising popularity reflects the fact that people have so many more things to think about. ` Robert H. Frank, a professor of economics at Cornell whose books include “Luxury Fever: Why Money Fails to Satisfy in an Era of Excess,” noted that the British economist John Maynard Keynes worried during the 1920s and ’30s that rising productivity would lead people to work less as it became easier to satisfy their basic needs.

“It’s funny,” Dr. Frank said, “that someone as smart as he was didn’t realize that we would invent a million new things to spend our money on and create higher and higher standards of quality for those products that would cost more and more.”

Hence the $5 cup of coffee and the $8 pickle.

In the dark ages before arugula, most supermarkets seemed to carry only one type of lettuce, iceberg, and apples were either green or red. In 1945, the average grocer carried about 5,000 products; today, that number is more than 40,000, according to Paul B. Ellickson, a professor of economics and marketing at the University of Rochester.

In addition, the Internet has made millions of other options just a mouse click away. Easy access to higher-quality products opens new avenues of connoisseurship — gorau glas cheese is more interesting, more provocative, than Velveeta. But it also presents us with a mind-numbing series of choices. In this context, connoisseurship is a coping strategy. When we say we want “the best,” we winnow our options, focusing our attention on a small sample of highly regarded items.

Put another way, rising connoisseurship is a response to life in an age of information shaped by consumerism. As ideas increasingly become the coin of the realm, people distinguish themselves by what they know. An important way to demonstrate this is through what they buy.

It is a form of conspicuous consumption that puts less emphasis on an item’s price tag — craft beers aren’t that expensive — than on its perceived cachet. In hoisting a Tripel brewed by Belgian monks, the drinker is telling the world: I know which ale to quaff. As, in all fairness, he enjoys a very tasty beverage.

Ironically, many items celebrated as examples of connoisseurship — handcrafted, small-batch, artisanal products — are themselves a reaction against the mass production trends of the global consumer society that shapes us. Just as art connoisseurs authenticate paintings, others seek wines and cheese and cupcakes that seem mystically authentic.

“A lot of what gets called connoisseurship is really just snobbery,” said Thomas Frank, who has dissected modern consumer culture in books like “Commodify Your Dissent,” which he edited with Matt Weiland, and “The Conquest of Cool.” “It’s not about the search for quality, but buying things that make you feel good about yourself. It’s about standing apart from the crowd, demonstrating knowledge, hipness.”

The rub is that, as access to knowledge through a Google search has become synonymous with possessing knowledge, fewer and fewer people seem to have the inclination or patience to become true connoisseurs. How many people, after all, have the time to make oodles of money and master the worlds of craft beer, cheese, wines and everything else people in the know must know?

In response, most people outsource connoisseurship, turning to actual connoisseurs for guidance. “Many people want the patina of connoisseurship on the cheap,” said Barry Schwartz, a professor of social theory and social action at Swarthmore College. “So they contract out the decision-making process. My guess is that a tiny fraction of people who are true connoisseurs of wine — and there are some — don’t make enough money to buy a $500 bottle of wine.”

As Steven Jenkins, an expert on cheese and other products at Fairway Market in New York, recently told a reporter for The New York Times: “The customer has no idea what he or she wants. The customer is dying to be told what they want.”

People have always relied on connoisseurs for guidance. What is different today is the idea — suggested by journalists and marketers intent on flattering their customers — that people can become paragons of taste simply by taking someone else’s advice.

Dr. Schwartz said this could be a wise strategy. Consumers may not get the pleasures of deep knowledge, but they also avoid the angst. “You get the benefits of discernment without paying the psychological price” of having to make difficult choices and distinctions, he said. “You’re happy because you’ve been told what to get and don’t know any better.”

This psychological dimension is essential to understanding connoisseurship, said Dan Ariely, a professor of psychology and behavioral economics at Duke University whose books include “Predictably Irrational.” While recognizing that a small handful of people are true connoisseurs, he said his experiments with people interested in wine revealed a startling lack of discernment.

In one experiment, Dr. Ariely asked people to taste and write descriptions of four wines. He waited 10 minutes and then gave them a blind taste test, asking them to match the wines to their descriptions. For the most part, they couldn’t.

In another experiment, he used food coloring to make white wine appear red. The participants, he said, “rated it highly in terms of tannins, complexity” and other general characteristics of red wine.

Dr. Ariely’s work dovetails with other experiments that have found, for instance, that many people cannot tell the difference between foie gras and dog food in blind taste tests.

Even connoisseurs have a hard time getting it right. Echoing a famous blind taste test of wines from California and France in 1976, known as the Judgment of Paris, nine wine experts gathered at Princeton University in 2012 to compare revered wines from France with wines from New Jersey that cost, on average, about 5 percent as much. Not only did the experts give vastly different scores for many of the wines, but they rated the Garden State wines on a par with their costly French counterparts.

Dr. Ariely said these results did not necessarily debunk the notion of connoisseurship. “Whether we can actually tell the difference between cheap and expensive wine may be less important than whether we think that we can,” he said. “We might actually experience more pleasure when drinking an expensive wine, enjoy it more, because we’re slowing down, savoring it, paying more attention to its qualities.”

Which, as it turns out, is a hallmark of connoisseurship.
post #1187 of 1305
The Mughal dynasty (1526–1858) was among the richest and longest ruling in India, and at its peak controlled large portions of the Indian subcontinent. The Mughals were Muslims of Central Asian origin, and Persian was their court language. Their intermarriage with Hindu royalty and establishment of strong alliances with the diverse peoples of the subcontinent led to profound cultural, artistic, and linguistic exchanges.

The Mughal dynasty claimed descent from the Mongols ("Mughal" is from the Arabized transliteration of "Moghol," or Mongol). The Mughal emperors were among India's greatest patrons of art, responsible for some of the country's most spectacular monuments, like the palaces at Delhi, Agra, and Lahore (in present-day Pakistan) and the famous mausoleum, the Taj Mahal (fig. 28).

Fig. 28. Taj Mahal, Agra, India, 1632–53. Commissioned by Shah Jahan

The tastes and patronage of the first six rulers, known as the Great Mughals, defined Mughal art and architecture, and their influence has endured to this day. The works of art featured in this chapter highlight artistic production during the reigns of Jahangir (1605–27) and his son Shah Jahan (1627–58).

Fig. 29. Portrait of Jahangir (detail), about 1615–20; India; ink, opaque watercolor, and gold on paper; 14 x 9 1/2 in. (35.6 x 24.1cm); The Metropolitan Museum of Art, New York, Gift of Alexander Smith Cochran, 1913 (13.228.47)

Emperor Jahangir (fig. 29) remains best known as a connoisseur and patron of the arts. His memoirs, the Tuzuk-i Jahangiri, describe opulent court events and sumptuous gifts in great detail. They also reflect the emperor's intense interest in the natural world—most evident in the meticulous descriptions of the plants and animals he encountered in India and during his travels. Jahangir is notable for his patronage of botanical paintings and drawings. In addition to works made at his own court, botanical albums with beautifully drawn and scientifically correct illustrations were brought to India by European merchants (see fig. 33). These inspired many of the works in Jahangir's court.

Emperor Shah Jahan's court was unrivaled in its luxury. Like his father Jahangir, Shah Jahan (fig. 30) also had a strong interest in the natural world and a taste for paintings, jewel-encrusted objects (fig. 31), textiles, and works of art in other media. In spite of his large collection of portable works, Shah Jahan is best known for his architectural commissions, which include a huge palace in Delhi and the Taj Mahal (fig. 28), a mausoleum built for his favorite wife. Shah Jahan's architectural projects also reflect the Mughal love of botanical imagery; many of the Taj Mahal's walls are carved with intricate images of recognizable flowers and leaves (fig. 32).

Fig. 30. Shah Jahan on a Terrace, Holding a Pendant Set With His Portrait: Folio from the Shah Jahan Album (recto) (detail), dated 1627–28; artist: Chitarman (active about 1627–70); India; ink, opaque watercolor, and gold on paper; 15 5⁄16 x 10 1/8 in. (38.9 x 25.7 cm); The Metropolitan Museum of Art, New York, Purchase, Rogers Fund and The Kevorkian Foundation Gift, 1955 (

Fig. 31. Mango-shaped flask, mid-17th century; India; rock crystal, set with gold, enamel, rubies, and emeralds; H. 2 1/2 in. (6.5 cm); The Metropolitan Museum of Art, New York, Purchase, Mrs. Charles Wrightsman Gift, 1993 (1993.18).

This small flask typifies the Mughal interest in natural forms and their transformation into richly decorated objects. The realistic mango shape was carved from rock crystal and encased in a web of golden strands of wire punctuated by rubies and emeralds. This flask is a practical vessel—possibly used to hold perfume or lime, an ingredient in pan, a mildly intoxicating narcotic popularly used in India—as well as a jewel-like decorative object that would have displayed the wealth and refined taste of its owner.

Fig. 32. Detail of wall showing highly naturalistic floral decoration, Taj Mahal, Agra, India, 1632–53.

During the golden age of Mughal rule (approximately 1526–1707), the emperors had a marked interest in naturalistic depictions of people, animals, and the environment. They employed the most skilled artists, who documented courtiers and their activities as well as the flora and fauna native to India. Informed by Mughal patronage, a new style of painting emerged in illustrations made for books and albums, which combined elements of Persian, European, and native Indian traditions. These works demonstrate keen observation of the natural environment and the royal court. The emperors collected their favorite poetry, calligraphy, drawings, and portraits in extensive albums, which were among their most valued personal possessions and were passed down to successive generations.

In addition to works on paper, the decorative arts of the Mughal court engaged a broad range of natural forms in carpets, textiles, jewelry, and luxuriously inlaid decorative objects, and used precious materials ranging from gemstones and pearls to silk.

The Emperor Shah Jahan with His Son Dara Shikoh: Folio from the Shah Jahan Album (verso)
About 1620
Artist: Nanha (active 1605–27)
Ink, opaque watercolor, and gold on paper; margins: gold and opaque watercolor on dyed paper; 15 5/16 x 10 5/16 in. (38.9 x 26.2 cm)
The Metropolitan Museum of Art, New York, Purchase, Rogers Fund and The Kevorkian Foundation Gift, 1955 (

Mughal empire, courtly life, Emperor Shah Jahan, natural world, album, figural art, plants, birds, watercolor, ink

This painting demonstrates the Mughals' focus on portraiture as well as their love of precious objects (see fig. 30). It presents two realistic depictions of the Mughal royal family—the Mughal emperor Shah Jahan and his eldest son, Dara Shikoh, who are shown examining precious stones.

The painting comes from an album begun by Emperor Jahangir and continued by his son Shah Jahan. The album was created for private viewing and study by the emperor.

Two figures are seated on a golden throne furnished with luxurious cushions. Shah Jahan admires the large ruby clasped in his right hand, while his son—who is facing him—looks toward the bowl of precious stones resting in his father's left hand. The emperor is clad in a red and yellow striped turban with a plume, a white double-breasted gown called a jama, a richly embroidered sash, and a violet garment called a pajama. On his right thumb is a jeweled ring, which could be used to draw the string of a hunting bow. The handle of a jeweled dagger, signaling his supremely important position in the court, is visible just above his waist.

Prince Dara Shikoh is dressed in a yellow jama fastened with a sash. In one hand he holds a turban pin, in the other a fly whisk made from a peacock feather. Multiple strands of pearls adorn Dara Shikoh; under Mughal rule, pearls were a hallmark of nobility, and princes and princesses were almost always portrayed with them.

The patron of this painting was most likely Shah Jahan's father, Emperor Jahangir, who was interested in realistic and masterfully drawn depictions of people, animals, and plants. The wide border that frames the painting contains precisely rendered images of flowers and birds. In the upper right corner are flowers, including narcissus, roses, poppies, and crocus. The Mughal style of creating botanically accurate flowers was informed by the presence of European botanical prints in the court (fig. 33). Birds, such as chukar partridges, demoiselle cranes, pigeons, Indian peafowl, and Birds of Paradise (symbolizing royalty), are also depicted with skillful realism. All the birds are native to the Mughal territories and still exist in present-day India and Pakistan.

Fig. 33. Crocus, folio 61 of Le Jardin du Roy tres Chrestien Henry IV Roy de France et de Navare, 1608; designer: Pierre Vallet (French, about 1575–1657); The Metropolitan Museum of Art, New York, Harris Brisbane Dick Fund, 1935 (35.67.3)

This type of illustration, from a European botanical album, influenced the paintings produced at the Mughal court. Notice the crocus in the top right corner of the margin of image 30.

Dagger with hilt in the form of a blue bull (nilgai )
About 1640
The Metropolitan Museum of Art, New York, Gift of Alice Heeramaneck, in memory of Nasli Heeramaneck, 1985 (1985.58a)

Mughal empire, royal hunt, dagger, Emperor Shah Jahan, natural world, album, animals, nephrite (jade), steel

Mughal emperors were keen observers of animals and the blue bull (nilgai)—a large antelope native to India—was among their favorites. The intimate familiarity with the features of the blue bull, as well as the fine quality of the carving, suggest that this dagger was made in the royal workshop by someone with access to the imperial zoo, which would have housed both native and foreign animals.

Finely carved daggers such as this were seldom used as weapons, but rather were part of the royal ceremonial costume of the Mughal court. Surviving daggers featuring animal heads are relatively rare, and were probably worn by those of the highest status at the Mughal court.

The head of the blue bull, which forms the handle of this dagger, features thin hollow ears, delicately carved facial features, and grooves along the neckline where the owner could rest his fingers. At the base of the hilt, a lotuslike flower rests in a leaf scroll, which bulges over the edge—a feature that prevents the hand from slipping from the smooth handle onto the sharp blade. The blue tone of the jade (nephrite) resembles the animal's coat, which was admired for its bluish gray hue.

The Mughal emperors' interest in animals might be considered paradoxical by today's standards. They admired animals for their beauty, enjoyed observing them in the wild and in the imperial zoo, but also were avid hunters and even held animal fights at the court where courtiers could place bets on their favorites. Court painters were often present during these fights and sketched the animals from life (fig. 34).

While the Mughals' Islamic faith informed their disapproval of large-scale figurative sculpture, India had a rich indigenous sculptural tradition, which influenced Mughal art. This figural tradition was transformed by the Mughals into objects such as this one—small in scale and finely executed. The genre of small-scale animal sculptures and depictions flourished in Mughal India, and the handle of this dagger, with its realistically carved head of a blue bull, is a prime example of this trend.

Fig. 34. Blue Bull (Nilgai  ): Folio from the Shah Jahan Album (verso), about 1620; artist: Mansur (active 1589–1629); India; ink, opaque watercolor, and gold on paper; 15 5/16 x 10 1/16 in. (38.9 x 25.6 cm); The Metropolitan Museum of Art, New York, Purchase, Rogers Fund and The Kevorkian Foundation Gift, 1955 (

According to Emperor Jahangir's memoirs, the blue bull (nilgai ) was commonly encountered on royal hunts. This illustration is by the court artist Mansur, who often accompanied the ruler on his hunts. He had a special talent for observing and depicting nature, and shows how the bull would have appeared in the wild. Blue bulls still live in the grasslands of present-day India, Pakistan, and Nepal.

Red-Headed Vulture and Long-Billed Vulture: Folio from the Shah Jahan Album (verso)
About 1615–20
Artist: Mansur (active 1589–1629)
Ink, opaque watercolor, and gold on paper; 15 3/8 x 10 1/16 in. (39.1 x 25.6 cm)
The Metropolitan Museum of Art, New York, Purchase, Rogers Fund and the Kevorkian Foundation Gift, 1955 (
post #1188 of 1305
Here is a place I made to store my various creative things, anonymously, that fall outside of the professional realm. Since we don't have a 'Creations' thread I though it might fit here among the other things that don't fit in.

post #1189 of 1305

Cool stuff man, are those waterfall photos Iceland?

post #1190 of 1305
yes indeed. and most importantly GLACIERS
post #1191 of 1305

More pictures must be posted, Iceland is my dream travel destination.

post #1192 of 1305
It's amazing. Highly recommend. Landscape is so alien, which is sublime (for a sci-fi dork). Also, snowmobiling on glaciers, snorkeling in glacier run-off streams, exploring lava caves, hot springs, etc. Standing on no-man's land which is slowly sinking as the tectonic plates move away from each other. I want to go back.
post #1193 of 1305
Gnomes, dont forget the gnomes
post #1194 of 1305

art I love it. This is the most metal thing ever
post #1195 of 1305
It was nuts, there was an entire field of those. The guide told me they ship tons of them to developing areas that have a hard time finding a plentiful protein source. Apparently it's good for soup!
post #1196 of 1305
The one and only you
There are flaws in our intuitive beliefs about what makes us who we are. Who are we really, asks philosopher Jan Westerhoff
Think back to your earliest memory. Now project forward to the day of your death. It is impossible to know when this will come, but it will.

What you have just surveyed might be called your "self-span", or the time when this entity you call your self exists. Either side of that, zilch.

Which is very mysterious, and a little unsettling. Modern humans have existed for perhaps 100,000 years, and more than 100 billion have already lived and died. We assume that they all experienced a sense of self similar to yours. None of these selves has made a comeback, and as far as we know, neither will you.

What is it about a mere arrangement of matter and energy that gives rise to a subjective sense of self? It must be a collective property of the neurons in your brain, which have mostly stayed with you throughout life, and which will cease to exist after you die. But why a given bundle of neurons can give rise to a given sense of selfhood, and whether that subjective sense can ever reside in a different bundle of neurons, may forever remain a mystery. Graham Lawton

The self: meet the real you

We hold fundamental beliefs about what defines us, but they crumble under scrutiny

THERE appear to be few things more certain to us than the existence of our selves. We might be sceptical about the existence of the world around us, but how could we be in doubt about the existence of us? Isn't doubt made impossible by the fact that there is somebody who is doubting something? Who, if not us, would this somebody be?

While it seems irrefutable that we must exist in some sense, things get a lot more puzzling once we try to get a better grip of what having a self actually amounts to.

Three beliefs about the self are absolutely fundamental for our belief of who we are. First, we regard ourselves as unchanging and continuous. This is not to say that we remain forever the same, but that among all this change there is something that remains constant and that makes the "me" today the same person I was five years ago and will be five years in the future.

Second, we see our self as the unifier that brings it all together. The world presents itself to us as a cacophony of sights, sounds, smells, mental images, recollections and so forth. In the self, these are all integrated and an image of a single, unified world emerges.

Finally, the self is an agent. It is the thinker of our thoughts and the doer of our deeds. It is where the representation of the world, unified into one coherent whole, is used so we can act on this world.

All of these beliefs appear to be blindingly obvious and as certain as can be. But as we look at them more closely, they become less and less self-evident.

It would seem obvious that we exist continuously from our first moments in our mother's womb up to our death. Yet during the time that our self exists, it undergoes substantial changes in beliefs, abilities, desires and moods. The happy self of yesterday cannot be exactly the same as the grief-stricken self of today, for example. But we surely still have the same self today that we had yesterday.

There are two different models of the self we can use to explore this issue: a string of pearls and a rope. According to the first model, our self is something constant that has all the changing properties but remains itself unchanged. Like a thread running through every pearl on a string, our self runs through every single moment of our lives, providing a core and a unity for them. The difficulty with this view of the self is that it cannot be most of the things we usually think define us. Being happy or sad, being able to speak Chinese, preferring cherries to strawberries, even being conscious –; all these are changeable states, the disappearance of which should not affect the self, as a disappearance of individual pearls should not affect the thread. But it then becomes unclear why such a minimal self should have the central status in our lives that we usually accord to it.

The second model is based on the fact that a rope holds together even though there is no single fibre running through the entire rope, just a sequence of overlapping shorter fibres. Similarly, our self might just be the continuity of overlapping mental events. While this view has a certain plausibility, it has problems of its own. We usually assume that when we think of something or make a decision, it is the whole of us doing it, not just some specific part. Yet, according to the rope view, our self is never completely present at any point, just like a rope's threads do not run its entire length.

It seems then as if we are left with the unattractive choice between a continuous self so far removed from everything constituting us that its absence would scarcely be noticeable, and a self that actually consists of components of our mental life, but contains no constant part we could identify with. The empirical evidence we have so far points towards the rope view, but it is by no means settled.

Even more important, and just as troublesome, is our second core belief about the self: that it is where it all comes together.

It is easy to overlook the significance of this fact, but the brain accomplishes an extremely complex task in bringing about the appearance of a unified world. Consider, for example, that light travels much faster than sound yet visual stimuli take longer to process than noises. Putting together these different speeds means that sights and sounds from an event usually become available to our consciousness at different times (only sights and sounds from events about 10 metres away are available at the same time). That means the apparent simultaneity of hearing a voice and seeing the speaker's lips move, for example, has to be constructed by the brain.

Our intuitive view of the result of this process resembles a theatre. Like a spectator seated in front of a stage, the self perceives a unified world put together from a diverse range of sensory data. It would get confusing if these had not been unified in advance, just as a theatregoer would be confused if they heard an actor's lines before he was on stage. While this view is persuasive, it faces many difficulties.

Consider a simple case, the "beta phenomenon". If a bright spot is flashed onto the corner of a screen and is immediately followed by a similar spot in the opposite corner, it can appear as if there was a dot moving diagonally across the screen. This is easily explained: the brain often fills in elements of a scene using guesswork. But a tweak to this experiment produces a curious effect.

If the spots are different colours –; for example a red spot followed by a green spot –; observers see a moving spot that changes colour abruptly around the mid-point of the diagonal (see "Spotted trick", page 36). This is very peculiar. If the brain is filling in the missing positions along the diagonal for the benefit of the self in the theatre, how does it know before the green spot has been observed that the colour will switch?

One way of explaining the beta phenomenon is by assuming that our experience is played out in the theatre with a small time delay. The brain doesn't pass on the information about the spots as soon as it can, but holds it back for a little while. Once the green spot has been processed, both spots are put together into a perceptual narrative that involves one moving spot changing colour. This edited version is then screened in the theatre of consciousness.

Unfortunately, this explanation does not fit in well with evidence of how perception works. Conscious responses to visual stimuli can occur at a speed very close to the minimum time physically possible. If we add up the time it takes for information to reach the brain and then be processed, there is not enough time left for a delay of sufficient length to explain the beta phenomenon.

Perhaps there is something wrong with the notion of a self perceiving a unified stream of sensory information. Perhaps there are just various neurological processes taking place in the brain and various mental processes taking place in our mind, without some central agency where it all comes together at a particular moment, the perceptual "now" (see "When are you?", page 37). It is much easier to make sense of the beta phenomenon if there is no specific time when perceptual content appears in the theatre of the self –; because there is no such theatre.

The perception of a red spot turning green arises in the brain only after the perception of the green spot. Our mistaken perception of the real flow of events is akin to the way we interpret the following sentence: "The man ran out of the house, after he had kissed his wife". The sequence in which the information comes in on the page is "running–;kissing", but the sequence of events you construct and understand is "kissing–;running". For us to experience events as happening in a specific order, it is not necessary that information about these events enters our brain in that same order.
The final core belief is that the self is the locus of control. Yet cognitive science has shown in numerous cases that our mind can conjure, post hoc, an intention for an action that was not brought about by us.

In one experiment, a volunteer was asked to move a cursor slowly around a screen on which 50 small objects were displayed, and asked to stop the cursor on an object every 30 seconds or so.


The computer mouse controlling the cursor was shared, ouija-board style, with another volunteer. Via headphones, the first volunteer would hear words, some of which related to the objects on screen. What this volunteer did not know was that their partner was one of the researchers who would occasionally force the cursor towards a picture without the volunteer noticing.

If the cursor was forced to the image of a rose, and the volunteer had heard the word "rose" a few seconds before, they reported feeling that they had intentionally moved the mouse there. The reasons why these cues combined to produce this effect is not what is interesting here: more important is that it reveals one way that the brain does not always display its actual operations to us. Instead, it produces a post-hoc "I did this" narrative despite lacking any factual basis for it (American Psychologist, vol 54, p 480).

So, many of our core beliefs about ourselves do not withstand scrutiny. This presents a tremendous challenge for our everyday view of ourselves, as it suggests that in a very fundamental sense we are not real. Instead, our self is comparable to an illusion –; but without anybody there that experiences the illusion.

Yet we may have no choice but to endorse these mistaken beliefs. Our whole way of living relies on the notion that we are unchanging, coherent and autonomous individuals. The self is not only a useful illusion, it may also be a necessary one. n

Jan Westerhoff is a philosopher at the University of Durham, UK, and the University of London's School of Oriental and African Studies, and author of Reality: A very short introduction (Oxford University Press, 2011)

"Our self runs through our lives Like a thread runs through a string of pearls"

"The self is not only a useful illusion, it may also be a necessary one"

For most people, most of the time, the sense of self is seamless and whole. But as with any construct of the brain, it can be profoundly disturbed by illness, injury or drugs. Anil Ananthaswamy and Graham Lawton reveal a few examples
Depersonalisation Disorder

Many people experience brief episodes of detachment, but for others "depersonalisation" is an everyday part of life. The Diagnostic and Statistical Manual of Mental Disorders IV defines it as "a feeling of detachment or estrangement from one's self The individual may feel like an automaton or as if he or she is living in a dream or a movie. There may be a sensation of being an outside observer of one's mental processes, one's body, or parts of one's body." There is some evidence that this state is caused by a malfunction of the body's emotion systems (Consciousness and Cognition, vol 20, p 99).
The petrified self

A crucial building block of selfhood is the autobiographical self, which allows us to recall the past, project into the future and view ourselves as unbroken entities across time. Key to this is the formation of memories of events in our lives.
Autobiographical memory formation is one of the first cognitive victims of Alzheimer's disease. This lack of new memories, along with the preservation of older ones, may be what leads to the outdated sense of self –; or "petrified self" –; often seen in the early stages of the disease. It could also be what causes a lack of self-awareness of having the illness at all (Consciousness and Cognition, vol 8, p 989). Continued on page 39
"our mind can conjure, post hoc, an intention for the action of someone else"

You think you live in the present?
Our brains create our own version of reality to help us make sense of things. But this means we're living outside time, says Jan Westerhoff
The Self: why you don't live in the present
It seems obvious that we exist in the present, but the brain constructs our experience of the world

It seems obvious that we exist in the present. The past is gone and the future has not yet happened, so where else could we be? But perhaps we should not be so certain.

Sensory information reaches us at different speeds, yet appears unified as one moment. Nerve signals need time to be transmitted and time to be processed by the brain. And there are events –; such as a light flashing, or someone snapping their fingers –; that take less time to occur than our system needs to process them. By the time we become aware of the flash or the finger-snap, it is already history.

Our experience of the world resembles a television broadcast with a time lag; conscious perception is not "live". This on its own might not be too much cause for concern, but in the same way the TV time lag makes last-minute censorship possible, our brain, rather than showing us what happened a moment ago, sometimes constructs a present that has never actually happened.

Evidence for this can be found in the "flash-lag" illusion. In one version, a screen displays a rotating disc with an arrow on it, pointing outwards (see "Now you see it...", below). Next to the disc is a spot of light that is programmed to flash at the exact moment the spinning arrow passes it. Yet this is not what we perceive. Instead, the flash lags behind, apparently occuring after the arrow has passed.

One explanation is that our brain extrapolates into the future. Visual stimuli take time to process, so the brain compensates by predicting where the arrow will be. The static flash –; which it can't anticipate –; seems to lag behind.

Neat as this explanation is, it cannot be right, as was shown by a variant of the illusion designed by David Eagleman of the Baylor College of Medicine in Houston, Texas, and Terrence Sejnowski of the Salk Institute for Biological Studies in La Jolla, California.

If the brain were predicting the spinning arrow's trajectory, people would see the lag even if the arrow stopped at the exact moment it was pointing at the spot. But in this case the lag does not occur. What's more, if the arrow starts stationary and moves in either direction immediately after the flash, the movement is perceived before the flash. How can the brain predict the direction of movement if it doesn't start until after the flash?

The explanation is that rather than extrapolating into the future, our brain is interpolating events in the past, assembling a story of what happened retrospectively (Science, vol 287, p 2036). The perception of what is happening at the moment of the flash is determined by what happens to the disc after it.
This seems paradoxical, but other tests have confirmed that what is perceived to have occurred at a certain time can be influenced by what happens later.

All of this is slightly worrying if we hold on to the common-sense view that our selves are placed in the present. If the moment in time we are supposed to be inhabiting turns out to be a mere construction, the same is likely to be true of the self existing in that present. n

Trick yourself into an outer-body experience
Your mind isn't as firmly anchored in your body as you think, says Anil Ananthaswamy. Time for some sleight of hand
The Self: how to wander outside your body
If you know how, you can travel beyond the borders of your body, says Anil Ananthaswamy

CLOSE your eyes and ask yourself: where am I? Not geographically, but existentially. Most of the time, we would say that we are inside our bodies. After all, we peer out at the world from a unique, first-person perspective within our heads –; and we take it for granted.

We wouldn't be so sanguine if we knew that this feeling of inhabiting a body is something the brain is constantly constructing. But the fact that we live inside our bodies doesn't mean that our sense of self is confined to its borders –; as these next examples show.

Sleight of (rubber) hand

By staging experiments that manipulate the senses, we can explore how the brain draws –; and redraws –; the contours of where our selves reside.

One of the simplest ways to see this in action is via an experiment that's now part of neuroscience folklore: the rubber hand illusion. The set up is simple: a person's hand is hidden from their view by a screen while a rubber hand is placed on the table in front of them. By stroking their hand while they see the rubber hand being stroked, you can make them feel that the fake hand is theirs (see diagram, below left).

Why does this happen? The brain integrates various senses to create aspects of our bodily self. In the rubber hand illusion, the brain is processing touch, vision and proprioception –; the internal sense of the relative location of our body parts. Given the conflicting information, the brain resolves it by taking ownership of the rubber hand.

The implication is that the boundaries of the self sketched out by the brain can easily expand to include a foreign object. And the self's peculiar meanderings outside the body don't end there.

Body Integrity Identity Disorder

Imagine a relentless feeling that one of your limbs is not your own. That is the unenviable fate of people with body integrity identity disorder. They often feel it so intensely that they end up amputating the "foreign" part.

The disorder can be viewed as a perturbation of the bodily self caused by a mismatch between the internal map of one's own body and physical reality. Neuroimaging studies by Peter Brugger of University Hospital Zurich in Switzerland have shown that the network of brain regions responsible for creating a sense of bodily self is different in people with the condition (Brain, vol 136, p 318).


One of the most reliable –; and reversible –; ways to alter your sense of self is to ingest psychedelic drugs such as LSD or psilocybin, the active ingredient in magic mushrooms.

Alongside sensory distortions such as visual hallucinations, a common psychedelic experience is a feeling that the boundary between one's self and the rest of the world is dissolving. A team led by David Nutt of Imperial College London recently discovered why: psilocybin causes a reduction in activity in the anterior cingulate cortex, a part of the brain thought to be involved in integrating perception and the sense of self. It was assumed that psychedelics worked by increasing brain activity; it seems the opposite is true (PNAS, vol 109, p 2138).

Continued on page 42

Trading places

Ever wish you had someone else's body? The brain can make it happen. To show how, Henrik Ehrsson at the Karolinska Institute in Stockholm, Sweden, and colleagues transported people out of their own bodies and into a life-size mannequin.

The mannequin had cameras for eyes, and whatever it was "seeing" was fed into a head-mounted display worn by a volunteer. In this case, the mannequin's gaze was pointed down at its abdomen. When the researchers stroked the abdomens of both the volunteer and the mannequin at the same time, many identified with the mannequin's body as if it was their own.

In 2011, the team repeated the experiment, but this time while monitoring the brain activity of volunteers lying in an fMRI scanner. They found that activity in certain areas of the frontal and parietal lobes correlated with the changing sense of body ownership.

So what's happening? Studies of macaque monkeys show us that these brain regions contain neurons that integrate vision, touch and proprioception. Ehrsson thinks that in the human brain such neurons fire only when there are synchronous touches and visual sensations in the immediate space around the body, suggesting that they play a role in constructing our sense of body ownership. Mess with the information the brain receives, and you can mess with this feeling of body ownership.

Yet while Ehrsson's study manipulated body ownership, the person "inside" the mannequin still had a first-person perspective –; their self was still located within a body, even if it wasn't their own. Could it be possible to wander somewhere where there is no body at all?

"volunteers felt they were floating and a few experienced a particularly strange effect"

Into thin air

Your self even can be tricked into hovering in mid-air outside the body. In 2011, Olaf Blanke at the Swiss Federal Institute of Technology (EPFL) in Lausanne and colleagues asked volunteers to lie on their backs and via a headset watch a video of a person of similar appearance being stroked on the back. Meanwhile, a robotic arm installed within the bed stroked the volunteer's back in the same way.

The experience that people described was significantly more immersive than simply watching a movie of someone else's body. Volunteers felt they were floating above their own body, and a few experienced a particularly strange effect. Despite the fact that they were all lying facing upwards, some felt they were floating face down so they could watch their own back (see "Leaving the body", left). "I was looking at my own body from above," said one participant. "The perception of being apart from my own body was a bit weak but still there."

"That was for us really exciting, because it gets really close to the classical out-of-body experience of looking down at your own body," says team member Bigna Lenggenhager, now at the University of Bern in Switzerland. Further support came by repeating the experiment inside an MRI scanner, which showed a brain region called the temporoparietal junction (TPJ) behaving differently when people said they were drifting outside their bodies. This ties in neatly with previous studies of brain lesions in people who reported out-of-body experiences, which also implicated the TPJ.

The TPJ shares a common trait with other brain regions that researchers believe are associated with body illusions: it helps to integrate visual, tactile and proprioceptive senses with the signals from the inner ear that give us our sense of balance and spatial orientation. This provides more evidence that the brain's ability to integrate various sensory stimuli plays a key role in locating the self in the body.

According to philosopher Thomas Metzinger of the Johannes Gutenberg University in Mainz, Germany, understanding how the brain performs this trick is the first step to understanding how the brain puts together our autobiographical self –; the sense we have of ourselves as entities that exist from a remembered past to an imagined future. "These experiments are very telling, because they manipulate low-level dimensions of the self: self-location and self-identification," he says. The feeling of owning and being in a body is perhaps the most basic facet of self-consciousness, and so could be the foundation on which more complex aspects of the self are built. The body, it seems, begets the self. n

Anil Ananthaswamy is a consultant for New Scientist

The feeling of being in a body is the most basic facet of consciousness

Why are you like you are?
You're so vain, you probably think your self is about you, says Michael Bond. The truth is slightly more complicated
THE first time a baby smiles, at around 2 months of age, is an intense and beautiful moment for the parents. It is perhaps the first sure sign of recognition for all their love and devotion. It might be just as momentous for the baby, representing their first step on a long road to identity and self-awareness.

Identity is often understood to be a product of memory as we try to build a narrative from the many experiences of our lives. Yet there is now a growing recognition that our sense of self may be a consequence of our relationships with others. "We have this deep-seated drive to interact with each other that helps us discover who we are," says developmental psychologist Bruce Hood at the University of Bristol, UK, author of The Self Illusion (Constable, 2012). And that process starts not with the formation of a child's first memories, but from the moment they first learn to mimic their parents' smile and to respond empathically to others.

The idea that the sense of self drives, and is driven by, our relationships with others makes intuitive sense. "I can't have a relationship without having a self," says Michael Lewis, who studies child development at the Robert Wood Johnson Medical School in New Brunswick, New Jersey. "For me to interact with you, I have to know certain things about you, and the only way I can get at those is by knowing things about me."

There is now evidence that this is the way the brain works. Some clues come from people with autism. Although the disorder is most commonly associated with difficulties in understanding other people's nonverbal social cues, it also seems to create some problems with self-reflection: when growing up, people with autism are later to learn how to recognise themselves in a mirror and tend to form fewer autobiographical memories. Tellingly, the same brain regions –; areas of the prefrontal cortex –; seem to show reduced activity when autistic people try to perform these kinds of tasks, and when they try to understand another's actions. This supports the idea that the same brain mechanism underlies both types of skills.

Further support for the idea comes from the work of Antonio Damasio at the University of Southern California, who has found that social emotions such as admiration or compassion, which result from a focus on the behaviour of others, tend to activate the posteromedial cortices, another set of brain regions also thought to be important in constructing our sense of self (PNAS, vol 106, p 8021).

The upshot is that my own self is not so much about me; it's as much about those around me and how we relate to one another –; a notion that Damasio calls "the social me". This has profound implications. If a primary function of self-identity is to help us build relationships, then it follows that the nature of the self should depend on the social environment in which it develops. Evidence for this comes from cultural psychology. In his book The Geography of Thought (Nicholas Brealey, 2003), Richard Nisbett at the University of Michigan presented lab experiments suggesting that Chinese and other east Asian people tend to focus on the context of a situation, whereas Westerners analyse phenomena in isolation –; different outlooks that affect the way we think about ourselves.

Researchers examining autobiographical memory, for example, have found that Chinese people's recollections are more likely to focus on moments of social or historical significance, whereas people in Europe and America focus on personal interest and achievement. Other studies of identity, meanwhile, have found that Japanese people are more inclined to tailor descriptions of themselves depending on the situation at hand, suggesting they have a more fluid, less concrete sense of themselves than Westerners, whose accounts tend not to rely on context in this way.
Such differences may emerge at an early age. Lewis points to anthropological reports suggesting that the "terrible twos" –; supposedly the time when a child develops an independent will –; are not as dramatic in cultures less focused on individual autonomy, which would seem to show that culture sculpts our sense of self during our earliest experiences.

These disparities in outlook and thinking imply that our very identities –; "what it is that I am" –; are culturally determined. "I'm a male, I'm an academic, I'm a senior, I'm married, I'm a father and grandfather: all of these things that I define myself as are really cultural artefacts," says Lewis. Clearly there is no single pan-cultural concept of selfhood. Yet Hazel Markus, who studies the interaction of culture and self at Stanford University in California, points out that human personalities do share one powerful trait: the capacity to continually shape and be shaped by whatever social environment we inhabit.

While the evidence for "the social me" continues to mount, not everyone is convinced that it is always helpful for our well-being. To the writer and psychologist Susan Blackmore, the self may be a by-product of relationships. It may simply unfold "in the context of social interaction and learning to relate to others, which may inevitably lead you to this sense that I am in here" while bringing some unfortunate baggage along with it. She points out that the self can compel us to cling neurotically to emotions and thoughts that undermine our overall happiness.
Letting it all go, however, would mean undoing the habit of a lifetime. n

Michael Bond is a New Scientist consultant in London

Cotard's delusion

Of all the disturbances of the self, the eeriest and least understood is Cotard's delusion. Symptoms of this rare syndrome range from claims that one's blood or internal organs have gone missing to a belief that one is dead or has otherwise ceased to exist. People with the delusion –; who are often severely depressed or psychotic –; have been known to plan their own funerals.

"I'm an academic, I'm married. all of these things that I define myself as are really cultural artefacts"

Your unique sense of "you" emerges in some of your earliest experiences
post #1197 of 1305
Originally Posted by Anil Ananthaswamy 
The Self: how to wander outside your body

CLOSE your eyes and ask yourself: where am I? Not geographically, but existentially. Most of the time, we would say that we are inside our bodies. After all, we peer out at the world from a unique, first-person perspective within our heads –; and we take it for granted.

What would it take, I began to wonder, for me to transform into a man?


Towards the end of the afternoon, Diane gives me her best and most disturbing piece of advice.

"Don't look at the world from the surface of your eyeballs." she says. "All your feminine availability emanates from there. Set your gaze back in your head. Try to get the feeling that your gaze originates from two inches behind the surface of your eyeballs, from where your optic nerve begins in your brain. Keep it right there."

Immediately, I get what she's saying. I pull my gaze back. I don't know how I appear from the outside, but the internal effect is appalling. I feel -- for the first time in my life -- an immense barrier rise before my vision, keeping me at a palpable distance from the world, roping me off from the people in the room. I feel dead-eyed. I feel like a reptile. I feel my whole face change, settling into a hard mask.

-- Elizabeth Gilbert, My Life As a Man (GQ, Aug. 2001)
post #1198 of 1305

Bump!  This thread got away from me, took me like a month to catch up.


Thick Fog Turns Dubai into a City Above the Clouds





Dubai-based German photographer Sebastian Opitz captures the surreal and mystical look of his adopted city as fog rolls in and out at sunrise. The photographer renames the cityscape as Cloud City for the brief moments when the mist takes over and fills the empty space between the towering buildings. Optiz's images offer a serene and dreamy view of a bustling city, re-imagining it as a heavenly metropolis in the sky.

The photographer says, "I've been living in Dubai for over four years now and always dreamed of taking one of those rare shots from above the fog. This only happens on 4 - 6 days per year and when it happens it will be over by 9 AM. So one has to make sure to be up on the roof of a tower before sunrise and hope for the best." Luckily, Opitz was there to catch the magical event from high above the city on the 85th floor of the Princess Tower.









post #1199 of 1305

Reminds me of  Bioshock Infinite´s Columbia.

post #1200 of 1305
Thread Starter 
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Streetwear and Denim
Styleforum › Forums › Men's Style › Streetwear and Denim › [SOON , A TITLE HERE ]