레이블이 Guardian인 게시물을 표시합니다. 모든 게시물 표시
레이블이 Guardian인 게시물을 표시합니다. 모든 게시물 표시

2014년 10월 30일 목요일

[guardian] The 10 greatest changes of the past 1,000 years

The 10 greatest changes of the past 1,000 years

In Europe, the last millennium has been shaped by successive waves of change, but which shifts, in which centuries, have really shaped the modern world? Historian Ian Mortimer identifies the 10 leading drivers of change
Château de Loches
My castle is not your castle … the keep of Loches castle, shown here in a detail from Emmanuel Lansyer’s 1891 painting, was built in the 11th century. Photograph: World History Archive/Alamy

11th century: Castles

Most people think of castles as representative of conflict. However, they should be seen as bastions of peace as much as war. In 1000 there were very few castles in Europe – and none in England. This absence of local defences meant that lands were relatively easy to conquer – William the Conqueror’s invasion of England was greatly assisted by the lack of castles here. Over the 11th century, all across Europe, lords built defensive structures to defend them and their land. It thus became much harder for kings to simply conquer their neighbours. In this way, lords tightened their grip on their estates, and their masters started to think of themselves as kings of territories, not of tribes. Political leaders were thus bound to defend their borders – and govern everyone within those borders, not just their own people. That’s a pretty enormous change by anyone’s standards. 

12th century: Law and order

A 12th-century illustration of men in the stocks
Banged up ... detail from a 12th-century illustration of men in the stocks. Photograph: Culture Club/Getty
If you consider visiting a foreign country, one of the most important aspects you bear in mind is how safe you will be while you are there. Indeed, probably no other factor deters people from visiting a place as much as an absence of law and order. So it follows that the introduction of the systematic application of law and order marks quite a turning point in European history. This happened through the compilation of law books, the development of jurisprudence, and, in England, the development of “justices in eyre” – the forerunners of circuit judges – together with the establishment of trial by jury.

13th century: Markets

A 13th-century gold coin
Market value ... a 13th-century gold coin. Photograph: Heritage Image/Alamy
As is well known, money has existed for thousands of years. However, that doesn’t mean it has always served the same function as it does today. At the start of the 13th century not many people used money in England. The vast majority lived in the country and bartered for the things that they could not make for themselves. Lords commanded the time of their peasants and allowed them to farm a few acres in return. The only people who regularly handled silver pennies were the inhabitants of market towns – and there were only 300 of those (and some had fewer than 500 people). However, over the course of the 13th century another 1,400 markets were founded in England. European countries saw a similar quadrupling of the number of towns. Not all of these new foundations succeeded but many did. The whole of christendom shifted to a more mercantile economy as you simply cannot operate a barter system efficiently in a marketplace. By 1300, several countries had begun minting large-denomination coins in gold, and credit was available from Italian banking companies, which had branches across the continent. 

14th century: Plague

A contemporary illustration of Death strangling a victim of the plague
The greatest disaster to befall mankind ... a contemporary illustration of death strangling a victim of the plague. Photograph: Heritage Image/Alamy
The greatest disaster to befall mankind and the most important event in the history of the western world had absolutely nothing to do with technology. With roughly half the population of the country dying in the space of seven months, the mortality impact was about 200 times as great as that of the first world war. The socio-economic consequences were profound. The old feudal system was dealt a heavy blow as the paucity of survivors meant workers could charge more for their labour, and peasants could acquire assets and even set themselves up as manorial lords. Questions were raised about God’s relationship with mankind and the nature of disease – how could a benevolent deity kill so many innocent children? At the same time, people began to regard death in a new light, and the religious started to abase themselves, adopting a stance of abject humility in the eyes of God. Thus the plague not only killed people, it changed the ways people lived, as well as their expectations of death.

15th century: Columbus

Detail from Emile Lassalle's 1839 portrait of Christopher Columbus
Expanding horizons ... detail from Emile Lassalle’s 1839 portrait of Christopher Columbus. Photograph: Famoso/Alamy
The most important relationship in human history is between mankind and the land. Basically, the more land you have, the more natural resources you have. Columbus thus stands as one of the most important figures in history. With a great fanfare of his own achievement, he showed Europeans the way to vast territories of which no one had previously dreamed. No new technology empowered him: the compass was already at least three centuries old by the time he discovered Hispaniola in 1492. It was rather socio-economic pressure that drove him – together with his own desire to become a wealthy landowner. The consequences go far further than Spanish being the second-most widely spoken language in the world today (after Chinese). Until 1492 most people had believed the ancient Roman and Greek writers had reached an epitome of knowledge. However, there is no reference to the American continents in Ptolemy or Strabo. People quickly realised that, if the ancient writers could have missed two whole continents, they might have misunderstood many other things too. The crossing of the Atlantic was thus one of the two or three biggest causes for the re-evaluation of received wisdom in the last thousand years.

16th century: The decline of personal violence

A 16th-century illustration of a homeowner thwarting a burglary
Greater certainty of finding the guilty party ... 16th-century illustration of a homeowner thwarting a burglary. Photograph: Leemage/Getty
The pre-industrial past was, by our standards, incredibly violent. In the middle ages, the murder rate in Oxford occasionally hit the same level as Dodge City at the height of the American gun-slinging wild west. But from 1500, the murder rates decreased rapidly, and not just in Oxford. In fact, across Europe, they more or less halved every 100 years, until they started to increase again in the late 20th century. The cause was better communication, through a massive increase in literacy and writing, allowing governments to act more regularly and with greater certainty of finding the guilty party. People started to think twice before drawing a knife in a brawl. Constables answering to the authorities pursued highwaymen and similar culprits far more rigorously than in previous centuries. As with many changes over past centuries, the development was so gradual that contemporaries did not comment on them; they also quickly took a safer society for granted. But that very thing – a safer society – is something not to be thrown away lightly.

17th century: The scientific revolution

Reflecting telescope, built by Isaac Newton in 1668
Understanding the world ... the world’s first reflecting telescope, built by Isaac Newton in 1668. Photograph: Royal Society/PA
One thing that few people fully appreciate about the witchcraft craze that swept Europe in the late 16th and early 17th centuries is that it was not just a superstition. If someone you did not like died, and you were accused of their murder by witchcraft, it would have been of no use claiming that witchcraft does not exist, or that you did not believe in it. Witchcraft was recognised as existing in law – and to a greater or lesser extent, so were many superstitions. The 17th century saw many of these replaced by scientific theories. The old idea that the sun revolved around the Earth was finally disproved by Galileo. People facing life-threatening illnesses, who in 1600 had simply prayed to God for health, now chose to see a doctor. But the most important thing is that there was a widespread confidence in science. Only a handful of people could possibly have understood books such as Isaac Newton’s Philosophiae Naturalis Principia Mathematica, when it was published in 1687. But by 1700 people had a confidence that the foremost scientists did understand the world, even if they themselves did not, and that it was unnecessary to resort to superstitions to explain seemingly mysterious things.

18th century: The French Revolution

The Tennis Court Oath in Versailles by Jacques-Louis David
Liberté, Égalité, Fraternité ... The Tennis Court Oath in Versailles by Jacques-Louis David. Photograph: De Agostini/Getty
There is no doubt that the French Revolution of 1789 was THE revolution for the western world. It was the first testing of the idea, nationally, that men should be equal in the eyes of the law. It forced thinkers all across Europe to reassess the ideas of human rights, political equality, and the rights of women. Although many governments were initially cautious of encouraging change, without the French Revolution, it is difficult to see how the great social reforms of the 19th century – the abolition of slavery, universal education, the rights of women to act as independent property owners, public health, and the diminution of capital punishment – would have proceeded as they did.

19th century: Communications

The first transatlantic telegraph cable is laid in 1858
‘Europe and America are united by telegraphy’ ... the first transatlantic telegraph cable is laid in 1858. Photograph: © Bettmann / Corbis
We think of the 20th century as undergoing a communications revolution. And for many people it has done: most of our great-grandfathers did not have a private phone in 1900 but about 40% of us had a mobile phone by 2000. But the real communications revolution lay in the 19th century – in 1900 you could send a telegram. In 1805, news of the Battle of Trafalgar (21 October) was delivered to the admiralty on 6 November. Just riding from Falmouth to London took Lieutenant Lapenotière 37 hours and 21 changes of horse. After the intercontinental telegraph cable was laid in 1872 it became possible to send a message to Australia immediately. The railways, telegraph and telephone made messaging much faster – in some cases almost instantaneous. This was just as significant as the modern communications revolution, if not more so. Governments trying to control their own countries and those overseas could now require that all important decisions be referred back to the capital; previously they had had to place trusted men in positions of responsibility all over the world – and hope for the best.

20th century: Invention of the future

Detail from Long Live the First Cosmonaut YA Gagarin! by Valentin Petrovich Viktorov
To infinity and beyond ... detail from Long Live the First Cosmonaut YA Gagarin! by Valentin Petrovich Viktorov (1961). Photograph: Heritage Images/Getty
There can be no doubt that technology hugely changed the ways in which we lived and died in the 20th century. However, it also masks changes that are arguably even more profound. In 1900 few people seriously considered the future. William Morris and a few socialists wrote utopian visions of the world they wanted to see, but there was little serious consideration of where we were going as a society. Today we predict almost everything: what the weather will be, what housing we will need, what our pensions will be worth, where we will dispose of our rubbish for the next 30 years and so on. The UN predicts world population levels up to the year 2300. Global warming reports are hot news. Novels about the future are 10 a penny. Newspapers and online newsfeeds are increasingly full of stories of what will happen, not what has happened. With limited resources on a limited planet, this is not a shift that is likely ever to change. In a thousand years or so, if society continues that long, the 20th century may well be viewed as the threshold when the modern world began – when humanity started to consider the future as well as the present and the past.
• Centuries of Change by Ian Mortimer is published by Bodley Head (£20)

2014년 7월 6일 일요일

[the Guardian] The end of the hipster: how flat caps and beards stopped being so cool

The end of the hipster: how flat caps and beards stopped being so cool

Now that cocktails in jam jars have made it to EastEnders, what's next for those who would be 'alternative'?

• Have you spotted any hipsters in the wild?
London hipster
A hipster on the streets of London sports trendy tattoos. Photograph: Wayne Tippetts/Rex Features
Meet Josh. Josh is a 30-year-old artist/chef who lives in a converted warehouse in Hackney, east London. Josh has a beard, glasses and cares about the provenance of his coffee. He pays his tax, doesn't have a 9-to-5 job and, along with his five polymathic flatmates, shuns public transport, preferring to ride a bike.
On paper, Josh is the archetypal hipster – just don't call him one: "I don't hate the word hipster, and I don't hate hipsters, but being a hipster doesn't mean anything any more. So God forbid anyone calls me one."
At some point in the last few years, the hipster changed. Or at least its definition did. What was once an umbrella term for a counter-culture tribe of young creative types in (mostly) New York's Williamsburg and London's Hackney morphed into a pejorative term for people who looked, lived and acted a certain way. The Urban Dictionary defines hipsters as "a subculture of men and women, typically in their 20s and 30s, that value independent thinking, counter-culture, progressive politics". In reality, the word is now tantamount to an insult.

How to be a hipster

How to be a hipster
So what happened? Chris Sanderson, futurologist and co-founder of trend forecasting agency The Future Laboratory, thinks it's simple: "The hipster died the minute we called him a hipster. The word no longer had the same meaning."
Fuelling this was a report last month from researchers at the University of New South Wales who discovered that the hipster look was no longer "hip". In short: the more commonplace a trend – in one instance, beards – the less attractive they are perceived to be. And in 2014 we may have reached "peak beard". Could it be that the flat-white-drinking, flat-cap-wearing hipster will soon cease to exist?
Sanderson thinks it's more a case of evolving than dying. Talking to theObserver last week, he suggested there are now two types of hipster: "Contemporary hipsters – the ones with the beards we love to hate – and proto-hipsters, the real deal." And herein lies the confusion.
"Historically, proto-hipsters have been connoisseurs – people who deviate from the norm. Like hippies. Over the years, though, they inspired a new generation of young urban types who turned the notion of a hipster into a grossly commercial parody. These new hipsters want to appear a certain way, to be seen to be doing certain things, but without doing the research. So they appropriated the lifestyle and mindset of a proto-hipster."
It's a definition neatly summarised in the song Sunday, by Los Angeles rapper Earl Sweatshirt: "You're just not passionate about half the shit that you're into."
The problem is that it is now almost impossible to differentiate between the two. "Hipsters are more interested in following; proto-hipsters are more interested in leading. Yet they look the same, so how are people to know the difference?"
A fixed gear rider in a yellow striped tank top and sunglasses posesFixed-gear bikes – handy for getting to your friend’s underground art show based on Mongolian barbecues. Photograph: Alamy
This lack of visual disparity has probably led to society's fondness for hipster-bashing. As Alex Miller, UK editor-in-chief of Vice, explains: "I couldn't define a hipster. I guess it's 'The Other'. But as a general term it's blown up because people finally realised they had a word to mock something cool and young which they didn't understand."
It's an age-old scenario. In Distinction, his 1979 report on the social logic of taste, French academic Pierre Bourdieu wrote that "social identity lies in difference, and difference is asserted against what is closest, which represents the greatest threat". So our inability to define a hipster merely fuels the enigma.
"And as you can imagine, this is greatly exasperating to proto-hipsters," says Sanderson.
It hasn't always been like this. While the definition of hipster hasn't altered vastly over the years, there was a time when it was considered to be something both meaningful and specific.
The word was coined in the 1940s to define someone who rejected societal norms – such as middle-class white people who listened to jazz. Then came a reactive literary subculture, realised through the work of beatniks such as Jack Kerouac and William Burroughs. It was Norman Mailer who attempted to define hipsters in his essay The White Negro as postwar American white generation of rebels, disillusioned by war, who chose to "divorce oneself from society, to exist without roots, to set out on that uncharted journey into the rebellious imperatives of the self".
A decade later, we had the counter-culture movement – hippies who carried their torch in a fairly self-explanatory fashion, divorced from the mainstream. The word mostly vanished until the 1990s, when it was redefined so as to describe middle-class youths with an interest in "the alternative".
In the "noughties", hipsters became the stuff of parody, via Chris Morris and Charlie Brooker's satire Nathan Barley, which earmarked the "twats of Shoreditch". Nowadays, though, anyone can appear to be a hipster provided they buy the right jeans. From the twee Match.com adverts featuring hipster-style couples to the cocktails served in jam jars at the trendy incomer bar the Albert in EastEnders, "the idea of the hipster has been swallowed up by the mainstream", says Sanderson.
Luke O'Neil, a Boston-based culture writer for the online magazine Slate,says it is the same in the US. "I've even noticed what I call the meta-hipster: a person who sidesteps the traditional requirements and just wants to skip ahead to the status. Like putting on glasses and getting a tattoo somehow makes you a hipster," he says.
But while Miller agrees that hipster has morphed into a negative term, it is less about the word and more about what it represents: "Growing up, we just used other words – 'scenester' at university, 'trendies' at school – and they mean the same. Hipster has simply become a word which means the opposite of authentic."
Not everyone agrees. At Hoxton Bar and Grill in east London, 24-year-old graduate Milly identifies with hipsters: "I mean, that's why we all live in east London. It just feels so real, like something creative and cool is happening."
Manny, a 28-year-old singer who has lived in Dalston for more than five years, likes the sense of community: "Young people haven't got jobs or work and they need it. It's like a tribe, like goths. I hope hipsters aren't dead, because I just signed a year lease on my flat."
Miller adds: "We've never written about hipsters as a subculture at Vicebecause I don't think hipsters are a subculture. However, I do appreciate that people like the idea of belonging to something, so I suppose on that level the idea exists." As O'Neil explains: "Whoever said [hipsters] wanted to be unique? I think it's more about wanting to belong."
So what next? "I think hipsters will have an overhaul. There will be a downturn in this skinny-jean, long-haired feminised look over the next few years owing to the rise of the stronger female role model," says Chris Sanderson." And in its place? "A more macho look, almost to the point of caricature, in a bid for men to reinforce their identity."
A man makes coffee at a cafe in Brixton. Double filtered flat-white coffee — because single-filtering is for people who like Jim Davidson. Photograph: Carl Court/AFP
Perhaps this explains the phenomenon of "normcore", a term coined by New York trend agency K-Hole in their Youth Mode report last autumn. Though widely derided by the fashion world, this plain, super-normal style is arguably a reaction to the commodification of individuality, the idea that you can buy uniqueness off the peg in Topshop. "Normcore doesn't want the freedom to become someone," they say. "Normcore moves away from a coolness that relies on difference to a post-authenticity that opts into sameness."
It sounds like a joke but, says Sanderson, it might actually might be a thing: "It's the opposite of what people think is hip now, but it's also very masculine – which ties in to the return to blokeiness."
But for many, including Josh, the desire to categorise people is infuriating. Arvida Byström is a Swedish-born, London-based artist, photographer and model. Though sometimes identified as a hipster aesthetically speaking, her work, which focuses on sexuality, self-identity and contemporary feminism, would suggest she is much more than that. Sanderson would describe her as "someone who leads not follows".
She balks at the idea of being a hipster: "I haven't been aware of people calling me a hipster. I certainly don't identify as one. What is a hipster, anyway? It is such a general term. I don't even know if they exist any more."
But as Josh says: "I don't see why you can't just be a guy in east London liking the stuff that's around without being branded as something."

2013년 11월 19일 화요일

[the Guardian] Swedish cinemas take aim at gender bias with Bechdel test rating

Swedish cinemas take aim at gender bias with Bechdel test rating

Movies need to pass test that gauges the active presence of women on screen in bid to promote gender equality
Jennifer Lawrence as Katniss Everdeen in The Hunger Games
Jennifer Lawrence as Katniss Everdeen in The Hunger Games, a film that would pass the Bechdel test and gain an A rating. Photograph: Murray Close
You expect movie ratings to tell you whether a film contains nudity, sex, profanity or violence. Now cinemas in Sweden are introducing a new rating to highlight gender bias, or rather the absence of it.
To get an A rating, a movie must pass the so-called Bechdel test, which means it must have at least two named female characters who talk to each other about something other than a man.
"The entire Lord of the Rings trilogy, all Star Wars movies, The Social Network, Pulp Fiction and all but one of the Harry Potter movies fail this test," said Ellen Tejle, the director of Bio Rio, an art-house cinema in Stockholm's trendy Södermalm district.
Bio Rio is one of four Swedish cinemas that launched the new rating last month to draw attention to how few movies pass the Bechdel test. Most filmgoers have reacted positively to the initiative. "For some people it has been an eye-opener," said Tejle.
Beliefs about women's roles in society are influenced by the fact that movie watchers rarely see "a female superhero or a female professor or person who makes it through exciting challenges and masters them", Tejle said, noting that the rating doesn't say anything about the quality of the film. "The goal is to see more female stories and perspectives on cinema screens," he added.
The state-funded Swedish Film Institute supports the initiative, which is starting to catch on. Scandinavian cable TV channel Viasat Film says it will start using the ratings in its film reviews and has scheduled an A-rated "Super Sunday" on 17 November, when it will show only films that pass the test, such as The Hunger Games, The Iron Lady and Savages.
The Bechdel test got its name from American cartoonist Alison Bechdel, who introduced the concept in her comic strip Dykes to Watch Out For in 1985. It has been discussed among feminists and film critics since then, but Tejle hopes the A rating system will help spread awareness among moviegoers about how women are portrayed in films.
In Bio Rio's wood-panelled lobby, students Nikolaj Gula and Vincent Fremont acknowledged that most of their favourite films probably would not get an A rating.
"I guess it does make sense, but to me it would not influence the way I watch films because I'm not so aware about these questions," said Fremont, 29.
The A rating is the latest Swedish move to promote gender equality by addressing how women are portrayed in the public sphere.
Sweden's advertising ombudsman watches out for sexism in that industry and reprimands companies seen as reinforcing gender stereotypes, for example by including skimpily clad women in their adverts for no apparent reason.
Since 2010, the Equalisters project has been trying to boost the number of women appearing as expert commentators in Swedish media through a Facebook page with 44,000 followers. The project has recently expanded to Finland, Norway and Italy.
For some, though, Sweden's focus on gender equality has gone too far.
"If they want different kind of movies they should produce some themselves and not just point fingers at other people," said Tanja Bergkvist, a physicist who writes a blog about Sweden's "gender madness".
The A rating has also been criticised as a blunt tool that does not reveal whether a movie is gender-balanced.
"There are far too many films that pass the Bechdel test that don't help at all in making society more equal or better, and lots of films that don't pass the test but are fantastic at those things," said Swedish film critic Hynek Pallas.
Pallas also criticised the state-funded Swedish Film Institute – the biggest financier of Swedish film – for vocally supporting the project, saying a state institution should not "send out signals about what one should or shouldn't include in a movie".
Research in the US supports the notion that women are under-represented on the screen and that little has changed in the past 60 years.
Of the top 100 US films in 2011, women accounted for 33% of all characters and only 11% of the protagonists, according to a study by the San Diego-based Centre for the Study of Women in Television and Film.
Another study, by the Annenberg Public Policy Centre at the University of Pennsylvania, showed that the ratio of male to female characters in movies has remained at about two to one for at least six decades. That study, which examined 855 top box-office films from 1950-2006, showed female characters were twice as likely to be seen in explicit sexual scenes as males, while male characters were more likely to be seen as violent.
"Apparently Hollywood thinks that films with male characters will do better at the box office. It is also the case that most of the aspects of movie-making – writing, production, direction, and so on – are dominated by men, and so it is not a surprise that the stories we see are those that tend to revolve around men," Amy Bleakley, the study's lead author, said in an email.