Kay has decided to ask other bloggers to write blogs based around the Seven Deadly Sins – one sin per week, and this week, it’s Lust. I had thought that I would write something about those college years in which it seemed that lust was one of the few comforts open to my pocket book.
But then last night happened.
And that doesn’t mean what you are probably thinking it means.
Kay’s sister is active in her high school’s drama program, and the school district has an annual “dramafest” program in which each high school in the district contributes students and performances to the evening’s entertainment. Kay was insistent that I attend, so off to Campbell I went.
Initially, this seemed to be the usual high school drama class fare – a couple of musical numbers opened up the show, both from the musical Side Show, performed better than one might expect, but nonetheless reinforcing to me the fact that we live in a sick culture that seems to think that everything, and I mean everything, MUST be turned into a Broadway-style musical, no matter the cost to innocent people such as the audience (the Onion not long back ran a story concerning musicals being made based on Alka Seltzer and Ajax Cleanser that, I am frightened to say, sounded eerily plausible).
After the musical numbers, we were treated to a scene in which a woman has an argument with her sexual fantasy – a bespectacled but good-natured guy who likes to spoon, or would if he existed, which he doesn’t, seeing as how he is a fantasy and all – and proceeds to go to the psychiatrist, to whom she describes her masturbation habits (though not graphically), and from whom she receives a prescription. Arriving home and taking the medication, the woman causes her sexual fantasy to vanish, only to be replaced by a new one – a charming Spanish rake. All in all, the scene was well-written, generally well performed, and extremely funny. It also made all of the adults in the audience squirm – many of them were parents to the kids involved in the show, and others were siblings or (like myself) attached to either the parents or siblings, so the rather surprisingly frank admission of lust and sex in this scene transcended the awkwardness that most of us would feel in seeing teenagers act out such a literally adult scene (the people in it were pretty clearly written to be older than 18, probably much older, so it wasn’t just the sex that made it adult) and began to probe the borders of that particular territory of discomfort known to anyone who has had to hear a sibling describe their first blowjob.
Well, this was odd and awkward, but, we figured, it couldn’t possibly continue in this vein. After all, this skit might have slipped through the cracks, but it was over early on, and now we were in much safer, family-friendly territory.
Heh. Heh. Heh.
The next skit, also well performed and extremely funny, was a two-man condensed performance of Romeo and Juliet (originally written by the Reduced Shakespeare Company) in which the fellow who took on the role of Romeo made a point of wildly gesticulating towards his crotch whenever possible, and the fact that all roles were played by the two men led to some rather strange but very funny dueling homoerotic and homophobic sequences.
This was followed by a sequence in which a fellow is struggling with the question of whether or not to jump into bed with a woman who’s boyfriend is out of the country (in which we saw one teenage girl shout “Hi, I have ovaries, come on in!” while gesturing towards her pelvis), a lesbian love song (from The Color Purple – again, I want to go to New York and forcibly demonstrate to some Broadway producers that not everything has to be a musical), and a young man holding a teddy bear while singing a song, admittedly veiled but nonetheless obvious, about how happy he is that someone has decided to join him “in his bed”.
When a scene from The Martian Chronicles came on stage, we were so beaten down with sexual subject matter that we began to look for a lesbian subtext where none probably existed. By the time that a scene from Alice in Wonderland was performed, we were wondering why Alice wasn’t wearing a dominatrix outfit, and really, why exactly was she hanging out with a rabbit rather than some other, less amorous animal anyway?
Underage kids exploring sex…it’s as if we had entered one of Warren Jeff’s deepest fantasies.
In truth, none of this should have surprised us. Teenagers are attempting to do two things simultaneously: figure out what it is to be an adult, and figure out what their growing interest in sex means. As adults, we learn that these are, of course, part and parcel of the same thing, and we (usually) learn to navigate both somewhat successfully. But I remember being a teenager, and trying to be mature, while realizing that one of the things that seemed the most mature – sex – had the tendency to reduce us to blithering fools. What we saw on stage was, in many ways, a reflection of that exploration, the sort of thing that we all have to do at some point. It just happened to be rather amazingly public in this case.
In short, the high school dramafest was an explosion of teenage lust on stage. While many people, including myself, felt somewhat shocked or even scandalized by this, it was probably quite harmless. We saw nothing on stage that isn’t already going through the heads and social interactions of these kids. In truth, the scandal isn’t that teenagers experience lust, but that adults, myself included, are uncomfortable hearing about it, and this probably prevents us from being as helpful as we could in keeping the kids safe and responsible – if we can’t talk about it, they can’t listen and learn from our mistakes.
Although it doesn’t quite fit in to the theme of lust, I do have to mention one last thing. The final item on the program was called “Twin Towers Bring Me Home: A Musical Tribute to 9/11.” Yep, it’s pretty much what you think if you are considering that this was a Bay Area school district with a strong arts program doing a musical number based on a terrorist attack. A group of students stood silhouetted by the lit backdrop and posed in various threatening ways (including as if they were pointing rifles at each other) while another student played his guitar and sang a song that we couldn’t quite make out (Kay thought it sounded like a Green Day song, I thought it sounded like it belonged in a commercial for a hair loss prevention product), interrupted by two young women singing songs from un-related musicals, and a young man who kept yelling about having gone through chemotherapy. In other words, it was a weird, incomprehensible mess. I suppose some folks in the audience were moved, most were probably offended, but I had a hard time keeping from bursting into laughter.
Subtitle
The Not Quite Adventures of a Professional Archaeologist and Aspiring Curmudgeon
Friday, February 27, 2009
Thursday, February 26, 2009
Karen Armstrong's Short History of Myth
I have an ambivalent relationship with the writings of Karen Armstrong, and not just because of her name (she and I share a surname, and my mother’s name is Karen). Those books and articles of hers that I have read are full of fascinating ideas, and I think that one of her recurring themes – that modern religion has placed claims of factual accuracy of religious texts over the value of these texts as explanatory of what it means to be human – is quite correct. However, she also has a tendency to look towards past religion and mythology with rose-colored glasses, claiming that prior to the beginnings of science the question of whether or not Biblical stories were literally true was seen as irrelevant (no doubt there are many who did feel this way, but the history of the Church’s persecution of heretics indicates that they were concerned about the literal truth of church teachings) and that modern society has problems that past societies didn’t because of the lack of compelling myth (there may be some merit to the basic idea, but Armstrong draws conclusions from it that are demonstrably false).
First, a definition. The term “Myth” is used here, and in the book being reviewed, not in its popular way, to indicate a false story being propagated, but in the sense that anthropologists and folklorists tend to use it – a story that holds symbolic meaning beyond the literal meaning of the story. Myths can refer to events that never occurred (the Paiute culture hero Helldiver retrieving the Earth from the bottom of a cosmic sea) or they can refer to events that are real but have become imbued with meaning beyond their original significance (the Emancipation Proclamation being viewed as a symbol of the equality of all people, when its actual purpose was mired in the more mundane politics of the day).
In A Short History of Myth, Karen Armstrong provides the reader with an overview of the development of mythology from our ancient ancestors up through the present day. The book lives up to its name, and is very brief, meaning that Armstrong deals in broad themes over wide spans of time, and tends to generalize. In and of itself, this isn’t too much of a problem, Armstrong acknowledges it, and it does serve to provide a framework for studying the development of mythology, so that is all to the good.
However, Armstrong’s mistakes begin in the first section on our ancient ancestors. She over-generalizes, implying that archaeological evidence of particular customs indicates that they were universal when they weren’t necessarily, and she draws rather broad conclusions that, while consistent with the evidence, are not necessarily supported by it. For example, she discusses flexed burials (where a body is bunched into a fetal position) and the placement of grave goods as indicating that the body was being prepared for rebirth in the next life, and holds that this is evidence for a general belief in a particular sort of afterlife among ancient cultures. Such an interpretation is not unreasonable, but it ignores such facts as that burial in the flexed position requires the excavation of a smaller grave, requiring less work on the part of the survivors, that the burial of goods may represent something other than the need for these goods in the next life (among a mobile population, for example, it may represent the need to dispose of goods so that the survivors are not carrying extra weight during their yearly rounds), and among sedentary populations such graves with grave goods may have served to legitimize land holding (this is the land of my ancestors and I can prove it! Look in this hole!) rather than veneration of the dead or a hope for a better afterlife.
Additionally, type of inhumation varies over both time and place (flexed burials, burial of an outstretched individual, burial of different body parts in different locations, cremations, etc.), as does the practice of depositing grave goods (some cultures don’t leave grave goods, others leave an abundance, some leave only certain types, and some leave all manner of household and daily task tools). So, it’s best not to assume too much based on mortuary evidence.
Another problem is that Armstrong seems to assume that there are basic over-arching myths that will be found expressed in remarkably similar ways across all cultures. Again, this is not an unreasonable notion, but it is one that requires more exploration rather than simple assertion that all hunter/gatherer and early farmer mythology is essentially the same. There are common themes, even some common images, that show up time and again across time and space, BUT there are also definite differences, some of them quite important that show up even between neighboring groups. Although I don’t think it is her intention, it is easy to read Ms. Armstrong’s books and come away with the impression that there was a single ancient religion practiced by all early peoples, a notion that is complete nonsense. In fact, the ancient religion that Armstrong discusses is heavily biased in favor of what many European archaeologists and folklorists have attempted to reconstruct for Europe and (to an extent) Africa, but it may have been quite alien to ancient Asia, Australia, and the Americas.
Still, to be fair, she is working with the accounts of a specific group of archaeologists and folklorists working in Europe, and the errors and mis-steps that she makes in this section of the book are likely due to the fact that the people she is relying on for sources tend to do the same. In other words, this is probably a problem more with her sources of information than with her reporting. Nonetheless, it would do her well to be more critical when examining this information, while some of it is quite solid, much of it is little more than conjecture, and in this book it is reported as fact. Of course, that assumes that she wants to be factually accurate, rather than writing a polemic disguised as a scholarly work, and this latter idea might be a more accurate description of her intention.
The next sections deal with changes to mythology that occur as people become sedentary, and begin to develop farming and then civilization. Although some of the same errors are present here as well, these sections are generally better, both because the quality of the archaeological data improves (it’s easier to see patterns in the remains of a long-term permanent settlement than in the remains of a hunter-gatherer encampment), and because the eventual invention of writing allows records of both the myths and the rituals surrounding these myths to be kept. Her discussion of the Axial age, in which mythology had to undergo a change and become more abstract and philosophical, as opposed to something that can be easily applied to everyday life, is interesting and does help to explain much about modern religions. However, I am not qualified to judge the accuracy of her assertions in this section, as my training and experience in prehistoric archaeology does not cover it, nor does my knowledge of history from the Roman Empire on. It’s a blind spot in my education, and it is possible that Armstrong commits the same types of errors here that she did in the earlier portions of the book.
As the book approaches the modern day, Armstrong laments that so many of the problems of our modern day – a great social ennui, genocidal regimes, and weapons of mass destruction – are due to our disconnect from myth. It’s not that we lack myth, but rather that we fail to understand it correctly that leads to violence on a mass scale and widespread social ills. This, she offers, is due to a commitment to scientific rationalism that both negates myth, thus alienating the rationalists from the meaning of the myth and causing destructive despair or misunderstandings, and creates a destructive literalist backlash from those who consider the myths sacred.
It is here that the author begins to spew bullshit in massive waves. First off, it is clear from the section on early hunter gatherers that Armstrong either doesn’t quite grasp the role of myth in the lives of these people, or else disregards what it was in favor of her pet hypothesis that humans are moving away from myth to their detriment. She is correct in stating that humans need myth, but they need it in ways both profound and profane, and early myth was as much about orienting humans to the world around them in a practical sense as in a psychological sense. In the practical sense, we now have much more trustworthy methods of orienting ourselves, and so we are left with the need of psychological orientation – what Joseph Campbell spent so much time dealing with.
Despite Armstrong’s claim, modern humans have not become detached from our use of myth for this purpose. Consider US history – we make ready use of the story of the founding of the nation, of events such as the Civil War and the Great Depression, of documents such as the Emancipation Proclamation not simply as historical facts, but as evidence of who we are as a people, what our destiny is, and how we should interact with the world around us. Armstrong may be right that many of us have lost the old myths, or in the case of Biblical literalists lost the point of the myths even if the myths themselves are retained, but we have formed new myths and use them in much the same way – and this is the process that is occurring at all times throughout the history of humans as a species.
Armstrong would likely concede that point, but in discussing groups such as the Nazis and their myths of Aryan superiority, she claims that we have become unmoored from our use and understanding of myths and that this is responsible for the evils of groups such as the Nazis and Stalin’s regime. But that this is absolutely wrong should be obvious to anyone who gives the idea a moment’s thought.
Genocidal war, hatred for the outgroup, irrational and homicidal claims of ethnic superiority, and internicene warfare even within ethnic groups are nothing new. Ethnographers have discovered this amongst hunter gatherers living isolated from the rest of the world, amongst primitive farmers throughout only recently contacted by those outside of their culture, and among all other social and technological organizations up through and including industrial and post-industrial nations. Archaeological evidence indicates strongly that these tendencies have been present for much, if not all, of human history, and tend to increase as populations grow and come into more regular contact with each other. And, of course, the written record is filled with such atrocities – the destruction of Carthage, the wars that led to the fall of Nineveh, and of course the genocidal campaigns approved by God in the Bible (which Armstrong would know about, being a former nun and all) – all committed by groups who Armstrong argues would not have had such violent dysfunctions of behavior because they were in touch with their myths.
Really, it’s not the myths that have changed, or even our direct relationship to them, it’s the technology and population size that allows us to carry out atrocities of a level not possible, but certainly dreamed of, in the past. The notion that getting back to our mythological roots will somehow put an end to, or even reduce the severity of, this violence is absurd and shows either an ignorance of history or, more likely in the case of Karen Armstrong, an intentional distortion of the facts to support a pet hypothesis that simply doesn’t hold water.
Not surprisingly, this book has gotten bad reviews from historians and anthropologists, but generally good reviews from academics who do not deal as critically with the past and from many people in the general media.
If you want to hear what the professional reviewers have to say about this book, go here.
First, a definition. The term “Myth” is used here, and in the book being reviewed, not in its popular way, to indicate a false story being propagated, but in the sense that anthropologists and folklorists tend to use it – a story that holds symbolic meaning beyond the literal meaning of the story. Myths can refer to events that never occurred (the Paiute culture hero Helldiver retrieving the Earth from the bottom of a cosmic sea) or they can refer to events that are real but have become imbued with meaning beyond their original significance (the Emancipation Proclamation being viewed as a symbol of the equality of all people, when its actual purpose was mired in the more mundane politics of the day).
In A Short History of Myth, Karen Armstrong provides the reader with an overview of the development of mythology from our ancient ancestors up through the present day. The book lives up to its name, and is very brief, meaning that Armstrong deals in broad themes over wide spans of time, and tends to generalize. In and of itself, this isn’t too much of a problem, Armstrong acknowledges it, and it does serve to provide a framework for studying the development of mythology, so that is all to the good.
However, Armstrong’s mistakes begin in the first section on our ancient ancestors. She over-generalizes, implying that archaeological evidence of particular customs indicates that they were universal when they weren’t necessarily, and she draws rather broad conclusions that, while consistent with the evidence, are not necessarily supported by it. For example, she discusses flexed burials (where a body is bunched into a fetal position) and the placement of grave goods as indicating that the body was being prepared for rebirth in the next life, and holds that this is evidence for a general belief in a particular sort of afterlife among ancient cultures. Such an interpretation is not unreasonable, but it ignores such facts as that burial in the flexed position requires the excavation of a smaller grave, requiring less work on the part of the survivors, that the burial of goods may represent something other than the need for these goods in the next life (among a mobile population, for example, it may represent the need to dispose of goods so that the survivors are not carrying extra weight during their yearly rounds), and among sedentary populations such graves with grave goods may have served to legitimize land holding (this is the land of my ancestors and I can prove it! Look in this hole!) rather than veneration of the dead or a hope for a better afterlife.
Additionally, type of inhumation varies over both time and place (flexed burials, burial of an outstretched individual, burial of different body parts in different locations, cremations, etc.), as does the practice of depositing grave goods (some cultures don’t leave grave goods, others leave an abundance, some leave only certain types, and some leave all manner of household and daily task tools). So, it’s best not to assume too much based on mortuary evidence.
Another problem is that Armstrong seems to assume that there are basic over-arching myths that will be found expressed in remarkably similar ways across all cultures. Again, this is not an unreasonable notion, but it is one that requires more exploration rather than simple assertion that all hunter/gatherer and early farmer mythology is essentially the same. There are common themes, even some common images, that show up time and again across time and space, BUT there are also definite differences, some of them quite important that show up even between neighboring groups. Although I don’t think it is her intention, it is easy to read Ms. Armstrong’s books and come away with the impression that there was a single ancient religion practiced by all early peoples, a notion that is complete nonsense. In fact, the ancient religion that Armstrong discusses is heavily biased in favor of what many European archaeologists and folklorists have attempted to reconstruct for Europe and (to an extent) Africa, but it may have been quite alien to ancient Asia, Australia, and the Americas.
Still, to be fair, she is working with the accounts of a specific group of archaeologists and folklorists working in Europe, and the errors and mis-steps that she makes in this section of the book are likely due to the fact that the people she is relying on for sources tend to do the same. In other words, this is probably a problem more with her sources of information than with her reporting. Nonetheless, it would do her well to be more critical when examining this information, while some of it is quite solid, much of it is little more than conjecture, and in this book it is reported as fact. Of course, that assumes that she wants to be factually accurate, rather than writing a polemic disguised as a scholarly work, and this latter idea might be a more accurate description of her intention.
The next sections deal with changes to mythology that occur as people become sedentary, and begin to develop farming and then civilization. Although some of the same errors are present here as well, these sections are generally better, both because the quality of the archaeological data improves (it’s easier to see patterns in the remains of a long-term permanent settlement than in the remains of a hunter-gatherer encampment), and because the eventual invention of writing allows records of both the myths and the rituals surrounding these myths to be kept. Her discussion of the Axial age, in which mythology had to undergo a change and become more abstract and philosophical, as opposed to something that can be easily applied to everyday life, is interesting and does help to explain much about modern religions. However, I am not qualified to judge the accuracy of her assertions in this section, as my training and experience in prehistoric archaeology does not cover it, nor does my knowledge of history from the Roman Empire on. It’s a blind spot in my education, and it is possible that Armstrong commits the same types of errors here that she did in the earlier portions of the book.
As the book approaches the modern day, Armstrong laments that so many of the problems of our modern day – a great social ennui, genocidal regimes, and weapons of mass destruction – are due to our disconnect from myth. It’s not that we lack myth, but rather that we fail to understand it correctly that leads to violence on a mass scale and widespread social ills. This, she offers, is due to a commitment to scientific rationalism that both negates myth, thus alienating the rationalists from the meaning of the myth and causing destructive despair or misunderstandings, and creates a destructive literalist backlash from those who consider the myths sacred.
It is here that the author begins to spew bullshit in massive waves. First off, it is clear from the section on early hunter gatherers that Armstrong either doesn’t quite grasp the role of myth in the lives of these people, or else disregards what it was in favor of her pet hypothesis that humans are moving away from myth to their detriment. She is correct in stating that humans need myth, but they need it in ways both profound and profane, and early myth was as much about orienting humans to the world around them in a practical sense as in a psychological sense. In the practical sense, we now have much more trustworthy methods of orienting ourselves, and so we are left with the need of psychological orientation – what Joseph Campbell spent so much time dealing with.
Despite Armstrong’s claim, modern humans have not become detached from our use of myth for this purpose. Consider US history – we make ready use of the story of the founding of the nation, of events such as the Civil War and the Great Depression, of documents such as the Emancipation Proclamation not simply as historical facts, but as evidence of who we are as a people, what our destiny is, and how we should interact with the world around us. Armstrong may be right that many of us have lost the old myths, or in the case of Biblical literalists lost the point of the myths even if the myths themselves are retained, but we have formed new myths and use them in much the same way – and this is the process that is occurring at all times throughout the history of humans as a species.
Armstrong would likely concede that point, but in discussing groups such as the Nazis and their myths of Aryan superiority, she claims that we have become unmoored from our use and understanding of myths and that this is responsible for the evils of groups such as the Nazis and Stalin’s regime. But that this is absolutely wrong should be obvious to anyone who gives the idea a moment’s thought.
Genocidal war, hatred for the outgroup, irrational and homicidal claims of ethnic superiority, and internicene warfare even within ethnic groups are nothing new. Ethnographers have discovered this amongst hunter gatherers living isolated from the rest of the world, amongst primitive farmers throughout only recently contacted by those outside of their culture, and among all other social and technological organizations up through and including industrial and post-industrial nations. Archaeological evidence indicates strongly that these tendencies have been present for much, if not all, of human history, and tend to increase as populations grow and come into more regular contact with each other. And, of course, the written record is filled with such atrocities – the destruction of Carthage, the wars that led to the fall of Nineveh, and of course the genocidal campaigns approved by God in the Bible (which Armstrong would know about, being a former nun and all) – all committed by groups who Armstrong argues would not have had such violent dysfunctions of behavior because they were in touch with their myths.
Really, it’s not the myths that have changed, or even our direct relationship to them, it’s the technology and population size that allows us to carry out atrocities of a level not possible, but certainly dreamed of, in the past. The notion that getting back to our mythological roots will somehow put an end to, or even reduce the severity of, this violence is absurd and shows either an ignorance of history or, more likely in the case of Karen Armstrong, an intentional distortion of the facts to support a pet hypothesis that simply doesn’t hold water.
Not surprisingly, this book has gotten bad reviews from historians and anthropologists, but generally good reviews from academics who do not deal as critically with the past and from many people in the general media.
If you want to hear what the professional reviewers have to say about this book, go here.
Tuesday, February 24, 2009
How to Spot Bullshit
Dr. Robert Park, a physics professor at the university of Maryland, has produced a list of seven warning signs that a scientific claim may be nonsense. While none of these warning signs proves that the claim is bullshit, they tend to be present when bullshit claims are made, and therefore should serve as warning signs. You can read the full article here.
The list of warning signs is pretty good. The only thing that I can think to add is that the claimant relies on authority over data for their claim. Otherwise, I can think of nothing to add – but perhaps you can, and that’s what the comments section is for.
I especially like the explanation for #5:
Anyway, check it out, it’s well worth reading and keeping in mind when you hear a claim made.
1. The discoverer pitches the claim directly to the media.
2. The discoverer says that a powerful establishment is trying to suppress his or her work.
3. The scientific effect involved is always at the very limit of detection.
4. Evidence for a discovery is anecdotal.
5. The discoverer says a belief is credible because it has endured for centuries.
6. The discoverer has worked in isolation.
7. The discoverer must propose new laws of nature to explain an observation.
The list of warning signs is pretty good. The only thing that I can think to add is that the claimant relies on authority over data for their claim. Otherwise, I can think of nothing to add – but perhaps you can, and that’s what the comments section is for.
I especially like the explanation for #5:
5. The discoverer says a belief is credible because it has endured for centuries. There is a persistent myth that hundreds or even thousands of years ago, long before anyone knew that blood circulates throughout the body, or that germs cause disease, our ancestors possessed miraculous remedies that modern science cannot understand. Much of what is termed "alternative medicine" is part of that myth.
Anyway, check it out, it’s well worth reading and keeping in mind when you hear a claim made.
Saturday, February 21, 2009
Thinking About Morality
What is morality, and where does it come from? Seems like a simple question, doesn’t it? Most of us treat it as if it’s a simple thing, but it really isn’t. If you live in the U.S., you will most often hear morality discussed as a function of religion. In fact, I find that because I do not share the majority position regarding religion, I am often (in fact, typically) accused of being immoral or amoral – the basic idea being that if I do not believe in a supernatural source for morality, I must be “cut off from moral bearings” or, as Banana Man Ray Comfort puts it, a “moral free agent.”
The first problem is that this particular claim is that, like most of the folks that I know who share my views on religion, I’m pretty boring on the whole sin front. For the most part, I don’t have any particular vices that aren’t also shared with the religious people I know, and I lack even many of those (I don’t drink, never been unfaithful to a partner, no drugs, tend to be a goody-two-shoes as applied to other people’s property, etc.). Where I tend to differ from them is that I do not view arbitrary things that don’t harm anyone as sins – I have no problem with gay people, I don’t really care if someone “blasphemes”, and so on.
In other words, I’m a good citizen, decent neighbor, and all without thinking that there is some sort of being hanging doom over my head if I do wrong. And I am not alone. Time and again, research into the relationship between violent crime, divorce, substance abuse, willingness to cheat others, and so on has shown that these things are not negatively correlated with religious belief. If the non-religious were truly adrift in a sea of immorality, the situation would be quite the opposite.
In fact, as a general rule*, high rates of religious belief in a nation or region correlates with higher rates of crime, drug use, divorce, unplanned pregnancy, abortion, poor social and personal health, etc. (see here and here). This is not necessarily to say that religion causes all of the strife – there are many factors that play a role (some of which have non-causal correlations with religion) – but it doesn’t prevent these problems and may bear some responsibility (by making some topics taboo – such as accurate information about sex education and STDs - or placing some legitimate solutions arbitrarily off-limits – such as the Catholic Church’s official refusal to accept that condoms may be of use in combating the spread of HIV – or placing belief ahead of action – such as the tendency amongst many Christian sects to argue that belief in Jesus is more important than behaving morally, or sanctifying anti-social actions as moral – such as suicide bombers in the Middle East or the murder of people who leave Islam in many Middle-Eastern nations). That religion is not the bulwark of morality against a rising tide of social ills is further illsutrated by the fact that the non-religious make up a smaller portion of the prison population than of the general population (see here and here). Independent of the question of whether or not religion causes social ills (a very complex question outside the scope of what I am writing here), it should be obvious to anyone with two brain cells to rub together that if religious belief was in fact the source of morality, then belief in gods would correlate with higher rates of moral behavior, but this proposition is demonstrably false.
There is a further problem with the notion that religion is the source of morality: most religious believers don’t actually follow the moral codes that they claim to believe – and, for the record, this reflects well on the believers. For example, in the recent Proposition 8 debate here in California, believers frequently stated that the Bible condemned homosexuality, and therefore gay marriage should not be allowed. They are, of course, correct that the Bible condemns homosexuality**, but they ignored that the Bible also calls for the death penalty for homosexuals – for that matter, most of them even feel that the mandatory prison sentences for homosexuality that were common up through the first half of the 20th century were overly harsh. And we see a similar rejection of harsh punishment for other religious “crimes” amongst most modern believers. In other words, the average believer today demonstrates a stronger sense of compassion and, well, morality than the authors of their holy texts did, and in demonstrating these commendable traits, they are, by the standards of the texts that they claim to follow, committing a sin***. And good for them I say, they are clearly better people, citizens, and neighbors than the original authors of the texts, and I think that this shows some degree of moral progress. But it also shows that even those who claim religion is the source of morality don’t actually behave as if it is.
So, if the majority view is wrong, and morality does not come from religion, where does it come from? How can we be moral? Why aren’t we doomed to nihilism and wickedness?
Well, the answer seems to come from a rather obvious place, really: our evolutionary origins. Put simply, we are social animals, and as such, we have had to evolve both biological and cultural traits that allow us to function in groups.
Let me phrase my argument as a hypothesis to be tested. If our sense of morality comes from our evolutionary origins, then it follows that other animals that are close to us either genetically or in their social organization will demonstrate similar traits to deal with social organization – constrained by their own biological capacities, that there would be evidence of moral behavior across all human societies and not just those with the “correct” religions, and that those traits that are universally “moral” should have adaptive use to mobile hunter-gatherers (our ancestors).
So, let’s start with other animals. When we look at other social animals, we see development of social rules that allow these animals to interact successfully and with minimal conflict – even the fights observed amongst packs of dogs are geared at determining the leader to be followed rather than violence for the sake of violence. As we come closer to humans, we see more and more traits that are recognizable. Chimpanzees, for example, show such human behaviors as warfare and outgroup exclusion (both part of most human moral codes, interestingly), but also show our better traits such as compassion and cooperation. As Jane Goodall, puts it:
For more information, look into the work of Dr. Goodall, or the work of others researching the origins of morality.
Likewise, we do find certain universals amongst human populations, and I have seen these time and again in my studies and research as an anthropologist: the preservation of the in-group is seen as good; altruism is good; harm to the in-group is bad; harm to the out-group may be neutral, good, or bad, depending on the impact that this has on the in-group; individual compassion is used as a guide to correct treatment of others, but is influenced by the relation of the other to the individual acting; the definition of in-group and out-group is flexible and dependent upon the situation, but is generally correlated to the social and genetic relationship of the individual being acted upon to the individual doing the acting.
So, we do see universals that make sense for mobile, stone-age hunter-gatherers, but not necessarily for modern humans. For example, look up the “Trolley Problem” (listen here) to see how the interjection of technology into the equation causes us to view what is logically an equivalent situation as being morally right or wrong (long story short – technological harm to an individual is seen as being more “okay” than directly-caused harm, even if the resulting harm is identical), or check out how an action that would generally be considered evil can be made acceptable through the phenomenon of groups absolution or the placement of those being acted upon into the out-group. These traits are not adaptive to a modern post-industrial society, in which we have the ability to impact masses of people both positively and negatively, but make perfect sense in the context of stone-age hunter-gatherers.
Now, of course, religion is itself probably a result of our evolutionary histories, and so it is no surprise that it often becomes conflated with morality. But the difference is that when we drop the notion that religion is the source of morality, rather than something that evolved along with it, we can see that morality is a natural thing – that is, it is something that has come about because we need it, rather than being enforced on us by an external force. This has an important implication: we can use the needs of living people as guides to moral behavior, and we can see where there is wisdom to be gained from our evolutionary past, rather than continuing to claim arbitrary and silly traditions from bronze-age societies.
A lot of people find this idea of a changing and fluid morality uncomfortable, and as such they declare that such a notion is bad (some particularly bigoted individuals will then go on to claim that the non-religious are unable or unlikely to be moral – but this says more about the insecurities, and, let’s face it, immorality of the people who claim this than the immorality of non-religious people). However, even these people subscribe to the notion of a fluid and changing morality, whether they admit it or not. As noted above, most believers are not in favor of the execution of blasphemers and homosexuals, though that is what their religious texts call for and their ancestors would have demanded. The reason for this is that our society has changed – for the better – and these old ways are seen as harsh and maladaptive now. Religion has not tamed society and made it more moral, rather, culture has tamed religion and forced it to actually behave in a more moral way. Anyone who clings to the notion of an eternal and unchanging code of morality is lying, either to themselves or to you.
Some may claim that giving up even the illusion of an unchanging morality from the divine will lead to moral decay - gulags for the sick, eugenics, violence against those of a different intellectual bent, etc. Assuming that such a thing is true - which is a debatable point - this claim in favor of a religion-based morality still doesn't hold water, and is in fact rather dishonest (and, ironically, therefore probably immoral) for a simple reason: through most of human history, people have clung to models of morality either dictated by or justified through perceived divine revelation, and that has resulted in persecution of dissenters, genocidal wars, torture, suicide bombings, honor-killings, etc. etc. Even if an openly fluid non-religious moral ethos completely replaces religious ones, and even if it is the worst that all believers claim it might be, it would not really be any worse than the religion-based versions. At worst, it would be pretty much the same, and then we're in the same place that we've always been - except that now we're honest about it.
In truth, those who push the notion (even if they don’t actually subscribe to it) of an un-changing morality handed down by god are the ones who are unmoored from moral anchors. They are allowing arbitrary codes that they themselves only half-heartedly hold to take precedence over the very real needs of people. They are more concerned with having their own prejudices and psychological comforts unquestioned than with actually doing good. They will often try to misdirect you – claiming that you are entering into a dangerous moral relativism, when they are the ones who are holding that their own arbitrary a-priori beliefs are somehow more important than the suffering or joy of others - and the assumption that arbitrary positions are somehow equal to verifiable facts is the very definition of relativism.
In other words, don’t buy it. Think about morality, consider that we do have moral impulses that we can sharpen and make use of for the good of ourselves and those around us, and don't allow yourself to be sold on something that may be doing more harm than good.
*I say “as a general rule” because there are, of course, some exceptions.
**Or, at least, most modern translations do – but what the original Hebrew and Greek said, and whether it was a blanket condemnation or rather a rejection of ritual homosexuality (common in the ancient Middle East), is a matter better discussed by people who know more about the subject than I currently do.
***There are, of course, many rationalizations that believers may give for ignoring these rather evil commandments – “that was a ritual requirement that Jesus did away with,” “that was specific to that time and place,” “God’s subsequent commandments show that this changed.” The problems with these rationalizations are twofold: 1) the same believers will still cling to commandments that can be easily dismissed in exactly the same way (such as the general condemnation of homosexuality), and 2) the same believers often claim that the moral codes of the Bible are “eternal and unchanging” while simultaneously admitting that they have changed.
The first problem is that this particular claim is that, like most of the folks that I know who share my views on religion, I’m pretty boring on the whole sin front. For the most part, I don’t have any particular vices that aren’t also shared with the religious people I know, and I lack even many of those (I don’t drink, never been unfaithful to a partner, no drugs, tend to be a goody-two-shoes as applied to other people’s property, etc.). Where I tend to differ from them is that I do not view arbitrary things that don’t harm anyone as sins – I have no problem with gay people, I don’t really care if someone “blasphemes”, and so on.
In other words, I’m a good citizen, decent neighbor, and all without thinking that there is some sort of being hanging doom over my head if I do wrong. And I am not alone. Time and again, research into the relationship between violent crime, divorce, substance abuse, willingness to cheat others, and so on has shown that these things are not negatively correlated with religious belief. If the non-religious were truly adrift in a sea of immorality, the situation would be quite the opposite.
In fact, as a general rule*, high rates of religious belief in a nation or region correlates with higher rates of crime, drug use, divorce, unplanned pregnancy, abortion, poor social and personal health, etc. (see here and here). This is not necessarily to say that religion causes all of the strife – there are many factors that play a role (some of which have non-causal correlations with religion) – but it doesn’t prevent these problems and may bear some responsibility (by making some topics taboo – such as accurate information about sex education and STDs - or placing some legitimate solutions arbitrarily off-limits – such as the Catholic Church’s official refusal to accept that condoms may be of use in combating the spread of HIV – or placing belief ahead of action – such as the tendency amongst many Christian sects to argue that belief in Jesus is more important than behaving morally, or sanctifying anti-social actions as moral – such as suicide bombers in the Middle East or the murder of people who leave Islam in many Middle-Eastern nations). That religion is not the bulwark of morality against a rising tide of social ills is further illsutrated by the fact that the non-religious make up a smaller portion of the prison population than of the general population (see here and here). Independent of the question of whether or not religion causes social ills (a very complex question outside the scope of what I am writing here), it should be obvious to anyone with two brain cells to rub together that if religious belief was in fact the source of morality, then belief in gods would correlate with higher rates of moral behavior, but this proposition is demonstrably false.
There is a further problem with the notion that religion is the source of morality: most religious believers don’t actually follow the moral codes that they claim to believe – and, for the record, this reflects well on the believers. For example, in the recent Proposition 8 debate here in California, believers frequently stated that the Bible condemned homosexuality, and therefore gay marriage should not be allowed. They are, of course, correct that the Bible condemns homosexuality**, but they ignored that the Bible also calls for the death penalty for homosexuals – for that matter, most of them even feel that the mandatory prison sentences for homosexuality that were common up through the first half of the 20th century were overly harsh. And we see a similar rejection of harsh punishment for other religious “crimes” amongst most modern believers. In other words, the average believer today demonstrates a stronger sense of compassion and, well, morality than the authors of their holy texts did, and in demonstrating these commendable traits, they are, by the standards of the texts that they claim to follow, committing a sin***. And good for them I say, they are clearly better people, citizens, and neighbors than the original authors of the texts, and I think that this shows some degree of moral progress. But it also shows that even those who claim religion is the source of morality don’t actually behave as if it is.
So, if the majority view is wrong, and morality does not come from religion, where does it come from? How can we be moral? Why aren’t we doomed to nihilism and wickedness?
Well, the answer seems to come from a rather obvious place, really: our evolutionary origins. Put simply, we are social animals, and as such, we have had to evolve both biological and cultural traits that allow us to function in groups.
Let me phrase my argument as a hypothesis to be tested. If our sense of morality comes from our evolutionary origins, then it follows that other animals that are close to us either genetically or in their social organization will demonstrate similar traits to deal with social organization – constrained by their own biological capacities, that there would be evidence of moral behavior across all human societies and not just those with the “correct” religions, and that those traits that are universally “moral” should have adaptive use to mobile hunter-gatherers (our ancestors).
So, let’s start with other animals. When we look at other social animals, we see development of social rules that allow these animals to interact successfully and with minimal conflict – even the fights observed amongst packs of dogs are geared at determining the leader to be followed rather than violence for the sake of violence. As we come closer to humans, we see more and more traits that are recognizable. Chimpanzees, for example, show such human behaviors as warfare and outgroup exclusion (both part of most human moral codes, interestingly), but also show our better traits such as compassion and cooperation. As Jane Goodall, puts it:
They kiss, embrace, hold hands, pat one another on the back, swagger, shake their fists, and throw rocks in the same context that we do these things. There are strong bonds of affection and support between family members. They help each other. And they have violent and brutal aggression, even a kind of primitive war. In all these ways, they’re very like us.
For more information, look into the work of Dr. Goodall, or the work of others researching the origins of morality.
Likewise, we do find certain universals amongst human populations, and I have seen these time and again in my studies and research as an anthropologist: the preservation of the in-group is seen as good; altruism is good; harm to the in-group is bad; harm to the out-group may be neutral, good, or bad, depending on the impact that this has on the in-group; individual compassion is used as a guide to correct treatment of others, but is influenced by the relation of the other to the individual acting; the definition of in-group and out-group is flexible and dependent upon the situation, but is generally correlated to the social and genetic relationship of the individual being acted upon to the individual doing the acting.
So, we do see universals that make sense for mobile, stone-age hunter-gatherers, but not necessarily for modern humans. For example, look up the “Trolley Problem” (listen here) to see how the interjection of technology into the equation causes us to view what is logically an equivalent situation as being morally right or wrong (long story short – technological harm to an individual is seen as being more “okay” than directly-caused harm, even if the resulting harm is identical), or check out how an action that would generally be considered evil can be made acceptable through the phenomenon of groups absolution or the placement of those being acted upon into the out-group. These traits are not adaptive to a modern post-industrial society, in which we have the ability to impact masses of people both positively and negatively, but make perfect sense in the context of stone-age hunter-gatherers.
Now, of course, religion is itself probably a result of our evolutionary histories, and so it is no surprise that it often becomes conflated with morality. But the difference is that when we drop the notion that religion is the source of morality, rather than something that evolved along with it, we can see that morality is a natural thing – that is, it is something that has come about because we need it, rather than being enforced on us by an external force. This has an important implication: we can use the needs of living people as guides to moral behavior, and we can see where there is wisdom to be gained from our evolutionary past, rather than continuing to claim arbitrary and silly traditions from bronze-age societies.
A lot of people find this idea of a changing and fluid morality uncomfortable, and as such they declare that such a notion is bad (some particularly bigoted individuals will then go on to claim that the non-religious are unable or unlikely to be moral – but this says more about the insecurities, and, let’s face it, immorality of the people who claim this than the immorality of non-religious people). However, even these people subscribe to the notion of a fluid and changing morality, whether they admit it or not. As noted above, most believers are not in favor of the execution of blasphemers and homosexuals, though that is what their religious texts call for and their ancestors would have demanded. The reason for this is that our society has changed – for the better – and these old ways are seen as harsh and maladaptive now. Religion has not tamed society and made it more moral, rather, culture has tamed religion and forced it to actually behave in a more moral way. Anyone who clings to the notion of an eternal and unchanging code of morality is lying, either to themselves or to you.
Some may claim that giving up even the illusion of an unchanging morality from the divine will lead to moral decay - gulags for the sick, eugenics, violence against those of a different intellectual bent, etc. Assuming that such a thing is true - which is a debatable point - this claim in favor of a religion-based morality still doesn't hold water, and is in fact rather dishonest (and, ironically, therefore probably immoral) for a simple reason: through most of human history, people have clung to models of morality either dictated by or justified through perceived divine revelation, and that has resulted in persecution of dissenters, genocidal wars, torture, suicide bombings, honor-killings, etc. etc. Even if an openly fluid non-religious moral ethos completely replaces religious ones, and even if it is the worst that all believers claim it might be, it would not really be any worse than the religion-based versions. At worst, it would be pretty much the same, and then we're in the same place that we've always been - except that now we're honest about it.
In truth, those who push the notion (even if they don’t actually subscribe to it) of an un-changing morality handed down by god are the ones who are unmoored from moral anchors. They are allowing arbitrary codes that they themselves only half-heartedly hold to take precedence over the very real needs of people. They are more concerned with having their own prejudices and psychological comforts unquestioned than with actually doing good. They will often try to misdirect you – claiming that you are entering into a dangerous moral relativism, when they are the ones who are holding that their own arbitrary a-priori beliefs are somehow more important than the suffering or joy of others - and the assumption that arbitrary positions are somehow equal to verifiable facts is the very definition of relativism.
In other words, don’t buy it. Think about morality, consider that we do have moral impulses that we can sharpen and make use of for the good of ourselves and those around us, and don't allow yourself to be sold on something that may be doing more harm than good.
*I say “as a general rule” because there are, of course, some exceptions.
**Or, at least, most modern translations do – but what the original Hebrew and Greek said, and whether it was a blanket condemnation or rather a rejection of ritual homosexuality (common in the ancient Middle East), is a matter better discussed by people who know more about the subject than I currently do.
***There are, of course, many rationalizations that believers may give for ignoring these rather evil commandments – “that was a ritual requirement that Jesus did away with,” “that was specific to that time and place,” “God’s subsequent commandments show that this changed.” The problems with these rationalizations are twofold: 1) the same believers will still cling to commandments that can be easily dismissed in exactly the same way (such as the general condemnation of homosexuality), and 2) the same believers often claim that the moral codes of the Bible are “eternal and unchanging” while simultaneously admitting that they have changed.
Labels:
Anthropology,
Atheism,
Critical Thinking,
Evolution,
Morality,
Religion,
Science
Friday, February 20, 2009
The UK Takes Health Care Seriously
Sorry I've been slacking lately - I've been busier than usual. However, I have come across the linked article, one that gives me hope that the US might follow the British example in taking health care more seriously - they have a new chairman for regulating alternative medicine.
I think that his commitment to his job speaks for itself:
Truly, an inspiration to us all.
I think that his commitment to his job speaks for itself:
But the witch doctor stressed the therapists would be judged not on the effectiveness of their treatments but on the strength of their mogambo.
Limba said: "There are many frauds and not everyone has as strong a connection to the serpent god Demballa as they like to make out.
Truly, an inspiration to us all.
Saturday, February 14, 2009
Rock Porn
I have spent the last week in the field recording one of the coolest archaeological sites that I have ever had the good fortune to see. The site consisted of over seventy milling features (bedrock mortars bedrock mortars - see a picture here - and milling slicks – smooth spots on rocks from grinding seeds into flour, etc.), a lot of debitage (waste products from making flaked stone tools such as arrowheads, spear-tips and knives), and pieces of the tools used for grinding seeds against the milling features. Most interesting, though, is the rock art that is spread throughout the site.
This rock art runs the gamut from “cupules”* ground into the surfaces of boulders to large panels of human and animal figures, abstract images, and geometric designs painted with red, black, and white paints. Most of the art had eroded off of the rocks over time, but what remained was fantastic – I can only imagine how amazing it must have been, say, 200 years ago, when it was fresh.
One of the types of rock art found at the site was shaped ovoid inclusions. Inclusions are parts of the rock that have a different density than other parts of the rock and therefore tend to erode differently when exposed to the elements – in this case they eroded more slowly and therefore were exposed as darker bumps on the surface of the bedrock. These darker bumps were then ground and carved into different shapes – in one case the shape was pretty clearly intended to be an eye, in others it was more difficult to determine what the shapes were supposed to be.
When you read the various different books and articles on the subject, you find a lot of different possible interpretations for these features, but one of the most common is that they represent a vulva and are likely fertility symbols. So, as I was recording one, I called Kay and, when she answered her phone, I said:
“Hey! I just thought I’d let you know that I’m recording stone representations of women’s genitals.”
She was silent for a moment, and then said “you’re recording stone representations of women’s genitals?”
“Yep. It’s rock art at this archaeological site that I’m working on. I thought that you’d appreciate knowing that.”
“You know, I really do” she answered with some enthusiasm.
I then hung up the phone and went on about my work. She, however, went to her Twitter page and mentioned that I was out in the field recording rock art representations of human genitals. This, apparently, prompted a flurry of responses ranging from the incredulous to people thinking that I have the coolest job on Earth. It also prompted one of her friends to start referring to me as “porn rock.”
And, you know, if you are going to have a nick-name, you could definitely do worse than “porn rock.”
…but I digress. We finished recording the site on Friday, and while the rock art is really fantastic, I am still no closer to having a clue as to what any of it meant**. What I do know is that there is a lot of literature out there on the interpretation of rock art, and it ranges from possibly relevant work based on ethnographic interviews to analysis of locations (is it hidden, implying secret rituals, or open implying public use, and is there different iconography between the different locations?) to complete nonsense based on Freudian analysis (so, let me get this straight, you want to use a largely antiquated system of analyzing the dreams of 19th century Europeans to figure out the meaning of stone-age hunter-gatherer rock art?), or, my recent favorite, color analysis – wherein a group of researchers have reached convoluted but ultimately arbitrary conclusions about what the colors in rock art mean, and attempt to use this to unlock the secrets of our ancestors.
Anyway, rock art can probably tell us a lot about the people who made it, if only we can figure out how to read it. There is one of the biggest challenges in prehistoric archaeology, both one of the most tantalizing and one of the most frustrating.
Rock Porn out!
*These cupules are common throughout California. Sometimes they may not be rock art, but may be used for grinding something such as seeds or ochre – they are sometimes found in direct connection with bedrock mortar cups indicating that they were used with the mortars, but they are sometimes found in places where they could not possibly have been used for grinding. These cupules are often interpreted as fertility symbols – for some reason fertility and hunting success are the most common interpretations of rock art – but while this is likely the case for many of the cupules, I suspect that there are other purposes as well.
**I would love to post pictures here, but for various professional ethical reasons it is generally considered a bad idea to post pictures of rock art when you are uncertain of the importance of the art to the local native groups.
This rock art runs the gamut from “cupules”* ground into the surfaces of boulders to large panels of human and animal figures, abstract images, and geometric designs painted with red, black, and white paints. Most of the art had eroded off of the rocks over time, but what remained was fantastic – I can only imagine how amazing it must have been, say, 200 years ago, when it was fresh.
One of the types of rock art found at the site was shaped ovoid inclusions. Inclusions are parts of the rock that have a different density than other parts of the rock and therefore tend to erode differently when exposed to the elements – in this case they eroded more slowly and therefore were exposed as darker bumps on the surface of the bedrock. These darker bumps were then ground and carved into different shapes – in one case the shape was pretty clearly intended to be an eye, in others it was more difficult to determine what the shapes were supposed to be.
When you read the various different books and articles on the subject, you find a lot of different possible interpretations for these features, but one of the most common is that they represent a vulva and are likely fertility symbols. So, as I was recording one, I called Kay and, when she answered her phone, I said:
“Hey! I just thought I’d let you know that I’m recording stone representations of women’s genitals.”
She was silent for a moment, and then said “you’re recording stone representations of women’s genitals?”
“Yep. It’s rock art at this archaeological site that I’m working on. I thought that you’d appreciate knowing that.”
“You know, I really do” she answered with some enthusiasm.
I then hung up the phone and went on about my work. She, however, went to her Twitter page and mentioned that I was out in the field recording rock art representations of human genitals. This, apparently, prompted a flurry of responses ranging from the incredulous to people thinking that I have the coolest job on Earth. It also prompted one of her friends to start referring to me as “porn rock.”
And, you know, if you are going to have a nick-name, you could definitely do worse than “porn rock.”
…but I digress. We finished recording the site on Friday, and while the rock art is really fantastic, I am still no closer to having a clue as to what any of it meant**. What I do know is that there is a lot of literature out there on the interpretation of rock art, and it ranges from possibly relevant work based on ethnographic interviews to analysis of locations (is it hidden, implying secret rituals, or open implying public use, and is there different iconography between the different locations?) to complete nonsense based on Freudian analysis (so, let me get this straight, you want to use a largely antiquated system of analyzing the dreams of 19th century Europeans to figure out the meaning of stone-age hunter-gatherer rock art?), or, my recent favorite, color analysis – wherein a group of researchers have reached convoluted but ultimately arbitrary conclusions about what the colors in rock art mean, and attempt to use this to unlock the secrets of our ancestors.
Anyway, rock art can probably tell us a lot about the people who made it, if only we can figure out how to read it. There is one of the biggest challenges in prehistoric archaeology, both one of the most tantalizing and one of the most frustrating.
Rock Porn out!
*These cupules are common throughout California. Sometimes they may not be rock art, but may be used for grinding something such as seeds or ochre – they are sometimes found in direct connection with bedrock mortar cups indicating that they were used with the mortars, but they are sometimes found in places where they could not possibly have been used for grinding. These cupules are often interpreted as fertility symbols – for some reason fertility and hunting success are the most common interpretations of rock art – but while this is likely the case for many of the cupules, I suspect that there are other purposes as well.
**I would love to post pictures here, but for various professional ethical reasons it is generally considered a bad idea to post pictures of rock art when you are uncertain of the importance of the art to the local native groups.
Wednesday, February 11, 2009
Happy Darwin Day
This Wednesday (February 12th) is the 200th anniversary of the birth of Charles Darwin. Darwin, of course, was the first person to work out the basic principles of natural selection. Contrary to what many people believe, Charles Darwin was not, in fact, the first person to propose the idea of the evolution of species – the idea had been considered in scientific and philosophical circles for some time, in no small part due to the frequent discovery of fossils – but he was the first person to work out the basic mechanism for evolution, and in so doing was able to link the field observations together into a coherent explanation of life on Earth.
Also contrary to what many believe, the theory of evolution* was not cooked up by “godless scientists” or agreed upon rapidly by those who wanted to remove miracles from the world, or any of the other variations I have heard on that theme. Darwin didn’t publish his work at all until he met someone else who had reached the same conclusions and was about to publish them, and even then he felt as if he had done something wrong in publishing them. They were not adopted immediately, but were subject to hard scrutiny and harsh criticism – but they stood up to both, and, more importantly, provided testable predictions that consistently proved correct.
In the end, Darwin’s work proved a great boon not only to our understanding of how we, and every other species, came to be here, but also to all fields that rely on biology – for example, it’s Darwin’s work that provides us with the models we need to understand the development of viruses and bacteria in order to fight them with new medicines (which is why I am always amused when creationists use vaccines and antibiotics – if their view is correct, these things shouldn’t work).
So, take a chance to ponder how well the breakthroughs of this particular 19th century scientist have improved our lives and views of the world. Oh, and if you have some time, look up the works of Charles Darwin online.
*I’ve said it before and I’ll say it again, theory is one of the most mis-used words in the English language. Contrary to what many a middle school teacher tells their students, theories are not simply hypotheses that have not been tested enough to be considered laws. A theory is something completely different – it is the body of observations and linking arguments, and can be highly hypothetical, but can also be considered a “fact” – gravity, electricity, the notion that diseases are caused by viruses and bacteria, these are all theories, but nonetheless are absolutely real.
So, really, when someone says that they don’t believe something because it’s “only a theory” they are saying a sentence that A) makes no sense at all, and B) shows their complete and utter ignorance of the subject that they have chosen to talk about.
Also contrary to what many believe, the theory of evolution* was not cooked up by “godless scientists” or agreed upon rapidly by those who wanted to remove miracles from the world, or any of the other variations I have heard on that theme. Darwin didn’t publish his work at all until he met someone else who had reached the same conclusions and was about to publish them, and even then he felt as if he had done something wrong in publishing them. They were not adopted immediately, but were subject to hard scrutiny and harsh criticism – but they stood up to both, and, more importantly, provided testable predictions that consistently proved correct.
In the end, Darwin’s work proved a great boon not only to our understanding of how we, and every other species, came to be here, but also to all fields that rely on biology – for example, it’s Darwin’s work that provides us with the models we need to understand the development of viruses and bacteria in order to fight them with new medicines (which is why I am always amused when creationists use vaccines and antibiotics – if their view is correct, these things shouldn’t work).
So, take a chance to ponder how well the breakthroughs of this particular 19th century scientist have improved our lives and views of the world. Oh, and if you have some time, look up the works of Charles Darwin online.
*I’ve said it before and I’ll say it again, theory is one of the most mis-used words in the English language. Contrary to what many a middle school teacher tells their students, theories are not simply hypotheses that have not been tested enough to be considered laws. A theory is something completely different – it is the body of observations and linking arguments, and can be highly hypothetical, but can also be considered a “fact” – gravity, electricity, the notion that diseases are caused by viruses and bacteria, these are all theories, but nonetheless are absolutely real.
So, really, when someone says that they don’t believe something because it’s “only a theory” they are saying a sentence that A) makes no sense at all, and B) shows their complete and utter ignorance of the subject that they have chosen to talk about.
Friday, February 6, 2009
What the hell? Pirate rap?
You ever see one of those things that seems both horribly bizarre and as if it actually makes perfect sense all at the same time. That's how I felt when I saw an advertisement for a "pirate rap" album called Captain Dan and the Scurvy Crew. A description of the album contains the following sentence:
I'm wondering whether this is due to someone seeing too many pirate movies, if this guy is a follower of the Flying Spaghetti Monster, if his sense of humor is like mine (hmmmm....rap and pirates, these are two things that shouldn't go together under any circumstances - let's see what happens), if he's intentionally drawin comparisons between "gangstas" and pirates (two types of criminal that get romanticized for no good reason), or some combination of all of the above.
Ah, fuck it, why am I analyzing something that's funny. It's funny, let's leave it at that.
The subject matter of this album includes(but is not limited to) being a drunken sailor, theft, digging for gold, appreciation for the female form, and attacking Santa Claus.
I'm wondering whether this is due to someone seeing too many pirate movies, if this guy is a follower of the Flying Spaghetti Monster, if his sense of humor is like mine (hmmmm....rap and pirates, these are two things that shouldn't go together under any circumstances - let's see what happens), if he's intentionally drawin comparisons between "gangstas" and pirates (two types of criminal that get romanticized for no good reason), or some combination of all of the above.
Ah, fuck it, why am I analyzing something that's funny. It's funny, let's leave it at that.
Tuesday, February 3, 2009
Notes on Japan
I should probably put up some photos from this trip, but I would have to re-size them, and as such, I will have to do some work before I can post them. Nonetheless, three weeks later, I figure that I should write something about the trip to Japan – which was, it must be said, pretty damn cool. For the sake of brevity (and because it makes it easier for me to make cheap cracks about things that I should probably not be making cheap cracks about), I will divide this post into themed sections.
Food
I had been warned that food in Tokyo would be prohibitively expensive – stories of $20 (or approximately $2000 Yen) bowls of soup and cups of coffee abounded. I was prepared to have to take out a line to buy ramen – even had the paperwork ready. So, you can imagine my surprise when I discovered that these prices were rare and limited to very high-end places, and that in general, food cost about the same as it does in a large city in the U.S., say San Francisco or Los Angeles.
The food was also generally of very high quality. Well, actually, what the hell do I know, really? It tasted good, and didn’t cause anyone to get sick. For all I know, I could have been eating in the Tokyo equivalent of Arbys. But, regardless, I was very pleased and very much enjoyed my meals.
What did seem remarkable to me was how very few surprises there were. The food was very familiar to me from Japanese food that I have had or seen in the U.S. There was nothing really new or odd to me – and I was on the lookout. Much of it was better – the noodles, for example, seemed to be better prepared and were never soggy as they often are in the U.S., but it was nonetheless very familiar. Interestingly, the only times that we had or saw something that was truly alien, it was invariably when the Tokyo chefs were attempting to make European or American foods – and we ended up with very strange stuffed omelets, watery orange scrambled eggs, and crepes that had either gone very, very wrong or were amazingly good (really, the Japanese seem to have taken a weird fascination with the crepe, and have developed it to a deliciously high level – though there have been some casualties along the way). Come to think of it, the number of mangled western foods that involved egg is really rather interesting.
One of the strangest moments for me came when Kay and I went to an allegedly “Tex Mex” restaurant (because, when you’re in Asia and see a “Tex Mex” place, you HAVE TO go in). In the course of the next hour, I had a curry prepared by a Japanese cook and served by a Nigerian man in a “Tex Mex” place that had an Indiana Jones movie playing in the monitor at the front. Very strange.
Transportation
The Tokyo subway system is both a marvel and a mess, though a well-articulated and highly efficient mess. There are numerous different companies running numerous different trains along numerous different lines. It is tempting to get day passes for one of the lines, but this often leaves you having to shell out more money when you decide to use another line. There were day passes available for most of the lines, but even these would occasionally not work.
And yet, somehow, the system manages to get by fairly well. Once we had learned the basic rules, we managed to get around Tokyo, and even outside of Tokyo, quite efficiently, and at less than the fuel and parking costs to travel equivalent distances in California. Nonetheless, especially in the first day or so, confusion about which train to get on, which entrance to use, and where to exit the train created a lot of trouble.
And then there is the subway stations themselves. Some of these spanned several city blocks, and contained numerous shops, restaurants, and resting places. I suspect some of them may even have had apartment buildings built over them, though I was not clear on that. Regardless, there was a virtual second city built underneath Tokyo consisting of the tunnels and hallways that made up the subway stations. This, naturally, made it easy to get lost. There were numerous occasions when we would walk around in circles looking for the correct exit or train, only to end up at the wrong place and spend another half hour trying to get our bearings.
Luckily, being the only tall and white people nearby, the locals often took pity on us and would help to direct us. And we did eventually get the hang of things. By the end of the trip, we were perfectly comfortable navigating the system.
Cultural Mish-Mash
People from the U.S. love to make fun of “Engrish” – the linguistic part of the perceived tendency for the Japanese to take elements from U.S. and European culture and mash them together into nonsensical or horribly garbled new forms. This tendency includes the seemingly random use of English words or phrases in everything from pop songs to shop names (indeed, it was truly bizarre to see shops with names like “Snobbery” or “Nudy Boy” or a bar named “Ooze Charm”), not to mention the hilarious and common mistranslation (even on government documents, where you think they’d at least spring for a native English Speaker to help out – there’s three continents that contain us, we’re not hard to find). However, it extends beyond language and into many other aspects of culture. As mentioned above, Japanese attempts at producing western foods ranging from pizza to omelets often resulted in strange creations that were foreign and either exotic or disgusting both to the Japanese and to the culture from which the food was allegedly borrowed.
At first, we would see these sorts of things, chuckle, and think to ourselves “wow, the Japanese are just not getting it.” And then I came across something that put the whole thing into context. In one of the guide books I read about a theme restaurant that is designed to look like the interior of a Catholic church (we didn’t go, not due to lack of interest, but rather due to lack of time), and we thought that this was simply beyond bizarre. And then we visited a Shinto shrine, and realized that many U.S.-run Japanese restaurants are designed to resemble these shrines, which, when you get down to it, is not at all different than the restaurant being designed to look like a Catholic church.
Since I returned, I have not been able to help noticing the number of places in the U.S. that either make nonsensical use of words from other languages (“Del Taco” anyone? And how many people have tattoos of Japanese writing elements without any real way of knowing what their tattoo says?) or else are themselves a rather silly mish-mash of different elements from the culture allegedly being paid tribute*. Regardless, this is one place where the outsider (in this case, us) tends to laugh without realizing that, really, they do the exact same thing**.
Regardless, when you are from the culture from which elements are being borrowed, it leads to a surreal experience.
People
Before I left, I had been told that the Japanese people were very formal, and that breaking with formality, even unintentionally, was likely to cause offense. I was delighted to find that this was not true. I suspect that we benefited from being in Tokyo – where folks are accustomed to visitors – and from being obviously white and non-native, thus signaling by our very appearance that we weren’t from ‘round them thar parts. Regardless, we found the people we dealt with were very friendly, willing to lend a hand, and would try to instruct us in the social niceties in as polite and friendly a manner as possible. In other words, they were pretty damn cool.
Now, perhaps they went home and complained about the crazy German guy they dealt with today, and I certainly would not blame them if they had, but at least to my face, they were never less than cordial.
That being said, as happens when one travels, we did have to get used to many customs that were easy to forget (especially when it comes to handling cash – there is a simple but important custom involving trays on which cash is placed).
However, I have heard that while the Japanese people tend to be very friendly to visitors, they are far less friendly towards foreigners who wish to become residents (and you can find some descriptions of this at the Tokyo Damage Report). This is made more difficult by the fact that Japan’s native population is shrinking due to low birth rates, and as such foreign workers are becoming increasingly important to the Japanese economy. Indeed, a similar trend is happening over much of the industrialized world, and will likely result in some major demographic shifts in the next couple of centuries***.
Architecture
The architecture was generally faimiliar – while there was the occasional pagoda or more traditional house, most of the buildings would not have looked out of place in San Francisco –except for one thing. Because real-estate is at such a premium, most buildings had the smallest footprint that they could, resulting in very tall AND skinny buildings everywhere. Otherwise, we saw the range of buildings from bland and utilitarian to amazingly ugly to rather beautiful. Other than the more traditional structures, even the beautiful buildings themselves did not appear to have any particularly “Japanese” quality to them, but rather were what one would expect in any major city. Nonetheless, some of them were amazing.
I would write more here, but really, it’s probably better to just show pictures. So, once I get the photos resized, I’ll post some.
One interesting quirk to Japanese architecture – the concept of privacy in the restroom is a bit different in Japan than in the U.S. As a result, men’s rooms were often either open for all to see, or at least had a large window so that passers-by could see the urinals. In fact, our hotel room had a great view of the men’s room in the office building opposite us. Brings a whole new perspective to the desire to have a room with a view.
However, the stalls were more enclosed than they are in the U.S. Each of the walls and the door proceeded all the way to the ground, rather than have the large gap between the wall and ground that we are familiar with. However, given that the toilets were typically porcelain basins in the ground with a flushing mechanism attached, rather than the seat toilets with which we are familiar, the full walls are not surprising.
Home-Grown Weird
Like everywhere else, Japan has its own home-grown weird. By weird, I don’t mean a patronizing “oh, look at those cute little non-Americans and their quaint customs” kinda’ of stuff, I mean that many of the Japanese seem to see much of this stuff as pretty damn weird. A lot of it is the general generation gap stuff that you see everywhere, but it is nonetheless of a particularly Japanese flavor and therefore noteworthy.
So, as I say, this home-grown weirdness was primarily found within youth culture (which, let’s face it, is where it’s found in many different societies). There is, for example, the tendency to see “cosplay” (or costume play) among teenagers – in one area of town, we ran across numerous teenagers dressed in all manner of clothing that would have done Liberace proud. It was rather fun to watch, and I expect that, provided that you have the time and patience necessary, fun to do. I didn’t get any particularly great photos, but I would again direct you to the Tokyo Damage Report for some wonderful examples.
There is one form off oddity that was not found primarily within youth culture (though it may have started there), and that was “cuteness.” This includes the cartoons, toys, and pop-culture stuff that is ubiquitous even outside of Japan (such as the Pokemon craze that hit the U.S., leaving millions of casualties in its wake), but is spread much, much farther than that. For example, warning signs pertaining to not sticking your fingers into automatic doors all had a rather cute “crying face” sign, which seemed to be universal, appearing in both public places and private businesses. Airplanes were painted with cute images. Cell phone charms (which are ubiquitous in Asia, but rather unusual in the U.S.) often sported cute animals or humanoid figures, and so on. Even the music played to indicate when the subway was about to depart was often “cute” in nature.
In Total
Anyway, on the whole it was a wonderful trip, and I hope to return someday – though I think that next time I’ll go to Kyoto to see more of the historical/cultural side of Japan (Tokyo is considered the economic/political capital, and Kyoto the cultural capitol), and perhaps I’ll have the opportunity to travel about the country more next time. Regardless, I’m glad I went, and I fully recommend such a trip to anyone who has the means.
*Much of the Japan-o-phile anime sub-culture does this with Japanese culture. The same is true of the culturally pornographic fetishization of India and Tibet by many young white people, which often seems to owe more to an atavistic resurgence of the imperial/colonial-era obsession with the “Mystic Orient” than with a fair assessment of the regions, cultures, and people in questions. All the funnier, or more offensive if you’re the sort who takes offense, when you consider that these are the same people who tend to protest against “imperialism.”
**I’ve been laughing at us as well ever since I returned, so I figure I’m just going equal opportunity in making fun of shops with names like “Nudy Boy.” Or I’m being a honky imperialist. Either way, the shop name is funny.
***Many right-wing people and organizations see this as being absolutely disastrous. While there are things that I would prefer not to see spread – any form of militant religion, for example – this sort of demographic change really is just part of the life of humans as a species and always has been, though it can be painful for those living through some of the major phases of it and current population dynamics give it a bit of a twist. It’s nothing new, and really, is going to happen even if everyone on the planet wanted to stop it. It’s just happening on a larger scale because of the concurrent growth of population and technological development. However, it’s how all of our current ethnic groups came to be, and no doubt will simply create new ethnic groups down the line, which will in turn face the same things themselves eventually – unless a comet hits the planet first.
Food
I had been warned that food in Tokyo would be prohibitively expensive – stories of $20 (or approximately $2000 Yen) bowls of soup and cups of coffee abounded. I was prepared to have to take out a line to buy ramen – even had the paperwork ready. So, you can imagine my surprise when I discovered that these prices were rare and limited to very high-end places, and that in general, food cost about the same as it does in a large city in the U.S., say San Francisco or Los Angeles.
The food was also generally of very high quality. Well, actually, what the hell do I know, really? It tasted good, and didn’t cause anyone to get sick. For all I know, I could have been eating in the Tokyo equivalent of Arbys. But, regardless, I was very pleased and very much enjoyed my meals.
What did seem remarkable to me was how very few surprises there were. The food was very familiar to me from Japanese food that I have had or seen in the U.S. There was nothing really new or odd to me – and I was on the lookout. Much of it was better – the noodles, for example, seemed to be better prepared and were never soggy as they often are in the U.S., but it was nonetheless very familiar. Interestingly, the only times that we had or saw something that was truly alien, it was invariably when the Tokyo chefs were attempting to make European or American foods – and we ended up with very strange stuffed omelets, watery orange scrambled eggs, and crepes that had either gone very, very wrong or were amazingly good (really, the Japanese seem to have taken a weird fascination with the crepe, and have developed it to a deliciously high level – though there have been some casualties along the way). Come to think of it, the number of mangled western foods that involved egg is really rather interesting.
One of the strangest moments for me came when Kay and I went to an allegedly “Tex Mex” restaurant (because, when you’re in Asia and see a “Tex Mex” place, you HAVE TO go in). In the course of the next hour, I had a curry prepared by a Japanese cook and served by a Nigerian man in a “Tex Mex” place that had an Indiana Jones movie playing in the monitor at the front. Very strange.
Transportation
The Tokyo subway system is both a marvel and a mess, though a well-articulated and highly efficient mess. There are numerous different companies running numerous different trains along numerous different lines. It is tempting to get day passes for one of the lines, but this often leaves you having to shell out more money when you decide to use another line. There were day passes available for most of the lines, but even these would occasionally not work.
And yet, somehow, the system manages to get by fairly well. Once we had learned the basic rules, we managed to get around Tokyo, and even outside of Tokyo, quite efficiently, and at less than the fuel and parking costs to travel equivalent distances in California. Nonetheless, especially in the first day or so, confusion about which train to get on, which entrance to use, and where to exit the train created a lot of trouble.
And then there is the subway stations themselves. Some of these spanned several city blocks, and contained numerous shops, restaurants, and resting places. I suspect some of them may even have had apartment buildings built over them, though I was not clear on that. Regardless, there was a virtual second city built underneath Tokyo consisting of the tunnels and hallways that made up the subway stations. This, naturally, made it easy to get lost. There were numerous occasions when we would walk around in circles looking for the correct exit or train, only to end up at the wrong place and spend another half hour trying to get our bearings.
Luckily, being the only tall and white people nearby, the locals often took pity on us and would help to direct us. And we did eventually get the hang of things. By the end of the trip, we were perfectly comfortable navigating the system.
Cultural Mish-Mash
People from the U.S. love to make fun of “Engrish” – the linguistic part of the perceived tendency for the Japanese to take elements from U.S. and European culture and mash them together into nonsensical or horribly garbled new forms. This tendency includes the seemingly random use of English words or phrases in everything from pop songs to shop names (indeed, it was truly bizarre to see shops with names like “Snobbery” or “Nudy Boy” or a bar named “Ooze Charm”), not to mention the hilarious and common mistranslation (even on government documents, where you think they’d at least spring for a native English Speaker to help out – there’s three continents that contain us, we’re not hard to find). However, it extends beyond language and into many other aspects of culture. As mentioned above, Japanese attempts at producing western foods ranging from pizza to omelets often resulted in strange creations that were foreign and either exotic or disgusting both to the Japanese and to the culture from which the food was allegedly borrowed.
At first, we would see these sorts of things, chuckle, and think to ourselves “wow, the Japanese are just not getting it.” And then I came across something that put the whole thing into context. In one of the guide books I read about a theme restaurant that is designed to look like the interior of a Catholic church (we didn’t go, not due to lack of interest, but rather due to lack of time), and we thought that this was simply beyond bizarre. And then we visited a Shinto shrine, and realized that many U.S.-run Japanese restaurants are designed to resemble these shrines, which, when you get down to it, is not at all different than the restaurant being designed to look like a Catholic church.
Since I returned, I have not been able to help noticing the number of places in the U.S. that either make nonsensical use of words from other languages (“Del Taco” anyone? And how many people have tattoos of Japanese writing elements without any real way of knowing what their tattoo says?) or else are themselves a rather silly mish-mash of different elements from the culture allegedly being paid tribute*. Regardless, this is one place where the outsider (in this case, us) tends to laugh without realizing that, really, they do the exact same thing**.
Regardless, when you are from the culture from which elements are being borrowed, it leads to a surreal experience.
People
Before I left, I had been told that the Japanese people were very formal, and that breaking with formality, even unintentionally, was likely to cause offense. I was delighted to find that this was not true. I suspect that we benefited from being in Tokyo – where folks are accustomed to visitors – and from being obviously white and non-native, thus signaling by our very appearance that we weren’t from ‘round them thar parts. Regardless, we found the people we dealt with were very friendly, willing to lend a hand, and would try to instruct us in the social niceties in as polite and friendly a manner as possible. In other words, they were pretty damn cool.
Now, perhaps they went home and complained about the crazy German guy they dealt with today, and I certainly would not blame them if they had, but at least to my face, they were never less than cordial.
That being said, as happens when one travels, we did have to get used to many customs that were easy to forget (especially when it comes to handling cash – there is a simple but important custom involving trays on which cash is placed).
However, I have heard that while the Japanese people tend to be very friendly to visitors, they are far less friendly towards foreigners who wish to become residents (and you can find some descriptions of this at the Tokyo Damage Report). This is made more difficult by the fact that Japan’s native population is shrinking due to low birth rates, and as such foreign workers are becoming increasingly important to the Japanese economy. Indeed, a similar trend is happening over much of the industrialized world, and will likely result in some major demographic shifts in the next couple of centuries***.
Architecture
The architecture was generally faimiliar – while there was the occasional pagoda or more traditional house, most of the buildings would not have looked out of place in San Francisco –except for one thing. Because real-estate is at such a premium, most buildings had the smallest footprint that they could, resulting in very tall AND skinny buildings everywhere. Otherwise, we saw the range of buildings from bland and utilitarian to amazingly ugly to rather beautiful. Other than the more traditional structures, even the beautiful buildings themselves did not appear to have any particularly “Japanese” quality to them, but rather were what one would expect in any major city. Nonetheless, some of them were amazing.
I would write more here, but really, it’s probably better to just show pictures. So, once I get the photos resized, I’ll post some.
One interesting quirk to Japanese architecture – the concept of privacy in the restroom is a bit different in Japan than in the U.S. As a result, men’s rooms were often either open for all to see, or at least had a large window so that passers-by could see the urinals. In fact, our hotel room had a great view of the men’s room in the office building opposite us. Brings a whole new perspective to the desire to have a room with a view.
However, the stalls were more enclosed than they are in the U.S. Each of the walls and the door proceeded all the way to the ground, rather than have the large gap between the wall and ground that we are familiar with. However, given that the toilets were typically porcelain basins in the ground with a flushing mechanism attached, rather than the seat toilets with which we are familiar, the full walls are not surprising.
Home-Grown Weird
Like everywhere else, Japan has its own home-grown weird. By weird, I don’t mean a patronizing “oh, look at those cute little non-Americans and their quaint customs” kinda’ of stuff, I mean that many of the Japanese seem to see much of this stuff as pretty damn weird. A lot of it is the general generation gap stuff that you see everywhere, but it is nonetheless of a particularly Japanese flavor and therefore noteworthy.
So, as I say, this home-grown weirdness was primarily found within youth culture (which, let’s face it, is where it’s found in many different societies). There is, for example, the tendency to see “cosplay” (or costume play) among teenagers – in one area of town, we ran across numerous teenagers dressed in all manner of clothing that would have done Liberace proud. It was rather fun to watch, and I expect that, provided that you have the time and patience necessary, fun to do. I didn’t get any particularly great photos, but I would again direct you to the Tokyo Damage Report for some wonderful examples.
There is one form off oddity that was not found primarily within youth culture (though it may have started there), and that was “cuteness.” This includes the cartoons, toys, and pop-culture stuff that is ubiquitous even outside of Japan (such as the Pokemon craze that hit the U.S., leaving millions of casualties in its wake), but is spread much, much farther than that. For example, warning signs pertaining to not sticking your fingers into automatic doors all had a rather cute “crying face” sign, which seemed to be universal, appearing in both public places and private businesses. Airplanes were painted with cute images. Cell phone charms (which are ubiquitous in Asia, but rather unusual in the U.S.) often sported cute animals or humanoid figures, and so on. Even the music played to indicate when the subway was about to depart was often “cute” in nature.
In Total
Anyway, on the whole it was a wonderful trip, and I hope to return someday – though I think that next time I’ll go to Kyoto to see more of the historical/cultural side of Japan (Tokyo is considered the economic/political capital, and Kyoto the cultural capitol), and perhaps I’ll have the opportunity to travel about the country more next time. Regardless, I’m glad I went, and I fully recommend such a trip to anyone who has the means.
*Much of the Japan-o-phile anime sub-culture does this with Japanese culture. The same is true of the culturally pornographic fetishization of India and Tibet by many young white people, which often seems to owe more to an atavistic resurgence of the imperial/colonial-era obsession with the “Mystic Orient” than with a fair assessment of the regions, cultures, and people in questions. All the funnier, or more offensive if you’re the sort who takes offense, when you consider that these are the same people who tend to protest against “imperialism.”
**I’ve been laughing at us as well ever since I returned, so I figure I’m just going equal opportunity in making fun of shops with names like “Nudy Boy.” Or I’m being a honky imperialist. Either way, the shop name is funny.
***Many right-wing people and organizations see this as being absolutely disastrous. While there are things that I would prefer not to see spread – any form of militant religion, for example – this sort of demographic change really is just part of the life of humans as a species and always has been, though it can be painful for those living through some of the major phases of it and current population dynamics give it a bit of a twist. It’s nothing new, and really, is going to happen even if everyone on the planet wanted to stop it. It’s just happening on a larger scale because of the concurrent growth of population and technological development. However, it’s how all of our current ethnic groups came to be, and no doubt will simply create new ethnic groups down the line, which will in turn face the same things themselves eventually – unless a comet hits the planet first.
Subscribe to:
Posts (Atom)