When I give public talks, a very common question from the audience is "how come the Native Americans died of European diseases, but the Europeans didn't die from Native American diseases?"
It's a good question. It's a question that never has any real relation to my talks, but I am always glad to get it because it shows that the audience is intelligent and has been thinking about what looks like a puzzling matter. After all, shouldn't it go both ways - if a small number of European settlers, priests, and soldiers can inadvertently wipe out huge swaths of the population with communicable disease to which the natives are not immune, doesn't it follow that the colonists would likewise be vulnerable to native pathogens?
The simple fact of the matter is that nobody really knows why native pathogens (bacteria and viruses) didn't do much harm to the European colonists (or the African slaves that they would eventually bring, or the Asian settlers who would come later still), but we are not completely clueless either*. Most explanations are based on our knowledge of how pathogens spread through a population. Most Native American communities were less dense than those of Europe or Asia, and parts of Africa. This may have provided less opportunity for pathogens to evolve, and may simply mean that there were less, and less virulent, diseases in the Americas prior to the arrival of Europeans.
A different, though not mutually exclusive, explanation is that people of the Old World (Europe, Africa, Asia) had long lived in or near cities, and as a result evolution had produced stronger immune systems - those who are more vulnerable to illness typically died at a young age and consequently left no offspring. Likewise, the large number of people living in poor sanitary conditions within the cities and towns of the pre-modern and early modern Old World would have been exposed to a wide range of pathogens, potentially providing immune systems primed for partial protection against new infections.
A new study has shown evidence that people descended from populations with long histories of living in cities (such as Italians, who picked up city living before Rome) tend to have genetic traits that improve the immune response to infection, and people descended from non-urban peoples tend to lack this trait. Note the "tend to" - the trait is present in some people descended from non-urban peoples, and lacking in some people who are descended from urbanites - it's a statistical difference and not a categorical one. Still it is interesting.
Although this is being interpreted as evidence that city living led to selection for stronger immune systems, there may be other explanations for the finding. Still, it's interesting new information to address this old question.
*However, it should be noted that there is evidence, far from conclusive but there nonetheless, that syphilis may have come from the Americas.
Subtitle
The Not Quite Adventures of a Professional Archaeologist and Aspiring Curmudgeon
Thursday, September 30, 2010
Wednesday, September 29, 2010
Perilous Tome of Doom!
I remember coming home from school, I must have been 9 or 10, and finding my mother waiting for me in the living room. She had a book on her lap, and when she picked it up I could see that it was one that I had picked up from the school library. She looked at me, waved the book about, and said “I found this in your room.”
“Yeah?” I replied, not sure why she was bringing this up.
She looked at me and said “I am disturbed that you have this book!.”
I was a bit surprised by this. The book was, as I have already written, from the school library. It wasn’t anything racy, or anything that was not age appropriate. It wasn’t Catcher in the Rye or Fahrenheit 451 or even Huckleberry Finn for that matter. It contained no violence, no sex, and only the mildest language one can imagine.
It was a book aimed at older children about UFOs. Specifically, it argued that folkloric creatures such as elves and gnomes might be aliens. It made this argument as such:
The book was every bit as well-written, well-argued and intellectually stimulating as you might imagine. It was so unbelievably trite that even to my decade-old brain it seemed absurd. In fact, the logical problems that I found in this book prepared me for seeing more sophisticated versions of the same fallacies in all manner of paranormal claims, and are part of the reason why I would eventually become rather skeptical of the UFO movement.
She stood up, walked towards me, and shook the book in my direction. “Books like this are dangerous!” She left the room, and eventually returned the book to the school library herself, not wanting me to get my hands on it again, I suppose.
The book wasn’t dangerous. At least, to me it wasn’t. It actually was rather helpful in the long run, and amusing in the short run. While I suppose it might help delude some people, I frankly have a hard time conceiving of the person gullible enough to think that this book was anything but doggerel, even amongst the pre-teen set.
What I have never been able to figure out is my mother’s reaction. Why was she so concerned about a plainly stupid book. I have a hard time believing that she was concerned about the paranormal themes of the book – she had been the one who had exposed my sisters and I to most of these ideas. Although she has since grown out of it, my mother was very much interested in New-Agey stuff during the late 70s through the 80s. She read the Seth books, introduced us to In Search Of…, did small experiments in remote viewing using my sisters and playing cards, and would routinely watch television shows about alien abductions.
She had never shown a concern about my developing similar interests, and even seemed to encourage it on occasion.
My mother would definitely describe herself as a Christian, and I suppose that she might have been concerned about the potential implication that religious figures are aliens, but the book itself never actually wandered into that territory, and instead focused on claims that critters of European Medieval folklore were the aliens.
I have asked about this incident several times in the 25 or so years since it happened. My mother says that she has no recollection of it, and I suppose that it really isn’t important in the scheme of things. But it’s one of those odd episodes of my life that I have always wondered about.
“Yeah?” I replied, not sure why she was bringing this up.
She looked at me and said “I am disturbed that you have this book!.”
I was a bit surprised by this. The book was, as I have already written, from the school library. It wasn’t anything racy, or anything that was not age appropriate. It wasn’t Catcher in the Rye or Fahrenheit 451 or even Huckleberry Finn for that matter. It contained no violence, no sex, and only the mildest language one can imagine.
It was a book aimed at older children about UFOs. Specifically, it argued that folkloric creatures such as elves and gnomes might be aliens. It made this argument as such:
A few different people in Europe claim to have been abducted by UFOs, and they say that the aliens wore what looked like weird hats that sound kind of like the ones that gnomes wear, and some other people said that the aliens are real, real short, so maybe these are like elves or fairies or something. Oh, what’s that? No, we can’t give you the names of the people who saw these things or even which city they live in, but it’s all ever so true, and you trust us, right?
The book was every bit as well-written, well-argued and intellectually stimulating as you might imagine. It was so unbelievably trite that even to my decade-old brain it seemed absurd. In fact, the logical problems that I found in this book prepared me for seeing more sophisticated versions of the same fallacies in all manner of paranormal claims, and are part of the reason why I would eventually become rather skeptical of the UFO movement.
She stood up, walked towards me, and shook the book in my direction. “Books like this are dangerous!” She left the room, and eventually returned the book to the school library herself, not wanting me to get my hands on it again, I suppose.
The book wasn’t dangerous. At least, to me it wasn’t. It actually was rather helpful in the long run, and amusing in the short run. While I suppose it might help delude some people, I frankly have a hard time conceiving of the person gullible enough to think that this book was anything but doggerel, even amongst the pre-teen set.
What I have never been able to figure out is my mother’s reaction. Why was she so concerned about a plainly stupid book. I have a hard time believing that she was concerned about the paranormal themes of the book – she had been the one who had exposed my sisters and I to most of these ideas. Although she has since grown out of it, my mother was very much interested in New-Agey stuff during the late 70s through the 80s. She read the Seth books, introduced us to In Search Of…, did small experiments in remote viewing using my sisters and playing cards, and would routinely watch television shows about alien abductions.
She had never shown a concern about my developing similar interests, and even seemed to encourage it on occasion.
My mother would definitely describe herself as a Christian, and I suppose that she might have been concerned about the potential implication that religious figures are aliens, but the book itself never actually wandered into that territory, and instead focused on claims that critters of European Medieval folklore were the aliens.
I have asked about this incident several times in the 25 or so years since it happened. My mother says that she has no recollection of it, and I suppose that it really isn’t important in the scheme of things. But it’s one of those odd episodes of my life that I have always wondered about.
Monday, September 27, 2010
Movin'
So, we have closed down my office. As noted previously, I still have a job and will be working from home when I am not in the field. However, today I pack up my computer for set-up back home. Tomorrow night, I fly down to southern California for a project.
Anyway, the point is that I'm all-over the place, and I have no idea what my posting schedule will be like. However, I hope to get back to a 3-times a week update schedule soon.
Anyway, the point is that I'm all-over the place, and I have no idea what my posting schedule will be like. However, I hope to get back to a 3-times a week update schedule soon.
Thursday, September 23, 2010
If Your Only Tool is a Hammer...
If you should decide to delve into the "cutting edge" archaeology literature of the 1970s and 1980s (and unless you are an archaeologist, I don't recommend that you bother), you will come across a number of different books and papers promising that archaeology can and/or should study any number of things that archaeology is, to date, rather unsuited to studying.
These sorts of claims come in two basic themes, with each of these containing many different flavors. Theme #1 can be summed up by saying "all human activity leaves behind physical traces, and therefore the study of the physical remains of human activity (that is, archaeology) can examine all human behaviors, provided that we can figure out which questions to ask."
This is, at best, an overly-enthusiastic belief. It is not really true, but it may be a useful delusion. It's open to debate whether or not all human activity leaves physical traces behind, and even for those that do, the traces are often not meaningful. For example, a fist-fight would leave physical traces in the scuffling on the dirt, but even if, through some miracle, these mild surface disturbances were preserved for examination by future archaeologists, it's unlikely that it would be determined that this was the result of a fist fight and not any number of other activities.
However, there are plenty of cultural phenomenon that it was long assumed could not be studied archaeologically that archaeologists, armed with the attitude that everything could be studied archaeologically, figured out ways to study. These range from studying prehistoric economics to socio-political organization to general religious cosmology. The degree to which these things could be studied, and the validity of some of the interpretations offered, is highly variable. But, nonetheless, there has been some success, and without the "we can study anything, we just have to figure out which questions to ask" attitude, and poor explanations have frequently been the starting point down the road towards better explanations.
So, it may be useful for us to fool ourselves into thinking that we can study anything through the archaeological record. But just because it's useful doesn't mean that it is true.
The second theme can be summed up as "all human activities are about/partially about X, and therefore we can use the study of material culture to study X." X can be many things: power relations, gender politics, group or individual identity, etc., etc., etc. Again, there may be a good reason for encouraging this behavior - it encourages people to look at how our material culture and the distribution of materials across the landscape may reflect, and allow the study of, these more ethereal aspects of human behavior. However, it is based on a basic fallacy - the notion that all material remains of human behavior are marked by some ephemeral thing (power relations, gender politics, a desire to pour yogurt on all things), and that this ephemeral thing can be examined through a careful study of the material culture*. Even allowing that most of us don't appreciate the degree to which our material culture does reflect the cultural seas in which we live, and therefore something as seemingly innocuous as a box of breakfast cereal can actually tell quite a complicated tale about the culture that created it, it still doesn't follow that every piece of detritus left behind is useful for examining our lot as a culture.
I have always been amused by what I see as the over-reach of my colleagues. Again, I need to say that I do think that this over-reach can be useful as it may push us to find something new rather than write it off. However, it is still over-reach nonetheless.
Whenever I hear that anything and everything is open to study via material culture, I think of the old saying "when your only tool is a hammer, every problem looks like a nail." When studying people long gone, all that we have is their materials culture, and we don't like that there may be things that will always be unknowable. So, we assert that we can study it all, even though we actually can't. It makes us feel better, even if it is false. And even if it is false, it does push us in some interesting directions.
*It should probably be noted that interest in these subjects is most common on what is often referred to as post-processualist or post-modern archaeology. Much of the research in these areas is quite solid, but some of it is performed by people who are very open about being more interested in forwarding a personal or political agenda than in an accurate reconstruction of the past.
These sorts of claims come in two basic themes, with each of these containing many different flavors. Theme #1 can be summed up by saying "all human activity leaves behind physical traces, and therefore the study of the physical remains of human activity (that is, archaeology) can examine all human behaviors, provided that we can figure out which questions to ask."
This is, at best, an overly-enthusiastic belief. It is not really true, but it may be a useful delusion. It's open to debate whether or not all human activity leaves physical traces behind, and even for those that do, the traces are often not meaningful. For example, a fist-fight would leave physical traces in the scuffling on the dirt, but even if, through some miracle, these mild surface disturbances were preserved for examination by future archaeologists, it's unlikely that it would be determined that this was the result of a fist fight and not any number of other activities.
However, there are plenty of cultural phenomenon that it was long assumed could not be studied archaeologically that archaeologists, armed with the attitude that everything could be studied archaeologically, figured out ways to study. These range from studying prehistoric economics to socio-political organization to general religious cosmology. The degree to which these things could be studied, and the validity of some of the interpretations offered, is highly variable. But, nonetheless, there has been some success, and without the "we can study anything, we just have to figure out which questions to ask" attitude, and poor explanations have frequently been the starting point down the road towards better explanations.
So, it may be useful for us to fool ourselves into thinking that we can study anything through the archaeological record. But just because it's useful doesn't mean that it is true.
The second theme can be summed up as "all human activities are about/partially about X, and therefore we can use the study of material culture to study X." X can be many things: power relations, gender politics, group or individual identity, etc., etc., etc. Again, there may be a good reason for encouraging this behavior - it encourages people to look at how our material culture and the distribution of materials across the landscape may reflect, and allow the study of, these more ethereal aspects of human behavior. However, it is based on a basic fallacy - the notion that all material remains of human behavior are marked by some ephemeral thing (power relations, gender politics, a desire to pour yogurt on all things), and that this ephemeral thing can be examined through a careful study of the material culture*. Even allowing that most of us don't appreciate the degree to which our material culture does reflect the cultural seas in which we live, and therefore something as seemingly innocuous as a box of breakfast cereal can actually tell quite a complicated tale about the culture that created it, it still doesn't follow that every piece of detritus left behind is useful for examining our lot as a culture.
I have always been amused by what I see as the over-reach of my colleagues. Again, I need to say that I do think that this over-reach can be useful as it may push us to find something new rather than write it off. However, it is still over-reach nonetheless.
Whenever I hear that anything and everything is open to study via material culture, I think of the old saying "when your only tool is a hammer, every problem looks like a nail." When studying people long gone, all that we have is their materials culture, and we don't like that there may be things that will always be unknowable. So, we assert that we can study it all, even though we actually can't. It makes us feel better, even if it is false. And even if it is false, it does push us in some interesting directions.
*It should probably be noted that interest in these subjects is most common on what is often referred to as post-processualist or post-modern archaeology. Much of the research in these areas is quite solid, but some of it is performed by people who are very open about being more interested in forwarding a personal or political agenda than in an accurate reconstruction of the past.
Monday, September 20, 2010
Alleged Worst Case
When I worked in Santa Barbara County, I would frequently be called out to consult on City of Santa Barbara projects. these were public work projects such as the construction of roads, replacement of sidewalks, construction of sewer lines, etc. One of the planners, a fellow whose name I can not recall (which is just as well, as my sense of ethics would prevent me from including it here without his permission), would always ask the same question:
"What is the worst case scenario, from a historic resources standpoint?"
Each time the conversation would play out the same way. He'd ask the question, and I would talk about what was likely to be found. For example, I was once asked to explain the "worst case scenario" for a new sidewalk installation. I explained that the proposed sidewalk was on a road that traversed a steep hillside with no rock outcrops or caves, meaning that the odds of there being an archaeological site was very, very low. Therefore, I couldn't conceive of a "worst case scenario."
"Well, what if we find an Indian cemetery while working?"
I then explained that this was rather unlikely, as people who don't have heavy equipment tend not to bury their dead on 60 degree slopes with no caves or rock outcrops.
"What if this was the exception? What if there was someone really important and special, and they made the effort? What if this was a group that thought it was religiously very important that everyone be buried in a steep hillside?"
And so it went.
This was typical, I would be asked to give a worst case scenario, and when I explained why the situation wasn't dire, I would be faced with a question about an absurd situation which I could pretty much guarantee would never happen, and I was asked to provide a full plan for how to deal with it, complete with budget and schedule (seriously, I was sometimes asked for a budget and schedule for dealing with things that didn't even exist).
The problem is that if you are going to play the "we have to think everything through, no matter how unlikely" game you will never reach an end point.
What if you find the site that provides actual, legitimate proof that the Knights Templar fled France and settled in California? What if you encounter the remains of a neanderthal who managed to migrate to the Americas? What if you find a site that contains clearly unearthly material, proving aliens landed? It's absurd, it's silly, it's stupid to waste time considering it, but if you are going to consider every possibility, no matter how far-fetched, you can't rule it out.
The problem is that, in archaeology, even things that are likely can't be worked out until they are certain. If we do find a burial ground, I can't say how it will be dealt with until the most likely descendants are contacted and consulted, the number of burials is known, and the potential for modifying the project to avoid impacts is assessed. In other words, asking for a "worst case scenario" before anything has been identified in an area is a bit like going to the doctor and asking for a worst-case scenario before he has been able to examine you or heard you describe the symptoms. And you know, I can kind-of forgive construction contractors or land developers when they ask these questions. they may not have dealt with this before, and they may simply be trying to wrap their minds around it. But a city planner who has dealt with countless archaeological consultations? Different story.
This is a situation that I don't find myself in very often, but it does happen, and it's always annoying.
"What is the worst case scenario, from a historic resources standpoint?"
Each time the conversation would play out the same way. He'd ask the question, and I would talk about what was likely to be found. For example, I was once asked to explain the "worst case scenario" for a new sidewalk installation. I explained that the proposed sidewalk was on a road that traversed a steep hillside with no rock outcrops or caves, meaning that the odds of there being an archaeological site was very, very low. Therefore, I couldn't conceive of a "worst case scenario."
"Well, what if we find an Indian cemetery while working?"
I then explained that this was rather unlikely, as people who don't have heavy equipment tend not to bury their dead on 60 degree slopes with no caves or rock outcrops.
"What if this was the exception? What if there was someone really important and special, and they made the effort? What if this was a group that thought it was religiously very important that everyone be buried in a steep hillside?"
And so it went.
This was typical, I would be asked to give a worst case scenario, and when I explained why the situation wasn't dire, I would be faced with a question about an absurd situation which I could pretty much guarantee would never happen, and I was asked to provide a full plan for how to deal with it, complete with budget and schedule (seriously, I was sometimes asked for a budget and schedule for dealing with things that didn't even exist).
The problem is that if you are going to play the "we have to think everything through, no matter how unlikely" game you will never reach an end point.
What if you find the site that provides actual, legitimate proof that the Knights Templar fled France and settled in California? What if you encounter the remains of a neanderthal who managed to migrate to the Americas? What if you find a site that contains clearly unearthly material, proving aliens landed? It's absurd, it's silly, it's stupid to waste time considering it, but if you are going to consider every possibility, no matter how far-fetched, you can't rule it out.
The problem is that, in archaeology, even things that are likely can't be worked out until they are certain. If we do find a burial ground, I can't say how it will be dealt with until the most likely descendants are contacted and consulted, the number of burials is known, and the potential for modifying the project to avoid impacts is assessed. In other words, asking for a "worst case scenario" before anything has been identified in an area is a bit like going to the doctor and asking for a worst-case scenario before he has been able to examine you or heard you describe the symptoms. And you know, I can kind-of forgive construction contractors or land developers when they ask these questions. they may not have dealt with this before, and they may simply be trying to wrap their minds around it. But a city planner who has dealt with countless archaeological consultations? Different story.
This is a situation that I don't find myself in very often, but it does happen, and it's always annoying.
Friday, September 17, 2010
Long Walks Arguing on the Beach
A few years ago, I was walking down the beach in Aptos, CA. I had gone out to take photos, and was generally having a good time minding my own business, when I heard, from behind me, "Pardon me, but what does this mean, 'The God Delusion'?"
I had forgotten which shirt I was wearing. It was one that I had picked up at a Richard Dawkins talk a couple of weeks earlier. So, now, I was of two minds on this. On the one hand, I know perfectly well that going around wearing a shirt that says "The God Delusion" is likely to get some people worked up, and that if I wear such a thing, I'd better be prepared to deal with the displeased. On the other hand, if I were wearing a pro-religion shirt (pro-Christianity, or in the area in which I live, pro-whateverasianorpsuedonativeamericanreligion), then I'd likely be able to go about unmolested. And, really, I fail to see how my shirt (which essentially indicates that I believe that all religions are probably wrong) is different from a pro-Christian shirt (which, by the nature of Christianity, essentially indicates that the wearer believes that all other religions are definitely wrong) in it's "insult anyone who doesn't agree with me" factor, and therefore I don't see any good reason why I should get any more guff than they guy with the "Sins washed by the blood of the lamb" or "Christ or Antichrist, there is no other choice" shirt (yes, I have seen this shirt, as well as a bumper sticker).
Still, there's the way that things should be, and the way that they are. So, I stood up a bit straighter, and turned to face the person addressing me, ready to have to argue.
"Well, it's a book written by a guy named Richard Dawkins. He argues that, as there is no real evidence for a god, and that the evidence that people tend to cite for a god doesn't really stand up to scrutiny, it is not reasonable to conclude that there is one."
"Oh," the fellow considered this for a moment, "I would argue that it is a bad idea to state that there is nothing simply because there is no evidence for something*."
So, I responded that simply not being able to disprove something doesn't make its existence as likely as its non-existence. And he responded quite intelligently to that, and we both began walking. In all, I spent the next hour walking down the beach with this complete stranger, debating religion in a pleasant way, and in the end, we shook hands and went our separate ways.
Did either of us change the other's mind as regards theism? Probably not. But this is, I think, an important thing to remember, for all of us. The fellow and I absolutely disagree on the existence of god, but he was not some bile-spewing idiot with a desire to smash all infidels. And he saw that I was not someone who was going to attack him for being foolish. I know that, when my fellow atheists make disparaging comments about people who believe in deities, I have been able to bring my conversation with this fellow up as an example of a theist who was smart, reasonable, and a decent guy. I hope that, when he is around fellow believers who make similar comments about atheists, he is able to bring me up as an example of an atheist who did not meet the negative stereotypes.
I am under no illusions. I doubt that supernatural beliefs will ever go away. I also doubt that there will ever be any religion that manages to convert the world. So, while there are times and places for argument and even fighting, it does us well to recognize that even our most vicious opponents are human. Persuading everyone on the planet of a belief or lack thereof will never happen, and there will always be those who wish to push their beliefs on others, but we can reduce conflict simply by humanizing the other, and learning to defend and argue for your position while not being a dick is a huge step on that path.
*Ahh, the ol' agnostic defense. The problem, of course, being that it assumes that the person with whom you are arguing is convinced that there is no god, rather than simply concluding that there is no reason to believe in one (a subtle but important difference). Also, it essentially assumes that if neither of two potential positions can be 100% proven, they must both be equally likely - a position shown to be absurd by Russell's Teapot analogy (updated as The Flying Spaghetti Monster.
I also like my own "Gillian Anderson Argument" - I can not prove that, when I head home tonight, Gillian Anderson will not be there to serve me dinner. Neither can I prove that she will be there. However, this does not mean that there is a 50% chance of her being there. In fact, as there is no evidence supporting the claim that she will be there, and there are many logistical problems with the idea of her being there, I must rate that probability of such a thing occurring as being very low. Likewise, there are many things about our universe that one would expect to be rather different if there were a god, and so simply not being able to prove that there is not one does not raise the possibility that there is one to 50%.
I had forgotten which shirt I was wearing. It was one that I had picked up at a Richard Dawkins talk a couple of weeks earlier. So, now, I was of two minds on this. On the one hand, I know perfectly well that going around wearing a shirt that says "The God Delusion" is likely to get some people worked up, and that if I wear such a thing, I'd better be prepared to deal with the displeased. On the other hand, if I were wearing a pro-religion shirt (pro-Christianity, or in the area in which I live, pro-whateverasianorpsuedonativeamericanreligion), then I'd likely be able to go about unmolested. And, really, I fail to see how my shirt (which essentially indicates that I believe that all religions are probably wrong) is different from a pro-Christian shirt (which, by the nature of Christianity, essentially indicates that the wearer believes that all other religions are definitely wrong) in it's "insult anyone who doesn't agree with me" factor, and therefore I don't see any good reason why I should get any more guff than they guy with the "Sins washed by the blood of the lamb" or "Christ or Antichrist, there is no other choice" shirt (yes, I have seen this shirt, as well as a bumper sticker).
Still, there's the way that things should be, and the way that they are. So, I stood up a bit straighter, and turned to face the person addressing me, ready to have to argue.
"Well, it's a book written by a guy named Richard Dawkins. He argues that, as there is no real evidence for a god, and that the evidence that people tend to cite for a god doesn't really stand up to scrutiny, it is not reasonable to conclude that there is one."
"Oh," the fellow considered this for a moment, "I would argue that it is a bad idea to state that there is nothing simply because there is no evidence for something*."
So, I responded that simply not being able to disprove something doesn't make its existence as likely as its non-existence. And he responded quite intelligently to that, and we both began walking. In all, I spent the next hour walking down the beach with this complete stranger, debating religion in a pleasant way, and in the end, we shook hands and went our separate ways.
Did either of us change the other's mind as regards theism? Probably not. But this is, I think, an important thing to remember, for all of us. The fellow and I absolutely disagree on the existence of god, but he was not some bile-spewing idiot with a desire to smash all infidels. And he saw that I was not someone who was going to attack him for being foolish. I know that, when my fellow atheists make disparaging comments about people who believe in deities, I have been able to bring my conversation with this fellow up as an example of a theist who was smart, reasonable, and a decent guy. I hope that, when he is around fellow believers who make similar comments about atheists, he is able to bring me up as an example of an atheist who did not meet the negative stereotypes.
I am under no illusions. I doubt that supernatural beliefs will ever go away. I also doubt that there will ever be any religion that manages to convert the world. So, while there are times and places for argument and even fighting, it does us well to recognize that even our most vicious opponents are human. Persuading everyone on the planet of a belief or lack thereof will never happen, and there will always be those who wish to push their beliefs on others, but we can reduce conflict simply by humanizing the other, and learning to defend and argue for your position while not being a dick is a huge step on that path.
*Ahh, the ol' agnostic defense. The problem, of course, being that it assumes that the person with whom you are arguing is convinced that there is no god, rather than simply concluding that there is no reason to believe in one (a subtle but important difference). Also, it essentially assumes that if neither of two potential positions can be 100% proven, they must both be equally likely - a position shown to be absurd by Russell's Teapot analogy (updated as The Flying Spaghetti Monster.
I also like my own "Gillian Anderson Argument" - I can not prove that, when I head home tonight, Gillian Anderson will not be there to serve me dinner. Neither can I prove that she will be there. However, this does not mean that there is a 50% chance of her being there. In fact, as there is no evidence supporting the claim that she will be there, and there are many logistical problems with the idea of her being there, I must rate that probability of such a thing occurring as being very low. Likewise, there are many things about our universe that one would expect to be rather different if there were a god, and so simply not being able to prove that there is not one does not raise the possibility that there is one to 50%.
Thursday, September 16, 2010
Conquistadors, Vikings, Indians, Skraelings, and Cloth Money?
The previous post made comments about the British Museum getting it wrong as regards money in the Americas. I was poking at that largely for comedic effect, and as a way to start talking about the introduction of Spanish beads into the Americas, but it's not quite accurate.
The shell beads that I discuss are known to have been in use some time after AD 1000, in which case the display was correct. There is a fair amount of debate as to when the shell beads became used as money, and even which types of shell beads were used as money. However, a growing body of evidence suggests that the trade networks found by the Europeans when they arrived may have been the latest form of a fluid system that had been evolving for nearly 2,000 years. The developments were regional - the people of California had no idea what the people in Colorado were up to, nor did the people in Colorado know what the people in Florida were doing, etc. - but there were common developments in many areas with the development of socio-political complexity, and the rise of exchange relationships (though it should be noted that there were other parts of North America where the people remained as nomadic bands, and yet others where they became sedentary farmers, there was alot of variation).
Myself, I am persuaded by the arguments of archaeologists who hold that there was some form of semi-formalized exchange, probably using beads as the medium of exchange, prior to AD 1000. However, the data is ambiguous, and my view is in the minority. However, I would like to point to similarities that I have noted in two ethnohistoric sources (historic records that describe contact with prehistoric people) to illustrate my point.
The journals of Spanish explorers into California and Florida describe the use of "pacification gifts" in order to gain favor with the native peoples that they encountered. The Spanish brought a large number of items - tools, clothing, coins, etc. - but while the Native Americans were happy to take other items, they really wanted the beads. As noted in yesterdays post, this is because the beads fit in with existing economic practices, and were therefore more immediately valuable to the native peoples than the other items. The Spanish often noted their surprise at the popularity of the beads, and made their confusion known.
The Vinland Sagas describe viking attempts to establish a colony in North America. The sagas are Norse historic epics, usually written in a way that is not quite larger-than life. Usually, they are the stories of heroic figures, and as such can be expected to be somewhat "modified" from reality. However, historians have long used them as a source - provided that one doesn't take everything literally, they can be of great value.
At any rate, the Vinland Sagas tell of encounters with native North Americans, whom the Vikings called Skraelings (after their name for the native peoples of Greenland). The Skraelings established a complicated relationship with the Vikings - sometimes trading, sometimes fighting. When trading, the Skraelings routinely requested fragments of died cloth, sometimes turning down objects that the Vikings thought were more valuable in favor of the cloth.
The similarity here is rather striking - utilitarian items, some of them of great use, turned down in favor of a rather odd and cheap bauble. This implies that the bauble, whether a bead or a piece of cloth, somehow fit into the existing native culture in a way that the other items couldn't.
You can probably see where I'm going with this. I suspect that the cloth served a purpose in eastern Canada ca. AD 1000 similar to what the beads served in California ca. AD 1800. I think we may be seeing a native form of money here.
Now, it should be said that my evidence is not particularly strong. The cloth may have served some other type of use, and the similarities between it and the beads are superficial. But I am struck by the similarities nonetheless, and I think that this may be early evidence for a type of North American money.
The shell beads that I discuss are known to have been in use some time after AD 1000, in which case the display was correct. There is a fair amount of debate as to when the shell beads became used as money, and even which types of shell beads were used as money. However, a growing body of evidence suggests that the trade networks found by the Europeans when they arrived may have been the latest form of a fluid system that had been evolving for nearly 2,000 years. The developments were regional - the people of California had no idea what the people in Colorado were up to, nor did the people in Colorado know what the people in Florida were doing, etc. - but there were common developments in many areas with the development of socio-political complexity, and the rise of exchange relationships (though it should be noted that there were other parts of North America where the people remained as nomadic bands, and yet others where they became sedentary farmers, there was alot of variation).
Myself, I am persuaded by the arguments of archaeologists who hold that there was some form of semi-formalized exchange, probably using beads as the medium of exchange, prior to AD 1000. However, the data is ambiguous, and my view is in the minority. However, I would like to point to similarities that I have noted in two ethnohistoric sources (historic records that describe contact with prehistoric people) to illustrate my point.
The journals of Spanish explorers into California and Florida describe the use of "pacification gifts" in order to gain favor with the native peoples that they encountered. The Spanish brought a large number of items - tools, clothing, coins, etc. - but while the Native Americans were happy to take other items, they really wanted the beads. As noted in yesterdays post, this is because the beads fit in with existing economic practices, and were therefore more immediately valuable to the native peoples than the other items. The Spanish often noted their surprise at the popularity of the beads, and made their confusion known.
The Vinland Sagas describe viking attempts to establish a colony in North America. The sagas are Norse historic epics, usually written in a way that is not quite larger-than life. Usually, they are the stories of heroic figures, and as such can be expected to be somewhat "modified" from reality. However, historians have long used them as a source - provided that one doesn't take everything literally, they can be of great value.
At any rate, the Vinland Sagas tell of encounters with native North Americans, whom the Vikings called Skraelings (after their name for the native peoples of Greenland). The Skraelings established a complicated relationship with the Vikings - sometimes trading, sometimes fighting. When trading, the Skraelings routinely requested fragments of died cloth, sometimes turning down objects that the Vikings thought were more valuable in favor of the cloth.
The similarity here is rather striking - utilitarian items, some of them of great use, turned down in favor of a rather odd and cheap bauble. This implies that the bauble, whether a bead or a piece of cloth, somehow fit into the existing native culture in a way that the other items couldn't.
You can probably see where I'm going with this. I suspect that the cloth served a purpose in eastern Canada ca. AD 1000 similar to what the beads served in California ca. AD 1800. I think we may be seeing a native form of money here.
Now, it should be said that my evidence is not particularly strong. The cloth may have served some other type of use, and the similarities between it and the beads are superficial. But I am struck by the similarities nonetheless, and I think that this may be early evidence for a type of North American money.
Wednesday, September 15, 2010
Conquistadors, Beads, Inflation, and the British Museum
While visiting the British Museum I saw a display, sponsored by the Bank of England naturally, on the history of money. One display showed examples of early money from across the globe, including bead-based moneys from Africa and Australia. However, North America was dismissed with a statement to the effect of "we know that there must have been some form of trade item, or early money, but we haven't any idea as to what it could have been."
I can only assume that whoever wrote the display copy had never bothered to actually read any books, magazine articles, or watch any television shows on prehistoric North America. If he had, he's have realized that shell bead money, much like that described for Africa and Australia, had been in use over much of North America. In fact, it's part of our American mythology that Rhode Island was purchased with beads that amounted to $20 (a rather huge embellishment based on a few semi-related true events).
In fact, the use of beads by many Native American groups was exploited by Spanish colonists in California during the 18th century.
It's important to understand what money really is - we so often think of it as an object in and of itself, but it really isn't. It's essentially a way of storing wealth, much like a battery stores electricity*. In many societies, the creation of an early form of money acted as a sort of safety-net. During times of plenty, a person or group could trade away their surplus for the currency, in order to spend the currency during lean times in order to ensure that they had enough to survive. This allows the growth of complex exchange networks, and allows sedentary populations to form and grow in environments that are seasonally barren. If I can build up a stockpile of money when I am doing well during the summer when local foods are plentiful, then I can spend that money to buy food during the winter when local resources are more meager, but this ensures that I have to maintain good relations with someone who has plenty in the winter but is not doing so well during the summer. This leads to cooperation across ecological zones, and may be a step in the formation of more complex state-type societies.
Prior to the arrival of the Spanish, shell beads were used as the primary form of currency in California. Shell beads made from an Olivella biplicata shell were ideal for this purpose because they took effort to manufacture and were based on a relatively limited resource, limiting the quantity that were likely to enter the system at any one time. Ethnohistoric data from the Santa Barbara Channel region indicates that the value of a string of shell beads was dependent on the length of the string of beads, the quality of the beads themselves (both in terms of workmanship and in terms of how exotic the bead appears), and the difficult that it took to manufacture the beads.
The most valuable type of bead (to the point that it is considered "the money bead" by many anthropologists) is made from a part of the Olivella shell called the callus. This is a hard to get-at and hard to work thick piece of the shell (see the diagram below), and beads made from it were considered especially valuable and important.
The Spanish used beads as gifts when meeting and trying to establish relations with the Native Californians. The hunger for the beads confused the Spanish, who thought that other goods that they had brought should be more desirable. However, as the native peoples had a rather complex economic system based on exchange of goods across ecological zones (coast to valleys to inland mountains, etc.) which used beads as the medium of exchange, and the Spanish beads would have been considered very high value in this system, the beads made for an ideal item to foster interaction.
The beads also appear to have had a rather interesting effect, or at least I think that it had an interesting effect.
See, the types of shell beads made changed over time, and can be used to figure out how old sites are. the callus cup beads - the really valuable ones - were made during the Late Period (from approximately AD 1200 to AD 1770), a time when social, organizational, and economic complexity really took off. There is a fair amount of evidence that indicates that the callus cup beads became common during this period specifically because they were hard to make and therefore could be used as a medium of exchange (if not everyone is going to bother to take the time to make them then they are perceived as more valuable and therefore can serve as a way of storing and showing wealth). This is also a period during which populations grew, putting pressure on areas where food was only abundant during certain parts of the year, requiring that people of most regions create and maintain relations with other regions in order to ensure that they have a steady supply of foods and other materials.
The introduction of glass beads from Europe screwed the system. The glass beads, as noted, had a high value, and their introduction correlates with the disappearance of the callus cup beads. It appears that the glass beads were considered to be of such great value, that the callus cup beads couldn't keep up, and the bead makers were essentially driven out of business.
But if you look at a more common type of Olivella bead, called a wall bead (again, see the diagram above), there's something strange a'brewin'. These types of beads had been well made prior to the historic period. The material to make them was clearly carefully removed from the Olivella shell, the holes were drilled carefully, the beads were made to a regular shape (either circular or oval), and the edges were ground smooth.
But after the glass beads are introduced, the wall bead shapes become irregular, the holes are not as carefully drilled, and the edges are either not ground as smooth and in many cases aren't ground at all.
What was going on here? Well, there's a number of explanations available, but I think that it may be that we are seeing inflation. The presence of the glass beads essentially destroyed the value of callus cup beads, resulting in them no longer being manufactured. I suspect, and keep in mind that this is my own interpretation and there are many other archaeologists who would disagree with me, that the wall beads also had a place within the economic system (either as money or as prestige goods or both - like the callus cup had been), and that the availability of higher-quality and more exotic glass beads resulted int he bead makers realizing that they couldn't compete in terms of quality, but they could compete in terms of quantity - and so lower-quality but easier-made beads became common.
Another explanation may be that the bead makers were specialists, and that the wall beads retained their importance into the Historic Period, but that the upset to the native economy due to the introduction of European goods meant that the bead makers had to find other ways of making a living, and therefore simply had less time to devote to bead making, hence the lower-quality goods.
Regardless of the explanation, it is interesting that, rather than simply eliminate a native industry, the introduction of European beads altered it.
Oh, and the people at the British Museum really need to do more homework when putting up displays dealing with the Americas.
*Much is made of the fact that much of the world's current money has no value behind it, and a minority of people demand that we MUST get back on a precious metal standard if we are to avoid calamity. Most money was once backed by gold or silver or other precious metals, but is now backed by nothing other than people all agreeing that it has value. However, this has always been the case for money - even gold and silver have little real use (outside of some relatively minor industrial uses) and are only of value because everyone agrees that they are.
In other words, contrary to what some people claim, if the economy collapsed tomorrow, having all of your finances tied up in gold and silver would do you no real good. Having all of your finances tied up in guns, bullets, and canned food, on the other hand, would put you in a position of power in the post-collapse world. Just sayin'.
The offending display from the British Museum
A close-up of the offending display. Let your anger flow through you as you read the lies. Lies, I say! Okay, actually, there's some truth to it - we're not quite sure how money functioned prior to AD 1000, but it likely was a form similar to the bead system found in many other parts of the world.
I can only assume that whoever wrote the display copy had never bothered to actually read any books, magazine articles, or watch any television shows on prehistoric North America. If he had, he's have realized that shell bead money, much like that described for Africa and Australia, had been in use over much of North America. In fact, it's part of our American mythology that Rhode Island was purchased with beads that amounted to $20 (a rather huge embellishment based on a few semi-related true events).
In fact, the use of beads by many Native American groups was exploited by Spanish colonists in California during the 18th century.
Photo from bcartifacts.com
It's important to understand what money really is - we so often think of it as an object in and of itself, but it really isn't. It's essentially a way of storing wealth, much like a battery stores electricity*. In many societies, the creation of an early form of money acted as a sort of safety-net. During times of plenty, a person or group could trade away their surplus for the currency, in order to spend the currency during lean times in order to ensure that they had enough to survive. This allows the growth of complex exchange networks, and allows sedentary populations to form and grow in environments that are seasonally barren. If I can build up a stockpile of money when I am doing well during the summer when local foods are plentiful, then I can spend that money to buy food during the winter when local resources are more meager, but this ensures that I have to maintain good relations with someone who has plenty in the winter but is not doing so well during the summer. This leads to cooperation across ecological zones, and may be a step in the formation of more complex state-type societies.
Prior to the arrival of the Spanish, shell beads were used as the primary form of currency in California. Shell beads made from an Olivella biplicata shell were ideal for this purpose because they took effort to manufacture and were based on a relatively limited resource, limiting the quantity that were likely to enter the system at any one time. Ethnohistoric data from the Santa Barbara Channel region indicates that the value of a string of shell beads was dependent on the length of the string of beads, the quality of the beads themselves (both in terms of workmanship and in terms of how exotic the bead appears), and the difficult that it took to manufacture the beads.
The most valuable type of bead (to the point that it is considered "the money bead" by many anthropologists) is made from a part of the Olivella shell called the callus. This is a hard to get-at and hard to work thick piece of the shell (see the diagram below), and beads made from it were considered especially valuable and important.
Image from Benyhoff and Hughes Shell Bead and Ornament Exchange Networks Between California and the Great Basin
The Spanish used beads as gifts when meeting and trying to establish relations with the Native Californians. The hunger for the beads confused the Spanish, who thought that other goods that they had brought should be more desirable. However, as the native peoples had a rather complex economic system based on exchange of goods across ecological zones (coast to valleys to inland mountains, etc.) which used beads as the medium of exchange, and the Spanish beads would have been considered very high value in this system, the beads made for an ideal item to foster interaction.
Photo of European glass beads at the American Museum of Natural History
The beads also appear to have had a rather interesting effect, or at least I think that it had an interesting effect.
See, the types of shell beads made changed over time, and can be used to figure out how old sites are. the callus cup beads - the really valuable ones - were made during the Late Period (from approximately AD 1200 to AD 1770), a time when social, organizational, and economic complexity really took off. There is a fair amount of evidence that indicates that the callus cup beads became common during this period specifically because they were hard to make and therefore could be used as a medium of exchange (if not everyone is going to bother to take the time to make them then they are perceived as more valuable and therefore can serve as a way of storing and showing wealth). This is also a period during which populations grew, putting pressure on areas where food was only abundant during certain parts of the year, requiring that people of most regions create and maintain relations with other regions in order to ensure that they have a steady supply of foods and other materials.
The introduction of glass beads from Europe screwed the system. The glass beads, as noted, had a high value, and their introduction correlates with the disappearance of the callus cup beads. It appears that the glass beads were considered to be of such great value, that the callus cup beads couldn't keep up, and the bead makers were essentially driven out of business.
But if you look at a more common type of Olivella bead, called a wall bead (again, see the diagram above), there's something strange a'brewin'. These types of beads had been well made prior to the historic period. The material to make them was clearly carefully removed from the Olivella shell, the holes were drilled carefully, the beads were made to a regular shape (either circular or oval), and the edges were ground smooth.
Photo of Chumash-made shell beads, from NPS.gov
But after the glass beads are introduced, the wall bead shapes become irregular, the holes are not as carefully drilled, and the edges are either not ground as smooth and in many cases aren't ground at all.
What was going on here? Well, there's a number of explanations available, but I think that it may be that we are seeing inflation. The presence of the glass beads essentially destroyed the value of callus cup beads, resulting in them no longer being manufactured. I suspect, and keep in mind that this is my own interpretation and there are many other archaeologists who would disagree with me, that the wall beads also had a place within the economic system (either as money or as prestige goods or both - like the callus cup had been), and that the availability of higher-quality and more exotic glass beads resulted int he bead makers realizing that they couldn't compete in terms of quality, but they could compete in terms of quantity - and so lower-quality but easier-made beads became common.
Another explanation may be that the bead makers were specialists, and that the wall beads retained their importance into the Historic Period, but that the upset to the native economy due to the introduction of European goods meant that the bead makers had to find other ways of making a living, and therefore simply had less time to devote to bead making, hence the lower-quality goods.
Regardless of the explanation, it is interesting that, rather than simply eliminate a native industry, the introduction of European beads altered it.
Oh, and the people at the British Museum really need to do more homework when putting up displays dealing with the Americas.
*Much is made of the fact that much of the world's current money has no value behind it, and a minority of people demand that we MUST get back on a precious metal standard if we are to avoid calamity. Most money was once backed by gold or silver or other precious metals, but is now backed by nothing other than people all agreeing that it has value. However, this has always been the case for money - even gold and silver have little real use (outside of some relatively minor industrial uses) and are only of value because everyone agrees that they are.
In other words, contrary to what some people claim, if the economy collapsed tomorrow, having all of your finances tied up in gold and silver would do you no real good. Having all of your finances tied up in guns, bullets, and canned food, on the other hand, would put you in a position of power in the post-collapse world. Just sayin'.
Monday, September 13, 2010
Anthropology and Changing Myths vs. "Eternal" Scripture
Ethnographers working with the Klamath and Modoc Native American groups in northern California and southern Oregon during the early 20th century encountered a yearly celebration known as the Shaman's Dance. The celebration was held during the winter, and people would gather at the home of the village shaman for a several-day long event in which the shaman, as well as others who held mystical ability, would demonstrate their powers. The show was part entertainment, part community gathering, and part reinforcement of the Shaman's position as a person of power within the community. It also served as a time for people to gather together and discuss community business, an important part of which was sorting out the stories.
Gathered together in the Shaman's home, one person would start telling one of the stories - a story of the culture heroes, a story of the spirits, a story of the group's origins, etc. The teller would continue until somebody objected to some aspect of the story being told, at which point all gathered would debate the matter until a consensus could be reached and the story continued or completed. This would, obviously, result in the stories that served as the base of the religion being different after the Shaman's Dance than they had been before it. This confused many of the ethnographers, one of whom wrote in his final monograph that he had asked several people present whether the new or the old version was the true version, and he received the answer "the story used to go one way, now it goes another."
As ethnographers covered more ground over the course of the 20th century, it became clear both that the mythic stories of pre-literate societies changed over time, and that the people of these societies were generally aware that they changed and were not perturbed by this fact. The way in which the changes were acknowledged and incorporated varied across time and space. Amongst the Klamath and the Modoc, a frank admission that the stories had changed was noted, and these people saw nothing at all wrong with this. By contrast, amongst some northern Australian Aborigine groups, there existed a position known as the Law Man, and the Law man's job was to tell the Dream Time stories, and to interpret them, often changing them (or creating new ones) in the process.
This phenomenon of changing stories provides a fascinating insight into the origins of religion. It is no secret to anyone that the stories that make up the mythic base of any religion provide the basic rules as to how to live as a member of society. When one looks at the myths of many groups, they also provide information regarding the locations of resources, allowing a "mythological map" to be held by anyone who knows the myths (I have often wondered if our very idea of a "sacred space" may be descended from, or at least related to, this use of myth). Those of us who live in post-literate societies with religions that either are (or at one point were) closely tied to the government (even within the U.S., we are descended from nations that had official churches) tend to think of myth as being unchanging, it's even common for people to talk about the "eternal truths" of their particular religion's stories.
And yet, conditions change. Social alliances shift, resource locations move, and behaviors that were once advantageous may become counter-productive (and vice-versa). And if you are basing your strategies for dealing with the world on myth, as most humans have for most of our history, then myths need to change to allow the people who make use of them to adapt to a changed world. Having a mechanism to create these changes allows the traditions, culture, and (importantly) people to better navigate a world that is not static. However, an acknowledgement of that change, which would have made perfect sense to the people who carried and lived with these myths, seems bizarre and counter-intuitive to those of us who live in literate societies where the myths have been written down and usually have been declared "eternal and inerrant."
But if religion has an evolutionary advantage, it likely is in allowing us to find a way of navigating the world - physically, socially, and psychologically. The shifting myths, with no "true" version, encountered by early ethnographers are probably much closer to religion when it served as a strong tool for our ancestors.
Indeed, for all of the talk of the "big three" modern monotheisms (Christianity, Judaism, Islam), these religions are also open to change. The Bible (inclusive of the Torah or Old Testament and the New Testament) and the Koran are both large, sprawling, and internally inconsistent. Churches and mosques choose to ignore certain passages, emphasize others, and downplay yet others. Religious authorities and interested writers of all of these religions produce volumes explaining why some religious passages allegedly means something very different from what they actually say. This is really an attempt to make a more rigid written mythology flex in the way that earlier mythologies did, even if people deny that this is what they are doing.
When we look at the Pope of the Catholic Church or the Mormon Prophet, we see roles that are large-scale adaptations of something like the Law Man of the Aborigines. They even have their own elder's councils. And other religions have similar individuals are groups who modify the doctrine to be in better keeping with the changing world. There are, of course, differences (many of them dictated by the sheer scale of modern world-spanning churches), but the same basic seed is present.
But even if the modern religions are able to adapt their written myths to the modern world, they suffer the basic problem that everyone can read the text and determine whether or not what the leadership says is in keeping. Without an open acknowledgement that the stories must change, schism is probably made inevitable.
Gathered together in the Shaman's home, one person would start telling one of the stories - a story of the culture heroes, a story of the spirits, a story of the group's origins, etc. The teller would continue until somebody objected to some aspect of the story being told, at which point all gathered would debate the matter until a consensus could be reached and the story continued or completed. This would, obviously, result in the stories that served as the base of the religion being different after the Shaman's Dance than they had been before it. This confused many of the ethnographers, one of whom wrote in his final monograph that he had asked several people present whether the new or the old version was the true version, and he received the answer "the story used to go one way, now it goes another."
As ethnographers covered more ground over the course of the 20th century, it became clear both that the mythic stories of pre-literate societies changed over time, and that the people of these societies were generally aware that they changed and were not perturbed by this fact. The way in which the changes were acknowledged and incorporated varied across time and space. Amongst the Klamath and the Modoc, a frank admission that the stories had changed was noted, and these people saw nothing at all wrong with this. By contrast, amongst some northern Australian Aborigine groups, there existed a position known as the Law Man, and the Law man's job was to tell the Dream Time stories, and to interpret them, often changing them (or creating new ones) in the process.
This phenomenon of changing stories provides a fascinating insight into the origins of religion. It is no secret to anyone that the stories that make up the mythic base of any religion provide the basic rules as to how to live as a member of society. When one looks at the myths of many groups, they also provide information regarding the locations of resources, allowing a "mythological map" to be held by anyone who knows the myths (I have often wondered if our very idea of a "sacred space" may be descended from, or at least related to, this use of myth). Those of us who live in post-literate societies with religions that either are (or at one point were) closely tied to the government (even within the U.S., we are descended from nations that had official churches) tend to think of myth as being unchanging, it's even common for people to talk about the "eternal truths" of their particular religion's stories.
And yet, conditions change. Social alliances shift, resource locations move, and behaviors that were once advantageous may become counter-productive (and vice-versa). And if you are basing your strategies for dealing with the world on myth, as most humans have for most of our history, then myths need to change to allow the people who make use of them to adapt to a changed world. Having a mechanism to create these changes allows the traditions, culture, and (importantly) people to better navigate a world that is not static. However, an acknowledgement of that change, which would have made perfect sense to the people who carried and lived with these myths, seems bizarre and counter-intuitive to those of us who live in literate societies where the myths have been written down and usually have been declared "eternal and inerrant."
But if religion has an evolutionary advantage, it likely is in allowing us to find a way of navigating the world - physically, socially, and psychologically. The shifting myths, with no "true" version, encountered by early ethnographers are probably much closer to religion when it served as a strong tool for our ancestors.
Indeed, for all of the talk of the "big three" modern monotheisms (Christianity, Judaism, Islam), these religions are also open to change. The Bible (inclusive of the Torah or Old Testament and the New Testament) and the Koran are both large, sprawling, and internally inconsistent. Churches and mosques choose to ignore certain passages, emphasize others, and downplay yet others. Religious authorities and interested writers of all of these religions produce volumes explaining why some religious passages allegedly means something very different from what they actually say. This is really an attempt to make a more rigid written mythology flex in the way that earlier mythologies did, even if people deny that this is what they are doing.
When we look at the Pope of the Catholic Church or the Mormon Prophet, we see roles that are large-scale adaptations of something like the Law Man of the Aborigines. They even have their own elder's councils. And other religions have similar individuals are groups who modify the doctrine to be in better keeping with the changing world. There are, of course, differences (many of them dictated by the sheer scale of modern world-spanning churches), but the same basic seed is present.
But even if the modern religions are able to adapt their written myths to the modern world, they suffer the basic problem that everyone can read the text and determine whether or not what the leadership says is in keeping. Without an open acknowledgement that the stories must change, schism is probably made inevitable.
Friday, September 10, 2010
Mosques, Community Centers, and Book Burnings
So, tomorrow is September 11, 2010. The pastor of a small church in Florida is threatening to burn Korans, and everyone is fighting over whether the Islamic equivalent to a YMCA (and yes, that's what it is, if you're going to start screaming that it's an "Extremist Mosque" then go look it up...somewhere other than Sean Hannity's website) should be built in New York.
There is one similarity between them: in both cases, people are doing something perfectly legal, and in both cases we have to worry about how ideologues, the frightened, and the angry are going to act. In New York, we have to worry about whether people who don't grasp that Al Qeada is a fringe group within Islam will commit acts of violence and/or vandalism, and the Florida pastor is running the risk of giving yet more cause to uninformed people in the Middle East who - through the manipulations of their own media and political figures - may think that the Koran burning is a normal part of American and/or Christian life and not the act of a half-wit fringe group.
The reality is that both of these events would have passed by completely un-noticed were it not for media and political attention. In getting worked up over either of them, people are allowing themselves to be manipulated or else giving in to a gut-level emotional reaction without stopping to consider what being a citizen of this country actually means. To be fair, the manipulators (politicians, ideologues, and media fixtures) are very good at it, and most folks don't realize that it's happening not because of a lack of their own intelligence but because the manipulators are very good at their jobs.
Let's start with the "Ground Zero Mosque." It's not at Ground Zero - now some folks will point out that it is in a location where the building was damaged by debris from the attacks, which is true. However, Ground Zero is a specifically designated place, and this falls outside of it. You could call it "the Building Damaged by Aircraft Debris Mosque", but that doesn't have the same ring, and more importantly for the people who are making hay with it, it doesn't have the same "shut off thinking by getting people angry" emotional charge. It's also not a Mosque, but that's true on a technical distinction between what is and what is not a mosque that many Christians don't make regarding what is and what is not a church, so while it's technically not a mosque, I'm not going to argue too much about that, as it's close enough.
There are alot of other claims going around - "They're going to let a mosque be built, but not a church be rebuilt", "the Imam refused offers to buy him out and give addition financial incentives if he just re-locates somewhere else", "the Imam receives money from the Kingdom Foundation, which is run by a guy who funds extremist groups around the world!" And so on. For the first two points - as someone who has been involved in building planning for several years now, neither of these things seems sinister. In New York, there are different authorities who have control over the permits in different areas (the same is true in many cities in my own state) - the area where the Islamic Center/Mosque is going to be built is not under the Port authority, who actually does have permitting responsibilities for the area where an Eastern Orthodox Church once stood, and with different agencies comes different permitting processes, regulations to follow, etc., which means that something that would be permitted by one agency may not be permitted by another. This is perfectly normal, if frustrating if you happen to be the person trying to get permits. Also, once the permitting process has been started, it's normal for the project proponent, regardless of the project, to not accept offers to be bought out - I have seen it many, many times, and they tend to dig their heels in more when project get political. So, these things that people seem to think are suspicious are actually a normal part of getting things built - sad, but true.
The guy, often referred to as "the Extremist" (not Joe Satriani) who is behind the Kingdom Foundation who funds some of this Imam's activities is Saudi Prince Al-Waleed bin Talal. He is also one of the major shareholders of Newscorp, the organization that owns FOX News. So, if you are going to claim that this Imam must be an evil anti-American extremist because he accepts funding from Al-Waleed bin Talal, then it follows that Bill O'Reilly is also an evil Anti-American extremist*.
There are numerous other things that people have brought up to try to show the alleged evils of the Imam behind the construction, but following them up routinely brings me to the same sorts of things once I find the source: either the statements are misconstruals of perfectly normal practices that all construction projects have the potential to entail, or they are distortions of statements actually made (sometimes through quote-mining, sometimes through very loose paraphrasing), or they are statements made by people with a political agenda that don't actually accuse the Imam of anything but are clearly intended to get people thinking that he's done something (in other words, they're lies carefully worked to avoid a libel suit). In other words, every time I have actually followed anything to the source, it turns out to either be mundane, or insulting but normal within the context of American life. The same is true for those who wish to paint him as the most wonderful guy in the world. In other words, whether you regard this Imam as a demon or an angel, you are looking at a media image and not what the guy has actually said or done.
Now, does this guy hold views that I, personally, would find abhorrent regarding issues of individual rights, women's rights, social roles, etc.? Probably. He is an Imam, after all. You know who else holds views that I find abhorrent? The Pope. The Prophet of the Mormon Church. Franklin Graham (Billy Graham's son). Pat Robertson. And the list could go on for many pages. Would I object to them building a church near Ground Zero, or in any other location? Not if they obtained permits and went about it in a legal manner.
Some people are going to be upset with of this construction because it's an Islamic center, and Al Qeada is an Islamic group. But the simple fact of the matter is that there is a very real distinction between Al Qaeda - an organization - and Islam - a religion**. Al Qaeda is an Islamic organization, to be certain. I'm not going to be one of those people who claims that they're not true Muslims, because as far as an honest assessment can be made, they are. But it should be kept in mind that in the 1930s, and possibly still today (I find alot of contradictory information) the Ku Klux Klan was a Christian organization, but it wasn't Christianity. In the 1930s, the Ku Klux Klan used political rhetoric and Biblical citations to justify it's agenda, and it's leadership was definitely Christian, and it was accepted, applauded, or at best ignored by many Protestant Christian churches in the United States in pretty much precisely the same way that Al Qeada is by many Middle Eastern mosques today. And yet we all know that the KKK is not and never was the face of mainstream Christianity, it was a fringe group (if a widely accepted or even supported one).
If you bother to do some reading outside of the headlines and more sensationalistic outlets, it quickly becomes apparent that much the same can be said of Al Qeada. Indeed, Al Qeada, for all of their "anti-Imperialist" rhetoric, has shown itself to be just as (or perhaps even more) interested in killing other Muslims as in striking Europe and the United States - hardly the actions of the mainstream of a religion. Hell, the justification for suicide bombing had to be developed by Ayman al-Zawahiri using references to Medieval Christian martyrologies and then quote-mining the Koran***.
So, really, I see no legitimate reason to oppose the construction of this Islamic center if I'm not also going to oppose the construction of any other religious structure. I would not oppose the construction of a Catholic church next to an elementary school, or a Southern Baptist church in a Selma, Alabama, or a synagogue near a Palestinian-American neighborhood either.
I have, however, heard one legitimate reason for not building it, and this is where it ties back to our friend in Florida. The reason is this: given that there has been such a furor whipped up over it, constructing it will likely result in further conflict, probably result in vandalism, and may result in violence. But let's be clear - this is a concern about the actions of people other than those who want to build the place. This is a concern about the actions of people who are able and willing to commit criminal acts because they fail to grasp that living in a society where both speech and religion are free means that you don't have a right to not be offended. This is a concern about the actions of people who fail to make a distinction between a religion and individuals within that religion.
By the same token, I don't really care if some imbecile in Florida wants to burn books. If the media wasn't paying him so much attention, just as with the Islamic Center/Mosque, then this wouldn't be any more noteworthy than any of the huge number of other things that imbeciles do every single day. I don't consider the Koran holy - that would require me believing that there was some mystical being who decreed things holy - and so when I first heard of this guy, I gave it the same eye roll that I gave when I heard that Mahmoud Ahmadinejad claimed that there were no homosexuals in Iran. It was an idiot spouting off, something that everyone has the right to do when within the borders of the United States.
Since then, though, this has gathered so much media attention that it has become a legitimate concern that groups such as Al Qeada can use it as a recruiting tool ("evidence that 'the West' is against Islam, and not terrorism!") and that others with an axe to grind or media to sell in the Middle East may use it as a way to raise emotions and fan the ol' flames o' hatred, creating volatility where non exist.
But, again, let's be clear. The problem isn't that somebody has decided to legally burn something that he legally obtained. His actions and words show him to be a bigot and an idiot, but in the United States, we have the right to be bigots and idiots (even if I wish my fellow countrymen would exercise that right less often). The problem is that other people will use this to their advantage, and yet another group of people will fail to grasp that this guy is not representative of the U.S.A., the West in general, or even Christianity, and will engage in violent acts in response. What's more, this is a problem that arises from the fact that there are people in the world who fail to see the difference between the destruction of a symbol, and a violent attack against a religion's believers (a problem that is in no way unique to Islam).
"Gee, Mr. Armstrong, is there a point to this rant?"
Yeah. Over the past month, I have watched people, many of whom I respect and who are generally very smart and articulate people, reduce themselves to ranting madmen over the alleged evil of either the construction of the Islamic Center/Mosque, or the idiot with the book burning. But in both cases, this has been little more than giving in to emotion without reflection, or worse, allowing one's self to be manipulated by those with an agenda at worst.
We have the ability to be better than this. If you oppose the Islamic Center, then think about why you oppose it. If it's simply because the 9/11 hijackers were Muslim, then you are being inconsistent if you don't also oppose Catholic churches having daycare centers - let's face it, most priests don't abuse children, and most Muslims don't care for ramming airplanes into buildings. If you have some other reason for opposing it, then you may have a good point, or you may not, but at least you're not just giving in to a knee-jerk reaction.
If you are going to oppose the guy burning the Koran, then stop and consider that he now has power only because you have paid attention to him.
I can't fault anyone for having an immediate emotional response. But I can fault many people for not considering their position carefully and acting in accordance with their own stated principles.
*Wait a minute...this is starting to make some sense...
**This also holds for those who argue that being in favor of or opposed to the actions of the Israeli government means being in favor of or opposed to Judaism. There is a great distinction between Israel - a nation state - and Judaism - a religion and ethnic group.
***It should, of course, be noted that suicide attacks have occurred throughout history. They are usually justified via legalistic interpretations of either religion or tradition after the fact.
There is one similarity between them: in both cases, people are doing something perfectly legal, and in both cases we have to worry about how ideologues, the frightened, and the angry are going to act. In New York, we have to worry about whether people who don't grasp that Al Qeada is a fringe group within Islam will commit acts of violence and/or vandalism, and the Florida pastor is running the risk of giving yet more cause to uninformed people in the Middle East who - through the manipulations of their own media and political figures - may think that the Koran burning is a normal part of American and/or Christian life and not the act of a half-wit fringe group.
The reality is that both of these events would have passed by completely un-noticed were it not for media and political attention. In getting worked up over either of them, people are allowing themselves to be manipulated or else giving in to a gut-level emotional reaction without stopping to consider what being a citizen of this country actually means. To be fair, the manipulators (politicians, ideologues, and media fixtures) are very good at it, and most folks don't realize that it's happening not because of a lack of their own intelligence but because the manipulators are very good at their jobs.
Let's start with the "Ground Zero Mosque." It's not at Ground Zero - now some folks will point out that it is in a location where the building was damaged by debris from the attacks, which is true. However, Ground Zero is a specifically designated place, and this falls outside of it. You could call it "the Building Damaged by Aircraft Debris Mosque", but that doesn't have the same ring, and more importantly for the people who are making hay with it, it doesn't have the same "shut off thinking by getting people angry" emotional charge. It's also not a Mosque, but that's true on a technical distinction between what is and what is not a mosque that many Christians don't make regarding what is and what is not a church, so while it's technically not a mosque, I'm not going to argue too much about that, as it's close enough.
There are alot of other claims going around - "They're going to let a mosque be built, but not a church be rebuilt", "the Imam refused offers to buy him out and give addition financial incentives if he just re-locates somewhere else", "the Imam receives money from the Kingdom Foundation, which is run by a guy who funds extremist groups around the world!" And so on. For the first two points - as someone who has been involved in building planning for several years now, neither of these things seems sinister. In New York, there are different authorities who have control over the permits in different areas (the same is true in many cities in my own state) - the area where the Islamic Center/Mosque is going to be built is not under the Port authority, who actually does have permitting responsibilities for the area where an Eastern Orthodox Church once stood, and with different agencies comes different permitting processes, regulations to follow, etc., which means that something that would be permitted by one agency may not be permitted by another. This is perfectly normal, if frustrating if you happen to be the person trying to get permits. Also, once the permitting process has been started, it's normal for the project proponent, regardless of the project, to not accept offers to be bought out - I have seen it many, many times, and they tend to dig their heels in more when project get political. So, these things that people seem to think are suspicious are actually a normal part of getting things built - sad, but true.
The guy, often referred to as "the Extremist" (not Joe Satriani) who is behind the Kingdom Foundation who funds some of this Imam's activities is Saudi Prince Al-Waleed bin Talal. He is also one of the major shareholders of Newscorp, the organization that owns FOX News. So, if you are going to claim that this Imam must be an evil anti-American extremist because he accepts funding from Al-Waleed bin Talal, then it follows that Bill O'Reilly is also an evil Anti-American extremist*.
There are numerous other things that people have brought up to try to show the alleged evils of the Imam behind the construction, but following them up routinely brings me to the same sorts of things once I find the source: either the statements are misconstruals of perfectly normal practices that all construction projects have the potential to entail, or they are distortions of statements actually made (sometimes through quote-mining, sometimes through very loose paraphrasing), or they are statements made by people with a political agenda that don't actually accuse the Imam of anything but are clearly intended to get people thinking that he's done something (in other words, they're lies carefully worked to avoid a libel suit). In other words, every time I have actually followed anything to the source, it turns out to either be mundane, or insulting but normal within the context of American life. The same is true for those who wish to paint him as the most wonderful guy in the world. In other words, whether you regard this Imam as a demon or an angel, you are looking at a media image and not what the guy has actually said or done.
Now, does this guy hold views that I, personally, would find abhorrent regarding issues of individual rights, women's rights, social roles, etc.? Probably. He is an Imam, after all. You know who else holds views that I find abhorrent? The Pope. The Prophet of the Mormon Church. Franklin Graham (Billy Graham's son). Pat Robertson. And the list could go on for many pages. Would I object to them building a church near Ground Zero, or in any other location? Not if they obtained permits and went about it in a legal manner.
Some people are going to be upset with of this construction because it's an Islamic center, and Al Qeada is an Islamic group. But the simple fact of the matter is that there is a very real distinction between Al Qaeda - an organization - and Islam - a religion**. Al Qaeda is an Islamic organization, to be certain. I'm not going to be one of those people who claims that they're not true Muslims, because as far as an honest assessment can be made, they are. But it should be kept in mind that in the 1930s, and possibly still today (I find alot of contradictory information) the Ku Klux Klan was a Christian organization, but it wasn't Christianity. In the 1930s, the Ku Klux Klan used political rhetoric and Biblical citations to justify it's agenda, and it's leadership was definitely Christian, and it was accepted, applauded, or at best ignored by many Protestant Christian churches in the United States in pretty much precisely the same way that Al Qeada is by many Middle Eastern mosques today. And yet we all know that the KKK is not and never was the face of mainstream Christianity, it was a fringe group (if a widely accepted or even supported one).
If you bother to do some reading outside of the headlines and more sensationalistic outlets, it quickly becomes apparent that much the same can be said of Al Qeada. Indeed, Al Qeada, for all of their "anti-Imperialist" rhetoric, has shown itself to be just as (or perhaps even more) interested in killing other Muslims as in striking Europe and the United States - hardly the actions of the mainstream of a religion. Hell, the justification for suicide bombing had to be developed by Ayman al-Zawahiri using references to Medieval Christian martyrologies and then quote-mining the Koran***.
So, really, I see no legitimate reason to oppose the construction of this Islamic center if I'm not also going to oppose the construction of any other religious structure. I would not oppose the construction of a Catholic church next to an elementary school, or a Southern Baptist church in a Selma, Alabama, or a synagogue near a Palestinian-American neighborhood either.
I have, however, heard one legitimate reason for not building it, and this is where it ties back to our friend in Florida. The reason is this: given that there has been such a furor whipped up over it, constructing it will likely result in further conflict, probably result in vandalism, and may result in violence. But let's be clear - this is a concern about the actions of people other than those who want to build the place. This is a concern about the actions of people who are able and willing to commit criminal acts because they fail to grasp that living in a society where both speech and religion are free means that you don't have a right to not be offended. This is a concern about the actions of people who fail to make a distinction between a religion and individuals within that religion.
By the same token, I don't really care if some imbecile in Florida wants to burn books. If the media wasn't paying him so much attention, just as with the Islamic Center/Mosque, then this wouldn't be any more noteworthy than any of the huge number of other things that imbeciles do every single day. I don't consider the Koran holy - that would require me believing that there was some mystical being who decreed things holy - and so when I first heard of this guy, I gave it the same eye roll that I gave when I heard that Mahmoud Ahmadinejad claimed that there were no homosexuals in Iran. It was an idiot spouting off, something that everyone has the right to do when within the borders of the United States.
Since then, though, this has gathered so much media attention that it has become a legitimate concern that groups such as Al Qeada can use it as a recruiting tool ("evidence that 'the West' is against Islam, and not terrorism!") and that others with an axe to grind or media to sell in the Middle East may use it as a way to raise emotions and fan the ol' flames o' hatred, creating volatility where non exist.
But, again, let's be clear. The problem isn't that somebody has decided to legally burn something that he legally obtained. His actions and words show him to be a bigot and an idiot, but in the United States, we have the right to be bigots and idiots (even if I wish my fellow countrymen would exercise that right less often). The problem is that other people will use this to their advantage, and yet another group of people will fail to grasp that this guy is not representative of the U.S.A., the West in general, or even Christianity, and will engage in violent acts in response. What's more, this is a problem that arises from the fact that there are people in the world who fail to see the difference between the destruction of a symbol, and a violent attack against a religion's believers (a problem that is in no way unique to Islam).
"Gee, Mr. Armstrong, is there a point to this rant?"
Yeah. Over the past month, I have watched people, many of whom I respect and who are generally very smart and articulate people, reduce themselves to ranting madmen over the alleged evil of either the construction of the Islamic Center/Mosque, or the idiot with the book burning. But in both cases, this has been little more than giving in to emotion without reflection, or worse, allowing one's self to be manipulated by those with an agenda at worst.
We have the ability to be better than this. If you oppose the Islamic Center, then think about why you oppose it. If it's simply because the 9/11 hijackers were Muslim, then you are being inconsistent if you don't also oppose Catholic churches having daycare centers - let's face it, most priests don't abuse children, and most Muslims don't care for ramming airplanes into buildings. If you have some other reason for opposing it, then you may have a good point, or you may not, but at least you're not just giving in to a knee-jerk reaction.
If you are going to oppose the guy burning the Koran, then stop and consider that he now has power only because you have paid attention to him.
I can't fault anyone for having an immediate emotional response. But I can fault many people for not considering their position carefully and acting in accordance with their own stated principles.
*Wait a minute...this is starting to make some sense...
**This also holds for those who argue that being in favor of or opposed to the actions of the Israeli government means being in favor of or opposed to Judaism. There is a great distinction between Israel - a nation state - and Judaism - a religion and ethnic group.
***It should, of course, be noted that suicide attacks have occurred throughout history. They are usually justified via legalistic interpretations of either religion or tradition after the fact.
Proper Language and Ebonics
As I drove to work this morning, I had the radio tuned to a discussion of the DEA's decision to hire translators for Ebonics (AKA Black Enlgish). A linguist was on the show explaining that Ebonics, while not a language in and of itself, is an internally consistent dialect of English*, largely derived from the dialects of English spoken by indentured servants who worked alongside slaves in early historic America. This led to one person calling in to the show and attempting to tell the linguist that Ebonics was not a dialect, but was simply "improper English."
But, well, for there to be a "proper language" you need two things that are rather new in human history: 1) Someone with the authority to dictate what is and is not proper (in the case of language, this is usually in the hands of academics or government ministers) which usually comes with the formation of a state (although it could come from other forms of organization), and 2) the ability to codify what is or is not proper, which until the 20th century could only be done with writing, and is still most practically done with writing.
For most of human history, we lived in communities where the exact dialect of a language would gradiate from band-to-band or village-to-village, and two groups who lived quite distant from each other would speak dialects that were similar to each other, but different enough that they could arguably be called different languages (think of Spanish, Italian, and Portuguese in the modern world). And languages change over time as well as space - Elizabethan English (what you would find in one of Shakesepare's plays) is a different dialect than modern English in the very same part of England in which Shakespeare lived and wrote.
Language is a living thing - costantly changing, constantly being altered. We all recognize this, and yet to most of us the idea of a "proper" form for a language seems intuitive. We seem to be able to accept that different nation-states will have their own proper form of a language (Spanish from Spain being different from Spanish from Mexico, and there are differences between British English and American English) - in fact the caller who insisted that Ebonics was not a dialect cited his experience as an infantryman traveling with the Marines, experiencing different dialects in different countries - but we seem to be unwilling to accept that there may be true dialects within each nation state. We accept regional accents and slang, but we regard full dialects as simply "improper" language, when, in fact, they are simply evidence of the normal evolution of language.
Which is not to say that there aren't many advantages to artificially slowing language change, creating an agreed-upon proper language, through both writing and education. I would say that the benefits outweigh any negatives that you could imagine, and it's an absolute necessity in modern nations. But, nonetheless, our adherence to an artificially glaciated language is anomolous. It's a tremendous innovation and a wonderful tool, but it's not the norm, it's a recent invention and the development of dialects such as Ebonics is the norm.
*I know that somebody's going to try to get me to argue about whether or not Ebonics belongs in the classroom. So, I'll just say this - that's beside the point of what I'm trying to discuss here, and I'm disinterested in discussing that matter, so piss off.
But, well, for there to be a "proper language" you need two things that are rather new in human history: 1) Someone with the authority to dictate what is and is not proper (in the case of language, this is usually in the hands of academics or government ministers) which usually comes with the formation of a state (although it could come from other forms of organization), and 2) the ability to codify what is or is not proper, which until the 20th century could only be done with writing, and is still most practically done with writing.
For most of human history, we lived in communities where the exact dialect of a language would gradiate from band-to-band or village-to-village, and two groups who lived quite distant from each other would speak dialects that were similar to each other, but different enough that they could arguably be called different languages (think of Spanish, Italian, and Portuguese in the modern world). And languages change over time as well as space - Elizabethan English (what you would find in one of Shakesepare's plays) is a different dialect than modern English in the very same part of England in which Shakespeare lived and wrote.
Language is a living thing - costantly changing, constantly being altered. We all recognize this, and yet to most of us the idea of a "proper" form for a language seems intuitive. We seem to be able to accept that different nation-states will have their own proper form of a language (Spanish from Spain being different from Spanish from Mexico, and there are differences between British English and American English) - in fact the caller who insisted that Ebonics was not a dialect cited his experience as an infantryman traveling with the Marines, experiencing different dialects in different countries - but we seem to be unwilling to accept that there may be true dialects within each nation state. We accept regional accents and slang, but we regard full dialects as simply "improper" language, when, in fact, they are simply evidence of the normal evolution of language.
Which is not to say that there aren't many advantages to artificially slowing language change, creating an agreed-upon proper language, through both writing and education. I would say that the benefits outweigh any negatives that you could imagine, and it's an absolute necessity in modern nations. But, nonetheless, our adherence to an artificially glaciated language is anomolous. It's a tremendous innovation and a wonderful tool, but it's not the norm, it's a recent invention and the development of dialects such as Ebonics is the norm.
*I know that somebody's going to try to get me to argue about whether or not Ebonics belongs in the classroom. So, I'll just say this - that's beside the point of what I'm trying to discuss here, and I'm disinterested in discussing that matter, so piss off.
Wednesday, September 8, 2010
Home Office
So, my office is closing down. Don't worry, I still have a job. But, my office is closing down. When I am in town, I'll be working from home, and when I am out of town, I'll be spending my time down in Los Angeles County working on a large transmission line project.
So, I still have a paycheck, I still have health insurance, and I still have a job. I just won't have an office after the end of this month. There are some up-sides to this. On the one hand, working from home will be pleasant in that I will be able to have my workspace as I wish, without concerns about disturbing coworkers. It'll also be nice to not have to get up, get dressed, and drive to work - starting the day by walking down the hall in my pajamas is a nice idea. On the downside, I do like the structure of having a place to go, and a set number of hours a day when I am supposed to be there. I genuinely like associating with my coworkers, and so to not have daily interaction with them will be a loss. And I tend to be given to over-working anyway, so having the necessity of going home removed may increase my tendency to overwork*. So, it's got it's up-side and it's downside.
The biggest thing about this for me, though, is the fact that over the last three and a half years, I have spent more time in my office than in my home (see the comment about overworking above). To see it going away is, frankly, weird. It would be one thing if it was going away because it was being replaced with a different office, or if it was staying here but I was going on to a different job. But the office is simply going away, and those of us associated with it are becoming independent in our own personal home offices. It's sort of a lonely feeling.
So, anyway, I'm still working, and all is well on that front. It's just a weird feeling to see what has become part of my home going away.
*For example, several years back, a friend of mine needed to get hold of me at 10 PM on a Friday night. Before they called my cell phone or my home phone number, they tried my office, and I answered. When I asked why they called that number first - pointing out that it was, after all, 10:00 on a Friday night - they said that, based on past experience, they felt fairly certain that I was more likely to be at my office than anywhere else at that time.
So, I still have a paycheck, I still have health insurance, and I still have a job. I just won't have an office after the end of this month. There are some up-sides to this. On the one hand, working from home will be pleasant in that I will be able to have my workspace as I wish, without concerns about disturbing coworkers. It'll also be nice to not have to get up, get dressed, and drive to work - starting the day by walking down the hall in my pajamas is a nice idea. On the downside, I do like the structure of having a place to go, and a set number of hours a day when I am supposed to be there. I genuinely like associating with my coworkers, and so to not have daily interaction with them will be a loss. And I tend to be given to over-working anyway, so having the necessity of going home removed may increase my tendency to overwork*. So, it's got it's up-side and it's downside.
The biggest thing about this for me, though, is the fact that over the last three and a half years, I have spent more time in my office than in my home (see the comment about overworking above). To see it going away is, frankly, weird. It would be one thing if it was going away because it was being replaced with a different office, or if it was staying here but I was going on to a different job. But the office is simply going away, and those of us associated with it are becoming independent in our own personal home offices. It's sort of a lonely feeling.
So, anyway, I'm still working, and all is well on that front. It's just a weird feeling to see what has become part of my home going away.
*For example, several years back, a friend of mine needed to get hold of me at 10 PM on a Friday night. Before they called my cell phone or my home phone number, they tried my office, and I answered. When I asked why they called that number first - pointing out that it was, after all, 10:00 on a Friday night - they said that, based on past experience, they felt fairly certain that I was more likely to be at my office than anywhere else at that time.
Monday, September 6, 2010
That Ol' Rugged Cross
This is just a quick one, but it's something that I have been thinking about, and that is of interest in archaeology, as it is about how symbols and their material expression change meaning over time.
In Daniel Radosh's book Rapture Ready, he discusses the world of Christian pop culture. In it, he describes a man who has developed a symbol that he calls The Smiling Cross. It's weird, the adaptation of a sign of torturous punishment into an anthropomorphic smiling figure.
But how did it get here? That's the interesting thing.
The cross began as a tool of the Roman state. It was used by the Romans to execute criminals*, a death that was long and agonizing. Moreover, the cross with the corpse could be left up for a while, if the authorities deemed it good that the people see the body on display (either as a deterrent, or to show the people that a hated criminal had been caught).
As the Roman Empire declined and eventually fell, the cross would come to be taken as a sign of the Christian religion. The cross took on meanings associated both with Roman law enforcement and with Christianity.
As crucifixion fell out of favor as a form of capital punishment, the association with execution began to be shed. Yes, the Bible and most history books speak of this, but crosses are no longer used for execution, and the only place where westerners would see crosses was in churches. As such, the cross became a purely religious symbol for all but a few historians. At this time, though, the cross was still tied to the notion of Jesus' torture and death, important events in Christianity, as they also lead to the redemption of humanity.
Over time, though, the image of Jesus on the Cross became less prominent than the cross itself. There are a number of really fascinating reasons why this happened, including an abhorrence on the part of many Protestant reformers with "graven images" and the rise of theologies that held that the notion of redemption was more important than "the passions" (Jesus' horrible death). At this point in time, the image of "Christ the Sufferer" is largely Catholic, while most Protestant sects focus on a more serene Christ in their iconography. And so, images of the cross in protestant circles are usually generic symbols bereft of the image of Jesus:
And images of Jesus are usually free of any sign of passion, showing him as contemplative or devoted, but not suffering:
The end result is that the cross came to be seen as an abstract symbol of Jesus, rather than the sign of his execution. Yes, most Christians are aware of the fact that it is based on the method of execution, but that doesn't change the fact that the immediate mental connection is with Jesus and not the death penalty. And so, the Cross becomes a stand-in for Jesus. And as the cross has become a symbol for Jesus himself, rather than for Jesus' death, it has also become associated with the properties attributed to Jesus, and as "Christ the Lamb" has become ascendant in Christian theology, this means that the Cross is often seen as a friendly symbol rather than a symbol of suffering and redemption. And so, the Smiling Cross, as bizarre and even sacrilegious as it may seem, is really just a continuation of the trajectory that a symbol representing Roman capital punishment has been on for a long time.
I just feel bad for Kevin Smith. He thought that he was being really clever with Buddy Christ...
Only to be outdone by reality.
*One of my history professors in college always liked to point out that crucifixion was a Roman form of execution, the Jews in the Roman Empire used the old reliable methods of Stoning people to death. So, when people talk about the Jews "killing Jesus" they are factually incorrect - it was the Romans who did so. Assuming that the account in the bible is correct, which is a dubious assumption, the Rabbis did political maneuvering, but the Romans did the rest.
In Daniel Radosh's book Rapture Ready, he discusses the world of Christian pop culture. In it, he describes a man who has developed a symbol that he calls The Smiling Cross. It's weird, the adaptation of a sign of torturous punishment into an anthropomorphic smiling figure.
But how did it get here? That's the interesting thing.
The cross began as a tool of the Roman state. It was used by the Romans to execute criminals*, a death that was long and agonizing. Moreover, the cross with the corpse could be left up for a while, if the authorities deemed it good that the people see the body on display (either as a deterrent, or to show the people that a hated criminal had been caught).
As the Roman Empire declined and eventually fell, the cross would come to be taken as a sign of the Christian religion. The cross took on meanings associated both with Roman law enforcement and with Christianity.
As crucifixion fell out of favor as a form of capital punishment, the association with execution began to be shed. Yes, the Bible and most history books speak of this, but crosses are no longer used for execution, and the only place where westerners would see crosses was in churches. As such, the cross became a purely religious symbol for all but a few historians. At this time, though, the cross was still tied to the notion of Jesus' torture and death, important events in Christianity, as they also lead to the redemption of humanity.
Over time, though, the image of Jesus on the Cross became less prominent than the cross itself. There are a number of really fascinating reasons why this happened, including an abhorrence on the part of many Protestant reformers with "graven images" and the rise of theologies that held that the notion of redemption was more important than "the passions" (Jesus' horrible death). At this point in time, the image of "Christ the Sufferer" is largely Catholic, while most Protestant sects focus on a more serene Christ in their iconography. And so, images of the cross in protestant circles are usually generic symbols bereft of the image of Jesus:
And images of Jesus are usually free of any sign of passion, showing him as contemplative or devoted, but not suffering:
The end result is that the cross came to be seen as an abstract symbol of Jesus, rather than the sign of his execution. Yes, most Christians are aware of the fact that it is based on the method of execution, but that doesn't change the fact that the immediate mental connection is with Jesus and not the death penalty. And so, the Cross becomes a stand-in for Jesus. And as the cross has become a symbol for Jesus himself, rather than for Jesus' death, it has also become associated with the properties attributed to Jesus, and as "Christ the Lamb" has become ascendant in Christian theology, this means that the Cross is often seen as a friendly symbol rather than a symbol of suffering and redemption. And so, the Smiling Cross, as bizarre and even sacrilegious as it may seem, is really just a continuation of the trajectory that a symbol representing Roman capital punishment has been on for a long time.
I just feel bad for Kevin Smith. He thought that he was being really clever with Buddy Christ...
Only to be outdone by reality.
*One of my history professors in college always liked to point out that crucifixion was a Roman form of execution, the Jews in the Roman Empire used the old reliable methods of Stoning people to death. So, when people talk about the Jews "killing Jesus" they are factually incorrect - it was the Romans who did so. Assuming that the account in the bible is correct, which is a dubious assumption, the Rabbis did political maneuvering, but the Romans did the rest.
Friday, September 3, 2010
Hyena Vomit and Human Evolution
The 1950s through 1980s were a time of very active change in archaeology. New technologies coupled with changes in the way that humans are viewed by archaeologists (less mythic "man the conqueror", more practical "humans as a specieis adapted to life on Earth") spurred a number of advances in the way that archaeologists study our evolutionary past.
One of the first steps along the path to re-evaluating our evolutionary past was determining whether or not many of the archaeological sites that had been used to argue for the particular model of the human past were, in fact, archaeological sites at all. High on this list were caches of animal bones that litter the African landscape. Often found in caves or rock shelters, these caches of bone indicate selection - only certain pieces of bone are found, not entire skeletons, and the types of bones are fairly consistent across sites - and show signs of modification - marks on the bones and bone breakage. The selection of bone and the signs of modification had long been taken as evidence of homonid (the family of animals to which humans belong, and of which humans are the only surviving member) activity, and specifically of hunting and butchering.
This changed as archaeologists began to look at the behavior of other animals that are present in the parts of Africa where our ancestors wandered. Researchers discovered that the feeding habits of certain large cats likely created some of the bone caches that had been found, with the marks on the bone coming from teeth or post-depositional environmental factors.
One particularly important book, The Hunters or the Hunted?, by a fellow with the delightful name of C.K. Brain, discussed bone caches found in African caves with an eye towards evaluating what materials came from homonid activities, and what was due to animals, both large and small, that also selectively leave bones behind.
However, the study that really grabbed my attention as an undergraduate involved the examination of hyena vomit. Yes, the study was done in the 1970s, and everyone was on drugs then, but there actually was a valid reason to study hyena vomit.
You see, hyenas have been around for approximately 26 million years, meaning that they were wandering the African savannah along with our early ancestors. Hyenas also are scavengers, meaning that they get access to carcasses after other animals have already had a chance to take what they want from them. Hyenas are well-adapted to scavenging because one of the many things that they can do is swallow bone and vomit it back up on demand. "How does this help them" you ask? Simple - the swallow the bone and their stomach begins to digest the grease and meat left on and in the bone, allowing them to gain nourishment from remains for which other animals would have little use. Once their system has extracted what it can from the bone, they vomit it back up, leaving behind select pieces of bone in piles that might look like a homonid had gone about picking up choice cuts of meat and leaving the bone behind to be discovered by it's descendants a couple of million years later.
With the results of this study, people studying homonids were better able to eliminate hyena puke sites from their lists o' potential archaeological sites. A useful thing to be sure, and a tool that allowed the view of humanity's evolutionary past to be refined.
Perhaps the most interesting thing about the study of homonids during this period is that it reflected how a change in how we think about ourselves can allow data to be more fully examined. Prior to this period, it had been common to think about "man the hunter" and "man the conqueror" - whether this was glorified or villified, we were seen as beings that could and did take the world by force and make it as we wished, and this was reflected in many of the models of our evolutionary past. With Post-WWII social change, not to mentiont echnological change, we began to look at ourselves and our ancestors in a different light, and began to consider that the data might be saying something different than what we had initially assumed. The image that began to emerge as the data was evaluated with better techniques was that our homonid ancestors did eventually become hunters, but that they likely spent quite a bit of time scavenging, and we were shaped by our environment at least as much as we would eventually shape it (probably more so).
One of the first steps along the path to re-evaluating our evolutionary past was determining whether or not many of the archaeological sites that had been used to argue for the particular model of the human past were, in fact, archaeological sites at all. High on this list were caches of animal bones that litter the African landscape. Often found in caves or rock shelters, these caches of bone indicate selection - only certain pieces of bone are found, not entire skeletons, and the types of bones are fairly consistent across sites - and show signs of modification - marks on the bones and bone breakage. The selection of bone and the signs of modification had long been taken as evidence of homonid (the family of animals to which humans belong, and of which humans are the only surviving member) activity, and specifically of hunting and butchering.
This changed as archaeologists began to look at the behavior of other animals that are present in the parts of Africa where our ancestors wandered. Researchers discovered that the feeding habits of certain large cats likely created some of the bone caches that had been found, with the marks on the bone coming from teeth or post-depositional environmental factors.
One particularly important book, The Hunters or the Hunted?, by a fellow with the delightful name of C.K. Brain, discussed bone caches found in African caves with an eye towards evaluating what materials came from homonid activities, and what was due to animals, both large and small, that also selectively leave bones behind.
However, the study that really grabbed my attention as an undergraduate involved the examination of hyena vomit. Yes, the study was done in the 1970s, and everyone was on drugs then, but there actually was a valid reason to study hyena vomit.
You see, hyenas have been around for approximately 26 million years, meaning that they were wandering the African savannah along with our early ancestors. Hyenas also are scavengers, meaning that they get access to carcasses after other animals have already had a chance to take what they want from them. Hyenas are well-adapted to scavenging because one of the many things that they can do is swallow bone and vomit it back up on demand. "How does this help them" you ask? Simple - the swallow the bone and their stomach begins to digest the grease and meat left on and in the bone, allowing them to gain nourishment from remains for which other animals would have little use. Once their system has extracted what it can from the bone, they vomit it back up, leaving behind select pieces of bone in piles that might look like a homonid had gone about picking up choice cuts of meat and leaving the bone behind to be discovered by it's descendants a couple of million years later.
With the results of this study, people studying homonids were better able to eliminate hyena puke sites from their lists o' potential archaeological sites. A useful thing to be sure, and a tool that allowed the view of humanity's evolutionary past to be refined.
Perhaps the most interesting thing about the study of homonids during this period is that it reflected how a change in how we think about ourselves can allow data to be more fully examined. Prior to this period, it had been common to think about "man the hunter" and "man the conqueror" - whether this was glorified or villified, we were seen as beings that could and did take the world by force and make it as we wished, and this was reflected in many of the models of our evolutionary past. With Post-WWII social change, not to mentiont echnological change, we began to look at ourselves and our ancestors in a different light, and began to consider that the data might be saying something different than what we had initially assumed. The image that began to emerge as the data was evaluated with better techniques was that our homonid ancestors did eventually become hunters, but that they likely spent quite a bit of time scavenging, and we were shaped by our environment at least as much as we would eventually shape it (probably more so).
Thursday, September 2, 2010
Thinking About Privatization and Government Control
I have alot of fiscal conservatives, both of the Republican and of the Libertarian sort, in my family. What this means is that, ever since I became aware of such things as a teenager, I have routinely heard that private business is always more efficient than government, and that most government duties should be privatized.
This has always seemed very odd to me, as the evidence that most of these folks give for their beliefs is their own interactions with government agencies - time at the DMV, dealing with building permits through the county or city government, and so on. I fully agree that this sort of interactions absolutely does reveal a slothful and often bloated bureaucracy that typically seems more concerned with following draconian procedures than with getting anything done.
The problem is that private industry is often not much better. Have you ever dealt with a telephone company? A bank? An insurance company? Hell, even most large retailers? They are generally just as bad. And that's not even getting into the weird jungles of government responsibilities that get privatized - private jails, defense contractors, etc.
I also live in Santa Cruz, which means that I have regular interaction with people who want things that are currently private industries made government functions (look at the recent health care shouting match - no, it wasn't a debate, because there was little actual discussion), and they will often cite the very same sorts of inefficiencies, draconian rules, and corruption that those who want everything privatized.
A fair-minded observer has to ask if the problem isn't so much that one is government and one is private, but rather that you are dealing with large bureaucracies. Proponents of privatization will point out that private companies are capable of making sudden changes that can improve efficiency - this is sometimes true, but often other factors (both within and outside the companies) influence against this. Proponents of government agencies will often point to the corruption within large companies (due to the large sums of money taken in) as a problem that makes them untrustworthy, also fair, but also a problem with government agencies.
What is fascinating to me is that, whether one is pro-privatization or anti-privatization, all of the arguments are big on rhetoric and assumption, and pretty much empty of hard data. In fact, many researchers have looked into the benefits of having various functions be handled by private vs. government agencies, and when one brushes away the "studies" performed by political organizations on both sides, what is left is a simply data that indicates that some functions are most efficiently performed by private industry, while others are most efficiently performed by governments, and which function falls into which category is dependent both on the service/purpose in question, and the time and place where it is being executed.
So, I find myself really quite amused (and when they influence policy, quite disturbed) by people who are absolutely convinced that, say, government health care will unquestionably be better, or that privatizing police forces will lead to safer cities. Both groups seem equally convinced of their position based on just-so stories with no real data (aside from maybe a bit generated by someone with a political rather than reality-based agenda) to back them up. Both fall to the confirmation bias and fail to take into account disconfirming data.
Generally, when someone routinely insists on the privatization or nationalization of various different functions or industries, they probably haven't bothered to actually do their homework.
This has always seemed very odd to me, as the evidence that most of these folks give for their beliefs is their own interactions with government agencies - time at the DMV, dealing with building permits through the county or city government, and so on. I fully agree that this sort of interactions absolutely does reveal a slothful and often bloated bureaucracy that typically seems more concerned with following draconian procedures than with getting anything done.
The problem is that private industry is often not much better. Have you ever dealt with a telephone company? A bank? An insurance company? Hell, even most large retailers? They are generally just as bad. And that's not even getting into the weird jungles of government responsibilities that get privatized - private jails, defense contractors, etc.
I also live in Santa Cruz, which means that I have regular interaction with people who want things that are currently private industries made government functions (look at the recent health care shouting match - no, it wasn't a debate, because there was little actual discussion), and they will often cite the very same sorts of inefficiencies, draconian rules, and corruption that those who want everything privatized.
A fair-minded observer has to ask if the problem isn't so much that one is government and one is private, but rather that you are dealing with large bureaucracies. Proponents of privatization will point out that private companies are capable of making sudden changes that can improve efficiency - this is sometimes true, but often other factors (both within and outside the companies) influence against this. Proponents of government agencies will often point to the corruption within large companies (due to the large sums of money taken in) as a problem that makes them untrustworthy, also fair, but also a problem with government agencies.
What is fascinating to me is that, whether one is pro-privatization or anti-privatization, all of the arguments are big on rhetoric and assumption, and pretty much empty of hard data. In fact, many researchers have looked into the benefits of having various functions be handled by private vs. government agencies, and when one brushes away the "studies" performed by political organizations on both sides, what is left is a simply data that indicates that some functions are most efficiently performed by private industry, while others are most efficiently performed by governments, and which function falls into which category is dependent both on the service/purpose in question, and the time and place where it is being executed.
So, I find myself really quite amused (and when they influence policy, quite disturbed) by people who are absolutely convinced that, say, government health care will unquestionably be better, or that privatizing police forces will lead to safer cities. Both groups seem equally convinced of their position based on just-so stories with no real data (aside from maybe a bit generated by someone with a political rather than reality-based agenda) to back them up. Both fall to the confirmation bias and fail to take into account disconfirming data.
Generally, when someone routinely insists on the privatization or nationalization of various different functions or industries, they probably haven't bothered to actually do their homework.
Wednesday, September 1, 2010
Ghost Signs
A post from several months back at the Diary of a Bluestocking blog discussed "ghost signs", the long-abandoned and usually fading signs that advertise businesses no longer in existence, goods no longer sold, and/or advertising campaigns long since dead and gone*. The writer of that blog, who I'm pretty sure could single-handedly beat me in a knowledge-off (and who is a better writer than I am) routinely demonstrates a concern for the lack of appreciation that most of us have regarding the ever-present artifacts of our relatively recent past. It is a matter that I have had to address on a professional basis, as many of these types of objects are now old enough that we have to consider them for the National Register of Historic Places and California Register of Historic Resources.
Her discussion of ghost signs got me wondering about the role that these play in my home state of California. Generally, my fellow Californians seem to regard these old signs with something of a mixture of acceptance and affection. They serve as an active part of our social landscape - landmarks on routes, makers of our social territories, and re-assuring objects that remain the same every time that we pass them. For many of the people who own businesses to which these signs are attached, they serve as a way of marking that the business is part of the community (even if the business isn't as old as the sign indicates), and for others they serve primarily for their kitch value, creating a new meaning rather than tying in to the community.
It's interesting to me that, while we Californians generally seem ready to replace almost everything - from our consumer electronics to our homes - we rarely seem to be in any hurry to paint over or get rid of these signs. It seems to fit in with our usual attitude - once something is old enough, we consider either venerable or kitch, and we allow it to stay. If it's not old enough, we want to obliterate it before it gains sufficient age to gain new meaning.
And, really, as odd as it is for many people to consider them in such a context, they are part of our archaeological record. There is information in their placement, their preservation (or the occasional attempt to cover them up), and in the ways in which the use of the buildings on which they are emblazoned change that is useful in evaluate our culture. In truth, they can be analyzed in the same way that we analyze the cave paintings of prehistoric peoples.
Although rather different from most ghost signs, there's this one, a reminder of times forgotten, that I took a picture of while in Seattle, Washington:
*And not the feces of ghosts. Sorry to dissapoint you.
Her discussion of ghost signs got me wondering about the role that these play in my home state of California. Generally, my fellow Californians seem to regard these old signs with something of a mixture of acceptance and affection. They serve as an active part of our social landscape - landmarks on routes, makers of our social territories, and re-assuring objects that remain the same every time that we pass them. For many of the people who own businesses to which these signs are attached, they serve as a way of marking that the business is part of the community (even if the business isn't as old as the sign indicates), and for others they serve primarily for their kitch value, creating a new meaning rather than tying in to the community.
It's interesting to me that, while we Californians generally seem ready to replace almost everything - from our consumer electronics to our homes - we rarely seem to be in any hurry to paint over or get rid of these signs. It seems to fit in with our usual attitude - once something is old enough, we consider either venerable or kitch, and we allow it to stay. If it's not old enough, we want to obliterate it before it gains sufficient age to gain new meaning.
And, really, as odd as it is for many people to consider them in such a context, they are part of our archaeological record. There is information in their placement, their preservation (or the occasional attempt to cover them up), and in the ways in which the use of the buildings on which they are emblazoned change that is useful in evaluate our culture. In truth, they can be analyzed in the same way that we analyze the cave paintings of prehistoric peoples.
Although rather different from most ghost signs, there's this one, a reminder of times forgotten, that I took a picture of while in Seattle, Washington:
*And not the feces of ghosts. Sorry to dissapoint you.
Subscribe to:
Posts (Atom)