Archaeologists are notorius for, to paraphrase a T-shirt, stalking other disciplines down dark alleys, whacking them across the head, and then rifling through their pockets for loose theory.
There is, it should be said, some benefit to archaeology from this behavior. There are ideas from fields as diverse as physics and literary criticism that have found good employment in the field of archaeology.
However, there are also many times when this results in bizarre concoctions of intellectual puree that make little sense, but are championed by certain practitioners as if they were the height of human intellectual achievement.
Back in 1971, the archaeologists Kent Flannery wrote a perceptive and hilarious article titles Archaeology With a Capital "S" in which he was extremely critical of the tendency of many of the archaeological theorists active at that time to uncritically adopt concepts from physics, mathematics, and biology without thoroughly considering the applicability of these concepts to the archaeological record. Unfortunately, I can not find an on-line copy to which I could direct you - it is really worth a read.
Flannery's complaint was that the archaeology of the 60s and 70s was filled with sciencey-sounding buzzwords and claims, though he was writing 10 years too early to see how many of the post-modern views of humanity would filter into archaeology and displace many of the sciencey-sounding buzzwords with philosophy-sounding buzzwords. In both cases, there was good that came from it - the theoretical changes of the 50s through the 70s provided us with a fairly robust model for developing and testing hypotheses, as well as for checking our ideas against the real world, while the post-modern ideas that began filtering in during the 70s and really came to the fore in the 80s provided ways of looking into behavior that wasn't easily quantifiable, as well as providing reminders of our own biases and the subjective nature of our conclusions when dealing with something as convoluted and open to interpretation as human behavior. There was also a whole lot of pseudo-intellectual posturing that came from it, and more than a few examples of archaeologists mis-applying concepts because they simply did not comprehend them.
For example: one approach to studying changes in material culture is to attempt to find similarities between the way that artifacts types change over time and the ways in which biological entities change over time. While there are some definite issues to be dealt with (people design tools and can do so relatively quickly, while evolution works through a process of random mutation and decidedly non-random selection over many generations), there is some benefit to employing the concept to try to understand how the physical or social environment might result in the selection of certain tool forms over others by the tool's makers and users.
However, this can become problematic when the archaeologist doesn't understand either evolution, or the difference between biological evolution and choices on the part of toolmakers. This was thrown into stark relief for me one day, when I was in a theory seminar, and we were discussing this approach. I commented that one way that the concepts of biological evolution could be applied would be to see which changes survived and became more common amongst tool types, and which only appear on a single or small number of known specimens. The common tools would indicate either a tool well adapted to a variety of uses or tools adapted to a narrow range of common uses (such as an arrowhead - it only serves one purpose, but that purpose is quite common in the life of a hunter/gatherer, so there's a butt-load of the things lying around archaeological sites); the less common tools would either indicate tools that ultimately didn't work or didn't work as well as others, or else were specialized tools for particular niche tasks that were relatively uncommon.
As soon as I said this, one of the other students stated "well, you're forgetting what any biologist could tell you. Evolution happens at the level of the individual!"
No. Any biologist could tell you (and many have told me) that mutation occurs at the level of the individual. Mutations only feed evolution if they spread throughout the population, meaning that evolution is a generational/population-level phenomenon. This is relevant to the application of the idea to archaeology in that it provides a loose framework for trying to make sense of the relative frequencies of both different types of tools and different traits of similar tools. When you assume that evolution=individual change, then you get it backwards and can easily doom yourself into attributing more importance to each individual variation than is warranted.
You see this sort of thing occur with all manner of ideas taken from other fields, however: resistance (from literature and history), identity theory (from history and sociology), carrying capacity models (from biology), etc. Each of these ideas is useful, to an extent, but tends to be at least somewhat misunderstood by many of its adherents in archaeology, and as a result, tends to get somewhat abused and misused.
This is, it should be said, a bit of a shame, as all of these ideas are good ideas, and can be applied to archaeology, but the mis-use by many of the more fervent supporters results in these concepts being misunderstood by other archaeologists, and therefore good ideas get scoffed at due the the enthusiasm of some of the more enthusiastic and misguided.