Saturday, 19 April 2014

The novelty effect: a factor in mate choice


 
Series of facial images from clean-shaven to full beard (Janif et al., 2014)


For the past thirty years, the tendency has been to study sexual attractiveness from the observer's standpoint, i.e., we choose mates on the basis of what's good for us. We therefore unconsciously look for cues that tell us how healthy or fertile a potential mate may be. But what about the standpoint of the person being observed? If you want to be noticed on the mate market, it's in your interest to manipulate any mental algorithm that will make you noticeable, including algorithms that have nothing to do with mating and exist only to keep track of unusual things in the observer's surroundings. If you're more brightly colored or more novel in appearance, you will stand out and thus increase your chances of finding a mate.

We see this with hair color. In one study, men were shown pictures of attractive women and asked to choose the one they most wanted to marry. One series had equal numbers of brunettes and blondes, a second series 1 brunette for every 5 blondes, and a third 1 brunette for every 11 blondes. It turned out that the scarcer the brunettes were in a series, the likelier any one brunette would be chosen (Thelen, 1983). Another study likewise found that Maxim cover girls were disproportionately light blonde or dark brown, and much less often the more usual dark blonde or light brown (Anon, 2008). This novelty effect may be seen in sales of home interior colors over the past half-century: preference for one color rises until satiated, then falls and yields to preference for another (Stansfield & Whitfield, 2005).

The novelty effect seems to apply not only to colors but also to other visible features. In a recent study, participants were shown a series of faces with different degrees of beardedness. A clean-shaven face was preferred to the degree that it was rare, being most appreciated when the other faces had beards. Heavy stubble and full beards were likewise preferred to the degree that they were rare (Janif et al., 2014).

The authors conclude:
 
Concordant effects of frequency-dependent preferences among men and women might reflect a domain-general effect of novelty. Frost [20] suggested the variation in female blond, brown and red hair between European populations spread, geographically, from where they first arose, via negative frequency-dependent preferences for novelty. There is some evidence that men's preferences increase for brown hair when it is rare [21] and for unfamiliar (i.e. novel) female faces [22]. (Janif et al., 2014)

 
The authors go on to suggest that the quest for novelty may drive the ups and downs of fashion trends. A new fashion will rise sharply in popularity when it is still unfamiliar to most people. As the novelty wears off, its popularity will peak and then decline, especially if it faces competition from a more recent fashion.

There are certainly limits to the novelty effect—something can be novel but also disgusting—but it seems to be more general than previously thought.
 

References

Anon. (2008). Maxim's audience prefers brunettes; distribution is bimodal. Gene Expression, July 6, 2008.  http://www.gnxp.com/blog/2008/07/maxims-audience-prefers-brunettes.php  

Frost P. (2006). European hair and eye color: a case of frequency-dependent sexual selection? Evolution & Human Behavior, 27, 85-103.

Frost, P. (2008). Sexual selection and human geographic variation, Special Issue: Proceedings of the 2nd Annual Meeting of the NorthEastern Evolutionary Psychology Society, Journal of Social, Evolutionary, and Cultural Psychology, 2(4),169-191. http://137.140.1.71/jsec/articles/volume2/issue4/NEEPSfrost.pdf  

Janif, Z.J., R.C. Brooks, and B.J. Dixson. (2014). Negative frequency-dependent preferences and variation in male facial hair, Biology Letters, 10, early view
http://rsbl.royalsocietypublishing.org/content/10/4/20130958

Little A.C., L.M. DeBruine, B.C. Jones. (2013). Sex differences in attraction to familiar and unfamiliar opposite-sex faces: men prefer novelty and women prefer familiarity, Archives of Sexual Behavior, early view

Stansfield, J., and Whitfield, T.W.A. (2005) Can future colour trends be predicted on the basis of past colour trends? An empirical investigation, Color Research & Application, 30(3), 235-242. 

Thelen, T.H. (1983). Minority type human mate preference, Social Biology, 30, 162-180.

Saturday, 12 April 2014

Compliance with moral norms: a partly heritable trait?


 
Election poster from the 1930s for Sweden’s Social Democratic Party (source). Is the welfare state more workable if the population is more predisposed to obey moral norms?
 

Do we differ genetically in our ability, or willingness, to comply with moral norms? Please note: I'm talking about compliance. The norms themselves can vary greatly from one historical period to another and from one society to another.

Apparently some people are more norm-compliant than others. This is the conclusion of a recent twin study from Sweden (Loewen et al., 2013). A total of 2,273 individuals from twin pairs were queried about the acceptability of four dishonest behaviors: claiming sick benefits while healthy (1.4% thought it totally or fairly acceptable), avoiding paying for public transit (2.8%), avoiding paying taxes (9.7%), and accepting bribes on the job (6.4%).

How heritable were the responses to the above questions? The heritabilities were as follows: 

Claiming sick benefits while healthy - 42.5%
Avoiding paying for public transit - 42.3%
Avoiding paying taxes - 26.3%
Accepting bribes on the job - 39.7%

Do these results indicate a specific predisposition to obey moral norms? Or is the genetic influence something more general, like religiosity or risk-taking, both of which are known to be partly heritable? To answer this question, the authors ran correlations with other factors:


Significant correlations were exhibited for age (r=.10, p=.00), sex (r=.12, p=.00), religiosity (r=.06, p=.00), preferences for risk (r=-.09, p=.00) and fairness (r=-.10, p=.00), locus of control (r=-.03, p=.01), and charitable giving (r=.09, p=.00). However, these significant correlations were relatively weak, suggesting that our measure is not merely standing in for these demographic and psychological differences between individuals. There were no significant correlations with behavioral inhibition (r=-.00, p=.81) or volunteering (r=.01, p=.29). (Loewen et al., 2013)


The jury is still out, but it looks like compliance with moral norms has a specific heritable component.
 

Population differences

Does this heritable component vary from one population to another, just as it seems to vary from one individual to another? The authors have little to say, other than the following:


Replication in other countries should occur, as the exact role and extent of genetic and common environment-influence could change in different national and cultural contexts. Such a multi-country approach could thus offer some clues on the generalizability of our findings. (Loewen et al., 2013)


Swedes seem to be better than most people at obeying moral norms. Only 1.4% think it acceptable to claim sick benefits while healthy! Maybe that's why they've been so successful at creating a welfare state. So few of them want to be free riders on the gravy train:


Gunnar and Alva Myrdal were the intellectual parents of the Swedish welfare state. In the 1930s they came to believe that Sweden was the ideal candidate for a cradle-to-grave welfare state. First of all, the Swedish population was small and homogeneous, with high levels of trust in one another and the government. Because Sweden never had a feudal period and the government always allowed some sort of popular representation, the land-owning farmers got used to seeing authorities and the government more as part of their own people and society than as external enemies. Second, the civil service was efficient and free from corruption. Third, a Protestant work-ethic—and strong social pressures from family, friends and neighbors to conform to that ethic—meant that people would work hard, even as taxes rose and social assistance expanded. Finally, that work would be very productive, given Sweden´s well-educated population and strong export sector. (Norberg, 2006)


This is not how most of the world works. While studying in Russia, I noticed that the typical Russian feels a strong sense of moral responsibility toward immediate family and longstanding friends, more so than we in the West. Beyond that charmed circle, however, the general feeling seems to be distrust, wariness, or indifference. There was little of the spontaneous willingness to help strangers that I had taken for granted back home. People had the same sense of right and wrong, but this moral universe was strongly centered on their own families.

In sociology, the term is amoral familialism. Family is everything and society is nothing, or almost nothing. It was coined by American sociologist Edward Banfield:


In 1958, Banfield, with the assistance of his wife, Laura, published The Moral Basis of a Backward Society, in which they explained why a region in southern Italy was poor. The reason, they said, was not government neglect or poor education, but culture. People in this area were reluctant to cooperate outside of their families. This kind of "amoral familialism," as they called it, was the result of a high death rate, a defective system of owning land, and the absence of extended families. By contrast, in an equally forbidding part of southern Utah, the residents were engaged in a variety of associations, each busily involved in improving the life of the community. In southern Italy, people did not cooperate; in southern Utah, they scarcely did anything else. (Banfield, 2003, p. viii)
 

Where did Western societies get this desire to treat family and non-family the same way? To some extent, it seems to be a longstanding trait. English historian Alan Macfarlane sees a tendency toward weaker kinship ties that goes back at least to the 13th century. Children had no automatic rights to the family property. Parents could leave their property to whomever they liked and disinherit their children if they so wished (Macfarlane, 2012).

Indeed, Macfarlane argues that "Weber's de-familization of society" was already well advanced in Anglo-Saxon times (Macfarlane, 1992, pp. 173-174). This picture of relatively weak kinship ties is consistent with the Western European marriage pattern. If we look at European societies west of a line running from Trieste to St. Petersburg, we find that certain cultural traits predominate:

- relatively late marriage for men and women
- many people who never marry
- neolocality (children leave the family household to form new households)
- high circulation of non-kin among different households (typically young people sent out as servants) (Hajnal, 1965; see also hbd* chick)

Again, these characteristics go back at least to the 13th century and perhaps much farther back (Seccombe, 1992, p. 94).

Historians associate this model of society with the rise of the market economy. In other words, reciprocal kinship obligations were replaced with monetized economic obligations, and this process in turn led to a broader-based morality that applied to everyone equally. In reality, the arrow of causation seems to have been the reverse. Certain societies, notably those of northwestern Europe, were pre-adapted to the market economy and thus better able to exploit its possibilities when it began to take off in the late Middle Ages. The expansion of the market economy and, later, that of the welfare state were thus made possible by certain pre-existing cultural and possibly genetic characteristics, i.e., weaker kinship ties and a corresponding extension of morality from the familial level to the societal level.
 

References

Banfield, E.C. (2003). Political Influence, New Brunswick (N.J.): Transaction Pub.

Hajnal, John (1965). European marriage pattern in historical perspective. In D.V. Glass and D.E.C. Eversley. Population in History. Arnold, London. 

Loewen, P.J., C.T. Dawes, N. Mazar, M. Johannesson, P. Keollinger, and P.K.E. Magnusson. (2013). The heritability of moral standards for everyday dishonesty, Journal of Economic Behavior & Organization, 93, 363-366.
https://files.nyu.edu/ctd1/public/Moral.pdf  

Macfarlane, A. (1992). On individualism, Proceedings of the British Academy, 82, 171-199.
http://www.alanmacfarlane.com/TEXTS/On_Individualism.pdf  

Macfarlane, A. (2012). The invention of the modern world. Chapter 8: Family, friendship and population, The Fortnightly Review, Spring-Summer serial
http://fortnightlyreview.co.uk/2012/07/invention-8/  

Norberg, J. (2006). Swedish Models, June 1, The National Interest.
http://www.johannorberg.net/?page=articles&articleid=151  

Seccombe, W. (1992). A Millennium of Family Change. Feudalism to Capitalism in Northwestern Europe, London: Verso.

 

Saturday, 5 April 2014

The riddle of Microcephalin


 
World distribution of the recent Microcephalin allele. The prevalence is indicated in black and the letter 'D' refers to the 'derived' or recent allele (Evans et al., 2005)
 

Almost a decade ago, there was much interest in a finding that a gene involved in brain growth, Microcephalin, continued to evolve after modern humans had begun to spread out of Africa. The 'derived' allele of this gene (the most recent variant) arose some 37,000 years ago somewhere in Eurasia and even today is largely confined to the native populations of Eurasia and the Americas (Evans et al., 2005).

Interest then evaporated when no significant correlation was found between this derived allele and higher scores on IQ tests (Mekel-Bobrov et al, 2007; Rushton et al., 2007). Nonetheless, a later study did show that this allele correlates with increased brain volume (Montgomery and Mundy, 2010).

So what is going on? Perhaps the derived Microcephalin allele helps us on a mental task that IQ tests fail to measure. Or perhaps it boosts intelligence in some indirect way that shows up in differences between populations but not in differences between individuals.

The second explanation is the one favored in a recent study by Woodley et al. (2014). The authors found a high correlation (r = 0.79) between the incidence of this allele and a population's estimated mean IQ, using a sample of 59 populations from throughout the world. They also found a correlation with a lower incidence of infectious diseases, as measured by DALY (disability adjusted life years). They go on to argue that this allele may improve the body’s immune response to viral infections, thus enabling humans to survive in larger communities, which in turn would have selected for increased intelligence:

Bigger and more disease resistant populations would be able to produce more high intelligence individuals who could take advantage of the new cognitive opportunities afforded by the social and cultural changes that occurred over the past 10,000 years. (Woodley et al., 2014)

Bigger populations would also have increased the probability of “new intelligence-enhancing mutations and created new cognitive niches encouraging accelerated directional selection for the carriers of these mutations.” A positive feedback would have thus developed between intelligence and population density:

[…] the evolution of higher levels of intelligence during the Upper Paleolithic revolution some 50,000 to 10,000 ybp may have been necessary for the development of the sorts of subsistence paradigms (e.g. pastoralism, plant cultivation, etc.) that subsequently emerged. (Woodley et al., 2014)
 
 
What do I think?

I have mixed feelings about this study. Looking at the world distribution of this allele (see above map), I can see right away a much higher prevalence in Eurasia and the Americas than in sub-Saharan Africa. That kind of geographic distribution would inevitably correlate with IQ. And it would also correlate with the prevalence of infectious diseases.

Unfortunately, such correlations can be spurious. There are all kinds of differences between sub-Saharan Africa and the rest of the world. One could show, for instance, that per capita consumption of yams correlates inversely with IQ. But yams don't make you stupid.

More seriously, one could attribute the geographic range of this allele to a founder effect that occurred when modern humans began to spread out of Africa to other continents. In that case, it could be junk DNA with no adaptive value at all. There is of course a bit of a margin between its estimated time of origin (circa 37,000 BP) and the Out of Africa event (circa 50,000 BP), but that difference could be put down to errors in estimating either date.

No, I don't believe that a founder effect was responsible. A more likely cause would be selection to meet the cognitive demands of the First Industrial Revolution, when humans had to create a wider range of tools to cope with seasonal environments and severe time constraints on the tasks of locating, processing, and storing food. This allele might have helped humans in the task of imagining a 3D mental “template” of whatever tool they wished to make. Or it might have helped hunters store large quantities of spatio-temporal information (like a GPS) while hunting over large expanses of territory. Those are my hunches.

I don't want to pooh-pooh the explanation proposed in this study. At times, however, the authors' reasoning seems more than a bit strained. Yes, this allele does facilitate re-growth of neural tissue after influenza infections, probably via repair of damaged DNA, but the evidence for a more general role in immune response seems weak. More to the point, the allele’s time of origin (39,000 BP) doesn't correspond to a time when humans began to live in larger, more sedentary communities. This was when they were still hunter-gatherers and just beginning to spread into temperate and sub-arctic environments with lower carrying capacities. Human population density was probably going down, not up. It wasn't until almost 30,000 years later, with the advent of agriculture, that it began to increase considerably.

The authors are aware of this last point and note in it their paper. So we come back to the question: what could have been increasing the risk of disease circa 39,000 BP? The authors suggest several sources of increased risk: contact with archaic hominins (Neanderthals, Denisovans), domestication of wolves and other animals, increasing population densities of hunter-gatherers, and contact by hunter-gatherers with new environments. Again, this reasoning seems to push the envelope of plausibility. Yes, Neanderthals were still around in 39,000 BP, but they had already begun to retreat and by 30,000 BP were extinct over most of their former range. Yes, we have evidence of wolf domestication as early as 33,000 BP, but livestock animals were not domesticated until much later. Yes, there was a trend toward increasing population density among hunter-gatherers, but this was not until after the glacial maximum, i.e., from 15,000 BP onward. Yes, hunter-gatherers were entering new environments, but those environments were largely outside the tropics in regions where winter kills many pathogens. So disease risk would have been decreasing.

I don’t wish to come down too hard on this paper. There may be something to it. My fear is simply that it will steer researchers away from another possible explanation: the derived Microcephalin allele assists performance on a mental task that is not measured by standard IQ tests.

 
References 

Evans, P. D., Gilbert, S. L., Mekel-Bobrov, N., Vallender, E. J., Anderson, J. R., Vaez-Azizi, L. M., et al. (2005). Microcephalin, a gene regulating brain size, continues to evolve adaptively in humans, Science, 309, 1717-1720.
http://www.fed.cuhk.edu.hk/~lchang/material/Evolutionary/Brain%20gene%20and%20race.pdf  

Mekel-Bobrov, N., Posthuma, D., Gilbert, S. L., Lind, P., Gosso, M. F., Luciano, M., et al. (2007). The ongoing adaptive evolution of ASPM and Microcephalin is not explained by increased intelligence, Human Molecular Genetics, 16, 600-608.
http://psych.colorado.edu/~carey/pdfFiles/ASPMMicrocephalin_Lahn.pdf  

Montgomery, S. H., and N.I. Mundy. (2010). Brain evolution: Microcephaly genes weigh in, Current Biology, 20, R244-R246.
http://www.sciencedirect.com/science/article/pii/S0960982210000862  

Rushton, J. P., Vernon, P. A., and Bons, T. A. (2007). No evidence that polymorphisms of brain regulator genes Microcephalin and ASPM are associated with general mental ability, head circumference or altruism, Biology Letters, 3, 157-160.
http://semantico-scolaris.com/media/data/Luxid/Biol_Lett_2007_Apr_22_3(2)_157-160/rsbl20060586.pdf  

Woodley, M. A., H. Rindermann, E. Bell, J. Stratford, and D. Piffer. (2014). The relationship between Microcephalin, ASPM and intelligence: A reconsideration, Intelligence, 44, 51-63.
http://www.sciencedirect.com/science/article/pii/S0160289614000312  

Saturday, 29 March 2014

A bird in a gilded cage


My second ebook has been published in the online journal Open Behavioral Genetics.
PDF version   Epub version
The following is a copy of the Foreword:
 

******************************************************


Luigi Luca Cavalli-Sforza is a complex figure. On the one hand, he has publicly backed those who assert that human races do not exist. On the other hand, by aggregating large volumes of genetic data, he has proven the existence of large continental races, as well as smaller regional and micro ones. By developing the theory of gene-culture co-evolution, he has also shown that humans did not stop evolving genetically when they began to evolve culturally. In fact, the two processes have fed into each other, with humans having to adapt not only to the natural portion of their environment (climate, vegetation, wildlife, etc.) but also to the portion they themselves have created (mode of subsistence, behavioral norms, gender roles, class structure, belief system, etc.).

This has led some to see a double game at work. While bowing to the mainstream taboos, Cavalli-Sforza has quietly amassed evidence that human races not only exist but also differ in ways that are more than skin deep. In time, his weighty tomes will speak louder than his official statements on race. This may indeed be how he sees himself, and it might explain certain contradictions between his public persona and his academic self. Oh, those naïve antiracists, if only they knew how they’re being outfoxed!

Time will tell who is outfoxing whom. To date, the results speak for themselves. When in 1994 Cavalli-Sforza published The History and Geography of Human Genes, academics and non-academics alike were talking more openly about race, as seen by the publication the same year of The Bell Curve and by the willingness of previously silent anthropologists, like Vincent Sarich, to step forward and speak out. That interval of glasnost soon ended, in no small part because of Cavalli-Sforza’s apparent conversion, as attested in his book, to the view that human races do not exist in any meaningful sense.

Why did he convert? And did he really? I doubt there was any conversion. His change of heart was too rapid, and it happened while the zeitgeist was moving in the other direction. Perhaps he saw a chance to gain acceptance for his new tome. Or perhaps he received a letter one day, detailing his wartime record, the people he worked with, and the testing on human subjects …

Cavalli-Sforza had to remake his life when the war ended. He never denied the nature of his wartime research (the time it takes for anthrax to kill its host) but tried to create the impression that he had been doing pure research with no military implications. Yet this was Berlin, in 1943-1944. There was no money for pure research. Was he motivated by opportunism, the chance to gain experience in his field of study? Or did he feel loyalty to the Axis cause? It is difficult to say, and perhaps it doesn’t matter. It is enough to say that he later saw his wartime research as a stain on his record and tried to minimize it as much as possible. He was thus vulnerable to blackmail, or rather to his chronic fear of blackmail.

We will probably never know the full story. One thing is sure. If Cavalli-Sforza is playing a double game, he has been playing it far too long. Such a strategy is excusable for an academic who is young, untenured, poorly known, and far from retirement, but these excuses hardly apply to a professor emeritus like Cavalli-Sforza. The time is overdue to speak frankly and, if need be, pay the price. Anyway, what else can he do now with all of his public esteem? Take it with him to the next world?

 

References

Frost, P. (2014). L.L. Cavalli-Sforza. A bird in a gilded cage, Open Behavioral Genetics, March 28
http://openpsych.net/OBG/2014/03/l-l-cavalli-sforza-a-bird-in-a-gilded-cage/


 

Saturday, 22 March 2014

Kinder, gentler speech


 
A highwayman - by Glen Campbell (source). Before the rise of the State, and its pacification of social relations, the top man was the one who dominated the local group through a mixture of violence, bombast, and charisma.
 

Before the State came into being, men were organized into small, loosely defined groups where authority was wielded through a mixture of violence, bombast, and charisma. The more you had of these qualities, the likelier you would become the leader, "the big man." But such leadership could easily slip out of your hands. Power was something that all men held, and it was only through the consensus of the moment that one man held more of it than the others.

Thus, in pre-State societies, power is not a permanent structure that transcends the lifetime of any one leader. Power is the leader. It is highly personal and ephemeral, and these qualities extend to the tools of power, like speech. When describing Amerindian tribes in Paraguay, Pierre Clastres (1989 pp. 151-153) says:


To speak is above all to possess the power to speak. [...] the question to ask is not: who is your chief? but rather: who among you is the one who speaks? The master of words is what many groups call their chief.

[...] Indian societies do not recognize the chief's right to speak because he is the chief: they require that the man destined to be chief prove his command over words. Speech is an imperative obligation for the chief. The tribe demands to hear him: a silent chief is no longer chief.

This situation changes with the rise of the State, in particular with its monopoly on the use of violence. Social relations become more pacified, more structured, and less changeable, thus creating a culture of deference to authority. Speech is still manipulative but subtly so, as Rosen (1987) describes in Ethiopia:


For people who grow up speaking Amharic and Tigrinya, the idea of being precise with language is a foreign one. Ethiopians, perhaps Amharas more than Tigreans, are always on guard with others, suspicious about the motives of almost everyone, and on the alert for verbal assaults of one sort or another. The Amhara does not assume good intentions—he expects people to harbor disruptive inclinations. He deals with authority cautiously, always seeking to perfect his verbal means for giving vent to his criticisms and frustrations, but without incurring the wrath of powerful superiors.

[...] One must live a long time in the midst of Ethiopians, speaking with them in Amharic (or Tigrinya), in order to begin to appreciate how much calculation is invested in each phrase, each answer to a question, each overt response to a situation. That he who desires to do harm may always be polite, or that he who wishes to deliver an insult may include it in a finely-wrought compliment, is part of a general understanding of human nature. When a person speaks, he wants to do so subtly, being able to make his point effectively, yet not so directly that he might find himself involved in an altercation or worse with some equally sensitive opponent.

Social relations are still incompletely pacified in Ethiopia. This is partly because of recurring conflicts between central and peripheral sources of authority, but also because many people chose until recent times to be outlaws, i.e., those outside the sphere of State-imposed law:


In Ethiopia, an exceptionally fierce warrior could not always restrict himself to serving the common cause, or to being subservient to a particular chieftain. His alternative was to rebel, flee from the constraints of society and become a shifta. The dictionary defines this term to mean "outlaw, bandit, brigand, rebel". It was applied to anyone who committed a crime and then fled to the wilderness, thereafter living by stealth and cunning, if not, as was more than likely, by killing and highway robbery. As often as not, the shifta was also admired for being guabäz: for his courage and manliness, and, perhaps, most of all, for his daring in flouting the norms of the society. (Rosen, 1987)

Incomplete pacification also appears in the persistence of disruptive forms of speech, "when language is made into a weapon to attack or disrupt others":


One form of this is an Ethiopian penchant for backbiting, known in Amharic as chiqechiq. This appears when personal interests are asserted in the midst of group undertakings, often leading to the downfall of the community plan or project, and the disruption of joint undertakings. Another form is the studied use of hyperbole in order to magnify a case, or to gain attention for one's cause, even if this requires wild exaggeration of the truth. (Rosen, 1987)

Emergence of a free marketplace of ideas

I have argued elsewhere that the State's monopoly on violence created a new cultural environment that favored the survival of meeker and more submissive individuals (Frost, 2010). This environment also improved the prospects for individuals who used speech less aggressively. Because other individuals no longer posed a threat to life and property, and because trust had become the rule and not the exception, people were now freer to use speech simply for communication. It became possible to exchange ideas in good faith and judge them on their own merits. 

This development is analogous to the rise of the market economy. In a low-trust society, buyers and sellers can securely make their transactions only in small protected areas that are limited in space and time, i.e., shops and marketplaces. In a high-trust society, the market mechanism can spread beyond these isolated points of exchange to encompass the entire economy. Increased trust emancipated the marketplace of goods and services, and it had a similar effect on the marketplace of ideas.


Cultural or genetic evolution?

Selection acts on phenotypes, and only indirectly on genotypes. When speech began to be used in new ways, the old ways became a handicap for survival and reproduction. There was thus cultural selection for new speech patterns. But were these new patterns passed on only through learning? Or was there also selection for certain genetic predispositions?

There are predispositions that selection can act upon. Loudness of speech seems to have a heritable basis (Carmelli et al., 1988; Matthews et al., 1984). The same is true for deceitful behavior (Barker et al., 2009). Heritability is particularly high for Attention-Deficit/Hyperactivity Disorder (ADHD), which is characterized by certain speech differences:


Analysis of speech parameters during conversation, such as voice rhythm (rate and duration of pauses and vocalization, response latency), intensity, and frequency, has revealed marked differences in the timing and modulation of speech between children with ADHD and those with and without specific learning disabilities. They speak louder, fail to modulate their voice volume, speak for much longer at a stretch with many short pause durations during their talk, but take much longer to respond to the conversational partner. (Tannock, 2005)

This is not to say that ADHD became less prevalent with the pacification of social relations, but rather that this new cultural environment selected for certain heritable aspects of speech that are impaired by ADHD. Like many other genetic disorders, ADHD sheds light on the heritable variability that selection can act upon.  

In sum, when the State imposed a monopoly on the use of violence, it set in motion a process of gene-culture co-evolution with many consequences. Among other things, this process may have favored not only learned ways of speaking but also unlearned ways as well.


References 

Barker, E.D., H. Larson, E. Viding, B. Maughan, F. Rijsdijk, N. Fontaine, and R. Plomin. (2009). Common genetic but specific environmental influences for aggressive and deceitful behaviors in preadolescent males, Journal of Psychopathology and Behavioral Assessment, 31, 299-308.
http://www.drru-research.org/data/resources/55/Barker-E.-et-al.-2009.PDF

Carmelli, D., R. Rosenman, M. Chesney, R. Fabsitz, M. Lee, and N. Borhani. (1988). Genetic heritability and shared environmental influences of type A measures in the NHLBI Twin Study, American Journal of Epidemiology, 127 (5), 1041-1052. 

Clastres, P. (1989). Society against the State, New York: Zone Books.

Frost, P. (2010). The Roman State and genetic pacification, Evolutionary Psychology, 8(3), 376-389. http://www.epjournal.net/filestore/EP08376389.pdf

Matthews, K.A., R.H. Rosenman, T.M. Dembroski, E.L. Harris, and J.M. MacDougall. (1984). Familial resemblance in components of the type A behavior pattern: a reanalysis of the California type A twin study, Psychosomatic Medicine, November, 46, 512-22.

Rosen, C. (1987). Core symbols of Ethiopian identity and their role in understanding the Beta Israel today, in M. Ashkenazi and A. Weingrod (eds.) Ethiopian Jews and Israel, pp. 55-62, New Brunswick (U.S.A.): Transaction Books. 

Tannock, R. (2005). Language and mental health disorders: the case of HDHD, in W. Ostreng (ed.) Convergence. Interdisciplinary Communications 2004/2005, 45-53.
http://www.cas.uio.no/Publications/Seminar/Convergence_Tannock.pdf

Saturday, 15 March 2014

Did Europeans become white in historic times?


 
Tătăroaice – Petre Iorgulescu-Yor (source). Today, the steppes north of the Black Sea lie within the European world—politically, culturally, and demographically. Not so long ago, they were home to nomads of Central Asian origin.
 

A new study shows that Europeans underwent strong selection for white skin, non-brown eyes, and non-black hair … during historic times!

Here we present direct estimates of selection acting on functional alleles in three key genes known to be involved in human pigmentation pathways—HERC2, SLC45A2, and TYR—using allele frequency estimates from Eneolithic, Bronze Age, and modern Eastern European samples and forward simulations. Neutrality was overwhelmingly rejected for all alleles studied, with point estimates of selection ranging from around 2-10% per generation. Our results provide direct evidence that strong selection favoring lighter skin, hair, and eye pigmentation has been operating in European populations over the last 5,000 y. (Wilde et al., 2014
 
If true, this finding would contradict other recent findings. Two studies have found a much earlier time frame for the whitening of European skin: 11,000 to 19,000 years ago (Beleza et al., 2013) and 7,600 to 19,200 years ago (Canfield et al., 2014). Two studies of ancient DNA indicate that non-brown eyes were already in existence 7,000 years ago in Spain (Olalde et al., 2014) and 8,000 years ago in Luxembourg (Lazaridis et al., 2013). Moreover, the genes responsible are the same as the ones in above quote. 

So who is right and who is wrong? All of these studies are probably right, but only for some early Europeans and not for all. In the latest study, the samples come from a very small part of Europe—the steppes north of the Black Sea:2

Ancient DNA was retrieved from 63 out of 150 Eneolithic (ca. 6,500-5,000 y ago) and Bronze Age (ca. 5,000-4,000 y ago) samples from the Pontic-Caspian steppe, mainly from modern-day Ukraine. […] We also genotyped the three pigmentation-associated SNPs in a sample of 60 modern Ukrainians (28) and observed an increase in frequency of all derived alleles between the ancient and modern samples from the same geographic region (Table 1 and Fig. S1). This implies that the pigmentation of the prehistoric population is likely to have differed from that of modern humans living in the same area.

[…] Inferring natural selection based on temporal differences in allele frequency requires the assumption of population continuity. To this end we compared the 60 mtDNA HVR1 sequences obtained from our ancient sample to 246 homologous modern sequences (29–31) from the same geographic region and found low genetic differentiation (FST = 0.00551; P = 0.0663) (32). Coalescent simulations based on the mtDNA data, accommodating uncertainty in the ancient sample age, failed to reject population continuity under a wide range of assumed ancestral population size combinations. (Wilde et al., 2014)

The authors are placing the burden of proof on the wrong null hypothesis when they state that their simulations “failed to reject population continuity.” The null hypothesis should be population discontinuity. For example, Swedes and Greeks differ in skin tone and eye color, and if we compare their autosomal DNA we get a comparable FST of 0.0084 (Genetic History of Europe, 2014). Admittedly, FST is different with mitochondrial DNA.

I suspect the authors ruled out population discontinuity because their FST seemed incompatible with a non-European population giving way to a European one. If so, they forgot one thing. They were comparing a population of the present with one that existed some 5,000 years ago. If you go farther and farther back in time, any human population will look more and more ancestral to a present-day population. This is especially so in northern Eurasia, where a population ancestral to both Europeans and Amerindians existed some 20,000 years ago. Yes, the FST does seem incompatible with a non-European population giving way to a European one, but this is because the ancient DNA comes from a non-European population that was closer to the time of common origin for all northern Eurasians.

This ancient DNA may come from a mixed European/Central Asian population or an intermediate and now extinct population, perhaps similar to the Lapps. If we look at the derived (European) alleles for the three genes in question (HERC2, SLC45A2, TYR), the frequencies fall halfway between those of Europeans and Asians (see Table 1 in the paper). In any case, this population does not have to be of non-European origin to be noticeably darker in skin color. As shown by the recent Mesolithic findings from Luxembourg and Spain, there used to be apparently native dark-skinned populations in the heart of Europe.
 

Historical background 

The hypothesis of population discontinuity becomes even more plausible if we look at the history of this region. Today, the steppes north of the Black Sea lie within the European world—politically, culturally, and demographically. Not so long ago, they were home to nomads of Central Asian origin. The latest of them, the Tatars, held sway until the 18th century.

The Tatars intermixed extensively with Slavic wives and concubines, so much so that they now look almost as fair as other Europeans. But they were originally quite swarthy, as attested by medieval sources. In a 14th-century romance, The King of Tars, a Tatar Khan converts to Christianity and turns white in the baptismal water. Two other chronicles of the same period describe how a Tatar Khan's Christian concubine bears him a son white on one side and black on the other. When baptized, the child emerges from the water white on both sides (Hornstein, 1941; Metlitzki, 1977, p. 137).

Medieval writers often noticed this difference in skin color. Genoese notaries usually described Tatar slaves as olive-skinned (Plazolles Guillen, 2012, p. 119). Florentine acts of sale give the following numerical breakdown of Tatar slaves by skin color: black 2, brown 18, olive 161, fair 11, reddish 5, white 45 (Epstein, 2001, p. 108). During a trial, a slave tried to regain her freedom by claiming to be Russian and, hence, Christian. Her owner rebuked her, saying: “You’ve lied to me. You look more like a Tatar, not at all like a Russian” (Plazolles Guillen, 2012, p. 119).

The Tatars were preceded by other nomads of Central Asian origin. The Scythians (8th to 2nd century BC) were likewise described as dark-skinned. Hippocrates wrote: “The Scythian race are tawny from the cold, and not from the intense heat of the sun, for the whiteness of the skin is parched by the cold, and becomes tawny” (Hippocrates).

One can find references to the contrary (Scythians, 2014). Keep in mind that the word “Scythian” was often used in the ancient world to encompass all northern peoples:

To the ancient Greeks the Scythians, Sarmatians, Germans, and Goths were the remote northern races of antiquity. Geographically near to one another, they were often grouped together under the term “Scythians,” which by the third century B.C.E. no longer had an ethnic or national connotation and had come to designate the peoples of the remote north. (Goldenberg, 2003, p. 43)

The term “Scythian” may also have subsumed different peoples north of the Black Sea, some of whom came from Central Asia and others from areas farther north and west.
 

Conclusion

Because this region is on the periphery of the European world and has been exposed to migrations from Central Asia, population change is a likelier explanation for the findings of Wilde et al.

These findings are nonetheless interesting. Together with the ancient DNA from Mesolithic hunter-gathers in Spain and Luxembourg, we have further proof that many early Europeans were brown-skinned. Indeed, this seems to have been the physical appearance of all Europeans during their first 20,000 years in Europe. Only later, within the time frame of 20,000 to 10,000 years ago, did some of them become white.

This may seem surprising to those who believe that white skin is an adaptation to weak sunlight at high latitudes. It was thought that Europeans became white because their ancestors no longer needed dark pigmentation to protect themselves against sunburn and skin cancer. Meanwhile, light pigmentation became necessary to maintain synthesis of vitamin D. There was admittedly the example of dark-skinned peoples who have long lived at similar latitudes in Asia and North America, but that counterfactual was attributed to the availability of vitamin D from a marine diet, such as among the Inuit of northern Canada.
 
Wilkes et al. do, in fact, address the apparent contradiction between their findings and the hypothesis that ancestral Europeans became white to maintain adequate production of vitamin D in their skin. In their Discussion section, they suggest that the shift from hunting and gathering to farming led to a decrease in dietary vitamin D (from fatty fish and animal liver). The main problem with this explanation is that farming came late to many parts of Europe: about 2,000 to 3,000 years ago for East Baltic peoples and less than 3,000 years ago for Finnish peoples (and incompletely at that). This leaves a very narrow time frame for evolution from brown skin to white skin. Ultimately, this question will be resolved with retrieval of ancient DNA from these populations.

 
Notes

1. Although Wilde et al. mention hair color, they did not study the main hair-color gene, MC1R.

2. Razib Khan has a great map of the ancient DNA samples.
 

References 

Beleza, S., Murias dos Santos, A., McEvoy, B., Alves, I., Martinho, C., Cameron, E., Shriver, M.D., Parra E.J., and Rocha, J. (2013). The timing of pigmentation lightening in Europeans. Molecular Biology and Evolution, 30, 24-35.
http://mbe.oxfordjournals.org/content/30/1/24.short 

Canfield, V.A., A. Berg, S. Peckins, S.M. Wentzel, K.C. Ang, S. Oppenheimer, and K.C. Cheng. (2014). Molecular phylogeography of a human autosomal skin color locus under natural selection, G3, 3, 2059-2067.
http://www.g3journal.org/content/3/11/2059.full 

Epstein, S.A. (2001). Speaking of Slavery. Color, Ethnicity, & Human Bondage in Italy, Ithaca: Cornell University Press. 

Genetic History of Europe. (2014). Wikipedia
http://en.wikipedia.org/wiki/Genetic_history_of_Europe

Goldenberg, D.M. (2003). The Curse of Ham. Race and Slavery in early Judaism, Christianity, and Islam, Princeton: Princeton University Press. 

Hippocates. On Airs, Waters, and Places, part 20. Translated by Francis Adams
http://classics.mit.edu/Hippocrates/airwatpl.20.20.html

Hornstein, L.H.  (1941). New analogues to the King of Tars, Modern Language Review, 36, 433-442. 

Khan, R. (2014). Descent and selection is a bugger: Black Kurgans, March 12, The Unz Review: An Alternative Media Selection
http://www.unz.com/gnxp/descent-and-selection-is-a-bugger/ 

Lazaridis, I., Patterson, N., Mittnik, A., Renaud, G., Mallick, S., et al. (2013). Ancient human genomes suggest three ancestral populations for present-day Europeans, BioRxiv, December 23.
http://biorxiv.org/content/early/2013/12/23/001552.full-text.pdf+html

Metlitzki, D. (1977). The Matter of Araby in Medieval England, New Haven and London, Yale University Press. 

Olalde, I., M.E. Allentoft, F. Sanchez-Quinto, G. Saintpere, C.W.K. Chiang, et al. (2014).  Derived immune and ancestral pigmentation alleles in a 7,000-year-old Mesolithic European, Nature, early view
http://www.nature.com/nature/journal/vaop/ncurrent/full/nature12960.html

Plazolles Guillen, F. (2012). “Negre e de terra de negres infels …”: Servitude de la couleur (Valence, 1479-1516), in R. Botte and A. Stella (eds.) Couleurs de l’esclavage sur les deux rives de la Méditerranée (Moyen Âge – xxe siècle), pp. 113-158, Paris: Karthala. 

Scythians. (2014). Wikipedia
http://en.wikipedia.org/wiki/Scythians  

Wilde, S., A. Timpson, K. Kirsanow, E. Kaiser, M. Kayser, M. Unterländer, N. Hollfelder, I.D. Potekhina, W. Schier, M.G. Thomas, and J. Burger. (2014). Direct evidence for positive selection of skin, hair, and eye pigmentation in Europeans during the last 5,000 y, Proceedings of the National Academy of Sciences, published ahead of print.
http://www.pnas.org/content/early/2014/03/05/1316513111.full.pdf+html