Information

Is there any reason the common housefly continues to return to an area?

Is there any reason the common housefly continues to return to an area?



We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

This might come off as a really silly question. But I'm wondering (especially in the case of food) if there is any reason a fly would continue to try and sit on top of a piece of food even after swatting it away. I assume (it could be misconception) that it is instinctive that animals and insects would leave an area if it is harmful / dangerous to their existence after having close encounters more than once. Is this not the same for the fly?

I have this question mainly because I recall waving a fly away several times while eating lunch, and I couldn't understand why the fly wouldn't just find another place where there is food or somewhere safer.


I don't think it's a silly question, but it is a common error to anthropomorphise animals.

Insects respond to cues which they have evolved to respond to, and this is how they 'make decisions'. They do not have free will or any more complex decision-making process like common sense. This is evident is lots of insect behaviour: flying repeatedly at a closed window; landing on brightly coloured clothes instead of flowers; and returning to a food source when they are in real danger of being swatted!

When a fly senses the food (often by olfactory receptors), they are 'programmed' to fly towards it in response to some chemical they sense depending on the species and food. They may not have adapted a response to swatting, or perhaps the food cue overrides others. In nature swatting is not so much of a threat to a fly. Some animals may brush them away, but since they are not really doing any harm in feeding from another animal's food, they are mostly ignored.

CO2 traps are used to entice and kill mosquitoes. The mosquitoes are attracted to CO2 (as it is dispelled from the animals from which they blood feed), they will only evolve to avoid the traps if there is another cue which they could eventually associate with a negative effect.

Another thing to remember when thinking about insect behaviour is that their life strategy is very different to ours. Insects are more r-selected than humans, meaning that each individual life has not had so much energy put into it as a more K-selected animal (such as humans), and to compensate for this, many more young are produced. This often results in more risks being taken by individuals since there will still be a viable population even after many deaths.

Chapter 4 of 'The Insects' by Gullan & Cranston gives a good introduction to the sensory responses of insect behaviour. There are other books on the subject, 'Introduction to insect behaviour' by Atkins looks like a good starting point, but I have not read it yet.


Ten reasons why immunity passports are a bad idea

Natalie Kofler is founder of the global initiative Editing Nature and adviser for the Scientific Citizenship Initiative, Harvard Medical School, Boston, Massachusetts, USA.

You can also search for this author in PubMed Google Scholar

Françoise Baylis is a professor of philosophy with a specialization in bioethics at Dalhousie University, Halifax, Nova Scotia, Canada.

You can also search for this author in PubMed Google Scholar

A woman in Beijing shows a health QR code on her phone to access a shopping area, as a security guard checks her temperature. Credit: Kevin Frayer/Getty

Imagine a world where your ability to get a job, housing or a loan depends on passing a blood test. You are confined to your home and locked out of society if you lack certain antibodies.

It has happened before. For most of the nineteenth century, immunity to yellow fever divided people in New Orleans, Louisiana, between the ‘acclimated’ who had survived yellow fever and the ‘unacclimated’, who had not had the disease 1 . Lack of immunity dictated whom people could marry, where they could work, and, for those forced into slavery, how much they were worth. Presumed immunity concentrated political and economic power in the hands of the wealthy elite, and was weaponized to justify white supremacy.

Something similar could be our dystopian future if governments introduce ‘immunity passports’ in efforts to reverse the economic catastrophe of the COVID-19 pandemic. The idea is that such certificates would be issued to those who have recovered and tested positive for antibodies to SARS-CoV-2 — the coronavirus that causes the disease. Authorities would lift restrictions on those who are presumed to have immunity, allowing them to return to work, to socialize and to travel. This idea has so many flaws that it is hard to know where to begin.

Show evidence that apps for COVID-19 contact-tracing are secure and effective

On 24 April, the World Health Organization (WHO) cautioned against issuing immunity passports because their accuracy could not be guaranteed. It stated that: “There is currently no evidence that people who have recovered from COVID-19 and have antibodies are protected from a second infection”(see go.nature.com/3cutjqz). Nonetheless, the idea is being floated in the United States, Germany, the United Kingdom and other nations.

China has already introduced virtual health checks, contact tracing and digital QR codes to limit the movement of people. Antibody test results could easily be integrated into this system. And Chile, in a game of semantics, says that it intends to issue ‘medical release certificates’ with three months’ validity to people who have recovered from the disease 2 .

In our view, any documentation that limits individual freedoms on the basis of biology risks becoming a platform for restricting human rights, increasing discrimination and threatening — rather than protecting — public health. Here we present ten reasons why immunity passports won’t, can’t and shouldn’t be allowed to work.


References (This Section)

  1. Centers for Disease Control and Prevention. Hepatitis A outbreak associated with green onions at a restaurant&ndashMonaca, Pennsylvania, 2003. MMWR 2003 52(47):1155&ndash7.
  1. Cobb S, Miller M, Wald N. On the estimation of the incubation period in malignant disease. J Chron Dis 19599:385&ndash93.
  1. Kelsey JL, Thompson WD, Evans AS. Methods in observational epidemiology. New York: Oxford University Press 1986. p. 216.
  2. Lee LA, Ostroff SM, McGee HB, Jonson DR, Downes FP, Cameron DN, et al. A. outbreak of shigellosis at an outdoor music festival. Am J Epidemiol 1991. 133:608&ndash15.
  3. White DJ, Chang H-G, Benach JL, Bosler EM, Meldrum SC. Means RG, et al. Geographic spread and temporal increase of the Lyme diseas. epidemic. JAMA 1991266:1230&ndash6.
  4. Centers for Disease Control and Prevention. Outbreak of West Nile-Like Viral Encephalitis&ndashNew York, 1999. MMWR 199948(38):845&ndash9.
  5. Centers for Disease Control and Prevention. Prevalence of overweight and obesity among adults with diagnosed diabetes &mdash United States. 1988&ndash1994 and 1999&ndash2002. MMWR 200453(45):1066&ndash8.
  6. National Center for Health Statistics [Internet]. Atlanta: Centers for Disease Control and Prevention [updated 2005 Feb 8]. Available from: https://www.cdc.gov/nchs/products/pubs/pubd/hestats/overwght99.htm.

Figure 1.21

Description: Epidemic curve (histogram) shows the presumed index case of Hepatitis A, followed 4 days later by a steep increase in cases which tapers off to 0. Cases who were food handlers and secondary cases are also shown. Return to text.

Figure 1.22

Description: Histogram shows the number of cases of diarrhea by date of onset. Arrows also show when water main breaks, a boil water order, and water chlorination occur. Bloody and nonbloody diarrheal illness is indicated by different colors. Overall increases and decreases in cases is easily seen. Return to text.

Figure 1.23

Description: Histogram shows the number of measles cases peaks around November 23 then declines. It peaks again on December 5 and declines until it peaks a third time. Return to text.

Figure 1.24

Description: Histogram shows the number of Shigella cases among staff and attendees in stacked bars. The first case occurs in a staff member on day 1. The number of cases among staff and attendees is seen in relationship to the festival dates. Return to text.

Figure 1.25

Description: Histogram shows a general increasing trend in the number of reported cases of Lyme disease. Return to text.

Figure 1.26

Description: Histogram shows reported cases of West Nile Encephalitis in New York City and other locations. In NYC, cases drop to 0 after mosquito control activities are begun in the city. Reported cases in other locations continue at about the same rate. Return to text.


The unwelcome revival of ‘race science’

O ne of the strangest ironies of our time is that a body of thoroughly debunked “science” is being revived by people who claim to be defending truth against a rising tide of ignorance. The idea that certain races are inherently more intelligent than others is being trumpeted by a small group of anthropologists, IQ researchers, psychologists and pundits who portray themselves as noble dissidents, standing up for inconvenient facts. Through a surprising mix of fringe and mainstream media sources, these ideas are reaching a new audience, which regards them as proof of the superiority of certain races.

The claim that there is a link between race and intelligence is the main tenet of what is known as “race science” or, in many cases, “scientific racism”. Race scientists claim there are evolutionary bases for disparities in social outcomes – such as life expectancy, educational attainment, wealth, and incarceration rates – between racial groups. In particular, many of them argue that black people fare worse than white people because they tend to be less naturally intelligent.

Although race science has been repeatedly debunked by scholarly research, in recent years it has made a comeback. Many of the keenest promoters of race science today are stars of the “alt-right”, who like to use pseudoscience to lend intellectual justification to ethno-nationalist politics. If you believe that poor people are poor because they are inherently less intelligent, then it is easy to leap to the conclusion that liberal remedies, such as affirmative action or foreign aid, are doomed to fail.

There are scores of recent examples of rightwingers banging the drum for race science. In July 2016, for example, Steve Bannon, who was then Breitbart boss and would go on to be Donald Trump’s chief strategist, wrote an article in which he suggested that some black people who had been shot by the police might have deserved it. “There are, after all, in this world, some people who are naturally aggressive and violent,” Bannon wrote, evoking one of scientific racism’s ugliest contentions: that black people are more genetically predisposed to violence than others.

One of the people behind the revival of race science was, not long ago, a mainstream figure. In 2014, Nicholas Wade, a former New York Times science correspondent, wrote what must rank as the most toxic book on race science to appear in the last 20 years. In A Troublesome Inheritance, he repeated three race-science shibboleths: that the notion of “race” corresponds to profound biological differences among groups of humans that human brains evolved differently from race to race and that this is supported by different racial averages in IQ scores.

Wade’s book prompted 139 of the world’s leading population geneticists and evolutionary theorists to sign a letter in the New York Times accusing Wade of misappropriating research from their field, and several academics offered more detailed critiques. The University of Chicago geneticist Jerry Coyne described it as “simply bad science”. Yet some on the right have, perhaps unsurprisingly, latched on to Wade’s ideas, rebranding him as a paragon of intellectual honesty who had been silenced not by experts, but by political correctness.

“That attack on my book was purely political,” Wade told Stefan Molyneux, one of the most popular promoters of the alt-right’s new scientific racism. They were speaking a month after Trump’s election on Molyneux’s YouTube show, whose episodes have been viewed tens of millions of times. Wade continued: “It had no scientific basis whatever and it showed the more ridiculous side of this herd belief.”

Another of Molyneux’s recent guests was the political scientist Charles Murray, who co-authored The Bell Curve. The book argued that poor people, and particularly poor black people, were inherently less intelligent than white or Asian people. When it was first published in 1994, it became a New York Times bestseller, but over the next few years it was picked to pieces by academic critics.

As a frequent target for protest on college campuses, Murray has become a figurehead for conservatives who want to portray progressives as unthinking hypocrites who have abandoned the principles of open discourse that underwrite a liberal society. And this logic has prompted some mainstream cultural figures to embrace Murray as an icon of scientific debate, or at least as an emblem of their own openness to the possibility that the truth can, at times, be uncomfortable. Last April, Murray appeared on the podcast of the popular nonfiction author Sam Harris. Murray used the platform to claim his liberal academic critics “lied without any apparent shadow of guilt because, I guess, in their own minds, they thought they were doing the Lord’s work.” (The podcast episode was entitled “Forbidden knowledge”.)

Students in Vermont turn their backs to Charles Murray during a lecture in March last year. Photograph: Lisa Rathke/AP

In the past, race science has shaped not only political discourse, but also public policy. The year after The Bell Curve was published, in the lead-up to a Republican congress slashing benefits for poorer Americans, Murray gave expert testimony before a Senate committee on welfare reform more recently, congressman Paul Ryan, who helped push the Republicans’ latest tax cuts for the wealthy, has claimed Murray as an expert on poverty.

Now, as race science leaches back into mainstream discourse, it has also been mainlined into the upper echelons of the US government through figures such as Bannon. The UK has not been spared this revival: the London Student newspaper recently exposed a semi-clandestine conference on intelligence and genetics held for the last three years at UCL without the university’s knowledge. One of the participants was the 88-year-old Ulster-based evolutionary psychologist Richard Lynn, who has described himself as a “scientific racist”.

One of the reasons scientific racism hasn’t gone away is that the public hears more about the racism than it does about the science. This has left an opening for people such as Murray and Wade, in conjunction with their media boosters, to hold themselves up as humble defenders of rational enquiry. With so much focus on their apparent bias, we’ve done too little to discuss the science. Which raises the question: why, exactly, are the race scientists wrong?

R ace, like intelligence, is a notoriously slippery concept. Individuals often share more genes with members of other races than with members of their own race. Indeed, many academics have argued that race is a social construct – which is not to deny that there are groups of people (“population groups”, in the scientific nomenclature) that share a high amount of genetic inheritance. Race science therefore starts out on treacherous scientific footing.

The supposed science of race is at least as old as slavery and colonialism, and it was considered conventional wisdom in many western countries until 1945. Though it was rejected by a new generation of scholars and humanists after the Holocaust, it began to bubble up again in the 1970s, and has returned to mainstream discourse every so often since then.

In 1977, during my final year in state high school in apartheid South Africa, a sociology lecturer from the local university addressed us and then took questions. He was asked whether black people were as intelligent as white people. No, he said: IQ tests show that white people are more intelligent. He was referring to a paper published in 1969 by Arthur Jensen, an American psychologist who claimed that IQ was 80% a product of our genes rather than our environments, and that the differences between black and white IQs were largely rooted in genetics.

In apartheid South Africa, the idea that each race had its own character, personality traits and intellectual potential was part of the justification for the system of white rule. The subject of race and IQ was similarly politicised in the US, where Jensen’s paper was used to oppose welfare schemes, such as the Head Start programme, which were designed to lift children out of poverty. But the paper met with an immediate and overwhelmingly negative reaction – “an international firestorm,” the New York Times called it 43 years later, in Jensen’s obituary – especially on American university campuses, where academics issued dozens of rebuttals, and students burned him in effigy.

The recent revival of ideas about race and IQ began with a seemingly benign scientific observation. In 2005, Steven Pinker, one of the world’s most prominent evolutionary psychologists, began promoting the view that Ashkenazi Jews are innately particularly intelligent – first in a lecture to a Jewish studies institute, then in a lengthy article in the liberal American magazine The New Republic the following year. This claim has long been the smiling face of race science if it is true that Jews are naturally more intelligent, then it’s only logical to say that others are naturally less so.

The background to Pinker’s essay was a 2005 paper entitled “Natural history of Ashkenazi intelligence”, written by a trio of anthropologists at the University of Utah. In their 2005 paper, the anthropologists argued that high IQ scores among Ashkenazi Jews indicated that they evolved to be smarter than anyone else (including other groups of Jews).

This evolutionary development supposedly took root between 800 and 1650 AD, when Ashkenazis, who primarily lived in Europe, were pushed by antisemitism into money-lending, which was stigmatised among Christians. This rapid evolution was possible, the paper argued, in part because the practice of not marrying outside the Jewish community meant a “very low inward gene flow”. This was also a factor behind the disproportionate prevalence in Ashkenazi Jews of genetic diseases such as Tay-Sachs and Gaucher’s, which the researchers claimed were a byproduct of natural selection for higher intelligence those carrying the gene variants, or alleles, for these diseases were said to be smarter than the rest.

Pinker followed this logic in his New Republic article, and elsewhere described the Ashkenazi paper as “thorough and well-argued”. He went on to castigate those who doubted the scientific value of talking about genetic differences between races, and claimed that “personality traits are measurable, heritable within a group and slightly different, on average, between groups”.

In subsequent years, Nicholas Wade, Charles Murray, Richard Lynn, the increasingly popular Canadian psychologist Jordan Peterson and others have all piled in on the Jewish intelligence thesis, using it as ballast for their views that different population groups inherit different mental capacities. Another member of this chorus is the journalist Andrew Sullivan, who was one of the loudest cheerleaders for The Bell Curve in 1994, featuring it prominently in The New Republic, which he edited at the time. He returned to the fray in 2011, using his popular blog, The Dish, to promote the view that population groups had different innate potentials when it came to intelligence.

Sullivan noted that the differences between Ashkenazi and Sephardic Jews were “striking in the data”. It was a prime example of the rhetoric of race science, whose proponents love to claim that they are honouring the data, not political commitments. The far right has even rebranded race science with an alternative name that sounds like it was taken straight from the pages of a university course catalogue: “human biodiversity”.

A common theme in the rhetoric of race science is that its opponents are guilty of wishful thinking about the nature of human equality. “The IQ literature reveals that which no one would want to be the case,” Peterson told Molyneux on his YouTube show recently. Even the prominent social scientist Jonathan Haidt has criticised liberals as “IQ deniers”, who reject the truth of inherited IQ difference between groups because of a misguided commitment to the idea that social outcomes depend entirely on nurture, and are therefore mutable.

Defenders of race science claim they are simply describing the facts as they are – and the truth isn’t always comfortable. “We remain the same species, just as a poodle and a beagle are of the same species,” Sullivan wrote in 2013. “But poodles, in general, are smarter than beagles, and beagles have a much better sense of smell.”

T he race “science” that has re-emerged into public discourse today – whether in the form of outright racism against black people, or supposedly friendlier claims of Ashkenazis’ superior intelligence – usually involves at least one of three claims, each of which has no grounding in scientific fact.

The first claim is that when white Europeans’ Cro-Magnon ancestors arrived on the continent 45,000 years ago, they faced more trying conditions than in Africa. Greater environmental challenges led to the evolution of higher intelligence. Faced with the icy climate of the north, Richard Lynn wrote in 2006, “less intelligent individuals and tribes would have died out, leaving as survivors the more intelligent”.

Set aside for a moment the fact that agriculture, towns and alphabets first emerged in Mesopotamia, a region not known for its cold spells. There is ample scientific evidence of modern intelligence in prehistoric sub-Saharan Africa. In the past 15 years, cave finds along the South African Indian Ocean coastline have shown that, between 70,000 and 100,000 years ago, biologically modern humans were carefully blending paint by mixing ochre with bone-marrow fat and charcoal, fashioning beads for self-adornment, and making fish hooks, arrows and other sophisticated tools, sometimes by heating them to 315C (600F). Those studying the evidence, such as the South African archaeologist Christopher Henshilwood, argue that these were intelligent, creative people – just like us. As he put it: “We’re pushing back the date of symbolic thinking in modern humans – far, far back.”

A 77,000-year-old piece of red ochre with a deliberately engraved design discovered at Blombos Cave, South Africa. Photograph: Anna Zieminski/AFP/Getty Images

A second plank of the race science case goes like this: human bodies continued to evolve, at least until recently – with different groups developing different skin colours, predispositions to certain diseases, and things such as lactose tolerance. So why wouldn’t human brains continue evolving, too?

The problem here is that race scientists are not comparing like with like. Most of these physical changes involve single gene mutations, which can spread throughout a population in a relatively short span of evolutionary time. By contrast, intelligence – even the rather specific version measured by IQ – involves a network of potentially thousands of genes, which probably takes at least 100 millennia to evolve appreciably.

Given that so many genes, operating in different parts of the brain, contribute in some way to intelligence, it is hardly surprising that there is scant evidence of cognitive advance, at least over the last 100,000 years. The American palaeoanthropologist Ian Tattersall, widely acknowledged as one of the world’s leading experts on Cro-Magnons, has said that long before humans left Africa for Asia and Europe, they had already reached the end of the evolutionary line in terms of brain power. “We don’t have the right conditions for any meaningful biological evolution of the species,” he told an interviewer in 2000.

In fact, when it comes to potential differences in intelligence between groups, one of the remarkable dimensions of the human genome is how little genetic variation there is. DNA research conducted in 1987 suggested a common, African ancestor for all humans alive today: “mitochondrial Eve”, who lived around 200,000 years ago. Because of this relatively recent (in evolutionary terms) common ancestry, human beings share a remarkably high proportion of their genes compared to other mammals. The single subspecies of chimpanzee that lives in central Africa, for example, has significantly more genetic variation than does the entire human race.

No one has successfully isolated any genes “for” intelligence at all, and claims in this direction have turned to dust when subjected to peer review. As the Edinburgh University cognitive ageing specialist Prof Ian Deary put it, “It is difficult to name even one gene that is reliably associated with normal intelligence in young, healthy adults.” Intelligence doesn’t come neatly packaged and labelled on any single strand of DNA.

U ltimately, race science depends on a third claim: that different IQ averages between population groups have a genetic basis. If this case falls, the whole edifice – from Ashkenazi exceptionalism to the supposed inevitability of black poverty – collapses with it.

Before we can properly assess these claims, it is worth looking at the history of IQ testing. The public perception of IQ tests is that they provide a measure of unchanging intelligence, but when we look deeper, a very different picture emerges. Alfred Binet, the modest Frenchman who invented IQ testing in 1904, knew that intelligence was too complex to be expressed in a single number. “Intellectual qualities … cannot be measured as linear surfaces are measured,” he insisted, adding that giving IQ too much significance “may give place to illusions.”

But Binet’s tests were embraced by Americans who assumed IQ was innate, and used it to inform immigration, segregationist and eugenic policies. Early IQ tests were packed with culturally loaded questions. (“The number of a Kaffir’s legs is: 2, 4, 6, 8?” was one of the questions in IQ tests given to US soldiers during the first world war.) Over time, the tests became less skewed and began to prove useful in measuring some forms of mental aptitude. But this tells us nothing about whether scores are mainly the product of genes or of environment. Further information is needed.

One way to test this hypothesis would be to see if you can increase IQ by learning. If so, this would show that education levels, which are purely environmental, affect the scores. It is now well-known that if you practise IQ tests your score will rise, but other forms of study can also help. In 2008, Swiss researchers recruited 70 students and had half of them practise a memory-based computer game. All 35 of these students saw their IQs increase, and those who practised daily, for the full 19 weeks of the trial, showed the most improvement.

Another way to establish the extent to which IQ is determined by nature rather than nurture would be to find identical twins separated at birth and subsequently raised in very different circumstances. But such cases are unusual, and some of the most influential research – such as the work of the 20th-century English psychologist Cyril Burt, who claimed to have shown that IQ was innate – has been dubious. (After Burt’s death, it was revealed that he had falsified much of his data.)

A genuine twin study was launched by the Minneapolis-based psychologist Thomas Bouchard in 1979, and although he was generously backed by the overtly racist Pioneer Fund, his results make interesting reading. He studied identical twins, who have the same genes, but who were separated close to birth. This allowed him to consider the different contributions that environment and biology played in their development. His idea was that if the twins emerged with the same traits despite being raised in different environments, the main explanation would be genetic.

The problem was that most of his identical twins were adopted into the same kinds of middle-class families. So it was hardly surprising that they ended up with similar IQs. In the relatively few cases where twins were adopted into families of different social classes and education levels, there ended up being huge disparities in IQ – in one case a 20-point gap in another, 29 points, or the difference between “dullness” and “superior intelligence” in the parlance of some IQ classifications. In other words, where the environments differed substantially, nurture seems to have been a far more powerful influence than nature on IQ.

But what happens when you move from individuals to whole populations? Could nature still have a role in influencing IQ averages? Perhaps the most significant IQ researcher of the last half century is the New Zealander Jim Flynn. IQ tests are calibrated so that the average IQ of all test subjects at any particular time is 100. In the 1990s, Flynn discovered that each generation of IQ tests had to be more challenging if this average was to be maintained. Projecting back 100 years, he found that average IQ scores measured by current standards would be about 70.

Yet people have not changed genetically since then. Instead, Flynn noted, they have become more exposed to abstract logic, which is the sliver of intelligence that IQ tests measure. Some populations are more exposed to abstraction than others, which is why their average IQ scores differ. Flynn found that the different averages between populations were therefore entirely environmental.

This finding has been reinforced by the changes in average IQ scores observed in some populations. The most rapid has been among Kenyan children – a rise of 26.3 points in the 14 years between 1984 and 1998, according to one study. The reason has nothing to do with genes. Instead, researchers found that, in the course of half a generation, nutrition, health and parental literacy had improved.

So, what about the Ashkenazis? Since the 2005 University of Utah paper was published, DNA research by other scientists has shown that Ashkenazi Jews are far less genetically isolated than the paper argued. On the claims that Ashkenazi diseases were caused by rapid natural selection, further research has shown that they were caused by a random mutation. And there is no evidence that those carrying the gene variants for these diseases are any more or less intelligent than the rest of the community.

But it was on IQ that the paper’s case really floundered. Tests conducted in the first two decades of the 20th century routinely showed Ashkenazi Jewish Americans scoring below average. For example, the IQ tests conducted on American soldiers during the first world war found Nordics scoring well above Jews. Carl Brigham, the Princeton professor who analysed the exam data, wrote: “Our figures … would rather tend to disprove the popular belief that the Jew is highly intelligent”. And yet, by the second world war, Jewish IQ scores were above average.

A similar pattern could be seen from studies of two generations of Mizrahi Jewish children in Israel: the older generation had a mean IQ of 92.8, the younger of 101.3. And it wasn’t just a Jewish thing. Chinese Americans recorded average IQ scores of 97 in 1948, and 108.6 in 1990. And the gap between African Americans and white Americans narrowed by 5.5 points between 1972 and 2002.

No one could reasonably claim that there had been genetic changes in the Jewish, Chinese American or African American populations in a generation or two. After reading the University of Utah paper, Harry Ostrer, who headed New York University’s human genetics programme, took the opposite view to Steven Pinker: “It’s bad science – not because it’s provocative, but because it’s bad genetics and bad epidemiology.”

T en years ago, our grasp of the actual science was firm enough for Craig Venter, the American biologist who led the private effort to decode the human genome, to respond to claims of a link between race and intelligence by declaring: “There is no basis in scientific fact or in the human genetic code for the notion that skin colour will be predictive of intelligence.”

Yet race science maintains its hold on the imagination of the right, and today’s rightwing activists have learned some important lessons from past controversies. Using YouTube in particular, they attack the left-liberal media and academic establishment for its unwillingness to engage with the “facts”, and then employ race science as a political battering ram to push forward their small-state, anti-welfare, anti-foreign-aid agenda.

These political goals have become ever more explicit. When interviewing Nicholas Wade, Stefan Molyneux argued that different social outcomes were the result of different innate IQs among the races – as he put it, high-IQ Ashkenazi Jews and low-IQ black people. Wade agreed, saying that the “role played by prejudice” in shaping black people’s social outcomes “is small and diminishing”, before condemning “wasted foreign aid” for African countries.

Similarly, when Sam Harris, in his podcast interview with Charles Murray, pointed out the troubling fact that The Bell Curve was beloved by white supremacists and asked what the purpose of exploring race-based differences in intelligence was, Murray didn’t miss a beat. Its use, Murray said, came in countering policies, such as affirmative action in education and employment, based on the premise that “everybody is equal above the neck … whether it’s men or women or whether it’s ethnicities”.

Race science isn’t going away any time soon. Its claims can only be countered by the slow, deliberate work of science and education. And they need to be – not only because of their potentially horrible human consequences, but because they are factually wrong. The problem is not, as the right would have it, that these ideas are under threat of censorship or stigmatisation because they are politically inconvenient. Race science is bad science. Or rather, it is not science at all.

Follow the Long Read on Twitter at @gdnlongread, or sign up to the long read weekly email here.


Prevention

There are no vaccines available in the United States to protect against non-polio enteroviruses, which are the most common cause of viral meningitis. The best way to help protect yourself and others from non-polio enterovirus infections is to

    often with soap and water for at least 20 seconds, especially after changing diapers or using the toilet
  • Avoid close contact, such as touching and shaking hands, with people who are sick
  • Clean and disinfect frequently touched surfaces
  • Stay home when you are sick and keep sick children out of school

Vaccines can protect against some diseases, such as measles, mumps, chickenpox, and influenza, which can lead to viral meningitis. Make sure you and your child are vaccinated on schedule.

Avoid bites from mosquitoes and other insects that carry diseases that can infect humans.

Control mice and rats. If you have a rodent in or around your home, follow appropriate cleaning and control precautions.


Treatment for Metastatic Cancer

There are treatments for most types of metastatic cancer. Often, the goal of treating metastatic cancer is to control it by stopping or slowing its growth. Some people can live for years with metastatic cancer that is well controlled. Other treatments may improve the quality of life by relieving symptoms. This type of care is called palliative care. It can be given at any point during treatment for cancer.

The treatment that you may have depends on your type of primary cancer, where it has spread, treatments you’ve had in the past, and your general health. To learn about treatment options, including clinical trials, find your type of cancer among the PDQ® Cancer Information Summaries for Adult Treatment and Pediatric Treatment.


The Top 10 Reasons to Major in Psychology

If you’ve studied psychology in college, or know someone who has, you’ve undoubtedly heard the claim that psychology majors can't get jobs. A recent investigation into the question of whether there are too many psych majors reveals that this is not the case, as published in the report: "Are There Too Many Psych Majors?" The American Psychological Association and the Florida Psychology Department Chairs, responding to concerns about psychology being “too popular” as a major, prepared this white paper to examine the facts about the employability of psychology majors. The surprising results show that in contrast to the view that it prepares students for very little of practical value, the undergraduate psychology major is one of the best choices a college student can make.

Before we answer the question of why psychology is such a sound choice for a major, let’s tackle the myths about the psychology major that many prospective students believe.

Based on the APA/Florida report, these are the four most common misbeliefs:

You can become a therapist with a bachelor’s degree. Although many students think they may be one or two courses away from being a “Dr. Phil,” the truth is that becoming a therapist does take training. That said, you can pursue many jobs in the mental health field with four solid years, plus practicals or internships, of courses within the major. However, it’s not true that…

You can’t get a job in an area of psychology with a bachelor’s degree. You can’t become a licensed psychologist unless you have graduate training plus additional hours of supervision, but the skills you gain as a psychology major translate well into many jobs, especially in entry-level positions. We’ll talk more about this shortly.

Psychology is an art, not a science. There may be applications of psychology to the arts, but the field is a science, one that is increasingly gaining recognition as a “STEM” (Science-Technology-Engineering-Mathematics) discipline.

Psychology is easy. Many people believe that psychology is nothing more than “common sense” and therefore is an easy field to master. In reality, the study of psychology involves rigorous training in topics ranging from statistics to neuroscience, and all psychology majors must complete a set of in-depth core courses based on this knowledge.

Now that we’ve laid these myths to rest, it’s time to examine the evidence showing the true advantages of majoring in psychology. Workforce analyses of psychology majors show that psychology graduates do get jobs using their degree. They may not be “psychologists” until they complete their graduate training, but they put their psychology skills to use. According to the report, over 40% of bachelor’s level psychology degree holders work in for-profit jobs in business and industry. The next largest group, between 20-30%, work in educational institutions. Approximately 15-20% work, respectively, in government and not-for-profit organizations. The remainder are self-employed. Although fewer than 25% of all psychology majors actually work in the field of psychology after graduation, they do qualify for entry-level positions in fields as diverse as marketing, sales, advertising, rehabilitation or psychiatric services, real estate, social work, child care, parole, and career counseling. According to the College Majors Handbook, and as reported in the white paper, the top 10 occupations that employ bachelor’s level psychology majors include management, sales, social work, personnel, health care, and financial specialists.

Unfortunately, those who work in psychology-related fields directly tend to earn less than those in other science disciplines. In 2006 terms, the median annual salary of bachelor’s level psychology majors was $30,000. In 2010, APA estimated that the starting salary for a psychology major is $36,400. Within related fields, the highest incomes are earned by psychology majors who become medical and health services managers, and the lowest are preschool teachers and teacher's assistants. The median income is admittedly less than the $51,000 earned by a mechanical engineer, and less in fact than other science-related disciplines. One reason for the lower earnings of psychology majors is not that the degree is worth any less, but that education and many social services fields are severely underfunded. If you want a job with a bachelor’s degree in teaching or human services, you will sacrifice your salary, no matter what your college major.

Fortunately, however, many psychology majors do go on to complete a graduate degree, which will benefit their career earnings potential. The white paper reports that 40% of psychology majors complete some form of graduate training, placing psychology among the highest of all undergraduate majors in post-graduate degree completion.

Psychology’s popularity has grown. In 2007, 90,000 of the over 1.5 million bachelor’s degrees awarded in the U.S. were claimed by psychology majors, nearly a doubling from the slightly over 50,000 psychology degrees earned in 1998.

Why do students want this major, even though the salary prospects in psychology itself are not particularly high? You guessed it. Psychology majors want to help people! As the white paper concluded, “Although the traditional introductory course design quickly disabuses students of the belief that psychology is solely about understanding and treating human abnormality, the breadth of introductory course content remains intrinsically interesting to most students” (p. 5). Apparently, we can't squelch the urge of psychology students to want to study psychology.

When all is said and done, psychology majors know a good thing when they see it. Psychology undergraduates gain valuable skills that give them an edge on the job market. The skills break down roughly to the types of courses that they take, which include rigorous training in statistics and research methods, neuroscience, individual and group processes, memory and learning, and psychopathology.

As outlined in the white paper, these are the 10 skills that psychology majors gain while earning their undergraduate degrees. They can better.

  1. predict and understand the behavior of individuals and groups
  2. understand how to use and interpret data
  3. evaluate the legitimacy of claims about behavior
  4. know how memory and learning function
  5. have insight into problematic behaviors
  6. demonstrate the capacity to adapt to change
  7. understand and operate effectively throughout the channels of an organization
  8. manage difficult situations and high stress environments
  9. start and carry out projects with limited information or experience
  10. show persistence in challenging circumstances

This is a formidable set of skills, and you might wonder how the average 22-year-old college graduate can carry these out so well. If you understand how the major works, the answer is not quite so mysterious.

Psychology is the science of behavior, and psychologists learn how to predict, understand, explain, and control behavior. Though not professional psychologists, undergraduates are taught how to look carefully at behavior and gain exposure to basic principles such as motivation, learning, thinking, sensation, and perception. The average college-level introductory psychology course surveys the field and provides students with the background to get them started in the major, no matter which specific area they want to pursue.

With regard to understanding and using data, the basic statistics courses we offer in the major expose students not only to such topics as probability theory, but also the use of statistical software packages that only 15 or 20 years ago required considerable programming mastery. Students can now understand the reasoning behind the statistics that they in fact compute on their own. Many students also complete their own independent research projects (under the advising of a professor or graduate student) or at least participate in a lab to get hands-on experience. This gives them an appreciation not only for how research gets done, but why research is needed to gain a scientific understanding of behavior.

The content areas in the undergraduate psychology major build on these basic scientific skills. The standards that accredited undergraduate programs adhere to ensure that students complete requirements that give them exposure to the major substantive areas, from neuroscience to social psychology. Many students also learn how to conduct their own independent library research. They can choose from thousands of research articles because online databases have become so sophisticated that within a few keystrokes, they can gain access to almost any article on almost any topic. Through required lab sections, students also learn the basic mechanics of collecting, analyzing, and writing up laboratory data.

Psychology majors also learn about the informal and formal workings of organizations through courses in organizational and social psychology, and by becoming involved in the activities of their own schools and communities. Many psychology students naturally gravitate toward campus organizations, community volunteer opportunities, and even involvement in the politics of their local and state governments.

In terms of managing stress and learning persistence, psychology students may not be all that different from their fellow students in other majors who must balance their school obligations with other responsibilities at home and at work. However, psychology students have an edge over their peers because the content of their courses often directly addresses the problems they confront in their daily lives. They are learning in the classroom about such topics as sleep deprivation, stress and coping, family relationships and the basis of addiction. Gaining mastery of the principles of memory, reinforcement, and behavior modification provides them with tools that they can use to manage their own academic and personal challenges.

Courses in developmental psychology, including the psychology of adolescence, help psychology majors learn about the development of their own identities. Gaining insight into identity development helps these young adults learn strategies to test their own values, priorities, and goals. Focusing their attention inward on their feelings and beliefs are processes that these development and adjustment courses foster, particularly when the courses allow students time for discussion and reflection.

What about the claim that psychology majors can start and execute projects with limited information or experience? Most undergraduates have limited information and experience. However, psychology majors learn the tools to cope with this situation because they learn about such organizational skills as time management and self-regulation (the ability to pace yourself). They also learn about motivation and managing emotions, which are two factors that contribute to the ability to get a job done even when the task is unclear at the outset.

Some potentially contradictory data about the benefits of the psych major came from a 2010 Wall Street Journal report. In a national survey, college alumni who graduated between 1999 and 2010 rated their satisfaction with their career path. Compared to the 54% of chemical engineers who said they were satisfied or very satisfied with their careers, only 26% of the over 10,000 former psychology majors felt positive about the direction their careers had taken them. The title of the article was “Psych Majors Aren’t Happy with Options.” However, the survey didn’t justify this conclusion. There was no direct evidence to show that the former psych majors felt that their college major was to blame for their lack of satisfaction. If the question was worded differently, the surveyors might have directly assessed whether the alums actually regretted their choice of a major instead of being dissatisfied with their careers more generally. We also don’t know if these students chose psychology because they couldn’t think of a better alternative, leading them to a less focused career path than students who majored in engineering, business, or computers.

The moral of the story is clear. Parents, guidance counselors, teachers, advisors, and most importantly, students, don’t fear the psych major. Call me biased, but it’s hard to imagine a field that is more intriguing and compelling. And according to the white paper, it’s also hard to imagine a field that gives you more valuable life skills.

Feel free to join my Facebook group, "Fulfillment at Any Age," to discuss today's blog, or to ask further questions about this posting.


Marine Biology

Marine biology is the study of marine organisms, their behaviors and their interactions with the environment. Because there are so many topics one could study within the field, many researchers select a particular interest and specialize in it. Specializations can be based on a particular species, organism, behavior, technique or ecosystem. For example, marine biologists may choose to study a single species of clams, or all clams that are native to a climate or region.

Some researchers get involved in a range of activities. Alex Almario, a laboratory and field operations technician profiled on this site, provides field support for scientists conducting estuarine research. He reports that his many duties include: "boat operation and maintenance water quality data and sample collection wetlands, mangrove, seagrass and coral research scuba surface support and diving fieldwork and wet lab ecotoxicology support."

One area of specialization, the field of marine biotechnology, offers great opportunity for marine biologists. Marine biotechnology research presents a wide range of possibilities and applications. One focus area is the biomedical field, where scientists develop and test drugs, many of which come from marine organisms.

Molecular biology is a related area of specialization in this field. Researchers apply molecular approaches and techniques to many environments, from coastal ponds to the deep sea, and many different organisms, from microscopic bacteria, plants, and animals to marine mammals. For example, molecular biology can be used to identify the presence of a specific organism in a water sample through the use of molecular probes. This is very useful when the organism in question is microscopic or similar to other organisms.

Aquaculture, the farming of finfish, shellfish and seaweeds, is another field that has been aided by marine biotechnology and molecular techniques. Aquaculture is gaining importance in this country as consumer demand for fish and shellfish becomes greater than can be met by traditional commercial fishing. At the same time, technological advances have made aquaculture more economically feasible.

Other popular areas within the field of marine biology are environmental biology and toxicology. Both of these areas have direct applications and implications for our society. Examples of specialities in environmental biology and toxicology include water quality research and the study of contaminants or pollutants in the coastal or marine environment. Laws, regulations and cleanup measures designed to protect the environment will ensure that marine and environmental biologists and consultants continue to play an important role in our society.

Another field of research within marine or aquatic biology involves organisms that have been around for billions of years: protists. Protists are singled-celled organisms that include protozoa and microalgae. Their importance as a group lies in the fact that microscopic algae serve as food for animals in aquatic food webs, earning them the title "primary producers." And since primary producers are mostly microscopic species, the organisms that consume them are often single-celled, microscopic species as well. If something happens to somehow alter populations of primary producers, the entire food web could be affected.

Probably the topic most often asked about within marine biology is research involving marine mammals, including cetaceans (whales and dolphins) and pinnipeds (sea lions, seals and walruses). The reality is that research jobs involving marine mammals are extremely hard to come by for a number of reasons, including the popularity of the field, the fact that working with marine mammals is highly regulated (most research is done using tissue samples of sick, stranded or dead animals and not on live, healthy animals), and because funding is very competitive.

Two popular fields of research involving marine mammals are bioacoustics and vocalization (the study of marine mammal sounds) and population dynamics (studying marine mammalian behaviors and responses to environmental conditions as they impact population). As for non-research employment options involving marine mammals, most positions would exist at aquaria, museums, and national and international conservation groups, though these are also highly competitive.


Health Solutions From Our Sponsors

American Academy of Ophthalmology. "What Is Macular Edema?" Dec. 1, 2010. <http://www.aao.org/eye-health/diseases/what-is-macular-edema>

Benjamin, K. D., BS. "Conservative Management of Acute Scrotal Edema." 2014.
<https://www.medscape.com/viewarticle/828275_5>

Hereditary Angioedema Association. "HAE - the disease." 2016.
<http://www.haea.org/what-is-hae/hae-the-disease>

O'Sullivan, S.B., et al. "Pitting Edema - measurement." 2007. 14 June 2016
<http://geriatrictoolkit.missouri.edu/cv/pitting_edema.htm>

Todhunter, Emily, M. "Clinical Nutrition: Edema." Jul 23, 2012.
<https://emilytodhunterwvudietetics.wordpress.com/2012/07/23/clinical-nutrition-edema/>

Top Edema Related Articles

Cirrhosis (Liver)

Symptoms include yellowing of the skin (jaundice), itching, and fatigue.

The prognosis is good for some people with cirrhosis of the liver, and the survival can be up to 12 years however the life expectancy is about 6 months to 2 years for people with severe cirrhosis with major complications.

High Blood Pressure (Hypertension)

High blood pressure (hypertension) is a disease in which pressure within the arteries of the body is elevated. About 75 million people in the US have hypertension (1 in 3 adults), and only half of them are able to manage it. Many people do not know that they have high blood pressure because it often has no has no warning signs or symptoms.

Systolic and diastolic are the two readings in which blood pressure is measured. The American College of Cardiology released new guidelines for high blood pressure in 2017. The guidelines now state that blood normal blood pressure is 120/80 mmHg. If either one of those numbers is higher, you have high blood pressure.

The American Academy of Cardiology defines high blood pressure slightly differently. The AAC considers 130/80 mm Hg. or greater (either number) stage 1 hypertension. Stage 2 hypertension is considered 140/90 mm Hg. or greater.

If you have high blood pressure you are at risk of developing life threatening diseases like stroke and heart attack.


What Are the Most Common F1 Visa Interview Questions?

The consular officers usually ask similar questions to every F1 visa candidate. This is in your favor since it helps you prepare in advance. Usually, the interviewer asks you questions related to your:

  • study plans
  • university choice
  • academic capability
  • financial status
  • post-graduation plans

The most common F1 visa interview questions are the following:

Why are you going to the United States? What will you specialize in for your degree? What will be your major?

The interviewer will ask you these questions one by one. This is just a ‘warm-up’ for the questions to come. You should tell him/her that you have been admitted to an educational institution in the United States. Do not talk a lot. Give short (but not very short) answers, and try not to gibberish since the visa consular will not like that.

Where did you go to school now? What do you do for a living?

The interviewer wants to know why you are not joining the workforce, but wish to continue your studies.

Other questions that enable the interviewer to understand more about you and your character and get more into the real questions about topics he really wants to know about.

Why are you planning to continue your education? Can you not continue your education in your home country? Why choose the United States of America? Why not choose Canada or Australia?

He/she will ask about your choice of the US as a study destination instead of another country. Try to give more specific answers.

Avoid giving answers as “US is a powerful nation” or “because it has a strong or developed economy” because such cliche answers will make the interviewer think that you admire the United States in a way that you wish to live there even after the completion of your studies. Instead, try to talk more about the university/college you will be attending. You can mention professors who lecture in that institution, and are well known as professionals of their field, etc. You can also mention some highlighted features of it such as world ranking, the research facility, the faculty profile, alumni profile, etc.

How many colleges did you apply to? How many schools did you get admitted to? How many schools rejected you?

The consular officer wants to shed light on your qualifications as a student and future professional. Keep in mind that students admitted at higher caliber universities will have better chances for a visa. However, you should be honest, when telling how many colleges have rejected you before being admitted to this one. If you lie, the interviewer can easily find out, which may lead to your visa application rejection.

Do you know your professors at that university? What are their names? What city is your school located?

If you know very little about the university you have been admitted, it would be better for you if you did some research before you attend your visa interview. The interviewer will ask you about the names of professors or other people in charge of the university. Take care to read about the most famous professors at the university, so you can mention their names and any price they have won, a book they have published or any other achievement of them.

The consular might also mention some notable alumni to you, if they know any, or ask you whether you know about any notable alumni of the university you have been admitted to. These questions are just to check if you are really interested in getting a proper education, or you are just using this as a way to enter and remain in the US.

Have you been to the United States before?

Answer honestly. Tell about the reasons you have visited the United States before, i.e tourism, training, medical reasons, etc. If you have never been to the United States before you can also say that this is not because you did not want to, but you did not have the chance. Give the impression to the consular that if you don’t get the chance to study there, you would still like to visit the country as a tourist.

What are your test scores (GRE, GMAT, SAT, TOEFL, IELTS)? What was your previous GPA?

Even if your university has admitted you, the consular officer will still want to know your likelihood of success at university.

How do you plan to fund the entire duration of your education?

With these questions, the interviewer wants to discover how you are planning to fund your stay in the United States. If you have enough savings for the entire period you will be in the United States then present that to the consular officer. Otherwise, if you have a sponsor as parents, cousins, partner, etc., then you will have to present how they will fund your stay in the United States, and if they are capable to do so. If you have won a scholarship for that present documents that prove your statement.

How much does your school cost? How will you meet these expenses?

Tell the consular how much does your school cost, and how much you will have to pay for your accommodation and other expenses. Tell him/her how much money you will be receiving each month and try to prove that it will be enough to cover your studies. Even if you are planning to work some student on-campus job, it would be better not to mention it, because this would lead the interviewer to think you might become a burden to the United States public funds.

Healthcare expenses in the United States may be unaffordable for many international students. The treatment of a broken leg or broken arm will cost you $2,500, while staying at a US hospital may cost over $10,000, on average.

Although it is not a requirement and the interviewer may not ask you about health insurance, you could provide proof of health insurance to convince your interviewer regarding financial subsistence during your time in the United States.

What is your sponsor’s occupation?

They want to know if your sponsor is really capable to cover your expenses.

Do you have any brother/sister?

If your parents will be your sponsor, then the interviewer wants to know if they would be capable to do so, or they will have to financially support other people too.

Have you got any loans? How do you plan on repaying your loan?

If you do not have any loans you simply say that you do not. Otherwise, honestly tell the interviewer about the quantum of the loan you have applied for and from where you have received the same.

You can also say that you will be able to find a good job in your home country upon your graduation and repay the same. Do not suggest by any means that you would be paying off the loan by taking up odd jobs in the US.

Will you come back to home during vacations/ holidays?

Again, the visa officer wants to know about your relations with your home country and your family. Tell them that you will be going back to your holidays to meet family and friends even if you do not. If you plan to stay in the United States during summer or winter holidays and work do not tell that to the interviewer. He will have the impression that you are going to the United States to earn money and that you might stay there even upon the completion of your studies.

Do you have relatives or friends currently in the US?

Answer honestly. Even if you have some faraway relatives that you only meet every three-four years, tell the consular about them. Or if you have a friend you have only met once or twice, you will have to tell the consular again.

What are your plans post-graduation? Do you have a job or career in mind after you graduate?

Since the F1 Visa is a non-immigrant visa, you will have to convince the consular that you do not plan to remain in the US but rather to return to your home country. If you tell him more about what you plan to do, you will most likely convince him/her that you have no intention to stay in the US after your graduation.

Do you plan on returning back to your home country? Are you sure you won’t stay in the US? Will you continue to work for your current employer after you graduate?

Try to tell to the interviewer that you have strong ties to your home country and that you will for sure return. Tell them you have your family, closest friends, or a partner in your home country if you really do. If you have any pet, tell him/her about that too. Mention any property, business, organization, etc., that you have and because of which you will return.

Why should you be given a student visa?

This is the very last question you will be asked. Try to put forward a strong case of why you should be issued a visa. Try to make a strong point of your case, and be confident. Once again, do not gibberish. Even while answering this question, try to convince the interviewer by giving him the impression you have no plans to remain in the United States and that you will return to your home country for sure.


Watch the video: ZAŠTO NE TREBA DRŽATI FOTOGRAFIJE MRTVIH LJUDI U KUĆI - RAZLOG ĆE VAS IZNENADITI.. (August 2022).