How to Lie with Statistics is the title of a classic book by Darrell Huff, published in 1954, which described how marketers, politicians, and others can manipulate statistics in order to mislead the public.
At the beginning of the Vallely report, in a section titled "Dimensions of the Crisis", the authors try to make the case that higher education and scientific research in Vietnam are in a terrible state, and that "it is difficult to overstate the seriousness of the challenges" they face. Their Table 1 purports to give the number of publications in peer-reviewed journals in 2007 in various countries and institutions in East Asia. According to the Vallely report, the data show that Vietnamese institutions "lag far behind even their undistinguished Southeast Asian neighbors" (which they identify as Thailand, the Philippines, and Indonesia).
Table 1 claims that in 2007 the total number of peer-reviewed publications of all researchers at the Vietnam Academy of Science and Technology (VAST) was only 44. The figure for the University of the Philippines is five times that number. If these numbers were correct, that would support the Vallely claim of a dismal state of scientific research in Vietnam, even by comparison with that in the Philippines.
However, someone with a little knowledge of science in Vietnam can immediately see that something is wrong with these numbers. VAST consists of 30 research institutes, one of which is the Hanoi Mathematical Institute, founded in 1970. Since the 1970's my closest friends and colleagues in Vietnam have been researchers at that Institute. In 2000 I was given a book detailing the achievements of the first thirty years of the Institute. The list of publications of the members runs to 128 pages -- well over 2000 items, most in peer-reviewed journals. Thus, the annual average is over 50, and that includes the early years when the Institute was much smaller and much more isolated from the West than it is now.
I emailed the director of the Hanoi Math Institute, Ngo Viet Trung, who confirmed my suspicions about Vallely's Table 1. He said that over the last ten years the average number per year of publications of Institute members in peer-reviewed journals has been about 70, and the numbers for the last three years are as follows: 62 in 2007, 77 in 2008, 73 in 2009. According to N.V. Trung, the reason for the error in Table 1 is that Vallely's staff searched the Science Citation Index (SCI) for articles that gave VAST as the author's affiliation. However, most researchers at the Math Institute name only the Math Institute and not the Vietnam Academy as their place of work. So almost all of their publications (and similarly those of most of their colleagues at the other 29 institutes of VAST) were missed in Table 1. Trung said that some people at VAST had complained to one of the Vallely report's authors about this. But of course Vallely has not made a public apology for his insulting comments about Vietnamese science that were based in part on a fallacious examination of the SCI.
Perhaps it is unfair for me to title this section "How to Lie with Statistics", since that implies that Vallely lied -- that is, that he knew that his table's statistics were wrong. It is more likely that he genuinely believed that the ridiculously low number he gave for VAST's publications was correct. Most likely it was because of his own lack of contact with the scientific community in Vietnam -- and because of his preconceived notions about Vietnam's backwardness -- that Vallely saw no reason to question the methodology used to construct Table 1. It was probably not deliberate deception, but rather willful blindness and bias that explain the Vallely report's use of dramatically false data.
There is a long history of Eurocentric bias in evaluating the accomplishments of people in Third World countries. Since the days of the European empires, representatives of the wealthy countries have ignored or denigrated native knowledge and institutions. In that way they could justify their own power and domination over those countries. The iconic representation of this attitude was in Rudyard Kipling's 1899 poem The White Man's Burden. It seems to me that the attitudes toward Vietnamese intellectuals and institutions shown by the "experts" from Fulbright, Harvard, The New School, and the World Bank are a 21st century version of the imperialist arrogance expressed in Kipling's poem.
Even well-intentioned people can easily be led astray by cross-national comparisons of numbers of publications. Let us look, for example, at Vietnam and the Philippines. There are many reasons why the level of Vietnamese researchers could be higher than that of Filipino scientists even if the Science Citation Index does not show this. The Philippines has been part of the U.S. colonial and (later) neocolonial system for over a century. This has many consequences, of which I will mention two that have a major impact on SCI data:
Filipino and Japanese scientists tend to develop strong collaborative ties, especially among Filipino scientists who were trained in Japanese universities. This is not usually the case for Filipino scientists trained in the U.S. or Australia.... For Australian and U.S. scientists, there seem to be fewer real gains in collaborating with Filipino scientists.... One reason for the close collaboration between Filipino and Japanese scientists revealed to us during our exploratory visit in 2004 was the relatively higher level of English-language proficiency of Filipino students compared to their Japanese professors, which is seen by the latter as a valuable human capital resource when publishing in international scholarly journals, which are typically in English.
Thus, because of Vietnam's relative isolation from the West (until recently) and because of low levels of English proficiency, Vietnam could have similar or even lower SCI numbers even if Vietnamese scientists are doing more innovative research than Filipino scientists.