A flawed system for judging research is leading to academic fraud
DISGUISED as employees of a gas company, a team of policemen
burst into a flat in Beijing on September 1st. Two suspects inside panicked and
tossed a plastic bag full of money out of a 15th-floor window. Red hundred-yuan
notes worth as much as $50,000 fluttered to the pavement below.
Money raining down on pedestrians was not as bizarre,
however, as the racket behind it. China is known for its pirated DVDs and fake
designer gear, but these criminals were producing something more intellectual:
fake scholarly articles which they sold to academics, and counterfeit versions
of existing medical journals in which they sold publication slots.
As China tries to take its seat at the top table of global
academia, the criminal underworld has seized on a feature in its research
system: the fact that research grants and promotions are awarded on the basis
of the number of articles published, not on the quality of the original
research. This has fostered an industry of plagiarism, invented research and
fake journals that Wuhan University estimated in 2009 was worth $150m, a
fivefold increase on just two years earlier.
Chinese scientists are still rewarded for doing good
research, and the number of high-quality researchers is increasing. Scientists all
round the world also commit fraud. But the Chinese evaluation system is
particularly susceptible to it.
By volume the output of Chinese science is impressive.
Mainland Chinese researchers have published a steadily increasing share of scientific
papers in journals included in the prestigious Science Citation Index
(SCI—maintained by Thomson Reuters, a publisher). The number grew from a
negligible share in 2001 to 9.5% in 2011, second in the world to America,
according to a report published by the Institute of Scientific and Technical
Information of China. From 2002 to 2012, more than 1m Chinese papers were
published in SCI journals; they ranked sixth for the number of times cited by
others. Nature, a science journal, reported that in 2012 the number of
papers from China in the journal’s 18 affiliated research publications rose by
35% from 2011. The journal said this “adds to the growing body of evidence that
China is fast becoming a global leader in scientific publishing and scientific research”.
In 2010, however, Nature had also noted rising
concerns about fraud in Chinese research, reporting that in one Chinese
government survey, a third of more than 6,000 scientific researchers at six
leading institutions admitted to plagiarism, falsification or fabrication. The
details of the survey have not been publicly released, making it difficult to
compare the results fairly with Western surveys, which have also found that
one-third of scientists admit to dishonesty under the broadest definition, but
that a far smaller percentage (2% on average) admit to having fabricated or
falsified research results.
In 2012 Proceedings of the National Academy of Sciences, an
American journal, published a study of retractions accounting for nation of
origin. In it a team of authors wrote that in medical journal articles in
PubMed, an American database maintained by the National Institutes of Health,
there were more retractions due to plagiarism from China and India together
than from America (which produced the most papers by far, and so the most
cheating overall). The study also found that papers from China led the world in
retractions due to duplication—the same papers being published in multiple
journals. On retractions due to fraud, China ranked fourth, behind America,
Germany and Japan.
“Stupid Chinese Idea”
Chinese scientists have urged their comrades to live up to
the nation’s great history. “Academic corruption is gradually eroding the
marvellous and well-established culture that our ancestors left for us 5,000
years ago,” wrote Lin Songqing of the Chinese Academy of Sciences, in an
article this year in Learned Publishing, a British-based journal.
In the 1980s, when China was only beginning to reinvest in
science, amassing publishing credits seemed a good way to use non-political
criteria for evaluating researchers. But today the statistics-driven standards
for promotion (even when they are not handed out merely on the basis of
personal connections) are as problematic as in the rest of the bureaucracy.
Xiong Bingqi of the 21st Century Education Research Institute calls it the
“GDPism of education”. Local government officials stand out with good
statistics, says Mr Xiong. “It is the same with universities.”
The most valuable statistic a scientist can tally up is SCI
journal credits, especially in journals with higher "impact
factors"—ones that are cited more frequently in other scholars’ papers.
SCI credits and impact factors are used to judge candidates for doctorates,
promotions, research grants and pay bonuses. Some ambitious professors amass
SCI credits at an astounding pace. Mr Lin writes that a professor at Ningbo
university, in south-east China, published 82 such papers in a three-year span.
A hint of the relative weakness of these papers is found in the fact that China
ranks just 14th in average citations per SCI paper, suggesting that many
Chinese papers are rarely quoted by other scholars.
The quality of research is not always an issue for those
evaluating promotions and grants. Some administrators are unqualified to
evaluate research, Chinese scientists say, either because they are bureaucrats
or because they were promoted using the same criteria themselves. In addition,
the administrators’ institutions are evaluated on their publication rankings,
so university presidents and department heads place a priority on publishing,
especially for SCI credits. This dynamic has led some in science circles to
joke that SCI stands for “Stupid Chinese Idea”.
Crystal unclear
The warped incentive system has created some big
embarrassments. In 2009 Acta Crystallographica Section E, a British
journal on crystallography, was forced to retract 70 papers co-authored by two
researchers at Jinggangshan university in southern China, because they had
fabricated evidence described in the papers. After the retractions the Lancet,
a British journal, published a broadside urging China to take more action to
prevent fraud. But many cases are covered up when detected to protect the
institutions involved.
The pirated medical-journal racket broken up in Beijing
shows that there is a well-developed market for publication beyond the
authentic SCI journals. The cost of placing an article in one of the
counterfeit journals was up to $650, police said. Purchasing a fake article
cost up to $250. Police said the racket had earned several million yuan
($500,000 or more) since 2009. Customers were typically medical researchers
angling for promotion.
Some government officials want to buy their way to academic
stardom as well: at his trial this month for corruption, Zhang Shuguang, a
former railway-ministry official, admitted to having spent nearly half of $7.8m
in bribes that he had collected trying to get himself elected to the Chinese
Academy of Sciences. Chinese reports speculated that he spent the money buying
votes and hiring teams of writers to produce books. Widely considered to be a
man of limited academic achievement, Mr Zhang ultimately fell just one vote
short of election. Less than two years later, he was in custody. The Economist
No comments:
Post a Comment