Oped on Open Data to Fix Academic Fraud in SCMP

SCMPheaderLast week the Open Science Working Group of ODHK had an Oped in South China Morning Post (SCMP) discussing issue of fighting academic fraud through use of Open Data. This is a topical issue at the moment with recent scandals implicating many academics in Mainland China with large-scale peer-review fraud covered in the Washington Post. With kind permission of SCMP we are posting an updated and extended version of the piece here, and being good Open Data purists include links to much of the source material discussed.

The scandal of scientific impact
The idealized view of science as the curiosity driven pursuit of knowledge to understand and improve the world around us, has been tarnished by recent news of systematic fraud and mass retraction of research papers from the Chinese academic system, and allegations of attempts to game the peer-review system on an industrial scale. With much of our R&D funded through government, we all hope our tax dollars are spent as wisely as possible, and around the world research funders have developed methods of assessing the quality of their funded researchers work.  One of the most widely used metrics to assess researchers is the Journal Impact Factor (JIF), a (proprietary, closed access) service run by Thomson-Reuters, that ranks the academic journals that scientists publish to get credit in. While many countries have tried to broaden their assessment system to take account a more balanced view of a researchers impact, in China the numbers of publications in JIF ranked journals is currently the only activity that researchers are judged by, and huge amounts of money are changing hands (often hundreds of thousands of RMB payment for a single publication in the top ranked journals) through this system.

This biased focus on one metric above all others has directly lead to large scale gaming of the system and a black market of plagiarism, invented research and fake journals. Following from previous exposé’s of an “academic bazaar” system where authorship on highly ranked papers can be bought, Scientific American in December uncovered a wider and more systematic network of Chinese “paper mills” producing ghostwritten papers and grant applications to order, linked to hacking the peer review system that is supposed to protect the quality and integrity of research. The first major fall-out from this has occurred last month, with the publisher BioMed Central (BMC) retracting 43 papers for peer review fraud, the biggest mass-retraction carried out for this reason to date, and increasing the number of papers retracted for this reason by over a quarter. Many other major publishers have been implicated, with the publisher of the worlds largest journal PLOS also issuing a statement that they are investigating linked submissions. It takes a great amount of time and effort employing Chinese speaking editorial teams to investigate and contact all of the researchers and institutions implicated, and BMC should be applauded for doing this and fixing the scientific record so quickly [COI declaration: Scott Edmunds is an ex-employee of BMC, and he and Rob Davidson are collaborating with them through GigaScience Journal].

To get an idea of the types of research uncovered and implicated, it is possible to see the papers retracted last month, and Retraction Watch has covered the story in detail. The Committee on Publication Ethics has also issued a statement. Guillaume Filion in his blog has done some sterling detective work providing insight on the types of papers written by these “paper mills” and “guaranteed publication in JIF journal” offering companies still advertising their services. The likely production-line explosion of medical meta-analysis publications coming from China has been well known for a number of years, but looking at the list of publications retracted by BMC in March shows a worrying introduction of many other research types such as network analysis.

Like in J. B.Priestley’s famous morality tale, An Inspector Calls, any evil comes from the actions or inactions of everyone. On top of the need for better policing by publishers, funders and research institutions, there needs to be fundamental changes to how we carry out research. Without a robust response and fundamental changes to their academic incentive systems there could be long term consequences for Chinese science, with danger that this loss of trust will lead to fewer opportunities to collaborate with institutions abroad, and potentially building such skepticism that people will stop using research from China.

While we are rightly proud of Hong Kong’s highly regarded and ranked universities system (with three Universities ranked in the world top 50), we are not immune to the same pressures. While funders in Europe have moved away from using citation based metrics such as JIF in their research assessments, the Hong Kong University Grants Committee states in their Research Assessment Exercise guidelines that they may informally use it. In practice some of the Universities do follow the practice of paying bonuses related to the impact factor of journals their researcher publish in, leading to the same temptations and skewed incentive systems that have led to these corrupt practices in China. From looking at the list of retracted papers fortunately on this occasion no Hong Kong based researchers were implicated. With our local institutions increasing their ties across the Pearl River through new joint research institutes and hospitals, and these scandals likely to run and run, how much longer our universities can remain unblemished will be a challenge.

Can We Fix it? Yes We Can!
If the impact factor system is so problematic, what are the alternatives? Different fields have different types of outputs, but there are factors that should obviously be taken into account like quality of teaching, and the numbers of students passing on to do bigger and better things. Impact can be about changing policy, producing open software or data that other research can build upon, or stimulating public interest and engagement through coverage in the media. Many of these measures can also be subject to gaming, but having a broader range of “alternative metrics” should be harder to manipulate.  China is overtaking the US to become the biggest producer of published research, but ranks only ninth in citations, so there obviously needs to be a better focus on quality rather than quantity.

The present lack of research data sharing has led to what is being called a ‘reproducibility crisis’, partly fuelled by fraudulent activity but very often just from simple error. This has led to some people estimating that as much as 85% of research resources (funding, man­hours etc) are wasted. Science is often lauded as being a worthy investment for any government because the return to the economy is more than that put in. What benefits could be gained if there was an 85% improvement on that return? How many more startups and innovative technologies could be produced if research was actually re­usable?

There is growing movement from funders across the world to encourage and enforce data management and access, and we at Open Data Hong Kong are cataloguing the policies and experiences of Hong Kong’s research institutions. Sadly, at this stage we seem to be far behind other countries, currently ranking 58th in the global Open Data Index (just falling from 54th earlier in the year). One of the main benefits of open data is transparency, which would have made the current peer review scandal much harder to carry out. It is encouraging that the Hong Kong Government is already promoting release of public sector data through the newly launched Data.Gov.HK portal, but it is clear that our research data needs to be treated the same way. ODHK is the first organization in Hong Kong (and 555th overall) to sign the San Francisco Declaration on Research Assessment that is trying to eliminate the use of journal-based metrics. To help change the skewed incentive systems we would encourage others to join us by signing at: http://am.ascb.org/dora/
Scott Edmunds, Rob Davidson and Waltraut Ritter; Open Data Hong Kong.
Naubahar Sharif; HKUST.

See SCMP for the published version of the Oped here: http://www.scmp.com/comment/insight-opinion/article/1758662/china-must-restructure-its-academic-incentives-curb-research