license

Creative Commons License
Where the stuff on this blog is something i created it is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License so there are no requirements to attribute - but if you want to mention me as the source that would be nice :¬)

Tuesday 7 April 2015

The 10 problems which mean the process to allocate scientific funding is broken (ht @LSEImpactblog )

This post summarises the arguments put forward in Mathias Binswanger's "Excellence by Nonsense: The Competition for Publications in Modern Science" chapter in Opening Science's "The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing" edited by Sönke Bartling and Sascha Friesike

Obviously in creating this summary I have used some text from the original document and altered other text as I add my understanding of the arguments.

The full citation is at the bottom of this article along with links to other posts you may find useful. 

Hat tip for making me aware of the publication to a tweet from @LSEImpactblog






Summary

1) The peer review process for scientific articles is being "gamed" by reviewers and authors and has itself created some perverse incentives.

2) The ranking of an institution or universities excellence (based on number and quality of articles published and cited AND 3rd party funded projects AND networks with other institutions or universities) is also being "gamed".

3) The competitive system that allocates scientific funding (based on the ranking described in 2) and quality controlled by the process described in 1) - is also being "gamed" and so doesn't lead to the best allocation of funds to create real scientific progress.

4) The bureaucracy of this process is crowding out unconventional people who do not perform well in standardized competitions. 

5) The process often rejects ideas that later turn out to be scientific breakthroughs whilst rarely discovering plagiarism, fraud and deception.

6) It is also probably crowding out true quality and knowledge creation as it diverts more of scientist's time away from research and into reporting. 

7) In the past, researchers who had nothing to say at least did not publish.  Now they do and non-performance has been replaced by the performance of nonsense. 

8) All of this makes it increasingly difficult to find the truly interesting research in the mass of articles in publications.

9) Indeed the system is working against creating real scientific progress. Some disciplines have degenerated into a kind of theology where heresy is no longer tolerated in established scientific journals. 

10) It is doubtful whether Albert Einstein would be able to pursue a scientific career under the current system.




And in a bit more detail ...



Why scientists need to get published


1) To access funds universities compete with each other to get high rankings in terms of education and scientific research.


2) The way to impress those who distribute the funds is by increasing measurable output such as:

- number of articles published;

- number of times the article is cited in other articles (and the "quality" of such citations as determined by citation analysis);

- number of projects funded by third-party funds;

- number of networks with other institutes and universities.


3) The current system tries to ensure quality in articles by peer reviewing work before it is allowed to be published in professional journals.

(In peer review the journals give submitted manuscripts to one or several professors or other distinguished scientists - the so-called peers - who ideally work in the same field as the author and therefore should be able to assess the work’s quality. The theory behind this peer review is that reviewers do not know who the author of the article is, nor do the authors know who the reviewers are. At the end of the peer review process reviewers inform the editor in writing whether they plead for acceptance [very rare], revision, or rejection [most common] of the article submitted to the journal in question.  Top journals often pride themselves on high rejection rates - approximately 95% - which is seen as a reflection of the journal's quality).



The 7 realities of reviewers in scientific peer review 


Experts who peer review are busy human beings and so therefore their reviews:


1) are sometimes ghost written by their assistants;

2) are highly subjective (since the consensus of several expert judgements is usually low);

3) sometimes reject articles out of personal grudges (they rejected my article so....);

4) do reject articles that later turn out to be scientific breakthroughs;

5) rarely discover plagiarism, fraud and deception;

6) favour articles that are in accordance with their own work;

7) reject articles that contradict their own work.



The 12 behaviours this encourages in scientists who need to get published 


Given the points above potential authors seeking to get their articles published will often:


1) strategically cite and praise already published articles dealing with the same or similar topics which are likely to have been written by those doing the reviewing. (The critical debate on the peer-reviewed process discussed in the journal Nature in 2007 clearly showed that in practice the anonymity of the review process for established scientists is rare);

2) avoid criticizing the work of possible reviewers and those eminent in their field;

3) reveal successful tests and conceal negative results;

4) present simple ideas in complex ways to demonstrate the authors technical expertise and signal importance to the reader;

5) change the article according to the wishes of the reviewers;

6) cut up ideas into as minor ideas as possible (as each can be the subject of a separate article thus maximising the  number of articles that can be put forward to be published);

7) try to get the same results published twice or even more often than that;

8) ensure they are included as honourory authors in all articles from their research team (if they have power in the academic hierarchy to do so - as this increases the potential number of articles they can get published).  Interestingly actual research today rests largely on the shoulders of assistants and graduate students whose low hourly compensations still allow them to improve scientific knowledge. In contrast, opportunity costs of doing research are often too high for professors and research leaders, because they can contribute more to the measurable output of their institution by focusing on the organization and management of project acquisitions and publications. Yet due to honourory authorship the list of publications of professors and research leaders still grows despite their lack of continued research;

9) produce articles with several other authors  - as this approach, or indeed 8) above - increases the number of articles put forward for publication and also often increases the number of citations of such articles.  (the more authors an article has, the more all participating authors will be quoting this article again, especially if they are again involved as co-authors in another article);

10) specialize in a sub-division of a research discipline understood only by very few insiders, and establish a scientific journal  (to get published) for this topic;

11) create large cooperative and long-range projects with a network of as many research partners as possible, bringing third-party funds to their institution. (Large anonymous institutions like the EU give money to other large anonymous institutions  - e.g. an excellence cluster - where the individual researcher is crowded out and disappears to become a small wheel in a big research machine);

12) focus on application-oriented, "useful" research rather than " useless" basic research (this is because the competition for 3rd party funded projects drives scientists towards application oriented research as this rapidly leads to marketable innovations.  In this way, both humanities and basic research is gradually crowded out because in these disciplines immediate usability can hardly be shown or postulated)



4 consequences of all of this


1) the competitive system initiated to best allocate scientific funding has led to a steadily increasing number of published articles in scientific journals.  But this is often not leading to new or original scientific insights which create real scientific progress.


2) indeed some disciplines (e.g. economics) have degenerated into a kind of theology where heresy is no longer tolerated in established scientific journals.  Heresy takes place in books, working papers and a few marginal journals specializing in divergent theories, but these publications rarely contribute to the reputation of a scientist.  


3) In the past, researchers who had nothing to say at least did not publish. However the competitive system forces even uninspired and mediocre scientists to publish all the time. Non-performance has been replaced by the performance of nonsense and this makes it increasingly difficult to find the truly interesting research in the mass of insignificant publications.


4) The new bureaucracy and its competitive approach is crowding out unconventional people who do not perform well in standardized competitions.  It may also be crowding out true quality and knowledge creation as it diverts more of scientist's time away from research to reporting.  It is doubtful whether Albert Einstein would be able to pursue a scientific career under the current system.




Citation

The source material for this post is from Mathias Binswanger's "Excellence by Nonsense: The Competition for Publications in Modern Science" chapter in Opening Science's "The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing"
 edited by Sönke Bartling and Sascha Friesikeand and published under this Creative Commons Attribution-NonCommercial 3.0 Unported (CC BY-NC 3.0) licence - and found via a tweet from to @LSEImpactblog






other posts on RSA, TED, other lectures, conferences, others blog posts

2015
18 top tips and thoughts about using #social media to enable #community source = an article by  Anatoliy Gruzd PhD & Caroline Haythornthwaite PhD 

2014
Data Protection & Privacy - 8 issues from an International Conference
escape your social horizon limit & understand more - source = a blog post summarising the work of  Jeffrey A. Smith, Miller McPherson & Lynn Smith-Lovin
social media & death - 10 things you may not have thought about - #DORS conference

2013
the development of the U2 spyplane - source = CIA historians Gregory Pedlow & Donald Welzenbach
considering culture and business process improvement  - source = an article by Schmiedel, Theresa, vom Brocke, Jan, & Recker 
ideas that may help you attract older volunteers - source = a paper by Brayley, Nadine, Obst, Patricia L., White, Katherine M., Lewis, Ioni M.,Warburton, Jeni, & Spencer, Nancy
physical factors which help people get better quicker - source = a paper by Salonen, Heidi & Morawska, Lidia 
a new approach to school and education - by Geetha Narayanan 
guiding principles on designing construction kits - by Mitchel Resnick & Brian Silverman
signs of overparenting - source = an article by Locke, Judith, Campbell, Marilyn A., & Kavanagh, David J
making ideas happen - source = a 99U conference

2012
how to spot a liar - by pamela myer 
measuring happiness - source = talk by jim clifton, jim harter, ben leedle









No comments:

Post a Comment