alcedo.atthis
Well-known member
A Note Regarding the Science & Publication of Research Studies
In order to critically address the published works of any writer, it is important for the reader to have an understanding of the purpose of scientific research publication, the process of scientific study review, and the role peer review plays in the publication of scientific research.
According to An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences (Rangachari, 1994-2001),“Scientists publish research reports for a variety of reasons. Ideally, a research report is a free communication by a scientist or a group of scientists informing their peers about a set of novel findings that either provide answers to puzzling problems or raise issues that are of academic or practical interest. At the opposite extreme lie reports that serve merely to add to the curriculum vitae of the investigators and have little or nothing important to say. Most often, the situation lies between these two extremes.” This author goes on to provide a summary of the scientific review process and a checklist for the critical evaluation of a scientific paper. The checklist provided by P.K. Rangachari is as follows:
CHECKLIST
as provided by P.K. Rangachari in “An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences.” http://www.science.mcmaster.ca/biopharm/critanal.htm10.31.03
Introduction
To review scientific work, the first question should be: Is the study free of bias? According to “High School Students’ Critical Evaluation of Scientific Resources on the World Wide Web,” (Nathan Bos, University of Michigan-Ann Arbor Accepted for publication in the Journal of Scientific Education Technology. http://www-personal.si.umich.edu/~serp/work/critical_evaluation.pdf 10.31.03), when analyzing any study, it is important to begin by knowing who funded or sponsored the study. Why? To determine if there may be any potential biases to the study as a result of the funding. Organizations funding research that are dedicated to a certain “constituent” or outcome are likely to fund a study and a group that in the past has shown their findings to be in line with the goal of the funding agency. Frequently, a funding agency reserves the final right of refusal of publication of the study results, which can also result in biased research.
The next important question is: Was the study published in a scientific journal? This is critical for validating the results of the study. This is not to say that a published study has no flaws or that its conclusions are inherently correct. But, according to “Systematic Critique – the art of scientific reading” (PDF file) (©2002 Biomedical Scientist: February 2002 http://www.ibmsscience.org/reading/systematic_critique.pdf10.31.03), when a study is submitted for publication to a peer-reviewed journal, the entirety of the study is subject to scrutiny. That includes the original hypothesis (for instance, what effect cats have on bird populations), the materials and methods used to study the hypothesis, and the results and the conclusions of the study. Each of these areas is carefully examined in a peer-review process. The authors are required to address each concern raised by the reviewers, either by changing or modifying the text to comply with the comments of the reviewers (who are examining both the methodology and the conclusions of the study and may not agree with the research conclusions as presented by the study authors); or by persuasively arguing why their original statements/conclusions are correct. The editors of the journal ultimately have the power to decide if the authors have adequately addressed the concerns of the reviewers or not. If they have, the article is published (usually with numerous modifications, it’s very rare a paper gets accepted in its first form). If they have not, the article is not published. If the reviewers believe adequate controls or proper methodology were not used in the study, the study will not be published. So when a research study is not published in a peer-reviewed journal, it has not gone through the rigorous screening process by people in the same field. Essentially, within the scientific community there is a reluctance to rely heavily on unpublished studies, because their validity or scientific rigor cannot be adequately addressed.
Also important to note in any review: Do the authors of any given work reference their own material? Are proper reference citations used? Dr. P.K. Rangachari warns that needlessly quoting your own studies to support your findings is generally unacceptable for obvious reasons. We note one very important caveat to this: if you are the only scientist/researcher conducting research in the particular field. If you have been breaking new ground and have the only published studies in your field of research, there is no choice but to reference your own work. But citing your own work to support conclusions when other studies in the same field of work have been published is simply unacceptable. Also, citing “other studies” that support your findings is not acceptable either. Any credible work will properly cite all reference material.
Finally, was the methodology of the study sound? Are projections made on the basis of single-point studies or averages of multiple studies? One of the tactics frequently used by bird conservationist and other wildlife activists is to extrapolate the implication of a study on small numbers of animals to large populations of animals. This practice is deceptive, inaccurate, and statistically unacceptable scientifically. This is an extremely important point.
For further reading on the subject of critical analysis of scientific research, please visit:
“An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences” ©1994-2001 P.K. Rangachari. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www.science.mcmaster.ca/biopharm/critanal.htm10.31.03.
“Systematic Critique – the art of scientific reading” (PDF file) ©2002 Biomedical Scientist: February 2002 for further reading on the critique of scientific research reviews. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www.ibmsscience.org/reading/systematic_critique.pdf 10.31.03.
“High School Students’ Critical Evaluation of Scientific Resources on the World Wide Web,” Accepted for publication in the Journal of Scientific Education Technology Nathan Bos, University of Michigan-Ann Arbor. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www-personal.si.umich.edu/~serp/work/critical_evaluation.pdf 10.31.03.
Regards
Malky
In order to critically address the published works of any writer, it is important for the reader to have an understanding of the purpose of scientific research publication, the process of scientific study review, and the role peer review plays in the publication of scientific research.
According to An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences (Rangachari, 1994-2001),“Scientists publish research reports for a variety of reasons. Ideally, a research report is a free communication by a scientist or a group of scientists informing their peers about a set of novel findings that either provide answers to puzzling problems or raise issues that are of academic or practical interest. At the opposite extreme lie reports that serve merely to add to the curriculum vitae of the investigators and have little or nothing important to say. Most often, the situation lies between these two extremes.” This author goes on to provide a summary of the scientific review process and a checklist for the critical evaluation of a scientific paper. The checklist provided by P.K. Rangachari is as follows:
CHECKLIST
as provided by P.K. Rangachari in “An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences.” http://www.science.mcmaster.ca/biopharm/critanal.htm10.31.03
Introduction
- Did the authors indicate why the study was undertaken?
- Was the background information provided adequate to understand the aims of the study?
- Were the methods described in sufficient detail for others to repeat or extend the study?
- If standard methods were used, were adequate references given?
- If methods were modified, were the modifications described carefully?
- Have the authors indicated the reasons why particular procedures were used?
- Have the authors indicated clearly the potential problems with the methods used?
- Have the authors indicated the limitations of the methods used?
- Have the sources of the drugs been given?
- Have the authors specified the statistical procedures used?
- Are the statistical methods used appropriate?
- Were the experiments done appropriate with respect to objectives of the study?
- Do the results obtained make sense?
- Do the legends to the figures describe clearly the data obtained?
- Are the data presented in tabular form clear?
- Are the legends to the tables clear?
- Has appropriate statistical analysis been performed on the data?
- Were the objectives of the study met?
- Do the authors discuss their results in relation to available information?
- Do the authors indulge in needless speculation?
- If the results obtained were statistically significant, were they also biologically significant?
- If the objectives were not met, do the authors have any explanation?
- Do the authors adequately interpret their data?
- Do the authors discuss the limitations of the methods used?
- Do the authors discuss only data presented or do they refer consistently to unpublished work?
- Do the authors cite appropriate papers for comments made?
- Do the authors cite their own publications needlessly?
- Is the abstract intelligible?
- Does the abstract accurately describe the objectives and results obtained?
- Does the abstract include data not presented in the paper?
- Does the abstract include material that cannot be substantiated?
To review scientific work, the first question should be: Is the study free of bias? According to “High School Students’ Critical Evaluation of Scientific Resources on the World Wide Web,” (Nathan Bos, University of Michigan-Ann Arbor Accepted for publication in the Journal of Scientific Education Technology. http://www-personal.si.umich.edu/~serp/work/critical_evaluation.pdf 10.31.03), when analyzing any study, it is important to begin by knowing who funded or sponsored the study. Why? To determine if there may be any potential biases to the study as a result of the funding. Organizations funding research that are dedicated to a certain “constituent” or outcome are likely to fund a study and a group that in the past has shown their findings to be in line with the goal of the funding agency. Frequently, a funding agency reserves the final right of refusal of publication of the study results, which can also result in biased research.
The next important question is: Was the study published in a scientific journal? This is critical for validating the results of the study. This is not to say that a published study has no flaws or that its conclusions are inherently correct. But, according to “Systematic Critique – the art of scientific reading” (PDF file) (©2002 Biomedical Scientist: February 2002 http://www.ibmsscience.org/reading/systematic_critique.pdf10.31.03), when a study is submitted for publication to a peer-reviewed journal, the entirety of the study is subject to scrutiny. That includes the original hypothesis (for instance, what effect cats have on bird populations), the materials and methods used to study the hypothesis, and the results and the conclusions of the study. Each of these areas is carefully examined in a peer-review process. The authors are required to address each concern raised by the reviewers, either by changing or modifying the text to comply with the comments of the reviewers (who are examining both the methodology and the conclusions of the study and may not agree with the research conclusions as presented by the study authors); or by persuasively arguing why their original statements/conclusions are correct. The editors of the journal ultimately have the power to decide if the authors have adequately addressed the concerns of the reviewers or not. If they have, the article is published (usually with numerous modifications, it’s very rare a paper gets accepted in its first form). If they have not, the article is not published. If the reviewers believe adequate controls or proper methodology were not used in the study, the study will not be published. So when a research study is not published in a peer-reviewed journal, it has not gone through the rigorous screening process by people in the same field. Essentially, within the scientific community there is a reluctance to rely heavily on unpublished studies, because their validity or scientific rigor cannot be adequately addressed.
Also important to note in any review: Do the authors of any given work reference their own material? Are proper reference citations used? Dr. P.K. Rangachari warns that needlessly quoting your own studies to support your findings is generally unacceptable for obvious reasons. We note one very important caveat to this: if you are the only scientist/researcher conducting research in the particular field. If you have been breaking new ground and have the only published studies in your field of research, there is no choice but to reference your own work. But citing your own work to support conclusions when other studies in the same field of work have been published is simply unacceptable. Also, citing “other studies” that support your findings is not acceptable either. Any credible work will properly cite all reference material.
Finally, was the methodology of the study sound? Are projections made on the basis of single-point studies or averages of multiple studies? One of the tactics frequently used by bird conservationist and other wildlife activists is to extrapolate the implication of a study on small numbers of animals to large populations of animals. This practice is deceptive, inaccurate, and statistically unacceptable scientifically. This is an extremely important point.
For further reading on the subject of critical analysis of scientific research, please visit:
“An Introduction to Critical Analysis of Publications in Experimental Biomedical Sciences” ©1994-2001 P.K. Rangachari. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www.science.mcmaster.ca/biopharm/critanal.htm10.31.03.
“Systematic Critique – the art of scientific reading” (PDF file) ©2002 Biomedical Scientist: February 2002 for further reading on the critique of scientific research reviews. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www.ibmsscience.org/reading/systematic_critique.pdf 10.31.03.
“High School Students’ Critical Evaluation of Scientific Resources on the World Wide Web,” Accepted for publication in the Journal of Scientific Education Technology Nathan Bos, University of Michigan-Ann Arbor. If you have trouble with the above link, please copy and paste this web address directly into your browser: http://www-personal.si.umich.edu/~serp/work/critical_evaluation.pdf 10.31.03.
Regards
Malky