The concept of scientific dishonesty ethics value systems and research

POVL RIIS

I have been interested in the problem of scientific dishonesty ever since the classic cases occurred in the USA and elsewhere from the mid-1980s onwards, and later with my involvement in the formation of the Danish central committee (undertaken before we had ever had a recent major case in the country). Here, however, I want to take a much broader look at the whole question, in particular trying to put it into the broader context of biomedical ethics.

The three concepts in my subtitle appear all the time in today's publications. Research is, of course, a well-known term; ethics have acquired linguistic citizenship in medicine in the last 30-50 years; but value systems are a "johnny-come-lately" in our vocabulary. Nevertheless, the meaning of each term is often considered self-evident, and all of them are often used with a variety of different connotations. Any discussion of these key concepts in the context of scientific dishonesty needs, then, to start with definitions.

Definitions

Research

Research is defined here1 as an original endeavour comprising:

• An idea leading to the first ("original") attempt to link two masses of knowledge (already existing or arising out of previous research) with the aim of detecting causal relationships and not merely coincidences.

• The transfer of the idea to one or more precise questions, characterised by the existence of potential answers.

• A bias-controlling methodology intending to link the question(s) to potential answers. (Methodology is defined as the art of planning, technically carrying through, interpreting, and publishing scientific research.) Good scientific methodology not only reduces the number of

• The relation between ethics and the law is bimodal. Ethics, with its fundamental values, forms the basis of legislation. Nevertheless, it also comprises values that are not controlled by the law, but are still decisive elements in societal and personal life.

Value universes of biomedical research

Until recently science had an elite status. Scientists were considered more honest than ordinary citizens, and hence an idea was current that research dishonesty did not occur outside fiction (as in "Dr Jekyll and Mr Hyde"). Today, however, we know better, and so can deal with this aspect in theoretical terms. The value universes of biomedical research concern two main subgroups:

• those related to society in general - the external universe;

• those related to the research community itself - the internal universe.

The former is concerned with the safety and trust of patients (not only patients in general but also trial patients in particular as well as healthy volunteers).

Thus the first aspect is the ethics of the research so far as the safety of and respect for the citizens acting as subjects are concerned. The evaluation rests primarily with research ethics committees, but the necessary premises also depend on the honesty of the researchers - and hence on knowing the risks to the participants, the potential benefits of the expected results, and an up-to-date survey of the literature.

The second aspect of the honesty/dishonesty concept is how the scientists recruit the trial subjects: do they fairly present all the undisputed facts to potential participants? Thirdly, are the results interpreted totally independently of any sponsors of the research? If the results are untrue for any reason clinicians may be misled in their treatment, even to the extent that the criminal law becomes involved should patients' health or even lives have been endangered. In this way, the societal value universe comes into close and serious contact with research activities.

Within the internal universe, scientists' curricula vitae are the most important premise for decisions on grants, academic appointments or promotions, travel to conferences, etc. Here, with the volume of scientific publications as the currency of the research market, any counterfeiting will have the same negative effects as in the monetary sphere. Values such as truth, justice, and responsibility are all at stake. The result may be that honest investigators sometimes lose out, because they have to spend much time on the project. Conversely, the fraudster can recruit patients faster; can work sloppily in the laboratory; or, most seriously, can fabricate the results or be a sleeping partner in several projects, but still an author in all the publications, thereby collecting much of the currency (here represented by authorship and co-authorship).

What is the driving force to fraud?

The driving force that unites the motives into active dishonesty varies from a criminal element to more cautious attempts to buy valid currency on the black market (more publications on the CV). For obvious reasons we know very little about these intentions, because sanctions are often taken in proved cases without scientists disclosing their motives. Such a policy of "admit as little as possible" is well known from our ordinary courts of law, but it is a source of wonder how often intelligent people can embark on dishonest research, given that they ought to "know better". My qualified bet is that they know very well about the consequences of such behaviour but think that they are too smart to get detected.1

Scientific dishonesty in relation to its nature, prevalence, and consequences

The four classic examples of fraudulent behaviour are fabrication, falsification, plagiarism, and theft. All represent transgression of laws and fundamental values known to the transgressor, and so are closely related to the crimes found in a country's penal code. Hence it is justifiable to speak of a general intention to deceive, whether the transgression is admitted or not when the facts come to light. The consequences of such serious scientific dishonesty are most serious in clinical research dealing with life-threatening diseases, as, for instance, in a recent example of treating disseminated breast cancer with bone-marrow transplantation. An obvious parallel is set by the so-called alternative treatments marketed for serious disease without scientific evidence and directly addressing lay people. Here, however, there has been no professional authorisation of the alternative methods, and hence a mixture of individual conceit and protective group insufficiency leads to a general blamelessness.

The next example of dishonest behaviour among scientists deals primarily with the way research results are evaluated and interpreted, and falls into the subgroup of biomedical ethics labelled "publication ethics". Data archaeology and "cleaning" of results for outliers - in other words, results that, if included, would seriously lower r values and increase P values - occur when scientists work with their raw data. Data massage, or archaeology, mean that scientists apply enough statistical tests until one of them produces a sufficiently low P value, without mentioning this multiple hypothesis testing in the subsequent publication. Such dishonest use of statistics is cognate with the exploitation of mass significance - for example, using a 0.05 level of significance and applying 20 significance tests on related data from the same group of subjects, not realising or forgetting that, by chance, at least one of them will show a P value of < 0.05. If done by an experienced scientist, such a practice will be fraudulent; if done by a tyro, then it can be an honest mistake, caused by a lack of methodological insight.

Another dishonest practice occurs in preparing the manuscript. The authors leave out references that point to previous original research, thereby indicating a spurious personal priority, even if this is not overtly claimed.

These types of scientific dishonesty are probably common but often undisclosed. They can distort results important for patients, and thereby have a societal perspective, but most cases are relevant only for the internal universe - that is, they affect competition between scientists.

Gross slovenliness may affect both the active data collection and the publication process. Examples include not searching the global literature, not testing batches of reagents, not ensuring accuracy of the numerical analyses, or not reading proofs properly and so on. Again, with a young inexperienced scientist such slovenliness might be accepted as venial; honest, but immature and non-professional (though this would not apply to any adviser). For an experienced scientist, however, such behaviour must be characterised as dishonest. It is often related to non-legitimate authorship, as when a senior scientist, often the boss, is a co-author of a paper reporting work that is then shown to have been fraudulent. Here the boss cannot be excused because of not having participated directly in the dishonesty.

Spurious authorship is the inflationary technique mentioned above. It may occur through a chief's supposed prerogative for co-authorship on all the publications coming from the department, or (at the opposite end of the institutional hierarchy) senior scientists may exclude the name of young, legitimate researchers from the article. Often this occurs after the juniors have left the department, with their seniors thinking of them as mere "water-carriers" despite their important contributions.

The same dishonest attitude is shown by authorships as exchangeable currency - for instance, gift authorship, ghost authorship, or barter authorship. Sometimes these take the form of production collectives ("I will make you a co-author if you do the same for me, given that we are not rivals").

Until formal regulations were introduced2 duplicate publication was frequent, with the double or even more frequent publication of the same data in two different journals, without any cross-reference to the other and without informing the editors. Other examples of inflating curricula vitae are the Salami technique (in which data are cut into thin slices and published separately), or the reverse, the Imalas technique, in which one, two, or more cases are added to an already published series without any statement that most of these have been described elsewhere. (Little is added, save another publication for the CV.)

All these examples of dishonest publication ethics overstep the values truth and justice within the internal universe. Very rarely do they have an additional societal perspective, or come to lower public trust in the scientist. The prevalence of such transgressions is unknown, but one study in a national context showed that it was high.3 The preventive measures detailed below are probably insufficient to eliminate or even reduce the number of non-legitimate authorships. Instead, several other, and more difficult, measures are being introduced, including a demand that authors specify their contribution in detail, while the ultimate decision on who is an author rests with the editor.4

The last subgroup of scientific dishonesty is more a matter of etiquette. An example is when a scientist presents the common work of a group to which he or she is a co-worker in slides and talks, but mentions only his or her own name and none of the co-workers. Clearly such practices are an internal matter for the research group, but they are still important because the resulting personal antagonisms waste much time and other research resources.

Common to almost all the disclosed cases of scientific dishonesty (irrespective of their position on the seriousness scale) are the two excuses also heard in our courtrooms: "I thought that everybody did it," and "I didn't know." Both have to be rejected in coming to a verdict, the first because it presupposes a judicial relativism influenced by the number of transgressions, the second because "ignorance of the law" is no excuse.

Good scientific practice (GSP)

Experience from national control systems has shown how important it is to create a new category between full-blown scientific dishonesty and full respect for all the relevant ethical values, a concept called good scientific practice (GSP). In this grey zone are the transgressions that cannot be classified as scientific dishonesty but are not GSP. Hence these are referred to as practices "not in accordance with GSP", where GSP represents the national consensus by scientific societies, research councils, independent scientific editors, and the like.5 In courses for young scientists, this intermediate category can be used to produce examples close to everyday research work. As a result, examples need not be drawn from the full-blown cases of criminal scientific dishonesty, which readily lose their effect because the common reaction is: "I could never become involved in such a scenario."

The social transfer of values

Values ought to underlie academic education, and not be part of any formal curriculum first met in the classroom. In other words, the values behind GSP have deep roots and can be reinforced during scientific training only by being made visible in everyday life. Hence there are several steps in attaining GSP:

• The first step is the visibility and priority given to fundamental human values (truth, reliability, justice, freedom) in children's upbringing. In other words, honest scientific behaviour is not an ethical subspecialty of a profession, but the projection of general ethical constituents onto a professional universe.

• The second step is the visibility of general values within the research society and an accompanying respect for them during daily activity in research institutes and departments. Probably the strongest influence is personal example, and full correspondence between spoken principles and everyday practice. If the head of the department mouths the right views on authorship, but consistently twists them to expect being a co-author of all its publications, then junior researchers find themselves working with double standards.

• The third step is to set up obligatory courses in GSP for all young scientists and would-be specialists. These will catalyse the development of widespread GSP, including the time-consuming change of the traditions of spurious authorship, so that there is a better correlation between scientific productivity and the number of papers in a CV.

• The fourth step is for the controlling body to publish selected cases from the full spectrum of scientific dishonesty in an annual report. And if any word gets out of a possible case before there has been an adjudication, then openness is the keyword. This should not be in the form of details or premature conclusions, but a confirmation that the case does exist with the promise that direct information will be supplied once it has been concluded.

Value conservation and the control of scientific dishonesty

For countries that have had no formally established bodies responsible for managing scientific dishonesty - and even for countries with unofficial mechanisms for examining allegations - it may be valuable to consider the different models and procedures.

The most common set-up is institutional, usually established unprepared and ad hoc if an allegation arises in a university or another scientific institution. The initial way of tackling such problems is often an official denial, or at least deep silence. If the case cannot remain in the dark in the long run, then the institution sometimes reacts fiercely with sanctions to show its high moral standards, despite the earlier downgrading of the case. Moreover, interinstitutional distributions of power between involved scientists and the leadership may represent a strong bias against justice. Historically one may apply Montesquieu's triad, demanding that the three components - the legislature, the judiciary, and the executive - should be kept independent of one another. Here, on the contrary, they are mixed to an unacceptable degree.

The alternative to the institutional set-up is the national, or regional, committee. This may be established in two principal ways - either on a legal basis or created by research councils, academies, and scientific societies in common. Further, its remit may be restricted to biomedical research or be extended to cover all kinds, such as the humanities, social sciences, and technical sciences. If such a body deals only with inquiries and investigations, but not with sanctions (which are left to the institutions), at least part of the triad is split into its individual components. If further definitions of scientific dishonesty (or, more importantly of GSP) are promulgated widely, then another step has been taken to secure both the accused and the whistleblowers a high degree of fairness.

Such a structure may, for instance, have the following action levels when a case is raised:

• Suspicion arises locally via a whistleblower or the independent committee's own channels (e.g. through the media).

• The committee decides whether the case should be considered.

• The involved parties are informed and asked for their comments, in accordance with judicial principles.

• If the committee decides on an investigation, an independent specialist group is formed once both parties have accepted the suggested membership.

• The ad hoc investigative group's report is open for comment from all the parties, and thereafter is evaluated by the committee. Does the case point to scientific dishonesty, non-accordance with GSP, or to an empty suspicion?

• The conclusion is forwarded to the parties and the institution. The latter decides on sanctions if scientific dishonesty has been substantiated, and reports back to the committee so that any disparity between sanctions taken by different institutions can be minimised.6

It may seem strange that scientific dishonesty and fraudulent behaviour within biomedicine have attracted so much attention, whereas very little has been written about ethical transgressions in other scientific disciplines - especially in those where the motives for dishonesty (such as strong competition) would make such behaviour just as feasible as in biomedicine. Nevertheless, few countries have extended their national control system to include all scientific sectors.7 Given the increasing number of transdisciplinary studies involving medicine and, for instance, the humanities, and often qualitative research disciplines, such an extension should be important not only in an overall scientific perspective but for biomedical research as well.

Finally the preventive value of an independent national or regional committee(s) should be emphasised. Publicity about individual cases has a "vaccination effect" on the research community, but this is enhanced if national and international developments are commented on in a national committee's annual report.8 The didactic use of concrete cases is easier if it is based on experiences of a national overview rather than on sporadic anecdotes through the media. As I have already mentioned, the cases included in courses for young scientists have a much stronger impact when they originate in concrete - even anonymised - cases from a contemporary spectrum, collected out there in real life.

Conclusion

The field of scientific dishonesty has developed from casuistic, often serious, cases 30-40 years ago to a stage where the multitude of different transgressions seems to form a basis for a more systematic analysis of motives and methods. The traditional epidemiological figures - true incidences and prevalences - remain unknown, and are probably not even ascertainable. As the criminological literature shows, individual transgressions can be counted only if disclosed, and the same is true for deviant scientists with a relapsing tendency to act dishonestly in science. Nevertheless, data from the national control systems seem to show that serious cases of fraud and its societal effects are relatively rare.9

Instead, the spectrum is dominated by cases with internal perspectives for the research society: spurious authorship, lack of planning with well-defined shares for each member of a project group, and cases with both internal and external (that is, societal) perspectives through dishonest methodology, such as data massage, removal of outliers, and the inflation of originality and personal priorities. The number of reported sensational cases reported worldwide managed by institutions or transferred to the law courts has been too few to indicate that these mechanisms should be widely applied. Instead, such cases have indicated the necessity of establishing independent systems - whether national or regional - based on principles long developed and tested in the common judicial systems. The important first step in creating these is to bring editors of national journals and members of research councils, scientific academies, associations, universities, and scientific societies together, to devise a system that will protect both the accused and the whistleblower against unfair procedures. The aim is to make such values the basis for GSP that is both visible and respected.

Here, with such values - far away from applied statistics, techniques of randomisation, and methods for polymerase chain reactions - the important societal perspective of scientific dishonesty is to be found. Immaterial values such as truth, justice, freedom, responsibility, and many others represent the essential grid that makes the greater society and the scientific one cohere - and to form the necessary trust and reliability to enable citizens to work together, or to depend on each other's information and results. In other words, if a society unofficially accepts that speed limits can be ignored, that tax evasion is a kind of Olympic sport, and that fraudulent receipt of social security is venial, then young scientists will meet the demands of GSP less prepared and less able to be influenced. In these circumstances, the only alternative is to thrust an ungrateful task onto an official body such as a National Committee on Scientific Dishonesty.

Was this article helpful?

0 0

Post a comment