Why the Journal's Structure is Different

An unexpectedly difficult question arose early in the development of the journal: what should we define as the standard structure of a manuscript? A more-or-less constant structure is part of what gives a journal its characteristic "look". The customary manuscript structure used by most research journals is Abstract > Introduction > Methods > Results > Discussion, but does this format fit papers in which the Methods are the main point, and not just the means to an end? I thought it would not.

A natural research presentation would introduce the question to be addressed, describe the approach used to address it and the implementation of that approach, present the results, and finally discuss the conclusions reached. This is how we tend to think, in linear fashion, and how we present our work in a seminar. In fact, this is how a conventionally-arranged, very simple paper describing one experiment would be organized - but how many papers are ever that simple?

It seemed to me that this customary structure dissects complex research presentations into awkwardly artificial bins, putting all the methods together in one bin, all the results in another, and the discussion in a third. It works, only because the effect (and intent) is to emphasize the discussion by setting aside the distracting descriptions of methods and results. An unfortunate consequence of this custom is that the development and discussion of methods is diminished in importance.

Thinking about this further, I decided that there's nothing particularly sacred or even necessary about the conventional format. Something different was needed, something that would focus on the methods and not on their application. My problem was to develop an alternative structure that was flexible enough to accomodate the description of a new laboratory analytical method, or an intercalibration of standard assay techniques, or a metaanalysis of published data, or a host of other sorts of manuscripts that could well come under the umbrella title of "a methods paper".

Even more important was that I want to change the way methods papers are regarded and presented. Quoting from another editorial: "To meet this goal, I intend to push authors to increase the rigor with which methods are developed, tested and presented. The journal will absolutely require a thorough, thoughtful and detailed analysis of any new method. Every methods paper published in the journal will have an assessment section that will present the critical experiments or studies that were conducted in the process of methods testing, the results of those studies, and the proof of concept they provide. The assessment must provide the answers to such basic questions as: How do you know that your method really works? How well does your method work? What are its strengths and limitations? How difficult or expensive is your method to adopt and use?"

So, how to accomplish this noble goal? I became convinced that the assessment section was the key to promoting rigorous methods development. If the journal was to push authors to a higher standard, then it would have to abandon the standard format and do things a little differently.

The structure used in Limnology and Oceanography: Methods is not quite like that in other journals. This is important to remember, because authors are irresistably drawn to type in the familiar headings "Results" and "Discussion", even after reading the journal's instructions. We do things a little differently around here.

Please read the instructions to authors very carefully. Take note of the general structure for manuscripts. Take special note of the following as you prepare your manuscript:

  1. Materials and Procedures: The Materials and Procedures section should be used to thoroughly describe the new analytical procedure you have developed, or the device you have created, or the criteria used to select literature data to analyze. You do not need to describe the experiments, analyses or other studies that you conducted to validate the method - yet!
  2. Assessment: The Assessment section combines elements of the traditional Methods, Results and Discussion sections. Its purpose is to assess and validate the methods presented in the previous section. Think of this section as reorganizing parts of the conventional Methods > Results > Discussion sequence, as illustrated in this abbreviated example:

    "I needed to know <a characteristic of the method, e.g. detection limits>. I used the materials and procedures just described to conduct the following test <describe it>, obtained the following results <describe them>, and concluded that <your experimental conclusion>."

    You may have conducted a number of assessments, for example to establish the detection limits, variability, and durability of an instrument. Each of these assessments can be self-contained and include a description of the intent of the test, the experiment conducted, the results and the basic conclusion reached.
  3. Discussion: By the time you reach the section labeled Discussion, you should have finished with the presentation of your work. This is not the place to present the conclusions of each experiment! It's time to discuss their significance. What new insights will be obtained from a method with the properties you have just demonstrated? If you posed an important question in the introduction, have you now answered it, or is there more work that must be done? Based on your rigorous testing of the assumptions of a standard method, do past studies based on the method need to be revisited, or reinterpreted? How will the results of your intercalibration study change the interpretation of past work: does it need to be redone, or just recalculated? If your work is important, tell everyone how, and why, and what they should do differently from now on. And don't be concerned if the Discussion section is not the longest or most important part of your paper.

I understand that authors will find this structure strange and will be irresistably tempted to submit manuscripts in the old familiar style. Be brave, be innovative, dare to think differently! This is your chance to focus on and highlight the methods work you have done, and to talk about its value and its promise.

Paul F. Kemp, Editor-in-Chief
October 10, 2002