The Methods section is where reviewers decide whether to trust your results. A strong Methods section does not just describe what you did; it convinces the reader that what you did was appropriate, reproducible, and free of obvious bias. When the Methods section is unclear, incomplete, or disorganized, reviewers begin questioning the results before they even read them.
These five mistakes appear across manuscripts in medical and life science research. Each one creates a specific type of doubt in the reviewer’s mind, and each has a clear fix.
1. Mixing Results Into the Methods Section
The Methods section describes what was planned and executed. The Results section reports what was found. When preliminary data, outcomes, or data-driven justifications appear in the Methods, the boundary between these sections collapses, and the reviewer loses confidence in the study’s logical structure.
This problem most commonly appears when authors justify a methodological choice by referencing a result: “Because the Shapiro-Wilk test showed non-normal distribution (p = 0.02), we used the Mann-Whitney U test.” The result of the normality test belongs in the Results section or in a supplementary statistical note. The Methods section should state the decision rule, not the outcome.
Typical original:
We performed the Shapiro-Wilk test to assess normality. Since the data were not normally distributed (p = 0.02), we used the Mann-Whitney U test for group comparisons. The median tumor volume in the treatment group was 2.3 cm³.
Revised:
Continuous variables were tested for normality using the Shapiro-Wilk test. Non-normally distributed variables were compared between groups using the Mann-Whitney U test; normally distributed variables were compared using the independent-samples t-test.
The revised version states the decision rule without revealing the data. The actual normality test results and the group comparisons belong in the Results section.
2. Reporting Statistical Methods Without Sufficient Detail
Reviewers and statisticians need specific information to evaluate whether the analysis was appropriate. Three details are frequently missing: the software and version used, the exact statistical tests applied, and the threshold for statistical significance.
Vague reporting does not just frustrate reviewers; it prevents replication and raises concerns about whether the authors understood their own analysis.
Typical original:
Statistical analysis was performed using appropriate software. The significance level was set at p < 0.05.
Revised:
Statistical analyses were performed using R (version 4.3.1; R Foundation for Statistical Computing, Vienna, Austria). Continuous variables were compared between groups using the independent-samples t-test (for normally distributed data) or the Mann-Whitney U test (for non-normally distributed data). Categorical variables were compared using the chi-squared test or Fisher’s exact test when expected cell counts were below 5. All tests were two-tailed, and statistical significance was defined as p < 0.05. Multiple comparisons were adjusted using the Bonferroni correction.
The revised version specifies:
- The software and version (R 4.3.1)
- The exact tests and the conditions under which each was used
- The directionality of tests (two-tailed)
- The significance threshold
- The method for handling multiple comparisons
3. Omitting Key Reproducibility Details
A Methods section should contain enough information for another researcher to replicate the study. In practice, authors frequently omit concentrations, incubation times, equipment specifications, sample sizes per group, and inclusion or exclusion criteria. These omissions are not minor: they are the details that determine whether a result is reproducible.
Typical original:
Cells were treated with the drug and incubated for a period of time. Cell viability was then measured.
Revised:
HeLa cells (1 × 10⁵ cells per well, 6-well plates) were treated with doxorubicin at concentrations of 0.1, 0.5, 1.0, and 5.0 μM. After 48 hours of incubation at 37°C in a humidified atmosphere containing 5% CO₂, cell viability was assessed using the MTT assay (Sigma-Aldrich, Cat. No. M2128) according to the manufacturer’s protocol. Absorbance was measured at 570 nm using a microplate reader (BioTek Synergy H1).
The revised version includes the cell line, seeding density, plate format, drug concentrations, incubation duration and conditions, assay kit with catalog number, and the measurement instrument. Each of these details is necessary for replication.
4. Using Inconsistent Terminology for the Same Entity
When the same variable, procedure, or patient group is referred to by different names within the Methods section, reviewers cannot tell whether you are describing one thing or multiple things. This problem is especially common in multi-author manuscripts where different sections were drafted by different researchers.
Typical original:
Patients in the experimental group received the new treatment protocol. Blood samples were collected from the intervention cohort at baseline and at 12 weeks. The treatment arm was compared with the control group for all primary endpoints.
Revised:
Patients in the intervention group received the new treatment protocol. Blood samples were collected from the intervention group at baseline and at 12 weeks. The intervention group was compared with the control group for all primary endpoints.
The fix is straightforward: choose one term and use it consistently throughout the manuscript. If the term must change (for example, when referring to the same patients in different analytical contexts), define the relationship explicitly: “The intervention group (hereafter referred to as the treated cohort in the survival analysis).“
5. Presenting Methods in an Illogical Order
The Methods section should follow a sequence that mirrors either the experimental workflow or the order in which results will be presented. When methods appear in an arbitrary order, reviewers cannot reconstruct the experimental timeline, and they begin to wonder whether the authors themselves had a clear protocol.
The most common ordering problem is describing analytical methods before describing sample collection, or presenting subgroup analyses before defining how subgroups were assigned.
Typical original:
We performed Cox regression analysis to identify prognostic factors. Tumor samples were collected from patients who underwent surgery between 2018 and 2022. Immunohistochemistry was performed on formalin-fixed paraffin-embedded sections. Patients were divided into high- and low-expression groups based on the median H-score.
Revised:
Tumor samples were collected from patients who underwent surgical resection between January 2018 and December 2022 at [Institution]. Formalin-fixed paraffin-embedded sections were prepared and stained by immunohistochemistry for [target protein]. Patients were divided into high- and low-expression groups based on the median H-score. Cox proportional hazards regression was used to identify independent prognostic factors for overall survival.
The revised version follows the experimental timeline: sample collection, tissue preparation, staining, grouping, and analysis. Each step logically leads to the next, and the reviewer can follow the workflow without rearranging paragraphs mentally.
Checklist Before Submitting Your Methods Section
- Does the Methods section contain any results, outcomes, or data values? If so, move them to the Results section.
- Have you specified the statistical software (with version), every test used, the conditions for choosing each test, and the significance threshold?
- Could another researcher in your field replicate your experiment using only the information in this section? Check for missing concentrations, time points, equipment, and sample sizes.
- Is every variable, group, and procedure referred to by the same name throughout the entire manuscript?
- Do the methods appear in an order that matches either the experimental workflow or the sequence of results?
A precise, well-organized Methods section tells reviewers that your research was conducted with the same rigor you expect from published work. If you want a professional review of whether your Methods section meets journal standards, ScholarMemory provides editing for medical and life science researchers. Contact us at contact@scholarmemory.com.