I have been refereeing and editing biochemistry papers for nearly 50 years, many dealing in some way with enzyme properties. However, checking that every detail in an experimental paper has been accurately and fully described requires more time and effort than one can reasonably ask referees to invest. Some faults are obvious enough, for example if the authors have used an acetate buffer at pH 7, but many are not so easily detected. The problem of transparency is not of course confined to enzyme experiments, and the editors of Nature (2018) recently discussed in a more general context the use of checklists for verifying that experiments are adequately described.
Everyone agrees that published experiments should be reproducible, not only by the original authors, but also by others who might want to check and extend them, or to incorporate the kinetic parameters in models of metabolic pathways. However, this is easier said than done. Is the enzyme studied clearly identified, including its source and method of purification? Is the method of assay fully described, including assay components, temperature and pH? Are the experiments described in sufficient detail, specifying ranges of substrate and effector concentrations, and so on? Is the type of buffer appropriate? Which kinetic parameters been obtained from the data?
Ensuring that these requirements are met implies a large amount of work. So far as authors are concerned, it is their responsibility to describe their experiments fully and accurately, whether or not that is laborious. However, it is hardly realistic to expect a referee to check every detail in a paper and insist on changes being made if the description is misleading or incomplete, and in practice unclear papers are often published. The problem has always existed, but it has become particularly obvious with the growth of databases that seek to compile kinetic information for all enzymes. Curators of databases such as Sabio-RK have often faced an almost impossible task, as discussed by Wittig and colleagues (2014).
In the early 21st century The Beilstein Institute, famous for the Beilstein Database of organic compounds, became interested in the need for effective standards in enzymology, and set up the Strenda (STandards for Reporting ENzymology DAta) Commission to analyse the requirements and define standards. The Strenda Guidelines (Apweiler et al., 2005; Tipton et al., 2014) now form part of the Instructions to Authors of many journals in biochemistry.
However, ensuring manually that these guidelines are satisfied still implies a large amount of work for authors, and an unrealistic amount of work for referees. The next stage, alleviating as much as possible of the burden, has now come as the automated Strenda DB database (Swainston et al., 2018). The submission tool incorporates the Guidelines, which specify minimum information requested in the reporting of enzyme function data, including kinetic parameter values and full experimental conditions under which they were acquired. Strenda DB then checks the manuscript data entered by the author for compliance with the Guidelines.
Publication in coordination with Strenda DB proceeds as follows: authors first submit the data to the database by supplying the information requested in a series of dialogues, and make any corrections or additions necessary. The data are then stored in the Strenda database, but remain private, accessible only by the authors. After a Strenda Registry No. has been assigned the manuscript can be submitted to the journal, with editors secure in the knowledge that the Guidelines have been followed. Once the paper is accepted after human reviewing it can go through the usual steps of DOI registration, PubMedID assignment, and publication. The definitive data in the database then become public, accessible by anyone.
The database is now freely available at http://www.strenda.db.org. Journals are encouraged to urge all authors of papers on enzyme characterization to use it. It will make it far easier for reproducible kinetic data to be published.
References
Anonymous (2018) "Two documents for greater transparency" Nature 555, 6
Apweiler R, Cornish-Bowden A, Hofmeyr JH, Kettner C, Leyh TS, Schomburg D & Tipton K (2005) "The importance of uniformity in reporting protein-function
data" Trends Biochem Sci 30, 11-12
Swainston, N, Baici, A, Bakker, BM, Cornish-Bowden, A, Fitzpatrick, PF, Halling, P, Leyh, TS, O'Donovan, C. Raushel, FM, Reschel, U., Rohwer J M, Schnell S, Schomburg D, Tipton KF, Tsai, M-D, Westerhoff HV, Wittig U, Wohlgemuth R & Kettner C (2018) "STRENDA DB: enabling the validation and sharing of enzyme kinetics data" FEBS J. in press, DOI:10.1111/febs.14427 (Open Access)
Tipton KF, Armstrong RN, Bakker BM, Bairoch A, Cornish-Bowden A, Halling PJ, Hofmeyr J-H, Leyh TS, Kettner C, Raushel FM, Rohwer J, Schomburg D & Steinbeck C (2014) "Standards for reporting enzyme data: the STRENDA consortium: what it aims to do and why it should be helpful" Perspect Sci 1, 131-137
Wittig, U, Rey, M, Kania, R, Bittkowski, M, Shi, L, Golebiewski, M, Weidemann, A , Muller, W & Rojas, I (2014) "Challenges for an enzymatic reaction kinetics database" FEBS J. 281, 572-582, DOI: 10.1111/febs.12562
Join the FEBS Network today
Joining the FEBS Network’s molecular life sciences community enables you to access special content on the site, present your profile, 'follow' contributors, 'comment' on and 'like' content, post your own content, and set up a tailored email digest for updates.