Editor Resources
Tools and templates to support consistent editorial decisions.
Journal at a Glance
ISSN: 2643-2811
DOI Prefix: 10.14302/issn.2643-2811
License: CC BY 4.0
Peer reviewed open access journal
Scope Alignment
Model based research, simulation, digital twins, computational methods, systems engineering, and data driven decision support. We prioritize validated models and reproducible workflows.
Publishing Model
Open access, single blind peer review, and rapid publication after acceptance and production checks. Metadata validation and DOI registration are included.
JMBR provides editors with resources to support efficient review management and consistent decisions. Resources include templates, checklists, and policy guidance tailored to model based research.
- Reviewer selection checklist
- Decision letter templates
- Ethics and conflict of interest guidance
- Model validation and reporting checklist
Editors should prioritize transparency, reproducibility, and clarity in decision making. Encourage authors to provide data availability statements, validation results, and documentation of model assumptions.
For additional resources or workflow questions, contact [email protected].
JMBR is committed to rigorous, transparent publishing in model based research. We emphasize reproducible methods, complete data statements, and ethical compliance across all article types.
The editorial office supports authors, editors, and reviewers with clear guidance and responsive communication. For questions about scope or workflow, contact [email protected].
We encourage continuous improvement in reporting practices and share updates that help the community maintain high standards in computational and simulation research.
Need Editorial Support?
Contact the editorial office for resources or workflow assistance.
Include a concise model summary that describes inputs, outputs, and key assumptions so readers can understand how the model operates before diving into equations. A short flow diagram or schematic in the manuscript can improve comprehension and review quality.
Provide details on computational resources, solver settings, and runtime considerations when they affect model performance or scalability. Reporting these elements helps reviewers assess feasibility and supports reproducibility across different research environments.
Benchmark new algorithms against established baselines and explain evaluation criteria, including accuracy, stability, and computational efficiency. Clear benchmarking strengthens the contribution and clarifies why the proposed method advances the field.
Describe how uncertainty was quantified and how sensitivity analyses informed conclusions. If uncertainty was not evaluated, explain why and clarify the limitations this introduces for decision making or predictive accuracy.
Explain the decision context for applied models, such as engineering design, policy analysis, or operational planning. Reviewers look for a clear link between model outputs and real world use cases.
Document preprocessing steps for input data, including normalization, filtering, or feature engineering, to ensure transparency and enable independent replication of results.
Define notation and acronyms consistently across the manuscript, especially when multiple models or scenarios are compared. Consistency reduces ambiguity during peer review.
Discuss limitations candidly, including boundary conditions, assumptions that may not generalize, and potential sources of bias in data or modeling choices.