Skip to Content

Publication Strategy for ML Research

Publishing your machine learning research requires strategic planning beyond just good science. This guide covers venue selection, authorship agreements, timeline management, and navigating the peer review process.

Understanding Publication Venues

ML/AI Conferences (Tier 1)

NeurIPS (Conference on Neural Information Processing Systems)

  • Submission deadline: May (conference in December)
  • Acceptance rate: ~25%
  • Focus: Broad ML, emphasis on theory and algorithms
  • Review process: Double-blind, 3-4 reviewers
  • Timeline: Submit May → Decisions September → Camera-ready October → Conference December

ICML (International Conference on Machine Learning)

  • Submission deadline: January (conference in July)
  • Acceptance rate: ~25%
  • Focus: Machine learning theory and applications
  • Review process: Double-blind peer review
  • Timeline: Submit January → Decisions May → Conference July

ICLR (International Conference on Learning Representations)

  • Submission deadline: September (conference in April/May)
  • Acceptance rate: ~30%
  • Focus: Representation learning, deep learning
  • Open review process: Reviews visible on OpenReview
  • Timeline: Submit September → Public reviews October-January → Decisions February → Conference April

CVPR/ICCV/ECCV (Computer Vision)

  • Focus: Computer vision, visual recognition
  • CVPR: November submission, June conference (annual)
  • ICCV: March submission, October conference (odd years)
  • ECCV: March submission, October conference (even years)

ACL/EMNLP/NAACL (Natural Language Processing)

  • Focus: Computational linguistics, NLP
  • Acceptance rates: 20-25%
  • Regular submission cycles

ML/AI Journals

Journal of Machine Learning Research (JMLR)

  • Rolling submissions
  • Rigorous peer review (often 6+ months)
  • High prestige, theory-focused

IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)

  • Computer vision and pattern recognition
  • Rigorous review process
  • High impact factor

Nature Machine Intelligence

  • Broad ML audience
  • Emphasis on applications and societal impact
  • Selective (low acceptance rate)

Healthcare AI Venues

See Healthcare AI Overview for domain-specific publication venues:

ML for Healthcare Conference (MLHC)

  • Dedicated healthcare ML conference
  • Clinical applications emphasis
  • Acceptance rate: ~30%

Medical Journals

  • Nature Medicine / Lancet Digital Health: High impact, clinical focus
  • JAMIA / JBI: Medical informatics
  • NEJM AI: New England Journal of Medicine AI

Publication Timeline Planning

Thesis-to-Publication Timeline (Example)

Assuming 12-month thesis program:

MonthResearch ActivitiesPublication Activities
1-2Literature review, design experimentsDraft authorship agreement, select venues
3-4Conduct experiments, initial resultsStart drafting methodology paper
5-6Refine experiments, collect more dataSubmit Paper 1 (methodology), draft Paper 2
7-8Complete experiments, write thesisHandle Paper 1 revisions, submit Paper 2
9-10Write thesis chaptersIncorporate reviews, finalize papers
11-12Thesis defense, final revisionsComplete paper revisions, plan future work

Key Principles:

  • Submit early: Don’t wait for perfection, iterate through review
  • Parallel writing: Draft papers while experiments run
  • Thesis integration: Papers become thesis chapters
  • Buffer time: Account for rejections and resubmissions

Conference Submission Cycles

Plan backwards from conference deadlines:

Example: NeurIPS (December conference)

  • May: Submission deadline
  • March-April: Finalize experiments, write draft
  • February: Preliminary results, start writing
  • December-January: Design study, run pilot experiments
  • September-October: Start 6-month runway

Multiple Submissions Strategy:

  • Submit to Conference A (May)
  • If rejected (September), revise and submit to Conference B (November)
  • If rejected again, submit to Conference C (January) or pivot to journal

Authorship Agreements

ICMJE Authorship Criteria

International Committee of Medical Journal Editors (ICMJE) criteria - all authors must meet ALL FOUR:

  1. Substantial contributions to conception/design OR data acquisition/analysis/interpretation
  2. Drafting the work OR revising it critically for important intellectual content
  3. Final approval of the version to be published
  4. Accountability for all aspects of the work (accuracy, integrity)

Non-qualifying contributions (acknowledged, not authored):

  • Providing materials or patients
  • General supervision alone
  • Funding acquisition alone

Establishing Authorship Early

Draft agreement before starting thesis work:

  1. First Authorship

    • Student/researcher who performed the work is first author
    • Explicitly state: “Student will be first author on papers from thesis work”
  2. Coauthor Contributions

    • Supervisor: Provides guidance, critical review (typically last author in ML)
    • Collaborators: Specify expected contributions (data, methods, analysis)
    • Data providers: May warrant coauthorship if significant contribution
  3. Publication Rights

    • Student retains right to publish thesis work
    • Timeline for submission (can submit during thesis without approval delays)
    • IP ownership (findings belong to student for publication purposes)
  4. Dispute Resolution

    • Process for handling authorship disagreements
    • Department/university mediation if needed

Document Template:

Authorship Agreement for [Project Name] Primary Researcher: [Student Name] - First Author - Design experiments, implement models, run analysis, write manuscript Principal Investigator: [Supervisor Name] - Last Author - Provide guidance, critical review, intellectual contributions Data Collaborator: [Name] - Middle Author (if applicable) - Provide dataset access, domain expertise, interpretation All authors agree to meet ICMJE criteria. Student retains publication rights for thesis work. Agreement signed: [Date]

Understanding Review Process

Double-Blind Review (Most ML Conferences):

  • Authors don’t know reviewers
  • Reviewers don’t know authors
  • Minimize bias

Open Review (ICLR, some journals):

  • Reviews posted publicly on OpenReview
  • Authors can respond to reviewers
  • Community can comment

Common Rejection Reasons

  1. Insufficient novelty - “Incremental improvement over existing work”

    • Solution: Clearly articulate novel contributions, emphasize insights
  2. Weak baselines - “Authors didn’t compare to state-of-the-art”

  3. Limited evaluation - “Tested on only one dataset”

    • Solution: Test on multiple benchmarks, report comprehensive metrics
  4. Unclear writing - “Paper is difficult to follow”

  5. Overstated claims - “Results don’t support conclusions”

    • Solution: Match claims to evidence, be precise

Responding to Reviews

Rebuttal Strategy (for conferences with rebuttal period):

  1. Thank reviewers - Start positively
  2. Address each point - Number responses to match review comments
  3. Provide evidence - Additional experiments, clarifications, references
  4. Remain professional - Never defensive or argumentative
  5. Highlight changes - “We have added X to Section Y”

Example Rebuttal Structure:

We thank the reviewers for their thoughtful feedback. Below we address each concern: ## Reviewer 1 **Comment 1.1**: "Baseline comparison to [Method X] is missing" We agree this is an important baseline. We have now implemented [Method X] and added results to Table 2 (see updated manuscript). Our method achieves 3.2% higher accuracy while using 50% fewer parameters. **Comment 1.2**: "Clarity of Section 3.2" We have restructured Section 3.2 with a concrete example (Figure 3) and simplified the notation following Reviewer 1's suggestion. [Continue for each point...]

Revision and Resubmission

After Rejection:

  1. Take time to process - Don’t immediately resubmit elsewhere
  2. Analyze feedback objectively - What are valid criticisms?
  3. Improve the work - Address weaknesses, add experiments
  4. Revise thoroughly - Incorporate all actionable feedback
  5. Choose next venue - Different conference or pivot to journal

Typical Revision Checklist:

  • Address all major reviewer concerns
  • Add suggested baselines/ablations
  • Improve clarity (have someone read it fresh)
  • Update related work with recent papers
  • Check all figures are publication-quality
  • Proofread for typos and grammar

Venue Selection Strategy

Matching Work to Venue

Methodological Contribution → ML Conference

  • Novel architecture, training technique, optimization method
  • Target: NeurIPS, ICML, ICLR

Computer Vision Application → CV Conference

  • New dataset, task, or vision-specific method
  • Target: CVPR, ICCV, ECCV

Domain-Specific Validation → Domain Conference/Journal

  • Healthcare, robotics, climate science application
  • Target: MLHC, Nature Medicine, domain-specific journals

Interdisciplinary Work → Consider Multiple Paths

  • Could fit both ML conference (methodology) and domain journal (application)
  • Often submit two papers: one for each audience

Hedging Your Bets

Simultaneous Submissions (Generally NOT Allowed):

  • Most venues prohibit concurrent submissions
  • Exception: Workshops are often okay alongside main conference submissions

Sequential Submission Strategy:

  1. Submit to top-tier conference (NeurIPS, ICML, ICLR)
  2. If rejected, revise and submit to second-tier conference (AAAI, IJCAI)
  3. If rejected again, extend significantly and submit to journal (JMLR, TPAMI)

Workshop → Conference Path:

  • Submit extended abstract to workshop (informal, quick turnaround)
  • Get feedback from community
  • Extend to full paper for conference submission
  • Workshops don’t count as prior publication for conferences

Preprints and ArXiv

ArXiv Submission:

  • Post preprint before or concurrent with conference submission
  • Establishes priority, gets early feedback
  • Most ML conferences allow (even encourage) arXiv posting

Benefits:

  • Establish timestamp for ideas
  • Get feedback before formal review
  • Increase visibility and citations
  • Required for some open review venues (ICLR)

Timing:

  • Post to arXiv when you submit to conference
  • Update arXiv version after acceptance with camera-ready

Potential Concerns:

  • Some medical journals may consider arXiv as prior publication (check policies)
  • Your work is public and could be scooped (rare in practice)

Ethics and Best Practices

Research Ethics

Data Ethics:

  • Obtain proper IRB approval for human subjects research
  • Anonymize data, protect privacy (especially healthcare)
  • Report data characteristics and potential biases

Algorithmic Ethics:

  • Conduct fairness audits (see Interpretability)
  • Report limitations and failure modes
  • Consider societal impact

Reproducibility:

  • Release code (GitHub) when possible
  • Document hyperparameters and training details
  • Provide dataset access or clear description

Avoiding Predatory Publishers

Warning Signs:

  • Unsolicited email invitations to submit
  • Promises of rapid publication (< 1 month)
  • High publication fees with unclear review process
  • Journal not indexed in recognized databases (PubMed, Scopus)

Legitimate Venues:

  • Check Beall’s List  of predatory publishers
  • Verify journal in DOAJ (Directory of Open Access Journals)
  • Ask your supervisor if unfamiliar with venue

Publication Checklist

Before submitting, verify:

  • Novelty: Clearly stated contributions
  • Baselines: Compared to strong, recent methods
  • Evaluation: Multiple datasets, comprehensive metrics
  • Writing: Clear, follows venue’s template
  • Figures: High-quality, publication-ready
  • References: Complete, formatted correctly
  • Code: Ready to release (if planning open source)
  • Ethics: IRB approval, data permissions obtained
  • Authorship: All authors reviewed and approved
  • Formatting: Meets venue requirements exactly
  • Supplementary: Code, data, additional results prepared

Resources

Venue Rankings

Writing and Review Resources

Publication Ethics

Summary

Key Takeaways:

  1. Plan early: Match your work to appropriate venues, plan timeline backwards from deadlines
  2. Authorship first: Draft authorship agreement before starting research
  3. Iterate: Expect rejections, use feedback to improve
  4. Ethics matter: Protect data privacy, report limitations, ensure reproducibility
  5. Build relationships: Collaborate with supervisors and domain experts
  6. Stay current: Follow conference deadlines, read recent papers

Publishing is a skill that improves with practice. Your first submission will be stressful, but each subsequent publication gets easier as you learn the process and build expertise.