Skip to content

Feature : Add Perplexity metric to ignite.metrics.nlp #3742

@steaphenai

Description

@steaphenai

🚀 Feature

I’d like to add a new NLP metric: Perplexity under ignite.metrics.nlp, and expose it in top-level ignite.metrics.

Motivation

Perplexity is a standard metric for language modeling, and having it built into Ignite would make evaluation pipelines easier and more consistent with other built-in metrics.

Proposal

  • Add Perplexity metric implementation in ignite.metrics.nlp.perplexity.
  • Export it from:
    • ignite.metrics.nlp
    • ignite.metrics (top-level import path)
  • Include tests for:
    • correctness vs manual computation
    • token-weighted accumulation across updates
    • reset behavior
    • input shape validation
    • edge cases (single token / empty compute handling)

Alternatives considered

Users can currently compute perplexity manually from cross-entropy, but that is repetitive and error-prone across projects.

Additional context

I have a working branch and tests for this implementation and can open a PR right after this issue is created.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions