MachineLearning/transformer library

🔁 Transformer (encoder-style minimal)

Minimal, well-documented transformer encoder block and a small sequence model inspired by modern transformer architectures. Provides clear constructor options (heads, model-dim), fit/predict signatures and explanatory docstrings to aid extension to production-grade variants.

Contract:

  • Input: token sequences as integer IDs or embeddings (List<List
  • Output: sequence embeddings or logits for downstream tasks
  • Errors: throws ArgumentError for invalid inputs

Classes

Transformer
Minimal Transformer encoder wrapper. We provide a tiny, testable abstraction: token embeddings -> mean pooling -> MLP head (ANN).