Train

Train models for creating synthetic data. This module is the primary entrypoint for creating a model. It depends on having created a engine specifc configuration and optionally a tokenizer to be used.

class gretel_synthetics.train.EpochState(epoch: int, accuracy: float | None = None, loss: float | None = None, val_accuracy: float | None = None, val_loss: float | None = None, batch: int | None = None, epsilon: float | None = None, delta: float | None = None)

Training state passed to the epoch callback on BaseConfig at the end of each epoch.

class gretel_synthetics.train.TrainingParams(tokenizer_trainer: None, tokenizer: None, config: None)

A structure that is created and passed into the engine-specific training entrypoint. All engine-specific training entrypoints should expect to receive this object and process accordingly.

gretel_synthetics.train.train(store: None, tokenizer_trainer: None = None)

Train a Synthetic Model. This is a facade entrypoint that implements the engine specific training operation based on the provided configuration.

Parameters:
  • store – A subclass instance of BaseConfig. This config is reponsible for providing the actual training entrypoint for a specific training routine.

  • tokenizer_trainer – An optional subclass instance of a BaseTokenizerTrainer. If provided this tokenizer will be used to pre-process and create an annotated dataset for training. If not provided a default tokenizer will be used.

gretel_synthetics.train.train_rnn(store: None)

Facade to support backwards compatibility for <= 0.14.x versions.