Struct LearnerBuilder
pub struct LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
B: AutodiffBackend,
M: AutodiffModule<B> + TrainStep<TI, TO> + Display + 'static,
<M as AutodiffModule<B>>::InnerModule: ValidStep<VI, VO>,
O: Optimizer<M, B>,
S: LrScheduler,
TI: Send + 'static,
VI: Send + 'static,
TO: ItemLazy + 'static,
VO: ItemLazy + 'static,{ /* private fields */ }Expand description
Struct to configure and create a learner.
The generics components of the builder should probably not be set manually, as they are optimized for Rust type inference.
Implementations§
§impl<B, M, O, S, TI, VI, TO, VO> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
B: AutodiffBackend,
M: AutodiffModule<B> + TrainStep<TI, TO> + Display + 'static,
<M as AutodiffModule<B>>::InnerModule: ValidStep<VI, VO>,
O: Optimizer<M, B>,
S: LrScheduler,
TI: Send + 'static,
VI: Send + 'static,
TO: ItemLazy + 'static,
VO: ItemLazy + 'static,
impl<B, M, O, S, TI, VI, TO, VO> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
B: AutodiffBackend,
M: AutodiffModule<B> + TrainStep<TI, TO> + Display + 'static,
<M as AutodiffModule<B>>::InnerModule: ValidStep<VI, VO>,
O: Optimizer<M, B>,
S: LrScheduler,
TI: Send + 'static,
VI: Send + 'static,
TO: ItemLazy + 'static,
VO: ItemLazy + 'static,
pub fn new(
directory: impl AsRef<Path>,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn new( directory: impl AsRef<Path>, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_loggers<MT, MV>(
self,
logger_train: MT,
logger_valid: MV,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
MT: MetricLogger + 'static,
MV: MetricLogger + 'static,
pub fn metric_loggers<MT, MV>(
self,
logger_train: MT,
logger_valid: MV,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
MT: MetricLogger + 'static,
MV: MetricLogger + 'static,
Replace the default metric loggers with the provided ones.
§Arguments
logger_train- The training logger.logger_valid- The validation logger.
pub fn with_checkpointing_strategy<CS>(
self,
strategy: CS,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
CS: CheckpointingStrategy + 'static,
pub fn with_checkpointing_strategy<CS>(
self,
strategy: CS,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
CS: CheckpointingStrategy + 'static,
Update the checkpointing_strategy.
pub fn renderer<MR>(
self,
renderer: MR,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
MR: MetricsRenderer + 'static,
pub fn renderer<MR>(
self,
renderer: MR,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
MR: MetricsRenderer + 'static,
pub fn metrics<Me>(
self,
metrics: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
Me: MetricRegistration<B, M, O, S, TI, VI, TO, VO>,
pub fn metrics<Me>(
self,
metrics: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
Me: MetricRegistration<B, M, O, S, TI, VI, TO, VO>,
Register all metrics as numeric for the training and validation set.
pub fn metrics_text<Me>(
self,
metrics: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
Me: TextMetricRegistration<B, M, O, S, TI, VI, TO, VO>,
pub fn metrics_text<Me>(
self,
metrics: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
Me: TextMetricRegistration<B, M, O, S, TI, VI, TO, VO>,
Register all metrics as numeric for the training and validation set.
pub fn metric_train<Me>(
self,
metric: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_train<Me>( self, metric: Me, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Register a training metric.
pub fn metric_valid<Me>(
self,
metric: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_valid<Me>( self, metric: Me, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Register a validation metric.
pub fn grads_accumulation(
self,
accumulation: usize,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn grads_accumulation( self, accumulation: usize, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Enable gradients accumulation.
§Notes
When you enable gradients accumulation, the gradients object used by the optimizer will be the sum of all gradients generated by each backward pass. It might be a good idea to reduce the learning to compensate.
The effect is similar to increasing the batch size and the learning rate by the accumulation
amount.
pub fn metric_train_numeric<Me>(
self,
metric: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_train_numeric<Me>( self, metric: Me, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_valid_numeric<Me>(
self,
metric: Me,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn metric_valid_numeric<Me>( self, metric: Me, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn num_epochs(
self,
num_epochs: usize,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn num_epochs( self, num_epochs: usize, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
The number of epochs the training should last.
pub fn learning_strategy(
self,
learning_strategy: LearningStrategy<B>,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn learning_strategy( self, learning_strategy: LearningStrategy<B>, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Run the training loop with different strategies
pub fn checkpoint(
self,
checkpoint: usize,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn checkpoint( self, checkpoint: usize, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
The epoch from which the training must resume.
pub fn interrupter(&self) -> Interrupter
pub fn interrupter(&self) -> Interrupter
Provides a handle that can be used to interrupt training.
pub fn with_interrupter(
self,
interrupter: Interrupter,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn with_interrupter( self, interrupter: Interrupter, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Override the handle for stopping training with an externally provided handle
pub fn early_stopping<Strategy>(
self,
strategy: Strategy,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn early_stopping<Strategy>( self, strategy: Strategy, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Register an early stopping strategy to stop the training when the conditions are meet.
pub fn with_application_logger(
self,
logger: Option<Box<dyn ApplicationLoggerInstaller>>,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn with_application_logger( self, logger: Option<Box<dyn ApplicationLoggerInstaller>>, ) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
By default, Rust logs are captured and written into
experiment.log. If disabled, standard Rust log handling
will apply.
pub fn with_file_checkpointer<FR>(
self,
recorder: FR,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
FR: FileRecorder<B> + 'static + FileRecorder<<B as AutodiffBackend>::InnerBackend>,
<O as Optimizer<M, B>>::Record: 'static,
<M as Module<B>>::Record: 'static,
<S as LrScheduler>::Record<B>: 'static,
pub fn with_file_checkpointer<FR>(
self,
recorder: FR,
) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>where
FR: FileRecorder<B> + 'static + FileRecorder<<B as AutodiffBackend>::InnerBackend>,
<O as Optimizer<M, B>>::Record: 'static,
<M as Module<B>>::Record: 'static,
<S as LrScheduler>::Record<B>: 'static,
pub fn summary(self) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
pub fn summary(self) -> LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Enable the training summary report.
The summary will be displayed after .fit(), when the renderer is dropped.
pub fn build(
self,
model: M,
optim: O,
lr_scheduler: S,
) -> Learner<LearnerComponentsMarker<B, S, M, O, AsyncCheckpointer<<M as Module<B>>::Record, B>, AsyncCheckpointer<<O as Optimizer<M, B>>::Record, B>, AsyncCheckpointer<<S as LrScheduler>::Record<B>, B>, AsyncProcessorTraining<FullEventProcessorTraining<TO, VO>>, Box<dyn CheckpointingStrategy>, LearningDataMarker<TI, VI, TO, VO>>>
pub fn build( self, model: M, optim: O, lr_scheduler: S, ) -> Learner<LearnerComponentsMarker<B, S, M, O, AsyncCheckpointer<<M as Module<B>>::Record, B>, AsyncCheckpointer<<O as Optimizer<M, B>>::Record, B>, AsyncCheckpointer<<S as LrScheduler>::Record<B>, B>, AsyncProcessorTraining<FullEventProcessorTraining<TO, VO>>, Box<dyn CheckpointingStrategy>, LearningDataMarker<TI, VI, TO, VO>>>
Create the learner from a model and an optimizer. The learning rate scheduler can also be a simple learning rate.
Auto Trait Implementations§
impl<B, M, O, S, TI, VI, TO, VO> Freeze for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
impl<B, M, O, S, TI, VI, TO, VO> !RefUnwindSafe for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
impl<B, M, O, S, TI, VI, TO, VO> !Send for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
impl<B, M, O, S, TI, VI, TO, VO> !Sync for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
impl<B, M, O, S, TI, VI, TO, VO> Unpin for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
impl<B, M, O, S, TI, VI, TO, VO> !UnwindSafe for LearnerBuilder<B, M, O, S, TI, VI, TO, VO>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more