Course Hero Logo

Return type none onepochend modelhooks onepochend

Course Hero uses AI to attempt to automatically extract content from documents to surface to you and others so you can study better, e.g., in search results, to enrich docs, and more. This preview shows page 80 - 84 out of 618 pages.

Return typeNoneon_epoch_endModelHooks.on_epoch_end()Called when either of train/val/test epoch ends.Return typeNone74Chapter 7. LightningModule
PyTorch Lightning Documentation, Release 1.3.8on_train_epoch_startModelHooks.on_train_epoch_start()Called in the training loop at the very beginning of the epoch.Return typeNoneon_train_epoch_endModelHooks.on_train_epoch_end(unused=None)Called in the training loop at the very end of the epoch.To access all batch outputs at the end of the epoch, either:1. Implementtraining_epoch_endin the LightningModule OR2. Cache data across steps on the attribute(s) of theLightningModuleand access them in this hookon_validation_batch_startModelHooks.on_validation_batch_start(batch,batch_idx,dataloader_idx)Called in the validation loop before anything happens for that batch.Parametersbatch(Any) – The batched data as it is returned by the validation DataLoader.batch_idx(int) – the index of the batchdataloader_idx(int) – the index of the dataloaderReturn typeNoneon_validation_batch_endModelHooks.on_validation_batch_end(outputs,batch,batch_idx,dataloader_idx)Called in the validation loop after the batch.Parametersoutputs(Union[Tensor,Dict[str,Any],None]) – The outputs of valida-tion_step_end(validation_step(x))batch(Any) – The batched data as it is returned by the validation DataLoader.batch_idx(int) – the index of the batchdataloader_idx(int) – the index of the dataloaderReturn typeNone7.4. LightningModule API75
PyTorch Lightning Documentation, Release 1.3.8on_validation_epoch_startModelHooks.on_validation_epoch_start()Called in the validation loop at the very beginning of the epoch.Return typeNoneon_validation_epoch_endModelHooks.on_validation_epoch_end()Called in the validation loop at the very end of the epoch.Return typeNoneon_post_move_to_deviceModelHooks.on_post_move_to_device()Called in theparameter_validationdecorator afterto()is called. This is a good place to tie weightsbetween modules after moving them to a device. Can be used when training models with weight sharing prop-erties on TPU.AddressesthehandlingofsharedweightsonTPU:TROUBLESHOOTING.md#xla-tensor-quirksExample:defon_post_move_to_device(self):self.decoder.weight=self.encoder.weightReturn typeNoneon_validation_model_evalModelHooks.on_validation_model_eval()Sets the model to eval during the val loopReturn typeNoneon_validation_model_trainModelHooks.on_validation_model_train()Sets the model to train during the val loopReturn typeNone76Chapter 7. LightningModule
PyTorch Lightning Documentation, Release 1.3.8on_test_model_evalModelHooks.on_test_model_eval()Sets the model to eval during the test loopReturn typeNoneon_test_model_trainModelHooks.on_test_model_train()Sets the model to train during the test loopReturn typeNoneoptimizer_stepLightningModule.optimizer_step(epoch=None,batch_idx=None,optimizer=None,opti-mizer_idx=None,optimizer_closure=None,on_tpu=None,using_native_amp=None,using_lbfgs=None)Override this method to adjust the default way theTrainercalls each optimizer. By default, Lightning callsstep()andzero_grad()as shown in the example once per optimizer.

Upload your study docs or become a

Course Hero member to access this document

Upload your study docs or become a

Course Hero member to access this document

End of preview. Want to read all 618 pages?

Upload your study docs or become a

Course Hero member to access this document

Term
Spring
Professor
N/A
Tags
Test, The Adventures of Tom Sawyer, Das Model, The Loss, The Trainer

Newly uploaded documents

Show More

Newly uploaded documents

Show More

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture