Utils

Cache

labml.utils.cache.cache(name: str, loader: Optional[Callable[[], Any]] = None, *, file_type: str = 'json') Any[source]

This caches results of a function. Can be used as a decorator or you can pass a lambda function to it that takes no arguments.

It doesn’t cache by arguments.

Parameters
  • name (str) – name of the cache

  • loader (Callable[[], Any], optional) – the function that generates the data to be cached

Keyword Arguments

file_type (str, optional) – The file type to store the data. Defaults to json.

labml.utils.cache.cache_get(name: str, file_type: str = 'json') Any[source]

Get cached data.

Parameters
  • name (str) – name of the cache

  • file_type (str, optional) – The file type to store the data. Defaults to json.

labml.utils.cache.cache_set(name: str, value: Any, file_type: str = 'json') Any[source]

Save data in cache.

Parameters
  • name (str) – name of the cache

  • value (any) – data to be cached

  • file_type (str, optional) – The file type to store the data. Defaults to json.

Keyboard Interrupt

class labml.utils.delayed_keyboard_interrupt.DelayedKeyboardInterrupt[source]

When this is used in a with block it will capture keyboard interrupts and fire them at the end of the with block after all the code completes execution.

Downloading

labml.utils.download.download_file(url: str, path: Path)[source]

Download a file from url.

Parameters
  • url (str) – URL to download the file from

  • path (Path) – The location to save the downloaded file

labml.utils.download.extract_tar(tar_file: Path, to_path: Path)[source]

Extract a .tar.gz file.

Parameters
  • tar_file (Path) – .tar.gz file

  • to_path (Path) – location to extract the contents

PyTorch

labml.utils.pytorch.store_model_indicators(model: Module, *, model_name: str = 'model')[source]

Track model parameters and gradients.

Parameters

model (Optimizer) – PyTorch model

Keyword Arguments

model_name (str, optional) – name of the model

labml.utils.pytorch.store_optimizer_indicators(optimizer: Optimizer, *, models: Optional[Dict[str, Module]] = None, optimizer_name: str = 'optimizer')[source]

Track optimizer stats such as moments.

Parameters

optimizer (Optimizer) – PyTorch optimizer

Keyword Arguments
  • models (Dict[str, torch.nn.Module], optional) – a dictionary of modules being optimized. This is used to get the proper parameter names.

  • optimizer_name (str, optional) – name of the optimizer

labml.utils.pytorch.get_modules(configs: BaseConfigs)[source]

Get all the PyTorch modules in configs object.

Parameters

configs (labml.configs.BaseConfigs) – configurations object

labml.utils.pytorch.get_device(module: Module)[source]

Get the device the module is in.

Parameters

module (torch.nn.Module) – PyTorch module