Monitor

Github Open In Colab

%%capture
!pip install labml

Iterators & Enumerators

You can use iterate() and enum() with any iterable object. In this example we use a PyTorch DataLoader.

# Create a data loader for illustration
import time

import torch
from torchvision import datasets, transforms

from labml import logger, monit, lab, tracker

test_loader = torch.utils.data.DataLoader(
        datasets.MNIST(lab.get_data_path(),
                       train=False,
                       download=True,
                       transform=transforms.Compose([
                           transforms.ToTensor(),
                           transforms.Normalize((0.1307,), (0.3081,))
                       ])),
        batch_size=32, shuffle=True)
for data, target in monit.iterate("Test", test_loader):
    time.sleep(0.01)
Test...[DONE]     7,044.82ms
for i, (data, target) in monit.enum("Test", test_loader):
    time.sleep(0.01)
Test...[DONE]     6,961.66ms

Sections

Sections let you monitor time taken for different tasks and also helps keep the code clean by separating different blocks of code.

with monit.section("Load data"):
    # code to load data
    time.sleep(2)
Load data...[DONE]        2,007.06ms
with monit.section("Load saved model"):
    time.sleep(1)
    monit.fail()
Load saved model...[FAIL] 1,009.20ms

You can also show progress while a section is running

with monit.section("Train", total_steps=100):
    for i in range(100):
        time.sleep(0.1)
        # Multiple training steps in the inner loop
        monit.progress(i)
Train...[DONE]    10,605.45ms

Loop

This can be used for the training loop. The :func:loop keeps track of the time taken and time remaining for the loop.

:func:labml.tracker.save outputs the current status along with global step.

for step in monit.loop(range(0, 400)):
    tracker.save()
tracker.set_global_step(0)

You can manually increment global step too.

for step in monit.loop(range(0, 400)):
    tracker.add_global_step(5)
    tracker.save()