Skip to content

Commit

Permalink
Remove __enter__ and __exit__ from MemorySnapshotProfiler
Browse files Browse the repository at this point in the history
Summary: There is no need for MemorySnapshotProfiler to be a context manager since it conflicts with start_step and stop_step

Differential Revision: D51049497
  • Loading branch information
Danielle Pintz authored and facebook-github-bot committed Nov 6, 2023
1 parent bba323a commit 30cdf74
Showing 1 changed file with 1 addition and 13 deletions.
14 changes: 1 addition & 13 deletions torchtnt/utils/memory_snapshot_profiler.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,7 @@

import logging
from dataclasses import dataclass
from types import TracebackType
from typing import Optional, Type
from typing import Optional

import torch
from torchtnt.utils.oom import attach_oom_observer, log_memory_snapshot
Expand Down Expand Up @@ -115,17 +114,6 @@ def __init__(
f"Created MemorySnapshotProfiler with MemorySnapshotParams={self.params}."
)

def __enter__(self) -> None:
self.start()

def __exit__(
self,
exc_type: Optional[Type[BaseException]],
exc_value: Optional[BaseException],
tb: Optional[TracebackType],
) -> Optional[bool]:
self.stop()

def start(self) -> None:
if not torch.cuda.is_available():
logger.warn("CUDA unavailable. Not recording memory history.")
Expand Down

0 comments on commit 30cdf74

Please sign in to comment.