visionsim.utils package¶
Submodules¶
visionsim.utils.color module¶
- visionsim.utils.color.srgb_to_linearrgb(img: torch.Tensor | npt.NDArray[np.floating]) torch.Tensor | npt.NDArray[np.floating][source]¶
Performs sRGB to linear RGB color space conversion by reversing gamma correction and obtaining values that represent the scene’s intensities.
- Parameters:
img (torch.Tensor | npt.NDArray) – Image to un-tonemap.
- Returns:
linear rgb image.
- visionsim.utils.color.linearrgb_to_srgb(img: torch.Tensor | npt.NDArray) torch.Tensor | npt.NDArray[source]¶
Performs linear RGB to sRGB color space conversion to apply gamma correction for display purposes.
- Parameters:
img (torch.Tensor | npt.NDArray) – Image to tonemap.
- Returns:
tonemapped rgb image.
visionsim.utils.pose module¶
- visionsim.utils.pose.T_wc_camframe_gl2world(T_wc_w_gl: npt.NDArray) npt.NDArray[source]¶
Fix the coordinate convention in the camera pose matrix in transforms.json.
- The coordinate convention for the camera view is OpenGL:
+x = right, +y = up, +z = out from the scene,
- while the coordinate convention for the world frame is:
+x = right, +y = into the scene/viewing direction, +z = up,
and the matrix in transforms.json directly maps from the former to the latter, so we can not treat it as an [R | t] form. In turn, we also need to be careful about interpreting the pose “derivatives” we get from directly using that matrix, such as when simulating IMU data. To remove this confusion, here we convert the matrix to use the world coordinate convention for the camera too.
- Parameters:
T_wc_w_gl (np.NDArray) – 4 x 4 matrix representing camera pose, but also mapping directly from OpenGL coordinate system to world
- Returns:
also 4 x 4, but uses world frame on both sides
- Return type:
T_wc_w_w (np.NDArray)
visionsim.utils.progress module¶
- class visionsim.utils.progress.ElapsedProgress(*columns: str | ProgressColumn, console: Console | None = None, auto_refresh: bool = True, refresh_per_second: float = 10, speed_estimate_period: float = 30.0, transient: bool = False, redirect_stdout: bool = True, redirect_stderr: bool = True, get_time: Callable[[], float] | None = None, disable: bool = False, expand: bool = False)[source]¶
Bases:
Progress
- class visionsim.utils.progress.PoolProgress(*args, auto_visible=True, description='[green]Total progress:', **kwargs)[source]¶
Bases:
ProgressConvenience wrapper around rich’s
Progressto enable progress bars when using multiple processes. All progressbar updates are carried out by the main process, and worker processes communicate their state via a callback obtained when a task gets added.Example
import multiprocessing def long_task(tick, min_len=50, max_len=200): import random, time length = random.randint(min_len, max_len) tick(total=length) for _ in range(length): time.sleep(0.01) tick(advance=1) if __name__ == "__main__": with multiprocessing.Pool(4) as pool, PoolProgress() as progress: for i in range(25): tick = progress.add_task(f"Task: {i}") pool.apply_async(long_task, (tick, )) progress.wait() pool.close() pool.join()
- __init__(*args, auto_visible=True, description='[green]Total progress:', **kwargs) None[source]¶
Initialize a
PoolProgressinstance.Note
All other *args and **kwargs are passed as is to rich.progress.Progress.
- Parameters:
auto_visible (bool, optional) – if true, automatically hides tasks that have not started or finished tasks. Defaults to True.
description (str, optional) – text description for the overall progress. Defaults to “[green]Total progress:”.
- classmethod get_default_columns() tuple[ProgressColumn, ...][source]¶
Overrides
rich.progress.Progress’s default columns to enable showing elapsed time when finished.
- add_task(*args, **kwargs) UpdateFn[source]¶
Same as Progress.add_task except it returns a callback to update the task instead of the task-id. The returned callback is roughly equivalent to Progress.update with it’s first argument (the task-id) already filled out, except calling it will not immediately update the task’s status. The main process will perform the update asynchronously.