Abstract: A system that generates a 3D environment from data collected by depth sensors (such as LIDAR) and color sensors (such as color video camera data) observing an area or activity, transmits versions of the 3D environment to various devices for display, and enables device users to dynamically alter a viewing angle of the 3D environment. The version of the 3D environment sent to each device may be optimized for the device's resolution and for the bandwidth of the connection to the device. Embodiments may enrich the 3D environment by detecting and tagging objects and their locations in the environment, and by calculating metrics related to motion or actions of these objects. Object tags and metrics may be transmitted to devices and displayed for example as overlays of images rendered from user-selected viewing angles. Embodiments of the system also enable 3D printing of an object as a memento for example.
Type:
Grant
Filed:
February 4, 2019
Date of Patent:
March 3, 2020
Assignee:
ALIVE 3D
Inventors:
Raymond Paul Marchak, Jr., Russell Neil Harlan, Jr., Hunter Laux
Abstract: A system that generates a 3D environment from data collected by depth sensors (such as LIDAR) and color sensors (such as color video camera data) observing an area or activity, transmits versions of the 3D environment to various devices for display, and enables device users to dynamically alter a viewing angle of the 3D environment. The version of the 3D environment sent to each device may be optimized for the device's resolution and for the bandwidth of the connection to the device. Embodiments may enrich the 3D environment by detecting and tagging objects and their locations in the environment, and by calculating metrics related to motion or actions of these objects. Object tags and metrics may be transmitted to devices and displayed for example as overlays of images rendered from user-selected viewing angles. Embodiments of the system also enable 3D printing of an object as a memento for example.
Type:
Application
Filed:
February 4, 2019
Publication date:
February 13, 2020
Applicant:
ALIVE 3D
Inventors:
Raymond Paul Marchak, JR., Russell Neil Harlan, JR., Hunter Laux
Abstract: A system that generates a 3D environment from data collected by depth sensors (such as LIDAR) and color sensors (such as color video camera data) observing an area or activity, transmits versions of the 3D environment to various devices for display, and enables device users to dynamically alter a viewing angle of the 3D environment. The version of the 3D environment sent to each device may be optimized for the device's resolution and for the bandwidth of the connection to the device. Embodiments may enrich the 3D environment by detecting and tagging objects and their locations in the environment, and by calculating metrics related to motion or actions of these objects. Object tags and metrics may be transmitted to devices and displayed for example as overlays of images rendered from user-selected viewing angles. Embodiments of the system also enable 3D printing of an object as a memento for example.
Type:
Grant
Filed:
August 9, 2018
Date of Patent:
April 23, 2019
Assignee:
ALIVE 3D
Inventors:
Raymond Paul Marchak, Jr., Russell Neil Harlan, Jr., Hunter Thomas Laux