Patents by Inventor Steven Hao
Steven Hao has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240079662Abstract: An example method includes receiving, at a module management unit (MMU) associated with a battery cell, a battery cell parameter measurement associated with the battery cell. The MMU in is communication with a battery radio frequency module via a first link. The method further includes performing, by the MMU, a first function based at least in part on the battery cell parameter measurement to generate a first signal, and transmitting, from the MMU, the first signal to the battery radio frequency module via the first link. The method further includes performing, by the battery radio frequency module, a second function based at least in part on the first signal to generate a second signal. The method further includes transmitting, from the battery radio frequency module, the second signal to a controller, and performing, by the controller, a third function based at least in part on the second signal.Type: ApplicationFiled: September 7, 2022Publication date: March 7, 2024Inventors: Lei Hao, Yue-Yun Wang, Steven Earl Muldoon, Michael P. Barker, Yongjie Zhu
-
Patent number: 11625892Abstract: One embodiment provides a user interface (UI) that permits users to select how point cloud colorings determined from multiple data sources are blended together in a rendering of a point cloud. The data sources may include photographic, label, and/or LIDAR intensity data. To improve frame rates, an aggregated point cloud may be generated using a spatial hash of a large set of points and sampling of each hash bucket based on the number of points therein and a user-configurable density. Sizes of points in the point cloud may decrease proportionally to distance from a viewer, but increase based on an activation function that enlarges points greater than a threshold distance from the viewer. In addition, luminance statistics for sub-regions of photographic data and dominant colors determined from photographic data may be used to automatically determine color properties to apply to a point cloud coloring.Type: GrantFiled: August 12, 2021Date of Patent: April 11, 2023Assignee: SCALE AI, INC.Inventors: Evan Moss, Steven Hao, Leigh Marie Braswell, Akshat Bubna, Chiao-Lun Cheng, Samuel Jacob Clearman, Nathaniel John Herman, Guido Leandro Maliandi
-
Patent number: 11488332Abstract: Techniques for coloring a point cloud based on colors derived from LIDAR (light detection and ranging) intensity data are disclosed. In some embodiments, the coloring of the point cloud may employ an activation function that controls the colors assigned to different intensity values. Further, the activation function may be parameterized based on statistics computed for a distribution of intensities associated with a 3D scene and a user-selected sensitivity. Alternatively, a Fourier transform of the distribution of intensities or a clustering of the intensities may be used to estimate individual distributions associated with different materials, based on which the point cloud coloring may be determined from intensity data.Type: GrantFiled: February 26, 2021Date of Patent: November 1, 2022Assignee: SCALE AI, INC.Inventors: Evan Moss, Steven Hao, Leigh Marie Braswell
-
Patent number: 11222460Abstract: One embodiment provides a user interface (UI) that permits users to select how point cloud colorings determined from multiple data sources are blended together in a rendering of a point cloud. The data sources may include photographic, label, and/or LIDAR intensity data. To improve frame rates, an aggregated point cloud may be generated using a spatial hash of a large set of points and sampling of each hash bucket based on the number of points therein and a user-configurable density. Sizes of points in the point cloud may decrease proportionally to distance from a viewer, but increase based on an activation function that enlarges points greater than a threshold distance from the viewer. In addition, luminance statistics for sub-regions of photographic data and dominant colors determined from photographic data may be used to automatically determine color properties to apply to a point cloud coloring.Type: GrantFiled: July 22, 2019Date of Patent: January 11, 2022Assignee: Scale AI, Inc.Inventors: Evan Moss, Steven Hao, Leigh Marie Braswell, Akshat Bubna, Chiao-Lun Cheng, Samuel Jacob Clearman, Nathaniel John Herman, Guido Leandro Maliandi
-
Patent number: 10937202Abstract: Techniques for coloring a point cloud based on colors derived from LIDAR (light detection and ranging) intensity data are disclosed. In some embodiments, the coloring of the point cloud may employ an activation function that controls the colors assigned to different intensity values. Further, the activation function may be parameterized based on statistics computed for a distribution of intensities associated with a 3D scene and a user-selected sensitivity. Alternatively, a Fourier transform of the distribution of intensities or a clustering of the intensities may be used to estimate individual distributions associated with different materials, based on which the point cloud coloring may be determined from intensity data.Type: GrantFiled: July 22, 2019Date of Patent: March 2, 2021Assignee: Scale AI, Inc.Inventors: Evan Moss, Steven Hao, Leigh Marie Braswell
-
Publication number: 20210027501Abstract: Techniques for coloring a point cloud based on colors derived from LIDAR (light detection and ranging) intensity data are disclosed. In some embodiments, the coloring of the point cloud may employ an activation function that controls the colors assigned to different intensity values. Further, the activation function may be parameterized based on statistics computed for a distribution of intensities associated with a 3D scene and a user-selected sensitivity. Alternatively, a Fourier transform of the distribution of intensities or a clustering of the intensities may be used to estimate individual distributions associated with different materials, based on which the point cloud coloring may be determined from intensity data.Type: ApplicationFiled: July 22, 2019Publication date: January 28, 2021Inventors: Evan Moss, Steven Hao, Leigh Marie BRASWELL
-
Publication number: 20210027524Abstract: One embodiment provides a user interface (UI) that permits users to select how point cloud colorings determined from multiple data sources are blended together in a rendering of a point cloud. The data sources may include photographic, label, and/or LIDAR intensity data. To improve frame rates, an aggregated point cloud may be generated using a spatial hash of a large set of points and sampling of each hash bucket based on the number of points therein and a user-configurable density. Sizes of points in the point cloud may decrease proportionally to distance from a viewer, but increase based on an activation function that enlarges points greater than a threshold distance from the viewer. In addition, luminance statistics for sub-regions of photographic data and dominant colors determined from photographic data may be used to automatically determine color properties to apply to a point cloud coloring.Type: ApplicationFiled: July 22, 2019Publication date: January 28, 2021Inventors: Evan Moss, Steven Hao, Leigh Marie BRASWELL
-
Publication number: 20210027546Abstract: Techniques are disclosed for facilitating the labeling of cuboid annotations in point cloud data. User-drawn annotations of cuboids in point cloud data can be automatically adjusted to remove outlier points, add relevant points, and fit the cuboids to points representative of an object. Interpolation and object tracking techniques are also disclosed for propagating cuboids from frames designated as key frames to other frames. In addition, techniques are disclosed for, in response to user adjustment of the size of a cuboid in one frame, automatically adjusting the sizes of cuboids in other frames while anchoring a set of non-occluded faces of the cuboids. The non-occluded faces may be determined as the faces that are closest to a LIDAR (light detection and ranging) sensor in the other frames.Type: ApplicationFiled: July 22, 2019Publication date: January 28, 2021Inventors: Steven Hao, Leigh Marie BRASWELL, Evan MOSS
-
Patent number: D1019632Type: GrantFiled: April 28, 2021Date of Patent: March 26, 2024Assignee: Garmin International, Inc.Inventors: Brent E. Barberis, Hans K. Fritze, Chuan-Hao Wen, Yu-An Lin, Todd P. Register, Benjamin D. Braun, Jonathan E. Reberry, Steven J. Christy
-
Patent number: D1019633Type: GrantFiled: April 28, 2021Date of Patent: March 26, 2024Assignee: Garmin International, Inc.Inventors: Brent E. Barberis, Hans K. Fritze, Chuan-Hao Wen, Yu-An Lin, Todd P. Register, Steven J. Christy, Benjamin D. Braun, Jonathan E. Reberry