Patents by Inventor Kah Seng Tay

Kah Seng Tay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11893682
    Abstract: One variation of a method includes: accessing a 2D color image recorded by a 2D color camera and a 3D point cloud recorded by a 3D depth sensor at approximately a first time, the 2D color camera and the 3D depth sensor defining intersecting fields of view and facing outwardly from an autonomous vehicle; detecting a cluster of points in the 3D point cloud representing a continuous surface approximating a plane; isolating a cluster of color pixels in the 2D color image depicting the continuous surface; projecting the cluster of color pixels onto the plane to define a set of synthetic 3D color points in the 3D point cloud, the cluster of points and the set of synthetic 3D color points representing the continuous surface; and rendering points in the 3D point cloud and the set of synthetic 3D color points on a display.
    Type: Grant
    Filed: October 10, 2022
    Date of Patent: February 6, 2024
    Inventors: Kah Seng Tay, Qing Sun, James Patrick Marion
  • Patent number: 11592833
    Abstract: One variation of a method for updating a localization map for a fleet of autonomous vehicles includes: selecting a set of roads including a first subset of road segments associated with existing incomplete scan data and a second subset of road segments associated with a monitoring request from an external entity; during a passenger period at the autonomous vehicle, autonomously transporting passengers according to a series of ride requests; during a mapping period succeeding the passenger period at the autonomous vehicle, autonomously navigating along the set of road segments, recording a first series of scan data representing surfaces proximal the first subset of road segments, and recording a second series of scan data representing surfaces proximal the second subset of road segments; updating the localization map based on the first series of scan data; and serving the second series of scan data to the external entity.
    Type: Grant
    Filed: May 1, 2019
    Date of Patent: February 28, 2023
    Inventors: Kah Seng Tay, Xiaotian Chen, Xin Sun, Vishisht Gupta
  • Patent number: 11468646
    Abstract: One variation of a method includes: accessing a 2D color image recorded by a 2D color camera and a 3D point cloud recorded by a 3D depth sensor at approximately a first time, the 2D color camera and the 3D depth sensor defining intersecting fields of view and facing outwardly from an autonomous vehicle; detecting a cluster of points in the 3D point cloud representing a continuous surface approximating a plane; isolating a cluster of color pixels in the 2D color image depicting the continuous surface; projecting the cluster of color pixels onto the plane to define a set of synthetic 3D color points in the 3D point cloud, the cluster of points and the set of synthetic 3D color points representing the continuous surface; and rendering points in the 3D point cloud and the set of synthetic 3D color points on a display.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: October 11, 2022
    Inventors: Kah Seng Tay, Qing Sun, James Patrick Marion
  • Patent number: 11163309
    Abstract: One variation of a method for autonomous navigation includes, at an autonomous vehicle: recording a first image via a first sensor and a second image via a second sensor during a scan cycle; calculating a first field of view of the first sensor and a second field of view of the second sensor during the scan cycle based on surfaces represented in the first and second images; characterizing a spatial redundancy between the first sensor and the second sensor based on an overlap of the first and second fields of view; in response to the spatial redundancy remaining below a threshold redundancy, disabling execution of a first navigational action—action informed by presence of external objects within a first region of a scene around the autonomous vehicle spanning the overlap—by the autonomous vehicle; and autonomously executing navigational actions, excluding the first navigational action, following the scan cycle.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: November 2, 2021
    Inventors: Kah Seng Tay, Joel Pazhayampallil, Brody Huval
  • Patent number: 10854011
    Abstract: One variation of a method includes: accessing a 2D color image recorded by a 2D color camera and a 3D point cloud recorded by a 3D depth sensor at approximately a first time, the 2D color camera and the 3D depth sensor defining intersecting fields of view and facing outwardly from an autonomous vehicle; detecting a cluster of points in the 3D point cloud representing a continuous surface approximating a plane; isolating a cluster of color pixels in the 2D color image depicting the continuous surface; projecting the cluster of color pixels onto the plane to define a set of synthetic 3D color points in the 3D point cloud, the cluster of points and the set of synthetic 3D color points representing the continuous surface; and rendering points in the 3D point cloud and the set of synthetic 3D color points on a display.
    Type: Grant
    Filed: April 9, 2019
    Date of Patent: December 1, 2020
    Inventors: Kah Seng Tay, Qing Sun, James Patrick Marion
  • Patent number: 10663977
    Abstract: One variation of a method for dynamically querying a remote operator for assistance includes, at an autonomous vehicle: autonomously navigating along a roadway; at locations along the roadway, testing performance of a set of wireless networks; in response to degradation of the set of wireless networks, decreasing a sensitivity threshold for triggering remote operator events at the autonomous vehicle; in response to a condition at the autonomous vehicle exceeding the sensitivity threshold at a particular location along the roadway, triggering a remote operator event; during the remote operator event, transmitting sensor data to a remote computer system via a subset of wireless networks, in the set of wireless networks, based on performance of the set of wireless networks proximal the particular location; and executing a navigational command received from a remote operator via a wireless network in the set of wireless networks.
    Type: Grant
    Filed: May 1, 2019
    Date of Patent: May 26, 2020
    Inventors: Tim Cheng, Kah Seng Tay
  • Publication number: 20190354111
    Abstract: One variation of a method for dynamically querying a remote operator for assistance includes, at an autonomous vehicle: autonomously navigating along a roadway; at locations along the roadway, testing performance of a set of wireless networks; in response to degradation of the set of wireless networks, decreasing a sensitivity threshold for triggering remote operator events at the autonomous vehicle; in response to a condition at the autonomous vehicle exceeding the sensitivity threshold at a particular location along the roadway, triggering a remote operator event; during the remote operator event, transmitting sensor data to a remote computer system via a subset of wireless networks, in the set of wireless networks, based on performance of the set of wireless networks proximal the particular location; and executing a navigational command received from a remote operator via a wireless network in the set of wireless networks.
    Type: Application
    Filed: May 1, 2019
    Publication date: November 21, 2019
    Inventors: Tim Cheng, Kah Seng Tay
  • Publication number: 20190339709
    Abstract: One variation of a method for updating a localization map for a fleet of autonomous vehicles includes: selecting a set of roads including a first subset of road segments associated with existing incomplete scan data and a second subset of road segments associated with a monitoring request from an external entity; during a passenger period at the autonomous vehicle, autonomously transporting passengers according to a series of ride requests; during a mapping period succeeding the passenger period at the autonomous vehicle, autonomously navigating along the set of road segments, recording a first series of scan data representing surfaces proximal the first subset of road segments, and recording a second series of scan data representing surfaces proximal the second subset of road segments; updating the localization map based on the first series of scan data; and serving the second series of scan data to the external entity.
    Type: Application
    Filed: May 1, 2019
    Publication date: November 7, 2019
    Inventors: Kah Seng Tay, Xiaotian Chen, Xin Sun, Vishisht Gupta
  • Publication number: 20190311546
    Abstract: One variation of a method includes: accessing a 2D color image recorded by a 2D color camera and a 3D point cloud recorded by a 3D depth sensor at approximately a first time, the 2D color camera and the 3D depth sensor defining intersecting fields of view and facing outwardly from an autonomous vehicle; detecting a cluster of points in the 3D point cloud representing a continuous surface approximating a plane; isolating a cluster of color pixels in the 2D color image depicting the continuous surface; projecting the cluster of color pixels onto the plane to define a set of synthetic 3D color points in the 3D point cloud, the cluster of points and the set of synthetic 3D color points representing the continuous surface; and rendering points in the 3D point cloud and the set of synthetic 3D color points on a display.
    Type: Application
    Filed: April 9, 2019
    Publication date: October 10, 2019
    Inventors: Kah Seng Tay, Qing Sun, James Patrick Marion
  • Publication number: 20190196481
    Abstract: One variation of a method for autonomous navigation includes, at an autonomous vehicle: recording a first image via a first sensor and a second image via a second sensor during a scan cycle; calculating a first field of view of the first sensor and a second field of view of the second sensor during the scan cycle based on surfaces represented in the first and second images; characterizing a spatial redundancy between the first sensor and the second sensor based on an overlap of the first and second fields of view; in response to the spatial redundancy remaining below a threshold redundancy, disabling execution of a first navigational action—action informed by presence of external objects within a first region of a scene around the autonomous vehicle spanning the overlap—by the autonomous vehicle; and autonomously executing navigational actions, excluding the first navigational action, following the scan cycle.
    Type: Application
    Filed: November 30, 2018
    Publication date: June 27, 2019
    Inventors: Kah Seng Tay, Joel Pazhayampallil, Brody Huval
  • Publication number: 20140281895
    Abstract: According to various embodiments, a user selection of a content portion of a content item displayed on an online webpage is received. A specific metadata portion indicating properties of the selected content portion of the online webpage is extracted, from metadata indicating properties of the webpage. The specific metadata portion is modified to generate quote metadata, based on one or more quote format rules. A preview pane of a quote content item generated based on the quote metadata is displayed, the quote content item corresponding to the selected content portion reformatted in accordance with a quote format.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Inventors: Kah Seng Tay, Kah Keng Tay, Hongping Lim, Kah Hong Tay