Patents by Inventor JOEL PAZHAYAMPALLIL

JOEL PAZHAYAMPALLIL has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240094399
    Abstract: A method includes: accessing a depth map, generated by a depth sensor arranged on a vehicle, including a set of pixels representing relative positions and radial velocities of surfaces relative to the depth sensor; correlating a cluster of pixels exhibiting congruent radial velocities with an object in the field of view of the depth sensor; aggregating the cluster of pixels into a three-dimensional object representation of the object; classifying the object into an object class based on congruence between the three-dimensional object representation and a geometry of the object class; characterizing motion of the object based on positions and radial velocities of surfaces represented by the cluster of pixels; and generating a motion command based on the motion of the object and a set of motion characteristics of the object class.
    Type: Application
    Filed: November 28, 2023
    Publication date: March 21, 2024
    Inventors: Joel Pazhayampallil, Jeremy Templeton, Paul Shved
  • Publication number: 20230333252
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: estimating a stopping duration, for the autonomous vehicle to reach a full stop, based on a current speed of the autonomous vehicle; calculating a critical time from the current time by the stopping duration; detecting an object in a scan image, of a field proximal the autonomous vehicle, captured by a sensor on the autonomous vehicle at the current time; based on the scan image, deriving a current location and motion of the object; calculating a future state boundary that represents a ground area accessible to the object up to the critical time based on the current location and motion of the object and a set of predefined motion limit assumptions for generic objects proximal public roads; and electing a navigational action to avoid entry into the future state boundary prior to the critical time.
    Type: Application
    Filed: June 16, 2023
    Publication date: October 19, 2023
    Inventor: Joel Pazhayampallil
  • Patent number: 11719821
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: accessing a first scan image containing data captured by a sensor on the autonomous vehicle at a first time; identifying a first group of points in the first scan image representing an object in a field proximal the autonomous vehicle; characterizing a first motion of the object at the first time based on the first group of points; characterizing an uncertainty of the first motion of the object at the first time; calculating a predicted second uncertainty of a second motion of the object at a second time based on the first motion of the object and motion of the autonomous vehicle at the first time; and, in response to the predicted second uncertainty falling below the uncertainty, muting the object from braking consideration for object avoidance by the autonomous vehicle at the second time.
    Type: Grant
    Filed: February 22, 2021
    Date of Patent: August 8, 2023
    Assignee: BlueSpace.ai, Inc.
    Inventors: Joel Pazhayampallil, Christine Moon
  • Patent number: 11714193
    Abstract: One variation of a method for registering distance scan data includes: accessing a first distance scan recorded, at a first time, by a first depth sensor defining a first field of view; accessing a second distance scan recorded, at approximately the first time, by a second depth sensor defining a second field of view overlapping a portion of the first field of view; calculating a first set of lines represented by points in a first portion of the first distance scan overlapping a second portion of the second distance scan; calculating a second set of lines represented by points in the second portion of the second distance scan; calculating skew distances between each pair of corresponding lines in the first and second sets of lines; and calculating an alignment transformation that aligns the first distance scan and the second distance scan to minimize skew distances between corresponding pairs of lines.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: August 1, 2023
    Inventors: Paul Foster, Joel Pazhayampallil, Lei Liu
  • Publication number: 20230213635
    Abstract: A method includes: deriving a first absolute motion of the first optical sensor based on radial positions, azimuthal positions, radial distances, and radial velocities of points in a first cluster of points representing a first static reference surface in a first frame captured by the first optical sensor; deriving a second absolute motion of the second optical sensor based on radial positions, azimuthal positions, radial distances, and radial velocities of points in a first cluster of points representing a first static reference surface in a second frame captured by the second optical sensor; calculating a rotational offset between the first optical sensor and the second optical sensor based on the first absolute motion and the second absolute motion; and aligning a third frame captured by the first optical sensor with a fourth frame captured by the second optical sensor based on the rotational offset.
    Type: Application
    Filed: December 29, 2022
    Publication date: July 6, 2023
    Inventors: Joel Pazhayampallil, Jasprit S. Gill
  • Patent number: 11235785
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: estimating a stopping duration, for the autonomous vehicle to reach a full stop, based on a current speed of the autonomous vehicle; calculating a critical time from the current time by the stopping duration; detecting an object in a scan image, of a field proximal the autonomous vehicle, captured by a sensor on the autonomous vehicle at the current time; based on the scan image, deriving a current location and motion of the object; calculating a future state boundary that represents a ground area accessible to the object up to the critical time based on the current location and motion of the object and a set of predefined motion limit assumptions for generic objects proximal public roads; and electing a navigational action to avoid entry into the future state boundary prior to the critical time.
    Type: Grant
    Filed: February 22, 2021
    Date of Patent: February 1, 2022
    Assignee: BlueSpace.ai, Inc.
    Inventors: Joel Pazhayampallil, Christine Moon
  • Patent number: 11163309
    Abstract: One variation of a method for autonomous navigation includes, at an autonomous vehicle: recording a first image via a first sensor and a second image via a second sensor during a scan cycle; calculating a first field of view of the first sensor and a second field of view of the second sensor during the scan cycle based on surfaces represented in the first and second images; characterizing a spatial redundancy between the first sensor and the second sensor based on an overlap of the first and second fields of view; in response to the spatial redundancy remaining below a threshold redundancy, disabling execution of a first navigational action—action informed by presence of external objects within a first region of a scene around the autonomous vehicle spanning the overlap—by the autonomous vehicle; and autonomously executing navigational actions, excluding the first navigational action, following the scan cycle.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: November 2, 2021
    Inventors: Kah Seng Tay, Joel Pazhayampallil, Brody Huval
  • Publication number: 20210261158
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: for each scan cycle in a sequence of scan cycles, identifying a group of points, representing an object in a field proximal the autonomous vehicle, in a scan image captured by a sensor on the autonomous vehicle at a scan time; and calculating a function relating possible linear and angular motion of the object at the scan time based on a correlation between radial velocities and positions of points in the group of points. The method also include: estimating current linear and angular motion of the object based on an intersection of a current function, derived from a scan image containing data captured at the current time, and a preceding function, derived from a scan image containing data captured prior to the current time; and electing a navigational action based on the current linear and angular motion of the object.
    Type: Application
    Filed: February 22, 2021
    Publication date: August 26, 2021
    Inventors: Joel Pazhayampallil, Christine Moon
  • Publication number: 20210261159
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: accessing a first scan image containing data captured by a sensor on the autonomous vehicle at a first time; identifying a first group of points in the first scan image representing an object in a field proximal the autonomous vehicle; characterizing a first motion of the object at the first time based on the first group of points; characterizing an uncertainty of the first motion of the object at the first time; calculating a predicted second uncertainty of a second motion of the object at a second time based on the first motion of the object and motion of the autonomous vehicle at the first time; and, in response to the predicted second uncertainty falling below the uncertainty, muting the object from braking consideration for object avoidance by the autonomous vehicle at the second time.
    Type: Application
    Filed: February 22, 2021
    Publication date: August 26, 2021
    Inventors: Joel Pazhayampallil, Christine Moon
  • Publication number: 20210261157
    Abstract: A method for autonomous navigation of an autonomous vehicle includes: estimating a stopping duration, for the autonomous vehicle to reach a full stop, based on a current speed of the autonomous vehicle; calculating a critical time from the current time by the stopping duration; detecting an object in a scan image, of a field proximal the autonomous vehicle, captured by a sensor on the autonomous vehicle at the current time; based on the scan image, deriving a current location and motion of the object; calculating a future state boundary that represents a ground area accessible to the object up to the critical time based on the current location and motion of the object and a set of predefined motion limit assumptions for generic objects proximal public roads; and electing a navigational action to avoid entry into the future state boundary prior to the critical time.
    Type: Application
    Filed: February 22, 2021
    Publication date: August 26, 2021
    Inventors: Joel Pazhayampallil, Christine Moon
  • Publication number: 20200400443
    Abstract: Systems and methods discussed herein involve localization of a device. In one implementation, a discrepancy flag associated with a particular geographic location is received from a device operating in a geographic region over a network. The discrepancy flag is generated based on a comparison of a local localization map of the device with scan data for a field of view of at least one sensor of the device. The scan data is received from the device over the network. A segment of a global localization map corresponding to the discrepancy flag is updated using the scan data. A plurality of vehicles operating in the geographic region is identified. The segment of the global localization map is selectively pushed to the plurality of devices based on a network connectivity of each of the plurality of devices and a location of each of the plurality of devices within the geographic region.
    Type: Application
    Filed: September 2, 2020
    Publication date: December 24, 2020
    Inventors: Joel PAZHAYAMPALLIL, Sameep Tandon
  • Publication number: 20190196481
    Abstract: One variation of a method for autonomous navigation includes, at an autonomous vehicle: recording a first image via a first sensor and a second image via a second sensor during a scan cycle; calculating a first field of view of the first sensor and a second field of view of the second sensor during the scan cycle based on surfaces represented in the first and second images; characterizing a spatial redundancy between the first sensor and the second sensor based on an overlap of the first and second fields of view; in response to the spatial redundancy remaining below a threshold redundancy, disabling execution of a first navigational action—action informed by presence of external objects within a first region of a scene around the autonomous vehicle spanning the overlap—by the autonomous vehicle; and autonomously executing navigational actions, excluding the first navigational action, following the scan cycle.
    Type: Application
    Filed: November 30, 2018
    Publication date: June 27, 2019
    Inventors: Kah Seng Tay, Joel Pazhayampallil, Brody Huval
  • Publication number: 20190137287
    Abstract: One variation of a method for detecting and managing changes along road surfaces for autonomous vehicles includes: at approximately a first time, receiving a first discrepancy flag from a first vehicle via a wireless network, the first discrepancy flag indicating a first discrepancy between a particular feature detected proximal a first geospatial location at the first time by the first vehicle and a particular known immutable surface—proximal the first geospatial location—represented in a first localization map stored locally on the first vehicle; receiving sensor data, representing the first discrepancy, from the first vehicle at approximately the first time; updating a first segment of a global localization map representing immutable surfaces proximal the first geospatial location based on the sensor data; and identifying a second vehicle currently executing a second route intersecting the first geospatial location.
    Type: Application
    Filed: June 27, 2018
    Publication date: May 9, 2019
    Inventors: Joel Pazhayampallil, Sameep Tandon
  • Patent number: 9970772
    Abstract: A method for localizing a vehicle in a digital map. GPS raw measurement data is retrieved from satellites. A digital map of a region traveled by the vehicle based on the raw measurement data is retrieved from a database. The digital map includes a geographic mapping of a traveled road and registered roadside objects. The registered roadside objects are positionally identified in the digital map by earth-fixed coordinates. Roadside objects are sensed in the region traveled by the vehicle using distance data and bearing angle data. The sensed roadside objects are matched on the digital map. A vehicle position is determined on the traveled road by fusing raw measurement data and sensor measurements of the identified roadside objects. The position of the vehicle is represented as a function of linearizing raw measurement data and the sensor measurement data as derived by a Jacobian matrix and normalized measurements, respectively.
    Type: Grant
    Filed: June 8, 2016
    Date of Patent: May 15, 2018
    Assignee: GM Global Technology Operations LLC
    Inventors: Shuqing Zeng, Jeremy A. Salinger, Bakhtiar B. Litkouhi, Joel Pazhayampallil, Kevin A. O'Dea, James N. Nickolaou, Mark E. Shields
  • Publication number: 20160282128
    Abstract: A method for localizing a vehicle in a digital map. GPS raw measurement data is retrieved from satellites. A digital map of a region traveled by the vehicle based on the raw measurement data is retrieved from a database. The digital map includes a geographic mapping of a traveled road and registered roadside objects. The registered roadside objects are positionally identified in the digital map by earth-fixed coordinates. Roadside objects are sensed in the region traveled by the vehicle using distance data and bearing angle data. The sensed roadside objects are matched on the digital map. A vehicle position is determined on the traveled road by fusing raw measurement data and sensor measurements of the identified roadside objects. The position of the vehicle is represented as a function of linearizing raw measurement data and the sensor measurement data as derived by a Jacobian matrix and normalized measurements, respectively.
    Type: Application
    Filed: June 8, 2016
    Publication date: September 29, 2016
    Inventors: SHUQING ZENG, JEREMY A. SALINGER, BAKHTIAR B. LITKOUHI, JOEL PAZHAYAMPALLIL, KEVIN A. O'DEA, JAMES N. NICKOLAOU, MARK E. SHIELDS
  • Patent number: 9435653
    Abstract: A method and system for localizing a vehicle in a digital map includes generating GPS coordinates of the vehicle on the traveled road and retrieving from a database a digital map of a region traveled by the vehicle based on the location of the GPS coordinates. The digital map includes a geographic mapping of a traveled road and registered roadside objects. The registered roadside objects are positionally identified in the digital map by longitudinal and lateral coordinates. Roadside objects in the region traveled are sensed by the vehicle. The sensed roadside objects are identified on the digital map. A vehicle position on the traveled road is determined utilizing coordinates of the sensed roadside objects identified in the digital map. The position of the vehicle is localized in the road as a function of the GPS coordinates and the determined vehicle position utilizing the coordinates of the sensed roadside objects.
    Type: Grant
    Filed: September 17, 2013
    Date of Patent: September 6, 2016
    Assignee: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: Shuqing Zeng, Jeremy A. Salinger, Bakhtiar B. Litkouhi, Joel Pazhayampallil, Kevin A. O'Dea, James N. Nickolaou, Mark E. Shields
  • Patent number: 9255988
    Abstract: A method of detecting and tracking objects using multiple radar sensors. Objects relative to a host vehicle are detected from radar data generated by a sensing device. The radar data includes Doppler measurement data. Clusters are formed, by a processor, as a function of the radar data. Each cluster represents a respective object. Each respective object is classified, by the processor, as stationary or non-stationary based on the Doppler measurement data of each object and a vehicle speed of the host vehicle. Target tracking is applied, by the processor, on an object using Doppler measurement data over time in response to the object classified as a non-stationary object; otherwise, updating an occupancy grid in response to classifying the object as a stationary object.
    Type: Grant
    Filed: January 16, 2014
    Date of Patent: February 9, 2016
    Assignee: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: Shuqing Zeng, Jeremy A. Salinger, Bakhtiar B. Litkouhi, Kevin A. O'Dea, Joel Pazhayampallil, Mohannad Murad, James N. Nickolaou
  • Publication number: 20150198711
    Abstract: A method of detecting and tracking objects using multiple radar sensors. Objects relative to a host vehicle are detected from radar data generated by a sensing device. The radar data includes Doppler measurement data. Clusters are formed, by a processor, as a function of the radar data. Each cluster represents a respective object. Each respective object is classified, by the processor, as stationary or non-stationary based on the Doppler measurement data of each object and a vehicle speed of the host vehicle. Target tracking is applied, by the processor, on an object using Doppler measurement data over time in response to the object classified as a non-stationary object; otherwise, updating an occupancy grid in response to classifying the object as a stationary object.
    Type: Application
    Filed: January 16, 2014
    Publication date: July 16, 2015
    Applicant: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: SHUQING ZENG, JEREMY A. SALINGER, BAKHTIAR B. LITKOUHI, KEVIN A. O'DEA, JOEL PAZHAYAMPALLIL, MOHANNAD MURAD, JAMES N. NICKOLAOU
  • Publication number: 20150081211
    Abstract: A method and system for localizing a vehicle in a digital map includes generating GPS coordinates of the vehicle on the traveled road and retrieving from a database a digital map of a region traveled by the vehicle based on the location of the GPS coordinates. The digital map includes a geographic mapping of a traveled road and registered roadside objects. The registered roadside objects are positionally identified in the digital map by longitudinal and lateral coordinates. Roadside objects in the region traveled are sensed by the vehicle. The sensed roadside objects are identified on the digital map. A vehicle position on the traveled road is determined utilizing coordinates of the sensed roadside objects identified in the digital map. The position of the vehicle is localized in the road as a function of the GPS coordinates and the determined vehicle position utilizing the coordinates of the sensed roadside objects.
    Type: Application
    Filed: September 17, 2013
    Publication date: March 19, 2015
    Applicant: GM GLOBAL TECHNOLOGIES OPERATIONS LLC
    Inventors: Shuqing Zeng, Jeremy A. Salinger, Bakhtiar B. Litkouhi, Joel Pazhayampallil, Kevin A. O'Dea, James N. Nickolaou, Mark E. Shields
  • Patent number: 8798841
    Abstract: A system and method designed to improve sensor visibility for a host vehicle operating in an autonomous driving mode when one or more forward-looking sensors are being occluded or obstructed. According to an exemplary embodiment, when a forward-looking object detection sensor is being obstructed by a target vehicle located closely ahead of the host vehicle, the method determines if lateral movement by the host vehicle within its own lane is appropriate to improve sensor visibility around the target vehicle. If lateral movement is deemed appropriate, the method generates lateral movement commands that dictate the direction and distance of the lateral movement by the host vehicle. This may enable the object detection sensors to at least partially see around the obstructing target vehicle and improve the preview distance of the sensors.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: August 5, 2014
    Assignee: GM Global Technology Operations LLC
    Inventors: James N. Nickolaou, Joel Pazhayampallil, Michael P. Turski