Patents Assigned to CYNGN, INC.
  • Publication number: 20230066189
    Abstract: A method may include obtaining sensor data relating to an autonomous vehicle (AV) and a total measurable world around the AV. The method may include identifying an operating environment of the AV and determining a projected computational load for computing subsystems that facilitate a driving operation performable by the AV corresponding to the identified environment. The method may include off-loading first computing subsystems of the computing subsystems in which computations of the first computing subsystems may be processed by an off-board cloud computing system and processing computations associated with second computing subsystems of the computing subsystems by an on-board computing system.
    Type: Application
    Filed: August 25, 2022
    Publication date: March 2, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230060383
    Abstract: An autonomous driving computing system may include an on-board computing system that is configured to perform first operations that include obtaining sensor data relating to an autonomous vehicle (AV). The first operations may include sending the obtained sensor data to an off-board cloud computing system. The autonomous driving computing system may include the off-board cloud computing system, which may be configured to perform second operations that include receiving the sensor data and performing computations relating to the driving operation of the AV based on the obtained sensor data. The second operations may include determining a control signal corresponding to a driving operation and sending the control signal to the on-board computing system. The first operations may involve the on-board computing system receiving, from the off-board cloud computing system, the control signal corresponding to the driving operation of the AV and performing the driving operation by implementing the control signal.
    Type: Application
    Filed: August 25, 2022
    Publication date: March 2, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Patent number: 11592565
    Abstract: A method may include obtaining first sensor data from a first sensor system and second sensor data from a second sensor system. The first and the second sensor systems may capture sensor data from a total measurable world. The method may include identifying a first object included in the first sensor data and a second object included in the second sensor data and determining first parameters corresponding to the first object and second parameters corresponding to the second object. The first parameters may be compared with the second parameters and whether the first object and the second object are a same object may be determined based on the comparing the first parameters and the second parameters. Responsive to determining that the first object and the second object are the same object, a set of objects representative of objects in the total measurable world including the same object may be generated.
    Type: Grant
    Filed: July 8, 2022
    Date of Patent: February 28, 2023
    Assignee: CYNGN, INC.
    Inventors: Biao Ma, Lior Tal
  • Publication number: 20230040017
    Abstract: A method may include obtaining input information relating to an environment in which an autonomous vehicle (AV) operates. The input information may describe a state of the AV, an operation of the AV within the environment, a property of the environment, or an object included in the environment. The method may include setting up a traffic rule profile for the AV that specifies societal traffic practices corresponding to a location of the environment. The method may include identifying a first traffic rule relevant to the AV based on the obtained input information and determining a first decision corresponding to the traffic rule profile and the first traffic rule. The method may include sending an instruction to a control system of the AV, the instruction describing a given operation of the AV responsive to the traffic rule profile and the first traffic rule according to the first decision.
    Type: Application
    Filed: August 5, 2022
    Publication date: February 9, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230039556
    Abstract: A method may include obtaining input information relating to an environment in which an autonomous vehicle (AV) operates in which the input information describes at least one of: a state of the AV, an operation of the AV within the environment, a property of the environment, or an object included in the environment. The method may include identifying a region of interest that represents a section of the environment based on the input information and identifying a portion of the environment based on the identified region of interest. The portion of the environment may include an object that affects operation of the AV. The method may include determining a first decision that relates to the object and sending an instruction to a control system of the AV describing a given operation of the AV responsive to the object according to the first decision.
    Type: Application
    Filed: August 5, 2022
    Publication date: February 9, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230040845
    Abstract: A method may include obtaining input information relating to an environment in which an autonomous vehicle (AV) operates, the input information describing at least one of: a state of the AV, an operation of the AV within the environment, a property of the environment, or an object included in the environment. The method may include identifying a first object in the vicinity of the AV based on the obtained input information. The method may include determining a first object rule corresponding to the first object, the first object rule indicating suggested driving behavior for interacting with the first object. The method may include determining a first decision that follows the first object rule and sending an instruction to a control system of the AV, the instruction describing a given operation of the AV responsive to the first object rule according to the first decision.
    Type: Application
    Filed: August 5, 2022
    Publication date: February 9, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230031829
    Abstract: A method may include obtaining one or more inputs in which each of the inputs describes at least one of: a state of an autonomous vehicle (AV) or a state of an object; and identifying a prediction context of the AV based on the inputs. The method may also include determining a relevancy of each object of a plurality of objects to the AV in relation to the prediction context; and outputting a set of relevant objects based on the relevancy determination for each of the plurality of objects. Another method may include obtaining a set of objects designated as relevant to operation of an AV; selecting a trajectory prediction approach for a given object based on context of the AV and characteristics of the given object; predicting a trajectory of the given object using the selected trajectory prediction approach; and outputting the given object and the predicted trajectory.
    Type: Application
    Filed: August 2, 2022
    Publication date: February 2, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230030786
    Abstract: A method may include obtaining one or more inputs in which each of the inputs describes at least one of: a state of an autonomous vehicle (AV) or a state of an object; and identifying a prediction context of the AV based on the inputs. The method may also include determining a relevancy of each object of a plurality of objects to the AV in relation to the prediction context; and outputting a set of relevant objects based on the relevancy determination for each of the plurality of objects. Another method may include obtaining a set of objects designated as relevant to operation of an AV; selecting a trajectory prediction approach for a given object based on context of the AV and characteristics of the given object; predicting a trajectory of the given object using the selected trajectory prediction approach; and outputting the given object and the predicted trajectory.
    Type: Application
    Filed: August 2, 2022
    Publication date: February 2, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230025579
    Abstract: A method may include obtaining sensor data about a total measurable world around an autonomous vehicle. The sensor data may be captured by sensor units co-located with the autonomous vehicle. The method may include generating a mapping dataset including the obtained sensor data and identifying data elements that each represents a point in the mapping dataset. The method may include sorting the data elements according to a structural data categorization that is a template for a high-definition map of the total measurable world and determining a mapping trajectory of the autonomous vehicle. The mapping trajectory may describe a localization and a path of motion of the autonomous vehicle. The method may include generating the high-definition map based on the structural data categorization and relative to the mapping trajectory of the autonomous vehicle, and the high-definition map may be updated based on the path of motion of the autonomous vehicle.
    Type: Application
    Filed: July 26, 2022
    Publication date: January 26, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230020776
    Abstract: A method may include obtaining first sensor data from a first sensor system and second sensor data from a second sensor system. The first and the second sensor systems may capture sensor data from a total measurable world. The method may include identifying a first object included in the first sensor data and a second object included in the second sensor data and determining first parameters corresponding to the first object and second parameters corresponding to the second object. The first parameters may be compared with the second parameters and whether the first object and the second object are a same object may be determined based on the comparing the first parameters and the second parameters. Responsive to determining that the first object and the second object are the same object, a set of objects representative of objects in the total measurable world including the same object may be generated.
    Type: Application
    Filed: July 8, 2022
    Publication date: January 19, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Patent number: 11555928
    Abstract: A method may include obtaining sensor data from one or more LiDAR units and determining a point-cloud corresponding to the sensor data obtained from each respective LiDAR unit. The method may include aggregating the point-clouds as an aggregated point-cloud and generating an initial proposal for a two-dimensional ground model made of multiple grid blocks. The method may include filtering out unrelated raw data points from each grid block of the plurality of grid blocks to generate a filtered point-cloud matrix. The method may include identifying one or more surface-points and one or more object-points included in the filtered point-cloud matrix and generating an array of extracted objects based on the object-points.
    Type: Grant
    Filed: June 21, 2022
    Date of Patent: January 17, 2023
    Assignee: CYNGN, INC.
    Inventors: Biao Ma, Lior Tal
  • Publication number: 20230009736
    Abstract: A method may include obtaining sensor data describing a total measurable world around a motion sensor. The method may include processing the sensor data to generate a pre-compensation scan of the total measurable world around the motion sensor based on the sensor data. The method may include determining a delay between the obtaining the sensor data and the generation of the pre-compensation scan. The method may include obtaining motion data corresponding to motion of the motion sensor and generating a motion model of the motion sensor based on the motion data. The method may include generating an after-compensation scan of the motion sensor using the delay and the motion model to compensate for continued motion during the delay.
    Type: Application
    Filed: July 12, 2022
    Publication date: January 12, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20230011829
    Abstract: A method may include obtaining first sensor data captured by a first sensor system and second sensor data captured by a second sensor system of a different type from the first sensor system. The method may include detecting a first object included in the first sensor data and a second object included in the second sensor data. The method may include assigning a first label to the first object and a second label to the second object after comparing the first and the second sensor data. The first and second labels may indicate degrees to which the first and the second objects match. Responsive to the first and second labels indicating that the first and the second objects match, the method may include designating a matched object representative of the first object and the second object and sending the matched object to a downstream computing system of an autonomous vehicle.
    Type: Application
    Filed: July 8, 2022
    Publication date: January 12, 2023
    Applicant: CYNGN, INC.
    Inventors: Biao MA, Lior TAL
  • Publication number: 20220404503
    Abstract: A method may include obtaining sensor data from one or more LiDAR units and determining a point-cloud corresponding to the sensor data obtained from each respective LiDAR unit. The method may include aggregating the point-clouds as an aggregated point-cloud and generating an initial proposal for a two-dimensional ground model made of multiple grid blocks. The method may include filtering out unrelated raw data points from each grid block of the plurality of grid blocks to generate a filtered point-cloud matrix. The method may include identifying one or more surface-points and one or more object-points included in the filtered point-cloud matrix and generating an array of extracted objects based on the object-points.
    Type: Application
    Filed: June 21, 2022
    Publication date: December 22, 2022
    Applicant: CYNGN, INC.
    Inventors: Biao Ma, Lior Tal
  • Publication number: 20220406014
    Abstract: A method may include obtaining sensor data from one or more LiDAR units and determining a point-cloud corresponding to the sensor data obtained from each respective LiDAR unit. The method may include aggregating the point-clouds as an aggregated point-cloud. A number of data points included in the aggregated point-cloud may be decreased by filtering out one or more of the data points according to one or more heuristic rules to generate a reduced point-cloud. The method may include determining an operational granularity level for the reduced point-cloud. An array of existence-based objects may be generated based on the reduced point-cloud and the operational granularity level.
    Type: Application
    Filed: June 21, 2022
    Publication date: December 22, 2022
    Applicant: CYNGN, INC.
    Inventors: Biao Ma, Lior Tal
  • Publication number: 20220404478
    Abstract: A method may include determining an alignment time based on a zero-crossing point corresponding to a LiDAR sensor and a horizontal field of view corresponding to an image-capturing sensor. The method may include determining a delay timing for initiating image capturing by the image-capturing sensor in which the delay timing is based on at least one of: the alignment time, a packet capture timing corresponding to the LiDAR sensor, and an average frame exposure duration corresponding to the image-capturing sensor. The method may include initiating data capture by the LiDAR sensor, and after the initiating of data capture by the LiDAR sensor and after the delay timing has elapsed, initiating data capture by the image-capturing sensor.
    Type: Application
    Filed: June 21, 2022
    Publication date: December 22, 2022
    Applicant: CYNGN, INC.
    Inventors: Biao Ma, Lior Tal
  • Patent number: 11372115
    Abstract: The localization of a vehicle is determined using less expensive and computationally robust equipment compared to conventional methods. Localization is determined by estimating the position of a vehicle relative to a map of the environment, and the process thereof includes using a map of the surrounding environment of the vehicle, a model of the motion of the frame of reference of the vehicle (e.g., ego-motion), sensor data from the surrounding environment, and a process to match sensory data to the map. Localization also includes a process to estimate the position based on the sensor data, the motion of the frame of reference of the vehicle, and/or the map. Such methods and systems enable the use of less expensive components while achieving useful results for a variety of applications, such as autonomous vehicles.
    Type: Grant
    Filed: May 24, 2019
    Date of Patent: June 28, 2022
    Assignee: CYNGN, INC.
    Inventors: I-Chung Joseph Lin, Elena Ramona Stefanescu, Dhivya Sukumar, Sumit Saxena
  • Publication number: 20220063515
    Abstract: Vehicle sensor systems include modular sensor kits having one or more pods (e.g., sensor roof pods) and/or one or more bumpers (e.g., sensor bumpers). The sensor roof pods are configured to couple to a vehicle. A sensor roof pod may be positioned atop a vehicle proximate a front of the vehicle, proximate a back of the vehicle, or at any position along a top side of the vehicle being coupled, for example, using a mounting shim or a tripod. The sensor roof pods can include sensors (e.g., LIDAR sensors, cameras, ultrasonic sensors, etc.), processing units, control systems (e.g., temperature and/or environmental control systems), and communication devices (e.g., networking and/or wireless devices).
    Type: Application
    Filed: November 10, 2021
    Publication date: March 3, 2022
    Applicant: CYNGN, INC.
    Inventors: Ain MCKENDRICK, Michael W. LOWE, Andrea MARIOTTI, Pranav BAJORIA, Akash JOSHI
  • Publication number: 20220057521
    Abstract: Virtual bumpers for autonomous vehicles improve effectiveness and safety as such vehicles are operated. One or more sensor systems having a Lidar sensor and a camera sensor determine proximity of objects around the vehicle and facilitate identification of the environment around the vehicle. The sensor systems are placed at various locations around the vehicle. The vehicle identifies an object and one or more properties of the identified object using the sensor systems. Based on the identified object and the properties of the object, a virtual bumper may be created for that object. For example, if the object is identified as another vehicle moving with a certain velocity, the vehicle may determine a minimum space to avoid the other vehicle, either by changing direction or reducing the velocity of the vehicle, with the minimum space constituting a virtual bumper.
    Type: Application
    Filed: November 3, 2021
    Publication date: February 24, 2022
    Applicant: CYNGN, INC.
    Inventors: Michael LOWE, Ain MCKENDRICK, Andrea MARIOTTI, Pranav BAJORIA, Biao MA
  • Patent number: 11186234
    Abstract: Vehicle sensor systems include modular sensor kits having one or more pods (e.g., sensor roof pods) and/or one or more bumpers (e.g., sensor bumpers). The sensor roof pods are configured to couple to a vehicle. A sensor roof pod may be positioned atop a vehicle proximate a front of the vehicle, proximate a back of the vehicle, or at any position along a top side of the vehicle being coupled, for example, using a mounting shim or a tripod. The sensor roof pods can include sensors (e.g., LIDAR sensors, cameras, ultrasonic sensors, etc.), processing units, control systems (e.g., temperature and/or environmental control systems), and communication devices (e.g., networking and/or wireless devices).
    Type: Grant
    Filed: October 15, 2019
    Date of Patent: November 30, 2021
    Assignee: CYNGN, INC.
    Inventors: Ain McKendrick, Michael W. Lowe, Andrea Mariotti, Pranav Bajoria, Akash Joshi