Patents by Inventor David NEUHOF

David NEUHOF has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12112497
    Abstract: A navigation system for a host vehicle may include a processor programmed to: receive from a camera onboard the host vehicle at least one captured image representative of an environment of the host vehicle, wherein the camera is positioned at a first location relative to the host vehicle; receive point cloud information from a LIDAR system onboard the host vehicle, wherein the LIDAR system is positioned at a second location relative to the host vehicle; analyze the at least one captured image and the received point cloud information to detect one or more objects in the shared field of view region; determine whether a vantage point difference between the first location of the camera and the second location of the LIDAR system accounts for the one or more detected objects being represented in only one of the at least one captured image or the received point cloud information.
    Type: Grant
    Filed: June 29, 2022
    Date of Patent: October 8, 2024
    Assignee: MOBILEYE VISION TECHNOLOGIES LTD.
    Inventors: Chaim Blau, Ofer Springer, Kevin Rosenblum, Alon Ziv, Erez Dagan, David Boublil, Nadav Shaag, David Neuhof, Jeffrey Moskowitz, Gal Topel, Yotam Stern
  • Patent number: 11734848
    Abstract: A navigation system for a host vehicle may include a processor programmed to: receive from a center camera onboard the host vehicle a captured center image including a representation of at least a portion of an environment of the host vehicle, receive from a left surround camera onboard the host vehicle a captured left surround image including a representation of at least a portion of the environment of the host vehicle, and receive from a right surround camera onboard the host vehicle a captured right surround image including a representation of at least a portion of the environment of the host vehicle; provide the center image, the left surround image, and the right surround image to an analysis module configured to generate an output relative to the at least one captured center image; and cause a navigational action by the host vehicle based on the generated output.
    Type: Grant
    Filed: June 29, 2022
    Date of Patent: August 22, 2023
    Assignee: MOBILEYE VISION TECHNOLOGIES LTD.
    Inventors: Ofer Springer, David Neuhof, Jeffrey Moskowitz, Gal Topel, Nadav Shaag, Yotam Stern, Roy Lotan, Shahar Harouche, Daniel Einy
  • Publication number: 20220333927
    Abstract: A navigation system for a host vehicle may include a processor programmed to determine at least one indicator of ego motion of the host vehicle. A processor may be also programmed to receive, from a LIDAR system, a first point cloud including a first representation of at least a portion of an object and a second point cloud including a second representation of the at least a portion of the object. The processor may further be programmed to determine a velocity of the object based on the at least one indicator of ego motion of the host vehicle, and based on a comparison of the first point cloud, including the first representation of the at least a portion of the object, and the second point cloud, including the second representation of the at least a portion of the object.
    Type: Application
    Filed: June 29, 2022
    Publication date: October 20, 2022
    Inventors: Kevin ROSENBLUM, Erez DAGAN, David BOUBLIL, Nadav SHAAG, David NEUHOF, Jeffrey MOSKOWITZ, Gal TOPEL
  • Publication number: 20220333932
    Abstract: A navigation system for a host vehicle may include a processor programmed to: receive, from an entity remotely located relative to the host vehicle, a sparse map associated with at least one road segment to be traversed by the host vehicle; receive point cloud information from a LIDAR system onboard the host vehicle, the point cloud information being representative of distances to various objects in an environment of the host vehicle; compare the received point cloud information with at least one of the plurality of mapped navigational landmarks in the sparse map to provide a LIDAR-based localization of the host vehicle relative to at least one target trajectory; determine an navigational action for the host vehicle based on the LIDAR-based localization of the host vehicle relative to the at least one target trajectory; and cause the at least one navigational action to be taken by the host vehicle.
    Type: Application
    Filed: June 29, 2022
    Publication date: October 20, 2022
    Inventors: Kevin ROSENBLUM, Alon ZIV, Erez DAGAN, David BOUBLIL, Nadav SHAAG, David NEUHOF, Jeffrey MOSKOWITZ, Gal TOPEL
  • Publication number: 20220324437
    Abstract: A navigation system for a host vehicle may include a processor programmed to: receive from a camera onboard the host vehicle at least one captured image representative of an environment of the host vehicle, wherein the camera is positioned at a first location relative to the host vehicle; receive point cloud information from a LIDAR system onboard the host vehicle, wherein the LIDAR system is positioned at a second location relative to the host vehicle; analyze the at least one captured image and the received point cloud information to detect one or more objects in the shared field of view region; determine whether a vantage point difference between the first location of the camera and the second location of the LIDAR system accounts for the one or more detected objects being represented in only one of the at least one captured image or the received point cloud information.
    Type: Application
    Filed: June 29, 2022
    Publication date: October 13, 2022
    Inventors: Chaim BLAU, Ofer SPRINGER, Kevin ROSENBLUM, Alon ZIV, Erez DAGAN, David BOUBLIL, Nadav SHAAG, David NEUHOF, Jeffrey MOSKOWITZ, Gal TOPEL, Yotam STERN
  • Publication number: 20220327719
    Abstract: A navigation system for a host vehicle may include a processor programmed to: receive from a center camera onboard the host vehicle a captured center image including a representation of at least a portion of an environment of the host vehicle, receive from a left surround camera onboard the host vehicle a captured left surround image including a representation of at least a portion of the environment of the host vehicle, and receive from a right surround camera onboard the host vehicle a captured right surround image including a representation of at least a portion of the environment of the host vehicle; provide the center image, the left surround image, and the right surround image to an analysis module configured to generate an output relative to the at least one captured center image; and cause a navigational action by the host vehicle based on the generated output.
    Type: Application
    Filed: June 29, 2022
    Publication date: October 13, 2022
    Inventors: Nadav SHAAG, David NEUHOF, Jeffrey MOSKOWITZ, Gal TOPEL, Ofer SPRINGER, Yotam STERN