Patents by Inventor Hasan Tafish

Hasan Tafish has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230413798
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may receive captured sensor data of an agricultural environment by one or more image capture devices, generate a first image from the captured sensor data, detect a first real-world agricultural object from the first image, determine a first pose of the first real-world agricultural object, identify the detected first real-world agricultural object as a new agricultural object or a preidentified agricultural object, and determine a treatment policy associated with the first real-world agricultural object.
    Type: Application
    Filed: September 11, 2023
    Publication date: December 28, 2023
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Publication number: 20230360392
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Application
    Filed: July 3, 2023
    Publication date: November 9, 2023
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Patent number: 11751558
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Grant
    Filed: August 2, 2021
    Date of Patent: September 12, 2023
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Patent number: 11694434
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Grant
    Filed: October 16, 2020
    Date of Patent: July 4, 2023
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Publication number: 20220121847
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Application
    Filed: October 16, 2020
    Publication date: April 21, 2022
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Publication number: 20220117213
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Application
    Filed: August 2, 2021
    Publication date: April 21, 2022
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Publication number: 20220117216
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Application
    Filed: November 20, 2020
    Publication date: April 21, 2022
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Publication number: 20220121848
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Application
    Filed: November 20, 2020
    Publication date: April 21, 2022
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Patent number: 11181929
    Abstract: Systems and methods for shared autonomy through cooperative sensing are described. According to one embodiment, a cooperative sensing system includes a rendezvous module that receives broadcast messages from a plurality of cooperating vehicles on the roadway. The rendezvous module also selects a subordinate vehicle from the plurality of cooperating vehicles based on the autonomy level of the subordinate vehicle as compared to an autonomy level of a principal vehicle. The cooperative sensing system also includes a positioning module that determines a cooperative position of the principal vehicle and the subordinate vehicle. The cooperative sensing system further includes a negotiation module that receives at least one cooperating parameter from the subordinate vehicle.
    Type: Grant
    Filed: July 31, 2018
    Date of Patent: November 23, 2021
    Assignee: HONDA MOTOR CO., LTD.
    Inventors: Paritosh Kelkar, Xue Bai, Samer Rajab, Hasan Tafish
  • Patent number: 11163317
    Abstract: Systems and methods for shared autonomy through cooperative sensing are described. According to one embodiment, a cooperative sensing system includes a rendezvous module that receives broadcast messages from a plurality of cooperating vehicles on the roadway. The rendezvous module also selects a subordinate vehicle from the plurality of cooperating vehicles based on the autonomy level of the subordinate vehicle as compared to an autonomy level of a principal vehicle. The cooperative sensing system also includes a positioning module that determines a cooperative position of the principal vehicle and the subordinate vehicle. The cooperative sensing system further includes a negotiation module that receives at least one cooperating parameter from the subordinate vehicle.
    Type: Grant
    Filed: May 17, 2019
    Date of Patent: November 2, 2021
    Assignee: HONDA MOTOR CO., LTD.
    Inventors: Paritosh Kelkar, Xue Bai, Samer Rajab, Hasan Tafish
  • Publication number: 20210239853
    Abstract: A computer-implemented method and system for vehicle path estimation using a vehicular communication network. The method includes receiving a first set of position measurements of a first remote vehicle and a second set of position measurements of the first remote vehicle from messages transmitted using the vehicular communication network. The method includes determining a path shape of an initial path estimate of the first remote vehicle. The initial path estimate is based on the first set of position measurements. The method includes determining a corrected vehicle path estimate of the first remote vehicle by fitting the second set of position measurements to the path shape of the initial path estimate.
    Type: Application
    Filed: March 26, 2021
    Publication date: August 5, 2021
    Inventors: Samir K. AL-STOUHI, Paritosh KELKAR, Hasan TAFISH
  • Patent number: 11076589
    Abstract: Various embodiments of an apparatus, methods, systems and computer program products described herein are directed to an agricultural observation and treatment system and method of operation. The agricultural treatment system may determine a first real-world geo-spatial location of the treatment system. The system can receive captured images depicting real-world agricultural objects of a geographic scene. The system can associate captured images with the determined geo-spatial location of the treatment system. The treatment system can identify, from a group of mapped and indexed images, images having a second real-word geo-spatial location that is proximate with the first real-world geo-spatial location. The treatment system can compare at least a portion of the identified images with at least a portion of the captured images. The treatment system can determine a target object and emit a fluid projectile at the target object using a treatment device.
    Type: Grant
    Filed: October 16, 2020
    Date of Patent: August 3, 2021
    Assignee: Verdant Robotics, Inc.
    Inventors: Gabriel Thurston Sibley, Lorenzo Ibarria, Curtis Dale Garner, Patrick Christopher Leger, Andre Robert Daniel Michelin, John Phillip Hurliman, II, Wisit Jirattigalochote, Hasan Tafish
  • Patent number: 10757485
    Abstract: A method for controlling vehicle sensor data acquisition using a vehicular communication network includes, receiving a global time signal at the first vehicle and at the second vehicle. The first vehicle synchronizes a local clock signal of the first vehicle with the global time signal, and the second vehicle synchronizes a local clock signal of the second vehicle with the global time signal. Further, the method includes determining a capture interval that minimizes a time between actuation of a sensor of the first vehicle and actuation of a sensor of the second vehicle. The method includes actuating, according to the capture interval, the sensor of the first vehicle and the sensor of the second vehicle. The first vehicle transmits to the second vehicle the sensor data, and the second vehicle transmits to the first vehicle the sensor data.
    Type: Grant
    Filed: August 25, 2017
    Date of Patent: August 25, 2020
    Assignee: Honda Motor Co., Ltd.
    Inventors: Samir K. Al-Stouhi, Paritosh Kelkar, Hasan Tafish
  • Publication number: 20200042013
    Abstract: Systems and methods for shared autonomy through cooperative sensing are described. According to one embodiment, a cooperative sensing system includes a rendezvous module that receives broadcast messages from a plurality of cooperating vehicles on the roadway. The rendezvous module also selects a subordinate vehicle from the plurality of cooperating vehicles based on the autonomy level of the subordinate vehicle as compared to an autonomy level of a principal vehicle. The cooperative sensing system also includes a positioning module that determines a cooperative position of the principal vehicle and the subordinate vehicle. The cooperative sensing system further includes a negotiation module that receives at least one cooperating parameter from the subordinate vehicle.
    Type: Application
    Filed: July 31, 2018
    Publication date: February 6, 2020
    Inventors: Paritosh Kelkar, Xue Bai, Samer Rajab, Hasan Tafish
  • Publication number: 20200042017
    Abstract: Systems and methods for shared autonomy through cooperative sensing are described. According to one embodiment, a cooperative sensing system includes a rendezvous module that receives broadcast messages from a plurality of cooperating vehicles on the roadway. The rendezvous module also selects a subordinate vehicle from the plurality of cooperating vehicles based on the autonomy level of the subordinate vehicle as compared to an autonomy level of a principal vehicle. The cooperative sensing system also includes a positioning module that determines a cooperative position of the principal vehicle and the subordinate vehicle. The cooperative sensing system further includes a negotiation module that receives at least one cooperating parameter from the subordinate vehicle.
    Type: Application
    Filed: May 17, 2019
    Publication date: February 6, 2020
    Inventors: Paritosh Kelkar, Xue Bai, Samer Rajab, Hasan Tafish
  • Patent number: 10338196
    Abstract: A method and system for controlling sensor data acquisition using a vehicular communication network is provided. An example method includes establishing an operable connection between a first vehicle and remote vehicles. The first vehicle and the remote vehicles operate based upon a common time base according to a global time signal. The method includes receiving capability data that includes a sensor actuation time slot of each of the remote vehicles indicting a time slot at which the sensors of each of the remote vehicles are actuating. The sensor actuation time slot of each of the remote vehicle is different. The method also includes dividing a clock cycle into a plurality of time slots based on the remote vehicles and controlling, according to the plurality of time slots and the sensor actuation time slot, sensor actuation of a sensor of the first vehicle and the sensors of the remote vehicles.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: July 2, 2019
    Assignee: Honda Motor Co., Ltd.
    Inventors: Samir Al-Stouhi, Paritosh Kelkar, Hasan Tafish
  • Publication number: 20190196025
    Abstract: A computer-implemented method and system for vehicle path estimation using a vehicular communication network. The method includes receiving a first set of position measurements of a first remote vehicle and a second set of position measurements of the first remote vehicle from messages transmitted using the vehicular communication network. The method includes determining a path shape of an initial path estimate of the first remote vehicle. The initial path estimate is based on the first set of position measurements. The method includes determining a corrected vehicle path estimate of the first remote vehicle by fitting the second set of position measurements to the path shape of the initial path estimate.
    Type: Application
    Filed: December 21, 2017
    Publication date: June 27, 2019
    Inventors: Samir K. Al-Stouhi, Paritosh Kelkar, Hasan Tafish
  • Patent number: 10334331
    Abstract: A method and system for controlling vehicle sensor data acquisition using a vehicular communication network, including synchronizing a local clock signal of a first vehicle and a local clock signal of a second vehicle with a global time signal. Further, determining a capture interval for a sensor data acquisition process time that maximizes a total number of data frames that can be captured by a sensor of the first vehicle and by a sensor of the second vehicle. Based on the capture interval, transmitting a first sensor trigger pulse to the sensor of the first vehicle and transmitting a second sensor trigger pulse according to the sensor of the second vehicle. The first vehicle transmits the sensor data from the sensor of the first vehicle to the second vehicle, and the second vehicle transmits the sensor data from the sensor of the second vehicle to the first vehicle.
    Type: Grant
    Filed: August 25, 2017
    Date of Patent: June 25, 2019
    Assignee: Honda Motor Co., Ltd.
    Inventors: Samir K. Al-Stouhi, Paritosh Kelkar, Hasan Tafish
  • Publication number: 20190072641
    Abstract: A method and system for controlling sensor data acquisition using a vehicular communication network is provided. An example method includes establishing an operable connection between a first vehicle and remote vehicles. The first vehicle and the remote vehicles operate based upon a common time base according to a global time signal. The method includes receiving capability data that includes a sensor actuation time slot of each of the remote vehicles indicting a time slot at which the sensors of each of the remote vehicles are actuating. The sensor actuation time slot of each of the remote vehicle is different. The method also includes dividing a clock cycle into a plurality of time slots based on the remote vehicles and controlling, according to the plurality of time slots and the sensor actuation time slot, sensor actuation of a sensor of the first vehicle and the sensors of the remote vehicles.
    Type: Application
    Filed: October 31, 2018
    Publication date: March 7, 2019
    Inventors: Samir Al-Stouhi, Paritosh Kelkar, Hasan Tafish
  • Publication number: 20190069051
    Abstract: A method and system for controlling vehicle sensor data acquisition using a vehicular communication network, including synchronizing a local clock signal of a first vehicle and a local clock signal of a second vehicle with a global time signal. Further, determining a capture interval for a sensor data acquisition process time that maximizes a total number of data frames that can be captured by a sensor of the first vehicle and by a sensor of the second vehicle. Based on the capture interval, transmitting a first sensor trigger pulse to the sensor of the first vehicle and transmitting a second sensor trigger pulse according to the sensor of the second vehicle. The first vehicle transmits the sensor data from the sensor of the first vehicle to the second vehicle, and the second vehicle transmits the sensor data from the sensor of the second vehicle to the first vehicle.
    Type: Application
    Filed: August 25, 2017
    Publication date: February 28, 2019
    Inventors: Samir K. Al-Stouhi, Paritosh Kelkar, Hasan Tafish