Patents by Inventor Abhinav Yarlagadda

Abhinav Yarlagadda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250117766
    Abstract: Systems and methods include automatically identifying items from images of items positioned at a POS system. Cameras capture images of the items in which each camera captures images with different FOVs of the items thereby capturing different item parameters of each item. The items are identified when item parameters of the items match item parameters previously identified and the items are failed to be identified when item parameters fail to match item parameters previously identified. Image pixels associated with each unknown item are extracted and mapped to real world coordinates associated with each unknown item as positioned at the POS system. A bounding polygon for each unknown item is generated that encapsulates each unknown item based on the image pixels mapped to the real world coordinates. The bounding polygon is projected onto each unknown item thereby providing feedback for each unknown item positioned at the POS system.
    Type: Application
    Filed: October 4, 2024
    Publication date: April 10, 2025
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20250117765
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. UPC features included in the item parameters associated with each item when combined are indicative as to an indication of the UPC of each item. The UPC features are analyzed to determine whether the UPC features associated with each item when combined match a corresponding combination of UPC features stored in a database. The database stores different combinations of UPC features with each different combination of UPC features associated with a corresponding item thereby identifying each corresponding item based on each different combination of UPC features associated with each item. Each item positioned at the POS system is identified when the UPC features associated with each item when combined match a corresponding combination of UPC features as stored in the database.
    Type: Application
    Filed: October 4, 2024
    Publication date: April 10, 2025
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20250118175
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. Item parameters associated with each item when mapped into a feature vector for each item are indicative as to an identification of the item. The feature vectors are analyzed to determine whether item parameters when combined and mapped into the feature vectors match a corresponding feature vector stored in a database. The database stores different combinations of item parameters as mapped into different stored feature vectors with different stored feature vectors associated with different items thereby identifying each item based on each different combination of item parameters as mapped into each stored feature vector associated with each item. Each item positioned at the POS system is identified when the feature vectors associated with each item match a corresponding stored feature vector as stored in the database.
    Type: Application
    Filed: October 4, 2024
    Publication date: April 10, 2025
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Patent number: 12272217
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. Item parameters associated with each item when mapped into a feature vector for each item are indicative as to an identification of the item. The feature vectors are analyzed to determine whether item parameters when combined and mapped into the feature vectors match a corresponding feature vector stored in a database. The database stores different combinations of item parameters as mapped into different stored feature vectors with different stored feature vectors associated with different items thereby identifying each item based on each different combination of item parameters as mapped into each stored feature vector associated with each item. Each item positioned at the POS system is identified when the feature vectors associated with each item match a corresponding stored feature vector as stored in the database.
    Type: Grant
    Filed: October 4, 2024
    Date of Patent: April 8, 2025
    Assignee: RadiusAI, Inc.
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Sri Priyanka Madduluri
  • Publication number: 20250069362
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. The item parameters associated with each item are indicative as to an identification of each item thereby enabling the identification of each item based on the item parameters. The item parameters are analyzed to determine whether the item parameters match item parameters stored in a database. The database stores different combinations of item parameters to thereby identify each item based on each different combination of item parameters associated with each item. Each item positioned at the POS system is identified when the item parameters for each item match item parameters as stored in the database and fail to identify each item when the item parameters fail to match item parameters. The item parameters associated with the items that fail to match are streamed to the database thereby enabling the identification of each failed item.
    Type: Application
    Filed: November 11, 2024
    Publication date: February 27, 2025
    Inventors: Abhinav Yarlagadda, Aykut Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Patent number: 12236662
    Abstract: Assisted checkout devices, including point-of-sale stations, can use computer vision and machine learning to speed the checkout process while maintaining human verification, assistance, and customer interaction provided by human clerks. A plurality of optical sensors, including cameras, can be arranged with different views of a checkout plane upon which items being purchased by a buyer are placed. Moreover, one or more support towers can be utilized to elevate the optical sensors to vertical heights at which the checkout plane, and items placed thereon, is within the field of view of the optical sensors. The information captured by the plurality of optical sensors can be analyzed using machine learning models to detect and identify the items placed on the checkout plane.
    Type: Grant
    Filed: January 16, 2024
    Date of Patent: February 25, 2025
    Assignee: RadiusAI, Inc.
    Inventors: Abhinav Yarlagadda, Aykut Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20240394779
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Application
    Filed: August 6, 2024
    Publication date: November 28, 2024
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Krishna Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Publication number: 20240265434
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Application
    Filed: April 12, 2024
    Publication date: August 8, 2024
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Krishna Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Patent number: 12056752
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Grant
    Filed: February 13, 2023
    Date of Patent: August 6, 2024
    Assignee: RadiusAI, Inc.
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Krishna Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Publication number: 20240249266
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. The item parameters associated with each item are indicative as to an identification of each item thereby enabling the identification of each item based on the item parameters. The item parameters are analyzed to determine whether the item parameters match item parameters stored in a database. The database stores different combinations of item parameters to thereby identify each item based on each different combination of item parameters associated with each item. Each item positioned at the POS system is identified when the item parameters for each item match item parameters as stored in the database and fail to identify each item when the item parameters fail to match item parameters. The item parameters associated with the items that fail to match are streamed to the database thereby enabling the identification of each failed item.
    Type: Application
    Filed: January 16, 2024
    Publication date: July 25, 2024
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20240242578
    Abstract: Assisted checkout devices, including point-of-sale stations, can use computer vision and machine learning to speed the checkout process while maintaining human verification, assistance, and customer interaction provided by human clerks. A plurality of optical sensors, including cameras, can be arranged with different views of a checkout plane upon which items being purchased by a buyer are placed. Moreover, one or more support towers can be utilized to elevate the optical sensors to vertical heights at which the checkout plane, and items placed thereon, is within the field of view of the optical sensors. The information captured by the plurality of optical sensors can be analyzed using machine learning models to detect and identify the items placed on the checkout plane.
    Type: Application
    Filed: January 16, 2024
    Publication date: July 18, 2024
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20240242505
    Abstract: Visual analytics systems can use video data from cameras placed throughout a location (e.g., a store or a hospital) to determine poses, gestures, tracks, journeys, and actions, and thereby to provide broader analytics useful in realizing efficiencies and improving human behaviors. Data from visual analytics systems can be used to generate alerts, make useful predictions, and run simulations to test hypotheses.
    Type: Application
    Filed: January 16, 2024
    Publication date: July 18, 2024
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Alexander Magsam, Sharan Raja, Stephen McCracken, Xaio Liu, Raghavendra Nakka, Nathan Kelly, Tarik Temur, Siva Naga Raju Balusu, Uday Bhargav Reddy Murikinati, Girisha Lakshmi Nakka, Marek Niemyjski, Todd Ellering, Jeffrey Kershner, Dana Altier, Paul Mills
  • Publication number: 20240242470
    Abstract: Systems and methods include extracting item parameters from images of items positioned at a POS system. The item parameters associated with each item are indicative as to an identification of each item thereby enabling the identification of each item based on the item parameters. The item parameters are analyzed to determine whether the item parameters match item parameters stored in a database. The database stores different combinations of item parameters to thereby identify each item based on each different combination of item parameters associated with each item. Each item positioned at the POS system is identified when the item parameters for each item match item parameters as stored in the database and fail to identify each item when the item parameters fail to match item parameters. The item parameters associated with the items that fail to match are streamed to the database thereby enabling the identification of each failed item.
    Type: Application
    Filed: January 16, 2024
    Publication date: July 18, 2024
    Inventors: Abhinav Yarlagadda, Enis Dengi, Sai Krishna Bashetty, Rahul Santhosh Kumar Varma, Daniel King, Kamalesh Kalirathinam, Nathan Kelly, Sri Priyanka Madduluri, Thomas Strich
  • Publication number: 20240029138
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Application
    Filed: February 13, 2023
    Publication date: January 25, 2024
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Krishna Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Patent number: 11580648
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Grant
    Filed: May 3, 2021
    Date of Patent: February 14, 2023
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Publication number: 20210304421
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Application
    Filed: May 3, 2021
    Publication date: September 30, 2021
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda
  • Patent number: 11024043
    Abstract: A visual tracking system for tracking and identifying persons within a monitored location, comprising a plurality of cameras and a visual processing unit, each camera produces a sequence of video frames depicting one or more of the persons, the visual processing unit is adapted to maintain a coherent track identity for each person across the plurality of cameras using a combination of motion data and visual featurization data, and further determine demographic data and sentiment data using the visual featurization data, the visual tracking system further having a recommendation module adapted to identify a customer need for each person using the sentiment data of the person in addition to context data, and generate an action recommendation for addressing the customer need, the visual tracking system is operably connected to a customer-oriented device configured to perform a customer-oriented action in accordance with the action recommendation.
    Type: Grant
    Filed: March 27, 2020
    Date of Patent: June 1, 2021
    Inventors: Abraham Othman, Enis Aykut Dengi, Ishan Agrawal, Jeff Kershner, Peter Martinez, Paul Mills, Abhinav Yarlagadda