Patents by Inventor Csaba Petre

Csaba Petre has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11113825
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: September 7, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 11049263
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: June 29, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20210049772
    Abstract: An automated store attached to or integrated into a site where vehicles park, such as a gas station, charging station, or parking lot. The store may obtain the identity of the vehicle automatically, for example in a message sent over a charging cable, or by scanning a license plate. An authorization linked to the vehicle identity may be extended to passengers who exit the vehicle, so that these passengers may take items from the store and have them automatically charged to the vehicle's account. Locked cases containing products may be unlocked automatically when a shopper who exited an authorized vehicle arrives at the case. As passengers move around the site and obtain items from the store, messages may be transmitted back to the vehicle, or to a mobile device of the vehicle owner or driver, showing the items that have been taken.
    Type: Application
    Filed: October 30, 2020
    Publication date: February 18, 2021
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10818016
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: October 27, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10783491
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: September 22, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20200202288
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: February 28, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10586208
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: July 16, 2019
    Date of Patent: March 10, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Filip Piekniewski, Aleksander Bapst, Soheyl Yousefisahi, Chin-Chang Kuo
  • Publication number: 20200027218
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 23, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200020113
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200020112
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: May 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200019921
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: July 16, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Filip PIEKNIEWSKI, Aleksander BAPST, Soheyl YOUSEFISAHI, Chin-Chang KUO
  • Patent number: 10535146
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: May 6, 2019
    Date of Patent: January 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20190244365
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Application
    Filed: March 19, 2019
    Publication date: August 8, 2019
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10373322
    Abstract: An autonomous store system that analyzes camera images to track people and their interactions with items using a processor that obtains a 3D model of a store that contains items and item storage areas. Receives images from cameras captured over a time period and analyzes the images and the 3D model of the store to detect a person in the store based on the images, calculates a trajectory of the person, identifies an item storage area proximal to the trajectory of the person during an interaction time period, analyzes two or more images to identify an item within the item storage area that is moved during the interaction time period. The images are captured within or proximal in time to the interaction time period, and the images contain views of the item storage area, and attribute motion of the item to the person. Enables calibration and placement algorithms for cameras.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: August 6, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre
  • Patent number: 10282852
    Abstract: A system that analyzes camera images to track a person in an autonomous store, and to determine when a tracked person takes or moves items in the store. The system may associate a field of influence volume around a person's location; intersection of this volume with an item storage area, such as a shelf, may trigger the system to look for changes in the items on the shelf. Items that are taken from, placed on, or moved on a shelf may be determined by a neural network that processes before and after images of the shelf. Person tracking may be performed by analyzing images from fisheye ceiling cameras projected onto a plane horizontal to the floor. Projected ceiling camera images may be analyzed using a neural network trained to recognize shopper locations. The autonomous store may include modular ceiling and shelving fixtures that contain cameras, lights, processors, and networking.
    Type: Grant
    Filed: January 23, 2019
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10282720
    Abstract: A system that analyzes camera images to track a person from a point where the person obtains an authorization to a different point where the authorization is used. The authorization may be extended in time and space from the point where it was initially obtained. Scenarios enabled by embodiments include automatically opening a locked door or gate for an authorized person and automatically charging items taken by a person to that person's account. Supports automated stores that allow users to enter, take products and exit without explicitly paying. An illustrative application is an automated, unmanned gas station that allows a user to pay at the pump and then enter a locked on-site convenience store or a locked case with products the user can take for automatic purchase. Embodiments may also extend authorization to other people, such as occupants of the same vehicle.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10282849
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Grant
    Filed: June 19, 2017
    Date of Patent: May 7, 2019
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10210452
    Abstract: Apparatus and methods for high-level neuromorphic network description (HLND) framework that may be configured to enable users to define neuromorphic network architectures using a unified and unambiguous representation that is both human-readable and machine-interpretable. The framework may be used to define nodes types, node-to-node connection types, instantiate node instances for different node types, and to generate instances of connection types between these nodes. To facilitate framework usage, the HLND format may provide the flexibility required by computational neuroscientists and, at the same time, provides a user-friendly interface for users with limited experience in modeling neurons. The HLND kernel may comprise an interface to Elementary Network Description (END) that is optimized for efficient representation of neuronal systems in hardware-independent manner and enables seamless translation of HLND model description into hardware instructions for execution by various processing modules.
    Type: Grant
    Filed: March 15, 2012
    Date of Patent: February 19, 2019
    Assignee: QUALCOMM Incorporated
    Inventors: Botond Szatmary, Eugene M. Izhikevich, Csaba Petre, Jayram Moorkanikara Nageswaran, Filip Piekniewski
  • Publication number: 20180018775
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Application
    Filed: June 19, 2017
    Publication date: January 18, 2018
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 9860077
    Abstract: Computerized appliances may be operated by users remotely. A learning controller apparatus may be operated to determine association between a user indication and an action by the appliance. The user indications, e.g., gestures, posture changes, audio signals may trigger an event associated with the controller. The event may be linked to a plurality of instructions configured to communicate a command to the appliance. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The sensory input may be used to determine the user indications. During operation, upon determine the indication using sensory input, the controller may cause execution of the respective instructions in order to trigger action by the appliance. Device animation methodology may enable users to operate computerized appliances using gestures, voice commands, posture changes, and/or other customized control elements.
    Type: Grant
    Filed: September 17, 2014
    Date of Patent: January 2, 2018
    Assignee: Brain Corporation
    Inventors: Patryk Laurent, Csaba Petre, Eugene M. Izhikevich