Patents Assigned to ACCEL ROBOTICS
  • Patent number: 10783491
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: September 22, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20200202177
    Abstract: A system having a bar containing distance sensors, such as LIDARs, that may be installed behind a shelf in an automated store. Shelves may be divided into storage zones, such as bins or lanes, and a distance sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. The distance sensor elements within the bar may be relocatable so they can be positioned correctly behind the corresponding storage zones. The bar may have mounting mechanisms on either side for attachment to shelving support structures. These mounting mechanisms may include security locks. The bar may rotate relative to the mounting mechanisms to provide access to the shelf from behind.
    Type: Application
    Filed: March 4, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, Aleksander BAPST, Filip PIEKNIEWSKI, Mark GRAHAM, Yohei YAMAMURO, Jose Miguel RODRIGUEZ
  • Publication number: 20200202288
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: February 28, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10621472
    Abstract: System that facilitates rapid onboarding of an autonomous (cashier-less) store by capturing images of items in the store's catalog from different angles, with varying backgrounds and lighting conditions, and that automatically builds a classifier training dataset from these images. The system may have cameras in different positions, lights supporting variable illumination, and monitor screens that generate different background colors. It may have an input device such as a barcode reader, and an operator terminal that prompts operators to place items into the imaging system in the necessary orientations. Once an item is placed in the imaging system, a fully automated process may generate a sequence of background colors, a sequence of lighting conditions, and may capture and process images from all of the cameras to create training images. Training images for an item may be generated in seconds, compared to many minutes per item using manual image capture and processing.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: April 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Tanuj Pankaj, Chin-Chang Kuo
  • Patent number: 10586208
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: July 16, 2019
    Date of Patent: March 10, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Filip Piekniewski, Aleksander Bapst, Soheyl Yousefisahi, Chin-Chang Kuo
  • Publication number: 20200027218
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 23, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200020112
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: May 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200019921
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: July 16, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Filip PIEKNIEWSKI, Aleksander BAPST, Soheyl YOUSEFISAHI, Chin-Chang KUO
  • Publication number: 20200020113
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10535146
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: May 6, 2019
    Date of Patent: January 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10373322
    Abstract: An autonomous store system that analyzes camera images to track people and their interactions with items using a processor that obtains a 3D model of a store that contains items and item storage areas. Receives images from cameras captured over a time period and analyzes the images and the 3D model of the store to detect a person in the store based on the images, calculates a trajectory of the person, identifies an item storage area proximal to the trajectory of the person during an interaction time period, analyzes two or more images to identify an item within the item storage area that is moved during the interaction time period. The images are captured within or proximal in time to the interaction time period, and the images contain views of the item storage area, and attribute motion of the item to the person. Enables calibration and placement algorithms for cameras.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: August 6, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre
  • Patent number: 10282720
    Abstract: A system that analyzes camera images to track a person from a point where the person obtains an authorization to a different point where the authorization is used. The authorization may be extended in time and space from the point where it was initially obtained. Scenarios enabled by embodiments include automatically opening a locked door or gate for an authorized person and automatically charging items taken by a person to that person's account. Supports automated stores that allow users to enter, take products and exit without explicitly paying. An illustrative application is an automated, unmanned gas station that allows a user to pay at the pump and then enter a locked on-site convenience store or a locked case with products the user can take for automatic purchase. Embodiments may also extend authorization to other people, such as occupants of the same vehicle.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10282852
    Abstract: A system that analyzes camera images to track a person in an autonomous store, and to determine when a tracked person takes or moves items in the store. The system may associate a field of influence volume around a person's location; intersection of this volume with an item storage area, such as a shelf, may trigger the system to look for changes in the items on the shelf. Items that are taken from, placed on, or moved on a shelf may be determined by a neural network that processes before and after images of the shelf. Person tracking may be performed by analyzing images from fisheye ceiling cameras projected onto a plane horizontal to the floor. Projected ceiling camera images may be analyzed using a neural network trained to recognize shopper locations. The autonomous store may include modular ceiling and shelving fixtures that contain cameras, lights, processors, and networking.
    Type: Grant
    Filed: January 23, 2019
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 9969080
    Abstract: A robot for capturing image frames of subjects in response to a request includes a base, a robot head, a camera, an angular positioning mechanism, a vertical positioning mechanism, and a control system. The base has a transport mechanism for controllably positioning the robot along a lateral surface. The angular positioning mechanism couples the camera to the robot head and controls a tilt of the camera. The vertical positioning mechanism couples the robot head to the base and adjusts a vertical distance between the robot head and the support surface. The control system controls image capture of the camera, the transport mechanism, the angular positioning mechanism, and the vertical positioning mechanism.
    Type: Grant
    Filed: August 2, 2016
    Date of Patent: May 15, 2018
    Assignee: ACCEL ROBOTICS
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky