Patents Assigned to ACCEL ROBOTICS
  • Publication number: 20200334835
    Abstract: A sensor bar shelf monitor, for example that may be added to an existing shelving system to convert a store to autonomous operation. The sensor bar may contain distance sensor to detect shoppers reaching towards items on a shelf, and cameras to determine which items shoppers have taken. It may be installed into shelf supports such as gondola shelving uprights. The sensor bar may be located at the front edge of a shelf, and may monitor the shelf below. Placing the sensor bar along the front edge prevents damage to electronics from spills or shelf cleaning, and prevents heat from the sensor bar electronics from damaging items on the shelf. The sensor bar may have a local sensor bar processor that collects sensor data; images may be analyzed locally or transferred to more powerful store processors. A sensor bar may also have controllable lights and controllable electronic labels.
    Type: Application
    Filed: June 30, 2020
    Publication date: October 22, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Aleksander BAPST, Christopher LAI, Jose Miguel RODRIGUEZ, Mark Alan GRAHAM, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10783491
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: September 22, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20200202177
    Abstract: A system having a bar containing distance sensors, such as LIDARs, that may be installed behind a shelf in an automated store. Shelves may be divided into storage zones, such as bins or lanes, and a distance sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. The distance sensor elements within the bar may be relocatable so they can be positioned correctly behind the corresponding storage zones. The bar may have mounting mechanisms on either side for attachment to shelving support structures. These mounting mechanisms may include security locks. The bar may rotate relative to the mounting mechanisms to provide access to the shelf from behind.
    Type: Application
    Filed: March 4, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, Aleksander BAPST, Filip PIEKNIEWSKI, Mark GRAHAM, Yohei YAMAMURO, Jose Miguel RODRIGUEZ
  • Publication number: 20200202288
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: February 28, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10621472
    Abstract: System that facilitates rapid onboarding of an autonomous (cashier-less) store by capturing images of items in the store's catalog from different angles, with varying backgrounds and lighting conditions, and that automatically builds a classifier training dataset from these images. The system may have cameras in different positions, lights supporting variable illumination, and monitor screens that generate different background colors. It may have an input device such as a barcode reader, and an operator terminal that prompts operators to place items into the imaging system in the necessary orientations. Once an item is placed in the imaging system, a fully automated process may generate a sequence of background colors, a sequence of lighting conditions, and may capture and process images from all of the cameras to create training images. Training images for an item may be generated in seconds, compared to many minutes per item using manual image capture and processing.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: April 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Tanuj Pankaj, Chin-Chang Kuo
  • Patent number: 10603794
    Abstract: A robotics camera system includes a camera robot that is in wireless communication with a robot cloud server through a cloud network such as the Internet. A user having a mobile device desires to utilize a particular camera robot. The user mobile device may wirelessly communicate with the camera robot either directly or indirectly, through the cloud network. User interaction with the camera robot may be to initiate a photo session and/or to have the camera robot follow the user through a venue. In some embodiments the user initiates the interaction through a social media platform. In some embodiments the user initiates the interaction by utilizing a designated application that runs upon the user's mobile device. In some embodiments the user initiates the interaction with the use of a two-dimensional machine-readable code element. In some embodiments the user initiates an interaction with a plurality of robots that work collaboratively.
    Type: Grant
    Filed: October 30, 2017
    Date of Patent: March 31, 2020
    Assignee: Accel Robotics Corporation
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky
  • Patent number: 10586208
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: July 16, 2019
    Date of Patent: March 10, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Filip Piekniewski, Aleksander Bapst, Soheyl Yousefisahi, Chin-Chang Kuo
  • Publication number: 20200027218
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 23, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200019921
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: July 16, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Filip PIEKNIEWSKI, Aleksander BAPST, Soheyl YOUSEFISAHI, Chin-Chang KUO
  • Publication number: 20200020113
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: September 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Publication number: 20200020112
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Application
    Filed: May 6, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Michael Brandon MASEDA, Martin Alan CSEH
  • Patent number: 10535146
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: May 6, 2019
    Date of Patent: January 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10373322
    Abstract: An autonomous store system that analyzes camera images to track people and their interactions with items using a processor that obtains a 3D model of a store that contains items and item storage areas. Receives images from cameras captured over a time period and analyzes the images and the 3D model of the store to detect a person in the store based on the images, calculates a trajectory of the person, identifies an item storage area proximal to the trajectory of the person during an interaction time period, analyzes two or more images to identify an item within the item storage area that is moved during the interaction time period. The images are captured within or proximal in time to the interaction time period, and the images contain views of the item storage area, and attribute motion of the item to the person. Enables calibration and placement algorithms for cameras.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: August 6, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre
  • Patent number: 10282720
    Abstract: A system that analyzes camera images to track a person from a point where the person obtains an authorization to a different point where the authorization is used. The authorization may be extended in time and space from the point where it was initially obtained. Scenarios enabled by embodiments include automatically opening a locked door or gate for an authorized person and automatically charging items taken by a person to that person's account. Supports automated stores that allow users to enter, take products and exit without explicitly paying. An illustrative application is an automated, unmanned gas station that allows a user to pay at the pump and then enter a locked on-site convenience store or a locked case with products the user can take for automatic purchase. Embodiments may also extend authorization to other people, such as occupants of the same vehicle.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10282852
    Abstract: A system that analyzes camera images to track a person in an autonomous store, and to determine when a tracked person takes or moves items in the store. The system may associate a field of influence volume around a person's location; intersection of this volume with an item storage area, such as a shelf, may trigger the system to look for changes in the items on the shelf. Items that are taken from, placed on, or moved on a shelf may be determined by a neural network that processes before and after images of the shelf. Person tracking may be performed by analyzing images from fisheye ceiling cameras projected onto a plane horizontal to the floor. Projected ceiling camera images may be analyzed using a neural network trained to recognize shopper locations. The autonomous store may include modular ceiling and shelving fixtures that contain cameras, lights, processors, and networking.
    Type: Grant
    Filed: January 23, 2019
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Publication number: 20180241938
    Abstract: A robot for capturing image frames of subjects in response to a request includes a base, a robot head, a camera, a vertical positioning mechanism, and a control system. The base has a transport mechanism for controllably positioning the robot along a lateral surface. The vertical positioning mechanism couples the robot head to the base and adjusts a vertical distance between the robot head and the support surface. The control system controls image capture of the camera, the transport mechanism, and the vertical positioning mechanism.
    Type: Application
    Filed: April 19, 2018
    Publication date: August 23, 2018
    Applicant: Accel Robotics Corporation
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky
  • Patent number: 9969080
    Abstract: A robot for capturing image frames of subjects in response to a request includes a base, a robot head, a camera, an angular positioning mechanism, a vertical positioning mechanism, and a control system. The base has a transport mechanism for controllably positioning the robot along a lateral surface. The angular positioning mechanism couples the camera to the robot head and controls a tilt of the camera. The vertical positioning mechanism couples the robot head to the base and adjusts a vertical distance between the robot head and the support surface. The control system controls image capture of the camera, the transport mechanism, the angular positioning mechanism, and the vertical positioning mechanism.
    Type: Grant
    Filed: August 2, 2016
    Date of Patent: May 15, 2018
    Assignee: ACCEL ROBOTICS
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky
  • Publication number: 20180043543
    Abstract: A robotics camera system includes a camera robot that is in wireless communication with a robot cloud server through a cloud network such as the Internet. A user having a mobile device desires to utilize a particular camera robot. The user mobile device may wirelessly communicate with the camera robot either directly or indirectly, through the cloud network. User interaction with the camera robot may be to initiate a photo session and/or to have the camera robot follow the user through a venue. In some embodiments the user initiates the interaction through a social media platform. In some embodiments the user initiates the interaction by utilizing a designated application that runs upon the user's mobile device. In some embodiments the user initiates the interaction with the use of a two-dimensional machine-readable code element. In some embodiments the user initiates an interaction with a plurality of robots that work collaboratively.
    Type: Application
    Filed: October 30, 2017
    Publication date: February 15, 2018
    Applicant: Accel Robotics Corporation
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky
  • Publication number: 20180036879
    Abstract: A robot for capturing image frames of subjects in response to a request includes a base, a robot head, a camera, an angular positioning mechanism, a vertical positioning mechanism, and a control system. The base has a transport mechanism for controllably positioning the robot along a lateral surface. The angular positioning mechanism couples the camera to the robot head and controls a tilt of the camera. The vertical positioning mechanism couples the robot head to the base and adjusts a vertical distance between the robot head and the support surface. The control system controls image capture of the camera, the transport mechanism, the angular positioning mechanism, and the vertical positioning mechanism.
    Type: Application
    Filed: August 2, 2016
    Publication date: February 8, 2018
    Applicant: Accel Robotics Corporation
    Inventors: Marius O. Buibas, Martin A. Cseh, Michael B. Maseda, Nicholas J. Morozovsky