Patents by Inventor Marius Buibas

Marius Buibas has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11743418
    Abstract: System that facilitates rapid onboarding of an autonomous (cashier-less) store by capturing images of the store's items from multiple angles, with varying background colors, and that builds a classifier training dataset from these images. Background surfaces may for example be coated with retroreflective tape or film, and variable-color incident light sources may generate the desired background colors. Embodiments may automatically rotate or otherwise reorient the item placed in the onboarding system, so that a relatively small number of cameras can capture views from multiple angles. When an item is placed in the system, a fully automated process may generate a sequence of item orientations and background colors, and may capture and process images from the cameras to create training images. Images of the item from multiple angles, under varying lighting conditions, may be captured without requiring an operator to move or reorient the item.
    Type: Grant
    Filed: February 17, 2021
    Date of Patent: August 29, 2023
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn
  • Patent number: 11394927
    Abstract: System that allows devices of an autonomous store, such as sensors in product display areas, to receive power and communicate data over conductive rails of store fixtures. By using fixtures to transmit data and power, the need for cabling to these devices, or for batteries to power the devices, is eliminated or greatly reduced, thereby dramatically simplifying installation and maintenance. Illustrative fixtures over which devices can communicate include slatwalls, pegboards, and rectangular support bars. Power and data may be multiplexed onto the same pair of conductive rails. Embodiments may use device hubs to coordinate communication with devices and to act as gateways between devices and centralized store servers. Devices may communicate their identities and locations to store servers to facilitate installation; for example, they may display their identities on electronic labels that are imaged by store cameras, so that the store server can learn the location of each device automatically.
    Type: Grant
    Filed: October 23, 2020
    Date of Patent: July 19, 2022
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, Aleksander Bapst, Mark Graham, Christopher Lai, Jose Miguel Rodriguez
  • Patent number: 11205094
    Abstract: System that facilitates rapid onboarding of an autonomous (cashier-less) store by capturing images of the store's items from multiple angles, with varying backgrounds, and that builds a classifier training dataset from these images. The system may have cameras in different positions, and backgrounds that generate different background colors. It may have a platform for items that can be switched between a transparent and non-transparent state; cameras below the platform may therefore capture images of the bottom side of the item when the platform is transparent. When an item is placed in the imaging system, a fully automated process may generate a sequence of background colors and may capture and process images from all of the cameras to create training images. Images of the item from multiple angles, including views of the entire external surface of the item, may be captured without requiring an operator to move or reorient the item.
    Type: Grant
    Filed: April 14, 2020
    Date of Patent: December 21, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Tanuj Pankaj, Chin-Chang Kuo
  • Patent number: 11113825
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: September 7, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 11106941
    Abstract: A system having a bar containing distance sensors, such as LIDARs, that may be installed behind a shelf in an automated store. Shelves may be divided into storage zones, such as bins or lanes, and a distance sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. The distance sensor elements within the bar may be relocatable so they can be positioned correctly behind the corresponding storage zones. The bar may have mounting mechanisms on either side for attachment to shelving support structures. These mounting mechanisms may include security locks. The bar may rotate relative to the mounting mechanisms to provide access to the shelf from behind.
    Type: Grant
    Filed: March 4, 2020
    Date of Patent: August 31, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, Aleksander Bapst, Filip Piekniewski, Mark Graham, Yohei Yamamuro, Jose Miguel Rodriguez
  • Patent number: 11069070
    Abstract: An autonomous store that tracks shopper movements and actions, and performs store cleaning actions based on analysis of shopper activity. Cleaning actions may include disinfecting the store or regions within the store using radiation, fogging or spraying, or ventilation. Cleaning actions may be targeted; for example, zones where shoppers linger or congregate may be cleaned more frequently or intensively, or shelves or items that shoppers touch may be cleaned after these interactions. Shopper activity information may be used to limit the number of shoppers in a store at once, for example by denying entry when the store is at capacity. The density of shoppers in regions of the store may be communicated to shoppers so that they can limit their interactions with other shoppers. Shopper activity history may be used for contact tracing by identifying other shoppers that may have been exposed to an individual who was in the store.
    Type: Grant
    Filed: August 14, 2020
    Date of Patent: July 20, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, Michael Brandon Maseda, Martin Alan Cseh, Samir Singh, Matthew Walker
  • Patent number: 11049263
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: June 29, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10989521
    Abstract: Data streams from multiple image sensors may be combined in order to form, for example, an interleaved video stream, which can be used to determine distance to an object. The video stream may be encoded using a motion estimation encoder. Output of the video encoder may be processed (e.g., parsed) in order to extract motion information present in the encoded video. The motion information may be utilized in order to determine a depth of visual scene, such as by using binocular disparity between two or more images by an adaptive controller in order to detect one or more objects salient to a given task. In one variant, depth information is utilized during control and operation of mobile robotic devices.
    Type: Grant
    Filed: December 10, 2018
    Date of Patent: April 27, 2021
    Assignee: Brain Corporation
    Inventors: Micah Richert, Marius Buibas, Vadim Polonichko
  • Patent number: 10909694
    Abstract: A sensor bar shelf monitor, for example that may be added to an existing shelving system to convert a store to autonomous operation. The sensor bar may contain distance sensor to detect shoppers reaching towards items on a shelf, and cameras to determine which items shoppers have taken. It may be installed into shelf supports such as gondola shelving uprights. The sensor bar may be located at the front edge of a shelf, and may monitor the shelf below. Placing the sensor bar along the front edge prevents damage to electronics from spills or shelf cleaning, and prevents heat from the sensor bar electronics from damaging items on the shelf. The sensor bar may have a local sensor bar processor that collects sensor data; images may be analyzed locally or transferred to more powerful store processors. A sensor bar may also have controllable lights and controllable electronic labels.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: February 2, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Aleksander Bapst, Christopher Lai, Jose Miguel Rodriguez, Mark Alan Graham, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10783491
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: September 22, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10621472
    Abstract: System that facilitates rapid onboarding of an autonomous (cashier-less) store by capturing images of items in the store's catalog from different angles, with varying backgrounds and lighting conditions, and that automatically builds a classifier training dataset from these images. The system may have cameras in different positions, lights supporting variable illumination, and monitor screens that generate different background colors. It may have an input device such as a barcode reader, and an operator terminal that prompts operators to place items into the imaging system in the necessary orientations. Once an item is placed in the imaging system, a fully automated process may generate a sequence of background colors, a sequence of lighting conditions, and may capture and process images from all of the cameras to create training images. Training images for an item may be generated in seconds, compared to many minutes per item using manual image capture and processing.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: April 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Tanuj Pankaj, Chin-Chang Kuo
  • Patent number: 10586208
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: July 16, 2019
    Date of Patent: March 10, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Filip Piekniewski, Aleksander Bapst, Soheyl Yousefisahi, Chin-Chang Kuo
  • Patent number: 10535146
    Abstract: A projected image item tracking system that analyzes projected camera images to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed on them may then be attributed to a shopper near the area. Projected images may be combined to generate a 3D volume difference between the state of the area before and after shopper interaction. The volume difference may be calculated using plane-sweep stereo, or using convolutional neural networks. Because these methods may be computationally intensive, the system may first localize a change volume where items appear to have been displaced, and then generate a volume difference only within that change volume. This optimization results in significant savings in power consumption and in more rapid identification of items. The 3D volume difference may also indicate the quantity of items displaced, for example from a vertical stack.
    Type: Grant
    Filed: May 6, 2019
    Date of Patent: January 14, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10373322
    Abstract: An autonomous store system that analyzes camera images to track people and their interactions with items using a processor that obtains a 3D model of a store that contains items and item storage areas. Receives images from cameras captured over a time period and analyzes the images and the 3D model of the store to detect a person in the store based on the images, calculates a trajectory of the person, identifies an item storage area proximal to the trajectory of the person during an interaction time period, analyzes two or more images to identify an item within the item storage area that is moved during the interaction time period. The images are captured within or proximal in time to the interaction time period, and the images contain views of the item storage area, and attribute motion of the item to the person. Enables calibration and placement algorithms for cameras.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: August 6, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre
  • Publication number: 20190178631
    Abstract: Data streams from multiple image sensors may be combined in order to form, for example, an interleaved video stream, which can be used to determine distance to an object. The video stream may be encoded using a motion estimation encoder. Output of the video encoder may be processed (e.g., parsed) in order to extract motion information present in the encoded video. The motion information may be utilized in order to determine a depth of visual scene, such as by using binocular disparity between two or more images by an adaptive controller in order to detect one or more objects salient to a given task. In one variant, depth information is utilized during control and operation of mobile robotic devices.
    Type: Application
    Filed: December 10, 2018
    Publication date: June 13, 2019
    Inventors: Micah Richert, Marius Buibas, Vadim Polonichko
  • Patent number: 10282720
    Abstract: A system that analyzes camera images to track a person from a point where the person obtains an authorization to a different point where the authorization is used. The authorization may be extended in time and space from the point where it was initially obtained. Scenarios enabled by embodiments include automatically opening a locked door or gate for an authorized person and automatically charging items taken by a person to that person's account. Supports automated stores that allow users to enter, take products and exit without explicitly paying. An illustrative application is an automated, unmanned gas station that allows a user to pay at the pump and then enter a locked on-site convenience store or a locked case with products the user can take for automatic purchase. Embodiments may also extend authorization to other people, such as occupants of the same vehicle.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10282852
    Abstract: A system that analyzes camera images to track a person in an autonomous store, and to determine when a tracked person takes or moves items in the store. The system may associate a field of influence volume around a person's location; intersection of this volume with an item storage area, such as a shelf, may trigger the system to look for changes in the items on the shelf. Items that are taken from, placed on, or moved on a shelf may be determined by a neural network that processes before and after images of the shelf. Person tracking may be performed by analyzing images from fisheye ceiling cameras projected onto a plane horizontal to the floor. Projected ceiling camera images may be analyzed using a neural network trained to recognize shopper locations. The autonomous store may include modular ceiling and shelving fixtures that contain cameras, lights, processors, and networking.
    Type: Grant
    Filed: January 23, 2019
    Date of Patent: May 7, 2019
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Michael Brandon Maseda, Martin Alan Cseh
  • Patent number: 10184787
    Abstract: Data streams from multiple image sensors may be combined in order to form, for example, an interleaved video stream, which can be used to determine distance to an object. The video stream may be encoded using a motion estimation encoder. Output of the video encoder may be processed (e.g., parsed) in order to extract motion information present in the encoded video. The motion information may be utilized in order to determine a depth of visual scene, such as by using binocular disparity between two or more images by an adaptive controller in order to detect one or more objects salient to a given task. In one variant, depth information is utilized during control and operation of mobile robotic devices.
    Type: Grant
    Filed: April 9, 2018
    Date of Patent: January 22, 2019
    Assignee: Brain Corporation
    Inventors: Micah Richert, Marius Buibas, Vadim Polonichko
  • Publication number: 20180299258
    Abstract: Data streams from multiple image sensors may be combined in order to form, for example, an interleaved video stream, which can be used to determine distance to an object. The video stream may be encoded using a motion estimation encoder. Output of the video encoder may be processed (e.g., parsed) in order to extract motion information present in the encoded video. The motion information may be utilized in order to determine a depth of visual scene, such as by using binocular disparity between two or more images by an adaptive controller in order to detect one or more objects salient to a given task. In one variant, depth information is utilized during control and operation of mobile robotic devices.
    Type: Application
    Filed: April 9, 2018
    Publication date: October 18, 2018
    Inventors: Micah Richert, Marius Buibas, Vadim Polonichko
  • Patent number: 9939253
    Abstract: Data streams from multiple image sensors may be combined in order to form, for example, an interleaved video stream, which can be used to determine distance to an object. The video stream may be encoded using a motion estimation encoder. Output of the video encoder may be processed (e.g., parsed) in order to extract motion information present in the encoded video. The motion information may be utilized in order to determine a depth of visual scene, such as by using binocular disparity between two or more images by an adaptive controller in order to detect one or more objects salient to a given task. In one variant, depth information is utilized during control and operation of mobile robotic devices.
    Type: Grant
    Filed: May 22, 2014
    Date of Patent: April 10, 2018
    Assignee: BRAIN CORPORATION
    Inventors: Micah Richert, Marius Buibas, Vadim Polonichko