Patents by Inventor Filip Piekniewski

Filip Piekniewski has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11106941
    Abstract: A system having a bar containing distance sensors, such as LIDARs, that may be installed behind a shelf in an automated store. Shelves may be divided into storage zones, such as bins or lanes, and a distance sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. The distance sensor elements within the bar may be relocatable so they can be positioned correctly behind the corresponding storage zones. The bar may have mounting mechanisms on either side for attachment to shelving support structures. These mounting mechanisms may include security locks. The bar may rotate relative to the mounting mechanisms to provide access to the shelf from behind.
    Type: Grant
    Filed: March 4, 2020
    Date of Patent: August 31, 2021
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, Aleksander Bapst, Filip Piekniewski, Mark Graham, Yohei Yamamuro, Jose Miguel Rodriguez
  • Patent number: 11042775
    Abstract: A data processing apparatus may utilize an artificial neuron network configured to reduce dimensionality of input data using a sparse transformation configured using receptive field structure of network units. Output of the network may be analyzed for temporally persistency that is characterized by similarity matrix. Elements of the matrix may be incremented when present activity unit activity at a preceding frame. The similarity matrix may be partitioned based on a distance measure for a given element of the matrix and its closest neighbors. Stability of learning of temporally proximal patterns may be greatly improved as the similarity matrix is learned independently of the partitioning operation. Partitioning of the similarity matrix using the methodology of the disclosure may be performed online, e.g., contemporaneously with the encoding and/or similarity matrix construction, thereby enabling learning of new features in the input data.
    Type: Grant
    Filed: June 20, 2016
    Date of Patent: June 22, 2021
    Assignee: Brain Corporation
    Inventors: Micah Richert, Filip Piekniewski
  • Patent number: 10967519
    Abstract: Systems and methods for automatic detection of spills are disclosed. In some exemplary implementations, a robot can have a spill detector comprising at least one optical imaging device configured to capture at least one image of a scene containing a spill while the robot moves between locations. The robot can process the at least one image by segmentation. Once the spill has been identified, the robot can then generate an alert indicative at least in part of a recognition of the spill.
    Type: Grant
    Filed: September 25, 2019
    Date of Patent: April 6, 2021
    Assignee: Brain Corporation
    Inventors: Dimitry Fisher, Cody Griffin, Micah Richert, Filip Piekniewski, Eugene Izhikevich, Jayram Moorkanikara Nageswaran, John Black
  • Patent number: 10860882
    Abstract: Apparatus and methods for detecting and utilizing saliency in digital images. In one implementation, salient objects may be detected based on analysis of pixel characteristics. Least frequently occurring pixel values may be deemed as salient. Pixel values in an image may be compared to a reference. Color distance may be determined based on a difference between reference color and pixel color. Individual image channels may be scaled when determining saliency in a multi-channel image. Areas of high saliency may be analyzed to determine object position, shape, and/or color. Multiple saliency maps may be additively or multiplicative combined in order to improve detection performance (e.g., reduce number of false positives). Methodologies described herein may enable robust tracking of objects utilizing fewer determination resources. Efficient implementation of the methods described below may allow them to be used for example on board a robot (or autonomous vehicle) or a mobile determining platform.
    Type: Grant
    Filed: July 23, 2018
    Date of Patent: December 8, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher
  • Patent number: 10818016
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Grant
    Filed: March 19, 2019
    Date of Patent: October 27, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10810456
    Abstract: Apparatus and methods for detecting and utilizing saliency in digital images. In one implementation, salient objects may be detected based on analysis of pixel characteristics. Least frequently occurring pixel values may be deemed as salient. Pixel values in an image may be compared to a reference. Color distance may be determined based on a difference between reference color and pixel color. Individual image channels may be scaled when determining saliency in a multi-channel image. Areas of high saliency may be analyzed to determine object position, shape, and/or color. Multiple saliency maps may be additively or multiplicative combined in order to improve detection performance (e.g., reduce number of false positives). Methodologies described herein may enable robust tracking of objects utilizing fewer determination resources. Efficient implementation of the methods described below may allow them to be used for example on board a robot (or autonomous vehicle) or a mobile determining platform.
    Type: Grant
    Filed: January 15, 2018
    Date of Patent: October 20, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher
  • Patent number: 10728436
    Abstract: An optical object detection apparatus and associated methods. The apparatus may comprise a lens (e.g., fixed-focal length wide aperture lens) and an image sensor. The fixed focal length of the lens may correspond to a depth of field area in front of the lens. When an object enters the depth of field area (e.g., sue to a relative motion between the object and the lens) the object representation on the image sensor plane may be in-focus. Objects outside the depth of field area may be out of focus. In-focus representations of objects may be characterized by a greater contrast parameter compared to out of focus representations. One or more images provided by the detection apparatus may be analyzed in order to determine useful information (e.g., an image contrast parameter) of a given image. Based on the image contrast meeting one or more criteria, a detection indication may be produced.
    Type: Grant
    Filed: December 18, 2017
    Date of Patent: July 28, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Vadim Polonichko, Eugene Izhikevich
  • Publication number: 20200202177
    Abstract: A system having a bar containing distance sensors, such as LIDARs, that may be installed behind a shelf in an automated store. Shelves may be divided into storage zones, such as bins or lanes, and a distance sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. The distance sensor elements within the bar may be relocatable so they can be positioned correctly behind the corresponding storage zones. The bar may have mounting mechanisms on either side for attachment to shelving support structures. These mounting mechanisms may include security locks. The bar may rotate relative to the mounting mechanisms to provide access to the shelf from behind.
    Type: Application
    Filed: March 4, 2020
    Publication date: June 25, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, Aleksander BAPST, Filip PIEKNIEWSKI, Mark GRAHAM, Yohei YAMAMURO, Jose Miguel RODRIGUEZ
  • Patent number: 10657409
    Abstract: Methods and apparatus for tracking and discerning objects using their saliency. In one embodiment of the present disclosure, the tracking of objects is based on a combination of object saliency and additional sources of signal about object identity. Under certain simplifying assumptions, the present disclosure allows for robust tracking of simple objects with limited processing resources. In one or more variants, efficient implementation of the methods described allow sensors (e.g., cameras) to be used on board a robot (or autonomous vehicle) on a mobile determining platform, such as to capture images to determine the presence and/or identity of salient objects. Such determination of salient objects allow for e.g., adjustments to vehicle or other moving object trajectory.
    Type: Grant
    Filed: March 4, 2019
    Date of Patent: May 19, 2020
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert
  • Publication number: 20200086494
    Abstract: Systems and methods for automatic detection of spills are disclosed. In some exemplary implementations, a robot can have a spill detector comprising at least one optical imaging device configured to capture at least one image of a scene containing a spill while the robot moves between locations. The robot can process the at least one image by segmentation. Once the spill has been identified, the robot can then generate an alert indicative at least in part of a recognition of the spill.
    Type: Application
    Filed: September 25, 2019
    Publication date: March 19, 2020
    Inventors: Dimitry Fisher, Cody Griffin, Micah Richert, Filip Piekniewski, Eugene Izhikevich, Jayram Moorkanikara Nageswaran, John Black
  • Patent number: 10586208
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Grant
    Filed: July 16, 2019
    Date of Patent: March 10, 2020
    Assignee: ACCEL ROBOTICS CORPORATION
    Inventors: Marius Buibas, John Quinn, Kaylee Feigum, Csaba Petre, Filip Piekniewski, Aleksander Bapst, Soheyl Yousefisahi, Chin-Chang Kuo
  • Publication number: 20200019921
    Abstract: A system that integrates camera images and quantity sensors to determine items taken from, placed on, or moved on a shelf or other area in an autonomous store. The items and actions performed may then be attributed to a shopper near the area. Shelves may be divided into storage zones, such as bins or lanes, and a quantity sensor may measure the item quantity in each zone. Quantity changes indicate that a shopper has taken or placed items in the zone. Distance sensors, such as LIDAR, may be used for shelves that push items towards the front. Strain gauges may be used for bins or hanging rods. Quantity changes may trigger analysis of camera images of the shelf to identify the items taken or replaced. Images from multiple cameras that view a shelf may be projected to a vertical plane at the front of the shelf to simplify analysis.
    Type: Application
    Filed: July 16, 2019
    Publication date: January 16, 2020
    Applicant: ACCEL ROBOTICS CORPORATION
    Inventors: Marius BUIBAS, John QUINN, Kaylee FEIGUM, Csaba PETRE, Filip PIEKNIEWSKI, Aleksander BAPST, Soheyl YOUSEFISAHI, Chin-Chang KUO
  • Patent number: 10464213
    Abstract: Systems and methods for automatic detection of spills are disclosed. In some exemplary implementations, a robot can have a spill detector comprising at least one optical imaging device configured to capture at least one image of a scene containing a spill while the robot moves between locations. The robot can process the at least one image by segmentation. Once the spill has been identified, the robot can then generate an alert indicative at least in part of a recognition of the spill.
    Type: Grant
    Filed: June 4, 2018
    Date of Patent: November 5, 2019
    Assignee: Brain Corporation
    Inventors: Dimitry Fisher, Cody Griffin, Micah Richert, Filip Piekniewski, Eugene Izhikevich, Jayram Moorkanikara Nageswaran, John Black
  • Publication number: 20190251386
    Abstract: Methods and apparatus for tracking and discerning objects using their saliency. In one embodiment of the present disclosure, the tracking of objects is based on a combination of object saliency and additional sources of signal about object identity. Under certain simplifying assumptions, the present disclosure allows for robust tracking of simple objects with limited processing resources. In one or more variants, efficient implementation of the methods described allow sensors (e.g., cameras) to be used on board a robot (or autonomous vehicle) on a mobile determining platform, such as to capture images to determine the presence and/or identity of salient objects. Such determination of salient objects allow for e.g., adjustments to vehicle or other moving object trajectory.
    Type: Application
    Filed: March 4, 2019
    Publication date: August 15, 2019
    Inventors: Filip Piekniewski, Micah Richert
  • Publication number: 20190244365
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Application
    Filed: March 19, 2019
    Publication date: August 8, 2019
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10282849
    Abstract: Systems and methods for predictive/reconstructive visual object tracking are disclosed. The visual object tracking has advanced abilities to track objects in scenes, which can have a variety of applications as discussed in this disclosure. In some exemplary implementations, a visual system can comprise a plurality of associative memory units, wherein each associative memory unit has a plurality of layers. The associative memory units can be communicatively coupled to each other in a hierarchical structure, wherein data in associative memory units in higher levels of the hierarchical structure are more abstract than lower associative memory units. The associative memory units can communicate to one another supplying contextual data.
    Type: Grant
    Filed: June 19, 2017
    Date of Patent: May 7, 2019
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher, Patryk Laurent, Csaba Petre
  • Patent number: 10268919
    Abstract: Methods and apparatus for tracking and discerning objects using their saliency. In one embodiment of the present disclosure, the tracking of objects is based on a combination of object saliency and additional sources of signal about object identity. Under certain simplifying assumptions, the present disclosure allows for robust tracking of simple objects with limited processing resources. In one or more variants, efficient implementation of the methods described allow sensors (e.g., cameras) to be used on board a robot (or autonomous vehicle) on a mobile determining platform, such as to capture images to determine the presence and/or identity of salient objects. Such determination of salient objects allow for e.g., adjustments to vehicle or other moving object trajectory.
    Type: Grant
    Filed: September 21, 2015
    Date of Patent: April 23, 2019
    Assignee: Brain Corporation
    Inventors: Filip Piekniewski, Micah Richert
  • Publication number: 20190061160
    Abstract: Systems and methods for automatic detection of spills are disclosed. In some exemplary implementations, a robot can have a spill detector comprising at least one optical imaging device configured to capture at least one image of a scene containing a spill while the robot moves between locations. The robot can process the at least one image by segmentation. Once the spill has been identified, the robot can then generate an alert indicative at least in part of a recognition of the spill.
    Type: Application
    Filed: June 4, 2018
    Publication date: February 28, 2019
    Inventors: Dimitry Fisher, Cody Griffin, Micah Richert, Filip Piekniewski, Eugene Izhikevich, Jayram Moorkanikara Nageswaran, John Black
  • Patent number: 10210452
    Abstract: Apparatus and methods for high-level neuromorphic network description (HLND) framework that may be configured to enable users to define neuromorphic network architectures using a unified and unambiguous representation that is both human-readable and machine-interpretable. The framework may be used to define nodes types, node-to-node connection types, instantiate node instances for different node types, and to generate instances of connection types between these nodes. To facilitate framework usage, the HLND format may provide the flexibility required by computational neuroscientists and, at the same time, provides a user-friendly interface for users with limited experience in modeling neurons. The HLND kernel may comprise an interface to Elementary Network Description (END) that is optimized for efficient representation of neuronal systems in hardware-independent manner and enables seamless translation of HLND model description into hardware instructions for execution by various processing modules.
    Type: Grant
    Filed: March 15, 2012
    Date of Patent: February 19, 2019
    Assignee: QUALCOMM Incorporated
    Inventors: Botond Szatmary, Eugene M. Izhikevich, Csaba Petre, Jayram Moorkanikara Nageswaran, Filip Piekniewski
  • Publication number: 20190043208
    Abstract: Apparatus and methods for detecting and utilizing saliency in digital images. In one implementation, salient objects may be detected based on analysis of pixel characteristics. Least frequently occurring pixel values may be deemed as salient. Pixel values in an image may be compared to a reference. Color distance may be determined based on a difference between reference color and pixel color. Individual image channels may be scaled when determining saliency in a multi-channel image. Areas of high saliency may be analyzed to determine object position, shape, and/or color. Multiple saliency maps may be additively or multiplicative combined in order to improve detection performance (e.g., reduce number of false positives). Methodologies described herein may enable robust tracking of objects utilizing fewer determination resources. Efficient implementation of the methods described below may allow them to be used for example on board a robot (or autonomous vehicle) or a mobile determining platform.
    Type: Application
    Filed: July 23, 2018
    Publication date: February 7, 2019
    Inventors: Filip Piekniewski, Micah Richert, Dimitry Fisher