Patents by Inventor Barry James O'Brien
Barry James O'Brien has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12300106Abstract: An edge device generates an exit event for a vehicle exiting a parking facility. The edge device determines whether the exit event matches with an entry event. Responsive to determining that the exit event does not match to an entry event, the edge device inputs images of the vehicle into a supervised machine learning model and receives, as output from the model, an exit feature vector. The edge device retrieves entry feature vectors corresponding to hanging entry events. A hanging entry event is an entry event for a vehicle with an unknown vehicle identifier. Edge device inputs the exit feature vector and the entry feature vectors into an unsupervised machine learning model and receives, as output from the model, matching scores for each entry feature vector. Edge device matches the exit event to one of the hanging entry events based on the matching scores.Type: GrantFiled: June 6, 2023Date of Patent: May 13, 2025Assignee: Metropolis IP Holdings, LLCInventors: Ji Sung Hwang, Anil Kumar Nayak, Barry James O'Brien, Kaleb-John Seijin Loo, Alexander David Israel
-
Publication number: 20250003827Abstract: An edge device receives sensor data from a sensor affixed to a moveable gate. The edge device determines the positional state of the moveable gate based on the sensor data by inputting the received data into a machine learning model or by comparing the sensor data to values associated with a positional state through a calibration process. The edge device stores a log that associates the positional state and sensor data. The edge device determines the health state of the moveable gate using a machine learning model that is trained to predict, based on input of a new log, the health state of the gate. Responsive to determining that the health state of the gate is unhealthy, the edge device triggers a remedial action.Type: ApplicationFiled: September 4, 2024Publication date: January 2, 2025Inventors: Barry James O'Brien, Leah Sardone, Zachary James Thompson, Kyle Bradley Kufalk, Luis Felipe Rodriguez Herrera, Antonio Ortega, Alexander David Israel
-
Publication number: 20240412314Abstract: A device detects, using input from one or more sensors installed at a parking facility of a plurality of parking facilities, an infraction caused by a vehicle. Responsive to detecting the infraction, the device generates a vehicle fingerprint by inputting a depiction of the vehicle into a supervised machine learning model, the depiction derived from one or more images of the vehicle captured at the parking facility, and receiving a feature vector of the vehicle as output from the supervised machine learning model, the feature vector comprising a plurality of embeddings each describing a dimension of the vehicle. The device monitors for entry of the vehicle at each of the plurality of parking facilities using the vehicle fingerprint, and, responsive to detecting entry of the vehicle at a given one of the plurality of parking facilities, triggers a remediation action.Type: ApplicationFiled: June 6, 2023Publication date: December 12, 2024Inventors: Ji Sung Hwang, Anil Kumar Nayak, Barry James O’Brien, Kaleb-John Seijin Loo, Owen Grace Wise Sanford, Alexander David Israel
-
Publication number: 20240412634Abstract: An edge device generates an exit event for a vehicle exiting a parking facility. The edge device determines whether the exit event matches with an entry event. Responsive to determining that the exit event does not match to an entry event, the edge device inputs images of the vehicle into a supervised machine learning model and receives, as output from the model, an exit feature vector. The edge device retrieves entry feature vectors corresponding to hanging entry events. A hanging entry event is an entry event for a vehicle with an unknown vehicle identifier. Edge device inputs the exit feature vector and the entry feature vectors into an unsupervised machine learning model and receives, as output from the model, matching scores for each entry feature vector. Edge device matches the exit event to one of the hanging entry events based on the matching scores.Type: ApplicationFiled: June 6, 2023Publication date: December 12, 2024Inventors: Ji Sung Hwang, Anil Kumar Nayak, Barry James O’Brien, Kaleb-John Seijin Loo, Alexander David Israel
-
Patent number: 12123809Abstract: An edge device receives sensor data from a sensor affixed to a moveable gate. The edge device determines the positional state of the moveable gate based on the sensor data by inputting the received data into a machine learning model or by comparing the sensor data to values associated with a positional state through a calibration process. The edge device stores a log that associates the positional state and sensor data. The edge device determines the health state of the moveable gate using a machine learning model that is trained to predict, based on input of a new log, the health state of the gate. Responsive to determining that the health state of the gate is unhealthy, the edge device triggers a remedial action.Type: GrantFiled: March 3, 2023Date of Patent: October 22, 2024Assignee: Metropolis Technologies, Inc.Inventors: Barry James O′Brien, Leah Sardone, Zachary James Thompson, Kyle Bradley Kufalk, Luis Felipe Rodriguez Herrera, Antonio Ortega, Alexander David Israel
-
Publication number: 20240295458Abstract: An edge device receives sensor data from a sensor affixed to a moveable gate. The edge device determines the positional state of the moveable gate based on the sensor data by inputting the received data into a machine learning model or by comparing the sensor data to values associated with a positional state through a calibration process. The edge device stores a log that associates the positional state and sensor data. The edge device determines the health state of the moveable gate using a machine learning model that is trained to predict, based on input of a new log, the health state of the gate. Responsive to determining that the health state of the gate is unhealthy, the edge device triggers a remedial action.Type: ApplicationFiled: March 3, 2023Publication date: September 5, 2024Inventors: Barry James O'Brien, Leah Sardone, Zachary James Thompson, Kyle Bradley Kufalk, Luis Felipe Rodriguez Herrera, Antonio Ortega, Alexander David Israel
-
Publication number: 20240185566Abstract: A device captures a series of images over time in association with a gate, each image having a timestamp. The device determines, for a vehicle approaching the entry side, from a subset of images of the series of images featuring the vehicle, a first data set comprising a plurality of parameters that describe attributes of the vehicle by inputting the subset of images into a first machine learning model and a vehicle identifier of the vehicle by inputting images of the subset featuring a depiction of a license plate of the vehicle into a second machine learning model. The device stores the data set in association with one or more timestamps with the subset of images, determines a second data set for a second vehicle approaching the exit side, and responsive to determining that the first data set and the second data set match, instructs the gate to move.Type: ApplicationFiled: December 6, 2022Publication date: June 6, 2024Inventors: Edwin Thomas, June Guo, Ji Sung Hwang, Anil Kumar Nayak, Todd Merle Shipway, Barry James O'Brien, Alexander David Israel
-
Publication number: 20240185569Abstract: A device captures a series of images over time in association with a gate, each image having a timestamp. The device determines, for a vehicle approaching the entry side, from a subset of images of the series of images featuring the vehicle, a first data set comprising a plurality of parameters that describe attributes of the vehicle by inputting the subset of images into a first machine learning model and a vehicle identifier of the vehicle by inputting images of the subset featuring a depiction of a license plate of the vehicle into a second machine learning model. The device stores the data set in association with one or more timestamps with the subset of images, determines a second data set for a second vehicle approaching the exit side, and responsive to determining that the first data set and the second data set match, instructs the gate to move.Type: ApplicationFiled: February 2, 2024Publication date: June 6, 2024Inventors: Edwin Thomas, June Guo, Ji Sung Hwang, Anil Kumar Nayak, Todd Merle Shipway, Barry James O'Brien, Alexander David Israel
-
Patent number: 11775930Abstract: This disclosure describes a device and system for verifying the content of items in a bin within a materials handling facility. In some implementations, a bin content verification apparatus may pass by one or more bins and capture images of those bins. The images may be processed to determine whether the content included in the bins has changed since the last time images of the bins were captured. A determination may also be made as to whether a change to the bin content was expected and, if so, if the determined change corresponds with the expected change.Type: GrantFiled: July 13, 2022Date of Patent: October 3, 2023Assignee: Amazon Technologies, Inc.Inventors: James Christopher Curlander, Jules Cook Graybill, Marshall Friend Tappen, Barry James O'Brien
-
Patent number: 11416814Abstract: This disclosure describes a device and system for verifying the content of items in a bin within a materials handling facility. In some implementations, a bin content verification apparatus may pass by one or more bins and capture images of those bins. The images may be processed to determine whether the content included in the bins has changed since the last time images of the bins were captured. A determination may also be made as to whether a change to the bin content was expected and, if so, if the determined change corresponds with the expected change.Type: GrantFiled: March 12, 2020Date of Patent: August 16, 2022Assignee: Amazon Technologies, Inc.Inventors: James Christopher Curlander, Jules Cook Graybill, Marshall Friend Tappen, Barry James O'Brien
-
Patent number: 11377232Abstract: Described is an imaging component for use by an unmanned aerial vehicle (“UAV”) for object detection. As described, the imaging component includes one or more cameras that are configured to obtain images of a scene using visible light that are converted into a depth map (e.g., stereo image) and one or more other cameras that are configured to form images, or thermograms, of the scene using infrared radiation (“IR”). The depth information and thermal information are combined to form a representation of the scene based on both depth and thermal information.Type: GrantFiled: July 29, 2019Date of Patent: July 5, 2022Assignee: Amazon Technologies, Inc.Inventors: Scott Raymond Harris, Barry James O'Brien, Joshua John Watson
-
Patent number: 11370531Abstract: A configurable unmanned aerial vehicle (UAV) may include swappable components that may be selectable to configure a customized UAV just prior to deployment of the UAV that is configured to deliver a package to a destination. The UAV may include a plurality of ports that may accept swappable components. The ports may be coupled to a logic board to enable control of the swappable components. The ports and swappable components may enable quick replacement of a malfunctioning components, such as an image sensor, which may avoid subjecting a UAV to significant downtime for service. The malfunctioning component may then be serviced after the UAV is readied for a subsequent flight or deployed on a subsequent flight.Type: GrantFiled: August 29, 2019Date of Patent: June 28, 2022Assignee: Amazon Technologies, Inc.Inventors: Barry James O'Brien, Joshua John Watson, Scott Michael Wilcox
-
Patent number: 11317036Abstract: A mobile calibration room may be used for calibrating one or more sensors used on unmanned aerial vehicles (UAVs). A system can include folding or collapsible walls to enable the system to be moved between a stowed position and a deployed position. In the deployed position, the system can comprise a calibration room including one or more 2D or 3D targets used to calibrate one or more sensors (e.g., cameras) on a UAV. The system can include a turntable to rotate the UAV about a first axis during calibration. The system can also include a cradle to rotate the UAV around, or translate the UAV along, a second axis. The turntable can include a frame to rotate the UAV around a third axis during calibration. The mobile calibration room can be coupled to a vehicle to enable the mobile calibration room to be moved between locations.Type: GrantFiled: June 11, 2019Date of Patent: April 26, 2022Assignee: Amazon Technologies, Inc.Inventors: Sarah Graber, Martin Koestinger, Barry James O'Brien, Gerald Schweighofer, Mario Sormann, Joshua John Watson, Scott Michael Wilcox
-
Patent number: 11317077Abstract: Methods and systems for collecting camera calibration data using wearable devices are described. An augmented reality interface may be provided at a wearable device. Three-dimensional virtual information may be presented at the augmented reality interface. The three-dimensional information may identify a field of view of a remote camera and may be associated with collection of calibration data for the remote camera. Calibration data collected by the remote camera viewing a calibration target in the field of view may be received. The camera may be calibrated based at least in part on the calibration data.Type: GrantFiled: December 18, 2019Date of Patent: April 26, 2022Assignee: Amazon Technologies, Inc.Inventors: James Christopher Curlander, Gur Kimchi, Barry James O'Brien, Jason Leonard Peacock
-
Patent number: 11284056Abstract: Described is an aerial vehicle, such as an unmanned aerial vehicle (“UAV”), that includes a plurality of sensors, such as stereo cameras, mounted along a perimeter frame of the aerial vehicle and arranged to generate a scene that surrounds the aerial vehicle. The sensors may be mounted in or on winglets of the perimeter frame. Each of the plurality of sensors has a field of view and the plurality of optical sensors are arranged and/or oriented such that their fields of view overlap with one another throughout a continuous space that surrounds the perimeter frame. The fields of view may also include a portion of the perimeter frame or space that is adjacent to the perimeter frame.Type: GrantFiled: September 9, 2020Date of Patent: March 22, 2022Assignee: Amazon Technologies, Inc.Inventors: Taylor David Grenier, Louis Leroi LeGrand, III, Barry James O'Brien, Joshua John Watson, Ricky Dean Welsh
-
Patent number: 11238603Abstract: This disclosure describes a configuration of an aerial vehicle, such as an unmanned aerial vehicle (“UAV”), that includes a plurality of cameras that may be selectively combined to form a stereo pair for use in obtaining stereo images that provide depth information corresponding to objects represented in those images. Depending on the distance between an object and the aerial vehicle, different cameras may be selected for the stereo pair based on the baseline between those cameras and a distance between the object and the aerial vehicle. For example, cameras with a small baseline (close together) may be selected to generate stereo images and depth information for an object that is close to the aerial vehicle. In comparison, cameras with a large baseline may be selected to generate stereo images and depth information for an object that is farther away from the aerial vehicle.Type: GrantFiled: May 28, 2020Date of Patent: February 1, 2022Assignee: Amazon Technologies, Inc.Inventors: Scott Raymond Harris, Barry James O'Brien, Joshua John Watson
-
Patent number: 10922984Abstract: Techniques for verifying a location and identification of a landing marker to aid an unmanned aerial vehicle (UAV) to deliver a payload to a location may be provided. For example, upon receiving an indication that a UAV has arrived to a delivery location, a server computer may process one or more images of an area that are provided by the UAV and/or a user interacting with a user device. A landing marker may be identified in the image and a representation of the landing marker along with instructions to guide the UAV to deliver the payload to the landing marker may be transmitted to the UAV and implemented by the UAV.Type: GrantFiled: November 30, 2018Date of Patent: February 16, 2021Assignee: Amazon Technologies, Inc.Inventors: Scott Patrick Boyd, Chengwu Cui, Sarah Graber, Barry James O'Brien, Joshua John Watson, Scott Michael Wilcox
-
Publication number: 20210012518Abstract: This disclosure describes a configuration of an aerial vehicle, such as an unmanned aerial vehicle (“UAV”), that includes a plurality of cameras that may be selectively combined to form a stereo pair for use in obtaining stereo images that provide depth information corresponding to objects represented in those images. Depending on the distance between an object and the aerial vehicle, different cameras may be selected for the stereo pair based on the baseline between those cameras and a distance between the object and the aerial vehicle. For example, cameras with a small baseline (close together) may be selected to generate stereo images and depth information for an object that is close to the aerial vehicle. In comparison, cameras with a large baseline may be selected to generate stereo images and depth information for an object that is farther away from the aerial vehicle.Type: ApplicationFiled: May 28, 2020Publication date: January 14, 2021Inventors: Scott Raymond Harris, Barry James O'Brien, Joshua John Watson
-
Publication number: 20200413026Abstract: Described is an aerial vehicle, such as an unmanned aerial vehicle (“UAV”), that includes a plurality of sensors, such as stereo cameras, mounted along a perimeter frame of the aerial vehicle and arranged to generate a scene that surrounds the aerial vehicle. The sensors may be mounted in or on winglets of the perimeter frame. Each of the plurality of sensors has a field of view and the plurality of optical sensors are arranged and/or oriented such that their fields of view overlap with one another throughout a continuous space that surrounds the perimeter frame. The fields of view may also include a portion of the perimeter frame or space that is adjacent to the perimeter frame.Type: ApplicationFiled: September 9, 2020Publication date: December 31, 2020Inventors: Taylor David Grenier, Louis Leroi LeGrand, III, Barry James O'Brien, Joshua John Watson, Ricky Dean Welsh
-
Patent number: 10853942Abstract: Camera calibration may be performed in a mobile environment. One or more cameras can be mounted on a mobile vehicle, such as an unmanned aerial vehicle (UAV) or an automobile. Because of the mobility of the vehicle the one or more cameras may be subjected to inaccuracy in imagery caused by various factors, such as environmental factors (e.g., airflow, wind, etc.), impact by other objects (e.g., debris, vehicles, etc.), vehicle vibrations, and the like. To reduce the inaccuracy in imagery, the mobile vehicle can include a mobile camera calibration system configured to calibrate the one or more cameras while the mobile vehicle is traveling along a path. The mobile camera calibration system can cause the one or more cameras to capture an image of an imaging target while moving, and calibrate the one or more cameras based on a comparison between the image and imaging target data.Type: GrantFiled: August 29, 2016Date of Patent: December 1, 2020Assignee: Amazon Technologies, Inc.Inventors: Scott Patrick Boyd, Chengwu Cui, Barry James O'Brien, Joshua John Watson, Scott Michael Wilcox