Robotics Patents (Class 382/153)
  • Patent number: 10776652
    Abstract: Described herein are systems and methods that use motion-related data combined with image data to improve the speed and the accuracy of detecting visual features by predict the locations of features using the motion-related data. In embodiments, given a set of features in a previous image frame and given a next image frame, localization of the same set of features in the next image frame is attempted. In embodiments, motion-related data is used to compute the relative pose transformation between the two image frames, and the image location of the features may then be transformed to obtain their location prediction in the next frame. Such a process greatly reduces the search space of the features in the next image frame, and thereby accelerates and improves feature detection.
    Type: Grant
    Filed: August 13, 2018
    Date of Patent: September 15, 2020
    Assignee: Baidu USA LLC
    Inventors: Yingze Bao, Mingyu Chen
  • Patent number: 10748318
    Abstract: A system and method of generating a two-dimensional (2D) image of an environment is provided. The system includes a 2D scanner having a controller that determines a distance value to at least one of the object points. One or more processors are operably coupled to the 2D scanner, the one or more processors being responsive to nontransitory executable instructions for generating a plurality of 2D submaps of the environment based at least in part on the distance value, each submap generated from a different point in the environment. A map editor is provided that is configured to: select a subset of submaps from the plurality of 2D submaps; and generate the 2D image of the environment using the subset of 2D submaps. The method provides for realigning of 2D submaps to improve the quality of a global 2D map.
    Type: Grant
    Filed: September 5, 2019
    Date of Patent: August 18, 2020
    Assignee: FARO TECHNOLOGIES, INC.
    Inventors: João Santos, Ahmad Ramadneh, Aleksej Frank, Oliver Zweigle
  • Patent number: 10719727
    Abstract: A method for determining at least one property related to at least part of a real environment comprises receiving a first image of a first part of a real environment captured by a first camera, wherein the first camera is a thermal camera and the first image is a thermal image and the first part of the real environment is a first environment part, providing at least one description related to at least one class of real objects, wherein the at least one description includes at least one thermal property related to the at least one class of real objects, receiving a second image of the first environment part and of a second part of the real environment captured by a second camera, wherein the second part of the real environment is a second environment part, providing an image alignment between the first image and the second image, determining, for at least one second image region contained in the second image, at least one second probability according to the image alignment, pixel information of the first image
    Type: Grant
    Filed: October 1, 2014
    Date of Patent: July 21, 2020
    Assignee: Apple Inc.
    Inventors: Darko Stanimirovic, Daniel Kurz
  • Patent number: 10710244
    Abstract: A method and a device for operating a robot are provided. According to an example of the method, information of a first gesture is acquired from a group of gestures of an operator, each gesture from the group of gestures corresponding to an operation instruction from a group of operation instructions. A first operation instruction from the group of operation instructions is obtained based on the acquired information of the first gesture, the first operation corresponding to the first gesture. The first operation instruction is executed.
    Type: Grant
    Filed: June 20, 2017
    Date of Patent: July 14, 2020
    Assignee: Beijing Airlango Technology Co., Ltd.
    Inventors: Yinian Mao, Xinmin Liu
  • Patent number: 10682764
    Abstract: A robotic system includes a robot having an associated workspace; a vision sensor constructed to obtain a 3D image of a robot scene including a workpiece located in the workspace; and a control system communicatively coupled to the vision sensor and to the robot. The control system is configured to execute program instructions to filter the image by segmenting the image into a first image portion containing substantially only a region of interest within the robot scene, and a second image portion containing the balance of the robot scene outside the region of interest; and by storing image data associated with the first image portion. The control system is operative to control movement of the robot to perform work, on the workpiece based on the image data associated with the first image portion.
    Type: Grant
    Filed: July 30, 2017
    Date of Patent: June 16, 2020
    Assignee: ABB Schweiz AG
    Inventors: Remus Boca, Thomas A. Fuhlbrigge
  • Patent number: 10675761
    Abstract: An improved method, system, and apparatus is provided to implement a general architecture for robot systems. A mode execution module is provided to universally execute execution modes on different robotic system. A system includes an execution module that receives software instructions in a normalized programming language. The system also includes an interface having a translation layer that converts the software instructions from the normalized language into robot-specific instructions that operate in a particular robotic system. The system further includes a controller that is communicatively coupled to the interface, wherein the controller receives the robot-specific instructions. Moreover, the system includes a robotic device that is operatively controlled by the controller by execution of the robot-specific instructions.
    Type: Grant
    Filed: October 16, 2017
    Date of Patent: June 9, 2020
    Assignee: Magic Leap, Inc.
    Inventors: Frederick Dennis Zyda, Jeffrey Steven Kranski, Vikram Chauhan
  • Patent number: 10661811
    Abstract: A vehicle information display control device includes: an automatic driving information obtaining unit obtaining automatic driving information including information indicating that each actuator of a vehicle is in a manual control mode or an automatic control mode; and a display controller causing a display to display an image based on the automatic driving information. The display controller causes the display to display: an image of a manual driving device corresponding to the actuator; a manual-operation recalling image that is superimposed on the image of the manual driving device corresponding to an actuator in the manual control mode and that recalls an operation performed by a person; and an automatic-operation recalling image that is superimposed on the image of the manual driving device corresponding to an actuator in the automatic control mode and that recalls an operation performed by a machine.
    Type: Grant
    Filed: October 30, 2015
    Date of Patent: May 26, 2020
    Assignee: MITSUBISHI ELECTRIC CORPORATION
    Inventors: Tadashi Miyahara, Mitsuo Shimotani, Satoru Inoue, Yoshio Sato, Yuki Sakai, Yuji Karita, Junichi Kimura
  • Patent number: 10657419
    Abstract: Machine vision methods and systems determine if an object within a work field has one or more predetermined features. Methods comprise capturing image data of the work field, applying a filter to the image data, in which the filter comprises an aspect corresponding to a specific feature, and based at least in part on the applying, determining if the object has the specific feature. Systems comprise a camera system configured to capture image data of the work field, and a controller communicatively coupled to the camera system and comprising non-transitory computer readable media having computer-readable instructions that, when executed, cause the controller to apply a filter to the image data, and based at least in part on applying the filter, determine if the object has the specific feature. Robotic installation methods and systems that utilize machine vision methods and systems also are disclosed.
    Type: Grant
    Filed: March 28, 2018
    Date of Patent: May 19, 2020
    Assignee: The Boeing Company
    Inventors: Tyler Edward Kurtz, Riley Harrison HansonSmith
  • Patent number: 10656646
    Abstract: An example method includes determining a target area of a ground plane in an environment of a mobile robotic device, where the target area of the ground plane is in front of the mobile robotic device in a direction of travel of the mobile robotic device. The method further includes receiving depth data from a depth sensor on the mobile robotic device. The method also includes identifying a portion of the depth data representative of the target area. The method additionally includes determining that the portion of the depth data lacks information representing at least one section of the target area. The method further includes providing an output signal identifying at least one zone of non-traversable space for the mobile robotic device in the environment, where the at least one zone of non-traversable space corresponds to the at least one section of the target area.
    Type: Grant
    Filed: December 14, 2017
    Date of Patent: May 19, 2020
    Assignee: X Development LLC
    Inventors: Kevin William Watts, Kurt Konolige
  • Patent number: 10640211
    Abstract: An automated commissioning and floorplan configuration (CAFC) device can include a CAFC system having a transceiver and a CAFC engine, where the transceiver communicates with at least one device disposed in a volume of space, where the CAFC engine, based on communication between the transceiver and the at least one device, commissions the at least one device.
    Type: Grant
    Filed: November 25, 2019
    Date of Patent: May 5, 2020
    Assignee: Eaton Intelligent Power Limited
    Inventors: Jonathan Andrew Whitten, Michael Alan Lunn
  • Patent number: 10617857
    Abstract: The invention generally relates to a raster injector for micropigmentation and use thereof, and more particularly to a raster injector that precisely inserts pigment, medicine or other fluids into skin or other industrial materials. The raster injector is connected to a motion platform that enables the injector to inject pigment or other fluids into the layers of the skin or other substrates through micro, hollow point needles. The raster injector includes a disposable, interchangeable ink delivery assembly, an ink reservoir assembly and a non-disposable base assembly.
    Type: Grant
    Filed: November 22, 2017
    Date of Patent: April 14, 2020
    Assignee: Odd Pixel Tattooing Technologies, Inc.
    Inventors: Lisa L. Phillips, Drew T. Morgan
  • Patent number: 10620608
    Abstract: A pick-and-place machine module is provided. The pick-and-place machine module includes a nozzle and a collet disk. The nozzle includes a body, a head and a tubular element extending between the body and the head such that the head is communicative with the body via the tubular element to enable a pick-up of a component by the head. The collet disk is affixed to a surface of the body facing the head about the tubular element and is configured to reflect light incident thereon toward an area of the base surrounding the component.
    Type: Grant
    Filed: March 7, 2017
    Date of Patent: April 14, 2020
    Assignee: RAYTHEON COMPANY
    Inventors: Kristopher W. Carlson, Christopher S. Bender
  • Patent number: 10596706
    Abstract: A mechanism-parametric-calibration method for a robotic arm system is provided, including: controlling the robotic arm to perform a plurality of actions so that one end of the robotic arm moves toward corresponding predictive positioning-points; determining a predictive relative-displacement between each two of the predictive positioning-points; after each of the actions is performed, sensing three-dimensional positioning information of the end of the robotic arm; determining, according to the three-dimensional positioning information, a measured relative-displacement moved by the end of the robotic-arm when each two of the actions are performed; deriving an equation corresponding to the robotic arm from the predictive relative-displacements and the measured relative-displacements; and utilizing a feasible algorithm to find the solution of the equation.
    Type: Grant
    Filed: March 9, 2018
    Date of Patent: March 24, 2020
    Assignee: DELTA ELECTRONICS, INC.
    Inventor: Cheng-Hao Huang
  • Patent number: 10589421
    Abstract: The present invention relates to a mechanical energy transfer system that comprises a moving member that imparts energy to an engaging member as a friction force is applied to the moving member. The moving members may be rotating plates/disks, belts, or drums/cylinders. Upon engagement of the moving members by the engaging members, movement is transferred to the engaging members. Movement transfer may occur through the application of any frictional force, such as a magnetic force or a mechanical force.
    Type: Grant
    Filed: January 12, 2016
    Date of Patent: March 17, 2020
    Inventor: Douglas H. DeCandia
  • Patent number: 10594917
    Abstract: Techniques are described for controlling the process of capturing three-dimensional (3D) video content. For example a controller can provide centralized control over the various components that participate in the capture, and processing, of the 3D video content. For example, the controller can establish connections with a number of components (e.g., running on other computing devices). The controller can receive state update messages from the components (e.g., comprising state change information, network address information, etc.). The controller can also broadcast messages to the components. For example, the controller can broadcast system state messages to the components where the system state messages comprise current state information of the components. The controller can also broadcast other types of messages, such as start messages that instruct the components to enter a start state.
    Type: Grant
    Filed: October 30, 2017
    Date of Patent: March 17, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Spencer G Fowers
  • Patent number: 10593065
    Abstract: A method includes acquiring a plurality of training images through a capturing component, acquiring a plurality of training camera poses of the capturing component corresponding to the training images through a pose sensor disposed corresponding to the capturing component, and training a camera pose estimation model according to the training images and the training camera poses of the capturing component.
    Type: Grant
    Filed: July 25, 2017
    Date of Patent: March 17, 2020
    Assignee: HTC Corporation
    Inventors: Che-Han Chang, Edward Chang
  • Patent number: 10571260
    Abstract: An automated rivet measurement system comprises a number of end effectors, a number of cameras, a processor, and a comparator. The number of end effectors is configured to perform drilling and riveting on a structure. The number of cameras is connected to the number of end effectors. The number of cameras is configured to take a first image of a hole in the structure and a second image of a rivet in the hole. The processor is configured to process the first image and the second image to identify a number of reference points in the first image and the second image. The comparator is configured to determine a rivet concentricity using the hole in the first image and the rivet in the second image, in which the first image and the second image are aligned using the number of reference points.
    Type: Grant
    Filed: September 6, 2017
    Date of Patent: February 25, 2020
    Assignee: The Boeing Company
    Inventors: Patrick L. Anderson, Stephen J. Bennison
  • Patent number: 10515291
    Abstract: A template creation device may include an acquisition unit configured to acquire a plurality of templates from a plurality of images of different poses of a single object, or a plurality of images for a plurality of objects. The template creation device may further include a clustering unit configured to divide the plurality of templates into a plurality of groups on the basis of a similarity score; and an integration unit configured to combine the templates in a group into an integrated template. A new template set may be created from the plurality of integrated templates corresponding to each group in the plurality of groups.
    Type: Grant
    Filed: September 22, 2017
    Date of Patent: December 24, 2019
    Assignee: OMRON Corporation
    Inventor: Yoshinori Konishi
  • Patent number: 10508922
    Abstract: A road boundary detection system includes: an optical scanner configured to emit a light to an object to acquire measurement data reflected from the object; and a processor configured to extract a plurality of straight lines in order of a priority on each of the plurality of straight lines when a higher priority is given to a straight line including more of the contact points, based on the measurement data, configured to calculate a score of each straight line by assigning a score to the contact point included in the extracted straight lines and by assigning a score to a non-contact point included in other straight line having the same angle as the contact point, and configured to select a road boundary according to the priority of the calculated score of each straight lines.
    Type: Grant
    Filed: August 10, 2016
    Date of Patent: December 17, 2019
    Assignees: Hyundai Motor Company, Kia Motors Corporation, University Industry Foundation, Yonsei University
    Inventors: Seongkeun Park, Hoon Lee, MinkYun Yoo, HyunJu Kim, YoungWon Kim, JuYun Ro, Euntai Kim, Beomseong Kim
  • Patent number: 10486812
    Abstract: An automated commissioning and floorplan configuration (CAFC) device can include a CAFC system having a transceiver and a CAFC engine, where the transceiver communicates with at least one device disposed in a volume of space, where the CAFC engine, based on communication between the transceiver and the at least one device, commissions the at least one device.
    Type: Grant
    Filed: March 25, 2019
    Date of Patent: November 26, 2019
    Assignee: Eaton Intelligent Power Limited
    Inventors: Jonathan Andrew Whitten, Michael Alan Lunn
  • Patent number: 10481610
    Abstract: A method and device are described for controlling a vehicle using a location based dynamic dictionary. The method includes receiving, by a navigation device, an image from an image sensing device associated with the autonomous vehicle, wherein the image sensing device captures the image at a location of the autonomous vehicle. The method includes extracting, by the navigation device, navigation content from the image. The method further includes identifying, by the navigation device, the navigation content from the image by comparing the navigation content from the image with a dynamic dictionary. The dynamic dictionary is created based on the location of the autonomous vehicle. The method further includes controlling, by the navigation device, the autonomous vehicle based on the navigation content.
    Type: Grant
    Filed: September 29, 2017
    Date of Patent: November 19, 2019
    Assignee: Wipro Limited
    Inventors: Balaji Govindaraj, Bharath Kumar Muthupandian, Sujatha Jagannath, Ramachandra Budihal
  • Patent number: 10460446
    Abstract: A device includes software instructions for a circular plot analysis agent and at least one circular plot definition. The circular plot analysis agent obtains a digital image of a circular plot, detects a perimeter of the circular plot within the digital image, detects a plurality of edges within the perimeter, identifies a set of endpoints on the perimeter as a function of the plurality of edges, generates a plot descriptor from the set of endpoints, and initiates a transaction with a second device as a function of the plot descriptor.
    Type: Grant
    Filed: October 16, 2017
    Date of Patent: October 29, 2019
    Assignee: NANT HOLDINGS IP, LLC
    Inventors: Carla W. Balch, Nicholas J. Witchey
  • Patent number: 10456918
    Abstract: A technique is provided, with which, even for an object which has a shape prone to be erroneously recognized and for which it is difficult to generate an initial value expected to produce a true value through fitting, it is possible to generate an appropriate initial value candidate through intuitive input and reduce erroneous recognition. An information processing apparatus includes a model acquiring unit configured to acquire a three-dimensional shape model of a target object, a display control unit configured to cause a display unit to display the three-dimensional shape model, a setting unit configured to set first and second orientations different from each other on the basis of the three-dimensional shape model displayed in the display unit, a parameter deriving unit configured to derive a transformation parameter that transforms the first orientation into the second orientation, and a storage unit configured to store the transformation parameter.
    Type: Grant
    Filed: March 10, 2015
    Date of Patent: October 29, 2019
    Assignee: Canon Kabushiki Kaisha
    Inventors: Daisuke Watanabe, Hiroshi Okazaki, Yutaka Niwayama
  • Patent number: 10453165
    Abstract: Systems and methods are provided for implementing a computer vision model execution service. A computer vision model execution service may maintain a library of machine learning models, and may provide a standard interface for accessing a model or models. Models may implement schemas that specify an input vector and an output vector, and the computer vision model execution service may obtain or determine workflows that process input vectors through multiple computer vision models. The service further provides an interface for adding, removing, or updating models, and may provide feedback to modelers regarding the usage and performance of various machine learning models.
    Type: Grant
    Filed: March 27, 2017
    Date of Patent: October 22, 2019
    Assignee: Amazon Technologies, Inc.
    Inventors: Anton Kostov, Yi Sun
  • Patent number: 10445913
    Abstract: One or more embodiments are described for generating a two dimensional map of an environment using a set of submaps that include point clouds of the environment that are captured using a scanner device that is moved from one position to another in the environment. An example method includes revising the two-dimensional map by editing at least one of the submaps from the set, independently of the other submaps from the set. In one or more examples, the editing may be based on one or more of the other submaps from the set. The one or more embodiments facilitate revising the two-dimensional map in an offline manner and without rescanning the environment.
    Type: Grant
    Filed: March 5, 2018
    Date of Patent: October 15, 2019
    Assignee: FARO TECHNOLOGIES, INC.
    Inventors: Joao Santos, Ahmad Ramadneh, Aleksej Frank, Oliver Zweigle
  • Patent number: 10402964
    Abstract: A system, device, and method for inspecting the cosmetic and operational features of electronic devices, including computing and telecommunications devices. The cosmetic inspection system includes an image capture unit for capturing the images of the electronic devices, and a user interface for processing the captured images and providing relevant information to the user of the system. Images of the external components such as external casing materials or touch screens of electronic devices are captured and the cosmetic inspection system uses baseline images to make determinations to identify defective components of the electronic devices. Based on these determinations, the system may conclude which, if any, replacement components of the devices are needed to restore the electronic device. In one embodiment, a user of the system may then be provided with information through a user interface about defective components and options for ordering replacement components.
    Type: Grant
    Filed: November 21, 2018
    Date of Patent: September 3, 2019
    Assignee: FEDEX SUPPLY CHAIN LOGISTICS & ELECTRONICS, INC.
    Inventors: Clark Humphrey, Brian Morris
  • Patent number: 10384348
    Abstract: A robot apparatus includes an output unit that displays an image including an object on a screen, an input unit that receives an operation performed by a user for specifying information relating to an approximate range including the object in the image, an object extraction unit that extracts information regarding a two-dimensional contour of the object on the basis of the specification received by the input unit, and a position and attitude estimation unit that estimates information regarding a three-dimensional position and attitude of the object on the basis of the information regarding the two-dimensional contour.
    Type: Grant
    Filed: March 28, 2018
    Date of Patent: August 20, 2019
    Assignee: Sony Corporation
    Inventors: Genta Kondo, Atsushi Miyamoto, Satoru Shimizu
  • Patent number: 10384707
    Abstract: A robot swing structure uses a swing bracket to be in combination with a curved rack, and a transmission gear of a power mechanism to be in engagement with the curved rack. Whereby, when a first end of the curved rack is positioned at the transmission gear, the curved rack and swing bracket are allowed to be swung to incline downward (as if the robot head lowers or bows); when a second end of the curved rack is positioned at the transmission gear, the curved rack and swing bracket are allowed to be in a transverse state; and when a stop section of the curved rack is positioned at the transmission gear, the curved rack and swing bracket are allowed to be in a upright state, thereby making the curved rack and swing bracket to have three swing stopping states.
    Type: Grant
    Filed: June 22, 2017
    Date of Patent: August 20, 2019
    Assignee: ROBOTELF TECHNOLOGY CO., LTD.
    Inventor: Ling-Feng Chen
  • Patent number: 10356562
    Abstract: Traces collected by multiple portable devices moving within a geographic area that includes an indoor region, each of the traces including measurements of wireless signals sources at different times by a same device, and at least some of the traces including pseudorange measurements related to distances to respective satellites. Location estimates for the portable devices and the signal sources are generated using graph-based SLAM optimization of the location estimates. More particularly, constraints for the pseudorange measurements are generated and applied for the pseudorange measurements in graph-based SLAM optimization.
    Type: Grant
    Filed: September 10, 2018
    Date of Patent: July 16, 2019
    Assignee: GOOGLE LLC
    Inventors: Etienne Le Grand, Mohammed Khider, Luigi Bruno
  • Patent number: 10347026
    Abstract: An information processing apparatus includes a display controller that, when position information on a movable object is changed, changes a display size of an image associated with the movable object and displays the image.
    Type: Grant
    Filed: September 8, 2017
    Date of Patent: July 9, 2019
    Assignee: FUJI XEROX CO., LTD.
    Inventor: Kengo Tokuchi
  • Patent number: 10334230
    Abstract: A system is described that allows for the tracking of an exercise occurring within a physical environment. In one or more implementations, the system includes a depth sensing device configured to obtain depth values associated with a plurality of pixels. The depth values indicating distances from one or more physical objects in a physical environment to the depth sensing device. The system also includes a computing device in communication with the depth sensing device. The computing device includes a memory and a processor. The processor is configured to execute the one or more modules to cause the processor to: identify a point corresponding to at least one pixel representing a portion of at least one physical object within the physical environment based upon the pixel depth values; track the point through a plurality of image frames; and determine whether at least one repetition has occurred based upon the tracked point.
    Type: Grant
    Filed: May 15, 2015
    Date of Patent: June 25, 2019
    Assignee: Nebraska Global Investment Company, LLC
    Inventors: Benjamin D. Rush, Joshua M. Brown-Kramer, Lucas A. Sabalka, Nathan H. Lowry
  • Patent number: 10307912
    Abstract: A robot cleaner includes a 3D sensor unit installed on a main body to sense nearby objects and output sensing information; a secondary sensor unit configured to sense nearby objects and output sensing information; a storage unit configured to set a diagnostic algorithm according to a diagnostic mode in advance; an input unit configured to input an execution command for the diagnostic mode; a control unit configured to auto-correct the diagnostic mode for the 3D sensor and a parameter of the 3D sensor unit using the diagnostic algorithm in response to the execution command; and an output unit configured to output an execution result of the diagnostic mode and a correction message.
    Type: Grant
    Filed: July 15, 2013
    Date of Patent: June 4, 2019
    Assignee: LG ELECTRONICS INC.
    Inventor: Yeonsoo Kim
  • Patent number: 10313650
    Abstract: An apparatus and method for calculating a cost volume by controlling an intensity so as to receive relatively less influence from a condition of intensity when capturing a stereo image and changing a parameter for each level according to distance is provided. The apparatus includes an illuminator controller, a pixel expected ratio calculator, and a cost volume calculator, and controls intensity using the illuminator when capturing an object, and calculates a cost volume value so as to receive relatively less influence from distance and intensity when performing stereo matching by changing a parameter used for calculating the cost volume value according to distance to the object and intensity.
    Type: Grant
    Filed: March 24, 2017
    Date of Patent: June 4, 2019
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
    Inventors: Eul Gyoon Lim, Ji Ho Chang, Jae Chan Jeong
  • Patent number: 10293486
    Abstract: A humanoid robot with a body joined to an omnidirectional mobile ground base, and equipped with: a body position sensor and a base position sensor to provide measures, actuators comprising at least 3 wheels located in the omnidirectional mobile base, extractors for converting the measures into useful data, a controller to calculate position, velocity and acceleration commands from the useful data using a robot model and pre-ordered position and velocity references, means for converting the commands into instructions for the actuators, wherein the robot model is a double point-mass model, and wherein the commands are based on a linear model predictive control law with a discretized time according to a sampling time period and a number of predicted samples, and expressed as a quadratic optimization formulation with: a weighted sum of objectives and a set of predefined linear constraints.
    Type: Grant
    Filed: April 17, 2015
    Date of Patent: May 21, 2019
    Assignees: SOFTBANK ROBOTICS EUROPE, INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET EN AUTOMATIQUE
    Inventors: Jory Lafaye, David Gouaillier, Pierre-Brice Wieber
  • Patent number: 10274325
    Abstract: Systems and methods for robotic mapping are disclosed. In some exemplary implementations, a robot can travel in an environment. From travelling in the environment, the robot can create a graph comprising a plurality of nodes, wherein each node corresponds to a scan taken by a sensor of the robot at a location in the environment. In some exemplary implementations, the robot can generate a map of the environment from the graph. In some cases, to facilitate map generation, the robot can constrain the graph to start and end at a substantially similar location. The robot can also perform scan matching on extended scan groups, determined from identifying overlap between scans, to further determine the location of features in a map.
    Type: Grant
    Filed: November 1, 2016
    Date of Patent: April 30, 2019
    Assignee: Brain Corporation
    Inventors: Jaldert Rombouts, Borja Ibarz Gabardos, Jean-Baptiste Passot, Andrew Smith
  • Patent number: 10239201
    Abstract: A master-slave system (1) according to the present invention includes a slave actuator (As1 to As3) for generating a slave driving force (?s) to control a slave robot in terms of driving force, an effective driving force sensor (Fs1 to Fs3) for measuring a slave effective driving force (?sa) actually acting on a terminal output axis of the slave actuator (As1 to As3), and a slave target effective driving force calculating device (3) for calculating a slave target effective driving force (?sad) which is a target value for the slave effective driving force (?sa), on the basis of a master operating force (fm) applied to the master robot by an operator (U). The slave actuator (As1 to As3) generates the slave driving force (?s) on the basis of the slave target effective driving force (?sad) and the slave effective driving force (?sa).
    Type: Grant
    Filed: April 28, 2015
    Date of Patent: March 26, 2019
    Assignee: Muscle Corporation
    Inventor: Katsuya Kanaoka
  • Patent number: 10239612
    Abstract: An automated commissioning and floorplan configuration (CAFC) device can include a CAFC system having a transceiver and a CAFC engine, where the transceiver communicates with at least one device disposed in a volume of space, where the CAFC engine, based on communication between the transceiver and the at least one device, commissions the at least one device.
    Type: Grant
    Filed: July 19, 2017
    Date of Patent: March 26, 2019
    Assignee: Cooper Technologies Company
    Inventors: Jonathan Andrew Whitten, Michael Alan Lunn
  • Patent number: 10192113
    Abstract: The described positional awareness techniques employing sensory data gathering and analysis hardware with reference to specific example implementations implement improvements in the use of sensors, techniques and hardware design that can enable specific embodiments to provide positional awareness to machines with improved speed and accuracy. The sensory data are gathered from multiple operational cameras and one or more auxiliary sensors.
    Type: Grant
    Filed: July 5, 2017
    Date of Patent: January 29, 2019
    Assignee: PerceptIn, Inc.
    Inventors: Shaoshan Liu, Zhe Zhang, Grace Tsai
  • Patent number: 10162354
    Abstract: In one embodiment, motion planning and control data is received, indicating that an autonomous vehicle is to move from a first point to a second point of a path. The motion planning and control data describes a plurality of routes from the first point to the second point within the path. For each of the routes, a simulation of the route is performed in view of physical characteristics of the autonomous vehicle to generate a simulated route. A controlling error is calculated, the controlling error representing a discrepancy between the route and the simulated route. One of the routes is selected based on controlling errors between the routes and associated simulated routes. The autonomous vehicle is operated to move from the first point to the second point according to the selected route.
    Type: Grant
    Filed: July 21, 2016
    Date of Patent: December 25, 2018
    Assignee: BAIDU USA LLC
    Inventors: Qi Kong, Fan Zhu, Dong Li, Yifei Jiang, Li Zhuang, Guang Yang, Jingao Wang
  • Patent number: 10163036
    Abstract: One or more image parameters of an image may be analyzed using a hierarchical set of models. Executing individual models in the set of models may generate outputs from analysis of different image parameters of the image. Inputs of one or more of the models may be conditioned on a set of outputs derived from one or more preceding model in the hierarchy.
    Type: Grant
    Filed: September 22, 2016
    Date of Patent: December 25, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Andreas Lehrmann, Leonid Sigal
  • Patent number: 10076842
    Abstract: Described are machine vision systems and methods for simultaneous kinematic and hand-eye calibration. A machine vision system includes a robot or motion stage and a camera in communication with a control system. The control system is configured to move the robot or motion stage to poses, and for each pose: capture an image of calibration target features and robot joint angles or motion stage encoder counts. The control system is configured to obtain initial values for robot or motion stage calibration parameters, and determine initial values for hand-eye calibration parameters based on the initial values for the robot or motion stage calibration parameters, the image, and joint angles or encoder counts. The control system is configured to determine final values for the hand-eye calibration parameters and robot or motion stage calibration parameters by refining the hand-eye calibration parameters and robot or motion stage calibration parameters to minimize a cost function.
    Type: Grant
    Filed: September 28, 2016
    Date of Patent: September 18, 2018
    Assignee: Cognex Corporation
    Inventors: Lifeng Liu, Cyril C. Marrion, Tian Gan
  • Patent number: 10075818
    Abstract: Traces are collected by multiple portable devices moving with an area that includes an indoor region, with each of the traces including measurements of wireless signals at different times, including measurements of wireless signals from signal sources disposed within the area. A motion map for the geographic area is constructed by determining, for each of the cells that make the motion map, respective probabilities of moving in various directions relative to each cell. Location estimates for the portable devices and the signal sources are generated using graph-based SLAM optimization of the location estimates. The graph-based SLAM optimization includes determining to which of the cells of the motion map the location estimate corresponds and applying the measurements of wireless signals sources and the set of probabilities of the cells as a first constraint and a second constraint, respectively, in the graph-based SLAM optimization.
    Type: Grant
    Filed: September 13, 2017
    Date of Patent: September 11, 2018
    Assignee: GOOGLE LLC
    Inventors: Etienne Le Grand, Mohammed Khider, Luigi Bruno
  • Patent number: 10062144
    Abstract: A spherical harmonic is defined which is an operationally optimal small finite subset of the infinite number of spherical harmonics allowed to exist mathematically. The composition of the subset differs depending on its position on virtual hemisphere. The subsets are further divided into small spherical tesserae whose dimensions vary depending on the distance from the hemispherical center. The images of the outside visual scenes are projected on the flat surface of the webcam and from there are read and recalculated programmatically as if the images have been projected on the hemisphere, rotational invariants are then computed in the smallest tesserae using numerical integration, and then invariants from neighboring tesserae are added to compute the rotational invariant of their union. Every computed invariant is checked with the library and stored there if there is no match. The rotational invariants are solely used for visual recognition and classification and operational decision making.
    Type: Grant
    Filed: July 6, 2017
    Date of Patent: August 28, 2018
    Inventor: Alex Simon Blaivas
  • Patent number: 10028632
    Abstract: A method of controlling a mobile robot, the method including monitoring a first system of the mobile robot to detect a first error associated with the first system, monitoring a second system of the mobile robot to detect a second error associated with the second system, and when the first error and the second error are detected at the same time, determining that a third error has occurred.
    Type: Grant
    Filed: May 26, 2016
    Date of Patent: July 24, 2018
    Assignee: Dyson Technology Limited
    Inventors: Maximilian John Britain, William Matthew Wakeling, Christopher John Ord
  • Patent number: 9969089
    Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.
    Type: Grant
    Filed: July 27, 2016
    Date of Patent: May 15, 2018
    Assignee: iRobot Corporation
    Inventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
  • Patent number: 9966060
    Abstract: The method is performed at an electronic device with one or more processors and memory storing one or more programs for execution by the one or more processors. A first speech input including at least one word is received. A first phonetic representation of the at least one word is determined, the first phonetic representation comprising a first set of phonemes selected from a speech recognition phonetic alphabet. The first set of phonemes is mapped to a second set of phonemes to generate a second phonetic representation, where the second set of phonemes is selected from a speech synthesis phonetic alphabet. The second phonetic representation is stored in association with a text string corresponding to the at least one word.
    Type: Grant
    Filed: February 28, 2017
    Date of Patent: May 8, 2018
    Assignee: Apple Inc.
    Inventors: Devang K. Naik, Thomas R. Gruber, Liam Weiner, Justin G. Binder, Charles Srisuwananukorn, Gunnar Evermann, Shaun Eric Williams, Hong Chen, Lia T. Napolitano
  • Patent number: 9939274
    Abstract: Providing a route guide to a destination using building information modeling (BIM) data. A request from a user for a route guide to a destination in a building is received. BIM data for the building, security information for a route, and a user profile of the user that requested the route guide to the destination is received. A route guide to the destination in the building is created, based at least on the BIM data, security information, and the user profile. Creating the route guide may be further based on use restriction information, including time, weight, operation, impassable information, information related to the method of using the facility equipment, or precaution information when using the facility equipment. The route guide may include a method of using the facility equipment in the building, a method of operating, a direction of operation, a method of unlocking or locking, or precaution information.
    Type: Grant
    Filed: February 13, 2014
    Date of Patent: April 10, 2018
    Assignee: International Business Machines Corporation
    Inventors: Yasutaka Nishimura, Masami Tada, Akihiko Takajo, Takahito Tashiro
  • Patent number: 9930252
    Abstract: Methods, systems, and robots for processing omni-directional image data are disclosed. A method includes receiving omni-directional image data representative of a panoramic field of view and segmenting, by one or more processors, the omni-directional image data into a plurality of image slices. Each image slice of the plurality of image slices is representative of at least a portion of the panoramic field of view of the omni-directional image data. The method further includes calculating a slice descriptor for each image slice of the plurality of image slices and generating a current sequence of slice descriptors. The current sequence of slice descriptors includes the calculated slice descriptor for each image slice of the plurality of image slices.
    Type: Grant
    Filed: December 6, 2012
    Date of Patent: March 27, 2018
    Assignee: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.
    Inventor: Joseph Maria Angelo Djugash
  • Patent number: 9824451
    Abstract: A system for determining a camera pose relative to an object including a unique feature may include a monocular camera configured to produce an image of the object, and a processing unit in operative communication with the monocular camera. The processing unit may be configured to: identify the unique feature of the object in the image produced by the monocular camera, synthesize at least four points along a contour of the object in the image using the identified unique feature as a starting point, synthesize a same number of points as synthesized in the image along a contour of the object in a reference model, the reference model preprogrammed into a memory of the processing unit, correlate the points from the reference model to the image, and determine a pose of the monocular camera based on the correlated points.
    Type: Grant
    Filed: April 7, 2016
    Date of Patent: November 21, 2017
    Assignee: The Boeing Company
    Inventors: Michelle Crivella, Philip Freeman
  • Patent number: 9818198
    Abstract: Motorized machinery, such as overhead cranes, are widely used in industries all over the world. It is not easy to move crane payloads without oscillation, increasing the likelihood of obstacle collisions and other accidents. One possible solution to such problems could be aiding the operator with a dynamic map of the workspace that shows the current position of obstacles. This method discloses the use of a camera to take images of the workspace, using imaging blurring to smooth the obtained images, and drawing contours to produce an individual, near real-time map of the workspace. In one or more embodiments, known obstacles may be tagged in a manner which is readable by the camera. This image and historical images of the same workspace are layered on top of one another to produce a map of obstacles on the workspace floor. This imaging and layering can produce a near real-time map of obstacles that can be used to guide heavy motorized machinery around a workspace without incident.
    Type: Grant
    Filed: October 19, 2015
    Date of Patent: November 14, 2017
    Assignee: University of Louisiana at Lafayette
    Inventors: Joshua Vaughan, Mohammad Sazzad Rahman