Vision Sensor (e.g., Camera, Photocell) Patents (Class 700/259)
  • Patent number: 12237636
    Abstract: A method, system and computer program product are provided for determining wire contact insertion based on image analysis. Methods include: acquiring at least one image of a connector having a plurality of wire contact insertion holes; identifying a wire contact within a wire contact insertion hole of the plurality of wire contact insertion holes; determining that the wire contact insertion hole is a correct wire contact insertion hole; determining that the wire contact has been fully inserted into the correct wire contact insertion hole; and providing feedback indicating that the wire contact is fully inserted into the correct wire contact insertion hole. The feedback includes in some cases a first indicator indicating that the wire contact is in the correct wire contact insertion hole and a second indicator indicating that the wire contact is fully inserted into the correct wire contact insertion hole.
    Type: Grant
    Filed: April 22, 2022
    Date of Patent: February 25, 2025
    Assignee: The Boeing Company
    Inventors: Michael Cui, Heiko Hoffmann, Bradley J. Mitchell
  • Patent number: 12226920
    Abstract: Implementations described herein relate to training and refining robotic control policies using imitation learning techniques. A robotic control policy can be initially trained based on human demonstrations of various robotic tasks. Further, the robotic control policy can be refined based on human interventions while a robot is performing a robotic task. In some implementations, the robotic control policy may determine whether the robot will fail in performance of the robotic task, and prompt a human to intervene in performance of the robotic task. In additional or alternative implementations, a representation of the sequence of actions can be visually rendered for presentation to the human can proactively intervene in performance of the robotic task.
    Type: Grant
    Filed: August 11, 2023
    Date of Patent: February 18, 2025
    Assignee: GOOGLE LLC
    Inventors: Seyed Mohammad Khansari Zadeh, Eric Jang, Daniel Lam, Daniel Kappler, Matthew Bennice, Brent Austin, Yunfei Bai, Sergey Levine, Alexander Irpan, Nicolas Sievers, Chelsea Finn
  • Patent number: 12229875
    Abstract: A program causes a display data generation apparatus to function as a moving image acquirer that acquires imaging data indicating a moving image of an environment including a device, an estimator that estimates a position and an orientation of an imager capturing the moving image based on an object in the moving image, a log acquirer that acquires log data indicating a log of an operation of the device and an object position corresponding to the object, and a display controller that generates display data for displaying the moving image and a model moving image to be played synchronously. The model moving image is acquired by projecting, at the position and in the orientation, a three-dimensional model of the device placed with respect to the object position while changing the three-dimensional model based on the log.
    Type: Grant
    Filed: March 23, 2022
    Date of Patent: February 18, 2025
    Assignee: MITSUBISHI ELECTRIC CORPORATION
    Inventor: Takayuki Yamaoka
  • Patent number: 12210342
    Abstract: In various embodiments communications latency and/or bandwidth of a communications connection between a device being controlled a remote operator workstation being used to control the device is measured. One or more parameters of the system, e.g., operator control stations and/or the device, e.g., robotic device, being remotely controlled, e.g., teleoperated, are altered in response to one or more of: i) communications latency, ii) communications bandwidth, iii) a task to be performed; and/or iv) environmental conditions. By changing such parameters, things such as maximum speed of device operation, a maximum acceleration or a maximum rate of movement of a device element such as forks of a forklift the device can be controlled or limited. The changing of parameters takes into consideration one, more or all of: i) communications latency, ii) communications bandwidth, iii) a task to be performed; and/or iv) environmental conditions.
    Type: Grant
    Filed: April 20, 2022
    Date of Patent: January 28, 2025
    Assignee: Third Wave Automation, Inc.
    Inventors: Katarina Bouma, Connor Schenck, Julian Mason
  • Patent number: 12190544
    Abstract: A method according to the present disclosure includes (a) estimating pixel coordinate values of a plurality of characteristic points set in advance in a robot arm from an image of the robot arm taken by a camera using a first machine learning model, (b) estimating first coordinate values of the plurality of characteristic points in a 3D camera coordinate system from the pixel coordinate values of the plurality of characteristic points using a second machine learning model, (c) calculating second coordinate values of the plurality of characteristic points in a 3D robot coordinate system using an encoder value of the robot arm, and (d) executing the steps (a) through (c) with respect to a plurality of postures of the robot arm to estimate calibration parameters including an external parameter of the camera using the first coordinate values and the second coordinate values of the plurality of characteristic points in the plurality of postures.
    Type: Grant
    Filed: June 21, 2023
    Date of Patent: January 7, 2025
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Akinobu Sato
  • Patent number: 12162171
    Abstract: A robot control system according to one or more embodiments may include a robot that performs a task in relation to a workpiece, a coordinate measuring machine that measures a three-dimensional shape of the workpiece, a control device that controls the robot in accordance with a measurement result from the coordinate measuring machine, and an image capturing apparatus that captures an image of the workpiece. An image capture interval for the image capturing apparatus is shorter than a measurement interval for the coordinate measuring machine. In a period after the coordinate measuring machine conducts a measurement and until the robot performs the task, the control device is configured to compute a position of the workpiece by referring to an image capture result from the image capturing apparatus.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: December 10, 2024
    Assignee: JOHNAN Corporation
    Inventors: Kozo Moriyama, Shin Kameyama, Truong Gia Vu, Lucas Brooks
  • Patent number: 12167167
    Abstract: A mapping between environments and devices included therein may be maintained, such that a configuration of each environment is known. Upon detecting that a user is within an environment, and based on a current device state of devices within the environment, an application may be generated and presented to the user via a corresponding user device. The application may allow the user to activate and control the devices within the environment. In particular, the application may depict selectable controls that correspond to functions or operations associated with the different devices within the environment. The application may also be dynamically updated based on an updated current device state of the devices.
    Type: Grant
    Filed: October 31, 2023
    Date of Patent: December 10, 2024
    Assignee: Amazon Technologies, Inc.
    Inventors: Milo Oostergo, Gary Zhong
  • Patent number: 12165538
    Abstract: A telesurgical mentoring platform with a wheeled base, a lower rack mounted on the base, an upper rack extending vertically from the lower rack, a compactly foldable articulated arm that is configured to extend horizontally outward away from the upper rack and configured to connect to a connector piece holding an end effectuator at its distal end, a tablet personal computer; the console configured to be readily mobilized on the floor of an existing operating room and is capable of providing a connectivity point for communication, audiovisual, and data transfer services in an operating room.
    Type: Grant
    Filed: November 4, 2022
    Date of Patent: December 10, 2024
    Assignee: MENDAERA, INC.
    Inventors: David Paul Schultz, Adam Daniel John
  • Patent number: 12158756
    Abstract: An autonomous vehicle fleet may include multiple autonomous vehicles. The autonomous vehicles of the fleet may be configured to request remote operator input in response to encountering a situation internally or in the environment that the vehicle is unable to resolve. The autonomous vehicle of the fleet requests remote operator input through a fleet queue system that prioritizes the input requests and matches requests to available remote operators for processing and resolving the situations.
    Type: Grant
    Filed: November 12, 2021
    Date of Patent: December 3, 2024
    Assignee: Zoox, Inc.
    Inventor: Alexander Jacques Maria Mertens
  • Patent number: 12151889
    Abstract: An automated packing system is disclosed for placing a plurality of objects into a shipping container. The system includes a supply bin receiving conveyor for receiving a supply bin at a supply station, a destination container location assessment system for positioning a destination container, a detection system for detecting a plurality of objects within the supply bin responsive to the position of the supply bin on the receiving conveyor as aligned by the alignment system, an object selection system for selecting a selected object from the plurality of objects to be placed into the shipping container, and a programmable motion device for grasping and acquiring the selected object from the plurality of objects at the supply station, and for placing the selected object into the shipping container in a selected orientation.
    Type: Grant
    Filed: October 29, 2021
    Date of Patent: November 26, 2024
    Assignee: Berkshire Grey Operating Company, Inc.
    Inventors: Benjamin Cohen, Christopher Geyer, Thomas Koletschka, Jay Link, Joshua Lurz, Matthew T. Mason, Richard Musgrave, Ryan O'Hern, Gene Temple Price, Joseph Romano, Prasanna Velagapudi, Thomas Wagner, Jeremy Saslaw
  • Patent number: 12145270
    Abstract: The present disclosure discloses an autonomous mobile grabbing method for a mechanical arm based on visual-haptic fusion under a complex illumination condition, which mainly includes approaching control over a target position and feedback control over environment information. According to the method, under the complex illumination condition, weighted fusion is conducted on visible light and depth images of a preselected region, identification and positioning of a target object are completed based on a deep neural network, and a mobile mechanical arm is driven to continuously approach the target object; in addition, the pose of the mechanical arm is adjusted according to contact force information of a sensor module, the external environment and the target object; and meanwhile, visual information and haptic information of the target object are fused, and the optimal grabbing pose and the appropriate grabbing force of the target object are selected.
    Type: Grant
    Filed: October 26, 2021
    Date of Patent: November 19, 2024
    Assignee: SOUTHEAST UNIVERSITY
    Inventors: Aiguo Song, Linhu Wei, Chaolong Qin, Yu Zhao
  • Patent number: 12133625
    Abstract: A dirtiness level determining method, applied to a robot cleaner comprising an image sensor, comprising: capturing an image of a reference surface as a reference image: capturing a current image; calculating a fixed pattern according to a difference between the reference image and the current image; calculating a dirtiness level of the image sensor according to the fixed pattern; and generating a notifying message if the dirtiness level is higher than a dirtiness threshold. The dirtiness level of the image sensor can be automatically determined by the robot cleaner, thus the user can be notified before the auto clean machine cannot normally operate.
    Type: Grant
    Filed: November 2, 2022
    Date of Patent: November 5, 2024
    Assignee: PixArt Imaging Inc.
    Inventor: Guo-Zhen Wang
  • Patent number: 12131529
    Abstract: A method for performing a task by a robotic device includes mapping a group of task image pixel descriptors associated with a first group of pixels in a task image of a task environment to a group of teaching image pixel descriptors associated with a second group of pixels in a teaching image based on positioning the robotic device within the task environment. The method also includes determining a relative transform between the task image and the teaching image based on mapping the plurality of task image pixel descriptors. The relative transform indicates a change in one or more of points of 3D space between the task image and the teaching image. The method also includes performing the task associated with the set of parameterized behaviors based on updating one or more parameters of a set of parameterized behaviors associated with the teaching image based on determining the relative transform.
    Type: Grant
    Filed: January 18, 2023
    Date of Patent: October 29, 2024
    Assignee: TOYOTA RESEARCH INSTITUTE, INC.
    Inventors: Jeremy Ma, Josh Petersen, Umashankar Nagarajan, Michael Laskey, Daniel Helmick, James Borders, Krishna Shankar, Kevin Stone, Max Bajracharya
  • Patent number: 12108926
    Abstract: A mobile cleaning robot system can include a mobile cleaning robot and processing circuitry. The mobile cleaning robot can include a camera and can be operable to clean a floor surface of an environment. The processing circuitry can be in communication with the mobile cleaning robot and the camera, the processing circuitry configured to produce an image output based on an optical field of view of the camera. The processing circuitry can also detect a visual fiducial in the image output and can determine a behavior modification based on the visual fiducial. The processing circuitry can modify movement of the mobile cleaning robot based on the behavior modification.
    Type: Grant
    Filed: August 21, 2023
    Date of Patent: October 8, 2024
    Assignee: iRobot Corporation
    Inventors: Eric J. Burbank, Oliver Lewis
  • Patent number: 12102403
    Abstract: A robotic surgical system with user engagement monitoring includes a surgeon console having a hand detection system and a tracking device including an image capture device configured to capture an image of a user position reference point, wherein information from the hand detection system and the tracking device are combined to control operation of the robotic surgical system.
    Type: Grant
    Filed: December 17, 2019
    Date of Patent: October 1, 2024
    Assignee: Coviden LP
    Inventors: William J. Peine, Steven J. Levine, Albert Dvornik, Mantena V. Raju, Chen Chen
  • Patent number: 12097609
    Abstract: The disclosure provides systems and methods for mitigating slip of a robot appendage. In one aspect, a method for mitigating slip of a robot appendage includes (i) receiving an input from one or more sensors, (ii) determining, based on the received input, an appendage position of the robot appendage, (iii) determining a filter position for the robot appendage, (iv) determining a distance between the appendage position and the filter position, (v) determining, based on the distance, a force to apply to the robot appendage, (vi) causing one or more actuators to apply the force to the robot appendage, (vii) determining whether the distance is greater than a threshold distance, and (viii) responsive to determining that the distance is greater than the threshold distance, the control system adjusting the filter position to a position, which is the threshold distance from the appendage position, for use in a next iteration.
    Type: Grant
    Filed: January 11, 2022
    Date of Patent: September 24, 2024
    Assignee: Boston Dynamics, Inc.
    Inventors: Stephen Berard, Alex Yu Khripin, Benjamin Swilling
  • Patent number: 12083688
    Abstract: Provided is an apparatus for arranging objects in such a manner as to permit the user to handle the objects more easily. The information includes a robotic arm device configured to arrange one or more objects, and circuitry configured to determine one or more characteristics of a user, determine an arrangement position of each object of the one or more objects to be arranged based on the one or more determined characteristics of the user, and initiate control of the robotic arm device to arrange each object according to the determined arrangement position of the object.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: September 10, 2024
    Assignee: SONY CORPORATION
    Inventors: Kazuo Hongo, Toshimitsu Tsuboi, Shunsuke Yajima
  • Patent number: 12036675
    Abstract: A robot control device includes: a trained model built by being trained on work data; a control data acquisition section which acquires control data of the robot based on data from the trained model; base trained models built for each of a plurality of simple operations by being trained on work data; an operation label storage section which stores operation labels corresponding to the base trained models; a base trained model combination information acquisition section which acquires combination information when the trained model is represented by a combination of a plurality of the base trained models, by acquiring a similarity between the trained model and the respective base trained models; and an information output section which outputs the operation label corresponding to each of the base trained models which represent the trained model.
    Type: Grant
    Filed: December 27, 2019
    Date of Patent: July 16, 2024
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Hitoshi Hasunuma, Takeshi Yamamoto, Kazuki Kurashima
  • Patent number: 12023808
    Abstract: A robot system includes two arms, each having a hand at an end thereof, and a controller to control operation of the arms. The hand has an openable and closable holder. The controller includes hand-number determining circuitry to determine the number of hands used to hold a holdable object based on the size of the holdable object, and hold controlling circuitry to control the holder of one of the hands to open so as to hold the holdable object by an inner surface of the holder, when the number of hands to be used is one, and control the holders of the two hands to close so as to hold the holdable object by outer surfaces of the two holders, when the number of hands to be used is two.
    Type: Grant
    Filed: March 25, 2021
    Date of Patent: July 2, 2024
    Assignee: KAWASAKI JUKOGYO KABUSHIKI KAISHA
    Inventors: Kentaro Azuma, Takayuki Ishizaki, Mitsunobu Oka, Masataka Yoshida
  • Patent number: 12017350
    Abstract: A robot may include a robot frame having a bottom plate, and a rear caster and a wheel module disposed on the bottom plate, wherein the wheel module may include a link base mounted in the bottom plate to be spaced apart from the rear caster, a rotation damper mounted in the link base and having an elastic member accommodated therein, a front link connected to the rotation damper through a connecting shaft, a front caster mounted in a front portion of the front link, a drive motor mounted in a rear portion of the front link, and a drive wheel configured to be rotated by the drive motor.
    Type: Grant
    Filed: September 30, 2021
    Date of Patent: June 25, 2024
    Assignee: LG ELECTRONICS INC.
    Inventors: Moonchan Kim, Taiwoo Kim, Wondong Lee, Sanghun Mun, Yoonhyouk Cheong
  • Patent number: 12012289
    Abstract: Laundry pieces are to be gripped at various locations in laundries. This is labor-intensive. For this purpose, efforts are being made toward automating the gripping of laundry pieces. A camera has previously been used for this purpose. The automatic gripping of laundry pieces is implemented by way of a plurality of cameras, in that a plurality of cameras simultaneously record image data of respective laundry pieces to be gripped. On account thereof, the cameras always provide exploitable image data of the relevant laundry piece even when one camera is momentarily obscured, for example, or is defective or the viewing direction of a camera is not directed toward the face of the laundry piece. Image data of the laundry piece can always be generated by virtue of the plurality of cameras, said image data enabling a handling installation to automatically grip the laundry piece in a reliable manner.
    Type: Grant
    Filed: March 22, 2021
    Date of Patent: June 18, 2024
    Assignee: Herbert Kannegiesser GmbH
    Inventors: Wilhelm Bringewatt, Engelbert Heinz
  • Patent number: 11998151
    Abstract: A cleaning roller mountable to a cleaning robot includes an elongate shaft extending from a first end portion to a second end portion along an axis of rotation. The first and second end portions are mountable to the cleaning robot for rotating about the axis of rotation. The cleaning roller further includes a core affixed around the shaft and having outer end portions positioned along the elongate shaft and proximate the first and second end portions. The core tapers from proximate the first end portion of the shaft toward a center of the shaft. The cleaning roller further includes a sheath affixed to the core and extending beyond the outer end portions of the core. The sheath includes a first half and a second half each tapering toward the center of the shaft.
    Type: Grant
    Filed: March 28, 2022
    Date of Patent: June 4, 2024
    Assignee: iRobot Corporation
    Inventors: William Goddard, Matthew Blouin
  • Patent number: 11998291
    Abstract: A robotic surgical system with user engagement monitoring includes a robot assembly, a surgeon console, and a tracking device. The robot assembly includes a robotic arm coupled to a surgical instrument. The surgeon console includes a display device and a handle communicatively coupled to at least one of the robot assembly, the robotic arm, or the surgical instrument. The tracking device includes an image capture device configured to capture an image of a user position reference point. At least one of the surgeon console or the tracking device is configured to compute a position of the user position reference point based on the image; determine whether a user is engaged with the surgeon console based on the computed position; and, in response to a determination that the user is disengaged from the surgeon console, cause the robotic surgical system to operate in a safe mode.
    Type: Grant
    Filed: January 2, 2019
    Date of Patent: June 4, 2024
    Assignee: COVIDIEN LP
    Inventor: William J. Peine
  • Patent number: 11986965
    Abstract: A method for the continuous storage of internal operating states and visualization of past sequences of operations, as well as to a robot and/or robot controller, wherein the robot is preferably mounted on or next to a processing machine, in particular an injection molding machine and serves for the removal, handling, manipulation or further processing of injection molded parts which have just been produced. The robot controller records data, in particular changes of state, positions, internal parameters, time stamps, etc., and in case of occurrence of an error this most recently recorded information is linked to the error and stored, whereby the changes of state until the occurrence of the respective error are simulated and visually displayed for an analysis on the basis of a virtual model of the physical robot.
    Type: Grant
    Filed: June 24, 2019
    Date of Patent: May 21, 2024
    Assignee: Wittmann Kunststoffgeräte GmbH
    Inventors: Johann Rella, Peter Michael Wittmann
  • Patent number: 11974053
    Abstract: A stereoscopic imaging platform includes a stereoscopic camera configured to record left and right images of a target site. A robotic arm is operatively connected to the stereoscopic camera, the robotic arm being adapted to selectively move the stereoscopic camera relative to the target. The stereoscopic camera includes a lens assembly having at least one lens and defining a working distance. The lens assembly has at least one focus motor adapted to move the at least one lens to selectively vary the working distance. A controller is adapted to selectively execute one or more automatic focusing modes for the stereoscopic camera. The automatic focusing modes include a continuous autofocus mode adapted to maintain a focus of the at least one stereoscopic image while the robotic arm is moving the stereoscopic camera and the target site is moving along at least an axial direction.
    Type: Grant
    Filed: March 2, 2022
    Date of Patent: April 30, 2024
    Assignee: Alcon, Inc.
    Inventor: Patrick Terry
  • Patent number: 11969218
    Abstract: A system and method provide feedback to guide a user to arrange components of a surgical robotic system in a suitable arrangement for the surgical procedure to be performed. An image of a medical procedure site at which a robotic manipulator and a second object such as a second robotic manipulator, patient table, patient, and bedside staff are located. The image is displayed for a user. The system uses computer vision to recognize the robotic manipulator and second object in the image and determine their relative positions. Based on the type of surgical procedure that is to be performed, the system determines a target position for the robotic manipulator and/or the second object, and displays, as an overlay to the displayed image, a graphical indication of the target position.
    Type: Grant
    Filed: July 6, 2021
    Date of Patent: April 30, 2024
    Assignee: Asensus Surgical US, Inc.
    Inventors: Michael Bruce Wiggin, Kevin Andrew Hufford
  • Patent number: 11958182
    Abstract: A mobile body system for managing a mobile body that moves within a predetermined area includes a mobile body associated with a guest group including one or more guests, and a managing apparatus configured to manage an outer appearance of the mobile body. The managing apparatus includes an activity collector configured to collect activity information indicative of an activity of the guest within the predetermined area, and an outer appearance manager configured to change the outer appearance of the mobile body viewed by the guest based on the activity information collected by the activity collector.
    Type: Grant
    Filed: December 14, 2021
    Date of Patent: April 16, 2024
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventors: Mutsumi Kawashima, Tomoaki Miyazawa, Masato Noritake, Nobuhiro Nishikawa, Masato Kurima, Tokuyuki Nishikawa, Reiko Tomita, Takaaki Kato, Hiroyuki Tomita, Daisaku Kato
  • Patent number: 11957090
    Abstract: This disclosure includes a method for pruning a fruit plant. An exemplary method step includes obtaining an image of the fruit plant that has branches. Next, creating exclusion zones surrounding the branches. Then pruning the fruit plant based upon the exclusion zones.
    Type: Grant
    Filed: November 7, 2019
    Date of Patent: April 16, 2024
    Assignee: HarvestMoore, L.L.C.
    Inventor: Francis Wilson Moore
  • Patent number: 11950850
    Abstract: A virtual surgical robot being built from kinematic data is rendered to a display. A user input is received to effect a movement or a configuration of the virtual surgical robot. The kinematic data is modified based on evaluation of the movement or the configuration of the virtual surgical robot.
    Type: Grant
    Filed: September 27, 2021
    Date of Patent: April 9, 2024
    Assignee: Verb Surgical Inc.
    Inventors: Bernhard Fuerst, Eric Johnson, Pablo Garcia Kilroy
  • Patent number: 11950024
    Abstract: Each remote operation terminal includes an own robot sound pressure information transmission unit, an own robot sound pressure information reception unit, an another terminal sound pressure information transmission unit, an another terminal sound pressure information reception unit, a sound pressure information output unit configured to output sound pressure information received by the own robot sound pressure information reception unit or the other terminal sound pressure information reception unit, and a conversation control unit configured to execute an inter-robot conversation mode that establishes conversation with another user using the own robot sound pressure information transmission unit and the own robot sound pressure information reception unit and an inter-terminal conversation mode that establishes conversation with another user using the other terminal sound pressure information transmission unit and the other terminal sound pressure information reception unit in such a way that they can be swit
    Type: Grant
    Filed: March 18, 2022
    Date of Patent: April 2, 2024
    Assignee: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventor: Masahiro Takahashi
  • Patent number: 11911921
    Abstract: Aspects of the disclosure are directed towards artificial intelligence-based modeling of target objects, such as aircraft parts. In an example, a system initially trains a machine learning (ML) model based on synthetic images generated based on multi-dimensional representation of target objects. The same system or a different system subsequently further trains the ML model based on actual images generated by cameras positioned by robots relative to target objects. The ML model can be used to process an image generated by a camera positioned by a robot relative to a target object based on a multi-dimensional representation of the target object. The output of the ML model can indicate, for a detected target, position data, a target type, and/or a visual inspection property. This output can then be used to update the multi-dimensional representation, which is then used to perform robotics operations on the target object.
    Type: Grant
    Filed: August 9, 2023
    Date of Patent: February 27, 2024
    Assignee: WILDER SYSTEMS INC.
    Inventors: Ademola Ayodeji Oridate, William Wilder, Spencer Voiss
  • Patent number: 11906324
    Abstract: A cleaning route determination system includes an analyzer that analyzes behavior of airflow and particles inside a facility, a map generator that generates a dust accumulation map indicating one or more dust accumulation areas inside the facility and one or more dust amounts corresponding to the one or more dust accumulation areas, and a route calculator that determines a first route from second routes. Each of the second routes is a route for a cleaner to pass through, within a certain period of time, at least one of the one or more dust accumulation areas. A total amount indicating a sum of dust amounts corresponding to dust accumulation areas included the first route is largest among total amounts corresponding to the second routes, each of the total amounts indicating a sum of dust amounts corresponding to dust accumulation areas included in each of the second routes.
    Type: Grant
    Filed: May 26, 2021
    Date of Patent: February 20, 2024
    Assignee: Panasonic Intellectual Property Management Co. Ltd.
    Inventor: Tetsuya Takayanagi
  • Patent number: 11887365
    Abstract: A method for producing and replaying courses based on virtual reality is provided. The method is used in an electronic device and the method includes steps of: receiving a 3D model; generating a model data package corresponding to the 3D model according to the 3D model, wherein the model data package at least includes several objects applied to the 3D model; recording, by several virtual cameras, actions of a user who manipulates the objects in virtual reality, and generating action videos corresponding to the objects; and generating a course data package, wherein the course data package includes the model data package and an animation package including action videos.
    Type: Grant
    Filed: December 30, 2020
    Date of Patent: January 30, 2024
    Assignee: DELTA ELECTRONICS, INC.
    Inventors: Yao-Han Yen, Wen-Hsin Lo
  • Patent number: 11860278
    Abstract: Embodiments herein describe a robotic system that uses range sensors to identify a vector map of an environment. The vector map includes lines that outline the shape of objects in the environment (e.g., shelves on the floor of a warehouse). The system identifies one or more line segments representing the boundary or outline of the objects in the environment using range data acquired by the range sensors. The robotic system can repeat this process at different locations as it moves in the environment. Because of errors and inaccuracies, line segments formed at different locations may not clearly align even when these line segments correspond to the same object. To account for this error, the robotic system match line segments identified at a first location with line segments identified at a second location. The matched line segments can be merged into a line that is stored in the vector map.
    Type: Grant
    Filed: November 14, 2018
    Date of Patent: January 2, 2024
    Assignee: Amazon Technologies, Inc.
    Inventors: Samer Nashed, Jong Jin Park, Joseph Durham
  • Patent number: 11839980
    Abstract: An image processing apparatus capable of analyzing a cause of an abnormality of a system efficiently. The image processing apparatus includes a monitoring unit and an analysis unit. The monitoring unit monitors a monitoring object to detect occurrence of an abnormality in the monitoring object by applying an image process to a first area in each of images of the monitoring object that are photographed at different time points. The analysis unit analyzes a cause of an abnormality detected by the monitoring unit by applying an image process to a second area that is different from the first area in an image of a time point preceding a time point at which the abnormality is detected.
    Type: Grant
    Filed: April 14, 2021
    Date of Patent: December 12, 2023
    Assignee: CANON KABUSHIKI KAISHA
    Inventors: Genki Cho, Hiroto Oka
  • Patent number: 11835713
    Abstract: An optical scanning apparatus includes a MEMS substrate, a substrate fixing section to which the MEMS substrate is fixed, and an environment detection sensor that detects an environment factor associated with the mirror. The environment detection sensor is disposed in a position where the environment detection sensor overlaps with or is adjacent to the substrate fixing section but does not overlap with the MEMS substrate in a plan view viewed in a direction perpendicular to a surface of the MEMS substrate.
    Type: Grant
    Filed: December 4, 2020
    Date of Patent: December 5, 2023
    Assignee: SEIKO EPSON CORPORATION
    Inventors: Kei Kamakura, Hirokazu Yamaga, Takeshi Shimizu
  • Patent number: 11813707
    Abstract: A metallurgical technology probe insertion calibration method employing visual measurement and an insertion system thereof are provided. A vision sensor (5), a cylindrical rod (1), and a metallurgical technology probe (2) are used to construct an agreed region (6). In the agreed region (6), the vision sensor (5) acquires relative positions and orientations of the cylindrical rod (1) and the metallurgical technology probe (2), and an acquired position and orientation result is used to control a driving device (3) to insert the cylindrical rod (1) into the metallurgical technology probe (2). To improve the accuracy and reliability of the insertion, a standard probe (7) and a fixing device (4) are used together to perform effective calibration on an initial position, orientation, and axis in the insertion.
    Type: Grant
    Filed: September 12, 2019
    Date of Patent: November 14, 2023
    Assignee: BAOSHAN IRON & STEEL CO., LTD.
    Inventors: Zhenhong Wei, Ruimin Wu, Xitao Song, Changhong Ye, Junjiang Liu, Guodong Xu
  • Patent number: 11775699
    Abstract: Grasping remains a complex topic for simulation. Embodiments provide a method to automatically determine grasping cues for tools. An example embodiment scans a CAD model representing a real-world tool to generate a series of sections from the CAD model. In turn, properties of each section are extracted and one or more regions of the CAD model are identified based upon the extracted properties and a tool family to which the tool represented by the CAD model belongs. To continue, a respective classification for each of the one or more identified regions is identified and grasping cues for the CAD model are generated based upon the determined respective classification for each of the one or more regions.
    Type: Grant
    Filed: May 1, 2020
    Date of Patent: October 3, 2023
    Assignee: DASSAULT SYSTEMES AMERICAS CORP.
    Inventors: Alexandre Macloud, Louis Rivest, Ali Zeighami, Pierre-Olivier Lemieux, Rachid Aissaoui
  • Patent number: 11762716
    Abstract: A system includes a memory storing computer-readable instructions and at least one processor to execute the instructions to receive a shot sheet comprising data and metadata associated with an animation project, parse the shot sheet to generate instructions associated with at least one shot in the animation project, send the instructions to an animation program using an application programming interface (API), generate the animation project based on the instructions using the animation program, render the animation project into a video, and store the video in a database and generate a uniform resource locator (URL) for the video.
    Type: Grant
    Filed: January 10, 2022
    Date of Patent: September 19, 2023
    Inventor: Jason Michael Rowoldt
  • Patent number: 11764093
    Abstract: A substrate transport apparatus including a transport chamber, a drive section, a robot arm having an end effector at a distal end configured to support a substrate and being connected to the drive section generating at least arm motion in a radial direction extending and retracting the arm, an imaging system with a camera mounted in a predetermined location to image at least part of the robot arm, and a controller connected to the imaging system to image the arm moving to a predetermined repeatable position, the controller effecting capture of a first image of the robot arm proximate to the repeatable position decoupled from encoder data of the drive axis, wherein the controller calculates a positional variance of the robot arm from comparison of the first image with a calibration image, and from the positional variance determines a motion compensation factor changing the extended position of the robot arm.
    Type: Grant
    Filed: August 10, 2021
    Date of Patent: September 19, 2023
    Assignee: Brooks Automation US, LLC
    Inventors: Alexander Krupyshev, Leigh F. Sharrock
  • Patent number: 11753003
    Abstract: A LIDAR system includes a laser emitter configured to emit a laser pulse in a sample direction of a sample area of a scene. A sensor element of the LIDAR system is configured to sense a return pulse, which is a reflection from the sample area corresponding to the emitted laser pulse. The LIDAR system may compare a width of the emitted laser pulse to a width of the return pulse in the time-domain. The comparison of the width of the emitted pulse to the width of the return pulse may be used to determine an orientation or surface normal of the sample area relative to the sample direction. Such a comparison leads to a measurement of the change of pulse width, referred to as pulse broadening or pulse stretching, from the emitted pulse to the return pulse.
    Type: Grant
    Filed: February 7, 2020
    Date of Patent: September 12, 2023
    Assignee: Zoox, Inc.
    Inventors: Adam Berger, Ryan McMichael, Bertrand Robert Douillard
  • Patent number: 11731271
    Abstract: Traditionally, robots may learn to perform tasks by observation in clean or sterile environments. However, robots are unable to accurately learn tasks by observation in real environments (e.g., cluttered, noisy, chaotic environments). Methods and systems are provided for teaching robots to learn tasks in real environments based on input (e.g., verbal or textual cues). In particular, a verbal-based Focus-of-Attention (FOA) model receives input, parses the input to recognize at least a task and a target object name. This information is used to spatio-temporally filter a demonstration of the task to allow the robot to focus on the target object and movements associated with the target object within a real environment. In this way, using the verbal-based FOA, a robot is able to recognize “where and when” to pay attention to the demonstration of the task, thereby enabling the robot to learn the task by observation in a real environment.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: August 22, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Naoki Wake, Kazuhiro Sasabuchi, Katsushi Ikeuchi
  • Patent number: 11724407
    Abstract: One embodiment provides a robotic system comprising: a robot, the robot further comprising a moveable robotic arm that moves within the robot's reference space; a depth-sensing camera, the camera having a reference frame that is in substantial view of the robot's reference space; a controller, the controller further comprising a processor and a computer readable memory that comprises instructions such that, when read by the controller, the controller inputs image data from the camera and sends signals to the moveable robotic arm, the instructions further comprising the steps of: calibrating the camera to the robot by instructing the robot to engage in a number of robot poses; extracting the location of the robot poses to obtain the robot poses in the camera reference frame; and creating a transformation that transforms robot points afterwards to camera points.
    Type: Grant
    Filed: January 18, 2021
    Date of Patent: August 15, 2023
    Assignee: Xerox Corporation
    Inventors: Robert Price, Kent Evans
  • Patent number: 11724388
    Abstract: A robot controller configured to assist an operation of a user, by effectively utilizing both techniques of augmented reality and mixed reality. The robot controller includes: a display device configured to display information generated by a computer so that the information is overlapped with an actual environment, etc.; a position and orientation obtaining section configured to obtain relative position and orientation between the display device and a robot included in the actual environment; a display controlling section configured to display a virtual model of the robot, etc., on the display device; an operation controlling section configured to operate the virtual model displayed on the display device; and a position and orientation determining section configured to determine the position and/or orientation of the robot by using the position and/or orientation of the operated virtual model and using the relative position and orientation between the robot and the display device.
    Type: Grant
    Filed: October 1, 2019
    Date of Patent: August 15, 2023
    Assignee: FANUC CORPORATION
    Inventor: Nobuhiro Yoshida
  • Patent number: 11724404
    Abstract: A surface evaluation system that includes one or more vision systems that generate target surface data during evaluation of a surface, the one or more vision systems comprising two or more of: at least one light, a camera, a structured light camera, a laser scanner and a 3D scanner.
    Type: Grant
    Filed: February 21, 2020
    Date of Patent: August 15, 2023
    Assignee: Canvas Construction, Inc.
    Inventors: Maria J. Telleria, Kevin B. Albert, Irene M. Davis, Henry Tonoyan, Gabriel F. Hein, Zelda Othenin-Girard, Jason De Alba
  • Patent number: 11696524
    Abstract: A robotic vehicle may include one or more functional components configured to execute lawn care function, a sensor network comprising one or more sensors configured to detect conditions proximate to the robotic vehicle, a positioning module configured to determine robotic vehicle position while the robotic vehicle traverses a parcel, and a boundary management module configured to enable the robotic vehicle to be operated within a bounded area. The bounded area may include a variable boundary, and the boundary management module may be configured to receive instructions from an operator to adjust the variable boundary.
    Type: Grant
    Filed: January 15, 2020
    Date of Patent: July 11, 2023
    Assignee: HUSQVARNA AB
    Inventor: Håkan Wahlgren
  • Patent number: 11667036
    Abstract: A workpiece picking device includes a sensor that measures the workpieces, a hand that grasps the workpieces, a robot that moves the hand, and a control device thereof. The control device has a position orientation calculation part that calculates position, orientation and the like of the workpieces, a grasping orientation calculation part that calculates a grasping orientation of the workpieces by the hand, a route calculation part that calculates a route through which the hand moves to the grasping orientation, a sensor control part, a hand control part, a robot control part, a situation determination part that determines the situation of the workpieces on the basis of measurement result or the like of the three-dimensional position, and a parameter modification part that modifies at least one of a measurement parameter and various calculation parameters, when the determination result of the situations of the workpieces satisfies a predetermined condition.
    Type: Grant
    Filed: March 11, 2019
    Date of Patent: June 6, 2023
    Assignee: OMRON Corporation
    Inventors: Norikazu Tonogai, Toshihiro Moriya, Takeshi Kojima, Haruka Fujii
  • Patent number: 11656628
    Abstract: Described herein are systems, devices, and methods for controlling a mobile cleaning robot to escape from a stuck state using a learned robot escape behavior model. The model is trained using reinforcement learning at a cloud-computing device or networked devices. A mobile cleaning robot comprises a drive system, a sensor circuit to collect sensor data associated with a detected stuck state, and a controller circuit that can receive the trained robot escape behavior model, and apply the sensor data associated with the detected stuck state to the trained robot escape behavior model to determine an escape policy. The drive system or one or more actuators of the mobile robot can remove the mobile robot from the stuck state according to the determined escape policy.
    Type: Grant
    Filed: September 15, 2020
    Date of Patent: May 23, 2023
    Assignee: iRobot Corporation
    Inventors: Laura V. Herlant, Aravindh Kuppusamy, Deepak Sharma, Kshitij Bichave, Shao Zhou, Cheuk Wah Wong
  • Patent number: 11651505
    Abstract: Systems and methods are provided for acquiring images of objects. Light of different types (e.g., different polarization orientations) can be directed onto an object from different respective directions (e.g., from different sides of the object). A single image acquisition can be executed in order to acquire different sub-images corresponding to the different light types. An image of a surface of the object, including representation of surface features of the surface, can be generated based on the sub-images.
    Type: Grant
    Filed: January 26, 2021
    Date of Patent: May 16, 2023
    Assignee: COGNEX CORPORATION
    Inventors: Torsten Kempf, Jens Rütten, Michael Haardt, Laurens Nunnink
  • Patent number: 11644826
    Abstract: A robot control apparatus includes a controller to control operation of a robot, a storage to store operation logs with different preservation periods for the operation of the robot, a collector to, when a specific event occurs, select and collect an information element corresponding to a type of the event from the operation logs, a record generate to create a record from the information element collected by the collector, and a record preserver to preserve the record.
    Type: Grant
    Filed: February 12, 2019
    Date of Patent: May 9, 2023
    Assignee: NIDEC CORPORATION
    Inventor: Takahiro Namikoshi