Optical Patents (Class 901/47)
  • Publication number: 20150094851
    Abstract: A robot control system detects a position and a direction of each user by a plurality of range image sensors provided in an exhibition hall. A central controller records an inspection action after a user attends the exhibition hall until the user leaves to generate an inspection action table. When the user attends again, the central controller reads a history of inspection action from the inspection action table. Then, the central controller chooses from an utterance content table an utterance content containing a phrase that mentions the inspection action included in the history at a time of last time attendance, determines the utterance content, and makes a robot output the determined utterance content to the user.
    Type: Application
    Filed: September 26, 2014
    Publication date: April 2, 2015
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Koji Kawabe, Taro Yokoyama, Takayuki Kanda, Satoru Satake, Takamasa Iio, Kotaro Hayashi, Florent Ferreri
  • Patent number: 8996174
    Abstract: In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Rodney Brooks, Bruce Blumberg, Noelle Dye, Paula Long
  • Patent number: 8996228
    Abstract: Methods and systems for construction zone object detection are described. A computing device may be configured to receive, from a LIDAR, a 3D point cloud of a road on which a vehicle is travelling. The 3D point cloud may comprise points corresponding to light reflected from objects on the road. Also, the computing device may be configured to determine sets of points in the 3D point cloud representing an area within a threshold distance from a surface of the road. Further, the computing device may be configured to identify construction zone objects in the sets of points. Further, the computing device may be configured to determine a likelihood of existence of a construction zone, based on the identification. Based on the likelihood, the computing device may be configured to modify a control strategy of the vehicle; and control the vehicle based on the modified control strategy.
    Type: Grant
    Filed: September 5, 2012
    Date of Patent: March 31, 2015
    Assignee: Google Inc.
    Inventors: David Ian Ferguson, Dirk Haehnel, Ian Mahon
  • Patent number: 8996175
    Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
  • Patent number: 8996167
    Abstract: In accordance with various embodiments, a user interface embedded into a robot facilitates robot training via direct and intuitive physical interactions. In some embodiments, the user interface includes a wrist cuff that, when grasped by the user, switches the robot into zero-force gravity-compensated mode.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: March 31, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Natan Linder, Rodney Brooks, Michael Sussman, Bruce Blumberg, Noelle Dye, Michael Caine, Elaine Y. Chen
  • Publication number: 20150088310
    Abstract: Devices, systems, and methods for social behavior of a telepresence robot are disclosed herein. A telepresence robot may include a drive system, a control system, an object detection system, and a social behaviors component. The drive system is configured to move the telepresence robot. The control system is configured to control the drive system to drive the telepresence robot around a work area. The object detection system is configured to detect a human in proximity to the telepresence robot. The social behaviors component is configured to provide instructions to the control system to cause the telepresence robot to operate according to a first set of rules when a presence of one or more humans is not detected and operate according to a second set of rules when the presence of one or more humans is detected.
    Type: Application
    Filed: November 21, 2014
    Publication date: March 26, 2015
    Inventors: Marco Pinter, Fuji Lai, Daniel Steven Sanchez, James Ballantyne, David Bjorn Roe, Yulun Wang, Charles S. Jordan, Orjeta Taka, Cheuk Wah Wong
  • Patent number: 8983776
    Abstract: A robotic apparatus for traversing a selected area autonomously that senses orientation relative to “environmental” signals. The robotic apparatus is provided in two models, a master that can record directive and “environmental signal” readings, or that can record received location information, to provide at least one command recorded on a machine-readable medium representing an instruction for traversing an area of interest, and a slave that lacks the recording capability. Both master and slave models can replay recorded commands, and compare the expected orientation from the command with an actual orientation sensed during autonomous operation. If an error exceeding a predetermined value is observed, a corrective action is taken. The robotic apparatus is able to utilize a tool to perform a task.
    Type: Grant
    Filed: April 26, 2007
    Date of Patent: March 17, 2015
    Inventor: Jason A. Dean
  • Publication number: 20150073596
    Abstract: A master slave robot is that receives force presentation according to a picture watched by an operator operating the master slave robot. The control apparatus for the master slave robot causes a force information correcting unit to correct force information in accordance with magnification percentage information acquired by a displayed information acquiring unit such that the force information is increased accordingly as the magnification percentage information is larger. An operator can thus apply appropriate force while watching the picture projected on a display to perform a task.
    Type: Application
    Filed: September 3, 2014
    Publication date: March 12, 2015
    Inventors: Yudai FUDABA, Yuko TSUSAKA
  • Publication number: 20150073646
    Abstract: A mobile robot that includes a drive system, a controller in communication with the drive system, and a volumetric point cloud imaging device supported above the drive system at a height of greater than about one feet above the ground and directed to be capable of obtaining a point cloud from a volume of space that includes a floor plane in a direction of movement of the mobile robot. The controller receives point cloud signals from the imaging device and issues drive commands to the drive system based at least in part on the received point cloud signals.
    Type: Application
    Filed: November 14, 2014
    Publication date: March 12, 2015
    Applicant: iRobot Corporation
    Inventors: Michael T. Rosenstein, Chikyung Won, Michael Halloran, Steven V. Shamlian, Mark Chiappetta
  • Publication number: 20150073595
    Abstract: A control apparatus for a master slave robot causes a force information correcting unit to correct force information in accordance with a feature of a target object on a screen from target object information calculated by a target object information calculation unit. An operator can thus apply appropriate force while watching a picture projected on a display to perform a task.
    Type: Application
    Filed: September 2, 2014
    Publication date: March 12, 2015
    Inventors: Yudai FUDABA, Yuko TSUSAKA
  • Publication number: 20150063637
    Abstract: An image recognition method according to one exemplary aspect of the present invention including the steps of: acquiring a shooting image generated by capturing an image of an object using an image generating device; acquiring subject distance information indicating a distance from the object to the image generating device at a target pixel in the shooting image; extracting an image pattern corresponding to the acquired subject distance information from a plurality of image patterns which are created for detecting one detection object in advance and are associated with the different distance information, respectively, and performing a pattern matching using the extracted image pattern against the shooting image.
    Type: Application
    Filed: August 25, 2014
    Publication date: March 5, 2015
    Applicant: TOYOTA JIDOSHA KABUSHIKI KAISHA
    Inventor: Ayako AMMA
  • Patent number: 8972060
    Abstract: An embodiment of the invention provides a control method of a cleaning robot with a non-omnidirectional light detector. The method includes the steps of: detecting a light beam via the non-omnidirectional light detector; stopping the cleaning robot and spinning the non-omnidirectional light detector when the non-omnidirectional light detector detects the light beam; stopping the spinning of the non-omnidirectional light detector and estimating a first spin angle when the non-omnidirectional light detector does not detect the light beam; and adjusting a moving direction of the cleaning robot according to the first spin angle.
    Type: Grant
    Filed: February 15, 2013
    Date of Patent: March 3, 2015
    Assignee: MSI Computer (Shenzhen) Co., Ltd.
    Inventors: You-Wei Teng, Shih-Che Hung, Yao-Shih Leng
  • Publication number: 20150057550
    Abstract: A robotic imaging system has at least one robotic imaging arm that includes a free-space optics subsystem. The free-space optics is capable of conveying an excitation light signal through the robotic imaging arm to an optical end effector at the distal end thereof while maintaining coaxial alignment between the optical axis and the robotic skeleton. The free-space optics is also capable of maintaining linear polarization of the light signal.
    Type: Application
    Filed: August 22, 2014
    Publication date: February 26, 2015
    Inventors: Hyun Kim, Joan Savall, Jerome Anthony-Jean Lecoq, Mark Schnitzer
  • Publication number: 20150057801
    Abstract: A system and method for guidance of a moving robotic device through an approximated real time (ART) virtual video stream is presented. The system and method includes at least one camera for collecting images of a terrain in a remote location, at least one terrain data collecting device for collecting data from a remote location, a memory for storing images from the plurality of cameras, a communication device for transmitting the images and data over a path and a computer configured to calculate a delay between the cameras and the receiver. The calculated delay causes the computer to retrieve images and data from the receiver and memory and consequently generate an approximate real-time video and data stream for displaying the terrain-just-ahead of a moving robotic device at a distance proportional to the calculated delay and the ART video and data stream is used to guide the moving robotic device.
    Type: Application
    Filed: August 20, 2013
    Publication date: February 26, 2015
    Inventor: Kenneth Dean Stephens, JR.
  • Publication number: 20150057675
    Abstract: A system and a method for automating a medical process including a memory storing a software program, a computer connected to the memory for running the software program, a display connected to the computer for generating a visual representation of output data generated by the computer running the program, a user interface connected to the computer for obtaining image data representing a configuration of a patient treatment space and fixed markers in the treatment space and storing the image data in the memory, a robot arm connected to the computer, and a medical tool mounted on the robot arm wherein when a human inputs a selected treatment procedure into the computer, the computer runs the software program to generate a tool path based upon the treatment procedure and the image data, and the computer operates the robot arm to move the medical tool along the tool path without human guidance, and wherein the data generated during the treatment procedure is stored, analyzed, and shared among collaborating compu
    Type: Application
    Filed: August 21, 2013
    Publication date: February 26, 2015
    Inventors: Hadi Akeel, Yaz Shehab, George Wong
  • Publication number: 20150052703
    Abstract: A robot cleaner having a monitoring function and minimizing power consumption and/or securing communication efficiency and a method of controlling a robot cleaner are provided. The robot cleaner may include at least one sound obtaining device; at least one image obtaining device; and a controller configured to determine whether a sound obtained through the at least one sound obtaining device is abnormal, sense a direction in which an abnormal sound is generated, and obtain images in the direction in which the abnormal sound is generated. The robot cleaner may automatically recognize a surrounding situation, and when necessary, the robot cleaner may rotate and/or move in a corresponding direction and/or position to obtain images or transmit the obtained images to a remote terminal, thereby minimizing power consumption of the robot cleaner with limited power.
    Type: Application
    Filed: August 22, 2014
    Publication date: February 26, 2015
    Applicant: LG ELECTRONICS INC.
    Inventors: Haesoo LEE, Seunghoe CHOE, Kangyoul LEE, Weonho SAE, Minjung KANG
  • Publication number: 20150057802
    Abstract: A robotic activity system, which includes a board and an autonomous robotic device, is described herein. The board may display a line and one or more color patterns. The robotic device may traverse the line using one or more integrated sensors. For example, sensor data may include light intensity data for visible light reflected or emitted by the board. The sensor data may be analyzed to 1) ensure the robotic device follows the line and/or 2) detect color sequences associated with color patterns shown on the board. Upon detection of a color sequence, the robotic device may attempt to match the color sequence with a known color pattern definition. The color pattern definition may be associated with a function to be performed by the robotic device. Using multiple sets of color patterns and associated functions allows the robotic device to move in a variable and potentially unpredictable fashion.
    Type: Application
    Filed: August 22, 2014
    Publication date: February 26, 2015
    Inventors: Armen Kroyan, Ondrej Stanek, Nader Hamda
  • Patent number: 8965576
    Abstract: In accordance with various embodiments, a user-guidable robot appendage provides haptic feedback to the user.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: February 24, 2015
    Assignee: Rethink Robotics, Inc.
    Inventors: Elaine Y. Chen, Rodney Brooks, Christopher J. Buehler, Matthew M. Williamson, Bruce Blumberg, Noelle Dye, Joseph M. Romano, William A. Goodwin
  • Patent number: 8961094
    Abstract: The invention relates to a method and a device for aligning substrates (2) in an XY-plane. A polygonal, flat substrate (2), the substrate plane of which is parallel to the XY-plane or lies in the XY-plane, is aligned with respect to reference coordinates and a reference angular position in the XY-plane. A corner (12) of the substrate (2) is detected using image detecting means (9). In addition, the position coordinates of the substrate (2) are determined. Using evaluating means (10), the angular position of the corner (12) of the substrate (2) in the XY-plane is determined, and the position differences between the reference coordinates and the position coordinates as well as the angle difference between the reference angular position and the angular position of the substrate corner (12) are calculated. The substrate (2) is moved and/or rotated in the XY-plane according to the determined position difference or the angle difference.
    Type: Grant
    Filed: April 6, 2010
    Date of Patent: February 24, 2015
    Assignee: Singulus Technologies AG
    Inventors: Edgar Rüth, Wolfgang Becker, Marjan Filipovic, Reiner Rohrmann
  • Patent number: 8965579
    Abstract: A telepresence robot may include a drive system, a control system, an imaging system, and a mapping module. The mapping module may access a plan view map of an area and tags associated with the area. In various embodiments, each tag may include tag coordinates and tag information, which may include a tag annotation. A tag identification system may identify tags within a predetermined range of the current position and the control system may execute an action based on an identified tag whose tag information comprises a telepresence robot action modifier. The telepresence robot may rotate an upper portion independent from a lower portion. A remote terminal may allow an operator to control the telepresence robot using any combination of control methods, including by selecting a destination in a live video feed, by selecting a destination on a plan view map, or by using a joystick or other peripheral device.
    Type: Grant
    Filed: January 27, 2012
    Date of Patent: February 24, 2015
    Assignees: Intouch Technologies, Irobot Corporation
    Inventors: Yulun Wang, Charles S. Jordan, Tim Wright, Michael Chan, Marco Pinter, Kevin Hanrahan, Daniel Sanchez, James Ballantyne, Cody Herzog, Blair Whitney, Fuji Lai, Kelton Temby, Eben Christopher Rauhut, Justin H. Kearns, Cheuk Wah Wong, Timothy Sturtevant Farlow
  • Publication number: 20150045950
    Abstract: A transfer system includes a first station at which a workpiece is placed, a second station which receives the workpiece from the first station, a robot including a holder for holding the workpiece and for transferring the workpiece from the first station to the second station, an image capturing unit for capturing an image of the workpiece that reflects a position of the workpiece in the first station, a first memory unit that stores intended placement position information indicating an intended placement position of the workpiece in the first station, and a deviation calculator that calculates a deviation of the position of the workpiece in the first station relative to the intended placement position. The deviation calculator calculates the deviation based on the image of the workpiece and the intended placement position information.
    Type: Application
    Filed: August 7, 2014
    Publication date: February 12, 2015
    Inventor: Takumi KOBAYASHI
  • Publication number: 20150032243
    Abstract: A robot system according to an aspect of the embodiment includes at least one robot, a transporter, and a controller. The robot performs multi-axial operation based on an operation instruction by the controller. The transporter has a pair of guides arranged parallel to each other along a predetermined transport direction, the guides having a variable spacing therebetween, transports a workpiece to a working position of the robot while restricting the movement of the workpiece present in an area between the pair of guides toward the direction of the spacing, and sandwiches and holds the workpiece by the pair of guides at the working position. The controller instructs the robot to perform the operation to apply predetermined processing to the workpiece held at the working position.
    Type: Application
    Filed: October 15, 2014
    Publication date: January 29, 2015
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Takashi SHIINO, Keigo ISHIBASHI, Toshiaki IKEDA, Toshiyuki HARADA, Shoji KOYAKUMARU
  • Publication number: 20150032254
    Abstract: A communication draw-in system that enables robot-human communication to start smoothly is provided. The communication draw-in system is a communication draw-in system provided in a robot that communicates with a target human, and includes: a human specifying unit 200 for specifying a position of the target human; a light source control unit 201 for moving light toward the specified position of the target human; a draw-in control unit 203 for instructing the robot to perform a draw-in operation for making the target human recognize a direction of the robot; and a human recognition specifying unit 204 for determining whether or not the target human has recognized the robot, wherein the robot is instructed to start communicating with the target human, in the case where the target human is determined to have recognized the robot.
    Type: Application
    Filed: November 15, 2012
    Publication date: January 29, 2015
    Applicant: NEC CORPORATION
    Inventor: Shin Ishiguro
  • Publication number: 20150032252
    Abstract: A method and system for piece-picking or piece put-away within a logistics facility. The system includes a central server and at least one mobile manipulation robot. The central server is configured to communicate with the robots to send and receive piece-picking data which includes a unique identification for each piece to be picked, a location within the logistics facility of the pieces to be picked, and a route for the robot to take within the logistics facility. The robots can then autonomously navigate and position themselves within the logistics facility by recognition of landmarks by at least one of a plurality of sensors. The sensors also provide signals related to detection, identification, and location of a piece to be picked or put-away, and processors on the robots analyze the sensor information to generate movements of a unique articulated arm and end effector on the robot to pick or put-away the piece.
    Type: Application
    Filed: July 25, 2014
    Publication date: January 29, 2015
    Applicant: IAM Robotics, LLC
    Inventors: Thomas Galluzzo, Jean Sebastien Valois, Vladimir Altman
  • Patent number: 8942851
    Abstract: A device which permits two additional tools to be attached to the robot arm of a Talon® robot with remote operation of these same tools, from the existing operator control unit. The invention permits an operator to merely carry two tools down range while preserving full use of the robot's gripper, and then allow an operator to remotely use those two tools. These do not increase the amount of space the robot occupies, and still allow for continued use of existing equipment.
    Type: Grant
    Filed: March 7, 2013
    Date of Patent: January 27, 2015
    Assignee: The United States of America as represented by the Secretary of the Army
    Inventors: Gregory Maier, Joshua Lee, Michael Freeman
  • Publication number: 20150023748
    Abstract: Drilling apparatus and method, the apparatus comprising: a first robot (10); a first member (30) (e.g. a pressure foot) and a drilling tool (38) both coupled to the first robot (10); a second robot (12); and a second member (52) coupled to the second robot (12); wherein the apparatus is arranged to press the members (30, 52) against opposite sides of a part to be drilled (2, 100) (e.g. an aircraft panel) so as to hold the part (2, 100) and prevent deflection of at least a portion of the part (2, 100); and the first member (30) and the drilling tool (38) are arranged such that the drilling tool (38) may drill into the portion of the part (2, 100) of which deflection is opposed from the side of the part (2, 100) pressed against by the first member (30). The robots (10, 12) may be robotic arms.
    Type: Application
    Filed: January 29, 2013
    Publication date: January 22, 2015
    Inventors: Jonathan Michael Carberry, Austin James Cook
  • Publication number: 20150025683
    Abstract: A control apparatus calculates a calibration value based on a position in the robot coordinate system 41 and a position in the vision coordinate system 42, for at least three teaching points set within a calibration area. Markers 21 of two of the three teaching points have the same inclination in relation to an optical axis of a camera 3 as a visual sensor, and the two points are placed on different positions of the same plane normal to the optical axis. The remaining one of the three teaching points other than the two points is set such that the inclination of the marker 21 in relation to the optical axis is different from that of the two points. As a result, influence of a large quantization error in the optical axis direction as a measurement error of the camera 3 can be reduced.
    Type: Application
    Filed: July 15, 2014
    Publication date: January 22, 2015
    Inventor: Shingo Amano
  • Patent number: 8938315
    Abstract: A method for verifying completion of a task is provided. In various embodiments, the method includes obtaining location coordinates of at least one location sensor within a work cell. The at least one sensor is affixed to a tool used to operate on a feature of a structure to be assembled, fabricated or inspected. The method additionally includes, generating a virtual object locus based on the location coordinates of the at least one location sensor. The virtual object locus corresponds to a computerized schematic of the structure to be assembled and represents of all possible locations of an object end of the tool within the work cell. The method further includes, identifying one of a plurality of candidate features as the most likely to be the feature operated on by the tool. The identification is based on a probability calculation for each of the candidate features that each respective candidate feature is the feature operated on by the tool.
    Type: Grant
    Filed: January 13, 2014
    Date of Patent: January 20, 2015
    Assignee: The Boeing Company
    Inventors: Philip L. Freeman, Thomas E. Shepherd, Christopher K. Zuver
  • Publication number: 20150019014
    Abstract: A device for quality inspection of an automotive part includes i) a jig unit for securing and supporting an inspection object, ii) rotary vision imager mounted to a fore end of a robot arm rotatable in a turret type for vision photographing a plurality of processed portions of the inspection object, and iii) a controller for obtaining a vision data from the rotary vision imager, and analyzing and processing the vision data for extracting a defect of the processed portion.
    Type: Application
    Filed: December 13, 2013
    Publication date: January 15, 2015
    Applicant: Hyundai Motor Company
    Inventor: Tae Ho Kim
  • Publication number: 20150019013
    Abstract: A displacement measuring cell may be used to measure linear and/or angular displacement. The displacement measuring cell may include movable and stationary electrodes in a conductive fluid. Electrical property measurements may be used to determine how far the movable electrode has moved relative to the stationary electrode. The displacement measuring cell may include pistons and/or flexible walls. The displacement measuring cell may be used in a touch-sensitive robotic gripper. The touch-sensitive robotic gripper may include a plurality of displacement measuring cells mechanically in series and/or parallel. The touch-sensitive robotic gripper may be include a processor and/or memory configured to identify objects based on displacement measurements and/or other measurements. The processor may determine how to manipulate the object based on its identity.
    Type: Application
    Filed: September 12, 2014
    Publication date: January 15, 2015
    Inventors: Jeffrey A. Rose, James Adam Rose, Stephen D. Rose, Raymond Cooper, Jeffrey John Sweda
  • Publication number: 20150018842
    Abstract: A system and a method of performing a frameless image-guided biopsy uses imaging, a six-dimensional robotic couch system, a laser guidance system, an optical distance indicator, and a needle control apparatus. A planning CT scan is made of the patient with stereotactic fiduciary markers to localize and produce digitally reconstructed radiographs. Two stereoscopic images are generated using an imaging device to visualize and identify a target tumor. The images are fused with the digitally reconstructed radiographs of the planning CT scan to process tumor location. The tumor location data are communicated to the movable robotic couch to position the target tumor of the patient at a known isocenter location. A biopsy needle is guided with a laser alignment mechanism towards the isocenter at the determined depth using a needle positioning apparatus and an Optical Distance Indicator, and a biopsy sample of the target tumor is obtained.
    Type: Application
    Filed: June 23, 2014
    Publication date: January 15, 2015
    Inventors: Javad Rahimian, Amir Rombod Rahimian
  • Patent number: 8935006
    Abstract: A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may facilitate a teleconferencing session with a remote third party, or interact with the resident in a number of ways. For example, the robot may carry on a dialog with the resident, reinforce compliance with medication or other schedules, etc. In addition, the robot incorporates safety features for preventing collisions with the resident; and the robot may audibly announce and/or visibly indicate its presence in order to avoid becoming a dangerous obstacle. Furthermore, the mobile robot behaves in accordance with an integral privacy policy, such that any sensor recording or transmission must be approved by the resident.
    Type: Grant
    Filed: September 29, 2006
    Date of Patent: January 13, 2015
    Assignee: iRobot Corporation
    Inventors: Clara Vu, Matthew Cross, Tim Bickmore, Amanda Gruber, Tony L. Campbell
  • Patent number: 8934706
    Abstract: A device is provided having a robotic arm for handling a wafer, the robotic arm including one or more encoders that provide encoder data identifying a position of one or more components of the robotic arm. The device also having a processor adapted to apply an extended Kalman Filter to the encoder data to estimate a position of the wafer.
    Type: Grant
    Filed: January 20, 2014
    Date of Patent: January 13, 2015
    Assignee: Brooks Automation, Inc.
    Inventors: Christopher C. Kiley, Peter van der Meulen, Forrest T. Buzan, Paul E. Fogel
  • Patent number: 8930025
    Abstract: A work robot for executing a work for operating an object includes a robot body for capturing an image including the object. During a teach mode, the captured image is correlated with an operation content taught by an operator and held. During a work mode, the captured image is acquired, an image similar to the acquired image is searched, and the object is operated according to the operation content correlated with the image captured in the past which has been found as a result of the search.
    Type: Grant
    Filed: May 24, 2007
    Date of Patent: January 6, 2015
    Inventor: Takehiro Ishizaki
  • Publication number: 20150003685
    Abstract: There is provided with an information processing apparatus. A detection unit detects a first region and a second region from a captured image. The first region includes a first portion of the measurement target that reflects a larger amount of light toward an imaging unit. The second region includes a second portion of the measurement target that reflects a smaller amount of light toward the imaging unit than the first portion does. A generation unit generates a measurement pattern that has different amounts of irradiation light depending on regions, such that an amount of irradiation light in the first region is smaller than an amount of irradiation light in the second region.
    Type: Application
    Filed: June 20, 2014
    Publication date: January 1, 2015
    Inventors: Akira Ohno, Takahisa Yamamoto, Masakazu Matsugu
  • Publication number: 20150001186
    Abstract: An all-in-one jigless projection loading system for a vehicle is adapted to load and assemble a body component to a vehicle body. The all-in-one jigless projection loading system may include: a fixing bracket fixed to an arm of a robot; a position adjusting member rotatably mounted to the fixing bracket; a gripper mounted to the fixing bracket to be movable backward and forward, and gripping the body component; an array unit mounted to the position adjusting member, and arraying the body component; and a welding unit mounted to the fixing bracket, and projection-welding the body component to a vehicle body.
    Type: Application
    Filed: December 30, 2013
    Publication date: January 1, 2015
    Applicant: Hyundai Motor Company
    Inventor: Sung Phil Ryu
  • Publication number: 20150005937
    Abstract: An action for execution by a robotic device may be selected. A robotic controller may determine that two or more actions are to be executed based on analysis of sensory and/or training input. The actions may comprise target approach and/or obstacle avoidance. Execution of individual actions may be based on a control signal and a separate activation signal being generated by the controller. Control signal execution may be inhibited by the controller relay block. Multiple activation signals may compete with one another in winner-take-all action selection network to produce selection signal. The selection signal may temporarily pause inhibition of a respective portion of the relay block that is associated with the winning activation signal channel. A disinhibited portion of the relay block may provide the respective control signal for execution by a controllable element. Arbitration between individual actions may be performed based on evaluation of activation signals.
    Type: Application
    Filed: June 27, 2013
    Publication date: January 1, 2015
    Inventor: Filip Ponulak
  • Publication number: 20150005923
    Abstract: A deburring device includes a deburring tool for removing burrs from an object, a robot for moving an object or the tool, a force sensor for detecting force acting on the tool, and a visual sensor for detecting a position of a burr portion of the object. According to the deburring device, information regarding shape data of the burr portion and a posture of the tool is obtained beforehand based on three-dimensional data of the object. Based on the shape data and the posture of the tool, a robot program is created. In accordance with an actual burr portion detected by the visual sensor, the robot program is replaced as necessary. During the deburring, the robot is controlled according to the force control by using a detected value from the force sensor.
    Type: Application
    Filed: June 25, 2014
    Publication date: January 1, 2015
    Applicant: FANUC CORPORATION
    Inventor: Yihua Gu
  • Patent number: 8924019
    Abstract: A cleaning robot a dirt recognition device thereof and a cleaning method of the robot are disclosed. The recognition device includes an image collecting module and an image processing module. The image collecting module may be used for collecting the image information of the surface to be treated by the cleaning robot and sending the image information to the image processing module. The image processing module may divide the collected image information of the surface to be treated into N blocks, extract the image information of each block and process the image information in order to determine the dirtiest surface to be treated that corresponds to one of the N blocks. Through the solution provided by the present invention, the cleaning robot can make an active recognition to the dirt such as dust, so that it can get into the working area accurately and rapidly.
    Type: Grant
    Filed: June 10, 2010
    Date of Patent: December 30, 2014
    Assignee: Ecovacs Robotics Suzhou Co., Ltd.
    Inventor: Jinju Tang
  • Publication number: 20140379198
    Abstract: The present invention provides a mobile object capable of stable movement and jumping. The mobile object includes two moving means attached to left and right sides under a body; a sensor to detect attitude of the body; a controller to receive information from the sensor and perform calculation; two telescopic actuators attached between the body and the two moving means and configured to generate vertical forces; a rotary actuator provided at the center of the two telescopic actuators and configured to rotate around a moving direction of the body; a roll link connected with an output part of the rotary actuator; two suspensions connecting left and right ends of the roll link and the moving means; and foot frames attached between the suspensions and the moving means, wherein the controller controls the rotary actuator so that the sensor detects a target tilt angle and a target angular velocity of the body.
    Type: Application
    Filed: December 12, 2011
    Publication date: December 25, 2014
    Applicant: Hitachi, Ltd.
    Inventors: Azusa Amino, Ryosuke Nakamura, Taishi Ueda
  • Patent number: 8918230
    Abstract: Described are systems and methods, including computer program products for controlling an unmanned vehicle. A user controls one or more unmanned vehicles with a smart phone. The smart phone receives video stream from the unmanned vehicles, and the smart phone displays the controls from the unmanned vehicle over the video. The smart phone and the unmanned vehicle communicate wirelessly.
    Type: Grant
    Filed: January 21, 2011
    Date of Patent: December 23, 2014
    Assignee: Mitre Corporation
    Inventors: Peter David Chen, Dennis Bushmitch
  • Publication number: 20140371910
    Abstract: A robot system includes a robot body, a camera mounted on the robot body and capable of photographing a work piece; and a control device for driving and controlling the robot body based on a trajectory to an instruction point, which is set in advance, and, when the camera arrives at an area in which the camera is capable of photographing the work piece during this driving and controlling, driving and controlling the robot body so that the camera moves linearly toward the work piece, taking an image of the work piece with the camera while the camera is moving linearly, and measuring a position of the work piece from the taken image.
    Type: Application
    Filed: June 4, 2014
    Publication date: December 18, 2014
    Inventor: Tadashi Hayashi
  • Publication number: 20140371911
    Abstract: A solution for pre-screening an object for further processing is provided. A pre-screening component can acquire image data of the object and process the image data to identify reference target(s) corresponding to the object, which are visible in the image data. Additionally, the pre-screening component can identify, using the reference target(s), the location of one or more components of the object. The pre-screening component can provide pre-screening data for use in further processing the object, which includes data corresponding to the set of reference targets and the location of the at least one component. A reference target can be, for example, an easily identifiable feature of the object and the component can be relevant for performing an operation on the object.
    Type: Application
    Filed: June 17, 2014
    Publication date: December 18, 2014
    Applicant: International Electronic Machines Corporation
    Inventors: Zahid F. Mian, Ronald W. Gamache
  • Publication number: 20140364986
    Abstract: A robot system capable of accurately measuring assembly accuracy of a workpiece formed to include a rotation shaft is provided. To implement such a robot system, a robot system according to an aspect of the present embodiment includes a robot and an accuracy measurement device. The robot transfers a workpiece formed to include a rotation shaft. The accuracy measurement device holds the rotation shaft of the workpiece transferred by the robot to be substantially parallel to the vertical direction, and measures assembly accuracy of the workpiece while rotating the rotation shaft to rotate the whole of the workpiece.
    Type: Application
    Filed: August 27, 2014
    Publication date: December 11, 2014
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Takeshi OKAMOTO, Kenichi MOTONAGA, Jun MATSUMURA, Teruhisa KITAGAWA, Ryoji NAGASHIMA
  • Publication number: 20140360420
    Abstract: The invention relates to remotely operated multi-component search robots for underwater search and rescue operations, and particularly suited for searches under ice.
    Type: Application
    Filed: April 23, 2014
    Publication date: December 11, 2014
    Applicant: Natick Public Schools
    Inventors: Doug Scott, Adam Azanow, Daniel Carson, William Coburn, Nicholas Exarchos, Russell Forrest, Jason Geller, Ford Grundberg, Kimya Harper, Susan Haverstick, Larion Johnson, Kevin King, James Kinsey, Alex Krasa, Ilir Kumi, Douglas Laderman, Jonathan Magee, James McLean, Alex Petrovsky, Katelyn Sweeney, Nickolas Thorsen, Olivia Van Amsterdam, Jacob Wainer, Chris Williamson, Ethan Ziegler
  • Publication number: 20140355834
    Abstract: A system and method for tracking, identifying, and labeling objects or features of interest, such as follicular units is provided. In some embodiments, tracking is accomplished using unique signature of the follicular unit and image stabilization techniques. According to some aspects pixel data of a region of interest in a first image is compared to pixel data of the regions of interest in a second image, and based on a result of the comparison of pixel data in the region of interest in the first and second images and the signature of the follicular unit, locating the follicular unit in the second image. In some embodiments the follicular unit is searched for in the direction of a motion vector.
    Type: Application
    Filed: August 14, 2014
    Publication date: December 4, 2014
    Applicant: Restoration Robotics, Inc.
    Inventors: Shehrzad A. Qureshi, Kyle R. Breton, John A. Tenney
  • Publication number: 20140358162
    Abstract: Robotic platform for mini-invasive surgery comprising robotic arms (4a, 4b) suitable to be placed in the body of a patient, introduced through a single access port, which are an extension of external robotic manipulators (19). The continuity between external (1) and internal (4a, 4b) robotic arms is guaranteed by means of a trans-abdominal magnetic connection (6) between the internal robotic arm integral with the external one. The trans-abdominal magnetic coupling not only guarantees a stable anchoring, but most of all it transfers degrees of freedom to the internal robotic arms, inducing the motion of internal magnets by means of the automatized motion of external magnets. It is also possible to reposition the internal robotic arms without requiring having to additionally perform incisions on the abdomen. Using the external robotic arms allows translating the internal ones on the entire abdomen thus providing a working space not bound to the point of insertion and theoretically unlimited.
    Type: Application
    Filed: May 31, 2012
    Publication date: December 4, 2014
    Applicant: VANDERBILT UNIVERSITY
    Inventors: Pietro Valdastri, Christian Di Natali, Massimiliano Simi, Tommaso Ranzani, Arianna Menciassi, Paolo Dario
  • Patent number: 8903589
    Abstract: Techniques that optimize performance of simultaneous localization and mapping (SLAM) processes for mobile devices, typically a mobile robot. In one embodiment, erroneous particles are introduced to the particle filtering process of localization. Monitoring the weights of the erroneous particles relative to the particles maintained for SLAM provides a verification that the robot is localized and detection that it is no longer localized. In another embodiment, cell-based grid mapping of a mobile robot's environment also monitors cells for changes in their probability of occupancy. Cells with a changing occupancy probability are marked as dynamic and updating of such cells to the map is suspended or modified until their individual occupancy probabilities have stabilized.
    Type: Grant
    Filed: October 30, 2013
    Date of Patent: December 2, 2014
    Assignee: Neato Robotics, Inc.
    Inventors: Boris Sofman, Vladimir Ermakov, Mark Emmerich, Steven Alexander, Nathaniel David Monson
  • Publication number: 20140350727
    Abstract: Methods and systems for providing functionality of an interface to control orientations of a camera on a device are provided. In one example, a method includes receiving an input on an interface indicating a command for an orientation of a camera on a robotic device, and the interface may be provided on a device remote from the robotic device. An indicator may be provided on the interface representing a location of the input, and the indicator may be representative of the command for the orientation of the camera on the robotic device. The method may also include determining that the location of the input on the interface is within a distance threshold to a pre-set location on the interface, and repositioning the indicator on the interface to be at the pre-set location.
    Type: Application
    Filed: August 7, 2014
    Publication date: November 27, 2014
    Inventors: Munjal Desai, Ryan Hickman, Thor Lewis, Damon Kohler
  • Publication number: 20140350707
    Abstract: A robot system according to one aspect of an embodiment includes a robot and a control unit. The robot performs multi-axis operations based on instructions on the operations from the control unit. The control unit causes a power source to supply power to a partially fabricated item that includes a movable member that is able to be controlled by receiving power, and instructs the robot to perform an operation of attaching a predetermined member to the partially fabricated item while controlling the movable member.
    Type: Application
    Filed: August 5, 2014
    Publication date: November 27, 2014
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Kenji MATSUFUJI, Ken OKAWA, Takuya MURAYAMA, Kenichi MOTONAGA