Patents by Inventor Shouta Takizawa
Shouta Takizawa has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20180315210Abstract: An object recognition apparatus includes a two-dimensional sensor for acquiring two-dimensional information of an object at a first clock time, a three-dimensional sensor for acquiring three-dimensional information of the object at a second clock time, a storage unit that associates and stores a first position of the two-dimensional sensor and the two-dimensional information, and a second position of the three-dimensional sensor and the three-dimensional information, and an arithmetic operation unit that calculates the amount of change in orientation between the orientation of the two-dimensional sensor and the orientation of the three-dimensional sensor based on the stored first position and second position, that converts the three-dimensional information acquired at the second position into three-dimensional information acquired at the first position based on the calculated amount of change in orientation, and that calculates the state of the object based on the converted three-dimensional information and tType: ApplicationFiled: April 12, 2018Publication date: November 1, 2018Inventors: Shouta TAKIZAWA, Junichirou YOSHIDA, Fumikazu WARASHINA
-
Publication number: 20180197311Abstract: A camera abnormality cause estimation system for estimating the causes of abnormalities in a camera in a production system in which the camera controls a robot. The production system includes a robot, a camera that detects visual information of the robot or the surrounding thereof, and a controller that controls the robot based on an image signal obtained by the camera. The camera abnormality cause estimation system estimates the causes of abnormalities in the camera and includes an environment information acquisition unit that acquires environment information of the camera, and an abnormality cause estimation unit that estimates a probability that each of a plurality of predetermined abnormality cause items is the cause of an abnormality in the camera for the respective abnormality cause items using the environment information acquired by the environment information acquisition means and displays the estimated probability on a display unit for the respective abnormality cause items.Type: ApplicationFiled: January 5, 2018Publication date: July 12, 2018Inventors: Shouta TAKIZAWA, Fumikazu WARASHINA
-
Publication number: 20180194008Abstract: For calibration on a single camera or a stereo camera, a calibration range is set in advance in an image coordinate system and the calibration is performed in an arbitrary range. A visual sensor controller is a calibration device that associates a robot coordinate system at a robot and an image coordinate system at a camera by placing a target mark at the robot, moving the robot, and detecting the target mark at multiple points in a view of the camera. The calibration device comprises: an image range setting unit that sets an image range in the image coordinate system at the camera; and a calibration range measurement unit that measures an operation range for the robot corresponding to the image range before implementation of calibration by moving the robot and detecting the target mark.Type: ApplicationFiled: January 4, 2018Publication date: July 12, 2018Inventors: Yuuta NAMIKI, Fumikazu WARASHINA, Shouta TAKIZAWA
-
Publication number: 20180101962Abstract: A projection pattern creation apparatus is configured to capture an image of a projection pattern projected from a pattern projection device by an imaging device to measure a three-dimensional position and/or a shape of an object. The projection pattern creation apparatus includes: a projection pattern deformation unit configured to reproduce deformation when a projected projection pattern is included in an image captured by the imaging device on the basis of characteristics of optical systems of the pattern projection device and the imaging device, and/or a positional relation between the pattern projection device and the imaging device and generate a deformation projection pattern; and a first projection pattern improvement unit configured to generate a second projection pattern obtained by improving a first projection pattern, on a basis of a first deformation projection pattern generated when the first projection pattern is projected toward evaluation surfaces having different positions and inclinations.Type: ApplicationFiled: September 29, 2017Publication date: April 12, 2018Inventors: Shouta TAKIZAWA, Junichirou YOSHIDA, Fumikazu WARASHINA
-
Patent number: 9844882Abstract: A robot system is provided with a three-dimensional sensor which acquires three-dimensional information of an object, and a robot which includes a gripping device for gripping an object. The robot system uses first three-dimensional information which relates to a state before an object is taken out and second three-dimensional information which relates to a state after an object is taken out as the basis to acquire three-dimensional shape information of an object, and uses the three-dimensional shape information of the object as the basis to calculate a position and posture of the robot when an object is placed at a target site.Type: GrantFiled: February 3, 2016Date of Patent: December 19, 2017Assignee: FANUC CORPORATIONInventors: Shouta Takizawa, Fumikazu Warashina, Kazunori Ban
-
Publication number: 20170274534Abstract: A positioning system using a robot, capable of eliminating an error factor of the robot such as thermal expansion or backlash can be eliminated, and carrying out positioning of the robot with accuracy higher than inherent positioning accuracy of the robot. The positioning system has a robot with a movable arm, visual feature portions provided to a robot hand, and vision sensors positioned at a fixed position outside the robot and configured to capture the feature portions. The hand is configured to grip an object on which the feature portions are formed, and the vision sensors are positioned and configured to capture the respective feature portions.Type: ApplicationFiled: March 23, 2017Publication date: September 28, 2017Applicant: FANUC CORPORATIONInventors: Yuutarou Takahashi, Shouta Takizawa, Fumikazu Warashina
-
Publication number: 20160229061Abstract: A robot system is provided with a three-dimensional sensor which acquires three-dimensional information of an object, and a robot which includes a gripping device for gripping an object. The robot system uses first three-dimensional information which relates to a state before an object is taken out and second three-dimensional information which relates to a state after an object is taken out as the basis to acquire three-dimensional shape information of an object, and uses the three-dimensional shape information of the object as the basis to calculate a position and posture of the robot when an object is placed at a target site.Type: ApplicationFiled: February 3, 2016Publication date: August 11, 2016Inventors: Shouta TAKIZAWA, Fumikazu WARASHINA, Kazunori BAN
-
Patent number: 9060114Abstract: An imaging command unit commands an imaging device to image an object projected with a shaded image where a predetermined shade value is set with respect to each pixel. An image acquisition unit acquires a shaded image imaged by the imaging device in accordance with a command of the imaging command unit. An image production unit produces a shaded image where any of shade values is set with respect to each pixel so as to have a brightness distribution opposite to a brightness distribution of the shaded image. An imaging command unit commands the imaging device to image the object projected with the shaded image. The image acquisition unit acquires a shaded image imaged by the imaging device in accordance with a command of the imaging command unit. An object information acquisition unit acquires information on the object in the shaded image based on the shaded image.Type: GrantFiled: December 23, 2013Date of Patent: June 16, 2015Assignee: FANUC CORPORATIONInventor: Shouta Takizawa
-
Publication number: 20140176761Abstract: An imaging command unit commands an imaging device to image an object projected with a shaded image where a predetermined shade value is set with respect to each pixel. An image acquisition unit acquires a shaded image imaged by the imaging device in accordance with a command of the imaging command unit. An image production unit produces a shaded image where any of shade values is set with respect to each pixel so as to have a brightness distribution opposite to a brightness distribution of the shaded image. An imaging command unit commands the imaging device to image the object projected with the shaded image. The image acquisition unit acquires a shaded image imaged by the imaging device in accordance with a command of the imaging command unit. An object information acquisition unit acquires information on the object in the shaded image based on the shaded image.Type: ApplicationFiled: December 23, 2013Publication date: June 26, 2014Applicant: FANUC CORPORATIONInventor: Shouta TAKIZAWA
-
Patent number: 8315735Abstract: A production system in which a human and a robot may simultaneously perform a cooperative task in the same area while ensuring human's safety. A robot is positioned at one side of a working table, and an operator is positioned at the other side of the working table. The reachable area of the operator is limited by the working table. An area of the working table is divided into an area where only the operator may perform a task, an area where only the robot may perform a task, and an area where both the operator and the robot may enter. In a cooperation mode, the maximum movement speed of a component of the robot is limited lower than when the component of the robot is outside the cooperative task area, and, the motion of the robot is limited so that the robot does not enter a robot entry-prohibited area.Type: GrantFiled: January 26, 2010Date of Patent: November 20, 2012Assignee: Fanuc LtdInventors: Ryo Nihei, Shinsuke Sakakibara, Kazunori Ban, Masahiro Morioka, Satoshi Adachi, Shouta Takizawa
-
Patent number: 8132835Abstract: A workpiece gripping device includes a primary catching mechanism having a suction pad adapted to hold a workpiece by suction power and a rod-like part having the suction pad mounted on one end thereof, gripping pawls sandwiching and gripping the workpiece, a drive unit for supplying a drive power, and a transmission mechanism transmitting the drive power of the drive unit to the gripping pawls. The transmission mechanism moves the gripping pawls along an axis of the rod-like part between a gripping position where the gripping pawls are projected beyond the suction pad and a retreated position where the gripping pawls are retreated from the suction pad toward the base plate. The gripping pawls are moved to the gripping position along the axis of the rod-like part while shortening the distance between the gripping pawls.Type: GrantFiled: September 8, 2009Date of Patent: March 13, 2012Assignee: Fanuc LtdInventors: Kazunori Ban, Fumikazu Warashina, Shouta Takizawa
-
Patent number: 8098928Abstract: An apparatus for picking up objects including a robot for picking up an object, at least one part of the object having a curved shape, having a storing means for storing a gray gradient distribution model of the object, a recognizing means for recognizing a gray image of the object, a gradient extracting means for extracting a gray gradient distribution from the gray image recognized by the recognizing means, an object detecting means for detecting a position or position posture of the object in the gray image in accordance with the gray gradient distribution extracted by the gradient extracting means and the gray gradient distribution model stored by the storing means, a detection information converting means for converting information of the position or position posture detected by the object detecting means into information of position or position posture in a coordinate system regarding the robot; and a robot moving means for moving the robot to the position or position posture converted by the detectionType: GrantFiled: March 28, 2008Date of Patent: January 17, 2012Assignee: Fanuc LtdInventors: Kazunori Ban, Ichiro Kanno, Hidetoshi Kumiya, Shouta Takizawa
-
Publication number: 20100191372Abstract: A production system in which a human and a robot may simultaneously perform a cooperative task in the same area while ensuring human's safety. A robot is positioned at one side of a working table, and an operator is positioned at the other side of the working table. The reachable area of the operator is limited by the working table. An area of the working table is divided into an area where only the operator may perform a task, an area where only the robot may perform a task, and an area where both the operator and the robot may enter. In a cooperation mode, the maximum movement speed of a component of the robot is limited lower than when the component of the robot is outside the cooperative task area, and, the motion of the robot is limited so that the robot does not enter a robot entry-prohibited area.Type: ApplicationFiled: January 26, 2010Publication date: July 29, 2010Applicant: FANUC LTDInventors: Ryo NIHEI, Shinsuke SAKAKIBARA, Kazunori BAN, Masahiro MORIOKA, Satoshi ADACHI, Shouta TAKIZAWA
-
Publication number: 20100078953Abstract: A workpiece gripping device includes a primary catching mechanism having a suction pad adapted to hold a workpiece by suction power and a rod-like part having the suction pad mounted on one end thereof, gripping pawls sandwiching and gripping the workpiece, a drive unit for supplying a drive power, and a transmission mechanism transmitting the drive power of the drive unit to the gripping pawls. The transmission mechanism moves the gripping pawls along an axis of the rod-like part between a gripping position where the gripping pawls are projected beyond the suction pad and a retreated position where the gripping pawls are retreated from the suction pad toward the base plate. The gripping pawls are moved to the gripping position along the axis of the rod-like part while shortening the distance between the gripping pawls.Type: ApplicationFiled: September 8, 2009Publication date: April 1, 2010Applicant: FANUC LTDInventors: Kazunori BAN, Fumikazu WARASHINA, Shouta TAKIZAWA
-
Publication number: 20080240511Abstract: An apparatus for picking up objects including a robot for picking up an object, at least one part of the object having a curved shape, having a storing means for storing a gray gradient distribution model of the object, a recognizing means for recognizing a gray image of the object, a gradient extracting means for extracting a gray gradient distribution from the gray image recognized by the recognizing means, an object detecting means for detecting a position or position posture of the object in the gray image in accordance with the gray gradient distribution extracted by the gradient extracting means and the gray gradient distribution model stored by the storing means, a detection information converting means for converting information of the position or position posture detected by the object detecting means into information of position or position posture in a coordinate system regarding the robot; and a robot moving means for moving the robot to the position or position posture converted by the detectionType: ApplicationFiled: March 28, 2008Publication date: October 2, 2008Applicant: FANUC LTDInventors: Kazunori BAN, Ichiro KANNO, Hidetoshi KUMIYA, Shouta TAKIZAWA