Patents by Inventor Fumikazu Warashina
Fumikazu Warashina has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10596698Abstract: A machine learning device that acquires state information from a robot control inspection system. The system has a robot hand to hold a workpiece or camera. The state information includes a flaw detection position of the workpiece, a movement route of the robot hand, an imaging point of the workpiece, and the number of imaging by the camera. A reward calculator calculates a reward value in reinforcement learning based on flaw detection information including the flaw detection position. A value function updater updates an action value function by performing the reinforcement learning based on the reward value, the state information, and the action.Type: GrantFiled: May 16, 2018Date of Patent: March 24, 2020Assignee: Fanuc CorporationInventors: Yuusuke Oota, Fumikazu Warashina, Hiromitsu Takahashi
-
Patent number: 10593053Abstract: A projection pattern creation apparatus is configured to capture an image of a projection pattern projected from a pattern projection device by an imaging device to measure a three-dimensional position and/or a shape of an object. The projection pattern creation apparatus includes: a projection pattern deformation unit configured to reproduce deformation when a projected projection pattern is included in an image captured by the imaging device on the basis of characteristics of optical systems of the pattern projection device and the imaging device, and/or a positional relation between the pattern projection device and the imaging device and generate a deformation projection pattern; and a first projection pattern improvement unit configured to generate a second projection pattern obtained by improving a first projection pattern, on a basis of a first deformation projection pattern generated when the first projection pattern is projected toward evaluation surfaces having different positions and inclinations.Type: GrantFiled: September 29, 2017Date of Patent: March 17, 2020Assignee: FANUC CORPORATIONInventors: Shouta Takizawa, Junichirou Yoshida, Fumikazu Warashina
-
Publication number: 20200061825Abstract: An interference avoidance device is provided with: a three-dimensional sensor that is attached to a tip portion of a robot arm and acquires a distance image of an area around a robot; a position data creating portion that converts coordinates of a nearby object in the distance image to coordinates on a robot coordinate system and creates the position data of the nearby object based on the coordinates of the nearby object on the robot coordinate system; a storage portion that stores the position data; and a control portion that controls the robot based on the robot coordinate system; and the control portion controls the robot to avoid interference of the robot with the nearby object, based on the position data stored in the storage portion.Type: ApplicationFiled: July 23, 2019Publication date: February 27, 2020Applicant: Fanuc CorporationInventors: Fumikazu WARASHINA, Yuta NAMIKI, Kyouhei KOKUBO
-
Patent number: 10525598Abstract: A positioning system using a robot, capable of eliminating an error factor of the robot such as thermal expansion or backlash can be eliminated, and carrying out positioning of the robot with accuracy higher than inherent positioning accuracy of the robot. The positioning system has a robot with a movable arm, visual feature portions provided to a robot hand, and vision sensors positioned at a fixed position outside the robot and configured to capture the feature portions. The hand is configured to grip an object on which the feature portions are formed, and the vision sensors are positioned and configured to capture the respective feature portions.Type: GrantFiled: March 23, 2017Date of Patent: January 7, 2020Assignee: FANUC CORPORATIONInventors: Yuutarou Takahashi, Shouta Takizawa, Fumikazu Warashina
-
Patent number: 10500731Abstract: The robot system includes a carriage for supporting a robot. The robot system includes a camera, and a mark disposed in a workspace. The control device includes a position acquisition part that is configured to acquire a position of the mark on the basis of an image captured by the camera, and a determination part configured to determine whether or not the robot is located at a position within a predetermined determination range. When the determination part determines that the position of the robot deviates from the determination range, the display device displays the direction and the movement amount in which the carriage is to be moved.Type: GrantFiled: October 3, 2017Date of Patent: December 10, 2019Assignee: FANUC CORPORATIONInventors: Junichirou Yoshida, Fumikazu Warashina
-
Patent number: 10497146Abstract: A camera abnormality cause estimation system for estimating the causes of abnormalities in a camera in a production system in which the camera controls a robot. The production system includes a robot, a camera that detects visual information of the robot or the surrounding thereof, and a controller that controls the robot based on an image signal obtained by the camera. The camera abnormality cause estimation system estimates the causes of abnormalities in the camera and includes an environment information acquisition unit that acquires environment information of the camera, and an abnormality cause estimation unit that estimates a probability that each of a plurality of predetermined abnormality cause items is the cause of an abnormality in the camera for the respective abnormality cause items using the environment information acquired by the environment information acquisition means and displays the estimated probability on a display unit for the respective abnormality cause items.Type: GrantFiled: January 5, 2018Date of Patent: December 3, 2019Assignee: FANUC CORPORATIONInventors: Shouta Takizawa, Fumikazu Warashina
-
Publication number: 20190333182Abstract: An image management device includes: an image data acquisition unit that acquires image data snapped by a visual sensor; an image snap condition acquisition unit that acquires an image snap condition applied when the image data is snapped, from a manufacturing machine and the visual sensor; an archive data creation unit that creates archive data based on the image data and the image snap condition; and an archive data storage unit that stores the archive data. The archive data creation unit creates archive data after recording the image snap condition in a file. The image management device can easily reuse image data that is snapped by the visual sensor attached to the manufacturing machine and an image snap condition of the image data.Type: ApplicationFiled: April 25, 2019Publication date: October 31, 2019Inventors: Yuta NAMIKI, Fumikazu WARASHINA
-
Patent number: 10451561Abstract: An inspection system makes image inspection on an inspection target. The inspection system includes: an image capture device that captures an image of the inspection target; a blower with a blow nozzle from which clean gas is blown out to the inspection target; a robot with an arm tip to which the image capture device and the blow nozzle, or the inspection target is attached; and as inspection device that makes image inspection on the inspection target based on an image captured by the image capture device. The inspection device generates an operation program for a robot based on the position of the image capture device and that of the blow nozzle relative to each other so as to move the blow nozzle ahead of the image capture device relative to the inspection target.Type: GrantFiled: September 11, 2018Date of Patent: October 22, 2019Assignee: FANUC CORPORATIONInventors: Yuusuke Oota, Fumikazu Warashina
-
Patent number: 10434654Abstract: A parameter for detecting a target mark 5 is not required to be set for each camera repeatedly while a stereo camera 2 is calibrated. A calibration device 1 associates position information in an image coordinate system at a first camera 21 of a stereo camera 2, position information in an image coordinate system at a second camera 22 of the stereo camera 2, and position information in a robot coordinate system at a robot 4. The calibration device comprises: first parameter setting unit 102 that sets a first parameter for detecting a target mark 5 attached to the robot 4 from data about an image captured by the first camera 21; and a second parameter setting unit 104 that sets a second parameter for detecting the target mark 5 from data about an image captured by the second camera 22 based on the first parameter.Type: GrantFiled: January 4, 2018Date of Patent: October 8, 2019Assignee: FANUC CORPORATIONInventors: Yuuta Namiki, Fumikazu Warashina
-
Publication number: 20190299403Abstract: A robot system includes a target position calculation section which calculates, when a first feature can be detected from an image, a target position of a robot based on the calculated position of the first feature and a stored first positional relationship, and calculates, when the first feature cannot be detected from the image and a second feature can be detected from the image, a target position of the robot based on the calculated position of the second feature and the stored first positional relationship.Type: ApplicationFiled: February 8, 2019Publication date: October 3, 2019Inventors: Kyouhei KOKUBO, Fumikazu WARASHINA
-
Publication number: 20190299405Abstract: A machine learning device includes a state observation unit for observing, as state variables, an image of a workpiece captured by a vision sensor, and a movement amount of an arm end portion from an arbitrary position, the movement amount being calculated so as to bring the image close to a target image; a determination data retrieval unit for retrieving the target image as determination data; and a learning unit for learning the movement amount to move the arm end portion or the workpiece from the arbitrary position to a target position. The target position is a position in which the vision sensor and the workpiece have a predetermined relative positional relationship. The target image is an image of the workpiece captured by the vision sensor when the arm end portion or the workpiece is disposed in the target position.Type: ApplicationFiled: March 22, 2019Publication date: October 3, 2019Inventors: Fumikazu WARASHINA, Yuutarou TAKAHASHI
-
Patent number: 10339668Abstract: An object recognition apparatus includes a two-dimensional sensor for acquiring two-dimensional information of an object at a first clock time, a three-dimensional sensor for acquiring three-dimensional information of the object at a second clock time, a storage unit that associates and stores a first position of the two-dimensional sensor and the two-dimensional information, and a second position of the three-dimensional sensor and the three-dimensional information, and an arithmetic operation unit that calculates the amount of change in orientation between the orientation of the two-dimensional sensor and the orientation of the three-dimensional sensor based on the stored first position and second position, that converts the three-dimensional information acquired at the second position into three-dimensional information acquired at the first position based on the calculated amount of change in orientation, and that calculates the state of the object based on the converted three-dimensional information and tType: GrantFiled: April 12, 2018Date of Patent: July 2, 2019Assignee: FANUC CORPORATIONInventors: Shouta Takizawa, Junichirou Yoshida, Fumikazu Warashina
-
Publication number: 20190197676Abstract: An object inspection system enabling a quick and easy registering of a master image and an inspection image when there is a displacement between the master image and the inspection image. The object inspection system includes an imaging section, a movement machine configured to move a first object or a second object and an imaging section relative to each other, a positional data acquisition section configured to acquire positional data of the movement machine when the movement machine disposes the first object or the second object and the imaging section at a relative position, an image data acquisition section configured to acquire a first image of the first object and a second image of the second object, and an image registering section configured to register the first image and the second image.Type: ApplicationFiled: December 7, 2018Publication date: June 27, 2019Inventors: Junichirou YOSHIDA, Fumikazu WARASHINA
-
Publication number: 20190120771Abstract: An inspection system makes image inspection on an inspection target. The inspection system includes: an image capture device that captures an image of the inspection target; a blower with a blow nozzle from which clean gas is blown out to the inspection target; a robot with an arm tip to which the image capture device and the blow nozzle, or the inspection target is attached; and as inspection device that makes image inspection on the inspection target based on an image captured by the image capture device. The inspection device generates an operation program for a robot based on the position of the image capture device and that of the blow nozzle relative to each other so as to move the blow nozzle ahead of the image capture device relative to the inspection target.Type: ApplicationFiled: September 11, 2018Publication date: April 25, 2019Inventors: Yuusuke OOTA, Fumikazu WARASHINA
-
Publication number: 20190066287Abstract: An inspection system includes a first imaging device provided in a first inspection device; a second imaging device provided in a second inspection device; a first controller; and a second controller, wherein the first controller acquires a particular feature of a calibration jig, which is positioned in the first inspection device, from an image of the calibration jig obtained by the first imaging device as first feature data, and the second controller acquires the particular feature of the calibration jig, which is positioned in the second inspection device, from an image of the calibration jig obtained by the second imaging device as second feature data. The correction amount needed for correcting the image obtained by the second imaging device so that the second feature data matches the first feature data is acquired, and the second inspection device corrects an image of an inspection subject using this correction amount.Type: ApplicationFiled: August 21, 2018Publication date: February 28, 2019Applicant: Fanuc CorporationInventors: Junichirou YOSHIDA, Fumikazu WARASHINA
-
Publication number: 20180370027Abstract: A machine learning device that acquires state information from a robot control inspection system. The system has a robot hand to hold a workpiece or camera. The state information includes a flaw detection position of the workpiece, a movement route of the robot hand, an imaging point of the workpiece, and the number of imaging by the camera. A reward calculator calculates a reward value in reinforcement learning based on flaw detection information including the flaw detection position. A value function updater updates an action value function by performing the reinforcement learning based on the reward value, the state information, and the action.Type: ApplicationFiled: May 16, 2018Publication date: December 27, 2018Inventors: Yuusuke OOTA, Fumikazu WARASHINA, Hiromitsu TAKAHASHI
-
Publication number: 20180315210Abstract: An object recognition apparatus includes a two-dimensional sensor for acquiring two-dimensional information of an object at a first clock time, a three-dimensional sensor for acquiring three-dimensional information of the object at a second clock time, a storage unit that associates and stores a first position of the two-dimensional sensor and the two-dimensional information, and a second position of the three-dimensional sensor and the three-dimensional information, and an arithmetic operation unit that calculates the amount of change in orientation between the orientation of the two-dimensional sensor and the orientation of the three-dimensional sensor based on the stored first position and second position, that converts the three-dimensional information acquired at the second position into three-dimensional information acquired at the first position based on the calculated amount of change in orientation, and that calculates the state of the object based on the converted three-dimensional information and tType: ApplicationFiled: April 12, 2018Publication date: November 1, 2018Inventors: Shouta TAKIZAWA, Junichirou YOSHIDA, Fumikazu WARASHINA
-
Publication number: 20180275073Abstract: A device capable of easily defining an area other than a surface to be inspected of a workpiece. The device includes a drawing acquisition section for acquiring drawing data of the workpiece; a designation reception section for receiving specification of the surface to be inspected of the workpiece in the drawing data; and a non-inspection area calculation section for calculating, as a non-inspection area, an image area other than the surface to be inspected in an image in a view of the imaging section when the workpiece and the imaging section are positioned at an imaging position at which at least a part of the surface to be inspected as specified falls within the view of the imaging section.Type: ApplicationFiled: March 20, 2018Publication date: September 27, 2018Inventors: Junichirou YOSHIDA, Fumikazu WARASHINA
-
Publication number: 20180260628Abstract: An image processing apparatus, which receives an input image and detects an image of a target object based on a detection algorithm, includes a machine learning device which performs learning by using a plurality of partial images cut out from at least one input image, based on a result of detection of the image of the target object, and calculates a likelihood of the image of the target object.Type: ApplicationFiled: February 20, 2018Publication date: September 13, 2018Inventors: Yuuta NAMIKI, Fumikazu WARASHINA
-
Publication number: 20180231474Abstract: An apparatus capable of quickly constructing an operation program that causes an inspection system to carry out an operation for imaging the surface to be inspected. This apparatus includes a drawing acquisition section configured to acquire drawing data of the workpiece, a designation receiving section configured to accept designation of the surface to be inspected in the drawing data, a target position acquisition section configured to acquire, as a target position, a position of the movement mechanism when the workpiece and the imaging section are positioned such that the surface to be inspected is within a field of view of the imaging section, and a program generation section configured to generate an operation program for controlling a movement operation of the movement mechanism and an imaging operation of the imaging section on the basis of the target position.Type: ApplicationFiled: February 12, 2018Publication date: August 16, 2018Inventors: Junichirou YOSHIDA, Fumikazu WARASHINA