Patents by Inventor Koji SOKABE

Koji SOKABE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11931895
    Abstract: A robot control system includes circuitry configured to: determine a necessity of assisting a robot to complete an automated work, based on environment information of the robot; select a remote operator from candidate remote operators based on stored operator data in response to determining that it is necessary to assist the robot to complete the automated work; transmit the environment information to the selected remote operator via a communication network; receive an operation instruction based on the environment information from the selected remote operator via the communication network; and control the robot to complete the automated work based on the operation instruction.
    Type: Grant
    Filed: July 8, 2021
    Date of Patent: March 19, 2024
    Inventors: Hiroyuki Handa, Koji Sokabe, Keita Shimamoto, Masaru Adachi, Ryokichi Hirata
  • Publication number: 20230010302
    Abstract: A robot control system includes circuitry configured to: generate a command to a robot; receive a frame image in which a capture position changes according to a motion of the robot based on the command; extract a partial region from the frame image according to the command; superimpose a delay mark on the partial region to generate an operation image; and display the operation image on a display device, so as to represent a delay of the motion of the robot with respect to the command.
    Type: Application
    Filed: September 15, 2022
    Publication date: January 12, 2023
    Inventors: Keita SHIMAMOTO, Koji SOKABE, Ryokichi HIRATA, Masaru ADACHI
  • Publication number: 20220371203
    Abstract: A robot control system includes circuitry configured to: acquire an input command value indicating a manipulation of a robot by a subject user; acquire a current state of the robot and a target state associated with the manipulation of the robot; determine a state difference between the current state and the target state; acquire from a learned model, a degree of distribution associated with a motion of the robot, based on the state difference, wherein the learned model is generated based on a past robot manipulation; set a level of assistance to be given during the manipulation of the robot by the subject user, based on the degree of distribution acquired; and generate an output command value for operating the robot, based on the input command value and the level of assistance.
    Type: Application
    Filed: August 2, 2022
    Publication date: November 24, 2022
    Inventors: Masayuki FUJITA, Takeshi HATANAKA, Junya YAMAUCHI, Kosei NODA, Keita SHIMAMOTO, Koji SOKABE, Ryokichi HIRATA, Masaru ADACHI
  • Patent number: 11446820
    Abstract: To generate a more appropriate path, provided is a robot path generation device including circuitry configured to: hold a track planning module learning data set, in which a plurality of pieces of path data generated based on a motion constraint condition of a robot, and evaluation value data, which corresponds to each of the plurality of pieces path data and is a measure under a predetermined evaluation criterion, are associated with each other; and generate, based on a result of a machine learning process that is based on the track planning module learning data set, a path of the robot between a set start point and a set end point, which are freely set.
    Type: Grant
    Filed: June 26, 2019
    Date of Patent: September 20, 2022
    Assignee: KABUSHIKI KAISHA YASKA WA DENKI
    Inventors: Koji Sokabe, Masaru Adachi
  • Patent number: 11338435
    Abstract: A gripping system includes a hand that grips a workpiece, a robot that supports the hand and changes at least one of a position and a posture of the hand, and an image sensor that acquires image information from a viewpoint interlocked with at least one of the position and the posture of the hand. Additionally, the gripping system includes a construction module that constructs a model by machine learning based on collection data. The model corresponds to at least a part of a process of specifying an operation command of the robot based on the image information acquired by the image sensor and hand position information representing at least one of the position and the posture of the hand. An operation module executes the operation command of the robot based on the image information, the hand position information, and the model, and a robot control module operates the robot based on the operation command of the robot operated by the operation module.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: May 24, 2022
    Inventors: Shota Ishikawa, Koji Sokabe, Keisuke Nakamura, Masaru Adachi, Yuichi Sasaki, Antoine Pasquali, Thomas Wilmotte
  • Publication number: 20210339392
    Abstract: A robot control system includes circuitry configured to: determine a necessity of assisting a robot to complete an automated work, based on environment information of the robot; select a remote operator from candidate remote operators based on stored operator data in response to determining that it is necessary to assist the robot to complete the automated work; transmit the environment information to the selected remote operator via a communication network; receive an operation instruction based on the environment information from the selected remote operator via the communication network; and control the robot to complete the automated work based on the operation instruction.
    Type: Application
    Filed: July 8, 2021
    Publication date: November 4, 2021
    Inventors: Hiroyuki HANDA, Koji SOKABE, Keita SHIMAMOTO, Masaru ADACHI, Ryokichi HIRATA
  • Publication number: 20190314989
    Abstract: To generate a more appropriate path, provided is a robot path generation device including circuitry configured to: hold a track planning module learning data set, in which a plurality of pieces of path data generated based on a motion constraint condition of a robot, and evaluation value data, which corresponds to each of the plurality of pieces path data and is a measure under a predetermined evaluation criterion, are associated with each other; and generate, based on a result of a machine learning process that is based on the track planning module learning data set, a path of the robot between a set start point and a set end point, which are freely set.
    Type: Application
    Filed: June 26, 2019
    Publication date: October 17, 2019
    Inventors: Koji Sokabe, Masaru Adachi
  • Publication number: 20190152054
    Abstract: A gripping system includes a hand that grips a workpiece, a robot that supports the hand and changes at least one of a position and a posture of the hand, and an image sensor that acquires image information from a viewpoint interlocked with at least one of the position and the posture of the hand. Additionally, the gripping system includes a construction module that constructs a model by machine learning based on collection data. The model corresponds to at least a part of a process of specifying an operation command of the robot based on the image information acquired by the image sensor and hand position information representing at least one of the position and the posture of the hand. An operation module executes the operation command of the robot based on the image information, the hand position information, and the model, and a robot control module operates the robot based on the operation command of the robot operated by the operation module.
    Type: Application
    Filed: November 19, 2018
    Publication date: May 23, 2019
    Applicant: KABUSHIKI KAISHA YASKAWA DENKI
    Inventors: Shota ISHIKAWA, Koji SOKABE, Keisuke NAKAMURA, Masaru ADACHI, Yuichi SASAKI, Antoine PASQUALI, Thomas WILMOTTE