Patents by Inventor Hiroyuki Oyama
Hiroyuki Oyama has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240123614Abstract: A learning device 1X mainly includes an optimization problem calculation means 51X and an executable state set learning means 52X. The optimization problem calculation means 51X calculates a function value to be a solution for an optimization problem which uses an evaluation function for evaluating reachability to a target state, based on an abstract system model and a detailed system model concerning a system in which a robot operates. The executable state set learning means 52X learns an executable state set of an action of the robot to be executed by a controller based on a function value.Type: ApplicationFiled: February 26, 2021Publication date: April 18, 2024Applicant: NEC CorporationInventors: Rin TAKANO, Hiroyuki OYAMA
-
Patent number: 11955772Abstract: A semiconductor light emitting element includes an optical waveguide having a first and second waveguide provided with a width that allows propagation of light in a second-order mode or higher and a multimode optical interference waveguide provided with a wider width than the first and second waveguide and arranged at a position therebetween. The semiconductor light emitting element further includes a first optical loss layer facing the first waveguide in an active-layer crossing direction for causing a loss of light that is propagating in the first waveguide in the second-order mode or higher and a second optical loss layer facing the second waveguide in an active-layer crossing direction for causing a loss of light that is propagating in the second waveguide in the second-order mode or higher, the active-layer crossing direction being orthogonal to a surface of an active layer.Type: GrantFiled: March 24, 2021Date of Patent: April 9, 2024Assignees: DENSO CORPORATION, KYUSHU UNIVERSITY, NATIONAL UNIVERSITY CORPORATIONInventors: Yuki Kamata, Koichi Oyama, Hiroyuki Tarumi, Kiichi Hamamoto, Haisong Jiang
-
Publication number: 20230373093Abstract: A proposition setting device 1X mainly includes an abstract state setting means 31X and a proposition setting means 32X. The abstract state setting means 31X sets an abstract state which is a state abstracting each object in a workspace based on a measurement result in the workspace where each robot works. The proposition setting means 32X sets a propositional area which represents a proposition concerning each object by an area, based on the abstract state and a relative area information which is information concerning a relative area of each object.Type: ApplicationFiled: October 9, 2020Publication date: November 23, 2023Applicant: NEC CorporationInventors: Hiroyuki Oyama, Rin Takano
-
Publication number: 20230364791Abstract: The temporal logic formula generation device 1X mainly includes a target relation logical formula generation means 331X and a target relation logical formula integration means 332X. The target relation logical formula generation means 331X is configured to generate, based on object-to-object relation information representing a relation between objects in a target state relating to a task of the robot, one or more target relation logical formulas that are temporal logic formulas representing relations, in the target state, between respective pair(s) of objects between which the relation is defined. The target relation logical formula integration means 332X is configured to generate a temporal logic formula into which the target relation logical formulas are integrated.Type: ApplicationFiled: October 9, 2020Publication date: November 16, 2023Applicant: NEC CorporationInventors: Rin TAKANO, Hiroyuki OYAMA
-
Publication number: 20230364792Abstract: The operation command generation device 1Y mainly includes a skill information acquisition means 341Y, a skill tuple generation means 342Y, and a skill use operation command generation means 343Y. The skill information acquisition means 341Y is configured to acquire skill information relating to a skill to be used in a motion planning of a robot. The skill tuple generation means 342Y is configured to generate, based on the skill information, a skill tuple which is a set of variables in a system model, the variables being associated with the skill, the system model being set in the motion planning. The skill use operation command generation means 343Y is configured to generate a skill use operation command that is a temporal logic command representing an operation corresponding to the skill tuple.Type: ApplicationFiled: October 9, 2020Publication date: November 16, 2023Applicant: NEC CorporationInventors: Rin TAKANO, Hiroyuki Oyama
-
Publication number: 20230364786Abstract: A control device 1X mainly includes an abstract state setting means 31X, an environment map generation means 34X, an abstract model generation means 35X, and a control input generation means 36X. The abstract state setting means 31X sets an abstract state which abstractly represents a state of each object in a workspace where each robot works. The environment map generation means 34X generates an environment map which is a map representing accuracy of information in the workspace. The abstract model generation means 35X generates an abstract model which represents dynamics of the abstract state and a time change of the environment map. The control input generation means 36X generates a control input with respect to each robot based on the abstract model.Type: ApplicationFiled: October 9, 2020Publication date: November 16, 2023Applicant: NEC CorporationInventors: Hiroyuki OYAMA, Rin Takano
-
Publication number: 20230321827Abstract: A determination device 1X mainly includes a proposition determination means 18X. The proposition determination means 18X performs a completion determination of a task based on a first proposition representing a current state of the task and a second proposition representing a completion state of the task, in which the first proposition and the second proposition are detected by a sensor, when an operation sequence concerning the task has completed or when a predetermined time length has lapsed from a start of the task.Type: ApplicationFiled: September 7, 2020Publication date: October 12, 2023Applicant: NEC CorporationInventors: Masatsugu Ogawa, Nobuharu Kami, Hisaya Wakayama, Hiroyuki Oyama, Masumi lchien
-
Publication number: 20230321828Abstract: An information collecting device 3X mainly includes an information acquisition means 35X and a task identifier setting means 36X. The information acquisition means 35X is configured to acquire work related information relating to a work of a robot. Examples of the information acquisition means 35X include the information acquisition unit 35 in the first example embodiment. The task identifier setting means 36X is configured to set an identifier of a task executed by the robot to the work related information.Type: ApplicationFiled: November 17, 2020Publication date: October 12, 2023Applicant: NEC CorporationInventors: Masatsugu OGAWA, Hiroyuki OYAMA, Hisaya WAKAYAMA, Masumi ICHIEN, Nobuharu KAMI
-
Publication number: 20230241770Abstract: The control device 200X functionally includes an operation policy acquisition means 21X and a policy combining means 23X. The operation policy acquisition means 21X is configured to acquire an operation policy relating to an operation of a robot. The operation policy acquisition means 21X is configured to acquire an operation policy relating to an operation of a robot.Type: ApplicationFiled: July 14, 2020Publication date: August 3, 2023Applicant: NEC CorporationInventors: Takehiro ITOU, Hiroyuki OYAMA
-
Publication number: 20230104802Abstract: A control device 1A mainly includes an operation sequence generation means 17A. The operation sequence generation means 17A is configured to generate, based on recognition results Ra relating to types and states of objects present in a workspace where a robot which performs a task and another working body perform cooperative work, an operation sequence Sa to be executed by the robot.Type: ApplicationFiled: February 25, 2020Publication date: April 6, 2023Applicant: NEC CorporationInventors: Hiroyuki OYAMA, Nobuharu KAMI, Massatsugu OGAWA, Hisaya WAKAYAMA, Mineto SATOH, Takehiro ITOU
-
Publication number: 20230099683Abstract: A control device 1B mainly includes a subgoal setting means 17B and an operation sequence generation means 18B. The subgoal setting means 17B is configured to set a subgoal “Sg” based on abstract states in which states in a workspace where a robot works are abstracted, the subgoal Sg indicating an intermediate goal for achieving a final goal or constraint conditions required to achieve the final goal. The operation sequence generation means 18B is configured to generate an operation sequence to be executed by the robot based on the subgoal.Type: ApplicationFiled: February 28, 2020Publication date: March 30, 2023Applicant: NEC CorporationInventor: Hiroyuki OYAMA
-
Publication number: 20230082482Abstract: The control device 1A mainly includes a determination means 15A, an abstract state setting means 16A, and a sequence generation means 17A. The determination means 15A determines, based on at least one of environment information relating to environment observed in a workspace of a controlled device to be controlled, state information relating to a state of the controlled device, and stored information that is stored information relating to the objective task to be executed by the controlled device, whether or not the objective task can be completed. The abstract state setting means 16A sets, when determined that the objective task cannot be completed, an abstract state in the workspace based on at least one of the environment information or the stored information. The sequence generation means 17A generates, based on the abstract state and the objective task, a sequence of subtasks to be executed by the controlled device.Type: ApplicationFiled: February 25, 2020Publication date: March 16, 2023Applicant: NEC CorporationInventors: Mineto SATOH, Nobuharu Kami, Masatsugu Ogawa, Hisaya Wakayama, Takehiro Itou, Hiroyuki Oyama
-
Publication number: 20230080565Abstract: A control device 1A mainly includes a display control means 15A and an operation sequence generation means 16A. The display control means 15A is configured to transmit display information S2 relating to a task to be executed by a robot to a display device 2A. The operation sequence generation means 16A is configured, in a case that the display control means 15A has received, from the display device 2A, task designation information that is input information which schematically specifies the task, to generate an operation sequence to be executed by the robot based on the task designation information Ia.Type: ApplicationFiled: February 25, 2020Publication date: March 16, 2023Applicant: NEC CorporationInventors: Masatsugu OGAWA, Hisaya Wakayama, Takehiro Itou, Mineto Satoh, Hiroyuki Oyama, Nobuharu Kami
-
Publication number: 20230072244Abstract: A control device 1C includes an operation sequence generation means 37C. The operation sequence generation means 37C is configured to generate, based on robot operation information Jr indicating operation characteristics of a robot executing a task and peripheral equipment information Ip indicating operation characteristics of peripheral equipment which delivers or receives an object relating to the task to or from the robot, operation sequences Sra and Spa indicating operations to be executed by the robot and the peripheral equipment, respectively.Type: ApplicationFiled: February 25, 2020Publication date: March 9, 2023Applicant: NEC CorporationInventors: Hisaya Wakayama, Masatsugu Ogawa, Takehiro Itou, Mineto Satoh, Hiroyuki Oyama, Nobuharu Kami
-
Publication number: 20230072442Abstract: A control device 1A includes a task group generation means 16A and an operation sequence generation means 17A. The task group generation means is configured to generate, in a case where multiple tasks to be executed by one or more robots are designated, one or more task groups obtained by classifying the multiple tasks. The operation sequence generation means 17A is configured to generate one or more operation sequences of the one or more robots for completing the multiple tasks so as to put completion time of tasks included in the one or more task groups close to one another.Type: ApplicationFiled: February 25, 2020Publication date: March 9, 2023Applicant: NEC CorporationInventors: Hisaya Wakayama, Hiroyuki Oyama, Mineto Satoh, Takehiro Ltou, Masatsugu Ogawa, Nobuharu Kami
-
Publication number: 20230069393Abstract: A control device 1C mainly includes an operation sequence generation means 16C and a synchronization management means 17C. The operation sequence generation means 16C is configured to generate, based on an operation prediction result R2a of another working body which performs cooperative work with a robot which executes a task, an operation sequence Sra to be executed by the robot. The synchronization management means 17C is configured to synchronize an operation executed by the robot during execution of the operation sequence Sra and an operation executed by the other working body.Type: ApplicationFiled: February 25, 2020Publication date: March 2, 2023Applicant: NEC CorporationInventors: Masatsugu OGAWA, Hisaya Wakayame, Takehiro Itou, Mineto Satoh, Hiroyuki Oyama, Nobuharu Kami
-
Publication number: 20220355475Abstract: The information processing device 1A mainly includes a logical formula conversion unit 322A, a constraint condition information acquisition unit 323A, and a constraint condition addition unit 324A. The logical formula conversion unit 322A is configured to convert an objective task, which is a task to be performed by a robot, into a logical formula that is based on a temporal logic. The constraint condition information acquisition unit 323A is configured to acquire constraint condition information I2 indicative of a constraint condition to be satisfied in performing the objective task. The constraint condition addition unit 324A is configured to generate a target logical formula Ltag that is a logical formula obtained by adding a proposition indicative of the constraint condition to the logical formula generated by the logical formula conversion unit 322A.Type: ApplicationFiled: August 30, 2019Publication date: November 10, 2022Applicant: NEC CorporationInventor: Hiroyuki OYAMA
-
Publication number: 20220299949Abstract: The information processing device 1B mainly includes an abstract model information acquisition unit 34X, a measurement information acquisition unit 34Y, and an abstract model generation unit 34Z. The abstract model information acquisition unit 34X is configured to acquire abstract model information I5 regarding an abstract model in which dynamics in a workspace 6 where a robot 5 performs an objective task is abstracted. The measurement information acquisition unit 34Y is configured to acquire measurement information Im indicating a measurement result in the workspace 6. The abstract model generation unit 34Z is configured to generate an abstract model ? based on the abstract model information I5 and the measurement information Im.Type: ApplicationFiled: August 30, 2019Publication date: September 22, 2022Applicant: NEC CorporationInventor: Hiroyuki OYAMA
-
Publication number: 20220105632Abstract: A control device includes a machine learning unit that performs machine learning of control for an operation of a control target device, an avoidance command value calculation unit that obtains an avoidance command value that is a control command value for the control target device, the control command value which satisfies constraint conditions including a condition for the control target device not to come into contact with an obstacle, and the control command value that an evaluation value obtained by applying the control command value to an evaluation function satisfies a prescribed end condition, and a device control unit that controls the control target device on the basis of the avoidance command value, in which a parameter value obtained through the machine learning in the machine learning unit is reflected in at least one of the evaluation function and the constraint condition.Type: ApplicationFiled: January 30, 2019Publication date: April 7, 2022Applicant: NEC CorporationInventors: Hiroyuki OYAMA, Takehiro lTOU
-
Publication number: 20220097231Abstract: An obstacle avoidance control device includes an avoidance command value calculation unit that obtains an avoidance command value that is a control command value for control target equipment, the control command value which satisfies constraint conditions including a condition sufficient for the control target equipment not to come into contact with an obstacle, and the control command value that an evaluation value obtained by applying the control command value to an evaluation function satisfies a prescribed end condition, and an equipment control unit that controls the control target equipment on the basis of a processing result of the avoidance command value calculation unit.Type: ApplicationFiled: January 30, 2019Publication date: March 31, 2022Inventors: Hiroyuki OYAMA, Takehiro ITOU