Patents by Inventor Hidetaka Koya

Hidetaka Koya has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240135300
    Abstract: An estimation device (10) includes a data acquisition unit (131), a first smoothing processing unit (132), a second smoothing processing unit (133), a calculation unit (134), and an estimation unit (135). The data acquisition unit (131) acquires time-series data of a pupil diameter during work of a subject of work performance estimation. The first smoothing processing unit (132) smooths the acquired time-series data of the pupil diameter of the subject by a first time window. The second smoothing processing unit (133) smooths the time-series data of the pupil diameter by a second time window that is a time window larger than the first time window. The calculation unit (134) calculates a difference between the time-series data of the pupil diameter smoothed by the first time window and the time-series data of the pupil diameter smoothed by the second time window.
    Type: Application
    Filed: March 5, 2021
    Publication date: April 25, 2024
    Inventors: Jumpei YAMASHITA, Akira KATAOKA, Hidetaka KOYA
  • Publication number: 20240126979
    Abstract: An information acquisition apparatus (100) includes an acquisition unit (121), a classification unit (122), a determination unit (123), and a registration unit (124). The acquisition unit (121) acquires tree information representing information of a system screen by a plurality of nodes having a tree structure. The classification unit (122) classifies the plurality of nodes having a tree structure into an operable component and a label component based on the tree information. The determination unit (123) determines whether the label component indicates a name of the operable component based on a distance between the operable component and the label component. The registration unit (124) registers a correspondence between a text corresponding to the label component and specifying information for specifying the operable component in a case where the determination unit (123) determines that the label component indicates a name of the operable component.
    Type: Application
    Filed: February 24, 2021
    Publication date: April 18, 2024
    Inventors: Hidetaka KOYA, Akira KATAOKA
  • Patent number: 11947770
    Abstract: A display control apparatus includes: a change detection unit that detects a change of a UI element on a target screen; a determination unit that determines a display update method among a plurality of display update methods for extended UIs based on the change of the UI element detected by the change detection unit; and a display update unit that updates display of the extended UIs by the display update method determined by the determination unit.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: April 2, 2024
    Assignee: Nippon Telegraph and Telephone Corporation
    Inventors: Makoto Komiyama, Akira Kataoka, Hidetaka Koya
  • Publication number: 20240012981
    Abstract: A display control system (1) includes an acquisition unit (153) and a search unit (154). The acquisition unit (153) acquires a DOM tree of a screen into which an additional UI is inserted. The search unit (154) searches for UIs used for display control of the additional UI from a base point UI serving as a base point of display of the additional UI in the DOM tree on the basis of a display condition related to display of the additional UI.
    Type: Application
    Filed: November 4, 2020
    Publication date: January 11, 2024
    Inventors: Hidetaka Koya, Makoto Komiyama, Akira Kataoka
  • Publication number: 20240004528
    Abstract: A user interface augmentation system includes a memory and a processor coupled to the memory. The processor is configured to perform operations including: searching a system screen for a base point UI serving as a base point for display of an additional user interface (UI) to be added to the system screen; displaying a new screen with a transparent background in such a way as to overlap the system screen; and displaying the additional UI at a position based on the base point UI on the new screen.
    Type: Application
    Filed: November 27, 2020
    Publication date: January 4, 2024
    Inventors: Hidetaka KOYA, Makoto KOMIYAMA, Akira KATAOKA, Kimio TSUCHIKAWA
  • Patent number: 11861417
    Abstract: A peripheral information acquisition unit (121) acquires information relating to a first application (13a) running in a terminal, information relating to control of the terminal, or information that can be acquired from a sensor included in the terminal, as peripheral information. The peripheral information acquired by the peripheral information acquisition unit (121) is accumulated in a peripheral information accumulation unit (122). A dialogue interface unit (11) accepts input of information from a user and outputs information to the user. When the peripheral information accumulated in the peripheral information accumulation unit (122) and information input to the dialogue interface unit (11) satisfy a predetermined condition, a scenario control unit (123) causes the dialogue interface unit (11) to output information relating to execution of a second application (14) that is associated with the condition in advance.
    Type: Grant
    Filed: October 9, 2019
    Date of Patent: January 2, 2024
    Assignee: Nippon Telegraph and Telephone Corporation
    Inventors: Makoto Komiyama, Takeshi Masuda, Akira Kataoka, Masashi Tadokoro, Hidetaka Koya
  • Publication number: 20230418451
    Abstract: An operation support device includes an acquisition unit configured to acquire information on a designated range based on operation of a user, an extraction unit configured to extract information on user interface (UI) elements included in the designated range from the information on the designated range acquired by the acquisition unit, a generation unit configured to generate a selector representing a search range including the designated range on a basis of the information on the UI elements extracted by the extraction unit, and a setting unit configured to set the search range from the selector representing the search range generated by the generation unit.
    Type: Application
    Filed: November 17, 2020
    Publication date: December 28, 2023
    Inventors: Makoto KOMIYAMA, Akira KATAOKA, Hidetaka KOYA
  • Publication number: 20230404393
    Abstract: An estimation device includes processing circuitry configured to acquire a pupil diameter of a subject whose work performance is to be estimated at a time of work and a luminance of a gaze target of the subject, calculate a variation amount of the pupil diameter of the subject from time-series data of the pupil diameter of the subject, determine whether or not the luminance of the gaze target of the subject is equal to or higher than a predetermined value, and determine that a correlation between a magnitude of the variation amount of the pupil diameter of the subject and a deterioration in the work performance of the subject is low in a case where it is determined that the luminance of the gaze target of the subject is equal to or higher than a predetermined threshold.
    Type: Application
    Filed: November 19, 2020
    Publication date: December 21, 2023
    Inventors: Jumpei YAMASHITA, Hidetaka KOYA, Akira KATAOKA
  • Patent number: 11836288
    Abstract: A distance estimation apparatus includes processing circuitry configured to acquire a sensor value output from a sensor configured to measure a relative motion of a head or eyeballs of a user who visually searches for a target on a plane and estimate a distance between the user and a search target plane using a maximum value of an amount of change when a rate of change in the sensor value acquired within a time period that is equal to or greater than a predetermined threshold value becomes a maximum.
    Type: Grant
    Filed: May 31, 2019
    Date of Patent: December 5, 2023
    Assignee: Nippon Telegraph and Telephone Corporation
    Inventors: Jumpei Yamashita, Hidetaka Koya, Hajime Nakajima
  • Publication number: 20230297203
    Abstract: A display control apparatus (10) includes: a change detection unit (131) that detects a change of a UI element on a target screen; a determination unit (132) that determines a display update method among a plurality of display update methods for extended UIs based on the change of the UI element detected by the change detection unit (131); and a display update unit (134) that updates display of the extended UIs by the display update method determined by the determination unit (132).
    Type: Application
    Filed: August 7, 2020
    Publication date: September 21, 2023
    Inventors: Makoto Komiyama, Akira Kataoka, Hidetaka Koya
  • Publication number: 20230260508
    Abstract: A task determination unit determines a type of processing being operated, on the basis of an operation situation of an existing system. A conversion unit converts voice data input during a predetermined operation of an HID into text data, and determines whether the text data is used for a command determination or a setting parameter according to operation content of the HID. A command determination unit determines a command by using the text data and the type of processing being operated, when it is determined that the text data is used for the command determination. An operation unit executes an operation corresponding to the determined command with respect to the existing system by using the text data as a parameter, when it is determined that the text data is used for the setting parameter.
    Type: Application
    Filed: July 1, 2020
    Publication date: August 17, 2023
    Inventors: Hidetaka KOYA, Makoto KOMIYAMA, Akira KATAOKA, Masashi TADOKORO
  • Publication number: 20230222420
    Abstract: An acquisition unit (15a) acquires operation information of a user on the screen that the user operates. An identification unit (15b) identifies a progress state in a predetermined work flow, using the acquired operation information. A display control unit (15c) causes the work flow and the identified progress state to be displayed.
    Type: Application
    Filed: May 15, 2020
    Publication date: July 13, 2023
    Inventors: Makoto Komiyama, Akira Kataoka, Masashi Tadokoro, Hidetaka Koya
  • Publication number: 20230195280
    Abstract: An identification apparatus (10) includes a screen configuration comparison unit (1314) configured to determine equivalence of screen components between sample screen data and processing target screen data based on whether a screen component in a screen structure of the sample screen data and a screen component in a screen structure of the processing target screen data have similar relationships to other screen components in the respective screen structures.
    Type: Application
    Filed: May 29, 2020
    Publication date: June 22, 2023
    Inventors: Shiro OGASAWARA, Takeshi MASUDA, Fumihiro YOKOSE, Hidetaka KOYA, Yuki URABE
  • Publication number: 20230185964
    Abstract: A masking apparatus identifies a screen and screen components included in individual screen data of first screen data serving as a reference and one or more pieces of second screen data serving as a processing target. In addition, the masking apparatus specifies third screen data on a basis of the screen and the screen components that are identified, the third screen data being data equivalent to the first screen data in the one or more pieces of second screen data. In addition, the masking apparatus determines necessity of masking of each screen component included in the first screen data on a basis of the third screen data.
    Type: Application
    Filed: May 29, 2020
    Publication date: June 15, 2023
    Inventors: Shiro OGASAWARA, Takeshi MASUDA, Yuki URABE, Fumihiro YOKOSE, Hidetaka KOYA
  • Publication number: 20230086261
    Abstract: The clustering apparatus constructs, on assumption that sensor data is generated from a latent variable which is a consecutive random variable of the number of dimensions suitable for handling in the shallow method, a model for estimating the latent variable from the sensor data, based on a generative model for generating the sensor data from the latent variable. Next, the clustering apparatus calculates, from the sensor data, an estimated value of the latent variable from which the sensor data is generated, by using the constructed model. The clustering apparatus then clusters the calculated estimated values of the latent variable by the shallow method, and identifies the optimum number of clusters. Thereafter, the clustering apparatus performs clustering of the sensor data by a neural network having three or more layers, by using the hyperparameter information of the constructed model and the identified optimum number of clusters.
    Type: Application
    Filed: February 25, 2020
    Publication date: March 23, 2023
    Inventors: Jumpei YAMASHITA, Hidetaka KOYA
  • Publication number: 20230089162
    Abstract: A learning apparatus (10) acquires a label corresponding to a variance not selectively explained by a latent variable, out of variances in characteristic of data. The learning apparatus (10) receives, as input data, real data or generated data output by a generator that generates data, discriminates whether the input data is the generated data or the real data, and adds, to a first neural network constituting a discriminator that estimates the latent variable, a path having two or more layers for estimating the label. The learning apparatus (10) performs learning for a second neural network obtained by adding the path so that by multiplying, by a minus sign, a gradient for an error propagating backward to the first neural network in a first layer of the added path during learning based on backpropagation, the gradient is propagated to minimize an estimation error for the latent variable, but the gradient is propagated to maximize an estimation error for the label.
    Type: Application
    Filed: February 14, 2020
    Publication date: March 23, 2023
    Inventors: Jumpei Yamashita, Hidetaka Koya
  • Publication number: 20230024249
    Abstract: A storage unit (21) stores, for each extended user interface, a work determination method (21a) that is a method for determining a work that is carried out by a user, and work data (21b) that has been input to the extended user interface in correspondence with the work. A work determination unit (12b) acquires the work determination method (21a) for an extended user interface that is displayed, from the storage unit (21), and determines the work that is carried out by the user on a terminal (10) that is operated by the user, according to the work determination method (21a). A work data processing unit (12c) acquires the work data (21b) corresponding to the determined work, from the storage unit (21), and displays it on the extended user interface.
    Type: Application
    Filed: October 9, 2019
    Publication date: January 26, 2023
    Inventors: Hidetaka Koya, Takeshi Masuda, Akira Kataoka, Masashi Tadokoro, Makoto Komiyama
  • Publication number: 20220342724
    Abstract: A peripheral information acquisition unit (121) acquires information relating to a first application (13a) running in a terminal, information relating to control of the terminal, or information that can be acquired from a sensor included in the terminal, as peripheral information. The peripheral information acquired by the peripheral information acquisition unit (121) is accumulated in a peripheral information accumulation unit (122). A dialogue interface unit (11) accepts input of information from a user and outputs information to the user. When the peripheral information accumulated in the peripheral information accumulation unit (122) and information input to the dialogue interface unit (11) satisfy a predetermined condition, a scenario control unit (123) causes the dialogue interface unit (11) to output information relating to execution of a second application (14) that is associated with the condition in advance.
    Type: Application
    Filed: October 9, 2019
    Publication date: October 27, 2022
    Inventors: Makoto Komiyama, Takeshi Masuda, Akira Kataoka, Masashi Tadokoro, Hidetaka Koya
  • Patent number: 11435822
    Abstract: An estimation method includes first acquiring information of an eyeball motion of a user on the basis of a measurement value of an eye potential of the user, second acquiring positional information of an interaction target on a screen that corresponds to an interaction performed on a device through an operation of the user, third acquiring information regarding a relative motion of sight of the user on the basis at least of the information of the eyeball motion of the user, and estimating a sight position of the user on the screen on the basis of the information regarding the relative motion of the sight of the user and the positional information of the interaction target on the screen, by processing circuitry.
    Type: Grant
    Filed: May 9, 2019
    Date of Patent: September 6, 2022
    Assignee: Nippon Telegraph and Telephone Corporation
    Inventors: Jumpei Yamashita, Hidetaka Koya, Hajime Nakajima
  • Publication number: 20220244778
    Abstract: An acquisition unit (15a) acquires a sensor value output from a sensor that measures a relative motion of a head or eyeballs of a user who visually searches for a target on a plane. An estimation unit (15b) estimates a distance between the user and a search target plane using a maximum value of an amount of change when a rate of change in the sensor value acquired within a time period that is equal to or greater than a predetermined threshold value becomes a maximum.
    Type: Application
    Filed: May 31, 2019
    Publication date: August 4, 2022
    Inventors: Jumpei Yamashita, Hidetaka Koya, Hajime Nakajima