Patents by Inventor Tadayuki Yoshida
Tadayuki Yoshida has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11024198Abstract: Provided are an apparatus and method for acquiring information designating a plurality of menu items provided in a meal; reading a characteristic of each of the plurality of menu items provided in the meal, from a menu item database that stores a characteristic for each menu item; generating an classification model for classifying a specified menu item from among the plurality of menu items, based on characteristics of the menu items; acquiring measurement data measured by a sensor attached to a utensil during the meal; and classifying a menu item corresponding to the measurement data, from a characteristic corresponding to the measurement data, using the classification model.Type: GrantFiled: September 8, 2016Date of Patent: June 1, 2021Assignee: International Business Machines CorporationInventors: Kazuhiro Hara, Hiroki Nakano, Hiroki Nishiyama, Ai Okada, Tadayuki Yoshida, Mayumi Yoshitake
-
Patent number: 10909328Abstract: Methods, computer program products, and systems are presented. The method computer program products, and systems can include, for instance: examining communication data of a first human user to return one or more sentiment attribute of the communication data; processing the communication data to return sentiment neutral adapted communication data, the processing being in dependence on the one or more sentiment attribute; presenting to a second human user a sentiment neutral adapted communication, the sentiment neutral adapted communication being based on the sentiment neutral adapted communication data; augmenting second communication data to return adapted second communication data, the augmenting being in dependence of the one or more sentiment attribute; and presenting to the first human user an adapted second communication, the adapted second communication being based on the adapted second communication data.Type: GrantFiled: January 4, 2019Date of Patent: February 2, 2021Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Sawa Takano, Tadayuki Yoshida
-
Publication number: 20200218781Abstract: Methods, computer program products, and systems are presented. The method computer program products, and systems can include, for instance: examining communication data of a first human user to return one or more sentiment attribute of the communication data; processing the communication data to return sentiment neutral adapted communication data, the processing being in dependence on the one or more sentiment attribute; presenting to a second human user a sentiment neutral adapted communication, the sentiment neutral adapted communication being based on the sentiment neutral adapted communication data; augmenting second communication data to return adapted second communication data, the augmenting being in dependence of the one or more sentiment attribute; and presenting to the first human user an adapted second communication, the adapted second communication being based on the adapted second communication data.Type: ApplicationFiled: January 4, 2019Publication date: July 9, 2020Inventors: Sawa TAKANO, Tadayuki YOSHIDA
-
Patent number: 10674130Abstract: An electronic system dynamically creates video based on a structured document by associating video clips with items in a structured document includes a server that is connected to a user terminal. The user terminal sends to the server, as a selected item, at least one item in a structured document selected by the user. The server receives the item sent by the user terminal, identifies the item in the structured document selected by the user, identifies at least one dependent item having a dependent relationship with the selected item, dynamically creates a video on the basis of at least one video clip associated with each identified item and at least one video clip associated with each identified dependent item, and sends the video for playback on the user terminal.Type: GrantFiled: March 30, 2017Date of Patent: June 2, 2020Assignee: International Business Machines CorporationInventors: Satoshi Innami, Tadayuki Yoshida, Natsuki Zettsu
-
Patent number: 10372827Abstract: A method, system, and/or computer program product translates a phrase in a first language to an equivalent phrase in a second language. Electronic data includes image data, which describes one or more graphical user interface (GUI) elements on a GUI. The GUI includes a delineated area that is used by a GUI element. The GUI element is matched to one of a plurality of predefined GUI elements, where each of the plurality of predefined GUI elements is predefined by a set of data that comprises a conditional expression, which describes a structure of a particular GUI element, as well as a type of text data that is associated with that particular GUI element. Text data from the delineated area in the GUI is then translated from the first language into the second language according to the type of text data that is associated with a particular type of GUI element.Type: GrantFiled: October 3, 2013Date of Patent: August 6, 2019Assignee: International Business Machines CorporationInventors: Sawa Takano, Tadayuki Yoshida
-
Patent number: 10038980Abstract: A method comprising: storing, by a computer, a list of mobile devices; creating, by the computer, a plurality of groups of the mobile devices, each group including at least one mobile device, based on locations of the mobile devices; providing, by the computer, at least one mobile device of the plurality of groups with information of a target location; and handling, by the computer, information for a mutual interaction between at least two groups of the plurality of groups.Type: GrantFiled: May 17, 2016Date of Patent: July 31, 2018Assignee: International Business Machines CorporationInventors: Kenji Hamahata, Koichi Nishitani, Tadayuki Yoshida
-
Patent number: 10007648Abstract: Embodiments relate to supporting creation of a manual of a program product. An aspect includes recording into a recording medium that can be accessed by the computer a screen character string, a translated character string where the screen character string has been translated to another language, or an identifier associated with the screen character string or the translated character string, displayed on a display device by the program product. Another aspect includes recording into the recording medium attribute information of the screen character string or the translated character string. Yet another aspect includes maintaining consistency between the screen character string or the translated character string and a character string that is displayed on a display device by an application for creating the manual, using the screen character string, the translated character string or identifier recorded on the recording medium and the attribute information.Type: GrantFiled: July 5, 2016Date of Patent: June 26, 2018Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Kenji Hamahata, Shingo Kawai, Tadayuki Yoshida
-
Publication number: 20180068585Abstract: Provided are an apparatus and method for acquiring information designating a plurality of menu items provided in a meal; reading a characteristic of each of the plurality of menu items provided in the meal, from a menu item database that stores a characteristic for each menu item; generating an classification model for classifying a specified menu item from among the plurality of menu items, based on characteristics of the menu items; acquiring measurement data measured by a sensor attached to a utensil during the meal; and classifying a menu item corresponding to the measurement data, from a characteristic corresponding to the measurement data, using the classification model.Type: ApplicationFiled: September 8, 2016Publication date: March 8, 2018Inventors: Kazuhiro HARA, Hiroki NAKANO, Hiroki NISHIYAMA, Ai OKADA, Tadayuki YOSHIDA, Mayumi YOSHITAKE
-
Publication number: 20170339523Abstract: A method comprising: storing, by a computer, a list of mobile devices; creating, by the computer, a plurality of groups of the mobile devices, each group including at least one mobile device, based on locations of the mobile devices; providing, by the computer, at least one mobile device of the plurality of groups with information of a target location; and handling, by the computer, information for a mutual interaction between at least two groups of the plurality of groups.Type: ApplicationFiled: May 17, 2016Publication date: November 23, 2017Inventors: Kenji Hamahata, Koichi Nishitani, Tadayuki Yoshida
-
Publication number: 20170206930Abstract: An electronic system dynamically creates video based on a structured document by associating video clips with items in a structured document includes a server that is connected to a user terminal. The user terminal sends to the server, as a selected item, at least one item in a structured document selected by the user. The server receives the item sent by the user terminal, identifies the item in the structured document selected by the user, identifies at least one dependent item having a dependent relationship with the selected item, dynamically creates a video on the basis of at least one video clip associated with each identified item and at least one video clip associated with each identified dependent item, and sends the video for playback on the user terminal.Type: ApplicationFiled: March 30, 2017Publication date: July 20, 2017Inventors: Satoshi Innami, Tadayuki Yoshida, Natsuki Zettsu
-
Patent number: 9686525Abstract: An electronic system dynamically creates video based on a structured document by associating video clips with items in a structured document includes a server that is connected to a user terminal. The user terminal sends to the server, as a selected item, at least one item in a structured document selected by the user. The server receives the item sent by the user terminal, identifies the item in the structured document selected by the user, identifies at least one dependent item having a dependent relationship with the selected item, dynamically creates a video on the basis of at least one video clip associated with each identified item and at least one video clip associated with each identified dependent item, and sends the video for playback on the user terminal.Type: GrantFiled: January 27, 2015Date of Patent: June 20, 2017Assignee: International Business Machines CorporationInventors: Satoshi Innami, Tadayuki Yoshida, Natsuki Zettsu
-
Publication number: 20160314105Abstract: Embodiments relate to supporting creation of a manual of a program product. An aspect includes recording into a recording medium that can be accessed by the computer a screen character string, a translated character string where the screen character string has been translated to another language, or an identifier associated with the screen character string or the translated character string, displayed on a display device by the program product. Another aspect includes recording into the recording medium attribute information of the screen character string or the translated character string. Yet another aspect includes maintaining consistency between the screen character string or the translated character string and a character string that is displayed on a display device by an application for creating the manual, using the screen character string, the translated character string or identifier recorded on the recording medium and the attribute information.Type: ApplicationFiled: July 5, 2016Publication date: October 27, 2016Inventors: Kenji Hamahata, Shingo Kawai, Tadayuki Yoshida
-
Patent number: 9436680Abstract: Embodiments relate to supporting creation of a manual of a program product. An aspect includes recording into a recording medium that can be accessed by the computer a screen character string, a translated character string where the screen character string has been translated to another language, or an identifier associated with the screen character string or the translated character string, displayed on a display device by the program product. Another aspect includes recording into the recording medium attribute information of the screen character string or the translated character string. Yet another aspect includes maintaining consistency between the screen character string or the translated character string and a character string that is displayed on a display device by an application for creating the manual, using the screen character string, the translated character string or identifier recorded on the recording medium and the attribute information.Type: GrantFiled: April 25, 2014Date of Patent: September 6, 2016Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Kenji Hamahata, Shingo Kawai, Tadayuki Yoshida
-
Publication number: 20150213840Abstract: An electronic system dynamically creates video based on a structured document by associating video clips with items in a structured document includes a server that is connected to a user terminal. The user terminal sends to the server, as a selected item, at least one item in a structured document selected by the user. The server receives the item sent by the user terminal, identifies the item in the structured document selected by the user, identifies at least one dependent item having a dependent relationship with the selected item, dynamically creates a video on the basis of at least one video clip associated with each identified item and at least one video clip associated with each identified dependent item, and sends the video for playback on the user terminal.Type: ApplicationFiled: January 27, 2015Publication date: July 30, 2015Inventors: SATOSHI INNAMI, TADAYUKI YOSHIDA, NATSUKI ZETTSU
-
Publication number: 20150046145Abstract: Embodiments relate to supporting creation of a manual of a program product. An aspect includes recording into a recording medium that can be accessed by the computer a screen character string, a translated character string where the screen character string has been translated to another language, or an identifier associated with the screen character string or the translated character string, displayed on a display device by the program product. Another aspect includes recording into the recording medium attribute information of the screen character string or the translated character string. Yet another aspect includes maintaining consistency between the screen character string or the translated character string and a character string that is displayed on a display device by an application for creating the manual, using the screen character string, the translated character string or identifier recorded on the recording medium and the attribute information.Type: ApplicationFiled: April 25, 2014Publication date: February 12, 2015Applicant: International Business Machines CorporationInventors: Kenji Hamahata, Shingo Kawai, Tadayuki Yoshida
-
Publication number: 20140122054Abstract: A method, system, and/or computer program product translates a phrase in a first language to an equivalent phrase in a second language. Electronic data includes image data, which describes one or more graphical user interface (GUI) elements on a GUI. The GUI includes a delineated area that is used by a GUI element. The GUI element is matched to one of a plurality of predefined GUI elements, where each of the plurality of predefined GUI elements is predefined by a set of data that comprises a conditional expression, which describes a structure of a particular GUI element, as well as a type of text data that is associated with that particular GUI element. Text data from the delineated area in the GUI is then translated from the first language into the second language according to the type of text data that is associated with a particular type of GUI element.Type: ApplicationFiled: October 3, 2013Publication date: May 1, 2014Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: SAWA TAKANO, TADAYUKI YOSHIDA
-
Patent number: 8447586Abstract: In a verification support apparatus, a content analysis section analyzes a content to divide the content into paragraphs, extract region/culture-specific data, and store the analysis results in an analysis result storage section. A first verification section verifies, based on the analysis results, the consistency between the content and locale of a paragraph and the consistency between the paragraph and locale of the region/culture-specific data. A second verification section verifies, based on the analysis results, the correspondence between a paragraph of language A and a paragraph of language B and the consistency between the region/culture-specific data of language A and the region/culture-specific data of language B. A content update section updates the content so that the results of verification by the first verification section or the second verification section can be displayed in a way a person in charge of verification can easily understand.Type: GrantFiled: February 9, 2010Date of Patent: May 21, 2013Assignee: International Business Machines CorporationInventors: Nozomu Aoyama, Kei Sugano, Tadayuki Yoshida, Natsuki Zettsu
-
Patent number: 7860846Abstract: An assigning unit included in a device assigns an ID to a verification-target character string. A mapping file creating unit creates a mapping file in which the ID is associated with location data that indicates a location of the verification-target character string. In addition, an encoding unit included in the device encodes the ID into a zero-width character string. A character-string concatenating unit concatenates the zero-width character string to the verification-target character string. Furthermore, a decoding unit included in the device decodes the zero-width character string to the ID in response to selection of the verification-target character string displayed by a verification-target character string output unit. An extracting unit extracts the location data from the mapping file on the basis of the decoded ID. A mapping data output unit included in the device displays the mapping data including the extracted location data.Type: GrantFiled: March 14, 2008Date of Patent: December 28, 2010Assignee: International Business Machines CorporationInventors: Michiko Takahashi, Tadayuki Yoshida
-
Publication number: 20100211377Abstract: In a verification support apparatus, a content analysis section analyzes a content to divide the content into paragraphs, extract region/culture-specific data, and store the analysis results in an analysis result storage section. A first verification section verifies, based on the analysis results, the consistency between the content and locale of a paragraph and the consistency between the paragraph and locale of the region/culture-specific data. A second verification section verifies, based on the analysis results, the correspondence between a paragraph of language A and a paragraph of language B and the consistency between the region/culture-specific data of language A and the region/culture-specific data of language B. A content update section updates the content so that the results of verification by the first verification section or the second verification section can be displayed in a way a person in charge of verification can easily understand.Type: ApplicationFiled: February 9, 2010Publication date: August 19, 2010Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Nozomu Aoyama, Kei Sugano, Tadayuki Yoshida, Natsuki Zettsu
-
Patent number: D871494Type: GrantFiled: May 24, 2018Date of Patent: December 31, 2019Assignee: Hoya CorporationInventors: Kazumi Yamada, Tadayuki Yoshida, Shinichi Sasaki