Patents by Inventor Xiaoyong Ye

Xiaoyong Ye has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11967087
    Abstract: To track certain difficult facial features during speech such as the corners of the mouth and the teeth, a camera sensor system generates RGB/IR images and the system also uses light intensity change signals from an event driven sensor (EDS), as well as voice analysis. In this way, the camera sensor system enables improved performance tracking (equivalent to using very high-speed camera) at lower bandwidth and power consumption.
    Type: Grant
    Filed: October 14, 2022
    Date of Patent: April 23, 2024
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20240048799
    Abstract: Detection of whether a video is a fake video derived from an original video and altered is undertaken using a block chain that either forbids adding to the block chain copies of original videos that have been altered or indicating in the block chain that an altered video has been altered. Image fingerprinting techniques are described for determining whether video sought to be added to block chain has been altered.
    Type: Application
    Filed: July 2, 2023
    Publication date: February 8, 2024
    Inventors: Xiaoyong Ye, Warren Benedetto
  • Publication number: 20230398434
    Abstract: A position and orientation of a controller are determined from a known configuration of two or more light sources with respect to each other and with respect to a controller body and from output signals from a dynamic vision sensor (DVS) generated in response to changes in light output from the light sources. The output signals indicate times events at corresponding light-sensitive elements in an array in the DVS and array locations of the light-sensitive elements. A position and orientation of one or more objects are determined from signals generated by two or more light-sensitive elements resulting from other light reaching the two or more light-sensitive elements.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20230398432
    Abstract: A tracking system uses a dynamic vision sensor (DVS) operably coupled to the processor. The DVS has an array of light-sensitive elements in a known configuration. The DVS outputs signals corresponding to two or more events at two or more corresponding light-sensitive elements in the array in response to changes in light output from two or more light sources mounted in a known configuration with respect to each other and with respect to a controller. The output signals indicate times of the events and locations of the corresponding light-sensitive elements. The processor determines an association between each event and two or more corresponding light sources and fits a position and orientation of the controller using the determined association, the known configuration of the light sources and the locations of the corresponding light-sensitive elements in the array.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20230398433
    Abstract: A DVS has an array of light-sensitive elements in a known configuration and outputs signals corresponding to events at corresponding light-sensitive elements in the array in response to changes in light output from two or more light sources in a known configuration with respect to each other and with respect to a controller body. The signals indicate times of the events and array locations of the corresponding light-sensitive elements. Filters selectively transmit light from the light sources to light-sensitive elements in the array and selectively block other light from reaching those elements and vice versa. A processor determines a position and orientation of the controller from times of the events, array locations of corresponding light-sensitive elements, and the known light source configuration and determines a position and orientation of one or more objects from signals generated by two or more light-sensitive elements resulting from other light.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20230400911
    Abstract: A tracking system includes a processor, a controller, two or more light sources and a dynamic vision sensor (DVS). The light sources are of known configuration with respect to each other the controller and turn on and off in a predetermined sequence. The DVS includes an array of light-sensitive elements of known configuration. The DVS outputs signals corresponding to events at corresponding light-sensitive elements in the array in response to changes in light from the light sources. The signals indicate times of the events and locations of the corresponding light-sensitive elements. The processor determines an association between each event and one or more of the light sources and, from that association, determines an occlusion of one or more of the light sources. The processor estimates a location of an object using the determined occlusion, the known light source configuration, and the locations of the corresponding light-sensitive elements in the array.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20230401723
    Abstract: A tracking system and method, comprising a processor; and a controller operably coupled to the processor. Two or more light sources are mounted in a known configuration with respect to each other and with respect to the controller body and the two or more light sources are configured to flash a predetermined time sequence. A dynamic vision sensor is configured to output signals corresponding to two or more events at two or more corresponding light-sensitive elements in an array in response to changes in light output from the two or more light sources and corresponding to times of the two or more events and locations of the events in the light-sensitive elements in the array. The processor determines a position and orientation of the controller from the times and location of the two or more events, and the known information.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20230400915
    Abstract: A tracking system uses one or more light sources to direct light toward one or more of a user's eyes. A dynamic vision sensor (DVS) operably coupled to a processor is configured to view the user's eyes. The DVS has an array of light-sensitive elements in a known configuration. The DVS outputs signals corresponding to events at corresponding light-sensitive elements in the array in response to changes in light from the light sources reflecting from a portion of the user's eyes. The output signals include information corresponding to locations of the corresponding light-sensitive elements in the array. The processor determines an association between each event and a corresponding particular light source and fits an orientation of the user's eyes using the determined association, the relative location of the light sources with respect to the user's eyes and the locations of the corresponding light-sensitive elements in the array.
    Type: Application
    Filed: June 10, 2022
    Publication date: December 14, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Patent number: 11736763
    Abstract: Detection of whether a video is a fake video derived from an original video and altered is undertaken using a block chain that either forbids adding to the block chain copies of original videos that have been altered or indicating in the block chain that an altered video has been altered. Image fingerprinting techniques are described for determining whether video sought to be added to block chain has been altered.
    Type: Grant
    Filed: June 15, 2021
    Date of Patent: August 22, 2023
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Xiaoyong Ye, Warren Benedetto
  • Patent number: 11717759
    Abstract: Methods and systems for enhancing viewing experience of a spectator includes receiving a request for viewing game play of a video game executing at a game cloud server. Responsive to the request, the spectator is initially provided with one of a player camera view or an active spectator camera view generated for the game play. The system dynamically switches between the player camera view and the active spectator camera view based on context of action occurring in the player camera view and the active spectator camera view. The dynamic switching correlates with time of action occurring in the game scene of the video game.
    Type: Grant
    Filed: March 1, 2022
    Date of Patent: August 8, 2023
    Assignee: Sony Interactive Entertainment Inc.
    Inventor: Xiaoyong Ye
  • Patent number: 11719850
    Abstract: A method includes using an electromagnetic (EM) tracking system to track a tangible object, detecting a presence of interference with a magnetic field generated by the EM tracking system, and compensating for the interference. A system includes an EM tracking transmitter, an EM tracking receiver, and a processor based apparatus in communication with the EM tracking transmitter and the EM tracking receiver. The processor based apparatus is configured to execute steps including using the EM tracking transmitter and the EM tracking receiver to implement an EM tracking system. A storage medium storing one or more computer programs is also provided.
    Type: Grant
    Filed: June 9, 2020
    Date of Patent: August 8, 2023
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Dennis D. Castleman, Xiaoyong Ye
  • Patent number: 11635802
    Abstract: To facilitate control of an AR HMD, a camera unit in a camera sensor system generates RGB/IR images and the system also extrapolates images for times in the future based on light intensity change signals from an event detection sensor (EDS) for HMD pose tracking, hand tracking, and eye tracking. The times in the future are provided by an HMD application which defines the future times, and the RGB/IR images and extrapolated images are sent back to the application. In this way, the camera sensor system enables improved performance tracking (equivalent to using very high-speed camera) at lower bandwidth and power consumption.
    Type: Grant
    Filed: January 13, 2020
    Date of Patent: April 25, 2023
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Jeffrey R. Stafford, Xiaoyong Ye, Yutaka Yokokawa
  • Publication number: 20230068416
    Abstract: To track certain difficult facial features during speech such as the corners of the mouth and the teeth, a camera sensor system generates RGB/IR images and the system also uses light intensity change signals from an event driven sensor (EDS), as well as voice analysis. In this way, the camera sensor system enables improved performance tracking (equivalent to using very high-speed camera) at lower bandwidth and power consumption.
    Type: Application
    Filed: October 14, 2022
    Publication date: March 2, 2023
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Patent number: 11551474
    Abstract: Detection of whether a video is a fake video derived from an original video and altered is undertaken using both image analysis and frequency domain analysis of one or more frames of the video. The analysis may be implemented using neural networks.
    Type: Grant
    Filed: October 21, 2019
    Date of Patent: January 10, 2023
    Assignee: Sony Interactive Entertainment Inc.
    Inventor: Xiaoyong Ye
  • Patent number: 11475618
    Abstract: To track certain difficult facial features during speech such as the corners of the mouth and the teeth, a camera sensor system generates RGB/IR images and the system also uses light intensity change signals from an event driven sensor (EDS), as well as voice analysis. In this way, the camera sensor system enables improved performance tracking (equivalent to using very high-speed camera) at lower bandwidth and power consumption.
    Type: Grant
    Filed: May 11, 2020
    Date of Patent: October 18, 2022
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20220184510
    Abstract: Methods and systems for enhancing viewing experience of a spectator includes receiving a request for viewing game play of a video game executing at a game cloud server. Responsive to the request, the spectator is initially provided with one of a player camera view or an active spectator camera view generated for the game play. The system dynamically switches between the player camera view and the active spectator camera view based on context of action occurring in the player camera view and the active spectator camera view. The dynamic switching correlates with time of action occurring in the game scene of the video game.
    Type: Application
    Filed: March 1, 2022
    Publication date: June 16, 2022
    Inventor: Xiaoyong Ye
  • Patent number: 11260307
    Abstract: Methods and systems for enhancing viewing experience of a spectator includes receiving a request for viewing game play of a video game executing at a game cloud server. Responsive to the request, the spectator is initially provided with one of a player camera view or an active spectator camera view generated for the game play. The system dynamically switches between the player camera view and the active spectator camera view based on context of action occurring in the player camera view and the active spectator camera view. The dynamic switching correlates with time of action occurring in the game scene of the video game.
    Type: Grant
    Filed: May 28, 2020
    Date of Patent: March 1, 2022
    Assignee: Sony Interactive Entertainment Inc.
    Inventor: Xiaoyong Ye
  • Publication number: 20210370186
    Abstract: Methods and systems for enhancing viewing experience of a spectator includes receiving a request for viewing game play of a video game executing at a game cloud server. Responsive to the request, the spectator is initially provided with one of a player camera view or an active spectator camera view generated for the game play. The system dynamically switches between the player camera view and the active spectator camera view based on context of action occurring in the player camera view and the active spectator camera view. The dynamic switching correlates with time of action occurring in the game scene of the video game.
    Type: Application
    Filed: May 28, 2020
    Publication date: December 2, 2021
    Inventor: Xiaoyong Ye
  • Publication number: 20210350602
    Abstract: To track certain difficult facial features during speech such as the corners of the mouth and the teeth, a camera sensor system generates RGB/IR images and the system also uses light intensity change signals from an event driven sensor (EDS), as well as voice analysis. In this way, the camera sensor system enables improved performance tracking (equivalent to using very high-speed camera) at lower bandwidth and power consumption.
    Type: Application
    Filed: May 11, 2020
    Publication date: November 11, 2021
    Inventors: Xiaoyong Ye, Yuichiro Nakamura
  • Publication number: 20210312301
    Abstract: Methods and systems are provided for generating recommendations related to a game. One example method is for generating recommendations for a game executed by a cloud gaming service. The method includes receiving, by a server, sensor data captured during gameplay of the game by a plurality of user, and each of the plurality of sensor data includes intensity information related to reactions made by respective users. The method includes processing, by the server, features from the sensor data and the interactive data from the game when the users played the game. The features are classified and used to build an engagement model that identifies relationships between specific ones of the plurality of sensor data and the interactive data. The method includes processing, by the server, sensor data captured during a current gameplay by a user using the engagement model.
    Type: Application
    Filed: April 1, 2020
    Publication date: October 7, 2021
    Inventors: Mahdi Azmandian, Xiaoyong Ye, Todd Tokubo