Patents by Inventor Targay Oskiper

Targay Oskiper has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170193710
    Abstract: A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.
    Type: Application
    Filed: March 21, 2017
    Publication date: July 6, 2017
    Inventors: Rakesh Kumar, Targay Oskiper, Oleg Naroditsky, Supun Samarasekera, Zhiwei Zhu, Janet Kim
  • Patent number: 9600067
    Abstract: A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.
    Type: Grant
    Filed: October 27, 2009
    Date of Patent: March 21, 2017
    Assignee: SRI International
    Inventors: Rakesh Kumar, Targay Oskiper, Oleg Naroditsky, Supun Samarasekera, Zhiwei Zhu, Janet Kim
  • Publication number: 20100103196
    Abstract: A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.
    Type: Application
    Filed: October 27, 2009
    Publication date: April 29, 2010
    Inventors: Rakesh Kumar, Targay Oskiper, Oleg Naroditsky, Supun Samarasekera, Zhiwei Zhu, Janet Kim