Patents by Inventor Michelle Tze Hiang Chua

Michelle Tze Hiang Chua has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10726637
    Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.
    Type: Grant
    Filed: June 14, 2019
    Date of Patent: July 28, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Scott Anderson Nagy, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
  • Patent number: 10564915
    Abstract: A computing system is provided, including a plurality of display devices including at least a first display device and a second display device. The computing system may further include one or more sensors configured to detect a first positional state of the first display device relative to the second display device and at least one user. The first positional state may include an angular orientation of the first display device relative to the second display device. The computing system may further include a processor configured to receive the first positional state from the one or more sensors. The processor may be further configured to generate first graphical content based at least in part on the first positional state. The processor may be further configured to transmit the first graphical content for display at the first display device.
    Type: Grant
    Filed: March 5, 2018
    Date of Patent: February 18, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Aaron D. Krauss, Jamie R. Cabaccang, Jennifer J. Choi, Michelle Tze Hiang Chua, Priya Ganadas, Donna Katherine Long, Kenneth Liam Kiemele
  • Publication number: 20190295330
    Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.
    Type: Application
    Filed: June 14, 2019
    Publication date: September 26, 2019
    Inventors: SCOTT ANDERSON NAGY, JASON CARTER, ANATOLIE GAVRILIUC, DANIEL L. OSBORN, KATHLEEN MULCAHY, PATRICK C. RYAN, MICHELLE TZE HIANG CHUA
  • Publication number: 20190272138
    Abstract: A computing system is provided, including a plurality of display devices including at least a first display device and a second display device. The computing system may further include one or more sensors configured to detect a first positional state of the first display device relative to the second display device and at least one user. The first positional state may include an angular orientation of the first display device relative to the second display device. The computing system may further include a processor configured to receive the first positional state from the one or more sensors. The processor may be further configured to generate first graphical content based at least in part on the first positional state. The processor may be further configured to transmit the first graphical content for display at the first display device.
    Type: Application
    Filed: March 5, 2018
    Publication date: September 5, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Aaron D. KRAUSS, Jamie R. CABACCANG, Jennifer J. CHOI, Michelle Tze Hiang CHUA, Priya GANADAS, Donna Katherine LONG, Kenneth Liam KIEMELE
  • Patent number: 10332317
    Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.
    Type: Grant
    Filed: June 7, 2017
    Date of Patent: June 25, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Scott Anderson Nagy, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
  • Patent number: 10142697
    Abstract: Various enhanced interactive TV experiences are supported by a real-time TV service that can interact over a network with applications running on interactive TV platforms. The real-time television service enables data, video, and communication user interface (UI) elements to be supported alongside the linear TV content in three discrete navigational states, each with a specific role aligning to user intent. In the case of one particular UI element—a focused companion panel—a live stream of personalized data is supported that enhances the user's comprehension and awareness of live events. The real-time TV service can further enable users to opt in to certain customizable events and be served notifications on the TV screen on which their attention is directed, and follow the notification into an application that lands them on a specific piece of content that satisfies specific conditions.
    Type: Grant
    Filed: December 8, 2014
    Date of Patent: November 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tyler Bielman, Helen Lam, Michelle Tze Hiang Chua, Shannon Yen Yun Lee, Matthew Nigel Carter, Matthew Davy Walsh, David Seymour, Preetinderpal S. Mangat, William Michael Mozell, Jesse William Wesson, Michael James Mahar, Raymond Alexander Chi-Yue Lum, Michael James Perzel, Cameron David James McRae, Darrin Adam Brown, Dashan Yue, Remus Radu, Eric C. Bridgwater, David T. Ferguson
  • Publication number: 20180114372
    Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.
    Type: Application
    Filed: June 7, 2017
    Publication date: April 26, 2018
    Inventors: SCOTT ANDERSON NAGY, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
  • Publication number: 20160066053
    Abstract: Various enhanced interactive TV experiences are supported by a real-time TV service that can interact over a network with applications running on interactive TV platforms. The real-time television service enables data, video, and communication user interface (UI) elements to be supported alongside the linear TV content in three discrete navigational states, each with a specific role aligning to user intent. In the case of one particular UI element—a focused companion panel—a live stream of personalized data is supported that enhances the user's comprehension and awareness of live events. The real-time TV service can further enable users to opt in to certain customizable events and be served notifications on the TV screen on which their attention is directed, and follow the notification into an application that lands them on a specific piece of content that satisfies specific conditions.
    Type: Application
    Filed: December 8, 2014
    Publication date: March 3, 2016
    Inventors: Tyler Bielman, Helen Lam, Michelle Tze Hiang Chua, Shannon Yen Yun Lee, Matthew Nigel Carter, Matthew Davy Walsh, David Seymour, Preetinderpal S. Mangat, William Michael Mozell, Jesse William Wesson, Michael James Mahar, Raymond Alexander Chi-Yue Lum, Michael James Perzel, Cameron David James McRae, Darrin Adam Brown, Dashan Yue, Remus Radu, Eric C. Bridgwater, David T. Ferguson
  • Patent number: D910055
    Type: Grant
    Filed: June 3, 2019
    Date of Patent: February 9, 2021
    Assignee: Apple Inc.
    Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge
  • Patent number: D944846
    Type: Grant
    Filed: February 5, 2021
    Date of Patent: March 1, 2022
    Assignee: Apple Inc.
    Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, Kyle Ellington Fisher, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge
  • Patent number: D980259
    Type: Grant
    Filed: February 28, 2022
    Date of Patent: March 7, 2023
    Assignee: Apple Inc.
    Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge