Patents by Inventor Michelle Tze Hiang Chua
Michelle Tze Hiang Chua has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10726637Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.Type: GrantFiled: June 14, 2019Date of Patent: July 28, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Scott Anderson Nagy, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
-
Patent number: 10564915Abstract: A computing system is provided, including a plurality of display devices including at least a first display device and a second display device. The computing system may further include one or more sensors configured to detect a first positional state of the first display device relative to the second display device and at least one user. The first positional state may include an angular orientation of the first display device relative to the second display device. The computing system may further include a processor configured to receive the first positional state from the one or more sensors. The processor may be further configured to generate first graphical content based at least in part on the first positional state. The processor may be further configured to transmit the first graphical content for display at the first display device.Type: GrantFiled: March 5, 2018Date of Patent: February 18, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Aaron D. Krauss, Jamie R. Cabaccang, Jennifer J. Choi, Michelle Tze Hiang Chua, Priya Ganadas, Donna Katherine Long, Kenneth Liam Kiemele
-
Publication number: 20190295330Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.Type: ApplicationFiled: June 14, 2019Publication date: September 26, 2019Inventors: SCOTT ANDERSON NAGY, JASON CARTER, ANATOLIE GAVRILIUC, DANIEL L. OSBORN, KATHLEEN MULCAHY, PATRICK C. RYAN, MICHELLE TZE HIANG CHUA
-
Publication number: 20190272138Abstract: A computing system is provided, including a plurality of display devices including at least a first display device and a second display device. The computing system may further include one or more sensors configured to detect a first positional state of the first display device relative to the second display device and at least one user. The first positional state may include an angular orientation of the first display device relative to the second display device. The computing system may further include a processor configured to receive the first positional state from the one or more sensors. The processor may be further configured to generate first graphical content based at least in part on the first positional state. The processor may be further configured to transmit the first graphical content for display at the first display device.Type: ApplicationFiled: March 5, 2018Publication date: September 5, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Aaron D. KRAUSS, Jamie R. CABACCANG, Jennifer J. CHOI, Michelle Tze Hiang CHUA, Priya GANADAS, Donna Katherine LONG, Kenneth Liam KIEMELE
-
Patent number: 10332317Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.Type: GrantFiled: June 7, 2017Date of Patent: June 25, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Scott Anderson Nagy, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
-
Patent number: 10142697Abstract: Various enhanced interactive TV experiences are supported by a real-time TV service that can interact over a network with applications running on interactive TV platforms. The real-time television service enables data, video, and communication user interface (UI) elements to be supported alongside the linear TV content in three discrete navigational states, each with a specific role aligning to user intent. In the case of one particular UI element—a focused companion panel—a live stream of personalized data is supported that enhances the user's comprehension and awareness of live events. The real-time TV service can further enable users to opt in to certain customizable events and be served notifications on the TV screen on which their attention is directed, and follow the notification into an application that lands them on a specific piece of content that satisfies specific conditions.Type: GrantFiled: December 8, 2014Date of Patent: November 27, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Tyler Bielman, Helen Lam, Michelle Tze Hiang Chua, Shannon Yen Yun Lee, Matthew Nigel Carter, Matthew Davy Walsh, David Seymour, Preetinderpal S. Mangat, William Michael Mozell, Jesse William Wesson, Michael James Mahar, Raymond Alexander Chi-Yue Lum, Michael James Perzel, Cameron David James McRae, Darrin Adam Brown, Dashan Yue, Remus Radu, Eric C. Bridgwater, David T. Ferguson
-
Publication number: 20180114372Abstract: The present disclosure provides approaches to facilitating virtual reality and cross-device experiences. In some implementations, an environmental snapshot is captured which includes an image of a virtual reality (VR) environment presented on a VR device and corresponding depth information of the VR environment. The image of the environmental snapshot is presented on a different device than the VR device. A user modification to content associated with the presented image is translated into the environmental snapshot based on the depth information. The environmental snapshot comprising the user modification is translated into the VR environment. The VR environment comprising the translated user modification is presented. The environmental snapshot may correspond to a personal space of a user and may be accessed by another user through a social networking interface or other user networking interface to cause the presentation of the image.Type: ApplicationFiled: June 7, 2017Publication date: April 26, 2018Inventors: SCOTT ANDERSON NAGY, Jason Carter, Anatolie Gavriliuc, Daniel L. Osborn, Kathleen Mulcahy, Patrick C. Ryan, Michelle Tze Hiang Chua
-
Publication number: 20160066053Abstract: Various enhanced interactive TV experiences are supported by a real-time TV service that can interact over a network with applications running on interactive TV platforms. The real-time television service enables data, video, and communication user interface (UI) elements to be supported alongside the linear TV content in three discrete navigational states, each with a specific role aligning to user intent. In the case of one particular UI element—a focused companion panel—a live stream of personalized data is supported that enhances the user's comprehension and awareness of live events. The real-time TV service can further enable users to opt in to certain customizable events and be served notifications on the TV screen on which their attention is directed, and follow the notification into an application that lands them on a specific piece of content that satisfies specific conditions.Type: ApplicationFiled: December 8, 2014Publication date: March 3, 2016Inventors: Tyler Bielman, Helen Lam, Michelle Tze Hiang Chua, Shannon Yen Yun Lee, Matthew Nigel Carter, Matthew Davy Walsh, David Seymour, Preetinderpal S. Mangat, William Michael Mozell, Jesse William Wesson, Michael James Mahar, Raymond Alexander Chi-Yue Lum, Michael James Perzel, Cameron David James McRae, Darrin Adam Brown, Dashan Yue, Remus Radu, Eric C. Bridgwater, David T. Ferguson
-
Patent number: D910055Type: GrantFiled: June 3, 2019Date of Patent: February 9, 2021Assignee: Apple Inc.Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge
-
Patent number: D944846Type: GrantFiled: February 5, 2021Date of Patent: March 1, 2022Assignee: Apple Inc.Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, Kyle Ellington Fisher, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge
-
Patent number: D980259Type: GrantFiled: February 28, 2022Date of Patent: March 7, 2023Assignee: Apple Inc.Inventors: Zachary Z. Becker, Adam James Bolton, Michelle Tze Hiang Chua, Peter Justin Dollar, David A. Lipton, Robin-Yann Joram Storm, Eric Gregory Thivierge