Patents by Inventor Sean Shiang-Ning Whelan
Sean Shiang-Ning Whelan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230152863Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.Type: ApplicationFiled: January 20, 2023Publication date: May 18, 2023Applicant: Microsoft Technology Licensing, LLCInventors: Scott D. SCHENONE, Otso Joona Casimir TUOMI, Eduardo SONNINO, Spencer Lee DAVIS, Sergio Eduardo RODRIGUEZ VIRGEN, TJ RHOADES, Sean Shiang-Ning WHELAN, Tyler WHITE, Peter Eugene HAMMERQUIST, Panos Costa PANAY
-
Patent number: 11635874Abstract: Methods for pen-specific user interface controls are performed by systems and devices. Users activate pen-specific menus at user devices by activating controls of a touch pen such as a tail button. A communication is received by a device, from a touch pen, that indicates an activation control of the touch pen has been physically activated. The user device selects a touch pen menu that includes selectable menu options for respective launching of separate pen applications from among available menus based at least on the received communication. The device determines menu presentation information specifying a location within a user interface (UI) from a state of the UI at a time associated with the communication. The touch pen menu is displayed via the UI according to the menu presentation information. A detection for selection of a selectable menu option causes the device to launch a pen application associated therewith.Type: GrantFiled: June 11, 2021Date of Patent: April 25, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Sean Shiang-Ning Whelan, Scott David Schenone, Young Soo Kim
-
Patent number: 11561587Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.Type: GrantFiled: December 18, 2019Date of Patent: January 24, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Scott D. Schenone, Otso Joona Casimir Tuomi, Eduardo Sonnino, Spencer Lee Davis, Sergio Eduardo Rodriguez Virgen, T J Rhoades, Sean Shiang-Ning Whelan, Tyler White, Peter Eugene Hammerquist, Panos Costa Panay
-
Publication number: 20220397988Abstract: Methods for pen-specific user interface controls are performed by systems and devices. Users activate pen-specific menus at user devices by activating controls of a touch pen such as a tail button. A communication is received by a device, from a touch pen, that indicates an activation control of the touch pen has been physically activated. The user device selects a touch pen menu that includes selectable menu options for respective launching of separate pen applications from among available menus based at least on the received communication. The device determines menu presentation information specifying a location within a user interface (UI) from a state of the UI at a time associated with the communication. The touch pen menu is displayed via the UI according to the menu presentation information. A detection for selection of a selectable menu option causes the device to launch a pen application associated therewith.Type: ApplicationFiled: June 11, 2021Publication date: December 15, 2022Inventors: Sean Shiang-Ning WHELAN, Scott David SCHENONE, Young Soo KIM
-
Patent number: 11093100Abstract: A virtual reality device can implement varying interactive modes for document viewing and editing by displaying, at an application container level, a current mode view in a view frame of the virtual reality device; and in response to receiving an overview command trigger, determining context, including that the current mode view is at the application container level; expanding to a next level view of, e.g., a task level or an overview level; and displaying, at a next level, the next level view in the view frame of the virtual reality device. The current mode view of the application container level includes a container space of an application and an application container level rule for the container space. Conversely, the virtual reality device can adjust the next level view back to the application container level in response to a focused command trigger and identified region of interest.Type: GrantFiled: March 30, 2018Date of Patent: August 17, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Michael M. Bennett, Gregory C. Hitchcock, Jonathan S. Kaufthal, Akshay Bakshi, Sean Shiang-Ning Whelan
-
Publication number: 20210096611Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.Type: ApplicationFiled: December 18, 2019Publication date: April 1, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Scott D. SCHENONE, Otso Joona Casimir TUOMI, Eduardo SONNINO, Spencer Lee DAVIS, Sergio Eduardo RODRIGUEZ VIRGEN, TJ RHOADES, Sean Shiang-Ning WHELAN, Tyler WHITE, Peter Eugene HAMMERQUIST, Panos Costa PANAY
-
Publication number: 20190278432Abstract: A virtual reality device can implement varying interactive modes for document viewing and editing by displaying, at an application container level, a current mode view in a view frame of the virtual reality device; and in response to receiving an overview command trigger, determining context, including that the current mode view is at the application container level; expanding to a next level view of, e.g., a task level or an overview level; and displaying, at a next level, the next level view in the view frame of the virtual reality device. The current mode view of the application container level includes a container space of an application and an application container level rule for the container space. Conversely, the virtual reality device can adjust the next level view back to the application container level in response to a focused command trigger and identified region of interest.Type: ApplicationFiled: March 30, 2018Publication date: September 12, 2019Inventors: Michael M. Bennett, Gregory C. Hitchcock, Jonathan S. Kaufthal, Akshay Bakshi, Sean Shiang-Ning Whelan