Patents by Inventor Sean Shiang-Ning Whelan

Sean Shiang-Ning Whelan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230152863
    Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
    Type: Application
    Filed: January 20, 2023
    Publication date: May 18, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Scott D. SCHENONE, Otso Joona Casimir TUOMI, Eduardo SONNINO, Spencer Lee DAVIS, Sergio Eduardo RODRIGUEZ VIRGEN, TJ RHOADES, Sean Shiang-Ning WHELAN, Tyler WHITE, Peter Eugene HAMMERQUIST, Panos Costa PANAY
  • Patent number: 11635874
    Abstract: Methods for pen-specific user interface controls are performed by systems and devices. Users activate pen-specific menus at user devices by activating controls of a touch pen such as a tail button. A communication is received by a device, from a touch pen, that indicates an activation control of the touch pen has been physically activated. The user device selects a touch pen menu that includes selectable menu options for respective launching of separate pen applications from among available menus based at least on the received communication. The device determines menu presentation information specifying a location within a user interface (UI) from a state of the UI at a time associated with the communication. The touch pen menu is displayed via the UI according to the menu presentation information. A detection for selection of a selectable menu option causes the device to launch a pen application associated therewith.
    Type: Grant
    Filed: June 11, 2021
    Date of Patent: April 25, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Sean Shiang-Ning Whelan, Scott David Schenone, Young Soo Kim
  • Patent number: 11561587
    Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
    Type: Grant
    Filed: December 18, 2019
    Date of Patent: January 24, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Scott D. Schenone, Otso Joona Casimir Tuomi, Eduardo Sonnino, Spencer Lee Davis, Sergio Eduardo Rodriguez Virgen, T J Rhoades, Sean Shiang-Ning Whelan, Tyler White, Peter Eugene Hammerquist, Panos Costa Panay
  • Publication number: 20220397988
    Abstract: Methods for pen-specific user interface controls are performed by systems and devices. Users activate pen-specific menus at user devices by activating controls of a touch pen such as a tail button. A communication is received by a device, from a touch pen, that indicates an activation control of the touch pen has been physically activated. The user device selects a touch pen menu that includes selectable menu options for respective launching of separate pen applications from among available menus based at least on the received communication. The device determines menu presentation information specifying a location within a user interface (UI) from a state of the UI at a time associated with the communication. The touch pen menu is displayed via the UI according to the menu presentation information. A detection for selection of a selectable menu option causes the device to launch a pen application associated therewith.
    Type: Application
    Filed: June 11, 2021
    Publication date: December 15, 2022
    Inventors: Sean Shiang-Ning WHELAN, Scott David SCHENONE, Young Soo KIM
  • Patent number: 11093100
    Abstract: A virtual reality device can implement varying interactive modes for document viewing and editing by displaying, at an application container level, a current mode view in a view frame of the virtual reality device; and in response to receiving an overview command trigger, determining context, including that the current mode view is at the application container level; expanding to a next level view of, e.g., a task level or an overview level; and displaying, at a next level, the next level view in the view frame of the virtual reality device. The current mode view of the application container level includes a container space of an application and an application container level rule for the container space. Conversely, the virtual reality device can adjust the next level view back to the application container level in response to a focused command trigger and identified region of interest.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: August 17, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael M. Bennett, Gregory C. Hitchcock, Jonathan S. Kaufthal, Akshay Bakshi, Sean Shiang-Ning Whelan
  • Publication number: 20210096611
    Abstract: One example provides a computing device including a first portion including a first display, a second portion including a second display and a camera, the second portion connected to the first portion via a hinge, a hinge angle sensing mechanism including one or more sensors, a logic device, and a storage device holding instructions executable by the logic device to execute a camera application and to receive sensor data from the one or more sensors, based at least in part on the sensor data received from the one or more sensors, determine a device pose, output the camera application to the first display when the device pose is indicative of the camera being world-facing, and output the camera application to the second display when the device pose is indicative of the camera being user-facing.
    Type: Application
    Filed: December 18, 2019
    Publication date: April 1, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Scott D. SCHENONE, Otso Joona Casimir TUOMI, Eduardo SONNINO, Spencer Lee DAVIS, Sergio Eduardo RODRIGUEZ VIRGEN, TJ RHOADES, Sean Shiang-Ning WHELAN, Tyler WHITE, Peter Eugene HAMMERQUIST, Panos Costa PANAY
  • Publication number: 20190278432
    Abstract: A virtual reality device can implement varying interactive modes for document viewing and editing by displaying, at an application container level, a current mode view in a view frame of the virtual reality device; and in response to receiving an overview command trigger, determining context, including that the current mode view is at the application container level; expanding to a next level view of, e.g., a task level or an overview level; and displaying, at a next level, the next level view in the view frame of the virtual reality device. The current mode view of the application container level includes a container space of an application and an application container level rule for the container space. Conversely, the virtual reality device can adjust the next level view back to the application container level in response to a focused command trigger and identified region of interest.
    Type: Application
    Filed: March 30, 2018
    Publication date: September 12, 2019
    Inventors: Michael M. Bennett, Gregory C. Hitchcock, Jonathan S. Kaufthal, Akshay Bakshi, Sean Shiang-Ning Whelan