Patents by Inventor Jeffrey Cheng-yao Fong

Jeffrey Cheng-yao Fong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9669298
    Abstract: A method to indicate that a first device is in communication with a second device is disclosed. The first device may receive an indication activity from the second device. The indication activity may change the display and the illumination object on the first device and the displays on the illumination object and the display are similar.
    Type: Grant
    Filed: November 22, 2013
    Date of Patent: June 6, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Anton Oguzhan Alford Andrews, David Walter Proctor, Jeffrey Cheng-Yao Fong, Thamer A. Abanami
  • Patent number: 9417787
    Abstract: Techniques and tools are described that relate to different aspects of a user interface in which visual distortion effects are presented to provide visual cues to a user to indicate a location in a movable data collection (e.g., a scrollable list, an email message, a content layer, etc.). For example, in response to a user gesture on a touchscreen, a user interface system presents a portion of a list or layer in a visually distorted state, such as a “squished,” squeezed or compressed state in which text, images or other content is shown to be smaller than normal in one or more dimensions, to indicate to a user that the end of a list has been reached.
    Type: Grant
    Filed: March 10, 2010
    Date of Patent: August 16, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Jeffrey Cheng-Yao Fong
  • Patent number: 9418464
    Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.
    Type: Grant
    Filed: October 21, 2013
    Date of Patent: August 16, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Christopher A. Glein
  • Patent number: 9317196
    Abstract: Disclosed herein are tools and techniques for using a single-finger single touch to zoom content. In one embodiment disclosed herein, a single-finger single touch on a touch screen displaying at least a page of content is detected. At least in response to the detecting the single-finger single touch, a page zoom is performed.
    Type: Grant
    Filed: August 10, 2011
    Date of Patent: April 19, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Liang Chen, Neil Kronlage
  • Patent number: 9292161
    Abstract: A pointer tool in a touch-screen display is disclosed. The method includes activating a pointer tool in a touch screen display in response to contact with an area of the touch screen and persisting the displaying of the pointer tool in the touch screen display after the contact with the touch screen is removed. Once editing data is received, the pointer tool is removed from the touch screen display.
    Type: Grant
    Filed: March 24, 2010
    Date of Patent: March 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Benjamin F. Carter, Priyanka Singhal, Shawna Julie Davis, Tirthankar Sengupta, Jeffrey Cheng-Yao Fong, Ryan Terry Bickel, Peter Gregory Davis
  • Publication number: 20150163341
    Abstract: Various technologies for managing mobile device communications can be offered to implement a virtual personal operator. Incoming calls and texts can be managed intelligently based on a rich network-stored context, allowing the network to make decisions and interact with callers. Because context is stored by the network, the virtual personal operator can function without contacting the called mobile phone, and can even provide helpful information to callers if the mobile phone is offline. Rich do-not-disturb functionality can be provided, and privileged callers can be given additional information or functionality based on their privileged status. Numerous other features that assist with communications management can be supported.
    Type: Application
    Filed: December 10, 2013
    Publication date: June 11, 2015
    Inventors: John Skovron, Krishnan Ananthanarayanan, Jeffrey Cheng-Yao Fong, Eric Jonathan Hull, Reid Kuhn, David E. Lemson, Ganapathy Raman, Mahendra Sekaran, Lavanya Vasudevan, Aaron Woo, Kerry D. Woolsey, Aaron Woodman
  • Publication number: 20140098108
    Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.
    Type: Application
    Filed: October 21, 2013
    Publication date: April 10, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Christopher A. Glein
  • Patent number: 8681149
    Abstract: Techniques and tools are described for rendering views of a map in which map metadata elements are layered in 3D space through which a viewer navigates. Layering of metadata elements such as text labels in 3D space facilitates parallax and smooth motion effects for zoom-in, zoom-out and scrolling operations during map navigation. A computing device can determine a viewer position that is associated with a view altitude in 3D space, then render for display a map view based upon the viewer position and metadata elements layered at different metadata altitudes in 3D space. For example, the computing device places text labels in 3D space above features associated with the respective labels, at the metadata altitudes indicated for the respective labels. The computing device creates a map view from points of the placed labels and points of a surface layer of the map that are visible from the viewer position.
    Type: Grant
    Filed: November 21, 2012
    Date of Patent: March 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Donald A. Barnett, Eric Neal Braff
  • Publication number: 20140080604
    Abstract: A method to indicate that a first device is in communication with a second device is disclosed. The first device may receive an indication activity from the second device. The indication activity may change the display and the illumination object on the first device and the displays on the illumination object and the display are similar.
    Type: Application
    Filed: November 22, 2013
    Publication date: March 20, 2014
    Applicant: Microsoft Corporation
    Inventors: Anton Oguzhan Alford Andrews, David Walter Proctor, Jeffrey Cheng-Yao Fong, Thamer A. Abanami
  • Patent number: 8650501
    Abstract: A user interface is described that temporarily displays portions of a page that reside outside of the viewable area of the screen. An animated transition creates a brief preview of at least one user interface feature. Additionally, the user interface feature is then animated out of the viewable area in a way to suggest a location of the feature. In one embodiment, a target page that is being opened controls the transition and animates features into and out of the viewable area to create the temporary preview. In another embodiment, the target page includes user interface elements that can asynchronously control the preview animation independent from the main content of the target page. In yet another embodiment, a transition coordinator can coordinate the timing between animating out a foreground application while animating in a target application.
    Type: Grant
    Filed: March 10, 2010
    Date of Patent: February 11, 2014
    Assignee: Microsoft Corporation
    Inventors: Jeff Arnold, Jeffrey Cheng-Yao Fong, Pritesh Bodalia, Jeffrey L. Bogdan, Lee Dicks Clark
  • Patent number: 8611962
    Abstract: A method to indicate that a first device is in communication with a second device is disclosed. The first device may receive an indication activity from the second device. The indication activity may change the display and the illumination object on the first device and the displays on the illumination object and the display are similar.
    Type: Grant
    Filed: June 29, 2007
    Date of Patent: December 17, 2013
    Assignee: Microsoft Corporation
    Inventors: Anton Oguzhan Alford Andrews, David Walter Proctor, Jeffrey Cheng-yao Fong, Thamer A. Abanami
  • Patent number: 8589815
    Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.
    Type: Grant
    Filed: March 10, 2010
    Date of Patent: November 19, 2013
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Jeff Arnold, Chris Glein
  • Publication number: 20130263052
    Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.
    Type: Application
    Filed: May 24, 2013
    Publication date: October 3, 2013
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
  • Patent number: 8473860
    Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.
    Type: Grant
    Filed: June 25, 2010
    Date of Patent: June 25, 2013
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
  • Publication number: 20130053007
    Abstract: Because of the small size and mobility of smart phones, and because they are typically hand-held, it is both natural and feasible to use hand, wrist, or arm gestures to communicate commands to the electronic device as if the device were an extension of the user's hand. Some user gestures are detectable by electro-mechanical motion sensors within the circuitry of the smart phone. The sensors can sense a user gesture by detecting a physical change associated with the device, such as motion of the device or a change in orientation. In response, a voice-based or image-based input mode can be triggered based on the gesture. Methods and devices disclosed provide a way to select from among different input modes to a device feature, such as a search, without reliance on manual selection.
    Type: Application
    Filed: August 24, 2011
    Publication date: February 28, 2013
    Applicant: Microsoft Corporation
    Inventors: Stephen Cosman, Aaron Woo, Jeffrey Cheng-Yao Fong
  • Publication number: 20130042199
    Abstract: Disclosed herein are tools and techniques for using a single-finger single touch to zoom content. In one embodiment disclosed herein, a single-finger single touch on a touch screen displaying at least a page of content is detected. At least in response to the detecting the single-finger single touch, a page zoom is performed.
    Type: Application
    Filed: August 10, 2011
    Publication date: February 14, 2013
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Liang Chen, Neil Kronlage
  • Patent number: 8319772
    Abstract: Techniques and tools are described for rendering views of a map in which map metadata elements are layered in 3D space through which a viewer navigates. Layering of metadata elements such as text labels in 3D space facilitates parallax and smooth motion effects for zoom-in, zoom-out and scrolling operations during map navigation. A computing device can determine a viewer position that is associated with a view altitude in 3D space, then render for display a map view based upon the viewer position and metadata elements layered at different metadata altitudes in 3D space. For example, the computing device places text labels in 3D space above features associated with the respective labels, at the metadata altitudes indicated for the respective labels. The computing device creates a map view from points of the placed labels and points of a surface layer of the map that are visible from the viewer position.
    Type: Grant
    Filed: July 23, 2010
    Date of Patent: November 27, 2012
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett, Eric Neal Braff
  • Patent number: 8239783
    Abstract: A user interface can display active and passive content. For example, a camera viewfinder image can be displayed on a screen, as part of a strip, concatenated with one or more other images, for example, images that were previously taken with the camera. A user can cause the viewfinder image and the other images to move together across the screen. This can allow a user to easily examine the other images and the viewfinder image without, for example, switching between different screens in a user interface. Media captured with a device can be associated with a media category by positioning a user interface element near one or more other elements associated with the category.
    Type: Grant
    Filed: February 15, 2010
    Date of Patent: August 7, 2012
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett
  • Publication number: 20120185788
    Abstract: A method and system are disclosed for displaying a user interface text element in an East-Asian mode so that system-based text can be displayed vertically on a user interface. In one embodiment, a device can dynamically switch between a Latin-based layout (horizontally displayed text elements) and an East-Asian based layout (vertically displayed text elements) based on global device settings, such as a language setting or a locale setting. Such settings can be dynamically modified by the user to change the display modes.
    Type: Application
    Filed: June 1, 2011
    Publication date: July 19, 2012
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Kenji Nakamura, Daniel J. Hwang, Lee Dicks Clark, Jeffery G. Arnold
  • Publication number: 20120019513
    Abstract: Techniques and tools are described for rendering views of a map in which map metadata elements are layered in 3D space through which a viewer navigates. Layering of metadata elements such as text labels in 3D space facilitates parallax and smooth motion effects for zoom-in, zoom-out and scrolling operations during map navigation. A computing device can determine a viewer position that is associated with a view altitude in 3D space, then render for display a map view based upon the viewer position and metadata elements layered at different metadata altitudes in 3D space. For example, the computing device places text labels in 3D space above features associated with the respective labels, at the metadata altitudes indicated for the respective labels. The computing device creates a map view from points of the placed labels and points of a surface layer of the map that are visible from the viewer position.
    Type: Application
    Filed: July 23, 2010
    Publication date: January 26, 2012
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett, Eric Neal Braff