Patents by Inventor Yao FONG
Yao FONG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 8636658Abstract: A surgical retractor includes an outer ring unit, a tubular retraction membrane made of a resilient material and including an inner surface defining a passageway, and a lighting unit. The tubular retraction membrane has an outer tubular end connected to the outer ring unit and rollable about the outer ring unit. The lighting unit is connected to an inner tubular end of the retraction membrane and includes a light-emitting ring, and a heat-dissipation ring in contact with the light-emitting ring.Type: GrantFiled: July 10, 2012Date of Patent: January 28, 2014Assignee: Chi Mei Medical CenterInventors: Ying-Chieh Su, Chao-Kun Chen, Yao Fong
-
Patent number: 8611962Abstract: A method to indicate that a first device is in communication with a second device is disclosed. The first device may receive an indication activity from the second device. The indication activity may change the display and the illumination object on the first device and the displays on the illumination object and the display are similar.Type: GrantFiled: June 29, 2007Date of Patent: December 17, 2013Assignee: Microsoft CorporationInventors: Anton Oguzhan Alford Andrews, David Walter Proctor, Jeffrey Cheng-yao Fong, Thamer A. Abanami
-
Patent number: 8589815Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.Type: GrantFiled: March 10, 2010Date of Patent: November 19, 2013Assignee: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Jeff Arnold, Chris Glein
-
Publication number: 20130263052Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.Type: ApplicationFiled: May 24, 2013Publication date: October 3, 2013Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
-
Patent number: 8473860Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.Type: GrantFiled: June 25, 2010Date of Patent: June 25, 2013Assignee: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
-
Publication number: 20130053007Abstract: Because of the small size and mobility of smart phones, and because they are typically hand-held, it is both natural and feasible to use hand, wrist, or arm gestures to communicate commands to the electronic device as if the device were an extension of the user's hand. Some user gestures are detectable by electro-mechanical motion sensors within the circuitry of the smart phone. The sensors can sense a user gesture by detecting a physical change associated with the device, such as motion of the device or a change in orientation. In response, a voice-based or image-based input mode can be triggered based on the gesture. Methods and devices disclosed provide a way to select from among different input modes to a device feature, such as a search, without reliance on manual selection.Type: ApplicationFiled: August 24, 2011Publication date: February 28, 2013Applicant: Microsoft CorporationInventors: Stephen Cosman, Aaron Woo, Jeffrey Cheng-Yao Fong
-
Publication number: 20130042199Abstract: Disclosed herein are tools and techniques for using a single-finger single touch to zoom content. In one embodiment disclosed herein, a single-finger single touch on a touch screen displaying at least a page of content is detected. At least in response to the detecting the single-finger single touch, a page zoom is performed.Type: ApplicationFiled: August 10, 2011Publication date: February 14, 2013Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Jeffery G. Arnold, Liang Chen, Neil Kronlage
-
Publication number: 20130018230Abstract: A surgical retractor includes an outer ring unit, a tubular retraction membrane made of a resilient material and including an inner surface defining a passageway, and a lighting unit. The tubular retraction membrane has an outer tubular end connected to the outer ring unit and rollable about the outer ring unit. The lighting unit is connected to an inner tubular end of the retraction membrane and includes a light-emitting ring, and a heat-dissipation ring in contact with the light-emitting ring.Type: ApplicationFiled: July 10, 2012Publication date: January 17, 2013Applicant: CHI MEI MEDICAL CENTERInventors: Ying-Chieh SU, Chao-Kun CHEN, Yao FONG
-
Patent number: 8319772Abstract: Techniques and tools are described for rendering views of a map in which map metadata elements are layered in 3D space through which a viewer navigates. Layering of metadata elements such as text labels in 3D space facilitates parallax and smooth motion effects for zoom-in, zoom-out and scrolling operations during map navigation. A computing device can determine a viewer position that is associated with a view altitude in 3D space, then render for display a map view based upon the viewer position and metadata elements layered at different metadata altitudes in 3D space. For example, the computing device places text labels in 3D space above features associated with the respective labels, at the metadata altitudes indicated for the respective labels. The computing device creates a map view from points of the placed labels and points of a surface layer of the map that are visible from the viewer position.Type: GrantFiled: July 23, 2010Date of Patent: November 27, 2012Assignee: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett, Eric Neal Braff
-
Patent number: 8239783Abstract: A user interface can display active and passive content. For example, a camera viewfinder image can be displayed on a screen, as part of a strip, concatenated with one or more other images, for example, images that were previously taken with the camera. A user can cause the viewfinder image and the other images to move together across the screen. This can allow a user to easily examine the other images and the viewfinder image without, for example, switching between different screens in a user interface. Media captured with a device can be associated with a media category by positioning a user interface element near one or more other elements associated with the category.Type: GrantFiled: February 15, 2010Date of Patent: August 7, 2012Assignee: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett
-
Publication number: 20120185788Abstract: A method and system are disclosed for displaying a user interface text element in an East-Asian mode so that system-based text can be displayed vertically on a user interface. In one embodiment, a device can dynamically switch between a Latin-based layout (horizontally displayed text elements) and an East-Asian based layout (vertically displayed text elements) based on global device settings, such as a language setting or a locale setting. Such settings can be dynamically modified by the user to change the display modes.Type: ApplicationFiled: June 1, 2011Publication date: July 19, 2012Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Kenji Nakamura, Daniel J. Hwang, Lee Dicks Clark, Jeffery G. Arnold
-
Publication number: 20120019513Abstract: Techniques and tools are described for rendering views of a map in which map metadata elements are layered in 3D space through which a viewer navigates. Layering of metadata elements such as text labels in 3D space facilitates parallax and smooth motion effects for zoom-in, zoom-out and scrolling operations during map navigation. A computing device can determine a viewer position that is associated with a view altitude in 3D space, then render for display a map view based upon the viewer position and metadata elements layered at different metadata altitudes in 3D space. For example, the computing device places text labels in 3D space above features associated with the respective labels, at the metadata altitudes indicated for the respective labels. The computing device creates a map view from points of the placed labels and points of a surface layer of the map that are visible from the viewer position.Type: ApplicationFiled: July 23, 2010Publication date: January 26, 2012Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Donald Allen Barnett, Eric Neal Braff
-
Publication number: 20110239153Abstract: A pointer tool in a touch-screen display is disclosed. The method includes activating a pointer tool in a touch screen display in response to contact with an area of the touch screen and persisting the displaying of the pointer tool in the touch screen display after the contact with the touch screen is removed. Once editing data is received, the pointer tool is removed from the touch screen display.Type: ApplicationFiled: March 24, 2010Publication date: September 29, 2011Applicant: Microsoft CorporationInventors: Benjamin F. Carter, Priyanka Singhal, Shawna Julie Davis, Tirthankar Sengupta, Jeffrey Cheng-Yao Fong, Ryan Terry Bickel, Peter Gregory Davis
-
Publication number: 20110225543Abstract: A user interface is described that temporarily displays portions of a page that reside outside of the viewable area of the screen. An animated transition creates a brief preview of at least one user interface feature. Additionally, the user interface feature is then animated out of the viewable area in a way to suggest a location of the feature. In one embodiment, a target page that is being opened controls the transition and animates features into and out of the viewable area to create the temporary preview. In another embodiment, the target page includes user interface elements that can asynchronously control the preview animation independent from the main content of the target page. In yet another embodiment, a transition coordinator can coordinate the timing between animating out a foreground application while animating in a target application.Type: ApplicationFiled: March 10, 2010Publication date: September 15, 2011Applicant: Microsoft CorporationInventors: Jeff Arnold, Jeffrey Cheng-Yao Fong, Pritesh Bodalia, Jeffrey L. Bogdan, Lee Dicks Clark
-
Publication number: 20110225547Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.Type: ApplicationFiled: March 10, 2010Publication date: September 15, 2011Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Jeff Arnold, Chris Glein
-
Publication number: 20110202834Abstract: Aspects of a user interface that provides visual feedback in response to user input. For example, boundary effects are presented to provide visual cues to a user to indicate that a boundary in a movable user interface element (e.g., the end of a scrollable list) has been reached. As another example, parallax effects are presented in which multiple parallel or substantially parallel layers in a multi-layer user interface move at different rates, in response to user input. As another example, simulated inertia motion of UI elements is used to provide a more natural feel for touch input. Various combinations of features are described. For example, simulated inertia motion can be used in combination with parallax effects, boundary effects, or other types of visual feedback.Type: ApplicationFiled: May 4, 2010Publication date: August 18, 2011Applicant: Microsoft CorporationInventors: Luciano Baretta Mandryk, Jeffrey Cheng-Yao Fong
-
Publication number: 20110202837Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.Type: ApplicationFiled: June 25, 2010Publication date: August 18, 2011Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
-
Publication number: 20110202859Abstract: Techniques and tools are described that relate to different aspects of a user interface in which visual distortion effects are presented to provide visual cues to a user to indicate a location in a movable data collection (e.g., a scrollable list, an email message, a content layer, etc.). For example, in response to a user gesture on a touchscreen, a user interface system presents a portion of a list or layer in a visually distorted state, such as a “squished,” squeezed or compressed state in which text, images or other content is shown to be smaller than normal in one or more dimensions, to indicate to a user that the end of a list has been reached.Type: ApplicationFiled: March 10, 2010Publication date: August 18, 2011Applicant: Microsoft CorporationInventor: Jeffrey Cheng-Yao Fong
-
Publication number: 20110199318Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. For example, the first movement rate can be substantially equal to the movement rate of a gesture made by a user's finger or other object on the touchscreen. The UI system calculates movements of other layers substantially parallel to the movement of the first layer, at movement rates that differ from the first movement rate.Type: ApplicationFiled: June 25, 2010Publication date: August 18, 2011Applicant: Microsoft CorporationInventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
-
Publication number: 20110119619Abstract: A user interface can display active and passive content. For example, a camera viewfinder image can be displayed on a screen, as part of a strip, concatenated with one or more other images, for example, images that were previously taken with the camera. A user can cause the viewfinder image and the other images to move together across the screen. This can allow a user to easily examine the other images and the viewfinder image without, for example, switching between different screens in a user interface. Media captured with a device can be associated with a media category by positioning a user interface element near one or more other elements associated with the category.Type: ApplicationFiled: February 15, 2010Publication date: May 19, 2011Applicant: Microsoft CorporationInventors: Jeffrey Cheng-yao Fong, Donald Allen Barnett