Patents by Inventor Steven Bathiche

Steven Bathiche has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150153855
    Abstract: Described embodiments include an apparatus and a method. In an apparatus, a tracking circuit detects a segment of a path defined by a user contact point moving across a touch sensitive display. A filter predicts a next contiguous segment of the path defined by the user contact point in response to an adaptively learned motion parameter. The adaptively learned motion parameter is based on at least two previous instances of the determined motion parameters respectively descriptive of a motion of a user contact point during its movement across the touch sensitive display. A compensation circuit initiates a display by the touch sensitive display of the detected segment of the path and the predicted next contiguous segment of the path. An updating circuit updates the detected segment of the path and the predicted next contiguous segment of the path as the user contact point moves across the touch sensitive display.
    Type: Application
    Filed: December 3, 2013
    Publication date: June 4, 2015
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny A. Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y. H. Wood
  • Publication number: 20150153898
    Abstract: Described embodiments include an apparatus and a method. In an apparatus, a tracking circuit detects a segment of a path defined by a user contact point moving across a touch sensitive display. An analysis circuit determines a parameter descriptive of a motion of the user contact point during the detected segment. A selection circuit selects a time-interval forecasted to improve a correspondence between a predicted next segment of the path and a subsequently detected next segment of the path. A filter predicts in response to the motion parameter and the selected time-interval a next segment of the path. A compensation circuit initiates a display of the detected segment of the path and the predicted next segment of the path. An updating circuit initiates an update of the detected segment of the path and the predicted next segment of the path as the user contact point moves across the display.
    Type: Application
    Filed: December 3, 2013
    Publication date: June 4, 2015
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny A. Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y. H. Wood
  • Patent number: 9047002
    Abstract: An electronic device may include a touch screen electronic display configured to offset and/or shift the contact locations of touch implements and/or displayed content based on one or more calculated parallax values. The parallax values may be associated with the viewing angle of an operator relative to the display of the electronic device. In various embodiments, the parallax value(s) may be calculated using three-dimensional location sensors, an angle of inclination of a touch implement, and/or one or more displayed calibration objects. Parallax values may be utilized to remap contact locations by a touch implement, shift and/or offset displayed content, and/or perform other transformations as described herein. A stereoscopically displayed content may be offset such that a default display plane is coplanar with a touch surface rather than a display surface. Contacts by a finger may be remapped using portions of the contact region and/or a centroid of the contact region.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: June 2, 2015
    Assignee: ELWHA LLC
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Jr., Kenneth P. Hinckley, III, Roderick A Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Victoria Y. H. Wood, Lowell L. Wood, Jr.
  • Publication number: 20150138059
    Abstract: Embodiments are disclosed that relate to operating a display illuminated by a backlight system configured to selectively emit light having two or more angular intensity profiles. For example, one disclosed embodiment provides a method comprising illuminating the display with light having a first angular intensity profile, while illuminating the display with light having the first angular intensity profile, outputting an image, after outputting the image, illuminating the display with light having a second angular intensity profile different than the first angular intensity profile, and while illuminating the display with light having the second angular intensity profile, outputting an inverse image of the image.
    Type: Application
    Filed: November 19, 2013
    Publication date: May 21, 2015
    Applicant: Microsoft Corporation
    Inventors: Timothy Large, Steven Bathiche
  • Patent number: 9013564
    Abstract: An autostereoscopic 3D display system includes a display having a plurality of pixels, wherein each pixel is configured to display light rays representing a left-eye view and a right-eye view of an image. The autostereoscopic 3D display system further includes an optical-deflection system configured to control the light rays representing the left-eye view and the right-eye view. The optical-deflection system includes a separately controllable lenslet associated with each pixel, where the lenslet is configured to steer the light ray representing the left-eye view corresponding to the pixel, and steer the light ray representing the right-eye view corresponding to the pixel.
    Type: Grant
    Filed: May 7, 2013
    Date of Patent: April 21, 2015
    Assignee: Elwha LLC
    Inventors: Steven Bathiche, Alistair K. Chan, William Gates, Roderick A. Hyde, Edward K. Y. Jung, Jordin T. Kare, Jaron Lanier, John L. Manferdelli, Clarence T. Tegreene, David B. Tuckerman, Charles Whitmer, Lowell L. Wood, Jr.
  • Publication number: 20150097928
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Application
    Filed: December 15, 2014
    Publication date: April 9, 2015
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20150100926
    Abstract: Disclosed herein are techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture-based system. The techniques for scaling and translation may take the varying distances from which a user interacts with components of the gesture-based system, such as a computing environment or capture device, into consideration with respect to defining and/or recognizing gestures. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones. A set of gesture data may be associated with each virtual zone such that gestures appropriate for controlling aspects of the gesture-based system may vary throughout the physical space.
    Type: Application
    Filed: August 18, 2014
    Publication date: April 9, 2015
    Inventors: Otto G. Berkes, Steven Bathiche, John Clavin, Ian LeGrow, Joseph Reginald Scott Molnar
  • Patent number: 8964008
    Abstract: Various embodiments are disclosed that relate to the presentation of video images in a presentation space via a head-mounted display. For example, one disclosed embodiment comprises receiving viewer location data and orientation data from a location and orientation sensing system, and from the viewer location data and the viewer orientation data, locate a viewer in a presentation space, determine a direction in which the user is facing, and determine an orientation of the head-mounted display system. From the determined location, direction, and orientation, a presentation image is determined based upon a portion of and an orientation of a volumetric image mapped to the portion of the presentation space that is within the viewer's field of view. The presentation image is then sent to the head-mounted display.
    Type: Grant
    Filed: June 17, 2011
    Date of Patent: February 24, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Steven Bathiche
  • Patent number: 8928735
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Grant
    Filed: June 14, 2011
    Date of Patent: January 6, 2015
    Assignee: Microsoft Corporation
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20140376785
    Abstract: A system for enhancing a facial expression includes a processing circuit is configured to receive video of a user, generate facial data corresponding to a face of the user, analyze the facial data to identify a facial expression, enhance the facial data based on the facial expression, and output modified video including the enhanced facial data.
    Type: Application
    Filed: June 20, 2013
    Publication date: December 25, 2014
    Inventors: Steven Bathiche, Alistair K. Chan, William David Duncan, William Gates, Roderick A. Hyde, Edward K.Y. Jung, Jordin T. Kare, Jaron Lanier, John L. Manferdelli, Clarence T. Tegreene, Adrian Travis, Charles Whitmer, Victoria Y.H. Wood, Lowell L. Wood, JR.
  • Patent number: 8843857
    Abstract: Disclosed herein are techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture-based system. The techniques for scaling and translation may take the varying distances from which a user interacts with components of the gesture-based system, such as a computing environment or capture device, into consideration with respect to defining and/or recognizing gestures. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones. A set of gesture data may be associated with each virtual zone such that gestures appropriate for controlling aspects of the gesture-based system may vary throughout the physical space.
    Type: Grant
    Filed: November 19, 2009
    Date of Patent: September 23, 2014
    Assignee: Microsoft Corporation
    Inventors: Otto G. Berkes, Steven Bathiche, John Clavin, Ian LeGrow, Joseph Reginald Scott Molnar
  • Publication number: 20140267184
    Abstract: A stylus for use as an input device automatically switches its mode of operation.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: ELWHA LLC
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, Kenneth P. Hinckley, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Lowell L. Wood, JR., Victoria Y.H. Wood
  • Publication number: 20140267177
    Abstract: An electronic device may include a touch screen electronic display configured to offset and/or shift the contact locations of touch implements and/or displayed content based on one or more calculated parallax values. The parallax values may be associated with the viewing angle of an operator relative to the display of the electronic device. In various embodiments, the parallax value(s) may be calculated using three-dimensional location sensors, an angle of inclination of a touch implement, and/or one or more displayed calibration objects. Parallax values may be utilized to remap contact locations by a touch implement, shift and/or offset displayed content, and/or perform other transformations as described herein. A stereoscopically displayed content may be offset such that a default display plane is coplanar with a touch surface rather than a display surface. Contacts by a finger may be remapped using portions of the contact region and/or a centroid of the contact region.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, JR., Kenneth P. Hinckley, III, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Victoria Y.H. Wood, Lowell L. Wood, JR.
  • Publication number: 20140267179
    Abstract: An electronic device may include a touch screen electronic display configured to offset and/or shift the contact locations of touch implements and/or displayed content based on one or more calculated parallax values. The parallax values may be associated with the viewing angle of an operator relative to the display of the electronic device. In various embodiments, the parallax value(s) may be calculated using three-dimensional location sensors, an angle of inclination of a touch implement, and/or one or more displayed calibration objects. Parallax values may be utilized to remap contact locations by a touch implement, shift and/or offset displayed content, and/or perform other transformations as described herein. A stereoscopically displayed content may be offset such that a default display plane is coplanar with a touch surface rather than a display surface. Contacts by a finger may be remapped using portions of the contact region and/or a centroid of the contact region.
    Type: Application
    Filed: May 15, 2013
    Publication date: September 18, 2014
    Inventors: Steven Bathiche, Jesse R. Cheatham, III, Paul H. Dietz, Matthew G. Dyor, Philip A. Eckhoff, Anoop Gupta, JR., Kenneth P. Hinckley, III, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Craig J. Mundie, Nathan P. Myhrvold, Andreas G. Nowatzyk, Robert C. Petroski, Danny Allen Reed, Clarence T. Tegreene, Charles Whitmer, Victoria Y.H. Wood, Lowell L. Wood, JR.
  • Publication number: 20140200079
    Abstract: A method of displaying visual information to different viewer-eyes includes receiving eye strength data indicative of a deficiency of a weak viewer-eye with respect to a dominant viewer-eye. The method further includes causing a 3D-display system to display a first perspective of an image to the weak viewer-eye and causing the 3D-display system to display a second perspective of the image to the dominant viewer-eye.
    Type: Application
    Filed: January 16, 2013
    Publication date: July 17, 2014
    Applicant: Elwha LLC
    Inventors: Steven Bathiche, Alistair K. Chan, William Gates, Roderick A. Hyde, Muriel Y. Ishikawa, Edward K.Y. Jung, Jordin T. Kare, Jaron Lanier, John L. Manferdelli, Clarence T. Tegreene, Adrian Travis, Charles Whitmer, Lowell L. Wood, JR.
  • Publication number: 20140198297
    Abstract: A method for treating a weak viewer-eye includes the steps of receiving eye-strength data indicative of an eye-strength of the weak viewer-eye and causing a 3D display system to vary, in accordance with the eye-strength of the weak viewer-eye, display characteristics of a perspective that the 3D display system displays.
    Type: Application
    Filed: January 16, 2013
    Publication date: July 17, 2014
    Applicant: Elwha LLC
    Inventors: Steven Bathiche, Alistair K. Chan, William Gates, Roderick A. Hyde, Edward K.Y. Jung, Jordin T. Kare, Jaron Lanier, John L. Manferdelli, Clarence T. Tegreene, Adrian Travis, David B. Tuckerman, Charles Whitmer, Lowell L. Wood, JR., Victoria Y.H. Wood
  • Publication number: 20140168096
    Abstract: A reduced-latency ink rendering system and method that reduces latency in rendering ink on a display by bypassing at least some layers of the operating system. “Ink” is any input from a user through a touchscreen device using the user's finger or a pen. Moreover, some embodiments of the system and method avoid the operating system and each central-processing unit (CPU) on a computing device when initially rendering the ink by going directly from the digitizer to the display controller. Any correction or additional processing of the rendered ink is performed after the initial rendering of the ink. Embodiments of the system and method address ink-rendering latency in software embodiments, which include techniques to bypass the typical rendering pipeline and quickly render ink on the display, and hardware embodiments, which use hardware and techniques that locally change display pixels. These embodiments can be mixed and matched in any manner.
    Type: Application
    Filed: December 14, 2012
    Publication date: June 19, 2014
    Applicant: Microsoft Corporation
    Inventors: Steven Bathiche, Paul Henry Dietz, Hrvoje Benko, Andreas Georg Nowatzyk
  • Patent number: 8730212
    Abstract: An integrated vision and display system comprises a display-image forming layer configured to transmit a display image for viewing through a display surface; an imaging detector configured to image infrared light of a narrow range of angles relative to the display surface normal and including a reflection from one or more objects on or near the display surface; a vision-system emitter configured to emit the infrared light for illuminating the objects; a visible- and infrared-transmissive light guide having opposing upper and/or lower face, configured to receive the infrared light from the vision-system emitter, to conduct the infrared light via TIR from the upper and lower faces, and to project the infrared light onto the objects outside of the narrow range of angles relative to the display surface normal.
    Type: Grant
    Filed: November 19, 2009
    Date of Patent: May 20, 2014
    Assignee: Microsoft Corporation
    Inventors: Karlton Powell, Prafulla Masalkar, Timothy Large, Steven Bathiche
  • Publication number: 20140132595
    Abstract: A display that renders realistic objects allows a designer to redesign a living space in real time based on an existing layout. A computer system renders simulated objects on the display such that the simulated objects appear to the viewer to be in substantially the same place as actual objects in the scene. The displayed simulated objects can be spatially manipulated on the display through various user gestures. A designer can visually simulate a redesign of the space in many ways, for example, by adding selected objects, or by removing or rearranging existing objects, or by changing properties of those objects. Such objects also can be associated with shopping resources to enable related goods and services to be purchased, or other commercial transactions to be engaged in.
    Type: Application
    Filed: November 14, 2012
    Publication date: May 15, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Catherine N. Boulanger, Matheen Siddiqui, Vivek Pradeep, Paul Dietz, Steven Bathiche
  • Publication number: 20140022184
    Abstract: The recognition of user input to a computing device is enhanced. The user input is either speech, or handwriting data input by the user making screen-contacting gestures, or a combination of one or more prescribed words that are spoken by the user and one or more prescribed screen-contacting gestures that are made by the user, or a combination of one or more prescribed words that are spoken by the user and one or more prescribed non-screen-contacting gestures that are made by the user.
    Type: Application
    Filed: July 20, 2012
    Publication date: January 23, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Steven Bathiche, Anoop Gupta