Patents by Inventor Vivek Pradeep

Vivek Pradeep has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140132595
    Abstract: A display that renders realistic objects allows a designer to redesign a living space in real time based on an existing layout. A computer system renders simulated objects on the display such that the simulated objects appear to the viewer to be in substantially the same place as actual objects in the scene. The displayed simulated objects can be spatially manipulated on the display through various user gestures. A designer can visually simulate a redesign of the space in many ways, for example, by adding selected objects, or by removing or rearranging existing objects, or by changing properties of those objects. Such objects also can be associated with shopping resources to enable related goods and services to be purchased, or other commercial transactions to be engaged in.
    Type: Application
    Filed: November 14, 2012
    Publication date: May 15, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Catherine N. Boulanger, Matheen Siddiqui, Vivek Pradeep, Paul Dietz, Steven Bathiche
  • Publication number: 20130286223
    Abstract: Photos are shared among devices that are in close proximity to one another and for which there is a connection among the devices. The photos can be shared automatically, or alternatively based on various user inputs. Various different controls can also be placed on sharing photos to restrict the other devices with which photos can be shared, the manner in which photos can be shared, and/or how the photos are shared.
    Type: Application
    Filed: April 25, 2012
    Publication date: October 31, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen G. Latta, Kenneth P. Hinckley, Kevin Geisner, Steven Nabil Bathiche, Hrvoje Benko, Vivek Pradeep
  • Publication number: 20130201276
    Abstract: Techniques for implementing an integrative interactive space are described. In implementations, video cameras that are positioned to capture video at different locations are synchronized such that aspects of the different locations can be used to generate an integrated interactive space. The integrated interactive space can enable users at the different locations to interact, such as via video interaction, audio interaction, and so on. In at least some embodiments, techniques can be implemented to adjust an image of a participant during a video session such that the participant appears to maintain eye contact with other video session participants at other locations. Techniques can also be implemented to provide a virtual shared space that can enable users to interact with the space, and can also enable users to interact with one another and/or objects that are displayed in the virtual shared space.
    Type: Application
    Filed: February 6, 2012
    Publication date: August 8, 2013
    Applicant: Microsoft Corporation
    Inventors: Vivek Pradeep, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Alice Jane Bernheim Brush
  • Publication number: 20130201095
    Abstract: Techniques involving presentations are described. In one or more implementations, a user interface is output by a computing device that includes a slide of a presentation, the slide having an object that is output for display in three dimensions. Responsive to receipt of one or more inputs by the computing device, how the object in the slide is output for display in the three dimensions is altered.
    Type: Application
    Filed: February 7, 2012
    Publication date: August 8, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Paul Henry Dietz, Vivek Pradeep, Stephen G. Latta, Kenneth P. Hinckley, Hrvoje Benko, Alice Jane Bernheim Brush
  • Publication number: 20130131985
    Abstract: The system comprises a wearable, electronic image acquisition and processing system (or visual enhancement system) to guide visually impaired individuals through their environment, providing information to the user about nearby objects of interest, potentially dangerous obstacles, their location, and potential paths to their destination.
    Type: Application
    Filed: April 11, 2012
    Publication date: May 23, 2013
    Inventors: James D. Weiland, Mark S. Humayan, Gerard Medioni, Armand R. Tanguay, JR., Vivek Pradeep, Laurent Itti
  • Publication number: 20120063638
    Abstract: A system and method are disclosed for estimating camera motion of a visual input scene using points and lines detected in the visual input scene. The system includes a camera server comprising a stereo pair of calibrated cameras, a feature processing module, a trifocal motion estimation module and an optional adjustment module. The stereo pair of the calibrated cameras and its corresponding stereo pair of camera after camera motion form a first and a second trifocal tensor. The feature processing module is configured to detect points and lines in the visual input data comprising a plurality of image frames. The feature processing module is further configured to find point correspondence between detected points and line correspondence between detected lines in different views. The trifocal motion estimation module is configured to estimate the camera motion using the detected points and lines associated with the first and the second trifocal tensor.
    Type: Application
    Filed: August 26, 2011
    Publication date: March 15, 2012
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Jongwoo Lim, Vivek Pradeep