Patents by Inventor Dmitry Morozov

Dmitry Morozov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150070274
    Abstract: The technology described herein allows for a wearable display device, such as a head-mounted display, to be tracked within a 3D space by dynamically generating 6DoF data associated with an orientation and location of the display device within the 3D space. The 6DoF data is generated dynamically, in real time, by combining of 3DoF location information and 3DoF orientation information within a user-centered coordinate system. The 3DoF location information may be retrieved from depth maps acquired from a depth sensitive device, while the 3DoF orientation information may be received from the display device equipped with orientation and motion sensors. The dynamically generated 6DoF data can be used to provide 360-degree virtual reality simulation, which may be rendered and displayed on the wearable display device.
    Type: Application
    Filed: November 10, 2014
    Publication date: March 12, 2015
    Inventor: Dmitry Morozov
  • Patent number: 8896522
    Abstract: A computer-implemented method and system for controlling various electronic devices by recognition of gestures made by a user within a particular space defined in front of the user are provided. An example method may comprise generating a depth map of a physical scene, determining that a head of the user is directed towards a predetermined direction, establishing a virtual sensing zone defined between the user and a predetermined location, identifying a particular gesture made by the user within the virtual sensing zone, and selectively providing to the electronic device a control command associated with the particular gesture. The particular gesture may be performed by one or more characteristic forms provided by the user within the virtual sensing zone being in an active state. The characteristic forms are forms reliably distinguishable from casual forms by means of computer vision and having certain attributes, which can reliably reflect user intent.
    Type: Grant
    Filed: July 4, 2012
    Date of Patent: November 25, 2014
    Assignee: 3DiVi Company
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov, Alexander Argutin
  • Patent number: 8823642
    Abstract: Provided are computer-implemented methods and systems for controlling devices using gestures and a 3D sensor that enables implementing the above. In one embodiment, the method proposed herein may be based on defining at least one sensor area within the space surrounding a user of a controlled device; associating this sensor area with at least one user gesture; associating the combination of the user gesture and sensor area with an actionable command; identifying the direction of the line of sight of the user and a focal point of the line of sight of the user; and, if the line of sight of the user is directed a sensor area, issuing an actionable command corresponding to the combination of the sensor area and the gesture that the user makes while looking at this sensor area.
    Type: Grant
    Filed: May 23, 2012
    Date of Patent: September 2, 2014
    Assignee: 3DiVi Company
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov
  • Publication number: 20140009384
    Abstract: The present technology refers to methods for dynamic determining location and orientation of handheld device, such as a smart phone, remote controller or gaming device, within a 3D environment in real time. For these ends, there is provided a 3D camera for capturing a depth map of the 3D environment within which there is a user holding the handheld device. The handheld device acquires motion and orientation data in response to hand gestures, which data is further processed and associated with a common coordinate system. The depth map is also processed to generate motion data of user hands, which is then dynamically compared to the processed motion and orientation data obtained from the handheld device so as to determine the handheld device location and orientation. The positional and orientation data may be further used in various software applications to generate control commands or perform analysis of various gesture motions.
    Type: Application
    Filed: April 3, 2013
    Publication date: January 9, 2014
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov, Alexander Argutin
  • Publication number: 20130010071
    Abstract: Disclosed are methods for determining and tracking a current location of a handheld pointing device, such as a remote control for an entertainment system, on a depth map generated by a gesture recognition control system. The methods disclosed herein enable identifying a user's hand gesture, and generating corresponding motion data. Further, the handheld pointing device may send motion, such as acceleration or velocity, and/or orientation data such as pitch, roll, and yaw angles. The motion data of user's hand gesture and motion data (orientation data) as received from the handheld pointing device are then compared, and if they correspond to each other, it is determined that the handheld pointing device is in active use by the user as it is held by a particular hand. Accordingly, a location of the handheld pointing device on the depth map can be determined.
    Type: Application
    Filed: July 4, 2012
    Publication date: January 10, 2013
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov, Alexander Argutin
  • Publication number: 20130009861
    Abstract: Provided are computer-implemented methods and systems for controlling devices using gestures and a 3D sensor that enables implementing the above. In one embodiment, the method proposed herein may be based on defining at least one sensor area within the space surrounding a user of a controlled device; associating this sensor area with at least one user gesture; associating the combination of the user gesture and sensor area with an actionable command; identifying the direction of the line of sight of the user and a focal point of the line of sight of the user; and, if the line of sight of the user is directed a sensor area, issuing an actionable command corresponding to the combination of the sensor area and the gesture that the user makes while looking at this sensor area.
    Type: Application
    Filed: May 23, 2012
    Publication date: January 10, 2013
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov
  • Publication number: 20130010207
    Abstract: A computer-implemented method for controlling one or more electronic devices by recognition of gestures made by a three-dimensional object (3D). In one example embodiment, the method comprises capturing a series of successive 3D images in real time, identifying that the object has a predetermined elongated shape, identifying that the object is oriented substantially towards a predetermined direction, determining at least one qualifying action being performed by a user and/or the object, comparing the at least one qualifying action to one or more pre-determined actions associated with the direction towards which the object is oriented, and, based on the comparison, selectively issuing to the one or more electronic devices a command associated with the at least one qualifying action.
    Type: Application
    Filed: May 23, 2012
    Publication date: January 10, 2013
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov
  • Publication number: 20130009865
    Abstract: A computer-implemented method and system for controlling various electronic devices by recognition of gestures made by a user within a particular space defined in front of the user are provided. An example method may comprise generating a depth map of a physical scene, determining that a head of the user is directed towards a predetermined direction, establishing a virtual sensing zone defined between the user and a predetermined location, identifying a particular gesture made by the user within the virtual sensing zone, and selectively providing to the electronic device a control command associated with the particular gesture. The particular gesture may be performed by one or more characteristic forms provided by the user within the virtual sensing zone being in an active state. The characteristic forms are forms reliably distinguishable from casual forms by means of computer vision and having certain attributes, which can reliably reflect user intent.
    Type: Application
    Filed: July 4, 2012
    Publication date: January 10, 2013
    Applicant: 3DIVI
    Inventors: Andrey Valik, Pavel Zaitsev, Dmitry Morozov, Alexander Argutin