Patents by Inventor Dov Katz

Dov Katz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10957059
    Abstract: A depth camera assembly (DCA) configured to determine distances between the headset and one or more objects in an area surrounding the headset. The DCA includes an imaging device, an illumination source, and a controller. The controller identifies objects in a portion of the local area, determines a depth zone for each object and corresponding structured light (SL) illumination parameters including a SL pattern for each object based on the depth zone, instructs the illumination source to illuminate a scene comprising the one or more objects with the determined SL pattern, and instructs the imaging device to capture images of the illuminated objects. The controller determines the depth information for the illuminated objects and updates the depth information associated with the objects.
    Type: Grant
    Filed: September 8, 2017
    Date of Patent: March 23, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Nadav Grossinger
  • Patent number: 10684674
    Abstract: A virtual reality system includes a head-mounted display (HMD) having one or more facial sensors and illumination sources mounted to a surface of the HMD. For example, the facial sensors are image capture devices coupled to a bottom side of the HMD. The illumination sources illuminate portions of a user's face outside of the HMD, while the facial sensors capture images of the illuminated portions of the user's face. A controller receives the captured images and generates a representation of the portions of the user's face by identifying landmarks of the user's face in the captured images and performing other suitable image processing methods. Based on the representation, the controller or another component of the virtual reality system generates content for presentation to the user.
    Type: Grant
    Filed: April 1, 2016
    Date of Patent: June 16, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Patent number: 10532277
    Abstract: A hand-held controller for a virtual-reality system includes an antenna, a grip, a cage coupled to the grip, and a plurality of LEDs on an outer surface of the cage. Wireless signals are received from a remote camera via the antenna. The plurality of LEDs is duty-cycled, which includes turning on the plurality of LEDs at times determined using the wireless signals. The times when the plurality of LEDs is turned on correspond to times when the remote camera captures images of the hand-held controller.
    Type: Grant
    Filed: June 11, 2015
    Date of Patent: January 14, 2020
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventors: Jason Andrew Higgins, Dov Katz, Paul Buckley
  • Patent number: 10463123
    Abstract: A wheeling frame for a suitcase which has its own rolling wheels includes a frame supporting a pair of leaning wheels arranged at ends of a pair of pivotable legs, with a first wheel arranged at the end of a first leg and a second wheel arranged at the end of a second leg. The leg positions are adjustable between a stowed position and a deployed, adjustable position extending at an angle to the frame. The legs have a movable pivoting axis and adjustable lengths. A handle system at a proximal end of the frame is adjustable in length and angle, and is operable in conjunction with the leaning wheels to roll the suitcase in a stable leaning position, to allow walking the suitcase on steps of a staircase, and to be converted into a suitcase table, to ease packing and unpacking.
    Type: Grant
    Filed: March 8, 2017
    Date of Patent: November 5, 2019
    Inventors: Max Moskowitz, Dov Katz, Bezalel Katz
  • Patent number: 10430988
    Abstract: A facial tracking system generates a virtual rendering of a portion of a face of a user wearing a head-mounted display (HMD). The facial tracking system illuminates portions of the face inside the HMD. The facial tracking system captures a plurality of facial data of the portion of the face using one or more facial sensors located inside the HMD. A plurality of planar sections of the portion of the face are identified based at least in part on the plurality of facial data. The plurality of planar sections are mapped to one or more landmarks of the face. Facial animation information is generated based at least in part on the mapping, the facial animation information describing a portion of a virtual face corresponding to the portion of the user's face.
    Type: Grant
    Filed: June 3, 2016
    Date of Patent: October 1, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Patent number: 10290154
    Abstract: A virtual reality (VR) headset calibration system calibrates a VR headset, which includes a plurality of locators and an inertial measurement unit (IMU) generating output signals indicative of motion of the VR headset. The system comprises a calibration controller configured to receive a headset model of the VR headset that identifies expected positions of each of the locators. The controller controls cameras to capture images of the VR headset while the headset is moved along a predetermined path. The images detect actual positions of the locators during the movement along the predetermined path. Calibration parameters for the locators are generated based on differences between the actual positions and the expected positions. Calibration parameters for the IMU are generated based on the calibration parameters for the locators and differences between expected and actual signals output by the IMU. The calibration parameters are stored to the VR headset.
    Type: Grant
    Filed: September 12, 2018
    Date of Patent: May 14, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Jennifer Leigh Dolson, Simon Hallam, Kieran Tobias Levin, Eric Loren Vaughan, Klas Petter Ivmark, Andrew Melim
  • Patent number: 10191561
    Abstract: A virtual reality (VR) system tracks the position of a controller. The VR system includes an image tracking system comprising of a number of fixed cameras, and a headset worn by the user that includes an imaging device to capture images of a controller operated by the user. The controller includes a set of features disposed on the surface of the controller. The image tracking system provides a first view of the controller. The imaging device mounted on the headset provides a second view of the controller. Each view of the controller (i.e., from the headset and from the image tracking system) provides a distinct set of features observed on the controller. The first and second sets of features are identified from the captured images and a pose of the controller is determined using the first set of features and the second set of features.
    Type: Grant
    Filed: July 30, 2015
    Date of Patent: January 29, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Neil Konzen, Oskar Linde, Maksym Katsev
  • Patent number: 10127732
    Abstract: A virtual reality (VR) headset calibration system calibrates a VR headset, which includes a plurality of locators and an inertial measurement unit (IMU) generating output signals indicative of motion of the VR headset. The system comprises a calibration controller configured to receive a headset model of the VR headset that identifies expected positions of each of the locators. The controller controls cameras to capture images of the VR headset while the headset is moved along a predetermined path. The images detect actual positions of the locators during the movement along the predetermined path. Calibration parameters for the locators are generated based on differences between the actual positions and the expected positions. Calibration parameters for the IMU are generated based on the calibration parameters for the locators and differences between expected and actual signals output by the IMU. The calibration parameters are stored to the VR headset.
    Type: Grant
    Filed: September 29, 2017
    Date of Patent: November 13, 2018
    Assignee: Facebook Technologies, LLC
    Inventors: Dov Katz, Jennifer Leigh Dolson, Simon Hallam, Kieran Tobias Levin, Eric Loren Vaughan, Klas Petter Ivmark, Andrew Melim
  • Patent number: 10095031
    Abstract: A virtual reality (VR) headset includes a first camera and a second camera capturing image data of an environment of the VR headset. Each camera has a field of view, and a portion of the fields of view of the first and second cameras overlap while a portion of the fields of view do not overlap. A processor receiving the image data from the first and second cameras is configured to identify a first observation of a position of the VR headset in the environment and positions of a plurality of features based on the image data captured by the first camera. The processor also identifies a second observation of the position of the VR headset in the environment and the positions of the features based on the image data captured by the second camera. Based on the first and second observations, the processor determines a model of the environment.
    Type: Grant
    Filed: May 25, 2018
    Date of Patent: October 9, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Oskar Linde, Kjell Erik Alexander Sjoholm, Rasmus Karl Emil Simander
  • Patent number: 10078218
    Abstract: A virtual reality (VR) headset includes a first camera and a second camera capturing image data of an environment of the VR headset. Each camera has a field of view, and a portion of the fields of view of the first and second cameras overlap while a portion of the fields of view do not overlap. A processor receiving the image data from the first and second cameras is configured to identify a first observation of a position of the VR headset in the environment and positions of a plurality of features based on the image data captured by the first camera. The processor also identifies a second observation of the position of the VR headset in the environment and the positions of the features based on the image data captured by the second camera. Based on the first and second observations, the processor determines a model of the environment.
    Type: Grant
    Filed: January 1, 2016
    Date of Patent: September 18, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Oskar Linde, Kjell Erik Alexander Sjoholm, Rasmus Karl Emil Simander
  • Patent number: 10073263
    Abstract: A virtual reality (VR) includes a sparse projection system configured to generate a plurality of clusters using one or more diffractive optical elements. Each generated cluster has a unique configuration that corresponds to a unique location in a virtual mapping of a local area. The sparse projection system projects the generated clusters throughout the local area. A VR console receives a series of images of the local area from the imaging device, with at least one image includes at least one cluster. The VR console determines a location of a VR headset within the virtual mapping of the local area based at least in part on a configuration of the at least one cluster included in the series of images. Content is generated based at least in part on the determined location of the VR headset and is provided to the VR headset for presentation.
    Type: Grant
    Filed: March 4, 2016
    Date of Patent: September 11, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Palmer Freeman Luckey, Samuel Redmond D'Amico, Evan M. Richards, Shizhe Shen, Jennifer Leigh Dolson
  • Patent number: 10001834
    Abstract: A virtual reality (VR) console receives slow calibration data from an imaging device and fast calibration data from an inertial measurement unit on a VR headset including a front and a rear rigid body. The slow calibration data includes an image where only the locators on the rear rigid body are visible. An observed position is determined from the slow calibration data and a predicted position is determined from the fast calibration data. If a difference between the observed position and the predicted position is greater than a threshold value, the predicted position is adjusted by a temporary offset until the difference is less than the threshold value. The temporary offset is removed by re-calibrating the rear rigid body to the front rigid body once locators on both the front and rear rigid body are visible in an image in the slow calibration data.
    Type: Grant
    Filed: January 10, 2017
    Date of Patent: June 19, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Jonathan Shine, Robin Michael Miller
  • Patent number: 9959678
    Abstract: A head mounted display (HMD) in a VR system includes sensors for tracking the eyes and face of a user wearing the HMD. The VR system records calibration attributes such as landmarks of the face of the user. Light sources illuminate portions of the user's face covered by the HMD. In conjunction, facial sensors capture facial data. The VR system analyzes the facial data to determine the orientation of planar sections of the illuminated portions of face. The VR system aggregates planar sections of the face and maps the planar sections to landmarks of the face to generate a facial animation of the user, which can also include eye orientation information. The facial animation is represented as a virtual avatar and presented to the user.
    Type: Grant
    Filed: June 3, 2016
    Date of Patent: May 1, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Patent number: 9854170
    Abstract: Embodiments relate to adjusting an image frame based on motion sensor data. An image frame including image data for a plurality of rows of the image frame is captured sequentially row-by-row. Motion sensor data corresponding to the image frame is also captured. Motion vectors describing motion between each row of the image frame are determined based on the motion sensor data. An image data translation is calculated for each row of image frame based on the motion vector for the row. The image frame is adjusted based on the image data translation for each row of the image frame to generate an adjusted image frame.
    Type: Grant
    Filed: December 29, 2015
    Date of Patent: December 26, 2017
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Simon Hallam
  • Publication number: 20170352183
    Abstract: A head mounted display (HMD) in a VR system includes sensors for tracking the eyes and face of a user wearing the HMD. The VR system records calibration attributes such as landmarks of the face of the user. Light sources illuminate portions of the user's face covered by the HMD. In conjunction, facial sensors capture facial data. The VR system analyzes the facial data to determine the orientation of planar sections of the illuminated portions of face. The VR system aggregates planar sections of the face and maps the planar sections to landmarks of the face to generate a facial animation of the user, which can also include eye orientation information. The facial animation is represented as a virtual avatar and presented to the user.
    Type: Application
    Filed: June 3, 2016
    Publication date: December 7, 2017
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Publication number: 20170352178
    Abstract: A facial tracking system generates a virtual rendering of a portion of a face of a user wearing a head-mounted display (HMD). The facial tracking system illuminates portions of the face inside the HMD. The facial tracking system captures a plurality of facial data of the portion of the face using one or more facial sensors located inside the HMD. A plurality of planar sections of the portion of the face are identified based at least in part on the plurality of facial data. The plurality of planar sections are mapped to one or more landmarks of the face. Facial animation information is generated based at least in part on the mapping, the facial animation information describing a portion of a virtual face corresponding to the portion of the user's face.
    Type: Application
    Filed: June 3, 2016
    Publication date: December 7, 2017
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Patent number: 9805512
    Abstract: A virtual reality (VR) headset calibration system calibrates a VR headset, which includes a plurality of locators and an inertial measurement unit (IMU) generating output signals indicative of motion of the VR headset. The system comprises a calibration controller configured to receive a headset model of the VR headset that identifies expected positions of each of the locators. The controller controls cameras to capture images of the VR headset while the headset is moved along a predetermined path. The images detect actual positions of the locators during the movement along the predetermined path. Calibration parameters for the locators are generated based on differences between the actual positions and the expected positions. Calibration parameters for the IMU are generated based on the calibration parameters for the locators and differences between expected and actual signals output by the IMU. The calibration parameters are stored to the VR headset.
    Type: Grant
    Filed: November 13, 2015
    Date of Patent: October 31, 2017
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Jennifer Leigh Dolson, Simon Hallam, Kieran Tobias Levin, Eric Loren Vaughan, Klas Petter Ivmark, Andrew Melim
  • Publication number: 20170287194
    Abstract: A virtual reality system includes a head-mounted display (HMD) having one or more facial sensors and illumination sources mounted to a surface of the HMD. For example, the facial sensors are image capture devices coupled to a bottom side of the HMD. The illumination sources illuminate portions of a user's face outside of the HMD, while the facial sensors capture images of the illuminated portions of the user's face. A controller receives the captured images and generates a representation of the portions of the user's face by identifying landmarks of the user's face in the captured images and performing other suitable image processing methods. Based on the representation, the controller or another component of the virtual reality system generates content for presentation to the user.
    Type: Application
    Filed: April 1, 2016
    Publication date: October 5, 2017
    Inventors: Dov Katz, Michael John Toksvig, Ziheng Wang, Timothy Paul Omernick, Torin Ross Herndon
  • Patent number: 9779540
    Abstract: A virtual reality (VR) console receives slow calibration data from an imaging device and fast calibration data from an inertial measurement unit on a virtual reality headset. Using a model of the VR headset, the VR console identifies model locators corresponding to locators on the VR headset and generates estimated positions for locators included in slow calibration data. The VR console adjusts calibration parameters so a relative distance between estimated positions of the locators and positions of their corresponding model locators is less than a threshold value. From the estimated positions, the VR console generates calibrated positions of a reference point on the VR headset associated with images from the slow calibration data. The VR console determines predicted positions of the reference point from the calibrated positions and adjusts calibration parameters so intermediate estimated positions of the reference point are within a threshold distance of the predicted positions.
    Type: Grant
    Filed: November 8, 2016
    Date of Patent: October 3, 2017
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Maksym Katsev, Neil Konzen, Steven LaValle, Michael Antonov
  • Publication number: 20170192232
    Abstract: A virtual reality (VR) headset includes a first camera and a second camera capturing image data of an environment of the VR headset. Each camera has a field of view, and a portion of the fields of view of the first and second cameras overlap while a portion of the fields of view do not overlap. A processor receiving the image data from the first and second cameras is configured to identify a first observation of a position of the VR headset in the environment and positions of a plurality of features based on the image data captured by the first camera. The processor also identifies a second observation of the position of the VR headset in the environment and the positions of the features based on the image data captured by the second camera. Based on the first and second observations, the processor determines a model of the environment.
    Type: Application
    Filed: January 1, 2016
    Publication date: July 6, 2017
    Inventors: Dov Katz, Oskar Linde, Kjell Erik Alexander Sjoholm, Rasmus Karl Emil Simander