Patents by Inventor Alexandru Octavian Balan

Alexandru Octavian Balan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11314321
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Grant
    Filed: July 7, 2020
    Date of Patent: April 26, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Drew Steedly, Michael Edward Samples, Alexandru Octavian Balan, William Douglas Guyman, Vuk Jovanovic, Taras Khapko, Ivan Razumenic, Vladimir Carapic, Martin Thomas Shetter, Jelena Mojasevic, Andrew C. Goris, Marko Bezulj
  • Patent number: 11042226
    Abstract: Techniques that modify traditional input devices (e.g., traditional computer mouse form factor) to interact with the virtual reality (VR) devices are described. Such modification allows the VR devices the ability to track the position and orientation of the mouse in the 3 dimensional (3D) VR without requiring extensive complex hardware typically included in VR motion controllers. Specifically, the traditional mouse form factor may be merged with 3D constellation based tracking elements (e.g., LEDs) with minimal form-factor modifications. The constellation tracking elements may include a plurality of fiducial markers on the mouse that may be detected by an imaging sensor, and thus allow tracking with a single imaging sensor (e.g., either head-mounted or fixed position).
    Type: Grant
    Filed: August 9, 2018
    Date of Patent: June 22, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Erik Alan Holverson, Yijie Wang, Alexandru Octavian Balan
  • Patent number: 10839603
    Abstract: Interactive zones are created in a virtual environment by a head mounted display device executing a virtual reality application. The interactive zones correspond to physical objects that are external to the user such as keyboards, mice, notepads, cups, and other objects that the user may desire to interact with. Each interactive zone includes real time video of the corresponding object so that the user can easily interact with the object, and view their interaction with the object, while still participating in the virtual reality application. If the user moves one of the physical objects, the corresponding interactive zone is also moved in the virtual environment.
    Type: Grant
    Filed: May 25, 2018
    Date of Patent: November 17, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Yijie Wang, Yangming Chong, Alexandru Octavian Balan
  • Patent number: 10818092
    Abstract: Methods for disambiguation and tracking of two or more wireless hand-held controllers with passive optical and inertial tracking within a system having a head mounted virtual or augmented reality display device having a forward facing optical sensor having a field of view, and wherein the display device interfaces with wireless hand-held inertial controllers for providing user input to the display device, with each controller two passive optically reflective markers, one marker being position at or adjacent each end of the controller and being separated by a known distance, and each controller also including an onboard inertial measurement unit for providing inertial data corresponding to its orientation.
    Type: Grant
    Filed: March 29, 2019
    Date of Patent: October 27, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandru Octavian Balan, Constantin Dulu
  • Publication number: 20200333878
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Application
    Filed: July 7, 2020
    Publication date: October 22, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Drew STEEDLY, Michael Edward SAMPLES, Alexandru Octavian BALAN, William Douglas GUYMAN, Vuk JOVANOVIC, Taras KHAPKO, Ivan RAZUMENIC, Vladimir CARAPIC, Martin Thomas SHETTER, Jelena MOJASEVIC, Andrew C. GORIS, Marko BEZULJ
  • Patent number: 10740924
    Abstract: Examples are disclosed that relate to tracking a pose of a handheld object used with a head-mounted display device. In one example, a method comprises: receiving image data from an image sensing system; detecting a plurality of feature points of the handheld object in a frame of the image data; receiving inertial measurement unit (IMU) data from an IMU of the handheld object; based on detecting the plurality of feature points and receiving the IMU data, determining a first pose of the handheld object; determining that at least a portion of the plurality of feature points is not detected in another frame of the image data; using the IMU data, updating the first pose of the handheld object to a second pose; and body-locking the second pose of the handheld object to a body location on a user wearing the head-mounted display device.
    Type: Grant
    Filed: April 16, 2018
    Date of Patent: August 11, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian Balan, Yogeshwar Narayanan Nagaraj, Constantin Dulu, William Guyman, Ivan Razumenic
  • Patent number: 10719125
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: July 21, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Drew Steedly, Michael Edward Samples, Alexandru Octavian Balan, William Douglas Guyman, Vuk Jovanovic, Taras Khapko, Ivan Razumenic, Vladimir Carapic, Martin Thomas Shetter, Jelena Mojasevic, Andrew C. Goris, Marko Bezulj
  • Patent number: 10705598
    Abstract: One disclosed example provides a computing device configured to receive from an image sensor of a head-mounted device environmental tracking exposures and handheld object tracking exposures, determine a pose of the handheld object with respect to the head-mounted device based upon the handheld object tracking exposures, determine a pose of the head-mounted device with respect to a surrounding environment based upon the environmental tracking exposures, derive a pose of the handheld object relative to the surrounding environment based upon the pose of the handheld object with respect to the head-mounted device and the pose of the head-mounted device with respect to the surrounding environment, and output the pose of the handheld object relative to the surrounding environment for controlling a user interface displayed on the head-mounted device.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: July 7, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Drew Steedly, Michael Edward Samples, Alexandru Octavian Balan, Salim Sirtkaya, William Douglas Guyman, Vuk Jovanovic, Filip Panjevic
  • Patent number: 10679376
    Abstract: Examples are disclosed herein that relate to determining a pose of a handheld object. One example provides a computing system configured to determine a pose of a handheld object comprising a plurality of light sources by acquiring image data of a surrounding environment, detecting a subset of light sources of the plurality of light sources of the handheld object in the image data, and performing a search, without using previous pose data, to determine the pose of the handheld object relative to the computing system. The computing system is further configured to use the pose determined to perform a later search for an updated pose of the handheld object, and if the later search fails to find the updated pose, determine the updated pose by again performing the search without using previous pose data.
    Type: Grant
    Filed: April 24, 2018
    Date of Patent: June 9, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandru Octavian Balan, Ronald Boskovic, Filip Panjevic, Ivan Razumenic, Vuk Jovanovic
  • Patent number: 10628711
    Abstract: One disclosed example provides a method for determining a pose of a handheld object in a surrounding environment. Optical pose data is stored in an image queue of a first filter. IMU data is received from an IMU of the handheld object and stored in an IMU queue of the first filter. Using at least a portion of the optical pose data and the IMU data, an initial pose of the handheld object is determined and outputted. The method determines that either the image queue or the IMU queue is empty. A second filter comprising the one empty queue and the other non-empty queue is instantiated as a copy of the first filter. Using the data from the non-empty queue in the second filter, the initial pose of the handheld object is updated to an updated pose, and the updated pose is outputted.
    Type: Grant
    Filed: April 24, 2018
    Date of Patent: April 21, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandru Octavian Balan, Ruven Jaime Rivera, III, Michael Edward Samples, Ivan Razumenic
  • Patent number: 10546426
    Abstract: A virtual reality scene is displayed via a display device. A real-world positioning of a peripheral control device is identified relative to the display device. Video of a real-world scene of a physical environment located behind a display region of the display device is captured via a camera. A real-world portal is selectively displayed via the display device that includes a portion of the real-world scene and simulates a view through the virtual reality scene at a position within the display region that tracks the real-world positioning of the peripheral control device.
    Type: Grant
    Filed: February 1, 2018
    Date of Patent: January 28, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alexandru Octavian Balan, Youding Zhu, Min shik Park
  • Patent number: 10521026
    Abstract: Systems are provided that include a wireless hand-held inertial controller with passive optical and inertial tracking in a slim form-factor. These systems are configured for use with a head mounted virtual or augmented reality display device (HMD) that operates with six degrees of freedom by fusing (i) data related to the position of the controller derived from a forward-facing optical sensor located in the HMD with (ii) data relating to the orientation of the controller derived from an inertial measurement unit located in the controller.
    Type: Grant
    Filed: November 15, 2018
    Date of Patent: December 31, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian Balan, Constantin Dulu, Christopher Douglas Edmonds, Mark James Finocchio
  • Patent number: 10503247
    Abstract: One disclosed example provides a head-mounted device including a stereo camera arrangement, a logic device configured to execute instructions, and a storage device storing instructions executable by the logic device to, for each camera in the stereo camera arrangement, receive image data of a field of view of the camera, detect light sources of a handheld object in the image data, and based upon the light sources detected, determine a pose of the handheld object. The instructions are executable to, based upon the pose of the handheld object determined for each camera in the stereo camera arrangement, calibrate the stereo camera arrangement.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: December 10, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michael Edward Samples, Alexandru Octavian Balan, Salim Sirtkaya, Vuk Jovanovic, Filip Panjevic, Taras Khapko, Ruven Jaime Rivera, III
  • Patent number: 10496157
    Abstract: Examples are disclosed herein related to tracking poses of a head-mounted display device that interfaces with handheld peripheral objects. One disclosed example provides a handheld object configured for providing user input to a head-mounted device, the handheld object including a body, a plurality of visible light sources arranged on the body in an arrangement trackable by a vision system of the head-mounted device, and a controller configured to control a brightness of one or more visible light sources of the plurality of visible light sources.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: December 3, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Drew Steedly, Alexandru Octavian Balan, Taras Khapko, Ivan Razumenic, Steven James Velat, Vladimir Carapic
  • Publication number: 20190333275
    Abstract: Interactive zones are created in a virtual environment by a head mounted display device executing a virtual reality application. The interactive zones correspond to physical objects that are external to the user such as keyboards, mice, notepads, cups, and other objects that the user may desire to interact with. Each interactive zone includes real time video of the corresponding object so that the user can easily interact with the object, and view their interaction with the object, while still participating in the virtual reality application. If the user moves one of the physical objects, the corresponding interactive zone is also moved in the virtual environment.
    Type: Application
    Filed: May 25, 2018
    Publication date: October 31, 2019
    Inventors: Yijie WANG, Yangming CHONG, Alexandru Octavian BALAN
  • Publication number: 20190325274
    Abstract: One disclosed example provides a method for determining a pose of a handheld object in a surrounding environment. Optical pose data is stored in an image queue of a first filter. IMU data is received from an IMU of the handheld object and stored in an IMU queue of the first filter. Using at least a portion of the optical pose data and the IMU data, an initial pose of the handheld object is determined and outputted. The method determines that either the image queue or the IMU queue is empty. A second filter comprising the one empty queue and the other non-empty queue is instantiated as a copy of the first filter. Using the data from the non-empty queue in the second filter, the initial pose of the handheld object is updated to an updated pose, and the updated pose is outputted.
    Type: Application
    Filed: April 24, 2018
    Publication date: October 24, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian BALAN, Ruven Jaime RIVERA, III, Michael Edward SAMPLES, Ivan RAZUMENIC
  • Publication number: 20190325600
    Abstract: Examples are disclosed herein that relate to determining a pose of a handheld object. One example provides a computing system configured to determine a pose of a handheld object comprising a plurality of light sources by acquiring image data of a surrounding environment, detecting a subset of light sources of the plurality of light sources of the handheld object in the image data, and performing a search, without using previous pose data, to determine the pose of the handheld object relative to the computing system. The computing system is further configured to use the pose determined to perform a later search for an updated pose of the handheld object, and if the later search fails to find the updated pose, determine the updated pose by again performing the search without using previous pose data.
    Type: Application
    Filed: April 24, 2018
    Publication date: October 24, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian BALAN, Ronald BOSKOVIC, Filip PANJEVIC, Ivan RAZUMENIC, Vuk JOVANOVIC
  • Publication number: 20190318501
    Abstract: Examples are disclosed that relate to tracking a pose of a handheld object used with a head-mounted display device. In one example, a method comprises: receiving image data from an image sensing system; detecting a plurality of feature points of the handheld object in a frame of the image data; receiving inertial measurement unit (IMU) data from an IMU of the handheld object; based on detecting the plurality of feature points and receiving the IMU data, determining a first pose of the handheld object; determining that at least a portion of the plurality of feature points is not detected in another frame of the image data; using the IMU data, updating the first pose of the handheld object to a second pose; and body-locking the second pose of the handheld object to a body location on a user wearing the head-mounted display device.
    Type: Application
    Filed: April 16, 2018
    Publication date: October 17, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alexandru Octavian BALAN, Yogeshwar Narayanan NAGARAJ, Constantin DULU, William GUYMAN, Ivan RAZUMENIC
  • Publication number: 20190302898
    Abstract: Techniques that modify traditional input devices (e.g., traditional computer mouse form factor) to interact with the virtual reality (VR) devices are described. Such modification allows the VR devices the ability to track the position and orientation of the mouse in the 3 dimensional (3D) VR without requiring extensive complex hardware typically included in VR motion controllers. Specifically, the traditional mouse form factor may be merged with 3D constellation based tracking elements (e.g., LEDs) with minimal form-factor modifications. The constellation tracking elements may include a plurality of fiducial markers on the mouse that may be detected by an imaging sensor, and thus allow tracking with a single imaging sensor (e.g., either head-mounted or fixed position).
    Type: Application
    Filed: August 9, 2018
    Publication date: October 3, 2019
    Inventors: Erik Alan HOLVERSON, Yijie WANG, Alexandru Octavian BALAN
  • Publication number: 20190228584
    Abstract: Methods for disambiguation and tracking of two or more wireless hand-held controllers with passive optical and inertial tracking within a system having a head mounted virtual or augmented reality display device having a forward facing optical sensor having a field of view, and wherein the display device interfaces with wireless hand-held inertial controllers for providing user input to the display device, with each controller two passive optically reflective markers, one marker being position at or adjacent each end of the controller and being separated by a known distance, and each controller also including an onboard inertial measurement unit for providing inertial data corresponding to its orientation.
    Type: Application
    Filed: March 29, 2019
    Publication date: July 25, 2019
    Inventors: Alexandru Octavian BALAN, Constantin Dulu