Patents by Inventor Nicholas Ferianc Kamuda
Nicholas Ferianc Kamuda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240037879Abstract: In some implementations, the disclosed systems and methods can be blocked by an obstacle or object, such as wall(s), door(s), or other objects such that a user cannot view the area in a real-world environment. In some implementations, the disclosed systems and methods can interface with smart lights (and/or other smart devices controlling ambient lighting) to know where lights are and how much light they are putting out. In some implementations, the disclosed systems and methods can be animated via user tracking. Some user systems may lack video tracking and/or comprise inconsistent video tracking. In some implementations, the disclosed systems and methods can display content for a screen in a pass-through visualization.Type: ApplicationFiled: September 6, 2023Publication date: February 1, 2024Inventors: Joseph GARDNER, Alexander FAABORG, Nicholas Ferianc KAMUDA
-
Patent number: 11755122Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: GrantFiled: May 23, 2022Date of Patent: September 12, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
-
Patent number: 11703994Abstract: A computing system is provided. The computing system includes a head mounted display (HMD) device including a display, a processor configured to execute one or more programs, and associated memory. The processor is configured to display a virtual object at least partially within a field of view of a user on the display, identify a plurality of control points associated with the virtual object, and determine that one or more of the control points associated with the virtual object are further than a predetermined threshold distance from the user. The processor is configured to, based on the determination, invoke a far interaction mode for the virtual object and receive a trigger input from the user. In response to the trigger input in the far interaction mode, the processor is configured to invoke a near interaction mode and display a virtual interaction object within the predetermined threshold distance from the user.Type: GrantFiled: April 28, 2022Date of Patent: July 18, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Sheng Kai Tang, Julia Schwarz, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Joshua Kyle Neff, Alton Kwok
-
Publication number: 20220345537Abstract: In one embodiment, an AR/VR system includes a social-networking application installed on the AR/VR system, which allows a user to access on online social network, including communicating with the user's social connections and interacting with content objects on the online social network. The AR/VR system also includes an AR/VR application, which allows the user to interact with an AR/VR platform by providing user input to the AR/VR application via various modalities. Based on the user input, the AR/VR platform generates responses and sends the generated responses to the AR/VR application, which then presents the responses to the user at the AR/VR system via various modalities.Type: ApplicationFiled: March 3, 2022Publication date: October 27, 2022Inventors: Safiya Samms, Ioana Adriana Vlad, Marcus Tanner, Pieter De Baets, Nicole Lundblad, Gregory Francis Mazurek, Nicholas Ferianc Kamuda, Kimberly Arnette, Azam Jiva, Anshul Raizada, Chandra Shekar Chetty, Bhabani Panda
-
Patent number: 11461955Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.Type: GrantFiled: August 23, 2021Date of Patent: October 4, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Sheng Kai Tang, Julia Schwarz, Jason Michael Ray, Sophie Stellmach, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Ramiro S. Torres, Kevin John Appel, Jamie Bryant Kirschenbaum
-
Publication number: 20220283646Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: ApplicationFiled: May 23, 2022Publication date: September 8, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
-
Publication number: 20220253199Abstract: A computing system is provided. The computing system includes a head mounted display (HMD) device including a display, a processor configured to execute one or more programs, and associated memory. The processor is configured to display a virtual object at least partially within a field of view of a user on the display, identify a plurality of control points associated with the virtual object, and determine that one or more of the control points associated with the virtual object are further than a predetermined threshold distance from the user. The processor is configured to, based on the determination, invoke a far interaction mode for the virtual object and receive a trigger input from the user. In response to the trigger input in the far interaction mode, the processor is configured to invoke a near interaction mode and display a virtual interaction object within the predetermined threshold distance from the user.Type: ApplicationFiled: April 28, 2022Publication date: August 11, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Sheng Kai TANG, Julia SCHWARZ, Thomas Matthew GABLE, Casey Leon MEEKHOF, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Joshua Kyle NEFF, Alton KWOK
-
Patent number: 11340707Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: GrantFiled: May 29, 2020Date of Patent: May 24, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
-
Patent number: 11320957Abstract: A computing system is provided. The computing system includes a head mounted display (HMD) device including a display, a processor configured to execute one or more programs, and associated memory. The processor is configured to display a virtual object at least partially within a field of view of a user on the display, identify a plurality of control points associated with the virtual object, and determine that one or more of the control points associated with the virtual object are further than a predetermined threshold distance from the user. The processor is configured to, based on the determination, invoke a far interaction mode for the virtual object and receive a trigger input from the user. In response to the trigger input in the far interaction mode, the processor is configured to invoke a near interaction mode and display a virtual interaction object within the predetermined threshold distance from the user.Type: GrantFiled: March 25, 2019Date of Patent: May 3, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Sheng Kai Tang, Julia Schwarz, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Joshua Kyle Neff, Alton Kwok
-
Patent number: 11294472Abstract: A method for augmenting a two-stage hand gesture input comprises receiving hand tracking data for a hand of a user. A gesture recognition machine recognizes that the user has performed a first-stage gesture based on one or more parameters derived from the received hand tracking data satisfying first-stage gesture criteria. An affordance cueing a second-stage gesture is provided to the user responsive to recognizing the first-stage gesture. The gesture recognition machine recognizes that the user has performed the second-stage gesture based on one or more parameters derived from the received hand tracking data satisfying second-stage gesture criteria. A graphical user interface element is displayed responsive to recognizing the second-stage gesture.Type: GrantFiled: March 26, 2019Date of Patent: April 5, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Sheng Kai Tang, Julia Schwarz, Thomas Matthew Gable, Casey Leon Meekhof, Chuan Qin, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Ramiro S. Torres, Joshua Kyle Neff, Jamie Bryant Kirschenbaum, Neil Richard Kronlage
-
Patent number: 11277652Abstract: A system finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.Type: GrantFiled: July 6, 2020Date of Patent: March 15, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Cesare John Saretto, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Henry Hooper Somuah, Matthew John McCloskey, Douglas C. Hebenthal, Kathleen P. Mulcahy
-
Publication number: 20210383594Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.Type: ApplicationFiled: August 23, 2021Publication date: December 9, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Sheng Kai TANG, Julia SCHWARZ, Jason Michael RAY, Sophie STELLMACH, Thomas Matthew GABLE, Casey Leon MEEKHOF, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Ramiro S. TORRES, Kevin John APPEL, Jamie Bryant KIRSCHENBAUM
-
Publication number: 20210373672Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: ApplicationFiled: May 29, 2020Publication date: December 2, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
-
Patent number: 11107265Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.Type: GrantFiled: March 11, 2019Date of Patent: August 31, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Sheng Kai Tang, Julia Schwarz, Jason Michael Ray, Sophie Stellmach, Thomas Matthew Gable, Casey Leon Meekhof, Nahil Tawfik Sharkasi, Nicholas Ferianc Kamuda, Ramiro S. Torres, Kevin John Appel, Jamie Bryant Kirschenbaum
-
Patent number: 10955665Abstract: A see through head mounted display apparatus includes code performing a method of choosing and optimal viewing location and perspective for shared-view virtual objects rendered for multiple users in a common environment. Multiple objects and multiple users are taken into account in determining the optimal, common viewing location. The technology allows each user to have a common view if the relative position of the object in the environment.Type: GrantFiled: June 18, 2013Date of Patent: March 23, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Tom G. Salter, Ben J. Sugden, Daniel Deptford, Robert L. Crocco, Jr., Brian E. Keane, Laura K. Massey, Alex Aben-Athar Kipman, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda
-
Publication number: 20200336778Abstract: A system finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.Type: ApplicationFiled: July 6, 2020Publication date: October 22, 2020Inventors: Cesare John SARETTO, Peter Tobias KINNEBREW, Nicholas Ferianc KAMUDA, Henry Hooper SOMUAH, Matthew John McCLOSKEY, Douglas C. HEBENTHAL, Kathleen P. MULCAHY
-
Patent number: 10735796Abstract: A system finds and aggregates the most relevant and current information about the people and things that a user cares about. The information gathering is based on current context (e.g., where the user is, what the user is doing, what the user is saying/typing, etc.). The result of the context based information gathering is presented ubiquitously on user interfaces of any of the various physical devices operated by the user.Type: GrantFiled: May 10, 2018Date of Patent: August 4, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Cesare John Saretto, Peter Tobias Kinnebrew, Nicholas Ferianc Kamuda, Henry Hooper Somuah, Matthew John McCloskey, Douglas C. Hebenthal, Kathleen P. Mulcahy
-
Publication number: 20200226814Abstract: A head-mounted display comprises a display device and an outward-facing depth camera. A storage machine comprises instructions executable by a logic machine to present one or more virtual objects on the display device, to receive information from the depth camera about an environment, and to determine a position of the head-mounted display within the environment. Based on the position of the head-mounted display, a position of a joint of a user's arm is inferred. Based on the information received from the depth camera, a position of a user's hand is determined. A ray is cast from a portion of the user's hand based on the position of the joint of the user's arm and the position of the user's hand. Responsive to the ray intersecting with one or more control points of a virtual object, the user is provided with an indication that the virtual object is being targeted.Type: ApplicationFiled: March 11, 2019Publication date: July 16, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sheng Kai TANG, Julia SCHWARZ, Jason Michael RAY, Sophie STELLMACH, Thomas Matthew GABLE, Casey Leon MEEKHOF, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Ramiro S. TORRES, Kevin John APPEL, Jamie Bryant KIRSCHENBAUM
-
Publication number: 20200225830Abstract: A computing system is provided. The computing system includes a head mounted display (HMD) device including a display, a processor configured to execute one or more programs, and associated memory. The processor is configured to display a virtual object at least partially within a field of view of a user on the display, identify a plurality of control points associated with the virtual object, and determine that one or more of the control points associated with the virtual object are further than a predetermined threshold distance from the user. The processor is configured to, based on the determination, invoke a far interaction mode for the virtual object and receive a trigger input from the user. In response to the trigger input in the far interaction mode, the processor is configured to invoke a near interaction mode and display a virtual interaction object within the predetermined threshold distance from the user.Type: ApplicationFiled: March 25, 2019Publication date: July 16, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sheng Kai TANG, Julia SCHWARZ, Thomas Matthew GABLE, Casey Leon MEEKHOF, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Joshua Kyle NEFF, Alton KWOK
-
Publication number: 20200225758Abstract: A method for augmenting a two-stage hand gesture input comprises receiving hand tracking data for a hand of a user. A gesture recognition machine recognizes that the user has performed a first-stage gesture based on one or more parameters derived from the received hand tracking data satisfying first-stage gesture criteria. An affordance cueing a second-stage gesture is provided to the user responsive to recognizing the first-stage gesture. The gesture recognition machine recognizes that the user has performed the second-stage gesture based on one or more parameters derived from the received hand tracking data satisfying second-stage gesture criteria. A graphical user interface element is displayed responsive to recognizing the second-stage gesture.Type: ApplicationFiled: March 26, 2019Publication date: July 16, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sheng Kai TANG, Julia SCHWARZ, Thomas Matthew GABLE, Casey Leon MEEKHOF, Chuan QIN, Nahil Tawfik SHARKASI, Nicholas Ferianc KAMUDA, Ramiro S. TORRES, Joshua Kyle NEFF, Jamie Bryant KIRSCHENBAUM, Neil Richard KRONLAGE