Patents by Inventor Dan Kroymann
Dan Kroymann has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11524232Abstract: In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.Type: GrantFiled: November 20, 2020Date of Patent: December 13, 2022Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 11513592Abstract: An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.Type: GrantFiled: April 23, 2021Date of Patent: November 29, 2022Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 11146661Abstract: An endpoint system including one or more computing devices receives user input associated with an avatar in a shared virtual environment; calculates, based on the user input, motion for a portion of the first avatar, such as a hand; determines, based on the user input, a first gesture state for first avatar; transmits first location change notifications and a representation of the first gesture state for the first avatar; receives second location change notifications and a representation of a second gesture state for a second avatar; detects a collision between the first avatar and the second avatar based on the first location change notifications and the second location change notifications; and identifies a collaborative gesture based on the detected collision, the first gesture state, and the second gesture state.Type: GrantFiled: June 28, 2017Date of Patent: October 12, 2021Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Publication number: 20210240262Abstract: An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.Type: ApplicationFiled: April 23, 2021Publication date: August 5, 2021Applicant: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 10990169Abstract: An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.Type: GrantFiled: June 28, 2017Date of Patent: April 27, 2021Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 10962780Abstract: One or more sensors of a virtual reality device track a pose of the virtual reality device. The virtual reality device requests a virtual image having a perspective corresponding to a future pose from a remote computer. After receiving the requested virtual image, the virtual reality device adjusts the virtual image to an adjusted virtual image having an updated perspective corresponding to an updated tracked pose of the virtual reality device. Then, a virtual reality display displays the adjusted virtual image.Type: GrantFiled: October 26, 2015Date of Patent: March 30, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Anthony Ambrus, Dan Kroymann, Cameron Quinn Egbert, Jeffrey Wallace McGlynn, Michael Ey
-
Publication number: 20210069589Abstract: In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.Type: ApplicationFiled: November 20, 2020Publication date: March 11, 2021Applicant: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 10905956Abstract: In some embodiments, a detecting endpoint system accessing a shared virtual environment detects a collision between a target avatar and an object within the shared virtual environment. The detecting endpoint system transmits a location change notification for a head of the target avatar. An observer endpoint system moves the head of the target avatar based on the location change notification. A target endpoint system associated with the target avatar does not move its viewpoint based on the location change notification. In some embodiments, this decoupling of viewpoint from the avatar allows for a more immersive experience for all users.Type: GrantFiled: June 28, 2017Date of Patent: February 2, 2021Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 10874943Abstract: In some embodiments of the present disclosure, endpoint systems participating in a shared virtual environment simulate objects locally that a user of the endpoint system is likely to interact with. In some embodiments, object authority is thus managed by the endpoint systems, and is not managed by a central server. In some embodiments, a subsequent endpoint system likely to interact with an object may be predicted, and object authority may be transferred to the subsequent endpoint system before the interaction in order to provide an immersive experience for a user of the subsequent endpoint system. In some embodiments, efficient techniques for transmitting notifications between endpoint systems are provided.Type: GrantFiled: June 28, 2017Date of Patent: December 29, 2020Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Johnathan Bevis, Joshua Wehrly
-
Patent number: 10843073Abstract: In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.Type: GrantFiled: June 28, 2017Date of Patent: November 24, 2020Assignee: Rec Room Inc.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Jonathan Bevis, Joshua Wehrly
-
Publication number: 20190379765Abstract: An endpoint system including one or more computing devices receives user input associated with an avatar in a shared virtual environment; calculates, based on the user input, motion for a portion of the first avatar, such as a hand; determines, based on the user input, a first gesture state for first avatar; transmits first location change notifications and a representation of the first gesture state for the first avatar; receives second location change notifications and a representation of a second gesture state for a second avatar; detects a collision between the first avatar and the second avatar based on the first location change notifications and the second location change notifications; and identifies a collaborative gesture based on the detected collision, the first gesture state, and the second gesture state.Type: ApplicationFiled: June 28, 2017Publication date: December 12, 2019Applicant: Against Gravity Corp.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Jonathan Bevis, Joshua Wehrly
-
Publication number: 20190329129Abstract: In some embodiments of the present disclosure, endpoint systems participating in a shared virtual environment simulate objects locally that a user of the endpoint system is likely to interact with. In some embodiments, object authority is thus managed by the endpoint systems, and is not managed by a central server. In some embodiments, a subsequent endpoint system likely to interact with an object may be predicted, and object authority may be transferred to the subsequent endpoint system before the interaction in order to provide an immersive experience for a user of the subsequent endpoint system. In some embodiments, efficient techniques for transmitting notifications between endpoint systems are provided.Type: ApplicationFiled: June 28, 2017Publication date: October 31, 2019Applicant: Against Gravity Corp.Inventors: Nicholas FAJT, Cameron BROWN, Dan KROYMANN, Omer Bilal ORHAN, Jonathan BEVIS, Joshua WEHRLY
-
Publication number: 20190217192Abstract: In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.Type: ApplicationFiled: June 28, 2017Publication date: July 18, 2019Applicant: Against Gravity Corp.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Jonathan Bevis, Joshua Wehrly
-
Publication number: 20190160378Abstract: In some embodiments, a detecting endpoint system accessing a shared virtual environment detects a collision between a target avatar and an object within the shared virtual environment. The detecting endpoint system transmits a location change notification for a head of the target avatar. An observer endpoint system moves the head of the target avatar based on the location change notification. A target endpoint system associated with the target avatar does not move its viewpoint based on the location change notification. In some embodiments, this decoupling of viewpoint from the avatar allows for a more immersive experience for all users.Type: ApplicationFiled: June 28, 2017Publication date: May 30, 2019Applicant: Against Gravity Corp.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Jonathan Bevis, Joshua Wehrly
-
Publication number: 20190155384Abstract: An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.Type: ApplicationFiled: June 28, 2017Publication date: May 23, 2019Applicant: Against Gravity Corp.Inventors: Nicholas Fajt, Cameron Brown, Dan Kroymann, Omer Bilal Orhan, Jonathan Bevis, Joshua Wehrly
-
Patent number: 10008044Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.Type: GrantFiled: December 23, 2016Date of Patent: June 26, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Adam G. Poulos, Evan Michael Keibler, Arthur Tomlin, Cameron Brown, Daniel McCulloch, Brian Mount, Dan Kroymann, Gregory Lowell Alt
-
Patent number: 9846968Abstract: A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras at user-defined positions within the mixed reality environment. The system renders virtual objects in the mixed reality environment from the perspective of the one or more cameras. Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras on one or more external, 2D monitor for viewing by others.Type: GrantFiled: June 2, 2015Date of Patent: December 19, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Arthur Charles Tomlin, Evan Michael Keibler, Nicholas Gervase Fajt, Brian J. Mount, Gregory Lowell Alt, Jorge Tosar, Jonathan Michael Lyons, Anthony J. Ambrus, Cameron Quinn Egbert, Will Guyman, Jeff W. McGlynn, Jeremy Hance, Roger Sebastian-Kevin Sylvan, Alexander Georg Pfaffe, Dan Kroymann, Erik Andrew Saltwell, Chris Word
-
Patent number: 9779512Abstract: Methods for automatically generating a texture exemplar that may be used for rendering virtual objects that appear to be made from the texture exemplar are described. In some embodiments, a head-mounted display device (HMD) may identify a real-world object within an environment, acquire a three-dimensional model of the real-world object, determine a portion of the real-world object from which a texture exemplar is to be generated, capture one or more images of the portion of the real-world object, determine an orientation of the real-world object, and generate the texture exemplar using the one or more images, the three-dimensional model, and the orientation of the real-world object. The HMD may then render and display images of a virtual object such that the virtual object appears to be made from a virtual material associated with the texture exemplar.Type: GrantFiled: January 29, 2015Date of Patent: October 3, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Arthur C. Tomlin, Roger Sebastian-Kevin Sylvan, Dan Kroymann, Cameron G. Brown, Nicholas Gervase Fajt
-
Publication number: 20170115488Abstract: One or more sensors of a virtual reality device track a pose of the virtual reality device. The virtual reality device requests a virtual image having a perspective corresponding to a future pose from a remote computer. After receiving the requested virtual image, the virtual reality device adjusts the virtual image to an adjusted virtual image having an updated perspective corresponding to an updated tracked pose of the virtual reality device. Then, a virtual reality display displays the adjusted virtual image.Type: ApplicationFiled: October 26, 2015Publication date: April 27, 2017Inventors: Anthony Ambrus, Dan Kroymann, Cameron Quinn Egbert, Jeffrey Wallace McGlynn, Michael Ey
-
Publication number: 20170103583Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.Type: ApplicationFiled: December 23, 2016Publication date: April 13, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Adam G. Poulos, Evan Michael Keibler, Arthur Tomlin, Cameron Brown, Daniel McCulloch, Brian Mount, Dan Kroymann, Gregory Lowell Alt