Patents by Inventor Savannah Niles
Savannah Niles has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240378831Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user's actions and behaviors.Type: ApplicationFiled: July 24, 2024Publication date: November 14, 2024Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
-
Publication number: 20240338912Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.Type: ApplicationFiled: June 18, 2024Publication date: October 10, 2024Inventors: James M. Powderly, Savannah Niles, Frank Alexander Hamilton, IV, Marshal Ainsworth Fontaine, Paul Armistead Hoover
-
Publication number: 20240329755Abstract: Systems and methods for interacting with virtual objects in a three-dimensional space using a wearable system are disclosed. The wearable system can be programmed to permit user interaction with interactable objects in a field of regard (FOR) of a user. The FOR includes a portion of the environment around the user that is capable of being perceived by the user via the AR system. The system can determine a group of interactable objects in the FOR of the user and determine a pose of the user. The system can update, based on a change in the pose or a field of view (FOV) of the user, a subgroup of the interactable objects that are located in the FOV of the user and receive a selection of a target interactable object from the subgroup of interactable objects. The system can initiate a selection event on the target interactable object.Type: ApplicationFiled: June 14, 2024Publication date: October 3, 2024Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Rony Abovitz, Alysha Naples
-
Patent number: 12079943Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user's actions and behaviors.Type: GrantFiled: March 31, 2023Date of Patent: September 3, 2024Assignee: MAGIC LEAP, INC.Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
-
Patent number: 12067209Abstract: A light emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.Type: GrantFiled: December 29, 2022Date of Patent: August 20, 2024Assignee: Magic Leap, Inc.Inventors: James M. Powderly, Savannah Niles, Christopher David Nesladek, Isioma Osagbemwenorue Azu, Marshal Ainsworth Fontaine, Haney Awad, William Wheeler, Brian David Schwab, Brian Edward Oliver Bucknor
-
Patent number: 12056293Abstract: Systems and methods for interacting with virtual objects in a three-dimensional space using a wearable system are disclosed. The wearable system can be programmed to permit user interaction with interactable objects in a field of regard (FOR) of a user. The FOR includes a portion of the environment around the user that is capable of being perceived by the user via the AR system. The system can determine a group of interactable objects in the FOR of the user and determine a pose of the user. The system can update, based on a change in the pose or a field of view (FOV) of the user, a subgroup of the interactable objects that are located in the FOV of the user and receive a selection of a target interactable object from the subgroup of interactable objects. The system can initiate a selection event on the target interactable object.Type: GrantFiled: July 3, 2023Date of Patent: August 6, 2024Assignee: MAGIC LEAP, INC.Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Rony Abovitz, Alysha Naples
-
Patent number: 12051167Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.Type: GrantFiled: April 11, 2023Date of Patent: July 30, 2024Assignee: MAGIC LEAP, INC.Inventors: James M. Powderly, Savannah Niles, Frank Alexander Hamilton, IV, Marshal Ainsworth Fontaine, Paul Armistead Hoover
-
Publication number: 20240211028Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.Type: ApplicationFiled: March 5, 2024Publication date: June 27, 2024Inventors: James M. Powderly, Savannah Niles, Jennifer M.R. Devine, Adam C. Carlson, Jeffrey Scott Sommers, Praveen Babu J D, Ajoy Savio Fernandes, Anthony Robert Sheeder
-
Publication number: 20240201823Abstract: A light-emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light-emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light-emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.Type: ApplicationFiled: February 29, 2024Publication date: June 20, 2024Inventors: James M. POWDERLY, Savannah NILES, Christopher David NESLADEK, Isioma Osagbemwenorue AZU, Marshal Ainsworth FONTAINE, Haney AWAD, William WHEELER, Brian David SCHWAB, Brian Edward Oliver BUCKNOR
-
Patent number: 11960636Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.Type: GrantFiled: December 22, 2021Date of Patent: April 16, 2024Assignee: MAGIC LEAP, INC.Inventors: James M. Powderly, Savannah Niles, Jennifer M. R. Devine, Adam C. Carlson, Jeffrey Scott Sommers, Praveen Babu J D, Ajoy Savio Fernandes, Anthony Robert Sheeder
-
Publication number: 20240087261Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: ApplicationFiled: November 17, 2023Publication date: March 14, 2024Inventors: Richard St. Clair BAILEY, Siddartha POTHAPRAGADA, Koichi MORI, Karen STOLZENBERG, Savannah NILES, Domingo NORIEGA-PADILLA, Cole Parker HEINER
-
Patent number: 11861803Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: GrantFiled: September 13, 2022Date of Patent: January 2, 2024Assignee: Magic Leap, Inc.Inventors: Richard St. Clair Bailey, Siddartha Pothapragada, Koichi Mori, Karen Stolzenberg, Savannah Niles, Domingo Noriega-Padilla, Cole Parker Heiner
-
Publication number: 20230350504Abstract: Systems and methods for interacting with virtual objects in a three-dimensional space using a wearable system are disclosed. The wearable system can be programmed to permit user interaction with interactable objects in a field of regard (FOR) of a user. The FOR includes a portion of the environment around the user that is capable of being perceived by the user via the AR system. The system can determine a group of interactable objects in the FOR of the user and determine a pose of the user. The system can update, based on a change in the pose or a field of view (FOV) of the user, a subgroup of the interactable objects that are located in the FOV of the user and receive a selection of a target interactable object from the subgroup of interactable objects. The system can initiate a selection event on the target interactable object.Type: ApplicationFiled: July 3, 2023Publication date: November 2, 2023Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Rony Abovitz, Alysha Naples
-
Publication number: 20230298287Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user’s actions and behaviors.Type: ApplicationFiled: March 31, 2023Publication date: September 21, 2023Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
-
Publication number: 20230273431Abstract: An apparatus for use with an image display device configured for head-worn by a user, includes: a screen; and a processing unit configured to assign a first area of the screen to sense finger-action of the user; wherein the processing unit is configured to generate an electronic signal to cause a change in a content displayed by the display device based on the finger-action of the user sensed by the assigned first area of the screen of the apparatus.Type: ApplicationFiled: May 5, 2023Publication date: August 31, 2023Applicant: MAGIC LEAP, INC.Inventors: Lorena PAZMINO, Andrea Isabel MONTOYA, Savannah NILES, Alexander ROCHA, Mario Antonio BRAGG, Parag GOEL, Jeffrey Scott SOMMERS, David Charles LUNDMARK
-
Patent number: 11741917Abstract: Systems and methods for displaying a cursor and a focus indicator associated with real or virtual objects in a virtual, augmented, or mixed reality environment by a wearable display device are disclosed. The system can determine a spatial relationship between a user-movable cursor and a target object within the environment. The system may render a focus indicator (e.g., a halo, shading, or highlighting) around or adjacent objects that are near the cursor. The focus indicator may be emphasized in directions closer to the cursor and deemphasized in directions farther from the cursor. When the cursor overlaps with a target object, the system can render the object in front of the cursor (or not render the cursor at all), so the object is not occluded by the cursor. The cursor and focus indicator can provide the user with positional feedback and help the user navigate among objects in the environment.Type: GrantFiled: May 27, 2022Date of Patent: August 29, 2023Assignee: MAGIC LEAP, INC.Inventors: John Austin Day, Lorena Pazmino, James Cameron Petty, Paul Armistead Hoover, Chris Sorrell, James M. Powderly, Savannah Niles
-
Publication number: 20230266859Abstract: Systems and methods for displaying a cursor and a focus indicator associated with real or virtual objects in a virtual, augmented, or mixed reality environment by a wearable display device are disclosed. The system can determine a spatial relationship between a user-movable cursor and a target object within the environment. The system may render a focus indicator (e.g., a halo, shading, or highlighting) around or adjacent objects that are near the cursor. When the cursor overlaps with a target object, the system can render the object in front of the cursor (or not render the cursor at all), so the object is not occluded by the cursor. The object can be rendered closer to the user than the cursor. A group of virtual objects can be scrolled, and a virtual control panel can be displayed indicating objects that are upcoming in the scroll.Type: ApplicationFiled: December 21, 2022Publication date: August 24, 2023Inventors: John Austin Day, Lorena Pazmino, James Cameron Petty, Paul Armistead Hoover, Chris Sorrell, James M. Powderly, Savannah Niles, Richard St. Claire Bailey
-
Patent number: 11733786Abstract: Systems and methods for interacting with virtual objects in a three-dimensional space using a wearable system are disclosed. The wearable system can be programmed to permit user interaction with interactable objects in a field of regard (FOR) of a user. The FOR includes a portion of the environment around the user that is capable of being perceived by the user via the AR system. The system can determine a group of interactable objects in the FOR of the user and determine a pose of the user. The system can update, based on a change in the pose or a field of view (FOV) of the user, a subgroup of the interactable objects that are located in the FOV of the user and receive a selection of a target interactable object from the subgroup of interactable objects. The system can initiate a selection event on the target interactable object.Type: GrantFiled: November 4, 2022Date of Patent: August 22, 2023Assignee: Magic Leap, Inc.Inventors: James M. Powderly, Savannah Niles, Frank Hamilton, Marshal A. Fontaine, Rony Abovitz, Alysha Naples
-
Publication number: 20230245406Abstract: A wearable system can comprise a display system configured to present virtual content in a three-dimensional space, a user input device configured to receive a user input, and one or more sensors configured to detect a user's pose. The wearable system can support various user interactions with objects in the user's environment based on contextual information. As an example, the wearable system can adjust the size of an aperture of a virtual cone during a cone cast (e.g., with the user's poses) based on the contextual information. As another example, the wearable system can adjust the amount of movement of virtual objects associated with an actuation of the user input device based on the contextual information.Type: ApplicationFiled: April 11, 2023Publication date: August 3, 2023Inventors: James M. Powderly, Savannah Niles, Frank Alexander Hamilton, IV, Marshal Ainsworth Fontaine, Paul Armistead Hoover
-
Publication number: 20230221831Abstract: A light emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.Type: ApplicationFiled: December 29, 2022Publication date: July 13, 2023Inventors: James M. Powderly, Savannah Niles, Christopher David Nesladek, Isioma Osagbemwenorue Azu, Marshal Ainsworth Fontaine, Haney Awad, William Wheeler, Brian David Schwab, Brian Edward Oliver Bucknor