Patents by Inventor Karen Stolzenberg
Karen Stolzenberg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11954243Abstract: An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.Type: GrantFiled: July 11, 2022Date of Patent: April 9, 2024Assignee: MAGIC LEAP, INC.Inventors: Amy Dedonato, James Cameron Petty, Griffith Buckley Hazen, Jordan Alexander Cazamias, Karen Stolzenberg
-
Publication number: 20240094865Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.Type: ApplicationFiled: November 29, 2023Publication date: March 21, 2024Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
-
Patent number: 11936733Abstract: A host device having a first processor executes an application via the first processor. The host device determines a state of the application. A scenegraph is generated corresponding to the state of the application, and the scenegraph is presented to a remote device having a display and a second processor. The remote device is configured to, in response to receiving the scenegraph, render to the display a view corresponding to the scenegraph, without executing the application via the second processor.Type: GrantFiled: November 8, 2021Date of Patent: March 19, 2024Assignee: Magic Leap, Inc.Inventors: Praveen Babu J D, Karen Stolzenberg, Jehangir Tajik, Rohit Anil Talwalkar, Colman Thomas Bryant, Leonid Zolotarev
-
Patent number: 11935198Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.Type: GrantFiled: June 29, 2021Date of Patent: March 19, 2024Assignee: Snap Inc.Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stolzenberg, David Meisenholder
-
Publication number: 20240087261Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: ApplicationFiled: November 17, 2023Publication date: March 14, 2024Inventors: Richard St. Clair BAILEY, Siddartha POTHAPRAGADA, Koichi MORI, Karen STOLZENBERG, Savannah NILES, Domingo NORIEGA-PADILLA, Cole Parker HEINER
-
Patent number: 11928306Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.Type: GrantFiled: September 20, 2021Date of Patent: March 12, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
-
Publication number: 20240061515Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.Type: ApplicationFiled: November 3, 2023Publication date: February 22, 2024Inventors: Viktoria Hwang, Karen Stolzenberg
-
Patent number: 11880542Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.Type: GrantFiled: September 20, 2021Date of Patent: January 23, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
-
Patent number: 11861803Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: GrantFiled: September 13, 2022Date of Patent: January 2, 2024Assignee: Magic Leap, Inc.Inventors: Richard St. Clair Bailey, Siddartha Pothapragada, Koichi Mori, Karen Stolzenberg, Savannah Niles, Domingo Noriega-Padilla, Cole Parker Heiner
-
Patent number: 11863596Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.Type: GrantFiled: December 7, 2021Date of Patent: January 2, 2024Assignee: Snap Inc.Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
-
Patent number: 11861070Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.Type: GrantFiled: April 13, 2022Date of Patent: January 2, 2024Assignee: Snap Inc.Inventors: Viktoria Hwang, Karen Stolzenberg
-
Publication number: 20230412650Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.Type: ApplicationFiled: August 25, 2023Publication date: December 21, 2023Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
-
Publication number: 20230315197Abstract: A wearable computing system that includes a head-mounted display implements a gaze timer feature for enabling the user to temporarily extend the functionality of a handheld controller or other user input device. In one embodiment, when the user gazes at, or in the vicinity of, a handheld controller for a predetermined period of time, the functionality of one or more input elements (e.g., buttons) of the handheld controller is temporarily modified. For example, the function associated with a particular controller button may be modified to enable the user to open a particular menu using the button. The gaze timer feature may, for example, be used to augment the functionality of a handheld controller or other user input device during mixed reality and/or augmented reality sessions.Type: ApplicationFiled: May 31, 2023Publication date: October 5, 2023Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day
-
Publication number: 20230298287Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user’s actions and behaviors.Type: ApplicationFiled: March 31, 2023Publication date: September 21, 2023Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
-
Publication number: 20230290031Abstract: Examples of systems and methods for rendering an avatar in a mixed reality environment are disclosed. The systems and methods may be configured to automatically scale an avatar or to render an avatar based on a determined intention of a user, an interesting impulse, environmental stimuli, or user saccade points. The disclosed systems and methods may apply discomfort curves when rendering an avatar. The disclosed systems and methods may provide a more realistic interaction between a human user and an avatar.Type: ApplicationFiled: May 19, 2023Publication date: September 14, 2023Inventors: Thomas Marshall Miller, IV, Josh Anon, Frank Alexander Hamilton, IV, Cole Parker Heiner, Victor Ng-Thow-Hing, Rodrigo Cano, Karen Stolzenberg, Lorena Pazmino, Gregory Minh Tran, Stephane Antoine Joseph Imbert, Anthony Marinello
-
Patent number: 11703943Abstract: A wearable computing system that includes a head-mounted display implements a gaze timer feature for enabling the user to temporarily extend the functionality of a handheld controller or other user input device. In one embodiment, when the user gazes at, or in the vicinity of, a handheld controller for a predetermined period of time, the functionality of one or more input elements (e.g., buttons) of the handheld controller is temporarily modified. For example, the function associated with a particular controller button may be modified to enable the user to open a particular menu using the button. The gaze timer feature may, for example, be used to augment the functionality of a handheld controller or other user input device during mixed reality and/or augmented reality sessions.Type: GrantFiled: December 9, 2021Date of Patent: July 18, 2023Assignee: MAGIC LEAP, INC.Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day
-
Patent number: 11699255Abstract: Examples of systems and methods for rendering an avatar in a mixed reality environment are disclosed. The systems and methods may be configured to automatically scale an avatar or to render an avatar based on a determined intention of a user, an interesting impulse, environmental stimuli, or user saccade points. The disclosed systems and methods may apply discomfort curves when rendering an avatar. The disclosed systems and methods may provide a more realistic interaction between a human user and an avatar.Type: GrantFiled: June 6, 2022Date of Patent: July 11, 2023Assignee: Magic Leap, Inc.Inventors: Thomas Marshall Miller, IV, Josh Anon, Frank Alexander Hamilton, IV, Cole Parker Heiner, Victor Ng-Thow-Hing, Rodrigo Cano, Karen Stolzenberg, Lorena Pazmino, Gregory Minh Tran, Stephane Antoine Joseph Imbert, Anthony Marinello
-
Publication number: 20230214015Abstract: Examples of systems and methods for interacting with content and updating the location and orientation of that content using a single controller. The system may allow a user to use the same controller for moving content around the room and interacting with that content by tracking a range of the motion of the controller.Type: ApplicationFiled: March 15, 2023Publication date: July 6, 2023Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day, Eric Norman Yiskis
-
Publication number: 20230179641Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.Type: ApplicationFiled: December 7, 2021Publication date: June 8, 2023Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
-
Publication number: 20230154445Abstract: Disclosed is a method of providing a music creation interface using a head-mounted device, including displaying first and second geometric loops fixed relative to a location in the real world, the first and second geometric loops each including a plurality of beat indicators. The second geometric loop is spaced apart from the first geometric loop. An interface comprising a plurality of sound or note icons is displayed, and in response to receiving user selection to move a selected sound or note icon to a particular beat indicator on one of the geometric loops, the selected sound or note icon is displayed at the particular beat indicator. In use, the geometric loops are rotated relative to at least one play indicator, and the selected sound or note icon is rendered when it reaches the at least one play indicator.Type: ApplicationFiled: November 15, 2021Publication date: May 18, 2023Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg