Patents by Inventor Karen Stolzenberg
Karen Stolzenberg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12141367Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.Type: GrantFiled: November 3, 2023Date of Patent: November 12, 2024Assignee: Snap Inc.Inventors: Viktoria Hwang, Karen Stolzenberg
-
Patent number: 12141415Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.Type: GrantFiled: November 29, 2023Date of Patent: November 12, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
-
Patent number: 12135840Abstract: Example systems, devices, media, and methods are described for controlling the presentation of one or more virtual or graphical elements on a display in response to bimanual hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a first hand and defines an input plane relative to a surface of the detected first hand. The image processing system also detects a series of bimanual hand shapes, including the detected first hand and at least one fingertip of a second hand. In response, the system presents a first movable element on the display at a location that is correlated with the current fingertip location. In addition, the image processing system determines whether the detected series of bimanual hand shapes matches a predefined hand gesture. In response to a matching gesture, the system executes a selecting action of an element nearest the current fingertip location.Type: GrantFiled: December 15, 2022Date of Patent: November 5, 2024Assignee: Snap Inc.Inventor: Karen Stolzenberg
-
Publication number: 20240353919Abstract: A wearable computing system that includes a head-mounted display implements a gaze timer feature for enabling the user to temporarily extend the functionality of a handheld controller or other user input device. In one embodiment, when the user gazes at, or in the vicinity of, a handheld controller for a predetermined period of time, the functionality of one or more input elements (e.g., buttons) of the handheld controller is temporarily modified. For example, the function associated with a particular controller button may be modified to enable the user to open a particular menu using the button. The gaze timer feature may, for example, be used to augment the functionality of a handheld controller or other user input device during mixed reality and/or augmented reality sessions.Type: ApplicationFiled: June 28, 2024Publication date: October 24, 2024Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day
-
Patent number: 12079943Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user's actions and behaviors.Type: GrantFiled: March 31, 2023Date of Patent: September 3, 2024Assignee: MAGIC LEAP, INC.Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
-
Patent number: 12056271Abstract: A wearable computing system that includes a head-mounted display implements a gaze timer feature for enabling the user to temporarily extend the functionality of a handheld controller or other user input device. In one embodiment, when the user gazes at, or in the vicinity of, a handheld controller for a predetermined period of time, the functionality of one or more input elements (e.g., buttons) of the handheld controller is temporarily modified. For example, the function associated with a particular controller button may be modified to enable the user to open a particular menu using the button. The gaze timer feature may, for example, be used to augment the functionality of a handheld controller or other user input device during mixed reality and/or augmented reality sessions.Type: GrantFiled: May 31, 2023Date of Patent: August 6, 2024Assignee: MAGIC LEAP, INC.Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day
-
Publication number: 20240211029Abstract: An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.Type: ApplicationFiled: March 7, 2024Publication date: June 27, 2024Inventors: Amy Dedonato, James Cameron Petty, Griffith Buckley Hazen, Jordan Alexander Cazamias, Karen Stolzenberg
-
Patent number: 12013985Abstract: Example systems, devices, media, and methods are described for controlling the presentation of a series of virtual or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined hand gesture. Each predefined hand gesture is associated with an action; for example, a leafing gesture is associated with a scrolling action. The system controls the display of a series of virtual elements, in accordance with the associated action. In an example series of hand shapes that includes flexing and extending the fingers of a single hand severally and continually in a leafing motion, the matching predefined leafing gesture is associated with a scrolling action, which displays the series of items in a display order, sequentially and in accordance with the detected speed of the moving fingers.Type: GrantFiled: January 31, 2022Date of Patent: June 18, 2024Assignee: Snap Inc.Inventors: Karen Stolzenberg, Ilteris Canberk
-
Publication number: 20240187488Abstract: A host device having a first processor executes an application via the first processor. The host device determines a state of the application. A scenegraph is generated corresponding to the state of the application, and the scenegraph is presented to a remote device having a display and a second processor. The remote device is configured to, in response to receiving the scenegraph, render to the display a view corresponding to the scenegraph, without executing the application via the second processor.Type: ApplicationFiled: February 16, 2024Publication date: June 6, 2024Inventors: Praveen BABU J D, Karen STOLZENBERG, Jehangir TAJIK, Rohit Anil TALWALKAR, Colman Thomas BRYANT, Leonid ZOLOTAREV
-
Publication number: 20240168565Abstract: Systems, devices, media, and methods for controlling the presentation of virtual or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects hand shapes in the video data and determines whether it matches a predefined hand gesture. Each predefined hand gesture is associated with an action; for example, a leafing gesture is associated with a scrolling action. The system controls the display of virtual elements, in accordance with the associated action. In an example hand shapes that includes flexing and extending the fingers of a single hand severally and continually in a leafing motion, the matching predefined leafing gesture is associated with a scrolling action, which displays the series of items in a display order, sequentially and in accordance with the detected speed of the moving fingers.Type: ApplicationFiled: January 31, 2024Publication date: May 23, 2024Inventors: Karen Stolzenberg, Ilteris Canberk
-
Patent number: 11954243Abstract: An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.Type: GrantFiled: July 11, 2022Date of Patent: April 9, 2024Assignee: MAGIC LEAP, INC.Inventors: Amy Dedonato, James Cameron Petty, Griffith Buckley Hazen, Jordan Alexander Cazamias, Karen Stolzenberg
-
Publication number: 20240094865Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.Type: ApplicationFiled: November 29, 2023Publication date: March 21, 2024Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
-
Patent number: 11936733Abstract: A host device having a first processor executes an application via the first processor. The host device determines a state of the application. A scenegraph is generated corresponding to the state of the application, and the scenegraph is presented to a remote device having a display and a second processor. The remote device is configured to, in response to receiving the scenegraph, render to the display a view corresponding to the scenegraph, without executing the application via the second processor.Type: GrantFiled: November 8, 2021Date of Patent: March 19, 2024Assignee: Magic Leap, Inc.Inventors: Praveen Babu J D, Karen Stolzenberg, Jehangir Tajik, Rohit Anil Talwalkar, Colman Thomas Bryant, Leonid Zolotarev
-
Patent number: 11935198Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.Type: GrantFiled: June 29, 2021Date of Patent: March 19, 2024Assignee: Snap Inc.Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stolzenberg, David Meisenholder
-
Publication number: 20240087261Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: ApplicationFiled: November 17, 2023Publication date: March 14, 2024Inventors: Richard St. Clair BAILEY, Siddartha POTHAPRAGADA, Koichi MORI, Karen STOLZENBERG, Savannah NILES, Domingo NORIEGA-PADILLA, Cole Parker HEINER
-
Patent number: 11928306Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.Type: GrantFiled: September 20, 2021Date of Patent: March 12, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
-
Publication number: 20240061515Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.Type: ApplicationFiled: November 3, 2023Publication date: February 22, 2024Inventors: Viktoria Hwang, Karen Stolzenberg
-
Patent number: 11880542Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.Type: GrantFiled: September 20, 2021Date of Patent: January 23, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
-
Patent number: 11861070Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.Type: GrantFiled: April 13, 2022Date of Patent: January 2, 2024Assignee: Snap Inc.Inventors: Viktoria Hwang, Karen Stolzenberg
-
Patent number: 11861803Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.Type: GrantFiled: September 13, 2022Date of Patent: January 2, 2024Assignee: Magic Leap, Inc.Inventors: Richard St. Clair Bailey, Siddartha Pothapragada, Koichi Mori, Karen Stolzenberg, Savannah Niles, Domingo Noriega-Padilla, Cole Parker Heiner