Patents by Inventor Karen Stolzenberg

Karen Stolzenberg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230214015
    Abstract: Examples of systems and methods for interacting with content and updating the location and orientation of that content using a single controller. The system may allow a user to use the same controller for moving content around the room and interacting with that content by tracking a range of the motion of the controller.
    Type: Application
    Filed: March 15, 2023
    Publication date: July 6, 2023
    Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day, Eric Norman Yiskis
  • Publication number: 20230179641
    Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.
    Type: Application
    Filed: December 7, 2021
    Publication date: June 8, 2023
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Publication number: 20230154445
    Abstract: Disclosed is a method of providing a music creation interface using a head-mounted device, including displaying first and second geometric loops fixed relative to a location in the real world, the first and second geometric loops each including a plurality of beat indicators. The second geometric loop is spaced apart from the first geometric loop. An interface comprising a plurality of sound or note icons is displayed, and in response to receiving user selection to move a selected sound or note icon to a particular beat indicator on one of the geometric loops, the selected sound or note icon is displayed at the particular beat indicator. In use, the geometric loops are rotated relative to at least one play indicator, and the selected sound or note icon is rendered when it reaches the at least one play indicator.
    Type: Application
    Filed: November 15, 2021
    Publication date: May 18, 2023
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Patent number: 11645823
    Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user's actions and behaviors.
    Type: Grant
    Filed: May 27, 2022
    Date of Patent: May 9, 2023
    Assignee: Magic Leap, Inc.
    Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
  • Publication number: 20230117197
    Abstract: Example systems, devices, media, and methods are described for controlling the presentation of one or more virtual or graphical elements on a display in response to bimanual hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a first hand and defines an input plane relative to a surface of the detected first hand. The image processing system also detects a series of bimanual hand shapes, including the detected first hand and at least one fingertip of a second hand. In response, the system presents a first movable element on the display at a location that is correlated with the current fingertip location. In addition, the image processing system determines whether the detected series of bimanual hand shapes matches a predefined hand gesture. In response to a matching gesture, the system executes a selecting action of an element nearest the current fingertip location.
    Type: Application
    Filed: December 15, 2022
    Publication date: April 20, 2023
    Inventor: Karen Stolzenberg
  • Patent number: 11619996
    Abstract: Examples of systems and methods for interacting with content and updating the location and orientation of that content using a single controller. The system may allow a user to use the same controller for moving content around the room and interacting with that content by tracking a range of the motion of the controller.
    Type: Grant
    Filed: February 16, 2022
    Date of Patent: April 4, 2023
    Assignee: Magic Leap, Inc.
    Inventors: Karen Stolzenberg, Marc Alan McCall, Frank Alexander Hamilton, IV, Cole Parker Heiner, John Austin Day, Eric Norman Yiskis
  • Publication number: 20230017752
    Abstract: Disclosed herein are systems and methods for enabling mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.
    Type: Application
    Filed: September 13, 2022
    Publication date: January 19, 2023
    Inventors: Richard St. Clair BAILEY, Siddartha POTHAPRAGADA, Koichi MORI, Karen STOLZENBERG, Savannah NILES, Domingo NORIEGA-PADILLA, Cole Parker HEINER
  • Publication number: 20220405995
    Abstract: Neutral avatars are neutral with reference physical characteristics of the corresponding user, such as weight, ethnicity, gender, or even identity. Thus, neutral avatars may be desirable to use in various copresence environments where the user desires to maintain privacy with reference to the above-noted characteristics. Neutral avatars may be configured to convey, in real-time, actions and behaviors of the corresponding user without using literal forms of the user's actions and behaviors.
    Type: Application
    Filed: May 27, 2022
    Publication date: December 22, 2022
    Inventors: Karen Stolzenberg, Lorena Pazmino, Savannah Niles, Ian Mankowski, Paul Kim, Christina Lee
  • Patent number: 11531402
    Abstract: Example systems, devices, media, and methods are described for controlling the presentation of one or more virtual or graphical elements on a display in response to bimanual hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a first hand and defines an input plane relative to a surface of the detected first hand. The image processing system also detects a series of bimanual hand shapes, including the detected first hand and at least one fingertip of a second hand. In response, the system presents a first movable element on the display at a location that is correlated with the current fingertip location. In addition, the image processing system determines whether the detected series of bimanual hand shapes matches a predefined hand gesture. In response to a matching gesture, the system executes a selecting action of an element nearest the current fingertip location.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: December 20, 2022
    Assignee: Snap Inc.
    Inventor: Karen Stolzenberg
  • Publication number: 20220374131
    Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.
    Type: Application
    Filed: September 20, 2021
    Publication date: November 24, 2022
    Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
  • Publication number: 20220374132
    Abstract: Disclosed is a method of receiving and processing content-sending inputs received by a head-worn device system including one or more display devices, one or more cameras and a vertically-arranged touchpad. The method includes displaying a content item on the one or more display devices, receiving a touch input on the touchpad corresponding to a send instruction, displaying a carousel of potential recipients, receiving a horizontal touch input on the touchpad, scrolling the carousel left or right on the one or more display devices in response to the horizontal touch input, receiving a tap touch input on the touchpad to select a particular recipient, receiving a further touch input, and in response to the further touch input, transmitting the content item to the selected recipient.
    Type: Application
    Filed: September 20, 2021
    Publication date: November 24, 2022
    Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Sana Park, Tianyi Sun, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich
  • Publication number: 20220366626
    Abstract: Examples of systems and methods for rendering an avatar in a mixed reality environment are disclosed. The systems and methods may be configured to automatically scale an avatar or to render an avatar based on a determined intention of a user, an interesting impulse, environmental stimuli, or user saccade points. The disclosed systems and methods may apply discomfort curves when rendering an avatar. The disclosed systems and methods may provide a more realistic interaction between a human user and an avatar.
    Type: Application
    Filed: June 6, 2022
    Publication date: November 17, 2022
    Inventors: Thomas Marshall Miller, IV, Josh Anon, Frank Alexander Hamilton, IV, Cole Parker Heiner, Victor Ng-Thow-Hing, Rodrigo Cano, Karen Stolzenberg, Lorena Pazmino, Gregory Minh Tran, Stephane Antoine Joseph Imbert, Anthony Marinello
  • Publication number: 20220343612
    Abstract: An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.
    Type: Application
    Filed: July 11, 2022
    Publication date: October 27, 2022
    Inventors: Amy Dedonato, James Cameron Petty, Griffith Buckley Hazen, Jordan Alexander Cazamias, Karen Stolzenberg
  • Publication number: 20220334649
    Abstract: Example systems, devices, media, and methods are described for controlling one or more virtual elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a hand and presents a menu icon on the display in accordance with a detected current hand location. The image processing system detects a series of hand shapes in the captured frames of video data and determines whether the detected hand shapes match any of a plurality of predefined hand gestures stored in a hand gesture library. In response to a match, the method includes executing an action in accordance with the matching hand gesture. In response to an opening gesture, an element animation system presents one or more graphical elements incrementally moving along a path extending away from the menu icon. A closing hand gesture causes the elements to retreat along the path toward the menu icon.
    Type: Application
    Filed: April 13, 2022
    Publication date: October 20, 2022
    Applicant: Snap Inc.
    Inventors: Viktoria Hwang, Karen Stolzenberg
  • Patent number: 11475644
    Abstract: Disclosed are systems and methods for mixed reality collaboration. A method may include receiving persistent coordinate data; presenting a first virtual session handle to a first user at a first position via a transmissive display of a wearable device, wherein the first position is based on the persistent coordinate data; presenting a virtual object to the first user at a second location via the transmissive display, wherein the second position is based on the first position; receiving location data from a second user, wherein the location data relates a position of the second user to a position of a second virtual session handle; presenting a virtual avatar to the first user at a third position via the transmissive display, wherein the virtual avatar corresponds to the second user, wherein the third position is based on the location data, and wherein the third position is further based on the first position.
    Type: Grant
    Filed: February 12, 2021
    Date of Patent: October 18, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Richard St. Clair Bailey, Siddartha Pothapragada, Koichi Mori, Karen Stolzenberg, Savannah Niles, Domingo Noriega-Padilla, Cole Parker Heiner
  • Publication number: 20220326781
    Abstract: Example systems, devices, media, and methods are described for controlling the presentation of one or more virtual or graphical elements on a display in response to bimanual hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a first hand and defines a mapped region relative to a surface of the detected first hand. The system presents a virtual target icon at a position within or near the mapped region. The image processing system detects a series of bimanual hand shapes, including the detected first hand and at least one fingertip of a second hand. In response to determining whether the detected series of bimanual hand shapes matches a predefined hand gesture, the system executes a selecting action that includes presenting one or more graphical elements on the display.
    Type: Application
    Filed: April 6, 2022
    Publication date: October 13, 2022
    Inventors: Viktoria Hwang, Karen Stolzenberg
  • Patent number: 11423624
    Abstract: An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.
    Type: Grant
    Filed: November 16, 2020
    Date of Patent: August 23, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Amy Dedonato, James Cameron Petty, Griffith Buckley Hazen, Jordan Alexander Cazamias, Karen Stolzenberg
  • Patent number: D960176
    Type: Grant
    Filed: May 13, 2020
    Date of Patent: August 9, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Lorena Pazmino, Karen Stolzenberg, Tim Zurmoehle, Andrea Isabel Montoya
  • Patent number: D960914
    Type: Grant
    Filed: March 3, 2022
    Date of Patent: August 16, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Lorena Pazmino, Karen Stolzenberg, Brian Everett Meaney, Andrea Isabel Montoya, Gregory Minh Tran, Amy Dedonato
  • Patent number: D974383
    Type: Grant
    Filed: May 19, 2021
    Date of Patent: January 3, 2023
    Assignee: Snap Inc.
    Inventors: Kaveh Anvaripour, Kyle Goodrich, David Meisenholder, Daniel Moreno, Karen Stolzenberg, Mathieu Emmanuel Vignau