Patents by Inventor Shin Hwun Kang

Shin Hwun Kang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230292077
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and speakers. The eyewear device captures image information for an environment surrounding the device, identifies a match between objects in the image information and predetermined objects in previously obtained information for the same environment. The eyewear device then identifies a target location within the environment, which may be associated with a physical or a virtual object. The eyewear device monitors its orientation with respect to the target location and presents audio signals to guide the user toward the target location.
    Type: Application
    Filed: May 16, 2023
    Publication date: September 14, 2023
    Inventors: Ilteris Canberk, Shin Hwun Kang, James Powderly
  • Patent number: 11740852
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group task. During a shared group task session, each eyewear user can manipulate virtual objects displayed in a respective virtual scene that is viewable to each use to perform collaboration. The virtual objects can include many different types of objects, such as a building structure that can be jointly created and edited by the eyewear users.
    Type: Grant
    Filed: August 23, 2022
    Date of Patent: August 29, 2023
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Publication number: 20230256297
    Abstract: Example systems, devices, media, and methods are described for evaluating movements and physical exercises in augmented reality using the display of an eyewear device. A motion evaluation application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) on the eyewear device. The method includes presenting virtual targets on the display, localizing the current eyewear device location based on the captured motion data, and presenting virtual indicators on the display. The virtual targets represent goals or benchmarks for the user to achieve using body postures. The method includes detecting determining whether the eyewear device location represents an intersecting posture relative to the virtual targets, based on the IMU data. The virtual indicators display real-time feedback about user posture or performance relative to the virtual targets.
    Type: Application
    Filed: January 26, 2022
    Publication date: August 17, 2023
    Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
  • Publication number: 20230214082
    Abstract: Systems and methods for controlling an Internet of Things (IoT) device through interaction with an augmented reality (AR) camera includes pairing an interactable augmented reality (AR) overlay with the IoT device. The interactable overlay includes control information for generating a control signal for controlling the IoT device when the interactable overlay is interacted with by the AR camera. The interactable overlay is presented on a display of the AR camera when the IoT device is in a field of view of the AR camera. An indication that the user has pointed the AR camera at the interactable overlay associated with the IoT device on the display of the AR camera is provided, and a control signal associated with the interactable overlay pointed at by the AR camera is provided to the at least one IoT device associated with the interactable overlay pointed at by the AR camera.
    Type: Application
    Filed: December 30, 2021
    Publication date: July 6, 2023
    Inventor: Shin Hwun Kang
  • Publication number: 20230217007
    Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.
    Type: Application
    Filed: December 30, 2021
    Publication date: July 6, 2023
    Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Patent number: 11689877
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and speakers. The eyewear device captures image information for an environment surrounding the device, identifies a match between objects in the image information and predetermined objects in previously obtained information for the same environment. The eyewear device then identifies a target location within the environment, which may be associated with a physical or a virtual object. The eyewear device monitors its orientation with respect to the target location and presents audio signals to guide the user toward the target location.
    Type: Grant
    Filed: June 8, 2021
    Date of Patent: June 27, 2023
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, James Powderly
  • Publication number: 20230179641
    Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.
    Type: Application
    Filed: December 7, 2021
    Publication date: June 8, 2023
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Publication number: 20230154445
    Abstract: Disclosed is a method of providing a music creation interface using a head-mounted device, including displaying first and second geometric loops fixed relative to a location in the real world, the first and second geometric loops each including a plurality of beat indicators. The second geometric loop is spaced apart from the first geometric loop. An interface comprising a plurality of sound or note icons is displayed, and in response to receiving user selection to move a selected sound or note icon to a particular beat indicator on one of the geometric loops, the selected sound or note icon is displayed at the particular beat indicator. In use, the geometric loops are rotated relative to at least one play indicator, and the selected sound or note icon is rendered when it reaches the at least one play indicator.
    Type: Application
    Filed: November 15, 2021
    Publication date: May 18, 2023
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Publication number: 20230082063
    Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.
    Type: Application
    Filed: November 16, 2022
    Publication date: March 16, 2023
    Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov
  • Publication number: 20230076353
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device. A portable eyewear device includes a processor, a memory, and a display projected onto at least one lens assembly. The memory has programming stored therein that, when executed by the processor, captures information depicting an environment surrounding the device and identifies a match between objects in that information and predetermined objects in previously obtained information for the same environment. When the position of the eyewear device reaches a preselected location with respect to the matched objects, a physical output is provided to produce the immersive experience. The physical output changes as the position of the eyewear device moves to maintain the immersive experience.
    Type: Application
    Filed: November 16, 2022
    Publication date: March 9, 2023
    Inventors: Ilteris Canberk, Shin Hwun Kang, Doug Mead
  • Patent number: 11573632
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group object manipulation task. During the shared group task, each user of the eyewear controls movement of a respective virtual object in a virtual scene based on a portion of the virtual scene the user is gazing at. Each user can also generate a verbal command to generate a virtual object that interacts with one or more of the other virtual objects.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: February 7, 2023
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Publication number: 20220405039
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group task. During a shared group task session, each eyewear user can manipulate virtual objects displayed in a respective virtual scene that is viewable to each use to perform collaboration. The virtual objects can include many different types of objects, such as a building structure that can be jointly created and edited by the eyewear users.
    Type: Application
    Filed: August 23, 2022
    Publication date: December 22, 2022
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Patent number: 11531390
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device. A portable eyewear device includes a processor, a memory, and a display projected onto at least one lens assembly. The memory has programming stored therein that, when executed by the processor, captures information depicting an environment surrounding the device and identifies a match between objects in that information and predetermined objects in previously obtained information for the same environment. When the position of the eyewear device reaches a preselected location with respect to the matched objects, a physical output is provided to produce the immersive experience. The physical output changes as the position of the eyewear device moves to maintain the immersive experience.
    Type: Grant
    Filed: March 31, 2020
    Date of Patent: December 20, 2022
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Doug Mead
  • Patent number: 11520399
    Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.
    Type: Grant
    Filed: May 26, 2020
    Date of Patent: December 6, 2022
    Assignee: Snap Inc.
    Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov
  • Publication number: 20220374131
    Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.
    Type: Application
    Filed: September 20, 2021
    Publication date: November 24, 2022
    Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
  • Patent number: 11481177
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group task. During a shared group task session, each eyewear user can manipulate virtual objects displayed in a respective virtual scene that is viewable to each use to perform collaboration. The virtual objects can include many different types of objects, such as a building structure that can be jointly created and edited by the eyewear users.
    Type: Grant
    Filed: July 24, 2020
    Date of Patent: October 25, 2022
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Publication number: 20220292780
    Abstract: Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.
    Type: Application
    Filed: February 28, 2022
    Publication date: September 15, 2022
    Inventors: Shin Hwun Kang, Dmytro Kucher, Dmytro Hovorov, Ilteris Canberk
  • Publication number: 20220206588
    Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.
    Type: Application
    Filed: December 14, 2021
    Publication date: June 30, 2022
    Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
  • Publication number: 20220182777
    Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, and image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.
    Type: Application
    Filed: November 17, 2021
    Publication date: June 9, 2022
    Inventors: Ilteris Canberk, Shin Hwun Kang
  • Publication number: 20220124295
    Abstract: Systems, devices, media, and methods are presented for producing an augmented reality (AR) experience for display on a smart eyewear device. The AR production system includes a marker registration utility for setting and storing markers, a localization utility for locating the eyewear device relative to a marker location and to the mapped environment, and a virtual object rendering utility to presenting one or more virtual objects having a desired size, shape, and orientation. A high-definition camera captures an input image of the environment. If the input image includes a marker, the system retrieves from memory a set of data including a first marker location expressed in terms relative to a marker coordinate system. The localization utility determines a local position of the eyewear device relative to the marker location.
    Type: Application
    Filed: December 31, 2021
    Publication date: April 21, 2022
    Inventors: Ilteris Canberk, Shin Hwun Kang, Kristina Marrero