Patents by Inventor Shin Hwun Kang

Shin Hwun Kang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240144611
    Abstract: Eyewear presenting text corresponding to spoken words (e.g., in speech bubbles) and optionally translating from one language to another. In one example, an interactive augmented reality experience is provided between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be speech spoken by a remote second user of eyewear converted to text. The converted text can be displayed on a display of eyewear of the first user proximate the viewed second user. The personal attribute may be displayed in a speech bubble proximate the second user, such as proximate the head or mouth of the second user. The language of the spoken speech can be recognized by the second user eyewear, and translated to a language that is understood by the first user.
    Type: Application
    Filed: December 21, 2023
    Publication date: May 2, 2024
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Publication number: 20240119679
    Abstract: Systems and methods are provided for performing operations on an augmented reality (AR) device using an external screen streaming system. The system establishes, by one or more processors of an AR device, a communication with an external client device. The system causes overlay of, by the AR device, a first AR object on a real-world environment being viewed using the AR device. The system receives, by the AR device, a first image from the external client device. The system, in response to receiving the first image from the external client device, overlays the first image on the first AR object by the AR device.
    Type: Application
    Filed: October 5, 2022
    Publication date: April 11, 2024
    Inventors: Ilteris Kaan Canberk, Bernhard Jung, Shin Hwun Kang, Daria Skrypnyk, Tianyi Sun, Lien Le Hong Tran
  • Publication number: 20240112383
    Abstract: An augmented reality (AR) content system is provided. The AR content system may analyze audio input obtained from a user to generate a search request. The AR content system may obtain search results in response to the search request and determine a layout by which to display the search results. The search results may be displayed in a user interface within an AR environment according to the layout. The AR content system may also analyze audio input to detect commands to perform with respect to content displayed in the user interface.
    Type: Application
    Filed: October 4, 2022
    Publication date: April 4, 2024
    Inventors: Shin Hwun Kang, Lien Le Hong Tran
  • Publication number: 20240107256
    Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.
    Type: Application
    Filed: December 7, 2023
    Publication date: March 28, 2024
    Inventors: Ilteris Canberk, Shin Hwun Kang
  • Patent number: 11934570
    Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: March 19, 2024
    Assignee: Snap Inc.
    Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov
  • Publication number: 20240087237
    Abstract: Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Shin Hwun Kang, Dmytro Kucher, Dmytro Hovorov, Ilteris Canberk
  • Patent number: 11928306
    Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.
    Type: Grant
    Filed: September 20, 2021
    Date of Patent: March 12, 2024
    Assignee: SNAP INC.
    Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
  • Patent number: 11914770
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group object manipulation task. During the shared group task, each user of the eyewear controls movement of a respective virtual object in a virtual scene based on a portion of the virtual scene the user is gazing at. Each user can also generate a verbal command to generate a virtual object that interacts with one or more of the other virtual objects.
    Type: Grant
    Filed: January 18, 2023
    Date of Patent: February 27, 2024
    Assignee: SNAP INC.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Publication number: 20240053858
    Abstract: Systems and methods for controlling an Internet of Things (IoT) device through interaction with an augmented reality (AR) camera includes pairing an interactable augmented reality (AR) overlay with the IoT device. The interactable overlay includes control information for generating a control signal for controlling the IoT device when the interactable overlay is interacted with by the AR camera. The interactable overlay is presented on a display of the AR camera when the IoT device is in a field of view of the AR camera. An indication that the user has pointed the AR camera at the interactable overlay associated with the IoT device on the display of the AR camera is provided, and a control signal associated with the interactable overlay pointed at by the AR camera is provided to the at least one IoT device associated with the interactable overlay pointed at by the AR camera.
    Type: Application
    Filed: October 23, 2023
    Publication date: February 15, 2024
    Inventor: Shin Hwun Kang
  • Publication number: 20240050831
    Abstract: An application enables users of electronic eyewear devices to bring a workout session anywhere and have aspects of the workout session (e.g., number of repetitions) recorded automatically. It also allows users to have always-on visibility of their workout instructor so that they can, for example, continue to maintain a visual guide for their form/pose. Virtual avatars perform demonstrations about activities such as yoga in an augmented reality environment using the display of an electronic eyewear device. The demonstration elements can be controlled using voice commands.
    Type: Application
    Filed: September 30, 2022
    Publication date: February 15, 2024
    Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
  • Publication number: 20240045494
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device. A portable eyewear device includes a processor, a memory, and a display projected onto at least one lens assembly. The memory has programming stored therein that, when executed by the processor, captures information depicting an environment surrounding the device and identifies a match between objects in that information and predetermined objects in previously obtained information for the same environment. When the position of the eyewear device reaches a preselected location with respect to the matched objects, a physical output is provided to produce the immersive experience. The physical output changes as the position of the eyewear device moves to maintain the immersive experience.
    Type: Application
    Filed: October 23, 2023
    Publication date: February 8, 2024
    Inventors: Ilteris Canberk, Shin Hwun Kang, Doug Mead
  • Patent number: 11869156
    Abstract: Eyewear presenting text corresponding to spoken words (e.g., in speech bubbles) and optionally translating from one language to another. In one example, an interactive augmented reality experience is provided between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be speech spoken by a remote second user of eyewear converted to text. The converted text can be displayed on a display of eyewear of the first user proximate the viewed second user. The personal attribute may be displayed in a speech bubble proximate the second user, such as proximate the head or mouth of the second user. The language of the spoken speech can be recognized by the second user eyewear, and translated to a language that is understood by the first user.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: January 9, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Patent number: 11863963
    Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, and image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.
    Type: Grant
    Filed: November 17, 2021
    Date of Patent: January 2, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang
  • Patent number: 11863596
    Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.
    Type: Grant
    Filed: December 7, 2021
    Date of Patent: January 2, 2024
    Assignee: Snap Inc.
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Patent number: 11854147
    Abstract: Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.
    Type: Grant
    Filed: February 28, 2022
    Date of Patent: December 26, 2023
    Assignee: Snap Inc.
    Inventors: Shin Hwun Kang, Dmytro Kucher, Dmytro Hovorov, Ilteris Canberk
  • Publication number: 20230412650
    Abstract: A head-worn device system includes one or more cameras, one or more display devices and one or more processors. The system also includes a memory storing instructions that, when executed by the one or more processors, configure the system to perform operations to initiate or join a joint visual computing session. The method may comprise receiving user input to initiate a joint session of a visual computing experience, monitoring for short-range data transmissions including data indicating the existence of a current session of the visual computing experience, and based on determining that a current session is in process, providing a user input option to join the current session of the visual computing experience.
    Type: Application
    Filed: August 25, 2023
    Publication date: December 21, 2023
    Inventors: Kristian Bauer, Tiago Rafael Duarte, Terek Judi, Shin Hwun Kang, Karen Stolzenberg
  • Patent number: 11809680
    Abstract: Systems and methods for controlling an Internet of Things (IoT) device through interaction with an augmented reality (AR) camera includes pairing an interactable augmented reality (AR) overlay with the IoT device. The interactable overlay includes control information for generating a control signal for controlling the IoT device when the interactable overlay is interacted with by the AR camera. The interactable overlay is presented on a display of the AR camera when the IoT device is in a field of view of the AR camera. An indication that the user has pointed the AR camera at the interactable overlay associated with the IoT device on the display of the AR camera is provided, and a control signal associated with the interactable overlay pointed at by the AR camera is provided to the at least one IoT device associated with the interactable overlay pointed at by the AR camera.
    Type: Grant
    Filed: December 30, 2021
    Date of Patent: November 7, 2023
    Assignee: Snap Inc.
    Inventor: Shin Hwun Kang
  • Patent number: 11803234
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device. A portable eyewear device includes a processor, a memory, and a display projected onto at least one lens assembly. The memory has programming stored therein that, when executed by the processor, captures information depicting an environment surrounding the device and identifies a match between objects in that information and predetermined objects in previously obtained information for the same environment. When the position of the eyewear device reaches a preselected location with respect to the matched objects, a physical output is provided to produce the immersive experience. The physical output changes as the position of the eyewear device moves to maintain the immersive experience.
    Type: Grant
    Filed: November 16, 2022
    Date of Patent: October 31, 2023
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Doug Mead
  • Publication number: 20230315383
    Abstract: Disclosed are systems and methods for voice-based control of augmented reality (AR) objects on a wearable device. The systems and methods perform operations comprising: instructing a display element of the AR wearable device to present a visual indicator representing a cursor; receiving voice input representing a first virtual object; determining a real-world position within a real-world environment being viewed through the AR wearable device based on a current position of the visual indicator; and instructing the display element of the AR wearable device to present the first virtual object within the real-world environment at the real-world position.
    Type: Application
    Filed: April 4, 2022
    Publication date: October 5, 2023
    Inventors: Ilteris Kaan Canberk, Shin Hwun Kang
  • Publication number: 20230292077
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and speakers. The eyewear device captures image information for an environment surrounding the device, identifies a match between objects in the image information and predetermined objects in previously obtained information for the same environment. The eyewear device then identifies a target location within the environment, which may be associated with a physical or a virtual object. The eyewear device monitors its orientation with respect to the target location and presents audio signals to guide the user toward the target location.
    Type: Application
    Filed: May 16, 2023
    Publication date: September 14, 2023
    Inventors: Ilteris Canberk, Shin Hwun Kang, James Powderly