Patents by Inventor Shin Hwun Kang
Shin Hwun Kang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240397033Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.Type: ApplicationFiled: August 7, 2024Publication date: November 28, 2024Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Publication number: 20240393887Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.Type: ApplicationFiled: August 8, 2024Publication date: November 28, 2024Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
-
Patent number: 12151138Abstract: Example systems, devices, media, and methods are described for evaluating movements and physical exercises in augmented reality using the display of an eyewear device. A motion evaluation application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) on the eyewear device. The method includes presenting virtual targets on the display, localizing the current eyewear device location based on the captured motion data, and presenting virtual indicators on the display. The virtual targets represent goals or benchmarks for the user to achieve using body postures. The method includes detecting determining whether the eyewear device location represents an intersecting posture relative to the virtual targets, based on the IMU data. The virtual indicators display real-time feedback about user posture or performance relative to the virtual targets.Type: GrantFiled: January 26, 2022Date of Patent: November 26, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
-
Patent number: 12108011Abstract: Systems, devices, media, and methods are presented for producing an augmented reality (AR) experience for display on a smart eyewear device. The AR production system includes a marker registration utility for setting and storing markers, a localization utility for locating the eyewear device relative to a marker location and to the mapped environment, and a virtual object rendering utility to presenting one or more virtual objects having a desired size, shape, and orientation. A high-definition camera captures an input image of the environment. If the input image includes a marker, the system retrieves from memory a set of data including a first marker location expressed in terms relative to a marker coordinate system. The localization utility determines a local position of the eyewear device relative to the marker location.Type: GrantFiled: December 31, 2021Date of Patent: October 1, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang, Kristina Marrero
-
Patent number: 12088781Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.Type: GrantFiled: December 30, 2021Date of Patent: September 10, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Patent number: 12086324Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.Type: GrantFiled: December 14, 2021Date of Patent: September 10, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
-
Publication number: 20240296633Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device acquires text from a user input, a data input, or a speech recognition system. The eyewear device presents a visual text graphic at a predefined location with respect to the eyewear on the display by the display system. The eyewear device allows the user to manipulate the visual text graphics in a number of ways.Type: ApplicationFiled: April 25, 2024Publication date: September 5, 2024Inventors: Ilteris Canberk, Shin Hwun Kang, Daniel Moreno
-
Patent number: 12061842Abstract: Disclosed are systems and methods for voice-based control of augmented reality (AR) objects on a wearable device. The systems and methods perform operations comprising: instructing a display element of the AR wearable device to present a visual indicator representing a cursor; receiving voice input representing a first virtual object; determining a real-world position within a real-world environment being viewed through the AR wearable device based on a current position of the visual indicator; and instructing the display element of the AR wearable device to present the first virtual object within the real-world environment at the real-world position.Type: GrantFiled: April 4, 2022Date of Patent: August 13, 2024Assignee: Snap Inc.Inventors: Ilteris Kaan Canberk, Shin Hwun Kang
-
Publication number: 20240202470Abstract: An augmented reality (AR) translation system is provided. The AR translation system may analyze camera data to determine objects included in a field of view of a camera of a user device. Augmented reality content may be provided that includes a visual translation of an object included in the field of view from a primary language of the user to an additional language. An audible version of the translation may also be provided as part of the augmented reality content. Users may also add an object in the field of view to a listing of translated objects associated with the user based on at least one of touch input, audio input, or gesture input.Type: ApplicationFiled: December 16, 2022Publication date: June 20, 2024Inventors: Ilteris Kaan Canberk, Shin Hwun Kang
-
Publication number: 20240193875Abstract: Systems, methods, and computer readable media for an augmented reality (AR) shared screen space. Examples relate to a host augmented realty (AR) device sharing a screen and a relative location of the AR device to the screen with guest AR devices where the guest AR devices share a relative location of the guest AR devices to a copy of the screen displayed on the display of the guest AR devices and where the users of the AR devices may see each other's location with the use of avatars around the shared screen and add augmentations to the shared screen. The yaw, roll, and pitch of the head of the avatars tracks the movement of the head of the user of the AR wearable device.Type: ApplicationFiled: December 9, 2022Publication date: June 13, 2024Inventors: Ilteris Kaan Canberk, Bernhard Jung, Shin Hwun Kang, Daria Skrypnyk
-
Patent number: 12008153Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.Type: GrantFiled: November 16, 2022Date of Patent: June 11, 2024Assignee: Snap Inc.Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov
-
Patent number: 11995774Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device acquires text from a user input, a data input, or a speech recognition system. The eyewear device presents a visual text graphic at a predefined location with respect to the eyewear on the display by the display system. The eyewear device allows the user to manipulate the visual text graphics in a number of ways.Type: GrantFiled: December 18, 2020Date of Patent: May 28, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang, Daniel Moreno
-
Publication number: 20240144611Abstract: Eyewear presenting text corresponding to spoken words (e.g., in speech bubbles) and optionally translating from one language to another. In one example, an interactive augmented reality experience is provided between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be speech spoken by a remote second user of eyewear converted to text. The converted text can be displayed on a display of eyewear of the first user proximate the viewed second user. The personal attribute may be displayed in a speech bubble proximate the second user, such as proximate the head or mouth of the second user. The language of the spoken speech can be recognized by the second user eyewear, and translated to a language that is understood by the first user.Type: ApplicationFiled: December 21, 2023Publication date: May 2, 2024Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
-
Publication number: 20240119679Abstract: Systems and methods are provided for performing operations on an augmented reality (AR) device using an external screen streaming system. The system establishes, by one or more processors of an AR device, a communication with an external client device. The system causes overlay of, by the AR device, a first AR object on a real-world environment being viewed using the AR device. The system receives, by the AR device, a first image from the external client device. The system, in response to receiving the first image from the external client device, overlays the first image on the first AR object by the AR device.Type: ApplicationFiled: October 5, 2022Publication date: April 11, 2024Inventors: Ilteris Kaan Canberk, Bernhard Jung, Shin Hwun Kang, Daria Skrypnyk, Tianyi Sun, Lien Le Hong Tran
-
Publication number: 20240112383Abstract: An augmented reality (AR) content system is provided. The AR content system may analyze audio input obtained from a user to generate a search request. The AR content system may obtain search results in response to the search request and determine a layout by which to display the search results. The search results may be displayed in a user interface within an AR environment according to the layout. The AR content system may also analyze audio input to detect commands to perform with respect to content displayed in the user interface.Type: ApplicationFiled: October 4, 2022Publication date: April 4, 2024Inventors: Shin Hwun Kang, Lien Le Hong Tran
-
Publication number: 20240107256Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.Type: ApplicationFiled: December 7, 2023Publication date: March 28, 2024Inventors: Ilteris Canberk, Shin Hwun Kang
-
Patent number: 11934570Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.Type: GrantFiled: November 16, 2022Date of Patent: March 19, 2024Assignee: Snap Inc.Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov
-
Publication number: 20240087237Abstract: Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.Type: ApplicationFiled: November 15, 2023Publication date: March 14, 2024Inventors: Shin Hwun Kang, Dmytro Kucher, Dmytro Hovorov, Ilteris Canberk
-
Patent number: 11928306Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.Type: GrantFiled: September 20, 2021Date of Patent: March 12, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
-
Patent number: 11914770Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to perform a shared group object manipulation task. During the shared group task, each user of the eyewear controls movement of a respective virtual object in a virtual scene based on a portion of the virtual scene the user is gazing at. Each user can also generate a verbal command to generate a virtual object that interacts with one or more of the other virtual objects.Type: GrantFiled: January 18, 2023Date of Patent: February 27, 2024Assignee: SNAP INC.Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher