Patents by Inventor Ilteris Canberk

Ilteris Canberk has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250110333
    Abstract: Interactive augmented reality experiences with an eyewear device including a virtual eyewear beam. The user can direct the virtual beam by orienting the eyewear device or the user's eye gaze or both. The eyewear device may detect the direction of an opponent's eyewear device or eye gaze of both. The eyewear device may calculate a score based on hits of the virtual beam of the user and the opponent on respective target areas such as the other player's head or face.
    Type: Application
    Filed: December 13, 2024
    Publication date: April 3, 2025
    Inventors: Ilteris Canberk, Jacob Knipfing
  • Patent number: 12260015
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device. A portable eyewear device includes a processor, a memory, and a display projected onto at least one lens assembly. The memory has programming stored therein that, when executed by the processor, captures information depicting an environment surrounding the device and identifies a match between objects in that information and predetermined objects in previously obtained information for the same environment. When the position of the eyewear device reaches a preselected location with respect to the matched objects, a physical output is provided to produce the immersive experience. The physical output changes as the position of the eyewear device moves to maintain the immersive experience.
    Type: Grant
    Filed: October 23, 2023
    Date of Patent: March 25, 2025
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Doug Mead
  • Publication number: 20250097659
    Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.
    Type: Application
    Filed: December 3, 2024
    Publication date: March 20, 2025
    Inventors: Ilteris Canberk, Shin Hwun Kang
  • Publication number: 20250093963
    Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device detects at least one of a hand gesture or movement of the user in the physical environment, and associates the hand gesture with a setting for a virtual object held within the memory of the eyewear device. The eyewear device may then change an attribute of the virtual object based on the detected hand gesture or movement. The eyewear device then provides an output corresponding to one or more attributes of the virtual object. The virtual object may be, for example, a music player or a virtual game piece.
    Type: Application
    Filed: December 5, 2024
    Publication date: March 20, 2025
    Inventor: Ilteris Canberk
  • Patent number: 12256211
    Abstract: Systems, devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and speakers. The eyewear device captures image information for an environment surrounding the device, identifies a match between objects in the image information and predetermined objects in previously obtained information for the same environment. The eyewear device then identifies a target location within the environment, which may be associated with a physical or a virtual object. The eyewear device monitors its orientation with respect to the target location and presents audio signals to guide the user toward the target location.
    Type: Grant
    Filed: May 16, 2023
    Date of Patent: March 18, 2025
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, James Powderly
  • Patent number: 12249036
    Abstract: Eyewear presenting text corresponding to spoken words (e.g., in speech bubbles) and optionally translating from one language to another. In one example, an interactive augmented reality experience is provided between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be speech spoken by a remote second user of eyewear converted to text. The converted text can be displayed on a display of eyewear of the first user proximate the viewed second user. The personal attribute may be displayed in a speech bubble proximate the second user, such as proximate the head or mouth of the second user. The language of the spoken speech can be recognized by the second user eyewear, and translated to a language that is understood by the first user.
    Type: Grant
    Filed: December 21, 2023
    Date of Patent: March 11, 2025
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Dmytro Kucher
  • Patent number: 12236535
    Abstract: Augmented reality guidance for guiding a user through an environment using an eyewear device. The eyewear device includes a display system and a position detection system. A user is guided though an environment by monitoring a current position of the eyewear device within the environment, identifying marker positions within a threshold of the current position, the marker positions defined with respect to the environment and associated with guidance markers, registering the marker positions, generating overlay image including the guidance markers, and presenting the overlay image on a display of the eyewear device.
    Type: Grant
    Filed: November 15, 2023
    Date of Patent: February 25, 2025
    Assignee: Snap Inc.
    Inventors: Shin Hwun Kang, Dmytro Kucher, Dmytro Hovorov, Ilteris Canberk
  • Publication number: 20250041661
    Abstract: Example systems, devices, media, and methods are described for evaluating movements and physical exercises in augmented reality using the display of an eyewear device. A motion evaluation application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) on the eyewear device. The method includes presenting virtual targets on the display, localizing the current eyewear device location based on the captured motion data, and presenting virtual indicators on the display. The virtual targets represent goals or benchmarks for the user to achieve using body postures. The method includes detecting determining whether the eyewear device location represents an intersecting posture relative to the virtual targets, based on the IMU data. The virtual indicators display real-time feedback about user posture or performance relative to the virtual targets.
    Type: Application
    Filed: October 22, 2024
    Publication date: February 6, 2025
    Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
  • Patent number: 12196954
    Abstract: Interactive augmented reality experiences with an eyewear device including a virtual eyewear beam. The user can direct the virtual beam by orienting the eyewear device or the user's eye gaze or both. The eyewear device may detect the direction of an opponent's eyewear device or eye gaze of both. The eyewear device may calculate a score based on hits of the virtual beam of the user and the opponent on respective target areas such as the other player's head or face.
    Type: Grant
    Filed: July 18, 2023
    Date of Patent: January 14, 2025
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Jacob Knipfing
  • Patent number: 12189861
    Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device detects at least one of a hand gesture or movement of the user in the physical environment, and associates the hand gesture with a setting for a virtual object held within the memory of the eyewear device. The eyewear device may then change an attribute of the virtual object based on the detected hand gesture or movement. The eyewear device then provides an output corresponding to one or more attributes of the virtual object. The virtual object may be, for example, a music player or a virtual game piece.
    Type: Grant
    Filed: March 4, 2021
    Date of Patent: January 7, 2025
    Assignee: Snap Inc.
    Inventor: Ilteris Canberk
  • Patent number: 12192740
    Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.
    Type: Grant
    Filed: December 7, 2023
    Date of Patent: January 7, 2025
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang
  • Patent number: 12169968
    Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be selected from a list of attributes, such as Bitmojis®, avatars, symbols, and tests. In an example, the personal attribute can reflect the mood of the first user. The personal attribute received by the first user is displayed proximate a portion of the first user, such as a head or mouth, on the display of the first user's eyewear device. The personal attribute may be displayed in a speech bubble proximate the first user.
    Type: Grant
    Filed: December 17, 2020
    Date of Patent: December 17, 2024
    Assignee: Snap Inc.
    Inventor: Ilteris Canberk
  • Publication number: 20240393887
    Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.
    Type: Application
    Filed: August 8, 2024
    Publication date: November 28, 2024
    Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
  • Publication number: 20240397033
    Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.
    Type: Application
    Filed: August 7, 2024
    Publication date: November 28, 2024
    Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Patent number: 12151138
    Abstract: Example systems, devices, media, and methods are described for evaluating movements and physical exercises in augmented reality using the display of an eyewear device. A motion evaluation application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) on the eyewear device. The method includes presenting virtual targets on the display, localizing the current eyewear device location based on the captured motion data, and presenting virtual indicators on the display. The virtual targets represent goals or benchmarks for the user to achieve using body postures. The method includes detecting determining whether the eyewear device location represents an intersecting posture relative to the virtual targets, based on the IMU data. The virtual indicators display real-time feedback about user posture or performance relative to the virtual targets.
    Type: Grant
    Filed: January 26, 2022
    Date of Patent: November 26, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
  • Patent number: 12108011
    Abstract: Systems, devices, media, and methods are presented for producing an augmented reality (AR) experience for display on a smart eyewear device. The AR production system includes a marker registration utility for setting and storing markers, a localization utility for locating the eyewear device relative to a marker location and to the mapped environment, and a virtual object rendering utility to presenting one or more virtual objects having a desired size, shape, and orientation. A high-definition camera captures an input image of the environment. If the input image includes a marker, the system retrieves from memory a set of data including a first marker location expressed in terms relative to a marker coordinate system. The localization utility determines a local position of the eyewear device relative to the marker location.
    Type: Grant
    Filed: December 31, 2021
    Date of Patent: October 1, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Kristina Marrero
  • Patent number: 12088781
    Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.
    Type: Grant
    Filed: December 30, 2021
    Date of Patent: September 10, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Patent number: 12086324
    Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.
    Type: Grant
    Filed: December 14, 2021
    Date of Patent: September 10, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
  • Publication number: 20240296633
    Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device acquires text from a user input, a data input, or a speech recognition system. The eyewear device presents a visual text graphic at a predefined location with respect to the eyewear on the display by the display system. The eyewear device allows the user to manipulate the visual text graphics in a number of ways.
    Type: Application
    Filed: April 25, 2024
    Publication date: September 5, 2024
    Inventors: Ilteris Canberk, Shin Hwun Kang, Daniel Moreno
  • Patent number: 12080261
    Abstract: Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.
    Type: Grant
    Filed: May 2, 2023
    Date of Patent: September 3, 2024
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Donald Giovannini, Sana Park