Patents by Inventor Ilteris Canberk
Ilteris Canberk has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12196954Abstract: Interactive augmented reality experiences with an eyewear device including a virtual eyewear beam. The user can direct the virtual beam by orienting the eyewear device or the user's eye gaze or both. The eyewear device may detect the direction of an opponent's eyewear device or eye gaze of both. The eyewear device may calculate a score based on hits of the virtual beam of the user and the opponent on respective target areas such as the other player's head or face.Type: GrantFiled: July 18, 2023Date of Patent: January 14, 2025Assignee: Snap Inc.Inventors: Ilteris Canberk, Jacob Knipfing
-
Patent number: 12189861Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device detects at least one of a hand gesture or movement of the user in the physical environment, and associates the hand gesture with a setting for a virtual object held within the memory of the eyewear device. The eyewear device may then change an attribute of the virtual object based on the detected hand gesture or movement. The eyewear device then provides an output corresponding to one or more attributes of the virtual object. The virtual object may be, for example, a music player or a virtual game piece.Type: GrantFiled: March 4, 2021Date of Patent: January 7, 2025Assignee: Snap Inc.Inventor: Ilteris Canberk
-
Patent number: 12192740Abstract: Devices, media, and methods are presented for an immersive augmented reality (AR) experience using an eyewear device with spatial audio. The eyewear device has a processor, a memory, an image sensor, and a speaker system. The eyewear device captures image information for an environment surrounding the device and identifies an object location within the same environment. The eyewear device then associates a virtual object with the identified object location. The eyewear device monitors the position of the device with respect to the virtual object and presents audio signals to alert the user that the identified object is in the environment.Type: GrantFiled: December 7, 2023Date of Patent: January 7, 2025Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang
-
Patent number: 12169968Abstract: Eyewear providing an interactive augmented reality experience between two users of eyewear devices to allow one user of an eyewear device to share a personal attribute of the user with a second user. The personal attribute can be selected from a list of attributes, such as Bitmojis®, avatars, symbols, and tests. In an example, the personal attribute can reflect the mood of the first user. The personal attribute received by the first user is displayed proximate a portion of the first user, such as a head or mouth, on the display of the first user's eyewear device. The personal attribute may be displayed in a speech bubble proximate the first user.Type: GrantFiled: December 17, 2020Date of Patent: December 17, 2024Assignee: Snap Inc.Inventor: Ilteris Canberk
-
Publication number: 20240393887Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.Type: ApplicationFiled: August 8, 2024Publication date: November 28, 2024Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
-
Publication number: 20240397033Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.Type: ApplicationFiled: August 7, 2024Publication date: November 28, 2024Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Patent number: 12151138Abstract: Example systems, devices, media, and methods are described for evaluating movements and physical exercises in augmented reality using the display of an eyewear device. A motion evaluation application implements and controls the capturing of frames of motion data using an inertial measurement unit (IMU) on the eyewear device. The method includes presenting virtual targets on the display, localizing the current eyewear device location based on the captured motion data, and presenting virtual indicators on the display. The virtual targets represent goals or benchmarks for the user to achieve using body postures. The method includes detecting determining whether the eyewear device location represents an intersecting posture relative to the virtual targets, based on the IMU data. The virtual indicators display real-time feedback about user posture or performance relative to the virtual targets.Type: GrantFiled: January 26, 2022Date of Patent: November 26, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Ivan Fekete, Shin Hwun Kang, Dmytro Kucher, Ihor Kuzin, Vernon James Carlos Manlapaz, Artur Sydoran
-
Patent number: 12108011Abstract: Systems, devices, media, and methods are presented for producing an augmented reality (AR) experience for display on a smart eyewear device. The AR production system includes a marker registration utility for setting and storing markers, a localization utility for locating the eyewear device relative to a marker location and to the mapped environment, and a virtual object rendering utility to presenting one or more virtual objects having a desired size, shape, and orientation. A high-definition camera captures an input image of the environment. If the input image includes a marker, the system retrieves from memory a set of data including a first marker location expressed in terms relative to a marker coordinate system. The localization utility determines a local position of the eyewear device relative to the marker location.Type: GrantFiled: December 31, 2021Date of Patent: October 1, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang, Kristina Marrero
-
Patent number: 12088781Abstract: Systems and methods are described for selectively sharing audio and video streams amongst electronic eyewear devices. Each electronic eyewear device includes a camera arranged to capture a video stream in an environment of the wearer, a microphone arranged to capture an audio stream in the environment of the wearer, and a display. A processor of each electronic eyewear device executes instructions to establish an always-on session with other electronic eyewear devices and selectively shares an audio stream, a video stream, or both with other electronic eyewear devices in the session. Each electronic eyewear device also generates and receives annotations from other users in the session for display with the selectively shared video stream on the display of the electronic eyewear device that provided the selectively shared video stream. The annotation may include manipulation of an object in the shared video stream or overlay images registered with the shared video stream.Type: GrantFiled: December 30, 2021Date of Patent: September 10, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Shin Hwun Kang, Sven Kratz, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Patent number: 12086324Abstract: Example systems, devices, media, and methods are described for controlling virtual elements or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined series of hand gestures. Each predefined series of hand gestures is associated with an action. The system controls movement of the virtual element, relative to the display, in accordance with the associated action. In an example hand shape that includes a thumb sliding along an extended finger, the system establishes a finger scale along the extended finger, calibrates a graphical scale with the finger scale, and controls movement of an interactive graphical element, such as a slider, according to the current thumb position relative to the calibrated graphical scale.Type: GrantFiled: December 14, 2021Date of Patent: September 10, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Viktoria Hwang, Shin Hwun Kang, David Meisenholder, Daniel Moreno
-
Publication number: 20240296633Abstract: Augmented reality experiences with an eyewear device including a position detection system and a display system are provided. The eyewear device acquires text from a user input, a data input, or a speech recognition system. The eyewear device presents a visual text graphic at a predefined location with respect to the eyewear on the display by the display system. The eyewear device allows the user to manipulate the visual text graphics in a number of ways.Type: ApplicationFiled: April 25, 2024Publication date: September 5, 2024Inventors: Ilteris Canberk, Shin Hwun Kang, Daniel Moreno
-
Patent number: 12080261Abstract: Systems, devices, media, and methods are presented for playing audio sounds, such as music, on a portable electronic device using a digital color image of a note matrix on a map. A computer vision engine, in an example implementation, includes a mapping module, a color detection module, and a music playback module. The camera captures a color image of the map, including a marker and a note matrix. Based on the color image, the computer vision engine detects a token color value associated with each field. Each token color value is associated with a sound sample from a specific musical instrument. A global state map is stored in memory, including the token color value and location of each field in the note matrix. The music playback module, for each column, in order, plays the notes associated with one or more the rows, using the corresponding sound sample, according to the global state map.Type: GrantFiled: May 2, 2023Date of Patent: September 3, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Donald Giovannini, Sana Park
-
Patent number: 12066634Abstract: The present application discloses examples of various apparatuses and systems that can be utilized for augmented reality. According to one example, a wearable device that can optionally comprise: a frame configured for wearing by a user; one or more optical elements mounted on the frame; an array having a plurality of light emitting diodes coupled to the one or more optical elements, wherein the one or more optical elements and the array are mounted within a field of view of the user when the frame is worn by the user; and additional onboard electronic components carried by the frame including at least a battery that is configured to provide for electrically powered operation of the array.Type: GrantFiled: August 17, 2023Date of Patent: August 20, 2024Assignee: Snap Inc.Inventors: Robert Matthew Bates, Ilteris Canberk, Brandon Carrillo, David G. Fliszar, Adam Douglas Greengard, Kenneth Kubala, David Meisenholder, Jonathan M Rodriguez, II, Amit Singh, Samuel Thompson
-
Publication number: 20240272432Abstract: Augmented reality experiences of a user wearing an electronic eyewear device are captured by at least one camera on a frame of the electronic eyewear device, the at least one camera having a field of view that is larger than a field of view of a display of the electronic eyewear device. An augmented reality feature or object is applied to the captured scene. A photo or video of the augmented reality scene is captured and a first portion of the captured photo or video is displayed in the display. The display is adjusted to display a second portion of the captured photo or video with the augmented reality features as the user moves the user's head to view the second portion of the captured photo or video. The captured photo or video may be transferred to another device for viewing the larger field of view augmented reality image.Type: ApplicationFiled: April 24, 2024Publication date: August 15, 2024Inventors: David Meisenholder, Dhritiman Sagar, Ilteris Canberk, Justin Wilder, Sumant Milind Hanumante, James Powderly
-
Publication number: 20240249640Abstract: Systems, devices, media, and methods are described for presenting a tutorial in augmented reality on the display of a smart eyewear device. The system includes a marker registration utility for setting a marker on a musical instrument, a localization utility, a virtual object rendering utility for presenting virtual tutorial objects on the display near the instrument, and a hand tracking utility for tracking the performer's finger locations in real time. The virtual tutorial objects, in one example, includes graphical elements presented on a virtual scroll that appears to move toward the instrument at a speed correlated with the song tempo. The hand tracking utility calculates a set of expected fingertip coordinates based on a detected hand shape.Type: ApplicationFiled: March 7, 2024Publication date: July 25, 2024Inventors: Ilteris Canberk, Dmytro Kucher
-
Patent number: 12039649Abstract: A server machine modifies an augmented reality (AR) object in response to fulfillment of a condition. The machine provides, to a user device, object data that defines the AR object. The object data specifies a physical geolocation of the AR object, a presentation attribute of the AR object, a conditional modification program, and a trigger condition for execution of the conditional modification program. The object data causes the user device to present the AR object with a first appearance, located at the physical geolocation. The machine detects fulfillment of the trigger condition, and in response, the machine executes the conditional modification program. This modifies the object data by modifying the presentation attribute. The machine provides, to the user device, the modified object data, which causes the user device to present the AR object with a second appearance based on the modified presentation attribute.Type: GrantFiled: April 25, 2023Date of Patent: July 16, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Andrés Monroy-Hernández, Rajan Vaish
-
Publication number: 20240211057Abstract: Eyewear configured to identify movements of a remote physical object to provide a 3D painting interactive augmented reality experience. In an example, the eyewear uses the identified movements of the remote physical object as a virtual brush to create a three-dimensional (3D) virtual object on an eyewear display. The eyewear determines positional information of the remote physical object. The eyewear uses the positional information and responsively displays and edits the virtual object as a function of the relative 3D positioning of the remote physical object.Type: ApplicationFiled: March 8, 2024Publication date: June 27, 2024Inventors: Ilteris Canberk, Daniel Moreno
-
Patent number: 12013985Abstract: Example systems, devices, media, and methods are described for controlling the presentation of a series of virtual or graphical elements on a display in response to hand gestures detected by an eyewear device that is capturing frames of video data with its camera system. An image processing system detects a series of hand shapes in the video data and determines whether it matches a predefined hand gesture. Each predefined hand gesture is associated with an action; for example, a leafing gesture is associated with a scrolling action. The system controls the display of a series of virtual elements, in accordance with the associated action. In an example series of hand shapes that includes flexing and extending the fingers of a single hand severally and continually in a leafing motion, the matching predefined leafing gesture is associated with a scrolling action, which displays the series of items in a display order, sequentially and in accordance with the detected speed of the moving fingers.Type: GrantFiled: January 31, 2022Date of Patent: June 18, 2024Assignee: Snap Inc.Inventors: Karen Stolzenberg, Ilteris Canberk
-
Patent number: 12014645Abstract: Systems, devices, media, and methods are described for presenting a tutorial in augmented reality on the display of a smart eyewear device. The system includes a marker registration utility for setting a marker on a musical instrument, a localization utility, a virtual object rendering utility for presenting virtual tutorial objects on the display near the instrument, and a hand tracking utility for tracking the performer's finger locations in real time. The virtual tutorial objects, in one example, includes graphical elements presented on a virtual scroll that appears to move toward the instrument at a speed correlated with the song tempo. The hand tracking utility calculates a set of expected fingertip coordinates based on a detected hand shape.Type: GrantFiled: August 3, 2022Date of Patent: June 18, 2024Assignee: Snap Inc.Inventors: Ilteris Canberk, Dmytro Kucher
-
Patent number: 12008153Abstract: Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.Type: GrantFiled: November 16, 2022Date of Patent: June 11, 2024Assignee: Snap Inc.Inventors: Shin Hwun Kang, Ilteris Canberk, James Powderly, Dmytro Kucher, Dmytro Hovorov