Patents by Inventor Yu Jiang Tham

Yu Jiang Tham has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11985175
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate augmented reality based communication between multiple users over a network. Session configuration data including configuration parameters of a virtual interaction session with a first user is received from a first device. The configuration parameters include an identifier of a second user that is permitted to join the virtual interaction session and a micro-chat duration that defines a time limit for a real-time communication link between the first and second user during the virtual interaction session. The real-time communication link between the first and second user by causing display, by the second device, of a live camera feed generated at the first device. Upon expiration of the micro-chat duration, the real-time communication link between the first and second user is terminated.
    Type: Grant
    Filed: March 19, 2021
    Date of Patent: May 14, 2024
    Assignee: Snap Inc.
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish, Hemant Surale
  • Patent number: 11968460
    Abstract: A system including image capture eyewear, a processor, and a memory. The image capture eyewear includes a support structure, a selector connected to the support structure, a display system (e.g., LEDs or a display) connected to the support structure to distinctly display assignable recipient markers, and a camera connected to the support structure to capture an image of a scene. The processor executes programming in the memory to assign recipients to the assignable recipient markers, receive a captured image of the scene, receive an indicator associated with the assignable recipient markers distinctly displayed at the time the image of the scene was captured, and transmit the captured image to the recipient assigned to the distinctly displayed assignable recipient markers.
    Type: Grant
    Filed: July 27, 2022
    Date of Patent: April 23, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Mitchell Bechtold, Antoine Ménard
  • Patent number: 11963105
    Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.
    Type: Grant
    Filed: February 10, 2023
    Date of Patent: April 16, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
  • Patent number: 11954774
    Abstract: Systems and methods enable users to build augmented reality (AR) experiences with Internet of Things (IoT) devices. The system includes an AR object studio that includes a list of IoT devices and control signals for the respective IoT devices and a list of AR objects (e.g., an AR lens). The AR object studio receives selections from users and correlates at least one IoT device to at least one AR object in response to the user selections. During use, a server receives an indication that an AR object has been activated and interacted with on a display of an AR camera device and, in response, sends a control signal to a correlated IoT device. Conversely, the server may receive a signal from an IoT device and, in response, present and control a correlated AR object on the display of the AR camera device.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: April 9, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20240110807
    Abstract: Eyewear having a light array and vibration sensors for indicating to a user when to turn left and right, such as when the eyewear is operating a navigation map application, that improves the user experience of eyewear devices for users having partial blindness or complete blindness. To compensate for partial blindness, the front portion of the eyewear frame, such as the bridge, may include the light array, where the left portion of the light array is illuminated to indicate to the user to turn left. The right portion of the light array is illuminated to indicate to the user to turn right. To compensate for complete users having complete blindness, the eyewear has a vibration device on each side of the eyewear, such as in the temples, which selectively vibrate to indicate when the user should turn left and right.
    Type: Application
    Filed: December 6, 2023
    Publication date: April 4, 2024
    Inventors: Stephen Pomes, Yu Jiang Tham
  • Patent number: 11947198
    Abstract: A case for an eyewear device having a conductive interface includes a housing that receives the eyewear device. A multi-purpose interface, supported by the housing, includes at least one contact arranged to couple with the conductive interface of the eyewear device when the housing receives the eyewear device. Circuitry is coupled to the at least one contact and includes a processor that detects a connection of the conductive interface of the eyewear device to the multi-purpose interface of the case. The processor performs a charging process during a charge state of the case in which an electrical charge is provided at the multi-purpose interface of the case to the eyewear device. Data is exchanged with the eyewear device during a communication state of the case.
    Type: Grant
    Filed: March 9, 2022
    Date of Patent: April 2, 2024
    Assignee: Snap Inc.
    Inventors: Miran Alhaideri, Alex Bamberger, Gerald Nilles, Russell Douglas Patton, Yu Jiang Tham
  • Patent number: 11941231
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: March 26, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11943192
    Abstract: The online co-location connection service is provided by a messaging system configured to selectively pair user profiles associated with respective client devices equipped with sensors that communicate with each other within the predetermined physical proximity range. The pairing is effectuated without requiring that the two client devices, at the time of pairing, are within a communication range permitted by their respective short range communication sensors and without requiring a communication between the first client device and the second client device via a short-range wireless communication technology. Subsequent to the pairing, the messaging system monitors physical proximity of the client devices based on the sensor data obtained by the co-location connection service from the respective messaging clients executing at the respective client devices.
    Type: Grant
    Filed: May 11, 2022
    Date of Patent: March 26, 2024
    Assignee: Snap Inc.
    Inventors: Andrés Monroy-Hernández, Yu Jiang Tham
  • Publication number: 20240097461
    Abstract: Methods and devices for wired charging and communication with a wearable device are described. In one embodiment, a symmetrical contact interface comprises a first contact pad and a second contact pad, and particular wired circuitry is coupled to the first and second contact pads to enable charging as well as receive and transmit communications via the contact pads as part of various device states.
    Type: Application
    Filed: November 27, 2023
    Publication date: March 21, 2024
    Inventors: Yu Jiang Tham, Nicolas Larson, Peter Brook, Russell Douglas Patton, Miran Alhaideri, Zhihao Hong
  • Patent number: 11935198
    Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: March 19, 2024
    Assignee: Snap Inc.
    Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stolzenberg, David Meisenholder
  • Publication number: 20240082697
    Abstract: Context-sensitive remote controls for use with electronic devices (e.g., eyewear device). The electronic device is configured to perform activities (e.g., email, painting, navigation, gaming). The context-sensitive remote control includes a display having a display area, a display driver coupled to the display, and a transceiver. The remote control additionally includes memory that stores controller layout configurations for display in the display area of the display by the display driver. A processor in the context-sensitive remote control is configured to establish, via the transceiver, communication with an electronic device, detect an activity currently being performed by the electronic device, select one of the controller layout configurations responsive to the detected activity, and present, via the display driver, the selected controller layout configuration in the display area of the display.
    Type: Application
    Filed: November 15, 2023
    Publication date: March 14, 2024
    Inventors: Simon Nielsen, Jonathan Rodriguez, Yu Jiang Tham
  • Publication number: 20240077933
    Abstract: A method for detecting full-body gestures by a mobile device includes a host mobile device detecting the tracked body of a co-located participant in a multi-party session. When the participant's tracked body provides a full-body gesture, the host's mobile device recognizes that there is a tracked body providing a full-body gesture. The host mobile device iterates through the list of participants in the multi-party session and finds the closest participant mobile device with respect to the screen-space position of the head of the gesturing participant. The host mobile device then obtains the user ID of the closest participant mobile device and broadcasts the recognized full-body gesture event to all co-located participants in the multi-party session, along with the obtained user ID. Each participant's mobile device may then handle the gesture event as appropriate for the multi-party session. For example, a character or costume may be assigned to a gesturing participant.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077934
    Abstract: Described are virtual AR interfaces for generating a virtual rotational interface for the purpose of controlling connected IoT devices using the inertial measurement unit (IMU) of a portable electronic device. The IMU control application enables a user of a portable electronic device to activate a virtual rotational interface overlay on a display and adjust a feature of a connected IoT product by rotating a portable electronic device. The device IMU moves a slider on the virtual rotational interface. The IMU control application sends a control signal to the IoT product which executes an action in accordance with the slider position. The virtual rotational interface is presented on the display as a virtual object in an AR environment. The IMU control application detects the device orientation (in the physical environment) and in response presents a corresponding slider element on the virtual rotational interface (in the AR environment).
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077983
    Abstract: Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240079031
    Abstract: Described are authoring tools for creating interactive AR experiences. The story-authoring application enables a user with little or no programming skills to create an interactive story that includes recording voice commands for advancing to the next scene, inserting and manipulating virtual objects in a mixed-reality environment, and recording a variety of interactions with connected IoT devices. The story creation interface is presented on the display as a virtual object in an AR environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Daekun Kim, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240078759
    Abstract: Multi-player co-located AR experiences are augmented by assigning characters and costumes to respective participants (a.k.a. “users” of AR-enabled mobile devices) in multi-player AR sessions for storytelling, play acting, and the like. Body tracking technology and augmented reality (AR) software are used to decorate the bodies of the co-located participants with virtual costumes within the context of the multi-player co-located AR experiences. Tracked bodies are distinguished to determine which body belongs to which user and hence which virtual costume belongs to which tracked body so that corresponding costumes may be assigned for display in augmented reality. A host-guest mechanism is used for networked assignment of characters and corresponding costumes in the co-located multi-player AR session. Body tracking technology is used to move the costume with the body as movement of the assigned body is detected.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077935
    Abstract: A virtual interface application presented in augmented reality (AR) is described for controlling Internet of Things (IoT) products. The virtual interface application enables a user of a portable electronic device to activate a virtual control interface overlay on a display, receive a selection from the user using her hands or feet, and send a control signal to a nearby IoT product which executes an action in accordance with the selection. The virtual control interface is presented on the display as a virtual object in an AR environment. The virtual interface application includes a foot tracking tool for detecting an intersection between the foot location (in the physical environment) and the virtual surface position (in the AR environment). When an intersection is detected, the virtual interface application sends a control signal with instructions to the IoT product.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Youjean Cho, Lei Zhang, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077984
    Abstract: Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Ava Robinson, Daekun Kim, Youjean Cho, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Patent number: 11918888
    Abstract: Systems, methods, and computer readable media directed to multi-user visual experiences such as interactive gaming experiences and artistic media experiences. A viewing electronic device includes a camera configured to capture images, a display, and a processor coupled to the camera and the display. The processor is configured to capture, with the camera, images of a rotating marker where the rotating marker is presented on a monitor of a remote device and to present, on the display, a visual experience where the visual experience has an adjustable feature. The processor detects a parameter of the rotating marker from the captured images that corresponds to the adjustable feature and updates the visual experience responsive to the detected parameter. The parameter may be one or more of an angle, speed of rotation, direction of rotation, color, or pattern of the rotating marker.
    Type: Grant
    Filed: May 31, 2022
    Date of Patent: March 5, 2024
    Assignee: Snap Inc.
    Inventors: Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20240058699
    Abstract: Systems and methods directed to generating an interactive graphical marker that includes a first region with a first indicator and a second region with a second indicator, the second region being around a circumference of the first region. The systems and methods are also directed to monitoring an animation of the interactive graphical marker to detect when the first indicator and the second indicator are aligned at a predetermined angle of rotation, and in response to detecting that the first indicator and the second indicator are aligned, initiating an interactive game application on a second computing device and a third computing device.
    Type: Application
    Filed: October 30, 2023
    Publication date: February 22, 2024
    Inventors: Yu Jiang Tham, Ava Robinson, Andrés Monroy-Hernández, Rajan Vaish