Patents by Inventor Yu Jiang Tham

Yu Jiang Tham has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210405747
    Abstract: A display system is disclosed. The display system includes a display, an eye-tracking sensor, and a processor communicatively coupled to the display and the eye-tracking sensor. The computing device receives data from the eye-tracking sensor, determines, based on the eye-tracking data, whether a user's eye is looking at a display in the display system, and based on whether the eye is determined to be looking at the display, controls a power consumption of the display.
    Type: Application
    Filed: September 13, 2021
    Publication date: December 30, 2021
    Inventor: Yu Jiang Tham
  • Publication number: 20210409954
    Abstract: A system to perform operations that include: detecting, at a first client device, a second client device in proximity with the first client device; generating a pairing code in response to the detecting the second client device in proximity of the first client device; establishing a communication pathway between the first client device and the second client device based on at least the pairing code; and presenting a collocation indicator at the first client device based on the establishing the communication pathway, according to certain example embodiments.
    Type: Application
    Filed: June 16, 2021
    Publication date: December 30, 2021
    Inventors: Savanah Frisk, Andrés Monroy-Hernández, Quan Thoi Minh Nguyen, Yu Jiang Tham, Michael Jing Xu
  • Patent number: 11206615
    Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.
    Type: Grant
    Filed: September 24, 2020
    Date of Patent: December 21, 2021
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
  • Publication number: 20210337351
    Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. One embodiment involves pairing a client device with a wearable device, capturing a first client location fix at a first time using the first application and location circuitry of the client device. The client device then receives content from the wearable device, where the content is associated with a content capture time and location state data. The client device then updates a location based on the available data to reconcile the different sets of location data. In some embodiments, additional sensor data, such as data from an accelerometer, is used is used to determine which location data is more accurate for certain content.
    Type: Application
    Filed: May 12, 2021
    Publication date: October 28, 2021
    Inventors: Yu Jiang Tham, John James Robertson, Antoine Ménard, Tamer El Calamawy
  • Patent number: 11157076
    Abstract: A display system is disclosed. The display system includes a display, an eye-tracking sensor, and a processor communicatively coupled to the display and the eye-tracking sensor. The computing device receives data from the eye-tracking sensor, determines, based on the eye-tracking data, whether a user's eye is looking at a display in the display system, and based on whether the eye is determined to be looking at the display, controls a power consumption of the display.
    Type: Grant
    Filed: September 26, 2018
    Date of Patent: October 26, 2021
    Assignee: Snap Inc.
    Inventor: Yu Jiang Tham
  • Publication number: 20210319612
    Abstract: A computer system receives user selection of an avatar story template. User-specific parameters relating to the user are determined and real-time data, based at least in part on the user-specific parameters, is retrieved. Specific media or digital assets are obtained based on at least one of the real-time data and the user-specific parameters. An avatar story is then generated by combining the avatar story template and the specific media or digital assets. The avatar story is then displayed on a display of a computing device.
    Type: Application
    Filed: June 24, 2021
    Publication date: October 14, 2021
    Inventors: Andrés Monroy-Hernández, Yu Jiang Tham
  • Publication number: 20210304507
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate context based augmented reality communication between multiple users over a network. Virtual content item configuration data indicative of a selection by a first user of virtual content item to apply to a real-world environment that is visible to a second user via a second device is received from a first device. The virtual content item configuration data also includes one or more criteria to trigger application of the virtual content item to the real-world environment. A triggering event is detected based on satisfaction of the one or more criteria determined from context data generated at the second device. The second device presents the virtual content item overlaid on the real-world environment that is visible to the second user based on the triggering event.
    Type: Application
    Filed: March 23, 2021
    Publication date: September 30, 2021
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20210306387
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate augmented reality based communication between multiple users over a network. Session configuration data including configuration parameters of a virtual interaction session with a first user is received from a first device. The configuration parameters include an identifier of a second user that is permitted to join the virtual interaction session and a micro-chat duration that defines a time limit for a real-time communication link between the first and second user during the virtual interaction session. The real-time communication link between the first and second user by causing display, by the second device, of a live camera feed generated at the first device. Upon expiration of the micro-chat duration, the real-time communication link between the first and second user is terminated.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 30, 2021
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20210306386
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate augmented reality based communication between multiple users over a network. A first user of a first device is enabled to view a real-world environment that is visible to a second user via a second device by causing display, at the first device, of a live camera feed generated at the second device. The live camera feed comprises images of the real-world environment that is visible to the second user. Input data indicative of a selection by the first user of a virtual content item to apply to the real-world environment that is visible to the second user is received. The first device and second device present media objects overlaid on the real-world environment based on the input data.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 30, 2021
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20210304450
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate augmented reality based communication between multiple users over a network. Input data is received from a first device that is indicative of a selection by a first user of virtual content item to apply to a real-world environment that is visible to a second user via a second device. The virtual content item may comprise one or more media objects. Based on receiving the input data, the second device is caused to present the one or more media objects overlaid on the real-world environment that is visible to the second user via the second device.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 30, 2021
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Patent number: 11132977
    Abstract: An eyewear device includes an image display and an image display driver coupled to the image display to control a presented image and adjust a brightness level setting of the presented image. The eyewear device includes a user input device including an input surface on a frame, a temple, a lateral side, or a combination thereof to receive from the wearer a user input selection. Eyewear device includes a proximity sensor to track a finger distance of a finger of the wearer to the input surface. Eyewear device controls, via the image display driver, the image display to present the image to the wearer. Eyewear device tracks, via the proximity sensor, the finger distance of the finger of the wearer to the input surface. Eyewear device adjusts, via the image display driver, the brightness level setting of the presented image on the image display based on the tracked finger distance.
    Type: Grant
    Filed: December 2, 2019
    Date of Patent: September 28, 2021
    Assignee: Snap Inc.
    Inventors: Ilteris Canberk, Jonathan M. Rodriguez, II, Yu Jiang Tham
  • Publication number: 20210289444
    Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.
    Type: Application
    Filed: May 26, 2021
    Publication date: September 16, 2021
    Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
  • Publication number: 20210280155
    Abstract: An eyewear device includes an image display and an image display driver coupled to the image display to control a presented image and adjust a brightness level setting of the presented image. The eyewear device includes a user input device including an input surface on a frame, a temple, a lateral side, or a combination thereof to receive from the wearer a user input selection. Eyewear device includes a proximity sensor to track a finger distance of a finger of the wearer to the input surface. Eyewear device controls, via the image display driver, the image display to present the image to the wearer. Eyewear device tracks, via the proximity sensor, the finger distance of the finger of the wearer to the input surface. Eyewear device adjusts, via the image display driver, the brightness level setting of the presented image on the image display based on the tracked finger distance.
    Type: Application
    Filed: May 21, 2021
    Publication date: September 9, 2021
    Inventors: Ilteris Canberk, Jonathan M. Rodriguez, II, Yu Jiang Tham
  • Publication number: 20210240246
    Abstract: Systems and methods for detecting touch events with an accelerometer are disclosed. In one aspect, a method includes measuring first accelerometer data at a first rate, detecting a first touch event based on the first accelerometer data, in response to detecting the first touch event, measuring second accelerometer data at a second rate, determining whether a second touch event is detected based on the second accelerometer data, measuring third accelerometer data at the first rate in response to an absence of the second touch event being detecting in the second accelerometer data over a predetermined threshold period of time.
    Type: Application
    Filed: April 23, 2021
    Publication date: August 5, 2021
    Inventors: Yu Jiang Tham, Xing Mei
  • Patent number: 11080917
    Abstract: A computer system receives user selection of an avatar story template. User-specific parameters relating to the user are determined and real-time data, based at least in part on the user-specific parameters, is retrieved. Specific media or digital assets are obtained based on at least one of the real-time data and the user-specific parameters. An avatar story is then generated by combining the avatar story template and the specific media or digital assets. The avatar story is then displayed on a display of a computing device.
    Type: Grant
    Filed: September 24, 2020
    Date of Patent: August 3, 2021
    Assignee: Snap Inc.
    Inventors: Andrés Monroy-Hernández, Yu Jiang Tham
  • Publication number: 20210218885
    Abstract: A system including image capture eyewear, a processor, and a memory. The image capture eyewear includes a support structure, a selector connected to the support structure, a display system (e.g., LEDs or a display) connected to the support structure to distinctly display assignable recipient markers, and a camera connected to the support structure to capture an image of a scene. The processor executes programming in the memory to assign recipients to the assignable recipient markers, receive a captured image of the scene, receive an indicator associated with the assignable recipient markers distinctly displayed at the time the image of the scene was captured, and transmit the captured image to the recipient assigned to the distinctly displayed assignable recipient markers.
    Type: Application
    Filed: March 31, 2021
    Publication date: July 15, 2021
    Inventors: Yu Jiang Tham, Mitchell Bechtold, Antoine Ménard
  • Publication number: 20210194178
    Abstract: Methods and devices for wired charging and communication with a wearable device are described. In one embodiment, a symmetrical contact interface comprises a first contact pad and a second contact pad, and particular wired circuitry is coupled to the first and second contact pads to enable charging as well as receive and transmit communications via the contact pads as part of various device states.
    Type: Application
    Filed: February 5, 2021
    Publication date: June 24, 2021
    Inventors: Yu Jiang Tham, Nicholas Larson, Peter Brook, Russell Douglas Patton, Miran Alhaideri, Zhihao Hong
  • Publication number: 20210185616
    Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.
    Type: Application
    Filed: September 24, 2020
    Publication date: June 17, 2021
    Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
  • Patent number: D928780
    Type: Grant
    Filed: December 12, 2019
    Date of Patent: August 24, 2021
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Gerald Nilles, Jonathan M. Rodriguez, II, Nir Daube
  • Patent number: D932429
    Type: Grant
    Filed: December 12, 2019
    Date of Patent: October 5, 2021
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Gerald Nilles, Jonathan M. Rodriguez, II, Nir Daube