Patents by Inventor Yu Jiang Tham
Yu Jiang Tham has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11893301Abstract: Methods and systems are disclosed for creating a shared augmented reality (AR) session. The methods and systems perform operations comprising: receiving, by a client device, input that selects a shared augmented reality (AR) experience from a plurality of shared AR experiences; in response to receiving the input, determining one or more resources associated with the selected shared AR experience; determining, by the client device, that two or more users are located within a threshold proximity of the client device; and activating the selected shared AR experience in response to determining that the two or more users are located within the threshold proximity of the client device.Type: GrantFiled: May 16, 2022Date of Patent: February 6, 2024Assignee: Snap Inc.Inventors: Ana Maria Cardenas Gasca, Ella Dagan Peled, Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
-
Publication number: 20240031782Abstract: A method and a system include receiving a first signal value generated by a biosignal sensor coupled to the first client device, receiving one or more second signal values corresponding to a respective sensor reading or environmental condition associated with the first client device, determining a total score based on a first score and one or more second scores determined based on the first and the one or more second signal values, selecting a first state of the plurality of states based on a ranking of total scores of the plurality of states, and causing a display of a first notification associated with a first user-selectable element corresponding to the first state on the first client device.Type: ApplicationFiled: September 27, 2023Publication date: January 25, 2024Inventors: Andrés Monroy-Hernández, Chunjong Park, Fannie Liu, Yu Jiang Tham
-
Patent number: 11880946Abstract: Systems and methods herein describe a system for context triggered augmented reality. The proposed systems and methods receive first user input indicative of a selection of a user interface element corresponding to a recipient user, generate an augmented reality content item based on second user input from the first computing device, generate a contextual trigger for the generated augmented reality content item, the contextual trigger defining a set of conditions for presenting the generated augmented reality content item on a second computing device, generate a multi-media message comprising audio data recorded at the first computing device, detect at least one condition of the set of conditions being satisfied, and in response to detecting at least one of the set of conditions being satisfied, causing presentation of the augmented reality content item and the multi-media message at the second computing device.Type: GrantFiled: September 15, 2021Date of Patent: January 23, 2024Assignee: Snap Inc.Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Patent number: 11876391Abstract: Methods and devices for wired charging and communication with a wearable device are described. In one embodiment, a symmetrical contact interface comprises a first contact pad and a second contact pad, and particular wired circuitry is coupled to the first and second contact pads to enable charging as well as receive and transmit communications via the contact pads as part of various device states.Type: GrantFiled: February 5, 2021Date of Patent: January 16, 2024Assignee: SNAP INCInventors: Yu Jiang Tham, Nicholas Larson, Peter Brook, Russell Douglas Patton, Miran Alhaideri, Zhihao Hong
-
Patent number: 11875022Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.Type: GrantFiled: August 29, 2021Date of Patent: January 16, 2024Assignee: Snap Inc.Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
-
Publication number: 20230418542Abstract: Methods and systems are disclosed for creating a shared augmented reality (AR) session. The methods and systems perform operations comprising: receiving, by a client device, input that selects a shared augmented reality (AR) experience from a plurality of shared AR experiences; in response to receiving the input, determining one or more resources associated with the selected shared AR experience; determining, by the client device, that two or more users are located within a threshold proximity of the client device; and activating the selected shared AR experience in response to determining that the two or more users are located within the threshold proximity of the client device.Type: ApplicationFiled: September 8, 2023Publication date: December 28, 2023Inventors: Ana Maria Cardenas Gasca, Ella Dagan Peled, Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
-
Publication number: 20230422176Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.Type: ApplicationFiled: September 5, 2023Publication date: December 28, 2023Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
-
Patent number: 11852500Abstract: Eyewear having a light array and vibration sensors for indicating to a user when to turn left and right, such as when the eyewear is operating a navigation map application, that improves the user experience of eyewear devices for user's having partial blindness or complete blindness. To compensate for partial blindness, the front portion of the eyewear frame, such as the bridge, may include the light array, where the left portion of the light array is illuminated to indicate to the user to turn left. The right portion of the light array is illuminated to indicate to the user to turn right. To compensate for complete user's having complete blindness, the eyewear has a vibration device on each side of the eyewear, such as in the temples, which selectively vibrate to indicate when the user should turn left and right.Type: GrantFiled: August 20, 2020Date of Patent: December 26, 2023Assignee: Snap Inc.Inventors: Stephen Pomes, Yu Jiang Tham
-
Publication number: 20230409158Abstract: An augmented reality (AR) device is interfaced to an Internet of Things (IoT) device by receiving IoT device information in a standardized schema from the IoT device. The IoT device information includes device inputs and available output information for the IoT device. A predetermined AR user interface widget to render for the IoT device is determined from the received IoT device information. The predetermined AR user interface widget converts user inputs to the AR device into the device inputs for the IoT device. Upon selection of the IoT device by the AR device, the predetermined AR user interface widget for the selected IoT device is provided to the AR device as an overlay on a display of the AR device. Device input received in response to user interaction with the AR user interface widget is provided to the IoT device in an input type expected by the IoT device.Type: ApplicationFiled: June 15, 2022Publication date: December 21, 2023Inventors: Sven Kratz, Andrés Monroy-Hernández, Yu Jiang THAM
-
Publication number: 20230410437Abstract: Interactive experiences are generated using an augmented reality (AR) system that leverages the native input and output capabilities of AR devices to generate immersive and interactive experiences in smart spaces that combine visually-represented programming logic, AR content, and IoT devices. The system connects IoT devices with each other in a native AR environment, provides virtual content and logic gates, scripts sequences and loops of actions, solves visibility issues for IoT devices that are far apart, and designs logic gates and flow circuits that best support user intentions when generating interactive experiences in smart spaces. The interactive experience is produced by generating a program of behaviors of selected IoT devices using logic gates and control circuits connected to the IoT devices in a visual programming interface of the AR device. Once completed, the program of behaviors is executed by a runtime module to control the IoT devices during the interactive experience.Type: ApplicationFiled: June 15, 2022Publication date: December 21, 2023Inventors: Sven Kratz, Andrés Monroy-Hernández, Yu Jiang THAM
-
Patent number: 11842451Abstract: Systems and methods herein describe a system for context triggered augmented reality. The proposed systems and methods receive first user input indicative of a selection of a user interface element corresponding to a recipient user, generate an augmented reality content item based on second user input from the first computing device, generate a contextual trigger for the generated augmented reality content item, the contextual trigger defining a set of conditions for presenting the generated augmented reality content item on a second computing device, generate a multi-media message comprising audio data recorded at the first computing device, detect at least one condition of the set of conditions being satisfied, and in response to detecting at least one of the set of conditions being satisfied, causing presentation of the augmented reality content item and the multi-media message at the second computing device.Type: GrantFiled: September 15, 2021Date of Patent: December 12, 2023Assignee: Snap Inc.Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
-
Patent number: 11833427Abstract: Systems and methods directed to generating an interactive graphical marker that includes a first region with a first indicator and a second region with a second indicator, the second region being around a circumference of the first region. The systems and methods are also directed to monitoring an animation of the interactive graphical marker to detect when the first indicator and the second indicator are aligned at a predetermined angle of rotation, and in response to detecting that the first indicator and the second indicator are aligned, initiating an interactive game application on a second computing device and a third computing device.Type: GrantFiled: September 16, 2022Date of Patent: December 5, 2023Assignee: Snap Inc.Inventors: Yu Jiang Tham, Ava Robinson, Andrés Monroy-Hernández, Rajan Vaish
-
Patent number: 11826635Abstract: Context-sensitive remote controls for use with electronic devices (e.g., eyewear device). The electronic device is configured to perform activities (e.g., email, painting, navigation, gaming). The context-sensitive remote control includes a display having a display area, a display driver coupled to the display, and a transceiver. The remote control additionally includes memory that stores controller layout configurations for display in the display area of the display by the display driver. A processor in the context-sensitive remote control is configured to establish, via the transceiver, communication with an electronic device, detect an activity currently being performed by the electronic device, select one of the controller layout configurations responsive to the detected activity, and present, via the display driver, the selected controller layout configuration in the display area of the display.Type: GrantFiled: December 7, 2022Date of Patent: November 28, 2023Assignee: Snap Inc.Inventors: Simon Nielsen, Jonathan Rodriguez, Yu Jiang Tham
-
Patent number: 11812347Abstract: A method and a system include receiving a first signal value generated by a biosignal sensor coupled to the first client device, receiving one or more second signal values corresponding to a respective sensor reading or environmental condition associated with the first client device, determining a total score based on a first score and one or more second scores determined based on the first and the one or more second signal values, selecting a first state of the plurality of states based on a ranking of total scores of the plurality of states, and causing a display of a first notification associated with a first user-selectable element corresponding to the first state on the first client device.Type: GrantFiled: September 8, 2020Date of Patent: November 7, 2023Assignee: Snap Inc.Inventors: Andrés Monroy-Hernández, Chunjong Park, Fannie Liu, Yu Jiang Tham
-
Publication number: 20230351701Abstract: The users' experience of engaging with augmented reality (AR) technology that permits users to interact with their environment and with each other is enhanced by automatically selecting an AR experience that is suitable for use given the physical environment of the user. The physical environment of the user is the physical environment of the user's computing device. The physical environment may include objects and/or conditions present in close proximity to the user's computing device, such as other humans, animals, and smart devices.Type: ApplicationFiled: April 28, 2022Publication date: November 2, 2023Inventors: Andrés Monroy-Hernández, Ava Marie Robinson, Yu Jiang Tham
-
Publication number: 20230350506Abstract: Systems, devices, media, and methods are presented for selectively activating and suspending control of a graphical user interface by two or more electronic devices. A portable eyewear device includes a display projected onto at least one lens assembly and a primary touchpad through which the user may access a graphical user interface (GUI) on the display. A handheld accessory device, such as a ring, includes an auxiliary touchpad that is configured to emulate the primary touchpad. The eyewear processor temporarily suspends inputs from one touchpad when it detects an activation signal from the other touchpad.Type: ApplicationFiled: July 6, 2023Publication date: November 2, 2023Inventors: Yu Jiang Tham, Stephen Pomes, Jonathan M. Rodriguez, II, Nir Daube
-
Publication number: 20230328479Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. One embodiment involves pairing a client device with a wearable device, capturing a first client location fix at a first time using the first application and location circuitry of the client device. The client device then receives content from the wearable device, where the content is associated with a content capture time and location state data. The client device then updates a location based on the available data to reconcile the different sets of location data. In some embodiments, additional sensor data, such as data from an accelerometer, is used is used to determine which location data is more accurate for certain content.Type: ApplicationFiled: June 7, 2023Publication date: October 12, 2023Inventors: Yu Jiang Tham, John James Robertson, Antoine Ménard, Tamer El Calamawy
-
Patent number: 11785549Abstract: Systems, methods, devices, computer readable media, and other various embodiments are described for location management processes in wearable electronic devices. Performance of such devices is improved with reduced time to first fix of location operations in conjunction with low-power operations. In one embodiment, low-power circuitry manages high-speed circuitry and location circuitry to provide location assistance data from the high-speed circuitry to the low-power circuitry automatically on initiation of location fix operations as the high-speed circuitry and location circuitry are booted from low-power states. In some embodiments, the high-speed circuitry is returned to a low-power state prior to completion of a location fix and after capture of content associated with initiation of the location fix. In some embodiments, high-speed circuitry is booted after completion of a location fix to update location data associated with content.Type: GrantFiled: November 23, 2021Date of Patent: October 10, 2023Assignee: Snap Inc.Inventors: Yu Jiang Tham, John James Robertson, Gerald Nilles, Jason Heger, Praveen Babu Vadivelu
-
Publication number: 20230298247Abstract: Users of electronic eyewear devices can interact with each other by sharing 3D objects (e.g., 2D or 3D augmented reality (AR) objects or scanned 2D or 3D images of real-world objects) with each other via local objects (real or virtual) in each user's environment established as personalized anchor points for social connection. When a user receives an object from another user, the user has the option to generate a connected session with other users that are co-located (physically or virtually at the same location) with the user. The co-located group of users in this new connected session may view the received object either on their personal electronic devices (e.g., smartphones) or on their electronic eyewear devices and can modify and annotate the shared object using collaboration software and AR display tools that enable modification and manipulation of the shared object.Type: ApplicationFiled: March 15, 2022Publication date: September 21, 2023Inventors: Yu Jiang Tham, Rajan Vaish, Sven Kratz
-
Patent number: 11765115Abstract: A system and method for recommending emojis to send by a messaging application includes receiving user biosignals from at least one sensor, receiving context information from at least one context source, and selecting, using a rule-based heuristic state selector, a list of plausible user states from possible heuristic states in a heuristic state table based on the received user biosignals and received context information. A state predictor determines, from at least the list of plausible user states and a user model of emojis previously sent in different user states, a list of user states and probabilities that the user will send an emoji for each user state given the context information. The recommended emojis are displayed for selection for sending by the messaging application in order from the highest to the lowest probabilities that the user will send the recommended emojis based on a selected plausible user state.Type: GrantFiled: July 29, 2021Date of Patent: September 19, 2023Assignee: Snap Inc.Inventors: Fannie Liu, Yu Jiang Tham, Tsung-Yu Tsai, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith