Patents by Inventor Fannie Liu
Fannie Liu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12361106Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.Type: GrantFiled: August 31, 2022Date of Patent: July 15, 2025Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Publication number: 20250200201Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.Type: ApplicationFiled: March 3, 2025Publication date: June 19, 2025Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Publication number: 20250173983Abstract: Augmented reality (AR) systems, devices, media, and methods are described for creating a handcrafted AR experience. The handcrafted AR experiences are created by capturing images of a scene, identifying an object receiving surface and corresponding surface coordinates, identifying a customizable AR primary object associated with at least one set of primary object coordinates, generating AR overlays including the customizable AR primary object for positioning adjacent the object receiving surface, presenting the AR overlays, receiving customization commands, generating handcrafted AR overlays including customizations associated with the customizable AR primary object responsive to the customization commands, presenting the handcrafted AR overlays, recording the handcrafted AR overlays, creating a handcrafted AR file including the recorded overlays, and transmitting the handcrafted AR file.Type: ApplicationFiled: January 24, 2025Publication date: May 29, 2025Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Patent number: 12299825Abstract: Augmented reality (AR) systems, devices, media, and methods are described for capturing and presenting effort put into generating a handcrafted AR experience. AR object generation data is captured during the generation of a handcrafted AR object. The AR object generation data is then processed to generate proof of effort data for inclusion with the handcrafted AR object. Examples of proof of effort include a time lapse view of the steps taken during generation of the AR object and statistics such as total time spent, number of images or songs considered for selection, number of actions implemented, etc.Type: GrantFiled: August 16, 2022Date of Patent: May 13, 2025Assignee: Snap Inc.Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Patent number: 12299150Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.Type: GrantFiled: August 31, 2022Date of Patent: May 13, 2025Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12284324Abstract: Augmented reality (AR) systems, devices, media, and methods are described for generating AR experiences including interactions with virtual or physical prop objects. The AR experiences are generated by capturing images of a scene with a camera system, identifying an object receiving surface and corresponding surface coordinates within the scene, identifying an AR primary object and a prop object (physical or virtual), establishing a logical connection between the AR primary object and the prop object, generating AR overlays including actions associated with the AR primary object responsive to commands received via a user input system that position the AR primary object adjacent the object receiving surface responsive to the primary object coordinates and the surface coordinates within the scene and that position the AR primary object and the prop object with respect to one another in accordance with the logical connection, and presenting the generated AR overlays with a display system.Type: GrantFiled: August 16, 2022Date of Patent: April 22, 2025Assignee: Snap Inc.Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Publication number: 20250060869Abstract: Systems and methods for using stencils for multitouch interactive exploration are disclosed. In one embodiment, a method may include: (1) receiving, by a computer program executed ay an electronic device, an identification of a stencil being used over a touch-sensitive interface associated with the electronic device, the stencil comprising a plurality of physical markers; (2) graphically presenting, by the computer program, a graphical representation of data on the touch-sensitive interface; (3) receiving, by the computer program, a user touch interaction with the touch-sensitive interface; (4) providing, by the computer program, feedback for the user touch interaction; (5) determining, by the computer program, a location of the user touch interaction on the stencil; (6) identifying, by the computer program, an element at the location; and (7) audibly providing, by the computer program, information on the element.Type: ApplicationFiled: August 16, 2023Publication date: February 20, 2025Inventors: David SAFFO, Ricardo GONZALEZ, Fannie LIU, Blair MACINTYRE
-
Patent number: 12223602Abstract: Augmented reality (AR) systems, devices, media, and methods are described for creating a handcrafted AR experience. The handcrafted AR experiences are created by capturing images of a scene, identifying an object receiving surface and corresponding surface coordinates, identifying a customizable AR primary object associated with at least one set of primary object coordinates, generating AR overlays including the customizable AR primary object for positioning adjacent the object receiving surface, presenting the AR overlays, receiving customization commands, generating handcrafted AR overlays including customizations associated with the customizable AR primary object responsive to the customization commands, presenting the handcrafted AR overlays, recording the handcrafted AR overlays, creating a handcrafted AR file including the recorded overlays, and transmitting the handcrafted AR file.Type: GrantFiled: August 15, 2022Date of Patent: February 11, 2025Assignee: Snap Inc.Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Publication number: 20250036208Abstract: Augmented reality (AR) systems, devices, media, and methods are described for sending and receiving AR objects (e.g., customized AR objects) based on/responsive to interactions with the physical world. AR virtual delivery route overlays are generated responsive to selected virtual delivery routes and include the AR object and a delivery mode (air, tunnel, etc.) corresponding to the virtual delivery route. Physical world actions associated with the delivery mode (blowing adjacent an AR device or scratching a surface) result in sending a communication corresponding to the AR object for delivery to a receiver and generating AR sending overlays including the AR object moving in accordance with the delivery mode.Type: ApplicationFiled: October 17, 2024Publication date: January 30, 2025Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Patent number: 12169905Abstract: Systems and methods enable users to engage in meaningful, authentic, online interactions by extracting objects (virtual or real) from an image or video and transferring the extracted objects into a real-world environment of another user in three-dimensional augmented reality. An object to be sent from the first user to the second user is generated using a drawing application, extracted from an image or a video, or an AR object is captured. A video is recorded showing the object responding to an action of the first user and metadata is generated relating to the response of the object to the action of the first user. The video and metadata are sent to the second user. Upon receipt, the video is watched by the second user and the object is selected for presentation in a display environment of the second user. The state of the object is preserved by the metadata.Type: GrantFiled: November 13, 2023Date of Patent: December 17, 2024Assignee: Snap Inc.Inventors: Fannie Liu, Rajan Vaish
-
Publication number: 20240393885Abstract: Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.Type: ApplicationFiled: July 31, 2024Publication date: November 28, 2024Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12148114Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object with an associated material and added virtual content is provided to users. In one example of the collaborative session, a user selects the associated material of the collaborative object. Physical characteristics are assigned to the collaborative object as a function of the associated material to be perceived by the participants when the collaborative object is manipulated. In one example, the material associated to the collaborative object is metal, wherein the interaction between the users and the collaborative object generates a response of the collaborative object that is indicative of the physical properties of metal, such as inertial, acoustic, and malleability.Type: GrantFiled: August 31, 2022Date of Patent: November 19, 2024Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12141363Abstract: Augmented reality (AR) systems, devices, media, and methods are described for sending and receiving AR objects (e.g., customized AR objects) based on/responsive to interactions with the physical world. AR virtual delivery route overlays are generated responsive to selected virtual delivery routes and include the AR object and a delivery mode (air, tunnel, etc.) corresponding to the virtual delivery route. Physical world actions associated with the delivery mode (blowing adjacent an AR device or scratching a surface) result in sending a communication corresponding to the AR object for delivery to a receiver and generating AR sending overlays including the AR object moving in accordance with the delivery mode.Type: GrantFiled: August 15, 2022Date of Patent: November 12, 2024Assignee: Snap Inc.Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
-
Publication number: 20240320353Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.Type: ApplicationFiled: May 31, 2024Publication date: September 26, 2024Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12079395Abstract: Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.Type: GrantFiled: August 31, 2022Date of Patent: September 3, 2024Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12019773Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.Type: GrantFiled: August 31, 2022Date of Patent: June 25, 2024Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Publication number: 20240087257Abstract: Systems and methods enable users to engage in meaningful, authentic, online interactions by extracting objects (virtual or real) from an image or video and transferring the extracted objects into a real-world environment of another user in three-dimensional augmented reality. An object to be sent from the first user to the second user is generated using a drawing application, extracted from an image or a video, or an AR object is captured. A video is recorded showing the object responding to an action of the first user and metadata is generated relating to the response of the object to the action of the first user. The video and metadata are sent to the second user. Upon receipt, the video is watched by the second user and the object is selected for presentation in a display environment of the second user. The state of the object is preserved by the metadata.Type: ApplicationFiled: November 13, 2023Publication date: March 14, 2024Inventors: Fannie Liu, Rajan Vaish
-
Publication number: 20240070302Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.Type: ApplicationFiled: August 31, 2022Publication date: February 29, 2024Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Publication number: 20240070243Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.Type: ApplicationFiled: August 31, 2022Publication date: February 29, 2024Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Publication number: 20240070301Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.Type: ApplicationFiled: August 31, 2022Publication date: February 29, 2024Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish