Patents by Inventor Rajan Vaish

Rajan Vaish has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972521
    Abstract: Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.
    Type: Grant
    Filed: August 31, 2022
    Date of Patent: April 30, 2024
    Assignee: Snap Inc.
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Patent number: 11954774
    Abstract: Systems and methods enable users to build augmented reality (AR) experiences with Internet of Things (IoT) devices. The system includes an AR object studio that includes a list of IoT devices and control signals for the respective IoT devices and a list of AR objects (e.g., an AR lens). The AR object studio receives selections from users and correlates at least one IoT device to at least one AR object in response to the user selections. During use, a server receives an indication that an AR object has been activated and interacted with on a display of an AR camera device and, in response, sends a control signal to a correlated IoT device. Conversely, the server may receive a signal from an IoT device and, in response, present and control a correlated AR object on the display of the AR camera device.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: April 9, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11941231
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: March 26, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11935198
    Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: March 19, 2024
    Assignee: Snap Inc.
    Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stolzenberg, David Meisenholder
  • Publication number: 20240087257
    Abstract: Systems and methods enable users to engage in meaningful, authentic, online interactions by extracting objects (virtual or real) from an image or video and transferring the extracted objects into a real-world environment of another user in three-dimensional augmented reality. An object to be sent from the first user to the second user is generated using a drawing application, extracted from an image or a video, or an AR object is captured. A video is recorded showing the object responding to an action of the first user and metadata is generated relating to the response of the object to the action of the first user. The video and metadata are sent to the second user. Upon receipt, the video is watched by the second user and the object is selected for presentation in a display environment of the second user. The state of the object is preserved by the metadata.
    Type: Application
    Filed: November 13, 2023
    Publication date: March 14, 2024
    Inventors: Fannie Liu, Rajan Vaish
  • Publication number: 20240077933
    Abstract: A method for detecting full-body gestures by a mobile device includes a host mobile device detecting the tracked body of a co-located participant in a multi-party session. When the participant's tracked body provides a full-body gesture, the host's mobile device recognizes that there is a tracked body providing a full-body gesture. The host mobile device iterates through the list of participants in the multi-party session and finds the closest participant mobile device with respect to the screen-space position of the head of the gesturing participant. The host mobile device then obtains the user ID of the closest participant mobile device and broadcasts the recognized full-body gesture event to all co-located participants in the multi-party session, along with the obtained user ID. Each participant's mobile device may then handle the gesture event as appropriate for the multi-party session. For example, a character or costume may be assigned to a gesturing participant.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077934
    Abstract: Described are virtual AR interfaces for generating a virtual rotational interface for the purpose of controlling connected IoT devices using the inertial measurement unit (IMU) of a portable electronic device. The IMU control application enables a user of a portable electronic device to activate a virtual rotational interface overlay on a display and adjust a feature of a connected IoT product by rotating a portable electronic device. The device IMU moves a slider on the virtual rotational interface. The IMU control application sends a control signal to the IoT product which executes an action in accordance with the slider position. The virtual rotational interface is presented on the display as a virtual object in an AR environment. The IMU control application detects the device orientation (in the physical environment) and in response presents a corresponding slider element on the virtual rotational interface (in the AR environment).
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240079031
    Abstract: Described are authoring tools for creating interactive AR experiences. The story-authoring application enables a user with little or no programming skills to create an interactive story that includes recording voice commands for advancing to the next scene, inserting and manipulating virtual objects in a mixed-reality environment, and recording a variety of interactions with connected IoT devices. The story creation interface is presented on the display as a virtual object in an AR environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Daekun Kim, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077935
    Abstract: A virtual interface application presented in augmented reality (AR) is described for controlling Internet of Things (IoT) products. The virtual interface application enables a user of a portable electronic device to activate a virtual control interface overlay on a display, receive a selection from the user using her hands or feet, and send a control signal to a nearby IoT product which executes an action in accordance with the selection. The virtual control interface is presented on the display as a virtual object in an AR environment. The virtual interface application includes a foot tracking tool for detecting an intersection between the foot location (in the physical environment) and the virtual surface position (in the AR environment). When an intersection is detected, the virtual interface application sends a control signal with instructions to the IoT product.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Youjean Cho, Lei Zhang, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240078759
    Abstract: Multi-player co-located AR experiences are augmented by assigning characters and costumes to respective participants (a.k.a. “users” of AR-enabled mobile devices) in multi-player AR sessions for storytelling, play acting, and the like. Body tracking technology and augmented reality (AR) software are used to decorate the bodies of the co-located participants with virtual costumes within the context of the multi-player co-located AR experiences. Tracked bodies are distinguished to determine which body belongs to which user and hence which virtual costume belongs to which tracked body so that corresponding costumes may be assigned for display in augmented reality. A host-guest mechanism is used for networked assignment of characters and corresponding costumes in the co-located multi-player AR session. Body tracking technology is used to move the costume with the body as movement of the assigned body is detected.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077983
    Abstract: Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077984
    Abstract: Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Ava Robinson, Daekun Kim, Youjean Cho, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Patent number: 11918888
    Abstract: Systems, methods, and computer readable media directed to multi-user visual experiences such as interactive gaming experiences and artistic media experiences. A viewing electronic device includes a camera configured to capture images, a display, and a processor coupled to the camera and the display. The processor is configured to capture, with the camera, images of a rotating marker where the rotating marker is presented on a monitor of a remote device and to present, on the display, a visual experience where the visual experience has an adjustable feature. The processor detects a parameter of the rotating marker from the captured images that corresponds to the adjustable feature and updates the visual experience responsive to the detected parameter. The parameter may be one or more of an angle, speed of rotation, direction of rotation, color, or pattern of the rotating marker.
    Type: Grant
    Filed: May 31, 2022
    Date of Patent: March 5, 2024
    Assignee: Snap Inc.
    Inventors: Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20240070243
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070298
    Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070301
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240069643
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a user interacts with the collaborative object using hand gestures. The virtual content associated with the collaborative object can be accessed with an opening hand gesture and the virtual content can be hidden with a closing hand gesture. The hand gestures are detected by cameras of a client device used by the user. The collaborative object can be moved and manipulated using a pointing gesture, wherein the position of the collaborate object can be confirmed to a new position by titling the client device of the user.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070300
    Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users by a collaborative system. The system receives a request from a user to join a session, and associates a timestamp with the user corresponding to receipt of the request.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240069637
    Abstract: The present disclosure relates to methods and systems for providing a touch-based augmented reality (AR) experience. During a capture phase, a first user may grip an object. An intensity of a force applied on the object in the grip and/or a duration of the grip may be recorded. A volumetric representation of the first user holding the object may also be captured. During an experience phase, a second user may touch the object, the object may provide haptic feedback (e.g., a vibration) to the second user at an intensity and a duration corresponding to an intensity of the force applied on the object and a duration of the grip of the object. If a volumetric representation of the first user holding the object is captured, touching the object may also cause a presentation of the first user's volumetric body that holds the object.
    Type: Application
    Filed: November 16, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240073404
    Abstract: A display device presents volumetric content comprising a volumetric video. The volumetric video comprises a volumetric representation of one or more elements a three-dimensional space. Input indicative of a control operation associated with the presentation of the volumetric video is received. The presentation of the volumetric video by the display device is controlled by executing the control operation. While the control operation is being executed, the volumetric representation of the one or more elements of the three-dimensional space are displayed from multiple perspectives based on movement of a user.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith