Patents by Inventor Rajan Vaish

Rajan Vaish has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230062433
    Abstract: Eyewear configured to control an unmanned aerial vehicle (UAV). In one example, a user interacts with the eyewear to generate control signals that are transmitted to the UAV to control the flight path, speed, orientation, and to communicate other instructions to the UAV. An input of the eyewear is controlled by the user to control the UAV, such as a touchpad, a microphone, a head movement tracker and a camera. The user is also able to configure and customize the eyewear to send specific control signals to the UAV as a function of user actions. This includes specific head movements and head gestures of the user as a method of controlling the UAV. This allows the user to control the UAV in a more natural and convenient way.
    Type: Application
    Filed: August 30, 2021
    Publication date: March 2, 2023
    Inventors: Terek Judi, Rajan Vaish
  • Publication number: 20230062366
    Abstract: Augmented reality (AR) systems, devices, media, and methods are described for creating a handcrafted AR experience. The handcrafted AR experiences are created by capturing images of a scene, identifying an object receiving surface and corresponding surface coordinates, identifying a customizable AR primary object associated with at least one set of primary object coordinates, generating AR overlays including the customizable AR primary object for positioning adjacent the object receiving surface, presenting the AR overlays, receiving customization commands, generating handcrafted AR overlays including customizations associated with the customizable AR primary object responsive to the customization commands, presenting the handcrafted AR overlays, recording the handcrafted AR overlays, creating a handcrafted AR file including the recorded overlays, and transmitting the handcrafted AR file.
    Type: Application
    Filed: August 15, 2022
    Publication date: March 2, 2023
    Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
  • Publication number: 20230068730
    Abstract: Systems and methods are described for providing social connection between users of first and second electronic eyewear devices. A first object in an environment of the first user is paired with a second object in a remote environment of the second user to create a communication connection between the first object and the second object. When the first object is within a field of view of the first electronic eyewear device, a communication is sent that invokes a preselected augmented reality (AR) object for display to the second user adjacent the second object when the second object is in the field of view of the second user. When the first object is within the field of view of the first electronic eyewear device for a predetermined duration, a communication may be sent that invokes a snapshot of an object or a user-selected AR object for display to the second user.
    Type: Application
    Filed: August 17, 2022
    Publication date: March 2, 2023
    Inventors: Hanseul Jun, Sven Kratz, Joanne Leong, Xingyu Liu, Andrés Monroy-Hernández, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20230069328
    Abstract: Systems and methods are described for sending a snapshot message using an electronic eyewear device by a first user capturing an image, identifying a physical marker in the captured image, and determining that the physical marker in the captured image has been within a field of view of the electronic eyewear device for a predetermined amount of time. A 3D snapshot including an object is captured and, each time the identified object appears in a field of view of the electronic eyewear device, a representation of the object is sent for display to a second user as if the real object was sent. The representation of the object may include the snapshot image, the object extracted from the snapshot image, a stored representation of the object, augmented reality content representing the object, etc.
    Type: Application
    Filed: August 18, 2022
    Publication date: March 2, 2023
    Inventors: Hanseul Jun, Sven Kratz, Joanne Leong, Xingyu Liu, Andrés Monroy-Hernández, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20230063944
    Abstract: Systems and methods for controlling Internet of Things (IoT) devices using an augmented reality (AR) camera are provided. The system includes a sensor and a server that receives an input from the sensor and presents an AR object on the display of the AR camera device that corresponds to the input from the sensor. A response to the displayed AR object from a user of the AR camera device is used to select a command to send to one or more IoT devices to perform an action corresponding to the user response to the displayed AR object. In an example, an AR smoke object is overlayed on the AR camera display in response to a smoke detection signal from a smoke detector. In response to the user swiping or gesturing to push away the AR smoke object, a window opening command is sent to one or more IoT enabled windows.
    Type: Application
    Filed: August 29, 2021
    Publication date: March 2, 2023
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20230060838
    Abstract: Systems and methods are described for providing scan based imaging using an electronic eyewear device. The methods include scanning a scene using the electronic eyewear device to capture at least one image in an environment of a first user and identifying at least one physical marker in the scanned scene. Upon identification of the at least one physical marker in the scanned scene, a message (e.g., preselected AR content) is passively sent to at least one of directly to a second user or to at least one physical marker at a remote location for presentation to the second user. The message may be sent without use of the first user's hands to make a selection. The physical markers may be associated with a fixed object, a human face, a pet, a vehicle, a person, a logo, and the like.
    Type: Application
    Filed: August 17, 2022
    Publication date: March 2, 2023
    Inventors: Hanseul Jun, Sven Kratz, Joanne Leong, Xingyu Liu, Andrés Monroy-Hernández, Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20230069042
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Application
    Filed: August 29, 2021
    Publication date: March 2, 2023
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20230066318
    Abstract: Augmented reality (AR) systems, devices, media, and methods are described for capturing and presenting effort put into generating a handcrafted AR experience. AR object generation data is captured during the generation of a handcrafted AR object. The AR object generation data is then processed to generate proof of effort data for inclusion with the handcrafted AR object. Examples of proof of effort include a time lapse view of the steps taken during generation of the AR object and statistics such as total time spent, number of images or songs considered for selection, number of actions implemented, etc.
    Type: Application
    Filed: August 16, 2022
    Publication date: March 2, 2023
    Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang THAM, Rajan Vaish, Lei Zhang
  • Publication number: 20230063194
    Abstract: Systems and methods for controlling an Internet of Things (IoT) device through interaction with an augmented reality (AR) object includes pairing an AR object with an IoT device, presenting the AR object on a display of an AR camera device of a user, receiving an interaction signal indicating that the user has interacted with the AR object on the display of the AR camera device, and sending a control signal to the IoT device paired with the AR object in response to the interaction signal. A second user may request presentation of the AR object to the AR display of the AR camera device of the user when the user's AR camera device is located at particular world coordinates. Also, the control signal may be sent when a particular series of interactions with the AR object have been completed, as during game play.
    Type: Application
    Filed: August 29, 2021
    Publication date: March 2, 2023
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20230060150
    Abstract: Augmented reality (AR) systems, devices, media, and methods are described for sending and receiving AR objects (e.g., customized AR objects) based on/responsive to interactions with the physical world. AR virtual delivery route overlays are generated responsive to selected virtual delivery routes and include the AR object and a delivery mode (air, tunnel, etc.) corresponding to the virtual delivery route. Physical world actions associated with the delivery mode (blowing adjacent an AR device or scratching a surface) result in sending a communication corresponding to the AR object for delivery to a receiver and generating AR sending overlays including the AR object moving in accordance with the delivery mode.
    Type: Application
    Filed: August 15, 2022
    Publication date: March 2, 2023
    Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang THAM, Rajan Vaish, Lei Zhang
  • Publication number: 20230063386
    Abstract: Eyewear configured to capture a video image of an object, and to control an unmanned aerial vehicle (UAV) to capture a video image of the object to produce stitched video images. The eyewear has an eyewear camera generating the eyewear video image, and the UAV has a UAV camera generating the UAV video image, wherein a processor timestamps frames of each of the video images. When the eyewear begins recording of the object, the eyewear controls the UAV camera to also begin recording the object. The stitched video images are synchronized to each other using the timestamps, wherein the stitched video images can be displayed on an eyewear display or a remote display. The processor compensates for any latency between when the eyewear begins recording and when the UAV begins recording.
    Type: Application
    Filed: September 2, 2021
    Publication date: March 2, 2023
    Inventors: Terek Judi, Rajan Vaish
  • Publication number: 20230066686
    Abstract: Augmented reality (AR) systems, devices, media, and methods are described for generating AR experiences including interactions with virtual or physical prop objects. The AR experiences are generated by capturing images of a scene with a camera system, identifying an object receiving surface and corresponding surface coordinates within the scene, identifying an AR primary object and a prop object (physical or virtual), establishing a logical connection between the AR primary object and the prop object, generating AR overlays including actions associated with the AR primary object responsive to commands received via a user input system that position the AR primary object adjacent the object receiving surface responsive to the primary object coordinates and the surface coordinates within the scene and that position the AR primary object and the prop object with respect to one another in accordance with the logical connection, and presenting the generated AR overlays with a display system.
    Type: Application
    Filed: August 16, 2022
    Publication date: March 2, 2023
    Inventors: Tianying Chen, Timothy Chong, Sven Kratz, Fannie Liu, Andrés Monroy-Hernández, Olivia Seow, Yu Jiang Tham, Rajan Vaish, Lei Zhang
  • Patent number: 11593997
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, method, and user interface to facilitate context based augmented reality communication between multiple users over a network. Virtual content item configuration data indicative of a selection by a first user of virtual content item to apply to a real-world environment that is visible to a second user via a second device is received from a first device. The virtual content item configuration data also includes one or more criteria to trigger application of the virtual content item to the real-world environment. A triggering event is detected based on satisfaction of the one or more criteria determined from context data generated at the second device. The second device presents the virtual content item overlaid on the real-world environment that is visible to the second user based on the triggering event.
    Type: Grant
    Filed: March 23, 2021
    Date of Patent: February 28, 2023
    Assignee: Snap Inc.
    Inventors: Brian Anthony Smith, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20230019561
    Abstract: Systems and methods directed to generating an interactive graphical marker that includes a first region with a first indicator and a second region with a second indicator, the second region being around a circumference of the first region. The systems and methods are also directed to monitoring an animation of the interactive graphical marker to detect when the first indicator and the second indicator are aligned at a predetermined angle of rotation, and in response to detecting that the first indicator and the second indicator are aligned, initiating an interactive game application on a second computing device and a third computing device.
    Type: Application
    Filed: September 16, 2022
    Publication date: January 19, 2023
    Inventors: Yu Jiang Tham, Ava Robinson, Andrés Monroy-Hernández, Rajan Vaish
  • Publication number: 20220414989
    Abstract: A virtual mailbox is established for a first user based on a virtual mailbox definition specified by the first user. A message directed to the first user is received from a first device associated with a second user. A location within a real-world environment of the first user corresponding to the virtual mailbox is identified. A marker associated with the virtual mailbox is detected within the real-world environment of the first user. Based on detecting the marker, the second device presents the message overlaid on the real-world environment at the location corresponding to the virtual mailbox.
    Type: Application
    Filed: June 29, 2021
    Publication date: December 29, 2022
    Inventors: Rajan Vaish, Yu Jiang Tham, Brian Anthony Smith, Sven Kratz, Karen Stoizenberg, David Meisenholder
  • Patent number: 11452939
    Abstract: Systems and methods directed to generating an interactive graphical marker that includes a first region with a first indicator and a second region with a second indicator, the second region being around a circumference of the first region. The systems and methods are also directed to monitoring an animation of the interactive graphical marker to detect when the first indicator and the second indicator are aligned at a predetermined angle of rotation, and in response to detecting that the first indicator and the second indicator are aligned, initiating an interactive game application on a second computing device and a third computing device.
    Type: Grant
    Filed: September 21, 2020
    Date of Patent: September 27, 2022
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Ava Robinson, Andrés Monroy-Hernández, Rajan Vaish
  • Publication number: 20220288487
    Abstract: Systems, methods, and computer readable media directed to multi-user visual experiences such as interactive gaming experiences and artistic media experiences. A viewing electronic device includes a camera configured to capture images, a display, and a processor coupled to the camera and the display. The processor is configured to capture, with the camera, images of a rotating marker where the rotating marker is presented on a monitor of a remote device and to present, on the display, a visual experience where the visual experience has an adjustable feature. The processor detects a parameter of the rotating marker from the captured images that corresponds to the adjustable feature and updates the visual experience responsive to the detected parameter. The parameter may be one or more of an angle, speed of rotation, direction of rotation, color, or pattern of the rotating marker.
    Type: Application
    Filed: May 31, 2022
    Publication date: September 15, 2022
    Inventors: Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20220277503
    Abstract: A server machine modifies an augmented reality (AR) object in response to fulfillment of a condition. The machine provides, to a user device, object data that defines the AR object. The object data specifies a physical geolocation of the AR object, a presentation attribute of the AR object, a conditional modification program, and a trigger condition for execution of the conditional modification program. The object data causes the user device to present the AR object with a first appearance, located at the physical geolocation. The machine detects fulfillment of the trigger condition, and in response, the machine executes the conditional modification program. This modifies the object data by modifying the presentation attribute. The machine provides, to the user device, the modified object data, which causes the user device to present the AR object with a second appearance based on the modified presentation attribute.
    Type: Application
    Filed: May 20, 2022
    Publication date: September 1, 2022
    Inventors: Ilteris Canberk, Andrés Monroy-Hernández, Rajan Vaish
  • Publication number: 20220276823
    Abstract: Methods and systems are disclosed for creating a shared augmented reality (AR) session. The methods and systems perform operations comprising: receiving, by a client device, input that selects a shared augmented reality (AR) experience from a plurality of shared AR experiences; in response to receiving the input, determining one or more resources associated with the selected shared AR experience; determining, by the client device, that two or more users are located within a threshold proximity of the client device; and activating the selected shared AR experience in response to determining that the two or more users are located within the threshold proximity of the client device.
    Type: Application
    Filed: May 16, 2022
    Publication date: September 1, 2022
    Inventors: Ana Maria Cardenas Gasca, Ella Dagan Peled, Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Patent number: 11383156
    Abstract: Systems, methods, and computer readable media directed to multi-user visual experiences such as interactive gaming experiences and artistic media experiences. A viewing electronic device includes a camera configured to capture images, a display, and a processor coupled to the camera and the display. The processor is configured to capture, with the camera, images of a rotating marker where the rotating marker is presented on a monitor of a remote device and to present, on the display, a visual experience where the visual experience has an adjustable feature. The processor detects a parameter of the rotating marker from the captured images that corresponds to the adjustable feature and updates the visual experience responsive to the detected parameter. The parameter may be one or more of an angle, speed of rotation, direction of rotation, color, or pattern of the rotating marker.
    Type: Grant
    Filed: December 29, 2020
    Date of Patent: July 12, 2022
    Assignee: Snap Inc.
    Inventors: Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish