Patents by Inventor Andrés Monroy-Hernández

Andrés Monroy-Hernández has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240069642
    Abstract: Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240073404
    Abstract: A display device presents volumetric content comprising a volumetric video. The volumetric video comprises a volumetric representation of one or more elements a three-dimensional space. Input indicative of a control operation associated with the presentation of the volumetric video is received. The presentation of the volumetric video by the display device is controlled by executing the control operation. While the control operation is being executed, the volumetric representation of the one or more elements of the three-dimensional space are displayed from multiple perspectives based on movement of a user.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071008
    Abstract: A two-dimensional element is identified from one or more two-dimensional images. A volumetric content item is generated based on the two-dimensional element identified from the one or more two-dimensional images. A display device presents the volumetric content item overlaid on a real-world environment that is within a field of view of a user of the display device.
    Type: Application
    Filed: February 16, 2023
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240070301
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070969
    Abstract: Input indicative of a selection of volumetric content for presentation is received. The volumetric content comprises a volumetric representation of one or more elements of a real-world three-dimensional space. In response to the input, device state data associated with the volumetric content is accessed. The device state data describes a state of one or more network-connected devices associated with the real-world three-dimensional space. The volumetric content is presented. The presentation of the volumetric content includes presentation of the volumetric representation of the one or more elements overlaid on the real-world three-dimensional space by a display device and configuring the one or more network-connected devices using the device state data.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andres Monroy-Hernandez, Brian Anthony Smith
  • Publication number: 20240071007
    Abstract: The present disclosure relates to methods and systems for providing a presentation of an experience (e.g., a journey) to a user using augmented reality (AR). During a capture phase, persons in the journey may take videos or pictures using their smartphones, GoPros, and/or smart glasses. A drone may also take videos or pictures during the journey. During an experience phase, an AR topographical rendering of the real-world environment of the journey may be rendered on a tabletop, highlighting/animating a path persons took in the journey. The persons may be rendered as miniature avatars/dolls overlaid on the representation of the real-world environment. When the user clicks on a point in the presentation of the journey, a perspective (e.g., the videos or pictures) at that point is presented.
    Type: Application
    Filed: February 15, 2023
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071006
    Abstract: A volumetric content presentation system includes a head-worn display device, which includes one or more processors, and a memory storing instructions that, when executed by the one or more processors, configure the display device to access AR content items that correspond to either real-world objects or virtual objects, mix and match these AR content items, and present volumetric content that includes these mixed and matched AR content items overlaid on a real-world environment to create a new AR scene that a user can experience.
    Type: Application
    Filed: November 22, 2022
    Publication date: February 29, 2024
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith, Rajan Vaish
  • Publication number: 20240070299
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a processor provides users with access to a collaborative object using respective physically remote devices, and associates virtual content received from the users with the collaborative object during a collaboration period. The processor maintains a timer including a countdown indicative of when the collaboration period ends for associating virtual content with the collaborative object. The processor provides the users with access to the collaborative object with associated virtual content at the end of the collaboration period.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240069627
    Abstract: A system monitors an environment via one or more sensors included in a computing device and applies a trigger to detect that a memory experience is stored in a data store based on the monitoring. The system creates an augmented reality memory experience, a virtual reality memory experience, or a combination thereof, based on the trigger if the memory experience is detected. The system additionally projects the augmented reality memory experience, the virtual reality memory experience, or the combination thereof, via the computing device.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240071020
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object with an associated material and added virtual content is provided to users. In one example of the collaborative session, a user selects the associated material of the collaborative object. Physical characteristics are assigned to the collaborative object as a function of the associated material to be perceived by the participants when the collaborative object is manipulated. In one example, the material associated to the collaborative object is metal, wherein the interaction between the users and the collaborative object generates a response of the collaborative object that is indicative of the physical properties of metal, such as inertial, acoustic, and malleability.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240073402
    Abstract: The present disclosure relates to methods and systems for providing a multi-perspective augmented reality experience. A volumetric video of a three-dimensional space is captured. The volumetric video of the three-dimensional space includes a volumetric representation of a first user within the three-dimensional space. The volumetric video is displayed by a display device worn by a second user, and the second user sees the volumetric representation of the first user within the three-dimensional space. Input indicative of an interaction (e.g., entering or leaving) of the second user with the volumetric representation of the first user is detected. Based on detecting the input indicative of the interaction, the display device switches to a display of a recorded perspective of the first user. Thus, by interacting with a volumetric representation of the first user in a volumetric video, the second user views the first user's perspective of the three-dimensional space.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Rajan Vaish, Sven Kratz, Andrés Monroy-Hernández, Brian Anthony Smith
  • Publication number: 20240058699
    Abstract: Systems and methods directed to generating an interactive graphical marker that includes a first region with a first indicator and a second region with a second indicator, the second region being around a circumference of the first region. The systems and methods are also directed to monitoring an animation of the interactive graphical marker to detect when the first indicator and the second indicator are aligned at a predetermined angle of rotation, and in response to detecting that the first indicator and the second indicator are aligned, initiating an interactive game application on a second computing device and a third computing device.
    Type: Application
    Filed: October 30, 2023
    Publication date: February 22, 2024
    Inventors: Yu Jiang Tham, Ava Robinson, Andrés Monroy-Hernández, Rajan Vaish
  • Publication number: 20240056524
    Abstract: An olfactory sticker used in chats between electronic devices to make messaging more immersive, personalized, and authentic by integrating olfactory information with traditional text-based formats and AR messaging. Olfactory stickers are used to indicate to a recipient that a message includes olfactory information. The olfactory sticker illustrates a graphical representation of a particular scent that can be sent and received via a chat or an AR message. Olfactory stickers provide the recipient control of accessing the transmitted scent at a desired time. This is particularly useful since certain olfactory information, i.e., scents, can be very direct and intrusive. Olfactory stickers are activated (i.e., release their scent) when they are tapped or rubbed by the recipient.
    Type: Application
    Filed: October 26, 2023
    Publication date: February 15, 2024
    Inventors: Andrés Monroy-Hernández, Sven Kratz, Rajan Vaish
  • Patent number: 11893301
    Abstract: Methods and systems are disclosed for creating a shared augmented reality (AR) session. The methods and systems perform operations comprising: receiving, by a client device, input that selects a shared augmented reality (AR) experience from a plurality of shared AR experiences; in response to receiving the input, determining one or more resources associated with the selected shared AR experience; determining, by the client device, that two or more users are located within a threshold proximity of the client device; and activating the selected shared AR experience in response to determining that the two or more users are located within the threshold proximity of the client device.
    Type: Grant
    Filed: May 16, 2022
    Date of Patent: February 6, 2024
    Assignee: Snap Inc.
    Inventors: Ana Maria Cardenas Gasca, Ella Dagan Peled, Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20240027394
    Abstract: An electronic device including an olfactory detector including an array of olfactory sensors for determining scents proximate the electronic device, such as a smartphone, smart eyewear, and a smart watch. Each sensor in the array of olfactory sensors is tuned to detect the presence and concentration of one or more specific chemical compounds or molecules. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor for further processing. Scent type and intensity can be classified by using the information from the scent sensors as input for a machine learning model, generated through supervised training using labeled example measurements from our sensor array.
    Type: Application
    Filed: July 19, 2022
    Publication date: January 25, 2024
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Rajan Vaish
  • Publication number: 20240031782
    Abstract: A method and a system include receiving a first signal value generated by a biosignal sensor coupled to the first client device, receiving one or more second signal values corresponding to a respective sensor reading or environmental condition associated with the first client device, determining a total score based on a first score and one or more second scores determined based on the first and the one or more second signal values, selecting a first state of the plurality of states based on a ranking of total scores of the plurality of states, and causing a display of a first notification associated with a first user-selectable element corresponding to the first state on the first client device.
    Type: Application
    Filed: September 27, 2023
    Publication date: January 25, 2024
    Inventors: Andrés Monroy-Hernández, Chunjong Park, Fannie Liu, Yu Jiang Tham
  • Patent number: 11875022
    Abstract: Systems and methods for interfacing augmented reality (AR) camera devices to Internet of Things (IoT) devices include pairing an AR camera device to an IoT device to establish which actions recognized by the AR camera device may control the IoT device, receiving an action identifier representing a gesture or position of a body part that has been recognized by the AR camera device, and sending a command to the IoT device paired with the AR camera device to perform a requested action in response to the action identifier received from the AR camera device. Context information relating to the action identifier from the AR camera may be used to modify the command sent to the IoT device. The AR camera device may be paired with an IoT device pointed at by the AR camera device, selected through a user interface of the AR camera device, or determined from context information.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: January 16, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Publication number: 20230418542
    Abstract: Methods and systems are disclosed for creating a shared augmented reality (AR) session. The methods and systems perform operations comprising: receiving, by a client device, input that selects a shared augmented reality (AR) experience from a plurality of shared AR experiences; in response to receiving the input, determining one or more resources associated with the selected shared AR experience; determining, by the client device, that two or more users are located within a threshold proximity of the client device; and activating the selected shared AR experience in response to determining that the two or more users are located within the threshold proximity of the client device.
    Type: Application
    Filed: September 8, 2023
    Publication date: December 28, 2023
    Inventors: Ana Maria Cardenas Gasca, Ella Dagan Peled, Andrés Monroy-Hernández, Ava Robinson, Yu Jiang Tham, Rajan Vaish
  • Publication number: 20230409158
    Abstract: An augmented reality (AR) device is interfaced to an Internet of Things (IoT) device by receiving IoT device information in a standardized schema from the IoT device. The IoT device information includes device inputs and available output information for the IoT device. A predetermined AR user interface widget to render for the IoT device is determined from the received IoT device information. The predetermined AR user interface widget converts user inputs to the AR device into the device inputs for the IoT device. Upon selection of the IoT device by the AR device, the predetermined AR user interface widget for the selected IoT device is provided to the AR device as an overlay on a display of the AR device. Device input received in response to user interaction with the AR user interface widget is provided to the IoT device in an input type expected by the IoT device.
    Type: Application
    Filed: June 15, 2022
    Publication date: December 21, 2023
    Inventors: Sven Kratz, Andrés Monroy-Hernández, Yu Jiang THAM
  • Publication number: 20230409852
    Abstract: The subject technology transforms a set of features into a three-dimensional mesh, the three-dimensional mesh including a set of vertices and a set of edges, wherein the set of features are determined based on analyzing a representation of a three-dimensional barcode. The subject technology extracts verification metadata from the three-dimensional mesh, the verification metadata including information for verifying whether a physical item is an authentic item. The subject technology receives manufacturer verification information based at least in part on the verification metadata. The subject technology receives provenance information associated with the physical item based at least in part on the manufacturer verification information and the verification metadata. The subject technology causes display of a media overlay including the physical item based at least in part on the provenance information, wherein the media overlay includes an indication of authenticity of the physical item.
    Type: Application
    Filed: September 1, 2023
    Publication date: December 21, 2023
    Inventor: Andrés Monroy-Hernández