Patents by Inventor Youjean Cho

Youjean Cho has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240079031
    Abstract: Described are authoring tools for creating interactive AR experiences. The story-authoring application enables a user with little or no programming skills to create an interactive story that includes recording voice commands for advancing to the next scene, inserting and manipulating virtual objects in a mixed-reality environment, and recording a variety of interactions with connected IoT devices. The story creation interface is presented on the display as a virtual object in an AR environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Daekun Kim, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077984
    Abstract: Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Ava Robinson, Daekun Kim, Youjean Cho, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077933
    Abstract: A method for detecting full-body gestures by a mobile device includes a host mobile device detecting the tracked body of a co-located participant in a multi-party session. When the participant's tracked body provides a full-body gesture, the host's mobile device recognizes that there is a tracked body providing a full-body gesture. The host mobile device iterates through the list of participants in the multi-party session and finds the closest participant mobile device with respect to the screen-space position of the head of the gesturing participant. The host mobile device then obtains the user ID of the closest participant mobile device and broadcasts the recognized full-body gesture event to all co-located participants in the multi-party session, along with the obtained user ID. Each participant's mobile device may then handle the gesture event as appropriate for the multi-party session. For example, a character or costume may be assigned to a gesturing participant.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240078759
    Abstract: Multi-player co-located AR experiences are augmented by assigning characters and costumes to respective participants (a.k.a. “users” of AR-enabled mobile devices) in multi-player AR sessions for storytelling, play acting, and the like. Body tracking technology and augmented reality (AR) software are used to decorate the bodies of the co-located participants with virtual costumes within the context of the multi-player co-located AR experiences. Tracked bodies are distinguished to determine which body belongs to which user and hence which virtual costume belongs to which tracked body so that corresponding costumes may be assigned for display in augmented reality. A host-guest mechanism is used for networked assignment of characters and corresponding costumes in the co-located multi-player AR session. Body tracking technology is used to move the costume with the body as movement of the assigned body is detected.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Daekun Kim, Lei Zhang, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077935
    Abstract: A virtual interface application presented in augmented reality (AR) is described for controlling Internet of Things (IoT) products. The virtual interface application enables a user of a portable electronic device to activate a virtual control interface overlay on a display, receive a selection from the user using her hands or feet, and send a control signal to a nearby IoT product which executes an action in accordance with the selection. The virtual control interface is presented on the display as a virtual object in an AR environment. The virtual interface application includes a foot tracking tool for detecting an intersection between the foot location (in the physical environment) and the virtual surface position (in the AR environment). When an intersection is detected, the virtual interface application sends a control signal with instructions to the IoT product.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Youjean Cho, Lei Zhang, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077934
    Abstract: Described are virtual AR interfaces for generating a virtual rotational interface for the purpose of controlling connected IoT devices using the inertial measurement unit (IMU) of a portable electronic device. The IMU control application enables a user of a portable electronic device to activate a virtual rotational interface overlay on a display and adjust a feature of a connected IoT product by rotating a portable electronic device. The device IMU moves a slider on the virtual rotational interface. The IMU control application sends a control signal to the IoT product which executes an action in accordance with the slider position. The virtual rotational interface is presented on the display as a virtual object in an AR environment. The IMU control application detects the device orientation (in the physical environment) and in response presents a corresponding slider element on the virtual rotational interface (in the AR environment).
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240077983
    Abstract: Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.
    Type: Application
    Filed: September 1, 2022
    Publication date: March 7, 2024
    Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
  • Publication number: 20240070300
    Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users by a collaborative system. The system receives a request from a user to join a session, and associates a timestamp with the user corresponding to receipt of the request.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070301
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant (the host) creates a new session and invites participants to join. The session creator (i.e., the host) and other approved participants can access the contents of a session (e.g., which may be recorded using an application such as lens cloud feature; available from Snap Inc. of Santa Monica, California). A timestamp is associated with each received virtual content, and the users are provided with a timelapse of the collaborative object as a function of the timestamps.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070243
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240069643
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a user interacts with the collaborative object using hand gestures. The virtual content associated with the collaborative object can be accessed with an opening hand gesture and the virtual content can be hidden with a closing hand gesture. The hand gestures are detected by cameras of a client device used by the user. The collaborative object can be moved and manipulated using a pointing gesture, wherein the position of the collaborate object can be confirmed to a new position by titling the client device of the user.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070302
    Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070298
    Abstract: Collaborative sessions in which access to added virtual content is selectively made available to participants/users. A participant (the host) creates a new session and invites participants to join. The invited participants receive an invitation to join the session. The session creator (i.e., the host) and other approved participants can access the contents of a session. The session identifies a new participant when they join the session, and concurrently notifies the other participants in the session that a new participant is waiting for permission to access the added virtual content. The host or approved participants can set up the new participant with permissions for accessing added virtual content.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240071020
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object with an associated material and added virtual content is provided to users. In one example of the collaborative session, a user selects the associated material of the collaborative object. Physical characteristics are assigned to the collaborative object as a function of the associated material to be perceived by the participants when the collaborative object is manipulated. In one example, the material associated to the collaborative object is metal, wherein the interaction between the users and the collaborative object generates a response of the collaborative object that is indicative of the physical properties of metal, such as inertial, acoustic, and malleability.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240069642
    Abstract: Collaborative sessions in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a participant crops media content by use of a hand gesture to produce an image segment that can be associated to the collaborative object. The hand gesture resembles a pair of scissors and the camera and processor of the client device track a path of the hand gesture to identify an object within a displayed image to create virtual content of the identified object. The virtual content created by the hand gesture is then associated to the collaborative object.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
  • Publication number: 20240070299
    Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, a processor provides users with access to a collaborative object using respective physically remote devices, and associates virtual content received from the users with the collaborative object during a collaboration period. The processor maintains a timer including a countdown indicative of when the collaboration period ends for associating virtual content with the collaborative object. The processor provides the users with access to the collaborative object with associated virtual content at the end of the collaboration period.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish