Patents by Inventor Jesse Dylan MERRIAM

Jesse Dylan MERRIAM has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104611
    Abstract: The techniques disclosed herein provide recommendations of content based on user selections of curated review responses. A system can provide a set of curated response candidates and a related question to users providing a review of a product. The set of curated response candidates are selected based on a category that is determined by one or more factors. For example, if a user is to provide a review on a video game, the category can be based on aspects of the video game, such as visual features, game play features, etc. A user can respond to the question by selecting at least one of the curated response candidates. The system can then analyze a data structure that associates the selected response to one or more characteristics to determine characteristics that are preferred by the user. The system can then use the characteristics to recommend other content.
    Type: Application
    Filed: September 22, 2022
    Publication date: March 28, 2024
    Inventors: Eric Warner TAYLOR, Aaron John Mayer BUCKLEY, Jesse Dylan MERRIAM
  • Patent number: 11386872
    Abstract: Described herein is a system and method for experiencing a virtual object at a plurality of sizes. During an AR session, the virtual object can created at a first size based upon a first scale (e.g., miniature, tabletop size). Once created, information regarding the virtual object can be stored. Thereafter, the virtual object can be displayed in an AR session at a second size based upon a second scale (e.g., full size or life size). In some embodiments, functionality of at least portion(s) of the virtual object are different when experienced in an AR session at the second size than when experienced in an AR session at the first size.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: July 12, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew Cahill, Torfi Frans Olafsson, Jesse Dylan Merriam, Michael Meincke Persson, Bradley Reid Shuber
  • Publication number: 20210346810
    Abstract: Described herein is a system and method for sharing an AR game within a shared coordinate space created between devices with initially disjoint relative coordinate spaces. Once the shared coordinate space is created, an AR video game can provide a first mode in which the users engage in game play action that have consequences according to pre-established game rules. The AR video game can provide a second mode (“sandbox mode”) in which users engage in non-destructive game play actions that do not have consequences once the second mode has been terminated. Further described herein is a system and method of using geolocation information within an AR session in which a virtual action can be initiated by a user that causes a corresponding virtual action to be displayed on a map of a virtual environment that parallels a physical environment displayed on a user gaming device of another user.
    Type: Application
    Filed: July 19, 2021
    Publication date: November 11, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Jesse Dylan MERRIAM, Torfi Frans OLAFSSON, Timothy James SCHUTZ, Michael Meincke PERSSON
  • Patent number: 11097194
    Abstract: Described herein is a system and method for sharing an AR game within a shared coordinate space created between devices with initially disjoint relative coordinate spaces. Once the shared coordinate space is created, an AR video game can provide a first mode in which the users engage in game play action that have consequences according to pre-established game rules. The AR video game can provide a second mode (“sandbox mode”) in which users engage in non-destructive game play actions that do not have consequences once the second mode has been terminated. Further described herein is a system and method of using geolocation information within an AR session in which a virtual action can be initiated by a user that causes a corresponding virtual action to be displayed on a map of a virtual environment that parallels a physical environment displayed on a user gaming device of another user.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: August 24, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew Cahill, Jesse Dylan Merriam, Torfi Frans Olafsson, Timothy James Schutz, Michael Meincke Persson
  • Patent number: 11090561
    Abstract: Described herein is a system and method for creating a shared coordinate space in an augmented reality session between two devices with disjoint relative coordinate spaces. With AR tracking initiated, each device has its own relative coordinate space. A spatial alignment image can be recognized by a device joining an AR session which records its location (e.g., six degrees of position). The joining device can utilize an obtained location of an inviting device at or about the time the spatial alignment image was recognized, and, a spatial origin of the inviting device, to calculate an offset between the joining device and the inviting device, establishing a shared coordinate space between the joining device and the inviting device.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: August 17, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew Cahill, Torfi Frans Olafsson, Jesse Dylan Merriam, Michael Meincke Persson, Bradley Reid Shuber
  • Publication number: 20200380259
    Abstract: Described herein is a system and method for providing a response (e.g., responsive action of a virtual character) within an augmented reality session. Positional data information regarding at least a portion of a human skeleton is received. A real world gesture is identified based, at least in part, upon the received positional data. A response to the identified real world gesture (e.g., responsive action of the virtual character) is determined based, at least in part upon, the identified real world gesture. The determined response is caused to be performed in the augmented reality session.
    Type: Application
    Filed: August 19, 2019
    Publication date: December 3, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Torfi Frans OLAFSSON, Jesse Dylan MERRIAM, Timothy James SCHUTZ, Craig Sean STEYN, Michael Meincke PERSSON
  • Patent number: 10852906
    Abstract: Described herein is a system and method for identifying locations for virtual items within a physical environment. For each of a plurality of users, information is received regarding the user's interaction with a one or more interactive virtual items presented on a map of a virtual environment that parallels at least portions of the physical environment. The information comprises the user's physical location when interacting with a particular interactive item. Each interactive virtual item has an associated physical location within the physical environment. The received information is aggregated over time to determine an associated physical location within the physical environment for the virtual item. With the virtual environment, the virtual item is placed at the determined associated physical location.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: December 1, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew Cahill, Torfi Frans Olafsson, Irina Koulinitch, Stacy Jiayan Chen, Jesse Dylan Merriam, Michael Meincke Persson, Bradley Reid Shuber
  • Patent number: 10848920
    Abstract: Described herein is a system and method of generating precise location information. Feature points of an image are received from a user device (e.g., mobile phone). Location information of the user device (e.g., GPS coordinates) and relative transform information from the user device are also received. An image anchor is identified based upon the received feature points, relative transform information, and the received location information. Precise location information of the identified image anchor is provided to the user device. The precise location information of the identified image anchor is based upon filtered feature points, location information and relative transform information received from a plurality of user devices over a period of time (e.g., hours, days, weeks, months). The precise location information of the identified image anchor is updated based upon the location information of the user device and the relative transform information received from the user device.
    Type: Grant
    Filed: September 17, 2019
    Date of Patent: November 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Torfi Frans Olafsson, Jason Matthew Cahill, Jesse Dylan Merriam, David Richard Reed, Timothy Arthur Gill, Gabriel Takacs, Bogdan Cosmin Bucur, Andrew Nicholas Bond
  • Publication number: 20200360823
    Abstract: Described herein is a system and method for sharing an AR game within a shared coordinate space created between devices with initially disjoint relative coordinate spaces. Once the shared coordinate space is created, an AR video game can provide a first mode in which the users engage in game play action that have consequences according to pre-established game rules. The AR video game can provide a second mode (“sandbox mode”) in which users engage in non-destructive game play actions that do not have consequences once the second mode has been terminated. Further described herein is a system and method of using geolocation information within an AR session in which a virtual action can be initiated by a user that causes a corresponding virtual action to be displayed on a map of a virtual environment that parallels a physical environment displayed on a user gaming device of another user.
    Type: Application
    Filed: May 16, 2019
    Publication date: November 19, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Jesse Dylan MERRIAM, Torfi Frans OLAFSSON, Timothy James SCHUTZ, Michael Meincke PERSSON
  • Publication number: 20200360816
    Abstract: Described herein is a system and method of generating self-portrait within augmented realty session. A first camera of a device can capture an image of user(s) in a physical environment (e.g., real-world). A depth sensor can determine a distance of the user(s) from the device. A second camera of the device can determine an orientation and/or change of orientation of the device in the physical environment. Virtual object(s) in proximity to the user(s) can be determined based, at least in part, upon the determined physical location and determined orientation of the user gaming device. Each virtual object can have an associated physical location within the physical environment. A representation of the identified virtual object(s) can then be generated and an AR self-portrait can be generated based, at least in part, the captured representation of the user in the physical environment and the generated representation of the identified virtual object(s).
    Type: Application
    Filed: May 16, 2019
    Publication date: November 19, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Jesse Dylan MERRIAM, Torfi Frans OLAFSSON, Timothy James SCHUTZ
  • Publication number: 20200261799
    Abstract: Described herein is a system and method for creating a shared coordinate space in an augmented reality session between two devices with disjoint relative coordinate spaces. With AR tracking initiated, each device has its own relative coordinate space. A spatial alignment image can be recognized by a device joining an AR session which records its location (e.g., six degrees of position). The joining device can utilize an obtained location of an inviting device at or about the time the spatial alignment image was recognized, and, a spatial origin of the inviting device, to calculate an offset between the joining device and the inviting device, establishing a shared coordinate space between the joining device and the inviting device.
    Type: Application
    Filed: February 15, 2019
    Publication date: August 20, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Torfi Frans OLAFSSON, Jesse Dylan MERRIAM, Michael Meincke PERSSON, Bradley Reid SHUBER
  • Publication number: 20200264757
    Abstract: Described herein is a system and method for identifying locations for virtual items within a physical environment. For each of a plurality of users, information is received regarding the user's interaction with a one or more interactive virtual items presented on a map of a virtual environment that parallels at least portions of the physical environment. The information comprises the user's physical location when interacting with a particular interactive item. Each interactive virtual item has an associated physical location within the physical environment. The received information is aggregated over time to determine an associated physical location within the physical environment for the virtual item. With the virtual environment, the virtual item is placed at the determined associated physical location.
    Type: Application
    Filed: February 15, 2019
    Publication date: August 20, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Torfi Frans OLAFSSON, Irina KOULINITCH, Stacy Jiayan CHEN, Jesse Dylan MERRIAM, Michael Meincke PERSSON, Bradley Reid SHUBER
  • Publication number: 20200265806
    Abstract: Described herein is a system and method for experiencing a virtual object at a plurality of sizes. During an AR session, the virtual object can created at a first size based upon a first scale (e.g., miniature, tabletop size). Once created, information regarding the virtual object can be stored. Thereafter, the virtual object can be displayed in an AR session at a second size based upon a second scale (e.g., full size or life size). In some embodiments, functionality of at least portion(s) of the virtual object are different when experienced in an AR session at the second size than when experienced in an AR session at the first size.
    Type: Application
    Filed: February 15, 2019
    Publication date: August 20, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jason Matthew CAHILL, Torfi Frans OLAFSSON, Jesse Dylan MERRIAM, Michael Meincke PERSSON, Bradley Reid SHUBER