Patents by Inventor Amy Mun Hong

Amy Mun Hong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11824821
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Grant
    Filed: November 1, 2022
    Date of Patent: November 21, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
  • Patent number: 11755122
    Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.
    Type: Grant
    Filed: May 23, 2022
    Date of Patent: September 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
  • Publication number: 20230111597
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Application
    Filed: November 1, 2022
    Publication date: April 13, 2023
    Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
  • Patent number: 11509612
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Grant
    Filed: December 15, 2020
    Date of Patent: November 22, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
  • Publication number: 20220283646
    Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.
    Type: Application
    Filed: May 23, 2022
    Publication date: September 8, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
  • Patent number: 11429186
    Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.
    Type: Grant
    Filed: November 18, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Mathew J. Lamb, Anthony James Ambrus, Amy Mun Hong, Jonathan Palmer, Sophie Stellmach
  • Publication number: 20220191157
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Application
    Filed: December 15, 2020
    Publication date: June 16, 2022
    Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
  • Patent number: 11340707
    Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.
    Type: Grant
    Filed: May 29, 2020
    Date of Patent: May 24, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
  • Publication number: 20220155857
    Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.
    Type: Application
    Filed: November 18, 2020
    Publication date: May 19, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Austin S. LEE, Mathew J. LAMB, Anthony James AMBRUS, Amy Mun HONG, Jonathan PALMER, Sophie STELLMACH
  • Publication number: 20210373672
    Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.
    Type: Application
    Filed: May 29, 2020
    Publication date: December 2, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
  • Patent number: D737526
    Type: Grant
    Filed: May 21, 2014
    Date of Patent: August 25, 2015
    Inventor: Amy Mun Hong