Patents by Inventor Amy Mun Hong
Amy Mun Hong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11824821Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).Type: GrantFiled: November 1, 2022Date of Patent: November 21, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
-
Patent number: 11755122Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: GrantFiled: May 23, 2022Date of Patent: September 12, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
-
Publication number: 20230111597Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).Type: ApplicationFiled: November 1, 2022Publication date: April 13, 2023Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
-
Patent number: 11509612Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).Type: GrantFiled: December 15, 2020Date of Patent: November 22, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
-
Publication number: 20220283646Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: ApplicationFiled: May 23, 2022Publication date: September 8, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
-
Patent number: 11429186Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.Type: GrantFiled: November 18, 2020Date of Patent: August 30, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Austin S. Lee, Mathew J. Lamb, Anthony James Ambrus, Amy Mun Hong, Jonathan Palmer, Sophie Stellmach
-
Publication number: 20220191157Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).Type: ApplicationFiled: December 15, 2020Publication date: June 16, 2022Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
-
Patent number: 11340707Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: GrantFiled: May 29, 2020Date of Patent: May 24, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Michael Harley Notter, Jenny Kam, Sheng Kai Tang, Kenneth Mitchell Jakubzak, Adam Edwin Behringer, Amy Mun Hong, Joshua Kyle Neff, Sophie Stellmach, Mathew J. Lamb, Nicholas Ferianc Kamuda
-
Publication number: 20220155857Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.Type: ApplicationFiled: November 18, 2020Publication date: May 19, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Austin S. LEE, Mathew J. LAMB, Anthony James AMBRUS, Amy Mun HONG, Jonathan PALMER, Sophie STELLMACH
-
Publication number: 20210373672Abstract: Examples are disclosed that relate to hand gesture-based emojis. One example provides, on a display device, a method comprising receiving hand tracking data representing a pose of a hand in a coordinate system, based on the hand tracking data, recognizing a hand gesture, and identifying an emoji corresponding to the hand gesture. The method further comprises presenting the emoji on the display device, and sending an instruction to one or more other display devices to present the emoji.Type: ApplicationFiled: May 29, 2020Publication date: December 2, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Michael Harley NOTTER, Jenny KAM, Sheng Kai TANG, Kenneth Mitchell JAKUBZAK, Adam Edwin BEHRINGER, Amy Mun HONG, Joshua Kyle NEFF, Sophie STELLMACH, Mathew J. LAMB, Nicholas Ferianc KAMUDA
-
Patent number: D737526Type: GrantFiled: May 21, 2014Date of Patent: August 25, 2015Inventor: Amy Mun Hong