Patents by Inventor Arjmand Micheal Samuel

Arjmand Micheal Samuel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10678852
    Abstract: Among other things, one or more techniques and/or systems are provided for annotating content based upon user reaction data and/or for maintaining a searchable content repository. That is, a user may request and/or opt-in for user reaction data to be detected while a user is experiencing content (e.g., watching a movie, walking through a park, interacting with a website, participating on a phone conversation, etc.). Metadata associated with the content may be used to determine when and/or what sensors to use to detect the user reaction data (e.g., metadata specifying an emotional part of a movie). The content may be annotated with a reaction annotation corresponding to the user reaction data, which may be used to organize, search, and/or interact with the content. A search interface may allow users to search for content based upon annotation data and/or aggregated annotation data of one or more users who experienced the content.
    Type: Grant
    Filed: June 20, 2017
    Date of Patent: June 9, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Emmanouil Koukoumidis, Brian Beckman, Nicholas Donald Atkins Lane, Arjmand Micheal Samuel
  • Publication number: 20170286538
    Abstract: Among other things, one or more techniques and/or systems are provided for annotating content based upon user reaction data and/or for maintaining a searchable content repository. That is, a user may request and/or opt-in for user reaction data to be detected while a user is experiencing content (e.g., watching a movie, walking through a park, interacting with a website, participating on a phone conversation, etc.). Metadata associated with the content may be used to determine when and/or what sensors to use to detect the user reaction data (e.g., metadata specifying an emotional part of a movie). The content may be annotated with a reaction annotation corresponding to the user reaction data, which may be used to organize, search, and/or interact with the content. A search interface may allow users to search for content based upon annotation data and/or aggregated annotation data of one or more users who experienced the content.
    Type: Application
    Filed: June 20, 2017
    Publication date: October 5, 2017
    Inventors: Emmanouil Koukoumidis, Brian Beckman, Nicholas Donald Atkins Lane, Arjmand Micheal Samuel
  • Patent number: 9721010
    Abstract: Among other things, one or more techniques and/or systems are provided for annotating content based upon user reaction data and/or for maintaining a searchable content repository. That is, a user may request and/or opt-in for user reaction data to be detected while a user is experiencing content (e.g., watching a movie, walking through a park, interacting with a website, participating on a phone conversation, etc.). Metadata associated with the content may be used to determine when and/or what sensors to use to detect the user reaction data (e.g., metadata specifying an emotional part of a movie). The content may be annotated with a reaction annotation corresponding to the user reaction data, which may be used to organize, search, and/or interact with the content. A search interface may allow users to search for content based upon annotation data and/or aggregated annotation data of one or more users who experienced the content.
    Type: Grant
    Filed: December 13, 2012
    Date of Patent: August 1, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Emmanouil Koukoumidis, Brian Beckman, Nicholas Donald Atkins Lane, Arjmand Micheal Samuel
  • Publication number: 20140172848
    Abstract: Among other things, one or more techniques and/or systems are provided for annotating content based upon user reaction data and/or for maintaining a searchable content repository. That is, a user may request and/or opt-in for user reaction data to be detected while a user is experiencing content (e.g., watching a movie, walking through a park, interacting with a website, participating on a phone conversation, etc.). Metadata associated with the content may be used to determine when and/or what sensors to use to detect the user reaction data (e.g., metadata specifying an emotional part of a movie). The content may be annotated with a reaction annotation corresponding to the user reaction data, which may be used to organize, search, and/or interact with the content. A search interface may allow users to search for content based upon annotation data and/or aggregated annotation data of one or more users who experienced the content.
    Type: Application
    Filed: December 13, 2012
    Publication date: June 19, 2014
    Inventors: Emmanouil Koukoumidis, Brian Beckman, Nicholas Donald Atkins Lane, Arjmand Micheal Samuel