Patents by Inventor Edward Sean Lloyd Rintel

Edward Sean Lloyd Rintel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240121280
    Abstract: Systems, methods, and computer-readable storage devices are disclosed for simulated choral audio chatter in communication systems. One method including: receiving audio data from each of a plurality of users participating in a first group of a plurality of groups for an event using a communication system; generating first simulated choral audio chatter based on the audio data received from each of the plurality of users in the first group; and providing the generated first simulated choral audio data to at least one user of a plurality of users of the event.
    Type: Application
    Filed: October 7, 2022
    Publication date: April 11, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Edward Sean Lloyd RINTEL, Amos MILLER, Andrew D. WILSON, Sasa JUNUZOVIC
  • Publication number: 20230421722
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11792364
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: October 17, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Publication number: 20230266830
    Abstract: Aspects of the present disclosure relate to semantic user input for a computing device. In examples, user input is identified and processed to identify and automatically perform an associated semantic action. The semantic action may be determined based at least in part on an environmental context associated with the user input. Thus, an action determined for a given user input may change according to the environmental context in which the input was received. For example, an association between user input, an environmental context, and an action may be used to affect the behavior of a computing device as a result of identifying the user input in a scenario that has the environmental context. Such associations may be dynamically determined as a result of user interactions associated with manually provided input, for example to create, update, and/or remove semantic actions associated with a variety of user inputs.
    Type: Application
    Filed: February 22, 2022
    Publication date: August 24, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Eyal OFEK, Michel PAHUD, Edward Sean Lloyd RINTEL, Mar Gonzalez FRANCO, Payod PANDA
  • Publication number: 20230259322
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: April 26, 2023
    Publication date: August 17, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20230236713
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: March 17, 2023
    Publication date: July 27, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Patent number: 11669294
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: June 6, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Patent number: 11656747
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Grant
    Filed: September 21, 2021
    Date of Patent: May 23, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: John C. Tang, William Arthur Stewart Buxton, Andrew D. Wilson, Kori M. Inkpen, Sasa Junuzovic, Abigail J. Sellen, Edward Sean Lloyd Rintel
  • Publication number: 20230086906
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: September 21, 2021
    Publication date: March 23, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Publication number: 20220385855
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220382506
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11386620
    Abstract: A method of providing a geographically distributed live mixed-reality meeting is described. The method comprises receiving, from a camera at a first endpoint, a live video stream; generating an mixed reality view incorporating the received video stream; rendering the mixed reality view at a display at the first endpoint and transmitting the mixed reality view to at least one other geographically distant endpoint; receiving data defining a bounding area; calculating a real world anchor for the bounding area using the data defining the bounding area; rendering the bounding area in the mixed reality view at a real world position determined using the real world anchor; and applying different rule sets to content objects placed into the mixed reality view by users dependent upon the position of the content objects relative to the bounding area in real world space.
    Type: Grant
    Filed: May 18, 2018
    Date of Patent: July 12, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anthony Arnold Wieser, Martin Grayson, Kenton Paul Anthony O'Hara, Edward Sean Lloyd Rintel, Camilla Alice Longden, Philipp Steinacher, Dominic Roedel, Advait Sarkar, Shu Sam Chen, Jens Emil Krarup Gronbaek, Ding Wang
  • Patent number: 11310294
    Abstract: The present disclosure provides a number of techniques for enhancing a user's experience when joining teleconference sessions with multiple devices. When a user attempts to join a teleconference session using the same user identity for multiple devices, a system differentiates the devices as a primary device and at least one companion device. The primary device has a first control set for controlling a teleconference session and the at least one companion device has a companion control set for sharing content. In some embodiments, the primary device also has one set of selected streams, e.g., a stage view, and the companion device has a subset of those streams or other streams based on an activity level. In addition, the present disclosure provides a number of techniques for enabling users to share content using the companion devices.
    Type: Grant
    Filed: April 5, 2017
    Date of Patent: April 19, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jason Thomas Faulkner, Kenton O'Hara, Ewin Davis Kannuthottiyil, Edward Sean Lloyd Rintel, Kevin Morrison, Robert Corish, Anthony Wieser
  • Patent number: 11212326
    Abstract: The present disclosure provides a number of techniques for enhancing a user's experience when joining teleconference sessions. When multiple users join a teleconference session using separate devices, a system identifies devices that are co-located. The co-located devices are identified by the use of a combination of data including, but not limited to social signals, audio signal and other location data. At least one device of the co-located devices, has a first control set for controlling a teleconference session and other devices of the co-located devices have a second control set for sharing content. The at least one device also has one set of streams and the other devices see a subset of those streams or other streams based on an activity level. In addition, the present disclosure provides a number of techniques for enabling users to use multiple devices to share content.
    Type: Grant
    Filed: April 5, 2017
    Date of Patent: December 28, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jason Thomas Faulkner, Kenton O'Hara, Ewin Davis Kannuthottiyil, Eric Randall Sexauer, Edward Sean Lloyd Rintel, Thaddeus Scott, Kevin Morrison, Robert Corish, Anthony Wieser
  • Patent number: 10462421
    Abstract: A projection unit has a rotating capture module and a rotating projection module. The capture module has at least one color camera, at least one microphone and at least one depth camera and is configured to capture images of an environment. The rotating projection module is configured to project images onto at least one surface in the environment. The projection unit has a processor configured to use data captured by the rotating capture module to select the at least one surface in the environment, the selection being dependent on a field of view of at least one user in the environment and dependent on characteristics of surfaces in the environment. The processor is configured to control rotation of the rotating capture module such that the data captured by the rotating capture module is suitable for computing field of view the user, and for determining characteristics of the surfaces.
    Type: Grant
    Filed: July 20, 2015
    Date of Patent: October 29, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Edward Sean Lloyd Rintel, Matthew Alastair Johnson
  • Publication number: 20190287306
    Abstract: A method of providing a geographically distributed live mixed-reality meeting is described. The method comprises receiving, from a camera at a first endpoint, a live video stream; generating an mixed reality view incorporating the received video stream; rendering the mixed reality view at a display at the first endpoint and transmitting the mixed reality view to at least one other geographically distant endpoint; receiving data defining a bounding area; calculating a real world anchor for the bounding area using the data defining the bounding area; rendering the bounding area in the mixed reality view at a real world position determined using the real world anchor; and applying different rule sets to content objects placed into the mixed reality view by users dependent upon the position of the content objects relative to the bounding area in real world space.
    Type: Application
    Filed: May 18, 2018
    Publication date: September 19, 2019
    Inventors: Anthony Arnold WIESER, Martin GRAYSON, Kenton Paul Anthony O'HARA, Edward Sean Lloyd RINTEL, Camilla Alice LONGDEN, Philipp STEINACHER, Dominic ROEDEL, Advait SARKAR, Shu Sam CHEN, Jens Emil Krarup GRONBAEK, Ding WANG
  • Publication number: 20180124128
    Abstract: The present disclosure provides a number of techniques for enhancing a user's experience when joining teleconference sessions. When multiple users join a teleconference session using separate devices, a system identifies devices that are co-located. The co-located devices are identified by the use of a combination of data including, but not limited to social signals, audio signal and other location data. At least one device of the co-located devices, has a first control set for controlling a teleconference session and other devices of the co-located devices have a second control set for sharing content. The at least one device also has one set of streams and the other devices see a subset of those streams or other streams based on an activity level. In addition, the present disclosure provides a number of techniques for enabling users to use multiple devices to share content.
    Type: Application
    Filed: April 5, 2017
    Publication date: May 3, 2018
    Inventors: Jason Thomas Faulkner, Kenton O'Hara, Ewin Davis Kannuthottiyil, Eric Randall Sexauer, Edward Sean Lloyd Rintel, Thaddeus Scott, Kevin Morrison, Robert Corish, Anthony Wieser
  • Publication number: 20180124136
    Abstract: The present disclosure provides a number of techniques for enhancing a user's experience when joining teleconference sessions with multiple devices. When a user attempts to join a teleconference session using the same user identity for multiple devices, a system differentiates the devices as a primary device and at least one companion device. The primary device has a first control set for controlling a teleconference session and the at least one companion device has a companion control set for sharing content. In some embodiments, the primary device also has one set of selected streams, e.g., a stage view, and the companion device has a subset of those streams or other streams based on an activity level. In addition, the present disclosure provides a number of techniques for enabling users to share content using the companion devices.
    Type: Application
    Filed: April 5, 2017
    Publication date: May 3, 2018
    Inventors: Jason Thomas Faulkner, Kenton O'Hara, Ewin Davis Kannuthottiyil, Edward Sean Lloyd Rintel, Kevin Morrison, Robert Corish, Anthony Wieser
  • Patent number: 9819902
    Abstract: A telecommunications device sends and receives messages comprising data about telecommunications resources and resource state of proximate devices. The telecommunications device has a processor configured to determine a proximate resource pool using at least the telecommunication resources of the other devices, and the state of the telecommunications resources of the other devices, the proximate resource pool comprising a list of content streams being generated by, or potentially being generated by, specified ones of the other devices. The processor is configured to receive instructions from one of the other devices, the instructions being to add, divert or stop a specified content stream of the proximate resource pool within a telecommunications activity ongoing at the device. In various examples, the processor is configured to execute the instructions responsive to the instructions being from a trusted device, or responsive to user authorization.
    Type: Grant
    Filed: March 19, 2015
    Date of Patent: November 14, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Edward Sean Lloyd Rintel, Richard Harry Robert Harper, Kenton Paul Anthony O'Hara, Philip Charles Gosset, Daniel Wilkinson Gratiot, Ian James Ray
  • Publication number: 20170199639
    Abstract: A computing device is described comprising a navigation component configured to receive navigation data from a presenter host device. The navigation data is about navigation of a plurality of content views as part of a presentation of the content views being controlled by the presenter host device. The navigation component is configured to enter an audience interactive mode when it receives data from the presenter host device indicating availability of the audience interactive mode. The navigation component, is configured when in the audience interactive mode, to send instructions to the presenter host device to control the navigation of the plurality of content views on the basis of user input received at the computing device.
    Type: Application
    Filed: January 7, 2016
    Publication date: July 13, 2017
    Inventors: Kenton O'Hara, Samuel Gavin Smyth, Edward Sean Lloyd Rintel, Debaleena Chattopadhyay