Patents by Inventor Geoff Stahl

Geoff Stahl has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10042536
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Grant
    Filed: May 10, 2017
    Date of Patent: August 7, 2018
    Assignee: Apple Inc.
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl
  • Publication number: 20180088776
    Abstract: The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, pyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user.
    Type: Application
    Filed: October 2, 2017
    Publication date: March 29, 2018
    Inventors: Ricardo Motta, Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Publication number: 20170357417
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Application
    Filed: May 10, 2017
    Publication date: December 14, 2017
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl
  • Patent number: 9800705
    Abstract: A user interface on a device allows a user to set their remote user status for viewing by other individuals on their devices. The user or an application can select from a number of predefined remote user status indicators representing remote user status, and the user can optionally include a text message to be displayed with the remote user status indicator. The selected remote user status indicator and optional text message can be stored on a network and made available to other devices that have a contact database that includes the user as a contact. In some implementations, the remote user status indicator can be displayed proximate the user's name in a user interface, such as a favorites list, e-mail interface, text messaging interface, chat room, or any other user interface associated with an application.
    Type: Grant
    Filed: June 2, 2010
    Date of Patent: October 24, 2017
    Assignee: Apple Inc.
    Inventors: Geoff Stahl, Michael Dale Lampell, Laurent Baumann, Thomas Goossens
  • Patent number: 9778815
    Abstract: The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user.
    Type: Grant
    Filed: July 13, 2016
    Date of Patent: October 3, 2017
    Assignee: Apple Inc.
    Inventors: Ricardo Motta, Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Patent number: 9652134
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Grant
    Filed: January 24, 2014
    Date of Patent: May 16, 2017
    Assignee: Apple Inc.
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl
  • Publication number: 20170115846
    Abstract: The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user.
    Type: Application
    Filed: July 13, 2016
    Publication date: April 27, 2017
    Inventors: Ricardo Motta, Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Patent number: 9417763
    Abstract: The techniques disclosed herein use a compass, MEMS accelerometer, GPS module, and MEMS gyrometer to infer a frame of reference for a hand-held device. This can provide a true Frenet frame, i.e., X- and Y-vectors for the display, and also a Z-vector that points perpendicularly to the display. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference. Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user.
    Type: Grant
    Filed: December 15, 2014
    Date of Patent: August 16, 2016
    Assignee: Apple Inc.
    Inventors: Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Patent number: 9411413
    Abstract: The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user.
    Type: Grant
    Filed: July 11, 2014
    Date of Patent: August 9, 2016
    Assignee: Apple Inc.
    Inventors: Ricardo Motta, Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Publication number: 20160092945
    Abstract: An apparatus, method, and computer readable medium related to monitoring computer users to acquire information regarding use of application programs and device features as well as the context of such use. Computer users are monitored and data is collected to indicate the computer users' activities including the use of any particular application program. Profiles of each computer user may be created where the profiles are an aggregate of the collected information or a portion thereof. The Profiles may correlated to determine relationships between user behaviors. Various analytics regarding the relationship information may be employed to improve customer-oriented information such as ratings, recommendations, customer support, marketing, communications, and product features design.
    Type: Application
    Filed: September 29, 2015
    Publication date: March 31, 2016
    Inventors: Geoff Stahl, Jacques P. Gasselin de Richebourg, Nate Begeman
  • Publication number: 20150106768
    Abstract: The techniques disclosed herein use a compass, MEMS accelerometer, GPS module, and MEMS gyrometer to infer a frame of reference for a hand-held device. This can provide a true Frenet frame, i.e., X- and Y-vectors for the display, and also a Z-vector that points perpendicularly to the display. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference. Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user.
    Type: Application
    Filed: December 15, 2014
    Publication date: April 16, 2015
    Inventors: Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Publication number: 20150009130
    Abstract: The techniques disclosed herein may use various sensors to infer a frame of reference for a hand-held device. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track a Frenet frame of the device in real time to provide an instantaneous (or continuous) 3D frame-of-reference. In addition to—or in place of—calculating this instantaneous (or continuous) frame of reference, the position of a user's head may either be inferred or calculated directly by using one or more of a device's optical sensors, e.g., an optical camera, infrared camera, laser, etc. With knowledge of the 3D frame-of-reference for the display and/or knowledge of the position of the user's head, more realistic virtual 3D depictions of the graphical objects on the device's display may be created—and interacted with—by the user.
    Type: Application
    Filed: July 11, 2014
    Publication date: January 8, 2015
    Inventors: Ricardo Motta, Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Patent number: 8913056
    Abstract: The techniques disclosed herein use a compass, MEMS accelerometer, GPS module, and MEMS gyrometer to infer a frame of reference for a hand-held device. This can provide a true Frenet frame, i.e., X- and Y-vectors for the display, and also a Z-vector that points perpendicularly to the display. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference. Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user.
    Type: Grant
    Filed: August 4, 2010
    Date of Patent: December 16, 2014
    Assignee: Apple Inc.
    Inventors: Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Patent number: 8886891
    Abstract: Accessing a shared buffer can include receiving an identifier associated with a buffer from a sending process, requesting one or more attributes corresponding to the buffer based on the received identifier, mapping at least a first page of the buffer in accordance with the one or more requested attributes, and accessing an item of data stored in the buffer by the sending process. The identifier also can comprise a unique identifier. Further, the identifier can be passed to one or more other processes. Additionally, the one or more requested attributes can include at least one of a pointer to a memory location and a property describing the buffer.
    Type: Grant
    Filed: March 26, 2008
    Date of Patent: November 11, 2014
    Assignee: Apple Inc.
    Inventors: Kenneth Christian Dyke, Jeremy Todd Sandmel, Geoff Stahl, John Kenneth Stauffer
  • Publication number: 20140143693
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Application
    Filed: January 24, 2014
    Publication date: May 22, 2014
    Applicant: Apple Inc.
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl
  • Patent number: 8694899
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Grant
    Filed: June 1, 2010
    Date of Patent: April 8, 2014
    Assignee: Apple Inc.
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl
  • Patent number: 8300056
    Abstract: Exemplary embodiments of methods, apparatuses, and systems for seamlessly migrating a user visible display stream sent to a display device from one rendered display stream to another rendered display stream are described. For one embodiment, mirror video display streams are received from both a first graphics processing unit (GPU) and a second GPU, and the video display stream sent to a display device is switched from the video display stream from the first GPU to the video display stream from the second GPU, wherein the switching occurs during a blanking interval for the first GPU that overlaps with a blanking interval for the second GPU.
    Type: Grant
    Filed: October 13, 2008
    Date of Patent: October 30, 2012
    Assignee: Apple Inc.
    Inventors: Mike Nugent, Thomas Costa, Eve Brasfield, David Redman, Amanda Rainer, Tim Millet, Geoff Stahl, Adrian Sheppard, Ian Hendry, Ingrid Aligaen, Kenneth C. Dyke, Chris Niederauer, Michael Culbert
  • Publication number: 20120036433
    Abstract: The techniques disclosed herein use a compass, MEMS accelerometer, GPS module, and MEMS gyrometer to infer a frame of reference for a hand-held device. This can provide a true Frenet frame, i.e., X- and Y-vectors for the display, and also a Z-vector that points perpendicularly to the display. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference. Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user.
    Type: Application
    Filed: August 4, 2010
    Publication date: February 9, 2012
    Applicant: Apple Inc.
    Inventors: Mark Zimmer, Geoff Stahl, David Hayward, Frank Doepke
  • Publication number: 20110298618
    Abstract: A user interface on a device allows a user to set their remote user status for viewing by other individuals on their devices. The user or an application can select from a number of predefined remote user status indicators representing remote user status, and the user can optionally include a text message to be displayed with the remote user status indicator. The selected remote user status indicator and optional text message can be stored on a network and made available to other devices that have a contact database that includes the user as a contact. In some implementations, the remote user status indicator can be displayed proximate the user's name in a user interface, such as a favorites list, e-mail interface, text messaging interface, chat room, or any other user interface associated with an application.
    Type: Application
    Filed: June 2, 2010
    Publication date: December 8, 2011
    Applicant: Apple Inc.
    Inventors: Geoff Stahl, Michael Dale Lampell, Laurent Baumann, Thomas Goossens
  • Publication number: 20110296324
    Abstract: Methods, systems, and computer-readable media for creating and using customized avatar instances to reflect current user states are disclosed. In various implementations, the user states can be defined using trigger events based on user-entered textual data, emoticons, or states of the device being used. For each user state, a customized avatar instance having a facial expression, body language, accessories, clothing items, and/or a presentation scheme reflective of the user state can be generated. When one or more trigger events indicating occurrence of a particular user state are detected on the device, the avatar presented on the device is updated with the customized avatar instance associated with the particular user state.
    Type: Application
    Filed: June 1, 2010
    Publication date: December 1, 2011
    Inventors: Thomas Goossens, Laurent Baumann, Geoff Stahl