Patents by Inventor Nathan L. Fillhardt

Nathan L. Fillhardt has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11922588
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Grant
    Filed: July 1, 2022
    Date of Patent: March 5, 2024
    Assignee: Apple Inc.
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Publication number: 20230025576
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: October 5, 2022
    Publication date: January 26, 2023
    Applicant: Apple Inc.
    Inventors: Justin M. Strawn, Adam B. Findley, Nathan L. Fillhardt, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada
  • Publication number: 20220390242
    Abstract: In some implementations, a computing device can present alight notifications to a user when the user (e.g., user device) is approaching a public transit stop where the user should depart the public transit vehicle the user is currently occupying. For example, a user may use a user device to generate a public transit route that includes a starting location, a sequence of public transit stops, and a destination location. While traversing the transit route, the user may need to exit public transit vehicle (e.g., to reach the destination location, to switch transit lines, etc.). The user device can identify transit stops along the route where the user should exit a transit vehicle and present notifications prompting the user to exit the transit vehicle at the upcoming stop as the user device approaches the identified transit stops.
    Type: Application
    Filed: May 31, 2022
    Publication date: December 8, 2022
    Applicant: Apple Inc.
    Inventors: Brian J Andrich, Sara E Marcus, Thomas Pajor, Nick Provenzano, Tim A Aerts, Alex Benevento, Leo Emile Jean Pierre Vallet, Samantha J Bennett, Sara A Bergquist, Nathan L Fillhardt, Yun Jae Kim, Scott G Jackson, Jan F Sanchez Dudus, Ishan Bhutani, Ray R Chen, Christopher Y Tremblay
  • Patent number: 11486724
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: November 1, 2022
    Assignee: Apple Inc.
    Inventors: Justin M. Strawn, Adam B. Findley, Nathan L. Fillhardt, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada
  • Publication number: 20220335699
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Application
    Filed: July 1, 2022
    Publication date: October 20, 2022
    Applicant: APPLE INC.
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Patent number: 11393174
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Grant
    Filed: September 4, 2020
    Date of Patent: July 19, 2022
    Assignee: APPLE INC.
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Publication number: 20200402319
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Application
    Filed: September 4, 2020
    Publication date: December 24, 2020
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Patent number: 10816353
    Abstract: The disclosed methods include generating a navigation presentation for a device navigating a route. A virtual camera modifies how it captures map portions to produce scenes in the navigation presentation based on changes in navigation context. The method identifies different attributes describing different navigation contexts at different times during the navigation presentation, and uses these attributes to identify styles for operating the virtual camera. The method uses a style to specify the camera's positional attributes which define map portions that the virtual camera identifies for rendering to produce navigation scenes for some time (until the navigation context changes, or the presentation ends when the context does not change again). When the navigation context changes, the identified attributes may change. This change may cause selecting a new virtual camera style. When the style changes, the method modifies the way the virtual camera captures the portion of the map to render.
    Type: Grant
    Filed: February 19, 2019
    Date of Patent: October 27, 2020
    Assignee: Apple Inc.
    Inventors: Justin M. Strawn, Nathan L. Fillhardt, Christopher F. Marrin, Thomas Davie
  • Publication number: 20200333160
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: June 30, 2020
    Publication date: October 22, 2020
    Applicant: Apple Inc.
    Inventors: Justin M. Strawn, Adam B. Findley, Nathan L. Fillhardt, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada
  • Patent number: 10777007
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Grant
    Filed: January 10, 2018
    Date of Patent: September 15, 2020
    Assignee: Apple Inc.
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Patent number: 10739157
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Grant
    Filed: September 23, 2016
    Date of Patent: August 11, 2020
    Assignee: Apple Inc.
    Inventors: Adam B. Findley, Nathan L. Fillhardt, Justin M. Strawn, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada
  • Patent number: 10643373
    Abstract: Various embodiments of the disclosure pertain to an augmented or virtual reality interface for interacting with maps displayed from a virtual camera perspective on a mobile device. Instead of manipulating the position of the virtual camera using a touchscreen interface, some embodiments allow a spatial location of the mobile device to control the position of the virtual camera. For example, a user can tilt the mobile device to obtain different angles of the virtual camera. As another example, the user can move the mobile device vertically to change the height of the virtual camera, e.g., a higher altitude above the ground.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: May 5, 2020
    Assignee: Apple Inc.
    Inventors: Nathan L. Fillhardt, Adrian P. Lindberg, Vincent P. Arroyo, Justin M. Strawn
  • Publication number: 20190178673
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: February 19, 2019
    Publication date: June 13, 2019
    Applicant: Apple Inc.
    Inventors: Justin M. Strawn, Nathan L. Fillhardt, Christopher F. Marrin, Thomas Davie
  • Patent number: 10302446
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Grant
    Filed: September 23, 2016
    Date of Patent: May 28, 2019
    Assignee: Apple Inc.
    Inventors: Justin M. Strawn, Nathan L. Fillhardt, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada, Aaron A. Reiner
  • Publication number: 20190102943
    Abstract: To reduce this amount of bandwidth needed to share 3D map images between mobile devices, according to some embodiments, a user's mobile device (i.e., a host device) may identify its origin in a 3D map and a current virtual camera position relative to the origin based on the physical position of the mobile device. The mobile device may send both the origin and the virtual camera position to another mobile device (i.e., a client device) for use in rendering a corresponding image. Separately, the client device may download the 3D map images from a server, e.g., in preparation for a meeting. In this manner, the host device may send the origin to the client device once, as well as send a data stream of the current virtual camera position for use in accessing the corresponding 3D map images at the client device.
    Type: Application
    Filed: January 10, 2018
    Publication date: April 4, 2019
    Inventors: Nathan L. Fillhardt, Syed Mohsin Hasan, Adrian P. Lindberg
  • Patent number: 10247568
    Abstract: Some embodiments of the invention provide methods for generating navigation presentations that display a device navigating a route on a map. The method uses a virtual camera that, based on detected changes to the navigation context, dynamically modifies how it captures map portions to produce navigation scenes in the presentation. The method of some embodiments identifies attribute sets describing different navigation contexts and uses these attribute sets to identify virtual camera operating styles. The method uses an identified style to specify the virtual camera's positional attributes, which define the map portions identified for rendering to produce navigation scenes over time. During the navigation presentation, each time the navigation context changes, the identified attribute set may change. This, in turn, may cause the method to select a new virtual camera operating style. When the style changes, the method modifies the way the virtual camera captures the map portion to render.
    Type: Grant
    Filed: September 23, 2016
    Date of Patent: April 2, 2019
    Assignee: Apple Inc.
    Inventors: Nathan L. Fillhardt, Thomas Davie, Justin M. Strawn, Christopher F. Marrin
  • Publication number: 20180365883
    Abstract: Various embodiments of the disclosure pertain to an augmented or virtual reality interface for interacting with maps displayed from a virtual camera perspective on a mobile device. Instead of manipulating the position of the virtual camera using a touchscreen interface, some embodiments allow a spatial location of the mobile device to control the position of the virtual camera. For example, a user can tilt the mobile device to obtain different angles of the virtual camera. As another example, the user can move the mobile device vertically to change the height of the virtual camera, e.g., a higher altitude above the ground.
    Type: Application
    Filed: May 16, 2018
    Publication date: December 20, 2018
    Inventors: Nathan L. Fillhardt, Adrian P. Lindberg, Vincent P. Arroyo, Justin M. Strawn
  • Publication number: 20170356755
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: September 23, 2016
    Publication date: December 14, 2017
    Inventors: Justin M. Strawn, Nathan L. Fillhardt, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada, Aaron A. Reiner
  • Publication number: 20170356753
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: September 23, 2016
    Publication date: December 14, 2017
    Inventors: Adam B. Findley, Nathan L. Fillhardt, Justin M. Strawn, Brian J. Andrich, Usama M. Hajj, Tadayasu Sasada
  • Publication number: 20170356754
    Abstract: Some embodiments of the invention provide several novel methods for generating a navigation presentation that displays a device navigating a route on a map. The method of some embodiments uses a virtual camera that, based on detected changes in the navigation context, dynamically modifies the way it captures portions of the map to produce different navigation scenes in the navigation presentation. To generate the navigation scenes, the method of some embodiments (1) identifies different sets of attributes that describe the different navigation contexts at different times during the navigation presentation, and (2) uses these different sets of attributes to identify different styles for operating the virtual camera. In some embodiments, the method uses an identified style to specify the virtual camera's positional attributes, which, in turn, define the portions of the map that the virtual camera identifies for rendering to produce several navigation scenes for a period of time (e.g.
    Type: Application
    Filed: September 23, 2016
    Publication date: December 14, 2017
    Inventors: Nathan L. Fillhardt, Thomas Davie, Justin M. Strawn, Christopher F. Marrin