Patents by Inventor Patrick S. Piemonte

Patrick S. Piemonte has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10109255
    Abstract: Methods, systems and apparatus are described to dynamically generate map textures. A client device may obtain map data, which may include one or more shapes described by vector graphics data. Along with the one or more shapes, embodiments may include texture indicators linked to the one or more shapes. Embodiments may render the map data. For one or more shapes, a texture definition may be obtained. Based on the texture definition, a client device may dynamically generate a texture for the shape. The texture may then be applied to the shape to render a current fill portion of the shape. In some embodiments the render map view is displayed.
    Type: Grant
    Filed: February 28, 2013
    Date of Patent: October 23, 2018
    Assignee: Apple Inc.
    Inventors: Marcel Van Os, Patrick S. Piemonte, Billy P. Chen, Christopher Blumenberg
  • Publication number: 20180293771
    Abstract: Methods, hardware, and software create and transmit augmented reality in context with captured real world media, so as to replicate a similar augmented reality at a different instance. A computer processor in a communications device handles a combination of augmented reality information, anchor information that provides the context-matching, and captured real world media information. The computer processor determines if the real world subject matter has suitable anchor information to control how augmented reality elements should appear contextually with such media. A graphical user interface on the communications device may provide a user with several options for creation of augmented reality. The augmented, anchor, and any additional information is transmitted to a different device to identify triggering or context-matching media. The augmented reality is performed based on the triggering media as perceived on the different device.
    Type: Application
    Filed: September 5, 2017
    Publication date: October 11, 2018
    Inventors: Patrick S. Piemonte, Ryan P. Staake
  • Publication number: 20180283896
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, machine status information for the machine is received at a dedicated machine component. The machine status information is published onto a distributed node system network of the machine. The machine status information is ingested at a primary interface controller, and an interactive user interface is generated using the primary interface controller. The interactive user interface is generated based on the machine status information. In some implementations, input is received from the user at the primary interface controller through the interactive user interface, and a corresponding action is delegated to one or more subsystems of the machine using the distributed node system network.
    Type: Application
    Filed: September 21, 2016
    Publication date: October 4, 2018
    Inventors: Patrick S. Piemonte, Jason D. Gosnell, Kjell F. Bronder, Daniel De Rocha Rosario, Shaun D. Budhram, Scott Herz
  • Publication number: 20180088324
    Abstract: Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world. In one example, the user may gesture to the object which is detected by a visual sensor. In another example, the user may verbally describe the object which is detected by an audio sensor. The system receiving the input may then determine which object near the location of the user that the user is indicating. Such a determination may include utilizing known objects near the geographic location of the user or the autonomous or mobile machine.
    Type: Application
    Filed: September 20, 2017
    Publication date: March 29, 2018
    Inventors: Patrick S. Piemonte, Wolf Kienzle, Douglas Bowman, Shaun D. Budhram, Madhurani R. Sapre, Vyacheslav Leizerovich, Daniel De Rocha Rosario
  • Publication number: 20180089899
    Abstract: An AR system that leverages a pre-generated 3D model of the world to improve rendering of 3D graphics content for AR views of a scene, for example an AR view of the world in front of a moving vehicle. By leveraging the pre-generated 3D model, the AR system may use a variety of techniques to enhance the rendering capabilities of the system. The AR system may obtain pre-generated 3D data (e.g., 3D tiles) from a remote source (e.g., cloud-based storage), and may use this pre-generated 3D data (e.g., a combination of 3D mesh, textures, and other geometry information) to augment local data (e.g., a point cloud of data collected by vehicle sensors) to determine much more information about a scene, including information about occluded or distant regions of the scene, than is available from the local data.
    Type: Application
    Filed: September 22, 2017
    Publication date: March 29, 2018
    Applicant: Apple Inc.
    Inventors: Patrick S. Piemonte, Daniel De Rocha Rosario, Jason D. Gosnell, Peter Meier
  • Patent number: 9754397
    Abstract: Methods, hardware, and software perform augmented reality created from a separate source in context with, such as synchronized and positioned in, captured media, so as to replicate a similar augmented reality at a different instance. A computer processor in a network of communications devices handles a combination of augmented reality information, anchor information that provides the context-matching, limitation information that controls if such information is transmitted or acted upon, and captured media information. The computer processor compares the anchor information with the media to identify triggering media and how augmented reality elements should appear in context with such media. If successful, the augmented reality is performed on a communications device based on the media.
    Type: Grant
    Filed: April 7, 2017
    Date of Patent: September 5, 2017
    Assignee: MIRAGE WORLDS, INC.
    Inventors: Patrick S. Piemonte, Ryan P. Staake
  • Patent number: 9756172
    Abstract: Methods and apparatus for an environment analysis tool on a mobile device which may construct a model of the surrounding environment in order to determine whether or not characteristics of the model implicate a degradation in wireless signal quality. In response to an analysis of the constructed model to determine signal quality, the environment analysis tool may alter the behavior of any number of hardware or software functions to avoid or reduce efforts to receive or use the affected signal over the duration of the mobile device's presence within the environment with the signal-degrading characteristics.
    Type: Grant
    Filed: August 24, 2012
    Date of Patent: September 5, 2017
    Assignee: Apple Inc.
    Inventors: Patrick S. Piemonte, Billy P. Chen, Christopher Blumenberg
  • Publication number: 20170115871
    Abstract: A multitouch device can interpret and disambiguate different gestures related to manipulating a displayed image of a 3D object, scene, or region. Examples of manipulations include pan, zoom, rotation, and tilt. The device can define a number of manipulation modes, including one or more single-control modes such as a pan mode, a zoom mode, a rotate mode, and/or a tilt mode. The manipulation modes can also include one or more multi-control modes, such as a pan/zoom/rotate mode that allows multiple parameters to be modified simultaneously.
    Type: Application
    Filed: December 20, 2016
    Publication date: April 27, 2017
    Inventors: Patrick S. Piemonte, Bradford A. Moore, Billy P. Chen
  • Publication number: 20170052672
    Abstract: A device that includes at least one processing unit and stores a multi-mode mapping program for execution by the at least one processing unit is described. The program includes a user interface (UI). The UI includes a display area for displaying a two-dimensional (2D) presentation of a map or a three-dimensional (3D) presentation of the map. The UI includes a selectable 3D control for directing the program to transition between the 2D and 3D presentations.
    Type: Application
    Filed: May 23, 2016
    Publication date: February 23, 2017
    Inventors: Scott Forstall, Bradford A. Moore, Marcel van Os, Christopher Blumenberg, Emanuele Vulcano, Brady A. Law, Patrick S. Piemonte, Matthew B. Ball
  • Patent number: 9541417
    Abstract: Some embodiments provide a non-transitory machine-readable medium that stores a program which when executed on a device by at least one processing unit performs panning operations on a three-dimensional (3D) map. The program displays a first 3D perspective view of the 3D map. In response to input to pan the 3D map, the program determines a panning movement based on the input and a two-dimensional (2D) view of the 3D map. The program pans the first 3D perspective view of 3D map to a second 3D perspective view of the 3D map based on determined panning movement. The program renders the second 3D perspective view of the 3D map for display on the device.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: January 10, 2017
    Assignee: APPLE INC.
    Inventor: Patrick S. Piemonte
  • Patent number: 9536325
    Abstract: A device that provides a map and/or navigation application that displays items on the map and/or navigation instructions differently in different modes. The applications of some embodiments provide a day mode and a night mode. In some embodiments the application uses the day mode as a default and activates the night mode when the time is after sunset at the location of the device. Some embodiments activate night mode when multiple conditions are satisfied (for example, when (1) the time is after sunset at the location of the device and (2) the ambient light level is below a threshold brightness).
    Type: Grant
    Filed: October 17, 2013
    Date of Patent: January 3, 2017
    Assignee: APPLE INC.
    Inventors: Cédric Bray, Christopher D. Moore, Patrick S. Piemonte, Emanuele Vulcano, Marcel van Os, Billy P. Chen, Seejo K. Pylappan, Justin O'Beirne
  • Patent number: 9529440
    Abstract: A multitouch device can interpret and disambiguate different gestures related to manipulating a displayed image of a 3D object, scene, or region. Examples of manipulations include pan, zoom, rotation, and tilt. The device can define a number of manipulation modes, including one or more single-control modes such as a pan mode, a zoom mode, a rotate mode, and/or a tilt mode. The manipulation modes can also include one or more multi-control modes, such as a pan/zoom/rotate mode that allows multiple parameters to be modified simultaneously.
    Type: Grant
    Filed: July 15, 2013
    Date of Patent: December 27, 2016
    Assignee: APPLE INC.
    Inventors: Patrick S. Piemonte, Bradford A. Moore, Billy P. Chen
  • Patent number: 9423946
    Abstract: Techniques for performing context-sensitive actions in response to touch input are provided. A user interface of an application can be displayed. Touch input can be received in a region of the displayed user interface, and a context can be determined. A first action may be performed if the context is a first context and a second action may instead be performed if the context is a second context different from the first context. In some embodiments, an action may be performed if the context is a first context and the touch input is a first touch input, and may also be performed if the context is a second context and the touch input is a second touch input.
    Type: Grant
    Filed: August 12, 2013
    Date of Patent: August 23, 2016
    Assignee: APPLE INC.
    Inventors: Christopher D. Moore, Marcel Van Os, Bradford A. Moore, Patrick S. Piemonte, Eleanor C. Wachsman
  • Patent number: 9418466
    Abstract: Some embodiments provide a non-transitory machine-readable medium that stores a mapping application which when executed on a device by at least one processing unit renders views of a three-dimensional (3D) map. The mapping application requests a first set of map tiles associated with a portion of the 3D map. In response to the request, the mapping application receives a second set of map tiles associated with portion of the 3D map. The mapping application identifies a third set of map tiles included in the first set of map tiles but not included in the second set of map tiles. For each map tile in the third set of map tiles, the mapping application generates a replacement map tile comprising geospatial data. The mapping application renders the portion of the 3D map based on the second set of map tiles and the set of replacement map tiles.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: August 16, 2016
    Assignee: APPLE INC.
    Inventor: Patrick S. Piemonte
  • Patent number: 9367959
    Abstract: A device that includes at least one processing unit and stores a multi-mode mapping program for execution by the at least one processing unit is described. The program includes a user interface (UI). The UI includes a display area for displaying a two-dimensional (2D) presentation of a map or a three-dimensional (3D) presentation of the map. The UI includes a selectable 3D control for directing the program to transition between the 2D and 3D presentations.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: June 14, 2016
    Assignee: APPLE INC.
    Inventors: Scott Forstall, Bradford A. Moore, Marcel van Os, Christopher Blumenberg, Emanuele Vulcano, Brady A. Law, Patrick S. Piemonte, Matthew B. Ball
  • Patent number: 9311750
    Abstract: A mapping program for execution by at least one processing unit of a device is described. The device includes a touch-sensitive screen and a touch input interface. The program renders and displays a presentation of a map from a particular view of the map. The program generates an instruction to rotate the displayed map in response to a multi-touch input from the multi-touch input interface. In order to generate a rotating presentation of the map, the program changes the particular view while receiving the multi-touch input and for a duration of time after the multi-touch input has terminated in order to provide a degree of inertia motion for the rotating presentation of the map.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: April 12, 2016
    Assignee: APPLE INC.
    Inventors: Bradford A. Moore, Marcel van Os, Albert P. Dul, Patrick S. Piemonte, Erik Anders Mikael Adlers
  • Publication number: 20160055669
    Abstract: Some embodiments provide a mapping application for generating views of a three-dimensional (3D) map. The mapping application includes a geographic data module for identifying a set of geographic data that represents a portion of the 3D map. The set of geographic data includes a set of camera captured images that correspond to the portion of the 3D map. The mapping application includes an image processing module for rendering the view of the 3D map based on the geographic data by animating a type of map element in the view of the 3D map.
    Type: Application
    Filed: August 24, 2015
    Publication date: February 25, 2016
    Inventors: Patrick S. Piemonte, Erik Anders Mikael Adlers, Christopher Blumenberg
  • Patent number: 9269178
    Abstract: Some embodiments provide a non-transitory machine-readable medium that stores a mapping application which when executed on a device by at least one processing unit provides automated animation of a three-dimensional (3D) map along a navigation route. The mapping application identifies a first set of attributes for determining a first position of a virtual camera in the 3D map at a first instance in time. Based on the identified first set of attributes, the mapping application determines the position of the virtual camera in the 3D map at the first instance in time. The mapping application identifies a second set of attributes for determining a second position of the virtual camera in the 3D map at a second instance in time. Based on the identified second set of attributes, the mapping application determines the position of the virtual camera in the 3D map at the second instance in time.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: February 23, 2016
    Assignee: APPLE INC.
    Inventors: Patrick S. Piemonte, Aroon Pahwa, Christopher D. Moore
  • Publication number: 20160035121
    Abstract: Methods, systems and apparatus are described to provide visual feedback of a change in map view. Various embodiments may display a map view of a map in a two-dimensional map view mode. Embodiments may obtain input indicating a change to a three-dimensional map view mode. Input may be obtained through the utilization of touch, auditory, or other well-known input technologies. Some embodiments may allow the input to request a specific display position to display. In response to the input indicating a change to a three-dimensional map view mode, embodiments may then display an animation that moves a virtual camera for the map display to different virtual camera positions to illustrate that the map view mode is changed to a three-dimensional map view mode.
    Type: Application
    Filed: October 9, 2015
    Publication date: February 4, 2016
    Applicant: APPLE INC.
    Inventors: Billy P. Chen, Patrick S. Piemonte, Christopher Blumenberg
  • Patent number: 9218685
    Abstract: Systems and methods for rendering 3D maps may highlight a feature in a 3D map while preserving depth. A map tool of a mapping or navigation application that detects the selection of a feature in a 3D map (e.g., by touch) may perform a ray intersection to determine the feature that was selected. The map tool may capture the frame to be displayed (with the selected feature highlighted) in several steps. Each step may translate the map about a pivot point of the selected map feature (e.g., in three or four directions) to capture a new frame. The captured frames may be blended together to create a blurred map view that depicts 3D depth in the scene. A crisp version of the selected feature may then be rendered within the otherwise blurred 3D map. Color, brightness, contrast, or saturation values may be modified to further highlight the selected feature.
    Type: Grant
    Filed: December 7, 2012
    Date of Patent: December 22, 2015
    Assignee: Apple Inc.
    Inventors: Patrick S. Piemonte, Billy P. Chen, Christopher Blumenberg, Edward Kandrot