Patents by Inventor Christopher D. Esposito
Christopher D. Esposito has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230342513Abstract: A temporal multi-configuration model dataset system comprises a computer system configured to: compare a prior parts list for a vehicle at a point in time to a current parts list in which comparing the prior parts list with the current parts list results in a comparison; determine change lists for the parts that changed using the comparison; append models to a model dataset for the vehicle in response to parts added to the vehicle, wherein models are not removed from the model dataset in response to parts being removed from the vehicle; determine display parts in the model dataset present for a selected point in time in response to receiving a request to visualize the vehicle at the selected point in time in which the display parts are determined using the set of change lists; and display a visualization of the vehicle using the display parts on a display system.Type: ApplicationFiled: April 22, 2022Publication date: October 26, 2023Inventors: James J. Troy, James Edward Fadenrecht, Vladimir Karakusevic, Christopher D. Esposito, Rohan Jayantilal Rana, Robert Allan Brandt
-
Patent number: 10929670Abstract: A method for providing just-in-time access to data for calibrating an augmented reality (AR) device relative to an operation environment and then using the calibration data to register virtual content generated by the AR device with a scene being viewed by a user in the operation environment. An AR device is calibrated by pairing marker identifiers of 2-D markers affixed to objects in a physical environment with 3-D locations of those objects specified in a virtual environment containing 3-D model of those objects and then a marker-to-model location pairing list is generated. The pairing list is used to align displayed virtual 3-D content with an object appearing in the visualized physical environment. A location correction is computed at run-time based on a current AR device-to-marker offset computed from an image of the 2-D marker and the 3-D location of the object retrieved from the pairing list.Type: GrantFiled: October 21, 2019Date of Patent: February 23, 2021Assignee: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito
-
Systems and methods for navigating within a visual representation of a three-dimensional environment
Patent number: 10839602Abstract: In an example, a method for navigating within a visual representation of a three-dimensional (3D) environment is described. The method includes detecting a selection of a translational input trigger at a translational origin in a translational input region of a user interface. The method includes detecting that the translational input trigger has been moved from the translational origin to a point on a translational control border. The method includes detecting that the translational input trigger has been moved from within the position direct control zone to a location within the position rate control zone. The method includes, responsive to detecting that the translational input trigger has been moved from within the position direct control zone to the location within the position rate control zone, translating the field of view in the visual representation of the 3D environment at a velocity corresponding to the location within the position rate control zone.Type: GrantFiled: September 30, 2019Date of Patent: November 17, 2020Assignee: The Boeing CompanyInventor: Christopher D. Esposito -
Patent number: 10800550Abstract: Apparatus and methods for displaying a three-dimensional model image of a portion of a target object. An imaging device is equipped with an inertial measurement unit (IMU) and a processor configured to execute a three-dimensional (3-D) visualization application. The IMU is used to track movement of the imaging device relative to a known initial location in a frame of reference of the target object. Imaging device position offsets are computed using relative position and orientation information acquired by a dead-reckoning process. The processor is configured to execute an algorithm that combines orientation data from the IMU with walking step information to produce a piecewise linear approximation for relative motion measurement. The resulting relative location data can then be used by the 3-D visualization application to provide an estimated 3-D viewpoint to display a 3-D model of a feature in the imaged area of interest.Type: GrantFiled: June 21, 2018Date of Patent: October 13, 2020Assignee: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito, Vladimir Karakusevic
-
Patent number: 10607409Abstract: Systems and methods for constructing and saving files containing computer-generated image data with associated virtual camera location data during 3-D visualization of an object (e.g., an aircraft). The process tags computer-generated images with virtual camera location and settings information selected by the user while navigating a 3-D visualization of an object. The virtual camera location data in the saved image file can be used later as a way to return the viewpoint to the virtual camera location in the 3-D environment from where the image was taken. For example, these tagged images can later be drag-and-dropped onto the display screen while the 3-D visualization application is running to activate the process of retrieving and displaying a previously selected image. Multiple images can be loaded and then used to determine the relative viewpoint offset between images.Type: GrantFiled: July 19, 2016Date of Patent: March 31, 2020Assignee: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito, Vladimir Karakusevic
-
Publication number: 20190389600Abstract: Apparatus and methods for displaying a three-dimensional model image of a portion of a target object. An imaging device is equipped with an inertial measurement unit (IMU) and a processor configured to execute a three-dimensional (3-D) visualization application. The IMU is used to track movement of the imaging device relative to a known initial location in a frame of reference of the target object. Imaging device position offsets are computed using relative position and orientation information acquired by a dead-reckoning process. The processor is configured to execute an algorithm that combines orientation data from the IMU with walking step information to produce a piecewise linear approximation for relative motion measurement. The resulting relative location data can then be used by the 3-D visualization application to provide an estimated 3-D viewpoint to display a 3-D model of a feature in the imaged area of interest.Type: ApplicationFiled: June 21, 2018Publication date: December 26, 2019Applicant: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito, Vladimir Karakusevic
-
Patent number: 10380469Abstract: Systems and methods for determining locations of a device in an environment where features are present. Passive code pattern markers are used as unique location landmarks to provide on-demand location information to the user of the device in an abstract, landmark-based reference system that can then be mapped into an underlying physical 3-D coordinate system to give location coordinates that can be used by other tools to determine a viewpoint. For example, a 3-D visualization system can be configured to set a viewpoint so that an image concurrently generated by a computer system presents a scene which approximates the scene being viewed by the user in the physical world at that moment in time.Type: GrantFiled: July 12, 2016Date of Patent: August 13, 2019Assignee: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito
-
Patent number: 9892558Abstract: Methods for identifying parts of a target object (e.g., an airplane) using geotagged photographs captured on site by a hand-held imaging device. The geotagged photographs contain GPS location data and camera setting information. The embedded image metadata from two or more photographs is used to estimate the location (i.e., position and orientation) of the imaging device relative to the target object, which location is defined in the coordinate system of the target object. Once the coordinates of the area of interest on the target object are known, the part number and other information associated with the part can be determined when the imaging device viewpoint information is provided to a three-dimensional visualization environment that has access to three-dimensional models of the target object.Type: GrantFiled: February 19, 2016Date of Patent: February 13, 2018Assignee: The Boeing CompanyInventors: James J. Troy, Vladimir Karakusevic, Christopher D. Esposito
-
Publication number: 20180025543Abstract: Systems and methods for constructing and saving files containing computer-generated image data with associated virtual camera location data during 3-D visualization of an object (e.g., an aircraft). The process tags computer-generated images with virtual camera location and settings information selected by the user while navigating a 3-D visualization of an object. The virtual camera location data in the saved image file can be used later as a way to return the viewpoint to the virtual camera location in the 3-D environment from where the image was taken. For example, these tagged images can later be drag-and-dropped onto the display screen while the 3-D visualization application is running to activate the process of retrieving and displaying a previously selected image. Multiple images can be loaded and then used to determine the relative viewpoint offset between images.Type: ApplicationFiled: July 19, 2016Publication date: January 25, 2018Applicant: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito, Vladimir Karakusevic
-
Publication number: 20170243399Abstract: Methods for identifying parts of a target object (e.g., an airplane) using geotagged photographs captured on site by a hand-held imaging device. The geotagged photographs contain GPS location data and camera setting information. The embedded image metadata from two or more photographs is used to estimate the location (i.e., position and orientation) of the imaging device relative to the target object, which location is defined in the coordinate system of the target object. Once the coordinates of the area of interest on the target object are known, the part number and other information associated with the part can be determined when the imaging device viewpoint information is provided to a three-dimensional visualization environment that has access to three-dimensional models of the target object.Type: ApplicationFiled: February 19, 2016Publication date: August 24, 2017Applicant: The Boeing CompanyInventors: James J. Troy, Vladimir Karakusevic, Christopher D. Esposito
-
Publication number: 20160321530Abstract: Systems and methods for determining locations of a device in an environment where features are present. Passive code pattern markers are used as unique location landmarks to provide on-demand location information to the user of the device in an abstract, landmark-based reference system that can then be mapped into an underlying physical 3-D coordinate system to give location coordinates that can be used by other tools to determine a viewpoint. For example, a 3-D visualization system can be configured to set a viewpoint so that an image concurrently generated by a computer system presents a scene which approximates the scene being viewed by the user in the physical world at that moment in time.Type: ApplicationFiled: July 12, 2016Publication date: November 3, 2016Applicant: The Boeing CompanyInventors: James J. Troy, Christopher D. Esposito