Patents by Inventor Stéphane Côté

Stéphane Côté has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11952109
    Abstract: A system for an aircraft comprising a controller configured to obtain a first piece of information regarding whether the aircraft has achieved lift-off; to obtain a second piece of information regarding whether the aircraft is in a flight phase in an initial part of lift-off; to determine, on the basis of at least the first piece of information whether the gear is retractable, and, if the gear is retractable in an initial part of the lift-off, to trigger a landing-gear-retraction alarm taking an audio form and/or a visual form, and, if the gear is retractable after the initial part of the lift-off to optionally trigger a second landing-gear-retraction alarm taking an audio form and/or a visual form. A method implemented by the system is provided. Advantageously, an omission of the retraction of the landing gear of an aircraft is notified to its flight crew.
    Type: Grant
    Filed: April 21, 2022
    Date of Patent: April 9, 2024
    Assignees: Airbus SAS, Airbus Operations SAS
    Inventors: Stéphane Cote, Christine Gris, Philippe Castaigns, Cesar Garcia Castilla
  • Patent number: 11521357
    Abstract: In one example embodiment, a software application obtains a set of images that include an aerial cable and generates a 3D model from the set of images. The 3D model initially excludes a representation of the aerial cable. The software application processes each image of the set of images to extract pixels that potentially represent cables and determines a position in 3D space of the 3D model of a pair of attachment points for the aerial cable. The software application defines a vertical plane in 3D space of the 3D model based on the pair of cable attachment points. For each of one or more images of the set of images, the software application projects at least some of the pixels that potentially represent cables onto the vertical plane. The software application then calculates a curve representation (e.g., a catenary equation) for the aerial cable based on the pixels projected onto the vertical plane, and adds a cable model defined by the curve representation to the 3D model to represent the aerial cable.
    Type: Grant
    Filed: November 3, 2020
    Date of Patent: December 6, 2022
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, William Guimont-Martin
  • Publication number: 20220340268
    Abstract: A system for an aircraft comprising a controller configured to obtain a first piece of information regarding whether the aircraft has achieved lift-off; to obtain a second piece of information regarding whether the aircraft is in a flight phase in an initial part of lift-off; to determine, on the basis of at least the first piece of information whether the gear is retractable, and, if the gear is retractable in an initial part of the lift-off, to trigger a landing-gear-retraction alarm taking an audio form and/or a visual form, and, if the gear is retractable after the initial part of the lift-off to optionally trigger a second landing-gear-retraction alarm taking an audio form and/or a visual form. A method implemented by the system is provided. Advantageously, an omission of the retraction of the landing gear of an aircraft is notified to its flight crew.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 27, 2022
    Inventors: Stéphane COTE, Christine GRIS, Philippe CASTAIGNS, Cesar GARCIA CASTILLA
  • Patent number: 10930078
    Abstract: In one embodiment, techniques are provided for improving perception of representations of subsurface features (e.g., virtual paint markings) in augmented reality. An input image of a terrain surface is accessed. An augmentation stencil image aligned with the input image is created and represented utilizing HSL color space. The input image is converted to the HSL color space. The technique creates and displays an augmented image that, for each pixel that falls outside of the representation subsurface features, has pixel values based on a hue value, a saturation value and a lightness value of the input image and for each pixel that coincides with the representation subsurface features has pixel values based on a hue value and a saturation value of the augmentation stencil image and a lightness value based on the input image.
    Type: Grant
    Filed: November 1, 2018
    Date of Patent: February 23, 2021
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Jade Marcoux-Ouellet
  • Patent number: 10930079
    Abstract: In one embodiment, an augmented reality application executing on an augmented reality device accesses a representation of the physical environment. The augmented reality application aligns information that describes near-ground features with the representation of the physical environment. After alignment, the augmented reality application generates an augmented reality view that is displayed on a display device by projecting the near-ground features onto the ground in the representation of the physical environment, generating a virtual plane above the ground that is parallel to the ground and separated therefrom by a distance, projecting the near-ground features onto the virtual plane above the ground, and showing an indicator of visual correspondence that connects at least a portion of the projection of near ground features on the ground and a corresponding portion of the projection of near-ground features on the virtual plane above the ground.
    Type: Grant
    Filed: January 15, 2019
    Date of Patent: February 23, 2021
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Marc-André Bouvrette, Danny Lebel
  • Patent number: 10755483
    Abstract: In one embodiment, techniques are provided for projecting information (e.g., describing subsurface features such as subsurface utilities) onto a surface (e.g., road surface) of the physical environment using pre-captured topography (e.g., determined by structure-from-motion (SfM) photogrammetry) and pre-projection of the information onto the pre-captured topography. The pre-projected information is subsequently combined with a live view of the physical environment to produce an augmented reality view that is displayed to a user.
    Type: Grant
    Filed: August 17, 2018
    Date of Patent: August 25, 2020
    Assignee: Bentley Systems, Incorporated
    Inventor: Stéphane Côté
  • Patent number: 10755484
    Abstract: In one embodiment, techniques are provided for capturing accurate information describing the location of subsurface features (e.g., subsurface utilities such as water pipes, sewer pipes, electrical conduits, etc.) usable in providing an augmented reality view. A set of images is captured with a camera rig coupled to a mobile portion (e.g., the boom) of a piece of heavy construction equipment (e.g., an excavator) being used by workers to conduct an excavation that exposes the subsurface features. The set of images is provided to a structure-from-motion (SfM) photogrammetry that generates a 3D reality mesh. Relative and/or absolute locations of the subsurface features are calculated based on the 3D reality mesh and provided to an augmented reality application executing on an augmented reality device for use in providing an augmented reality view.
    Type: Grant
    Filed: August 17, 2018
    Date of Patent: August 25, 2020
    Assignee: Bentley Systems, Incorporated
    Inventor: Stéphane Côté
  • Patent number: 10602117
    Abstract: In one embodiment, at an initial time, a capture system records a 3D video stream and an audio stream of a first user interacting with the physical environment. A processing device receives the 3D video stream and the audio stream, isolates the first user from the physical environment, and stores at least a portion of the isolated representation and a portion of the audio stream. At a subsequent time, a second user uses a camera of an augmented reality device to capture a scene. The augmented reality device loads the portion of the isolated representation of the first user and the portion of the audio stream, aligns the 3D video stream with the captured scene, and calculates a pose of the augmented reality device. The augmented reality device then produces an augmented scene, which is displayed to the second user while the portion of the audio stream is played back.
    Type: Grant
    Filed: September 11, 2017
    Date of Patent: March 24, 2020
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Marc-Andre Bouvrette
  • Patent number: 10497177
    Abstract: In one embodiment, augmented reality is simulated by augmenting a pre-generated, physical environment-aligned 3D reality mesh. A camera system captures digital images of a site. A backend processing system generates a 3D reality mesh of the site based on the digital images, assigns metadata to infrastructure elements in the 3D reality mesh, and stores the 3D reality mesh and metadata. At a subsequent time, a mobile device accesses the 3D reality mesh and assigned metadata. A positioning sensor determines a live position of the mobile device. An orientation sensor determines a live orientation of the mobile device. Based on the live position and live orientation, a view of the 3D reality mesh aligned to the physical environment is produced and displayed on a display device of the mobile device. That view is then augmented based on the assigned metadata.
    Type: Grant
    Filed: September 19, 2017
    Date of Patent: December 3, 2019
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Maxime Ménard
  • Patent number: 10395427
    Abstract: An augmented reality application is proved that enhances on-site visualization and modeling using functional drawings (e.g., P&IDs). The augmented reality application may utilize a 3-D model as a bridge between symbols and lines in a functional drawing and objects (e.g., pieces of equipment) in the physical environment, to allow a user to rapidly locate a symbol or line in the functional drawing that represents a user-selected object (e.g., piece of equipment) in the physical environment or to rapidly locate an object (e.g., piece of equipment) in the physical environment that is represented by a user-selected symbol or line in the functional drawing. The augmented reality application may further allow a user to efficiently modify (e.g., add elements to) a 3-D model based on changes (e.g., additions) to a functional drawing.
    Type: Grant
    Filed: April 11, 2017
    Date of Patent: August 27, 2019
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Vincent Hamel
  • Patent number: 10185787
    Abstract: In one embodiment, accurate onsite model visualization is provided by utilizing a user's location and orientation in the physical environment to control a view of a CAD model (and optional aligned point cloud), enabling the user to perceive the relationship between model data and locations in the physical environment. Instead of augmenting a view of the physical environment with virtual features, the user navigates within a virtual environment (consisting of the CAD model and optional point cloud) based on their movements in the physical environment. Based on such navigation, the user may determine when they have moved to a location in the physical environment that they would like to interact with (e.g., a location to mark), and interact with such location based on related model data.
    Type: Grant
    Filed: April 6, 2016
    Date of Patent: January 22, 2019
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Antoine Girard-Vallée
  • Patent number: 9918204
    Abstract: In one embodiment, a technique is provided for tracking a mobile device within a building. A course position estimate of the mobile device is determined using a positioning system. The course position estimate indicates a room in which the mobile device is located. One or more sensors of the mobile device capture a live point cloud of surroundings of the mobile device. Tracking software accesses a portion of a pre-captured point cloud of the interior of the building that serves as a reference. The portion of the pre-captured point cloud corresponds to the room indicated by the course position estimate. Once the initial pose is determined, an updated pose of the mobile device is determined when the mobile device is moved, based on a further comparison of the live point cloud to the portion of the pre-captured point cloud.
    Type: Grant
    Filed: December 8, 2015
    Date of Patent: March 13, 2018
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Francois Rheault
  • Patent number: 9881419
    Abstract: In one embodiment, augmented reality is facilitated by a special initialization user interface that shows multiple views of a three-dimensional (3-D) model and a representation of a physical environment captured by a camera, where each of the multiple views is updated substantially in real-time as changes are made in another of the multiple views. An approximate lateral position of the representation of the physical environment with respect to the 3-D model is received through user interaction with a first of the views. An approximate orientation of the representation of the physical environment with respect to the 3-D model is received through user interaction with a second of the views. Correspondence is established between a plurality of portions of the 3-D model and a plurality of portions of the representation of the physical environment to anchor the 3-D model to the representation of the physical environment.
    Type: Grant
    Filed: February 2, 2012
    Date of Patent: January 30, 2018
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Stéphane Poirier
  • Patent number: 9824490
    Abstract: In one embodiment, an augmented view is generated that accounts for dynamically changing terrain surface at a site. A sensor captures live georeferenced terrain surface topography for the site. A camera captures an image of the site. Further, a tracking system determines a georeferenced camera pose of the camera. An augmented reality application aligns a georeferenced three-dimensional (3-D) model for the site with the live georeferenced terrain surface topography. Then, using at least the captured image, the georeferenced camera pose, the georeferenced 3-D model and live georeferenced terrain surface topography, the augmented reality application creates an augmented view of the site that shows graphical representations of subsurface features. At least a portion of the graphical representations are dynamically conformed to the contours of the terrain surface in the image based on the live georeferenced terrain surface topography.
    Type: Grant
    Filed: June 8, 2015
    Date of Patent: November 21, 2017
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Ian Létourneau, Jade Marcoux-Ouellet
  • Patent number: 9761045
    Abstract: In one embodiment, a three-dimensional (3-D) model is aligned with a view of a physical environment captured by a camera. An augmented reality view is generated by superposing elements of the 3-D model and the view of the physical environment. At least some of the elements of the 3-D model are initially hidden within the augmented reality view. A virtual window is imposed that penetrates a surface within the augmented reality view. The virtual window clips the view of the physical environment and elements of the 3-D model that intersect the virtual window to reveal initially-hidden elements of the 3-D model located beyond the virtual window. The augmented reality view with the virtual window is displayed on a display device.
    Type: Grant
    Filed: July 18, 2013
    Date of Patent: September 12, 2017
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Robert Snyder, Renaud Gervais
  • Patent number: 9715008
    Abstract: In one embodiment, an augmented reality application generates an augmented reality view that displays three-dimensional (3-D) ground penetrating radar (GPR) data on boundary surfaces of a virtual excavation. The augmented reality application calculates an intersection of the one or more boundary surfaces of the virtual excavation and the 3-D GPR data, and extracts data items of the 3-D GPR data that intersect the one or more boundary surfaces of the virtual excavation. The augmented reality application then projects two-dimensional (2-D) images based on the extracted data items onto the one or more boundary surfaces of the virtual excavation to show subsurface features in the augmented reality view that can be manipulated (e.g., moved, rotated, scaled, have its depth changed, etc) by a user.
    Type: Grant
    Filed: March 20, 2013
    Date of Patent: July 25, 2017
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Cõté, Renaud Gervais
  • Patent number: 9646571
    Abstract: In one embodiment, an augmented view is provided utilizing a panorama. A panorama of the physical environment is captured with a panoramic camera. The panorama has a panoramic field of view and is embodied as a sequence of video frames. An initial pose of the panoramic camera is determined. The panoramic camera is tracked to update the initial pose to subsequent poses to account for movement of the panoramic camera about the physical environment. The tracking utilizes features dispersed across the panoramic field of view of video frames. The panorama is augmented by merging computer-generated elements with the panorama based on the updated pose of the panoramic camera. A sub-portion of the panorama, along with the computer-generated elements, is displayed as an augmented view. The displayed augmented view may have a field of view that is less than the panoramic field of view.
    Type: Grant
    Filed: June 4, 2013
    Date of Patent: May 9, 2017
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Marc Antoine Desbiens
  • Patent number: 9536351
    Abstract: In one embodiment, a panoramic camera captures a panorama of the physical environment visible from a position within the physical environment. An orientation sensor determines an orientation of a handheld device. Based on the orientation of the handheld device, a backend augmentation application selects a portion of the captured panorama visible in a corresponding orientation from the position to produce a view of the physical environment. The view of the physical environment shows physical features of the physical environment and at least a portion of the user's body. The backend augmentation application augments the view of the physical environment to merge computer-generated features with the view of the physical environment to create an augmented reality view. The augmented reality view shows a relationship between the portion of the user's body, the physical features, and the computer-generated features. The augmented reality view is displayed on the handheld device.
    Type: Grant
    Filed: February 3, 2014
    Date of Patent: January 3, 2017
    Assignee: Bentley Systems, Incorporated
    Inventor: Stéphane Côté
  • Patent number: 9460561
    Abstract: In one embodiment, a two-dimensional (2-D) drawing is shown in an augmented reality view on a display screen of an electronic device. A three-dimensional (3-D) model is imposed within a view of the physical structure captured by a camera. The 2-D drawing is also imposed within the view of the physical structure. A portion of the 2-D drawing whose details correspond to internal features of the physical structure may be shown by sliding the 2-D drawing from a particular position to a different position, or by displaying the 2-D drawing within context of the 3-D model, which is in turn displayed within context of the view of the physical structure.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: October 4, 2016
    Assignee: Bentley Systems, Incorporated
    Inventors: Stéphane Côté, Rob Snyder, Phillippe Trudel
  • Patent number: 9454908
    Abstract: A method and system for managing automatic guidance of an aircraft during a complete engine failure. Said system includes means for monitoring engines so as to be able to detect a complete failure of the engines; means for detecting whether the aircraft is in flight; means for detecting whether the aircraft is in a different guidance mode from a guidance mode configured to make the aircraft descend with a reduced engine thrust and a fixed speed; and control means for automatically bringing guidance means of the aircraft into a guidance configuration compatible with the situation associated with the complete failure of the engines.
    Type: Grant
    Filed: October 9, 2013
    Date of Patent: September 27, 2016
    Assignee: AIRBUS OPERATIONS S.A.S.
    Inventors: Marie-Claire Moune, Jean Muller, Stéphane Cote