Patents Assigned to SPEED TECHNOLOGY CO., LTD.
  • Patent number: 11854145
    Abstract: The present invention provides an octree-based three-dimensional building model LOD method, specifically comprising the following steps: S1, reading three-dimensional building model data; S2, setting a tree depth Depth parameter of the octree; S3, respectively merging leaf node bounding boxes of the octree according to layers, and establishing coarse grid blocks; S4, merging the coarse grid blocks of the components of each layer; S5, performing triangularization; S6, calculating a normal vector of the coarse grid block; S7, merging the simplified components to form a building model of a coarse grid block; S8, deleting internal vertexes; S9, setting materials or textures; and S10, outputting the three-dimensional building LOD model.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: December 26, 2023
    Assignee: SPEED TECHNOLOGY CO., LTD.
    Inventors: Jun Li, Jianyong Fan, Biliang Zhu, Zhongjian Xu
  • Patent number: 11798228
    Abstract: A method for updating road signs and markings on the basis of monocular images comprises the following steps: acquiring street images of urban roads and GPS phase center coordinates and spatial attitude data corresponding to the street images; extracting coordinates of the road sign marking images; constructing a sparse three-dimensional model, and then generating a streetscape image depth map; calculating the space position of the road sign and marking according to the semantic and depth values of the image, the collinear equation and the space distance relation; if the same road sign and marking is visible in multiple views, solving the position information of the road sign; and vectorizing the obtained road sign position information, and fusing the information into the original data to realize the updating of the road sign data.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: October 24, 2023
    Assignee: SPEED TECHNOLOGY CO., LTD.
    Inventors: Zhongjian Xu, Biliang Zhu, Jun Li