Patents by Inventor Dimitris Zermas
Dimitris Zermas has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11280608Abstract: Systems and methods for providing an accurate determination of above ground level (AGL) altitude for use in precision agriculture. A 3D computer model of an agricultural field is generated by analyzing frames of a video of the agricultural field that have been obtained by a camera on an unmanned aerial vehicle flown over the agricultural field and generating 3D points that are used to create the 3D computer model. The 3D computer model is then scaled by fusing each 3D point with navigation sensor data associated with each frame. An AGL altitude for each 3D point of the scaled 3D computer model can then be determined. In another embodiment, a trajectory of the unmanned aerial vehicle can be determined and used to generate a georeferenced map using images captured by the camera.Type: GrantFiled: December 11, 2019Date of Patent: March 22, 2022Assignee: Sentera, Inc.Inventor: Dimitris Zermas
-
Patent number: 11275941Abstract: Systems, techniques, and devices for detecting plant biometrics, for example, plants in a crop field. An imaging device of an unmanned vehicle may be used to generate a plurality of images of the plants, and the plurality of images may be used to generate a 3D model of the plants. The 3D model may define locations and orientations of leaves and stems of plants. The 3D model may be used to determine at least one biometric parameter of at least one plant in the crop. Such detection of plant biometrics may facilitate the automation of crop monitoring and treatment.Type: GrantFiled: March 8, 2019Date of Patent: March 15, 2022Assignee: Regents of the University of MinnesotaInventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Mike Bazakos
-
Patent number: 11188752Abstract: Systems, techniques, and devices for detecting plant biometrics, for example, plants in a crop field. An imaging device of an unmanned vehicle may be used to generate a plurality of images of the plants, and the plurality of images may be used to generate a 3D model of the plants. The 3D model may define locations and orientations of leaves and stems of plants. The 3D model may be used to determine at least one biometric parameter of at least one plant in the crop. Such detection of plant biometrics may facilitate the automation of crop monitoring and treatment.Type: GrantFiled: March 8, 2019Date of Patent: November 30, 2021Assignee: Regents of the University of MinnesotaInventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Mike Bazakos
-
Patent number: 11127165Abstract: An aerial imaging system is disclosed. The aerial imaging system includes an aerial vehicle; a camera system mounted on the aerial vehicle; and an image storage medium. The camera system includes a lens and a plurality of image sensors. The plurality of image sensors includes a reference image sensor and a plurality of fixed wavelength sensors. The reference image sensor is configured to capture images including light at a plurality of wavelength ranges and to output a multilayer image. The plurality of fixed wavelength sensors are each configured to capture images including light at a selected wavelength range and to each output a single layer image. The image storage medium is configured to store the multilayer image and the plurality of single layer images.Type: GrantFiled: December 2, 2019Date of Patent: September 21, 2021Assignee: Sentera, Inc.Inventor: Dimitris Zermas
-
Patent number: 10599926Abstract: Pixel color values representing an image of a portion of a field are received where each pixel color value has a respective position within the image. A processor identifies groups of the received pixel color values as possibly representing a Nitrogen-deficient plant leaf. For each group of pixel color values, the processor converts the pixel color values into feature values that describe a shape and the processor uses the feature values describing the shape to determine whether the group of pixel color values represents a Nitrogen-deficient leaf of a plant. The processor stores in memory an indication that the portion of the field is deficient in Nitrogen based on the groups of pixel color values determined to represent a respective Nitrogen-deficient leaf.Type: GrantFiled: December 15, 2016Date of Patent: March 24, 2020Assignee: Regents of the University of MinnesotaInventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Michael Bazakos, Daniel Kaiser
-
Publication number: 20190278988Abstract: Systems, techniques, and devices for detecting plant biometrics, for example, plants in a crop field. An imaging device of an unmanned vehicle may be used to generate a plurality of images of the plants, and the plurality of images may be used to generate a 3D model of the plants. The 3D model may define locations and orientations of leaves and stems of plants. The 3D model may be used to determine at least one biometric parameter of at least one plant in the crop. Such detection of plant biometrics may facilitate the automation of crop monitoring and treatment.Type: ApplicationFiled: March 8, 2019Publication date: September 12, 2019Inventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Mike Bazakos
-
Publication number: 20190274257Abstract: Systems, techniques, and devices for detecting plant biometrics, for example, plants in a crop field. An imaging device of an unmanned vehicle may be used to generate a plurality of images of the plants, and the plurality of images may be used to generate a 3D model of the plants. The 3D model may define locations and orientations of leaves and stems of plants. The 3D model may be used to determine at least one biometric parameter of at least one plant in the crop. Such detection of plant biometrics may facilitate the automation of crop monitoring and treatment.Type: ApplicationFiled: March 8, 2019Publication date: September 12, 2019Inventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Mike Bazakos
-
Patent number: 10366310Abstract: An illustrative example object detection system includes a camera having a field of view. The camera provides an output comprising information regarding potential objects within the field of view. A processor is configured to select a portion of the camera output based on information from at least one other type of detector that indicates a potential object in the selected portion. The processor determines an Objectness of the selected portion based on information in the camera output regarding the selected portion.Type: GrantFiled: August 18, 2017Date of Patent: July 30, 2019Assignee: Aptiv Technologies LimitedInventors: Dimitris Zermas, Izzat H. Izzat, Anuradha Mangalgiri
-
Patent number: 10101746Abstract: A road-model-definition system suitable for an automated-vehicle includes a lidar-unit and a controller. The lidar-unit is suitable to mount on a host-vehicle. The lidar-unit is used to provide a point-cloud descriptive of an area proximate to the host-vehicle. The controller is in communication with the lidar-unit. The controller is configured to: select ground-points from the point-cloud indicative of a travel-surface, tessellate a portion of the area that corresponds to the travel-surface to define a plurality of cells, determine an orientation of each cell based on the ground-points within each cell, define a road-model of the travel-surface based on the orientation of the cells, and operate the host-vehicle in accordance with the road-model.Type: GrantFiled: August 23, 2016Date of Patent: October 16, 2018Assignee: Delphi Technologies, Inc.Inventors: Izzat H. Izzat, Dimitris Zermas, Uday Pitambare
-
Patent number: 10031231Abstract: An object-detection system suitable for an automated vehicle includes a lidar and a controller. The lidar is used to detect a point-cloud that is organized into a plurality of scan-lines. The controller is in communication with the lidar. The controller is configured to classify each detected point in the point-cloud as a ground-point or a non-ground-point, define runs of non-ground-points, where each run characterized by one or multiple instances of adjacent non-ground-points in a scan-line separated from a subsequent run of one or more non-ground-points by at least one instance of a ground-point, define a cluster of non-ground-points associated with the object. The cluster is characterized by a first run from a first scan-line being associated with a second run from a second scan-line when a first point from the first run is displaced less than a distance-threshold from a second point from the second run.Type: GrantFiled: September 12, 2016Date of Patent: July 24, 2018Assignee: Delphi Technologies, Inc.Inventors: Dimitris Zermas, Izzat H. Izzat, Anuradha Mangalgiri
-
Publication number: 20180075320Abstract: An illustrative example object detection system includes a camera having a field of view. The camera provides an output comprising information regarding potential objects within the field of view. A processor is configured to select a portion of the camera output based on information from at least one other type of detector that indicates a potential object in the selected portion. The processor determines an Objectness of the selected portion based on information in the camera output regarding the selected portion.Type: ApplicationFiled: August 18, 2017Publication date: March 15, 2018Inventors: Dimitris Zermas, Izzat H. Izzat, Anuradha Mangalgiri
-
Publication number: 20180074203Abstract: An object-detection system suitable for an automated vehicle includes a lidar and a controller. The lidar is used to detect a point-cloud that is organized into a plurality of scan-lines. The controller is in communication with the lidar. The controller is configured to classify each detected point in the point-cloud as a ground-point or a non-ground-point, define runs of non-ground-points, where each run characterized by one or multiple instances of adjacent non-ground-points in a scan-line separated from a subsequent run of one or more non-ground-points by at least one instance of a ground-point, define a cluster of non-ground-points associated with the object. The cluster is characterized by a first run from a first scan-line being associated with a second run from a second scan-line when a first point from the first run is displaced less than a distance-threshold from a second point from the second run.Type: ApplicationFiled: September 12, 2016Publication date: March 15, 2018Inventors: DIMITRIS ZERMAS, IZZAT H. IZZAT, ANURADHA MANGALGIRI
-
Publication number: 20180059666Abstract: A road-model-definition system suitable for an automated-vehicle includes a lidar-unit and a controller. The lidar-unit is suitable to mount on a host-vehicle. The lidar-unit is used to provide a point-cloud descriptive of an area proximate to the host-vehicle. The controller is in communication with the lidar-unit. The controller is configured to: select ground-points from the point-cloud indicative of a travel-surface, tessellate a portion of the area that corresponds to the travel-surface to define a plurality of cells, determine an orientation of each cell based on the ground-points within each cell, define a road-model of the travel-surface based on the orientation of the cells, and operate the host-vehicle in accordance with the road-model.Type: ApplicationFiled: August 23, 2016Publication date: March 1, 2018Inventors: IZZAT H. IZZAT, Dimitris ZERMAS, UDAY PITAMBARE
-
Publication number: 20170177938Abstract: Pixel color values representing an image of a portion of a field are received where each pixel color value has a respective position within the image. A processor identifies groups of the received pixel color values as possibly representing a Nitrogen-deficient plant leaf. For each group of pixel color values, the processor converts the pixel color values into feature values that describe a shape and the processor uses the feature values describing the shape to determine whether the group of pixel color values represents a Nitrogen-deficient leaf of a plant. The processor stores in memory an indication that the portion of the field is deficient in Nitrogen based on the groups of pixel color values determined to represent a respective Nitrogen-deficient leaf.Type: ApplicationFiled: December 15, 2016Publication date: June 22, 2017Inventors: Nikolaos Papanikolopoulos, Vassilios Morellas, Dimitris Zermas, David Mulla, Michael Bazakos, Daniel Kaiser