Patents by Inventor Oscar Alejandro De Lellis

Oscar Alejandro De Lellis has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11612097
    Abstract: A method of managing crops using an electronic device having an interface. Inputs of crop data is received, and each of the crop data is associated with a sample site location corresponding to each of a plurality of images captured by an image capturing device. A graph plotting one or more types of crop data including data associated with the plurality of images is generated in a first display region of the interface. A subset of sample site locations requiring one of a predetermined set of actions is displayed on the map in a second display region of the interface based on a selection within one of the plots on the graph.
    Type: Grant
    Filed: December 10, 2019
    Date of Patent: March 28, 2023
    Assignee: Canon Kabushiki Kaisha
    Inventors: Julie Rae Kowald, Dixon De Sheng Deng, Nicholas Grant Fulton, Oscar Alejandro De Lellis
  • Patent number: 11354350
    Abstract: A method of browsing images on a user interface displaying a map. A selection of a geographical feature within the map on the user interface is received based on a selection criteria comprising a set of predefined gesture rules. A plurality of images is selected based on a proximity of each of the images to the selected geographical feature. A dynamic browsing widget is generated on the user interface having dimensions proportional to dimensions of the selected geographical feature. The selected plurality of images is browsed using the generated dynamic browsing widget.
    Type: Grant
    Filed: November 10, 2020
    Date of Patent: June 7, 2022
    Assignee: Canon Kabushiki Kaisha
    Inventors: Dixon De Sheng Deng, Julie Rae Kowald, Nicholas Grant Fulton, Oscar Alejandro De Lellis
  • Patent number: 11138258
    Abstract: A system and method of grouping images captured using an image capture device. The method comprises receiving a plurality of images, each of the plurality of images having associated camera settings; and determining an inertial profile for the plurality of images based on acceleration data of the image capture device and an imaging entity at pre-determined length of time before and after capture of the each of the plurality of images. The method further comprises forming image groups from the received plurality of images based on the determined inertial profile, and the associated camera settings.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: October 5, 2021
    Assignee: Canon Kabushiki Kaisha
    Inventors: Dixon De Sheng Deng, Julie Rae Kowald, Nicholas Grant Fulton, Oscar Alejandro De Lellis
  • Publication number: 20210081447
    Abstract: A method of browsing images on a user interface displaying a map. A selection of a geographical feature within the map on the user interface is received based on a selection criteria comprising a set of predefined gesture rules. A plurality of images is selected based on a proximity of each of the images to the selected geographical feature. A dynamic browsing widget is generated on the user interface having dimensions proportional to dimensions of the selected geographical feature. The selected plurality of images is browsed using the generated dynamic browsing widget.
    Type: Application
    Filed: November 10, 2020
    Publication date: March 18, 2021
    Inventors: DIXON DE SHENG DENG, JULIE RAE KOWALD, NICHOLAS GRANT FULTON, OSCAR ALEJANDRO DE LELLIS
  • Patent number: 10866986
    Abstract: A method of browsing images on a user interface displaying a map. A selection of a geographical feature within the map on the user interface is received based on a selection criteria comprising a set of predefined gesture rules. A plurality of images is selected based on a proximity of each of the images to the selected geographical feature. A dynamic browsing widget is generated on the user interface having dimensions proportional to dimensions of the selected geographical feature. The selected plurality of images is browsed using the generated dynamic browsing widget.
    Type: Grant
    Filed: October 9, 2018
    Date of Patent: December 15, 2020
    Assignee: Canon Kabushiki Kaisha
    Inventors: Dixon De Sheng Deng, Julie Rae Kowald, Nicholas Grant Fulton, Oscar Alejandro De Lellis
  • Publication number: 20200196516
    Abstract: A method of managing crops using an electronic device having an interface. Inputs of crop data is received, and each of the crop data is associated with a sample site location corresponding to each of a plurality of images captured by an image capturing device. A graph plotting one or more types of crop data including data associated with the plurality of images is generated in a first display region of the interface. A subset of sample site locations requiring one of a predetermined set of actions is displayed on the map in a second display region of the interface based on a selection within one of the plots on the graph.
    Type: Application
    Filed: December 10, 2019
    Publication date: June 25, 2020
    Inventors: Julie Rae Kowald, Dixon De Sheng Deng, Nicholas Grant Fulton, Oscar Alejandro De Lellis
  • Publication number: 20190188223
    Abstract: A system and method of grouping images captured using an image capture device. The method comprises receiving a plurality of images, each of the plurality of images having associated camera settings; and determining an inertial profile for the plurality of images based on acceleration data of the image capture device and an imaging entity at pre-determined length of time before and after capture of the each of the plurality of images. The method further comprises forming image groups from the received plurality of images based on the determined inertial profile, and the associated camera settings.
    Type: Application
    Filed: December 14, 2018
    Publication date: June 20, 2019
    Inventors: DIXON DE SHENG DENG, JULIE RAE KOWALD, NICHOLAS GRANT FULTON, OSCAR ALEJANDRO DE LELLIS
  • Publication number: 20190121878
    Abstract: A method of browsing images on a user interface displaying a map. A selection of a geographical feature within the map on the user interface is received based on a selection criteria comprising a set of predefined gesture rules. A plurality of images is selected based on a proximity of each of the images to the selected geographical feature. A dynamic browsing widget is generated on the user interface having dimensions proportional to dimensions of the selected geographical feature. The selected plurality of images is browsed using the generated dynamic browsing widget.
    Type: Application
    Filed: October 9, 2018
    Publication date: April 25, 2019
    Inventors: DIXON DE SHENG DENG, JULIE RAE KOWALD, NICHOLAS GRANT FULTON, OSCAR ALEJANDRO DE LELLIS
  • Patent number: 9721391
    Abstract: A method of displaying augmented reality content on a physical surface is disclosed. A surface complexity measure is determined for the physical surface from a captured image of the physical surface. A content complexity measure is determined for the augmented reality content to be applied to the physical surface. The content complexity measure represents an amount of fine detail in the augmented reality content. The method determines if the amount of fine detail in the augmented reality content is to be modified, based on a function of the surface complexity measure and said content complexity measure. A display attribute of the augmented reality content is adjusted to modify the fine detail in the augmented reality content. The modified augmented reality content is displayed on the physical surface.
    Type: Grant
    Filed: May 11, 2015
    Date of Patent: August 1, 2017
    Assignee: Canon Kabushiki Kaisha
    Inventors: David Robert James Monaghan, Belinda Margaret Yee, Oscar Alejandro De Lellis
  • Patent number: 9633479
    Abstract: A method of displaying virtual content on an augmented reality device (101) is disclosed. The virtual content is associated with a scene. An image of a scene captured using the augmented reality device (101) is received. A viewing time of the scene is determined, according to a relative motion between the augmented reality device and the scene. Virtual content is selected, from a predetermined range of virtual content, based on the determined viewing time. The virtual content is displayed on the augmented reality device (101) together with the image of the scene.
    Type: Grant
    Filed: December 22, 2014
    Date of Patent: April 25, 2017
    Assignee: Canon Kabushiki Kaisha
    Inventors: Matthew John Grasso, Belinda Margaret Yee, David Robert James Monaghan, Oscar Alejandro De Lellis, Rajanish Calisa
  • Publication number: 20150332507
    Abstract: A method of displaying augmented reality content on a physical surface is disclosed. A surface complexity measure is determined for the physical surface from a captured image of the physical surface. A content complexity measure is determined for the augmented reality content to be applied to the physical surface. The content complexity measure represents an amount of fine detail in the augmented reality content. The method determines if the amount of fine detail in the augmented reality content is to be modified, based on a function of the surface complexity measure and said content complexity measure. A display attribute of the augmented reality content is adjusted to modify the fine detail in the augmented reality content. The modified augmented reality content is displayed on the physical surface.
    Type: Application
    Filed: May 11, 2015
    Publication date: November 19, 2015
    Inventors: David Robert James Monaghan, Belinda Margaret Yee, Oscar Alejandro De Lellis
  • Publication number: 20150206353
    Abstract: A method of displaying virtual content on an augmented reality device (101) is disclosed. The virtual content is associated with a scene. An image of a scene captured using the augmented reality device (101) is received. A viewing time of the scene is determined, according to a relative motion between the augmented reality device and the scene. Virtual content is selected, from a predetermined range of virtual content, based on the determined viewing time. The virtual content is displayed on the augmented reality device (101) together with the image of the scene.
    Type: Application
    Filed: December 22, 2014
    Publication date: July 23, 2015
    Inventors: Matthew John Grasso, Belinda Margaret Yee, David Robert James Monaghan, Oscar Alejandro De Lellis, Rajanish Calisa