Patents by Inventor Pietro Perona

Pietro Perona has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9355167
    Abstract: Systems and methods for the crowdsourced clustering of data items in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a method for determining categories for a set of source data includes obtaining a set of source data, determining a plurality of subsets of the source data, where a subset of the source data includes a plurality of pieces of source data in the set of source data, generating a set of pairwise annotations for the pieces of source data in each subset of source data, clustering the set of source data into related subsets of source data based on the sets of pairwise labels for each subset of source data, and identifying a category for each related subset of source data based on the clusterings of source data and the source data metadata for the pieces of source data in the group of source data.
    Type: Grant
    Filed: May 17, 2013
    Date of Patent: May 31, 2016
    Assignee: California Institute of Technology
    Inventors: Ryan Gomes, Peter Welinder, Andreas Krause, Pietro Perona
  • Patent number: 9355360
    Abstract: Systems and methods for determining annotator performance in the distributed annotation of source data in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a method for clustering annotators includes obtaining a set of source data, determining a training data set representative of the set of source data, obtaining sets of annotations from a set of annotators for a portion of the training data set, for each annotator determining annotator recall metadata based on the set of annotations provided by the annotator for the training data set and determining annotator precision metadata based on the set of annotations provided by the annotator for the training data set, and grouping the annotators into annotator groups based on the annotator recall metadata and the annotator precision metadata.
    Type: Grant
    Filed: June 19, 2013
    Date of Patent: May 31, 2016
    Assignee: California Institute of Technology
    Inventors: Peter Welinder, Pietro Perona
  • Patent number: 9141878
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Grant
    Filed: August 29, 2013
    Date of Patent: September 22, 2015
    Assignees: AOL Inc., California Institute of Technology, Isis Innovation Limited
    Inventors: Sharon M Perlmutter, Keren O Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Patent number: 9092458
    Abstract: A system and method of sorting graphics files with a graphic server and image search engine is disclosed. The method preferably comprises the steps of: receiving a plurality of search results where each of the search results comprising one or more associated graphics; using a general purpose computer to identifying one or more groups of said graphics that depict or otherwise possess similar visual features; and returning the plurality of search results to a user in accordance with said identified groups. The preferred embodiment effectively clusters graphics files, particularly image files, based on upon shared scale-invariant features in order to enable the user to readily identify collections of images relevant to the user while skip past collections of less relevant images.
    Type: Grant
    Filed: September 13, 2012
    Date of Patent: July 28, 2015
    Assignee: iRobot Corporation
    Inventors: Pietro Perona, Luis Goncalves, Enrico Di Bernardo
  • Publication number: 20140289246
    Abstract: Systems and methods for the crowdsourced clustering of data items in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a method for determining categories for a set of source data includes obtaining a set of source data, determining a plurality of subsets of the source data, where a subset of the source data includes a plurality of pieces of source data in the set of source data, generating a set of pairwise annotations for the pieces of source data in each subset of source data, clustering the set of source data into related subsets of source data based on the sets of pairwise labels for each subset of source data, and identifying a category for each related subset of source data based on the clusterings of source data and the source data metadata for the pieces of source data in the group of source data.
    Type: Application
    Filed: May 17, 2013
    Publication date: September 25, 2014
    Applicant: California Institute of Technology
    Inventors: Ryan Gomes, Peter Welinder, Andreas Krause, Pietro Perona
  • Publication number: 20140188879
    Abstract: Systems and methods for distributed data annotation in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a distributed data annotation server system includes a storage device configured to store source data, one or more annotators, annotation tasks and a processor, wherein a distributed data annotation application configures the processor to receive source data including one or more pieces of source data, select one or more annotators, create one or more annotation tasks for the selected annotators and source data, request one or more annotations for the source data using the annotation tasks, receive annotations, determine source data metadata for at least one piece of source data using the received annotations, generate annotator metadata for at least one annotator using the received annotations and the source data, and estimate the ground truth for the source data using the source data metadata and the annotator metadata.
    Type: Application
    Filed: March 6, 2014
    Publication date: July 3, 2014
    Applicant: California Institute of Technology
    Inventors: Peter Welinder, Pietro Perona
  • Patent number: 8706729
    Abstract: Systems and methods for distributed data annotation in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a distributed data annotation server system includes a storage device configured to store source data, one or more annotators, annotation tasks and a processor, wherein a distributed data annotation application configures the processor to receive source data including one or more pieces of source data, select one or more annotators, create one or more annotation tasks for the selected annotators and source data, request one or more annotations for the source data using the annotation tasks, receive annotations, determine source data metadata for at least one piece of source data using the received annotations, generate annotator metadata for at least one annotator using the received annotations and the source data, and estimate the ground truth for the source data using the source data metadata and the annotator metadata.
    Type: Grant
    Filed: October 12, 2012
    Date of Patent: April 22, 2014
    Assignee: California Institute of Technology
    Inventors: Peter Welinder, Pietro Perona
  • Publication number: 20140050374
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Application
    Filed: October 23, 2013
    Publication date: February 20, 2014
    Applicant: AOL Inc.
    Inventors: Keren O. Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona
  • Publication number: 20130346409
    Abstract: Systems and methods for determining annotator performance in the distributed annotation of source data in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a method for clustering annotators includes obtaining a set of source data, determining a training data set representative of the set of source data, obtaining sets of annotations from a set of annotators for a portion of the training data set, for each annotator determining annotator recall metadata based on the set of annotations provided by the annotator for the training data set and determining annotator precision metadata based on the set of annotations provided by the annotator for the training data set, and grouping the annotators into annotator groups based on the annotator recall metadata and the annotator precision metadata.
    Type: Application
    Filed: June 19, 2013
    Publication date: December 26, 2013
    Inventors: Peter Welinder, Pietro Perona
  • Publication number: 20130346356
    Abstract: Systems and methods for the annotation of source data using confidence labels in accordance embodiments of the invention are disclosed. In one embodiment of the invention, a method for determining confidence labels for crowdsourced annotations includes obtaining a set of source data, obtaining a set of training data representative of the set of source data, determining the ground truth for each piece of training data, obtaining a set of training data annotations including a confidence label, measuring annotator accuracy data for at least one piece of training data, and automatically generating a set of confidence labels for the set of unlabeled data based on the measured annotator accuracy data and the set of annotator labels used.
    Type: Application
    Filed: June 12, 2013
    Publication date: December 26, 2013
    Inventors: Peter Welinder, Pietro Perona
  • Patent number: 8542888
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Grant
    Filed: July 16, 2012
    Date of Patent: September 24, 2013
    Assignee: AOL Inc.
    Inventors: Sharon M. Perlmutter, Keren O. Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Publication number: 20130028522
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Application
    Filed: June 4, 2012
    Publication date: January 31, 2013
    Inventors: Keren O. Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona
  • Publication number: 20120281910
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Application
    Filed: July 16, 2012
    Publication date: November 8, 2012
    Inventors: Sharon M. Perlmutter, Keren O. Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Patent number: 8233679
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Grant
    Filed: February 10, 2011
    Date of Patent: July 31, 2012
    Assignee: AOL Inc.
    Inventors: Sharon M. Perlmutter, Keren O. Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Patent number: 8194939
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Grant
    Filed: August 3, 2010
    Date of Patent: June 5, 2012
    Assignee: AOL Inc.
    Inventors: Keren O. Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona
  • Publication number: 20110129145
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Application
    Filed: February 10, 2011
    Publication date: June 2, 2011
    Applicant: AOL INC
    Inventors: Sharon M. PERLMUTTER, Keren O. Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Publication number: 20110085710
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Application
    Filed: August 3, 2010
    Publication date: April 14, 2011
    Inventors: Keren Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona
  • Patent number: 7907755
    Abstract: Similar faces may be determined within images based on human perception of facial similarity. The user may provide an image including a query face to which the user wishes to find faces that are similar. Similar faces may be determined based on similarity information. Similarity information may be generated from information related to a human perception of facial similarity. Images that include faces determined to be similar, based on the similarity information, may be provided to the user as search result images. The user then may provide feedback to indicate the user's perception of similarity between the query face and the search result images.
    Type: Grant
    Filed: May 10, 2006
    Date of Patent: March 15, 2011
    Assignee: AOL Inc.
    Inventors: Sharon M. Perlmutter, Keren O. Perlmutter, Joshua Alspector, Alex Holub, Mark Everingham, Pietro Perona, Andrew Zisserman
  • Patent number: 7783085
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Grant
    Filed: May 10, 2006
    Date of Patent: August 24, 2010
    Assignee: AOL Inc.
    Inventors: Keren O. Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona
  • Publication number: 20090034805
    Abstract: Images are searched to locate faces that are the same as a query face. Images that include a face that is the same as the query face may be presented to a user as search result images. Images also may be sorted by the faces included in the images and presented to the user as sorted search result images. The user may provide explicit or implicit feedback regarding the search result images. Additional feedback may be inferred regarding the search result images based on the user-provided feedback, and the results may be updated based on the user-provided and inferred feedback.
    Type: Application
    Filed: May 10, 2006
    Publication date: February 5, 2009
    Applicant: AOL LLC
    Inventors: Keren O. Perlmutter, Sharon M. Perlmutter, Joshua Alspector, Mark Everingham, Alex Holub, Andrew Zisserman, Pietro Perona