Patents by Inventor Luke St. Clair

Luke St. Clair has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170103058
    Abstract: Particular embodiments may store, at a client device, information associated with nodes and edges of a social graph. A node may comprise a user node or a concept node. Each node may be connected by edges to other nodes of the social graph. A first user may be associated with a first user node of the social graph. The client device may receive a character string from the first user, and identify, as the first user inputs the character string, an edge-type based on the character string and one or more edges of the identified edge-type, wherein the edges are locally stored on the client device. The client device may display one or more predictive typeahead results based on the identified edge-type and the identified edges. The predictive typeahead results may correspond to nodes stored locally on the client device.
    Type: Application
    Filed: December 20, 2016
    Publication date: April 13, 2017
    Inventors: Shaheen Ashok Gandhi, Jasper Reid Hauser, Luke St. Clair, David Harry Garcia, Jenny Yuen
  • Patent number: 9619694
    Abstract: In particular embodiments, one or more images associated with a primary user are received. The image(s) may comprise single images, a series of related images, or video frames. In each image, one or more faces are detected and/or tracked. For each face, a set of one or more candidates are selected who may be identified with the face. The primary user has a computed measure of affinity for candidates in the set through a social network, or the candidate in the set is otherwise known to the primary user. A facial recognition score is calculated for each candidate. A subset of candidates is selected, wherein each candidate in the subset has a facial recognition score above a predetermined threshold. A candidate score is calculated for each candidate based on the facial recognition score and the computed measure of affinity. A winning candidate is selected based on the candidate scores.
    Type: Grant
    Filed: June 18, 2015
    Date of Patent: April 11, 2017
    Assignee: Facebook, Inc.
    Inventors: David Harry Garcia, Luke St. Clair, Jenny Yuen
  • Patent number: 9619037
    Abstract: In one embodiment, a method includes identifying a touch input made by a user of a computing device on a touch screen of the computing device as a particular one of a plurality of custom touch gestures of the user stored on the computing device; determining the particular one of the user inputs corresponding to the particular one of the custom touch gestures identified as the touch gesture made by the user; and executing one or more actions based on the particular one of the user inputs.
    Type: Grant
    Filed: July 25, 2012
    Date of Patent: April 11, 2017
    Assignee: Facebook, Inc.
    Inventor: Luke St. Clair
  • Patent number: 9600167
    Abstract: In one embodiment, a method includes detecting one or more user interactions, associated with a user of a computing device, each interaction occurring at a region associated with an input value, and determining, for at least one user interaction, that the at least one user intended to provide a different input value. Adaptation information is generated for the at least one user based on the at least one user interaction. The adaptation information is stored for the at least one user. A user interaction is detected at a region. The user's intended input value is determined based on the user interaction and the adaptation information.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: March 21, 2017
    Assignee: Facebook, Inc.
    Inventors: Jasper Reid Hauser, Luke St. Clair, Jenny Yuen
  • Publication number: 20170068842
    Abstract: In particular embodiments, one or more images associated with a primary user are received. The image(s) may comprise single images, a series of related images, or video frames. In each image, one or more faces are detected and/or tracked. For each face, a set of one or more candidates are selected who may be identified with the face. A candidate score is calculated for each candidate based on a computed measure of affinity of the primary user for a particular candidate, a facial recognition score comparing the candidate to the face, and a geographic proximity of the candidate to the primary user at a time when the one or more images were created. A winning candidate is selected based on the candidate scores.
    Type: Application
    Filed: November 16, 2016
    Publication date: March 9, 2017
    Inventors: David Harry Garcia, Luke St. Clair, Jenny Yuen
  • Publication number: 20170060257
    Abstract: In one embodiment, a method includes identifying a gesture made by a user of the computing device with respect to one or more surfaces of the computing device, the gesture comprising a single trajectory in three dimensions including: an earlier portion in a first direction along at least one of the surfaces; and immediately following the earlier portion of the single trajectory, a later portion in a second direction comprising a second series of points distant from the surfaces, wherein the second direction comprises a deflection from the first direction that follows through on the earlier portion of the single trajectory; determining a user input based at least in part on a speed of the gesture along the earlier portion of the single trajectory and a speed of the gesture along the later portion of the single trajectory; and executing one or more actions based on the user input.
    Type: Application
    Filed: November 9, 2016
    Publication date: March 2, 2017
    Inventor: Luke St. Clair
  • Patent number: 9582589
    Abstract: In one embodiment, a method includes identifying a content object for display based at least in part on one or more filtering criteria. The filtering criteria is a measure of suitability of each content object for presentation based at least in part on social-graph information between a first user and one or more second users or a current geo-location of the first user. The method also includes applying the filtering criteria to the content object; and providing for display on a user interface (UI) the content object based on whether the content object is suitable for presentation based at least in part on the filtering criteria.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: February 28, 2017
    Assignee: Facebook, Inc.
    Inventor: Luke St. Clair
  • Patent number: 9575956
    Abstract: Particular embodiments may retrieve information associated with one or more nodes of a social graph from one or more data stores. A node may comprise a user node or a concept node. Each node may be connected by edges to other nodes of a social graph. A first user may be associated with a first user node of the social graph. Particular embodiments may detect that the first user is entering an input term. Predictive typeahead results may be provided as the first user enters the input term. The predictive typeahead results may be based on the input term. Each predictive typeahead result may include at least one image. Each predictive typeahead result may correspond to at least one node of the social graph.
    Type: Grant
    Filed: June 2, 2015
    Date of Patent: February 21, 2017
    Assignee: Facebook, Inc.
    Inventors: Shaheen Ashok Gandhi, Jasper Reid Hauser, Luke St. Clair, David Harry Garcia, Jenny Yuen
  • Patent number: 9535596
    Abstract: In one embodiment, a method includes identifying a three-dimensional gesture made by a user of a computing device with respect to one or more surfaces of the computing device, the three-dimensional gesture comprising a trajectory in three dimensions, a first portion of the trajectory comprising a touch of one or more of the surfaces, a second portion of the trajectory comprising a series of points in space distant from the surfaces; determining a user input based on the three-dimensional gesture; and executing one or more actions based on the user input.
    Type: Grant
    Filed: July 25, 2012
    Date of Patent: January 3, 2017
    Assignee: Facebook, Inc.
    Inventor: Luke St. Clair
  • Publication number: 20160219006
    Abstract: In one embodiment, a computing device receives input from a user participating in a message session. The computing device detects an emoticon in the received input and identifies an image corresponding to the emoticon. The computing device accesses the image corresponding to the emoticon and replaces the emoticon with the image in the message session.
    Type: Application
    Filed: March 31, 2016
    Publication date: July 28, 2016
    Inventors: Jenny Yuen, Luke St. Clair
  • Publication number: 20160210280
    Abstract: In one embodiment, collecting a plurality of words from texts submitted by one or more users; for each of a plurality of communication categories, determining a usage frequency of each of one or more of the words within the communication category based on the texts; and constructing one or more customized dictionaries that each comprise a different blending of selected words.
    Type: Application
    Filed: March 30, 2016
    Publication date: July 21, 2016
    Applicant: Facebook, Inc.
    Inventors: Erick Tseng, Shaheen Ashok Gandhi, Adam D.I. Kramer, Luke St. Clair
  • Publication number: 20160132486
    Abstract: Particular embodiments determine that a textual term is not associated with a known meaning. The textual term may be related to one or more users of the social-networking system. A determination is made as to whether the textual term should be added to a glossary. If so, then the textual term is added to the glossary. Information related to one or more textual terms in the glossary is provided to enhance auto-correction, provide predictive text input suggestions, or augment social graph data. Particular embodiments discover new textual terms by mining information, wherein the information was received from one or more users of the social-networking system, was generated for one or more users of the social-networking system, is marked as being associated with one or more users of the social-networking system, or includes an identifier for each of one or more users of the social-networking system.
    Type: Application
    Filed: January 19, 2016
    Publication date: May 12, 2016
    Inventors: Jasper Reid Hauser, Luke St. Clair, David Harry Garcia, Jenny Yuen
  • Patent number: 9330082
    Abstract: In one embodiment, constructing one or more customized dictionaries for a particular user, each of the customized dictionaries comprising a different blending of one or more frequently used words collected from texts submitted by one or more users; and in response to the user inputting text to an electronic device, selecting one of the customized dictionaries and utilizing it to aid the particular user in inputting text.
    Type: Grant
    Filed: February 14, 2012
    Date of Patent: May 3, 2016
    Assignee: FACEBOOK, INC.
    Inventors: Erick Tseng, Shaheen Ashok Gandhi, Adam D. I. Kramer, Luke St. Clair
  • Patent number: 9330083
    Abstract: In one embodiment, collecting a plurality of words from texts submitted by one or more users; for each of a plurality of communication categories, determining a usage frequency of each of one or more of the words within the communication category based on the texts; and constructing one or more customized dictionaries that each comprise a different blending of selected words.
    Type: Grant
    Filed: February 14, 2012
    Date of Patent: May 3, 2016
    Assignee: FACEBOOK, INC.
    Inventors: Erick Tseng, Shaheen Ashok Gandhi, Adam D. I. Kramer, Luke St. Clair
  • Patent number: 9331970
    Abstract: In one embodiment, a computing device receives input from a user participating in a message session. The computing device detects an emoticon in the received input and identifies an image corresponding to the emoticon. The computing device accesses the image corresponding to the emoticon and replaces the emoticon with the image in the message session.
    Type: Grant
    Filed: December 5, 2012
    Date of Patent: May 3, 2016
    Assignee: Facebook, Inc.
    Inventors: Jenny Yuen, Luke St. Clair
  • Publication number: 20160110344
    Abstract: In one embodiment, constructing a set of customized dictionaries for a particular user, each of the customized dictionaries in the set comprising a different blending of one or more frequently used words collected from texts submitted by one or more users; and sending a copy of the set of customized dictionaries to each of a plurality of electronic devices associated with the particular user to be stored on the electronic device and to aid the particular user in inputting text to the electronic device.
    Type: Application
    Filed: December 1, 2015
    Publication date: April 21, 2016
    Applicant: Facebook, Inc.
    Inventors: Erick Tseng, Shaheen Ashok Gandhi, Adam D.I. Kramer, Luke St. Clair
  • Publication number: 20160098098
    Abstract: In one embodiment, a method includes, in response to a user entering a string of one or more characters on a computing device, determining a plurality of auto-suggestions for the string; displaying all or a portion of one of the auto-suggestions on the display with the string; and in response to a particular swipe gesture on the display, causing the displayed auto-suggestion to be accepted
    Type: Application
    Filed: December 9, 2015
    Publication date: April 7, 2016
    Inventor: Luke St. Clair
  • Patent number: 9298295
    Abstract: In one embodiment, a method includes, in response to a user entering a string of one or more characters on a computing device, displaying the string on a display of the computing device and determining an auto-suggestion for the string; displaying all or a portion of the auto-suggestion on the display with the string; and, in response to a swipe touch gesture on the display, terminating the display of the auto-suggestion.
    Type: Grant
    Filed: July 25, 2012
    Date of Patent: March 29, 2016
    Assignee: Facebook, Inc.
    Inventor: Luke St. Clair
  • Patent number: 9280534
    Abstract: Particular embodiments determine that a textual term is not associated with a known meaning. The textual term may be related to one or more users of the social-networking system. A determination is made as to whether the textual term should be added to a glossary. If so, then the textual term is added to the glossary. Information related to one or more textual terms in the glossary is provided to enhance auto-correction, provide predictive text input suggestions, or augment social graph data. Particular embodiments discover new textual terms by mining information, wherein the information was received from one or more users of the social-networking system, was generated for one or more users of the social-networking system, is marked as being associated with one or more users of the social-networking system, or includes an identifier for each of one or more users of the social-networking system.
    Type: Grant
    Filed: November 19, 2012
    Date of Patent: March 8, 2016
    Assignee: Facebook, Inc.
    Inventors: Jasper Reid Hauser, Luke St. Clair, David Harry Garcia, Jenny Yuen
  • Patent number: 9235565
    Abstract: Techniques for constructing a set of customized dictionaries for a particular user are described. Each of the customized dictionaries in the set may include a different blending of one or more frequently used words collected from texts submitted by one or more users. A copy of the set of customized dictionaries may be sent to each of a plurality of electronic devices associated with the particular user to be stored on the electronic device and to aid the particular user in inputting text to the electronic device.
    Type: Grant
    Filed: February 14, 2012
    Date of Patent: January 12, 2016
    Assignee: FACEBOOK, INC.
    Inventors: Erick Tseng, Shaheen Ashok Gandhi, Adam D. I. Kramer, Luke St. Clair