Patents by Inventor Robert Moton, JR.

Robert Moton, JR. has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240095854
    Abstract: The disclosed technology is directed towards enabling a property, such as via a device associated with a residential or commercial property, to identify an issue and arrange for mediation of the issue. For example, a service provider can be automatically scheduled to resolve a maintenance issue at a scheduled time, with controlled (e.g., key code-based) property access granted to the service provider in the corresponding timeframe. Sensor data obtained from a sensor associated with the property can be analyzed to determine when an incident related to the property has occurred that needs mediation. An action to remediate the incident is determined and taken, such as to schedule a repair. An automated action can also be taken, e.g., to temporarily halt damage resulting from the issue until the repair can take place. If needed, communication between the party responsible for the property and the service provider is facilitated.
    Type: Application
    Filed: September 20, 2022
    Publication date: March 21, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240095968
    Abstract: The disclosed technology is directed towards presenting communications data to a responding entity that are relevant to an emergency situation at an emergency location. The communications data can be obtained from communications (e.g., text messages, transmitted video, voice calls and the like) that involve at least one user device at the situation, including communications that do not involve the responding entity. Users may opt in to such an emergency service to allow access to their communications, whereby their device locations are tracked and known in the event of an emergency. Upon obtaining the communications data, a responder can receive a view of the scene augmented with the communications data. The view can include a three-dimensional and/or two-dimensional representation of the zone/area of the emergency situation. Filtering can be used to eliminate irrelevant communications, and summarization can be used to combine generally redundant communications.
    Type: Application
    Filed: September 16, 2022
    Publication date: March 21, 2024
    Inventors: Adrianne Luu, Robert Moton, Jr., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240098472
    Abstract: The disclosed technology is directed towards facilitating communication with an unknown person via a device of the person having an unknown communications address. The presence of a visible person at a location can be determined, and a communications device at or near that location can be detected as a candidate device associated with that person. The device's current location, which can be regularly registered along with its communication address, is matched to the person's visible location. Communication via the device, such as by a responder, whether with the person and/or to obtain information from the device, is based on the registered communication address. The person's identity and other data may be obtainable. For a non-visible person, e.g., underneath an obstruction, the location of a detected device predicts the likely presence of a person near that location. A map may be generated showing locations of multiple detected devices.
    Type: Application
    Filed: September 13, 2022
    Publication date: March 21, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240086463
    Abstract: The disclosed technology is directed towards generating an annotated visual playback of an incident. Upon obtaining notification of an incident, event data occurring in the incident zone and timeframe are obtained (e.g., collected) from sensors proximate the zone. The event data is used to annotate a visual playback captured by one or more cameras, e.g., within a timeframe ranging from some time before the incident occurred to the end of the incident. For example, a video can be presented with overlaid annotation data, each annotation describing an event as it occurred in time, in conjunction with an advancing timestamp overlay showing the time of the video frames. A simulated video, e.g., extended reality video, can also be generated, along with annotations and timeframe data, such as if captured from camera(s) from one or more various different perspectives.
    Type: Application
    Filed: September 12, 2022
    Publication date: March 14, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240087431
    Abstract: The disclosed technology is directed towards providing guidance, e.g., point-to-point audio guidance, along an evacuation route. The presence of occupant(s) within a structure/area is monitored. If an evacuation event occurs, an evacuation route is determined for each occupant based on their current monitored starting location. Guidance is output, e.g., by various output devices along the evacuation route, which can be the safety monitors that monitor occupant presence, to guide each occupant from point-to-point along the evacuation route. For example, one monitor can be configured to hand off guidance responsibility to a next monitor along the evacuation route. The guidance can be customized for each known occupant based on their recognized features, or can be directed generally to an unknown occupant. Status information of the occupants can be provided to other occupants and to a responding entity. A responding entity can also be given directional guidance.
    Type: Application
    Filed: September 12, 2022
    Publication date: March 14, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240089697
    Abstract: The disclosed technology is directed towards efficiently identifying a first entity (e.g., a user or sensor) as a potential witness to an event. A first dataset describing event time data and event location data of an event are obtained from a first device. A second dataset describing device time data and device location data are obtained via a device associated with a second entity, e.g., a user. By analyzing the first and second datasets, e.g., for time and location intersection, the second entity can be identified as a potential witness to the event, and notified of such. Factors such as line of sight, speed and direction, audio sensing range versus event volume, and the like of the second entity can be used in determining whether the second entity is a likely witness. A witness to a live event can be identified and asked to send in live (e.g., video) evidence.
    Type: Application
    Filed: September 12, 2022
    Publication date: March 14, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Publication number: 20240078890
    Abstract: The disclosed technology is directed towards associating a rescue tag with a victim in need of rescue, in which the rescue tag collects condition (biological state) data associated with the victim and provides a location of the rescue tag. The location data is maintained in association with the condition data. A responder makes a request to output the condition data, and in response, the responder's device is presented with an augmented reality display that shows the victim condition data relative to the location data. Multiple victims can be efficiently triaged, with more urgent victims highlighted via augmented reality for more urgent treatment. Filtering as requested by a responder can present augmented reality display for only a subset of the victims. Also described is the use of an aerial vehicle to assist the responders and/or a command center; the aerial vehicle can act as an edge node for efficient communication.
    Type: Application
    Filed: September 2, 2022
    Publication date: March 7, 2024
    Inventors: Adrianne Luu, Robert Moton, JR., Ryan Schaub, Timothy Knezevich, Barrett Kreiner, Wei Wang, Ari Craine, Robert Koch
  • Patent number: 11798276
    Abstract: In one example, a method performed by a processing system including at least one processor includes identifying an environment surrounding a user of an augmented reality display, identifying a relative location of the user within the environment, determining a field of view of the augmented reality display, identifying an individual within the field of view, querying a data source for information related to the individual, and modifying the augmented reality display to present the information related to the individual.
    Type: Grant
    Filed: June 3, 2021
    Date of Patent: October 24, 2023
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: Robert Moton, Jr., Adrianne Luu, James Pratt, Barrett Kreiner, Walter Cooper Chastain, Robert Koch, Ari Craine
  • Patent number: 11670081
    Abstract: In one example, a method performed by a processing system including at least one processor includes identifying an environment surrounding a user of an augmented reality display, identifying a relative location of the user within the environment, determining a field of view of the augmented reality display, identifying a room within the field of view, querying a data source for current information about the room, and modifying the augmented reality display to present the current information about the room.
    Type: Grant
    Filed: June 3, 2021
    Date of Patent: June 6, 2023
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: Walter Cooper Chastain, Barrett Kreiner, James Pratt, Adrianne Luu, Robert Moton, Jr., Robert Koch, Ari Craine
  • Publication number: 20220391619
    Abstract: In one example, a method performed by a processing system including at least one processor includes detecting an input from a user of an augmented reality display that is deployed within a window, the input indicating that the user is seeking additional information about a real world object that is viewable through the window, identifying the real world object for which the user is seeking additional information, based on the input from the user and a reference map, retrieving information about the real world object from a data source, and modifying the augmented reality display to present the information about the real world object.
    Type: Application
    Filed: June 3, 2021
    Publication date: December 8, 2022
    Inventors: Walter Cooper Chastain, Barrett Kreiner, James Pratt, Adrianne Luu, Robert Moton, JR., Robert Koch, Ari Craine
  • Publication number: 20220391618
    Abstract: In one example, a method performed by a processing system including at least one processor includes identifying an environment surrounding a user of an augmented reality display, identifying a relative location of the user within the environment, determining a field of view of the augmented reality display, identifying an individual within the field of view, querying a data source for information related to the individual, and modifying the augmented reality display to present the information related to the individual.
    Type: Application
    Filed: June 3, 2021
    Publication date: December 8, 2022
    Inventors: Robert Moton, JR., Adrianne Luu, James Pratt, Barrett Kreiner, Walter Cooper Chastain, Robert Koch, Ari Craine
  • Publication number: 20220391617
    Abstract: In one example, a method performed by a processing system including at least one processor includes identifying an environment surrounding a user of an augmented reality display, identifying a relative location of the user within the environment, determining a field of view of the augmented reality display, identifying a room within the field of view, querying a data source for current information about the room, and modifying the augmented reality display to present the current information about the room.
    Type: Application
    Filed: June 3, 2021
    Publication date: December 8, 2022
    Inventors: Walter Cooper Chastain, Barrett Kreiner, James Pratt, Adrianne Luu, Robert Moton, JR., Robert Koch, Ari Craine
  • Publication number: 20180189909
    Abstract: A patentability search and analysis system can, based on a submission, perform a patentability search that finds prior art relevant to the submission. An example system can operate to indicate the relevancy of prior art documents discovered in the search, and categorize the art. The system can determine a patentability indicator (e.g., a patentability score) representative of the likelihood that a patent will be granted on a patent application representative of the submission. The system can, based on an analysis of filed patent applications, provide rejection and allowance rates for the claims of those applications having a similar patentability indicator value to that of the patentability indicator. The rejection and allowance rates can be for the patent office as a whole, the art unit, or the examiner. Additionally, the system can provide a “confidence level” representative of the strength of the patentability indicator.
    Type: Application
    Filed: November 9, 2017
    Publication date: July 5, 2018
    Inventors: Samuel N. Zellner, Scott Frank, Robert Moton, JR., Ari Craine, Jason V. Chang