Patents by Inventor Benedict Liang

Benedict Liang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240064110
    Abstract: Implementations set forth herein relate to conditionally delaying fulfillment of client update requests in order to preserve network bandwidth and other resources that may be consumed when an ecosystem of linked assistant devices are repeatedly pinging servers for updates. In some implementations, a server device can delay and/or bypass fulfillment of a client request based on one or more indications that certain requested data is currently, or is expected to be, expired. For example, a user that is modifying assistant settings via a cellular device can cause an update notification to be pushed to several other assistant devices before the user finishes editing the assistant settings. Implementations herein can limit fulfillment of update requests from the client devices according to certain criteria-such as whether the user is continuing to modify the assistant settings from their cellular device.
    Type: Application
    Filed: October 30, 2023
    Publication date: February 22, 2024
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo
  • Patent number: 11907214
    Abstract: Implementations set forth herein relate to conditionally caching responses to automated assistant queries according to certain contextual data that may be associated with each automated assistant query. Each query can be identified based on historical interactions between a user and an automated assistant, and—depending on the query, fulfillment data can be cached according to certain contextual data that influences the query response. Depending on how the contextual data changes, a cached response stored at a client device can be discarded and/or replaced with an updated cached response. For example, a query that users commonly ask prior to leaving for work can have a corresponding assistant response that depends on features of an environment of the users. This unique assistant response can be cached, before the users provide the query, to minimize latency that can occur when network or processing bandwidth is unpredictable.
    Type: Grant
    Filed: January 30, 2023
    Date of Patent: February 20, 2024
    Assignee: GOOGLE LLC
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo, Anarghya Mitra
  • Patent number: 11853381
    Abstract: Techniques of this disclosure are directed to enable a computing device to process voice queries and provide query answers even when the computing device and vehicle do not have internet connectivity. According to the disclosed techniques, a computing device may detect a query via input devices of the computing device and output a query answer determined based on the detected query. Rather than directly querying a remote computing system, various aspects of the techniques of this disclosure may enable the computing device to use a query answer cache to generate the query answer. The query answer cache may include predicted queries and query answers retrieved from a query answer cache of a remote computing system, thereby enabling the computing device to respond to the detected queries while experiencing unreliable internet connection.
    Type: Grant
    Filed: November 13, 2020
    Date of Patent: December 26, 2023
    Assignee: Google LLC
    Inventors: Xin Li, Yixin Wang, Benedict Liang, Dharminder Singh
  • Patent number: 11805068
    Abstract: Implementations set forth herein relate to conditionally delaying fulfillment of client update requests in order to preserve network bandwidth and other resources that may be consumed when an ecosystem of linked assistant devices are repeatedly pinging servers for updates. In some implementations, a server device can delay and/or bypass fulfillment of a client request based on one or more indications that certain requested data is currently, or is expected to be, expired. For example, a user that is modifying assistant settings via a cellular device can cause an update notification to be pushed to several other assistant devices before the user finishes editing the assistant settings. Implementations herein can limit fulfillment of update requests from the client devices according to certain criteria—such as whether the user is continuing to modify the assistant settings from their cellular device.
    Type: Grant
    Filed: February 26, 2021
    Date of Patent: October 31, 2023
    Assignee: GOOGLE LLC
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo
  • Publication number: 20230177050
    Abstract: Implementations set forth herein relate to conditionally caching responses to automated assistant queries according to certain contextual data that may be associated with each automated assistant query. Each query can be identified based on historical interactions between a user and an automated assistant, and—depending on the query, fulfillment data can be cached according to certain contextual data that influences the query response. Depending on how the contextual data changes, a cached response stored at a client device can be discarded and/or replaced with an updated cached response. For example, a query that users commonly ask prior to leaving for work can have a corresponding assistant response that depends on features of an environment of the users. This unique assistant response can be cached, before the users provide the query, to minimize latency that can occur when network or processing bandwidth is unpredictable.
    Type: Application
    Filed: January 30, 2023
    Publication date: June 8, 2023
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo, Anarghya Mitra
  • Patent number: 11567935
    Abstract: Implementations set forth herein relate to conditionally caching responses to automated assistant queries according to certain contextual data that may be associated with each automated assistant query. Each query can be identified based on historical interactions between a user and an automated assistant, and—depending on the query, fulfillment data can be cached according to certain contextual data that influences the query response. Depending on how the contextual data changes, a cached response stored at a client device can be discarded and/or replaced with an updated cached response. For example, a query that users commonly ask prior to leaving for work can have a corresponding assistant response that depends on features of an environment of the users. This unique assistant response can be cached, before the users provide the query, to minimize latency that can occur when network or processing bandwidth is unpredictable.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: January 31, 2023
    Assignee: Google LLC
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo, Anarghya Mitra
  • Publication number: 20220318248
    Abstract: Implementations set forth herein relate to conditionally caching responses to automated assistant queries according to certain contextual data that may be associated with each automated assistant query. Each query can be identified based on historical interactions between a user and an automated assistant, and—depending on the query, fulfillment data can be cached according to certain contextual data that influences the query response. Depending on how the contextual data changes, a cached response stored at a client device can be discarded and/or replaced with an updated cached response. For example, a query that users commonly ask prior to leaving for work can have a corresponding assistant response that depends on features of an environment of the users. This unique assistant response can be cached, before the users provide the query, to minimize latency that can occur when network or processing bandwidth is unpredictable.
    Type: Application
    Filed: March 30, 2021
    Publication date: October 6, 2022
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo, Anarghya Mitra
  • Publication number: 20220272048
    Abstract: Implementations set forth herein relate to conditionally delaying fulfillment of client update requests in order to preserve network bandwidth and other resources that may be consumed when an ecosystem of linked assistant devices are repeatedly pinging servers for updates. In some implementations, a server device can delay and/or bypass fulfillment of a client request based on one or more indications that certain requested data is currently, or is expected to be, expired. For example, a user that is modifying assistant settings via a cellular device can cause an update notification to be pushed to several other assistant devices before the user finishes editing the assistant settings. Implementations herein can limit fulfillment of update requests from the client devices according to certain criteria—such as whether the user is continuing to modify the assistant settings from their cellular device.
    Type: Application
    Filed: February 26, 2021
    Publication date: August 25, 2022
    Inventors: Benedict Liang, Bryan Christopher Horling, Lan Huo
  • Publication number: 20220156340
    Abstract: Techniques of this disclosure are directed to enable a computing device to process voice queries and provide query answers even when the computing device and vehicle do not have internet connectivity. According to the disclosed techniques, a computing device may detect a query via input devices of the computing device and output a query answer determined based on the detected query. Rather than directly querying a remote computing system, various aspects of the techniques of this disclosure may enable the computing device to use a query answer cache to generate the query answer. The query answer cache may include predicted queries and query answers retrieved from a query answer cache of a remote computing system, thereby enabling the computing device to respond to the detected queries while experiencing unreliable internet connection.
    Type: Application
    Filed: November 13, 2020
    Publication date: May 19, 2022
    Inventors: Xin Li, Yixin Wang, Benedict Liang, Dharminder Singh