Patents by Inventor Francois Gervais

Francois Gervais has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11277347
    Abstract: Inference server and computing device for inferring an optimal wireless data transfer rate. The computing device determines parameters of a data transfer through a wireless communication interface of the computing device, and transmits the parameters of the data transfer to the inference server. The inference server receives the parameters of the data transfer, executes a neural network inference engine using a predictive model (generated by a neural network training engine) for inferring an optimal data transfer rate based on the parameters of the data transfer, and transmits the optimal data transfer rate to the computing device. The computing device receives the optimal data transfer rate, and configures its wireless communication interface to operate at the optimal data transfer rate. For example, the computing device consists of an environment control device (e.g. an environment controller, a sensor, a controlled appliance, and a relay).
    Type: Grant
    Filed: February 27, 2020
    Date of Patent: March 15, 2022
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Publication number: 20220044127
    Abstract: Method and environment controller for validating a predictive model of a neural network. The environment controller receives at least one environmental characteristic value and determines a plurality of input variables. At least one of the plurality of input variables is based on one among the environmental characteristic value(s). The environment controller executes an environment control software module for calculating at least one output variable based on the plurality of input variables. The environment controller transmits the plurality of input variables to a training server executing a neural network training engine using the predictive model; and receives at least one inferred output variable from the training server. Each inferred output variable corresponds to one of the at least one output variable calculated by the environment control software module.
    Type: Application
    Filed: October 26, 2021
    Publication date: February 10, 2022
    Inventor: Francois Gervais
  • Publication number: 20220022084
    Abstract: Method and computing device for inferring a predicted state of a communication channel. The computing device stores a predictive model generated by a neural network training engine. The computing device collects a plurality of data samples representative of operating conditions of the communication channel. The communication channel is associated to a communication interface of the computing device. The communication interface allows an exchange of data between the computing device and at least one remote computing device over the communication channel. Each data sample comprises a measure of the amount of data respectively transmitted and received by the communication interface over the communication channel and a connection status of the communication channel, during a period of time. The computing device further executes a neural network inference engine using the predictive model for inferring the predicted state of the communication channel based on the plurality of data samples.
    Type: Application
    Filed: September 30, 2021
    Publication date: January 20, 2022
    Inventor: Francois Gervais
  • Patent number: 11188832
    Abstract: Method and environment controller for validating a predictive model of a neural network. The environment controller receives at least one environmental characteristic value and determines a plurality of input variables. At least one of the plurality of input variables is based on one among the environmental characteristic value(s). The environment controller executes an environment control software module for calculating at least one output variable based on the plurality of input variables. The environment controller transmits the plurality of input variables to a training server executing a neural network training engine using the predictive model; and receives at least one inferred output variable from the training server. Each inferred output variable corresponds to one of the at least one output variable calculated by the environment control software module.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: November 30, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Publication number: 20210264276
    Abstract: Computing device and method for inferring a predicted number of data chunks writable on a flash memory before the flash memory wears out. The computing device stores a predictive model generated by a neural network training engine. A processing unit of the computing device executes a neural network inference engine, using the predictive model for inferring the predicted number of data chunks writable on the flash memory before the flash memory wears out based on inputs. The inputs comprise a total number of physical blocks previously erased from the flash memory, a size of the data chunk, and optionally an operating temperature of the flash memory. In a particular aspect, the flash memory is comprised in the computing device, and an action may be taken for preserving a lifespan of the flash memory based at least on the predicted number of data chunks writable on the flash memory.
    Type: Application
    Filed: May 10, 2021
    Publication date: August 26, 2021
    Inventor: Francois GERVAIS
  • Publication number: 20210248442
    Abstract: Computing device and method using a neural network to predict values of an input variable of a software. Computing device determines an initial series of n consecutive values of the input variable and then performs an iterative process, which includes using the neural network for inferring a next value of the input variable based at least on the series of n consecutive values of the input variable. Iterative process includes executing the software, using the next value of the input variable to calculate a corresponding next value of an output variable. Iterative process includes updating the series of n consecutive values by removing the first value among the series of n consecutive values and adding the next value as the last value of the series of n consecutive values. Iterative process may include determining that a condition is met based at least on the next value of the output variable.
    Type: Application
    Filed: February 11, 2020
    Publication date: August 12, 2021
    Inventor: Francois GERVAIS
  • Patent number: 11079134
    Abstract: Computing device and method for inferring via a neural network a two-dimensional temperature mapping of an area. A predictive model is stored by the computing device. The computing device receives a plurality of temperature measurements transmitted by a corresponding plurality of temperature sensors located at a corresponding plurality of locations on a periphery of the area. The computing device executes a neural network inference engine, using the predictive model for inferring outputs based on inputs. The inputs comprise the plurality of temperature measurements. The outputs consist of a plurality of temperature values at a corresponding plurality of zones, the plurality of zones being comprised in a two-dimensional grid mapped on a plane within the area. For instance, the area is a room of a building, the periphery is an interface of a ceiling and walls of the room, and the plane is a horizontal plane within the room.
    Type: Grant
    Filed: December 19, 2018
    Date of Patent: August 3, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Patent number: 11041644
    Abstract: Method and environment controller using a neural network for bypassing a legacy environment control software module. The environment controller receives at least one environmental characteristic value and determines a plurality of input variables. At least one of the plurality of input variables is based on one among the at least one environmental characteristic value. The environment controller transmits the plurality of input variables to an inference server executing a neural network inference engine. The environment controller receives at least one inferred output variable from the inference server. The environment controller uses the at least one inferred output variable received from the inference server in place of at least one output variable calculated by the legacy environment control software module based on the plurality of input variables.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: June 22, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Patent number: 11037056
    Abstract: Computing device and method for inferring a predicted number of data chunks writable on a flash memory before the flash memory wears out. The computing device stores a predictive model generated by a neural network training engine. A processing unit of the computing device executes a neural network inference engine, using the predictive model for inferring the predicted number of data chunks writable on the flash memory before the flash memory wears out based on inputs. The inputs comprise a total number of physical blocks previously erased from the flash memory, a size of the data chunk, and optionally an operating temperature of the flash memory. In a particular aspect, the flash memory is comprised in the computing device, and an action may be taken for preserving a lifespan of the flash memory based at least on the predicted number of data chunks writable on the flash memory.
    Type: Grant
    Filed: November 21, 2017
    Date of Patent: June 15, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Publication number: 20210141540
    Abstract: Computing device and method for inferring a predicted number of physical blocks erased from a flash memory. The computing device stores a predictive model generated by a neural network training engine. A processing unit of the computing device executes a neural network inference engine, using the predictive model for inferring the predicted number of physical blocks erased from the flash memory based on inputs. The inputs comprise a total number of physical blocks previously erased from the flash memory, an amount of data to be written on the flash memory, and optionally an operating temperature of the flash memory. In a particular aspect, the flash memory is comprised in the computing device, and an action may be taken for preserving a lifespan of the flash memory based at least on the predicted number of physical blocks erased from the flash memory.
    Type: Application
    Filed: January 25, 2021
    Publication date: May 13, 2021
    Inventor: Francois GERVAIS
  • Publication number: 20210116142
    Abstract: Computing device and method using a neural network to adjust temperature measurements. The computing device comprises a temperature sensing module, one or more processor and a display. The neural network receives as inputs a plurality of consecutive temperature measurements performed by the temperature sensing module, a plurality of consecutive utilization metrics of the one or more processor, and a plurality of consecutive utilization metrics of the display. The neural network outputs an inferred temperature, which is an adjustment of the temperature measured by the temperature sensing module to take into consideration heat dissipated by the one or more processor and the display when using the temperature sensing module for measuring the temperature in an area where the computing device is deployed. An example of computing device is a smart thermostat. A corresponding method for training a neural network to adjust temperature measurements is also disclosed.
    Type: Application
    Filed: October 22, 2019
    Publication date: April 22, 2021
    Inventors: Jean-Simon BOUCHER, Francois GERVAIS
  • Publication number: 20210096519
    Abstract: Method and environment controller for inferring via a neural network one or more commands for controlling an appliance. A predictive model generated by a neural network training engine is stored by the environment controller. The environment controller determines at least one room characteristic. The environment controller receives at least one environmental characteristic value and at least one set point. The environment controller executes a neural network inference engine, which uses the predictive model for inferring the one or more commands for controlling the appliance. The inference is based on the at least one environmental characteristic value, the at least one set point and the at least one room characteristic. The environment controller transmits the one or more commands to the controlled appliance.
    Type: Application
    Filed: December 10, 2020
    Publication date: April 1, 2021
    Inventor: Francois GERVAIS
  • Publication number: 20210088987
    Abstract: Inference server and environment controller for inferring via a neural network one or more commands for controlling an appliance. The environment controller determines at least one room characteristic. The environment controller receives at least one environmental characteristic value and at least one set point. The environment controller transmits the at least one environmental characteristic, set point and room characteristic to the inference server. The inference server executes a neural network inference engine using a predictive model (generated by a neural network training engine) for inferring the one or more commands for controlling the appliance. The inference is based on the received at least one environmental characteristic value, at least one set point and at least one room characteristic. The inference server transmits the one or more commands to the environment controller, which forwards the one or more commands to the controlled appliance.
    Type: Application
    Filed: December 9, 2020
    Publication date: March 25, 2021
    Inventor: Francois GERVAIS
  • Patent number: 10956048
    Abstract: Computing device and method for inferring a predicted number of physical blocks erased from a flash memory. The computing device stores a predictive model generated by a neural network training engine. A processing unit of the computing device executes a neural network inference engine, using the predictive model for inferring the predicted number of physical blocks erased from the flash memory based on inputs. The inputs comprise a total number of physical blocks previously erased from the flash memory, an amount of data to be written on the flash memory, and optionally an operating temperature of the flash memory. In a particular aspect, the flash memory is comprised in the computing device, and an action may be taken for preserving a lifespan of the flash memory based at least on the predicted number of physical blocks erased from the flash memory.
    Type: Grant
    Filed: November 21, 2017
    Date of Patent: March 23, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Publication number: 20210063041
    Abstract: Interactions between a training server and a plurality of environment controllers are used for updating the weights of a predictive model used by a neural network executed by the plurality of environment controllers. Each environment controller executes the neural network using a current version of the predictive model to generate outputs based on inputs, modifies the outputs, and generates metrics representative of the effectiveness of the modified outputs for controlling the environment. The training server collects the inputs, the corresponding modified outputs, and the corresponding metrics from the plurality of environment controllers. The collected inputs, modified outputs and metrics are used by the training server for updating the weights of the current predictive model through reinforcement learning. A new predictive model comprising the updated weights is transmitted to the environment controllers to be used in place of the current predictive model.
    Type: Application
    Filed: November 27, 2019
    Publication date: March 4, 2021
    Inventors: Steve LUPIEN, Francois GERVAIS
  • Publication number: 20210064968
    Abstract: Interactions between a training server and a plurality of environment controllers are used for updating the weights of a predictive model used by a neural network executed by the plurality of environment controllers. Each environment controller executes the neural network using a current version of the predictive model to generate outputs based on inputs, modifies the outputs, and generates metrics representative of the effectiveness of the modified outputs for controlling the environment. The training server collects the inputs, the corresponding modified outputs, and the corresponding metrics from the plurality of environment controllers. The collected inputs, modified outputs and metrics are used by the training server for updating the weights of the current predictive model through reinforcement learning. A new predictive model comprising the updated weights is transmitted to the environment controllers to be used in place of the current predictive model.
    Type: Application
    Filed: November 27, 2019
    Publication date: March 4, 2021
    Inventors: Steve LUPIEN, Francois GERVAIS
  • Publication number: 20210034967
    Abstract: Methods and environment controller for validating an estimated number of persons present in an area. The controller determines a temperature measurement in the area, a carbon dioxide (CO2) level measurement in the area, a humidity level measurement in the area, and the estimated number of persons present in the area based on data generated by an occupancy sensor. The controller executes a neural network inference engine for generating outputs based on inputs, using a predictive model comprising weights of a neural network. The inputs include the temperature, CO2 level and humidity level measurements, and the estimated number of persons. The outputs include an inferred temperature, an inferred CO2 level, an inferred humidity level, and an inferred number of persons. The controller applies a validation algorithm to the inputs and outputs of the neural network inference engine, to determine if the estimated number of persons is accurate or not.
    Type: Application
    Filed: November 27, 2019
    Publication date: February 4, 2021
    Inventors: Patrice SOUCY, Francois GERVAIS
  • Patent number: 10908561
    Abstract: Method and environment controller for inferring via a neural network one or more commands for controlling an appliance. A predictive model generated by a neural network training engine is stored by the environment controller. The environment controller determines at least one room characteristic. The environment controller receives at least one environmental characteristic value and at least one set point. The environment controller executes a neural network inference engine, which uses the predictive model for inferring the one or more commands for controlling the appliance. The inference is based on the at least one environmental characteristic value, the at least one set point and the at least one room characteristic. The environment controller transmits the one or more commands to the controlled appliance.
    Type: Grant
    Filed: December 12, 2017
    Date of Patent: February 2, 2021
    Assignee: DISTECH CONTROLS INC.
    Inventor: Francois Gervais
  • Publication number: 20210026313
    Abstract: Method and environment controller for inferring via a neural network one or more commands for controlling an appliance. A predictive model generated by a neural network training engine is stored by the environment controller. The environment controller receives at least one environmental characteristic value (for example, at least one of a current temperature, current humidity level, current carbon dioxide level, and current room occupancy). The environment controller receives at least one set point (for example, at least one of a target temperature, target humidity level, and target carbon dioxide level). The environment controller executes a neural network inference engine, which uses the predictive model for inferring the one or more commands for controlling the appliance based on the at least one environmental characteristic value and the at least one set point. The environment controller transmits the one or more commands to the controlled appliance.
    Type: Application
    Filed: October 13, 2020
    Publication date: January 28, 2021
    Inventors: Francois GERVAIS, Carlo MASCIOVECCHIO, Dominique LAPLANTE
  • Publication number: 20210026312
    Abstract: Inference server and environment controller for inferring one or more commands for controlling an appliance. The environment controller receives at least one environmental characteristic value (for example, at least one of a current temperature, current humidity level, current carbon dioxide level, and current room occupancy) and at least one set point (for example, at least one of a target temperature, target humidity level, and target carbon dioxide level); and forwards them to the inference server. The inference server executes a neural network inference engine using a predictive model (generated by a neural network training engine) for inferring the one or more commands based on the received at least one environmental characteristic value and the received at least one set point; and transmits the one or more commands to the environment controller. The environment controller forwards the one or more commands to the controlled appliance.
    Type: Application
    Filed: October 9, 2020
    Publication date: January 28, 2021
    Inventors: Francois GERVAIS, Carlo MASCIOVECCHIO, Dominique LAPLANTE