Patents by Inventor Owen Jones

Owen Jones has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240144006
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: January 8, 2024
    Publication date: May 2, 2024
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20240109011
    Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or natural fibers and microfibrillated cellulose fibers.
    Type: Application
    Filed: October 12, 2023
    Publication date: April 4, 2024
    Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
  • Patent number: 11893483
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: February 6, 2024
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20240020491
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 18, 2024
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11819788
    Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or fiberglass fibers and microfibrillated cellulose fibers.
    Type: Grant
    Filed: December 23, 2020
    Date of Patent: November 21, 2023
    Assignee: Donaldson Company, Inc.
    Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
  • Patent number: 11809834
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Grant
    Filed: August 27, 2021
    Date of Patent: November 7, 2023
    Assignee: Google LLC
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11725788
    Abstract: Implementations are described herein for an adjustable recessed lighting apparatus (100) with a rotation ring (110). In various embodiments, a base (101) may be mounted to a surface and includes a light passage that generally directs light in a first direction (FD). The rotation ring (110) may be rotatably mounted to the base (101) such that the rotation ring (110) is rotatable about the light passage. At least one light source (140) may be mounted within the apparatus (100) to emit light through the light passage in a second direction (SD). A first drive (112) and a second drive (114) may be fixedly secured to the rotation ring (110). Accordingly, when torque is applied to the first drive (112), the rotation ring (110) may rotate relative to the base (101) about the light passage.
    Type: Grant
    Filed: June 8, 2020
    Date of Patent: August 15, 2023
    Assignee: SIGNIFY HOLDING B.V.
    Inventor: Mark Owen Jones
  • Publication number: 20220412983
    Abstract: sents one of a nitrogen atom, an oxygen atom, a sulphur atom, a phosphorus atom, or a selenium atom; R represents an aromatic group and/or an aliphatic group; p is an integer of 1 or 2; q and s are independently integers of 1, 2, 3, or 4; Y1, Y2, and Y3 independently comprise, consist of, or represent a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a substituted or unsubstituted alkyl group, a substituted or unsubstituted aryl group, a polyether chain, a polyglycol group, an oxygen atom, a nitrogen atom, a cyano group, or a nitro group; two or more of Y1, Y2, and/or Y3 may combine together to form a condensed ring; wherein one or more of Y1, Y2, and/or Y3 comprises a spacing portion comprising a continuous chain of between 3 and 20 atoms, and further comprising a functional group capable of forming a covalent bond with a second species, the functional group being selected from one or more of a carboxylic acid, an ester, an azide, an amine, a maleimide, a thiol, an isothioc
    Type: Application
    Filed: September 24, 2020
    Publication date: December 29, 2022
    Inventors: Jon PREECE, Alex ROBINSON, Owen JONES, Michael BUTLIN, Zania STAMATAKI
  • Publication number: 20220357332
    Abstract: A composition for imaging a biological tissue or fluid comprising a compound of formula (A) and a biologically acceptable diluent or carrier, (A) wherein X represents one of a nitrogen atom, an oxygen atom, a sulphur atom, a phosphorus atom, or a selenium atom; R represents an aromatic group and/or an aliphatic group; p is an integer of 1 to 2; q and s are independently integers of 1, 2, 3, or 4; Y1, Y2, and Y3 independently represent a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a substituted or unsubstituted alkyl group, a substituted or unsubstituted aryl group, a polyglycol group, an oxygen atom, a nitrogen atom, a cyano group, a nitro group; and/or wherein two or more of Y1, Y2, and Y3 may combine together to form a condensed ring.
    Type: Application
    Filed: September 24, 2020
    Publication date: November 10, 2022
    Inventors: Jon PREECE, Alex ROBINSON, Owen JONES, Michael BUTLIN, Parvez IQBAL, Sareena SUND
  • Patent number: 11494561
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for training a machine learning model to perform multiple machine learning tasks from multiple machine learning domains. One system includes a machine learning model that includes multiple input modality neural networks corresponding to respective different modalities and being configured to map received data inputs of the corresponding modality to mapped data inputs from a unified representation space; an encoder neural network configured to process mapped data inputs from the unified representation space to generate respective encoder data outputs; a decoder neural network configured to process encoder data outputs to generate respective decoder data outputs from the unified representation space; and multiple output modality neural networks corresponding to respective different modalities and being configured to map decoder data outputs to data outputs of the corresponding modality.
    Type: Grant
    Filed: August 4, 2020
    Date of Patent: November 8, 2022
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Ashish Teku Vaswani
  • Publication number: 20220299176
    Abstract: Implementations are described herein for an adjustable recessed lighting apparatus (100) with a rotation ring (110). In various embodiments, a base (101) may be mounted to a surface and includes a light passage that generally directs light in a first direction (FD). The rotation ring (110) may be rotatably mounted to the base (101) such that the rotation ring (110) is rotatable about the light passage. At least one light source (140) may be mounted within the apparatus (100) to emit light through the light passage in a second direction (SD). A first drive (112) and a second drive (114) may be fixedly secured to the rotation ring (110). Accordingly, when torque is applied to the first drive (112), the rotation ring (110) may rotate relative to the base (101) about the light passage.
    Type: Application
    Filed: June 8, 2020
    Publication date: September 22, 2022
    Inventor: MARK OWEN JONES
  • Publication number: 20220083746
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Application
    Filed: August 27, 2021
    Publication date: March 17, 2022
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Publication number: 20220051099
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: September 3, 2021
    Publication date: February 17, 2022
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Patent number: 11175023
    Abstract: A mounting bracket (20) for a luminaire fixture frame (12) having one or more tabs (30) to engage a hat channel or the like. The one or more tabs (30) are positionable between an un-deployed position and a deployed position to operably engage the hat channel.
    Type: Grant
    Filed: July 28, 2017
    Date of Patent: November 16, 2021
    Assignee: SIGNIFY HOLDING B.V.
    Inventor: Mark Owen Jones
  • Patent number: 11138392
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Grant
    Filed: July 25, 2019
    Date of Patent: October 5, 2021
    Assignee: Google LLC
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11113602
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: July 17, 2020
    Date of Patent: September 7, 2021
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20210106934
    Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or fiberglass fibers and microfibrillated cellulose fibers.
    Type: Application
    Filed: December 23, 2020
    Publication date: April 15, 2021
    Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
  • Patent number: 10956819
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: March 23, 2021
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20210070720
    Abstract: Polycyclic aromatic hydrocarbons represented by the following general formula (I) wherein X is one of nitrogen, phosphorus, arsenic, antimony, bismuth, sulphur, selenium, tellurium; R independently represents an aromatic group and/or an aliphatic group; Q is one of a cyclic aliphatic hydrocarbon, a cyclic aromatic hydrocarbon, a polycyclic hydrocarbon, a polycyclic aromatic hydrocarbon, and/or a fused polycyclic aromatic hydrocarbon; wherein the substituents independently comprise one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g. an alkylated nitrogen atom), a cyano group, a nitro group, an alkyl group and/or anaryl group; p is an integer of 1 to 2; q is an integer of 1 to 4; Y1 and Y2 independently represent one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g.
    Type: Application
    Filed: March 21, 2019
    Publication date: March 11, 2021
    Inventors: Alex ROBINSON, Jon PREECE, Gregory O'CALLAGHAN, Karolis VIRZBICKAS, Owen JONES, Dennis ZHAO, Michael BUTLIN, Sareena SUND
  • Patent number: 10938357
    Abstract: Embodiments provide an audio amplifier circuit with integrated (built-in) filter (e.g., a digital-to-analog converter (DAC) filter). The audio amplifier circuit may have a non-flat (e.g., low-pass) closed loop frequency response. The audio amplifier circuit may include a low pass filter coupled between an input terminal that receives the input analog audio signal and the input of the gain stage of the amplifier. In some embodiments, additional impedance networks may be included to produce a desired low-pass filter response, such as a second order filter, a third order filter, and/or another suitable filter response. Other embodiments may be described and/or claimed.
    Type: Grant
    Filed: September 26, 2019
    Date of Patent: March 2, 2021
    Assignee: THX LTD.
    Inventor: Owen Jones