Patents by Inventor Owen Jones

Owen Jones has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250230935
    Abstract: The present invention relates to heating panels for underfloor heating, heated flooring elements, and a heating system comprising one or more heating panels. Methods for manufacturing said heating panels are also provided. The heating panels comprise a conductive layer of graphene particles dispersed in a polymer matrix material, wherein the graphene particles have an oxygen content of less than 4 at % and a nitrogen content of at least 3 at %. The heated flooring element comprises one or more heating panels in contact with, and optionally adhered to, at least a portion of a flooring layer.
    Type: Application
    Filed: October 20, 2022
    Publication date: July 17, 2025
    Inventors: John-Mark Seymour, Thomas Harry Howe, Elliot Owen Jones
  • Publication number: 20250217644
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: December 30, 2024
    Publication date: July 3, 2025
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20250126683
    Abstract: The present invention relates to heating pads, heatable garments, fabrics for making such garments and methods for making such heating pads and garments and fabrics. Also provided is heatable bedding. The heating pad comprises graphene particles dispersed in a polymer matrix material, wherein the graphene particles have an oxygen content of less than 4 at % and a nitrogen content of at least 3 at %. The heatable garment comprises a garment body and a heating pad adhered to at least a portion of the garment body.
    Type: Application
    Filed: October 20, 2022
    Publication date: April 17, 2025
    Inventors: Thomas Harry Howe, Elliot Owen Jones
  • Publication number: 20250079777
    Abstract: A connector stud includes a head component. The head component includes a head portion and a neck portion extending from a first side of the head portion. The head portion has a regular polygonal shape. The head component comprises an attachment slot defined in a second side of the head portion. The second side is opposed to the first side. The attachment slot extends radially into the head portion. The attachment slot is configured to receive a protrusion of a connector dock. The head component includes a detent arrangement configured to inhibit movement of a protrusion of a connector dock along the attachment slot.
    Type: Application
    Filed: August 30, 2024
    Publication date: March 6, 2025
    Inventor: Owen Jones
  • Patent number: 12217173
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: September 3, 2021
    Date of Patent: February 4, 2025
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Patent number: 12162847
    Abstract: Polycyclic aromatic hydrocarbons represented by the following general formula (I) wherein X is one of nitrogen, phosphorus, arsenic, antimony, bismuth, sulphur, selenium, tellurium; R independently represents an aromatic group and/or an aliphatic group; Q is one of a cyclic aliphatic hydrocarbon, a cyclic aromatic hydrocarbon, a polycyclic hydrocarbon, a polycyclic aromatic hydrocarbon, and/or a fused polycyclic aromatic hydrocarbon; wherein the substituents independently comprise one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g. an alkylated nitrogen atom), a cyano group, a nitro group, an alkyl group and/or anaryl group; p is an integer of 1 to 2; q is an integer of 1 to 4; Y1 and Y2 independently represent one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g.
    Type: Grant
    Filed: March 21, 2019
    Date of Patent: December 10, 2024
    Assignee: CHROMATWIST LIMITED
    Inventors: Alex Robinson, Jon Andrew Preece, Gregory O'Callaghan, Karolis Virzbickas, Owen Jones, Dennis Zhao, Michael Butlin, Sareena Sund
  • Patent number: 12041850
    Abstract: Polycyclic aromatic hydrocarbon derivatives represented by the following general formula: (I) wherein R independently represents an aromatic group and/or an aliphatic group; Q is one of a cyclic aliphatic hydrocarbon, a cyclic aromatic hydrocarbon, a polycyclic hydrocarbon, a polycyclic aromatic hydrocarbon, and/or a fused polycyclic aromatic hydrocarbon; wherein the substituents independently comprise one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g. an alkylated nitrogen atom), a cyano group, a nitro group, an alkyl group and/or an aryl group; p is an integer of 1 to 2; q is an integer of 1 to 4; Y1 and Y2 independently represent one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g.
    Type: Grant
    Filed: March 21, 2019
    Date of Patent: July 16, 2024
    Assignee: CHROMATWIST LIMITED
    Inventors: Alex Robinson, Jon Preece, Gregory O'Callaghan, Karolis Virzbickas, Owen Jones, Dennis Zhao, Michael Butlin, Sareena Sund
  • Publication number: 20240144006
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: January 8, 2024
    Publication date: May 2, 2024
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20240109011
    Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or natural fibers and microfibrillated cellulose fibers.
    Type: Application
    Filed: October 12, 2023
    Publication date: April 4, 2024
    Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
  • Patent number: 11893483
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: February 6, 2024
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Publication number: 20240020491
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 18, 2024
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11819788
    Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or fiberglass fibers and microfibrillated cellulose fibers.
    Type: Grant
    Filed: December 23, 2020
    Date of Patent: November 21, 2023
    Assignee: Donaldson Company, Inc.
    Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
  • Patent number: 11809834
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Grant
    Filed: August 27, 2021
    Date of Patent: November 7, 2023
    Assignee: Google LLC
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11725788
    Abstract: Implementations are described herein for an adjustable recessed lighting apparatus (100) with a rotation ring (110). In various embodiments, a base (101) may be mounted to a surface and includes a light passage that generally directs light in a first direction (FD). The rotation ring (110) may be rotatably mounted to the base (101) such that the rotation ring (110) is rotatable about the light passage. At least one light source (140) may be mounted within the apparatus (100) to emit light through the light passage in a second direction (SD). A first drive (112) and a second drive (114) may be fixedly secured to the rotation ring (110). Accordingly, when torque is applied to the first drive (112), the rotation ring (110) may rotate relative to the base (101) about the light passage.
    Type: Grant
    Filed: June 8, 2020
    Date of Patent: August 15, 2023
    Assignee: SIGNIFY HOLDING B.V.
    Inventor: Mark Owen Jones
  • Patent number: 11494561
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for training a machine learning model to perform multiple machine learning tasks from multiple machine learning domains. One system includes a machine learning model that includes multiple input modality neural networks corresponding to respective different modalities and being configured to map received data inputs of the corresponding modality to mapped data inputs from a unified representation space; an encoder neural network configured to process mapped data inputs from the unified representation space to generate respective encoder data outputs; a decoder neural network configured to process encoder data outputs to generate respective decoder data outputs from the unified representation space; and multiple output modality neural networks corresponding to respective different modalities and being configured to map decoder data outputs to data outputs of the corresponding modality.
    Type: Grant
    Filed: August 4, 2020
    Date of Patent: November 8, 2022
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Ashish Teku Vaswani
  • Publication number: 20220083746
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Application
    Filed: August 27, 2021
    Publication date: March 17, 2022
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Publication number: 20220051099
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Application
    Filed: September 3, 2021
    Publication date: February 17, 2022
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
  • Patent number: 11175023
    Abstract: A mounting bracket (20) for a luminaire fixture frame (12) having one or more tabs (30) to engage a hat channel or the like. The one or more tabs (30) are positionable between an un-deployed position and a deployed position to operably engage the hat channel.
    Type: Grant
    Filed: July 28, 2017
    Date of Patent: November 16, 2021
    Assignee: SIGNIFY HOLDING B.V.
    Inventor: Mark Owen Jones
  • Patent number: 11138392
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.
    Type: Grant
    Filed: July 25, 2019
    Date of Patent: October 5, 2021
    Assignee: Google LLC
    Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
  • Patent number: 11113602
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.
    Type: Grant
    Filed: July 17, 2020
    Date of Patent: September 7, 2021
    Assignee: Google LLC
    Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani