Patents by Inventor Owen Jones
Owen Jones has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250230935Abstract: The present invention relates to heating panels for underfloor heating, heated flooring elements, and a heating system comprising one or more heating panels. Methods for manufacturing said heating panels are also provided. The heating panels comprise a conductive layer of graphene particles dispersed in a polymer matrix material, wherein the graphene particles have an oxygen content of less than 4 at % and a nitrogen content of at least 3 at %. The heated flooring element comprises one or more heating panels in contact with, and optionally adhered to, at least a portion of a flooring layer.Type: ApplicationFiled: October 20, 2022Publication date: July 17, 2025Inventors: John-Mark Seymour, Thomas Harry Howe, Elliot Owen Jones
-
Publication number: 20250217644Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: ApplicationFiled: December 30, 2024Publication date: July 3, 2025Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
-
Publication number: 20250126683Abstract: The present invention relates to heating pads, heatable garments, fabrics for making such garments and methods for making such heating pads and garments and fabrics. Also provided is heatable bedding. The heating pad comprises graphene particles dispersed in a polymer matrix material, wherein the graphene particles have an oxygen content of less than 4 at % and a nitrogen content of at least 3 at %. The heatable garment comprises a garment body and a heating pad adhered to at least a portion of the garment body.Type: ApplicationFiled: October 20, 2022Publication date: April 17, 2025Inventors: Thomas Harry Howe, Elliot Owen Jones
-
Publication number: 20250079777Abstract: A connector stud includes a head component. The head component includes a head portion and a neck portion extending from a first side of the head portion. The head portion has a regular polygonal shape. The head component comprises an attachment slot defined in a second side of the head portion. The second side is opposed to the first side. The attachment slot extends radially into the head portion. The attachment slot is configured to receive a protrusion of a connector dock. The head component includes a detent arrangement configured to inhibit movement of a protrusion of a connector dock along the attachment slot.Type: ApplicationFiled: August 30, 2024Publication date: March 6, 2025Inventor: Owen Jones
-
Patent number: 12217173Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: GrantFiled: September 3, 2021Date of Patent: February 4, 2025Assignee: Google LLCInventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
-
Patent number: 12162847Abstract: Polycyclic aromatic hydrocarbons represented by the following general formula (I) wherein X is one of nitrogen, phosphorus, arsenic, antimony, bismuth, sulphur, selenium, tellurium; R independently represents an aromatic group and/or an aliphatic group; Q is one of a cyclic aliphatic hydrocarbon, a cyclic aromatic hydrocarbon, a polycyclic hydrocarbon, a polycyclic aromatic hydrocarbon, and/or a fused polycyclic aromatic hydrocarbon; wherein the substituents independently comprise one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g. an alkylated nitrogen atom), a cyano group, a nitro group, an alkyl group and/or anaryl group; p is an integer of 1 to 2; q is an integer of 1 to 4; Y1 and Y2 independently represent one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g.Type: GrantFiled: March 21, 2019Date of Patent: December 10, 2024Assignee: CHROMATWIST LIMITEDInventors: Alex Robinson, Jon Andrew Preece, Gregory O'Callaghan, Karolis Virzbickas, Owen Jones, Dennis Zhao, Michael Butlin, Sareena Sund
-
Patent number: 12041850Abstract: Polycyclic aromatic hydrocarbon derivatives represented by the following general formula: (I) wherein R independently represents an aromatic group and/or an aliphatic group; Q is one of a cyclic aliphatic hydrocarbon, a cyclic aromatic hydrocarbon, a polycyclic hydrocarbon, a polycyclic aromatic hydrocarbon, and/or a fused polycyclic aromatic hydrocarbon; wherein the substituents independently comprise one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g. an alkylated nitrogen atom), a cyano group, a nitro group, an alkyl group and/or an aryl group; p is an integer of 1 to 2; q is an integer of 1 to 4; Y1 and Y2 independently represent one or more of a hydrogen atom, a deuterium atom, a fluorine atom, a chlorine atom, a bromine atom, a carbon atom, an oxygen atom (e.g. an alkylated oxygen atom), a nitrogen atom (e.g.Type: GrantFiled: March 21, 2019Date of Patent: July 16, 2024Assignee: CHROMATWIST LIMITEDInventors: Alex Robinson, Jon Preece, Gregory O'Callaghan, Karolis Virzbickas, Owen Jones, Dennis Zhao, Michael Butlin, Sareena Sund
-
Publication number: 20240144006Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: ApplicationFiled: January 8, 2024Publication date: May 2, 2024Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
-
Publication number: 20240109011Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or natural fibers and microfibrillated cellulose fibers.Type: ApplicationFiled: October 12, 2023Publication date: April 4, 2024Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
-
Patent number: 11893483Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: GrantFiled: August 7, 2020Date of Patent: February 6, 2024Assignee: Google LLCInventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
-
Publication number: 20240020491Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.Type: ApplicationFiled: September 28, 2023Publication date: January 18, 2024Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
-
Patent number: 11819788Abstract: A nonwoven filtration medium that includes a fibrous base media including synthetic and/or fiberglass fibers and microfibrillated cellulose fibers.Type: GrantFiled: December 23, 2020Date of Patent: November 21, 2023Assignee: Donaldson Company, Inc.Inventors: Janelle M. Hampton, Derek Owen Jones, Suresh Laxman Shenoy
-
Patent number: 11809834Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.Type: GrantFiled: August 27, 2021Date of Patent: November 7, 2023Assignee: Google LLCInventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
-
Patent number: 11725788Abstract: Implementations are described herein for an adjustable recessed lighting apparatus (100) with a rotation ring (110). In various embodiments, a base (101) may be mounted to a surface and includes a light passage that generally directs light in a first direction (FD). The rotation ring (110) may be rotatably mounted to the base (101) such that the rotation ring (110) is rotatable about the light passage. At least one light source (140) may be mounted within the apparatus (100) to emit light through the light passage in a second direction (SD). A first drive (112) and a second drive (114) may be fixedly secured to the rotation ring (110). Accordingly, when torque is applied to the first drive (112), the rotation ring (110) may rotate relative to the base (101) about the light passage.Type: GrantFiled: June 8, 2020Date of Patent: August 15, 2023Assignee: SIGNIFY HOLDING B.V.Inventor: Mark Owen Jones
-
Patent number: 11494561Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media for training a machine learning model to perform multiple machine learning tasks from multiple machine learning domains. One system includes a machine learning model that includes multiple input modality neural networks corresponding to respective different modalities and being configured to map received data inputs of the corresponding modality to mapped data inputs from a unified representation space; an encoder neural network configured to process mapped data inputs from the unified representation space to generate respective encoder data outputs; a decoder neural network configured to process encoder data outputs to generate respective decoder data outputs from the unified representation space; and multiple output modality neural networks corresponding to respective different modalities and being configured to map decoder data outputs to data outputs of the corresponding modality.Type: GrantFiled: August 4, 2020Date of Patent: November 8, 2022Assignee: Google LLCInventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Ashish Teku Vaswani
-
Publication number: 20220083746Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.Type: ApplicationFiled: August 27, 2021Publication date: March 17, 2022Inventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
-
Publication number: 20220051099Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: ApplicationFiled: September 3, 2021Publication date: February 17, 2022Inventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani
-
Patent number: 11175023Abstract: A mounting bracket (20) for a luminaire fixture frame (12) having one or more tabs (30) to engage a hat channel or the like. The one or more tabs (30) are positionable between an un-deployed position and a deployed position to operably engage the hat channel.Type: GrantFiled: July 28, 2017Date of Patent: November 16, 2021Assignee: SIGNIFY HOLDING B.V.Inventor: Mark Owen Jones
-
Patent number: 11138392Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for machine translation using neural networks. In some implementations, a text in one language is translated into a second language using a neural network model. The model can include an encoder neural network comprising a plurality of bidirectional recurrent neural network layers. The encoding vectors are processed using a multi-headed attention module configured to generate multiple attention context vectors for each encoding vector. A decoder neural network generates a sequence of decoder output vectors using the attention context vectors. The decoder output vectors can represent distributions over various language elements of the second language, allowing a translation of the text into the second language to be determined based on the sequence of decoder output vectors.Type: GrantFiled: July 25, 2019Date of Patent: October 5, 2021Assignee: Google LLCInventors: Zhifeng Chen, Macduff Richard Hughes, Yonghui Wu, Michael Schuster, Xu Chen, Llion Owen Jones, Niki J. Parmar, George Foster, Orhan Firat, Ankur Bapna, Wolfgang Macherey, Melvin Jose Johnson Premkumar
-
Patent number: 11113602Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for generating an output sequence from an input sequence. In one aspect, one of the systems includes an encoder neural network configured to receive the input sequence and generate encoded representations of the network inputs, the encoder neural network comprising a sequence of one or more encoder subnetworks, each encoder subnetwork configured to receive a respective encoder subnetwork input for each of the input positions and to generate a respective subnetwork output for each of the input positions, and each encoder subnetwork comprising: an encoder self-attention sub-layer that is configured to receive the subnetwork input for each of the input positions and, for each particular input position in the input order: apply an attention mechanism over the encoder subnetwork inputs using one or more queries derived from the encoder subnetwork input at the particular input position.Type: GrantFiled: July 17, 2020Date of Patent: September 7, 2021Assignee: Google LLCInventors: Noam M. Shazeer, Aidan Nicholas Gomez, Lukasz Mieczyslaw Kaiser, Jakob D. Uszkoreit, Llion Owen Jones, Niki J. Parmar, Illia Polosukhin, Ashish Teku Vaswani