Patents by Inventor Allison Phuong HUYNH
Allison Phuong HUYNH has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12200108Abstract: Techniques for updating blockchains using a proof of work determined serially include receiving a block of data for inclusion in a new block of a blockchain; deterministically determining an initial nonce, hashing a combination of the block of data and the initial nonce to create a hashed value; iteratively deterministically determining an updated nonce based on a combination of the hashed value and updating the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria; creating the new block based on the block of data, the initial nonce, and the updated hashed value that satisfies the proof of work criteria; and having the new block stored in the blockchain.Type: GrantFiled: May 19, 2023Date of Patent: January 14, 2025Assignee: MYDREAM INTERACTIVE, INC.Inventors: Andrew Jonathan Leker, Matthew Drew Birder, Allison Phuong Huynh, Mark Thomas Wallace
-
Patent number: 12175013Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: June 19, 2023Date of Patent: December 24, 2024Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20230412359Abstract: Techniques for updating blockchains using a proof of work determined serially include receiving a block of data for inclusion in a new block of a blockchain; deterministically determining an initial nonce, hashing a combination of the block of data and the initial nonce to create a hashed value; iteratively deterministically determining an updated nonce based on a combination of the hashed value and updating the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria; creating the new block based on the block of data, the initial nonce, and the updated hashed value that satisfies the proof of work criteria; and having the new block stored in the blockchain.Type: ApplicationFiled: May 19, 2023Publication date: December 21, 2023Inventors: Andrew Jonathan LEKER, Matthew Drew BIRDER, II, Allison Phuong HUYNH, Mark Thomas WALLACE
-
Publication number: 20230333637Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: June 19, 2023Publication date: October 19, 2023Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH
-
Patent number: 11681360Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: May 9, 2022Date of Patent: June 20, 2023Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Patent number: 11658804Abstract: Techniques for updating blockchains using a proof of work determined serially include receiving a block of data for inclusion in a new block of a blockchain; deterministically determining an initial nonce, hashing a combination of the block of data and the initial nonce to create a hashed value; iteratively deterministically determining an updated nonce based on a combination of the hashed value and updating the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria; creating the new block based on the block of data, the initial nonce, and the updated hashed value that satisfies the proof of work criteria; and having the new block stored in the blockchain.Type: GrantFiled: May 11, 2021Date of Patent: May 23, 2023Assignee: MYDREAM INTERACTIVE, INC.Inventors: Andrew Jonathan Leker, Matthew Drew Birder, Allison Phuong Huynh, Mark Thomas Wallace
-
Publication number: 20220261068Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: May 9, 2022Publication date: August 18, 2022Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH
-
Patent number: 11327560Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: May 10, 2021Date of Patent: May 10, 2022Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20210333867Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: May 10, 2021Publication date: October 28, 2021Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH
-
Publication number: 20210266144Abstract: Techniques for updating blockchains using a proof of work determined serially include receiving a block of data for inclusion in a new block of a blockchain; deterministically determining an initial nonce, hashing a combination of the block of data and the initial nonce to create a hashed value; iteratively deterministically determining an updated nonce based on a combination of the hashed value and updating the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria; creating the new block based on the block of data, the initial nonce, and the updated hashed value that satisfies the proof of work criteria; and having the new block stored in the blockchain.Type: ApplicationFiled: May 11, 2021Publication date: August 26, 2021Inventors: Andrew Jonathan LEKER, Matthew Drew BIRDER, Allison Phuong HUYNH, Mark Thomas WALLACE
-
Patent number: 11038669Abstract: A system and method for blockchains with serial proof of work includes a memory storing a blockchain, and a processor coupled to the memory. The processor is configured to receive a miner identifier, receive a block of data for inclusion in a new block of the blockchain, determine an initial nonce based on the miner identifier, hash a combination of the block of data and the initial nonce to create a hashed value, iteratively determine an updated nonce based on the hashed value and update the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria, create the new block based on the block of data, the miner identifier, and the updated hashed value that satisfies the proof of work criteria, and share the new block with one or more other computing devices hosting the blockchain.Type: GrantFiled: August 13, 2018Date of Patent: June 15, 2021Assignee: MYDREAM INTERACTIVE, INC.Inventors: Andrew Jonathan Leker, Matthew Drew Birder, Allison Phuong Huynh, Mark Thomas Wallace
-
Patent number: 11003241Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: January 28, 2019Date of Patent: May 11, 2021Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Patent number: 10489978Abstract: A system and method for displaying computer-based content in a virtual or augmented environment includes receiving a selection of content for display on a window of a computing system, rendering the content as a virtual screen independently orientable from other virtual screens corresponding to other windows of the computing system, compositing the virtual screen into a virtual or augmented reality environment, detecting a selection of the virtual screen using a virtual controller, based on a gaze of a user within the virtual or augmented reality environment, or both, and changing properties of the virtual screen based on manipulation of the virtual controller by the user. In some embodiments, the system and method further include one or more of rendering a pointing ray and changing the properties based on movement of the virtual controller as if the virtual controller and the virtual screen are connected by a rigid link.Type: GrantFiled: November 2, 2016Date of Patent: November 26, 2019Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20190268142Abstract: A system and method for blockchains with serial proof of work includes a memory storing a blockchain, and a processor coupled to the memory. The processor is configured to receive a miner identifier, receive a block of data for inclusion in a new block of the blockchain, determine an initial nonce based on the miner identifier, hash a combination of the block of data and the initial nonce to create a hashed value, iteratively determine an updated nonce based on the hashed value and update the hashed value by hashing the updated nonce until the updated hashed value satisfies a proof of work criteria, create the new block based on the block of data, the miner identifier, and the updated hashed value that satisfies the proof of work criteria, and share the new block with one or more other computing devices hosting the blockchain.Type: ApplicationFiled: August 13, 2018Publication date: August 29, 2019Inventors: Andrew Jonathan LEKER, Matthew Drew BIRDER, Allison Phuong HUYNH, Mark Thomas WALLACE
-
Publication number: 20190155379Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: January 28, 2019Publication date: May 23, 2019Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH
-
Patent number: 10191540Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: January 12, 2018Date of Patent: January 29, 2019Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20180196526Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: January 12, 2018Publication date: July 12, 2018Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20180033204Abstract: A system and method for displaying computer-based content in a virtual or augmented environment includes receiving a selection of content for display on a window of a computing system, rendering the content as a virtual screen independently orientable from other virtual screens corresponding to other windows of the computing system, compositing the virtual screen into a virtual or augmented reality environment, detecting a selection of the virtual screen using a virtual controller, based on a gaze of a user within the virtual or augmented reality environment, or both, and changing properties of the virtual screen based on manipulation of the virtual controller by the user. In some embodiments, the system and method further include one or more of rendering a pointing ray and changing the properties based on movement of the virtual controller as if the virtual controller and the virtual screen are connected by a rigid link.Type: ApplicationFiled: November 2, 2016Publication date: February 1, 2018Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH
-
Patent number: 9870064Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: GrantFiled: November 30, 2016Date of Patent: January 16, 2018Inventors: Rouslan Lyubomirov Dimitrov, Allison Phuong Huynh
-
Publication number: 20170357327Abstract: A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display. In some embodiments, the blending engine is further configured to detect an amount of head tilt of the user and adjust a blending factor controlling an amount of transparency of the live reality feed within the blended view based on the amount of head tilt. In some embodiments, the blending engine is further configured to detect manipulation of a controller by the user and adjust the blending factor based on the detected manipulation.Type: ApplicationFiled: November 30, 2016Publication date: December 14, 2017Inventors: Rouslan Lyubomirov DIMITROV, Allison Phuong HUYNH