Patents by Inventor Denys Makoviichuk
Denys Makoviichuk has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12323478Abstract: Method for triggering changes to real-time special effects included in a live streaming video starts with a processor transmitting in real-time a video stream captured by a camera via a network. The processor causes a live streaming interface that includes the video stream to be displayed on the plurality of client devices. The processor receives a trigger to apply one of a plurality of special effects to the video stream and determines a first special effect of the plurality of special effects is associated with the trigger. The processor applies in real-time the first special effect to the video stream to generate a video stream having the first special effect and transmits in real-time the video stream having the first special effect via the network. The processor causes the live streaming interface that includes the video stream having the first special effect to be displayed on the plurality of client devices. Other embodiments are disclosed.Type: GrantFiled: June 6, 2023Date of Patent: June 3, 2025Assignee: Snap Inc.Inventors: Artem Gaiduchenko, Artem Yerofieiev, Bohdan Pozharskyi, Gabriel Lupin, Oleksii Kholovchuk, Travis Chen, Yurii Monastyrshyn, Denys Makoviichuk
-
Publication number: 20240202869Abstract: A neural light field (NeLF) that runs real-time on mobile devices for neural rendering of three dimensional (3D) scenes, referred to as MobileR2L. The MobileR2L architecture runs efficiently on mobile devices with low latency and small size, and it achieves high-resolution generation while maintaining real-time inference for both synthetic and real-world 3D scenes on mobile devices. The MobileR2L has a network backbone including a convolutional layer embedding an input image at a resolution, residual blocks uploading the embedded image, and super-resolution modules receiving the uploaded embedded image and rendering an output image having a higher resolution than the embedded image. The convolution layer generates a number of rays equal to a number of pixels in the input image, where a partial number of the rays is uploaded to the super-resolution modules.Type: ApplicationFiled: December 14, 2022Publication date: June 20, 2024Inventors: Jian Ren, Pavlo Chemerys, Vladislav Shakhrai, Ju Hu, Denys Makoviichuk, Sergey Tulyakov, Junli Cao
-
Publication number: 20240053959Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for automatic quantization of a floating point model. The program and method provide for providing a floating point model to an automatic quantization library, the floating point model being configured to represent a neural network, and the automatic quantization library being configured to generate a first quantized model based on the floating point model; providing a function to the automatic quantization library, the function being configured to run a forward pass on a given dataset for the floating point model; causing the automatic quantization library to generate the first quantized model based on the floating point model; causing the automatic quantization library to calibrate the first quantized model by running the first quantized model on the function; and converting the calibrated first quantized model to a second quantized model.Type: ApplicationFiled: June 8, 2023Publication date: February 15, 2024Inventors: Denys Makoviichuk, Jiazhuo Wang, Yang Wen
-
Publication number: 20230319126Abstract: Method for triggering changes to real-time special effects included in a live streaming video starts with a processor transmitting in real-time a video stream captured by a camera via a network. The processor causes a live streaming interface that includes the video stream to be displayed on the plurality of client devices. The processor receives a trigger to apply one of a plurality of special effects to the video stream and determines a first special effect of the plurality of special effects is associated with the trigger. The processor applies in real-time the first special effect to the video stream to generate a video stream having the first special effect and transmits in real-time the video stream having the first special effect via the network. The processor causes the live streaming interface that includes the video stream having the first special effect to be displayed on the plurality of client devices. Other embodiments are disclosed.Type: ApplicationFiled: June 6, 2023Publication date: October 5, 2023Inventors: Artem Gaiduchenko, Artem Yerofieiev, Bohdan Pozharskyi, Gabriel Lupin, Oleksii Kholovchuk, Travis Chen, Yurii Monastyrshyn, Denys Makoviichuk
-
Patent number: 11711414Abstract: Method for triggering changes to real-time special effects included in a live streaming video starts with a processor transmitting in real-time a video stream captured by a camera via a network. The processor causes a live streaming interface that includes the video stream to be displayed on the plurality of client devices. The processor receives a trigger to apply one of a plurality of special effects to the video stream and determines a first special effect of the plurality of special effects is associated with the trigger. The processor applies in real-time the first special effect to the video stream to generate a video stream having the first special effect and transmits in real-time the video stream having the first special effect via the network. The processor causes the live streaming interface that includes the video stream having the first special effect to be displayed on the plurality of client devices. Other embodiments are disclosed.Type: GrantFiled: November 30, 2021Date of Patent: July 25, 2023Assignee: Snap Inc.Inventors: Artem Gaiduchenko, Artem Yerofieiev, Bohdan Pozharskyi, Gabriel Lupin, Oleksii Kholovchuk, Travis Chen, Yurii Monastyrshyn, Denys Makoviichuk
-
Publication number: 20230214639Abstract: Techniques for training a neural network having a plurality of computational layers with associated weights and activations for computational layers in fixed-point formats include determining an optimal fractional length for weights and activations for the computational layers; training a learned clipping-level with fixed-point quantization using a PACT process for the computational layers; and quantizing on effective weights that fuses a weight of a convolution layer with a weight and running variance from a batch normalization layer. A fractional length for weights of the computational layers is determined from current values of weights using the determined optimal fractional length for the weights of the computational layers. A fixed-point activation between adjacent computational layers is related using PACT quantization of the clipping-level and an activation fractional length from a node in a following computational layer.Type: ApplicationFiled: December 31, 2021Publication date: July 6, 2023Inventors: Sumant Milind Hanumante, Qing Jin, Sergei Korolev, Denys Makoviichuk, Jian Ren, Dhritiman Sagar, Patrick Timothy McSweeney Simons, Sergey Tulyakov, Yang Wen, Richard Zhuang
-
Publication number: 20220166816Abstract: Method for triggering changes to real-time special effects included in a live streaming video starts with a processor transmitting in real-time a video stream captured by a camera via a network. The processor causes a live streaming interface that includes the video stream to be displayed on the plurality of client devices. The processor receives a trigger to apply one of a plurality of special effects to the video stream and determines a first special effect of the plurality of special effects is associated with the trigger. The processor applies in real-time the first special effect to the video stream to generate a video stream having the first special effect and transmits in real-time the video stream having the first special effect via the network. The processor causes the live streaming interface that includes the video stream having the first special effect to be displayed on the plurality of client devices. Other embodiments are disclosed.Type: ApplicationFiled: November 30, 2021Publication date: May 26, 2022Inventors: Artem Gaiduchenko, Artem Yerofieiev, Bohdan Pozharskyi, Gabriel Lupin, Oleksii Kholovchuk, Travis Chen, Yurii Monastyrshyn, Denys Makoviichuk
-
Patent number: 11212331Abstract: Method for triggering changes to real-time special effects included in a live streaming video starts with a processor transmitting in real-time a video stream captured by a camera via a network. The processor causes a live streaming interface that includes the video stream to be displayed on the plurality of client devices. The processor receives a trigger to apply one of a plurality of special effects to the video stream and determines a first special effect of the plurality of special effects is associated with the trigger. The processor applies in real-time the first special effect to the video stream to generate a video stream having the first special effect and transmits in real-time the video stream having the first special effect via the network. The processor causes the live streaming interface that includes the video stream having the first special effect to be displayed on the plurality of client devices. Other embodiments are disclosed.Type: GrantFiled: January 31, 2019Date of Patent: December 28, 2021Assignee: Snap Inc.Inventors: Artem Gaiduchenko, Artem Yerofieiev, Bohdan Pozharskyi, Gabriel Lupin, Oleksii Kholovchuk, Travis Chen, Yurii Monastyrshyn, Denys Makoviichuk