Patents by Inventor Lijun Yin
Lijun Yin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240047971Abstract: A hydro-photovoltaic complementary operation chart application method for a clean energy base includes: divide a hydro-photovoltaic complementary operation chart into two sub-operation charts by runoff probability and critical probability; predict runoff and predicted photovoltaic output during the operation cycle, select the sub-operation chart by the runoff probability, determine hydropower output of a reservoir in the current month according to water level of the reservoir at the beginning of the current month and the operation area, and obtain the water level of the reservoir at the end of the current month through the runoff calculation; obtain long-term hydropower output process and reservoir level process in the clean energy base until the hydropower output and water level of the reservoir in all months of the operation cycle are calculated, and calculate hydropower generation probability to complete operation.Type: ApplicationFiled: September 13, 2023Publication date: February 8, 2024Inventors: Xu Li, Dacheng Li, Xianfeng Huang, Jian Zhou, Huawei Xiang, Feng Wu, Yun Tian, Di Wu, Chang Xu, Xinglin Duan, Yanqing Zhang, Yuan Zheng, Wenbo Huang, Min Xu, Hong Pan, Zhiyuan Wu, Hucheng Xianyu, Wennan Yuan, Lijun Yin
-
Publication number: 20230351178Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.Type: ApplicationFiled: June 24, 2023Publication date: November 2, 2023Inventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
-
Patent number: 11687778Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.Type: GrantFiled: January 6, 2021Date of Patent: June 27, 2023Assignee: The Research Foundation for The State University of New YorkInventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
-
Publication number: 20210209388Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.Type: ApplicationFiled: January 6, 2021Publication date: July 8, 2021Inventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
-
Patent number: 10780468Abstract: The present disclosure discloses a cleaning device and a clean method, the cleaning device includes a sweeping module and a washing module, the sweeping module includes a brush component and a steam generating component, the steam generating component and the brush component are arranged in sequence in the cleaning direction of the cleaning device; the washing module is configured to wash the parts to be cleaned, the washing module and the sweeping module are arranged in sequence in the cleaning direction of the cleaning device.Type: GrantFiled: May 31, 2018Date of Patent: September 22, 2020Assignees: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE DISPLAY TECHNOLOGY CO., LTD.Inventors: Junwen Luo, Bin Chang, Shichao Fan, Lijun Yin, Hui Sun, Hongyang Tang
-
Patent number: 10500531Abstract: Embodiments of the present invention are a filtering element, a filtering equipment and a water circulation cleaning system. In an embodiment, a filtering element includes a filtering screen and filter particles adhered to one side of the filtering screen, sizes of the filter particles being gradually increased in a direction from the one side to the other side of the filtering screen. Meanwhile, there also provides a filtering equipment including the abovementioned filtering element and a water circulation cleaning system including the abovementioned filtering equipment.Type: GrantFiled: August 15, 2016Date of Patent: December 10, 2019Assignees: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE DISPLAY TECHNOLOGY CO., LTD.Inventors: Donglei Wen, Zhenshan Lu, Bin Chang, Bo Bai, Shichao Fan, Lijun Yin
-
Patent number: 10335045Abstract: Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state-of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. The present approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation.Type: GrantFiled: June 23, 2017Date of Patent: July 2, 2019Assignees: Universita degli Studi Di Trento, Fondazione Bruno Kessler, The Research Foundation for the State University of New York, University of Pittsburgh of the Commonwealth of Higher EducationInventors: Niculae Sebe, Xavier Alameda-Pineda, Sergey Tulyakov, Elisa Ricci, Lijun Yin, Jeffrey F. Cohn
-
Publication number: 20190054509Abstract: The present disclosure discloses a cleaning device and a clean method, the cleaning device includes a sweeping module and a washing module, the sweeping module includes a brush component and a steam generating component, the steam generating component and the brush component are arranged in sequence in the cleaning direction of the cleaning device; the washing module is configured to wash the parts to be cleaned, the washing module and the sweeping module are arranged in sequence in the cleaning direction of the cleaning device.Type: ApplicationFiled: May 31, 2018Publication date: February 21, 2019Inventors: Junwen LUO, Bin CHANG, Shichao FAN, Lijun YIN, Hui SUN, Hongyang TANG
-
Patent number: 9953214Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.Type: GrantFiled: March 9, 2016Date of Patent: April 24, 2018Assignee: The Research Foundation for the State Universirty of New YorkInventors: Lijun Yin, Michael Reale
-
Publication number: 20170367590Abstract: Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state-of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. The present approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation.Type: ApplicationFiled: June 23, 2017Publication date: December 28, 2017Inventors: Niculae Sebe, Xavier Alameda-Pineda, Sergey Tulyakov, Elisa Ricci, Lijun Yin, Jeffrey F. Cohn
-
Publication number: 20170266595Abstract: Embodiments of the present invention are a filtering element, a filtering equipment and a water circulation cleaning system. In an embodiment, a filtering element includes a filtering screen and filter particles adhered to one side of the filtering screen, sizes of the filter particles being gradually increased in a direction from the one side to the other side of the filtering screen. Meanwhile, there also provides a filtering equipment including the abovementioned filtering element and a water circulation cleaning system including the abovementioned filtering equipment.Type: ApplicationFiled: August 15, 2016Publication date: September 21, 2017Inventors: Donglei Wen, Zhenshan Lu, Bin Chang, Bo Bai, Shichao Fan, Lijun Yin
-
Publication number: 20160210503Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.Type: ApplicationFiled: March 9, 2016Publication date: July 21, 2016Inventors: Lijun YIN, Michael Reale
-
Patent number: 9372546Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.Type: GrantFiled: September 3, 2015Date of Patent: June 21, 2016Assignee: The Research Foundation for The State University of New YorkInventors: Lijun Yin, Shaun Canavan, Kaoning Hu
-
Patent number: 9311527Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.Type: GrantFiled: November 10, 2014Date of Patent: April 12, 2016Assignee: The Research Foundation for The State University of New YorkInventors: Lijun Yin, Michael Reale
-
Publication number: 20150378444Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.Type: ApplicationFiled: September 3, 2015Publication date: December 31, 2015Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
-
Patent number: 9128530Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.Type: GrantFiled: March 2, 2015Date of Patent: September 8, 2015Assignee: The Research Foundation for the State University of New YorkInventors: Lijun Yin, Shaun Canavan, Kaoning Hu
-
Publication number: 20150177846Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.Type: ApplicationFiled: March 2, 2015Publication date: June 25, 2015Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
-
Patent number: 8971572Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.Type: GrantFiled: August 10, 2012Date of Patent: March 3, 2015Assignee: The Research Foundation for The State University of New YorkInventors: Lijun Yin, Shaun Canavan, Kaoning Hu
-
Patent number: 8885882Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.Type: GrantFiled: July 16, 2012Date of Patent: November 11, 2014Assignee: The Research Foundation for The State University of New YorkInventors: Lijun Yin, Michael Reale