Patents by Inventor Lijun Yin

Lijun Yin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240047971
    Abstract: A hydro-photovoltaic complementary operation chart application method for a clean energy base includes: divide a hydro-photovoltaic complementary operation chart into two sub-operation charts by runoff probability and critical probability; predict runoff and predicted photovoltaic output during the operation cycle, select the sub-operation chart by the runoff probability, determine hydropower output of a reservoir in the current month according to water level of the reservoir at the beginning of the current month and the operation area, and obtain the water level of the reservoir at the end of the current month through the runoff calculation; obtain long-term hydropower output process and reservoir level process in the clean energy base until the hydropower output and water level of the reservoir in all months of the operation cycle are calculated, and calculate hydropower generation probability to complete operation.
    Type: Application
    Filed: September 13, 2023
    Publication date: February 8, 2024
    Inventors: Xu Li, Dacheng Li, Xianfeng Huang, Jian Zhou, Huawei Xiang, Feng Wu, Yun Tian, Di Wu, Chang Xu, Xinglin Duan, Yanqing Zhang, Yuan Zheng, Wenbo Huang, Min Xu, Hong Pan, Zhiyuan Wu, Hucheng Xianyu, Wennan Yuan, Lijun Yin
  • Publication number: 20230351178
    Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.
    Type: Application
    Filed: June 24, 2023
    Publication date: November 2, 2023
    Inventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
  • Patent number: 11687778
    Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.
    Type: Grant
    Filed: January 6, 2021
    Date of Patent: June 27, 2023
    Assignee: The Research Foundation for The State University of New York
    Inventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
  • Publication number: 20210209388
    Abstract: Detection of synthetic content in portrait videos, e.g., deep fakes, is achieved. Detectors blindly utilizing deep learning are not effective in catching fake content, as generative models produce realistic results. However, biological signals hidden in portrait videos which are neither spatially nor temporally preserved in fake content, can be used as implicit descriptors of authenticity. 99.39% accuracy in pairwise separation is achieved. A generalized classifier for fake content is formulated by analyzing signal transformations and corresponding feature sets. Signal maps are generated, and a CNN employed to improve the classifier for detecting synthetic content. Evaluation on several datasets produced superior detection rates against baselines, independent of the source generator, or properties of available fake content.
    Type: Application
    Filed: January 6, 2021
    Publication date: July 8, 2021
    Inventors: Umur Aybars Ciftci, Ilke Demir, Lijun Yin
  • Patent number: 10780468
    Abstract: The present disclosure discloses a cleaning device and a clean method, the cleaning device includes a sweeping module and a washing module, the sweeping module includes a brush component and a steam generating component, the steam generating component and the brush component are arranged in sequence in the cleaning direction of the cleaning device; the washing module is configured to wash the parts to be cleaned, the washing module and the sweeping module are arranged in sequence in the cleaning direction of the cleaning device.
    Type: Grant
    Filed: May 31, 2018
    Date of Patent: September 22, 2020
    Assignees: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE DISPLAY TECHNOLOGY CO., LTD.
    Inventors: Junwen Luo, Bin Chang, Shichao Fan, Lijun Yin, Hui Sun, Hongyang Tang
  • Patent number: 10500531
    Abstract: Embodiments of the present invention are a filtering element, a filtering equipment and a water circulation cleaning system. In an embodiment, a filtering element includes a filtering screen and filter particles adhered to one side of the filtering screen, sizes of the filter particles being gradually increased in a direction from the one side to the other side of the filtering screen. Meanwhile, there also provides a filtering equipment including the abovementioned filtering element and a water circulation cleaning system including the abovementioned filtering equipment.
    Type: Grant
    Filed: August 15, 2016
    Date of Patent: December 10, 2019
    Assignees: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE DISPLAY TECHNOLOGY CO., LTD.
    Inventors: Donglei Wen, Zhenshan Lu, Bin Chang, Bo Bai, Shichao Fan, Lijun Yin
  • Patent number: 10335045
    Abstract: Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state-of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. The present approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation.
    Type: Grant
    Filed: June 23, 2017
    Date of Patent: July 2, 2019
    Assignees: Universita degli Studi Di Trento, Fondazione Bruno Kessler, The Research Foundation for the State University of New York, University of Pittsburgh of the Commonwealth of Higher Education
    Inventors: Niculae Sebe, Xavier Alameda-Pineda, Sergey Tulyakov, Elisa Ricci, Lijun Yin, Jeffrey F. Cohn
  • Publication number: 20190054509
    Abstract: The present disclosure discloses a cleaning device and a clean method, the cleaning device includes a sweeping module and a washing module, the sweeping module includes a brush component and a steam generating component, the steam generating component and the brush component are arranged in sequence in the cleaning direction of the cleaning device; the washing module is configured to wash the parts to be cleaned, the washing module and the sweeping module are arranged in sequence in the cleaning direction of the cleaning device.
    Type: Application
    Filed: May 31, 2018
    Publication date: February 21, 2019
    Inventors: Junwen LUO, Bin CHANG, Shichao FAN, Lijun YIN, Hui SUN, Hongyang TANG
  • Patent number: 9953214
    Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.
    Type: Grant
    Filed: March 9, 2016
    Date of Patent: April 24, 2018
    Assignee: The Research Foundation for the State Universirty of New York
    Inventors: Lijun Yin, Michael Reale
  • Publication number: 20170367590
    Abstract: Recent studies in computer vision have shown that, while practically invisible to a human observer, skin color changes due to blood flow can be captured on face videos and, surprisingly, be used to estimate the heart rate (HR). While considerable progress has been made in the last few years, still many issues remain open. In particular, state-of-the-art approaches are not robust enough to operate in natural conditions (e.g. in case of spontaneous movements, facial expressions, or illumination changes). Opposite to previous approaches that estimate the HR by processing all the skin pixels inside a fixed region of interest, we introduce a strategy to dynamically select face regions useful for robust HR estimation. The present approach, inspired by recent advances on matrix completion theory, allows us to predict the HR while simultaneously discover the best regions of the face to be used for estimation.
    Type: Application
    Filed: June 23, 2017
    Publication date: December 28, 2017
    Inventors: Niculae Sebe, Xavier Alameda-Pineda, Sergey Tulyakov, Elisa Ricci, Lijun Yin, Jeffrey F. Cohn
  • Publication number: 20170266595
    Abstract: Embodiments of the present invention are a filtering element, a filtering equipment and a water circulation cleaning system. In an embodiment, a filtering element includes a filtering screen and filter particles adhered to one side of the filtering screen, sizes of the filter particles being gradually increased in a direction from the one side to the other side of the filtering screen. Meanwhile, there also provides a filtering equipment including the abovementioned filtering element and a water circulation cleaning system including the abovementioned filtering equipment.
    Type: Application
    Filed: August 15, 2016
    Publication date: September 21, 2017
    Inventors: Donglei Wen, Zhenshan Lu, Bin Chang, Bo Bai, Shichao Fan, Lijun Yin
  • Publication number: 20160210503
    Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.
    Type: Application
    Filed: March 9, 2016
    Publication date: July 21, 2016
    Inventors: Lijun YIN, Michael Reale
  • Patent number: 9372546
    Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.
    Type: Grant
    Filed: September 3, 2015
    Date of Patent: June 21, 2016
    Assignee: The Research Foundation for The State University of New York
    Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
  • Patent number: 9311527
    Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.
    Type: Grant
    Filed: November 10, 2014
    Date of Patent: April 12, 2016
    Assignee: The Research Foundation for The State University of New York
    Inventors: Lijun Yin, Michael Reale
  • Publication number: 20150378444
    Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.
    Type: Application
    Filed: September 3, 2015
    Publication date: December 31, 2015
    Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
  • Patent number: 9128530
    Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.
    Type: Grant
    Filed: March 2, 2015
    Date of Patent: September 8, 2015
    Assignee: The Research Foundation for the State University of New York
    Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
  • Publication number: 20150177846
    Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.
    Type: Application
    Filed: March 2, 2015
    Publication date: June 25, 2015
    Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
  • Patent number: 8971572
    Abstract: Hand pointing has been an intuitive gesture for human interaction with computers. A hand pointing estimation system is provided, based on two regular cameras, which includes hand region detection, hand finger estimation, two views' feature detection, and 3D pointing direction estimation. The technique may employ a polar coordinate system to represent the hand region, and tests show a good result in terms of the robustness to hand orientation variation. To estimate the pointing direction, Active Appearance Models are employed to detect and track, e.g., 14 feature points along the hand contour from a top view and a side view. Combining two views of the hand features, the 3D pointing direction is estimated.
    Type: Grant
    Filed: August 10, 2012
    Date of Patent: March 3, 2015
    Assignee: The Research Foundation for The State University of New York
    Inventors: Lijun Yin, Shaun Canavan, Kaoning Hu
  • Patent number: 8885882
    Abstract: A gaze direction determining system and method is provided. A two-camera system may detect the face from a fixed, wide-angle camera, estimates a rough location for the eye region using an eye detector based on topographic features, and directs another active pan-tilt-zoom camera to focus in on this eye region. A eye gaze estimation approach employs point-of-regard (PoG) tracking on a large viewing screen. To allow for greater head pose freedom, a calibration approach is provided to find the 3D eyeball location, eyeball radius, and fovea position. Both the iris center and iris contour points are mapped to the eyeball sphere (creating a 3D iris disk) to get the optical axis; then the fovea rotated accordingly and the final, visual axis gaze direction computed.
    Type: Grant
    Filed: July 16, 2012
    Date of Patent: November 11, 2014
    Assignee: The Research Foundation for The State University of New York
    Inventors: Lijun Yin, Michael Reale