Patents by Inventor Ying-Ko Lu

Ying-Ko Lu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10817072
    Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.
    Type: Grant
    Filed: April 24, 2019
    Date of Patent: October 27, 2020
    Assignee: CM HK Limited
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
  • Publication number: 20190250718
    Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.
    Type: Application
    Filed: April 24, 2019
    Publication date: August 15, 2019
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
  • Patent number: 10275038
    Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.
    Type: Grant
    Filed: June 5, 2017
    Date of Patent: April 30, 2019
    Assignee: CM HK LIMITED
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
  • Patent number: 10003485
    Abstract: A system for synchronizing lighting effect patterns of interactive lighting effect devices at a remote location with respect to those at local location is disclosed herein. Synchronized lighting effects produced at the remote location while watching a lighting effect show using other interactive lighting effect devices illuminated according to a script at event venue, can be achieved. Such synchronized lighting effects obtained at remote location generate a corresponding virtual simulated perception of attending same concert venue live when watching a live streaming video thereof. Low latency between lighting effect changes are produced at remote location with respect to those observed in concert venue live streaming video due to method of color control signal generation along with usage of color control pattern blending module that creates a blended video frame comprising of a color control pattern, which allows for efficient lighting effect pattern generation at remote location.
    Type: Grant
    Filed: October 23, 2017
    Date of Patent: June 19, 2018
    Assignee: LUMIC TECHNOLOGY INC.
    Inventors: Ying-Ko Lu, Ta-Wei Huang, Ta-Jen Lin, Chih-Ming Chang, Wen-Chih Wang
  • Publication number: 20180088775
    Abstract: Systems and methods for controlling a display are provided, of which an example system includes: an electronic device having a motion sensor module configured to provide a sensed motion value, the motion sensor module having an accelerometer, a gyroscope, and a magnetometer; at least one processor configured to: estimate motion of the device to provide an estimated yaw and an estimated pitch, respectively, based on angular velocity sensed by the gyroscope; determine a control movement user input of the device if: a sum of the acceleration value on each axis corresponds to a force threshold; and both the estimated yaw and the estimated pitch do not correspond to a rotation threshold; determine a control rotation user input of the device based on a difference between magnetic parameters, including measured magnetic field values, and angular speed parameters, including estimated magnetic field values estimated based on angular speed values; adjust information based on at least one of the control movement or the
    Type: Application
    Filed: September 26, 2017
    Publication date: March 29, 2018
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu
  • Patent number: 9913344
    Abstract: Interactive lighting effect devices configured by an interactive lighting effect control system in an automated wireless manner on a mass scale are provided. RF data bursts are captured to illuminate interactive lighting effect devices selectively in accordance with a matched data. The matched data is formed by combining a pattern-related data of lighting effect extracted from a QR code of the event ticket with an identification address extracted from a QR code of the interactive lighting effect device, the pattern-related data of lighting effect includes a zone code. Improvisational illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes to be transmitted by the interactive lighting effect control system for broadcasting as data bursts to the interactive lighting effect devices. Different types of data acquisition interfaces are provided for obtaining the matched data.
    Type: Grant
    Filed: January 24, 2017
    Date of Patent: March 6, 2018
    Assignee: LUMIC TECHNOLOGY INC.
    Inventors: Ying-Ko Lu, Ta-Wei Huang, Ta-Jen Lin, Chih-Ming Chang, Wen-Chih Wang
  • Publication number: 20180049287
    Abstract: A system for synchronizing lighting effect patterns of interactive lighting effect devices at a remote location with respect to those at local location is disclosed herein. Synchronized lighting effects produced at the remote location while watching a lighting effect show using other interactive lighting effect devices illuminated according to a script at event venue, can be achieved. Such synchronized lighting effects obtained at remote location generate a corresponding virtual simulated perception of attending same concert venue live when watching a live streaming video thereof. Low latency between lighting effect changes are produced at remote location with respect to those observed in concert venue live streaming video due to method of color control signal generation along with usage of color control pattern blending module that creates a blended video frame comprising of a color control pattern, which allows for efficient lighting effect pattern generation at remote location.
    Type: Application
    Filed: October 23, 2017
    Publication date: February 15, 2018
    Inventors: YING-KO LU, TA-WEI HUANG, TA-JEN LIN, CHIH-MING CHANG, WEN-CHIH WANG
  • Patent number: 9798395
    Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.
    Type: Grant
    Filed: January 4, 2017
    Date of Patent: October 24, 2017
    Assignee: CM HK LIMITED
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
  • Publication number: 20170269701
    Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.
    Type: Application
    Filed: June 5, 2017
    Publication date: September 21, 2017
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
  • Patent number: 9763311
    Abstract: Portable light illuminating device with RF receiver along with interactive lighting effect control system with RF transmitter via wireless data transmissions are provided. RF data burst are captured to illuminate LEDs disposed in portable light illuminating device selectively in accordance with illuminating color and zone assignment data and matching nested hierarchical zone codes, which can be assigned for seating location within one seating zone, several seating zones, and segment within one seating zone. Improvisational manual illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes as well as color control signals extracted from sound track using color show control software from PC/laptop can be encoded and sequenced using lighting controller to be transmitted to wireless RF transmitter for broadcasting as data burst. DMX controller and PC/Laptop can also be part of the interactive lighting effect control system.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: September 12, 2017
    Assignee: LUMIC TECHNOLOGY INC.
    Inventors: Ta-Wei Huang, Ying-Ko Lu
  • Patent number: 9690386
    Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.
    Type: Grant
    Filed: February 8, 2013
    Date of Patent: June 27, 2017
    Assignee: CM HK LIMITED
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
  • Patent number: 9665984
    Abstract: Method to create try-on experience wearing virtual 3D eyeglasses is provided using 2D image data of eyeglasses. Virtual 3D eyeglasses are constructed using set of 2D images for eyeglasses. Virtual 3D eyeglasses is configured onto 3D face or head model and being simulated as being fittingly worn by the wearer. Each set of 2D images for eyeglasses includes a pair of 2D lens images, a frontal frame image, and at least one side frame image. Upon detection of a movement of the face and head of wearer in real-time, the 3D face or head model and the configuration and alignment of virtual 3D eyeglasses are modified or adjusted accordingly. Features such as trimming off of portion of the glasses frame, shadow creating and environment mapping are provided to the virtual 3D eyeglasses in response to translation, scaling, and posture changes made to the head and face of the wearer in real-time.
    Type: Grant
    Filed: July 31, 2014
    Date of Patent: May 30, 2017
    Assignee: ULSee Inc.
    Inventors: Zhou Ye, Chih-Ming Chang, Ying-Ko Lu, Yi-Chia Hsu
  • Publication number: 20170135165
    Abstract: Interactive lighting effect devices configured by an interactive lighting effect control system in an automated wireless manner on a mass scale are provided. RF data bursts are captured to illuminate interactive lighting effect devices selectively in accordance with a matched data. The matched data is formed by combining a pattern-related data of lighting effect extracted from a QR code of the event ticket with an identification address extracted from a QR code of the interactive lighting effect device, the pattern-related data of lighting effect includes a zone code. Improvisational illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes to be transmitted by the interactive lighting effect control system for broadcasting as data bursts to the interactive lighting effect devices. Different types of data acquisition interfaces are provided for obtaining the matched data.
    Type: Application
    Filed: January 24, 2017
    Publication date: May 11, 2017
    Inventors: YING-KO LU, TA-WEI HUANG, TA-JEN LIN, CHIH-MING CHANG, WEN-CHIH WANG
  • Patent number: 9642521
    Abstract: Method for automatically measuring pupillary distance includes extracting facial features of face image, a head current center indicator is shown/displayed based on facial feature extraction, elliptical frame and target center indicator are shown, a first distance between head current center indicator and target center indicator is calculated to see if below a threshold range, then allowing head current center indicator, elliptical frame and target center indicator to disappear. Card window based on facial tracking result is shown. Credit card band detection is performed to see if located within card window. Card window then disappear. Elliptical frame of moving head and target elliptical frame are shown. Elliptical frame of the moving head is aligned with the target elliptical frame and maintaining a correct head posture. If elliptical frame of moving head is aligned with target elliptical frame, then allow them to disappear from view, and performing a pupillary distance measurement.
    Type: Grant
    Filed: November 25, 2014
    Date of Patent: May 9, 2017
    Assignee: ULSee Inc.
    Inventors: Zhou Ye, Sheng-Wen Jeng, Ying-Ko Lu, Shih Wei Liu
  • Publication number: 20170115750
    Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.
    Type: Application
    Filed: January 4, 2017
    Publication date: April 27, 2017
    Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
  • Publication number: 20170048951
    Abstract: Portable light illuminating device with RF receiver along with interactive lighting effect control system with RF transmitter via wireless data transmissions are provided. RF data burst are captured to illuminate LEDs disposed in portable light illuminating device selectively in accordance with illuminating color and zone assignment data and matching nested hierarchical zone codes, which can be assigned for seating location within one seating zone, several seating zones, and segment within one seating zone. Improvisational manual illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes as well as color control signals extracted from sound track using color show control software from PC/laptop can be encoded and sequenced using lighting controller to be transmitted to wireless RF transmitter for broadcasting as data burst. DMX controller and PC/Laptop can also be part of the interactive lighting effect control system.
    Type: Application
    Filed: August 11, 2015
    Publication date: February 16, 2017
    Inventors: Ta-Wei HUANG, Ying-Ko LU
  • Patent number: 9564075
    Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.
    Type: Grant
    Filed: December 14, 2010
    Date of Patent: February 7, 2017
    Assignee: CYWEEMOTION HK LIMITED
    Inventors: Zhou Ye, Shan-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
  • Patent number: 9317136
    Abstract: An image-based object tracking system and an image-based object tracking method are provided. The image-based object tracking system includes an object, a camera, a computing device, and a display device. A color stripe is disposed on the surface of the object. The color stripe divides the surface of the object into a first section and a second section. The camera is configured to capture real-time images of the object. Further, the object tracking algorithm is stored in the computing device. The display device includes a display screen and is electrically connected to the computing device, and the display screen is configured to display the real-time images of the object. By using the image-based object tracking system provided in the invention, the object can be tracked accurately and efficiently without interference with the image of the background.
    Type: Grant
    Filed: January 10, 2014
    Date of Patent: April 19, 2016
    Assignee: UL See Inc.
    Inventors: Zhou Ye, Sheng-Weng Jeng, Chih-Ming Chang, Hsin-Wei Hsiao, Yi-Chia Hsu, Ying-Ko Lu
  • Patent number: 9262869
    Abstract: A method of 3D morphing driven by facial tracking is provided. First, a 3D model is loaded. After that, facial feature control points and boundary control points are picked up respectively. A configure file “A.config” including the facial feature control points and boundary control points data that are picked up corresponding to the 3D avatar is saved. Facial tracking algorithm is started, and then “A.config” is loaded. After that, controlled morphing of a 3D avatar by facial tracking based on “A.config” is performed in real time by a deformation method having control points. Meanwhile, teeth and tongue tracking of the real-time face image, and scaling, translation and rotation of the real-time 3D avatar image is also provided. In addition, a control point reassignment and reconfiguration method, and a pupil movement detection method is also provided in the method of 3D morphing driven by facial tracking.
    Type: Grant
    Filed: July 12, 2013
    Date of Patent: February 16, 2016
    Assignee: UL See Inc.
    Inventors: Zhou Ye, Ying-Ko Lu, Yi-Chia Hsu, Sheng-Wen Jeng
  • Publication number: 20160035133
    Abstract: Method to create try-on experience wearing virtual 3D eyeglasses is provided using 2D image data of eyeglasses. Virtual 3D eyeglasses are constructed using set of 2D images for eyeglasses. Virtual 3D eyeglasses is configured onto 3D face or head model and being simulated as being fittingly worn by the wearer. Each set of 2D images for eyeglasses includes a pair of 2D lens images, a frontal frame image, and at least one side frame image. Upon detection of a movement of the face and head of wearer in real-time, the 3D face or head model and the configuration and alignment of virtual 3D eyeglasses are modified or adjusted accordingly. Features such as trimming off of portion of the glasses frame, shadow creating and environment mapping are provided to the virtual 3D eyeglasses in response to translation, scaling, and posture changes made to the head and face of the wearer in real-time.
    Type: Application
    Filed: July 31, 2014
    Publication date: February 4, 2016
    Applicant: ULSee Inc.
    Inventors: Zhou Ye, Chih-Ming Chang, Ying-Ko Lu, Yi-Chia Hsu