Patents by Inventor Ying-Ko Lu
Ying-Ko Lu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10817072Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.Type: GrantFiled: April 24, 2019Date of Patent: October 27, 2020Assignee: CM HK LimitedInventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
-
Publication number: 20190250718Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.Type: ApplicationFiled: April 24, 2019Publication date: August 15, 2019Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
-
Patent number: 10275038Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.Type: GrantFiled: June 5, 2017Date of Patent: April 30, 2019Assignee: CM HK LIMITEDInventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
-
Patent number: 10003485Abstract: A system for synchronizing lighting effect patterns of interactive lighting effect devices at a remote location with respect to those at local location is disclosed herein. Synchronized lighting effects produced at the remote location while watching a lighting effect show using other interactive lighting effect devices illuminated according to a script at event venue, can be achieved. Such synchronized lighting effects obtained at remote location generate a corresponding virtual simulated perception of attending same concert venue live when watching a live streaming video thereof. Low latency between lighting effect changes are produced at remote location with respect to those observed in concert venue live streaming video due to method of color control signal generation along with usage of color control pattern blending module that creates a blended video frame comprising of a color control pattern, which allows for efficient lighting effect pattern generation at remote location.Type: GrantFiled: October 23, 2017Date of Patent: June 19, 2018Assignee: LUMIC TECHNOLOGY INC.Inventors: Ying-Ko Lu, Ta-Wei Huang, Ta-Jen Lin, Chih-Ming Chang, Wen-Chih Wang
-
Publication number: 20180088775Abstract: Systems and methods for controlling a display are provided, of which an example system includes: an electronic device having a motion sensor module configured to provide a sensed motion value, the motion sensor module having an accelerometer, a gyroscope, and a magnetometer; at least one processor configured to: estimate motion of the device to provide an estimated yaw and an estimated pitch, respectively, based on angular velocity sensed by the gyroscope; determine a control movement user input of the device if: a sum of the acceleration value on each axis corresponds to a force threshold; and both the estimated yaw and the estimated pitch do not correspond to a rotation threshold; determine a control rotation user input of the device based on a difference between magnetic parameters, including measured magnetic field values, and angular speed parameters, including estimated magnetic field values estimated based on angular speed values; adjust information based on at least one of the control movement or theType: ApplicationFiled: September 26, 2017Publication date: March 29, 2018Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu
-
Patent number: 9913344Abstract: Interactive lighting effect devices configured by an interactive lighting effect control system in an automated wireless manner on a mass scale are provided. RF data bursts are captured to illuminate interactive lighting effect devices selectively in accordance with a matched data. The matched data is formed by combining a pattern-related data of lighting effect extracted from a QR code of the event ticket with an identification address extracted from a QR code of the interactive lighting effect device, the pattern-related data of lighting effect includes a zone code. Improvisational illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes to be transmitted by the interactive lighting effect control system for broadcasting as data bursts to the interactive lighting effect devices. Different types of data acquisition interfaces are provided for obtaining the matched data.Type: GrantFiled: January 24, 2017Date of Patent: March 6, 2018Assignee: LUMIC TECHNOLOGY INC.Inventors: Ying-Ko Lu, Ta-Wei Huang, Ta-Jen Lin, Chih-Ming Chang, Wen-Chih Wang
-
Publication number: 20180049287Abstract: A system for synchronizing lighting effect patterns of interactive lighting effect devices at a remote location with respect to those at local location is disclosed herein. Synchronized lighting effects produced at the remote location while watching a lighting effect show using other interactive lighting effect devices illuminated according to a script at event venue, can be achieved. Such synchronized lighting effects obtained at remote location generate a corresponding virtual simulated perception of attending same concert venue live when watching a live streaming video thereof. Low latency between lighting effect changes are produced at remote location with respect to those observed in concert venue live streaming video due to method of color control signal generation along with usage of color control pattern blending module that creates a blended video frame comprising of a color control pattern, which allows for efficient lighting effect pattern generation at remote location.Type: ApplicationFiled: October 23, 2017Publication date: February 15, 2018Inventors: YING-KO LU, TA-WEI HUANG, TA-JEN LIN, CHIH-MING CHANG, WEN-CHIH WANG
-
Patent number: 9798395Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.Type: GrantFiled: January 4, 2017Date of Patent: October 24, 2017Assignee: CM HK LIMITEDInventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
-
Publication number: 20170269701Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.Type: ApplicationFiled: June 5, 2017Publication date: September 21, 2017Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
-
Patent number: 9763311Abstract: Portable light illuminating device with RF receiver along with interactive lighting effect control system with RF transmitter via wireless data transmissions are provided. RF data burst are captured to illuminate LEDs disposed in portable light illuminating device selectively in accordance with illuminating color and zone assignment data and matching nested hierarchical zone codes, which can be assigned for seating location within one seating zone, several seating zones, and segment within one seating zone. Improvisational manual illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes as well as color control signals extracted from sound track using color show control software from PC/laptop can be encoded and sequenced using lighting controller to be transmitted to wireless RF transmitter for broadcasting as data burst. DMX controller and PC/Laptop can also be part of the interactive lighting effect control system.Type: GrantFiled: August 11, 2015Date of Patent: September 12, 2017Assignee: LUMIC TECHNOLOGY INC.Inventors: Ta-Wei Huang, Ying-Ko Lu
-
Patent number: 9690386Abstract: A method and apparatus for performing motion recognition using motion sensor fusion and an associated computer program product are provided, where the method is applied to an electronic device. The method includes the steps of: utilizing a plurality of motion sensors of the electronic device to obtain sensor data respectively corresponding to the plurality of motion sensors, the sensor data measured at a device coordinate system of the electronic device, wherein the plurality of motion sensors includes inertial motion sensors; and performing sensor fusion according to the sensor data by converting at least one portion of the sensor data and derivatives of the sensor data into converted data based on a global coordinate system of a user of the electronic device, to perform motion recognition based on the global coordinate system, in order to recognize the user's motion.Type: GrantFiled: February 8, 2013Date of Patent: June 27, 2017Assignee: CM HK LIMITEDInventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Chin-Lung Lee
-
Patent number: 9665984Abstract: Method to create try-on experience wearing virtual 3D eyeglasses is provided using 2D image data of eyeglasses. Virtual 3D eyeglasses are constructed using set of 2D images for eyeglasses. Virtual 3D eyeglasses is configured onto 3D face or head model and being simulated as being fittingly worn by the wearer. Each set of 2D images for eyeglasses includes a pair of 2D lens images, a frontal frame image, and at least one side frame image. Upon detection of a movement of the face and head of wearer in real-time, the 3D face or head model and the configuration and alignment of virtual 3D eyeglasses are modified or adjusted accordingly. Features such as trimming off of portion of the glasses frame, shadow creating and environment mapping are provided to the virtual 3D eyeglasses in response to translation, scaling, and posture changes made to the head and face of the wearer in real-time.Type: GrantFiled: July 31, 2014Date of Patent: May 30, 2017Assignee: ULSee Inc.Inventors: Zhou Ye, Chih-Ming Chang, Ying-Ko Lu, Yi-Chia Hsu
-
Publication number: 20170135165Abstract: Interactive lighting effect devices configured by an interactive lighting effect control system in an automated wireless manner on a mass scale are provided. RF data bursts are captured to illuminate interactive lighting effect devices selectively in accordance with a matched data. The matched data is formed by combining a pattern-related data of lighting effect extracted from a QR code of the event ticket with an identification address extracted from a QR code of the interactive lighting effect device, the pattern-related data of lighting effect includes a zone code. Improvisational illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes to be transmitted by the interactive lighting effect control system for broadcasting as data bursts to the interactive lighting effect devices. Different types of data acquisition interfaces are provided for obtaining the matched data.Type: ApplicationFiled: January 24, 2017Publication date: May 11, 2017Inventors: YING-KO LU, TA-WEI HUANG, TA-JEN LIN, CHIH-MING CHANG, WEN-CHIH WANG
-
Patent number: 9642521Abstract: Method for automatically measuring pupillary distance includes extracting facial features of face image, a head current center indicator is shown/displayed based on facial feature extraction, elliptical frame and target center indicator are shown, a first distance between head current center indicator and target center indicator is calculated to see if below a threshold range, then allowing head current center indicator, elliptical frame and target center indicator to disappear. Card window based on facial tracking result is shown. Credit card band detection is performed to see if located within card window. Card window then disappear. Elliptical frame of moving head and target elliptical frame are shown. Elliptical frame of the moving head is aligned with the target elliptical frame and maintaining a correct head posture. If elliptical frame of moving head is aligned with target elliptical frame, then allow them to disappear from view, and performing a pupillary distance measurement.Type: GrantFiled: November 25, 2014Date of Patent: May 9, 2017Assignee: ULSee Inc.Inventors: Zhou Ye, Sheng-Wen Jeng, Ying-Ko Lu, Shih Wei Liu
-
Publication number: 20170115750Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.Type: ApplicationFiled: January 4, 2017Publication date: April 27, 2017Inventors: Zhou Ye, Shun-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
-
Publication number: 20170048951Abstract: Portable light illuminating device with RF receiver along with interactive lighting effect control system with RF transmitter via wireless data transmissions are provided. RF data burst are captured to illuminate LEDs disposed in portable light illuminating device selectively in accordance with illuminating color and zone assignment data and matching nested hierarchical zone codes, which can be assigned for seating location within one seating zone, several seating zones, and segment within one seating zone. Improvisational manual illuminating color control change for any zone assignment for color control signal can be generated and converted to set of RGB color codes as well as color control signals extracted from sound track using color show control software from PC/laptop can be encoded and sequenced using lighting controller to be transmitted to wireless RF transmitter for broadcasting as data burst. DMX controller and PC/Laptop can also be part of the interactive lighting effect control system.Type: ApplicationFiled: August 11, 2015Publication date: February 16, 2017Inventors: Ta-Wei HUANG, Ying-Ko LU
-
Patent number: 9564075Abstract: An electronic control apparatus including motion sensors is integrated in a portable electronic device to responsively control a media content stored in the portable electronic device, in response to motion sensor signals to flip, zoom, displace images/pages of the media content displayed on a display field of a display thereof. Accordingly, a responsive control method includes the steps of: presetting a first threshold angle; sensing an first rotation angle of the portable electronic device to send out a first rotation sensing signal as a rotation of a yaw, pitch or roll of a portable electronic device detected by a sensing module including motion sensors; and receiving the first rotation sensing signal to calculate and determine whether the first rotation angle is greater than the first threshold angle to responsively control a media content stored in an electronic control apparatus be flipped, zoomed or displaced when the first rotation angle is greater than the first threshold angle.Type: GrantFiled: December 14, 2010Date of Patent: February 7, 2017Assignee: CYWEEMOTION HK LIMITEDInventors: Zhou Ye, Shan-Nan Liou, Ying-Ko Lu, Wen-Hao Chang, Tigran Tadevosyan
-
Patent number: 9317136Abstract: An image-based object tracking system and an image-based object tracking method are provided. The image-based object tracking system includes an object, a camera, a computing device, and a display device. A color stripe is disposed on the surface of the object. The color stripe divides the surface of the object into a first section and a second section. The camera is configured to capture real-time images of the object. Further, the object tracking algorithm is stored in the computing device. The display device includes a display screen and is electrically connected to the computing device, and the display screen is configured to display the real-time images of the object. By using the image-based object tracking system provided in the invention, the object can be tracked accurately and efficiently without interference with the image of the background.Type: GrantFiled: January 10, 2014Date of Patent: April 19, 2016Assignee: UL See Inc.Inventors: Zhou Ye, Sheng-Weng Jeng, Chih-Ming Chang, Hsin-Wei Hsiao, Yi-Chia Hsu, Ying-Ko Lu
-
Patent number: 9262869Abstract: A method of 3D morphing driven by facial tracking is provided. First, a 3D model is loaded. After that, facial feature control points and boundary control points are picked up respectively. A configure file “A.config” including the facial feature control points and boundary control points data that are picked up corresponding to the 3D avatar is saved. Facial tracking algorithm is started, and then “A.config” is loaded. After that, controlled morphing of a 3D avatar by facial tracking based on “A.config” is performed in real time by a deformation method having control points. Meanwhile, teeth and tongue tracking of the real-time face image, and scaling, translation and rotation of the real-time 3D avatar image is also provided. In addition, a control point reassignment and reconfiguration method, and a pupil movement detection method is also provided in the method of 3D morphing driven by facial tracking.Type: GrantFiled: July 12, 2013Date of Patent: February 16, 2016Assignee: UL See Inc.Inventors: Zhou Ye, Ying-Ko Lu, Yi-Chia Hsu, Sheng-Wen Jeng
-
Publication number: 20160035133Abstract: Method to create try-on experience wearing virtual 3D eyeglasses is provided using 2D image data of eyeglasses. Virtual 3D eyeglasses are constructed using set of 2D images for eyeglasses. Virtual 3D eyeglasses is configured onto 3D face or head model and being simulated as being fittingly worn by the wearer. Each set of 2D images for eyeglasses includes a pair of 2D lens images, a frontal frame image, and at least one side frame image. Upon detection of a movement of the face and head of wearer in real-time, the 3D face or head model and the configuration and alignment of virtual 3D eyeglasses are modified or adjusted accordingly. Features such as trimming off of portion of the glasses frame, shadow creating and environment mapping are provided to the virtual 3D eyeglasses in response to translation, scaling, and posture changes made to the head and face of the wearer in real-time.Type: ApplicationFiled: July 31, 2014Publication date: February 4, 2016Applicant: ULSee Inc.Inventors: Zhou Ye, Chih-Ming Chang, Ying-Ko Lu, Yi-Chia Hsu