Patents Issued in July 9, 2020
-
Publication number: 20200218339Abstract: A locomotion system for use with a virtual environment technology includes a platform configured to support a user, a harness support assembly coupled to the platform and extending upwardly from the platform, and a safety harness configured to be worn by the user. The harness support assembly includes a support halo positioned above the platform and extending about a vertical central axis. The safety harness includes an interface structure moveably coupled to the support halo.Type: ApplicationFiled: March 9, 2020Publication date: July 9, 2020Applicant: Virtuix Holdings Inc.Inventor: Jan GOETGELUK
-
Publication number: 20200218340Abstract: The present disclosure provides a processing circuit of a display panel, a display method and a display device. The processing circuit includes: a line-of-sight acquisition module configured to track an eyeball of each eye, to determine a concern region of line of sight on the display panel and a region other than the concern region; a control module configured to acquire original image data of an image to be displayed on the display panel, subject first original image data corresponding to the concern region and/or second original image data corresponding to the other region to treatment, and output first image generation data corresponding to the concern region and second image generation data corresponding to the other region and having a resolution smaller than the first image generation data; and a display signal output module configured to output a display signal to the display panel in accordance with the first image generation data and the second image generation data.Type: ApplicationFiled: August 10, 2017Publication date: July 9, 2020Applicants: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE OPTOELECTRONICS TECHNOLOGY CO., LTD.Inventors: Tiankuo SHI, Xue DONG, Dong CHEN, Xiaomang ZHANG, Wei SUN, Hao ZHANG, Lingyun SHI, Xiaobo XIE, Zijiao XUE, Bo GAO, Yafei LI, Jinxing LIU, Yan LI, Yue LI, Xiangyi CHEN, Shuaishuai XU, Xiaochuan CHEN, Shengji YANG
-
Publication number: 20200218341Abstract: Embodiments of the present disclosure relate to augmented reality (AR) safety enhancement. In embodiments, an eye-gaze time indicating a period in which a user using an AR application is viewing a screen of a mobile device running the AR application can be determined. The eye-gaze time can then be compared to an eye-gaze threshold. In response to a determination that the eye-gaze time exceeds the eye-gaze threshold, an alert can be issued to the mobile device running the AR application. In embodiments, a set of proximity data can be received. The set of proximity data can be analyzed to determine a number of nearby devices. A determination can be made whether the number of nearby devices exceeds a safety threshold. When a determination is made that the number of nearby devices exceeds the safety threshold, an alert can be issued to a device having a running AR application.Type: ApplicationFiled: January 3, 2019Publication date: July 9, 2020Inventors: Rebecca D. Young, Stewart J. Hyman, Manvendra Gupta, Rhonda L. Childress
-
Publication number: 20200218342Abstract: Provided are techniques for personalized adaptation of Virtual Reality (VR) content based on eye strain context. An initial eye strain context for a user while wearing a VR headset to view VR content in a User Interface (UI) is determined. A UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation. Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation. An updated eye strain context is determined. In response to determining that the updated eye strain context indicates that eye strain has decreased, a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile.Type: ApplicationFiled: January 3, 2019Publication date: July 9, 2020Inventors: Srikanth K. Murali, Vijay Kumar Ananthapur Bache, Vijay Ekambaram, Padmanabha Venkatagiri Seshadri
-
Publication number: 20200218343Abstract: A gaze point compensation method and apparatus in a display device, and a display device are provided. The method includes: obtaining a real-time eye image of a user; obtaining an eye-corner position coordinate in the real-time eye image; when the eye-corner position coordinate is within an initial calibration range, making first compensation for the eye-corner position coordinate according to a correspondence relationship between a pupil and an eye-corner position; otherwise, making second compensation for the eye-corner position coordinate, wherein the second compensation is configured to make the eye-corner position coordinate after the second compensation at least be within the initial calibration range; and compensating for a gaze point according to the first compensation or the second compensation.Type: ApplicationFiled: October 24, 2019Publication date: July 9, 2020Inventors: Jiankang Sun, Lili Chen, Hao Zhang, Hongzhen Xue, Fuqiang Ma, Zehua Dong
-
Publication number: 20200218344Abstract: A display system includes a display device and a rendering device having a plurality of individually-controllable illumination regions. The rendering device is to render a frame for display at the display device during a frame period, to determine a gaze position of a user relative to the display device for the frame period, and to set, for each illumination region, an illumination configuration to be applied by the display device for the illumination region during at least one of the frame period or a subsequent frame period based on a classification of the illumination region that is representative of a location of the gaze position relative to the illumination region.Type: ApplicationFiled: October 31, 2019Publication date: July 9, 2020Inventors: Ed CALLWAY, David GLEN
-
Publication number: 20200218345Abstract: In a calibration process for an eye-tracking application, a calibration mark is displayed on an instrument, and the user is instructed to keep the gaze focused on the calibration mark. Next, a dynamic image is displayed on the instrument, and the user is instructed to move his head or the instrument as indicated by the dynamic image while keeping the gaze focused on the calibration mark. The ocular information of the user is recorded during the head movement or the instrument movement for calibrating the eye-tracking application.Type: ApplicationFiled: January 1, 2020Publication date: July 9, 2020Inventors: Kuan-Ling Liu, LIANG FANG, Po-Jung Chiu, Yi-Heng Wu, Ming-Yi Tai, Yi-Hsiang Chen, Chia-Ming Chang, Shao-Yi Chien
-
Publication number: 20200218346Abstract: A system including a rendering engine to render a field texture for a field display and a foveal texture for a steerable foveal display and a compositor including a field compositor to generate frames for the field display from the field texture and a foveal compositor to generate frames for the foveal display from the foveal texture. The system further including a composition manager designed to sequence and select what is presented including one or more of data in the field display and the foveal display.Type: ApplicationFiled: January 6, 2020Publication date: July 9, 2020Inventors: Aaron Matthew Eash, Andrew John Gross, Baker Ngan, Edward Chia Ning Tang, Joseph Roger Battelle, Warren Cornelius Welch, III
-
Publication number: 20200218347Abstract: A control system is provided. The control system includes an image capturing unit, an input interface and a processing unit. The image capturing unit is configured to capture a plurality of images of a user. The input interface configured to receive an input signal from the user. The processing unit is configured to identify a facial feature from the captured images; calculate a gaze point of the user according to the facial feature; determine a target facility among multiple facilities according to the gaze point of the user; receive a confirmation signal; configure the target facility as a facility subject to control when the confirmation signal is received; and control the facility subject to control in response to a control signal received from the input interface.Type: ApplicationFiled: January 6, 2020Publication date: July 9, 2020Inventors: Yu-Sian Jiang, Mu-Jen Huang
-
Publication number: 20200218348Abstract: A system of eye tracking includes an infrared (IR) source to project IR light to an eye of a user, an IR holographic optical element (HOE) to change an angle of the IR light reflected from the eye of the user, and a sensor to receive the IR light. The system further includes a processor to use the IR light to determine a gaze vector of the user, in one embodiment.Type: ApplicationFiled: January 8, 2020Publication date: July 9, 2020Inventors: Aaron Matthew Eash, Andrew John Gross, Christopher David Westra, Warren Cornelius Welch, III, Eric Richard David Frasch
-
Publication number: 20200218349Abstract: An operating method in a virtual environment through a wearable device is disclosed, wherein the wearable device has a motion sensor, the virtual environment has an operated object and a virtual device corresponding to the wearable device, the corresponding virtual device has a first operational data constraint, and the operated object has a second operational data constraint. The operating method comprises the following steps of: using the motion sensor to generate a motion sensed data; causing the corresponding virtual device to generate a derived data according to the motion sensed data, wherein the derived data indicates an interaction relationship between the virtual device and the operated object; and when the virtual device separated from the operated object under the interaction relationship, moving the operated object in accordance with the derived data.Type: ApplicationFiled: January 2, 2020Publication date: July 9, 2020Applicant: J-MEX Inc.Inventors: Chin-Ting Chu, Chia-Wei Lee, Chih-Hung Hsu, Te-Hsi Chen, Chi-Hung Chen, Meng-Yu Lee
-
Publication number: 20200218350Abstract: A computer network implemented system for improving the operation of one or more biofeedback computer systems is provided.Type: ApplicationFiled: March 13, 2020Publication date: July 9, 2020Inventors: Trevor CE COLEMAN, Christopher Allen AIMONE, Ariel Stephanie GARTEN, Locillo (Lou) Giuseppe PINO, Paul Harrison BARANOWSKI, Raul Rajiv RUPSINGH, Kapil Jay Mishra VIDYARTHI, Graeme MOFFAT, Samuel Thomas MACKENZIE
-
TACTILE PRESENTATION PANEL, TACTILE PRESENTATION TOUCH PANEL, AND TACTILE PRESENTATION TOUCH DISPLAY
Publication number: 20200218351Abstract: Tactile electrodes include a plurality of first electrodes and a plurality of second electrodes that are alternately arranged with an interval therebetween on a transparent insulating substrate. A dielectric layer covers the tactile electrodes. A voltage supply circuit applies a voltage signal having a first frequency to those of the first electrodes that are located on at least a partial region of the transparent insulating substrate, and applies a voltage signal having a second frequency different from the first frequency to those of the second electrodes that are located on at least the partial region of the transparent insulating substrate.Type: ApplicationFiled: April 16, 2018Publication date: July 9, 2020Applicant: Mitsubishi Electric CorporationInventors: Tae ORITA, Masafumi AGARI, Takeshi ONO -
Publication number: 20200218352Abstract: An audio-haptic signal generator for a haptic system including an amplifier coupled to a haptic actuator is described. The audio-haptic signal generator includes an audio input configured to receive an audio signal; a haptic input configured to receive a haptic signal and a controller configured to receive to at least one of the haptic signal, an amplifier state and a haptic actuator state. A mixer is coupled to the audio input and the haptic input. The mixer has an output configured to be coupled to a haptic actuator. The controller controls the mixer to process the audio signal dependent on at least one of a characteristic of the haptic signal, an amplifier state, and a haptic actuator state. The mixer is configured to mix the haptic signal and processed audio signal on to output the mixed haptic signal and processed audio signal.Type: ApplicationFiled: December 25, 2019Publication date: July 9, 2020Inventors: Christophe Marc Macours, Temujin Gautama
-
Publication number: 20200218353Abstract: A foldable electronic device is provided, which includes a hinge structure, a first housing that is connected to the hinge structure, a second housing that is connected to the hinge, a first vibration element disposed in the first housing, a first motion sensor disposed adjacent to the first vibration element, a second motion sensor disposed in the second housing, a processor, and a memory. The memory stores instructions that, when executed, cause the processor to control vibration intensity of the first vibration element, based on a first value measured from the first motion sensor and a second value measured from the second motion sensor.Type: ApplicationFiled: January 2, 2020Publication date: July 9, 2020Inventors: Kwonho SONG, Jeongseok LEE, Yonggu LEE, Jaehwan PARK, Changkwan YANG, Kihun EOM
-
Publication number: 20200218354Abstract: Described is a method for instilling the haptic dimension of texture to virtual and holographic objects using mid-air ultrasonic technology. A set of features is extracted from imported images using their associated displacement maps. Textural qualities such as the micro and macro roughness are then computed and fed to a haptic mapping function together with information about the dynamic motion of the user's hands during holographic touch. Mid-air haptic textures are then synthesized and projected onto the user's bare hands. Further, mid-air haptic technology enables tactile exploration of virtual objects in digital environments. When a user's prior and current expectations and rendered tactile texture differ, user immersion can break. A study aims at mitigating this by integrating user expectations into the rendering algorithm of mid-air haptic textures and establishes a relationship between visual and mid-air haptic roughness.Type: ApplicationFiled: January 6, 2020Publication date: July 9, 2020Inventors: David Beattie, Rory Clark, Adam Harwood, Orestis Georgiou, Benjamin John Oliver Long, Thomas Andrew Carter
-
Publication number: 20200218355Abstract: In various example embodiments, a system and method for simulating touch in a virtual environment is disclosed. In one example embodiment, a system includes one or more circuits configured to receive an indicator of a sensed touch in a virtual environment and to determine, based on the indicator, an area of the sensed touch. The one or more circuits are further configured to generate a simulated touch by applying a field to one or more touch simulators, the field actuating the one or more touch simulators by linearly displacing an element of the one or more touch simulators.Type: ApplicationFiled: January 16, 2020Publication date: July 9, 2020Inventor: Clayton Gustin
-
Publication number: 20200218356Abstract: One illustrative system disclosed herein includes a computing device that comprises a memory and a processor in communication with the memory. The system also includes an xPC target machine that is capable of achieving sampling rates of at least 100 khz and in communication with the computing device and a user device that includes a sensor and a haptic output device. The processor generates a simulate reality environment and determines a haptic effect based on the simulated reality environment or a sensor signal from the sensor. The processor transmits data about a parameter of the haptic effect or the sensor signal to the xPC target machine, which determines the parameter of the haptic effect and generates, in substantially real time, a haptic signal. The xPC target machine transmits the haptic signal to the haptic output device, which is configured to receive the haptic signal and output the haptic effect.Type: ApplicationFiled: January 22, 2020Publication date: July 9, 2020Applicant: Immersion CorporationInventors: Liwen Wu, Danny A. Grant, Juan Manuel Cruz-Hernandez
-
Publication number: 20200218357Abstract: A haptic interface unit may include an application in the operating system of a device in communication with a driver layer. A plurality of sensors and actuators may be in communication with the driver layer. The driver layer analyzes information from the sensors to generate an output signal based on an interaction model stored in the driver layer. The application updates the interaction model in the driver layer.Type: ApplicationFiled: March 19, 2020Publication date: July 9, 2020Inventor: David J. Meyer
-
Publication number: 20200218358Abstract: Systems, devices, and methods for providing limited duration haptic effects are disclosed. Systems for providing limited duration haptic effects include sensors, control circuits, and vibration actuators configured closed loop feedback control of the vibration actuators. The sensors are configured to measure motion characteristics induced by the vibration actuators. The control circuits are configured to receive motion characteristic information from the sensors and provide closed loop feedback control of the vibration actuators. Closed loop feedback control permits precise control of vibration actuator output during limited duration haptic effects.Type: ApplicationFiled: March 20, 2020Publication date: July 9, 2020Inventors: Juan Manuel CRUZ HERNANDEZ, Danny A. GRANT, Christopher J. ULLRICH
-
Publication number: 20200218359Abstract: Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices. The simultaneous control includes automatically detecting a gesture of at least one object from gesture data received via the sensors. The detecting comprises identifying the gesture using only the gesture data. The computer applications translate the gesture to a gesture signal, and control the display devices in response to the gesture signal.Type: ApplicationFiled: March 10, 2020Publication date: July 9, 2020Inventors: David Minnen, Paul Yarin
-
Publication number: 20200218360Abstract: There is provided an information processing apparatus including: a process execution unit configured to execute a process relating to a user's gesture recognized on a basis of information from a sensor. The process execution unit determines, during a period after the gesture has been recognized, whether or not there is an input of operation information based on a user's operation, and the process execution unit refrains from executing the process relating to the recognized gesture when there is an input of the operation information during the period.Type: ApplicationFiled: March 13, 2020Publication date: July 9, 2020Applicant: Sony CorporationInventors: Ryu Aoyama, Tomoo Mizukami, Yuji Hirose, Kei Takahashi, Ikuo Yamano
-
Publication number: 20200218361Abstract: This document describes techniques and devices for non-line-of-sight radar-based gesture recognition. Through use of the techniques and devices described herein, users may control their devices through in-the-air gestures, even when those gestures are not within line-of-sight of their device's sensors. Thus, the techniques enable users to control their devices in many situations in which control is desired but conventional techniques do permit effective control, such as to turn the temperature down in a room when the user is obscured from a thermostat's gesture sensor, turn up the volume on a media player when the user is in a different room than the media player, or pause a television program when the user's gesture is obscured by a chair, couch, or other obstruction.Type: ApplicationFiled: March 18, 2020Publication date: July 9, 2020Applicant: Google LLCInventor: Ivan Poupyrev
-
Publication number: 20200218362Abstract: Force sensitive input device and methods are disclosed. A force sensitive input device may include a button, an analog sensor, and circuitry. The button may be movable along a first axis between first and second end positions and biased toward the first end position. The analog sensor may output an analog signal that is a function of a displacement of the button along the first axis from the first end position. The circuitry may generate both analog and digital input data in response to the analog signal. The analog input data may include a range of values that are monotonically related to the displacement of the button, and the digital input data may include first and second binary values.Type: ApplicationFiled: January 2, 2020Publication date: July 9, 2020Inventors: Lance William Madsen, Nikhil Bajaj
-
Publication number: 20200218363Abstract: An input method for a mobile communication apparatus, and such a mobile communication apparatus, which comprises a processor and a touch sensitive display, is disclosed. The invention particularly comprises displaying a touch keypad comprising a set of keys, detecting an object over one key of said set of keys, and displaying, upon detection of said object, a first sub-set of keys adjacent to said one key, wherein said sub-set of keys is associated with a first set of sub-functions of said one key.Type: ApplicationFiled: November 29, 2019Publication date: July 9, 2020Inventor: Romel AMINEH
-
Publication number: 20200218364Abstract: An electronic device includes a housing, a microphone exposed through a part of the housing, at least one wireless communication circuitry detachably disposed inside the housing and configured to wirelessly connect with a stylus pen which includes a button. The electronic device also includes a processor and a memory for storing instructions. The instructions when executed, cause the processor receive a first radio signal transmitted based on a user input to the button from the stylus pen, activate a voice recognition function of the microphone in response to receiving the first radio signal, receive an audio signal from a user through the microphone, recognize the received audio signal using the activated voice recognition function, and execute a function indicated by the audio signal, based at least in part on the recognition result.Type: ApplicationFiled: January 9, 2020Publication date: July 9, 2020Inventors: Jeonghoon KIM, Keunsoo KIM, Sangheon KIM, Jongwu BAEK
-
Publication number: 20200218365Abstract: A method of motion capture includes: by multiple positioning devices located on a user, receiving scanning signals emitted by signal emitting devices to obtain detected coordinates, determining angular information, and generating and transmitting to a processor position signals that contain the angular information and the detected coordinates of the positioning devices; by the processor based on the position signals and data of a skeleton related to the user, determining estimated coordinates of a position of a body portion of the user; and generating an image of a virtual object based on the position signals, the estimated coordinates, the data of the skeleton related to the user and data of a skeleton related to a virtual object, and controlling a display to display the image.Type: ApplicationFiled: December 31, 2019Publication date: July 9, 2020Inventors: Dobromir Todorov, Yi-Chi Huang, Ting-Chieh Lin, Chien-Hung Shih
-
Publication number: 20200218366Abstract: A system and method for determining a position and orientation (e.g., pose) of a rigid body. The rigid body may be a position enabled projector, a surveying rod, a power tool, a drill robot, etc., in a given space. The position of the rigid body is specified by a set of three coordinates and the orientation is specified by a set of three angles. As such, based on these six values, the position and orientation of the rigid body can be determined.Type: ApplicationFiled: June 29, 2018Publication date: July 9, 2020Applicant: Hilti AktiengesellschaftInventors: Kent KAHLE, Sascha KORL, Andreas WINTER, Scott GRAYBILL
-
Publication number: 20200218367Abstract: An electronic pen and a tilt handwriting adjustment method thereof, a tilt handwriting adjustment system and an adjusting method thereof are provided. The electronic pen includes a pen body, a detector and a processor provided within the pen body. The detector is configured to detect a tilt angle of the pen body with respect to a writing plane, and output a corresponding detection parameter value; and the processor is configured to acquire the detection parameter value, and generate and output a driving signal according to the detection parameter value, and a characteristic parameter value of the driving signal corresponds to the tilt angle.Type: ApplicationFiled: May 29, 2019Publication date: July 9, 2020Applicants: Hefei BOE Optoelectronics Technology Co., Ltd., BOE Technology Group Co., Ltd.Inventors: Guanglei Yang, Zhixiang Fang, Jian He, Zhen Tang, Peng Ding, Meng Wang, Chunhua Li, Xuxu Hu
-
Publication number: 20200218368Abstract: A mouse pad structure includes a bottom pad, a second cloth and a first cloth. The bottom pad is a rubber pad of flat shape and has an upper surface and a lower surface located oppositely. The second cloth is attached on the upper surface. The first cloth is made of waterproof cloth and combined on the second cloth. Thus, the mouse pad can have effects of anti-fouling and waterproof.Type: ApplicationFiled: February 12, 2019Publication date: July 9, 2020Inventor: Hung-Jen CHOU
-
Publication number: 20200218369Abstract: A wireless charging mousepad and processes thereof is provided. The wireless charging mousepad includes an accommodation seat with at least one groove, and a first cloth is disposed on the accommodation seat. Then a coil layer is disposed on the first cloth and placed in the groove with the first cloth. In addition, a glue layer is disposed on the first cloth and covers the groove. At last, a second cloth is disposed on the glue layer. Thereby, the coil layer will be embedded in the accommodation seat, and the wireless charging mousepad is completed.Type: ApplicationFiled: February 20, 2019Publication date: July 9, 2020Inventor: Hung-Jen CHOU
-
Publication number: 20200218370Abstract: A wireless charging mouse pad, including: a fiber layer; a first colloid adhered to a surface of the fiber layer; a second colloid provided on another surface of the fiber layer; and a first soft layer provided on the second colloid. The invention further relates to a manufacture process of a wireless charging mouse pad, the wireless charging mouse pad is flexible and may be furled to reduce its volume, so that the space occupied may be decreased and the wireless charging mouse pad may be convenient to carry.Type: ApplicationFiled: May 2, 2019Publication date: July 9, 2020Inventor: Ho Lung LU
-
Publication number: 20200218371Abstract: An input touch pen includes a digitizer refill contained inside a shaft tube and including a contact tip and a stepped tip portion rearward thereof, and a knock member protruding from an opening at a rear end of the shaft tube. The contact tip can protrude and retract through an opening at a tip end of the shaft tube. The stepped tip portion has a larger diameter than the opening at the tip end. A relationship A>X>Y is satisfied, wherein X represents a knock stroke of the digitizer refill, A a distance from the stepped tip portion to an inner surface at the tip end side when the contact tip is sunken inside the opening, and Y a distance that the contact tip is moved by operation of the knock member from a state of protruding from the opening to a state of being sunken inside the opening.Type: ApplicationFiled: August 30, 2018Publication date: July 9, 2020Applicant: MITSUBISHI PENCIL COMPANY, LIMITEDInventors: Shinichi Ushiku, Tsuyoshi Nishida, Kyo Nakayama, Satoru Okabe
-
Publication number: 20200218372Abstract: In one embodiment, a stylus with one or more electrodes and one or more computer-readable non-transitory storage media embodying logic for transmitting signals wirelessly to a device through a touch sensor of the device has one or more sensors for detecting movement of the stylus.Type: ApplicationFiled: March 20, 2020Publication date: July 9, 2020Inventors: Esat Yilmaz, Trond Jarle Pedersen, John Logan, Vemund Kval Bakken, Kishore Sundara-Rajan, Joo Yong Um, Igor Polishchuk
-
Publication number: 20200218373Abstract: A method for improving flexibility of a circuit board, e.g., comprising a touch-based sensor, and reducing manufacturing costs by eliminating routing around a border of the touch-based sensor is presented herein. The method comprises forming an array of touch sensors on a first side of the circuit board, in which portions of the circuit board located between three edges of the circuit board and a border of the array of touch sensors exclude traces; and forming first traces between respective second traces in a singular direction on a second side of the circuit board, in which the first traces are electrically coupled, using a first group of vias, to respective rows of the array of touch sensors, and the second traces are electrically coupled, using a second group of vias, to respective columns of the array of touch sensors.Type: ApplicationFiled: January 2, 2020Publication date: July 9, 2020Inventor: Ilya Daniel ROSENBERG
-
Publication number: 20200218374Abstract: In various embodiments, electronic devices such as thin-film transistors and/or touch-panel displays incorporate bilayer capping layers and/or barrier layers.Type: ApplicationFiled: January 3, 2020Publication date: July 9, 2020Inventors: Helia JALILI, Francois DARY, Barbara COX
-
Publication number: 20200218375Abstract: Disclosed herein is a touch panel including: a sensor substrate and a cover substrate stuck to each other. The sensor substrate includes a sensor electrode, and plural signal wirings electrically connected to the sensor electrode and extending along a periphery of the sensor electrode. The cover substrate includes one or plural conductive layers extending along the periphery of the sensor electrode and the plural signal wirings within an area not facing the sensor electrode and the plural signal wirings.Type: ApplicationFiled: February 5, 2020Publication date: July 9, 2020Inventors: Takeshi Kurashima, Shoji Hinata
-
Publication number: 20200218376Abstract: An electronic device with a touch-sensitive surface displays a user interface of a first software application that is updated at a first display rate. While displaying a first frame of the user interface in accordance with the first display rate, the device detects respective movement of a touch input across the touch-sensitive surface. An application-independent touch processing module of the device selects a respective touch location of the touch input that was detected during the respective movement to identify as a representative touch location for the respective movement based on touch-processing criteria for the first software application, and sends to an application-specific portion of the first software application touch location information for the touch input that identifies the respective touch location as the representative touch location for the respective movement. The first software application updates the user interface in accordance with the touch location information.Type: ApplicationFiled: March 18, 2020Publication date: July 9, 2020Inventors: Bruce D. Nilo, David Michael Chan, Jacob A. Xiao, Jason Clay Beaver
-
Publication number: 20200218377Abstract: Logic of a handheld controller can implement sensor fusion algorithms based on force data provided by a force sensing resistor (FSR) in combination with touch data or proximity data provided by a touch sensor or an array of proximity sensors, respectively. An example sensor fusion algorithm can be used to re-calibrate the FSR when an object contacts an associated control, as detected by the touch sensor. Another example sensor fusion algorithm can be used to ignore spurious inputs detected by the FSR when an object is in contact with an adjacent control. Another example sensor fusion algorithm can be used to detect a hand size of a hand grasping a handle of the controller, as detected by the array of proximity sensors, and to adjust the threshold force to register a FSR input event at the FSR according to the hand size.Type: ApplicationFiled: March 23, 2020Publication date: July 9, 2020Inventors: Scott Dalton, Jeffrey Peter Bellinghausen, Scott Douglas Nietfeld, Jeffrey George Leinbaugh, Ian Campbell, Cheang Tad Yoo, Lawrence Yang, Jeffrey Walter Mucha
-
Publication number: 20200218378Abstract: An operation detection device includes an operation unit including an operation surface to be operated thereon, a base portion to which the operation unit is attached, a plurality of load sensors that are arranged between the operation unit and the base portion to detect a load applied to the operation surface, and a plurality of elastic bodies that are attached to the base portion and to the operation unit to cause the plurality of load sensors to contact with the operation unit by an elastic force thereof.Type: ApplicationFiled: July 20, 2018Publication date: July 9, 2020Inventor: Toshihito TAKAI
-
Publication number: 20200218379Abstract: A touch sensor includes a base layer, first touch sensor columns, second touch sensor columns, and sensing lines. The base layer includes a sensing region and a non-sensing region. The first touch sensor columns extend in a first direction. The first touch sensor columns include first touch electrodes. The first touch electrodes include sub-touch electrodes in the sensing region. The second touch sensor columns include second touch electrodes in the sensing region. The second touch sensor columns are alternately arranged with the first touch sensor columns. The sensing lines are in the non-sensing region. The sensing lines include: first sensing lines electrically connected to the sub-touch electrodes, and second sensing lines electrically connected to the second touch electrodes. The sub-touch electrodes and the second touch electrodes have different widths.Type: ApplicationFiled: March 16, 2020Publication date: July 9, 2020Inventors: Soo Jung LEE, Hyoung Wook JANG, Gwang Bum KO, Jeong Yun HAN
-
Publication number: 20200218380Abstract: A display device includes first electrodes, second electrodes, lines, and a controller, and includes a substrate, pixel electrodes, a display functional layer, and a common electrode stacked in this order. An insulating layer is between the common electrode and the first and second electrodes. The controller controls the pixel electrodes, the common electrode, the lines, and the first and second electrodes. During a display period, the pixel electrodes are supplied with a pixel signal through the lines, and the common electrode is supplied with a common signal. During a first sensing period, the lines are supplied with a first drive signal to generate a magnetic field. During a second sensing period, the lines are supplied with the first drive signal to generate the magnetic field, and an electromotive force corresponding to a distance between the lines and the first electrodes is generated in the first electrodes by the magnetic field.Type: ApplicationFiled: March 18, 2020Publication date: July 9, 2020Applicant: Japan Display Inc.Inventors: Hayato KURASAWA, Hiroshi MIZUHASHI, Tadayoshi KATSUTA
-
Publication number: 20200218381Abstract: A touch panel is provided and includes a substrate having first and second face sides, the first face side being an inputting face side and the second face side being opposite to the first face side; a conductive member that is arranged on the substrate, and that configures input position detecting electrodes in an inputting region; a mounting terminal in a peripheral region that is outside the inputting region in a plan view; peripheral wiring lines disposed in the peripheral region, each having a first end electrically connected to the mounting terminal and a second end connected to one of the input position detecting electrodes; a first light blocking layer arranged on the substrate so as to cover at least a part of the peripheral wiring lines when viewed from the first face side of the substrate; and a second light blocking layer arranged between the substrate and the peripheral wiring line, wherein the first light blocking layer overlaps at least a part of the second light blocking layer so as to partlyType: ApplicationFiled: March 19, 2020Publication date: July 9, 2020Inventor: Yoshikatsu IMAZEKI
-
Publication number: 20200218382Abstract: A method of manufacturing an electronic device including forming a first conductive pattern layer on a base layer, forming an organic layer on which a plurality of contact holes exposing a portion of the first conductive pattern layer on the base layer are defined, forming a resin pattern layer covering the contact holes on the organic layer, forming an insulating layer covering at least a portion of the resin pattern layer on the organic layer, removing the resin pattern layer such that an index matching layer is formed by removing at least the portion covering the resin pattern layer, and forming a second conductive pattern layer on the index matching layer. An electronic device constructed according to the method of manufacturing is also disclosed.Type: ApplicationFiled: March 22, 2020Publication date: July 9, 2020Inventors: Seungrok LEE, Sunhaeng CHO
-
Publication number: 20200218383Abstract: A display device with a touch detection device is provided and includes display elements arranged in matrix of row and column direction and surrounded by scan lines each extending in a first direction and a signal lines each extending in a second direction, the display elements including a red display element, a green display element, and a blue display element; light shielding members each extending in one direction, the light shielding members arranged between the display elements adjacent to each other; and touch detection electrodes each extending in the one direction and arranged between the light shielding members and the display elements, and the touch detection electrodes including metal wires that extend overlapping the light shielding members, Wherein the metal wires are divided into metal wire parts that are arranged in the column direction, and the metal wire parts, each being longitudinal in the one direction, are disposed without crossing each other and spaced from each other in the one directioType: ApplicationFiled: March 23, 2020Publication date: July 9, 2020Inventors: Koji ISHIZAKI, Hayato KURASAWA, Kohei AZUMI
-
Publication number: 20200218384Abstract: A touch substrate includes a common electrode, a piezoelectric material layer and a touch electrode sequentially disposed on the substrate. The touch electrode includes a plurality of touch driving electrodes and a plurality of touch sensing electrodes, which cross with each other and are insulated from one another. The touch driving electrode includes a plurality of first touch sub-electrodes that are electrically connected. An overlapping area of front projections of the common electrode and each of the first touch control sub-electrodes on the substrate is smaller than an area of a pattern enclosed by a peripheral boundary of the first touch sub-electrodes.Type: ApplicationFiled: July 31, 2019Publication date: July 9, 2020Inventors: Yuzhen Guo, Yingming Liu, Haisheng Wang, Xiaoliang Ding, Xueyou Cao
-
Publication number: 20200218385Abstract: A touch display panel is provided and includes a bending section and a non-bending section connecting to two sides of the bending section, the bending section includes a bending centerline; the touch display panel includes a plurality of first metal lines and a plurality of second metal lines in the bending section and the non-bending section; each of the first metal lines is parallel to the bending centerline; each of the second metal lines includes a plurality of second metal line segments spaced apart from each other, each of the second metal line segments is perpendicularly connected to one of the first metal lines; the first metal lines and the second metal lines form a grid-shaped first metal layer; the touch display panel further comprises an organic layer and a plurality of third metal lines in the bending section, the organic layer is disposed on the first metal layer.Type: ApplicationFiled: January 9, 2019Publication date: July 9, 2020Inventor: Xiaoliang FENG
-
Publication number: 20200218386Abstract: A touch display device is provided. The touch display device comprises: a first substrate including a plurality of row wires, a plurality of column wires and a plurality of pixel drive elements, and the row wires and the column wires are interleaved to form a pixel matrix, and the pixel drive elements are disposed on pixels of the pixel matrix; a second substrate disposed opposite the first substrate; a display medium interposed between inner sides of the first substrate and the second substrate; a plurality of touch electrodes disposed on the inner side of the first substrate or the second substrate; and an electroconductive protection, which is disposed outside the pixel matrix on the first substrate, electrically isolated from the row wires and the column wires, and electrically connected to the touch electrodes.Type: ApplicationFiled: January 10, 2018Publication date: July 9, 2020Inventor: Beizhou HUANG
-
Publication number: 20200218387Abstract: An in-cell touch display device is provided. The in-cell touch display device comprises: a first substrate including plural row wires, plural column wires and plural pixel drive elements, and the row wires and the column wires are interleaved to form a pixel matrix, and the pixel drive elements are disposed on pixels of the pixel matrix; a second substrate disposed opposite the first substrate; a display medium interposed between inner sides of the first substrate and the second substrate; plural touch electrodes disposed on the inner side of the first substrate or the second substrate; a remaining short-circuit wiring obtained after cutting, used as an antenna and is disposed outside the pixel matrix on the first substrate, electrically isolated from the row wires and the column wires, and is at a distance from the row wire or the column wire.Type: ApplicationFiled: January 10, 2018Publication date: July 9, 2020Inventor: Beizhou HUANG
-
Publication number: 20200218388Abstract: A touch display panel, a method for driving the same, and a display device.Type: ApplicationFiled: October 16, 2019Publication date: July 9, 2020Inventors: Junhui WU, Xin Bi, Jiandong Guo, Xun Pu, Zhongshan Wu