Patents Issued in March 7, 2024
-
Publication number: 20240077939Abstract: An apparatus and a method are described herein related to the art of augmented reality type monitor virtualization. A monitor-virtualization system, such as a head-mountable device, an ophthalmic device or an intraocular implant, can render a virtual monitor in augmented reality. A liquid lens or an optical phased array can position the virtual monitor in space by optical means. A dimmable occlusion matrix can be additionally operated such as to make the image of the virtual monitor substantially opaque. A coordinator module can synchronize the activities of monitor positioning and occlusion masking. The virtual monitor can be anchored to real-world artifacts using bokode technology. Various dimming modes of the occlusion matrix reduce operator fatigue. The apparatus may operate in smart sunglass mode when the virtual monitor function is paused. The virtual monitor can be hidden or visualized differently when thresholds in terms of user geolocation or viewing angle are breached.Type: ApplicationFiled: March 11, 2019Publication date: March 7, 2024Inventor: Maximilian Ralph Peter von und zu Liechtenstein
-
Publication number: 20240077940Abstract: A head-mountable device including a display, a housing at least partially surrounding the display, a facial interface attached to the housing, and a cover positioned between the housing and the facial interface, the cover comprising a conductive fabric.Type: ApplicationFiled: August 15, 2023Publication date: March 7, 2024Inventors: Javier C. Mendez, Nicholas C. Soldner, Darshan R. Kasar, Grant H. Mulliken
-
Publication number: 20240077941Abstract: There is provided an information processing system to cause a performer who appears in content distributed in real time to give a performance according to a viewer's reaction in a remote location. The information processing system includes a control unit. The control unit acquires, from a terminal of a viewer on which content obtained by shooting a performance of a performer is being reproduced via a network in real time, a gaze parameter indicating a gaze of the viewer in a coordinate system of a space in which the viewer is present, the gaze parameter being acquired together with a viewer identification information for identifying the viewer. Moreover, the control unit converts the acquired gaze parameter to a gaze parameter indicating a virtual gaze of the viewer in a coordinate system of a space in which the performer is present.Type: ApplicationFiled: October 30, 2020Publication date: March 7, 2024Applicant: SONY GROUP CORPORATIONInventor: Shunichi HOMMA
-
Publication number: 20240077942Abstract: Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and an eye motion is detected. Based on the detecting, it is determined whether the object is in a field of view for at least a predetermined period of time, and in response to determining that the object is in the field of view for at least the predetermined period of time, one or more items related to the object are generated for display in the extended virtual reality environment.Type: ApplicationFiled: September 8, 2023Publication date: March 7, 2024Inventor: Sakura Saito
-
Publication number: 20240077943Abstract: A display system can include a head-mounted display configured to project light to an eye of a user to display virtual image content at different amounts of divergence and collimation. The display system can include an inward-facing imaging system possibly comprising a plurality of cameras that image the user's eye and glints for thereon and processing electronics that are in communication with the inward-facing imaging system and that are configured to obtain an estimate of a center of rotation of the user's eye using cornea data derived from the glint images. The display system may render virtual image content with a render camera positioned at the determined position of the center of rotation of said eye.Type: ApplicationFiled: November 7, 2023Publication date: March 7, 2024Inventors: David Cohen, Elad Joseph, Ron Nisim Ferens, Eyal Preter, Eitan Shmuel Bar-On, Giora Yahav
-
Publication number: 20240077944Abstract: An information processing device estimates an attitude of a hand of a user holding a controller with the hand. The information processing device has an acquisition unit, a determination unit, and an estimation unit. The acquisition unit acquires inertial information from an inertial sensor provided in the controller. The determination unit determines whether a specific portion of the hand of the user is detected in a captured image acquired by imaging of an imaging unit. The estimation unit estimates the attitude of the hand of the user on a basis of the captured image and the inertial information in a case where the specific portion is detected in the captured image.Type: ApplicationFiled: April 27, 2023Publication date: March 7, 2024Inventor: Masanao YOKOYAMA
-
Publication number: 20240077945Abstract: A controller is communicatively coupled to the one or more user input devices, such as sensors or electrodes, a user interface device, and one or more switch-controlled devices. The controller presents a configuration user interface on the user interface device including selectable configurations for access modes. The controller assigns one or more of the user input devices according to a current selected configuration of an access mode. The controller detects a volitional user input corresponding to a change in a particular signal detected by a particular user input device and switches the switch-controlled device based on the detected user input. In one or more embodiments, the system includes one or more biosignal electrodes attachable to a user and/or mechanical, positional, or other switch technologies. In one or more embodiments, the system includes one or more sensors for detection of movement, gestures, eye tracking or other input.Type: ApplicationFiled: November 7, 2023Publication date: March 7, 2024Applicant: Control Bionics LimitedInventors: James E. Schorey, Peter S. Ford
-
Publication number: 20240077946Abstract: A wearable device includes wearable structure, an array of individually controlled electrohydraulic-controlled haptic tactors coupled to a portion of the wearable structure, a power source for providing a voltage, and circuitry configured to provide instructions for generating the haptic response. Each electrohydraulic-controlled haptic tactor is in fluid communication with an actuator pouch filled with a dielectric substance.Type: ApplicationFiled: September 6, 2023Publication date: March 7, 2024Inventors: Priyanshu Agarwal, FNU Purnendu, Nicholas Colonnese
-
Publication number: 20240077947Abstract: An embodiment for AI-based direction awareness during content engagement is provided. The embodiment may include receiving a query from a user on a primary device and data relating to an orientation of the primary device. The embodiment may also include presenting display content to the user on the primary device. The embodiment may further include in response to determining directional input is received from the user, identifying a relative distance and direction of the one or more wearable devices to the primary device. The embodiment may also include detecting a reference object. The embodiment may further include identifying a scaled distance of the one or more wearable devices to the reference object. The embodiment may also include adjusting the presented display content to include the reference object in a center of a screen of the primary device.Type: ApplicationFiled: September 7, 2022Publication date: March 7, 2024Inventors: Tushar Agrawal, Christian Compton, Jeremy R. Fox, Sarbajit K. Rakshit
-
Publication number: 20240077948Abstract: A gesture-based display interface control method and apparatus, a device and a storage medium, belonging to the technical field of intelligent devices. The method includes: acquiring a first gesture of a user, and determining whether the first gesture is kept for more than a preset time; if the first gesture is kept for more than the preset time, displaying a target virtual gesture point on the current display interface; and acquiring a second gesture of the user, and moving the target virtual gesture point on the current display interface according to the second gesture. According to the method, the virtual gesture point is displayed on the display interface, and the virtual gesture point is randomly moved on the display interface, so that any interactive control on the current display interface can be operated, and gesture control is more intelligent and comprehensive while improving the accuracy of gesture control.Type: ApplicationFiled: August 25, 2021Publication date: March 7, 2024Inventors: Xiaochen WANG, Ke DONG
-
Publication number: 20240077949Abstract: A gesture and voice-controlled interface device comprising one or a plurality of gesture sensors for sensing gestures of a user; one or a plurality of audio sensors for sensing sounds made by the user; and a processor configured to obtain one or a plurality of sensed gestures from said one or a plurality of gesture sensors and to obtain one or a plurality of sensed sounds from said one or a plurality of audio sensors, to analyze the sensed gesture and sensed sounds to identify an input from the user, and to generate an output signal corresponding to the input to a controlled deviceType: ApplicationFiled: November 9, 2023Publication date: March 7, 2024Applicant: Wearable Devices Ltd.Inventors: Guy WAGNER, Leeor LANGER, Asher DAHAN
-
Publication number: 20240077950Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The technology disclosed includes determining from the motion information whether a motion of the control object with respect to the virtual control construct is an engagement gesture, such as a virtual mouse click or other control device operation. The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: November 9, 2023Publication date: March 7, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Publication number: 20240077951Abstract: An input device includes a control surface for a color correction system. The control surface includes a housing with an upwardly facing control panel. The control panel has a proximal edge which is nearest a user and a distal edge that is furthest from a user in normal use. The control panel includes a plurality of controls. The plurality of controls includes: a plurality of trackballs, wherein each trackball comprises a ball and a control ring, said ball cooperating with at least one encoder to generate a multi-dimensional control signal based on motion of the ball, and said control ring cooperating with at least one encoder to generate a one dimensional control signal based on the rotational motion of the ring, wherein the ball of said trackball is mounted concentrically with said control ring; a plurality of control buttons; and a plurality of knobs coupled to respective rotary encoders.Type: ApplicationFiled: September 5, 2023Publication date: March 7, 2024Inventors: Grant David Petty, Simon Milne Kidd, John Anthony Vanzella, Shannon Howard Smith, Andrew James Godin, Benjamin Hill, Lachlan James Karp
-
Publication number: 20240077952Abstract: A method includes displaying a first screen of a keyboard area having an upper pseudo-image part, a lower pseudo-image part, and multiple image-pixel keys, changing a Unicode character in one of the upper pseudo-image part and the lower pseudo-image part into a first non-blank Unicode character in the first screen in response to a first image-pixel key input, displaying a second screen of the keyboard area different from the first screen, providing the upper pseudo-image part in a text input area in response to a first image-part key input, providing a word in the text input area in response to at least one letter key input, and providing the lower pseudo-image part in the text input area in response to a second image-part key input.Type: ApplicationFiled: April 27, 2023Publication date: March 7, 2024Inventor: Bonggeun Kim
-
Publication number: 20240077953Abstract: An apparatus, method and computer program is described including: generating a display pattern including features with spatial variations, wherein the spatial variations are configured to enable determination, by a first radar device, of at least one of an orientation of a first user device, position of the first user device, size of a display of the first user device, or one or more gestures associated with the display.Type: ApplicationFiled: September 1, 2023Publication date: March 7, 2024Inventors: Rory Andrew Bruce McDonald, Christopher John WRIGHT, Harry Michael CRONIN
-
Publication number: 20240077954Abstract: Disclosed are a human-computer interaction movement track detection method, device, apparatus, and computer-readable storage medium. The method comprises: segmenting a movement path of a movement target to obtain one or more sub paths; monitoring and recording movement of the movement target in each of the sub paths to obtain a human-computer interaction movement track of the movement target in each of the sub paths; identifying the human-computer interaction movement track of the movement target in each of the sub paths with a determination model corresponding to each of the sub paths; and detecting a human-computer interaction movement track of the movement target based on identification of the human-computer interaction movement track in each of the sub paths.Type: ApplicationFiled: December 27, 2022Publication date: March 7, 2024Inventors: Qichao ZHAO, Ran YANG
-
Publication number: 20240077955Abstract: An optical sensor, comprising: a plurality of pins, wherein the optical sensor is configured to sense optical data and configured to compute motions according to the optical data; wherein the optical sensor outputs the motions to a control circuit via a complex pin among the pins in a first mode; wherein the optical sensor outputs data other than the motions to the control circuit via the complex pin in a second modeType: ApplicationFiled: October 30, 2023Publication date: March 7, 2024Applicant: PixArt Imaging Inc.Inventor: Tong Sen Liew
-
Publication number: 20240077956Abstract: An electronic device having a flexible display and a method for providing a control object, based on a user's gripping state of an electronic device are provided. The electronic device includes a display, a memory, and a processor. The processor may control the display such that an execution screen of an application is displayed in a designated state of the electronic device. The processor may detect a control object from the execution screen. The processor may determine a user's gripping state. The processor may identify at least one target control object from the control object, based on the designated state and the gripping state. The processor may provide a duplicate control object corresponding to a control object identified as the target control object to an optimization region corresponding to the gripping state.Type: ApplicationFiled: October 17, 2023Publication date: March 7, 2024Inventors: Younsun LEE, Sangheon KIM, Juyoung KIM, Changhwan KIM, Hyunjung MOON, Sungchan BAE, Yeunwook LIM
-
Publication number: 20240077957Abstract: A virtual reality (VR) control method includes displaying, by a VR device, a VR environment. The VR environment is presented from a perspective in the VR environment that is defined with respect to a reference frame in the VR environment. A graphical indicator of a current real-world position of the VR device with respect to a real-world reference position is displayed. The graphical indicator tracks the perspective during movement of the perspective to remain displayed during the movement of the perspective. The method includes determining a velocity of the reference frame based on a displacement between the current real-world position and the real-world reference position; and translating the reference frame in the VR environment at the velocity.Type: ApplicationFiled: September 6, 2023Publication date: March 7, 2024Inventors: Eric Malafeew, Jason Warburg
-
Publication number: 20240077958Abstract: An information processing device includes at least one memory and at least one processor which function as: an acquisition unit configured to acquire operation information of a two-dimensional operation performed on a controller; and a control unit configured to control an object displayed on a display device based on the operation information and a correspondence relationship between an operation coordinate system for determining an operation direction of the two-dimensional operation performed on the controller and a display coordinate system for determining a display position of the display device.Type: ApplicationFiled: August 31, 2023Publication date: March 7, 2024Inventor: Hiroaki KAMIJO
-
Publication number: 20240077959Abstract: An active stylus includes a pin holder in a main body housing, a tip shell having non-conductive material, a first electrode having a first end fixed in the tip shell, a second electrode having a first end fixed in the main body housing and a first insulating ring in the second electrode configured for isolating the first electrode. The main body housing includes a first opening and the tip shell includes a second opening and the second opening is opposite to the first opening. Moreover, a second end of the first electrode is protruded through the second opening and a second end of the second electrode is protruded through the first opening. The second end of the first electrode is protruded through the second electrode and detachably fixed in a holding portion of the pin holder; and the second end of the second electrode is detachably fixed in the tip shell.Type: ApplicationFiled: August 28, 2023Publication date: March 7, 2024Inventors: Yeh Sen-Fan CHUEH, Tzu-Yu TING
-
Publication number: 20240077960Abstract: A pen-shaped position indicator is configured to capacitively couple with a sensor surface of a position detection apparatus. The indicator includes a pen-shaped body; a coil; a driving power production circuit configured to produce a DC voltage from an induced signal in the coil by a wireless interaction with a charging device; a signal production circuit connected to the driving power production circuit and configured to generate a signal based on the DC voltage to form a capacitive relationship between the pen-shaped body and the position detection apparatus; a first electrode; and a second electrode. The first and second electrodes are configured to form first and second capacitive relationships with the sensor surface, respectively, to generate detection signals in the sensor surface from which a first detection signal and a second detection signal distinguishable from each other are extracted and used to obtain angle information of the pen-shaped position indicator.Type: ApplicationFiled: November 14, 2023Publication date: March 7, 2024Inventors: Yasuo ODA, Sadao YAMAMOTO, Yoshihisa SUGIYAMA
-
Publication number: 20240077961Abstract: An ergonomic hand support device reduces stress and helps to prevent strain when using a computer pointing device.Type: ApplicationFiled: August 5, 2022Publication date: March 7, 2024Inventors: Timothy Frank Bernasch, Ahmed Abdullahi Omar
-
Publication number: 20240077962Abstract: A control method, a computer-readable medium, and an electronic device. The control method includes: receiving a target control instruction; and controlling a real-time motion state of the retractable screen according to the target control instruction.Type: ApplicationFiled: October 26, 2023Publication date: March 7, 2024Applicant: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.Inventor: Ping ZHANG
-
Publication number: 20240077963Abstract: An electronic substrate and an electronic device are provided. The electronic substrate includes a first functional region, a second functional region and a peripheral region; the first functional region includes an opening, the second functional region is in the opening, and the peripheral region includes an opening peripheral region between the first functional region and the second functional region; the electronic substrate includes a base substrate and a detection trace structure on the base substrate; the detection trace structure includes a first conductive trace and a second conductive trace which are in the opening peripheral region, and the first conductive trace and the second conductive trace respectively extend from a first position reversely along an edge of the second functional region and respectively partially surround the second functional region, the first conductive trace and the second conductive trace are spaced apart from each other in the first position.Type: ApplicationFiled: May 18, 2021Publication date: March 7, 2024Inventors: Cong FAN, Yu WANG, Kemeng TONG, Fan HE, Xiangdan DONG, Hongwei MA, Mengmeng DU, Xuwu HU
-
Publication number: 20240077964Abstract: There are provided a conductive film, a touch sensor film, and an image display apparatus, in which disconnection of a metal wire hardly occurs even in a case where bending is repeated. The conductive film includes a base material, protruding parts disposed on the base material, each of which projects in one direction and a plurality of which are spaced apart to be disposed in a direction orthogonal to the one direction, and a metal wire that intersects the one direction and extends across the plurality of protruding parts, where in a case where a width of the protruding part in the direction orthogonal to the one direction in which the protruding part projects is denoted as Lj, and an interval between the plurality of protruding parts is denoted as Ld, 1 ?m?Lj<100 ?m and 1 ?m?Ld<100 ?m are satisfied, and in a case where a thickness of the metal wire is denoted as td, and a thickness of the protruding part is denoted as tj, tj?td is satisfied.Type: ApplicationFiled: July 19, 2023Publication date: March 7, 2024Applicant: FUJIFILM CorporationInventor: Tokuju OIKAWA
-
SYSTEMS AND METHODS FOR TOUCH SENSING ON DEVICES INCLUDING REGIONS WITH AND WITHOUT TOUCH ELECTRODES
Publication number: 20240077965Abstract: Touch sensor panels/screens can include a first region having a plurality of touch electrodes and a second region without touch electrodes. In some examples, to improve touch sensing performance, a first algorithm or a second algorithm is applied to determine whether an object corresponding to the touch patch is in contact with the touch screen. Whether to apply the first algorithm or the second algorithm is optionally dependent on the location of the touch patch.Type: ApplicationFiled: August 16, 2023Publication date: March 7, 2024Inventors: Dor SHAVIV, Behrooz SHAHSAVARI, David S. GRAFF, Baboo V. GOWREESUNKER, Nima FERDOSI, Yash S. AGARWAL, Sai ZHANG -
Publication number: 20240077966Abstract: A light emitting display device with an integrated touch screen includes: a substrate which includes a display area in which a plurality of display pixels is disposed and a non-display area around the display area; a light emitting diode in the display area; an encapsulation unit which covers the display area and the non-display area; a touch electrode line on the encapsulation unit; a touch routing line which is disposed in the non-display area and is connected to the touch electrode line; a plurality of blocking structures which is disposed in the non-display area and is configured to enclose the display area; and a step compensation layer disposed between the encapsulation unit and the touch routing line; wherein the step compensation layer reduces a step caused by the plurality of blocking structures to reduce irregularities of a surface of the encapsulation unit.Type: ApplicationFiled: November 10, 2023Publication date: March 7, 2024Inventors: YeonGyeong Bae, Jonghyun Han
-
Publication number: 20240077967Abstract: A touch panel and a display apparatus. The touch panel has a touch region and a fingerprint identification region and includes a first metal mesh layer, a second metal mesh layer, and a dielectric layer. The first metal mesh layer includes a plurality of touch electrodes disposed in the touch region and a plurality of fingerprint identification electrodes disposed in the fingerprint identification region. The second metal mesh layer includes a plurality of fingerprint lead wires. Each of the plurality of fingerprint lead wires is electrically connected to corresponding one of the plurality of fingerprint identification electrodes. The dielectric layer is disposed between the first metal mesh layer and the second metal mesh layer. The dielectric layer is provided with a plurality of contact holes via which the identification lead wires are electrically connected to the fingerprint identification electrodes.Type: ApplicationFiled: November 14, 2023Publication date: March 7, 2024Applicant: Yungu (Gu’an) Technology Co., Ltd.Inventors: Haofeng ZHANG, Rui GUO, Meng ZHANG, Ching Tung HSU
-
Publication number: 20240077968Abstract: An electronic device has sensors. More particularly, the electronic device is a small form factor electronic device such as earbuds, styluses, or electronic pencils, earphones, and so on. In some implementations, one or more touch sensors and one or more force sensors are coupled to a flexible circuit. In various implementations, the touch sensor and the force sensor are part of a single module controlled by a single controller. In a number of implementations, the flexible circuit is laminated to one or more portions of an interior surface of the electronic device.Type: ApplicationFiled: August 30, 2023Publication date: March 7, 2024Inventors: Zhiyuan Sun, Wei Lin, Ying-da Wang, Chun-Chih Chang, Nathan K. Gupta, Travis N. Owens, Karan S. Jain, Supratik Datta, Kyle J. Campiotti
-
Publication number: 20240077969Abstract: Devices and related methods are disclosed herein that generally involve detecting and interpreting gestures made by a user to generate user input information for use by a digital data processing system. In one embodiment, a device includes first and second sensors that observe a workspace in which user gestures are performed. The device can be set to a keyboard input mode, a number pad input mode, or a mouse input mode based on the positioning of the user's hands. Subsequent gestures made by the user can be interpreted as keyboard inputs, mouse inputs, etc., using observed characteristics of the user's hands and various motion properties of the user's hands. These observed characteristics can also be used to implement a security protocol, for example by identifying authorized users by the anatomical properties of their hands or the behavioral properties exhibited by the user while gesturing.Type: ApplicationFiled: July 21, 2023Publication date: March 7, 2024Inventor: Thomas J. MOSCARILLO
-
Publication number: 20240077970Abstract: An electronic device is provided. The electronic device includes a touch sensor, a processor, and a memory. The processor may determine a touch input from a user as at least one of a force-touch input or a long-touch input, based on received touch data, determine whether a result of determining the touch data matches an intention of the user, store data that does not match the intention of the user as a result of determination among the touch data in the memory, and determine a type of an artificial intelligence (AI)-based pre-learning model to be used in the electronic device, based on touch input accuracy and the data that does not match the intention of the user.Type: ApplicationFiled: November 13, 2023Publication date: March 7, 2024Inventors: Junhyuk LEE, Hyunbin PARK, Seungjin YANG, Jin CHOI
-
Publication number: 20240077971Abstract: A method performed by at least one processor of synchronizing an active stylus pen with an electronic device, the method comprising: receiving, by the active stylus pen, a synchronization start signal from the electronic device via wireless communication between the active stylus pen and the electronic device; in response to receiving the synchronization start signal, calculating, by the active stylus pen, a plurality of transmission timings at which a position signal corresponding to a position of the active stylus pen is transmitted to the electronic device, the calculating being performed on the basis of a point in time at which the synchronization start signal is received; and transmitting, by the active stylus pen on the basis of the plurality of transmission timings, the position signal to the electronic device.Type: ApplicationFiled: September 1, 2023Publication date: March 7, 2024Applicant: SAMSUNG ELECTRONICS CO., LTD.Inventor: Kiup KIM
-
Publication number: 20240077972Abstract: This application provides a touch substrate and a display device. The touch substrate comprises a touch area and a peripheral area surrounding the touch area. The touch substrate further comprises: a touch electrode and a touch signal line that are coupled to each other. At least portion of the touch electrode is located in the touch area. At least portion of the touch signal line is located in the peripheral area, the touch signal line comprises at least one corner portion, the at least one corner portion comprises a first part, a second part and a third part that are connected end to end in sequence, an extension direction of the first part is the same as an extension direction of the third part, and an extension direction of the second part intersects with the extension direction of the first part.Type: ApplicationFiled: November 8, 2023Publication date: March 7, 2024Inventors: Fan HE, Kemeng TONG, Cong FAN, Yu WANG, Hongwei MA, Qian MA
-
Publication number: 20240077973Abstract: A sensor device includes: a sensor panel including sensors arranged in a matrix form and sensor lines electrically connected to the sensors one-to-one; and a sensor driver configured to receive sensing signals from the sensors through the sensor lines, wherein the sensor driver is configured to simultaneously receive a first sensing signal from a first sensor using a first reference signal and a second sensing signal from a second sensor using a second reference signal, wherein the first reference signal and the second reference signal have a same waveform, a phase of the second reference signal is different from a phase of the first reference signal, and wherein a phase of the second sensing signal is different from a phase of the first sensing signal.Type: ApplicationFiled: March 24, 2023Publication date: March 7, 2024Inventors: Jin Woo KIM, Ja Seung KU, Chang Bum KIM, Dong Chun LEE
-
Publication number: 20240077974Abstract: A method to process touch input on a touch-screen display device having an electronic display layer arranged behind a series of column electrodes and behind a series of row electrodes. The method comprises: concurrently driving one or more row electrodes while leaving undriven one or more other row electrodes; sensing a row signal from the one or more other row electrodes; sensing a column signal from the series of column electrodes; and providing a corrected column output based at least partly on the column signal and on the row signal.Type: ApplicationFiled: January 31, 2022Publication date: March 7, 2024Applicant: Microsoft Technology Licensing, LLCInventors: On HARAN, Eliyahu BAREL
-
Publication number: 20240077975Abstract: An active pen adapted for use with a touch pad includes: a receiving circuit configured to generate an internal signal in response to modulating a received uplink signal, a first filter configured to generate a first sampling signal by sampling the internal signal at a first frequency, a second filter configured to generate a second sampling signal by sampling the internal signal at a second frequency unequal to the first frequency, and a timing information generating circuit configured to generate timing information associated with the uplink signal from the internal signal. The receiving circuit is also configured to receive the uplink signal at the first frequency in a first frame, and at the second frequency in a second frame. A noise power calculating circuit is also provided to calculate a noise power level of the uplink signal at each of the first and second frequencies, in response to the first and second sampling signals.Type: ApplicationFiled: July 24, 2023Publication date: March 7, 2024Inventors: Choonghoon Lee, Kiup Kim, Seunghoon Baek, Yoonion Hwang
-
Publication number: 20240077976Abstract: A signal processing circuit includes a driving signal generator and an encoder. The driving signal generator is configured to generate a driving signal. The encoder includes a multiplexer, a plurality of driver/receiver circuits and a summation circuit. The multiplexer is configured to receive multiple sensing signals in response to the driving signal. Among the driver/receiver circuits, a first driver/receiver circuit is configured to receive at least one first sensing signal, and apply a first gain to the first sensing signal to generate a first encoded signal; and a second driver/receiver circuit is configured to receive at least one second sensing signal other than the first sensing signal, and apply a second gain different from the first gain to the second sensing signal to generate a second encoded signal. The summation circuit is configured to sum up the first encoded signal and the second encoded signal to generate an encoded data.Type: ApplicationFiled: September 5, 2022Publication date: March 7, 2024Applicant: NOVATEK Microelectronics Corp.Inventor: Tsen-Wei Chang
-
Publication number: 20240077977Abstract: Disclosed is a method and system for predicting a touch interaction position on a large display based on a binocular camera. The method includes: separately acquiring arm movement video frames of a user and facial and eye movement video frames of the user by a binocular camera; extracting a video clip of each tapping action from the arm movement video frames and the facial and eye movement video frames and obtaining a key frame by screening; marking the key frame of each tapping action with coordinates to indicate coordinates of a finger in a display screen; inputting the marked key frame to an efficient convolutional network for online video understanding (ECO)-Lite neural network for training to obtain a predictive network model; and inputting a video frame of a current operation to be predicted to the predictive network model and outputting a touch interaction position predicted for the current operation.Type: ApplicationFiled: September 5, 2023Publication date: March 7, 2024Inventors: Gangyong JIA, Yumiao ZHAO, Huanle RAO, Ziwei SONG, Minghui YU, Hong XU
-
Publication number: 20240077978Abstract: A flexible display device includes a display panel having an active area including a plurality of pixels displaying an image and a non-active area surrounding the active area, first data pads disposed in the non-active area adjacent to the active area of the display panel and electrically connected with the plurality of pixels, pads disposed in a direction away from the first data pads and the active area, second data pads disposed between the first data pads and the pads, and a protrusion pattern located between the second data pads and the third data pads and disposed on an upper part of the insulating layer.Type: ApplicationFiled: November 2, 2023Publication date: March 7, 2024Applicant: LG DISPLAY CO., LTD.Inventors: Kiyoung SUNG, SangHo KIM, Eunjin OH
-
Publication number: 20240077979Abstract: A display device includes: a substrate; a display layer on the substrate; and a touch layer on the display layer, and including: a touch area; first sensor electrodes located along a first direction, and electrically connected to each other; second sensor electrodes located along a second direction crossing the first direction, and electrically connected to each other; and third sensor electrodes located along the first direction, and electrically insulated from the first and second sensor electrodes. In a first mode, the touch layer is to sense amounts of change in first capacitances between the first sensor electrodes and the second sensor electrodes, and in a second mode, the touch layer is to sense amounts of change in second capacitances between the first sensor electrodes and some of the second sensor electrodes, and amounts of change in third capacitances between the first sensor electrodes and the third sensor electrodes.Type: ApplicationFiled: April 20, 2023Publication date: March 7, 2024Inventors: Jae Uk CHOI, Yun Ho KIM
-
Publication number: 20240077980Abstract: The present invention provides a transparent conductive substrate, sequentially comprising: a first resist layer, a first transparent conductive layer, a transparent core, a second transparent conductive layer, and a second resist layer; wherein the first resist layer is composed of a UV-light sensitive composition (C1); and the second resist layer is composed of a visible-light sensitive composition (C2). The present invention provides a double-side photolithographic method for manufacturing transparent conductive laminates. The transparent conductive laminates manufactured by the inventive method may be incorporated into touch panels.Type: ApplicationFiled: August 23, 2023Publication date: March 7, 2024Inventor: YI-TING CHEN
-
Publication number: 20240077981Abstract: In some examples, a touch screen includes a first region corresponding to a region of the touch screen without touch electrodes; a second region corresponding to a region of the touch screen with a first conductive material (e.g., solid metal) with a first density in a first conductive layer; and a third region corresponding to a region of the touch screen with a second conductive material (e.g., metal mesh) with a second density, lower than the first density, in the first conductive layer. In some examples, the second region circumscribes the first region, and the third region circumscribes the second region. Some touch electrodes include a portion of the first conductive material in the second region and a portion of the second conductive material in the third region. Such touch electrodes can be routed using the first conductive material in the first conductive layer around the first region.Type: ApplicationFiled: August 30, 2023Publication date: March 7, 2024Inventors: Ashray Vinayak GOGTE, Yufei ZHAO, Christophe BLONDIN, Yoann J. LANET
-
Publication number: 20240077982Abstract: An application that is executed by a data processing apparatus serving as a mobile terminal and transmits a performance instruction (job performance request) to perform a job to an image forming apparatus displays a user interface (a stop button, a notification of a notification center) for receiving a stop operation of transmitting, to the image forming apparatus, a stop instruction (job stop request) to instruct the image forming apparatus to stop performing the job based on the performance instruction, regardless of a screen for the performance instruction by the application and a screen after the performance instruction.Type: ApplicationFiled: September 1, 2023Publication date: March 7, 2024Inventor: SATOKI WATARIUCHI
-
Publication number: 20240077983Abstract: Recording tools for creating interactive AR experiences. An interaction recording application enables a user with little or no programming skills to perform and record user behaviors that are associated with reactions between story elements such as virtual objects and connected IoT devices. The user behaviors include a range of actions, such as speaking a trigger word and apparently touching a virtual object. The corresponding reactions include starting to record a subsequent scene and executing actions between story elements. The trigger recording interface is presented on the display as an overlay relative to the physical environment.Type: ApplicationFiled: September 1, 2022Publication date: March 7, 2024Inventors: Lei Zhang, Youjean Cho, Daekun Kim, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
-
Publication number: 20240077984Abstract: Described are recording tools for generating following behaviors and creating interactive AR experiences. The following recording application enables a user with little or no programming skills to virtually connect virtual objects to other elements, including virtual avatars representing fellow users, thereby creating an interactive story in which multiple elements are apparently and persistently connected. The following interface includes methods for selecting objects and instructions for connecting a virtual object to a target object. In one example, the recording application presents on the display a virtual tether between the objects until a connecting action is detected. The following interface is presented on the display as an overlay, in the foreground relative to the physical environment.Type: ApplicationFiled: September 1, 2022Publication date: March 7, 2024Inventors: Lei Zhang, Ava Robinson, Daekun Kim, Youjean Cho, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández
-
Publication number: 20240077985Abstract: An electronic device may include one or more sensors that capture sensor data for a physical environment around the electronic device. The sensor data may be used to determine a scene understanding data set for an extended reality environment including the electronic device. The scene understanding data set may include information such as spatial information, information regarding physical objects in the extended reality environment, and information regarding virtual objects in the extended reality environment. When providing scene understanding data to one or more applications running on the electronic device, spatial and/or temporal restrictions may be applied to the scene understanding data set. Scene understanding data that is associated with locations within a boundary and that is associated with times after a cutoff time may be provided to an application.Type: ApplicationFiled: June 21, 2023Publication date: March 7, 2024Inventors: Divya T. Ramakrishnan, Brandon J. Van Ryswyk, Reinhard Klapfer, Antti P. Saarinen, Kyle L. Simek, Aitor Aldoma Buchaca, Tobias Böttger-Brill, Robert Maier, Ming Chuang
-
Publication number: 20240077986Abstract: An Adaptive Tangible User Interface (ATUI) for use in an extended reality environment in which tangible interfaces are composed in real time based on identified affordances of existing objects in the physical environment and the input tasks of a user. An extended reality system can be provided with instructions executable by one or more processors to perform processing including: generating a representation of the real-world environment within a field of view of the user; identifying physical objects within the real-world environment; generating a set of possible performable gestures afforded by available object affordance factors; determining potential input tasks; composing performable gestures for the potential input tasks; and selecting a physical object for use as an adaptive tangible user interface. Techniques for designing a virtual user interface, overlaying the virtual user interface on a selected ATUI physical object, and maintaining alignment of the virtual user interface, are also provided.Type: ApplicationFiled: September 1, 2023Publication date: March 7, 2024Applicant: Meta Platforms Technologies, LLCInventors: Stephanie Santosa, Frances Cin-Yee Lai, Michael Glueck, Daniel Clarke, Tovi Grossman, Weilun Gong
-
Publication number: 20240077987Abstract: This application provides a widget display method applied to an electronic device, and the method includes: displaying, in a first area of a first page of a home screen, a first widget that has a first size; in response to detecting that an ongoing task exists in a first application, simultaneously displaying, in the first area, a widget of the first application and the first widget with the size reduced, where the widget of the first application displays first content, the widget of the first application has a second size, and the second size is less than the first size; and in response to the fact that the electronic device meets a preset condition, enlarging the size of the widget of the first application, where the enlarged widget displays more content than the first content. This application provides a widget display method and an electronic device, to improve user experience.Type: ApplicationFiled: January 5, 2022Publication date: March 7, 2024Applicant: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Yanan Zhang, Hongjun Wang
-
Publication number: 20240077988Abstract: Embodiments of the present disclosure enable system(s) and method(s) for creating and deploying an electronic skill-based activity, including implementing a matchup tool to determine a projected performance score for participants in real-world events based at least in part on historical performance data of each participant. The matchup tool creates suggested matchups for inclusion in a skill-based game by selecting, for each suggested matchup, at least two components, formed from one or more participants, expected to produce substantially similar scores in the skill-based game based on the projected performance score of each participant. The matchup tool renders for display to a game operator the suggested matchups to enable the game operator to interactively select suggested matchups for inclusion within the skill-based game.Type: ApplicationFiled: March 2, 2023Publication date: March 7, 2024Inventor: Daniel K. Orlow