Patents Issued in December 15, 2016
-
Publication number: 20160363990Abstract: A multi-dimensional measuring system is disclosed. The measuring system includes a dimensional measuring device that fits over an exterior portion of an item and provides reference measurements in three orthogonal directions. The measuring system also includes a camera to take an image of the item with the dimensional measuring placed thereon and a hardware processor that receives and processes the image to determine the dimensions of the item based upon the reference measurements of the dimensional measuring device.Type: ApplicationFiled: June 9, 2016Publication date: December 15, 2016Inventor: Wanda L. Key
-
Publication number: 20160363991Abstract: There is provided a vehicle comprising a body having an exterior, an external display screen visible on the exterior of the body, a proximity sensor, a non-transitory memory storing an executable code and a vehicle information database including vehicle information, and a hardware processor executing the executable code to detect, using the proximity sensor, an individual within a proximity of the vehicle, activate the external display screen in response to detecting the individual within the proximity of the vehicle, display, on the external display screen, the vehicle information from the vehicle information database stored in the memory, in response to detecting the individual within the proximity of the vehicle.Type: ApplicationFiled: June 8, 2016Publication date: December 15, 2016Inventors: Fairuz Jane Schlecht, Nikola Kostov Stefanov
-
Publication number: 20160363992Abstract: A wearable device worn on the head of a user determines the head orientation of the user. A tracking application executing on the wearable device determines the orientation of the wearable device relative to a frame of reference. A mobile application executing on a mobile device likewise determines the orientation of the mobile device relative to the frame of reference. The frame of reference may be magnetic north or an inertial reference frame shared between the wearable device and the mobile device. The tracking application estimates the head orientation of the user, relative to the mobile device, based on the relative orientations of the wearable device and the mobile device.Type: ApplicationFiled: June 15, 2015Publication date: December 15, 2016Inventor: Todd WELTI
-
Publication number: 20160363993Abstract: The embodiment of the present disclosure provides transparent display system and vehicle equipment provided with the same.Type: ApplicationFiled: September 12, 2014Publication date: December 15, 2016Inventor: Changlin Leng
-
Publication number: 20160363994Abstract: A display control method for a display control system includes: acquiring eye gaze information concerning eye gaze of a user; specifying, as a highlight target display element, among at least one display element to be displayed based on data indicating at least one item of information, a display element that has a predetermined positional relationship with the eye gaze of the user indicated by the eye gaze information and that satisfies a certain requirement as a highlight target candidate; and performing display control so that the specified highlight target display element is highlighted.Type: ApplicationFiled: June 5, 2016Publication date: December 15, 2016Inventor: MAYU YOKOYA
-
Publication number: 20160363995Abstract: Embodiments of the present disclosure provide for apparatuses, methods, and systems for head-mounted eye gaze tracking system including: a camera for capturing images of a user's eyes; a light source for illuminating the user's eyes during the capturing of images; a light guide member for dispersing the light from the light source into a two-dimensional distribution for incidence onto the user's eyes; and a processor for processing the captured images to calculate the eye gaze direction of the user's eyes across a plurality of images by detecting a reflection of the two-dimensional distribution from the eyes.Type: ApplicationFiled: June 8, 2016Publication date: December 15, 2016Inventor: Sebastian Rougeaux
-
Publication number: 20160363996Abstract: There is provided a hand-held controller for a virtual-reality system. The hand-held controller includes a grip extending from a proximal end to a distal end and a first user-input key mounted at least in part on the grip. The first user-input key includes a casing depressible by one or more fingers of a user, and a switch coupled to the casing. The switch includes a sensor configured to detect and distinguish between a range of pressures applied to the casing.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventors: Jason Andrew Higgins, Benjamin E. Tunberg Rogoza
-
Publication number: 20160363997Abstract: A glove interface object is provided, comprising: a plurality of fluid channels disposed on a palmar side of the glove interface object, the fluid channels containing a magnetic fluid; a plurality of electromagnets positioned on the palmar side of the glove interface object, each of the plurality of electromagnets being configured when activated to generate a magnetic field that acts on at least a portion of the magnetic fluid; a controller configured to control activation and deactivation of the electromagnets based on received haptic feedback data.Type: ApplicationFiled: June 13, 2016Publication date: December 15, 2016Inventors: Glenn Black, Steven Osman
-
Publication number: 20160363998Abstract: A touchscreen includes conductive islands that are capacitively coupled to conductive traces, where they may be used to provide capacitive sensing of the position of one or more fingers in contact with a touchscreen and/or to exert haptic forces on one or more fingers in contact with a touchscreen.Type: ApplicationFiled: June 9, 2016Publication date: December 15, 2016Inventors: James E. Colgate, Michael A. Peshkin
-
Publication number: 20160363999Abstract: Techniques are described herein to assist users to operate non-physical controls in situations where the users are unable to visually locate the controls. According to one embodiment, the device containing the non-physical controls is designed to give non-visual feedback to the user based, at least on part, on the distance between (a) the current position of user input, and (b) the location of the non-physical control. At least one characteristic of the non-visual feedback changes as that distance changes.Type: ApplicationFiled: June 20, 2016Publication date: December 15, 2016Inventor: Amir C. Djavaherian
-
Publication number: 20160364000Abstract: A touch sensitive display assembly includes a touch screen and a button array. The touch screen is configured to display one or more input keys. The button array includes one or more buttons corresponding to the one or more input keys. The button array is formed by a substrate attached to a button membrane thereby creating a set of button cavities corresponding to the input keys. The button cavities are configured to be inflated and deflated by a pump coupled to a fluid reservoir. The cavities can be inflated/deflated together, in subsets, and/or individually. In some embodiments, the button array is sandwiched between a touch sensing layer and a display of the touch screen. In other embodiments, the button array can be located either above or below the touch screen.Type: ApplicationFiled: August 10, 2016Publication date: December 15, 2016Inventor: Craig M. Ciesla
-
Publication number: 20160364001Abstract: An electronic device includes a panel which a user touches, a detector that detects the touch of the user on the panel, a vibrator that vibrates the panel, and a signal generator. The signal generator generates a signal for driving the vibrator, the signal including a drive signal that generates vibration of the panel and a suppression signal that suppresses inertial vibration of the panel. The suppression signal has a degree of suppression the inertial vibration that changes depending on a touch position detected by the detector.Type: ApplicationFiled: August 29, 2016Publication date: December 15, 2016Inventors: Yoshifumi HIROSE, Shoichi ARAKI
-
Publication number: 20160364002Abstract: Various embodiments of the invention allow to detect and analyze gestures, such as tapping and swiping patterns, that a user performs in the process of interacting with a computing device to determine the user's mood therefrom, so as to initiate an appropriate response. In certain embodiments, this is accomplished, without requiring labeled training data, by monitoring a user-device interaction via sensors and analyzing the sensor data based on contextual data via a processor to determine a gesture and one or more properties associated with an emotional state of the user. A response is generated based on the identified emotional state.Type: ApplicationFiled: June 9, 2015Publication date: December 15, 2016Applicant: DELL PRODUCTS L.P.Inventors: Carrie Elaine Gates, Gabriel Mauricio Silberman
-
Publication number: 20160364003Abstract: The holographic interface for manipulation includes a holographic display unit for constructing and displaying a hologram and a motion detector for detecting movement and location of a physical command object, such as a user's finger or hand, relative to the displayed hologram. The motion detector is in communication with a controller for converting the detected location of the physical command object relative to its position with respect to the displayed hologram into a command signal. The command signal is then transmitted to an external device, such as a robotic arm in a remote plant, or any other suitable external system. Alternatively, the hologram may be a holographic image of physical controls for an external system, for example, and the command signal may be a command for the external device to perform an act corresponding to manipulation of the holographic image of a physical control by the command object.Type: ApplicationFiled: June 10, 2015Publication date: December 15, 2016Inventor: WAYNE PATRICK O'BRIEN
-
Publication number: 20160364004Abstract: In embodiments, apparatuses, methods and storage media (transitory and non-transitory) are described that receive sensor data from one or more sensor devices that depict a user gesture in three dimensional space, determine a flight path based at least in part on the sensor data, and store the flight path in memory for use to control operation of a drone. Other embodiments may be described and/or claimed.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventors: Joshua Ekandem, Glen J. Anderson, Lenitra M. Durham, Richard T. Beckwith, Giuseppe Raffa, Jamie Sherman, Kathy Yuen
-
Publication number: 20160364005Abstract: A portable device includes micro-electro-mechanical systems (“MEMS”) wind turbines integrated in the portable device. A gesture detection module receives signals generated by the MEMS wind turbines based on movement of the portable device that causes a wind force to be applied to the MEMS wind turbines. The gesture module then recognizes a gesture based on the signals generated from the wind force and initiates an action of the portable device that corresponds to the gesture (e.g., the device is awakened).Type: ApplicationFiled: June 15, 2015Publication date: December 15, 2016Inventors: Francis Forest, Scott Debates
-
Publication number: 20160364006Abstract: An apparatus for performing gesture recognition and control based on ultrasonic positioning is disclosed. The apparatus includes an ultrasonic transmitter module for providing an original ultrasonic signal; an ultrasonic receiver for receiving an reflected ultrasonic signal corresponding to the original ultrasonic signal; a computing module for computing a frequency shift within a time period from the original ultrasonic signal being transmitted to the reflected ultrasonic signal being received, and converting the frequency shift into a gesture characteristic signal; a gesture database for providing reference gesture characteristic signals, and control signals corresponding to the reference gesture characteristic signals respectively; and a gesture recognition and control module for comparing the gesture characteristic signal with the reference gesture characteristic signals in the gesture database, and selecting a corresponding control signal from the data base according to a comparison result.Type: ApplicationFiled: January 24, 2016Publication date: December 15, 2016Applicant: AAC Technologies Pte. Ltd.Inventor: Xiaoyu Zhang
-
Publication number: 20160364007Abstract: Described herein is a method and a system for providing efficient and complementary natural multi-modal gesture based interaction with a computerized system which displays visual feedback information on a graphical user interface on an interaction surface. The interaction surface is within the frustum of an imaging device comprising a single sensing system. The system uses the single sensing system for detecting both touch gesture interactions with the interaction surface (120) and three-dimensional touch-less gesture interactions in areas or volumes above the interaction surface performed by hands of a user. Both types of interaction are associated contextually with an interaction command controlling the computerized system when the gesture has been detected.Type: ApplicationFiled: January 30, 2015Publication date: December 15, 2016Inventors: ALIAKSANDR KAMOVICH, JULIEN THOLLOT, LAURENT GUIGUES
-
Publication number: 20160364008Abstract: Smart glasses, and a system and method for processing a hand gesture command using the smart glasses. According to an exemplary embodiment, the system includes smart glasses to capture a series of images including a hand gesture of a user and represent and transmit a hand image, included in each of the series of images, as hand representation data that is represented in a predetermined format of metadata; and a gesture recognition apparatus to recognize the hand gesture of a user by using the hand representation data of the series of images received from the smart glasses, and generate and transmit a gesture command corresponding to the recognized hand gesture.Type: ApplicationFiled: June 10, 2016Publication date: December 15, 2016Applicants: INSIGNAL Co., Ltd., Industry-University Cooperation Foundation of Korea Aerospace UniversityInventors: Sung Moon CHUN, Hyun Chul KO, Jea Gon KIM
-
Publication number: 20160364009Abstract: A/V recording and communication devices and methods that permit commands to be executed based on gestures recorded by the camera, and which may include automatic identification and data capture (AIDC) and/or computer vision. In one example, the camera receives an input comprising a user-generated gesture. The gesture is interpreted and, if it matches defined gesture information, a command associated with the gesture is executed.Type: ApplicationFiled: August 24, 2016Publication date: December 15, 2016Inventor: Elliott Lemberger
-
Publication number: 20160364010Abstract: In general, a system can include an interface component configured to receive measurement data from a motion sensor unit physically coupled with a movable part of a body of a user. The measurement data can include sensor data of a sensor of the motion sensor unit that corresponds to a second derivation in time of a trajectory of the motion sensor unit. A data storage component can store technical profiles associated with characters and can include at least a plurality of predefined acceleration profiles. Each acceleration profile can include acceleration data characterizing a movement associated with a specific portion of a potential trajectory of the motion sensor unit in the context of at least a previous or subsequent portion of the potential trajectory. A decoding component can compare the received sensor data with the plurality of predefined acceleration profiles to identify a sequence of portions of the trajectory.Type: ApplicationFiled: August 25, 2016Publication date: December 15, 2016Inventors: Christoph Amma, Tanja M. Schultz
-
Publication number: 20160364011Abstract: A handheld controller for controlling a computer video game that receives streaming video data defining images of regions of a virtual game environment that the computer transmits; projects the images defined by the video data along a projection axis to form the images on a surface that the projection axis intersects so that a user may view and interact with the images; transmits position and/or orientation (P/O) data that defines positions and/or orientations of the projection axis to control for which regions of the virtual environment the computer streams video; and comprises an actuator operable to disengage the P/O data so that video data received from the computer does not change responsive to changes in position and/or orientation of the controller.Type: ApplicationFiled: June 15, 2015Publication date: December 15, 2016Inventor: David Bohn
-
Publication number: 20160364012Abstract: Apparatuses, methods, and storage media for adaptive provision of content are described. In one instance, the apparatus may comprise a processor and an adaptive content provision module to be operated by the processor. The adaptive content provision module may include a content provision module to generate and provide for display a first view of content captured by first selected one or more of cameras, to a plurality of user devices; an analysis module to receive and analyze user response information provided by the plurality of user devices in response to the displayed first view of content; and a control module to control the content provision module to generate a second view of content captured by second selected one or more of the plurality of cameras, based at least in part on a result of the analysis of the user response information. Other embodiments may be described and claimed.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventors: Yosi Govezensky, Raanan Yehezkel, Michal Jacob
-
Publication number: 20160364013Abstract: A virtual reality (VR) system tracks the position of a controller. The VR system includes an image tracking system comprising of a number of fixed cameras, and a headset worn by the user that includes an imaging device to capture images of a controller operated by the user. The controller includes a set of features disposed on the surface of the controller. The image tracking system provides a first view of the controller. The imaging device mounted on the headset provides a second view of the controller. Each view of the controller (i.e., from the headset and from the image tracking system) provides a distinct set of features observed on the controller. The first and second sets of features are identified from the captured images and a pose of the controller is determined using the first set of features and the second set of features.Type: ApplicationFiled: July 30, 2015Publication date: December 15, 2016Inventors: Dov Katz, Neil Konzen, Oskar Linde, Maksym Katsev
-
Publication number: 20160364014Abstract: A method for detecting an adjusting movement of an adjusting element located on a display area in a motor vehicle, the adjusting element serving for operating at least one function. At least part of the light emitted from the display area is used for detecting the adjusting movement of the adjusting element. This makes a reliable detection of the adjusting movement possible. A compact and robust device for carrying out the method is also disclosed.Type: ApplicationFiled: June 8, 2016Publication date: December 15, 2016Inventors: Jan-Peter DIETZ, Jan NEUMANN
-
Publication number: 20160364015Abstract: A detector (118) for determining a position of at least one object (112) is disclosed, the detector (118) comprising: at least one longitudinal optical sensor (120), wherein the longitudinal optical sensor (120) has at least one sensor region (124), wherein the longitudinal optical sensor (120) is at least partially transparent, wherein the longitudinal optical sensor (120) is designed to generate at least one longitudinal sensor signal in a manner dependent on an illumination of the sensor region (124) by at least one light beam (126) traveling from the object (112) to the detector (118), wherein the longitudinal sensor signal, given the same total power of the illumination, is dependent on a beam cross-section of the light beam (126) in the sensor region (124); at least one illumination source (114) adapted to illuminate the object (112) with illumination light (115) through the longitudinal optical sensor (120); and at least one evaluation device (136), wherein the evaluation device (136) is designed toType: ApplicationFiled: August 15, 2014Publication date: December 15, 2016Applicant: BASF SEInventors: Robert SEND, Ingmar BRUDER, Erwin THIEL, Stephan IRLE
-
Publication number: 20160364016Abstract: A display apparatus is provided. The display apparatus includes at least one speaker configured to output a sound signal; a display configured to display a pointing object; communication circuitry configured to communicate with a pointing apparatus for controlling a movement state of the pointing object, the pointing apparatus comprising a microphone to sense the sound signal; and a processor configured to display the pointing object at a reference position determined based on an oriented direction of the pointing apparatus determined based on information of time at which the pointing apparatus senses the sound signal, after a trigger signal for setting the reference position of the pointing object on a screen of the display is received from the pointing apparatus.Type: ApplicationFiled: March 8, 2016Publication date: December 15, 2016Inventors: Seung-il YOON, Hyun-kyu YUN
-
Publication number: 20160364017Abstract: A screen content display method. The method includes displaying screen content by using a display screen. Motion direction of the display screen on a plane parallel to the display screen and a motion amplitude along the motion direction and detected. The screen content is offset in an opposite direction of the motion direction of the display screen according to the motion amplitude of the display screen.Type: ApplicationFiled: August 24, 2016Publication date: December 15, 2016Inventor: Ling WANG
-
Publication number: 20160364018Abstract: A tilt command system for input peripherals is disclosed which allows for enhanced functionality for a peripheral device based on the peripheral's degree of tilt and direction of tilt.Type: ApplicationFiled: August 29, 2016Publication date: December 15, 2016Inventor: David A. STEPHENSON
-
Publication number: 20160364019Abstract: A handheld operation device includes a casing, a circuit board, a socket, an encoder and a roller. The circuit board is disposed in the casing. The socket is disposed on the circuit board. The encoder is detachably disposed in the socket. The roller is rotatably connected to the encoder and exposed out of the casing. Since the encoder is detachably disposed in the socket, a user can detach the encoder from the socket directly without using any tools.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventor: Yu-hsun Cheng
-
Publication number: 20160364020Abstract: A foldable mouse comprises a substrate, a thin-film battery, a touch module, a signal processing module, and a communication module. The substrate is a foldable structure. The thin-film battery is electrically connected to the touch module, the signal processing module, and the communication module; the signal processing module drives the touch module, receives signals from the touch module, and sends the signals to the communication module. The substrate comprises a main body structure sandwiched between a first support plate and a second support plate. The first support plate and the second support plate are symmetrically located at two opposite sides of the main body and foldable connected to the main body structure. The main body structure comprises a first surface and a second surface, and a minidisc is integrated on the second surface.Type: ApplicationFiled: June 6, 2016Publication date: December 15, 2016Inventors: LEI DENG, YING-RI SUN, GUANG-HONG HAN, LING ZHANG, SHOU-SHAN FAN
-
Publication number: 20160364021Abstract: A system that incorporates teachings of the present disclosure may include, for example, a computer mouse having a tracking device to navigate in a graphical user interface presented by a software application or control aspects thereof, a plurality of buttons, a scroll wheel, a display, a memory, and a controller coupled to the tracking device, the plurality of buttons, the display, and the memory. The controller can be operable to present by way of the display a plurality of scrollable options to program a corresponding plurality of operational parameters of the computer mouse, detect from a tactile contact of at least one of one or more of the plurality of buttons and the scroll wheel a selection from the plurality of options, store in the memory the selection, and operate at least one of the plurality of operational parameters of the computer mouse according to the stored selection. Additional embodiments are disclosed.Type: ApplicationFiled: August 23, 2016Publication date: December 15, 2016Inventors: Jacob Wolff-Petersen, Tino Soelberg, Jonas Bollack
-
Publication number: 20160364022Abstract: A position indicator includes: a receiving electrode to receive a driving signal from a position detector; an amplifying circuit to amplify the driving signal to generate an amplified signal; an oscillating circuit generating an output signal that oscillates for a time period upon satisfaction of a predetermined threshold condition associated at least with the amplified signal and a predetermined trigger level; and a transmitting electrode to transmit the output signal.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventor: Shun-Pin LIN
-
Publication number: 20160364023Abstract: A stylus pen that can be used as an input device to a digitizer associated with a computer screen on a computing device, such as a computer, mobile device, tablet, etc. The stylus pen can include an end cap that has multiple pressure thresholds for implementing different user-input commands. To detect the pressure being applied to the end cap, the cap is movable relative to a stylus pen body so as to move a plunger in proximity or contact with a mechanical switch. The mechanical switch is a single-action switch that is converted to a dual-action switch by using the electrical conductivity of the switch to detect an electrical coupling between a plunger and the switch. The electrical coupling can be in the form of a capacitive coupling or a direct electrical connection. Further pressure can be detected through actuation of the mechanical switch.Type: ApplicationFiled: June 15, 2015Publication date: December 15, 2016Applicant: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Steven Bathiche, Michael Jensen, Flavio Protasio Ribeiro, Gabriel Pirie, Vineet Thuvara
-
Publication number: 20160364024Abstract: A capacitive pointer is disclosed. The capacitive pointer comprises a conductive nib, a contact device, a device holder, a pressure sensor, a sensor board, a control circuit board and a power source. The device holder covers the pressure sensor and both are mounted on the sensor board. The contact device penetrates the device holder so that when the conductive nib applies a tip pressure against the pressure sensor through the contact device, the pressure sensor can detect the tip pressure applied upon the capacitive pointer. The conductive nib electrically connects the control circuit board. The sensor board electrically connects the control circuit board. The power source provides the capacitive pointer with an electrical power.Type: ApplicationFiled: August 13, 2015Publication date: December 15, 2016Inventor: Chien-Chia Lien
-
Publication number: 20160364025Abstract: An electronic device with a touch-sensitive display and one or more sensors to detect signals from a stylus associated with the device: detects a positional state of the stylus, the positional state of the stylus corresponding to a distance, a tilt, and/or an orientation of the stylus relative to the touch-sensitive display; determines a location on the touch-sensitive display that corresponds to the detected positional state of the stylus; displays, in accordance with the positional state of the stylus, an indication on the touch-sensitive display of the determined location prior to the stylus touching the touch-sensitive display; detects a change in the distance, the tilt, and/or the orientation of the stylus, prior to the stylus touching the touch-sensitive display; and in response to detecting the change, updates the displayed indication on the touch-sensitive display.Type: ApplicationFiled: September 21, 2015Publication date: December 15, 2016Inventors: Jeffrey Traer Bernstein, Linda L. Dong, Mark K. Hauenstein, Julian Missig
-
Publication number: 20160364026Abstract: An electronic device with a touch-sensitive display and one or more sensors to detect signals from a stylus associated with the device: while the electronic device is in a locked state: displays a lock screen user interface on the touch-sensitive display; while displaying the lock screen user interface, detects a first input from the stylus to draw on the touch-sensitive display; in response to detecting the first input, displays, over the lock screen user interface, one or more marks of a drawing that correspond to the first input; while displaying, on the lock screen user interface, the one or more marks of the drawing: detects a second input from the stylus to display a drawing application in a restricted mode; and, in response to detecting the second input, executes the drawing application in the restricted mode and displays the one or more marks of the drawing in the drawing application.Type: ApplicationFiled: September 22, 2015Publication date: December 15, 2016Inventors: Jeffrey Traer Bernstein, Linda L. Dong, Mark K. Hauenstein, Julian Missig
-
Publication number: 20160364027Abstract: An electronic device: displays an electronic document; while displaying the electronic document, detects a first input from a stylus, including detecting an initial contact by the stylus on a touch-sensitive surface; determines a plurality of characteristics of the first input, including a tilt of the stylus; in accordance with a determination that the tilt meets one or more selection criteria for a first virtual drawing implement, selects the first virtual drawing implement for the stylus to emulate; in accordance with a determination that the tilt meets one or more selection criteria for a second virtual drawing implement, selects the second virtual drawing implement for the stylus to emulate; and, after selecting one of the first virtual drawing implement and the second virtual drawing implement for the stylus to emulate, generates a mark in the electronic document with the selected virtual drawing implement in response to detecting the first input.Type: ApplicationFiled: September 22, 2015Publication date: December 15, 2016Inventors: Jeffrey Traer Bernstein, Linda L. Dong, Mark K. Hauenstein, Julian Missig
-
Publication number: 20160364028Abstract: An example stylus pen including a body includes a central axis, a first end, and a second end. In addition, the stylus pen includes an engagement tip disposed at the first end and including resilient cover. Further, the stylus pen includes an adjustment mechanism at least partially disposed within the body and arranged such that manipulation of the adjustment mechanism by a user causes the cover of the engagement tip to actuate between a first shape and a second shape. The cover of the engagement tip is arranged to engage with a touch sensitive surface to cause a change in a computing device when in either one of the first shape and the second shape.Type: ApplicationFiled: January 30, 2014Publication date: December 15, 2016Inventors: Kibok SONG, Ilchan LEE, Dong Ryual CHA
-
Publication number: 20160364029Abstract: According to one or more aspects, a system for vehicle user interface (UI) management includes an interface component, an operation component, a presentation logic, and a rendering component. The interface component may include a display portion located at a first position and an input portion located at a second position different than the first position. The input portion may receive one or more user inputs and include a touch sensitive portion, a first button, and a second button. The operation component may select one or more modes for a UI based on one or more of the user inputs. The presentation logic may generate one or more objects based on a selected mode and presence information associated with one or more of the user inputs. The rendering component may render one or more of the objects to form a composite image at the display portion.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventors: Ross Cameron Miller, Matthew Conway, Richard Alan Ewing, JR.
-
Publication number: 20160364030Abstract: A touch user interface may be implemented at a display edge, such as a display included with a portable information handling system. An edge touch element may provide structural support for the display along with touch functionality. The edge touch element may be integrated with a touch panel included with the display. The edge touch element may wrap around to a back face of the display.Type: ApplicationFiled: June 11, 2015Publication date: December 15, 2016Inventors: Stefan Peana, David Michael Meyers, Deeder M. Aurongzeb
-
Publication number: 20160364031Abstract: An input operation accepting unit (111) of an exemplary information-processing device (100) accepts a pointing operation input using a pointing device to indicate a coordinate of a display area of a display device. A setting unit (112) sets a first stop position for a scroll of a content, a part of the content appearing in the display area, based on a positional relationship between a start coordinate and an end coordinate of the pointing operation accepted by the input operation accepting unit (111). A scrolling unit (113) scrolls the content to the first stop position set by the setting unit (112), based on the pointing operation accepted by the input operation accepting unit (111).Type: ApplicationFiled: October 20, 2015Publication date: December 15, 2016Inventors: Mai YAMAMOTO, Satoko OKADA, Yuji SAWATANI
-
Publication number: 20160364032Abstract: A touch screen and a preparation method thereof, and a touch display screen are provided for avoiding the problem of yellowing of a white frame of a prepared touch screen due to high temperature processes. The touch screen includes a light-transmissive region and a frame region surrounding the light-transmissive region, a base substrate in the light-transmissive region being provided with a touch functional layer, wherein, a base substrate in the frame region is provided with a white light-shielding layer, the white light-shielding layer is mixed with a fluorescent agent, and the fluorescent agent is capable of absorbing ultraviolet light and emitting blue light.Type: ApplicationFiled: April 16, 2015Publication date: December 15, 2016Inventors: Wenjie SHI, Ming HU, Taofeng XIE, Hongqiang LUO
-
Publication number: 20160364033Abstract: The present invention discloses a transparent conductive oxide pattern blanking structure, a touch panel, and a display device. A patterned transparent conductive oxide layer is stacked on a base plate, and a pattern blanking layer is adapted to make a pattern of the transparent conductive oxide layer invisible in the visible light. Thus, the pattern blanking layer as a whole layer can be formed during fabricating the pattern of the transparent conductive oxide layer, and there is no need for an attaching or patterning process. Thus, an effect of making the pattern of the transparent conductive oxide layer become invisible in the visible light from clearly visible can be achieved at a low fabricating cost.Type: ApplicationFiled: August 20, 2015Publication date: December 15, 2016Inventor: Wenjuan Yang
-
Publication number: 20160364034Abstract: Embodiments of the present disclosure discloses a touch panel; a plurality of second electrodes; a plurality of the first electrode leads for leading the first electrodes, which are connected to each other, out of a touch area; a plurality of the second electrode leads for leading the second electrodes, which are connected to each other, out of the touch area. The touch panel further includes at least one transparent conductive layer which is formed on at least one surface of each of the first and/or second electrode leads and which is formed in the same layer as at least one of the first and second electrodes, the first and second connecting wires. The embodiments of the present disclosure may prevent oxidation of the electrode leads and increase adhesion without increasing manufacture processes.Type: ApplicationFiled: October 23, 2015Publication date: December 15, 2016Inventors: Chuanxiang Xu, Yonglian Qi
-
Publication number: 20160364035Abstract: An electronic device and a method for user track input detection are disclosed. One electronic device includes a processor having a first state and a second state. The processor consumes more power in the first state than in the second state. The electronic device also includes a track detection unit that detects a user track input. Operational information of the track detection unit is determined as the track detection unit detects the user track input, and, in response to the determined operational information of the track detection unit satisfying a predetermined condition, the processor switches from operating in the second state to operating in the first state.Type: ApplicationFiled: June 15, 2016Publication date: December 15, 2016Inventor: Bin Shi
-
Publication number: 20160364036Abstract: The present invention discloses a mobile terminal, including a touch screen, the touch screen including a touch cover plate. The touch screen further includes a sensing identification module, the touch cover plate covering the sensing identification module. In the embodiments of the present invention, the sensing identification module is integrated into a touch screen, and a full touch panel is used. In this way, the problem of sense of difference which is brought by the independently assembled sensing identification module is solved from the visual and tactile perspectives, the manufacturing procedure is simplified, production and utilization efficiencies are improved, and user's satisfaction is enhanced.Type: ApplicationFiled: August 29, 2016Publication date: December 15, 2016Inventors: Gengchun Deng, Wei Long
-
Publication number: 20160364037Abstract: A touch display panel and a display device are provided. The touch display panel includes a first substrate and a second substrate disposed opposite to each other; a black matrix; and a plurality of light sensing elements, a plurality of scan lines and a plurality of signal lines formed on the second substrate; the black matrix is formed on the first substrate and only transmits light within the first wavelength range; the light sensing elements are shaded by the black matrix; each scan line is connected to a first end of the corresponding light sensing element, and each signal line is connected to a second end of the corresponding light sensing element; and the light sensing element is configured to generate a touch signal according to the received light within the first wavelength range and the scanning signal provided by the scan line and supply the touch signal to the corresponding signal line. The touch display panel has a simple structure and a relatively low cost.Type: ApplicationFiled: January 4, 2015Publication date: December 15, 2016Inventors: Jun XU, Hu LI
-
Publication number: 20160364038Abstract: The present invention provides an optical sensing electronic device including first, second, third and fourth image-sensing devices. The first, second, third and fourth image-sensing devices capture images of a rectangular area from different directions to produce a first, second, third and fourth image signal. The first, second, third and fourth image-sensing devices are disposed on a first side of a first edge of the rectangular area, the first and second image-sensing devices are disposed on a first horizontal line, the third and fourth image-sensing devices are disposed on a second horizontal line, and there is a first distance between the first horizontal line and the second horizontal line.Type: ApplicationFiled: October 22, 2015Publication date: December 15, 2016Inventor: Yu-Yen CHEN
-
Publication number: 20160364039Abstract: An object is to increase color variations of a bezel part of an optical film while maintaining accuracy of reading position information patterns in the bezel part by a reader. An optical film in accordance with the present disclosure includes: a display part; and a bezel part formed around the display part, wherein each of the display part and the bezel part has formed therein patterns indicating position information, and wherein an area occupied by the patterns per a unit area in the bezel part is smaller than an area occupied by the patterns per a unit area in the display part.Type: ApplicationFiled: August 29, 2016Publication date: December 15, 2016Inventors: Kazuhiro YAMADA, Takashi YAMADA