Patents Issued in March 2, 2017
-
Publication number: 20170060227Abstract: A financial device may include: one or more modules each including a storage unit; and a main control unit configured to disconnect power to the one or more modules after operation information is stored in the storage unit, when entering a power saving mode. The storage unit may store the operation information of the corresponding module of the one or more modules, when entering the power saving mode.Type: ApplicationFiled: September 2, 2016Publication date: March 2, 2017Applicant: LG CNS CO., LTDInventor: Maeng Cheol PARK
-
Publication number: 20170060228Abstract: A power control subsystem for controlling the supply of power transmitted to at least one node over communication cabling, the power control subsystem comprising: a plurality of references; a plurality of comparators, each of the comparators being associated with a particular one of the plurality of references; and a current limiter in communication with the plurality of comparators and arranged to limit current of the power transmitted over communication cabling responsive to the plurality of comparators.Type: ApplicationFiled: September 7, 2016Publication date: March 2, 2017Inventors: Amir Lehr, llan Atias, Dror Korcharz, David Pincu
-
Publication number: 20170060229Abstract: One embodiment of the present invention provides a system that facilitates reducing static power consumption of a processor. During operation, the system receives a signal indicating that instruction execution within the processor is to be temporarily halted. In response to this signal, the system halts an instruction-processing portion of the processor, and reduces the voltage supplied to the instruction-processing portion of the processor. Full voltage is maintained to a remaining portion of the processor, so that the remaining portion of the processor can continue to operate while the instruction-processing portion of the processor is in reduced power mode.Type: ApplicationFiled: November 11, 2016Publication date: March 2, 2017Inventor: Lynn R. Youngs
-
Publication number: 20170060230Abstract: In a system for dynamic switching and merging of head, gesture and touch input in virtual reality, a virtual object may be selected by a user in response to a first input implementing one of a number of different input modes. Once selected, with focus established on the first object by the first input, the first object may be manipulated in the virtual world in response to a second input implementing another of the different input modes. In response to a third input, another object may be selected, and focus may be shifted from the first object to the second object in response to a third input if, for example, a priority value of the third input is higher than a priority value of the first input that established focus on the first object. If the priority value of the third input is less than the priority value of the first input that established focus on the first object, focus may remain on the first object.Type: ApplicationFiled: August 26, 2015Publication date: March 2, 2017Inventors: Alexander James FAABORG, Manuel Christian CLEMENT, Chris McKENZIE
-
Publication number: 20170060231Abstract: Disclosed are a function control method and an electronic device processing the method. An electronic device according to various embodiments may include a memory and a processor electronically connected to the memory. According to an embodiment, the processor may perform a control such that at least one function for the electronic device is performed, an emotion of a user for the performed function is determined, and a scheme for controlling the performed function according to the emotion of the user is output.Type: ApplicationFiled: September 2, 2016Publication date: March 2, 2017Inventor: Kyung-Hwa Kim
-
Publication number: 20170060232Abstract: Tracking eye movement during the completion of a form on a mobile computing device to determine possible errors and suggest changes to the form. To improve data quality, eye-tracking data is used to determine input fields on a form that cause issues for a user; based on the eye tracking data, suggestions are made to change a response or to modify the form.Type: ApplicationFiled: August 25, 2015Publication date: March 2, 2017Inventors: Yoav Ben-Yair, Gil Fuchs, Itai Gordon, Ilan D. Prager
-
Publication number: 20170060233Abstract: Tracking eye movement during the completion of a form on a mobile computing device to determine possible errors and suggest changes to the form. To improve data quality, eye-tracking data is used to determine input fields on a form that cause issues for a user; based on the eye tracking data, suggestions are made to change a response or to modify the form.Type: ApplicationFiled: December 16, 2015Publication date: March 2, 2017Inventors: Yoav Ben-Yair, Gil Fuchs, Itai Gordon, Ilan D. Prager
-
Publication number: 20170060234Abstract: A driver assistance apparatus can include a display unit, a camera configured to capture an image of an interior of a vehicle, and a processor configured to sense a direction of at least one of a gaze and a gesture of a driver based on images provided from the camera, detect a region corresponding to the direction among a plurality of predetermined regions of the vehicle, and control the display unit to output information related to the detected region in response to an occurrence of an event related to the detected region among a plurality of predetermined events.Type: ApplicationFiled: August 9, 2016Publication date: March 2, 2017Applicant: LG Electronics Inc.Inventor: Sinji Sung
-
Publication number: 20170060235Abstract: The invention relates to a method for capturing fixations and viewing movements of the driver of a vehicle with a head-up display by way of observation using a camera (6) that receives an image of the head (2) of the driver which is reflected by a combiner (3) of the head-up display. According to the invention, at least some of the information displayed to the driver using the head-up display is selected depending on the fixations and viewing movements of the driver.Type: ApplicationFiled: August 23, 2016Publication date: March 2, 2017Applicant: FORD GLOBAL TECHNOLOGIES, LLCInventors: Matus BANYAY, Marcus HAEFNER
-
Publication number: 20170060236Abstract: Tracking eye movement during the completion of a form on a mobile computing device to determine possible errors and suggest changes to the form. To improve data quality, eye-tracking data is used to determine input fields on a form that cause issues for a user; based on the eye tracking data, suggestions are made to change a response or to modify the form.Type: ApplicationFiled: November 1, 2016Publication date: March 2, 2017Inventors: Yoav Ben-Yair, Gil Fuchs, Itai Gordon, Ilan D. Prager
-
Publication number: 20170060237Abstract: A control system includes an RFID device and an RFID reader antenna configured to receive a signal from the RFID device. The signal is associated with a command. A transmitter transmits the command to an electronic device to operate the electronic device.Type: ApplicationFiled: November 14, 2016Publication date: March 2, 2017Inventor: Eric Pellaton
-
Publication number: 20170060238Abstract: A warning apparatus for a computer comprises an input device and a wearable device. The wearable device is in wireless connection with the input device. The input device includes a sensor and a first microprocessor. The wearable device includes a body stimulating device. In case that the sensor detects the hand of the user, the sensor generates a sensing signal to the first microprocessor. If the time length that the first microprocessor continuously receives the sensing signal reaches a predetermined time value, the first microprocessor generates a driving signal. In response to the driving signal, the body stimulating device generates a body stimulation signal to stimulate the user. Consequently, a warning function is achieved.Type: ApplicationFiled: October 26, 2015Publication date: March 2, 2017Inventors: YING-CHE TSENG, CHENG-YI TSAI
-
Publication number: 20170060239Abstract: Disclosed are a device and a method for providing tactile sensation by using electrostatic force between an electrode and a user. The device for providing tactile sensation includes a plurality of electrodes arranged on a substrate, and a dielectric substance layer formed on the substrate and the electrodes, wherein the electrodes electrify the dielectric substance layer with an electric charge according to a driving voltage, so as to generate electrostatic force that provides tactile stimuli to a user who comes into contact with the dielectric substance layer.Type: ApplicationFiled: June 12, 2014Publication date: March 2, 2017Applicant: SAMSUNG ELECTRONICS CO., LTDInventors: Soo Chul LIM, Joonah PARK, Hyun Jeong LEE, Seung Ju HAN
-
Publication number: 20170060240Abstract: An input device according to an embodiment includes an operation detection unit, at least one vibration element, a setting unit, and a vibration control unit. The operation detection unit detects a touch operation on an operation surface. The at least one vibration element vibrates the operation surface. The setting unit receives a setting where, at least, a content of the touch operation on the operation surface and a vibration parameter of the vibration element are associated with one another. The vibration control unit controls a vibration state of the vibration element based on the setting.Type: ApplicationFiled: July 13, 2016Publication date: March 2, 2017Applicant: FUJITSU TEN LIMITEDInventors: Masahiro IINO, Teruomi KUWANO
-
Publication number: 20170060241Abstract: An input device according to a mode of an embodiment includes a detection unit, at least one vibration element, and a vibration control unit. The detection unit detects a contact position of a user on an operation surface. The vibration element vibrates the operation surface. The vibration control unit controls the vibration element in such a manner that a vibration state of the vibration element becomes a first vibration state when the contact position detected by the detection unit is in a predetermined region and that a vibration state of the vibration element becomes a second vibration state different from the first vibration state when the contact position is outside the predetermined region.Type: ApplicationFiled: July 15, 2016Publication date: March 2, 2017Applicant: FUJITSU TEN LIMITEDInventors: Shinsuke MATSUMOTO, Hitoshi TSUDA, Teru SAWADA, Yoshihiro NAKAO, Masahiro IINO
-
Publication number: 20170060242Abstract: A user interface includes both a touchscreen for tactile input and one or more lensless optical sensors for sensing additional, remote gestures. Users can interact with the user interface in a volume of space near the display, and are thus not constrained to the relatively small area of the touchscreen. Remote hand or face gestures can be used to turn on or otherwise alter the tactile user interface. Shared user interfaces can operate without touch, and thus avoid cross-contamination of e.g. viruses and bacteria.Type: ApplicationFiled: August 11, 2016Publication date: March 2, 2017Inventors: Patrick R. Gill, David G. Stork, Thomas Vogelsang
-
Publication number: 20170060243Abstract: The present application relates generally to haptic feedback actuators and their construction and use in touch based systems. The haptic feedback actuators are suitably bilayer structures including at least two materials having different thermal coefficients, allowing the structure to deflect from a first position to a second position in response to heating and/or cooling of the structure.Type: ApplicationFiled: August 18, 2016Publication date: March 2, 2017Inventor: Vahid KHOSHKAVA
-
Publication number: 20170060244Abstract: The present application relates generally to haptic actuators. For example, the application is directed to high performance parallel plate actuators, and more particularly an actuator that can be used to provide haptic feedback in a variety applications such as buttons, panels, track pads, touch panels, wearables, gaming devices and/or touch-sensitive surfaces.Type: ApplicationFiled: August 18, 2016Publication date: March 2, 2017Inventors: Vahid KHOSHKAVA, Juan Manuel CRUZ-HERNANDEZ
-
Publication number: 20170060245Abstract: An input device according to an embodiment includes an operation panel, a selection unit, a detector, a vibration element, a setting unit, and a vibration control unit. The selection unit selects a to-be-controlled object using the operation panel in accordance with the user's action. The detector detects touch operation on the operation panel. The vibration element vibrates the operation panel. The setting unit sets a vibration pattern of the vibration element appropriate to the touch operation detected by the detector, depending on the to-be-controlled object selected by the selection unit. The vibration control unit controls the vibration element to generate the vibration pattern set by the setting unit.Type: ApplicationFiled: August 18, 2016Publication date: March 2, 2017Applicant: FUJITSU TEN LIMITEDInventors: Osamu KUKIMOTO, Masahiro IINO, Yutaka MATSUNAMI, Hitoshi TSUDA, Teru SAWADA, Minoru MAEHATA
-
Publication number: 20170060246Abstract: One variation of a method for remotely sharing touch includes: receiving a location of a touch input on a surface of a first computing device; receiving an image related to the touch input; displaying the image on a display of a second computing device, the second computing device comprising a dynamic tactile layer arranged over the display and defining a set of deformable regions, each deformable region in the set of deformable region configured to expand from a retracted setting into an expanded setting; and transitioning a particular deformable region in the set of deformable regions from the retracted setting into the expanded setting, the particular deformable region defined within the dynamic tactile layer at a position corresponding to the location of the touch input.Type: ApplicationFiled: November 9, 2016Publication date: March 2, 2017Applicant: Tactus Technology, Inc.Inventor: Micah Yairi
-
Publication number: 20170060247Abstract: The user interface system of the preferred embodiments includes a layer defining a tactile surface and including a first region and a particular region adjacent the first region; a substrate defining a fluid channel, cooperating with the layer at the particular region to define a cavity fluidly coupled to the fluid channel, coupled to the layer at the first region; a displacement device displacing fluid into the fluid channel into the cavity to transition the particular region from a retracted volume setting into an expanded volume setting, the particular region substantially level with the first region in the retracted volume setting and elevated above the first region in the expanded volume setting; and a sensor including a first conductor and a second conductor coupled to the substrate and adjacent the cavity, the first conductor offset from the second conductor and capacitively coupled to the second conductor.Type: ApplicationFiled: November 10, 2016Publication date: March 2, 2017Applicant: Tactus Technology, Inc.Inventors: Craig Michael Ciesla, Micah B. Yairi
-
Publication number: 20170060248Abstract: A flexible device includes a bendable-foldable display that has bendable flaps connected by a hinge. The display has sensors for detecting a folding characteristic between the at least two flaps and for detecting a bending characteristic in at least one flap. The display has a haptic system with haptic output devices, where the haptic system receives input from the sensors indicating deformation of the bendable-foldable display device. A flexible device also includes bendable, foldable, or rollable displays that have sensors and actuators to augment user interaction with the device. Based on one or more measurements provided by the input, the haptic system interprets the input to determine deformation characteristics of the bendable-foldable display device. The haptic system generates haptic feedback based on the deformation characteristics.Type: ApplicationFiled: November 14, 2016Publication date: March 2, 2017Inventors: Ali MODARRES, Vincent LEVESQUE, Danny GRANT, Juan Manuel CRUZ-HERNANDEZ
-
Publication number: 20170060249Abstract: A system and method are disclosed for a morphable pad and display configured for tactile control. The system comprises a display for displaying a user interface comprising a layout of vehicle control features. The display is configured to highlight a portion of the layout associated with a received highlight input, and to update the layout based on a received selection input. A morphable pad is connected to the display and comprises an array of switches. Each switch is configured to receive highlight input and selection input. The switches are also configured to adjust in tactile feel to match the layout, and to reconfigure in tactile feel responsive to a change in the layout.Type: ApplicationFiled: November 16, 2016Publication date: March 2, 2017Inventors: Nicholas A. Scheufler, Steven Feit, Dave Jaeyeong Choi, Ross C. Miller
-
Publication number: 20170060250Abstract: A method, a system, and a non-transitory computer readable medium are disclosed for real-time interaction with a user interface recognizing a gesture.Type: ApplicationFiled: August 31, 2015Publication date: March 2, 2017Applicant: KONICA MINOLTA LABORATORY U.S.A., INC.Inventors: Nandita M. NAYAK, Ghassem TOFIGHI, Haisong GU
-
Publication number: 20170060251Abstract: A mobile device responsive to hand gestures or hand motions detected by a camera. The mobile device comprises: i) transmit path circuitry and receive path circuitry configured to communicate with a wireless network; ii) a memory configured to store a plurality of application programs; iii) a digital camera configured to record an image and to generate a live video stream; and iv) processing circuitry configured to analyze the live video stream and to detect therein a gesture made by a person in the recorded image. In response to detection of the gesture, the processing circuitry performs an operation associated with the detected gesture, such as taking a picture of the image or playing music.Type: ApplicationFiled: September 1, 2015Publication date: March 2, 2017Applicant: Samsung Electronics Co., Ltd.Inventor: Sejin CHOI
-
Publication number: 20170060252Abstract: An eyeglasses-type wearable device of an embodiment can handle various data inputs. The device includes right and left eye frames corresponding to positions of right and left eyes and nose pads corresponding to a position of a nose. Eye motion detection electrodes (sightline detection sensor electrodes) are provided with the nose pads to detect the eye motion of a user. Transmitter/receiver electrodes (capacitance sensor electrodes) of a gesture detector are provided with a part of the right and left eye frames to detect a gesture of the user. Various data inputs are achieved by a combination of input A corresponding to a gesture of the user detected by the gesture detector and input B corresponding to the eye motion of the user detected by the eye motion detector.Type: ApplicationFiled: December 22, 2015Publication date: March 2, 2017Applicant: KABUSHIKI KAISHA TOSHIBAInventors: Hiroaki Komaki, Akira Tanaka, Kenichi Doniwa, Hiroki Kumagai, Takashi Sudo, Yasuhiro Kanishima, Nobuhide Okabayashi
-
Publication number: 20170060253Abstract: A method for processing a three-dimensional (3D) object may including acquiring, based on an interaction of a user with at least one 3D object displayed on a 3D display, location information and depth information of pixels corresponding to the interaction. The method may including processing the at least one 3D object based on whether the location information and the depth information satisfy a depth continuity condition.Type: ApplicationFiled: February 18, 2016Publication date: March 2, 2017Inventors: Dongwoo KANG, Hyoseok HWANG, YangHo CHO
-
Publication number: 20170060254Abstract: An apparatus and method for gesture detection and recognition. The apparatus includes a processing element, a radar sensor, a depth sensor, and an optical sensor. The radar sensor, the depth sensor, and the optical sensor are coupled to the processing element, and the radar sensor, the depth sensor, and the optical sensor are configured for short range gesture detection and recognition. The processing element is further configured to detect and recognize a hand gesture based on data acquired with the radar sensor, the depth sensor, and the optical sensor.Type: ApplicationFiled: March 3, 2016Publication date: March 2, 2017Inventors: Pavlo MOLCHANOV, Shalini GUPTA, Kihwan KIM, Kari PULLI
-
Publication number: 20170060255Abstract: An object detection apparatus is provided. The object detection apparatus includes a storage configured to store a plurality of detectors respectively trained to detect an object from different viewpoints; an image receiver configured to receive an image captured by an image capturing apparatus from a viewpoint, wherein an object is captured within the image; and a controller configured to detect the object in the image by applying a detector corresponding to the viewpoint from which the image is captured from among the plurality of detectors.Type: ApplicationFiled: July 19, 2016Publication date: March 2, 2017Applicant: SAMSUNG ELECTRONICS CO., LTD.Inventors: Mikiyas TESHOME, Nam-su HA
-
Publication number: 20170060256Abstract: A facial gesture controller for controlling an electronic device is provided. The facial gesture controller includes a body, a plurality of sensors, and at least one processor. The body is configured to fit over the face of a user. The plurality of sensors are disposed on the body. The at least one processor is in electrical communication with the sensors. The sensors generate and transmit electrical signals to the at least one processor in response to movements of one or more facial muscles of the user. The at least one processor generates and transmits control signals corresponding to the electrical signals, received from the sensors, to the electronic device.Type: ApplicationFiled: August 31, 2016Publication date: March 2, 2017Applicant: REACH BIONICS, INC.Inventors: Sandy Lawrence Heck, Michael Rollin, Eric Conley
-
Publication number: 20170060257Abstract: In one embodiment, a method includes identifying a gesture made by a user of the computing device with respect to one or more surfaces of the computing device, the gesture comprising a single trajectory in three dimensions including: an earlier portion in a first direction along at least one of the surfaces; and immediately following the earlier portion of the single trajectory, a later portion in a second direction comprising a second series of points distant from the surfaces, wherein the second direction comprises a deflection from the first direction that follows through on the earlier portion of the single trajectory; determining a user input based at least in part on a speed of the gesture along the earlier portion of the single trajectory and a speed of the gesture along the later portion of the single trajectory; and executing one or more actions based on the user input.Type: ApplicationFiled: November 9, 2016Publication date: March 2, 2017Inventor: Luke St. Clair
-
Publication number: 20170060258Abstract: Input devices having a thin profile and good comfort and feel are disclosed. In embodiments, an input device includes a sensor layer configured to effectuate a key press upon application of an activation force, and a collapsible layer coupled to the sensor layer. The collapsible layer may be configured to collapse in response to a collapsing force that is substantially equal to the activation force. When the collapsible layer collapses, the sensor layer may simultaneously effectuate the key press in response to application of the collapsing force.Type: ApplicationFiled: August 31, 2015Publication date: March 2, 2017Inventors: Samuel Siegfried, Jean-Claude Dunant, Baptiste Merminod, Regis Croisonnier, Olivier Theytaz
-
Publication number: 20170060259Abstract: An embodiment of the present disclosure discloses an information processing method and an electronic device. The method comprises: a fingerprint service entering a stage of verifying the fingerprint, when an electronic device enters a stage of inputting a fingerprint, so that a fingerprint texture collected by a collecting region in an input surface of a key of the electronic device is capable of being verified; the fingerprint service setting a flag bit; a keystroke service obtaining the flag bit when the key is pressed; disabling a corresponding operation instruction by the keystroke service in response to the pressing of the key based on the flag bit; collecting the fingerprint texture by the collection region in the input surface of the key when the key is pressed, and the fingerprint service verifying the fingerprint based on the fingerprint texture.Type: ApplicationFiled: December 22, 2015Publication date: March 2, 2017Inventors: Liangyin Yang, Xuguo Liu, Weixian Guo
-
Publication number: 20170060260Abstract: The present disclosure relates to methods and devices for connecting external equipment. The method may include in response to detecting that the external equipment is connected with a smart terminal, acquiring description information of the external equipment. The method may further include determining a keyboard type of the external equipment, based upon the acquired description information of the external equipment. The method may further include generating an equipment acknowledgement of, and establishing a connection to, the external equipment, based upon the determined keyboard type.Type: ApplicationFiled: June 13, 2016Publication date: March 2, 2017Applicant: Xiaomi Inc.Inventors: Zhengan WANG, Laijun YAN, Junzhou WU
-
Publication number: 20170060261Abstract: The present invention is a method to type phonetic languages utilizing an abbreviated keyboard with less keys available physically or virtually than required to accommodate each character. The method provides for the phonetic language to be typed with three or fewer keystrokes and without a timer.Type: ApplicationFiled: August 29, 2016Publication date: March 2, 2017Inventor: Aberra Molla
-
Publication number: 20170060262Abstract: A packaging system and method for providing information through a packaging system. A cover page of the packaging system is displayed. A determination of a page accessed with the packaging system is made. A display action associated with the page of the packaging system is performed in response to a user turning to the next page.Type: ApplicationFiled: August 23, 2016Publication date: March 2, 2017Inventors: Nikolaj Hviid, Arne D. Loermann, Matthias Lackus
-
Publication number: 20170060263Abstract: One example includes a display device. The display device includes an electronic paper display imageable by receiving charges on an imaging surface of the electronic paper display. The display device includes an embedded chip to enable writing to the electronic paper display based on a successful authentication.Type: ApplicationFiled: July 29, 2014Publication date: March 2, 2017Inventors: Henryk Birecki, Omer Gila, BorĂs Balacheff, Napoleon Leoni, Steven J Simske
-
Publication number: 20170060264Abstract: Systems, methods, and computer-readable media for enabling efficient control of a media application at a media electronic device by a user electronic device are provided.Type: ApplicationFiled: August 24, 2015Publication date: March 2, 2017Inventors: Jacques P. Gasselin de Richebourg, Norman N. Wang, Bruno M. Sommer, Ross R. Dexter
-
Publication number: 20170060265Abstract: A computer-implemented method manages displayed content on a reshaped flexible display. One or more processors detect a location of a bend in a flexible display, where the bend reshapes the flexible display to define at least two sections of the flexible display. One or more processors identify a type of application being used to generate content that is displayed on the flexible display, and then divide the content into a first content portion and a second content portion, based on the type of application being used. One or more processors then display the first content portion on a first section of the flexible display and the second content portion on a second section of the flexible display.Type: ApplicationFiled: August 27, 2015Publication date: March 2, 2017Inventors: James E. Bostick, John M. Ganci, JR., Sarbajit K. Rakshit, Kimberly G. Starks
-
Publication number: 20170060266Abstract: A screen control method is provided. The method includes: detecting, by a mobile terminal, a moving direction of the mobile terminal; detecting, by the mobile terminal, an orientation of a screen of the mobile terminal; receiving, by the mobile terminal, posture information sent by a wearable device, the posture information including a palm orientation of a user of the mobile terminal; sending, by the mobile terminal, a screen recognition instruction to smart glasses when each of the moving direction, the orientation of the screen, and the palm orientation is in a first direction; powering on the screen when screen recognition success information sent by the smart glasses is received; and powering off the screen when each of the moving direction, the orientation of the screen, and the palm orientation is in a second direction, the second direction being different from the first direction.Type: ApplicationFiled: July 29, 2016Publication date: March 2, 2017Inventors: Yi Gao, Hongqiang Wang, Yunyuan Ge
-
Publication number: 20170060267Abstract: A processor connected to one or more displays shaped to affix to a finger nail for displaying an image.Type: ApplicationFiled: August 5, 2016Publication date: March 2, 2017Inventor: GREGORY A. PICCIONELLI
-
Publication number: 20170060268Abstract: A device for integrating a position, an attitude, and a wireless transmission is disclosed. The device includes an electrical connection substrate, a processor unit, a wireless communication module, and a set of sensors. The wireless communication module is electrically coupled to the processor unit via the electrical connection substrate. The set of sensors is electrically coupled to the processor unit. The processor unit and the wireless communication module are packaged as a monolithic package structure on the electrical connection substrate. The device for integrating the position, the attitude, and the wireless transmission can be manufactured as a miniaturization device. Accordingly, the present invention can be applied to a wearable device and applied to a game in which an absolute positioning is required.Type: ApplicationFiled: August 17, 2016Publication date: March 2, 2017Applicant: PRINCO CORP.Inventors: Chen-Ping Chiu, Chih-Kuang Yang, Cheng-Yi Chang
-
Publication number: 20170060269Abstract: An earpiece includes an earpiece housing, a processor disposed within the earpiece housing, a gesture based interface operatively connected to the processor and configured to detect changes in an energy field associated with user gestures, and at least one sensor operatively connected to the processor for determining positioning of the ear piece. The processor is configured to interpret the changes in the energy field to determine the user gestures. The processor is configured to activate the ear piece based on the positioning of the ear piece.Type: ApplicationFiled: August 23, 2016Publication date: March 2, 2017Inventors: Friedrich Christian Förstner, Eric Christian Hirsch, Nikolaj Hviid
-
Publication number: 20170060270Abstract: Systems and methods for customizing behavior of a computing system based on details of interactions with the computing system by a user, such as a direction, intensity, or magnitude of a particular input from a user input device.Type: ApplicationFiled: November 9, 2016Publication date: March 2, 2017Inventor: Evan K. Fram
-
Publication number: 20170060271Abstract: An input device includes four coil bodies held by a holder and four movable magnetic pole formation portions held by a movable body. The coil bodies are arranged two-by-two in x-axis and y-axis direction and arranged in a crisscross with a center region and four sides of the center region are surrounded by the four coil bodies. The magnetic pole formation portions are arranged two-by-two in the x-axis and y-axis directions so that the polarities alternately change. Each magnetic pole formation portion has a facing surface that faces two of the four coil bodies in a winding axis direction of the coil body. The movable body is arranged to be movable relative to the holder in response to receipt of an operation force.Type: ApplicationFiled: April 15, 2014Publication date: March 2, 2017Inventor: Shinsuke HISATSUGU
-
Publication number: 20170060272Abstract: An electronic smart pen is disclosed that comprises a housing with a twist ring and a marker that is configured to be in an exposed state or in a retracted state. In the exposed state a tip of the marker is exposed from the housing, while the retracted state has the tip being enclosed by the housing. The smart pen also comprises an internal power switch that toggles the electronics of the smart pen between an on-state and an off-state. Rotating the twist ring provides a combined mechanism to move the marker from the retracted state to the exposed state, while also toggling the power switch from the off-state to the on-state so that the marker is automatically extended when the pen is turned on.Type: ApplicationFiled: May 2, 2016Publication date: March 2, 2017Inventors: Christopher Wheaton, Chi Kin Benjamin Leung, Gregory Robert Cerny, Kyle Aya Naydo
-
Publication number: 20170060273Abstract: Various embodiments provide an object recognition process that is configured to detect a passive stylus and reject non-passive stylus objects on a touch screen, including an edge portion of the touch screen. In one embodiment, the object recognition process includes receiving sense signals from sense elements of a sense array in response to a touch object being on the sense array, selecting three sense signals from three respective sense elements, calculating a first sum of the strengths of the three selected signals, calculating a second sum of the strengths of two of the selected signals which are greater than the strength of one of the selected signals; and determining a type of the object (e.g., a passive stylus or a user hand's grip shadow) based on the first sum and the second sum.Type: ApplicationFiled: August 10, 2016Publication date: March 2, 2017Inventors: Oleksiy Savitskyy, Oleksandr Karpin, Igor Kravets
-
Publication number: 20170060274Abstract: A method may be executed by one or more active capacitive styluses and a sensor controller connected to sensor electrodes. The method includes: a discovery step, executed by the sensor controller, of repeatedly sending out a discovery packet for detecting any of the active capacitive styluses; a discovery response step, executed by a first active capacitive stylus among the one or more active capacitive styluses, by which the discovery packet is detected, of returning a response packet to the discovery packet; a configuration step, executed by the sensor controller, of transmitting a configuration packet including time slot designation information that designates a first time slot to the first active capacitive stylus; and a data transmission step, executed by the first active capacitive stylus, of transmitting operation state data indicative of an operation state of the first active capacitive stylus using the designated first time slot.Type: ApplicationFiled: November 10, 2016Publication date: March 2, 2017Inventor: Naoki Watanabe
-
Publication number: 20170060275Abstract: A hand-held device with a sensor for providing a signal indicative of a position of the hand-held device relative to an object surface enables power to the sensor at a first time interval when the hand-held device is indicated to be in a position that is stationary and adjacent relative to the object surface, enables power to the sensor at a second time interval shorter than the first time interval when the hand-held device is indicated to be in a position that is moving and adjacent relative to the object surface, and enables power to the sensor at a third time interval when the hand-held device is determined to be in a position that is removed relative to the object surface.Type: ApplicationFiled: November 11, 2016Publication date: March 2, 2017Inventors: Stephen Brian Gates, Jeremy K. Black
-
Publication number: 20170060276Abstract: An active stylus includes an electrode and a controller. The controller is configured to 1) generate a report including stylus information represented by a plurality of bits including a first subset of bits and a second subset of bits, 2) encode the first set of bits differently than the second set of bits to reduce a size of the report, and 3) excite the electrode with a carrier signal to form an electrostatic communication channel, the carrier signal being modulated to transmit the report via the electrostatic communication channel.Type: ApplicationFiled: September 1, 2015Publication date: March 2, 2017Inventors: Tianzhu Qiao, Jonathan Westhues