Display Peripheral Interface Input Device Patents (Class 345/156)
  • Patent number: 10692668
    Abstract: An accessory device having a first and second body is described. The second body may include a force feedback mechanism that responds to an input received at the first body causing a force feedback. The force feedback mechanism includes coils distributed throughout the second body, with each coil configured to receive an electrical current such that each coil provides an external magnetic field to magnetically couple with one or more magnets disposed in the first body, causing the first body to move in a direction toward the second body. The first body may return to its original position when the external magnetic field is no longer applied. In this regard, the movement of the first body defines the force feedback. Also, the first body may include a keyboard or a touch screen functioning with the force feedback mechanism. The accessory device may be used in conjunction with an electronic device.
    Type: Grant
    Filed: July 20, 2016
    Date of Patent: June 23, 2020
    Assignee: Apple Inc.
    Inventor: James A. Stryker
  • Patent number: 10692338
    Abstract: A haptic output device includes an actuator that provides a haptic effect and a signal transmitter that transmits a driving signal and a braking signal to the actuator. The driving signal includes a first driving waveform in a first half cycle, a second driving waveform, following the first driving waveform, in a second half cycle, and a third driving waveform, following the second driving waveform, in a third half cycle. The braking signal includes a braking waveform following the third driving waveform. The polarities of the voltage values of the first driving waveform and second driving waveform are opposite to each other. The polarities of the voltage values of the second driving waveform and third driving waveform are opposite to each other. The braking waveform has a phase opposite to the phase of the driving signal. The third driving waveform has a higher frequency than the first driving waveform.
    Type: Grant
    Filed: March 27, 2019
    Date of Patent: June 23, 2020
    Assignee: NIDEC SEIMITSU CORPORATION
    Inventor: Naoki Kanai
  • Patent number: 10691215
    Abstract: An apparatus for manipulating an object includes first and second gesture controllers, each operatively connected to the object and structured and programmed such that, in a first-action active state, each can causes a first action to be carried out on the object by an appropriate first-action gesture made in the gesture controller. Only one of the first and second gesture controllers at any given time is capable of being in the first-action active state, and the first-action active state is transferable between the first and second gesture controllers upon the detecting of a first-action transfer gesture by one of said first gesture controller and said second gesture controller. Specific gesture control apparatus and methods for manipulating an object are also disclosed.
    Type: Grant
    Filed: February 2, 2015
    Date of Patent: June 23, 2020
    Assignee: NANOTRONICS IMAGING, INC.
    Inventors: Matthew C. Putman, John B. Putman, Paul Roossin
  • Patent number: 10692333
    Abstract: An electronic device receives an incoming communication and determines that the device is in a first use context. In response to receiving the incoming communication, the device provides first feedback that includes a first ongoing audio output that corresponds to the first use context and a first ongoing tactile output with a first tactile output profile that corresponds to the first use context. While providing the first ongoing audio output and the first ongoing tactile output, the device detects that the electronic device is in a second use context, different from the first use context. In response to detecting that the electronic device is in the second use context, the device provides second feedback that includes a second ongoing tactile output that has a second tactile output profile that corresponds to the second use context.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: June 23, 2020
    Assignee: APPLE INC.
    Inventors: Camille Moussette, Duncan R. Kerr, Joshua B. Kopin, Miao He, Jules K. Fennis, Hugo D. Verweij, Matthew I. Brown, Marcos Alonso Ruiz, Afrooz Family
  • Patent number: 10691744
    Abstract: Systems and methods are described herein to determine data associated with affiliated color palettes identified from keyword searches of color palettes. Color palettes may be searched by name or other data associated with the color palettes. Affiliated color palettes may be determined based at least in part on an input color. Furthermore, affiliated colors can be determined based at least in part on votes and/or rankings. The items and/or images associated with affiliated color palettes may be identified. Various user interfaces may be based at least in part on the keyword searches of color palettes and/or determination of affiliated color palettes.
    Type: Grant
    Filed: June 26, 2014
    Date of Patent: June 23, 2020
    Assignee: Amazon Technologies, Inc.
    Inventors: Charles Shearer Dorner, Jenny Ann Blackburn, Eva Manolis, Timothy Andrew Ong, Paul Barnhart Sayre, III
  • Patent number: 10692336
    Abstract: An encoder and encoding method map haptic effects for various haptic channels to a plurality of areas of a body model. The haptic channels represent temperature, vibration, or similar such effects for use by haptic actuators, for example. Each channel comprises signals to represent a timestamp, face locations and identifiers, and spatial and temporal resolution values to control effect resolutions for various parts of the body for a particular effect controlled by that channel. The haptic channels are multiplexed together to form a composite signal that can control various effects to a plurality of body locations.
    Type: Grant
    Filed: June 24, 2016
    Date of Patent: June 23, 2020
    Assignee: INTERDIGITAL VC HOLDINGS, INC.
    Inventors: Julien Fleureau, Olivier Dumas, Bertrand Leroy, Fabien Danieau
  • Patent number: 10691199
    Abstract: Methods and systems for enabling a user to perform a full body movement while viewing a virtual reality environment on a head up displays without interfering with viewing of content on the heads up display. Specifically, a full body movement of the user is detected. In response to detecting the full body movement, the additional content is generated for display in a portion of the virtual reality environment corresponding to a foreground area of the user's visual field. The additional content assists the user perform the full body movement.
    Type: Grant
    Filed: June 15, 2018
    Date of Patent: June 23, 2020
    Assignee: Rovi Guides, Inc.
    Inventor: Ajit Shanware
  • Patent number: 10691893
    Abstract: A method, system and computer program product are disclosed for recommending terms in a document based on a specified interaction with the document. In one embodiment, the method comprises displaying a document on a device; detecting a specified interaction with the document displayed on the device; identifying text associated with the specified interaction with the document displayed on the device; performing object recognition with the identified text to recognize one or more defined interested objects; and recommending terms in the document based on the recognized defined interested objects. In an embodiment, the object recognition includes identifying one or more objects from the text, and performing the object recognition with these objects to recognize the one or more interested objects. In an embodiment, one or more terms are extracted from the text, and these terms are partitioned to identify the one or more objects from the text.
    Type: Grant
    Filed: February 29, 2016
    Date of Patent: June 23, 2020
    Assignee: International Business Machines Corporation
    Inventors: Min Gong, Yuan Ni, Junchi Yan, Hui J. Zhu
  • Patent number: 10688388
    Abstract: An operation apparatus mountable to one hand of a user includes: a first operation section that is located on a front surface side and that can be operated by the thumb of the one hand; a second operation section that is located on a back surface side and that can be operated by a finger other than the thumb; and a first contact surface with which the palm of the one hand makes contact. At least one of the first contact surface and an extension plane of the first contact surface is inclined relative to a virtual plane in such a direction as to be spaced more therefrom in going in a first direction, the virtual plane being defined by the first direction from the front surface side toward the back surface side and a second direction being orthogonal to the first direction and directed from a bottom surface side toward a top surface side.
    Type: Grant
    Filed: December 7, 2016
    Date of Patent: June 23, 2020
    Assignee: SONY INTERACTIVE ENTERTAINMENT INC.
    Inventors: Kunihito Sawai, Yuichi Machida
  • Patent number: 10691226
    Abstract: An input device detection system includes a foundation, a cover plate and a trajectory detection plate. The foundation includes a base plate and plural first retractable rods. The base plate has a top surface, a bottom surface and a detection hole. The input device is pushed by the plural first retractable rods along a horizontal direction. The input device is pushed by plural second retractable rods of the cover plate along a vertical direction. The trajectory detection plate is located under the base plate. An optical sensing module of the input device emits a light beam. The light beam is transmitted through the detection hole and projected to the trajectory detection plate. When the trajectory detection plate is moved relative to the foundation and the light beam is reflected to the optical sensing module by the trajectory detection plate, a trajectory signal is generated.
    Type: Grant
    Filed: February 27, 2018
    Date of Patent: June 23, 2020
    Assignee: PRIMAX ELECTRONICS LTD
    Inventors: Cheng-Yi Tsai, Ying-Che Tseng
  • Patent number: 10691287
    Abstract: The point target can be easily and accurately instructed without disturbing the viewability of the point target and its neighboring display information. A touch panel type information terminal device comprises means for displaying on a display screen a touchable pointer comprising a pointer part for instructing a point target that is displayed on a display screen, and an operation part for a user to perform a touch operation, and means for integrally moving a display position of the operation part and the pointer part on the display screen in accordance with the touch operation of the user with respect to the operation part of the touchable pointer.
    Type: Grant
    Filed: July 25, 2016
    Date of Patent: June 23, 2020
    Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
    Inventors: Shinichi Doi, Takayuki Nakamura, Eiji Kobayashi, Hiroyuki Tanaka
  • Patent number: 10691204
    Abstract: A finger mounted computer input device is provided. The device includes a housing adapted to be worn on a finger, a pressure response unit configured to convert pressure into movement, and a movement sensing unit comprising a motion sensor capable of measuring the movement of the pressure response unit in response to the pressure. The device allows a user to control a cursor on a computer screen and input command without the restrictive requirement of a rigid flat space, and is particularly suited the anatomy of fingers, thus ergonomic to a user's hand.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: June 23, 2020
    Inventor: Xiong Huang
  • Patent number: 10691280
    Abstract: The present disclosure provides a touch sensing structure, an operating method thereof, a touch substrate, a manufacturing method thereof, and a touch display device. The touch sensing structure includes: first signal lines; second signal lines; a plurality of sensing driving electrodes each connected to a corresponding first signal line and a corresponding second signal line; and a sensing layer in contact with the plurality of sensing driving electrodes. Rigidity of the sensing layer in contact with the sensing driving electrode changes along with a voltage difference between an electric signal applied to one of the first signal lines and an electric signal applied to the corresponding second signal line.
    Type: Grant
    Filed: February 27, 2018
    Date of Patent: June 23, 2020
    Assignee: BOE TECHNOLOGY GROUP CO., LTD.
    Inventors: Xiaoliang Ding, Haisheng Wang, Yingming Liu, Yanling Han, Chihjen Cheng
  • Patent number: 10684478
    Abstract: Aspects of the present disclosure relates to user interface systems and methods for use in head-worn computing systems.
    Type: Grant
    Filed: August 22, 2016
    Date of Patent: June 16, 2020
    Assignee: Mentor Acquisition One, LLC
    Inventor: Ralph F. Osterhout
  • Patent number: 10682038
    Abstract: The presently disclosed devices, methods, and systems involve direct and intuitive visualization using gaze-control in robotic laparoscopy. For example, when the system detects a new visual interest, the robot may guide the scope to approach or move toward the target view. In order to achieve the disclosed control, a system coordinate transformation is developed. The disclosed devices, methods, and systems may translates gaze positions on image (in pixels) to relative rotation angles and/or translation distances of the laparoscope. In most cases, this relationship, which may be built, in part, on the parameters of the laparoscope and the inserted laparoscope length in the patient's body, may allow the robot to directly follow the surgeon's gaze position on the monitor. The disclosed devices, methods, and systems may help to reduce cognitive and physical burdens on a laparoscopic surgeon.
    Type: Grant
    Filed: September 21, 2015
    Date of Patent: June 16, 2020
    Inventors: Xiaoli Zhang, Songpo Li
  • Patent number: 10684692
    Abstract: Systems, devices, and methods adapt established concepts from natural language processing for use in gesture identification algorithms. A gesture identification system includes sensors, a processor, and a non-transitory processor-readable memory that stores data and/or instructions for performing gesture identification. A gesture identification system may include a wearable gesture identification device. The gesture identification process involves segmenting signals from the sensors into data windows, assigning a respective “window class” to each data window, and identifying a user-performed gesture based on the corresponding sequence of window classes. Each window class exclusively characterizes at least one data window property and is analogous to a “letter” of an alphabet. Under this model, each gesture is analogous to a “word” made up of a particular combination of window classes.
    Type: Grant
    Filed: December 22, 2017
    Date of Patent: June 16, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Idris S. Aleem, Pedram Ataee
  • Patent number: 10685643
    Abstract: A method and apparatus that dynamically adjust operational parameters of a text-to-speech engine in a speech-based system are disclosed. A voice engine or other application of a device provides a mechanism to alter the adjustable operational parameters of the text-to-speech engine. In response to one or more environmental conditions, the adjustable operational parameters of the text-to-speech engine are modified to increase the intelligibility of synthesized speech.
    Type: Grant
    Filed: June 28, 2017
    Date of Patent: June 16, 2020
    Assignee: Vocollect, Inc.
    Inventors: James Hendrickson, Debra Drylie Stiffey, Duane Littleton, John Pecorari, Arkadiusz Slusarczyk
  • Patent number: 10685487
    Abstract: Systems, apparatus and methods for limiting information on an augmented reality (AR) display based on various speeds of an AR device are presented. Often information forms a distraction when the wearer is driving, running or even walking. Therefore, the described systems, devices and methods aim to limit information displayed on an AR display based on three or more levels movement (e.g., stationary, walking, driving) such that the wearer is less distracted when higher levels of concentration are needed for real world activities.
    Type: Grant
    Filed: March 6, 2013
    Date of Patent: June 16, 2020
    Assignee: QUALCOMM Incorporated
    Inventor: Jack Mandala
  • Patent number: 10677562
    Abstract: Head up displays are provided. A head up display may have a base adapted for mounting to a firearm; a window positioned by the base so that a user of the firearm can observe a field of view through the window; an illuminator operable to generate a divergent illumination light; a light valve configured to modulate the divergent illumination light in accordance with an image provided by an image generator; and a collimating optic in an optical path between the light valve and the window that substantially collimates the image modulated illumination light. The window reflects at least a portion of the image modulated collimated light so that the observer observes the image substantially in focus with objects in field of view. Images presented may include data and images, including but not limited to thermal images.
    Type: Grant
    Filed: April 22, 2019
    Date of Patent: June 9, 2020
    Assignee: LMD Power of Light Corporation
    Inventors: Brian L. Olmsted, Jeffrey W. Mock
  • Patent number: 10678340
    Abstract: A system includes one or more hardware processors, a head mounted display (HMD) configured to display a virtual environment to a user wearing the HMD, an input device configured to allow the user to interact with virtual objects presented in the virtual environment, and a virtual mini-board module executable by the one or more hardware processors. The virtual mini-board module is configured to perform operations including providing a virtual mini-board to the user within the virtual environment, the virtual mini-board including a representation of a region of the virtual environment, detecting a scroll operation performed by the user, modifying the region of the virtual environment based on the scroll operation, and updating one or more of (1) the virtual environment and (2) the representation of the region of the virtual environment on the virtual mini-board, based on the modifying.
    Type: Grant
    Filed: July 31, 2017
    Date of Patent: June 9, 2020
    Assignee: Unity IPR ApS
    Inventors: Timoni West, Amir Pascal Ebrahimi
  • Patent number: 10678336
    Abstract: A display and an input member. A controller to generate a user interface on the display and orient the displayed user interface to a side of the display in response to an activation of the input member.
    Type: Grant
    Filed: September 10, 2013
    Date of Patent: June 9, 2020
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventor: Ronald Andrew DeMena, III
  • Patent number: 10679378
    Abstract: A system configured to determine a six-degree of freedom pose of a physical object in a physical environment and to utilize the six-degree of freedom pose as an within a virtual environment or mixed reality environment. In some cases, the system may utilize one or more cameras on a headset device to track the pose of a controller or other objects and one or more cameras on the controller itself to track the pose of the headset device or the user. In one example, the system may capture image data of a physical object having a constellation or pattern on the external source. The system may analyze the image data to identify image points associated with the constellation or pattern and to determine the pose of the object based on a location of the points in the image.
    Type: Grant
    Filed: January 23, 2019
    Date of Patent: June 9, 2020
    Assignee: Occipital, Inc.
    Inventors: Vikas M. Reddy, Jeffrey Roger Powers, Paul Jakob Schroeder, Chris Slaughter, Ganesh Mallya
  • Patent number: 10678407
    Abstract: The invention relates to a method and apparatus for controlling one or more controllable devices (5, 6, 7, 38, 39, 40) in a system. In order to provide the apparatus with a more user friendly interface to control the one or more controllable device in the system, the method comprises capturing an image (50), comprising the one or more controllable devices, displaying the captured image (50), associating local areas (105,106,107,108,109,110) in the captured image (50) to the one or more respective controllable devices (5,6,7,38,39,40) on basis of information associated with the controllable devices in the captured image, receiving a user input indicating a selected one of the local areas in the displayed image, determining a command for the one or more controllable devices from the user input associated with the selected local area; and communicating the command to the one or more controllable devices.
    Type: Grant
    Filed: August 6, 2013
    Date of Patent: June 9, 2020
    Assignee: SIGNIFY HOLDING B.V.
    Inventors: Bingzhou Chen, Jianping Zhang, Xiang Chen, Hong Ming Zheng, Jianlin Xu, Zhen Hua Zhou
  • Patent number: 10678373
    Abstract: Devices, methods, and computer-readable media process to distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. The devices, methods, and computer-readable media process quickly and reliably distinguishes between input device gestures by utilizing a gesture profile that includes a preferential input type, e.g. to preferentially recognize a received input as a first input type over a second input type.
    Type: Grant
    Filed: February 7, 2017
    Date of Patent: June 9, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alexander J. Zotov, Reed Townsend, Steven P. Dodge
  • Patent number: 10678903
    Abstract: Example implementations relate to authentication using a sequence of images. For example, a computing device may include an input device to receive data from a user and a processor in communication with the input device. The processor provides a request for a private key associated with the user, where the request includes a public key specifying a set of categories each relating to an object. The processor receives, via the input device, the private key in response to the request, where the private key includes a sequence of images based on the public key. The processor authenticates the user when each image in the sequence of images is associated with a particular object previously assigned to the user based on the set of categories.
    Type: Grant
    Filed: May 2, 2016
    Date of Patent: June 9, 2020
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Steven J. Simske, Tong Zhang
  • Patent number: 10678327
    Abstract: The technology described herein splits the control focus of a user interface during a sustained user interaction. A sustained user interaction is an interaction action lasting more than a threshold period of time. The sustained interaction is initiated when the user directs the control focus of an interface onto an interface object and begins an interaction with the object. Upon determining that a sustained interaction has begun, a control focus lock is executed at the point of control focus where the sustained interaction began, for example, the point where the cursor was located when the sustained interaction began. Upon termination of the sustained interaction, the primary control focus is snapped to the location of the secondary control focus and a secondary control focus is terminated until a subsequent sustained interaction is detected.
    Type: Grant
    Filed: January 27, 2017
    Date of Patent: June 9, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Robert Poerschke, Jonathan Hoof, Tommaso Checchi, Henry C. Sterchi
  • Patent number: 10681317
    Abstract: In a color correction method, a reference area is selected in a document page, which is rendered to produce image data. The reference area can be a company logo, a letter or character in a line of text, or other graphic object. The reference area need not be a set calibration chart. The image data is used by a projector to display the document page on a screen. Ambient light may cause colors in the displayed document page to be noticeably different from the intended colors specified in the document page. A camera takes a picture of the displayed document page, and a target area is identified in the camera image based on a characteristic of the reference area that was previously selected. Corrected image data is generated based on a color difference between the reference area and the target area. The projector uses the corrected image data to display the document page, this time with colors that are closer to or the same as the intended colors.
    Type: Grant
    Filed: March 29, 2018
    Date of Patent: June 9, 2020
    Assignee: KONICA MINOLTA LABORATORY U.S.A., INC.
    Inventor: Kurt Nathan Nordback
  • Patent number: 10678328
    Abstract: A system and method selects a user interface. The method is performed by an imaging device including a gaze tracker. The method includes receiving captured data used to generate an image that is displayed where the image includes identified areas. The method includes tracking a first viewing location on the image by a user of the imaging device. The method includes determining one of the identified areas in the image being viewed based upon the first viewing location. The method includes determining a first user interface to be provided based upon a first correlation to the determined identified area. The method includes displaying the first user interface for use by the user.
    Type: Grant
    Filed: November 30, 2015
    Date of Patent: June 9, 2020
    Assignee: KONINKLIJKE PHILIPS N.V.
    Inventors: Kongkuo Lu, Yuechen Qian, Eric Cohen-Solal
  • Patent number: 10679474
    Abstract: A tactile or haptic guidance device incorporates a LIDAR assembly and servo motor assembly or linear actuators to provide tactile feedback to the user. The LIDAR assembly determines obstacles in the path of the user and the micro-controller uses the LIDAR data to send a signal to the servo motor assembly or linear actuators, which will cause translational or rotational movement along one or more axes with appropriate amounts of force to provide tactile feedback to the user about objects in their path.
    Type: Grant
    Filed: September 4, 2019
    Date of Patent: June 9, 2020
    Inventor: Winston Yang
  • Patent number: 10678330
    Abstract: A head mounted display system includes a display device and an eyetracking device. The display device includes a liquid crystal (LC) panel comprising a plurality of rows of pixels, a back light unit (BLU), and a data driver. The BLU emits light during an illumination period of a frame period from an illumination start time and does not emit light for a remaining portion of the frame period. The eyetracking device determines an eye gaze area of a user in a pixel area of the display device. The illumination start time varies based on a location of the eye gaze area of the user. Liquid crystal material in a row of pixels of the LC panel outside the eye gaze area of the user transitions during the illumination period.
    Type: Grant
    Filed: August 15, 2019
    Date of Patent: June 9, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Evan M. Richards, Jianru Shi, Fenglin Peng
  • Patent number: 10678346
    Abstract: A pointing device interacts with a target point arranged on a projection surface of a virtual desktop and a pointing device. A spatial coordinate of a target point may be determined by means of an image processing system from a two-dimensional optical code applied to a carrier plane associated with the pointing device, and may be transmitted as a control variable to a control unit of the projection surface. A spatial position of a normal vector perpendicular to the center of gravity of the surface of the code may be determined. The normal vector may be aligned with the target point by shifting and tilting the carrier plane. A rotational movement of the carrier plane about an axis of rotation perpendicular to the carrier plane may be detected, and an activation signal of the pointing device may be generated as a function of a detected rotational movement.
    Type: Grant
    Filed: April 25, 2019
    Date of Patent: June 9, 2020
    Inventor: Oliver Horst Rode
  • Patent number: 10678337
    Abstract: A movement recognition system includes a feature extraction circuit and a classification circuit that is orientation independent. The feature extraction circuit is configured to receive a plurality of acceleration or angular velocity measurements from a sensor or multiple sensors that include a three dimensional (3D) accelerometer and 3D gyroscope. Each of the measurements is taken at different times. The feature extraction circuit is also configured to determine a difference between a first of the measurements and each other of the measurements. The classification circuit is configured to classify a movement of an object attached to the sensor irrespective of an orientation of the sensor on the object utilizing a signal recognition technique based on the difference between the first of the measurements and each other of the measurements.
    Type: Grant
    Filed: January 4, 2017
    Date of Patent: June 9, 2020
    Assignee: The Texas A&M University System
    Inventors: Roozbeh Jafari, Jian Wu
  • Patent number: 10681325
    Abstract: A system creates an output image of a scene using two-dimensional (2D) images of the scene. For a pixel in the output image, the system identifies, in the output image, 2D fragments that correspond to the pixel. The system converts the 2D fragments into three dimensional (3D) fragments, creates volume spans for the pixel based on the 3D fragments, determines a color of a volume span based on color contribution of respective one or more of the 3D fragments for the volume span, and determines a color of the pixel for the output image from determined colors of the volume spans.
    Type: Grant
    Filed: May 16, 2016
    Date of Patent: June 9, 2020
    Assignee: Google LLC
    Inventors: Janne Kontkanen, Noah Snavely
  • Patent number: 10681042
    Abstract: Embodiments of the invention are generally directed to systems, methods, devices, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a method may involve recording a first gesture-based signature and storing the recorded first gesture-based signature. Then the method compares the first gesture-based signature with a second gesture-based signature. Then the method verifies the first gesture-based signature as authentic when the first gesture-based signature is substantially similar to the second gesture-based signature.
    Type: Grant
    Filed: March 6, 2019
    Date of Patent: June 9, 2020
    Assignee: Intel Corporation
    Inventors: Carlos Carrizo, Claudio Ochoa
  • Patent number: 10679052
    Abstract: A head mounted display device, an object tracking apparatus and a method for tracking object thereof are provided. The object tracking apparatus includes a lens, a light splitting device, a programmable light source and an image extractor. The lens generates and projects a detection light beam to an object. The programmable light source has a plurality of sub-light sources. The sub-light sources respectively project a plurality of light beams to a plurality of positions of the light splitting device. The programmable light source receives a driving signal, and adjusts a light-on status of each of the sub-light sources according to the driving signal. The image extractor extracts a detection image from the object. Wherein, the light splitting device receives at least one of the light beams and generates at least one reflection light beam to the lens, and the lens generates the detection light beam accordingly.
    Type: Grant
    Filed: February 2, 2018
    Date of Patent: June 9, 2020
    Assignee: HTC Corporation
    Inventors: Bo-Wen Xiao, Fu-Cheng Fan
  • Patent number: 10678329
    Abstract: A line-of-sight input device includes a detection means that detects a point of gaze or an eyeball rotational movement of the user, a movement control means that moves the input element; and an input determining means that determine an input of the input element near a predetermined position when a predetermined condition is satisfied; wherein the movement control means increases a movement speed of the input element as the point of gaze is farther from the predetermined position or a rotation angle of the eyeball rotational movement is larger, and makes the movement speed closer to zero as the point of gaze becomes nearer to the predetermined position or a rotation angle of the eyeball rotational movement becomes smaller.
    Type: Grant
    Filed: April 19, 2018
    Date of Patent: June 9, 2020
    Assignee: ORYLAB INC.
    Inventor: Kentaro Yoshifuji
  • Patent number: 10671160
    Abstract: Systems and methods for eye tracking calibration in a wearable system are described. The wearable system can present three-dimensional (3D) virtual content and allow a user to interact with the 3D virtual content using eye gaze. During an eye tracking calibration, the wearable system can validate that a user is indeed looking at a calibration target while the eye tracking data is acquired. The validation may be performed based on data associated with the user's head pose and vestibulo-ocular reflex.
    Type: Grant
    Filed: May 30, 2018
    Date of Patent: June 2, 2020
    Assignee: Magic Leap, Inc.
    Inventors: Benjamin Joseph Uscinski, Yan Xu, Bradley Vincent Stuart
  • Patent number: 10671207
    Abstract: A multi-display system and a driving method of the same are disclosed. The multi-display system includes a multi-display having a plurality of displays each including a sensor for sensing a user input pattern, a position detector configured to sequentially output extended display identification data (EDID) and time-position information together with display identification information of each of the displays based on user input pattern sensing information and display information input from each of the displays, an arrangement calculator configured to store the display identification information and the EDID and time-position information of each of the displays, and calculate arrangement positions of the displays based on the display identification information and the EDID and time-position information, and an image processor configured to divide and distribute an original image of an image source in accordance with the arrangement positions of the displays.
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: June 2, 2020
    Assignee: LG DISPLAY CO., LTD.
    Inventors: Dongwon Park, SungHoon Kim, Jongmin Park, JoonHee Lee, YongChul Kwon
  • Patent number: 10671164
    Abstract: A method for analyzing electroencephalogram (EEG) signals is disclosed. Information associated with two or more options is presented to a user. EEG signals from a sensor coupled to the user are received contemporaneously to the user receiving information associated with the two or more options. The EEG signals are processed in real time to determine which one of the options was selected by the user. In response to determining which one of the options was selected by the user, an action from one or more possible actions associated with the information presented to the user is selected. An output associated with the selected action is then generated.
    Type: Grant
    Filed: December 27, 2017
    Date of Patent: June 2, 2020
    Assignee: X Development LLC
    Inventors: Sarah Ann Laszlo, Gabriella Levine, Joseph Hollis Sargent, Phillip Yee
  • Patent number: 10670866
    Abstract: Disclosed herein are a method for providing a composite image based on optical transparency and an apparatus for the same. The method includes supplying first light of a first light source for projecting a virtual image and second light of a second light source for tracking eye gaze of a user to multiple point lights based on an optical waveguide; adjusting the degree of light concentration of any one of the first light and the second light based on a micro-lens array and outputting the first light or the second light, of which the degree of light concentration is adjusted; tracking the eye gaze of the user by collecting the second light reflected from the pupil of the user based on the optical waveguide; and combining an external image with the virtual image based on the tracked eye gaze and providing the combined image to the user.
    Type: Grant
    Filed: June 6, 2018
    Date of Patent: June 2, 2020
    Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSITUTE
    Inventors: Ung-Yeon Yang, Ki-Hong Kim, Jin-Ho Kim
  • Patent number: 10671168
    Abstract: Described embodiments include a system and a method. A system includes a first ultrasound transmitter acoustically coupled to a conducting layer of a display surface and configured to deliver a first ultrasound wave to a selected delineated area. The first ultrasonic wave has parameters sufficient to induce a non-linear vibrational response in the conducting layer. A second ultrasound transmitter is acoustically coupled to the conducting layer and configured to deliver a second ultrasound wave to the selected delineated area. The second ultrasonic wave has parameters sufficient to induce a non-linear vibrational response in the conducting layer. A controller selects a delineated area in response to an indication of a touch to the display surface, and initiates delivery of the first and second ultrasonic waves. A convergence of the first and second ultrasonic waves at the selected delineated area produces a stress pattern perceivable or discernible by the human appendage.
    Type: Grant
    Filed: September 20, 2017
    Date of Patent: June 2, 2020
    Assignee: Elwha LLC
    Inventors: Jesse R. Cheatham, III, William Gates, Roderick A. Hyde, Muriel Y. Ishikawa, Jordin T. Kare, Nathan P. Myhrvold, Tony S. Pan, Robert C. Petroski, Lowell L. Wood, Jr., Victoria Y. H. Wood
  • Patent number: 10671925
    Abstract: A computing device and method for cloud-assisted perceptual computing is described. The computing device includes a sensor to collect data for perceptual computing. The computing device also includes an analytics determiner to calculate a disposition result based on the data collected by the sensor and to calculate a confidence level of the disposition result. The computing device compares the confidence level to a threshold and sends the data to a cloud computing device in response to the confidence level being below the threshold.
    Type: Grant
    Filed: December 28, 2016
    Date of Patent: June 2, 2020
    Assignee: Intel Corporation
    Inventor: Yen Hsiang Chew
  • Patent number: 10672361
    Abstract: A method, apparatus and computer program product are disclosed. The method includes detecting ambient light around an electronic device, determining whether a user is present based on the ambient light, and controlling an operation of the electronic device. The apparatus includes a component that consumes electric power, a light sensor that detects ambient light, a frequency analyzing unit that specifies a frequency component, and a control unit that determines whether a user is present and controls an operation of the component. The computer program product includes code to perform detecting ambient light around an electronic device, determining whether a user is present based on the ambient light, and controlling an operation of the electronic device.
    Type: Grant
    Filed: May 18, 2017
    Date of Patent: June 2, 2020
    Assignee: Lenovo (Singapore) PTE LTD
    Inventors: Kazuhiro Kosugi, Takuroh Kamimura, Hiroki Oda, Atsushi Ohyama, Yuhsaku Sugai, Hideshi Tsukamoto, Hiroyuki Uchida
  • Patent number: 10671231
    Abstract: In one embodiment, an apparatus includes an electrode that is coupled to a body of a user and is configured to receive a signal from the body. The received signal is based on an electromagnetic interference signal generated by an object that is external to the apparatus. The apparatus further includes one or more processors coupled to the electrode. The processors are configured to detect, based on the signal received by the electrode, one or more of: an interaction between the user and the object, an identity of the object, or a context surrounding the apparatus.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: June 2, 2020
    Assignee: Samsung Electronics Company, Ltd.
    Inventors: Olivier Bau, Bogdana Rakova, Mike Digman, Philipp Schoessler, Martin Salvador, Sergi Consul Pacareu
  • Patent number: 10671264
    Abstract: A system and method are described for delivering content to a mobile device using a companion device. The companion device acts as a proxy device to send and receive signals on behalf of other proxied devices. Once content is loaded onto the mobile device, a user can navigate through the content using a navigation path determined based on a user's item of focus. Various transitions and animations can be displayed along the navigation path. Moreover, a user can interact with the content when viewed in a specific layout using touch events or a rotation input device.
    Type: Grant
    Filed: August 31, 2018
    Date of Patent: June 2, 2020
    Assignee: Apple Inc.
    Inventors: James A. Howard, Jonathan R. Dascola
  • Patent number: 10671896
    Abstract: Systems and techniques are disclosed for improvement of machine learning systems based on enhanced training data. An example method includes providing a visual concurrent display of a set of images of features, the features requiring classification by a reviewing user. The user interface is provided to enable the reviewing user to assign classifications to the images, the user interface being configured to create, read, update, and/or delete classifications. The user interface is responsive to the user, with the user response indicating at least two images with a single classification. The user interface is updated to represent the single classification.
    Type: Grant
    Filed: December 4, 2017
    Date of Patent: June 2, 2020
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Murray A. Reicher, Aviad Zlotnick
  • Patent number: 10671220
    Abstract: Disclosed is a touchpad system forming a human-machine interface, intended for a vehicle steering wheel. The touchpad system includes a light source illuminating the touchpad and a camera capturing light coming from the touchpad. The touchpad includes a first marker able to reflect the light coming from the light source, and the touchpad, with the exception of the first marker, being transparent to the light emitted by the light source. The system is notable in particular in that pressure on the touchpad causes the touchpad to move and in that, when the touchpad is in a first position, the first marker is not visible to the camera and, when the touchpad is in a second position, the first marker is visible to the camera.
    Type: Grant
    Filed: March 24, 2016
    Date of Patent: June 2, 2020
    Assignees: CONTINENTAL AUTOMOTIVE FRANCE, CONTINENTAL AUTOMOTIVE GMBH
    Inventors: Sebastien Champinot, Stephane Melou
  • Patent number: 10674141
    Abstract: A head-mounted-display system may include 1) a support assembly and 2) a pair of display assemblies moveably coupled to the support assembly such that the display assemblies are moveable between a plurality of positions corresponding to a plurality of user interpupillary distances, with the pair of display assemblies respectively defining separate viewing regions for a user's right and left eyes. Each of the pair of display assemblies may include a lens and a display screen. The head-mounted-display system may also include a detection subsystem having 1) at least one position sensor and 2) a determination module that determines a positional relationship between the pair of display assemblies based on measurements obtained by the at least one position sensor. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: January 23, 2018
    Date of Patent: June 2, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Rui Zhang, Dong Yang
  • Patent number: 10671270
    Abstract: Systems and method for beverage dispense from a plurality of users include a beverage dispenser with a touch-sensitive graphical display. A computer of the beverage dispenser receives touch event data points and identifies one or more GUI sections of a plurality of GUI sections associated with the received touch event data points. The computer further operates to interpret a touch event input and provide a command responsive to the input to an associated dispensing unit.
    Type: Grant
    Filed: January 5, 2018
    Date of Patent: June 2, 2020
    Assignee: Cornelius, Inc.
    Inventors: Hector Abrach, Jeffrey Joray, Vincenzo DiFatta, Fernando Sanchez, E. Scott Sevcik
  • Patent number: 10671179
    Abstract: A rollable and flexible input device includes a drum, a reel received in the drum, an input device body connected to the reel, and a battery. The reel is rotated relative to the drum so that the input device body is in a rolled-up state or a stretched state. The reel defines a battery cavity, and the battery is disposed in the battery cavity.
    Type: Grant
    Filed: December 30, 2015
    Date of Patent: June 2, 2020
    Assignee: SHENZHEN ROYOLE TECHNOLOGIES CO., LTD.
    Inventors: Xinyuan Xia, Linyu Yu, Songya Chen, Yu Zhou, Gang Chen