Including Orientation Sensors (e.g., Infrared, Ultrasonic, Remotely Controlled) Patents (Class 345/158)
  • Patent number: 10075010
    Abstract: In one embodiment, the tablet connection system includes an attachment station configured to couple to a tablet, and a tablet attachment system configured to couple to the attachment station and the tablet. The attachment station includes a charging mechanism such that the attachment station is configured to charge the tablet and hold the tablet such that the tablet is positioned for viewing by a user. In one embodiment, the charging mechanism of the attachment station is configured to wirelessly charge the tablet that is coupled thereto. The tablet attachment system is configured to provide coupling between the tablet and the attachment station.
    Type: Grant
    Filed: March 30, 2016
    Date of Patent: September 11, 2018
    Assignee: FARADAY & FUTURE INC.
    Inventors: Blake Rosengren, William Alan Beverley
  • Patent number: 10067576
    Abstract: An exemplary embodiment of the present disclosure provides a handheld pointer device and a tilt angle adjustment method thereof. The tilt angle adjustment method includes the following steps. Images corresponding to the position of a reference point are captured as the handheld pointer device pointing toward the reference point to generate a plurality of frames. Whether the reference point has substantially moved is subsequently determined based on the plurality of frames. When determines that the reference point has not substantially moved, causes an accelerometer unit of the handheld pointer device to detect the accelerations thereof over various axes so as to update a first tilt angle being used currently to a second tilt angle, accordingly. The handheld pointer device may thus accurately and efficiently calculate the relative position of the reference point with the appropriate tilt angle of the handheld pointer device used.
    Type: Grant
    Filed: May 8, 2014
    Date of Patent: September 4, 2018
    Assignee: PIXART IMAGING INC.
    Inventors: Han-Ping Cheng, Chao-Chien Huang, Chia-Cheun Liang
  • Patent number: 10068362
    Abstract: A data processing apparatus includes a motion control unit that controls a movement of a character in a virtual space; a display unit that displays the character and a background, the background including an object; a detection unit that detects contact position information at which the object of the displayed background and a virtual body provided in the virtual space contact; and a specifying unit that specifies characteristic position information regarding a shape of the object of the background in accordance with the detected contact position information.
    Type: Grant
    Filed: August 25, 2016
    Date of Patent: September 4, 2018
    Assignee: KOEI TECMO GAMES CO., LTD.
    Inventor: Yuishin Takafuji
  • Patent number: 10062303
    Abstract: This invention concerns the tracking of objects in video data for artificial vision; for instance for a bionic eye. More particularly, the invention concerns a vision enhancement apparatus for a vision-impaired user. In other aspects, the invention concerns a method for enhancing vision and software to perform the method. The image processor operates to process video data representing images of a scene. Automatically detect and track a user selected object, such as a face, in the images. And, automatically modify the video data, by reserving a user selected area of the displayed images for displaying the tracked object as a separate video tile within the scene. The separate video tile remains in the selected area despite movement of the camera relative to the scene, or movement of the user relative to the object or the scene.
    Type: Grant
    Filed: May 31, 2017
    Date of Patent: August 28, 2018
    Assignee: NATIONAL ICT AUSTRALIA LIMITED
    Inventors: Nick Barnes, Chunhua Shen
  • Patent number: 10049494
    Abstract: A method for performing interaction based on augmented reality is provided. The method includes obtaining an image of a real scene photographed by a first terminal; obtaining a first location of the first terminal and a second location of a second terminal, the first location including geographical coordinate and orientation of the first terminal; matching the first location with the second location; displaying, on the image of the real scene photographed by the first terminal, information of successfully matched second terminal; and interacting according to the displayed information of the second terminal. By the method, an image is formed by photographing a real scene, and information of the successfully matched second terminal is displayed on the image of the real scene photographed by the first terminal; then interactions are performed according to needs. Furthermore, a system for performing interaction based on augmented reality is provided.
    Type: Grant
    Filed: July 11, 2014
    Date of Patent: August 14, 2018
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventors: Zhenwei Zhang, Ling Wang, Fen Xiao, Zhehui Wu
  • Patent number: 10048805
    Abstract: An apparatus can include a left optical trackpad that, responsive to object sensing, outputs a left optical trackpad signal; a right optical trackpad that, responsive to object sensing, outputs a right optical trackpad signal; assignment circuitry that assigns a status to the optical trackpads that is a right hand dominant status or a left hand dominant status; and control circuitry that outputs one or more commands based at least in part on the status and at least one of the right optical trackpad signal and the left optical trackpad signal.
    Type: Grant
    Filed: August 27, 2015
    Date of Patent: August 14, 2018
    Assignee: Lenovo (Singapore) Pte. Ltd.
    Inventors: Julie Anne Gordon, James S. Rutledge, Bradley Park Strazisar, Aaron Michael Stewart, Jay Wesley Johnson
  • Patent number: 10049497
    Abstract: There is provided a display control device including a matching section configured to match a first image or sensor data output from a first imaging device or a sensor worn on a head of a first user, to a second image output from a second imaging device worn on a part other than the head of the first user, a sight estimation section configured to estimate a region corresponding to a sight of the first user in the second image, on the basis of a result of the matching, and a display control section configured to generate an image expressing the sight of the first user using the second image on the basis of a result of the estimation of the sight, and display the image expressing the sight of the first user toward a second user that is different from the first user.
    Type: Grant
    Filed: July 11, 2016
    Date of Patent: August 14, 2018
    Assignee: SONY CORPORATION
    Inventors: Shunichi Kasahara, Junichi Rekimoto
  • Patent number: 10048808
    Abstract: An input operation detection device to detect input operation input to an image includes a first and second imaging parts and a processor to detect input operation based on data acquired by the first and second imaging parts. The image is divided into first and second images. The optical axes of imaging optical systems of the first and second imaging parts intersect with the image at points on the same side as installation position sides of the corresponding imaging parts with respect to the center of the corresponding images.
    Type: Grant
    Filed: November 13, 2015
    Date of Patent: August 14, 2018
    Assignee: RICOH COMPANY, LTD.
    Inventors: Koji Masuda, Yasuhiro Nihei, Takeshi Ueda, Masahiro Itoh, Shu Takahashi, Takeshi Ogawa, Hiroaki Tanaka, Shiori Ohta
  • Patent number: 10044998
    Abstract: A projection apparatus includes a projection mode designation unit configured to designate one of a plurality of projection modes which designate at least one of a brightness and a color of the image which is projected by the projection unit; and a projection control unit configured to determine, based on the color information of the projection target surface acquired by the projection surface information acquisition unit and the projection mode designated by the projection mode designation unit, a gray scale range of each of colors constituting the image which is projected by the projection unit, and configured to cause the projection unit to project the image.
    Type: Grant
    Filed: September 28, 2017
    Date of Patent: August 7, 2018
    Assignee: Casio Computer Co., Ltd.
    Inventors: Akihide Takasu, Toru Takahama, Tetsuro Narikawa
  • Patent number: 10042443
    Abstract: A wearable touch device comprising a carrier, a pattern emitting unit, an image acquisition unit, an image monitoring unit and a processing unit, the carrier is wearable, the pattern emitting unit emits a scanning pattern to a touch surface capable of being touched by a touch end, the image acquisition unit is used for acquiring an image formed by projecting the scanning pattern on the touch surface and sending image information of the acquired image to the processing unit, the image monitoring unit is used for monitoring current light energy of the scanning pattern in regions of the touch surface and sending current light energy information to the processing unit, and the processing unit is used for processing the image information of the acquired image and the current light energy information to determine a touch position of the touch end on the touch surface and generate corresponding command information.
    Type: Grant
    Filed: August 20, 2014
    Date of Patent: August 7, 2018
    Assignee: BOE TECHNOLOGY GROUP CO., LTD.
    Inventors: Tianyue Zhao, Yanshun Chen, Qiushi Xu, Yaohui Li
  • Patent number: 10044712
    Abstract: A user may be authenticated to access an account, computing device, or other resource based on the user's gaze pattern and neural or other physiological response(s) to one or more images or other stimuli. When the user attempts to access the resource, a computing device may obtain login gaze tracking data and measurement of a physiological condition of the user at the time that the user is viewing an image or other stimulus. Based on comparison of the login gaze tracking data and the measurement of the physiological condition to a model, the computing device can determine whether to authenticate the user to access the resource.
    Type: Grant
    Filed: May 31, 2016
    Date of Patent: August 7, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: John C. Gordon, Cem Keskin, Michael Betser
  • Patent number: 10042421
    Abstract: Individual images for individual frames of an animation may be rendered to include individual focal areas. A focal area may include one or more of a foveal region corresponding to a gaze direction of a user, an area surrounding the foveal region, and/or other components. The foveal region may comprise a region along the user's line of sight that permits high visual acuity with respect to a periphery of the line of sight. A focal area within an image may be rendered based on parameter values of rendering parameters that are different from parameter values for an area outside the focal area.
    Type: Grant
    Filed: August 24, 2016
    Date of Patent: August 7, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Kenneth J. Mitchell, Darren Cosker, Nicholas T. Swafford
  • Patent number: 10031588
    Abstract: A tracking system generates a structured light pattern in a local area. The system includes an array of lasers that generate light. The array of lasers includes a plurality of lasers and an optical element. The plurality of lasers are grouped into at least two subsets of lasers, and each of the at least two subsets of lasers is independently switchable. The optical element includes a plurality of cells that are each aligned with a respective subset of the array of lasers. Each cell receives light from a corresponding laser of the array of lasers, and each cell individually applies a modulation to the received light passing through the cell to form a corresponding portion of the structured light pattern that is projected onto a local area.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: July 24, 2018
    Assignee: Facebook, Inc.
    Inventors: Nitay Romano, Nadav Grossinger, Yair Alpern, Emil Alon, Guy Raz
  • Patent number: 10032346
    Abstract: A haptic effect enabled apparatus is presented. The apparatus comprises a surface, a first haptic effect output device, a stretch sensor, a second haptic effect output device, and one or more processors. The surface has a stretch input area. The first haptic effect output device is a stretch haptic effect output device disposed at the stretch input area. The stretch sensor is coupled to the stretch haptic effect output device and to the stretch input area. The stretch sensor is configured to sense a stretch input, and the first haptic effect output device is at least coextensive with the stretch sensor. The second haptic effect output device is disposed at a location outside of the stretch input area. The one or more processors are configured to determine whether a haptic effect output response should occur in response to a stretch input signal received from the stretch sensor.
    Type: Grant
    Filed: October 14, 2016
    Date of Patent: July 24, 2018
    Assignee: IMMERSION CORPORATION
    Inventors: Robert W Heubel, Francis Jose
  • Patent number: 10025395
    Abstract: An image-capturing device configured for a 3D space optical pointing apparatus, comprising: a plurality of adjacently arranged image-sensing units configured to sense an image of a 3D space and generate successive plane frame images, each comprising a plurality of sensing signals respectively, being adapted to evaluate a velocity and a position relative to a surface of the optical pointing apparatus. The velocity and position information can be applied to adjust the resolution setting of the image-capturing device.
    Type: Grant
    Filed: July 8, 2016
    Date of Patent: July 17, 2018
    Assignee: PIXART IMAGING INC.
    Inventors: Hsin-Chia Chen, Yen-Min Chang
  • Patent number: 10026228
    Abstract: Scene modification is described for augmented reality using markers with parameters. In one example, a method includes capturing a scene by a camera, the scene having a marker, analyzing the captured scene to identify the marker, determining a location of the marker in the captured scene, determining an augmented reality parameter associated with the identified marker, modifying the captured scene at the marker location based on the augmented reality parameter, and rendering the modified scene.
    Type: Grant
    Filed: February 25, 2015
    Date of Patent: July 17, 2018
    Assignee: INTEL CORPORATION
    Inventors: Kathy Yuen, Glen J. Anderson
  • Patent number: 10025400
    Abstract: A projector displays an image supplied by a PC on a screen using a projection unit, detects a pointed location on the screen using a location detection unit, calculates first coordinates as coordinates of the pointed location in a displayable area of the screen using a coordinate calculation part, converts the calculated first coordinates into second coordinates as coordinates in the supply image using a coordinate conversion part based on image location information indicating a location of the supply image on the screen, outputs the second coordinates obtained by the conversion from an output unit, and corrects the image location information by processing of displaying the image based on a correction image using a control unit.
    Type: Grant
    Filed: February 3, 2017
    Date of Patent: July 17, 2018
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Hiroyuki Ichieda
  • Patent number: 10019843
    Abstract: A method and a system for controlling a near eye display using a virtual navigation space are provided herein. The system may include: a wearable near eye display; a sensor having a field of view, attached to the wearable near eye display and configured to capture a scene; a transmitter attached to the wearable near eye display said transmitter is configured to transmit a structured light pattern onto a navigation space, wherein the sensor is configured to capture reflections of the specified pattern coming from the navigation space; and a computer processor configured to analyze said reflections and control a visual indicator presented to a user over the wearable near eye display. The method implements the aforementioned logic without being limited to the architecture.
    Type: Grant
    Filed: August 7, 2014
    Date of Patent: July 10, 2018
    Assignee: Facebook, Inc.
    Inventors: Nadav Grossinger, Yair Alpem
  • Patent number: 10019782
    Abstract: Provided are a method and apparatus for displaying content. The method includes receiving content displayed on an external display apparatus; displaying only a partial region of an entire region of the content on a device; generating additional information corresponding to the partial region, based on a user input; and providing the additional information to the external display apparatus, and wherein the additional information comprises information provided to the external display apparatus to be applied to the content that is displayed on the external display apparatus.
    Type: Grant
    Filed: July 1, 2013
    Date of Patent: July 10, 2018
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Sin-oug Yeo, Sun Choi
  • Patent number: 10019068
    Abstract: A method for controlling a home device is provided. More specifically, a method of utilizing a user interface, a method of utilizing a voice, and a method of utilizing a gesture such as a wrist snap utilizing a wearable device is provided.
    Type: Grant
    Filed: January 6, 2015
    Date of Patent: July 10, 2018
    Assignee: Samsung Electronics Co., Ltd.
    Inventor: Bon-Hyun Koo
  • Patent number: 10013115
    Abstract: An electronic apparatus includes a housing, a display unit installed in the housing and having a display area to display an image or a menu icon, a sensing unit installed in the housing adjacent to the display unit, having a virtual area to sense an object which is disposed over the display area of the display unit, and having a first camera to photograph and display the virtual area in a photographing mode and a second camera to extract the object from the virtual area and display a sensed image of the extracted object in a sensing mode, and a controller to analyze status of the object according to the sensed image of the object, and to determine the analyzed status of the object as a user input in the sensing mode.
    Type: Grant
    Filed: February 23, 2017
    Date of Patent: July 3, 2018
    Inventor: Seungman Kim
  • Patent number: 10007359
    Abstract: A navigation trace calibrating method and a related optical navigation device are utilized to transform a first trace line generated by the optical navigation device into a second trace line suitable for user operation. The navigation trace calibrating method includes establishing a reference coordinate system, reading and analyzing the first trace line, calculating a first offset of the first trace line relative to the reference coordinate system, defining an offset between the first trace line and the second trace line as calibration weight to acquire a second offset of the second trace line relative to the reference coordinate system, and calculating a value of the calibration weight according to the second offset and a length of the first trace line.
    Type: Grant
    Filed: January 27, 2016
    Date of Patent: June 26, 2018
    Assignee: PixArt Imaging Inc.
    Inventors: Ching-Lin Chung, Chia-Fu Ke
  • Patent number: 10008038
    Abstract: A method of displaying augmented reality comprises identifying an object as a totem having at least one user input element, capturing at least one image of an interaction of at least one finger of a user of an augmented reality system with the at least one user input element of the totem, detecting at least one characteristic pertaining to the interaction of the finger(s) of the user with the user input element(s) of the totem based on the capture image(s), and determining a user input based at least in part on the at least one determined characteristic.
    Type: Grant
    Filed: May 7, 2015
    Date of Patent: June 26, 2018
    Assignee: Magic Leap, Inc.
    Inventor: Samuel A. Miller
  • Patent number: 10003732
    Abstract: An image processing device for simulating depth of field in a captured image, the image processing device comprising: a camera sensor configured to capture an image; an orientation sensor configured to determine an orientation of the device from which the direction of capture of the image by the camera sensor is derivable; and a processor configured to apply blur to a first area of the captured image in dependence on the orientation of the device.
    Type: Grant
    Filed: February 25, 2016
    Date of Patent: June 19, 2018
    Assignee: Foodim Ltd
    Inventors: Romi Kadri, Jim Ingle, Neville Kidd
  • Patent number: 10001806
    Abstract: A dynamic dual displays system is coupled with a computing device to provide at least two display panels for single user or multiple user applications. In one embodiment, two display panels are provided in a back-to-back configuration connected by a hinge to allow rotation motion of the display panels relative to each other. Depending on the relative positions of the display panels and the computing device, the combined system provides a hybrid device that may dynamically serve as a notebook or a tablet computer. The multiple display panels may present the same or different contents simultaneously, depending on application. In a multi-user application, the display panels may be presented to different users, one of which accepts input data through a touch-sensitive surface on the corresponding display panel.
    Type: Grant
    Filed: April 12, 2012
    Date of Patent: June 19, 2018
    Inventors: Shang-Che Cheng, Wei-Han Wu, Chia-Ming Lin
  • Patent number: 9998427
    Abstract: A system for remotely controlling an electronic device is provided. The system includes a first electronic device for reading a frame buffer to compress a screen and transmitting the compressed screen; a second electronic device connected to the first electronic device to repeatedly receive the compressed screen, receive a communication service related event generated in the first electronic device and output the communication service related event, and receive an event of copying a file between the first electronic device and the second electronic device and pasting the file; and a network for forming a communication channel between the first electronic device and the second electronic device according to an authentication result of authentication information input into at least one of the first electronic device and the second electronic device.
    Type: Grant
    Filed: May 22, 2014
    Date of Patent: June 12, 2018
    Assignee: Samsung Electronics Co., Ltd
    Inventors: Shinhyun Kim, Taeho Kim, Hongkyun Kim, Hyomin Oh, Yongwan Hwang
  • Patent number: 9989623
    Abstract: A detector for determining a position of at least one object. The detector includes: at least one optical sensor configured to detect a light beam traveling from the object towards the detector, the optical sensor including at least one matrix of pixels; and at least one evaluation device configured to determine an intensity distribution of pixels of the optical sensor that are illuminated by the light beam, the evaluation device further configured to determine at least one longitudinal coordinate of the object by using the intensity distribution.
    Type: Grant
    Filed: June 5, 2014
    Date of Patent: June 5, 2018
    Assignee: BASF SE
    Inventors: Robert Send, Ingmar Bruder, Stephan Irle, Erwin Thiel
  • Patent number: 9989924
    Abstract: A smart watch and motion gaming system are disclosed. The smart watch interacts with the motion-controlled game apparatus, and includes a controller, and a geomagnetic field sensor, a gravity sensor, a gyroscope, and a data transmission circuit, which are electrically connected to the controller. The geomagnetic field sensor detects an orientation of the smart watch and acquires the orientation data. The gravity sensor detects an inclination condition of the smart watch and acquires the inclination data. The gyroscope detects a rate of rotation of the smart watch to acquire the rotation rate data. The controller runs at least one of the geomagnetic field sensor, the gravity sensor, and the gyroscope in response to the user's selection, and accordingly collect the sensing data. The data transmission circuit transmits the collected sensing data to the motion-controlled game apparatus.
    Type: Grant
    Filed: September 1, 2016
    Date of Patent: June 5, 2018
    Assignee: JRD COMMUNICATION INC.
    Inventor: Yanfeng Huang
  • Patent number: 9982990
    Abstract: A method corrects the position and deviation of a syringe needle in a machine for automatically preparing intravenous medication. The machine includes an automatic actuator, together with a control system, in which are placed the syringe and a one-dimensional position sensor including a measuring plane. The correction method includes steps in which the position sensor obtains position coordinates of a first and a second point of the needle, and steps of correcting the position and deviation error of the needle, by the control system of the automatic actuator.
    Type: Grant
    Filed: November 3, 2016
    Date of Patent: May 29, 2018
    Assignee: GRIFOLS ENGINEERING, S.A.
    Inventors: Javier Rubio Aguilera, Oriol Casanova Montpeyo
  • Patent number: 9984285
    Abstract: An adaptive tracking system for spatial input devices provides real-time tracking of spatial input devices for human-computer interaction in a Spatial Operating Environment (SOE). The components of an SOE include gestural input/output; network-based data representation, transit, and interchange; and spatially conformed display mesh. The SOE comprises a workspace occupied by one or more users, a set of screens which provide the users with visual feedback, and a gestural control system which translates user motions into command inputs. Users perform gestures with body parts and/or physical pointing devices, and the system translates those gestures into actions such as pointing, dragging, selecting, or other direct manipulations. The tracking system provides the requisite data for creating an immersive environment by maintaining a model of the spatial relationships between users, screens, pointing devices, and other physical objects within the workspace.
    Type: Grant
    Filed: July 7, 2017
    Date of Patent: May 29, 2018
    Assignee: Oblong Industries, Inc.
    Inventors: Ambrus Csaszar, Dima Kogan, Paul Yarin
  • Patent number: 9978178
    Abstract: When sharing visible content on multiple display surfaces associated with different users, the hand of a user may be detected and analyzed when placed over the user's display surface. Characteristics of the user's hand, including things such as position, orientation, color, shape, and texture, may be shared with other users. Based on these characteristics, a representation of the user's hand may be depicted over the shared content on the display surfaces of the other users.
    Type: Grant
    Filed: October 25, 2012
    Date of Patent: May 22, 2018
    Assignee: Amazon Technologies, Inc.
    Inventor: Deborah May Shepard
  • Patent number: 9977531
    Abstract: An optical touch apparatus configured to sense a touch operation of an optical touch stylus is provided. The optical touch apparatus includes a touch operation surface, an optical sensor and a touch controller. The optical touch stylus performs the touch operation on the touch operation surface. The optical sensor is disposed on a side of the touch operation surface and senses the light beam from the touch operation to obtain sensing data. The sensing data include an image of the optical touch stylus and a mirror image generated on the touch operation surface by the optical touch stylus. The touch controller is electrically connected to the optical sensor and calculates a brightness threshold according to the sensing data. The touch controller determines a position of the touch indicator point of the optical touch stylus on the touch operation surface according to the sensing data and the brightness threshold.
    Type: Grant
    Filed: March 2, 2016
    Date of Patent: May 22, 2018
    Assignee: Wistron Corporation
    Inventor: Kuo-Hsien Lu
  • Patent number: 9979473
    Abstract: A system for determining a location of a user is provided that comprises a plurality of transmitters, each associated with a predefined physical location and each comprising a light source to provide an optical location signal; and at least one head-worn locator device with an optical receiver for receiving at least one of the optical location signals. To allow the determination of a location of a user that is wearing the head-worn locator device, each transmitter is configured to provide location information in the respectively provided optical location signal, wherein said location information corresponds to said predefined physical location of the respective transmitter.
    Type: Grant
    Filed: October 29, 2015
    Date of Patent: May 22, 2018
    Assignee: Plantronics, Inc.
    Inventor: Douglas K Rosener
  • Patent number: 9977506
    Abstract: A primary user input mechanism is recommended to an application that executes on a computing device which supports a plurality of different user input mechanisms that users of the computing device can utilize to input information into the computing device. The utilization of each of the user input mechanisms is monitored on an ongoing basis, where this monitoring includes weighting each of the user input mechanisms based on its frequency of use. Upon receiving an indication to launch the application on the computing device, a one of the user input mechanisms currently having the highest weight is recommended to the application as being the primary user input mechanism. The weighting of each of the user input mechanisms is also provided to the application.
    Type: Grant
    Filed: May 22, 2015
    Date of Patent: May 22, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Oren Freiberg, Brandon Walderman, Scott Sheehan
  • Patent number: 9965027
    Abstract: An application control system and method is adapted for use with an entertainment system of a type including a display such as a monitor or TV and having display functions. A control device may be conveniently held by a user and employs an imager. The control system and method images the screen of the TV or other display to detect distinctive markers displayed on the screen. This information is transmitted to the entertainment system for control of an application or is used by the control device to control an application.
    Type: Grant
    Filed: June 22, 2016
    Date of Patent: May 8, 2018
    Assignee: I-INTERACTIVE LLC
    Inventor: David L. Henty
  • Patent number: 9965246
    Abstract: A method of outputting screen information using a sound, executed on an electronic device, is provided. The method includes creating a multi-dimensional space of two or more dimensions corresponding to a screen of the electronic device, setting sound information on respective coordinates of the multi-dimensional space, extracting location information of a focused point on the screen of the electronic device, determining coordinates of the multi-dimensional space corresponding to the location information, and outputting a sound according to the sound information set to the determined coordinates.
    Type: Grant
    Filed: September 11, 2015
    Date of Patent: May 8, 2018
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Sung Jae Cho, Dong Heon Kang
  • Patent number: 9962561
    Abstract: Some embodiments are directed to a calibration method including a calibration phantom positioned on an adjustable table on the surface of a mechanical couch, with the phantom's center at an estimated location for the iso-center of a radio therapy treatment apparatus. The calibration phantom is then irradiated using the apparatus, and the relative location of the center of the calibration phantom and the iso-center of the apparatus is determined by analyzing images of the irradiation. The calibration phantom is then repositioned by the mechanical couch applying an offset corresponding to the determined relative location of the center of the calibration phantom and the iso-center of the apparatus to the calibration phantom. Images of the relocated calibration phantom are obtained, and the images are processed to set the co-ordinate system of a 3D camera system relative to the iso-center of the apparatus.
    Type: Grant
    Filed: July 11, 2014
    Date of Patent: May 8, 2018
    Assignee: VISION RT LIMITED
    Inventors: Ivan Meir, Martin Allen, Gideon Hale, Norman Smith, Robert Howe
  • Patent number: 9958946
    Abstract: User input in the form of image data is received from a user via a natural user interface. A vector difference between an adjustment start position and a current position of the user input is calculated. The vector difference includes a vector position and a vector length. The vector position is compared to stored rail data, and the vector length is compared to a stored threshold length. The rail data describes a plurality of virtual rails associated with an application. Based on the comparisons, the user input is matched to one of the plurality of virtual rails and a notification describing the matching is provided to the application. The application, thereupon, transitions from a first command to a second command corresponding to the matching virtual rail without receiving any explicit termination gesture for the first command from the user.
    Type: Grant
    Filed: June 6, 2014
    Date of Patent: May 1, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David William Bastien, Oscar P. Kozlowski, Mark D. Schwesinger
  • Patent number: 9955341
    Abstract: A method for preventing call-up operation errors and system using the same are provided. The method includes the steps of: (S1) a proximity sensor continuously detecting a proximity or distant state of an obstruction; (S2) when detecting that the obstruction approaches, uploading the proximity state in drive program to an upper layer; (S3) when detecting that the obstruction moves far away, the proximity sensor first skipping reporting the distant state at this time, and increasing the transmission power of the proximity sensor; (S4) determining whether the reflection intensity signal exceeds a setting threshold; and (S5) if the obstruction is near the proximity sensor of the mobile phone, the proximity sensor skips uploading the distant state, and the display screen will not light up, if the obstruction is far away from the proximity sensor, the proximity sensor uploading the distant state and then lighting up the display screen.
    Type: Grant
    Filed: May 7, 2014
    Date of Patent: April 24, 2018
    Assignee: GUANG DONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.
    Inventors: Qiang Zhang, Lizhong Wang
  • Patent number: 9946338
    Abstract: There is provided an information processing apparatus including a detection unit configured to detect a gaze point of a user in a display image displayed on a display unit, an estimation unit configured to estimate an intention of the user based on the gaze point detected by the detection unit, an image generation unit configured to generates a varying image that subtly varies from the display image to a final display image according to the intention estimated by the estimation unit, and a display control unit configured to control the display unit in a manner that the varying image generated by the image generation unit is displayed.
    Type: Grant
    Filed: January 30, 2014
    Date of Patent: April 17, 2018
    Assignee: SONY CORPORATION
    Inventors: Kazunori Hayashi, Yoichiro Sako, Takayasu Kon, Yasunori Kamada, Takatoshi Nakamura
  • Patent number: 9946356
    Abstract: Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user.
    Type: Grant
    Filed: February 4, 2016
    Date of Patent: April 17, 2018
    Assignee: InterDigital Patent Holdings, Inc.
    Inventor: Matthew G. Liberty
  • Patent number: 9939915
    Abstract: An operation device and a method for operating functions of a vehicle involve a gesture detection device allowing detection of gestures carried out by a person in an interior of the vehicle for operating the functions. The gesture detection device has a multitude of operation zones in the interior, each zone being allocated to one of the functions and being selectable for operating the respective function by a gesture allocated to the respective operation zone.
    Type: Grant
    Filed: July 1, 2015
    Date of Patent: April 10, 2018
    Assignee: DAIMLER AG
    Inventor: Volker Entenmann
  • Patent number: 9939167
    Abstract: A user interface for an HVAC controller includes an electronic display and a proximity sensor for sensing a position of a user relative to the electronic display. A display controller is operably coupled to the electronic display and the proximity sensor and is configured to display one or more display elements on the electronic display. In some embodiments, a location of one or more of the display elements on the electronic display may be based, at least in part, on the position of the user sensed by the proximity sensor. In some embodiments, a size of one or more of the display elements on the electronic display may be based, at least in part, on the position of the user sensed by the proximity sensor.
    Type: Grant
    Filed: October 22, 2014
    Date of Patent: April 10, 2018
    Assignee: Honeywell International Inc.
    Inventors: Michael Hoppe, Daniel Murr, Patrick Hudson
  • Patent number: 9936257
    Abstract: The present disclosure discloses a method and terminal for displaying an application. According to an example, in the method, a terminal creates a user interface (UI) operation controller and an UI operation window for an application when receiving an operation command from a user for the application, wherein when a video displaying command for a video application is received, the created UI operation controller is a video decoder, the created UI operation window is a video displaying window and the video displaying window is displayed in a designated area in a UI provided by the terminal, when the UI operation window for another application is received, the entire UI operation window covers the entire designated area and has a same size with the designated area, or the entire UI operation window is within the designated area.
    Type: Grant
    Filed: September 2, 2014
    Date of Patent: April 3, 2018
    Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED
    Inventor: Zhengyuan Qiu
  • Patent number: 9933833
    Abstract: A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages.
    Type: Grant
    Filed: July 10, 2015
    Date of Patent: April 3, 2018
    Assignee: Apple Inc.
    Inventors: Xiaoyuan Tu, Anil K. Kandangath
  • Patent number: 9928660
    Abstract: Hybrid rendering is described for a wearable display that is attached to a tethered computer. In one example a process include determining a position and orientation of a wearable computing device, determining a rate of motion of the wearable computing device, comparing the rate of motion to a threshold, if the rate of motion is above the threshold, then rendering a view of a scene at the wearable computing device using the position and orientation information, and displaying the rendered view of the scene.
    Type: Grant
    Filed: September 12, 2016
    Date of Patent: March 27, 2018
    Assignee: INTEL CORPORATION
    Inventors: Deepak Shashidhar Vembar, Paul Diefenbaugh, Vallabhajosyula S. Somayazulu, Atsuo Kuwahara, Kofi Whitney, Richmond Hicks
  • Patent number: 9928653
    Abstract: Embodiments are disclosed for adjusting a presentation on a head-mounted display (HMD). In one or more example embodiments, a method of dynamically orienting a presentation of a HMD includes gathering HMD sensor data via at least one HMD sensor that is installed on an HMD worn by a driver of the vehicle and gathering vehicle sensor data via at least one vehicle mounted sensor mounted to the vehicle. The example method further includes performing an analysis of the HMD sensor data and of the vehicle sensor data to identify a difference between the HMD sensor data and the vehicle sensor data, and calculating, based on the difference, an orientation of the HMD device in relation to the vehicle. The method further includes adjusting a presentation of data on a display of the HMD device based on the orientation.
    Type: Grant
    Filed: April 13, 2015
    Date of Patent: March 27, 2018
    Assignee: Harman International Industries, Incorporated
    Inventor: Dan Atsmon
  • Patent number: 9928571
    Abstract: A stretchable display device including a display unit configured to be stretched in at least one direction; a sensing unit configured to sense information on a stretching force applied to the display unit; and a controller configured to stretch the display unit an amount corresponding to the stretching force applied to the display unit, stretch an entire area of the display unit in response to the stretching force being applied to the display unit without an area designation input for designating a partial area of the display unit to be stretched, and stretch the partial area of the display unit in response to the stretching force being applied to the display unit with the area designation input being applied to the display unit.
    Type: Grant
    Filed: May 26, 2015
    Date of Patent: March 27, 2018
    Assignee: LG ELECTRONICS INC.
    Inventors: Jumin Chi, Sanghyun Eim
  • Patent number: 9921645
    Abstract: The present disclosure is directed to retinal display projection device comprising a projection component arranged for projecting an image directly onto the retina of a user wearing the device. The projection device further comprises an eye gaze detection module arranged to take an eye image of a user's eyes and to activate the projection component if a pupil of the user is in a predetermined position in said eye image.
    Type: Grant
    Filed: June 29, 2015
    Date of Patent: March 20, 2018
    Assignee: Logitech Europe S.A.
    Inventors: Olivier Theytaz, Christophe Constantin, Arash Salarian, Maxim Vlassov, Daniel Bonanno
  • Patent number: 9922179
    Abstract: A method is provided including: detecting, by an electronic device, at least one signal that is generated while a predetermined content is displayed on a display; identifying a security-related object associated with the content based on the signal; comparing, by the electronic device, information relating to the object with specified security setting information; and releasing a restriction on access to a resource based on an outcome of the comparison.
    Type: Grant
    Filed: May 22, 2015
    Date of Patent: March 20, 2018
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Dong Il Son, Jong Chul Choi, Yang Wook Kim, Chi Hyun Cho, Pil Kyoo Han, Chang Ryong Heo