Gesture-based Patents (Class 715/863)
  • Patent number: 10592039
    Abstract: In some embodiments, an electronic device is provided for: displaying, on a display, a plurality of active applications; detecting, on a touch interface, a first-orientation gesture on at least one of the active applications; in response to the detection of the first-orientation gesture on the at least one of the active applications, performing a first operation including moving multiple of the active applications; detecting, on the touch interface, a second-orientation gesture on one or more of the active applications; in response to the detection of the second-orientation gesture on the one or more of the active applications, performing a second operation; detecting, on the touch interface, a particular gesture on a particular one of the active applications; after the detection of the particular gesture on the particular one of the active applications, performing different operations, that are different from the first and second operations, based on a duration of the particular gesture.
    Type: Grant
    Filed: October 9, 2018
    Date of Patent: March 17, 2020
    Assignee: P4TENTS1, LLC
    Inventor: Michael S Smith
  • Patent number: 10592735
    Abstract: In one embodiment, a system includes a processor, and a memory to store data used by the processor, wherein the processor is operative to detect a personalized content request or a personalized content capture attempt from at least one image of a video captured by a camera of a collaboration end-point during a collaboration event, visually identify a participant making the personalized content request or the personalized content capture attempt based on an image of the participant in the at least one image, and issue an instruction to send a content item or a link to the content item to a personalized collaboration space of the identified participant, the content item being a response to the personalized content request or the personalized content capture attempt. Related apparatus and methods are also described.
    Type: Grant
    Filed: February 12, 2018
    Date of Patent: March 17, 2020
    Assignee: Cisco Technology, Inc.
    Inventors: Qiujun Zhao, Bingjun Lyu, Qunfeng Chai, Lianqi You, Damien McCoy
  • Patent number: 10591999
    Abstract: The embodiments of the present invention disclose a hand gesture recognition method, device, system, and computer storage medium; a hand gesture recognition method disclosed in the present invention comprising: a hand gesture recognition device publishing in a network a list of supported hand gestures and/or a list of instructions corresponding to said supported hand gestures. Another hand gesture recognition method disclosed by the present invention comprises: a hand gesture recognition control device obtaining by means of a network a list of hand gestures supported by a hand gesture recognition device, and/or a list of instructions corresponding to said supported hand gestures.
    Type: Grant
    Filed: October 22, 2014
    Date of Patent: March 17, 2020
    Assignee: ZTE CORPORATION
    Inventor: Haijun Liu
  • Patent number: 10585490
    Abstract: Embodiments described herein provide approaches for controlling inadvertent inputs to a mobile device. Specifically, at least one approach includes: detecting an operating mode of a mobile device by determining if a user is currently interacting with the mobile device; detecting an operating environment of the mobile device; receiving an input resulting from a physical gesture to an input area of a mobile device; comparing the input to a past history of inputs received by the mobile device; and determining whether the physical gesture is intended by the user based on the operating mode of the mobile device, the operating environment of the mobile device, and the past history of inputs (e.g., per device application). In one approach, an input controller selects logic to be applied in processing gestures based on a combination of user customization, interaction history, and environment characteristics. The selected logic is applied to subsequent gestures.
    Type: Grant
    Filed: March 19, 2018
    Date of Patent: March 10, 2020
    Assignee: International Business Machines Corporation
    Inventors: Swaminathan Balasubramanian, Andrew R. Jones, Brian M. O'Connell
  • Patent number: 10585582
    Abstract: A method and apparatus for disambiguating touch interactions is disclosed. Specifically, a plurality of touch events may be received from an input device. The touch events may each occur at a relative time and location with respect to the other touch events. Based at least on the relative locations and relative times of each touch event, a predetermined number of touch events may be grouped into combinations of touch events. For example, touch events may be grouped together into a first and second touch event set based on a largest time difference between consecutive touch events. An action may be determined based at least on the first touch event set. Accordingly, the first touch event set may be disambiguated from other touch interactions and an associated action may be selected from a broader set of possible actions.
    Type: Grant
    Filed: August 21, 2015
    Date of Patent: March 10, 2020
    Assignee: MOTOROLA SOLUTIONS, INC.
    Inventors: Pawel Jurzak, Maciej Kucia
  • Patent number: 10579210
    Abstract: The present invention provides a display control method, a display control device and a terminal. The display control method includes: arranging, in a gap area between application icons on a terminal screen, an application icon not present on a current menu page of the terminal or an application icon not present on each of all the menu pages of the terminal. The present invention fully utilizes the gap area between the application icons to arrange specific application icons, and by means of a change mode or a specific gesture operation, the handover between hiding and display is realized, thereby enriching the display control modes of the application icons.
    Type: Grant
    Filed: July 19, 2016
    Date of Patent: March 3, 2020
    Assignee: YULONG COMPUTER TELECOMMUNICATION SCIENTIFIC (SHENZHEN) CO., LTD.
    Inventor: Xueying Jing
  • Patent number: 10579213
    Abstract: A method includes presenting a UI of a first application on a screen of a computing device and detecting a user input. For example, the detected user input may be an input tracing a continuous path on the screen of the computing device, and the path may include a first gesture extending from a first location to a second location on the screen followed by a second gesture extending from the second location to a third location on the screen. In response to detecting the first gesture, the computing device may display an interactive menu of the first application, the interactive menu comprising a plurality of menu options. In response to the detecting the second gesture, the computing device may identify one of the menu options. In response to detecting user input indicating completion of the second gesture, the computing device may determine the selection of the identified menu option.
    Type: Grant
    Filed: July 20, 2015
    Date of Patent: March 3, 2020
    Assignee: Facebook, Inc.
    Inventor: Brendan Benjamin Aronoff
  • Patent number: 10573049
    Abstract: An intuitive interface may allow users of a computing device (e.g., children, etc.) to create imaginary three dimensional (3D) objects of any shape using body gestures performed by the users as a primary or only input. A user may make motions while in front of an imaging device that senses movement of the user. The interface may allow first-person and/or third person interaction during creation of objects, which may map a body of a user to a body of an object presented by a display. In an example process, the user may start by scanning an arbitrary body gesture into an initial shape of an object. Next, the user may perform various gestures using his body, which may result in various edits to the object. After the object is completed, the object may be animated, possibly based on movements of the user.
    Type: Grant
    Filed: February 5, 2018
    Date of Patent: February 25, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Xiang Cao, Yang Liu, Teng Han, Takaaki Shiratori, Nobuyuki Umetani, Yupeng Zhang, Xin Tong, Zhimin Ren
  • Patent number: 10573044
    Abstract: Saliency-based collage generation techniques are described. A collage generation module is executed by a computing device and receives multiple digital images. The collage generation module then generates multiple saliency maps from the digital images that describe a visual saliency of respective pixels in the digital images. The saliency maps are then used by the collage generation module to fit bounding boxes to portions of the digital images that are considered salient. Collage candidates are generated by the collage generation module based on predefined layouts and the bounding boxes that are fit the portions of the digital images. The collage generation module then selects at least one collage from these collage candidates for output to a user based on the determined amounts of deviation.
    Type: Grant
    Filed: November 9, 2017
    Date of Patent: February 25, 2020
    Assignee: Adobe Inc.
    Inventors: Shailendra Singh Rathore, Anmol Dhawan
  • Patent number: 10573141
    Abstract: A security system (10) includes an image information acquisition unit (11) that acquires input image information on an image taken of a person in a store, a tracking unit (12) that tracks an action of a hand of the person based on the input image information, and a suspicious action detection unit (13) that detects a suspicious action of the person based on the tracked action of the hand. A security system, a security method, and a security program capable of accurately detecting a suspicious action are thereby provided.
    Type: Grant
    Filed: September 10, 2018
    Date of Patent: February 25, 2020
    Assignee: NEC Corporation
    Inventors: Kaoru Uchida, Nobuyuki Yamashita
  • Patent number: 10551936
    Abstract: A gesture control system includes a processor, the processor in communication with a plurality of sensors. The processor is configured to perform the steps of detecting, using the plurality of sensors, a gesture in a volume occupied by a plurality of occupants, analyzing a prior knowledge to associate the gesture with one of the plurality of occupants, and generating an output, the output being determined by the gesture and the one of the plurality of occupants.
    Type: Grant
    Filed: October 17, 2018
    Date of Patent: February 4, 2020
    Assignees: Honda Motor Co., Ltd., Edge 3 Technologies, Inc.
    Inventors: Stuart Yamamoto, Matt Conway, Graeme Asher, Matt McElvogue, Churu Yun, Tarek El Dokor, Josh King, James E. Holmes, Jordan Cluster
  • Patent number: 10551408
    Abstract: A motion detecting device including an input section to which acceleration signals, that express accelerations of respective axes of three axes of a three-dimensional orthogonal coordinate system, are respectively inputted; and a motion detecting section that sets each of plural axes, that include two different axes that are selected from among the three axes, as a designated axis in a predetermined order, and outputs a motion detection signal if it is judged that directions of motion, that are detected on the basis of the acceleration signals of the respective axes that were inputted to the input section, are directions of respective designated axes that have been set.
    Type: Grant
    Filed: May 18, 2017
    Date of Patent: February 4, 2020
    Assignee: LAPIS SEMICONDUCTOR CO., LTD.
    Inventor: Junpei Sato
  • Patent number: 10545580
    Abstract: The embodiment of the present application discloses a 3D interaction method, device, computer equipment and storage medium. The method comprises the steps of acquiring motion trail data of a gesture; identifying hand joint position information of the gesture based on the motion trail data; identifying a trigger as a corresponding trigger gesture to realize an interaction operation in an interactive interface based on stay or movement of the hand joint position information within a preset time, wherein the gesture is within an acquisition scope of non-contact gesture control equipment. Thus, the usability of non-contact gesture recognition is improved and the misjudgment rate is reduced with higher operational fluency, better compatibility and improved user experience.
    Type: Grant
    Filed: April 18, 2018
    Date of Patent: January 28, 2020
    Assignee: SHENZHEN STARFIELD INFORMATION TECHNOLOGIES CO., LTD.
    Inventors: Yantao Yang, Ting Zhang, Junqing Fang, Ting Cao
  • Patent number: 10546310
    Abstract: Embodiments of the invention relate generally to systems, methods, and apparatus for assessing consumer perception of business features, such as brands, products, and services. A graphical user interface presents a consumer with a prime associated with the business feature. The graphical user interface presents a target to be sorted by the consumer. An instruction from the consumer is received (via a user input device) to sort the target into a bin presented on the graphical user interface. Consumer response data associated with the instruction from the consumer is generated and, based on the consumer response data, the consumer's perception of the business feature is assessed.
    Type: Grant
    Filed: June 12, 2018
    Date of Patent: January 28, 2020
    Assignee: Sentient Decision Science, Inc.
    Inventors: Aaron Ashley Reid, Clinton Lee Taylor
  • Patent number: 10540647
    Abstract: A user terminal supporting mobile payment service is provided. The user terminal includes a display, a memory in which a payment application is stored, and a processor configured to run the payment application. If at least one specified user input occurs on the display while in a locked state, the processor runs the payment application without unlocking the locked state. Thus the payment application may be quickly launched from the locked state.
    Type: Grant
    Filed: February 21, 2019
    Date of Patent: January 21, 2020
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Yong Man Park, Han Na Kim, Mi Yeon Park, You Bi Seo, Hwa Youn Suh, Sae Ah Oh, Byung In Yu
  • Patent number: 10540039
    Abstract: A method is performed at an electronic device with a display, a touch-sensitive surface, and one or more sensors to detect intensity of contacts. The method includes displaying, on the display, a user interface for an application; detecting an edge input that includes detecting a change in a characteristic intensity of a contact proximate to an edge of the touch-sensitive surface; and, in response to detecting the edge input: in accordance with a determination that the edge input meets system-gesture criteria, performing an operation that is independent of the application, wherein: the system-gesture criteria include intensity criteria; the system-gesture criteria include a location criterion that is met when the intensity criteria for the contact are met while the contact is within a first region relative to the touch-sensitive surface; and the first region relative to the touch-sensitive surface is determined based on one or more characteristics of the contact.
    Type: Grant
    Filed: October 6, 2018
    Date of Patent: January 21, 2020
    Assignee: P4TENTS1, LLC
    Inventor: Michael S Smith
  • Patent number: 10534502
    Abstract: A device and method and graphical user interface for positioning a selection and selecting text on a mobile computing device with a touch-sensitive display is described. This includes: displaying a selection having a selection start point and a selection end point within text content displaying a control icon; detecting a contact on the touch-sensitive display; and in response to detecting a change in a horizontal and vertical position of the contact beginning anywhere on the control icon: changing a selection position wherein a horizontal position of the selection start point is changed by an amount proportional to the change in a horizontal position of the contact and a vertical position of the selection start point is changed by an amount proportional to the change in a vertical position of the contact and wherein the horizontal position of the selection start point with respect to the control icon is changed and a vertical position of the control icon is changed.
    Type: Grant
    Filed: February 10, 2016
    Date of Patent: January 14, 2020
    Inventor: David Graham Boyers
  • Patent number: 10534431
    Abstract: A system including: a first sensor module having an inertial measurement unit and attached to a palm of a hand of a user; a second sensor module having an inertial measurement unit and attached to a first bone of a finger (e.g., a middle or proximal phalange bone) on the palm; and a computing device coupled to the first sensor module and the second sensor module to calculate, based on the orientation of the palm and the orientation of the first bone, orientations of the second bones of the finger (e.g., a distal or proximal phalange bone, a metacarpal bone of the thumb) that have no separately attached inertial measurement unit, according to a predetermined ratio of rotation from a reference orientation along a same axis of rotation.
    Type: Grant
    Filed: October 24, 2017
    Date of Patent: January 14, 2020
    Assignee: FINCH TECHNOLOGIES LTD.
    Inventors: Viktor Vladimirovich Erivantcev, Rustam Rafikovich Kulchurin, Alexander Sergeevich Lobanov, Iakov Evgenevich Sergeev, Alexey Ivanovich Kartashov
  • Patent number: 10528786
    Abstract: An electronic apparatus is provided. The electronic device includes a display, a fingerprint sensor disposed under the display, and a processor electrically connected with the display and the fingerprint sensor. The processor is configured to set a fingerprint sensing region at a location, which corresponds to a location where the fingerprint sensor is disposed, on the display if the electronic apparatus enters a state for registering a fingerprint of a finger, to display a first guide on the display such that the first guide at least partly overlaps the fingerprint sensing region, to obtain first fingerprint information through the fingerprint sensing region if the finger touches the first guide, to display a second guide on the display such that the second guide at least partly overlaps the fingerprint sensing region, and to obtain second fingerprint information through the fingerprint sensing region if the finger touches the second guide.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: January 7, 2020
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Hee Kuk Lee, Dae Kyu Shin, Hyeong Wook Yang, Yu Min Jung, Pil Joo Yoon, Hae Dong Lee
  • Patent number: 10521101
    Abstract: A computing device is described which has a sensor operable to receive user input associated with a display area. The computing device has a renderer operable to render a content item to the display area, the content item having a length and width. The computing device has a processor operable to detect when the user input comprises a scroll mode action and to trigger, in response to the scroll mode action, a scroll mode in which a dimension of the display area is mapped to the length or the width of the content. The scroll mode action comprises a swipe from an edge of the display area in a direction towards the center of the display area.
    Type: Grant
    Filed: February 9, 2016
    Date of Patent: December 31, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Joseph Benjamin Phillips
  • Patent number: 10515484
    Abstract: A system configured to facilitate interactions with virtual content in an interactive space may comprise one or more of a hand tracking device, a light source, an optical element, one or more physical processors, non-transitory electronic storage, and/or other components. The hand tracking device may be configured to generate output signals conveying ranges of surfaces and/or objects present in the real world. A hand of a user may be identified and tracked through three-dimensional space. A virtual cursor may be presented as positionally linked with the hand. The virtual cursor may be modified to reflect change in hand pose. As the hand comes close to virtual objects, an individual virtual object may be modified to reflect which virtual object is closest to the hand.
    Type: Grant
    Filed: October 20, 2017
    Date of Patent: December 24, 2019
    Assignee: Meta View, Inc.
    Inventors: Benjamin Lucas, Michael Stein, Mayan Shay May-Raz, Kharis O'Connell
  • Patent number: 10515625
    Abstract: Multi-modal natural language processing systems are provided. Some systems are context-aware systems that use multi-modal data to improve the accuracy of natural language understanding as it is applied to spoken language input. Machine learning architectures are provided that jointly model spoken language input (“utterances”) and information displayed on a visual display (“on-screen information”). Such machine learning architectures can improve upon, and solve problems inherent in, existing spoken language understanding systems that operate in multi-modal contexts.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: December 24, 2019
    Assignee: Amazon Technologies, Inc.
    Inventors: Angeliki Metallinou, Rahul Goel, Vishal Ishwar
  • Patent number: 10509659
    Abstract: A system tasked with processing inputs and generating outputs, such as a transaction processor, might have many users. Updates to a particular user's use case can require an update to the code of the system. Examples of this disclosure provide a mechanism by which configuration can be employed to update output logic rather than code, which can facilitate updating and roll-out.
    Type: Grant
    Filed: September 28, 2016
    Date of Patent: December 17, 2019
    Assignee: Amazon Technologies, Inc.
    Inventors: Douglas Dance, Mark Hjelm, Yu Liu, Vivek Mehta
  • Patent number: 10509538
    Abstract: An example information processing apparatus for selectively executing application programs includes a camera operable in a photographing-enabled state of the information processing apparatus; a touchscreen; and processing circuitry in communication with the camera and the touchscreen. The processing circuitry is configured to control the information processing apparatus to provide a selection screen allowing for input of a first selection for activating the photographing-enabled state of the information processing apparatus and of a second selection for proceeding to display of a main menu of the information processing apparatus. The main menu comprises scrollable touch images for launching respective application programs, and the processing circuitry controls the information processing apparatus to return to the display of the main menu after executing of a launched application program is terminated.
    Type: Grant
    Filed: August 8, 2015
    Date of Patent: December 17, 2019
    Assignee: NINTENDO CO., LTD.
    Inventors: Yoshihiro Matsushima, Yuki Onozawa
  • Patent number: 10500600
    Abstract: An aerosol delivery device is provided that includes a housing, motion sensor and microprocessor. The motion sensor is within the housing and configured to detect a defined motion of the aerosol delivery device caused by user interaction with the housing to perform a gesture. The motion sensor may be configured to convert the defined motion to an electrical signal. The microprocessor or motion sensor, then, may be configured to receive the electrical signal, recognize the gesture and an operation associated with the gesture based on the electrical signal, and control at least one functional element of the aerosol delivery device to perform the operation.
    Type: Grant
    Filed: December 9, 2014
    Date of Patent: December 10, 2019
    Assignee: RAI Strategic Holdings, Inc.
    Inventors: Raymond Charles Henry, Jr., Wilson Christopher Lamb, Mark Randall Stone, Glen Joseph Kimsey, Frederic Philippe Ampolini
  • Patent number: 10496808
    Abstract: The present disclosure generally relates to managing access to credentials. In some examples, an electronic device authorizes release of credentials for use in an operation for which authorization is required. In some examples, an electronic device causes display of one or more steps to be taken to enable an input device for user input. In some examples, an electronic device disambiguates between commands to change the account that is actively logged-in on the device and commands to cause credentials to be released from the secure element.
    Type: Grant
    Filed: October 12, 2017
    Date of Patent: December 3, 2019
    Assignee: APPLE INC.
    Inventors: Marcel Van Os, Peter D. Anton, Patrick L. Coffman, Elizabeth Caroline Furches Cranfill, Raymond S. Sepulveda, Chun Kin Minor Wong
  • Patent number: 10488922
    Abstract: A device may be configured to provide a graphical user interface that is specifically designed for use in a person's non-foveal vision. Via a graphical user interface for non-foveal vision, a user may interact with the device without focusing his or her foveal vision on a touchscreen of the device. Thus, the user may operate the device and its applications entirely and exclusively without using his or her foveal vision. For example, the user may operate the device exclusively using his or her peripheral vision or using no vision whatsoever.
    Type: Grant
    Filed: October 24, 2017
    Date of Patent: November 26, 2019
    Assignee: Drivemode, Inc.
    Inventors: Yokichi Koga, Jeffrey Allen Standard, Masato Miura
  • Patent number: 10481682
    Abstract: An electronic system generates at a display virtual writing corresponding to tracked motion of the tip of a pointer with respect to a surface based on proximity of the tip of the pointer to the surface and the gaze of a user's eye. The electronic system determines the location and motion of the tip of the pointer with respect to the surface based on images captured by scene cameras, and determines the focus and gaze direction of the user's eye based on images captured by a user-facing camera. By generating virtual writing at the display corresponding to tracked motion of the tip of the pointer based on proximity of the tip of the pointer to the surface and based on the focus and gaze direction of the user's eye, the electronic system can enable virtual writing and associated collaboration services without the need for a specialized writing surface or pointer.
    Type: Grant
    Filed: March 29, 2017
    Date of Patent: November 19, 2019
    Assignee: GOOGLE LLC
    Inventors: Lewis James Marggraff, Nelson G. Publicover, Spencer James Connaughton
  • Patent number: 10484501
    Abstract: A system for dynamic profile and persona management, comprising a profile and persona management server that receives device event information from a user device, and compares the event information to a feature bundle, the feature bundle corresponding to a set of feature configurations, and directs the operation of connected user services of a telephony control system, whether for the user device, for another subscribed user device, or in the cloud, based on the feature configurations. The system also comprises a database for storing feature configurations, feature configuration bundles and feature policies made up of multiple feature bundles.
    Type: Grant
    Filed: August 28, 2017
    Date of Patent: November 19, 2019
    Assignee: Broadsource Group Pty Ltd
    Inventors: Haydn Faltyn, Michael Gliana
  • Patent number: 10471301
    Abstract: A system for 3D online sports athletics includes: a cloud server and several head-mounted devices, wherein each of the head-mounted devices is paired with an intelligent terminal having a step counting function. A 3D scene virtualizing unit virtualizing a 3D online sports athletics scene to display the virtualized objects of all the head-mounted device users. A wirelessly connecting unit wirelessly connecting the intelligent terminal that it is paired with and the cloud server. A processing unit acquiring step counting data of the intelligent terminal and uploading the step counting data to the cloud server, and acquiring the step counting data of other head-mounted devices from the cloud server, driving the 3D scene virtualizing unit to display online the step counting data of each of the head-mounted device users in its 3D online running athletics scene, to realize sports athletics where a plurality of persons are simultaneously online.
    Type: Grant
    Filed: November 4, 2016
    Date of Patent: November 12, 2019
    Assignee: Beijing Pico Technology Co., Ltd.
    Inventors: Xiangchen Yang, Kai Liu
  • Patent number: 10466799
    Abstract: There is provided an information processing apparatus, information processing method, and program which can control display in a display region while preventing the visibility of display in the display region from deteriorating, the information processing apparatus including: a detection unit configured to detect a manipulation of displaying a display object related to an application or changing a display range of the display object in a display region; a determination unit configured to determine a manipulating user who has performed the manipulation when the manipulation has been detected; and a control unit configured to select a display object to be controlled and displayed based on information related to the manipulating user.
    Type: Grant
    Filed: August 25, 2015
    Date of Patent: November 5, 2019
    Assignee: SONY CORPORATION
    Inventor: Yuuji Takimoto
  • Patent number: 10469802
    Abstract: Embodiments disclosed herein provide systems, methods, and computer readable media for detecting disturbances in a media stream from a participant on a communication. In a particular embodiment, a method provides identifying disturbance criteria defining a plurality of audible disturbances, a plurality of visual disturbances, and a plurality of communication disturbances. The method further provides identifying one or more audible disturbances from an audio component of the media stream based on predefined disturbance criteria and identifying one or more visual disturbances from a video component of the media stream based on the disturbance criteria. Additionally, the method provides correlating the audible disturbances with the visual disturbances to determine one or more combined disturbances for the participant based on the disturbance criteria, wherein each of the combined disturbances comprises at least one of the audible disturbances and at least one of the visual disturbances.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: November 5, 2019
    Assignee: Avaya Inc.
    Inventors: John F. Buford, Mehmet C. Balasaygun, Keith Ponting, Wendy Holmes
  • Patent number: 10466796
    Abstract: A method comprising determining occurrence of a software event on a wrist worn apparatus, determining a notification based, at least in part, on the software event such that the notification comprises information that signifies the software event, causing rendering of the notification, determining that the wrist worn apparatus has been tilted from a user-facing direction to a non-user-facing direction, determining that the wrist worn apparatus has been tilted from the non-user-facing direction to another user-facing direction within a notification tilt actuation threshold duration from determining that the wrist worn apparatus has been tiltedfrom the user-facing direction to the non-user-facing direction, and causing performance of at least one operation associated with the software event is disclosed.
    Type: Grant
    Filed: February 9, 2015
    Date of Patent: November 5, 2019
    Assignee: Nokia Technologies Oy
    Inventor: Apaar Tuli
  • Patent number: 10460481
    Abstract: Shape building within a digital medium environment is described. In an implementation, a relationship is detected of a user input as drawn within a user interface with respect at least one shape displayed within the user interface. Based on this relationship, a shape building operation is identified from a plurality of shape building operations. The relationship is also used to determine at least one shape that is to be subject of the identified shape building operation. From this the identified shape building operation is performed on the identified at least one shape and a result of the performance of the shape building operation is output in the user interface.
    Type: Grant
    Filed: November 11, 2016
    Date of Patent: October 29, 2019
    Assignee: Adobe Inc.
    Inventors: Avadhesh Kumar Sharma, Ankit Phogat, Akhil Jindal
  • Patent number: 10459527
    Abstract: Techniques are described for notebook hinge users. For example, a computing device may comprise a housing having a processor circuit and an input device, the input device arranged on a side of the housing, a lid having a digital display arranged on a side of the lid, a hinge arranged to couple the housing and the lid, and a sensor module coupled to the processor circuit, the sensor module arranged inside the hinge and operative to capture motion input outside of the computing device.
    Type: Grant
    Filed: January 9, 2018
    Date of Patent: October 29, 2019
    Assignee: Intel Corporation
    Inventor: James M. Okuley
  • Patent number: 10459579
    Abstract: A portable electronic device comprises means 4 for transmitting ultrasonic signals 8, means 6 for receiving ultrasonic signals 12 reflected from an input object 10, and means for processing received ultrasonic signals to determine an input to the device. The device is configured to reduce the transmission power of the ultrasonic signals transmitted by the transmitting means in the event that: the device determines that a reflection 16 from an object 14 other than the input object meets a predetermined criterion; or if an event is detected which indicates that the user is or will be using a function of the device which does not support touchless interaction.
    Type: Grant
    Filed: June 12, 2012
    Date of Patent: October 29, 2019
    Assignee: ELLIPTIC LABORATORIES AS
    Inventors: Tobias Gulden Dahl, Cato Syversrud
  • Patent number: 10452228
    Abstract: A method and a device for displaying information and for operating an electronic device in a vehicle, wherein a plurality of list elements are displayed on a display surface with a touch-sensitive surface; a contact of a list element on the touch-sensitive surface with an actuating object or an approach of the actuating object to the list element is detected, whereupon the list element is activated, and after the activation of the list element, a first swiping movement of the actuating object on or in the vicinity of the touch-sensitive surface is detected, and with or after the detection of the first swiping movement of the actuating object, at least the list elements which are not activated are removed from the display surface and a multimedia additional element associated with the activated list element is inserted into the display surface.
    Type: Grant
    Filed: September 11, 2012
    Date of Patent: October 22, 2019
    Assignee: VOLKSWAGEN AG
    Inventors: Mathias Kuhn, Jan Michaelis, Alexander Hahn
  • Patent number: 10437449
    Abstract: The system comprises the ability to detect certain gestures made by sliding a finger or stylus on a touch sensitive screen on a handheld device, even when a so called “screen lock” is active where the gesture is used to unlock the device and trigger the desired function associates with the gesture.
    Type: Grant
    Filed: July 20, 2015
    Date of Patent: October 8, 2019
    Assignee: BlackBerry Limited
    Inventor: Karl-Anders Johansson
  • Patent number: 10437882
    Abstract: Methods and devices for initiating a search of an object are disclosed. In one embodiment, a method is disclosed that includes receiving video data recorded by a camera on a wearable computing device, where the video data comprises at least a first frame and a second frame. The method further includes, based on the video data, detecting an area in the first frame that is at least partially bounded by a pointing device and, based on the video data, detecting in the second frame that the area is at least partially occluded by the pointing device. The method still further includes initiating a search on the area.
    Type: Grant
    Filed: May 7, 2015
    Date of Patent: October 8, 2019
    Assignee: Google LLC
    Inventors: Thad Eugene Starner, Irfan Essa, Hayes Solos Raffle, Daniel Aminzade
  • Patent number: 10437442
    Abstract: Embodiments of the present invention relate to the field of terminal application technologies, and provide a method, an apparatus, and a terminal for processing notification information. Therefore, an efficient operation manner for processing notification information is provided, which can simplify operation steps and improve usability and controllability of a device. The method is: after a user display operation is detected, and a display instruction corresponding to the user display operation is triggered, determining a target application program corresponding to notification information; and when it is determined that a screen display status meets a condition, adjusting, according to a preset adjustment policy corresponding to the display instruction, a window corresponding to the target application program. The embodiments of the present invention are used to process notification information of an intelligent terminal device.
    Type: Grant
    Filed: January 4, 2015
    Date of Patent: October 8, 2019
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Hao Jing, Dian Fu, Yahui Wang, Xiaojuan Li
  • Patent number: 10440346
    Abstract: Disclosed is a medical video display system with improved operability in manipulation regarding display of medical videos, which include a changeover device through which one, or two or more surgical videos are entered; and a work station and a distribution device control unit that distribute one video entered through the changeover device and display it on each of a touch panel and a monitor, wherein the work station allows the touch panel, having the video displayed thereon, to display an operation screen or an operation icon for accepting an operation directed to the monitor.
    Type: Grant
    Filed: September 21, 2017
    Date of Patent: October 8, 2019
    Assignee: MEDI PLUS INC.
    Inventors: Naoya Sugano, Minsu Kwon
  • Patent number: 10430924
    Abstract: In general, embodiments of the invention relate to generating thumbnails representing files stored in database or other repository. More specifically, embodiments of the invention are directed to creating snapshots of file content and using the snapshots to generate a thumbnail representation for each file. In addition, embodiments of the invention are directed to resizing thumbnails having thumbnail content from file snapshots, annotating on the thumbnails using an interactive device, and later synchronizing the annotations with the file content.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: October 1, 2019
    Assignee: QuirkLogic, Inc.
    Inventors: Alfonso Fabian de la Fuente, Michael Howatt Mabey, Nashirali Samanani
  • Patent number: 10427033
    Abstract: A gaming apparatus includes a position control section and a display control section. The position control section controls a position of an object arranged in a virtual 3D space based on a relative position between an input apparatus used by a user wearing a head-mounted display and the head-mounted display. The display control section generates an image in the virtual 3D space including the object and displays the image on the head-mounted display. When a distance between the input apparatus and the head-mounted display is equal to a first distance or more, the position control section linearly changes the object position in response to a change in position of the input apparatus.
    Type: Grant
    Filed: November 25, 2016
    Date of Patent: October 1, 2019
    Assignee: Sony Interactive Entertainment Inc.
    Inventors: Tomokazu Kake, Takayuki Ishida, Akira Suzuki, Yasuhiro Watari
  • Patent number: 10429939
    Abstract: An apparatus for projecting an image, according to one embodiment of the present invention, projects an image, detects a user gesture operation in an area between an image screen area, where an image screen corresponding to the projected image is displayed, and the apparatus for projecting an image, and carries out a control operation corresponding to the detected gesture operation. The operation of detecting the user gesture operation includes distinguishing, into a plurality of areas, the area between the image screen area and the apparatus for projecting an image, and detecting the user gesture operation with respect to the plurality of distinguished plurality of areas.
    Type: Grant
    Filed: September 3, 2014
    Date of Patent: October 1, 2019
    Assignee: LG ELECTRONICS INC.
    Inventors: Kyoungjoung Kim, Hyunhee You, Yeonjoo Joo, Sooji Yeom
  • Patent number: 10423282
    Abstract: A projector includes a determining unit configured to determine whether the distance between an operation device and a distance measuring unit configured to measure a distance to the operation device is a first threshold or less, a switching unit configured to switch, when a mode of the projector is a first mode, the mode from the first mode to a second mode when it is determined that the distance between the operation device and the distance measuring unit is the first threshold or less, a detecting unit configured to detect the position indicated by the operation device, and a processing unit configured to, when indication of a first position on the screen is detected, perform first processing corresponding to the first position when the mode is the first mode and perform second processing different from the first processing corresponding to the first position when the mode is the second mode.
    Type: Grant
    Filed: February 29, 2016
    Date of Patent: September 24, 2019
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Toshiki Fujimori
  • Patent number: 10416874
    Abstract: Methods and apparatuses are provided for processing interface displays. The disclosed methods and apparatuses may detect a gesture operation on a current interface of a computing device. The current interface is pre-divided into a first region and a second region. The first region is configured to move in accordance with the detected gesture operation. The disclosed methods and apparatuses may switch from the current interface to a new interface that includes content of the first region when the gesture operation is determined to be corresponding to a predetermined gesture.
    Type: Grant
    Filed: March 23, 2016
    Date of Patent: September 17, 2019
    Assignee: Guangzhou UCWeb Computer Technology Co., Ltd.
    Inventors: Liang Rao, Yaohang Xiao
  • Patent number: 10417991
    Abstract: A system for modifying a user interface in a multi-display device environment described herein can include a processor and a memory storing instructions that cause the processor to detect a number of display screens coupled to the system. The plurality of instructions can also cause the processor to split an image to generate sub-images based on the number of display screens and a bezel size corresponding to each of the display screens, the sub-images to exclude portions of the image corresponding to the bezel size of each of the display screens. Additionally, the plurality of instructions can cause the processor to resize each of the sub-images based on a display size of each of the display screens and display the image by transmitting the sub-images to the display screens.
    Type: Grant
    Filed: August 18, 2017
    Date of Patent: September 17, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Matthias Baer, Bryan K. Mamaril, Kyle T. Kral, Kae-Ling J. Gurr, Ryan Whitaker
  • Patent number: 10403402
    Abstract: A system permits a medical practitioner to interact with medically relevant information during a medical procedure. The system comprises: a projector for projecting a user interface menu image onto a projection surface; a three-dimensional optical imaging system for capturing three-dimensional location information for objects in a sensing volume which includes the projection surface; and a controller connected to receive the three-dimensional location information from the three-dimensional optical imaging system and configured to interpret one or more gestures made by the practitioner in a space between the three-dimensional optical imaging system and the projection surface based on a location of the gesture relative to the projected user interface menu image.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: September 3, 2019
    Assignee: The University of British Columbia
    Inventors: Nima Ziraknejad, Behrang Homayoon, Peter Lawrence, David Ming-Teh Liu
  • Patent number: 10402081
    Abstract: Methods and apparatuses are described for providing a thumb scroll user interface element in a computerized visual environment. A viewing device displays a graphical user interface within a three-dimensional space that includes surface planes each associated with a graphical thumb scroll element. A sensor device captures a location of the user's hand within the three-dimensional space and a gesture of the user's hand. A computing device detects that the location is within a defined zone around one of the graphical thumb scroll elements and activates the thumb scroll element associated with the defined zone. The computing device identifies the gesture, determines a scroll speed based upon a speed of the identified gesture, and determines a scroll direction based upon a direction of the identified gesture. The computing device changes characteristics of graphical content in the surface plane based upon at least one of: the scroll speed or the scroll direction.
    Type: Grant
    Filed: August 28, 2018
    Date of Patent: September 3, 2019
    Assignee: FMR LLC
    Inventors: James Andersen, Adam Schouela, Hangyu Wang
  • Patent number: 10402082
    Abstract: A mobile terminal and a method of selecting a lock function thereof are provided. The method of selecting a lock function of a mobile terminal having a touch screen includes: measuring, when at least one touch occurs on the touch screen, at least one of a pressure, a current, and a capacitance of an area of the touch screen in which the touch occurs; and selecting, if at least one of the measured pressure, current, and capacitance is greater than or equal to a preset value, a lock function. Thus, by touching the touch screen in such a manner that a high current or capacitance may be measured in a specific area of the mobile terminal, a user can easily perform a desired function, thereby improving user convenience.
    Type: Grant
    Filed: October 9, 2015
    Date of Patent: September 3, 2019
    Assignee: Samsung Electronics Co., Ltd.
    Inventor: Se Youp Chu