Gesture-based Patents (Class 715/863)
-
Patent number: 9500524Abstract: An ICP emission spectrometer is schematically configured to include an inductively coupled plasma generation unit, a light condensing unit, a spectroscope, a two-dimensional detection unit and a controller. The two-dimensional detection unit includes a CCD image sensor which has multiple pixels laid in a planar shape and detects emission light by causing the emission light emitted from the spectroscope to be imaged on the multiple pixels. Then, the controller determines a pixel used in detecting the emission light among the multiple pixels in accordance with an imaging shape of detection-targeted emission light.Type: GrantFiled: March 31, 2015Date of Patent: November 22, 2016Assignee: Hitachi High-Tech Science CorporationInventor: Osamu Matsuzawa
-
Patent number: 9501138Abstract: A method for operating a real-time gesture based interactive system includes: obtaining a sequence of frames of data from an acquisition system; comparing successive frames of the data for portions that change between frames; determining whether any of the portions that changed are part of an interaction medium detected in the sequence of frames of data; defining a 3D interaction zone relative to an initial position of the part of the interaction medium detected in the sequence of frames of data; tracking a movement of the interaction medium to generate a plurality of 3D positions of the interaction medium; detecting movement of the interaction medium from inside to outside the 3D interaction zone at a boundary 3D position; shifting the 3D interaction zone relative to the boundary 3D position; computing a plurality of 2D positions based on the 3D positions; and supplying the 2D positions to control an application.Type: GrantFiled: May 5, 2015Date of Patent: November 22, 2016Assignee: Aquifi, Inc.Inventors: Carlo Dal Mutto, Giuliano Pasqualotto, Giridhar Murali, Michele Stoppa, Amir hossein Khalili, Ahmed Tashrif Kamal, Britta Hummel
-
Patent number: 9491438Abstract: Provided is a communication method using a three-dimensional (3D) image display device. In the communication method, motion information is determined using a motion image obtained by photographing a user's motion indicating the user's request in relation to an opposite party, distance information indicating the distance between the user who is moving and the 3D image display device is determined, and then, the user's request is determined based on the motion information and the distance information.Type: GrantFiled: February 11, 2013Date of Patent: November 8, 2016Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Hark-joon Kim, Pil-seung Yang
-
Patent number: 9491501Abstract: A mobile terminal capable of giving an instruction for a device linkage by simple operations, without any special hardware. The mobile terminal which is connected to a TV through a communication path includes: a communication I/F unit for communicating with the TV, a storage unit which stores content to be displayed on the TV; a display unit having a touch pad and a display screen; and a control unit which displays, on the display screen, a content icon associated with the content. When detecting that the content icon is pressed and held and then is flicked on the display unit, the control unit reads the content from the storage unit and transmit it to the TV through the communication I/F unit, so that the content is displayed on the TV.Type: GrantFiled: August 14, 2015Date of Patent: November 8, 2016Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.Inventor: Shigeki Matsunaga
-
Patent number: 9477370Abstract: A method of displaying a plurality of pages on a screen of a terminal is provided. The method includes detecting a user's gesture that requests movement of the plurality of pages, identifying a movement mode relating to the movement of the plurality of pages, and moving and displaying a first page displayed on the screen and a second page connected to the first page displayed on the screen according to the identified movement mode, wherein the movement mode is one of a discrete mode and a continuous mode.Type: GrantFiled: April 12, 2013Date of Patent: October 25, 2016Assignee: Samsung Electronics Co., Ltd.Inventors: Kwang-won Sun, Kang-tae Kim, Doo-hwan Kim, Eun-young Kim, Chul-joo Kim, Duck-hyun Kim, Jong-won Han
-
Patent number: 9477404Abstract: An electronic device detects a gesture on a touch-sensitive surface. In response to detecting the gesture on the touch-sensitive surface, when the gesture is a first swipe gesture in a first direction, the device displays at least a list of recent electronic notifications. When the gesture is a second swipe gesture in a second direction distinct from the first direction, the device displays one or more settings icons in a settings panel, wherein the settings panel includes a respective settings icon that, when selected, causes a partially transparent interface to be displayed over the settings panel, wherein the partially transparent interface is at least partially transparent so that at least a portion of the settings panel can be seen through the partially transparent user interface.Type: GrantFiled: February 18, 2014Date of Patent: October 25, 2016Assignee: APPLE INC.Inventors: Imran Chaudhri, Lawrence Yang, Alessandro Sabatelli, Brian Schmitt
-
Patent number: 9473612Abstract: A method, apparatus and a computer program is provided. The apparatus comprises: at least one processor; and at least one memory storing computer program code configured, working with the at least one processor, to cause at least the following to be performed: responding to user input by switching a device from an unlocked state to a locked state; and setting a user input process for switching the device from the locked state to the unlocked state, in dependence upon the user input provided to switch the device from the unlocked state to the locked state.Type: GrantFiled: November 28, 2012Date of Patent: October 18, 2016Assignee: Nokia Technologies OyInventors: Maokun Han, Chaobin Guo, Ziyan Chen, Guangjun Wang
-
Patent number: 9471153Abstract: The amount of power and processing needed to process gesture input for a computing device can be reduced by utilizing a separate gesture sensor. The gesture sensor can have a form factor similar to that of conventional camera elements, in order to reduce costs by being able to utilize readily available low cost parts, but can have a lower resolution and adjustable virtual shutter such that fast motions can be captured and/or recognized by the device. In some devices, a subset of the pixels of the gesture sensor can be used as a motion detector, enabling the gesture sensor to run in a low power state unless there is likely gesture input to process. Further, at least some of the processing and circuitry can be included with the gesture sensor such that various functionality can be performed without accessing a central processor or system bus, thus further reducing power consumption.Type: GrantFiled: June 22, 2015Date of Patent: October 18, 2016Assignee: Amazon Technologies, Inc.Inventor: Volodymyr V. Ivanchenko
-
Patent number: 9471150Abstract: A first directional touch gesture on a display area and a second directional touch gesture on the display area are received simultaneously, wherein: the first directional touch gesture is on a left side of the display area; the second directional touch gesture is on a right side of the display area; and a first direction associated with the first directional touch gesture is identical to a second direction associated with the second directional touch gesture. Content is scaled within the display area based at least in part on the identical direction associated with the simultaneous directional touch gestures.Type: GrantFiled: September 27, 2013Date of Patent: October 18, 2016Assignee: EMC CorporationInventor: Chandramouli Addaguduru
-
Patent number: 9467729Abstract: A method for remote controlling of a smart TV. With the method for remote controlling, the user does not look at the remote controller's touchscreen and remote-controls the smart TV by looking at the motion of the cursor on the smart TV, which is an object to control and manipulating the touchscreen. Therefore, the smart TV can be controlled freely and comfortably as if holding it like a smart phone in hand.Type: GrantFiled: December 31, 2013Date of Patent: October 11, 2016Inventors: Ik Soo Eun, Myung Jin Eun
-
Patent number: 9465450Abstract: The invention describes a method of controlling a system (1) comprising one or more components (C1, C2, . . . , Cn), which method comprises the steps of aiming a pointing device (2) comprising a camera (3) in the direction of one or more of the components (C1, C2, . . . , Cn), generating image data (4) of a target area A( ) aimed at by the pointing device (2), encompassing at least part of one or more of the components (C1, C2, . . . , Cn), and analyzing the image data (4) to determine position information (P) pertaining to the position of the user (5) relative to one or more of the components (C1, C2, . . . , Cn) at which the pointing device (2) is being aimed and/or to relative positions of the components. The system (1) is subsequently controlled according to the position information (P). Furthermore, the invention describes a corresponding control system (10), a home entertainment system, and a lighting system.Type: GrantFiled: June 28, 2006Date of Patent: October 11, 2016Assignee: Koninklijke Philips N.V.Inventors: Hubertus Maria Rene Cortenraad, Jan Kneissler
-
Patent number: 9459699Abstract: A control method and an electronic device are provided in this application. The control method includes: determining a first operating area where an operator performs a gesture operation according to sensing information obtained by the at least two sensing units, where, sensing information from various spatial areas are obtained the at least two sensing units, the first operating area includes at least one spatial area of the various spatial areas; and generating and executing a first control instruction corresponding to the first operating area based on at least a configuration relationship between an operating area and an instruction, where the first control instruction is different from a second control instruction which is generated when the operator performs a gesture operation in a second operating area.Type: GrantFiled: December 29, 2014Date of Patent: October 4, 2016Assignees: Beijing Lenovo Software Ltd., Lenovo (Beijing) Co., Ltd.Inventor: Li Su
-
Patent number: 9459794Abstract: In some examples, a stylus may be used to make a first input including a first shape at a first location on a display. The first location may correspond to a location of a graphic element presented on the display, and a mark consistent with the shape may be presented on the display, such as overlying the graphic element. Additionally, the stylus may be used to make a second input including a second shape. For example, the second shape may correspond to a command for performing an operation. In response, to the entry of the first shape and the second shape, the operation may be performed on or with respect to the graphic element. In some instances, various different shapes and combinations of shapes may be mapped to various different operations, designations, destinations, and so forth.Type: GrantFiled: March 24, 2014Date of Patent: October 4, 2016Assignee: Amazon Technologies, Inc.Inventor: Orry Wijanarko Soegiono
-
Patent number: 9459789Abstract: There is provided an information processing apparatus with an interface with high convenience for a user. A reference speed is set according to an amount of movement or a movement time period of a pointer of a stylus or a finger. It is determined based on a movement speed of the pointer and the reference speed that a flick operation with the pointer has occurred.Type: GrantFiled: October 28, 2011Date of Patent: October 4, 2016Assignee: Canon Kabushiki KaishaInventor: Keiichi Yamamoto
-
Patent number: 9459697Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: January 15, 2014Date of Patent: October 4, 2016Assignee: Leap Motion, Inc.Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian
-
Patent number: 9454220Abstract: In one exemplary embodiment, a method includes the step of obtaining a digital image of an object with a digital camera. The object is identified. A user query is received. The method includes the step of matching, with at least one processor, the user query with at least one manipulatable portion of a virtual view of the digital image of the object. The method include the step of obtaining the at least one manipulatable portion from a database of manipulatable portions of the object. The method includes the step of integrating the at least one manipulatable portion with the virtual view of the digital image of the object, wherein a manipulatable portion comprises a region of an augmented-reality element integrated into the virtual view of the digital image of the object, and wherein the augmented-reality element comprises a hyper link to another augmented-reality element comprising a virtual interior view of the object and wherein the camera obtains the predefined user gesture.Type: GrantFiled: January 23, 2014Date of Patent: September 27, 2016Inventor: Derek A. Devries
-
Patent number: 9456072Abstract: Provided is an apparatus and method for managing an application in a wireless terminal, in which data of an application is managed and displayed according to an input type of a password for unlocking the wireless terminal, wherein the apparatus includes a memory for storing a plurality of passwords and a controller for classifying and managing data of each of a plurality of applications according to a password type.Type: GrantFiled: September 8, 2015Date of Patent: September 27, 2016Assignee: Samsung Electronics Co., Ltd.Inventors: Se-Hong Kwon, Don-Gyo Jun, Soon-Shik Hwang
-
Patent number: 9456203Abstract: An apparatus and method provide logic for processing information. In one implementation, an apparatus may include a determination unit configured to determine a first spatial position of a portion of an operating tool disposed within a threshold distance of a surface of the determination unit. The first spatial position may be determined relative to the determination unit surface in a depth direction. The apparatus may also include a control unit configured to generate a first signal to display a stereoscopic image to a user at a first display position. The first display position may be disposed within a predetermined distance of the first spatial position.Type: GrantFiled: March 20, 2015Date of Patent: September 27, 2016Assignee: Sony CorporationInventor: Takuro Noda
-
Patent number: 9448620Abstract: An input apparatus for use in a portable device is provided. The input apparatus includes a camera for capturing an image; a storage unit which stores a key-hand mapping table for mapping segments of a hand to a plurality of keys, respectively, according to predetermined criteria; a display unit displaying the captured image during an input mode; and a control unit which activates the camera in the input mode, controls the display unit to display the captured image, assign the plural keys to the segments of the hand based on mapping information of the key-hand mapping table, detect an image change in one of the segments displayed on the display unit, and input the key mapped to the segment at which the image change is detected on the display unit.Type: GrantFiled: August 21, 2013Date of Patent: September 20, 2016Assignee: Samsung Electronics Co., Ltd.Inventor: Jeonghun Kim
-
Patent number: 9442575Abstract: A free space input standard is instantiated on a processor. Free space input is sensed and communicated to the processor. If the free space input satisfies the free space input standard, a touch screen input response is invoked in an operating system. The free space input may be sensed using continuous implicit, discrete implicit, active explicit, or passive explicit approaches. The touch screen input response may be invoked through communicating virtual touch screen input, a virtual input event, or a virtual command to or within the operating system. In this manner free space gestures may control existing touch screen interfaces and devices, without modifying those interfaces and devices directly to accept free space gestures.Type: GrantFiled: May 15, 2015Date of Patent: September 13, 2016Assignee: ATHEER, INC.Inventors: Shashwat Kandadai, Nathan Abercrombie, Yu-Hsiang Chen, Sleiman Itani
-
Patent number: 9443202Abstract: There is disclosed a method including receiving sensor data extracted from one or more physical sensors, using the extracted sensor data and a context model to perform a first level context determination, and examining the at least one condition. If the examining indicated that the at least one condition was fulfilled, the context model is adapted on the basis of the sensor data, otherwise adaptation data formed on the basis of the sensor data is provided to a second level context determination. A corresponding apparatus and computer program product are also provided.Type: GrantFiled: July 1, 2010Date of Patent: September 13, 2016Assignee: Nokia Technologies OyInventors: Jussi Leppanen, Rajasekaran Andiappan, Antti Eronen
-
Patent number: 9438543Abstract: A user selects a content item that he wishes to send. He then performs a gesture to specify a direction toward which the content item is to be sent. For example, he pretends to “throw” his portable communications device toward the West. To receive the content item, potential recipients also make gestures that specify receiving directions. For example, a recipient pretends to “catch” a ball thrown from the East. If the sending and receiving directions are compatible, then the content item is sent from the sender's device to the recipient's device. Enhancements to the basic scheme described above can be made to, for example, specify a dispersion angle for the sending or receiving directions or to include other restrictions so that, for example, only intended recipients can receive the content even if other potential recipients are in the specified sending direction and make appropriate receiving gestures.Type: GrantFiled: March 4, 2013Date of Patent: September 6, 2016Assignee: Google Technology Holdings LLCInventors: Alex G. Goncalves, Edward Judalevitch
-
Patent number: 9430133Abstract: Techniques and tools are described for facilitating user reflection on past decisions in order to determine trends and to assist in future decision-making. Technologies for administering a career history exercise and for visualizing results of the career history exercise are described. Visualizations include using stage cards representative of stages in a user's career history. User ratings of the stages in his or her career history are displayed on the stage cards using color-shaded ratings units.Type: GrantFiled: December 17, 2012Date of Patent: August 30, 2016Assignee: SAP SEInventors: Zsuzsanna Kovacs, Christoph Dobiasz, Simone Charlotte Holz, Nenad Dordevic, Aniko Zsofia Georgiev, Katalin Ocsai, Tamas Kirmer
-
Patent number: 9430065Abstract: A method is provided for generating ink data including stroke objects. The method includes generally four steps. The first step receives device-dependent user-input data including either one of pen event data of Type 1, which includes indicator position data and indicator pressure data, or pen event data of Type 2, which includes indicator position data but does not include indicator pressure data. The second step determines whether the pen event data is Type 1 or Type 2. The third step derives one or both of radius data for defining a width of the stroke object and transparency data for defining a transparency of the stroke object, based on the pen event data of Type 1 or Type 2. The fourth step outputs the stroke object including said one or both of radius data and transparency data as device-independent common attribute value(s) of each of multiple points of the stroke object.Type: GrantFiled: September 4, 2015Date of Patent: August 30, 2016Assignee: Wacom Co., Ltd.Inventors: Plamen Petkov, Branimir Angelov, Stefan Yotov, Heidi Wang, Boriana Mladenova
-
Patent number: 9430089Abstract: The present invention provides an information processing apparatus configured to recognize a touch operation. The information processing apparatus includes an obtaining unit configured to obtain a plurality of touch positions touched on a display screen; a specifying unit configured to specify an object displayed at a position corresponding to each of the touch positions obtained by the obtaining unit on the display screen when at least one of the touch positions obtained by the obtaining unit is included in a predetermined range on the display screen; and a determination unit configured to determine, among the touch positions obtained by the obtaining unit, a touch position included in the predetermined range as an invalid input for a touch operation to the information processing apparatus when the object specified by the specifying unit is not a plurality of objects associated with each other.Type: GrantFiled: December 26, 2013Date of Patent: August 30, 2016Assignee: Canon Kabushiki KaishaInventors: Hikaru Ito, Seiko Kono, Takeshi Yamazaki
-
Patent number: 9430070Abstract: A display device, method, computer-readable storage medium and user interface, each of which detects contact to or proximity of an object with respect to a generated image, and responsive to detection of contact to or proximity of the object to the generated image, disables any operational functions associated with a first portion of the generated image. Additionally, operation associated with a second portion of the generated image is allowed responsive to the detection of contact to or proximity of the object to the generated image, where the second portion of the generated image is different from the first portion of the generated image. An indication corresponding to the second portion of the generated image for which operation is enabled may be displayed on the generated image.Type: GrantFiled: June 5, 2014Date of Patent: August 30, 2016Assignee: SONY CORPORATIONInventors: Mitsuo Okumura, Hazime Matsuda, Shoji Imamura, Katsuji Miyazawa, Motoki Higashide, Kunihito Sawai
-
Patent number: 9423939Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: GrantFiled: November 12, 2012Date of Patent: August 23, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Patent number: 9424416Abstract: A computing device can enable a user to navigate to an application or other digital object directly from a lock screen of the device. A user can specify a credential, such as a short code, that is associated with a specific application. If the credential is recognized, the device can be unlocked and the corresponding application displayed. The user can then be granted full or partial access to functionality and/or data of the device, as may depend at least in part upon the type of credential or a level of access specified for the credential. The credential can be based at least in part upon, or independent of, a general unlock credential for the device. In some embodiments, the user can be able to specify the amount and/or type of access to be granted under a credential, such as access only to utilize the corresponding application.Type: GrantFiled: July 2, 2013Date of Patent: August 23, 2016Assignee: Amazon Technologies, Inc.Inventor: Mihir Kumar Choudhary
-
Patent number: 9424444Abstract: Systems, apparatus and methods are provided. In some embodiments, a computer-implemented method is provided. The computer-implemented method can include executing, using at least one processor, computer-readable instructions for performing various acts. The acts can include executing at least two applications, wherein at least one of the two applications is for providing social media information. The acts can also include receiving a signal in response to a detected input gesture indicative of a selection of an entity. The selection of the entity can be performed via another one of the two applications. The acts can also include integrating the two applications such that social media information is generated about the entity. The integration can be in response to receiving the signal, and the social media information can include information from one or more social networking websites. The acts can also include displaying the social media information about the entity.Type: GrantFiled: December 21, 2009Date of Patent: August 23, 2016Assignee: AT&T MOBILITY II LLCInventors: Cristy Swink, Jason Sikes, Benjamin Fineman, Jonathan Solis Snydal, Matthew James Schoenholz, Craig Alan Williams, David Merkoski, Claudia Knop, Alex Tam, Alison Rae Maiorano, Christopher Marshall Turitzin, Dhana Dhanasarnsombat, Eric Brady Burns, Hannah Regier, Jennifer Siu Bettendorff, Karin Maire Curkowicz, Katrin Asen, Harry Lawson Kight, Flora Elysia Howell, Megan Elisabeth Shia, Paul McDougall, Philip Foeckler, Ratna Desai, Sara Louise Todd, Thomas L Rohrer
-
Patent number: 9417778Abstract: An application interface is provided that allows a user to interact with a work space. The application interface may include a defined display area that includes one or more discrete works spaces. In one implementation a user interaction with the defined display area, such as an overscroll interaction, results in the defined display area being modified, such as to include additional or fewer discrete work spaces.Type: GrantFiled: June 18, 2015Date of Patent: August 16, 2016Assignee: Apple Inc.Inventors: Donald R. Beaver, Gregory C. Langmead
-
Patent number: 9417706Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: May 17, 2015Date of Patent: August 16, 2016Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Patent number: 9411423Abstract: An electronic interactive device having a user interface with a flexible surface, a sensor configured to sense a flex gesture applied to the flexible surface, a haptic device configured to generate a haptic effect in response to the flex gesture, and a controller in signal communication with the sensor and the haptic device. The controller is configured to trigger the haptic device to provide haptic confirmation of the flex gesture and to trigger an application action associated with the flex gesture.Type: GrantFiled: February 8, 2012Date of Patent: August 9, 2016Assignee: Immersion CorporationInventor: Robert W. Heubel
-
Patent number: 9413950Abstract: A method performed by a processing system includes determining a device identifier corresponding to a device from a series of captured images that include a light signal emitted by the device.Type: GrantFiled: January 25, 2013Date of Patent: August 9, 2016Assignee: Hewlett-Packard Development Company, L.P.Inventor: Wei Hong
-
Patent number: 9411424Abstract: An input device includes an input unit for inputting a predetermined motion image signal, a motion detector for detecting a motion on the basis of the motion image signal inputted into the input unit, a video signal processor for outputting a predetermined video signal, and a controller. The controller controls the video signal processor so that, when a motion detector detects a first motion, the video signal processor outputs a video signal to explain a predetermined second motion to be next detected by the motion detector after the detection of the first motion to a user.Type: GrantFiled: September 29, 2015Date of Patent: August 9, 2016Assignee: HITACHI MAXELL, LTD.Inventors: Takashi Matsubara, Tomochika Ozaki, Tatsuya Tokunaga
-
Patent number: 9412021Abstract: An approach for controlling transmission of data based on gaze interaction. A data transmission determination platform determines one or more gaze metrics for one or more users. The one or more gaze metrics, relate, at least in part, to a level of interaction of each user with an object. The data transmission determination platform then processes and/or facilitates a processing of the one or more gaze metrics, user preference information, propagation information from one or more entities associated with the object, or a combination thereof to determine data to transmit one or more devices associated with at least a subset of the at least one or more users.Type: GrantFiled: November 29, 2013Date of Patent: August 9, 2016Assignee: NOKIA TECHNOLOGIES OYInventors: Debmalya Biswas, Matthew John Lawrenson, Julian Nolan
-
Patent number: 9405367Abstract: An object execution method and apparatus is provided for executing an object based on an input pressure level with a haptic feedback indicative of an attribute of the object. An object execution apparatus may include a display unit, an input unit, and a control unit. The display unit may display at least one object. The input unit may detect a selection of an object and/or a pressure level input for executing the object. The control unit may check a reference pressure value preset for the selected object. The reference pressure value may be indicative of attribute information of the object. The control unit may compare an input pressure value of the pressure level input with the reference pressure value, and may determine whether to execute the object based on the comparison result. The input unit comprising a vibration generator may generate a vibration mapped to the reference pressure value.Type: GrantFiled: September 29, 2009Date of Patent: August 2, 2016Assignee: Samsung Electronics Co., Ltd.Inventors: Han Chul Jung, O Jae Kwon, Chang Beom Shin
-
Patent number: 9405377Abstract: In many computing scenarios, a device comprises at least one sensor, and is configured to recognize a gesture performed by a user according to the sensor output of the sensor, and to perform a particular action upon recognizing the gesture. However, many devices are preconfigured with such gestures, and the recognition is specific to the sensors of the device, and is not specific to the manner in which a particular user performs the gesture. Presented herein are techniques for enabling a device to recognize a new gesture by monitoring the sensor output of any sensors provided by the device while the user performs the gesture, optionally requesting repeated gesture performances until reaching a recognition confidence. Once trained to recognize the gesture according to the sensor outputs of the particular sensors of the device, the device may subsequently recognize the gesture performed by the user and execute an associated action.Type: GrantFiled: March 15, 2014Date of Patent: August 2, 2016Assignee: Microsoft Technology Licensing, LLCInventor: MengKe Li
-
Patent number: 9389779Abstract: Technologies for depth-based gesture control include a computing device having a display and a depth sensor. The computing device is configured to recognize an input gesture performed by a user, determine a depth relative to the display of the input gesture based on data from the depth sensor, assign a depth plane to the input gesture as a function of the depth, and execute a user interface command based on the input gesture and the assigned depth plane. The user interface command may control a virtual object selected by depth plane, including a player character in a game. The computing device may recognize primary and secondary virtual touch planes and execute a secondary user interface command for input gestures on the secondary virtual touch plane, such as magnifying or selecting user interface elements or enabling additional functionality based on the input gesture. Other embodiments are described and claimed.Type: GrantFiled: March 14, 2013Date of Patent: July 12, 2016Assignee: Intel CorporationInventors: Glen J. Anderson, Dror Reif, Barak Hurwitz, Gila Kamhi
-
Patent number: 9389891Abstract: Architecture that enables the capability to call user-defined functions (UDFs) within the hosting page, and from within a spreadsheet. UDFs can be used to add functionality to spreadsheet models. Spreadsheet calculations are spread across a backend calculation server and the web browser. Spreadsheet calculation state is maintained as browser functions are calculated. Moreover, the browser UDFs can be executed synchronously or asynchronously. The architecture also provides capability to dynamically register/unregister UDFs at runtime, which can then be called from the spreadsheet model and executed remotely), and the capability to define a manifest that can include all of the UDF definitions, infinite calculation loop situations are also prevented. The UDFs have full access to the page DOM (document object model).Type: GrantFiled: January 9, 2012Date of Patent: July 12, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Daniel C. Battagin, Shahar Prish
-
Patent number: 9390250Abstract: The present invention relates to a mobile terminal and a control method thereof. The mobile terminal displays a plurality of figures arranged in a first composition on a touch screen when a specific mode is locked, rearranges the plurality of figures in a second composition different from the first composition upon receiving a specific input, and determines whether to unlock the specific mode on the basis of a pattern which is input using the plurality of figures arranged in the first or second composition.Type: GrantFiled: December 7, 2012Date of Patent: July 12, 2016Assignee: LG ELECTRONICS INC.Inventors: Kanguk Kim, Taiyoung Choi, Keumsung Hwang, Byonggyo Lee
-
Patent number: 9389782Abstract: Disclosed herein are an electronic device and a control method thereof. The electronic device includes a touch device and a processor. The processor is electrically connected to the touch device. When at least one continuous back-and-forth moving touch trace is formed on the touch device during a predetermined period, the processor switches one of a plurality of modes of the electronic device to another.Type: GrantFiled: August 7, 2014Date of Patent: July 12, 2016Assignee: COMPAL ELECTRONICS, INC.Inventors: Feng-Yi Yu, Ming-Che Weng, Wei-Han Hu, Jui-Wen Hsu, Shih-Hung Lin
-
Patent number: 9384403Abstract: A system and method that is able to recognize a user's natural superimposed handwriting without any explicit separation between characters. The system and method is able to process single-stroke and multi-stroke characters. It can also process cursive handwriting. Further, the method and system can determine the boundaries of input words either by the use of a specific user input gesture or by detecting the word boundaries based on language characteristics and properties. The system and method analyzes the handwriting input through the processes of fragmentation, segmentation, character recognition, and language modeling. At least some of these processes occur concurrently through the use of dynamic programming.Type: GrantFiled: July 10, 2015Date of Patent: July 5, 2016Assignee: MYSCRIPTInventors: Zsolt Wimmer, Freddy Perraud, Pierre-Michel Lallican, Guillermo Aradilla
-
Patent number: 9383782Abstract: A mobile terminal, including detection section 30 that detects an orientation and moving direction of mobile terminal 1; vibration section 40 that generate vibrations corresponding to the orientation and moving direction detected by detection section 30; and control section 50 that executes an operation of mobile terminal 1 corresponding to the orientation and moving direction detected by detection section 30.Type: GrantFiled: August 13, 2013Date of Patent: July 5, 2016Assignee: NEC CORPORATIONInventor: Atsuhiko Kamijima
-
Patent number: 9384329Abstract: A system and method is disclosed determining caloric burn via an HCI system. Using a capture device which is able to detect the thickness of a user's arms, legs, torso, etc., the system determines a mass for each of a user's body parts. Thereafter, in one example, the system measures caloric burn for a given body part as a function of how far the body part was displaced, a mass of the body part displaced and gravity.Type: GrantFiled: June 11, 2010Date of Patent: July 5, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Andrew Wilson, Mark Stevenson, Nicholas Burton, William Bryan, James Thomas
-
Patent number: 9377851Abstract: There is provided an information processing apparatus including an operation detection unit configured to detect an orientation of a user's face and operations performed by the user, and an area selection unit configured to, when the operation detection unit detects that the user has performed a first operation, select an area on a screen based on the orientation of the user's face during the first operation.Type: GrantFiled: February 4, 2013Date of Patent: June 28, 2016Assignee: Sony CorporationInventor: Tsutomu Sawada
-
Patent number: 9372561Abstract: An electronic device includes a touch-screen unit which includes a flat surface touch-screen region and a curved surface touch-screen region, and a body unit to which the touch-screen unit is attached. Here, the electronic device determines that a first user command is input when a border between the flat surface touch-screen region and the curved surface touch-screen region is traversed from a first direction, and determines that a second user command is input when the border between the flat surface touch-screen region and the curved surface touch-screen region is traversed from a second direction, where the second direction is opposite to the first direction.Type: GrantFiled: January 15, 2014Date of Patent: June 21, 2016Assignee: Samsung Display Co., Ltd.Inventor: Hwang-Keun Kim
-
Patent number: 9372598Abstract: Provided are techniques for reference-based circular scrolling on a computer display. Techniques include generating a circular scrolling bar in a computer display, the scrolling bar comprising a plurality of segments; populating the circular scrolling bar with reference-based information corresponding to an information source, wherein each segment of the plurality of segments displays a unique reference corresponding to a corresponding discrete portion of the information source; presenting the corresponding discrete portion of the information source corresponding to a particular segment in response to contact with the particular segment.Type: GrantFiled: January 14, 2014Date of Patent: June 21, 2016Assignee: Lenovo Enterprise Solutions (Singapore) PTE. LTD.Inventors: Barry A. Kritt, Sarbajit K. Rakshit
-
Patent number: 9367207Abstract: A mobile terminal including a wireless communication unit configured to wirelessly communicate with at least one other terminal; a display unit; and a controller to receive a lock screen input command, display a lock screen on the display unit upon receiving the lock screen input command, receive a predetermined touch input on the lock screen, display a home screen in an overlapping manner with the lock screen on the display unit upon receiving the predetermined touch input on the lock screen, identify an icon displayed at a position on the home screen corresponding to a position of the predetermined touch input among a plurality of icons included in the home screen, and release the lock screen and execute an application corresponding to the identified icon when the predetermined touch input is released.Type: GrantFiled: July 12, 2013Date of Patent: June 14, 2016Assignee: LG Electronics Inc.Inventors: Ashesh Chagana Boyana, Rajesh Nagaraja Rao
-
Patent number: 9367216Abstract: A hand-held electronic device, method of operation and computer readable medium are disclosed. The device may include a case having one or more major surfaces. A visual display and a touch interface are disposed on at least one of the major surfaces. A processor is operably coupled to the visual display and touch screen.Type: GrantFiled: October 7, 2009Date of Patent: June 14, 2016Assignee: SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Gary M. Zalewski, Charles Nicholson
-
Patent number: 9367236Abstract: A system and method for processing touch actions are provided. A plurality of sequentially performed touch actions including a first touch action and a second touch action are determined on a touch interface of an electronic device. An initiation location and a completion location of each of the first and second touch actions are determined on the touch interface. A command is selected based on the determined completion location of the first touch action, the second touch action, and the determined initiation location of the second touch action. The selected command is executed on the electronic device.Type: GrantFiled: March 15, 2013Date of Patent: June 14, 2016Assignee: GOOGLE INC.Inventors: Alexander Friedrich Kuscher, Stefan Kuhne, John Nicholas Jitkoff