Including Orientation Sensors (e.g., Infrared, Ultrasonic, Remotely Controlled) Patents (Class 345/158)
-
Patent number: 10872474Abstract: A method and system for manipulating a 3D model during treatment planning and automatically adjusting the 3D model based on a localized area of the 3D model proximate to a location of said manipulation. The 3D model is automatically adjusted during the course of treatment planning such that a user has an unobstructed view of the surface. The 3D model may be for example, a 3D model of a tooth or teeth or dental anatomy.Type: GrantFiled: June 29, 2018Date of Patent: December 22, 2020Assignee: DENTSPLY SIRONA INC.Inventors: Sascha Schneider, Evgenij Derzapf, Ravid Aloni
-
Patent number: 10869632Abstract: Described herein is a system and method for ergonomic analysis including a sensorized glove having an inner glove including a plurality of extensometer sensors for detecting relative movements between parts of a worker's hand, and an outer glove including a plurality of pressure sensors distributed over a palmar surface and for detecting pressure exerted in corresponding areas of said palmar surface; a wearable network of sensors being located in the network so that they can be associated to corresponding joints of the human body; a unit for generating a sequence of images of a worker task; and a processing unit for receiving data and/or signals from the sensorized glove, from the wearable sensor network, and/or from the unit, and configured for processing said data and/or signals to estimate ergonomic indicators and/or to obtain local information of effort and/or posture.Type: GrantFiled: January 23, 2019Date of Patent: December 22, 2020Assignee: C.R.F. Società Consortile per AzioniInventors: Massimo Di Pardo, Giorgio Pasquettaz, Rossella Monferino, Francesca Gallo
-
Patent number: 10869009Abstract: An imaging device for an interactive display includes at least one image projector to project an image onto a display area. The imaging device further includes a number of image capture devices to capture at least one image of the display area, and a number of sensors to detect the presence of an object within the field of view of the image capture devices.Type: GrantFiled: September 30, 2015Date of Patent: December 15, 2020Assignee: Hewlett-Packard Development Company, L.P.Inventors: Joshua Hailpern, Murilo Juchem
-
Patent number: 10866650Abstract: Provided is a method and device for recognizing a pointing location by using a radar. A device according to an embodiment of the present disclosure comprises: a weight value space generation unit for generating a weight value space comprising a plurality of unit spaces in a space in front of the screen, and setting weight values in proportion to intensity of clutter-removed reception signals of the plurality of radars to the unit spaces; a hand area detection unit for adding up the set weight values for each of the unit spaces of the weight value space; an effective radar selection unit for selecting, as an effective radar, a radar having a shortest distance to the detected hand area among the plurality of radars; and a finger position determination unit for detecting a first location (first path) exceeding a predetermined threshold value.Type: GrantFiled: February 13, 2018Date of Patent: December 15, 2020Assignee: WRT LAB CO., LTD.Inventors: Sung Ho Cho, Dae Hyeon Yim
-
Patent number: 10866563Abstract: One disclosed example provides a computing device comprising a logic subsystem comprising a processor, and memory storing instructions executable by the logic subsystem. The instructions are executable to display via a display system one or more holographic objects, receive depth image data from a depth image sensor, detect a user input setting a trajectory for a selected holographic object, in response to detecting the user input setting the trajectory for the selected holographic object, determine the trajectory for the selected holographic object set by the user input, determine, based upon the depth image data and the trajectory, a surface intersected by the trajectory of the selected holographic object, and display via the display system the selected holographic object as travelling along the trajectory and changing in form upon encountering the surface.Type: GrantFiled: February 13, 2019Date of Patent: December 15, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Addison Kenan Linville, Jarod Wayne Lenz Erwin, Dong Yoon Park
-
Patent number: 10849776Abstract: A handheld device includes a base comprising a handgrip for receiving a vibration movement, a gripping element linked to the base for releasably connecting the handheld device to an object, at least one inertia sensor for detecting an acceleration of the vibration movement to generate an acceleration signal, a processing unit for determining to generate a cancellation decision according to the acceleration signal, and at least one actuator for controlling movement of the gripping element according to the cancellation decision, such that the acceleration is counteracted.Type: GrantFiled: July 23, 2015Date of Patent: December 1, 2020Assignee: INALWAYS CORPORATIONInventors: Yung-Hsi Wu, Po-Chih Hung
-
Patent number: 10843077Abstract: A system and method for creating, presenting and permitting concurrent interaction by players within multiple reality and virtual reality environments wherein a virtual representation of a non-virtual location is generated by a processor in which a set of all of the elements and a set of all of the players are located and represented. The system creates a first dimensionally precise replica of a physical environment and generates a virtual reality environment which is substantially identical to the physical environment. The physical environment situated therein an ultra-precise indoor positioning system consisting of a series of networked antennas connected to a processor to record, generate and transmit positional and action data. The system and method then permit interaction between both the players within the physical environment and those within the virtual reality environment as if both were located within the same environment.Type: GrantFiled: June 10, 2019Date of Patent: November 24, 2020Inventors: Brian Deller, Michael Scott McCraw
-
Patent number: 10845922Abstract: A mobile device includes a housing, an input and output unit including a panel having a display area on a first side of the housing and configured to display an image or menu to a user and to receive a user input from the user to perform a function of the mobile device, a sensing unit disposed between the panel and a second side of the housing to sense a first object, which is disposed to face the display area of the panel, through the panel, and a controller disposed in the housing and configured to perform a sensing mode which includes a sensing operation of the sensing unit to sense the first object through the panel and to generate a sensed image from the first object, and to determine a state of the sensed image as another user input to perform another function of the mobile device.Type: GrantFiled: August 1, 2019Date of Patent: November 24, 2020Inventor: Seungman Kim
-
Patent number: 10845376Abstract: An aspect of the present invention more reliably prevents a false detection of lifting of the electronic device, in an information processing device mounted on an electronic device. A first lifting determination section (64A) determines that a mobile terminal (1) has been lifted, in a case where the following conditions (i) and (ii) are satisfied after a manner of change in acceleration detected by an acceleration sensor (11) over time has satisfied a predetermined acceleration condition: (i) a standstill determination section (63) determines that the mobile terminal (1) is in a standstill state; and (ii) a result of detection by a proximity sensor (14) indicates a transition from proximity to non-proximity within a predetermined time range, the predetermined time range being set with reference to a standstill determination completion time point at which the standstill determination section (63) has completed determination.Type: GrantFiled: February 13, 2017Date of Patent: November 24, 2020Assignee: SHARP KABUSHIKI KAISHAInventors: Tohru Kashiwagi, Masatoshi Noma
-
Patent number: 10838514Abstract: There is provided an optical navigation device including an image sensor, a processing unit, a storage unit and an output unit. The image sensor is configured to successively capture images. The processing unit is configured to calculate a current displacement according to the images and to compare the current displacement or an accumulated displacement with a threshold so as to determine an outputted displacement. The storage unit is configured to save the accumulated displacement. The output unit is configured to output the outputted displacement with a report rate.Type: GrantFiled: October 31, 2019Date of Patent: November 17, 2020Assignee: PIXART IMAGING INC.Inventors: Yu-Hao Huang, Ming-Tsan Kao, Sen-Huang Huang
-
Patent number: 10839576Abstract: Systems and methods for displaying a virtual reticle in an augmented or virtual reality environment by a wearable device are described. The environment can include real or virtual objects that may be interacted with by the user through a variety of poses, such as, e.g., head pose, eye pose or gaze, or body pose. The user may select objects by pointing the virtual reticle toward a target object by changing pose or gaze. The wearable device can recognize that an orientation of a user's head or eyes is outside of a range of acceptable or comfortable head or eye poses and accelerate the movement of the reticle away from a default position and toward a position in the direction of the user's head or eye movement, which can reduce the amount of movement by the user to align the reticle and target.Type: GrantFiled: October 25, 2018Date of Patent: November 17, 2020Assignee: Magic Leap, Inc.Inventors: Paul Armistead Hoover, Sam Baker, Jennifer M. R. Devine
-
Patent number: 10825058Abstract: Disclosed herein are systems and methods for presenting intelligent interactive content. The methods may include receiving user input on a first device from a first user to present content to a second user on a second device. The method may include providing the second device with content data, presenting the content to the second user, monitoring the second user's reaction to the content using eye-tracking sensor, and generating feedback data. The method may include providing the feedback data to the first device and presenting feedback to the first user.Type: GrantFiled: September 30, 2016Date of Patent: November 3, 2020Assignee: Massachusetts Mutual Life Insurance CompanyInventor: Michal Knas
-
Patent number: 10817079Abstract: In some examples, a computing device may receive stylus sensor data from one or more sensors in a stylus that is used to interact with the computing device. The computing device may determine, based on the stylus sensor data, (i) a stylus height of a tip of the stylus from a display device and (ii) one or more stylus angles of the stylus relative to the display device. The computing device may display, on the display device, a virtual shadow of the stylus at a location based at least in part on the stylus height, the stylus angles, and a location of a virtual light source. The computing device may display, on the display device, one or more virtual shadows of content items corresponding to the stylus acting as a virtual light source.Type: GrantFiled: January 29, 2018Date of Patent: October 27, 2020Assignee: Dell Products L.P.Inventors: Mark R. Ligameri, Michiel S. Knoppert, Jace W. Files
-
Patent number: 10819952Abstract: Implementations generally relate to virtual reality telepresence. In some implementations, a method includes performing projection mapping of a projection area. The method further includes collecting user information associated with a user. The method further includes positioning the user in a virtual environment based on the projection mapping and the user information. The method further includes determining a point of view of the user in the virtual environment. The method further includes projecting the virtual environment onto the projection area based on the point of view of the user.Type: GrantFiled: October 11, 2016Date of Patent: October 27, 2020Assignee: SONY INTERACTIVE ENTERTAINMENT LLCInventors: Allison Marlene Chaney, Anthony Rogers
-
Patent number: 10820061Abstract: Systems and methods are operable to communicate information about media device operations to an electronic Braille device. An exemplary embodiment receives a media content event at a media device, wherein the media content event is presented by at least one component of a media content presentation system to at least one visually impaired user; generates a supplemental Braille text information message in response to an operation of the media device that affects the presentation of the media content event, wherein the supplemental Braille text information message is based on text that indicates at least one attribute of the operation of the media device; and communicates the supplemental Braille text information message to the electronic Braille device, wherein the supplemental Braille text information message is presented by the electronic Braille device as tactile information that is sensed by the visually impaired user.Type: GrantFiled: October 17, 2016Date of Patent: October 27, 2020Assignee: DISH Technologies L.L.C.Inventors: Lawrence Moran, Brandon Halper
-
Patent number: 10809860Abstract: A display device including: a plurality of pixels arranged in a matrix form and having a first side parallel to a row and a second side opposite to the first side; scanning lines arranged in each row of the pixels to supply a scanning signal to the pixels; signal lines arranged in each column of the pixels to supply an image signal to the pixels; a drive electrode circuit which is arranged along the first side; and a plurality of drive electrodes which are arranged in the direction of column of the pixels and to which a drive signal to detect an object is supplied from the drive electrode circuit, wherein the signal lines transmit the image signal in a display period, and at least one of the signal lines transmits a control signal to control the drive electrode circuit in a touch detection period.Type: GrantFiled: April 18, 2019Date of Patent: October 20, 2020Assignee: Japan Display Inc.Inventors: Hiroshi Mizuhashi, Makoto Hayashi, Yasuyuki Teranishi, Daisuke Ito
-
Patent number: 10806996Abstract: A light sphere display device is disclosed that provides a persistence of vision (POV) light field display. The device is preferably globe-shaped and has a transparent outer globe with an outer surface having touch-sensitive capability that enables user interaction and control of the device. In embodiments, the device provides a 360 degree POV display as seen on the outer surface, by rotating a support bar having an attached circuit board including LEDs about one or more axes of rotation within the device. The device can be held in the hands of a user, or placed upon one or more gaming accessories such as gaming guns, presentation bases, and action figures. The device communicates with the accessories to provide augmented virtual reality displays of the accessories and to provide interactive gaming displays, in examples. The device also displays images and receives interactive games sent over wireless connections from cell phones.Type: GrantFiled: January 15, 2018Date of Patent: October 20, 2020Inventors: Scott Frisco, Steve Strumpf
-
Patent number: 10805453Abstract: The invention relates to devices, systems, and methods for controlling the audio input state of a device. The devices, systems, and methods include a band having an activator in communication with a device that mutes the device, the activator serving as a reversible temporary toggle to the input state of an incoming audio stream, unmuting the device when desired.Type: GrantFiled: August 26, 2019Date of Patent: October 13, 2020Inventor: Tico Blumenthal
-
Patent number: 10802593Abstract: In the context of a user interface control, a gesture-recognition device: receives gyroscopic data representing a gesture executed with a dedicated instrument including a gyroscopic sensor; determines a correlation between the received gyroscopic data and gyroscopic data relating to a supervised learning and pre-recorded into a database; recognizes or not the executed gesture according to said correlation, the only data representing the executed gesture taken into account being said gyroscopic data; transposes each recognized gesture into a user interface command.Type: GrantFiled: December 9, 2013Date of Patent: October 13, 2020Assignee: SAGEMCOM BROADBAND SASInventors: Etienne Evrard, Frédéric Colinet
-
Patent number: 10802596Abstract: A display device, a self-luminous display panel and a gesture recognition method are provided. The display device includes a self-luminous display panel, an infrared light source, at least one camera device, and an image processing device. The image processing device is coupled to the at least one camera device and configured to perform gesture recognition based on a plurality of successive infrared images and a plurality of successive visible light images of an operation body taken within a same period of time.Type: GrantFiled: June 4, 2018Date of Patent: October 13, 2020Assignee: BOE TECHNOLOGY GROUP CO., LTD.Inventors: Xueyou Cao, Xue Dong, Haisheng Wang, Xiaoliang Ding, Chih Jen Cheng, Pengpeng Wang, Wei Liu, Yanling Han, Ping Zhang
-
Patent number: 10801857Abstract: A navigational information display device is provided, which may include a display configured to display a navigational image indicated using a geometrical figure related to first navigational information, and further display one or more characters selected from the group consisting of a first navigational information character indicative of the first information, a second navigational information character indicative of second navigational information, and a device operation character related to operation of the display device, and processing circuitry configured to receive input of an instruction signal to instruct a change in a display orientation of the screen, and to change a display orientation of at least one of the one or more characters while maintaining a display orientation of the navigational image when the input of the instruction signal is received.Type: GrantFiled: July 18, 2018Date of Patent: October 13, 2020Assignee: Furuno Electric Co., Ltd.Inventors: Masato Okuda, Takatoshi Morita, Timo Kostiainen, Jussi Suistomaa, Petri Turkulainen
-
Patent number: 10796494Abstract: A method, medium, and virtual object for providing a virtual representation with an attribute are described. The virtual representation is generated based on a digitization of a real-world object. Properties of the virtual representation, such as colors, shape similarities, volume, surface area, and the like are identified and an amount or degree of exhibition of those properties by the virtual representation is determined. The properties are employed to identify attributes associated with the virtual representation, such as temperature, weight, or sharpness of an edge, among other attributes of the virtual object. A degree of exhibition of the attributes is also determined based on the properties and their degrees of exhibition. Thereby, the virtual representation is provided with one or more attributes that instruct presentation and interactions of the virtual representation in a virtual world.Type: GrantFiled: July 20, 2011Date of Patent: October 6, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shawn C Wright, Jeffrey Jesus Evertt, Justin Avram Clark, Christopher Harley Willoughby, Mike Scavezze, Michael A Spalding, Kevin Geisner, Daniel L. Osborn
-
Patent number: 10795353Abstract: Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or FDA) using a portable multifunction device (PMD) such as smart phone. In some embodiments, a user may control image capture from an FDA by adjusting the position and orientation of a PMD. In other embodiments, a user may input a touch gesture via a touch display of a PMD that corresponds with a flight path to be autonomously flown by the FDA.Type: GrantFiled: September 24, 2019Date of Patent: October 6, 2020Assignee: SKYDIO, INC.Inventors: Abraham Galton Bachrach, Adam Parker Bry, Matthew Joseph Donahoe
-
Patent number: 10789617Abstract: A method and system for advertising and screen identification using an electronic mobile device transparent screen having 3D image processing and analysis capabilities is presented. The display may include one or more display screens varying in translucency on the rear display screen. Screens may be bendable/foldable/flexible, or multi sided transparent and or non-transparent display screen in front and rear of mobile device. Advertisements delivered to the electronic mobile devices by virtue of a mobile app are displayed on the electronic device screen outside the confines or borders of the mobile app. Advertisements are viewed on a rear display screen to non-users of the electronic mobile device, and they may be displayed based on GPS location, time, date, camera/cameras, 3D camera, 3D sensor and information collected through sensors built into the electronic mobile device.Type: GrantFiled: October 25, 2018Date of Patent: September 29, 2020Inventor: Suk K. Kim-Whitty
-
Patent number: 10788904Abstract: Provided is an in-vehicle information processing system that can be operated intuitively. An in-vehicle information processing system (10) includes a display (11), a touch operation interface (12), an imaging unit (13), and a controller (14). The display (11) includes at least one screen. The touch operation interface (12) detects contact by an operation hand of an operator. The imaging unit (13) captures an image of the touch operation interface (12) and at least a portion of the operation hand. The controller (14) associates position coordinates in an operation region (R2) on the screen with position coordinates in a predetermined region (R1) of the touch operation interface (12) and causes at least a portion of the operation hand to be displayed in overlap on the screen on the basis of the image captured by the imaging unit (13).Type: GrantFiled: March 30, 2017Date of Patent: September 29, 2020Assignee: CALSONIC KANSEI CORPORATIONInventors: Haruhiko Satou, Teruo Yoshitomi
-
Patent number: 10782796Abstract: Embodiments of the present disclosure provide for implementing a mixed reality system with less power. In some examples, a passive state of the mixed reality system can have a GPU render predictable content that does not need to be processed by a CPU. In such examples, the predictable content can be identified and rendered by the GPU while the CPU is in a low-power mode. Accordingly, embodiments of the present disclosure provide benefits not available with conventional techniques because a CPU may consume more power than a corresponding GPU. In some examples, the passive state can take advantage of the fact that predictable content can be identified and rendered without the use of the CPU. In such examples, the passive state can render predictable content that does not need to be processed by the CPU.Type: GrantFiled: November 15, 2017Date of Patent: September 22, 2020Assignee: Magic Leap, Inc.Inventor: Xintian Li
-
Patent number: 10780349Abstract: A rhythm-based video game (“game”) is disclosed. In the game, a player slashes blocks representing musical beats using a pair of energy blades resembling a lightsaber. A gaming console renders multiple digital objects, e.g., digital blocks, digital mines and digital obstacles, that are approaching a player in a virtual space. The gaming console also renders a digital representation of an instrument, e.g., a lightsaber (“digital saber”), using which the player slashes, cuts or otherwise interacts with the digital blocks to cause a digital collision between the digital saber and the digital blocks. The player can score by slashing the digital blocks, not hitting the digital mines and avoiding the digital obstacles. The game presents the player with a stream of approaching digital objects in synchronization with music, e.g., a song's beats, being played in the game. The pace at which the digital blocks approach the player increases with the beats.Type: GrantFiled: November 19, 2018Date of Patent: September 22, 2020Assignee: Facebook Technologies, LLCInventors: Vladimír Hrin{hacek over (c)}ár, Ján Ilavský
-
Patent number: 10773169Abstract: Systems and methods are described for providing co-presence in an augmented reality environment. The method may include controlling a first and second computing device to detect at least one plane associated with a scene of the augmented reality environment generated for a physical space, receiving, from the first computing device, a first selection of a first location within the scene and a first selection of a second location within the scene, generating a first reference marker corresponding to the first location and generating a second reference marker corresponding to the second location, receiving, from a second computing device, a second selection of the first location within the scene and a second selection of the second location within the scene, generating a reference frame and providing the reference frame to the first computing device and to the second computing device to establish co-presence in the augmented reality environment.Type: GrantFiled: January 22, 2018Date of Patent: September 15, 2020Assignee: Google LLCInventors: Adam Leeper, John Ullman, Cheng Yang, Peter Tan
-
Patent number: 10775483Abstract: Disclosed is a radar-based human motion recognition apparatus including a first radar for emitting a first radar signal for detecting a position of an object, a second radar for emitting a second radar signal for gesture recognition, an antenna for receiving a first reflected signal reflected from the object by the first radar signal and a second reflected signal reflected from the object by the second radar signal, and a controller for determining situation information based on second signal processing for the second reflected signal.Type: GrantFiled: December 9, 2019Date of Patent: September 15, 2020Assignee: H LAB CO., LTD.Inventors: Hyoung Min Kim, Han June Kim, Jae Jun Yoon, Hyo Ryun Lee, Jong Hee Park
-
Patent number: 10775901Abstract: Techniques and apparatus are described for obtaining user input via a stylus configured to serve as an interface for providing user input into a computing device. The computing device may obtain rotation-related information indicative of rotational position or rotational movement of the stylus about a longitudinal axis of the stylus. The computing device may identify an operation in response to the rotation-related information, and perform the identified operation.Type: GrantFiled: January 30, 2015Date of Patent: September 15, 2020Assignee: QUALCOMM IncorporatedInventors: Virginia Walker Keating, Niccolo Andrew Padovani, Gilad Bornstein, Thien Lee, Nathan Altman
-
Patent number: 10769857Abstract: A method of implementing a plurality of contextual applications within a mixed reality (MR) environment on an MR-capable device of a user is disclosed. At least one real-world object is identified in the MR environment by applying an object recognition algorithm to one or more attributes of the at least one real-world object that are captured by sensors of the MR-capable device. A first contextual application of the plurality of contextual applications is used to determine an association between a first set of contextual triggers and a second contextual application of the plurality of contextual applications. A second contextual application is initiated based on a satisfying of the at least one contextual trigger. A function is invoked within the second contextual application based on an interaction of the user with at least one virtual object satisfying a second set of contextual triggers associated with the second contextual application.Type: GrantFiled: June 27, 2019Date of Patent: September 8, 2020Assignee: Unity IPR ApSInventors: Sylvio Herve Drouin, Gregory Lionel Xavier Jean Palmaro, Dioselin Alejandra Gonzalez Rosillo
-
Patent number: 10766366Abstract: An operating method in a vehicle having a menu of an operating structure displayed on a display surface arranged in the vehicle. The operating structure includes a main menu and additional menus. Inputs can be detected, by which a change of the displayed menu can be caused and by which switching elements of the menus can be actuated. A special switching element is displayed at the edge of the display surface which can be actuated by a swipe gesture and, if the swipe gesture associated with the special switching element has been detected, the main menu is displayed. Also disclosed is an operating device for performing the operating method.Type: GrantFiled: August 20, 2013Date of Patent: September 8, 2020Assignee: Volkswagen AGInventor: Mi-Ran Jun
-
Patent number: 10768718Abstract: A spherical input device 105 for navigating a virtual environment 102 is activated for touch sensitivity at any point on its surface 801 by a capacitive touch sensor 709 that includes first and second capacitance-sensing elements 710 and 711. A first variable capacitance 806 is formed between a first capacitance-sensing element and a first area of the user's hands through a first hemisphere 802. A second variable capacitance 807 is formed between a second capacitance-sensing element and a second area of the user's hands through the second hemisphere 803. A touch-responsive capacitance 805 includes the first variable capacitance in series with the second variable capacitance. Gestural data is derived from the touch-responsive capacitance and device rotations, and transmitted in gestural radio signals 108 to a receiver 109. One or both of the capacitance-sensing elements is configured to minimize attenuation of the gestural radio signals passing through the surface.Type: GrantFiled: November 1, 2018Date of Patent: September 8, 2020Inventor: Anthony Richard Hardie-Bick
-
Patent number: 10768712Abstract: A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.Type: GrantFiled: May 2, 2019Date of Patent: September 8, 2020Assignee: Google LLCInventors: Carsten C. Schwesig, Ivan Poupyrev
-
Patent number: 10761721Abstract: A method for interactive image caricaturing by an electronic device is described. The method includes detecting at least one feature location of an image. The method further includes generating, based on the at least one feature location, an image mesh that comprises a grid of at least one horizontal line and at least one vertical line. The method additionally includes obtaining a gesture input. The method also includes determining at least one caricature action based on the at least one gesture input. The method further includes generating a caricature image based on the image mesh, the at least one caricature action and the image.Type: GrantFiled: May 23, 2019Date of Patent: September 1, 2020Assignee: QUALCOMM IncorporatedInventors: Xuan Zou, Fan Ling, Ning Bi, Lei Zhang
-
Patent number: 10755482Abstract: Provided is a mechanism that can prevent deterioration in a sense of immersion or a sense of reality of a user with respect to a virtual object displayed in a display area from deteriorating. An information processing device includes a display control unit that controls display of a virtual object displayed to be moved in a display area, and change an aspect of the virtual object when the virtual object moves across a boundary of the display area. An information processing method is performed using a processor, the information processing method including controlling display of a virtual object displayed to be moved in a display area, and changing an aspect of the virtual object when the virtual object moves across a boundary of the display area.Type: GrantFiled: February 1, 2017Date of Patent: August 25, 2020Assignee: SONY CORPORATIONInventors: Tomohisa Tanaka, Tsubasa Tsukahara, Ryo Fukazawa, Akane Yano, Kenji Sugihara
-
Patent number: 10755458Abstract: A network access device is configured to display a first vertical line of a height proportionate to a range between a first high price and first low price from a first intratime period of OHLC data and display a second vertical line adjacent to the first vertical line, the second vertical line having a height proportionate to a range between a second high price and second low price from a second intratime period of OHLC data and generate a body from an open price of the time period and a close price of the second intratime period and determine a highest intratime period having a highest price from each high price of each intratime period and a lowest intratime period having a lowest price from each low price of each intratime period and generate and display an enhanced candlestick by the display of the body overlayed upon the first vertical line and the second vertical line and then generate at least one of an upper wick and lower wick by a removal of all portions of all vertical lines outside of the body froType: GrantFiled: February 13, 2020Date of Patent: August 25, 2020Inventor: Eric Schneider
-
Patent number: 10747342Abstract: A hand-held device with a sensor for providing a signal indicative of a position of the hand-held device relative to an object surface enables power to the sensor at a first time interval when the hand-held device is indicated to be in a position that is stationary and adjacent relative to the object surface, enables power to the sensor at a second time interval shorter than the first time interval when the hand-held device is indicated to be in a position that is moving and adjacent relative to the object surface, and enables power to the sensor at a third time interval when the hand-held device is determined to be in a position that is removed relative to the object surface.Type: GrantFiled: November 11, 2016Date of Patent: August 18, 2020Assignee: Universal Electronics Inc.Inventors: Stephen Brian Gates, Jeremy K. Black
-
Patent number: 10748331Abstract: Techniques are disclosed for displaying a graphical element in a manner that simulates three-dimensional (3D) visibility (including parallax and shadowing). More particularly, a number of images, each captured with a known spatial relationship to a target 3D object, may be used to construct a lighting model of the target object. In one embodiment, for example, polynomial texture maps (PTM) using spherical or hemispherical harmonics may be used to do this. Using PTM techniques a relatively small number of basis images may be identified. When the target object is to be displayed, orientation information may be used to generate a combination of the basis images so as to simulate the 3D presentation of the target object.Type: GrantFiled: March 4, 2019Date of Patent: August 18, 2020Assignee: Apple Inc.Inventors: Ricardo Motta, Lynn R. Youngs, Minwoong Kim
-
Patent number: 10746898Abstract: An electronic system includes a pixelated light source having a plurality of individually controllable pixels, a controller operable to control the pixelated light source, a photosensor configured to detect light signals emitted from the pixelated light source, and an analysis unit configured to recognize objects with different properties that pass in range of the pixelated light source and the photosensor, based on the light signals detected by the photosensor. Corresponding object recognition and material analysis methods are also described.Type: GrantFiled: August 1, 2018Date of Patent: August 18, 2020Assignee: Infineon Technologies AGInventors: Norbert Elbel, Thomas Beer, Georg Pelz
-
Patent number: 10737174Abstract: An electronic gaming machine (EGM) includes a display having a display surface configured to provide stereoscopic 3D viewing of at least a portion of the game. The portion of the game includes a 3D interface element. The EGM further includes a locating sensor that generates an electronic signal based on a player's location in a sensing space. The EGM includes an ultrasonic emitter configured to emit an ultrasonic field. At least a portion of the ultrasonic field is located in the sensing space. The EGM also includes one or more processors configured to: identify a location of one or more player features based on the electrical signal from the locating sensor; determine that the location of the player features is associated with the 3D interface element; and control one or more of the ultrasonic emitters based on the identified location to provide tactile feedback to the player.Type: GrantFiled: December 31, 2018Date of Patent: August 11, 2020Assignee: IGT Canada Solutions ULCInventors: Fayez Idris, David Froy
-
Patent number: 10739854Abstract: A terminal and a touch response method and device are provided. The terminal includes: a screen cover plate comprising a display area part and a key area portion, a touch key is formed in the key area portion; a touch sensor arranged below the touch key; a pressure sensor arranged below the touch key; and a processing chip electrically connected with the touch sensor and the pressure sensor respectively.Type: GrantFiled: July 25, 2017Date of Patent: August 11, 2020Assignee: BEIJING XIAOMI MOBILE SOFTWARE CO., LTD.Inventors: Zhe Liang, Yin Zhu, Zhenzhou Lu
-
Patent number: 10740924Abstract: Examples are disclosed that relate to tracking a pose of a handheld object used with a head-mounted display device. In one example, a method comprises: receiving image data from an image sensing system; detecting a plurality of feature points of the handheld object in a frame of the image data; receiving inertial measurement unit (IMU) data from an IMU of the handheld object; based on detecting the plurality of feature points and receiving the IMU data, determining a first pose of the handheld object; determining that at least a portion of the plurality of feature points is not detected in another frame of the image data; using the IMU data, updating the first pose of the handheld object to a second pose; and body-locking the second pose of the handheld object to a body location on a user wearing the head-mounted display device.Type: GrantFiled: April 16, 2018Date of Patent: August 11, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Alexandru Octavian Balan, Yogeshwar Narayanan Nagaraj, Constantin Dulu, William Guyman, Ivan Razumenic
-
Patent number: 10733803Abstract: A method of implementing a plurality of contextual applications within a mixed reality (MR) environment on an MR-capable device of a user is disclosed. At least one real-world object is identified in the MR environment by applying an object recognition algorithm to one or more attributes of the at least one real-world object that are captured by sensors of the MR-capable device. A first contextual application of the plurality of contextual applications is used to determine an association between a first set of contextual triggers and a second contextual application of the plurality of contextual applications. A second contextual application is initiated based on a satisfying of the at least one contextual trigger. A function is invoked within the second contextual application based on an interaction of the user with at least one virtual object satisfying a second set of contextual triggers associated with the second contextual application.Type: GrantFiled: June 27, 2019Date of Patent: August 4, 2020Assignee: Unity IPR ApSInventors: Sylvio Herve Drouin, Gregory Lionel Xavier Jean Palmaro, Dioselin Alejandra Gonzalez Rosillo
-
Patent number: 10725560Abstract: An electronic device includes an input/output interface capable of being electrically connected to a 3-dimensional movable accessory and a processor electrically connected to the input/output interface. The processor is configured to sense an event generated in the electronic device and to transmit a command to operate the accessory to the accessory through the input/output interface in response to the event.Type: GrantFiled: October 5, 2016Date of Patent: July 28, 2020Assignee: Samsung Electronics Co., Ltd.Inventors: Sang Soo Lee, Byung Soo Kwak, Li Yeon Kang, Jin Heung Kim, Min Jong Lim, Jae Hoon Choi, Jin Keun Park, Jong Chul Choi, Chang Ryong Heo
-
Patent number: 10728485Abstract: A multi-mode remote control unit that has both infrared command and laser pointer capability. In some examples, the multi-mode remote control unit together with one or both of a television receiver and a television may implement simulated touch gestures that typically require a touch sensitive screen. The multi-mode remote control unit in such an implementation advantageously may not include logic to implement simulated multi-touch gestures. Rather, one or both of the television receiver and the television may include logic to implement simulated touch gestures.Type: GrantFiled: December 25, 2014Date of Patent: July 28, 2020Assignee: DISH Ukraine L.L.C.Inventors: Igor Rylskyi Rylovnikov, Andrey Maznev
-
Patent number: 10719214Abstract: Operating a computerized system includes presenting user interface elements on a display screen. A first gesture made in a three-dimensional space by a by a distal portion of an upper extremity of a user is detected while a segment of the distal portion thereof rests on a surface. In response to the first gesture, an area of the display screen selected by the user is identified, and a corresponding user interface element is displayed. After displaying the corresponding user interface element, a second gesture made by the distal portion is detected while the segment continues to rest on the surface so as to select one of the user interface elements that appears in the selected area.Type: GrantFiled: November 8, 2017Date of Patent: July 21, 2020Assignee: APPLE INC.Inventors: Aviad Maizels, Alexander Shpunt, Shai Litvak
-
Patent number: 10719133Abstract: Embodiments of the present invention provide a human-machine interaction method of determining an intended target of an object in relation to a user interface, comprising determining a three-dimensional location of the object at a plurality of time intervals, determining a metric associated with each of a plurality of items of the user interface, the metric indicative of the respective item being the intended target of the object, wherein the metric is determined based upon a model and the location of the object in three dimensions at the plurality of time intervals, and determining, using a Bayesian reasoning process, the intended target from the plurality of items of the user interface based on the metric associated with each of the plurality of items.Type: GrantFiled: July 6, 2015Date of Patent: July 21, 2020Assignee: JAGUAR LAND ROVER LIMITEDInventors: Robert Hardy, Lee Skrypchuk, Bashar Ahmad, Patrick Langdon, Simon Godsill
-
Patent number: 10720082Abstract: A method includes capturing, by a camera coupled to a computing device, a video of an experimental platform device having a designated area for an experiment and displaying, by the computing device, the video of the experimental platform device. The method further includes superimposing, in the video, an overlay animation on the designated area of the experimental platform device, the overlay animation corresponding to an environment of the experiment. The method further includes receiving, by the computing device from the experimental platform device, independent variable data corresponding to manipulations of a sensor of the experimental platform device by a user. The method further includes modifying, in the video, the overlay animation superimposed on the designated area based on the independent variable data.Type: GrantFiled: September 7, 2017Date of Patent: July 21, 2020Assignee: CTSKH, LLCInventors: Cyrillus Kunta Hutabarat, Conny Susan Karman
-
Patent number: RE48221Abstract: Disclosed is a system comprising a handheld device and at least one display. The handheld device is adapted for performing at least one action in a physical 3D environment; wherein the at least one display is adapted for visually representing the physical 3D environment; and where the handheld device is adapted for remotely controlling the view with which the 3D environment is represented on the display.Type: GrantFiled: July 30, 2019Date of Patent: September 22, 2020Assignee: 3SHAPE A/SInventors: Henrik Öjelund, David Fischer, Karl-Josef Hollenbeck