Including Orientation Sensors (e.g., Infrared, Ultrasonic, Remotely Controlled) Patents (Class 345/158)
-
Patent number: 12386441Abstract: A method, a computer readable medium, and an apparatus for wireless communication are provided. The apparatus may receive, via a radio frequency channel, a first data from a first human interface device at a first time slot of a frame. The apparatus may transmit the first data to a computing device. The apparatus may receive, via the radio frequency channel, a second data from a second human interface device at a second time slot of the frame. The apparatus may transmit the second data to the computing device. The apparatus may transmit, via the radio frequency channel at a third time slot of the frame, at least one of a first acknowledgment to the first data, a first lighting effect configuration data for the first human interface device, a second acknowledgment to the second data, or a second lighting effect configuration data for the second human interface device.Type: GrantFiled: October 18, 2023Date of Patent: August 12, 2025Assignee: Razer (Asia-Pacific) Pte. Ltd.Inventors: Kah Yong Lee, Chee Oei Chan, Gui Mei Dai
-
Patent number: 12386430Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.Type: GrantFiled: June 28, 2024Date of Patent: August 12, 2025Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Alex Marcolina, David S. Holz
-
Patent number: 12380698Abstract: An information processing terminal including a communication device, which transmits a control command to a controlled device via the communication device to remotely control the controlled device, comprises: a virtual remote control generator configured to identify the controlled device and a desired control operation that is a control operation desired to be performed from a surrounding image which is an image around the information processing terminal, and generate an acceptance object for accepting an operation instruction to the desired control operation from a user; an operation acceptance section configured to accept the operation instruction from the user via the acceptance object; and a command transmitter configured to transmit the control command corresponding to the operation instruction accepted by the operation acceptance section to the controlled device.Type: GrantFiled: July 8, 2020Date of Patent: August 5, 2025Assignee: MAXELL, LTD.Inventors: Mayumi Nakade, Tamotsu Ito
-
Patent number: 12381922Abstract: An example may include identifying an initial context of a meeting based on one or more data inputs received from one or more participant devices operated by a plurality of meeting participants, querying remote data sources to identify one or more additional contexts associated with one or more of a specific period of time and a trending topic related to the initial context, forwarding a first of the one or more additional contexts to a first collaboration space of a virtual collaboration space, forwarding at least one additional context of the one or more additional contexts to a second collaboration space of the virtual collaboration space, identifying one or more sentiment actions associated with the one or more meeting participants, and overlaying the first collaboration space and the second collaboration space on an augmented reality display of a device based on the one or more sentiment actions.Type: GrantFiled: May 15, 2024Date of Patent: August 5, 2025Assignee: NCA Holding BVInventors: Daniel Erasmus, David Marvit, Rein Brune, Petrus Pelser, Ian van Wyk, Floris Fok
-
Patent number: 12381924Abstract: The present disclosure generally relates to real-time communication user interfaces. A computer system displays a plurality of selectable options for sharing a link to a real-time communication session via a plurality of respective communication protocols. A computer system displays a selectable user interface element corresponding to a link to a real-time communication session that, when selected via user input, initiates a process to send the link to a user and displays a user interface for participating in the real-time communication session with the user. A computer system displays a visual representation of a user attempting to join a real-time communication session that includes an option that is selectable to determine whether the user is allowed to participate in the real-time communication session.Type: GrantFiled: December 19, 2023Date of Patent: August 5, 2025Assignee: Apple Inc.Inventors: Marco Triverio, Jae Woo Chang, Lauren E. Tappana, Marcel Van Os
-
Patent number: 12373024Abstract: Systems and techniques are described herein for generating and/or processing virtual content in a distributed system (e.g., a distributed extended reality system). For example, a first device (e.g., a client device) of a distributed extended reality system may transmit one or more sets of pose prediction parameters (e.g., prediction coefficients, prediction time associated with raw pose data, and/or the raw pose data) to a second device (e.g., a server device) of the distributed extended reality system. The second device may predict one or more poses of the second device based on the set(s) of pose prediction parameters, and may generate virtual content based on a pose of the predicted pose(s) of the first device. The second may transmit and the first device may receive the virtual content. The first device may then display one or more virtual objects based at least in part on the received virtual content.Type: GrantFiled: March 27, 2024Date of Patent: July 29, 2025Assignee: QUALCOMM IncorporatedInventors: Sriram Ajaykumar, Vinay Melkote Krishnaprasad, Arjun Sitaram
-
Patent number: 12361106Abstract: A collaborative session (e.g., a virtual time capsule) in which access to a collaborative object and added virtual content is selectively provided to participants/users. In one example of the collaborative session, authentication of the collaborative object is performed by all of the users to complete the collaborative session. Each user authenticates the collaborative object, such as using a stamping gesture on a user interface of a client device or in an augmented reality session. User specific data is recorded with the stamping gesture to authenticate the collaborative object and the associated virtual content. In an example, user specific data may include device information, participant profile information, or biometric signal information. Biometric signal information, such as a fingerprint from a mobile device or a heart rate received from a connected smart device can be used to provide an authenticating signature to the seal.Type: GrantFiled: August 31, 2022Date of Patent: July 15, 2025Assignee: Snap Inc.Inventors: Youjean Cho, Chen Ji, Fannie Liu, Andrés Monroy-Hernández, Tsung-Yu Tsai, Rajan Vaish
-
Patent number: 12361759Abstract: A control apparatus for providing a user interface to a user is provided. The apparatus comprises a display control unit configured to display a selectable object on a display apparatus, a detection unit configured to analyze an image of a hand of the user to detect a position of the hand, a specifying unit configured to specify an indicated position in the display apparatus based on the position of the hand, and a processing unit configured to, based on the indicated position overlapping the selectable object, execute processing associated with the selectable object. The display control unit further displays, in the display apparatus, a cursor that indicates the indicated position.Type: GrantFiled: January 18, 2023Date of Patent: July 15, 2025Assignee: HONDA MOTOR CO., LTD.Inventors: Cheree Anne Schepp, Helge Wagner
-
Patent number: 12355135Abstract: An information handling system with a wireless charging system or wireless antenna may include a processor; a memory; a power management unit (PMU), an antenna controller to provide instructions to a radio to cause an antenna to transceive wirelessly with a network, or a wireless charging controller to schedule the wireless charging at a charging coil, where the antenna or charging coil are part of a twice-molded modular antenna including: the antenna a charging coil, an antenna holder onto which the antenna is insert molded, and an antenna cover that is molded onto the antenna or charging coil for interference at an antenna window edge surface of a chassis cover of the information handling system.Type: GrantFiled: April 28, 2022Date of Patent: July 8, 2025Assignee: DELL PRODUCTS LPInventors: Wu Chin-Chung, Ching Wei Chang
-
Patent number: 12353660Abstract: A technique for reading the columns and rows of a touch sensor provides several advantages, including scalability to touch sensors with high column/row counts. Reading the touch sensor relies on combining the row or column outputs of the touch sensor, during column and/or row excitation of the touch sensor, with the resulting combined signal or signals transformed into the frequency domain and touch detection based on evaluation of the frequency-domain values corresponding to the excitation frequencies.Type: GrantFiled: March 21, 2024Date of Patent: July 8, 2025Assignee: Telefonaktiebolaget LM Ericsson (publ)Inventors: Mohammed Zourob, Alexander Hunt, Andreas Kristensson, Mohammed Abdulaziz, Bryan Smith
-
Patent number: 12355712Abstract: Aspects of the present disclosure involve a system and a method for performing operations comprising: receiving, by a messaging application, content from a given user; selecting a metric for measuring performance of the content on the messaging application; measuring performance of the content on the messaging application; computing a value of the performance of the content on the messaging application based on the selected metric; and updating a restricted use token wallet stored in a profile for the given user based on the computed value of the performance of the content.Type: GrantFiled: November 7, 2023Date of Patent: July 8, 2025Assignee: Snap Inc.Inventors: John Jensen, Swetha Krishna Prabhakar
-
Patent number: 12353644Abstract: A display control system according to the present disclosure includes a cross-section display which, when an operation device moves in a direction away from a display device and overlaps a virtual plane at a position separated from the display device by a predetermined distance, causes the display device to display the object image including a cut plane obtained by cutting the operation device at the virtual plane.Type: GrantFiled: April 4, 2023Date of Patent: July 8, 2025Assignee: SHARP KABUSHIKI KAISHAInventors: Naoki Tani, Yusuke Konishi, Hiroaki Okumoto
-
Patent number: 12346502Abstract: An example method includes identifying, by a mobile computing device that includes a housing and a presence-sensitive display, and based on a first group of sensor signals provided at least by an inertial measurement unit included in one or more sensors, at least one first gesture that is performed at portions of the housing, wherein the one or more portions are separate from the display, initiating an interaction mode, outputting at least one visual or audio indicator for the interaction mode that is associated with a particular function of the mobile computing device, identifying, based on a third group of sensor signals provided by the one or more sensors, at least one second gesture that is performed at the one or more portions of the housing to confirm a user selection of the particular function, and, responsive to identifying the at least one second gesture, performing the particular function.Type: GrantFiled: August 25, 2020Date of Patent: July 1, 2025Assignee: Google LLCInventors: Xuelin Huang, Shumin Zhai
-
Patent number: 12296832Abstract: A travel information sensor senses travel information of a host-vehicle. A biological information sensor senses biological information of a driver. A camera unit senses a facial expression of the driver. A communication unit acquires an agitating degree indicating a degree to which an other-vehicle agitates the driver of the host-vehicle, via a network. An agitated degree calculation unit calculates an agitated degree indicating a degree to which the driver of the host-vehicle is agitated by the other-vehicle. A danger degree determination unit determines a danger degree including whether the driver of the host-vehicle is agitated by the other-vehicle, based on the agitated degree and the agitating degree. A presentation unit warns the host-vehicle of the danger degree if it is determined that the driver of the host-vehicle is agitated by the other-vehicle.Type: GrantFiled: September 22, 2022Date of Patent: May 13, 2025Assignee: JVCKENWOOD CORPORATIONInventor: Shuta Yufune
-
Patent number: 12282608Abstract: A control method of a user interface may include: detecting an occupant from an image input from a camera; estimating a skeleton of the occupant; estimating a relative position of a finger of the occupant based on the estimated skeleton; determining a projection position based on the relative position of the finger, and projecting a projection image on the projection position by an image projector and playing the projection image.Type: GrantFiled: August 28, 2023Date of Patent: April 22, 2025Assignees: Hyundai Motor Company, Kia CorporationInventors: Danghee Park, Sukmin Choi, Sunwook Kim, Juhee Park, Jangryul Rim, Sehyun Chang, Hongmin Kim
-
Patent number: 12283257Abstract: A display device including a display unit having a display panel and a camera; a posture adjustment driving unit configured to adjust a posture of the display unit; and a controller configured to control the camera to capture a first image of a viewer viewing the display unit when the display unit is positioned at a first height, control the posture adjustment driving unit to move the display unit to a second height different than the first height, control the camera to capture a second image of the viewer viewing the display unit when the display unit is positioned at the second height, correct at least one of the first image and the second image based on orientation information of the display unit on capturing the first image and the second image, generate a 3D body mesh of the viewer using the first image and the second image, calculate a viewer-optimized display unit posture setting based on the 3D body mesh, and control the posture adjustment driving unit to move the display unit to a height correspondinType: GrantFiled: May 3, 2023Date of Patent: April 22, 2025Assignee: LG ELECTRONICS INC.Inventors: Jinho Park, Hyunjae Jun, Junho Roh, Sangguel Oh, Taewoong Park
-
Patent number: 12277954Abstract: A method for operating a voice trigger is provided. In some implementations, the method is performed at an electronic device including one or more processors and memory storing instructions for execution by the one or more processors. The method includes receiving a sound input. The sound input may correspond to a spoken word or phrase, or a portion thereof. The method includes determining whether at least a portion of the sound input corresponds to a predetermined type of sound, such as a human voice. The method includes, upon a determination that at least a portion of the sound input corresponds to the predetermined type, determining whether the sound input includes predetermined content, such as a predetermined trigger word or phrase. The method also includes, upon a determination that the sound input includes the predetermined content, initiating a speech-based service, such as a voice-based digital assistant.Type: GrantFiled: April 16, 2024Date of Patent: April 15, 2025Assignee: Apple Inc.Inventors: Justin Binder, Samuel D. Post, Onur Tackin, Thomas R. Gruber
-
Patent number: 12272276Abstract: A third controller of a control device executes obtaining a taken image obtained by imaging an operator, making a projector display a first display image including the taken image, detecting an operation gesture corresponding to processing to be executed by the control device from the taken image, generating a second display image based on the operation gesture, and making the projector display the second display image.Type: GrantFiled: June 16, 2023Date of Patent: April 8, 2025Assignee: SEIKO EPSON CORPORATIONInventor: Tetsuo Mano
-
Patent number: 12270983Abstract: A method of determination of a trajectory of a living tissue, in which subsequent frames representing images of living tissue are acquired with at least a first imaging device and at least a first segment of a trajectory of the living tissue is determined with a use of relative displacements of at least first subset of subsequent frames and recalculation thereof to coordinates corresponding to time of frame acquisition to form a vector Tm elements tm wherein element tm is at least one coordinate of living tissue, element tm of vector Tm is determined with a use of relative displacements pm,k of at least two preceding frames.Type: GrantFiled: January 29, 2021Date of Patent: April 8, 2025Assignee: Inoko Vision Spółka z ograniczon odpowiedzialnościaInventors: Maciej Szkulmowski, Krzysztof Dalasinski, Michal Meina, Maciej Nowakowski, Krystian Wrobel, Maciej Bartuzel, Szymon Tamborski, Patrycjusz Stremplewski
-
Patent number: 12265221Abstract: Head-mounted augmented reality (AR) devices can track pose of a wearer's head to provide a three-dimensional virtual representation of objects in the wearer's environment. An electromagnetic (EM) tracking system can track head or body pose. A handheld user input device can include an EM emitter that generates an EM field, and the head-mounted AR device can include an EM sensor that senses the EM field. EM information from the sensor can be analyzed to determine location and/or orientation of the sensor and thereby the wearer's pose. The EM emitter and sensor may utilize time division multiplexing (TDM) or dynamic frequency tuning to operate at multiple frequencies. Voltage gain control may be implemented in the transmitter, rather than the sensor, allowing smaller and lighter weight sensor designs. The EM sensor can implement noise cancellation to reduce the level of EM interference generated by nearby audio speakers.Type: GrantFiled: August 24, 2022Date of Patent: April 1, 2025Assignee: MAGIC LEAP, INC.Inventors: Brian Bucknor, Christopher Lopez, Michael Janusz Woods, Aly H. M. Aly, James William Palmer, Evan Francis Rynk
-
Patent number: 12265479Abstract: In some examples, an electronic device includes a network interface to connect to a computer network, a display port operating as a display input, an input/output port, memory having a plurality of profiles, and a processor coupled to the memory. In some examples, the processor is to change an operation of the input/output port based on a profile of the plurality of profiles. In some examples, the processor is to change the display port from the display input to a display output based on the profile of the plurality of profiles.Type: GrantFiled: June 30, 2021Date of Patent: April 1, 2025Assignee: Hewlett-Packard Development Company, L.P.Inventors: Peter Andrew Seiler, Byron A. Alcorn, Clifton Robin, John Michael Stahl, Patrick S. Anderson, Eric John Gressman, Douglas Allen Reynolds, Joseph-Jonathan Salzano, Gregory Mark Hughes
-
Patent number: 12260023Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.Type: GrantFiled: June 13, 2023Date of Patent: March 25, 2025Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 12243523Abstract: This relates generally to intelligent automated assistants and, more specifically, to provide a handsfree notification management system. An example method includes displaying one or more notifications by the electronic device. In response to displaying the one or more notifications, the method includes detecting a visual interaction of a user with the one or more notifications, identifying a notification from the one or more notifications based on the visual interaction, receiving a speech input related to the notification from the user, determining one or more actions associated with the notification based on the speech input, and performing the one or more actions.Type: GrantFiled: September 13, 2022Date of Patent: March 4, 2025Assignee: Apple Inc.Inventors: Andrea Valentina Simes, Daniel W. Loo, Felicia W. Edwards, Harry M. Simmonds, Tyler R. Calderone, Kevin D. Pitolin, Lorena S. Pazmino
-
Patent number: 12236052Abstract: A position indicator that electrostatically interacts with a sensor includes: a casing having a pen shape; a conductive core body including a pen tip that protrudes from an opening on one end in the axial direction of the casing; a conductor surrounding the core body; a signal transmitting circuit which, in operation, generates a signal that electrostatically interacts with the sensor, and supplies the generated signal to the core body; and a control circuit which, in operation, produces different electrostatic interactions with the sensor by performing a first control operation that sets the conductor in a state of being grounded while the signal is sent out from the pen tip of the core body to the sensor, and performing a second control operation that sets the conductor in a state different from the state of being grounded.Type: GrantFiled: May 2, 2023Date of Patent: February 25, 2025Assignee: Wacom Co., Ltd.Inventor: Sadao Yamamoto
-
Patent number: 12236524Abstract: A data processing apparatus includes receiving circuitry to receive graphics data for at least a portion of a virtual environment, the graphics data including a plurality of virtual objects, a machine learning model trained to output data indicative of an object classification for one or more of the virtual objects in dependence upon the graphics data, assignment circuitry to assign one or more of the virtual objects to one or more virtual object groups in dependence upon the object classification for one or more of the virtual objects, where the virtual objects assigned to a same virtual object group have a same object classification, and control circuitry to share first interaction data associated with a first virtual object assigned to a virtual object group with a second virtual object assigned to the virtual object group to thereby permit use of the first interaction data for the second virtual object in response to a user interaction with the second virtual object in the virtual environment.Type: GrantFiled: October 18, 2022Date of Patent: February 25, 2025Assignee: Sony Interactive Entertainment Inc.Inventors: Fabio Cappello, Mandana Jenabzadeh, Michael Lee Jones
-
Patent number: 12236016Abstract: A coordinate reception unit receives time series spatial coordinates of an object to be detected. The spatial coordinates include a coordinate in a first direction, a coordinate in a second direction intersecting the first direction, and a coordinate in a third direction intersecting the first direction and the second direction. When a predetermined first condition related to a change in the coordinate in the first direction is satisfied at a first point of time and then a predetermined second condition related to a change in the coordinate in the first direction is satisfied at a second point of time, a function execution unit executes a process in electronic equipment that is a target of control, based on the coordinate in the second direction and the coordinate in the third direction at one or more points of time between the first point of time and the second point of time.Type: GrantFiled: October 31, 2023Date of Patent: February 25, 2025Assignee: PANASONIC AUTOMOTIVE SYSTEMS CO., LTD.Inventor: Atsushi Hirai
-
Patent number: 12230262Abstract: Presented herein are techniques in which a device detecting a phrase spoken in an online collaboration session between a plurality of users, the phrase being spoken by a first user to one or more second users. The device determines that the phrase indicates an issue with a quality of user experience of the online collaboration session, labels a log of metrics associated with the online collaboration session with a time stamp corresponding to a time when the phrase was spoken, to provide a labeled log of metrics; and performs one or more actions to improve the user experience based on detecting the phrase.Type: GrantFiled: October 19, 2021Date of Patent: February 18, 2025Assignee: CISCO TECHNOLOGY, INC.Inventors: Ali Mouline, Christopher Rowen, David Guoqing Zhang, Francis Anthony Kurupacheril
-
Patent number: 12232234Abstract: There is provided a controller for receiving input from a user, including: a plurality of input elements, each operating to receive respective operative input from a user, the respective operative input being processed and transmitted to an external apparatus for manipulating an application program during execution; a memory having a plurality of sets of profile data stored therein, each set of profile data includes a respective profile ID in association with respective location information indicative of where a respective one of the input elements is located on the controller; a communication unit operating to receive a user-selected one of the plurality of sets of profile data; and a control unit operating to control the controller based on the selected profile data set received from the communication unit.Type: GrantFiled: July 18, 2023Date of Patent: February 18, 2025Assignee: Sony Interactive Entertainment Inc.Inventors: Akinori Ito, Hiroshi Morita, Tomoe Ochiai, Takeshi Igarashi, Yuji Takeuchi
-
Patent number: 12231614Abstract: A communication terminal device acquires audio including voice of own user using own device; acquires visual information for enabling formation of three-dimensional data of object in predetermined user space; identifies viewpoint position of own user on basis of visual information on user space; transmits, to another device, image transmission information based on visual information on user space and audio transmission information based on audio; displays, on basis of image transmission information transmitted from other device, on display surface having fixed relative position relative to shared virtual space, in which first virtual space in which user space is fixedly arranged and second virtual space in which user space of other device is fixedly arranged are arranged at predetermined relative positions and relative angles, image of shared virtual space which can be viewed from viewpoint position of own user; and outputs audio on basis of audio transmission information transmitted from other device.Type: GrantFiled: April 30, 2020Date of Patent: February 18, 2025Assignee: VIRTUAL WINDOW CO., LTD.Inventor: Rui Sato
-
Patent number: 12222416Abstract: According to various embodiments, an electronic device using a millimeter wave comprises: a first antenna array; a second antenna array; a communication circuit; and at least one processor, wherein the at least one processor may be configured to: control the communication circuit to output a first signal through the first antenna array; when a first reflected signal acquired from the first signal reflected by an object is received through the second antenna array, determine the range between the object and the electronic device on the basis of the first reflected signal; determine an output period of a second signal on the basis of the determined range; control the communication circuit to output the second signal through the first antenna array according to the determined output period; and when a second reflected signal acquired from the second signal reflected by the object is received through the second antenna array, identify an attribute of the object on the basis of the second reflected signal.Type: GrantFiled: February 10, 2022Date of Patent: February 11, 2025Assignee: Samsung Electronics Co., Ltd.Inventors: Hyunkee Min, Chiho Kim, Junghun Lee, Taehun Lim, Junsu Choi
-
Patent number: 12223603Abstract: Provided is a method of learning a target object implemented on a computer-aided design program of an authoring computing device using a virtual viewpoint camera, including displaying a digital model of a target object that is a target for image recognition, setting at least one observation area surrounding the digital model of the target object and having a plurality of viewpoints on the digital model, generating a plurality of pieces of image data obtained by viewing the digital model of the target object at the plurality of viewpoints of the at least one observation area, and generating object recognition library data for recognizing a real object implementing the digital model of the target object based on the generated plurality of pieces of image data.Type: GrantFiled: November 21, 2022Date of Patent: February 11, 2025Assignee: VIRNECT INC.Inventors: Ki Young Kim, Thorsten Korpitsch
-
Patent number: 12223104Abstract: Systems and methods for providing partial passthrough video to a user of a virtual reality device are disclosed herein. Providing the partial passthrough video can include detecting a hand passthrough trigger event and identifying a hand passthrough video feed. Providing partial passthrough video can further include aligning the hand passthrough video feed with a virtual environment presented to a user by the virtual environment and, based on the aligning of the hand passthrough video feed with the virtual environment, overlaying the hand passthrough video feed on the virtual environment.Type: GrantFiled: October 11, 2021Date of Patent: February 11, 2025Assignee: Meta Platforms Technologies, LLCInventors: Michael James LeBeau, John Nicholas Jitkoff
-
Patent number: 12216813Abstract: A processing method for a request for a digital service in an interactive environment, where the method includes: generating an enriched request based on an interpretation of signals received from a multimodal set of human-machine interactions resulting from human-machine interactions coming from coordinated actions of a plurality of distinct users in the interactive environment, where each human-machine interaction, considered individually, is only indicative of one simple request and where the enriched request triggers sending at least one command.Type: GrantFiled: May 30, 2023Date of Patent: February 4, 2025Assignee: OrangeInventors: Chantal Guionnet, Hélène Joucla
-
Patent number: 12217371Abstract: Techniques are disclosed, whereby graphical information for a first image frame to be rendered is obtained at a first device, the graphical information comprising at least depth information for at least a portion of the pixels within the first image frame. Next, a regional depth value may be determined for a region of pixels in the first image frame. Next, the region of pixels may be coded as either a “skipped” region or a “non-skipped” region based, at least in part, on the determined regional depth value for the region of pixels. Finally, if the region of pixels is coded as a non-skipped region, a representation of the region of pixels may be rendered and composited with any other graphical content, as desired, to a display of the first device; whereas, if the region of pixels is coded as a skipped region, the first device may avoid rendering the region.Type: GrantFiled: September 21, 2022Date of Patent: February 4, 2025Assignee: Apple Inc.Inventor: Seyedkoosha Mirhosseini
-
Patent number: 12216822Abstract: System and methods for gesture-based control are described. In some embodiments, a system may include a wearable device having a biopotential sensor and a wrist motion sensor. The biopotential sensor may be configured to output a first data stream indicating actions of a person's hand. The system may further include a second device configured to output a second data stream, which may also indicate the actions of the person's hand. The system may be configured to analyze the first and second data streams to train a machine learning interpreter to classify actions of a person's hand based on at least biopotential data.Type: GrantFiled: October 17, 2022Date of Patent: February 4, 2025Assignee: Pison Technology, Inc.Inventors: Dexter W. Ang, David O. Cipoletta
-
Patent number: 12208510Abstract: This disclosure describes systems, methods, and devices related to robotic drive control device. A robotic device may receive an indication associated with pressing an actuator on a handheld device, wherein the handheld device controls a movement of an end effector of the robotic device. The robotic device may record a home location associated with where the actuator was pressed in space. The robotic device may determine an orientation of the handheld device. The robotic device may detect a movement of the handheld device from the home location to a second location in space. The robotic device may cause the end effector of the robot to move in the same orientation as the handheld device from a stationary position that is associated with the home location while continuing to move the end effector even when the handheld device stops moving at the second location.Type: GrantFiled: January 15, 2021Date of Patent: January 28, 2025Assignee: SISU DEVICES LLCInventors: Jacob Robinson, Russell Aldridge, Joshua Foss, Marc Christenson
-
Patent number: 12205192Abstract: In an example, an apparatus comprises logic, at least partially comprising hardware logic, to receive an input from one or more detectors proximate a display to present an output from a graphics pipeline, determine that a user is not interacting with the display, and in response to a determination that the user is not interacting with the display, to reduce a frame rendering rate of the graphics pipeline. Other embodiments are also disclosed and claimed.Type: GrantFiled: August 10, 2021Date of Patent: January 21, 2025Assignee: INTEL CORPORATIONInventors: Balaji Vembu, Nikos Kaburlasos, Josh B. Mastronarde
-
Patent number: 12204705Abstract: A plural-panel touchpad and palm rest base chassis assembly for an information handling system may comprise a base chassis having an upper portion including a keyboard opening, a touchpad opening for insertion of a touchpad assembly having a touchpad surface panel comprising polycarbonate impregnated thermoplastic fiberglass, and palm rest support surfaces supporting palm rest surface panels. The touchpad surface panel and the palm rest surface panels may have front edges and side edges defining a front and side boundary of the base chassis respectively and also defining the keyboard opening. A lightguide layer may be joined to the touchpad surface panel and to a touchpad printed circuit board assembly (PCBA) to illuminate touch buttons. The touchpad assembly is mechanically and operatively coupled to the base chassis upper portion via top-mounted fasteners disposed through a support bracket to palm rest support surfaces and concealed by the palm rest surface panels.Type: GrantFiled: April 6, 2023Date of Patent: January 21, 2025Assignee: DELL PRODUCTS LPInventors: Priyank J. Gajiwala, Timothy M. Radloff
-
Patent number: 12198381Abstract: Provided are a hand pose estimation method, a device and a computer storage medium. The method may include: determining a classification logic map corresponding to each of a plurality of key-points, the plurality of key-points may represent key nodes of a skeleton of a target hand skeleton, a first key-point may be any one of the plurality of key-points; determining, based on a preset classification map and the classification logic map corresponding to the first key-point, co-ordinate information of the first key-point; and obtaining a pose estimation result of the target hand, in response to determining the co-ordinate information corresponding to each of the plurality of key-points.Type: GrantFiled: May 19, 2022Date of Patent: January 14, 2025Assignee: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD.Inventors: Yang Zhou, Jie Liu
-
Patent number: 12192631Abstract: A camera integrates a magnet at one end of a cylindrical housing to magnetically attract and attach the camera to a front side of a peripheral display, such as to support a video conference through an information handling system interfaced with the display. Camera orientation at a display panel front face is detected, such as with an accelerometer or with analysis of a visual image captured by the camera module, so that the camera orientation is presented with an orientation indicator at the camera front face, such as an LED that illuminates with a predetermined color to indicate an upright vertical orientation. In one alternative embodiment, a gimble interfaced with the camera module and cylindrical housing actuates to adjust the camera module to the upright vertical orientation.Type: GrantFiled: January 18, 2022Date of Patent: January 7, 2025Assignee: Dell Products L.P.Inventors: Jace W. Files, Andrew P. Tosh, John Trevor Morrison
-
Patent number: 12179094Abstract: This disclosure describes a system that allows a user to communicate via social networking from a game controller. These social networking notifications may be displayed on a screen of the game controller. In addition to the game controller, the disclosed system may comprise an app, on a device external from the game controller, that is also able to display the social networking notifications.Type: GrantFiled: September 16, 2021Date of Patent: December 31, 2024Assignee: Voyetra Turtle Beach, Inc.Inventors: Andrew Brian Young, Stephen Thomas Bright, Daniel Stuart-Cross
-
Patent number: 12182343Abstract: A virtual reality tracker includes a first part and a second part. The first part includes a plurality of first light-emitting diodes (LEDs) and an inner measurement unit (IMU). The inertial measurement unit is used for measuring the acceleration and the triaxial angular velocity of the first part. The second part includes a plurality of second light-emitting diodes. Moreover, the first part and the second part are connected by a flexible component.Type: GrantFiled: August 30, 2022Date of Patent: December 31, 2024Assignee: HTC CORPORATIONInventors: Chun-Kai Huang, Chih-Chien Chen, Yan-Ru Chen
-
Patent number: 12164697Abstract: This disclosure provides an in-vehicle mid-air gesture-based interaction method, an electronic apparatus, and a system, and relates to the field of intelligent vehicle technologies. The method includes: obtaining a first mid-air gesture detected by a camera; and starting, when a preset response operation corresponding to the first mid-air gesture matches a first user who initiates the first mid-air gesture, the preset response operation corresponding to the first mid-air gesture in response to the first mid-air gesture. The method can be used in an in-vehicle mid-air gesture-based interaction scenario, reduce a mid-air gesture operation rate, and improve driving safety and interaction experience.Type: GrantFiled: December 28, 2022Date of Patent: December 10, 2024Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Qiuyuan Tang, Shuaihua Peng, Hao Li
-
Patent number: 12153728Abstract: An optical system for a virtual retina display and a gesture detection of a user of the virtual retina display. The optical system includes a projector unit, an image source, and an image processing unit. The projector unit includes a first, second, and a third light source, and a first controllable deflection unit for scanning deflection of first, second, and third light beams. The optical system further includes a second deflection unit designed to transmit the first and second scanned light beams and to deflect the third light beam into a gesture detection area of the user. The optical system further includes a deflection unit, onto which the image content is projectable and which is configured to direct the projected image content and the second light beam onto an eye of a user.Type: GrantFiled: November 10, 2023Date of Patent: November 26, 2024Assignee: ROBERT BOSCH GMBHInventors: Johannes Fischer, Johannes Meyer
-
Patent number: 12154011Abstract: A wrist-mounted system for tracking hand poses includes one or more cameras mounted to a wearable band. In some examples, the one or more cameras are low profile and may be located less than 15 mm, and preferably less than 10 mm, of the wrist of the user. In some examples, a system includes a wearable band and one or more imaging sensors. The one or more imaging sensors are disposed to have a field of view that is anatomically distal when the wearable band is coupled to an arm of the user. The one or more imaging sensors each define an optical axis spaced from the wearable band of less than 15 mm, and preferably less than 10 mm. The image data may come from a single camera mounted on the back of the wrist which captures images of the surface contours of the back of the hand to infer hand poses and gestures.Type: GrantFiled: April 26, 2021Date of Patent: November 26, 2024Assignees: Cornell University, Wisconsin Alumni Research FoundationInventors: Cheng Zhang, Fang Hu, Yin Li
-
Patent number: 12147981Abstract: Systems and methods are disclosed for device movement-based authentication. One method comprises receiving contextual data from one or more sensors of a user device and determining a device movement pattern based on the received contextual data. The determined device movement pattern is compared with a device movement-based signature associated with a user of the user device. If the determined device movement pattern matches the device-movement based signature within a predetermined threshold, the user is authenticated for an electronic transaction. If the determined device movement pattern does not match the device-movement based signature, a notification indicating authentication failure is sent to the user device.Type: GrantFiled: May 15, 2019Date of Patent: November 19, 2024Assignee: Worldpay, LLCInventor: Daren L. Pickering
-
Patent number: 12141381Abstract: The present disclosure is directed to selective gesture recognition for handheld device gestures. An example method includes receiving, by a handheld interactive object, movement information descriptive of a gesture performed with the handheld interactive object. The method includes selecting a local and/or remote machine-learned model for processing the movement information. The movement information can be processed to identify a gesture action corresponding to the movement information. The local and/or remote machine-learned model can be selected based on user input data and/or a complexity of the movement information. In response to selecting the local machine-learned model, the method includes processing the movement information according to the local machine-learned model and communicating a message to a remote device based on the result.Type: GrantFiled: February 22, 2021Date of Patent: November 12, 2024Assignee: GOOGLE LLCInventors: Dev Bhargava, Alejandro Kauffmann
-
Patent number: 12135369Abstract: Apparatus and associated methods relate to an array of individually readable distance sensors disposed along a first axis on a platform and configurable to detect penetration of a first plane containing the first axis, and an array of individually controllable light emitting indicators disposed on the platform along at least a second axis and configurable to emit visual indicia to a user out of the first plane. The visual indicia may, for example, be associated with the detected penetration. A reconfigurable predetermined detection window may, for example, be generated by associating adjacent sensors detecting input during a teaching operation. The detection window may, for example, be further generated by determining at least one distance threshold profile as a function of input received from the adjacent sensors during the teaching operation. Various embodiments may advantageously enable efficient configuration of generic sensing and indication units.Type: GrantFiled: January 20, 2021Date of Patent: November 5, 2024Assignee: BANNER ENGINEERING CORP.Inventor: Charles Dolezalek
-
Patent number: 12131544Abstract: A method for capturing motion of an object, the method comprising: installing at least one marker on the object; bringing the object having the at least one marker installed thereon in an acquisition volume; arranging at least two event-based light sensors such that respective fields of view of the at least two event-based light sensors cover the acquisition volume, wherein each event-based light sensor has an array of pixels; receiving events asynchronously from the pixels of the at least two event-based light sensors depending on variations of incident light from the at least one marker sensed by the pixels; and processing the events to position the at least one marker within the acquisition volume and capture motion of the object.Type: GrantFiled: May 14, 2020Date of Patent: October 29, 2024Assignee: PROPHESEEInventors: Nicolas Bourdis, Davide Migliore
-
Patent number: 12125219Abstract: A system for performing synergistic object tracking and pattern recognition for event representation includes a computing platform having processing hardware and a system memory storing a software code. The processing hardware is configured to execute the software code to receive event data corresponding to one or more propertie(s) of an object, to generate, using the event data, a location data estimating a location of each of multiple predetermined landmarks of the object, and to predict, using one or both of the event data and the location data, a pattern corresponding to the propertie(s) of the object. The processing hardware is further configured to execute the software code to update, using the predicted pattern, the location data, and to merge the updated location data and the predicted pattern to provide merged data.Type: GrantFiled: April 4, 2023Date of Patent: October 22, 2024Assignee: Disney Enterprises, Inc.Inventors: Keith Comito, Kevin Prince