Patents Examined by Ryan R Yang
  • Patent number: 10672167
    Abstract: In some implementations, a computing device can generate a synthetic group selfie. For example, a synthetic group selfie can be an arrangement or composition of individual selfies obtained from a plurality of computing devices into a single group image (e.g., synthetic group selfie). The individual selfie images can be still images, stored video images, or live streaming images. Thus, the synthetic group selfie can be a composition of still images, stored video images, or live streaming video images. The computing device can automatically arrange the individual selfies into the synthetic group selfie. The synthetic group selfie can be stored as a multi-resource object that preserves the individual selfie images so that the user who created the synthetic group selfie or a recipient of the synthetic group selfie can modify the arrangement of the individual selfies within the synthetic group selfie.
    Type: Grant
    Filed: July 11, 2018
    Date of Patent: June 2, 2020
    Assignee: Apple Inc.
    Inventor: Jean-Francois M Albouze
  • Patent number: 10672192
    Abstract: A display device and a method thereof are provided. The display device includes a display, a communicator, and a processor to display a virtual reality (VR) content on the display, in response to a trigger signal and motion information to change a display viewpoint area of the VR content being received from a user terminal device through the communicator, control the display to display by changing a display viewpoint area of the VR content based on first motion direction included in the motion information until a predetermined signal is received. In response to the predetermined signal being received from the user terminal device, control the display to terminate changing the display viewpoint area.
    Type: Grant
    Filed: August 21, 2018
    Date of Patent: June 2, 2020
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Hee-seok Jeong, Sang-young Lee, Kyu-hyun Cho
  • Patent number: 10657723
    Abstract: An image processing apparatus that embeds additional information relating to an object that is displayed in a superimposing manner on a captured image, in an original image. The image processing apparatus includes a determining unit configured to determine whether a direction of the original image is a landscape or a portrait, and an embedment unit configured to embed the additional information in the original image. The additional information is information capable of at least specifying a type of the object and a display direction, of the object with respect to a display screen, in a case of displaying the object in a superimposing manner on the captured image. The embedment unit embeds the additional information based on the determination by the determining unit, such that the display direction changes in accordance with whether the direction of the original image is the landscape or the portrait.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: May 19, 2020
    Assignee: Canon Kabushiki Kaisha
    Inventor: Koji Ito
  • Patent number: 10657687
    Abstract: A system includes reception of a selection of a point of a first data visualization associated with a first measure value, reception of an instruction from the user to create a visualization based on the first measure value, determination of a first context of the first measure value, the first context comprising one or more dimension values, generation of a first numeric point visualization of the first measure value based on the first context, and presentation of a first interface comprising the first data visualization and the first numeric point visualization, where the first numeric point visualization is presented in association with the selected first measure of the first data visualization.
    Type: Grant
    Filed: November 15, 2018
    Date of Patent: May 19, 2020
    Assignee: SAP SE
    Inventors: Sarah Menard, Viren Kumar
  • Patent number: 10652041
    Abstract: An augmented-reality (AR) device detects a physical device located within a first predefined distance of the augmented-reality device. The AR device detects, using an optical sensor, a physical object located within a second predefined distance of the physical device, the physical object being electronically unconnected to the physical device. The AR device determines that the physical object is associated with the physical device and identifies a command based on an identification of both the physical object and the physical device. The physical device is configured to operate the command and display, in the display, virtual content as an overlay to the physical object.
    Type: Grant
    Filed: January 2, 2019
    Date of Patent: May 12, 2020
    Assignee: DAQRI, LLC
    Inventor: Frank Chester Irving, Jr.
  • Patent number: 10650239
    Abstract: Devices, computer-readable media, and methods for providing an enhanced indication of an object that is located via a visual feed in accordance with a user context are disclosed. For instance, in one example, a processing system including at least one processor may detect a user context from a visual feed, locate an object via the visual feed in accordance with the user context, and provide an enhanced indication of the object via an augmented reality display.
    Type: Grant
    Filed: July 25, 2018
    Date of Patent: May 12, 2020
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: David Crawford Gibbon, Eric Zavesky, Bernard S. Renger, Behzad Shahraray, Lee Begeja
  • Patent number: 10643388
    Abstract: The inventive concept relates to a system for analyzing a degree of interest in a VR image, which allows a user to freely move in a VR image, appreciate objects, and check information on the objects to make the user feel as if the user were actually making a visit, and analyzes a degree of interest of users by performing analysis and generating a hit map.
    Type: Grant
    Filed: February 19, 2019
    Date of Patent: May 5, 2020
    Assignee: DATAKING. INC
    Inventor: Sun Kyou Park
  • Patent number: 10643573
    Abstract: Technologies for end-to-end display integrity verification include a computing device with a display controller coupled to a display by a physical link. The computing device generates pixel data in a data buffer in memory, and the display controller outputs a pixel signal on the physical link based on the pixel data using a physical interface. The display receives the pixel signal and displays a corresponding image. A splicer is connected to the physical link and repeats the pixel signal to an I/O port of the computing device. The I/O port may be a USB Type-C port. The computing device compares pixel data received by the I/O port to the pixel data in the data buffer. The computing device may calculate checksums of the pixel data. If the pixel data does not match, the computing device may indicate a display integrity failure. Other embodiments are described and claimed.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: May 5, 2020
    Assignee: Intel Corporation
    Inventors: Prashant D. Chaudhari, Michael N. Derr
  • Patent number: 10643363
    Abstract: Systems and methods for adaptive obfuscation may include a user interface device, a hardware processor, and memory communicatively connected to the hardware processor. The user interface device may include a first hardware display with a first active display area, a first mirrored display area, and a first input area. The hardware processor may be configured to provide operations including receive a plurality of objects including first and second objects having respective first and second character sets with a predefined character length, select an obfuscation scheme defining an obfuscation condition, and obfuscate the first and second character sets according to the selected obfuscation scheme, and display the first character set in the first active display area and the second character set in the first mirrored display area.
    Type: Grant
    Filed: November 26, 2018
    Date of Patent: May 5, 2020
    Assignee: Curfuffles LLC
    Inventor: Thomas Hartl
  • Patent number: 10636191
    Abstract: A method and an apparatus of displaying a window border shadow, the method including the following steps: creating a slave window stitched to a border of a main window, wherein the main window is a non-Layered Window, and the slave window is a Layered Window (S100); adjusting a size of the slave window according to a size of the main window, and adjusting screen coordinates of the slave window according to screen coordinates of the main window (S200); and calculating a pixel point transparency for the slave window, and displaying the slave window according to the calculated result (S300). Above mentioned method sets the main window for presenting software interface to the non-Layered Window, and the slave windows stitched to the borders surrounding the main window to the Layered Window, thus utilizing the slave windows to create the border shadow display effect, and also decreasing memory footprint for the entire system.
    Type: Grant
    Filed: December 29, 2016
    Date of Patent: April 28, 2020
    Assignee: GUANGZHOU SHIRUI ELECTRONICS CO. LTD.
    Inventor: Yao Cheng
  • Patent number: 10621789
    Abstract: Embodiments provide for tracking location and resolving drift in Augmented Reality (AR) devices. The AR devices includes computing devices having screens on a first face and cameras on a second, opposite face to project an image onto optical arrangements for viewing by wearers of the AR devices. The AR devices map locations for real objects in the environment to a virtual environment; anchor virtual objects at anchor locations within the virtual environment; capture station keeping images of the environment from a first Field of View via the camera; determine a second, different Field of View in the environment for the wearer of the AR device based on the relative locations of real objects present in the station keeping images; and output images depicting the virtual objects at positions on the screen to depict the virtual objects in the physical environment at the anchor locations.
    Type: Grant
    Filed: October 24, 2018
    Date of Patent: April 14, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Randall S. Davis, Elliott H. Baumbach, Nathan D. Nocon, Todd Michael Graham
  • Patent number: 10602556
    Abstract: A wireless communication device may locate a proximate object in an environment, such as an electronic device or a resource. During this communication technique, the wireless communication device may receive a transmission that includes an identifier associated with the object. The wireless communication device may determine a range and/or a direction of the object from the wireless communication device. For example, the wireless communication device may determine the range and/or the direction, at least in part, using wireless ranging. Next, the wireless communication device may present output information that indicates the range and/or the direction. In particular, the wireless communication device may display a map of a proximate area with an indicator representative of the object shown on the map. Alternatively, the wireless communication device may display an image of the proximate area with the indicator representative of the object on the image.
    Type: Grant
    Filed: February 3, 2017
    Date of Patent: March 24, 2020
    Assignee: Apple Inc.
    Inventors: James H. Foster, Duncan R. Kerr
  • Patent number: 10596478
    Abstract: In some embodiments, a head-mounted display apparatus with a visual display and one or more sensors may make navigation of virtual environments more natural. The invention enables a participant to pivot, tip and aim the apparatus to orient and move through virtual space hands-free.
    Type: Grant
    Filed: February 26, 2018
    Date of Patent: March 24, 2020
    Assignee: MONKEYmedia, Inc.
    Inventors: Eric Justin Gould Bear, Rachel M. Strickland, Jim McKee
  • Patent number: 10586393
    Abstract: A technique for generating one or more maneuver points includes determining a first location at which a maneuver is initiated by a vehicle and determining a difference between the first location and a stored location that corresponds to the maneuver. The technique further includes, in response to determining that the difference exceeds a threshold value, transmitting the first location to an update application. The update application modifies the stored location based on the first location to generate an updated location.
    Type: Grant
    Filed: December 18, 2017
    Date of Patent: March 10, 2020
    Assignee: HARMAN INTERNATIONAL INDUSTRIES, INCORPORATED
    Inventor: Axel Nix
  • Patent number: 10579207
    Abstract: A method of manipulating a three-dimensional image file including a virtual object includes obtaining image information in a processing device of a non-instrumented physical object manipulated by a user, such image information including movement information; and causing virtual movement of the virtual object based on the movement information. A method of shaping a virtual object includes obtaining image information including movement information; and determining a shape of the virtual object based on the movement information. A method of modifying a virtual object includes obtaining image information including movement information; and altering a virtual surface appearance of at least a part of the virtual object based on the movement information. Systems and computer-readable media are also described.
    Type: Grant
    Filed: May 14, 2015
    Date of Patent: March 3, 2020
    Assignee: Purdue Research Foundation
    Inventors: Cecil Piya, Vinayak Raman Krishnamurthy, Karthik Ramani
  • Patent number: 10573079
    Abstract: Hybrid rendering is described for a wearable display that is attached to a tethered computer. In one example, a process includes determining a position and orientation of a wearable computing device, determining a rate of motion of the wearable computing device, comparing the rate of motion to a threshold, if the rate of motion is above the threshold, then rendering a view of a scene at the wearable computing device using the position and orientation information, and displaying the rendered view of the scene.
    Type: Grant
    Filed: March 27, 2018
    Date of Patent: February 25, 2020
    Assignee: INTEL CORPORATION
    Inventors: Deepak Shashidhar Vembar, Paul Diefenbaugh, Vallabhajosyula S. Somayazulu, Atsuo Kuwahara, Kofi Whitney, Richmond Hicks
  • Patent number: 10573084
    Abstract: Embodiments relate to using sensor data and location data from a device to generate augmented reality images. A mobile device pose can be determined (a geographic position, direction and a three dimensional orientation of the device) within a location. A type of destination in the location can be identified and multiple destinations can be identified, with the mobile device receiving queue information about the identified destinations from a server. A first image can be captured. Based on the queue information, one of the identified destinations can be selected. The geographic position of each identified destination can be identified, and these positions can be combined with the mobile device pose to generate a second image. Finally, an augmented reality image can be generated by combining the first image and the second image, the augmented reality image identifying the selected one destination.
    Type: Grant
    Filed: October 8, 2018
    Date of Patent: February 25, 2020
    Assignee: Live Nation Entertainment, Inc.
    Inventor: James Paul Callaghan
  • Patent number: 10559087
    Abstract: An apparatus comprises a first acquisition unit which acquires an captured image in a real space from an image capturing unit provided for a display apparatus; a second acquisition unit which acquires data, from a measuring unit provided for the display apparatus, indicating a distance from the display apparatus to an object in the real space; a generating unit which generates, based on the data acquired by the second acquisition unit, an image by superimposing CG on the captured image; and a setting unit which sets a measurement frequency of the measuring unit to a first frequency if a specific object is included in the captured image, and sets the measurement frequency of the measuring unit to a second frequency lower than the first frequency if the specific object is not included in the captured image.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: February 11, 2020
    Assignee: CANON KABUSHIKI KAISHA
    Inventor: Seishi Miura
  • Patent number: 10559107
    Abstract: A system and method for presentation of computer vision (e.g., augmented reality, virtual reality) using user data and a user code is disclosed. A client device can detect an image feature (e.g., scannable code) in one or more images. The image feature is determined to be linked to a user account. User data from the user account can then be used to generate one or more augmented reality display elements that can be anchored to the image feature in the one or more images.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: February 11, 2020
    Assignee: Snap Inc.
    Inventors: Ebony James Charlton, Omer Cansizoglu, Kirk Ouimet, Nathan Kenneth Boyd
  • Patent number: 10553006
    Abstract: A method for guiding a user to manually apply cosmetics by overlaying graphical indicators on a real-time image of the user, the method including receiving a real-time image of the area of skin. Generating a registration layer containing a mapping between features of the real-time image and a graphical model of the area of skin. Presenting, for display to a user, a control layer overlaid on the real-time image according to the registration layer, the control layer including at least one indicator of a location on the area of skin where a reflectance modifying agent (RMA) should be applied. Determining, based on the real-time image, that the RMA has been applied to the location on the area of skin. In response to determining that the RMA has been applied to the location on the area of skin, modifying the at least one indicator included in the control layer.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: February 4, 2020
    Assignee: TCMS Transparent Beauty, LLC
    Inventors: David C. Iglehart, Albert Durr Edgar