Patents Examined by Jennifer Mehmood
-
Patent number: 11232532Abstract: A split hierarchy graphics processor system including a master node executing a virtual reality (VR) application responsive to input from a client device received over a network to generate primitives for in a VR environment. The graphics processor system including render nodes performing rendering based on the primitives for views into the VR environment taken from a location in the VR environment, the views corresponding to a grid map of the VR environment. Each of the render nodes renders, encodes and streams a corresponding sequence of frames of a corresponding view to the client device. The processor system including an asset library storing input geometries for the objects used for building the VR environment, wherein the objects in the asset library are accessible by the master node and the render nodes.Type: GrantFiled: May 30, 2018Date of Patent: January 25, 2022Assignee: Sony Interactive Entertainment LLCInventor: Torgeir Hagland
-
Patent number: 11217036Abstract: An avatar personalization engine can generate personalized avatars for a user by creating a 3D user model based on one or more images of the user. The avatar personalization engine can compute a delta between the 3D user model and an average person model, which is a model created based on the average measurements from multiple people. The avatar personalization engine can then apply the delta to a generic avatar model by changing measurements of particular features of the generic avatar model by amounts specified for corresponding features identified in the delta. This personalizes the generic avatar model to resemble the user. Additional features matching characteristics of the user can be added to further personalize the avatar model, such as a hairstyle, eyebrow geometry, facial hair, glasses, etc.Type: GrantFiled: October 7, 2019Date of Patent: January 4, 2022Assignee: Facebook Technologies, LLCInventors: Elif Albuz, Chad Vernon, Shu Liang, Peihong Guo
-
Patent number: 11217010Abstract: The present invention discloses a sketch-based shape maintaining tree transformation animation method, comprising: creating a three-dimensional tree model; representing the three-dimensional tree model in the form of a skeleton and a cluster of leaves; selecting an angle of interest, detecting the step (2) canopy silhouette of the 3D tree model, and the transformation animation target canopy silhouette is drawn on the sketch, and the crown silhouette of the 3D tree model is gradually transformed into a transformation animation target crown silhouette, and the crown of the 3D tree model is the transformation of the silhouette is transmitted to the branches to obtain a smooth transformation process of the trees; the transformation process of the crown silhouette and the branches is recorded frame by frame, and the tree transformation animation of the shape is obtained.Type: GrantFiled: June 14, 2018Date of Patent: January 4, 2022Assignee: ZHEJIANG UNIVERSITYInventors: Yutong Wang, Luyuan Wang, Xiaogang Jin
-
Patent number: 11209285Abstract: A vehicular display device configured to display a guide route to a destination of a host vehicle includes a navigation device configured to calculate the guide route, a display controller configured to draw an arrow having a shaft with an arrowhead at one end of the shaft, as a guide figure for guidance along the guide route calculated by the navigation device, such that the arrow is shaped as a three-dimensional object in which one surface of the arrow is a continuous flat surface from the shaft to the arrowhead and the other surface of the arrow includes a first portion with a first width having a first height, and a second portion with a second width smaller than the first width and having a second height smaller than the first height, and a display configured to display an image drawn by the display controller in a display area.Type: GrantFiled: September 30, 2015Date of Patent: December 28, 2021Assignee: Nissan Motor Co., Ltd.Inventors: Kenji Maruyama, Hiroshi Watanabe, Masayuki Shishido, Norio Kosaka
-
Patent number: 11210763Abstract: An image processing apparatus of the technique of this disclosure includes processing units, storage units, a control unit, dividing units which divide image data, and combining units which combine image data. The control unit specifies processing for which image data is divided according to a status of use of the storage units. The control unit causes one of the image processing units to process one of parts of image data divided based on a dividing position, combines the processed part of image data with the other part of image data, causes the other of the image processing units to process the other of parts of image data, the other of parts of image data being not processed by the one of the image processing units, and combines the processed part of image data with the one part of image data.Type: GrantFiled: August 11, 2020Date of Patent: December 28, 2021Assignee: CANON KABUSHIKI KAISHAInventor: Kazunori Matsuyama
-
Patent number: 11210732Abstract: Methods and apparatus for visualizing a surface covering on at least a portion of a surface in an image of a scene. The method comprises identifying, using at least one computer processor, a surface plane from the image of the scene, determining, for each pixel of a plurality of pixels corresponding to the surface plane, whether the pixel corresponds to at least a portion of the surface in the scene, and generating an updated image of the scene by overlaying on the surface plane, a visualization of a plurality of surface covering tiles on pixels along the surface plane determined to correspond to at least a portion of the surface in the scene.Type: GrantFiled: November 4, 2020Date of Patent: December 28, 2021Assignee: Wayfair LLCInventors: Shrenik Sadalgi, Christian Vázquez
-
Patent number: 11212684Abstract: Aspects of the embodiments are directed to systems, methods, and computer program products for displaying floorplans and electromagnetic (EM) emissions to facilitate EM emission design within the floorplan. The display of the floorplan and EM emissions can be performed on a wearable device, such as a augmented reality implement or virtual reality implement. The implement can also facilitate a configuration, manipulation, and revision of floorplan and EM emitter positioning, to allow an operator to more precisely configure EM emitter placement and selection.Type: GrantFiled: July 27, 2017Date of Patent: December 28, 2021Inventor: Ryan Robert Hadley
-
Patent number: 11200637Abstract: A method for profiling energy usage in graphics user interfaces (UI) in handheld mobile devices is disclosed, which includes quantifying the central processing unit (CPU) energy drain of each UI update, quantifying the graphics processing unit (GPU) energy drain of each UI update, quantifying the number of pixels changed due to each UI update, identifying an UI update that consumes energy drain but results in no pixel changes to the displayed frame as a graphics energy bug.Type: GrantFiled: April 20, 2020Date of Patent: December 14, 2021Assignee: Purdue Research FoundationInventors: Yu Charlie Hu, Ning Ding
-
Patent number: 11200703Abstract: An information processing device and method for enabling partial control of the resolution of a data group that can be turned into a tree structure. Data of an Octree pattern is encoded, so that a bit stream containing depth control information indicating that a leaf node is to be formed at a different level from the lowest level based on information specifying the depth of the Octree pattern is generated. Also, a bit stream is decoded, so that an Octree pattern including a leaf node at a different level from the lowest level is constructed, on the basis of depth control information indicating that the leaf node is to be formed at a different level from the lowest level based on information specifying the depth of the Octree pattern. The present disclosure can be applied to an information processing device, an image processing device, an electronic apparatus, an information processing method, a program, or the like, for example.Type: GrantFiled: September 21, 2018Date of Patent: December 14, 2021Assignee: SONY CORPORATIONInventors: Tsuyoshi Kato, Satoru Kuma, Ohji Nakagami, Koji Yano
-
Patent number: 11200718Abstract: System and a method for rendering augmented-reality (AR) enhanced images and information to potential home buyers by providing the AR-enhanced images over a local network. The system includes a computer loaded with an AR application configured to communicate with potential home buyers' personal devices, which have an AR application for communicating with that computer. The potential home buyers can then take a virtual tour of the house, with images that show the different rooms of the house, rendered by the AR to display rooms populated with the home buyers' furniture.Type: GrantFiled: May 29, 2020Date of Patent: December 14, 2021Assignee: United Services Automobile Association (USAA)Inventors: Gregory Brian Meyer, Mark Anthony Lopez, Ravi Durairaj, Nolan Serrao, Victor Kwak, Ryan Thomas Russell, Christopher Russell, Ruthie D. Lyle
-
Patent number: 11195335Abstract: A controller outputs an image of the virtual space in correspondence with a posture of a first user wearing the mounted display, outputs an image of the virtual space to a touch panel display used by a second user, performs a first action in the virtual space in correspondence with a touch operation performed by the second user on the touch panel display, outputs an image of the virtual space reflecting the first action to the mounted display, and performs a second action in the virtual space reflecting the first action in correspondence with an operation performed by the first user on an operation unit.Type: GrantFiled: October 17, 2017Date of Patent: December 7, 2021Assignee: GREE, INC.Inventor: Masashi Watanabe
-
Patent number: 11195250Abstract: The present invention relates to a prediction system for determining a set of subregions to be used for rendering a virtual world of a computer graphics application, said subregions belonging to streamable objects to be used for rendering said virtual world, said streamable objects each comprising a plurality of subregions. The prediction system comprises a plurality of predictor units arranged for receiving from a computer graphics application information on the virtual world and each arranged for obtaining a predicted set of subregions for rendering a virtual world using streamable objects, each predicted set being obtained by applying a different prediction scheme, a streaming manager arranged for receiving the predicted sets of subregions, for deriving from the predicted sets a working set of subregions to be used for rendering and for outputting, based on the working set of subregions, steering instructions concerning the set of subregions to be actually used.Type: GrantFiled: July 31, 2020Date of Patent: December 7, 2021Assignee: Graphine NVInventors: Bart Pieters, Charles-Frederik Hollemeersch, Aljosha Demeulemeester
-
Patent number: 11195334Abstract: A live video signal of a scene associated with a field of view of a user may be generated. The scene may include a casino gaming floor that includes multiple wagering stations. A location in the scene and a virtual element may be determined, based on the live video signal of the scene. The virtual element may be displayed to the user in the location in the scene so that the virtual element is in the field of view of the user using an augmented reality (AR) device. An user input that corresponds to the virtual element may be received and an action of the virtual element responsive to receiving the user input may be generated.Type: GrantFiled: August 3, 2018Date of Patent: December 7, 2021Assignee: IGTInventors: Dwayne Nelson, Patrick Danielson
-
Patent number: 11189077Abstract: Techniques are described for deriving information, including graphical representations, based on perspectives of a 3D scene by utilizing sensor model representations of location points in the 3D scene. A 2D view point representation of a location point is derived based on the sensor model representation. From this information, a data representation can be determined. The 2D view point representation can be used to determine a second 2D view point representation. Other techniques include using sensor model representations of location points associated with dynamic objects in a 3D scene. These sensor model representations are generated using sensor systems having perspectives external to the location points and are used to determine a 3D model associated with a dynamic object. Data or graphical representations may be determined based on the 3D model. A system for obtaining information based on perspectives of a 3D scene includes a data manager and a renderer.Type: GrantFiled: July 21, 2014Date of Patent: November 30, 2021Assignee: Disney Enterprises, Inc.Inventor: Gregory House
-
Patent number: 11189010Abstract: Embodiments of the present application provide a method and an apparatus for image processing. The method includes the following steps: determining an object for adjustment in an image, the object embedded in the image; determining, based on information corresponding to the object, a target adjustment strategy from a mirror strategy and a position translation strategy; and adjusting the object in the image based on the target adjustment strategy. In the technical solutions according to the embodiments of the present application, the information of the object embedded in the image is used, so a target adjustment strategy suitable for the object is determined from the mirror strategy and the position translation strategy, and the object is adjusted in the image based on the target adjustment strategy, thereby improving the degree of automation and the effectiveness of layout modification.Type: GrantFiled: December 28, 2020Date of Patent: November 30, 2021Assignee: ALIBABA GROUP HOLDING LIMITEDInventor: Miaomiao Cui
-
Patent number: 11182965Abstract: In one example, a method for generating and displaying markers in XR environments to enhance social engagement among users includes presenting, by a processing system, an extended reality environment to a first user, wherein the extended reality environment combines elements of a real world environment surrounding the first user with elements of a virtual world, inferring, by the processing system, a marker to be associated with a second user in the extended reality environment, wherein the marker indicates information about the second user; and modifying, by the processing system, the extended reality environment to incorporate the marker in a manner that is apparent to the first user.Type: GrantFiled: May 1, 2019Date of Patent: November 23, 2021Assignee: AT&T INTELLECTUAL PROPERTY I, L.P.Inventors: Eric Zavesky, Nigel Bradley, Nikhil Marathe, James Pratt
-
Patent number: 11182953Abstract: Mobile device integration with a virtual reality environment may include: determining a location of a mobile device relative to a head-mounted display displaying a virtual environment; receiving a video stream mirroring a display of the mobile device; rendering, in the virtual environment, based on the location of the mobile device relative to the head-mounted display, a representation of the mobile device comprising the video stream; and outputting, to the head-mounted display, a rendering of the virtual environment comprising the representation of the mobile device.Type: GrantFiled: January 8, 2019Date of Patent: November 23, 2021Assignee: Lenovo Enterprise Solutions (Singapore) Pte. Ltd.Inventors: Jeffrey R. Hamilton, Ross L. Mickens, Markesha F. Parker
-
Patent number: 11182964Abstract: The present disclosure relates to techniques for providing tangibility visualization of virtual objects within a computer-generated reality (CGR) environment, such as a CGR environment based on virtual reality and/or a CGR environment based on mixed reality. A visual feedback indicating tangibility is provided for a virtual object within a CGR environment that does not correspond to a real, tangible object in the real environment. A visual feedback indicating tangibility is not provided for a virtual representation of a real object within a CGR environment that corresponds to a real, tangible object in the real environment.Type: GrantFiled: April 4, 2019Date of Patent: November 23, 2021Assignee: Apple Inc.Inventors: Alexis H. Palangie, Avi Bar-Zeev
-
Patent number: 11182931Abstract: Methods are provided for generating a prescription map for the application of crop inputs. In one method, the user draws a boundary on a map within a user interface and the system identifies relevant soil data and generates a soil map overlay and legend for changing the application prescription for various soils and soil conditions. In another method, the user instead drives a field boundary which is recorded on a planter monitor using a global positioning receiver, and the system generates a soil map and legend for changing the application prescription.Type: GrantFiled: July 13, 2020Date of Patent: November 23, 2021Assignee: THE CLIMATE CORPORATIONInventors: Derek A. Sauder, Timothy A. Sauder, Steven D. Monday
-
Patent number: 11176725Abstract: Systems and methods for image retargeting are provided. Image data may be acquired that includes motion capture data indicative of motion of a plurality of markers disposed on a surface of a first subject. Each of the markers may be associated with a respective location on the first subject. A plurality of blendshapes may be calculated for the motion capture data based on a configuration of the markers. An error function may be identified for the plurality of blendshapes, and it may be determined that the plurality of blendshapes can be used to retarget a second subject based on the error function. The plurality of blendshapes may then be applied to a second subject to generate a new animation.Type: GrantFiled: July 9, 2020Date of Patent: November 16, 2021Assignees: Sony Interactive Entertainment America LLC, Soul Machines LimitedInventors: Mark Andrew Sagar, Tim Szu-Hsien Wu, Frank Filipe Bonniwell, Homoud B. Alkouh, Colin Joseph Hodges