Simulation Patents (Class 345/952)
-
Patent number: 11074167Abstract: Disclosed herein are techniques for visualizing and configuring controller function sequences.Type: GrantFiled: March 24, 2020Date of Patent: July 27, 2021Assignee: Aurora Labs Ltd.Inventors: Zohar Fox, Carmit Sahar
-
Patent number: 10459710Abstract: Systems and methods for demonstrating a replacement information management software for a computing system. The methods may include determining if existing information management software is installed in the computing system. The methods may include identifying computing devices of the computing system using information from the existing information management software. The methods may include simulating the replacement information management software with characteristics of the identified computing devices to enable a user to experience the replacement information management software prior to committing to installing the replacement information management software in the computing system. The methods may include automatically installing the replacement information management software. Other implementations are disclosed.Type: GrantFiled: September 1, 2017Date of Patent: October 29, 2019Assignee: Commvault Systems, Inc.Inventor: Sanjay Harakhchand Kripalani
-
Patent number: 10424220Abstract: Disclosed is a tutorial model including at least one frame provided with an assistance template for assisting in the recording of the frame, the recording of the frame being a rush that can be incorporated into a framework for a film or series of images or image; the invention also relates to a film or series of images or image which includes a framework and rushes incorporated into the framework, the rushes being produced from such a model; a method for designing a personalized film or series of images or image; a tutored coaching method for helping a user to record frames that can be incorporated into a framework; and a tutored coaching method for helping a user to learn professional gestures.Type: GrantFiled: June 16, 2015Date of Patent: September 24, 2019Inventor: Antoine Huet
-
Patent number: 10388071Abstract: A method, system, computer readable media and cloud systems are provided for adjusting image data presented in a head mounted display (HMD). One method includes executing a virtual reality (VR) session for an HMD user. The VR session is configured to present image data to a display of the HMD. The image data is for a VR environment that includes a VR user controlled by the HMD user. The method further includes adjusting the image data presented on the display of the HMD with the cadence profile when the VR user is moved in the VR environment by the HMD user. The adjusting causes a movement of a camera view for the image data that is for the VR environment as presented on the display of the HMD. In some examples, the cadence profile substantially replicates a rhythmic movement of a person while moving in a real world environment.Type: GrantFiled: March 25, 2016Date of Patent: August 20, 2019Assignee: Sony Interactive Entertainment Inc.Inventor: Javier Fernandez Rico
-
Patent number: 9943250Abstract: A method and system for provoking gait disorders, such as freezing of gait; usable, for example, for diagnosing and/or treatment thereof. In an exemplary embodiment of the invention, displays of situations calculated to cause freezing of gait are presented to a subject, optionally using virtual reality displays. Optionally or alternatively, incipit freezing of gait is identified using changes in gait parameters, and optionally used to guide attempts at causing freezing of gait. Optionally or alternatively, a portable device is provided which detects incipit freezing of gait and generates a corrective signal to the subject.Type: GrantFiled: October 9, 2012Date of Patent: April 17, 2018Assignee: The Medical Research, Infrastructure amd Health Services Fund of the Tel Aviv Medical CenterInventors: Meir Plotnik-Peleg, Jeffrey M. Hausdorff, Nir Giladi, Anat Mirelman
-
Patent number: 9753844Abstract: Systems and methods for demonstrating a replacement information management software for a computing system. The methods may include determining if existing information management software is installed in the computing system. The methods may include identifying computing devices of the computing system using information from the existing information management software. The methods may include simulating the replacement information management software with characteristics of the identified computing devices to enable a user to experience the replacement information management software prior to committing to installing the replacement information management software in the computing system. The methods may include automatically installing the replacement information management software. Other implementations are disclosed.Type: GrantFiled: March 25, 2015Date of Patent: September 5, 2017Assignee: Micron Technology, Inc.Inventor: Sanjay Harakhchand Kripalani
-
Patent number: 8674837Abstract: The present specification discloses systems and methods for patient monitoring using a multitude of display regions, at least two of which have the capability to simultaneously display real time patient waveforms and vital statistics as well as provide display for local and remote software applications. In one example, a primary display shows real time patient waveforms and vital statistics while a customizable secondary display shows trends, cumulative data, laboratory and radiology reports, protocols, and similar clinical information. Additionally, the secondary display can launch local and remote applications such as entertainment software, Internet and email programs, patient education software, and video conferencing applications. The dual display allows caregivers to simultaneously view real time patient vitals and aggregated data or therapy protocols, thereby increasing hospital personnel efficiency and improving treatment, while not compromising the display of critical alarms or other data.Type: GrantFiled: March 21, 2011Date of Patent: March 18, 2014Assignee: Spacelabs Healthcare LLCInventors: Jeffrey Jay Gilham, Patrick Jensen, Michael Brendel, Katherine Stankus
-
Patent number: 8260593Abstract: According to one embodiment of the invention, a computerized method for simulating human movement includes storing a plurality of sets of data, in which each set of data is indicative of a measured movement of a first human, receiving a start point and an end point for a desired movement of a second human, and comparing the desired movement to the stored sets of data. The method further includes selecting, based on the comparison, a stored set of data that is representative of the desired movement and simulating the desired movement based on the start point, the end point, and the relative change in position of a first joint associated with the selected set of data from an empirical start point to an empirical end point.Type: GrantFiled: September 18, 2002Date of Patent: September 4, 2012Assignee: Siemens Product Lifecycle Management Software Inc.Inventor: Ulrich Raschke
-
Patent number: 8139067Abstract: Motion capture animation, shape completion and markerless motion capture methods are provided. A pose deformation space model encoding variability in pose is learnt from a three-dimensional (3D) dataset. Body shape deformation space model encoding variability in pose and shape is learnt from another 3D dataset. The learnt pose model is combined with the learnt body shape model. For motion capture animation, given parameter set, the combined model generates a 3D shape surface of a body in a pose and shape. For shape completion, given partial surface of a body defined as 3D points, the combined model generates a 3D surface model in the combined spaces that fits the 3D points. For markerless motion capture, given 3D information of a body, the combined model traces the movement of the body using the combined spaces that fits the 3D information or reconstructing the body's shape or deformations that fits the 3D information.Type: GrantFiled: July 25, 2007Date of Patent: March 20, 2012Assignee: The Board of Trustees of the Leland Stanford Junior UniversityInventors: Dragomir D. Anguelov, Praveen Srinivasan, Daphne Koller, Sebastian Thrun
-
Patent number: 8036863Abstract: A method for customizing a bearing bore in a housing so that the bearing assembly will transmit load in a desired manner over a predetermined range of operating temperatures.Type: GrantFiled: January 30, 2009Date of Patent: October 11, 2011Assignee: American Axle & Manufacturing, Inc.Inventors: David P Schankin, Suhui W Wang, Chih-Hung Chung, Zhaohui Sun
-
Patent number: 7916143Abstract: Provided are a system and a method that automatically produce natural locomotion animation without an applicable discontinuity portion with respect to various moving distance and timing by using motion capture data. The system includes a motion capture data storage, a simulation calculator, and an animation calculator. The method includes defining a speed calculated in the moving motion capture data as a maximum moving speed of a simulation in order to calculate an entire moving distance, a stopped time when starting and arriving, and a stopped time before starting and after arriving regarding to respective characters; extracting a portion of the arriving motion capture data to be appropriate for the entire moving distance in order to produce the locomotion animation when the entire moving distance is less than a moving distance of the arriving motion capture data; and satisfying an entire time corresponding to an entire motion of animation.Type: GrantFiled: July 20, 2007Date of Patent: March 29, 2011Assignee: Electronics and Telecommunications Research InstituteInventors: Sung June Chang, Se Hoon Park, In Ho Lee
-
Patent number: 7756722Abstract: Patients with chronic illnesses resist using conventional automated healthcare management systems to supply necessary clinical data because such systems feel impersonal, preferring to actually visit a clinic where the patient interacts with various healthcare practitioners. In this invention, the patient interacts with a clinical management system via a series of initial GUI screens that replicate the experience of actually visiting the clinic. Additional screens allow the patient to submit clinical information, to communicate with that patient's healthcare practitioner and other healthcare practitioners, to access management information that aids the patient in managing that patient's chronic illness, and to access educational information regarding that chronic illness. The clinical management system may be used to manage a plurality of different chronic illnesses while providing a consistent look and feel to the screens.Type: GrantFiled: October 2, 2001Date of Patent: July 13, 2010Assignee: Georgetown UniversityInventors: Betty A. Levine, Stephen C. Clement, Seong Ki Mun, Adil Alaoui, Tang Ming-Jye Hu
-
Patent number: 7512613Abstract: The invention relates to enabling a user to log data of a block diagram without using a functional logging block within the block diagram. There is a first timing identifier for a first data set based on a timing characteristic of the first data set. There is also a first task identifier established by an execution engine that is associated with a first data set. The logging of the data associated with the first data set is based on the first timing identifier and the first task identifier.Type: GrantFiled: April 16, 2003Date of Patent: March 31, 2009Assignee: The MathWorks, Inc.Inventor: Howard Taitel
-
Patent number: 7508393Abstract: A system comprising a plurality of three dimensional artificially animated portraits for performing preprogrammed animations of voice and facial expressions in the form of a scripted dialogue orchestrated by a central source. The system is operable to prepare animations of recorded voice and selected depictions of facial expressions to be transferred to the animated portraits and performed by the animated portraits. The system is operable to combine prepared animations in a scripted dialogue to be performed so as to mimic an interactive conversation.Type: GrantFiled: June 6, 2006Date of Patent: March 24, 2009Inventors: Patricia L. Gordon, Robert E Glaser
-
Patent number: 7446772Abstract: A spectator experience corresponding to an occurrence of one or more games or events is generated based on each associated occurrence. The occurrence of a game or event varies in response to contributions and/or interactions of one or more participants of the game or event. The spectator experience enables users thereof to observe an augmented version of the game or event, such as by implementing enhanced viewpoint controls and/or other spectator related effects. In a particular aspect, the spectator experience can provide an indication of the spectator' presence, which is made available to the spectators and/or to the participants of the game.Type: GrantFiled: December 19, 2005Date of Patent: November 4, 2008Assignee: Microsoft CorporationInventors: Curtis G. Wong, Steven Drucker, Michael F. Cohen, Steven D. De Mar, Asta L. Glatzer, Li-Wei He
-
Patent number: 7403202Abstract: Embodiments of the present invention provide methods and apparatus wherein physics models are integrated with motion capture animation to allow for variability in animations, dynamic response, such as animating events different from those for which motion capture data was obtained, including changes in character purpose and collisions. The physical model may include sets of internal forces and/or external forces. To facilitate the integration of mo-cap animation data with physics models, mo-cap animation data is played back using forces on physical models rather than directly setting joint angles or positions in a graphical skeleton, which allows the animation to be dynamically altered in real-time in a physically realistic manner by external forces and impulses.Type: GrantFiled: August 1, 2005Date of Patent: July 22, 2008Assignee: Electronic Arts, Inc.Inventor: Shawn P. Nash
-
Patent number: 7382374Abstract: In a visualization system a three-dimensional scene (43) is projected onto a camera's view projection plane (42) from a camera's defined viewpoint (41) and mapped onto a two-dimensional display. For positioning a pointer (45) in the three-dimensional scene (43), the view of the three-dimensional scene is animated automatically to provide to a user the view of the three-dimensional scene with a kinetic depth effect. The view of the three-dimensional scene is animated by applying a spatial transformation to the three-dimensional scene (43) or the camera. The transformation is applied to the three-dimensional (43) scene or the camera such that the projected view of the pointer (45) remains essentially static. The pointer (45) is positioned based on signals received from the user, while the view is animated.Type: GrantFiled: May 2, 2005Date of Patent: June 3, 2008Assignee: Bitplane AGInventors: Peter Majer, Christoph Laimer
-
Patent number: 7358972Abstract: A system and method for capturing motion comprises a motion capture volume adapted to contain at least one actor having body markers defining plural body points and facial markers defining plural facial points. A plurality of body motion cameras and a plurality of facial motion cameras are arranged around a periphery of the motion capture volume. The facial motion cameras each have a respective field of view narrower than a corresponding field of view of the body motion cameras. The facial motion cameras are arranged such that all laterally exposed surfaces of the actor while in motion within the motion capture volume are within the field of view of at least one of the plurality of facial motion cameras at substantially all times. A motion capture processor is coupled to the plurality of facial motion cameras and the plurality of body motion cameras to produce a digital model reflecting combined body and facial motion of the actor.Type: GrantFiled: November 6, 2006Date of Patent: April 15, 2008Assignees: Sony Corporation, Sony Pictures Entertainment Inc.Inventors: Demian Gordon, Jerome Chen, Albert Robert Hastings, Jody Echegaray
-
Patent number: 7116341Abstract: An information presentation apparatus creates a three-dimensional animation of a specific object in a three-dimensional virtual space on the basis of the human characteristic of paying more attention to a moving object. A user's attention can be drawn to a specific object, such as a destination building, in the virtual space displayed on the screen. Irrespective of whether the specific object is selected by the user or designated at the system side to which the user's attention is to be drawn, the user can easily detect the attention-drawing object.Type: GrantFiled: April 22, 2003Date of Patent: October 3, 2006Assignee: Sony CorporationInventor: Yasunori Ohto
-
Patent number: 7068277Abstract: A system and method for animating facial motion comprises an animation processor adapted to generate three-dimensional graphical images and having a user interface and a facial performance processing system operative with the animation processor to generate a three-dimensional digital model of an actor's face and overlay a virtual muscle structure onto the digital model. The virtual muscle structure includes plural muscle vectors that each respectively define a plurality of vertices along a surface of the digital model in a direction corresponding to actual facial muscles. The facial performance processing system is responsive to an input reflecting selective actuation of at least one of the plural muscle vectors to thereby reposition corresponding ones of the plurality of vertices and re-generate the digital model in a manner that simulates facial motion.Type: GrantFiled: May 23, 2003Date of Patent: June 27, 2006Assignees: Sony Corporation, Sony Pictures Entertainment Inc.Inventor: Alberto Menache
-
Patent number: 7012607Abstract: A system and/or method that generates user interface output sequences controlled by a user interface output system. The user interface output system can provide event definitions to an application pro that specify high-level actions to be performed by the sequence and can issue low-level commands to direct the actions of the user interface output sequence. The user interface output system provides a user interface output controller, which acts as an interface between an application program and the low-level commands which specify tasks for the user interface output sequence to perform. The user interface output controller is generated from a specification, using a planning methodology.Type: GrantFiled: November 10, 1999Date of Patent: March 14, 2006Assignee: Microsoft CorporationInventors: David J. Kurlander, Daniel T. Ling
-
Patent number: 6958752Abstract: Systems and methods for modifying a virtual object stored within a computer. The systems and methods allow virtual object modifications that are otherwise computationally inconvenient. The virtual object is represented as a volumetric representation. A portion of the volumetric model is converted into an alternative representation. The alternative representation can be a representation having a different number of dimensions from the volumetric representations. A stimulus is applied to the alternative representation, for example by a user employing a force-feedback haptic interface. The response of the alternative representation to the stimulus is calculated. The change in shape of the virtual object is determined from the response of the alternative representation. The representations of the virtual object can be displayed at any time for the user. The user can be provided a force-feedback response. Multiple stimuli can be applied in succession.Type: GrantFiled: December 14, 2001Date of Patent: October 25, 2005Assignee: SensAble Technologies, Inc.Inventors: Ralph E. Jennings, Jr., Thomas Harold Massie, Bradley A. Payne, Walter C. Shannon, III
-
Patent number: 6791549Abstract: Systems and methods are disclosed for providing interactive displays of complex virtual environments. Systems and methods consistent with embodiments of the invention may be implemented to generate virtual reality (VR) file(s) from a 3D model of the complex environment. The VR file(s) may include octree and collision detection information that is used to simulate and render frames of the complex environment. During simulation, moving objects may be evaluated to detect for collisions with other objects. Further, during rendering, objects or elements may be dynamically tessellated during run-time operations to actively control their appearance when displayed to a user. Memory management operations for facilitating the display of complex virtual environments are also disclosed, consistent with embodiments of the invention.Type: GrantFiled: December 21, 2001Date of Patent: September 14, 2004Assignee: VRcontext s.a.Inventors: Alain Yves Nestor Hubrecht, Tom Nuydens
-
Patent number: 6343936Abstract: A system and method for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ. An electronic biopsy can also be performed on an identified growth or mass in the visualized object. Improved fly-path generation and volume rendering techniques provide enhanced navigation through, and examination of, a region of interest.Type: GrantFiled: January 28, 2000Date of Patent: February 5, 2002Assignee: The Research Foundation of State University of New YorkInventors: Arie E. Kaufman, Zhengrong Liang, Mark R. Wax, Ming Wan, Dongqing Chen
-
Publication number: 20020008703Abstract: An animation system provides synchronization services to synchronize actions of two more interactive user interface characters that are displayed simultaneously. The animation services allow applications to make animation requests to control the actions of characters on the display. These actions include playing one of the character's animation sequences and generating speech output with lip-synched animation of the character's mouth. Accessible via script commands or an Application Programming Interface, the synchronization services allow an application to control interaction between two or more characters on the display. Applications can synchronize actions by invoking straightforward commands such as Wait, Interrupt, or Stop. In response to these commands, the animation server synchronizes scheduled actions by halting playback of a character until a specified action of another character completes or halting a specified action of one character after scheduled actions for another character are completed.Type: ApplicationFiled: February 26, 1998Publication date: January 24, 2002Inventors: JOHN WICKENS LAMB MERRILL, TANDY W. TROWER, MARK JEFFERY WEINBERG
-
Patent number: 6234799Abstract: A real-time IMU simulator for an Inertial Measurement Unit (IMU) of an installed avionics system of a vehicle includes a 6DOF flight simulator; an IMU computer; and a 6DOF(Degree of Freedom) interface. The 6 DOF interface is connected between the 6DOF flight simulator and the IMU computer for transferring flight trajectory data from the 6DOF flight simulator to the IMU computer. The IMU computer is adapted for receiving flight state data and calculating IMU simulation data and outputting the IMU simulation data to an IMU signal generation board. The IMU signal generation board is adapted for receiving the IMU simulation data and generating IMU signals and injecting the IMU signals to the installed avionics system.Type: GrantFiled: March 30, 1999Date of Patent: May 22, 2001Assignee: American GNC CorporationInventor: Ching-Fang Lin
-
Patent number: 6141035Abstract: When the corresponding position of each pixel in stereo video pair is searched, a fixed pixel is set in one side video and the corresponding pixel is set in the other side video. Then, a window is set around the fixed pixel in one side video and a same size window is set around the corresponding pixel in the other side video. Selectable and plural kinds of window size are prepared at that moment.Pixel values are compared with each other respectively. The MDL standard value is adopted as for an evaluation standard. In the comparison, the smaller the MDL standard value of the window size, the more suitable the window size. All pixel values in the window are used for parameters to calculate the MDL standard value. The MDL standard value contains the value which is equivalent to error quantity in case of comparing each pixel in the window, it also indicates resemblance of the video picture in the windows.Type: GrantFiled: May 20, 1998Date of Patent: October 31, 2000Assignee: Oki Data CorporationInventors: Kazuyo Kurabayashi, Nobuhito Matsushiro
-
Patent number: 6050822Abstract: The present invention, an Electromagnetic Locomotion Platform (ELP) system, is an electronically powered device which allows human-like activities to be performed in all three axes: x, y, & z, in a confined and localized a for the purpose of attaining total immersion (both visual and physical) into Virtual Environments. The ELP system allows humans the ability to walk, jog, run, crawl, etc. in a stationary position, thus the user can "walk forever to nowhere", similar to the basic function of a treadmill. This feature can be enhanced by having the user wear a helmet mounted display (HMD) which is coupled to a computer generated Virtual Reality (VR) system. The VR system provides external environments to the HMD and when synchronized with the movements of the individual on the ELP, it displays visual changes in the surrounding environment according to the movements generated by the user.Type: GrantFiled: October 1, 1997Date of Patent: April 18, 2000Assignee: The United States of America as represented by the Secretary of the ArmyInventor: Jim A. Faughn
-
Patent number: 6034692Abstract: An interactive entertainment apparatus is provided having means (10,14) for modelling a virtual environment populated by modelled characters, with each of the characters being controlled by respective rule-based agents. A camera control function (58) within the apparatus processor periodically monitors at least one compiled behavior per character agent, together with the respective locations within the virtual environment for each of the characters. The processor (10) generates clusters of adjacent characters within the virtual environment in accordance with predetermined clustering criteria such as relatively proximity and commonality of behavioral characteristics, and generates a respective cluster value derived from the current settings of the monitored behaviors within that cluster.Type: GrantFiled: August 1, 1997Date of Patent: March 7, 2000Assignee: U.S. Philips CorporationInventors: Richard D. Gallery, Dale R. Heron
-
Patent number: 5801713Abstract: A data browsing apparatus displays data automatically in an automatic page-turning mode. The data browsing apparatus has dynamic picture display parameters comprising a frame display time (w), a frame display pitch (m), and a display priority (r). The dynamic picture display parameters may be calculated based on automatic page-turning options designated by users. Dynamic pictures are displayed with accuracy at the automatic page-turning interval.Type: GrantFiled: May 23, 1996Date of Patent: September 1, 1998Assignee: NEC CorporationInventors: Kaoru Endo, Mikio Sugiyama
-
Patent number: 5745126Abstract: Each and any viewer of a video or a television scene is his or her own proactive editor of the scene, having the ability to interactively dictate and select--in advance of the unfolding of the scene and by high-level command--a particular perspective by which the scene will be depicted, as and when the scene unfolds. Video images of the scene are selected, or even synthesized, in response to a viewer-selected (i) spatial perspective on the scene, (ii) static or dynamic object appearing in the scene, or (iii) event depicted in the scene. Multiple video cameras, each at a different spatial location, produce multiple two-dimensional video images of the real-world scene, each at a different spatial perspective. Objects of interest in the scene are identified and classified by computer in these two-dimensional images. The two-dimensional images of the scene, and accompanying information, are then combined in the computer into a three-dimensional video database, or model, of the scene.Type: GrantFiled: June 21, 1996Date of Patent: April 28, 1998Assignee: The Regents of the University of CaliforniaInventors: Ramesh Jain, Saied Moezzi, Arun Katkere