Patents by Inventor George Weising
George Weising has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20160214011Abstract: Methods, apparatus, and computer programs for controlling a view of a virtual scene with a handheld device are presented. In one method, images of a real world scene are captured using a device. The method further includes operations for creating an augmented view for presentation on a display of the device by augmenting the images with virtual reality objects, and for detecting a hand in the images as extending into the real world scene. In addition, the method includes operations for showing the hand in the screen as detected in the images, and for generating interaction data, based on an interaction of the hand with a virtual reality object, when the hand makes virtual contact in the augmented view with the virtual reality object. The augmented view is updated based on the interaction data, which simulates on the screen that the hand is interacting with the virtual reality object.Type: ApplicationFiled: April 1, 2016Publication date: July 28, 2016Inventors: George Weising, Thomas Miller
-
Patent number: 9372701Abstract: An interface for managing digital information is provided. Digital information including one or more digital files is stored in memory. An icon is associated with the digital information and rendered inside a translucent bubble. The bubble may be manipulated in the digital environment by a user.Type: GrantFiled: May 12, 2010Date of Patent: June 21, 2016Assignee: Sony Interactive Entertainment America LLCInventor: George Weising
-
Patent number: 9310883Abstract: Methods, apparatus, and computer programs for controlling a view of a virtual scene with a handheld device are presented. In one method, images of a real world scene are captured using a device. The method further includes operations for creating an augmented view for presentation on a display of the device by augmenting the images with virtual reality objects, and for detecting a hand in the images as extending into the real world scene. In addition, the method includes operations for showing the hand in the screen as detected in the images, and for generating interaction data, based on an interaction of the hand with a virtual reality object, when the hand makes virtual contact in the augmented view with the virtual reality object. The augmented view is updated based on the interaction data, which simulates on the screen that the hand is interacting with the virtual reality object.Type: GrantFiled: April 23, 2014Date of Patent: April 12, 2016Assignee: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Patent number: 8954356Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: GrantFiled: October 9, 2013Date of Patent: February 10, 2015Assignee: Sony Computer Entertainment America LLCInventor: George Weising
-
Publication number: 20140235311Abstract: Methods, apparatus, and computer programs for controlling a view of a virtual scene with a handheld device are presented. In one method, images of a real world scene are captured using a device. The method further includes operations for creating an augmented view for presentation on a display of the device by augmenting the images with virtual reality objects, and for detecting a hand in the images as extending into the real world scene. In addition, the method includes operations for showing the hand in the screen as detected in the images, and for generating interaction data, based on an interaction of the hand with a virtual reality object, when the hand makes virtual contact in the augmented view with the virtual reality object. The augmented view is updated based on the interaction data, which simulates on the screen that the hand is interacting with the virtual reality object.Type: ApplicationFiled: April 23, 2014Publication date: August 21, 2014Applicant: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Publication number: 20140232652Abstract: Methods, systems, and computer programs are provided for generating an interactive space. One method includes operations for associating a first device to a reference point in 3D space, and for calculating by the first device a position of the first device in the 3D space based on inertial information captured by the first device and utilizing dead reckoning. Further, the method includes operations for capturing images with a camera of the first device, and for identifying locations of one or more static features in the images. The position of the first device is corrected based on the identified locations of the one or more static features, and a view of an interactive scene is presented in a display of the first device, where the interactive scene is tied to the reference point and includes virtual objects.Type: ApplicationFiled: April 23, 2014Publication date: August 21, 2014Applicant: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Patent number: 8730156Abstract: Ways for controlling a virtual-scene view in a portable device are presented. In one method, a signal is received and the device is synchronized to make the location of the device a reference point in a three-dimensional (3D) space. A virtual scene with virtual reality elements is generated around the reference point. The current position of the device in the 3D space, with respect to the reference point, is determined and a view of the virtual scene created. The view represents the virtual scene as seen from the current position of the device with a viewing angle based on the position of the device. The created view is displayed in the device, and the view of the virtual scene is changed as the device is moved within the 3D space. In another method, multiple players shared the virtual reality and interact with each other in the virtual reality.Type: GrantFiled: November 16, 2010Date of Patent: May 20, 2014Assignee: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Patent number: 8725659Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: GrantFiled: December 21, 2012Date of Patent: May 13, 2014Assignee: Sony Computer Entertainment America LLCInventor: George Weising
-
Patent number: 8717294Abstract: Methods, systems, and computer programs for generating an interactive space, viewable through at least a first and a second handheld devices, are presented. The method includes an operation for taking an image with a camera in the first device. In addition, the method includes an operation for determining a relative position of the second device with reference to the first device, based on image analysis of the taken image to identify a geometry of the second device. Furthermore, the method includes operations for identifying a reference point in a three-dimensional (3D) space based on the relative position, and for generating views of an interactive scene in corresponding displays of the first device and the second device. The interactive scene is tied to the reference point and includes virtual objects, and each view shows all or part of the interactive scene as observed from a current location of the corresponding device.Type: GrantFiled: September 3, 2013Date of Patent: May 6, 2014Assignee: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Publication number: 20140040168Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: ApplicationFiled: October 9, 2013Publication date: February 6, 2014Applicant: Sony Computer Entertainment America LLCInventor: George Weising
-
Publication number: 20140002359Abstract: Methods, systems, and computer programs for generating an interactive space, viewable through at least a first and a second handheld devices, are presented. The method includes an operation for taking an image with a camera in the first device. In addition, the method includes an operation for determining a relative position of the second device with reference to the first device, based on image analysis of the taken image to identify a geometry of the second device. Furthermore, the method includes operations for identifying a reference point in a three-dimensional (3D) space based on the relative position, and for generating views of an interactive scene in corresponding displays of the first device and the second device. The interactive scene is tied to the reference point and includes virtual objects, and each view shows all or part of the interactive scene as observed from a current location of the corresponding device.Type: ApplicationFiled: September 3, 2013Publication date: January 2, 2014Applicant: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Patent number: 8537113Abstract: Methods, systems, and computer programs for generating an interactive space viewable through at least a first and a second device are presented. The method includes an operation for detecting from the first device a location of the second device or vice versa. Further, synchronization information data is exchanged between the first and the second device to identify a reference point in a three-dimensional (3D) space relative to the physical location of the devices in the 3D space. The devices establish the physical location in the 3D space of the other device when setting the reference point. The method further includes an operation for generating views of an interactive scene in the displays of the first and second devices. The interactive scene is tied to the reference point and includes virtual objects. The view in the display shows the interactive scene as observed from the current location of the corresponding device.Type: GrantFiled: December 20, 2010Date of Patent: September 17, 2013Assignee: Sony Computer Entertainment America LLCInventors: George Weising, Thomas Miller
-
Patent number: 8504487Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: GrantFiled: September 21, 2010Date of Patent: August 6, 2013Assignee: Sony Computer Entertainment America LLCInventor: George Weising
-
Patent number: 8484219Abstract: Developing a knowledgebase associated with a user interface is disclosed. Development of the knowledgebase includes cataloging local data associated with a user, collecting remote data associated with the user, recording information associated with verbal input received from the user, tracking acts performed by the user to determine user idiosyncrasies, and updating the knowledgebase with the cataloged local data, the collected remote data, the recorded information, and the user idiosyncrasies. The updated knowledgebase is then provided to a component of a user interface.Type: GrantFiled: September 22, 2010Date of Patent: July 9, 2013Assignee: Sony Computer Entertainment America LLCInventor: George Weising
-
Publication number: 20130117201Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: ApplicationFiled: December 21, 2012Publication date: May 9, 2013Inventor: George Weising
-
Publication number: 20120072424Abstract: Developing a knowledgebase associated with a user interface is disclosed. Development of the knowledgebase includes cataloging local data associated with a user, collecting remote data associated with the user, recording information associated with verbal input received from the user, tracking acts performed by the user to determine user idiosyncrasies, and updating the knowledgebase with the cataloged local data, the collected remote data, the recorded information, and the user idiosyncrasies. The updated knowledgebase is then provided to a component of a user interface.Type: ApplicationFiled: September 22, 2010Publication date: March 22, 2012Inventor: George Weising
-
Publication number: 20120072379Abstract: A user interface evolves based on learned idiosyncrasies and collected data of a user. Learned idiosyncrasies and collected data of the user can be stored in a knowledge base. Information from the surrounding environment of the user can be obtained during learning of idiosyncrasies or collection of data. Thought-based statements can be generated based at least in part on the knowledge base and the information from the environment surrounding the user during learning of idiosyncrasies or collection of data. The thought-based statements serve to invoke or respond to subsequent actions of the user. The user interface can be presented so as to allow for interaction with the user based at least in part on the thought-based statements. Furthermore, personality nuances of the user interface can be developed that affect the interaction between the user and the user interface.Type: ApplicationFiled: September 21, 2010Publication date: March 22, 2012Inventor: George Weising
-
Publication number: 20110283238Abstract: An interface for managing digital information is provided. Digital information including one or more digital files is stored in memory. An icon is associated with the digital information and rendered inside a translucent bubble. The bubble may be manipulated in the digital environment by a user.Type: ApplicationFiled: May 12, 2010Publication date: November 17, 2011Inventor: George Weising
-
Publication number: 20110281648Abstract: The generation, association, and display of in-game tags are disclosed. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play.Type: ApplicationFiled: May 11, 2010Publication date: November 17, 2011Inventor: George Weising
-
Publication number: 20110260830Abstract: Methods and systems for applying biometric data to an interactive program executed by a portable device are provided. According to embodiments of the invention, raw bio-signal data is captured and filtered so as to determine the bio-signal of the user of the interactive program. The bio-signal is analyzed so as to determine biometrics of the user, which are applied as input to the interactive program. A setting or state of the interactive program is modified based on the biometrics. An updated state of the interactive program is rendered to the user, reflecting the modification of the setting or state of the interactive program.Type: ApplicationFiled: December 8, 2010Publication date: October 27, 2011Applicant: Sony Computer Entertainment Inc.Inventor: George Weising