Patents by Inventor Maribeth Joy Back
Maribeth Joy Back has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10811055Abstract: A computer-implemented method is provided for coordinating sensed poses associated with real-time movement of a first object in a live video feed, and pre-recorded poses associated with movement of a second object in a video. The computer-implemented method comprises applying a matching function to determine a match between a point of one of the sensed poses and a corresponding point of the pre-recorded poses, and based on the match, determining a playtime and outputting at least one frame of the video associated with the second object for the playtime.Type: GrantFiled: June 27, 2019Date of Patent: October 20, 2020Assignee: FUJI XEROX CO., LTD.Inventors: Donald Kimber, Laurent Denoue, Maribeth Joy Back, Patrick Chiu, Chelhwon Kim, Yanxia Zhang
-
Patent number: 10250813Abstract: A method at a first device with a display includes: receiving image information from a camera device, the image information corresponding to a first field of view associated with the camera device; receiving sensory information from a second device, the sensory information corresponding to a second field of view associated with the second device; and displaying on the display the image information and a visualization of at least one of the first field of view and the second field of view.Type: GrantFiled: September 3, 2014Date of Patent: April 2, 2019Assignee: FUJI XEROX CO., LTD.Inventors: Donald Kimber, Sven Kratz, Patrick F. Proppe, Maribeth Joy Back, Bee Y. Liew, Anthony Dunnigan
-
Patent number: 9437045Abstract: A computer-implemented method for obtaining texture data for a three-dimensional model is disclosed. The method is performed at a portable electronic device including a camera, one or more processors, and memory storing one or more programs. The method includes obtaining an image of one or more objects with the camera. The method also includes concurrently displaying at least a portion of the image with one or more objects in the three-dimensional model. The one or more objects in the three-dimensional model at least partially overlay the one or more objects in the image. The method furthermore includes storing at least one or more portions of the image as texture data for one or more of the one or more objects in the image.Type: GrantFiled: July 3, 2013Date of Patent: September 6, 2016Assignee: FUJI XEROX CO., LTD.Inventors: David Joseph Arendash, Maribeth Joy Back
-
Patent number: 9317171Abstract: Described is an approach to gesture interaction that is based on user interface widgets. In order to detect user gestures, the widgets are provided with hotspots that are monitored using a camera for predetermined patterns of occlusion. A hotspot is a region where a user interacts with the widget by making a gesture over it. The user's gesture may be detected without the user physically touching the surface displaying the widget. The aforesaid hotspots are designed to be visually salient and suggestive of the type of gestures that can be received from the user. Described techniques are advantageous in relation to conventional systems, such as systems utilizing finger tracking, in that they can better support complex tasks with repeated user actions. In addition, they provide better perceived affordance than conventional systems that attempt to use widgets that are not designed for gesture input, or in-the-air gesture detection techniques that lack any visual cues.Type: GrantFiled: April 18, 2013Date of Patent: April 19, 2016Assignee: FUJI XEROX CO., LTD.Inventors: Patrick Chiu, Qiong Liu, Maribeth Joy Back, Sven Kratz
-
Publication number: 20160065858Abstract: A method at a first device with a display includes: receiving image information from a camera device, the image information corresponding to a first field of view associated with the camera device; receiving sensory information from a second device, the sensory information corresponding to a second field of view associated with the second device; and displaying on the display the image information and a visualization of at least one of the first field of view and the second field of view.Type: ApplicationFiled: September 3, 2014Publication date: March 3, 2016Inventors: DONALD KIMBER, SVEN KRATZ, PATRICK F. PROPPE, MARIBETH JOY BACK, BEE Y. LIEW, ANTHONY DUNNIGAN
-
Publication number: 20150177967Abstract: In embodiments, a user interface provides for manipulating one or more physical devices for use in a conference room setting. The user interface includes a touch screen for presenting a variety of options to a user. The touch screen includes controllers, such as buttons, to enable the user to select any one of the options. Each of the controllers has goals-oriented information, enabling the user to select a goal, while insulating the user from the underlying complex processes required to carry out the goal through the selection of one of the controllers.Type: ApplicationFiled: August 24, 2011Publication date: June 25, 2015Inventors: Maribeth Joy Back, Gene Golovchinsky, John Steven Boreczky, Anthony Eric Dunnigan, Pernilla Qvarfordt, William J. van Melle, Laurent Denoue
-
Publication number: 20150009206Abstract: A computer-implemented method for obtaining texture data for a three-dimensional model is disclosed. The method is performed at a portable electronic device including a camera, one or more processors, and memory storing one or more programs. The method includes obtaining an image of one or more objects with the camera. The method also includes concurrently displaying at least a portion of the image with one or more objects in the three-dimensional model. The one or more objects in the three-dimensional model at least partially overlay the one or more objects in the image. The method furthermore includes storing at least one or more portions of the image as texture data for one or more of the one or more objects in the image.Type: ApplicationFiled: July 3, 2013Publication date: January 8, 2015Inventors: David Joseph Arendash, Maribeth Joy Back
-
Patent number: 8909702Abstract: A system is provided that coordinates the operation of hardware devices and software applications in support of specific tasks such as holding a meeting. The system includes one or more computers connected by a network, at least one configuration repository component, at least one room control component, and one or more devices and applications for each room control component. Meeting presenters can configure a meeting, or they may use a default configuration. A meeting includes one or more presenters' configurations of devices and applications to accommodate multiple presenters simultaneously. The meeting configurations are stored by the configuration repository component. Each presenter's configuration comprises a subset of the one or more devices and applications. The operation of devices and applications in the meeting is coordinated by the room control component based on the presenters' configurations for the meeting.Type: GrantFiled: September 14, 2007Date of Patent: December 9, 2014Assignee: Fuji Xerox Co., Ltd.Inventors: Gene Golovchinsky, John Steven Boreczky, William J. van Melle, Maribeth Joy Back, Anthony Eric Dunnigan, Pernilla Qvarfordt
-
Publication number: 20140313363Abstract: Described is approach to gesture interaction that is based on user interface widgets. In order to detect user gestures, the widgets are provided with hotspots that are monitored using a camera for predetermined patterns of occlusion. A hotspot is a region where user interacts with the widget by making a gesture over it. The user's gesture may be detected without user physically touching the surface displaying the widget. The aforesaid hotspots are designed to be visually salient and suggestive of the type of gestures that can be received from the user. Described techniques are advantageous in relation to conventional systems, such as systems utilizing finger tracking, in that they can better support complex tasks with repeated user actions. In addition, they provide better perceived affordance than conventional systems that attempt to use widgets that are not designed for gesture input, or in-the-air gesture detection techniques that lack any visual cues.Type: ApplicationFiled: April 18, 2013Publication date: October 23, 2014Applicant: FUJI XEROX CO., LTD.Inventors: Patrick Chiu, Qiong Liu, Maribeth Joy Back, Sven Kratz
-
Patent number: 8243116Abstract: A method is described for modifying behavior for social appropriateness in computer mediated communications. Data can be obtained representing the natural non-verbal behavior of a video conference participant. The cultural appropriateness of the behavior is calculated based on a cultural model and previous behavior of the session. Upon detecting that the behavior of the user is culturally inappropriate, the system can calculate an alternative behavior based on the cultural model. Based on this alternative behavior, the video output stream can be modified to be more appropriate by altering gaze and gesture of the conference participants. The output stream can be modified by using previously recorded images of the participant, by digitally synthesizing a virtual avatar display or by switching the view displayed to the remote participant. Once the user's behavior changes to be once again culturally appropriate, the modified video stream can be returned to unmodified state.Type: GrantFiled: September 24, 2007Date of Patent: August 14, 2012Assignee: Fuji Xerox Co., Ltd.Inventors: Pernilla Qvarfordt, Gene Golovchinsky, Maribeth Joy Back
-
Publication number: 20110307800Abstract: In embodiments, a user interface provides for manipulating one or more physical devices for use in a conference room setting. The user interface includes a touch screen for presenting a variety of options to a user. The touch screen includes controllers, such as buttons, to enable the user to select any one of the options. Each of the controllers has goals-oriented information, enabling the user to select a goal, while insulating the user from the underlying complex processes required to carry out the goal through the selection of one of the controllers.Type: ApplicationFiled: August 24, 2011Publication date: December 15, 2011Inventors: Maribeth Joy Back, Gene Golovchinsky, John Steven Boreczky, Anthony Eric Dunnigan, Pernilla Qvarfordt, William J. van Melle, Laurent Denoue
-
Patent number: 7991920Abstract: System and method for controlling the presentation of information, such as dynamically displayed text, includes a computer with a display device and one or more sets of electrode plates and capacitive field sensors arranged facing each other on a substantially flat and substantially stationary surface, such as a table top. The method includes forming capacitive fields between the electrodes and sensors by electrically charging the electrode plates. The sensors monitor for gestural movements made by a user's hands within the fields by detecting changes in voltage levels of the fields. In response to detected gestural movements, the computer adjusts the manner in which the information is presented in the display device, such as the display rate, information source, font size and contrast control.Type: GrantFiled: December 18, 2003Date of Patent: August 2, 2011Assignee: Xerox CorporationInventors: Maribeth Joy Back, Margaret H. Szymanksi
-
Patent number: 7822765Abstract: Described is a component-based control system involving the interface and middleware layers for collaborative exploratory search. The components include modules for multi-user input and display capabilities, and are individually configurable to allow simultaneous manipulation of multiple search parameters and algorithms. In a collaborative exploratory search, a team of people with a shared information need engage in exploratory search together. This search happens synchronously, leveraging realtime feedback in the search loop. The search team works together, finding patterns and information that each player individually might not have found, and doing so more efficiently than any single person could have. Each team member brings their own expertise and point of view to a shared problem. Distributing tasks and roles among team members leverages individual expertise and creates efficiencies of scale.Type: GrantFiled: October 23, 2007Date of Patent: October 26, 2010Assignee: Fuji Xerox Co., Ltd.Inventors: Maribeth Joy Back, Jeremy Pickens, Gene Golovchinsky
-
Publication number: 20090079816Abstract: A method is described for modifying behavior for social appropriateness in computer mediated communications. Data can be obtained representing the natural non-verbal behavior of a video conference participant. The cultural appropriateness of the behavior is calculated based on a cultural model and previous behavior of the session. Upon detecting that the behavior of the user is culturally inappropriate, the system can calculate an alternative behavior based on the cultural model. Based on this alternative behavior, the video output stream can be modified to be more appropriate by altering gaze and gesture of the conference participants. The output stream can be modified by using previously recorded images of the participant, by digitally synthesizing a virtual avatar display or by switching the view displayed to the remote participant. Once the user's behavior changes to be once again culturally appropriate, the modified video stream can be returned to unmodified state.Type: ApplicationFiled: September 24, 2007Publication date: March 26, 2009Applicant: FUJI XEROX CO., LTD.Inventors: Pernilla Qvarfordt, Gene Golovchinsky, Maribeth Joy Back
-
Publication number: 20090024585Abstract: Described is a component-based control system involving the interface and middleware layers for collaborative exploratory search. The components include modules for multi-user input and display capabilities, and are individually configurable to allow simultaneous manipulation of multiple search parameters and algorithms. In a collaborative exploratory search, a team of people with a shared information need engage in exploratory search together. This search happens synchronously, leveraging realtime feedback in the search loop. The search team works together, finding patterns and information that each player individually might not have found, and doing so more efficiently than any single person could have. Each team member brings their own expertise and point of view to a shared problem. Distributing tasks and roles among team members leverages individual expertise and creates efficiencies of scale.Type: ApplicationFiled: October 23, 2007Publication date: January 22, 2009Applicant: FUJI XEROX CO., LTD.Inventors: Maribeth Joy BACK, Jeremy Pickens, Gene Golovchinsky
-
Publication number: 20090024581Abstract: Described is a new framework that combines the best of both exploratory search and social search: Collaborative exploratory search. This system allows a small group of focused information seekers to search through a collection of information in concert. The system provides exploratory feedback not only based on the individual's search behavior, but on the current, active search behavior of one's fellow search allies. The assumption is that the users who have gotten together to search collaboratively have the same information need, but differing perspectives and insights as to how to best express the queries to meet that need. The collaborative exploratory search system will therefore provide tools and algorithmic support to focus, enhance, and augment searcher activities. Searchers can, by interacting with each other through system-mediated information displays, help each other find all relevant information more efficiently and effectively.Type: ApplicationFiled: July 20, 2007Publication date: January 22, 2009Applicant: FUJI XEROX CO., LTD.Inventors: Jeremy Garner Pickens, Maribeth Joy Back
-
Publication number: 20080184115Abstract: In embodiments, a user interface provides for manipulating one or more physical devices for use in a conference room setting. The user interface includes a touch screen for presenting a variety of options to a user. The touch screen includes controllers, such as buttons, to enable the user to select any one of the options. Each of the controllers has goals-oriented information, enabling the user to select a goal, while insulating the user from the underlying complex processes required to carry out the goal through the selection of one of the controllers.Type: ApplicationFiled: July 19, 2007Publication date: July 31, 2008Applicant: FUJI XEROX CO., LTD.Inventors: Maribeth Joy Back, Gene Golovchinsky, John Steven Boreczky, Anthony Eric Dunnigan, Pernilla Qvarfordt, William J. van Melle, Laurent Denoue
-
Publication number: 20080183820Abstract: A system is provided that coordinates the operation of hardware devices and software applications in support of specific tasks such as holding a meeting. The system includes one or more computers connected by a network, at least one configuration repository component, at least one room control component, and one or more devices and applications for each room control component. Meeting presenters can configure a meeting, or they may use a default configuration. A meeting includes one or more presenters' configurations of devices and applications to accommodate multiple presenters simultaneously. The meeting configurations are stored by the configuration repository component. Each presenter's configuration comprises a subset of the one or more devices and applications. The operation of devices and applications in the meeting is coordinated by the room control component based on the presenters' configurations for the meeting.Type: ApplicationFiled: September 14, 2007Publication date: July 31, 2008Applicant: FUJI XEROX CO., LTD.Inventors: Gene Golovchinsky, John Steven Boreczky, William J. van Melle, Maribeth Joy Back, Anthony Eric Dunnigan, Pernilla Qvarfordt
-
Publication number: 20070271524Abstract: The present invention relates to techniques for supporting organizational, labeling and retrieval tasks on an electronic tabletop, wall or large display. In various embodiments of the invention, a dynamic visualization is used to show a current working set of documents. In an embodiment of the invention, the rest of the collection is represented in the background as small dots. In an embodiment of the invention, when a user moves objects into groups or creates a label, relevant objects in the background are automatically retrieved and moved into the foreground. In an embodiment of the invention, retrieved objects along with relevant objects in the current set are highlighted and decorated with arrows pointing to their relevant groups. In an embodiment of the invention, the movement is animated to provide user feedback when objects must travel long distances on a large display.Type: ApplicationFiled: May 19, 2006Publication date: November 22, 2007Applicant: FUJI XEROX CO., LTD.Inventors: Patrick Chiu, Xiaohua Sun, Jeffrey Huang, Maribeth Joy Back, Wolfgang H. Polak
-
Publication number: 20040119684Abstract: System and method for navigating information. The system includes an information presentation device having an output portion (e.g., display device or audio device) and one or more motion sensors. The method includes the motion sensors monitoring for at least one movement of the information display device while the device dynamically presents information at its output portion. Upon the sensors detecting movement of the device, the device adjusts the manner in which the information is dynamically presented to enable operators of the device to easily navigate to an appropriate location of the information and adjust the manner in which it is dynamically presented (e.g., the rate of the dynamic presentation).Type: ApplicationFiled: December 18, 2002Publication date: June 24, 2004Applicant: Xerox CorporationInventors: Maribeth Joy Back, Roy Want