Patents by Inventor Belinda Margaret Yee
Belinda Margaret Yee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11062505Abstract: A computer-implemented system and method of rendering an object in a virtual view. The method comprises determining a variation of an occlusion measure of the object over time, the occlusion measure being an evaluation of an occlusion of the virtual view by the object, the variation of the occlusion measure determined based on a trajectory of the object. The method also comprises determining a transition effect for the object based on the variation of the occlusion measure, the transition effect being a visual effect, and applying the determined transition effect to render the object, the determined transition effect being applied according to a position of the object as the object moves along the trajectory.Type: GrantFiled: December 7, 2017Date of Patent: July 13, 2021Assignee: Canon Kabushiki KaishaInventors: Belinda Margaret Yee, Berty Jacques Alain Bhuruth
-
Publication number: 20200106967Abstract: A computer-implemented method of configuring a virtual camera. The method comprises receiving, at an interface of an electronic device, a pointing operation identifying a location in a representation of a scene displayed in a first display region; and receiving, at the interface, a further operation in the first display region, the further operation comprising a continuous motion away from the location of the specific operation. The method further comprises configuring the virtual camera based on the location of the pointing operation and at least a direction of the further operation, wherein an image corresponding to the configured virtual camera is displayed, in a second display region, the second display region being different from the first display region.Type: ApplicationFiled: May 31, 2018Publication date: April 2, 2020Inventor: BELINDA MARGARET YEE
-
Patent number: 10569172Abstract: A method of controlling a virtual camera comprising displaying, when a device is in a first orientation, a view of a scene on the device. A location in the scene is determined while the device is in the first orientation, based on user input detected on the device. Controls for the virtual camera on the device are configured to control the virtual camera in response to detecting that an orientation of the device has changed from the first orientation to a second orientation, the configuration of the controls being based on the determined location. Commands for the configured controls are received to control the virtual camera.Type: GrantFiled: September 19, 2017Date of Patent: February 25, 2020Assignee: Canon Kabushiki KaishaInventor: Belinda Margaret Yee
-
Patent number: 10460492Abstract: A method of navigating a virtual camera using a navigation device is disclosed. An initial position of the virtual camera capturing at least a portion of a scene is received. The initial position of the virtual camera is associated with an initial camera movement characteristic. A candidate rail is generated based on the initial position of the virtual camera and a change in the scene. The generated candidate rail is associated with a navigation constraint defining movement of the virtual camera along the generated candidate rail. The virtual camera is navigated along the generated candidate rail using the navigation device based on the initial camera movement characteristic and a characteristic defined by the navigation constraint. The constrained navigation is enabled using the navigation device.Type: GrantFiled: September 13, 2017Date of Patent: October 29, 2019Assignee: Canon Kabushiki KaishaInventors: Nikos James Andronikos, Belinda Margaret Yee, Berty Jacques Alain Bhuruth
-
Patent number: 10440313Abstract: A method and system for spatially arranging a plurality of video frames for display on a layout region is provided. The plurality of video frames are selected from a video sequence based on a determination of motion of an object within the video sequence. An image layout path is determined for the selected video frames. An anchor point is determined for each selected video frame based on a determination of motion of the object depicted in the video frame, each said anchor point locating a selected video frame with respect to the layout path. The selected plurality of video frames are spatially arranged on the layout region relative to the determined image layout path and in accordance with the determined anchor points.Type: GrantFiled: November 23, 2016Date of Patent: October 8, 2019Assignee: Canon Kabushiki KaishaInventors: Veena Murthy Srinivasa Dodballapur, Belinda Margaret Yee, Ian Robert Boreham
-
Patent number: 10413803Abstract: A method of displaying a video sequence of a scene captured using a video capture device, the video sequence having a limited field of view of the scene. A plurality of objects positioned in the scene outside limits of the field of view of the captured video sequence is determined. A representation of at least one of the objects is generated, a characteristic of the generated representation being determined from an object impact measure defining, at least in part, a confidence that the at least one object will enter the field of view. The generated object representation is displayed together with, and proximate to, a display of the captured video sequence.Type: GrantFiled: December 20, 2016Date of Patent: September 17, 2019Assignee: Canon Kabushiki KaishaInventors: Berty Jacques Alain Bhuruth, Andrew Peter Downing, Belinda Margaret Yee
-
Publication number: 20190259199Abstract: A computer-implemented system and method of rendering an object in a virtual view. The method comprises determining a variation of an occlusion measure of the object over time, the occlusion measure being an evaluation of an occlusion of the virtual view by the object, the variation of the occlusion measure determined based on a trajectory of the object. The method also comprises determining a transition effect for the object based on the variation of the occlusion measure, the transition effect being a visual effect, and applying the determined transition effect to render the object, the determined transition effect being applied according to a position of the object as the object moves along the trajectory.Type: ApplicationFiled: December 7, 2017Publication date: August 22, 2019Inventors: BELINDA MARGARET YEE, BERTY JACQUES ALAIN BHURUTH
-
Patent number: 10389935Abstract: A computer-implemented method of configuring a virtual camera. A first and second object in a scene are detected, each object having at least one motion attribute. An interaction point in the scene is determined based on the motion attributes of the first and second objects. A shape envelope of the first and second objects is determined, the shape envelope including an area corresponding to the first and second objects at the determined interaction point. The virtual camera is configured based on the determined shape envelope to capture, in a field of view of the virtual camera, the first and second objects.Type: GrantFiled: December 13, 2016Date of Patent: August 20, 2019Assignee: Canon Kabushiki KaishaInventors: Belinda Margaret Yee, Andrew James Dorrell
-
Patent number: 10255690Abstract: A system and head-mounted device modify display of augmented reality content during an interaction between a person and a user of a mixed reality system. The method comprises detecting an interaction between a person and a user of a mixed reality system, the mixed reality system displaying the augmented reality content to the user, the interaction being detectable by the mixed reality system when a gaze interaction of the person is detected as directed towards the user and accordingly a sensor of the mixed reality system; determining an urgency of the interaction according to further interaction between the person and the user; determining that an element of the display of the augmented reality content is obscuring the interaction; selecting a transition effect for modifying display of the element from a plurality of transition effects according to the determined urgency of the interaction, and modifying the display of the element according to the selected transition effect.Type: GrantFiled: December 20, 2016Date of Patent: April 9, 2019Assignee: Canon Kabushiki KaishaInventors: Berty Jacques Alain Bhuruth, Belinda Margaret Yee, Julie Rae Kowald
-
Publication number: 20190083885Abstract: A method of controlling a virtual camera comprising displaying, when a device is in a first orientation, a view of a scene on the device. A location in the scene is determined while the device is in the first orientation, based on user input detected on the device. Controls for the virtual camera on the device are configured to control the virtual camera in response to detecting that an orientation of the device has changed from the first orientation to a second orientation, the configuration of the controls being based on the determined location. Commands for the configured controls are received to control the virtual camera.Type: ApplicationFiled: September 19, 2017Publication date: March 21, 2019Inventor: Belinda Margaret Yee
-
Publication number: 20190080495Abstract: A method of navigating a virtual camera using a navigation device is disclosed. An initial position of the virtual camera capturing at least a portion of a scene is received. The initial position of the virtual camera is associated with an initial camera movement characteristic. A candidate rail is generated based on the initial position of the virtual camera and a change in the scene. The generated candidate rail is associated with a navigation constraint defining movement of the virtual camera along the generated candidate rail. The virtual camera is navigated along the generated candidate rail using the navigation device based on the initial camera movement characteristic and a characteristic defined by the navigation constraint. The constrained navigation is enabled using the navigation device.Type: ApplicationFiled: September 13, 2017Publication date: March 14, 2019Inventors: Nikos James Andronikos, Belinda Margaret Yee, Berty Jacques Alain Bhuruth
-
Publication number: 20180176502Abstract: A method of displaying a video sequence of a scene captured using a video capture device, the video sequence having a limited field of view of the scene. A plurality of objects positioned in the scene outside limits of the field of view of the captured video sequence is determined. A representation of at least one of the objects is generated, a characteristic of the generated representation being determined from an object impact measure defining, at least in part, a confidence that the at least one object will enter the field of view. The generated object representation is displayed together with, and proximate to, a display of the captured video sequence.Type: ApplicationFiled: December 20, 2016Publication date: June 21, 2018Inventors: Berty Jacques Alain Bhuruth, Andrew Peter Downing, Belinda Margaret Yee
-
Publication number: 20180167553Abstract: A computer-implemented method of configuring a virtual camera. A first and second object in a scene are detected, each object having at least one motion attribute. An interaction point in the scene is determined based on the motion attributes of the first and second objects. A shape envelope of the first and second objects is determined, the shape envelope including an area corresponding to the first and second objects at the determined interaction point. The virtual camera is configured based on the determined shape envelope to capture, in a field of view of the virtual camera, the first and second objects.Type: ApplicationFiled: December 13, 2016Publication date: June 14, 2018Inventors: Belinda Margaret Yee, Andrew James Dorrell
-
Patent number: 9972122Abstract: A computer-implemented system and method of rendering an object in a virtual view. The method comprises determining a variation of an occlusion measure of the object over time, the occlusion measure being an evaluation of an occlusion of the virtual view by the object, the variation of the occlusion measure determined based on a trajectory of the object. The method also comprises determining a transition effect for the object based on the variation of the occlusion measure, the transition effect being a visual effect, and applying the determined transition effect to render the object, the determined transition effect being applied according to a position of the object as the object moves along the trajectory.Type: GrantFiled: December 20, 2016Date of Patent: May 15, 2018Assignee: Canon Kabushiki KaishaInventors: Belinda Margaret Yee, Berty Jacques Alain Bhuruth
-
Publication number: 20180077345Abstract: A computer-implemented method and system of selecting a camera angle is described. The method comprises determining a visual fixation point of a viewer of a scene using eye gaze data from an eye gaze tracking device; detecting, from the eye gaze data, one or more saccades from the visual fixation point of the viewer, the one or more saccades indicating a one or more regions of future interest to the viewer; selecting, based on the detected one or more saccades, a region of the scene; and selecting a camera angle of a camera, the camera capturing video data of the selected region using the selected angle.Type: ApplicationFiled: September 12, 2016Publication date: March 15, 2018Inventor: Belinda Margaret Yee
-
Patent number: 9918057Abstract: A method, apparatus and system of projecting text characters onto a textured document are described. The method comprises determining, from a captured image of the textured surface, a measure of the texture on the surface for a region of the textured surface over which the text characters are to be projected; selecting, based on a function of the determined measure, a glyph set, each glyph in the glyph set having visually contrasting inner and outer portions, the outer portion being sized proportionally to the inner portion according to the determined measure; and projecting the text characters onto the textured surface on of region using the selected glyph set.Type: GrantFiled: September 29, 2016Date of Patent: March 13, 2018Assignee: Canon Kabushiki KaishaInventors: David Robert James Monaghan, Belinda Margaret Yee, Rajanish Calisa
-
Patent number: 9721391Abstract: A method of displaying augmented reality content on a physical surface is disclosed. A surface complexity measure is determined for the physical surface from a captured image of the physical surface. A content complexity measure is determined for the augmented reality content to be applied to the physical surface. The content complexity measure represents an amount of fine detail in the augmented reality content. The method determines if the amount of fine detail in the augmented reality content is to be modified, based on a function of the surface complexity measure and said content complexity measure. A display attribute of the augmented reality content is adjusted to modify the fine detail in the augmented reality content. The modified augmented reality content is displayed on the physical surface.Type: GrantFiled: May 11, 2015Date of Patent: August 1, 2017Assignee: Canon Kabushiki KaishaInventors: David Robert James Monaghan, Belinda Margaret Yee, Oscar Alejandro De Lellis
-
Publication number: 20170178356Abstract: A system and computer-implemented method of modifying display of augmented reality content are disclosed. The method comprises detecting an interaction between a person and a user of a mixed reality system, the mixed reality system displaying the augmented reality content to the user, the interaction being detectable by the mixed reality system when a gaze interaction of the person is detected as directed towards the user and accordingly a sensor of the mixed reality system; determining an urgency of the interaction according to further interaction between the person and the user; determining that an element of the display of the augmented reality content is obscuring the interaction; selecting a transition effect for modifying display of the element from a plurality of transition effects according to the determined urgency of the interaction, and modifying the display of the element according to the selected transition effect.Type: ApplicationFiled: December 20, 2016Publication date: June 22, 2017Inventors: Berty Jacques Alain BHURUTH, Belinda Margaret YEE, Julie RAE KOWALD
-
Publication number: 20170171499Abstract: A method and system for spatially arranging a plurality of video frames for display on a layout region is provided. The plurality of video frames are selected from a video sequence based on a determination of motion of an object within the video sequence. An image layout path is determined for the selected video frames. An anchor point is determined for each selected video frame based on a determination of motion of the object depicted in the video frame, each said anchor point locating a selected video frame with respect to the layout path. The selected plurality of video frames are spatially arranged on the layout region relative to the determined image layout path and in accordance with the determined anchor points.Type: ApplicationFiled: November 23, 2016Publication date: June 15, 2017Inventors: VEENA MURTHY SRINIVASA DODBALLAPUR, BELINDA MARGARET YEE, IAN ROBERT BOREHAM
-
Patent number: 9633479Abstract: A method of displaying virtual content on an augmented reality device (101) is disclosed. The virtual content is associated with a scene. An image of a scene captured using the augmented reality device (101) is received. A viewing time of the scene is determined, according to a relative motion between the augmented reality device and the scene. Virtual content is selected, from a predetermined range of virtual content, based on the determined viewing time. The virtual content is displayed on the augmented reality device (101) together with the image of the scene.Type: GrantFiled: December 22, 2014Date of Patent: April 25, 2017Assignee: Canon Kabushiki KaishaInventors: Matthew John Grasso, Belinda Margaret Yee, David Robert James Monaghan, Oscar Alejandro De Lellis, Rajanish Calisa