Information processing apparatus and control method thereof
An information processing apparatus for inputting minutes data comprises an input unit which inputs the minutes data, a designation unit which designates, among a plurality of object images displayed on a display screen, at least one object images to be stored together with the minutes data, and a determination unit which determines, based on the minutes data input by the input unit, the plurality of object images to be displayed on the display screen.
Latest Canon Patents:
This application is a division of application Ser. No. 13/188,041 filed Jul. 21, 2011.
BACKGROUND OF THE INVENTIONField of the Invention
The present invention relates to a conference support technique.
Description of the Related Art
Today, when a conference or meeting is held in companies, schools, and research institutions, it is common to summarize the contents of the conference or meeting using a whiteboard. Some whiteboards electronically display and store written contents using a projector and digitizer. These whiteboards are called digital whiteboards or interactive whiteboards, and have become widespread because of their great convenience. In this specification, these electronic whiteboards are simply referred to as whiteboards hereinafter.
To allow ready use of whiteboards, many of them have a function of pasting a half-finished character/graphic object, which is called a template or stencil. Note that a template is a relatively large character/graphic object, and is used to make the whole document conform to a predetermined layout. On the contrary, a stencil is a relatively small character/graphic object, and is used when the user creates an easy-to-understand document by arranging and connecting a plurality of stencils.
Furthermore, a conference support system for summarizing the minutes of a conference and managing an overall conference flow is becoming common. This system and the above whiteboard are often used in combination with each other. Japanese Patent Laid-Open No. 2006-099414 discloses a technique for automatically preparing graphic objects corresponding to conference participants on a whiteboard.
The technique described in Japanese Patent Laid-Open No. 2006-099414, however, does not automatically prepare, from conference information such as conference minutes, character/graphic objects derived from the contents of a conference. The conference participants have to create by handwriting character/graphic objects reflecting the contents of the conference on the whiteboard, which is cumbersome.
Moreover, even if the contents of a character/graphic object are corrected, the conference information such as conference minutes does not reflect the correction. It is, therefore, troublesome for the user to match the contents represented by the conference information with those on the whiteboard.
SUMMARY OF THE INVENTIONThe present invention provides a technique for efficiently creating a document in a conference.
According to a first aspect of the present invention, an information processing apparatus for inputting minutes data, comprising: an input unit which inputs the minutes data; a designation unit which designates, among a plurality of object images displayed on a display screen, at least one object images to be stored together with the minutes data; and a determination unit which determines, based on the minutes data input by the input unit, the plurality of object images to be displayed on the display screen.
According to a second aspect of the present invention, a control method for an information processing apparatus for inputting minutes data, comprising: an input step of inputting the minutes data; a designation step of designating, among a plurality of object images displayed on a display screen, at least one object images to be stored together with the minutes data; and a determination step of determining, based on the minutes data input in the input step, the plurality of object images to be displayed on the display screen.
According to a third aspect of the present invention, a non-transitory storage medium storing a computer-readable program executable by a computer for inputting minute data, the program comprising an input step of inputting the minutes data, a designation step of designating, among a plurality of object images displayed on a display screen, at least one object images to be stored together with the minutes data, and a determination step of determining, based on the minutes data input in the input step, the plurality of object images to be displayed on the display screen.
According to a fourth aspect of the present invention, an information processing apparatus for inputting minutes data, comprising: a detection unit which detects that minute data displayed on a first screen of a plurality of screens which displays the minutes data have been changed; and a changing unit which changes minutes data displayed on a second screen of the plurality of screens when the detection unit detects that the minutes data displayed on the first screen have been changed.
According to a fifth aspect of the present invention, a control method for an information processing apparatus for inputting minutes data, comprising: a detection step of detecting that minute data displayed on a first screen of a plurality of screens which displays the minutes data have been changed; and a changing step of changing minutes data displayed on a second screen of the plurality of screens when the minutes data displayed on the first screen is detected to have been changed in the detection step.
According to a sixth aspect of the present invention, a non-transitory storage medium storing a computer-readable program executable by a computer for inputting minutes data, the program comprising a detection step of detecting that minute data displayed on a first screen of a plurality of screens which displays the minutes data have been changed, and a changing step of changing minutes data displayed on a second screen of the plurality of screens when the minutes data displayed on the first screen is detected to have been changed in the detection step.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Embodiments of the present invention will be described below with reference to the accompanying drawings. The embodiments to be described below are merely examples when the present invention is practically implemented, and are practical examples of an arrangement set forth in the following claims.
The following embodiments assume use in a conference. In addition to the conference, the present invention is applicable to all events using a whiteboard such as a class in a school, a training course, and a research representation.
First EmbodimentA configuration example of a conference support apparatus 100 serving as an information processing apparatus according to this embodiment will be explained first using a block diagram in
A CPU 102 controls the operation of the conference support apparatus 100 as a whole using computer programs and data stored in a RAM 103, and also executes each process which is to be described later by assuming that the conference support apparatus 100 performs it.
The RAM 103 has an area for temporarily storing a computer program loaded from a program storage area 104 and data (conference information 110, character/graphic data 120, and the like) loaded from a data storage area 105. The RAM 103 has a work area used by the CPU 102 to execute various processes. That is, the RAM 103 can provide various areas, as needed.
The program storage area 104 stores a computer program which causes the CPU 102 to execute each process which is to be described later by assuming that the conference support apparatus 100 performs it. Under the control of the CPU 102, the computer program stored in the program storage area 104 is loaded into the RAM 103, as needed.
The data storage area 105 stores the conference information 110 and the character/graphic data 120, as described above. The conference information 110 is used to manage elements representing the contents of a conference in a hierarchical structure, as shown in, for example,
The character/graphic data 120 include template data and stencil data, and conform to a data format such as SVG (Scalable Vector Graphics). Under the control of the CPU 102, the above-mentioned various data stored in the data storage area 105 are loaded into the RAM 103, as needed.
Each of the program storage area 104 and data storage area 105 serves as, for example, a large-capacity information storage device such as a hard disk drive device. Referring to
The user operates a UI device 101 to input various instructions to the CPU 102. As long as this operation is possible, any apparatus may be used as the UI device 101. For example, a pointing device such as a group of various buttons, a mouse, or a digitizer is applicable to the UI device 101.
A UI display unit 106 includes a CRT or liquid crystal screen, and can display a processing result of the CPU 102 using images and characters. A touch panel display device may be formed by integrating the UI device 101 with the UI display unit 106.
A display example of a conference support UI according to this embodiment which includes a screen (conference minutes screen) for displaying the minutes of a conference indicated by conference information and a whiteboard screen corresponding to this conference minutes screen will be described next with reference to
As shown in
A configuration example of the whiteboard screen 300 will be described next with reference to
As shown in
Furthermore, an action item owner 338 and action item due date 339 are displayed as graphic objects on the whiteboard screen 300. Both the owner 338 and due date 339 are read out from the conference information 110, and the graphic objects corresponding to the readout information are placed as shown in
The user can use the UI device 101 to change or edit any of the character objects and graphic objects, as needed. Assume, for example, that the user operated the UI device 101, and instructed the mode switching menu 320 to select a graphic object selection mode. In this case, the CPU 102 permits processing associated with graphic objects such as processing of moving a graphic object using the UI device 101, and processing of changing the graphic object of the owner 338 to that of another owner.
Processing executed by the CPU 102 when the user changes the conference information 110 on the conference minutes screen 200 will be described with reference to
In step S101, the CPU 102 acquires the conference information 110 into the RAM 103. In step S102, the CPU 102 determines whether the conference information 110 has been updated. Although there are various determination methods, a determination method is not particularly limited as long as it is possible to recognize an updated portion of the conference information 110. As a result of the determination process, if the information has been updated, the process advances to step S103; otherwise, the process ends.
In step S103, the CPU 102 specifies an object (a graphic object and/or a character object) corresponding to the updated portion (that is, an updated element in the conference information 110).
In step S104, the CPU 102 updates the object specified in step S103 with an object corresponding to the updated element. Assume, for example, that the user updates the owner of the action item 206 in
Processing executed by the CPU 102 when the user changes a character object or graphic object on the whiteboard screen 300 will be described with reference to
In step S201, the CPU 102 acquires character objects and graphic objects contained in a template (the template 330 in
In step S203, the CPU 102 specifies an element within the conference information 110 corresponding to the changed object. In step S204, the CPU 102 changes the specified element to contents corresponding to the changed object. According to the changed conference information 110, the display of the conference minutes screen 200 is also changed, as a matter of course.
Assume, for example, that the user used the UI device 101 to select the graphic object of the owner 338 in
In this embodiment, if one of an element of interest and an object corresponding to the element of interest is detected to have been updated, the other is also updated according to the update operation. As long as an UI with another configuration can implement this function, it may be applied instead of the conference support UI.
If the user uses the UI device 101 to add an arrow pointing another action item like a graphic object 350, the conference information 110 reflects information indicating that those action items are related to each other. Since the graphic object 350 has a single-headed arrow, a dependency relation such that the December 1 action item needs to be completed after the November 22 action item is completed is stored. On the conference minutes screen 200 of
If the user uses the UI device 101 to circle an action item like a graphic object 351, the conference information 110 reflects information indicating that this action item is important. On the conference minutes screen 200 of
As shown in
In the example of
Processing executed by a CPU 102 when the user changes conference information 110 on a conference minutes screen 200 will be described with reference to
In step S301, the CPU 102 acquires the conference information 110 into the RAM 103. In step S302, the CPU 102 determines whether the conference information 110 has been updated. Steps S301 and S302 are performed similarly to steps S101 and S102, respectively. As a result of the determination process, if the information has been updated, the process advances to step S303; otherwise, the process ends.
In step S303, the CPU 102 estimates a conference type based on elements contained in the conference information 110, for example, appearing words such as words in conference minutes and the titles of conference participants. This estimation operation is done as follows. That is, for each conference type, a dictionary registering appearing words which should be contained in the conference information 110 to estimate the type is registered in a data storage area 105 in advance. Using the dictionary, the CPU 102 estimates a conference type corresponding to the appearing words contained in the conference information 110.
In step S304, among stencil list data registered in advance in the data storage area 105 for each conference type, the CPU 102 reads out, into the RAM 103, stencil list data corresponding to the type estimated in step S303. The CPU 102 arranges, from the top, the icons of stencils in a stencil list indicated by the readout stencil list data in the descending order of the frequency of use on the left side of the whiteboard screen 300. The display position and format of the stencil list is not particularly limited.
In addition to arranging stencils prepared in advance, it is possible to place stencils some of which have been changed in accordance with the contents of a conference like a stencil list 410, as shown in
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2010-192710 filed Aug. 30, 2010, which is hereby incorporated by reference herein in its entirety.
Claims
1. An information processing, comprising:
- one or more hardware processors; and
- one or more memories which store instructions executable by the one or more hardware processors to cause the information processing apparatus to perform at least:
- acquiring minutes data;
- determining a meeting type based on the acquired minutes data;
- selecting, based on the determined meeting type, at least one object image to be displayed on a display screen among a plurality of object images;
- displaying, on a first display area in the display screen, the at least one selected object image;
- displaying, on a second display area in the display screen, an object image which is designated by a user operation among the at least one selected object image which is displayed on the first display area in the display screen; and
- storing the acquired minutes data with object data corresponding to the object image which is displayed on the second display area in the display screen.
2. The apparatus according to claim 1, wherein the instructions further cause the information processing apparatus to perform:
- determining, based on a designation frequency of the at least one object image which is selected based on the minutes data, a display position of the selected object image on the first display area in the display screen.
3. The apparatus according to claim 1, wherein the instructions further cause the information processing apparatus to perform:
- changing, based on further acquiring of minutes data, the at least one object image which have been displayed on the first display area in the display screen.
4. The apparatus according to claim 1, wherein the instructions further cause the information processing apparatus to perform:
- storing character information related to minutes data and an object image in correspondence with each other, and
- wherein the object image corresponding to the character information included in the acquired minutes data is selected as the at least one object image to be displayed on the display screen.
5. The apparatus according to claim 1, wherein the in a case where a place name is included in the acquired minutes data, at least one map object image corresponding to the place name is selected among the plurality of object images, and is displayed on the first display area in the display screen.
6. The apparatus according to claim 1, wherein the instructions further cause the information processing apparatus to perform:
- receiving user operation information related to a user operation to the display screen for designating the object image.
7. The apparatus according to claim 1, wherein information according to the acquired minutes data is displayed with the object image designated by the user operation on the second display area.
8. The apparatus according to claim 1, wherein the plurality of object images are stored in the information processing apparatus before the acquiring of the minutes data.
9. The apparatus according to claim 1, wherein the object image designated by the user operation is displayed at a position in the second display area designated by the user operation.
10. The apparatus according to claim 1, wherein the meeting type is determined based on at least one of information of conference participants included in the acquired minutes data.
11. An information processing method, comprising:
- acquiring minutes data;
- determining a meeting type based on the acquired minutes data;
- selecting, based on the determined meeting type, at least one object image to be displayed on a display screen among a plurality of object images;
- displaying, on a first display area in the display screen, the at least one selected object image;
- displaying, on a second display area in the display screen, an object image which is designated by a user operation among the at least one selected object image which is displayed on the first display area in the display screen; and
- storing the acquired minutes data with object data corresponding to the object image which is displayed on the second display area in the display screen.
12. The method according to claim 11, further comprising:
- determining, based on a designation frequency of the at least one object image which is selected based on the acquired minutes data, a display position of the selected object image on the first display area in the display screen.
13. The method according to claim 11, further comprising:
- changing, based on further acquiring of minutes data, the at least one object image which have been displayed on the first display area in the display screen.
14. A non-transitory computer-readable storage medium storing a computer program for causing a computer to perform the functions of:
- acquiring minutes data;
- determining a meeting type based on the acquired minutes data;
- selecting, based on the determined meeting type, at least one object image to be displayed on a display screen among a plurality of object images;
- displaying, on a first display area in the display screen, the at least one selected object image;
- displaying, on a second display area in the display screen, an object image which is designated by a user operation among the at least one selected object image which is displayed on the first display area in the display screen; and
- storing the acquired minutes data with object data corresponding to the object image which is displayed on the second display area in the display screen.
15. The medium according to claim 14, wherein the computer program further causes the computer to perform the function of:
- determining, based on a designation frequency of the at least one object image which is selected based on the minutes data, a display position of the selected object image on the first display area in the display screen.
16. The medium according to claim 14, wherein the computer program further causes the computer to perform the function of:
- changing, based on further acquiring of minutes data, the at least one object image which have been displayed on the first display area in the display screen.
5572728 | November 5, 1996 | Tada |
7447608 | November 4, 2008 | Poston |
20010034738 | October 25, 2001 | Cantwell et al. |
20030076353 | April 24, 2003 | Blackstock |
20060047816 | March 2, 2006 | Lawton |
20060200372 | September 7, 2006 | O'Cull |
20060218477 | September 28, 2006 | Shibata |
20070112926 | May 17, 2007 | Brett |
20070188654 | August 16, 2007 | Fuse |
20080091656 | April 17, 2008 | Charnock |
20090094532 | April 9, 2009 | Lyle et al. |
20090158173 | June 18, 2009 | Palahnuk |
1344397 | April 2002 | CN |
1619565 | May 2005 | CN |
1928859 | March 2007 | CN |
H03-177975 | August 1991 | JP |
H06-215095 | August 1994 | JP |
H08-163524 | June 1996 | JP |
2004-094833 | March 2004 | JP |
2005-278786 | October 2005 | JP |
2006-099414 | April 2006 | JP |
2008-059010 | March 2008 | JP |
2009-217653 | September 2009 | JP |
- Mar. 27, 2015 Japanese Official Action in Japanese Patent Appln. No. 2014-143628.
Type: Grant
Filed: Jan 27, 2016
Date of Patent: Jul 16, 2019
Patent Publication Number: 20160139749
Assignee: Canon Kabushiki Kaisha (Tokyo)
Inventor: Toshimizu Yamane (Tokyo)
Primary Examiner: Anil K Bhargava
Application Number: 15/007,751
International Classification: G06F 3/0482 (20130101); G06T 11/20 (20060101); G06Q 10/10 (20120101); G06F 3/0484 (20130101);