INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND A NON-TRANSITORY STORAGE MEDIUM
The present invention is an apparatus comprising: a display control unit configured to cause a display screen to display a plurality of objects which include an object to be processed according to a moving operation; a selection unit configured to select at least one of the plurality of objects which are displayed, based on a position on the display screen; a determining unit configured to, in a case where an object different from the object to be processed is selected and the moving operation for the selected object is executed, determine whether to execute the predetermined processing to the object to be processed according to the moving operation, based on the selected object; and a processing unit configured to execute the predetermined processing for the object according to the moving operation, in a case where the determining unit determines that the predetermined processing is executed to the object.
Latest Canon Patents:
- MEDICAL DATA PROCESSING APPARATUS, MAGNETIC RESONANCE IMAGING APPARATUS, AND LEARNED MODEL GENERATING METHOD
- METHOD AND APPARATUS FOR SCATTER ESTIMATION IN COMPUTED TOMOGRAPHY IMAGING SYSTEMS
- DETECTOR RESPONSE CALIBARATION DATA WEIGHT OPTIMIZATION METHOD FOR A PHOTON COUNTING X-RAY IMAGING SYSTEM
- INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM
- X-RAY DIAGNOSIS APPARATUS AND CONSOLE APPARATUS
1. Field of the Invention
The present invention generally relates to information processing and, more particularly, to an apparatus, a method and a non-transitory storage medium storing a program for executing a processing according to a user instruction.
2. Description of the Related Art
An information processing apparatus which inputs a user instruction by a coordinate input device (such as a touch panel) and executes a processing according to the instruction is familiar. In such information processing apparatus, the apparatus judges an operation such as touch, tap (touch and release), drag (trace), and flick that a user operates on a touch panel, and executes a processing according to the action. For example, it is familiar that when a user touches an object displayed on a display with a finger, the said object is selected, and the user have the object moved according to a moving of a finger (drag).
In a case that a user's operation corresponds to several kinds of operations, there is a technique which judges a processing to be executed according to the action of the user's operation. In Japanese Patent Application No. JP 2006-343856, even if a user touches an object when doing manual input, the information processing apparatus puts the manual input processing above others as long as a length of traces of a drag operation is larger than predetermined value. In the above-mentioned technique, in a case that traces of a drag operation from an object that a user touches is long, the processing is judged as the manual input. Therefore, in a case that a user executes a drag operation by mistake when selecting an object, a processing according to the drag operation (manual input) is executed against intention.
SUMMARY OF THE INVENTIONThe present disclosure provides an information processing apparatus, an information processing method and a storage medium which executes appropriately a processing according to the moving operation, according to an object selected by the user's moving operation.
According to an aspect of the present disclosure, an information processing apparatus includes a display control unit configured to cause a display screen to display a plurality of objects which include an object to be processed according to a moving operation by a user; a selection unit configured to select at least one of the plurality of objects which are displayed by the display control unit, based on a position on the display screen which is instructed by a user; a determining unit configured to, in a case where an object different from the object to be processed is selected by the selection unit and the moving operation for the selected object is executed, determine whether to execute the predetermined processing to the object to be processed according to the moving operation, based on the selected object; and a processing unit configured to execute the predetermined processing for the object according to the moving operation, in a case where the determining unit determines that the predetermined processing is executed to the object.
Further features of the present disclosure will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Various exemplary embodiments, features, and aspects of the disclosure will be described in detail below with reference to the drawings. Such as relatively positions of constituent elements and display screen set forth in the exemplary embodiments are not intended to limit the scope of the present disclosure.
CPU 102 controls the whole of the image processing apparatus 101. RAM 103 is comprised of non-transitory memory. RAM 103 stores such as a program control variable and provides a buffer area to store data temporally for a variety of works. ROM 104 is comprised of non-transitory memory. ROM 104 stores a control program and an operating system (OS) program that CPU 102 executes. CPU 102 executes various processing by executing a control program stored in ROM 104, RAM 103 as work memory as.
Display portion 105 is comprised of a display output device such as liquid-crystal display, and displays various information on a screen according to an instruction of CPU 102. The operation portion 106 corresponds to keys such as power unit key and stop key. The operation portion 106 accepts an instruction by a user. Card interface 108 reads image files and document files which are stored in memory card 109 which is attached, and executes writing date for memory card 109 and deleting data.
Touch panel 107 is able to detect a coordinate which is pressed. This touch panel is such as a capacitance touch panel. The touch panel 107 detects periodically whether a user touches or not. If the touch panel 107 is touched by a user, coordinate information representing coordinate which is touched on the touch panel 107 is entered to CPU 102.
The CPU 102 identifies a position that a user touches by the entered coordinate information. And the touch panel 107, CPU 102 can judge that a long-touch operation is executed by a user. In a case that a touch position is moving, CPU 102 can judge that a drag operation is executed by a user. At that time, CPU 102 can identify direction, distance and tracks of the drag operation by using the touch position. Furthermore, in a case that an input of the touch coordinate information from the touch panel 107 disappears, CPU 102 can judge that a user have a finger or a touch pen separated from touch panel 107 (release).
The specific information of an object which exists on a point of origin when a user touches on the touch panel 202 is stored in object information in a selected sate 305. The specific information of the object arranged in the coordinate of the touch position on the touch panel at the present is stored in the object information 306. Necessary information to change a display data of the display portion 105 based on a operation by a user is stored in a flag information 307.
In the present embodiment, the moving object 303 is an image object, the moving instruction object 304 is an image changing object. The moving object and the moving instruction object is selected among the object to be displayed and the specific information of the selected object is stored in RAM 103, when CPU 102 have the display portion 105 displays objects.
Besides, the specific information of the object information on a current coordinate 306 is stored in RAM 103 by CPU 102. CPU 102 has the specific information of the object which is displayed on the display portion 105 and the coordinate information representing a coordinate that the object thereof is arranged on the screen stored in RAM 103. When the coordinate information is input by the touch panel 107, CPU 102 judges the object which is arranged on a coordinate corresponds to the touch position and store the specific information to each region of RAM 103.
Next,
When a user drags image object 401, CPU 102 executes a moving control which moves the image to follow the drag operation.
Shown as
A toolbar object 402 has two states, the opening state and the closing state, opening and closing states can be shifted by touching and releasing immediately (tapping) on a predetermined region of touch panel 202. Image switching objects 410 and 411 can forward an image by tapping. And images can be switched by a long-term touching this object. High-speed display mode is explained later by using
Switching the image to be displayed by the drag operation or the image switching objects 410, 411, switching is performed according to a list of image files stored in memory card 109. This list includes the image file number and file-path information to access the image file, and is generated on RAM 103 by CPU 102. Numbering of the image files is, for example, in the order according to attributes of the image files (for example the photo date). CPU 102 selects the image file to be displayed on the display portion 105 among the image files included in the list as a candidate to be displayed, and shows the image on the display screen of the display portion by accessing according to the file-path information of the image file.
A number of print object 412 displays number of print. A number of print setting screen which is not illustrated is displayed by tapping the number of print object, and a number of print can be changed on the number of print setting screen. A print setting screen which is not illustrated is displayed by tapping a print setting object 413, sheet type or sheet size can be set.
A mode which displays a plurality of images on a screen, which is not illustrated, can be changed by tapping multi image display switching object 414. Types and portion of the object displayed on a screen is not limited. Another object may be displayed, and a display portion may be changed. And a part of objects may be displayed, for example only the number of print object 412 is displayed.
Those objects can be switched, to appear on the display, or not appear on the display. For example, those objects can be switched to appear or not by tapping a predetermined region of the screen or dragging on another region.
Image counter 603 displays the total number of images of memory card 109 and an image number that is displayed currently. In an example shown in
Next,
In S702, CPU 102 reads a coordinate register of the touch panel 107, and judges whether or not the coordinate of touch position is detected. With respect to timing of detecting a coordinate, CPU 102 may read the coordinate register of the touch panel 107 every predetermined interval. A hardware interrupt occurs when detecting a coordinate, CPU 102 may read the coordinate register at that timing.
In a case that the coordinate or touch position is detected in S702, the process proceeds to S703. In a case that the coordinate or touch position is not detected in S702, the process proceeds to S712. In S703 CPU 102 judges whether or not the detected coordinate is the start point coordinate. If the detected coordinate is the start point coordinate, the process proceeds to S704. If the detected coordinate is not the start point coordinate, the process proceeds to S706. In S706, the current coordinate stored in RAM 103 is set to the previous coordinate, and the process proceeds to S707.
In S704, the detected coordinate is set to the start point coordinate. In S705, a touch event showing a start point detection is issued. In S707, the detected coordinate is set to the current coordinate and is stored in RAM 103. In S708, a drag operation judgment is performed. In concrete terms, CPU 102 compares the current coordinate with the previous coordinate and calculates differences of coordinates. In a case that the difference is larger than a predetermined value in one of X-axis or Y-axis, CPU 102 judges the drag operation. If a drag operation is judged, a drag event is issued in S709. If a user continues to perform the drag operation on the touch panel 202, drag event is issued a plurality of times.
In a case that drag operation is not judged, the process proceeds to S710. In S710, a judgment of a long-term touch operation is performed. In concrete terms, in a case that a touch state continues for a predetermined period, and the detected coordinate fall within a predetermined region for that period, CPU 102 judges long-term touch operation. If a long-term touch operation is judged, the long-term touch event is issued. The long-term touch event is issued every predetermined interval by continuing to touch on a predetermined region of the touch panel 202 by a user.
In S712, CPU 102 judges whether or not a touch by a user is released. In concrete terms, if the previous coordinate is detected, CPU 102 judges that a finger of a user is released from the touch panel 202 after the last detection of coordinates, and judges a release. If the previous coordinate is not detected, CPU 102 does not judge a release because a user releases a finger from the touch panel 202 before that time. In a case that a release is judged, the process proceeds to S713 and a release event is issued. In a case that a release is not judged, the process goes back to S701. Each event which is issued in S705, S709, S711, and S713 is used to change a display of the display portion 105.
Next,
The object in a selected state in S902 may be an object in a non-displayed state. As mentioned above, the object such as the image switching objects 410 and 411 may become to a non-displayed state after a predetermined period has passed. In that case, there is a possibility that the predetermined period passed and the image switching objects become to a non-displayed state when a user tries to touch them. If it is set that an object in a displayed state can be selected, there is a case that the image forwarding cannot be performed because the object is in the non-displayed state, even though a user tries to touch the image switching object to perform image forwarding. Therefore, for an object turned to the non-displayed state after the passage of a predetermined period, the object in the non-displayed state may become to a selected state.
S903 to S906 are steps to judge whether or not an operation by a user is an operation to close the toolbar object 402 of
S907 to S909 are steps to switch the toolbar object between display and non-display state, the image switching object 410, the number of print object 412 of
In S907, it is judged whether or not the start point coordinate is in an information display switching region. An example of the information display switching region is shown as 1001 of
In S1102, the image fast-forward processing is performed. The image fast-forward processing is already explained in
In the present embodiment, in a case that the drag operation is detected, it is judged that an user's intent is not to switch images, because switching images are performed by the tap operation. If a user taps on the information display switching region when various information such as the number of print are displayed, the above information becomes non-display state. By turning off the information display switching flag when the drag operation is judged in S1201, one can prevent information such as the number of print turned to non-display state after forwarding images.
In S1202, the current coordinate which is stored in RAM 103 in S701 of
In S1205, an object on the current coordinate is acquired. In a case that the object on the current coordinate is the image object 401, the process proceeds to S1206. If the toolbar object 402 or 402′, the process proceeds to S1209. If other objects, the process proceeds to S1215. In S1206, it is judged whether or not the image fast-forward processing is executing. In a case that the image fast-forward processing is not executing, the process proceeds to S1208.
In S1208, the object information in a selected state which is stored in RAM 103 in S902 of
In this way, in S1208, in a case that it is judged that not only the image object but also the image switching object 410, 411 is selected as selected objects, the moving processing by the drag operation is performed. Even if a user touches on the image switching object by mistake when performing the drag operation of image, the image can be moved by the drag operation.
It is judged that a drag operation of a user does not express the user's intent to perform the moving processing of image if the object is touched when performing a touch and the moving processing of image is not performed. In a case that a touch on objects such as the toolbar object 402 and the number of print object 412, and performing a drag operation to image object is performed, the moving processing of image is not performed. By this way, it can prevent the image forwarding performed against the intention of a user.
In the present embodiment, as shown in
However, a condition to judge whether or not the moving processing of the image by the drag operation is performed is not limited to the above example. For example, when the drag operation for the image object is performed after an object arranged on a predetermined coordinate is touched, the moving processing of the image by the drag may be performed.
In S1211, the previous coordinate which is stored in RAM 103 in S710 is acquired, the process proceeds to S1212. In S1212, a moving distance between the previous coordinate and the current coordinate in X-direction is calculated, the process proceeds to 1213. In S1213, the image is moved by the distance which is calculated in S1212. In S1214, a flag during the image drag is set to ON, the flag state is stored in RAM 103. This flag as referred to a release processing in
On the other hand, in a case that the object on the current coordinate during the drag operation is the toolbar object in S1205, the process proceeds to S1209. In S1209, the effective flag of the toolbar closing operation which is stored in S905 and S906 of
In S1215, the flag during the image drag is stored in RAM 103 in S1214 is acquired. If the flag is ON, the process proceeds to S1216. If the flag is OFF, the process is completed. That is, in a case that an object other than the image object is touched during the image drag, the image forward processing in 1216 is performed. By this way, the image forward is performed, even if the image switching object 411, 412 and the toolbar object 402 are touched. The image forward processing is explained later in
In S1302, the image forward processing is performed. It is explained later by using
In S1305, it is judged whether or not the coordinate which is assigned at the time of release exists on the selected object which is selected in S902. In a case that the coordinate exists on the selected object, a predetermined processing for the selected object is performed in S1306.
For example, in a case that the selected object is the image switching object, switching the image to be displayed is performed. In a case that the selected object is the number of print object, the screen for a user to instruct the number of print is displayed on the display portion 105. Switching the image is performed according to the list of the image file. That is, in the list, the image which is previous or next to the image to be displayed currently is the new image to be displayed. In a case that the image switching object 410 is selected, the previous image is displayed. In a case that the image switching object 411 is selected, the next image is displayed.
And, in S1305, if the coordinate at the time of release is not on the selected object, a predetermined processing for the selected object is not performed, and the process is completed. That is, after a user touches the object, selection of the object is cancelled by the drag operation.
On the other hand, as explained in S1208, in a case that the image switching object is selected, a moving of the object by the drag operation is performed with the drag operation by a user. Therefore, a user can cancel the selection of the object by the drag operation after touching the selected object as mentioned above. And a user can perform the image forwarding, in a case that a user performs the drag operation, after touching the image switching object. With respect to touching the image switching object, the image forward is performed by the drag operation, because a user touches the image switching object by mistake when performing the drag operation for the image forward. As shown
In S1405, S1406, and S1408 are steps for judging whether or not the image-forward is performed based on the X-vector of the end of a drag track.
In S1405, X-coordinates of the current coordinate and the start point coordinate are compared. If Cx>Sx, the process proceeds to S1406. If not, the process proceeds to S1408.
S1406 and S1408 are steps for calculating a moving direction vector of the end of the drag operation and judging whether or not a user intends to cancel the image-forward. In S1406, the X-coordinate of the current coordinate and the previous coordinate are compared. If Cx<=Px, the process proceeds to S1409. If not, the process proceeds to S1410. For example, if the drag operation is a drag operation which is shown as
In S1407 the image is forwarded to +X-direction (right direction), in S1409 the image is forwarded to −X-direction (right direction), in S1410 the image-forward is canceled, and the present flow is completed.
In the above embodiment, if the object which is selected by a user touching is the image switching object, change of the image to be displayed is performed by a drag operation. Therefore, even if a user touches the image switching object by mistake when performing the image-forward by the drag operation, the image-forward is performed appropriately.
And, even if a user intends to tap the image switching operation but performs a drag operation by mistake, the image-forward by a drag operation, which is similar to a processing by the image switching object (switch of the image to be displayed), is performed. Therefore, a processing corresponds to a drag operation by a user can be performed, which reflects the user intention.
On the other hand, even if a user performs the drag operation after touching the number of print object 412, for example, the image to be displayed is not changed. Therefore, in a case that a user performs the drag operation by mistake when an instruction of the number of print object 412, it prevents the image to be displayed changed without the user intention.
And, when an object other than the number of print object 412, the image object, or the image switching object is selected, a processing by a drag operation is not performed. Therefore, even if a user touches an object by mistake, the touch is able to be canceled by a drag operation. That is, it is prevented that the image to be displayed is changed by a drag operation without user intention or the screen for setting the number or print is displayed.
In a case that the drag operation is performed after touching and selecting the object other than the image object and image switching object, a processing for the selected object may be performed.
And, in the above embodiment, it is judged whether or not the image-forward by a drag operation is performed, based on the arrangement position of the object which is selected by touch of a drag operation on a display screen. But, not being limited with these possibilities, the above judgment may be performed, according to the kind of the object which is selected by touch of the drag operation.
Furthermore, in the present embodiment, if the information display switching region 1001 is touched and released by a user, display of information such as the number of print object 412 is able to be turned ON/OFF as shown in S1304 of
However, as mentioned in S1201 of
As mentioned above in S1208 of
In the present embodiment, the image file is used to be explained as the object to be displayed. However the disclosure is not limited to this embodiment. The disclosure is able to be applied to a various kinds of data. For example, as shown in
In the above embodiment, the drag operation is used to be explained as a moving operation by a user. However, the disclosure is not limited to this embodiment. In the disclosure, a flick operation which means that flick user's finger on the touch panel may be applied.
In the above embodiment, example of a user performing instructions by using the touch panel for deciding the region to be processed in the image. However, the disclosure is not limited to this embodiment. In the disclosure, the case that the object to be displayed is switched by using various kinds of pointing devices such as mouse, track ball, touch pen, and touch pad can be applied.
Embodiments of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., a non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present disclosure, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more of a CPU, micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
The processing may be executed by one computer (CPU or MPU) or a plurality of computers may cooperate for execution. Not limited to the case of execution by a computer (or a CPU, a MPU or the like) of a program stored in a memory such as a ROM, hardware (circuit or the like) to perform the process described in the above embodiments may perform the process. A part of the process described in the above embodiments may be executed by a computer (or a CPU, a MPU or the like), and the remaining part may be executed by hardware.
While the present disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the priority from Japanese Patent Application No. JP2012-203095 filed Sep. 14, 2012, which is hereby incorporated by reference herein in its entirety.
Claims
1. An apparatus comprising:
- a display control unit configured to cause a display screen to display a plurality of objects which include an object to be processed according to a moving operation by a user;
- a selection unit configured to select at least one of the plurality of objects which are displayed by the display control unit, based on a position on the display screen which is instructed by a user;
- a determining unit configured to, in a case where an object different from the object to be processed is selected by the selection unit and the moving operation for the selected object is executed, determine whether to execute the predetermined processing to the object to be processed according to the moving operation, based on the selected object; and
- a processing unit configured to execute the predetermined processing for the object according to the moving operation, in a case where the determining unit determines that the predetermined processing is executed to the object.
2. The apparatus according to claim 1,
- wherein the determining unit determines whether to execute the predetermined processing to the object to be processed according to the moving operation, based on a position that the selected object which is selected by the selection unit is arranged on the display screen.
3. The apparatus according to claim 1,
- wherein the processing unit executes a processing which changes a candidate to be displayed by the display control unit among a plurality of candidates of the objects to be processed, as the predetermined processing.
4. The apparatus according to claim 3,
- wherein the display control unit changes the display on the display screen of a part of object among the plurality of objects, in a case where a predetermined region is instructed by a user on the display screen.
5. The apparatus according to claim 4,
- wherein the display control unit does not execute the change of the display for the part of objects, even if the predetermined region is instructed, in a case where a continuous moving operation is executed by a user.
6. The apparatus according to claim 3,
- wherein the processing unit changes the object to be displayed, by moving the object which is displayed by the display control unit according to the moving operation performed by a user.
7. The apparatus according to claim 1,
- in a case where the determining unit determines that the object to be processed is not executed with the predetermined processing, a processing corresponding to the selected object selected by the selection unit is not executed for the selected object.
8. The apparatus according to claim 1,
- wherein the display screen has a touch panel, the display control unit causes the display screen set up with the touch panel, to display the plurality of objects which include the object to be processed according to the moving operation.
9. The apparatus according to claim 8,
- wherein the moving operation is a flick operation or a drug operation on the touch panel.
10. A method comprising:
- causing a display screen to display a plurality of objects which include an object to be processed according to a moving operation by a user;
- selecting at least one of the plurality of objects which are displayed, based on a position on the display screen which is instructed by a user;
- in a case where an object different from the object to be processed is selected by the selection and the moving operation for the selected object is executed, determining whether to execute the predetermined processing to the object to be processed according to the moving operation, based on the selected object which is selected; and
- executing the predetermined processing for the object according to the moving operation, in a case where the predetermined processing is executed to the object to be processed.
11. The method according to claim 10,
- wherein it is determined whether to execute the predetermined processing to the object to be processed according to the moving operation, based on a position that the selected object is arranged on the display screen.
12. The method according to claim 10,
- wherein a processing which changes a candidate to be displayed among a plurality of candidates of the objects to be processed, as the predetermined processing.
13. The method according to claim 12, wherein a part of object among the plurality of objects displayed on the display screen is changed, in a case where a predetermined region is instructed by a user on the display screen.
14. The method according to claim 13,
- wherein the change of the display in the part of objects is not executed, even if the predetermined region is instructed, in a case where a continuous moving operation is executed by a user.
15. The method according to claim 12,
- wherein the object to be displayed is changed, by moving the displayed object according to the moving operation by a user.
16. The method according to claim 10,
- wherein in a case where it is determined that the object to be processed is not executed with the predetermined processing, a processing corresponding to the selected object is not executed for the selected object.
17. The method according to claim 10,
- wherein the display screen has a touch panel, a plurality of objects which include the object to be proceeded are displayed on the display screen set up with the touch panel.
18. The method according to claim 10,
- wherein the moving operation is a flick operation or a drag operation.
19. A non-transitory recording medium storing a program for causing a computer to execute the method according to claim 10.
Type: Application
Filed: Sep 12, 2013
Publication Date: Mar 20, 2014
Applicant: CANON KABUSHIKI KAISHA (Tokyo)
Inventor: Kazunori Yamauchi (Yokohama-shi)
Application Number: 14/024,939
International Classification: G06F 3/041 (20060101);