METHOD OF USING A TOUCH SCREEN AND USER INTERFACE APPARATUS EMPLOYING THE SAME

- Samsung Electronics

A user interface apparatus and method of using a touch screen are provided. The apparatus includes a touch screen division unit for dividing a screen area of the touch screen into zones, an allocation unit for allocating a user interface unit to each of the zones and a plurality of user interface units for sensing job commands input through the zones according to the allocation result of the allocation unit, wherein an operation corresponding to each of the sensed job commands is performed. Accordingly, since a plurality of users can simultaneously use an image-forming device by dividing the screen area of the touch screen into two or more zones, job processing time can be reduced, and a usage rate of a high specification image-forming device can be increased.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED PATENT APPLICATION

This application is a continuation of U.S. patent application Ser. No. 11/509,608, filed Aug. 25, 2006, which claims the benefit under U.S.C. §119(a) of Korean Patent Application No. 10-2005-0103424, filed on Oct. 31, 2005, in the Korean Intellectual Property Office, the entire disclosures of both of said applications being incorporated herein by reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to an image-forming device. More particularly, the instant invention relates to a user interface apparatus and method for allowing a plurality of users to simultaneously use an image-forming device having a touch screen.

2. Description of the Related Art

A touch screen is designed to execute a command or move a cursor by recognizing a contact point when a user touches the touch screen using a finger or a touch pen having a ballpoint pen shape. The touch screen is a user interface method used in various fields, such as personal digital assistants (PDAs), liquid crystal displays (LCDs), cathode ray tubes (CRTs), banks, authorities, medical equipment, tourism, organization guides, and traffic guides.

In touch screen technology, a pressure sensing method and an electrostatic method may be used. The pressure sensing method is a method of calculating the coordinates of a pressurized location by installing sensors closely together that sense pressure exerted on the surface of the touch screen and sense the degree of pressure when the pressure is exerted. The pressure sensing method is widely used, but its accuracy is low. The electrostatic method has greater accuracy than the pressure sensing method. The electrostatic method calculates the coordinates of a touched location by charging the surface of the touch screen, installing sensors around the touch screen, and sensing the amount of charge lost when touching the touch screen.

Recently, medium or large sized image-forming devices, such as printers or multi-function peripherals (MFPs), generally use a touch screen as a user interface for users' convenience. In general, a plurality of users share such a medium or large sized image-forming device.

However, when operating an image-forming device using a touch screen, only a single user can use the touch screen at a given time. Thus, when a plurality of users want to use the image-forming device simultaneously, all but a certain user must wait to use the image-forming device until the certain user ends his/her work.

Accordingly, there is a need for an improved user interface apparatus and method of using a touch screen which allows a plurality of users to employ the touch screen at the same time.

SUMMARY OF THE INVENTION

Exemplary embodiments of the present invention address at least the above problems and/or disadvantages and provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a user interface apparatus and method of using a touch screen, which allows a plurality of users to simultaneously use the touch screen by dividing a screen area of the touch screen into two or more zones.

According to an aspect of the present invention, there is provided a user interface apparatus of a touch screen, the apparatus comprising a touch screen division unit for dividing a screen area of the touch screen, an allocation unit for allocating a user interface unit to each of the zones and a plurality of user interface units for sensing job commands input through the zones according to the allocation result of the allocation unit, wherein an operation corresponding to each of the sensed job commands is performed.

According to another aspect of the present invention, there is provided a user interface method of using a touch screen, the method comprising dividing a screen area of the touch screen into zones, allocating a user interface unit to each of the zones, sensing job commands input through the zones and performing an operation corresponding to each of the sensed job commands.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which:

FIG. 1 is a block diagram of a user interface apparatus of a touch screen according to an exemplary embodiment of the present invention;

FIGS. 2A-D illustrate layouts for representing zones of a touch screen according to an exemplary embodiment of the present invention;

FIG. 3 illustrates user interfaces of the exemplary touch screen of FIG. 2 divided into two zones;

FIG. 4 illustrates a user interface displaying a job execution process for the exemplary touch screen of FIG. 3 divided into two zones;

FIG. 5 is a flowchart illustrating a user interface method of a touch screen according to an exemplary embodiment of the present invention; and

FIG. 6 is a detailed flowchart illustrating an operation of FIG. 5 according to an exemplary embodiment of the present invention.

Throughout the drawings, the same drawing reference numerals will be understood to refer to the same elements, features, and structures.

DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS

The matters defined in the description such as a detailed construction and elements are provided to assist in a comprehensive understanding of the embodiments of the invention and are merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted for clarity and conciseness.

The present invention will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown.

FIG. 1 is a block diagram of a user interface apparatus of a touch screen according to an exemplary embodiment of the present invention. The user interface apparatus includes a mode selection input unit 90, a touch screen division unit 100, an allocation unit 110, and first through nth (n is a positive integer larger than 1) user interface units 120 through 140.

Using the mode selection input unit 90, a user inputs a mode selection for dividing the screen area of the touch screen. The user selects a mode for area division of the touch screen so that the screen area is divided into at least two zones.

The mode selection input unit 90 may be an operating panel that senses a mode selection signal generated by a key panel operation of the user.

The mode selection input unit 90 can receive a mode selection signal when an image-forming device is in an idle state or is performing a job, such as printing or copying. That is, although a job of a certain user may be proceeding, the same user or another user can input a mode selection for area division of the touch screen through the mode selection input unit 90.

The touch screen division unit 100 divides the screen area of the touch screen into zones and outputs the division result to the allocation unit 110. To do this, the touch screen division unit 100 includes a layout display unit 102, a layout sensing unit 104, and an area division unit 106.

The layout display unit 102 displays sample layouts of the zones of the touch screen. Each of the sample layouts of the touch screen may be obtained by dividing the touch screen into at least two zones.

FIGS. 2A-D illustrate sample layouts for representing zones of the touch screen according to exemplary embodiments of the present invention. FIG. 2A is an example in which the touch screen is divided into two zones, FIG. 2B is an example in which the touch screen is divided into three zones, FIG. 2C is another example in which the touch screen is divided into three zones, and FIG. 2D is an example in which the touch screen is divided into four zones. However, FIGS. 2A-D show only examples of the ways the touch screen may be divided.

The layout sensing unit 104 senses a layout selected by the user among the displayed layouts and outputs the sensing result to the area division unit 106.

The area division unit 106 divides the screen area of the touch screen into zones corresponding to the layout sensed by the layout sensing unit 104. For example, if the user selects the area division illustrated in FIG. 2A, the area division unit 106 divides the touch screen as illustrated in FIG. 2A.

The allocation unit 110 allocates the first through nth user interface units 120, 130, and 140 to corresponding zones of the touch screen and outputs the division result to the first through nth user interface units 120, 130, and 140.

For example, if the touch screen division unit 100 has divided the screen area of the touch screen into two zones as illustrated in FIG. 2A, the allocation unit 110 allocates the first user interface unit 120 to a zone A and outputs the allocation result to the first user interface unit 120. The allocation unit 110 also allocates the second user interface unit 130 to a zone B and outputs the allocation result to the second user interface unit 130.

Each of the first through nth user interface units 120, 130, and 140 displays a user interface corresponding to each of the zones according to the allocation result.

FIG. 3 illustrates user interfaces of the touch screen divided into two zones.

For example, if the allocation unit 110 allocates that the first user interface unit 120 is allocated to the divided zone A of FIG. 2A, the first user interface unit 120 displays a user interface on the divided zone A as illustrated in FIG. 3. If the allocation unit 110 allocates that the second user interface unit 130 is allocated to the divided zone B of FIG. 2A, the second user interface unit 130 displays a user interface on the divided zone B as further illustrated in FIG. 3.

Each of the first through nth user interface units 120, 130, and 140 senses a job command input through each of the zones.

For example, when a user selects any of a printing job, a copy job, a scanning job, a fax job and the like through the user interface allocated to the divided zone A, the first user interface unit 120 senses the job command selected by the user. Moreover, the second through nth user interface units 130 and 140 also sense the job command selected by the user. When another user selects any one of a printing job, a copy job, a scanning job, a fax job and the like through the user interface of the divided zone B, the second user interface unit 130 senses that job command selected by the user. Similarly, the first and nth user interface units 120 and 140 also sense the job command selected by the another user.

When a new job (for example, Scan, Fax, or Print) is registered for a device that is currently being used for a proceeding job previously input from among the displayed user interfaces, the first through nth user interface units 120, 130, and 140 may inactivate input buttons of jobs that are typically performed immediately. The buttons that are inactivated are returned to active when the device is ready for use after the registered job is complete. In addition, the first through nth user interface units 120, 130, and 140 may activate input buttons, except buttons associated with the currently proceeding job, among the displayed user interfaces.

A new job can be registered during printing, and the registered job is performed as soon as a previous job ends. However, for scanning, a new scanning job may not be registered while the device is scanning. For example, while a document for copy is being scanned, input buttons “Copy,” “Scan,” and “ScanToEmail” are inactivated. When the scanning of the document for copy ends, input buttons not overlapping with a currently proceeding job are activated. For faxing, a new job may or may not be registered according to the type of a currently proceeding job.

FIG. 4 illustrates the divided touch screen of FIG. 3 when a user interface displays a job execution process. If a user selects a printing job command through the user interface displayed on the divided zone A of FIG. 3, an image indicating that a printing job is proceeding may be displayed on the divided zone A as illustrated in FIG. 4.

The user interface units 120, 130, and 140 of the touch screen described above may be included in an MFP. The MFP includes a printing device, a copy device, a scanning device, a facsimile and the like.

A user interface method of a touch screen according to an exemplary embodiment of the present invention will now be described in detail with reference to the accompanying drawings.

FIG. 5 is a flowchart illustrating a user interface method of a touch screen according to an exemplary embodiment of the present invention.

Referring to FIG. 5, a mode selection for area division of the touch screen is input in operation 200. The mode selection can be input when an image-forming device is in an idle state or is performing a job, such as printing or copying.

A screen area of the touch screen is divided in operation 202.

FIG. 6 is a detailed flowchart illustrating operation 202 of FIG. 5 according to an exemplary embodiment of the present invention.

Possible layouts of the zones of the touch screen are displayed in operation 300. Each of the layouts of the touch screen may be obtained by dividing the touch screen into at least two zones. As illustrated in FIGS. 2A through 2D, various layouts can be displayed.

A layout selected by a user among the displayed layouts is sensed in operation 302.

The screen area of the touch screen is divided to correspond to the sensed layout in operation 304.

A relative number of user interface units are allocated to the zones in operation 204. For example, if the touch screen is divided into two zones in operation 202 as illustrated in FIG. 2A, the first user interface unit 120 is allocated to zone A and the second user interface unit 130 is allocated to zone B.

In operation 206, user interfaces corresponding to the zones are displayed, and job commands input through the user interfaces of the zones are sensed.

For example, if the first user interface unit 120 is allocated to the divided zone A of FIG. 2A, the first user interface unit 120 displays a user interface on the divided zone A as illustrated in FIG. 3. If the second user interface unit 130 is allocated to the divided zone B of FIG. 2A, the second user interface unit 130 displays a user interface on the divided zone B as illustrated in FIG. 3. In particular, when a new job (for example Scan, Fax, or Print) is registered for a device currently being used for a proceeding job previously input from among the displayed user interfaces, the first and second user interface units 120 and 130 may inactivate input buttons of jobs that are typically performed immediately. The buttons that are inactivated are returned to active when the device is ready for use after the registered job is complete. In addition, input buttons, except buttons associated with the currently proceeding job among the displayed user interfaces may be activated.

Each of the user interface units corresponding to the zones senses that a job command is input through a relative user interface.

For example, when a user selects any one of a printing job, a copy job, a scanning job, a fax job and the like through the user interface of the divided zone A illustrated in FIG. 3, the first user interface unit 120 senses a job command selected by the user. Moreover, the second through nth user interface units 130 and 140 also sense the job command selected by the user. When another user selects any one of a printing job, a copy job, a scanning job, a fax job and the like through the user interface of the divided zone B illustrated in FIG. 3, the second user interface unit 130 senses a job command selected by the user. Similarly, the first and nth user interface units 120 and 140 also sense the job command selected by the another user.

An operation corresponding to the sensed job command is performed in operation 208. For example, if the sensed job command is a printing job, a printing job is performed.

The user interface method of a touch screen described above may comprise a MFP including a printing device, a copy device, a scanning device, a facsimile and the like.

The exemplary embodiments of the present invention can be written as codes/instructions/programs and can be implemented in general-use digital computers that execute the codes/instructions/programs using a computer readable recording medium.

Examples of the computer readable recording medium include magnetic storage media (for example, ROM, floppy disks, hard disks, etc.), optical recording media (for example, CD-ROMs, or DVDs), and storage media such as carrier waves (for example, transmission through the Internet). The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

Also, functional programs, codes, and code segments for accomplishing the exemplary embodiments of the present invention can be easily construed by programmers skilled in the art to which the present invention pertains.

As described above, using a user interface apparatus and method of a touch screen according to exemplary embodiments of the present invention, a plurality of users can simultaneously use an image-forming device, and thus, job processing time can be reduced, and other users do not have to wait to use the image-forming device.

In addition, when a single user simultaneously performs a plurality of jobs, different jobs can be processed in parallel, increasing job efficiency.

In addition, since a plurality of users can simultaneously use a high specification image-forming device, a usage rate of the image-forming device can be increased.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims and the full scope of equivalents thereof.

Claims

1.-18. (canceled)

19. An apparatus having a touch screen, the apparatus comprising:

a touch screen division unit for dividing a screen area of the touch screen into a plurality of zones, if a screen area division function is enabled which divides the screen area of the touch screen into the plurality of zones; and
an allocation unit for allocating respective user interfaces (UIs) to each of the plurality of zones.

20. The apparatus of claim 19, further comprising:

a mode selection unit for receiving, via the touch screen, a mode selection that enables the screen area division function.

21. The apparatus of claim 19, wherein the plurality of zones are logically separated each other so as to independently display a user interface (UI) and to independently receive a gesture.

22. The apparatus of claim 19, wherein the plurality of zones display, on the respective UIs, one and more selectable objects corresponding to one and more executable operations of the apparatus, the operations are executed by operation unit included in the apparatus.

23. The apparatus of claim 22, wherein when a first operation from among the operations is being executed, the plurality of zones deactivate at least one of a first object and a second object from among the selectable objects on the respective UIs, the first object corresponding to the first operation and the second object corresponding to a second operation which conflicts with the first operation.

24. The apparatus of claim 22, wherein:

the plurality of zones detect a gesture of selecting a third object and a gesture of selecting a forth object from among the selectable objects, the gestures are detected by different zones each other; and
the operation unit executes a third operation corresponding to the third object and a forth operation corresponding to the forth object at the same time.

25. The apparatus of claim 24, wherein the zones that detected the gestures display results of executing the third operation and forth operation respectively.

26. The apparatus of claim 19, wherein the touch screen division unit determines at least one of a number of the plurality of zones, sizes of the plurality of zones, shapes of the plurality of zones and an arrangement of the plurality of zones based on a user input.

27. The apparatus of claim 19, wherein the touch screen division unit displays, via the touch screen, at least one layout which represents a preview for the divided screen area.

28. The apparatus of claim 19, wherein the screen area is a part of the entire touch screen.

29. The apparatus of claim 19, wherein the apparatus is included in one of a mobile device, a displaying device, a medical device and an image forming device.

30. A method of using a touch screen, the method comprising:

dividing a screen area of the touch screen into a plurality of zones, if a screen area division function is enabled which divides the screen area of the touch screen into the plurality of zones; and
allocating respective user interfaces (UIs) to each of the plurality of zones.

31. The method of claim 30, further comprising receiving, via the touch screen, a mode selection that enables the screen area division function.

32. The method of claim 30, wherein the plurality of zones are logically separated each other so as to independently display a user interface (UI) and to independently receive a gesture.

33. The method of claim 30, further comprising displaying, on the respective UIs, one and more selectable objects corresponding to one and more executable operations, by the plurality of zones.

34. The method of claim 33, wherein when a first operation from among the operations is being executed, the plurality of zones deactivate at least one of a first object and a second object from among the selectable objects on the respective UIs, the first object corresponding to the first operation and the second object corresponding to a second operation which conflicts with the first operation.

35. The method of claim 33, further comprising:

detecting, by the plurality of zones, a gesture of selecting a third object and a gesture of selecting a forth object from among the selectable objects, the gestures are detected by different zones each other; and
executing a third operation corresponding to the third object and a forth operation corresponding to the forth object at the same time

36. The method of claim 35, further comprising displaying, by the zones that detected the gestures, results of executing the third operation and forth operation respectively.

37. The method of claim 30, wherein the dividing the screen area of the touch screen into the plurality of zones comprises determining at least one of a number of the plurality of zones, sizes of the plurality of zones, shapes of the plurality of zones and an arrangement of the plurality of zones based on a user input.

38. The method of claim 30, wherein the method is performed in one of a mobile device, a displaying device, a medical device and an image forming device.

39. A non-transitory computer readable recoding medium storing a computer readable program for executing the method of claim 30.

Patent History
Publication number: 20130163026
Type: Application
Filed: Feb 21, 2013
Publication Date: Jun 27, 2013
Applicant: Samsung Electronics Co., Ltd. (Suwon-si)
Inventor: Samsung Electronics Co., Ltd. (Suwon-si)
Application Number: 13/773,493
Classifications
Current U.S. Class: Emulation Or Plural Modes (358/1.13)
International Classification: H04N 1/00 (20060101);