SELF-SERVICE COMPUTER WITH DYNAMIC INTERFACE

A self-service computer with dynamic interface. The self-service computer includes a processor for receiving environment information, for selecting among different user interface features including different transaction screens and different transaction screen content based upon the environment information, and for providing selected user interface features during a transaction involving a customer.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND

Self-service computer systems have replaced assisted-service computer systems in many business environments today. For example, self-service computer systems may be found in banking, retail, hospitality, travel, entertainment, medical, and other environments.

Self-service computer systems typically provide a consistent interface for completing transactions, regardless of customer type, queue length, or other factors.

Past approaches for offering customized user experiences have required a user to log in to a self-service computer. The self-service computer retrieves stored preferences associated with the user and changes the user interface accordingly. In addition, some self-service computers display buttons to change user interface parameters, such as volume or font size, on the fly.

Therefore, it would be desirable to provide a self-service computer with a dynamic interface that is not necessarily based upon knowing the identity of a user.

SUMMARY

A self-service computer with a dynamic interface is provided.

The self-service computer includes a processor for receiving environment information, for selecting among different user interface options including different transaction screens and different transaction screen content based upon the environment information, and for providing selected user interface features during a transaction involving a customer.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a block diagram of an example self-service computer.

FIG. 2 illustrates a block diagram of an example transaction system including the self-service computer.

DETAILED DESCRIPTION

Referring now to FIG. 1, an example self-service computer 10 includes processor 12 which executes self-service application 14 and environment information processing software 16.

Self-service application 14 provides user interface features from self-service application data 18 to customers in order to process transactions. User interface features may include transaction screens and transaction screen content. User interface features may additionally include multimedia features including audio and video features.

Self-service application data 18 may include one or more templates or skins populated with transaction items. For example, in a quick service restaurant, self-service application 14 may display a template populated with images of food items stored locally and/or obtained from a host server computer that distributes images of menu items to a plurality of quick service restaurants.

Self-service computer 10 executes environment information processing software 18 which captures or receives environment information 20 and processes it for use by self-service application 14. Environment information 20 may include any information about the location of self-service computer 10.

Self-service computer 10 may receive environment information 20 from various sources. For example, self-service computer 10 may include one or more peripheral sensors, such as a camera, for capturing environment information 20. As another example, self-service computer 10 may receive environment information 20, such as outside weather information, from another computer via a network.

Example environment information 20 may include user type information, queue length, queuing time, weather conditions where self-service computer 10 is located (or outside a building in which self-service computer 10 is located).

User type information may include demographic categories, including gender and age. Example age types include juveniles, adults, and seniors. Additional categories are envisioned and possible by further segmenting these categories.

Self-service application 14 is dynamic in that it may provide a different user interface based upon different environmental information 20 provided by environment information processing software 18. Thus, self-service application 14 may display different transaction screens and transaction screen content, and/or vary individual user interface features such as volume, font type and size, and color based upon the information in its environment.

Self-service application 14 may also rely on customer identification information, if provided via loyalty card or other input, but customer identification information is not necessarily required by self-service application 14 to select a user interface. Self-service application 14 is capable of selecting user interface features without customer identification information.

Self-service application 14 may rely on rules 22 for selecting user interface features. Rules 22 may be derived from previously collected customer interactions stored in purchase data.

For example, self-service application 14 may choose and display different templates based upon environmental information 20 and rules 22. If information processing software 18 detects a female adult, self-service application 14 may display a template with user interface features tailored to women. If information processing software 18 detects a teenager, self-service application 14 may display a template with user interface features tailored to teenagers.

As another example, self-service application 14 may vary individual user interface features, such as font size, color, contrast, and audio volume based upon environmental information 20 and rules 22. If information processing software 18 detects a senior citizen or person who is likely above a certain age, self-service application 14 may adjust font size, color and contrast, and audio volume to larger/higher levels.

In addition, self-service application 14 may follow different workflows for different demographics based upon environmental information 20 and rules 22. Again, using the example of a senior citizen, self-service application 14 may provide a more straightforward, human-like interaction. For younger users who tend to be more comfortable with technology, self-service application 14 may provide a more streamlined workflow, using common text-messaging shortcuts or even a video-game motif.

As yet another example, self-service application 14 may choose and display different avatars based upon environmental information 20 and rules 22. Self-service application 14 may choose an avatar suited to the demographic of the user, either one like the demographic or one which complements the demographic.

Any of self-service application data 18, environment information 20, and rules 22 may alternatively be stored at a server computer connected to self-service computer 10 via a network. Environment information processing software 16 may also be executed by a server computer instead of self-service computer 10.

Self-service computer 10 additionally includes memory, program and data storage, a display, and one or more user input devices. The display and user input device may be combined as a touch screen.

Self-service computer 10 may execute an operating system such as a Microsoft operating system, which can display screen information within one or more windows.

Self-service computer 10 additionally includes components and peripherals necessary to accomplish the purpose for the environment in which it is located. For example, self-service computer 10 may additionally include, but not be limited to, one or more payment peripherals and a receipt printer. Example payment peripherals include a card reader, a currency and/or coin acceptor, and a currency and/or coin dispenser.

The venue for self-service computer 10 may be any venue suited to completing a self-service transaction. For example, self-service computer 10 may be located at a retail store, airport, hotel, rental car facility, restaurant, health care facility, or other venue.

Referring now to FIG. 2, an example system 50 in a quick service or fast food venue includes one or more attendant computers 32 and one or more self-service kiosks 30.

In this example, self-service kiosk 30 includes camera 26. Camera 26 may be positioned to capture images of the entire queuing area adjacent self-service kiosk 30.

Environment information processing software 16 includes image processing application 40. Image processing application 40 captures image data 42 using camera 26, compares captured image data 42 to reference facial characteristic data 44 to determine a customer type, and provides the customer type information to self-service application 14. Self-service application 14 selects self-service application data 18 for display based upon the determined customer type and rules 22.

For example, image processing application 40 may determine gender, approximate age, ethnicity, group status, presence of children, and other anonymous characteristics associated with a customer. Self-service application 14 uses this classification information to choose and tailor any of different templates, workflows, menu options, promotions, suggestive sell options, instructions, color contrast and schemes, font size audio prompts and levels, avatars, voices and/or other user interface features from self-service application data 18 based upon rules 22.

As another example, image processing application 40 may capture image data tracking customer journeys from the time they enter the field of view until they press the first button on self-service kiosk 30. Self-service application 14 selects self-service application data 18 for display based upon the determined journey and rules 22.

Image processing application 40 may calculate actual customer queuing time along the journey. Self-service application 14 may track the following times: 1) “customer ordering time” (e.g., the time from a first screen touch until a “Pay Now” touch, 2) “customer payment time” (e.g. the time from a “Pay Now” touch until a receipt is printed, 3) “customer wait for food time” (e.g., the time from when the receipt is printed until an attendant or other employee first scans a printed order barcode indicating that the customer's order is ready for pickup, and 4) “customer leave counter area with food time” (e.g., the time from a first scanning of the order barcode until a second scanning of the order barcode indicating that the customer picked up the food order.

As yet another example, image processing application 40 may determine customer queue length, a total number of customers waiting to place orders using self-service kiosk 30, and a percentage of queuing area currently being occupied by customers waiting to use self-service kiosk 30.

The self-service ordering interface presented by self-service application 14 is dynamic depending on the state of the customer queues. An example rule 22 is to limit the number of upselling prompts, and/or display a different opening menu containing only top selling items if the customer queue is greater than about eight customers, and/or the queuing area in front of self-service kiosks 30 is more than 50% occupied. The objective is to improve service levels and to speed up the ordering process when the customer queue reaches a certain pre-defined critical level to reduce the likelihood of customers leaving the queue/restaurant.

Other environment information 20 is also envisioned. For example, real-time weather information for a given restaurant's location would dynamically influence suggestive selling prompts shown to the customer by self-service application 14. During hot weather, self-service application 14 may offer ice cream, milkshakes, ice coffee, freeze pops, and other food items that customer purchase histories have shown are popular during hot weather. During cold weather, self-service application 14 may offer coffee, tea, chili, and other food items that customer purchase histories have shown are popular during cold weather.

Attendant computer 32 and self-service kiosk 30 may be located in close proximity to another so that an attendant may see and verbally interact with a customer.

Alternatively, attendant computer 32 and self-service kiosk 30 and may be located separately from each other. For example, self-service kiosk 30 may be located in a drive-through lane or in a play area.

Self-service kiosk 30 allows a self-service customer to perform a transaction with or without assistance from an attendant at attendant computer 32. Self-service application 14 displays screens from self-service application data 18, which includes images of food items available for selection by a customer and food selections already made by the customer.

Self-service kiosk 30 additionally executes sharing application 24, which sends information to attendant computer 32, including the identity of a currently displayed screen and any selections made by a customer on that screen. Sharing application 24 further receives selections made by an attendant at attendant computer 32.

Self-service application 14 stores the selections as updates to self-service application data 18. Thus, sharing application 24 ensures that locally stored self-service application data 18 on self-service kiosk 30 is synchronized with self-service application data 18 stored by attendant computer 32. A customer at self-service kiosk 30 sees the same display information that is displayed by attendant computer 32. The customer can make selections and the attendant can watch the customer selections in real time as they are performed.

Attendant computer 32 is coupled to one or more self-service kiosks 30. Attendant computer 32 allows an attendant to interact with a customer at any of self-service kiosks 30 during a transaction. Attendant computer 32 executes attendant application 28, which accesses local or remote copies of self-service application data 18 associated with each of the self-service kiosks 30 to obtain data defining a screen currently displayed by one or more self-service kiosks 30.

Attendant computer 14 further executes sharing application 24, which receives the identity of the screen currently displayed by self-service application 14, and any selections made by a customer on that screen.

Attendant application 28 stores the screen identity information and the selections in the locally stored copy of self-service application data 18. Thus, sharing application 24 ensures that locally stored self-service application data 18 on attendant station 32 are synchronized with self-service application data 18 stored on self-service kiosk 30. An attendant at attendant computer 32 sees the same display information that is displayed by self-service kiosk 30. The attendant can make selections on behalf of the customer and the customer can watch the attendant selections as they are performed.

Attendant application 28 and self-service application 14 may optionally hand off payment processing to other transaction software on an in-store computer.

Although particular reference has been made to certain embodiments, variations and modifications are also envisioned within the spirit and scope of the following claims.

Claims

1. A self-service computer comprising:

a processor for receiving environment information, for selecting among different user interface features including different transaction screens and different transaction screen content based upon the environment information, and for providing selected user interface features during a transaction involving a customer.

2. The self-service computer of claim 1, wherein the processor also selects the user interface options based upon rules.

3. The self-service computer of claim 2, wherein the rules are based upon previously extracted information about customers.

4. The self-service computer of claim 1, wherein the environment information comprises customer type information.

5. The self-service computer of claim 4, wherein the customer type information comprises gender.

6. The self-service computer of claim 4, wherein the customer type information comprises age.

7. The self-service computer of claim 1, wherein the environment information comprises queue length.

8. The self-service computer of claim 1, wherein the environment information comprises queuing time.

9. The self-service computer of claim 1, wherein the environment information comprises weather conditions.

10. The self-service computer of claim 1, wherein the different user interface options comprise different templates.

11. The self-service computer of claim 1, wherein the different user interface options comprise different fonts and font sizes.

12. The self-service computer of claim 1, wherein the different user interface options comprise different color and contrast.

13. The self-service computer of claim 1, wherein the different user interface options comprise different volume levels.

14. The self-service computer of claim 1, wherein the different user interface options comprise different avatars.

15. The self-service computer of claim 1, wherein the different user interface options comprise different transaction workflows.

16. A self-service computer comprising:

a camera for capturing images of a location in front of the self-service computer containing at least one customer; and
a processor for extracting environment information from the images including information about the one customer, for classifying the one customer into a customer type out of a plurality of different customer types, for selecting user interface features among a plurality of different user interface features including different transaction screens and different transaction screen content based upon the customer type, and for providing selected user interface features during a transaction involving the one customer.

17. A self-service method comprising:

receiving environment information by a self-service computer;
selecting among different user interface features including different transaction screens and different transaction screen content based upon the environment information by the self-service computer; and
providing selected user interface features during a transaction involving a customer by the self-service computer.
Patent History
Publication number: 20110078637
Type: Application
Filed: Sep 29, 2009
Publication Date: Mar 31, 2011
Patent Grant number: 9310880
Inventors: Michael Thomas Inderrieden (Lawrenceville, GA), Jennie Psihogios Johnson (Suwanee, GA), Nathaniel Christopher Herwig (Lawrenceville, GA)
Application Number: 12/569,283
Classifications
Current U.S. Class: Miscellaneous Customization Or Adaptation (715/866); 705/10
International Classification: G06F 3/00 (20060101); G06Q 10/00 (20060101);