ANALYZING SURVEY RESULTS
Systems and method for analyzing results of an automated survey are disclosed herein. According to some implementations, a computer implemented method comprises receiving survey result information, where the survey result information includes information extracted from an automated survey offered to a survey recipient. The computer implemented method also comprises performing an analysis of the survey result information and determining if the analysis of the survey result information warrants one or more follow-up actions with a customer.
Latest 3PD, INC. Patents:
This application is a continuation of U.S. patent application Ser. No. 12/722,463, filed Mar. 11, 2010, which claims the benefit of U.S. Provisional Application No. 61/266,599, filed Dec. 4, 2009, the entire disclosures of which are hereby incorporated by reference herein.
This application is related to co-pending U.S. patent application Ser. No. 12/722,455, filed Mar. 11, 2010, and titled, “Triggering and Conducting an Automated Survey,” the entire disclosure of which is hereby incorporated by reference herein.
This application is also related to co-pending U.S. patent application Ser. No. 12/722,474, filed Mar. 11, 2010, and titled, “Performing Follow-up Actions Based on Survey Results,” the entire disclosure of which is hereby incorporated by reference herein.
TECHNICAL FIELDThe present disclosure generally relates to surveys, and more particularly relates to survey automation.
BACKGROUNDBusinesses often use surveys to obtain feedback from customers. The survey responses can help a business understand the customer's level of satisfaction. Also, a business can use data from surveys to track patterns and trends in customer service. In response, the business can make changes as necessary in areas where improvements can be made. Businesses that can keep operations running smoothly and focused on customer satisfaction may typically have a better chance of long-term success.
SUMMARYThe present disclosure describes various systems and methods for analyzing results of an automated survey. According to some embodiments, a computer-readable medium may be encoded with computer-executable instructions, wherein the computer-executable instructions may include logic adapted to receive survey result information, the survey result information including information extracted from an automated survey that is offered to a survey recipient. The computer-executable instructions may also include logic adapted to perform an analysis of the survey result information and logic adapted to determine if the analysis of the survey result information warrants one or more follow-up actions with a customer.
According to some implementations, a computer implemented method comprises receiving survey result information, where the survey result information includes information extracted from an automated survey offered to a survey recipient. The computer implemented method also comprises performing an analysis of the survey result information and determining if the analysis of the survey result information warrants one or more follow-up actions with a customer.
Some implementations may include a survey result analysis system that comprises a processing device configured to execute a survey program and a memory device comprising a database and configured to store the survey program. The survey program may be configured to enable the processing device to retrieve survey result information from the database, the survey result information comprising information extracted from an automated survey offered to a survey recipient. The processing device may also be enabled to analyze the survey result information and determine if the survey result information warrants one or more follow-up actions with a customer.
Various implementations described in the present disclosure may include additional systems, methods, features, and advantages, which may not necessarily be expressly disclosed herein but will be apparent to one of ordinary skill in the art upon examination of the following detailed description and accompanying drawings. It is intended that all such systems, methods, features, and advantages be included within the present disclosure and protected by the accompanying claims.
The features and components of the following figures are illustrated to emphasize the general principles of the present disclosure. Corresponding features and components throughout the figures may be designated by matching reference characters for the sake of consistency and clarity.
The present disclosure describes systems and methods for conducting surveys in response to interactions between businesses and customers. Surveys may be created and utilized for obtaining feedback about products sold to customers and/or about services provided for the customers. Although various implementations of the present disclosure are described with respect to surveys conducted in response to a service, the survey systems and methods herein may also be configured to be conducted in response to products or other offerings by a company or business. In addition, various implementations herein describe many services as being delivery services, but it should be understood that the present disclosure also may include other types of services without departing from the principles described herein. Other features and advantages will be apparent to one of ordinary skill in the art upon consideration of the general principles described herein, and all such features and advantages are intended to be included in the present disclosure.
According to various implementations, the customer 12 may provide the business 10 with personal information, such as name, address, phone numbers, e-mail addresses, etc., which can be used for contacting the customer 12 to provide the intended services or for contacting the customer 12 as needed. Other ordering information may be exchanged or created, including special instructions for delivery, unpacking or assembly requests, and/or installation requests. Orders can usually be taken in any number of ways, including transactions in person, by phone, by mail, by e-mail, by the Internet, or by other ordering methods. The business 10 may provide some of this order information to the service group 14 in order that the service group 14 can perform the service properly. The order information can be provided by an automatic ordering system, by facsimile device, by e-mail, by phone, or in any other manner. The service group 14 may pick up products, as necessary, from the business's store, warehouse, supplier, etc., and deliver the products to one or more customers 12. In some embodiments, the customer 12 may schedule the service directly with the service group 14.
The service managers 26 may be field managers, regional managers, or local managers who manage one or more service providers 28, often in a particular region and/or for a specific client. The service manager may also manage one or more internal servicers 24. The service providers 28 manage a number of servicers 30, who may be employed by the service providers 28 or may be independent contractors. The servicer 30 may be the individual or team representing the service group 20 (or service group 14 shown in
The client systems 38 may represent any business, such as the businesses described with respect to
According to various embodiments of
After notification of service completion has been received, the automated survey system 36 waits for a short amount of time (e.g., to allow the customer to reflect upon the service received). After a configurable short delay, e.g., about 10 minutes, the automated survey system 36 launches an automated survey. In some implementations, the survey is conducted over the telephone using an IVR system, which is configured to call the customer's home telephone number using contact information obtained during the order process. The survey may be sent to the customer systems 42 using the PSTN or over other communication networks, such as an e-mail system, chat session, text message system, etc. In some cases, the customer may delegate another individual to interact with the servicers, such as if the customer wishes for a neighbor to handle the acceptance of the delivered items. In these cases, the survey recipient may be the neighbor, who may be in a better position to rate the delivery service.
In some implementations, the automated survey system 36 may include a processing system adapted to conduct the survey when the service is complete. The automated survey system 36 is further configured to analyze the results of the survey to determine if any follow-up actions with the customer are needed. For example, if the customer is dissatisfied with the service received, the customer can leave responses that can be analyzed for follow-up. In some situations, the customer may have need of immediate resolution to which the service group or client can provide follow up. Feedback may be received in the form of key strokes on a touch tone key pad of a telephone, voice messages left over the telephone, and/or by other communication means.
Some follow-up actions may involve a service manager, field manager, or other representative of the service group. The automated survey system 36 organizes the survey results in tables or charts to clearly communicate any issues that the customers may have. For example, if the customer indicates poor service, such as by providing low ratings on the survey or by explaining problems in a voice message, this information can be automatically or manually recorded and then provided directly to the service manager or other responsible person or team of the service group associated with the service group systems 40. In some cases, survey feedback can be directed to the client systems 38. In the case where follow-up actions may involve the client, the automated survey system 36 may send an automatic communication to the client systems 38 in order that the client can view the survey result information using a web-enabled browser via the Internet. Both the client and field managers of the service group can access survey result information and/or a digitized version of the voice message as needed to help resolve the customer's issues.
In some embodiments, each component of the automated survey system 36 as shown may include multiple components on multiple computer systems of a network. For example, the managed services 22 of the service group may comprise servers, such as application servers, file servers, database servers, web servers, etc., for performing various functions described herein. The servers of the automated survey system 36 may for example be physically separate servers or servers in a VMware ESX±4.0 virtual environment, among other implementations. In addition, the internal servicers 24, service managers 26, service providers 28, and/or servicers 30 may comprise laptop or desktop computer systems, which may form part of the automated survey system 36 and may be used for accessing the servers as needed.
The processing device 48 may be one or more general-purpose or specific-purpose processors or microcontrollers for controlling the operations and functions of the automated survey system 36. In some implementations, the processing device 48 may include a plurality of processors, computers, servers, or other processing elements for performing different functions within the automated survey system 36.
The memory device 50 may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units, each including a tangible storage medium. The various storage units may include any combination of volatile memory and non-volatile memory. For example, volatile memory may comprise random access memory (RAM), dynamic RAM (DRAM), etc. Non-volatile memory may comprise read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, etc. The storage units may be configured to store any combination of information, data, instructions, software code, etc. The order management program 52, survey program 54, and database 56 may be stored in one or more memory devices 50 and run on the same or different computer systems and/or servers.
The input/output devices 58 may include various input mechanisms and output mechanisms. For example, input mechanisms may include various data entry devices, such as keyboards, keypads, buttons, switches, touch pads, touch screens, cursor control devices, computer mice, stylus-receptive components, voice-activated mechanisms, microphones, cameras, infrared sensors, or other data entry devices. Output mechanisms may include various data output devices, such as computer monitors, display screens, touch screens, audio output devices, speakers, alarms, notification devices, lights, light emitting diodes, liquid crystal displays, printers, or other data output devices. The input/output devices 58 may also include interaction devices configured to receive input and provide output, such as dongles, touch screen devices, and other input/output devices, to enable input and/or output communication.
The interface devices 60 may include various devices for interfacing the automated survey system 36 with one or more types of communication systems, such as the communication networks 44. The interface devices 60 may include devices for communicating the automated survey from the automated survey system 36 to the customer systems 42. For example, when the survey is communicated via telephone, a telephone/voice interface device of the interface devices 60 can be used for controlling an IVR device and accessing a telephone network. Also, interface devices 60 may include various devices for interfacing with a data network, such as the Internet, to enable the communication of data. In some examples, the interface devices 60 may include Dialogic cards, Dialogic Diva softIP software, Envox, a voice over Internet protocol (VoIP) device, or other hardware or software interface elements.
The order management program 52 stored in the memory device 50 includes any suitable instructions for processing a customer order. For example, the order management program 52 may be Dispatch Office or other software for managing orders. In some implementations, the order management program 52 may include the capability of tracking deliveries. The order management program 52 may be omitted from the automated survey system 36 in some embodiments or placed in a separate processing system according to other embodiments.
The survey program 54, which is described in more detail below, includes instructions and templates for enabling a user to create an automated survey. The survey program 54 is also configured to detect a trigger event, such as the completion of a delivery service, and then launch the automated survey in response to the trigger. The survey program 54 also may automatically analyze the feedback from the survey recipient and enable a survey monitor person to review voice messages left by the survey recipient and enter notes, a summary, and/or a transcript of the voice message. When the analysis of the survey result information is made, the survey program 54 can determine if follow-up actions are warranted. For example, if a delivered product is damaged, the survey program 54 can communicate with the appropriate person or team that can resolve the issue. The survey program 54 utilizes, as needed, the database 56, which is configured to store order information, customer information, survey information, and other types of data and information. Other implementations may omit one or more of the functions disclosed herein.
The survey assembling module 62 is configured to record a survey script read by a professional speaker. The survey assembling module 62 can record the read script in digitized form in a way file, vox file, and/or other audio file formats. A file naming convention can be used to help identify the properties of the survey scripts. For example, the file name may include an indication of the client, product, types of services, spoken language, store brand, and/or other information. When the scripts are recorded, the survey assembling module 62 enables a user to select different scripts to combine into a complete survey. In this respect, each script may be a single question, single statement, or other portion of an entire survey. The user may then arrange the selected scripts in a particular order. Also, the user is enabled to enter acceptable answers for each of the survey questions.
The survey triggering module 64 detects when a trigger event occurs that warrants the conducting of a survey. For example, the trigger event may be the completion of a delivery service or other service. In some embodiments, the survey triggering module 64 may detect when an order case is closed or when the status of a customer's order has been closed or finished (e.g., when an order has been fulfilled and properly delivered). The survey triggering module 64 may detect the order status using a polling process in which the database 56 is polled. The polling process may be operated on a periodic schedule, e.g., about every 10 minutes. When the order case is detected as being closed, the survey triggering module 64 may create a new survey case to indicate that a survey is to be launched. According to some embodiments, the survey triggering module 64 may detect when a survey record has been created automatically or manually in the database 56.
In some embodiments, the survey triggering module 64 may be configured to receive indications when trigger events occur that warrant the initiation of surveys. For example, when a service is complete, the servicer may use a handheld device that prompts the servicer to provide input when the service job is finished. The handheld device may transmit a wireless signal to the automated survey system 36 via the interface devices 60 and this signal may be forwarded to the survey triggering module 64. Some embodiments may also include a purchased product (e.g., a mobile phone, smart phone, cable service, etc.) that may be configured to automatically communicate notification of a trigger event (e.g., installation, registration, initiation of phone service, etc.) to the survey triggering module 64. Other trigger events and other means of communicating a notification of the trigger events to the survey triggering module 64 may be used according to the particular design.
When the survey triggering module 64 determines that an authentic trigger event has occurred, the survey triggering module 64 may then set a flag stored in the memory device 50 or provide some other type of indication that the service job is complete (or other trigger event has occurred) and that the status of a new survey case associated with that service job is now opened. In some implementations, the survey triggering module 64 may enter the time that the trigger signal was received in order to allow multiple service jobs to be recorded chronologically according to completion time.
The survey triggering module 64 may also be configured to perform a polling process in which the database 56 is polled to determined which entries were recorded over a past predetermined time period. For example, if surveys are to be initiated every ten minutes, the polling process can determine which service jobs were completed in the last ten minutes. The survey triggering module 64 places the polled service jobs in the scheduling queue 84 in the order in which the service jobs were completed. The order that the automated surveys are conducted is based in part on the list in the survey scheduling queue 84.
The survey triggering module 64 may also be configured to wait a predetermined amount of time before triggering the launch of the survey. The reason for the delay is to allow the customer to have time to observe the delivered product and try running it, for example, to determine if there are any defects. Also, the delay permits time for the servicer to leave the vicinity of the customer's residence to allow the customer to provide unbiased responses to the survey questions. When the predetermined lag time has elapsed, the survey triggering module 64 instructs the survey conducting module 66 to launch the survey.
In response to a trigger to launch, the survey conducting module 66 is configured to retrieve the appropriate survey script for the particular client, brand, product, service, customer, order, or other criteria. Also, the survey conducting module 66 retrieves the customer contact information, such as a home telephone number or mobile phone number. The survey conducting module 66 may be configured to control the IVR device to dial the customer's number and begin playing the survey scripts when the customer answers the phone. In some embodiments, other methods of contacting the customer may be used.
The survey conducting module 66 is also configured to capture the touch tone entries from the customer's telephone in response to the survey questions. Customer input can also be captured by the survey conducting module 66 using other input techniques, such as by e-mail, web-based inputs, spoken answers, etc. The survey conducting module 66 also gives the customer an option to leave a voice message, if desired. When a voice message is left, the survey conducting module 66 may also record the message in digital form. In some embodiments, the survey conducting module 66 may also be configured to give the customer the option of speaking with a live operator. If the customer wishes to speak with an operator, the survey conducting module 66 may redirect the call to an operator associated with the service group. The survey conducting module 66 may also be configured to give the customer the option to leave a message using text, such as typing a message in an e-mail, typing a message in a text message, typing a message on a smart phone, using a chat session, or other means of leaving a non-voice message.
When the survey is finished, the survey result information and voice messages can be analyzed to determine the customer's satisfaction with the service received. Some analysis of this information may be done automatically, while other analysis may require human involvement.
The automated survey result analyzing module 68 is configured to automatically analyze the feedback from the customer when the survey is completed. For example, the survey may include any number of questions, any of which may require numeric answers, such as answers on a numeric scale from 1 to 5, where 1 represents “completely dissatisfied” and 5 represents “completely satisfied.” Other scales can be used according to the particular design. The automated survey result analyzing module 68, according to some implementation, may be configured to calculate a score of the survey recipient's numeric answers.
All the scores on the five-point scale can be averaged together to determine an overall score for the survey. The automated survey result analyzing module 68 may be configured to use the overall score to determine if it is below a threshold that indicates that the customer was generally dissatisfied with the service. With a low average score, such as if the score is below 3.0 on a scale from 1 to 5, the automated survey result analyzing module 68 may set a flag to indicate that follow-up is warranted. Thresholds other than 3.0 may also be used according to the client's wishes or based on other factors. In some embodiments, the automated survey result analyzing module 68 may be configured to automatically send an e-mail or communicate in another manner to the field manager (or others) for follow up. The field manager may then respond by calling the customer to try to resolve any issues.
According to some embodiments, the automated survey result analyzing module 68 may detect if one or more answers indicate the lowest level of satisfaction on the part of the customer. In this case, the automated survey result analyzing module 68 may set the flag indicating the need for follow-up. Also, an automatic e-mail may be sent to the field manager (or others). The automated survey result analyzing module 68 may be configured to analyze the feedback from the survey in any suitable manner to determine if follow-up actions are warranted.
The survey result monitoring module 70 may be a web-based tool that can be accessed by a human operator (e.g., a survey monitor, service manager, field manager, or other authorized personnel of the service group). The survey result monitoring module 70 may provide a user interface enabling the user to access the survey result information, analyzed results from the automated survey result analyzing module 68, digitized voice messages, and/or other information. According to various implementations of the present disclosure, the survey result monitoring module 70 may enable the user to access and listen to the voice messages, enter a transcript of the voice message, enter a summary of the voice message, append notes to the survey result information, select one or more predefined classifications of customer issues, and/or select or recommend one or more follow-up actions. When follow-up actions are selected or recommended, the survey result monitoring module 70 can open a follow-up case for the purpose of monitoring the status of follow-up actions taken until the customer issues are resolved. As used herein, opening cases is understood to include the creation of one or more database records. In some embodiments, survey cases and follow-up cases for the same service may be monitored simultaneously. The survey result monitoring module 70 may provide a link or hyperlink to the survey information and/or voice messages. The input received from the user via the user interface can be stored along with the other information of the survey record and/or follow-up record.
The survey follow-up module 72 may be configured to track the follow-up actions that are taken to resolve customer issues. The survey follow-up module 72 may record and organize information related to the status of the follow-up case, such as, for example, the age of the follow-up case from the start of an opened follow-up case to the present. The survey follow-up module 72 enables access to this information and allows the user to use a searching tool associated with the survey follow-up module 72 to search for specific groups of follow-up cases, based on any factors, such as client, age, region, etc.
When analysis of the survey result information has been done, a follow-up case can be opened if necessary. If the survey is flagged as needing follow-up, the survey follow-up module 72 is configured to initiate follow-up actions. For example, if the survey feedback contains certain scores or marks that fit the specified criteria for needing follow-up, the survey follow-up module 72 may automatically send an e-mail to the field manager responsible for that servicer or service team. In this way, the field manager is informed that follow-up is needed and is incentivized to act quickly to resolve the issues. Along with the e-mail, the survey follow-up module 72 can also transmit the survey result information and recorded voice messages and/or links to the information and voice messages. In some cases, the issues may require the involvement of the client. Depending on how the client decides to establish follow-up routines, the survey follow-up module 72 may communicate information to the client directly or to both the client and the field manager.
The survey follow-up module 72 may be configured to determine the age of a follow-up case and track the progress being made to resolve the issues. The survey follow-up module 72 may be monitored by the survey monitor person to determine if certain issues need to be revisited. The survey follow-up module 72 may enable the transmission or re-transmission of an e-mail as a reminder as necessary to notify the field manager or other responsible party for resolving an older issue. The reminder can be send automatically by the survey follow-up module 72 based on predetermined conditions. In some embodiments, the survey follow-up module 72 may be further configured to calculate incentive payments based in part on survey scores, survey result information, compliments, or other information that is received with respect to the performance by a servicer or service team. Also, the survey follow-up module 72 may calculate bonuses for managers based on survey result numbers. In this respect, the servicers and managers can receive bonus compensation for high quality customer service.
The survey result reporting module 74 may be configured to send reports to one or more clients to inform them of the survey result information, types of issues encountered, overall scores, or other information or data. The reports may be sent automatically to the clients based in part on the client's preferences. Some reports may be communicated daily, monthly, quarterly, or for any time period. The survey result reporting module 76 may be configured to communicate with different groups of people who may be responsible for different aspects of a particular service. For example, when the results of surveys indicate defective products from a client, the survey result reporting module 74 may be configured to send a notice to an individual or department about the detective products.
The survey program 54 of the present disclosure may be implemented in hardware, software, firmware, or any combinations thereof. In the disclosed embodiments, the survey program 54 may be implemented in software or firmware that is stored on a memory device and that is executable by a suitable instruction execution system. The survey program 54 may be implemented as one or more computer programs stored on different memory devices or different computer systems of a network. If implemented in hardware, the survey program 54 may be implemented using discrete logic circuitry, an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), or any combinations thereof.
The order information 78 may include the store name, product purchases, type of services to be provided, date and time of order, etc. The customer information 80 may include the customer's name, mailing address, billing address, delivery address, telephone and mobile phone numbers, e-mail addresses, preferred means of contact, etc. The service information 82 (e.g., when related to a delivery service) may include the product ordered, shipping identification information of the product, the delivery driver, the carrier, the servicer, the promised delivery time, the actual arrival time, status of delivery, etc.
The survey scripts 84 may include digitized voice scripts of portions of one or more surveys, complete surveys, or other survey information. The survey scheduling queue 86 is a queue for recording the time when survey cases are open, a sequence of surveys to be conducted, etc. The survey result information 88 may include the results, feedback, responses, etc., provided by the customer during the survey. The survey result information 88 may also include result of the analysis by the automated survey result analyzing module 68, such as overall scores. The voice messages 90 may include digitized voice messages recorded during the survey. The voice messages 90 may be stored as files (e.g., on a separate file server) that may be accessed by hyperlinks via the network. The survey follow-up action information 91 may include a record of a classification of customer issues that warrant follow-up actions in addition to a record of follow-up actions to be taken to resolve the customer issues.
Service Interaction 94 is the process when a service of any kind is performed for the customer. For example, the service may be a delivery of goods or packages, building and/or installing a product, maintenance, repair, improvement, communication with a service manager or customer service representative, a product registration process, or other services. When the service is complete, it may be advantageous for the client or service group to conduct a survey to collect information about the customer's satisfaction with the service. The collected information can be used to help the service group improve the quality of their services.
When the status of the service case has changed due to the completion of the service job, a survey may be triggered. This is indicated by block 96. One way in which the survey is triggered may include a servicer calling into an IVR device indicating that the job is complete or closed. Another way of triggering a survey may include the servicer using a handheld device to close the job and the handheld device being configured to send a trigger signal to the automated survey system 36. Another way may include the servicer calling a support center to close the job using a landline telephone or mobile phone. When the job is recorded as being closed, the closed status may be detected in the database by a program that creates a survey call record that initiates the deployment of the survey.
After receiving notification of the Trigger Event 96, an Automated Survey may be conducted. The survey may be conducted automatically via a phone call to the customer using an IVR device, e-mail, chat, or other means of communication. The automated survey may include pre-recorded questions and may respond to the answers captured by a numeric keypad, an alphanumeric keyboard, touch screen device, or other data entry device on the customer's telephone, mobile phone, computer, or other device. Responses may be received via telephone, in a return e-mail or chat session, or by other digital entry device. Responses to survey questions may also be in the form of voice messages received via telephone, VoIP, or other voice recording device or system. In some embodiments, the customer may be given the option to wait for live customer care if desired. Also, an option may be given to allow the customer to enter a message other than a voice message, such as, for example, a text message, e-mail message, or other textual based message. According to some implementations, the survey may be started within about ten minutes of the trigger event and completed within about two minutes.
When the survey results are received, the automated survey system is configured to analyze the results. This analysis can be done automatically by the processing system and/or manually by a survey monitor person. The automated analysis may include analysis of the customer data, product data, survey responses, and/or other information. The survey responses may be collected using finite answers, such as an answer 1, 2, 3, 4, or 5 for a ranking in response to a specific survey question. In addition, the survey response may include a voice message, which can be manually analyzed and entered according to certain defined classifications.
In many cases, the results of a survey do not require follow-up with the customer and these survey cases can be closed. However, in some cases, the customer may enter certain responses or leave a voice message that prompts the automated survey system to begin a follow-up process to resolve any issues that the customer may have. When the answers are analyzed, either automatically or manually, the issues may be identified. When these exceptions are identified, a follow-up process is opened to ensure that the issues are treated sensitively. The follow-up may include inquiries to gather additional information from the customer, if needed. Countermeasures may be followed as needed to resolve the issues.
Follow-up actions may be acted upon internally within the service group or if necessary reported to client management and/or client teams. Information from the analysis and the follow-up may be collected and reported to internal teams for future use, such as performance management, improving processes, services and products, tracking costs and issues, billing, etc. Reports include hyperlinks to voicemails for easy access and review.
The closing of the service case, as illustrated in
When the survey case is closed, a follow-up case is opened to determine if follow-up to the survey is needed. Any issues fed back by the customer are analyzed to determine if follow-up actions are needed. If so, the appropriate people are contacted in order to resolve the issues. When the issues are resolved, the follow-up case is closed.
The user interface 118 also includes an add button 132, enabling the user to add a selected question or statement to the survey. A delete button 134 enables the user to delete one or more questions, and a save button 136 enables to the user to save the survey when it is complete. The user interface 118 may also include a “sample playback” button allowing the user to listen to how the created survey might sound.
According to decision block 152, it is determined whether or not a periodic time for performing a polling function has arrived. For example, the polling function may be configured to operate every 10 or 15 minutes. If the proper time has not yet arrived, the flow path loops back to itself and block 152 is repeated until the time arrives. When it is time for polling, the database is polled to detect new survey records, as indicated in block 154. Block 156 indicates that the method includes conducting an automated survey. The order that the automated surveys are launched may be based in part on the sequence of survey records in the survey scheduling queue. The process of conducting the automated survey is described in more detail below. As indicated in block 158, survey result information is received. The survey result information may be choices entered by the survey recipient, voice messages, or other useful data.
According to decision block 166, it is determined whether or not the survey recipient is on a do-not-call list. If so, the method skips ahead to block 168, which indicates that the survey case is closed with a status of “no contact made—DNC.” If the survey recipient is not on the do-not-call list, the method flows to block 170, which indicates that an attempt is made to contact the survey recipient. According to decision block 172, it is determined whether or not contact is made with the survey recipient. If not, then the flow proceeds to decision block 184. If contact is made, the flow proceeds to block 174, which indicates that the automated survey is launched and responses by the survey recipient are captured.
During the automated survey, the survey recipient is given the option to speak with a live operator. If it is determined in decision block 176 that the survey recipient requests to speak to someone live, then the flow branches to block 178. As indicated in block 178, the survey recipient is connected with an operator, such as a customer service agent, for the completion of the survey. When the live survey is completed, the survey analysis status is set to “ready” as indicated in block 180. If in block 176 it is determined that the survey recipient does not wish to talk with a live operator, the flow proceeds to decision block 182. According to block 182, it is determined whether or not the survey was completed successfully. If so, the flow proceeds to block 180 to set the survey analysis status to “ready.” If the survey did not complete successfully, as determined in block 182, flow proceeds to decision block 184.
Block 184 is reached when the survey recipient could not be contacted (decision block 172) or when the survey was not completed successfully (decision block 182). At this point, it is determined whether or not the number of contact attempts is equal to a predetermined threshold. If the number of contact attempts is determined to be equal to the threshold, flow proceeds from block 184 to block 186 and the survey is closed with the status of “no contact made.” If not, then the method goes to block 188, in which the survey is reschedule for another attempt, and the flow then proceeds back to block 170.
After compliments are handled, the flow proceeds to decision block 220, which indicates that a determination is made whether the survey score warrants one or more follow-up actions. If not, then the flow skips to block 212 and the follow-up case is closed. However, if follow-up is warranted, the method flows on to decision block 222, which determines whether involvement by a field manager is needed. If so, the survey result information (which may include any of the survey answers, survey scores, and voice messages) is made available to the field manager, according to block 224. When the survey result information is received, the field manager may be enabled to add or edit follow-up information, as indicated in block 225. For example, the field manager may log any follow-up actions taken to resolve the issues. The field manager may also set classifications of issues and set follow-up actions that were not previously recorded. The field manager may also be enabled to mark when the follow-up case is closed, e.g., when all the issues have been resolved. The method also includes checking if client involvement is needed, as indicated in decision block 226. If so, the flow is directed to block 228 and the survey result information is made available to the client. As indicated in block 229, the client is enabled to add and/or edit follow-up information. In some embodiments, the client's name may be logged in during the modification process. The types of follow-up information that can be modified in this method may be different for the field manager, client, and others who may be given access to the information and authority to change the information, depending on the particular design.
The information made available to the client may be different than that made available to the field manager, depending on the particular design. The field managers and clients, when given the information, may be responsible for contacting the customer, service group members, or others by any available communication devices in order to help resolve the issues. Decision block 230 indicates that it is determined whether or not any issues remain. This determination may be made by the field manager, who may set a flag, mark an item on a checklist, enter a summary, or other operation that may be detectable by the survey program 54. These indications can be analyzed to determine that the issues are resolved. If no issues remain, the flow goes to block 212 and the follow-up case is closed. If issues still remain, the flow loops back to block 220 to repeat follow-up actions until the issues can be resolved.
The flow diagrams of
The survey program 54, which comprises an ordered listing of executable instructions for implementing logical functions, may be embodied in any computer-readable medium for use by any combination of instruction execution systems or devices, such as computer-based systems, processor-controlled systems, etc. The computer-readable medium may include one or more suitable physical media components configured to store the software, programs, or computer code for a measurable length of time. The computer-readable medium may be any medium configured to contain, store, communicate, propagate, or transport programs for execution by the instruction execution systems or devices.
Section 244 of the user interface 238 enables the user to check certain listed items to define the customer's issues and categorize them into classification categories. The list of issues included in section 244 may be customized for the client based on the client's needs, based on the particular service provided, based on the particular product being delivered, or based on any other factors. Some non-limiting examples of customer issue items listed in section 244 may include a scheduling issue, an incorrect phone number, an issue with the contract carrier, a delivery fee issue, a schedule notification issue, poor service at the store, a damaged product, the product missing items, the wrong product delivered, the wrong address, a store or client issue, a voice message compliment, or any other service issues. In some embodiments, the selection of at least of the classification items can be required before a case is closed. By listening to the voice message, the user may be able to determine the classification of issues described audibly.
Section 246 includes a list of possible ways to resolve the issues marked in section 244. This list may also be customized for the particular client depending on various factors. Some non-limiting examples of resolution items listed in section 246 may include the issuing of a gift card to the customer, passing the information on the store or client, leaving a voice message for the client, recording a voicemail summary, or other ways of reaching resolution. Other items may also include the closure of the follow-up case based on a failure to contact the customer or a representative speaking with the customer to resolve some issue, addressing the issue with the delivery team, or the customer misunderstanding the survey. The user interface 238 enables the user to check the appropriate boxes of section 246 as needed. The user interface 238 may display certain additional information fields depending on the selections made in section 246. For example, if the user selects “Passed to Store/Client”, the user interface 238 may prompt the user to enter the name of the person to which the survey result information is passed. According to another example, if the user selects “Issue Gift Card”, the user interface 238 may prompt the user to enter the monetary amount of the gift card to be issued.
If a voice message is left, the user may listen to the message by clicking on the link 241 and then may enter a summary of the voice message in window 248. The window 248 can also be used to record steps that were taken by different people of the service group to resolve issues or any other notes that may be necessary for understanding the issues of the case. The summaries entered in window 248 are displayed in section 250 when inserted by the user. The Actions selected in section 246 are also automatically displayed adjacent section 250. If the follow-up case is to be closed, the user may check the box 252.
The user interface 256 also includes a link 257 allowing the user to respond to the survey recipient. Also, the user interface 256 includes a link 264, which allows the user to listen to a recording of the voice message left by the survey recipient. For example, the voice message may include any file format, such as a .wav file, a vox file, etc.
The table 282 includes rows of different entries arranged with columns for the profit center, the customer receiving the service (“ship to”), the job number, the time and date the follow-up case was opened (“reported at”), the deadline, the age of the follow-up case, whether a low score was received in the survey, whether a voice message link is available, the number of responses, whether the follow-up case has been closed, and a details link linking to the details of the survey. The table 282 may list the follow-up cases in a sequence from the oldest case to the newest, ordered according to the age column. The age column may work with a suitable clock or timing device to update the age of opened cases every six minutes (0.1 hours). The age may be used by the service team to give priority to older issues.
The daily survey result table 306 may include numbers broken out by region. The columns of the daily survey result table 306 include the number of service orders (e.g., deliveries), the number of surveys completed, the percentages of customers completing the survey, and an average score goal. The daily survey result table 306 also may include the particular questions of the survey, such as whether the customer would desire to have the delivery team back, the appearance of the delivery team, on-time success, call ahead success, whether the delivery team properly tested and demonstrated the product, the professional courtesy of the team, and the overall average score.
The month-to-date results table 308 may include the same column divisions as the daily report but for the longer time period from the first of the month to the present. The response classification matrix table 310 includes columns for each of a number of specific customer issues. For example, the table 310 may include scheduling issues, incorrect phone number issues, contract carrier issues, notification of deliver time issues, damaged product issues, address issues, store/client issues, etc.
According to various implementations, the survey result reporting module 74 (
The service managers may analyze the events surrounding the service problems and determine if a particular servicer is at fault or responsible. If it is determined that a particular servicer is responsible for the service problem, such as for arriving outside the promised time window, failing to complete the service, and/or other problems, then the automated survey system 36 may be configured to automatically subtract the gift card amount from the servicer's pay. In this respect, the amount that the managed services 22 pays for gift card issuance is charged back to the servicer. However, if the problem is not caused by the servicer but is caused by other operators or systems, then the servicer is not held responsible.
Many advantages might be gained by a service business or other entity by the use of the survey network system 34 and particularly the automated survey system 36 and survey program 54. For example, one benefit might be the ability to provide rapid follow-up actions to customers that have issues. In some cases, it may be possible to resolve the customer's issue within two hours, which is a desirable service goal for a service company. By responding to issues quickly, the overall customer satisfaction level of a service group can be high.
Another advantage might be the aspect of performing the survey using an automated system as opposed to a survey conducted by a live operator. An automated system may allow the survey recipient to answer more truthfully and may also lead to a high survey participation rate. For example, many surveys have a participation rate of under 10%. However, with the automated survey system 36 described herein, a participation rate of about 40% or higher can been achieved. In this respect, the service company can obtain a larger sample of data that may better define the satisfaction level of the customers. Also, by conducting the survey at an advantageous time, which is controlled by the automated survey system 36, customers are more likely to take the survey and likely to answer more accurately, because the service experience might still be fresh in their minds.
One should note that conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more particular embodiments or that one or more particular embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
It should be emphasized that the above-described embodiments are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the present disclosure. Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included in which functions may not be included or executed at all, may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the present disclosure. Further, the scope of the present disclosure is intended to cover any and all combinations and sub-combinations of all elements, features, and aspects discussed above. All such modifications and variations are intended to be included herein within the scope of the present disclosure, and all possible claims to individual aspects or combinations of elements or steps are intended to be supported by the present disclosure.
Claims
1. A computer-readable medium encoded with computer-executable instructions, the computer-executable instructions comprising:
- logic adapted to receive survey result information, the survey result information comprising information extracted from an automated survey that is offered to a survey recipient;
- logic adapted to perform an analysis of the survey result information;
- logic adapted to determine if the analysis of the survey result information warrants one or more follow-up actions with a customer.
2. The computer-readable medium claim 1, wherein the logic adapted to perform the analysis is further adapted to calculate an average score based in part on the survey recipient's numeric answers to survey questions in the automated survey.
3. The computer-readable medium of claim 1, wherein the logic adapted to perform the analysis further comprises:
- logic adapted to determine if a voice message has been received from the survey recipient;
- logic adapted to provide access to the voice message when it is determined that a voice message has been received, access being provided to an authorized operator; and
- logic enabling the authorized operator to enter a summary of the voice message.
4. The computer-readable medium of claim 3, wherein the logic adapted to perform the analysis further comprises:
- logic adapted to provide classification categories enabling the operator to classify one or more customer issues extracted from the voice message;
- wherein the classification categories include one or more categories selected from the group consisting of a scheduling issue, an incorrect phone number, a contract carrier issue, a delivery fee issue, an issue regarding a schedule notification, poor service at the store, a damaged product, a defective product, missing parts of the product, a wrong delivery address, a store issue, and a compliment.
5. A computer implemented method comprising:
- receiving survey result information, the survey result information comprising information extracted from an automated survey offered to a survey recipient;
- performing an analysis of the survey result information;
- determining if the analysis of the survey result information warrants one or more follow-up actions with a customer.
6. The computer implemented method of claim 5, wherein:
- performing the analysis further comprises calculating a score based in part on the survey recipient's answers to survey questions in the automated survey; and
- the survey recipient's answers are numeric answers.
7. The computer implemented method of claim 6, wherein the numeric answers range from 1 to 5, where an answer of 1 represents “completely dissatisfied” and an answer of 5 represents “completely satisfied.”
8. The computer implemented method of claim 7, wherein:
- the score is an average of the numeric answers; and determining if the analysis of the survey result information warrants one or more follow-up actions further comprises determining if the average is below 3.0.
9. The computer implemented method of claim 7, wherein determining if the analysis of the survey result information warrants one or more follow-up actions further comprises determining if one or more answers are a 1.
10. The computer implemented method of claim 5, wherein analyzing the survey result information comprises determining if a voice message has been received from the survey recipient.
11. The computer implemented method of claim 10, further comprising:
- providing access to the voice message when it is determined that a voice message has been received, access being provided to an authorized operator; and
- enabling the authorized operator to enter a summary of the voice message.
12. The computer implemented method of claim 11, further comprising:
- providing classification categories enabling the operator to classify one or more customer issues extracted from the voice message.
13. The computer implemented method of claim 10, further comprising:
- placing the voice message in a queue with other voice messages, the voice messages being ordered with respect to the time when the voice message was received.
14. The computer implemented method of claim 5, wherein the survey recipient and the customer are the same.
15. A survey result analysis system comprising:
- a processing device associated with a computing system, the processing device configured to execute a survey program; and
- a memory device in communication with the processing device, the memory device comprising a database and configured to store the survey program, the survey program configured to enable the processing device to: retrieve survey result information from the database, the survey result information comprising information extracted from an automated survey offered to a survey recipient; analyze the survey result information; and determine if the survey result information warrants one or more follow-up actions with a customer.
16. The survey result analysis system of claim 15, wherein the processor device is further enabled to calculate scores based in part on the survey recipient's numeric answers to survey questions in the automated survey, wherein acceptable numeric answers range from 1 to 5, where an answer of 1 represents “completely dissatisfied” and an answer of 5 represents “completely satisfied.”
17. The survey result analysis system of claim 16, wherein the processor device is further enabled to determine if one or more follow-up actions are warranted if the average of the numeric answers is below 3.0.
18. The survey result analysis system of claim 16, wherein the processor device is further enabled to determine if one or more follow-up actions are warranted if one or more numeric answers are a 1.
19. The survey result analysis system of claim 15, wherein the processor device is further enabled to retrieve any voice message left by the survey recipient and provide access to the voice message a member of a service group enabling the member to enter a summary of the voice message.
20. The survey result analysis system of claim 15, wherein the survey program provides classification categories in a user interface enabling the user to classify one or more customer issues extracted from the voice message, wherein the classification categories include one or more categories selected from the group consisting of a scheduling issue, an incorrect phone number, a contract carrier issue, a delivery fee issue, an issue regarding a schedule notification, poor service at the store, a damaged product, a defective product, missing parts of the product, a wrong delivery address, and a store issue.
Type: Application
Filed: Sep 26, 2011
Publication Date: Jan 19, 2012
Applicant: 3PD, INC. (Marietta, GA)
Inventors: Karl Meyer (Atlanta, GA), Jonathan Turner (Marietta, GA)
Application Number: 13/245,858
International Classification: G06Q 10/00 (20060101);