SYSTEM AND METHOD FOR MODE-NEUTRAL COMMUNICATIONS WITH A WIDGET-BASED COMMUNICATIONS METAPHOR
Disclosed herein are systems, methods, and non-transitory computer-readable storage media for managing communications mode neutrally using widgets. The method includes presenting via a graphical user interface (GUI) a set of connected graphical elements representing a communication session comprising at least two communicating users, wherein each graphical element representing a user further comprises at least one graphical sub-element indicating user communication details, receiving user input associated with the set of connected graphical elements, the user input having an action associated with the communication session, and performing the action based on the received user input. The graphical sub-elements can indicate a communication mode through which an associated user connects to the communication session and/or available communication modes for an associated user. The graphical sub-elements can include a telephone, mobile phone, instant message, camera, video camera, microphone, text-message, document, headset, or email icon.
The present application is a continuation of U.S. patent application Ser. No. 12/749,122, filed Mar. 29, 2010, which claims priority to U.S. Provisional Application No. 61/164,753, filed 30 Mar. 2009, which is incorporated herein by reference in its entirety.
This application is related to Attorney Docket Numbers 509022-US1 (application Ser. No. 12/749,028), 509022-US2 (application Ser. No. 12/749,058), 509022-US3 (application Ser. No. 12/749,094), 509022-US4 (application Ser. No. 12/749,123), 509022-US5 (application Ser. No. 12/749,150), 509048-US (application Ser. No. 12/749,178), and 509049-US (application Ser. No. 12/749,103), filed on Mar. 29, 2010, each of which is herein incorporated by reference.
BACKGROUND1. Technical Field
The present disclosure relates to telecommunications and more specifically to displaying and managing communication sessions via a graphical user interface (GUI). Communication sessions can exist in a variety of modes such as telephone calls, communication sessions, instant messaging sessions, email sessions, video conference sessions, multi-media sessions, and the like.
2. Introduction
Touchtone telephones have been supplemented over the years by the addition of feature buttons and menus. Interfaces for these features have evolved from simple buttons to hierarchical menus actuated by trackballs, quadrant style pointers, and the like. As the number of features increases, the interfaces add more buttons, sequences, and/or combination of button presses. This proliferation of features has led to a multitude of different interfaces with varying levels of complexity. Often users resort to rote memorization of key features, but that is not always practical or desirable. Recently, smartphones with touch-sensitive displays have begun to provide similar functionality. However, the touch-sensitive displays in such devices typically reproduce the feature buttons and menus, albeit on a touch-sensitive display.
Further, users are migrating to other communication forms, such as text messaging, instant messaging, email, chat sessions, video conferencing, and so forth. Incorporating the ability to handle these modes of communication into a traditional telephone increases the complexity and difficulty manyfold. What is needed in the art is a more intuitive communication management interface.
In order to describe the manner in which the above-recited and other advantages and features of the disclosure can be obtained, a more particular description of the principles briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
The present disclosure addresses the need in the art for improved communication session management. A companion case U.S. patent application Ser. No. 12/749,028, filed on Mar. 29, 2010 (Attorney Docket Number 509022US1; 069-0011US1) discloses a graphical interface which enables a user to setup a communication session with various users and tear down or remove users from a communication session. A system, method and non-transitory computer-readable media are disclosed which in each respective embodiment relate to graphical user interfaces for managing various types of communication sessions quickly and efficiently based on a graphical user interface having communication related widgets. In the system embodiment, the system displays to the user on a graphical user interface a set of graphical connected elements representing a structure of a particular communication session or group of communication sessions. This disclosure focuses on mode-neutral communications graphic interface in which icons or images of participants in a communication session can be connected by a user graphically adding a communications widget to the respective icons. The communication widgets can relate to various modes of communication such as by: telephone, conference call, video conference, web conference, IM session, email, and so forth. The communication mode can also be deleted, changed or otherwise modified by managing the use of the widgets in the interface. A brief introductory description with reference to
The graphical interface 200 of
The communication session is also agnostic with respect to the mode of communication. The same metaphor of a connected user in a communication session being displayed on the graphical interface can represent a called/calling user, an instant messaging (IM) user, an email user, a user connecting via video conferencing, and so forth. The presentation of the graphical elements, how they are connected and how the user interacts with the elements all vary depending on the needs and current active context of the communication session. For example, the elements can include text, titles, positions, data about each user, etc. and the connection metaphor between users can also represent information such as the type of connection (phone, video, web conference, etc), the quality of the connection (low-band, high-band, etc.), a hierarchy of how participants are related to the primary user (friend, associate, acquaintance, un-trusted user, etc.), a status of the connection (active, inactive, on-hold, etc.), and so forth. For example, a user can select a contact and then use the same type of user input (drag and drop, flicking, gestures, etc.) to initiate any of the communication modes with a contact. The user does not have to know or learn different input mechanisms for different communication modes. These variations shall be discussed herein as the various embodiments are set forth. The disclosure now turns to
With reference to
The system bus 110 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in ROM 140 or the like, may provide the basic routine that helps to transfer information between elements within the computing device 100, such as during start-up. The computing device 100 further includes storage devices 160 such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive or the like. The storage device 160 can include software modules 162, 164, 166 for controlling the processor 120. Other hardware or software modules are contemplated. The storage device 160 is connected to the system bus 110 by a drive interface. The drives and the associated computer readable storage media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing device 100. In one aspect, a hardware module that performs a particular function includes the software component stored in a non-transitory computer-readable medium in connection with the necessary hardware components, such as the processor 120, bus 110, display 170, and so forth, to carry out the function. The basic components are known to those of skill in the art and appropriate variations are contemplated depending on the type of device, such as whether the device 100 is a small, handheld computing device, a desktop computer, or a computer server.
Although the exemplary embodiment described herein employs the hard disk 160, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) 150, read only memory (ROM) 140, a cable or wireless signal containing a bit stream and the like, may also be used in the exemplary operating environment. Non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
To enable user interaction with the computing device 100, an input device 190 represents any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 170 can also be one or more of a number of output mechanisms known to those of skill in the art. If the device includes a graphical display which also receives touch sensitive input, the input device 190 and the output device 170 can be essentially the same element or display. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing device 100. The communications interface 180 generally governs and manages the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
For clarity of explanation, the illustrative system embodiment is presented as including individual functional blocks including functional blocks labeled as a “processor” or processor 120. The functions these blocks represent may be provided through the use of either shared or dedicated hardware, including, but not limited to, hardware capable of executing software and hardware, such as a processor 120, that is purpose-built to operate as an equivalent to software executing on a general purpose processor. For example the functions of one or more processors presented in
The logical operations of the various embodiments are implemented as: (1) a sequence of computer implemented steps, operations, or procedures running on a programmable circuit within a general use computer, (2) a sequence of computer implemented steps, operations, or procedures running on a specific-use programmable circuit; and/or (3) interconnected machine modules or program engines within the programmable circuits. The system 100 shown in
Having briefly discussed the exemplary system embodiment, the disclosure now turns to
A benefit of this approach is that the system can add additional communication modes to the modes already existing as is represented by the utility icons in
The display 200 shows a communication session of three connected graphical elements 202, 204, 206. The displayed communication session 201 represents a real-time communication. In this case, the real-time communication is a three-way conference call between Frank Grimes 202, Max Power 204, and Karl 206, shown by connecting lines between their respective icons 202, 204, 206.
However, visualization of communication sessions and user controls are neutral with respect to various communication modalities and treat each the same even as users seek to join a call or other communication session. The only difference is the indicator used for the specific modality such as the communication modality icons 208, 210, 212, 214, 216. For instance, in
The user can then add additional parties to the communication session in a similar manner. The user can remove participants from a communication session by dragging them to a trash can icon 220, providing a flicking motion, clicking an X associated with that participant, highlighting a participant and shaking a device, if it is mobile with accelerometer capability, or clicking a physical or graphical disconnect button. In one aspect where the communication session is via telephone, the system 100 removes participants from the communication session when the user hangs up the telephone receiver. As participants leave the communication session, the system 100 removes their icon from the graphical representation of the communication session.
The graphical elements shown are icons, but can also include images, text, video, animations, sound, caricatures, and/or avatars. Users can personalize their own graphical elements or feed a live stream of images from a camera or video camera, for example. In addition, the graphical elements can have an associated string of text 202a, 204a, 206a. The string of text can include a name, a title, a position, a telephone number, email address, a current status, presence information, location, and/or any other available information. The string of text can be separate from but associated with the graphical element, as shown in
The system 100 can include for each icon a graphical sub-element 202b, 204b, 206b that indicates the communication mode for each participant. For example, Max Power 204 is participating via an instant messaging (IM) client 204b; Frank Grimes 202 is participating via telephone 202b; Karl is participating via a video conference client 206b. The system 100 is mode-neutral, meaning that the system 100 treats each mode of communication the same, such as telephone, cellular phone, voice over IP (VoIP), instant messaging, e-mail, text messaging, screen sharing, file sharing, application sharing, and video conferencing. As a user changes from one mode to another, the sub-elements can change accordingly. For example, if Frank Grimes 202 changes from a landline to a cellular phone mid-conference, the telephone icon 202b can change to a mobile phone icon.
The graphical elements can also convey information about the conference call by changing type, size, color, border, brightness, position, and so forth. The lines, for example, can convey relationships between participants. A user can manually trigger the changes for their own icon or others' icons, or the system 100 can detect change events and change the graphical elements accordingly. Change events can be based on a contacted party, context, persona, connectivity status, and/or presence. For example, as one person is talking or typing a text message, the system 100 can enlarge their icon. As another example, the system 100 can track how much each person in the conference call is talking and move graphical elements up and down based on a total talk time in the conference call.
In another variation, the system 100 modifies the links connecting the graphical elements 202, 204, 206 by changing their thickness, length, color, style, and/or animating the links. These modifications can represent a currently active party, shared resources, an active communication session, a held communication session, a muted communication session, a pending communication session, a connecting communication session, a multi-party line, a sidebar conversation, a monitored transfer, an unmonitored transfer, selective forwarding, and selective breakup of the communication session into multiple communication sessions, and so forth.
In one aspect, a user provides input such as a gesture (such as a drag and drop, tap, and drag with a touch screen or performs any other instructive user input) to manipulate and manage the conference call. For example, the user can click a call icon 208, a video conference icon 210, an IM icon 212, an email icon 214, or a social media icon 216 to invite another user to join the communication session. A user can drag these icons and drop them on a contact or on a participant in a current communication session. For example, if an incoming communication session is in one modality (IM for example), the user can drag the call icon onto the incoming communication session to accept the incoming communication session but renegotiate it from IM to a call. A user can also initiate a communication session by dragging and dropping an appropriate icon onto a contact. Social media include web sites such as Facebook, Twitter, LinkedIn, MySpace, and so forth. Alternatively, the user can browse through a list of contacts 218, then drag and drop a desired contact to add the desired contact to the conference call. The system 100 then automatically contacts that person in their desired mode, a sender preferred mode, a currently available mode based on presence information, or in a common available mode between the participants and joins that person to the conference call. The system 100 can display other information as well, such as a calendar, notes, memos, personal presence information, and time. The system 100 display can be user-configurable. Each participant in the communication session 201 or contact in a list of contacts can have multiple associated addresses, phone numbers, or points of contact, such as a work phone, home phone, mobile phone, work email, home email, AIM address, Facebook chat address, and the like and that each may have an icon or a qualifier such as a symbol that indicates not only the party but the contact mode.
An incoming communication session icon can blink, bounce, pulse, grow, shrink, vibrate, change color, send an audible alert (such as a ringtone), and/or provide some other notification to the user of the incoming session. The user can interact with and manipulate this incoming request in the same manner as the other current communication sessions. The system 100 does not differentiate between an active communication session and a communication session representing an incoming request. For example, the user can drag and drop an incoming call on top of a communication session to add the incoming call directly to the communication session. As another example, the user can drag and drop an incoming session to a trash can icon to ignore, double click on the incoming session to send the incoming caller (if it is a call) to voicemail, or tap and hold to place the caller on hold.
In one aspect, user preferences guide the amount and type of information conveyed by the graphical elements and the associated text. User preferences can be drawn from a viewer's preferences and/or a source person's preferences. For example, a viewer sets preferences to show others' email addresses when available, but a source person sets preferences as never share email address. The source person's preferences (or preferences of the “owner” of the information) can override a third party's preferences.
One possible user input is to divide the communication session shown in
Then the system presents a utility icon (not shown in
Having discussed several variations of
In one aspect, a centralized entity controls the communication session. The centralized entity can reside in the network or communicate via the network. The centralized entity can operate as a centralized enterprise intelligence server. In another aspect, the communication session control and functionality is distributed among multiple server resources 314, 316, 318, 320 in the network or cloud. In addition to a centralized intelligence and distributed intelligence in the cloud, the network 302 can provide this functionality using a peer-to-peer approach with intelligence on the endpoints. Some variations include providing standardized functionality on a standards-compliant server and non-standardized functionality distributed across the endpoints.
The display of each communications device shows a different aspect or view of the same communication session. For example, the display of device 304 shows the same display of the same participants 202, 204, 206 as shown in
For example, Max Power 204 is currently communicating via instant messaging. Frank Grimes 202 is currently communicating via telephone. Karl 206 is currently communicating via video conferencing. While
In addition to the current communication modality icons, each participant's icon in the communication session can have associated sub-icons or sub-elements indicating available and/or preferred communication modalities. For example, Max Power 204 includes sub-icons for telephone 408, IM 410, and email 412. Frank Grimes 414 includes sub-icons for telephone 414, video conferencing 416, and social media 418. Karl 206 includes sub-icons for video conferencing 420, IM 422, and email 424. A user can interact with other communication session participants via these sub-icons or sub-elements. For example, in order to set up a sidebar communication session with Karl 206, Max Power 204 can click on the IM sub-icon 422 associated with Karl 206. A user can modify his or her own set of sub-icons in the communication session. For example, Frank Grimes 202 can drag the social media sub-icon 418 out to remove it from the group of sub-icons. Alternatively, Frank Grimes 202 can click on the video conferencing sub-icon 416 to change from a telephone connection to a video conferencing connection while remaining connected to the communication session.
In some cases, a sub-icon can represent multiple related facets of functionality.
The display 500 can include additional sub-icons not related to any participant for all available modalities regardless of each participant's preferences. A user can drag a sub-icon from the additional sub-icons to a user to request interaction via a communication modality not currently preferred or available. For example, Max Power's 204 icon does not have an associated sub-icon for video conferencing. Frank Grimes can drag a video conferencing sub-icon from the additional sub-icons onto Max Power 204 to request a video conference.
The communication session view 602 depicts a communication session 604 with three participants, John 606, Moe 608, and Carly 610. In this example, the communication session view 602 includes a central hub or session manager 612 that links the participants. Each participant's icon can have a set of associated icons representing available or currently used communication modalities. For example, John 606 has a cellular phone icon 606a and a webcam icon 606b. Moe 608 has a webcam icon 608a, a telephone icon 608b, and a computer icon 608c. Carly 610 has a telephone icon 610a, a computer icon 610b, and a webcam icon 610c. A user can drag and drop graphical elements from the various portions of the user interface 600 to perform actions such as adding participants to the communication session 604, creating a new communication session, terminating a communication session, dividing a communication session, sharing information, and so forth.
A user can manipulate communication modalities of himself as well as others via the communication modality buttons 714. For example, the user can click and drag or otherwise move any one of the communication modality buttons 714 onto a participant icon, such as John 706. That action can trigger the system 100 to shift to the indicated communication modality or inquire of John 706 if he is willing to change to the indicated communication modality.
The basic input/output buttons 716 provide a user with basic functionality to manipulate the communication session, create a new communication session, or simply to answer system queries. For example, if the user wants to add a new participant to the communication session, the user can click the “new” button. The system 100 presents a dialog to the user to determine which contact to add to the communication session, and presents a confirmation dialog such as “Are you sure?” The user can then tap on the OK button to confirm, after which the system 100 adds the selected contact as a new participant.
A user can click and drag individual lines onto other contacts to establish additional communication sessions. For example, if participants 802, 804 are part of a larger communication session having more participants, participant 802 can establish a sidebar with participant 804 by dragging the IM line 812 and dropping it directly on participant 804. Then to terminate the sidebar, participant 804 can drag the IM line 812 back to the communication session hub 806.
In another aspect, the communication session includes multiple modalities for each participant. For example, the communication session can be a video conference where all participants have a video stream (if a video camera is available), one or more participants have an audio stream (if a microphone is available), and the session includes a text-based chat under the video stream. In this example, each participant in the graphical representation connects to the communication session via one, two, or three graphical links representing actual communication modalities. Users can individually control (i.e. terminate, add, mute, pause, and so forth) each modality's separate link to the communication session. For example, participant 802 can pause the video conference link 810 feeding a video stream to other participants 804 via the hub 806 but still maintain the other two modalities, telephone 808 and IM 812. Participant 802 can later resume the video conference link 810.
The disclosure now turns to the exemplary method embodiment shown in
The system 100 receives user input associated with the set of connected graphical elements, the user input having an action associated with the communication session (904). The user input can be a click of a mouse, a tap of a finger on a touch screen, or any other suitable input. The user can click, drag, drop, and otherwise move and locate icons as user input.
An example of applying user controls in a mode neutral way are provided below. For example, the display shown in
The system 100 performs the action based on the received user input (906). The system 100 can also receive a first user input indicating a specific graphical sub-element, display a menu of options based on the specific graphical sub-element and its respective associated user, receive a second user input selecting an option in the menu of options, and manipulate the communication session based on the second user input. In one instance, the system 100 manipulates the communication session by creating a separate communication session, but the other actions are possible consistent with the disclosure.
As can be appreciated based on this disclosure, the interface treats all communication modalities exactly the same with respect to session control. For example, as discussed herein, communication sessions of any mode or type can be controlled and managed using the same user input modalities. Thus, if communication sessions are started, ended, split, or if participants are added or removed, the same communication modalities (drag and drop, speech, gesture input, tapping, etc.) perform the same functions across different communication modes. An IM chat session with four participants can be split into two sessions of two IM chat participants using the same modality as splitting a telephone conference with four participants into two separate conferences of two people each. The communication session may be a video conference or a screen sharing session over the web. The user operations with respect to session control are identical.
Thus, the interfaces shown in FIGS. 2 and 4-8 can easily support a new mode of communication while enabling users to manage a new mode with the same communication modalities of the other modes. For example, if the system is to integrate a communication mode such as Google wave in addition to the call, video, IM, email and social shown in
Embodiments within the scope of the present disclosure may also include tangible and/or non-transitory computer-readable storage media for carrying or having computer-executable instructions or data structures stored thereon. Such non-transitory computer-readable storage media can be any available media that can be accessed by a general purpose or special purpose computer, including the functional design of any special purpose processor as discussed above. By way of example, and not limitation, such non-transitory computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computer-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of the computer-readable media.
Computer-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Computer-executable instructions also include program modules that are executed by computers in stand-alone or network environments. Generally, program modules include routines, programs, components, data structures, objects, and the functions inherent in the design of special-purpose processors, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
Those of skill in the art will appreciate that other embodiments of the disclosure may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination thereof) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
The various embodiments described above are provided by way of illustration only and should not be construed to limit the scope of the disclosure. Those skilled in the art will readily recognize various modifications and changes that may be made to the principles described herein without following the example embodiments and applications illustrated and described herein, and without departing from the spirit and scope of the disclosure.
Claims
1. A method comprising:
- presenting, via a graphical user interface controlled by a processor, a set of connected graphical elements, wherein each connected graphical element represents a respective user of a set of users participating in a communication session, and wherein each connected graphical element is unique to the respective user;
- overlaying, in the graphical user interface, a graphical sub-element on top of each connected graphical element, the graphical sub-element indicating a respective current communication modality of the associated user;
- receiving user input directed to the set of connected graphical elements, the user input specifying an action associated with the communication session; and
- performing the action according to the respective current communication modality.
2. The method of claim 1, wherein the graphical sub-element indicates an additional communication modality through which the respective user can communicate during the communication session.
3. The method of claim 2, wherein the graphical sub-element further indicates an availability of the additional communication modality.
4. The method of claim 1, wherein the graphical sub-element comprises one of a telephone icon, a mobile phone icon, an instant message icon, a camera icon, a video camera icon, a microphone icon, a text message icon, a document icon, a headset icon, a social media icon, and an email icon.
5. The method of claim 1, wherein the set of connected graphical elements are connected in the graphical user interface by one of a line, a shape, proximity, a common shape, a common color, and a common appearance.
6. The method of claim 1, further comprising:
- receiving changed communication details for one of the set of users;
- identifying a connected graphical element corresponding to the respective user; and
- updating a corresponding graphical sub-element for the connected graphical element based on the changed communication details.
7. The method of claim 1, wherein the graphical sub-element is a communications widget.
8. The method of claim 7, further comprising:
- receiving a first user input indicating a specific graphical sub-element;
- displaying a menu of options based on the specific graphical sub-element and a respective associated user;
- receiving a second user input selecting an option in the menu of options; and
- manipulating the communication session based on the second user input.
9. The method of claim 8, wherein manipulating the communication session comprises creating a separate communication session separate from the communication session.
10. The method of claim 1, wherein the graphical user interface accepts input via one of a mode-neutral user control, a mode neutral button, and a mode neutral gesture.
11. A system comprising:
- a processor; and
- a non-transitory computer-readable storage medium storing instructions which, when executed by the processor, cause the processor to perform operations comprising: presenting, via a graphical user interface, a set of connected graphical elements, wherein each connected graphical element represents a respective user of a set of users participating in a communication session, and wherein each connected graphical element is unique to the respective user; overlaying, in the graphical user interface, a graphical sub-element on top of each connected graphical element, the graphical sub-element indicating a respective current communication modality of the associated user; receiving user input directed to the set of connected graphical elements, the user input specifying an action associated with the communication session; and
- performing the action according to the respective current communication modality.
12. The system of claim 11, wherein the graphical sub-element indicates an additional communication modality through which the respective user can communicate during the communication session.
13. The system of claim 12, wherein the graphical sub-element further indicates an availability of the additional communication modality.
14. The system of claim 11, wherein the graphical sub-element comprises one of a telephone icon, a mobile phone icon, an instant message icon, a camera icon, a video camera icon, a microphone icon, a text message icon, a document icon, a headset icon, a social media icon, and an email icon.
15. The system of claim 11, wherein the set of connected graphical elements are connected in the graphical user interface by one of a line, a shape, proximity, a common shape, a common color, and a common appearance.
16. The system of claim 11, wherein the computer-readable storage medium stores additional instructions which, when executed by the processor, cause the processor to perform further operations comprising:
- receiving changed communication details for one of the set of users;
- identifying a connected graphical element corresponding to the respective user; and
- updating a corresponding graphical sub-element for the connected graphical element based on the changed communication details.
17. The system of claim 11, wherein the graphical sub-element is a communications widget.
18. The system of claim 17, wherein the non-transitory computer-readable storage medium stores additional instructions which, when executed by the processor, cause the processor to perform further operations comprising:
- receiving a first user input indicating a specific graphical sub-element;
- displaying a menu of options based on the specific graphical sub-element and a respective associated user;
- receiving a second user input selecting an option in the menu of options; and
- manipulating the communication session based on the second user input.
19. The system of claim 18, wherein manipulating the communication session comprises creating a separate communication session separate from the communication session.
20. A non-transitory computer-readable storage device storing instructions which, when executed by a processor, cause the processor to perform operations comprising:
- presenting, via a graphical user interface, a set of connected graphical elements, wherein each connected graphical element represents a respective user of a set of users participating in a communication session, and wherein each connected graphical element is unique to the respective user;
- overlaying, in the graphical user interface, a graphical sub-element on top of each connected graphical element, the graphical sub-element indicating a respective current communication modality of the associated user;
- receiving user input directed to the set of connected graphical elements, the user input specifying an action associated with the communication session; and
- performing the action according to the respective current communication modality.
Type: Application
Filed: Jan 14, 2015
Publication Date: May 14, 2015
Inventors: Birgit GEPPERT (Basking Ridge, NJ), Frank ROESSLER (Basking Ridge, NJ)
Application Number: 14/596,371
International Classification: H04L 12/58 (20060101); G06F 3/0484 (20060101);