UNIFIED MESSAGING PLATFORM FOR PROVIDING INTERACTIVE SEMANTIC OBJECTS

- Microsoft

A unified messaging platform is described which provides a comprehensive environment for collaboration, file sharing, and project management. In aspects, the unified messaging platform is organized based on one or more teams or projects, where each team or project is further organized by customizable categories. A user interface is provided for ready access to information related to each category (e.g., communications, files, tasks, work product, etc.), which information is automatically and seamlessly synchronized across the platform such that each team member remains abreast of the current progress and status of a project. Team collaboration and cooperation is facilitated by interactive semantic objects. Interactive semantic objects may act as an access points to external services, centralized interfaces for team interaction, scheduling interfaces, and the like. Status updates, tallies, and/or selections are automatically synchronized and displayed for team members in a single version of an interactive semantic object.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History

Description

CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application Ser. No. 62/165,856, entitled “SYSTEM AND METHODS FOR IMPLEMENTING UNIFIED MESSAGING PLATFORM,” filed on May 22, 2015, the entire disclosure of which is hereby incorporated herein by reference.

BACKGROUND

Numerous and diverse communications platforms are currently available. Some communications platforms, e.g., messaging and/or email platforms, allow for a certain amount of interoperability. However, these platforms fail to adequately address the needs and requirements of contemporary team environments. For example, traditional email applications are configured such that each message is addressed to one or more recipients by the sender. It is often difficult for the sender to know which recipients would be interested in receiving certain information, which leads to message forwarding and/or overlooking relevant or key individuals. In the case of message forwarding, the communication chain becomes fractured, which results in disparate information being provided to various members of a team. Moreover, when certain members are overlooked and/or excluded, information that would be useful to the whole team is archived and acted on by only a subset of the team. The above deficiencies are compounded by the fact that email messaging is overused for too many purposes—e.g., from messages as basic as requesting approval from the recipient to messages attaching critical vision documents for an organization—which leads to overloaded inboxes and overwhelmed recipients.

Other communication tools and mediums have been developed to fill the gaps, such as instant messaging, short message service (SMS), Yammer, Skype, SharePoint, etc., but these tools add complexity rather than an overarching solution. For instance, while these additional communications tools are useful in point solutions, they also create the need for users to visit multiple locations to obtain a complete picture of related information, tasks and obligations.

It is with respect to these and other general considerations that embodiments have been described. Also, although relatively specific problems have been discussed, it should be understood that the embodiments should not be limited to solving the specific problems identified in the background.

SUMMARY

The disclosure generally relates to methods and systems for providing a unified messaging platform. The unified messaging platform provides a comprehensive environment for collaboration, file sharing, and project management. In aspects, the unified messaging platform is organized based on one or more teams or projects, where each team or project is further organized by customizable categories. A user interface is provided for ready access to information related to each category (e.g., communications, files, tasks, work product, etc.), which information is automatically and seamlessly synchronized across the platform such that each team member remains abreast of the current progress and status of a project. For instance, collaboration and cooperation between team members is facilitated by interactive semantic objects. In aspects, an interactive semantic object may act as an access point to external services, may act as a centralized interface object for team interaction regarding a topic, may act as a scheduling interface for team meetings, and the like. Status updates, tallies, and/or selections are automatically synchronized and reflected in a single version of the interactive semantic object in a conversation tab between team members, as well as in an activity tab and/or a lists tab storing interactive semantic objects associated with a team and/or team member. These and other features will be detailed and described herein.

In aspects, a system including a processing unit and a memory is provided. The memory storing computer executable instructions that, when executed by the processing unit, cause the system to perform a method. The method including receiving a semantic object including an interactive control, receiving a selection of the interactive control, and performing an action associated with the interactive control. In response to performing the action, generating a result, and updating the semantic object with the result.

In further aspects, a system including a processing unit and a memory is provided. The memory storing computer executable instructions that, when executed by the processing unit, cause the system to perform a method. The method including creating a semantic object including an interactive control within a user interface of a unified messaging application. The method further comprising receiving a result of an action associated with the interactive control and updating the semantic object with the result.

In still further aspects, a method of creating a semantic object including an interactive control is provided. The method including creating the semantic object including the interactive control within an interface of a unified messaging application and receiving a selection of the interactive control, wherein the interactive control is linked to an operation. The method further including receiving a result of the operation and updating the semantic object with the result.

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS

Non-limiting and non-exhaustive examples are described with reference to the following Figures.

FIG. 1 illustrates an exemplary conceptual model for a unified messaging platform, according to an example embodiment.

FIG. 2A illustrates an exemplary interface for interacting with the unified messaging platform, according to an example embodiment.

FIG. 2B illustrates an exemplary interface for interacting with the unified messaging platform, according to a second example embodiment.

FIG. 2C illustrates an exemplary interface for interacting with the unified messaging platform, according to a third example embodiment.

FIG. 2D illustrates an exemplary interface for interacting with the unified messaging platform, according to a fourth example embodiment.

FIG. 2E illustrates an exemplary interface for interacting with the unified messaging platform, according to a fifth example embodiment.

FIG. 2F illustrates an exemplary mobile interface for interacting with the unified messaging platform, according to an example embodiment.

FIG. 2G illustrates an exemplary mobile interface for interacting with the unified messaging platform, according to a second example embodiment.

FIG. 3 illustrates an exemplary system implemented on a computing device for message handling, according to an example embodiment.

FIG. 4 illustrates an exemplary method for creating a semantic object, according to an example embodiment.

FIG. 5 illustrates an exemplary method for embedding a semantic object into a message, according to an example embodiment.

FIG. 6 illustrates an exemplary method for receiving an update to a semantic object, according to an example embodiment.

FIG. 7A illustrates an exemplary semantic object embedded in a message, according to a first example embodiment.

FIG. 7B illustrates an exemplary semantic object embedded in a message, according to a second example embodiment.

FIG. 7C illustrates an exemplary semantic object embedded in a message, according to a third example embodiment.

FIG. 7D illustrates an exemplary updated semantic object, according to an example embodiment.

FIG. 8 illustrates an exemplary mobile interface for creating a semantic object, according to an example embodiment.

FIG. 9 illustrates an exemplary interface for displaying an object embedded message, according to an example embodiment.

FIG. 10 illustrates an exemplary interface for providing a portal to an external application, according to an example embodiment.

FIG. 11 is a block diagram illustrating example physical components of a computing device with which aspects of the disclosure may be practiced.

FIGS. 12A and 12B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced.

FIG. 13 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.

FIG. 14 illustrates a tablet computing device for executing one or more aspects of the present disclosure.

DETAILED DESCRIPTION

In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the present disclosure. Embodiments may be practiced as methods, systems or devices. Accordingly, embodiments may take the form of a hardware implementation, an entirely software implementation, or an implementation combining software and hardware aspects. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims and their equivalents.

In particular, a unified messaging platform is described which provides a comprehensive environment for collaboration, file sharing, and project management. In aspects, the unified messaging platform is organized based on one or more teams or projects, with each team or project further organized by customizable categories, such as finance, engineering, launch readiness, debugging, catering, construction, general, random, and the like. A user interface is provided for ready access to information related to each category (e.g., communications, files, tasks, work product, etc.), which information is organized by pages or tabs for each category. Moreover, documents, project updates, tasks, and communications between team members are automatically and seamlessly synchronized across the platform such that each team member remains abreast of the current progress and status of a project. For instance, collaboration and cooperation between team members is facilitated by interactive semantic objects. In aspects, an interactive semantic object may act as an access point to external services, may act as a centralized interface object for team interaction regarding a topic, may act as a scheduling interface for team meetings, and the like. Status updates, tallies, and/or selections are automatically synchronized and reflected in a single version of the interactive semantic object in a conversation tab between team members, as well as in an activity tab and/or lists tab storing interactive semantic objects associated with a team and/or team member. It is with respect to these and other general considerations that embodiments have been made.

FIG. 1 illustrates an exemplary system for providing a unified messaging platform, according to an example embodiment.

In aspects, a unified messaging platform (UMP) 105 may be implemented via a client unified messaging application 104a executed on client computing device 104 in communication with a server unified messaging application executed on a server computing device 106. In some aspects, the client computing device 104 may comprise a client-side object model 107 in communication with a server-side object model 109 (e.g., implemented by middle tier 106b). In a basic configuration, the client computing device 104 is a personal or handheld computer having both input elements and output elements. For example, the client computing device 104 may be one of: a mobile telephone; a smart phone; a tablet; a phablet; a smart watch; a wearable computer; a personal computer; a desktop computer; a laptop computer; a gaming device/computer (e.g., Xbox); a television; and the like. This list is exemplary only and should not be considered as limiting. Any suitable client computing device for executing a messaging application may be utilized.

The unified messaging platform 105 is a communication system/service that provides a collaborative environment for users to communicate and collaborate. The unified messaging platform 105 is shown by a dashed line, illustrating that implementation of the unified messaging platform 105 may involve the front end 106a, middle tier 106b and/or the back end 106c of server computing device 106, among other examples. In aspects, server computing device 106 may include one or more server computing devices 106. In an example the unified messaging platform 105 presents a configurable and extensible workspace for collaboration between users through a user interface (UI) that may comprise a plurality of different views. Users of the unified messaging platform 105 may be include but are not limited to: one or more persons, companies, organizations, departments, virtual teams, ad-hoc groups, vendors, customers, third-parties, etc. Users of the unified messaging platform 105 may have one or more user profiles that are customizable by the user. The unified messaging platform 105 enables visibility and communication between users including users who are organized in teams or groups as well as users/groups outside of a team/group. Policies may be set for teams/groups by one or more administrators of a team/group and by administrators of the unified messaging platform 105. Examples described throughout the present disclosure are designed to accommodate to protect user privacy. Protection of sensitive information, including legally protected data and personally identifiable information, is a paramount consideration for implementing examples described herein. For instance, users may set privacy settings for what data that can displayed/shared, and examples described herein comply with such settings as well as laws related to distribution of data and protection of privacy.

As illustrated in FIG. 1, systems and/or services associated with the unified messaging platform 105 may be implemented as a front end 106a, a middle tier 106b, and a back end 106c on a server computing device 106. However, one skilled in the art will recognize that the unified messaging platform 105 may be implemented across one or more components of system examples described herein, including one or more client computing devices 104 and/or enterprise stack 110. In some aspects, the front end 106a of server computing device 106 may send information and commands via the client unified messaging application 104a to the client computing device 104. For instance, the middle tier 106b and/or the back end 106c of the server computing device 106 may receive information and commands from the client computing device 104 via the client unified messaging application 104a. In other aspects, the front end 106a may act as an intermediary between the client computing device 104 and the middle tier 106b. That is, front end 106a may exchange commands and information with the client computing device 104 and may also exchange the commands and information with middle tier 106b. In an example, the unified messaging platform 105 includes a server unified messaging application executing on server computing device 106 via front end 106a, middle tier 106b, and a back end 106c in communication with the client unified messaging application 104a.

In some aspects, the back end 106c may further comprise or be in communication with one or more application agents 106d to facilitate interoperability and communication with one or more external services 114. More specifically, application agents 106d may interface with external services 114 using webhooks 106e in order to facilitate integration between the unified messaging platform 105 and external services 114. External services 114 are services and/or websites that are hosted or controlled by third parties. For example, external services 114 may include line-of-business (LOB) management services, customer relationship management (CRM) services, debugging services, accounting services, payroll services, etc. External services 114 may further include other websites and/or applications hosted by third parties, such as social media or networking websites; photo sharing websites; video and music streaming websites; search engine websites; sports, news or entertainment websites, and the like. That is, some external services 114 may provide robust reporting, analytics, data compilation and/or storage service, etc., whereas other external services 114 may provide search engines or other access to data and information, images, videos, and the like.

In aspects, data or information may be shared between server computing device 106 and the one or more external services 114. For example, business contacts, sales, etc., may be input via a client computing device 104 in communication with server computing device 106, which is in communication with CRM software hosted by a third party. The third-party CRM software may track sales activity, marketing, customer interactions, etc., to provide analytics or other information for promoting business relations. Alternatively, a manufacturing work order may be input via a client computing device 104 in communication with server computing device 106, which is in communication with LOB management software hosted by a third party. The LOB management software may guide and track the work order by creating work flows such as tasks or alerts for scheduling manufacturing equipment, ordering raw materials, scheduling shipping, relieving inventory, etc. In some cases, the LOB management software may create requests for user approval or review at different stages of a work flow. In still further aspects, a user may issue a query to one or more of the external services 114, such as a request for business contacts, sales for the prior month, the status of a work order, or a search query or request for an image, etc.

As illustrated by FIG. 1, the server computing device 106 may communicate with external services 114 and client computing device 104 via a network 108. In one aspect, the network 108 is a distributed computing network, such as the Internet. In aspects, the unified messaging platform 105 may be implemented on more than one server computing device 106, such as a plurality of server computing devices 106. As discussed above, the server computing device 106 may provide data to and from the client computing device 104 through the network 108. The data may be communicated over any network suitable to transmit data. In some aspects, the network 108 is a computer network such as an enterprise intranet and/or the Internet. In this regard, the network 108 may include a Local Area Network (LAN), a Wide Area Network (WAN), the Internet, wireless and wired transmission mediums. In further aspects, server computing device 106 may communicate with some components of the system via a local network (e.g., an enterprise intranet), whereas server computing device 106 may communicate with other components of the system via a wide area network (e.g., the Internet).

According to further aspects, communication between the unified messaging platform 105 and other components of the system may require authentication 112. Authentication 112 refers to a process by which a device, application, component, user, etc., provides proof that it is “authentic” or that it is “authorized” to access or communicate with another device, application, component, user, etc. Authentication may involve the use of third-party digital certificates, authentication tokens, passwords, symmetric or asymmetric key encryption schemes, shared secrets, authentication protocols, or any other suitable authentication system or method either now known or developed in the future. In aspects, in response to authentication, access or communication may be allowed and data or information may be exchanged between the unified messaging platform 105 and various other components of the system. In some aspects, an environment or network linking various devices, applications, components, users, etc., may be referred to as a “trusted” environment. In a trusted environment, authentication between devices, applications, components, users, etc., may not be necessary.

The unified messaging platform 105 executing operations on the server computing device 106 may further be in communication with one or more enterprise applications (e.g., enterprise stack 110). Enterprise stack 110 may include, for example, an active directory 110a, an enterprise messaging application 110b, a file sharing application 110c, a telemetry application 110d, and the like. The enterprise stack 110 may be stored and/or executed locally, e.g., within an enterprise intranet, or in distributed locations over the Internet. In some cases, enterprise stack 110 may be included within server computing device 106. For example, active directory 110a may be included as part of back end 106c of server computing device 106. In some instances, enterprise stack 110 may reside or communicate with the unified messaging platform 105 within a trusted environment. In aspects, information and/or messages received, sent or stored via the unified messaging platform 105 may be communicated to the enterprise stack 110. Moreover, information and/or messages received, sent or stored via the enterprise stack 110 may be communicated to the unified messaging platform 105.

Additionally, in some aspects, the unified messaging platform 105 executing on the server computing device 106 may be in communication with one or more third party messaging applications 116. Third party messaging applications 116 are messaging applications that are hosted or controlled by third parties, including third party email messaging applications, SMS applications, instant messaging applications, social networking applications, and the like. In aspects, some users who are members of a team may be registered with the unified messaging platform 105 (e.g., internal users), whereas other users who are members of the team may not be registered with the unified messaging platform 105 (e.g., external users) but may be registered with one or more third party messaging applications 116. In some aspects, users who are registered with an enterprise messaging application 110b, but not with the unified messaging platform 105, are considered external users. In this case, the unified messaging platform 105 may communicate with one or more third party messaging applications 116 and/or with one or more enterprise messaging applications 110b to exchange information and messages with external users. In some aspects, communication between the unified messaging platform 105 and the one or more third party messaging applications 116 and/or the one or more enterprise messaging applications 110b over network 108 may involve authentication 112. In other aspects, communication between the unified messaging platform 105 and, for example, the one or more enterprise messaging applications 110b, may not involve authentication 112.

As should be appreciated, the various devices, components, etc., described with respect to FIG. 1 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.

FIG. 2A illustrates an exemplary interface for interacting with the unified messaging platform, according to a first example embodiment.

In aspects, a user may interact with a unified messaging platform via a user interface 200, e.g., a graphical user interface. An exemplary unified messaging platform 105 is provided in the description of FIG. 1, and further described throughout the rest of the present disclosure such as in FIGS. 2A-2G, among other examples. In some aspects, the user interface 200 may involve one or more panes or windows for organizing the display of information and/or interactive controls. In one example, the user interface 200 may include three panes, e.g., a left rail 202, a center pane 204, and a right rail 206. In another example, the user interface 200 may include two panes, e.g., a left rail and a right rail. In still other examples, the user interface 200 may include one pane, four or more panes, and/or panes may be embodied in multiple browser or application windows.

As detailed above, each pane or window may display information in the form of text, graphics, etc., and/or one or more interactive controls or links. For example, a first pane, e.g., left rail 202, may display one or more teams 208, an email portal, etc. As used herein, a team refers to any group of two or more users formed for one or more purposes. A team may be formed for any conceivable purpose or purposes, e.g., a business purpose, a social purpose, a charitable purpose, and the like. Moreover, a team may comprise any type of user, e.g., co-workers, family members, classmates, business associates, and the like. In aspects, a team may be formed within the unified messaging platform 105 by creating a team title, e.g., leadership team, design team, event team, project team, etc., and adding users (e.g., members) to the team. For example, in a settings or administration pane (not shown), members may be added to the team by selecting an identifier of a user, e.g., a user icon, a user email, a user phone number, etc. In at least some aspects, each member of a team is granted access to a team portal or channel. Furthermore, any number of teams may be created within the unified messaging platform 105 and/or teams may be implicitly created based on communications between two or more users.

A team portal may provide access to all communications, files, links, lists, hashtags, development tools, etc., shared by any member of a team. According to embodiments, in response to selection (e.g., by clicking) of a team 208 within a pane, e.g., the left rail 202, a team portal may be opened. A team portal refers to an access point through which team members can view and interact with shared information and other team members. In at least some cases, each member of a team is granted full access to the information and conversations shared within the team portal. In aspects, in response to selection of a team 208, general information regarding the team, project specifications, etc., may be displayed in a second pane, e.g., center pane 204. For example, member names, member contact information (e.g., email addresses, phone numbers, etc.), member usage time, project specifications, project time lines, project mission, and the like, may be displayed in the center pane 204.

A team portal may be further organized based on customizable categories 210 of information for a team 208. For example, any suitable category 210 for organizing team information may be created for a team portal, e.g., finance, engineering, launch readiness, debugging, catering, construction, general, random, and the like. In aspects, information related to a category 210 may be displayed in center pane 204 in response to selecting a category 210 of a team 208 within left rail 202. In some instances, each member of a team is granted full access to information associated with each category 210 of a team 208 within the team portal.

As noted above, a team portal may provide access to all communications, files, links, lists, hashtags, etc., shared by members of a team 208. In aspects, within each category 210, information may further be organized by tabs or pages. For example, each tab 212 may display a different type of information associated with a category 210 in the center pane 204. When selected, a tab 212 may be identified by highlighting, with a different font or font color, by outlining, underlining, etc. As illustrated by FIG. 2A, in response to selection of a first tab (e.g., conversations tab 212a, denoted by underlining) communications 218 between team members may be displayed in center pane 204. As used herein, the term “communication” may be used interchangeably with the term “message.” In aspects, a conversation 216 entails two or more communications 218 of any type or mode between team members. In some cases, a conversation 216 may be displayed in ascending order with the most recent communication 218 displayed at the bottom of the center pane 204. Alternatively, a conversation 216 may be displayed in descending order with the most recent communication 218 displayed at the top of the center pane 204.

In some cases, described further below, one or more communications 218 (e.g., communications 218a and 218b) may be grouped as a conversation thread 220. A communication 218 refers to a single message transmitted by a team member in any format (e.g., email, SMS, instant message, etc.) via any mode (e.g., via the unified messaging platform, or via any enterprise or third-party messaging application). That is, messages may be generated within the unified messaging platform between internal users or messages may be communicated to and from external users via enterprise messaging applications (e.g., enterprise messaging application 110b) and/or third party messaging applications (e.g., third party messaging applications 116).

As provided above, each pane or window may display information and/or interactive controls. For example, a third pane, i.e., right rail 206, may display context information, status information, recent activity, and the like. In some aspects, information displayed in the right rail 206 may be related to or associated with the category 210 selected in the left rail 202 and/or the tab 212 selected in the center pane. For instance, where the center pane 204 displays communications, files, links, lists, hashtags, etc., related to a category 210a entitled “New Product Launch,” the right rail 206 may display one or more recent files 222, recent links 224, tags 226, or active people 228 related to the New Product Launch. In some aspects, at least some of the information displayed in the right rail 206 may be specific to a particular user (e.g., the particular user accessing the team portal via a client computing device 104, “accessing user”). For example, the particular user accessing the team portal may be identified by a name, icon, or the like, within right rail 206, such as user name 230a or user icon 230b. That is, in some cases, the recent files 222 and/or recent links 224 related to the New Product Launch may have been recently accessed or uploaded by the accessing user. Moreover, the right rail 206 displayed for another user accessing the same category 210 may display a different set of recent files 222 or recent links 224. In further examples, additional or different information relevant to a category 210 and a particular user may be displayed in the right rail 206, e.g., user tasks, user alerts, user calendar, user notes, etc.

According to additional aspects, center pane 204 may include a search field 240. For example, search field 240 may allow a user to search within a team portal for any communication, file, link, list, hashtag, term, team member, calendar, task, event, and the like, related to a team 208. In aspects, search field 240 may allow for plain language searching, Boolean searching (e.g., searching using Boolean operators), or otherwise. In response to entering one or more search terms into the search field 240, any information related to the search terms within the team portal may be displayed as search results to the accessing user.

As should be appreciated, the various features and functionalities of user interface 200 described with respect to FIG. 2A are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2B illustrates an exemplary interface for interacting with the unified messaging platform, according to a second example embodiment.

As described above, the unified messaging platform may provide a user interface 200 including three panes, e.g., a left rail 202, a center pane 204, and a right rail 206. As illustrated by FIG. 2B, the unified messaging platform may provide a variety of options for generating communications. For example, the unified messaging platform may provide a new message input field, e.g., new message input field 232, for sending an instant message, SMS, or other “text-like” communication. In aspects, new message input field 232 may allow entry of text, entry of commands, entry of user callouts, entry of hashtags, entry of images, entry of rich web content, entry of rich interactive content, etc. New message input field 232 may further include controls 268 for attaching files, inserting emoticons, etc. However, in at least some aspects, new message input field 232 may not provide for entry of recipients or a subject line. In response to inputting a message into a new message input field 232 and hitting “enter,” a communication from a user may automatically post to a conversation as a new “text-like” message. According to further aspects, new message input field 232 may include optional controls 266 (denoted as an ellipsis) for expanding the new message input field 232 into an email interface object (e.g., email interface object 238 described below).

Alternatively, the unified messaging platform may provide a reply link 234 associated with each communication of a conversation. In some aspects, reply link 234 is displayed near each communication of a conversation, e.g., to the right of a sender or subject line for a communication (not shown), indented below a communication (shown), up and to the right of a communication (not shown), and the like. Alternatively, reply link 234 may not be displayed unless and until a communication is clicked, hovered over, touched or otherwise identified with an input device (e.g., mouse, pointer, etc.). Upon display and in response to selection of a reply link 234 associated with a particular communication, a message reply input field may be displayed (not shown). Similar to the new message input field 232, the message reply input field may allow entry of text, entry of commands, entry of hashtags, attachment of files, insertion of emoticons, etc. However, in this case, in response to inputting a message and hitting enter, a communication from the user may automatically post within a conversation thread 220 associated with the particular communication. In aspects, as illustrated by FIG. 2A, secondary communications 218b within a conversation thread 220 may be displayed as indented, bulleted, or otherwise offset below a primary or initial communication 218a (in above example, the “particular communication” may be referred to as a “primary communication”).

Alternatively still, the unified messaging platform may provide an email control 236 for accessing an email interface object, e.g., email interface object 238, to send “email-like” communications. In aspects, email interface object 238 may allow similar actions to new message input field 232, such as an input field 276 for entry of text, entry of commands, entry of hashtags, etc., and controls 268 for attachment of files, insertion of emoticons, etc. Additionally, email interface object 238 may provide controls 278 for altering text font and size, bulleting text, etc., and controls 270 for sending, saving a draft email, deleting, etc. Email interface object 238 may further provide a recipient field 272 for inputting or selecting recipients and a subject field 274 for inputting a subject line, and the like. In response to inputting a message into an email interface object 238 and hitting “send” or “enter,” a communication from the user may automatically post to the conversation as a new “email-like” message.

As should be appreciated, the various features and functionalities of user interface 200 described with respect to FIG. 2B are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2C illustrates an exemplary interface for interacting with the unified messaging platform, according to a third example embodiment.

As described above, the unified messaging platform may provide a user interface 200 including three panes, e.g., a left rail 202, a center pane 204, and a right rail 206. Moreover, as described above, each tab 212 may display a different type of information associated with a category 210a in the center pane 204. For example, as illustrated by FIG. 2C, a second tab (e.g., files tab 212b) may be selected (denoted by underlining) to display files 242 shared between team members. Files 242 may include any type of file, e.g., document files, spreadsheet files, presentation files, image files, video files, audio files, note files, and the like.

In some aspects, files 242 displayed in files tab 212b include files that were sent as attachments to communications 218 between team members. That is, the unified messaging application may extract files sent as attachments and automatically save them in files tab 212b. In other aspects, as illustrated by FIG. 2C, a file upload field 244 may be provided. In response to selecting file upload field 244, one or more files 242 may be saved to the files tab 212b by a user. For example, in response to selection of file upload field 244, a browsing box (not shown) may be activated for retrieving a file for upload. Alternatively, a command may be entered (e.g., “/file”) for retrieving a file for upload. Alternatively still, a file may be copied and pasted into file upload field 244. In aspects, any suitable method for uploading and saving a file to the files tab 212b may be implemented. In at least some aspects, a single version of a first file with a first file name exists in files tab 212b such that any annotations (e.g., revisions, comments, or other data) made to the first file are synchronized and stored within the single version. In some aspects, in response to saving the first file with a second file name, a second file can be created, attached, and/or uploaded to files tab 212b.

According to further examples, a third tab (e.g., links tab 212c) may display links (e.g., hyperlinks) shared between team members. In some aspects, links displayed in the links tab 212c include links that were sent within the body of a communication or as attachments to a communication between team members. That is, the unified messaging application may extract links sent within or as attachments to communications and may automatically save them to the links tab 212c. In other aspects, a link upload field (not shown) may be provided. In response to selecting the link upload field, one or more links may be saved to the links tab 212c by a user. For example, in response to selection of a link upload field, a browsing box (not shown) may be activated for retrieving a link for upload. Alternatively, a command may be entered (e.g., “/link”) for retrieving a link for upload. Alternatively still, a link may be copied and pasted into the link upload field. In aspects, any suitable method for uploading and saving a link to the links tab 212c may be implemented.

A fourth tab (e.g., lists tab 212d) may display list objects and/or other information, data, files, images, etc., shared between team members. In aspects, list objects may include lists, tables, charts, or other organized forms of data. In some aspects, list objects displayed in lists tab 212d include list objects that were sent within the body of a communication 218 or as an attachment to a communication 218 between team members. That is, the unified messaging application may extract list objects sent as attachments or within a message body and automatically save them to lists tab 212d. As used herein, a message body refers to content displayed within a communication (e.g., excluding recipient, sender, time stamp, subject information, confidentiality disclaimer, etc.) that need not be activated or opened for viewing.

In other aspects, a list object may be created or uploaded by a user within lists tab 212d. For example, a list creation control (not shown) may be provided for creating a list object. In some cases, in response to selecting the list creation control, a list object may be created and inserted in a message body and/or attached to a message. Upon creating the list object, the list object may be automatically saved to the lists tab 212d. Alternatively, a list upload field (not shown) may be provided. In response to selecting a list upload field, one or more list objects may be selected, uploaded and saved to the lists tab 212d by a user, as described similarly above. In at least some cases, a single copy of each list object may exist such that if data is updated in any view, e.g., within the communications tab 212a or the lists tab 212d, the list object is automatically updated and synchronized across all other views.

According to aspects, any number of tabs 212 may be created for organizing and sequestering various forms of information related to a category 210a. For example, a hashtag tab may be included to store various hashtags created within communications between team members. In additional examples, custom or extensibility tabs may be created, e.g., a tab for a spreadsheet dashboard, a tab for a webpage, a tab for a custom application, a tab for a system plugin, and the like.

In further aspects, additional interactive controls or links (e.g., controls 246) may be provided, e.g., in left rail 202, for quickly and easily accessing communications, files, lists, links, tags, etc., related to a team 208. For example, people control 246a may access team members and/or conversations stored in the team portal, files control 246b may access files stored in the team portal, lists control 246c may access lists stored in the team portal, links control 246d may access links stored in the team portal, and hashtags control 246e may access hashtags stored in the team portal. In some aspects, selection of a control 246 may display a corresponding tab view within the center pane 204. In other aspects, selection of a control 246 may display results for all categories within a team portal, e.g., in the form of search results associated with a particular control 246.

As illustrated by FIG. 2C, in response to selection of a files tab 212b, the right rail 206 may display different information than when a different tab 212 is viewed in center pane 204. For example, selecting or highlighting a file 242a in center pane 204 may cause information related to file 242a to be displayed in the right rail 206. For instance, a file history 262 for the file 242a may be displayed in the right rail 206. The file history 262 may include information such as a user identifier for a user who uploaded the file 242a, a user who authored the file 242a, a user who edited the file 242a, a file creation date, a file revision date, and the like. The right rail 206 may further display recent comments 264 regarding file 242a. In aspects, any information related to file 242a may be displayed in right rail 206.

As should be appreciated, the various features and functionalities of user interface 200 described with respect to FIG. 2C are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2D illustrates an exemplary interface for interacting with the unified messaging platform, according to a fourth example embodiment.

As described above, the unified messaging platform may provide a user interface 200 including three panes, e.g., a left rail 202, a center pane 204, and a right rail 206. In further aspects, the left rail 202 may include an email portal 214. Unlike a team portal, email portal 214 may be an access point through which a particular user can view and interact with his or her email messages inside or outside of the context of a team. In aspects, in response to selection of email portal 214, a second pane, e.g., center pane 204, may display a user's email messages. Center pane 204 may further display a user identifier 248 as a header, e.g., a user email address, a user name, a user icon, and the like. Center pane 204 may provide one or more tabs 250 for organizing the user's email messages. Tabs 250 may include, for instance, an inbox tab 250a, a files tab 250b, a links tab 250c, a sent tab 250d, a drafts tab 250e, a deleted tab 250f, and the like. For example, a user's inbox of messages may be displayed in the center pane 204 in response to selection of inbox tab 250a (denoted by underlining). In some aspects, the user's inbox of messages may include all messages sent to the user, e.g., messages between team members, including internal and external users, as well as messages between entities and users that are not team members.

In some aspects, the user's email messages 280 in inbox tab 250a may be displayed in a summary list format (shown) in descending order based on a date the email message was received with the most recent email message displayed at the top of center pane 204. The summary list format may display a portion of each email message, e.g., a sender, a subject line, and a portion of text for each email message.

In alternative aspects, the user's email messages in inbox tab 250a may be displayed in a conversation thread format (not shown). A conversation thread format may display email messages which are replies to a primary email message as indented, bulleted, or otherwise offset below a primary email message. In at least some aspects, each conversation thread may be displayed in descending order based on a date the last email message in the conversation thread was received, with the most recent conversation thread displayed at the top of center pane 204. In this case, individual communications (e.g., communications that have not been replied to) may be interspersed among and between conversation threads in descending order based on a date the individual communication was received. In other aspects, each conversation thread may be displayed in ascending order based on a date the last email message in the conversation thread was received with the most recent conversation thread displayed at the bottom of center pane 204. In this case, individual communications may be interspersed among and between conversation threads in ascending order based on a date the individual communication was received.

In further aspects, email messages that have been opened or viewed may be displayed within the in inbox tab 250a of center pane 204 with normal text, whereas email messages that have not been opened or viewed may be displayed within the center pane 204 with at least portions of the email message in bold text (e.g., a sender and/or a subject line may be displayed with bold text).

As should be appreciated, the various features and functionalities of user interface 200 described with respect to FIG. 2D are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2E illustrates an exemplary interface for interacting with the unified messaging platform, according to a fifth example embodiment.

As described above, the unified messaging platform may provide a user interface 200 including three panes, e.g., a left rail 202, a center pane 204, and a right rail 206. As described above, in response to selection of email portal 214, center pane 204 may display a user's email messages. In some aspects, as illustrated by FIG. 2E, a user's email messages may be organized based on conversations 252 between one or more users. For example, as shown in left rail 202, a conversation 252a between a first user and a second user (e.g., Rachel) may be displayed separately from a conversation 252b between the first user, a third user (e.g., Rob) and fourth user (e.g., Sofia).

In aspects, by selecting a conversation 252 displayed in the left rail 202, communications between the one or more users may be displayed in center pane 204. As illustrated in FIG. 2E, conversation 252c has been selected and the communications 254 between the first user and the second user (e.g., Rachel), the third user (e.g., Rob), a fifth user (e.g., Jim), and a sixth user (e.g., Sophia) are displayed in center pane 204. In this example, the first user refers to the accessing user (e.g., Ping Li) identified by user name 256a and user icon 256b.

In aspects, communications 254 of conversation 252c may be displayed in descending order based on a date each communication 254 was received with the most recent communication 254 displayed at the top of center pane 204. In other aspects, communications 254 of conversation 252c may be displayed in ascending order based on a date each communication 254 was received with the most recent communication 254 displayed at the bottom of center pane 204.

In further aspects, information related to conversation 252c may be organized by tabs or pages. For example, each tab 258 may display a different type of information associated with conversation 252c in the center pane 204. When selected, a tab 258 may be identified by highlighting, with a different font or font color, by outlining, underlining, and the like. As illustrated by FIG. 2E, a first tab (e.g., conversation tab 258a) may display the communications 254 between the first user, second user, third user, fifth user and sixth user. Additional tabs, described in further detail above, may include a second tab (e.g., files tab 258b), a third tab (e.g., links tab 258c), a fourth tab (e.g., lists tab 258d), and the like, for displaying files, links, lists, etc., shared between participants in the conversation 252c. For example, as illustrated by FIG. 2E, a list object 260 was inserted in communication 254a from the second user (e.g., Rachel). In aspects, as described above, the list object 260 may be accessed from the conversation tab 258a or from the lists tab 258d.

As illustrated by FIG. 2E, when viewing a conversation 252c between the first user, second user, third user, fifth user and sixth user, the right rail 206 may display information associated with the conversation 252c and/or the users participating in the conversation 252c. For example, the right rail 206 may display group availability 282 for the users participating in the conversation 252c. The right rail 206 may further display common meetings 284 between the users participating in the conversation 252c. In aspects, any information related to conversation 252c and/or the participating users may be displayed in right rail 206.

As should be appreciated, the various features and functionalities of user interface 200 described with respect to FIG. 2E are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2F illustrates an exemplary mobile interface for interacting with the unified messaging platform, according to an example embodiment.

In aspects, a version of the unified messaging platform may provide a user interface 285 for mobile devices. The mobile user interface 285 may provide one or more panes or windows for viewing communications, files, lists, links, etc., associated with one or more teams of which a user is a member. In some aspects, a second pane may be displayed (e.g., second pane 288) in response to swiping a first pane (e.g., first pane 286) in a left-to-right direction or a right-to-left direction.

As illustrated, first pane 286 displays one or more teams (e.g., team 287) and one or more categories (e.g., categories 291). In aspects, a notification (e.g., notification 292) may be displayed near a category (e.g., category 291a) when a new communication, file, list, hyperlink, etc., has been received within the category 291. As further illustrated, second pane 288 displays one or more communications 289 (e.g., communications 289a and 289b), which are each associated with a sender (e.g., senders 290a and 290b).

As should be appreciated, the various features and functionalities of user interface 285 described with respect to FIG. 2F are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 2G illustrates an exemplary mobile interface for interacting with the unified messaging platform, according to a second example embodiment.

As described above, mobile user interface 285 may allow a user to view a conversation (e.g., conversation 293) in a conversation pane (e.g., conversation pane 294). The mobile user interface 285 may further provide a new message input field 295 and an input interface 296 for inputting and sending communications to participants of the conversation 293. In aspects, when a communication is sent to the participants of an ongoing conversation (e.g., conversation 293), new message input field 295 does not require recipient information but may provide a subject input field, e.g., subject input field 297, for inputting a subject of the communication, e.g., “New UX.” In some aspects, new message input field 295 may be similar to an instant, chat, SMS, or similar messaging interface. In other aspects, new message input field 295 may provide functionality similar to an email messaging interface (e.g., allowing for attaching documents, list objects, images, etc.). As illustrated, a communication 298 has been partially input into new message input field 295.

As should be appreciated, the various features and functionalities of user interface 285 described with respect to FIG. 2G are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 3 illustrates an exemplary system implemented on a computing device for message handling, according to an example embodiment.

In aspects, a client computing device 304 may implement a unified messaging application. In some aspects, client computing device 304 may implement a client application 310 for interfacing with unified messaging application 312 implemented on a server computing device 308. In a basic configuration, the client computing device 304 may be described similarly to client computing device 104. However, any suitable client computing device for implementing a unified messaging application 312, or client application 310 of such application, may be utilized.

In aspects, as illustrated in FIG. 3, the unified messaging application 312 may be implemented on a server computing device 308. In a basic configuration, the server computing device 308 may be described similarly to server computing device 106. The server computing device 308 may provide data to and from the client computing device 304 through a network 306, where network 306 is described similarly to network 108. In further aspects, the unified messaging application 312 may be implemented on more than one server computing device 308, such as a plurality of server computing devices 308. As discussed above, the server computing device 308 may provide data to and from the client computing device 304 through the network 306. In some cases, a textual or voice input may be received at the client computing device 304 and transmitted over the network 306 for processing by unified messaging application 312 at the server computing device 308.

As illustrated in FIG. 3, the unified messaging application 312 may include a create component 314, a link component 316, a transform component 318, an interface component 320, an update component 322, and a synchronize component 324. The various components may be implemented using hardware, software, or a combination of hardware and software. The unified messaging application 312 may be configured to receive and process textual and/or voice input messages. In one example, a textual and/or voice input may include phrases, words, and/or terms in the form of a textual and/or spoken language input (e.g., a user text or voice message). In this regard, the unified messaging application 312 may be configured to receive the textual and/or spoken language input from user 302. In aspects, the unified messaging application 312 may be configured to convert spoken language input into a textual communication between team members. For example, the unified messaging application 312 may include standard speech recognition techniques known to those skilled in the art such as “automatic speech recognition” (ASR), “computer speech recognition”, and “speech to text” (STT). In some cases, the unified messaging application 312 may include standard text to speech techniques known to those skilled in the art such as “text to speech” (TTS).

As illustrated by FIG. 3, the client computing device 304 and the server computing device 308 may further be in communication with storage 326 that stores parameters, configuration information, communications, images, files, interactive semantic objects, or any other information accessed by unified messaging application 312. Storage 326 may be a local or remote database, within an enterprise intranet, or in distributed locations over the Internet. In aspects, storage 326 may include a plurality of files, including formatted, markup or plain text, in any file format such as digital word processing documents, spreadsheets, presentations, webpages, text messages, tweets, email messages, calendars, tasks, and the like.

In aspects, create component 314 may create and store semantic objects. In aspects, a semantic object may act as an access point for external services, may act as a centralized interface object for team interaction regarding a topic, may act as a scheduling interface for team meetings, and the like. Creating a semantic object that acts as an access point may involve identifying an action and adding a control for performing the action at an external service. In additional aspects, creating a semantic object that acts as a centralized interface object may involve identifying a topic, inviting one or more team members, and adding a control for responding to the topic. In still further aspects, creating a semantic object that acts as a scheduling interface may involve inviting one or more team members, scanning calendars of the one or more team members for two or more time periods of common availability, displaying the two or more time periods, and adding a control for selecting at least one of the two or more time periods. As should be understood, similar semantic objects are conceivable and may be similarly created within the scope of the present disclosure.

Create component 314 may provide any suitable control or interface for creating a semantic object. For instance, an interface for creating a semantic object may be displayed in response to selection of a control within an email interface object, within a lists tab of the unified messaging platform, or otherwise. The interface may allow for entering text (e.g., entering a topic, entering a meeting, etc.) and may provide drop down boxes for selecting controls (e.g., for performing actions), selecting team members or other recipients, selecting external services, and the like. In further aspects, a “control” may be created within an interactive semantic object as a click button, hyperlink, drop down list box, option button, check box, date picker, or other active object for performing an action. In some aspects, the control may enable performance of the action (e.g., checking a box “yes” or “no” in response to a question posed by the interactive semantic object, selecting an option presented by the interactive semantic object by a click button, and the like). Alternatively, the control may be linked to an operation for performing the action (e.g., an adder/subtractor function, an external services application, and the like). In at least some aspects, when the semantic object is created, the semantic object is stored in an activity tab and/or a lists tab of the unified messaging platform.

Link component 316 may establish links between controls of a semantic object and one or more operations. An operation may be any function, application, system, and the like, for performing an action. In aspects, link component 316 may establish a link between a control and an external services application for performing an action, e.g., approving an expense report, entering payroll, denying a purchase order, entering a work order, and the like. In other aspects, link component 316 may establish a link between a control and an adder/subtractor function (or other function) for aggregating and/or compiling responses to a topic. In still further aspects, link component 316 may establish a link between a control and one or more calendar applications for scheduling a meeting.

In some cases, a structure of a semantic object may be characterized by identifiers or otherwise indexed. For example, the semantic object may include one or more fields, e.g., a recipient field, a topic field, a meeting field, a time period field, an action field, an external services field, a tally field, a control field, and the like. In aspects, at least one identifier may be associated with each field. Additionally, one or more operations associated with the semantic object may be identified by hyperlinks or other identifiers, such as a uniform resource locator (URL) address, an internet protocol (IP) address, file storage location, email address, and the like.

As noted above, a link may be established between a control and one or more operations for performing an action, e.g., an adder/subtractor or other function for aggregating data, an external services application, a calendar application, and the like. In further examples, the one or more operations may be linked to an update component (e.g., update component 322) such that in response to performing the action, results, responses, aggregated data, etc., may be routed to the update component 322. In aspects, in response to establishing one or more links between a control and one or more operations for performing actions, and in response to establishing links between the one or more operations and update component 322, the semantic object may be referred to as an “interactive semantic object” or a “semantic object including an interactive control.” As should be appreciated, the above examples are offered for explanatory purposes and are not intended to be limiting. Accordingly, interactive objects may be linked to other operations without departing from the scope of the present application.

When a receiving application is not unified messaging application 312 or client application 310, transform component 318 may transform the interactive semantic object into a format and/or representation that is understood or readable by a receiving application, e.g., a third party messaging application, an enterprise messaging application, and the like. In some aspects, transform component 318 may reformat the interactive semantic object into a structure renderable by the receiving application. In further aspects, transform component 318 may translate the interactive semantic object into a representation readable by the receiving application. In further aspects, when a response or result is received from an operation, an external services application, adder/subtractor function, etc., the response or result may be transformed into a format and/or representation that is readable by the unified messaging application.

In some cases, interface component 320 may create and/or open a window or portal into an application for performing an action. For instance, interface component 320 may open a portal to an external services application, such as a line-of-business (LOB) management service, customer relationship management (CRM) service, debugging service, accounting service, payroll service, etc. In aspects, the portal may enable a user to interact with an application within a host environment, e.g., a website of an external service hosted by a third party. In aspects, in response to performing an action in the application, a result and/or update may be linked to update component 322.

Update component 322 may report, compile or present updates and/or results related to an interactive semantic object. For instance, update component 322 may create a tally interface object for providing updates and/or results related to the interactive semantic object. That is, as results are received from an operation or responses are received from users, the results and/or responses may be reported in the tally interface object. In other cases, as results are received from an operation or responses are received from users, the results and/or responses may be reported in an update field, a status field, a voting field, or otherwise, within an interactive semantic object. In some cases, a link established between the one or more operations and the update component 322 may automatically route results, responses, aggregated data, etc., from the one or more operations to the update component 322 for reporting.

Synchronize component 324 may synchronize results, responses, aggregated data, selections, etc., related to an interactive semantic object such that a single version of the interactive semantic object (or a tally interface object) is provided on the unified messaging platform. That is, if the interactive semantic object is embedded in a communication shared between users in a conversation, a single version of the interactive semantic object may be viewable in a conversation tab. Users registered with the unified messaging application may interact directly with the single version of the interactive semantic object in the conversation tab, an activity tab, and/or a lists tab. Accordingly, any results, updates, and/or responses entered into one view (e.g., tab) may be synchronized across all views (e.g., all tabs).

Alternatively, when responses, results, etc., are received from external users and/or third party applications or external services, the results, updates, and/or responses may be transformed into a format and/or representation readable by the unified messaging application and routed to update component 322. Update component 322 may compile and/or report the results, updates, and/or responses from external sources. Synchronize component 324 may then synchronize the results, updates, and/or responses for an interactive semantic object (and/or the tally interface object) across all views to provide a single version of the interactive semantic object (and/or tally interface object).

Traditionally, when collaborating regarding a topic, determining a meeting time, cooperating regarding a decision, direction or strategy, users may send numerous communications back and forth, which creates confusion, lack of consensus, and distraction among team members. In aspects disclosed herein, an interactive semantic object may be provided in a single version such that as results, updates, and/or responses are received, the single version of the interactive semantic object may be updated and provided on the unified messaging platform. For instance, in the case of making a collective decision regarding a direction or strategy, a user may create an interactive semantic object by selecting one or more recipients, inputting one or more options for the direction or strategy, adding at least one control for selecting at least one option, and providing an update field for reporting results of user selections. In this case, team members are able to view a progress of the polling in real time as selections are received and the single version of the interactive semantic object is updated.

According to further aspects, the client computing device 304 and/or server computing device 308 may be in communication with a third party computing device 328. Third party computing device 328 may be described similarly to server computing device 106 or server computing device 308. In aspects, third party computing device 328 may host one or more third party messaging applications, an enterprise messaging application, a word processing application, a collaborative authoring application, a calendar application, an external services application, etc. In at least some aspects, authentication (e.g., authentication 112) may be required to access third party computing device 328.

As should be appreciated, the various devices, components, etc., described with respect to FIG. 3 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.

FIG. 4 illustrates an exemplary method for creating a semantic object, according to an example embodiment.

Method 400 begins with create operation 402, where a semantic object is created in a unified messaging application. For instance, an interface for creating the semantic object may be displayed in response to selection of a control within an email interface object, within a lists tab of the unified messaging platform, or otherwise. The interface may allow for entering text (e.g., entering a topic, entering a meeting, etc.) and may provide drop down menus, etc., for selecting interactive controls (e.g., for performing actions), selecting team members or other recipients, selecting external services, and the like. Additionally, an “interactive control” may be created within a semantic object as a click button, hyperlink, drop down list box, option button, check box, date picker, or other active object for performing an action. In some aspects, the interactive control may enable performance of an action (e.g., checking a box “yes” or “no” in response to a question posed by the interactive semantic object, selecting an option presented by the semantic object using a click button, and the like). Alternatively, the interactive control may be linked to an operation for performing the action (e.g., an adder/subtractor or other function, an external services application, and the like). In at least some aspects, when the semantic object is created, the semantic object is stored in a conversation tab, an activity tab and/or a lists tab of the unified messaging platform.

In some aspects, one or more parameters may be associated with a semantic object. For instance, a semantic object may be associated with a duration period, a completion date, an end time, etc. That is, in some cases, a semantic object may be held open for a certain period of time or “duration period.” For instance, if a semantic object is created to vote on a destination for lunch, the semantic object may be associated with an end time of 11:30 am. In another example, if the semantic object is created to vote on a name for a new product line, the semantic object may be associated with a suitable completion date or an end time prior to a launch date for the product line. After the duration period, completion date, and/or end time, the semantic object may no longer be displayed, may no longer accept updates, may indicate that results are final, etc.

At identify structure operation 404, a structure of the semantic object may be identified. In some cases, a structure of a semantic object may be characterized by identifiers or otherwise indexed. For example, the semantic object may include one or more fields, e.g., a recipient field, a topic field, a meeting field, a time period field, an action field, an external services field, a tally field, an interactive control field, and the like. In aspects, at least one identifier may be associated with each field. In aspects, an interactive control may be associated with the structure of the semantic object by an identifier or other indexing. Additionally, one or more operations associated with the interactive control may be identified by hyperlinks or other identifiers, such as a uniform resource locator (URL) address, an internet protocol (IP) address, file storage location, email address, and the like.

At link operation 406, a link may be established between the interactive control and one or more operations. An operation may be any function, application, system, and the like, for performing an action. In aspects, establishing a link between an interactive control and one or more operations may include establishing a link between the interactive control and an external services application for approving an expense report, entering payroll, denying a purchase order, entering a work order, and the like. In other aspects, establishing a link between the interactive control and one or more operations may include establishing a link between the interactive control and an adder/subtractor or other function for aggregating and/or compiling responses to a topic. In still further aspects, establishing a link between the interactive control and one or more operations may include establishing a link between the interactive control and one or more calendar applications for scheduling a meeting. As should be appreciated, the above examples are offered for explanatory purposes and are not intended to be limiting. Accordingly, interactive controls may be linked to other operations without departing from the scope of the present application.

At identify endpoint operation 408, an endpoint registered with a receiving application may be identified. In aspects, in response to creating an interactive semantic object, the interactive semantic object may be shared with one or more users at one or more endpoints. In some cases, one or more endpoints may be identified for each recipient, e.g., a personal computer, a mobile device, a tablet, a smart television, etc. Identifying an endpoint may include identifying a device type for the endpoint (e.g., mobile device, personal computer, tablet computer, etc.), a display type for the endpoint (e.g., monitor, television, touch enabled display, graphical display, alphanumeric display, etc.), applications registered with the endpoint (e.g., enterprise or third party email messaging applications, SMS messaging applications, social networking applications, instant messaging applications, voicemail applications, calendaring applications, etc.), and the like.

At decision operation 410, it is determined whether an endpoint is registered with a unified messaging application. In aspects, whereas a semantic object may have been created at an endpoint registered with the unified messaging application (e.g., by an accessing user), the semantic object may be shared with recipients on endpoints that are not registered with the unified messaging application. If an endpoint is registered with the unified messaging application, the method proceeds to share operation 414. Alternatively, if an endpoint is not registered with the unified messaging application, the method proceeds to transform operation 412.

At transform operation 412, the semantic object may be transformed such that it is readable and/or renderable by one or more receiving applications registered with the one or more endpoints. In some aspects, for applications other than the unified messaging application, the semantic object may be altered (i.e., transformed) such that it can be provided to a team member who is not registered with the unified messaging application. Transforming the semantic object may involve translating the semantic object into a representation readable by a receiving application and may also involve reformatting the semantic object such that it is renderable by a receiving application registered with the recipient endpoint. Thus, transforming the semantic object may be described in terms of a translation process (e.g., providing the semantic object in a language or representation readable by a consuming application) and a reformatting process (e.g., providing the semantic object in a structure for rendering by a consuming application and/or by a particular endpoint). In some aspects, the transform operation may involve a single process that transforms the semantic object into a language or representation readable by a receiving application, where the receiving application performs any processing necessary for rendering or presenting the semantic object on a particular recipient endpoint.

During transformation, the semantic object may be reformatted into different structures renderable by different receiving applications, e.g., a first structure renderable by a third party email messaging application and a second structure renderable by an enterprise messaging application. In some cases, aspects of the semantic object may not be renderable by a particular receiving application and may be removed or altered, e.g., an interactive control provided as a button object may be reformatted as a link or as a check box, etc. Additionally, the semantic object may be translated into different representations readable by different messaging applications, e.g., a first representation readable by a third party email messaging application and a second representation readable by an enterprise messaging application.

In further aspects, links established between the interactive control and one or more operations may also be translated into a representation readable by a receiving application. In some aspects, the receiving application my not have access to the one or more operations. For example, a receiving application may not have access to an external services application or a particular calendaring application. In this case, the link established between the interactive control and the external service application, for example, may be converted to a control for enabling an action (e.g., selecting “approve” or “reject”) and a response may be routed to the unified messaging application for interaction with the external services application. By way of example, the interactive control may be linked to the external services application for approving a purchase order. In this case, a user's approval for the purchase order may be received by the unified messaging application, which may in turn access the external services application and perform the action of approving the purchase order.

Transform operation 412 may consider additional factors, such as a device type and display type, when transforming the semantic object. For instance, transform operation may include reformatting the semantic object such that it is renderable by a particular device type having a particular display type. For instance, while a personal computer may be capable of rendering the semantic object, a mobile device may not have such capability. In this case, the semantic object may be transformed into an electronic image (e.g., .jpeg) for rendering by the mobile device and the interactive control may be transformed into a link or other form renderable by the mobile device.

At share operation 414, the semantic object is shared with one or more recipient endpoints. In some aspects, e.g., when the one or more recipient endpoints are registered with the unified messaging application, the semantic object may not require transformation. That is, as described above, the semantic object may be embedded in a message and presented within a conversation tab, an activity tab, and/or a lists tab in a center pane of a user interface of the unified messaging platform. Moreover, the semantic object may be represented as a single synchronized version such that responses and/or updates received by the semantic object in any view (e.g., tab) are automatically synchronized across all tabs as a single version of the semantic object.

Alternatively, when a recipient endpoint is not registered with the unified messaging application, share operation 414 may share the semantic object with a receiving application outside of the unified messaging platform, such as a third party email messaging application or an enterprise messaging application. The receiving application may then render or present a copy of the semantic object to a user at the recipient endpoint. In this case, when a recipient endpoint is not registered with the unified messaging application, while the semantic object may include an interactive control for responding to or updating the semantic object and while the user may benefit from receiving a copy of the semantic object, the user may not be able to interact with a single synchronized version of the semantic object but with a copy. Moreover, the receiving application may not have access to one or more operations linked to the interactive control, resulting in responses and/or updates being routed through the unified messaging application or otherwise.

As should be appreciated, operations 402-414 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.

FIG. 5 illustrates an exemplary method for embedding a semantic object into a message, according to an example embodiment.

Method 500 begins with receive operation 502, where a message is received by a unified messaging application. The message may be of any type or format, including an email message, SMS message, instant message, and the like. In aspects, the message may be received as input by an “accessing user” at an endpoint registered with the unified messaging application. For example, the accessing user may input content into a new message input field, an email interface object, a message reply input field, etc., associated with the unified messaging application.

At create operation 504, a semantic object may be created in the unified messaging application. For instance, an interface for creating the semantic object may be displayed in response to selection of a control within an email interface object, within a lists tab of the unified messaging platform, or otherwise. The interface may allow for entering text (e.g., entering a topic, entering a meeting, etc.) and may provide drop down menus, etc., for selecting interactive controls (e.g., for performing actions), selecting team members or other recipients, selecting external services, and the like. In further aspects, an “interactive control” may be created within the semantic object as a click button, hyperlink, drop down list box, option button, check box, date picker, or other active object for performing an action. In some aspects, the interactive control may enable performance of an action (e.g., checking a box “yes” or “no” in response to a question posed by the interactive semantic object, selecting an option presented by the interactive semantic object using a click button, and the like). Alternatively, the control may be linked to an operation for performing the action (e.g., an adder/subtractor or other function, an external services application, and the like), as described above. In at least some aspects, when the semantic object is created, the semantic object is stored in a conversation tab, an activity tab and/or a lists tab of the unified messaging platform.

In some aspects, one or more parameters may be associated with a semantic object. For instance, a semantic object may be associated with a duration period, a completion date, an end time, etc. That is, in some cases, a semantic object may be held open for a certain period of time or “duration period.” For instance, if a semantic object is created to vote on a destination for lunch, the semantic object may be associated with an end time of 11:30 am. In another example, if the semantic object is created to vote on a name for a new product line, the semantic object may be associated with a suitable completion date or an end time prior to a launch date for the product line. After the duration period, completion date, and/or end time, the semantic object may no longer be displayed, may no longer accept updates, may indicate that results are final, etc.

At embed operation 506, the semantic object may be embedded in the message. Embedding the semantic object may be implemented by any suitable means. For instance, the semantic object may be pasted into the message and, in particular, may be embedded into a message body of the message. In further aspects, a location within the message body for embedding the semantic object may be determined by any suitable means. In some examples, a sender of the message may select a location for embedding the semantic object. In this case, for example, the sender may place a cursor in a desired location and access a control for creating and embedding a semantic object into the message. For example, in response to accessing the control (e.g., an “Insert” link), an interface may be displayed for creating and embedding the semantic object at the desired location. In at least some aspects, when the semantic object is created and embedded into the message, the semantic object is also stored in an activity tab and/or a lists tab of the unified messaging platform.

At identify endpoint operation 508, an endpoint registered with a receiving application may be identified. In some cases, one or more endpoints may be identified for each recipient, e.g., a personal computer, a mobile device, a tablet, a smart television, etc. Identifying an endpoint may include identifying a device type for the endpoint (e.g., mobile device, personal computer, tablet computer, etc.), a display type for the endpoint (e.g., monitor, television, touch enabled display, graphical display, alphanumeric display, etc.), applications registered with the endpoint (e.g., enterprise or third party email messaging applications, SMS messaging applications, social networking applications, instant messaging applications, voicemail applications, calendaring applications, etc.), and the like.

At decision operation 510, it is determined whether an endpoint is registered with a unified messaging application. In aspects, whereas a semantic object may have been created and embedded in a message at an endpoint registered with the unified messaging application (e.g., from an accessing user), the message may be transmitted and displayed to recipients on endpoints that are not registered with the unified messaging application. If an endpoint is registered with the unified messaging application, the method proceeds to share operation 514. Alternatively, if an endpoint is not registered with the unified messaging application, the method proceeds to transform operation 512.

At transform operation 512, the message including the semantic object (hereinafter “object embedded message”) may be transformed such that it is readable and/or renderable by one or more receiving applications registered with the one or more endpoints. In some aspects, for applications other than the unified messaging application, the object embedded message may be altered (i.e., transformed) such that it can be provided to a team member who is not registered with the unified messaging application. That is, transforming the object embedded message may involve translating the object embedded message into a representation readable by a receiving application and may also include reformatting the object embedded message such that it is renderable by a receiving application registered with the recipient endpoint. Thus, transforming the object embedded message may be described in terms of a translation process (e.g., providing the object embedded message in a language or representation readable by a consuming application) and a reformatting process (e.g., providing the object embedded message in a structure for rendering by a consuming application and/or by a particular endpoint). In some aspects, the transform operation may involve a single process that transforms the object embedded message into a language or representation readable by a receiving application, where the receiving application performs any processing necessary for rendering or presenting the object embedded message on a particular recipient endpoint.

During transformation, the object embedded message may be reformatted into different structures renderable by different receiving applications, e.g., a first structure renderable by a third party email messaging application and a second structure renderable by an enterprise messaging application. In some cases, aspects of the semantic object may not be renderable by a particular receiving application and may be altered or removed, e.g., an interactive control provided as a button object may be reformatted as a link or as a check box, etc. Additionally, the object embedded message may be translated into different representations readable by different messaging applications, e.g., a first representation readable by a third party email messaging application and a second representation readable by an enterprise messaging application.

In further aspects, links established between the interactive control and one or more operations may also be translated into a representation readable by a receiving application. In some aspects, the one or more operations may not be accessible to the receiving application. For example, a receiving application may not have access to an external services application or a particular calendaring application. In this case, the link established between the interactive control and the external service application, for example, may be converted to a control for enabling an action (e.g., selecting “approve” or “reject”) and a response may be routed to the unified messaging application for interaction with the external services application. By way of example, the interactive control may be linked to the external services application for approving a purchase order. In this case, a user's approval for the purchase order may be received by the unified messaging application, which may in turn access the external services application and perform the action of approving the purchase order.

Transform operation 512 may consider additional factors, such as a device type and display type, when transforming the object embedded message. For instance, transform operation may include reformatting the object embedded message such that it is renderable by a particular device type having a particular display type. For instance, while a personal computer may be capable of rendering the object embedded message, a mobile device may not have such capability. In this case, the object embedded message may be transformed such that the semantic object may be transformed into an electronic image (e.g., .jpeg) for rendering by the mobile device and the interactive control may be transformed into a link or other form renderable by the mobile device.

At send operation 514, the object embedded message is sent to one or more recipient endpoints. In some aspects, e.g., when the one or more recipient endpoints are registered with the unified messaging application, the object embedded message may not require transformation. That is, as described above, the semantic object embedded in the message may be presented within a conversation tab, an activity tab, and/or a lists tab in a center pane of a user interface of the unified messaging platform. Moreover, the semantic object may be represented as a single synchronized version such that responses and/or updates received in the semantic object in any view (e.g., tab) are automatically synchronized across all tabs as a single version of the semantic object.

Alternatively, when a recipient endpoint is not registered with the unified messaging application, send operation 514 may send a copy of the object embedded message to a receiving application outside of the unified messaging platform, such as a third party email messaging application or an enterprise messaging application. The receiving application may then render or present the copy of the object embedded message to a user at the recipient endpoint. In this case, when a recipient endpoint is not registered with the unified messaging application, while the object embedded message may include an interactive control for responding to or updating the semantic object and while the user may benefit from receiving a copy of the semantic object, the user may not interact with a single synchronized version of the semantic object but with a copy, as described above. Moreover, the receiving application may not have access to one or more operations linked to the interactive control, resulting in responses and/or updates being routed through the unified messaging application or otherwise, as described above.

As should be appreciated, operations 502-514 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.

FIG. 6 illustrates an exemplary method for receiving an update to a semantic object, according to an example embodiment.

Method 600 begins with receive operation 602, where a semantic object is received by one or more receiving applications, such as a unified messaging application, a third party email messaging application, an enterprise messaging application, and the like.

At receive operation 604, a selection of an interactive control may be received. The interactive control may include a click button, hyperlink, drop down list box, option button, check box, date picker, or other active object for performing an action. In some aspects, the interactive control may enable performance of an action (e.g., checking a box “yes” or “no” in response to a question posed by the semantic object, selecting an option presented by the interactive semantic object using a click button, and the like). Alternatively, the control may be linked to an operation for performing the action (e.g., an adder/subtractor or other function, an external services application, and the like), as described above.

In response to receiving selection of the interactive control, at follow link operation 606, a link established between the interactive control and one or more operations may be followed. In some examples, following the link may involve authentication (e.g., authentication 112). As noted above, an operation may be any function, application, system, and the like, for performing an action. For instance, the link may be established between the interactive control and an external services application for approving an expense report, entering payroll, denying a purchase order, entering a work order, and the like. Alternatively, the link may be established between the interactive control and an adder/subtractor or other function for aggregating and/or compiling responses to a topic. In still further aspects, the link may be established between the interactive control and one or more calendar applications for scheduling a meeting.

Following the link may include accessing an external services application, accessing an adder/subtractor function, accessing a calendar application, and the like. In some aspects, following the link may open a window and/or portal into an external services application, such as a line-of-business (LOB) management service, customer relationship management (CRM) service, debugging service, accounting service, payroll service, etc. In aspects, the portal may enable a user to interact with an application within a host environment, e.g., a website of an external service hosted by a third party.

At generate result operation 608, a result may be received in response to performing an action. In aspects, a result may comprise a response, an aggregation of responses, a selection of an option, an aggregation of selected options, a completed task, and the like. For instance, in response to selecting a button control and entering a portal for an external services application, an expense report may be approved, payroll may be entered, a purchase order may be denied, and/or a work order may be entered. Alternatively, in response to selecting an option in the semantic object (e.g., option for lunch, option for strategy, etc.), a link may be followed to an adder/subtractor function, and a tally of selected options may be generated. In further examples, in response to selecting an option of a meeting time, a link may be followed to an adder/subtractor or other function, and a tally of selected meeting times may be generated. As should be appreciated, additional or alternative actions may be performed to generate additional or alternative results.

At update operation 610, the semantic object may be updated based on a result of performing an action. For instance, in response to approval of an expense report, entry of payroll, denial of a purchase order, and/or entry of a work order, an update to the semantic object may include completion of a task, e.g., “expense report approved,” “payroll complete,” “purchase order denied,” and the like. Alternatively, in response to generating a tally of selected options, the semantic object may be updated with the tally of selected options (e.g., in an update field). In further examples, in response to generating a tally of most frequently selected meeting times, an update may include automatically scheduling a meeting and updating the semantic object with the scheduled meeting. As described above, the semantic object may be stored in a conversation tab, an activity tab, a lists tab, etc. As further described above, a single version of the semantic object may be provided. Accordingly, in aspects, the update may be synchronized with the semantic object in all views (e.g., all tabs) of the unified messaging platform.

At decision operation 612, it is determined whether a receiving endpoint is registered with a unified messaging application. In aspects, whereas a semantic object may have been updated at an endpoint registered with the unified messaging application (e.g., from an accessing user), the updated semantic object may be transmitted and displayed to recipients on endpoints that are not registered with the unified messaging application. If an endpoint is registered with the unified messaging application, the method proceeds to share operation 616. Alternatively, if an endpoint is not registered with the unified messaging application, the method proceeds to transform operation 614.

At transform operation 614, the updated semantic object may be transformed such that it is readable and/or renderable by one or more receiving applications registered with the one or more endpoints. In some aspects, for applications other than the unified messaging application, the updated semantic object may be altered (i.e., transformed) such that it can be provided to a team member who is not registered with the unified messaging application. That is, transforming the updated semantic object may involve translating the updated semantic object into a representation readable by a receiving application and may also include reformatting the updated semantic object such that it is renderable by a receiving application registered with the recipient endpoint, as described above.

In some aspects, rather than transforming the updated semantic object, transform operation 614 may merely transform the update. In this case, the update may be provided in a communication readable by the receiving application, e.g., “expense report approved,” “payroll complete,” “purchase order denied,” and the like. Alternatively, a tally of selected options or a tally of most frequently selected meeting times may be provided in a communication readable by the receiving application.

At share operation 616, the updated semantic object is shared with one or more recipient endpoints. In some aspects, e.g., when the one or more recipient endpoints are registered with the unified messaging application, the updated semantic object may not require transformation. That is, as described above, the updated semantic object may be embedded in a message and presented within a conversation tab, an activity tab, and/or a lists tab in a center pane of a user interface of the unified messaging platform. Moreover, the updated semantic object may be represented as a single synchronized version across all tabs.

Alternatively, when a recipient endpoint is not registered with the unified messaging application, share operation 616 may share a copy of the updated semantic object with a receiving application outside of the unified messaging platform, such as a third party email messaging application or an enterprise messaging application. The receiving application may then render or present the copy of the updated semantic object to a user at the recipient endpoint. In some cases, rather than sharing the updated semantic object, share operation 616 may merely share a copy of the update, as described above.

As should be appreciated, operations 602-616 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.

FIG. 7A illustrates an exemplary semantic object embedded in a message, according to a first example embodiment.

The unified messaging platform may provide an interface 700. As illustrated, the interface 700 includes a semantic object embedded in a message. The message may include textual content 702 and semantic object 704. Semantic object 704 includes a number of interactive controls 706 for providing access to an external services application 708. For example, in aspects, in response to selecting acknowledge control 706a, approve/decline control 706b, accept/reject control 706c, or custom control 706d, a portal or other access to the external service application 708 may be provided for performing actions such actions as “acknowledge” “approve or decline,” “accept or reject,” and “custom” within the external services application.

FIG. 7B illustrates an exemplary semantic object embedded in a message, according to a second example embodiment.

As detailed above, the unified messaging platform may provide an interface 700 that includes a semantic object 712 embedded in a message with textual content 710. Semantic object 712 includes a number of interactive controls 714 for selecting options for lunch, including a first control 714a (“Din Tai Fung”), a second control 714b (“PF Changs”), a third control 714c (“Cheesecake Factory”), and a fourth control 714d (“Tap House”). In response to receiving a selection of send “To All” button 716, the message with the embedded semantic object 712 may be sent to all recipients (not shown).

FIG. 7C illustrates an exemplary semantic object embedded in a message, according to a third example embodiment.

As detailed above, the unified messaging platform may provide an interface 700 that includes a semantic object 720 embedded in a message with textual content 718. Semantic object 720 includes a number of controls 722 for selecting times for a meeting, including a first control 722a (“12:30-1:00 PM”), a second control 722b (“1:00-1:30 PM”), a third control 722c (“2:30-3:00 PM”), and a fourth control 722d (“3:30-4:00 PM”). In response to receiving a selection of send “To: Megha” button 724, the message with the embedded semantic object 720 may be sent to a recipient “Megha.”

FIG. 7D illustrates an exemplary updated semantic object, according to an example embodiment.

As detailed above, the unified messaging platform may provide an interface 700 that includes an updated semantic object 728 embedded in a message including textual content 726. As illustrated, textual content 726 corresponds to textual content 710 (FIG. 7B). Updated semantic object 728 (e.g., a tally interface object) includes a tally of the lunch options selected in semantic object 712 (see FIG. 7B). For instance, as illustrated, updated semantic object 728 displays a tally of four selections 732a for a first lunch option 730a (“Din Tai Fung”) (corresponding to first control 714a) and a tally of two selections 732b for a fourth lunch option 732b (“Tap House”) (corresponding to fourth control 714d).

In some aspects, the semantic object may be created such that updates to the semantic object are hidden from recipients (e.g., until all results or responses are received, etc.). In this case, the creator of the semantic object may select a control (e.g., control 734) for hiding updates to the semantic object. In other aspects, updated semantic object 728 may be synchronized such that as selections are received, an updated tally of selections may be displayed to all recipients as a single version of the updated semantic object 728 in all views (e.g., conversation tab, activity tab, lists tab, etc.) across all endpoints. Where an endpoint is not registered with the unified messaging application, a copy of the updated semantic object 728 may be provided in a message to the endpoint.

As should be appreciated, the various features and functionalities of user interface 700 described with respect to FIGS. 7A-7D are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 8 illustrates an exemplary mobile interface for creating a semantic object, according to an example embodiment.

A version of the unified messaging platform may provide a mobile interface 800 for creating a semantic object. In an example, a first user (e.g., Manny Powell) may be communicating with a second user (e.g., Megan Thompson) regarding a topic (e.g., in a conversation pane of the mobile interface (not shown) and may wish to set up a meeting to discuss the topic further. Upon selection of a control associated with Megan Thompson (e.g., a user icon, a user identifier, etc.), the mobile interface 800 may display information regarding Megan Thompson for arranging the meeting (e.g., group, email alias, phone number, common availability, etc.).

More specifically, a sending user 802 (e.g., first user “Manny Powell”) may wish to create a semantic object for selecting a meeting time with a recipient user 804 (e.g., second user “Megan Thompson”). In aspects, mobile interface 800 may display common availability 806 between the sending user 802 and the recipient user 804. However, the sending user 802 may wish to provide the recipient user 804 with optional times for the meeting.

In response to selecting control 808, a semantic object (not shown) may be generated. In some cases, the semantic object may be embedded in a message to the recipient user 804 and may include automatically generated textual content such as “Please select a meeting time.” Alternatively, the sending user 802 may manually enter textual content. The semantic object may further include a number of interactive controls for selecting a meeting time, e.g., a first control “12:30-1:00 PM,” a second control “1:00-1:30 PM,” a third control “2:30-3:00 PM,” a fourth control “3:00-3:30 PM,” and a fifth control “4:00-4:30 PM.” In aspects, the interactive controls may be generated based on time periods 810 associated with common availability 806.

In response to the recipient user 804 selecting one of the interactive controls within the semantic object, an update (e.g., selection) may automatically be sent to the sending user 802 and/or displayed in an updated semantic object (not shown). In some cases, e.g., where the recipient user 804 is the only recipient and consensus is not required from multiple recipients, in response to the recipient user 804 selecting one of the interactive controls within the semantic object, a meeting may automatically be scheduled at the selected time between the sending user 802 and the recipient user 804. In other cases, e.g., for more than one recipient, a tally of selected times may be generated and displayed in the semantic object (or a tally interface object). In some cases, a most frequently selected time may be identified, and a meeting may automatically be schedule at the most frequently selected time between the sending user and the recipient users.

As should be appreciated, the various features and functionalities of mobile interface 800 described with respect to FIG. 8 are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 9 illustrates an exemplary interface for displaying an object embedded message, according to an example embodiment.

The unified messaging platform may provide a user interface 900 including three panes, e.g., a left rail 902, a center pane 904, and a right rail 906. In aspects, as described above, a category 908 may be selected in the left rail 902, and the conversation related to the category 908 may be displayed in a conversations tab 910 in center pane 904.

As illustrated by FIG. 9, a communication 912 was received from sending user 914. The communication 912 includes textual content 916 and an embedded semantic object 918. The embedded semantic object 918 includes a number of controls 922 associated with a file 920. For example, in aspects, in response to selecting approve control 922a or reject control 922b the file 920 may be approved or rejected, respectively. Alternatively, in response to selecting approve control 922a or reject control 922b, the file 920 may be opened for performing actions such as “approve” or “reject.”

As should be appreciated, the various features and functionalities of user interface 900 described with respect to FIG. 9 are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIG. 10 illustrates an exemplary interface for providing a portal to an external application, according to an example embodiment.

As described above, the unified messaging platform may provide a user interface 1000 including three panes, e.g., a left rail 1002, a center pane 1004, and a right rail 1006. In aspects, as described above, in response to selecting a category (e.g., category 1008) in the left rail 1002, the category 1008 may be displayed in center pane 1004.

As illustrated by FIG. 10, a conversation regarding category 1008 is displayed in center pane 1004 in conversations tab 1010. As illustrated, a communication 1012 was received from Rachel Morrison that includes a portal 1014 providing access to an external services application 1016 (e.g., debugging services). In aspects, the portal 1014 may enable a user to interact with the external services application 1016 executing within a host environment, e.g., a website of a debugging service hosted by a third party.

In aspects, in response to performing an action in the external services application, a result and/or update may be linked to a semantic object and displayed in a communication (e.g., communication 1012) within the conversation. For instance, an update may be displayed (not shown), such as “Rachel Morrison completed debugging Launch API,” and the like.

As should be appreciated, the various features and functionalities of user interface 1000 described with respect to FIG. 10 are not intended to limit associated systems and methods to the particular features and functionalities described. Accordingly, additional features and functionalities may be associated with the systems and methods described herein and/or some features and functionalities described may be excluded without departing from the systems and methods described herein.

FIGS. 11-14 and the associated descriptions provide a discussion of a variety of operating environments in which aspects of the disclosure may be practiced. However, the devices and systems illustrated and discussed with respect to FIGS. 11-14 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing aspects of the disclosure, described herein

FIG. 11 is a block diagram illustrating physical components (e.g., hardware) of a computing device 1100 with which aspects of the disclosure may be practiced. The computing device components described below may have computer executable instructions for implementing a unified messaging application on a server computing device 106 (or server computing device 308), including computer executable instructions for unified messaging application 1120 that can be executed to employ the methods disclosed herein. In a basic configuration, the computing device 1100 may include at least one processing unit 1102 and a system memory 1104. Depending on the configuration and type of computing device, the system memory 1104 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. The system memory 1104 may include an operating system 1105 and one or more program modules 1106 suitable for running unified messaging application 1120, such as one or more components in regards to FIG. 3 and, in particular, create component 1111, link component 1113, transform component 1115, or synchronize component 1117. The operating system 1105, for example, may be suitable for controlling the operation of the computing device 1100. Furthermore, embodiments of the disclosure may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 11 by those components within a dashed line 1108. The computing device 1100 may have additional features or functionality. For example, the computing device 1100 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 11 by a removable storage device 1109 and a non-removable storage device 1110.

As stated above, a number of program modules and data files may be stored in the system memory 1104. While executing on the processing unit 1102, the program modules 1106 (e.g., unified messaging application 1120) may perform processes including, but not limited to, the aspects, as described herein. Other program modules that may be used in accordance with aspects of the present disclosure, and in particular for providing a unified messaging platform, may include create component 1111, link component 1113, transform component 1115, or synchronize component 1117, etc.

Furthermore, embodiments of the disclosure may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the disclosure may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 11 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality, described herein, with respect to the capability of client to switch protocols may be operated via application-specific logic integrated with other components of the computing device 1100 on the single integrated circuit (chip). Embodiments of the disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the disclosure may be practiced within a general purpose computer or in any other circuits or systems.

The computing device 1100 may also have one or more input device(s) 1112 such as a keyboard, a mouse, a pen, a sound or voice input device, a touch or swipe input device, etc. The output device(s) 1114 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 1100 may include one or more communication connections 1116 allowing communications with other computing devices 1150. Examples of suitable communication connections 1116 include, but are not limited to, radio frequency (RF) transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.

The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 1104, the removable storage device 1109, and the non-removable storage device 1110 are all computer storage media examples (e.g., memory storage). Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 1100. Any such computer storage media may be part of the computing device 1100. Computer storage media does not include a carrier wave or other propagated or modulated data signal.

Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.

FIGS. 12A and 12B illustrate a mobile computing device 1200, for example, a mobile telephone, a smart phone, wearable computer (such as a smart watch), a tablet computer, a laptop computer, and the like, with which embodiments of the disclosure may be practiced. In some aspects, the client may be a mobile computing device. With reference to FIG. 12A, one aspect of a mobile computing device 1200 for implementing the aspects is illustrated. In a basic configuration, the mobile computing device 1200 is a handheld computer having both input elements and output elements. The mobile computing device 1200 typically includes a display 1205 and one or more input buttons 1210 that allow the user to enter information into the mobile computing device 1200. The display 1205 of the mobile computing device 1200 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 1215 allows further user input. The side input element 1215 may be a rotary switch, a button, or any other type of manual input element. In alternative aspects, mobile computing device 1200 may incorporate more or less input elements. For example, the display 1205 may not be a touch screen in some embodiments. In yet another alternative embodiment, the mobile computing device 1200 is a portable phone system, such as a cellular phone. The mobile computing device 1200 may also include an optional keypad 1235. Optional keypad 1235 may be a physical keypad or a “soft” keypad generated on the touch screen display. In various embodiments, the output elements include the display 1205 for showing a graphical user interface (GUI), a visual indicator 1220 (e.g., a light emitting diode), and/or an audio transducer 1225 (e.g., a speaker). In some aspects, the mobile computing device 1200 incorporates a vibration transducer for providing the user with tactile feedback. In yet another aspect, the mobile computing device 1200 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.

FIG. 12B is a block diagram illustrating the architecture of one aspect of a mobile computing device. That is, the mobile computing device 1200 can incorporate a system (e.g., an architecture) 1202 to implement some aspects. In one embodiment, the system 1202 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some aspects, the system 1202 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.

One or more application programs 1266 may be loaded into the memory 1262 and run on or in association with the operating system 1264. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 1202 also includes a non-volatile storage area 1268 within the memory 1262. The non-volatile storage area 1268 may be used to store persistent information that should not be lost if the system 1202 is powered down. The application programs 1266 may use and store information in the non-volatile storage area 1268, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 1202 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 1268 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 1262 and run on the mobile computing device 1200, including the instructions for providing a unified messaging platform as described herein (e.g., search engine, extractor module, relevancy ranking module, answer scoring module, etc.).

The system 1202 has a power supply 1270, which may be implemented as one or more batteries. The power supply 1270 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.

The system 1202 may also include a radio interface layer 1272 that performs the function of transmitting and receiving radio frequency communications. The radio interface layer 1272 facilitates wireless connectivity between the system 1202 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio interface layer 1272 are conducted under control of the operating system 1264. In other words, communications received by the radio interface layer 1272 may be disseminated to the application programs 1266 via the operating system 1264, and vice versa.

The visual indicator 1220 may be used to provide visual notifications, and/or an audio interface 1274 may be used for producing audible notifications via the audio transducer 1225. In the illustrated embodiment, the visual indicator 1220 is a light emitting diode (LED) and the audio transducer 1225 is a speaker. These devices may be directly coupled to the power supply 1270 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 1260 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 1274 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 1225, the audio interface 1274 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present disclosure, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 1202 may further include a video interface 1276 that enables an operation of an on-board camera 1230 to record still images, video stream, and the like.

A mobile computing device 1200 implementing the system 1202 may have additional features or functionality. For example, the mobile computing device 1200 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 12B by the non-volatile storage area 1268.

Data/information generated or captured by the mobile computing device 1200 and stored via the system 1202 may be stored locally on the mobile computing device 1200, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio interface layer 1272 or via a wired connection between the mobile computing device 1200 and a separate computing device associated with the mobile computing device 1200, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 1200 via the radio interface layer 1272 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.

FIG. 13 illustrates one aspect of the architecture of a system for processing data received at a computing system from a remote source, such as a personal computer 1304, tablet computing device 1306, or mobile computing device 1308, as described above. Content displayed at server device 1302 may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 1322, a web portal 1324, a mailbox service 1326, an instant messaging store 1328, or a social networking site 1330. The unified messaging application 1320 may be employed by a client that communicates with server device 1302, and/or the unified messaging application 1320 may be employed by server device 1302. The server device 1302 may provide data to and from a client computing device such as a personal computer 1304, a tablet computing device 1306 and/or a mobile computing device 1308 (e.g., a smart phone) through a network 1315. By way of example, the computer system described above with respect to FIGS. 1-12 may be embodied in a personal computer 1304, a tablet computing device 1306 and/or a mobile computing device 1308 (e.g., a smart phone). Any of these embodiments of the computing devices may obtain content from the store 1316, in addition to receiving graphical data useable to be either pre-processed at a graphic-originating system, or post-processed at a receiving computing system.

FIG. 14 illustrates an exemplary tablet computing device 1400 that may execute one or more aspects disclosed herein. In addition, the aspects and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.

Aspects of the present disclosure, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to aspects of the disclosure. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.

The description and illustration of one or more aspects provided in this application are not intended to limit or restrict the scope of the disclosure as claimed in any way. The aspects, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed disclosure. The claimed disclosure should not be construed as being limited to any aspect, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate aspects falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claimed disclosure.

Claims

1. A system comprising:

at least one processing unit; and
at least one memory storing computer executable instructions that, when executed by the at least one processing unit, cause the system to perform a method, the method comprising: receiving a semantic object including an interactive control; receiving a selection of the interactive control; performing an action associated with the interactive control; in response to performing the action, generating a result; and updating the semantic object with the result.

2. The system of claim 1, wherein the semantic object is embedded in a message.

3. The system of claim 1, wherein the interactive control is associated with a link to an operation, and wherein the action is performed by following the link to the operation.

4. The system of claim 1, further comprising:

sending the updated semantic object to one or more endpoints.

5. The system of claim 4, further comprising:

determining that at least one of the one or more endpoints is not registered with a unified messaging application; and
transforming the updated semantic object into a format that is readable at the at least one endpoint.

6. The system of claim 1, further comprising:

updating the semantic object with the result across a plurality of access points within a user interface associated with a unified messaging application, wherein the updated semantic object is provided as a single synchronized version.

7. The system of claim 6, wherein the single synchronized version of the updated semantic object is viewable at a plurality of endpoints.

8. The system of claim 1, wherein the result comprises one or more of:

a tally of selections;
an update to data;
a selection of an option; and
a completed task.

9. A system comprising:

at least one processing unit; and
at least one memory storing computer executable instructions that, when executed by the at least one processing unit, cause the system to perform a method, the method comprising: creating a semantic object including an interactive control within a user interface of a unified messaging application; receiving a result of an action associated with the interactive control; and updating the semantic object with the result.

10. The system of claim 9, further comprising:

embedding the semantic object into a message.

11. The system of claim 9, wherein the semantic object includes at least one parameter, comprising at least one of:

a duration period;
a completion date; and
an end time.

12. The system of claim 10, wherein the message embedded with the semantic object is sent to one or more endpoints.

13. The system of claim 11, further comprising:

determining that at least one of the one or more endpoints is not registered with the unified messaging application; and
transforming the semantic object into a format that is readable at the at least one endpoint.

14. The system of claim 9, wherein the updated semantic object is sent to one or more endpoints.

15. The system of claim 9, further comprising:

updating the semantic object with the result across a plurality of access points within the unified messaging application, wherein the updated semantic object is provided as a single synchronized version.

16. The system of claim 15, wherein the single synchronized version of the updated semantic object is viewable at a plurality of endpoints.

17. A method of creating a semantic object including an interactive control, the method comprising:

creating the semantic object including the interactive control within an interface of a unified messaging application;
receiving a selection of the interactive control, wherein the interactive control is linked to an operation;
receiving a result of the operation; and
updating the semantic object with the result.

18. The method of claim 17, further comprising:

embedding the semantic object into a message.

19. The method of claim 17, further comprising:

updating the semantic object with the result across a plurality of access points within the unified messaging application, wherein the updated semantic object is provided as a single synchronized version.

20. The method of claim 19, wherein the single synchronized version of the updated semantic object is viewable at a plurality of endpoints.

Patent History

Publication number: 20160344677
Type: Application
Filed: Jul 14, 2015
Publication Date: Nov 24, 2016
Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC (Redmond, WA)
Inventors: Brian MacDonald (Bellevue, WA), Mira Lane (Bellevue, WA), Larry Waldman (Seattle, WA), Chad Voss (Seattle, WA), Diego Baca Del Rosario (Seattle, WA), Andrew Spiziri (Seattle, WA), William J. Bliss (Los Angeles, CA)
Application Number: 14/798,905

Classifications

International Classification: H04L 12/58 (20060101);