NOTIFICATIONS IN MULTI APPLICATION USER INTERFACES

In one general aspect, a method and system are described for generating notifications in a user interface. The method may include detecting an availability of at least one notification available for display in the user interface, generating a container for the at least one notification, generating, for the container, additional selectable actions and appending the additional selectable actions to the at least one selectable action, determining which display device type of a plurality of display device types in which the user interface is being accessed, and generating, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions, the container being arranged for display according to the display device type.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to and the benefit of U.S. Provisional Application No. 62/335,888, filed May 13, 2016, U.S. Provisional Application No. 62/335,892, filed May 13, 2016, U.S. Provisional Application No. 62/335,895, filed May 13, 2016, U.S. Provisional Application No. 62/335,897, filed May 13, 2016, U.S. Provisional Application No. 62/335,899, filed May 13, 2016, U.S. Provisional Application No. 62/335,873, filed May 13, 2016, U.S. Provisional Application No. 62/335,875, filed May 13, 2016, U.S. Provisional Application No. 62/335,879, filed May 13, 2016, U.S. Provisional Application No. 62/335,883, filed May 13, 2016, U.S. Provisional Application No. 62/335,886, filed May 13, 2016, and U.S. Provisional Application No. 62/335,887, filed May 13, 2016, each of which provisional application is incorporated by reference in its entirety.

TECHNICAL FIELD

This description generally relates to user interfaces and user experiences. The description, in particular, relates to systems and techniques for providing a user experience for accessing and viewing data and information related to multiple software applications on a computing device.

BACKGROUND

Users may utilize or interact with multiple software applications at the same time. The multiple applications may be hosted on the same or different types of computer platforms or systems and accessed from the users' client devices. In example implementations, the different types of computer platforms or systems may include, for example, SAP HANA, SAP ABAP, or other enterprise-type computer platforms or systems.

In example implementations, the suite of the multiple applications which an enterprise may deploy (and which users may need to use for their work) may be large. A sample of the large number of applications that may be deployed by an enterprise for its operations may, for example, include applications in the areas or domains of Finance, R&D, Engineering, Human Resources, Manufacturing, etc. Different subsets of these applications may be used in the work of enterprise personnel, who, for example, may have a variety of different roles. Each user may have a need to use a different respective subset of the multiple applications, based, for example, on the user's role in the enterprise.

Consideration is now given to a notification service for generating and providing a display of content and notifications in an expandable user interface.

SUMMARY

A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a computer-implemented method for generating notifications in a user interface. The method may include detecting, with a processor, an availability of at least one notification available for display in the user interface, generating, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action, generating, with the processor and for the container, additional selectable actions and appending the additional selectable actions to the at least one selectable action, the additional selectable actions being generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface, determining, with the processor, a display device type in which the user interface is being accessed, and generating, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions. The container may be arranged for display according to the display device type. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.

Implementations may include one or more of the following features. The computer-implemented method further including, displaying, in the user interface in a display location, the container, the display location determined based on the display device and a role associated with the at least one user. The computer-implemented method where the display location is predefined for the context determined to be associated with the at least one notification. The computer-implemented method further including in response to detecting one or more additional notifications, generating a container for each of the one or more additional notifications, and generating, for display in the user interface, the container for each of the one or more additional notifications, each container depicting a plurality of selectable action where each container is arranged for display according a display device providing the user interface. The computer-implemented method where the one or more additional notifications are provided from a plurality of source applications associated with at least one user accessing the user interface, the additional selectable actions providing access to a plurality applications hosted outside of the user interface. The computer-implemented method further including: merging each notification received in the user interface into a list, displaying the list in a viewport, and generating a plurality of actions that enable at least one bulk operation for the notifications in the list. The computer-implemented method where the additional actions are implemented upon selection within a respective notification. The computer-implemented method where each display device type is associated with a different set of notification rules, the display device type including a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.

In another general aspect, a system for generating a user interface is described. The system may include a shell container, executing in a web browser and providing a plurality of services for generating notifications in a user interface, an application container, executing in the web browser, the application container and at least one processor to programmed to, obtain at least one notification, provide, for display in a display device, the user interface depicting the at least one notification, detect, with a processor, an availability of at least one notification available for display in the user interface, generate, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action, generate, with the processor and for the container, additional selectable actions and append the additional selectable actions to the at least one selectable action. The additional selectable actions may be generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface. The system may also determine, with the processor, a display device type in which the user interface is being accessed, and generate, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions, the container being arranged for display according to the display device type.

Implementations may include one or more of the following features. The system where the at least one processor is further programmed to display, in the user interface in a display location, the container, the display location determined based on the display device and a role associated with the at least one user. The system where the at least one processor is further programmed to in response to detecting one or more additional notifications, generating a container for each of the one or more additional notifications, generating, for display in the user interface, the container for each of the one or more additional notifications, each container depicting a plurality of selectable action, wherein each container is arranged for display according a display device providing the user interface.

The system where the at least one processor is further programmed to merge each notification received in the user interface into a list, displaying the list in a viewport and generating a plurality of actions that enable at least one bulk operation for the notifications in the list. The system where the additional actions are implemented upon selection within a respective notification. The system where each display device type is associated with a different set of notification rules, the display device type including a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device.

Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.

The details of one or more implementations are set forth in the accompanying drawings and the description below. Further features of the disclosed subject matter, its nature and various advantages will be more apparent from the accompanying drawings, the following detailed description, and the appended claims.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1A is a screen shot of an example personalized user interface (UI) display, in accordance with the principles of the present disclosure.

FIG. 1B is an illustration showing an example login screen displayed in a shell main container.

FIG. 1C is an illustration showing an example launchpad displayed in a shell main container.

FIG. 1D is an illustration showing an example active application screen (an overview page) displayed in a shell main container.

FIG. 1E is an illustration showing an example object page displayed in a shell main container.

FIG. 1F is an illustration showing an example footer toolbar.

FIG. 1G is an illustration showing an example me area that can be displayed in a left container.

FIG. 1H is an illustration showing an example notification area that can be displayed in a right container.

FIG. 1I is an illustration showing an example copilot user interface.

FIG. 1J is an illustration of a timeline user interface that can display timeline entries.

FIG. 2 is a diagram of an example system that can implement the user interfaces and user experiences described herein.

FIG. 3 is a diagram of an example system that can implement the launchpad for the user interfaces and user experiences described herein.

FIGS. 4A-4C illustrate screenshots depicting examples of the viewport.

FIGS. 5A-5E illustrate screenshots of example user interfaces depicting viewports.

FIGS. 6A-6D illustrate screenshots of example user interfaces depicting notification aspects.

FIG. 7 is an example data model for a notification architecture.

FIG. 8 is an example block diagram depicting an integration of notification services.

FIG. 9 is an example swim lane diagram depicting provision of messages using a notification service described herein.

FIG. 10 is an example swim lane diagram depicting action processing on notifications.

FIG. 11 is an example swim lane diagram depicting a process for incrementing a badge counter

FIG. 12 is an example swim lane diagram depicting a process for resetting a badge counter.

FIG. 13 is an example of a notification list.

FIG. 14 is an illustration of an example process for generating and displaying notifications.

Like reference symbols in the various drawings indicate like elements.

DETAILED DESCRIPTION

The present disclosure relates to graphical user interfaces of software applications that display content, referred to herein as the “main content,” together with notifications, functions, and other information besides the main content, i.e., supplemental content. Such applications may include, among other things, standalone software programs with a built-in display module that generates notifications to be depicted in a graphical user interface (e.g., a viewport), as described in the example embodiments herein. Alternatively, display of notifications may be provided as separate functionality, e.g., as an add-on package, a plug-in or through a separate program that communicates with a main content providing program via an Application Program Interface (API). The main content providing program and/or the display program may be executed locally on a user device and/or remotely, as a Web application, for example.

Example embodiments are described in which a number of notifications are generated, configured, and sent to reach an end-user in any number of devices. For example, such notifications may be received as mobile notifications on mobile devices. The notifications provided herein can be integrated into the native notification infrastructure of a respective mobile platform (e.g. notification center in iOS, badge support, etc.). For example, an end-user can be navigated directly to a particular Fiori application from the notification. The application may contain further information to process the notification (i.e., also referred to as “deep linking”). If supported by the native mobile infrastructure exposing actions directly within the native notification center may execute (e.g. in iOS 8).

In another example, notifications may be provided by means of email. Email support can be integrated into each application. Integrations may not be necessarily replaced with a new mandatory infrastructure but should be integrated. In yet another example, exposure of notifications on an end-user's desktop may be provided via a launchpad configured to launch a plurality of applications on a client computing device. Additionally, native desktop notification centers can be leveraged if supported by a particular operating system (e.g. in OSX or WINDOWS 10).

The systems and methods described herein can function together to integrate a notification center. The user can access the notification center from everywhere within SAP Fiori, for example, on any device. The notifications can be extended by the notification center to include actions and operations. This provides the user an advantage of being able to perform key actions (such as approvals) directly from the notification, without having to open a separate application.

In some implementations, the systems and methods described herein can merge and display new and existing notifications in one list. Users can see the read/unread status and order and filter the list based on their preconfigured requirements. Display of such notifications can include banners, badges, and sounds to inform users when new notifications come in. Notifications of the same type may be grouped together. For faster interaction, users can perform bulk operations on all the notifications in a stack

In some implementations, settings can be provided to access notifications for all of a user's apps in one location. Here, the user can activate notifications, choose the delivery channel, or set the notification priority—all at a glance. The systems can provide an If This Then That dialog for setting up subscription based notifications. This enables users to receive notifications on specific KPIs or business object values that are not offered in a settings dialog. For easier handling, the system automatically helps the user to set up the notification by prefilling the dialog with content, conditions and triggers from the current screen.

In general, notifications can originate from a variety of providers, out of different systems, both Cloud and on premise and out of different technology stacks. Therefore, notifications are generally integrated in a standardized way using a central notification service for notification aggregating. When defining the corresponding contracts and APIs, open standards may be used to allow third party solutions to integrate their notifications into the harmonized Fiori user experience. From the end-user perspective notifications shall always be perceived as being pushed to the user. This may also occur when utilizing Fiori Launchpad (i.e., there is not a refresh button to manually pull new notifications). Native mobile push notifications and email can also be used to handle notifications.

In some implementations, notifications are provided in particular display areas in a viewport based upon a context or user setting or user role. For example, different styles and display areas may be provided to help users to understand a context of a notification provided in a viewport before reading the notification. To do so, an indication of the context may be location. For example, approval request notifications may be set to appear at a left center portion of a viewport in all open applications for a user. When such a notification arrives in the left center portion of the viewport, the user understands that she can ignore the notification until it is convenient to read and take action on the notification.

In one example, there are two types of notification providers (a) providers that are enabled to proactively push new notifications (b) providers that support reading notifications through an API, but that do not include a push notification process. Notification providers that are not push-enabled can still be integrated into consumption channels that rely on an end-user session such as the Fiori shell. In this case, the notifications can be pulled and subsequently short-polled for that given user.

As notifications are typically actionable, they may contain data to allow the end-user to decide on the appropriate action. In consequence, that means that the notifications potentially need to contain sensitive data. If notifications are exposed over unsecure channels such as email or mobile infrastructures not owned by SAP (e.g., APNS) they do not contain any sensitive data. In this case, the sensitive data is read by the receiver through an additional secured channel (e.g., from the mobile device). If that is not possible (e.g., typically for emails), the notification itself can be a trigger to let the user navigate to some UI which contains all of the secure information.

In addition, the question on how to deal with sensitive data is also relevant when storing or caching the notification content within the notification service. Depending on the type of notifications and regulations on the customer side, the data may not be persisted.

Referring to FIG. 1A, an example display of a viewport 100 with a launchpad 101, in accordance with the principles of the present disclosure. Launchpad 101 may be included in a center container 120 (e.g., “Work”) with content relevant to the user's work, domain, or role in the enterprise. A left side container 110 (e.g., “ME”) with content personal to the user may pertain to content in the launchpad 101 or may be independent of content in the launchpad 101. A right side container 130 (e.g., “Notifications”) may include notifications directed to the user that pertain to content in the launchpad or other content. In some implementations, these containers 110, 120, and 130 may be referred to collectively herein as viewports. In some implementations, the combined areas 110, 120 and 130 (or other screens) are referred to collectively herein as a viewport. In accordance with the principles of the present disclosure, the personalized web interface may be presented as a uniquely integrated, multifaceted user interface which may, in effect, transform a single-screen view on the client computer device into three multifunctional screen areas (e.g., Left/Center/Right “viewports”).

In some implementations, the viewport may function as an entry point to access software applications and associated content. The viewport may be configured to provide a single screen view that depicts three (or more) multifunctional screen areas. In one example, the three areas are displayed in parallel as a left panel, a center panel, and a right panel. The center panel may include a workspace area which can display a launchpad (e.g., home screen) or one or more active application areas (e.g., screens) that a user has launched from the launchpad (or tile/link in the launchpad). The left panel may include a Me Area that provides various generalized functionalities related to the user and operation and personalization of the environments described herein. The right panel may include a Notifications Area that displays a broad array of notification types (System Alerts, messages, reminders, tasks alerts, etc.) in a customizable listing format.

In an example embodiment, the functions and information described herein are assigned to at least one virtual extension of a viewport. That is, a portion of the display area can be displayed, while other portions are virtual extensions and only displayed as a user or algorithm scrolls to place one or more of the other portions into view. In one example, a virtual extension can include a first extension area to the left of the viewport and a second extension area to the right of the viewport. When the main content is selected, the extension area(s) are hidden from display.

In another example embodiment, the viewport is switched to display selected supplemental content by triggering a graphical icon inside a viewport. Alternatively, if the display is touch-sensitive, a viewport may be switched by a touch gesture such as a swiping motion towards or away from the corresponding extension area. A viewport may be switched back to the main content, e.g., by triggering a respective icon or using a gesture.

In another example embodiment, trigger icons indicate when new or unread information is available inside a respective extension area. The indication can be a numerical counter, a symbol, a special graphic or animation, etc. Thus, the user need not leave the current context, i.e., the main content, to be alerted to new information.

In an example embodiment, the supplemental content is displayed by moving the corresponding extension area over to the viewport. The movement may be animated in the manner of a camera pan. However, other movements such as instantaneous display or fading in and out are also possible.

In yet another example embodiment, at least part of the main content remains on display in the viewport when the extension area is displayed. The main content may be shifted away from a central portion of the viewport and reduced in size (e.g., scaled down to 75% of its original size) to direct the user's attention to the supplemental content. In this display state, the main content may be partially cut off by the border of the viewport.

As shown in FIG. 1A, a Work viewport 120 is located in the center of the display screen. The Work viewport 120 may, for example, display either the launchpad 101 or an active application screen that was previously selected or opened from the launchpad tile array. The left Me viewport 110 may, for example, provide various generalized functionalities related to the user and their operation and personalization. The right Notifications viewport 130 may, for example, display one or more of a broad array of notification types (System Alerts, messages, reminders, tasks alerts, etc.) in a customizable listing format.

The launchpad or home screen in the viewport, which may available at all times and in any application, may provide a clear screen orientation for accessing corresponding application information as well as generalized functionalities and navigations without ever disrupting a user's context of their current task. On a client computer device (e.g., a mobile device), which has a limited display screen area, a personalized UI display may be adapted to present fewer of the three multifunctional screen areas or viewports on the device's limited display screen area. For example, only the Center, Left/Center or Center/Right screen areas or viewports may be presented on a mobile device's display screen.

For convenience in description, the terms “Work viewport”, “center viewport”, “launchpad”, “home screen” and “home page” may be used interchangeably herein because each may persist as a user-configured starting point in which to access content.

A client computer device structure or framework provides a viewport for a web interface for access to, or interaction with, a suite of multiple and diverse applications (or data sources), in accordance with the principles of the present disclosure. The viewport can be used for the multiple and diverse applications and may, for example, provide services to a user for application-to-application navigation, personalization, search, and incident creation. The Viewport may be designed to provide a common, same, or unified user experience (UX) to the user when launching, accessing, or interacting with one or more of the multiple applications. In an example implementation, a backend or gateway computer system (which may be connected to multiple applications or hosts) may generate the viewport. The Viewport may be delivered or presented as a web page on the client computer device and serve as a single web-based entry point for multiple applications and analytics across platforms and devices.

As indicated above, the content of the viewport may be organized in one or more containers (e.g., main or center “shell” container, left container, right container) for display on a display screen of a client computer device. The main container may contain the launchpad (e.g., home page), which may act as the starting or focal location for initiating application-to-application navigation, personalization, search, and incident creation, just to name a few examples.

Each of the multiple applications may be represented by, or delivered via, content (e.g., a graphical user element (GUI), link, tile, factsheet, or other object) on the viewport (or within the launchpad). Further, the content of the launchpad may be customized or personalized to a user (e.g., based on user role, authorization level, user interests or needs, etc.) for access to, or interaction with, a selected subset of the multiple applications (or data sources). Each of the selected subset of multiple applications may be represented a specific object (e.g., a tile or link) on the viewport (or within the launchpad). The specific object (e.g., tile or link) may be identified or labelled by a name, title, or icon indicating the specific application which the specific object represents. The tile or link (e.g., by a single click) may be used as an application launcher on the viewport (e.g., web interface) to launch the application that the tile or link represents.

The tiles corresponding to the specific applications represented on the launchpad may be organized as a group or array of tiles in a “tiles area” of the UI hosting the launchpad. Similarly, links corresponding to specific applications represented on the launchpad may be organized as a list of links in a “links area.” A Design Time Tool (e.g., available, for example, in a menu or via a tile or link on the launchpad) may allow users or administrators to define which applications should be displayed as links or tiles on the launchpad. Users/Administrators may personalize the tiles area and the link list area to a user.

One or more containers of the viewport may have adjustable amounts of displayed content (e.g., number of tiles) (and correspondingly adjustable display size or display area) so that the same viewport can be adapted for display on different-sized display screens of different client device types (e.g., smartphone, smart watches, laptops, work station, tablet, desktop computer, etc.), and across all possible deployment options (e.g., on premise, cloud, as-a-service, etc.). Which ones of the one or more containers are displayed on the display screen at given moment may depend, for example, the status of tasks or activities of the user navigating the viewport, and also, for example, on the size of the display screen of the client computer device available for display.

In example implementations, a container (e.g., center container, launchpad) may be used to display main or core content for a user (e.g., application/tiles relevant to a user's work or role). The launchpad may serve as a shell container to access all content. Other containers may include different panels with different floorplans for different content corresponding user interests or activities (e.g. a “ME” panel displaying information or personal data about a user, a “notifications center” displaying notifications (e.g., e-mail, text messages, alerts, etc.) for the user, a panel displaying discussion threads or boards, an Overview Page, an Object Page (e.g., a floorplan to view, edit and create objects), a panel displaying context and ad-hoc workflows, a panel displaying dynamic sidebar information, a dynamic side content panel, etc. The dynamic side content is a layout control that allows additional content such as timeline, chat, additional information to be displayed in a way that flexibly adapts to different screen sizes. In some implementation, if no notifications are available, the launchpad may overtake space typically set aside for notifications. In some implementation, the launchpad may be placed with a visual effect, including sliding in from a top of a UI and bouncing into place in the UI.

In example implementations, the applications (which, for example, may be a set of applications implemented on HTML5/CSS/JS technology using SAPUI5 framework) delivered via launchpad 101 may adhere to a consistent, responsive design that allows users to seamlessly experience the applications across interaction channels—desktop, tablet, mobile, etc. Further, the applications delivered via the launchpad may include legacy applications implemented on traditional platforms using legacy UI technologies (e.g., FPM/WDA, SAPGUI for HTML, SAPGUI for Windows, etc.). Access to legacy applications may, for example, be provided via corresponding links in a links area of the personalized UI display.

In an example implementation of the personalized UI display, a start screen (e.g., main container, “launchpad” or home page) may present assigned applications as so-called “tiles” (e.g., tile 150, tile 151, tile 152, etc.). Tiles (which are user-activable UI elements) may only be used as application launchers for launching applications and presenting the applications on the launchpad. An App Descriptor defines Navigation Intent (=Semantic Object+Action) to launch the transaction, Title, Subtitle and Icon for the Application Launcher, i.e. the text of the tile; and Parameters, e.g. order number.

A user may use these tiles (e.g., tile 150, tile 151, tile 152, etc.) to launch or navigate to specific applications. Incorporated into the launchpad may be a launchpad Designer tool, which allows assignment of tiles to users and user groups for customization or personalization (e.g., based on user role) of launchpad 101. As a general rule, each of the multiple applications (for which launchpad 100 serves as an interface) may correspond to at least one tile. An exception to the general rule may be for factsheet applications, which need not be represented by tiles. However, factsheets may optionally still be saved as and represented by tiles on launchpad 101 if desired.

In accordance with the principles of the present disclosure, a tile that represents an application (e.g., on launchpad 101 or any other UI), apart from serving as a UI element or button for launching the application and displaying the application identifier, may be a container that displays different types of additional information or content. The additional information may include, for example, informative text, numbers, and charts. The displayed tile content may be static or dynamic. The displayed tile content may be dynamically updated and may include, for example, data (e.g., trends or key performance indicators (KPIs), and application status, etc.) supplied by the backend systems or applications to which the tile is represents.

The multiple applications described herein may be hosted on the same or different types of computer platforms or systems (possibly including some applications hosted on the client device itself). In example implementations, the different types of computer platforms or systems may include, for example, SAP HANA, SAP ABAP, or other enterprise-type computer platforms or systems.

In example implementations, the suite of the multiple applications which an enterprise may deploy for its operations (e.g., in the areas or domains of Finance, R&D, Engineering, Human Resources, Manufacturing, etc.) may be large. Different subsets of these applications may be used in the work of enterprise personnel who may have a variety of different roles. Each user may have a need to use a different respective subset of the multiple applications, based, for example, on the user's role in the enterprise.

In general, viewports (e.g., viewports 110, 120, and 130) may each represent a partial view of a larger surface. By opening up this surface beyond the borders of a window (i.e., beyond the borders of the actual screen) a user may prepare to use the architecture described herein to extend to larger screens and collaborative wall displays. For example, if a screen or window is too small, the user will only see the viewport that fits to the screen or window. On the other hand, if the virtual screen is wider (e.g., multi-screen displays), the systems and methods described herein can provide an advantage of allowing a widening of the viewport to offer a panoramic view of the surface. While maintaining the promise to responsively support small devices, the systems and methods described herein offer the possibility to also target larger displays.

The viewport also provides the advantage of a natural user experience compared to the classical off-canvas designs that are common in mobile applications. As shown in FIG. 1A, two off-screen areas are shown, the Me area (e.g., viewport 110) with user-specific information and a Notifications area (e.g., viewport 130) on the right. Each off-screen area is populated using system-driven information. Users can access these areas through actions in a shell bar on the top left and top right corners. The transition that is shown upon accessing such content depicts a smoothly animated lateral move that mimics the user's head turning to the left and to the right in a panoramic view. User interaction with the content can be mapped to mimic natural user (e.g., human) gestures or input controls. The surface generated by the view therefore removes any screen limitations. Such a surface offers additional space for user-specific and system-driven data.

The Me Area can be found to the left of an off-screen area. Because this area is located off-screen, it is not permanently visible to the user. In order for the Me Area to slide into view, the user can click on the profile image located on the top left corner of the screen—an action that mimics the user turning his or her head to the left. This action will also trigger the viewport to move to the left and the main content area to zoom out. As the Me Area slides into view, the user will be able access information relevant to both the user and his or her usage environment. This includes, for example, the user's profile picture and access to online state, settings and preferences, a catalog of available apps (App Finder), tools to personalize the current content in the main area, and objects and apps recently visited by the user.

The Me Area may be available from each screen in the main content area. On the background surface, the different areas co-exist and influence one another. While most actions in the Me Area are available independently of the current context, some of the actions will be directly tied to the content shown in the main content area. For example, settings will display the settings page for the specific app in the main content area (not yet available). Additionally, personalization options might only be available if the respective screen is visible in the main area. In some implementations, an option to allow users to view a list of their most recently visited items is provided. This is especially useful for those users who are used to working with a limited set of apps or objects as it significantly simplifies their navigation.

The right off-screen area is dedicated to providing system-driven information. This may include system-generated notifications of events to which a user has subscribed. The system may provide more live insights and actions, making a real-time push channel increasingly important.

A notification center can provide system-generated notifications from various sources such as the workflow inbox or chat notifications. Notifications can be prioritized and grouped into groups of similar items. Through these configurations, the user will be able to access more information about a notification and take immediate action.

Similar to the Me Area, the notification area is accessible from every app that is shown in the main content area. Here, too, the user can bring the notification area into focus through a virtual turn of the head—that is, by clicking on the notification icon on the top right corner of the screen.

The notification area exists independently of the application in the main content area. The big difference between this area and the notifications on the home page of the launchpad is that the launchpad home area displays notifications within the launch tiles. By separating the notifications from the tiles, our rationale is to guide the user and make him aware of critical and actionable issues immediately. Other types of information may be suitable for display in the notification area, such as progress indicators for long-running tasks (for example, for a build or deployment process).

With the design of the viewport, the systems and methods described herein can concurrently manage different screen areas without sacrificing simplicity and responsiveness. The viewport offers a partial view of a potentially infinite surface on which content and functionality can be placed either in a fixed layout with the three main areas, or in a more flexible layout of multiple areas.

In one example, the Me Area slides into view from the left to offer users access to various user-related information including personalization, profile, settings and interaction history. Similarly, the notification area slides into view from the right to offer users access to system-driven information that helps them to become aware of critical, real-time information. The notification area may also offer other system-driven content.

FIG. 1B is an illustration showing an example login screen 110 displayed in the shell main container 104. The login screen 110 provides a UI that allows a user to enter credentials in order to log into and begin a personalized and customized UX. In the example shown in FIG. 1B, the login screen 110 appears to drop into the shell main container 104 from a virtual extension area located along a top of a display area. In some implementations, the virtual extension area can be placed along the bottom of the display area. In some implementations, the virtual extension area can be placed to the left and/or the right of the display area.

FIG. 1C is an illustration showing an example launchpad 101 displayed in the shell main container 104. The launchpad 101 can be a web-based entry point (or homepage) for enterprise applications that can execute (run) across multiple platforms and computing devices. In the example shown in FIG. 1C, the launchpad 101 appears to drop into the shell main container 104 from the top of a display area. In some implementations, the virtual extension area can be placed along the bottom of the display area. In some implementations, the virtual extension area can be placed to the left and/or the right of the display area.

The launchpad 101 can serve as a bracket around (or a base for) a set (or group) of enterprise applications, providing a single point of entry for the set of enterprise applications. In the example shown in FIG. 1C, the launchpad 101 presents (displays on a screen of a computing device of a user) each application represented by a tile. A tile can be a container that represents the application. Each tile can display different types of content. A user can interact with each tile to navigate to the specific enterprise application associated with the tile. In addition, when designing a tile to represent a specific application, a programmer can assign a tile to a specific user or group of users. The launchpad 101 can provide one or more services. The one or more services can include, but are not limited to, application-to-application navigation, personalization, role-based application assignments, search, and incident creation.

The launchpad 101 can be a role based, personalized, real-time and contextual aggregation point for business applications and analytics. The launchpad 101 can run (execute) on multiple computing devices including, but not limited to, desktop computers and mobile computing devices such as laptop computers, tablet computers, notebook computers, personal digital assistants (PDAs), smartphones, mobile phones, smart watches, etc.). In addition, the launchpad 101 can be deployed on multiple platforms (e.g., Linux, Windows, Windows Phone, MAC®, iOS®, OS X®, Android®, etc.).

The launchpad 101 includes tiles 114a-h. Each tile can display different types of content. For example, tile 114a can be a news and feeds tile that can enhance collaboration by providing a user with information about the enterprise. The tiles 114a-h can be individually color-coded. A color can represent a particular role (e.g., finance, human resources, supply chain management (SCM), customer relationship management (CRM), etc.). The tiles 114a-h can be associated with a group 116. Tile 114f can be a key performance indicator (KPI) tile. Tile 114b can be a basic launch tile. Tile 114d can be a monitoring tile. Tile 114g can display a comparison chart for specific content.

The launchpad 101 includes a link list area 118 that includes links 119a-f. The link list area 118 is an area on the launchpad 101 that can provide links to enterprise applications represented by the tiles 114a-h. For example, a user can select and drag a tile from the tile area on the launchpad 101 into the link list area 118 to create a link to the application associated with (represented by) the tile. In some implementations, the launchpad 101 can include a footer toolbar (e.g., footer toolbar 132 as shown in FIG. 1F). In some implementations, the footer toolbar can appear to float over the content displayed in the launchpad 101.

In some implementations, the shell toolbar 108 can display a search icon 111 and a copilot launch icon 113. A user can select (click on) the copilot launch icon 113 to launch a copilot UI. A copilot UI will be described in more detail with reference to FIG. 1I.

FIG. 1D is an illustration showing an example active application screen (overview page 120) displayed in the shell main container 104. The enterprise applications that can be accessed by a user by way of the launchpad 101 and then subsequently displayed in an active application screen (e.g., the overview page 120) can include, but are not limited to, transactional applications, analytical applications, and fact sheet applications (contextual navigation applications). Transactional applications can allow a user to create, change and/or approve processes with guided navigation. Analytical applications can provide a user with a visual overview of a dedicated topic for monitoring and tracking purposes to allow for further key performance indicator (KPI) related analysis. Fact sheet applications can allow a user to view essential information about an object and to allow navigation between related objects.

The overview page 120 can visualize all of the information a user may need for a specific business context (business domain) on a single page or screen. The information can be displayed in one or more variable content packages (VCPs) or cards 122a-i. Each card can be a container of content for organizing large amounts of information on an equal plane within the overview page 120. In some implementations, a user can rearrange the position of the cards 122a-i on the overview page 120. In some implementations, a user defines, adds, or deletes cards included in the overview page 120.

An overview page (e.g., the overview page 120) can be a selectable application (e.g., from the launchpad 101) providing an integrated gateway into enterprise applications and application content included in the launchpad 101. The UI of the overview page (e.g., the overview page 120) can provide a user with a visual summary of data, links, actions, and content that are relevant to a business domain of expertise of a user and relevant to a selected role of the user within the domain. The visual summary can be presented in one or more cards (e.g., the cards 122a-i) that display live content to a user at-a-glance without the user having to open multiple applications and perform multiple drill downs through application content to find and present the content.

In some implementations, the overview page 120 can include a footer toolbar (e.g., footer toolbar 132 as shown in FIG. 1F). In some implementations, the footer toolbar can appear to float over the content displayed in the overview page 120.

In some implementations, an enterprise system can determine content displayed on an overview page (e.g., the overview page 120). In addition or in the alternative, a selection of one or more business domains and one or more roles of a user in the business or enterprise can determine content displayed on an overview page (e.g., the overview page 120). In some implementations, a user can make the selection using a settings UI included in a launchpad (e.g., the launchpad 101). In some implementations, a user can select one or more business domains and/or one or more roles of the user in the enterprise by way of an overview page (e.g., the overview page 120). Selecting one or more business domains and/or one or more roles of the user in the enterprise by way of the overview page can maintain absolute relevance to the individual user and the way in which the user works.

In some implementations, the user can personalize the layout and placement of one or more cards (e.g., the cards 122a-i) included in a UI of an overview page (e.g., the overview page 120) and the display of content included in each card. The personalization can enhance the workplace productivity of the user.

FIG. 1E is an illustration showing an example object page (object page 124) displayed in the shell main container 104. An object page can be a floor-plan used to represent objects in a UI. An object page can be used to display, create, or edit an object. An object can represent a business entity (e.g., a customer, a sales order, a product, an account, etc.). Enterprise applications that reflect a specific scenario (e.g., a sales order, am account status) can be bundled using an object. The object page can include a header area 126, a navigation area 128, a content area 130, and, in some implementations, a footer toolbar (e.g., footer toolbar 132 as shown in FIG. 1F). In some implementations, the footer toolbar can appear to float over the content displayed in the object page 124. For example, referring to FIG. 1C, a user can select the tile 114f and an object page can be displayed to the user.

FIG. 1F is an illustration showing an example a footer toolbar (e.g., footer toolbar 132). In some implementations, referring to FIG. 1A, the footer toolbar 132 can appear at the bottom of a screen displayed in the shell main container 104, the left container 102, and/or the right container 106. For example, as described herein with reference to FIGS. 1C-E, a footer toolbar (e.g., the footer toolbar 132) can be displayed at the bottom of the launchpad 101, the overview page 120, and the object page 124. The footer toolbar (e.g., the footer toolbar 132) can continue to appear at the bottom of the screen of the display area of the display device even as the displayed screen is scrolled. The footer toolbar (e.g., the footer toolbar 132) can appear to hover over or float over the content being displayed on the screen. The footer toolbar 132 can include buttons or controls 134a-k. The controls 134a-k can be selected by a user in order to perform one or more actions that can affect content included on the page being displayed on the screen. The controls 134a-k are examples of controls that can be included in a footer toolbar. In some implementations, the controls can be different, fewer than, or more than the controls 134a-k. The type and number of controls included in a footer toolbar can be based on the type of page being displayed and/or the content being displayed in the page.

FIG. 1G is an illustration showing an example me area (e.g., me area 136) that can be displayed in the left container 102. In some implementations, the me area 136 can be displayed in the right container 106. The me area 136 includes an upper section 138 and a lower section 140. The upper section 138 includes a user icon 142. Selecting (clicking on) the user icon 142 can provide a user profile. A dropdown indicator button 144 displays a status of the user and, if selected, a user can logout of an application. The upper section 138 includes navigation targets 146a-e. Selection of (clicking on) a navigation target by a user triggers a corresponding functionality (e.g., an application) associated with a navigation target. The me area 136 can provide various generalized functionalities as they are related to a user.

The upper section 138 can include sort selections 146a-b. A user can select (click on) a sort selection (e.g., one of the sort selections 146a-b) to determine how the listing of the recent activities included in the lower section 140 will be sorted and displayed.

The lower section 140 of the me area 136 includes a list of recent activities 148a-c. The recent activities 148a-c can include links 156a-c, respectively, that when selected (clicked on) by a user can navigate the user to back to the shell main container 104, opening an application (or function) that corresponds to the link in the shell main container 104. Recent activity items can include, but are not limited to, enterprise applications, triggered searches, co-pilot collections, and co-pilot drafts.

FIG. 1H is an illustration showing an example notification area (e.g., notification area 150) that can be displayed in the right container 106. In some implementations, the notification area 150 can be displayed in the left container 102. The notification area 150 includes notifications 152 a-c. A user interacting with the UI in the notification area 150 can take immediate action on a notification. A notification item (e.g., notifications 152 a-c) can have an indicator (e.g., notification indicators 154a-c) that can indicate the status of the notification. For example, a notification indicator can be color coded to indicate a particular status of the notification.

A user can reject a notification by selecting (clicking on) a reject selection (e.g., a reject selection 156a-b). For example, a user can reject the notification 152a by selecting (clicking on) the reject selection 156a. The rejection of the notification 152a (the notification status) can be indicated by content included in (e.g., a color of) a notification indicator 154a. A user can acknowledge a notification by selecting (clicking on) an acknowledge selection (e.g., a acknowledge selection 158a-b). For example, a user can acknowledge the notification 152b by selecting (clicking on) the acknowledge selection 158b. The acknowledgement of the notification 152b (the notification status) can be indicated by content included in (e.g., a color of) a notification indicator 154b.

A user can drill down into a relevant application by selecting (clicking on) a more info selection (e.g., a more info selection 160a-b). In some cases, a user may contact someone directly in response to a notification.

FIG. 1I is an illustration showing an example copilot UI (e.g., copilot UI 162). For example, referring to FIG. 1C, a copilot application can be launched from the launchpad 101 when a user selects (clicks on) the copilot launch icon 113. The copilot application can provide (generate and display) the copilot UI 162. In some cases, the copilot UI 162 can float over the UI included in the launchpad 101. As a floating UI control, the copilot UI 162 can be visually unobtrusive and flexible in its cross-functional omnipresent implementation across any device or application screen.

The example copilot UI 162 is an example copilot start page or start screen. The start screen (the copilot UI 162) can be an entry point for copilot functionality for an enterprise system.

The copilot UI 162 can provide shortcuts to different copilot features. For example, as shown in FIG. 1I, a collection can be represented by an entry in a collection list 164 that includes collection list entries 164a-d. A copilot collection can be a cluster of items in relation to a specific topic. For example, an item can be a note, a screenshot, a chat message, a copilot message, an object, or a quick create. In some implementations, the items included in the collection can be homogeneous (e.g., all of the items are of the same type). In some implementations, the items included in a collection can be non-homogeneous (e.g., the items can be of different types). Each collection list entry 164a-d can provide a representation of a collection that can include a title, a timestamp (e.g., last changed), a visual content summary, and a textual content preview. In some implementations, the collection list 164 can be searched and/or filtered.

For example, the selection of a copilot shortcut 166a-d can allow a user to create and navigate to a new collection with a specified intention. The selection of a copilot create icon 168 located in a copilot footer toolbar 170 can create and navigate to a new plain collection. The selection of a copilot settings icon 172 located in the copilot footer toolbar 170 can allow a user access to copilot settings (e.g., display a copilot settings UI, open a copilot settings application, etc.).

Copilot entries can be living, gradually growing artifacts and software entities that can accompany a user from the identification of an issue to a solution for the issue, while providing support in the form of relevant context and actions. Copilot entries can serve as memory aides while the copilot entries can incrementally evolve into valuable transactional tasks and collaborations as they mature in meaningful ways that bridge a gap between predefined application functionality and processes based on personal ways of working for a user. Though the example shown in FIG. 1I describes launching the copilot application from the launchpad 101, referring to FIG. 1A, the copilot application can be launched from other screens displayed in (included in) the shell main container 104, the left container 102, and/or the right container 106.

Copilot entries can be made ready for users to use when communicating, collaborating, and creating actionable transactions in desktop or mobile scenarios. For example, copilot text entries can be analyzed for recognizing and identifying relevant text related objects. Copilot text entries can emphasize displayed text, and a copilot application can recommend contextual entities for use in a current task. The copilot application can understand user context and can intelligently propose selections, auto-entries, and user options.

A smart template can provide a framework for generating user interfaces at runtime for an enterprise application. For example, a smart template can be used to generate the UI for the overview page 120 as shown in FIG. 1D. In another example, a smart template can be used to generate the UI for the object page 124, as shown in FIG. 1E. A smart template can provide a framework for generating the user interfaces based on metadata annotations and predefined templates for the most used application patterns. The use of smart templates can ensure design consistency by providing centralized high quality code by using predefined templates and controllers. The use of smart templates can keep applications up to date with evolving design guidelines. The use of smart templates can reduce an amount of front-end code used in building enterprise applications. The term “smart” can refer to annotations that add semantics and structures to provided data. The term “smart” can also refer to the way in which the templates understand the semantics.

FIG. 1J is an illustration of a timeline UI (e.g., the timeline 174). A timeline UI (e.g., the timeline 174) can display timeline entries 176a-e. For example, the entries can be events, objects, and/or posts listed and displayed in a chronological order. The timeline 174 includes nodes 178a-d that correspond to respective timeline entries 176a-d.

The timeline 174 can be used for collaborative communications. The timeline 174 can be configured in multiple different ways depending on use case implementations. For example, the timeline 174 can provide information about changes of an object or about events related to an object. The timeline 174 can provide information about generated entries (e.g., value XY changed from A to B) or about manual entries (e.g., comments from an individual). In some implementations, the latest entry is at the top of a list displayed by a timeline. In some implementations, the timeline 174 can be displayed along with a business object. In some cases, the timeline 174 can be displayed to the right of the business object.

Two example versions of a timeline can include a basic timeline and a social timeline. A basic timeline can be a read-only timeline. A social timeline can allow for interaction and collaboration among users.

FIG. 2 is a diagram of an example system 200 that can implement the user interfaces and user experiences described herein. The system 200 includes an enterprise computing system 202, a network 204, and client computing devices 206a-e.

For example, computing device 206a can be a mobile phone, a smartphone, a personal digital assistant, or other type of mobile computing device. The computing device 206a includes a display device 220. For example, computing device 206b can be a laptop or notebook computer. The computing device 206b includes a display device 222. For example, computing device 206c can be a tablet computer. The computing device 206c includes a display device 224. For example, the computing device 206d can be a wearable device such as a smartwatch. The computing device 206d includes a display device 226. For example, the computing device 206e can be a desktop computer. The computing device 206e can include a display device 228. A user of the computing devices 206a-e can use/interface with the display devices 220, 222, 224, 226, and 228, respectively, when interacting with the enterprise computing system 202. The computing devices 206a-e can display on the display devices 220, 222, 224, 226, and 228 any of the screens and UIs described herein.

The enterprise computing system 202 can include one or more computing devices such as a web management server 214, a frontend server 230, a backend server 208, and a mobile device management server 210. The enterprise computing system 202 can also include a database management computing system 212 that includes a database management server 212a and a database 212b. Though not specifically shown in FIG. 2, each server (the web management server 214, the frontend server 230, the backend server 208, the mobile device management server 210, and the database management server 212a) can include one or more processors and one or more memory devices. Each server can run (execute) a server operating system.

In some first implementations, the client computing devices 206a-d (e.g., the mobile computing devices) can communicate with the enterprise computing system 202 (and the enterprise computing system 202 can communicate with the client computing devices 206a-d) by way of the mobile device management server 210. The mobile device management server 210 includes one or more mobile device platform application(s) 216. By using the mobile device platform application(s) 216, the enterprise computing system 202 can deliver cross-platform, secure, and scalable applications to the computing devices 202a-d, independent of the mobile computing device-type (e.g., laptop, notebook, smartwatch, mobile phone, PDA, etc.) and independent of the operating system running on the computing device 206a-d. In these implementations, the mobile device management server 210 can then communicate with the web management server 214.

In some second implementations, the client computing devices 206a-e (both the mobile computing devices (computing devices 206a-d) and the desktop computing device 206e) can communicate with the enterprise computing system 202 (and specifically with the web management server 214), and the enterprise computing system 202 (and specifically with the web management server 214) can communicate with each of the client computing devices 202a-e) using the network 204. The web management server 214 includes a web dispatcher application 218. In both the first implementations and the second implementations, the web dispatcher application 218 can act as a “software web switch” accepting or rejecting connections to the enterprise computing system 202.

In some implementations, the network 204 can be a public communications network (e.g., the Internet, cellular data network, dialup modems over a telephone network) or a private communications network (e.g., private LAN, leased lines). In some implementations, the computing devices 206a-e can communicate with the network 204 using one or more high-speed wired and/or wireless communications protocols (e.g., 802.11 variations, WiFi, Bluetooth, Transmission Control Protocol/Internet Protocol (TCP/IP), Ethernet, IEEE 802.3, etc.).

The frontend server 230 can include product specific UI Add-On Applications 232 and a UI infrastructure 234. The UI infrastructure 234 can include a design portion and a runtime portion. The frontend server 230 can decouple a lifecycle of a UI (e.g., design and runtime deployment) from the backend server 208. The decoupling can allow UI applications to interface with a plurality of different databases. The decoupling provides a single point of UI design, access, and maintenance allowing for theming, branding, configuring, and personalizing a UI without a need for development privileges to the backend server 208 (e.g., no need to have backend administrative rights). The decoupling can result in a more secure enterprise computing system. The decoupling can provide for rule-based dispatching of requests in a multi-system landscape (e.g., for approvals including aggregation).

The frontend server 230 includes a gateway 236. The gateway 236 can provide a way to connect devices, environments, and platforms to enterprise software based on market standards. The gateway 236 can enable the development of UIs for use in different environments (e.g., social and collaboration environments). The gateway 236 can enable the development of UIs for use on different types of client computing devices (e.g., client computing devices 206a-e). The gateway 236 can enable the development of UIs for use in internet-based applications.

The backend server 208 can include a bundle (a set) of business applications (e.g., business suite 238). The business applications can be transactional applications. analytical applications, and fact sheet and contextual navigation applications. Transactional applications can allow task-based access to tasks that can include create and change. In addition or in the alternative, transactional applications can allow access to entire processes with guided navigation. Analytical applications can provide a user with a visual overview of complex tasks for monitoring and tracking purposes. Fact sheet applications and contextual navigation applications involve search and explore activities. Fact sheet applications and contextual navigation can allow a user to view essential information about an object and can allow contextual navigation between related objects.

The database management computing system 212 includes a database management server 212a that can run (execute) applications that can manage a database 212b. For example, the database 212b can be an in-memory, column-oriented, relational database (e.g., SAP HANA®). The database management computing system 212 can include extended application services 240 that can embed a full featured application server, web server, and development environment within the database management computing system 212. The extended application services 240 can include application content 242 and reuse content 244 for use by the enterprise computing system 202 when providing a personalized, responsive, and simple UX across different types of computing devices and deployment options.

FIG. 3 is a diagram of an example system 300 that can implement the launchpad for the user interfaces and user experiences described herein. The launchpad acts as runtime shell environment for the apps described herein in which the personalized home page is one feature among many other services. The launchpad is based on a unified shell architecture. The guiding principle of the unified shell is to have a single, platform-independent, client-side runtime environment which can be hosted on different server platforms (e.g., SAP NetWeaver AS ABAP, SAP HANA XS, SAP HANA CloudPlatform).

In general, the framework described herein may support for modularizing comprehensive JavaScript applications. That means, instead of defining and loading one large bundle of JavaScript code, an application can be split into smaller parts which then can be loaded at runtime at the time when they are requested. These smaller individual files are called modules.

A module is a JavaScript file that can be loaded and executed in a browser. The module may include a name, a description, a dependency, and a declaration location. The content bundled in a module is up to the developer, but typically the content has a common topic, such as forming a JavaScript class or namespace or the contained functions address a specific topic, for example client to server communication or mathematical functions.

Modules have no predefined syntax or structure, but module developers can use the name, declaration, description, or dependency to identify such modules. The name identifies the module and is used with jQuery.sap.require to load the module. As human readers associate a module with the main JavaScript object declared in it, the module names by convention are a hierarchical sequence of dot-separated identifiers like sap.ui.core.Core. A developer can use all but the last identifier to group modules in a logical and/or organizational order, similar to packages in Java, and can use the last identifier to give the module a semantical name.

Modules can declare themselves and their location of content by calling the static jQuery.sap.declare function with their name. This helps SAPUI5 to check at runtime whether a loaded module contains the expected content by comparing the required name against the declared name. As a side effect, jQuery.sap.declare ensures that the parent namespace of the module name exists in the current global namespace (window). For modules without declaration, the framework assumes that the module has the expected content and declares it with the name that was used for loading. In some cases a module declaration is mandatory.

The description of a module is any JavaScript comment preceding the module's declaration statement and is intended to help to decide whether a module is useful for the intended purpose. The configuration UI displays the description next to the module name.

Modules can use the jQuery.sap.require method to load other modules they depend on. While jQuery.sap.require internally has the effect of a loadModule call, it can also be regarded as a dependency declaration. The dependency declarations can be evaluated at runtime, but can also be analyzed at built time or at runtime on the server.

In one example, the unified shell offers unified services with platform-independent interfaces (APIs) (e.g., services 301) to the hosted apps and shell components. The implementations of these services can utilize different service adapters for the respective platform to carry out platform-specific behavior. The unified shell can be enabled using a shell container 302, shell services 304, and a shell renderer 306. In some implementations, the shell container may be independent of shell services 304 by utilizing the shell renderer 306.

Applications (e.g., apps) 308 may be embedded in an application container 310. As this is an independent re-use component, the embedding aspect is decoupled from the renderer 306. The application container 310 can, for example, host SAPUI5 components, Web Dynpro ABAP applications and SAP GUI for HTML transactions.

The shell services 304 and renderers 306 are managed by the central shell container 302. The shell container 302 utilizes a runtime configuration 312, which defines the concrete implementations for services 314, adapters 316, and shell renderer 306, as well as global settings like theme, language, system and user data. The runtime configuration 312 is fed by a number of settings, including, but not limited to static configuration settings in the hosting HTML page, dynamic configuration data read from the front-end server during startup, and/or dynamic settings passed as query parameters in the URL

In some implementations, the JavaScript components shown in FIG. 300 are embedded into a single HTML page. The launchpad implementation of the SAP NetWeaver ABAP front-end server may contain a standard page called, for example, Fiorilaunchpad.html 318, or other URL directed to one or more viewports 320. Users may create custom start pages which utilize the shell with different static configurations.

The web browser can use http data and OData to access application backend systems 322 and UI front-end server 324 (e.g., service implementations 326 and UI contact 328) via web dispatcher 330.

Users can embed apps into the Launchpad. When embedding applications into the launchpad, the system 300 differentiates between applications based on SAP GUI for HTML or Web Dynpro ABAP can be embedded using an iFrame (i.e., inline frame). The system 300 differentiates between applications based on SAPUI5. As these have been implemented using the same UI technology, these can be embedded directly into the Launchpad using DOM injection. This approach also allows smooth, animated UI transitions and the reuse of shared components at runtime. Therefore, applications have to be implemented as self-contained SAPUI5 components, as described below.

In a specific example, users can embed SAPUI5 Applications into the launchpad using the application container 310 configured with the following parameters: the URL (root path) of the application and the name of the SAPUI5 component. The root path is a path where the component controller for the SAPUI5 app (e.g., the Component.js file) is located. The application container 310 registers the component namespace as module path for the application URL.

The SAPUI5 component is defined with a file structure having a file named Component.js, which should be located in the root folder of the application being embedded. The definition of an SAPUI5 component includes the component metadata. The component metadata includes a config object containing additional information. The launchpad-specific configuration is defined in this config object.

The launchpad evaluates the following properties of the component configuration:

ResourceBundle—Path to the resource bundle that holds the translated app title. Example: i18n/i18n.properties.

TitleResource—Key of the app title text in the resource bundle. The title is typically displayed in the browser tab.

FavIcon—Path to the “favicon” (*.ico) file for the app, which is typically displayed in the address bar or next to the window title or tab title.

HomeScreenIconPhone, homeScreenIconPhone@2, homeScreenIcon Tablet, and/or homeScreenIconTablet@2—Paths to icons with different resolutions that are used when users add the (launchpad page containing the) app to their mobile devices' home screens. The properties with an @2 suffix enable referral to special icons for high-resolution devices.

The launchpad uses URL hashes for its own navigation. Direct manipulation of the location hash would interfere with the launchpad navigation. For cross-app navigation, use the Cross-Application Navigation service. For inner-app navigation, use the SAPUI5 routing API. Ensure that all controls created by your component are destroyed when the component is destroyed. Avoid using sap.ui.localResources inside your Component.js file. sap.ui.localResources registers a path relative to the main page (Fiorilaunchpad.html).

FIG. 4A is an example screenshot 400 of a scrollable screen area. The scrollable screen area may provide one or more viewports that a user can scroll through. For example, the entire screen area may be a viewport that can be scrolled onto and off of a display screen. In another example, each region (e.g., container) within the screenshot 400 may be a viewport that can be scrolled between other viewports. As shown, the screenshot 400 includes a left container 402, a shell main container 404, a right container 406, and a shell toolbar 408. In general, it may be possible for a user to scroll (e.g., pan) left and right across different regions (e.g., container 402, container 404, container 406, and container 408) on a display device screen.

In one example, the shell toolbar 408 can be used to toggle between viewports. As shown in FIG. 4B, a screenshot 410 includes a representation 411 of the Shell Toolbar 408. The representation 411 may be provided when the Shell Toolbar 408 is off the screen. For example, if the user chooses to view other viewports that do not include the toolbar 408, then a representation 411 of the toolbar can be provided. The representation 411 includes a Toggle Me Area control 412 that can toggle between viewports (e.g., container 402, container 404, container 406, and container 408). The representation 411 also includes a Back to launchpad control 414 to enable the user to return to their launchpad viewport. A Toggle Notifications control 416 is shown to enable the user to toggle the Notifications in and out of a view of the screen.

As shown in FIG. 4C, a screenshot 420 depicts a number of Viewports 422, 424, and 428 viewable on a display screen of a computing device 428. The Viewports 422-426 may be selected by a user to show additional data associated with each respective Viewport. For example, if a user selects an item on the launchpad viewport 424, a scrollable overlay 430 can be presented in part within the screen of device 428. In one example, the overlay 430 may be a single viewport that is scrollable by the user. In another example, a number of Viewports can be represented by overlay 430. If the user selects a portion of the overlay, any Me area or notification area viewports may be hidden to display additional overlay data.

The architecture described herein can also enable a viewport that can be translated, faded, zoomed, and/or scaled on a display screen. As shown in FIG. 5A, a screenshot 500 includes a left container 502, a main container 504, and a right container 506. The left container 502 may include a Me area while the right container 506 includes a notification area. More or fewer containers can be shown and any of the containers may be presented in any position on the screen in one or more viewports (or virtually off of the screen). The viewports described herein can support parallax side-to-side scrolling. In one example, when scrolling content, the user can shrink and fade content (e.g., as shown at arrow 508) from the main container 502. In another example, the user can scale up content and move content to another container/viewport, as shown by arrow 512 in FIG. 5B. Users can also zoom into and out of a region on a container/viewport.

FIG. 5C illustrates a screenshot 520 of an example animation that can occur when a user interacts with a portion of a viewport. In particular, if the user is viewing a viewport (e.g., an open viewport), and selects a profile icon 522, the profile image is faded from a user profile picture into a cancel icon 524. The fade includes a gradual removal of the icon by scaling down (e.g., shrinking from larger to smaller) the profile image. When the user closes the viewport, the cancel icon 524 is faded into the profile icon. The fade in includes scaling the cancel icon 524 from smaller to larger. In one example, if the left container is open and the Main container or Notification container is clicked, the same procedure can occur as in when closing the left Viewport/container.

FIG. 5D illustrates a screenshot 530 that depicts a user moving content from a main container 504. Here, the user is moving a launchpad into left container 502 from main container 504. When the user begins to move the content, an animation is generated by the systems described herein to scale down (e.g., shrink) the launchpad element as the element is dragged to the left between containers/viewports 504 and 502.

FIG. 5E illustrates a screenshot 540 that depicts a user moving content from a notification container 506. Here, the user is moving notifications from right container 506 into main container 504. When the user begins to move the content, an animation is generated by the systems described herein to scale up (e.g., enlarge) the notification element as the element is dragged to the left between containers/viewports 506 and 504.

FIGS. 6A-6D illustrate screenshots of example user interfaces depicting notification aspects. As shown, in FIG. 6A, a screenshot 600 depicts a laptop display 602A in which a notification 604A is being provided. The notification 604A can be provided in a sliding right to left motion into the display 602A. For example, the notification can be provided from viewports outside of the display 602A and into a main viewable content area. Similarly, the notification 602B (or the same notification 602A) can slide left to right (or right to left) outward away from the viewable area on display 602B. For example, the notification 602A may be provided as an alert and upon a threshold amount of time, the notification 602A (or 602B) can slide out of the display and into a notification container in a right viewport (e.g., viewport 506), for example.

Referring to 6B, a screenshot 610 depicts a notification 612 indicating a number of different icons/links/actionable content that can be selected to carry out an action. For example, a first action area 614 in which a user can look into additional information in the notification, decline the notification, or acknowledge the notification. Looking into additional information may function to expand the notification, decline may remove the notification from view and in some examples, may delete the notification from a notification listing, trigger email receipts, or decline notices, etc. Acknowledging the notification may trigger email receipts, or additional notifications to other users. In some implementations, acknowledging a notification may place the notification onto an additional list of items to carry out with respect to the notification. In some implementations, a notification can provide one or more links and/or icons to enable a user to navigate 616 from the notification to an application, website, or other area pertaining to details in the notification.

Referring to FIG. 6C, a screenshot 620 depicts a notification 622 provided to a user in a viewport. The notification includes buttons in the notification that the user can select. Here, the buttons include transparent icons with textual description below each button.

Referring to FIG. 6D, a screenshot 630 depicts three notifications 632, 634, and 636. Each notification indicates a status using color. For example, the first notification 632 includes a red status indicating an urgent status. The second notification 634 includes a green status indicating a status with a non-urgent date. The third notification 636 includes a white status indicating a non-urgent and non-date specific status.

FIG. 7 is an example data model 700 for a notification architecture. The data model 700 includes an origin system block 702, a notification block 704, a sensitive content block 706, a notification type 708, a recipient block 710, a consumption channel block 712, a configuration block 714, a template block 716, and an action block 718.

The origin system block 702 defines where a particular notification originates. Existing messaging/notification frameworks can be used to store this information. The origin system is also responsible for deciding which notification is sent to which user and which recipient list.

The notification block 704 is the central data entity containing the content from the backend and all administrative data. It is identified by a compound key of ID, Notification Type, and Origin System. The sensitive content block 706 is stored separately, so that it can be managed separately for security and data-privacy reasons. In order to ease the processing (e.g., in case of asynchronous queues), the notification may have a processing status (e.g., created, delivered, erroneous, etc.).

The notification type 708 identifies the type of the notification (e.g., Purchase Order Approval, Leave Request . . . ). It may be used to identify the notification and also to manage the delivery configuration on a type specific level. It also defines the behavior of sensitive content cache with respect to security and data privacy (different types can behave differently). The templates and the mass texts for stacked notifications are notification type specific as well.

The recipient block 710 indicates one or more users in which the notification should be delivered to. A notification can have multiple recipients. The recipients are typically identified throughout all systems in order to deliver the notification to the correct user. The recipients are determined by the notification provider.

The consumption channel block 712 defines the technical way (email, native mobile push, Fiori launchpad, etc.) in which a notification is delivered to a recipient. A recipient can have multiple consumption channels assigned, to specify how the notification is delivered. The detailed delivery information is part of the configuration.

The configuration block 714 specifies the details about the notification delivery for a specific delivery channel and a specific user (e.g., display as banner or notification, do not disturb times, etc.).

The template block 716 specifies templates detailing the way a notification is compiled to a human readable message format. The template is originated from the system that also generates the notification. The template can be replicated during the configuration of the source system or later on by an administrator. Templates are specific to a notification type and a consumption channel.

The action block 718 specifies actions that represent a one click action, which can be performed on the notification without further user interaction involved (e.g. approve/reject). An action is channeled to the backend system for execution without further processing on the notification service.

In operation, an origin system 702 is defined and a notification 704 is generated. The notification includes a notification type 708 and may also include actions 718, sensitive content 706, and configuration details 714. The notification type 708 can be dictated by a template 716 that is also available to the origin system 702. The notification may include one or more recipients 710 and each recipient may be associated with a consumption channel 712.

FIG. 8 is an example block diagram 800 depicting an integration of notification services. A frontend server 802 retrieves and displays notifications to a user via backend server 804 using a notification API 806, for example. A launchpad notification center 808, a native mobile notification center 810, and an email client 812 may be integrated to utilize notification services provided by server 802. Such access can be provided by a reverse proxy 814, an SMP HCP block 816 (e.g., via push hub 818), and/or email gateway 820, respectively for 808, 810, and 812.

The frontend server 802 includes a notification service 822. The notification service 822 may be accessible as a central service to all notification providers and consumers. It may also be integrated seamlessly into the Fiori infrastructure (e.g., FIGS. 1-3). Therefore, the notification service 822 may be deployed to either the on premise Fiori Frontend Server 802 or/and as a central service on the HCP 816. The notification service 822 may serve as a central aggregation point as well as a runtime and configuration place.

The notification service 822 includes one or more notification processors 824, inbound adapters 826, notification APIs 806, callback adapters 828, outbound adapters 830, notification stores 832, templates in a template cache 834, sensitive content cache 836, cache configurations 838, and configurations 840.

The notification processor 824 is one component responsible for processing and sending notifications to the configured consumers including the template handling and processing. As such, the notification processor 824 includes a cache handler 842 to cache notifications and a template engine 844 to generate notifications using particular templates. The notification processor 824 can read (836) sensitive data from the notification provider when needed and handling the caching (838) of sensitive data handling the consumption lifecycle (e.g., read, snoozed, etc.), handling the action processing (e.g., approved, rejected, etc.) towards the notification provider, and handling the configuration.

Inbound adapters 826 may be configured to receive the push notifications from the notification providers and storing them in the notification store. There is typically one adapter per technical communication channel (e.g., RFC, OData). The channel specific data format is transformed into the internal storage format. Depending on the backend system type, there might also be a generic component.

Notification APIs 806 may be implemented on the system 800, which provides convenient functions for the notification provisioning developer. This component may not perform any implicit commits in order to not mess with the commit logic of the caller. The Notification API is also responsible for queuing the calls towards the notification service.

Callback adapters 828 may provide backend type specific implementations to synchronously trigger the notification related functionality in the backend (e.g., triggering actions, reading un-cached sensitive data, reading templates, etc.). The Inbound and Callback adapter are a logical couple for communicating with the backend systems.

Outbound adapters 830 may be responsible for transferring the push notification to the consumption channels. There may be one adapter per technical channel (e.g., mobile, email). The adapter implementation can enrich the internal notification service data with additional channel specific data (e.g., form-factor configurations from the frontend server 802 in the mobile scenario).

Notification stores 832 store notifications that were received from the notification providers. The store 832 also contains lifecycle information (e.g., read, snoozed, etc.). It contains the notification content, which can safely be sent to non-secure communication channels like email or mobile push channels.

Templates may include text templates for generating the notification message. Templates may be stored in a template cache 834. Together with the notification data from the backend, the template engine may generate the messages for the different consumption channels. The templates (i.e., the way, the text is generated) are originated from the backend system. During the configuration time of the connected backend system, the relevant templates are replicated from the backend and cached in the template cache. The update can be triggered manually at a later time. In some implementations, the notification service may also implement a template editor (e.g., for backend systems, where no access is available). In this case, the Template Cache is not solely a cache but a primary persistence. An administrator may invalidate the cache.

Sensitive content cache 836 stores information that is related to the notification store content. The sensitive content is decoupled from the non-sensitive content in order for the customer to be able to handle it differently (e.g., switching off caching for certain notification types or the lifetime of cached information). The content of the cache can be encrypted according to SEC97/106. The notification provider is responsible for classifying which parts of the notification have to be treated as sensitive and non-sensitive content. Cache configurations 838, and configurations 840 contain all relevant configurations for the notification service.

In order to enable an application as a service provider the developer implements an interface from the Notification API 806 in order to process the actions and deliver the notification types and sensitive texts. The developer may also register the implementation in the Notification API 806 for the specific provider and call a provided functionality from the Notification API 806. The notification provider also may deliver the intent for the intent based navigation on the consumption channel. Configurations done by an administrator may function to configure the landscape and the infrastructure including, but not limited to communication channels to/from the notification provider, service end point (notification service and Notification API), authentication, and protocol (e.g., inbound adapter 826/callback adapter 828).

The notification service may provide functions and features to allow administrators and the IT responsible to reliably operate the system. This includes (but is not limited to) for example monitoring for delivered and stuck notifications, logging and tracing, manual deletion of notifications, resending of notifications, and/or invalidating cache.

FIG. 9 is an example swim lane diagram 900 depicting provision of messages using a notification service described herein. The architecture 800 may carry out swim lane diagram 900. The components may include, but are not limited to, the launchpad 808, the mail/mobile 810/812, the notification service 822, the backend 804 and notification processor 824.

A new notification is created by a notification provider, processed and sent 902 to various consumption channels (from the backend server 804). The notification is received 904 by the inbound adapter (e.g., notification service 822) belonging to the technical communication channel, then parsed by the adapter and provided to the notification processor 824.

After the inbound handling, the notification processor further processes 906 the notification. The non-sensitive content and the administrative data are stored in the notification store. The sensitive content is stored in the sensitive content cache (via the cache handler). During this step, the template engine also processes the notification to create a non-sensitive version to be sent 908 to the non-secure channels (e.g., email or mobile).

If an active frontend session with the Fiori Launchpad (FLP) 808 exists for a user, the FLP is notified 910 about new notifications. This enables the FLP 808 to update the notification list, by calling back to the notification service for a notification delta. Since the FLP 808 is a secured channel, this information can contain sensitive content. Again, the template engine is used to create the notification content. In case there is no sensitive content cached on the notification service, the cache handler calls 912 the backend (via the callback adapter) to get the sensitive content.

FIG. 10 is an example swim lane diagram depicting action processing 1000 on notifications. To not force end-users to always navigate to the corresponding application, actions for processing 1002 the notification can directly be exposed in the notification center. Even though the diagram only shows the FLP as a trigger, it could be a mobile or email client as well. The processing trigger is received by the Notification Processor. The processor delegates the call to the backend system via to the callback adapter. After the action is processed 1004 in the backend, a (delete-) delta will be pushed 1006 to the notification service if processing the action results in the notification not being relevant any longer.

In order to improve the performance and avoid delays in updating the clients, a Delete on Action Flag may be introduced in the notification protocol. With that flag, the notification provider could inform 1008 the notification service, that the message can be deleted directly after the action was triggered.

In order to allow the customer to control the lifecycle of the sensitive content, it may be separated from the non-sensitive content and the administration data. With the separate handling of the sensitive data in a sensitive content cache, it is possible to provide notification type specific control over the caching.

The lifecycle of a notification is primarily controlled by the notification provider. That means for example, if a workflow item is deleted or unassigned, the notification provider has to send a subsequent delta notification to the notification service. For handling the lifecycle of notification type information, there may be two types of changes: incompatible changes and compatible changes. Incompatible changes may include for example, adding parameters to the notification text. In this case already delivered notifications do not have enough information stored on the notification service to completely assemble the notification text. Compatible changes may include, for example, correcting typos in the template text.

The template engine typically uses a template which harmonizes with the runtime information of a notification (i.e., number and type of parameters). Therefore, rather than just overwriting a notification type on the notification service, a new version may be created in case of incompatible changes. Thus, the old notifications can use proprietary templates and new notifications may be mapped to the latest version.

In case of compatible changes, the information just needs to be re-fetched from the backend and the active version needs to be updated from that information.

An invalidation will be triggered by an administrator via a report. That report typically supports (i.e., automatically determines), weather an incompatible or compatible change applies. This could be checked by calling back to the backend system and compare the notification type with the active one cached on the notification service based on a set of predefined rules.

The end user may have additional capabilities to control the presentation and the handling of a notification in the FLP. Even though, these operations could be perceived as lifecycle operations, the changes are not driven by business logic or actions (e.g., approve, reject). The effect of operations is handled between the consumer and the notification service (e.g., snooze, stack, mark read, etc.). The status is kept on the notification level in the notification service. The availability of operations can depend on the notification type or can be specified by the notification provider via attributes in the contract.

Stacking enables the user to perform mass actions on certain notification types. It may be identified by the backend if a notification type is stackable or not. If yes, the backend has to deliver the “stacked headline text” (e.g., “{x} new Leave Request to approve”) and the mass action texts (e.g., “Approve all”) in addition to the standard action texts (e.g., “Approve”).

The stacking itself (i.e., interaction, triggering of actions) is a frontend capability (like Fiori Launchpad). The notification service does not provide stacking specific operations. That means for example, that the mass actions will result in single batched action calls for each notification item, rather than executing a “special” stack wrapper on the notification service. The notification service hast to make sure, though, that enough meta information is delivered, so that the frontend can handle the stacking properly.

The end user may have the possibility to navigate to an appropriate Fiori application when clicking/tapping a notification. The navigation will be based on the intent based navigation concept, which will be used cross channel.

The status of a notification should be synchronized across all consumption channels of a user. If a user, for example, removes a notification from his list on the mobile device, this should be reflected in the notification center on the FLP.

The infrastructure described herein supports badges on the native mobile app as well as within the Fiori Launchpad 808 to inform the user about new notifications. The UX design also distinguishes between read/unread notifications on the one hand side and new notifications on the other hand. The read state of a notification as kept with a notification instance and is set to true, if one of the following events occur the user performed an action on a notification, the user clicked the notification and triggered an intent based navigation, or the user performed a “Mark all as read” action if supported by the respective notification center (e.g. in Launchpad or native mobile).

The new state is the basis for the badge counting. It is not tracked with a notification instance, since the user is only interested in the number of new notifications (which he has not viewed yet). The notification service keeps a separate new-counter per user instead. This counter is incremented by one, as soon as a new notification for the user arrives and is set to zero (all viewed), as soon as the notification center in the Launchpad is opened. This number is shown as a badge on the notification center icon in FLP as well as a badge on the native mobile application.

FIG. 11 is an example swim lane diagram depicting a process 1100 for incrementing a badge counter. The following sequence describes the high level logic of incrementing the badge counter. When a notification is received 1102, the new-counter for every recipient is incremented by one. If the notification is pushed to a mobile device, the current new-counter is delivered 1106 per user to the mobile infrastructure 1104. This can be used to show the badge number on the native mobile app. The online clients (like FLP) get notified (via a Web Socket based push) about the new notification and in addition to the delta list of notifications it has to read the current new-counter from the notification service via an OData request 1108.

In some implementations, particular rules can be established for notifications. An example rule may define when a counter is set to zero. For example, a counter may be set to zero when

The counter is set to zero when a user clicks on notification icon to access Notification Center, when a User is in the notification Center, new notification arrives, navigating back to FLP or clicking Notification Icon, when a User is inside an app, High Priority Banner is shown, User clicks banner, counter is set to zero or minus one, when a User is on a mobile device, native notification is shown, tap on native notification will set counter to zero or minus one.

Another example rule may define when a notification is displayed and viewed. For example, a notification may be considered viewed when a user clicks on notification icon to access Notification Center, when the User is in the notification Center, new notification arrives, when the User is inside an app, High Priority Banner is shown, User clicks banner, when the user is on a mobile device, native notification is shown, tap on native notification to navigate will set this item as viewed.

Another example rule may define when a notification is read when the user clicks on notification to navigate to app and/or when the user clicks High Priority Banner, and/or when a user clicks native banner/alert, etc.

FIG. 12 is an example swim lane diagram depicting a process 1200 for resetting a badge counter. Resetting the badge counter can include a resetCounter function being called 1202 on by the notification service, after the user opens 1204 the notification center (either on mobile or desktop). This may reset the new counter for the user triggering the request. The information to delete the badge on the native app is sent 1206 to the mobile device as well. In addition, a notification, that the new counter changed, is pushed 1208 to all online clients (like FLP, via a WebSocket based push). This triggers re-reading of the news counter from the notification service via a get request 1210. By this logic the reset of the badge is synced to all online clients.

The end-to-end notification experience may leverage the native notification infrastructure (e.g. APNS on iOS). For enabling notifications on mobile channels and abstracting the specifics of the different mobile platforms, the notification service uses SAP's mobile platform (on premise or in the cloud). In order to overcome the APNS “certificate issue”, the HCPms Push Hub will be used (see [HCPmsPushHub]). In case users request a non-cloud based solution for routing notifications to own custom-specific packaged apps, one could also imagine that the SMP itself provides a solution for dealing with APNS and custom-certificates in the future.

The SMP may be enabled to deliver a notification identified by the user name and intend based navigation target to the end user. The notification may be delivered, if the target application is installed on the user's device. In general, a notification delivered to a mobile device is technically bound to a native mobile app on that device.

Since the notification, pushed to a mobile device, is usually processed by servers outside of an SAP network, the mobile push channel is considered as not secure. Therefore, the push notification can only contain non-sensitive content. Therefore the notification can actively pull the sensitive content from the notification service as soon as the user accesses the notification in the native notification center if the platform supports it.

The integration of the backend with the notification service, as well as the integration of the frontend with the notification service should happen on the data level through services. That means, the notification provider does not provide any sort of UI snippets to be plugged into the notification center (FLP or native mobile). If further user interactions with the notification are requested, the user can navigate from the notification center to the registered Fiori app.

The Provider Interface covers the communication between a notification provider and the notification service in both directions: On the one hand side it allows the notification provider to push new notifications to the notification service. This can ensure that the notification can call back to the notification provider to get cross-notification-instance specific content (e.g. notification type specific templates, mass texts for stacked notifications) and to trigger the execution of actions (e.g. approve, reject).

The interface towards the consumer may provide pre-aggregated and processed notifications (e.g., language dependent, based on consumption channel specific templates). The interface may also contain links to operations and actions for the consumer to call back upon user interaction and should be prepared to support Web Socket based updates.

The consumer API may be consumed by the Fiori Launchpad, by mobile apps, and in the future potentially by non Fiori apps, email plugins, or by consumers provided by customers and partners. Therefore OData may be used as the protocol for the provisioning.

FIG. 13 is an example of a notification list view 1300. As shown, the notification list view 1300 can be arranged and/or organized by date 1302, by notification type 1304, or by priority 1306. Here, the by date 1302 option is selected. Each notification 1308, 1310, 1312, 1314 in the list 1300 are organized in date order. Each notification pertains to a different application available and provided within a computing device. In general, a list of notifications is associated with a user (or a user role).

In the example notification 1308, a travel request notification is awaiting approval by the user associated with the notification 1308. The user can select approve 1316 or decline 1318. Other actions are possible. For example, an action to forward to another user may be provided. An action to make a change request to the timing of the travel request may also be provided.

Items can be added to the list 1300 based on internal events within apps or systems. The list can display various types of notifications, which can all differ in terms of appearance, content, and functionality. In some implementations, the list is ordered by a timestamp (e.g., most recent notification first). The time stamp for a notification group may be based on the most recent notification within the group.

In some implementations, the notifications can include a priority indicator, a tap/click option to navigate to a particular relevant app or source associated with the notification, a read/unread status, an actionable notification item (e.g., perform tasks directly from the notification), scrolling, subscription based notification alerts (e.g., to KPIs or following a business object).

In some implementations, the notification list 1300 can be displayed and arranged by type 1304. When the type is chosen, items in the list may appear as grouped notifications. Bulk actions can be hidden by an administrator for critical approvals.

In some implementations, on initial display, all items may be collapsed. The priority indicator may be presented at the notification group level. The highest priority of a single item defines group priority. In some implementations, a single item can appear as well when only one notification is in the list.

In some implementations, a priority 1306 sorting may be selected. The priority sorting may sort by reverse chronological order of when the message is received. Notifications may include attributes including, but not limited to title, description image/icon, author/source, timestamp, object status/priority, read/unread, action, operation, trigger to expand/contract/truncate notification.

FIG. 14 is an illustration of an example process 1400 for generating and displaying notifications, such as notification 1300, for example. The notifications may be generated and displayed within any number of applications. At block 1402, the process 1400 may include detecting, with a processor, an availability of at least one notification available for display in the user interface. At block 1404, the process 1400 may include generating, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action.

At block 1406, the process 1400 may include generating, with the processor and for the container, additional selectable actions and appending the additional selectable actions to the at least one selectable action. The additional selectable actions may be generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface. Example actions may include accept, reject, move, archive, respond, mark as low or high importance, elevate or de-elevate status, update document, etc.

At block 1408, the process 1400 may include determining, with the processor, which display device type of a plurality of display device types in which the user interface is being accessed. For example, a tablet, a laptop, a mobile phone, a desktop or other computing device may be configured to display particular applications and notifications according to the display size, type, and/or display features.

At block 1410, the process 1400 may include generating, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions. The container may be arranged for display according to the determined display device type.

In some implementations, the process 1400 also includes displaying, in the user interface in a display location, the container, the display location determined based on the determined display device and a role associated with the at least one user. For example, particular containers holding notifications may be displayed in one area of a device based on device size and based on the user role being associated with performing particular actions associated with similar notifications. In some implementations, the process 1400 also includes the display location being predefined for the context determined to be associated with the at least one notification.

In some implementations, the process 1400 also includes in response to detecting one or more additional notifications, generating a container for each of the one or more additional notifications, and generating, for display in the user interface, the container for each of the one or more additional notifications. Each container may depict a plurality of selectable action. Each container may be arranged for display according a determined display device providing the user interface.

In some implementations, the process 1400 also includes the one or more additional notifications being provided from a plurality of source applications associated with at least one user accessing the user interface. The additional selectable actions may provide access to a plurality applications hosted outside of the user interface.

In some implementations, the process 1400 also includes merging each notification received in the user interface into a list, displaying the list in a viewport and generating a plurality of actions that enable at least one bulk operation for the notifications in the list. In some implementations, the process 1400 also includes having the additional actions implemented upon selection within a respective notification.

In some implementations, the process 1400 also includes having each display device type be associated with a different set of notification rules. the display device type may include a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device.

The various systems and techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The various techniques may implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine readable non-transitory storage device, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program, such as the computer program(s) described above, can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a standalone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.

Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer also may include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magnetooptical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of nonvolatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magnetooptical disks; and CDROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in special purpose logic circuitry.

Implementations may be implemented in a computing system that includes a backend component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a frontend component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation, or any combination of such backend, middleware, or frontend components. Components may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), e.g., the Internet.

Claims

1. A computer-implemented method for generating notifications in a user interface, the method comprising:

detecting, with a processor, an availability of at least one notification available for display in the user interface;
generating, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action;
generating, with the processor and for the container, additional selectable actions and appending the additional selectable actions to the at least one selectable action, the additional selectable actions being generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface;
determining, with the processor, a display device type in which the user interface is being accessed; and
generating, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions, the container being arranged for display according to the display device type.

2. The computer-implemented method of claim 1, further comprising, displaying, in the user interface in a display location, the container, the display location determined based on the display device and a role associated with the at least one user.

3. The computer-implemented method of claim 2, wherein the display location is predefined for the context determined to be associated with the at least one notification.

4. The computer-implemented method of claim 1, further comprising:

in response to detecting one or more additional notifications, generating a container for each of the one or more additional notifications; and generating, for display in the user interface, the container for each of the one or more additional notifications, each container depicting a plurality of selectable action, wherein each container is arranged for display according a display device providing the user interface.

5. The computer-implemented method of claim 4, wherein the one or more additional notifications are provided from a plurality of source applications associated with at least one user accessing the user interface, the additional selectable actions providing access to a plurality applications hosted outside of the user interface.

6. The computer-implemented method of claim 1, further comprising:

merging each notification received in the user interface into a list;
displaying the list in a viewport; and
generating a plurality of actions that enable at least one bulk operation for the notifications in the list.

7. The computer-implemented method of claim 1, wherein the additional actions are implemented upon selection within a respective notification.

8. The computer-implemented method of claim 1, wherein each display device type is associated with a different set of notification rules, the display device type including a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device.

9. A system for generating a user interface, the system comprising:

a shell container, executing in a web browser and providing a plurality of services for generating notifications in a user interface;
an application container, executing in the web browser, the application container and
at least one processor to programmed to,
obtain at least one notification,
provide, for display in a display device, the user interface depicting the at least one notification,
detect, with a processor, an availability of at least one notification available for display in the user interface,
generate, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action,
generate, with the processor and for the container, additional selectable actions and append the additional selectable actions to the at least one selectable action, the additional selectable actions being generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface,
determine, with the processor, a display device type in which the user interface is being accessed, and
generate, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions, the container being arranged for display according to the display device type.

10. The system of claim 9, wherein the at least one processor is further programmed to display, in the user interface in a display location, the container, the display location determined based on the display device and a role associated with the at least one user.

11. The system of claim 9, wherein the at least one processor is further programmed to:

in response to detecting one or more additional notifications, generating a container for each of the one or more additional notifications; and generating, for display in the user interface, the container for each of the one or more additional notifications, each container depicting a plurality of selectable action, wherein each container is arranged for display according a display device providing the user interface.

12. The system of claim 9, wherein the at least one processor is further programmed to:

merge each notification received in the user interface into a list;
display the list in a viewport; and
generate a plurality of actions that enable at least one bulk operation for the notifications in the list.

13. The system of claim 9, wherein the additional actions are implemented upon selection within a respective notification.

14. The system of claim 9, wherein each display device type is associated with a different set of notification rules, the display device type including a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device.

15. A computer program product for generating a plurality of notifications for display in a user interface, the computer program product being tangibly embodied on a non-transitory computer-readable storage medium and comprising instructions that, when executed by at least one computing device, are configured to cause the at least one computing device to:

obtain at least one notification;
provide, for display in a display device, the user interface depicting the at least one notification;
detect, with a processor, an availability of at least one notification available for display in the user interface;
generate, with the processor, a container for the at least one notification, the container being adapted to include the at least one notification and at least one selectable action;
generate, with the processor and for the container, additional selectable actions and append the additional selectable actions to the at least one selectable action, the additional selectable actions being generated based at least in part on a context determined to be associated with the at least one notification and at least one user accessing the user interface;
determine, with the processor, a display device type in which the user interface is being accessed; and
generate, for display in the user interface, the container depicting the at least one selectable action and the additional selectable actions, the container being arranged for display according to the display device type.

16. The computer program product of claim 15, wherein the instructions, when executed by the at least one computing device, are configured to cause the at least one computing device to:

display, in the user interface in a display location, the container, the display location determined based on the display device and a role associated with the at least one user.

17. The computer program product of claim 15, wherein the instructions, when executed by the at least one computing device, are configured to cause the at least one computing device to:

in response to detecting one or more additional notifications, generate a container for each of the one or more additional notifications; and generate, for display in the user interface, the container for each of the one or more additional notifications, each container depicting a plurality of selectable action, wherein each container is arranged for display according a display device providing the user interface.

18. The computer program product of claim 15, wherein the instructions, when executed by the at least one computing device, are configured to cause the at least one computing device to:

merge each notification received in the user interface into a list;
displaying the list in a viewport; and
generating a plurality of actions that enable at least one bulk operation for the notifications in the list.

19. The computer program product of claim 15, wherein the additional actions are implemented upon selection within a respective notification.

20. The computer program product of claim 15, wherein each display device type is associated with a different set of notification rules, the display device type including a display on any one of a mobile phone device, a tablet device, a laptop device, and a desktop device.

Patent History
Publication number: 20170329614
Type: Application
Filed: May 10, 2017
Publication Date: Nov 16, 2017
Inventors: Jamila Schon (Heidelberg), Marc Arno Ziegler (Mauer), Kai Richter (Muehltal), Florian Jann (Heidelberg), Michael Krenkler (Wiesloch)
Application Number: 15/591,995
Classifications
International Classification: G06F 9/44 (20060101); G06F 3/0481 (20130101); G06F 3/0482 (20130101); G06F 17/21 (20060101);