INTEGRATED CLOUD SYSTEM FOR PREMISES AUTOMATION

A system comprises premises equipment including premises devices located at a premises. The system includes a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. The system includes a system server configured to interact with the premises devices. The system server is configured to interact with the partner device via a partner proxy corresponding to the partner device. The system includes automation rules coupled to the system server. The automation rules include actions and triggers for controlling interactions between at least one of the partner device and the premises devices. The system includes a user interface coupled to the system server and configured to interact with the premises devices and the partner device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application claims the benefit of U.S. Patent Application No. 62/186,696, filed Jun. 30, 2015.

This application claims the benefit of U.S. Patent Application No. 62/186,825, filed Jun. 30, 2015.

This application claims the benefit of U.S. Patent Application No. 62/186,857, filed Jun. 30, 2015.

This application is a divisional of U.S. patent application Ser. No. 15/196,281, filed Jun. 29, 2016.

This application is a continuation in part application of U.S. patent application Ser. No. 12/189,780, filed Aug. 11, 2008.

This application is a continuation in part application of U.S. patent application Ser. No. 13/531,757, filed Jun. 25, 2012.

This application is a continuation in part application of U.S. patent application Ser. No. 12/197,958, filed Aug. 25, 2008.

This application is a continuation in part application of U.S. patent application Ser. No. 13/334,998, filed Dec. 22, 2011.

This application is a continuation in part application of U.S. patent application Ser. No. 12/539,537, filed Aug. 11, 2009.

This application is a continuation in part application of U.S. patent application Ser. No. 14/645,808, filed Mar. 12, 2015.

This application is a continuation in part application of U.S. patent application Ser. No. 13/104,932, filed May 10, 2011.

This application is a continuation in part application of U.S. patent application Ser. No. 13/929,568, filed Jun. 27, 2013.

This application is a continuation in part application of U.S. patent application Ser. No. 14/628,651, filed Feb. 23, 2015.

This application is a continuation in part application of U.S. patent application Ser. No. 13/718,851, filed Dec. 18, 2012.

This application is a continuation in part application of U.S. patent application Ser. No. 12/972,740, filed Dec. 20, 2010.

This application is a continuation in part application of U.S. patent application Ser. No. 13/954,553, filed Jul. 30, 2013.

This application is a continuation in part application of U.S. patent application Ser. No. 14/943,162, filed Nov. 17, 2015.

This application is a continuation in part application of U.S. patent application Ser. No. 15/177,915, filed Jun. 9, 2016.

TECHNICAL FIELD

The embodiments described herein relate generally to networking and, more particularly, to premises automation systems and methods.

BACKGROUND

There is a need for systems and methods that integrate cloud services and internet-connected devices with a user interface and other components and functions of a service provider system. This integration would enable third party and/or other connected devices (e.g., smart door bells, door locks, garage door operators, cameras, thermostats, lighting systems, lighting devices, etc.), and third party services to control or trigger automations in the service provider system using components and functions of the service provider system. This would enable end-users to integrate and use their previously-standalone internet-connected devices with each other as well as with their service provider-based service.

INCORPORATION BY REFERENCE

Each patent, patent application, and/or publication mentioned in this specification is herein incorporated by reference in its entirety to the same extent as if each individual patent, patent application, and/or publication was specifically and individually indicated to be incorporated by reference.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a block diagram of the Integrated Cloud System (ICS) or platform, under an embodiment.

FIG. 2 is a flow diagram for Service Association, under an embodiment.

FIG. 3 is a flow diagram for Service Disassociation, under an embodiment.

FIG. 4 is a flow diagram for Card UI Interactions, under an embodiment.

FIG. 5 is an example rules interface for controlling triggers and actions involving third party devices integrated in the ICS, under an embodiment.

FIG. 6 is another example of an actions portion of a rules interface for integrated third party devices, under an embodiment.

FIG. 7 is an example of a triggers portion of a rules interface for third party services integrated with the ICS, under an embodiment.

FIG. 8 is an example touchscreen display including numerous On states, under an embodiment.

FIG. 9 is an example touchscreen display during arming, under an embodiment.

FIG. 10 is an example touchscreen display including numerous Off states, under an embodiment.

FIG. 11 is an example touchscreen display of a rules list, under an embodiment.

FIG. 12 is an example touchscreen display in response to selection of the “Add Rule” icon, under an embodiment.

FIG. 13 is an example touchscreen displayed upon selection of the “Weather Event” icon, including a list of weather events, under an embodiment.

FIG. 14 is an example touchscreen displayed upon selection of the “Reports a temperature” icon, including selections for activating low and high temperature selections, under an embodiment.

FIG. 15 is an example touchscreen display for selecting a temperature limit (lower) for “Reports a temperature” (“choose low”), under an embodiment.

FIG. 16 is an example touchscreen display following selection of a temperature limit (lower) for “Reports a temperature”, under an embodiment.

FIG. 17 is an example touchscreen display for selecting a temperature limit (upper) for “Reports a temperature”, under an embodiment.

FIG. 18 is an example touchscreen display following selection of a temperature limit (upper) for “Reports a temperature”, under an embodiment.

FIG. 19 is an example touchscreen display for filtering the temperature reporting rule based on time or day, under an embodiment.

FIG. 20 is an example touchscreen display for selecting a time after selecting “any day” as a filtering parameter for the temperature reporting rule, under an embodiment.

FIG. 21 is an example touchscreen display for selecting system state as an event filter for the temperature reporting rule, under an embodiment.

FIG. 22 is an example touchscreen display for which two arming types are selected for system state as an event filter for the temperature reporting rule, under an embodiment.

FIG. 23 is an example touchscreen display presenting available actions for the temperature reporting rule, under an embodiment.

FIG. 24 is an example touchscreen display for selecting a type of device to control in response to choosing an action to control a device according to a temperature reporting rule, under an embodiment.

FIG. 25 is an example touchscreen display for displaying a list of device types corresponding to the selected device type to be controlled under the temperature reporting rule, under an embodiment.

FIG. 26 is an example touchscreen display showing selection of a particular ceiling fan device (“Patio”) under the temperature reporting rule, under an embodiment.

FIG. 27 is an example touchscreen display showing available actions of the selected device for control under the temperature reporting rule, under an embodiment.

FIG. 28 is an example touchscreen display showing options for creating a compound rule with additional actions, under an embodiment.

FIG. 29 is an example touchscreen display for saving a rule, under an embodiment.

FIG. 30 is an example touchscreen display of a rules list following creation of a new rule, under an embodiment.

FIG. 31 is an example touchscreen display of a rules list, under an embodiment.

FIG. 32 is an example touchscreen display in response to selection of the “Add Rule” icon, under an embodiment.

FIG. 33 is an example touchscreen displayed upon selection of the “Irrigation” icon, including a list of irrigation events, under an embodiment.

FIG. 34 is an example touchscreen displayed upon selection of the “Switches on” icon, including selections for a day for the switching event, under an embodiment.

FIG. 35 is an example touchscreen display for selecting an “on” time after selecting “any day” as a filtering parameter for the switching event, under an embodiment.

FIG. 36 is an example touchscreen display for selecting a time of day as a start time, under an embodiment.

FIG. 37 is an example touchscreen display following selection of a start time for the switching event rule, under an embodiment.

FIG. 38 is an example touchscreen display for selecting a time of day as an end time, under an embodiment.

FIG. 39 is an example touchscreen display following selection of a start time and an end time for the switching event rule, under an embodiment.

FIG. 40 is an example touchscreen display for selecting system state as an event filter for the switching event rule, under an embodiment.

FIG. 41 is an example touchscreen display presenting available actions for the switching event rule, under an embodiment.

FIG. 42 is an example touchscreen display for selecting a type of device to control in response to choosing an action for the switching event rule, under an embodiment.

FIG. 43 is an example touchscreen display for displaying a list of device types corresponding to the selected device type (“lights”) to be controlled under the switching event rule, under an embodiment.

FIG. 44 is an example touchscreen display showing selection of particular lighting devices (“Porch” and “Living Room”) under the switching event rule, under an embodiment.

FIG. 45 is an example touchscreen display showing available actions of the selected device (“Porch light”) for control under the switching event rule, under an embodiment.

FIG. 46 is an example touchscreen display showing available actions of another selected device (“Living Room light”) for control under the switching event rule, under an embodiment.

FIG. 47 is an example touchscreen display showing options for creating a compound rule with additional actions under the switching event rule, under an embodiment.

FIG. 48 is an example touchscreen display for selecting a type of device to control in response to choosing an additional control device for a compound switching event rule, under an embodiment.

FIG. 49 is an example touchscreen display for displaying a list of device types corresponding to the selected device type (“shades”) to be controlled under the compound switching event rule, under an embodiment.

FIG. 50 is an example touchscreen display showing available actions of the selected device (“Living room shades”) for control under the compound switching event rule, under an embodiment.

FIG. 51 is an example touchscreen display for saving a rule, under an embodiment.

FIG. 52 is an example touchscreen display of a rules list following creation of the new switching event rule, under an embodiment.

FIG. 53 is a flow diagram for local card development and unit testing, under an embodiment.

FIG. 54 is a flow diagram for card integration testing, under an embodiment.

FIG. 55 is a flow diagram for card production, under an embodiment.

FIG. 56 is an example card (e.g., thermostat, etc.) operating on a smart phone, under an embodiment.

FIG. 57 is an example small card (e.g., thermostat, etc.), under an embodiment.

FIG. 58 is an example card menu, under an embodiment.

FIG. 59 is a block diagram of the integrated security system, under an embodiment.

FIG. 60 is a block diagram of components of the integrated security system, under an embodiment.

FIG. 61 is a block diagram of the gateway software or applications, under an embodiment.

FIG. 62 is a block diagram of the gateway components, under an embodiment.

FIG. 63 is a block diagram of IP device integration with a premise network, under an embodiment.

FIG. 64 is a block diagram of IP device integration with a premise network, under an alternative embodiment.

FIG. 65 is a block diagram of a touchscreen, under an embodiment.

FIG. 66 is an example screenshot of a networked security touchscreen, under an embodiment.

FIG. 67 is a block diagram of network or premise device integration with a premise network, under an embodiment.

FIG. 68 is a block diagram of network or premise device integration with a premise network, under an alternative embodiment.

FIG. 69 is a flow diagram for a method of forming a security network including integrated security system components, under an embodiment.

FIG. 70 is a flow diagram for a method of forming a security network including integrated security system components and network devices, under an embodiment.

FIG. 71 is a flow diagram for installation of an IP device into a private network environment, under an embodiment.

FIG. 72 is a block diagram showing communications among IP devices of the private network environment, under an embodiment.

FIG. 73 is a flow diagram of a method of integrating an external control and management application system with an existing security system, under an embodiment.

FIG. 74 is a block diagram of an integrated security system wirelessly interfacing to proprietary security systems, under an embodiment.

FIG. 75 is a flow diagram for wirelessly ‘learning’ the gateway into an existing security system and discovering extant sensors, under an embodiment.

FIG. 76 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel wirelessly coupled to a gateway, under an embodiment.

FIG. 77 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel wirelessly coupled to a gateway, and a touchscreen, under an alternative embodiment.

FIG. 78 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel connected to a gateway via an Ethernet coupling, under another alternative embodiment.

FIG. 79 is a flow diagram for automatic takeover of a security system, under an embodiment.

FIG. 80 is a flow diagram for automatic takeover of a security system, under an alternative embodiment.

FIG. 81 is a general flow diagram for IP video control, under an embodiment.

FIG. 82 is a block diagram showing camera tunneling, under an embodiment.

DETAILED DESCRIPTION

Systems and methods comprise premises equipment including premises devices located at a premises. A partner device is located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. A system server is configured to interact with the premises devices and the partner device. A user interface is coupled to the system server and configured to interact with the premises devices. The user interface includes a partner user interface corresponding to the partner device. The partner user interface configures the user interface to interact with the partner device. The user interface is configured to control interactions between the premises devices and the partner device.

Embodiments also include systems and methods comprising premises equipment including premises devices located at a premises. The system includes a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. The system includes a system server configured to interact with the premises devices. The system server is configured to interact with the partner device via a partner proxy corresponding to the partner device. The system includes automation rules coupled to the system server. The automation rules include actions and triggers for controlling interactions between at least one of the partner device and the premises devices. The system includes a user interface coupled to the system server and configured to interact with the premises devices and the partner device.

FIG. 1 is a block diagram of the integrated cloud system or platform, under an embodiment. The integrated cloud system (ICS) of an embodiment comprises cloud-based components that include a Cloud Integration Service/Server (CIS) coupled to a system server (e.g., “Icontrol Server”, also referred to herein as the service provider server) via an internal event bus. The CIS, system server, and event bus are implemented by the service provider in data centers of the service provider's customers, but are not so limited.

The system server is coupled to customer-premises equipment (CPE) at corresponding subscriber premises of numerous subscribers. The CPE includes one or more of security panels, security systems, gateways, hubs, touchscreens, and Wi-Fi access points that operate as a gateway to the system servers and ICS. The CPE is described in detail in the Related Applications incorporated by reference herein.

The CIS is coupled to a partner's production server (“partner server”) via a Cloud Integration Adapter (CIA). The partner server interacts with their products/services that their users wish to integrate into their ICS platform. The Cloud Integration Adapter provides the system server and CIS with REST endpoints to call for checking the health of the adapter, associating with adapter cloud devices, and processing events coming from the CIS. Furthermore, the Cloud Integration Adapter is responsible for sending events to the CIS as acknowledgement of incoming system events, and as an endpoint for Adapter managed cloud device events to be reported into the system servers.

The ICS of an embodiment effects integration of cloud services and internet-connected devices with the user interface, Rules Engine and other components and functions of the service provider system. This integration enables third party and/or other connected devices (e.g., smart door bells (e.g, Doorbot, etc.), door locks, garage door operators (e.g., Chamberlain, etc.), cameras (e.g., Dropcam, etc.), thermostats (e.g., Nest, etc.), lighting systems (e.g., Philips Hue, etc.), lighting devices, lawn irrigation systems (e.g., Rachio, etc.), plant sensors, pet feeders, weather stations, rain sensors, pool controls, air quality sensors, music systems, remote controllers, internet user interfaces, connected systems, connected vehicles, etc.), and third party services (e.g., weather forecasting services and applications (e.g., Accuweather, etc.), family networking services and applications, partner or third party services, Accuweather, MSO digital assets such as voicemail, etc.), to control or trigger automations in the service provider system using the user interface, Rules Engine and other components and functions of the service provider system. This enables end-users to integrate and use their previously-standalone internet-connected devices with each other as well as with their service provider-based service.

The ICS of an embodiment as described in detail herein includes one or more components of the “integrated security system” described in detail in the Related Applications, which are incorporated by reference herein. An example of the “integrated security system” is available as one or more of the numerous systems or platforms available from iControl Networks, Inc., Redwood City, Calif. The ICS of an embodiment incorporates one or more components of the “integrated security system”. The ICS of an embodiment is coupled to one or more components of the “integrated security system”. The ICS of an embodiment integrates with one or more components of the “integrated security system”.

The system server includes or hosts a partner proxy and an integration REST application programming interface (API). The integration REST API is coupled to the CIS. The partner proxy is coupled to a corresponding partner server, and is also coupled to a Card UI (“REST Client”). The partner proxy is configured to proxy API calls from the Partner's Card UI (REST client) to the Partner Server and appends the appropriate OAuth Token for a given user. This enables all client UIs to be enabled after a single OAuth pairing is completed (i.e., if one user authorizes Partner's product, all users and clients on the same account will have it auto-enabled and populated). This also improves security by not storing the user's credentials on the Partner's server in the client UI. The Card UI of an example embodiment is an HTML5-based user interface card developed by the Partner, or service provider, that is embedded into the service provider user interface (e.g., mobile app, web portal).

The ICS of an embodiment includes Cloud Actions and Triggers (CAT), which enable third party connected devices and services to trigger automations in the service provider system, thereby enabling end-users to integrate and use their previously-standalone internet connected devices with their service provider-based service.

Devices and services that are hosted outside of the automation platform or network are referred to as ‘cloud objects’ and provide numerous use cases when integrated with the system of an embodiment. The description that follows includes details of aspects of the system including but not limited to server infrastructure required to support external cloud objects, data format definitions for actions and triggers across the event bus, the process of onboarding external cloud objects, integration of cloud objects with the CPE rules engine, common OAuth2 Support for Cloud Services, and card UI/SDK Support for Cloud Objects.

The CAT of an embodiment integrates partner services into the ICS platform including support for rules on the CPE and partner-specific user interfaces based on the Card UI. The system of an embodiment includes a web API for the CIS for which partners develop Integration Adapters (also referred to as “adapters”) responsible for the translation of service provider events and operations into partner proprietary calls. Partners also develop Cards with the Card SDK in order to get branded partner specific user interfaces. The partners of an embodiment host their Integration Adapters in their environments, however in an alternative embodiment the adapters are hosed by the ICS described herein.

While the rules engine of an embodiment is included and running on CPE, the embodiments herein are not so limited. In an alternative embodiment the rules engine is included and running on a system server or other component of the ICS platform.

In another alternative embodiment the rules engine is distributed between the CPE and ICS platform so that a set of rules is included and running on the CPE while another set of rules is included and running on the ICS platform. For example, rules controlling actions and triggers limited to local devices in the premises, and not using any data or information from a device or service outside the premises, are included and running on the CPE. Likewise, rules controlling actions and triggers involving device(s) in the premises, and also involving device(s) or service(s) outside the premises, are included and running on the CPE.

The CAT includes but is not limited to use cases comprising Service Association, Cloud Object Creation, Service Disassociation, Cloud Object Synchronization, Card UI Interactions, Rule Authoring, and Rule Execution. Each of the use cases is described in detail herein.

Upon startup, the Partner's Cloud Integration Adapter uses username, password and partnerKey to authenticate with the CIS. The username, password and partnerKey are provided by the service provider. The Partner's Event Callback URL and Health Check URL are defined initially as part of the partner onboarding process. The CIS provides two URLs for the partner to optionally update the two URLs at runtime.

The Register Event callback URI allows partner to update the Event Callback URL at runtime.

Endpoint /cloudIntegration/[partnerName]/eventCallback/ registerEventCallback?partner Url=[partnerUrl] Description Update the eventCallback URL for a partner Method POST Header x-login - username of the integration user x-password - password of the integration user x-partnerKey - unique key issued by Service provider to the partner URL partnerName: The unique name of the partner provided parameters by Service provider partnerUrl: The updated Event Callback URL Result HTTP response 200 if successful

An example payload includes but is not limited to the following:

curl -k -L -v -H “X-login: <username>” -H “X-password: test” -H “x-partnerKey: key” -X POST
“https://<server>/cloudIntegration/icontrol/cloudIntegrations/rachio/eventCallback/registerEventCallback?partnerUrl=https.//rachioAdapter/updatedEventCallbackUrl”

The Register Health Check Callback URI allows partner to update the Health Check URL at runtime.

Endpoint /cloudIntegration/[partnerName]/healthCheckCallback/ registerHealthCheckCallback?partnerHealthCheckUrl= [partnerHealthCheckUrl] Description Update healthCheckCallback URL for partner Method POST Header x-login - username of the integration userx-password - password of the integration userx-partnerKey - A unique key issued by Service provider to the partner URL partnerName: The unique name of the partnerpartner parameters HealthCheckUrl: The updated health check URL Result HTTP response 200 if successful

An example payload includes but is not limited to the following:

curl -k -L -v -H “Content-Type: text/xml” -H “X-login: <username>” -H “X-password: test” -H “x-partnerKey: key” -X POST
https://<server>/cloudIntegration/icontrol/cloudIntegrations/rachio/healthCheckCallback/registerHealthCheckCallback?partnerHealthCheckUrl=https://rachioAdapter/updatedHealthcheckcallback.

For both the Event Callback Registration and Health Check Callback Registration, the CIS responds with a HTTP 200 if the POST is accepted. Appropriate HTTP error code will be returned for error conditions.

The Health Check Callback service implemented by the Partner supports HTTP GET operations, and responds with HTTP 200 to indicate all systems are functioning properly. Any other response will be considered an indication that the adapter is not available. The CIS of an embodiment periodically checks availability of the Integration Adapter, and the periodicity is configurable.

The cloud integration user lifecycle of an embodiment embodies the core user experiences from a technical viewpoint (i.e., technical use cases). The following user lifecycle use cases are described in detail herein: Service Association (User Onboarding); Updating new user product(s)/service(s) on the Partner's server; Product/Service status updates; Controlling user's product(s)/service(s) from the service provider platform; User Offboarding of one or more product(s)/service(s).

Service Association (User Onboarding) is initiated by the user via a service provider user interface when the user selects a Partner device type from the list of devices available to pair to the user's Service provider system. FIG. 2 is a flow diagram for Service Association, under an embodiment. Service Association (Partner Onboarding) is initiated by a Card UI of an embodiment when the user selects a partner from a partner list. The list of all possible partners and their custom (partner specific) cards are built into each release of the Card UI (they are not dynamically downloaded from a server). However the list of enabled partners (and related metadata) is dynamically retrieved from Service provider server via an API.

Once the user selects a partner service for association, three-legged OAuth2 begins. A browser control is created, has its context populated with information identifying the user, and calls Service provider OAUTH Redirect servlet, which in turn opens the OAuth2 landing page URL with the required parameters (response_type, client_id and token). This page, served by the partner's web server, collects the user's ID and password and successfully authenticates.

After the user is authenticated, the partner server issues an HTTP 302 redirect to the Service provider OAuth Callback servlet located in the portal server and includes an authorization code as well as the rest of the original browser context. The OAuth Callback servlet contacts the partner's service to exchange the authorization code for an access token Which it stores in the database.

Then, the OAuth Callback servlet will call the ‘Associate Account URL’ provided by the partner. The access token for the user account is attached to the request as the Authorization header to identify the user. The response to the call will include the user's account id in the partner system and a list of cloud devices owned by the user. After successful account association, HTTP 200 is returned to the browser indicating the completion of the service association process.

The service provider OAUTH callback URL has the following format, but is not so limited: https://<servemame>/oauth/oauthPartner/<partnerName>. It is recommended that a service provider deployment registers this URL in the partner's system. As an alternate (less secure) option, this URL can be passed to the partner system as a parameter in the first leg of the OAUTH process.

The system of an embodiment includes Cloud Object Synchronization as described herein. After a service has been associated for a user/account, the system server has the list of cloud devices owned by the user. If the user adds/removes a device in the partner's system, partner server calls the Service provider Cloud Integration Service API to inform Service provider regarding the change. Conversely, if user removes a cloud device association in Service provider Card UI, an event will be sent to the partner's system via the ‘Event Callback URL’.

After completing user authentication, the OAuth token for the user account is attached to a request to associate the account (Associate Account in FIG. 2). The Partner Server's response to the call will include the user's Account ID in the Partner's system and a list of cloud-enabled devices owned by the user in that account. After successful account association, HTTP 200 is returned to the browser indicating the completion of the service association process. After a service has been associated for a user/account, the Service provider server will have the list of cloud devices owned by the user.

Account association with the CIS is the process by which the system server creates the relationship between the Service provider user and partner cloud devices. The Cloud Integration Service sends, via HTTP POST, a JSON Object containing the OAuth Access Token.

An example Associate Account Request follows but the embodiment is not so limited:

URL The ‘Associate Account URL’ provided by the partner in the Cloud Integration Submission Form. Description Get the user's account/device info from the partner. Method POST Header Authorization - ‘Bearer xxxxx’ where xxxxx is the user's access token. URL customerName: The name of the Service provider server. parameters

When an account association request is received by the Cloud Integration Adapter, it responds with a JSON message in the following format:

Associate Account Response {  “virtualDevice.siteId”:“acc_1234”,  “virtualDevice.instanceIds”:[   {    “id”:“device-inst001”,    “name”:“Front Sprinkler”   },   {    “id”:“device-inst002”,    “name”:“Backyard Sprinkler”   }  ] }

Where:

Field Name Description virtualDevice.siteId The user ID in the Partner's system. This will be the global identifier used by Service provider to refer to the Partner's primary user. virtualDevice.instanceIds A list (JSON Array) of devices the Partner or user wishes Service provider to interact with. id The Device ID in the Partner's system. name A friendly display name for this device.

A HTTP 200 is expected along with this data. Error codes should include an HTTP 500 for errors, and an HTTP 401 for improper OAuth token.

Updating status of partner product (events) involves user's interacting with the Partner's product/service through a Partner client (e.g., Partner mobile app) or the user may interact with the device locally and change its state, mode, or otherwise affect the product/service's status. Events received from the Partner's Cloud Integration Adapter can be treated as a trigger for a rule in the Service provider system (e.g., when the backyard sprinkler system is running, lock the pet door).

An example payload description follows but the embodiment is not so limited:

Endpoint /cloudIntegration/[partnerName]/events/submitCloudEvent Description Submit partner events to Service provider server. Method POST Header x-login - username of the integration user.x-password - password of the integration user.x-partnerKey - A unique key issued by Service provider to the partner URL partnerName: The unique name of the parameters partner.externalAccountId: The user's account ID in the partner system.
    • Events originated from the partner system in IcEvent(s) JSON format.Example:{“icEvent”:[{“metaData”:[{“name”:“virtualDevice.siteId”,“value” :“acc_1234”,{“name”: “virtualDevice.instanceId”, “value”: “rachio-inst001”},{“name”:“virtualDevice.providerId”,“value”:“rachio”}],“mediaType”:“sprinkler/on”,“ts”:1409675025053, “value”:“true”}]}

Body

Fields:

    • mediaType: The event mediaTypes defined as part of the cloud object definition and approved by Service provider.
    • ts: The time when the event happened (in milliseconds).

Event Metadata:

    • virtualDevice.providerId: The name of the partner. Also referred to as Integration_ID in the Card SDK.
    • virtualDevice.siteId: The user's account ID in the partner system.
    • virtualDevice.instanceId: The device ID in the partner system.
      Result HTTP response 200 if successful.

In controlling a partner product via the rules engine (actions) of an embodiment, the CIS uses the partner's Event Callback URL to submit action events to partner's system. Typically, an action event asks to the partner's system to perform a specific function. The partner submits the result of the action back to Service provider in the form of an event.

Payload Description

URL The Event Callback URL for the partner Description Submit action events to partner server. Method POST Header Authorization - The value is ‘Bearer xxxxxx’ with xxxxxx being the user's OAUTH access token.externalAccountId: The user's account ID in the partner system (to be added on Padre release). URL None. parameters
    • Action events originated from the Service provider system in IcEvent JSON format. Example event sent to Rachio:
    • {“icEvent”: [{“ts”:1409675025053,“instanceId”:“181964.0”,“mediaType”:“virtualDevice/pending”, “id”: “1430834677258”,“instanceName”:“Bedroom”,“value”:null,“context”:[ ],“metaData”:“name”:“virtualDevice.instanceId”,“value”:“rachio-inst001”},{“name”:“virtualDevice.siteId”,“value”:“acc_1234”},{“name”:“virtualDevice.providerId”,“value”:“rachio”},{“name”:“functionMediaType”,“value”:“sprinkler/schedulePause”},“requestMessageId”,“value”:“1430489036”}]}]}

Body

Fields:

    • id: The event ID generated by Service provider server.
    • mediaType: All action events have ‘virtualDevice/pending’ as the event media type. The actual action is represented as ‘sprinkler/schedulePause’ in metadata.
    • ts: The time when the event happened (in milliseconds).
    • instanceId: The ID of the device in Service provider system.

Event Metadata:

    • virtualDevice.providerId: The name of the partner. Also referred to as Integration_ID in the Card SDK.
    • virtualDevice.siteId: The user's account ID in the partner system.
    • virtualDevice.instanceId: The device ID in the partner system.
    • functionMediaType: Identifies the action called by Service provider. The list of all possible function media types are defined at the time of partner onboarding.
    • requestMessageId: The ID of the action request. Partner should used the this ID when sending success/failure response.
    • Upon receiving the action event, partner should send success/failure response as event to Service provider server.

Action Successful response: Event {“icEvent”:[{“metaData”:[{“name”:“virtualDevice.siteId”,“value”:“acc_1234”},{“name”: Response “virtualDevice.instanceId”,“value”:“rachio- inst001”},{“name”:“virtualDevice.providerId”,“value”:“rachio”},,{“name”:“request MessageId”,“value”:“1430489036”}],“mediaType”:“virtualDevice/success”,“ts”:140 9675025053,“value”:“true”}]} Failure Response: {“icEvent”:[{“metaData”:[{“name”:“virtualDevice.siteId”,“value”:“acc_1234”},{“name”: “virtualDevice.instanceId”,“value”:“rachio- inst001”},{“name”:“virtualDevice.providerId”,“value”:“rachio”},,{“name”:“request MessageId”,“value”:“1430489036”}],“mediaType”:“virtualDevice/failed”,“ts”:14096 75025053,“errorCode”:“500”,“value”:“true”}]}

Event disposition is determined by the functionMediaType in the metaData array. In the above example, the functionMediaType has the value of device/schedulePause, but depending on the function, there may be a parameter or value in order to effect the desired control.

FIG. 3 is a flow diagram for Service Disassociation, under an embodiment. If the user adds/removes a device in the Partner's system, the Partner Server calls the CIS API to inform Service provider about the change. A Cloud Service can be disassociated from an Service provider user through an API invocation on the Service provider server. This removes the cloud account and its associated Cloud devices from the Service provider server. A SMAP message is sent to the CPE to update its Cloud Object inventory, and the CIS calls the partner's ‘Event Callback URL’ to inform the Partner that the user has disassociated.

Payload Description

URL The Event Callback URL for the partner Description Notify Partner Server that a user has “deleted” or removed one of their partner products from being controlled by the Service provider system. Method POST URL None Parameters Body {  “icEvent”:[   {    “metaData”:[     {      “name”:“virtualDevice.siteId”,      “value”:“rachio-account-id-0001”,     },     {      “name”:“virtualDevice.instanceId”,      “value”:“rachio-userinst-0001”,     },     {      “name”:“virtualDevice.providerId”,      “value”:“rachio”,     },    ],    “id”:“1409865500000”,    “mediaType”:“virtualDevice/remove”,    “ts”:1409865500000,    “href”:“sites/1/network/instances/181002.0”,    “siteId”:“1”,    “deviceId”:“1002”,    “instanceId”:“181002.0”,   }  ] }

Event metadata includes but is not limited to: virtualDevice.providerId (e.g., name of the partner, also referred to as Integration_ID in the Card SDK); virtualDevice.siteId (e.g., user's account ID in the partner system); virtualDevice.instanceId (e.g., device ID in the partner system). Fields include but are not limited to: mediaType (e.g., all remove events will have a mediaType of ‘virtualDevice/remove’); is (e.g., time when the event happened (in milliseconds)); instanceId (e.g., ID of the device in Service provider system); id (e.g., event ID generated by Service provider server).

FIG. 4 is a flow diagram for Card UI Interactions, under an embodiment. The Card UIs that interact with the Cloud Objects will not depend on data stored in Service provider servers. Instead the cards will interact through the Partner Proxy Service, which handles authentication and logging, to make calls to the partner server. For example, a Nest card that needs to show a list of thermostats will get the list of thermostats and their metadata indirectly from Nest (through the Partner Proxy Service) instead of leveraging the Cloud Object data stored in our database. This is done primarily due to the desire to have the Card UI authors, which are expected to be the partners themselves, use their own APIs for easier development. Note that it does provide the possibility for the two data sets (the Cloud Objects in our database and the list of devices provided by the partner's server) to get out of sync if bugs exist in the integrations. Normally changes should be synchronized as described above in Cloud Object Synchronization and the two data sets should be equivalent.

Cards will be oblivious to authentication with the Partner Server (except for service association where the authentication data is stored in our server). Invocations to the Partner Proxy Service cause it to attempt a ‘pass-through’ invocation on the Partner Server using the authentication credentials stored in the database. If the Partner Server responds with a 401 authentication failure, the Partner Proxy Service will attempt to refresh the token and re-attempt the invocation to the Partner Server with the updated token as shown in the diagram above. Authentication credentials are not made available to the Cards, so they perform authenticated requests through the Partner Proxy Service.

The system of an embodiment includes files that form the Cloud Integration Metadata. As an example, an embodiment includes Cloud Integration Descriptor (CID) and Rules Template files that make up the Cloud Integration Metadata that defines a cloud integration.

The CID describes the capabilities of the devices and/or services provided by the Partner Provider Plugin including attributes, actions, events, and their associated parameters. This descriptor is used by the server to provide REST API access to the capabilities provided by the cloud service, but is not so limited.

An example CID XSD of an embodiment is as follows, but the embodiment is not so limited.

CID XSD  <xs:complexType name=“cloudObject”>   <xs:complexContent>    <xs:sequence>     <xs:element name=“name” type=“xs:token”/>     <xs:element name=“metaData” type=“metaData” minOccurs=“0” maxOccurs=“64”/>     <xs:element name=“point” type=“point” minOccurs=“0” maxOccurs=“64”/>     <xs:element name=“function” type=“function” minOccurs=“0” maxOccurs=“64”/>    </xs:sequence>    <xs:attribute name=“id” type=“xs:string” use=“required”/>    <xs:attribute name=“mediaType” type=“xs:token”    use=“optional”/>    <xs:attribute name=“href” type=“xs:anyURI”/>    <xs:attribute name=“tags” type=“xs:token”/>    <xs:attribute name=“status” type=“cloudObjectStatus”/>   </xs:complexContent>  </xs:complexType>  <xs:complexType name=“metaData”>   <xs:attribute name=“name” type=“xs:string”/>   <xs:attribute name=“value” type=“xs:string”/>   <xs:attribute name=“mediaType” type=“xs:token”/>  </xs:complexType>  <xs:complexType name=“point”>   <xs:attribute name=“mediaType” type=“xs:token” use=“required”/>   <xs:attribute name=“name” type=“xs:string”/>   <xs:attribute name=“href” type=“xs:anyURI”/>   <xs:attribute name=“value” type=“xs:string”/>   <xs:attribute name=“ts” type=“xs:long”/>   <xs:attribute name=“readOnly” type=“xs:boolean” use=“required”/>  </xs:complexType>  <xs:complexType name=“function”>   <xs:sequence>    <xs:element name=“input” type=“input” minOccurs=“0” maxOccurs=“unbounded”/>   </xs:sequence>   <xs:attribute name=“mediaType” type=“xs:token” use=“required”/>   <xs:attribute name=“name” type=“xs:token”/>   <xs:attribute name=“href” type=“xs:anyURI”/>   <xs:attribute name=“description” type=“xs:string”/>  </xs:complexType>  <xs:simpleType name=“cloudObjectStatus”>   <xs:restriction base=“xs:token”>    <xs:enumeration value=“ok”/>    <xs:enumeration value=“offline”/>    <xs:enumeration value=“unknown”/>    <xs:enumeration value=“missing”/>    <xs:enumeration value=“searching”/>    <xs:enumeration value=“configuration_failure”/>    <xs:enumeration value=“upgrading”/>    <xs:enumeration value=“configuring”/>   </xs:restriction>  </xs:simpleType>

An example CID of an embodiment is as follows, but the embodiment is not so limited.

Example CID <Nest id=″Nest343234345″ mediaType=″cloud/nest″ tags=″thermostat″ status=″ok″>   <name>My Nest</name>   <metadata name=″manufacturer″ mediaType=″cloud/nest/manufacturer″ value=″Nest″/>   <metadata name=″model″ mediaType=″cloud/nest/model″ value=″M1″/>   <point name=″temperature″ mediaType=″cloud/nest/temperature″ value=″2800″ ts=″23434535464557″ readOnly=″false″/>   <point name=″coolSetpoint″ mediaType=″cloud/nest/coolSetpoint″ value=″2400″ ts=″23434535464557″ readOnly=″false″/>   <point name=″heatSetpoint″ mediaType=″cloud/nest/heatSetpoint″ value=″2000″ ts=″23434535464557″ readOnly=″false″/>   <function name=″resetSetpoints″ mediaType=″cloud/nest/reset″ description=″reset heat/cool setpoint to factory default″/>  </Nest>

An example Rules XSD Changes of an embodiment is as follows, but the embodiment is not so limited.

Rules XSD Changes  <!--   - Subclass of trigger for Cloud commands.   -->  <xsd:complexType name=″cloudTrigger″>    <xsd:complexContent>     <xsd:extension base=″trigger″>      <xsd:sequence>       <!-- - The specific cloud object ID. -->       <xsd:element name=″cloudObjectID″ type=″xsd:string″ minOccurs=″1″ maxOccurs=″1″/>       <!-- the evaluation mechanism to apply to this trigger -->       <xsd:choice>        <xsd:element name=″simpleEval″ type=″cloudSimpleTriggerEvaluation″/>        <xsd:element name=″comparisonEval″ type=″cloudComparisonTriggerEvaluation″/>       </xsd:choice>      </xsd:sequence>     </xsd:extension>    </xsd:complexContent>   </xsd:complexType>   <xsd:element name=″cloudTrigger″ type=″cloudTrigger″ substitutionGroup=″trigger″/>   <!-- simple cloud trigger evaluation (just an event, no args) -->  <xsd:complexType name=″cloudSimpleTriggerEvaluation″>   <xsd:sequcnce>     <xsd:element name=″eventName″ type=″xsd:string″/>   </xsd:sequence>  </xsd:complexType>  <!-- cloud trigger evaluation that compares a value -->  <xsd:complexType name=″cloudComparisonTriggerEvaluation″>    <xsd:sequence>     <xsd:element name-″attributeName″ type=″xsd:string″/>     <xsd:element name=″comparisonMethod″ type=″comparisonMethodEnum″/>     <xsd:element name=″comparisonValue″ type=″xsd:double″/>    </xsd:sequence>  </xsd:complexType>  <!-- comparison methods --> <xsd:simpleType name=″comparisonMethodEnum″>   <xsd:restriction base=″xsd:string″>     <!-- equality -->     <xsd:enumeration value=″eq″/>     <!-- less than -->     <xsd:enumeration value=″lt″/>     <!-- less than or equal  >     <xsd:enumeration vaIue=″le″/>     <!-- greater than -->     <xsd:enumeration value-″gt″/>     <!-- greater than or equal -->     <xsd:enumeration value=″ge″/>    </xsd:restriction> </xsd:simpleType>

An example Master Action List Changes of an embodiment is as follows, but the embodiment is not so limited.

Master Action List Changes  <a:action actionID=″137″>   <a:description>Invoke a Cloud Action</a:description>   <a:parameterDef    <a:key>cloudObjectID</a:key>    <a:type>string</a:type>   </a:parameterDef   <a:parameterDef    <a:key>cloudActionID</a:key>    <a:type>string</a:type>   </a:parameterDef   <a:parameterDef>    <a:key>parameters</a:key>    <a:type>string</a:type> <!-- a JSONArray of JSONObjects that contain name/value/type triplets (type is optional) -->   </a:parameterDef>   <!-- does this type make sense? -->   <a:type>workflow</a:type>   <a:target>ruleAction_invokeCloud</a:target>  </a:action>

An example Rule XML Examples of an embodiment is as follows, but the embodiment is not so limited.

Rule XML Examples <rule ruleID=″1002351″>   <triggerList>    <cloudTrigger>     <description>Cloud Trigger</description>     <category>cloud</category>     <!-- just points to the global service, not to any particular instance -->     <cloudObjectID>AccuWeather</cloudObjectID>     <!-- it is assumed here that when the AccuWeather account is connected      that it is already filtering based on the user′s location / zipcode -->     <simpleEval>      <eventName>tornadoWarning</eventName>     </simpleEval>    </cloudTrigger>   </triggerList>   <action>    <actionID>70</actionID>    <parameter>     <key>lightID</key>     <value>3781220513309696</value>    </parameter>    <parameter>   <key>level</key>   <value>100</value>    </parameter>   </action>   <description>Turn on kitchen light when Tornado Warning</description>  </rule> <rule ruleID=″1008603″>   <triggerList>    <zoneTrigger>     <description>Zone Trigger</description>     <category>sensor</category>     <zoneState>open</zoneState>     <zoneID>18</zoneID>    </zoneTrigger>   </triggerList>   <action>    <actionID>137</actionID>    <parameter>     <key>cloudObjectID</key>     <value>nest.1</value> <!-- device 1 under the nest service associated with this account -->    </parameter>    <parameter>     <key>cloudActionID</key>     <value>configureThermostat</value>    </parameter>    <parameter>     <key>parameters</key>     <value>[ { ″name″: ″heatSetPoint″, ″value″: ″2200″, ″type″: ″nest/temperature″ }, { ″name″: ″coolSetPoint″, ″value″: ″2700″ } ] </value>    </parameter>   </action>   <description>Zone 1 Open Configure Nest Thermostat</description>  </rule>

In order to provide a dynamic list of available actions and triggers during rule authoring, templates describing the available functionality must be provided with the Cloud Integration Metadata. Some examples of trigger and action templates (e.g., Rachio Smart Sprinkler Controller trigger and action template, AccuWeather weather service trigger template, etc.) of an embodiment are as follows, but the embodiment is not so limited.

Example Rule Templates

<rules-core:triggerTemplates  xmlns:rules-core=″rules-core″ xmlns :xsi=″http://www.w3.org/2001/XMLSchema-instance″  xsi:schemaLocation=″rules-core ../../../../rules-core/src/main/resources/ rules-core.xsd″>  <rules-core:triggerTemplate id=″203″  description=″{STR.RULES.TEMPLATES.TRIGGER.DESC. INSTANCE.RACHIO}″   cvTriggerType=″cloudTrigger″ cvCategory=″cloud″  excludeActionIds=″10:11:15:16:17:18:100:101:103:120:121:122:135:: 136:137138:139″>   <rules-core:inputs>    <rules-core:input hidden=″false″  description=″{STR.RULES.TEMPLATES.TRIGGER. TARGETVALUES.DESC.RACHIO}″     name=″targetValues″ pattern=″eq″>     <option  description=″{STR.RULES.TEMPLATES.TRIGGER. TARGETVALUES.OPTION.DESC.RACHIO.ON}″      value=″1″ />     <option  description=″{STR.RULES.TEMPLATES.TRIGGER. TARGETVALUES.OPTION.DESC.RACHIO.OFF}″      value=″0″/>    </rules-core:input>    <rules-core:input hidden=″true″ name=″type″ value=″event″ />    <rules-core:input hidden=″false″ name=″instanceIds″ />    <rules-core:input hidden=″true″ name=″tags″ value=″rachio″ />    <rules-core:input hidden=″true″ name=″mediaTypes″     value″sprinkler/on″ />   </rules-core:inputs>  </rules-core:triggerTemplate> </rules-core:triggerTemplates> <rules-core:triggerTemplates  xmlns:rules-core=″rules-core″ xmlns:xsi=″http://www.w3.org/2001/XMLSchema-instance″  xsi:schemaI.ocation=″rules-core ../../../../rules-core/src/main/resources/ rules-core.xsd″>  <rules-core:triggerTemplate id=″200″  description=″ {STR.RULES.TEMPLATES.TRIGGER.DESC. INSTANCE.ACCUWEATHER}″   cvTriggerType=″cloudTrigger″ cvCategory=″cloud″ excludeActionIds=″10:11:15:16:17:18:135:136″>   <rules-core:inputs>    <rules-core:input hidden=″false″  description=″{STR.RULES.TEMPLATES.TRIGGER. TARGETVALUES.DESC.ACCUWEATHER}″     name=″targetValues″ pattern=″gt>     <option  description=″{STR.RULES.TEMPLATES.TRIGGER. TARGETVALUES.OPTION.DESC.ACCUWEATHER. TEMPERATURE.GT}″      value=″temperatureGt″/>    </rules-core:input>    <rules-core:input hidden=″true″ name=″type″ value=″event″ />    <rules-core:input hidden=″false″ name=″instanceIds″ />    <rules-core:input hidden=″true″ name=″tags″ value″accuWeather″ />    <rules-core:input hidden=″true″name=″mediaTypes″     value=″weather/temperature″ />   </rules-core:inputs>  </rules-core:triggerTemplate> </rules-core:triggerTemplates> <rules-core:actionTemplates xmlns:rules-core=″rules-core″  xmlns:xsi=″http://www.w3.org/2001/XML Schema-instance″  xsi:schemaLocation=″rules-core ../../../../rules-core/src/main/ resources/rules-core.xsd″>  <rules-core:actionTemplate id=″137″  description=″{STR.RULES.TEMPLATES.ACTION. DESC.RACHIO.OFF}″ cvActionId=″137″   cvType=″workflow″>   <rules-core:inputs>    <rules-core:input hidden=″ false″ description=″Which Rachio Object″     name=″instanceIds″ cvKey=″cloudObjectID″ cvType=″cloudObjectID″     cvRequired=″true″ />    <rules-core:input name=″mediaType″ value″sprinkler/scheduleStop″ />   </rules-core:inputs>  </rules-core:actionTemplate> </rules-core:actionTemplates>

Sample curl commands of Rachio cloud rule follows.
curl -k -v -L -H “Content-Type:application/json” -H “X-login:insight” -H “X-password:test” -H “X-AppKey: defaultKey” -X PUT
“https://10.0.12.102/rest/icontrol/sites/420/rules”-d‘{“description”:“Rachio turns ON, Turn on
Light”,“executionSource”:“client”,“enabled”:true,“valid”:true, “default”:false,“conditionals”:{“conditional”:[{“triggers”:{“trigger”:[{“description”:“Rachio is ON”,“id”:“0”,“templateId”:“203”,“targetValues”:“1”,“type”:“event”,“mediaTypes”:“sprinkler/systemOn”,“instances”:“181002.0”,“targetComparisonTypes”:“eq”}]},“actions”:{“action”:[{“id”:“0”,“templateId”:“70”,“inputs”:“level=0&”,“instanceIds”:“13000d6f00020a5d9a.1.0”}]}}]}}’
User should pass targetComparisonTypes whenever there is a pattern present in cloudTrigger. In above case targetComparisonTypes is “eq” and targetValues is “1”. Both these values should be fetched from the triggerTemplate.
Sample curl commands of AccuWeather cloud rule follows.
curl -k -v -L -H “Content-Type:application/json” -H “X-login:insight” -H “X-password:test” -H “X-AppKey: defaultKey” -X PUT
“https://10.0.12.102/rest/icontrol/sites/420/rules” -d ‘{“description”:“Outside temperature is less than 100, Turn on
light”,“executionSource”: “client”,“enabled”:true,“valid”:true,“default”:false,“conditionals”:{“conditional”:[{“triggers”:{“trigger”:[{“description”:“Outside temperature is greater than 60 degrees, turn on light”,“id”:“0”,“templateId”:“200”,“targetValues”:“70”,“type”:“event”,“mediaTypes”:“weather/temperature”,“instances”:“181001.0”,“targetComparisonTypes”:“gt”}]},“actions”: {“action”:[{“id”:“0”,“templateId”:“70”,“inputs”:“level=0&”,“instanceIds”:“13000d6f00020a5d9a.1.0”}]}}}}}’
In above case targetComparisonTypes is “gt” and targetValues is “70”. Here targetComparisonTypes should be fetched from triggerTemplate and user should pass user defined value in targetValues.

SMAP Protocol Changes

SMAP is updated to allow server to send external events to CPE and CPE send external action event to server.

 <xsd:complexType name=″cloudEvent″>    <xsd:complexContent>     <xsd:extension base=″smap:baseMessage″>      <xsd:sequence>       <xsd:element name=″metaData″ type=″smap:eventMetaData″ maxOccurs=″32″ minOccurs=″0″>        <xsd:annotation>         <xsd:documentation>Additional information about the event itself</xsd:documentation>        </xsd:annotation>       </xsd:element>       <xsd:element name=″context″ type=″smap:eventContext″ maxOccurs=″32″ minOccurs=″0″>        <xsd:annotation>         <xsd:documentation>Information about other aspects of the system at the time of the event</xsd:documentation>        </xsd:annotation>       </xsd:element>      </xsd:sequence>      <xsd:attribute name″ id″ type=″xsd:token″/>      <xsd:attribute name=″cloudObjectId″ type=″xsd:token″/>      <xsd:attribute name=″mediaType″ type=″xsd:token″ use=″required″/>      <xsd:attribute name=″ts″ type=″xsd:long″ use=″required″/>      <xsd:attribute name=″href″ type=″xsd:anyURI″/>      <xsd:attribute name=″errorCode″ type=″xsd:token″/>      <xsd:attribute name=″value″ type=″xsd:string″/>     </xsd:extension>    </xsd:complexContent>   </xsd:complexType>   <xsd:complexType name=″eventContext″>    <xsd:attribute name=″mediaType″ type=″xsd:token″ use=″required″/>    <xsd:attribute name=″value″ type=″xsd:string″ use=″required″/>    <xsd:attribute name=″href type=″xsd:anyURI″/> </xsd:complexType> <xsd:complexType name=″eventMetaData″>    <xsd:attributename=″name″ type=″xsd:token″ use=″required″/>    <xsd:attribute name=″value″ type=″xsd:string″ use=″required″/> </xsd:complexType> <xsd:complexType name=″cloudActionEvent″>    <xsd:complexContent>     <xsd:extension base=″smap:baseMessage″>      <xsd:sequence>       <xsd:element name=″ruleId″ type=″xsd:long″ minOccurs=″0″ maxOccurs=″1″>        <xsd:annotation>         <xsd:documentation>Id of the rule that triggered this action, if applicable.</xsd:documentation>        </xsd:annotation>       </xsd:element>       <xsd:element name=″eventId″ type=″xsd:string″ minOccurs=″0″ maxOccurs=″1″>        <xsd:annotation>         <xsd:documentation>The id of the event that triggered the rule.</xsd:documentation>        </xsd:annotation>       </xsd:element>       <xsd:element name=″cloudObjectId″ type=″xsd:token″ minOccurs=″1″ maxOccurs=″1″/>       <xsd:element name=″actionMediaType″ type=″xsd:token″ minOccurs=″1″ maxOccurs=″1″/>       <xsd:element name=″actionHref″ type=″xsd:anyURI″ minOccurs=″0″ maxOccurs=″1″/>       <xsd:element name=″actionInput″ type=″smap:input″ minOccurs=″0″ maxOccurs=″32″/>      </xsd:sequence>     </xsd:extension>    </xsd:complexContent>   </xsd:complexType>   <xsd:complexType name=″input″>    <xsd:attribute name=″name″ type=″xsd:token″ use=″required″/>    <xsd:attribute name=″mediaType″ type=″xsd:token″ use=″optional″/>    <xsd:attribute name=″value″ type=″xsd:string″ use=″required″/>   </xsd:complexType>

The ICS of an embodiment effects ICS platform integration with third party system and device functionality (e.g., Philips Hue lights, Chamberlain garage door openers, Nest thermostats, Dropcam cameras. Doorbot doorbell cameras, etc.), as described in detail herein. Using the same processes, other server-to-server (cloud) services (e.g., Accuweather, MSO digital assets such as voicemail, etc.) are also integrated into the ICS platform. FIG. 5 is an example rules interface for controlling triggers and actions involving third party devices integrated in the CAT, under an embodiment. FIG. 6 is another example of an actions portion of a rules interface for integrated third. party devices, under an embodiment. FIG. 7 is an example of a triggers portion of a rules interface for third party services integrated with the CAT, under an embodiment. The rules automation actions and triggers of an embodiment include monitor/control functionality enabled via proprietary ills, and with cards in the Card UI as described herein.

Cloud Action Triggers (CAT)

Cloud Actions and Triggers (CAT) of an embodiment enable cloud services and internet-connected devices to leverage the user interface, Rules Engine and other functions of the service provider system. This allows third party devices (e.g., smart door bells, door locks, garage door operators, cameras, thermostats, lighting systems, lighting devices, lawn irrigation systems, plant sensors, pet feeders, weather stations, rain sensors, pool controls, air quality sensors, music systems, remote controllers, internet user interfaces, connected systems, connected vehicles, etc.), third party services (e.g., weather forecasting services and applications, family networking services and applications, etc.), and others to trigger automations in the service provider system using the Rules Engine. This enables end-users to integrate and use their previously-standalone internet connected devices with their service provider-based service.

The app icons for security and devices that have states are configured to provide at-a-glance status to homeowners.

FIG. 8 is an example touchscreen display including numerous On states, under an embodiment. The touchscreen On states include, for example, but are not limited to at least one door unlocked, at least one light on, and thermostat on, cooling mode, current temperature (73 degrees).

FIG. 9 is an example touchscreen display during arming, under an embodiment.

FIG. 10 is an example touchscreen display including numerous Off states, under an embodiment. The touchscreen Off states include, for example, but are not limited to at least one all doors locked, all lights off, and thermostat off.

Embodiments include event-based CAT Rules comprising a Card UI configured to perform operations including but not limited to creating, editing and deleting a rule via the REST API, focusing exclusively on trigger events that lead to actions (e.g., weather trigger events (Accuweather) to invoke actions, irrigation system trigger events (Rachio) to invoke actions, etc.).

As an example, the triggers of an embodiment used to drive use case requirements for a weather service integration include but are not limited to current temperature (Celsius, Fahrenheit), current precipitation conditions (Rain yes/no, Snow yes/no), daily temperature forecast high (Celsius, Fahrenheit), daily temperature forecast low (Celsius, Fahrenheit), daily forecast for precipitations (Rain yes/no, Snow yes/no), severe weather alert (Yes/No). Example use cases of the Accuweather configuration include, but are not limited to forecast indicator (e.g., mornings between 6 am and 7 am, turn on/off light, or change light color to reflect daily weather forecast), outdoor temperature controls a Relays+ device, such as a fan or pet door, turn on a vacation home thermostat only within a certain outdoor temperature range, get text messages when temperature reaches over/below a certain threshold, and receive severe weather alerts over text messages.

FIGS. 11-30 are example implementations of the CAT rules and corresponding device displays involving the AccuWeather device application (app), under an embodiment. These example screens, which include mnemonic icons and state summaries to represent the rule names, differ from conventional displays presenting a list of lengthy sentences, thereby retaining the simplicity of IFTT, but with greater detail to differentiate between rules.

FIG. 11 is an example touchscreen display of a rules list, under an embodiment.

FIG. 12 is an example touchscreen display in response to selection of the “Add Rule” icon, under an embodiment.

FIG. 13 is an example touchscreen displayed upon selection of the “Weather Event” icon, including a list of weather events, under an embodiment.

FIG. 14 is an example touchscreen displayed upon selection of the “Reports a temperature” icon, including selections for activating low and high temperature selections, under an embodiment. This display can support TEMP is ABOVE or BELOW settings, rather than range, but is not so limited.

FIG. 15 is an example touchscreen display for selecting a temperature limit (lower) for “Reports a temperature” (“choose low”), under an embodiment.

FIG. 16 is an example touchscreen display following selection of a temperature limit (lower) for “Reports a temperature”, under an embodiment.

FIG. 17 is an example touchscreen display for selecting a temperature limit (upper) for “Reports a temperature”, under an embodiment.

FIG. 18 is an example touchscreen display following selection of a temperature limit (upper) for “Reports a temperature”, under an embodiment.

FIG. 19 is an example touchscreen display for filtering the temperature reporting rule based on time or day, under an embodiment.

FIG. 20 is an example touchscreen display for selecting a time after selecting “any day” as a filtering parameter for the temperature reporting rule, under an embodiment.

FIG. 21 is an example touchscreen display for selecting system state as an event filter for the temperature reporting rule, under an embodiment.

FIG. 22 is an example touchscreen display for which two arming types are selected for system state as an event filter for the temperature reporting rule, under an embodiment.

FIG. 23 is an example touchscreen display presenting available actions for the temperature reporting rule, under an embodiment.

FIG. 24 is an example touchscreen display for selecting a type of device to control in response to choosing an action to control a device according to a temperature reporting rule, under an embodiment.

FIG. 25 is an example touchscreen display for displaying a list of device types corresponding to the selected device type to be controlled under the temperature reporting rule, under an embodiment.

FIG. 26 is an example touchscreen display showing selection of a particular ceiling fan device (“Patio”) under the temperature reporting rule, under an embodiment.

FIG. 27 is an example touchscreen display showing available actions of the selected device for control under the temperature reporting rule, under an embodiment.

FIG. 28 is an example touchscreen display showing options for creating a compound rule with additional actions, under an embodiment.

FIG. 29 is an example touchscreen display for saving a rule, under an embodiment.

FIG. 30 is an example touchscreen display of a rules list following creation of a new rule, under an embodiment.

As another example, the triggers of an embodiment are used to drive use case requirements for functions of an irrigation system (e.g., Rachio), including controlling irrigation On/Off and skipping of a watering schedule based on rain/weather. The triggers of an embodiment corresponding to the irrigation system include but are not limited to Sleep Mode on/off, Temporarily disable all watering schedules, Enable/Disable specific watering schedule, Make sure that if a schedule was already previously enabled then Rachio disables it before enabling a different one Run a specific Watering Schedule. Example use cases of the Rachio configuration include, but are not limited to Turn on porch light if irrigation system is on, and off when irrigation system is off, Sleep/Disable Rachio if temperature dips below a specific temperature, When home system Armed (Away) the use a specific watering schedule, When home system Disarmed or Armed (Stay) then use a different watering schedule, Notify user when watering schedule is skipped due to rain forecast, manually start a watering schedule, run a specific watering schedule when conditions met (e.g., temperature is above 70 degrees in a specified time period).

FIGS. 31-52 are example implementations of the CAT rules and corresponding device displays involving the Rachio device application (app), under an embodiment. The screens described below provide examples for how the CAT rules implement a mobile demo using irrigation events. The example screens, which include mnemonic icons and state summaries to represent the rule names, differ from conventional displays presenting a list of lengthy sentences, thereby retaining the simplicity of IFTT, but with greater detail to differentiate between rules.

FIG. 31 is an example touchscreen display of a rules list, under an embodiment.

FIG. 32 is an example touchscreen display in response to selection of the “Add Rule” icon, under an embodiment.

FIG. 33 is an example touchscreen displayed upon selection of the “Irrigation” icon, including a list of irrigation events, under an embodiment.

FIG. 34 is an example touchscreen displayed upon selection of the “Switches on” icon, including selections for a day for the switching event, under an embodiment.

FIG. 35 is an example touchscreen display for selecting an “on” time after selecting “any day” as a filtering parameter for the switching event, under an embodiment.

FIG. 36 is an example touchscreen display for selecting a time of day as a start time, under an embodiment.

FIG. 37 is an example touchscreen display following selection of a start time for the switching event rule, under an embodiment.

FIG. 38 is an example touchscreen display for selecting a time of day as an end time, under an embodiment.

FIG. 39 is an example touchscreen display following selection of a start time and an end time for the switching event rule, under an embodiment.

FIG. 40 is an example touchscreen display for selecting system state as an event filter for the switching event rule, under an embodiment.

FIG. 41 is an example touchscreen display presenting available actions for the switching event rule, under an embodiment.

FIG. 42 is an example touchscreen display for selecting a type of device to control in response to choosing an action for the switching event rule, under an embodiment.

FIG. 43 is an example touchscreen display for displaying a list of device types corresponding to the selected device type (“lights”) to be controlled under the switching event rule, under an embodiment.

FIG. 44 is an example touchscreen display showing selection of particular lighting devices (“Porch” and “Living Room”) under the switching event rule, under an embodiment.

FIG. 45 is an example touchscreen display showing available actions of the selected device (“Porch light”) for control under the switching event rule, under an embodiment.

FIG. 46 is an example touchscreen display showing available actions of another selected device (“Living Room light”) for control under the switching event rule, under an embodiment.

FIG. 47 is an example touchscreen display showing options for creating a compound rule with additional actions under the switching event rule, under an embodiment.

FIG. 48 is an example touchscreen display for selecting a type of device to control in response to choosing an additional control device for a compound switching event rule, under an embodiment.

FIG. 49 is an example touchscreen display for displaying a list of device types corresponding to the selected device type (“shades”) to be controlled under the compound switching event rule, under an embodiment.

FIG. 50 is an example touchscreen display showing available actions of the selected device (“Living room shades”) for control under the compound switching event rule, under an embodiment.

FIG. 51 is an example touchscreen display for saving a rule, under an embodiment.

FIG. 52 is an example touchscreen display of a rules list following creation of the new switching event rule, under an embodiment.

Card UI

The ICS of an embodiment includes a Card UI, as described in detail herein. The Card UI of an embodiment provides a clean, simple and easily navigable interface with UI elements grouped into a card-like configuration. This framework can then be customized by MSOs and augmented by third-party vendors with their own cards. The Card UI includes a software development kit (SDK) that enables third-party developers to integrate additional features into the application infrastructure. The third-party cards integrate into the existing Card UI framework, thus allowing developers to generate and add new functionality into one or more of the smartphone, tablet and subscriber portal. The SDK also provides a sandbox for the service providers IP, and the cards run in the sandbox to protect that apps and users, and not expose the developer to critical IP elements of the application (e.g., source code, etc.).

FIG. 53 is a flow diagram for local card development and unit testing, under an embodiment. FIG. 54 is a flow diagram for card integration testing, under an embodiment. FIG. 55 is a flow diagram for card production, under an embodiment. FIG. 56 is an example card (e.g., thermostat, etc.) operating on a smart phone, under an embodiment. FIG. 57 is an example small card (e.g., thermostat, etc.), under an embodiment. FIG. 58 is an example card menu, under an embodiment.

In a desktop browser, the sandbox is an iframe so that the “sandboxed” attribute on the iframe provides for extra isolation, which is implemented on all supported browsers. The main app provides a limited API to the sandbox via the PostMessage function, and the cards of an embodiment are served from a different origin than the main app to prevent unrestricted access to the service provider server via shared cookies.

Regarding cards operating on a smartphone, each multiple-system operator (MSO) has limited control over the positioning and sizing of the third party cards. On a smartphone, the card is displayed full screen, except for a header provided and branded by the core app in the main webView. Native code sizes the card webview and layers it over the app's main webview. Tablet builds include an option to show “small mode” cards in a popup. A mini card on a view acts like a button that will trigger the popup. Alternatively if the card prefers “large mode”, it can occupy the whole tablet screen (except the header).

The cards of an embodiment will not control their size, so they manage the various different sizes used by the wide range of mobile devices, but the embodiment is not so limited. An embodiment includes different size modes, for example, a small mode (e.g., the size of a card or a phone screen, and a large mode (e.g., the full size of a tablet screen or full content window). All cards support small mode as this is the only mode on a phone. As part of their packaging the developer can indicate whether they prefer large mode. See below for some examples of how these will appear.

When displayed or presented in a smartphone, the cards of an embodiment appear “full screen” (apart from a header above it), and appear “full screen” on a tablet device and/or a browser if the card is configured to prefer large mode. The cards of an embodiment are displayed in a popup format on a tablet device if the cards are configured to not prefer the large mode.

The cards are alternatively displayed via a browser as mixed in with other cards in a single view if the cards are configured so as to not prefer the large mode. As an alternative to a full screen display, the cards may be displayed as mixed in with other cards in a single view on smartphones, and used instead of popups on tablet devices, but the embodiment is not so limited.

The Card UI Manage Views code is extended to enable users to add third-party cards to views. The Card packaging indicates to the Card UI what type of device it controls (if any), and whether it controls one or more of those devices. When controlling a single device, Card UI Manage Views code stores that association and informs the Card as to which device ID it is associated when it starts.

The SDK of an embodiment includes or exposes APIs (e.g., javascript) that enable the developer to interact with the service provider devices at the premises. One such API, for example, is as follows:

getDevices ({lights: true}); // —> [“1234”, “5678”]; list of deviceIds for lights getDeviceState (“1234”); // {type: “light”, on: true, dimLevel: 50} setDeviceState (“1234”, {on: true}); // switch a light on // Register for light and thermostat state updates // Command failures or device troubles will also be sent to this function registerForUpdates ({lights: true, thermostats: true}, myStateUpdateFunction); ! // register for state updates for 2 devices registerForUpdates ({deviceIds: [“1234”, “5678”]}, myStateUpdateFunction);

Additional API examples of an embodiment include but are not limited to the following:

    • a. App lifetime: Events deviceready, pause, resume; informs card about app status; deviceready is simulated by the Card UI for browsers.
    • b. Card lifetime: Activate, deactivate; activate has a parameter for the device ID if the card is dealing with a single home device.
    • c. Preference storage: enables card developers to store user preferences or external server credentials; stored in the same location as for the rest of the app (e.g., keychain for IOS, encrypted file for Android, obfuscated localStore for browser, etc.).
    • d. Device APIs: For mobile devices, access to APIs including Device (hardware model, OS, unique app specific device ID), Network-information (offline/cellular/Wi-Fi), Globalization (locale+date/currency display functions).
    • e. Title update: Card uses this to update title bar appropriately for this card.

The SDK of an embodiment enables developer access to the browser DOM APIs so that numerous frameworks are available (e.g. jquery, backbone, etc). The SDK enables developers to manage their own access to external servers direct from their card's code, and to have full access via XHR, controlled by a whitelist on mobile apps.

External server APIs needing authentication are routed through the partner proxy API. Developers first register for OAuth onboarding with the service provider and, once onboarded and the OAuth process completed, the partner proxy API is available to the card. This API automatically appends an access token to the request, makes the call to the external server, and returns the raw response. An example of this partner proxy API is as follows:

    • cardsdk.partnerProxyCall(path, callback, method, parameters)
    • path: the endpoint of the external API, given either as an absolute or relative URL
    • callback: function(success, data){ }
      • success: returns true if the request completed successfully, else false
      • data: a JSON object containing either the response wrapped in a response
    • object (on success) or detail of the error on failure.
      • method: (optional) one of GET, PUT, POST, DELETE. Defaults to GET.
      • parameters: (optional) a URL encoded string of any parameters to include.

Makes an authenticated call to the external server specified.

For example:

cardsdk.partnerProxyCall(“/1/public/device/dc345cf5-a6f7-4654-b8ce-4161d0445593/even t”, apiCallCallback, “GET”, “startTime%3D1414818000000%26endTime%3D1415739608103”) function apiCallCallback(success, data) {  if(success) // Success of the partner proxy API call  {   var myRes = data.response;   if(myRes.statusCode == 200) // Success of my external server API call    var someData = myRes.content; // Body of the external server    response  } }

A data object on success:

{  ″response″:  {   ″headers″: [    {″Date″:[″Thu, 13 Nov 2014 23:22:11 GMT″]},    {″Content-Length″:[″50″]},    {″Connection″:[″keep-alive″]},    {″Content-Type″:[″application\/json″]},    {″Server″:[″Apache-Coyote\/1.1″]}   ],   ″content″: ″{\″current\″:\″0b881a86-6262-4c00-8d44-5f0b1f324c7e\″}″,   ″statusCode″: 200,   ″contentType″: ″application\/json″,   ″contentEncoding″: ″UTF-8″  } }

And on failure: {“error”: “Some error message”}

The app of an embodiment controls the lifetime of cards and, as such, creates a card when it needs to be displayed, and destroys a card at any time when it is not displayed (e.g., depending on how many cards are in use and any memory pressure). When a user wishes to view a card again, the app re-creates the card transparently, and in time for any slide transition to display smoothly.

A card indicates in its packaging whether it needs to be running constantly when the app runs (e.g., long-lived third party server connection, need to be alerted to device status changes immediately upon occurrence, etc.). The app handles status updates from the server and updates its own internal memory model with the new status, which it shares with cards when needed. Whenever a card is started or restarted, the app feeds it the latest state of its associated device or devices, so there is generally no need for a card to be running continuously.

An embodiment includes a stub environment for testing and debugging a card. The stub environment enables developers to build, test and debug their cards in a self-contained environment, without access to service provider app source code and, as such, does not require a service provider server or account. Instead, the APIs described herein are simulated using local javascript only, making it trivial to simulate new features in the stub environment without waiting for those features to appear in the service provider apps or server.

For mobile apps, the stub environment is based around Cordova (AKA Phonegap). Developers use the base SDK javascript files, mix in their card files, and build, package and sign their own mobile apps for testing using Cordova and IOS/Android tools, but the embodiment is not so limited. The app includes one or more dummy cards, and the developer's own card or cards so that developers can test card behavior on a wide range of mobile devices. Cards of an embodiment are debugged using “Web Inspector” (provided with IOS and Android dev tools) but are not so limited.

An embodiment provides the stub environment as HTML/JS/CSS files. Developers can include their card files, and run these locally on their development machine, or upload to a web server of their choice, and debugging is via the Web Inspector in their browser. The stub environment is configured with a JSON file that specifies the devices of the stub system, their initial states, and also allows state changes or troubles to be simulated after fixed periods of time (e.g., can request a stub system with 6 lights, 3 thermostats, 2 water sensors, and request a water sensor trip or trouble 1 minute after startup).

The system of an embodiment includes a Card Portal comprising a web site that enables developers to do one or more of create a developer account, download the Card SDK, register a card and its ID (e.g., com.mycompany.myfirstcard), upload a card (see later for packaging details), perform basic automatic validation of the card and package, run a card in the service provider app, and request review and certification of a card by the service provider.

The Card Portal of an embodiment is written in node.js, but is not so limited. After testing the debugging in the Stub Environment, the card is then tested inside the service provider app within the service provider system. To do so, the developer installs the service provider branded app from the app store. In “settings” of the service provider app, the developer selects a checkbox indicating they are a developer and, in response, the app prompts for the user's developer account and password, validates the user at the Card Portal, downloads a list of the user's Cards, displays the list of cards so that the user checks each one they want to have loaded at startup, and downloads the cards in background.

Once downloaded, the developer is presented a message (e.g., “New cards have been downloaded. Restart the app to use them”). During a subsequent app start, it automatically checks for updated cards, downloads them in the background, and then prompts the user/developer as before to restart. Unlike the stub environment, the developer will not have debug access inside the service provider app, but will receive log statements (e.g., route any log statements from their app to the regular device logs (xcode/adb/browser console), but the embodiment is not so limited.

Cards of an embodiment are uploaded to the Card Portal as a zip file that includes the HTML, CSS, JS, and resource files. A card.json file is also to be included with the cards, which has details of the corresponding card behavior as described herein. The card.json file includes an object with the following properties: id (id of third party card); integraionld (id for the partner integration with the iControl server this will enable developers to develop multiple cards with one partner integration); version (three point version string of the form “<int>.<int>.<int>”); name (display name of card); deviceType (e.g., other, info, irrigation, doors, thermostats, lights, cameras, dryContact, motion, co, water, etc.); appStateTypes (array of all device types for which a card would like to receive the state); deviceImg (filename of the image to be displayed in the onboarding section of the app); preferLargeMode (a card is loaded either full screen or inside a pop-up on tablet and smartphone, this flag enables “always full screen” if true); startFile (filename of the HTML file to load on card startup); runInBackground (if true, the app will try to keep the card in memory after a user navigates away from it, but card may be destroyed at any time).

Once the developer has tested their card in the service provider app, they use the Card Portal to request card certification by the service provider. Considerations for certification include that the card performs reliably and satisfactorily, does not damage performance or UX of the service provider app when integrated, use of external servers is appropriate, and card works correctly across range of phones, tablets and browsers. A version of a card is initially certified by the service provider, and newly updated versions may undergo further review. Alternatively, trusted developers might be allowed to provide updates to their already certified cards without any further review.

The app build system is extended to pull certain certified cards from the Card Portal and include them in the MSO app and smartphone builds. Each MSO can have its own choice of cards, card versions and card languages, but the embodiment is not so limited. Alternatively, the Card Portal may be extended to distribute cards dynamically to real users, instead of baking the cards into an app.

Branding and internationalization of cards is enabled for third party developers wishing to project their own brand on their card, or use existing HTML5 code in their card. Furthermore, MSOs wishing to build their own cards have access to select JS and CSS files as part of the SDK that implement basic widgets, spinners and styles used throughout the mobile apps.

Card developers are able to extend their card IDs to identify particular variants that the app build system can pick up, for example:

// Developer registers this card ID:

    • com.superlights.multicolor

Example individual MSO and/or language variants

    • com.superlights.multicolor.comcast
    • com.superlights.multicolor.rogers.fr
    • com.superlights.multicolor.rogers.en
    • com.superlights.multicolor.fr

The card packaging parameters of an embodiment (in card.json) include Card ID (e.g., com.mycompany.myfirstcard), Card Version (e.g., 1.2.15), Server Whitelist (e.g., https://myserver.com, https://myotherserver.com), Prefers large mode (If true, card prefers “full screen” on tablet and smartphone, Runs all the time (If true, app should not kill card to save resources), Name of startup file (e.g., index.html), Device types (e.g., “lights”, “thermostats”), and Single device (If true, card controls a single device).

Once certified, cards are onboarded by the service provider. The description of the onboarding process that follows is presented in the context of an example in which partner devices (partners) include a third party service (e.g., accuWeather), a thermostat (e.g., Nest), and a sprinkler control system (e.g., Rachio), but the embodiment is not so limited.

A list of available (within ICS) cloud partners is generated using the partnerNames API of the Card SDK to retrieve a comma-delimited list of partners.

https://qacluster-converge.icontrol.com/rest/icontrol/sites/291/partnerNames

Expand source

// returns this

accuWeather,nest,rachio

An embodiment calls the cloudObjectByProvider API to get details or instances for a cloud partner. If this API returns a list of device instances, then the partner has already been on-boarded.

A token is then generated to pass to the OAuth onboarding API, as a security measure, but the embodiment is not so limited.

https://qacluster-converge.icontrol.com/rest/icontrol/sites/291/generateTokenForCloudObjectOnboarding? providerName=nest

Expand source

// 200: returns this

<some_token>

    • // 409: return this

The account is not provisioned for the operation.

The OAuth login configuration is persisted in the database, thereby eliminating any need to pass it down to the mobile app or cards. The oauthRedirect API is used to onboard, and the server sends the appropriate requests to start the on boarding process.

https://qacluster-converge.icontrol.com/oauth/oauthRedirect/nest?token=<some_token> // nothing returned here but we wait for a redirect

Within the app, the request is redirected to/cui/onboardingcompleted.html, which calls postMessage to inform the app know the onboarding was completed (seen below).

Expand Source

<!-- Onboarding Complete --> <html> <head> <script type=″text/javascript″> var queryParams = location.search.slice(1).split(′&′).map(function(a){return a.split(′=′); }); var partnerId = (queryParams.filter(function(a){return a[0] == ″partner″ })[0] ∥ [ ])[1] ∥ ″″; var accountId = (queryParams.filter(function(a){return a[0] == ″accountId″ })[0] ∥ [ ])[1] ∥ ″″; window.opener && window.opener.postMessage(partnerId+″_onboardingcomplete″, ″*″); // We should be in a popup window, so send the parent window a message </script> </head> <body></body> </html>

Once the “nest_onboardingcomplete” postMessage is called (e.g., Nest thermostat), the app can then display the card.

Integrated Security System

The ICS of an embodiment includes one or more components of an “integrated security system” as described in detail herein. An example of the “integrated security system” is available as one or more of the numerous systems or platforms available from iControl Networks, Inc., Redwood City, Calif. The ICS of an embodiment is coupled to, incorporates, and/or integrates with one or more components of the “integrated security system”.

As but one example, an integrated security system is described herein that integrates broadband and mobile access and control with conventional security systems and premise devices to provide a tri-mode security network (broadband, cellular/GSM, POTS access) that enables users to remotely stay connected to their premises. The integrated security system, while delivering remote premise monitoring and control functionality to conventional monitored premise protection, complements existing premise protection equipment. The integrated security system integrates into the premise network and couples wirelessly with the conventional security panel, enabling broadband access to premise security systems. Automation devices (cameras, lamp modules, thermostats, etc.) can be added, enabling users to remotely see live video and/or pictures and control home devices via their personal web portal or webpage, mobile phone, and/or other remote client device. Users can also receive notifications via email or text message when happenings occur, or do not occur, in their home.

In accordance with the embodiments described herein, a wireless system (e.g., radio frequency (RF)) is provided that enables a security provider or consumer to extend the capabilities of an existing RF-capable security system or a non-RF-capable security system that has been upgraded to support RF capabilities. The system includes an RF-capable Gateway device (physically located within RF range of the RF-capable security system) and associated software operating on the Gateway device. The system also includes a web server, application server, and remote database providing a persistent store for information related to the system.

The security systems of an embodiment, referred to herein as the iControl security system or integrated security system, extend the value of traditional home security by adding broadband access and the advantages of remote home monitoring and home control through the formation of a security network including components of the integrated security system integrated with a conventional premise security system and a premise local area network (LAN). With the integrated security system, conventional home security sensors, cameras, touchscreen keypads, lighting controls, and/or Internet Protocol (IP) devices in the home (or business) become connected devices that are accessible anywhere in the world from a web browser, mobile phone or through content-enabled touchscreens. The integrated security system experience allows security operators to both extend the value proposition of their monitored security systems and reach new consumers that include broadband users interested in staying connected to their family, home and property when they are away from home.

The integrated security system of an embodiment includes security servers (also referred to herein as iConnect servers or security network servers) and an iHub gateway (also referred to herein as the gateway, the iHub, or the iHub client) that couples or integrates into a home network (e.g., LAN) and communicates directly with the home security panel, in both wired and wireless installations. The security system of an embodiment automatically discovers the security system components (e.g., sensors, etc.) belonging to the security system and connected to a control panel of the security system and provides consumers with full two-way access via web and mobile portals. The gateway supports various wireless protocols and can interconnect with a wide range of control panels offered by security system providers. Service providers and users can then extend the system's capabilities with the additional IP cameras, lighting modules or security devices such as interactive touchscreen keypads. The integrated security system adds an enhanced value to these security systems by enabling consumers to stay connected through email and SMS alerts, photo push, event-based video capture and rule-based monitoring and notifications. This solution extends the reach of home security to households with broadband access.

The integrated security system builds upon the foundation afforded by traditional security systems by layering broadband and mobile access, IP cameras, interactive touchscreens, and an open approach to home automation on top of traditional security system configurations. The integrated security system is easily installed and managed by the security operator, and simplifies the traditional security installation process, as described below.

The integrated security system provides an open systems solution to the home security market. As such, the foundation of the integrated security system customer premises equipment (CPE) approach has been to abstract devices, and allows applications to manipulate and manage multiple devices from any vendor. The integrated security system DeviceConnect technology that enables this capability supports protocols, devices, and panels from GE Security and Honeywell, as well as consumer devices using Z-Wave, IP cameras (e.g., Ethernet, wifi, and Homeplug), and IP touchscreens. The DeviceConnect is a device abstraction layer that enables any device or protocol layer to interoperate with integrated security system components. This architecture enables the addition of new devices supporting any of these interfaces, as well as add entirely new protocols.

The benefit of DeviceConnect is that it provides supplier flexibility. The same consistent touchscreen, web, and mobile user experience operate unchanged on whatever security equipment selected by a security system provider, with the system provider's choice of IP cameras, backend data center and central station software.

The integrated security system provides a complete system that integrates or layers on top of a conventional host security system available from a security system provider. The security system provider therefore can select different components or configurations to offer (e.g., CDMA, GPRS, no cellular, etc.) as well as have iControl modify the integrated security system configuration for the system provider's specific needs (e.g., change the functionality of the web or mobile portal, add a GE or Honeywell-compatible TouchScreen, etc.).

The integrated security system integrates with the security system provider infrastructure for central station reporting directly via Broadband and GPRS alarm transmissions. Traditional dial-up reporting is supported via the standard panel connectivity. Additionally, the integrated security system provides interfaces for advanced functionality to the CMS, including enhanced alarm events, system installation optimizations, system test verification, video verification, 2-way voice over IP and GSM.

The integrated security system is an IP centric system that includes broadband connectivity so that the gateway augments the existing security system with broadband and GPRS connectivity. If broadband is down or unavailable GPRS may be used, for example. The integrated security system supports GPRS connectivity using an optional wireless package that includes a GPRS modem in the gateway. The integrated security system treats the GPRS connection as a higher cost though flexible option for data transfers. In an embodiment the GPRS connection is only used to route alarm events (e.g., for cost), however the gateway can be configured (e.g., through the iConnect server interface) to act as a primary channel and pass any or all events over GPRS. Consequently, the integrated security system does not interfere with the current plain old telephone service (POTS) security panel interface. Alarm events can still be routed through POTS; however the gateway also allows such events to be routed through a broadband or GPRS connection as well. The integrated security system provides a web application interface to the CSR tool suite as well as XML web services interfaces for programmatic integration between the security system provider's existing call center products. The integrated security system includes, for example, APIs that allow the security system provider to integrate components of the integrated security system into a custom call center interface. The APIs include XML web service APIs for integration of existing security system provider call center applications with the integrated security system service. All functionality available in the CSR Web application is provided with these API sets. The Java and XML-based APIs of the integrated security system support provisioning, billing, system administration, CSR, central station, portal user interfaces, and content management functions, to name a few. The integrated security system can provide a customized interface to the security system provider's billing system, or alternatively can provide security system developers with APIs and support in the integration effort.

The integrated security system provides or includes business component interfaces for provisioning, administration, and customer care to name a few. Standard templates and examples are provided with a defined customer professional services engagement to help integrate OSS/BSS systems of a Service Provider with the integrated security system.

The integrated security system components support and allow for the integration of customer account creation and deletion with a security system. The iConnect APIs provides access to the provisioning and account management system in iConnect and provide full support for account creation, provisioning, and deletion. Depending on the requirements of the security system provider, the iConnect APIs can be used to completely customize any aspect of the integrated security system backend operational system.

The integrated security system includes a gateway that supports the following standards-based interfaces, to name a few: Ethernet IP communications via Ethernet ports on the gateway, and standard XML/TCP/IP protocols and ports are employed over secured SSL sessions; USB 2.0 via ports on the gateway; 802.11b/g/n IP communications; GSM/GPRS RF WAN communications; COMA 1×RTT RF WAN communications (optional, can also support EVDO and 3G technologies).

The gateway supports the following proprietary interfaces, to name a few: interfaces including Dialog RF network (319.5 MHz) and RS485 Superbus 2000 wired interface; RF mesh network (908 MHz); and interfaces including RF network (345 MHz) and RS485/RS232bus wired interfaces.

Regarding security for the IP communications (e.g., authentication, authorization, encryption, anti-spoofing, etc), the integrated security system uses SSL to encrypt all IP traffic, using server and client-certificates for authentication, as well as authentication in the data sent over the SSL-encrypted channel. For encryption, integrated security system issues public/private key pairs at the time/place of manufacture, and certificates are not stored in any online storage in an embodiment.

The integrated security system does not need any special rules at the customer premise and/or at the security system provider central station because the integrated security system makes outgoing connections using TCP over the standard HTTP and HTTPS ports. Provided outbound TCP connections are allowed then no special requirements on the firewalls are necessary.

FIG. 59 is a block diagram of the integrated security system 100, under an embodiment. The integrated security system 100 of an embodiment includes the gateway 102 and the security servers 104 coupled to the conventional home security system 110. At a customer's home or business, the gateway 102 connects and manages the diverse variety of home security and self-monitoring devices. The gateway 102 communicates with the iConnect Servers 104 located in the service provider's data center 106 (or hosted in integrated security system data center), with the communication taking place via a communication network 108 or other network (e.g., cellular network, internet, etc.). These servers 104 manage the system integrations necessary to deliver the integrated system service described herein. The combination of the gateway 102 and the iConnect servers 104 enable a wide variety of remote client devices 120 (e.g., PCs, mobile phones and PDAs) allowing users to remotely stay in touch with their home, business and family. In addition, the technology allows home security and self-monitoring information, as well as relevant third party content such as traffic and weather, to be presented in intuitive ways within the home, such as on advanced touchscreen keypads.

The integrated security system service (also referred to as iControl service) can be managed by a service provider via browser-based Maintenance and Service Management applications that are provided with the iConnect Servers. Or, if desired, the service can be more tightly integrated with existing OSS/BSS and service delivery systems via the iConnect web services-based XML APIs.

The integrated security system service can also coordinate the sending of alarms to the home security Central Monitoring Station (CMS) 199. Alarms are passed to the CMS 199 using standard protocols such as Contact ID or SIA and can be generated from the home security panel location as well as by iConnect server 104 conditions (such as lack of communications with the integrated security system). In addition, the link between the security servers 104 and CMS 199 provides tighter integration between home security and self-monitoring devices and the gateway 102. Such integration enables advanced security capabilities such as the ability for CMS personnel to view photos taken at the time a burglary alarm was triggered. For maximum security, the gateway 102 and iConnect servers 104 support the use of a mobile network (both GPRS and CDMA options are available) as a backup to the primary broadband connection.

The integrated security system service is delivered by hosted servers running software components that communicate with a variety of client types while interacting with other systems. FIG. 60 is a block diagram of components of the integrated security system 100, under an embodiment. Following is a more detailed description of the components.

The iConnect servers 104 support a diverse collection of clients 120 ranging from mobile devices, to PCs, to in-home security devices, to a service provider's internal systems. Most clients 120 are used by end-users, but there are also a number of clients 120 that are used to operate the service.

Clients 120 used by end-users of the integrated security system 100 include, but are not limited to, the following:

    • Clients based on gateway client applications 202 (e.g., a processor-based device running the gateway technology that manages home security and automation devices).
    • A web browser 204 accessing a Web Portal application, performing end-user configuration and customization of the integrated security system service as well as monitoring of in-home device status, viewing photos and video, etc. Device and user management can also be performed by this portal application.
    • A mobile device 206 (e.g., PDA, mobile phone, etc.) accessing the integrated security system Mobile Portal. This type of client 206 is used by end-users to view system status and perform operations on devices (e.g., turning on a lamp, arming a security panel, etc.) rather than for system configuration tasks such as adding a new device or user.
    • PC or browser-based “widget” containers 208 that present integrated security system service content, as well as other third-party content, in simple, targeted ways (e.g. a widget that resides on a PC desktop and shows live video from a single in-home camera). “Widget” as used herein means applications or programs in the system.
    • Touchscreen home security keypads 208 and advanced in-home devices that present a variety of content widgets via an intuitive touchscreen user interface.
    • Notification recipients 210 (e.g., cell phones that receive SMS-based notifications when certain events occur (or don't occur), email clients that receive an email message with similar information, etc.).
    • Custom-built clients (not shown) that access the iConnect web services XML API to interact with users' home security and self-monitoring information in new and unique ways. Such clients could include new types of mobile devices, or complex applications where integrated security system content is integrated into a broader set of application features.

In addition to the end-user clients, the iConnect servers 104 support PC browser-based Service Management clients that manage the ongoing operation of the overall service. These clients run applications that handle tasks such as provisioning, service monitoring, customer support and reporting.

There are numerous types of server components of the iConnect servers 104 of an embodiment including, but not limited to, the following: Business Components which manage information about all of the home security and self-monitoring devices; End-User Application Components which display that information for users and access the Business Components via published XML APIs; and Service Management Application Components which enable operators to administer the service (these components also access the Business Components via the XML APIs, and also via published SNMP MIBs).

The server components provide access to, and management of, the objects associated with an integrated security system installation. The top-level object is the “network.” It is a location where a gateway 102 is located, and is also commonly referred to as a site or premises; the premises can include any type of structure (e.g., home, office, warehouse, etc.) at which a gateway 102 is located. Users can only access the networks to which they have been granted permission. Within a network, every object monitored by the gateway 102 is called a device. Devices include the sensors, cameras, home security panels and automation devices, as well as the controller or processor-based device running the gateway applications.

Various types of interactions are possible between the objects in a system. Automations define actions that occur as a result of a change in state of a device. For example, take a picture with the front entry camera when the front door sensor changes to “open”. Notifications are messages sent to users to indicate that something has occurred, such as the front door going to “open” state, or has not occurred (referred to as an iWatch notification). Schedules define changes in device states that are to take place at predefined days and times. For example, set the security panel to “Armed” mode every weeknight at 11:00 pm.

The iConnect Business Components are responsible for orchestrating all of the low-level service management activities for the integrated security system service. They define all of the users and devices associated with a network (site), analyze how the devices interact, and trigger associated actions (such as sending notifications to users). All changes in device states are monitored and logged. The Business Components also manage all interactions with external systems as required, including sending alarms and other related self-monitoring data to the home security Central Monitoring System (CMS) 199. The Business Components are implemented as portable Java J2EE Servlets, but are not so limited.

The following iConnect Business Components manage the main elements of the integrated security system service, but the embodiment is not so limited:

    • A Registry Manager 220 defines and manages users and networks. This component is responsible for the creation, modification and termination of users and networks. It is also where a user's access to networks is defined.
    • A Network Manager 222 defines and manages security and self-monitoring devices that are deployed on a network (site). This component handles the creation, modification, deletion and configuration of the devices, as well as the creation of automations, schedules and notification rules associated with those devices.
    • A Data Manager 224 manages access to current and logged state data for an existing network and its devices. This component specifically does not provide any access to network management capabilities, such as adding new devices to a network, which are handled exclusively by the Network Manager 222.
    • To achieve optimal performance for all types of queries, data for current device states is stored separately from historical state data (a.k.a. “logs”) in the database. A Log Data Manager 226 performs ongoing transfers of current device state data to the historical data log tables.

Additional iConnect Business Components handle direct communications with certain clients and other systems, for example:

    • An iHub Manager 228 directly manages all communications with gateway clients, including receiving information about device state changes, changing the configuration of devices, and pushing new versions of the gateway client to the hardware it is running on.
    • A Notification Manager 230 is responsible for sending all notifications to clients via SMS (mobile phone messages), email (via a relay server like an SMTP email server), etc.
    • An Alarm and CMS Manager 232 sends critical server-generated alarm events to the home security Central Monitoring Station (CMS) and manages all other communications of integrated security system service data to and from the CMS.
    • The Element Management System (EMS) 234 is an iControl Business Component that manages all activities associated with service installation, scaling and monitoring, and filters and packages service operations data for use by service management applications. The SNMP MIBs published by the EMS can also be incorporated into any third party monitoring system if desired.

The iConnect Business Components store information about the objects that they manage in the iControl Service Database 240 and in the iControl Content Store 242. The iControl Content Store is used to store media objects like video, photos and widget content, while the Service Database stores information about users, networks, and devices. Database interaction is performed via a JDBC interface. For security purposes, the Business Components manage all data storage and retrieval.

The iControl Business Components provide web services-based APIs that application components use to access the Business Components' capabilities. Functions of application components include presenting integrated security system service data to end-users, performing administrative duties, and integrating with external systems and back-office applications.

The primary published APIs for the iConnect Business Components include, but are not limited to, the following:

    • A Registry Manager API 252 provides access to the Registry Manager Business Component's functionality, allowing management of networks and users.
    • A Network Manager API 254 provides access to the Network Manager Business Component's functionality, allowing management of devices on a network.
    • A Data Manager API 256 provides access to the Data Manager Business Component's functionality, such as setting and retrieving (current and historical) data about device states.
    • A Provisioning API 258 provides a simple way to create new networks and configure initial default properties.

Each API of an embodiment includes two modes of access: Java API or XML API. The XML APIs are published as web services so that they can be easily accessed by applications or servers over a network. The Java APIs are a programmer-friendly wrapper for the XML APIs. Application components and integrations written in Java should generally use the Java APIs rather than the XML APIs directly.

The iConnect Business Components also have an XML-based interface 260 for quickly adding support for new devices to the integrated security system. This interface 260, referred to as DeviceConnect 260, is a flexible, standards-based mechanism for defining the properties of new devices and how they can be managed. Although the format is flexible enough to allow the addition of any type of future device, pre-defined XML profiles are currently available for adding common types of devices such as sensors (SensorConnect), home security panels (PanelConnect) and IP cameras (CameraConnect).

The iConnect End-User Application Components deliver the user interfaces that run on the different types of clients supported by the integrated security system service. The components are written in portable Java J2EE technology (e.g., as Java Servlets, as JavaServer Pages (JSPs), etc.) and they all interact with the iControl Business Components via the published APIs.

The following End-User Application Components generate CSS-based HTML/JavaScript that is displayed on the target client. These applications can be dynamically branded with partner-specific logos and URL links (such as Customer Support, etc.). The End-User Application Components of an embodiment include, but are not limited to, the following:

    • An iControl Activation Application 270 that delivers the first application that a user sees when they set up the integrated security system service. This wizard-based web browser application securely associates a new user with a purchased gateway and the other devices included with it as a kit (if any). It primarily uses functionality published by the Provisioning API.
    • An iControl Web Portal Application 272 runs on PC browsers and delivers the web-based interface to the integrated security system service. This application allows users to manage their networks (e.g. add devices and create automations) as well as to view/change device states, and manage pictures and videos. Because of the wide scope of capabilities of this application, it uses three different Business Component APIs that include the Registry Manager API, Network Manager API, and Data Manager API, but the embodiment is not so limited.
    • An iControl Mobile Portal 274 is a small-footprint web-based interface that runs on mobile phones and PDAs. This interface is optimized for remote viewing of device states and pictures/videos rather than network management. As such, its interaction with the Business Components is primarily via the Data Manager API.
    • Custom portals and targeted client applications can be provided that leverage the same Business Component APIs used by the above applications.
    • A Content Manager Application Component 276 delivers content to a variety of clients. It sends multimedia-rich user interface components to widget container clients (both PC and browser-based), as well as to advanced touchscreen keypad clients. In addition to providing content directly to end-user devices, the Content Manager 276 provides widget-based user interface components to satisfy requests from other Application Components such as the iControl Web 272 and Mobile 274 portals.

A number of Application Components are responsible for overall management of the service. These pre-defined applications, referred to as Service Management Application Components, are configured to offer off-the-shelf solutions for production management of the integrated security system service including provisioning, overall service monitoring, customer support, and reporting, for example. The Service Management Application Components of an embodiment include, but are not limited to, the following:

    • A Service Management Application 280 allows service administrators to perform activities associated with service installation, scaling and monitoring/alerting. This application interacts heavily with the Element Management System (EMS) Business Component to execute its functionality, and also retrieves its monitoring data from that component via protocols such as SNMP MIBs.
    • A Kitting Application 282 is used by employees performing service provisioning tasks. This application allows home security and self-monitoring devices to be associated with gateways during the warehouse kitting process.
    • A CSR Application and Report Generator 284 is used by personnel supporting the integrated security system service, such as CSRs resolving end-user issues and employees enquiring about overall service usage. The push of new gateway firmware to deployed gateways is also managed by this application.

The iConnect servers 104 also support custom-built integrations with a service provider's existing OSS/BSS, CSR and service delivery systems 290. Such systems can access the iConnect web services XML API to transfer data to and from the iConnect servers 104. These types of integrations can compliment or replace the PC browser-based Service Management applications, depending on service provider needs.

As described above, the integrated security system of an embodiment includes a gateway, or iHub. The gateway of an embodiment includes a device that is deployed in the home or business and couples or connects the various third-party cameras, home security panels, sensors and devices to the iConnect server over a WAN connection as described in detail herein. The gateway couples to the home network and communicates directly with the home security panel in both wired and wireless sensor installations. The gateway is configured to be low-cost, reliable and thin so that it complements the integrated security system network-based architecture.

The gateway supports various wireless protocols and can interconnect with a wide range of home security control panels. Service providers and users can then extend the system's capabilities by adding IP cameras, lighting modules and additional security devices. The gateway is configurable to be integrated into many consumer appliances, including set-top boxes, routers and security panels. The small and efficient footprint of the gateway enables this portability and versatility, thereby simplifying and reducing the overall cost of the deployment.

FIG. 61 is a block diagram of the gateway 102 including gateway software or applications, under an embodiment. The gateway software architecture is relatively thin and efficient, thereby simplifying its integration into other consumer appliances such as set-top boxes, routers, touch screens and security panels. The software architecture also provides a high degree of security against unauthorized access. This section describes the various key components of the gateway software architecture.

The gateway application layer 302 is the main program that orchestrates the operations performed by the gateway. The Security Engine 304 provides robust protection against intentional and unintentional intrusion into the integrated security system network from the outside world (both from inside the premises as well as from the WAN). The Security Engine 304 of an embodiment comprises one or more sub-modules or components that perform functions including, but not limited to, the following:

    • Encryption including 128-bit SSL encryption for gateway and iConnect server communication to protect user data privacy and provide secure communication.
    • Bi-directional authentication between the gateway and iConnect server in order to prevent unauthorized spoofing and attacks. Data sent from the iConnect server to the gateway application (or vice versa) is digitally signed as an additional layer of security. Digital signing provides both authentication and validation that the data has not been altered in transit.
    • Camera SSL encapsulation because picture and video traffic offered by off-the-shelf networked IP cameras is not secure when traveling over the Internet. The gateway provides for 128-bit SSL encapsulation of the user picture and video data sent over the internet for complete user security and privacy.
    • 802.11b/g/n with WPA-2 security to ensure that wireless camera communications always takes place using the strongest available protection.
    • A gateway-enabled device is assigned a unique activation key for activation with an iConnect server. This ensures that only valid gateway-enabled devices can be activated for use with the specific instance of iConnect server in use. Attempts to activate gateway-enabled devices by brute force are detected by the Security Engine. Partners deploying gateway-enabled devices have the knowledge that only a gateway with the correct serial number and activation key can be activated for use with an iConnect server. Stolen devices, devices attempting to masquerade as gateway-enabled devices, and malicious outsiders (or insiders as knowledgeable but nefarious customers) cannot effect other customers' gateway-enabled devices.

As standards evolve, and new encryption and authentication methods are proven to be useful, and older mechanisms proven to be breakable, the security manager can be upgraded “over the air” to provide new and better security for communications between the iConnect server and the gateway application, and locally at the premises to remove any risk of eavesdropping on camera communications.

A Remote Firmware Download module 306 allows for seamless and secure updates to the gateway firmware through the iControl Maintenance Application on the server 104, providing a transparent, hassle-free mechanism for the service provider to deploy new features and bug fixes to the installed user base. The firmware download mechanism is tolerant of connection loss, power interruption and user interventions (both intentional and unintentional). Such robustness reduces down time and customer support issues. Gateway firmware can be remotely download either for one gateway at a time, a group of gateways, or in batches.

The Automations engine 308 manages the user-defined rules of interaction between the different devices (e.g. when door opens turn on the light). Though the automation rules are programmed and reside at the portal/server level, they are cached at the gateway level in order to provide short latency between device triggers and actions.

DeviceConnect 310 includes definitions of all supported devices (e.g., cameras, security panels, sensors, etc.) using a standardized plug-in architecture. The DeviceConnect module 310 offers an interface that can be used to quickly add support for any new device as well as enabling interoperability between devices that use different technologies/protocols. For common device types, pre-defined sub-modules have been defined, making supporting new devices of these types even easier. SensorConnect 312 is provided for adding new sensors, CameraConnect 316 for adding IP cameras, and PanelConnect 314 for adding home security panels.

The Schedules engine 318 is responsible for executing the user defined schedules (e.g., take a picture every five minutes; every day at 8 am set temperature to 65 degrees Fahrenheit, etc.). Though the schedules are programmed and reside at the iConnect server level they are sent to the scheduler within the gateway application. The Schedules Engine 318 then interfaces with SensorConnect 312 to ensure that scheduled events occur at precisely the desired time.

The Device Management module 320 is in charge of all discovery, installation and configuration of both wired and wireless IP devices (e.g., cameras, etc.) coupled or connected to the system. Networked IP devices, such as those used in the integrated security system, require user configuration of many IP and security parameters—to simplify the user experience and reduce the customer support burden, the device management module of an embodiment handles the details of this configuration. The device management module also manages the video routing module described below.

The video routing engine 322 is responsible for delivering seamless video streams to the user with zero-configuration. Through a multi-step, staged approach the video routing engine uses a combination of UPnP port-forwarding, relay server routing and STUN/TURN peer-to-peer routing.

FIG. 62 is a block diagram of components of the gateway 102, under an embodiment. Depending on the specific set of functionality desired by the service provider deploying the integrated security system service, the gateway 102 can use any of a number of processors 402, due to the small footprint of the gateway application firmware. In an embodiment, the gateway could include the Broadcom BCM5354 as the processor for example. In addition, the gateway 102 includes memory (e.g., FLASH 404, RAM 406, etc.) and any number of input/output (I/O) ports 408.

Referring to the WAN portion 410 of the gateway 102, the gateway 102 of an embodiment can communicate with the iConnect server using a number of communication types and/or protocols, for example Broadband 412, GPRS 414 and/or Public Switched Telephone Network (PTSN) 416 to name a few. In general, broadband communication 412 is the primary means of connection between the gateway 102 and the iConnect server 104 and the GPRS/CDMA 414 and/or PSTN 416 interfaces acts as back-up for fault tolerance in case the user's broadband connection fails for whatever reason, but the embodiment is not so limited.

Referring to the LAN portion 420 of the gateway 102, various protocols and physical transceivers can be used to communicate to off-the-shelf sensors and cameras. The gateway 102 is protocol-agnostic and technology-agnostic and as such can easily support almost any device networking protocol. The gateway 102 can, for example, support GE and Honeywell security RF protocols 422, Z-Wave 424, serial (RS232 and RS485) 426 for direct connection to security panels as well as WiFi 428 (802.11b/g) for communication to WiFi cameras.

The integrated security system includes couplings or connections among a variety of IP devices or components, and the device management module is in charge of the discovery, installation and configuration of the IP devices coupled or connected to the system, as described above. The integrated security system of an embodiment uses a “sandbox” network to discover and manage all IP devices coupled or connected as components of the system. The IP devices of an embodiment include wired devices, wireless devices, cameras, interactive touchscreens, and security panels to name a few. These devices can be wired via ethernet cable or Wifi devices, all of which are secured within the sandbox network, as described below. The “sandbox” network is described in detail below.

FIG. 63 is a block diagram 500 of network or premise device integration with a premise network 250, under an embodiment. In an embodiment, network devices 255-257 are coupled to the gateway 102 using a secure network coupling or connection such as SSL over an encrypted 802.11 link (utilizing for example WPA-2 security for the wireless encryption). The network coupling or connection between the gateway 102 and the network devices 255-257 is a private coupling or connection in that it is segregated from any other network couplings or connections. The gateway 102 is coupled to the premise router/firewall 252 via a coupling with a premise LAN 250. The premise router/firewall 252 is coupled to a broadband modem 251, and the broadband modem 251 is coupled to a WAN 200 or other network outside the premise. The gateway 102 thus enables or forms a separate wireless network, or sub-network, that includes some number of devices and is coupled or connected to the LAN 250 of the host premises. The gateway sub-network can include, but is not limited to, any number of other devices like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The gateway 102 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the gateway 102.

FIG. 64 is a block diagram 600 of network or premise device integration with a premise network 250, under an alternative embodiment. The network or premise devices 255-257 are coupled to the gateway 102. The network coupling or connection between, the gateway 102 and the network devices 255-257 is a private coupling or connection in that it is segregated from any other network couplings or connections. The gateway 102 is coupled or connected between the premise router/firewall 252 and the broadband modem 251. The broadband modem 251 is coupled to a WAN 200 or other network outside the premise, while the premise router/firewall 252 is coupled to a premise LAN 250. As a result of its location between the broadband modem 251 and the premise router/firewall 252, the gateway 102 can be configured or function as the premise router routing specified data between the outside network (e.g., WAN 200) and the premise router/firewall 252 of the LAN 250. As described above, the gateway 102 in this configuration enables or forms a separate wireless network, or sub-network, that includes the network or premise devices 255-257 and is coupled or connected between the LAN 250 of the host premises and the WAN 200. The gateway sub-network can include, but is not limited to, any number of network or premise devices 255-257 like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The gateway 102 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the gateway 102.

The examples described above with reference to FIGS. 5 and 6 are presented only as examples of IP device integration. The integrated security system is not limited to the type, number and/or combination of IP devices shown and described in these examples, and any type, number and/or combination of IP devices is contemplated within the scope of this disclosure as capable of being integrated with the premise network.

The integrated security system of an embodiment includes a touchscreen (also referred to as the iControl touchscreen or integrated security system touchscreen), as described above, which provides core security keypad functionality, content management and presentation, and embedded systems design. The networked security touchscreen system of an embodiment enables a consumer or security provider to easily and automatically install, configure and manage the security system and touchscreen located at a customer premise. Using this system the customer may access and control the local security system, local IP devices such as cameras, local sensors and control devices (such as lighting controls or pipe freeze sensors), as well as the local security system panel and associated security sensors (such as door/window, motion, and smoke detectors). The customer premise may be a home, business, and/or other location equipped with a wired or wireless broadband IP connection.

The system of an embodiment includes a touchscreen with a configurable software user interface and/or a gateway device (e.g., iHub) that couples or connects to a premise security panel through a wired or wireless connection, and a remote server that provides access to content and information from the premises devices to a user when they are remote from the home. The touchscreen supports broadband and/or WAN wireless connectivity. In this embodiment, the touchscreen incorporates an IP broadband connection (e.g., Wifi radio, Ethernet port, etc.), and/or a cellular radio (e.g., GPRS/GSM, CDMA, WiMax, etc.). The touchscreen described herein can be used as one or more of a security system interface panel and a network user interface (UI) that provides an interface to interact with a network (e.g., LAN, WAN, internet, etc.).

The touchscreen of an embodiment provides an integrated touchscreen and security panel as an all-in-one device. Once integrated using the touchscreen, the touchscreen and a security panel of a premise security system become physically co-located in one device, and the functionality of both may even be co-resident on the same CPU and memory (though this is not required).

The touchscreen of an embodiment also provides an integrated IP video and touchscreen UI. As such, the touchscreen supports one or more standard video CODECs/players (e.g., H.264, Flash Video, MOV, MPEG4, M-JPEG, etc.). The touchscreen UI then provides a mechanism (such as a camera or video widget) to play video. In an embodiment the video is streamed live from an IP video camera. In other embodiments the video comprises video clips or photos sent from an IP camera or from a remote location.

The touchscreen of an embodiment provides a configurable user interface system that includes a configuration supporting use as a security touchscreen. In this embodiment, the touchscreen utilizes a modular user interface that allows components to be modified easily by a service provider, an installer, or even the end user. Examples of such a modular approach include using Flash widgets, HTML-based widgets, or other downloadable code modules such that the user interface of the touchscreen can be updated and modified while the application is running. In an embodiment the touchscreen user interface modules can be downloaded over the internet. For example, a new security configuration widget can be downloaded from a standard web server, and the touchscreen then loads such configuration app into memory, and inserts it in place of the old security configuration widget. The touchscreen of an embodiment is configured to provide a self-install user interface.

Embodiments of the networked security touchscreen system described herein include a touchscreen device with a user interface that includes a security toolbar providing one or more functions including arm, disarm, panic, medic, and alert. The touchscreen therefore includes at least one screen having a separate region of the screen dedicated to a security toolbar. The security toolbar of an embodiment is present in the dedicated region at all times that the screen is active.

The touchscreen of an embodiment includes a home screen having a separate region of the screen allocated to managing home-based functions. The home-based functions of an embodiment include managing, viewing, and/or controlling IP video cameras. In this embodiment, regions of the home screen are allocated in the faun of widget icons; these widget icons (e.g. for cameras, thermostats, lighting, etc) provide functionality for managing home systems. So, for example, a displayed camera icon, when selected, launches a Camera Widget, and the Camera widget in turn provides access to video from one or more cameras, as well as providing the user with relevant camera controls (take a picture, focus the camera, etc.)

The touchscreen of an embodiment includes a home screen having a separate region of the screen allocated to managing, viewing, and/or controlling internet-based content or applications. For example, the Widget Manager UI presents a region of the home screen (up to and including the entire home screen) where internet widgets icons such as weather, sports, etc. may be accessed). Each of these icons may be selected to launch their respective content services.

The touchscreen of an embodiment is integrated into a premise network using the gateway, as described above. The gateway as described herein functions to enable a separate wireless network, or sub-network, that is coupled, connected, or integrated with another network (e.g., WAN, LAN of the host premises, etc.). The sub-network enabled by the gateway optimizes the installation process for IP devices, like the touchscreen, that couple or connect to the sub-network by segregating these IP devices from other such devices on the network. This segregation of the IP devices of the sub-network further enables separate security and privacy policies to be implemented for these IP devices so that, where the IP devices are dedicated to specific functions (e.g., security), the security and privacy policies can be tailored specifically for the specific functions. Furthermore, the gateway and the sub-network it forms enables the segregation of data traffic, resulting in faster and more efficient data flow between components of the host network, components of the sub-network, and between components of the sub-network and components of the network.

The touchscreen of an embodiment includes a core functional embedded system that includes an embedded operating system, required hardware drivers, and an open system interface to name a few. The core functional embedded system can be provided by or as a component of a conventional security system (e.g., security system available from GE Security). These core functional units are used with components of the integrated security system as described herein. Note that portions of the touchscreen description below may include reference to a host premise security system (e.g., GE security system), but these references are included only as an example and do not limit the touchscreen to integration with any particular security system.

As an example, regarding the core functional embedded system, a reduced memory footprint version of embedded Linux forms the core operating system in an embodiment, and provides basic TCP/IP stack and memory management functions, along with a basic set of low-level graphics primitives. A set of device drivers is also provided or included that offer low-level hardware and network interfaces. In addition to the standard drivers, an interface to the RS 485 bus is included that couples or connects to the security system panel (e.g., GE Concord panel). The interface may, for example, implement the Superbus 2000 protocol, which can then be utilized by the more comprehensive transaction-level security functions implemented in PanelConnect technology (e.g SetAlarmLevel (int level, int partition, char *accessCode)). Power control drivers are also provided.

FIG. 65 is a block diagram of a touchscreen 700 of the integrated security system, under an embodiment. The touchscreen 700 generally includes an application/presentation layer 702 with a resident application 704, and a core engine 706. The touchscreen 700 also includes one or more of the following, but is not so limited: applications of premium services 710, widgets 712, a caching proxy 714, network security 716, network interface 718, security object 720, applications supporting devices 722, PanelConnect API 724, a gateway interface 726, and one or more ports 728.

More specifically, the touchscreen, when configured as a home security device, includes but is not limited to the following application or software modules: RS 485 and/or RS-232 bus security protocols to conventional home security system panel (e.g., GE Concord panel); functional home security classes and interfaces (e.g. Panel ARM state, Sensor status, etc.); Application/Presentation layer or engine; Resident Application; Consumer Home Security Application; installer home security application; core engine; and System bootloader/Software Updater. The core Application engine and system bootloader can also be used to support other advanced content and applications. This provides a seamless interaction between the premise security application and other optional services such as weather widgets or IP cameras.

An alternative configuration of the touchscreen includes a first Application engine for premise security and a second Application engine for all other applications. The integrated security system application engine supports content standards such as HTML, XML, Flash, etc. and enables a rich consumer experience for all ‘widgets’, whether security-based or not. The touchscreen thus provides service providers the ability to use web content creation and management tools to build and download any ‘widgets’ regardless of their functionality.

As discussed above, although the Security Applications have specific low-level functional requirements in order to interface with the premise security system, these applications make use of the same fundamental application facilities as any other ‘widget’, application facilities that include graphical layout, interactivity, application handoff, screen management, and network interfaces, to name a few.

Content management in the touchscreen provides the ability to leverage conventional web development tools, performance optimized for an embedded system, service provider control of accessible content, content reliability in a consumer device, and consistency between ‘widgets’ and seamless widget operational environment. In an embodiment of the integrated security system, widgets are created by web developers and hosted on the integrated security system Content Manager (and stored in the Content Store database). In this embodiment the server component caches the widgets and offers them to consumers through the web-based integrated security system provisioning system. The servers interact with the advanced touchscreen using HTTPS interfaces controlled by the core engine and dynamically download widgets and updates as needed to be cached on the touchscreen. In other embodiments widgets can be accessed directly over a network such as the Internet without needing to go through the iControl Content Manager

Referring to FIG. 65, the touchscreen system is built on a tiered architecture, with defined interfaces between the Application/Presentation Layer (the Application Engine) on the top, the Core Engine in the middle, and the security panel and gateway APIs at the lower level. The architecture is configured to provide maximum flexibility and ease of maintenance.

The application engine of the touchscreen provides the presentation and interactivity capabilities for all applications (widgets) that run on the touchscreen, including both core security function widgets and third party content widgets. FIG. 66 is an example screenshot 800 of a networked security touchscreen, under an embodiment. This example screenshot 800 includes three interfaces or user interface (UI) components 802-806, but is not so limited. A first UI 802 of the touchscreen includes icons by which a user controls or accesses functions and/or components of the security system (e.g., “Main”, “Panic”, “Medic”, “Fire”, state of the premise alarm system (e.g., disarmed, armed, etc.), etc.); the first UI 802, which is also referred to herein as a security interface, is always presented on the touchscreen. A second UI 804 of the touchscreen includes icons by which a user selects or interacts with services and other network content (e.g., clock, calendar, weather, stocks, news, sports, photos, maps, music, etc.) that is accessible via the touchscreen. The second UI 804 is also referred to herein as a network interface or content interface. A third UI 806 of the touchscreen includes icons by which a user selects or interacts with additional services or components (e.g., intercom control, security, cameras coupled to the system in particular regions (e.g., front door, baby, etc.) available via the touchscreen.

A component of the application engine is the Presentation Engine, which includes a set of libraries that implement the standards-based widget content (e.g., XML, HTML, JavaScript, Flash) layout and interactivity. This engine provides the widget with interfaces to dynamically load both graphics and application logic from third parties, support high level data description language as well as standard graphic formats. The set of web content-based functionality available to a widget developer is extended by specific touchscreen functions implemented as local web services by the Core Engine.

The resident application of the touchscreen is the master service that controls the interaction of all widgets in the system, and enforces the business and security rules required by the service provider. For example, the resident application determines the priority of widgets, thereby enabling a home security widget to override resource requests from a less critical widget (e.g. a weather widget). The resident application also monitors widget behavior, and responds to client or server requests for cache updates.

The core engine of the touchscreen manages interaction with other components of the integrated security system, and provides an interface through which the resident application and authorized widgets can get information about the home security system, set alarms, install sensors, etc. At the lower level, the Core Engine's main interactions are through the PanelConnect API, which handles all communication with the security panel, and the gateway Interface, which handles communication with the gateway. In an embodiment, both the iHub Interface and PanelConnect API are resident and operating on the touchscreen. In another embodiment, the PanelConnect API runs on the gateway or other device that provides security system interaction and is accessed by the touchscreen through a web services interface.

The Core Engine also handles application and service level persistent and cached memory functions, as well as the dynamic provisioning of content and widgets, including but not limited to: flash memory management, local widget and content caching, widget version management (download, cache flush new/old content versions), as well as the caching and synchronization of user preferences. As a portion of these services the Core engine incorporates the bootloader functionality that is responsible for maintaining a consistent software image on the touchscreen, and acts as the client agent for all software updates. The bootloader is configured to ensure full update redundancy so that unsuccessful downloads cannot corrupt the integrated security system.

Video management is provided as a set of web services by the Core Engine. Video management includes the retrieval and playback of local video feeds as well as remote control and management of cameras (all through iControl CameraConnect technology).

Both the high level application layer and the mid-level core engine of the touchscreen can make calls to the network. Any call to the network made by the application layer is automatically handed off to a local caching proxy, which determines whether the request should be handled locally. Many of the requests from the application layer are web services API requests, although such requests could be satisfied by the iControl servers, they are handled directly by the touchscreen and the gateway. Requests that get through the caching proxy are checked against a white list of acceptable sites, and, if they match, are sent off through the network interface to the gateway. Included in the Network Subsystem is a set of network services including HTTP, HTTPS, and server-level authentication functions to manage the secure client-server interface. Storage and management of certificates is incorporated as a part of the network services layer.

Server components of the integrated security system servers support interactive content services on the touchscreen. These server components include, but are not limited to the content manager, registry manager, network manager, and global registry, each of which is described herein.

The Content Manager oversees aspects of handling widget data and raw content on the touchscreen. Once created and validated by the service provider, widgets are ‘ingested’ to the Content Manager, and then become available as downloadable services through the integrated security system Content Management APIs. The Content manager maintains versions and timestamp information, and connects to the raw data contained in the backend Content Store database. When a widget is updated (or new content becomes available) all clients registering interest in a widget are systematically updated as needed (a process that can be configured at an account, locale, or system-wide level).

The Registry Manager handles user data, and provisioning accounts, including information about widgets the user has decided to install, and the user preferences for these widgets.

The Network Manager handles getting and setting state for all devices on the integrated security system network (e.g., sensors, panels, cameras, etc.). The Network manager synchronizes with the gateway, the advanced touchscreen, and the subscriber database.

The Global Registry is a primary starting point server for all client services, and is a logical referral service that abstracts specific server locations/addresses from clients (touchscreen, gateway 102, desktop widgets, etc.). This approach enables easy scaling/migration of server farms.

The touchscreen of an embodiment operates wirelessly with a premise security system. The touchscreen of an embodiment incorporates an RF transceiver component that either communicates directly with the sensors and/or security panel over the panel's proprietary RF frequency, or the touchscreen communicates wirelessly to the gateway over 802.11, Ethernet, or other IP-based communications channel, as described in detail herein. In the latter case the gateway implements the PanelConnect interface and communicates directly to the security panel and/or sensors over wireless or wired networks as described in detail above.

The touchscreen of an embodiment is configured to operate with multiple security systems through the use of an abstracted security system interface. In this embodiment, the PanelConnect API can be configured to support a plurality of proprietary security system interfaces, either simultaneously or individually as described herein. In one embodiment of this approach, the touchscreen incorporates multiple physical interfaces to security panels (e.g. GE Security RS-485, Honeywell RF, etc.) in addition to the PanelConnect API implemented to support multiple security interfaces. The change needed to support this in PanelConnect is a configuration parameter specifying the panel type connection that is being utilized.

So for example, the setARMState( ) function is called with an additional parameter (e.g., Armstate=setARMState(type=“ARM STAY| ARM AWAY| DISARM”, Parameters=“ExitDelay=30 |Lights=OFF”, panelType=“GE Concord4 RS485”)). The ‘panelType’ parameter is used by the setARMState function (and in practice by all of the PanelConnect functions) to select an algorithm appropriate to the specific panel out of a plurality of algorithms.

The touchscreen of an embodiment is self-installable. Consequently, the touchscreen provides a ‘wizard’ approach similar to that used in traditional computer installations (e.g. InstallShield). The wizard can be resident on the touchscreen, accessible through a web interface, or both. In one embodiment of a touchscreen self-installation process, the service provider can associate devices (sensors, touchscreens, security panels, lighting controls, etc.) remotely using a web-based administrator interface.

The touchscreen of an embodiment includes a battery backup system for a security touchscreen. The touchscreen incorporates a standard Li-ion or other battery and charging circuitry to allow continued operation in the event of a power outage. In an embodiment the battery is physically located and connected within the touchscreen enclosure. In another embodiment the battery is located as a part of the power transformer, or in between the power transformer and the touchscreen.

The example configurations of the integrated security system described above with reference to FIGS. 5 and 6 include a gateway that is a separate device, and the touchscreen couples to the gateway. However, in an alternative embodiment, the gateway device and its functionality can be incorporated into the touchscreen so that the device management module, which is now a component of or included in the touchscreen, is in charge of the discovery, installation and configuration of the IP devices coupled or connected to the system, as described above. The integrated security system with the integrated touchscreen/gateway uses the same “sandbox” network to discover and manage all IP devices coupled or connected as components of the system.

The touchscreen of this alternative embodiment integrates the components of the gateway with the components of the touchscreen as described herein. More specifically, the touchscreen of this alternative embodiment includes software or applications described above with reference to FIG. 3. In this alternative embodiment, the touchscreen includes the gateway application layer 302 as the main program that orchestrates the operations performed by the gateway. A Security Engine 304 of the touchscreen provides robust protection against intentional and unintentional intrusion into the integrated security system network from the outside world (both from inside the premises as well as from the WAN). The Security Engine 304 of an embodiment comprises one or more sub-modules or components that perform functions including, but not limited to, the following:

    • Encryption including 128-bit SSL encryption for gateway and iConnect server communication to protect user data privacy and provide secure communication.
    • Bi-directional authentication between the touchscreen and iConnect server in order to prevent unauthorized spoofing and attacks. Data sent from the iConnect server to the gateway application (or vice versa) is digitally signed as an additional layer of security. Digital signing provides both authentication and validation that the data has not been altered in transit.
    • Camera SSL encapsulation because picture and video traffic offered by off-the-shelf networked IP cameras is not secure when traveling over the Internet. The touchscreen provides for 128-bit SSL encapsulation of the user picture and video data sent over the internet for complete user security and privacy.
    • 802.11b/g/n with WPA-2 security to ensure that wireless camera communications always takes place using the strongest available protection.
    • A touchscreen-enabled device is assigned a unique activation key for activation with an iConnect server. This ensures that only valid gateway-enabled devices can be activated for use with the specific instance of iConnect server in use. Attempts to activate gateway-enabled devices by brute force are detected by the Security Engine. Partners deploying touchscreen-enabled devices have the knowledge that only a gateway with the correct serial number and activation key can be activated for use with an iConnect server. Stolen devices, devices attempting to masquerade as gateway-enabled devices, and malicious outsiders (or insiders as knowledgeable but nefarious customers) cannot effect other customers' gateway-enabled devices.

As standards evolve, and new encryption and authentication methods are proven to be useful, and older mechanisms proven to be breakable, the security manager can be upgraded “over the air” to provide new and better security for communications between the iConnect server and the gateway application, and locally at the premises to remove any risk of eavesdropping on camera communications.

A Remote Firmware Download module 306 of the touchscreen allows for seamless and secure updates to the gateway firmware through the iControl Maintenance Application on the server 104, providing a transparent, hassle-free mechanism for the service provider to deploy new features and bug fixes to the installed user base. The firmware download mechanism is tolerant of connection loss, power interruption and user interventions (both intentional and unintentional). Such robustness reduces down time and customer support issues. Touchscreen firmware can be remotely download either for one touchscreen at a time, a group of touchscreen, or in batches.

The Automations engine 308 of the touchscreen manages the user-defined rules of interaction between the different devices (e.g. when door opens turn on the light). Though the automation rules are programmed and reside at the portal/server level, they are cached at the gateway level in order to provide short latency between device triggers and actions.

DeviceConnect 310 of the touchscreen touchscreen includes definitions of all supported devices (e.g., cameras, security panels, sensors, etc.) using a standardized plug-in architecture. The DeviceConnect module 310 offers an interface that can be used to quickly add support for any new device as well as enabling interoperability between devices that use different technologies/protocols. For common device types, pre-defined sub-modules have been defined, making supporting new devices of these types even easier. SensorConnect 312 is provided for adding new sensors, CameraConnect 316 for adding IP cameras, and PanelConnect 314 for adding home security panels.

The Schedules engine 318 of the touchscreen is responsible for executing the user defined schedules (e.g., take a picture every five minutes; every day at 8 am set temperature to 65 degrees Fahrenheit, etc.). Though the schedules are programmed and reside at the iConnect server level they are sent to the scheduler within the gateway application of the touchscreen. The Schedules Engine 318 then interfaces with SensorConnect 312 to ensure that scheduled events occur at precisely the desired time.

The Device Management module 320 of the touchscreen is in charge of all discovery, installation and configuration of both wired and wireless IP devices (e.g., cameras, etc.) coupled or connected to the system. Networked IP devices, such as those used in the integrated security system, require user configuration of many IP and security parameters, and the device management module of an embodiment handles the details of this configuration. The device management module also manages the video routing module described below.

The video routing engine 322 of the touchscreen is responsible for delivering seamless video streams to the user with zero-configuration. Through a multi-step, staged approach the video routing engine uses a combination of UPnP port-forwarding, relay server routing and STUN/TURN peer-to-peer routing. The video routing engine is described in detail in the Related Applications.

FIG. 67 is a block diagram 900 of network or premise device integration with a premise network 250, under an embodiment. In an embodiment, network devices 255, 256, 957 are coupled to the touchscreen 902 using a secure network connection such as SSL over an encrypted 802.11 link (utilizing for example WPA-2 security for the wireless encryption), and the touchscreen 902 coupled to the premise router/firewall 252 via a coupling with a premise LAN 250. The premise router/firewall 252 is coupled to a broadband modem 251, and the broadband modem 251 is coupled to a WAN 200 or other network outside the premise. The touchscreen 902 thus enables or forms a separate wireless network, or sub-network, that includes some number of devices and is coupled or connected to the LAN 250 of the host premises. The touchscreen sub-network can include, but is not limited to, any number of other devices like WiFi IP cameras, security panels (e.g., IP-enabled), and IP devices, to name a few. The touchscreen 902 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the touchscreen 902.

FIG. 68 is a block diagram 1000 of network or premise device integration with a premise network 250, under an alternative embodiment. The network or premise devices 255, 256, 1057 are coupled to the touchscreen 1002, and the touchscreen 1002 is coupled or connected between the premise router/firewall 252 and the broadband modem 251. The broadband modem 251 is coupled to a WAN 200 or other network outside the premise, while the premise router/firewall 252 is coupled to a premise LAN 250. As a result of its location between the broadband modem 251 and the premise router/firewall 252, the touchscreen 1002 can be configured or function as the premise router routing specified data between the outside network (e.g., WAN 200) and the premise router/firewall 252 of the LAN 250. As described above, the touchscreen 1002 in this configuration enables or forms a separate wireless network, or sub-network, that includes the network or premise devices 255, 156, 1057 and is coupled or connected between the LAN 250 of the host premises and the WAN 200. The touchscreen sub-network can include, but is not limited to, any number of network or premise devices 255, 256, 1057 like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The touchscreen 1002 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the touchscreen 1002.

The gateway of an embodiment, whether a stand-along component or integrated with a touchscreen, enables couplings or connections and thus the flow or integration of information between various components of the host premises and various types and/or combinations of IP devices, where the components of the host premises include a network (e.g., LAN) and/or a security system or subsystem to name a few. Consequently, the gateway controls the association between and the flow of information or data between the components of the host premises. For example, the gateway of an embodiment forms a sub-network coupled to another network (e.g., WAN, LAN, etc.), with the sub-network including IP devices. The gateway further enables the association of the IP devices of the sub-network with appropriate systems on the premises (e.g., security system, etc.). Therefore, for example, the gateway can form a sub-network of IP devices configured for security functions, and associate the sub-network only with the premises security system, thereby segregating the IP devices dedicated to security from other IP devices that may be coupled to another network on the premises.

The gateway of an embodiment, as described herein, enables couplings or connections and thus the flow of information between various components of the host premises and various types and/or combinations of IP devices, where the components of the host premises include a network, a security system or subsystem to name a few. Consequently, the gateway controls the association between and the flow of information or data between the components of the host premises. For example, the gateway of an embodiment forms a sub-network coupled to another network (e.g., WAN, LAN, etc.), with the sub-network including IP devices. The gateway further enables the association of the IP devices of the sub-network with appropriate systems on the premises (e.g., security system, etc.). Therefore, for example, the gateway can form a sub-network of IP devices configured for security functions, and associate the sub-network only with the premises security system, thereby segregating the IP devices dedicated to security from other IP devices that may be coupled to another network on the premises.

FIG. 69 is a flow diagram for a method 1100 of forming a security network including integrated security system components, under an embodiment. Generally, the method comprises coupling 1102 a gateway comprising a connection management component to a local area network in a first location and a security server in a second location. The method comprises forming 1104 a security network by automatically establishing a wireless coupling between the gateway and a security system using the connection management component. The security system of an embodiment comprises security system components located at the first location. The method comprises integrating 1106 communications and functions of the security system components into the security network via the wireless coupling.

FIG. 70 is a flow diagram for a method 1200 of forming a security network including integrated security system components and network devices, under an embodiment. Generally, the method comprises coupling 1202 a gateway to a local area network located in a first location and a security server in a second location. The method comprises automatically establishing 1204 communications between the gateway and security system components at the first location, the security system including the security system components. The method comprises automatically establishing 1206 communications between the gateway and premise devices at the first location. The method comprises forming 1208 a security network by electronically integrating, via the gateway, communications and functions of the premise devices and the security system components.

In an example embodiment, FIG. 71 is a flow diagram 1300 for integration or installation of an IP device into a private network environment, under an embodiment. The IP device includes any IP-capable device that, for example, includes the touchscreen of an embodiment. The variables of an embodiment set at time of installation include, but are not limited to, one or more of a private SSID/Password, a gateway identifier, a security panel identifier, a user account TS, and a Central Monitoring Station account identification.

An embodiment of the IP device discovery and management begins with a user or installer activating 1302 the gateway and initiating 1304 the install mode of the system. This places the gateway in an install mode. Once in install mode, the gateway shifts to a default (Install) Wifi configuration. This setting will match the default setting for other integrated security system-enabled devices that have been pre-configured to work with the integrated security system. The gateway will then begin to provide 1306 DHCP addresses for these IP devices. Once the devices have acquired a new DHCP address from the gateway, those devices are available for configuration into a new secured Wifi network setting.

The user or installer of the system selects 1308 all devices that have been identified as available for inclusion into the integrated security system. The user may select these devices by their unique IDs via a web page, Touchscreen, or other client interface. The gateway provides 1310 data as appropriate to the devices. Once selected, the devices are configured 1312 with appropriate secured Wifi settings, including SSID and WPA/WPA-2 keys that are used once the gateway switches back to the secured sandbox configuration from the “Install” settings. Other settings are also configured as appropriate for that type of device. Once all devices have been configured, the user is notified and the user can exit install mode. At this point all devices will have been registered 1314 with the integrated security system servers.

The installer switches 1316 the gateway to an operational mode, and the gateway instructs or directs 1318 all newly configured devices to switch to the “secured” Wifi sandbox settings. The gateway then switches 1320 to the “secured” Wifi settings. Once the devices identify that the gateway is active on the “secured” network, they request new DHCP addresses from the gateway which, in response, provides 1322 the new addresses. The devices with the new addresses are then operational 1324 on the secured network.

In order to ensure the highest level of security on the secured network, the gateway can create or generate a dynamic network security configuration based on the unique ID and private key in the gateway, coupled with a randomizing factor that can be based on online time or other inputs. This guarantees the uniqueness of the gateway secured network configuration.

To enable the highest level of performance, the gateway analyzes the RF spectrum of the 802.11x network and determines which frequency band/channel it should select to run.

An alternative embodiment of the camera/IP device management process leverages the local ethernet connection of the sandbox network on the gateway. This alternative process is similar to the Wifi discovery embodiment described above, except the user connects the targeted device to the ethernet port of the sandbox network to begin the process. This alternative embodiment accommodates devices that have not been pre-configured with the default “Install” configuration for the integrated security system.

This alternative embodiment of the IP device discovery and management begins with the user/installer placing the system into install mode. The user is instructed to attach an IP device to be installed to the sandbox Ethernet port of the gateway. The IP device requests a DHCP address from the gateway which, in response to the request, provides the address. The user is presented the device and is asked if he/she wants to install the device. If yes, the system configures the device with the secured Wifi settings and other device-specific settings (e.g., camera settings for video length, image quality etc.). The user is next instructed to disconnect the device from the ethernet port. The device is now available for use on the secured sandbox network.

FIG. 72 is a block diagram showing communications among integrated IP devices of the private network environment, under an embodiment. The IP devices of this example include a security touchscreen 1403, gateway 1402 (e.g., “iHub”), and security panel (e.g., “Security Panel 1”, “Security Panel 2”, “Security Panel n”), but the embodiment is not so limited. In alternative embodiments any number and/or combination of these three primary component types may be combined with other components including IP devices and/or security system components. For example, a single device that comprises an integrated gateway, touchscreen, and security panel is merely another embodiment of the integrated security system described herein. The description that follows includes an example configuration that includes a touchscreen hosting particular applications. However, the embodiment is not limited to the touchscreen hosting these applications, and the touchscreen should be thought of as representing any IP device.

Referring to FIG. 14, the touchscreen 1403 incorporates an application 1410 that is implemented as computer code resident on the touchscreen operating system, or as a web-based application running in a browser, or as another type of scripted application (e.g., Flash, Java, Visual Basic, etc.). The touchscreen core application 1410 represents this application, providing user interface and logic for the end user to manage their security system or to gain access to networked information or content (Widgets). The touchscreen core application 1410 in turn accesses a library or libraries of functions to control the local hardware (e.g. screen display, sound, LEDs, memory, etc.) as well as specialized librarie(s) to couple or connect to the security system.

In an embodiment of this security system connection, the touchscreen 1403 communicates to the gateway 1402, and has no direct communication with the security panel. In this embodiment, the touchscreen core application 1410 accesses the remote service APIs 1412 which provide security system functionality (e.g. ARM/DISARM panel, sensor state, get/set panel configuration parameters, initiate or get alarm events, etc.). In an embodiment, the remote service APIs 1412 implement one or more of the following functions, but the embodiment is not so limited: Armstate=setARMState(type=“ARM STAY| ARM AWAY| DISARM”, Parameters=“ExitDelay=30 |(Lights=OFF”); sensorState=getSensors(type=“ALL| SensorName| SensorNameList”); result=setSensorState(SensorName, parameters=“Option1, Options2, . . . Option n”); interruptHandler=SensorEvent( ) and, interruptHandler=alarmEvent( ).

Functions of the remote service APIs 1412 of an embodiment use a remote PanelConnect API 1424 which resides in memory on the gateway 1402. The touchscreen 1403 communicates with the gateway 1402 through a suitable network interface such as an Ethernet or 802.11 RF connection, for example. The remote PanelConnect API 1424 provides the underlying Security System Interfaces 1426 used to communicate with and control one or more types of security panel via wired link 1430 and/or RF link 3. The PanelConnect API 1224 provides responses and input to the remote services APIs 1426, and in turn translates function calls and data to and from the specific protocols and functions supported by a specific implementation of a Security Panel (e.g. a GE Security Simon XT or Honeywell Vista 20P). In an embodiment, the PanelConnect API 1224 uses a 345 MHz RF transceiver or receiver hardware/firmware module to communicate wirelessly to the security panel and directly to a set of 345 MHz RF-enabled sensors and devices, but the embodiment is not so limited.

The gateway of an alternative embodiment communicates over a wired physical coupling or connection to the security panel using the panel's specific wired hardware (bus) interface and the panel's bus-level protocol.

In an alternative embodiment, the Touchscreen 1403 implements the same PanelConnect API 1414 locally on the Touchscreen 1403, communicating directly with the Security Panel 2 and/or Sensors 2 over the proprietary RF link or over a wired link for that system. In this embodiment the Touchscreen 1403, instead of the gateway 1402, incorporates the 345 MHz RF transceiver to communicate directly with Security Panel 2 or Sensors 2 over the RF link 2. In the case of a wired link the Touchscreen 1403 incorporates the real-time hardware (e.g. a PIC chip and RS232-variant serial link) to physically connect to and satisfy the specific bus-level timing requirements of the SecurityPanel2.

In yet another alternative embodiment, either the gateway 1402 or the Touchscreen 1403 implements the remote service APIs. This embodiment includes a Cricket device (“Cricket”) which comprises but is not limited to the following components: a processor (suitable for handling 802.11 protocols and processing, as well as the bus timing requirements of SecurityPanel1); an 802.11 (WiFi) client IP interface chip; and, a serial bus interface chip that implements variants of RS232 or RS485, depending on the specific Security Panel.

The Cricket also implements the full PanelConnect APIs such that it can perform the same functions as the case where the gateway implements the PanelConnect APIs. In this embodiment, the touchscreen core application 1410 calls functions in the remote service APIs 1412 (such as setArmState( ). These functions in turn couple or connect to the remote Cricket through a standard IP connection (“Cricket IP Link”) (e.g., Ethernet, Homeplug, the gateway's proprietary Wifi network, etc.). The Cricket in turn implements the PanelConnect API, which responds to the request from the touchscreen core application, and performs the appropriate function using the proprietary panel interface. This interface uses either the wireless or wired proprietary protocol for the specific security panel and/or sensors.

FIG. 73 is a flow diagram of a method of integrating an external control and management application system with an existing security system, under an embodiment. Operations begin when the system is powered on 1510, involving at a minimum the power-on of the gateway device, and optionally the power-on of the connection between the gateway device and the remote servers. The gateway device initiates 1520 a software and RF sequence to locate the extant security system. The gateway and installer initiate and complete 1530 a sequence to ‘learn’ the gateway into the security system as a valid and authorized control device. The gateway initiates 1540 another software and RF sequence of instructions to discover and learn the existence and capabilities of existing RF devices within the extant security system, and store this information in the system. These operations under the system of an embodiment are described in further detail below.

Unlike conventional systems that extend an existing security system, the system of an embodiment operates utilizing the proprietary wireless protocols of the security system manufacturer. In one illustrative embodiment, the gateway is an embedded computer with an IP LAN and WAN connection and a plurality of RF transceivers and software protocol modules capable of communicating with a plurality of security systems each with a potentially different RF and software protocol interface. After the gateway has completed the discovery and learning 1540 of sensors and has been integrated 1550 as a virtual control device in the extant security system, the system becomes operational. Thus, the security system and associated sensors are presented 1550 as accessible devices to a potential plurality of user interface subsystems.

The system of an embodiment integrates 1560 the functionality of the extant security system with other non-security devices including but not limited to IP cameras, touchscreens, lighting controls, door locking mechanisms, which may be controlled via RF, wired, or powerline-based networking mechanisms supported by the gateway or servers.

The system of an embodiment provides a user interface subsystem 1570 enabling a user to monitor, manage, and control the system and associated sensors and security systems. In an embodiment of the system, a user interface subsystem is an HTML/XML/Javascript/Java/AJAX/Flash presentation of a monitoring and control application, enabling users to view the state of all sensors and controllers in the extant security system from a web browser or equivalent operating on a computer, PDA, mobile phone, or other consumer device.

In another illustrative embodiment of the system described herein, a user interface subsystem is an HTML/XML/Javascript/Java/AJAX presentation of a monitoring and control application, enabling users to combine the monitoring and control of the extant security system and sensors with the monitoring and control of non-security devices including but not limited to IP cameras, touchscreens, lighting controls, door locking mechanisms.

In another illustrative embodiment of the system described herein, a user interface subsystem is a mobile phone application enabling users to monitor and control the extant security system as well as other non-security devices.

In another illustrative embodiment of the system described herein, a user interface subsystem is an application running on a keypad or touchscreen device enabling users to monitor and control the extant security system as well as other non-security devices.

In another illustrative embodiment of the system described herein, a user interface subsystem is an application operating on a TV or set-top box connected to a TV enabling users to monitor and control the extant security system as well as other non-security devices.

FIG. 74 is a block diagram of an integrated security system 1600 wirelessly interfacing to proprietary security systems, under an embodiment. A security system 1610 is coupled or connected to a Gateway 1620, and from Gateway 1620 coupled or connected to a plurality of information and content sources across a network 1630 including one or more web servers 1640, system databases 1650, and applications servers 1660. While in one embodiment network 1630 is the Internet, including the World Wide Web, those of skill in the art will appreciate that network 1630 may be any type of network, such as an intranet, an extranet, a virtual private network (VPN), a mobile network, or a non- TCP/IP based network.

Moreover, other elements of the system of an embodiment may be conventional, well-known elements that need not be explained in detail herein. For example, security system 1610 could be any type home or business security system, such devices including but not limited to a standalone RF home security system or a non-RF-capable wired home security system with an add-on RF interface module. In the integrated security system 1600 of this example, security system 1610 includes an RF-capable wireless security panel (WSP) 1611 that acts as the master controller for security system 1610. Well-known examples of such a WSP include the GE Security Concord, Networx, and Simon panels, the Honeywell Vista and Lynx panels, and similar panels from DSC and Napco, to name a few. A wireless module 1614 includes the RF hardware and protocol software necessary to enable communication with and control of a plurality of wireless devices 1613. WSP 1611 may also manage wired devices 1614 physically connected to WSP 1611 with an RS232 or RS485 or Ethernet connection or similar such wired interface.

In an implementation consistent with the systems and methods described herein, Gateway 1620 provides the interface between security system 1610 and LAN and/or WAN for purposes of remote control, monitoring, and management. Gateway 1620 communicates with an external web server 1640, database 1650, and application server 1660 over network 1630 (which may comprise WAN, LAN, or a combination thereof). In this example system, application logic, remote user interface functionality, as well as user state and account are managed by the combination of these remote servers. Gateway 1620 includes server connection manager 1621, a software interface module responsible for all server communication over network 1630. Event manager 1622 implements the main event loop for Gateway 1620, processing events received from device manager 1624 (communicating with non-security system devices including but not limited to IP cameras, wireless thermostats, or remote door locks). Event manager 1622 further processes events and control messages from and to security system 1610 by utilizing WSP manager 1623.

WSP manager 1623 and device manager 1624 both rely upon wireless protocol manager 1626 which receives and stores the proprietary or standards-based protocols required to support security system 1610 as well as any other devices interfacing with gateway 1620. WSP manager 1623 further utilizes the comprehensive protocols and interface algorithms for a plurality of security systems 1610 stored in the WSP DB client database associated with wireless protocol manager 1626. These various components implement the software logic and protocols necessary to communicate with and manager devices and security systems 1610. Wireless Transceiver hardware modules 1625 are then used to implement the physical RF communications link to such devices and security systems 1610. An illustrative wireless transceiver 1625 is the GE Security Dialog circuit board, implementing a 319.5 MHz two-way RF transceiver module. In this example, RF Link 1670 represents the 319.5 MHz RF communication link, enabling gateway 1620 to monitor and control WSP 1611 and associated wireless and wired devices 1613 and 1614, respectively.

In one embodiment, server connection manager 1621 requests and receives a set of wireless protocols for a specific security system 1610 (an illustrative example being that of the GE Security Concord panel and sensors) and stores them in the WSP DB portion of the wireless protocol manager 1626. WSP manager 1623 then utilizes such protocols from wireless protocol manager 1626 to initiate the sequence of processes detailed in FIG. 15 and FIG. 16 for learning gateway 1620 into security system 1610 as an authorized control device. Once learned in, as described with reference to FIG. 16 (and above), event manager 1622 processes all events and messages detected by the combination of WSP manager 1623 and the GE Security wireless transceiver module 1625.

In another embodiment, gateway 1620 incorporates a plurality of wireless transceivers 1625 and associated protocols managed by wireless protocol manager 1626. In this embodiment events and control of multiple heterogeneous devices may be coordinated with WSP 1611, wireless devices 1613, and wired devices 1614. For example a wireless sensor from one manufacturer may be utilized to control a device using a different protocol from a different manufacturer.

In another embodiment, gateway 1620 incorporates a wired interface to security system 1610, and incorporates a plurality of wireless transceivers 1625 and associated protocols managed by wireless protocol manager 1626. In this embodiment events and control of multiple heterogeneous devices may be coordinated with WSP 1611, wireless devices 1613, and wired devices 1614.

Of course, while an illustrative embodiment of an architecture of the system of an embodiment is described in detail herein with respect to FIG. 16, one of skill in the art will understand that modifications to this architecture may be made without departing from the scope of the description presented herein. For example, the functionality described herein may be allocated differently between client and server, or amongst different server or processor-based components. Likewise, the entire functionality of the gateway 1620 described herein could be integrated completely within an existing security system 1610. In such an embodiment, the architecture could be directly integrated with a security system 1610 in a manner consistent with the currently described embodiments.

FIG. 75 is a flow diagram for wirelessly ‘learning’ the Gateway into an existing security system and discovering extant sensors, under an embodiment. The learning interfaces gateway 1620 with security system 1610. Gateway 1620 powers up 1710 and initiates software sequences 1720 and 1725 to identify accessible WSPs 1611 and wireless devices 1613, respectively (e.g., one or more WSPs and/or devices within range of gateway 1620). Once identified, WSP 1611 is manually or automatically set into ‘learn mode’ 1730, and gateway 1620 utilizes available protocols to add 1740 itself as an authorized control device in security system 1610. Upon successful completion of this task, WSP 1611 is manually or automatically removed from ‘learn mode’ 1750.

Gateway 1620 utilizes the appropriate protocols to mimic 1760 the first identified device 1614. In this operation gateway 1620 identifies itself using the unique or pseudo-unique identifier of the first found device 1614, and sends an appropriate change of state message over RF Link 1670. In the event that WSP 1611 responds to this change of state message, the device 1614 is then added 1770 to the system in database 1650. Gateway 1620 associates 1780 any other information (such as zone name or token-based identifier) with this device 1614 in database 1650, enabling gateway 1620, user interface modules, or any application to retrieve this associated information.

In the event that WSP 1611 does not respond to the change of state message, the device 1614 is not added 1770 to the system in database 1650, and this device 1614 is identified as not being a part of security system 1610 with a flag, and is either ignored or added as an independent device, at the discretion of the system provisioning rules. Operations hereunder repeat 1785 operations 1760, 1770, 1780 for all devices 1614 if applicable. Once all devices 1614 have been tested in this way, the system begins operation 1790.

In another embodiment, gateway 1620 utilizes a wired connection to WSP 1611, but also incorporates a wireless transceiver 1625 to communicate directly with devices 1614. In this embodiment, operations under 1720 above are removed, and operations under 1740 above are modified so the system of this embodiment utilizes wireline protocols to add itself as an authorized control device in security system 1610.

A description of an example embodiment follows in which the Gateway (FIG. 16, element 1620) is the iHub available from iControl Networks, Palo Alto, Calif., and described in detail herein. In this example the gateway is “automatically” installed with a security system.

The automatic security system installation begins with the assignment of an authorization key to components of the security system (e.g., gateway, kit including the gateway, etc.). The assignment of an authorization key is done in lieu of creating a user account. An installer later places the gateway in a user's premises along with the premises security system. The installer uses a computer to navigate to a web portal (e.g., integrated security system web interface), logs in to the portal, and enters the authorization key of the installed gateway into the web portal for authentication. Once authenticated, the gateway automatically discovers devices at the premises (e.g., sensors, cameras, light controls, etc.) and adds the discovered devices to the system or “network”. The installer assigns names to the devices, and tests operation of the devices back to the server (e.g., did the door open, did the camera take a picture, etc.). The security device information is optionally pushed or otherwise propagated to a security panel and/or to the server network database. The installer finishes the installation, and instructs the end user on how to create an account, username, and password. At this time the user enters the authorization key which validates the account creation (uses a valid authorization key to associate the network with the user's account). New devices may subsequently be added to the security network in a variety of ways (e.g., user first enters a unique ID for each device/sensor and names it in the server, after which the gateway can automatically discover and configure the device).

A description of another example embodiment follows in which the security system (FIG. 16, element 1610) is a Dialog system and the WSP (FIG. 16, element 1611) is a SimonXT available from General Electric Security, and the Gateway (FIG. 16, element 1620) is the iHub available from iControl Networks, Palo Alto, Calif., and described in detail herein. Descriptions of the install process for the SimonXT and iHub are also provided below.

GE Security's Dialog network is one of the most widely deployed and tested wireless security systems in the world. The physical RF network is based on a 319.5 MHz unlicensed spectrum, with a bandwidth supporting up to 19 Kbps communications. Typical use of this bandwidth—even in conjunction with the integrated security system—is far less than that. Devices on this network can support either one-way communication (either a transmitter or a receiver) or two-way communication (a transceiver). Certain GE Simon, Simon XT, and Concord security control panels incorporate a two-way transceiver as a standard component. The gateway also incorporates the same two-way transceiver card. The physical link layer of the network is managed by the transceiver module hardware and firmware, while the coded payload bitstreams are made available to the application layer for processing.

Sensors in the Dialog network typically use a 60-bit protocol for communicating with the security panel transceiver, while security system keypads and the gateway use the encrypted 80-bit protocol. The Dialog network is configured for reliability, as well as low-power usage. Many devices are supervised, i.e. they are regularly monitored by the system ‘master’ (typically a GE security panel), while still maintaining excellent power usage characteristics. A typical door window sensor has a battery life in excess of 5-7 years.

The gateway has two modes of operation in the Dialog network: a first mode of operation is when the gateway is configured or operates as a ‘slave’ to the GE security panel; a second mode of operation is when the gateway is configured or operates as a ‘master’ to the system in the event a security panel is not present. In both configurations, the gateway has the ability to ‘listen’ to network traffic, enabling the gateway to continually keep track of the status of all devices in the system. Similarly, in both situations the gateway can address and control devices that support setting adjustments (such as the GE wireless thermostat).

In the configuration in which the gateway acts as a ‘slave’ to the security panel, the gateway is ‘learned into’ the system as a GE wireless keypad. In this mode of operation, the gateway emulates a security system keypad when managing the security panel, and can query the security panel for status and ‘listen’ to security panel events (such as alarm events).

The gateway incorporates an RF Transceiver manufactured by GE Security, but is not so limited. This transceiver implements the Dialog protocols and handles all network message transmissions, receptions, and timing. As such, the physical, link, and protocol layers of the communications between the gateway and any GE device in the Dialog network are totally compliant with GE Security specifications.

At the application level, the gateway emulates the behavior of a GE wireless keypad utilizing the GE Security 80-bit encrypted protocol, and only supported protocols and network traffic are generated by the gateway. Extensions to the Dialog RF protocol of an embodiment enable full control and configuration of the panel, and iControl can both automate installation and sensor enrollment as well as direct configuration downloads for the panel under these protocol extensions.

As described above, the gateway participates in the GE Security network at the customer premises. Because the gateway has intelligence and a two-way transceiver, it can ‘hear’ all of the traffic on that network. The gateway makes use of the periodic sensor updates, state changes, and supervisory signals of the network to maintain a current state of the premises. This data is relayed to the integrated security system server (e.g., FIG. 2, element 260) and stored in the event repository for use by other server components. This usage of the GE Security RF network is completely non-invasive; there is no new data traffic created to support this activity.

The gateway can directly (or indirectly through the Simon XT panel) control two-way devices on the network. For example, the gateway can direct a GE Security Thermostat to change its setting to ‘Cool’ from ‘Off’, as well as request an update on the current temperature of the room. The gateway performs these functions using the existing GE Dialog protocols, with little to no impact on the network; a gateway device control or data request takes only a few dozen bytes of data in a network that can support 19 Kbps.

By enrolling with the Simon XT as a wireless keypad, as described herein, the gateway includes data or information of all alarm events, as well as state changes relevant to the security panel. This information is transferred to the gateway as encrypted packets in the same way that the information is transferred to all other wireless keypads on the network.

Because of its status as an authorized keypad, the gateway can also initiate the same panel commands that a keypad can initiate. For example, the gateway can arm or disarm the panel using the standard Dialog protocol for this activity. Other than the monitoring of standard alarm events like other network keypads, the only incremental data traffic on the network as a result of the gateway is the infrequent remote arm/disarm events that the gateway initiates, or infrequent queries on the state of the panel.

The gateway is enrolled into the Simon XT panel as a ‘slave’ device which, in an embodiment, is a wireless keypad. This enables the gateway for all necessary functionality for operating the Simon XT system remotely, as well as combining the actions and information of non-security devices such as lighting or door locks with GE Security devices. The only resource taken up by the gateway in this scenario is one wireless zone (sensor ID).

The gateway of an embodiment supports three forms of sensor and panel enrollment/installation into the integrated security system, but is not limited to this number of enrollment/installation options. The enrollment/installation options of an embodiment include installer installation, kitting, and panel, each of which is described below.

Under the installer option, the installer enters the sensor IDs at time of installation into the integrated security system web portal or iScreen. This technique is supported in all configurations and installations.

Kits can be pre-provisioned using integrated security system provisioning applications when using the kitting option. At kitting time, multiple sensors are automatically associated with an account, and at install time there is no additional work required.

In the case where a panel is installed with sensors already enrolled (i.e. using the GE Simon XT enrollment process), the gateway has the capability to automatically extract the sensor information from the system and incorporate it into the user account on the integrated security system server.

The gateway and integrated security system of an embodiment uses an auto-learn process for sensor and panel enrollment in an embodiment. The deployment approach of an embodiment can use additional interfaces that GE Security is adding to the Simon XT panel. With these interfaces, the gateway has the capability to remotely enroll sensors in the panel automatically. The interfaces include, but are not limited to, the following: EnrollDevice(ID, type, name, zone, group); SetDeviceParameters(ID, type, Name, zone, group), GetDeviceParameters(zone); and RemoveDevice(zone).

The integrated security system incorporates these new interfaces into the system, providing the following install process. The install process can include integrated security system logistics to handle kitting and pre-provisioning. Pre-kitting and logistics can include a pre-provisioning kitting tool provided by integrated security system that enables a security system vendor or provider (“provider”) to offer pre-packaged initial ‘kits’. This is not required but is recommended for simplifying the install process. This example assumes a ‘Basic’ kit is preassembled and includes one (1) Simon XT, three (3) Door/window sensors, one (1) motion sensor, one (1) gateway, one (1) keyfob, two (2) cameras, and ethernet cables. The kit also includes a sticker page with all Zones (1-24) and Names (full name list).

The provider uses the integrated security system kitting tool to assemble ‘Basic’ kit packages. The contents of different types of starter kits may be defined by the provider. At the distribution warehouse, a worker uses a bar code scanner to scan each sensor and the gateway as it is packed into the box. An ID label is created that is attached to the box. The scanning process automatically associates all the devices with one kit, and the new ID label is the unique identifier of the kit. These boxes are then sent to the provider for distribution to installer warehouses. Individual sensors, cameras, etc. are also sent to the provider installer warehouse. Each is labeled with its own barcode/ID.

An installation and enrollment procedure of a security system including a gateway is described below as one example of the installation process.

  • 1. Order and Physical Install Process
    • a. Once an order is generated in the iControl system, an account is created and an install ticket is created and sent electronically to the provider for assignment to an installer.
    • b. The assigned installer picks up his/her ticket(s) and fills his/her truck with Basic and/or Advanced starter kits. He/she also keeps a stock of individual sensors, cameras, iHubs, Simon XTs, etc. Optionally, the installer can also stock homeplug adapters for problematic installations.
    • c. The installer arrives at the address on the ticket, and pulls out the Basic kit. The installer determines sensor locations from a tour of the premises and discussion with the homeowner. At this point assume the homeowner requests additional equipment including an extra camera, two (2) additional door/window sensors, one (1) glass break detector, and one (1) smoke detector.
    • d. Installer mounts SimonXT in the kitchen or other location in the home as directed by the homeowner, and routes the phone line to Simon XT if available. GPRS and Phone numbers pre-programmed in SimonXT to point to the provider Central Monitoring Station (CMS).
    • e. Installer places gateway in the home in the vicinity of a router and cable modem. Installer installs an ethernet line from gateway to router and plugs gateway into an electrical outlet.
  • 2. Associate and Enroll gateway into SimonXT
    • a. Installer uses either his/her own laptop plugged into router, or homeowners computer to go to the integrated security system web interface and log in with installer ID/pass.
    • b. Installer enters ticket number into admin interface, and clicks ‘New Install’ button. Screen prompts installer for kit ID (on box's barcode label).
    • c. Installer clicks ‘Add SimonXT’. Instructions prompt installer to put Simon XT into install mode, and add gateway as a wireless keypad. It is noted that this step is for security only and can be automated in an embodiment.
    • d. Installer enters the installer code into the Simon XT. Installer Learns ‘gateway’ into the panel as a wireless keypad as a group 1 device.
    • e. Installer goes back to Web portal, and clicks the ‘Finished Adding SimonXT’ button.
  • 3. Enroll Sensors into SimonXT via iControl
    • a. All devices in the Basic kit are already associated with the user's account.
    • b. For additional devices, Installer clicks ‘Add Device’ and adds the additional camera to the user's account (by typing in the camera ID/Serial #).
    • c. Installer clicks ‘Add Device’ and adds other sensors (two (2) door/window sensors, one (1) glass break sensor, and one (1) smoke sensor) to the account (e.g., by typing in IDs).
    • d. As part of Add Device, Installer assigns zone, name, and group to the sensor. Installer puts appropriate Zone and Name sticker on the sensor temporarily.
    • e. All sensor information for the account is pushed or otherwise propagated to the iConnect server, and is available to propagate to CMS automation software through the CMS application programming interface (API).
    • f. Web interface displays ‘Installing Sensors in System . . . ’ and automatically adds all of the sensors to the Simon XT panel through the GE RF link.
    • g. Web interface displays ‘Done Installing’-->all sensors show green.
  • 4. Place and Tests Sensors in Home
    • a. Installer physically mounts each sensor in its desired location, and removes the stickers.
    • b. Installer physically mounts WiFi cameras in their location and plugs into AC power. Optional fishing of low voltage wire through wall to remove dangling wires. Camera transformer is still plugged into outlet but wire is now inside the wall.
    • c. Installer goes to Web interface and is prompted for automatic camera install. Each camera is provisioned as a private, encrypted Wifi device on the gateway secured sandbox network, and firewall NAT traversal is initiated. Upon completion the customer is prompted to test the security system.
    • d. Installer selects the ‘Test System’ button on the web portal—the SimonXT is put into Test mode by the gateway over GE RF.
    • e. Installer manually tests the operation of each sensor, receiving an audible confirmation from SimonXT.
    • f. gateway sends test data directly to CMS over broadband link, as well as storing the test data in the user's account for subsequent report generation.
    • g. Installer exits test mode from the Web portal.
  • 5. Installer instructs customer on use of the Simon XT, and shows customer how to log into the iControl web and mobile portals. Customer creates a username/password at this time.
  • 6. Installer instructs customer how to change Simon XT user code from the Web interface. Customer changes user code which is pushed to SimonXT automatically over GE RF.

An installation and enrollment procedure of a security system including a gateway is described below as an alternative example of the installation process. This installation process is for use for enrolling sensors into the SimonXT and integrated security system and is compatible with all existing GE Simon panels.

The integrated security system supports all pre-kitting functionality described in the installation process above. However, for the purpose of the following example, no kitting is used.

    • 1. Order and Physical Install Process
      • a. Once an order is generated in the iControl system, an account is created and an install ticket is created and sent electronically to the security system provider for assignment to an installer.
      • b. The assigned installer picks up his/her ticket(s) and fills his/her truck with individual sensors, cameras, iHubs, Simon XTs, etc. Optionally, the installer can also stock homeplug adapters for problematic installations.
      • c. The installer arrives at the address on the ticket, and analyzes the house and talks with the homeowner to determine sensor locations. At this point assume the homeowner requests three (3) cameras, five (5) door/window sensors, one (1) glass break detector, one (1) smoke detector, and one (1) keyfob.
      • d. Installer mounts SimonXT in the kitchen or other location in the home. The installer routes a phone line to Simon XT if available. GPRS and Phone numbers are pre-programmed in SimonXT to point to the provider CMS.
      • e. Installer places gateway in home in the vicinity of a router and cable modem, and installs an ethernet line from gateway to the router, and plugs gateway into an electrical outlet.
    • 2. Associate and Enroll gateway into SimonXT
      • a. Installer uses either his/her own laptop plugged into router, or homeowners computer to go to the integrated security system web interface and log in with an installer ID/pass.
      • b. Installer enters ticket number into admin interface, and clicks ‘New Install’ button. Screen prompts installer to add devices.
      • c. Installer types in ID of gateway, and it is associated with the user's account.
      • d. Installer clicks ‘Add Device’ and adds the cameras to the user's account (by typing in the camera ID/Serial #).
      • e. Installer clicks ‘Add SimonXT’. Instructions prompt installer to put Simon XT into install mode, and add gateway as a wireless keypad.
      • f. Installer goes to Simon XT and enters the installer code into the Simon XT. Learns ‘gateway’ into the panel as a wireless keypad as group 1 type sensor.
      • g. Installer returns to Web portal, and clicks the ‘Finished Adding SimonXT’ button.
      • h. Gateway now is alerted to all subsequent installs over the security system RF.
    • 3. Enroll Sensors into SimonXT via iControl
      • a. Installer clicks ‘Add Simon XT Sensors’—Displays instructions for adding sensors to Simon XT.
      • b. Installer goes to Simon XT and uses Simon XT install process to add each sensor, assigning zone, name, group. These assignments are recorded for later use.
      • c. The gateway automatically detects each sensor addition and adds the new sensor to the integrated security system.
      • d. Installer exits install mode on the Simon XT, and returns to the Web portal.
      • e. Installer clicks ‘Done Adding Devices’.
      • f. Installer enters zone/sensor naming from recorded notes into integrated security system to associate sensors to friendly names.
      • g. All sensor information for the account is pushed to the iConnect server, and is available to propagate to CMS automation software through the CMS API.
    • 4. Place and Tests Sensors in Home
      • a. Installer physically mounts each sensor in its desired location.
      • b. Installer physically mounts Wifi cameras in their location and plugs into AC power. Optional fishing of low voltage wire through wall to remove dangling wires. Camera transformer is still plugged into outlet but wire is now inside the wall.
      • c. Installer puts SimonXT into Test mode from the keypad.
      • d. Installer manually tests the operation of each sensor, receiving an audible confirmation from SimonXT.
      • e. Installer exits test mode from the Simon XT keypad.
      • f. Installer returns to web interface and is prompted to automatically set up cameras. After waiting for completion cameras are now provisioned and operational.
    • 5. Installer instructs customer on use of the Simon XT, and shows customer how to log into the integrated security system web and mobile portals. Customer creates a username/password at this time.
    • 6. Customer and Installer observe that all sensors/cameras are green.
    • 7. Installer instructs customer how to change Simon XT user code from the keypad. Customer changes user code and stores in SimonXT.
    • 8. The first time the customer uses the web portal to Arm/Disarm system the web interface prompts the customer for the user code, which is then stored securely on the server. In the event the user code is changed on the panel the web interface once again prompts the customer.

The panel of an embodiment can be programmed remotely. The CMS pushes new programming to SimonXT over a telephone or GPRS link. Optionally, iControl and GE provide a broadband link or coupling to the gateway and then a link from the gateway to the Simon XT over GE RF.

In addition to the configurations described above, the gateway of an embodiment supports takeover configurations in which it is introduced or added into a legacy security system. A description of example takeover configurations follow in which the security system (FIG. 2, element 210) is a Dialog system and the WSP (FIG. 2, element 211) is a GE Concord panel (e.g., equipped with POTS, GE RF, and Superbus 2000 RS485 interface (in the case of a Lynx takeover the Simon XT is used) available from General Electric Security. The gateway (FIG. 2, element 220) in the takeover configurations is an iHub (e.g., equipped with built-in 802.11b/g router, Ethernet Hub, GSM/GPRS card, RS485 interface, and iControl Honeywell-compatible RF card) available from iControl Networks, Palo Alto, Calif. While components of particular manufacturers are used in this example, the embodiments are not limited to these components or to components from these vendors.

The security system can optionally include RF wireless sensors (e.g., GE wireless sensors utilizing the GE Dialog RF technology), IP cameras, a GE-iControl Touchscreen (the touchscreen is assumed to be an optional component in the configurations described herein, and is thus treated separately from the iHub; in systems in which the touchscreen is a component of the base security package, the integrated iScreen (available from iControl Networks, Palo Alto, Calif.) can be used to combine iHub technology with the touchscreen in a single unit), and Z-Wave devices to name a few.

The takeover configurations described below assume takeover by a “new” system of an embodiment of a security system provided by another third party vendor, referred to herein as an “original” or “legacy” system. Generally, the takeover begins with removal of the control panel and keypad of the legacy system. A GE Concord panel is installed to replace the control panel of the legacy system along with an iHub with GPRS Modem. The legacy system sensors are then connected or wired to the Concord panel, and a GE keypad or touchscreen is installed to replace the control panel of the legacy system. The iHub includes the iControl RF card, which is compatible with the legacy system. The iHub finds and manages the wireless sensors of the legacy system, and learns the sensors into the Concord by emulating the corresponding GE sensors. The iHub effectively acts as a relay for legacy wireless sensors.

Once takeover is complete, the new security system provides a homogeneous system that removes the compromises inherent in taking over or replacing a legacy system. For example, the new system provides a modern touchscreen that may include additional functionality, new services, and supports integration of sensors from various manufacturers. Furthermore, lower support costs can be realized because call centers, installers, etc. are only required to support one architecture. Additionally, there is minimal install cost because only the panel is required to be replaced as a result of the configuration flexibility offered by the iHub.

The system takeover configurations described below include but are not limited to a dedicated wireless configuration, a dedicated wireless configuration that includes a touchscreen, and a fished Ethernet configuration. Each of these configurations is described in detail below.

FIG. 76 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel wirelessly coupled to an iHub, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and communicates with the panel via the 802.11 link, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.

FIG. 77 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel wirelessly coupled to an iHub, and a GE-iControl Touchscreen, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and communicates with the panel via the 802.11 link, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.

The GE-iControl Touchscreen can be used with either of an 802.11 connection or Ethernet connection with the iHub. Because the takeover involves a GE Concord panel (or Simon XT), the touchscreen is always an option. No extra wiring is required for the touchscreen as it can use the 4-wire set from the replaced keypad of the legacy system. This provides power, battery backup (through Concord), and data link (RS485 Superbus 2000) between Concord and touchscreen. The touchscreen receives its broadband connectivity through the dedicated 802.11 link to the iHub.

FIG. 78 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel connected to an iHub via an Ethernet coupling, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and wired to the panel using a 4-wire SUperbus 2000 (RS485) interface, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.

The takeover installation process is similar to the installation process described above, except the control panel of the legacy system is replaced; therefore, only the differences with the installation described above are provided here. The takeover approach of an embodiment uses the existing RS485 control interfaces that GE Security and iControl support with the iHub, touchscreen, and Concord panel. With these interfaces, the iHub is capable of automatically enrolling sensors in the panel. The exception is the leverage of an iControl RF card compatible with legacy systems to ‘takeover’ existing RF sensors. A description of the takeover installation process follows.

During the installation process, the iHub uses an RF Takeover Card to automatically extract all sensor IDs, zones, and names from the legacy panel. The installer removes connections at the legacy panel from hardwired wired sensors and labels each with the zone. The installer pulls the legacy panel and replaces it with the GE Concord panel. The installer also pulls the existing legacy keypad and replaces it with either a GE keypad or a GE-iControl touchscreen. The installer connects legacy hardwired sensors to appropriate wired zone (from labels) on the Concord. The installer connects the iHub to the local network and connects the iHub RS485 interface to the Concord panel. The iHub automatically ‘enrolls’ legacy RF sensors into the Concord panel as GE sensors (maps IDs), and pushes or otherwise propagates other information gathered from HW panel (zone, name, group). The installer performs a test of all sensors back to CMS. In operation, the iHub relays legacy sensor data to the Concord panel, emulating equivalent GE sensor behavior and protocols.

The areas of the installation process particular to the legacy takeover include how the iHub extracts sensor info from the legacy panel and how the iHub automatically enrolls legacy RF sensors and populates Concord with wired zone information. Each of these areas is described below.

In having the iHub extract sensor information from the legacy panel, the installer ‘enrolls’ iHub into the legacy panel as a wireless keypad (use install code and house ID—available from panel). The iHub legacy RF Takeover Card is a compatible legacy RF transceiver. The installer uses the web portal to place iHub into ‘Takeover Mode’, and the web portal the automatically instructs the iHub to begin extraction. The iHub queries the panel over the RF link (to get all zone information for all sensors, wired and RF). The iHub then stores the legacy sensor information received during the queries on the iConnect server.

The iHub also automatically enrolls legacy RF sensors and populates Concord with wired zone information. In so doing, the installer selects ‘Enroll legacy Sensors into Concord’ (next step in ‘Takeover’ process on web portal). The iHub automatically queries the iConnect server, and downloads legacy sensor information previously extracted. The downloaded information includes an ID mapping from legacy ID to ‘spoofed’ GE ID. This mapping is stored on the server as part of the sensor information (e.g., the iConnect server knows that the sensor is a legacy sensor acting in GE mode). The iHub instructs Concord to go into install mode, and sends appropriate Superbus 2000 commands for sensor learning to the panel. For each sensor, the ‘spoofed’ GE ID is loaded, and zone, name, and group are set based on information extracted from legacy panel. Upon completion, the iHub notifies the server, and the web portal is updated to reflect next phase of Takeover (e.g., ‘Test Sensors’).

Sensors are tested in the same manner as described above. When a HW sensor is triggered, the signal is captured by the iHub legacy RF Takeover Card, translated to the equivalent GE RF sensor signal, and pushed to the panel as a sensor event on the SuperBus 2000 wires.

In support of remote programming of the panel, CMS pushes new programming to Concord over a phone line, or to the iConnect CMS/Alarm Server API, which in turn pushes the programming to the iHub. The iHub uses the Concord Superbus 2000 RS485 link to push the programming to the Concord panel.

FIG. 79 is a flow diagram for automatic takeover 2100 of a security system, under an embodiment. Automatic takeover includes establishing 2102 a wireless coupling between a takeover component running under a processor and a first controller of a security system installed at a first location. The security system includes some number of security system components coupled to the first controller. The automatic takeover includes automatically extracting 2104 security data of the security system from the first controller via the takeover component. The automatic takeover includes automatically transferring 2106 the security data to a second controller and controlling loading of the security data into the second controller. The second controller is coupled to the security system components and replaces the first controller.

FIG. 80 is a flow diagram for automatic takeover 2200 of a security system, under an alternative embodiment. Automatic takeover includes automatically forming 2202 a security network at a first location by establishing a wireless coupling between a security system and a gateway. The gateway of an embodiment includes a takeover component. The security system of an embodiment includes security system components. The automatic takeover includes automatically extracting 2204 security data of the security system from a first controller of the security system. The automatic takeover includes automatically transferring 2206 the security data to a second controller. The second controller of an embodiment is coupled to the security system components and replaces the first controller.

Components of the gateway of the integrated security system described herein control discovery, installation and configuration of both wired and wireless IP devices (e.g., cameras, etc.) coupled or connected to the system, as described herein with reference to FIGS. 1-4, as well as management of video routing using a video routing module or engine. The video routing engine initiates communication paths for the transfer of video from a streaming source device to a requesting client device, and delivers seamless video streams to the user via the communication paths using one or more of UPnP port-forwarding, relay server routing and STUN/TURN peer-to-peer routing, each of which is described below.

By way of reference, conventional video cameras have the ability to stream digital video in a variety of formats and over a variety of networks. Internet protocol (IP) video cameras, which include video cameras using an IP transport network (e.g., Ethernet, WiFi (IEEE 802.11 standards), etc.) are prevalent and increasingly being utilized in home monitoring and security system applications. With the proliferation of the internet, Ethernet and WiFi local area networks (LANs) and advanced wide area networks (WANs) that offer high bandwidth, low latency connections (broadband), as well as more advanced wireless WAN data networks (e.g. GPRS or CDMA 1×RTT), there increasingly exists the networking capability to extend traditional security systems to offer IP-based video. However, a fundamental reason for such IP video in a security system is to enable a user or security provider to monitor live or otherwise streamed video from outside the host premises (and the associated LAN).

The conventional solution to this problem has involved a technique known as ‘port fowarding’, whereby a ‘port’ on the LAN's router/firewall is assigned to the specific LAN IP address for an IP camera, or a proxy to that camera. Once a port has been ‘forwarded’ in this manner, a computer external to the LAN can address the LAN's router directly, and request access to that port. This access request is then forwarded by the router directly to the IP address specified, the IP camera or proxy. In this way an external device can directly access an IP camera within the LAN and view or control the streamed video.

The issues with this conventional approach include the following: port forwarding is highly technical and most users do not know how/why to do it; automatic port forwarding is difficult and problematic using emerging standards like UPnP; the camera IP address is often reset in response to a power outage/router reboot event; there are many different routers with different ways/capabilities for port forwarding. In short, although port forwarding can work, it is frequently less than adequate to support a broadly deployed security solution utilizing IP cameras.

Another approach to accessing streaming video externally to a LAN utilizes peer-to-peer networking technology. So-called peer-to-peer networks, which includes networks in which a device or client is connected directly to another device or client, typically over a Wide Area Network (WAN) and without a persistent server connection, are increasingly common. In addition to being used for the sharing of files between computers (e.g., Napster and KaZaa), peer-to-peer networks have also been more recently utilized to facilitate direct audio and media streaming in applications such as Skype. In these cases, the peer-to-peer communications have been utilized to enable telephony-style voice-communications and video conferencing between two computers, each enabled with an IP-based microphone, speaker, and video camera. A fundamental reason for adopting such peer-to-peer technology is the ability to transparently ‘punch through’ LAN firewalls to enable external access to the streaming voice and video content, and to do so in a way that scales to tens of millions of users without creating an untenable server load.

A limitation of the conventional peer-to-peer video transport lies in the personal computer (PC) -centric nature of the solution. Each of the conventional solutions uses a highly capable PC connected to the video camera, with the PC providing the advanced software functionality required to initiate and manage the peer-to-peer connection with the remote client. A typical security or remote home monitoring system requires multiple cameras, each with its own unique IP address, and only a limited amount of processing capability in each camera such that the conventional PC-centric approach cannot easily solve the need. Instead of a typical PC-centric architecture with three components (a “3-way IP Video System”) that include a computer device with video camera, a mediating server, and a PC client with video display capability, the conventional security system adds a plurality of fourth components that are standalone IP video cameras (requiring a “4-way IP Video System”), another less-than-ideal solution.

In accordance with the embodiments described herein, IP camera management systems and methods are provided that enable a consumer or security provider to easily and automatically configure and manage IP cameras located at a customer premise. Using this system IP camera management may be extended to remote control and monitoring from outside the firewall and router of the customer premise.

With reference to FIGS. 5 and 6, the system includes a gateway 253 having a video routing component so that the gateway 253 can manage and control, or assist in management and control, or video routing. The system also includes one or more cameras (e.g., WiFi IP camera 254, Ethernet IP camera 255, etc.) that communicate over the LAN 250 using an IP format, as well as a connection management server 210 located outside the premise firewall 252 and connected to the gateway 253 by a Wide Area Network (WAN) 200. The system further includes one or more devices 220, 230, 240 located outside the premise and behind other firewalls 221, 231, 241 and connected to the WAN 200. The other devices 220, 230, 240 are configured to access video or audio content from the IP cameras within the premise, as described above.

Alternatively, with reference to FIGS. 9 and 10, the system includes a touchscreen 902 or 1002 having a video routing component so that the touchscreen 902 or 1002 can manage and control, or assist in management and control, or video routing. The system also includes one or more cameras (e.g., WiFi IP camera 254, Ethernet IP camera 255, etc.) that communicate over the LAN 250 using an IP format, as well as a connection management server 210 located outside the premise firewall 252 and connected to the gateway 253 by a Wide Area Network (WAN) 200. The system further includes one or more devices 220, 230, 240 located outside the premise and behind other firewalls 221, 231, 241 and connected to the WAN 200. The other devices 220, 230, 240 are configured to access video or audio content from the IP cameras within the premise, as described above.

FIG. 81 is a general flow diagram for IP video control, under an embodiment. The IP video control interfaces, manages, and provides WAN-based remote access to a plurality of IP cameras in conjunction with a home security or remote home monitoring system. The IP video control allows for monitoring and controlling of IP video cameras from a location remote to the customer premise, outside the customer premise firewall, and protected by another firewall. Operations begin when the system is powered on 2310, involving at a minimum the power-on of the gateway, as well as the power-on of at least one IP camera coupled or connected to the premise LAN. The gateway searches 2311 for available IP cameras and associated IP addresses. The gateway selects 2312 from one or more possible approaches to create connections between the IP camera and a device external to the firewall. Once an appropriate connection path is selected, the gateway begins operation 2313, and awaits 2320 a request for a stream from one of the plurality of IP video cameras available on the LAN. When a stream request is present the server retrieves 2321 the requestor's WAN IP address/port.

When a server relay is present 2330, the IP camera is instructed 2331 to stream to the server, and the connection is managed 2332 through the server. In response to the stream terminating 2351, operations return to gateway operation 2313, and waits to receive another request 2320 for a stream from one of the plurality of IP video cameras available on the LAN.

When a server relay is not present 2330, the requestor's WAN IP address/port is provided 2333 to the gateway or gateway relay. When a gateway relay is present 2340, the IP camera is instructed 2341 to stream to the gateway, and the gateway relays 2342 the connection to the requestor. In response to the stream terminating 2351, operations return to gateway operation 2313, and waits to receive another request 2320 for a stream from one of the plurality of IP video cameras available on the LAN. When a gateway relay is not present 2340, the IP camera is instructed 2343 to stream to an address, and a handoff 2344 is made resulting in direct communication between the camera and the requestor. In response to the stream terminating 2351, operations return to gateway operation 2313, and waits to receive another request 2320 from one of the plurality of IP video cameras available on the LAN.

The integrated security system of an embodiment supports numerous video stream formats or types of video streams. Supported video streams include, but are not limited to, Motion Picture Experts Group (MPEG)-4 (MPEG-4)/Real-Time Streaming Protocol (RTSP), MPEG-4 over Hypertext Transfer Protocol (HTTP), and Motion Joint Photographic Experts Group (JPEG) (MJPEG).

The integrated security system of an embodiment supports the MPEG-4/RTSP video streaming method (supported by video servers and clients) which uses RTSP for the control channel and Real-time Transport Protocol (RTP) for the data channel. Here the RTSP channel is over Transmission Control Protocol (TCP) while the data channel uses User Datagram Protocol (UDP). This method is widely supported by both streaming sources (e.g., cameras) and stream clients (e.g., remote client devices, Apple Quicktime, VideoLAN, IPTV mobile phones, etc).

Encryption can be added to the two channels under MPEG-4/RTSP. For example, the RTSP control channel can be encrypted using SSL/TLS. The data channel can also be encrypted.

If the camera or video stream source inside the home does not support encryption for either RTSP or RTP channels, the gateway located on the LAN can facilitate the encrypted RTSP method by maintaining separate TCP sessions with the video stream source device and with the encrypted RTSP client outside the LAN, and relay all communication between the two sessions. In this situation, any communication between the gateway and the video stream source that is not encrypted could be encrypted by the gateway before being relayed to the RTSP client outside the LAN. In many cases the gateway is an access point for the encrypted and private Wifi network on which the video stream source device is located. This means that communication between the gateway and the video stream source device is encrypted at the network level, and communication between the gateway and the RTSP client is encrypted at the transport level. In this fashion the gateway can compensate for a device that does not support encrypted RTSP.

The integrated security system of an embodiment also supports reverse RTSP. Reverse RTSP includes taking a TCP-based protocol like RTSP, and reversing the roles of client and server (references to “server” include the iControl server, also referred to as the iConnect server) when it comes to TCP session establishment. For example, in standard RTSP the RTSP client is the one that establishes the TCP connection with the stream source server (the server listens on a port for incoming connections). In Reverse RTSP, the RTSP client listens on a port for incoming connections from the stream source server. Once the TCP connection is established, the RTSP client begins sending commands to the server over the TCP connection just as it would in standard RTSP.

When using Reverse RTSP, the video stream source is generally on a LAN, protected by a firewall. Having a device on the LAN initiate the connection to the RTSP client outside the firewall enables easy network traversal.

If the camera or video stream source inside the LAN does not support Reverse RTSP, then the gateway facilitates the Reverse RTSP method by initiating separate TCP sessions with the video stream source device and with the Reverse RTSP client outside the LAN, and then relays all communication between the two sessions. In this fashion the gateway compensates for a stream source device that does not support Reverse RTSP.

As described in the encryption description above, the gateway can further compensate for missing functionalities on the device such as encryption. If the device does not support encryption for either RTSP or RTP channels, the gateway can communicate with the device using these un-encrypted streams, and then encrypt the streams before relaying them out of the LAN to the RTSP Reverse client.

Servers of the integrated security system can compensate for RTSP clients that do not support Reverse RTSP. In this situation, the server accepts TCP connections from both the RTSP client and the Reverse RTSP video stream source (which could be a gateway acting on behalf of a stream source device that does not support Reverse RTSP). The server then relays the control and video streams from the Reverse RTSP video stream source to the RTSP client. The server can further compensate for the encryption capabilities of the RTSP client; if the RTSP client does not support encryption then the server can provide an unencrypted stream to the RTSP client even though an encrypted stream was received from the Reverse RTSP streaming video source.

The integrated security system of an embodiment also supports Simple Traversal of User Datagram Protocol (UDP) through Network Address Translators (NAT) (STUN)/Traversal Using Relay NAT (TURN) peer-to-peer routing. STUN and Turn are techniques for using a server to help establish a peer-to-peer UDP data stream (it does not apply to TCP streams). The bandwidth consumed by the data channel of a video stream is usually many thousands of times larger than that used by the control channel. Consequently, when a peer-to-peer connection for both the RTSP and RTP channels is not possible, there is still a great incentive to use STUN/TURN techniques in order to achieve a peer-to-peer connection for the RTP data channel.

Here, a method referred to herein as RTSP with STUN/TURN is used by the integrated security system. The RTSP with STUN/TURN is a method in which the video streaming device is instructed over the control channel to stream its UDP data channel to a different network address than that of the other end of the control TCP connection (usually the UDP data is simply streamed to the IP address of the RTSP client). The result is that the RTSP or Reverse RTSP TCP channel can be relayed using the gateway and/or the server, while the RTP UDP data channel can flow directly from the video stream source device to the video stream client.

If a video stream source device does not support RTSP with STUN/TURN, the gateway can compensate for the device by relaying the RTSP control channel via the server to the RTSP client, and receiving the RTP data channel and then forwarding it directly to the RTSP with STUN/TURN enabled client. Encryption can also be added here by the gateway.

The integrated security system of an embodiment supports MPEG-4 over HTTP. MPEG-4 over HTTP is similar to MPEG-4 over RTSP except that both the RTSP control channel and the RTP data channel are passed over an HTTP TCP session. Here a single TCP session can be used, splitting it into multiple channels using common HTTP techniques like chunked transfer encoding.

The MPEG-4 over HTTP is generally supported by many video stream clients and server devices, and encryption can easily be added to it using SSL/TLS. Because it uses TCP for both channels, STUN/TURN techniques may not apply in the event that a direct peer-to-peer TCP session between client and server cannot be established.

As described above, encryption can be provided using SSL/TLS taking the form of HTTPS. And as with MPEG-4 over RTSP, a gateway can compensate for a stream source device that does not support encryption by relaying the TCP streams and encrypting the TCP stream between the gateway and the stream client. In many cases the gateway is an access point for the encrypted and private Wifi network on which the video stream source device is located. This means that communication between the gateway and the video stream source device is encrypted at the network level, and communication between the gateway and the video stream client is encrypted at the transport level. In this fashion the gateway can compensate for a device that does not support HTTPS.

As with Reverse RTSP, the integrated security system of an embodiment supports Reverse HTTP. Reverse HTTP includes taking a TCP-based protocol like HTTP, and reversing the roles of client and server when it comes to TCP session establishment. For example, in conventional HTTP the HTTP client is the one that establishes the TCP connection with the server (the server listens on a port for incoming connections). In Reverse HTTP, the HTTP client listens on a port for incoming connections from the server. Once the TCP connection is established, the HTTP client begins sending commands to the server over the TCP connection just as it would in standard HTTP.

When using Reverse HTTP, the video stream source is generally on a LAN, protected by a firewall. Having a device on the LAN initiate the connection to the HTTP client outside the firewall enables easy network traversal.

If the camera or video stream source inside the LAN does not support Reverse HTTP, then the gateway can facilitate the Reverse HTTP method by initiating separate TCP sessions with the video stream source device and with the Reverse HTTP client outside the LAN, and then relay all communication between the two sessions. In this fashion the gateway can compensate for a stream source device that does not support Reverse HTTP.

As described in the encryption description above, the gateway can further compensate for missing functionalities on the device such as encryption. If the device does not support encrypted HTTP (e.g., HTTPS), then the gateway can communicate with the device using HTTP, and then encrypt the TCP stream(s) before relaying out of the LAN to the Reverse HTTP client.

The servers of an embodiment can compensate for HTTP clients that do not support Reverse HTTP. In this situation, the server accepts TCP connections from both the HTTP client and the Reverse HTTP video stream source (which could be a gateway acting on behalf of a stream source device that does not support Reverse HTTP), The server then relays the TCP streams from the Reverse HTTP video stream source to the HTTP client. The server can further compensate for the encryption capabilities of the HTTP client; if the HTTP client does not support encryption then the server can provide an unencrypted stream to the HTTP client even though an encrypted stream was received from the Reverse HTTP streaming video source.

The integrated security system of an embodiment supports MJPEG as described above. MJPEG is a streaming technique in which a series of JPG images are sent as the result of an HTTP request. Because MJPEG streams are transmitted over HTTP, HTTPS can be employed for encryption and most MJPEG clients support the resulting encrypted stream. And as with MPEG-4 over HTTP, a gateway can compensate for a stream source device that does not support encryption by relaying the TCP streams and encrypting the TCP stream between the gateway and the stream client. In many cases the gateway is an access point for the encrypted and private Wifi network on which the video stream source device is located. This means that communication between the gateway and the video stream source device is encrypted at the network level, and communication between the gateway and the video stream client is encrypted at the transport level. In this fashion the gateway can compensate for a device that does not support HTTPS.

The integrated system of an embodiment supports Reverse HTTP. Reverse HTTP includes taking a TCP-based protocol like HTTP, and reversal of the roles of client and server when it comes to TCP session establishment can be employed for MJPEG streams. For example, in standard HTTP the HTTP client is the one who establishes the TCP connection with the server (the server listens on a port for incoming connections). In Reverse HTTP, the HTTP client listens on a port for incoming connections from the server. Once the TCP connection is established, the HTTP client begins sending commands to the server over the TCP connection just as it would in standard HTTP.

When using Reverse HTTP, the video stream source is generally on a LAN, protected by a firewall. Having a device on the LAN initiate the connection to the HTTP client outside the firewall enables network traversal.

If the camera or video stream source inside the LAN does not support Reverse HTTP, then the gateway can facilitate the Reverse HTTP method by initiating separate TCP sessions with the video stream source device and with the Reverse HTTP client outside the LAN, and then relay all communication between the two sessions. In this fashion the gateway can compensate for a stream source device that does not support Reverse HTTP.

As described in the encryption description above, the gateway can further compensate for missing functionalities on the device such as encryption. If the device does not support encrypted HTTP (e.g., HTTPS), then the gateway can communicate with the device using HTTP, and then encrypt the TCP stream(s) before relaying out of the LAN to the Reverse HTTP client.

The servers can compensate for HTTP clients that do not support Reverse HTTP. In this situation, the server accepts TCP connections from both the HTTP client and the Reverse HTTP video stream source (which could be a gateway acting on behalf of a stream source device that does not support Reverse HTTP). The server then relays the TCP streams from the Reverse HTTP video stream source to the HTTP client. The server can further compensate for the encryption capabilities of the HTTP client; if the HTTP client does not support encryption then the server can provide an unencrypted stream to the HTTP client even though an encrypted stream was received from the Reverse HTTP streaming video source.

The integrated security system of an embodiment considers numerous parameters in determining or selecting one of the streaming formats described above for use in transferring video streams. The parameters considered in selecting a streaming format include, but are not limited to, security requirements, client capabilities, device capabilities, and network/system capabilities.

The security requirements for a video stream are considered in determining an applicable streaming format in an embodiment. Security requirements fall into two categories, authentication and privacy, each of which is described below.

Authentication as a security requirement means that stream clients must present credentials in order to obtain a stream. Furthermore, this presentation of credentials should be done in a way that is secure from network snooping and replays. An example of secure authentication is Basic Authentication over HTTPS. Here a username and password are presented over an encrypted HTTPS channel so snooping and replays are prevented. Basic Authentication alone, however, is generally not sufficient for secure authentication.

Because not all streaming clients support SSL/TLS, authentication methods that do not require it are desirable. Such methods include Digest Authentication and one-time requests. A one-time request is a request that can only be made by a client one time, and the server prevents a reuse of the same request. One-time requests are used to control access to a stream source device by stream clients that do not support SSL/TLS. An example here is providing video access to a mobile phone. Typical mobile phone MPEG-4 viewers do not support encryption. In this case, one of the MPEG-4 over RTSP methods described above can be employed to get the video stream relayed to an server. The server can then provide the mobile phone with a one-time request Universal Resource Locator (URL) for the relayed video stream source (via a Wireless Application Protocol (WAP) page). Once the stream ends, the mobile phone would need to obtain another one-time request URL from the server (via WAP, for example) in order to view the stream again.

Privacy as a security requirement means that the contents of the video stream must be encrypted. This is a requirement that may be impossible to satisfy on clients that do not support video stream encryption, for example many mobile phones. If a client supports encryption for some video stream format(s), then the “best” of those formats should be selected. Here “best” is determined by the stream type priority algorithm.

The client capabilities are considered in determining an applicable streaming format in an embodiment. In considering client capabilities, the selection depends upon the supported video stream formats that include encryption, and the supported video stream formats that do not support encryption.

The device capabilities are considered in determining an applicable streaming format in an embodiment. In considering device capabilities, the selection depends upon the supported video stream formats that include encryption, the supported video stream formats that do not support encryption, and whether the device is on an encrypted private Wifi network managed by the gateway (in which case encryption at the network level is not required).

The network/system capabilities are considered in determining an applicable streaming format in an embodiment. In considering network/system capabilities, the selection depends upon characteristics of the network or system across which the stream must travel. The characteristics considered include, for example, the following: whether there is a gateway and/or server on the network to facilitate some of the fancier video streaming types or security requirements; whether the client is on the same LAN as the gateway, meaning that network firewall traversal is not needed.

Streaming methods with the highest priority are peer-to-peer because they scale best with server resources. Universal Plug and Play (UPnP) can be used by the gateway to open ports on the video stream device's LAN router and direct traffic through those ports to the video stream device. This allows a video stream client to talk directly with the video stream device or talk directly with the gateway which can in turn facilitate communication with the video stream device.

Another factor in determining the best video stream format to use is the success of STUN and TURN methods for establishing direct peer-to-peer UDP communication between the stream source device and the stream client. Again, the gateway and the server can help with the setup of this communication.

Client bandwidth availability and processing power are other factors in determining the best streaming methods. For example, due to its bandwidth overhead an encrypted MJPEG stream should not be considered for most mobile phone data networks. Device bandwidth availability can also be considered in choosing the best video stream format. For example, consideration can be given to whether the upstream bandwidth capabilities of the typical residential DSL support two or more simultaneous MJPEG streams.

Components of the integrated security system of an embodiment, while considering various parameters in selecting a video streaming format to transfer video streams from streaming source devices and requesting client devices, prioritize streaming formats according to these parameters. The parameters considered in selecting a streaming format include, as described above, security requirements, client capabilities, device capabilities, and network/system capabilities. Components of the integrated security system of an embodiment select a video streaming format according to the following priority, but alternative embodiments can use other priorities.

The selected format is UPnP or peer-to-peer MPEG-4 over RTSP with encryption when both requesting client device and streaming source device support this format.

The selected format is UPnP or peer-to-peer MPEG-4 over RTSP with authentication when the requesting client device does not support encryption or UPnP or peer-to-peer MPEG-4 over RTSP with encryption.

The selected format is UPnP (peer-to-peer) MPEG-4 over HTTPS when both requesting client device and streaming source device support this format.

The selected format is UPnP (peer-to-peer) MPEG-4 over HTTP when the requesting client device does not support encryption or UPnP (peer-to-peer) MPEG-4 over HTTPS.

The selected format is UPnP (peer-to-peer) MPEG-4 over RTSP facilitated by gateway or touchscreen (including or incorporating gateway components) (to provide encryption), when the requesting client device supports encrypted RTSP and the streaming source device supports MPEG-4 over RTSP.

The selected format is UPnP (peer-to-peer) MPEG-4 over HTTPS facilitated by gateway or touchscreen (including or incorporating gateway components) (to provide encryption) when the requesting client device supports MPEG-4 over HTTPS and the streaming source device supports MPEG-4 over HTTP.

The selected format is UPnP (peer-to-peer) MJPEG over HTTPS when the networks and devices can handle the bandwidth and both requesting client device and streaming source device support MJPEG over HTTPS.

The selected format is Reverse RTSP with STUN/TURN facilitated by the server when the streaming source device initiates SSL/TLS TCP to server, the streaming source device supports Reverse RTSP over SSL/TLS with STUN/TURN, and the requesting client device supports RTSP with STUN/TURN.

The selected format is Reverse RTSP with STUN/TURN facilitated by server and gateway or touchscreen (including or incorporating gateway components) when the gateway initiates SSL/TLS TCP to the server and to the streaming source device, the streaming source device supports RTSP, and the requesting client device supports RTSP with STUN/TURN.

The selected format is Reverse MPEG over RTSP/HTTP facilitated by the server when the streaming source device initiates SSL/TLS TCP to server, the streaming source device supports Reverse RTSP or HTTP over SSL/TLS, and the requesting client device supports MPEG over RTSP/HTTP.

The selected format is Reverse MPEG over RTSP/HTTP facilitated by server and gateway or touchscreen (including or incorporating gateway components) when the gateway initiates SSL/TLS TCP to server and to streaming source device, the streaming source device supports MPEG over RTSP or HTTP, and the requesting client device supports MPEG over RTSP/HTTP.

The selected format is UPnP (peer-to-peer) MJPEG over HTTP when the networks and devices can handle the bandwidth and when the requesting client device does not support encryption and does not support MPEG-4.

The selected format is Reverse MJPEG over HTTPS facilitated by the server when the streaming source device initiates SSL/TLS TCP to server, the streaming source device supports Reverse MJPEG over SSL/TLS, and the requesting client device supports MJPEG.

The selected format is Reverse MJPEG over HTTPS facilitated by server and gateway or touchscreen (including or incorporating gateway components) when the gateway initiates SSL/TLS TCP to the server and to the streaming source device, the streaming source device supports MJPEG, and the requesting client device supports MJPEG.

FIG. 82 is a block diagram showing camera tunneling, under an embodiment.

Additional detailed description of camera tunnel implementation details follow.

An embodiment uses XMPP for communication with a remote video camera as a lightweight (bandwidth) method for maintaining real-time communication with the remote camera. More specifically, the remote camera is located on another NAT (e.g., NAT traversal).

An embodiment comprises a method for including a remotely located camera in a home automation system. For example, using XMPP via cloud XMPP server to couple or connect camera to home automation system. This can be used with in-car cameras, cell phone cameras, and re-locatable cameras (e.g., dropped in the office, the hotel room, the neighbor's house, etc.).

Components of an embodiment are distributed so that any one can be offline while system continues to function (e.g., panel can be down while camera still up, motion detection from camera, video clip upload etc. continue to work.

Embodiments extend the PSIA in one or more of the following areas: wifi roaming configuration; video relay commands; wifi connectivity test; media tunnel for live video streaming in the context of a security system; motion notification mechanism and configuration (motion heartbeat) (e.g., helps with scalable server); XMPP for lightweight communication (helps with scalable server, reduced bandwidth, for maintaining persistent connection with a gateway); ping request sent over XMPP as health check mechanism; shared secret authentication bootstrapping process; asynchronous error status delivery by the camera for commands invoked by the gateway if the camera is responsible for delivering errors to the gateway in an asynchronous fashion (e.g., gateway requests a firmware update or a video clip upload).

Embodiments extend the home automation system to devices located on separate networks, and make them useable as general-purpose communication devices. These cameras can be placed in the office, vacation home, neighbor house, software can be put onto a cell phone, into a car, navigation system, etc.

Embodiments use a global device registry for enabling a device/camera to locate the server and home to which it is assigned.

Embodiments include methods for bootstrapping and re-bootstrapping of authentication credentials. The methods include activation key entry by installer into the cloud web interface. Activation key generation is based upon mac address and a shared secret between manufacturer and the service provider. Embodiments of the system allow activation of a camera with valid activation key that is not already provisioned in the global registry server.

Embodiments include a web-based interface for use in activating, configuring, remote firmware update, and re-configuring of a camera.

Embodiments process or locate local wifi access points and provide these as options during camera configuring and re-configuring. Embodiments generate and provide recommendations around choosing a best wifi access point based upon characteristics of the network (e.g., signal strength, error rates, interference, etc.). Embodiments include methods for testing and diagnosing issues with wifi and network access.

Embodiments include cameras able to perform this wifi test using only one physical network interface, an approach that enables the camera to dynamically change this physical interface from wired to wifi. Embodiments are able to change the network settings (wifi etc) remotely using the same process.

Cameras of an embodiment can be configured with multiple network preferences with priority order so that the camera can move between different locations and the camera can automatically find the best network to join (e.g., can have multiple ssid+bssid+password sets configured and prioritized).

Regarding firmware download, embodiments include a mechanism to monitor the status of the firmware update, provide feedback to the end user and improve overall quality of the system.

Embodiments use RTSP over SSL to a cloud media relay server to allow live video NAT traversal to a remote client (e.g., PC, cell phone, etc.) in a secure manner where the camera provides media session authentication credentials to the server. The camera initiates the SSL connection to the cloud and then acts as a RTSP server over this connection.

Embodiments include methods for using NAT traversal for connecting to the cloud for remote management and live video access allows the integrated security components to avoid port forwarding on the local router(s) and as a result maintain a more secure local network and a more secure camera since no ports are required to be open.

Embodiments enable camera sensors (e.g., motion, audio, heat, etc.) to serve as triggers to other actions in the automation system. The capture of video clips or snapshots from the camera is one such action, but the embodiments are not so limited.

A camera of an embodiment can be used by multiple systems.

A detailed description of flows follows relating to the camera tunnel of an embodiment.

A detailed description of camera startup and installation follows as it pertains to the camera tunnel of an embodiment.

Activation Key

    • a. camera to follow same algorithm as ihub where activation key is generated from serial based upon a one-way hash on serial and a per-vendor shared secret.
    • b. Used com.icontrol.util.ops.activation.ActivationKeyUtil class to validate serialNo <-> activationKey.

Registry Request

[partner]/registry/[device type]/[serial]

    • a: new column in existing registry table for id type; nullable but the application treats null as “gateway”.
    • b. rest endpoints allow adding with the new optional argument.
    • c. current serial and siteId uniqueness enforcement by application depends upon device type (for any device type, there should be uniqueness on serial; for gateway device type, there should be uniqueness on siteId; for other device types, there need not be uniqueness on siteId).
    • d. if no activation yet (e.g., no entry) then send dummy response (random but repeatable reply; may include predictable “dummy” so that steps below can infer.
    • e. add/update registry server endpoints for adding/updating entries.

If Camera has No Password

Camera retrieves “Pending Key” via POST to

/<CredentialGatewayURL>/GatewayService/<siteID>/PendingDeviceKey.

    • a. pending key request (to get password) with serial and activation key.
    • b. server checks for dummy reply; if dummy then responds with retry backoff response.
    • c. server invokes pass-through API on gateway to get new pending key.
    • d. if device is found, then gateway performs validation of serial+activation key, returns error if mismatch.
    • e. if activation key checks out, then gateway checks pending key status.
    • f. if device currently has a pending key status, then a new pending password is generated.
    • g. gateway maintains this authorization information in a new set of variables on the camera device.
    • h. device-authorization/session-key comprises the current connected password.
    • i. device-authorization/pending-expiry comprises a UTC timestamp representing the time the current pending password period ends; any value less than the current time or blank means the device is not in a pending password state.
    • j. device-authorization/pending-session-key comprises the last password returned to the camera in a pending request; this is optional (device may choose to maintain this value in memory).
    • k. session-key and pending-session-key variables tagged with “encryption” in the device def which causes rest and admin to hide their value from client.
      ConnectInfo request
    • a. returns xmpp host and port to connect to (comes from config as it does for gateway connect info).
    • b. returns connectInfo with additional <xmpp> parameter.

Start Portal Add Camera Wizard

    • a. user enters camera serial, activation key.
    • b. addDevice rest endpoint on gateway called
    • c. gateway verifies activation key is correct.
    • d. gateway calls addDevice method on gapp server to add LWG_SerComm_iCamera_1000 with given serial to site.
    • e. Server detects the camera type and populates registry.
    • f. gateway puts device into pending password state (e.g., updates device-auth/pending-expiry point).
    • g. rest endpoints on gateway device for managing device pending password state.
    • h. start pending password state: POST future UTC value to device-auth/pending-expiry; device-auth/pending-expiry set to 30 minutes from time device was added.
    • i. stop pending password state: POST −1 to device-auth/pending-expiry.
    • j. check pending password state: GET device-auth/pending-expiry.
    • k. message returned with “Location” header pointing to relative URI.
    • l. user told to power on camera (or reboot if already powered on).
    • m. once camera connects, gateway updates device-auth/pending-expiry to −1 and device-auth/session-key with password and device/connection-status to connected
    • n. portal polls for device/connection-status to change to connected; if does not connect after X seconds, bring up error page (camera has not connected—continue waiting or start over).
    • o. user asked if wifi should be configured for this camera.
    • p. entry fields for wifi ssid and password.
    • q. portal can pre-populate ssid and password fields with picklist of any from other cameras on the site.
    • r. get XML of available SSIDs.
    • s. non-wifi option is allowed.
    • t. portal submits options to configure camera (use null values to specify non-wifi); upon success, message is returned with “Location” header pointing to relative URI.
    • u. checks configuration progress and extracting “status” and “subState” fields.
    • v. puts device state into “configuring”; upon error, puts device state into “configuration failure”.
    • w. performs firmware upgrade if needed, placing device state into “upgrading”; upon error, puts device state into “upgrade failure”.
    • x. upon configuration success, puts device state of “ok” and applies appropriate configuration for camera (e.g., resolutions, users, etc.).
    • y. if non-blank wifi parameters, automatically perform “wifi test” method to test wifi without disconnecting Ethernet.
    • z. portal wizard polls device status until changes to “ok” or “upgrade failure/“configuration failure” in “status” field, along with applicable, if any, with error code reason, in “subState” field; upon error, show details to user, provide options (start over, configure again, reboot, factory reset, etc)
    • aa. notify user they can move camera to desired location.

Camera Reboots

    • a. gets siteId and server URL from registry.
    • b. makes pending paid key request to server specifying correct siteId, serial and activation key; gets back pending password.
    • c. makes connectInfo request to get xmpp server.
    • d. connects over xmpp with pending password.

If Camera Reboots Again

    • a. get siteId and server URL from registry.
    • b. already has password (may or may not be pending) so no need to perform pending paid key request.
    • c. make connectInfo request to get xmpp server.
    • d. connect over xmpp with password.
      Xmpp Connect with Password
    • a. xmpp user is of the form [serial]@[server]/[siteId]
    • b. session server performs authentication by making passthrough API request to gateway for given Siteld.
    • c. Session xmpp server authenticates new session using DeviceKey received in GET request against received xmpp client credential.
    • d. If authentication fails or GET receives non-response, server returns to camera XMPP connect retry backoff with long backoff.
    • e. gateway device performs password management.
    • f. compares password with current key and pending key (if not expired); if matches pending, then update device-auth/session-key to be pending value, and clear out the device-auth/pending-expiry.
    • g. gateway device updates the device/connection-status point to reflect that camera is connected.
    • h. gateway device tracks the xmpp session server this camera is connected to via new point device/proxy-host and updates this info if changed.
    • i. if deviceConnected returns message, then session server posts connected event containing xmpp user to queue monitored by all session servers.
    • j. session servers monitor these events and disconnect/cleanup sessions they have for same user.
    • k. may use new API endpoint on session server for broadcast messages.
      Xmpp Connect with Bad Password
    • a. Upon receiving a new connection request, session server performs authentication by making passthrough API request to gateway for given SiteId.
    • b. Session xmpp server authenticates new session using DeviceKey received in above GET request against received xmpp client credential.
    • c. If authentication fails or GET receives non-response from virtual gateway.
    • d. Session server rejects incoming connection (is there a backoff/retry XMPP response that can be sent here).
    • e. Session server logs event.
    • f. Gateway logs event.

Xmpp Disconnect

    • a. session server posts disconnected event to gateway (with session server name).
    • b. gateway updates the device/connected variable/point to reflect that camera is disconnected.
    • c. gateway updates the device/connection-status variable/point to reflect that camera is disconnected.
    • d. gateway clears the device/proxy-host point that contains the session host to this camera is connected.

LWGW Shutdown

    • a. During LWGW shutdown, gateway can broadcast messages to all XMPP servers to ensure all active XMPP sessions are gracefully shutdown.
    • b. gateways use REST client to call URI, which will broadcast to all XMPP servers.

To Configure Camera During Installation

    • a. applies all appropriate configuration for camera (e.g., resolutions, users, etc).
    • b. returns message for configuration applied, wifi test passed, all settings taken. returns other response code with error code description upon any failure.

To Reconfigure Wifi SSID and Key

    • a. returns message for wifi credentials set.
    • b. returns other response code with error code description upon any failure.

API Pass-Through Handling for Gateway Fail-Over Case

    • a. When performing passthrough for LWGW, the API endpoint handles the LWGW failover case (e.g., when gateway is not currently running on any session server).
    • b. passthrough functions in the following way: current session server IP is maintained on the gateway object; server looks up gateway object to get session IP and then sends passthrough request to that session server; if that request returns gateway not found message, server error message, or a network level error (e.g., cannot route to host, etc.), if the gateway is a LWGW then server should lookup the primary/secondary LW Gateway group for this site; server should then send resume message to primary, followed by rest request; if that fails, then server send resume message to secondary followed by rest request
    • c. alternatively, passthrough functions in the following way: rather than lookup session server IP on gateway object, passthrough requests should be posted to a passthrough queue that is monitored by all session servers; the session server with the Gateway on it should consume the message (and pass it to the appropriate gateway); the server should monitor for expiry of these messages, and if the gateway is a LWGW then server should lookup the primary/secondary LW Gateway group for this site; server should then send resume message to primary, followed by rest request; if that fails, then server send resume message to secondary followed by rest request.

A detailed description follows for additional flows relating to the camera tunnel of an embodiment.

Motion Detection

    • a. camera sends openhome motion event to session server via xmpp.
    • b. session server posts motion event to gateway via passthrough API.
    • c. gateway updates the camera motion variable/point to reflect the event gateway updates the camera motion variable/point to reflect the event

Capture Snapshot

    • a. gateway posts openhome snapshot command to session server with camera connected.
    • b. gateway sends command including xmpp user id to xmpp command Queue monitored by all session servers.
    • c. session server with given xmpp user id consumes command and sends command to camera (command contains upload URL on gw webapp).
    • d. gateway starts internal timer to check if a response is received from camera (e.g., 5 sec wait window).
    • e. if broadcast RabbitMQ not ready, then gateway will use device/proxy-host value to know which session server to post command to.
    • f. session server sends command to camera (comprises upload URL on gw webapp)
    • g. Example XML body:

<MediaUpload> <id>1321896772660<id> <snapShotImageType>JPEG<snapShotImageType> <gateway_url>[gatewaysyncUrl]/gw/GatewayService/SPutJpg/s/[siteId]/[ deviceIndex]/[varValue]/[varIndex]/[who]/[ts]/[HMM]/[passCheck]/</ <failure_url>[gatewaysyncUrl]/gw/GatewayService/SPutJpgError/s/[siteI d]/[deviceIndex]/[varValue]/[varIndex]/[who]/[ts]/[HMM]/[passCheck[/</ <MediaUpload>
    • h. session server receives response to sendRequestEvent from camera and posts response to gateway.
    • i. camera uploads to upload URL on gw webapp.
    • j. passCheck can be verified on server (based upon gateway secret); alternatively, the OpenHome spec calls for Digest Auth here.
    • k. endpoint responds with message digest password if the URI is expected, otherwise returns non-response.
    • l. gw webapp stores snapshot, logs history event.
    • m. event is posted to gateway for deltas.

Capture Clip

    • a. gateway posts openhome video clip capture command to session server with camera connected.
    • b. gateway sends command including xmpp user id to xmpp command Queue monitored by all session servers.
    • c. session server with given xmpp user id consumes command and sends command to camera (command comprises upload URL on gw webapp).
    • d. gateway starts internal timer to check if a response is received from camera (e.g., 5 sec wait window).
    • e. session server sends command to camera (comprises upload URL on gw webapp).
    • f. Example URI from session server to camera:
      • /openhome/streaming/channels/1/video/upload
    • g. Example XML body:

<MediaUpload> <id>1321898092270</id> <videoClipFormatType>MP4</videoClipFormatType> <gateway_url>[gatewaysyncUrl]/gw/GatewayService/SPutMpeg/s/[siteId] /[deviceIndex]/[varValue]/[varIndex]/[who]/[ts]/[HMM]/[passCheck]/</ <failure_url>[gatewaysyncUrl]/gw/GatewayService/SPutMpegFailed/s/[si teId]/[deviceIndex]/[varValue]/[varIndex]/[who]/[ts]/[HMM]/[passCheck] /</ </MediaUpload>
    • h. session server receives response to sendRequestEvent from camera and posts response to gateway.
    • i. camera uploads to upload URL on gw webapp.
    • j. passCheck can be verified on server (based upon gateway secret).
    • k. alternatively, spec calls for Digest Auth here.
    • l. endpoint responds with message digest password if the URI is expected, otherwise returns non-response.
    • m. gw webapp stores video clip, logs history event.
    • n. event is posted to gateway for deltas.

Live Video (Relay)

    • a. Upon user login to portal, portal creates a media relay tunnel by calling relayAPImanager create.
    • b. RelayAPImanager creates relays and sends ip-config-relay variable (which instructs gateway to create media tunnel) to gateway.
    • c. Upon receiving media tunnel create ip-config-relay command, gateway posts openhome media channel create command to session server with camera connected.
    • d. session server sends create media tunnel command to camera (comprises camera relay URL on relay server).
    • e. Example URI from session server to camera:
      • /openhome/streaming/mediatunnel/create
    • f. Example XML body:

<CreateMediaTunnel> <sessionID>1</sessionID> <gatewayURL>TBD</gatewayURL> <failureURL>TBD</failureURL> </CreateMediaTunnel>
    • g. GatewayURL is created from relay server, port, and sessionId info included within ip-config-relay variable.
    • h. camera creates a TLS tunnel to relay server via POST to <gatewayURL>.
    • i. When user initiates live video, portal determines user is remote and retrieves URL of Relay server from relayAPImanager.
    • j. Upon receiving a user pole connection on the relay server (along with valid rtsp request), relay sends streaming command to camera: example: rtsp:://openhome/streaming/channels/1/rtsp
    • k. Upon user portal logout, portals calls relayAPImanager to terminate media tunnel.
    • l. RelayAPImanager send ip-config-relay variable to terminate media tunnel.
    • m. Gateway sends destroy media tunnel command to camera via XMPP.

Camera Firmware Update

    • a. Gateway checks camera firmware version; if below minimum version, gateway sends command to camera (via session server) to upgrade firmware (command: /openhome/system/updatefirmware).
    • b. Gateway checks firmware update status by polling: /openhome/system/updatefirmware/status.
    • c. Gateway informs portal of upgrade status.
    • d. Camera auto-reboots after firmware update and reconnects to Session server.

Camera First-Contact Configuration

    • a. After a camera is added successfully and is connected to the session server for the first time, gateway performs first contact configuration as follows.
    • b. Check firmware version.
    • c. Configure settings by: download config file using /openhome/sysetm/configurationData/configFile; or configure each category individually (configure video input channel settings—/openhome/system/video/inputs/channels; onfigure audio input channel settings (if any)—/openhome/system/audio/inputs/channels; configure video streaming channel settings—/openhome/streaming/channels; configure motion detection settings—example: PUT/openhome/custom/motiondetection/pir/0; configure event trigger settings—example: PUT/openhome/custom/event).
    • d. Reboot camera (/openhome/system/factoryreset) if camera responds with reboot required.

Embodiments include a system comprising premises equipment comprising a plurality of premises devices located at a premises. The system includes a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. The system includes a system server configured to interact with the plurality of premises devices and the partner device. The system includes a user interface coupled to the system server and configured to interact with the plurality of premises devices. The user interface includes a partner user interface corresponding to the partner device. The partner user interface configures the user interface to interact with the partner device. The user interface is configured to control interactions between the plurality of premises devices and the partner device.

Embodiments include a system comprising: premises equipment comprising a plurality of premises devices located at a premises; a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment; a system server configured to interact with the plurality of premises devices and the partner device; a user interface coupled to the system server and configured to interact with the plurality of premises devices, wherein the user interface includes a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device, wherein the user interface is configured to control interactions between the plurality of premises devices and the partner device.

Embodiments include a system comprising premises equipment comprising a plurality of premises devices located at a premises. The premises equipment corresponds to a service provider. The system includes a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. The system includes a system server configured to interact with the plurality of premises devices. The system server is configured to interact with the partner device via a partner proxy corresponding to the partner device. The system includes a user interface coupled to the system server and configured to interact with the plurality of premises devices. The user interface includes a partner user interface corresponding to the partner device. The partner user interface configures the user interface to interact with the partner device.

Embodiments include a system comprising: premises equipment comprising a plurality of premises devices located at a premises, wherein the premises equipment corresponds to a service provider; a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment; a system server configured to interact with the plurality of premises devices, wherein the system server is configured to interact with the partner device via a partner proxy corresponding to the partner device; a user interface coupled to the system server and configured to interact with the plurality of premises devices, wherein the user interface includes a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

The partner proxy is configured to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

The partner device at least one of controls and triggers automations in the plurality of premises devices via the partner server.

The system server is configured to include the partner proxy and to communicate with the partner server via the partner proxy.

The partner proxy is configured to proxy calls from the partner user interface to the partner server.

The partner user interface is embedded in the user interface.

The system includes an integration server coupled to the system server.

The system server includes an integration application programming interface (API).

The integration server is coupled to the integration API.

The system includes an event bus coupled to the system server and the integration server.

The integration server is coupled to the partner server.

The system includes an integration adapter coupled to the integration server.

The integration adapter is configured to translate between protocols of the system server and the partner server.

The integration adapter is coupled to the partner server.

The integration adapter is configured to provide endpoints for associating with the partner device.

The integration adapter is configured to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

The device data includes at least one of state data, control data, and command data.

The integration adapter is configured to send events to the integration server.

The integration adapter is configured to send events including acknowledgment of incoming system events.

The integration adapter is configured as an endpoint for managed partner device events to be reported to the system server.

The system includes a rules engine.

The rules engine is configured to run on the premises equipment.

The rules engine is configured to run on the system server.

The rules engine is configured to run on at least one of the system server and the premises equipment.

The system includes automation rules running on the rules engine, wherein the automation rules include actions and triggers for controlling interactions between at least one of the partner device and the plurality of premises devices.

The rules engine is configured to treat an event relating to the partner device as a trigger for at least one rule.

In response to the event the at least one rule triggers at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

The partner user interface is configured to at least one of create and edit at least one rule of the automation rules.

The partner user interface is configured to delete at least one rule of the automation rules.

The partner user interface includes a list comprising a plurality of partners, wherein the plurality of partners correspond to a plurality of partner devices.

The partner user interface is configured with partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

The partner user interface is configured to authenticate with the partner server a user corresponding to the partner device.

The system server is configured with information of the partner device as a result of authentication of the user.

The partner user interface interacts with the partner server via the partner proxy.

The partner user interface receives data relating to the partner device from the partner server.

The system includes an integration descriptor, wherein the integration descriptor includes capabilities data of the partner device.

The capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

The integration descriptor is configured for use to provide access to capabilities of the system server.

The system includes a rules template file including a description of available functionality of at least one of the plurality of premises devices and the partner device.

The system includes a card user interface coupled to the system server, wherein the card user interface includes UI elements in a card-like configuration and configured to generate the partner user interface.

Embodiments include a method comprising configuring a system server to interact with premises equipment comprising a plurality of premises devices located at a premises. The premises equipment corresponds to a service provider. The method includes configuring the system server to interact with a partner device at the premises via a partner proxy corresponding to the partner device. The partner device is configured to use a partner protocol different from a protocol of the premises equipment. The method includes configuring a user interface to interact with the plurality of premises devices. The user interface includes a partner user interface corresponding to the partner device. The partner user interface configures the user interface to interact with the partner device.

Embodiments include a method comprising: configuring a system server to interact with premises equipment comprising a plurality of premises devices located at a premises, wherein the premises equipment corresponds to a service provider; configuring the system server to interact with a partner device at the premises via a partner proxy corresponding to the partner device, wherein the partner device is configured to use a partner protocol different from a protocol of the premises equipment; configuring a user interface to interact with the plurality of premises devices, wherein the user interface includes a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

The method includes configuring the partner proxy to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

The method includes configuring the partner device to at least one of control and trigger automations in the plurality of premises devices via the partner server.

The method includes configuring the system server to include the partner proxy and to communicate with the partner server via the partner proxy.

The method includes configuring the partner proxy to proxy calls from the partner user interface to the partner server.

The method includes embedding the partner user interface in the user interface. The method includes configuring an integration server to communicate with the system server.

The method includes configuring the system server to include an integration application programming interface (API).

The integration server is coupled to the integration API.

The integration server is coupled to the partner server.

The method includes configuring an integration adapter to communicate with the integration server.

The method includes configuring the integration adapter to translate between protocols of the system server and the partner server.

The method includes configuring the integration adapter to communicate with the partner server.

The method includes configuring the integration adapter to provide endpoints for associating with the partner device.

The method includes configuring the integration adapter to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

The device data includes at least one of state data, control data, and command data.

The method includes configuring the integration adapter to send events to the integration server.

The method includes configuring the integration adapter to send events including acknowledgment of incoming system events.

The method includes configuring the integration adapter as an endpoint for managed partner device events to be reported to the system server.

The method includes configuring a rules engine to run on at least one of the system server and the premises equipment.

The method includes configuring the rules engine to run on the premises equipment.

The method includes configuring the rules engine to run on the system server.

The method includes configuring automation rules to execute on the rules engine, wherein the automation rules include actions and triggers for controlling interactions between at least one of the partner device and the plurality of premises devices.

The method includes configuring the rules engine to treat an event relating to the partner device as a trigger for at least one rule.

The method includes, in response to the event, triggering with the at least one rule at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

The method includes configuring the partner user interface to at least one of create and edit at least one rule of the automation rules.

The method includes configuring the partner user interface to delete at least one rule of the automation rules.

The method includes configuring the partner user interface to include a list comprising a plurality of partners, wherein the plurality of partners correspond to a plurality of partner devices.

The method includes configuring the partner user interface to include partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

The method includes configuring the partner user interface to authenticate with the partner server a user corresponding to the partner device.

The method includes configuring the system server with information of the partner device as a result of authentication of the user.

The method includes configuring the partner user interface to interact with the partner server via the partner proxy.

The method includes configuring the partner user interface to receive data relating to the partner device from the partner server.

The method includes configuring an integration descriptor to include capabilities data of the partner device.

The capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

The method includes configuring the integration descriptor for use to provide access to capabilities of the system server.

The method includes configuring a rules template file to include a description of available functionality of at least one of the plurality of premises devices and the partner device.

Embodiments include a system comprising premises equipment comprising a plurality of premises devices located at a premises. The premises equipment corresponds to a service provider. The system includes a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment. The system includes a system server configured to interact with the plurality of premises devices. The system server is configured to interact with the partner device via a partner proxy corresponding to the partner device. The system includes automation rules coupled to the system server. The automation rules include actions and triggers for controlling interactions between at least one of the partner device and the plurality of premises devices. The system includes a user interface coupled to the system server and configured to interact with the plurality of premises devices and the partner device.

Embodiments include a system comprising: premises equipment comprising a plurality of premises devices located at a premises, wherein the premises equipment corresponds to a service provider; a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment; a system server configured to interact with the plurality of premises devices, wherein the system server is configured to interact with the partner device via a partner proxy corresponding to the partner device; automation rules coupled to the system server, wherein the automation rules include actions and triggers for controlling interactions between at least one of the partner device and the plurality of premises devices; a user interface coupled to the system server and configured to interact with the plurality of premises devices and the partner device.

The user interface includes a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

The partner user interface is configured to at least one of create and edit at least one rule of the automation rules.

The partner user interface is configured to delete at least one rule of the automation rules.

The partner proxy is configured to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

The partner device at least one of controls and triggers automations in the plurality of premises devices via the partner server.

The system server is configured to include the partner proxy and to communicate with the partner server via the partner proxy.

The partner proxy is configured to proxy calls from the partner user interface to the partner server.

The partner user interface is embedded in the user interface.

The system includes an integration server coupled to the system server.

The system server includes an integration application programming interface (API).

The integration server is coupled to the integration API.

The system includes an event bus coupled to the system server and the integration server.

The integration server is coupled to the partner server.

The system includes an integration adapter coupled to the integration server.

The integration adapter is configured to translate between protocols of the system server and the partner server.

The integration adapter is coupled to the partner server.

The integration adapter is configured to provide endpoints for associating with the partner device.

The integration adapter is configured to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

The device data includes at least one of state data, control data, and command data.

The integration adapter is configured to send events to the integration server.

The integration adapter is configured to send events including acknowledgment of incoming system events.

The integration adapter is configured as an endpoint for managed partner device events to be reported to the system server.

The system includes a rules engine configured to run the automation rules.

The rules engine is configured to run on the premises equipment.

The rules engine is configured to run on the system server.

The rules engine is configured to run on at least one of the system server and the premises equipment.

The rules engine is configured to treat an event relating to the partner device as a trigger for at least one rule.

In response to the event the at least one rule triggers at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

The partner user interface includes a list comprising a plurality of partners, wherein the plurality of partners corresponds to a plurality of partner devices.

The partner user interface is configured with partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

The partner user interface is configured to authenticate with the partner server a user corresponding to the partner device.

The system server is configured with information of the partner device as a result of authentication of the user.

The partner user interface interacts with the partner server via the partner proxy.

The partner user interface receives data relating to the partner device from the partner server.

The system includes an integration descriptor that includes capabilities data of the partner device.

The capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

The integration descriptor is configured for use to provide access to capabilities of the system server.

The system includes a rules template file including a description of available functionality of at least one of the plurality of premises devices and the partner device.

Embodiments include a method comprising configuring a system server to interact with premises equipment including a plurality of premises devices located at the premises. The premises equipment corresponds to a service provider. The method includes configuring the system server to interact with a partner device at the premises via a partner proxy corresponding to the partner device. The partner device is configured to use a partner protocol different from a protocol of the premises equipment. The method includes configuring automation rules to control interactions between at least one of the partner device and the plurality of premises devices using actions and triggers. The method includes configuring a user interface coupled to the system server to interact with the plurality of premises devices and the partner device.

Embodiments include a method comprising: configuring a system server to interact with premises equipment including a plurality of premises devices located at the premises, wherein the premises equipment corresponds to a service provider; configuring the system server to interact with a partner device at the premises via a partner proxy corresponding to the partner device, wherein the partner device is configured to use a partner protocol different from a protocol of the premises equipment; configuring automation rules to control interactions between at least one of the partner device and the plurality of premises devices using actions and triggers; configuring a user interface coupled to the system server to interact with the plurality of premises devices and the partner device.

The method includes configuring the user interface to include a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

The method includes configuring the partner user interface to at least one of create and edit at least one rule of the automation rules.

The method includes configuring the partner user interface to delete at least one rule of the automation rules.

The method includes configuring the partner proxy to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

The method includes configuring the partner device to at least one of control and trigger automations in the plurality of premises devices via the partner server.

The method includes configuring the system server to include the partner proxy and to communicate with the partner server via the partner proxy.

The method includes configuring the partner proxy to proxy calls from the partner user interface to the partner server.

The method includes embedding the partner user interface in the user interface.

The method includes configuring an integration server to communicate with the system server.

The method includes configuring the system server to include an integration application programming interface (API).

The method includes configuring the integration server to communicate with the integration API.

The method includes configuring an event bus to communicate with the system server and the integration server.

The method includes configuring the integration server to communicate with the partner server.

The method includes configuring the integration server to communicate with an integration adapter.

The integration adapter is configured to translate between protocols of the system server and the partner server.

The integration adapter is coupled to the partner server.

The integration adapter is configured to provide endpoints for associating with the partner device.

The integration adapter is configured to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

The device data includes at least one of state data, control data, and command data.

The integration adapter is configured to send events to the integration server.

The integration adapter is configured to send events including acknowledgment of incoming system events.

The integration adapter is configured as an endpoint for managed partner device events to be reported to the system server.

The method includes configuring a rules engine to run the automation rules.

The method includes configuring the rules engine to run on the premises equipment.

The method includes configuring the rules engine to run on the system server.

The method includes configuring the rules engine to run on at least one of the system server and the premises equipment.

The method includes configuring the rules engine to treat an event relating to the partner device as a trigger for at least one rule.

The method includes, in response to the event, triggering with the at least one rule at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

The method includes configuring the partner user interface to include a list comprising a plurality of partners, wherein the plurality of partners correspond to a plurality of partner devices.

The method includes configuring the partner user interface with partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

The method includes configuring the partner user interface to authenticate with the partner server a user corresponding to the partner device.

The method includes configuring the system server with information of the partner device as a result of authentication of the user.

The method includes configuring the partner user interface to interact with the partner server via the partner proxy.

The method includes configuring the partner user interface to receive data relating to the partner device from the partner server.

The method includes configuring an integration descriptor to include capabilities data of the partner device.

The capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

The method includes configuring the integration descriptor for use to provide access to capabilities of the system server.

The method includes configuring a rules template file to include a description of available functionality of at least one of the plurality of premises devices and the partner device.

As described above, computer networks suitable for use with the embodiments described herein include local area networks (LAN), wide area networks (WAN), Internet, or other connection services and network variations such as the world wide web, the public internet, a private internet, a private computer network, a public network, a mobile network, a cellular network, a value-added network, and the like. Computing devices coupled or connected to the network may be any microprocessor controlled device that permits access to the network, including terminal devices, such as personal computers, workstations, servers, mini computers, main-frame computers, laptop computers, mobile computers, palm top computers, hand held computers, mobile phones, TV set-top boxes, or combinations thereof. The computer network may include one of more LANs, WANs, Internets, and computers. The computers may serve as servers, clients, or a combination thereof.

The system can be a component of a single system, multiple systems, and/or geographically separate systems. The system can also be a subcomponent or subsystem of a single system, multiple systems, and/or geographically separate systems. The system can be coupled to one or more other components (not shown) of a host system or a system coupled to the host system.

One or more components of the system and/or a corresponding system or application to which the system is coupled or connected includes and/or runs under and/or in association with a processing system. The processing system includes any collection of processor-based devices or computing devices operating together, or components of processing systems or devices, as is known in the art. For example, the processing system can include one or more of a portable computer, portable communication device operating in a communication network, and/or a network server. The portable computer can be any of a number and/or combination of devices selected from among personal computers, personal digital assistants, portable computing devices, and portable communication devices, but is not so limited. The processing system can include components within a larger computer system.

The processing system of an embodiment includes at least one processor and at least one memory device or subsystem. The processing system can also include or be coupled to at least one database. The term “processor” as generally used herein refers to any logic processing unit, such as one or more central processing units (CPUs), digital signal processors (DSPs), application-specific integrated circuits (ASIC), etc. The processor and memory can be monolithically integrated onto a single chip, distributed among a number of chips or components, and/or provided by some combination of algorithms. The methods described herein can be implemented in one or more of software algorithm(s), programs, firmware, hardware, components, circuitry, in any combination.

The components of any system that includes the system herein can be located together or in separate locations. Communication paths couple the components and include any medium for communicating or transferring files among the components. The communication paths include wireless connections, wired connections, and hybrid wireless/wired connections. The communication paths also include couplings or connections to networks including local area networks (LANs), metropolitan area networks (MANs), wide area networks (WANs), proprietary networks, interoffice or backend networks, and the Internet. Furthermore, the communication paths include removable fixed mediums like floppy disks, hard disk drives, and CD-ROM disks, as well as flash RAM, Universal Serial Bus (USB) connections, RS-232 connections, telephone lines, buses, and electronic mail messages.

Aspects of the systems and methods described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the systems and methods include: microcontrollers with memory (such as electronically erasable programmable read only memory (EEPROM)), embedded microprocessors, firmware, software, etc. Furthermore, aspects of the systems and methods may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. Of course the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.

It should be noted that any system, method, and/or other components disclosed herein may be described using computer aided design tools and expressed (or represented), as data and/or instructions embodied in various computer-readable media, in terms of their behavioral, register transfer, logic component, transistor, layout geometries, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, etc.). When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of the above described components may be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs.

Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words “herein,” “hereunder,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. When the word “or” is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.

The above description of embodiments of the systems and methods is not intended to be exhaustive or to limit the systems and methods to the precise forms disclosed. While specific embodiments of, and examples for, the systems and methods are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the systems and methods, as those skilled in the relevant art will recognize. The teachings of the systems and methods provided herein can be applied to other systems and methods, not only for the systems and methods described above.

The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the systems and methods in light of the above detailed description.

Claims

1. A system comprising:

premises equipment comprising a plurality of premises devices located at a premises, wherein the premises equipment corresponds to a service provider;
a partner device located at the premises and configured to use a partner protocol different from a protocol of the premises equipment;
a system server configured to interact with the plurality of premises devices, wherein the system server is configured to interact with the partner device via a partner proxy corresponding to the partner device;
automation rules coupled to the system server, wherein the automation rules include actions and triggers for controlling interactions between at least one of the partner device and the plurality of premises devices;
a user interface coupled to the system server and configured to interact with the plurality of premises devices and the partner device.

2. The system of claim 1, wherein the user interface includes a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

3. The system of claim 2, wherein the partner user interface is configured to at least one of create and edit at least one rule of the automation rules.

4. The system of claim 2, wherein the partner user interface is configured to delete at least one rule of the automation rules.

5. The system of claim 2, wherein the partner proxy is configured to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

6. The system of claim 5, wherein the partner device at least one of controls and triggers automations in the plurality of premises devices via the partner server.

7. The system of claim 6, wherein the system server is configured to include the partner proxy and to communicate with the partner server via the partner proxy.

8. The system of claim 5, wherein the partner proxy is configured to proxy calls from the partner user interface to the partner server.

9. The system of claim 5, wherein the partner user interface is embedded in the user interface.

10. The system of claim 5, comprising an integration server coupled to the system server.

11. The system of claim 10, wherein the system server includes an integration application programming interface (API).

12. The system of claim 11, wherein integration server is coupled to the integration API.

13. The system of claim 11, comprising an event bus coupled to the system server and the integration server.

14. The system of claim 10, wherein the integration server is coupled to the partner server.

15. The system of claim 10, comprising an integration adapter coupled to the integration server.

16. The system of claim 15, wherein the integration adapter is configured to translate between protocols of the system server and the partner server.

17. The system of claim 15, wherein the integration adapter is coupled to the partner server.

18. The system of claim 17, wherein the integration adapter is configured to provide endpoints for associating with the partner device.

19. The system of claim 17, wherein the integration adapter is configured to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

20. The system of claim 19, wherein the device data includes at least one of state data, control data, and command data.

21. The system of claim 17, wherein the integration adapter is configured to send events to the integration server.

22. The system of claim 21, wherein the integration adapter is configured to send events including acknowledgment of incoming system events.

23. The system of claim 17, wherein the integration adapter is configured as an endpoint for managed partner device events to be reported to the system server.

24. The system of claim 5, comprising a rules engine, wherein the rules engine is configured to run the automation rules.

25. The system of claim 24, wherein the rules engine is configured to run on the premises equipment.

26. The system of claim 24, wherein the rules engine is configured to run on the system server.

27. The system of claim 24, wherein the rules engine is configured to run on at least one of the system server and the premises equipment.

28. The system of claim 24, wherein the rules engine is configured to treat an event relating to the partner device as a trigger for at least one rule.

29. The system of claim 28, wherein in response to the event the at least one rule triggers at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

30. The system of claim 5, wherein the partner user interface includes a list comprising a plurality of partners, wherein the plurality of partners correspond to a plurality of partner devices.

31. The system of claim 30, wherein the partner user interface is configured with partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

32. The system of claim 31, wherein the partner user interface is configured to authenticate with the partner server a user corresponding to the partner device.

33. The system of claim 32, wherein the system server is configured with information of the partner device as a result of authentication of the user.

34. The system of claim 5, wherein the partner user interface interacts with the partner server via the partner proxy.

35. The system of claim 34, wherein the partner user interface receives data relating to the partner device from the partner server.

36. The system of claim 5, comprising an integration descriptor, wherein the integration descriptor includes capabilities data of the partner device.

37. The system of claim 36, wherein the capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

38. The system of claim 36, wherein the integration descriptor is configured for use to provide access to capabilities of the system server.

39. The system of claim 36, comprising a rules template file including a description of available functionality of at least one of the plurality of premises devices and the partner device.

40. A method comprising:

configuring a system server to interact with premises equipment including a plurality of premises devices located at the premises, wherein the premises equipment corresponds to a service provider;
configuring the system server to interact with a partner device at the premises via a partner proxy corresponding to the partner device, wherein the partner device is configured to use a partner protocol different from a protocol of the premises equipment;
configuring automation rules to control interactions between at least one of the partner device and the plurality of premises devices using actions and triggers;
configuring a user interface coupled to the system server to interact with the plurality of premises devices and the partner device.

41. The method of claim 40, comprising configuring the user interface to include a partner user interface corresponding to the partner device, wherein the partner user interface configures the user interface to interact with the partner device.

42. The method of claim 41, comprising configuring the partner user interface to at least one of create and edit at least one rule of the automation rules.

43. The method of claim 41, comprising configuring the partner user interface to delete at least one rule of the automation rules.

44. The method of claim 41, comprising configuring the partner proxy to interact with a partner server and the partner device, wherein the partner server corresponds to the partner device and the partner proxy.

45. The method of claim 44, comprising configuring the partner device to at least one of control and trigger automations in the plurality of premises devices via the partner server.

46. The method of claim 45, comprising configuring the system server to include the partner proxy and to communicate with the partner server via the partner proxy.

47. The method of claim 44, comprising configuring the partner proxy to proxy calls from the partner user interface to the partner server.

48. The method of claim 44, comprising embedding the partner user interface in the user interface.

49. The method of claim 44, comprising configuring an integration server to communicate with the system server.

50. The method of claim 49, comprising configuring the system server to include an integration application programming interface (API).

51. The method of claim 50, comprising configuring the integration server to communicate with the integration API.

52. The method of claim 50, comprising configuring an event bus to communicate with the system server and the integration server.

53. The method of claim 49, comprising configuring the integration server to communicate with the partner server.

54. The method of claim 49, comprising configuring the integration server to communicate with an integration adapter.

55. The method of claim 54, wherein the integration adapter is configured to translate between protocols of the system server and the partner server.

56. The method of claim 54, wherein the integration adapter is coupled to the partner server.

57. The method of claim 56, wherein the integration adapter is configured to provide endpoints for associating with the partner device.

58. The method of claim 56, wherein the integration adapter is configured to process events coming from the integration server, wherein the events comprise device data corresponding to the partner device.

59. The method of claim 58, wherein the device data includes at least one of state data, control data, and command data.

60. The method of claim 56, wherein the integration adapter is configured to send events to the integration server.

61. The method of claim 60, wherein the integration adapter is configured to send events including acknowledgment of incoming system events.

62. The method of claim 56, wherein the integration adapter is configured as an endpoint for managed partner device events to be reported to the system server.

63. The method of claim 44, comprising configuring a rules engine to run the automation rules.

64. The method of claim 63, comprising configuring the rules engine to run on the premises equipment.

65. The method of claim 63, comprising configuring the rules engine to run on the system server.

66. The method of claim 63, comprising configuring the rules engine to run on at least one of the system server and the premises equipment.

67. The method of claim 63, comprising configuring the rules engine to treat an event relating to the partner device as a trigger for at least one rule.

69. The method of claim 68, comprising, in response to the event, triggering with the at least one rule at least one action event to at least one of the partner device, at least one other partner device, and at least one of the plurality of devices.

70. The method of claim 44, comprising configuring the partner user interface to include a list comprising a plurality of partners, wherein the plurality of partners correspond to a plurality of partner devices.

71. The method of claim 70, comprising configuring the partner user interface with partner data of the partner device received from the system server in response to selection of a corresponding partner from the plurality of partners.

72. The method of claim 71, comprising configuring the partner user interface to authenticate with the partner server a user corresponding to the partner device.

73. The method of claim 72, comprising configuring the system server with information of the partner device as a result of authentication of the user.

74. The method of claim 44, comprising configuring the partner user interface to interact with the partner server via the partner proxy.

75. The method of claim 74, comprising configuring the partner user interface to receive data relating to the partner device from the partner server.

76. The method of claim 44, comprising configuring an integration descriptor to include capabilities data of the partner device.

77. The method of claim 76, wherein the capabilities data includes at least one of attributes, actions, events, and associated parameters of the partner device.

78. The method of claim 76, comprising configuring the integration descriptor for use to provide access to capabilities of the system server.

79. The method of claim 76, comprising configuring a rules template file to include a description of available functionality of at least one of the plurality of premises devices and the partner device.

Patent History
Publication number: 20170118037
Type: Application
Filed: Jun 29, 2016
Publication Date: Apr 27, 2017
Inventors: Jim KITCHEN (Redwood City, CA), David PROFT (Redwood City, CA), Weiping GUO (Redwood City, CA), Kyle MCKENZIE (Redwood City, CA), Thomas LEA (Redwood City, CA), John ELDERTON (Redwood City, CA), Nathan YANG (Redwood City, CA)
Application Number: 15/196,646
Classifications
International Classification: H04L 12/28 (20060101); G05B 11/01 (20060101); G06F 3/14 (20060101); H04L 29/08 (20060101); G06F 3/0484 (20060101);