Relationship Based Trust Verification Schema

A computationally-implemented method, in accordance with certain example embodiments, may include, but is not limited to: receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other aspects are presented in the claims, drawings, and written description forming a part of the present disclosure.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is related to and claims the benefit of the earliest available effective filing date(s) from the following listed application(s) (the “Related Applications”) (e.g., claims earliest available priority dates for other than provisional patent applications or claims benefits under 35 USC §119(e) for provisional patent applications, for any and all parent, grandparent, great-grandparent, etc. applications of the Related Application(s)). All subject matter of the Related Applications and of any and all parent, grandparent, great-grandparent, etc. applications of the Related Applications is incorporated herein by reference to the extent such subject matter is not inconsistent herewith.

RELATED APPLICATIONS

For purposes of the USPTO extra-statutory requirements:

    • (1) the present application claims benefit of priority of U.S. Provisional Patent Application No. 61/632,836 (Atty. Docket No. SE1-0540-US), entitled “Behavioral Fingerprint Based Authentication”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed Sep. 24, 2011, which was filed within the twelve months preceding the filing date of the present application, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (2) the present application claims benefit of priority of U.S. Provisional Patent Application No. 61/572,309 (Atty. Docket No. SE1-0541-US), entitled “Network Acquired Behavioral Fingerprint for Authentication”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed Oct. 13, 2011, which was filed within the twelve months preceding the filing date of the present application, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (3) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/373,685 (Atty. Docket No. SE1-0542-US), entitled “Behavioral Fingerprint Device Identification”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Nov. 23, 2011, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (4) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/373,684 (Atty. Docket No. SE1-0543-US), entitled “Behavioral Fingerprint Controlled Automatic Task Determination”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Nov. 23, 2011, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (5) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/373,680 (Atty. Docket No. SE1-0544-US), entitled “Behavioral Fingerprint Controlled Theft Detection and Recovery”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Nov. 23, 2011, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (6) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/373,677 (Atty. Docket No. SE1-0545-US), entitled “Trust Verification Schema Based Transaction Authorization”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Nov. 23, 2011, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (7) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/373,682 (Atty. Docket No. SE1-0546-US), entitled “Social Network Based Trust Verification Schema”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Nov. 23, 2011, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
    • (8) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/475,564 (Atty. Docket No. SE1-0547-US), entitled “Behavioral Fingerprint Based Authentication”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on May 18, 2012, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date; and
    • (9) the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 13/538,385 (Atty. Docket No. SE1-0548-US), entitled “Network Acquired Behavioral Fingerprint for Authentication”, naming Marc E. Davis, Matthew G. Dyor, Daniel A. Gerrity, Xuedong (XD) Huang, Roderick A. Hyde, Royce A. Levien, Richard T. Lord, Robert W. Lord, Mark A. Malamud, Nathan Myhrvold, and Clarence T. Tegreene, as inventors, filed on Jun. 29, 2012, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.

The United States Patent Office (USPTO) has published a notice to the effect that the USPTO's computer programs require that patent applicants reference both a serial number and indicate whether an application is a continuation or continuation-in-part. Stephen G. Kunin, Benefit of Prior-Filed Application, USPTO Official Gazette Mar. 18, 2003, available at http://www.uspto.gov/web/offices/com/sol/og/2003/week11/patbene.htm. The present Applicant Entity (hereinafter “Applicant”) has provided above a specific reference to the application(s) from which priority is being claimed as recited by statute. Applicant understands that the statute is unambiguous in its specific reference language and does not require either a serial number or any characterization, such as “continuation” or “continuation-in-part,” for claiming priority to U.S. patent applications. Notwithstanding the foregoing, Applicant understands that the USPTO's computer programs have certain data entry requirements, and hence Applicant is designating the present application as a continuation-in-part of its parent applications as set forth above, but expressly points out that such designations are not to be construed in any way as any type of commentary and/or admission as to whether or not the present application contains any new matter in addition to the matter of its parent application(s).

FIELD OF INVENTION

This invention relates generally to the field of relationship based trust verification schema based on behavioral fingerprints of network accessible users.

SUMMARY

For certain example embodiments, a computationally-implemented method may include, but is not limited to: receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other example method aspects are described or included in the claims, drawings, and written description forming a part of the present disclosure.

In one or more various aspects, related systems may include, but are not limited to, circuitry and/or programming for effecting the herein referenced method aspects; the circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware in one or more machines or articles of manufacture configured to effect the herein-referenced method aspects depending upon the design choices of a system designer.

For certain example embodiments, a computationally-implemented system may include, but is not limited to: means for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; means for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and means for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other example system aspects are described or included in the claims, drawings, and written description forming a part of the present disclosure.

For certain example embodiments, a computationally-implemented system may include, but is not limited to: circuitry for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; circuitry for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other example system aspects are described or included in the claims, drawings, and written description forming a part of the present disclosure.

For certain example embodiments, with at least one processor-accessible medium bearing processor-executable instructions, the processor-executable instructions may include, but are not limited to: one or more instructions for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; one or more instructions for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and one or more instructions for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other example processor-accessible medium aspects are described or included in the claims, drawings, and written description forming a part of the present disclosure.

For certain example embodiments, a computer program product comprises an article of manufacture that may bear, among other instructions: one or more instructions for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; one or more instructions for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and one or more instructions for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. In addition to the foregoing, other example computer program product aspects are described or included in the claims, drawings, and written description forming a part of the present disclosure.

For certain example embodiments, a method may relate to handling an authentication request using at least one behavioral-fingerprint-generated trust verification schema, with the method including, but not being limited to: receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users; receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users, wherein at least one of the receiving the one or more behavioral fingerprints, the receiving the authentication request, or the transmitting the decision is performed via at least one of a machine, an article of manufacture, or a composition of matter.

The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to, e.g., the drawings, the claims, and the following detailed description.

BRIEF DESCRIPTION OF THE FIGURES

FIG. 1 shows a computer server 30 and a computing device 10 in an exemplary environment 100, in accordance with certain example embodiments.

FIG. 2a shows a particular example implementation of a computing device 10 of FIG. 1, in accordance with certain example embodiments.

FIG. 2b shows an example view of a level of authentication module 102/102a, in accordance with certain example embodiments.

FIG. 2c shows an example view of an access restricting module 104/104a, in accordance with certain example embodiments.

FIG. 2d shows various types of example sensors 120 that may be included in a computing device 10, in accordance with certain example embodiments.

FIG. 2e shows a particular example implementation of a computer server 30 of FIG. 1, in accordance with certain example embodiments.

FIG. 3a shows an example view of a behavioral fingerprint library 170, in accordance with certain example embodiments.

FIG. 3b shows an example view of a behavioral fingerprint module 106/106a, in accordance with certain example embodiments.

FIG. 3c shows an example implementation of a trust verification schema in accordance with certain example embodiments.

FIG. 4 is a high-level logic flowchart of an example process depicting an implementation for a computing device, in accordance with certain example embodiments.

FIG. 5a is a high-level logic flowchart of an example process depicting alternate implementations of the computing device operation 404 of FIG. 4, in accordance with certain example embodiments.

FIG. 5b is a high-level logic flowchart of an example process depicting alternate implementations of the computing device operation 404 of FIG. 4, in accordance with certain example embodiments.

FIG. 5c is a high-level logic flowchart of an example process depicting alternate implementations of the computing device operation 404 of FIG. 4, in accordance with certain example embodiments.

FIG. 6 is a high-level logic flowchart of an example process depicting alternate implementations of network level operations, in accordance with certain example embodiments.

FIG. 7a is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 604 of FIG. 6, in accordance with certain example embodiments.

FIG. 7b is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 604 of FIG. 6, in accordance with certain example embodiments.

FIG. 8 is a high-level logic flowchart of an example process depicting alternate implementations of network level operations, in accordance with certain example embodiments.

FIG. 9a is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 802 of FIG. 8, in accordance with certain example embodiments.

FIG. 9b is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 802 of FIG. 8, in accordance with certain example embodiments.

FIG. 9c is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 802 of FIG. 8, in accordance with certain example embodiments.

FIG. 9d is a high-level logic flowchart of an example process depicting alternate implementations of the computer server operation 802 of FIG. 8, in accordance with certain example embodiments.

FIG. 10 is a high-level logic flowchart of an example process depicting alternative implementations of operations for at least one device or at least one server, in accordance with certain example embodiments.

FIG. 11a is a high-level logic flowchart of an example process depicting alternative implementations of a computer device operation 1001 of FIG. 10, in accordance with certain example embodiments.

FIG. 11b is a high-level logic flowchart of an example process depicting alternative implementations of a computer device operation 1002 of FIG. 10, in accordance with certain example embodiments.

FIG. 11c is a high-level logic flowchart of an example process depicting alternative implementations of a computer device operation 1002 of FIG. 10, in accordance with certain example embodiments.

FIG. 11d is a high-level logic flowchart of an example process depicting alternative implementations of a computer device operation 1003 of FIG. 10, in accordance with certain example embodiments.

DETAILED DESCRIPTION

In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.

Advances in computing technologies and related technologies (e.g., visual display technology, battery technology, etc.) resulted in the development of computing devices with tremendous processing power and relatively small form factors. Examples of such computing devices include, for example, laptops, Netbooks, tablet computers (i.e., “slate” computers), e-readers, smartphones, and so forth. Having a small form factor with tremendous processing power presents numerous opportunities for developing applications that previously required desktop computers or other stationary devices. One problem with the numerous applications available on a small form factor is that authentication becomes paramount. For example, if an application enables a mobile phone or a smartphone or a computing device, such as a key fob to open doors to a home, it is important to determine that the user of the device/phone/fob is the true owner. For example, FIG. 1 illustrates a car 75, key fob 74, gate 71, safe 72, cage 73 and door 74 as locking devices 70, each of which can be elements in one or more embodiments herein.

Embodiments herein are directed to enabling authentication and verification to be determined based on a behavioral fingerprint of the true owner of a device.

In accordance with various embodiments, computationally implemented methods, systems, and articles of manufacture are provided that can determine a level of authentication of a first user of a computing device; and in response to determining the level of authentication, automatically enable one or more actions as a function of the level of authentication. In various embodiments, such computationally implemented methods, systems, and articles of manufacture may be implemented at the computing device and/or a computer server networked to a computing device.

Referring now to FIG. 1, the figure illustrates a computing device 10 connected via a network interface to a computer server 30 in an exemplary environment 100. Computing device 10 is shown being operated by a first user 20. As will be further described herein the illustrated computing device 10 and computer server 30 may employ the computationally implemented methods, systems, and articles of manufacture in accordance with various embodiments. The computing device 10 and computer server 30, in various embodiments, may be endowed with logic that is designed to determine a level of authentication of a user of the computing device 10, and in response to such a determination, automatically enable functions of the computing device 10. In other embodiments behavioral fingerprints of network accessible users can be determined so that a relational mapping of the behavioral fingerprints can be created. The relational mapping can be implemented as a schema to enable authentication of transactions of the network accessible users.

Referring to FIG. 1, first user 20 may be the primary user, such as the owner, of the computing device 10, or could be a person given authority to use the computing device by the owner, or any network accessible user. As discussed below, the level of authentication associated with the first user 20, whether owner or not, is determined, at least partially based on a behavioral fingerprint of the owner of computing device 10. More particularly, a level of authentication associated with first user 20 of computing device 10 can be determined based on a behavioral fingerprint of the owner of computing device 10. The behavioral fingerprint of an owner of computing device 10 can be configured to be network accessible by computing device 10 via network 50 to server[s] 30. Server[s] 30 can be a cloud of connected network servers or can be a web server or the like. The behavioral fingerprint of an owner/authorized user of computing device 10 can be configured to override or be a determining factor for a level of authentication associated with computing device 10.

Although the computing device 10 illustrated in FIG. 1 is depicted as being a tablet computer, in alternative embodiments, the computationally implemented methods, systems, and articles of manufacture in accordance with various embodiments may be embodied in other types of computer systems having other form factors including other types of portable computing devices such as, for example, mobile telephones, laptops, Netbooks, smartphones, e-readers, and so forth. For example, device[s] 60 illustrate smartphones, client computers and the like as possible computing devices. As illustrated, the computing device 10 can include a display 12, such as a touchscreen, on the front side 17a of the computing device 10. Computing device 10 can further include a keyboard, either as a touch input/output keyboard or as an attached keyboard. As further depicted in FIG. 1, the display 12 displays an exemplary document 14 and a tool bar 15. As further depicted, the computing device 10 may also include a camera 16 (e.g., a webcam) disposed on the front side 17a of the computing device 10. In some embodiments, additional cameras may be included on the front side 17a and/or backside of the computing device 10.

The first user 20 can be an authorized user of computing device 10 or a person who has no connection to the computing device 10. In an embodiment, a level of authentication and/or a behavioral fingerprint can be determinative of the accessibility of computing device 10. In an embodiment, computing device 10 determines a level of authentication of first user 20 of a computing device 10. In an embodiment, computing device 10 uses the level of authentication to enable or disable automatic functions of the computing device 10. For example, computing device 10 can be configured to automatically open doors to a home, car, or other authorized user-designated item, depending on the level of authentication of the computing device at that time.

In accordance with an embodiment, the level of authentication determination relies at least in part on the behavioral fingerprint of one or more authorized users of computing device 10. The behavioral fingerprint can be determined based on statistical calculations on social network collected data, sensor-provided data, user input and/or a combination of such data. Thus, the level of authentication can be affected by a behavioral fingerprint of an authorized user of computing device 10, which may include social network collected data. The level of authentication can also be affected by various aspects at the time computing device 10 is turned on, such as aspects surrounding computing device 10 and/or aspects of the computing device itself (e.g., movements or detected images). For example, when the computing device 10 of FIG. 1 is turned on by the first user 20 the first user 20 may input a password or pattern or other identifying input, such as a fingerprint, facial recognition or the like. Thus, the level of authentication would recognize the user as an authorized user and then determine whether a behavioral fingerprint is established for that authorized user. Thus, the behavioral fingerprint of an authorized user can be configured to work together to determine accessibility of computing device 10 to first user 20. The level of authentication and the behavioral fingerprint can be directly correlated, or can be configured to enable a level of authentication to override the behavioral fingerprint or vice versa.

For example, a manufacturer of computing device 10 may be able to override a behavioral fingerprint of an authorized user of computing device 10 via the level of authentication, by entering a secret code, such as a manufacturer's accessibility code or the like in order to perform work on computing device 10.

In one or more embodiments, first user 20 can be a network-accessible user for which computing device 10 is just one of many network-accessible devices that network-accessible user 20 may use to access the internet, a cloud server, a mobile network or the like. A network-accessible user can be an owner and/or operator of computing device 10 and other devices. According to an embodiment, network-accessible user 20 can have a behavioral fingerprint that exists outside of computing device 10, that can exist in a cloud computing system for which servers 30 are connected. Devices 30 can further have a presence in the cloud computing system to enable the embodiments described herein. For example, each of devices 30 can be a network-accessible device to which network-accessible user 20 could be connected. Thus, network-accessible user 20 could be a user of one or several devices simultaneously. Network-accessible user 20 could also be a user of a public computing device, for example, if none of devices 30 are available to network-accessible user.

Referring now to FIG. 2a, computing device 10 of FIG. 1 illustrates a level of authentication module 102, an access restricting module 104, a behavioral fingerprint module 106, an alert generating module 108, a memory 114 (which may store one or more applications 160 and/or a library of behavioral fingerprints 170), one or more processors 116 (e.g., microprocessors, controllers, etc.), one or more sensors 120, a user interface 110 (e.g., a display monitor such as a touchscreen, a keypad, a mouse, a microphone, a speaker, etc.), and a network interface 112 (e.g., network interface card or NIC).

In various embodiments, the level of authentication module 102 of FIG. 2a is a logic module that is designed to determine a level of authentication associated with first user 20 of computing device 10. The access restricting module 104 is a logic module that is designed to restrict access to one or more items in response to the determination made by the level of authentication module 102. Alert generating module 108 is a logic module that is designed to generate an alert that causes the computing device 10 to communicate a variance to the level of authentication module to restrict capabilities of the computing device and access to the one or more items. The computing device 10 of FIG. 1, can include the three logic modules (e.g., the level of authentication module 102, the restriction module 104, and the alert generating module 108) using circuitry including components such as application specific integrated circuit or ASIC. Alternatively, logic modules including a level of authentication module 102/102a, access restricting module 104/104a, behavioral fingerprint module 106/106a and alert generating module 108/108a can provide the same and similar functionality and correspond to level of authentication module 102, the access restricting module 104, behavioral fingerprint module 106 and the alert generating module 108. Logic modules level of authentication module 102a, the behavioral fingerprint module 106a, the access restricting module 104a, and the alert generating module 108a of the computing device 10 of FIG. 2a can be implemented by the one or more processors 116 executing computer readable instructions 152 (e.g., software and/or firmware) that may be stored in the memory 114. Instructions may comprise, by way of example but not limitation, a program, a module, an application or app (e.g., that is native, that runs in a browser, that runs within a virtual machine, a combination thereof, etc.), an operating system, etc. or portion thereof; operational data structures; processor-executable instructions; code; or any combination thereof; and so forth. At least one medium (e.g., memory 114) may comprise, by way of example but not limitation, processor-accessible or non-transitory media that is or are capable of bearing instructions, data, files, configuration settings, a combination thereof, and so forth.

Note that although FIG. 2a illustrates all of the logic modules (e.g., the level of authentication module 102, the access restricting module 104, the behavioral fingerprint module 106 and the alert generating module 108) being implemented using purely circuitry components such as ASIC, logic modules 102, 102a, 104, 104a, 106, 106a, 108, or 108a may be implemented using a combination of specifically designed circuitry such as ASIC and one or more processors 116 (or other types of circuitry such as field programmable gate arrays or FPGAs) executing computer readable instructions 152. For example, in some embodiments, at least one of the logic modules may be implemented using specially designed circuitry (e.g., ASIC) while a second logic module may be implemented using a processor 116 (or other types of programmable circuitry such as an FPGA) executing computer readable instructions 152 (e.g., software and/or firmware). System requirements or specifications may militate for a combination of software or firmware or circuitry to comport with example embodiments as described herein; for example, logic modules may be designed to use an efficient combination of software, hardware, or firmware in order to quickly or efficiently implement methods, systems, etc. that are within the scope of the present disclosure. For certain example embodiments, logic may comprise hardware, software, firmware, discrete/fixed logic circuitry, any combination thereof, etc. that is capable of performing or facilitating performance of methods, processes, operations, functionality, technology, or mechanisms, etc. that are described herein or illustrated in the accompanying drawings. Circuitry may comprise hardware, software, firmware, discrete/fixed logic circuitry, any combination thereof, etc. that is capable of performing or facilitating performance of methods, processes, operations, functionality, technology, or mechanisms, etc. that are described herein or illustrated in the accompanying drawings, wherein circuitry comprises at least one physical or hardware component or aspect.

In various embodiments, the memory 114 of the computing device 10 of FIG. 2a may comprise of one or more of mass storage device, read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), cache memory such as random access memory (RAM), flash memory, synchronous random access memory (SRAM), dynamic random access memory (DRAM), and/or other types of memory devices. In various embodiments the one or more applications 160 stored in memory 114 may include, for example, an operating system 162, one or more productivity applications 164 such as a word processing application or a spreadsheet application, one or more communication applications 166 such as an email or IM application, and one or more personal information manager applications 168 (e.g., Microsoft® Outlook™) and one or more social network applications such as Twitter™ and Facebook™.

Turning now to FIG. 2b illustrating a particular implementation of the level of authentication module 102 and 102a of FIG. 2a. As illustrated, the level of authentication module 102 and 102a may include one or more sub-logic modules in various alternative implementations. For example, in various implementations, the level of authentication module 102/102a may include a behavioral fingerprint interaction module 210, which may further include anomalous action detecting module 212, and a social network confirmation module 216. Level of authentication module 102/102a may further include statistical level determination module 218, a visual cue detecting module 220, including face detecting module 222, and an audio cue detecting module 226, including a voice pattern detecting module 227. Level of authentication module 102/102a may also include a geographic location determination module 230.

The behavioral fingerprint catalogue or library of anomalous actions may be stored as part of behavioral fingerprint library 170 stored in memory 114 (see FIG. 2a) of the computing device 10 of FIG. 1. Therefore, when anomalous changes that match with catalogued or a library of anomalous changes (e.g., as stored in library 170 of the memory 114) have been detected, then at least an inference may be made that the user of computing device 10 is not authenticated, that first user 20 is not an owner of computing device 10, or the like.

In some embodiments, the computing device 10 may include logic that is designed to determine data from a combination of sensors 120 (e.g., of FIG. 2d) that may be processed and analyzed. In some embodiments, computing device 10 determines via one or more image capturing devices 204 (e.g., webcam or digital camera), and/or one or more audio capturing devices 206 (e.g., microphones), and/or images received by computing device via one or more networked devices and/or social networks, whether the computing device 10 is no longer under the control of first user 20, which would cause the level of authentication determined in level of authentication module 102 to alter. For example, the computing device 10 in some cases may employ one or more movement sensors 202 to detect the actual movements of the computing device 10 and/or one or more image capturing devices 204 (possibly including a facial recognition system/application) to determine that a face associated with the first user 20 is not a face associated with an owner of computing device 10. Based on the data provided by both the movement sensors 202 and/or the image capturing devices 204 at least an inference may be made that the computing device 10 requires an alteration to the level of authentication.

Alternatively or additionally, in some embodiments, the computing device 10 may be endowed with a facial recognition system (e.g., facial recognition software) that when employed with one or more image capturing devices 204 may be used in order to determine the presence or absence of a face associated with an owner of computing device 10 and compare to the first user 20. If the face associated with the owner of computing device 10 does not match first user 20 then a determination may be made to alter the level of authentication associated with first user 20. In addition to face recognition, other logic can include using the field of view of image capturing device 16 or audio capturing devices of the computing device 10 to identify an authorized user of computing device through other recognition processes, such as fingerprint, retina, voice verification, global positioning system (GPS) locating of the owner of computing device 10 or other personal identification.

In various embodiments, the one or more items that access may be restricted to may be one or more electronic items that may have been open or running prior to a level of authentication change of the computing device 10 and/or electronic items that were accessible through the computing device 10 (e.g., electronic documents and files that were stored in the computing device 10) prior to an alteration of the level of authentication of the computing device 10.

Statistical level determination module 218 may be configured to apply statistical algorithms, comparative analysis, statistical probability functions, and the like to determine a statistical level of authentication for computing device 10. In one embodiment, statistical level determination module 218 may apply a weighting function, which determines a level of authentication based on received data from scanners, and other devices, and a behavioral fingerprint, with each received data having a predetermined weight regarding relevance to authentication. Statistical level determination module 218 may additionally or alternatively analyze anomalous actions to determine or infer the level of authentication. To further determine or at least infer that the computing device 10 should have a low level of authentication, statistical examination/analysis of the detected anomalous action movements of the computing device 10 may involve comparing the detected anomalies of the computing device 10 with catalogued or library anomalous action movements (which may be stored in the memory 114 of the computing device 10) that are identified as being movements associated with, for example, a transfer of computing device 10, a dropping of computing device 10, an action incompatible with the stored predicted actions of an authorized user, or an alert received from a social network that an expected or previously possessory authorized user does not have possession of computing device 10.

Computing device 10 may maintain in its memory 114 (see FIG. 2A) a behavioral fingerprint library 170 that may include a catalogue or library of actions, inputs, movements, received network data including anomalous data that have been previously identified as anomalous that may occur when, for example, a computing device 10 is stolen or used by another user, or a social network query fails to return appropriate confirmatory data that confirms that an authorized user is in control of computing device 10. Thus, when anomalous movements, inputs or actions match something in the library anomalous movements, inputs or actions have been detected, a determination or inference may be made that the level of authentication must be altered. The level of authentication can be lowered, such that first user 20 is determined to have a lowest level of authentication.

Behavioral fingerprint interaction module 210 may receive data from behavior fingerprint module 106/106a and/or behavioral fingerprint library 170. Behavioral fingerprint interaction module 210 can apply the data relating to one or more behavioral fingerprints of authorized users to determine a level of authentication. More particularly, level of authentication module 102/102a may be configured to receive a behavioral fingerprint as a list of activities, warnings, anomalous actions, and the like. Specific details related to the level of authentication module 102/102a as well as the above-described sub-modules of the level of authentication module 102 will be provided below with respect to the operations and processes to be described herein.

Referring now to FIG. 2c illustrating a particular implementation of the access restricting module 104/104a of FIG. 2a. Access restricting module 104/104a of the computing device 10 of FIG. 2c can be configured to restrict access (e.g., hiding or disguising, denying viewing or editorial access, converting to read-only form, and so forth) via the computing device 10 to one or more items (e.g., documents, image or audio files, passwords, applications, and so forth) or preventing one or more actions by computing device 10.

As illustrated, the access restricting module 104/104a may include one or more sub-logic modules in various alternative implementations. For example, in various implementations, the access restricting module 104/104a may include a partial access providing module 232, a no access module 234, a viewing access restricting module 236 (which may further include a visual hiding module 237 that may further include a visual replacing module 238), an audio access restricting module 240 (which may further include an audio hiding module 241 that may further include an audio replacing module 242), an editorial restricted format presenting module 245, a functional restricting format presenting module 250, an open item ascertaining module 252, a document access restricting module 254 (which may further include a productivity document access restricting module 255, a message access restricting module 256, an image document access restricting module 257, and/or an audio document access restricting module 258), and/or a password access restricting module 262. As further illustrated in FIG. 2c, the access restricting module 104/104a, in various implementations, may also include an application access restriction module 264 (which may further include a productivity application access restriction module 265, a communication application access restriction module 266, and/or a personal information manager application access restriction module 267), and/or an affiliation ascertaining module 270. As further illustrated in FIG. 2c, in various implementations, the affiliation ascertaining module 270 may further include one or more sub-modules including an identifier affiliation ascertaining module 271 (which may further include a name affiliation ascertaining module 272, an image affiliation ascertaining module 273, and/or a voice pattern affiliation ascertaining module 274), an address ascertaining module 276, a source ascertaining module 277, and/or a word/phrase/number affiliation ascertaining module 278.

An example of how access restricting module 104/104a operates includes determining whether one or more productivity documents are word processing documents and then restricting access to such items may involve hiding or disguising representations of the documents in a directory (e.g., deleting document names or subject headings in the directory or replacing the document names or subject headings in the directory with pseudo-names or subject headings). Alternatively, a non-editable form of the documents may be presented in order to restrict access to such documents. If, on the other hand, the one or more items are one or more software applications, then restricting access to such items may involve denying use of one or more functionalities associated with the items (e.g., applications). For example, if the one or more items include a word processing application, then restricting access to such an application may involve, although allowing general access to such an application, disabling one or more editing functions of the application.

FIG. 2d illustrates the various types of sensors 120 that may be included with the computing device 10 of FIG. 1. As illustrated, the sensors 120 that may be included with the computing device 10 may include one or more movement sensors 202, one or more image capturing devices 204 (e.g., a web cam, a digital camera, etc.), one or more audio capturing devices 206 (e.g., microphones), and/or a global positioning system (GPS) 208 (which may include any device that can determine its geographic location including those devices that determine its geographic location using triangulation techniques applied to signals transmitted by satellites or by communication towers such as cellular towers).

One way to monitor actions taken by first user 20 with respect to computing device 10 is to directly detect such actions using one or more sensors shown in FIG. 2d that are designed to directly detect/measure activities by user 20 of computing device 10. These sensors can be integrated with computing device 10 and may be used to directly detect the action taken with respect to the computing device 10 as the computing device 10 is being used by first user 20. For example, fingerprint detection sensor, or facial recognition sensors can detect whether first user 20 is an authorized user of computing device 10. Once first user 20 is associated with an authorized user of computing device 10, the behavioral fingerprint associated with the associated authorized user can be accessed. The behavioral fingerprint module 106/106a then can process data received by behavioral fingerprint library 170, and provide the behavioral fingerprint data to level of authentication module 102. In one embodiment, level of authentication module 102 receives the behavioral fingerprint data from behavioral fingerprint library 170 and determines the accessibility of computing device 10 based at least in part on the determined behavioral fingerprint.

Referring now to FIG. 2e, a computer server 30 of FIG. 1 may include at least a portion of functionality that is similar, analogous, comparable, etc. to that of computing device 10, or vice versa. As such, FIG. 2e illustrates a level of authentication module 102c, an access restricting module 104c, a behavioral fingerprint module 106c, an alert generating module 108c, a memory 114c (which may store one or more applications 160c, one or more computer-readable instructions, or a library of behavioral fingerprints 170c), one or more processors 116c (e.g., microprocessors, controllers, etc.), and a network interface 112c (e.g., network interface card or NIC). Although not explicitly referenced above, descriptions of level of authentication module 102/102a, access restricting module 104/104a, behavioral fingerprint module 106/106a, and alert generating module 108/108a may respectively apply to a level of authentication module 102c, an access restricting module 104c, a behavioral fingerprint module 106c, and an alert generating module 108c, and vice versa, unless context dictates otherwise.

In various embodiments, logic modules level of authentication module 102c, the behavioral fingerprint module 106c, the access restricting module 104c, and the alert generating module 108c of the computer server 30 of FIG. 2e can be implemented by the one or more processors 116c executing computer readable instructions (e.g., software and/or firmware) that may be stored in the memory 114.

Note that FIG. 2e illustrates the logic modules (e.g., the level of authentication module 102c, the access restricting module 104c, the behavioral fingerprint module 106c, and the alert generating module 108c) being implemented using processor modules, however, purely circuitry components such as an ASIC may be implemented using a combination of specifically designed circuitry such as ASIC and one or more processors 116c (or other types of circuitry such as field programmable gate arrays or FPGAs) executing computer readable instructions. For example, in some embodiments, at least one of the logic modules may be implemented using specially designed circuitry (e.g., ASIC) while a second logic module may be implemented using a processor 116c (or other types of programmable circuitry such as an FPGA) executing computer readable instructions (e.g., software and/or firmware). System requirements could dictate a combination of software and firmware and circuitry to meet the embodiments herein, for example, logic modules could be designed to use the most efficient combination of software/hardware/firmware in order to quickly implement methods and systems within the scope of the present disclosure.

In various embodiments, the memory 114c of the computer server 30 of FIG. 2e may comprise of one or more of mass storage device, read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), cache memory such as random access memory (RAM), flash memory, synchronous random access memory (SRAM), dynamic random access memory (DRAM), and/or other types of memory devices. In various embodiments the one or more applications 160c stored in memory 114c may include, for example, an operating system 162c, one or more productivity applications 164c such as a word processing application or a spreadsheet application, or one or more communication applications 166c.

Referring now to FIG. 3a, an example behavioral fingerprint library 170/170c is shown with more particularity. Computing device 10 or computer server 30 may maintain in its memory 114/114c (e.g., see FIG. 2a or FIG. 2e) a behavioral fingerprint library 170/170c (e.g., see also, FIG. 2a or FIG. 2e), which may comprise a catalog or library that identifies a plurality of actions by one or more users, including by way of example only network interactions, including social network interactions, alerts relating to one or more users and the like that when detected as occurring at least imply (e.g., intimate or lead to an inference) that computing device 10 is being used by an authorized user. FIG. 3a illustrates modules or functionalities that may be performed by either or both of a computing device 10 or a computer server 30. In a case of computer server 30, functionalities of the various modules may be replicated as needed for a plurality of computing devices authorized users of one or more computing devices, as will be appreciated by one of ordinary skill in the art. For example, a computer server 30 may comprise one computer server of a computer server farm, which may exist in a cloud computing setting, and enable productivity applications 164c or communications applications 166c to be performed via cloud computing technologies. As such, appropriate replications may be included within the scope of the present application.

As shown, FIG. 3a includes a behavioral fingerprint library 170/170c, which may include a social network library 302, an authorized user library 304, an anomalous activity library 306, or a cryptographic library 308.

Social network library 302 may be configured to store interactions between authorized users and other entities. For example, one or more social networks may include Facebook™, Twitter™, Pinterest™, Google+™, Myspace™, Foursquare™, Flickr™, Classmates.com™, Match.com™, and so forth. A social network library 302 may be configured to store messages from one or more social networks such that a behavioral fingerprint module 106/106a may determine if action needs to be taken based on the messages. For example, an authorized user of a computing device 10 or another device via computer server 30 or over network 50 may post a message via a social network that computing device 10 is no longer under his/her control. Computing device 10 may automatically receive such a post over a network connection, e.g. from computer server 30 via network interface 112/112c, at a social network library 302, which may create a low level of authentication for first user 20, possibly before first user 20 attempts to use computing device 10. A higher level of authentication may be reestablished by an authorized user of computing device 10 after return of possession of computing device 10 in order for an authorized user to have full functionality of computing device 10 or to restore a prior level of authentication or the like.

A social network library 302 may identify any messages with indicative aspects relative to authentication. A social network library 302 may be configured to identify key words, such as “stolen” or “lost” and pass on a warning notification to a behavioral fingerprint module 106/106a/102c or a level of authentication module 102/102a/102c for further processing. In an example embodiment, a social network library 302 may apply a search algorithm to identify key words to assist in determining behaviors that are authentication positive or authentication negative. For example, “stolen” or “lost” may be deemed to comprise authentication negative key words. Conversely, a current message from a current “friend” on Facebook™ and a response using computing device 10 may be deemed to comprise an authentication positive action or actions. Indications that an authorized user of computing device 10 is interacting with previously verified and identified “friends” on Facebook™ may be deemed authentication positive.

For certain example embodiments, a social network library 302 may include at least one trust verification schema 303 as shown in FIG. 3a. By way of example but not limitation, FIG. 3c illustrates an example schema mapping including users A, B, or C as network accessible users. Each of users A, B, or C may comprise a first user 20 using a machine, such as a machine 320 that is used by User A, a machine 321 that is used by User B, and a machine 323 that is used by User C.

As illustrated in FIG. 3c for a trust verification schema 303 that maps technical relationships such as machine/service/resource/etc. relationships between or among people or other entities, but by way of example only, User A may be using machine 320, which is shown connected on a first tiered relational portion to an infrastructure service 322, a file system 324, a software usage 326, a software license instance 328, a car computer 330, a laptop 332, a server 334, or a configuration item 336. On a second tiered relational portion of trust verification schema 303 with respect to User A, User A is shown connected to a LinkedIn™ server 340, a Twitter™ server 342, a Match.com™ server 344, a cloud server 346, a Facebook™ server 348, and a User B machine 321. On a third tier from Machine 320, User A is shown connected to a User C machine 323; cloud servers 350, 352, or 354; or internal systems for User B machine 321, such as a file system 362, a software usage 364, or a software license instance 366. Also illustrated comprising a part of or with respect to a third tier from machine 320, User A is shown connected to a server 368, a laptop 370, or a car computer 372.

For certain example embodiments, a trust verification schema 303 may be created via one or more behavioral fingerprints that include data for a first user 20 as a network accessible user. A schema may be created by mapping data from a behavioral fingerprint for user 20. Arrows 390 (only a portion of which are explicitly identified by reference number in FIG. 3c for the sake of visual clarity) may illustrate representative examples of a mapping of or to different connections within and outside of a computer machine 320 as used by User A. Information comprising a part of or available from a behavioral fingerprint that is associated with User A may be subject to choices, settings, restrictions, a combination thereof, etc. by User A. By way of example only, one or more sensors 120 may be set to sense things (e.g., actions, usage patterns, interactions, messages sent or received, data accessed or stored, a combination thereof, etc.) about a computer being used by User A, or sensed data may be shared with a behavioral fingerprint, e.g. subject to cryptographic sealing or the like. As shown, a User C machine 323 may be connected to User A, but a User C behavioral fingerprint may be set to reveal only connections to or via a LinkedIn™ server 340, a Twitter™ server 342, a Match.com server 344, or a cloud server 352. In contrast, a behavioral fingerprint for User B on a machine 321 may enable sharing, exposure, or revelation of a connection to a file system 362 of machine 321, a software usage 364, or a software license instance 366.

For certain example embodiments, a trust verification schema 303 may be used to authenticate transactions of any or each of User A, User, B, or User C. In certain example implementations, for each behavioral fingerprint, a level of authentication may be associated therewith. Behavioral fingerprints may enable a social graph to be generated as shown by example schema 303. Additionally or alternatively, a level of authentication for each of Users A, B, or C may be linked based at least partially on schema 303. By way of example but not limitation, as shown by example schema 303 in FIG. 3c, User B and User A may be considered relatively closer in relation than are User A and User C. Therefore, a level of authentication of User A and User B may be combined or correlated. As a result, if users or a transaction are closely tied, a transaction authentication for User B may be correlated to a level of authentication for User A, or vice versa. As one of ordinary skill in the art with the benefit of this disclosure will appreciate, a trust verification schema may be used: for either or both of approving or denying transactions, for enabling further connections between users, for a combination thereof, and for other uses. For example, users that are closely related may use at least one behavioral fingerprint that is at least partially based on at least one social network: to generate a trust verification schema, such as schema 303; to share processing power; to share data; to further one or more purposes of cloud computing, a combination thereof, and so forth. As shown, cloud servers 350, 352, 354, or 346 may use resources of a User B machine 321 if a User A machine 320 is overloaded or can otherwise benefit from the resources because both are connected to at least one common cloud service and because they are closely related according to their behavioral fingerprints; hence, permissions for resource sharing may be automatically granted in such an example.

For certain example embodiments, a trust verification schema 303 that maps social relationships such as familial/friendship/professional/etc. relationships between or among people or other entities may comprise, by way of example only, a connected child and parent table that stores records, e.g. in a database. Stored records, which may include indicators as to e.g. family or other social connections, may be updated using behavioral fingerprints. With cryptographic protection, for example, different users may have their respective records accessed simultaneously as if they are effectively one record, e.g. in accordance with a mapping represented by or implemented using a schema 303. For certain example implementations, social connections that imply tiered/level relationships may be extracted from one or more behavioral fingerprints associated with one or more network accessible users, including but not limited to those behavioral fingerprints that are updated by monitoring interactions with or by polling/querying/scraping of at least one social network.

FIG. 3a also includes authorized user library 304, which can include a library of authorized users of computing device 10. Computing device 10 and computer server 30 can be associated with one or more authorized users. The authorized users can include an owner or several owners, co-owners, and users with varying degree of permission for using computing device 10 or other computer devices. Authorized user library 304 can include profiles for each authorized user, including passwords. Behavior fingerprint module 106/106a/106c and level of authentication module 102/102a/102c can be associated with one or more authorized users, or associated with just one authorized user, in accordance with system requirements. For example, each authorized user can have a designated behavioral fingerprint. When first user 20 is identified as one of a plurality of authorized users, the behavioral fingerprint for that authorized user would be associated with first user 20, and a level of authentication can be then determined.

FIG. 3a further illustrates anomalous activity library 306. Anomalous activity library 306 can include data stored that indicates an anomalous activity has taken place. In one embodiment, an authorized user can store or log activities that the user has predetermined to be anomalous. For example, an authorized user may provide a list of area codes for which the computing device operated as a phone, would consider anomalous. A list could include all foreign country phone numbers, specific area codes or the like that the authorized user would not normally call from computing device 10. An authorized user could further identify actions that would be anomalous for that authorized user. Identified action could include time of day usage, GPS-determined locations identified as locations of computing device 10 the authorized user considered anomalous, and application-specific actions identified as anomalous. An example of application-specific actions could include deletion of significant amounts of data, logging into a social network as a user that is not an authorized user of computing device 10, and the like. In an embodiment, anomalous activity library 306 further logs activities that are received upon via a network that are determined to be anomalous. For example, a social networked entity can post a message that is monitored by computing device 10 and/or computer server 30 that includes a warning or other indication of unsafe conditions associated with computing device 10. Anomalous activity library 306 could be configured to log the warning so that the behavioral fingerprint module can determine whether to associate the warning with an authorized user.

FIG. 3a further illustrates cryptographic library 308, which can include data such as passwords, public/private key pair data, cryptographic keys such as the types used in block ciphers such as Triple DES or substitution permutation algorithms like AES. As will be appreciated by those of skill in the art, Triple DES data is encrypted with the first key, decrypted with the second key, and finally encrypted again with the third key, resulting in up to a 168 bit encryption. AES encryption can use variable key lengths. For example, keys used in AES can have lengths of 128, 192, or 256 bits to encrypt blocks with a length of 128, 192 or 256 bits (all nine combinations of key length and block length are possible).

As will be appreciated by those of skill in the art with the benefit of the present application, key lengths can change over time as computing capabilities change and progress. As such, the key lengths described herein are exemplary only and not intended to be limiting in any way. Cryptographic library 308 can receive data from social networks or designated sources to create a key pair or to regenerate a key or key pair. For example, as part of an authorized user's behavioral fingerprint, the authorized user could assign parts of a key, either asymmetric or symmetric, to several “friends” on a social network. In the current state of the art, an asymmetric key could include a “public key” and would not need to be kept secret, and a symmetric key could include a “private key” or a “secret” which would need to be protected. For purposes of the present application, in embodiments presented herein, the terms “asymmetric key,” “public key,” and “private key” contemplate possible changes in cryptography algorithms for which different types of asymmetric keys could require protection. Furthermore, embodiments herein contemplate the re-emergence and/or generation of cryptography systems wherein cryptographic keys may be made public and the specific cryptographic algorithms used to generate cryptographic keys may need to be kept secret. For example, in an attempt to thwart piracy, some computer gaming software systems now execute certain security code(s) on a remote server instead of the local device. In this case, the data may be known, but the code implementing the algorithm is kept secret. The use of the terms asymmetric, public, and private should not be interpreted as restricted to the current form of public/private key pair encryption, but rather to the general case of establishing a means of secure communication with some aspect being kept secret. For example, key encryption may be either symmetrical or asymmetrical, with some aspect being known. If an anomalous event occurs which causes the authorized user's behavioral fingerprint to be compromised, an authorized user can reestablish a behavioral fingerprint by notifying each designated “friend” in the social network to send a portion of a key, so that when the key is regenerated, the behavioral fingerprint is rebuilt.

Referring to FIG. 3b, behavioral fingerprint module 106/106a/106c is shown in more detail. Behavioral fingerprint module 106/106a/106c receives data from behavioral fingerprint library 170. Behavioral fingerprint module 106/106a/106c is shown including initialization module 312, fingerprint build/degradation module 314, and fingerprint generation module 316.

Initialization module 312 may be configured to determine an initial behavioral fingerprint associated with an authorized user. The initial behavioral fingerprint can be based on entered data by authorized user, and received data from behavioral fingerprint library 170 and received data from sensor[s] 120.

Fingerprint build/degradation module 314 may be configured to determine whether initial behavioral fingerprint should be altered due to received data from behavioral fingerprint library 170, or sensor[s] 120.

Fingerprint generation module 316 may be configured to determine a current behavioral fingerprint for a first user 20 determined to be an authorized user attempting to operate computing device 10. Fingerprint generation module 316 can also be configured to determine a behavioral fingerprint for an established authorized user based on network received data while computing device 10 is connected to a network connection. In the case of fingerprint generation module 316 existing in a cloud computing setting or computer server 30, fingerprint generation module 316 may be configured to determine a network-based behavioral fingerprint for a plurality of users when first logging into network 50 or cloud computing logging to computer server 30.

A behavioral fingerprint can be determined before first user 20 handles computing device 10. In some embodiments, a manufacturer can set both a behavioral fingerprint and a level of authentication based on information received by first user 20 when ordering computing device 10 or first handling computing device 10. For example, received passwords and the like. In a computer server 30 environment, a behavioral fingerprint can be transferred from another device, such as devices 60. Whether the level of authentication or the behavioral fingerprint controls the accessibility and actions available to first user 20 depends on system requirements and can be adjusted. For example, a behavioral fingerprint may indicate that computing device 20 has been stolen, and, in such a case, the behavioral fingerprint library 170 could be configured to notify level of authentication module 102 of exigent circumstances requiring a reduced access to computing device 10. Likewise, computer server 30 could hold the behavioral fingerprint library 170c and notify a level of authentication module 102 and 102c of exigent circumstances.

Also, a behavioral fingerprint module 106/106a/106c may be configured to rebuild some type of asymmetric key pair or a Triple DES or AES type key after an anomalous event, and notify level of authentication module that an authorized user should have a level of authentication that allows access.

Behavioral fingerprint module 106/106a/106c can receive data related to various types of movements, actions and inputs related to computing device 10. For example, an initial behavioral fingerprint generated by behavioral fingerprint module 106/106a/106c could be configured to communicate to level of authentication logic module 102/102a/102c predetermined inputs to computing device 10 and/or computer server 30 to provide access.

Other examples of the type of movements, actions and inputs that may be tracked for purposes of determining a behavioral fingerprint may include, for example, may be, individually or in combination, those tracked using one or more sensors 120 that may be included with the computing device 10 as illustrated in FIG. 2d. For example, in various embodiments, one or more movement sensors 202 can directly detect movements, and/or other types of sensors (e.g., image capturing devices 204, audio capturing devices 206, etc.) that may be able to indirectly detect actions may be employed to confirm actions taken with respect to the computing device 10 as will be further described herein. Another type of sensor can determine a particular way in which the first user types on a keyboard of the computing device or uses pressure on the computing device. For example, a first user may repetitively use particular keys with a particular pressure or the like. The key pattern could be used in behavioral fingerprint module 106/106a to build on a behavioral fingerprint as in fingerprint build/degradation module 314, for example.

The type of access to be restricted in response to determining that the computing device 10 or computer server 30 has an altered level of authentication for first user 20 will depend on a number of factors including what types of actions are requested. For example, if the one or more items are one or more software applications (herein “applications”), then the access restriction may include restriction to one or more functionalities of the one or more applications. Alternatively, access restriction and disabling of the one or more applications in some cases may mean access to the one or more applications being completely blocked or hidden. In contrast, if the one or more items are one or more electronic documents (e.g., productivity documents, image or audio files, etc.), then the access restriction that may be applied to such items may relate to editorial access restrictions (e.g., restrictions to the modifications, deletion, addition, and so forth of the items) of the items as a function of the level of authentication. Likewise, automatic actions and tasks may be restricted or disabled as a function of the level of authentication.

In some cases, restricting access to the one or more items may mean restricting viewing access to the one or more items while in other cases it may mean restricting audio access to the one or more items. In some cases, restricting access to the one or more items may mean complete restriction to access of the one or more items and/or one or more actions, while in other cases, restricting access to the one or more items may mean only a partial restriction to access of the one or more items. In any event, a more detailed discussion related to the various types of access restrictions that may be applied to the one or more items will be provided below with respect to the operations and processes to be described herein.

In some embodiments, the computing device 10 in response to restricting access to the one or more items and preventing one or more automatic actions, may be designed to generate an alert that indicates that the computing device 10 has been reconfigured to restrict access to the one or more items and disable the one or more automatic actions. Note that in some embodiments, the alert can go back and forth between computer server 30 and computing device 10, depending on the source of the alert and the exigency of the alert.

A more detailed discussion related to the computing device 10 of FIGS. 1-3 will now be provided with respect to the processes and operations to be described herein. FIG. 4 illustrates an operational flow 400 representing example operations for, among other things, restricting access via a computing device to one or more items (e.g., software applications, electronic documents including productivity documents, audio or image files, electronic messages including emails, passwords, and so forth). In FIG. 4 and in the following figures that include various examples of operational flows, discussions and explanations will be provided with respect to the exemplary environment 100 described above and as illustrated in FIG. 1 and/or with respect to other examples (e.g., as provided in FIG. 2a) and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 2a, 2b, 2c, or 2d and FIG. 3a or 3b. Also, although the various operational flows are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders other than those which are illustrated, or may be performed concurrently.

Further, in FIG. 4 and in the figures to follow thereafter, various operations may be depicted in a box-within-a-box manner. Such depictions may indicate that an operation in an internal box may comprise an optional example embodiment of the operational step illustrated in one or more external boxes. However, it should be understood that internal box operations may be viewed as independent operations separate from any associated external boxes and may be performed in any sequence with respect to all other illustrated operations, or may be performed concurrently. Still further, these operations illustrated in FIG. 4 as well as the other operations to be described herein are performed by at least one of a machine, an article of manufacture, or a composition of matter unless indicated otherwise.

In any event, after a start operation, the operational flow 400 of FIG. 4 may move to an associative determination operation 402 for determining that a first user of a computing device is associated with the computing device. For instance, and as an illustration, the level of authentication module 102/102a of the computing device 10 of FIG. 1 determining that a computing device 10 used by a first user 20 (e.g., an unknown user having inferior access rights or an authorized user of the computing device 10 of FIG. 1) has turned on and/or logged onto computing device 10. Note that in various implementations, the first user 20 may “use” the computing device 10 by logging onto the computing device 10 and/or by employing the computing device 10 to access one or more applications and/or content that may be accessible through the computing device 10. In addition to the association operation 402, operational flow 400 may also include a level of authentication operation 404 for determining a level of authentication associated with the first user via the computing device, the level of authentication at least partially based on a behavioral fingerprint as further illustrated in FIG. 4. For instance, level of authentication module 102/102a determining a level of authentication for first user 20. The level of authentication can be configured to restrict access to the one or more items/actions as a function of the level of authentication assigned to first user 20. If first user 20 is identified as an authorized user, level of authentication module 102/102a can be configured to take into account a behavioral fingerprint associated with that authorized user.

In addition to level of authentication operation 404, operational flow 400 includes operation 406, determining via the computing device that the first user has made a request for performance of a task, for example, computing device 10 user interface 110 receiving an input from first user 10 to access an application 160 or the like. Operation 406 is followed by operation 408, performing the task automatically without interference by the first user as a function of the level of authentication of the first user. For instance, the level of authentication module 102/102a of the computing device 10 of FIG. 1 determining automatically without interference (e.g., without prompting) that first user 20 is an authorized user and activating one of applications 160 to perform a task automatically.

As will be further described herein, the level of authentication operation 404 of FIG. 4 may be executed in a variety of different ways in various alternative implementations. FIGS. 5a, 5b, 5c, for example, illustrate at least some of the alternative ways that operation 404 of FIG. 4 may be executed in various alternative implementations. For example, in various implementations, operation 404 of FIG. 4 may include an operation 502 for determining the behavioral fingerprint via establishing a statistical predictability of one or more future actions of an authorized user of the computing device as depicted in FIG. 5a. For instance, behavioral fingerprint module 106/106a determining a behavioral fingerprint of first user 20 by establishing that first user 20 is an authorized user of computing device 10, and generating a behavioral fingerprint via fingerprint build/degradation module 314 and fingerprint generation module 316, which can include statistical calculations based on prior actions to predict future actions of an authorized user.

As further illustrated in FIG. 5a, in some implementations, the level of authentication operation 502 may additionally or alternatively include an operation 503 for sensing the one or more actions of the authorized user. For instance, sensors 120 and level of authentication module 102/102a of the computing device 10 of FIG. 1 determining that first user 20 is an authorized user based, at least in part, on data provided by one or more sensors 120.

Data from various types of sensors 120 may be used in order to determine a level of authentication of the computing device 10. For example, and as further illustrated in FIG. 5a, operation 503 may be followed by an operation 504 applying a statistical value to the sensed one or more actions of the authorized user to establish the statistical predictability of one or more future actions of the authorized user. For instance, the level of authentication module 102/102a of the computing device 10 of FIG. 1 applying statistical level determination module 218 to actions taken by an authorized user with a behavioral fingerprint via sensors 120, and behavioral fingerprint library 170.

In some implementations, operation 504 may include an operation 505 for storing the sensed one or more actions of the authorized user as further depicted in FIG. 5a. For instance, memory 114, including library of behavioral fingerprints 170 of the computing device 10 of FIG. 1 storing one or more actions sensed by sensors 120 and actions over a network, such as social network interactions.

In the same or different implementations, operation 505 may include an operation 506 for detecting the one or more actions of the authorized user wherein the one or more actions of the authorized user include logging into one or more social networks. For instance, the level of authentication module 102/102a of the computing device 10 of FIG. 1 determining that first user 20 is operating computing device 10 as an authorized user and communication application 166 running a social network application with data being stored in behavioral fingerprint library 170.

In the same or alternative implementations, operation 503 may include an operation 507 for detecting one or more keystrokes on the computing device to determine a pattern of use associated with the authorized user. For instance, the level of authentication module 102/102a of the computing device 10 of FIG. 1 detecting via movement sensors 202 one or more keystrokes on computing device 10 to determine a pattern of use associated with an authorized user.

Operations 503 may also include an operation 508 for detecting one or more manners for swiping input on the computing device to determine a pattern of use associated with the authorized user as depicted in FIG. 5a. For instance, the level of authentication module 102/102a of the computing device 10 of FIG. 1 detecting via movement sensors 202 manners of swiping an input on computing device 10 to determine a pattern of use associated with an authorized user.

Operations 503 may also include an operation 509 for detecting one or more contacts frequently visited by the authorized user on the computing device to determine a visitation pattern associated with the authorized user as depicted in FIG. 5a. For instance, level of authentication module 102/102a of the computing device 10 of FIG. 1 detecting via social network library 302 a visitation pattern associated with an authorized user.

In some cases, operation 503 may, in turn, include an operation 510, which provides for comparing a stored image of the authorized user to a detected image of the first user via a camera connected to the computing device. For instance, computing device 10 using behavioral fingerprint library 170, authorized user library 304 to store an image of an authorized user, and level of authentication module 102/102a and/or behavior fingerprint module 106/106a comparing the stored image of the authorized user with a received image of first user 20 via sensors 120, such as image capturing device 204.

Referring to operation 504, operation 504 can include operation 511 altering the level of authentication of the first user as a function of the statistical predictability of the one or more future actions of the authorized user. For instance, computing device 10 altering a level of authentication using level of authentication module 102/102a as a function of a statistical probability determined via statistical level determination module 218 to determine one or more future actions of the authorize user.

In the same or different implementations, operation 511 may include an operation 512 for lowering the level of authentication of the first user when the one or more actions of the first user includes a detected anomalous action as further depicted in FIG. 5a. For instance, the anomalous action detecting module 212 of the computing device 10 detecting an anomalous action with respect to computing device 10 during use of the computing device 10 by the first user 20, and causing level of authentication module 102/102a to lower the level of authentication with respect to first user 20.

In various implementations, the operation 512 for lowering the level of authentication of the first user when the one or more actions of the first user includes a detected anomalous action may include operation 513 for detecting that the first user has performed an action uncharacteristic of the authorized user and/or that the first user has performed an action previously identified by the authorized user as being an action to cause lowering of the level of authentication. For instance, computing device 10, behavioral fingerprint library 170, anomalous activity library 306 alerting level of authentication module 102/102a and behavioral fingerprint library 106/106a of an action anomalous to a stored activity of anomalous activity library 306.

Operation 511 can further include operation 514 alerting a predetermined set of contacts if the statistical predictability of the one or more future actions of the authorized user causes a predetermined level of authentication of the first user. For instance, computing device 10 alerting a predetermined set of contacts via social network library 302 and network interface 112 after statistical level determination module 218 determines that the statistical predictability of one or more future actions of an authorized user causes a predetermined level of authentication of the first user 20. The predetermined level of authentication determined for first user 20 could be a determination that first user has stolen computing device 10, that first user 20 is on a list of users that are unauthorized, that first user 20 has entered several incorrect passwords or the like, which would cause a lowered level of authentication.

Operation 511 can further include operation 515 disabling one or more devices of the authorized user if the level of authentication is lowered to a predetermined level. For instance, computing device 10 disabling one or more devices for which computing device 10 has control when a level of authentication determined by level of authentication module 102/102a is altered to a lower predetermined level. The one or more devices can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Operation 511 can further include operation 516 disabling a mobile device of the authorized user if the level of authentication is lowered to a predetermined level. For instance, computing device 10 disabling a mobile device when a level of authentication determined by level of authentication module 102/102a is altered to a lower predetermined level. The mobile device can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Referring now to FIG. 5b operation 404, determining a level of authentication associated with the first user via the computing device, the level of authentication at least partially based on a behavioral fingerprint, can include operation 517 determining the level of authentication of the first user at least partially via a reconstructed key formed via gathered data from at least one social network. For instance, computing device 10, behavioral fingerprint library 170, cryptographic library 308 receiving key data from at least one social network, such as social networks stored in social network library 302 to rebuild an asymmetric key pair, such as a public/private key pair, a Triple DES or AES type cryptographic key.

In some implementations, operation 517 may further include an operation 518 for generating a security certificate associated with the authorized user based on a cryptographic key. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or an asymmetric key pair, such as a private/public key pair. In doing so, the computing device 10 may store either a private or a public portion of the public/private key pair.

In some embodiments operation 518 may be followed by an operation 519 altering the cryptographic key to enable distribution of one or more altered forms of the cryptographic key to enable rebuilding of the cryptographic key via the gathered data from the at least one social network. For instance, a cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends on at least one social network such as social networks stored via social network library 302 and the portions can later be gathered from the users/members/friends of the social network.

In various embodiments, operation 517 for determining the level of authentication of the first user at least partially via a reconstructed key formed via gathered data from at least one social network includes operation 525 determining a private/public key pair including a private key and a public key. For instance, cryptographic library 308 determining a private/public key pair with a private key and a public key.

Operation 525 can be followed by operation 526 altering the private key to enable distribution of one or more components of the private key, each of the one or more components of the private key required for the reconstructed key. For instance, a cryptographic key based on a public/private key pair could have the private key separated into components of the cryptographic key for distribution of the one or more components so that the one or more components, or a combination thereof are required for the regenerated key.

Operation 526 can be followed by operation 527 distributing the one or more components of the private key to one or more members of a trusted group. For instance, cryptographic library 308 distributing via network interface 112 one or more components of the private key to one or members of a trusted group, such as members of a group on one or more social networks stored on social network library 302.

In one implementation, operation 517 for determining the level of authentication of the first user at least partially via a reconstructed key formed via gathered data from at least one social network, can further include operation 528 determining the gathered data from the at least one social network via retrieving one or more components of the private key required for the reconstructed key from one or more members of a trusted group via the at least one social network. For instance, cryptographic library 308 gathering data via network interface 112 one or more components of the private key from one or members of a trusted group, such as members of a group of at least one social network stored on social network library 302.

In one implementation, operation 517 can further include operation 529 requesting each of the one or more members of the trusted group for the one or more components of the private key, each of the one or more members having a level of authentication previously granted by the authorized user. For instance, computing device 10 requesting via network interface 112 each of one or more members of a trusted group holding one or more components of the private key generated by cryptographic library 308, and each of the one or more members stored in social network library 302, having a level of authentication previously granted by authorized user and stored in social network library 302.

In one embodiment, operation 517 can further include operation 530 determining one or more members of a trusted group from which to gather the gathered data, the one or more members of the trusted group belonging to the at least one social network, each of the one or more members capable of storing a component to enable forming the reconstructed key. For instance, computing device 10 determining one or more members of a trusted group via social network library 302, each of the one or more members being a member of a social network, and each of the one or more members capable of storing a component of a cryptographic key created via cryptographic library 308 such that the component can be gathered as gathered data to reconstruct the cryptographic key via cryptographic library 308.

As further illustrated in FIG. 5c, in some implementations, operation 404 may further include an operation 531 for restricting access via the computing device to one or more applications in response to the determining as depicted in FIG. 5c. For instance, the access restriction module 104/104a of the computing device 10 restricting access via the computing device 10 to one or more items (e.g., electronic documents including electronic messages and/or productivity documents such as word processing documents, image or audio files, applications, passwords, and so forth) in response to the determining by at least restricting access to the one or more items that were accessible by an authorized user (e.g., was visible, editable, and/or usable by the authorized user) when the authorized user was using the computing device 10. For instance, the application access restriction module 264 (see FIG. 2c) of the computing device 10 restricting access via the computing device 10 to one or more applications 160 (e.g., a productivity application such as a word processing application, a communication application such as an IM application, a gaming application, and so forth) in response to the determining. In some cases, such restrictions to one or more applications 160 may be related to restricting use of one or more functionalities of the one or more applications 160. In some embodiments, access can be complete, for instance, the access restricting module 104/104a including the no access module 234 (see FIG. 2c) of the computing device 10 restricting access to the one or more items that would be accessible by the first user 20 when the first user 20 is an authorized user of computing device 10 by having the no access module 234 provide no access (e.g., completely hiding or erasing any indications of the existence of the one or more items) to the one or more items that were accessible by an authorized user was using the computing device 10.

As further illustrated in FIG. 5c, operation 531 may include one or more additional operations in various alternative implementations. For example, in some implementations, operation 531 may include an operation 532 for restricting access via the computing device to one or more productivity applications in response to the determining. For instance, the access restricting module 104/104a including the document access restricting module 254 (see FIG. 2c) of the computing device 10 restricting access to the one or more items that would be accessible by the first user 20 if first user 20 is determined to be an authorized user of the computing device 10 by having the productivity document access restricting module 255 provide restricted access (e.g., read-only access or limited functional access if the one or more items includes one or more applications 160) to the one or more items that were accessible by an authorized user using the computing device 10.

In some implementations, operation 532 may include an operation 533 for restricting access via the computing device to one or more communication applications in response to the determining. For instance, the communication application access restriction module 266 (see FIG. 2c) of the computing device 10 restricting access via the computing device 10 to one or more communication applications (e.g., email application, instant messaging or IM application, text messaging application, and so forth) in response to the determining.

In some cases, the access restricting operation 531 restricting access via the computing device to one or more applications in response to the determining may include an operation 534 for restricting access via the computing device to one or more personal information manager applications in response to the determining. For instance, the personal information manager application access restriction module 267 (see FIG. 2c) of the computing device 10 restricting access via the computing device 10 to one or more personal information manager applications (e.g., Microsoft® Outlook™) in response to the determining.

As further illustrated in FIG. 5c, operation 531 may include operation 535 restricting access via the computing device to automatic tasks that are associated with a predetermined level of authentication of an authorized user in response to the determining. For instance, the no automatic task functionality module 235 (see FIG. 2c) of the computing device 10 preventing, via the computing device 10 and in response at least in part to the determining a level of authentication, the one or more automatic tasks (e.g., door opening, car starting) can be prevented from being performed.

A more detailed discussion related to the computer server 30 of FIGS. 1-3 will now be provided with respect to the processes and operations to be described herein. Referring now to FIG. 6, a detailed discussion related to the computing device 10 of FIGS. 1-3 will now be provided with respect to alternative processes and operations to be described herein. FIG. 6 illustrates an operational flow 600 representing example operations for, among other things, developing a behavioral fingerprint. In FIG. 6 and in the following figures that include various examples of operational flows, discussions and explanations will be provided with respect to the exemplary environment 100 described above and as illustrated in FIG. 1 and/or with respect to other examples (e.g., as provided in FIG. 2a) and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 2a, 2b, 2c, or 2d and FIG. 3a or 3b. Also, although the various operational flows are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in other orders other than those which are illustrated, or may be performed concurrently.

Further, in FIG. 6 and in the figures to follow thereafter, various operations may be depicted in a box-within-a-box manner. Such depictions may indicate that an operation in an internal box may comprise an optional example embodiment of the operational step illustrated in one or more external boxes. However, it should be understood that internal box operations may be viewed as independent operations separate from any associated external boxes and may be performed in any sequence with respect to all other illustrated operations, or may be performed concurrently. Still further, these operations illustrated in FIG. 6 as well as the other operations to be described herein are performed by at least one of a machine, an article of manufacture, or a composition of matter unless indicated otherwise.

In any event, after a start operation, the operational flow 600 of FIG. 6 includes an identification operation 602 for identifying a network connection via a computer server to a computing device. For instance, and as an illustration, the computer server 30 connecting via network 50 to the computing device 10 of FIG. 1. In addition to the identification operation 602, operational flow 600 may also include a transmission operation 604 for transmitting, via the network connection, a behavioral fingerprint associated with an authorized user of the computing device, the behavioral fingerprint providing a current status of the authorized user with respect to the computing device as further illustrated in FIG. 6. For instance, transmitting via network interface 112c determining a level of authentication for first user 20. The level of authentication can be configured to restrict access to the one or more items/actions as a function of the level of authentication assigned to first user 20. If first user 20 is identified as an authorized user, level of authentication module 102/102a can be configured to take into account a behavioral fingerprint associated with that authorized user. FIG. 6 further shows operation 606 for transmitting, via the network connection, a level of authentication for network accessible functions associated with the behavioral fingerprint to the computing device. For instance, computer server 30 transmitting via network interface 112c a level of authentication for any network accessible functions shown in FIG. 2e associated with a behavioral fingerprint of computing device 20. FIG. 6 further shows operation 608 for enabling one or more tasks to be performed automatically as a function of the level of authentication of the authorized user. For instance, computer server 30 enabling tasks associated with functions shown in FIG. 2e, such as communication applications 166c and productivity applications 164c to be performed automatically.

As will be further described herein, the behavioral fingerprint operation 604 of FIG. 6 may be executed in a variety of different ways in various alternative implementations. FIGS. 7a and 7b, for example, illustrate at least some of the alternative ways that operation 604 of FIG. 6 may be executed in various alternative implementations. For example, in various implementations, operation 604 of FIG. 6 may include an operation 702 for determining the behavioral fingerprint via confirming an internet presence of the authorized user of the computing device as depicted in FIG. 7a. For instance, behavioral fingerprint module 106/106a/106c determining a behavioral fingerprint of first user 20 by establishing that first user 20 is an authorized user of computing device 10, and generating a behavioral fingerprint via fingerprint build/degradation module 314 and fingerprint generation module 316, which can include statistical calculations based on prior actions to predict future actions of an authorized user.

As further illustrated in FIG. 7a, in some implementations, the behavioral fingerprint operation 702 may additionally or alternatively include an operation 703 for sensing one or more actions of the authorized user and two or more designated internet available entities. For instance, sensors 120 and level of authentication module 102/102a of the computing device 10 of FIG. 1 determining that first user 20 is an authorized user based, at least in part, on data provided by one or more sensors 120 and sensing activities of two or more designated internet available entities, such as via a cloud computing network, network 50, and/or device 60 shown in FIG. 1.

Data from various types of sensors 120 may be used in order to determine a behavioral fingerprint to be stored on computer server 30 and computing device 10. For example, and as further illustrated in FIG. 7a, operation 703 may be followed by an operation 704 applying reliability criteria to the sensed one or more actions of the authorized user and the two or more designated internet available entities to generate the behavioral fingerprint of the authorized user. For instance, the actions of the authorized user and two or more designated internet available entities can be judged via statistical probabilities or other criteria to determine if the actions are consistent with available data and used to generate or to regenerate or amend a behavioral fingerprint of the authorized user.

In some implementations, operation 703 may include an operation 706 for storing the sensed one or more actions of the authorized user and the two or more designated internet available entities as further depicted in FIG. 7a. For instance, memory 114/114c, including library of behavioral fingerprints 170/170c in computing device 10/computer server 30 of FIG. 1, including storing one or more actions sensed by sensors 120 and actions over a network, such as social network interactions.

In some implementations, operation 703 may include an operation 707 for detecting the one or more actions of the authorized user wherein the one or more actions of the authorized user include logging into one or more social networks as further depicted in FIG. 7a. For instance, memory 114c, including library of behavioral fingerprints 170c of the computer server 30 of FIG. 1 detecting one or more actions over a network, such as social network interactions. Also, detecting one or more actions can include an authorized user and communication application 166c running a social network application with data being stored in behavioral fingerprint library 170c.

In the same or different implementations, operation 703 may include an operation 708 for mapping one or more locations of the authorized user and the two or more designated internet available entities. For instance, the level of authentication module 102/102a/102c of the computing device 10/computer server 30 of FIG. 1 determining that first user 20 is operating computing device 10 via a network connection and using GPS-enabled applications, such as GPS 208 shown on FIG. 2d of computing device 10 to locate the authorized user. Additionally, any designated internet available entities can be located via social network functionalities such as a “check in” function on a smart phone application running on devices 60 or the like.

In the same or alternative implementations, operation 703 may include an operation 709 for detecting contact pattern between the authorized user and the two or more designated internet available entities. For instance, the applications 160c applications running on a computer server/cloud computer servers 30 of FIG. 1 detecting how often an authorized user of computing device 10 contacts other internet available entities and devices 60 to determine a pattern of use associated with an authorized user.

Operations 703 may also include an operation 710 for detecting one or more contacts frequently visited by the authorized user via one or more social networks to determine a visitation pattern associated with the authorized user as depicted in FIG. 7a. For instance, the level of authentication module 102/102a/102c of the computing device 10 and computer server 30 of FIG. 1 detecting contacts frequently visited via Facebook™ and/or Twitter™ and social network library 302 by an authorized user of device 10 to determine a pattern of visitation or frequently contacted persons associated with an authorized user.

Operations 703 may also include an operation 711 for storing, via the computer sever, one or more locations visited by the authorized user, the one or more locations including one or more of physical locations and internet address-based locations as depicted in FIG. 7a. For instance, level of authentication module 102/102a/102c of the computing device 10 and computer server 30 of FIG. 1 via social network library 302 and GPS enabled applications 308 and the like any physical locations and/or internet address-based locations visited by and/or associated with an authorized user.

Referring to operation 704, operation 704 can include operation 712 altering the behavioral fingerprint of the authorized user as a function of the sensed one or more actions of the authorized user and the two or more designated internet available entities. For instance, computer server 30 and/or computing device 10 altering a level of authentication using level of authentication module 102/102a/102c as a function of the sensed one or more actions of the authorized user and the two or more designated internet available entities.

In the same or different implementations, operation 712 may include an operation 713 for generating an alert as part of the behavioral fingerprint when the sensed one or more actions of the authorized user includes a detected anomalous action as further depicted in FIG. 7a. For instance, alert generating module 108c interacting with the anomalous action detecting module 212 of the computing device 10 and/or computer server 30 detecting an anomalous action with respect to computing device 10 or with respect to sensed one or more actions of an authorized user of computing device 10 during use of the computing device 10 or by using another computing device. For example, an authorized user can borrow or use a public computer to send an alert or create an anomalous action which indicates that any actions by the first user 20, could cause level of authentication module 102/102a to lower the level of authentication with respect to first user 20.

In various implementations, the operation 713 for generating an alert may include operation 714 for transmitting the alert to the computing device. For instance, computer server 30 sending to computing device 10 via network interface 112c an alert to behavioral fingerprint library 170, anomalous activity library 306 alerting level of authentication module 102/102a and behavioral fingerprint library 106/106a of an action anomalous to a stored activity of anomalous activity library 306.

In various implementations, the operation 713 for generating an alert may include operation 715 for transmitting the alert to one or more applications running on a cloud computing system. For instance computer server 30 operating in a cloud computing environment receiving the alert via network interface 112c.

In various implementations, operation 715 may include operation 716 for transmitting an alert to the two or more internet available entities via the cloud computing system. For instance, alerting a predetermined set of contacts via computer server 30 operating in a cloud environment if the statistical predictability of the one or more future actions of the authorized user causes an alert. For instance, computing device 10 or computer server 30 alerting a predetermined set of contacts via social network library 302 and network interface 112/112c after statistical level determination module 218 determines that the statistical predictability of one or more future actions of an authorized user detects an anomaly.

Operation 712 can further include operation 717 for notifying a predetermined set of contacts if the alert is generated by the authorized user. For instance, computer server 30 notifying one or more devices 60 when alert is generated by an authorized user. The one or more devices can be configured to be automatically notified without interference by first user 20 or the authorized user.

Operation 712 can further include operation 718 for disabling one or more devices of the authorized user if the behavioral fingerprint alteration indicates that the one or more devices of the authorized user have been compromised with respect to authentication. For instance, computing device 10 disabling a mobile device when a behavioral fingerprint determined via library of behavioral fingerprints 170c and behavioral fingerprint module 106c is altered to an untrustworthy level. The devices 60 can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Operation 712 can further include operation 719 for disabling, via the server, a mobile device of the authorized user if the behavioral fingerprint indicates that a level of authentication for the mobile device should be lowered to a predetermined level. For instance, computer server 30 disabling a mobile device or any device 60 when a behavioral fingerprint determined via library of behavioral fingerprints 170/170c and behavioral fingerprint module 106/106a/106c is altered to an untrustworthy level. The mobile device can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Referring now to FIG. 7b operation 604 transmitting, via the network connection, a behavioral fingerprint associated with an authorized user of the computing device, the behavioral fingerprint providing a current status of the authorized user with respect to the computing device, can include operation 720 reconstructing the behavioral fingerprint of authorized user at least partially via a reconstructed key at least partially formed via data gathered from at least one social network. For instance, computer server 30 using behavioral fingerprint library 170c, and cryptographic library 308 receiving key data from at least one social network, such as social networks stored in social network library 302 to rebuild a public/private key pair, a Triple DES or AES type cryptographic key.

In some implementations, operation 720 may further include an operation 721 for generating a security certificate associated with the authorized user based on a cryptographic key. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or an asymmetrical key pair such as a private/public key pair. In doing so, the computer server 30 may store a private or a public portion of the public/private key pair.

In some embodiments operation 721 may be followed by an operation 722 altering the cryptographic key to enable distribution of one or more altered forms of the cryptographic key to enable rebuilding of the cryptographic key via the gathered data from the at least one social network. For instance, within computer server 30, a cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends on at least one social network such as social networks stored via social network library 302 and the portions can later be gathered from the users/members/friends of the social network.

In various embodiments, operation 720 includes operation 728 for determining a private/public key pair including a private key and a public key. For instance, cryptographic library 308 determining a private/public key pair with a private key and a public key.

Operation 728 can be followed by operation 729 for altering the private key to enable distribution of one or more components of the private key, each of the one or more components of the private key required for the reconstructed key. For instance, a cryptographic key based on a public/private key pair could have the private key separated into components of the cryptographic key for distribution of the one or more components so that the one or more components are required for the regenerated key.

Operation 729 can be followed by operation 730 distributing the one or more components of the private key to one or more members of a trusted group. For instance, cryptographic library 308 distributing via computer server 30 network interface 112c one or more components of the private key to one or members of a trusted group, such as members of a group on one or more social networks stored on social network library 302.

In one implementation, operation 720 for reconstructing the behavioral fingerprint of authorized user at least partially via a reconstructed key at least partially formed via data gathered from at least one social network, can further include operation 731 determining the gathered data from the at least one social network via retrieving one or more components of the private key required for the reconstructed key from one or more members of a trusted group via the at least one social network. For instance, cryptographic library 308 gathering data via network interface 112c of computer server 30 one or more components of the private key from one or members of a trusted group, such as members of a group of at least one social network stored on social network library 302.

In one implementation, operation 731 can further include operation 732 for requesting each of the one or more members of the trusted group for the one or more components of the private key, each of the one or more members previously identified by the authorized user. For instance, computer server 30 requesting via network interface 112c each of one or members of a trusted group holding one or more components of the private key generated by cryptographic library 308, and each of the one or more members stored in social network library 302, having a level of authentication previously granted by authorized user and stored in social network library 302.

In one embodiment, operation 720 can further include operation 733 determining one or more members of a trusted group from which to gather the gathered data, the one or more members of the trusted group belonging to the at least one social network, each of the one or more members capable of storing a component to enable forming the reconstructed key. For instance, computer server 30 determining one or more members of a trusted group via social network library 302, each of the one or more members being a member of a social network, and each of the one or more member members capable of storing a component of a cryptographic key created via cryptographic library 308 such that the component can be gathered as gathered data to reconstruct the cryptographic key via cryptographic library 308.

A more detailed discussion related to the computer server 30 of FIGS. 1-3 will now be provided with respect to alternate processes and operations to be described herein. Referring now to FIG. 8, a detailed discussion related to the computing device 10 of FIGS. 1-3 will now be provided with respect to alternative processes and operations to be described herein. FIG. 8 illustrates an operational flow 800 representing example operations for, among other things, developing a behavioral fingerprint. In FIG. 8 and in the following figures that include various examples of operational flows, discussions and explanations will be provided with respect to the exemplary environment 100 described above and as illustrated in FIG. 1 and/or with respect to other examples (e.g., as provided in FIG. 2a) and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 2a, 2b, 2c, or 2d and FIG. 3a or 3b. Also, although the various operational flows are presented in the sequence(s) illustrated, it should be understood that the various operations may be performed in orders other than those which are illustrated, or may be performed concurrently.

Further, in FIG. 8 and in the figures to follow thereafter, various operations may be depicted in a box-within-a-box manner. Such depictions may indicate that an operation in an internal box may comprise an optional example embodiment of the operational step illustrated in one or more external boxes. However, it should be understood that internal box operations may be viewed as independent operations separate from any associated external boxes and may be performed in any sequence with respect to all other illustrated operations, or may be performed concurrently. Still further, these operations illustrated in FIG. 8 as well as the other operations to be described herein are performed by at least one of a machine, an article of manufacture, or a composition of matter unless indicated otherwise.

In any event, after a start operation, the operational flow 800 of FIG. 8 includes a behavioral fingerprint operation 801 for determining a behavioral fingerprint associated with a network accessible user of one or more devices, the behavioral fingerprint providing a current status of the network accessible user. For instance, and as an illustration, the computer server 30 connecting via network 50 to the computing device 10 of FIG. 1 can establish and/or determine a behavioral fingerprint associated with a network accessible user, which could be first user 20 of computing device 10 and the device or a network can provide a current status of the network accessible user. In addition to the behavioral fingerprint operation 801, operational flow 800 may also include a controlling/disabling operation 802 for disabling the one or more devices automatically as a function of the determined behavioral fingerprint as further illustrated in FIG. 8. For instance, disabling via network interface 112c a current device of one or more devices such as computing device 10. The behavioral fingerprint can be configured to disable a device requiring a disabling signal as a function of the behavioral fingerprint of a network accessible user. If first user 20 is identified as the network accessible user, level of authentication module 102/102a can be configured to take into account a behavioral fingerprint and assist determining whether the device should be disabled. FIG. 8 further shows operation 803 for transmitting to the one or more devices a level of authentication for network accessible functions associated with the behavioral fingerprint. For instance, computer server 30 transmitting via network interface 112c a level of authentication for any network accessible functions associated with a behavioral fingerprint as shown in FIG. 2e associated with a behavioral fingerprint of a network accessible user. FIG. 8 further shows operation 804 for disabling one or more tasks automatically as a function of the level of authentication of the network accessible user. For instance, computer server 30 disabling functions via access restriction module 104/104a shown in FIG. 2c, and/or server available tasks such as communication applications 166c and disabling productivity applications 164c to be performed automatically.

As will be further described herein, the controlling/disabling operation 802 of FIG. 8 may be executed in a variety of different ways in various alternative implementations. FIGS. 9a, 9b, 9c, and 9d, for example, illustrate at least some of the alternative ways that operation 802 of FIG. 8 may be executed in various alternative implementations. For example, in various implementations, operation 802 of FIG. 8 may include an operation 902 for transmitting, from a network accessible theft detection system, a disabling signal, the disabling signal promulgated over a network as depicted in FIG. 9a. For instance, behavioral fingerprint module 106/106a/106c determining a behavioral fingerprint of a network accessible user by establishing that first user 20 is the network accessible user, and generating a behavioral fingerprint via fingerprint build/degradation module 314 and fingerprint generation module 316, which can include statistical calculations based on prior actions to confirm a persistent internet presence of the network accessible user of computing device 10 and/or additional devices.

As further illustrated in FIG. 9a, in some implementations, an operation 902 may additionally or alternatively be followed by an operation 903 for transmitting, from the network accessible theft detection system, an alert signal to the network accessible user. For instance, transmitting using network interface 112c from server 30 to computing device 10 of FIG. 1 an alert signal generated by alert generating module 108c to first user 20 as a network accessible user.

Data from various types of sensors 120 may be used in order to determine a behavioral fingerprint to be stored on computer server 30 and computing device 10.

In some implementations, operation 903 may include an operation 904 for transmitting, from the network accessible theft detection system, an alert signal to at least one or more of a manufacturer of the one or more devices, a law enforcement agency, a trusted group identified by the network accessible user, and/or a social network, the alert signal including data identifying the one or more devices as further depicted in FIG. 9a. For instance, network interface 112c transmitting an alert signal generated via alert generating module 108c and transmitting the alert signal via network 50 to a manufacturer of the one or more devices, a law enforcement agency, a trusted group identified by first user 20 or another user, a social network, such as Facebook or Twitter. The alert signal can include data identifying the one or more devices, such as devices requiring an alert, such as stolen devices or the like.

In some implementations, operation 802 may include an operation 905 for determining the behavioral fingerprint via confirming an internet presence of the network accessible user of the one or more devices as further depicted in FIG. 9a. For instance, memory 114/114c, including library of behavioral fingerprints 170/170c in computing device 10/computer server 30 of FIG. 1, including storing one or more internet interactions sensed by sensors 120 and actions over a network, such as social network interactions.

In the same or different implementations, operation 905 may include operations 906, 907 and 908. Operation 906 includes sensing one or more actions of the network accessible user and two or more designated internet available entities. For instance, sensors 120 sensing actions of first user 20 as a network accessible user and sensing the actions of two or more designated internet available entities. In the same or alternative implementations, operation 905 may include an operation 907 for applying reliability criteria to the sensed one or more actions of the network accessible user and the two or more designated internet available entities to generate the behavioral fingerprint of the network accessible user. For instance, the applications 160c applications running on a computer server/cloud computer servers 30 of FIG. 1 applying reliability criteria to sensed one or more actions sensed via sensors 120 of first user 20 and two or more designated internet available entities to generate a behavioral fingerprint associated with first user 20.

Operations 905 may also include an operation 908 for transmitting the behavioral fingerprint to a theft detection system as depicted in FIG. 9a. For instance, network interface 112 or 112c transmitting a behavioral fingerprint to theft detection module 167/167a or 167c, such as transmitting a behavioral fingerprint determined via behavioral fingerprint module 106/106a/106c to server 30 which can interface or operate a theft detection system.

Referring now to FIG. 9b, operations 802, 905 and 906 continue with operation 906 further illustrating alternative and different implementations including operation 906 shown alternatively including operations 909, 910, 911, 912, 913, 914 and 915. Specifically, operation 906 can include operation 909 storing the sensed one or more actions of the network accessible user and the two or more designated internet available entities in a network accessible location, the network accessible location accessible by the theft detection system to access the one or more actions. For instance, memory 114/114a/114c storing actions sensed via sensors 120 of first user 20 and two or more designated internet available entities, wherein the network accessible location is accessible by theft detection module 167/167a and/or 167c.

Operation 906 can include operation 910 for detecting the one or more actions of the network accessible user wherein the one or more actions of the network accessible user include logging into one or more social networks. For instance, detecting via sensors 120 one or more actions of first user 20 wherein the actions of the first user include logging into Facebook, Twitter or another social network.

Operation 906 can further include operation 911 for transmitting the sensed one or more actions of the network accessible user and the two or more designated internet available entities to the theft detection system, wherein the theft detection system is a network accessible third-party system. For instance, referring to FIGS. 1 and 2a, computing device 10 transmitting using network interface 112 actions sensed using sensor 120 to a theft detection system, such as a theft detection system implemented by theft detection module 167c in server 30 as in FIGS. 1 and 2e.

Operation 906 can further include operation 912 for detecting a contact pattern between the network accessible user and the two or more designated internet available entities. For instance, sensors 120 residing at computing device 10 and computer server 30 of FIG. 1 detecting contacts that can be two or more designated internet available entities, such as people frequently visited via Facebook™ and/or Twitter™ and social network library 302 by a first user 20 of computing device 10 to determine a pattern of visitation or frequently contacted.

Operation 906 can further include operation 913 for detecting one or more contacts frequently visited by the network accessible user via one or more social networks to determine a visitation pattern associated with the network accessible user. For instance, memory 114c, including library of behavioral fingerprints 170c of the computer server 30 of FIG. 1 detecting one or more actions over a network, such as social network interactions and computing device 10 and computer server 30 of FIG. 1 detecting contacts frequently visited via Facebook™ and/or Twitter™ and social network library 302 by first user 20 of device 10 to determine a pattern of visitation or frequently contacted persons associated with an authorized user.

Operation 906 can also include operation 914 for transmitting the visitation pattern to the theft detection system. For instance, network interface 114 transmitting the visitation pattern to theft detection module 167c in computer server 30 over a network; or sensing the visitation pattern using sensors 120 and transmitting the pattern to theft detection module 167 or 167a within computing device 10.

Operation 906 may also include an operation 915 for transmitting one or more locations visited by the network accessible user to the theft detection system, the one or more locations including one or more of physical locations predicted as being appropriate for the network accessible user as depicted in FIG. 9b. For instance, computing device 10 and computer server 30 of FIG. 1 transmitting via network interface 114 or 114c one or more locations detected using sensors 120, such as a GPS or the like via social network library 302 and GPS enabled applications 208 and any physical locations and/or internet address-based locations visited by and/or associated with first user 20.

Referring now to FIG. 9c, operation 802, operation 905 and operation 907 continue. As further illustrated in FIG. 9c, operation 907 for applying reliability criteria to the sensed one or more actions of the network accessible user and the two or more designated internet available entities to generate the behavioral fingerprint of the network accessible user is again illustrated. For instance, the actions of the authorized user and two or more designated internet available entities can be judged via statistical probabilities or other criteria to determine if the actions are consistent with available data and used to generate or to regenerate or amend a behavioral fingerprint of the a network accessible user.

Operation 907 can include operations 916 and 917. In particular, operation 916 is for altering the behavioral fingerprint of the network accessible user as a function of the sensed one or more actions of the network accessible user and the two or more designated internet available entities. For instance, computer server 30 and/or computing device 10 altering a behavioral fingerprint using level of authentication module 102/102a/102c or behavioral fingerprint module 106/106a/106c as a function of the sensed one or more actions of the first user 20 and the two or more designated internet available entities. Operation 916 may be followed by an operation 917 for transmitting the altered behavioral fingerprint of the network accessible user to the theft detection system.

In the same or different implementations, operation 916 may include an operation 918 or operation 919. Operation 918 is for generating a disabling signal as part of the behavioral fingerprint when the sensed one or more actions of the network accessible user includes a detected anomalous action as further depicted in FIG. 9c. For instance, alert generating module 108c interacting with the anomalous action detecting module 212 of the computing device 10 and/or computer server 30 detecting an anomalous action with respect to computing device 10 or with respect to sensed one or more actions of first user 20 of computing device 10 during use of the computing device 10 or by using another computing device. For example, a network accessible user can borrow or use a public computer to send an alert or create an anomalous action which indicates that any actions by the first user 20, could cause level of authentication module 102/102a to lower the level of authentication with respect to first user 20.

In one implementation, operation 918 may include operation 920 for transmitting the disabling signal to the one or more devices. For instance, computing device 10 or computer server 30 transmitting via network interface 112/112c a disabling signal to one or more devices such as a computing device 10 or devices 60 shown in FIG. 1. The disabling signal can be a signal that disables the device entirely or renders a portion of the device unusable, self destructs all or a portion of the device or the like. The disabling signal can include a specialized virus signal, a code that causes a preexisting application to self-instantiate or the like.

In one implementation, operation 918 may include operation 921 for transmitting the disabling signal to one or more applications running on a cloud computing system. In one implementation, operation 921 may include operation 922 for transmitting the disabling signal to the two or more internet available entities via the cloud computing system. Operation 919 in one implementation, is for transmitting the disabling signal to the theft detection system. For instance, network interface 112/112c transmitting a disabling signal to theft detection module 167, 167a or 167c as appropriate.

In various implementations, the operation 916 may include various operations such as operations 923, 924, or 925.

Specifically, in an implementation, operation 923 is for notifying a predetermined set of contacts if the disabling signal is generated by the network accessible user. For instance, computer server 30 sending to computing device 10 via network interface 112c a disabling signal to behavioral fingerprint library 170, anomalous activity library 306 to alerting level of authentication module 102 and behavioral fingerprint library 106/106a of an action anomalous to a stored activity of anomalous activity library 306. In an embodiment, level of authentication module 102 can send out a disabling signal to one more devices in accordance with a list of contacts stored in library 306. For instance, computer server 30 disabling a mobile device or any device 60 when a behavioral fingerprint determined via library of behavioral fingerprints 170c and behavioral fingerprint module 106c is altered to an untrustworthy level. The mobile device can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Operation 924, in an implementation, is for disabling one or more devices of the network accessible user if the behavioral fingerprint alteration indicates that the one or more devices of the network accessible user have been compromised with respect to authentication. For instance, computer server 30 disabling a mobile device or any device 60 when a behavioral fingerprint determined via library of behavioral fingerprints 170c and behavioral fingerprint module 106c is altered to an untrustworthy level. The mobile device can be configured to be automatically disabled without interference by first user 20 or the authorized user.

Operation 925, in an implementation, is for disabling one of the one or more devices, wherein the device is a mobile device of the network accessible user if the behavioral fingerprint indicates that a level of authentication for the mobile device should be lowered to a predetermined level.

Referring now to FIG. 9d, operation 802 continues in an implementation. As shown in FIG. 9d, operation 802 includes operation 926 for re-enabling the one or more devices as a function of a reconstructed behavioral fingerprint of the network accessible user at least partially via a reconstructed key formed via gathered data from at least one social network. For instance, assuming a network accessible user is identified, a device of devices 60 may need to be re-enabled if the behavioral fingerprint of the network accessible user was subject to an anomaly or otherwise vulnerable. For example, a mobile phone that is stolen resulting in anomalous activities by a thief would cause a behavioral fingerprint to lower a level of authentication related to all devices of network accessible user. If the mobile phone is recovered, the network accessible user could contact members of a trusted group over one or more social networks so that a cryptographic key could be reconstructed. Reconstructing the cryptographic key could be directly tied to restoring a behavioral fingerprint to a trusted level, such as a level of authentication as it existed prior to the mobile phone being stolen.

Operation 926, in an embodiment, can include operations 927 and 928. Operation 927 includes an implementation for generating a security certificate associated with the network accessible user based on a cryptographic key. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or private/public key pair. In doing so, the computer server 30 may store either a private or a public portion of the public/private key pair.

Operation 928 includes altering the cryptographic key to enable distribution of one or more altered forms of the cryptographic key to enable rebuilding of the cryptographic key via the gathered data from the at least one social network. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or private/public key pair. The cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends of the network accessible user. Computer server 30 can determine one or more members of a trusted group via social network library 302, each of the one or more members being a member of a social network such as Facebook™ or the like, and each of the one or more member members capable of storing a component of a cryptographic key created via cryptographic library 308 such that the component can be gathered as gathered data to reconstruct the cryptographic key via cryptographic library 308.

Operation 926 can further include in one implementation, operations 929, 930 and 931. Operation 929 includes determining a private/public key pair including a private key and a public key. For instance, network accessible user can generate a private/public key pair using an IMEI, or other device specific number, such as a serial number or the like.

Operation 926 can include operation 930, for altering the private key to enable distribution of one or more components of the private key, each of the one or more components of the private key required for the reconstructed key. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or private/public key pair. The cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends of the network accessible user on at least one social network such as social networks stored via social network library 302 and the portions can later be gathered from the users/members/friends of the social network by requesting from each of the members of the trusted group the one or more components.

Operation 926 can include operation 931 for distributing, by a network accessible theft detection system, the one or more components of the private key to one or more members of a trusted group the one or more components of the private key to one or more members of a trusted group. For instance, cryptographic library 308 of computing device 10 generating a security certificate associated with the authorized user based on a cryptographic key such as a triple DES, AES or private/public key pair. The cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends of the network accessible user.

In one embodiment, operation 926 includes operation 932 for determining the gathered data from the at least one social network via retrieving, by the theft detection system, one or more components of the private key required for the reconstructed key from one or more members of a trusted group via the at least one social network. For instance, within computer server 30, a cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends on at least one social network such as social networks stored via social network library 302 and the portions can later be gathered from the users/members/friends of the social network.

Operation 932 can include operation 933 requesting, by the theft prevention system, each of the one or more members of the trusted group for the one or more components of the private key, each of the one or more members previously identified by the network accessible user. For instance, within computer server 30, a cryptographic key based on a public/private key pair could have the private key altered such that portions of the cryptographic key can be distributed to users/members/friends of the network accessible user on at least one social network such as social networks stored via social network library 302 and the portions can later be gathered from the users/members/friends of the social network by requesting from each of the members of the trusted group the one or more components.

Operation 926 can also include operation 934 for determining, by the theft prevention system, one or more members of a trusted group from which to gather the gathered data, the one or more members of the trusted group belonging to the at least one social network, each of the one or more members capable of storing a component to enable forming the reconstructed key. For instance, network accessible user determining members of a trusted group of friends or persons belonging to Facebook™ or Twitter™ or the like, wherein each of the trusted members are network accessible such that if necessary, a component of a private key can be stored and recovered when needed to reconstruct a key. For instance, computer server 30 determining one or more members of a trusted group via social network library 302, each of the one or more members being a member of a social network, and each of the one or more member members capable of storing a component of a cryptographic key created via cryptographic library 308 such that the component can be gathered as gathered data to reconstruct the cryptographic key via cryptographic library 308.

A more detailed discussion related to computing device 10/computer server 30 of FIGS. 1-3 is provided below with respect to example alternative processes or operations that are described herein. Referring now to FIG. 10 in particular, a detailed discussion related to computing device 10/computer server 30 of FIGS. 1-3 is provided with respect to example alternative processes or operations that are described herein. FIG. 10 illustrates an operational flow 1000 representing example operations for, among other things, handling a proposed transaction in a context that includes at least one behavioral fingerprint or at least one trust verification schema. In FIG. 10 and in the following figures (e.g., FIGS. 11a-11d) that include various examples of operational flows, example descriptions and explanations are provided with respect to an exemplary environment 100 that is described herein above and that is illustrated in FIG. 1 and/or with respect to other example environments (e.g., as provided in FIG. 2a et seq.) or other example contexts. However, it should be understood that operational flows may be executed in a number of other environments or contexts, and/or in modified versions of FIG. 2a, 2b, 2c, or 2d and FIG. 3a, 3b, or 3c. Also, although various operational flows are presented in particular sequence(s) as illustrated in the drawings, it should be understood that the various operations may be performed in orders other than those that are illustrated or may be performed at least partially concurrently.

Further, in FIG. 10 and in figures to follow thereafter (e.g., FIGS. 11a-11d), various example operations may be depicted in a box-within-a-box manner. Such depictions may indicate that an operation in an internal box may comprise an optional example embodiment of an operation illustrated in one or more external boxes. However, it should be understood that internal box operations may be viewed as independent operations separate from any associated external boxes and may be performed in any sequence with respect to other illustrated operations, or may be performed fully or partially concurrently. Still further, operations that are illustrated in FIG. 10 as well as other operations that are described herein below may be performed by at least one of a machine, an article of manufacture, or a composition of matter, unless indicated otherwise.

For certain example embodiments, e.g. after a start operation, an operational flow 1000 of FIG. 10 may include a behavioral fingerprint receiving operation 1001 for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may receive (e.g., wirelessly or by wire, via at least one network 50, as part of an update, in response to a request, automatically, fully or partially, from a user or a behavioral fingerprint-related service, a combination thereof, etc.) one or more behavioral fingerprints associated with one or more network accessible users 20 (e.g., a user that is connected to or connects—e.g., occasionally, frequently, regularly, periodically, daily, a combination thereof, etc.—to an internet or an application or service that communicates using the internet). An example instance, as an illustration, may include a computer server 30 connecting via at least one network 50 to a computing device 10 (e.g., of FIG. 1); either can send or receive a behavioral fingerprint associated with one or more network accessible users, which may comprise a first user 20 of a computing device 10. A device or a network service or system may provide at least one behavioral fingerprint of a network accessible user.

For certain example embodiments, e.g. in addition to a behavioral fingerprint receiving operation 1001, an operational flow 1000 may include an operation 1002 for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users as further illustrated in FIG. 10. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may receive an authentication request (e.g., a request to authenticate an entity, or a request to provide a measure of assurance or confirmation that an entity is who the entity represents itself to be or otherwise has permission to conduct a transaction), with the authentication request associated with one or more proposed (e.g., requested, indicated, pending, non-final, unconsummated, non-executed, incomplete, suggested, a combination thereof, etc.) transactions (e.g., a purchase; a sale; an exchange of goods, services, money, or other consideration; a financial transaction; an internet-based transaction; a transaction including virtual goods or services; a physical retailer-based transaction; a transaction for a subscription; a transaction for access to a physical or virtual good or resource; a transaction for an entertainment object, such as a movie, TV show, or song; a combination thereof; etc.) of the one or more network accessible users 20. An example instance may include a server 30 via a network interface 112c receiving from a particular device of one or more devices 60, such as computing device 10, an authentication request for at least one proposed transaction.

For certain example embodiments, an operational flow 1000 of FIG. 10 may include an operation 1003 for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may transmit a decision (e.g., approval of a proposed transaction, disapproval of a proposed transaction, confirmation of authentication, denial of authentication, an indication of a user's identity, a combination thereof, etc.) associated with the authentication request, with the decision based at least partially on a trust verification schema 303 generated from a relational mapping (e.g., a determination, a graphing, a calculation, a storing, a combination thereof, etc. of technical, social, a combination thereof, etc. relational connections having one or more tiers or levels) of one or more behavioral fingerprints associated with one or more network accessible users 20. An example instance may include a computer server 30 transmitting via a network interface 112c a decision associated with an authentication request, with the decision based on a trust verification schema, such as a trust verification schema 303 as shown in FIG. 3a or 3c. Trust verification schema 303 (e.g., of FIG. 3c) illustrates a schema that may include authentication functions associated with a behavioral fingerprint (e.g., as shown in FIGS. 2a and 2e) that is associated with a network accessible user 20 such as a User A that is using a machine 320.

As is described herein below, an example operational flow 1000 of FIG. 10 may be executed, performed, etc. in a variety of different ways in various alternative implementations. FIGS. 11a, 11b, 11c, and 11d, for example, illustrate at least some alternative ways that an operational flow 1000 of FIG. 10 may be executed, performed, etc. in various alternative implementations.

For certain example embodiments, in various implementations, an operation 1001 of FIG. 10 may include an operation 1102 for sensing at the computer device one or more actions of the one or more network accessible users as depicted in FIG. 11a. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may sense one or more actions (e.g., physical device interactions such as swipes, key presses, shakes, a combination thereof, etc.; virtual interactions such as web sites visited, people interacted with, social networks used, a combination thereof, etc.; movements such as physical locations visited; virtual or real-world transactions conducted; some combination thereof; etc.) of one or more network accessible users 20. An example instance may include a behavioral fingerprint module 106/106a/106c determining a behavioral fingerprint including sensed actions of a network accessible user by: establishing that a first user 20 is an authorized network accessible user and generating a behavioral fingerprint via fingerprint build/degradation module 314 or fingerprint generation module 316. Behavioral fingerprint generation may include, for example, statistical calculations based on prior actions to confirm a persistent internet presence of a network accessible user of a computing device 10 or additional/alternative devices 60.

For certain example embodiments, as further illustrated in FIG. 11a, a sensing operation 1102 may include an operation 1104 for detecting at the computer device one or more contacts frequently interacted with by at least one of the one or more network accessible users via one or more social networks to determine at least one interaction pattern associated with the at least one of the one or more network accessible users. For certain example implementations, a computer device may detect one or more contacts (e.g., people, companies, entities, family members, social network members, groups of people, some combination thereof, etc.) frequently (e.g., sufficiently repeated so as to serve as a meaningful check on or confirmation of possession or use of a device, such as hourly, daily, weekly, etc.) interacted with by at least one of the one or more network accessible users via one or more social networks to determine at least one interaction pattern (e.g., times of day; days of week; frequency; social network used; social network communication pathway utilized—such as posting, emailing, or tweeting; diction or grammar employed; other contacts similarly or simultaneously or as a group interacted with; a combination thereof; etc.) associated with the at least one of the one or more network accessible users. An example instance may include transmitting using network interface 112/112c from a computing device 10 or computer server 30 (e.g., of FIG. 1) a detected contact frequently interacted with by a first user 20 as a network accessible user and determining an interaction pattern associated with first user 20 from one or more of such transmissions.

For certain example embodiments, an operation 1102 may include an operation 1105 for detecting one or more locations visited by at least one of the one or more network accessible users, the one or more locations including one or more of physical locations or internet address-based locations. For certain example implementations, a computer device may detect one or more locations visited by at least one of the one or more network accessible users, the one or more locations including one or more of physical locations (e.g., an address, GPS coordinates or similar, a store or commercial establishment name, an individual's house, a neighborhood, a city, a combination thereof, etc.) or internet address-based locations (e.g., a URL, a web service, a cloud entity, a social network, a virtual world, a location within a virtual world, a combination thereof, etc.). An example instance may include data from various types of sensors 120 used in order to determine one or more locations visited by a first user 20, which locations may be stored on a computer server 30 or a computing device 10. Stored locations may be physical locations, internet addresses, or both.

For certain example embodiments, an operation 1001 may include an operation 1103 for applying at the computer device one or more reliability criteria to the sensed one or more actions of the one or more network accessible users to update the one or more behavioral fingerprints associated with the one or more network accessible users as further depicted in FIG. 11a. For certain example implementations, a computer device may apply one or more reliability criteria (e.g., likelihood values or functions, statistical analysis tools, statistical values, probability values, probabilistic mechanisms, confidence levels, action comparison techniques, action generalization strategies, a combination thereof, etc.) to one or more sensed actions (e.g., physical device interactions such as swipes, key presses, shakes, a combination thereof, etc.; virtual interactions such as web sites visited, people interacted with, social networks used, a combination thereof, etc.; movements such as physical locations visited; virtual or real-world transactions conducted; some combination thereof; etc.) of one or more network accessible users to update one or more behavioral fingerprints associated with the one or more network accessible users. An example instance may include applying reliability criteria at computer server 30 to sensed actions of a user garnered via one or more sensors 120 to update at least one behavioral fingerprint.

For certain example embodiments, an operation 1103 may include an operation 1106 for altering at least one of the one or more behavioral fingerprints associated with the one or more network accessible users as a function of the sensed one or more actions of the one or more network accessible users and at least one internet available entity. For certain example implementations, a computer device may alter (e.g., change, update, add an action to, adjust a likelihood value of, issue or process an alert for, include a new contact in, increase a value representing a number of times an action has occurred for, a combination thereof, etc.) at least one behavioral fingerprint associated with one or more network accessible users as a function of one or more sensed actions of the one or more network accessible users and at least one internet available entity (e.g., another, different network accessible user; a cloud system; an email service provider; an interactive web site; a social network; an instant message participant; a texting participant; a social network member; a combination thereof; etc.). An example instance may include using one or more sensors 120 to sense actions of one or more network-accessible users and altering each behavioral fingerprint for each behavioral fingerprint module 106/106a/106c of each user. Altering may include determining a new behavioral fingerprint using sensed actions of one or more network-accessible users. Computer server 30 or computing device 10 may alter a behavioral fingerprint using a level of authentication module 102/102a/102c or a behavioral fingerprint module 106/106a/106c as a function of one or more sensed actions of a first user 20 and at least one internet available entity, which internet available entity may be specifically designated by a user.

Referring now to FIGS. 11b and 11c, for some example implementations, an operation 1002 may include an operation 1107, an operation 1113, an operation 1114, or an operation 1117. For some example implementations, an operation 1113 may follow an operation 1114, which may follow an operation 1117.

For certain example embodiments, an operation 1002 may include an operation 1107 for relationally mapping the one or more behavioral fingerprints based at least partially on one or more relations between or among the one or more network accessible users as indicated by at least one social network. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may relationally (e.g., with respect to social relations such as family, friends, professional contacts, a combination thereof, etc.; with respect to technical relations such as machines, web/cloud/internet services, resources, a combination thereof, etc.; other forms or kinds of relations; some combination thereof; and so forth) map (e.g., discern, determine, record, a combination thereof, etc. a number of connections, types of connections, endpoints for connections, strengths of connections, a combination thereof, etc.) the one or more behavioral fingerprints based at least partially on one or more relations (e.g., family relations, friendship relations, professional relations, machine relations, web/cloud/internet service relations, resource relations, a combination thereof, etc.) between or among the one or more network accessible users as indicated by at least one social network (e.g., including, inter alia, at least one relational connection determinable via at least one social network). An example instance may include, with reference to FIG. 3c, a mapping of a trust verification schema 303 showing one or more relationships between users A, B, or C using behavioral fingerprints that employ various types of sensors 120 to determine behavioral fingerprint indications to be stored by a computer server 30 or a computing device 10.

For some example implementations, an operation 1107 may include any one or more of operation 1108, 1109, 1110, 1111, or 1112. For certain example embodiments, an operation 1107 may include an operation 1108 for receiving data at the computer device from the at least one social network, the received data indicating one or more relations between or among the one or more network accessible users. For certain example implementations, a computer device may receive data from at least one social network (e.g., by intercepting social network communications originating from or destined for a network accessible user, in response to a query to a social network, in response to a request in accordance with a specialized protocol or API offered by a social network for relationship data, by monitoring public social network feeds, by scraping or harvesting from a social network website, a combination thereof, etc.), with the received data (e.g., relationship data including explicit relational connections, relational connections derived from relationship data, relationship data that can be used to extract relational connections by processing it, a combination thereof, etc.) indicating one or more relations between or among the one or more network accessible users. An example instance may include, with reference to FIG. 3c or FIG. 1, data from Twitter™, LinkedIn™, Facebook™, Match.com™, a combination thereof, etc. being sent to other servers or services to identify relations between users A, B, or C. As shown in FIG. 3c, users A, B, or C may connect to the same or similar/related servers, and based at least partially on their behavioral fingerprints, further relations can be established or ascertained.

For certain example embodiments, an operation 1108 may be followed by an operation 1109 for mapping at the computer device one or more relationships that are extant between or among the one or more network accessible users based at least partially on the indicated one or more relations between or among the one or more network accessible users. For certain example implementations, a computer device may map (e.g., determine, discern, detect, identify, record, align, place in a data structure, graph, a combination thereof, etc. connections, linkages, associations, tiers, levels, common aspects, a combination thereof, etc. representing) one or more relationships (e.g., family, friendship, professional, machine, service, resource, a combination thereof, etc. relationships) that are extant between or among the one or more network accessible users based at least partially on the indicated one or more relations between or among the one or more network accessible users. An example instance may include mapping a trust verification schema 303 as shown in FIG. 3c based on identified relations between users A, B, or C.

For certain example embodiments, an operation 1107 may include an operation 1110 for determining at the computer device via the at least one social network that at least one respective network accessible user of the one or more network accessible users has at least one corresponding behavioral fingerprint of the one or more behavioral fingerprints. For certain example implementations, a computer device may determine via at least one social network that at least one respective network accessible user of one or more network accessible users has (e.g., is associated with, is registered to, is capable of configuring, is capable of restricting, is capable of authorizing access to, is assigned to be characterized by, a combination thereof, etc.) at least one corresponding behavioral fingerprint of the one or more behavioral fingerprints. An example instance may include a computer 30 receiving data via network interface 112c from one or more of Twitter™, Facebook™, LinkedIn™, or the like to confirm available behavioral fingerprint data via social network library 302.

For certain example embodiments, an operation 1110 may be followed by an operation 1111 for determining at the computer device if the at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user of the one or more network accessible users. For certain example implementations, a computer device may determine if at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user of the one or more network accessible users (e.g., determine if the at least one corresponding behavioral fingerprint is kept current, still made accessible, updated periodically, a combination thereof, etc.). An example instance may include a computer server 30 receiving data via a network interface 112c from one or more of Twitter™, Facebook™, LinkedIn™, or the like to confirm current behavioral fingerprint data via social network library 302. Each behavioral fingerprint of each network accessible user, such as a first user 20, may be checked to confirm a minimum level of currency or recency of updating/accessing.

For certain example embodiments, an operation 1111 may be followed by an operation 1112 for relationally mapping by the computer device at least a subset of the one or more network accessible users for which the at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user. For certain example implementations, a computer device may relationally map (e.g., identify, store, discover, ascertain, a combination thereof, etc. one or more social or technical connections for) at least a subset (e.g., at least a portion, at least a group, at least a sub-group, a combination thereof, etc.) of the one or more network accessible users for which the at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user. An example instance may include mapping connections between or among users A, B, or C as shown in FIG. 3c as arrows linking network accessible users via their machines, resources, services, combinations thereof, etc. or family/inter-personal relationships thereof.

For certain example embodiments, an operation 1002 may include an operation 1113 for identifying by the computer device one or more relations between or among the one or more network accessible users. For certain example implementations, a computer device may identify one or more relations (e.g., family relations, friendship relations, professional relations, machine relations, web/cloud/internet service relations, resource relations, a combination thereof, etc.) between or among the one or more network accessible users (e.g., relations linking two or more users that are at least occasionally coupled to a network, such as the internet, a social network, a combination thereof, etc.). An example instance may include using a trust verification schema 303 (e.g., as shown in FIG. 3c) to identify one or more relations (e.g., family relations such as parent-child, sibling, uncle-niece, grandchild-grandparent, cousins, first cousin once removed, great aunt-niece, a combination thereof, etc.) between or among users A, B, or C or producing a trust verification schema 303 from one or more indicated relations between or among users A, B, or C.

For some example implementations, an operation 1113 may include an operation 1118 or an operation 1119. For certain example embodiments, an operation 1118 may comprise identifying the one or more relations based at least partially on one or more social network data. For certain example implementations, a computer device may identify one or more relations (e.g., family relations, friendship relations, professional relations, machine relations, web/cloud/internet service relations, resource relations, a combination thereof, etc.) based at least partially on one or more social network data (e.g., explicit relational data offered or provided by a social network, explicit relational data obtainable from a social network via a specialized protocol or API, relational data that is inferred from social network communications, relational data that is ascertainable from social network settings or profile information, a combination thereof, etc.). An example instance may include identifying relations between or among users A, B, or C as shown in FIG. 3c by identifying which users are mapped to common servers, devices, or the like and confirming the identified relations using data from Twitter™, Facebook™ LinkedIn™, or the like.

For certain example embodiments, an operation 1113 may include an operation 1119 for identifying the one or more relations via identifying one or more common network accessible users as linked via one or more social networks. For certain example implementations, a computer device may identify one or more relations (e.g., social or technical relations) via identifying one or more common network accessible users as linked (e.g., connected through one or more tiers or levels) via one or more social networks (e.g., identifying network accessible users that are linked to a same one or more other network accessible users or social network members). An example instance may include identifying relations between users A, B, or C as shown in FIG. 3c by identifying users that are connected to the same servers or the same social network members or the like. Such relations may be confirmed using data from Twitter™, Facebook™, LinkedIn, or the like.

For certain example embodiments, as shown in FIG. 11c, an operation 1002 may include an operation 1114 for correlating by the computer device the one or more behavioral fingerprints associated with the one or more network accessible users based at least partially on the identified one or more relations. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may correlate (e.g., compare, contrast, ascertain similarities of, ascertain differences of, ascertain overlapping aspects of, determine connections for, a combination thereof, etc.) one or more behavioral fingerprints associated with the one or more network accessible users based at least partially on the identified one or more relations (e.g., by focusing at least partly on overlapping aspects, common social or technical aspects of linkages, common connections, interconnections between or among users, common social network member connections, a combination thereof, etc. of behavioral fingerprints for one or more network accessible users). An example instance may include identifying relations between users A, B, or C as shown in FIG. 3c by identifying users that are mapped to common servers, devices, people, entities, a combination thereof, etc. using behavioral fingerprints. Relations may be confirmed using data from Twitter™, Facebook™ LinkedIn™, or the like. Another example instance may include, for each arrow 390 illustrated in a trust verification schema 303 (e.g., as shown in FIG. 3c), correlating (e.g., linking, associating, comparing, sharing, a combination thereof, etc.) a level of authentication for each of users A, B, or C.

For some example implementations, an operation 1114 may include an operation 1120 or an operation 1121. For certain example embodiments, an operation 1120 may comprise identifying the one or more behavioral fingerprints of the one or more network accessible users. For certain example implementations, a computer device may identify one or more behavioral fingerprints of the one or more network accessible users via at least one social network (e.g., by contacting a server of a social network, by contacting an app of a social network, by contacting a network accessible user via a social network communication capability, a combination thereof, etc.). An example instance may include, after identifying relations between users A, B, or C as shown in FIG. 3c by identifying users that are mapped to common servers, devices, social network members, or the like and confirming relations using data from a social network, using the identified relations to obtain identities of additional related network accessible users via behavioral fingerprints, e.g. of the additional related network accessible users. Another example instance may include using multiple behavioral fingerprints associated with users A, B, or C to establish linkages between them based on a sharing of or a willingness to share one or more servers, devices, data, accounts (e.g., financial accounts, cloud service accounts, social network accounts, debit/gift/store card accounts, retailer accounts, a combination thereof, etc.), a combination thereof, and so forth.

For certain example embodiments, an operation 1114 may include an operation 1121 for comparing the identified one or more behavioral fingerprints based at least partially on one or more relationships existing between or among the one or more network accessible users. For certain example implementations, a computer device may compare one or more identified behavioral fingerprints (e.g., known behavioral fingerprints, newly-discovered behavioral fingerprints from an analysis of known behavioral fingerprints or at least one trust verification schema, a combination thereof, etc.) based at least partially on one or more relationships existing between or among the one or more network accessible users (e.g., relationships determinable from known behavioral fingerprints, from connections of at least one trust verification schema, a combination thereof, etc.). An example instance may include using a trust verification schema 303 (e.g., as shown in FIG. 3c) to compare behavioral fingerprints of connected users A, B, or C. Another example instance may include calculating at least one correlation between or among users A, B, or C based on data from servers shared between or among the users. Parameters usable in a correlation analysis may include a time, a date, shared data, shared account access, a combination thereof, etc. with respect to one or more servers or devices shared by one or more of users A, B, or C.

For certain example embodiments, an operation 1002 may include an operation 1117 for generating the trust verification schema at least partially by mapping the correlated one or more behavioral fingerprints with the identified one or more relations. For certain example implementations, a computer device may generate a trust verification schema 303 (e.g., a data structure, a file, a matrix, a graph, a combination thereof, etc. that includes, represents, indicates, a combination thereof, etc. one or more connections or tiered levels of linkages between or among one or more network accessible users) at least partially by mapping (e.g., comparing, finding similarities, locating overlapping aspects, creating a graph, obtaining relations, identifying connections, marking/recording linkages, a combination thereof, etc. with respect to) one or more correlated behavioral fingerprints with one or more identified relations (e.g., family relations, friendship relations, professional relations, machine relations, web/cloud/internet service relations, resource relations, a combination thereof, etc.). An example instance may include generating a trust verification schema (e.g., a trust verification schema 303 as illustrated in FIG. 3c) using correlated levels of authentication (e.g., levels of authentication that are shared, compared, linked causally, associated as tending to exist or change together, associated as tending to impact one another, a combination thereof, etc.) of users A, B, or C or determining how close relations between or among users A, B, or C are to each other.

For certain example embodiments, an operation 1117 may include an operation 1125 for generating the trust verification schema using the correlated one or more behavioral fingerprints, wherein the correlated one or more behavioral fingerprints result in a particular level of authentication for one or more groups of related network accessible users of the one or more network accessible users as further depicted in FIG. 11c. For certain example implementations, a computer device may generate a trust verification schema using one or more correlated behavioral fingerprints (e.g., behavioral fingerprints that have been compared to one another, that have had shared aspects identified, that have had overlapping aspects identified, that have had similar or dissimilar aspects identified, that have had shared usage cataloged in terms of time or space, that have been determined to have connected users in common, that have been determined to have shared devices/machines/data, a combination thereof, etc.), wherein the one or more correlated behavioral fingerprints result in a particular level of authentication (e.g., a low level of authentication, an average level of authentication, a high level of authentication, a shared level of authentication, a combination thereof, etc.) for one or more groups (e.g., an identifiable or determinable listing of multiple network accessible users) of related network accessible users (e.g., related via social or technical relationships) of the one or more network accessible users. An example instance may include, with reference to a trust verification schema 303 as shown in FIG. 3c, generating a trust verification schema using correlated levels of authentication by determining/plotting shared servers or data between or among users A, B, or C and determining an average level of authentication applicable to each user of the group of users A, B, and C.

For some example implementations, an operation 1125 may include an operation 1127 or an operation 1128, which may follow an operation 1127. For certain example embodiments, an operation 1127 may comprise determining at least one proximity of relation for the one or more network accessible users based at least partially on one or more social network linkages that are confirmed by at least a portion of the one or more network accessible users. For certain example implementations, a computer device may determine at least one proximity of relation for (e.g., a number of levels or tiers between or among) one or more network accessible users based at least partially on one or more social network linkages (e.g., arrows 390, family connections, friendship connections, professional connections, server connections, a combination thereof, etc.) that are confirmed (e.g., via a response to an explicit inquiry, by providing or authorizing the providing of a behavioral fingerprint, a combination thereof, etc.) by at least a portion of the one or more network accessible users. An example instance may include receiving from one or more network accessible users their behavior fingerprint(s) and determining a proximity of relation via analyzing/plotting/graphing arrows 390 or other connections of a generated trust verification schema 303 (e.g., as illustrated in FIG. 3c). Arrows 390 that connect different machines, servers, devices, data sources, entities, or other aspects may also include or represent social network links that are confirmed using behavioral fingerprints of users A, B, or C.

For certain example embodiments, an operation 1128 may comprise determining at least one level of relation between or among the one or more network accessible users based at least partially on the determined at least one proximity of relation for the one or more network accessible users. For certain example implementations, a computer device may determine at least one level of relation (e.g., a family relation, an immediate family relation, an extended family relation, an in-law relation, a parent-child relation, a sibling relation, a living-in-the-same household relation, a roommate relation, a distant relative relation, a close friends relation, an acquaintances relation, a co-workers relation, a relation corresponding to being directly connected for a first tier, a relation corresponding to being connected via one intermediate linking server/device or person for a second tier, a relation corresponding to being connected via two intermediate linking servers/devices or persons for a third tier, a combination thereof, and so forth) between or among the one or more network accessible users based at least partially on the determined at least one proximity of relation for the one or more network accessible users. An example instance may include determining a level of relation or how close two or more network accessible users (such as users A, B, or C) are as illustrated in FIG. 3c, such as how many servers are shared between or among the users or how many people in common each user knows or is connected to. A proximity of relation may be determined based at least partly on how or when each of users A, B, or C are connected over different servers as shown in example trust verification schema 303 or who, how, or when each of users A, B, or C interact with respect to one or more common entities. Another example instance may include determining a level of relation from explicit familial relationship indicators obtained (e.g., received, retrieved, observed, monitored, a combination thereof, etc.) via at least one social network.

For some example implementations, an operation 1128 may include an operation 1129 or an operation 1130. For certain example embodiments, an operation 1129 may comprise altering the at least one level of relation between or among the one or more network accessible users based at least partially on one or more changes to relations indicated by at least one of the one or more network accessible users, the one or more changes indicated via at least one social network. For certain example implementations, a computer device may alter at least one level of relation (e.g., increase or decrease a recorded or mapped number of tier level or levels, change a relationship from being married to being unmarried or vice versa, change a relationship from being roommates to not being roommates or vice versa, change a relationship from being co-workers to being acquaintances or vice versa, change a relationship from being involved romantically to not being involved romantically or vice versa, any combination thereof, etc.) between or among one or more network accessible users based at least partially on one or more changes to relations (e.g., becoming closer friends, being married, getting divorced, ceasing being friends, starting to share machines or data, ceasing being co-workers, starting to date, having a greater or lesser number of contacts or social network members in common, communicating or otherwise interacting more or less frequently, increasing or decreasing a number of different social networks used to interact with a given individual or other entity, changing a number of common servers used, changing whether a connection exists on a given social network platform (e.g., friending or unfriending, following or un-following, a combination thereof, etc.), a combination thereof, etc.) indicated by at least one of the one or more network accessible users, the one or more changes indicated via at least one social network (e.g., by considering social network servers used, by querying a social network, by scraping a website of a social network, by utilizing a specialized protocol or API of a social network, by monitoring social network feeds, by monitoring public or private social network communications, by monitoring social network status updates, by monitoring social network connection updates, a combination thereof, etc.). An example instance may include updating a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c) by changing relations shown by arrows 390 with respect to social networks or servers represented by 340, 342, 346, or 348; additionally or alternatively, arrows from or to network accessible users A, B, or C may be added or removed.

For certain example embodiments, an operation 1130 may comprise mapping one or more locations of the one or more network accessible users to confirm the at least one level of relation between or among the one or more network accessible users. For certain example implementations, a computer device may map one or more locations (e.g., determine, ascertain, a combination thereof, etc. at least one physical or virtual location; link coordinates to an address or place name; determine a distance between two or more locations; some combination thereof; etc.) of one or more network accessible users to confirm at least one level of relation (e.g., verify that at least two network accessible users have visited locations in common or are in proximity to each other) between or among the one or more network accessible users. An example instance may include, with reference to FIG. 3c illustrating a schema map showing activities of three network accessible users, A, B, and C, mapping locations of users A, B, or C if the devices used by each of them include sensors 120. In some example implementations a location may comprise a physical location, and in other example implementations a location may comprise an internet location or site.

With reference to FIG. 11d, for certain example embodiments, an operation 1003 may include an operation 1131 for using the trust verification schema to automatically authenticate a proposed transaction associated with at least one of the one or more network accessible users based at least partly on a level of authentication associated with at least two of the one or more network accessible users. For certain example implementations, a computer device (e.g., a computing device 10, a computer server 30, another device 60, a combination thereof, etc.) may use a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c) to automatically authenticate (e.g., to authenticate without requiring any additional contemporaneous authentication credentials, such as a password or a biometric input, after the authentication request is received; to authenticate without sending a notification to a user; to authenticate without asking a user for approval; a combination thereof; etc.) a proposed (e.g., requested, indicated, pending, non-final, unconsummated, non-executed, incomplete, suggested, a combination thereof, etc.) transaction (e.g., a purchase; a sale; an exchange of goods, services, money, or other consideration; a financial transaction; an internet-based transaction; a transaction including virtual goods or services; a physical retailer-based transaction; a transaction for a subscription; a transaction for access to a physical or virtual good or resource; a transaction for an entertainment object, such as a movie, TV show, or song; a combination thereof; etc.) associated with at least one of the one or more network accessible users based at least partly on a level of authentication associated with at least two (e.g., a level of authentication shared by, jointly assigned to, bilaterally adopted, at least partially simultaneously belonging to, a combination thereof, etc. two or more) of the one or more network accessible users. An example instance may include trust verification schema 303 being used to authenticate a transaction by network accessible User A based at least partly on a level of authentication of User B or User C or a combined level of authentication of User A and at least one other user. Additionally or alternatively to using a trust verification schema, a proposed transaction may be automatically authenticated using, at least in part, a value (e.g., a dollar amount of a good or service being purchased, a time involved to complete an exchange, a loss incurred for incorrectly approving, a risk of authentication, a risk of non-authentication, a computer versus a cup of coffee, an entertainment item versus a safety-related item, a combination thereof, etc.) of the proposed transaction.

For certain example embodiments, an operation 1003 may include an operation 1132 for authenticating a shared computer processing request via verification of at least one of the one or more network accessible users based at least partly on the trust verification schema. For certain example implementations, a computer device may authenticate a shared computer (e.g., a computing device 10, another device 60, a combination thereof, etc. that is being used by, that is configured to be usable by, that is associated with—such as via logon accounts or via service provider accounts—multiple users, a combination thereof, etc.) processing request (e.g., a request to conduct a transaction, a request to execute a particular application, a request to perform some function, a request to access some data or device capability, a request to make a purchase, a request to install an application, a combination thereof, etc.) via verification (e.g., confirmation of identity, consideration of current behavioral fingerprint information, acceptance of valid authentication credentials, a combination thereof, etc.) of at least one of the one or more network accessible users based at least partly on a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c indicating which user(s) may be able to vouch for, authenticate, authorize, share authentication level(s) with, a combination thereof, etc. which other user(s)). An example instance may include a trust verification schema 303 of FIG. 3c being used to authenticate a transaction that includes a shared computer processing request via verification of a network accessible user A based at least partially on a level of authentication of User A, User B, or user C that is determined in combination with at least two users.

For certain example embodiments, an operation 1003 for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users may include an operation 1133 for authenticating an internet purchase transaction via verification of at least one of the one or more network accessible users based at least partly on the trust verification schema. For certain example implementations, a computer device may authenticate an internet purchase transaction (e.g., a transaction at least partially initiated, conducted, completed, effectuated, a combination thereof, etc. using the internet) via verification (e.g., confirmation of identity, consideration of current behavioral fingerprint information, acceptance of valid authentication credentials, a combination thereof, etc.) of at least one of the one or more network accessible users based at least partly on a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c indicating which user(s) may be able to vouch for, authenticate, authorize, share authentication level(s) with, a combination thereof, etc. which other user(s)). An example instance may include a trust verification schema 303 of FIG. 3c being used to authenticate a transaction with an internet retailer by network accessible User A by accessing trust verification schema 303 from or using a transaction server. User A or a related network accessible user may be verified.

For certain example embodiments, an operation 1003 may include an operation 1134 for authenticating a purchase by a first network accessible user of the one or more network accessible users based at least partially on a location of a second network accessible user of the one or more network accessible users and on the trust verification schema. For certain example implementations, a computer device may authenticate a purchase (e.g., verify that a person representing himself or herself as having a particular identity does indeed have that identity, approve a purchase, indicate that a person has permission to make a purchase, indicate that it is plausible that an identified person is indeed requesting a particular purchase, a combination thereof, etc.) by a first network accessible user of one or more network accessible users based at least partially (i) on a location (e.g., an address, a set of GPS coordinates, an establishment name, a person's home, a neighborhood, a geographic range or area based on an antenna's position, a combination thereof, etc.) of a second network accessible user of the one or more network accessible users and (ii) on a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c). An example instance may include a computing device 10 or a computer server 30 approving a purchase by a User A if a location of User B indicates that User B is proximate to (e.g., within a same store or walking distance thereto) where User A is attempting to make a purchase because a trust verification schema 303 indicates that User A and User B are sufficiently related such that if User B is in a particular location then it is likely that User A is in fact in the same location and may be trying to make purchase.

For certain example embodiments, an operation 1003 may include an operation 1135 for denying at least one proposed transaction of the one or more proposed transactions that is attempted by at least one of the one or more network accessible users based at least partly on the trust verification schema. For certain example implementations, a computer device may deny at least one proposed (e.g., requested, indicated, pending, non-final, unconsummated, non-executed, incomplete, suggested, a combination thereof, etc.) transaction (e.g., a purchase; a sale; an exchange of goods, services, money, or other consideration; a financial transaction; an internet-based transaction; a transaction including virtual goods or services; a physical retailer-based transaction; a transaction for a subscription; a transaction for access to a physical or virtual good or resource; a transaction for an entertainment object, such as a movie, TV show, or song; a combination thereof; etc.) of one or more proposed transactions that is attempted by at least one user of one or more network accessible users based at least partly on a trust verification schema (e.g., a trust verification schema 303 of FIG. 3c). An example instance may include a trust verification schema 303 of FIG. 3c being used to deny a transaction by network accessible User A based on schema 303 if, for example, a level of authentication of User A, User B, or User C in combination indicates that a low level of authentication is appropriate for at least User A. By way of example only, a transaction for any of user A, B, or C may be denied if at least one of them has indicated that a device of theirs has been stolen. Additionally or alternatively to using a trust verification schema, a proposed transaction may be denied using, at least in part, a value (e.g., a dollar amount of a good or service proposed to be purchased, a time involved to complete a proposed exchange, a loss incurred for incorrectly approving, a risk of authentication or approval, a risk of non-authentication or non-approval, a computer versus a cup of coffee, an entertainment item versus a safety-related item, a combination thereof, etc.) of the proposed transaction.

For certain example embodiments, an operation 1135 may include an operation 1136 for denying the at least one proposed transaction based at least partially on a calculated combined level of authentication for related network accessible users and on a predetermined combined level of authentication that is indicated by the trust verification schema. For certain example implementations, a computer device may deny at least one proposed transaction based at least partially (i) on a calculated combined level of authentication for (e.g., a level of authentication shared by, jointly assigned to, bilaterally adopted, at least partially simultaneously belonging to, a combination thereof, etc. at least two users that are) related (e.g., via one or more social or technical relationships) network accessible users and (ii) on a predetermined combined level of authentication that is indicated by a trust verification schema (e.g., a level of authentication established or stored by a trust verification schema 303 that leads to joint transaction denials for each user of a combined group of users if any user of the group has a level of authentication reach or fall below the predetermined combined level of authentication). An example instance may include accessing multiple levels of authentication associated with users A, B, or C with reference to a trust verification schema 303 of FIG. 3c. If one of users A, B, or C has a low level of authentication, such as may result from an indication of a stolen phone, a transaction may be denied even if the phone of User A was not the one stolen. Depending on a proximity or level of relation between or among users A, B, or C, each of the users may be denied certain transactions if one reports a stolen or lost phone. A denial may protect the property or financial standing of each of users A, B, or C if their relations are close enough such that a problem with one user indicates a high likelihood of a problem with related network accessible users.

Those having skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware in one or more machines or articles of manufacture), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation that is implemented in one or more machines or articles of manufacture; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware in one or more machines or articles of manufacture. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware in one or more machines or articles of manufacture.

The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuitry (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuitry, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).

In a general sense, those skilled in the art will recognize that the various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.

Those having skill in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use engineering practices to integrate such described devices and/or processes into data processing systems. That is, at least a portion of the devices and/or processes described herein can be integrated into a data processing system via a reasonable amount of experimentation. Those having skill in the art will recognize that a typical data processing system generally includes one or more of a system unit housing, a video display device, a memory such as volatile and non-volatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.

The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.

While particular aspects of the present subject matter described herein have been shown and described, it will be apparent to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from the subject matter described herein and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of the subject matter described herein. Furthermore, it is to be understood that the invention is defined by the appended claims.

It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations.

In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).

In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”

Claims

1. A computationally-implemented system comprising:

circuitry for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users;
circuitry for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and
circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users.

2. The computationally-implemented system of claim 1, wherein the circuitry for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users comprises:

circuitry for sensing at the computer device one or more actions of the one or more network accessible users; and
circuitry for applying at the computer device one or more reliability criteria to the sensed one or more actions of the one or more network accessible users to update the one or more behavioral fingerprints associated with the one or more network accessible users.

3. The computationally-implemented system of claim 2, wherein the circuitry for sensing at the computer device one or more actions of the one or more network accessible users comprises:

circuitry for detecting at the computer device one or more contacts frequently interacted with by at least one of the one or more network accessible users via one or more social networks to determine at least one interaction pattern associated with the at least one of the one or more network accessible users; and
circuitry for detecting one or more locations visited by at least one of the one or more network accessible users, the one or more locations including one or more of physical locations or internet address-based locations.

4. The computationally-implemented system of claim 2, wherein the circuitry for applying at the computer device one or more reliability criteria to the sensed one or more actions of the one or more network accessible users to update the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for altering at least one of the one or more behavioral fingerprints associated with the one or more network accessible users as a function of the sensed one or more actions of the one or more network accessible users and at least one internet available entity.

5. The computationally-implemented system of claim 1, wherein the circuitry for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users comprises:

circuitry for relationally mapping the one or more behavioral fingerprints based at least partially on one or more relations between or among the one or more network accessible users as indicated by at least one social network.

6. The computationally-implemented system of claim 5, wherein the circuitry for relationally mapping the one or more behavioral fingerprints based at least partially on one or more relations between or among the one or more network accessible users as indicated by at least one social network comprises:

circuitry for receiving data at the computer device from the at least one social network, the received data indicating one or more relations between or among the one or more network accessible users; and
circuitry for mapping at the computer device one or more relationships that are extant between or among the one or more network accessible users based at least partially on the indicated one or more relations between or among the one or more network accessible users.

7. The computationally-implemented system of claim 5, wherein the circuitry for relationally mapping the one or more behavioral fingerprints based at least partially on one or more relations between or among the one or more network accessible users as indicated by at least one social network comprises:

circuitry for determining at the computer device via the at least one social network that at least one respective network accessible user of the one or more network accessible users has at least one corresponding behavioral fingerprint of the one or more behavioral fingerprints;
circuitry for determining at the computer device if the at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user of the one or more network accessible users; and
circuitry for relationally mapping by the computer device at least a subset of the one or more network accessible users for which the at least one corresponding behavioral fingerprint is maintained by the at least one respective network accessible user.

8. The computationally-implemented system of claim 1, wherein the circuitry for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users comprises:

circuitry for identifying by the computer device one or more relations between or among the one or more network accessible users;
circuitry for correlating by the computer device the one or more behavioral fingerprints associated with the one or more network accessible users based at least partially on the identified one or more relations; and
circuitry for generating the trust verification schema at least partially by mapping the correlated one or more behavioral fingerprints with the identified one or more relations.

9. The computationally-implemented system of claim 8, wherein the circuitry for identifying by the computer device one or more relations between or among the one or more network accessible users comprises:

circuitry for identifying the one or more relations based at least partially on one or more social network data.

10. The computationally-implemented system of claim 8, wherein the circuitry for identifying by the computer device one or more relations between or among the one or more network accessible users comprises:

circuitry for identifying the one or more relations via identifying one or more common network accessible users as linked via one or more social networks.

11. The computationally-implemented system of claim 8, wherein the circuitry for correlating by the computer device the one or more behavioral fingerprints associated with the one or more network accessible users based at least partially on the identified one or more relations comprises:

circuitry for identifying the one or more behavioral fingerprints of the one or more network accessible users; and
circuitry for comparing the identified one or more behavioral fingerprints based at least partially on one or more relationships existing between or among the one or more network accessible users.

12. The computationally-implemented system of claim 8, wherein the circuitry for generating the trust verification schema at least partially by mapping the correlated one or more behavioral fingerprints with the identified one or more relations comprises:

circuitry for generating the trust verification schema using the correlated one or more behavioral fingerprints, wherein the correlated one or more behavioral fingerprints result in a particular level of authentication for one or more groups of related network accessible users of the one or more network accessible users.

13. The computationally-implemented system of claim 12, wherein the circuitry for generating the trust verification schema using the correlated one or more behavioral fingerprints, wherein the correlated one or more behavioral fingerprints result in a particular level of authentication for one or more groups of related network accessible users of the one or more network accessible users comprises:

circuitry for determining at least one proximity of relation for the one or more network accessible users based at least partially on one or more social network linkages that are confirmed by at least a portion of the one or more network accessible users; and
circuitry for determining at least one level of relation between or among the one or more network accessible users based at least partially on the determined at least one proximity of relation for the one or more network accessible users.

14. The computationally-implemented system of claim 13, wherein the circuitry for determining at least one level of relation between or among the one or more network accessible users based at least partially on the determined at least one proximity of relation for the one or more network accessible users comprises:

circuitry for altering the at least one level of relation between or among the one or more network accessible users based at least partially on one or more changes to relations indicated by at least one of the one or more network accessible users, the one or more changes indicated via at least one social network.

15. The computationally-implemented system of claim 13, wherein the circuitry for determining at least one level of relation between or among the one or more network accessible users based at least partially on the determined at least one proximity of relation for the one or more network accessible users comprises:

circuitry for mapping one or more locations of the one or more network accessible users to confirm the at least one level of relation between or among the one or more network accessible users.

16. The computationally-implemented system of claim 1, wherein the circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for using the trust verification schema to automatically authenticate a proposed transaction associated with at least one of the one or more network accessible users based at least partly on a level of authentication associated with at least two of the one or more network accessible users.

17. The computationally-implemented system of claim 1, wherein the circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for authenticating a shared computer processing request via verification of at least one of the one or more network accessible users based at least partly on the trust verification schema.

18. The computationally-implemented system of claim 1, wherein the circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for authenticating an internet purchase transaction via verification of at least one of the one or more network accessible users based at least partly on the trust verification schema.

19. The computationally-implemented system of claim 1, wherein the circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for authenticating a purchase by a first network accessible user of the one or more network accessible users based at least partially on a location of a second network accessible user of the one or more network accessible users and on the trust verification schema.

20. The computationally-implemented system of claim 1, wherein the circuitry for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users comprises:

circuitry for denying at least one proposed transaction of the one or more proposed transactions that is attempted by at least one of the one or more network accessible users based at least partly on the trust verification schema.

21. The computationally-implemented system of claim 20, wherein the circuitry for denying at least one proposed transaction of the one or more proposed transactions that is attempted by at least one of the one or more network accessible users based at least partly on the trust verification schema comprises:

circuitry for denying the at least one proposed transaction based at least partially on a calculated combined level of authentication for related network accessible users and on a predetermined combined level of authentication that is indicated by the trust verification schema.

22. A computationally-implemented system comprising:

means for receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users;
means for receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and
means for transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users.

23.-42. (canceled)

43. A method comprising:

receiving at a computer device one or more behavioral fingerprints associated with one or more network accessible users;
receiving an authentication request at the computer device, the authentication request associated with one or more proposed transactions of the one or more network accessible users; and
transmitting from the computer device a decision associated with the authentication request, the decision based at least partially on a trust verification schema generated from a relational mapping of the one or more behavioral fingerprints associated with the one or more network accessible users.

44.-63. (canceled)

Patent History
Publication number: 20130133054
Type: Application
Filed: Jul 18, 2012
Publication Date: May 23, 2013
Inventors: Marc E. Davis (San Francisco, CA), Matthew G. Dyor (Bellevue, WA), Daniel A. Gerrity (Seattle, WA), Xuedong Huang (Bellevue, WA), Roderick A. Hyde (Redmond, WA), Royce A. Levien (Lexington, WA), Richard T. Lord (Tacoma, WA), Robert W. Lord (Seattle, WA), Mark A. Malamud (Seattle, WA), Nathan P. Myhrvold (Bellevue, WA), Clarence T. Tegreene (Mercer Island, WA)
Application Number: 13/552,502
Classifications
Current U.S. Class: Usage (726/7)
International Classification: G06F 21/31 (20060101);