Hiding information in noise
A process of hiding a key or data inside of random noise is introduced, whose purpose is to protect the privacy of the key or data. In some embodiments, the random noise is produced by quantum randomness, using photonic emission with a light emitting diode. When the data or key generation and random noise have the same probability distributions, and the key size is fixed, the security of the hiding can be made arbitrarily close to perfect secrecy, by increasing the noise size. The hiding process is practical in terms of infrastructure and cost, utilizing the existing TCP/IP infrastructure as a transmission medium, and using light emitting diode(s) and a photodetector in the random noise generator. In some embodiments, symmetric cryptography encrypts the data before the encrypted data is hidden in random noise, which substantially amplifies the computational complexity.
Latest Fiske Software, LLC Patents:
Description
1 RELATED APPLICATIONS
This application claims priority benefit of U.S. Provisional Patent Application Ser. No. 62/085,338, entitled “Hiding Data Transmissions in Random Noise”, filed Nov. 28, 2014, which is incorporated herein by reference; this application claims priority benefit of U.S. Provisional Patent Application Ser. No. 62/092,795, entitled “Hiding Data Transmissions in Random Noise”, filed Dec. 16, 2014, which is incorporated herein by reference.
2 BACKGROUND—FIELD OF INVENTION
The present invention relates broadly to protecting the privacy of information and devices. The processes and device are generally used to maintain the privacy of information transmitted through communication and transmission systems. For example, the hiding processes may be used to protect the metadata of a phone call; in some embodiments, the phone call may be transmitted via voice over IP (internet protocol) with a mobile phone. These processes and devices also may be used to hide passive data stored on a computer or another physical device such as a tape drive. In some embodiments, symmetric cryptographic methods and machines are also used to supplement the hiding process.
Typically, the information (data) is hidden by a sending agent, called Alice. Alice transmits the hidden data to a receiving agent, called Bob. The receiving agent, Bob, applies an extraction process or device. The output of this extraction process or device is the same information (data) that Alice gathered before hiding and sending it. Eve is the name of the agent who is attempting to obtain the information or data. One of Alice and Bob's primary objectives is to assure that Eve cannot capture the private information that was hidden and transmitted between them.
3 BACKGROUND—PRIOR ART
The subject matter discussed in this background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the Summary and some Advantages of Invention section represents different approaches, which in and of themselves may also be inventions, and various problems, which may have been first recognized by the inventor.
In information security, a fundamental problem is for a sender, Alice, to securely transmit a message M to a receiver, Bob, so that the adversary, Eve, receives no information about the message. In Shannon's seminal paper [2], his model assumes that Eve has complete access to a public, noiseless channel: Eve sees an identical copy of ciphertext C that Bob receives, where C(M,K) is a function of message M lying in message space and secret key K lying in key space .
In this specification, the symbol P will express a probability. The expression P(E) is the probability that event E occurs and it satisfies 0≤P(E)≤1. For example, suppose the sample space is the 6 faces of die and E is the event of rolling a 1 or 5 with that die and each of the 6 faces is equally likely. Then P(E)= 2/6=⅓. The conditional probability
P(A∩B) is probability that event A occurs and also event B occurs. The conditional probability P(AB) expresses the probability that event A will occur, under the condition that someone knows event B already occurred. The expression that follows the symbol “” represents the conditional event. Events A and B are independent if P(A∩B)=P(A)P(B).
Expressed in terms of conditional probabilities, Shannon [2] defined a cryptographic method to be perfectly secret if P(M)=P(MEve sees ciphertext C) for every cipher text C and for every message M in the message space . In other words, Eve has no more information about what the message M is after Eve sees ciphertext C pass through the public channel. Shannon showed for a noiseless, public channel that the entropy of the keyspace must be at least as large as the message space in order to achieve perfect secrecy.
Shannon's communication secrecy model [2] assumes that message sizes in the message space are finite and the same size. Shannon's model assumes that the transformations (encryption methods) on the message space are invertible and map a message of one size to the same size. Shannon's model assumes that the transformation applied to the message is based on the key. In the prior art, there is no use of random noise that is independent of the message or the key. In the prior art, there is no notion of being able to send a hidden or encrypted message inside the random noise where Eve is not necessarily revealed the size of the message. In the prior art, there is no notion of using random noise to hide the secret channel and transmitting a key inside this channel that is indistinguishable from the noise.
Quantum cryptography was introduced by Weisner and eventually published by Bennett, Brassard, et al. [3, 4]. Quantum cryptography based on the uncertainty principle of quantum physics: by measuring one component of the polarization of a photon, Eve irreversibly loses her ability to measure the orthogonal component of the polarization. Unfortunately, this type of cryptography requires an expensive physical infrastructure that is challenging to implement over long distances [5, 6]. Furthermore, Alice and Bob still need a shared, authentication secret to successfully perform this quantum cryptography in order to assure that Eve cannot corrupt messages about the polarization bases, communicated on Alice and Bob's public channel.
4 SUMMARY AND SOME ADVANTAGES OF THE INVENTION(S)
In some parts of the prior art, conventional wisdom believes that hiding data in the open cannot provide adequate information security. The invention(s), described herein, demonstrate that our process of hiding data inside noise is quite effective. A process for hiding data inside of random noise is demonstrated and described. In some embodiments, the data hidden is a key. In some embodiments, the data hidden is a public key. In some embodiments, the data hidden is encrypted data. In some embodiments, the data hidden is encrypted data that was first encrypted by a block cipher. In some embodiments, the data hidden is encrypted data that was first encrypted by a stream cipher. In some embodiments, the hidden data may be hidden metadata that is associated with the TCP/IP infrastructure [1] used to transmit information.
The inventions) described herein are not bound to Shannon's limitations [2] because they use noise, rather than seek to eliminate noise. When the data generation and random noise have a uniform probability distribution, and the key size is fixed, the security of the key transmission can be made arbitrarily close to perfect secrecy—where arbitrarily close is defined in section 7.9—by increasing the noise size. The processes, devices and machines described herein are practical; they can be implemented with current TCP/IP infrastructure acting as a transmission medium and a random noise generator providing the random noise and key generation.
5 ADVANTAGES AND FAVORABLE PROPERTIES
Overall, our invention(s) that hide data and keys inside random noise exhibits the following favorable security properties.

 The hiding process is O(n).
 For a fixed key size m bits and ρ=n−m bits of random noise, as ρ→∞, the security of the hidden data can be made arbitrarily close to perfect secrecy. In some applications, the key size can also be kept secret and is not revealed to Eve.
 From the binomial distribution, the closeness to perfect secrecy can be efficiently computed.
 The scatter map a can reused when both the key generation and noise generation have a uniform probability distribution and a new random key and new noise are created for each transmission.
 The reuse property enables a practical process of hiding data that is first encrypted by a block or stream cipher. The complexity of finding this hidden encrypted data can be substantially greater than the computational complexity of the underlying block or stream cipher. See section 7.12.
 Our hiding process uses a noiseless, public channel, which means it can implemented with our current Transmission Control Protocol/Internet Protocol internet infrastructure (TCP/IP). No expensive, physical infrastructure is needed to create noisy channels or transmit and maintain polarized photons, as is required by the prior art of quantum cryptography. Random noise generators are commercially feasible and inexpensive. A random noise generator that produces more than 10,000 random bits per second can be manufactured in high volume for less than three U.S. dollars per device.
 Alice and Bob possess their sources of randomness. This system design decentralizes the security to each user. Decentralization helps eliminate potential single points of failure, and backdoors in the transmission medium that may be outside the inspection and control of Alice and Bob.
6 DESCRIPTION OF FIGURES
In the following figures, although they may depict various examples of the invention, the invention is not limited to the examples depicted in the figures.
Table 1 shows probabilities after Eve observes a hidden key or hidden data inside random noise. The hidden key or hidden noise is represented as .
7 DETAILED DESCRIPTION
7.1 Information System
In this specification, the term “data” is broad and refers to any kind of information. In some embodiments, data may refer to plaintext information. In some embodiments, data may refer to voice information, transmitted with a landline phone or mobile phone. In some embodiments, data may refer to metadata. In some embodiments, data may refer to email or other information available on the Internet. In some embodiments, data may refer to the information in a sequence of values. In some embodiments, data may refer to the information in a sequence of bit values. In some embodiments, data may refer to the information in a sequence of numbers. In some embodiments, data may refer to the information in a sequence or collection of physical values or physical measurements. In some embodiments, data may refer to the information in a physical location (e.g., GPS coordinates of an auto or a mailing address in Venezia, Italia) or to the information in an abstract location for example, a computer memory address or a virtual address. In some embodiments, data may refer to the information contained in Shakespeare's King Lear or Dostoevsky's Grand Inquisitor or Euclid's Elements. In some embodiments, data may refer to the information in Kepler's astronomical measurements or a collection of geophysical measurements. In some embodiments, data may refer to the information in to a sequence of times or collection of times. In some embodiments, data may refer to the information in statistical data such as economic or insurance information. In some embodiments, data may refer to medical information (e.g., an incurable cancer diagnosis) or genetic information (e.g., that a person has the amino acid substitution causing sickle cell anemia). In some embodiments, data may refer to the information in a photograph of friends or family or satellite photos. In some embodiments, data may refer to the information in a code or sequence of codes. In some embodiments, data may refer to the information in a sequence of language symbols for a language that has not yet been discovered or designed. In some embodiments, data may refer to financial information for example, data may refer to a bid quote on a financial security, or an ask quote on a financial security. In some embodiments, data may refer to information about a machine or a collection of machines for example, an electrial grid or a power plant. In some embodiments, data may refer to what electrical engineers sometimes call signal in information theory. In some embodiments, data may refer to a cryptographic key. In some embodiments, data may refer to a sequence or collection of computer program instructions (e.g., native machine instructions or source code information). In some embodiments, data may refer to a prime number or a mathematical formula or a mathematical invariant information. In some embodiments, data may refer to an internet protocol address or internet traffic information. In some embodiments, data may refer to a combination or amalgamation or synthesis of one or more of these types of aforementioned information.
In this specification, the term “noise” is information that is distinct from data and has a different purpose. Noise is information that helps hide the data so that the noise hinders the adversary Eve from finding or obtaining the data. This hiding of the data helps maintain the privacy of the data. In some embodiments, hiding the data means rearranging or permuting the data inside the noise. An example of data is a key. Hiding a key inside noise helps protect the privacy of the key; the key may subsequently help execute a cryptographic algorithm by a first party (e.g., Alice) or a second party (e.g., Bob).
In this specification, the term “location” may refer to geographic locations and/or storage locations. A particular storage location may be a collection of contiguous and/or noncontiguous locations on one or more machine readable media. Two different storage locations may refer to two different sets of locations on one or more machinereadable media in which the locations of one set may be intermingled with the locations of the other set.
In this specification, the term “machinereadable medium” refers to any nontransitory medium capable of carrying or conveying information that is readable by a machine. One example of a machinereadable medium is a computerreadable medium. Another example of a machinereadable medium is paper having holes that are detected that trigger different mechanical, electrical, and/or logic responses. The term machinereadable medium also includes media that carry information while the information is in transit from one location to another, such as copper wire and/or optical fiber and/or the atmosphere and/or outer space.
In this specification, the term “process” refers to a series of one or more operations. In an embodiment, “process” may also include operations or effects that are best described as nondeterministic. In an embodiment, “process” may include some operations that can be executed by a digital computer program and some physical effects that are nondeterministic, which cannot be executed by a digital computer program and cannot be performed by a finite sequence of processor instructions.
In this specification, the machineimplemented processes implement algorithms and nondeterministic processes on a machine. The formal notion of “algorithm” was introduced in Turing's work [8] and refers to a finite machine that executes a finite number of instructions with finite memory. In other words, an algorithm can be executed with a finite number of machine instructions on a processor. “Algorithm” is a deterministic process in the following sense: if the finite machine is completely known and the input to the machine is known, then the future behavior of the machine can be determined. However, there is quantum random number generator (QRNG) hardware [9, 10] and other embodiments that measure quantum effects from photons (or other physically nondeterministic processes), whose physical process is nondeterministic. The recognition of nondeterminism produced by quantum randomness and other quantum embodiments is based on many years of experimental evidence and statistical testing. Furthermore, the quantum theory—derived from the KochenSpecker theorem and its extensions [11, 12] predicts that the outcome of a quantum measurement cannot be known in advance and cannot be generated by a Turing machine (digital computer program). As a consequence, a physically nondeterministic process cannot be generated by an algorithm: namely, a sequence of operations executed by a digital computer program.
Some examples of physically nondeterministic processes are as follows. In some embodiments that utilize nondeterminism, photons strike a semitransparent mirror and can take two or more paths in space. In one embodiment, if the photon is reflected by the semitransparent mirror, then it takes on one bit value b ∈ {0, 1}; if the photon passes through by the semitransparent mirror, then the nondeterministic process produces another bit value 1−b. In another embodiment, the spin of an electron may be sampled to generate the next nondeterministic bit. In still another embodiment, a protein, composed of amino acids, spanning a cell membrane or artificial membrane, that has two or more conformations can be used to detect nondeterminism: the protein conformation sampled may be used to generate a nondeterministic value in {0, . . . n−1} where the protein has n distinct conformations. In an alternative embodiment, one or more rhodopsin proteins could be used to detect the arrival times of photons and the differences of arrival times could generate nondeterministic bits. In some embodiments, a Geiger counter may be used to sample nondeterminism.
In this specification, the term “photodetector” refers to any type of device or physical object that detects or absorbs photons. A photodiode is an embodiment of a photodetector. A phototransistor is an embodiment of a photodetector. A rhodopsin protein is an embodiment of a photodetector.
In this specification, the term “key” is a type of information and is a value or collection of values to which one or more operations are performed. In some embodiments, one or more of these operations are cryptographic operations. {0, 1}^{n }is the set of all bitstrings of length n. When a key is represented with bits, mathematically a nbit key is an element of the collection {0, 1}^{n }which is the collection of strings of 0's and 1's of length n. For example, the string of 0's and 1's that starts after this colon is a 128bit key: 01100001 11000110 01010011 01110001 11000101 10001110 11011001 11010101 01011001 01100100 10110010 10101010 01101101 10000111 10101011 00010111. In an embodiment, n=3000 so that a key is a string of 3000 bits.
In other embodiments, a key may be a sequence of values that are not represented as bits. Consider the set {A, B, C, D, E}. For example, the string that starts after this colon is a 40symbol key selected from the set {A, B, C, D, E}: ACDEB AADBC EAE BB AAECB ADDCB BDCCE ACECB EACAE. In an embodiment, a key could be a string of length n selected from {A, B, C, D, E}^{n}. In an embodiment, n=700 so that the key is a string of 700 symbols where each symbol is selected from {A, B, C, D, E}.
In some embodiments, a key is a collection of one or more values, that specifies how a particular encryption function will encrypt a message. For example, a key may be a sequence of 0's and 1's that are bitwise exclusiveor'ed with the bits that comprise a message to form the encrypted message.
In some embodiments, hidden data (key) 109 in
In another embodiment, a key may be a sequence of values that a block cipher reads as input in order to encrypt a message with the block cipher encryption algorithm . In another embodiment, a key may be a sequence of values that a block cipher reads as input in order to decrypt an encrypted message with the block cipher's decryption algorithm . If Eve does not know that key, then it is difficult for Eve to decrypt the encrypted message (K,M). AES [13] is a common block cipher algorithm that reads 256bit keys as input. Serpent [14] is also a block cipher algorithm that reads 256bit keys as input.
In other embodiments, the key be a public key. In some embodiments, a key may refer to a public key for the RSA publickey algorithm [15]. In this case, a key is a huge prime number. In some embodiments, random generator 128 generates a key that is subsequently hidden by scatter map instructions 130.
Information system 100 may be a system for transmitting hidden data. Data 104 refers to information that has a purpose and that has not been hidden yet. In some embodiments, data is intended to be delivered to another location, software unit, machine, person, or other entity.
In some embodiments, data 104 is voice metadata that has not yet been hidden. Voice metadata may contain the IP address of the sending (calling) phone and also the IP address of the receiving phone. Voice metadata may contain the time of the call and the date. Some embodiments of a mobile phone are shown in
In an embodiment, data may be unhidden information being transmitted wirelessly between satellites. Data may be represented in analog form in some embodiments and may be represented in digital form. In an embodiment, the sound waves transmitted from a speaker's mouth into a mobile phone microphone are data. The representation of this data information before reaching the microphone is in analog form. Subsequently, the data information may be digitally sampled so it is represented digitally after being received by the mobile phone microphone. In general, data herein refers to any kind of information that has not been hidden or encrypted and that has a purpose.
In information system 100, noise helps hide the data. It may be desirable to keep the contents of data 104 private or secret. Consequently, it may be desirable to hide data 104, so that the transmitted information is expected to be unintelligible to an unintended recipient should the unintended recipient attempt to read and/or extract the hidden data transmitted. Data 104 may be a collection of multiple, not yet hidden information blocks, an entire message of data, a segment of data (information), or any other portion of a data.
Hiding process 106 may be a series of steps that are performed on data 104. In one embodiment, the term “process” refers to one or more instructions for sending machine 102 to execute the series of operations that may be stored on a machinereadable medium. Alternatively, the process may be carried out by and therefore refer to hardware (e.g., logic circuits) or may be a combination of instructions stored on a machinereadable medium and hardware that cause the operations to be executed by sending machine 102 or receiving machine 112. Data 104 may be input for hiding process 106. The steps that are included in hiding process 106 may include one or more mathematical operations and/or one or more other operations.
As a postprocessing step, oneway hash function 948 may be applied to a sequence of random events such as quantum events (nondeterministic) generated by nondeterministic generator 942 in
In
In some embodiments, as shown in
In some embodiments, hiding process 106 requests random generator 128 to help generate one or more keys (shown in cipher instructions 129) for encrypting at least part of data 104. In an embodiment, nondeterministic generator 942 (
Sending machine 102 may be an information machine that handles information at or is associated with a first location, software unit, machine, person, sender, or other entity. Sending machine 102 may be a computer, a phone, a mobile phone, a telegraph, a satellite, or another type of electronic device, a mechanical device, or other kind of machine that sends information. Sending machine 102 may include one or more processors and/or may include specialized circuitry for handling information. Sending machine 102 may receive data 104 from another source (e.g., a transducer such as a microphone which is inside mobile phone 402 or 502 of
Sending machine 102 may implement any of the hiding processes described in this specification. Hiding process 106 may include any of the hiding processes described in this specification. For example, hiding process 106 may implement any of the embodiments of the hiding processes 1 or 2, as described in section 7.6; hiding process 106 may implement any of the embodiments of the hiding process 3, as described in section 7.10; hiding process 106 may implement any of the embodiments of the hiding processes 4 or 5, as described in section 7.11; hiding process 106 may implement any of the embodiments of the hiding processes 6 or 7, as described in section 7.13. In some embodiments, hidden data 132, shown in
Transmission path 110 is the path taken by hidden data 109 to reach the destination to which hidden data 109 was sent. Transmission path 110 may include one or more networks, as shown in
Receiving machine 112 may be an information machine that handles information at the destination of an hidden data 109. Receiving machine 112 may be a computer, a phone, a telegraph, a router, a satellite, or another type of electronic device, a mechanical device, or other kind of machine that receives information. Receiving machine 112 may include one or more processors and/or specialized circuitry configured for handling information, such as hidden data 109. Receiving machine 112 may receive hidden data 109 from another source and/or reconstitute (e.g., extract) all or part of hidden data 109. Receiving machine 112 may implement any of the hiding processes described in this specification and is capable of extracting any message hidden by sending machine 102 and hiding process 106.
In one embodiment, receiving machine 112 only receives hidden data 109 from transmission path 110, while hiding process 106 is implemented manually and/or by another information machine. In another embodiment, receiving machine 112 implements extraction process 116 that reproduces all or part of data 104, referred to as extracted data 114 in
Receiving machine 112 may be identical to sending machine 102. For example, receiving machine 112 may receive data 104 from another source, produce all or part of data 104, and/or implement hiding process 106. Similar to sending machine 102, receiving machine 112 may create keys and random noise and random data. Receiving machine 112 may transmit the output of extraction process 116, via transmission path 110 to another entity and/or receive hidden data 109 (via transmission path 110) from another entity. Receiving machine 112 may present hidden data 109 for use as input to extraction process 116.
7.2 Processor, Memory and Input/Output Hardware
Information system 200 illustrates some of the variations of the manners of implementing information system 100. Sending machine 202 is one embodiment of sending machine 101. Sending machine 202 may be a secure USB memory storage device as shown in 3A. Sending machine 202 may be an authentication token as shown in
Sending machine 202 or sending machine 400 may communicate wirelessly with computer 204. In an embodiment, computer 204 may be a call station for receiving hidden data 109 from sending machine 400. A user may use input system 254 and output system 252 of sending machine (mobile phone) 400 to transmit hidden voice data or hidden metadata to a receiving machine that is a mobile phone. In an embodiment, input system 254 in
Computer 204 is connected to system 210, and is connected, via network 212, to system 214, system 216, and system 218, which is connected to system 220. Network 212 may be any one or any combination of one or more Local Area Networks (LANs), Wide Area Networks (WANs), wireless networks, telephones networks, and/or other networks. System 218 may be directly connected to system 220 or connected via a LAN to system 220. Network 212 and system 214, 216, 218, and 220 may represent Internet servers or nodes that route hidden data (e.g., hidden voice data or hidden metadata) received from sending machine 400 shown in
In
In an embodiment, hiding process 106 and extraction process 116 execute in a secure area of processor system 258 of
In an embodiment, specialized hardware in processor system 258 may be embodied as an ASIC (application specific integrated circuit) that computes SHA1 and/or SHA512 and/or Keccak and/or BLAKE and/or JH and/or Skein that help execute the HMAC function in process 4 named Hiding One or More Keys with Authentication or help execute process 5 named Hiding Encrypted Data Elements with Authentication. An ASIC chip can increase the execution speed and protect the privacy of hiding process 106 and extraction process 116.
In an embodiment, input system 254 of
In an embodiment, memory system 256 of
7.3 NonDeterministic Generators
The emission times of the photons emitted by the LED experimentally obey the energytime form of the Heisenberg uncertainty principle. The energytime form of the Heisenberg uncertainty principle contributes to the nondeterminism of random noise generator 142 because the photon emission times are unpredictable due to the uncertainty principle. In
In
A photodiode is a semiconductor device that converts light (photons) into electrical current, which is called a photocurrent. The photocurrent is generated when photons are absorbed in the photodiode. Photodiodes are similar to standard semiconductor diodes except that they may be either exposed or packaged with a window or optical fiber connection to allow light (photons) to reach the sensitive part of the device. A photodiode may use a PIN junction or a pn junction to generate electrical current from the absorption of photons. In some embodiments, the photodiode may be a phototransistor.
A phototransistor is a semiconductor device comprised of three electrodes that are part of a bipolar junction transistor. Light or ultraviolet light activates this bipolar junction transistor. Illumination of the base generates carriers which supply the base signal while the base electrode is left floating. The emitter junction constitutes a diode, and transistor action amplifies the incident light inducing a signal current.
When one or more photons with high enough energy strikes the photodiode, it creates an electronhole pair. This phenomena is a type of photoelectric effect. If the absorption occurs in the junction's depletion region, or one diffusion length away from the depletion region, these carriers (electronhole pair) are attracted from the PIN or pn junction by the builtin electric field of the depletion region. The electric field causes holes to move toward the anode, and electrons to move toward the cathode; the movement of the holes and electrons creates a photocurrent. In some embodiments, the amount of photocurrent is an analog value, which can be digitized by a analogtodigital converter. In some embodiments, the analog value is amplified before being digitized. The digitized value is what becomes the random noise. In some embodiments, a oneway hash function 948 or 958 may also be applied to postprocess the random noise to produce the noise r_{1 }r_{2 }. . . r_{p }used by processes 1, 2, 3 4 and 5. In some embodiments, a oneway hash function may be applied to the random noise to produce key(s) k_{1 }k_{2 }. . . k_{m}, used by processes 2 and 4.
In an embodiment, the sampling of the digitized photocurrent values may converted to threshold times as follows. A photocurrent threshold θ is selected as a sampling parameter. If a digitized photocurrent value i_{1 }is above θ at time t_{1}, then t_{1 }is recorded as a threshold time. If the next digitized photocurrent value i_{2 }above θ occurs at time t_{2}, then t_{2 }is recorded as the next threshold time. If the next digitized value i_{3 }above θ occurs at time t_{3}, then t_{3 }is recorded as the next threshold time.
After three consecutive threshold times are recorded, these three times can determine a bit value as follows. If t_{2}−t_{1}>t_{3}−t_{2}, then random noise generator produces a 1 bit. If t_{2}−t_{1}<t_{3}−t_{2}, then random noise generator produces a 0 bit. If t_{2}−t_{1}=t_{3}−t_{2}, then no noise information is produced. To generate the next bit, random noise generator 942 or 952 continues the same sampling steps as before and three new threshold times are produced and compared.
In an alternative sampling method, a sample mean μ is established for the photocurrent, when it is illuminated with photons. In some embodiments, the sampling method is implemented as follows. Let i_{1 }be the photocurrent value sampled at the first sampling time. i_{1 }is compared to μ. ϵ is selected as a parameter in the sampling method that is much smaller number than μ. If i_{1 }is greater than μ+ϵ, then a 1 bit is produced by the random noise generator 942 or 952. If i_{1 }is less than μ−ϵ, then a 0 bit is produced by random noise generator 942 or 952. If i_{1 }is in the interval [μ−ϵ, μ+ϵ], then NO bit is produced by random noise generator 942 or 952.
Let i_{2 }be the photocurrent value sampled at the next sampling time. i_{2 }is compared to μ. If i_{2 }is greater than μ+ϵ, then a 1 bit is produced by the random noise generator 942 or 952. If i_{2 }is less than μ−ϵ, then a 0 bit is produced by the random noise generator 942 or 952. If i_{2 }is in the interval [μ−ϵ, μ+ϵ], then NO bit is produced by the random noise generator 942 or 952. This alternative sampling method continues in the same way with photocurrent values i_{3}, i_{4}, and so on. In some embodiments, the parameter E is selected as zero instead of a small positive number relative to μ.
Some alternative hardware embodiments of nondeterministic generator 128 (
In some embodiments, the seek time of a hard drive can be used as random noise values as the air turbulence in the hard drive affects the seek time in a nondeterministic manner. In some embodiments, local atmospheric noise can be used as a source of random noise. For example, the air pressure, the humidity or the wind direction could be used. In other embodiments, the local sampling of smells based on particular molecules could also be used as a source of random noise.
In some embodiments, a Geiger counter may be used to sample nondeterminism and generate random noise. In these embodiments, the unpredictability is due to radioactive decay rather than photon emission, arrivals and detection.
7.4 Deterministic Generators
In an embodiment, a deterministic generator 962 (
In some embodiments, Φ and Ψ are the same oneway hash functions. In other embodiments, Φ and Ψ are different oneway hash functions. In an embodiment, Φ is oneway hash function SHA512 and Ψ is oneway hash function Keccak. In another embodiment, Φ is oneway hash function Keccak and Ψ is oneway hash function SHA512.
In an embodiment, the ith generator Δ(i) is composed of N bits and updated with generator update instructions 966. The N bits of Δ(i) are represented as Δ_{i,0 }Δ_{i,1 }. . . Δ_{i,N1 }where each bit Δ_{i,j }is a 0 or 1. In an embodiment, generator update instructions 966 are executed according to the following two steps described in machine 1:
In an embodiment, the size of the deterministic generator N may be 1024. In another embodiment, N may be fifty thousand. In another embodiment, N may be ten billion.
In an embodiment, oneway hash instructions 964 are performed by processor system 258 (
In an embodiment, the instructions that execute machine 1 and help execute deterministic generator 962 may expressed in the C programming language before compilation. In an embodiment, the instructions that execute machine 1 and help execute deterministic generator 962 may be expressed in the native machine instructions of processor system 258. In an embodiment, the instructions that execute machine 1 may be implemented as an ASIC, which is part of processor system 258.
Machine 1. Generating Noise with a Machine
In an embodiment, machine 2 generates key(s) 970 as follows. Φ is oneway hash function with digest size d and is executed with oneway hash instructions 964. In some embodiment, Ψ is a oneway hash function with digest size at least m bits (size of one or more keys) and is executed with oneway hash instructions 968. In some embodiments, if m is greater than digest size of Ψ, then the generator update steps in machine 2 may be called more than once to generate enough keys.
In some embodiments, Φ and Ψ0 are the same oneway hash functions. In other embodiments, Φ and Ψ are different oneway hash functions. In an embodiment, Φ is oneway hash function SHA512 and Ψ is oneway hash function Keccak. In another embodiment, Φ is oneway hash function Keccak and Ψ is oneway hash function SHA512.
In an embodiment, the ith generator Δ(i) is composed of N bits and updated with generator update instructions 966. The N bits of Δ(i) are represented as Δ_{i,0 }Δ_{i,1 }. . . Δ_{i,N1 }where each bit Δ_{i,j }is a 0 or 1. In an embodiment, generator update instructions 966 are executed according to the following two steps described in machine 2:
In an embodiment, the size of the deterministic generator N may be 1024. In another embodiment, N may be fifty thousand. In another embodiment, N may be ten billion.
In an embodiment, oneway hash instructions 964 are performed by processor system 258 (
In an embodiment, the instructions that execute machine 2 and help execute deterministic generator 962 may expressed in the C programming language before compilation. In an embodiment, the instructions that execute machine 2 and help execute deterministic generator 962 may be expressed in the native machine instructions of processor system 258. In an embodiment, the instructions that execute machine 2 may be implemented as an ASIC, which is part of processor system 258. In an embodiment, memory system 956 may store one or more keys 970.
Machine 2. Generating One or More Keys with a Machine
7.5 OneWay Hash Functions
In
More details are provided on computationally intractable. In an embodiment, there is an amount of time T that encrypted information must stay secret. If encrypted information has no economic value or strategic value after time T, then computationally intractable means that the number of computational steps required by all the world's computing power will take more time to compute than time T. Let C(t) denote all the world's computing power at the time t in years.
Consider an online bank transaction that encrypts the transaction details of that transaction. Then in most embodiments, the number of computational steps that can be computed by all the world's computers for the next 30 years is in many embodiments likely to be computationally intractable as that particular bank account is likely to no longer exist in 30 years or have a very different authentication interface.
To make the numbers more concrete, the 2013 Chinese supercomputer that broke the world's computational speed record computes about 33,000 trillion calculations per second [17]. If T=1 one year and we can assume that there are at most 1 billion of these supercomputers. (This can be inferred from economic considerations, based on a far too low 1 million dollar price for each supercomputer. Then these 1 billion supercomputers would cost 1,000 trillion dollars.). Thus, C(2014)×1 year is less than 10^{9}×33×10^{15}×3600×24×365=1.04×10^{33 }computational steps.
As just discussed, in some embodiments and applications, computationally intractable may be measured in terms of how much the encrypted information is worth in economic value and what is the current cost of the computing power needed to decrypt that encrypted information. In other embodiments, economic computational intractability may be useless. For example, suppose a family wishes to keep their child's whereabouts unknown to violent kidnappers. Suppose T=100 years because it is about twice their expected lifetimes. Then 100 years×C(2064) is a better measure of computationally intractible for this application. In other words, for critical applications that are beyond an economic value, one should strive for a good estimate of the world's computing power.
Oneway functions that exhibit completeness and a good avalanche effect or the strict avalanche criterion [18] are preferable embodiments: these properties are favorable for oneway hash functions. The definition of completeness and a good avalanche effect are quoted directly from [18]:

 If a cryptographic transformation is complete, then each ciphertext bit must depend on all of the plaintext bits. Thus, if it were possible to find the simplest Boolean expression for each ciphertext bit in terms of plaintext bits, each of those expressions would have to contain all of the plaintext bits if the function was complete. Alternatively, if there is at least one pair of nbit plaintext vectors X and X_{i }that differ only in bit i, and ƒ(X) and ƒ(X_{i}) differ at least in bit j for all {(i,j):1≤i,j≤n}, the function ƒ must be complete.
 For a given transformation to exhibit the avalanche effect, an average of one half of the output bits should change whenever a single input bit is complemented. In order to determine whether a m×n (m, input bits and n output bits) function ƒ satisfies this requirement, the 2^{m }plaintext vectors must be divided into 2^{m1 }pairs, X and X_{j }such that X and X_{j }differ only in bit i. Then the 2^{m1 }exclusiveor sums V_{i}=ƒ(X)⊕ƒ(X_{i}) must be calculated. These exclusiveor sums will be referred to as avalanche vectors, each of which contains n bits, or avalanche variables.
 If this procedure is repeated for all i such that 1≤i≤m and one half of the avalanche variables are equal to 1 for each i, then the function ƒ has a good avalanche effect. Of course this method can be pursued only if m is fairly small; otherwise, the number of plaintext vectors becomes too large. If that is the case then the best that can be done is to take a random sample of plaintext vectors X, and for each value i calculate all avalanche vectors V_{i}. If approximately one half the resulting avalanche variables are equal to 1 for values of i, then we can conclude that the function has a good avalanche effect.
A hash function, also denoted as Φ, is a function that accepts as its input argument an arbitrarily long string of bits (or bytes) and produces a fixedsize output of information. The information in the output is typically called a message digest or digital fingerprint. In other words, a hash function maps a variable length m of input information to a fixedsized output, Φ(m), which is the message digest or information digest. Typical output sizes range from 160 to 512 bits, but can also be larger. An ideal hash function is a function Φ, whose output is uniformly distributed in the following way: Suppose the output size of Φ is n bits. If the message m is chosen randomly, then for each of the 2^{n }possible outputs z, the probability that Φ(m)=z is 2^{−n}. In an embodiment, the hash functions that are used are oneway.
A good oneway hash function is also collision resistant. A collision occurs when two distinct information elements are mapped by the oneway hash function Φ to the same digest. Collision resistant means it is computationally intractable for an adversary to find colllisions: more precisely, it is computationally intractable to find two distinct information elements m_{1}, m_{2 }where m_{1}≠m_{2 }and such that Φ(m_{1})=Φ(m_{2}).
A number of oneway hash functions may be used to implement oneway hash function 148. In an embodiment, SHA512 can implement oneway hash function 148, designed by the NSA and standardized by NIST [19]. The message digest size of SHA512 is 512 bits. Other alternative hash functions are of the type that conform with the standard SHA384, which produces a message digest size of 384 bits. SHA1 has a message digest size of 160 bits. An embodiment of a oneway hash function 148 is Keccak [20]. An embodiment of a oneway hash function 148 is BLAKE [21]. An embodiment of a oneway hash function 148 is Grθstl [22]. An embodiment of a oneway hash function 148 is JH [23]. Another embodiment of a oneway hash function is Skein [24].
7.6 Scatter Map Hiding
A scatter map is a function that permutes data (information) to a sequence of distinct locations inside the random noise. To formally define a scatter map, the location space is defined first.
Definition 1
Let m, nϵ, where m≤n. The set _{m,n}={(l_{1}, l_{2}. . . l_{m})ϵ{1, 2, . . . n}^{m}: l_{j}≠l_{k }whenever j≠k} is called an (m,n) location space.
Remark 1.
The location space _{m,n }has
elements.
Definition 2
Given a location element (l_{1}, l_{2}. . . l_{m})ϵ_{m,n}, the noise locations with respect to (l_{1}, l_{2}. . . l_{m}) are denoted as (l_{1}, l_{2}. . . l_{m})={1, 2, . . . n}−{l_{i}:1≤i≤m}.
Definition 3
An (m,n) scatter map is an element π=(l_{1}, l_{2}. . . l_{m})ϵL_{m,n }such that π: {0,1}^{m}×{0,1}^{nm}→{0,1}^{n }and π(d_{1}, . . . , d_{m}, r_{1}, r_{2}. . . r_{nm})=(s_{1}, . . . s_{n}) where the hiding locations s_{i }are selected as follows. Set s_{l}_{1}=d_{1 }s_{l}_{2}=d_{2}. . . s_{l}_{m}=d_{m}. For the noise locations, set s_{i}_{1}=r_{1 }for the smallest subscript i_{1}ϵ(π). Set s_{i}_{k}=r_{k }for the kth smallest subscript i_{k}ϵ(π).
Definition 3 describes how the scatter map selects the hiding locations of the parts of the key or data hidden in the noise. Furthermore, the scatter map process stores the noise in the remaining locations that do not contain parts of the key or data. Before the scatter map process begins, it is assumed that an element πϵ_{m,n }is randomly selected with a uniform distribution and Alice and Bob already have secret scatter map π=(l_{1}, l_{2}. . . l_{m}).
Hiding Process 1. Scatter Map Process Hides Data
 Alice retrieves data d_{1 }d_{2}. . . d_{m}.
 Alice generates noise r_{1 }r_{2}. . . r_{ρ }with her random noise generator.
 Per definition , Alice uses scatter map π to store her data s_{l}_{1}=d_{1}. . . s_{l}_{m}=d_{m}.
 Per definition , Alice stores the noise in the noise (unoccupied) locations of (s_{1}. . . s_{n}) so that the data d_{1}d_{2}. . . d_{m }is hidden in the noise.
 Alice sends to Bob.
 Bob receives .
 Bob uses scatter map π to extract data d_{1}. . . d_{m }from .
In an embodiment of process 1, scatter map π is executed by scatter map instructions 130 (
In an embodiment of process 1, output system 252 in
Hiding Process 2. Scatter Map Process Hides One or More Keys
 Alice generates one or more keys k_{1 }k_{2 }. . . k_{m }with her random noise generator and random noise r_{1 }r_{2 }. . . r_{ρ}.
 Per definition , Alice stores one or more keys s_{l}_{1}=k_{1}. . . s_{l}_{m}=k_{m }using scatter map π.
 Per definition , Alice stores the noise r_{1 }r_{2}. . . r_{ρ }in the noise (unoccupied) locations of =(s_{1}. . . s_{n}) so that the one or more keys k_{1}k_{2}. . . k_{m }are hidden in the noise.
 Alice sends to Bob.
 Bob receives .
 Bob uses scatter map π to extract one or more keys k_{1}. . . k_{m }from .
In an embodiment of process 2, scatter map π is executed by scatter map instructions 130 (
In an embodiment of process 2, output system 252 is used during the step Alice sends to Bob. Output system 252 is part of sending machine 102 in
When the scatter size is n, process 1 takes n steps to hide the data inside the noise. When the scatter size is n, process 2 takes n steps to hide one or more keys inside the noise. When the bitrate of a random noise generator is x bits per second, then a transmission with scatter size x bits is practical. When x=10,000, a key size of 2000 bits and noise size of 8000 bits is feasible. When x=20,000, a data size of 5000 bits and noise size of 1500 bits is feasible. In some applications, Alice and Bob may also establish the key size or data size m as a shared secret, where m is not disclosed to Eve.
In the interests of being conservative about the security, the mathematical analysis in section 7.9 assumes that Eve knows the data or key size m. For applications where Eve doesn't know m, the security will be stronger than the results obtained in the upcoming sections.
7.7 Effective Hiding
This section provides the intuition for effective hiding. Effective hiding occurs when Eve obtains no additional information about scatter map a after Eve observes multiple hidden key or hidden data transmissions. Section 7.8 provides mathematical analysis of this intuition.
The effectiveness of the hiding depends upon the following observation. Even after Eve executes a search algorithm for the data (signal) in the noise, Eve's search algorithm does NOT know when it has found the key or the data because her search algorithm CANNOT distinguish the signal from the noise. This is illustrated by
The pixel values in
possibilities for scatter map σ. Even if Eve's search method stumbles upon the correct sequence of locations, Eve's method has no basis for distinguishing the data from the noise because the key and noise probability distributions are equal. For
In
7.8 Multiple Scattered Data Transmissions
This section analyzes the mathematics of when a scatter map is safest to reuse for multiple, scattered transmissions. Suppose that scatter map πϵ_{m,n }is established with Alice and Bob, according to a uniform probability distribution and adversary Eve has no information about π. Before Eve sees the first scatter transmission from Alice to Bob, from Eve's perspective, the probability
for each (l_{1}, l_{2}. . . l_{m}) in _{m,n}: in other words, Eve has zero information about π with respect to _{m,n}.
Next, two rules are stated whose purpose is to design embodiments that do not lead leak information to Eve. Section 7.11 shows some embodiments that authenticate the data or key(s) hidden in the noise. Embodiments that follow these rules help hinder Eve from actively sabotaging Alice and Bob to violate these rules.
Rule 1. New Noise and New Data
For each scattered transmission, described in process 1 or process 2, Alice creates new data d_{1 }d_{m }or creates a new key k_{1 }. . . k_{m }and Alice also creates new noise r_{1}. . . r_{nm }from a random number generator that satisfies the no bias and history has no effect properties.
Rule 2. No Auxiliary Information
During the kth scattered transmission, Eve only sees scattered transmission (k); Eve receives no auxiliary information from Alice or Bob. Scattered transmission (k) represents the key(s) or data hidden in the noise.
Theorem 1.
When Eve initially has zero information about π w.r.t. _{m,n}, and rules 1 and 2 hold, then Eve still has zero information about π after she observes scattered transmissions (1), (2), . . . (k).
In a proof of theorem 1, the following terminology is used. i lies in π=(l_{1}, l_{2}. . . l_{m}) if i=l_{j }for some 1≤j≤m. Similarly, i lies outside π if i≠l_{j }for every 1≤j≤m. In this latter case, i is a noise location.
PROOF. Consider the ith bit location in the scattered transmission. Let x_{i}(k) denote the ith bit observed by Eve during the kth scattered transmission (k). The scatter map π is established before the first transmission based on a uniform probability distribution; rule 1 implies the data generation and noise generation obey the two properties of no bias and history has no effect, These rules imply the conditional probabilities P(x_{i}(k+1)=1x_{i}(k)=b)=½=P(x_{i}(k+1)=0x_{i}(k)=b) hold for bϵ{0,1}, independent of whether i lies in π or i lies outside π. Rule 2 implies that if Eve's observation of (1), (2), . . . (k) enabled her to obtain some information, better than
about whether i lies in π or i lies outside π, then this would imply that the probability distribution of the noise is distinct from the probability distribution of the data, which is a contradiction. □
Remark 2.
Theorem 1 is not true if the probability distribution of the noise is distinct from the probability distribution of the data.
In embodiments, remark 2 advises us not to let Alice violate rule 1: an example of what Alice should not do is send the same data or same key in multiple executions of process 1 or process 2 and the noise is randomly generated for each execution.
7.9 Single Transmission Analysis
The size of the location space is significantly greater than the data or key size. Even for values of n as small as 30,
The uniform distribution of the noise and the data generation and a large enough noise size poses Eve with the challenge that even after seeing the transmission =(s_{1 }. . . s_{n}), she has almost no more information about the data or key(s), than before the creation of k_{1 }k_{2 }. . . k_{m}. The forthcoming analysis will make this notion of almost no more information more precise.
In some applications, Alice and Bob may also establish the data size m as a shared secret, where m is not disclosed to Eve. In the interests of being conservative about the security, π is assumed that Eve knows the data size m. For applications where Eve doesn't know m, the information security will be stronger than the results obtained in this section.
Processes 1 and 2 are analyzed with counting and asymptotic results that arise from the binomial distribution. First, some preliminary definitions are established.
For 0≤i≤n, define E_{i,n}={rϵ{0,1}^{n}:η_{1}(r)=i}. When n=4, E_{0,4}={0000}, E_{1,4}={0001, 0010, 0100, 1000}, E_{2,4}={0011, 0101, 0110, 1001, 1010, 1100}, E_{3,4}={0111, 1011, 1101, 1110} and E_{44}={1111}. Note
The expression—ith element of E_{k,n}—refers to ordering the set E_{k,n }according to an increasing sequence of natural numbers that each binary string represents and selecting the ith element of this ordering. For example, the 3rd element of E_{2,4 }is 0110.
In table 1, event B_{i,j }refers to the ith data in E_{j,m}. Event R_{i }refers to the set of random noise elements which have i ones, and the noise size is ρ=n−m. Event A_{i }refers to a scatter (s_{1 }. . . s_{n}) which contains i ones.
Equation 7.1 follows from the independence of events R_{k }and B_{l,j}.
P(R_{k}∩B_{l,j})=P(R_{k})∩P(B_{l,j}) (7.1)
whenever 0≤k≤ρ and 0≤j≤m and
Equation 7.2 follows from the definitions in table 1; η_{1}(s_{1}. . . s_{n})=η_{1}(r_{1}. . . r_{ρ})+η_{1}(k_{1}. . . k_{m}); and the meaning of conditional probability.
whenever 0≤j≤min{k,m} and
A finite sample space and
imply that each event
Furthermore, B_{l}_{1,}_{j}_{1}∩B_{l}_{2,}_{j}_{2}=θ whenever l_{1}≠l_{2 }or j_{1}≠j_{2 }such that 0≤j_{1}, j_{2}≤m and 1≤l_{1}≤E_{j}_{1}_{,m }and 1≤l_{2}≤E_{j}_{2}_{,m}. Thus, Bayes Law is applicable. Equation 7.3 follows from Bayes Law and the derivation below 7.3.
whenever 0≤j≤min{k,m} and
The mathematical steps that establish equation 7.3 are shown below.
Definition 4
Let c be a positive integer. ƒ:→ is called a binomial cstandard deviations function if there exists Nϵ such that whenever ρ≥N, then
Define the function
The h_{c }is a binomial cstandard deviations function. Lemmas 2 and 3 may be part of the binomial distribution folklore; for the sake of completeness, they are proven below.
Lemma 2.
Let k:→ be a binomial cstandard deviations function. Then
PROOF. A simple calculation shows that
Since k(ρ) is a binomial cstandard deviations function,
This implies
Thus,
(7.4) Since
apply the squeeze theorem to equation 7.4. □
The work from lemma 2 helps prove lemma 3. Lemma 3 helps prove that equation 7.3 converges to 2^{−m }when k(ρ) is a binomial cstandard deviations function.
Lemma 3.
Fix mϵ. Let k:→ be a binomial cstandard deviations function. For any b,j such that 0≤b, j≤m, then
PROOF. Using a similar computation to equation 7.4 inside of c+1 standard deviations instead of c, then ρ can be made large enough so that k(ρ)−b and k(ρ)−j lie within c+1 standard deviations so that
where 0≤i≤m. W.L.O.G., suppose j<b. Thus,
Theorem 4.
Fix data size mϵ. Let cϵ. Let k:→ be a binomial cstandard deviations function. Then
PROOF.
Remark 3.
Theorem 4 is not true when k(ρ) stays on or near the boundary of Pascal's triangle. Consider
The math confirms common sense: namely, if Eve sees event A_{0}, then Eve knows that Alice's data is all zeroes. A practical and large enough noise size enables process 1 or process 2 to effectively hide the data transmission so that outlier events such as A_{0}, A_{1 }do not occur in practice. For example, when n=2048, P(A_{0})=2^{−2048 }and P(A_{1})2^{−2037}.
Definitions 5, 6 and theorems 5, 6 provide a basis for calculating how big the noise size should be in order to establish an extremely low probability that Eve will see outlier events such as A_{0}.
Definition 5
f:→ is an binomial ϵtail function if there exists Nϵ such that n≥N implies that
The area under the standard normal curve from −∞ to x is expressed as
Theorem 5.
For each cϵ, set ϵ_{c}=4Φ(−c). The function
is a binomial ϵ_{c}tail function.
PROOF. This is an immediate consequence of the central limit theorem [26, 27], applied to the binomial distribution. Some details are provided.
Define
In [28] DeMoivre proved for each fixed x that
Thus,
Now ϵ_{c }is four times the value of
which verifies that g_{c }is a binomial ϵ_{c}tail function. □
EXAMPLE 1
This example provides some perspective on some ϵtails and Eve's conditional probabilities. For n=2500, the scatter mean μ is 1250 and the standard deviation
Set c=20, so μ−cσ=750. A calculation shows that
For n=4096, the scatter mean is 2048 and the standard deviation σ=32. Set c=50 standard deviations, so μ−cσ=448. A calculation shows that
Some of Eve's conditional probabilities are calculated for n=2500 and data size m=576. The average number of 1's in a key is μ_{key}=288 and the standard deviation σ_{key}=12.
A typical case is when j=300 and k=1275, which are both one standard deviation to the right of the data and scatter mean, respectively. When Eve's conditional probability equals 2^{−m}, the secrecy ratio is exactly 1. Using equation 7.3, a computer calculation shows that the secrecy ratio is
so 2^{−576}<P(B_{l,300}A_{1275})<2^{−575}.
A rare event is when j=228 and k=1225. That is, j=228 is five standard deviations to the left of μ_{key }and k=1225 is one standard deviation to the left of the scatter mean. A calculation shows that
Thus, 2^{−577}<P(V_{l,228}A_{1225})<2^{−576}.
An extremely rare event occurs when j=228 and k=1125. Event A_{1125 }is 4 standard deviations to the left.
Thus, 2^{−565}<P(B_{l,228}A_{1125})<2^{−564}. While a secrecy ratio of 3840 is quite skew, it still means that even if Eve sees a scatter transmission 4 standard deviations to the left, there is still a probability in the interval [2^{−565}, 2−564] of Alice's data element being the event B_{l,228}.
Even when Eve sees a highly skewed, scattered transmission and obtains some information about the current hidden data element, Eve's observation provides her with no information about the next data element hidden in a subsequent transmission. The secrecy ratio calculations in example 1 provide the motivation for definition 6.
Definition 6
Let ϵ>0. Eve's conditional probabilities P(B_{l,j}A_{k(ρ)}) are ϵclose to perfect secrecy if there exists a binomial ϵtail function ƒ such that for any function k:→ satisfying ƒ(ρ)≤k(ρ)≤ρ−ƒ(ρ), then
Theorem 6.
For any ϵ>0, there exists Mϵ such that ϵ_{c}<ϵ for all c≥M and cϵ. Furthermore, function g_{c }is a binomial ϵ_{c}tail function that makes Eve's conditional probabilities P(B_{l,j}A_{k(ρ)}) ϵ_{c}close to perfect secrecy, where g_{c}(ρ)≤k(ρ)≤ρ−g_{c}(ρ).
PROOF. Since
there exists Mϵ such that ϵ_{c}<ϵ for all c≥M. Recall that
For all ρϵ, g_{c}(ρ)−h_{c}(ρ)≤1 and g_{c}(4ρ^{2})−h_{c}(4ρ^{2})=0. This fact and h_{c }is a binomial cstandard deviations function together imply that lemma 3 and hence theorem 4 also hold for function g_{c}. That is
Whenever function k satisfies g_{c}(ρ)≤k(ρ)≤ρ−g_{c}(ρ), this implies k is a binomial c+1standard deviations function. Thus, this theorem immediately follows from theorems 4, 5 and from definition 6. □
7.10 Data Transformations
In some embodiments, the key or data may be transformed by the sender (Alice) before being scattered and subsequently transmitted to the receiver (Bob). In an embodiment, each bit of the key or the data may be transformed according to the map Φ:{0,1}→{01,10} where Φ(0)=01 and Φ(1)=10. Suppose the data is K=010010000. Φ^{−1 }denotes the inverse of Φ. The inverse of Φ is used by Bob to reconstruct the data d_{1 }d_{2 }. . . d_{m }from the transformed data t_{1 }t_{2 }. . . t_{2}, after Bob extracts t_{1 }t_{2 }. . . t_{2}, from the scattered transmission received from Alice. Note that Φ^{−1}(01)=0 and Φ^{−1}(10)=1. In some embodiments, data transformation instructions 126, shown in
After applying Φ to each bit of K, the transformation is Φ(0)Φ(1)Φ(0)Φ(0)Φ(1)Φ(0)Φ(0)Φ(0)Φ(0)=01 10 01 01 10 01 01 01 01. After this transformation by Φ, each of these 18 bits is scattered inside random noise. Suppose K is scattered inside of 130 bits of noise, then the location space will be _{18,148}. A scatter map π in _{18,148 }has 18 locations. That is, π=(l_{1}, l_{2}, . . . l_{18}) and each l_{i }satisfies 1≤l_{i}≤148.
In alternative embodiments, the map Ψ:{0,1}→{01,10} where Ψ(0)=10 and Ψ(1)=01, may be used to transform the data before scattering (hiding) the data inside the noise. In an embodiment, the map Ψ transforms the 16 bits of data 0100 1110 1001 0110 to a 32bit transformation 10011010 01010110 01101001 10010110, before this 32bit transformation is scattered by the sender (Alice) inside of random noise. After Bob extracts the transformed data 10011010 01010110 01101001 10010110 from the scattered transmission, Bob applies the inverse of Ψ to each substring of two bits. For the first two bits, Ψ^{−1}(10)=0, so d_{1}=0. For bits 3 and 4, Bob computes Ψ^{−1}(01)=1, so Bob reconstructs d_{2}=1. For bits 5 and 6, Bob computes Ψ^{−1}(10)=0, so his third reconstructed data bit d_{3}=0. Bob continues this reconstruction of the 16th bit of data with bits 31 and 32 and computes Ψ^{−1}(10)=0, and reconstructs bit d_{16}=0. In some embodiments, data transformation instructions 126, shown in
Before the scatter map process using a data transformation is started, an element πϵ_{2m,n }is randomly selected and securely distributed to Alice and Bob. Note 2m<n.
Hiding Process 3. Scatter Map Method using a Data Transformation Ψ

 Alice and Bob already have secret scatter map π=(l_{1}, l_{2}. . . l_{2m}).
 A Alice generates data d_{1 }d_{2}. . . d_{m }and noise r_{1 }r_{2 }. . . r_{n2m }from her random noise generator.
 B Alice transforms data d_{1 }d_{2}. . . d_{m }to t_{1 }t_{2}. . . t_{2m }with transformation Ψ.
 C According to definition , Alice uses π to set s_{l}_{1}=t_{1}. . . s_{l}_{2m}=t_{2m}.
 D Per definition , Alice stores the noise at noise (unoccupied) locations in =(s_{1}. . . s_{n}) so that the transformed data is hidden inside the noise.
 E Alice sends =(s_{1}. . . s_{m}) to Bob.
 F Bob uses scatter map π to extract the transformed data t_{1}. . . t_{2m }from .
 G Bob applies the inverse of Ψ to t_{1}. . . t_{2m }and reads data d_{1 }d_{2}. . . d_{m}.
7.11 Hiding Data Elements with Authentication
It is assumed that Alice and Bob have previously established secret scatter map σ=(l_{1}, l_{2}. . . l_{m}) and authentication key κ. In some embodiments, Alice and Bob may establish scatter map σ and authentication key κ with a DiffieHellmanMerkle exchange [29, 30], where their public keys are signed in a secure computing or private manufacturing environment; alternatively, in other embodiments, Alice and Bob may establish σ and κ via a different channel or in the same physical location by a face to face exchange or using a physical delivery by a mutually trusted courier.
Let h_{κ }denote an MAC (e.g., HMAC [31] or [32]) function which will be used to authenticate the scattered transmission. The use of h_{κ }helps hinder the following attack by Eve. An active Eve could flip a bit at bit location l in the scattered transmission. If no authentication occurs on the noise and the hidden key bits, then upon Alice resending a scattered transmission due to Alice and Bob not arriving at the same session key secret, Eve gains information that l lies in σ. If the scattered transmission is not authenticated, Eve's manipulation of the bits in helps her violate rule 2.
Hiding Process 4. Hiding One or More Keys with Authentication
 Alice's random generator creates one or more keys k_{1 }k_{2 }. . . k_{m }and random noise r_{1 }r_{2 }. . . r_{σ}.
 Per definition , Alice uses scatter map σ to set S_{l}_{1}=k_{1}. . . s_{l}_{m}=k_{m}.
 Alice stores the noise r_{1 }r_{2}. . . r_{ρ }at noise (unoccupied) locations in =(s_{1}. . . s_{m}) so that her one or more keys k_{1 }k_{2}. . . k_{m }are hidden inside the noise.
 Alice sends and h_{κ}() to Bob.
 Bob receives ′ and h_{κ}() from Alice. Bob computes h_{κ}(′) and checks it against _{κ}().
 If h_{κ}(′) is valid, Bob uses scatter map σ to extract one or more keys k_{1}. . . k_{m }from ; else Bob rejects ′ and asks Alice to resend .
In some embodiments of process 4, scatter map σ is executed by scatter map instructions 130 (
In some embodiments of process 4, the probability distribution of the data elements is biased and the probability distribution of the noise is biased. In preferred embodiments, the probability distribution of the data elements is the same as the probability distribution of the noise even though they are both biased. In some embodiments, the probability distribution of the data elements is almost the same the probability distribution of the noise. Almost the same probability distribution means that an average hacker that is eavesdropping on the hidden data transmissions would not be able to find where the data is being hidden after a seeing the hidden transmissions for a reasonable amount of time. In an embodiment, a reasonable amount of time is 3 months. In another embodiment, a reasonable amount of time is 1 year. In another embodiment, a reasonable amount of time is 5 years.
In other embodiments, Alice encrypts plaintext data d_{1}, . . . d_{m }with a block or stream cipher before the encrypted data e_{1}, . . . e_{m }is hidden in random noise; this is described in process 5 below.
Hiding Process 5. Hiding Encrypted Data Elements with Authentication
 Alice's uses encryption algorithm and key K to encrypt data M=d_{1}d_{2}. . . d_{m }as (M,K)=e_{1}e_{2}. . . e_{m}.
 Per definition , Alice uses scatter map σ to set s_{l}_{1}=e_{1}. . . s_{l}_{m}=e_{m}.
 Alice's random noise generator creates noise r_{1 }r_{2}. . . r_{ρ}.
 Alice stores the noise r_{1 }r_{2}. . . r_{ρ }at noise (unoccupied) locations in =(s_{1}. . . s_{n}) so that the encrypted data e_{1}e_{2}. . . e_{m }is hidden inside the noise.
 Alice sends and h_{κ}() to Bob.
 Bob receives ′ and h_{κ}() from Alice. Bob computes h_{κ}(′) and checks it against h_{κ}().
 If h_{κ}(′) is valid, Bob uses scatter map σ to extract e_{1}. . . e_{m }from and subsequently uses decryption algorithm and key K to decrypt e_{1}. . . e_{m }and obtain d_{1}. . . d_{m}.
 else Bob rejects ′ and asks Alice to resend .
In some embodiments of process 5, catter map σ is executed by scatter map instructions 130 (
In some embodiments of process 5, encryption algorithm is the block cipher Serpent [14] and is executed with cipher instructions 129 as shown in
In some embodiments of process 5, encryption algorithm is a block cipher and also uses the cipher block chaining mode. In some embodiments of process 5, encryption algorithm is a stream cipher.
7.12 Some Complexity Analysis of Hidden Encrypted Data
Suppose that the encrypted data element e_{1}e_{2}. . . e_{128 }has 128 bits and these bits are hidden inside of 128 bits r_{1}r_{2 }. . . r_{128 }of random noise. In an embodiment following process 5, block cipher Serpent is executed with cipher instructions 126 to encrypt the data element as e_{1}e_{2 }. . . e_{128 }before scatter map instructions 130 are applied to hide encrypted bits e_{1}e_{2 }. . . e_{128 }in random noise r_{1}r_{2 }. . . r_{128 }produced by random number generator 128.
The hiding of encrypted bits e_{1}e_{2 }. . . e_{128 }by scatter map instructions 130 is shown in
When Eve does not receive any auxiliary information (that is, rule 2 holds), it is extremely unlikely that Eve can extract any information about the bit locations even after Eve observes 625,000 encrypted data elements, each hidden in 128 bits of noise. If Eve has the computing power to bruteforce search through each element σϵ_{128,256 }and subsequently to find data element e_{1}. . . e_{128}, Eve still has no way of knowing if this particular σ is the one that Alice used to hide encrypted bits e_{1}e_{2}. . . e_{128}. Eve needs some auxiliary information.
7.13 The Scatter Map Process Hides OneTime Locks
Consider the following cryptographic method. Alice places her onetime lock a on message m and transmits m⊕a to Bob. Bob applies his onetime lock b and sends m⊕a⊕b back to Alice. Alice removes her lock, by applying a to m⊕a⊕b and sends m⊕b back to Bob. Bob removes lock b from m⊕b to read message m. This method of onetime locks is vulnerable if Eve can see the three transmissions m⊕a, m⊕a⊕b and m⊕b because Eve can compute m=(m⊕a)⊕(m⊕a⊕b)⊕(m⊕b).
In an embodiment, process 6 protects these onetime locks by using two distinct and independent scatter maps π_{A}, π_{B }to hide each transmission inside a new generation of random noise. Independent means that any information given to Eve about π_{B }tells Eve nothing about π_{A }and vice versa. In terms of conditional probabilities, independence means P(π_{A}=(l_{1}. . . l_{κ})ϵ_{κ,n}π_{B}=(j_{1}. . . j_{κ}))=P(π_{A}=(l_{1}. . . l_{κ})ϵ_{κ,n}). Using these independent scatter maps, Eve is no longer able to see the three transmissions m⊕a, m⊕a⊕b and m⊕b because the encrypted data m⊕a, and the twice encrypted data m⊕a⊕b and the second party encrypted data m⊕b are each hidden inside of a new generation of random noise.
Hiding Process 6. Scattered One Time Locks
In an alternative embodiment, Alice and Bob use a third, distinct scatter map π_{C}, created independently from π_{A }and π_{B}. Scatter map π_{C }helps Alice scatter b_{1}⊕m_{1 }. . . b_{κ}⊕m_{κ} after removing her lock. This alternative embodiment is shown in process 7.
Hiding Process 7. Scattered One Time Locks with 3 Scatter Maps
In an embodiment of process 7, scatter maps π_{A}, π_{B }and π_{C }are executed by scatter map instructions 130 (
In an embodiment of process 7, output system 252 in
In other alternative, embodiments, the message size κ is known to Eve.
In preferred embodiments, each scatter transmission should use a new lock and new noise. For example, if due to a failed transmission, Alice or Bob generated new noise but transmitted the same values of a_{1}⊕m_{1}. . . a_{κ}⊕m_{κ} and b_{1}⊕m_{1}. . . b_{κ}⊕m_{κ}, and b_{1}⊕a_{1}⊕m_{1}. . . b_{κ}⊕a_{κ}⊕m_{κ}, then Eve could run a matching or correlation algorithm between the scatters , or in order to extract a permutation of message m_{1}. . . m_{κ}. During any kind of failed transmission, Alice and Bob should generate new locks from their respective random noise generators, just as they have to do for every iteration of the while loop in process 6.
In process 6, Alice's lock a_{1}. . . a_{κ} is generated from her random noise generator. Hence, for every (x_{1}, . . . , x_{κ})ϵ{0,1}^{κ}, the probability P(a_{1}⊕m_{1}=x_{1}, . . . a_{κ}⊕m_{κ}=x_{κ})=2^{−κ}. Similarly, Bob's lock b_{1}. . . b_{κ} is generated from his random noise generator, so the probability P(b_{1}⊕m_{1}=x_{1}, . . . b_{κ}⊕m_{κ}=x_{κ})=2^{−κ} for every (x_{1}, . . . , x_{κ})ϵ{0,1}^{κ}.
Although the invention(s) have been described with reference to specific embodiments, π will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the true spirit and scope of the invention. In addition, modifications may be made without departing from the essential teachings of the invention.
REFERENCES
 [1] Wikipedia. Transmission Control Protocol/Internet Protocol. en.m.wikipedia.org/wiki/TCP_IP
 [2] Claude Shannon. Communication Theory of Secrecy Systems. 1949.
 [3] Charles H. Bennett and Gilles Brassard. Quantum cryptography: Public key distribution and coin tossing. In Proceedings of IEEE International Conference on Computers, Systems and Signal Processing. 175, 175179. New York, 1984.
 [4] Charles H. Bennett, Francois Bessette, Gilles Brassard, Louis Salvail and John Smolin. Experimental Quantum Cryptography. Journal of Cryptology. 5, No. 1, 328, 1992.
 [5] P.D. Townsend, J.G. Rarity and P.R. Tapster. Single photon interference in a 10 km long optical fibre interferometer. Electronic Letters. 29, No. 7, 634635, April 1993.
 [6] P.D. Townsend, J.G. Rarity and P.R. Tapster. Enhanced single photon fringe visibility in a 10 km long prototype quantum cryptography channel. Electronic Letters. 29, No. 14, 12911293, July 1993.
 [7] Gilbert Vernam. Cipher printing telegraph systems for secret wire and radio telegraphic communications. J. Americ. Inst. Elect. Eng. 55, 10915, 1926.
 [8] Alan M. Turing. On computable numbers, with an application to the Entscheidungsproblem. Proc. London Math. Soc. Series 2 42 (Parts 3 and 4), 230265, 1936. A correction, ibid. 43, 544546, 1937.
 [9] Andre Stefanov, Nicolas Gisin, Olivier Guinnard, Laurent Guinnard, and Hugo Zbinden. Optical quantum random number generator. Journal of Modern Optics, 47(4):595 598, 2000.
 [10] Mario Stipcevic and B. Medved Rogina. Quantum random number generator based on photonic emission in semiconductors. Review of Scientific Instruments. 78, 045104: 17, 2007.
 [11] Simon Kochen and Ernst P. Specker. The Problem of Hidden Variables in Quantum Mechanics Journal of Mathematics and Mechanics (now Indiana Univ. Math Journal) 17 No. 1, 5987, 1967.
 [12] John Conway and Simon Kochen. The Strong Free Will Theorem. Notices of the American Mathematical Society. 56(2), 226232, February 2009.
 [13] NIST. Advanced Encryption Standard (AES), FIPS 197. November 2001. csrc.nist.gov/publications/fips/fips197/fips197.pdf
 [14] Ross Anderson, Eli Biham, Lars Knudsen. A Proposal for the Advanced Encryption Standard. www.cl.cam.ac.uk/ ˜rja14/Papers/serpent.pdf www.cl.cam.ac.uk/˜rja14/serpent.html
 [15] R.L. Rivest, A. Shamir, and L. Adleman. A method for obtaining digital signatures and publickey cryptosystems. Communications of the ACM. 21, 120 126, 1978.
 [16] Stephen Cook. The P VS NP Problem. www. claymath. org/sites/default/files/pvsnp.pdf
 [17] Klint Finley. Chinese Supercomputer Is Still the Worlds Most Powerful. Wired Magazine. Nov. 18, 2013.
 [18] A.F. Webster and S.E. Tavares. On the Design of SBoxes. Advances in Cryptology. CRYPTO 85 Proceedings. LNCS 218. Springer, 523534, 1986.
 [19] NIST. FIPS1802: Secure Hash Standard, August 2002. www.itl.nist gov/ fipspubs/.
 [20]Guido Bertoni, Joan Daemen, Michael Peeters, Gilles Van Assche. Keccak Reference 3.0 2011. keccak.noekeon org/ en.wikipedia org/wiki/Keccak
 [21] JeanPhilippe Aumasson, Samuel Neves, Zooko WilcoxO′Hearn, Christian Winnerlein. BLAKE. 131002.net/blake/. wikipedia. org/wiki/BLAKE_(hash_function)
 [22] Praveen Gauravaram, Lars Knudsen, Krystian Matusiewicz, Florian Mendel, Christian Rechberger, Martin Schlffer, and Sren S. Thomsen. Grstl a SHA3 candidate. www. groestl. info www.groestl.info/Groestl.pdf
 [23] Hongjun Wu. The Hash Function JH. 2011. ehash iaik. tugraz. at/wiki/JH www3.ntu. edu.sg/home/wuhj/research/jh/jh_round3.pdf
 [24] Niels Ferguson, Stefan Lucks, Bruce Schneier, Doug Whiting, Mihir Bellare, Tadayoshi Kohno, Jon Callas, Jesse Walker. The Skein Hash Function Family. 2010. www.schneier.com/skein1.3.pdf en.wikipedia.org/wiki/Skein_(hash_function)
 [25] Thomas Bayes. An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London. 53, 370418, 1764.
 [26] William Feller. An Introduction to Probability Theory and Its Applications. Volume II. John Wiley. 1966.
 [27] Harald Cramer. Mathematical Methods of Statistics. Princeton University Press. 1946.
 [28] Abraham De Moivre. The Doctrine of Chances: or, A Method of Calculating the Probabilities of Events in play. 1st Edition London, 1718; 2nd edition 1738; 3rd edition 1756.
 [29] Ralph C. Merkle. Secure Communications over Insecure Channels. Communications of the ACM. 21 (4), 294299, April 1978.
 [30] Whitfield Diffie and Martin Hellman. New directions in cryptography. IEEE Transactions on Information Theory 22, 644654, 1976.
 [31] Mihir Bellare, Ran Canetti and Hugo Krawczyk. Keying Hash Functions for Message Authentication. Advances in Cryptology —Crypto 96 Proceedings. LNCS 1109, N. Koblitz ed., Springer, 1996.
 [32] Mark Wegman and J. Lawrence Carter. New Hash Functions and Their Use in Authentication and Set Equality. Journal of Computer and System Sciences. 22, 265279, 1981.
Claims
1. A process comprising:
 generating noise from a nondeterministic generator of a machine of a first party, the machine of the first party having a processor system and a memory system, the processor system including one or more processors;
 generating one or more keys from the nondeterministic generator of the machine of the first party; and
 hiding the one or more keys inside the noise, by the machine of the first party, by at least placing the one or more keys at locations, based on a shared secret map and the noise at other locations, based on the shared secret map; and
 transmitting, by the machine of the first party, the one or more keys while hidden in the noise, to a machine of a second party that is capable of generating the shared secret map.
2. The process of claim 1, wherein each of the one or more keys have a plurality of parts, the process further comprising:
 selecting, by the machine, a hiding location for each part of the plurality of parts of the one or more keys; storing each part of the plurality of parts of the one or more keys in the hiding location that was selected;
 storing, by the machine, the noise in remaining locations that are unoccupied by parts of the one or more keys.
3. The process of claim 1 further comprising: generating, by the machine, the noise based at least on a behavior of photons.
4. The process of claim 3 further comprising: emitting said photons from a light emitting diode.
5. The process of claim 3 further comprising:
 generating, by the machine, the noise based at least on arrival times of emitted photons.
6. The process of claim 3 further comprising:
 generating, by the machine, the noise based at least on threshold times, wherein if a photon of the photons arrives prior to the threshold time, a bit is set to one value and if the photon of the photons arrives after the threshold time the bit is set to another value.
7. The process of claim 1 further comprising:
 generating, by the machine, the one or more keys based at least on a behavior of photons.
8. The process of claim 7 further comprising: emitting the photons from a light emitting diode.
9. The process of claim 7, wherein the generating of the one or more keys at least partly depends on at least photocurrent values in a photodiode compared to an average value of the photocurrent.
10. The process of claim 7, wherein said photons are absorbed by a photo detector.
11. The process of claim 10, wherein said photons are absorbed by a phototransistor.
12. The process of claim 10, wherein said photons are absorbed by a photodiode.
13. The process of claim 1 further comprising:
 the second party receives the one or more keys, which were hidden inside the noise;
 the second party computes the shared secret map from a secret to find the hiding locations of the parts of the one or more keys;
 the second party extracts the one or more keys from the noise with the hiding locations.
14. The process of claim 1 further comprising:
 the first party having a distinct authentication key extracted from the hidden one or more keys that were hidden in the noise;
 the first party applying, by the machine, a oneway hash function to a combination of the authentication key and the one or more keys that were hidden in the noise;
 the first party also transmitting the output of the hash function to the second party.
15. The process of claim 1, where in there are at least twice as many noise bits than bits storing the one or more keys.
16. An information process comprising:
 generating noise from a nondeterministic generator of a machine of a first party, the noise having a probability distribution; encrypting data to form encrypted data;
 hiding the encrypted data inside the noise with the machine of the first party, the machine having a processor system and a memory system, by at least
 placing, by the machine of the first party, the encrypted data at a first set of locations based on a shared secret map;
 placing, by the machine of the first party, the noise at a second set of locations that is different from the first set of locations, the second set of locations being based on the shared secret map; the encrypted data having a probability distribution, wherein the probability distribution of the encrypted data is close enough to the probability distribution of the noise, that a hacker would not be able to distinguish the noise from the encrypted data; and
 transmitting, by the machine of the first party, the encrypted data while hidden in the noise, to a machine of a second party that has the shared secret map.
17. The information process of claim 16, where in there are at least twice as many noise bits than bits storing the one or more keys.
18. An information process comprising:
 generating noise from a nondeterministic generator, the noise having a probability distribution; hiding data inside the noise with a machine, the machine having a processor system and a memory system, the hiding of the data being performed by at least
 placing, by the machine of the first party, the data at a first set of locations being based on a shared secret map, and
 placing, by the machine of the first party, the noise at a second set of locations that is different from the first set of locations, the second set of locations being based on the shared secret map; the data having a probability distribution, wherein the probability distribution of the data is the same as the probability distribution of the noise; and
 transmitting, by the machine of the first party, the data while hidden in the noise, to a machine of a second party that is capable of generating the shared secret map.
19. The process of claim 18 further comprising:
 hiding, by the machine, metadata in the noise, the metadata pertains to the hidden data;
 wherein said metadata contains at least some internet protocol information.
20. The process of claim 18 further comprising:
 generating the noise based at least on a behavior of photons.
21. The process of claim 20 further comprising: emitting the photons from a light emitting diode.
22. The process of claim 20 further comprising: generating the noise at least partly depends on photocurrent values in a photodiode compared to an average value of the photocurrent.
23. The process of claim 20 wherein said photons are absorbed by a photodetector.
24. The process of claim 18 further comprising:
 the first party selecting a hiding location for each part of the data;
 the first party storing, by the machine each part of the data in the hiding location that was selected;
 the first party storing, by the machine, the noise in remaining locations that are unoccupied by parts of the data.
25. The process of claim 24 further comprising:
 the first party transmitting, by the machine, the hidden data inside the noise to the second party;
 the second party computing the secret map, function based on a secret, to find the hiding locations of the parts of the data;
 the second party extracting the data from the noise based on the map.
26. The information process of claim 18, where in there are at least twice as many noise bits than bits storing the one or more keys.
27. A process comprising:
 generating noise by a machineimplemented nondeterministic generator;
 encrypting data with a machine, therein creating encrypted data;
 the machine having a processor system and a memory system, the processor system including one or more processors;
 hiding, by the processor system, the encrypted data inside the noise by at least a first party selecting, by the processor system, a hiding location for each part of the encrypted data based on a shared secret map; the first party storing, by the processor system, each part of the encrypted data in the hiding location that was selected; the first party storing, by the processor system, the noise in remaining locations that are unoccupied by parts of the encrypted data, the remaining locations being based on the shared secret map; and
 transmitting, by the machine of the first party, the encrypted data while hidden in the noise, to a machine of a second party that has the shared secret map.
28. The process of claim 27 wherein a block cipher encrypts said data.
29. The process of claim 27 further comprising:
 the first party transmitting, by the machine, the encrypted data that was hidden inside the noise to a second party;
 the second party computes the shared secret map from a secret to find the hiding locations of the parts of the encrypted data;
 the second party extracting the encrypted data from the noise based on the shared secret map.
30. The process of claim 29 further comprising:
 the second party further encrypting the extracted encrypted data;
 wherein further encrypting the extracted encrypted data is called twice encrypted data;
 the second party selecting a hiding location for each part of the twice encrypted data;
 the second party storing each part of the twice encrypted data in the selected hiding location;
 the second party storing new noise in the remaining locations that are unoccupied by parts of the twice encrypted data.
31. The process of claim 30 further comprising:
 the second party transmitting the twice encrypted data that was hidden inside the new noise to the first party;
 the first party computes, by the machine, a shared secret map to find the hiding locations of the parts of the twice encrypted data;
 the first party, by the machine, extracting the twice encrypted data from the new noise based on the map;
 the first party removing the first party's encryption from the twice encrypted data;
 wherein the removed first party's encryption from the twice encrypted data is called second party encrypted data.
32. The process of claim 31 further comprising:
 the first party selecting, by the machine, a hiding location for each part of the second party encrypted data;
 the first party storing, by the machine, each part of the second party encrypted data in the selected hiding location;
 the first party storing, by the machine, new noise in the remaining locations that are unoccupied by parts of the second party encrypted data;
 the first party transmitting, by the machine, the second party encrypted data, hidden inside the new noise, to the second party.
33. The process of claim 27, wherein a processor executes a oneway hash function.
34. The process of claim 27 further comprising:
 generating the noise based at least on a behavior of photons.
35. The process of claim 34 further comprising:
 generating the noise based at least on threshold times.
36. The process of claim 34 further comprising:
 comparing photocurrent values in a photodetector to an average value of the photocurrent.
37. The process of claim 34 further comprising: emitting the photons from a light emitting diode.
38. The process of claim 27, where in there are at least twice as many noise bits than bits storing the one or more keys.
39. A machineimplemented method comprising:
 a first machine receiving a communication having one or more keys that are hidden amongst noise at predetermined locations based on a shared secret map, and the noise being stored in predetermined locations, the communication having originated from a second machine having the shared secret map, the noise having been generated by a nondeterministic generator;
 the first machine having a processor system and a memory system, the processor system including one or more processors;
 computing, at the first machine, the shared secret map, based on stored machine instructions, to find the predetermined locations where parts of the one or more keys, were previously stored, by the sender second machine, and currently are located amongst the predetermined locations storing noise;
 extracting, by the first machine, the one or more keys from the predetermined locations that store the one or more keys, from amongst the predetermined locations that store the noise, based on the shared secret map.
40. The machineimplemented method of claim 39, where in there are at least twice as many noise bits than bit storing the one or more keys.
41. A machineimplemented method comprising:
 encrypting data with a machine of a first party, the machine having a processor system and a memory system, the processor system including one or more processors;
 generating noise with a nondeterministic generator of the machine;
 and hiding, by the processor system, the encrypted data inside the noise, by at least designating, by the processor system, some locations within a set of data for storing noise, based on a shared secret map, and designating, by the processor system, some locations within the set of noise for storing the encrypted data, based on the shared secret map; and storing by the processor system the encrypted data in the locations designated for the encrypted data and storing, by the processor system, the noise in locations designated for noise, when the set of data is assembled, the locations designated for the encrypted data being mixed within the locations designated for noise, so as to obscure the encrypted data; and
 transmitting, by the machine of the first party, the encrypted data while hidden in the noise, to a machine of a second party that is capable of generating the shared secret map.
42. The method of claim 41, wherein said one or more processors execute a oneway hash function.
43. The machineimplemented method of claim 41, where in there are at least twice as many noise bits than bits storing the one or more keys.
44. A machineimplemented method comprising:
 generating noise with a nondeterministic generator of a machine of a first party;
 the machine having a processor system and a memory system, the processor system including one or more processors;
 encrypting data with the machine of the first party;
 the machine of the first party hiding, by the processor system, the encrypted data inside the noise, by at least designating some locations, based on a shared secret map, within a set of data for storing noise, and
 designating some locations, by the machine of the first party, based on the shared secret map, within the set of data for storing the encrypted data, and
 storing by the processor system the encrypted data in the locations designated for the encrypted data and
 storing by the processor system the noise in locations designated for noise,
 when the set of data is assembled, the locations designated for the encrypted data being mixed within the locations designated for noise, so as to obscure the encrypted data;
 wherein the noise has a probability distribution and the encrypted data has the same probability distribution as the noise; and
 transmitting, by the machine of the first party, the set of data after being assembled, to a machine of a second party that has the shared secret map.
45. The method of claim 44 wherein a block cipher encrypts said data.
46. The method of claim 44 further comprising:
 the first party selecting, by the machine, a hiding location for each part of the encrypted data;
 the first party storing, by the machine, each part of the encrypted data in the hiding location that was selected;
 the first party storing, by the machine, the noise in the remaining locations that are unoccupied by parts of the encrypted data.
47. The machineimplemented method of claim 44, where in there are at least twice as many noise bits than bits storing the one or more keys.
48. An information system comprising:
 a processor system, the processor system including one or more processors and
 a memory system storing one or more machine instructions, which when invoked cause the processor system to implement a method including at least, the information system being associated with a first party
 generating noise from a nondeterministic generator of the information system associated with the first party;
 generating one or more keys from the nondeterministic generator of the information system associated with the first party;
 and
 hiding the one or more keys inside the noise by the machine, the machine having a processor system and a memory system, the processor system including one or more processors the one or more keys being located in the noise at locations based on a shared secret map; and
 transmitting, by the information system associated with the first party, the one or more keys while hidden in the noise, to a machine of a second party that has the shared secret map.
49. The system of claim 48 wherein the generating of the noise has a probability distribution and the generating of the one or more keys has a probability distribution, which is the same as the probability distribution of the generating of the noise.
50. The system of claim 49 each of the one or more keys have a plurality of parts, the system furthercomprising:
 selecting, by the machine, a hiding location for each part of the plurality of parts of the one or more keys;
 storing, by the machine, each part of the plurality of parts of the one or more keys in the hiding location that was selected;
 storing, by the machine, the noise in remaining locations that are unoccupied by parts of the one or more keys.
51. The system of claim 48 further comprising: generating the noise based at least on a behavior of photons.
52. The system of claim 51 further comprising: emitting said photons from a light emitting diode.
53. The system of claim 48 further comprising:
 generating the one or more keys based at least on a behavior of photons.
54. The system of claim 53 further comprising: emitting the photons from a light emitting diode.
55. The system of claim 53, wherein said photons are absorbed by a photodetector.
56. The information system of claim 48, where in there are at least twice as many noise bits than bits storing the one or more keys.
57. An information system comprising:
 a machine of a first party, the machine of the first party including an interface that receives a communication having one or more keys that are hidden inside noise at predetermined locations for storing the one or more keys, where the noise is stored at predetermined locations for storing noise, the predetermined locations for storing the key being chosen at a machine of a second party, so as to be deterministically determinable based on a shared secret map that is shared by the first party and the second party, the noise having been generated from a machineimplemented nondeterministic generator;
 the machine of the first party having a processor system and a memory system, the processor system including one or more processors;
 the machine of the first party storing machine instructions, which when implemented compute a map to find the predetermined locations for storing the key, where parts of the one or more keys are located amongst the predetermined locations within the noise is stored, based on the shared secret map;
 the machine of the first party extracting the predetermined locations for storing the one or more keys from amongst the predetermined locations for storing the noise, based on the shared secret map, the communication originating from the machine of the second party having the shared secret map;
 the one or more keys having a probability distribution, wherein the probability distribution of the one or more keys is close enough to the probability distribution of the noise, that a hacker would not be able to distinguish the noise from the one or more keys.
58. The information system of claim 57, where in there are at least twice as many noise bits than bits storing the one or more keys.
59. A machineimplemented method comprising:
 generating noise with a nondeterministic generator of a machine of a first party;
 the machine of the first party having a processor system and a memory system, the processor system including one or more processors; encrypting data with the machine;
 the machine of the first party hiding, by the processor system, the encrypted data inside the noise, by at least designating some locations, based on a shared secret map, within a set of data for storing noise, and
 designating some locations, by the machine of the first party, based on the shared secret map, within the set of data for storing the encrypted data, and
 storing by the processor system the encrypted data in the locations designated for the encrypted data and
 storing by the processor system the noise in locations designated for noise,
 when the set of data is assembled, the locations designated for the encrypted data being mixed within the locations designated for noise, so as to obscure the encrypted data; and
 transmitting the set of data, after the set of data is assembled, to a second party capable of generating the shared secret map.
Referenced Cited
U.S. Patent Documents
8615087  December 24, 2013  DiCrescenzo 
9306739  April 5, 2016  Troupe 
9461987  October 4, 2016  Panging 
9882879  January 30, 2018  Dotan 
20050152540  July 14, 2005  Barbosa 
20070079123  April 5, 2007  Iwamura 
20070099597  May 3, 2007  Arkko 
20090003701  January 1, 2009  Rekhi 
20090161870  June 25, 2009  Rosenberg 
20090323718  December 31, 2009  OrenDahan 
20100034377  February 11, 2010  Kamel Ariffin 
20100046755  February 25, 2010  Fiske 
20100067701  March 18, 2010  Patwari 
20100080386  April 1, 2010  Donnangelo 
20110055585  March 3, 2011  Lee 
20110085666  April 14, 2011  Hicks 
20110274273  November 10, 2011  Fiske 
20110280397  November 17, 2011  Patwar 
20110280405  November 17, 2011  Habif 
20120045053  February 23, 2012  Qi 
20120121080  May 17, 2012  Kerschbaum 
20120195428  August 2, 2012  Wellbrock 
20120221615  August 30, 2012  Cerf 
20120300925  November 29, 2012  Zaverucha 
20130042111  February 14, 2013  Fiske 
20130089204  April 11, 2013  Kumar 
20130132723  May 23, 2013  Gaborit 
20130163759  June 27, 2013  Harrison 
20130251145  September 26, 2013  Lowans 
20130315395  November 28, 2013  Jacobs 
20130329886  December 12, 2013  Kipnis 
20140025952  January 23, 2014  Marlow 
20140098955  April 10, 2014  Hughes 
20140126766  May 8, 2014  Crisan 
20140201536  July 17, 2014  Fiske 
20140270165  September 18, 2014  Durand 
20140331050  November 6, 2014  Armstrong 
20140372812  December 18, 2014  Lutkenhaus 
20150106623  April 16, 2015  Holman 
20150188701  July 2, 2015  Nordholt 
20150295707  October 15, 2015  Howe 
20150295708  October 15, 2015  Howe 
20150326392  November 12, 2015  Cheng 
20160034682  February 4, 2016  Fiske 
20160112192  April 21, 2016  Earl 
20160117149  April 28, 2016  Caron 
20160234017  August 11, 2016  Englund 
20160295403  October 6, 2016  Hwang 
20160380765  December 29, 2016  Hughes 
Other references
 Wikipedia, Hardware Random Number Generator, 2018, Wikipedia, pp. 19.
Patent History
Type: Grant
Filed: Nov 28, 2015
Date of Patent: Jul 23, 2019
Patent Publication Number: 20160154966
Assignee: Fiske Software, LLC (San Francisco, CA)
Inventor: Michael Stephen Fiske (San Francisco, CA)
Primary Examiner: Joseph P Hirl
Assistant Examiner: Stephen T Gundry
Application Number: 14/953,300
Classifications
International Classification: H04L 9/06 (20060101); G06F 21/60 (20130101); H04L 9/32 (20060101); H04L 29/06 (20060101); H04L 9/08 (20060101); G06F 21/62 (20130101);