PUF: Oxymoron? edit

Objects can be copy resistant, but with adequate measurement and modification capability, there is no such thing as "physical unclonability". A so-called PUF is an artifact that contains a finite number of analog values, which are measured and converted to digital values, perhaps under conditions modified by other digital input values. The measurements are also modified by environmental conditions such as power supply voltage, temperature, ambient noise, clock rate, etc. Real physical artifacts are also modified by aging and stress. So the analog values, treated as messages, contain only so many bits of information. If there are many analog values, and they correlate with each other, the total information is not a simple multiple of each analog bit value.

Because of aging, stress, and variation of measurement conditions, the analog values will drift. A well designed system will not have common mode drift, that is, the average drift of all of the values will tend towards zero as the number of values increases. Failing that, the average drift should be easily characterized and subtracted from each analog measurement. Otherwise, the effective number of bits is decreased, and cryptographic strength (if desired) is strongly diminished.

If the analog values with uncorrelated drifts are added together, the analog range increases, but the drift of the sum increases by at least the root-mean-square sum of the uncorrelated drifts. If the drifts correlate, then the sum of the drifts is larger than the root-mean-square sum. Different values have different drifts, and the drift sum is dominated by the cell with the largest drift. If a random value distribution is gaussian around some mean, and the drift is also, then there is no information advantage to adding values; on average, the ratio of the value range to the drift range stays constant, so the number of extractable bits is the same for one value or for two added values.

The process of turning analog values into digital measurements is noisy. Some analog values will be close to decision thresholds, and noise and drift can cause bit values to change between measurements. Nature does not provide error correction on these measurements, so the identification, authentication, or key-generation system using these bits must tolerate or mask out these unexpected bit changes or suffer from an unreasonably high field failure rate. The mechanisms that do this can leak information to an attacker.

If the bit extraction process is iterative, such that future value measurements are dependent on previous ones, then the bit drift is cumulative, and the reliability is reduced. Again, correction systems will leak information. Digital vectors can be chosen that condition the circuit to avoid drift, but the choice of these vectors can be observed by an attacker, and those observations also convey information.

If an example of the so-called PUF is available to an attacker for a complete physical teardown, or similar structures are used on many different products, the deterministic digital portions of the system can be characterized. While these deterministic portions may behave in an "unpredictable" way - for example, they may be cryptographic one-way functions - we can rely on them to produce identical outputs for identical inputs. The first so-called PUF would probably end up destroyed, but the knowledge gained could be used to attack all identically-manufactured PUFs.

If deterministic digital vectors are conditioning the PUF measurement system - say, selecting analog values to be added together - it does not matter that we cannot produce a particular desired value, as long as we can produce enough analog measurements. Given drift and noise, we can keep trying digital vectors until we see noisy responses on a measurable digital node - say an authenticated/not-authenticated signal. That tells us that the analog sum is very close to a measurement decision threshold, and is being dithered around it by noise. We can even perform a crude measurement of the analog value based on the characterized gain of the measurement circuit, and the known characteristics of thermal noise. With enough probes, probably a small multiple of (value range)/(noise range), we can find many noisy outputs with different digital vectors driving the analog selection process, and develop a system of equations, expressible as a matrix, that can be inverted to yield the values being measured as a function of the known noise value.

So we now know the values, and we know the circuit, and we know the behavior of the one-way function (even though we can't reverse it). That is all we need to know to build a physical clone, perhaps implemented with a table of values and a fast DSP, which would be a functional clone of the original PUF. With more effort, we can take another identically manufactured PUF, with a different "native" behavior, and modify its analog values (typically having separate physical locations) with ion implantation and ion beam milling until our copy matches the target.

The attacker's cost/value ratio for all this effort will be too high if the result is one compromised credit card, or one damaged ecommerce transaction. It might be worthwhile as part of a batch process to attack a large number of low-value PUFs, and it would definitely be worthwhile if the guarded information was worth hundreds of thousands of dollars or more.

In other words, the cost of attacking a so-called PUF can be as low as attacking a symmetric cryptosystem with a few tens of bits of key strength. Some architectures are more copy-resistant than others, but none will ever have copy resistance approaching modern cryptographic key strength. The term "Physically Unclonable Function" is deceptive, and may lull users of such technologies into a false sense of security, with horrendously expensive consequences.

I suggest reading Bruce Schneier's "Secrets and Lies". While Schneier's books will not teach anyone how to reliably design high-strength cryptosystems, they do teach us to spot some of the characteristics of bad ones. But keep in mind that they are books about deterministic digital structures. Analog structures contain many vulnerabilities that digital designers are unaware of, and require additional skills to analyze. You may not have those skills, but be assured that your adversary does.


Note: I am not a disinterested spectator. My company SiidTech licenses and sells copy-resistant bit-generator cells for integrated circuits, and has some of the fundamental patents (i.e. US6161213) in this area. While our products are copy-resistant, and can be used to generate cryptographic strength keys, we do not fool ourselves or our customers with the idea that they cannot be functionally duplicated given access to the physical artifacts. They are good for protecting low-value assets, or higher-value assets if physical access is controlled, just like any other kind of hardware.


KeithLofstrom (talk) 04:55, 18 January 2011 (UTC)Reply


PUF: is not a function edit

Physical Unclonable Function or PUF is not a function in the sense of mathematics because a function maps a certain input every time to the same output. A Puf is usually noisy and so with the same input we get not always the same output. — Preceding unsigned comment added by 129.27.137.157 (talk) 14:41, 26 January 2012 (UTC)Reply

The objects discussed in this article are named Physical Unclonable Function, and the occurrence of Function in the name is not a problem. On the other hand, using the term function to explain what a PUF is might be problematic. Is there any such usage? — Preceding unsigned comment added by 80.215.172.99 (talk) 12:42, 8 March 2016 (UTC)Reply

Wiki Education assignment: National and International Cybersecurity Policy edit

  This article was the subject of a Wiki Education Foundation-supported course assignment, between 17 January 2023 and 15 May 2023. Further details are available on the course page. Student editor(s): Jdrake69002 (article contribs).

— Assignment last updated by Jdrake69002 (talk) 14:22, 25 February 2023 (UTC)Reply