Eugene Leitl wrote,
> Of course it's possible: you have to just substitute the noise in a given
> source by a pseudorandom source bearing the same signature. Preparing
> proper signature is not trivial, but it's not rocket science, either.
You are aware that most experts and FAQs on steganography discuss the fact
that being undetectable may not be possible? This is not just an arbitrary
assumption on my part. It really is a standard steganography topic, and
right now most of the experts seem to agree that it may not be possible. If
anybody really invents undetectable steganography, they would probably get
I don't want to discuss too many of the specific details before I get your
pictures. However, I don't want this conversation to get stuck at the "can
too" "can not" level. So without revealing too many details of what I plan
to do, let me describe some of my ideas for detect steganography. There are
three main points that make my part of the challenge easier than might be
1. I don't really have to detect or predict complex random tampering. All
I have to predict is standard JPEG output. Given a known image and known
JPEG algorithms, it is pretty easy to predict what the output should be at
various compressions. Anything that deviates from the predicted results
would be suspect. 99% of the pictures out on the Internet are optimized,
compressed, perfect JPEGS. Only a small subset of them would be unusual
enough to merit further investigation.
2. I probably have your encoding algorithms. There is currently only one
program that even claims to be able to produce undetectable steganography.
Unless you have developed your own secret algorithms, I am pretty sure that
you are planning to use outguess. This means that I don't have to detect
any possible random tampering. I only have to use outguess itself to see
what kind of stuff it produces. I can compare known JPEG outputs and known
outguess outputs. I can even look into the source code for outguess if I
need more specific details about what outguess will do or will not do.
Having the known algorithms and the known data output makes it a large but
easy problem to determine whether the algorithm would produce the data.
3. I developed a security reverse-engineering methodology for IBM for this
purpose. They are considering pursuing a patent based on my work (aided by
my business partner and two of my IBM colleagues). Although they sold this
to the U.S. government, I am not directly restricted from discussing this
invention of mine. For once, one of my security inventions is not
classified top secret by the U.S. government as a matter of national
security. This was my first big security invention since moving to the
private sector. Although it eventually became the property of the U.S.
Government, and is probably still regarded by IBM as one of their trade
secrets, it is the first time I was able to directly profit from one of my
security methodologies. It also is the first major achievement of mine that
I can discuss in public. I still do not want to reveal too many details,
since IBM still is the only public-sector company with knowledge of this
technique, and I do not want to diminish its value for them.
The problem I solved for IBM was meant to detect changes in program
binaries. Unfortunately, we could not simply recompile the code to make a
trusted binary, and we did not have access to checksums or information about
what the original binary should have looked like. My method analyzed the
compiler itself to determine what output binaries were possible. I produced
a constraint map that could be applied to the binary. Anything outside the
known constrains must have been added later and was not part of the original
compilation. The added binary was not invalid, and was not analyzed
directly, but was detected as being outside the range of what would have
been normally produced. Similarly, I think the same method will apply here.
I can predict what normal JPEGS would produce for the image. Although
random fluxuations in low-order bits are valid and might appear as noise,
they never would really be output by the JPEG algorithm itself. It actually
works to reduce noise, not add it. Therefore, these extraneous bits would
appear alien even though they are defined in a valid manner. Statistical
analysis on the image won't find these bits outside the norm from a visual
perspective. However, analysis on the JPEG and outguess programs themselves
would show that these bits would not be produced in the absence of a hidden
-- Harvey Newstrom <www.HarveyNewstrom.com> Principal Security Consultant, Newstaff Inc. <www.Newstaff.com> Board of Directors, Extropy Institute <www.Extropy.org> Cofounder, Pro-Act <www.ProgressAction.org>
This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:11 MDT