Why it's more secure?

Jun 10, 2014 at 1:28 PM
Hi,

Why adding that many iterations does make the algorithm "stronger?
It's just reusing the same algorithm, and actually has a chance of corrupting the original file.

If there is a single bit that get's jacked up during the process for a 1Gb file, repeat it 1000 times and up to 1KB could be wrong.
If you repeat it 322761 times, well, that could end up causing around, say, up to 327KB in error. Is that likely? No. Possible? Yes.

The randomization curve for the algorithm is fine, and given the length of the key for the algorithms used makes it unnecessary to add some random amount of iterations. If you actually wanted to accomplish something, during the time you are presented with a password it should be also allowed to determine the amount of iterations you want used, some number between 1,000 and 10,000. That would be a nice change, adding a level of randomness to the algorithm.

Correct me if i'm wrong .
Regards.
Coordinator
Jun 10, 2014 at 8:49 PM
Hi,

I think you are misunderstanding the concept of security for cryptographic key derivation. We are not dealing with error correction or error prevention during data transmission or storage but rather data protection through encryption using a symmetric key derived from the user password.
In order to protect against offline attacks (for example someone stealing your laptop), we have to ensure that brute-force attacking the derived encryption key would be too costly for the adversary to perform.
TrueCrypt was using 1000 or 2000 iterations which is weak (even if a random salt is created for each volume header) regarding the rapid development in distributed brute-force attacks.

Encryption doesn't protect against errors. No encryption solution does. This must be handled on an other layer.

Concerning your idea about letting the user choose the iteration count he wants, it certainly adds another unknown variable to the attacker but it makes the software more complex to use and from a pure security point of view it doesn't add any strength. No encryption solution on the market today implement this and VeraCrypt is no exception. Of course, since this is Open Source, anybody can make modifications and send patches, so if any developer is interested in this, he is welcomed to do so. As far as VeraCrypt is concerned, there are other more pressing features on the waiting list.

Regards,
Marked as answer by IronHD on 6/10/2014 at 3:33 PM
Dec 26, 2016 at 9:34 AM
Edited Dec 26, 2016 at 9:37 AM
the same question but in a different flavor. Nothing about "error-correction", only about a truly meaningful reasons or their absence.

homepage states:
...at most 2000 iterations but VeraCrypt uses 655331...
idrassi wrote:
TrueCrypt was using 1000 or 2000 iterations which is weak (even if a random salt is created for each volume header) regarding the rapid development in distributed brute-force attacks.
Concerning your idea about letting the user choose the iteration count he wants, it certainly adds another unknown variable to the attacker but it makes the software more complex to use and from a pure security point of view it doesn't add any strength
Well,

First, let look back IRL. A lot of well-working non-digital cryptoanalyze methods are around, from a set of Guantanamo-proved until a wide-available thermorectal (soldering-iron-in-ass). So any algorithm and any iterations count goes meaningless after a some level of interest. This case a plausible deniable encryption means a much much more. Unfortunately these methods are out of our control, so let return to a numbers.

Second, an every encryption iteration consumes a processing power and a time. And the same is about decryption. So both sides are seeks a balance between an inevitable resources wasting and a desired result probability. A popular measurement "unit" is a timeframe needed to decrypt (find a key). The same about encryption still shadowed under a "performance"-related terminology.

Third, the most of users, even a half-devils, will never get so much interest to start decryption of their crypto with the IBM's Deep Blue or a similar hardware.

Now, assuming are no known flaws in the used algorithms itself, what a difference between an abovequoted "2000 vs 655331 iterations" (and another mentioned in "What does VeraCrypt bring to you", and another used too) "improvements" in terms of:
  1. resources to encrypt.
  2. resources to crack.
Anything like "MIPS" or "cpu cycles" or "time on i386DX-100" or anything else just to a brief estimated overview.