0xbadcafebee
a year ago
He asks how it's possible, but avoids the obvious?
PORT_Memcpy(cx->u.buffer, sig->data, sigLen);
break;
Fig 3. The signature size must match the size of the key, but there are no other limitations. cx->u is a fixed-size buffer, and sig is an arbitrary-length, attacker-controlled blob.
Dude. This function just copied into an object blindly. It didn't interrogate the object to determine the object's maximum size (functionality you can add to a custom class). You don't have a preprocessor, postprocessor, compiler, etc that enforces object boundaries (afaik). This copy probably didn't even require similar object types to copy memory from one place to another.The failure of C/C++ coding like this is bad design, bad process, and bad practice. This could have been prevented by merely writing simple object oriented code that enforce boundaries when manipulating data. Sure, C programmers love to play fast and loose, but this is no justification for writing code that refuses to enforce correct behavior.
It's not even hard. It's just more lines of code. My theory is that this kind of thing persists because C/C++ programmers tend to lean on packaged shared libraries rather than a "programming language package manager", and so the ecosystem for layers upon layers upon layers of abstractions never became fashionable the way it did for higher level languages. That mess of abstractions could have added some really basic safety frameworks. So really, I think the root cause is culture.
It's not like C is so primitive that it's impossible to create a program that checks bounds when copying data. You definitely can - you just have to actually do it. But there was no culture for rigor. Even the Linux kernel is rife with gotos, almost a middle finger to the general consensus on good practice. "But we're so good, it's perfectly fine for us!" Sure buddy. Tell me that in a month when the next Linux 0day comes out.
olliej
a year ago
> He asks how it's possible, but avoids the obvious?
He’s not asking “how does this cause corruption”, he’s asking “how is it possible that a bug like this can occur in a code base like this, and not be caught earlier”.
He then enumerates all the myriad “correct” things that Mozilla do (did?), including code reviews, fuzzing, static analysis, bug bounties, etc and yet something as trivially trivial as copying an arbitrarily large amount of data into a buffer without verifying it fit went unnoticed.
Personally I think it’s a good example of how over valued static analysis is when something this trivial is not reported (I suspect the issue is SA tools have to avoid too many false positives and reporting every memcpy that only checks one size could be too “noisy”)