Judging from the article, Zig would have prevented the CVE.
> This includes memory allocations of type NV01_MEMORY_DEVICELESS which are not associated with any device and therefore have the pGpu field of their corresponding MEMORY_DESCRIPTOR structure set to null
This does look like the type of null deref that Zig does prevent.
Looking at the second issue in the chain, I believe standard Zig would have prevented that as well.
The C code had an error that caused the call to free to be skipped:
threadStateInit(&threadState, THREAD_STATE_FLAGS_NONE);
status = rmapiMapWithSecInfo(/*…*/); // null deref here
threadStateFree(&threadState, THREAD_STATE_FLAGS_NONE);
Zig’s use of ‘defer’ would ensure that free is called even if an error occurred:
threadStateInit(&threadState, THREAD_STATE_FLAGS_NONE);
defer threadStateFree(&threadState, THREAD_STATE_FLAGS_NONE);
status = try rmapiMapWithSecInfo(/*…*/); // null deref here
Assuming the user would not have forgotten to type the line with defer, and correctly as well, like all great coders.
Followed by never touching the variable ever again.
Nothing can prevent a sufficiently belligerent programmer from writing bad code. Not even Rust—which I assume you’re advocating for without reading the greater context of this thread.
Especially in the case of GP, I'd say Rust is not the main recommendation, although it is one. I would concur that Rust is only one of many decent languages (for memory safety or otherwise).
Still, there are languages with guardrails, and then there are languages with guardrails, and the order for memory safety is probably something like C < C++ < Zig < Rust < managed (GC) languages.
We're literally in a thread about the famous Onion recurring story.
"'No Way to Prevent This,' Says Only Nation Where This Regularly Happens"
I literally replied with “A Way to Prevent This”?
"Don't make mistakes" isn't a way to prevent this. We know that doesn't work.
No, the solutions I spoke about were language features that make it trivial to avoid or impossible to make the mistakes.
If your bar for mistakes is “what if you forget to add literally the next line of code in the incredibly common pattern”, I don’t really care to have a discussion about programming languages anymore.
You can forget to increment a loop and have your program not terminate so why don’t you program with language of exclusively primitive recursive functions?
You won't get anywhere with people who just like to argue.
Note that the mention of Zig that I responded to was in reference to Tony Hoare's "billion dollar mistake", which was making null a valid value of a pointer type, not free after use, which is a quite different issue. As I noted, the mistake doesn't occur in Zig because null is not a valid value for a pointer, only an optional pointer, which must be unwrapped with an explicit null test.
I do think it's a bit too easy to forget a deferred free, although it's possible for tools to detect them. Unfortunately Andrew Kelley is prone to being extremely opinionated about language design (GingerBill is another of that sort) and so macros are forever banned from Zig, but macros are the only mechanism for encapsulating a scoped feature like defer.
> You won't get anywhere with people who just like to argue.
Yeah not really sure why I bother. I think I just get bothered that Rust gets touted everywhere as a silver bullet.
> Tony Hoare's "billion dollar mistake", which was making null a valid value of a pointer type
It’s funny how we got stuck with his biggest mistake for decades and his (probably not entirely his) algebraic types / tagged unions have just started to get first class support now.
You were correct about the lack of billion dollar mistake in Zig, once I'd decided to list some "C replacement" languages not just C and C++ I should have either checked they all make exactly this mistake (Odin does, Zig does not) or removed that part of my comment.
However actually in practice for this nVidia bug Zig's "defer" is just irrelevant, which is why nVidia's "fix" doesn't attempt the most similar C equivalent strategy and instead now performs a heap allocation (and thus free) on the happy path.
There's a kernel Oops, likely in someone else's code. When that happens our stack goes away. In Rust they can (I don't happen to know if they do in Rust for Linux but it is commonly used in some types of application) recover from disasters and unwind the stack before it's gone, such as removing the threadState from that global state. In Zig that's prohibited by the language design, all panics are fatal.
What a crap, disingenuous argument.
A kernel oops isn’t a panic at least however zig or rust defines a panic. So zig saying things about panics don’t apply here.
Rust fails here the same exact way if drop semantics aren’t upheld (they aren’t afaik). Also Rust’s soundness goes immediately out the window if UB happens in unsafe code. So immediately when a kernel Oops happens safety is moot point.
I’m not sure if Zig has a clean way to kill a thread, unwind the stack, and run deferred code. Zig is a pre-1.0 language after all so it’s allowed to be missing features.
> I’m not sure if Zig has a clean way to kill a thread, unwind the stack, and run deferred code.
Zig deliberately only has fatal panic. This isn't a "missing feature" it's intentional