I’m not sure why it’s pushed so hard other than I suspect the compiler is compromised or something by some alphabet agency.
If that’s not it, I’m about to get real curmudgeonly. People tout it being “safe by design” and “better than c” because of memory safety being built in, etc.
I’m no rust expert, though I’m arguably a (embedded at least) c expert, which biases me to some extent at least.
My take is that for situations where memory safety was already critical, my understanding is that rust mechanisms would have to be bypassed anyway and the safety of C is ensured by processes proven over decades…
So basically it feels like the CISA people trying to push “modern languages with modern safety” either because they don’t understand how we actually do things or because they want us to use it for another reason… Both of which are equally believable to me.
Theoretically speaking… It could be possible for the compiler to recognize certain patterns and inject arbitrary instructions into the compiled code of interest. If it were really smart it would probably be limited to some specific platforms of interest, be some otherwise harmless looking instructions, that might do something to allow consistent exploitation under some specific circumstances. I’m just spit balling here, I’ve not put much thought into this past “I’m sure there could be some nasty shit you could do if you wanted to.”
Another option might be hiding some information about the author and the system doing the compilation in binaries.
You’re trusting the compiler to convert human readable code into machine readable code. I suspect you could sneak some “unreachable” code in there or something, and if it doesn’t look scary it’d be easy to write it off as a quirk of optimization or something.
Edit: I have no evidence this is being done or has ever actually been done. I’m just saying that it’s theoretically possible.
I’m not sure why it’s pushed so hard other than I suspect the compiler is compromised or something by some alphabet agency.
If that’s not it, I’m about to get real curmudgeonly. People tout it being “safe by design” and “better than c” because of memory safety being built in, etc.
I’m no rust expert, though I’m arguably a (embedded at least) c expert, which biases me to some extent at least.
My take is that for situations where memory safety was already critical, my understanding is that rust mechanisms would have to be bypassed anyway and the safety of C is ensured by processes proven over decades…
So basically it feels like the CISA people trying to push “modern languages with modern safety” either because they don’t understand how we actually do things or because they want us to use it for another reason… Both of which are equally believable to me.
What does it mean for a compiler to be compromised?
Theoretically speaking… It could be possible for the compiler to recognize certain patterns and inject arbitrary instructions into the compiled code of interest. If it were really smart it would probably be limited to some specific platforms of interest, be some otherwise harmless looking instructions, that might do something to allow consistent exploitation under some specific circumstances. I’m just spit balling here, I’ve not put much thought into this past “I’m sure there could be some nasty shit you could do if you wanted to.”
Another option might be hiding some information about the author and the system doing the compilation in binaries.
You’re trusting the compiler to convert human readable code into machine readable code. I suspect you could sneak some “unreachable” code in there or something, and if it doesn’t look scary it’d be easy to write it off as a quirk of optimization or something.
Edit: I have no evidence this is being done or has ever actually been done. I’m just saying that it’s theoretically possible.