buffer overflows come from poor assumptions about the data. For example allocating 256 bytes of memory but trying to load a text string into it until you hit a null character, even if that is more than 256. Since we don't expect any data format from the aliens (or the random noise of space), we don't make faulty assumptions of what that format will be.
Ah, C-isms. Using Ada, or LISP-like languages and this problem goes away (unless you specifically tell it not to do bounds-checking): precisely because both were designed with a concept of array/list whose implementation required bounds. {In the case of LISP, the bounds being in-theory the memory of the Computer/VM [but wouldn't overflow, in any case]}
Since we don't expect any data format from the aliens (or the random noise of space), we don't make faulty assumptions of what that format will be.
I'm not so sure about that. The last few programs I've been on could be called horrid chimeras created and fostered by cut-and-paste. -- Sometimes assumptions are made in a certain section of code which are valid, but in another are invalid.