An article in ACM by Poul-Henning Kemp claims the decision to null-terminate strings is the most expensive one-byte mistake. Not even.
Null-terminating strings was a great design decision, especially compared to the alternative of preceeding each string by a length. First, a length "byte" would be just as prone to being corrupted as a null terminator, maybe more so. Second, who's to say one byte is enough? 255 characters isn't very many, and if you need more than one byte, how many more? Always two? Always four? If it's variable, how do you know? Big endian or little? Or both? Third, how do you handle variable-length strings which grow - do you reallocate the whole string to make the length byte bigger? The problems go on and on.
The real most expensive one-byte mistake was using backslashes as path delimiters in DOS, which is even mentioned in this article.