Is it a conceptual break? Yes. But the "conceptual break" is that for all intents and purposes, C does now have "built-in functions". They're specified in the Standard, their names are all reserved (unless you're in a freestanding environment), and compilers are special-casing them in all sorts of ways.
For example, if I call printf
without properly including <stdio.h>
, I'll typically get a warning like "incompatible implicit declaration of built-in function ‘printf’" (gcc), or "implicitly declaring library function 'printf' with type 'int (const char *, ...)'" (clang). Both messages prove that the compiler knew about printf
all along, whether or not I explicitly included a header file with an external declaration that told the compiler something.
Given that compilers do know about library functions, it's perfectly appropriate that one of the things they do with this knowledge, in the case of printflike functions, is to double-check the actual arguments.
Yes, I do remember the days when library functions truly weren't built in, in any way, and there were things about those days that I miss, but really, the fact that compilers do now know about library functions isn't causing me any problems or costing me any sleep.
And I firmly believe that for a modern compiler to check the number and types of arguments handed to printflike functions is more or less mandatory. Back in the good old days, back when library functions weren't part of the language, it was also the case that function prototypes didn't exist. If a programmer wanted to ensure that a function call matched its definition, it was up to the programmer either to be careful, or to run lint. But prototypes changed that, and today's programmers know that it's fine to call, say, sqrt(144)
. But for the same reason, today's programmers don't understand why they can't call printf("%f\n", 144)
. And I really can't blame today's programmers for that.