I'm using gcc on Ubuntu 4.6.1 and SUSE 4.6.2 with the following command
gcc gets_s.c
My source code is
// Read and Display Lines
// gets_s.c
#include <stdio.h>
int main(void)
{
char first_name[11];
char last_name[11];
printf("First Name : ");
gets_s(first_name, 11);
printf("Last Name : ");
gets_s(last_name, 11);
puts(first_name);
puts(last_name);
return 0;
}
Elaborating on my question:
The principal issue for me is one-to-one correspondence between lines input and lines saved.
On success, the difference between fgets and gets_s is that fgets includes the newline terminator while gets_s replaces the newline terminator with the null terminator so as to maintain a one-to-one correspondence between the lines input and successful calls to gets_s.
For input that overflows the buffer length, fgets accepts the number of characters that fit into the buffer and leaves the rest in the input buffer for the next fgets.
The standard (K.3.5.4.1) states that with gets_s (unlike gets) requires a newline, EOF or read error within n-1 characters. Hence overflow is a runtime-constraint violation. If there is a runtime-constraint violation, the first character in the buffer is set to the null character, and the characters in the stdin input buffer are read and discarded until the new-line character is read, an end-of-file occurs or a read error occurs.
Accordingly on success, I expected:
>fgets
First Name : Chris
Last Name : Szalwinski
Chris
Szalwinski
>
>gets_s
First Name : Chris
Last Name : Szalwinski
Chris
Szalwinski
>
On overflow, I expected different behavior from fgets and gets_s. In other words,
>fgets
First Name : Christopher
Last Name : Christophe
r
>
>gets_s
First Name : Christopher
Last Name : Szalwinski
Szalwinski
>
Note how I expected gets_s to remove the contents of the first line of input altogether.
If the principal issue is one-to-one correspondence between lines input and lines saved, which is important in debugging, we still need to write our own function (similar to K&R's getline)
char *gets_s(char *s, int n)
{
int i, c;
for (i = 0; i < n - 1 && (c = getchar()) != EOF && c != (int)'\n'; i++)
s[i] = c;
s[i] = '\0';
while (n > 1 && c != EOF && c != (int)'\n')
c = getchar();
return c != EOF ? s : NULL;
}
With such a function the one-to-one correspondence is maintained, the buffer is saturated and there is no runtime-constraint violation.
Am I correct in drawing this conclusion.