1

I just compiled NIST RS274NGC G-Code Interpreter and saw unbelievable 890 warnings from gcc.

200 of them were caused by this array:

char * _rs274ngc_errors[] = {
/*   0 */ "No error",
/*   1 */ "No error",
/*   2 */ "No error",
/*   3 */ "No error",
/*   4 */ "A file is already open", // rs274ngc_open
<...>

which, according to my basic understanding, should be const char *.

Then I saw these macros (they actually appear several times in different .cc files):

#define AND              &&
#define IS               ==
#define ISNT             !=
#define MAX(x, y)        ((x) > (y) ? (x) : (y))
#define NOT              !
#define OR               ||
#define SET_TO           =

Then I saw a lot of warnings suggest braces around empty body in an 'else' statement [-Wempty-body] caused by really strange control flow altering macros like this (yes, with dangling else!):

#define PRINT0(control) if (1)                        \
          {fprintf(_outfile, "%5d \n", _line_number++); \
           print_nc_line_number();                    \
           fprintf(_outfile, control);                \
          } else

Report suggests that

A.5 Interpreter Bugs

The Interpreter has no known bugs

All of that makes me wonder - why is it written so strangely? I can understand macros like PRINT0 - error handling in C can be a real pain - but why would anyone use SET_TO instead of =?

I can believe that all this code was generated but couldn't it be generated in warning-free way?

I'm not an expert in any way, I'm just really curious.

Community
  • 1
  • 1
Amomum
  • 6,217
  • 8
  • 34
  • 62
  • 4
    It is just a very old chunk of code, RS-274 goes back to 1980. The author(s) were not yet using a C compiler that was that picky. None were back then. I recognize the programming style, the authors had a background in Algol or Pascal. Not quite gone yet, today the type of a string literal is still char*, the macros are still partly covered by iso646.h. Pascal did not have a dangling else problem, if-then-else was a single statement that was terminated by a semi-colon. – Hans Passant Sep 12 '17 at 22:39
  • I don't know what settings you enabled, but literal strings have the type `char *`. – Antti Haapala -- Слава Україні Sep 12 '17 at 22:45
  • @AnttiHaapala I think I used g++ to compile that bit, so it's actually a C++ warning. – Amomum Sep 12 '17 at 22:55
  • @HansPassant can you please make your comment an answer so I can accept it? – Amomum Sep 12 '17 at 22:57
  • @HansPassant interestingly enough, report that I mentioned, is dated year 2000. – Amomum Sep 13 '17 at 21:16

2 Answers2

2

As Hans and Foad point out, this was the norm back then. It's referred to as K&R C, after the inventors of the language, Brian Kernighan and Dennis Ritchie. (Note that K&R C can also refer to a formatting style they popularized while writing the first book about the language.)

K&R C was quite forgiving of things that a compiler in C99 or C11 mode would treat as UB (undefined behavior) or a flat-out syntax error, such as defining a function taking args without specifying the types of the args. Back then, such a function was assumed to take int args.

IIRC the first major overhaul of the language was ANSI C '89; computer power, compilers, and the popularity of the language had changed drastically between its invention and that standardization.

If antiquated C interests you, you may wish to look at the source of the Bourne shell. Quoting Wikipedia's Bourne Shell article:

Stephen Bourne's coding style was influenced by his experience with the ALGOL 68C compiler ...

... Bourne took advantage of some macros to give the C source code an ALGOL 68 flavor. These macros (along with the finger command distributed in Unix version 4.2BSD) inspired the IOCCC – International Obfuscated C Code Contest.

An actual example of the code can be found on rsc's site, and (what I assume is) complete source can be found here.


By the way, if you think the number of warnings you got is a lot, you should try turning on additional warnings. There are plenty that are reliable and useful but aren't on by default, though I don't have an up-to-date list of them. -Wall -Wextra would be a decent start.

Community
  • 1
  • 1
Mark
  • 1,035
  • 2
  • 11
  • 24
  • I just wasn't expecting such an old piece of code from the document dated 'Aug. 17, 2000' :) And I usually enable `-Wall -Wextra -Wpedantic`. – Amomum Nov 30 '19 at 11:20
  • Ah, yeah. I've seen some other code from NIST that also used dated techniques. Staying up-to-date on best practices doesn't seem to be a priority. – Mark Jan 01 '20 at 17:19
1

Well actually if you look at this from the computer science or mathematical point of view of its time, it makes sense. The "C" syntax is noways so mainstream that we don't even think about it. But back then = was equal, and it still is in mathematical context. Using = as the assignment operator is a bizarre and rather confusing choice. Novice programmers sometimes confuse it with the mathematical equal. In contrast, R uses <- or ->, Maxima and Modelica use :=, which are all languages designed by mathematicians. and, or, not and is are also pretty good choices, which are for example used in Python. From programing point of view using pre-processing macros in this way is horrible idea, but if you think about the rationale, it is actually C to be blamed.

Foad S. Farimani
  • 12,396
  • 15
  • 78
  • 193