Louis I. answered 04/18/19
Computer Science Instructor/Tutor: Real World and Academia Experienced
Hi - great question.
So let's begin with enum - to be used when we want to define a type that can take on a fixed number of abstract values - such as Direction (North, South, East, West} ...
We either have a need for an enum, or we don't.
Sounds like you're really asking about how to best define a constant - or a reference related to WORM (Write-Once Read-Many) memory, as it's sometimes called.
What's the difference between
#define MAX 100
and
const int MAX = 100;
The #define notation does not reserve a memory location ... it's sometimes referred to as a "macro",
#define values get substituted with the specified value at compile time (actually, just before the compilation step - the C/C++ preprocessor program [cpp] is pulled in)
In this example, every reference to MAX gets replaced with 100.
const <datatype> <name> = <value>, on the other hand, reserves a location of WORM memory.
As it relates to "best practices", since ANSI C incorporated the "const", I've preferred it over the C Preprocessor / #define approach.
Most of the time, it really doesn't matter which approach you use, but sometimes using const can be more space efficient. For that reason alone, I typically make a habit of using "const".
It also more resembles defining constants in other high level languages (such as the "final" key word in Java).
A #define / macro mechanism is unique to C/C++.
Clear?
Enjoy
--Lou