Would there have been a difference in performance if C#'s built-in data types were implemented the same way that Java's primitive data types are implemented?
For example, in C#, if we write the following code:
int foo = 1;
we are declaring and instantiating an object, right?
However, in Java, when we write the same piece of code, we are not instantiating an object. Since 'int
' is "implemented directly by the Java compiler and the JVM."
So I was wondering if it would have made a difference if C# had used the same implementation as Java -- i.e. by not making an object every time a primitive data type like int
is used in C#.