In pre-.NET world I always assumed that int is faster than byte since this is how processor works.
Now it's matter habit of using int even when bytes could work, for example when byte is what is stored in database
Question: How .NET handles byte type versus int from point view of performance/memory.
Update: Thanks for the input. Unfortunately, nobody really answered the question. How .NET handles byte vs. int.
And if there is no difference in performance, then I like how chills42 put it: int for arithmetics bytes for binary Which I will continue to do.