6

I have a function where I'm passing in an iterator to a char * buffer (which is also a char *). The function needs to increment the iterator. Anyway, I found that a good method of passsing the iterator into the function is by passing the pointer by reference:

bool myFunction(unsigned char * &iter)
{
   ...

However, I've heard that this is bad form and could cause problems. Here is the method that my coworker suggested I use:

typedef unsigned char * unsignedcharptr;
bool myFunction(unsignedcharptr &iter)
{
   ...

It looks to me like they're both doing the same thing. Is there a compiler difference between these two methods (I'm using Visual Studio 2005)? Which is correct?

Joe Lyga
  • 733
  • 2
  • 7
  • 17
  • 4
    I would rather consider typedefs for pointer types bad form to cause problems, but well, for the compiler there isn't any difference. – Christian Rau Feb 10 '12 at 16:53
  • why not return a distance to increment by instead of modifying the pointer? – Anycorn Feb 10 '12 at 16:54
  • 1
    Semantically there is no difference, but yes second one is mostly preferred, typededs are easy to read. – Mr.Anubis Feb 10 '12 at 16:55
  • `typedef` creates an alias for an existing type, and used for readability purposes mostly. – a1ex07 Feb 10 '12 at 16:55
  • 1
    @Mr.Anubis Well, just until you forget that `iter` is a pointer and not an object. A `*` just says so much more than a `ptr` inside a long type name. – Christian Rau Feb 10 '12 at 17:03
  • @ChristianRau Didn't get what you mean – Mr.Anubis Feb 10 '12 at 17:05
  • @Mr.Anubis I meant, that it *IMHO* is not a good idea to incorporate pointers into typedefs, because that somehow hides the *pointerness* of the type making it look like an ordinary (non-pointer) variable type. Until you finally see that `ptr` phrase appended to the type name, which is *IMHO* much harder to parse (for me human) than a simple and obvious `*`, which always signals *"it's a pointer type!"* immediately when seeing it. But well, at least with smart pointers you don't get around that problem (but then again, I don't like typedefs for those either). – Christian Rau Feb 10 '12 at 17:12
  • @ChristianRau Well at the end of the day it's all your choice whether you want to use typedefs or not but e.g in conversion operators including pointer to some type or some other harder constructs etc you must have to include typedefs – Mr.Anubis Feb 10 '12 at 17:17

2 Answers2

7

I don't think there's any difference between the two. You should ask your coworker why he believes there is.

What might be the cause is for maintainability, where if you wanted to switch from unsigned char * to char * you'd only have to change one place. But this, I think, could lead to other problems.

Luchian Grigore
  • 253,575
  • 64
  • 457
  • 625
  • 3
    @Adrian that's subjective though. I find `unsigned char*` more readable than `unsignedintptr`, while some comments suggest otherwise. What do you think? – Luchian Grigore Feb 10 '12 at 17:07
  • Nah, it wasn't about readability. I think she thinks that it would cause memory issues. – Joe Lyga Feb 10 '12 at 17:09
  • @JoeLyga I don't know what you both think exclusively xD but she got that wrong if you talk about your particular above code example. – Mr.Anubis Feb 10 '12 at 17:20
5

Is there a compiler difference between these two methods (I'm using Visual Studio 2005)?

As others have correctly noted, "no".

Which is correct?

Between the two alternatives, it entirely comes down to the "should I hide pointers behind typedefs" debate. There are valid arguments for either position

However, I think that both of your code snippets suffer from over-specialization. I prefer to code algorithms as template functions so that I Don't Repeat Myself.

If your design supports it, you could generalize your code to accept any iterator:

template <class InputIterator>
bool myFunction(InputIterator &iter)
{
  std::cout << *iter;
  ++iter;
  return true;
}
Robᵩ
  • 163,533
  • 20
  • 239
  • 308