1

I need to use this (unmanaged) C++ library. One of the methods has a wchar_t* as a parameter. Tried using it in C#, but all my attempts resulted in an error code 'invalid argument'.

I wrote a managed C++ wrapper for it - same problem. Now I compared the values of arguments from my C++ wrapper and native C++ example that came with the library. The only significant difference I see is that NUL in my managed C++ is "0 L''" (Visual Studio watch) and NUL in unmanaged example is simply "0". They both have the same value... 0.

Can this really be the issue? (I tried manually setting that character to '0' but got the same results) If yes, how do I solve it?

EDIT: Image: http://img6.imageshack.us/img6/5977/comparisonofvalues.png Ok, on the left side is my code (managed C++), on the right side is the example (unmanaged C++). As it is, right one is working, left one isn't (the function rejects the arguments). I think the issue is in the 17th character - NUL. Any further thoughts?

Stonehead
  • 366
  • 2
  • 12
  • 2
    `wchar_t*` is a null-terminated string. Other than that I can't help until you show some code and give details of precisely what the unmanaged library is expecting to be passed. – David Heffernan Sep 23 '11 at 16:18
  • C++/CLI doesn't have NUL or NULL, it has *nullptr*. Your watch doesn't make much sense but looks like an empty string L"". Not the same thing as a null pointer. You'll have to use CharSet.Unicode in your C# [DllImport] declaration. – Hans Passant Sep 23 '11 at 16:56
  • How does `sec` get it's value? – wimh Sep 23 '11 at 20:27
  • sec is just a pointer, it is given at the initialization. I am using it without problems with other functions. This is the one that's giving me the troubles (incidentally it's the only one with a 'const wchar_t*' for an argument). – Stonehead Sep 23 '11 at 21:11
  • Ok, I should point out that there probably was no issue with my code. The emulator I was using behaves differently than the actual hardware. So, basically, this is a bad question. Sorry to have wasted your time guys and thank you for trying to help. – Stonehead Oct 04 '11 at 23:53

2 Answers2

1

The difference in your debugger is just appearances. You'll note that the debugger generally shows two values: the binary value, and the matching Unicode character. But you can't show a Unicode character for binary value 0. The two debuggers deal with it in slightly different ways (showing L'' versus showing nothing) , but the bits in memory are the same.

On the other hand, your ip string is garbage.

MSalters
  • 173,980
  • 10
  • 155
  • 350
0

You might check your project properties. There is a compiler option that controls whether wchar_t is treated as a built-in type or not. Setting this to NO will use the old header definition of wchar_t and may fix your problem.

ScottTx
  • 1,483
  • 8
  • 12
  • No difference. I was using to begin with (same as the example, mind you). Thanks for the suggestion, though. – Stonehead Sep 23 '11 at 20:05