1

I have been writing some code passing through delegates to perform the actual work, but sadly now need to deduce the delegate's parameterised type, so used overloaded functions for the types I'm interested in, but was surprised to see ambiguity as in the example below.

I realise I could cast the delegate parameter to resolve the ambiguity, but why is this ambiguity there, and is there any other way to resolve it without explicitly instantiating/casting a delegate prior the call to CallGotTParam ?

namespace ConversionAmbiguous
{
    class IntShortDelegate
    {
        delegate void GotTParam<T> (T t);

        static void GotInt32Param(System.Int32 t)
        {
            Console.WriteLine("GotInt32Param: {0}", t);
        }
        static void GotInt16Param(System.Int16 t)
        {
            Console.WriteLine("GotInt16Param: {0}", t);
        }

        void CallGotTParam(GotTParam<System.Int32> dcall)
        {
            dcall(1);
        }

        void CallGotTParam(GotTParam<System.Int16> dcall)
        {
            dcall(2);
        }

        static public void Test()
        {
            IntShortDelegate test = new IntShortDelegate();
            // error CS0121: The call is ambiguous between the following methods or properties: 
            // 'ConversionAmbiguous.IntShortDelegate.CallGotTParam(Quickie.ConversionAmbiguous.IntShortDelegate.GotTParam<int>)' and
            // 'ConversionAmbiguous.IntShortDelegate.CallGotTParam(Quickie.ConversionAmbiguous.IntShortDelegate.GotTParam<short>)'
            test.CallGotTParam(IntShortDelegate.GotInt32Param);

            GotTParam<System.Int32> d32 = IntShortDelegate.GotInt32Param;
            test.CallGotTParam(d32); // This is fine
            test.CallGotTParam(IntShortDelegate.GotInt16Param); // This is fine
        }
    }  // class IntShortDelegate
} // Ends namespace ConversionAmbiguous

Compiled against .Net 3.5

Had a brief look at the csharp language specification version 4.0 sections 7.5.2 Type Inference and 7.5.3. Overload resolution, but haven't had time to study them.

WaffleSouffle
  • 3,293
  • 2
  • 28
  • 27
  • 1
    You should use C# specific type names (int, short...) instead of CLS type names, since these could change overtime, while language-specific synonyms should maintain "as is" in the long-term. – Matías Fidemraizer Jan 20 '12 at 10:07
  • 4
    @MatíasFidemraizer I don't believe that type names like `Int32` will change meaning any time soon! In fact, `int` could change its number of bits sooner than that. – Mr Lister Jan 20 '12 at 10:14
  • @MrLister I never understood that fear either. – Louis Kottmann Jan 20 '12 at 10:15
  • @MatíasFidemraizer It's for a library which handles different primitive types. i.e. 32 bit integer will be treated one way, 16 bit integer another. Changing to int/short would break if they didn't map exactly. – WaffleSouffle Jan 20 '12 at 10:18
  • @WaffleSouffle You're wrong, since int = System.32, short = System.16. It's exactly the same, but using language-specific aliases. – Matías Fidemraizer Jan 20 '12 at 10:58
  • @MrLister Int32, Single or any of these types (even _String_ instead of **string**) are the actual structs/classes for int, short, decimal, long and so on. But these identifiers are the common ones for all CLS-compatible .NET languages, including VB, C# and others, and any of these languages have their own semantics, so these types are called in a different way depending on language. If you use CLS/CLR names, if someday (who knows) there's some change in .NET specification, your code will be broken and you need to refactor your code, but an integer with 32bit precision will be always Int32... – Matías Fidemraizer Jan 20 '12 at 11:07
  • @MatíasFidemraizer So are you saying there's no way that in the future int could change to be System.Int64 ? That would surprise me. – WaffleSouffle Jan 20 '12 at 11:16
  • @MrLister ... so, at the end of the day, you're coding C#, not "pure .NET bytecode", and in C# language, a 32bit integer is **int**. In other words, the more predictable code will be one using **int**, **string**, **decimal**... – Matías Fidemraizer Jan 20 '12 at 11:26
  • @WaffleSouffle I doubt it, because Int64 is **long**. – Matías Fidemraizer Jan 20 '12 at 11:27
  • Glancing at wikipedia and the .net 4 specification, in general **and** in c# _int_ is 4 bytes. That doesn't seem to be set in stone, but is conventional. At the time of writing in general **and** in c# _long_ is 64-bit. However this is more susceptible to change outside of c#. The .net 4 spec kinda lays down the law 4.1.5 Integral Types: "_The int type represents signed 32-bit integers_" and "_The long type represents signed 64-bit integers_". Overall this differs across languages (although it would be nice if a c-like language was c-like). People also draw a managed/unmanaged distinction... – WaffleSouffle Jan 20 '12 at 15:22
  • [Good stack overflow answer on this](http://stackoverflow.com/questions/384502/what-is-the-bit-size-of-long-on-64-bit-windows), [wikipedia 64-bit article](http://en.wikipedia.org/wiki/Integer_(computer_science)#Long_integer) – WaffleSouffle Jan 20 '12 at 15:34

1 Answers1

0

Not an answer per se, but if the following code is fine

GotTParam<System.Int32> d32 = IntShortDelegate.GotInt32Param;             
test.CallGotTParam(d32); // This is fine 

could you also try this?

var typeDelegate = IntShortDelegate.GotInt32Param;             
test.CallGotTParam(typeDelegate); 

If this works you could maintain the generic type-agnostic aspect of your code while working around a compiler quirk.

Dr. Andrew Burnett-Thompson
  • 20,980
  • 8
  • 88
  • 178