14

In VS2012 (and previous versions...), you can specify the target platform when building a project. My understanding, though, is that C# gets "compiled" to CIL and is then JIT compiled when running on the host system.

Does this mean that the only reasons for specifying a target platform are to deliberately restrict users from running the software on certain architectures or to force the application to run as 32 bit on a 64 bit machine? I can't see that it would be to do with optimization, as I guess that happens at the CIL-->Native stage, which happens Just-In-Time on the host architecture?

This MS Link does not seem to offer any alternative explanation and I can find no suggestion of the fact that you should, for example, release separate 32 / 64 bit versions of the same application - it would seem logical that something compiled for "anycpu" should run just as well and, again, optimizations will be applied at the JIT stage.

AJ.
  • 1,621
  • 1
  • 10
  • 23
  • I don't know if it has changed in VS2012, but the "edit & continue" feature doesn't work while debugging apps in 64bit mode under VS2010. – David Oct 09 '12 at 05:07
  • @Davis: It hasn't. (but you can now use E&C to edit method bodies that contain anonymous method or lambdas, minus the actual statements where they appear) – Allon Guralnek Oct 14 '12 at 13:38

2 Answers2

16

Does this mean that the only reasons for specifying a target platform are to deliberately restrict users from running the software on certain architectures or to force the application to run as 32 bit on a 64 bit machine?

Yes, and this is critical if you're using native code mixed with your managed code. It does not change what gets optimized at runtime, however.

If your code is 100% managed, then AnyCPU (or the new AnyCPU Prefer 32-Bit) is likely fine. The compiler optimizations will be the same, and the JIT will optimize at runtime based on the current executing platform.

I can find no suggestion of the fact that you should, for example, release separate 32 / 64 bit versions of the same application

There is no reason to do this unless you're performing interop with non-managed code, in which case, that will require separate 32 and 64 bit DLLs.

Reed Copsey
  • 554,122
  • 78
  • 1,158
  • 1,373
  • Does the P/Invoke or "unsafe" stuff depend on this setting? (I suppose those could be considered not 100% managed ..) –  Oct 08 '12 at 19:27
  • @pst P/Invoke is going against native, so it matters there (unless you have some way, at runtime, to make sure you're only hitting the right platform DLLs), but it's not an optimization issue - it's a correctness issue. – Reed Copsey Oct 08 '12 at 19:28
  • 1
    @pst - Yes. Because setting this platform flag (if used on the entry executable responsible for initiating the runtime) will result in the code being loaded into either a 32 or 64 bit memory space. If your p/invoke is written to expect pointers and other memory specific types to be of 32-bit dimensions, and it's loaded into a 64-bit runtime (or vice-versa) things will fail. – Adam Oct 08 '12 at 19:30
  • 1
    I'd another valid (albeit minor) reason to why you'd want to limit the platform. In certain scenarios, memory consumption can be reduced by running in 32-bit compared to 64-bit, as all object references are only 4 bytes instead of 8. Forcing 32-bit can decrease a program's memory footprint if you know you'll never consume more than 2 GB. – Allon Guralnek Oct 08 '12 at 19:52
  • @AllonGuralnek Yeah - in general, unless you need 64 bit, 32 bit is often better. I believe that's part of why the default in VS 2012 is now "AnyCPU Prefer 32-bit" instead of just AnyCPU. – Reed Copsey Oct 08 '12 at 19:55
  • @Reed: Huh, it's the first time I've heard of this new AnyCPU varient. I think it should have been called just "32-bit", because according to [this article](http://blogs.microsoft.co.il/blogs/sasha/archive/2012/04/04/what-anycpu-really-means-as-of-net-4-5-and-visual-studio-11.aspx), it will only ever run as 32-bit. Or perhaps it will run as 64-bit on Itanium (but who cares)? – Allon Guralnek Oct 09 '12 at 09:12
  • Excellent answer and comments, thank you. @Allon Guralnek: in my case, memory consumption is not a problem as I'm only tending to run on machines with 8GiB of RAM, but I guess if I released this more generally including to people running 32 bit machines (hence the question), that could be a problem. I wonder how much of a tradeoff this is, though? Presumably running 32 bit JIT'd code on a 64 bit machine running 64 bit Windows also implicates WOW64 and additional CPU mode changes from Long to Compatibility modes? – AJ. Oct 09 '12 at 22:20
  • @AJ: Actually, running in 32-bit mode in 64-bit Windows can yield performance improvements. WOW64 is so thin it [doesn't affect performance](http://goo.gl/znyop), but the decreased memory usage by a 32-bit app means that more can fit on the CPU cache (and other caches) and memory takes less time to allocate, read and write. These performance gains can be noticeable (or even significant) for certain kinds of applications. There's no downside, therefore you should always prevent your applications running in 64-bit unless you suspect you're going to exhaust the 2GB address space. – Allon Guralnek Oct 10 '12 at 20:59
12

Reed has a good answer here. However, I think it's also important to point out that this setting is just a flag in the DLL - it pretty much has no effect whatsoever in most situations. It is the runtime loader's (the bit of native code that starts the .NET runtime) responsibility to look at this flag, and direct the appropriate version of of the .NET runtime to be started up.

Because of this - the flag will mostly only matter when it is set on an EXE file - and have no effect when set on a DLL. For example - if you have a '32-bit-flagged .NET DLL' which is used by either a 64-bit-flagged .NET EXE, or an any-cpu-flagged .NET EXE, and you run the EXE on a 64-bit machine - then the loader will start the 64-bit runtime. When it comes time to load the 32-bit DLL, it's too late - the 64-bit runtime has already been chosen, so your program will fail (I believe it's a BadImageFormatException that you will receive).

Adam
  • 4,159
  • 4
  • 32
  • 53
  • +1 Very good complement to Reed's answer. And yes, you are right about the `BadImageFormatException` error. – Icarus Oct 08 '12 at 19:41