0

I have handful of applications that use .net 1.1 framework. We are upgrading our server's to 2008r2 64 bit or maybe even 2012 64 bit. Our client side apps will run on citrix that will also be on a 2008r2 64 bit or 2012 64bit box.

Now, I'm I correct in thinking that if.net 1.1 isn't installed then applications will automatically use a higher version of the framework? So what problems are there with this strategy:- Try the 1.1 apps on the new servers. If there work then I can delay upgrading them.

paul rockerdale
  • 377
  • 6
  • 21

1 Answers1

0

No they won't run unless .NET 1.1 is installed. Applications can only use the framework they are compiled for. (They maybe able to be upgraded if source code is available)

.NET 1.1 is available for 2008 server not R2 or 2012 server so you will have to upgrade them.

See here though http://support.microsoft.com/kb/2489698 (indicates it may be possible)

See this for Server 2012 http://msdn.microsoft.com/en-us/library/hh925570%28v=vs.110%29.aspx

Dreamwalker
  • 3,032
  • 4
  • 30
  • 60
  • http://msdn.microsoft.com/en-us/library/hh925570(v=vs.110).aspx. This article end with "You should always try to install the application first to determine if it will automatically be updated to a later version of the .NET Framework. If it does not, contact your ISV for an application update." – paul rockerdale Apr 10 '14 at 12:27
  • http://msdn.microsoft.com/en-us/library/ff962563(v=vs.110).aspx. This article suggest shows way retarget a framework in the config file – paul rockerdale Apr 10 '14 at 12:32
  • You may get lucky and the app doesn't use a feature that has a breaking change. If the app uses an installer that may not work if so you may be able to copy the installation from an old machine. If you really have to you could always use hyper-v and place the application on a version it supports. – Dreamwalker Apr 10 '14 at 13:02