0

I have an interface which implementing classes export via MEF in my application. The implementing classes are in separate assemblies and aren't known about at compile-time (think plug-ins).

The interface basically consists of a call which says 'here is a bunch of key-value pairs, now initialise your licensing state'. I.e.

public LicensingInfo InitialiseLicense(IEnumerable<KeyValuePair<string, string>> keys)

What I would like to know is - is there any way of protecting that interface from a 'middle man' implementation? I.e. one that receives the call from my application, then calls the same method on the plug-in assembly with a different bunch of key-value pairs, basically saying 'yes - here you go - have everything'.

I did try and think of it a different way, in that the application would call the plug-in assembly and pass in an object which could be queried. That method might look something like this:

public LicensingInfo InitialiseLicense(ILicenseQueryProvider provider)

However, again with this method I think that an intercepting object could just simply give a different provider to the library.

So, is there a way I could prevent such interface interception, or should I refactor it such that the plug-in assembly is entirely responsible for licence loading etc within it's own code? Or is there another way, perhaps, that I could refactor it that I haven't considered?

Matt Whitfield
  • 6,436
  • 3
  • 29
  • 44
  • 2
    If the attacker can put code on the machine, he has already defeated you. – SLaks Jun 12 '12 at 19:12
  • Is there a trust relationship? For example, the plug-in developer trusts you to pass him a set of keys that represent the user, and he determines what the user has access to? Or are you developing both sides, but you want to make sure that your user hasn't managed to intercept your call and grant himself access to all data? – Chris Shain Jun 12 '12 at 19:15
  • @ChrisShain - it would be the latter. It's an app that our company is making, that will come with plug-ins that we also develop. We just don't want to make it easy to steal. I mean, it's probably not the easiest thing to do anyway, but I'd rather it was as close to 'really really painful to achieve' as possible. – Matt Whitfield Jun 12 '12 at 19:20
  • @SLaks - seeing as the attacker owns the machine, and our software is being installed on it, I don't think that would count as much of a 'win' on the attacker's part :) – Matt Whitfield Jun 12 '12 at 19:21
  • Wow, downvote on the question? Anyone like to enlighten me as to why? – Matt Whitfield Jun 12 '12 at 21:39
  • Slaks is correct though. Because the hacker has control over his machine and the software installed and running on it, including your software, there really is nothing you can do to stop him. Well it all depends on the economics. If it's cheaper to buy than it is to hack, they'll buy. Even better: if they get something valuable from buying they can't get from hacking, they'll buy. Better support for example, or access to your web services. See my answer. – Kris Vandermotten Jun 12 '12 at 21:42
  • @KrisVandermotten - I know, totally. But it's a bit of a peurile point really. Most security, whether software based or not aims to stop *most* people, but not all. There will always be a hardcore set of people who, given enough time / inclination, will break the law. That's a fact of life, rather than computer science. So, given that 'casual hackers' who are looking for a 5-minute 'can I break it' pass would be deterred by something relatively straightforward, that's good enough, really. – Matt Whitfield Jun 12 '12 at 21:49

2 Answers2

2

I believe you can make it harder to break, but not with an interface.

Here's what you do:

You need 2 + n projects: one for the exe (let's call it program.exe), one for the contracts (contracts.dll), and one for each of your n plugins (plugin.dll).

program.exe has a hard reference to contracts.dll, as does plugin.dll.

Sign all of them with a strong name key. See http://msdn.microsoft.com/en-us/library/xc31ft41.aspx

Instead of an interface ILicenceQueryProvider, create a sealed class LicenceQueryProvider in contracts.dll. Make sure it has no public constructors, only an internal one, and no methods to modify the object (initialized on construction, immutable and with readonly fields).

Mark contracts.dll with an InternalsVisibleToAttribute, granting program.exe access to the internal constructor. See http://msdn.microsoft.com/en-us/library/System.Runtime.CompilerServices.InternalsVisibleToAttribute.aspx

This way, program.exe can call the constructor on this object, and plugin.dll can read from it.

plugin.dll "knows" the object class has not been modified because of the strong name signature. And because it's sealed, a man in the middle cannot substitute in another implementation.

Now remember I said you can make it much harder to break, but it is not impossible and will never be, especially if you're using managed code.

For example, a man in the middle can use reflection to instantiate the object with the internal constructor.

Even worse, in your plugin there is code that reads from this object, and makes a decision based on licence information. A hacker can decompile your plugin.dll to IL, and replace that code with code that always grants all privileges.

Obfuscation would help just a little bit, but not against the reflection attack. Native code would make it somewhat more difficult, but native code can be patched too.

Ultimately, the code is on the hacker's machine, and the hacker can do what he wants. He can even run it under a debugger, and modify the data in memory. This is the problem that all copy protection and licencing mechanisms face. In my opinion, licences make it harder on your clients to use your software, and will not stop a determined hacker. Do you (or your company) want to make it hard on your clients to use your software?

Now this doesn't mean there is no solution. In fact there is: a hacker cannot modify code that is not on his machine. Have the code run on a server under your control. The client app accesses it through a web service. The web service authenticates the user (not the calling code, that is impossible). Knowing the user, the service can validate the user's licence. This is the only solution.

UPDATE

Just to be clear: such a service needs to run the actual code that has value for the user, not just a licence check. In the latter case, a hacker could modify the client to simply not mke the call, or even substitute a fake licence server. However, the assumption is that a licence is cheaper than it is to recreate the actual logic living in the service. In that case, even hackers will prefer to buy over recreating the code.

Kris Vandermotten
  • 10,111
  • 38
  • 49
  • Yeah, I agree with pretty much everything you say - but I do have a question - why would you put the sealed provider in a third DLL and not in the main application? I can definitely see the benefit - and I did consider that approach, but I figured the reflection constructor call was quite easy. However, it's the 10-80-10 principle, and I want to catch the 80, rather than the hardcore 10... And I don't necessarily agree licensing makes it hard on users unless your implementation sucks. And I'll be trying my best to make sure it doesn't :). Thanks. – Matt Whitfield Jun 12 '12 at 21:44
  • I would put it in a third assembly, just like the other contracts (i.e. interfaces for MEF), so that I can update my exe without requiring recompilation of the plugins. Though I must admit that this is more important if plugins are developed by third parties. BTW, note that the internal constructor through reflection problem exists when the class sits in program.exe too. – Kris Vandermotten Jun 12 '12 at 22:19
1

There isn't a bullet proof way to secure your software.

We once used (advertisements ahead) Safenet Inc's Sentinel hardware keys, dotfuscator pro and smart assembly to protected some of our applications.

The hardware keys can used to store licenses (i.e. each feature/plugin has its own license that can be enabled on the hardware key and queried by the application). Optionally their product can be used to encrypt your application - the application gets encrypted and can then only be in-memory-decrypted and started when the right hardware key is attached to the system. There are anti-debugging mechanisms in place to make it harder for someone to use a debugger, waiting for the application to get decrypted in memory and then copying it.

Dotfuscator and smart assembly can be used to "obfuscate" the code of applications to make decompiling using tools like Reflector harder.

Neither of these tools are bullet proof though. But "really really painful to achieve/steal"? I'd say so, but that comes at a price... you sure can throw a lot of money at these tools.

stmax
  • 6,506
  • 4
  • 28
  • 45
  • All true. And unfortunately, all these things make it harder on your legitimate customers to use your software. And they make it harder for you to sell it: now you have physically ship a dongle! And you need a help desk, to take the calls form users that cannot get it to work for whatever reason ("I have no USB ports on my machine"). And you need to replace broken dongles. End so on. It's a real nightmare, and an expensive one too, to your customer and to you. – Kris Vandermotten Jun 12 '12 at 20:35
  • Hardware keys make sense for a few products (products with low volume, high price, direct support at the customer site) but that's about it. Obfuscation tools are totally unobtrusive to the user and better than nothing.. they sure make decompiling quite a bit harder compared to starting Reflector to copy/paste your C# code. – stmax Jun 12 '12 at 20:45
  • Software that requires direct support at the client site doesn't really need dongles. In fact "requiring direct support at the client site" is probably as good as "requiring a dongle at the client site", maybe even better. – Kris Vandermotten Jun 12 '12 at 21:36
  • Hardware keys wouldn't really be an option in this case. I do know about obfuscation etc. I've used EZiriz .NET Reactor in the past - which is a good obfuscator. When it works. And when it doesn't, the support is shocking. – Matt Whitfield Jun 12 '12 at 21:46
  • @KrisVandermotten You install a 20k EUR license on your customers system and train their people. Then you leave. How do you keep them from copying it to a dozen other systems without paying you? – stmax Jun 12 '12 at 21:46