4

I'm looking for the right approach to verify a currently running executable from within that executable. I've already found a way to compute a (SHA256) hash for the file that is currently running.

The problem is: Where do I safely store this hash? If I store it in a config file, a malicious user can just calculate his own hash and replace it. If I store it in the executable itself, it can be overridden with a hex editor probably.

A suggestion I read was to do an asymmetrical en- (or was it de-) cryption, but how would I go about this?

A requirement is that the executable code hashes and en/decrypts exactly the same on different computers, otherwise I can't verify correctly. The computers will all be running the same OS which is Windows XP (Embedded).

I'm already signing all of my assemblies, but I need some added security to successfully pass our Security Target.

For those who know, it concerns FPT_TST.1.3: The TSF shall provide authorised users with the capability to verify the integrity of stored TSF executable code.

Davio
  • 4,609
  • 2
  • 31
  • 58
  • 9
    if it is suspected that the executing executable might have been edited, doesn't that also mean they could have removed (nop'd) the verification step itself? – Marc Gravell Jan 04 '13 at 08:41
  • 1
    You should figure out why strong signing is not enough and than try to find approach to solve it. Signing the same files twice with different keys will not going to give you much more security than signing with one... – Alexei Levenkov Jan 04 '13 at 08:42
  • @AlexeiLevenkov of course, a malicious user with enough access can simply use `SN -Vr` at the command line to disable that – Marc Gravell Jan 04 '13 at 08:46
  • 1
    You can't store the checksum in the executable itself because if you do so, the checksum changes. This is recursive. I recommend putting the hash and the RSA-signed hash into an external file. The RSA-signature blocks user-created hashes. Of course, a malicious user can still NOP the checksum validation. So, there is no bullet-proof way to do it absolutely safe. – nikeee Jan 04 '13 at 08:59
  • 1
    @MarcGravell I understand the problem, but it is something required by the security target, that's why I need to put it in there. I just need to be able to point out: Look, we have it, it's there, so it can be crossed off the list. – Davio Jan 04 '13 at 08:59
  • @nikeee13 Indeed, a user could always change an entire folder with his own files if he gets access to it. We use multiple layers of security and this is one that is required. – Davio Jan 04 '13 at 09:01
  • Can't you just verify the public key? – leppie Jan 04 '13 at 10:39
  • @MarcGravell, disabling strong name verification does not make signature valid or invalid - it will only ignore the check (one can still validate it i.e. http://blogs.msdn.com/b/shawnfa/archive/2004/06/07/150378.aspx). To my knowledge it is the same kind of signature as Authenticode (and over about the same bytes), so the fact signing assembly is not acceptable rule out other similar approaches in my opinion. – Alexei Levenkov Jan 04 '13 at 17:04

2 Answers2

4

All the comments, especially the one from Marc, are valid.

I think your best bet is to look at authenticode signatures - that's kind of what they're meant for. The point being that the exe or dll is signed with a certificate (stamping your organisation's information into it, much like an SSL request) and a modified version cannot (in theory plus with all the normal security caveats) be re-signed with the same certificate.

Depending upon the requirement (I say this because this 'security target' is a bit woolly - the ability to verify the integrity of the code can just as easily be a walkthrough on how to check a file in windows explorer), this is either enough in itself (Windows has built-in capability to display the publisher information from the certificate) or you can write a routine to verify the authenticode certificate.

See this SO Verify whether an executable is signed or not (signtool used to sign that exe), the top answer links to an (admittedly old) article about how to programmatically check the authenticode certificate.

Update

To follow on from what Marc suggested - even this won't be enough if a self-programmatic check is required. The executable can be modified, removing the check and then deployed without the certificate. Thus killing it.

To be honest - the host application/environment really should have it's own checks in place (for example, requiring a valid authenticode certificate) - if it's so important that code isn't modified then it should have its own steps for doing so. I think you might actually be on a wild goose chase.

Just put whatever check will take least amount of effort on your behalf without worrying too much about the actual security it apparently provides - because I think you're starting from an impossible point. If there is actually any genuine reason why someone would want to hack the code you've written, then it won't just be a schoolboy who tries to hack it. Therefore any solution available to you (those mentioned in comments etc) will be subverted easily.

Rent-a-quote final sentence explaining my 'wild goose chase' comment

Following the weakest link principle - the integrity of an executable file is only as valid as the security requirements of the host that runs that executable.

Thus, on a modern Windows machine that has UAC switched on and all security features switched on; it's quite difficult to install or run code that isn't signed, for example. The user must really want to run it. If you turn all that stuff down to zero, then it's relatively simple. On a rooted Android phone it's easy to run stuff that can kill your phone. There are many other examples of this.

So if the XP Embedded environment your code will be deployed into has no runtime security checks on what it actually runs in the first place (e.g. a policy requiring authenticode certs for all applications) then you're starting from a point where you've inherited a lower level of security than you actually supposed to be providing. No amount of security primitives and routines can restore that.

Community
  • 1
  • 1
Andras Zoltan
  • 41,961
  • 13
  • 104
  • 160
  • I'd seen that SO, but still wasn't exactly what I was looking for. The thing is, user authentication is only done in the program itself (with a custom made smartcard), so the verification has to be accessible from within the program as well. And it doesn't have to be critically secure, it just has to be possible under normal circumstances I think. – Davio Jan 04 '13 at 09:14
  • Sorry I think it's either all or nothing. Anything you do in-assembly to check security can be subverted by the very process you're trying to protect against. Authenticode cannot. To be honest with you it's bizarre that the target application doesn't have it's own verification step - because even a baked-in authenticode check can be no-oped as Marc suggested. – Andras Zoltan Jan 04 '13 at 09:25
  • Well, Andras, thing is that it's a requirement for a checklist, no matter how silly it is. Without it, we can't pass our certification and start deploying. Of course we will have OS level checks as well. The device even has pressure sensitive switches to see if it has been opened. – Davio Jan 04 '13 at 10:03
  • Well - at the risk of being flippant - just come up with the bare minimum that'll pass. The hash check on the file contents as you've suggested in your question should do that. It doesn't even qualify as 'security' but if it satisfies the requirement that doesn't matter. – Andras Zoltan Jan 04 '13 at 10:06
  • I've added another block to my answer at the end to explain why I might appear to be unhelpful here. I hope it adequately explains my position. – Andras Zoltan Jan 04 '13 at 10:22
2

Since .NET 3.5 SP1, the runtime is not checking the strong name signature. When your assemblies are strong named, so I suggest to check the signature by code. Use the native mscoree.dll with p/Invoke.

private static class NativeMethods
{
      [DllImport("mscoree.dll")]
      public static extern bool StrongNameSignatureVerificationEx([MarshalAs(UnmanagedType.LPWStr)] string wszFilePath, byte dwInFlags, ref byte pdwOutFlags);
}

Than you can use the assemlby load event and check every assembly that is loaded into your (current) app domain:

AppDomain.CurrentDomain.AssemblyLoad += CurrentDomain_AssemblyLoad;

private static void CurrentDomain_AssemblyLoad(object sender, AssemblyLoadEventArgs args)
{
    Assembly loadedAssembly = args.LoadedAssembly;

    if (!VerifytrongNameSignature(loadedAssembly))
        // Do whatever you want when the signature is broken.
}

private static bool VerifytrongNameSignature(Assembly assembly)
{
     byte wasVerified = 0;

     return NativeMethods.StrongNameSignatureVerificationEx(assembly.Location, 1, ref wasVerified);
}

Of course, someone with enough experience can patch out the "check code" from you assemlby, or simply strip the strong name from your assembly..

CodeTherapist
  • 2,776
  • 14
  • 24
  • 1
    +1 - this may actually be enough to pass checklist requirement. This approach is discussed in detail in [Checking For A Valid Strong Name Signature](http://blogs.msdn.com/b/shawnfa/archive/2004/06/07/150378.aspx). – Alexei Levenkov Jan 04 '13 at 17:08