From the comments it become clear this is to find the maximum clique, so I will now focus on that instead of on the literal question.
First some simple tricks. With backtracking, many branches of the search space can be pruned. The rough idea looks like
findMaxClique(clique, banned, graph):
if clique ∪ banned is everything:
if isclique(clique) and size(clique) > size(bestFound):
bestFound = clique
return
for each node n not in clique ∪ banned:
try findMaxClique(clique + n, banned, graph)
try findMaxClique(clique, banned + n, graph)
This is still the naive version, trying every possible subset. But there are some obvious improvements. For example, there's no point in waiting to test whether the potential clique is actually a clique until the last moment, every subset of nodes that form a clique also form a clique. Adding a node to it that doesn't leave it a clique is pointless. Call this #1. This prunes a lot.
Also, if the current clique plus the nodes that could possible be added to it is less than the best found clique, nothing better can be found in this branch of the search space. There are several different levels of effort you can do here, the simplest is just counting all in the leftover set, but you could go for the biggest clique in that set or something. Anyway I'll show the simple one, call it #2. Now we have something like:
findMaxClique(clique, banned, graph):
if clique ∪ banned is everything:
if size(clique) > size(bestFound):
bestFound = clique
return
for each node n not in clique ∪ banned:
if isclique(clique + n):
try findMaxClique(clique + n, banned, graph)
notbanned = graph - (banned ∪ n)
if size(notbanned) >= size(bestFound):
try findMaxClique(clique, banned + n, graph)
An other option to estimate the size of clique you could build is to use linear programming.
For example, this is an ILP model for max clique:
maximize sum x[i]
s.t.
for each i,j that are not adjacent, x[i]+x[j] ≤ 1
x[i] in { 0, 1 }
The linear relaxation (ie dropping the last constraint) of that is easy to compute, and you can lazily add the constraints if you want. Obviously there are constraints coming from the clique
/banned
sets, forcing certain x's to be 1 or 0 respectively. If the objective value of the linear relaxation is not better than your biggest found clique, then you can prune the current branch.
There are an other fun property of that model. If the resulting x has all entries from {0, 0.5, 1}, then you can immediately decide to pick all nodes for which x[i] = 1 to be in your clique, so you can skip a lot of branching in that case. This is probably uncommon high in the search tree, but you can add some Gomory cuts to encourage integrality. Your favourite LP solver may have them built in.
There are more clever tricks here, check the literature.
A completely different way to attack the problem is with SAT, which does not optimize at all, but you can try every size of clique and for every size use SAT to ask whether there is a clique of that size (and if it exists, what it looks like). Actually it's easiest as a Pseudo boolean model:
for each i,j that are not adjacent: ¬i + ¬j ≥ 1
sum of everything = k
The usual adjacency constraint is trivial to state in pure SAT, the sum constraint is annoying, requiring an addition circuit. That's easy enough to generate, but hard to write out here, not only because it depends on k.