I am reading Jang's book of Neuro-Fuzzy and Soft Computing
and in the 2nd chapter the author talks about Schweizer and Sklar T-Norm which is presented by this equation:
it's a handy T-norm. in the exercises (#20 page 45) it asks what would happen to the Tss(a,b,p)
if p->0
In fact it asks to show that the whole equation is going to be just ab
in the end.
I tried different things and at last, I used Ln
but I got this: -1/p Ln(a^-p + b^-p)
and I have no idea where to go from here!
can anybody suggest anything? thanks for your help.
p.s: is there any simple way of expanding Ln(x+y)
generally?