I am getting an error which tells me I can't multiply two variables of a specific value.
TypeError: can't multiply sequence by non-int of type 'str'
I am trying to make a Pythagorean theorem inside of python for school. I need to have it inside of a float to get a decimal number.
I have already tried a couple of different things,
- I've put it inside of a couple values, int, string, float etc.
- I just tried a lot of different things, this is the best I got this far.
l_1 = float(input())
l_1 = float(l_1)
l_1 = str(l_1)
print ("The long side is: " + l_1)
l_2 = float(input())
l_2 = float(l_2)
l_2 = str(l_2)
print ("The short side is: " + l_2)
l_2 = int(l_2)
l_1 = int(l_1)
l_1 = int(l_1)
l_2 = int(l_2)
wor1 = math.sqrt(l_1 * l_1 - l_2 * l_2)
print (wor1)
I expect the output to actual be the answer without any error codes, I just need it to calculate with the variables it is given.