ValueError: invalid literal for int() with base 10: '-' m[i][i] = int(d[i])
问题 Im working on the code the finding the maximum evaluation (ignoring precedence) such as 2 5+ 3*2 -1 + -4 * -3 Output 16 40000000000 I have the following code which returns the following error: ValueError: invalid literal for int() with base 10: '-' m[i][i] = int(d[i]) File Note: I have looked through the other questions with a similar error, but this is different from each of them. Im using Python 3and it docent work. def get_maximum_value(expression): expression="".join(expression.split())