Why do I get different results when using a function versus a macro?

十年热恋 提交于 2019-12-11 08:07:51

问题


I'm using DevCPP IDE and I found that while programming in c,

Value returned by:

float f(float x)
{
      return 1/(1+x*x);
}

and value returned by f(x) if it's defined as:

#define f(x) 1/(1+x*x)

are different.

Why am I getting different results in these cases?

EDIT:

Here's my code for which I'm getting the anomaly:

main()    
{
      int i;

      float a=0, b=1, h, n=12, s1, s2=0;

      h=(b-a)/n; //step length

      s1=f(a)+f(b);

      for(i=1;i<=n-1;i++)

      {

        s2+=f(a+(i*h));

      }

      s1=(s1+2*s2)*h/2;

      printf("sum: %f", s1);   

      getch();
}

OUTPUT 1: 0.693581 (using MACRO)

OUTPUT 2: 0.785109 (using function)


回答1:


There are many possibilities here. Here's a few.

In the function version, the argument is explicitly typed as a float. This means that if you call

f(1);

then 1 is converted to the float 1.0f as the argument. Then, when 1 / (1 + x * x) is computed, it evaluates to 1 / 2.0f, which comes out to 0.5f. However, in the macro version, f(1) would be evaluated as 1 / 2 using integer division, yielding the value 0.

Second, in the function version, the argument is evaluated only once. This means that

int x = 0;
f(x++);

will increment x to 1, then pass in the value 0. The result will then be 1.0f. In the macro version, however, the code expands to

1 / (1 + x++ * x++)

This causes has undefined behavior because there is no sequence point between the evaluations of x++. This expression could evaluate to anything, or it could crash the program outright.

Finally, the function version respects operator precedence while the macro does not. For example, in the function version, calling

f(1 - 1)

will call f(0), evaluating to 1.0f. In the macro version, this expands to

  1 / (1 + 1 - 1 * 1 - 1)
= 1 / (1 + 1 - 1 - 1)
= 1 / 0

This causes undefined behavior because of a divide-by-zero error.

The simple way to avoid this is to not use macros to define functions. Way back in the Bad Old Days this was a standard practice, but now that C has inline functions and compilers are way smarter, you should prefer functions to macros. They're safer, easier to use, and harder to mess up. They're also type aware and don't evaluate arguments multiple times.

Hope this helps!




回答2:


#define f(x) 1/(1+x*x)

should be defined as

#define f(x) 1/(1+(x)*(x))



回答3:


Try this:

#define f(x) (1/(1+x*x))

A #define basically sticks the code where the #define'd name was, so order of operations and precedence will apply.



来源:https://stackoverflow.com/questions/20182897/why-do-i-get-different-results-when-using-a-function-versus-a-macro

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!