I need some advice with this strange behavior – lets have this code:
int ** p;
This compiles without any trouble:
p++;
When you typecast an expression, the result of that expression is an rvalue rather than an lvalue. Intuitively, a typecast says "give me the value that this expression would have if it had some other type," so typecasting a variable to its own type still produces an rvalue and not an lvalue. Consequently, it's not legal to apply the ++ operator to the result of a typecast, since ++ requires an lvalue and you're providing an rvalue.
That said, it is in principle possible to redefine the C language so that casting a value to its own type produces an lvalue if the original expression is an lvalue, but for simplicity's and consistency's sake I suppose the language designers didn't do this.
Hope this helps!