The benefit of that notation in C/C++ was to make it easier to see what a symbol's type was without having to go search for the declaration. These styles appeared before the arrival of Intellisense and "Go to Definition" - we often had to go on a goose chase looking for the declaration in who knows how many header files. On a large project this could be a significant annoyance which was bad enough when looking at C source code, but even worse when doing forensics using mixed assembly+source code and a raw call stack.
When faced with these realities, using m_ and all the other hungarian rules starts to make some sense even with the maintenance overhead because of how much time it would save just in looking up a symbol's type when looking at unfamiliar code. Now of course we have Intellisense and "Go to Definition", so the main time saving motivation of that naming convention is no longer there. I don't think there's much point in doing that any more, and I generally try to go with the .NET library guidelines just to be consistent and possibly gain a little bit more tool support.