I am trying to convert a decimal to binary such as 192 to 11000000. I just need some simple code to do this but the code I have so far doesn\'t work:
void de
Perhaps understanding the algorithm would allow you write or modify your own code to suit what you need. I do see that you don't have enough char array length to display your binary value for 192 though (You need 8 digits of binary, but your code only gives 5 binary digits)
Here's a page that clearly explains the algorithm.
I'm not a C/C++ programmer so here's my C# code contribution based on the algorithm example.
int I = 0;
int Q = 95;
string B = "";
while (Q != 0)
{
Debug.Print(I.ToString());
B += (Q%2);
Q = Q/2;
Debug.Print(Q.ToString());
I++;
}
Debug.Print(B);
All the Debug.Print is just to show the output.