Mabby it\'s a stupid question, but how can I cast digit of type int to digit of type char?
Standard conversion OPs doesn\'t do this:
int x = 5
If you know for sure that the values are in the 0 .. 9 range (both inclusive), you can slightly improve efficiency by making use of a bitwise or, instead of addition:
(char) ('0' | i);
this works since '0' has as binary codepoint 0011 0000, and the ten digits thus all are between 0011 0000 and 0011 1001.
You can thus convert an IEnumerable<int> to an IEnumerable<char> with:
data.Select(i => (char) ('0' | i));
var yourInts = new int[]{1,2,3};
var result = yourInts.Select(x=>x.ToString()[0]).ToArray();
Add 48 to your int value before converting to char:
char c = (char)(i + 48);
Or for int[] -> char[] conversion:
var source = new int[] { 1, 2, 3 };
var results = source.Select(i => (char)(i + 48)).ToArray();
It works, because '0' character in ASCII table has 48 value. But it will work only if your int values is between 0 and 9.