Just say I have a value of type uint64_t seen as sequence of octets (1 octet = 8-bit). The uint64_t value is known containing only one set bit<
If you can use POSIX, use the ffs() function from strings.h (not string.h!). It returns the position of the least significant bit set (one indexed) or a zero if the argument is zero. On most implementations, a call to ffs() is inlined and compiled into the corresponding machine instruction, like bsf on x86. The glibc also has ffsll() for long long arguments which should be even more suitable for your problem if available.