uint

How to convert char to hex stored in uint8_t form?

纵饮孤独 提交于 2019-12-20 03:56:04
问题 Suppose I have these variables, const uint8_t ndef_default_msg[33] = { 0xd1, 0x02, 0x1c, 0x53, 0x70, 0x91, 0x01, 0x09, 0x54, 0x02, 0x65, 0x6e, 0x4c, 0x69, 0x62, 0x6e, 0x66, 0x63, 0x51, 0x01, 0x0b, 0x55, 0x03, 0x6c, 0x69, 0x62, 0x6e, 0x66, 0x63, 0x2e, 0x6f, 0x72, 0x67 }; uint8_t *ndef_msg; char *ndef_input = NULL; How can I convert ndef_input (which is just a plain text, like "hello") to hex and save into ndef_msg ? As you can see ndef_default_msg is in hex form. Data inside ndef_msg should be

When trying to print UINT_MAX I get -1 [duplicate]

血红的双手。 提交于 2019-12-13 10:02:47
问题 This question already has answers here : How can I print maximum value of an unsigned integer? (8 answers) Closed 3 years ago . When trying to print 'UINT_MAX' all I get it -1, why is this? It's the only thing I have in my 'main()', a printf statement that prints 'UINT_MAX' 回答1: You used the %d format code, which interprets its argument as a signed int . You're on a two's complement system, so UINT_MAX ( 0xFFFFFFFF ), interpreted as a signed int , is equal to -1 . If you want to print it

Deserialize xml string to uint property

只愿长相守 提交于 2019-12-13 06:31:47
问题 Is it possible to deserialize a XML file with MyValue is 0x0001 and deserialize it to an uint property? What's the best way to implement this? public class MyClass { private string myValue; public uint MyValue { //checkValue method checks if myValue is a decimal or hex and number (returns an uint value). get { return checkValue(myValue); } set { myValue = value.ToString(); } } } 回答1: How about something like: public class MyClass { private uint? _myValue; [XmlIgnore] public uint MyValue { get

UInt8 EXC_BAD_ACCESS

纵饮孤独 提交于 2019-12-13 04:50:16
问题 I have a method that will add a filter to an image. This worked fine until a couple of months ago, now when I try to use this method the application will crash on the images buffer. I create the buffer and set it to the image's data, accessing the specific index later causes a bad access crash. I have looked for the past hour or two, and now I am convinced there is something im overlooking. I think something is being released that should not be. I am using the ios DP 4 preview of xcode, and I

Swift: How to convert String to UInt?

夙愿已清 提交于 2019-12-12 07:51:52
问题 According to Swift - Converting String to Int, there's a String method toInt() . But, there's no toUInt() method. So, how to convert a String to a Uint ? 回答1: Update for Swift 2/Xcode 7: As of Swift 2, all integer types have a (failable) constructor init?(_ text: String, radix: Int = default) which replaces the toInt() method of String , so no custom code is needed anymore for this task: print(UInt("1234")) // Optional(1234) // This is UInt.max on a 64-bit platform: print(UInt(

Are reads and writes for uint8 in golang atomic?

折月煮酒 提交于 2019-12-10 15:38:46
问题 As in the title, are read and write operations regarding uint8, atomic? Logically it must be a single cpu instruction obviously to read and write for a 8 bit variable. But in any case, two cores could simultaneously read and write from the memory, is it possible to create a stale data this way? 回答1: There's no guarantee that the access on native types are on any platform atomic. This is why there is sync/atomic. See also the advice in the memory model documentation. Example for generic way of

How to convert HEX string to UInt16?

徘徊边缘 提交于 2019-12-08 02:03:52
问题 I have a HEX string d285 and I want to convert it UInt16, please guide me how I can convert it. I tried this let buffer = UInt16("\(UInt8(text, radix: 16)!)") return Data(bytes: (buffer?.bigEndian.toBytes)!) but it's not working 回答1: UInt16 has an initializer that takes a string and radix value. This can be used to create UInt16 from string. let hexString = "d285" let hexToInt = UInt16(hexString, radix: 16) // prints 53893 回答2: let hexStr = "d285" var byteArr = [UInt8]() byteArr += hexStr

Generate random uint

天大地大妈咪最大 提交于 2019-12-06 17:39:22
问题 I need to generate random numbers with range for byte , ushort , sbyte , short , int , and uint . I am able to generate for all those types using the Random method in C# (e.g. values.Add((int)(random.Next(int.MinValue + 3, int.MaxValue - 2))); ) except for uint since Random.Next accepts up to int values only. Is there an easy way to generate random uint ? 回答1: The simplest approach would probably be to use two calls: one for 30 bits and one for the final two. An earlier version of this answer

Using a typedef'd uint causes error, while “unsigned int” does not…?

时间秒杀一切 提交于 2019-12-05 18:20:47
问题 For some reason, when I define a variable as "uint" instead of "unsigned int" in my program, it errors. This seems strange, because uint is typedef'd as: typedef unsigned int uint; ...so I would think that I could use the two interchangeably. To be more exact, I am assigning the result of a function which returns "unsigned int" into a uint variable, then using that uint in a vector resize call... at which point it errors. Ie, my code looks something like this: unsigned int getUInt() { return

uint8_t does not take two digit inputs

安稳与你 提交于 2019-12-04 23:47:33
问题 This is my test code #include<iostream> using namespace std; int main() { uint8_t a; while(1) { cin>>a; if(a == 0) break; cout<<"Input is "<<a<<endl; } } When I execute (with my inputs), this is what I get 1 Input is 1 2 Input is 2 12 Input is 1 Input is 2 0 Input is 0 3 Input is 3 Problem1: It takes input 12 as two separate inputs Problem2: Condition if a==0 doesn't work What might be the problems? 回答1: uint8_t is a typedef for an unsigned char . This means that one character will be read