casting

F#: Casting to generic type

▼魔方 西西 提交于 2019-12-24 02:00:57
问题 I'm fairly new to F# and coming from a C++ background. I am trying to write a simple vector class which can be of generic type (int, float, etc) but I run into trouble with the default constructor. I want to initialize the values to be zero but to do this I need to somehow cast a concrete zero to a generic type but I'm not sure how to do this. Perhaps some code might help. Here's what I have so far: type Vector3D<'T> (x :'T, y: 'T, z: 'T) = member this.x = x member this.y = y member this.z =

How to check conversion from C++ string to unsigned int

南笙酒味 提交于 2019-12-24 01:54:45
问题 I need to: 1) Find what is the maximum unsigned int value on my current system. I didn't find it on limits.h. Is it safe to write unsigned int maxUnsInt = 0 - 1; ? I also tried unsigned int maxUnsInt = MAX_INT * 2 + 1 that returns the correct value but the compiler shows a warning about int overflow operation. 2) Once found, check if a C++ string (that I know it is composed only by digits) exceeded the maximum unsigned int value on my system. My final objective is to convert the string to a

JAVA extended ASCII table usage

…衆ロ難τιáo~ 提交于 2019-12-24 01:51:11
问题 in my app I need to get character from extended ASCII table that is shown on the image. But when I cast decimal values into char, I get different characters. What is the real value of these characters in JAVA. I dont write the character on console or in a file, just into the image. private void generateAsciiMatrix() { //32 - 255 are visible characters in ascii table for(int i = 32; i < 256; i++) { this.generateAsciiMatrix((char)i); } } private void generateAsciiMatrix(char letter) {

Why Convert.ToInt32(1.0/0.00004) != (Int32)(1.0/0.00004)

元气小坏坏 提交于 2019-12-24 01:44:54
问题 Why this code http://ideone.com/YRcICG void Main() { double a = 0.00004; Int32 castToInt = (Int32)(1.0/a); Int32 convertToInt = Convert.ToInt32(1.0/a); Console.WriteLine("{0} {1:F9} {2:F9}", castToInt == convertToInt, castToInt, convertToInt); Console.WriteLine((((int)(1.0/(1.0/25000))) == 24999)); } results in False 24999,000000000 25000,000000000 True in context of CLR/C# implementation 回答1: The trick lies in the way the double is represented so (1.0/a) will be represented in the following

Casting an anonymous array initializer list

余生颓废 提交于 2019-12-24 00:53:21
问题 I can successfully do a C cast of an initializer list for an array of char strings, but can't seem to get it to work with a C++ cast (static_cast): int main() { char x[] = "test 123"; // This works fine: char **foo = (char *[]) { "a", x, "abc" }; std::cout << "[0]: " << foo[0] << " [1]: " << foo[1] << " [2]: " << foo[2] << std::endl; // This will not compile ("expected primary-expression before '{' token"): //char **bar = static_cast<char *[]>( { "a", x, "abc" } ); //std::cout << "[0]: " <<

C - What does *(long *)(host->h_addr); do?

会有一股神秘感。 提交于 2019-12-24 00:48:25
问题 I found the following code in this example: addr.sin_addr.s_addr = *(long *)(host->h_addr); h_addr is is a char pointer and host is a pointer to a struct of type hostent . addr is a struct of type sockaddr_in and sin_addr is a struct of type in_addr . s_addr is a uint32 . Most of this information can be found here: http://man7.org/linux/man-pages/man7/ip.7.html I'm pretty sure (long) casts the char to a long, but I don't know what the extra asterisks do, especially because s_addr is not a

Pandas long to wide

喜欢而已 提交于 2019-12-24 00:45:13
问题 Using pandas, I want to convert a long data frame to wide but the usual pivot method is not as flexible as I need. Here is the long data: raw = { 'sample':[1, 1, 1, 1, 2, 2, 3, 3, 3, 3], 'gene':['G1', 'G2', 'G3', 'G3', 'G1', 'G2', 'G2', 'G2', 'G3', 'G3'], 'type':['HIGH', 'HIGH', 'LOW', 'MED', 'HIGH', 'LOW', 'LOW', 'LOW', 'MED', 'LOW']} df = pd.DataFrame(raw)` which produces gene sample type G1 1 HIGH G2 1 HIGH G3 1 LOW G3 1 MED G1 2 HIGH G2 2 LOW G2 3 LOW G2 3 LOW G3 3 MED G3 3 LOW What I

Casting to one class and calling function from sibling class?

十年热恋 提交于 2019-12-24 00:34:32
问题 I'm getting a pointer to a base class (which is actually a pointer to some derived class). Then I want to call a function on that derived class, but I don't know which one it is. class Base { }; class DerivedOne : public Base { public: void functionA() { int x = 0; } }; class DerivedTwo : public Base { public: void functionA() { int x = 0; } }; int main() { Base* derivedTwoPtr = new DerivedTwo(); reinterpret_cast<DerivedOne*>(derivedTwoPtr)->functionA(); return 0; } This works as I want, but

Casting generic delegates to another type with interfaces

流过昼夜 提交于 2019-12-24 00:22:58
问题 (Using .NET 4.0) Ok, so I have private Dictionary<int, Action<IMyInterface, IMyInterface>> handler {get; set;} public void Foo<T, U>(Action<T, U> myAction) where T : IMyInterface where U : IMyInterface { // | This line Fails // V Action<IMyInterface, IMyInterface> anotherAction = myAction; handler.Add(someInt, anotherAction); } I'm trying to store the delegate in a generic collection, so I can pull it back out later to invoke it. How do I properly cast it? 回答1: The generic parameters to the

LLVM 3.0 compiler error: cast of C pointer type to Objective-C pointer type 'id' requires a bridged cast

梦想与她 提交于 2019-12-23 23:47:00
问题 I am trying to compile old iPhone application project using new LLVM 3.0 compiler. I am getting this error: Automatic Reference Counting Issue: cast of C pointer type 'CGColorRef' (aka 'struct CGColor *') to Objective-C pointer type 'id' requires a bridged cast [4] for code: UIColor *color1, *color2, *color3, *color4; .... NSArray *colors = [NSArray arrayWithObjects:(id)color1.CGColor, color2.CGColor, color3.CGColor, nil]; This code compiles without problems in older LLVM GCC 4.2 compiler.