implicit-conversion

Char to int implicit cast Behind the Scenes

自闭症网瘾萝莉.ら 提交于 2019-12-23 18:16:09
问题 The following is valid in c# as Char can be implicitly cast to int int i = 'a'; i am just curious about what .Net Framework do behind the scenes, i looked in to char and int types source code but unable to find where it's written. Can any body explain what happens behind the scene? 回答1: In C# we have something that is called a Single-Rooted unified type system . That means every existing type is a subtype of one root type. Object in C#. So char or int is only short (alias) for System.Char and

Why does one need to specify the row while assigning a pointer to a 2D Array?

泄露秘密 提交于 2019-12-23 16:06:41
问题 The compiler states "assignment from incompatible pointer type" when the row of the 2D array is not mentioned, I always thought an array without brackets means the address of the first element, in this case address of the element twodstring[0][0] Compiler does not state an error when the row is mentioned, I was wondering why is this the case? #include<stdio.h> int main() { char onedstring[]={"1D Array"}; char twodstring[][5]={"2D","Array"}; char *p1,*p2; p1=onedstring; p2=twodstring; p2

Multiple user-defined conversions on initialization

我是研究僧i 提交于 2019-12-23 12:25:58
问题 I am aware of the fact that C++ allows only a single user-defined implicit conversion when converting between types. However, I recently came across a situation where it seems like double user-defined implicit conversions are allowed on initialization. Consider the following classes: //fractions class Rational { public: int num, den; // default constructor, etc. Rational(int n) : num(n), den(1) {} // NOT explicit // arithmetic and compound assignment defined between two Rational's. }; /

Internal compiler error - Templated conversion operator in switch expression

不打扰是莪最后的温柔 提交于 2019-12-23 09:48:19
问题 The following code crashes the Microsoft compiler: class Var { public: template <typename T> operator T () const { } }; int main() { Var v; switch (v) { } } My question: Is the code correct or should the compiler give an appropriate error? Is an unambiguous conversion to an integral type possible? 回答1: The compiler crashing is always a bug, this code does not compile on either gcc or clang but both provide an error without crashing. For clang the error is: error: statement requires expression

Why does Assert.AreEqual on custom struct with implicit conversion operator fail?

不羁的心 提交于 2019-12-23 08:59:25
问题 I've created a custom struct to represent an amount. It is basically a wrapper around decimal . It has an implicit conversion operator to cast it back to decimal . In my unit test, I assert that the Amount equals the original decimal value, but the test fails. [TestMethod] public void AmountAndDecimal_AreEqual() { Amount amount = 1.5M; Assert.AreEqual(1.5M, amount); } When I use an int though (for which I did not create a conversion operator), the test does succeed. [TestMethod] public void

Using typecasting to remove gcc compiler warnings

家住魔仙堡 提交于 2019-12-23 07:28:56
问题 I am doing embedded ARM programming with gcc 4.9. I've been using the -Wconversion switch because it's in my company's default dev tool configuration. I'm using the stdint.h types (uint8_t, uint32_t, etc). The compiler creates warnings every time I perform a compound assignment or even simple addition. For example: uint8_t u8 = 0; uint16_t u16; // These cause warnings: u8 += 2; u8 = u16 >> 8; The "common method" to fix this is to use casts, as discussed here and here: u8 = (uint8_t)(u8 + 2);

Implicit conversion from char to single character string

风格不统一 提交于 2019-12-23 07:25:40
问题 First of all: I know how to work around this issue. I'm not searching for a solution. I am interested in the reasoning behind the design choices that led to some implicit conversions and didn't lead to others. Today I came across a small but influential error in our code base, where an int constant was initialised with a char representation of that same number. This results in an ASCII conversion of the char to an int . Something like this: char a = 'a'; int z = a; Console.WriteLine(z); //

Implicit conversion from char to single character string

烂漫一生 提交于 2019-12-23 07:25:05
问题 First of all: I know how to work around this issue. I'm not searching for a solution. I am interested in the reasoning behind the design choices that led to some implicit conversions and didn't lead to others. Today I came across a small but influential error in our code base, where an int constant was initialised with a char representation of that same number. This results in an ASCII conversion of the char to an int . Something like this: char a = 'a'; int z = a; Console.WriteLine(z); //

How does implicitly work in this example from Scala in Depth

假装没事ソ 提交于 2019-12-23 02:48:28
问题 In the book Scala in Depth . There's this example of implicit scoping as follows: scala> object Foo { | trait Bar | implicit def newBar = new Bar { | override def toString = "Implicit Bar" | } | } defined module Foo scala> implicitly[Foo.Bar] res0: Foo.Bar = Implicit Bar My question here is how did implicitly find the implementation of the trait Bar in the above given example? I think I am a little confused by how implicitly works 回答1: Apparently, for Foo.Bar, it works like Foo#Bar, i.e., if

Scala delegate import of implicit conversions

纵饮孤独 提交于 2019-12-22 10:00:00
问题 In Scala, how can I delegate the importing of implicit conversions into my scope, such that I don't have to have a big "environment" class which provides both library functions/values (for a DSL I am creating) as well as implicit conversions? In short, can I move my implicit conversions from an object, and still have it imported when I write: import MyDslEnvironment._ ? The goal of this is to make the importing and use of my framework simple and lightweight, in the sense that only a single