How to create 2 incompatible number-like types in TypeScript?

寵の児 提交于 2020-12-25 02:01:50

问题


I've been trying to figure out how to create 2 mutually-incompatible number-like/integer types in TS.

For example, in the code below, height and weight are both number-like, but the concept of adding them together or treating them equivalently is nonsensical and should be an error.

type height = number; // inches
type weight = number; // pounds
var a: height = 68;
var b: weight = operateScale(); // Call to external, non-TS function, which returns a weight.
console.log(a+b); // Should be an error.

Is there a way to create 2 types which are both numbers, but are not compatible with each other?


EDIT: As mentioned in the comments section, this behavior appears to be analogous to haskell's newtype behavior.


EDIT2: After several hours of poking the problem with a pointy stick, I managed to reach an answer, which I have posted below.


回答1:


The closest thing to a newtype in TypeScript is to create a new "nominal" type (TypeScript doesn't have nominal types but there are workarounds like branding) and create a value constructor and a field accessor function which just uses type assertions in the implementation. For example:

interface Height { 
  __brand: "height"
}
function height(inches: number): Height {
  return inches as any;
}
function inches(height: Height): number {
  return height as any;
}

interface Weight { 
  __brand: "weight"
}
function weight(pounds: number): Weight {
  return pounds as any;
}
function pounds(weight: Weight): number {
  return weight as any;
}

const h = height(12); // one foot
const w = weight(2000); // one ton

The types Height and Weight (sorry, can't bring myself to give new types a lowercase name) are treated as distinct types by the compiler. The height() function is a Height value constructor (takes a number and returns a Height), and the inches() function is its associated field accessor (takes a Height and returns a number), and weight() and pounds() are the analogous functions for Weight. And all of those functions are just identity functions at runtime. So JavaScript treats them as pure numbers with a bit of function-call overhead that's hopefully optimized away by a good compiler, and if you're really worried about that overhead, you can do the assertions yourself:

const h = 12 as any as Height;
const w = 2000 as any as Weight;

Now you have distinct named types you can use so you don't accidentally use a Height where a Weight is needed or vice versa. But, just like with a newtype, the compiler will not treat these as a number. Yes, you could make Height and Weight subtypes of number (via intersection types), but that's probably a mistake: the arithmetic operators like + are able to operate on number values and if both h and w are subtypes of number, then h + w will not be an error. And if h and w are not subtypes of number, then h + h will be an error. And you can't change that, since TypeScript does not let you alter the type declarations of operators the way it does with functions. I prefer to prevent both h + h and h + w from compiling, so the Height and Weight types are not numbers. Instead, let's create our own add() function that behaves how you want:

type Dimension = Height | Weight;

function add<D extends Dimension>(a: D, b: D): D {
  return ((a as any) + (b as any)) as any;
}

The add() function accepts either two Height or two Weight parameters, and returns a value of the same type. Actually, with the above it's still possible to pass in something like Height | Weight as D, so if you're really serious about locking it down you can use overloads instead:

function add(a: Height, b: Height): Height;
function add(a: Weight, b: Weight): Weight;
function add(a: any, b: any): any {
  return a+b;
}

And, behold:

const twoH = add(h, h); // twoH is a Height
const twoW = add(w, w); // twoW is a Weight
const blah = add(h, w); // error, Height and Weight don't mix

So we're almost done. For your external measureScale() function you'd just declare the return type to be a Weight:

declare function measureScale(): Weight;

var a = height(68);
var b = measureScale();

And verify the intended result:

console.log(add(a,b)); // err
console.log(add(a,a)); // okay

Hope that helps; good luck!




回答2:


So, after banging my head against the wall for several hours, I managed to come up with this:

class height extends Number {}
class weight extends Number {}

By sub-classing the Number class, Typescript allows you to create distinct numeric types.

And then you can go and use the variables as specified above.

var a: height = 68;
var b: weight = 184;
console.log(a+b); // Should be an error.

The issue that I run into is that this also returns an error:

console.log(a+a); // Should NOT be an error.


来源:https://stackoverflow.com/questions/48054767/how-to-create-2-incompatible-number-like-types-in-typescript

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!