1

I created my custom types what's actually a named primitive types with purpose to give a clear definition about what kind of number or what kind of name is there. For example:

export declare type AmountOfCredits = number;
export declare type AmountOfMoney = number;
export declare type AmountOfProduct = number;
export declare type BankAccountNumber = string;
export declare type CompanyName = string;
export declare type CountryName = string;
export declare type DayOfWeek = 0 | 1 | 2 | 3 | 4 | 5 | 6;
export declare type Days = number;
export declare type ID = number;
export declare type Month = number;
// ...etc

Now I can use this interface:

export interface EmployeeInterface {
  id: ID;
  user_id: ID;
  mother_name: PersonName;
  place_of_birth: SettlementName;
  date_of_birth: ISODate;
  name: PersonName;
}

But in the class this is working too:

export class Employee implements EmployeeInterface {
  id: number;
  user_id: number;
  mother_name: string;
  place_of_birth: string;
  date_of_birth: string;
  name: string;
}

Can I somehow force in the class to accept only the custom declared types instead of number or string primitive types?

netdjw
  • 5,419
  • 21
  • 88
  • 162

0 Answers0