What are decimals called in javascript?

JavaScript has only one type of number. Numbers can be written with or without decimals.

Example

let x = 3.14;    // A number with decimals
let y = 3;       // A number without decimals

Try it Yourself »

Extra large or extra small numbers can be written with scientific [exponent] notation:

JavaScript Numbers are Always 64-bit Floating Point

Unlike many other programming languages, JavaScript does not define different types of numbers, like integers, short, long, floating-point etc.

JavaScript numbers are always stored as double precision floating point numbers, following the international IEEE 754 standard.

This format stores numbers in 64 bits, where the number [the fraction] is stored in bits 0 to 51, the exponent in bits 52 to 62, and the sign in bit 63:

Value [aka Fraction/Mantissa]ExponentSign
52 bits [0 - 51]  11 bits [52 - 62] 1 bit [63]

Integer Precision

Integers [numbers without a period or exponent notation] are accurate up to 15 digits.

Example

let x = 999999999999999;   // x will be 999999999999999
let y = 9999999999999999;  // y will be 10000000000000000

Try it Yourself »

The maximum number of decimals is 17.

Floating Precision

Floating point arithmetic is not always 100% accurate:

let x = 0.2 + 0.1;

Try it Yourself »

To solve the problem above, it helps to multiply and divide:

let x = [0.2 * 10 + 0.1 * 10] / 10;

Try it Yourself »

Adding Numbers and Strings

WARNING !!

JavaScript uses the + operator for both addition and concatenation.

Numbers are added. Strings are concatenated.

If you add two numbers, the result will be a number:

If you add two strings, the result will be a string concatenation:

If you add a number and a string, the result will be a string concatenation:

If you add a string and a number, the result will be a string concatenation:

A common mistake is to expect this result to be 30:

A common mistake is to expect this result to be 102030:

Example

let x = 10;
let y = 20;
let z = "30";
let result = x + y + z;

Try it Yourself »

The JavaScript interpreter works from left to right.

First 10 + 20 is added because x and y are both numbers.

Then 30 + "30" is concatenated because z is a string.

Numeric Strings

JavaScript strings can have numeric content:

let x = 100;         // x is a number

let y = "100";       // y is a string

JavaScript will try to convert strings to numbers in all numeric operations:

This will work:

This will also work:

And this will work:

But this will not work:

In the last example JavaScript uses the + operator to concatenate the strings.

NaN - Not a Number

NaN is a JavaScript reserved word indicating that a number is not a legal number.

Trying to do arithmetic with a non-numeric string will result in NaN [Not a Number]:

However, if the string contains a numeric value , the result will be a number:

You can use the global JavaScript function isNaN[] to find out if a value is a not a number:

Watch out for NaN. If you use NaN in a mathematical operation, the result will also be NaN:

Or the result might be a concatenation like NaN5:

NaN is a number: typeof NaN returns number:

Infinity

Infinity [or -Infinity] is the value JavaScript will return if you calculate a number outside the largest possible number.

Example

let myNumber = 2;
// Execute until Infinity
while [myNumber != Infinity] {
  myNumber = myNumber * myNumber;
}

Try it Yourself »

Division by 0 [zero] also generates Infinity:

Infinity is a number: typeof Infinity returns number.

Hexadecimal

JavaScript interprets numeric constants as hexadecimal if they are preceded by 0x.

Never write a number with a leading zero [like 07].
Some JavaScript versions interpret numbers as octal if they are written with a leading zero.

By default, JavaScript displays numbers as base 10 decimals.

But you can use the toString[] method to output numbers from base 2 to base 36.

Hexadecimal is base 16. Decimal is base 10. Octal is base 8. Binary is base 2.

Example

let myNumber = 32;
myNumber.toString[32];
myNumber.toString[16];
myNumber.toString[12];
myNumber.toString[10];
myNumber.toString[8];
myNumber.toString[2];

Try it Yourself »

JavaScript Numbers as Objects

Normally JavaScript numbers are primitive values created from literals:

But numbers can also be defined as objects with the keyword new:

Do not create Number objects.

The new keyword complicates the code and slows down execution speed.

Number Objects can produce unexpected results:

When using the == operator, x and y are equal:

let x = 500;
let y = new Number[500];

Try it Yourself »

When using the === operator, x and y are not equal.

let x = 500;
let y = new Number[500];

Try it Yourself »

Note the difference between [x==y] and [x===y].

[x == y] true or false?

let x = new Number[500];
let y = new Number[500];

Try it Yourself »

[x === y] true or false?

let x = new Number[500];
let y = new Number[500];

Try it Yourself »

Comparing two JavaScript objects always returns false.

Complete JavaScript Number Reference

For a complete Number reference, visit our:

Complete JavaScript Number Reference.

The reference contains descriptions and examples of all Number properties and methods.


How are numbers represented in JavaScript?

In JavaScript, numbers are implemented in double-precision 64-bit binary format IEEE 754 [i.e., a number between ±2^−1022 and ±2^+1023, or about ±10^−308 to ±10^+308, with a numeric precision of 53 bits]. Integer values up to ±2^53 − 1 can be represented exactly.

What type of variable is a decimal number?

Decimal variables are stored as 96-bit [12-byte] unsigned integers, together with a scaling factor [used to indicate either a whole number power of 10 to scale the integer down by, or that there should be no scaling] and a value indicating whether the decimal number is positive or negative.

What is the type of number in JavaScript?

The JavaScript Number type is a double-precision 64-bit binary format IEEE 754 value, like double in Java or C#. This means it can represent fractional values, but there are some limits to the stored number's magnitude and precision.

What is the output of 0.1 0.2 in JavaScript?

Conclusion. I was super surprised to learn that 0.1 + 0.2 is actually supposed to equal 0.30000000000000004 in JavaScript because of floating point math. This seems like a bug waiting to happen, but there is no clear workaround, because the ECMAScript specification requires that 0.1 + 0.2 ≠ 0.3.

Chủ Đề