# Natural Numbers

Document Sample

```					                                    Natural Numbers

Natural Numbers
In mathematics, the natural numbers are the ordinary whole numbers used for counting ("there are 6
coins on the table") and ordering ("this is the 3rd largest city in the country"). These purposes are
related to the linguistic notions of cardinal and ordinal numbers, respectively (see English numerals). A
later notion is that of a nominal number, which is used only for naming. Properties of the natural
numbers related to divisibility, such as the distribution of prime numbers, are studied in number theory.
Problems concerning counting and ordering, such as partition enumeration, are studied in
combinatorics. There is no universal agreement about whether to include zero in the set of natural
numbers: some define the natural numbers to be the positive integers {1, 2, 3, ...}, while for others the
term designates the non-negative integers {0, 1, 2, 3, ...}.

The former definition is the traditional one, with the latter definition first appearing in the 19th century.
Some authors use the term "natural number" to exclude zero and "whole number" to include it; others
use "whole number" in a way that excludes zero, or in a way that includes both zero and the negative
integers. The first major advance in abstraction was the use of numerals to represent numbers. This
allowed systems to be developed for recording large numbers. The ancient Egyptians developed a
powerful system of numerals with distinct hieroglyphs for 1, 10, and all the powers of 10 up to over
one million.

Know More About :- Composite Number

Math.Edurite.com                                                               Page : 1/3
A stone carving from Karnak, dating from around 1500 BC and now at the Louvre in Paris, depicts 276
as 2 hundreds, 7 tens, and 6 ones; and similarly for the number 4,622. The Babylonians had a place-
value system based essentially on the numerals for 1 and 10. A much later advance was the
development of the idea that zero can be considered as a number, with its own numeral. The use of a
zero digit in place-value notation (within other numbers) dates back as early as 700 BC by the
Babylonians, but they omitted such a digit when it would have been the last symbol in the number.[1]
The Olmec and Maya civilizations used zero as a separate number as early as the 1st century BC, but
this usage did not spread beyond Mesoamerica. The use of a numeral zero in modern times originated
with the Indian mathematician Brahmagupta in 628. However, zero had been used as a number in the
medieval computus (the calculation of the date of Easter), beginning with Dionysius Exiguus in 525,
without being denoted by a numeral (standard Roman numerals do not have a symbol for zero); instead
nulla or nullae, genitive of nullus, the Latin word for "none", was employed to denote a zero value.[2]
The first systematic study of numbers as abstractions (that is, as abstract entities) is usually credited to
the Greek philosophers Pythagoras and Archimedes.

Independent studies also occurred at around the same time in India, China, and Mesoamerica. Several
set-theoretical definitions of natural numbers were developed in the 19th century. With these definitions
it was convenient to include 0 (corresponding to the empty set) as a natural number. Including 0 is now
the common convention among set theorists, logicians, and computer scientists. Many other
mathematicians also include 0, although some have kept the older tradition and take 1 to be the first
natural number.[4] Sometimes the set of natural numbers with 0 included is called the set of whole
numbers or counting numbers. On the other hand, integer being Latin for whole, the integers usually
stand for the negative and positive whole numbers (and zero) altogether. Mathematicians use N or (an
N in blackboard bold, displayed as ℕ in Unicode) to refer to the set of all natural numbers. This set is
countably infinite: it is infinite but countable by definition. This is also expressed by saying that the
cardinal number of the set is aleph-null . To be unambiguous about whether zero is included or not,
sometimes an index (or superscript) "0" is added in the former case, and a superscript "" or subscript ""
is added in the latter case: Some authors who exclude zero from the naturals use the terms natural
numbers with zero, whole numbers, or counting numbers, denoted W, for the set of nonnegative
integers. Others use the notation P for the positive integers if there is no danger of confusing this with
the prime numbers.

Math.Edurite.com                                                              Page : 2/3
Thank You

Math.Edurite.Com

```
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 1 posted: 9/25/2012 language: pages: 3