Jump to content

Programmed Math/Fractions

From Wikibooks, open books for an open world
(Redirected from Programmed Math:fractions)

A fraction is most easily thought of as representing some portion of a unit.

The simplest fractions to think about are numbers like "one half" or "one tenth." For example, one third is the number you get when you divide one into three equal parts and take one of them.

We write one third as .

Every fraction looks like that, having an integer on top, which we call the numerator, and an integer on the bottom, which we call the denominator. This fractional notation represents the number that we get when we divide the numerator by the denominator. In our simple examples we had a numerator of 1.

So, what do we mean by the expression ?

When we read this in English, we say "two thirds." That sounds like we have "two" of something, namely two "thirds" or two of: , which we naturally think of as or . But in our more technical definition of a fraction we said it was a number we get by dividing a pair of integers, so that is defined as .

Well fortunately for our intuitions, it will turn out that .

Thinking about a fraction as some portion of a unit works nicely so long as the numerator is less than the denominator. But what do we mean by ? Well, intuitively, we think of having four of , which, thinking about huge slices of cake, must surely be equal to . And, sure enough, if you do the division of 4 by 3 you'll end up with 1.33333..., which is 1 + 0.3333... or .

Fractions can represent whole numbers, too. Nothing but our boring desire for simplicity and clarity keeps us from writing 12 as whenever we like.

Further topics in fractions: