What Is an IQ?
Originally, intelligence testing was used to detect children of lower intelligence in order to place them in special education programs. The first IQ tests were designed to compare a child’s intelligence to what his or her intelligence “should be” as compared to the child’s age. If the child was significantly smarter than a normal child of his or her age, the child was given a higher score, and if the child scored lower than expected for a child of his or her age, the child was given a lower IQ score.
This method of determining mental age doesn’t work too well when testing adults, and today, we attempt to write tests that will determine an person’s true mental potential, unbiased by culture, and compare their scores to the scores of others who have taken the same test.
So, we compare a person’s objective results to the objective results of other people, and determine how intelligent each test taker is compared to all other test takers, instead of comparing test takers to an arbitrary age related standard.
The first step to understanding IQ testing is to understand standard deviation.
To understand this concept, it can help to learn about what statisticians call normal distribution of data.
A normal distribution of data means that most of the examples in a set of data are close to the average, while relatively few examples tend to one extreme or the other.
Let’s say you are writing a story…