This introductory chapter provides an overview of Benford' law. Benford's law, also known as the First-digit or Significant-digit law, is the empirical gem of statistical folklore that in many naturally occurring tables of numerical data, the significant digits are not uniformly distributed as might be expected, but instead follow a particular logarithmic distribution. In its most common formulation, the special case of the first significant (i.e., first non-zero) decimal digit, Benford's law asserts that the leading digit is not equally likely to be any one of the nine possible digits 1, 2, … , 9, but is 1 more than 30 percent of the time, and is 9 less than 5 percent of the time, with the probabilities decreasing monotonically in between. The remainder of the chapter covers the history of Benford' law, empirical evidence, early explanations and mathematical framework of Benford' law.
Princeton Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
If you think you should have access to this title, please contact your librarian.
To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us.