Given a point set in the -dimensional unit cube , the star discrepancy is defined as
(1)
|
where the local discrepancy is defined as
(2)
|
is the content of , and is the class of all -dimensional subintervals of of the form
(3)
|
with for . Here, the term "star" refers to the fact that the -dimensional subintervals have a vertex at the origin.