The average (or "mean") of a set of values is equal to the sum of the values divided by the number of values there are.
The average of real numbers x, y, and z is: (x+y+z)/3.
The average of 0 and 14.76 =
(0 + 14.76) / 2 = 14.76 / 2 = 7.38.
If you take a uniform distribution of values (that is, equally-spaced values) between 0 and 14.76, by definition half of them will be above the mean (7.38), and half will be below it. In other words, the mean of all the values between 0 and 14.76 will be the same as the mean of 0 and 14.76 alone.
Thus, if dates are randomly distributed between the new moon date ("0 days") and the full moon date ("14.76 days"), we expect the random date to fall, on average, midway between these two dates, or: at 7.38 days.