In the study of computer network, found in one of the computer network performance rate calculation, data often calculated in binary bytes, such as 15 gb of data is 15 * 2 ^ 30 x 8 bits, if transmitted at a rate of about 10 g, why not (15 * 2 ^ 30 x 8)/10 x 2 ^ 30 but (15 * 2 ^ 30 x 8)/10 x 10 ^ 9? Is why in front of the decimal is behind the data volume is binary rate...