Most of us use bit rate and data rate interchangeably, where bits mean either 0 and 1, while data comes in chunks of 0s and 1s. Now lets come to the definition -
Data rate is the amount of data in bits per second of footage or frame, measured per second and expressed as bits per second.
Bit rate is the number of bits per second of footage, measured per frame or field, also expressed as bits per second.
Now from the definition both looks same but lets look at the following hypothetical example -
Say eNodeB is transferring 100 frames per second where each 10th frame is re-transmitted because of error and size of each frame is 100 bits.
Now
Bit-Rate: 100*(100+10) = 11, 000 BPS
Date-Rate: 100*100 = 10,000 BPS
I hope it will clear the doubt.