http://www.tomshardware.com/forum/19951-42-detecting-length-ethernet-frame
The encoding is such that there *must* be a low-to-high
or high-to-low transition in the middle of each bit period.
This is how the system differentiates between 0 and 1 bits.
The receiver clocks in a bit whenever a transition occurs.
After the transmitter has sent its last bit,
the line returns to its idle state, there are no more state transitions
and hence the receiver clocks no more bits into its input buffer.
When the interface sends a frame to the network device driver,
it supplies the length of the received frame.