As the megadrive real speed is out of the range of hdmi frame rate specifications (It's 59.something, but outside of the allowed range), some tvs might have issues, so buffering modes make the frame rate fall into the spec in 3 ways:
Zero delay. It sopeeds the framerate up a bit to spec (i think 59.97Hz), that means the console clock is running slightly faster, fast enough for the a/v sync to fail in long cutscenes in some cd games, as it's slowly accumulating error, as video frames are played faster that what they should, while audio is played at the normal cd hardware speed.
Fully buffered. The console (i suppose) has a triple buffer, so when a new console frame has finished rendering, it's added to the queue, and the current one is marked as free. There is an extra buffer frame to avoid tearing, so it always swaps completed frames in the vblank (3 buffers: display buffer, anti tearing buffer, render buffer) so there is extra input lag.
The other mode name I can't remember: instead of 3 buffers, this mode only has 2 (display buffer, render buffer), so it has less lag. As soon as a buffer completes rendering, it's swapped with the display buffer, sometimes mid-frame, so tearing can appear.