Hi,
to me this looks more like a hardware problem.
But since you are thinking of software-matters lets try this first:
1. is your sent resolution equal to the native resolution of the display? I assume it is, but never miss to ask..
2. if not, is there a proper EDID-communication between display and graphics card? This is pretty tricky to check and you will have to search damn lot of information to verify this, so it might be pretty optional to check this.
But as mentioned I think this problem is more caused by the display or the other way round by the HDMI-connection.
It is pretty common through current display-generations that they assume signals on the HDMI-input automatically to be video-signals. And in video-technology a feature called
"overscan" is exactly what you described by the 5% "crop", which is a historical relict from times of CRTs and analog signals. Maybe your display has a option to enable this. Probably you would find it in the advanced menu, or in the "sizing" or "input" section of the OSM of the display. If your display offers no option maybe your graphics card's driver does. I think the CCC of ATI has one slider to scale the output signal.
Otherwise you might want to try to adopt your HDMI to a DVI, which will be recognized as a data-signal by the display and the picture is shown pixel by pixel.
The background to this is, that
DVI has evolved to be mor a data-standard and
HDMI is a consumer-standard and mostly used by video-devices like camcorders or DVD-/BlueRay-players. The small-sized equivalent to DVI will be the
Displayport in mol near future.