If there were some way for the X11 server to send the application a hint that performance was low, the GUI could automatically scale-down to simpler graphics. (Of course, that approach violates network transparency, but it could be an easy path to higher performance)
It depends what school of thought you come from. Some people think that true transparency in a client/server abstraction means that the application should be completely ignorant of the implementation behind the abstraction. I can think of several instances where a client would benefit from being able to query for situational details and make local optimizations. You mentioned one. Another that immediately comes to mind is OpenGL - OpenGL attempts to completely hide the hardware. Unfortunately, this means that the programmer has no way to know whether he is hitting software fallbacks or whether the hardware implementation is faster than software on a given function.

I really like the idea of hiding implementation, but not to the extent that the application is made too dumb - having access to information about current state inside the implementation could allow the application to make better decisions regarding various aspects of its usability. This would include a X11 toolkit knowing whether it is running locally or remotely.

Hmm. Perhaps checking if DISPLAY has no hostname in it would suffice?