People with Nvidia GPU's are getting 100% GPU usage on the map screen as well. What's the difference between FurMark using 100% and something else? They're both 100% load.
Official word: "FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that's why FurMark is also a perfect stability and stress testtool (also called GPU burner) for the graphics card." It's similar to how Prime95 doesn't produce as much heat as IntelBurnTest, since they use different algorithms/calculations.
Both of those are benchmarks, and benchmarks are designed to load a GPU or CPU to either get a score or generate heat, in the case of FurMark. The fact that one generates slightly more or less heat than the other is kind of irrelevant since they BOTH generate a significant amount of heat, as a result of loading the GPU or CPU to 100%. Since one doesn't want to be running benchmarks constantly, generating tons of heat for no reason or by looking at a 2D map, they ought to fix whatever is causing this.