Comparison: 80w vs 90w RTX 2080 Max-Q
Nvidia RTX GPU laptops have arrived. Before the new GPUs launched, there was a lot of discussion regarding the speed and wattage of the GPUs in comparison to their desktop counterparts, especially the “Max-Q” versions. Per Nvidia, Max-Q laptop GPUs are more efficient models that have the same CUDA cores as desktop models but with greatly reduced speeds. This leads to reduced performance but also reduced power drawn and heat generated. They made these GPUs for thin and light gaming notebooks such as the MSI GS65 that I’ll be testing today.
Nvidia also made 2 different variants for some Max-Q GPUs, an 80-watt and a 90-watt version. In this article, we’ll be comparing performance of the 2080 Max-Q with different wattage specs.
2019 MSI GS65 (Review Here)
Nvidia 2080 Max-Q (Driver version – 418.91)
Intel 8750H (Undervolted, running at stock clock speeds)
32GB DDR4 2666MHz (Dual channel)
*Conductonaut Liquid Metal (LM) repaste on CPU and GPU by HIDevolution to improve thermal cooling and performance
My 2019 GS65 comes with an 80w variant of the 2080 Max-Q. MSI doesn't disclose nor gives you the option of choosing which variant you will receive. In order for me to get the 90w results, I had to find someone that has a 2080 Max-Q with a 990MHz base clock, confirmed through GPU-Z. After some searching, I found and reached out to someone on Reddit (who wants to remain anonymous) and they provided me with a GPU-Z screenshot and their GPU bios from a Razer Blade Advanced. To confirm that the BIOS would match my GPU, I compared his GPU-Z screenshot with mine and both systems had the same GPU ID and revision number (IE90, Rev A1). Finally, with some help from Zill from NotebookReview, I successfully flashed my GPU!
It was actually a really easy but nerve-wracking process because my laptop could’ve become an expensive paperweight if something went wrong.
Below are the specs for the 80w and 90w 2080 Max-Q via GPU-Z
These variants look exactly the same until you see their clock speeds. The 90w version is ~35% faster at base clocks than the 80w. Once we compare boost speeds, that gap narrows to ~11%. Memory speed remains the same across both versions. Finally, the jump from 80w to 90w is a 12.5% increase in TDP which means there should be a performance increase with the higher wattage unit.
Here’s the link to NotebookCheck.net that talks about the different clock speeds based on wattage for the 2080 Max-Q.
Let’s see what the performance difference is between the 80w and 90w BIOS
At stock clock speeds, the increase in performance was less than I expected. The 80w version was already performing pretty well, especially with the latest drivers which improved performance ~5% overall from stock drivers. The biggest gains came from the most graphically intensive benchmarks. According to the games and benchmarks tested, the 90w variant is 4.32% faster at stock speeds over the 80w variant.
I overclocked both cards and reran my tests. Interestingly, my GPU shared the same +140MHz core and +500MHz memory overclock when using the 80w or 90w BIOS. Anything higher than that led to performance degradation, artifacts, and a BSOD. The average performance increase is slightly higher at 4.75% when both GPUs are overclocked.
If we compare an overclocked 80w to a stock 90w, there is no real performance difference. In fact, the overclocked 80w matched and sometimes BEAT a stock 90w in my testing.
On our last chart, there is a 9.87% increase in performance when comparing a overclocked 90w to a stock 80w.
One thing not shown on the charts is GPU utilization. In Shadow of the Tomb Raider, it records the GPU usage during the benchmark test. The 80w version was at 96% utilized while the 90w was at 75%. I repeated the test multiple times to make sure that it wasn’t a fluke but the results remained the same. It could be that, since SotTR is a CPU-intensive game with AVX instructions, I was starting to get bottlenecked by my CPU.
Also, I looked at the minimum FPS for games that provided that data and there was no tangible difference between 80w and 90w, stock and overclocked.
Overall, flashing the 2080 Max-Q to a 90w bios provided a “free” bump in performance.
*Note for benchmarks and games
Grand Theft Auto V – Ran the benchmark test, averaged all tests ran
Battlefield 5 – Observed FPS at 2nd War Stories mission, observed FPS from spawn point
Overwatch – Observed FPS in the training area from the same vantage point, looking at a bot
Cooling and wattage
Along with an increase in performance comes more heat. My custom fan curve remained the same between both versions. Heat remained the same even when overclocking the GPUs so that’s a plus. With my custom fan curve and using a laptop stand, I had no thermal throttling issues during these tests.
Looking at the HWinfo shots below, you can see that the 80w version peaked at 92w of power while the 90w version pulled 108w.
*Note – Don’t mind my CPU wattage. I made bios tweaks that show lower CPU wattage than what it actually is. Read my review HERE if you want to know more about it.
A 90w 2080 Max-Q improves performance over the 80w variant but not by much. Personally, I’ve reverted back to my 80w bios because I don’t like the increased heat generated by the GPU. Due to the design of the GS65’s cooling system, the CPU’s heatpipe and fan help dissipate heat from the GPU. The 8750H is already a hard component to keep cool and I’d rather ensure that its heatpipe and fan focus more on cooling the CPU than the GPU. Plus, the performance jump isn’t as big as I thought and I don’t think it’s worth the extra heat and noise generated by the fans. If I want similar performance to a stock 90w, i'll simply overclock my GPU as I don't generate any extra heat.
Also, a native 90w 2080 Max-Q might perform better than simply flashing from 80w to 90w. However, since the 90w BIOS shares the same device ID and revision as my own, I doubt it would.
If you want to try flashing the bios yourself, see below. I hope this comparison helps!
-The Everyday Enthusiast
There is a chance you may brick your system. Keep in mind that I am not responsible for anything that goes wrong if you decide to try this out!