cancel
Showing results for 
Search instead for 
Did you mean: 

GPU at 92% load on 3DMark

3 REPLIES 3
thorpyUK
Beginner
Message 1 of 4
1,738 Views
Message 1 of 4
1,738 Views

GPU at 92% load on 3DMark

Hi All,

Hi All, first post - what a wonderful resource!
I've recently purchased the Erazer X10 Guardian gaming laptop - link here to the exact model and spec:
https://www.overclockers.co.uk/medi...-intel-i7-10750h-gaming-laptop-lt-00v-md.html

Specs again here are:
Intel i7-10750H CPU
Nvidia RTX2070 Super (mobile version but NOT MAX-Q)
16MB DDR4 2666mhz ram (my only criticism- comes with 1 stick in single so have upgraded to 32mb in dual channel)
1TB SSD

Overall i'm really happy with it - the specs are great for the price etc.
BUT i can't help but feel it's holding out on me a little. If i look at my 3dMark scores vs other identical machines, they're a bit lower - enough to be significant: (all these are on stock speeds)
Firestrike Avg Scores
Mine overall 17,991 graphics 20,564
MSI GE75 overall 19,456 graphics 22,879 https://www.3dmark.com/fs/22809260
MSI GE66 overall 18,880 graphics 22,258 https://www.reddit.com/r/GamingLaptops/comments/il59wt/msi_ge66_benchmark_questions/
Medion Erazer X10 Beast overall 18,534 graphics 21,446
https://www.notebookcheck.net/Medio...g-laptop-with-good-battery-life.506266.0.html

Even the other Medion laptop review which has almost identical specs (although it states it's max-q design it is in fact the same 'full-fat' RTX2070 Super (Mobile) as in mine - this is just the 17" version)
I think i've narrowed it down to a GPU utilisation issue - temps are cool (GPU doesn't go above around 72/73 degrees / CPU about 84), but while running the gfx tests on firestrike for EG my GPU utilisation % seems to hover around 92-93% avg in a range of between 89-98%)
I'm on the latest nvidia drivers (no geforce experience just driver) with a new version of windows 10 (with all latest updates installed)
Any help with getting my GPU to run at 99-100% would be great! Thanks... image of 3dmark monitoring data below:

3dmark firestrike.jpg

3 REPLIES 3
sweetpoison
Mentor
Message 2 of 4
1,730 Views
Message 2 of 4
1,730 Views

Hi @thorpyUK , let me be the first to tell you Welcome to the forum.

 

My reply might not be the one you whish for but a bad reply is better than no reply at all.

First, congratulation for your choice regarding the laptop and although you might have second though you still have to consider that this is a budget gaming laptop so expect some letdown each and there (mostly on cosmetic part).

 

Regarding your post, there is much to talk about 3d mark or any other benchmark programs. You can't be sure about the results from other machine.  For example, if you gain access to a machine which is on score list and run the benchmark you will always get a different result. Consider the personalization of OS as main factor. Other factor would be the sponsorship status of that machine where a score can suffer some alteration according to to that (3DMark did something like that in the past). So of you want to compare your laptop with another one make sure you have both configured by your hand and run the benchmark.

 

Speaking of RAM, you have to consider and do some research to see the one bank fitted in your laptop is a dual or single channel. Another important aspect is the latency where so far I notice Medion prefer single channel with low latency which in some cases are better choice than dual with high latency. The fitted M2 NVmi SSD which host your OS mill makes almost no difference between 16Gb and 32Gb. You can order a  2x16Gb dual channel Ram and after running test (in games not in 3dmark) see if it's worth the investment. 

As part of the warning you might consider the warranty when it comes about opening the covers.

 

About loading GPU, 3DMark really does a stress test and in games if it happens to have that heavy load most of the time you will notice micro stuttering. A 99% load on GPU will make any game unplayable because of stuttering and the only way to make the GPU to work that hard is to use an exclusive access rendering program which during the execution will freeze any background process (including mouse movement). 

As example: in order to render a scene the GPU have to build the shaders based on 3d model and texture. Once the model is build the GPU will stop the process until new shaders are required hence some % unused.

 

Max-Q Design on a video card means a bit lower performances. Here is a reference although I couldn't find the RTX 2070 Super mobile version. Overall you can see wherever Max-Q Design is in effect the performances are lower than plain version.

 

Cheers

 

thorpyUK
Beginner
Message 3 of 4
1,725 Views
Message 3 of 4
1,725 Views

Hi sweetpoison, thanks for your reply - some of which i agree with, some of which is incorrect.

I know it's towards the lower-middle end of the gaming laptop price range, but £1500 (around 1700 euros) isn't cheap enough to not expect everything to work flawlessly!

I agree you'll see variations between identical hardware due to manufacturing tolerances etc, but the score variations are far in excess of what you would expect to just be natural margin of error, hence why something seems amiss.

On RAM, you cannot have 1-bank dual channel configuration - neither can you buy dual-channel memory modules - you need to have two identical modules to run dual-channel, and yes, the performance benefit in some parts of some games is around +30% after the upgrade. The point on warranty is absurd - the laptop is sold as upgradeable with 'additional expansion slots' - so how can using a feature it's sold with void a warranty?

The point about games micro-stuttering at 99% load is also completely incorrect - GPU's are designed to run at full load for maximum FPS, and indeed why most people see a 100% GPU load on 3dMark when running the graphics tests.

SOmething (power maybe? BIOS? voltage? drivers?) is holding this back, hence the qn. I agree max-q is lower performance because of lower thermal limits

sweetpoison
Mentor
Message 4 of 4
1,723 Views
Message 4 of 4
1,723 Views

Two things I said are based on experience and turns out I know what I'm talking about (actually all of them but let's focus on 2 only).

 

1. You can use one single dual channel memory bank on a MB which support dual channel. It will work as single channel with focus on "it will work". I didn't say it's won't be any difference, I suggested to test and see if it's worth the money considering that if what you have is a single channel you might need to buy 2x16 gb banks). An ingame test will tell if it's worth to invest in 2x16gb ram vs 1x16gb in terms on cost vs gain.

 

2. There is no such thing as identical configurations when it comes about laptops from different brand. Each laptop use a specific BM and a specific heat management system, both counting for performances.

 

More than that It's pointless to come with additional proof although I can provide specific info about a game where if I leave the video card to fork for max frame rate (>200 fps) creates stutter every time I move the character or rotate camera. Limiting the FPS to 100 solve all problems and also keep the GPU at decent temperature. But that is for you to experience and maybe one day will thing about what I said.

 

Anyway, I will keep quiet further on and hope you will get the answers you are looking for.

 

Take care

3 REPLIES 3