top of page
Search

Maxed-Out Marvel Rivals On RTX 5080 Aorus Master

Affiliate disclosure: as an Amazon Associate, we may earn commissions from qualifying purchases from Amazon. 


We finally got our hands on the RTX 5080, and instead of drowning you in endless charts, we're doing things differently. With limited time and cutting-edge tech under the hood, we're diving deep into a real-world gaming scenario - pushing Marvel Rivals to its limits while keeping a close eye on thermals and acoustics of this Gigabyte Master card. Let’s get straight into it! 



First up, let’s talk about the card. And also - if you’re looking for something specific, we have chapters in the video, so jump ahead anytime. This Gigabyte Master comes in a hefty box packed with accessories, including all the usual paperwork, a quick setup guide, a 12V PCIe power adapter, a GPU mount, and – surprise - a bonus fan for the GPU. Spoiler alert: it’s not entirely useless, but it’s not exactly a game-changer either. More on that in the thermal tests later on. 


Let’s dive straight into the real testing – I pushed this card as hard as I could with the time I had. For reference, the test bench is running an AMD Ryzen 7 9800X3D, paired with DDR5 6400 M/T memory on a Gigabyte X870 Elite motherboard. For monitor we are using ROG Strix PG27UCDM, which is probably one of the highest end 4K 240Hz displays on the market right now. And yes, a full review is coming to the channel soon. High-quality 4K gaming might finally be here! 


Anyway, in Marvel Rivals, I immediately turned up settings to the max in 4K mode to really push the card. Out of the box, without any NVIDIA magic, the framerates were decent but not mind-blowing - hovering in the high 80s. That’s not exactly game-changing, and for a competitive title, it’s honestly not great. If upscaling isn’t your thing, dialling down the settings is probably the way to go. 

 

But I wanted to squeeze out the best possible performance, so I tested different upscaling levels, and here is a quick look at the results. With DLSS set to Performance mode and Frame Generation enabled, the average framerate shot up to around 200 FPS, with 1% lows hovering around 140. The difference was instantly noticeable - everything felt significantly smoother, and honestly, I think my gameplay even improved. Maybe it’s placebo, maybe it’s just better responsiveness - but either way, it made the experience way more enjoyable. 

 


Cranking up the new Frame Gen to x3 and x4 drastically increased framerates, hitting 300 to 350 FPS on average, with 1% lows hovering between 210 and 240. There’s a slight difference in how it feels, but it’s nowhere near as dramatic as jumping from 80 to 200 FPS. 

 

For those who prefer less upscaling while still maintaining high framerates, the Quality preset with FG x4 still delivers around 300 FPS - very close to Performance mode at x3. Compared to the 4080 Super, the raw rasterization and even DLSS gains aren’t massive, but as you climb the Frame Generation ladder, the performance gap becomes much more noticeable. 

 

Now, the real question is image quality versus raw FPS. This will likely vary from game to game, but in Marvel Rivals, I honestly struggled to spot any major flaws or noticeable issues. Take a look at these two comparisons. 

 

I’ve lined up all the modes side by side, keeping things as consistent as possible. In this first example, we have a side-view run through the map - and to be honest, the differences are minimal, with maybe a little bit more texture and ironically few extra artifacts with stock settings.  


In the second clip, it’s more of the same - maybe a slight improvement in texture detail, particularly around the hair, but there are still some minor artifacts. That said, to even notice these differences, I had to record, export side-by-side footage, and analyse it frame by frame. When actually playing such a fast-paced game, I couldn’t spot anything - just a smoother experience overall. If you did notice anything I missed, drop a comment below - I’d love to hear your thoughts. 

 


But there’s one big catch with frame generation – it's latency. Normally, when your system renders more frames per second, the delay between frames gets shorter. For example, at 60 FPS, a new frame appears every 16.67ms, while at 120 FPS, it’s every 8.33ms - cutting input lag in half. 

  

However, frame generation doesn’t actually speed up real frame rendering - it simply inserts AI-generated frames between them. So, if your game is running at 60 FPS and Frame Generation boosts it to 120 FPS, every other frame is fake. The real frames still arrive every 16.67ms, meaning input lag stays the same. 

  

This makes motion look smoother but doesn’t actually improve responsiveness. That’s why technologies like NVIDIA Reflex are often used alongside Frame Generation to help reduce input delay.  

 

Now, let’s shift focus back to Gigabyte’s card and dive into thermal and noise testing. To get a well-rounded assessment, we ran multiple test setups, all using Furmark to push the GPU to its absolute limit. These tests were conducted on an open-air test bench in a 27°Celsius ambient environment.  

 

We started with the card at its stock settings, then added the extra fan - connected to the motherboard - to see its impact. If you’re using this fan, we recommend setting a custom fan curve linked to GPU temperature for better control.  

 

If that’s not an option, setting a fixed speed that you’re comfortable with works too. In our case, we had it ramp up to 1200 RPM when the GPU hit 65°C. And in the last test, we cranked all fans to max speed to see if it delivered any extra performance. 

  

For accuracy, we used PCAT and FrameView to capture the most data. 

 



Looking at the power graph, the stock settings perform well, but we can see brief fluctuations in power draw, spiking up and down over a few milliseconds. This happens far less in the other setups, suggesting that better cooling helps maintain slightly more stable performance. 

 

As for temperatures, I’d love to say the difference is dramatic—but in reality, it’s not. Even at stock settings, the GPU only reaches around 66°C, while running all fans at max brings it down to about 60°C. So, while extra cooling helps, the gains aren’t ground-breaking. 

 

Now onto acoustics - the difference between stock and 100% fan speed is very noticeable. At stock settings, we recorded 41 dBA, while adding the extra fan with a moderate curve barely raised it to 41.2 dBA. However, cranking all fans to max pushed noise levels up to a loud 54 dBA. Based on this, the additional fan works well as a middle ground - with some fine-tuning of the fan curves on it and the GPU, you should be able to achieve a decent overclock without excessive noise if you wanted to. 

 

To wrap things up - I was hoping for a bit more raw performance uplift from the RTX 5080. Our testing was focused on specific scenarios, so I’d recommend checking out other reviews for a broader perspective on this launch. That said, I was genuinely impressed by the new MFG technology. I couldn’t really tell the difference between the different modes, but it definitely made the game feel smoother, so I’m pretty set on keeping it enabled in my own gaming. 

 

 

What do you think? Do the performance gains make it worth upgrading? Let me know in the comments! 

Recent Posts

See All

Comments


SIGN UP FOR UPDATES, POSTS & NEWS

Thanks for submitting!

  • Grey YouTube Icon
  • Grey Instagram Icon
  • Grey Facebook Icon

Copyright © 2023 A2K. All Rights Reserved.

bottom of page
google.com, pub-6094549887784613, DIRECT, f08c47fec0942fa0