From www.techradar.com
Nvidia might be revamping a past-gen laptop GPU with a current-gen chip, or at least this is what some clues that have just been aired suggest.
The theory is that there’s a new ‘RTX 3050 A’ mobile graphics card, with the ‘A’ suffix indicating this is a fresh flavor of the GPU which uses an AD106 chip – that’s a Lovelace GPU (used in the RTX 4060 and 4060 Ti desktop, and 4070 mobile). This replaces the Ampere GA107 chip seen in the current RTX 3050 mobile.
As Tom’s Hardware observes, this RTX 3050 A variant has been listed in the PCI-ID database, and it’s also included as a GPU in the latest graphics driver from Nvidia (a pretty strong suggestion that it exists, of course).
So, not only is this rejigged laptop graphics card using a chip from a newer generation, but it’s a step up in terms of the tier (from 107 to 106), and naturally, AD106 is a much peppier GPU than GA107.
As a result, are we going to get a more powerful, faster, budget graphics card for affordable gaming laptops? No, is the short answer to that, at least in terms of performance – but as we’ll discuss next, there could be a hidden benefit here in terms of battery life.
Analysis: Power efficiency FTW
Okay, first off, we should underline that Nvidia hasn’t officially announced the RTX 3050 A, and we won’t know that it exists for sure until Team Green does so. Well, we say that, but the likelihood is a spin on a last-gen GPU like this won’t be formally announced, it’ll just suddenly appear in laptops – but still, the point remains, until we have actual evidence that it’s out there, this supposed RTX 3050 A could come to nothing.
It’s likely Nvidia does have the revamped graphics card in the wings, mind, given its presence in the new graphics driver, and the fact that the company quite commonly pulls this kind of trick. And it isn’t done because Team Green suddenly has a change of heart about an old graphics card, and decides it needs to be more powerful. Rather, it’s about using available resources in a sensible way.
By which we mean there are always chips that don’t quite make the grade and are found to have duff cores on board when tested. What Nvidia then does is disable a chunk of the GPU cores – obviously including the wonky ones, so they are taken out of the equation – and then put that chip in a lesser card (with a spec of fewer cores).
So, these are AD106 chips that didn’t make the grade that Nvidia is using up, rather than throwing away, which is simply good practice. It also means that the AD106 chip used will still maintain the same core count as a GA107-toting RTX 3050 mobile, and it’s very likely Nvidia will make the spec essentially identical.
In short, there may be no difference between the RTX 3050 vanilla and this ‘A’ variant – but sometimes, there are slight spec boosts with these fresh takes on a chip. So, we can still hope that maybe the spec will be a touch pepped up in some respects.
That’s probably not likely to happen on balance, but as Tom’s points out, there is an important upgrade here just in the switch from Ampere to Lovelace. This is because AD106 uses a more advanced process (TSMC 4nm, versus Ampere on Samsung 8nm), so there will absolutely be inherent benefits in terms of better efficiency. Much better power-efficiency in fact, and remember we’re talking about gaming laptops – so this could be a big plus point in terms of battery life.
If anything ever comes of this RTX 3050 A, that is – and even if it does appear in laptops on shelves, who’s to say it might not be restricted to certain regions (say, Asia). Everything is guesswork about this laptop GPU at the moment, really, but nonetheless, we’ve had a tantalizing glimpse of what could be a nice boost for budget gaming laptops on the battery front (and maybe a touch of a performance bump, too – if we’re really lucky).
You might also like
[ For more curated Computing news, check out the main news page here]
The post Nvidia looks to be revamping RTX 3050 mobile GPU with current-gen tech – and that might mean better battery life for budget gaming laptops first appeared on www.techradar.com