• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Will Be Announcing 7nm Ampere GPUs At GTC 2020 In March

So I received a tip-off from a source that we have not used before and we will be tagging this as a rumor for now. While the source is new, it has proven to be right at least once in the past and we consider this worth publishing at this time. Our source states that NVIDIA is going to be announcing its Ampere graphics card during GTC in March 2020. While our source did not mention 7nm, all evidence so far has indicated that Ampere will be manufactured on the 7nm platform.

NVIDIA announcing Ampere-based graphics cards at GTC 2020 in March
This is obviously not the first time we have heard of Ampere graphics cards or even a 2H 2020 launch date, but this is the first time we are hearing of GTC 2020 as the subject of a leak. GTC is NVIDIA's premiere gaming technology show and it would make a lot of sense to reveal its brand new Ampere lineup during the conference. While we have not heard anything from TSMC (in fact, the recent report from Taiwan doesn't even mention NVIDIA as one of the clients for 7nm), use of Samsung's 7nm EUV node is not out of the question for the company and may in fact be utilized for Ampere graphics cards.

Alternatively, and I add this just for the sake of covering all bases, NVIDIA can choose to push forward with architectural improvements while staying on the same process node as well. This is something the company is exceptionally good at but will depend on how it chooses to tackle the raytraced elephant in the room. The company clearly has a lot of spare room on the die if it chooses to get rid of the RT engines while remaining on the same process, but will not have much room for growth if it does not.

NVIDIA has historically been very good at keeping launches secret (with the major exception of the Turing launch) and regardless of which node it decides to go with on the Ampere GPU front, we are sure that the next generation of graphics cards will represent a significant step up from the older series.

Analysts before have confirmed a launch date of 2H 2020 as well:

On Tuesday, Susquehanna Financial Group analyst Christopher Rolland reiterated his Positive rating for Nvidia shares, citing the strong sales for the Nintendo videogame console. Nvidia makes chips for the gaming device.
“We think Nvidia faces the most reasonable Street expectations in quite some time, with potential tailwinds from improving DC [data center], Switch [Nintendo console], and high GPU [graphics processing unit] attach rates (laptops/desktops), all in front of a litany of upcoming 7nm [nanometer] launches over the next nine months,” he wrote. Source.
NVIDIA it seems has finally decided to play catch up and will be shifting to its own 7nm by 2020. According to what we have heard so far, this will be Samsung's 7nm EUV process and should offer a significant step up in performance from previous generations (even TSMC's non-EUV based 7nm process). 9 months amounts to roughly 3 quarters, and with a launch in 2020, you will first start to feel the impact in the third-quarter earnings (exactly a year from now). In other words, NVIDIA is slowly but surely working its way back up to getting Jensen his coveted record quarters.

What we know so far
We have previously heard of NVIDIA's Ampere GPUs when they passed their EEC certification, but nothing more came up since then. Now, however, we have a tentative timeline: they will be launching in 2020. It is highly likely that NVIDIA will continue with their RTX philosophy and take that to the next level with Ampere. Right now, the Turing GPU is capable of raytracing at 1080p 30 fps for light to moderate path ray tracing workloads. The Ampere GPU will be able to go further.
NVIDIA-Ampere-GA104.jpg


The fact that it is based on Samsung's 7nm EUV process means we are looking at a performance advantage as well as a power efficiency advantage. Not only that, but believe it or not, 7nm EUV is actually supposed to be easier to fab than standard UV multi-patterning efforts. Think of EUV as sort of a reset of the difficulty curve as the company moves to a new light source. This will, however, require extensive re-tooling, but the economies of scale will almost certainly prove to be worth it. At a bare minimum you are looking at a 50% increase all things considered and watt for watt.

Here's the thing right, NVIDIA is one of the biggest customers of TSMC and has been their loyal patron since pretty much the dawn of modern gaming tech. If they are actually planning to shift to Samsung's 7nm technology, then that will have repercussions not only for the company, but for TSMC as well. There are two possible things that are going on here, either Samsung is offering them a better deal financially, or NVIDIA has reason to believe Samsung's 7nm tech is better. We can guess one reason for why this might be the case. Right now, TSMC's 7nm process is not based on EUV, but they do have an EUV node planned. The process that NVIDIA is planning to shift to, at Samsung's, is EUV.

That would imply that they have reason to believe that Samsung's EUV process is better positioned to help them achieve their goals than the TSMC-based one. Another potential reason is that TSMC cannot offer them a large amount of volume and will never prioritize them over the likes of Apple Inc. At the same time, AMD is using up a lot of their capacity and things are getting too cramped in there for NVIDIA's liking. Samsung's foundries on the other hand have ample capacity and considering the giant that Samsung Electronics is, can simply throw money at yield problems to make them disappear.

Everything considered, Samsung is the logical partner for a company as ambitious as NVIDIA. With a big question mark on Intel foundries' capabilities right now and TSMC hogged down, Samsung remains the only leading edge foundry for NVIDIA to tap into. GlobalFoundries dropped out of the leading edge race earlier this year - not that they would have been considered to begin with.
 

Celcius

°Temp. member
This reminds me of 2018 when Pascal had been on the market for a while and everyone was waiting for Turing to come out. The whole year every time a conference was coming up there would be rumors of the 2000 series cards being unveiled at the event. CES, GTC, E3, etc... eventually they were announced in late August and released in September if I remember correctly.

This could be true, but all I'm saying is I'll believe it when nvidia themselves either say they have something to announce or they start teasing things on their social media accounts.
 
Last edited:

DESTROYA

Member
I’m interested to see how this translates to mobile GPU’s , hopefully the power gap between desktop and mobile is not as lopsided.
 

John117

Member
The core will be RTX technology. I'm very exciting to see something running on new Ampere architecture :) I remember the demo star wars for to show RTX tech but it runned on DGX Station :( However the used should be down (RTX 2080Ti, 2080) but maybe I wait the the new Titan GPU. It will be curious how AMD will respond at Ampere
 

PhoenixTank

Member
This reminds me of 2018 when Pascal had been on the market for a while and everyone was waiting for Turing to come out. The whole year every time a conference was coming up there would be rumors of the 2000 series cards being unveiled at the event. CES, GTC, E3, etc... eventually they were announced in late August and released in September if I remember correctly.

This could be true, but all I'm saying is I'll believe it when nvidia themselves either say they have something to announce or they start teasing things on their social media accounts.
On top of that, this rumour seems to ignore that Jensen himself apparently said after GTC 2019 (China) late last year that they'd be using TSMC 7nm for most wafer orders:
Google translate is not perfect (as well as this being second hand from SK media?) so there is room for misinterpretation above but I don't buy what wccftech are selling here.
 
Last edited:

base

Banned
I highly suggest you to wait for AMD offering.
No thanks. Bought 5700 2 weeks ago and this is my 3rd ATI/AMD card in the last let's say 25 years and it sucks. Drivers problems - Freesync flickering with my monitor, stuttering/fps drops, throttle. Ordered 2060S to replace it. Had plenty of NVIDIA cards and never experienced such problems.
 
No thanks. Bought 5700 2 weeks ago and this is my 3rd ATI/AMD card in the last let's say 25 years and it sucks. Drivers problems - Freesync flickering with my monitor, stuttering/fps drops, throttle. Ordered 2060S to replace it. Had plenty of NVIDIA cards and never experienced such problems.
I understand, you still have to wait for AMD in case there are improvements, if not get nvidia with a lower price because AMD made them to.
 

base

Banned
I understand, you still have to wait for AMD in case there are improvements, if not get nvidia with a lower price because AMD made them to.
Bought their Ryzen 2600X. Their CPUs are really good, but GPU need more improvements.

P.S: My last CPU was Athlon 64 3400+. Then I move to laptops where Intel ruled for a long time.
 

Mecha Meow

Member
I just think you're getting really unlucky because I've never had issues with any of my AMD gpus and always had a couple issues with my 560ti, 770, and my 1070 died right after I gave it to a friend. Luckily that still had a year warranty on it. My cousin's 580 (Nvidia - also just went to shit randomly one day back in 2012).

Never had issues with drivers with either brand so I still think that complaint is largely exaggerated.

Edit: I had to RMA two Fury cards but that was more Sapphire's fault.

TLDR, I don't let small issues sour me a on brand.
 
Last edited:

kiphalfton

Member
This reminds me of 2018 when Pascal had been on the market for a while and everyone was waiting for Turing to come out. The whole year every time a conference was coming up there would be rumors of the 2000 series cards being unveiled at the event. CES, GTC, E3, etc... eventually they were announced in late August and released in September if I remember correctly.

This could be true, but all I'm saying is I'll believe it when nvidia themselves either say they have something to announce or they start teasing things on their social media accounts.

I remember that. And that's also why I stopped reading any articles on their website. With each passing conference, I just lost more and more faith.

Also wccftech is awful. They may as well have created a text template that started with "Take this news with a grain of salt", as I seem to have seen that in pretty much every article they wrote.
 

Dr.D00p

Member
Sorry Nvidia, I'm a PC gamer but my (discretionary) spending on new tech this year has already been allocated..

1. LG OLED TV
2. PS5
3. Switch Pro (if it turns up)

My RTX 2080 will have to suffice until the RTX 4xxx cards come along in 2022.

..Besides, I don't expect my self imposed metric of getting (at least) double the performance for the same cost as my current GPU before upgrading, to be met until the 4070/80 cards anyway.
 
Last edited:

llien

Member
Huang has likely missed the TSMC train, with Apple and AMD (remember consoles with oversized chips) jumping the gun early.

Samsung 10nm node is low power, as far as I know.
 
It's just a rumor (10nm), i'm not sure it's wise for them to skip 7nm, they can't afford AMD taking the lead like they did with Intel.
10nm Ampere will most likely be on part with 7nm AMD Big Navi, but why would they risk that ?
It make no sense to go with 10nm.
 
It's just a rumor (10nm), i'm not sure it's wise for them to skip 7nm, they can't afford AMD taking the lead like they did with Intel.
10nm Ampere will most likely be on part with 7nm AMD Big Navi, but why would they risk that ?
It make no sense to go with 10nm.

AMD took the lead from Intel? Maybe in video processing.

Lol at RTX for everyone, but SLI limited to 102 only. Youre not running anything with RTX enabled on 106/7.
 

01011001

Banned
Hope AMD has something good, really don't want to buy another crazy expensive GPU from Nvidia.

well the thing is, you don't know if AMD will stay as cheap as they are once they can compete head to head with Nvidia. It's hard to tell where the prices will go once that's the case.
AMD had to offer lower prices for a while now because they literally couldn't compete with Nvidia in terms of power so they only could compete with cheaper alternatives.

I really hope they will kick Nvidia's asses tho, so we finally have some competition at the high end again, and hopefully lower prices throughout
 

Kenpachii

Member
well the thing is, you don't know if AMD will stay as cheap as they are once they can compete head to head with Nvidia. It's hard to tell where the prices will go once that's the case.
AMD had to offer lower prices for a while now because they literally couldn't compete with Nvidia in terms of power so they only could compete with cheaper alternatives.

I really hope they will kick Nvidia's asses tho, so we finally have some competition at the high end again, and hopefully lower prices throughout

AMD is going nowhere without massively cheaper prices to compete with nvidia on the same level or if they give far beyond faster performance which is high likely.
This is the only reason people moved into ryzen also, if they price matched intel nobody would care about those chips.
 

01011001

Banned
AMD is going nowhere without massively cheaper prices to compete with nvidia on the same level or if they give far beyond faster performance which is high likely.
This is the only reason people moved into ryzen also, if they price matched intel nobody would care about those chips.

I hope so, and I hope it stays that way. because right now it seems like AMD is literally (excuse my words) buttfucking Intel with an iron-spiked condom lol... so if this keeps on being the case, I could see them slowly raising their prices generation to generation.
 
AMD took the lead from Intel? Maybe in video processing.

Lol at RTX for everyone, but SLI limited to 102 only. Youre not running anything with RTX enabled on 106/7.

AMD took the lead from Intel in:

- multithreaded performance
- IPC
- Power draw
- Performance per watt
- Performance per dollar
- Efficiency

And is only about 3-5% behind Intel in gaming BUT only when you set res to 1080p AND use a 2080 Ti...
 
I started saving for 3080ti as soon as they unveiled 2080RTX as I knew that raytracing round one would be everyone figuring out what actually works and nothing will have been as refined. Can't wait for a 3080ti. 1080ti has been an amazing card and has made the waiting so incredibly easy. And seeing ray-tracing advance so much in such a short time is very exciting.
 
Last edited:

Kenpachii

Member
Lol if that is what they have to offer, yea i won't be upgrading. 12gb of v-ram fucking lol. sli only on 102 chips means sli is dead.
 

PhoenixTank

Member
It's just a rumor (10nm), i'm not sure it's wise for them to skip 7nm, they can't afford AMD taking the lead like they did with Intel.
10nm Ampere will most likely be on part with 7nm AMD Big Navi, but why would they risk that ?
It make no sense to go with 10nm.
It is definitely very strange. Only thing that might make sense is if Samsung's 10nm can offer bigger chips at good yields. A trade off for better power usage compared to the current custom process while retaining the big die sizes. Could just be a BS rumour, because the "what process?" answer seems to change a lot!
 
Last edited:

Celcius

°Temp. member
Sigh, I saw this thread and thought something official was teased. Thread should be marked as rumor instead of news.
 

skneogaf

Member
What kind of specs are we likely to see?

I will either sell my rtx 2080ti or my gtx 1080ti to fund one as I would really like a gpu with hdmi 2.1 for my LG c9 and its ability to do 4k@120hz with gsync technology.
 

diffusionx

Gold Member
What kind of specs are we likely to see?

I will either sell my rtx 2080ti or my gtx 1080ti to fund one as I would really like a gpu with hdmi 2.1 for my LG c9 and its ability to do 4k@120hz with gsync technology.

I think 50%+ improvement over prior gen is very likely.
 

Dontero

Banned
I don't want burst some bubbles but this suggest ampere will be released first for enterprise products not necessarily for consumers.
 
Top Bottom