Will you be Buying an RTX 3080?

I agree. The whole launch continues to be a big mess with none of us really knowing what's going to happen next :thumbsdown:

I'm in a queue myself but have no idea on the number or how far back I am. There's a little part of me hopes it won't arrive before the AMD announcement so that I'll have an opportunity to choose between them.


Yes, not only do you have "worry" if you can get your hands on a 3080, but now you also have to be sure you have the right AIB board........... and then still mabey the AMD card is faster
 
and then still mabey the AMD card is faster

I seriously doubt it will be faster. It might be nearly a match, it might be a better value, but faster would be an unlikely first for AMD. Every new generation of graphic cards in the past decade has had that hope that AMD will be faster, only to be dashed on the shores of reality.
 
Last edited:
"The crashing with the RTX 3080 cards doesn’t appear to be down to the caps used, which is why we haven’t made a video yet, we don’t know the issue. What we do know is the FE and TUF Gaming models crash just as much as other models and they use MLCC’s. " Source: HardwareUnboxed on Twitter.

So if it turns out that my soon to be incoming 3080 won't boost above a certain limit I'm totally ok with that. I bought according to specs and and if the card delivers those specs I will be happy with that.

And as of the "don't be an early adopter" thing, why not? If nobody buys first , nobody buys anything. (All hail the early adopters. ;))
 
I am new to PC gaming - I just built my first machine a few months ago and bought a used 1080Ti to hold me over to the 3000 series cards.

For those who have been doing this for years, is this entire botched GPU release a rarity? Or does some version of this happen every two years?
 
This seems a bit worse than typical, but paper launches and/or beta testing being pushed onto users is not unheard of. Along with cost/acquisition/future support, it's one of the main risks to being an early adopter.
 
Last edited:
For those who have been doing this for years, is this entire botched GPU release a rarity? Or does some version of this happen every two years?

There is always a supply shortage on every generational launch. This year is probably a little worse for a variety of reasons (politics, COVID, natural disasters). Sometimes the bitcoin miners are to blame, sometimes its new display tech (3D monitors, VR), and sometimes new software (Crysis, raytracing) creates a lingering high demand and supplies aren't back to normal for maybe a year or more.

Teething problems always pop up during the generational launches; I don't think this year is all that different because sometimes its driver problems and sometimes it manufacturing. The technical problems always are sorted out within 6 months... can't say the same for supply problems.
 
  • Deleted member 197115

But can 3080 run Crysis?
Surprisingly, yes, at 25fps. :roflmao:
Crytek has released remastered version of original Crysis (Epic only) to keep the meme relevant even for modern hardware.
 
I am new to PC gaming - I just built my first machine a few months ago and bought a used 1080Ti to hold me over to the 3000 series cards.

For those who have been doing this for years, is this entire botched GPU release a rarity? Or does some version of this happen every two years?

Well, especially the availability issue has definitely been the worst ever. The issues with the cards at the beginning are, unfortunately, not so rare. If it's not something on the cards, then it's the drivers.
 
Lol.. I’m happy/lucky I didn’t sell my 2080ti( lol I put it in sale asking only 500$) then seeing all these 3080 launch issues I decided to keep my 2080ti and I will wait 4-6 months before jumping in the 3xxx series.
I think that the 2080ti will do a good job with the Reverb G2,
 
Lol.. I’m happy/lucky I didn’t sell my 2080ti( lol I put it in sale asking only 500$) then seeing all these 3080 launch issues I decided to keep my 2080ti and I will wait 4-6 months before jumping in the 3xxx series.
I think that the 2080ti will do a good job with the Reverb G2,
I never buy GPUs in that price range but If I had an RTX2080Ti and hadn't sold it prior to RTX3080 release, there is no way I'd be selling right now.
Strange as it sounds...you'll probably get a better deal a few weeks from now once all the Nvidia and AMD release hype has died down.
It's a rather strange phenomenon but one that happens...not sure why.
 
It seems that the new drivers fix the issues for most 3080 users, the boost clocks were too high leading to instability. The new drivers keep the boost range more stable, also in my own tests, but actually perform a bit better than the old drivers.
 
I seriously doubt it will be faster.

Agree, but I think the word were after is disruptive.

What I think will happen is the first time in a long time is introduce a competitor to every Nvidia card which will change the mind share dramatically, suddenly it will be very clear that there is a good alternative choice for every product. i.e. with Ryzen they matched Intels i5, i7 i9 etc, to a R5, R7, and R9 etc. Radeon in Return will have a 600, 700, 800 and 900 series card vs Nvidia 60, 70, 80 and 90 and xt vs ti and so on.
I think this is the clever strategy Lisa Su is taking and her approach to AMDs future moving away from previous mistakes and random product names.

Will it be faster? AMD has proven they can match performance spec to spec with the 5700xt to the 2070 super. Same core count, memory bandwidth, clock speed etc, they now need to prove it scales up to a flagship with double the number of processors, this much seems to be true judging by recent leaks of a 80 CU chip vs the 40 on the 5700xt, but there will always be doubt with AMD Radeon at launch.

Finally they are going to use the new Zen 3 processors with higher performance than Intel make the 1080p and 1440p gaming performances of Radeon seem much better looking than Nvidia at launch, I think this is why Radeon is launching after Zen 3.
 
VR on RTX 3090 in Project CARS 2, Ultra (avg 85 fps)

1601333108223.png


Jump to 3:17 for Project CARS 2 footage


There's definitely room to tweak settings as he averages 141 fps on low graphics settings. Jump to 2:30 for Project CARS 2 footage
 
Last edited:
Feels to me like Nvidia pushed the envelope a bit too far on their cards in the worry they might not live up to the marketing hype or beat AMD. Hoping they could provide the death knell on the competition before any 'problems' arise with the cards. Instead its backfired quite badly, looks more like an AMD launch from 5 or 6 years ago...
 
For those who have been doing this for years, is this entire botched GPU release a rarity? Or does some version of this happen every two years?
It's only botched in the sense that people couldn't get the thing they wanted immediately. If you are one of the people that was prepared to wait and see what people said about the cards before buying, then all this drama means nothing.
 
Feels to me like Nvidia pushed the envelope a bit too far on their cards in the worry they might not live up to the marketing hype or beat AMD. Hoping they could provide the death knell on the competition before any 'problems' arise with the cards. Instead its backfired quite badly, looks more like an AMD launch from 5 or 6 years ago...

Nvidia have f*cked up basically because they couldn't get TSMC 7nm and having to make do with Samsung 8nm which needs to be pushed very hard with not quite the clock speed they hoped for.

When was the last time genuinely Radeon had a matching flagship? Was it the 290x in 2013 or the 7970 in 2011? I had a Nvidia 8800gt between 2009 and 2015 when I then got a 970 so missed a few years.
 
I just don't get why the chips aren't simply limited to 1850 mhz. If you wanna manually oc, fine. But imo squeezing 1% of performance out of the cards and risk instability is pretty, stupid.
It's not like 1950 mhz or 2 GHz are listed in any specs afaik so why not limit the boost...?

It also might be happening due to the big coolers for the 3080. Due to the 350+ watts, the coolers need to be pretty big and efficient.
When my 1070 is cold and I simply overclock it via "+100" for example, it will shoot up to 2100 mhz and show artifacts.
Once it gets hot after a few minutes, it would settle at around 1900 mhz.

I solved this by using the oc curve in afterburner. Put 2012 mhz in from the minimum voltage during game play and the maximum voltage during game play.
This way I have always 2000-2024 mhz no matter what voltage the bios is throwing at it.
When the gpu load drops, it will clock down up to 1500 mhz (3D mode) with the default voltage.
 

Latest News

How often do you meet up (IRL) with your simracing friends?

  • Weekly

    Votes: 54 9.0%
  • Monthly

    Votes: 29 4.8%
  • Yearly

    Votes: 37 6.1%
  • Weekly at lan events

    Votes: 3 0.5%
  • Monthly at lan events

    Votes: 2 0.3%
  • Yearly at lan events

    Votes: 14 2.3%
  • Never have

    Votes: 474 78.6%
Back
Top