A 1080TI still plays every release at medium or higher settings. /shrug
Unless you’re worried about 4k or VR, I wouldn’t upgrade anyway.
If you care about refresh rate it matters, not a lot of people can stand 30-40fps with hard drops to single digits just to be able to play a game.
I’m curious. What game do you think drops to single digits fps on medium settings with a 1080TI?
I was playing Darktide on a 1060 with minimum 30fps recently, and that game is optimized like absolute trash.
Starfield.
1080 is the minimum card, the TI is decently more powerful (30%), but you’ve got to make concessions on medium to get 30fps, and there’s drops.
To be fair though, that’s a VERY new game and they cared so little about optimizing it that they went out and said “you’re probably going to need a new computer to play this” …
I hear the 1080ti runs Doom just fine 😛
My 1070 handled Doom Eternal just fine with pretty high settings. I’m sure it helps that I only use a single 1080p monitor for games, but it was still pretty enough for me.
*Edit - I picked up on the sarcasm after posting this reply. Oh well.
Every release means every release, and the requirements aren’t going to get lower. It’s a great card and I know people hate losing it, but it’s on its last legs and likely won’t be able to play new releases at all next year.
That’s Bethesda’s fault. There is no fucking reason that game can’t run well on a 1080ti for how mediocre it looks.
Sounds like I’ll never want to even play it lol
Cod MW2/3 are total crapshoots with frame rates, even on a 3080 and set to performance it can still just turn to crap. It seems to run more stable and on higher settings on 2070 laptop. I don’t understand. (I tried to get as much hardware running DMZ as possible for friends and family, lots of machines)
That’s how I akways played games :(
Have a 1060 for vr.actually works fine
I actually prefer the crisp edges without as much post-processing effects sometimes. Source engine games look great to me, just minimal crisp and clean geometry. I find a lot of modern graphics distracting, but it depends on the game. I do love really pushing graphics for a game like Skyrim.
Modern game engines don’t use the amazing SSAA (super sampling anti aliasing). Most have post processing anti aliasing like FXAA or TXAA which always makes edges look fuzzy. Source engine is one of those that still supports super sampling
Yeah that’s exactly it, MSAA isn’t too bad but FXAA makes edges look pretty blurry. Temporal anti aliasing is also really blurry looking sometimes but gives the impression that the edges could be crisp.
I do 4k and VR on my 1080Ti with no issues, on the highest settings, too. That said, I don’t do a lot of AAA gaming, so take that as you will.
My 980ti is still a toss up between amazing or mediocre performance. The big issue is that I bought it for £600 which is a lot of money to me, and new GPUs are 3 times that, or more.
I’ve said it before and I’ll say it again, when humanity is wiped out future species will find a Nokia on half battery and a fully working 1080Ti.
1070ti gang here. no problems playing anything on medium/high.
I don’t do 4k though but VR is fine.
I feel like I’d need a display that can do 4K before I upgrade
Same, but I also don’t really care about 4k; 1440 is just fine for me.
I have a 3060, which is a bit faster overall, and it struggles sometimes with my reverb g2. It’s a pretty high resolution headset though.
yeah my VR headset is a couple years old as well and not the highest quality it’s the Odyssey.
I guess I practice good enough gaming lol.
1080ti is still a beast
…
…
I have never felt more called out.
If my 1080 gives up in the near future, I’ll probably just give up AAA gaming. BG3 is literally the only game in the last 5 years I have loved which would require more than a potato to run.
Honestly I have a gtx1080 and I can run BG3 at close to ultra settings for around 30-60fps. I think my actual issue is my CPU I have an i7 3770 which came out in like 2012. And I have like ddr 3 ram from 10 years ago as well.
My SO upgraded to a Ryzen 5600 and 6600 GPU from a 1050TI and some integrated old CPU and saw massive improvements for BG3. I don’t know the integrated CPU but I think that made the difference for this game. Loading in went from crashes or naked people to no issue at all.
Same here but with an i7 6700K. If I happen to find a decent deal on something like a 12th gen i5 or i7 and chuck in an extra 16gb ram then I think I’d be all set for another couple of years. Although I still don’t desperately need an upgrade, everything works well enough except for maybe that one nightclub map in Ready or Not with more npcs than my cpu can keep up with.
They can pry my 1080 out of my cold dead 12fps hands!
1070 gang
1070 is comfy AF, been on one for so long now and I haven’t seen any reasons to switch yet
I finally bit the bullet and upgraded to a 3060ti. I’ll move the 1070 into a different PC.
>me still running a 970
My 960 runs Unity games like Overcooked at 4k, so I probably won’t be upgrading any time soon. With a toddler I don’t have time for AAA games anymore, but I’m guessing the frame rate would be painful.
970 here too. A true hero o7
Having to run -400mhz on the vram to prevent mine from crashing all the time but I’m hanging in there 🥲👍
Oh that’s a odd problem. What do you think caused that?
I thought it was driver problems at first because it happened so infrequently, but it has gotten worse this year.
It took ages to narrow it down to memory faults since I’ve been running stock settings forever. Stumbled upon this tool: https://github.com/GpuZelenograd/memtest_vulkan/
Found a load of errors, which went away after the down clock and it’s been stable since. It must be just age related degradation, temps were never high but I know the repeated temp changes have an impact.
Couldn’t this be related to psu?
Same problem as mine! I thought I was the only one. I’m using msi afterburn and tuning down power usage to 90% sigh
I bumped the power up on mine, I might be wrong but I assumed more power but less speed would allow it to keep stable and it seems to be working so far.
https://github.com/GpuZelenograd/memtest_vulkan/ - was the tool I used to confirm the memory issues. Just need it to hold on till the 50xx series!
1060 gang
1080 here, got to use for like 220, before the graphics card nightmare. I may be looking to try to get a 2080 or 2070. I wonder if I can do that on 550 at 600 watts
It’s still a decent card, probably can still do well at 1080p max settings in most games. Very similar to a 3060 in terms of performance, which is the card I have.
I’m playing D4 at 4K on medium settings. No complaints. This is on a 3yo laptop with 1080ti in an external enclosure hooked up via Thunderbolt.
Me and my 1060 ain’t going anywhere
I had a 1060 3G version and it just couldn’t hack it anymore. Picked up a 20 series this year and it was such an improvement.
1060 6g still flying
Yeah. I’m about 90% sure that if I had gotten the 1060 6G and I hadn’t gotten a really good deal on a 2080, I’d probably still be using the 1060.
For now though, I don’t suspect I’ll be replacing the 2080 anytime soon… So when the 50 series comes out, this meme will be me with my 2080.
Anyway, this moved you for the better card! Congrats :)
I just upgraded from my 1060 6g. Got a 7800xt.
Same here. Works great and does all I need it to. It would be nice to have a new GPU but I’m driving this one until the wheels fall off.
My 970 seems as old as the wheel if you read this kind of threads.
I really need to upgrade my rig…
Depends on the games. My 980TI can still rock 3440x1440 in most of the games I play.
The fact that what I play is mostly metroidvania shouldn’t be an issue, right? 😅
My 980ti still holds up pretty well at 1920x1440 (high-end CRT monitors were beautiful things, restart production you cowards) for most 3d games I play on Linux, but it is starting to have performance issues in some games, and I’m getting real sick and tired of the dumb shit Nvidia keeps pulling with their Linux drivers. The current driver gives me horrible black flickering in a lot of games, and of course they arbitrarily lock me out maxing out my CRT monitor (which don’t have a fixed resolution, only a balance of resolution vs refresh rate, and it keeps blocking me from a whole range of refresh rate/resolution combinations). So I confess I am starting to eye the higher-end AMD 6xxx GPUs, and I would definitely try and grab one as cheaply as I could if I ever got a 3440x1440 ultrawide.
Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?
Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?
Awesome. For work (even if I am a Linux system engineer) I need to use W11 due to corporate policy. I have two 34" in landscape and a 27" in portrait. I split the screens with FancyZones.
Time for my bad drawing skills, lol.
In order:
- SSH
- SSH
- SSH
- Outlook
- Edge for work
- Teams
- Firefox with YouTube running. Firefox is the only browser that allows for in-window full screen.
I see
I’m debating getting a 3440x1440 monitor for coding and because I hear they work well with tiling window managers (hence the question), it’s just annoying that I have almost no chances to try them out for free, and also the cost is enough that I wouldn’t get one without serious consideration first. Although you have nudged me a bit closer to “maybe I could get one without testing them first, if it’s second hand and cheap(er)”.
Also I’d be replacing my existing 27 inch LCD with it, and keeping the 4:3, 21 inch CRT, for a highly cursed monitor setup, where everything gets letterboxed or pillarboxed. And then to make things worse, I could grab a 16:10 monitor to put in portrait besides one of the other two, for maximum “what is 16:9 and why do I have black bars on everything”.