GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
I'm guessing the winning play is to get a cast-off Dell Optiplex off Ebay and stick a GPU in it.
Cheapest is getting one of the AMD BC-250 boards with cut-down PS5 APU and 16 GB GDDR6 built-in (said to be "$100", probably closer to $150) and 3D printing a case for it:



 
Cheapest is getting one of the AMD BC-250 boards with cut-down PS5 APU and 16 GB GDDR6 built-in (said to be "$100", probably closer to $150) and 3D printing a case for it:
The problem you're going to run into is that console games don't copy data from main RAM to VRAM owing to the unified memory architecture, so while 16 GB total is enough for a console game, it isnt for the PC version.
 
The problem you're going to run into is that console games don't copy data from main RAM to VRAM owing to the unified memory architecture, so while 16 GB total is enough for a console game, it isnt for the PC version.
It seems to be OK for a 1080p gaming target, and you're spending maybe $200-250 total after PSU, storage, fan, etc.


Left 4 Dead 2 - 1080p max = >150 FPS
Spiderman 2 - 1080p medium FSR balanced = >50 FPS
Spiderman 2 - with frame gen = >100 FPS average
Witcher 3 - 1080p high = >60 FPS
Forza Horizon 5 - 1080p medium = >60 FPS
Cyberpunk 2077 - 1080p high FSR balanced = >50 FPS
Cyberpunk 2077 - with frame gen = >90 FPS average
 
Last edited:
CHINA SUPERPOWA, GET RTX3070 16GB, RTX 4090 48GB or MI50 32GB (cheap bastard), CHOOSE IT WELL WHITE PIG DEMON, NO LEFUNDS (or flee shipping).

btw shouldn't a pro b50/60 do the job well? b50 has 16GB and while the b60 has 24GB it's kinda rare to find although it's intel arc, not nvidia, b50's are plentiful for some reason.

Spending new money for AI, only to avoid Nvidia carries some risk of future self-ownage. Already did that by going AMD with the 6xxx series w/16gb vram years ago (still a great value).

The Dual B60 looks really good. But regular B60 is equally inconvenient to obtain and ship from euroland to be raped for duty/shipping/brokerage/taxes.

While Nvidia cards are (for now) within walking distance or next day prime via jeetslave.
 
It seems to be OK for a 1080p gaming target, and you're spending maybe $200-250 total after PSU, storage, fan, etc.

https://youtube.com/watch?v=q_CxcbS5HI8
Left 4 Dead 2 - 1080p max = >150 FPS
Spiderman 2 - 1080p medium FSR balanced = >50 FPS
Spiderman 2 - with frame gen = >100 FPS average
Witcher 3 - 1080p high = >60 FPS
Forza Horizon 5 - 1080p medium = >60 FPS
Cyberpunk 2077 - 1080p high FSR balanced = >50 FPS
Cyberpunk 2077 - with frame gen = >90 FPS average
Surprised that it handles any PS5-era games that well (L4D2 and witcher 3 are ancient).
 
The NextGen update brings up it's hardware demands quite a bit, close to Cyberpunk 2077 demand I'd say.
Sys reqs are still potato-tier:

1766857612657.png

Unless they forgot to update them, which is likely, since they added a raytracing option.
 
Last edited:
Sys reqs are still potato-tier:

View attachment 8338708

Unless they forgot to update them, which is likely, since they added a raytracing option.
1766859127387.png
Yep. But even then:
The old version still looked pretty damn good for those system requirements. Same with GTA V. 2015 was the 80/20 rule breaking point for game graphics. Now we have to use 80% more computing power to get 20% more of graphical fidelity.
 
View attachment 8338774
Yep. But even then:
https://youtube.com/watch?v=v1Do1oHUV9IThe old version still looked pretty damn good for those system requirements. Same with GTA V. 2015 was the 80/20 rule breaking point for game graphics. Now we have to use 80% more computing power to get 20% more of graphical fidelity.
The requirements for "high" settings are basically PS4 Pro tier in terms of the GPU, plus 8 GB VRAM and 8 GB main RAM. No real surprise it runs on that PS5 hacked into a PC, then.
 
welp, got the cougar and it's a bit banged up with some rust in a few places.
but i've also managed to bag a corsair R100 too that is well preserved
1766876266401.png 1766876278468.png
the owners decided to leave the case fans as is so both cougar and R100 have 3 fans each, total cost 220 monkey money (40$) so i'd say it's kind of alright.
it's a tad late so i gotta sleep before cleaning them, the cougar smells of cat piss which is kind of funny when you think about it.
 
a problem with them is the frequency, some cpu's are heavily sensitive to frequency and most of the adapters will run the sticks at the lowest speed possible.
something that a custom pcb with all of the kajiggers to control voltage, timings and frequency won't have, godspeed to them ruskies.
Both of these ideas are dumb as fuck anyway, because ddr5 SODIMMs are stupid expensive just like desktop DIMMs!

Pay $480 for 2 16gb SODIMMs and 2 adapters just to run at ddr5-4800 or $410 for desktop ddr5 6000 mt/s cl32 rgb.

It's just a poor idea
 
Sys reqs are still potato-tier:

View attachment 8338708

Unless they forgot to update them, which is likely, since they added a raytracing option.
The techtubers don't want you to know this, but you can literally play almost every game in existence with reasonably okay performance on a $50 GPU. They've gotta keep you hooked with another video about how PC gaming is impossible in 2025.
 
The techtubers don't want you to know this, but you can literally play almost every game in existence with reasonably okay performance on a $50 GPU. They've gotta keep you hooked with another video about how PC gaming is impossible in 2025.
NO! If I can't get 240 fps, the game is UNPLAYABLE.

I saw one of those YTers unironically say you need Hitman at 200+ fps to really be able to play it.
 
Unironically this, once you get a monitor that's at least 120Hz you'll feel like 60fps is bad performance. Then again, frame gen is meant to bridge that gap, but alas, the war mongering gook just lied to people it's to make a 5070 perform like a 4090.
Yep once you get used to higher refresh rates, basic bitch 60fps looks like trash.

I still remember when my wife had a 30hz monitor and was playing Hogwarts on it. God.
 
Unironically this, once you get a monitor that's at least 120Hz you'll feel like 60fps is bad performance. Then again, frame gen is meant to bridge that gap, but alas, the war mongering gook just lied to people it's to make a 5070 perform like a 4090.
My monitor is 165 Hz, and has been for a couple years now. I have no problem playing games that run at 60 fps. Even played some 30 fps games on the PS2 recently, also the OG version of Doom II, which is 25 fps.
 
VRR is what you will truly miss once you lose it.

Most people (including past me) tend to see high framerate + VRR as a luxury, but on the contrary it's what really makes the low end viable. Suddenly your 50-to-70FPS game doesn't need to be uglified further until you're almost always above 60, you just tell your PC to output whatever it can handle and the monitor will do the rest. So much liberating to be done with that tedium.

High refresh monitors are the only major PC peripheral that keeps getting cheaper, you can right now get a 32" 1440p 165+Hz for about $200. No excuse.
 
The only thing that really bothers me is screen tearing. I keep hearing that there is some kind of visual experience where, once I've experienced it, booting up the old Xbox and running through Halo's campaign again will induce headaches and vomiting, but that's never happened yet.
 
I've been sitting here for legit like an hour, kind of getting MATI, typing a 200 word essay about why low FPS/hz is dogshit, deleting it, and retyping it over and over. I've also had a guy playing GoldenEye 007 on my 2nd monitor the entire time, a game that quite often dipped below 15 FPS.

Playable is subjective. I can't stand anything below around 90hz, it feels like I'm moving my mouse pointer through glue. Playing stuff on a console with imprecise thumbsticks, on a LCD TV with massive input lag and shit response times will feel different than using mouse and keyboard on a decent monitor.
 
Back
Top Bottom