Help me build my PC: 2024 Catch-All

Looks like 13th gen Intel CPUs are going to be power consumption monsters.

Bad enough the GPU market is cranking up power draw, but at least they seem to be promising significant performance gains. Intel seems to be promising a fairly meh incremental boost (8-10% IPC gain, 15% total gain once adding in the higher clock speeds) in exchange for making your power meter go whirly-whirly-whirl.

I need to stop talking about Intel, it makes me feel like an AMD fanboy. But if Intel were pushing out the better, more efficient CPU process, I would be here saying so.

Yeah. AMD seems to be winning on a lot of fronts these days. Their escalating efficiency gains are on the cusp of being legendary. Generation over generation 25% efficiency gains are an incredible achievement.

And then you have the numerous features introduced and the execution on a die shrink.

*Legion* wrote:

I need to stop talking about Intel, it makes me feel like an AMD fanboy. But if Intel were pushing out the better, more efficient CPU process, I would be here saying so.

When I upgraded last year I went with an AMD cpu. That's the first time I've not picked Intel - *ever*

If they pushed even me away, they've seriously screwed the pooch.

Moggy wrote:
*Legion* wrote:

I need to stop talking about Intel, it makes me feel like an AMD fanboy. But if Intel were pushing out the better, more efficient CPU process, I would be here saying so.

When I upgraded last year I went with an AMD cpu. That's the first time I've not picked Intel - *ever*

If they pushed even me away, they've seriously screwed the pooch.

I have been going Intel for a long long time. The last time I went AMD was when I got the AMD 5x86 DX4-133. I was working at a computer store back then and after their 486 line, I saw so many AMD systems come in for repair with weird issues so I stayed away like the plague. This last upgrade was the first time I went back to AMD, and I have no complaints.

The great Ethereum Merge of myth and legend has finally happened. As of today, you can no longer mine Ethereum, completely eliminating the demand for GPUs in the Ethereum world.

It's a little anti-climactic given that the tanking of the crypto markets has depressed mining GPU demand already, and of course ETH miners who have been paying attention were already executing their exit strategies.

Bitcoin remains proof-of-work (ie. mining), but efficient Bitcoin mining long ago left GPUs behind, and instead necessitates purpose-built ASICs to mine profitably. Still power-sucking leeches, just not ones that eat up the PC GPU market.

Hoping for the entire crypto world to die a horrible death is probably too much to wish for, so I at least hope Ethereum eats the lunch of all the remaining proof-of-work coins. If people are gonna run these "greater fool" scams, let's at least move everyone to the one that is no longer an energy parasite.

Pyramid scam for math nerds...
DIAF all of them.

Have you checked eBay today? I wonder when the panic will set in? I mean miners can wait as long as Nvidia is still manipulating the market

fangblackbone wrote:

Have you checked eBay today? I wonder when the panic will set in? I mean miners can wait as long as Nvidia is still manipulating the market :(

I have no insight to the Ethereum community, so I have been wondering what percentage of the mining community had already long gotten out, what percentage has been winding down the clock and getting every drop they could out, and what percentage woke up this morning completely blindsided by the fact that their mining setups now do nothing. My desire for schadenfreude makes me hope the last group is a large percentage, but it might be a small one.

I could see it going with a large group surprised. The merge has been delayed several times. Plus there is a lot of money vested in the merge not happening
The ensuing panic would be very welcome!

*Legion* wrote:

The great Ethereum Merge of myth and legend has finally happened. As of today, you can no longer mine Ethereum, completely eliminating the demand for GPUs in the Ethereum world.

It's a little anti-climactic given that the tanking of the crypto markets has depressed mining GPU demand already, and of course ETH miners who have been paying attention were already executing their exit strategies.

Bitcoin remains proof-of-work (ie. mining), but efficient Bitcoin mining long ago left GPUs behind, and instead necessitates purpose-built ASICs to mine profitably. Still power-sucking leeches, just not ones that eat up the PC GPU market.

Hoping for the entire crypto world to die a horrible death is probably too much to wish for, so I at least hope Ethereum eats the lunch of all the remaining proof-of-work coins. If people are gonna run these "greater fool" scams, let's at least move everyone to the one that is no longer an energy parasite.

I have a neighbor who I'd hoped was mining Ethereum but is likely mining BTC with ASICs since they are still running... He runs them in his garage and vents them directly out the garage. The fans are... ridiculously, obnoxiously loud. Really ruins the peace and quiet we'd otherwise have.

A little different than the normal PC build recommendations, but I want to shout out these little barebones mini-PC devices made for wired routers/firewalls.

I did a build with one of these (pairing my own M.2 drive and RAM with the fully barebones model) to run pfSense. The Intel J4125 CPU is quite speedy for a 10W, passively-cooled CPU (see, I can say nice things about Intel chips), and it has AES-NI hardware crypto support, making it quite viable to run a VPN server in tandam with the router/firewall. I also got rid of Pi-Hole on my network and replaced it with pfBlockerNG right there inside pfSense.

You can find these barebones kits even a bit cheaper than my Amazon link if you hit up AliExpress, though obviously you'll be waiting longer for it to arrive. But I think it's a great piece of kit for more technically minded users. I'm a big fan of running a separate wired router/firewall, and making my wireless "routers" act purely as access points (makes it easier to place them when they don't need to be next to your modem). I think even if pfSense is a little much for you, you could run OpenWRT on one of these.

If you're the type to flash DD-WRT or OpenWRT on your wireless router/AP combo devices, highly recommend one of these as your next step up.

EDIT: Since my box is running pfSense, I decided to give it a name that starts with "pf". I thought about Pfeiffer or Pfizer, but in honor of my time living in Austin/Round Rock, I named it Pflugerville.

So, am I going to get greedy?
6700 xt for $359 after $20 mail in rebate at new egg...
https://www.newegg.com/msi-radeon-rx...

The 6800's haven't come down yet or if they have, they are $175 more than a 6700 xt. I.E. totally not worth it.
I can wait another week I guess. That card isn't the first and won't be the last as there are several models sitting at $399 right now...

Plus if I was going to pay $529 for a 6800, I'd spend the extra $$$ and get a 6800 xt for $559.

Intel's lower-end CPU product lines, the Pentium and Celeron brandings, will soon be rebranded as simply, "Intel Processors".

I have jokes. So many jokes. But I'm going to behave.

That J4125 I was raving about in the router build uses the Celeron branding. That (or rather its replacement) will now be an Intel Processor, with a capital "P". For you see, all Intel Processors are Intel processors, but not all Intel processors are Intel Processors.

I always thought it was a missed opportunity with everything being branded now e-this and e-that, intel should have rebranded the Celeron, Celer-e ;P

fangblackbone wrote:

I always thought it was a missed opportunity with everything being branded now e-this and e-that, intel should have rebranded the Celeron, Celer-e ;P

Well now that they're "Intel Processors", the performance cores can be called IP-P.

If the intel engineers are fans of KISS, it could be IP Freeley...

jonnypolite wrote:

EVGA exits the graphics card business.

Wow.

Something really wacky is going on with nVidia's business relationships behind the scenes.

Seems worth noting that Microsoft's experience with nVidia in the OG xbox era led to nVidia getting frozen out of the console space for the next 20+ years.

Yup, that is all the buzz. I find it questionable that they won't work with AMD. Okay I guess mass layoffs are better.

I think that first - it is 80% of the company's revenue.
Second, if they think they will hold the same clout as a company without any gpu business, they are sorely mistaken.
Third - it seems knee jerk, considering the unique market conditions over the last couple of years. I'm not implying they need to take Nvidia's abuse. But rage quitting the industry seems just as bad.

It's also an issue because Nvidia is competing against partners with its own Founders Edition cards, both videos say, because Nvidia, as the supplier and manufacturer, doesn't have to worry about profit margins as much.

When NVIDIA redesigned the cooler for the 3000 series Founders Edition, and really started emphasizing it as a product in their presentations as opposed to it being a reference board, I interpreted that as the first step towards trying to recapture a much bigger piece of the board seller pie for themselves.

I'm not at all shocked to see a major board partner call it quits.

fangblackbone wrote:

Yup, that is all the buzz. I find it questionable that they won't work with AMD. Okay I guess mass layoffs are better.

I think that first - it is 80% of the company's revenue.
Second, if they think they will hold the same clout as a company without any gpu business, they are sorely mistaken.
Third - it seems knee jerk, considering the unique market conditions over the last couple of years. I'm not implying they need to take Nvidia's abuse. But rage quitting the industry seems just as bad.

80% of their revenue, but approximately half of their profit. The CEO stated that their PSU margins were 300% greater. And with Nvidia feeling free to undercut their price by not having to pay list for the GPU, it'll likely only get worse from here.

Some of the quotes from Jensen are just f*cked. "Why are the board partners making money when they don't do anything?"

So it is a sound situation to throw away 50% of your profit because other parts of your business have higher margins?

If you expect that 50% to get squeezed harder and harder going forward while consuming a lot of your company's energy, yes.

My company deliberately walked away from a significant fraction of our business a while ago because our much bigger partner was becoming progressively more unreasonable. Not having to deal with their BS allowed space to develop other business relationships and do more internal development. Having our best people constantly dealing with their bullsh*t was not in the best interests of the company or those employees.

Edit:
Perhaps the first chart in this article will provide context: https://arstechnica.com/gaming/2022/...

I think EVGA sees this as an opportunity to refocus their business into areas where they could possibly grow their market share and recoup that loss in profit without having to deal with all the NVIDIA bullsh*t.

Craazy!

Yeah making $3 or so on selling a $1000 videocard is... stupid. And that's apparently a thing that is happening right now.

I'd abandon ship too. Don't blame them at all.

fangblackbone wrote:

Yup, that is all the buzz. I find it questionable that they won't work with AMD. Okay I guess mass layoffs are better.

I think that first - it is 80% of the company's revenue.
Second, if they think they will hold the same clout as a company without any gpu business, they are sorely mistaken.
Third - it seems knee jerk, considering the unique market conditions over the last couple of years. I'm not implying they need to take Nvidia's abuse. But rage quitting the industry seems just as bad.

Quite a hot take to claim that a company with the longevity, experience, and reputation of EVGA hasn't taken the time to think this through carefully. I'm not saying you're wrong, but there are a lot safer claims I wouldn't tend to make without evidence.

You can exist in the PC market without making GPU’s for sure..there are plenty of examples right now. GPU market is wacky right now..lions share of profits are in the mid range but consumers won’t touch you unless you also have a halo product. I’ve been running an AMD 6800xt and a 6900xt for most of the pandemic and they are just fine..but gamers have a long memory and for the most part won’t touch AMD.

For the longest time I only used Radeon GPUs. I think my first Nvidia card was 970. The biggest difference to me was the stability of the drivers. Had so many driver crashing issues, across GPU generations, on the Radeon cards. That is the main thing that has kept me away from AMD cards since, unfair as it might be.

My last Nvidia card was a 8800GT.
Since then I've had a 6770, a 7870 and currently a 580. I've had zero issues with AMD gpus for... 10 years?

CPU wise I've been even more stingy about upgrading but I will note that my current ryzen 3600 is the biggest boost I've seen in gaming performance. I switched from a fx8300 which I had no idea how much a cpu could dog performance. But switching gave so much more life to the 580. Before the fx8300 I believe I had a celeron. (the e2160 I think it was?)

I had a bumpy experience with EVGA's GPUs, first one remember buying about 14 years ago was a 7600 GT that didn't last much, I got it to be able to play Team Fortress 2 with you lot back then but it was a fairly simple card with a tiny fan, it died on me about a year and half later and its demise signaled the time to upgrade.

Then I got a 450 GTS that survived a little longer, about 4-5 years before it started acting up and coil whining, upgraded to a GTX 750 ti that still runs with its infernal single fan full-throttle, it's in my old rig that's currently out of service, but not because of the card.

After my latest rig upgrade I went APU without regrets, you can game to your hearts content with VEGA 11 up to 1080p, but for 3d work it's become clear I need more visual memory, so I am leaning on getting a 6600 or 6650 XT soon, but I will certainly miss having the option of the classic, reliable and elegant EVGA line.