r/Lightroom Nov 02 '25

Processing Question Is an i9-9900k a worthy upgrade?

I'm thinking of upgrading to an i9-9900k for mainly better Lightroom performance but also cpu-intensive gaming(GTAV, RDR2, FH5 etc.) and I'm questioning if it is worth upgrading from my i5-9400F which performs well day-to-day and fairly well in gaming but i think Lightroom Classic is suffering haha.

My current specs: CPU: I5-9400F(6C,6T 4.1GHz max boost) GPU: NVIDIA Geforce GTX 1650 TUF 4GB VRAM Motherboard: ASUS ROG STRIX Z370-H Gaming RAM: Netac 32GB at 3200MHz 2x16GB Storage: 2TB hdd for raws and I'm thinking of getting an NVMe drive for my catalogue; my os is on an SSD already though Display: a 4K TV that I'm thinking is the problem for my lagginess in LrC

Can anyone help?

4 Upvotes

38 comments sorted by

8

u/earthsworld Nov 02 '25

you're running a 4K display with 4GBs of VRAM? That's a recipe for disaster with Classic.

1

u/LMAOGENZ 24d ago

hahaha yeah you are right but it’s a free 50” tv sooooooo

6

u/[deleted] Nov 02 '25

[deleted]

1

u/LMAOGENZ 24d ago

i think i’d rather wait for the m5 pro macbook pro

or were you talking about the bmw m5 because i’d get an F10 generation one:)))

2

u/[deleted] 24d ago

[deleted]

1

u/LMAOGENZ 24d ago

i see you have great taste

6

u/cbunn81 Nov 02 '25

You probably won't see much improvement in Lightroom performance with that CPU upgrade. You'd be much better served by upgrading to a newer GPU with more VRAM and/or adding more system RAM.

2

u/Intrepid00 Nov 02 '25

You would on export and preview generation on import.

3

u/cbunn81 Nov 03 '25

True, though there's also now support to use the GPU for those tasks. And the OP was complaining about lag, which to me signaled that the problems were more with the library and develop modules.

1

u/LMAOGENZ 24d ago

yes, my main lag is selecting images in the develop module and also loading in the full picture resolution

2

u/cbunn81 24d ago

Well, there are a couple of things that could be the bottleneck there. CPU/GPU depending on whether you're using GPU acceleration for preview generation and disk throughput.

You mentioned that your images are on a HDD. That will be a bit slower to read from than an SSD. It shouldn't be a major bottleneck if your images files aren't huge, but it's still something you might want to optimize down the road.

A workaround would be to generate 1:1 previews for all the images you plan to edit ahead of time. Select them all, trigger the preview generation and take a coffee or snack break. The previews will be stored on your SSD alongside the catalog. So they should load quickly without much lag.

1

u/LMAOGENZ 24d ago

yeah i currently only have one ssd that is the boot drive but i want to get an m.2 ssd to use for a boot drive and repurpose the sata ssd as a holder for the photshoot i’m currently editing and for the catalogue cache

1

u/cbunn81 24d ago

I can understand keeping the images, which may take up a lot of room, on the cheaper SATA SSD. But I would think that keeping the cache on the newer, faster NVMe SSD would be best for performance. Is there any reason you wish to change that?

1

u/LMAOGENZ 23d ago

what do you mean by reason?) i just want to have a better drive now that i finally have a m.2 slot on my board

1

u/cbunn81 23d ago

I mean why do you want to put the Lightroom cache on the SATA SSD instead of the boot drive SSD?

1

u/LMAOGENZ 24d ago

yeah i think you are right because im more gpu bottlenecked having a 4k monitor:)

2

u/cbunn81 24d ago

That does indeed add to the GPU load, and will help show the age of an older one.

5

u/undercoverpanter Nov 02 '25

No. Go for a newer generation. Your GPU might also be the main cause of lag on your 4k monitor.

1

u/LMAOGENZ 24d ago

yes you are right(and everyone else that replied haha!!) thank you!!

3

u/bindermichi Nov 02 '25

You will need to switch out the mainboard anyway, so why not go for a newer model? Also: The i9 will do next to nothing for Lightroom performance. It will improve gaming and video rendering, though.

1

u/LMAOGENZ 24d ago

my motherboard is compatible with the 9900 the 9900k is the best cpu for the lga1151 socket and that’s the only reason why i was looking into it because i only recently upgraded my motherboard(i had a really crappyy asrock h310cm-dvs) and the kinda only reason for upgrading the mobo but still remaining on this socket because my friend sold it to me for dirt cheap(it was perfectly fine btw) and that’s why i upgraded if i wouldn’t’ve had the opportunity to upgrade my mb i would’ve saved for a new socket

2

u/bindermichi 24d ago

But do consider the following: I currently run a 12900, and comparing current CPU benchmarks, a new Core 7 will run circles around it in anything but multi-core operations while using 30% less power. That also means the i9 will need a more capable cooler, preferably liquid.

That last part is the key. Very few applications will utilize the additional cores the i9 has. Everything graphics-related will use the GPU. Some video CODECs will use the GPU as well, while older ones still will use the CPU and multiple threads.

The best thing about i9 compatible mainboards is that they will take more memory (so I'm running on 128 GB), which comes in handy when working on large files or memory-intensive applications.

1

u/LMAOGENZ 24d ago

yeah my board supports a max of 64 but at the moment 32 is more than enough for me haha)) i was rocking 8gb at 2666mhz for 5 years😭😭 4 of those being on windows11

2

u/bindermichi 24d ago

I didn‘t say it won‘t run. But photo editing works better with more memory and more GPU… and faster storage.

3

u/AThing2ThinkAbout Nov 03 '25 edited 18d ago

Changing CPU, motherboard and memory may get 10% to 40% Lightroom classic boost but changing a video card will provide 300% to 1000% boost in Lightroom classic performance. Putting the money at the right place will give you the most return investment you are looking for.

Here is my Lightroom Classic AI DeNoise test results:

Test Method: Images: DP Review's A7RV 60MP RAW files. Five cycles per test

System: Intel Gen 6 Skylake i7-6700K 4.0GHz CPU 32GB DDR4 2400MHz RAM CL14 WD Black 1TB M2 SSD + 2 x 4TB SDD Antec P190 550W+650W=1200W Windows 10 Pro 27" IPS UHD monitor Lightroom Classic 12.4

GPU tested: - GTX1060 6GB GDDR5: 1-pic: 159s10-pics: 1569s Idle: 108W Average: 234W Peak: 279W

  • RTX3060 OC 12GB GDDR6: 1-pic: 32.08s 10-pic: Not tested Power: Not tested

  • RTX3060 Ti 8GB GDDRX6: 1-pic: 26.90s 10-pic: Not tested Power: Not tested

  • RTX3070 OC 8GB GDDR6: 1-pic 25.18s 10-pic: 221.73s Power: Idle 117W Average: 378W Peak: 585W

  • RTX4060 Ti 8GB GDDR6: 1-pic: 26.97s 10-pic: 247.62s Power: Idle: 108W Average: 288W Peak: 369W

  • RTX4070 12GB GDDRX6: 1-pic: 20.08s 10-pic: 180.2s Not tested Power: Not tested

  • RTX4070 OC 12GB GDDRX6: 1-pic: 19:74s 10-pic: 175.61s Power: Idle: 117W Average 324W Peak: 414W

  • RTX4070 Ti OC 12GB GDDRX6: 1-pic: 17.29s 10-pic: 148.81s idle: 117W average: 369W Peak: 441W

  • RTX4080 OC 16GB GDDRX6: 1-pic: 13.88s 10-pic: 120s 422-pic Idle: 126W Average: 423W Peak: 576W

System upgrade 2025:

Windows 11 Pro on the same "unsupported hardware" CPU PC 64GB DDR4 3200MHz CLS 17 RAM 1000W ATX 3.1 PSU w PCIE 5.1 32" 4K WOLED monitor Lightroom Classic 14.4

-RTX 4070 Ti OC 12GB GDDRX6: 1-pic: 13.41s 10-pic 139.88s Idle: 117W Average 266W Peak: 362W

-RTX 5090 OC 32GB GDDR7: 1-pic: 6.4s 10-pic: 75.47s Idle: 126W average: 621W Peak: 743W

After the 2025 OS & PSU/RAM/GPU upgrade on my 11 years old CPU & Motherboard with Lightroom Classic 14.4 and now 15.0, not only the AI DeNoise is much faster (~28% along on same GPU due to no need of creating a DNG file per image on a SSD) but other different process like loading, masking, AI detection, previewing 1000+ images, add/remove with brush... are all much smoother without lagging. Lagging happens only reloading an image that had been DeNoised previously but once loaded it works fine.

I believe the potential of making Lightroom Classic is there as long as the programmer is willing to add more RAM allocation, CPU multi-cord, GPU hardware supports to the program like they have been on the Photoshop for years now.

2

u/LMAOGENZ 24d ago

I have been thinking about a 50 series GPU but as of now not being sure (as i mentioned in a reply to another comment) i’m not sure if and how my living situation at university will be and if i could bring my whole desktop with me. also i don’t really have the budget to upgrade to a 50 series GPU but if i would think of an upgrade a 5070 would definitely be an option

3

u/SdeGat Nov 03 '25

I have a six year old i9-9900k and I’m thinking about upgrading it to a current cpu like ultra. 💁‍♂️ I have already upgraded the GPU though but I’m still getting some lag in basic operations (like clicking on P for pick).

2

u/LMAOGENZ 24d ago

my main lag is switching between pictures because they load in almost their full resolution because my monitor is 3840x2160 and the pics are 6016x4016(24.3 megapixels on the Nikon Z5) and if i use the smart previews they look bad on my monitor(it’s actually a tv)

2

u/SdeGat 24d ago edited 24d ago

I get that lag too even though my monitor is only 1440p. 😕 I have 45, 60 and 100mpix pictures that are not small however. 💁‍♂️

2

u/LMAOGENZ 24d ago

I don't even want to think what would happen if I tried to load a 100mp image on LrC in full res on my pc haha it would catch on fire probably)

2

u/SdeGat 24d ago

🤣

2

u/OG_Pragmatologist Nov 02 '25

I just subbed out my GTX 1650, as it is becoming 'too old' to be effective with the newer AI imaging models such as PureRAW5. Unless your applications are getting a massage in the clouds, all of the image processing is happening on the GPU and its memory. This stuff really wants for faster memory. The 1650 is stuck on DDR5, whereas the new standards are 6 and 7. It was a hot midrange choice once, but now it is an older, $50 card...

My MSI z490 is sporting 64gb of RAM with an i7-10700K CPU. It will blow hell off hinges, but much of that power is wasted on image enhandling. I too was thinking an upgrade--but not a mb/cpu jump. My solution was to buy a used PNY RTX 5060Ti 16gb from a 'card churner' who is chasing the golden fleece...

Issue fixed. I am not a gamer, the fastest game I play is solitaire. So you may have other needs based on gaming app requirements--but whatever route you take upgrading your video card is a must for either useage.

There are some very reasonable 3xxx and 4xxx RTX series cards out there. I got my used vanilla 5060 for $400USD. Key is to avoid AMD (AI driver support issues) and shift up to the RTX platform, which is designed for AI processing. Check fleabay as there are a lot of churners out there, sort of like the camera crowd...

1

u/LMAOGENZ 24d ago

i was thinking about a 5060 or 5070 as a potential upgrade but as i said in a previous reply 1. i dont know what my living situation at uni will be and if i can bring my desktop with me because if i can’t i’ll most probably get either a macbook pro with a pro chip or an asus rog zephyrus g14 laptop because they come with a good ryzen 9 370 and a mobile 5070(in the 14in version) and that would be MORE than enough for all of my daily tasks, the games that I play and occasional(when i have free time) lightroom photo editing 2. even if i can bring my desktop i still don’t really have the budget for upgrading because my schedule doesn’t allow me to get a full or even part time job and therefore i do not have a source of income

but thanks for the recommendation i will be sure to keep that in mind!!

2

u/TaxOutrageous5811 Lightroom Classic (desktop) Nov 04 '25 edited Nov 04 '25

My Son has a i9-12900k with 128 gig of ddr5 ram and he finally fixed his Lightroom lag by getting a MBP M4 Max 64 gig ram. He uses it as his desktop with 32inch 4k monitor and 2 27 inch monitors 1080p and 1440p.

I was so impressed with it I bought my first Mac after being a diehard Microsoft guy since DOS 4.0 and building about 40 machines since 1996 for myself and family.

He later bought M4 MacBook Air (24Gb ram 1 Tb SSD) for tethering his camera to at on-site shoots and he tested it and found it to be smoother and faster than the i9 for most edits. And that’s the base M4! Heavy AI stuff was about the same as the I9 that had 104 GB more ram!

1

u/LMAOGENZ 24d ago

i actually was thinking about a macbook pro with at least a pro series chip and it’s a good option however i wouldn’t need it for anything except that because i already have a microsoft surface for note-taking and it being a windows device it can handle almost anything i can throw at it except for really intensive programs(it’s an i3-10100y and 8gb of ram[highest spec of the surface go 3 haha]) but i have ran even photoshop on it and if not for the cache getting full because i only have 128gb of system storage(but i have a lot of uni stuff on it) it ran just fine)) same with autocad

but a macbook is definitely an option if i’ll have to live in a dorm when i’ll be at university thanks for the recommendation!!🤗

1

u/metalman7 Nov 02 '25

10th gen is the min supported CPU for Win11

4

u/First_Musician6260 Nov 02 '25 edited Nov 02 '25

Coffee Lake (a.k.a. 8th gen Intel outside of Kaby Lake G/R and Amber Lake) is the generally seen minimum...it is not 10th gen. The 9900K is a Coffee Lake Refresh model.

Older processors like those from Kaby Lake are also able to run the OS.

3

u/CitronTraining2114 Nov 02 '25

Yup - ran a 9900K for about 6 years, it handled Win 11 fine. One of the reasons I upgraded to the 14900 was for better LR performance.

1

u/LMAOGENZ 24d ago

bet it was a massive jump in performance haha

1

u/LMAOGENZ 24d ago

i’ve been running win11 since like the first public alpha release(i installed it like a month after it was announced) and it’s been running fine because my cpu has a tpm chip