Folks: I’m dredging the internet for non-lhr cards, but not knowing anything, what it the telltale marking or 2 that will tell me its an lhr… I assume if it says gaming = lhr, but ive come across some that i can’t tell and so far 2 vendors are pleading ignorant but they think it is non-lhr… not buying it… whats the thing or two to screen for? thanks in advance…
It is listed on the box usually. Otherwise just knowing the models and checking model/serial numbers. If the card says v2 it’s LHR. No 3090’s, I believe, are LHR. Almost all of the ti cards are LHR. If you know the model number just google the information.
Perfect… this is what I was looking for!
The reason why Nvidia created the LHR graphics cards is due to the GPU miners hoarding and buying up all the inventory. To help gamers and computer companies, last July 2021, Nvidia began creating Low Hash Rate (LHR) GPUs to keep them out the miners’ grubby mitts. The ideal is that LHR GPU’s will not be profitable for mining and therefore increase the supply in the stores. The LHR does not affect gaming or video quality, just lowers the mining capability. There may be software fixes to bypass the LHR but it will probably void the warranty.
I understand the rationale its the identification I was trying to streamline… appreciate you chipping in. Cheers
Like the other post above, the 3090 high-end GPU is one of the only non-LHR cards Nvidia makes. You can pretty much tell if it’s a LHR card if it’s below a 3090 and manufactured July 2021 to present.
I’m not sure about AMD, I think this just applies to Nvidia.
Hi, it’s a very interesting topic and just before you purchase a non-LHR card make a simple calculation on ROI to Hashrate. As the LHR is only applied for ETH mining and not for other algorithms, so If ETH goes 2.0 and we cannot mine then the cards are pretty much same. From the prices I can buy I found out that I need 10-11 months of mining ETH before the ROI and I am not sure when ETH 2.0 is going to be.