Jump to content
  • Nvidia RTX 3050 is here to save budget gamers as it's purportedly really bad at mining


    Karlston

    • 1 comment
    • 537 views
    • 2 minutes
     Share


    • 1 comment
    • 537 views
    • 2 minutes

    The upcoming GeForce RTX 3050, the budget gaming graphics card from Nvidia, is reportedly a really poor Ethererum (ETH) miner according to a report from fellow outlet VideoCardz. The site says that its sources in China informed it about this information which gamers will certainly be pleased to hear.

     

    Although it is not officially called a Lite Hash Rate (LHR)-based GPU, it looks like the RTX 3050 will, under the hood, run the LHR algorithm as the ETH mining hash rate of the card reportedly drops from around 20 down to just 12.5 MH/s within seconds, as soon as the LHR algorithm detects the mining software. The site has provided the following image to show this.

     

    1642791944_3050_mining_(source-_videocar

     

    The purported RTX 3050 here is also seen to consume around 73W of power making it pretty inefficient for mining.

     

    A Chinese Twitter user @wxnod has also provided a screenshot of mining output that purportedly belongs to the GeForce RTX 3050. In this case, the GPU does 13.66MH/s at 57W after hardware tweaking and optimizations, says VideoCardz.

     

    1642792623_3050_mining_(source-_wxnod_tw

     

    So, despite having 8GB of VRAM, the GeForce RTX 3050 could be unattractive to miners making it potentially more readily available for gamers, which is also assuming scalpers don't buy it up anyway. The card is priced at an MSRP of $249 and the poor mining performance could make the Radeon RX 6500 XT a tough sell due to the 6500 XT's hardware limitations.

     

    Source and image: VideoCardz

     

     

    Nvidia RTX 3050 is here to save budget gamers as it's purportedly really bad at mining

    • Like 2

    User Feedback

    Recommended Comments

    • Administrator

    If the algorithm is not circumvented that is. It is possible to to change the - software - to put it simply, of the graphics card to do that.

     

    However, if it is done at the hardware level that cannot be circumvented, then I see it as a good work done by Nvidia there.

    Link to comment
    Share on other sites




    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...