Jump to content

Search the Community

Showing results for tags 'gpu'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Site Related
    • News & Updates
    • Site / Forum Feedback
    • Member Introduction
  • News
    • General News
    • FileSharing News
    • Mobile News
    • Software News
    • Security & Privacy News
    • Technology News
  • Downloads
    • nsane.down
  • General Discussions & Support
    • Filesharing Chat
    • Security & Privacy Center
    • Software Chat
    • Mobile Mania
    • Technology Talk
    • Entertainment Exchange
    • Guides & Tutorials
  • Off-Topic Chat
    • The Chat Bar
    • Jokes & Funny Stuff
    • Polling Station


  • Drivers
  • Filesharing
    • BitTorrent
    • eDonkey & Direct Connect (DC)
    • NewsReaders (Usenet)
    • Other P2P Clients & Tools
  • Internet
    • Download Managers & FTP Clients
    • Messengers
    • Web Browsers
    • Other Internet Tools
  • Multimedia
    • Codecs & Converters
    • Image Viewers & Editors
    • Media Players
    • Other Multimedia Software
  • Security
    • Anti-Malware
    • Firewalls
    • Other Security Tools
  • System
    • Benchmarking & System Info
    • Customization
    • Defrag Tools
    • Disc & Registry Cleaners
    • Management Suites
    • Other System Tools
  • Other Apps
    • Burning & Imaging
    • Document Viewers & Editors
    • File Managers & Archivers
    • Miscellaneous Applications
  • Linux Distributions


  • General News
  • File Sharing News
  • Mobile News
  • Software News
  • Security & Privacy News
  • Technology News

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...

  1. The alleged die shot of the GPU powering AMD's upcoming RX 7000 series graphics cards, based on the RDNA 3 architecture, which are launching later tonight, has leaked. While it is in no way a perfectly clear capture in glorious 4K or something, we do get a fair idea of what the Navi 31 die itself looks like. First, do not be confused by that "A" shaped logo on top of the image. That is the watermark of Angstronomics, who is the blogger that leaked this photo. What's actually interesting about the die shot is the layout of the chip and its infinity cache components. We know that AMD is moving to a chiplet-based packaging on its GPUs too. While the core will be based on an alleged 308mm2 GCD (5nm), the infinity will consist of six MCDs (6nm) each being 37.5mm2. Although prior rumors had suggested that each MCD will have 64MB of stacked cache, more recent reports suggest 16MB on each MCD for first-gen Navi 31 for a total of 96MB infinity cache. There is reportedly also the possibility of 32MB MCDs or a total of 192MB of infinity cache on Navi 31. However, that is apparently planned for when the Navi 31 refresh RX 7x50 series comes out sometime in the future. Tonight, AMD is expected to unveil two premium SKUs, the RX 7900 XTX and the RX 7900 XT. The chiplet design approach allows AMD to utilize a lower-cost process for stuff like the I/O or memory whereas the cutting-edge lithography can be used for the core itself. RX 7900 XTX The flagship RX 7900 XTX is rumored to pack 48 WGP (Work Group Processor) or 96 Compute Units (CU) for a total of 12,288 Stream Processors (SP). It will purportedly have 24GB of 20Gbps GDDR6 memory across a 384-bit wide memory bus for a bandwidth of 960GB/s. This is still not fully confirmed though and AMD may even use the slightly slower but more abundant 18Gbps memory (864GB/s bandwidth) that it is currently deploying on its RX 6x50 refresh SKUs like the RX 6950 XT, 6750XT, and 6650 XT. RX 7900 XT Meanwhile, the slightly cut-down RX 7900 XT will purportedly feature 42WGP or 84CU for a total of 10,752 SP. The memory subsystem is also slashed down to 20GB across a 320-bit bus. And once again, depending on the speed of the memory chips used, the bandwidth could be 800GB/s (for 20Gbps) or 720GB/s (for 18Gbps). The infinity cache will also be decreased from 96MB down to 80MB. RX 7800 XT Finally, we have the RX 7800 XT which will reportedly have 30WGP or 60CU which means 7,680 SP in total. This will be based on Navi 32 die instead of Navi 31. In terms of memory, the 7800 XT is expected to have 16GB of 20Gbps or 18Gbps GDDR6 for a total bandwidth of 640GB/s or 576GB/s respectively. Coming to the infinity cache, the 7800 XT will allegedly come with 64MB of the on-die cache. Expect to know more later tonight when AMD officially takes the wraps off RDNA 3. Source and image: Angstronomics (1) , (2) via chi11eddog (Twitter) AMD RDNA 3 RX 7900 XTX, 7900 XT, 7800 XT full alleged spec details, die shot have leaked
  2. Nvidia says that "excess inventory" is dragging down its balance sheet. How quickly things change: A year ago, it was nearly impossible to buy a GeForce GPU for its intended retail price. Now, the company has the opposite problem. Nvidia CEO Jensen Huang said during the company's Q2 2023 earnings call yesterday that the company is dealing with "excess inventory" of RTX 3000-series GPUs ahead of its next-gen RTX 4000 series release later this year. To deal with this, according to Huang, Nvidia will reduce the number of GPUs it sells to manufacturers of graphics cards and laptops so that those manufacturers can clear out their existing inventory. Huang also says Nvidia has "instituted programs to price position our current products to prepare for next-generation products." When translated from C-suite to English, this means the company will be cutting the prices of current-generation GPUs to make more room for next-generation ones. Those price cuts should theoretically be passed along to consumers somehow, though that will be up to Nvidia's partners. Nvidia announced earlier this month that it would be missing its quarterly projections by $1.4 billion, mainly due to decreased demand for its gaming GPUs. Huang said that "sell-through" of GPUs, or the number of cards being sold to users, had still "increased 70 percent since pre-COVID," though the company still expects year-over-year revenue from GPUs to decline next quarter. Demand for PCs and PC components is mostly down across the board, partly because of recession fears and partly because people bought so much PC hardware early in the pandemic. Nvidia is also selling fewer GPUs to cryptocurrency miners, both because of falling prices and Ethereum's long-anticipated transition away from GPU mining. Nvidia CFO Colette Kress insisted that the company is "unable to accurately quantify the extent to which reduced crypto mining contributed to the decline in gaming demand," notably because Nvidia has gotten into trouble with the SEC for obscuring the number of GPUs it was selling to cryptocurrency miners in the past. The upshot for consumers is that new and used GPU prices should continue to fall, as they have been for most of the year, and that we can expect at least a few RTX 3000-series cards to hang around well after the launch of the RTX 4000 series. People who want the best performance they can get should probably wait for those cards to launch (AMD and Intel are planning new CPU launches soon, too). However, it's the best time in years to overhaul or replace your gaming desktop for anyone still getting by with older GPUs and wanting to upgrade. The GPU shortage is over. The GPU surplus has arrived!
  3. Buyers who need an inexpensive graphics card in the US now have one more option, which, this time, comes from Intel, not AMD or Nvidia. More than two months after the official unveiling, Intel has finally brought its entry-level desktop graphics card to the United States. The Intel Arc A380 is currently available only from ASRock. It features a single fan with zero-dB support (the fan turns itself off during light loads), 6 GB of GDDR6 memory, and PCIe 4.0 connectivity. Ports include three DisplayPort 2.0 and a single HDMI 2.0. ASRock Challenger A380 GPU Intel ACM-G11 Alchemist Core Clock 2250 MHz Xe Cores / FP32 Cores 8 / 1024 Memory 6 GB GDDR6 96-bit, 15.5 Gbps Ports 3x DisplayPort 2.0 1x HDMI 2.0 Power 75W Cooling Single fan with 0dB support Price $139 The ASRock Challenger Arc A380 is a low-end graphics card with a price tag of $139 at Newegg, which means users should not expect much in terms of high-fidelity gaming. This GPU rivals other affordable graphics cards from AMD and Nvidia, such as Radeon RX 6400 and GeForce GTX 1650. You can see how the Arc A380 compares to its competition in our dedicated coverage. Those keeping an eye on the Intel Arc A380 should also consider the fact that it requires a CPU and motherboard with Resizable BAR support. That means that the graphics card will work on systems with Intel 10th Gen CPUs and newer (Intel 400, 500, and 600 Series motherboards) and AMD Ryzen 3000 CPUs and newer (AMD X570 motherboards). If you need something more capable than an entry-level GPU from Intel, EVGA currently offers Nvidia RTX 30 Series graphics cards with massive discounts. Intel Arc A380 desktop GPU officially arrives in the US at just $139
  4. NVIDIA as a company is on solid ground, but its revenue from the sale of graphic cards isn't great. The company made a billion dollars less in gaming revenue in the first half of 2022, compared to the same time period last year. Incidentally, AMD ensured its revenues climbed up by roughly the same amount, suggesting team red picked up what team green lost. According to preliminary numbers released by NVIDIA, the company reported $2.04 billion in gaming revenue for the first half of 2022, down from the $3.06 billion it made in 2021. Simply put, NVIDIA's revenues from the sales of graphics cards for gaming have slid 33%. While founder and CEO of NVIDIA, Jen-Hsun "Jensen" Huang, acknowledged the concerning downturn, he blamed the slowdown on component shortages and “ongoing macroeconomic uncertainty.” It is interesting to note that these factors seem to have had no negative effect on AMD. The company’s financial report claims AMD's revenue was $6.6 billion for Q2 2022. With $1.7 billion coming from gaming, the sector is now the company’s second-biggest source of revenue, sitting below computer processors, which earned $2.2 billion. While its rival's revenues from gaming slid, the CEO of AMD claimed the company is growing steadily: We delivered our eighth-straight quarter of record revenue based on our strong execution and expanded product portfolio. We see continued growth in the back half of the year highlighted by our next-generation 5nm product shipments. NVIDIA’s CEO has indicated the company could prioritize its efforts to scale up in the exponentially growing field of Artificial Intelligence (AI): Our gaming product sell-through projections declined significantly as the quarter progressed. As we navigate these challenges, we remain focused on the once-in-a-generation opportunity to reinvent computing for the era of AI. NVIDIA’s GPUs continue to dominate the high-end GPU market. But the company’s graphics cards still command a hefty premium, despite the recent turmoil in the cryptocurrency market. This often makes AMD’s products a popular choice for budget gaming rig builders. Additionally, AMD’s GPUs are embedded in Microsoft Xbox Series X|S as well as Sony PlayStation 5. These factors could help explain why AMD's revenues, from gaming, are growing. NVIDIA lost a billion dollars which AMD may have picked up from gaming GPU sales
  5. Whether it’s Nvidia or AMD, the trend of next generation graphics cards using extremely high amounts of power is worrying. It is said as microchip technology improves, the performance of these chips increase. On the same time, the power usage of these microchips decrease too. So usually we get to see newer generations of processors either delivering double the performance or consuming half of the power at the same performance, when compared to their predecessors. This progressive improvements in the performance and power usage is always welcomed. Sometimes both are balanced properly by the companies so that both performance and power efficiency is not compromised. But it looks like one important part of that duo is now being thrown out completely. Going by the reports, in the race for performance supremacy, power efficiency is going to take a hit. Power Usage Of Next-Gen Graphics Cards To Increase If reports are to be believed, Nvidia is going to massively increase the power usage of its graphics cards. The Nvidia GeForce RTX 3090 Ti has a TDP (base power usage) of 350 Watts. The max power it draws does not go above 450 Watts in almost all cards. However, it is said that the Nvidia GeForce RTX 4000 series cards are going to draw power above 800 Watts. Which is a massive jump. Some reports are even suggesting 900W usage. The card which is most likely going to use this much amount of power is going to be RTX 4090 Ti. It is possible that it’s going to be named as the RTX 4090 Titan. AMD Radeon Joins The Race In an interview to Tom’s Hardware, AMD has already confirmed that next-gen graphics cards, most likely called Radeon 7000 series, too are going to have an increase in power usage. It is evident. Why would AMD want to stay behind the game when it’s rival is trying to race away with the performance increase at the cost of power usage. AMD, however, might be more conservative in its power usage increase. As reported by Twitter user @Kepler_L2, the AMD Radeon RX 7000 series might be having a max TDP of anywhere between 400 Watts to 450 Watts. With the 450 Watts being reserved for an entirely new card named RX 7970 XT3D. The 3D here is possibly AMD’s GPU version of its 3D stacking found in Ryzen 5800X3D CPUs. This power usage, it must be mentioned, is TDP, not max power. The max power can be nearer to Nvidia’s offerings, though not as high as it. Outcomes Of Such Power Usage Increase This increase of graphics card power consumption has two big problems. Environmental and practical. As far as the environment is concerned, the world has faced massive fuel and power shortages this year. Many countries were known to have lack of fuel required to generate electricity and were also facing massive power cuts in various places. In Europe too, we are going to see many houses in later part of this year not having access to room heating fuels thanks to the war. These cards should not become replacement room heaters due to their higher power consumption leading led heat production. Which is bad for the environment and the product itself too. Across The Board Hike In Power Usage Coming back to the more practical and technological problem at hand. The outcome of this power usage is not going to be limited to just top level flagship graphics cards, the power usage is going to increase on all the models of the cards. Including mainstream ones, which many people can afford. This is a huge problem most seems to have ignored. Nvidia RTX 4000 and AMD RX 7000 Chart. Credit @Kepler_L2. As one can see from the above chart. The power increase has happened across the board. For example, famous Twitter leaker kopite7kimi had reported that RTX 4060 is going to consume more power than RTX 3070. For the record, RTX 3070 has a TDP of 220 Watts. Compare that too RTX 3060, which has TDP of 170 Watts, we are going to see an increase of 50 Watts. AMD’s mainstream offering too doesn’t seem behind. While RX 6600 came with a TDP of 132 Watts, RX 7600 is expected to have a TDP of 200W. A massive 68 Watt increase. Worrying Trend This trend of increasing power consumption to increase performance has to stop. In recent years, we were starting to see more power efficient cards, but now trends seems to have reversed. Not only this is bad, but the question arises, when will this stop. Even later generation of cards might use even more amount of power. Most users would be least concerned about it as long as they have the latest and fastest performance at hands. Let’s also not forget, people will need newer, more powerful power supplies to support these cards too. 1200W or 1600W PSUs might not be enough. Flagship cards might require even a 2000W PSU, when pared with a similarly power CPU and other components. Whether people agree or not, all this is going to cause problems to the users. Maybe not as outright and visible, but still a worrying trend to say the least about it. Next-Gen GPUs Using Crazy Amounts Of Power Is Worrying
  6. We are pretty late into the current generation GPU cycle which means both AMD and Nvidia's next-gen cards are nearly here. For a while now, there have been rumors that Nvidia is purportedly increasing the power consumption of its upcoming RTX 4000 Ada Lovelace GPUs, with some whispers of it reaching up to an insane 900W too. Following that, it looks like AMD too is allegedly increasing the power draw on its upcoming RX 7000 series cards based on the RDNA 3 architecture (Navi 3X). The report comes via YouTuber RedGamingTech (RGT) who says that the Radeon team is increasing the total graphics power (TGP) of the top end Navi 31 chip from 375W to 405W, which is bump of 8%. Meanwhile, apparently there will also be another more efficient SKU based on this that will feature the 375W TGP. RGT has also provided purported specification details for the flagship SKU as well as others below it. Apparently it will come with 84 Compute Units (CUs) or 42 Workgroup processors (WGPs). However, the flagship is apparently named the 7950 XT this time around instead of the expected 7900 XT. Meanwhile, previous rumors have alleged that the top SKU could have an even higher core count coming in at 60 WGPs or 120 CUs. Regardless though, the performance is expected to be very impressive as Radeon is expected to move from a two SIMD32 per CU up to four with RDNA 3. Alongside that, the memory subsystem is also expected to get a big overhaul with V-cache coming into the mix allegedly taking the Infinity Cache amount of the top chip up from 128MB to a massive 384MB. Overall, the performance of the RX 7000 series is expected to be around double that of the RX 6950 XT, which is the current AMD flagship and should put up a good fight against the Nvidia RTX 4090 which is also expected to be a behemoth. AMD confirmed recently that the RDNA 3 GPUs are launching in the next quarter and the launch of the Nvidia RTX 4000 series is also not too far away. Source and image: RGT (YouTube) Following Nvidia RTX 4000, AMD allegedly bumping up power draw for RX 7000 (RDNA 3) too
  7. New EEC filings bring us closer to seeing the RTX 4090 Two new filings with the Eurasian Economic Commission (EEC) revealed that the next-gen Nvidia and AMD graphics cards will most likely be launched soon. These include the oft-rumored RTX 4090 and RTX 4090 Ti models. According to the filings that were reported by Digital Trends not only are the Nvidia Lovelace RTX 40-series listed in many variations but there are also more listings for the 30-series, particularly the RTX 30 Super series. Meanwhile, the AMD RX 7000 variations are also listed on their own page. The EEC is a trademark registry that often reliably reveals product ldetails, but with that said there are still no guarantees these models will all release in the market. At the very least, we know that these trademarks indicate the manufacturers are either developing these products or hedging their bets and securing the trademarks in case they want to release one at a later date. The registration is from manufacturer Afox, which is a Foxconn subsidiary based in Taiwan. Analysis: What does this mean for Team Red? These AMD and Nvidia graphics cards debuting on the EEC usually indicate a rapidly approaching release date, which corresponds to the rumored October release date for the NTX 4090. Furthermore, the fact that both the AMD and Nvidia graphics cards were registered at the same time most likely means that they’ll be launching at around the same time in October too. This would be excellent news for buyers since the market would be especially competitive and could drive down prices. Considering that for its last graphics cards launch, AMD trailed behind Nvidia by launching months later might have contributed to its grabbing a far smaller chunk of the market share this generation. So it’s honestly a given that this time around the tech giant would want to avoid a late launch at all costs. Next-gen Nvidia and AMD graphics cards are one step closer to release
  8. While the Intel CPU division works on the launch of its 13th Gen Raptor Lake CPUs, its graphics division is perhaps working on the far more daunting task of getting the first-gen Intel Arc "Alchemist" GPUs out. So far, the company has managed to successfully release its entry-level Arc A380 desktop card, albeit only in China. Now, the company has also begun teasing its higher end A750 offering. This one, on paper could be up to three times faster than the Arc A380 as the A750 purportedly features 24 Xe-cores while the A380 has 8 Xe-cores. The company has also shared performance numbers for the Arc A750 GPU and according to Intel, the A380 is often significantly faster than Nvidia's RTX 3060. Although numbers compared to AMD weren't released, this performance looks to be equivalent to around the Radeon RX 6650 XT. Performance across only five titles was shown, and the company will likely share more numbers later. Although only the percentage advantage for the Arc A750 was shown here, Intel did say that in Cyberpunk 2077, using these same settings, the A750 was able to achieve close to 60 fps on average. Source: Intel Graphics (YouTube) Intel shares official Arc A750 GPU benchmarks showing better than RTX 3060 performance
  9. After several leaks, Nvidia has officially unveiled (via Videocardz) the GTX 1630, a new entry-level graphics card for light gaming and everyday computing tasks, with the $150 price tag (approximate price in China, global pricing will be unveiled later). The Nvidia GeForce GTX 1630 is the first 30-series model under the GTX branding. Still, do not expect much from this puny graphics card, even though it received the letter X in its name. The GTX 1630 features the Turing TU117-150 core with fewer CUDA cores than the GTX 1650. Also, Nvidia has cut its memory bus in half from 128-bit to 64-bit, which might become a problem for modern games. Early benchmarks suggest the GTX 1630 will be slower than the GTX 1050 Ti Nvidia introduced six years ago. In other aspects, the GTX 1630 is similar to the GTX 1650, the former entry-level GPU in Nvidia's current non-RTX lineup. The new graphics card has the same 4 GB of GDDR6 memory, and TDP is 75W, which means many variants will not require additional power. GTX 1630 GTX 1650 Radeon RX 6400 Chip Turing TU-117-150 Turing TU-117-300 Navi 24 Clocks 1785 MHz 1590 MHz 2039 Mhz CUDA Cores 512 896 Not Applicable Memory 4 GB GDDR6 4GB GDDR6 4 GB GDDR6 Memory Bus 64-bit 128-bit 64-bit Memory Clock 12 Gbps 12 Gbps 16 Gbps Memory Bandwidth 96 GB/s 192 GB/s 128 GB/s TDP 75W 75W 53 W Do you think $150 is a reasonable price for what the GTX 1630 offers? Nvidia launches GeForce GTX 1630, its cheapest modern graphics card
  10. As the cryptocurrency market currently goes through one of its worst nosedives in recent years, miners are trying to get rid of their mining hardware. Due to the crashing prices of popular crypto coins, numerous Chinese miners and e-cafes are flooding the market with graphics cards they no longer need. Miners, e-cafes, and scalpers are now trying to sell their hardware stock on streams and auctions. As a result, users can snag a second-hand GPU, such as the RTX 3060 Ti, for $350 or even less. Many popular graphics cards going for MSRP or even less is quite a sight to behold after astronomically high prices and scarce availability during the last two years. As tempting as it might be to snag a powerful Nvidia or AMD GPU for a price lower than its MSRP, it is not the best idea to go after a graphics card that went through seven rings of mining hell. Potential buyers should be aware that the mining GPUs are often not in their best conditions after spending months in always-on, always-100% mode. With manufacturers increasing their supply and prices going down like never before, you may better spend a little more and get a new graphics card with a warranty and peace of mind. As a bonus, you can enjoy the view of scalpers getting desperate to get at least some money from their stock. Miners flood market with GPUs they no longer need as cryptocurrencies crash
  11. AMD's Radeon RX 6000 series GPUs, in particular, are easy to find below MSRP. Cryptocurrency has had a rough year. Bitcoin has fallen by more than 50 percent since the start of the year, from nearly $48,000 in January to just over $20,000 as of publication. Celsius, a major cryptocurrency "bank," suspended withdrawals earlier this week, and the Coinbase crypto exchange announced a round of layoffs this past Tuesday after pausing hiring last month. It may be small comfort to anyone who wanted to work at Coinbase or spent hard-earned money on an ugly picture of an ape because a celebrity told them to, but there's some good news for PC builders and gamers in all of this. As tracked by Tom's Hardware, prices for new and used graphics cards continue to fall, coming down from their peak prices in late 2021 and early 2022. For weeks, it has generally been possible to go to Amazon, Newegg, or Best Buy and buy current-generation GPUs for prices that would have seemed like bargains six months or a year ago, and pricing for used GPUs has fallen further. As Tom's Hardware reports, most mid-range Nvidia GeForce RTX 3000-series cards are still selling at or slightly over their manufacturer-suggested retail prices—the 3050, 3060, and 3070 series are all still in high demand. But top-end 3080 Ti, 3090, and 3090 Ti GPUs are all selling below their (admittedly astronomical) MSRPs right now, as are almost all of AMD's Radeon RX 6000 series cards. Used prices have fallen even more quickly. Between June 1 and June 15, eBay prices for used GPUs fell an average of 10 percent as at least some cryptocurrency miners sought to cut their losses and sell their hardware. This is happening even as mining software is beginning to find ways around Nvidia's hash-rate limiting LHR protections—falling cryptocurrency prices and rising energy costs are still making the economics of mining tricky. That said, buyers of used GPUs should still proceed with caution. Aside from the scams and bait-and-switches that can come with any high-value eBay purchase, GPUs that have been mining cryptocurrency at full-tilt for months or years may have problems that a new GPU (or a pre-owned GPU that was only used to play games) wouldn't have. The heat generated by constant use in a high-density mining farm can degrade performance (though GPU manufacturers have overstated this risk in the past), as can dust or dried-out thermal paste. If you buy a used GPU that looks dirty or runs hot, removing and cleaning the fan and heatsink and reapplying fresh thermal paste can help restore lost performance and extend the card's life span. If you enjoy struggling to buy a GPU, things might get more interesting for you soon. Nvidia's RTX 4000-series GPUs are reportedly nearing release, and manufacturing and supply chain issues could conspire to keep these new cards scarce. As cryptocurrency tumbles, prices for new and used GPUs continue to fall
  12. Intel has kept its promise. Last month, the company stated that it will be releasing the first Arc desktop GPUs in China and it has kept its word by unveiling its new Arc A380. The Arc A380 is an entry level desktop graphics card from Intel's Arc 3 lineup and in China, the card is priced 1030 Yuan (with VAT) which is around $153. At this price, the card claims to offer around 25% better performance than the $160 RX 6400 from AMD. Also, the card packs 6GB VRAM instead of 4GB which makes it, on paper, a better alternative than Nvidia's GTX 1650 SUPER, although currently all 1650 variants are overpriced. The image below shows all the features of the Arc A380: The full specs for the Arc A380 are given below: GPU Specifications Xe-cores: 8 Render Slices: 2 Ray Tracing Units: 8 Intel® Xe Matrix Extensions (Intel® XMX) Engines: 128 Xe Vector Engines: 128 Graphics Base Clock: 2000 MHz TDP: 75 W Memory Specifications Memory Size: 6 GB Memory Type: GDDR6 Memory Interface: 96 bit Memory Bandwidth: 192 GB/s Memory Speed: 16 Gbps Intel has provided the following performance number for Arc A380: If you can't read Chinese, that's alright. On its Performance Index page Intel has explained what this graph shows. It reads: Claim: Intel® Arc™ A380 grpahics delivers above 60 FPS performance at 1080p across popular games - Game workloads that support this claim are Naraka Bladepoint, PUBG, World of Warcraft: Shadowlands, Apex Legends, Fortnite, Overwatch, Counter-Strike: Global Offensive, League of Legends, NiZhan, Dota 2, all run with medium settings at 1080p resolution. The company is essentially driving home the point here that the Arc A380 can offer playable 60+fps in most popular titles at respectable settings. The company has also provided more performance-related figures with Intel's XeSS image upscaling technology, as well as AV1 encoding. Interestingly, the page says the tests were run using the Arc driver version, which isn't optimized for A380. So with the optimized 1736 driver version, the performance may end up being a bit better. Source: VideoCardz via The Register Edit: More details about Intel's Arc A380 performance claims are added below: Claim: The Intel® Arc™ A380 GPU, with a recommended customer price of 1,030 yuan, delivers up to 25% better performance per yuan than available competitive offerings as measured by performance on a selection of popular games. GPU(s): Intel® Arc™ A380 6GB reference card AMD Radeon RX 6400 4GB System Configuration: Intel® Arc™ A380 configuration: Graphics Driver:, Processor: Intel® Core™ i5-12600K, MSI PRO Z690-A WIFI DDR4, BIOS: 1.3, Memory: 32GB (2x16GB) DDR4 @ 3200MHz, Storage: MP600 PRO XT 4TB, OS: Windows 11 Version 10.0.22000.675 AMD Radeon RX 6400 configuration: Graphics Driver: 30.0.15021.11005, Processor: Intel® Core™ i5-12600K, MSI PRO Z690-A WIFI DDR4, BIOS: 1.3, Memory: 32GB (2x16GB) DDR4 @ 3200MHz, Storage: MP600 PRO XT 4TB, OS: Windows 11 Version 10.0.22000.675 Measurements: All FPS (frames per second) scores are either measured with PresentMon or in-game benchmark. All gameplay has a documented workload running the same replay or game scenario across all configurations and test runs. Game workloads that support this claim are Naraka Bladepoint (27)%, JX Remake (27%), F1 2021 (26%), Rust (24%), Total Saga: Troy (22%), The Witcher 3 (22%), Arcadegeddon (21%), Metro Exodus (18%), NiZhan (16%), Wolfenstein: Youngblood (15%), Destiny 2 (14%). All games tested at 1080p resolution and medium settings presets. Period: Pricing and testing as of June 12, 2022. Radeon RX 6400 pricing of 1199 yuan on JD.com. Source: Intel Intel releases Arc A380 desktop GPU and it's better than AMD and Nvidia's offerings
  13. A leak from reliable sources recently spilled the beans on a new upcoming entry-level graphics card from Nvidia. The company is about to unveil the GTX 1630, the first 30-series graphics card since 2017. While we wait for Nvidia to announce its latest budget-friendly GPU, VideoCardz spoiled the surprise by publishing leaked specs of the GTX 1630. The GTX 1630 should replace the Pascal-based GT 1030 Nvidia announced five years ago. The new model reportedly has the 12nm TU117-150 die based on the Turing architecture that powers GTX 16 and RTX 20 lineups. It will rival integrated graphics in Intel and AMD processors, plus the recently announced budget-friendly and somewhat lame AMD RX 6400. Compared to the GTX 1650, the GTX 1630 offers fewer CUDA cores, half the memory bus, and lower bandwidth. At the same time, it will feature much higher boost clocks, which might be why TDP is the same at 75 W. Nvidia GTX 1630 Nvidia GT 1030 Nvidia GTX 1650 AMD Radeon RX 6400 Architecture Turing, 12 nm Pascal, 14 nm Turing, 12 nm RDNA 2, 7nm CUDA Cores 512 384 896 not comparable Boost Clocks 1800 MHz 1468 MHz 1590 MHz 2321 MHz Memory 4 GB GDDR6 2 GB GDDR5 2 GB SDDR4 4 GB GDDR6 4 GB GDDR6 Memory Bus 64-bit 64-bit 128-bit 64-bit Memory Clock 12 Gbps 6 Gbps 12 Gbps 16 Gbps Memory Bandwidth 96 GB/s 48 GB/s 192 GB/s 128 GB/s TDP 75 W 30 W 75 W 53 W MSRP TBA $79 (at launch) $149 (at launch) $160 VideoCardz claims Nvidia is set to launch the GTX 1630 on May 31. The graphics card will be the first x30-series GPU to leave the GT lineup and move under the GTX brand. Source: VideoCardz Specs of the upcoming Nvidia GTX 1630 leak with launch date set for May 31
  14. Nvidia is reportedly developing a new entry-level graphics card in the form of the GTX 1630, though generally X30 cards from Nvidia have carried the GT moniker instead of GTX. The report comes via VideoCardz and the site adds that Nvidia is looking to replace the GTX 1050 Ti with this which leads the site to speculate that the GT 1630 will be a sub-$200 graphics card. In terms of performance, the report expects the 1630 to be slower than the three year old GTX 1650 though precise performance details aren't available at the moment. This means that Nvidia will be launching a last-gen Turing-based GPU in 2022. That said, it should be enough to compete with the AMD Radeon RX 6400, which is a Navi 24 GPU and is hamstrung by the lack of PCIe lanes, though to a degree far lower than the RX 6500 XT. Alongside the RX 6500 XT and 6400, AMD is also allegedly preparing a third Navi 24 variant in the form of the 2GB RX 6300. The $159 RX 6400 on a PCIe 3.0 keeps up with a GTX 1650 GDDR5 variant which means a GT 1630 with GDDR6 on-board may be able to keep up with the RX 6400, provided it isn't massively cut-down from the 1650. Source: VideoCardz Report: Nvidia prepping GTX 1630 based on last-gen tech to take on AMD's budget cards
  15. At the end of March earlier this year, Intel took the wraps of its highly anticipated Arc discrete graphics cards. Initially, mobile GPUs will be arriving (image above), followed by desktop parts later in the year. But perhaps to the disappointment of many people worldwide who were looking forward to Intel's discrete GPUs, the firm has confirmed that it its only certain Asian markets where the mobile and desktop Alchemist (First gen Arc codename) SKUs will be available, and only via OEMs and system integrators. Still if you are looking forward to getting an Arc desktop GPU later in the year when its finally available, you can already plan ahead from today into which SKU you would prefer buying. That's because the entire Arc Alchemist A-series lineup has leaked courtesy of the latest Intel beta driver. As you can see in the image below, a list of unreleased Arc models have been listed by Intel, alongside the couple of already available models. Intel Arc, like its Core series CPUs, will be available in three flavors. The entry-level parts will comprise the Arc 3 (A3) lineup. And similarly, the higher SKUs will be under Arc 5 (A5) and Arc 7 (A7). This isn't the first time Intel's drivers have provided insight into future unannounced products. Information on alleged future Intel Arc Elasti architecture also leaked out via a driver too. Elasti might be the successor to Intel's fourth-gen Arc design, codenamed Druid. Via: momomo_us (Twitter) Entire Arc Alchemist GPU lineup has leaked thanks to Intel's own driver
  16. As the launch date nears for the upcoming AMD RDNA 2 refresh, more and more leaks spill out for the new Radeon RX 6000 cards. First, we got alleged official benchmarks for the flagship RX 6950 XT, in the form of synthetic tests. Afterwards, alleged gaming numbers, including ray-tracing, also leaked. Today, vBIOS (VGA BIOS) details for the 6950 XT were spilled inadvertently by TechPowerUp which maintains a database of vBIOS for various Radeon and GeForce graphics cards. While the link has since been pulled from TechPowerUp's site, a Chiphell forum member was quick to spot it and VideoCardz managed to take screenshots of two of 6950 XT aftermarket AIB variants. The leaked vBIOS data reveals two interesting details about the upcoming Radeon RX 6950 XT. First, the new GPU is not based on the Navi 21 XTXH chip which has been used by AMD previously for its top-end enthusiast 6900 XT variants. Instead this is a new chip apparently dubbed "Navi 21 KXTX". The memory subsystem on the new KXTX chip is seemingly more versatile than the previous 6900 XT GPU as it apparently supports both Samsung as well as Hynix GDDR6 memory, as opposed to just Samsung GDDR6. Outside of minor performance differences, sourcing the memory from two vendors should help AMD maintain a decent stock for the Radeon RX 6950 XT which in turn should help keep the price of the GPU low. Source: TechPowerUp via ljy1414 (Chiphell forum) | Image via VideoCardz Leaked AMD RX 6950 XT vBIOS suggests the card is more versatile than 6900 XT
  17. Intel has already announced its Arc Alchemist GPU lineup for laptops and notebooks. The lineup for both laptop and desktops will be segmented into three parts similar to how Intel markets its Core i-series processors. Hence there will be Arc 3 for entry-level, Arc 5 for mainstream, and Arc 7 for high-end stuff. Intel says it will release Arc Alchemist desktops GPUs in Q2 of this year and if you are someone who is expecting Intel to completely disrupt the GPU market like Zen did when it launched back in 2016, you may be in for some disappointment. According to a Wccftech report, Intel is planning to position some of its Arc SKUs in similar price brackets as comparably performing Nvidia GeForce or AMD Radeon GPU models. In fact, some of the MSRPs may even be higher if we take Nvidia's MSRPs as reference. The alleged launch prices for the Intel Alchemist GPUs are given in the table below: Arc SKU Comparable GeForce Comparable Radeon Alleged MSRP Alleged release A750 RTX 3060 RX 6600 / XT $350 May-June A580 RTX 3050 RX 6600/ 6500 XT $280 May-June A380 GTX 1650 RX 6400 $150 July While these prices seem decent at first glance when compared to current GPU prices, it must be kept in mind that graphics cards are getting cheaper gradually, and by the time the Arc desktop lineup is out, it may be too late as AMD and Nvidia may already begin launching their next-gen RDNA 3 RX 7000 series and Ada Lovelace RTX 4000 series GPUs. In case you are wondering what the specifications of Alchemist will look like, SiSoftware had already leaked some of that in its early review of Arc where it noted that the performance of Intel Arc was "nothing special". Source: Wccftech Intel's upcoming Arc Alchemist GPUs may be more expensive than AMD and Nvidia's
  18. AMD Radeon fans could be in for a massive surprise in a good way as the company may be working on some insanely powerful next-gen RDNA 3 GPUs. According to the latest report by Twitter leakster Greymon55, the top of the line RDNA 3 (Navi 31) graphics card could come in with close to 92 TFLOPs of single-precision (FP32) compute power. This number is nearly four three times more than AMD's current best offering, the RX 6900 XT, which is capable of delivering 23.04 TFLOPs of FP32. The 92 TFLOP number is reached using a 3GHz boost clock for next-gen flagship Navi 31 GPU, which might be the Radeon RX 7900 XT. Navi 31 is rumored to feature a total of 120 Workgroup Processors (WGPs) or 240 Compute Units (CUs) for a total of 15,360 Stream Processors. This is how the rumored specifications compares with the RX 6900 XT: RX 6900 XT RX 7900 XT Workgroup Processors (WGP) 40 120 Compute Units (CUs) 80 240 Stream Processors 5,120 15,360 Boost Clock 2,250 MHz 3,000 MHz (?) FP32 TFLOPs 23.04 ~92 Additionally, earlier today, a driver leak suggested that AMD's RDNA 3 architecture is moving to a four SIMD32 design per CU up from two SIMD32. Hence, the total final performance of the 7900 XT may even end up being more than four times that of the RX 6900 XT. Source: Greymon55 (Twitter) AMD's RX 7900 XT could be more than four times faster than the current flagship RX 6900 XT EDIT: Title changed from "...more than four times faster..." to "...four times as fast..." to correct this mathematically challenged journo. (92 - 23 = 69 faster = three times faster)
  19. An intriguing if weak option for low-profile mini PCs The Sapphire Pulse RX 6400 is one of the first low-profile RDNA2 GPUs. If you’re looking for a serious gaming graphics card, look elsewhere — reviewers absolutely dragged the $199 AMD Radeon RX 6500 XT in January, and today’s GPU entry is even weaker. But if you absolutely, positively need to fit a miniature graphics card in a very small PC, the new RX 6400 might be worth a look. Today, AMD has quietly launched the Radeon RX 6400 with an array of partners including ASRock, Biostar, Gigabyte, MSI, PowerColor, Sapphire, and XFX, and we’re actually seeing a few of them retail for that price or close to it, GPU shortage be damned. There’s a $159.99 ASRock Challenger and a $169.99 XFX Speedster SWFT105 in stock at Newegg right now — with a $159.99 Sapphire Pulse card also on the way. Intriguingly, every one of these cards appears to be a miniature model, and many of them are single-slot, low-profile GPUs that can fit in much narrower cases. (If you’re not familiar with low-profile GPUs, check out the pics above and below — they typically come with a shorter, swappable metal PCI-Express bracket like the one you see on the Sapphire card at the top of this post; the XFX card below is showing off its longer bracket for scale, but it should come with both.) This XFX card looks short, and it’ll be shorter if you attach its low-profile PCIe bracket. Mind you, there’s a reason these cards don’t need to be big! They’re only rated at 53W of power, less than half the power of even the lackluster RX 6500 XT, with only 12 compute units (down from 16), lower clocks, slower RAM, only 128GB per second of bandwidth, and just two display outputs. (Some of this is less surprising when you consider that its Navi24 GPU was originally designed for laptops.) On the plus side, you don’t need an extra power connector: the card can draw all its power from the PCI-Express slot. According to one early video review, it looks like it will struggle with games like Elden Ring and Cyberpunk 2077 even at 1080p and on low settings but should be fine for the likes of Fortnite. Know that while all RX 6400 models we’ve seen so far are miniature (big hat tips to VideoCardz and Wccftech for rounding up most of them), not all of them are low-profile. Many are two slots wide, some have two fans, and Gigabyte even has a model (below) that’s all of the above. This Gigabyte model is squat but seemingly long and wide. AMD quietly launches new space-saving RX 6400 graphics cards for $159
  20. Best cheap graphics cards 2020: the top graphics cards on a budget Get the best cheap graphics cards for your dollar (Image credit: Future) Ah, the graphics card. Whether you're a dedicated esports enthusiast or a creative working with ultrahigh definition visuals, the graphics card is possibly the most important component in your computer. Unfortunately, it's also priced as such, making the search for the best cheap graphics card a difficult one. Still, there are options out there for those without a whole lot of money to spend. Between Nvidia and AMD, there are several quality Geforce and Radeon graphics cards out there that technically count as affordable if you know where to look. Fortunately, we do, and we're here to help you find the best cheap graphics card to fit your need – and your budget. Best cheap graphics card at a glance AMD Radeon RX 5700 Nvidia GeForce GTX 1650 Super AMD Radeon RX 5600 XT Gigabyte GeForce GTX 1660 OC 6G AMD Radeon RX 5500 XT (Image credit: AMD) 1. AMD Radeon RX 5700 AMD's best cheap graphics card all around Stream Processors: 2,304 | Core base clock: 1,465 MHz | Core boost clock: 1, 725 MHz | Memory: 8 GB GDDR6 | Memory clock: 14 Gbps | Power connectors: 1 x 8-pin and 1x 6-pin | Outputs: 1 x DisplayPort 1.4 DSC, 1 x HDMI with 4K60 support + Excellent performance + 1440p gaming at Ultra/Max setting - No ray-tracing - Just barely "affordable" When it comes to "budget" AMD graphics cards, the Radeon RX 5700 is about as good as it gets. Capable of delivering 1440p gaming on Ultra or Max settings, you're going to be hard pressed to find a better mid-range graphics card at this price point. Some sacrifices had to be made for affordability, however – and it is just barely affordable. The Radeon RX 5700 doesn't come with ray tracing, so the budget minded will have to wait a little while longer before that feature makes it into even the best cheap graphics cards on the market. Maybe next year. Read the full review: AMD Radeon RX 5700 (Image credit: Nvidia) 2. Nvidia GeForce GTX 1660 Super A powerhouse for 1080p gaming Stream Processors: 1408 | Core base clock: 1.520 MHz | Core boost clock: 1,785 MHz | Memory: 6 GB GDDR6 | Memory clock: 14 Gbps | Power connectors: 1x 6-pin | Outputs: 1 x DisplayPort 1.4a, 1 x HDMI 2.0b, DL-DVI + Excellent 1080p performance + Affordable - No RT Cores - Limited Ports Replacing the Nvidia GeForce GTX 1660, the Nvidia GeForce GTX 1660 Super is absolutely one of the best cheap graphics cards on the market right now. It is able to crank out nearly 80 FPS on Middle Earth: Shadow of War on Ultra graphics settings at 1080p and even managing a decent 54 FPS at 1440p. This is incredible considering it comes in at under $250 (£200, AU$400). Still, the Nvidia GeForce GTX 1660 Super does have its drawbacks. It still goes with a DVI port in lieu of a second HDMI port (or even a USB-C) and while it does have a DisplayPort, you won't be running several displays with this card. It also lacks ray tracing cores, but that's not surprising given that this is a budget graphics card. Still, you're going to be hard-pressed to find another graphics card that's as good as the Nvidia GeForce GTX 1660 Super for the price. Read our full review: Nvidia GeForce GTX 1660 Super (Image credit: AMD) 3. AMD Radeon RX 5600 XT Raw performance at a budget price Stream Processors: 2,304 | Core base clock: 1,355 MHz | Core boost clock: 1,560 MHz | Memory: 6 GB GDDR6 | Memory clock: 14 Gbps | Power connectors: 1x 8-pin | Outputs: 1 x DisplayPort 1.4, 1 x HDMI 2.0 + Incredible performance + Competitive against more expensive cards - No ray-tracing - Limited ports With all the cheap 1080p graphics cards out there, it's especially hard for gamers to find the right card for their systems and budget. In a market flooded with Nvidia offerings, the AMD Radeon RX 5600 XT thankfully stands out as one of the best cheap graphics cards in its class, and can even bloody the noses of some of the more expensive mid-range cards in terms of raw performance. Still, it is an AMD card, so it doesn't have ray tracing, and for a card that costs nearly $300 (about £250, AU$470), it's worth asking whether it's worth paying just a bit more for the Nvidia GeForce 2060 which does have ray-tracing capabilities. If you can get by without the ray-tracing though, the AMD Radeon RX 5600 XT is possibly the best cheap graphics card you're going to find. Read our full review: AMD Radeon RX 5600 XT (Image credit: Gigabyte) 4. Gigabyte GeForce GTX 1660 OC 6G High-performance, low price Stream Processors: 1,408 | Core base clock: 1,530 MHz | Memory: 6 GB GDDR5 | Memory clock: 8 Gbps | Power connectors: 1x 8-pin | Outputs: 3 x DisplayPort 1.4, 1 x HDMI 2.0 + Turing architecture + Excellent performance - Only GDDR5 While Nvidia GeForce graphics cards tend to shine brightest on the high-end, they don't always pack the same performance-to-price value as a solid AMD offering. Fortunately, not every GeForce card is out of reach, and the Gigabyte GeForce GTX 1660 OC 6G brings the latest Nvidia Turing architecture to the budget-minded consumer. With excellent 1080p gaming performance and even some solid 1440p gaming with the right settings, the GeForce GTX 1660 OC is one of the best cheap graphics card options for gamers who want a little bit more from their graphics card without paying a lofty premium. Read our full review: Gigabyte GeForce GTX 1660 OC 6G (Image credit: AMD) 5. AMD Radeon RX 5500 XT AMD Navi at a budget price point Stream Processors: 1,408 | Core base clock: 1,717 MHz | Core boost clock: 1,845 MHz | Memory: 8 GB GDDR6 | Memory clock: 14 Gbps | Power connectors: 1x 8-pincompu | Outputs: 1 x DisplayPort 1.4 with DSC, 1 x HDMI with 4K60 + Very affordable + Solid 1080p gaming performance - Struggles with most-demanding 1080p games If you're looking for solid 1080p gaming without spending a ton, you can't go wrong with the AMD Radeon RX 5500 XT. It definitely doesn't aim beyond its reach and it sticks to turning out quality 1080p performance, edging out the rival GeForce GTX 1660 when factoring in the price. Capable of pumping out 60 FPS on most AAA titles, the AMD Radeon RX 5500 XT will struggle with a Metro Exodus on high settings, much less ultra. But for most games, it'll be tough to find a better graphics card at this price point. Source: Best cheap graphics cards 2020: the top graphics cards on a budget (TechRadar)
  21. AMD has sold more GPUs than Nvidia, according to this analyst report Team Green is falling behind, though not by much (Image credit: Future) Team red is on fire. It seems like AMD’s winning streak won’t likely end anytime soon. After leaked figures from Mindfactory revealed that AMD’s Ryzen CPU sales are destroying that of Intel’s, the latest report from Jon Peddie Research is now showing that the Santa Clara company is winning in the GPU market as well. According JPS’s Market Watch Q4 2019 report, AMD saw a 22.6% increase in overall GPU shipments in Q4 2019. This means that AMD now has 19% share of the GPU market, which is a 3% increase from Q3, while rivals Nvidia and Intel saw 0.97% and 2% drops respectively. That leaves Nvidia with only an 18% share - leaving AMD in the lead between the two. That said, Intel still dominates the market with its integrated and discrete GPUs, taking 63% of the market share in Q4. And, Nvidia is still king of the discrete GPU game, taking 73% of discrete GPU shipments in 2019 over AMD’s 27%. However, the fact that AMD's GPU sales are steadily going up is still great news for the company. AMD's shipments of discrete graphics in particular progressed to 27% of the market total, up from 26% in 2018 and 24% in Q3 2019. With the highly-anticipated “Nvidia killer” Radeon RX 5950 XT just around the corner, those numbers are likely to go higher in 2020. Of course, it’s also entirely possible that Intel’s promising Xe discrete graphics will only perpetuate Team Blue’s dominance, especially in the laptop market. Good news for the GPU market in general It’s not just AMD that’s enjoying the fruits of its labor, however. According to Market Watch, the overall GPU shipments had increased 3.4% from Q3 2019. The overall attach rate of GPUs to PCs was up by 1.8% and the number of desktop graphics add-in boards (AIBs) that use discrete GPUs also saw 12.17% increase in Q4. Considering that GPU shipments have been historically flat in the fourth quarter, this is excellent news for the graphics card industry. JPR President Jon Peddie, even notes that this is “the third consecutive quarter of increased GPU shipments.” It’s not all good news. With the coronavirus epidemic crippling many of China’s factories and thus interrupting the supply chain, Q1 2020 “may show an unusual dip,” says Peddie. However, with “Intel’s entry into the discrete GPU market and a possible fourth entry by an IP company,” 2020 is still going to be an exciting year in graphics card game. Source: AMD has sold more GPUs than Nvidia, according to this analyst report (TechRadar)
  22. Intel Xe DG1 graphics card 3DMark leak again suggests AMD and Nvidia won’t be troubled But at least this rumor is more promising than the last leak we saw for DG1 (Image credit: Shutterstock) Intel’s Xe DG1 graphics card has been spotted in a 3DMark benchmark, or at least the rumor mill believes that result is for Intel’s first crack at a discrete GPU. As ever, we shouldn’t read too much into this given that it is just speculation that this is DG1, although the source is a reliable one, the ever-present TUM_APISAK. But even if the leak is on the money, remember that this is an early sample GPU, and won’t reflect the exact performance Intel may achieve with the final product. At any rate, the purported DG1 graphics cards scored 5,538 in 3DMark’s Fire Strike test (paired with an Intel Core i9-9900K processor) and hit a graphics score of 5,960. That’s not a massively impressive result, but as we’ve already mentioned, it must be treated with caution. It’s in the ballpark of a graphics card as old as the GeForce GTX 750 Ti, albeit a bit faster than that veteran GPU (which scored 5,402 for graphics in a 3DMark result highlighted on Twitter). As Wccftech, which spotted this, observes, it’s a fair way behind the GTX 1050 to pick out another example from Nvidia’s line-up – that previous-gen budget card is around 500 points to 800 points better than the DG1 depending on which 3DMark result you look at. No cause for concern? Anyhow, you get the idea – and as with a previous Geekbench result, which showed that the DG1 wasn’t much better than Nvidia’s low-end MX250, the overall vibe thus far is that Intel’s initial product is not going to be causing either AMD or Nvidia any sleepless nights. That said, at least this new 3DMark leak shows the Intel GPU comfortably outdoing the likes of the MX350, by around a third in terms of that graphics score in fact. Further remember that Intel’s first GPU is likely to be a testing the waters affair, and as we’ve previously heard via the rumor mill, it’s going to be a mobile part – in other words, a graphics card for laptops, not a GPU for a desktop PC. With further development, perhaps it could start to worry Intel’s rivals at least in the notebook arena – particularly when combined with the potential of Xe integrated graphics with Intel’s Tiger Lake mobile processors. Intel Xe DG1 graphics card 3DMark leak again suggests AMD and Nvidia won’t be troubled
  23. Pixel 5 sees dramatically improved GPU performance after April patch It's not an amazing improvement; the Pixel 5 GPU was just really bad to begin with. Enlarge / Our Pixel 5 came in this funky green version. Ron Amadeo Nearly six months after the release of the Pixel 5, Google has revamped the way the GPU works, bringing the phone up to the level you'd expect from a Snapdragon 765G phone. The April 2021 security update for the Pixel 5 and Pixel 4a (5G) came with a note saying that it includes "performance optimizations for certain graphics-intensive apps & games." Hands-on testing reveals the update apparently dramatically improves the Pixel 5 GPU, reportedly showing performance gains of up to 50 percent in some apps and benchmarks. We don't have a Pixel 5 on hand to test, but Andreas Proschofsky from Der Standard tipped off the Internet that he's seeing "30-50 percent better" performance in 3DMark after the update. Andrei Frumusanu from Anandtech confirmed "performance has been essentially doubled from the scores published [in Anandtech's review], and in line or better than other 765G phones," adding that "the fact it took 6 months is sad, though." Hmm. Yes. It might seem impossible to add 50 percent better performance from a mere software update, but Google is just fixing the terrible state of the launch phone. There was simply that much room for improvement relative to other phones. When we reviewed the device, we called it "the world's slowest Snapdragon 765G phone," noting that other Snapdragon 765G smartphones like the OnePlus Nord could wipe the floor with the device in head-to-head GPU benchmarks. It wasn't a great look for the Pixel 5, which was already facing a switch to mid-range hardware that meant it would be slower than the Pixel 4. Benchmarks allow us to put exact numbers on the changes, but this isn't a trick of benchmarking; the numbers reflected real-world performance when it came to 3D gaming, which was terrible on the Pixel 5. Google says the April 2021 security update also comes with camera quality improvements for third-party apps, a fix for an issue that would cause freezing on startup, and a fix for some missing home screen settings. The update should be rolling out now to Pixel phones—just mash that update button. Pixel 5 sees dramatically improved GPU performance after April patch
  24. AMD announces $479 Radeon RX 6700 XT, says it will have ‘significantly more GPUs available’ ‘We know it’s crazy out there, we’re doing everything we can’ AMD has heard you loud and clear: you can’t buy its excellent RX 6800 and 6800 XT graphics cards at anything close to their retail prices. Today, the company’s announcing a new GPU that might (but probably won’t?) change that: the Radeon RX 6700 XT. “With the AMD Radeon RX 6700 XT launch, we are on track to have significantly more GPUs available for sale at launch,” AMD tells The Verge. Even better: AMD claims it’ll begin refreshing stock of RX 6000 GPUs and Ryzen 5000 CPUs every week on its own website, where it’ll sell them at their retail prices. We’ve been waiting for that for nearly two months. The new RX 6700 XT will arrive on March 18th for a suggested retail price of $479. In a normal, sane year, that would slot it between Nvidia’s $500 RTX 3070, which we called “the 1440p sweet spot,” and Nvidia’s bang-for-the-buck $400 RTX 3060 Ti, where you might have to dial down the settings here and there. It’s also a full $100 less than AMD’s $579 RX 6800, which we found had enough oomph for entry-level 4K gaming. This isn’t a sane period for GPU buyers, though. In December, the actual street prices of the $400 3060 Ti, the $500 3070, and the $579 RX 6800 were $675, $819, and $841, respectively — and that was before Trump’s tariffs pushed Nvidia and AMD’s board partners to raise their retail prices. “We know it’s crazy out there, we’re doing everything we can,” says AMD’s Nish Neelalojanan, a director of product management. That not only includes more stock at AMD.com but also additional supply for board partners and manufacturers that’ll sell gaming PCs later on. AMD wouldn’t say how much of that stock is being allocated toward GPUs that’ll be sold at AMD.com, though. It says it doesn’t set its partners’ retail prices either. Assuming for a moment that AMD pulls it off, managing substantially greater availability than Nvidia’s recent debut, the RX 6700 XT sounds like it could be a compelling pick. With 230 watts of power, 12GB of video memory, and 40 compute units (compared to 60 for the RX 6800 and 80 for the RX 6800 XT), AMD’s promising you’ll be able to play all of the latest games at maximum settings at 1440p resolution. With a fast enough CPU, AMD suggests you should be able to hit 212 fps in Overwatch, 272 fps in League of Legends, and 360 fps in Rainbow Six Siege, enough for esports gamers to justify some of the fastest monitors on the market. AMD says it should be fast enough for ray tracing at 1440p as well. The company’s early benchmarks (see above) show it pulling ahead of Nvidia’s 3070 and 3060 Ti, though not in all games. It’s worth noting these numbers were generated using the frame rate boost of AMD’s Smart Access Memory (generically known as Resizable BAR), something that’s only just starting to roll out to Nvidia’s graphics cards and generally requires newer CPUs to work. That said, AMD also just announced that it will begin rolling out Resizable BAR to its Ryzen 3000-series processors, not just the newer Ryzen 5000 ones. The new card will require two power connectors, an 8-pin and a 6-pin, and the GPU should be clocked somewhat faster than in earlier RX 6000-series cards at up to 2424MHz. It’s got a 192-bit memory bus, down from 256-bit for the company’s other 6000-series cards. Clearly, we’ll have to test the RX 6700 XT’s performance ourselves, but nothing matters more than availability — and where that availability will leave the card’s actual price by the time you can buy one. AMD claims cards will be available on March 18th from all of the usual board partners, 40 different system builders, and AMD.com, with prebuilt systems including the HP Omen 25L and 30L desktops coming later this spring and beyond. While AMD’s own version is a dual-fan card with the same basic reference design as the RX 6800 and 6800 XT, it appears many of AMD’s partners are opting for three-fan designs. Those generally carry a premium price as well. And in case you’re wondering, AMD has no plans to nerf the crypto mining performance of the RX 6700 XT the way Nvidia did for Ethereum with the RTX 3060. “We have no plans to limit the product in that way,” AMD told journalists this week. You can watch the company’s 20-minute presentation, including a tease of Resident Evil Village with ray tracing, in the video below. Update, 11:36AM ET: Added that the HP Omen and other prebuilts will arrive later this spring. AMD announces $479 Radeon RX 6700 XT, says it will have ‘significantly more GPUs available’
  25. Last quarter, we reported that the prices for graphics cards were coming down as the trend showed an apparent downward movement of retail GPU prices. Sadly, however, it seems things haven't quite materialized the way it was thought. Latest pricing data from 3DCenter gathered from the same major German retail outlets are now showing a slow but steady rise in the asking price of the graphics cards from both the Green (Nvidia) and Red (AMD) camps. In fact, after the last quarterly report, there was an even steeper decline in prices during late June and early July. This trend however did not continue as the prices settled for around this level for the next month. Since August though, the retail GPU prices have started to creep back up slowly. Currently, the average pricing for AMD Radeon and Nvidia GeForce cards are 74% and 70% above their respective official MSRPs. Multiple factors could be affecting the prices of the GPUs. First, there is the existing global chip shortage crisis which has also started to adversely influence GDDR6 memory chip prices as well, which is in turn affecting the prices of the graphics cards for this generation. There are also reports of fresh COVID outbreaks in China that are seemingly affecting the factory production output levels of GPUs. Source and image: 3DCenter Report: GPU prices creeping back up after a promising dip in cost last quarter
  • Create New...