Intel’s Arc A770 and A750 appeared doomed. Intel has been developing discrete gaming graphics cards for decades, but the Arc A770 and A750 are the first to leave the prototype stage. And, despite reports of cancellation, different delays, and seemingly interminable issues, the Arc A770, and A750 GPUs have arrived. The surprise is that they’re really good.
With GPU prices on the rise, the Arc A770 and A750 might offer a substantial price drop. They’ll undoubtedly be surpassed by new Nvidia and AMD GPUs in the same price range when the time comes, but for the next year at least, Intel is a real competitor in the $250 to $350 area.
A note on compatibility Intel Arc
Before I get into the juicy performance testing, a note about compatibility is in order.
According to Intel, the Arc A770 and A750 require a 10th-generation Intel CPU or AMD Ryzen 3000 CPU or newer. This is because Arc Alchemist cards greatly benefit from Resizable BAR, which is only accessible on the most recent processor versions. The cards will function with earlier CPUs, however, the performance will be significantly reduced if ReBAR is disabled.
I did not retest my full suite because the differences were obvious after just a few games. In Horizon Zero Dawn, my frame rate decreased by 24% when the A770 was set to 1080p. The decline was approximately 19% in Metro Exodus. Keep in mind that the only difference between the statistics was that ReBAR was turned off in the BIOS. The rest remained unchanged.
That performance discrepancy is significant enough to qualify as an essential feature for ReBAR. Although it’s recommended to keep ReBAR enabled for optimal performance, modern AMD and Nvidia architectures don’t require it to function properly.
If you want to use an Arc GPU on an older system that lacks ReBAR compatibility, you will have a far worse experience.
Let’s check Intel Arc A770 and A750 specs
The specifications of the Arc A770 and A750 aren’t particularly intriguing. The most notable difference is that Intel uses specialized ray tracing processors, as opposed to AMD’s RX 6000 graphics cards. As I’ll discuss further in this article, this provides Arc with a significant advantage in ray tracing.
Although we still don’t know the price or the exact release date for the A580 and A380 cards, I listed their specifications here for reference. Intel is selling its Limited Edition variants for the cards we have at the pricing shown above. These cards are comparable to Nvidia’s Founder’s Edition cards; they are not genuinely limited edition. Board partners will release their own models, which will be significantly more expensive than the retail pricing.
Due to Intel’s availability of 8GB and 16GB variants, the Arc A770 is the sole outlier in the range. Apart from that, they are identical, and Intel claims that only the 16GB Limited Edition model will be sold. It’s difficult to predict how many 8GB devices we’ll see or whether 16GB models from third parties will fetch significantly higher prices.
Synthetic and rendering
Let’s start with some synthetic benchmarks before moving on to actual games. The Arc A770 and A750 outperform all rival AMD and Nvidia GPUs in Port Royal and Time Spy tests from 3DMark. That sounds clear-cut, however, Arc has particular 3D Mark enhancements, allowing the Arc GPUs to perform better in Time Spy.
The Arc A770 and RTX 3060 Ti both perform comparably on the Port Royal ray tracing benchmark, while the Arc A750 falls short. In Port Royal, AMD’s GPUs lag far behind, which is understandable given the present limitations of AMD’s graphics cards’ ray tracing capabilities.
In comparison to Intel’s GPUs, AMD’s GPUs perform just as well when rasterized and can even appear to be a better deal. That narrative is turned on its head by ray tracing. Especially as they compete on rasterized performance as well as when ray tracing is taken into account, the Arc A750 and especially the A770 are legitimate RTX 3060 competitors.
What is Intel Arc XeSS?
In addition to ray tracing, Intel also provides Xe Super Sampling (XeSS) to compete with Nvidia’s Deep Learning Super Sampling (DLSS). Similar to DLSS in operation is XeSS. In compatible games, the image is reproduced at a lesser resolution and upscaled using specialized AI hardware built into the GPU to get the closest possible resemblance to the native resolution. That is the concept, but Intel offers a twist: XeSS doesn’t need an Intel graphics card. Nvidia’s RTX GPU is necessary for DLSS.
How is Intel accomplishing this? You’ll automatically receive whichever of the two versions of XeSS your GPU supports out of the two available. Arc GPUs are supported by the primary version. The specialized XMX AI cores of Arc are used to speed up the utilization of an advanced upscaling model. The alternative employs DP4a instructions and is based on a less complex upscaling model. In essence, both versions use AI, but DP4a can’t handle the complexity that the XMX cores can, thus it uses a less complex model on AMD and Nvidia GPUs.
XeSS isn’t yet a DLSS slayer. After the upscaling is completed on Intel’s GPUs, it still needs some work to clean up the image, and the DP4a version needs to be completely updated to compete with what temporal supersampling can currently give.
I have to question whether Intel’s strategy is the right one. In terms of performance and image quality, AMD’s FidelityFX Super Resolution (FSR) 2.0 is comparable to DLSS and works with any GPU vendor without artificial intelligence. With sufficient optimization, XeSS can eventually rival DLSS, but a general-purpose temporal super-resolution feature like FSR 2.0 or the one in Unreal Engine 5 would have yielded greater performance advantages right away.
Ok, so it’s time will you buy the Arc A770 or A750?
It’s difficult to justify the RTX 3060 at the going rate. To get marginally better ray tracing and DLSS, you’ll need to pay a $50 to $100 premium above Intel’s options. Even though Intel still has to tune it to fully utilize it, XeSS appears to be a promising alternative to DLSS. Performance-wise, the Arc A770 easily surpasses the RTX 3060, while the A750 lags somewhat behind.
One thing is certain, though: Intel’s entry into the GPU market is guaranteed to make an impression, and hopefully, Team Blue will be a third rival for years to come. The final decision will come down to whichever GPU you can find and at what price.