What does theGeForce GTX 2080 designed before integrating the NVIDIA's powerful core? The answer is now before you
The days go by slowly and the long awaited launch of NVIDIA On August 20 is already on the horizon - which is also reflected in more and more interesting leaks that purport to destroy the surprises the green chip developer is preparing for us. Truth or disinformation? Feel free to read to judge for yourself.
A pair of photographs that were distributed on the Reddit platform ostensibly document a naked printed circuit of the "GeForce GTX 2080 (or perhaps GTX 1180) is still not fully agreed on the network), with a preparation for the processing core itself which does not look very big and what looks like a place for eight chips GDDR6 In the 256 bit interface and probably in a total volume of 8GB, or possibly 16GB in the future when double-volume chips will mature for commercial and mass use.
Logo of NVIDIA Printed prominently on the board indicates that this is probably the company's reference design, with a more massive array of voltage stabilizers than we used to see on "normal" video cards (not designed for the most extreme speed) in the past, 8Pin and 6Pin power connectors that may indicate power consumption Higher than the GeForce GTX 1080 models - and 'mysterious' like connectors SLI Can point to the preparation of a new generation based on the innovative NVLink technology.
אל תמונות The printed circuit is joined by documentation from Chinese ticket manufacturers Manli and Maxsun - which allegedly reveal the very existence and intended appearance of models GeForce GTX 2080 andGeForce GTX 2070.
Manli made a listing of the new model names on the ECC Standards Association website (then claimed it was unrelated to forgery), while Maxsun posted a photo of the "next generation" card, which is massively equipped with three fans and probably also a single 8Pin connection for power supply. You shouldn't rely on these pieces of information with your eyes closed, but there's no doubt that the feeling is that we're at the beginning of a very busy era when it comes to the world of graphic processing - expect more updates soon!