A device that tries to steal the performance crown from theHBM2 Almost ready to grab the central stage
The coming year will almost certainly serve as a platform for the arrival of graphic cards First with chips GDDR6 A new generation, and the one that hopes to be the first to enable this is Micron, which announced that it has completed the development process and started producing samples for a pair of chips that meet the JEDEC's advanced standard - with a clear goal to start mass production (which will give us the coveted video cards) The first half of 2018.
The company's first GDDR6 chips will be available in volumes identical to those already available on GDDR5 and GDDR5X chips (although the GDDR6 standard is designed to support even large volumes of up to 4 gigabytes per chip) and power consumption However, with increased workloads of 12 gigabits per second and 14 gigabits per second, compared to the maximum 11 Gbps in the GDDR5X chips NVIDIA, And compared to speeds of up to 9 gigabits per second in memories GDDR5.
In practice, this means that we can see graphics cards equipped with "simple" GDDR6 memories that will approach the massive bandwidths that are possible with leading cards with Memories HBM2 that are integrated into the processing cores themselves. Card with controller memory The 256 bit interface, equipped with 14 Gigabits per second, can deliver an effective bandwidth of approximately 450 Gbps, cards with interfaces memory 384 bits wide can reach a bandwidth of 670 gigabytes per second, while top-end cards equipped with a 512-bit interface can reach a touching distance of 900 gigabytes per second. Most impressive - and important to ensure that a new generation of cores does not encounter unexpected bottlenecks in its operation.
Micron faces a challenge from manufacturers such as SK Hynix and Samsung (which has already announced preparations for the production of chips GDDR6 At a rate of 16 gigabit per second), all of which certainly encourages us and makes us believe that 2018 will also be considered useful, and perhaps even revolutionary, for the flourishing GPU.