The next generation of AMD and NVIDIA graphics chips are delayed: HWzone
Computersgraphic cards

The next generation AMD and NVIDIA graphics chips are delayed

The reason for the delay is the lack of chips in advanced 20 and 16 nanometers, which are provided for other and more "urgent" purposes such as chip making For companies like Apple and Qualcomm

AMD and In the fight for new chips in the market

You wanted a stronger, more economical smartphone? You'll have to wait longer for your next video card. According to reports from WCCFTech, also AMD and also NVIDIA Are delaying the launch of next-generation company cards, containing chips in the new lithography of 20 and 16 nanometers. The standard solution for these small chipless companies was to launch their flagship cards in a relatively old (28) nanometer lithograph with future chips that would continue to come with the same lithography to enable the flagship cards to benefit from the new technology.

The main reason for the delay in the new cards in the new lithograph is the market for smartphones. In light of the sky-high popularity of smartphones all over the world, even with the availability of technology for 20 and 16 low-density chip chips, most of the chips are used for Qualcomm's purposes, And Samsung in manufacturing New, stronger, and more efficient than before.

According to the report, this is also the reason for the production of tickets based on NVIDIA's GM204 coreGTX 980 and-GTX 970) In the 28 nanometer production process, in addition to the remake cores called "Tonga" by AMD, which are also expected to come up with this current lithography. New chips in 20 or 16 nanometer lithography We do not expect to see from NVIDIA up to 2016 with the "Pascal" core expected to skip 20 nanometer lithography straight to 16. From AMD you can expect toThe R9 3xx cards (Based on the Caribbean Islands cores) in 20 lithography at nanometers at the end of the first half of next year, about two months delay from the initial design.

As a plan to deal with the lack of chips in the new lithography, Announced that it will continue to use chips in the production process of 28 nanometers for its medium and low series, with high-market cards that will benefit first from the new lithography, when availability will allow.

[interaction id="54a15ace08b0f989512977ad"]


15 תגובות


    Smartphones too ... ruined our health ... At this rate, StarCitizen is running on the smartphone I am still waiting for 20 hopefully when normal cards come out we can afford some 4K2K screen 
  2. For 2, you will only see it on VOLTA… Sorry PASCAL _ (ZA 2016)
    When 4K is available for home prices around the world, only then will you see the long-awaited ticket ... _ (It's all planned and orchestrated ... All of these engineering steps / steps are solely for the purpose of extorting money from the market ... and of course technology)

  3. They make a mistake there are big profits from video cards, me and other people in the end will move to consoles and then they will cry why we did that, I already switched to ps4

  4. ^ So what do you actually say that the consoles in the current generation are better than today's hardware?
    Really, but not at all!

  5. A routine reaction of pine and pine 1911 and like the rest is worth ignoring.

    You don't have to be an analyst to know that today Enoydia is enjoying its video card sales. Soon mid-market products will bring another hit, so why hurry. Plus, there's plenty of room to relax and unwind before a strong 980 product on 28 nm can bring another hit. Slowly.

  6. djelectric
    Say you want to move forward in life or do you want to be stuck on the same technology?
    I did not say that it is better, it is more lucrative, why every year should you buy a video card at 1800 2000 easy to run on the highest?
    Once upon a time it wasn't like a video card that cost 2000 that it would have been easy to hold 2-3 for years, which would make a card like a human being not every year.
    This whole forum does not understand the main cruise I bought last year I recommended this computer that I run every game for a maximum of two years
    i5 4670 + gtx 770 2gb + 16gb At the end it turns out to be wrong.

  7. I hope you understand the contrast in your message, and this is the source of your expression.
    Want to move forward, which is exactly what happens from year to year. So why buy a ticket at 1800 every year to run at maximum? Because every year the quality ceiling rises.
    Very basic logic, the egg and the chicken. Think about it a little, and only then will you respond.

  8. It is amazing that Intel is already a maker of 14 nanometers.
    But amd / nvidia are stuck in 28

  9. Pine and Pine 1911, so let me simplify what you say.
    You don't want to buy a video card every year to run the highest graphics.
    So instead you bought a console that in the next ten years will consistently run all games on the worst graphics (!)?
    It's like a person saying he is scared to die in a car accident so he commits suicide before it is God forbid. You prevented your problem by making sure your problem was at its worst level from the first second. Well done.

    What's more, your main point, which is to buy a video card every year to enjoy the best graphics, is complete nonsense. Let alone 770 today is able to run everything to the max (you probably need to check what problem you have on your computer if you can't do it) you need to understand that the gaming development market has years of unrelated leaps to the existing hardware. The improvements between video cards are large and consistent, and you're wrong for blaming NVIDIA and AMD for delaying the technology. On the contrary, they are at the forefront. They implement in their cards technologies that game developers avoid to a certain point. Let me give you an example: Suppose that NVIDIA has developed technology, called it "Itzik Filtering" and exists only in their newest series of video cards, the 9XX series. Game developers will not invest in adding the "Itzik Filtering" option to their games in the meantime because a small percentage of the market has the 9XX series. Next year the technology will be available in two series, one year after three, and so on.
    At some point technology will become the standard, there will be a drastic jump in the quality of farms, and this will be more noticeable than the jumps in previous years.
    This is one example of something leading to a technological leap in gaming. Second and very significant (meanwhile) are the consoles. As long as developers had to worry about developing a game that would run on PS3 or XBOX360, they created them in their less demanding base. As soon as new consoles came out capable of supporting newer technologies the leap in the gaming market was more massive. This means that a video card from a series that came out immediately after the release of new consoles will be the video card that will last the most in relation to its point of purchase. A video card bought a year before the release of a new console series will have to deal with a greater technological leap as time goes on, as will your 770 that a year later the new consoles came out.
    In any case, there is nothing to do with the video card companies, which you mention also produce the video cards of the consoles, so getting away with them by buying a product that also contains their technology is not the smartest.

  10. Intel is having a lot of trouble with the 14nm, so difficult it seems that TSMC, GloFo and Samsung will line up in the coming years. There is a big difference between working on a production process and creating one.
    I don't understand why people here are so focused on the size of the transistor, if there is anything the GTX900 has shown is that you can bring great improvement even if you remain in the existing manufacturing process. These cards work at relatively low power and there is no reason why a larger and more powerful core would not be possible to approach the previous generation power consumption.
    About AMD, in some places it is said that they transfer the creature to GloFo because their creature process is preferable. I can't say more than that because I don't know if their process is better and I don't know if AMD will actually move the creature elsewhere.
    I don't understand why AMD and NVIDIA are being blamed, it's not that they are left behind by choice, they have no manufacturing plants, they use what they can. Currently TSMC cannot provide them with anything else that APPLE and QUALCOMM settle on production tracks and each one is several sizes from AMD and NVIDIA together…

    post Scriptum.
    Regarding the consoles, some time ago talked about SONY and AMD working on changes that would allow PS4 SLIM to be removed, among those changes was beyond a more advanced production process, probably to 2016 / 7. It is likely that MS will sit on AMD's head for the same thing and it appears that Nintendo is also straight. In short, you buy a computer that is based on components of AMD that are much weaker than the company's aging FX processors and the company's latest video cards.

  11. HuGeMouTh

    Intel having trouble manufacturing 14 nm? Tell me, where do you read this nonsense? Are you an Intel employee? Have you been to Ireland? I ask you to stop spreading nonsense.

    First of all, are you aware that next year (2016) will begin a physical production process in Kiryat Gat of 10 nm, yes? Israel will be the first to produce a 10 nanometer manufacturing process and a reduction, Intel has stayed here for at least 10-15 years.
    "Lol" have a hard time on 14… Currently the factory of the former Micron is abstract and ready to undermine the 10 nm machine infrastructure my friend… take the aluminum cap off the head.

    Stop spreading your nonsense on the net.

  12. Cute you: kiss:
    This information comes from Intel directly, at no point did they hide the fact that this transition was particularly difficult for them, they also released graphs and lots of information to investors (and of course this information was available to everyone very quickly). Now it seems like everything is working properly but until a few months ago the percentage of defects in production was much higher than in the previous production process and that means a decrease in profits. In fact, they reached the data they only wanted in the last quarter and as a result many products have been lightly rejected. It is likely that the next passage will also be postponed so do not rush to declare a creature in 2016.
    By comparison, TSMC started to produce RISK processors at 16nm (they have no plan for 14, where the transition will be to 10) and 2015 will show first products when most of them arrive at the end of the year, the list includes many companies including NVIDIA. If some time ago they talked about the fact that Intel has a technological advantage of several years over the competitors it seems now that this advantage is much smaller.

  13. HuGeMouTh Intel having trouble with 14 nm production process? Tell me, where do you read this nonsense? Are you an Intel employee? Have you been to Ireland? I ask you to stop spreading nonsense. First of all, are you aware that next year (2016) will begin a physical production process in Kiryat Gat of 10 nm, yes? Israel will be the first to produce a 10 nanometer manufacturing process and a reduction, Intel has stayed here for at least 10-15 years. "Lol" have a hard time on 14 ... Currently the former Micron's factory is abstract and ready to undermine the 10 nm machine infrastructure my friend ... Take the aluminum cap off the head. young. Stop spreading your nonsense on the net.

    Why do you work Intel / have you been to Ireland?
    Maybe you were, you would also know that they are having a lot of trouble in the production process, that's why the production of future technology has been postponed every time.

Leave a Reply

Back to top button