Just before the exposure, here are all the details about the new generation of graphics cards for AMD • HWzone
Computersgraphic cards

Just before the exposure, here's all the details about AMD's new generation of graphics cards

A lot of name refreshes and one big Fury that sits above everyone: everything you need to know about the new graphics cards Is already on the net, a moment before the actual launch

We already know that next Tuesday, the June 16, we will be exposed to graphic processing products The new and the talking של . However, the network is not really waiting for the red manufacturer's schedule and is already full of photos, bits of information and reports about everything that awaits us around the corner. We are here to make you order, for the benefit of anyone who simply can not wait for next week.

In the numerical Radeon 300 series, it appears that the early reports were indeed incorrect and the most recent reports are accurate - all this series, including The R9 390 and R9 390X models, Will actually be refreshing (not to say rebranding the exact same product) of previous generations. The core of the leading models of Fiji will be called Grenada and will offer slightly faster working frequencies alongside GDDR5 GDDR8 as standard, the core of Antigua will replace the Tonga core, the old Pitcairn will be called Trinindad and Bonaire will become Tobago. It's disappointing to see that we have only one new core in this generation of AMD, but we can try to console ourselves In the reports That the Radeon 300 cards are offered Competitive prices Very much.

amf2

Source: guru3d.com
Source: guru3d.com

The crowning glory of the generation, Contrary to what was estimated Until a few weeks ago, there will be a new-old brand Named Fury Which will be offered in three different specifications - the advanced Fury X with Fiji XT core offered in water cooling and air cooling version with slightly lower working frequencies, and Fury "standard" with Fiji Pro core containing slightly less processing units and expected to be the answer for The GeForce GTX 980 Standard, while the X versions will struggle against The new GTX 980 Ti.

According to a recent photograph, the Fury models will indeed be very compact, despite the impressive power inside
According to a recent photograph, the Fury models will indeed be very compact, despite the impressive power inside
Source: wccftech.com
Source: wccftech.com

Dessert, Official performance tests (As it were) Fury X, the most advanced model, will offer Very similar to those of the GTX 980 Ti and GTX X, including dual-array - at least in the test FireStrike. The results in real world applications may vary significantly depending on the game or the specific software being tested, but in general the competition for the crown of performance in the current generation will appear to be more closely linked than ever, and the choice of each consumer may ultimately converge to his preferred brand, Which will be available upon purchase.

amf3 amf4

With a jump of 35 percent in performance compared to its best single-core card in the previous generation, R9 290X, There is no doubt that AMD has many reasons to be satisfied with the technological achievement that is the Radeon Fury / Fury X. Will this be enough to stop the The gradual takeover של On the market? It's a different matter altogether.

AMD 's Great Hope
AMD 's Great Hope

Tags

117 תגובות

  1. What's very strange is that the new-generation (390-390X) cards come with a standard 8GB memory and the new ones that should be AMD's titan with 4 only ..

  2. No, just the 980ti changed the market
    Both amd and Noydia are doing the same
    The fury will be as strong as the ti and will cost the same with cooling water and it will be quiet and cool
    Then, stop with the fanboys
    failure? Maybe you're afraid of amd?
    Great tickets for both companies
    There is no prominence except price

  3. I do not understand the excitement of the 980Ti card, at least in Israel it is not worth the money and the investment.
    In addition to not peak performance for 4K, it is better to play 1080P already until the Pascal generation…

  4. What gives the 640GB bandwidth offered by AMD in this generation?
    Why is it good ?
    How do I know it's for me?

  5. No, just the 980ti changed the marketplace amd and also its Needs do the same
    The fury will be as strong as the ti and will cost the same with cooling water and it will be quiet and cool

    How exactly is the same with 4GB for a leading card designed for high rides ??
    For me, if it came with at least 6GB I would consider and probably buy it. (The dimensions and style do not do anything)
    Now - probably not ..

  6. In the past, ended.
    All the details, on the front page, really soon.
    There was a small earthquake about a tiny product that was launched and brought a threat.

  7. I watched live launch, it must have been particularly cool.
    And there was nothing new to the truth.

  8. Seeing a palm-sized video card that defeats R9 290X with half its power consumption and saying its nothing new is absurd.

  9. Well Lior I have been waiting for your article for some time now ..
    You are ready to upload the article instead of writing us a word word.

  10. Yes, it's nice and everything, but in a moment we'll be realistic.
    There are not many answers in my opinion, the more interesting are the fury, and in this part was not new.

  11. See a palm-sized video card that defeats R9 290X with half of its electricity consumption and say there was nothing new is absurd.

    I also think it's the most interesting product in this release, I'm dying to see such small physical cards that do not consume much electricity and give high performance.
    the question is,
    Will there ever be a flag card with these features or will it always be a third and sometimes much lower (750TI) card?
    Or does the market go there? The intention is that in the future we will no longer see 30 cm cards and there will be no cards above 150W Let's say, those who need a big cooling already so no matter if the PCB is small that the cooling will determine the size, at least on single-core gaming cards?

    Know that there are no prophecies but gut feelings will be interesting to hear.

  12. The market is going there, crisp and smooth. Look at the statue generation and the HBM2. The printed boards become radically short as a result of the transition to memories on the graphical core surface.

  13. This is what I hoped ... we can only wait patiently (not easy). It would be interesting to see this happening.

  14. COLOR = # 000000] [FONT = arial] Will HDMI be 2.0?
    [/ FONT] [/ COLOR] i
    I could not find any detail of what would be on the card

    
    
    Will there be HDMI 2.0? 
    

    How does nobody know?

    
    
    What Bandwidth Data Will Bandwidth 18gbps?
    
    
    Is support 12BIT 4K color?
  15. All rumors say AMD will not bring HDMI 2.0
    Will not bring DP 1.3

    NVIDIA also did not bring HDMI 2.0 at least according to the link to 980
    After declaring their cards in the past as HDMI 2.0 and found out by a network whose data width is like HDMI 1.4
    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980-ti/specifications

    For those who have X290 stay with them, pity the time without at least an update to HDMI 2.0 from 2013, out of date

    This means that the data bandwidth remains 10.2 GBPS

  16. URL="http://www.newegg.com/Product/Product.aspx?Item=N82E16814202144&cm_re=r9_290x-_-14-202-144-_-Product"]דוגמא[/URL]).
    The most surprising and disappointing of all is that although 3XX carries the 8GB label for the K4 generation specifically in the flagship cards (especially the X that competes with 980TI that contains 6GB), we do not 'fancy' any more, I guess it has to do with the different system of memories.

    Overall. I enjoyed it very much and my next card might be the X. In addition, there is a good chance (depending on pricing) that NANO is going to be the sweeping recommendation for a new computer (instead of the 970) - which is a revolution
    I was really excited.
    Certainly from the NANO, which, if priced reasonably, is a huge hit and shows pretty good progress.
    Also from the different versions of the flagship card in its versions (impressive and solid design - similar to the general line of the manufacturers of enclosed memory boards and basically not cover everything in elegant and shiny surfaces and the size, the size is simply fantastic)
    I was impressed.

    A very bad decision in my humble opinion is to market the X290 and 280 with high memory and prices (on the other hand, it is rebranding the existing at the same price)

    Example
    ). The most puzzling and disappointing of all is that although the leading 3XX generation carries with it the 8GB signal to the K4 generation precisely on the flagship cards (in particular the X that competes with the 980TI which contains 6GB) no longer 'drew', I suppose it is related to the different method of memory. Overall. I enjoyed it very much and my next card might be the X. In addition, there is a good chance (depending on pricing) that NANO is going to be the sweeping recommendation for a new computer (instead of the 970) - and this, a revolution - 'rediscovering the red'
    
    
    
    
    
    
    Yes, it's nice and everything, but in a moment we'll be realistic. 

    If the pricing falls from 970 (and it is likely to be) - for everyone! See previous paragraph.

  17. COLOR

    Each HDMI standard has a data bandwidth standard 
    
    
    https://en.wikipedia.org/wiki/HDMI#Version_2.0
    
    
    HDMI 2.0 = 
    
    
    
    
    18
    
    
    
    
    Gbit / s
    
    
    HDMI 1.4 = 10.2
    
    
    Gbit / s
    
    
    That is to get real 4K need bandwidth to transfer all data 
    
    
    Not enough to replace the HDMI connector on the card should replace all controller and data processing of the cards to get HDMI 2.0 = 
    
    
    
    
    18
    
    
    
    
    Gbit / s
    
    
    .
    
    
    AMD got into trouble with this and previously promised that this generation came out with true HDMI 2.0 
    
    
    Not as NVIDIA has done a trick that brought HDMI to 2.0 but has left 10.2 data bandwidth 
    
    
    Gbit / s
    
    
     (According to the network)
    
    
    All my questions to NVIDIA would not give me information about the data transfer bandwidth of the card.
    
    Even the new NVIDIA card does not record any HDMI standard! (Because they understood this trick)
    
    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980-ti/specifications
    
    
    
    
    Dual Link DVI-I, HDMI, 3x DisplayPort 1.2
    
    
    
    
    
    
    
    
    
    
    
    
    The AMD site and all AMD's yesterday lecture did not speak anything about HDMI standard 
    
    
    What will give AMD in HDMI remains a mystery.
    
    
    DP 1.3 with 32 bandwidth
    
    
    Gbit / s
    
    
     , Will not be because they do not have a fast processor to run this bandwidth.
    You have to understand to run 
    HDMI 2.0 = 
    
    
    
    
    18
    
    
    
    
    Gbit / s
      , Need much more power than the 1.4 standard
    
    
    
    
    Another spicy detail, 80% of all TVs in 2014 watched in TV or 10.2 Gbit / s 
    
    
    
    2013 does not recognize a single screen with real 2.0 HDMI.
    
    
    After all, the screen has a data processing processor to display the image.
    
    
    And it is not relevant if the screen has an HDMI 2.0 connection, the actual processor on the screen determines the data bandwidth.
    
    The only one that brought a suitable processor was Samsung in some models.
    
    
    Companies / manufacturers of screens have made a trick offered HDMI 2.0 with some of the options of HDMI 2.0 but the data width of HDMI 1.4 = 10.2
    
    
    Gbit / s
    
    
    Now ask how you can tell this, there is one big manufacturer of CPU processors expected by almost every screen, named 
    
    
    siliconimage
    
    
    http://www.siliconimage.com/technologies/mhl/
    Each screen processor has a number and production date, such a processor to transplant into a screen motherboard and do experiments, it takes a year for a screen only to come out on 2014, a chip that supports full HDMI 2.0 comes out (so only now see some screens with full HDMI 2.0).
    
    http://www.siliconimage.com/solutions/tv/
    
    
    
    
    SiI9779 The last expectation already brings a standard that will only be seen in 2016 with 
    
    
    
    
    High Dynamic Range / BT.2020 / Deep Color 10, 12 16-bit
    
    That is, you must check whether the screen is at least the standard 
    
    
    SiI9777
    
    
    
    
    Once 4 connectors have been offered by 3 or HDMI connections, only 1 is HDMI 2.0 and the HDMI 1.4 is outdated 
    
    
    The only connection is a type of HDMI 2.0 but the CPU is in the HDMI 1.4 = 10.2 screen
    
    
    Gbit / s
    Ie 4K 60HZ bandwidth of 10.2 
    Gbit / s
    
    				
  18. It is not clear how you decided that the Nano card would be equal to 290X.

    True, their wording is in the style of the comparisons made at the time between the 900 Series of Greens and the 600 series.
    Basically, it could also be a quarter of the 290X's power consumption, which means half its power, but still 2's energy efficiency. In short we will wait and see what they cook there ..

  19. I also think this is the most interesting product in this launch, I am already dying to see such small physical cards that do not consume much electricity and give high performance. The question is, will there ever be a flag card with these features or will it always be a third and sometimes much lower (750TI) card? Or does the market go there? The intention is that in the future we will no longer see 30 cm cards and there will be no cards above 150W Let's say, those who need a big cooling already so no matter if the PCB is small that the size will determine, at least on single-core gaming cards? Know that there are no prophecies but gut feelings Will be interesting to hear.

    There are such, - 970ITX - and it's simply a technical decision not to make 980 models.
    Nano sounded interesting, just a pity he would not be out soon, talking only about the end of the summer. This leaves a lot of time before clearing shelves / falling prices of 970 without any real competition.

  20. Know the little 970 and the little 670, but it seems like there is a limit to how small the cooling can be and still hold the card with its head above the water (or over the flames) and it seems like that limit goes where it is in the 150W I'm not sure 980 despite its thermal efficiency It would be useful with such a little cooling (but I didn't delve into it).

    But yeah it probably will take time ..

  21. Roeyrden, thanks for the explanation. Here I see it written in HDMI 2.0 data
    
    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications
    
    
    As I mentioned in the above, NVIDIA has previously recorded HDMI 2.0 but the network, forums abroad have found that it is not true HDMI 2.0 it accepts all the features of HDMI 2.0 but without the most important feature of any 18GBPS bandwidth. (As far as I figured a card that needs to process 1.8 data needs more processing power, it might be in the next generation 16NM) Worked on a video card that was stronger on 30% I brought you a link, not great HDMI 2.0, why? I contacted NVIDIA several times and did not receive an answer as to whether the card has 18GBPS bandwidth. I also contacted AMD customer service and haven't received a response yet. Note the Important New 2015 screens match color levels 
    DCI-P3
     Which is more color width than REC 709 Which video card does it support? This color level can greatly enhance screen / game / movie color. Unfortunately the video card companies only offer another FPS that already does not show (not worth 300FPS on the 60 or 120 HZ screen) Unfortunately, it seems that simple video card companies are impeding the worth of buying 4K / TV screens.
  22. It is not clear how you decided that the Nano card would be equal to 290X.

    Don't know who you mean by "your decision", but Lisa Sue said he is (and I quote) "significantly" from R9 290X, which even makes him stronger than R9 390X and literally from any R9 300 series.

  23. I looked at the piece again.
    She did. But the bottom line, it sounds a little stinking.
    If this card is much stronger than the 290X, it should be somewhere between the 980 and 980TI. Above it is another set of cards that need to be significantly stronger than the NANO.
    That means the Rage Fury X water cooled should just leave 980 TI dust. It's hard for me to believe that this is the situation, given the pricing and light that they let nVidia quietly sell the Ti without leaking reviews and numbers to make people wait.

  24. I do not know who you mean by "your decision," but Lisa Sue said he was strong (and I quote) "significant" from the R9 290X, which even makes him stronger than the R9 390X and literally any R9 300 series.
    
    
    I looked at the section again. She did say so. But bottom line, I sound a little smelly. If this card is significantly stronger than the 290X, it should be located somewhere between the 980 and 980TI. Above it is another set of cards that need to be significantly stronger than the NANO. That means the Rage Fury X water cooled should just leave 980 TI dust. It's hard for me to believe that this is the situation, given the pricing and light that they let nVidia quietly sell the Ti without leaking reviews and numbers to make people wait.
    Now you were really curious about me. I'll be on thorns to see what came out of this card.
  25. So after the dust had settled (and there was a lot of it in the positive sense)
    What do you think of the whole AMD event in the end?
    It seems to me that they have certainly succeeded in surprising and stimulating the taste buds, they have succeeded in surprising both ways in pricing, but on the other hand there is no more concrete information here.
    So, a man who has been debating so far whether to buy 980TI or wait for gentle rage remains in his wits.

    Why should they postpone and stretch the event to an end, not sure it does them good

  26. url] http://www.iqpc.co.il/Home/showItemMenu.asp?id

    http://www.iqpc.co.il/Home/showItemMenu.asp?id=12967
    
    
    http://www.iqpc.co.il/Home/showItemMenu.asp?id=12966
    
    				
  27. I went through the graphs a bit and realized that 7950 and 7970 are still the most paying cards up to NIS 1000.
    Come on, it's time to close the basset, AMD.

  28. In short, disappointing ... Waiting for Pascal at the moment has no material answer to play on 4K.
    Keep in mind that the Fury X cards went through a massive Aberclock for these benches.

  29. It's on 4k. It would be interesting to see how it is at lower (and more widespread) resolutions.

    This is only a hypothesis, but it seems to me that the difference will be less significant.
    According to what is today, the benefit of large memory widths decreases with a decrease in AA resolution.
    So now we can say that both companies offer optimal tickets to 1440P.
    By the way, to do a fair comparison you need to see what the air cooling version knows how to do or alternatively put 980TI with water cooling.

  30. ] [RIGHT] [COLOR = # 000000] [FONT = arial] Quote: [/ FONT] [/ COLOR] [COLOR = # 000000] [FONT = arial] Lacking the available bandwidth to full support [Email protected] until the arrival of HDMI 2.0, the latest crop of 4K TVs such as t
    Here is another proof that NVIDIA does not have a true connection. HDMI 2.0 NVIDIA and Sony and Samsung have done an exercise on how to get HDMI 2.0 in HDMI 1.4 bandwidth 8.16GBPS

    
    
    
    
    
    
    Quote:
    
    
     
    
    
    Lacking the available bandwidth to fully support [Email protected] until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a [Email protected] signal two 
    
    
    
    
    
    
    
    
    fitwithin HDMI 1.4's 8.16Gbps bandwidth limit. 
    
    
     
    http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
     Here's another possibility to check that NVIDIA HDMI 2.0 is not available 
    
    
    
    
    
    
    no THIS card does not have HDMI 2.0
    
    
     
    
    
    Without appropriate Chip.
    
    
     
    
    
    they do not have SiI9777 Chip
    
    
    
    
     
    http://www.siliconimage.com/Company/...with_HDCP_2_2/
     
    
    
    
    
    to speak from video card to screen display or Blu-ray is a need to Chip in any device
    
    
     
    
    
    the chip need Protocol of the HDMI 2.0
    
    
     
    
    
    ONLY Silicon Image make that chip for HDMI Organization
    
    
     
    
    
    nvidia 960 / 970 / 980 not have SiI9777 Chip
    
    
     
    
    
    no hdcp2.2
    
    
     
    
    
    no HDR
    
    
     
    
    
    no bandwidth 18GBPS
    
    
     
    
    
    no DCI-P3 COLOER?
    
    
     
    
    
    This the card 980TI where the SiI9777 Chip
    
    
    
    
     
    http://www.pcper.com/files/imagecach...VBA9_1_4_l.jpg
     
    
    
    
    
    Quote:
    
    
     
    
    
    This is very easy to prove with a simple detailed image for 4: 4: 4 and only a blind man can not tell the different between 30 and 60hz on a PC.
    
    
     
    
    
    yes this very easy to prove
    
    
     
    
    
    ask in written confirmation from NVIDIA video card that thay have:
    
    
     
    
    
    SiI9777 Chip on video card
    
    
     
    
    
    hdcp2.2
    
    
     
    
    
    HDR
    
    
     
    
    
    andwidth 18GBPS
    
    
     
    
    
    DCI-P3 COLOER
    
    
     
    
    
    i try 3 time from nvidia and no reply!
    
    
     
    
    
    what they have to hide?
    
    
     
    
    
    Quote:
    
    
     
    
    
    Lacking the available bandwidth to fully support [Email protected] until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a [Email protected] signal two 
    
    
    
    
    
    
    
    
    fitwithin HDMI 1.4's 8.16Gbps bandwidth limit.
     
    http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
     
    
    
    
    
    
    
    
    
     you bringing article on Sony's XBR 55X900A this tv make in 2013 when it was only HDMI 1.4 with out chip and the can play with HDMI 1.4 only 30HZ 4K so how it can play HDMI 2.0 how it can have HDCP 2.2? how it can have 18 gbps? Can you connect this screen to NETFLIX? : 
    http://www.anandtech.com/show/8191/n...upport-kind-of
     this TV was manufactured before the chip SiI9777 was invented in 2014 
    http://www.siliconimage.com/Company/...with_HDCP_2_2/
     
    
    
     
    
    
    
    
    
    
       
    
    
    
    
    
    
    
    
    
    				
  31. Roy, you made a whole salad here. You gave a link to an article about old cards with Kepler's core.

  32. Roy, you made a whole salad here. You gave a link to an article about old cards with Kepler's core.

    I'm glad you read

    I did not make any salad
    NVIDIA's invention on HDMI 2.0 continues in this generation as well.
    (As they lied in the previous generation without interruption continue in this generation)

    Which 4K on 8.1GBPS bandwidth? It's an invention of NVIDIA
    Does not support anything even Lanflix 4K does not connect.

    Get your anger out of NVIDIA not in front of me.

  33. NVIDIA never lied, she never said in the publications that the previous generation with Kepler has HDMI 2.0. Where did you find it? She added driver back support, what's wrong with that?
    If you rely so much on the story in ANANDTECH then here is another story that explains that there is support for HDMI 2.0 in the new generation:
    http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/5

    Factually, right now, only NVIDIA cards are releasing 4k video with 60p, which is something that AMD and Rope have.

  34. ] [RIGHT] [COLOR

    
    
    
    
    By the way, to do a fair comparison you need to see what the air cooling version knows how to do or alternatively put 980TI with water cooling.

    Not true.
    The comparison is between finished products. If I buy a card he will be stronger than B. As you say it is impossible to compare 730 to 780 until you last remove the fan.

  35. ] [RIGHT] [COLOR

    
    
    
    
    NVIDIA never lied, she never said in the publications that the previous generation with Kepler has HDMI 2.0. Where did you find it? 
    
    
    
    
    NVIDIA called 750TI is said to have had HDMI 2.0 in publications in the past, mimicking many publications that were discovered to be incorrect. NVIDIA is also lying in this generation and it is precisely why and how and why your dignity does not believe or believe believe Contact NVIDIA with questions like many others I even brought you topics for questions translated into English.
    
    
    
    
    
    If you can't play UHD Bluray or HDR or Netflix with NVIDIA 960 / 970 / 980 
    
    
    
    
    
    
    what is HDCP 2.2 it is?
    
    
    
    
    
    
    what hdmi 2.0 is it?
    
    
    
    
    
    
    you can't have DCI - P3 colors
    
    
    
    
    
    
    NVIDIA data bandwidth 8.16 GB 
    
    
    
    
    In the link you brought to the new generation, people can't connect to Netflix that requires 4K with HDCP 2.2. It's already a problem that says there is no HDCP 2.2.
    
    
    
    
     4K 4: 2: 2 12BIT The new card does not support, that means a problem, because the HDMI 2.0 standard guarantees this HDMI 2.0 guarantees 2 new color systems DCI-P3 and REC 2020 the new Nevidia card gives nothing but HD X generation I can go on and on about things that are not on the new Nvidia video card, experimenting with new 709 screens The little things I mentioned give the quality and experience to HDMI 2015 non-Nvidia data bandwidth, as opposed to HDMI 2.0 as well as non-Nvidia HDR width Its not standard. For their video card to call on TVs / BluRay 2.0K HDMI 4 generation must have a special expectation that Nabidia card does not refuse to provide information Why and why, what should be hidden? Nabidia plays with accelerators and a lot of information is lost because there are bandwidth restrictions like in the previous generation Nabidia plays with inaccurate promises Everyone who contacted Nabidia, sites, people do not receive answers You are also welcome to contact them  
    
    
    :)
  36. Show me one Nvidia post that says 750TI has an HDMI 2.0 connection
    I don't care about Netflix, at least NVIDIA cards take out 4K 60P and AMD doesn't.

  37. Show me one Nvidia post that says 750TI HDMI 2.0 connection doesn't interest me Netflix, at least NVIDIA cards take out 4K 60P and AMD doesn't.

    Sorry I will not start running, I wanted to buy it was listed.

    Yes yes interesting nflix, see 4K 60HZ, even if it does not interest you the essence of it that fix HDCP 2.2 Anno
    And during the year, Blue Ray Ultra came out, without HDCP 2.2 you can't see 4K.
    All companies that broadcast 4K online are committed to broadcasting rights with HDCP 2.2 only
    So where is this device?
    What to pay NIS 3000 per card?

    4K 60P But up to HDMI 1.4 bandwidth every bit over is deleted

    About 4K 60P and AMD
    I connected HDMI to the UH9000 65 screen You can get some games on 50 FPS and 4K resolution
    But I did not notice if 30 HZ
    I realized that Samsung's screen makes UPSCALE
    Samsung really is a league above all, this year's 8 cores in their processor.
    The image is very impressive in 2014

    :)

  38. I have never seen a GTX 750 TI publication with HDMI 2.0 and even if you browse the Internet you will not find. I think you got confused

  39. I have never seen a GTX 750 TI publication with HDMI 2.0 and even if you browse the Internet you will not find. I think you got confused

    I didn't get confused!
    When the card came out I wanted to buy it and I checked very well.
    And still, irrelevant whether or not.
    Today NVIDIA does not have a real HDMI 2.0 and attaches all the material to why why
    Also in the link that brought 8.16GBPS bandwidth

    Attached is confirmation from AMD Worldwide Advertising here on the site

    http://oi57.tinypic.com/qy7gcm.jpg

    AMD will issue adapter from DP1.2 to Get HDMI 2.0 4K 60HZ
    What it will be and what quality, only God knows or not.

    Note that 1.2a indicates that NVIDIA has only 1.2

  40. You must have been confused, you claimed that Nvidia lied in the GTX 750 publications, if that were true, you would see articles on the professional sites around the world and there is no such thing anywhere.
    Are you the only one in the world who has seen such advertising?
    In the link I came up with, it states that HDMI 2.0 is full of 900 series:

    It should be noted that this is full HDMI 2.0 support, and as a result it notably differs from the earlier support that NVIDIA patched into Kepler and Maxwell 1 through drivers. According to NVIDIA's earlier update was to allow these products to operate a [Email protected] display using 4: 2: 0 subsampling to stay within the bandwidth limitations of HDMI 1.4, Maxwell 2 implements the bandwidth improvements necessary to support [Email protected] with full resolution 4: 4: 4 and RGB color spaces.

  41. ] [RIGHT] [COLOR

    
    
    
    
    You must have been confused, you claimed that Nvidia lied in the GTX 750 publications, if that were true, you would see articles on the professional sites around the world and there is no such thing anywhere.
    
    
    
    
    Again and again you are not telling the truth. It's a pity that NVIDIA's buttons instead of getting permission from NVIDIA, continue to threaten to write things that don't exist on a standard that NVIDIA doesn't have. Don't you think you should just contact NVIDIA for an explanation? Ask whether your HDMI 2.0 meets all devices? Instead you keep writing to me and chasing me? If your goal was to find out the truth, you would write to NVIDIA. You prove that English and grammar, not exactly your language, will read what you brought:
    
    
    
    
    
    
    using 4: 2: 0 subsampling to stay within the bandwidth limitations of HDMI 1.4, 
    
    
    
    
    If you knew HDMI 1.4 10.2 GBPS data bandwidth vs HDMI 2.0 18 GBPS data bandwidth Where do you see the 18GBPS bandwidth card? If there was - HDMI 2.0 why not what is in the standard:
    
    
    
    HDCP 2.2?
    
    
    
    
    
    
    DCI - P3 colos?
    
    
    
    
    
    
    SiI9777 Chip?
    
    
    
    
    
    
    HDR?
    
    
    
    
    
    
    bandwidth 18GBPS? And more ... you should first learn the standard and then:
    
    
    
    
    
    
    
    ASK NVIDIA IF THAY HAVE:
    
    
    
    
    
    
    HDCP 2.2? DCI - P3 colos? SiI9777 Chip? HDR? bandwidth 18GBPS? Websites in the world do not have the tools, can test bandwidth, standards on video cards, they rely on what they give in writing and what is listed on the site. You dream if you think they can test floppy and protocols or standards. Most sites in the world are amateur boys uploading video card credits. Something else you are chasing after you claim no one in the world wrote that the 750 HDMI 2.0 card was wrong again, listed:
    
    
    
    
    
    However, this will leave HTPC users in a pickle if they want HDMI 2.0 support; with the GM107 based GTX 750 series having launched only 7 months ag
    
    
    
    
    
    
    
    
    o That is again proven that NVIDIA lied to the public even before 7 months ago having HDMI 2.0 so what is stopping them today from continuing to lie? Even in the link you provided, things are not proven but write down answers to your questions. Register clearly enough to wonder and direct these questions to NVIDIA. Of course if you are not shy 
    
    
    
    
    
    
    
    :)
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    
    				
  42. Anyone who doesn't know English is you.

    The quote says:

    It should be noted that this is full HDMI 2.0 support, and as a result it notably differs from the earlier support that NVIDIA patched into Kepler and Maxwell 1 through drivers. According to NVIDIA's earlier update was to allow these products to operate a [Email protected] display using 4: 2: 0 subsampling to stay within the bandwidth limitations of HDMI 1.4, Maxwell 2 implements the bandwidth improvements necessary to support [Email protected] with full resolution 4: 4: 4 and RGB color spaces

    In plain Hebrew translation: Kepler card driver update brought 4K / 60P HDMI 1.4 bandwidth
    In contrast, the new Maxwell cards offer full HDMI 2.0.

    With that level of understanding, I retire from the argument, it's like talking to the wall.

  43. Anyone who doesn't know English is you. In a simple quote: Kepler card drivers brought 4K / 60P with HDMI 1.4 bandwidth, however, the new Maxwell cards offer full HDMI 2.0. With that level of understanding, I retire from the argument, it's like talking to the wall.
    Do you make yourself ashamed and like a little boy sticking with the nonsense you bring? In the link you provide is clearly listed, I think you are translating Google and distorting everything written. Welcome to us if we rely on Google translation 
    
    
    
    
    
    within the bandwidth limitations of HDMI 1.4,
    Enhancement is made within HDMI 1.4 bandwidth The new model also brings improvement to 444 but the HDMI 1.4 bandwidth limit and backbone may reach the 8 BIT level is there anything new in it? Neither registered nor registered NVIDIA is approaching 18GBPS!
    
    
    
    
    
    floor stand [Email protected] with full resolution 4: 4: 4 and RGB color spaces
    
    NVIDIA makes fun of us all, the worst thing people pay a fortune, NIS thousands on HDMI 2.0 that is completely castigated also your writing brings the NVIDIA joke to a new record in that same article which stated that 750TI contains HDMI 2.0 but only 4: 2: 0 Add 4: 4: 4? That means 750TI joke 980 brings to the equation maybe 4: 4: 4 This too is HDMI 1.4 bandwidth again, it's not HDMI 2.0 it's a very insulting joke, blackmail money, parents in thousands of dollars.
  44. Roy. Sorry to tell you, but your reading comprehension in English is very poor.

    Or is my understanding better than mine, if you are so successful, what do you want from me?
    You can get by on your own.

    If I don't understand, why are you addressing me?
    There's an 200 website that tells you every day how successful you are and how much they should thank your parents for bringing you.
    Maybe even make a holiday day because you exist.

    Successfully
    :)

  45. "Royerden" is deserted ... There is enough intriguing material here, who wants to read and decide alone.
    There is no point in going down to personal tracks.
    Keep giving us interesting material to read ...

  46. LEFT] [COLOR

    
    
    It should be noted that this is full HDMI 2.0 support, and as a result it notably differs from the earlier support that NVIDIA patched into Kepler and Maxwell 1 through drivers. According to NVIDIA's earlier update was to allow these products to operate a [Email protected] display using 4: 2: 0 subsampling to stay within the bandwidth limitations of HDMI 1.4, Maxwell 2 implements the bandwidth improvements necessary to support [Email protected] with full resolution 4: 4: 4 and RGB color spaces.
    
    
    It is worth noting that there is full support for HDMI 2.0 so it is noticeably different from the support that NVIDIA has added to Kepler and Maxwell through drivers. While NVIDIA's previous support was designed to allow these products to display on[Email protected] Using 4 subsampling: 2: 0 to stay within HDMI 1.4 bandwidth limits, the implementation of Maxwell 2 added the necessary bandwidth improvements to support[Email protected] With (not closed on how to translate it) full resolution 4: 4: 4 and RGB color spaces. And I thank my parents every day. Thanks :)
  47. Yes, this is a tremendous site. Was very useful even when it was hard to get PS4. Amazon currently has the PNY version available in stock, but this is a cooling reference.

  48. url] http://oi57.tinypic.com/27wsoxt.jpg [/ url]

    [COLOR

    http://oi57.tinypic.com/27wsoxt.jpg
    
    
    
    
    MSI says 380, 390 and will be in them   
    
    
    HDMI 2.0 At least AMD admits they now have HDMI 2.0 in HDMI 1.4 bandwidth 
  49. If you say the word hdmi a few more times, there is a situation that your body will start an evolutionary process and turn one of your fingers into the hdmi2.0 port, thus solving the problem for you.

  50. New reporting ran online Here's a quote:
    "Police provided protection for nvidia ceo after recieving a threat letter that Said: No real HDMI 2.0 !! you are reading !! Now
    I kill you "

  51. Can you explain which way you order away that all costs are already calculated in advance without any surprises afterwards?
    I tried to order a ticket that was really good for a sale sold by Amazon including the option "Global International Shipping" and the order page has the option of sending as a wrapper as a gift and hiding the invoice (which is worthwhile by the way?), And there is the address page, and that's it.
    This is where credit information is requested.
    Where is the tax calculation? Where is the shipping cost, I did not understand how it works ..

  52. mozar. It doesn't make more sense to present the final breakdown of the price, taxes and shipping that I even know how much I have to pay before I bother to enter credit details ..
    No matter, however, once I have agreed to the final price for that matter, there is no way that the shipping company will make me a salad here in customs etc.
    And will they require any more surprises?

    I ask because I have very bad experience with all these forwarding companies and I prefer not to use them unless there is no choice.

  53. Bigger the price appears after narrowing down the options there are in search for shipping to Israel and by Amazon, VAT, customs and shipping costs
    Of course you have to calculate each one individually and I would like to see it sorted before I enter the credit information, but if you enter your details
    Credit is not the last step in ordering so it is not critical anyway.

    I might be confused so I want to understand something I just have no experience with Amazon - the moment I order from Amazon, I get the price detail including taxes here in the country?
    And what, are they transferring the money I paid in advance for taxation to the authorities here through the shipping company?

  54. url]http://www.amazon.com/gp/product/B00YDAYOF0/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER[/url]

    Costs $ 1,000 until the house includes everything. It's NIS 796.
    The price at Amazon includes all taxes in the country. On slow delivery, they ship via I-Parcel / Aramex and eventually sent to you by courier mail.
    In the fast delivery it is UPS / DHL. Sometimes they ship in the fast delivery even though you paid for Haiti.

    Never pay in shekels, their conversion fee is high. Cancel their conversion option and pay in dollars.
    I don't know which card you checked, as they currently have nothing in stock available.
    But for example this card:

    http://www.amazon.com/gp/product/B00YDAYOF0/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER
    Costs $ 1,000 until the house includes everything. It's NIS 796.
  55. Whether the GTX970 and R9 390 prices are more or less the same. I made the booking experience with the 970.
    330 dollars ended up at NIS 1750 value including taxes, even if I saved the conversion fee difference and left
    I eventually saved 200 ~ 250 about the price in Israel I think I would still prefer to buy here because of the warranty issue
    And so on, and I can credit it to some 2 payments.
    Obviously, if it were a cost of NIS 3000 like the example you gave that coming to ALT I save close to a thousand shekels the difference is already
    Significantly larger and much better.

  56. For the 970 prices in the country are relatively stable. Differences are about 10-12% compared to the cheap stores.

  57. url] https://www.youtube.com/watch?t=138&v=BWbRSHtHI6c [/ url]
    Attaching video from a senior AMD manager who says that now the video card bandwidth will be in the a1.4 generation
    And in winter promises and converts to HDMI 2.0 adapter from DP1.2

    https://www.youtube.com/watch?t=138&v=BWbRSHtHI6c
    
    				
  58. My prediction - the resulting hip will not justify itself and only people will be disappointed with the ticket.
    Over time, he will praise and return to a healthy and well-deserved competition in this field and thus his truly great success.

  59. COLOR = "silver"] - - - COMMENTED COMMENT: - - - [/ COLOR]

    [= Roeerden; 5061209] Attaches a video from a senior AMD manager who says that now the video card bandwidth will be a1.4 generation
    And in winter promises and converts to HDMI 2.0 adapter from DP1.2

    [U
    My prediction - the resulting hip will not justify itself and only people will be disappointed with the ticket.
    Over time, he will praise and return to a healthy and well-deserved competition in this field and thus his truly great success.

    - - - Unified response: - - -
    
    
    Attaches a video from a senior AMD manager who says that the video card bandwidth will now be in a1.4 generation and promises winter HDMI 2.0 converts and converts to DP1.2 
    
    
    https://www.youtube.com/watch?t=138&v=BWbRSHtHI6c
    
    
    Perhaps it is enough to flood with this here, there is a focused discussion on this exact issue.
  60. Well, all the reviews came out. Plus minus the GTX 980 ti. There is a situation that is a little better buy than the 980 TI for 4K.

    Only if dropped in price.
    When comparing a blower reference to water cooling it is clear that the first one will be much more potential to OC. Plus the extra 2GB and at a similar price +/- 10%, 980TI from Costam takes. Not to mention its hybrid versions.

    Two Fury X It's also 2 internal radiators to find a place for them, not understand which ..

  61. 1. The fury x reference is water cooling so the comparison is just fine.
    2. The average customer is not interesting. He looks at performance.
    3. The OC potential is not determined because of the type of cooling. The fact that having a better cooling card does not make it less OCER. This is a logical failure :)

  62. Completely right about OC, and this is exceptionally exemplified.
    Like you said, without getting into the rest of things that don't interest the simple consumer, only results in the field.
    Air cooling takes water cooling both in a boxed product and with maximum OC.
    A good example of this:

  63. COLOR = "silver"] - - - COMMENTED COMMENT: - - - [/ COLOR]

    It will be very interesting to see how each company (MSI, ASUS & Co.) will differentiate itself when reference is such a tailored and closed water cooler.
    Hope this does not cause boredom between models as is the case in TX or 295X2
    Completely right about OC, and this is exceptionally exemplified (I didn't talk about a general case but specifically 980TI VS FURY-X, I probably wasn't clear about that).
    Like you said, without getting into the rest of the stuff that doesn't interest the simple consumer, only from FPS results in the field.
    Air cooling takes water cooling both in-product directly out of the box and with maximum OC.
    A good example of this:

    - - - Unified response: - - -

    It will be very interesting to see how each company (MSI, ASUS & Co.) will differentiate itself when reference is such a tailored and closed water cooler.
    Hope this does not cause boredom between models as is the case in TX or 295X2

  64. On the other hand, the 980TI Customs versions also pay more than the Fury X.
    But there is no doubt that the OC Potnitzal looks better on the 980.

  65. My main disappointment is the OC capacity is too small.
    Basically, it has all the right conditions - water cooling and very low core temp - but something there doesn't allow it.
    Apparently, as usual, AMD pushed the card to its maximum.
    Really, from cooling water to a single-core card, I expected much more.
    Drivers remain their only hope now.

  66. I wish that it would also give 25% with its working temp, which would be justification for expensive water cooling.
    After such a long time of building expectation and endlessly lasting and prolonged launch, go ahead and wait for software updates DX12WIN10 drivers to make the difference Well, it no longer catches, loses momentum, makes a bad initial impression and overall. Sadly similar to the AMD launches they used to be.
    The ultimate index was 980TI prices in the next two months.

  67. It's just like the excuses to the other side (drivers, bad OC, etc.).
    The picture is inconclusive, and Hechtis is far from a failure.

  68. URL = "http://tpucdn.com/reviews/EVGA/GTX_980_Ti_SC_Plus/images/witcher3_2560_1440.gif"] in Witcher 3 [/ URL] (don't say niche game, which is the year's game of its kind) and the launch differences in drivers.

    There are things about him, the situation is not very bright.
    At the same time, the FURY X is a great performance product, simply coming and daring to put it in the 980 TI price tag is a wrong decision. Do you know what NVIDIA did shortly after the launch of the FURY X? Lowered 980 TI prices in Europe, a decision that directly affects us.
    I don't think there is one serious site that called FURY X a failure. The problem you have here is that there is a competing product of a company that has a quarter of the market power of its major competitor, which is similarly priced and does not generally perform better.

    Poor OC capabilities are no excuse. In the PC GAMING SHOW one of the engineers said that the FURY X could be ripped off in the overclock. What happened after the launch? The memory frequency is completely locked, and even under core water cooling does not exceed 8% frequency addition. This is far from light years from the phenomenal overclocking capabilities of the GTX 980 TI, in which many (and inexpensive) models succeed in adding north to the 20% net performance improvement. And that is without having to play with tensions.

    Drivers are no excuse either. As of today the situation is not in the sky at all. To date, long after launch, CFX is not supported

    In Witcher 3

    (Don't say niche game, facing the game of the year of its kind) and the launch differences in drivers are very worrying.

    There are things about him, the situation is not very bright.
    At the same time, the FURY X is a great performance product, simply coming and daring to put it in the 980 TI price tag is a wrong decision. Do you know what NVIDIA did shortly after the launch of the FURY X? Lowered 980 TI prices in Europe, a decision that directly affects us.

  69. I had hope that the FIX-free FURY would be a major blow to NVIDIA's higher market, but with R9 290X being re-priced with 8GB over $ 400, it's hard for me to see how it will cost less than 500 $ as a direct competitor of a very strong product in itself, The GTX 980.

    At first I had a hope that the lower market would come with a card like the 380, including the X that would devour the cards, but that the 970 didn't even fall off the 390 in FHD
    Costing no less than NIS 2000, probably we will return to play with the green team after quite a few years with the Reds.

  70. In 4K 4way cf / sli
    

    Great experiment !! Shows that the memory channel -384BIT- restricts TX on the 4SLI array (according to this bench) while the core restricts FX to a single card (according to the other benches).
    And now ask - which of the cards is more balanced?

  71. I looked at the minimum FPS which has 48% !! A difference in favor of FX - that's an amazing difference! From him I derived that his core is Achilles heel when he is lonely and that is very unequivocal in my opinion.
    The conclusion for average FPS was very embarrassing for AMD in the same experiment with 980TI: "Only in the 4 array of cards, and at the same price, FX managed to outperform its competitor. That too in 3% is negligible." The choice of titanium over 980TI works out well if it was made by AMD.
    It makes you think what TX with HBM would do ... NV will do it for gourmet reasons.

    Whatever, I really want to see the 2 systems in terms of chassis arrangement from the inside.

  72. U] FX memory faster [/ U] (which is well known and clear even without this bench).

    2- Compared to [B] Single Card [/ B] TX takes on every bench I've seen on different sites (except maybe a single game). So despite [U] FX memory faster [/ U] the whole card is slower = > Single FX core is slower than single TX core

    If FX takes a set of cards but loses as an individual then FX is less balanced than TX when the first has a kernel that does not keep up with the memory and the other exactly the opposite.
    If you need 4 FX cores to reach the standard of
    Each kernel handles a fraction of the frame so that the same processing power produces more FPS and therefore needs faster input from memory.
    If the memory is fast enough then the scaling is more successful. The contribution of memory speed is expressed in realizing the processing potential by its slowdown.

    1 - In the array, scaling is seen to be significantly better with FX in the min-FPS data. In my understanding, the minimum shows the most difficult and inherent processing of bank, which is why we see the strengths of each card most dramatically.

    FX memory faster
     (And this is well known and obvious even without this bench). 2 - comparatively 
    Single card
     TX takes on every bench I've seen on the various sites (except maybe a single game). 
    That FX memory is faster
     The entire card is slower => Single FX core is slower than single TX core If FX takes on a card array but loses as a single then FX is less balanced than TX when the first has a core that does not keep up with memory and the other is the opposite. If 4 needs FX cores to store its memory, then it is clear who is less balanced.
    
    
    I guessed
     What would happen if there were 980TI instead of TX. We would see a similar result (the average difference was slightly more than 3% and the minimum difference was slightly more than 48% - maybe not accurate), but the difference in price was 0 and not 1400 $. So you need 4 FX to get the advantage of the 5-6% Governor over 980TI. You are invited to speculate on the perimeter cost of establishing such an array over a single-card system. In my opinion, it is much more than 1400%.
  73. The truth is that I wouldn't rush to draw conclusions at the minimum fps either, without seeing graph fps on a timeline. After all, one stutter is "accidental" (for whatever reason) to determine this figure in the picture.

  74. Each core handles a small portion of the frame

    Not necessarily. To my knowledge, the most common method is AFR. Ie alternate frame rendering.
    Hence your continued argument is also inaccurate.

  75. True, I forgot to address it. He talks about SFR, which was recently used under Mantle in the game "Civilization: Beyond Earth". Maybe SFR will make a comeback with DX12, but 99.9% today use AFR.

  76. AFR you say?
    Well, it really takes the air out of the conclusion that the core is limiting:
    It may still be true, but that is not the correct explanation.
    It might be possible to check this on Fiji's weaker models.

    By the way, if they checked the cards in different APIs, the comparison between the cards can be taken into account.
    Either comparing cards or comparing APIs.

    By the way 2- a felt earthquake you were at the moment

  77. Let's see, by including performance in 4K "Max" CF fury-x settings led by about 5%, where:
    - Turn off the AA on the Witcher 3 which completely scales the scaling.
    - The array of greens falls to base frequency due to high temp with cooling stock (in this situation reds consume 56% more voltage than greens).
    - There are stuttering in some games, probably due to a lack of memory.

    Coming in for more settings, the average red array transport in 4K increases to 11%.
    But keep in mind that the fury-x CF runs at "maximum" versus base frequency for the 980ti SLI.

  78. This is not Pierre Fayette for those who have learned to do overclock in 2 minutes in software. Two 980 TI with an extra 20% performance every day for the same price, that's not Fate at all.

  79. This card is confused.
    If you need 2 pieces of it to make it worthwhile but in the same breath it is limited to 4GB it just highlights the compromises they made with it.
    Anyone, especially anyone who spends more than $ 1300, would prefer to settle for an average of FPS in return for smooth, clean playability without memory shortages.
    And every time Titan X Reference compares, they hit another nail in the FURY closet.
    No one in his healthy mind would buy 2 third-party units without pushing to the edge when it comes to 4K.

    They should have called him 390X and not try to play it with names that do not meet their expectations.

  80. This is not Pierre Fayette for those who have learned to do overclock in 2 minutes in software. Two 980 TI with an extra 20% performance every day for the same price, that's not Fate at all.

    The fury is still locked in terms of tension, isn't it? To compare Stoke and OC.
    I still argue that until they open up the possibility of changing the comparative tension, the story is not told.

    And omertgm, you keep repeating the mantra of 4GB. According to what you see in the chips, there is no memory problem for the card in the 4GB.

  81. The frequency of frequency is completely locked, the core frequency barely climbing to anything reasonable.

  82. Opening the core voltage is not something that exists in these cores that way either.
    You have a maximum TDP opening similar to the R9 290390 series and then you can climb frequencies. The problem is that the climb is on a small hill.
    Adding tension to these vast cores will blow everything up in flames.

  83. And omertgm, you keep repeating the mantra of 4GB. According to what you see in the chips, there is no memory problem for the card in the 4GB.

    He will be told that the coverage in question actually mentioned the subject at least referring to 2 games (translated from Google):

    Evolve are the other two games in which R9 Fury X are back and this time it seems clear that it is their amount of memory, limited to 4GB, which is the problem, not just in performance

    Refers to the stuttering reported in the following paragraph - "fluidity". Although 2 (in memory context) out of 13 games is not that bad, you can compromise on settings, it is still 4K.

    Regarding tensions and OC had to wait. The truth a bit surprised me that both LinusTechTips and DudeRandom84 received tickets with artifacts in Stoke. Hope just a coincidence.
    In another suspicious move, XFX only lowers its lifetime warranty for two years. I also realized that the 300 series, although the best buy still gives it a lifetime.
    According to an XFX support representative, over the past six months - too many RMA tickets have started to reach RMA following the Bitcoin mining frenzy. In truth, the argument sounds credible.

  84. url] http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results [/ url]
    They took advantage of some bug in AMD's software itself - add 100MHZ20% to memory and 9% / 95MHZ to the core.
    It is unfortunate that they tested a change only in synthetic software (20% addition in fire-strike)

    I wouldn't stop buying to see a performance after a change of heart to the core, based on what you see in the area already assigned. No temp problem at all but very large variance between the various reviews. If you took out 980TI with 1350 base speed we would see it A phenomenon of great variance.
    I think FURY-X was indeed a "dream for OC" but reached the great potential it already had in the factory for obvious reasons.
    The improvement after extra stress will, I think, be on a small average and suffer from the same great variance that it now has and it will become a matter of luck. Hope to have a hatred and eat the hat in this regard.

    By the way, this is an enlarged core for a chip that has been around for a year, [URL = "http://www.techpowerup.com/reviews/AMD/R9_Fury_X/5.html"] Same Power Controller [/ URL] (third from end) for 290X and
    Here's the OC of memory (at least on paper, they didn't show his relative contribution only)

    http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results
    They took advantage of some bug in AMD's software itself - add 100MHZ20% to memory and 9% / 95MHZ to the core. It is unfortunate that they only tested a change in synthetic software (20% increment in Fire-strike) I would not stop buying to see performance after opening a voltage change to the core, according to what this area has already assigned. No temp problem at all but the variance is very large If we took 980TI out with 1350 base speed, we would have seen the same phenomenon of great variance, I think FURY-X was indeed "a dream for OC," but had reached the big potential it already had in the factory for obvious reasons. Adding stress will, I think, be on a small average and suffer from the same great variance that it now has and it will become a matter of more fortune. By the way, this is an enlarged core for a chip that has been around for a year, 
    The same voltage controller

    (Third from end) 290X has therefore long tailored software.
    I don't understand what the big technical limitation is to get a suitable tool? Maybe it's AMD's decision not to allow a change of heart to the core ..

  85. URL

    OC 8% to the memory of FURYX 

    Adds 1-2% performance
    They say that extra stress is just a matter of time.

Leave a Reply

Back to top button
Close
Close