The Radeon Fury in Reviews: Life and Death in the Hand of Hustle • HWzone
Computersgraphic cards

The Radeon Fury in Reviews: Life and Death in the Hand of Haste

The second video card based on the core of the Fiji is already here, with Which transcend the But at a price that complicates matters

Get updates from everyone in TelgramGet updates from us all at TelgramJoin the channel now


As promised, AMD has removed its information embargo earlier than expected - and reviews The Radeon Fury Began Flow אל Network. Is this $ 550 card (or higher) the ideal middle point for gaming for the highest quality? The details are before you.

Contrast To Radeon Fury X Offered only by a reference design that AMD's affiliates were not allowed to change, the Radeon Fury was actually launched without any reference design at all, but based on self-designs from only two manufacturers - Sapphire וAsus, When the first offers Two Tri-X models With massive cooling based on a compact reference circuit (such as that of The Fury X), And the other offers a single STRIX model based on a full-size printed circuitry and offers support for 0dB technology to turn off the card fan when the environment is cool enough to support it.

ffr6 ffr7

We are big proponents of nonstandard video cards with advanced technologies and cooling systems, but the "trouble" in the absence of a reference design is that manufacturers are much easier to deliver a dedicated branding card and aim for a more oily coupon - thus, the basic card of Although 550 is priced as recommended, its OC model generates $ 570, while the STRIX of Will be offered at $ 580, with each addition naturally detrimental to the overall cost / benefit ratio of the Fury.

ffr
Similar, but different in price. Source: anandtech.com

ffr5

Let's move on to the performance itself: Here, the Radeon Fury has certainly succeeded in its main mission - it is faster than its closest competitor, The GTX 980. Differences are not large and are about 6 to 7 percent on average in 2160p resolution, Demonstrate their potential in moving to Ultra HD, where the Fury shines and the difference climbs to a two-figure average of about 12 to 14 percent on average. At FullHD and below, you can see equivalence between these two advanced graphics cards, and even the reversal of the trend - but for such resolutions it might be better to settle for a slightly simpler card, saving the difference for the day when a screen can be purchased Quality and competitive price.

ffr10
Source: hardwarecanucks.com

Despite the start line performance advantage, not everything is pink for the new AMD model. Unfortunately, the biggest drawback of the Fury X Migrant is to the little brother - and despite the massive and effective air coolers attached to it by Sapphire and Essos, the card's speed capability is software-limited (and perhaps firmware) to only 10 per cent, alongside the speed capability of approx. Apparently, 10 also holds for the new HBM memory. This is particularly jarring due to the fact that the GTX 980 that faces a known passage As an excellent overclockerMost of which can operate without problems with the addition of 15 percent and even 20 per cent of core frequencies, which provide an actual incremental performance of 10 to 13 and greatly reduce and possibly eliminate the relative advantage of performance with which X- Fury in standard mode.

ffr1 ffr2

According to techpowerup, Fury has a performance lead (at resolutions of over FullHD, as mentioned above), but when it comes to power consumption and performance per watt the situation is quite different - although here you can see a big improvement on the part of As opposed to the generation of Hawaii

We very much hope that, Will allow execution It is more appropriate to fully utilize the potential of the Fiji core sometime down the road, but until then it will be very difficult to recommend the purchase of one of the advanced and non-standard cards, as almost all of their advantages become irrelevant in a situation where the core frequency does not cross only the " 1,110 MHz.

ffr9ffr8

According to techreport, the Fury can not pass the - Not in performance, not in relative performance stability, nor at the cost as we already know

At the end of the day, we can say that once again we have no absolute decision - anyone looking for a better-performing card 'straight out of the box', especially for looking to the future and the 4K resolutions it brings, can choose to invest a few dozen more dollars ( Or a few hundred shekels in translation to the Hebrew market) in Radeon Fury, but against anyone who is not afraid of rushing and unintentional head to the realm of resolution, can get the same final product and maybe even better From the competitor GeForce GTX 980, And at a seemingly reduced price.

Another good card from AMD, which is not really exhausting its potential due to somewhat puzzling decisions from the manufacturer Source: sweclockers.com
Another good card from home , Which is not really exhausting its potential due to somewhat puzzling decisions of the manufacturer
Source: sweclockers.com

The main question mark we have left now touches For the R9 Nano - When will he come to us, how will he fit into the pricing hierarchy of , If It will provide and whether it can actually be 2-efficient from Hawaii cards. The clock went down and the countdown began again.


Tags

74 תגובות

  1. really? Don't you recommend a video card that is more powerful and slightly more expensive (in dollars and version dependent) because the speed is limited to 10%? I'm guessing you didn't compare the rush to the rush because according to what you're saying here if the 980 rush is equal and slightly bypasses this card's diplet it means if you rush the fury it will still win ... Again the article starts well and ends in complete silence ... please learn not to say Nonsense and check the data and not say maybe faster or maybe in a hurry because you really have no idea (because you haven't checked)

  2. Maybe instead of opening up the jura on the site you will check for yourself. Here's an example:
    On the TPU site after the release, the ASUS STRIX finished at a faster 6.4 percent
    http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/34.html
    This means that when the relevant difference between FURY and GTX 980 is 7.5% in 1440P, this difference is increased to 14.4%

    Here:
    http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_980_G1_Gaming/30.html
    You can see that compared to default, the G1 of GB improves as 21% of net performance after overclocking (without even extra voltage).

    Take this number, return to the FURY comparison, and add 21% to 93% of GTX 980
    http://tpucdn.com/reviews/ASUS/R9_Fury_Strix/images/perfrel_2560.gif
    What you will see in this table if you add the two in hurry, is the GTX 980 sitting at 112%, and the FURY sitting at 106.4%.

    not complicated.

  3. As long as no one has tested FURY against G1 / HOF / any other monster in the same resolution, with OC and in those games you can not say what you wrote. You're based on one game tested in different resolutions, you can not say it represents anything.

  4. Battlefield 3 is the Best case scenario. I mean, the average game improvement will usually be slightly lower than that. It is not so gratifying, but rather increases the difference between the two.
    I have not seen a FURY review in which it has improved more than this tiny performance analyzer on the OC, which is more or less than it will be behind advanced 980 versions of each other, possibly in one line or even behind 980 Reference.

  5. The most unusual part of this "fine" card is the fact that only 2 manufacturers show its models. The large majority, even among the leading -EVGA, MSI GIGABYT- chose not to invest in it. Unprecedented to the best of memory on a top end card.

    And without OC manually, when you look at models out of the box, the performance difference is reduced to negligible. And again FURY, and like FX, becomes the least paying option for most of the public.
    AMD is missing out on the little things again and is almost tripping themselves up.

  6. No one neglects anything and the tickets are not available until the 16.

    The main problem with them is the price.

  7. There may be inventory issues and other companies currently sitting aside because it is not paying off. SAPPHIRE is likely to be a higher priority followed by market share.

    I think there are too many variables here to be based on one game, so long as you have not checked what you say in depth, you can not say anything unequivocal.

  8. There are some people here who like to discredit amd, and it has already become a snowball
    Whether it's on the driver's supercharged drivers or features that show them as crushing advantages!
    Fury is more powerful to 25% ~ than 980 and even reaches 30% difference in 4k! It is doubtful even if 980 is maximized to equal it in Stock
    And this is still with the first driver, which is likely to be significantly improved later on
    If the price in Israel is similar or by a small margin, there is no hesitation at all

  9. No card from any company! Does not give a gaming experience worthy of 4K.
    Bottom line: You can continue to fantasize about 4K in 60FPS no matter if you belong to the red or green camp.

  10. No single card, but there is certainly an improvement with the arrays and obviously it is more economic interest of 2 parties.

  11. AMD have become assigned here: D
    I keep telling my kids that I'm special… .If I'm wrong, I'm just a limited edition :-)

  12. So I got a ticket like 980 only without the features of anodia
    So why should I buy it?
    Acne Anodia with the features and the drivers and image quality.

    Another generation where AMD is just catching up with the gap instead of breaking through.

  13. picture quality? And features like gameworks that hurt their performance just drop a little. All the problems AMD has with certain games is that the companies refuse to cooperate with them because of Nvidia's draconian agreements.

    And see how you laugh when your 980 gets drivers that reduce card performance to make new cards look even better in time.

    Meanwhile NVIDIA is laughing with your money to the bank.

  14. Freesync is slowly entering the market and more and more screens are supporting it.
    Mantle may not have grasped but he showed the world the problem with dx11 and prepared the dx12 area by seeing the need for low level api and actually on the weak CPUs saw an improvement in performance because the processor limited performance less.
    About the drivers. It seems that they just gave up on anything that is not maxwell when it comes to optimization and all gcn cards get all the improvements. Just for the Doge GTX 760 was almost ten percent what 270x today with updated drivers has loads of games that 270x bypasses. Cards that were weaker than their competitors on the green side today are much closer and even bypass their green friends. And it's not that AMD has flown forward. The ones that NVIDIA malfunctioned are these tickets instead.

  15. It's all well and good that it's slowly coming in and that AMD has shown the world the wonders of low level api and that they are the first to HBM and so on and so forth. But if you invest without getting any consideration your strategy is bankrupt. The problem is that this has been a recurring trend over time, and with statements and declarations that do not stand the test of reality, they begin to lose public trust (and shareholders).

    Drivers:
    You are talking about the lack of optimization for new games for cards from previous architectures and not about downloading performance by Driver.
    That is, NV stopped improving performance for new gaming engines in previous architectures (Kepler, Fermi) while AMD continued to support and improve the same architecture (GCN) performance.
    Between that and saying that they are removing drivers that are performing well there is a huge difference.

    So NV prefers to focus on improving the current generation (Maxwell) over the previous generations - not ideal and even smelly but it has some logic of resource distribution. AMD's driver division also does a resource sharing and is not entirely without problems (poor support or lack of cross fire in some attractive games).

  16. AMD pushed the "low level api" to work around their drawcall overhead problem in DX11, where Nvidia leads significantly (driver).
    For me, Mantle served as a lifeline for AMD, until about DX12 came out, perhaps even in case DX12 did not fully meet their need.

    The story with FreeSync is not so fundamentally different, they made sure to push the Adaptive Sync standard to fill their missing in the market in the face of Nvidia's G-Sync.

    As for Driver, about a month and a half ago, Driver was officially released who handled Kepler optimizations and fixes. David referred to the fact that claims were made on the Web that Enoydia sabotaged Kepler's performance to promote Maxwell.
    What is certain is that you will not hear from those people about a driver that improved telescopic performance for the R9 3xx series at launch (while the 2xx series with the same architecture was left out).

    At the end, every company takes care of its ass first and foremost, and each adopts the strategy that suits it. There are no bad guys here. There is competition and all competitors are contributing to the advancement of technology in one way or another.

  17. So many gossip and nonsense talk here ...
    Yes, yes, just like some postmills who do not work and smoke all day, and are transformed into postmills who stay at home to raise children (and look at the children as key chains or other decorations)
    AMD SUCK in terms of quality.
    Since there is no competition for NVIDIA, it can do what it wants (its right), and it definitely has quality!
    =
    In connection with the drivers: (Do good, do not talk if you have not seen with your own eyes or checked with your own hands!)
    AMD has been treating drivers for over two years as a blind date!
    NVIDIA, on the other hand, invests in drivers!
    There is full support from the FERMI section and above! _ (It's true that some users have problems but the problems are mostly from the OS combo and some of its codes and software and the driver)
    About download performance because of the drivers. _ That's true but touches on a specific card!
    When GTX980 came out…. The Greens sabotaged the GTX780TI drivers and thus gave 980 the desired EDGE _ (after all, Max has no change in transistors! Although there is an improvement because of the architecture… but not that serious…) (less efficient utilization cores) _ (When we get the new FINFET, we see a significant change)

  18. none77, you received a warning about language and disrespect.
    And odedyac f or cowgirl or whatever you don't want to call yourself (I'm in favor of user freedom), you can be a little less vocal and use punctuation instead of random characters on the keyboard.

    Please try to have a normal discussion between two companies that offer us, customers, products and the way we are tempted to become their attorneys. A little objectivity please.

  19. URL = "http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/31.html"] 6-7% Performance Decrease [/ URL]. With 290 / X it was eliminating 9% units for them [URL = "http://www.techpowerup.com/reviews/AMD/R9
    AMD is trying to repeat here the successful exercise of 290 / X only this time the price difference is 100 $ instead of 150 $.
    Another interesting thing - canceled 14% of processing units for

    6-7% decrease in performance
    . With 290 / X it was eliminating 9% units for them 
    6-7% decrease in performance
     (1080 / 1440, 4K irrelevant). The differences in core velocity are the same in both cases and the relative difference is negligible. A. For those who go with this AMD and almost completely abandons FX. With the strange price mix in the country and to the extent that it goes below 3000 NIS may well be a worthy competitor to 980. B. FX does not seem to be using all the power at its disposal. C. They still have a chance with nano-eliminating more cores for a small performance hit, Another 100 $ and in compact packaging they have a strong card. Just don't take their time. I have no doubt that NV has at least one more intermediate model to offer.
  20. In front of 390 there is the 970 in addition to the 100-150 chess justifies the plug-in performance it gives after easy OC. The power supplier is also a consideration.
    If you already have a money-to-performance ratio then 290 but it is no longer in the top market and not the level of performance it is about - 980 and north.

  21. In front of 390 there is the 970 that plus the 100-150 that justifies the extra performance it gives after a slight OC. The power supplier is also a consideration. If you already have a money-to-performance ratio then 290 but it is no longer in the top market and not the level of performance it is about - 980 and north.
    
    
    I would like to address the 3 claims that each raised separately:
    
    
    
     970 vs. 390
    
    The 970 for my taste is in principle preferable or sweeping, over 390 for a number of reasons including: 1. A memory-size problem that stands at only 3.5GB, which is starting to affect upper-end crowns already, and will have a lot more impact later. Hence its longevity, due to the limited memory, is not long. 2. A second reason is that the 970 has probably reached a high level of driver optimization that can be done for it, while the 390 series has the (probably) greater scope for improvement and will come later. 3. A third reason is that nowadays most of the new titles first come out to the game consoles and are therefore written in principle on AMD hardware, so later on these titles are expected to benefit from running on AMD-based PC hardware as well. 4. A fourth reason is not technical but commercial, if we consumers in the duopoly that are in the graphical acceleration market today, we will contribute even further to AMD's total push from the gaming arena due to continued support for front runner = Noida, and the disappearance of what under dog = AMD remains with a single Noida contestant ( So-called monopoly) and then everyone will cry and beat for the sin that contributed to this situation. Because then we will pay a lot more on each accelerator, and we will get technological upgrades at a turtle pace. Not that Duopoly is a marinine, but you still have an infinite state of literary monopoly.
    
    
    Power Supply
    Regarding an unintelligible power supply what the problem is, any proper power supplier of about 400-500 watts and above will run both this card and this one, unless it is an obscure Chinese power supply that was received in some ninety-nine-pack computer, and then wished for the entire computer without Connect to a video card running on it. Since it is likely that anyone who purchases a graphics accelerator at almost 2000 - it also has a reasonable and worthy computer with a strong supplier of interest and a well-known brand (even if the supplier is modest), then the problem is not expected from here.
    
    
    value for money
    In principle, the accelerators can be further lowered down to the next segment, and second-hand 7900 / 680 / 770 residuals can be purchased for a few hundred shekels, and the proceeds for an even better shekel. For if in the 600-700 chassis you buy 7900 or 680 / 770, for this (performance) they get around 50% of what furyX currently gives, which we will remember for a moment - a new 3.5 thousand. So here: For a fifth price we got half the damage. If the goal is to look for the "maximum performance per shekel" then one has to run away from the top 3-4K and from the center of the 1700-1900 table center and go for used 3 equipment (which was then the top of the top).
  22. You can always go down and down and down, not the living market. With games that have X2 and X4 resolutions than they knew before, the processing power sometimes just needs to be huge to open the pocket and not take 75% or 50% but 100%, and preferably 120% through Overclock software, something that NVIDIA's latest cards do great.

    The reason I won't address 2 at all is that AMD hits slaps left and right with new titles and is usually left behind with a small audience of angry customers, see AC value: Unity, Project Cars, Witcher 3 CFX, Watch dogs and more where there is a lack of support or Ugly performance impairment. Do not know whether to blame the limited staff of AMD software or carelessness of game makers, only that it is the facts on the ground that customers suffer.

    I definitely agree that around NIS 1700, the R9 390 is great value, and this 8GB is a plus. I do not agree that NIS 1700 is a winning price, and think that when it comes to an underdog with a fifth market share, 1500's consumer price will be correct in this case.

  23. Everything you have listed is possible but what links them all is 'Maybe, sometime in the future, 390 will be a priority'. He currently has none. You said it quite rightly - for about a year all the cards will be rerouted. So maybe AMD will be able to improve the drivers and maybe DX12 will show them a bigger improvement than NV and maybe open Fiji cards the option for Vcore to fly them in OC and maybe maybe. I personally, in light of the events that have been concerning her lately, neither build nor trust them to be used as part of these "maybe".
    I also hesitate to buy a product "out of pity" for a particular company when there is a preferable option. If something stuck there why postpone the end and reject the possibility of a material change that would bring them back to the lead?

    A person upgrading from an existing system and already having a few years 500W provider I would not recommend 390 but with 970 you can carry around (about 77W difference). 970 is effective at 40-50% of it in performance W. This is not an essential point, just something that increases the likelihood that more 970 will fit into an existing system.

    Agree with you and I said we are talking about the high market, 980 and above, so the option of 390 / 970 is irrelevant to the discussion. I brought it up because you said it was "super efficient in performance / money" which is wrong when you consider 970.

  24. url] https://www.youtube.com/watch?v

    https://www.youtube.com/watch?v=-BTpXQkFJMY
    The GTX980 came close but did not win the FURY, even after the OC. Drivers may also come out that will improve performance so it can still get better. If you want to add another card we saw that CFX brings a bigger improvement then we probably prefer AMD's solution (to 4K).
  25. url] https://www.youtube.com/watch?v

    https://www.youtube.com/watch?v=-BTpXQkFJMY
    The GTX980 came close but did not win the FURY, even after the OC. Drivers may also come out that will improve performance so it can still get better. If you want to add another card we saw that CFX brings a bigger improvement then we probably prefer AMD's solution (to 4K).
    
    
    - - - Unified response: - - -
    https://www.youtube.com/watch?v=-BTpXQkFJMY
    ה GTX980 הגיע קרוב אבל לא ניצח את ה FURY, גם לא אחרי OC. יכול להיות שגם יצאו דרייברים שישפרו ביצועים אז זה עוד יכול להשתפר. אם רוצים להוסיף כרטיס נוסף ראינו ש CFX מביא שיפור גדול יותר אז כנראה שעדיף את הפיתרון של AMD (ל 4K).
  26. 
    
    
    
    
    
    nec_000
    
    
    
    
     
    Following 
    
    
    
    
    djelectric
    -    
    
    
    I would be happy to see 390 / X with 4GB memory in exchange for a drop in the price of 200-300 (390X that is closer to 980 than 2000 will be a hit). Who got off the tree to market them as 4K cards - in the present market situation there is no point in that. Because of an incorrect marketing decision - "those 4K cards" - which is accompanied by another technical mistake ("therefore 8GB is an advantage"), they are losing an opportunity to give real fite in the mid-high market, which has a lot of meat. They have the potential but they do not realize it. They made their choice, which came in response to a given situation, and now we will see how the market responds. I'm still waiting for their ticket that I can recommend and buy myself wholeheartedly.

    The reason they planted 8GB total memory chips is not due to the apparent desire to adapt to higher resolutions (which is what you first think)
    But because of a purely technical reason, namely that these memories are more advanced lithography and therefore are able to run at a faster frequency under lower current consumption.
    Their desire was to go from 5GB / sec to 6GB / sec and the only chips that matched the required voltage (ie low current consumption) were the new generation chips in modern lithography,
    And those as a by-product are also larger in volume (2).

  27. An interesting point, I looked for a mention of something like this and did not find it. Do you have a link?

  28. I did not understand why save a few hundred shekels and get stuck with a card that does not have support for the newest games as they said here watch dogs and more
    In addition, everyone knows that NVIDIA image quality is better than AMD.

    It's better to increase budget and buy NVIDIA and that's it.
    That way you know you have no problems with the driver or the heat. And you do not have to be upset if you lack a feature that lowers quality.

  29. People really like to forget the driver that burned video cards. The 480 cooler card is probably one of the reasons for thawing glaciers. I even remember that one site got a developer sample of the card that was supposed to replace the top-end 480 with an open core and God save it was so hot and consumed so much electricity that it just never launched. Better image quality? It is precisely in the video that people say that AMD's image processors are better. Plus, the video cards are only warm with shitty reference coolers. That it was the same with NVIDIA. I have an 270x with advanced cooling of passing the 60 degrees so that Nvidia reference cooling is relatively good is something else. But when every normal card comes with the manufacturer's refrigerator that can hold the fury or titan at 75 degrees without a problem this section is irrelevant.

  30. I did not understand why save a few hundred shekels and get stuck with a card that does not support the latest games like they say here Watch dogs and more And everyone knows that the image quality on NVIDIA is better than AMD. It's better to increase budget and buy NVIDIA and that's it. This is how you know you have no driver or heat issues. And you don't have to be embarrassed if you are missing a quality downloader.

    Popular slogans that are not backed up in anything. Too bad.

  31. I did not understand why save a few hundred shekels and get stuck with a card that does not support the latest games like they say here Watch dogs and more And everyone knows that the image quality on NVIDIA is better than AMD. It's better to increase budget and buy NVIDIA and that's it. This is how you know you have no driver or heat issues. And you don't have to be embarrassed if you are missing a quality downloader.
    Popular slogans that are not backed up in anything. Too bad.
    
    
    - - - Unified response: - - -
    
    
    People really like to forget the driver that burned video cards. The 480 cooler card is probably one of the reasons for thawing glaciers. I even remember that one site got a developer sample of the card that was supposed to replace the top-end 480 with an open core and God save it was so hot and consumed so much electricity that it just never launched. Better image quality? It is precisely in the video that people say that AMD's image processors are better. Plus, the video cards are only warm with shitty reference coolers. That it was the same with NVIDIA. I have an 270x with advanced cooling of passing the 60 degrees so that Nvidia reference cooling is relatively good is something else. But when every normal card comes with the manufacturer's refrigerator that can hold the fury or titan at 75 degrees without a problem this section is irrelevant.
     
    
    You spared me the comprehensive answer writing that our populist predecessor needed to throw in the face: xyxthumbs:
  32. Popular passwords that are not backed up by anything. Too bad. You've spared me the comprehensive answer writing that our populist predecessor needed to throw in the face: xyxthumbs: And there's a lot more to thicken the sheet and bring more examples, but to my taste what you've brought is enough to understand the point.

    That's the thing
    All the latest AMD tickets are like the 480
    warm
    Driver is not good
    And poor picture quality

  33. What's the point of uploading tickets from the past? It's like arguing that Pentium 4 was heating up (anyone remember Prescott?) And that's why AMD processor is better today rather than i5 / i7

    AMD is on the way down. It loses every year of the market. What are the reasons? again:

    picture quality
    Drivers
    Heat
    Lack of buttons

    The only reason they have not yet fallen in front of the giant Anodia is the low price.
    On the principle level they admit that they are less good
    So they sell cheaply

    that's it. There's nothing to be confused about.

  34. url] http://www.plonter.co.il/detail.tmpl?sku

    http://www.plonter.co.il/detail.tmpl?sku=STRIX-R9FURY-DC3-4G-GAMING
    
    				
  35. Wait a year or two, and not sure the market is going there, for the simple reason that if 4k video cards come out, and it becomes a PC gaming standard, there will be a big gap between consoles and PCs
    What we all see that no one wants ... Hardware advancement does not depend on knowledge or ability to produce

  36. God will keep telling you how many times you have to tell that there is no difference in image quality, the differences in heat vary from generation to generation (TITAN X and 980TI, for example, sometimes have problems with cooling stock) and absolutely no features for AMD, some have their own development and some have the ideal situation Independent voice (third party / open source). Drivers are also not as big a difference as they are, especially when it comes to release frequencies (for some unclear reason, people keep counting only drivers who have a Microsoft signature, even though this shit doesn't matter).
    Just now compare GTX 980TI and FURY X in the ITX package (it is important to note that many people switch to small computers) and the FURY X has barely reached 70 as 980TI has reached its maximum temperature with no possibility of doing a BUST.

  37. url] https://www.youtube.com/watch?v=trq6B4anzjM [/ url]
    It does not matter that this GTX card is better.
    And here's a difference in image quality:

    https://www.youtube.com/watch?v=trq6B4anzjM
    
    				
  38. 
    
    http://www.plonter.co.il/detail.tmpl?sku=STRIX-R9FURY-DC3-4G-GAMING
    
    
    3335 It's a pretty entertaining price. Depending on their price to (FX (3975 and the difference between 2 tickets according to 100 $ MSRP) you might think that one dollar is worth NIS 6.4.) At FX KSP, 3490 is currently rising so according to the conversion rate this FURY is expected to cost only 2850- excellent price that will break 980: cog : It seems to me that the primary cause is preventing AMD from recovering (and not the unnecessary, somewhat unnecessary, statements by Yoram)
    
    : funn: The above should be read in a cynical tone of despair at the absurd prices in the country, at a moment of jokes and generally in good spirits: funn:
    
    
    				
  39. AMD on the way down. It loses percent of the market every year. What are the reasons? Again: Image Quality Heat Drivers Lack of Pitchers The only reason they have not yet fallen flat against the giant Enoydia is the low price. On the principle level they admit that they are less good so they sell cheaply that's all. There is nothing too confusing here.

    Compute performance is preferred.
    First support three screens from one card. First with eyefinity.
    Upscale image to the hardware level without losing performance
    They switched to a crossfire that does not require a bridge compared to SLI
    Worked to integrate technology to synchronize frames via dp and work on adding it to hdmi also which will bring possible support to millions of screens vs gsync.
    AMD were also the first to launch a video card with GDDR5 and if I'm not mistaken then also DX11 video cards.

    Then you can go and look for your green trolls.

  40. Compute performance is preferable. First to support three screens from one card. First with eyefinity. Upscale to hardware-made image without losing performance They switched to crossfire that doesn't require bridge vs sli Work on integrating frame synchronization technology through dp and are working on adding it to hdmi as well which will potentially support millions of screens vs gsync. AMD were also the first to launch a video card with GDDR5 and if I'm not mistaken then also DX11 video cards. Then you can go and look for your green trolls.

    Compute performance is poor. See the upper coda dozens of times for everything else.

    eyefinity? You are a Panavoy of the AMI. Who uses it. 1% of gamers. The majority just buy a screen 46 an inch.

    Upscale doesn't help them make the GTX even better. This is Anodia don't forget.

    They switched to a crossfire that requires no bridge versus sli. Again, it does not interest anyone, the poor image quality has no coda no physics no hair and no vexels

    "Work on integrating frame synchronization technology through dp and work on adding it to hdmi as well which will bring possible support for millions of screens vs gsync." Again, it doesn't interest anyone, poor image quality no coda no physics no hair and no voxels

    Remove first or not first. Poor performance. Poor image quality. Poor driver quality.

    Then you can go and look for your red trolls

  41. In terms of performance it is more accurate to compare the 980 to 390X

    Depends on the resolution. At stock frequencies, 1080p is comparable to Fury. 1440p is among them, although closer to 390X,
    But considering its rushing capabilities, especially in advanced cooling versions (a performance improvement of about 20% above the stock), it is already climbing towards the 980Ti stock.

  42. Suppose someone is going to buy an AMD card now, because it has a freesync support screen.
    Given: FURY strong at 7 / 10% from 390X for 1080 / 1440 respectively.
    The price difference is currently NIS 1000 in the country, let's say it will fall and balance the 800 difference.
    Similar to the difference of 10% in performance between 970-980 plus 1000 which was at the time and then the recommendation was unequivocal on 970.
    Their OC capabilities are similar so it is not a factor, similar power consumption, memory volume if it is already in 390X's favor.
    So… why buy fury anyway ??

  43. This is the thing that all the latest AMD cards are like the hot 480 driver is not good and the poor picture quality

    Once again popular slogans that are not only backed by nothing, but worse, opposite to reality.

    For that matter, I will save myself the search across the net and leave it to you as a home infusion.

    A good place to start learning is to read all the latest reviews on AMD's new generation of cards,
    Including the fury and the 390, and discovering that they are not only cooler than their counterparts from the NeoVideo camp, but also quieter.

    Go and learn.

  44. Once again popular slogans that are not backed up, and worse, are opposite to reality. For that matter, I will save myself the search across the net and leave it to you as homework, my experience has shown that only this is how you learn seriously. A good place to start your study (in your possession) would be to read the latest reviews on AMD's new generation of cards, including the fury and the 390, and find out that not only are they (on average) cooler than their counterparts from the Nvidia camp, but also quieter. ** I recommend starting from anandtech's website and switching to toms hardware as well. Regarding "Bad Drivers", I can only conclude that you do not have (probably) any of AMD's tickets, so you prefer to go out with baseless sayings (which you probably read somewhere on the net), in amateur and non-serious forums, without understanding the subject, , And without sufficient personal experience on the components in question. For poor image quality, say this: On the day you can analyze and process your image quality in your process, present a neat presentation where image quality from two cards is displayed against each other, and explain to us why the quality of one image over the other is better and what are the parameters that demonstrate it and how Lady comes to expression and what causes them, ensures that you will receive a proper return on your investment from the reader. But until then, in Methota, we will ask that the forum members not fill our reading space (in our professional forum) with throwing slogans into the air that are not unprofessional and serious. I'm sorry if I was a little impatient with you in my aggressive remarks.

    I gave an example of NVIDIA's excellent image quality over the cheap AMD card without the physics.
    In Anodia the better quality asked whoever wants it is already known.

    Fanboy AMD.

  45. B] [COLOR = # ff0000] Who gives better value for price [/ COLOR] [/ B].
    You, a young guy, have 100% Fanboy.
    Yoram, we have to dig our brains in this amateurish way and with these slogans. You can push your physics wherever your sun does not shine, there are a few titles that support it and even then it is not exciting (and I have two GTX980 so save me). At the time I even had a separate PHYSIX card even before the company was bought by NVIDIA and even then it was endearing and not much beyond. With regard to image quality, most of AMD / ATI's image quality exceeded that of NVIDIA, it only proves how much you are a child and how little time You are active in the field.
    Unlike you, I am zigzagging between AMD and NVIDIA by

    
    
    Those who give better value for the price

    .
    You, a young guy, have 100% Fanboy.

  46. Yoram, we have to dig our brains in this amateurish way and with these slogans. You can push your physics wherever your sun doesn't shine. There are a few titles that support it and so it is not that exciting (and I have two GTX980 so spare me). At the time I even had a separate PHYSIX card even before the company was bought by NVIDIA and even then it was endearing and not much beyond. With regard to image quality, most of AMD / ATI's image quality exceeded that of NVIDIA, it only proves how much you are a child and how little time Unlike you, I (and others) are zigzagging between AMD and NVIDIA 
    
    
    Those who give better value for the price

    .
    You, a young guy, have 100% Fanboy.

    The facts show that the opposite is true.
    The image quality is higher in Anodia and it is well known.

  47. I feel like it's a troll from one of the directors, I've been to this movie once or twice ...

  48. I gave an example of NVIDIA's excellent image quality over the cheap AMD card without the physics. In Anodia the better quality asked whoever wants it is already known. Fanboy AMD.

    Your examples are irrelevant, there is a parallel AMD public presentation called tress FX.
    Nothing parallel to psyX. It works on "kids."

    Regarding AMD's fan boy, for your unfamiliarity, I forgive you.

    on the tip of the fork:
    I am a self-employed science and technology development major, working in security industries that use a lot of graphical acceleration.
    The number of accelerators and graphics processors I have undergone in the past 25 years is varied including 3dfx, matrox, silicon graphics
    Noydia, ATI….

    To me, there is no commercial branding and admiration for any of the two remaining manufacturers currently on the market,
    Both are respected professionals and have an incredible reputation and history for my perception and my best experience.

    The product is of people not of a sticker or branding symbol, and included people on both sides moving from one manufacturer to another
    From time to time, there was a great deal of symbiosis among the manufacturers, far beyond what one thinks.

    If you believe that you are being attacked for your childish and branding support from the said Noida manufacturer is due to the fact that
    To the opposite camp asymmetrically to you, this is a wrong conclusion. If you were advocating in the same unprofessional way at AMD
    Would have received from me as well as from others a congratulations on your misconduct on our forum. No, and if you have any doubt about it
    Check us out…. check me out.

    There has been one in the past that spilled dandruff thanks to AMD two years ago, and we also pushed it straight away (including your loyal slave).

  49. Your examples are irrelevant, there is a parallel AMD public presentation called tress FX. Nothing parallel to psyX. It works on "kids." Regarding AMD's fan boy, for your unfamiliarity, I forgive you. On the forefront: Selfish science and high technology developer, working in security industries that uses a lot of graphical acceleration. The number of accelerators and graphics processors I have undergone in the past 25 years is varied including 3dfx, matrox, silicon graphics Noedia, ATI .... Lee has no commercial branding and admiration for any of the two remaining manufacturers on the market today, both of whom are respected and highly reputable professionals And to the best of my experience. The product is of people who work under these companies, not of a symbol, sticker or branding. As a rule, people on both sides move from one manufacturer to the other from time to time, so there is a considerable symbiosis between the manufacturers, far beyond what the opinion seems to have. If you are of the opinion that you are being attacked here for your childish and branding support by a manufacturer, and this is due to the attacker belonging to the opposite camp (symmetrically you - only you can think so), this is a completely wrong conclusion. If you advocated in the same unprofessional way at AMD, you would receive (as well as others) penalties for your misconduct on our forum. Not yet. Knowing that before the current card I had (at home) of the AMD, I had a Noida card. And before AMD, and before it again, Noydia. Our work and products have dozens and hundreds of cards from both manufacturers. By the way, there has already been one in the past that spilled dandruff thanks to AMD (about two years ago), and we also pushed it straight away (including your loyal servant). If in doubt, check us out .... Check me out. We always try and work to maintain the place as a professional and proper benchmark, so that there really is something here to learn and quality knowledge to exchange with one another. And also a pleasant place as far as possible.

    You are a developer. You have no eye for color and such things of quality.
    You, unlike me, open an 2 game with 2 cards and don't understand the difference.

    NVIDIA:
    - A better white balance as a result of a better driver. The textures all look better. The Gifs know how to translate the COLOR SPACE game more correctly. Always looks better. Also, the way the driver detects the pixels before applying them better elicitation. And see it in the game. And another small difference and another small difference at the end make a difference. The one I see playing on Anodia I honestly see it looks better.

    - The features allow for much more loose dynamics, physics is the main player
    3D capabilities embedded in all cards. Today's Jeeps which is a gaming card is better than dozens of counters over AMD's Pierre Dedicated Card Mainly because of Coda. And Anodia's driver that for some reason talks better with all the software except Maya where AMD works well

    - And the most important reason, every how much the card changes to the core. And with it come completely new things. The voxels and hair are just on the tip of the fork.

    So do me a favor, I don't just praise them.
    All the benefits presented here for AMD end with "they invented infinity" or things that no one uses. In front of Infinity I can bring the vision of the 3D glasses.

    "They issued GDDR5 first." Who's that interesting. It does not express image quality.

    Like I said, I've had ATI for many years.
    And I admit that:

    7500 was superior to competitors because of a really good price and Anoydia had not yet invented it so worth it
    9800PRO was superior to competitors
    4870 is better than competitors

    But today ??
    Today GTX has caught a gap on AMD. It is simply a more successful product. Even if it is expensive for a few hundred shekels and gives a few less even ace (which does not happen), it was still a better and good product to use.

  50. Guys, I'm just a question:
    I have two R9 290 cards in the OC version out of the box with water cooling (which makes the idea of ​​upgrading a bit problematic because my whole system with water cooling and this is a story to replace cards).
    But the question is ... does it pay to switch to R9 3XX when I have 2XR9 290?
    I play on three screens mostly simulators and BF4 for fun.
    I have always said that until there is one card capable of defeating 2XR9 290, I have no point in upgrading, but I am no longer sure if this is the case so I ask here. In addition, since I did not delve into the dry data of the new cards, I did not form an opinion.

    By the way, for the guy who claimed something about the balance of whiteness in the green cards - one does not pretend to understand who displays better image quality but when it comes to color, it is a totally subjective matter, whatever each screen from the same manufacturer and the same model shows a different whiteness balance. You need a professional screen calibrator to get a whiteness balance and it's not about a video card but the screen itself, the panel and the technology of the panel.

  51. Compute performance is preferable. First to support three screens from one card. First with eyefinity. Upscale to hardware-made image without losing performance They switched to crossfire that doesn't require bridge vs sli Work on integrating frame synchronization technology through dp and are working on adding it to hdmi as well which will potentially support millions of screens vs gsync. AMD were also the first to launch a video card with GDDR5 and if I'm not mistaken then also DX11 video cards. Then you can go and look for your green trolls.

    It's not clear to me what the advantage is without using a bridge like SLI.
    After all, without a bridge, you still need to communicate between the 2 tickets. how? With the interface itself. (PCIE) and it takes bandwidth and down the road performance.

  52. PCI E 3 X16 has enough bandwidth. The problem was in the bridge that there was not enough bandwidth for high-resolution formats and because of that they moved to XDMA

  53. You are aware that CF users have significantly less gaming compatibility than SLI users, which is a fact.

  54. And it has to do with what I said because? Besides, sometimes they can't be blamed for poor game planning and developers' reluctance to cooperate with them because of their agreements with NVIDIA. True, they make mistakes but you can't make all the blame for them.

    What does this have to do with NV and developers, or do you know how to write a driver or not ...

  55. Because it is not enough to do V in the driver and it will work. It goes through tests and if repairs are needed it is not always that simple. Especially when you don't know what the hell is going on in the code of the game that doesn't let CFX work.

  56. I would do the Double Blind test for you and you would go out absolutely sorry, I promise you.

    The master known as Yoram, has long lost the attention of readers here…. Too bad for our trouble.
    Ignoring it is the most correct way.

  57. : xyxthumbs: The gentleman known as Yoram, has long lost the attention of readers here ... it's a shame for our trouble. Ignoring it is the most correct way.

    Don't worry, I will continue to recommend to people on NVIDIA and save them from buying an inexpensive AMD product.

  58. You recommend people buy AMD and another second hand.
    You ruin people's shopping.

    It's all fun to open a new box of video card and everything new. This is why there is a concept called UNBOXING
    What do you understand at all.

  59. Fan boys, Fan boys every where!
    I mean Mr. Yoram, of course.

    And on a serious note, my only problem right now with AMD and I'm the only highlight, is not performance, size, heat, power, etc…
    Only one: and it's a price!
    Their products are fine, but not the price they require and this is the problem members of the current generation.
    For the avoidance of doubt, I have an 980 SLI that will replace the 980 TI…

  60. 3335 according to their price to (FX) (3975 and the difference between 2 tickets according to MSNP $ 100 one might think that one dollar is worth NIS 6.4. In FX KSP 3490 is currently rising so according to the conversion rate FURY is expected to rise there only 2850 price will break 980: cognee:
    
    
    
    http://ksp.co.il/?select=&list=1&sort=3&glist=1&uin=27604&txt_search=fury&buy=&minprice=0&maxprice=0
    At the time I was laughing but it turns out that 2850 is really the price, 
    Opposite 2775 for good 980 from Costem
     There is a real competition here (at least with KSP) I think 980 can be achieved in 2500 as well, but 350 is still a logical difference that could create a new situation in the high to medium
  61. For 550 $ reasons for regular fury (not X) over 500 $ that 980 costs, is a price that does not justify itself and has already written about it a few pages earlier in this thread.

    The problem is that after lowering the price from AMD to the 980's intention, Noida will not stay sucker too and lower the price on its part (it has already lowered $ 20 the day before).

    There will be a price competition and a downward crawl that is pretty clear, these consumers benefit.

    Although again, it is a mistake on principle to spend close to three thousand shekels for a screen accelerator, which in a year when the next generation launch at 20 nm will do well if they give it about a third of its value.

  62. For a moment let go of what is happening in the sand, here in the country there is a completely different price mix that gives a realistic possibility of competition here and now (of course it should be in stock).
    Without waiting for the effects of lowering prices in the sand that may or may not occur in Israel.

  63. From what I have noticed, the prices in the country (in shekels) are an order of magnitude 5.1-5.3 from the retail US price which is also quite logical since beyond the transport there is also VAT.

    This ratio is generally valid for both AMD and Noida products.

    That is, as long as the top low is not treated as the 980Ti, which, because of its high price, is inherently high (as a flag product), it brings very small quantities to Israel,
    In order to justify the import of these small quantities, pricing is a little less attractive - comes down to the 5.8 size ratio.

  64. According to 5.8? You mean 5.8 shekels to the dollar.
    Either way America here is not, and the price is relatively passive for those willing to give an investment. The price of GTX 980 TI in Israel has fallen, and it is already kissing to 3600 shekels, which is a much better situation than the 4000 we saw here just two weeks ago.

  65. Indeed, what I meant was the NIS dollar to US retail ratio.

    And if the 980TI has already dropped to 3600 (I remembered 3800, not checking every day…)
    So that's excellent, because that means that the ratio has already fallen to the region 5.5 shekel to the dollar.

  66. At the end of all the cards work with the effects of Direct X, if one company started supporting a certain effect a little earlier than the other it does not mean that its cards are better.
    At the end of the day everyone will get exactly the same picture when the time comes.
    And the same "image quality" will stop confusing the brain with Whit Blance as if it were a camera. Instagram also knows how to do White Balance.

Leave a Reply

Back to top button
Close
Close