RX 6800 XT and RX 6800 graphics cards in review: Is Radeon defeating GeForce? - Page 6 - Video Cards - HWzone Forums
adplus-dvertising
Skip to content
  • Create an account
  • About Us

    Hello Guest!

     
    Please note - in order to participate in our community, comment and open new discussions, you must join as a registered member.

    Our members enjoy many advantages, including the ability to participate in discussions, enjoy raffles and promotions for members of the site, and receive our weekly content directly by email.

    Do not like being harassed by email? You can register for the site but do not submit your registration to the weekly email updates.

RX 6800 XT and RX 6800 graphics cards in review: Is Radeon defeating GeForce?


djelectric
 Share

Recommended Posts

You bring a video from two months ago. Before the release of the new version of msi afterburner that allows testing Utilized vs. assigned. He even points out that there is not so much a way to test it other than specific games that have a built-in tool for it.

 

You have here examples that people from the forum ran and show the actual memory consumption. Or do they also get money from nvidia?

Link to content
Share on other sites

The videos that NEC brought are excellent.

In the first video he explains that the developers of the game choose the size of the textures and thus determine the size of the memory consumption. That is, they will need to know what the minimum is Which they want to support. For example, if they determine that 8 GB is the new standard, and they manage to maintain consumption Low with only QHD textures, so they have no point in putting larger textures.

The interest in the GodFall game seems to me like a marketing act of Insert a blow under the belt.

According to what he says in the video BY . If I understand correctly it is already more than Normal he is 3840 x 2160

Link to content
Share on other sites

Quote of Jabberwock

The videos that NEC brought are excellent.

In the first video he explains that the developers of the game choose the size of the textures and thus determine the size of the memory consumption. That is, they will need to know what the minimum is זיכרון Which they want to support. For example, if they determine that 8 GB is the new standard, and they manage to maintain consumption זיכרון Low with only QHD textures, so they have no point in putting larger textures.

The interest in the GodFall game seems to me like a marketing act of AMD Insert a blow under the belt.

According to what he says in the video 4k BY 4k. If I understand correctly it is already more than4k Normal he is 3840 x 2160

 

On the same argument it can be said that developers will decide to launch ray tracing there to No answer.

Or embed DLSS which is in my opinion a real game changer, then all the discussion on here on Gets a sharp turn.

Link to content
Share on other sites

Quote of the captaincaveman

Images with DLSS look better than those without ...

 

This is one of the side effects and it surprised you too , But sometimes get a less good result.

I personally think it will be interesting to follow the development of DLSS, and it will be interesting to see if this approach of using deep learning to mimic the end result will reach more areas when it comes to graphics, and maybe finally photo realism.

Edited By k534d
Link to content
Share on other sites

Quote of nec_000

 

This video spurred me to make another effort. So we are talking about Godfall in maximum settings.

1440P

image.png.1612c66108a4056476db422cc73fbdcf.png

 

2880P

image.png.7fbfd5e313f0b37688c088edc408a7f1.png

 

I was able to get it to do the conversion in resolution. It can be seen that the FPS did crash and we have not yet reached the memory limit according to the actual memory.

I do not think their founder is a liar I just think they have not yet released the update of these super high textures because there are not enough cards of In the market to give meaning to this marketing move.

Edited By KobyC92
Link to content
Share on other sites

Two comments,

One you are with the finest card in this round that has 10GB , When the main problem we are talking about today,

Is the poorer card model 3070, the one that has only 8GB . If you do about 3070 experiments you will find real difficulty today

Not in the future.

 

But rather,

Even now, even though you have 3080 10GB, you are already on the scale, as they say - it's not that the card you bought is 3 years old.

Years, but barely 3 weeks old. You bought the most expensive there is, Nvidia's flagship, and you've already learned that here's a consumption She brings

Him for full utilization of the resource at his disposal. There are no redundancies for the rest of the road, no preparations for the coming years. Since when do you buy

A product that utilizes 100% of its resources right on the day it is born, and regardless of the future and the future?

 

Second and this time I will be short because I do not have patience, who does not understand what the difference between allocated and used is his problem, he approached to study the

The subject in depth. The nec classroom is closed at this time.

 

Abstract to anyway as they say:

The memory consumption is completely allocated, this is the address space to which the graphic accelerator writes dynamically according to the requirement

At that moment in software code and algorithm. What flows from the address space allocated outside the physical memory space that exists to the card,

Actually mapped to . As is well known, write / read speed from RAM is correspondingly slow. used is not the correct measure, it is misleading and who

Who introduced him is ignorant. allocated is the actual address space to which you write and call DATA And is dynamic Under his management.

 

 

Edited By nec_000
Link to content
Share on other sites

I opened the survey of sets, as of this moment 2.3% hold a screen , One percent hold 1440P ultrawide, and 7 percent hold a standard 1440P screen. 65% still own an FHD screen. A significant percentage hold an HD screen, almost hard to believe. Do not think that someone in Neodya is particularly bothered by a memory lacking in resolution that no one uses. Anyway, the 5900X got going, somehow I jumped from 173rd in line for the "package shipped", now it remains to be seen whether to pay 100 pounds more than the MSRP on 3080 in advance, it seems to me that 830 pounds for the Asus tuf is a bit excessive? (I'm the only percentage who hold 1440P 21: 9 with 144hz). 

Link to content
Share on other sites

Quote of DrShawarma

I opened the survey of sets, as of this moment 2.3% hold a screen 4k, One percent hold 1440P ultrawide, and 7 percent hold a standard 1440P screen. 65% still own an FHD screen.

Without going into the depths of the issue of the viability of the 3070/80, the survey brought from Seth reflects the reality in a partial and slightly misleading way regarding the discussion here, I will explain:

Most of the platform sets are full of cheap to free games ($ 0-25 for that matter) some in the past were great games that went through the hourglass of time, some are pretty contemporary games with graphics / content / gimplay at the Hop / Lolly channel level (completely legitimate, too It has a target audience)

The AAA games from the last two years are priced at around $ 30-60 (from those that are on the order of two years in the market to the present).

When it comes to discussing whether 8/10 VRAM will suffice for existing titles, one has to assume that most of their gimpy games revolve around COD games from 2004, or "lolly" games of some kind are not part of the discussion here. 

 

The ideal was to examine the distribution of resolutions among those who purchase AAA games as stated at $ 30-60, for those who have FHD / QHD / UHD, it is clear to me that the percentages will change drastically to the higher resolutions when we leave the retro and the lol behind.

Edited By No-One
Link to content
Share on other sites

It turns out that the reason for the shortages in launched tickets is the fact that Manufactures all 7NM products (processors, video cards and chips for PS5 / XBOX X) from the same chipmaker (ie TSMC). Because the above manufacturer has limited production capacity, Has to decide what percentage of this production capacity to allocate to each segment, and unsurprisingly the bulk goes to the production of chips for consoles:

 

 

https://www.youtube.com/watch?v=OiZHYCN-vw4

Link to content
Share on other sites

Join the discussion

You can then join the discussion and then join our community. If you already have an account with us, please Log in now To comment under your username.
Note that: The comment will appear to the surfers after approval by the board management team.

guest
Add a comment

×   The content pasted is with formatting.   Remove formatting

  Only 75 emoji are allowed.

×   Your link has been automatically assimilated.   Show as regular link

×   Your previous content has been automatically restored.   Clear all

×   You can not paste images directly. Upload or insert images from URL.

 Share


  • Latest news

  • Buzz-Zone: Everything hot on the net

  • Popular now

×
  • Create new ...

At the top of the news:

Sony's Power Show

The Japanese maker has made an impressive appearance for its PlayStation 5 console, with a large number of new gameplay videos and some special surprises

new on the site