You need a videocard with 1 gig to run the future games in full detail.

Viewing as a guest Viewing as Guest Last visit: 09.04.2025
Search this topic Search Topic

Welcome to the Cubed3 forums! Join us today - it takes just 20 seconds to start posting! Sign Up for Free Account Login

Well, almost. If we take Doom 3, the game needed a Videocard with 512 Mb to runs in full details. The game loaded 500 MB textures data in each second in a regular lvl. Then, Unreal Engine 3 arrived, and was showed with Nvidias Geforce 6800 Ultra. The videocard supported Pixel Shader 3, and HDR (High Dynamic Rabge) in 100% detail. And , since X800-series from ATi didnt support this technology, the answer to Epic Games was, Nvidia. Well now Tim Sweeney, a programmer from Epic Games said, to enjoy the game at full detail, you need a videovard with 1 gig of memory. This videocard/s excist already, the answer is Nvidia and the SLI technology. You can buy you self the new NV48 Geforc 6800 Ultra with 512, and buy another one , then you have 2 videocards SLI ready togheter and each one with 512 MB , both with 1 gig. Then you can run the future gaming in maximum. But, today this two card cost atleast 1200 euro.

There are 5 manufacture today, wich soon will launch this videocards PCI-Express with SLI ready, Gainward, Asus, Gigabyte,BFG and Leadtek.


The main programmer at Epic, Tim Sweeney, was very lavish with the info on the latest Unreal engine, so the gaming community has had the opportunity to find out many interesting details about the engine. Sweeny explained this advanced engine is intended for what Epic thinks will be the mainstream in 2006 and next-gen consoles, with DirectX 9.0 as the minimum specification, considering the fact that 1GB video card will let you enjoy the game at full detail. If you cannot afford a 1 Gig video card, Microsoft is rumored to be preparing a program where gamers can trade in their organs in exchange for computer part upgrades. Need more RAM? Simply donate 1 kilo of bone marrow. Donating one of your corneas will get you a new CPU and that 1 Gig video card I just mentioned can be yours for a kidney. Keep in mind that most humans can survive with only one kidney filleting the deadly toxins and bodily wastes from their bloodstream, which is good because NVIDIA's GeForce 6800 is namely the first video card that can run this new engine properly.

Read more
http://www.cq-ef.net/index.php

[ Edited by Chrise on 2005/3/26 14:26 ]

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

ahh, one more thing. Read this. The Future gaming will also support Dual Core CPUS, and Unreal Engnie 3 support Dual Core systems.

http://anandtech.com/cpuchipsets/showdoc.aspx?i=2377&p=3

AnandTech: The new Unreal Engine 3 is designed for multi-threading, and will make good use of dual core CPUs available when games on the new engine come out. What parts of the game will benefit/be improved, thanks to multiprocessing? What will be the parts that will benefit the most?

Tim Sweeney: For multithreading optimizations, we're focusing on physics, animation updates, the renderer's scene traversal loop, sound updates, and content streaming.We are not attempting to multithread systems that are highly sequential and object-oriented, such as the gameplay.

Implementing a multithreaded system requires two to three times the development and testing effort of implementing a comparable non-multithreaded system, so it's vital that developers focus on self-contained systems that offer the highest effort-to-reward ratio.

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

too much info Smilie

[ Edited by Blade2t3DS on 2005/3/26 14:32 ]

Well, one thing is for sure Smilie iam going dual core and multi gpu Smilie

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

ATI have announced supperior technology to SLI. SLI is limited to two graphics cards because it splits the screen into two sectors, top and bottom. The new ATI technology splits the screen into a "checkerboard" letting each graphics card deal with sectors on the screen. This allows you to have as many graphics cards as you have PCI-E slots.

Maybe I'm missing something, but... If SLi splits the screen into two, doesn't that mean that one mdel is frequently split between two cards? That must cause some inefficiency, with the extra job of stitching those two halves back together...

how do you find out how many gig you have on a memory card. I wanna know.

King of the 'League of Kings'||My deviantART|| My Photography

SLi makes it so the two cards act as one, just like dual-core processing, it would work with millisecond timesharing. So it shares the processing power. It's only marginal, and the large amount of extra processing power from the second graphics card more than makes up for it. Because if you think about it, it would always split the models between two cards even if the screen wasn't split in two.

Hope that made sense.

Bearing in mind that 256mb gfx cards are hardly mainstream prices yet, I think we are talking further into the future than anticipated here.

See no Wiivil
Hear no Wiivil
Speak no Wiivil

Yeah, well depending where you get em fromSmilie

Thats the pain in the ass with PC gaming... you just buy the latest graphics card out, for

Screw PC gaming.

I don't care if it's got better graphics... I just can't be screwed with it.

I'm sorry but I just have to correct most of this.

First, you don't require a GC with 512MB to run Doom3 at high quality. It could be run at high quality from day one and there weren't even 512MB Graphic cards out.

The amount of memory on a GC is pointless over 256MB as even when using the PCI-express ports, the throughput on the connection will never even push the 256MB limit of most graphic cards.

Sli technology DOES NOT double the memory nor does it double the performance of existing cards, at maximum settings it provides a 75% increase over single cards. If you decrease the resolution from 1600x1200 to 1280x1024 (the standard resolution for most monitors, CRT and TFT) the performance increase is only 25%.

Also SLI is not an industry standard and games have to have the Sli codeing built into them, so while only a few games will use Sli those that don't may even notice a slow down.

Matthew Evans [ Writer :: Moderator :: King of Impartiality :: Lord of the 15min Thread ] As the wind blows the sand to cover the camel's tracks so does time move to cover the Lord's.
Rejoice for the Lord will taketh his quarter and give much back to his followers.



Gastrian wrote:
I'm sorry but I just have to correct most of this.

First, you don't require a GC with 512MB to run Doom3 at high quality. It could be run at high quality from day one and there weren't even 512MB Graphic cards out.

The amount of memory on a GC is pointless over 256MB as even when using the PCI-express ports, the throughput on the connection will never even push the 256MB limit of most graphic cards.

Sli technology DOES NOT double the memory nor does it double the performance of existing cards, at maximum settings it provides a 75% increase over single cards. If you decrease the resolution from 1600x1200 to 1280x1024 (the standard resolution for most monitors, CRT and TFT) the performance increase is only 25%.

Also SLI is not an industry standard and games have to have the Sli codeing built into them, so while only a few games will use Sli those that don't may even notice a slow down.

Hehe, are you sure that doom 3 dosent require 512 Videocard?. In what reason do you think that?. First, in Ultra quality, the game load each texture; diffuse, specular, normal map at full resolution with no compression. In a typical DOOM 3 level, this can hover around a whopping 500MB of texture data. This will run on current hardware but obviously we cannot fit 500MB of texture data onto a 256MB card and the amount of texture data referenced in a give scene per frame ( 60 times a second ) can easily be 50MB+. This can cause some choppiness as a lot of memory bandwidth is being consumed. It does however look fantastic and it is certainly playable on high end systems but due to the hitching that can occur we chose to require a 512MB Video card before setting this automatically. So yes, you can sett it manual, but you aint geting ''Ful details'' performance.

Then about the SLI, yes, it double your performance twice as a singel card in many games, and also from normal resolution at 1280x1024 with the new updated drivers. From the begining the sli gave performance only at high relolution.

[ Edited by Chrise on 2005/3/27 15:45 ]

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

I've always thought 256mb was fine to run Doom3....

Me = Thick

How about showing us some benchmarks as opposed to re-iterations of 10 month old press releases.

Matthew Evans [ Writer :: Moderator :: King of Impartiality :: Lord of the 15min Thread ] As the wind blows the sand to cover the camel's tracks so does time move to cover the Lord's.
Rejoice for the Lord will taketh his quarter and give much back to his followers.


Gastrian wrote:
How about showing us some benchmarks as opposed to re-iterations of 10 month old press releases.

How about telling me, why you think that Doom 3 dosent require a 512 videocard? Smilie . Well, from my point, i told that doom 3 need a 512 videocard, and i explained it.

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

Because you've copied your text from a press release dated July 2004 in which they based their assumptions on nVidia's NV10/20 chipset architecture which is otherwise known as the Geforce3 and GeForce4 MX.

We are now using systems with GeForce 6xxx and Ati x--- chipsets. Going by Doom3 benchmarks the GeForce Ti4200 *x AGP can only muster 8.6 fps at medium quality while the new modern cards can easily handle 40+ fps.

Your explanations are based on out-dated and irrelevant news articles based on obsolete hardware.

Matthew Evans [ Writer :: Moderator :: King of Impartiality :: Lord of the 15min Thread ] As the wind blows the sand to cover the camel's tracks so does time move to cover the Lord's.
Rejoice for the Lord will taketh his quarter and give much back to his followers.

And this is why I don't do PC gaming, and also I'm yet to see it looking any better than consoles, I'm sorry if I'm thick/going blind/fanboyish/whatever, but I am yet to see a PC game that looks better than the best of GameCube graphics.

Wich one? about Doom3?. This about Doom3 wasnt a pressrelease, it was writen when they testet Doom3, so what does you mean?, they ws lieing?. Every hard-core gamers knows about Doom3 and how much its bump each second. So belive my, what ever you like it or not, you need a damn 512 Videocard to run doom3 ''full detail''.

About your pressrelease, your thinking about something else.

[ Edited by Chrise on 2005/3/27 17:20 ]

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

Is the GeForce FX 6800GT Ultra a 512mb card? I thought it was 256mb

[ Edited by f | j | D on 2005/3/27 17:43 ]

Here is the article you quoted for Doom3

http://www.tomshardware.com/hardnews/20040727_052008.html

Its a press release from July 27 2004, but seeing as you are a "hardcore" gamer you would've already known that.

Chrise said

Hehe, are you sure that doom 3 dosent require 512 Videocard?. In what reason do you think that?. First, in Ultra quality, the game load each texture; diffuse, specular, normal map at full resolution with no compression. In a typical DOOM 3 level, this can hover around a whopping 500MB of texture data. This will run on current hardware but obviously we cannot fit 500MB of texture data onto a 256MB card and the amount of texture data referenced in a give scene per frame ( 60 times a second ) can easily be 50MB+. This can cause some choppiness as a lot of memory bandwidth is being consumed. It does however look fantastic and it is certainly playable on high end systems but due to the hitching that can occur we chose to require a 512MB Video card before setting this automatically. So yes, you can sett it manual, but you aint geting ''Ful details'' performance.

Robert Duffy - id Software's programmer said 27 July 2004

In Ultra quality, we load each texture; diffuse, specular, normal map at full resolution with no compression. In a typical DOOM 3 level, this can hover around a whopping 500MB of texture data. This will run on current hardware but obviously we cannot fit 500MB of texture data onto a 256MB card and the amount of texture data referenced in a give scene per frame ( 60 times a second ) can easily be 50MB+. This can cause some choppiness as a lot of memory bandwidth is being consumed. It does however look fantastic Smilie and it is certainly playable on high end systems but due to the hitching that can occur we chose to require a 512MB Video card before setting this automatically.

At the time the press article was made 512MB cards weren't even on the drawing boards and he was refering to THEORETICAL cards in a HYPOTHETICAL situation. Those theroetical cards were never created and that hypothetical situation never occured.



[ Edited by Gastrian on 2005/3/27 17:46 ]

Matthew Evans [ Writer :: Moderator :: King of Impartiality :: Lord of the 15min Thread ] As the wind blows the sand to cover the camel's tracks so does time move to cover the Lord's.
Rejoice for the Lord will taketh his quarter and give much back to his followers.


f | j | D wrote:
Is the GeForce FX 6800GT Ultra a 512mb card? I thought it was 256mb

[ Edited by f | j | D on 2005/3/27 17:43 ]

There is now 6800 Ultra with 256 and 512.

The new BFG 6800 Ultra 512 OC.
Gainwards upcoming solution, singel 6800 Ultra and dual 6800 ultra 256 in one card. Asus, Leadtek and gigabyte is also on.

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

Here is the article you quoted for Doom3

http://www.tomshardware.com/hardnews/20040727_052008.html

Well yes, in this article, there is the same point, you need 512 videocard to play doom 3 at Ultra Quality. And thats was my point.

You need a not yet released 512MB Yet?, well you need it know when its out.

Or did i missed understood something?

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

Also how can you not be talking out of you arse when the card isn't released until April and the only benchmark is of an XFX card playing NFS:U2 but not being compared to other cards.

Show us some comparitive benchmarks from a reputable site.

Seeing as you claim to be a "hardcore" gamer you show a complete lack of ability to grasp Graphic card technology.

Graphic Card memory means little to nothing, its all down to the GPU, id software were basing their 512MB claims on the arcitecture of the GeForce3 and GeForce4MX which even with 512MB would not even be able to touch the current GPUs on the market.

512MB was needed then, its not needed now. Try doing some research first.

[ Edited by Gastrian on 2005/3/27 18:02 ]

Matthew Evans [ Writer :: Moderator :: King of Impartiality :: Lord of the 15min Thread ] As the wind blows the sand to cover the camel's tracks so does time move to cover the Lord's.
Rejoice for the Lord will taketh his quarter and give much back to his followers.


Gastrian wrote:
Also how can you not be talking out of you arse when the card isn't released until April and the only benchmark is of an XFX card playing NFS:U2 but not being compared to other cards.

Show us some comparitive benchmarks from a reputable site.

Seeing as you claim to be a "hardcore" gamer you show a complete lack of ability to grasp Graphic card technology.

Graphic Card memory means little to nothing, its all down to the GPU, id software were basing their 512MB claims on the arcitecture of the GeForce3 and GeForce4MX which even with 512MB would not even be able to touch the current GPUs on the market.

512MB was needed then, its not needed now. Try doing some research first.

[ Edited by Gastrian on 2005/3/27 18:02 ]

I didnt saying, they used 512 card by playing Doom3. Doom 3 tested and they mentioned the same point as the article. And its the same point, as the article says.

Ultra Quality (max. quality): for 512MB video cards. Each texture; diffuse, specular, normal map at full resolution with no compression.

That was my point.

-Wii rules the world.
-But sir?, Wii dont rules it.
-I dont mean Wii you stupid, i mean Wii.
-What? You sad Wii dont and then Wii does? WTF??

Reply to this topic

To post in the forums please login or sign up to join the Cubed3 community! Sign Up for Free Account Login

Subscribe to this topic Subscribe to this topic

If you are a registered member and logged in, you can also subscribe to topics by email.
Sign up today for blogs, games collections, reader reviews and much more
Site Feed
Who's Online?
Azuardo

There are 1 members online at the moment.