View Single Post
Old 09-12-2008, 09:39 PM   #9
shillingworth
A Predatory Creeper
 
Join Date: Dec 2002
Server: Bertoxxulous
Posts: 251
Interface Author - Click to view interfaces
Default

Those surface formats you guys are talking about are different per computer, most will default to A8R8G8B8, but in the case of it not being supported the devices moves down a list of formats until it finds one for the screen and back buffer that match.

The artifacts your seeing aren't related to surface formats, it's due to texture co-ordinate systems. EQ has to convert the X and Y values of location to floating point numbers between 0 and 1, considering the complexity of the scene, they are most likely using the HLSL half(16 bits) structure instead of the float(32 bits) structure for those texture co-ordinates. The resulting image is also rasterized, which can sometimes place as many as 12 pixels on top of each other.

Compression artifacts show as a sort of blotchy effect across an entire surface.

While DDS are nice because then you gain the advantage of a floating point format for pixel values, you also gain a huge amount of memory usage with them because Direct X will load every detail level from the size specified all the way down to 1x1, the flags in the DX LoadTextureFromFile function only specifies which detail levels are sent to the graphics cards, so system memory still gets eaten up.
__________________

"Computers are like Air Conditioners, they stop working properly when you open Windows."
shillingworth is offline   Reply With Quote