This dialog allows you to select a video mode and see if page flipping is supported by the video card as well as to see how a given font (or bitamp) will look. The video mode selected here will also be used for all other tests that TimeDX performs that require a video mode. To actually test the selected mode, click OK and TimeDX will then switch to that mode and test it, clicking "Just Select It" will do just that for other TimeDX tests when you just want change the video mode. Do not worry if the Hz field of a video mode is displayed as 0Hz, Windows 98 video card DirectX drivers do not supply this. If you're using XP and have video drivers that enumerate all the different refresh rates they will be listed after the default 0Hz video modes -- but be warned, some video cards will still ignore this field unless the Forced Refresh Rate Rate field in the DirectDraw tab of the DirectX Control Panel is set to Default.
In later versions of
TimeDX (like 3.3.0 onwards) there is an extra button labeled "Select Desktop
Video Mode" (yes, I'm too cheap to go grab another screen shot). When
clicked this selects whatever video mode corresponds to what the windows
desktop is currently set at. In addition the newer versions also pick the
desktop video mode as the default video mode instead of the archaic 640x480
8bit video mode TimeDX used to start with. There are a couple of reasons
for this. First is that later video drivers appear to be really bad at
handling 8 bit video modes (have been so for a number of years, it's 2007 as
this paragraph is written) so they are highly discouraged and having TimeDX
automatically pick one of them is stupid. And while the desktop video
mode may be a stretch for some machines to have multiple back buffers of the
predominance of TFT displays these days means that a lot of the time the only
video mode worth using is the desktop one (or one that's the same except for
color depth) that is hopefully set to the native resolution of a TFT display if
one is used. The reason for all the concern with TFT displays (or LCD
flat panels) is that when video is sent to the display that doesn't match it's
native resolution (a TFT display consisting of an array of pixels of definite
fixed dimensions) the display itself has to render that video data onto it's
array of pixels -- and this takes time (sometimes multiple retrace worth's of
it) and thus wildly interferes with any tachistoscopic display (there's more
discussion on this in the Refresh Rate
test where you can check how well the video mode you've selected is being
handled by the display). In
addition using a display resolution other than the native resolution is usually blurry. Ok, there's three reasons, having a button
here to get the desktop video mode also makes use of DMDX's
<vm desktop> much easier as you don't
have to go probing through the desktop's properties to find out what the
display is set to when you've got a TFT monitor.
In order for DMDX to use a video mode it must be able to buffer two full screens in video memory, for instance a 640x480 256 color screen takes 300k of memory to map (written in 1998 so take these figures with huge grains of salt), so for DMDX to use it there must be at least 600k of memory on the video card -- not a problem, 1Meg seems to be the minimum for a PCI card. If however you wanted to use 65k colors each screen now takes 600k, so a 2Meg video card is required. Some really cheap video cards (like the STB Horizon 64 PCI) do not support page flipping, you will be warned of this if you are unlucky enough to have such a card as it's use is not recommend -- there is no way to synchronize the raster with what is to be displayed, or at least no way that I am prepared to code, there always exists the possibility that the first portion of the retrace was the previous frame and the later portion of the retrace the new frame.
The number of scan lines to re-program the display to is for adjusting the refresh rate if you aren't using Windows 2000 or XP. For our purposes any single given DirectX video mode can be a vast number of different video modes depending on whether the number of scan lines are adjusted down from the usual value -- assuming you had a OS capable of doing that and given that they are a real rarity these days (2007) almost everyone can just plain ignore any mention of subtracting lines from a display mode. There are several things to note here, firstly, that lowering the number of scan lines does not lower the memory requirements, as far as the rest of DirectX is concerned there are still the un-modified number of scan lines on the screen. Secondly, a number of tests assume that there are at least 480 scan lines on the screen so not all diagnostic information may be visible. Lastly, the code that re-programs the scan lines does not know about displays with more than 1023 scan lines, nor will it know about displays that are not SVGA compatible or displays other than the primary display -- if it screws up don't use it.
Additionally, if you are intending to use a Multimon system only the Primary Display can be re-programmed, using this when a secondary display has been chosen will still re-program the Primary Display (oops).
The font and text selection is to give you some idea of what a given font will look like using a given video mode. Format of this string is not arbitrary, if you enter a new font you must produce a string that looks like the List Box entries:
After the weight (999wt) and before
the hyphen the letters I, S and U can occur, signifying Italic, StrikeOut
and Underlined respectively. Changing the name of the font is not likely
to produce anything as these are all the available fonts, changing it's weight,
attributes I, S or U and it's dimensions will result in synthesized fonts
if they don't exist already. The field in angle brackets indicates the font
type and is not part of the typeface name, it is provided primarily to show
which fonts are True Type Fonts as these will scale nicely without pixelization.
I have also added the ability to use a bitmap instead of a font, useful for getting an idea how long individual bitmaps take to load and how they will be distorted with varying video modes. As a general note, if you are going to present photographic images I sincerely recommend using 16 bit, 65535, color modes, even if you have successfully managed to scan them into 256 colors, the reason being that the GDI palette manager is going to get it's hands on those 256 colors and squeeze them into either 254 or, worse, 236 colors. If you really need the speed of 256 color mode (65535 is not all that much slower on most video cards and is in fact faster on newer hardware) or more to the point your graphics card does not have enough memory to handle 65535 colors (in which case I recommend you go and buy one that can) and you must have photographic images then to avoid the GDI palette manager's distortion you will have to find a way to make the image leave (or include) the 20 colors that windows usually reserves for it's use -- maybe PhotoShop can do this, I don't know, I doubt whether the Graphics Workshop can do it.
Once TimeDX is displayed a mouse click or a return key press will finish the test and the next TimeDX test will use that video mode. A results dialog box is also displayed:
The times reported are the millisecond times to fill a memory screen will a single color, the times to Blit them to the screen buffers (this gives you an idea of the sustained changing frame rate DMDX will have), the time to draw the texts into memory (if a bitmap is displayed then this is the time taken to move the bitmap from memory into the screen buffer), the time to setup the font (if a bitmap is displayed this is the time taken to read the bitmap off disk or out of the disk cache if this is the second usage in a row of the same bitmap), the time to draw the text (or transfer the bitmap) and the time to flip the display from the back buffer to the front buffer (which shouldn't be more than the refresh interval). The wait times are the amount of time spent waiting for an operation to finish -- unless you have a hardware accelerator that has it's own separate Blitter these times should be less than one millisecond. The code will not wait more than 1000 milliseconds for anything so values of 1000 or more indicate something wrong. Do not be alarmed by the high Blit time, the first blit of any DirectX session is always way slow.