DMDX Help.
Digital Video Notes
DMDX has the ability to display a streaming Digital Video multi-media file, beit AVI, MPEG, QuickTime or whatever else DirectShow supports (see TimeDX's Digital Video
help on Codecs), or it can also just use just the Streaming Audio abilities (see
below). It's probably not a good idea to use the
freesync video mode
option with digital video playback.
As with the <animation> switch all frame buffers (including the visible primary buffer)
are cleared at the end of an item when the legacy DirectDraw rendering path is
active, originally when the Direct3D renderer was in use a blank display was rendered after
the video was finished but by the time I got the Direct3D renderer (VMR9)
working properly this behavior was removed. So all Digital Video items
when using the legacy DirectDraw renderer should end with a blank frame, for example:
10 <dv> "movie" / ;
The primary reason
DMDX attempts to clear the screen after video when using the legacy DirectDraw
renderer is that the video can run on for
an indeterminate number of frames, should you wish to display normal DMDX frames
after the video a bit of extra attention will be needed to deal with either DMDX
clearing the display or the run on, for example the following item's second
"black.bmp" after the video may display
on some machines, may not on others:
+1 <ms% 500> "+" / <dv>
"../laura/neutral/608-80_l.mpg" / <ms% 500> <bmp> "black.bmp" / * <bmp> "response.bmp";
Instead that following frame will be need to be repeated,
maybe more than once in order to work on a large range of machines, for example:
+1 <ms% 500> "+" / <dv>
"../laura/neutral/608-80_l.mpg" / <ms% 50> <bmp> "black.bmp" / <ms% 50> <bmp> "black.bmp" / <ms% 400> <bmp> "black.bmp"/ * <bmp> "response.bmp";
Similarly if you wish to display the
video against a static background image the frame to display it might need to be
issued a number of times. Here one uses the <dv%
0> modifier to cancel the digital video's frame duration immediately
after the video begins and to then render the static background. The trouble with that of
course is that if you're wanting to start gathering responses after the video is
finished you'll have to have the duration of that video encoded in your items,
here we're assuming the video is 1.5 seconds long:
+1 <ms% 500> "+" / <dv>
"../laura/neutral/608-80_l.mpg" <dv% 0> / <ms% 50> <bmp> "black.bmp"/ <ms% 50> <bmp> "black.bmp"
/ <ms% 1400> <bmp> "black.bmp" / * <bmp> "response.bmp";
If
you want that static background image to precede the video you can display it
and then the video with a no erase switch
and use the <dv%
0> modifier to cancel the digital video's frame duration immediately
after the video begins and then render the static background again afterwards --
assuming you're using DMDX 6.3.3.0 or later and the
Direct3D renderer as trying to
render the background before the video began prior to that was fraught with difficulty
and would at best wind up with a flash of the background color prior to video
commencing (! or | or
<NoErase> were specifically
disallowed in a <dv> frame due to the weird unexpected things it did as
the playback rectangle was adjusted by the display routines to erase what was
previously on the screen and that is decidedly not what is desired). That
trailing frame re-rendering the static image is critical as it's that frame that
the code renders the video against, without it you're likely to just get a blank
display when using a no erase switch:
0 <bmp 0,0,0,0> "golftp" / ! <ml 0> <dv
-1,.1,.56,.75> "gro20017-0-005_l.mpg" <dv% 0> / <bmp 0,0,0,0> "golftp" ;
Note that support of No Erase in a Digital Video
<dv> frame is limited, normally in DMDX a
No Erase frame is copying bitmap data from one frame to another, none of that is
happening in a <dv> frame, instead
because you're using the Direct3D renderer DMDX
re-renders the previous frame's textures that are still available when it's time
to render a new video frame and if that frame doesn't exist
! in a <dv>
frame can't function. So while you can use !
at the start of an item in a normal frame to merge the previous item's display
with the next you can't do that in a <dv>
frame, what you can do however is use ! in both a normal frame and a
<dv> frame so constructs like the
following work:
0 <bmp 0,0,0,0> "golftp" ;
0 ! / ! <ml 0>
<dv -1,.1,.56,.75> "gro20017-0-005_l.mpg" <dv% 0> / ! ;
When DMDX processes an item with a Digital Video file in it
using the DirectDraw renderer it uses a completely different method than it
otherwise would to load images into the display memory. Instead of buffering any number of frames in the video card's memory ahead of time
and flipping between them at the appropriate time DMDX now only moves a frame
into video memory immediately prior to it's being displayed as DirectShow
doesn't provide more than one frame at a time. Using the Direct3D renderer
things aren't that different than they normally are where the frame's displays
are loaded into 3D textures prior to the item beginning and then rendering them
when their time is due, with <dv> frames there's additional code checking for
new video frames to be rendered on top of whatever normal DMDX display is
happening. Additionally, regardless of renderer digital video files have enormous initial latency (kind of like pre-roll time on tape machines) in the order of hundreds of milliseconds
on a lot of machines, DMDX automatically re-schedules the rest of the display queue once a digital video frame has actually started producing images to be displayed.
Multiple digital video files can be played at once (assuming you have the CPU horse power) and the frames can be combined with other frames in normal DMDX ways. The DMDX RT clock can be turned on at a particular digital video file frame number using a new clockon switch, <DVClockOn> and the duration of the DMDX frame can be set to expire when a particular digital video frame is displayed with the <DVFrameDuration> switch.
When playing back a file with the <DigitalVideo [N,N[,N,N]]> switch the first optional pair of Ns specify the top left corner location of playback, the second the bottom right
corner location (or if a centered playback region has been requested these can
be the size of the playback region). DMDX switches <X> and <Y> are overridden by these
-- indeed it would appear that any attempt to use them (or <BMPMultipliers>)
will result in a Lost Graphic message meaning the only way to position digital
video is using the parameters in the <dv> keyword). In the same way that the <X> and <Y> switches can be real and express a fraction of the screen dimensions these values can be too
with the one slight difference and that is that if the value 1 is used here it's
the full relative dimension of the screen, in
<xy> it's the pixel value 1. If neither pairs of coordinates are present the images are displayed centered on the screen in the size they are stored in the file in, if only the first pair is present the top left corners of the images are moved to that location, if both coordinates are specified the images are stretched to fit (overriding any <BMPMultipliers>). If all four values are zero the image is displayed full screen. If a -1 is used for a field the default value for that field is used, if -1 is used for either or both of the first coordinate pair the image is centered in that axis
and if the second pair of coordinates has been provided they become the size of
the playback region, if -1 is used for either or both of the second pair the original image size in that dimension is used. For example:
<dv 64,48>
This would display the images near the top left corner of the screen in the original file size.
<dv 64,48,164,148>
This would display the images near the top left corner of the screen as 100x100 pixel images.
<dv 0,0,0,0>
<dv 0,0,1,1>
This would display the images full screen.
<dv -1,48>
This would display the images near the top center of the screen in the original file size.
<dv -1,48,-1,0.5>
This would display the images near the top center of the screen in the original file size width but stretched to half the screen height.
<dv -1,-1,0.5,0.5>
This would display the images in the center of the screen in one quarter of the screen area.
0 <bmp -1,-1,.66,.7> "border" %1 / ! <ml
0> <dv -1,-1,.6,.6> "video.mpg" <dv% 0> / ! ;
This item renders a video with border around it,
first the the border that is centered and 70% of the screen in height for a tick
before the video starts and then the video itself scaled a bit smaller at 60% of
the screen height. The border X dimension is 66% in a crude attempt to get
it the same width around the video, people can probably do some actual math to
figure out what those values should actually be...
When turning the clock on for responses at a given digital video frame with the <DVClockOn> switch the normal DMDX clock on switch should not be used as well. This switch must occur in the frame that defines the digital video file name regardless of whether another DMDX frame will be displayed by the time that digital video frame is displayed. The way this functions is that as the code is displaying video frames it takes note of the number of that video frame in the video sequence, when it sees that the video frame number is greater than or equal to the clockon frame number (and it hasn't already turned the clock on) it turns the clock on. In order to determine which video frame contains the relevant stimuli TimeDX can be used to seek around the digital video file to find a particular video frame and it's number can be noted and then entered into the item file (sorry, cursors don't exist in any generic fashion for digital video files).
If synchronization between a digital video sequence and the rest of DMDX is needed the <DVFrameDuration> switch can be used. In the same fashion as the <DVCLockOn> switch, when the specified video frame has been displayed DMDX moves to the next element in the display queue and re-schedules everything accordingly.
If display of the digital video needs to be abortable
<AbortItemKeyName>
can be used, if the display needs to be contingent upon a subject's response (or
some other condition I can't currently imagine)
<AbortItemExpression>
can be used (see the jobstatus .eq. 7 example).
If you are doing something extreme like playing back 60 Hz HD video on
older hardware and are experiencing stuttering or outright freezing of the
playback you may have success by either forcing DMDX to track the raster in a
relaxed fashion with -auto -ez on
the
command line or tripping it with test mode 18
instead of relentlessly hammering the raster status functions or by using a
video mode with a slower refresh, say 24
Hz should you be able to do so. If that's not enough you could use the
freesync video mode modifier that stops DMDX
tracking the raster outright (the fact that you don't necessarily have FreeSync
or GSync display is irrelevant), YMMV however as when
freesync is active DMDX is processing the display queue every millisecond
and this extra load might outweigh any savings from not using relaxed raster
tracking.
Note that if you are reusing the same video over and over (or even just
twice in a row) certain combinations of things (like using DIrect3D and VMR9 and
some particular video formats) can give VMR9 indigestion and DMDX's attempts to
reuse video resources from the previous item can fail in either freaky ways
(video hitches or only plays partially) or spectacularly (DMDX crashes).
To mitigate this DMDX 6.3.1.10 has extended the
Media Life keyword to handle a media
life of zero that flags to DMDX do not reuse the resources used to render this
file.
Streaming Audio
To facilitate the use of large audio instruction sets a subset of the Digital Video abilities have been implemented with the <StreamingAudio> keyword. The problem with large multi-minute (and therefore multi-megabyte) audio files is that they must be read into main memory before DMDX plays them if the <wav> keyword is used, this can take appreciable amounts of time on some systems and can also cause portions of memory to be swapped out to the paging file if really large audio files are used that are a significant percentage of or larger than the RAM installed in a machine, slowing things right down. Another problem is that the file must be a plain vanilla .WAV file, no compression. Using the <StreamingAudio> keyword the audio is streamed from disk (only the next fraction of a second is read at any one time) and DMDX also uses the codecs installed on the machine to decode the file.
To use compressed audio files (say .mpg or .au files) a suitable codec must be found, see TimeDX's Digital Video help on Codecs.
The downsides are that any significant degree of precision or synchronization is lost as DirectShow is now doing all the work. The use of .WAV cursors is no longer possible, instead the millisecond accuracy of the <PlaySAFrom> and <PlaySATo> keywords is available (these are in fact synonyms for <PlayDVFrom> and <PlayDVTo>, the frame duration for a streaming audio file is one millisecond). <StreamingAudio> does not support any of the <Sound> (<wav>) keyword's <SetVisualProbe> functionality, the visual probe is always at the start of the audio file (the audio file always begins playing at the frame's scheduled time) and the duration of the frame is either the duration of the audio file or the explicit % frame duration. Nor does it support the <Volume>, <Pan>, <DVFrameDuration>, or <DVClockOn> keywords.
It is currently not possible to have the same instance of a streaming audio file playing twice (the old <wav> keyword limitation in a slightly different incarnation) as the <StreamingAudio> keyword really is designed for simple instructional purposes -- should you really need to do so then use slightly different parts of the file, say <PlaySAFrom 0> in one frame and <PlaySAFrom 1> in the other.
DMDX Index.