BitsPlusImagingPipelineTest([whichScreen=max][,plotdiffs=0][, forcesuccess=0][, winrect=][, bitsSharpPortName][, acceptableerror=1])
Tests correct function of Mono++ and Color++ mode with imaging pipeline…
This test script needs to be run once after each graphics card or
graphics driver or Psychtoolbox upgrade.
This test tests if the Psychtoolbox image processing pipeline is capable
to correctly convert a high dynamic range image for the Cambridge
Research Systems Bits++ / Bits# box for Mono++ and Color++ mode. This
test script can be also used to verify proper operation with VPixx Inc.
devices like the DataPixx, ViewPixx, and ProPixx, as they expect the same
type of color encoding of pixeldata as the devices from CRS.
It does so by generating a test stimulus, converting it into a Bits++
image via the Matlab BitsPlusPlus toolbox and by use of the imaging
pipeline. Then it reads back and compares the conversion results of both
to verify that the imaging pipeline produces exactly the same results
as the Matlab routines.
If the results are the same, it will write some info file to the
filesystem to confirm this test was successfully run.
whichScreen = Screen id of display to test on. Will be the secondardy
display if none provided.
plotdiffs = If set to one, plot diagnostic difference images, if any
differences are detected. By default no such images are plotted. No
images will be plotted if no differences exist.
forcesuccess = Set this to one if you want to force the test to succeed,
despite detected errors, ie., if you want the GPU
conversion to be used. Only use this if you really know
what you are doing!
winrect = Optional placement rectangle for window. Defaults to fullscreen.
bitsSharpPortName = Optional name of the serial port to which a Bits# device
is connected. If omitted, the portName will be fetched
from the global Bits# config file, or auto-detected.
acceptableerror = Maximum acceptable error in absolute output intensity
difference. Differences less than this value are
considered a successful “pass” of the test by the GPU.
The default is 1, ie., 1 unit out of 65535. On CRS
Bits+ and Bits# this would cause no measureable error,
as these devices only use 14 bits of the 16 bit range,
ie., an error in the least significant bit of intensity
won’t show up anyway. On VPixx Data/View/ProPixx
devices, the error will be so small that it will just
disappear in the noise/error caused by other sources of
error, e.g., inaccurate display calibration or gamma
correction, lighting conditions, variation in display
hardware due to operating temperature, voltage
fluctuations etc. The default setting of 1 should be
perfectly fine, even higher values (maybe up to 10)
would be ok in practice.
Please note that this test script can only test if the correct output to
your systems framebuffer is generated by Psychtoolbox. It can’t detect if
the Bits++ box itself is working correctly with this data. Only visual
inspection and a photometer/colorimeter test can really tell you if the
whole system is working correctly!