forked from Minki/linux
Staging: comedi: drivers: adl_pci9111: Fix AI commands in TRIG_FOLLOW case
I received a report that AI streaming acquisitions do not work properly for the adl_pci9111 driver when convert_src is TRIG_TIMER and scan_begin_src is TRIG_FOLLOW (and scan_begin_arg is therefore 0). This seems to be down to the incorrect setting of dev_private->scan_delay in pci9111_ai_do_cmd(). Under the previously stated conditions, dev_private->scan_delay ends up set to (unsigned int)-1, but it ought to be set to 0. The function sets it to 0 initially, and it only makes sense to change it if both convert_src and scan_begin_src are set to TRIG_TIMER. Note: 'scan_delay' is the number of unwanted scans to discard after each valid scan. The hardware does not support 'scan' timing as such, just a regularly paced conversion timer (with automatic channel switching between conversions). The driver simulates a scan period that is some (>1) multiple of the conversion period times the scan length (chanlist_len samples) by reading chanlist_len samples and discarding the next scan_delay times chanlist_len samples. Signed-off-by: Ian Abbott <abbotti@mev.co.uk> Signed-off-by: Greg Kroah-Hartman <gregkh@suse.de>
This commit is contained in:
parent
44176d9f82
commit
6c2fd30804
@ -824,9 +824,12 @@ static int pci9111_ai_do_cmd(struct comedi_device *dev,
|
||||
plx9050_interrupt_control(dev_private->lcr_io_base, true, true,
|
||||
false, true, true);
|
||||
|
||||
dev_private->scan_delay =
|
||||
(async_cmd->scan_begin_arg / (async_cmd->convert_arg *
|
||||
async_cmd->chanlist_len)) - 1;
|
||||
if (async_cmd->scan_begin_src == TRIG_TIMER) {
|
||||
dev_private->scan_delay =
|
||||
(async_cmd->scan_begin_arg /
|
||||
(async_cmd->convert_arg *
|
||||
async_cmd->chanlist_len)) - 1;
|
||||
}
|
||||
|
||||
break;
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user