Page 1 of 1

MDM Questions... (Sys: ARM STM32F205RCT6)

PostPosted: Sat Feb 01, 2020 4:47 pm
by jorgo
Hi Dmitri, some Questions:

1. In the description you indicate that
> MinScan:
Measured/shown in 1/10th of millisecond. Can be set between 1 and 100 (0.1 - 10ms).
...but in "MDM" the smallest settable value is 10 (1ms) not 1 (0.1ms) - a Bug?

2. It would also be nice to be able to set the "Retrigger Mask" parameter in 1/10 milliseconds.
I have often the problem that the optimal value could (not shure) between 1+2 (like 1.2) or 2+3 (like 2.7).
So, if this possible, the smallest settable value at 1/10th of a millisecond would be nice...

3. What changes with activated "Alt Sampling Alg"?
I use "Alt Sampling Alg" (maybe I'm the only one ;-) but get better
results (from my novice perspective).
Now I would like to know what's behind it.

Thank you in advance and greetings from Hamburg

Re: MDM Questions... (Sys: ARM STM32F205RCT6)

PostPosted: Sun Feb 02, 2020 8:41 pm
by dmitri
1. It's not a bug but an error in description. It should say between 10 and 100 which means 1-10ms.
2. To be honest, the double triggering is better to be controlled with DynTime and DynLevel. Retrigger mask below 10ms really does not make much sense.
3. With AltSampling algorithm the sampling is done using parallel sampling of multiple inputs. It's potentially better as it causes fewer MUXers switches and as a result less switching noise.

Re: MDM Questions... (Sys: ARM STM32F205RCT6)

PostPosted: Tue Feb 04, 2020 4:13 pm
by jorgo
Hi Dmitri,
next Question...

- Description: MinScan
...
"Lowering it will improve Latency (MegaDrum latency is between Latency and Latency+MinScan) and may worsen level accuracy. Raising it will worsen Latency and may improve level accuracy."

Do you mean the total latency of the module (one input with higher MinScan setting affect all inputs)
or does the increased latency refer only to the input with the higher MinScan value?

Thx a Lot...
HangLoose...

Re: MDM Questions... (Sys: ARM STM32F205RCT6)

PostPosted: Sat Feb 08, 2020 10:39 pm
by dmitri
Max Latency of each input is equal to Latency(global)+MinScan(per input).