Re: [AH] DIN Sync Splitter and DIN Sync timing gotchas

From Colin f
Sent Fri, Nov 28th 2003, 15:18

> The primary situation my concerns relate to is at the
> start of the song, but I am also concerned that a sequencer could put
> out two Clock bytes close together.   The latter could happen for
> reasons we cannot anticipate - which are unlikely, but possible.

Start of the song I'll come back to...
The second point about two clock bytes occuring together is probably
possible, but is still sloppy programming IMHO.
System realtime bytes can be inserted at any time in the midi stream, even
in the middle of a note message for example.
So in the code I write, whenever a clock event occurs, the midi clock byte
is sent as soon as possible. If a byte transmit is already in progress, it
will be sent immediately after it. Maximum timing error for a clock pulse is
therefore no more than about 0.4ms, including software latency in the
interrupt routine that triggers the clock event.
Midi clock bytes have no need to go into a buffer along with other bytes. I
imagine clock bytes coming out of a heavily loaded buffer would be one way
you might see them transmitted contiguously.

If you look at the 303, 606 or 808 schematics around the sync oscillator,
you'll see that pressing the run/stop switch resets the oscillator.
I believe this is just to ensure that the first clock pulse seen after the
device enters the running state is the full length. If the clock pulse was
already 1.5ms old by the time RUN went high, then the device would very
likely miss the remaining 0.5ms.
But I think it's fair to say that the even the 303, 606 or 808 would respond
to clock pulse with a duration of 4ms even if it occured 0.33ms after the
RUN line went high. In the worst case, only a 4ms delay would be required
between run going high and the start of the pulse, as this is equivalent to
what happens at an x0x sync output running with an 8ms clock period.
I think a bit of experimentation is called for.

> Since the MIDI sequencer is physically capable of sending a Clock byte
> every 0.33 ms, since we have no way of anticipating the timing of these
> bytes with every sequencer in creation, operating in every computer
> (with its operating systems and other software vying for CPU time),
> output interface and musical situation possible, I think that it is
> vital to have a general capacity to buffer received Clock bytes and make
> sure they all generate Clock cycles, within the timing limits.  There's
> probably no point in trying to count up more than 8 or so such Clock
> bytes - because any MIDI sequencer which puts out 8 or more so densely
> that we need to buffer them is probably not functioning properly anyway.

But if you buffer clock bytes, and interpolate with a moving average of
their period, you are going to introduce timing errors even in a rock stable
midi clock where there is a sudden change in tempo. This may even be
audible.
I prefer to stick with a known accurate and precise clock source.
Maybe this is why some people still don't use PCs with a non-realtime OS for
realtime applications ;-)

Cheers,
Colin f



________________________________________________
Message sent using UebiMiau 2.7.2