Re: [AH] DIN Sync Splitter and DIN Sync timing gotchas

From Robin Whittle
Sent Fri, Nov 28th 2003, 03:06

Hi Colin,

I agree with all you wrote about timing of the high and low phases of
the Clock signal needing to be seen by a CPU which runs its firmware to
look at the Clock (and everything else) once every 1.8 msec.   I think
the 4 ms high and 4 msec low arrangement sounds good.

You wrote:

> If the midi clock bytes arrive closer together than DIN sync can
> handle, then the instantaneous tempo is higher than the slaved device
> can possibly handle. I'd say that's a problem with the source, rather
> than the destination.
> Personally I think if a midi device is introducing that much error in
> it's clock output, it's not a stable enough clock.

What I wrote about the need to send out Clock pulses slower than they
may arrive via MIDI is not related to the MIDI sequencer running at such
a high tempo that it violates the Clock timing restrictions, such as the
8 ms cycle time.  The primary situation my concerns relate to is at the
start of the song, but I am also concerned that a sequencer could put
out two Clock bytes close together.   The latter could happen for
reasons we cannot anticipate - which are unlikely, but possible.  A MIDI
slave device would stay in time no matter how quickly the Clock bytes
arrived - and the task is to make the MIDI to Sync converter keep the
slave devices, such as TB-303s, TR-808s etc. in time too.

Lets say the MIDI sequencer puts out a Start byte followed immediately
by a Clock byte.  That's a perfectly reasonable thing to send via MIDI,
and a recipient MIDI device would make the initial sounds for the first
beat of the song as soon as it could.   The trouble arises if the
recipient device is a MIDI to Roland ("DIN") Sync converter.  If it
simply follows the MIDI timing, it will probably raise the Run/Stop
signal and 1/3 ms later create the positive edge of the first Clock
cycle.   While some devices receiving this Roland Sync timing may
respond correctly, many will not, because they are still scratching
their heads about starting the song when the first Clock cycle arrives.
 Consequently, they miss it, and remain forever one 1/24 of a quarter
note behind.

So I think that in addition to the Clock cycle timing you suggest, there
also needs to be some similar timing restriction on not starting the
first Clock cycle until X ms after the Run/Stop signal goes high.
Testing this with a TB-303 seems a reasonable way of estimating how long
 this should be - and a safety margin should be added.

Even if we add a 10 ms delay after the rising Run/Stop to the start of
the first Clock cycle, its not going to audible, since only the first
notes or drum sounds will be delayed by this barely perceptible amount.

Since the MIDI sequencer is physically capable of sending a Clock byte
every 0.33 ms, since we have no way of anticipating the timing of these
bytes with every sequencer in creation, operating in every computer
(with its operating systems and other software vying for CPU time),
output interface and musical situation possible, I think that it is
vital to have a general capacity to buffer received Clock bytes and make
sure they all generate Clock cycles, within the timing limits.  There's
probably no point in trying to count up more than 8 or so such Clock
bytes - because any MIDI sequencer which puts out 8 or more so densely
that we need to buffer them is probably not functioning properly anyway.

 - Robin

  Devil Fish mods for the TB-303