[r-t] 147 TDMM

Richard Smith richard at ex-parrot.com
Mon Oct 4 13:49:19 UTC 2010


Mark Davies wrote:

> I've studied Richard's description of the new search 
> algorithm with interest. There seem to be no completely 
> new innovations here, but the key breakthrough appears to 
> be the separation of the tree search into two parts: 
> first, the partitioning of the space into sets of 
> mutually-true leads, and second, the linkage search, 
> seeking to join the members of each set. Is this a fair 
> summary?

That's certainy one of the key elements to the search 
algorithm.  As you note later, this isn't in itself 
particularly innovative.  However, However, I think there 
are a few aspects of the first part of the search that I've 
not used before and are perhaps worth noting.  Whether 
they're completely new innovations, I'll leave for others to 
decide.

Johnson's 10-part is a good example where I've seen this 
sort of two-stage search done.  The first stage finds 
mutually true blocks with D_5 as a part end group, and the 
second stage looks for Q-sets to join them up.  I think all 
the successful two-stage searches I've seen have had a first 
stage that produced round blocks.  Arguably this is no 
exception, if we stick a 2nds place lead end after every 
lead, we partion the extent into mutually true blocks (even 
for something like Norwich which doesn't have a 2nds place 
l.e. variant).  My second stage doesn't involve Q-sets, but 
does a brute-force search of all possible lend ends.  This 
means that something like a Relfe splice (that doesn't have 
a description in terms of Q-sets) is still found.  I don't 
claim that to be especially innovative, though.

The interesting bits are in the details of the first stage. 
The critical step is that I don't look at the lead ends in 
order.  If at some stage, I have a some lead ends with 
methods assigned to them, and some unassigned, the next lead 
I choose is the unassigned lead end with the fewest 
available viable methods.  This is key to getting the search 
to run in a sensible length of time.  If I search the leads 
in lexicographic order, a search of plans of the 41 surprise 
minor takes 89h; by choosing the one with the fewest viable 
methods, this reduces to 51s.

Another fairly obvious speed improvement comes from a poor 
man's version of rotational pruning.  Once the first method 
has been chosen, we can require all subsequent methods not 
to come before the first method in list of methods that 
we're studying.  This only eliminates about half of the 
rotations and reflections -- I trap the remaining ones by 
filtering the result of the search.  But the speed up is 
significant.  The 51s required for the 41 was reduced to 
9.5s.

A final speed up came from the order in which methods were 
considered.  With rotational pruning enabled, if you put 
methods with lots of inter-method falseness earlier in the 
search, you get a speed improvement.  Unfortunately it's not 
always clear which methods are the worst -- and something 
like Warkworth which one might expect to have lots of 
inter-method falseness, actually turns out not to have too 
much -- Bourne and Ipswich are both considerably worse.  But 
with some experimentation, reordering the list of methods 
allowed me to reduce the search to 3.4s.


> I think this approach has been used before, in custom searches (Stedman 
> Triples springs to mind), but I wonder if it could be more widely 
> applied. It is obviously most useful when working on complete extents, 
> but may have applications to non-extent searches too. I'm imagining a 
> music-driven search in TD Major, for example, where the "plans" would be 
> driven from the forced inclusion of certain musical nodes, and exclusion 
> of others, with the majority of less-musical plans omitted. The 
> difficulty would be establishing, in each case, whether the time taken to 
> create N plans, and to run N linkage searches on each one, is less than 
> the time taken to run a single conventional search.

An interesting idea.  I hadn't thought about using it to 
look for musical touches.  However, what do you propose the 
nodes in the first stage to be?  If you intend to take the 
5040 possible l.h.s / l.e.s, I can see serious problem. 
With minor, the search is viable because the plans are 
already implicitly in the form of round blocks.  This means 
that almost all (95.7%) plans can be linked up.

If you only intend your major plan to have around 316
assigned l.h.s / l.e.s, the chances of linking these is so 
slim that we can assume it will never happen by chance.  So 
you need a strategy for ensuring it is possible (or at least 
likely to be possible) to join them up.  I'm not sure what 
that would be.


An application where I see this potentially being of use is 
looking for interesting extent of spliced surprise major. 
We can easily impose a part-end group on the search.  A 
simple attempt might use S_5 (on 23456) which would only 
leave 42 l.h.s and l.e.s which can often be joined into a 
672 e.g. by 3 bobs V.  (The 21 parts can always be joined.) 
That would probably be a small enough collection of l.h.s 
and l.e.s that we could use a long list of methods -- 
maybe several hundred.

RAS




More information about the ringing-theory mailing list