I'm sorry, i should have explained better - what i suggested has nothing to do with blockhead. I only mentioned blockhead as an example of how much a little codeinjection (OBSE library) can change the rules.
I'll divide my explanation into three seperate problems, which however build on top of each other.
Problem 1: Measurement
I'm not experienced with modelling, but i always imagined that meshes get "rigged" with skels. So, that the ends of bones are tied to certain vertices of the mesh (probably identified by ID or something like that. Or more probably the other way around: Mesh vert references a bone-node by bone ID). If that is the case, then INGAME at *runtime*, a BBB-bone indeed might be used as measuring rod for breast diametre, because ingame, the bone stretches to connect to it's target vert of the mesh. Well, okay not neccessarily... i can imagine some ways to make breasts bounce, without actually putting a bone *through* the breast. So yeah, depends on how BBB is achieved.
Anyways, assuming there are bones going through breasts and connecting to one of their surface verts, then code injected at runtime might determine the current location of the bone node - and thus indirectly the approx position/extend of the breast surface mesh.
Next, for things like hands there almost certainly are bones going right through them - perhaps even for each individual finger, but that i'm not certain about (maybe it's just one bone for thumb, and another for the other fingers). In any case, locating the hands of another actor via bone node location is a no-brainer, if you can locate bone node locs.
If you could follow this far, you'll realize that we now have the position and orientation of the hand of actorA, and the approx surface location and orientation of the breast of actorB. So, we now know where stuff is. What's still missing at this point, is a way to move things.
(Small sidenode: All the above work might not actually be neccessary, if we're willing to do a bit of hardcoding: The number of popular body mods and cupsizes is small enough to hardcode offsets into a table - thus, if we can figure out the current body variant of the actors, we can simply read the offsets from a table, without having to measure.)
Problem 2: Movement
Unfortunatelly, i suspect movement to actually be the hardest - or perhaps impossible - part of the problem. The chain of problems goes like this:
1. Can injected code (DLL) modify position of bone nodes at runtime?
2. Even if we manage to do that, what about the other connected bones? After all, we want them to follow suit, ragdoll style (kinetics). That is, if we pull the hand to a position, we want the arm to follow. Doing this manually is a mess and way to much work. So the question is: Will the oblivion engine do this automatically for us?
Problem 3: Control
Even given 1. and 2., we of course need a way to send instructions about how we want stuff to move. We'd need a way to send those movement commands. Doing it in the LPK framework itself, is not only way too much work, as you pointed out - it also is a mess, because it would mean that animation descriptions would be split all over the place - some would be in the animation file, others would be in ini files, then other aspects again at another place... bleh.
A more clean way would be, if one could embed this extra data, into the animation (.kf) file itself as metadata, so that it would be ignored by oblivion, but interpreted by the injected code. All that LAPF might need to do, is to send to tell the injected code that a given animation should be manipulated, as well as the refs of the affected actors. So, LAPF wouldn't have to handle the manipulation itself - all it would have to do, is inform the injected code when the animations of two actors should be manipulated. The actual movement instructions, the injected code would then read from the .kf metadata.