Technically Voodoo 1/2 should work even without 2D card, but naturally using such a setup might prove difficult. 
From what I heard about the mainstream models of the V2, it should be practically IMPOSSIBLE due to the complete absence of 2D-capable hardware on it. That is assuming I'm not mixing this up with earlier 3dfx cards, which I KNOW were definitely daughterboard designs.
I doubt that. Because it doesn't need to do that. Just start a 3D application and Voodoo will switch the inputs automatically (via any API, like Direct3D or Glide, _but_ the application has to select secondary device, of course, if primary is present). I suspect that it knows what card to target via DD_GUID or Options values. FF8 uses similar method, although I can't recall the exact registry value (but it does have a dedicated 'use secondary 3D device setting').
True, but did it work like that back when DirectX5.0 was out? For all _I_ know, DX5 may have used more primitive, arcane initialization methods which didn't involve the DD_GUID (which frankly, I never even knew existed....maybe I need to start poking around the registry more....

).
I suppose the whole idea behind my explaination is that the "driverpath" key must correspond to the "Display" entry in the config program; and that if you select "Primary Display Driver", the key is empty.
Oh, and I retract my statement about the V2 being the only card that has a different "Display" entry in the config program; as I'm pretty sure that if you have two 3D cards in your machine, you can have a different "Display" entry (as you may or may not have already pointed out) when you select your second 3D device. Still, I believe that the driverpath key is only used when you have a second entry.
Therefore, I have a new, revised theory:
Primary Display Driver selected->"Driverpath" key empty; Game attempts to use the default D3D5 dlls, which by their own defaults would then attempt to interface with the primary card's 3D drivers, if any are there.
Secondary card selected->"Driverpath" key lists path to the 3D-accelerated drivers that card uses, Game then feeds that path to the D3D5 dlls and tells them to use that, similar to how we "mask" the path to the default sgt files in FF8 using our .ini files.
Or: Game feeds that path to the DD_GUID registry key (possibly getting around your conundrum of "why not use DD_GUID?"), considering that for all we know, "Driverpath" could use a registry path, not a logical drive path.
Passthrough cable is just what the name implies, it is connected from 2D card to Voodoo, and regular monitor cable is connected to Voodoo (so, it's not a Y-cable, that would be technical impossibility anyway due to impedance, and other nice electrical thingies). So the signal passes through the Voodoo, hence the name for the cable. When 3D acceleration was turned on, Voodoo just flipped a relay, cutting off signal from 2D card, and started to send it's own signal. I can send you an image of such cable, if you want. 
An image of the cable? Hey, why not? I may need it for future reference if I ever for some reason needed to install a V2 on something, LOL.

But seriously, I've been wondering what that thing looked like.

Still, quite decent detective work, Mister Holmes. 
Thank you for the compliment......*hopes that Jari wasn't being sarcastic for some reason there...*
Wait, I just thought of something. The FF7XP patch Doesnt use Direct3d because when I use it with FF7 on my Voodoo3 and run in hardware rendering it runs in 32bit color. Voodoo3 cant run direct3d in 32bit, it has to be 16 for direct3d to work
Mofokubik, as a fellow Voodoo3 user, I must sadly inform you that the V3 is definitely
incapable of handling 32bit rendering. At all. Not for textures, Z-Buffering, Stencil Buffering, or any other 32-bit rendering operation.
At best, you can get what some people call a "22bpp Post-filtered output" from the card, but at the end of the day its still considered 16-bit rendering.
If the fact that you can initialize D3D and OpenGL games even though your desktop's colors are at 32bit was leading you to the conclusion that the v3 was rendering at 32bit, then you were being tricked by some safegards put into DirectDraw/D3D initialization. As soon as you start the program, as far as I know, DirectDraw does the resolution/color depth switching to what the application wants, and then hands control of the card over to D3D. Otherwise, you'd get an API that gives out some "failure to initialize" messages because it tried to do something that the card can't do.
With OpenGL, I believe it trys to initialize the app by defaulting to rendering at the same res/color settings that the windows desktop uses, and if it can't get that function to work, falls back to a Software Rendered, "SafeGL" mode, which is not a very good thing to use, as even the D3D software renderer is typically FASTER than how OpenGL trys to do things without hardware acceleration.