marco.swe wrote:- appropriate perspective of direct sound to reverb: I guess this has to do with "critical distance" right?
Yes... but do you actually understand what the Critical Distance is, or is it just a phrase you've heard?
-appropriate stereo image: I guess this has to do with distance between the mics, setup and angles?
These -- along with the mic polar patterns -- all determine affect the stereo image. But you need to know what you want the image to sound like -- how much of the space between the speakers is filled by the sound sources -- before you can choose and optimise all those parameters.
-violin tonality: I know high frequencies tends to move upwards so I thought the "vertical" direction will affect the tonality for the most. You write that all direction affect tonality. Is there some sort of "general rule" regarding tonality?
Every instrument is slightly different, so the only 'general rule' is to move the mic around while listening carefully until you find the location that sounds best for your requirements. Up, down, in, out, left, right.... moving the mic just a few inches can often make a significant difference! And beware the swaying violinist -- a performer who sways and rotates their body and instrument as they play. That can produce changes to the recorded tone which, on a sound-only recording, go unexplained and very distracting. This problem is obviously much worse with close miking!
Here's a rough guide to the way a violin (left) and cello (right) typically emits different frequencies in different directions:
This is derived from Jürgen Meyer's book, Acoustics and the Performance of Music
https://www.amazon.co.uk/Acoustics-Performance-Music-Acousticians-Architects/dp/0387095160 It's a fascinating read if you're interesting in the science of acoustic instrumental recording, but it is quite deep...
Thanks for the 3:1 rule clarification. Is this just relevant for spilling problem then?
Yes. It's a guide to help minimise spill -- unwanted sound from other nearby sources -- and thus maintain control of the mix. It's pointless trying to balance two mics if they both contain much the same sound, so the 3:1 is a way of ensuring that each mic has a useful amount of it's own wanted sound, and sufficiently little of the unwanted sound.
I thought two mics aiming at the same source, but placed at different distance from it, could also generate phase cancellation. is this not relevant?
Yes they can... although when we're talking about stereo arrays the spacing is relatively small and it rarely results in true phase cancellation. Instead we get what we call comb-filtering colours the sound a bit like when you talk with your hands cupped in front of your mouth.
And this comb-filtering effect only occurs when the outputs of the two mics are combined together -- so for a stereo array that typically means when summed to mono. That's why you have to be careful when choosing a spaced microphone array configuration.
how do you practically do the delay between the spot and the room mics in the DAW during the mixing?
The mathematical way is to measure the distance from the source to the close mic and then to the distant mic. Subtract one from the other to give the distance between the two. Sound travels around 1 foot per millisecond or 1 metre in 3.4ms. So you delay the close mic by the distance times one o those values (depending on whether your measurements are imperial or metric). The other way is to listen and dial up the delay until is sounds right... In practice I do both, calculating the nominal figure and then adjusting it by ear. When delaying close (accent) mics to blend with a main stereo pair I usually end up with a slightly longer (by a few milliseconds) delay than the maths would suggest.