R/E/P Community

Please login or register.

Login with username, password and session length
Advanced search  

Pages: 1 2 [3] 4 5 ... 8   Go Down

Author Topic: dual mic recording phase issues?  (Read 16319 times)

maccool

  • Full Member
  • ***
  • Offline Offline
  • Posts: 234
Re: dual mic recording phase issues?
« Reply #30 on: October 19, 2006, 07:17:18 PM »

From a point source in an anechoic space a sound might be recorded with two non-coincident mic's.  Those two recordings could be aligned in a DAW to be pretty much phase-coincident.  1/2-1 inch diameter transducers just a short distance in front of a 12-inch speaker cone don't even come close to that model; in that case the non-coincident mic's would be recording two entirely different, complex, and phase-incoherent sound waves, waves that have their phase origins spread across that 12" cone, never mind the complex local reflections.  The only mechanism I can think of which is sufficiently complex and discriminating enough to sort out that mess of information is the human ear.

2
Logged
"Live sound will always be different."  Paul Frindle

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #31 on: October 20, 2006, 12:12:10 AM »

jimmyjazz wrote on Thu, 19 October 2006 10:40

J.J. Blair wrote on Thu, 19 October 2006 13:09

Here's the problem with this logic: A DAW rendering of a sound file is a crude representation of signal amplitude and time.  You are not actually looking at a wave form, as you would see it on an oscilloscope, which is showing you waveshape, frequency and amplitude.


What?  A 'scope would show you EXACTLY amplitude and time!  Only when you tranform the signal into the frequency domain (via a Fourier transform) do you get amplitude and phase.


Wait.  Reread what I said.  Nothing you just said contradicts it.  I am saying that on an ocsilloscope, you ARE seeing waveshape, frequency and amplitude.  I am saying that you do not see those thing on a DAW.




Quote:

Quote:

There is nothing on a DAW to represent freqency.  This is the point you keep missing.  The only thing you are aligning is time.  There is NO indication of phase of the frequncies.  You could visually time align something and still be 180˚ out of phase.


I'm sorry, J.J., but I completely disagree.  If you take two sine waves of identical frequency and slip one such that the zero crossings coincide, then they're either in phase (0 degrees) or out of phase (180 degrees).  It's easy to determine which is which -- if both signals have the same "sign", they're in phase.  If one is the inverse of the other, they're out of phase.


OK, that's in a perfect world where you are only recording sine waves.  We are not recording sine waves.  We are recording complex waveforms that do not repeat themselves like a pure sine wave.  And again, a DAW does not render a waveform to where you can properly see the phase.  

Quote:

Quote:

Maybe if I have a spare hour over the weekend, I'll do a post with sound files and pics to illustrate the concept.


I think that's a great idea!  This is a topic that rears its ugly head quite often, so maybe we need some sounds and pictures to get some clarity.  I'll agree to capitulate if you're right, as long as you agree to post the information anyway if you're wrong.  Deal?


Sure, but realize again, I am not talking about generated sine waves.  I am talking about soundwaves with varying harmonics and noise, and nonlinear amplitudes.  
Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #32 on: October 20, 2006, 12:25:40 AM »

iCombs wrote on Thu, 19 October 2006 12:00

Not necessarily.  Here's something that is lost in that explanation...that 100HZ portion of the sound HAS to have a longer period than the 1kHz portion because it is a lower frequency, so the difference in phase vs. distance (and in this equation, since velocity remains constant, the change in distance = the change in time) over that longer period will equal fewer degrees of shift.


Yes.  That's exactly the point I was making!

Quote:

All phase is, by definition, is the amplitude of a waveform at a given point in time, given in degrees.  So by definition, a waveform that shows amplitude and time shows phase.


Not exactly.  Where on a DAW wavform is the amplitude of that 100Hz and the amplitude of that 1kHz?  It's not there.  It doesn't distinguish between the two.  And I said, your 100Hz could be in positive phase while your 1kHz is in negative phase, and the DAW would never indicate that.

Quote:

Also we're not talking about a distance difference of 9 inches...we're dealing in distances that are well within a centimeter...more likely less than an eigth of an inch, which also means that the difference in the sound the mics are capturing in terms of direct vs. reflected signal won't be significantly different.


Actually, that 1/8" could mean a significant differemce in phase in higher frequencies, even though the lower freqeuncies will be unaffected.  Changing whatever time diferential between the tracks might bring those higher frequencies into positive phase interference, but putting the slightly lower ones in negative interference.  


Quote:

Putting waves up on a scope can show COMBINED phase relationships, but as far as one track interacting with another track is concerned, phase relationships, assuming similar source material (i.e., the 2 mic guitar amp setup as described), visually aligning tracks by amplitude and time IS aligning them in phase.

Look at it like this:
http://www.kwantlen.ca/science/physics/faculty/mcoombes/P2421_Notes/Phasors/doublesine.gif

http://en.wikipedia.org/wiki/Phase_%28waves%29


Again, this only works if you are using simple waveforms without harmonics.  You have a single waveshape with a single frequency for both waveforms shown.  In the real world, with something like a guitar signal, if you lined up whatever frequency you have illustrated here in perefct phase, you just change the phsde relationship of every other frequency in that signal.
Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham

jimmyjazz

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1885
Re: dual mic recording phase issues?
« Reply #33 on: October 20, 2006, 12:31:43 AM »

J.J. Blair wrote on Fri, 20 October 2006 00:12


Wait.  Reread what I said.  Nothing you just said contradicts it.  I am saying that on an ocsilloscope, you ARE seeing waveshape, frequency and amplitude.  I am saying that you do not see those thing on a DAW.


And I'm disagreeing.  On both instruments, you're seeing amplitude versus time.  Or are you using a different 'scope than I'm using?


Quote:

OK, that's in a perfect world where you are only recording sine waves.  We are not recording since waves.  We are recording complex waveforms that do not repeat themselves like a pure sine wave.  And again, a DAW does not render a waveform to where you can properly see the phase.


Again, I disagree.  If the phase isn't there implicitly, then you're not sampling at a high enough rate.  Nyquist and Shannon and the other big boys tell us that a properly sampled signal preserves amplitude and phase PERFECTLY (for perfect converters working above twice the bandwidth).  What am I missing?


Quote:

Sure, but realize again, I am not talking about generated sine waves.  I am talking about soundwaves with varying harmonics and noise, and nonlinear amplitudes.


Oh, absolutely.  Just do something we'd typically see in a studio situation.  I will be very grateful, and I'm sure many others will, too.
Logged

iCombs

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 537
Re: dual mic recording phase issues?
« Reply #34 on: October 20, 2006, 12:37:11 AM »

J.J. Blair wrote on Thu, 19 October 2006 23:25

iCombs wrote on Thu, 19 October 2006 12:00

Not necessarily.  Here's something that is lost in that explanation...that 100HZ portion of the sound HAS to have a longer period than the 1kHz portion because it is a lower frequency, so the difference in phase vs. distance (and in this equation, since velocity remains constant, the change in distance = the change in time) over that longer period will equal fewer degrees of shift.


Yes.  That's exactly the point I was making!



You aren't understanding what I'm saying.  Because of the difference in period, yes, they are going to catch diffrent frequencies at different phases.  that is plain to see, if you've looked at even basic wave mechanics and periodic functions.  What I'm saying is this: the ratio of the differences in phase between frequencies will not change over distance (again, not APPRECIABLY, as per the example we are using)...so aligning phase for one frequency IS aligning phase for all frequencies.

I'll also beg to differ that you can't suss out frequency difference by looking at a waveform...lower frequencies have larger (i.e. with longer period) waveforms.  Whether they are simple sine waves or complex additve harmonic-laden waves, this is plain to see on a graph of amplitude vs. time.  You can see the high frequency "click" of a hard beater striking a kick drum head in the waveform, and you can see the low frequencies of the drum bloom and subsequently decay.  Those are all different frequencies and they are all present on the graph of amplitude vs. time.

Quote:


Actually, that 1/8" could mean a significant differemce in phase in higher frequencies, even though the lower freqeuncies will be unaffected. Changing whatever time diferential between the tracks might bring those higher frequencies into positive phase interference, but putting the slightly lower ones in negative interference.


It would make a difference in relative phase...yes...but assuming you were using a matched pair of mics, you'd be unable to distinguish much of a difference from each mic in solo, which is the point I was making.
Logged
Ian Combs
Producer/Engineer
Lightspeed Group, Inc.
----------------------
"Mista apareeatah... can I have maar beass at all frequencies?"

jimmyjazz

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1885
Re: dual mic recording phase issues?
« Reply #35 on: October 20, 2006, 12:46:12 AM »

J.J. Blair wrote on Fri, 20 October 2006 00:25

Again, this only works if you are using simple waveforms without harmonics.  You have a single waveshape with a single frequency for both waveforms shown.  In the real world, with something like a guitar signal, if you lined up whatever frequency you have illustrated here in perefct phase, you just change the phsde relationship of every other frequency in that signal.


Let's imagine a highly transient musical signal -- a rimshot.  We'll approximate it as an "N" wave.  Imagine we mic the snare up close (2") and back a ways (24").  That's a separation of 22", which corresponds to a time delay of ~ 1.6 milliseconds.  In other words, that "N" wave is going to hit the 2nd mic 1.6 milliseconds after it hits the first mic.  BUT (big BUT) the two signals are going to look substantially the same, even though the signal from the 2nd mic might be inverted (initial peak is negative instead of positive) and/or attenuated.  Still, it's trivially easy to flip its phase/polarity and slide it in time to where the peaks and zero crossings line up, right?

So where's the beef?

Sure, there all all kinds of frequencies represented by that signal, but it's the SIGNAL that matters.  The waveform.  If the signal from the 2nd mic isn't substantially the same as that from the 1st mic, just delayed in time and perhaps inverted, then other factors are coming in to play besides this "nebulous" (heh) quantity called "phase".  Different mic, different local acoustic, different "view factor" to the source, etc., and yes, those things all matter, and are likely the difference between theory and practice.  (I'm not denying the difficulty of doing this in practice.  I just want to know the reasons why it is difficult, and "you can't do that" just dosn't cut it with me.)
Logged

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #36 on: October 20, 2006, 12:47:36 AM »

iCombs wrote on Thu, 19 October 2006 21:37


I'll also beg to differ that you can't suss out frequency difference by looking at a waveform...lower frequencies have larger (i.e. with longer period) waveforms.  Whether they are simple sine waves or complex additve harmonic-laden waves, this is plain to see on a graph of amplitude vs. time.  You can see the high frequency "click" of a hard beater striking a kick drum head in the waveform, and you can see the low frequencies of the drum bloom and subsequently decay.  Those are all different frequencies and they are all present on the graph of amplitude vs. time.


OK, just because you record a bass and the waveshape seems to represent wider waves than when you record a violin or something, it's still not showing you the phase of any of the frequencies.  Where are all the other frequencies?  Where's the harmonics?  It's not there.  
Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham

jimmyjazz

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1885
Re: dual mic recording phase issues?
« Reply #37 on: October 20, 2006, 12:51:04 AM »

J.J. Blair wrote on Fri, 20 October 2006 00:47

OK, just because you record a bass and the waveshape seems to represent wider waves than when you record a violin or something, it's still not showing you the phase of any of the frequencies.  Where are all the other frequencies?  Where's the harmonics?  It's not there.  


Of course they are.  If it doesn't look like a sine wave, then it's laden with harmonics.  In a band-limited signal, they're usually apparent.  (At least the higher-amplitude, lower-frequency harmonics are.  They look like ripple on the fundamental "carrier frequency".  Only infinite-bandwidth systems can crank out triangle waves and sawtooth waves and square waves, where it's impossible to discern any harmonic contribution without understanding what it is that makes them different from pure sine waves.)
Logged

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #38 on: October 20, 2006, 12:53:25 AM »

jimmyjazz wrote on Thu, 19 October 2006 21:31

J.J. Blair wrote on Fri, 20 October 2006 00:12


Wait.  Reread what I said.  Nothing you just said contradicts it.  I am saying that on an ocsilloscope, you ARE seeing waveshape, frequency and amplitude.  I am saying that you do not see those thing on a DAW.


And I'm disagreeing.  On both instruments, you're seeing amplitude versus time.  Or are you using a different 'scope than I'm using?


I still fail to see how we're not saying the same thing.  Frequency is the same as time.  You said time and amplitude.  I said frequency and amplitude, and waveshape, which I'm sure you would agree on.  Right?

Quote:

OK, that's in a perfect world where you are only recording sine waves.  We are not recording since waves.  We are recording complex waveforms that do not repeat themselves like a pure sine wave.  And again, a DAW does not render a waveform to where you can properly see the phase.


Again, I disagree.  If the phase isn't there implicitly, then you're not sampling at a high enough rate.  Nyquist and Shannon and the other big boys tell us that a properly sampled signal preserves amplitude and phase PERFECTLY (for perfect converters working above twice the bandwidth).  What am I missing?[/quote]

I thought you said that you don't use a DAW.  I'm talking about the VISUAL rendering of the waveform, not the way it's recorded.  I'm saying that the visual representation in the edit window does not tell you enough information to be able to judge the phase of any given signal.

Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #39 on: October 20, 2006, 12:58:47 AM »

jimmyjazz wrote on Thu, 19 October 2006 21:51

J.J. Blair wrote on Fri, 20 October 2006 00:47

OK, just because you record a bass and the waveshape seems to represent wider waves than when you record a violin or something, it's still not showing you the phase of any of the frequencies.  Where are all the other frequencies?  Where's the harmonics?  It's not there.  


Of course they are.  If it doesn't look like a sine wave, then it's laden with harmonics.  In a band-limited signal, they're usually apparent.  (At least the higher-amplitude, lower-frequency harmonics are.  They look like ripple on the fundamental "carrier frequency".  Only infinite-bandwidth systems can crank out triangle waves and sawtooth waves and square waves, where it's impossible to discern any harmonic contribution without understanding what it is that makes them different from pure sine waves.)



Jimmy, you keep disagreeing about things we agree on.  You keep missing my point that a DAW's visual rendering of a waveform is not showing you what an oscilloscope would show you.  I'm talking about the information that a DAW, or in my case, ProTools, is capabale of showing you about a waveform.
Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham

jimmyjazz

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1885
Re: dual mic recording phase issues?
« Reply #40 on: October 20, 2006, 01:01:53 AM »

J.J. Blair wrote on Fri, 20 October 2006 00:53


I thought you said that you don't use a DAW.  I'm talking about the VISUAL rendering of the waveform, not the way it's recorded.  I'm saying that the visual representation in the edit window does not tell you enough information to be able to judge the phase of any given signal.



OK, maybe I haven't understood your point.  Are you saying PT (or whatever your DAW flavor is) doesn't show you a detailed amplitude versus time chart of a given track?  You can't zoom in and see all of the complexities of the signal?
Logged

Andy Peters

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1124
Re: dual mic recording phase issues?
« Reply #41 on: October 20, 2006, 03:34:16 AM »

J.J. Blair wrote on Thu, 19 October 2006 21:58

Jimmy, you keep disagreeing about things we agree on.  You keep missing my point that a DAW's visual rendering of a waveform is not showing you what an oscilloscope would show you.  I'm talking about the information that a DAW, or in my case, ProTools, is capabale of showing you about a waveform.


JJ, I'm with Jimmy here, but I think I see where you're coming from.  This is a bit long, and it'll either clear things up or stir the mud some more ...

A DAW's rendering of the waveform should show the same thing one would see on an oscilloscope.   Zoom in on the DAW so you can see individual samples and it's obvious.

(Aside:  I have a .WAV file of the closest we can get to the Kronecker delta function.  The delta function is infinite at time zero and zero at every other time.  Since we can't have an infinite pulse height, the sample at time zero is 0x7FFFFF, the most-positive value we can represent with 24 bits.  If I view this file in CoolEdit, and zoom in on the samples around time zero, you see what you expect: the sample at full scale, followed by the samples at zero.  Furthermore, you see what you expect if you're familiar with sampling theory: rather than straight lines doing connect-the-dots, the samples are connected with a sinc (that is, sin(x)/x) waveform.  That's right: CoolEdit gets it right.  And if I wasn't lazy, I'd see how ProTools, SAWPro and GarageBand display this file.

And if you play this waveform and look at the result on the 'scope, you see a sinc response.  Life is good when theory and practice agree.)

There's another rub, though: as you zoom out on the DAW, you get to a point where you have more samples than pixels in the waveform window.  The DAW authors have to decide how to handle this: simply drop samples, or average samples, or show only the peak sample within the interval to display, or show the envelope, or whatever.   So the crux of this biscuit is that when you zoom out, you can't display the waveform exactly.   And an oscilloscope, whether digital or analog, has to handle the same situation.  If the DAW doesn't handle this scenario in the same way that your 'scope does, then of course there will be a difference in the display!

Try playing with a 'scope, varying the horizontal scale (time per division) to match the DAW's time display.  See if they're similar or different as you zoom out.

Basically, if you zoom in enough so that you can see the actual samples, you can time-align your tracks.  It's real easy to see a snare hit or click and line things up to the peak.

It's worth noting that neither the DAW nor the 'scope show phase information, because without a reference, phase information is meaningless.  What's useful is when you compare two signals to get the relative phase between them, which Jimmy mentioned.

Now, a thought about the original topic, which is can one mic (say one that's close up to an amp's speaker be delayed to match the time-of-arrival to a room mic, and will they be time-aligned?  The answer is yes, of course, and in fact that's exactly what we do when we're using a transfer-function measurement tool (like Smaart) to measure the response of a sound system.  You have the reference signal, a direct feed of the test signal, and you have the measurement signal from the mic in the room, and you do an impulse response to learn the delay between the two.  You then insert that delay on the reference input, and the result is that the two signals line up in time.

So, yeah, you could measure the distance between the two mics, insert a delay on the close mic, and they'd be time-aligned.  But (and this is the preaching-to-the-choir part of our show) the two mics will still not sound the same, and the reason is obvious and doesn't need advanced math to understand: the two mics aren't picking up the same signal source!    It should be obvious that the close mic picks up pretty much only the direct signal (high S/N) and the distant mic picks up the direct signal plus room reflections.  When you delay the close-up signal, you just move that signal back to line up with the strongest signal at the distant mic (which should be the direct signal), but you haven't added any of the multipath that gives the distant mic its character (and it's this multipath that is the cause of comb filtering). Yeah, you've reinforced the direct signal, which may or may not be what you want.

If you were in an anechoic chamber, where the reflections are nil, then delaying the doesn't buy you anything.  But there's no point in putting a room mic in an anechoice chamber ...

-a
Logged
"On the Internet, nobody can hear you mix a band."

ssltech

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 4780
Re: dual mic recording phase issues?
« Reply #42 on: October 20, 2006, 10:29:32 AM »

Well, I've been staying out of this for a while, because it's been difficult coming up with an approach which corrects some of the inaccuracies and misconceptions here without coming across as nasty...

Please take the following as intended solely as constructive or helpful, -if I fail, please don't take offence, there's none intended.

What's been gnawing at me while reading through this is the amount of misunderstanding or incomplete understanding as soon as the word "phase" is introduced.

Also, there are a few instances during the course of some explanations thus far, where some part-truths have been presented as absolute fact, without any reference to the limits of their accuracy...

Firstly, I don't recall there being an absolute description of this being a SINGLE POINT SOUND SOURCE.  MOST of what I'm going to address is specific to a single point source, such as a small single-speaker amplifier (not a perfect example, but decent for the purpose of illustration).

Now, Most of us appear to have assumed (from the respones already posted) that we're dealing with a single source (speaker) and multiple (two) microphones, one closer than the other. Given this situation, with one mic closer, the sound spreading out from a single point will arrive at the nearest microphone first, then the second one later.

JJ Blair quite excellently gave an example of different frequencies (which have specifically different wavelengths in air at any given air pressure/temperature). I didn't check the math in the example, but if you want to round things off for easy calculation, let's say that the speed of sound is 1,000 feet/second, so sound would travel 1 foot in 1ms, so a 1kHz signal would have a 1 foot wavelength...

However, I think that JJ blair might have missed the fact that the phase cancellations between two mics -ignoring acoustic/room ambience additions etc... specifically the difference in the DIRECT signal pickup ARE in fact caused EXCLUSIVELY by the difference in arrival times. Certainly, if you time-slide them, they will not cancel in any way satisfactorily, but this is due to differences in the acoutic characters at the mics' locations... these are discrete and specifically different from direct signal phase interferences. (I say 'interferences', because we're not of course dealing with just cancellations, some frequencies will be reinforced, others diminished.)

As to the waveform in a DAW containing no phase information, it depends on where you look. Constant waveforms can be in-phase, out-of-phase, or anywhere in-between, and those which gradually fade in and out are indeed difficult or -let's face it impossible- to align correctly. You shift the frequency and the peaks and troughs which you aligned are no longer in alignment. JJ is spot on with this. -However, it's too much of a stretch to say that it contains NO information, there are times when it definately does.... percussive instances, impulses etc most definately do. If you look at the start of a plucked guitar note, you CAN slide them around and make things line up. Once you do that, you'll even find that a sine wave oscillator plugged into the amp and swept up and down WILL in fact remain in-phase at basically all the frequencies, once the timing is aligned. It's an easy test to do if you have a sine wave oscillator or some other way of generating sine waves at a selection of different frequencies.

Oh yeah, someone also meantioned time/amplitude possibly being the same as frequency/amplitude... I'd disagree... a time/amplitude representation is the representation that a DAW displays to show you a waveform, a Frequency/amplitude representation would be somthing like a fourier transform, or the display on a spectrum analyser... definately not the same thing in my terminology, though I'd have to re-read to see if bopth people did indeed mean the same thing...

I'd say that JJ Blair actually had the technical part of the explanation reversed when he said that shifting the phase corrects the problem better than shifting the delay or slipping the track... Of course "Better" can mean 'sounds better' or 'works better in the track' and I can't take iussue with that, but if the inference was that it is the most accurate way of aligning phase error at all frequencies, then I have to differ... the sine wave oscillator demonstration described above should illutrate the differences in a direct comparison... -I shold state that when I say "comparison", I'm thinking of comparing ot to a "phase shift without overall delay" solution, such as the Little Labs IBP box.

Now, of course I'm not denying the usefullness of IBP bixes ad their like: Nor am I suggesting that you should ALWAYS use the track-slip/delay approach. IBPs and other approaches such as having an heaphoned-assistant shift the mic around until the resulting combination sounds pleasing do of course produce different results to track-slip, and I'll never deny that sometimes THAT approach works better in the track, no matter how technical accuracy works.

To the comment made early on regarding track slipping, where it was sugested that an engineer should use his ears instead of his eyes, I think that's ridiculous. Just beacuse the ears should be the FINAL judge doesn't mean that you have to fumble blindly until you stumble upon the right track-slip amount. However, equally true is that just because it LOOKS like it should sound good, is never proof of good sound. I suppose that's an example of how something with a lot of truth at it's core can be sort of 'wrong' in that a difference in the way it's expressed suddenly changes the inference to the point that I disagree entirely.

One small semantic (but important) matter: I think Jimmyjazz said that if two waveforms line up their zero-crossing points then they're either in-phase or 180
Logged
MDM (maxdimario) wrote on Fri, 16 November 2007 21:36

I have the feeling that I have more experience in my little finger than you do in your whole body about audio electronics..

jimmyjazz

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 1885
Re: dual mic recording phase issues?
« Reply #43 on: October 20, 2006, 10:46:13 AM »

Andy & Keith, thanks for chiming in.  I think you've both helped clarify things immensely.

Keith, I'm glad you brought up the Little Labs box.  I've never used one, and I'm curious, since you mentioned that it isn't a delay . . . what is it?  How does it work?
Logged

J.J. Blair

  • Hero Member
  • *****
  • Offline Offline
  • Posts: 12809
Re: dual mic recording phase issues?
« Reply #44 on: October 20, 2006, 11:51:56 AM »

Quote:

To the comment made early on regarding track slipping, where it was sugested that an engineer should use his ears instead of his eyes, I think that's ridiculous. Just beacuse the ears should be the FINAL judge doesn't mean that you have to fumble blindly until you stumble upon the right track-slip amount. However, equally true is that just because it LOOKS like it should sound good, is never proof of good sound. I suppose that's an example of how something with a lot of truth at it's core can be sort of 'wrong' in that a difference in the way it's expressed suddenly changes the inference to the point that I disagree entirely.


Keith, I was not suggesting that you never use your eyes.  It was more a statement regarding the attitude of fixing audio things visually.

But, it is my experience that a DAW's visual rendering of a waveform really only tells you transient response (somewhat) accurately.  My experience also differs from what Andy is saying.  I have recorded signals in PT that do not at all resemble my oscilloscope.  I noticed this when experimenting with the effect of compression on a squarewave signal.  I have not found PT to be an accurate indication of compression or rarefaction, or even waveshape, which has been the underlying basis of my whole argument.  
Logged
studio info

They say the heart of Rock & Roll is still beating, which is amazing if you consider all the blow it's done over the years.

"The Internet enables pompous blowhards to interact with other pompous blowhards in a big circle jerk of pomposity." - Bill Maher

"The negative aspects of this business, not only will continue to prevail, but will continue to accelerate in madness. Conditions aren't going to get better, because the economics of rock and roll are getting closer and closer to the economics of Big Business America." - Bill Graham
Pages: 1 2 [3] 4 5 ... 8   Go Up
 

Site Hosted By Ashdown Technologies, Inc.

Page created in 0.073 seconds with 19 queries.