So far I've been looking at signal strength but there's another fundamental quantity that an interferometer measures: phase. Amplitude tells you how bright (loud) something is but phase tells you the direction it's coming from. How well does the prototype detector do this?
Consider a pair of microphones and an animal at a distance making a noise. Unless the animal is located in a direction perpendicular to the line connecting the microphones (i.e. the baseline) the sound waves will arrive at the mics at different times. This difference is measured as a phase where one complete rotation (360 degrees) is equal to the wavelength of the sound. By measuring phase, you measure the time delay and hence the direction angle with respect to the baseline.
There is a problem however. Say for example that you measure a phase of zero. This corresponds to a signal perpendicular to the baseline but you will also get zero phase at a full rotation, twice that and so on. If the microphones are placed less than a wavelength apart, then this problem goes away (because the maximum delay can never be more than a complete phase rotation) but for ultrasonic frequencies, this is very difficult. The wavelength of a 50 kHz signal from a bat is only 6.9 mm and our microphones are 15 mm apart! Besides, the shorter the baseline, the less accurate the angle measurement so you don't necessarily want them that close anyway.
Fortunately though, the interferometer phase for an off-axis signal changes as a function of sound frequency and no animal 'broadcasts' on a single frequency. So looking at phase vs frequency helps. Also in order to get a complete direction solution (rather than just an angle with respect to a baseline) we need multiple baselines with different orientations and lengths. If we know that the signal is coming from a single location we can use multiple baseline solutions to find the one that's consistent between all of them. A 4-microphone array produces 6 baselines which should be plenty.
Anyway, let's not get ahead of ourselves. The first thing to do is look at how well the prototype can measure phase.
If the microphones were perfect then the interferometer would show zero phase as a function of frequency for a signal location perpendicular to the baseline. However, there's no such thing as a perfect microphone and there are probably significant systematic effects resulting in signal delays that need measurement so they can be calibrated.
So for a start, let's look at the detector response to an artificial signal that covers as much of the frequency range as possible and is located perpendicular to the baseline. How much does measured phase change with frequency?
To do this, I generated a sound file and played it to the detector along with a 40 kHz signal from a ultrasonic range-finder, in an otherwise quiet room. The speaker was about 3m away from the microphones which are 15 mm apart. I also recorded with the baseline approximately 45 degrees to the sound direction and with no sound playing to give background levels.
So here's a graph showing data for the perpendicular case for 0.03 sec that covers a pulse-on period for the ultrasonic range-finder.
The top panel shows correlated amplitude for the signal-on and quiet cases. There's decent signal below about 20 kHz and also around 40 kHz. The Phase panel shows the corresponding phase for the signal-on case. There's clearly a roughly linear drift with frequency which is probably a combination of instrumental effects and not having the microphones perfectly perpendicularly aligned. The phase can be converted to a delay and that's what is shown in the third panel. Also shown here are the other possible delays that the phase corresponds to, i.e. any delay within the sound travel time between the microphones. It's clear that there's a shift of one cycle of phase at 41 kHz and this needs to be taken into account. It looks though like the delay is reasonably constant across the frequency range but it does tend to increase a bit with frequency, which is presumably and instrumental effect. Lastly, the bottom panel shows the possible directions to the signal (where zero is perpendicular to the baseline). The only direction that works is at -10 deg which is encouraging. The systematics probably aren't too bad.
I next took the direction measurements as a function of frequency, smoothed the high SNR data (red line) and used this as my model to calibrate the phase. This next plot shows what happens to phase when this solution is applied to some perpendicular data taken a little later on. It looks OK and shows that the phase can be calibrated pretty well.
45 degree data
The next thing to do is look at the data taken with the baseline at about 45 deg to the signals, apply the calibration from the perpendicular data and see how it works. Here's a plot of the 45 deg data:
The Delay and Direction panels show a change at around 18 kHz, where the delay increases from about 10 to 25 us and direction goes from about -20 to -40 degrees. Note that a shift of one cycle in phase gives a reasonable looking solution in the 40 kHz data at zero delay and direction but the lower frequency solutions favour the ~-40 deg solution. This change in the solution with frequency is a worry though. It may be a temporal instability or it could be that the systematic error response of the interferometer is angle dependant.
Anyway, at the ultrasonic end it looks promising. I'm getting the sort of solution I expected.
Some more testing is required here and I still need to look at how stable things are with time. Can the interferometer be calibrated and then taken into the field and still produce reliable results?
Next: real data
In the next edition, I'll start to look at the bird and bat data and see how directions can be determined for them...