PDA

View Full Version : How do you determine axis tilt at crossover??



Robh3606
09-03-2003, 09:47 AM
I was looking at the "Improvements in Monitors Systems" article and was wondering how you determine how much tilt you have in the acoustic axis at crossover. They include a formula to determine where the nulls are in a noncoincident vertical array. So thats easy to determine through the crossover range depending on the slopes. In the 4430/4435 they set the drivers and crossover up to tilt the axis up . But what it doesn't tell you is how to determine how much tilt there is when the drivers are not aligned. I would think the the amount would be somehow connected with the ratio of wavelength to offset but can't seem to put it together. Obviously the crossover has a role too. Anyone have any ideas?? My impression is as you change the distance the axis will begin to tilt reach a maximum go back to the norm and then repeat the process for each mulitiple of wavelength offset. So you will get the same repeating wobble and an increased time delay with each increase in offset. This make any sense or am I all wet on this???

Thanks Rob:)