Propagation delay is the time required for a signal to travel between a transmitter and receiver. In satellite systems, this delay depends on the signal path length and the speed of light. For signals in free space, delay is given by the following equation:
Why Propagation Delay Changes During a Satellite Pass
When the distance between endpoints remains approximately constant, such as in fixed links or GEO systems, the propagation delay is also effectively constant.
However, in satellite systems such as LEO and MEO, the distance between the satellite and ground station changes continuously during a pass. As this separation distance varies, the propagation delay becomes time-varying.
To better understand how this delay evolves during a satellite pass, the visualization below shows how propagation delay changes over time as the satellite approaches and moves away from the ground station.
Visualizing Propagation Delay during a LEO Satellite Pass
This visualization shows how propagation delay varies during a LEO satellite pass. Move the slider to observe how delay decreases at closest approach and increases as the satellite moves away.
Why Accurate Propagation Delay Emulation Matters
Accurately reproducing time-varying propagation delay is critical for realistic system testing. This delay cannot be implemented by simply adding or removing samples. Instead, it must be updated smoothly and continuously, typically using resampling techniques that preserve phase integrity and maintain the natural relationship between delay and Doppler.
The Maury Microwave ACE9000 Advanced Channel Emulator replicates these dynamic conditions with high fidelity, enabling engineers to model real satellite motion in a controlled laboratory environment.
