Rain fade refers primarily to the absorption of a microwave radio frequency (RF) signal by atmospheric rain, snow, or ice, and losses which are especially prevalent at frequencies above 11 GHz. It also refers to the degradation of a signal caused by the electromagnetic interference of the leading edge of a storm front. Rain fade can be caused by precipitation at the uplink or downlink location. It does not need to be raining at a location for it to be affected by rain fade, as the signal may pass through precipitation many miles away, especially if the satellite dish has a low look angle. From 5% to 20% of rain fade or satellite signal attenuation may also be caused by rain, snow, or ice on the uplink or downlink antenna reflector, radome, or feed horn. Rain fade is not limited to satellite uplinks or downlinks, as it can also affect terrestrial point-to-point microwave links (those on the Earth's surface).
Rain fade is usually estimated experimentally and also can be calculated theoretically using scattering theory of raindrops. Raindrop size distribution (DSD) is an important consideration for studying rain fade characteristics.[1] Various mathematical forms such as Gamma function, lognormal or exponential forms are usually used to model the DSD. Mie or Rayleigh scattering theory with point matching or t-matrix approach is used to calculate the scattering cross section, and specific rain attenuation. Since rain is a non-homogeneous process in both time and space, specific attenuation varies with location, time and rain type.
Total rain attenuation is also dependent upon the spatial structure of rain field. Horizontal, as well as vertical, extension of rain again varies for different rain type and location. Limit of the vertical rain region is usually assumed to coincide with 0˚ isotherm and called rain height. Melting layer height is also used as the limits of rain region and can be estimated from the bright band signature of radar reflectivity.[2] The horizontal rain structure is assumed to have a cellular form, called rain cell. Rain cell sizes can vary from a few hundred meters to several kilometers and dependent upon the rain type and location. Existence of very small size rain cells are recently observed in tropical rain.[3]
The rain attenuation on satellite communication can be predicted using rain attenuation prediction models which lead to a suitable selection of the Fade Mitigation Technique (FMT).[4] The rain attenuation prediction models require rainfall rate data which, in turn, can be obtained from in either the prediction rainfall maps, which may reflect inaccurate rain performance prediction, or by actual measured rainfall data that gives more accurate prediction and hence the appropriate selection of FMT. Substantially, the earth altitude above the sea level is an essential factor affecting the rain attenuation performance.[5] The satellite system designers and channel providers should account for the rain impairments at their channel setup.
Possible ways to overcome the effects of rain fade are site diversity, uplink power control, variable rate encoding, and receiving antennas larger than the requested size for normal weather conditions.