Jitter is a measure of variation in latency. It’s also known as packet delay variation. Jitter is important to time sensitive UDP applications like real-time voice and video streaming. Streaming services like Youtube and Netflix use TCP, which is not sensitive to jitter.
Jitter is important to real-time streaming media applications because each packet they produce contains a tiny sample of the source media. In order to accurately recreate the media at the receiver end, those packets must arrive at a constant rate (and in the correct order!). Otherwise, audio may be garbled, or video may be fuzzy or freeze. All networks introduce some jitter because each packet in a single data stream can potentially experience different network conditions. This could be because it took a different path, or because it experienced a longer queuing delay. Severe jitter is almost always caused by network congestion, lack of QoS, or mis-configured QoS.
For LAN or dedicated WAN links, make sure you use traffic shaping techniques to prioritize traffic, and that your traffic shaping config has the intended effect. Use delivery monitoring to identify low capacity links, and flow analysis to try and free up the network.
For paths that traverse the public Internet, you should focus on compensating for jitter rather reducing it. Playback devices have a jitter buffer to counter variations in packet delay and packet reordering. Measured in milliseconds, it meters packets on ingress so that they’re spaced evenly before processing. Problems can arise if the jitter buffer is too small or too big compared to the actual jitter. Modern playback devices have adaptive jitter buffers, which automatically correct for jitter, but it might still be worth checking the config on your handset or playback device.