TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) are two widely used communication protocols on the Internet. They are both data transport protocols, but they differ in their approach to transmission. In this article, we’ll look at the differences between TCP and UDP and when it’s best to use one or the other.
TCP
TCP is a connection-oriented protocol, which means that a connection must be established before data can be transmitted. Once the connection is established, the data is transmitted reliably. Each packet is confirmed by the recipient before the next packet is sent. If there is data loss and a packet is lost, then it will be retransmitted. Guaranteed data delivery is therefore one of the advantages of TCP.
However, all of these steps can lead to higher latency than UDP, because TCP must wait for an acknowledgment from each packet before it can send the next one.
UDP
UDP, on the other hand, is a connectionless protocol. This means that data can be transmitted without having to establish a connection first. Packages are sent with no guarantee of delivery or order. UDP is generally faster and requires fewer resources than TCP because it does not need to deal with data reliability. This can be useful in applications that require minimal latency, such as video streaming or voice over IP applications.
However, if there is data loss in UDP, there is no automatic mechanism for retransmitting lost packets. Thus, if a packet is lost en route, it will not be resent automatically. In the case of audio or video transmission, this will result in a small cut or corruption in the video.
Which one to choose?
In video surveillance, it is generally safer to use the TCP protocol. This avoids having cuts or corruption in the video.
On the other hand, if delays are noticed in the live image, it is possible that the UDP protocol can reduce the latency. Using UDP can also improve PTZ control which often involves rapid camera movements to track a moving target.