Ethernet Latency can be defined as the time it takes for a network packet to get to a destination or the time it takes to come back from its destination. It also impacts the time an application must wait for data to arrive at its destination . This is as important as download speed because a network with high latency (a slow network) will take a longer time to pass information about and this can have a negative effect as web pages will take longer to load as each request for the next picture, script or text has a significant delay in between . Latency in a packet-switched network is stated as either one-way latency or Round-Trip Time (RTT). One-way latency is the time required to send a packet from the source to the destination or the RTT divided by two (RTT/2) which means the one-way latency from the source to the destination plus the one-way latency from the destination back to the source divided by two (RTT/2) . Latency also refers to any of several kinds of delays typically incurred in processing of network data. Systems with low latency do not only need to be able to get a message from A to B as quickly as possible but also to be able to do this for millions of messages per second.
End-to-end latency is a cumulative effect of the individual latencies along the end-to-end network path. Network routers are the devices that create the most latency of any device on the end-to-end path. These network devices (routers) are usually found in network segments. Packet queuing due to link congestion is most often the reason for large amounts of latency through a router. Since Latency is cumulative, the more links and router hops there are in between the sender and receiver, the larger the end-to-end latency...
... middle of paper ...
...it for data to arrive at its destination, and is normally expressed in milliseconds (ms). Although Latency and Bandwidth define the Speed and Capacity of a network but having a 25 Mbps (Megabits per second) connection does not really allow a single bit of data to travel that distance any faster. However, a large bandwidth connection only allows you to send or receive more data in parallel but not faster as the data still needs to travel the distance and experience the normal delay .
IV. THE IMPACT OF LATENCY
Applications with programming models that are susceptible to performance degradation due to Latency include the following:
• Applications that depend on the frequent delivery of one-at-a-time transactions, as opposed to the transfer of large quantities of data.
• Applications that track or process real-time data, such as “low latency” applications .