Sometime you need to know how much data can be transferred per second on your network (LAN or WAN) and sometimes you need to find out how long it would take for X amount of data to be fully transferred.

For example you want to know how long it would take to seed data from a production site to a DR site or to the cloud.

The below aims to show how to find the theoretical numbers, I mention theoretical because the number is not exact as it could change based on network condition, jitter/etc/etc, but the below can give you a good indication of what you are dealing with.

Find theoretically how much data you can transfer per second:

If you have a 1Gbps network link.

How much data can you theoretically transfer per second?

# Network metric conversion uses decimal math

1 Gbps = 1000 Mbps = 1,000,000 Kbps = 1,000,000,000 bps

1 second = 1,000 milliseconds

So it will be:
1,000Mbp/s / 8

= 125 MB can be transferred per second over a 1Gbps network link.

If you want to measure how long, theoretically it will take to transfer X amount of data you can do the follow:


20TB need to be transferred on a 150Mbps link.

How many days will it take to transfer that amount of data over the network?

# Data conversion uses base 2

20 TB = 20 x (2^10) GB = 20 x (2^20) MB = 20 x (2^30) KB = 20 x (2^40) Bytes =

So it will be:
( (20x(2^20)) x 8 ) / (150) = 1,118,481.0667 seconds

1,118,481.0667 / 3600 = 310 hours

310 / 24

= 12.95 days will take to transfer 20TB over a 150Mbps network link