You have two very fast servers and need to transfer a 10 Gigabyte file from one server to the other. You have a 10/100 switch to connect these servers. Which of the following transfer time estimates is the most accurate?
- 15 Seconds
- 3 Minutes
- 3 Seconds
- 15 Minutes
EXPLANATION
The maximum transfer rate of the switch is 100 megabit per second, not megabyte per second. Odds are the switch will be the bottleneck between these two servers. Many people mistakenly think a gigabit switch can transfer one gigabyte per second.A 100 Mbps switch will transfer approximately 12.5 Megabytes of information per second.
1024 * 10 = 10240MB
10240 / 12.5 = 819.2 seconds
819.2 / 60 = 13.65333 minutes.
Round up, and it'll take about 15 minutes.
To do the math Another way
10 GB * 1024 MB/GB = 10240 MB
10240 MB * 8 Bits/Byte = 81920 megabits
81920 megabits / 100 megabits/Second = 819.2 seconds
819.2 seconds / 60 seconds/minute = 13.65 minutes
0 comments:
Post a Comment