A comparison of client-server and P2P file distribution delays
In this problem, you'll compare the time needed to distribute a file that is initially located at a server to clients via either client-server download or peer-to-peer download. Before beginning, you might want to first review Section 2.5 and the discussion surrounding Figure 2.22 in the text.
The problem is to distribute a file of size F = 9 Gbits to each of these 9 peers. Suppose the server has an upload rate of u = 74 Mbps.
The 9 peers have upload rates of: u1 = 14 Mbps, u2 = 19 Mbps, u3 = 20 Mbps, u4 = 11 Mbps, u5 = 11 Mbps, u6 = 27 Mbps, u7 = 30 Mbps, u8 = 24 Mbps, and u9 = 26 Mbps
The 9 peers have download rates of: d1 = 35 Mbps, d2 = 35 Mbps, d3 = 26 Mbps, d4 = 38 Mbps, d5 = 17 Mbps, d6 = 15 Mbps, d7 = 20 Mbps, d8 = 31 Mbps, and d9 = 20 Mbps
Question List
1. What is the minimum time needed to distribute this file from the central server to the 9 peers using the client-server model?
2. For the previous question, what is the root cause of this specific minimum time? Answer as 's' or 'ci' where 'i' is the client's number
3. What is the minimum time needed to distribute this file using peer-to-peer download?
4. For question 3, what is the root case of this specific minimum time: the server (s), client (c), or the combined upload of the clients and the server (cu)
Solution
1. The minimum time needed to distribute the file = max of: N*F / US and F / dmin = 1094.59 seconds.
2. The root cause of the minimum time was s.
3. The minimum time needed to distribute the file = max of: F / US, F / dmin, and N * F / sum of ui for all i + uS = 600 seconds.
4. The root cause of the minimum time was c.
That's incorrect
That's correct
The answer was: 1094.59
The answer was: s
The answer was: 600
The answer was: c