Coded caching can significantly decrease the communication load in peak hours of the network. The gain of caching is maximized in a centralized setting, where the cache content of users are opportunistically designed. In the absence of a centralized placement, users' caches are filled with randomly selected packets of the files. This yields to a loss in the caching gain, especially for small cache size. A novel placement scheme is introduced in this work which is based on (within file) precoding of the files at the server, followed by random cache placement. It is shown that the proposed technique improves the caching gain compared to the uncoded placement. Surprisingly, the performance of the proposed decentralized placement matches with that of the centralized placement for small cache size.
|Original language||English (US)|
|Title of host publication||2018 IEEE International Symposium on Information Theory, ISIT 2018|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||5|
|State||Published - Aug 15 2018|
|Event||2018 IEEE International Symposium on Information Theory, ISIT 2018 - Vail, United States|
Duration: Jun 17 2018 → Jun 22 2018
|Name||IEEE International Symposium on Information Theory - Proceedings|
|Other||2018 IEEE International Symposium on Information Theory, ISIT 2018|
|Period||6/17/18 → 6/22/18|
Bibliographical noteFunding Information:
The work of H. Reisizadeh and S. Mohajer was supported in part by the National Science Foundation under Grant CCF-1749981.
The work of H. Reisizadeh and S. Mohajer was supported in part by the National Science Foundation under Grant CCF-1749981