Abstract
Coded caching can significantly decrease the communication load in peak hours of the network. The gain of caching is maximized in a centralized setting, where the cache content of users are opportunistically designed. In the absence of a centralized placement, users' caches are filled with randomly selected packets of the files. This yields to a loss in the caching gain, especially for small cache size. A novel placement scheme is introduced in this work which is based on (within file) precoding of the files at the server, followed by random cache placement. It is shown that the proposed technique improves the caching gain compared to the uncoded placement. Surprisingly, the performance of the proposed decentralized placement matches with that of the centralized placement for small cache size.
Original language | English (US) |
---|---|
Title of host publication | 2018 IEEE International Symposium on Information Theory, ISIT 2018 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 1715-1719 |
Number of pages | 5 |
ISBN (Print) | 9781538647806 |
DOIs | |
State | Published - Aug 15 2018 |
Event | 2018 IEEE International Symposium on Information Theory, ISIT 2018 - Vail, United States Duration: Jun 17 2018 → Jun 22 2018 |
Publication series
Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|
Volume | 2018-June |
ISSN (Print) | 2157-8095 |
Other
Other | 2018 IEEE International Symposium on Information Theory, ISIT 2018 |
---|---|
Country/Territory | United States |
City | Vail |
Period | 6/17/18 → 6/22/18 |
Bibliographical note
Publisher Copyright:© 2018 IEEE.