The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recursive least-squares (DRLS) algorithm. Through introducing a cost function that underrates the importance of those observations with small innovations, we develop the first censoring strategy based on the alternating minimization algorithm and the stochastic Newton method. It saves computation when a datum is censored. The computation and communication costs are further reduced by the second censoring strategy, which prohibits a node updating and transmitting its local estimate to neighbors when its current innovation is less than a threshold. For both strategies, a simple criterion for selecting the threshold of innovation is given so as to reach a target ratio of data reduction. The proposed censored D-RLS algorithms guarantee convergence to the optimal argument in the mean-square deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms.