In this paper, we consider the problem of solving large scale systems of linear equations over a network of interconnected computing nodes. We assume the communication links among the nodes are randomly switching over time and model the intermittent communication with Bernoulli process. We propose an iterative algorithm to solve the problem based on a distributed optimization framework and the idea to use the last received information at each node. We provide convergence analysis of the algorithm and drive computational as well as analytical convergence conditions based on an intrinsic system decomposition and the application of singular perturbation and stochastic control theory. A numerical example verifies that the algorithm is robust to both stochastic link switches and additive uncertainties.