Extended range electric vehicles (EREVs) are a potential solution for fossil fuel usage mitigation and on-road emissions reduction. The use of EREVs can be shown to yield significant fuel economy improvements when proper energy management strategies (EMSs) are employed. However, many in-use EREVs achieve only moderate fuel reduction compared to conventional vehicles due to the fact that their EMS is far from optimal. This paper focuses on in-use rule-based EMSs to improve the fuel efficiency of EREV last-mile delivery vehicles equipped with two-way Vehicle-to-Could (V2C) connectivity. The method uses previous vehicle data collected on actual delivery routes and machine learning methods to improve the fuel economy of future routes. The paper first introduces the main challenges of the project, such as inherent uncertainty in human driver behavior and in the roadway environment. Then, the framework of our practical physics-model guided data-driven approach is introduced. For vehicles with small amounts of prior data, a Bayesian method is used to adjust a control parameter in the EMS offline for each vehicle with introduced prior information derived from large numbers of trips from other vehicles in the fleet. For vehicles with many delivery trips, a reinforcement learning algorithm is used to optimize the parameter in real-time without requiring future information of the trip. Although our data-driven framework cannot achieve a globally optimal solution with respect to the fuel efficiency, it provides a systematic and immediate solution for in-use EREVs used for package delivery with a very low computation cost and no change of vehicle hardware. Also, this framework is ready to be extended for further fuel economy improvements if more information is available from advanced transportation infrastructures like Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) connectivity.