Context-aware recommendation has become increasingly important and popular in recent years when users are immersed in enormous music contents and have difficulty to make their choices. User emotion, as one of the most important contexts, has the potential to improve music recommendation, but has not yet been fully explored due to the great difficulty of emotion acquisition. This article utilizes users' microblogs to extract their emotions at different granularity levels and during different time windows. The approach then correlates three elements: user, music and the user's emotion when he/she is listening to the music piece. Based on the associations extracted from a data set crawled from a Chinese Twitter service, we develop several emotion-aware methods to perform music recommendation. We conduct a series of experiments and show that the proposed solution proves that considering user emotional context can indeed improve recommendation performance in terms of hit rate, precision, recall, and F1 score. Copyright © 2015 Elsevier Ltd. All rights reserved.