Some shows are not very good when they begin. However, as they release season 2 and beyond, they start becoming better and better. Which series got better after the first season, what was the major addition to the show that made it better? I think Breaking Bad and Sopranos became better with new seasons.