I've seen five seasons worth, and I've yet to work up the effort to watch six or seven. I'm a horror/gore buff and have probably seen more zombie films than I've had hot meals, but considering its level of popularity, I consider 'The Walking Dead' to be overrated.
For me, the show's simply not engaging enough. With very few exceptions, I can't bring myself to care about any of the characters or what happens to them because they're either bland/annoying/cliched enough that I don't care if they survive, and the characters that are engaging and likable - Daryl or Michonne, for example - are practically guaranteed to survive because they're fan favorites, thus eliminating any tension. The series has also had a lot of excruciatingly slow patches where nothing really happens or they have plot threads that don't really go anywhere.
On top of that, it doesn't really offer anything subversive in the way of social commentary or satire as most zombie fiction is typically known for, and is utterly bereft of any kind of humor which saves it from becoming the overwrought melodrama it almost always is. I hate comparing things because I prefer to look at any piece of media on its own merits, but 'The Walking Dead' just pales in comparison to the films which inspired it, especially the Romero trilogy.
I guess I don't actively dislike it since I've managed to watch five seasons of it, but I have no emotional attachment to it whatsoever...it's merely a passable way to spend an hour and often there's some good special effects work to watch, but I certainly don't obsess over it.