1080p is all hype people (at least right now). No one broadcasts in 1080p period. Which means all your TV is doing is deinterlacing it for you, which pretty much every flat panel fixed-pixel display has to do anyway to display a progressive image (because they are PROGRESSIVE displays!). Now, the exception to that last point might be if you one a CRT display that can render 1080i, then it won't need to deinterlace it to display it, but if you own an LCD or Plasma or DLP or rear projection LCD, then every freakin' signal you ever pass to the TV will have to be deinterlaced because the monitor is progressive scan. There pretty much is no such thing as a 1080p source, so the only thing you might be getting is a better deinterlacing or scaling chip in your TV to take 720p up to 1080p (scaling only) or 1080i to 1080p (deinterlacing). Other than that what's the benefit?
Also, the macro blocking thing can happen anywhere in the chain, and is most often a BANDWIDTH issue. You can bet your ass those $100K pro HD cameras don't macro block (they may show interlace tearing/combing if running at 1080i, but that is very different than the macro blocking thing). Usually, your cable provider is compressing the image to squeeze more bandwidth out of their lines, this can be observed by the fact that the same program might show macro blocking on Comcast but not via DirectTV. It all depends, but one thing is for sure, the macro blocking is coming down the pipe that way, your TV can't do anything about it. Now, when it comes to deinterlacers and scalers, the chips can and do make a huge difference, but that doesn't really manifest itself as macro blocking, it has more effect on horizontal combing/tearing (the deinterlacer) and edge sharpness/fuzziness (scaling). Maybe I'm wrong, but I know a fair amount about video and home theaters, and this is my opinion.
Finally, remember another thing folks, your fixed pixel displays have a NATIVE RESOLUTION of their own. Do you actually know what it is? I bet some might, but I bet others don't. Cause if you think your 1080i/p display actually has that many pixels, you might be surprised that it most likely doesn't. Just because it can display 1080i, doesn't mean it has that many pixels -- it won't. Take for example my LCD, it's native resolution is that of 720p (which is pretty rare to be spot on any HD format, but in my case it is). This means that even though it can play 1080i, in the background it's going to deinterlace (remember we are using progressive scan monitors for the most part not CRTs) and downscale (from 1080 to 720) just to display the image. Add into the equation that in many cases, the format changes several times, each time making things worse. They might be shooting 720p, then the cable company may be sending you 1080i, and then your TV will have to scale it to it's native resolution. It's worth finding out the native resolution of your TV, it might be an enlightening piece of info.