Seeing the Pixels on Hi-def
Odd...Ever since I saw hi def tvs in th store, I noticed how bright they were, but oddly enough that the picture quality was degraded because I could see all the pixels....I just assumed maybe I was being picky, that maybe most people don't mind or something.
But I recently discovered that this is not the case. Other people only see hi def as bright vibrant image quality, and I PREFER less vibrant colors because then graphics do not look so pixelated. I think what it is, is that NTs brains automatically smooth out the pixels that my brain sees automatically, and the increase in color for them allows for just as much brain compensation, while for me, the higher contrasts causes the individual blocks to stand out more.
Anyone else notice this? To me, it was honestly baffling for me to ask my friend if he could see the obvious pixels on the screen and for him to tell me know. It's not just that I noticed them more. That's what I thought before. He doesn't see them at all. It would be like an NT being baffled at an aspie being unable to see something tha is just common sense in the NT world I guess.
Problem is that alot of the stuff they show on those tv's is 420p-740p and not 1080p so it doesnt looks its fullest and with anything the bigger the tv gets the bigger the pixels get hince 1080p. for those that dont know 1080p in computer ref would be like 1024x768 pixels granted thats alittle less but about it and p is progressive which is faster then i which is interlaced. And im pretty sure everyone sees it like your explaining it. its just the fact the tv looks cool. And Techniqly the picture is better its just they are displaying the wrong thing's. If you ever see a hidef/normal split video you can see the difference. The difference is that the background is way way more clear where they focused on normal looks the same on blueray or dvd its just everything around the focus point is more in focus in blueray hince the better quality
I think it depends.
Blu-Ray is a good combo for HDTV because it is higher resolution than DVD. If you play a DVD on a laptop, you can often see the pixels because a laptop display is much higher resolution than even HDTV. Standard TV is too low for the flaws in DVD media to be visible. HDTV is high enough to make it more noticeable.
The encoding process for DVD is similar to converting a BMP file to JPEG or GIF format. BMP has a ton of image data and a lot can be done with it. JPEG and GIF strips away all the excess info and saves just what's needed to maintain the base image.
Enlarge a BMP file and you retain a lot of detail for a while. Enlarge a JPEG or GIF conversion of that file and it quickly pixelates on you.
I've not noticed the same problem with DVDs on HTDV, but my folks got a 32" model widescreen and unless you stand up at the screen, you don't see any real loss. I don't doubt the 52"+ sized models force you to get Blu-Ray to compensate when watching videos.
The problem is the video it is displaying. Not the TV.
You see, a video is encoded to DVD in a way that fits and what appears to be great quality. With standard animation, every frame is drawn. With modern animation, it is thought as too much time wasted to draw the same thing that the same for many frames, so other techniques were developed to over come this. Layers were used to the same background could be used again and again.
The same is true with DVD's ONLY the part of the frame that has changed in the next is updated. With high quality encoding this is done with little give. On low quality. A range is determined which is the same. A scale of colour from 1-10 could be changed 10 times on low quality, where as high quality could see 9.
This technique to me is seen as low quality as sometimes I see parts of the frame in the next. Blacks are annoying. A gradient is more like a step of colour. And people say DVD's are high quality. They are not. It is also the same with MP3's use a similar type of encoding to save space. Parts of the sound are cut from the file to save space especially if it's out of the hearing range of the human ear. I mean, why include a 44KHz range when 11KHz may not be heard as very different or not sound to be different at all.
If you want to test a TV, use a video cassette recorder and a video tape as that would offer the best quality (apart from tape fade)
Digital TV to me is the worst thing ever. For digital TV to improve, there needs to be a smart ECC implemented. I hate having to wait 2 seconds for the sound to resume everytime the signal is interrupted, or corrupted.
A single pixel is just one group of RGB on a screen. A TV with a resolution of 1024 by 768 playing a video of 512 bye 384 will use twice as many pixels to display the video for the full screen effect, otherwise the actual video will be smaller and harder to see. This stretching effect also helps the possibility of seeing "pixels" or "pixellation"
Interesting stuff though I don't really have much context for understanding all the different terms and numbers, so maybe I'm thinking primarily of a certain type of tv, dvd, pixel count, etc. But all I know is that I've been in the store and in people's homes and every time it's been a big ol' flat screen on the wall playing hi def cable news (since I know that the news channels did switch to hi def) or dvds, or xbox 360 games, the pixels are super noticeable to me, more so than on boring old analog tv, to the extent that I actually PREFER the analog---whether it be dvds, tv, videogames, etc...and I also know that when I asked my friend to look at the tv and tell me if he saw pixels, he said he saw none whatsoever. Then he switched over to non hi def showing the same images to show me the difference, and I could see that the colors were way less bright, but also way smoother.
Maybe it's true that there is some hi def stuff out there that is better quality than I'm thinking of, but I'm pretty sure I just have an annoying high visual acuity. I've been in conversations where people start talking about how awesome hi def is and I say, "yes, it has some nice colors, but the only annoying part is that you can see all the pixels" and they are like, "what are you talking about"
And then I realize that it's not just that I notice them more and other people less, but that other people don't see them at all, or maybe if they did, they would have to really look for them intentionally. I know that I have never had to think about it to see them---it's the first thing I notice. I personally think it has something to do with the same reason we aspies are not fooled as easily by optical illusions---our brains see what is literally there instead of compensating. And when the colors are brighter, the pixels that are literally there pop out more than the image they make up.
Supposedly, people on the autistic spectrum have a higher visual acuity than neurotypicals (something like our eyesight is as acute as birds of prey). I feel the same way about CGI. Other people are impressed and I see right through it... all of the seams, holes, places where it falls apart.
I still think Brad Pitt in Benjamin Button looked like a puppet.
I feel this way about plasma TVs. I don't get why people are so impressed.
KingdomOfRats
Veteran
Joined: 31 Oct 2005
Age: 40
Gender: Female
Posts: 4,833
Location: f'ton,manchester UK
Keiths' answer is like what am have heard [but a lot more to it].
have wondered this as well,because on some hdtvs especially the pixels look blocky.
sisters boyfriend said its to do with the analogue or digital signal to the tv,as its having to be stretched [or something like that? cant remember exactly],so it can look blocky.
am have a hd lcd tv,and get pure sharp quality picture when its in full hd mode,tv channels just look a bit blocky on it,but then they're not hd signal,am not paying extra to get like one hd channel am would actually be able to watch with subtitles.
he also said it makes a difference on the cables that are using-make sure they're great quality,is it gold plated? some great type....cant remember which stupid memory.
_________________
>severely autistic.
>>the residential autist; http://theresidentialautist.blogspot.co.uk
blogging from the view of an ex institutionalised autism/ID activist now in community care.
>>>help to keep bullying off our community,report it!
What a fun topic!
I don't keep up on how DVDs work. I do know that to turn video into something that will fit onto a DVD (or even Blu-Ray) they have to compress the image into something that takes as little space as possible. This always compromises image quality, but given the technology of the time, nobody really catches the flaws. I suppose if video technology jumps forward again, we'll have similar flaws in Blu-Ray.
I got the MP3 reference, but there is a important thing to remember. The problem with MP3 is that it has always been a trade off.
If you want true audio fidelity, you can keep the bit rate so high that you really don't reduce the space of the audio file. If you want to save space, you lower the bitrate. My MP3 collection is done at 128 bps. Some would say that's bad. If I compared any file against the actual CD on good speakers, I'd agree, but for going about town in the pickup, or on the motorcycle, or the gym, I really don't notice the degradation that much. Like HDTV, unless you are listening on a quality set of speakers (or headphones) you really don't notice what you're missing.
It's a trade off. The first "DVD" concept was Laserdisc, and those suckers were as big as an LP album. Never caught on, but when they thought of making it as small as a CD and making a CD player that could use the media, DVDs were born.