Jump to content
CCleaner Community Forums

The Great "MYTH" built around LCD & LEDs Screens!


Recommended Posts

Hello Folks!

 

Going by the Textbook Information, LCD & LEDs Screens are touted to offer a Vivid, Lively & a better Picture Quality!

 

Reality turned out to be in quite a Stark Contrast when we made a transition from our much revered Sony CRT Monitors/TVs to opt for "in-trend" Branded LCD & LED TV Sets a few years ago! 

 

Using a Digital Signal Input only of Standard Definition quality, the latter category is able to render only a Pixelated, Fuzzy & Smudged Picture Quality when looked upon from a close Distance. Secondly, the "Sharpness" Slider appears to make a NO IMPACT on the Picture Output with Impunity!

 

Sound Output almost always carry an indispensable requirement to render the Audio Output throught attached External Speakers only, as the Inbuilt TV Speakers are too meek to produce the desirable Audible levels!

 

Till this date we do really repent on our decisions to do away with impecabble  Quality standards of CRT in terms of both Audio & Video in favor of Hype built around Modern day screens!

 

If I were ask to give a ranking, CRT in my opinion holds +10, & this 'Marketing Hype' based Modern-day screens will be able to fetch only a Sub zero standards!

 

Do any of you share the same Experience as well? Would be keen to hear your Thoughts!

Thanks.

Link to post
Share on other sites
  • Moderators

What I've noticed is the screen quality looks excellent if watching actual HD content (HD Cable TV/Satellite TV, Blu-ray, HD games, etc). It however looks very bad if for instance watching SD content like an old VHS tape or playing an old non-HD video game system where the same content would still look acceptable and good on an old SD TV set.

 

One of my favourite old video games is visually unpleasing on an HDTV, it looks like the lighting effects smear with a slight ghosting, washed out, and fuzzy. However that same exact game on an old CRT SD TV looks good. While they concentrated on making the HD capabilities look really good I feel they could have put in some microchip, etc., that could've enhanced old SD content to look as good on an HDTV as it used to on an SDTV instead of making it look horribly worse. Edit: Oh yeah, the Interlacing lines my HDTV shows from the DVD Recorder really suck, making it unwatchable.

 

The speakers in every off-the-shelf HDTV (I'm not talking about the gigantic floor standing HDTV's which have the capacity/room for decent speakers inside the casing) I've heard sound horrible and absolutely need external speakers -- this is just a case of HDTV's that can sit on a tabletop, etc., not having a thick enough casing to house decent speakers, loud speakers need some room in the casing to sound good. There's a mini-jack plugin so even some inexpensive computer hardware speakers (I got mine for under $60 USD) will make a notiable improvement in the audio quality and a good thump can be had if buying a 2.1 system with a sub-woofer. My mother didn't believe I could make her TV shows sound better with external speakers, then she looked stunned at the huge difference in the sound those little computer speakers made.

Edited by Andavari
Link to post
Share on other sites
  • Moderators

I have found flat panel monitors (be it LCD, LED, or plasma) to be much better than CRT.

Like any new monitor, they need tweaking.

It sounds like your resolution may need cranking up.

Don't use standard def, use 1920x1080 resolutions and above and get into the high def range.

Other factors to consider are your video card and monitor size.

 

You talk about sound output, are you using a multifunction monitor?  That is, it has a built-in TV tuner.

Backup now & backup often.
It's your digital life - protect it with a backup.
Three things are certain; Birth, Death and loss of data. You control the last.

Link to post
Share on other sites
  • Moderators
Other factors to consider are your video card ...

 

On an HDTV that's used strickly as a TV and not connected to a computer as a monitor there won't be a video card to play with settings.

Link to post
Share on other sites

My shared experience is not based on the performance of a single/one-off LCD or LED Screen, but on the basis of a Collective experience I had with 3 Differently Branded Screens of make Samsung, Panasonic LCDs & a Sony LED.

 

Did anyone of you ever notice the "Sharpness" Slider making any sort of impact on the Image Quality? On the contrary, CRT Monitors were too responsive to even slight to major maneuvers of Sharpness Setting!

 

I literally mean that the 'so called' Standard Definition appeared too Good during the Golden days of CRT, & the 'planned' Handicap of SD only started appearing after the transition to these Modern day screens.

 

As far as Volume parameter is concerned, it has to be upped within a range of 80-90 to make it Audible in a fair manner.

 

Nevertheless, Are there any end user Checks to ensure whether the Installed DISH have really been 'Calibrated' to receive only the OPTIMUM Signal Quality?

 

What should be the 'Ideal' settings/values/ratio in context to Signal Strength & Signal Levels for a DTH Subscriber to avoid the impact of Picture Pixelation, Blurriness, Smudged or Smeared Picture Quality?  

 

I would like you to also ponder over whether these Branded Companies follow a 'Different Standards' of Product Quality across varying regions of the World?

 

PS:: HD Content of National Geographic, NG Wild, History TV18 carry a transmission only in the English Language. For a complete immersive Experience; World makes more sense to you, when it speaks your Language :-) Thus, this dissuasion from HD! My Dear Friends,Cost Factors need to be considered too by the people residing across the 3rd World Countries! :-)

Edited by saurabhdua
Link to post
Share on other sites

Old games (to use Andavari's example) have TERRIBLE resolutions, the nintendo64 (for example) natviely supported 56x224, 320x240, and 640x480 (interlaced). When you scale that up to 1920x1080, you're scaling that up between 34x4 to 3x2.25 larger depending on the original resolution. If you want to watch SD content, you may have to configure the screen to display the content in a lower resolution to avoid upscaling and ruining the look.

 

One thing I'm not a fan of in modern TV's is frame interpolation, a lot of TVs don't seem to have an option to shut it off and it looks bad to me (when things are shot at 30fps and they upscale it to 60fps it just looks cheap! and sometimes the interpolated frames aren't accurate and there are small graphical errors or artifacts because of it.) I just want to see things played back at their native frame rate!

Link to post
Share on other sites
  • Moderators

The old analogue TV picture was actually better in my opinion than a Standard Definition picture displayed on a modern Hi Def TV.

 

Speaking for the UK by the way where I think our Std-Def TV quality can be pretty poor.

 

However Hi-Def (1081i or p) is something else, and I'm still as knocked out by it now as I was when I first got the set.

Link to post
Share on other sites

I remember, even during the times of Analog Signal, our Sony CRT's Picture rendering used to quite a Crisp & Crystal Clear! With the advent the Digital Signal it only went to attain the status of "Perfect"!

 

Nowadays, regardless of whatever these mention on the Front Panel, The Final Video & Audio Output off these Modern-day 'Size Zero' Screens,  appears very much to be Lull n Dull in Comparison!

 

I find this situation very similar to the one, that involves People calling Light of Incandescent Bulbs to be far superior, warmer in comparison to the prevalent Energy-Efficient Solutions such as CFLs & LEDs!

 

I mean....the performance of a "Standard Definition" Category shouldn't be that Bad as has been made to connote with by the Television & Broadcasting Industry!

 

Did they want to up the ante for the sake of Higher Profits? 

Link to post
Share on other sites
  • Moderators

What should be the 'Ideal' settings/values/ratio in context to Signal Strength & Signal Levels for a DTH Subscriber to avoid the impact of Picture Pixelation, Blurriness, Smudged or Smeared Picture Quality? 

 

Mine by Dish Network automatically adjusts itself for the strongest signal, for instance if it rains long enough to weaken the signal it will literally reboot itself and readjust.

 

And I fully agree with DennisD, old standard definition content (movies, games, etc), looks much better on an "old" analog TV ("standard definition TV"). I know on some gaming forums from years past people used to say the PS2 looked awful graphically, if only they could see those same "awful looking" games on an HDTV set into the proper 480i picture mode they'd fully witness the true definition of awful looking.

Link to post
Share on other sites
  • 2 weeks later...

It's all about the pixels folks. More pixels = better image quality. For example, years ago I bought a Sony Cybershot DSC-71 digital camera that has a 3.2 megapixel CCD. At that time, it was considered an above average camera, since most of the others on the market had 2 - 2.4 megapixel CCDs. Nowadays, entry level "point and shoot" digital cameras are 10 - 12 megapixels. The high end DSLRs are 24 - 36 megapixels. 

Start every day with a smile and get it over with. - W.C. Fields

Link to post
Share on other sites
  • Moderators

Nowadays, entry level "point and shoot" digital cameras are 10 - 12 megapixels.

 

I remember Dell gave us a 1.2 megapixel camera with our purchase of a Dell PC back in 2003. I took one picture with it saw how horrible the quality was and put it back into the box to never use it again, that was back when film cameras were still considered much higher in image quality.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...