Seiki 50" 4k LED TV at TigerDirect 3840x2160 - $1299.99 shipped

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Juddog

Diamond Member
Dec 11, 2006
7,851
6
81
too all those saying 1080 p is great etc. I have old 21" sony gdm f monitors that have almost no dot pitch, true black contrast, true color, incredible scan rates and resolution to the sky which is far sharper than ANYTHING built in the last oh what 12 years...which make everything look like pasty cardboard cutouts in comparison.

Anyway seemed like oled was the only thing thing close to the picture quality if not the resolution. These supposed hd tvs they have been selling where supposed to have been in this range originally when the bandwidth was discussed.

I went to the sony and tesla store (rare earth i wonderful) a few months ago and checked out the 4k and it looked great (at least in detail)...not old film imax (vs digital imax crap) great for the size and the lcd/led tvs world we live in..

eww lets watch a movie on my phone... this mp3 player sounds so good...are you kidding me i thought we would have better quality sound and displays by now...realistic full spectrum electro magnetic...not lower quality just shrunk down or blown up.

as people are fond of saying here...wheres my holo deck!

I just hope these come out and I can afford one before

A. i go blind or B. I die

in summation its about time 1080p sucks in anything bigger than a 15 inch monitor.

hopefully the thing will come with other ports besides hdmi (I dont care if I gotta hook up multiple cables if it works people) but i guess its an anti pirate thing so we the paying consumer will be stuck again.

Also for those who didnt read the box says 120hz for what its worth.

cheerios anandi's!

I don't agree at all.

I too have owned an old school high end CRT designed for CAD. It used to do 120 Hz at 1600x1200 with 4:3 ratio (forgot it's max resolution offhand).

Major drawbacks versus a LCD / any newer flatscreen:
* It was heavy as f*ck. The thing literally weighed something like 85 lbs. It was also bulky; I couldn't trust it on a flimsy desk, fortunately at the time I had a several hundred pound oak desk that could handle it. The length of it was also ridiculous.
* While it's contrast was great, etc., the image started to wear on my eyes after prolonged use. There is something about LCD's where I can stare at them for hours and not get the eye strain from a CRT.
* It drew a ton of power.

So yes while an old school CRT may be great in some ways, I have been converted to the LCD for a while now, mostly because of the bulk, power draw, and eye strain issues.

Was the contrast better? Yes
Was the color better? Yes

They had some of these at work that they were literally trying to give away and nobody wanted them. They ended up going to the recycling company. The other thing is that the old school CRT's either used VGA or BNC connectors, which are a pain to work with compared with the simplicity of DP (display port).

In regards to the sound quality mention, the sound quality nowadays is fine if you:
A) Listen to lossless or something that approaches the limits of what humans can tell the difference on (e.g. 320kbps VBR MP3's)
B) Have a good headset / earbuds. If you're listening to tunes on your iphone and you think the quality sucks, then spring some bucks for good earbuds instead of the crap apple ones.

In regards to the bandwidth issues, that's simply a limit of HDMI 1.4 right now. I'm betting the later 4k displays will have either a new version of HDMI or will have a displayport input (which does have enough bandwidth). I'm betting that future versions will have a thunderbolt input as well, especially once apple makes its' own 4k desktop monitor.
 

Sonikku

Lifer
Jun 23, 2005
15,886
4,886
136
The only time a CRT like the Sony FW900 I had weared on my eyes after long viewing was when it somehow got switched to 60hz on PC restart. When it was at 85htz it was majestic.
 

mikeford

Diamond Member
Jan 27, 2001
5,671
160
106
Sounds like bottom line, this is a transition display, some but not all features of next gen. Maybe that is why its cheap?
 

houkouonchi

Junior Member
Jan 9, 2008
15
0
0
I don't agree at all.

I too have owned an old school high end CRT designed for CAD. It used to do 120 Hz at 1600x1200 with 4:3 ratio (forgot it's max resolution offhand).

I am unaware of any CRTs that could do 120Hz @ 1600x1200. I had one of the highest resolution/refresh rate CRT's ever made (Viewsonic P225F) which could do 2560x1920x@63Hz and 1600x1200@105Hz. AFAIK no CRT did > 110 Hz @ 1600x1200. You sure on that?

Also I have been using my seiki as a monitor for quite a while now. I am running @ 24 Hz using a active DP -> HDMI adapter due to broken linux drivers. Hoping nvidia will fix their shit soon so I can get the full 30Hz.
 

Juddog

Diamond Member
Dec 11, 2006
7,851
6
81
I am unaware of any CRTs that could do 120Hz @ 1600x1200. I had one of the highest resolution/refresh rate CRT's ever made (Viewsonic P225F) which could do 2560x1920x@63Hz and 1600x1200@105Hz. AFAIK no CRT did > 110 Hz @ 1600x1200. You sure on that?

Also I have been using my seiki as a monitor for quite a while now. I am running @ 24 Hz using a active DP -> HDMI adapter due to broken linux drivers. Hoping nvidia will fix their shit soon so I can get the full 30Hz.

What are your impressions of using it as a monitor?

In terms of CAD CRT's, there were a few that are able to hit 1600 x 1200 @ 120 Hz, for example this uses it in stereo mode:
http://h18000.www1.hp.com/products/quickspecs/11864_na/11864_na.HTML

Technically any of the following should be capable of handling 1600x1200 @ 120hz in stereo mode:

- Iiyama Vision Master Pro 514 / HM204DT (H-sync: 142 kHz)
- DELL/HP/COMPAQ p1230 (H-sync: 140 kHz)
- LaCie Electron 22blue IV (H-sync: 140 kHz)
- NEC FP2141SB-BK (H-sync: 140 kHz)
- NEC-Mitsubishi Diamond Pro 2070SB-BK (H-sync: 140 kHz)
- NEC-Mitsubishi RDF225WG (H-sync: 140 kHz)
- Sun X7149A (H-sync: 140 kHz)

Grabbed from here:
http://pymol.sourceforge.net/stereo3d.html

One thing to note is that people using the Nvidia drivers generally had issues getting this working but it seems to work fine on any of the workstation class ATI (now AMD) cards running stereo output.
 
Last edited:

houkouonchi

Junior Member
Jan 9, 2008
15
0
0
What are your impressions of using it as a monitor?

In terms of CAD CRT's, there were a few that are able to hit 1600 x 1200 @ 120 Hz, for example this uses it in stereo mode:
http://h18000.www1.hp.com/products/quickspecs/11864_na/11864_na.HTML

Technically any of the following should be capable of handling 1600x1200 @ 120hz in stereo mode:

- Iiyama Vision Master Pro 514 / HM204DT (H-sync: 142 kHz)
- DELL/HP/COMPAQ p1230 (H-sync: 140 kHz)
- LaCie Electron 22blue IV (H-sync: 140 kHz)
- NEC FP2141SB-BK (H-sync: 140 kHz)
- NEC-Mitsubishi Diamond Pro 2070SB-BK (H-sync: 140 kHz)
- NEC-Mitsubishi RDF225WG (H-sync: 140 kHz)
- Sun X7149A (H-sync: 140 kHz)

Grabbed from here:
http://pymol.sourceforge.net/stereo3d.html

One thing to note is that people using the Nvidia drivers generally had issues getting this working but it seems to work fine on any of the workstation class ATI (now AMD) cards running stereo output.


I am sure it would be easy to do if you do your own modelines/custom resolutions although if its >400 Mhz pixelclock that could indeed be a problem. Ok interesting I wonder if those came out later than the p225f as I thought that was one of the highest refresh rate CRT's that existed when it came out but alas its 127 Khz horizontal vs the 140 Khz these are so obviously they are significantly higher bandwidth.


So far I am *really* liking the display. Finally got 30Hz working in linux although nvidia's implementation for HDMI 1.4 stuff is retarded as it has to physically read EDID off the display to do it (even using a dumped EDID doesn't work) and you can't do any custom modes > 165 Mhz (this is not a problem on windows though).

You get used to a 50 inch display quite fast to be honest. It would be awesome if it was 60Hz or higher but 30Hz is fine for desktop/video/etc and I really enjoy the extra desktop real-estate. Its a nice display. I figure its probably going to be a year (or more) before another affordable 4k display comes out that can do 60Hz (either via dual link DVI or dual dual link DVI or display port). It was worth to spend the money now and not have to wait for a year.

The Sharp 32 inch that is 3840x2160 and can do 60Hz is nice but its just too expensive ($4500).
 

mikeford

Diamond Member
Jan 27, 2001
5,671
160
106
16 seven segment character LED display not good enough for ya, rings bell on walker.
 

AmdEmAll

Diamond Member
Aug 27, 2000
6,699
9
81
I am unaware of any CRTs that could do 120Hz @ 1600x1200. I had one of the highest resolution/refresh rate CRT's ever made (Viewsonic P225F) which could do 2560x1920x@63Hz and 1600x1200@105Hz. AFAIK no CRT did > 110 Hz @ 1600x1200. You sure on that?

Also I have been using my seiki as a monitor for quite a while now. I am running @ 24 Hz using a active DP -> HDMI adapter due to broken linux drivers. Hoping nvidia will fix their shit soon so I can get the full 30Hz.

I had a Samsung 900NF that could do 1600x1200 @ 100Hz.. Such a great monitor.

Nevermind.. seems it only did 87 which was really nice I remember at the time and hard to find.. holy crap 120hz though those must have been $$$.
 
Last edited:

wirednuts

Diamond Member
Jan 26, 2007
7,121
4
0
i think a lot of people are missing the idea that the tv accepts a 30hz signal at full resolution max...

the box says 120hz because it fake upconverts the image to 120hz, just like normal lcd tvs take whats usually a 60hz signal and display it in 120 frames per second.

the reason it only takes 30hz is as mentioned earlier- thats all hdmi can handle right now (hello displayport?)


most all movies are recorded at 24hz, so no matter what, you are always going from a 24hz image to 120hz or whatever the set is finally rated for, so to worry about what hz the tv inputs shouldn't be that great of a concern.
 

BenJeremy

Senior member
Oct 31, 2004
718
87
91
It is very cheap, but personally I'd rather have a high-quality 1080p plasma set from a known maker (like a Panasonic ST60) than an off-brand 4K LCD set. To each his own, though . . .

Why are people hung up on the idea of this HDTV as a living room set?

4K screens make great monitors. Beyond that, it's rather pointless... most people CANNOT tell the difference in video quality with full motion movies or TV shows.

Pixel Fallacy #1
Pixel Fallacy #2

Past 720p, you have diminishing returns. I could tell the difference between 1080p Samsara and 720p Samsara on my 58" Plasma, but it would be difficult, and entering the room, not told which one it was... I probably couldn't tell you either. Considering that Samsara is a movie DESIGNED to highlight details and visual fidelity, should say something about that. an action movie? You probably NEVER tell the difference between a 720p version and a 1080p version on that screen, outside of 3ft away.

The jump to 2160p is even less distinctive, particularly when considering the content you'd be watching.

So, as I said, where these would really matter is for people using them as computer monitors, where they seem to work great, from the reviews and videos I've seen. They basically replace 4 25" 1080p monitors. Doubling the vertical resolution would be great.

For your living room duty, these screen are really only going to be gobbled up by bleeding edge early adopters who absolutely, positively, must have the latest and greatest tech, money no object. Most consumers have no need to touch these until the cost differential is less than 20% of a 1080p monitor - and only when they NEED to buy a new set because the old one is broken or no longer suits their needs.

The fidelity offered by 2160p is real, but for all practical purposes, useless for movies and TV.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
Whats odd is that in your first pixel image I can clearly see "banding" in the top and bottom bar, the middle one is the smoothest (still some banding) and the bottom one obviously the worst.

I can't figure out why the middle one looks better. I can only guess they oops'd and swapped the 320 and 640 bar. There also appears to be a few other artifacts in the bars (horizontal lines/bars).

Me thinks you have a terrible example here. 720 vs 1080 is a much smaller res difference, so it's harder to identify (no problem for me on a big screen though). I'd say 40" is the limit for 720.

I have no doubt I will instantly and easily be able to identify a native 4k image at 50"+ vs a stretched 1080 image at 50"+.
 

Apex

Diamond Member
Oct 11, 1999
6,511
1
71
www.gotapex.com
Me thinks you have a terrible example here. 720 vs 1080 is a much smaller res difference, so it's harder to identify (no problem for me on a big screen though). I'd say 40" is the limit for 720.

I have no doubt I will instantly and easily be able to identify a native 4k image at 50"+ vs a stretched 1080 image at 50"+.

The issue is not so much size related as angle related (size in relation to viewing distance). There's really no one-size-fits-all for every viewing condition and preference.

http://s3.carltonbale.com/resolution_chart.html

Given a 50" screen, the benefits of 4k should start to become visible somewhere around 5' away, given 20/20 vision. That's actually reasonable. I'm at about 12' away from my 100" projector, and that's right at the cusp of starting to reap the benefits of higher than 1080p resolution.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
The issue is not so much size related as angle related (size in relation to viewing distance). There's really no one-size-fits-all for every viewing condition and preference.

http://s3.carltonbale.com/resolution_chart.html

Given a 50" screen, the benefits of 4k should start to become visible somewhere around 5' away, given 20/20 vision. That's actually reasonable. I'm at about 12' away from my 100" projector, and that's right at the cusp of starting to reap the benefits of higher than 1080p resolution.

Yeah I would probably agree with this...but even then I think this chart can vary depending on the individual.

Here's an interesting image I found on google:

Choosing-Between-1080p-Full-HD-or-720p-HD-Ready-For-Your-HDTV-3.jpg


For me, many larger (40"+) TV's actually look like the 720 image there (even at 1080). Like I'm seeing the large pixels. But most places I've been also don't have huge rooms to sit crazy far back. And why would I want to? I prefer to have the TV fill as much of my vision as possible.

I normally sit about 2 feet away from a 1920x1440 screen (4:3 CRT). My friend has a nice IPS 1080 32" I picked for him and even after I calibrated it my CRT is just sharper and better (side by side). The bigger and bigger screens just look worse and worse to me. And many have horrid colors, contrast, back light bleed, and other issues that just make watching them far less pleasant.

Yes, old TVs had low res...that's no excuse to stall progress. 1080p has been around far to long, and has even taken over (and in my opinion destroyed) the desktop monitor market. Even as screen sizes have grown massively since the old CRT TVs.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
The problems with using anything as a TV is that the compression dictates the quality far more than the resolution. Compression artifacts are everywhere.

You can see the issue in the images that BenJeremy linked to... They're trying to show issues with different pixel density, but what I noticed first in the color spectrum pictures is that ALL THREE show severe JPEG artifacts around the text and right where the color bands border the white area, because they saved it as a compressed image... DERP.

Broadcast / streaming size isn't what matters to people displaying primarily from media, like Blu-Ray Discs and such. That is usually relatively low compression and high quality... but I'd consider those people in the minority.

What matters most to broadcast companies and streaming media companies is total compressed size, since this translates to the cost to maintain the storage farm. When that is considered, then overall quality for a given "streaming bandwidth per time" is going to be better with 1080p or possibly even 720p than with 4k.

Until we get home bandwidth up to levels that can easily support 4k streaming at LOW compression, there's no point in migration. We aren't even at the point where we can consistently rely on having 1080p at low compression artifacts. I see compression artifacts quite regularly in broadcast HD sports. Hockey is borderline unwatchable on some HD implementations and I notice them sometimes even in basketball. Sometimes it makes me wish we still had analog SD. No compression artifacts mean you can actually enjoy fast action sports.