- Mar 6, 2000
- 971
- 76
- 91
We built a couple of Haswell Pentium G3220 systems to run 4K TV monitors for the HD surveillance system we installed in a large processing plant. 60 5MP cameras (2560x1440). It was a shot in the dark and I missed badly... The client software which displays tiled cameras from 1 to 16 renders everything in software, so the dual video cards are only useful to output a 4K signal to the TV's.
The pictures look amazing but on anything over 4 cameras, it will start dropping frames. When I looked at cpu usage, 1 camera uses 25%, 4 cameras- 60%, 5 cameras-80%, 7 cameras 95% and anything more, 100%. They want to see 24 cameras at a time- 12 on each TV. So the little G3220 is maxed out all the time. I heard her crying in the server closet.
The software engineer for the company that makes the recording and client software says they have never tested using anything with these resolutions, but the software will use 1 core/thread per camera tile until it uses all available and then it starts sharing.
Question- will I be better off using a 4770 with 4 cores/8 threads or an AMD 8350 with 8 discrete cores?
The pictures look amazing but on anything over 4 cameras, it will start dropping frames. When I looked at cpu usage, 1 camera uses 25%, 4 cameras- 60%, 5 cameras-80%, 7 cameras 95% and anything more, 100%. They want to see 24 cameras at a time- 12 on each TV. So the little G3220 is maxed out all the time. I heard her crying in the server closet.
The software engineer for the company that makes the recording and client software says they have never tested using anything with these resolutions, but the software will use 1 core/thread per camera tile until it uses all available and then it starts sharing.
Question- will I be better off using a 4770 with 4 cores/8 threads or an AMD 8350 with 8 discrete cores?