• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Today marks the first time I've broken the 1GB/s barrier

Kaido

Elite Member & Kitchen Overlord
http://i.imgur.com/CSLRu.png

Using a pair of Angelbird Wings cards (32gb + Lite) and eight 60GB Sandforce-1200 SATA-II SSD's. With the OS on the RAID (striped) I get about 1GB/s, if I do the OS on the onboard 32GB SSD and use the RAID as a data drive, I get about 1470 MB/s read (testing on empty). The cards are basically Marvell SATA 3GB/s controller cards with an optional embedded SSD (around 90 MB/s read or so) for putting the OS or install files on; you can link them across the PCIe bus via Software RAID and add as many cards as your board can handle. I'm using OS X, which has extremely reliable software RAID...the results are staggering. Such a jump from my previous 250 MB/s single SSD, lol.

Pretty great day in nerdland for me, just wanted to share :biggrin:
 
Nice, let us know how the Sandforce drives hold up. 🙂

I'm using the Angelbird-brand SSD's for this setup. First time on them. So far, so good!

I've had the best luck with OWC-brand (MacSales.com) SSD's. Had a few DOA's, but their customer service is extremely good and I got replacements within a week. Have installed around 20 or so without any issues yet other than the DOA's (fingers crossed).

I've had the worst luck with OCZ-brand SSD's. Not just DOA's, but dying randomly. Just had one die about a week ago - 2 months old in a senior manager's laptop, in the middle of an executive meeting. Completely unrecoverable. Maddening.

G.Skill Phoenix has been another good brand for me, although they are not making SATA-III drives yet. Really enjoying the SATA-III speeds on the new rigs - the OWC drives I get clock in at about 490 MB/s actual tested speed, which is pretty dang zippy.
 
Congratulations.
Sorry but I have to say
Random 4k reads/write performance >>>>>>>>>>> sequential speed 😉
except video editing.
 
Congratulations.
Sorry but I have to say
Random 4k reads/write performance >>>>>>>>>>> sequential speed 😉
except video editing.

This system was built for primarily for video editing 😉

I'm happy to run any test you'd like & post results though - testing various install setups & then putting it into production on Monday :biggrin:
 
AVCHD at the highest consumer bitrate is only 24mbps or about 3MB/sec, perhaps 3.5MB/sec with audio. Any hard drive can do 5 or 10 streams of that.
Even fully uncompressed video is doable with conventional hard drives. And it's unlikely you are editing 4:4:4 uncompressed 1080p video.

The only situation I could see requiring mega bandwidth video-wise would be editing 4:4:4 uncompressed 4k video streams.

Bottom line is don't worry about your storage subsystem for video editing, your CPU or GPU if you NLE is accelerated will be the bottleneck of your system.

Hard drives haven't been an issue for video editing for about 10 years now.
 
while some of that is true.. I see very dramatic gains in my usage for gfx and vid edits from a 6 drive SSD array versus my 6 drive HDD array.

Does the actual encoding process itself change much?.. not really.. but everything leading upto and after it is far beyond HDD's capabilities.
 
while some of that is true.. I see very dramatic gains in my usage for gfx and vid edits from a 6 drive SSD array versus my 6 drive HDD array.

Does the actual encoding process itself change much?.. not really.. but everything leading upto and after it is far beyond HDD's capabilities.

Especially for round-tripping...Final Cut Pro to Resolve to After Effects to Nuke and back. Fast boot, fast application opening, fast loading & saving of small & large files...lovin' it :biggrin:
 
while some of that is true.. I see very dramatic gains in my usage for gfx and vid edits from a 6 drive SSD array versus my 6 drive HDD array.

Does the actual encoding process itself change much?.. not really.. but everything leading upto and after it is far beyond HDD's capabilities.


Application loading can be accomplished with a single SSD boot drive.
Everything else is CPU or GPU bound when video editing. Unless you included copying one large video file from one drive to another for whatever reason.

But hey if the big SSD array makes you feel good then go for it!
 
Application loading can be accomplished with a single SSD boot drive.
Everything else is CPU or GPU bound when video editing. Unless you included copying one large video file from one drive to another for whatever reason.

But hey if the big SSD array makes you feel good then go for it!

Not the only application. In video editing, I send files from the encoder to the editor to the color grading app to the special effects app and back again. Open, edit, save, close, rinse, repeat. It's nice having a speedy scratch drive for projects 🙂
 
http://imgur.com/Dm7Fc

bleh..... ramdrives, as you said in your review

😀...

you probably have > 300GB of SSD space right? get 8x8GB = 64GB of ram, take away 16GB for system, use 48GB for your workspace... 8*$200 per 8GB dimm = $1600

just a thought if you need even faster speed, but smaller scratch drive

wished I had the money for 8x60GB SSDs

while some of that is true.. I see very dramatic gains in my usage for gfx and vid edits from a 6 drive SSD array versus my 6 drive HDD array.

Does the actual encoding process itself change much?.. not really.. but everything leading upto and after it is far beyond HDD's capabilities.

possible because some other application is using the HDD array as well, and HDDs doesn't do well bouncing the drive heads between two different file reads (compared to SSDs)
 
Last edited:
lol.. I have figured out how to test hardware performance by now through establishing proper baselines. I guess seeing is believing for some folks.

And if application loading was the only thing that I did during a worksession?.. then I wouldn't have an 8 drive HDD array matched up to my 6 drive SSD array and single drives would be good enough. However, after anyone uses this setup for a few minutes?.. they quickly get spoiled and even the ones with SSD's in their systems feel shortchanged. Not just about the benchmarks, you know.
 
lol.. I have figured out how to test hardware performance by now through establishing proper baselines. I guess seeing is believing for some folks.

And if application loading was the only thing that I did during a worksession?.. then I wouldn't have an 8 drive HDD array matched up to my 6 drive SSD array and single drives would be good enough. However, after anyone uses this setup for a few minutes?.. they quickly get spoiled and even the ones with SSD's in their systems feel shortchanged. Not just about the benchmarks, you know.
that is true... great deal of "feeling" with the system being responsive...
 
http://imgur.com/Dm7Fc

bleh..... ramdrives, as you said in your review

😀...

you probably have > 300GB of SSD space right? get 8x8GB = 64GB of ram, take away 16GB for system, use 48GB for your workspace... 8*$200 per 8GB dimm = $1600

just a thought if you need even faster speed, but smaller scratch drive

Yeah, I was looking at doing an eVGA SR-2 system originally - unofficially, they support 96GB RAM. So say a 60GB RAMdisk with 36GB leftover for the system. But 60GB is only an hour of ProRes footage at 1GB/min, unfortunately. The SSD setup I'm testing now has 480GB, which is 3-4 hours of footage after formatting. Cost for the 480GB 1GB/s array was about $2,000, while the 96GB RAM kit is around $1600 plus the cost of Xeon chips and a $600 motherboard, and requires a bit more management (volatile storage & backing up to an image and all) with less space.

Although the new Westmere-EX Supermicro server boards support 2TB of RAM...a little out of my pricerange tho :biggrin:
 
Back
Top