• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Recommendations for serving static content and dynamically resized images?

fuzzybabybunny

Moderator<br>Digital & Video Cameras
Moderator
I've got a site that's heavy on the images. I only know the old-school way of serving this kind of content: take photos and resize/crop them down to the exact dimensions I want and choose a good compression ratio, then put them in /assets/images on the server and link them on the site.

There has got to be better ways to do this.

Hosting from Amazon S3? What about having the Amazon server dynamically resize/crop the images so that all I have to do is obtain images of one size instead of manually resizing them myself? I've seen websites that seem to pass a parameter or something in the image file name, and you can change the numbers to get back an image of a different size.
 
Last edited:
First I think you'd want to determine your caching policy.

Should a resized image be saved on the server or regenerated on every request? If saved, for how long? Save across multiple servers or let each server have it's own cache?

There's lots of ways of doing this. You can use a standard web app that spits back images from certain requests, or even override the standard image handlers so that you can still use a .jpg/.gif extension but with querystring params. Personally I think this is a pain in the ass (you have to tinker with the server settings/mime types, etc) and would only do it if I were trying to make something very image specific like Imgur.

I've seen various apps do this on cloud services like S3 and Azure, usually offloading the actual resizing process to worker roles. Then other worker roles that do housekeeping on your cache.
 
For Amazon, note that S3 buckets are just static storage so you can't run script code there. You'd need to have EC2 server instance(s) running your scripts and ask them for the resized images not S3.

Or only support some pre-set sizes and have some job that takes image X.png and spits out X_640_480.png, X_1024_768.png, etc. as separate static files in S3.
 
Last edited:
Or only support some pre-set sizes and have some job that takes image X.png and spits out X_640_480.png, X_1024_768.png, etc. as separate static files in S3.
Nice things about pre-set sizes:

- You don't have to work on caching. Just make 'em and save 'em.
- You can take more time to compress them better and save bandwidth.
 
If you're using Rails there are two gems that might interest you.

paperclip and carrierwave

My experience has all been with paperclip but it sort of does most of what you want.

- Allows you to process images on your servers using imagemagick after being uploaded through a web form
- Saves all processed + original images to whatever storage system you choose(S3, local filesystem, etc.)
- Easy helper methods to create various URLs to retrieve the different versions of your images

For low traffic sites the default setup should work pretty OK serving direct from S3 buckets, but if you want to speed things up and reduce costs a bit you should look into setting up a CDN to cache your S3 buckets. Obviously Cloudfront would work the easiest out of the box with S3, but you can certainly use other providers with a more manual setup involved.
 
Thanks. I'm doing MeteorJS exclusively, so no RoR. But there is a package I found that's based on ImageMagick that'll work. I'll see if they offer similar features to Paperclip and Carrierwave.

I'm extremely fuzzy on how CDNs and caching works. I've never used CDNs beyond using them to provide jQuery and stuff. Can anyone point me to a good tutorials?

The way I figure, there are a few methods I can do if I only provide one image at, say, 4000 x 3000 and I want to have it at different resolutions and sizing:

Option 1:

When a user makes a request for /assets/images/my-image.jpg?w=500 the server reads that parameter and resizes / compresses the image down to that particular width via a script.

I don't know much about cost and speed for web hosting, but I would imagine that this would slow down website loading plus increase the cost of hosting since more CPU is being used? So this would be the worst way to go about this?

Option 2:

I pre-determine the sizes I need, and I upload my-image.jpg to the server via a web form and the script automatically resizes and compresses the image into different versions and saves them on my server. And the script / package gives me helper methods for retrieving those different images.

Option 3:

MongoDB allows storage of large files in its database via the GridFS filesystem. It breaks large files down into chunks and there are helper methods for retrieving the file back.

I could pre-determine the sizes I need and when I upload my-image.jpg via the web form, a script automatically resizes and compresses the image into different versions and saves them to MongoDB's GridFS. And the script / package gives me helper methods for retrieving those different images.

I know nothing about the performance of GridFS. Is it generally a very bad idea to use a no-SQL database for storage and retrieval of media items? And why?

Option 4:

Use a CDN to cache S3 buckets I have no idea.
 
With Amazon S3 buckets you can think of it as FTP storage space that can also be fetched directly via HTTP/S. If you install Cloudberry Explorer free you can also look at it like a remote hard drive, it has sort of a 2-pane Windows Explorer view where you"copy" files. It's pure storage, not a server.

To include an image file stored in S3 on your web page, it's just the normal static src="http:// ---amazon domain-- /bucket-folder/bucket-subfolder/image.png"

So for resizing files, you'd either need a separate server that makes the images and then sends them to S3 storage using Amazon API libraries, or to do it on your PC and then use Cloudberry, CrossFTP, etc.

---

CloudFlare can be added later. For it you just set up a mapping between an S3 bucket and a different CloudFlare subdomain. You can use both at once for the same storage:

Fetched directly from S3 bucket:

img src="http:// ---amazon domain-- /bucket-folder/bucket-subfolder/image1.png"

Fetched using CDN instead

img src="http:// ---cloudflare domain-- /bucket-folder/bucket-subfolder/image1.png"

One thing I haven't needed to deal with at work is leeching, since I just use it for application patches. For images, I'm not sure what options S3 and Cloudflare have for checking the referer: tag.
 
Back
Top