Prefetching algorithms?

stuckinasquare3

Senior member
Feb 8, 2008
397
0
76
Can someone point me to a good resource?

Let's say I have an app and a user could scroll through my app very quickly. As the user scrolls I want to bring in content and display it as quickly as possible. To improve speed, I'd want to prefetch and cache some upcoming content. I'd like to read up on some good algorithms for handling this.
 

nForce2

Senior member
Aug 15, 2013
285
0
76
Can someone point me to a good resource?

Let's say I have an app and a user could scroll through my app very quickly. As the user scrolls I want to bring in content and display it as quickly as possible. To improve speed, I'd want to prefetch and cache some upcoming content. I'd like to read up on some good algorithms for handling this.

I think you will find that the algorithms used will vary substantially based on the type of content you are displaying and the system constraints you are working within...


For example, if your application is an image viewer, the image files are local, and you have a 3GHz quad-core processor with 8GB of RAM... then you might have a few threads reading ahead (and behind) a few frames, but no need to do anything beyond that because new images can be loaded so quickly that there is no need to pre-fetch beyond 2 or 3 images out.

Change the constraints a touch and lower the CPU to a 1GHz single-core... and you might pre-fetch significantly further out so that the user can flip through them quickly, and you can cache as much as you want because you have tons of RAM.... but at the same time, your pre-fetch also must throttle itself because it shares that single CPU core with the main thread and you can't afford to slow the rest of the app down for the current image just to read ahead.

Change the constraints again so that the images are on a network - let's say a cellular network. In this case, you don't want to fetch too far ahead, even if you can (and even if it would help the user experience), because that would incur high network usage and the associated bandwidth costs for your users - even for images that the user never looked at! (because you're fetching ahead.) Mobile users wouldn't like that. ;)

Change the constraints so that the images are on a slow network, and the solution changes yet again. You have to start pre-fetching earlier to offset the network slowness, which likely leads to parallel fetching... but at the same time you have to be ready to terminate the fetching immediately if the user requests something else to view... and you have to be ready to dynamically balance the parallel fetches so that if a user scrolls to one that is only partially retrieved, the other items actively being retrieved can be throttled back to prioritize the active image.

Now let's say the images are local again, but memory is scarce. Say an older device or something embedded, with 128 MB of RAM. Even if you want to cache ahead to make things faster, you might not be able to because you simply run out of RAM. In this case, you might even have to get aggressive with purging items, and start paying attention to the user's actions, inputs, and predicted intent... to make more intelligent decisions about how big the buffers should be for forward vs backward browsing.


Sooo.... long story short, if you search for them, I'm sure you will find some fantastic predictive caching algorithms. But you'll have to really focus your use cases first, as they definitely won't be a one-size-fits-all. ;)