I'm developing a web app. A frequently used page needs to access a complex query comprising many joins, aggregations and several intermediate temporary tables. The problem is the potential for this to be long running.
Fortunately, it isn't strictly required for the data used for this query to be absolutely up to date, so I've been wondering about caching some pre-processed data and doing the final aggregations on demand.
Some experiments have run into problems though:
A - temporary tables: these are dropped when the connection is lost; as I don't maintain persistent db connections - this is no use
B - permanent tables: work fine, but the transaction log grows alarmingly quickly
C - permanent tables in tempdb: requires that you set up permissions on tempdb every time the server is started, as by default only administrators have create table access to tempdb
Any ideas? I'm getting very tempted to give-up on the idea of caching and simply accept that this page might get a bit slow.
Fortunately, it isn't strictly required for the data used for this query to be absolutely up to date, so I've been wondering about caching some pre-processed data and doing the final aggregations on demand.
Some experiments have run into problems though:
A - temporary tables: these are dropped when the connection is lost; as I don't maintain persistent db connections - this is no use
B - permanent tables: work fine, but the transaction log grows alarmingly quickly
C - permanent tables in tempdb: requires that you set up permissions on tempdb every time the server is started, as by default only administrators have create table access to tempdb
Any ideas? I'm getting very tempted to give-up on the idea of caching and simply accept that this page might get a bit slow.