I have a caching server that forwards queries to the root servers then keeps the info cached for a bit. Only problem, it caches strictly in memory and only for a small amount of time. So for frequent accessed sites like google, its fine, but anything not accessed within the last hour or so needs to go out and query the root servers again.
So is there a way to make it cache records to disk, and up to a certain quota, like 1GB? So any domain I resolve would basically create a record locally until the TTL expires (or some fixed value). So lets say I go to example.com today and it resolves to 123.123.123.123 then tomorow if I go, it should still have the record locally instead of having to go out and resolve it again. I'm guessing this is how lot of companies handle this.
So is there a way to make it cache records to disk, and up to a certain quota, like 1GB? So any domain I resolve would basically create a record locally until the TTL expires (or some fixed value). So lets say I go to example.com today and it resolves to 123.123.123.123 then tomorow if I go, it should still have the record locally instead of having to go out and resolve it again. I'm guessing this is how lot of companies handle this.
