How to keep an old php site up while keeping it secure?

Red Squirrel

No Lifer
May 24, 2003
67,329
12,096
126
www.anyf.ca
I have this super old phpbb forum I keep up, mostly as a relic, and because I have the domain so may as well have it go to the site. Problem is, since it's not really used anymore nor do I have time to maintain it, it's bound to be hacked etc as there are probably lot of known vulnerabilities for that software.

Is there some way of keeping such a site running while also protecting the rest of the server from it? I'm thinking I could convert it all to plain html, but I imagine that would take up a HUGE amount of space, or perhaps I can put it on a separate vm and have it setup so it just reloads the entire site/db at reboot, and it would reboot once a night or something. But that seems overkill.

Essentially I don't want a vulnerability on the site to lead to someone being able to modify my other sites too. So I want it to be separate from everything else. I'm leaning towards the HTML or VM route, but open to other suggestions.
 

KB

Diamond Member
Nov 8, 1999
5,396
383
126
You could try making the database read-only and the user running the phpbb code only have read-only access to the filesystem. This will eliminate a lot of chance for vulnerability. Some sophisticated hacks may be able to get elevated permissions though.
 

Red Squirrel

No Lifer
May 24, 2003
67,329
12,096
126
www.anyf.ca
Problem is that the php user is apache, so I still need that user to have write access to the active sites. I recall there was something called phpsuexec that would let you run code as the user of the home directory but everything I find seems to say it's deprecated. Unless there's another way of doing that?

The html route is the easiest but it seems kinda inefficient, not sure how smart wget will be about reusing image files etc or if it will duplicate for each post/page. Going to add up fast in disk space. I might give it a go and see though.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
I'm not sure that it will take as much space as you think (are there lots of images?). Compress it with zopfli (HTML is highly compressible) and use something like nginx's precompress feature to serve them up.
 

Red Squirrel

No Lifer
May 24, 2003
67,329
12,096
126
www.anyf.ca
I tried it out, did wget -krv and it sorta worked, problem is, it's not rewriting URLs that have question marks and it actually saved the files in that format too. But when you request it, the web server does not know to take the question mark literally. Suppose I could script some kind of mass replace, but is there maybe a flag for wget that I missed? Turned out to only be like 173MB, so I think this is very well doable. I was expecting it to end up being several hundred GB.