The idea of limiting user’s vhost resource usage on Apache brings me to give the suPHP a try. suPHP makes PHP process owned by the owner it self, not “nobody” or apache user, enabling us to limit resource per vhost.
After setting up suPHP with rlimit rule per vhosts, I see that Rlimit really works. Apache kills all PHP execution that hit the Rlimit. So, basically we can have a containers that lock user’s PHP execution, thus preventing user to overload the server with buggy or highload type of PHP script.
But a new problem arised. When PHP execution killed, it generate coredump files. Coredump files are very useful to traceback any crash issues that occur during PHP execution. But I got them all over user’s directory, especially on user’s directory that have a highload type of PHP script. The size may vary from 1MB to 40MB (on my system). They eat up users space every time a greedy resource PHP execution killed.
I have try a few tips I got on Google to stop coredump files generation :
- set “ulimit -c 0” for users on /etc/profile
- set “/proc/sys/fs/suid_dumpable” to 0 (by default it already zero)
- set /etc/security/limits.conf with 0 limit for core parameter
- set CoreDumpDirectory to specific directory (I think suPHP makes apache ignores this directive)
Finally I found a very simple solution. I see on /etc/init.d/httpd and it has ulimit setting at the top lines of the file. I add “ulimit -c 0” and wohooo!! It works!
No more coredump files – thanks to my coredumb head 😛