This may sound like a weird thing to do, but I realised that many crawlers and bots are somehow still able to get past my Anubis. I presume they have gotten smarter and are capable of using JavaScript.
To counter this, I want to link my Anubis to an Iocane setup such that:
Internet > nginx reverse proxy > Anubis > Iocane > my site/app
My hope is that two different filtering mechanisms (one of which will actively poison and waste the bot’s resourced) will protect my system better.
I thought I’d ask before actually trying out something like this.


Iocaine expects you know how to detect it the bots, if they can get past anubis do you have another detection process?
I’m very late to the party, but: no, iocaine does not expect you to detect the bots. It used to, but it does its own detection for quite a while now (you can replace the detection mechanism, though).
What do you mean? Where does it do it’s own detection?
Around here. In the default configuration, it is using the built-in handler. The script can be replaced with something like Nam-Shub of Enki (used by pretty much everything I host, and by Codeberg too, for example).
Ah that wasn’t there when I deployed it.