It also adds to the deployment complexity even more. Just from memory, to run Mastodon you need:
any number of Rails web servers (horizontally scalable)
any number of Sidekiq worker processes (horizontally scalable)
a PostgreSQL database for persistent storage (vertically scalable modulo sharding)
a Redis server for caching and Sidekiq (vertically scalable modulo sharding)
a Elasticsearch server for full text search (vertically scalable modulo sharding)
So this is at least 5 different server processes to manage, In reality for almost all deployments, Redis and Elasticsearch are unnecessary; the database can be used for jobs and full text search. Further, it could even be SQLite for all but large instances.
The deployment story for Mastodon is a nightmare and a substitute like Pleroma or even better something in Rust is necessary.
That’s usually from credential stuffing, which I guess you could consider botting, but what I was referring to was automatically creating accounts. Sorry for the miscommunication.
I don’t think so really. Google accounts are pretty hard to bot. I think they’re just idiots and children and with Poe’s law you can’t really tell the difference.
It also adds to the deployment complexity even more. Just from memory, to run Mastodon you need:
So this is at least 5 different server processes to manage, In reality for almost all deployments, Redis and Elasticsearch are unnecessary; the database can be used for jobs and full text search. Further, it could even be SQLite for all but large instances.
The deployment story for Mastodon is a nightmare and a substitute like Pleroma or even better something in Rust is necessary.