Fedify: an ActivityPub server framework's avatar
Fedify: an ActivityPub server framework

@fedify@hollo.social · Reply to silverpill's post

@silverpill The bottleneck happens because for each recipient, we need to:

  1. Serialize the activity data
  2. Create a queue message with metadata
  3. Write to queue storage

When you have thousands of followers, these operations add up quickly and block the HTTP response. With fan-out, we only do this once during the request.

What issues are you having with your current fan-out implementation? We're always looking to improve ours.

silverpill's avatar
silverpill

@silverpill@mitra.social · Reply to Fedify: an ActivityPub server framework's post

@fedify

>When you have thousands of followers, these operations add up quickly and block the HTTP response

Is that because Fedify implements some storage abstraction layer, so every storage write (3.) must be an independent operation, and writing thousands of items at once is not possible?

>What issues are you having with your current fan-out implementation?

Each delivery task looks like [activity, recipients].
These tasks are executed sequentially, and although HTTP requests to recipients are parallelized, all of them must complete before the next task can be started. Dead instances delay the completion of the task and slow down the whole queue.

I think this slowdown can be remedied by running multiple delivery tasks in parallel, but I am wondering if breaking them down into smaller [activity, single recipient] tasks would make the system simpler and more flexible.