We will implement a new Fediverse Auxiliary Service Provider (FASP) that will allow sharing storage and media processing between servers.
This is pretty big too, as the cost and legal risks of hosting this user content is high. They’ve clearly thought about the media moderation problems too:
We will build a reference implementation of a Automated Content Detection service, again as a new Fediverse Auxiliary Service Provider with an open protocol.
This will allow server owners to opt-in to use external tools to scan content for spam, illegal materials, etc in order to help them fight bad actors; they could self-host these tools if they choose to do so, or share the infrastructure with other servers for better efficiency.
This is pretty big too, as the cost and legal risks of hosting this user content is high. They’ve clearly thought about the media moderation problems too: