Decentralised Media and the Question of Responsibility

Aali JaiswalLaw

Decentralised Media and the Question of Responsibility

As users of digital platforms, we value the freedom of unrestricted expression and the right to feel protected. The Internet has continuously evolved in response to the balance between freedom of expression and the need for order. The first wave of digital platforms, such as Facebook, Instagram, and Twitter (now X), was centralised mainly; they were under the control of a specific corporation. These corporation had their own regulations and community guidelines about what users can or cannot post on their platforms. They had the power to remove the posts and/or ban the profiles that went against the guidelines. This framework provided a mechanism for regulating and removing offensive, unlawful, or harmful posts.

Now, we are entering the era of decentralised media, where the rules of responsibility are blurred. We cannot pinpoint who exactly is responsible for a particular post. This raises a pressing question: when harmful or illegal content is posted on decentralised media, who should be held liable?

How is Decentralised Media different?

On Platforms like Facebook, Instagram, or YouTube, a single corporate entity controls the posts. That entity owns the servers, sets the community standards, monitors content, and is subject to national laws and regulations. If a harmful post is published, accountability can be fixed either on the user or the company owning the platform. But on platforms like Mastodon, Bluesky, PeerTube, or Pixelfed, the power is distributed across users or servers. Instead of one company managing everything, different communities or individuals host servers, called “instances,” with their own rules and regulations. Some, like Mastodon, PeerTube, Pixelfed, use federated models where multiple servers interconnect. Others, like Bluesky with its AT Protocol, allow user data portability through independent hosting. In blockchain-based decentralised apps, like Steemit or Minds, a public ledger records content, making takedowns nearly impossible.

On federated media platforms like Mastodon, Pixelfed, and PeerTube, users join different servers managed by administrators. Each admin sets rules, moderates, and can “federate” or “defederate” with other servers. On blockchain-based media like Steemit and Minds, posts are stored permanently on the blockchain, preventing deletion or centralised censorship. And, on protocol-based media like Bluesky, users can select their hosting service provider (PDS host). Moderation tools are layered, allowing choice but limiting universal enforcement; hence, as we can see, there is no single authority responsible, thereby making accountability a grey area.

Freedom of Expression v. Public Safety

One of the most significant dilemmas of decentralised media is balancing freedom of expression with public safety. Decentralised media emerged partly in response to criticisms that centralised platforms excessively censored users. We have witnessed private corporations removing politically charged content, not allowing the algorithm to promote a controversial opinion on centralised media platforms. By enabling communities or individuals to set their own rules, these platforms promise more autonomy and less corporate control. Users view this as a safeguard against arbitrary takedowns or political censorship.

However, at the same time, unmoderated freedom creates a fertile ground for abuse, hate speech, child sexual abuse material, deepfakes, terrorist propaganda, financial scams, and defamation. On blockchain platforms, once harmful content is published, it cannot be erased.

Philosophically, John Stuart Mill’s “harm principle” states that liberty ends where harm to others begins. If decentralised platforms allow content that incites violence, defames reputations, or harms vulnerable groups, states have an obligation to regulate them.

Thus, freedom without accountability risks turning into lawlessness.

Filling the Legal Void

Currently, Indian law has not explicitly addressed decentralised media. Indians have the right to freedom of speech (Article 19), as well as the right to reputation and the right to privacy (Article 21), which are provided in the Constitution. For social media, Indian law holds intermediaries and/or uploaders liable for the content they post.  But this is where it gets tricky when it comes to decentralised media.

Here’s a table that maps out and explains why holding someone liable in decentralised media is not easy, and what the ideal scenario could be if India were to enact laws pertaining to the issue in question, using popular sites as examples.

Platform Who sets/enforces rules? Why can they not simply be held liable Who should ideally be liable?
Mastodon Individual instance administrators (Each server has its own rules). Admins only manage their own server, not the whole network. They can’t track every harmful post, and liability would discourage people from hosting instances. The uploader/creator of content should be primarily liable; admins should only be responsible if they knowingly refuse to act after receiving complaints.
Pixelfed Instance administrators (server-based rules). Same as Mastodon: limited scope, lack of resources, no central control. Uploader first; admins should have “safe harbour” protection if they remove content after notice.
PeerTube Instance administrators, but content is shared peer-to-peer across instances. Videos can be mirrored beyond one admin’s control. Holding them liable for content they didn’t host would be unfair. Uploader is primarily responsible; in addition, those who mirror/share the content should bear secondary liability.
Bluesky PDS hosts (store data), Bluesky’s community guidelines, plus AT Protocol governance. PDS hosts act more like ISPs/cloud providers—they provide infrastructure, not editorial control. Uploader first; PDS host should be liable only if they refuse to comply with valid takedown orders.
Blockchain-based media (Steemit, Lens, etc.) Smart contracts and DAO governance (rules set by token holders/community). On-chain data is immutable; no single admin has the power to delete it. Punishing all token holders/developers is impractical. Uploader first; DAO/community governance should create compensation funds or arbitration mechanisms for victims of harmful content.
Interpretation in the Context of the Information Technology Act, 2000

Indian law must evolve to ensure user protection without stifling innovation. Currently, the Information Technology Act, 2000 and the Intermediary Guidelines (2021) assume “intermediaries” like Meta or X can remove flagged content. For decentralised platforms, the definition of “intermediary” can include admins of federated servers (e.g., Mastodon, PeerTube, Pixelfed), PDS hosts (e.g., Bluesky), and developers/maintainers of protocols if they profit commercially. Each should have obligations to respond to takedown requests; failing to do so may result in liability.

Another idea is to introduce a three-tier liability framework wherein there can be:

  • Primary Liability: on the user who uploaded harmful content;
  • Secondary Liability: on the admin/PDS host who knowingly fails to remove flagged illegal content; and
  • Residual Liability: on the protocol developers only if they intentionally design features that encourage illegality. This balances accountability without unfairly punishing developers who intended a neutral tool.

For blockchain-based media where deletion is impossible, laws can mandate content masking, preventing harmful material from being indexed or accessed via Indian ISPs. We can also introduce warning overlays, requiring third-party moderation layers to blur or block objectionable content. India can also introduce “traceability mandates” requiring servers hosting Indian users to retain minimal identifying metadata for investigation purposes.

Conclusion

Decentralised media empowers free speech, resists monopolistic control, and gives communities autonomy. But it also creates legal black holes where no single entity can be held responsible for harmful or illegal content.

India cannot afford to ignore this reality. While users must remain primarily accountable for what they post, platform actors, such as administrators, hosts, and developers, should bear tiered responsibilities to ensure some level of moderation. At the same time, Indian laws, such as the IT Act, must be amended to integrate decentralised platforms into the legal framework, with innovative solutions for blockchain-based permanence. Ultimately, the goal is balance. Preserving free expression while ensuring safety, accountability, and justice in the digital sphere. Only then can decentralised media evolve into a responsible yet liberating space for global communication.