Bear's Feed

Thoughts on a distributed Indieweb chat

The other night while talking to some of the Indieweb community I had one of those aha! moments you think only happen in movies :)

I was describing how I use XMPP to have a lot of different streams of data flow to a terminal client that I pretty much use as a dashboard - different bots and agents listen to Twitter feeds, poll the few atom feeds I still follow and other sources of notifications and then they are sent to different Multi-User-Chat (MUC) rooms on my XMPP server (Prosody) based on filters.

This type of setup is something i've been working on for decades and i've always had a few different iterations going as I learn about new tech or even new languages: the first version of it was in Perl and now most of it is in Python. I have tried to envision what this would require before but always got bogged down in the translations required to move messages from one service (silo) to another.

Then it struck me that we could now use some of the core Indieweb protocols to enable a distributed chat environment that doesn't (necessarily) require a central server or hub.

Now I'm wondering if we couldn't use simple text with some HTML+MF2 for the required metadata - this would allow a lot of the things that we in the Indieweb community have been working to be used to handle the publication (micropub, syndicate-to), notification (webmention, PuSH) and consumption (h-feed)!

Encouraged by Tantek I captured on the Indieweb Wiki most of the brainstorming from that night and I started to work getting my own site to syndicate notes to a prototype distribution hub that will allow me to see if it's realistic to have many people syndicate their own chat messages to a topic and then enable consumption of that topic in a way that feels like a chat room.

Pretty cool stuff!

Kaku MVP is complete (I hope!)

I've always wanted to get the tools I use to work with my personal site cleaned up enough to make them useful to others who may also want to run a Python based static site ... and I think I have reached that point tonight!

I just got done with a large refactoring and code simplification to enable delete and undelete of posts and during that work I was able to remove a lot of code that just wasn't required. After that process I realized that the two tools I was using had a lot of similar code so I started the process to merge them. That also reduced the code size and smell quite a bit.

Now the tool I used to use to generate the site, Hakkan, is now deprecated by the tool that I use to handle the dynamic events - Kaku.

Kaku now supports Micropub, Webmention and Token endpoints and generates events that are sent via Redis Pub/Sub to the event handler, which does the work of making sure the metadata files are created and/or updated and then generates the static files as needed.

Most of the configurable items are defined in the config files but the remaining items that are tighly bound to my site is the use of Markdown for the post's and the URL path style for articles.

Other items that remain are to get more than the basic handling of Micropub notes and to handle the fancier Webmentions like u-like-of.

maybe next weekend :)

Kaku refactoring to enable webmention and post deletes

In order to implement Webmention.rocks delete tests I had to find a way to flag my posts as deleted so that I could create a flag file that then would be noticed during the static generation of the site. This also required that when a deleted post was detected that it generated HTML that was a tombstone entry with a <meta http-equiv="Status" content="410 GONE" /> entry.

I also took advantage of this large refactoring to make Kaku a more modern Flask implementation so that it runs in the latest nginx and uwsgi environments.

Once Webmention.rocks Issue #7 is fixed I'll be able to mark that task as done!