Bear's Feed

Kaku MVP is complete (I hope!)

I've always wanted to get the tools I use to work with my personal site cleaned up enough to make them useful to others who may also want to run a Python based static site ... and I think I have reached that point tonight!

I just got done with a large refactoring and code simplification to enable delete and undelete of posts and during that work I was able to remove a lot of code that just wasn't required. After that process I realized that the two tools I was using had a lot of similar code so I started the process to merge them. That also reduced the code size and smell quite a bit.

Now the tool I used to use to generate the site, Hakkan, is now deprecated by the tool that I use to handle the dynamic events - Kaku.

Kaku now supports Micropub, Webmention and Token endpoints and generates events that are sent via Redis Pub/Sub to the event handler, which does the work of making sure the metadata files are created and/or updated and then generates the static files as needed.

Most of the configurable items are defined in the config files but the remaining items that are tighly bound to my site is the use of Markdown for the post's and the URL path style for articles.

Other items that remain are to get more than the basic handling of Micropub notes and to handle the fancier Webmentions like u-like-of.

maybe next weekend :)

Kaku refactoring to enable webmention and post deletes

In order to implement Webmention.rocks delete tests I had to find a way to flag my posts as deleted so that I could create a flag file that then would be noticed during the static generation of the site. This also required that when a deleted post was detected that it generated HTML that was a tombstone entry with a <meta http-equiv="Status" content="410 GONE" /> entry.

I also took advantage of this large refactoring to make Kaku a more modern Flask implementation so that it runs in the latest nginx and uwsgi environments.

Once Webmention.rocks Issue #7 is fixed I'll be able to mark that task as done!

Testing Webmention.rocks

Python Testing with Flask

On Monday, 15th February 2016, I gave a talk for Developer Week 2016 in San Francisco about best-practices for testing Python. For the talk I created a functional Python Flask app so that the talk would be grounded in reality.

A lot of research went into this talk so I wanted to make sure that all of the resources were available - the full talk, talk notes and the example project are all on GitHub:

This research is still on going! Just today I learned about an even better way to use uWSGI to run the Flask app that uses the PyEnv virtual environment without a lot of tricks -- I'm working that up into another article and I'll be updating Tenki to reflect this new method.

IndieWeb Public Timeline Project

I was working on a talk I'm going to be giving next week (2016-02-16) at Developer Week 2016 about Python Testing when I noticed the conversation happening in the IndieWeb IRC Channel:

# 08:35 tantek showed that stream of notes somewhere on the home page
# 08:35 tantek just the notes
# 08:35 aaronpk ah
# 08:35 tantek so it had that old school twitter feel
# 08:36 aaronpk that'd be fun
# 08:52 benwerd tantek aaronpk: I really love that idea
# 08:52 benwerd A hub for people posting notes to their indieweb homepages

This caught my attention because i've had this on my personal setup for years as a stream of all of the items/feeds that I have interest in - my own personal stream of events. The code I use for that is extremely ugly and very tied to my personal data so I couldn't just pop it onto a public site but it is something that I could use as a framework tho!

So I spent the weekend setting up a Python Flask application and started pulling over the various parts that made sense (to me) - so far it does the following:

  • Displays an h-feed of items that have been submitted to it
  • Uses Server Sent Events (SSE) to send new data to the page. Big thanks to Aaron for reminding me about Nginx Stream Push Module!
  • Uses the Bridgy Publish method of having a hidden /publish URL pointing to the site to trigger getting the post into the stream
  • Has the beginnings of an API to pull JSON data for the stream based on domain and date range
  • Will soon have an IndieAuth login page for each domain to allow people to opt-out of being included in case of indirect mentions causing their posts to be displayed.

I need to finish some deploy work, like getting a SSL Certificate in place and standing up the code some place other than my personal server, and then I will start sharing the URL :)

Working with Nginx logs from bash

Earlier today on the IndieWeb IRC Channel Aaron was talking about writing some code to parse his Nginx logs to get a list of 404s and I commented that it could be done easily enough by setting Nginx to emit JSON formatted logs.

Now that is definitely an option, but then I remembered that I had written a quick-n-dirty parser to spit out a JSON blob per logline and that with the amazingly handy jq tool the JSON could quickly be filtered or acted on!

So I took a break from work to refactor the quick-n-dirty parser to use apache-log-parser to increase it's flexibility (the old version was hardcoded for a specific log line format) and added some helpful comments.

logs.py will generate via stdout a stream of JSON for each line of the log file:

./logs.py < /var/log/nginx/bear.im.log | jq '. | { method, status, request_first_line }'