media aggregator disaggregator
Find a file
2026-03-27 15:35:59 +07:00
_config first pass at a mastodon source 2026-01-05 15:09:58 +07:00
_data first pass at a mastodon source 2026-01-05 15:09:58 +07:00
doc source documentation, other documentation adjustments 2026-03-27 15:35:59 +07:00
lib source documentation, other documentation adjustments 2026-03-27 15:35:59 +07:00
spec initial commit 2025-12-18 09:57:54 +07:00
.gitignore first pass at a mastodon source 2026-01-05 15:09:58 +07:00
.npmignore add .npmignore 2026-02-24 12:41:19 +07:00
index.js allow processing local posts, instead of fetching posts 2026-01-02 23:15:18 +07:00
package-lock.json minimal epub destination 2026-01-02 12:34:31 +07:00
package.json replace run-dev with npm run dev 2026-01-05 15:15:21 +07:00
readme.md source documentation, other documentation adjustments 2026-03-27 15:35:59 +07:00

digester

the purpose of this program is to time-and-location shift all the online stuff you peruse, according to your predilictions. it consumes a collection of sources eg. rss, activitypub, and gemfeeds and redirects the posts found therein to one or more destinations eg. an epub, a read it later service, etc.

in order to use it, one defines some sources, and some destinations, and a set of rules which route posts from the defined sources to the defined destinations. it currently only runs in batch mode. ie it takes all of these things that we interact with in realtime, and gives them to us once an hour, or day, or week, or month, or whatever. in a context that makes them most useful, and least distracting.

an example

I have a collection of gemfeeds, rss feeds, and fediverse accounts. many of these posts are text that I'd like to read. some of them are photography. others are music I'd like to listen to. and, some of them are posts that only make sense in a realtime context (eg livestream anouncements). they aren't useful unless you see them when they are posted.

digester aggregates posts from all of these sources, and sorts them via a set of rules which: generate an epub with all the text posts (it finds it's way to an ereader by way of some external mechanisms). generates an epub with all the photographic posts (it stays on the computer with a large screen for comfortable viewing). sends all the music posts off to an rss feed generator (it generates a feed that is subscribed to on the device I use to listen to music). and discards posts that are only useful in the moment they're posted, or I'd rather not see for whatever reason. the 'discarded' posts are saved for later occasional inspection. to see if I'm missing anything that I'd rather not be. this happens once a day via a cron job on my laptop.

configuration is very malleable, in order to support a variety of use cases.

installation

globally

sudo npm install -g git+https://code.paiges.net/rburns/digester.git@v0.1.0

in current directory

npm install git+https://code.paiges.net/rburns/digester.git@v0.1.0

documentation