Watching anime and YouTube the Unix way

Since streaming movies and series became mainstream there’s been a plethora of sites that offer content to users. However, this sites either require an app or a website to function. Not very Unix friendly. The same goes for YouTube. Moreover, this sites are designed to take as much of your attention as possible and that can affect your productivity if you are not careful.

Luke Smith published a while ago a video where he shows a RSS reader based on curses. I have setup that reader along with other tools to allow me to watch content directly from the terminal without having to go to this websites and sink a lot of hours there.

The tools needed for this setup are:


For newsboat we basically need a bunch of RSS feeds. Just go to the blog or site you want and take the link. All the RSS feeds are stored in the file ~/.config/newsboat/urls. And they follow this format: webcomics xkcd academia science humor "~XKCD"

It’s kinda self explanatory, but the main gist is: First you put the url, then a bunch of tags that you can use for searching and the tag with a ~ is the name of the feed. Luke explains in more detail how the tool works in his video and I suggest you go and watch it. YouTube has a RSS feed for each channel but the feature is a little bit hidden. There’s information on the Internet on how to obtain the url for a single channel. Since I don’t want to go channel by channel in my subscriptions I do the following: Go here and scroll to the bottom. Click on Export subscriptions and a xml file will download. Take this Python script:

#!/usr/bin/env python3

import xml.etree.ElementTree
import sys

e = xml.etree.ElementTree.parse(sys.argv[1]).getroot()
for outline in e.iter('outline'):
	if "type" in outline.attrib and outline.attrib["type"] == "rss":
		url = outline.attrib['xmlUrl']
		name = outline.attrib['title']#.encode("utf-8")
		print("%s youtube \"~%s\"" % (url, str(name)))

This script will print each channel you are subscribed in the newsboat format. Pipe that to your urls file and you are done.

For the anime series I use fansubs that publish the new releases to nyaa. Nyaa has a RSS feed of the uploads of each user. Just pick the username of the fansub you want. For example: Put that on newsboat and you are good to go.

This is a excerpt of my urls file to give a visual example of how everything is setup:

# Torrents anime nyaa puya "~Puyasubs"

# Youtube channels youtube linux "~Luke Smith"


You probably know what MPV is but in case you don’t, MPV is a video player that we can launch from the terminal. It will also play YouTube videos if you have youtube-dl installed. We can make Newsboat play a YouTube video in the feeds by setting-up a macro. This macros are configured in ~/.config/newsboat/config:

macro v set browser "mpv --fs %u" ; open-in-browser ; set browser "open %u"

This macro can be activated with ,v inside Newsboat while selecting the entry for the video. This macro works for Macos. For linux change open with xdg-open and you should be good to go.

MPV will now take the URL of the video, pass it to youtube-dl and play it in full screen.


Playing video from a torrent is not a straight forward as pulling the video from HTTP. But, there’s peerflix. This tool will open a torrent and start taking data from it. It will pick the largest file in the torrent and take the pieces in order so the beginning of the video is available earlier than the end. Then it setups a webserver that serves the video over HTTP. With the command line flags I have in my macro it will also execute mpv with the address of the server as the video input:

macro p set browser "peerflix %u --mpv -- --fs" ; open-in-browser ; set browser "open %u"

Finishing touches

The last thing you may want to setup is some automation. You can run two commands from the cli in newsboat:

newsboat -x reload         # Will reload all the feeds and get the latests news
newsboat -x print-unread   # Will print the number of unread elements

You can use this two commands to build a system that will notify you if there’s new stuff that you may want to check out. I have a cron job that updates the feeds each 15 minutes.