Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:03
I scratched somebody else's itch with a
0:05
good That will all become clear will all become
0:08
clear in a moment. But first, a
0:10
short story. as It's a tale as
0:12
old as time. Computers are
0:14
very good at producing that output,
0:16
being and that output being partly
0:18
human readable, but mostly and humans are
0:20
And humans are quite good
0:22
at looking at patterns and seeing
0:24
things in computer output. But
0:26
really if that computer output is
0:28
intended to go into another
0:30
computer to another computer system, then program
0:32
or another computer system, then it's not in
0:35
a way that a human
0:37
with in silly eyeballs can
0:39
ingest it. can ingest know,
0:41
whether that's a CSV file a spreadsheet
0:43
or a spreadsheet or a... other kind of
0:45
easily machine thing, often it's often better
0:47
to sometimes present the data differently.
0:50
And that was the challenge I
0:52
had to solve. I wanted to solve.
0:54
I had some data of come out
0:56
of the program. being very being
0:58
very vague, I know. I'll be more specific
1:00
very shortly. I had some data that some data that came
1:02
out of a that was intended to go
1:04
into another program. But I wanted to
1:06
look at it before it passed down the
1:08
line to the next the line check it and make
1:10
sure it was it I hoped. it was, as I And
1:13
the problem is problem is... The
1:15
lingua franca of sending data
1:17
from one program to another
1:19
these days is to another these days
1:21
And that is problematic for
1:23
humans, especially when it's very
1:25
large volumes of JSON. If
1:27
you've got enormous blob of
1:29
data an with loads of indentations
1:31
with loads of vast, right? and it's
1:33
the software that I use
1:35
at work at work uses a
1:37
lot. and the And the
1:39
standards that people use for
1:41
formatting and vulnerability scans,
1:43
it's pretty much all JSON.
1:45
There's the the odd but it's
1:47
mostly JSON. And I find
1:50
it a little bit difficult because
1:52
the because the tool du jour that everyone seems
1:54
to use to use is JQ and I know And
1:56
I know Martin's going to suggest
1:58
some some replacement for JQ that better
2:00
is there? Not really. Good correct answer.
2:02
There's YQ which is the Yamwell equivalent
2:04
but lo and behold it uses JQ
2:07
under the covers. So what I wanted
2:09
to do was take the output of
2:11
a command specifically gripe. which produces a
2:13
giant thing of Jason, and I wanted
2:15
to read it. And what the Jason
2:17
contains is a list of vulnerabilities that
2:20
might be in a container. So I've
2:22
scanned a container and then gripe scans
2:24
the S-bomb, which is also Jason, and
2:26
output more Jason. You can print it
2:28
like a text table, but there's not
2:30
enough room to fit everything on the
2:33
screen that you need, and so it's
2:35
better to have this machine readable Jason
2:37
file. So I've got this Jason far
2:39
and I want to be able to
2:41
read it and I've run this past
2:43
my good friend and... nerd-sniping target, Stewart
2:45
language, and he didn't accept this one.
2:48
So I had to do it myself.
2:50
What I wanted was a basically a
2:52
Jason viewer. Now what I don't mean
2:54
is something that just gives me the
2:56
raw Jason with little twiddly triangles that
2:58
let me open up each of the
3:01
things. That's not what I want. What
3:03
I wanted was something that aggregates things
3:05
together. So show me all of the
3:07
critical vulnerabilities, show me all the high
3:09
vulnerabilities, and then if I drill down
3:11
into those... I can look at them
3:14
by package or something like that. And
3:16
so I needed to read the Jason
3:18
in and then interpret it and slice
3:20
it in one particular way to show
3:22
me the critical vulnerabilities and then slice
3:24
the exact same data, but slightly differently
3:26
to maybe show me from a package
3:29
point of view. So show me all
3:31
the packages that are in the system
3:33
and then I drill down on those
3:35
and see. the severity and so on.
3:37
And initially I thought this could be
3:39
a generic Jason viewer that I could
3:42
just throw any Jason at it and
3:44
make a generic Jason viewer but you
3:46
can't really because there's the formats of
3:48
them are all different and needed to
3:50
make it bespoke and so I did.
3:52
Yeah it's a bit like saying I'm
3:55
going to make a generic database for
3:57
you and you know the scheme is
3:59
always different. Yeah I mean it's a
4:01
document but it's a it's essentially a
4:03
database a database of vulnerabilities in my
4:05
case for the Jason file that I
4:07
want to view. And so I thought
4:10
how am I going to do this?
4:12
And the answer is Python, of course.
4:14
And I wanted to make it pretty.
4:16
So I've said that I don't want
4:18
it to just be one giant tree
4:20
that I open up. I want it
4:23
to be something a bit prettier than
4:25
that. And so I went for a
4:27
kind of split screen view, where I
4:29
have a tree on the left hand
4:31
side, that I can open up the
4:33
little leafs of the tree. And then
4:36
when I get like two or three
4:38
levels deep, I then click on a
4:40
node at the bottom of the tree
4:42
and the right hand side of the
4:44
screen filled with detail and that works.
4:46
As a user interface, you know the
4:48
typical file manager, I drill down through
4:51
a tree structure and click on something.
4:53
I thought that already works, so that
4:55
feels like a structure I could live
4:57
with. And is this, you're talking about
4:59
a two-pane view, is this in the
5:01
terminal or is this a GUI app?
5:04
It is a graphical application in the
5:06
terminal, yes. Yeah it's fully in a
5:08
terminal. So the reason I did it
5:10
in a terminal is because yeah there
5:12
are tools to do this in the
5:14
web and there are like fully fledged
5:17
desktop applications to view these kind of
5:19
things but I didn't want to make
5:21
something big and fat I just wanted
5:23
something that I could basically load a
5:25
Jason file in the terminal because I've
5:27
just run the command that generated the
5:29
Jason I'm already sat in the terminal
5:32
so I'll just pipe it to another
5:34
command. And I used a library a
5:36
Python library called textualize I think it's
5:38
called and I learned of this through
5:40
someone at work who has written some
5:42
interesting software using it and I saw
5:45
him do a demo of it and
5:47
I thought wow that's really cool what
5:49
did you use and he told me
5:51
and that was it I was sold
5:53
and the really nice thing is it's
5:55
quite pretty it adjusts to the terminal
5:58
size automatically I don't have to do
6:00
anything and it's got all user interface
6:02
for the tree structure and it works
6:04
with the mouse as well as the
6:06
keyboard. So I can tab around and
6:08
press buttons but if I want to
6:10
click on something I can double click
6:13
on something and open it in another
6:15
pain and it is super wonderful to
6:17
work with and it makes things look
6:19
pretty pretty. even for someone like me
6:21
who's a bit of a novice with
6:23
Python. And this is the first thing
6:26
I've ever done with Textualise. I just
6:28
followed some of their documentation and there
6:30
are some examples in there. It's super
6:32
easy to use. I was well impressed.
6:34
So I made the basic thing that
6:36
slices and dices buy severity, by vulnerability,
6:39
by package and so on and I
6:41
thought I'll show this off and so
6:43
during our coffee time on a Friday
6:45
we have a coffee chat at work
6:47
I said hey can I show you
6:49
something that I've made and I shared
6:51
my screen and told them what it
6:54
was and what it does and someone
6:56
else on the call who is a
6:58
massive fan of JQ and uses JQ
7:00
a lot and he works on the
7:02
team. unmuted his microphone and I thought
7:04
I'm going to get abuse now because
7:07
I said something disparaging about JQ and
7:09
he actually unmuted his microphone to say
7:11
I need this I need this right
7:13
now because this will help me do
7:15
my job because what he was doing
7:17
was looking at Jason files from an
7:20
upstream security vendor so red hat for
7:22
example red hat have changed the format
7:24
of or they're using a different format
7:26
for the security feed and he wanted
7:28
to be able to examine it in
7:30
a way that was nice and easy
7:32
and straightforward and this tool would have
7:35
done it for him And so I
7:37
was like, oh, oh, this isn't just
7:39
me that needs this thing then. And
7:41
so I put it on GitHub on
7:43
the Friday night and then sent them
7:45
a link and said, there you go.
7:48
You can, it's got instructions. I made
7:50
sure the instructions were right and you
7:52
could just like use virtual environment to
7:54
run it and it works. And he's
7:56
tried it and it works for him
7:58
as well. So it works on at
8:01
least two computers. So you were saying
8:03
that you can't just make something like
8:05
this for Jason files in general because
8:07
they are too general. the red hat
8:09
thing you just spoke about, does that
8:11
happen to use the same format that
8:13
Sifton gripe used or is it something
8:16
else but it's similar enough that this
8:18
can be made to work with it?
8:20
No, there's a different standard but what
8:22
he was doing was ingesting it and
8:24
then seeing what came out of gripe
8:26
the other end. So mine is effectively
8:29
a gripe viewer. Yes. So I think
8:31
what he wanted to do was pull
8:33
the feed in and then use this
8:35
as a way to view whether gripe
8:37
was getting the right data based on
8:39
the feed that was coming into it,
8:42
that kind of thing. A lot of
8:44
your colleagues are in the US, aren't
8:46
they? Yes. Is rummage a word that
8:48
translates across the pond? Not so much,
8:50
no. And that's why I quite like
8:52
it. So yeah, I sent them the
8:54
link and they've tested it out and
8:57
it works fine for them. And I
8:59
had another play with it today and
9:01
I've made a little list of to-do
9:03
items, basically issues on the project. and
9:05
so if anyone else wants to help
9:07
me out in some way it's quite
9:10
niche you know but I could see
9:12
this as a base for viewing other
9:14
types of files like this could be
9:16
reworked for the red hat oval file
9:18
and oval is a standard format right
9:20
and so I've used it very specifically
9:23
for gripe output but it could be
9:25
used for SPDX which is a standardized
9:27
S-bomb format or basically any other Jason
9:29
it could be used It's not actually
9:31
very long, I think it's like a
9:33
hundred lines or something, it's not huge,
9:35
and all you do is just run
9:38
it with one parameter, and the parameter
9:40
is the Jason Farr that it's going
9:42
to read. There is one embellishment that
9:44
I did, which I'm quite pleased with,
9:46
which is gripe has a thing called
9:48
explain, where you can say, explain that
9:51
issue, that vulnerability to me. and gripe
9:53
then interrogates the vulnerability database and produces
9:55
a giant output of all the links
9:57
to the CVE databases. all the other
9:59
metadata about the security issue and what
10:01
version you need to upgrade to and
10:03
what files are affected and all that
10:06
kind of stuff. And the nice thing
10:08
is that giant blob of text fits
10:10
in the right hand pain, but it's
10:12
also scrollable. And I didn't have to
10:14
do anything really to make that a
10:16
scrollable window inside the terminal with my
10:19
mouse. It's great. I'm really, really pleased
10:21
with it. And it was only a
10:23
couple of hours of noodling around with
10:25
a bit of Python and a bit
10:27
of Python and a library. This
10:32
episode is sponsored by tail scale
10:34
but please don't skip. I want
10:36
to tell you how I use
10:38
tail scale's free plan to securely
10:40
share remote access to web applications
10:43
I'm prototyping on my laptop. Sometimes
10:45
I want to test those web
10:47
applications from other platforms such as
10:49
mobile devices. With the tail scale
10:51
serve command I can share a
10:53
link to my creations on any
10:55
other device on my tail net.
10:57
I have tail scale installed on
10:59
all my devices so I can
11:01
access my prototype app from any
11:03
device securely securely and without fuss.
11:05
And if I want to share
11:07
my creation with friends, colleagues and
11:10
customers, I can use the Tailscale
11:12
Funnel Command, which gives my application
11:14
a public URL for as long
11:16
as I need until I close
11:18
the funnel. It made authenticating my
11:20
app with GitHub easy because I
11:22
could use the public Tailscale link
11:24
as the callback URL. No need
11:26
for Ruto port forwarding and SSL
11:28
is automatically enabled by default. Both
11:30
Talescale Serve and Tailscale Funnel are
11:32
handy for developers like me. Many
11:34
of the Late Night Linux family
11:36
hosts are using tail scale on
11:39
their devices to secure cross device
11:41
networking and you could too because
11:43
tail scale's personal plan will always
11:45
be free. To support the show
11:47
and become a web developer like
11:49
me try tail scale for free
11:51
today. You'll get up to a
11:53
hundred devices and three users for
11:55
free with no credit card required
11:57
at tail scale.com/Linux matters. I've
12:02
been going social down in the
12:04
Fedi verse. Oh, not Acapolco. No,
12:07
I'm not going to do
12:09
old song. Following Ogcamp, where I
12:11
was inspired by a number of
12:14
the talks there about activity pub
12:16
and the Fedi verse in general.
12:18
I decided that coming out
12:20
of Odd Camp, I wasn't happy
12:23
just having accounts in the Fediverse.
12:25
I wanted to be more deeply
12:28
involved in that movement, that community,
12:30
and that technology. So I
12:32
talked about creating a server, which
12:34
was going to be the place
12:37
I was going to host and
12:39
experiment with activity pub-related things. and
12:42
I have now got my own
12:44
personal activity pub instance on that
12:46
there Fediverse and it is not
12:49
Mastodon. It's not Mastodon then. It
12:51
is not Mastodon and that
12:53
was surprising in itself because I
12:56
thought going into this I was
12:58
inevitably going to end up installing
13:00
Mastodon and because I'd gone in
13:03
with that let's say misconception
13:05
I may have over provisioned my
13:07
service. and I might have to
13:10
downsize at some future date. But
13:12
you know, for now, we'll just
13:14
go with what I have.
13:16
But I did a sort of
13:19
a looksie around. There's a few
13:21
awesome activity pub typisticals on GitHub
13:24
and Git lab and what have
13:26
you and Codeberg. which list
13:28
all sorts of interesting software categorized
13:30
by functionality and I was surprised
13:33
at like how many mastodon-like platforms
13:35
there are available. So I had
13:38
a little look through those
13:40
and had a bit of a
13:42
tinker and long story short I
13:45
have decided to go with a
13:47
bit of software called Go To
13:49
Social. somewhat familiar with Go
13:51
to Social. Oh, really? I spoke
13:54
in an early episode or two
13:56
about writing my own activity pub
13:59
integration for a static blog and
14:01
Go to Social was how
14:03
I tested interacting with an actual
14:05
activity pub server by running that
14:08
locally and trying to send things
14:10
back and forth with it because
14:13
that way I had something that
14:15
I could devo, which wasn't vast
14:17
and resource hungry and complex like
14:20
Mastodon. Well that's interesting because one
14:22
of the reasons why I
14:24
settled on go-to social is apart
14:27
from having my like own actual
14:29
account I want some test accounts
14:31
where I can experiment with the
14:34
API and prototype things and
14:36
I didn't want to invest a
14:38
lot of time and effort in
14:41
end-tier architecture which is basically what
14:43
Mastodon is. Also you don't want
14:45
to be using bots in
14:47
dot space or some other instance
14:50
that's just going to disappear from
14:52
Undi. Right. You want to be
14:55
in control of it, right? Indeed.
14:57
So go to social is
14:59
rather fantastic. For those of you
15:01
that have not encountered it yet,
15:04
we'll have links in the show
15:06
notes, but it is a single
15:09
go binary. And it's quite
15:11
clever in that, yes, you could
15:13
have it talk to a post-gress-QL
15:15
database. I've decided not to do
15:18
that, maybe in a couple of
15:20
years that might come back
15:22
to haunt me, we'll see, but
15:25
it actually has, among other things,
15:27
SQL Light and FFM Peg and
15:29
FF probe embedded within it as
15:32
wasm executables. So when I
15:34
say it's a single binary, it
15:36
is literally a single binary with
15:39
all of its external dependencies. bundled
15:41
inside it, which I think is
15:43
kind of fabulous and like
15:45
next-gen internet technology sort of being
15:48
being sort of showcased there. So
15:50
whilst I thought I would use
15:53
a post-gress-Q database to sort of
15:55
back this, I've decided the project
15:57
itself seems to be quite happy
16:00
with sequel light. And when you
16:02
go looking at like what modern
16:05
sequel light is capable of,
16:07
I'm quite comfortable with the fact
16:09
that sequel light can handle my
16:11
two or three account single person
16:14
instance on that their Fediverse. So
16:16
that's what I've gone with.
16:18
And so far it seems to
16:21
be holding up. I've actually, quite
16:23
a few of the services that
16:25
I run in containers on my
16:28
home server, do a similar
16:30
thing. They have the option of
16:32
post-Gresl sequelite because they're pretty much
16:35
API compatible for most, like the
16:37
stuff that you'd use for a
16:39
web application. And yeah, for
16:41
each of them I have a
16:44
database file which I back up
16:46
with everything else. And it works
16:49
for the scale that I'm running
16:51
at. It's really good. No,
16:53
no, data wasn't lost so much
16:55
as I just had to throw
16:58
data away. So I thought I
17:00
had read a good amount of
17:03
the go-to social documentation going
17:05
into creating my first production instance.
17:07
It turns out I had not
17:09
read the memo about These are
17:12
your deployment requirements and understandings. So
17:14
I had no idea that
17:16
when you deploy something onto the
17:19
Fediverse, when you choose like what
17:21
that platform is, you're kind of
17:23
stuck with it. Like it's a
17:26
one-way street for two reasons.
17:28
One because of the protocols compatibility,
17:30
but also because of the keys
17:33
that it generates and once it
17:35
starts interacting. with other nodes on
17:38
the Fediverse, those keys are kind
17:40
of the keys and there is
17:42
no mechanism within the activity pub
17:45
protocol for you to signal to
17:47
instances that you're federated with
17:49
that your keys have changed and
17:52
need to be re-synched. So if
17:54
you mess it I messed it
17:56
up, then those keys, it means
17:59
that the new instance you
18:01
deploy on that domain can no
18:03
longer interact with those instances that
18:06
it has previously come into contact
18:08
with because the keys have changed.
18:10
Without you contacting a thousand
18:12
admins of the other instances and
18:15
begging that they, you know, rotate
18:17
the keys for you. Well the
18:20
good news is all you have
18:22
to do is post one
18:24
status update and then look at
18:26
the web server logs and you'll
18:29
get the names of every single
18:31
instance that's federated with you. Yes.
18:34
And then just at, admin,
18:36
at that and then you're done.
18:38
You could script that and that
18:40
could be your new bot is
18:43
a thing that tells all these
18:45
people. Please defederate from me
18:47
and forget I ever existed. So
18:50
you could do that or you
18:52
could do what I did. Register
18:54
a new domain? 24 hours later
18:57
realized by mistake, registered another
18:59
domain. Of course, that's the solution
19:01
to everything. By new domain. Yeah,
19:04
so I am now the proud
19:06
owner of Whimpee's World. Social. Review
19:08
2. Social. Indeed. Indeed. So
19:10
yeah, having realized that blunder and
19:13
the fact that actually... The database
19:15
that backs your activity pub instance
19:18
is very important in terms of
19:20
the keys that are sat inside
19:22
it, not just for the instance
19:25
itself, but the users you've created
19:27
on that instance. I realised I
19:30
needed a solid sort of
19:32
backup solution for go-to social. Now,
19:34
one of the things that I
19:36
love about go-to social is their
19:39
documentation is absolutely fantastic. In fact,
19:41
as a community project, There's
19:43
nothing to fault in the way
19:46
the project has been set out,
19:48
documented, their sort of milestones and
19:50
roadmaps have been communicated. It's really
19:53
very accessible. So there was
19:55
lots of good information about how
19:57
to back up a go-to social.
20:00
but there wasn't a tool
20:02
to do it, but there
20:04
is now, because I have
20:06
I have written go-to social
20:08
backup. So what that does
20:10
is does a backup of
20:13
the sequel-like database, given that
20:15
that is the common sort
20:17
of database that's being used,
20:19
and there are better tools.
20:21
for backing up PostQL. But
20:23
what it also does is
20:25
it uses the go-to social
20:27
tooling to build a list
20:29
of what your local media
20:31
is, so your custom emogies
20:33
and any images that you've
20:35
uploaded to your own instance,
20:37
it will extract and it
20:39
builds like a hard link
20:41
archive of your stuff and
20:43
deposits that in a location
20:45
so you can R clone
20:47
it off to like an
20:49
S3 backup. So I've written
20:51
this tool, it does a
20:53
comprehensive backup of go-to social
20:55
and I'm now using that
20:57
to make sure I don't
20:59
get defederated from the instances,
21:01
the 1,157 instances that this
21:03
thing is now talking to.
21:05
What's the backup written in,
21:07
is it go or is
21:09
it bash? Yes, it's gold-plated
21:11
bash. Really, because it really
21:14
only does a few things.
21:16
It does a sequel-like backup
21:18
using vacuum. So that's like
21:20
a transactional backup. It
21:22
creates this hard link hierarchy of
21:24
the local media and something else
21:27
I forget. Oh there's an export
21:29
feature so there's like an
21:31
export of like the basic configuration
21:33
so it's a belt and
21:35
braces. So it's really only three
21:38
simple things that it does
21:40
and then it automatically handles lock
21:42
rotation and retention of that 28
21:45
day retention of that backup.
21:47
So I'll camp you were incentivised
21:49
by the other talks and
21:51
now you're a contributor to the
21:53
Fediverse. Yes, I'd like to think
21:56
so. In some small way at
21:58
least. you are looking to host
22:01
your own instance, seriously take
22:03
a look at go to social,
22:05
the project is currently labelled
22:07
as beta. It's recently exited Alpha.
22:09
I would say it's a
22:11
good drop-in replacement for Mastodon. It's
22:14
got an implementation of the Mastodon
22:16
API. So most of the
22:18
tooling that you use in terms
22:21
of clients work with it.
22:23
The clients that I've used with
22:25
it is a thing called
22:27
Feddy Text, which is a continuation
22:29
of the old Metatext and Tuske,
22:32
on Android. And what's interesting
22:34
about go-to social is it doesn't
22:36
have a client built into
22:38
it. So it's just a back-end
22:41
implementation. So it's a bring-your- own
22:43
client. So there's some web clients
22:46
called Pinafore and Semafore, and I've
22:48
been using those to log
22:50
in over the web. So if
22:52
you're looking for an instance
22:54
to play around with, I highly
22:57
recommend it and it can
22:59
even run on SBCs and is
23:01
advertised as the way to get
23:04
on that there Fediverse with
23:06
low-powered devices. This
23:10
episode is sponsored by one password.
23:12
Imagine your company's security like the
23:14
quad of a college campus. There
23:17
are nice, brick paths between the
23:19
buildings. Those are the company-owned devices,
23:21
IT-approved apps, and managed employee identities.
23:24
And then there are the paths
23:26
people actually use. The shortcuts worn
23:29
through the grass that are the
23:31
straightest line from point A to
23:33
B. Those are unmanaged devices, shadow
23:36
IT apps, and non-employer identities like
23:38
contractors. Most security tools only work
23:41
on those happy brick parts, but
23:43
a lot of security problems take
23:45
place on the shortcuts. One password
23:48
extended access management is the first
23:50
security solution that brings all these
23:52
unmanaged devices, apps and identities under
23:55
your control. It ensures that every
23:57
user credential is strong and protected,
24:00
every device is known and healthy
24:02
and every app is visible. One
24:04
password extended access management solves the
24:07
problems traditional IAM and MDM can't
24:09
touch. It's security for the way
24:11
we work today and it's now
24:14
generally available to companies with octa
24:16
and Microsoft Entra and in beta
24:19
for Google workplace customers. So support
24:21
the show and check it out
24:23
at one password.com/Linux Matters. Linux
24:28
Matters is part of the
24:30
Late Night Linux family. If
24:32
you enjoy this show then
24:34
please consider supporting us and
24:37
the rest of the Late
24:39
Night Linux team using the
24:41
PayPal or patron links you
24:43
can find at Linuxmatters.sh slash
24:45
support. for $5 a month
24:47
on Patreon you can enjoy
24:49
an ad-free feed of our
24:51
show or for $10 get
24:53
access to all the late
24:55
night Linux shows ad-free. We
24:57
love to get your feedback
24:59
via email show at Linuxmatters.sh
25:01
or you can chat with
25:03
us and other listeners in
25:05
our telegram group. All the
25:07
details are available at Linuxmatters.sh
25:09
slash contact. Previously
25:13
on Linux Matters I spoke about hosting
25:15
my own audiobook server. with audio bookshelf.
25:17
That's easy for you to say. Everything's
25:19
been going well with it, but I
25:22
did mention last time I spoke about
25:24
this that I was having an issue
25:26
listening to certain episodes of a certain
25:28
audio book that are very long to
25:30
the tune of several hours in length
25:32
each. And the problem there is that
25:34
when it's trying to stream them, it
25:36
can't seek them and doesn't seem to
25:38
remember position very easily. It makes it
25:40
very hard to actually listen to it
25:42
successfully. What book has three-hour chapters? So
25:44
I say book, but in this case
25:46
it's actually an archive of a podcast.
25:49
So this is the critical role podcast,
25:51
but it's several seasons ago. actually what's
25:53
currently on their feed. So I've downloaded
25:55
them as an archive and I've uploaded
25:57
them to my server as an audio
25:59
book in my library and so I
26:01
listen to them that way. I was
26:03
going to ask why you didn't use
26:05
a podcast reading a bit if they're
26:07
not in their feed. I guess it
26:09
makes it a little bit more problematic
26:11
to do that. yes but also I
26:13
listen to them differently because with podcasts
26:16
they're coming in new and I'm listening
26:18
to them as they come out whereas
26:20
this is like something more at a
26:22
leisurely pace I listen to you know
26:24
when I've got time so it doesn't
26:26
really fit with my workflow for having
26:28
them in a podcast client either I
26:30
prefer them to have set to have
26:32
them separately. So I did some testing
26:34
and reported bug on this and it
26:36
turns out I'm not the only person
26:38
who's had this issue, it is a
26:40
known phenomenon with very large audio books
26:43
on audio books shelf, but I did
26:45
find a solution. someone mentioned in the
26:47
bug. They said something similar, like if
26:49
it's a podcast, why aren't you listening
26:51
to it as a podcast? Because audio
26:53
bookshelf does have a podcast library feature
26:55
where you can stick in an RSS
26:57
feed and it will download the episodes
26:59
and then you play them from your
27:01
audio book server, which I don't use
27:03
because I don't actually want to archive
27:05
all of the podcasts I listen to
27:07
as they come out. But what it
27:10
does do with podcast that it doesn't
27:12
do with audio books is it lets
27:14
you download one episode at a time.
27:16
So with audio books you have the
27:18
option to either stream an episode or
27:20
download the whole book to your device.
27:22
But with podcast you can download one
27:24
chapter or one episode at a time,
27:26
which means I don't have to download.
27:28
400 hours worth of critical role in
27:30
one go. I can just download the
27:32
couple of episodes I'm listening to now,
27:34
delete them when I'm done, and because
27:36
it's all playing locally, it doesn't have
27:39
any of this issue with streaming. Oh,
27:41
nice. Yes. So I'm now happily listening
27:43
to this. I've got two libraries in
27:45
audio books, one for podcasts and one
27:47
for audio books, which has all of
27:49
my other stuff, and I can flip
27:51
between quite easily. It remembers my positions
27:53
of all of the books and podcasts
27:55
I'm listening to. at once
27:57
and everything's working and it's
27:59
it's all open it's
28:01
it's all and it's
28:03
all brilliant.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More