Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.

2016/04/04

Diagnosing A Problem: OddMuse

I work in a lab in a large research-centered university. We use a wiki to serve as our lab notebook where we keep notes about the samples that go through. We're also a Perl shop, so we went with a Perl-based wiki named OddMuse (a fork of UseMod). This has been our platform of choice for nearly a decade.

Today, it was reported that a few pages would hang during loading. They gave up after 300, as we have a 5-minute timeout in our Apache config. I shame myself by saying that I went to the OddMuse IRC channel before I looked at /var/log/html/error_log, but that is what happened. The error log reported:  [Mon Apr 04 13:11:54 2016] [error] [client 142.68.31.21] (70007)The timeout specified has expired: ap_content_length_filter: apr_bucket_read() failed. I'm not strong in my Apache Fu, but I'm pretty sure this means that we hit the timeout, but it doesn't really say why we hit the timeout.

Which brings up a weirdness. Imagine example.com/wiki/SandBox is the page in question. You can get the page in it's full glory, before it's turned into HTML and spat out, at example.com/wiki/raw/SandBox, and that page always loads fast.

I "solved" the issue by editing and saving the file. I still don't know what's going on so that OddMuse can handle the data in raw form but cannot convert it to HTML. I am currently going down two roads of thought. The first is that there was a filesystem issue. We're working on a filesystem that is amazing in it's redundancy, size and the sheer number of connected nodes, but on occasion, we hit points where it falls down, giving us a several minute wait for commands such as ls or clear to run.

The other thought relates to how we actually use OddMuse. We wanted to have a front-end that behaved nearly like Word, so we use CKEditor and save HTML instead of wiki markup. At first, we saved samples in groups of up to 10, writing them to an unordered HTML list, but now we're pushing 400. The stub of each wiki page is created programmatically, and I am thinking that the higher numbers might be more than it can take.

This, I guess, gives me a thing I can test. Write a thing that starts with, say, 100 elements in a list, then builds it up until the page doesn't render. I can do that. And I will do that tomorrow, because it's after 5pm today.

No comments:

Post a Comment