So, there's this instrument. We have a bunch of samples going into the instrument, and we pull data out which tells us whether it is worth it to run the samples on the next, more expensive instrument. So, it's for quality assurance.
I do several things with this thing. I have an interface that collects all the information about group of samples (what goes in which lane, etc.) which we import after the sample is run to connect name, ID numbers and all sorts of metadata with each sample. Right now, there's a comment section, and I put together a CSV table of the metadata in there. The database table I created is set to be all the data fields that the instrument software takes, which means I create the comment on Submit. And it means I don't save all the metadata independently.
This, I am finding out, is stupid.
Because I also code stuff handling the output. Before, I simply used Sikuli to automate the export of the XML, but now I'm handling exported graphs and putting them into our lab notebook, and I'm looking at having to deconstruct the comments in order to get the information I need.
Knuth was right. Premature optimization is the root of all evil.