Germany's Institut für Dokumentologie und Editorik is now publishing RIDE, a review of digital editions (issue one is out and issue two is coming up).
The checklist that reviewers use to assess accessibility and inter-operability is an interesting one. There's a chart of technical issues noted so far and here are some of their criteria:
- Which licenses are used to determine the copyright of the material published by the projects?
- Do projects adhere to a standard data model like TEI?
- Is the raw data accessible, either for the individual parts of the edition or as a whole?
- Which interfaces do the projects support to allow reuse of their data?
- Is search with wildcards available?
- Are the methods employed in the project explicitly documented?
Like most digital publishers, I am keen to put material in the public domain, but am cautious about the form of declaration. I'm sceptical about TEI. I do want to put the raw data out there, but we are all aware that the rate of uptake is going to be minute: at the most, three or four researchers per century are likely to want to use my MS Excel tables. The interfaces question refers to things like an API or Representational State Transfer principles, which are only vaguely in the mind of independent researchers running single-person projects.