Update Tokenization API
When doing either a GET /api/tokens or GET /api/bags/{id}/tokens, it's currently possible to filter on filenames. When a filename contains commas or other special characters which aren't encoded properly bad results can be returned which can lead to other errant behavior. To fix this it might be easy to pass in json containing the parameters, such as:
{
page: 1,
page_size: 3,
bag_id: 12,
algorithm: 'sha256',
filenames: ['manifest-sha256.txt', 'bagit.txt', 'data/some-file.txt']
}
The larger issue comes in from the verbage for the API. We'd want to switch from GET to POST, which also means we'd need to update the endpoints... possibly POST /api/tokens/query and POST /api/bags/{id}/tokens/query. This way we don't conflict with the POST endpoints which already exist.
Edited by Ghost User