We switched to JSON output from Bro when we started feeding logs into an ELK cluster. While I can still grep the raw logs, it's a bit ugly. The ever-brilliant Justin Azoff recommended jq to me. I played around a bit with this today. The current release version of jq is 1.4, which doesn't seem to have gmtime(). I checked out the git version and that worked, or you can get 1.5rc(whatever) from the releases page. By the way, those of you who know me, know that I generally despise the "cat file | something" convention, but I'm using it here. Sorry, not sorry.

If you just want to get the timestamps and convert them to human-readable (assuming you don't think in epoch):

$ cat test | jq -c '.ts | gmtime'

I used the -c flag so it wouldn't actually break each timestamp up one line per element. Still, boring - who just wants slightly-more-readable-than-epoch timestamps? Timestamp, source and destination IPs, and the requested URI:

$ cat test | jq -c '[(.ts | gmtime), ."id.orig_h", ."id.resp_h", .uri]'

(I was checking the Beer Store hours for this holiday weekend.) The square brackets around the fields collects everything into a single array, the quotation marks around the field names tell jq to interpret those keys as literals - otherwise it gets angry about the dots in them. Parentheses allow piping the timestamp to the gmtime function.

gmtime output is pretty ugly though.

$ cat test | jq -c '[(.ts | todateiso8601), ."id.orig_h", ."id.resp_h", .uri]'

Note the timestamps actually are in UTC. Depending on your use case, you may want to omit the -c flag - compact output including any of the longer fields isn't a lot better to read than the raw JSON logs.