Your leaking thatched hut during the restoration of a pre-Enlightenment state.

 

Hello, my name is Judas Gutenberg and this is my blaag (pronounced as you would the vomit noise "hyroop-bleuach").



links

decay & ruin
Biosphere II
Chernobyl
dead malls
Detroit
Irving housing

got that wrong
Paleofuture.com

appropriate tech
Arduino μcontrollers
Backwoods Home
Fractal antenna

fun social media stuff


Like asecular.com
(nobody does!)

Like my brownhouse:
   data pipeline sleuth
Thursday, April 28 2016 As a task for my new job, I'd been asked to figure out why people in Brazil were receiving emails in English instead of Portuguese. Assuming the problem lay somewhere in the frontend, I dug into the code and reproduced what I could on my local machine(s), eventually establishing that the problem lay somewhere beyond the API call sending data to some backend server. At that point I assumed my task was done, since I'd never heard of the machine servicing the API calls and assumed I wouldn't (yet) be trusted anywhere near it. But no, according to a rather icy post in the task manager, Meerkat indicated that I was not done. The mysterious server handling the API calls had a name which was just an alias for the www server. "How was I supposed to know that?" I asked Meerkat in our Google hangout. He then referred me to a document in the document repository that hadn't existed a few days ago. That's the kind of workplace I'm working in, but aside from having to deal with an abrasive personality, I rather like the actual work.
Following the data pipeline, I discovered the data was ultimately being logged as a line of JSON in a text file, and that was the end of its journey, at least synchronously. I had to know, did the API really just log data in a text file, and that was the end of it? So I asked Meerkat, and he confirmed that this was true. I got the feeling that he liked that I had figured this out but was feeling a little defensive about implementing it this way. So he asked if I wanted an explanation for why it was done this way. I did indeed. Meerkat then laid out a great case for simply logging JSON data and then processing it later by cron job "when there is time." The system had been developed as a bandaid during a time when the database the data normally went to was being unsuccessfully transitioned to a third-party provider, and it proved such an improvement that Meerkat decided to keep doing things that way even after the transition to the third party was abandoned.


For linking purposes this article's URL is:
http://asecular.com/blog.php?160428

feedback
previous | next