We'll respond shortly.
For your entertainment. A vignette from my pre-pivotal days.
I wake up, bleary-eyed, and roll out of bed. Squinting I look at the time: 11:27. Perfect. I slide into a pair of jeans and make a pot of coffee. I drink it black; it pairs well with a nutritious breakfast of two heavily-toasted frosted strawberry pop tarts.
I clamber uphill to my office on the 6th floor of Campbell Hall eager to start my day. There will be much code to write.
Yesterday was rough. My advisor and I had to face the fact, yet again, that the results of our analysis were still off. She was correct of course: we still weren’t encabulating the diagraphical errors with sufficient bisectional amplitude.
Sigh. Another one of those “will I ever graduate” moments? Time is running out, I remind myself, I have to finish in two years if I want Arnold’s signature on my PhD.
But I have an idea. And I race uphill eager to write the triphase meta-gaussian process code that might.. just might!.. encabulate my analysis errors with enough amplitude.
My office-mate hasn’t made it in yet so I’m all alone in my office-cave. Perfect. Headphones on, I nestle into my mouldering chair and lean back into my near-horizontal ergonomic position of choice (less bending of the wrists, you see).
The iMac flickers to life as I bring up a terminal:
$> cd ~/code2 $> mate analysis.py
I skim through the familiar file deciding where to put the new encabulation method. I settle on line 3742, between the definitions of mq7_take2(data) and EE_Medium(data3, data).
But what to name this function? With barely a second thought:
def EE_3P_MG(data, metadata):
A quick copy/paste of EE_4Q_ML and it’s off to the races. I slip into a blessed state of flow, sliding globs of terse code around as I fly up and down analysis.py. I’m at home here. The variables are old friends (p2 and xj3 are particularly beloved – we’ve been through a lot together) and it’s always a fun challenge to remember how all the helper functions work.
I reach a commit point but keep going.
Now I just need to use the new EE_3P_MG code in the analysis. I skip past The Scary Bit, pretty sure that it won’t depend on the new code and run a quick find-and-replace across the rest of the file.
Almost done! Time to test the thing. Pointing it at the small dataset I step away for an hour and a half to get some coffee and futz about on my phone.
I return to discover that the code had bombed out 15 seconds into the analysis. “Oh right” I say to myself, “the small data set has insufficient permambulatory significance for a triphase encabulation”. Duh. Thankful for this short feedback loop, I decide to run against the full data set.
But first – shuddering as I remember The Incident from last year – I put the code into my SCM of choice with a helpful commit message:
From: firstname.lastname@example.org To: email@example.com Subject: EE_3P_MG moar better analysis.py Onsi <Attachment: analysis.py 689KB> <Attachment: runner.py 3KB>
Just to be sure (again – The Incident) I check my quota on the mail server:
Used: 987 MB of 1 GB
Should be good for a while now that I’ve unsubscribed from all those cat video feeds.
With that out of the way:
$> mkdir run_1837
Wait. No, crap.
$> rmdir run_1837 $> mkdir run_1836
And now, victoriously:
$> ./runner.py -ds=full -tk=427 -out=./run_1836 Analyzing... ============ Using dataset: full Will output to: /Users/onsi/code2/run_1836/out.pickle Reticulating splines... Retailoring dark matter halo trees: 21/182739 - 2s 46/182739 - 4s
…looks like it’ll take 5 hours or so. As this process usually burns my computer into the ground I decide it’s time for more coffee, some lunch, and a chance to fall asleep reading some papers.
I return, 5 hours later:
182720/182739 - 17912s 182742/182739 - 17914s Trees retailored Enhancing merger rates... done Encabulating errors... Traceback (most recent call last): IndexError: list assignment index out of range File "/Users/onsi/code2/analysis.py", line 3920, in EE_3P_MG xj3[k] = p2*a[i:j-17]*a2[i+1:j-15]
Hmmph. It’s going to be another long night.