2013-06-10

Programming Katas with Anki

Last year I was confronted with the fact that some of my technical skills have grown stale - at the point at which they were good enough to get the job done it seemed that all progress stopped.   I hadn't stopped learning - just stopped learning about certain technologies and products.   There were benefits to being in this rut like having extra time to spend on the needs of my organization and opportunities within my industry.   But all justifications aside the bottom line was that my productivity was being whittled away through small and continuous inefficiencies in my development methods.   I was using vim like a glorified notepad, I was using the mouse rather than keyboard short cuts on Firefox and Terminator, my use of Python was stuck in the 2.4 days, and I was consulting help on the find command far too often.

So, I settled on a strategy that I used many years ago - to learn one thing every day.   Which mostly worked.   Where it failed is in the specifics of the meaning of 'learn':  I was forgetting some things almost as quickly as I was learning them.   Certain items really required practice and repetition.

But why stop with just learning something well enough to do it slowly and painfully?   Ideally, I would know my tools so well that most common tasks don't require conscious thought at all.  This would eliminate unnecessary distractions and free me up to think about the harder problems at hand.   I want more of my development time to be in a mental state of flow.

2013-03-19

Gristle Slicer - of Architects, Chairs and Unix Utilities


There's an old story about two senior architects that were friends in college, and met again thirty years later.   After a few minutes they started talking about their favorite achievements.   The first described office towers, airports, and universities he was quite proud of.   The second didn't have any monuments to talk about, but shared that he thought he may have designed the perfect chair.t chair.    Clearly trumped, his friend congratulated him, and asked to hear more - since the perfect chair is far more significant than yet another monument.

Sometimes, I feel that small unix utilities are to a programmer what a chair is to an architect:  they continue to be essential, and are typically small, spare, do just a single thing and can clearly show elegance.

I've written quite a number of them, and have recently started packaging those related to data analysis into a project called DataGristle.   My favorite utility of the set is gristle_slicer - a tool similar to the Unix program cut.   While cut allows the user to select columns out of a file, gristle_slicer selects columns and rows - and uses the more functional Python string slicing syntax to do it.  

It's no perfect chair but it might be a good utility.  

2013-02-08

The Hidden Value of Crappy Experience

A friend was recently sharing his experience in helping his company fix a botched RAID recovery.  He was exhausted by the work and questioning the value of what he was learning from his employer.

But in our discussion we agreed that bad experience might be as valuable as good experience in some ways.    It might not be as fun, and you can certainly hit diminishing returns on it.   But the intense memories of those bad experiences are the basis for many of our most valuable instincts and judgements.

Well, at the time, it felt kinda nice to be able to help him put a happy face on his obnoxious job.

And then just this week I had the tables turned on me:  A server I depended on failed because a battery swelled, and then warped and took out the motherboard.   Even better, the backups only appeared to be working, but really weren't.   And that's not all: there was some code on that server that wasn't yet in version control because of chaos on the team at the time - after existing in limbo for two years it was slated to be added this week.    We were down for days replacing that server.   And I suppose we were lucky - it could have been worse.

All this got me thinking about my glib advice to my friend about the value of crappy experience.   Was there a silver lining for me in all this?   Did this week add to my mental catalogue of situations to avoid?   Well, just as an exercise I decided to write down what the catalogue might look like.   At least just the part that deals with backups & recoveries.   Here it is:

2013-01-22

Programmers and Practice vs Training

Lately I've been trying to learn one new thing every day.   And one of the sad discoveries that I've made is that I'm capable of forgetting things almost as quickly as I learn them.

So, two months after I learned how to write vim macros - I've already forgotten the specific keys used to define and run them.   Now, I can re-learn this very quickly - I've got good notes, the memories are just slightly hidden, and I haven't forgotten any concepts, just simple keys.   But this will slow down my use enough that I probably won't pull this tool out when I need it.

This got me thinking about how I needed to complement training with some  some repetition, some practice.   That just learning something isn't good enough.   This is exactly what martial artists do - they would call these katas.    It's also what musicians do - they would call these scales.


2012-10-10

Rediscovering the Benefits of Simple Design

Recently, I met a coworker from almost twenty years ago whose clearest memory of our time together was our discussions about design.   And how I got us all to make a field trip to the break room to take a look at the microwave oven there.

It had a dial and no buttons.  Pull the handle and it shut off the element automatically.   I loved the simplicity of this.   I loved how it made no demands of the user, and anybody could immediately put it to use.   There was no training, no documentation, no "insufficiently skilled users".

At the time we were rolling Microstrategy out to hundreds of users.    Microstrategy is a ROLAP (Relational On-Line Analytical Processing) tool that once you provided data in a relational database within a star-schema, defined that to Microstrategy in the form of metadata then any user could use it easily - they could quickly create new reports by dragging and dropping element names and it would generate the SQL for them.   It was a very powerful tool that in the right hands could achieve amazing results.   Prior to our roll-out of this tool the backlog on reports for our organization was ten months.   After we rolled it out I signed onto and delivered an 8-hour average SLA for the creation of new reports.


2012-09-20

In Praise of the Embaressingly Simple

Recently, while having lunch with some former members of my project the conversation drifted to some of the old code that's still around.   These guys are incredibly good programmers, and so many of their contributions are still running today - four to six years after they've left the project.

One of the items that we discussed was our "batch broker" - a process responsible for handing out unique batch ids - that uniquely identify processes, end up in logs, in audit tables, and sometimes tagged to rows in the database.

We laughed about how embarrassingly simple this process was: just a few dozen lines of python code that
  • open up and lock a file
  • increment the number within
  • close & lock the file
  • log the requester & new batch_id
  • return the batch_id
Our myriad batch programs (transforms, loads, publishes, etc) then simply call a bash or python function on their local system which calls this program remotely over ssh to get a new batch_id.   Total amount of code is maybe 50 lines across all libraries.

2012-09-10

Learning 1 Thing Every Day

When I was 18 and a programmer in the USMC I decided the best way for me to become skilled was to learn one thing every day about programming in addition to my daily duties.   I recruited a colleague and each of us committed to learning and sharing our discoveries.

By learning I don't mean just reading about some feature or method, but instead studying it to the degree necessary to be able to easily apply it later.   Fitting this extra work into our schedules meant that most of these discoveries were fairly small.    But they accumulated and built upon one another very quickly.    Perhaps more importantly this strategy positively affected our daily outlook by helping us frame our day within an optimistic, learning context.

Decades later I'm a mid-career technologist who tends to neglect my technical skills while focusing on organization, communication, and resource issues necessary to get projects successfully deployed.  So, I've decided to resurrect this strategy to resharpen my skills and inject some more fun into my day.   I'm going to use this blog to help me track these items and summarize the impacts.

2011-03-18

Data Warehouse ETL for Data Scientists

At Strata I attended a discussion panel in which a number of speakers described the various types of work involved in data science:  data scrubbing, data analysis, presentation, etc.  The general consensus was that data scrubbing was the most time-consuming task in data science.   I've also found this to be true on data warehousing and data mining projects, so no surprise here.

The most interesting part of the discussion was when an audience member asked if the panel could recommend any tools to help with the data scrubbing.  The answer was "no".

I spoke with the panel members afterwards and found that they were completely unfamiliar with data warehousing and of course, ETL.  So, appears the author of "Data Analysis with Open Source Tools".  So, is just about everyone I've met working in this field.

Of course this is mostly just because data warehousing came from the database community and the new interest in data analysis has come from the programmer community.   There's certainly no problem in having a different community re-explore this space and possibly find new and better solutions.  The problem is that the more likely scenario is a vast number of failed projects that fail because of performance, data quality, or maintenance costs associated with solving this problem poorly.

2011-02-14

Analysis and the 'So What' Question

While at Strata I had an opportunity to participate in quite a few sessions that demonstrated how to take raw data and analyze it with various tools.  The output was usually a set of graphs, charts, etc, though sometimes just simple tables.   All of this was useful to get a sense of how the tools work, but what was missing was the final step in the analysis - a powerful insight or understanding that one could use to make an intelligent change to a process.   Generally, the presentation technique was fine, the tools were great, but the demonstrated impact of the tools was trivial.

One reason for this is that some of the presenters may have to hold back on their most significant discoveries until the right time - and this just wasn't that time, or this wasn't the right audience.  I can understand this - since most of my best analysis can't really be shown without getting NDA and other agreements in place first.  Another reason is that the presenters might have wanted to focus on the tool and not the data or business being studied which is just serving as a necessary example to work on.   But this is misguided, since delivering insights is the bottom line - not delivering pretty pictures.   The last reason I can imagine is that delivering powerful insights is hard, and while these presenters are working on it they may not yet have a suitable example.  And I think that this is the most likely answer.

My concern is that people spend a lot of time building gorgeous but empty-headed analytical solutions that just don't have much to say.    This is pretty similar to the chart junk problem that Edward Tufte complains about.   To make this a little more clear I've included a few examples below.

2011-02-10

Breadth of Data vs Depth of Analysis

One of the things that I felt was missing from O'Reilly's Strata Conference was a nuanced sense of the trade-offs between complex analysis and vast volumes of data.  Because there is a trade-off and I've seen it play out consistently.  It works like this: where do you spend your investment?
  • deep analysis - with unpredictable costs and benefits
  • broad sets of data - with predictable (high) costs and benefits