At Strata I attended a discussion panel in which a number of speakers described the various types of work involved in data science: data scrubbing, data analysis, presentation, etc. The general consensus was that data scrubbing was the most time-consuming task in data science. I've also found this to be true on data warehousing and data mining projects, so no surprise here.
The most interesting part of the discussion was when an audience member asked if the panel could recommend any tools to help with the data scrubbing. The answer was "no".
I spoke with the panel members afterwards and found that they were completely unfamiliar with data warehousing and of course, ETL. So, appears the author of "Data Analysis with Open Source Tools". So, is just about everyone I've met working in this field.
Of course this is mostly just because data warehousing came from the database community and the new interest in data analysis has come from the programmer community. There's certainly no problem in having a different community re-explore this space and possibly find new and better solutions. The problem is that the more likely scenario is a vast number of failed projects that fail because of performance, data quality, or maintenance costs associated with solving this problem poorly.
Data increasingly defines our experiences, realities and opportunities. But data doesn't do anything on its own - it takes people and methods to refine it and make it valuable. This blog is my effort to work on these ideas - mostly for myself, but also for anyone else interested.
2011-03-18
2011-02-14
Analysis and the 'So What' Question
While at Strata I had an opportunity to participate in quite a few sessions that demonstrated how to take raw data and analyze it with various tools. The output was usually a set of graphs, charts, etc, though sometimes just simple tables. All of this was useful to get a sense of how the tools work, but what was missing was the final step in the analysis - a powerful insight or understanding that one could use to make an intelligent change to a process. Generally, the presentation technique was fine, the tools were great, but the demonstrated impact of the tools was trivial.
One reason for this is that some of the presenters may have to hold back on their most significant discoveries until the right time - and this just wasn't that time, or this wasn't the right audience. I can understand this - since most of my best analysis can't really be shown without getting NDA and other agreements in place first. Another reason is that the presenters might have wanted to focus on the tool and not the data or business being studied which is just serving as a necessary example to work on. But this is misguided, since delivering insights is the bottom line - not delivering pretty pictures. The last reason I can imagine is that delivering powerful insights is hard, and while these presenters are working on it they may not yet have a suitable example. And I think that this is the most likely answer.
My concern is that people spend a lot of time building gorgeous but empty-headed analytical solutions that just don't have much to say. This is pretty similar to the chart junk problem that Edward Tufte complains about. To make this a little more clear I've included a few examples below.
One reason for this is that some of the presenters may have to hold back on their most significant discoveries until the right time - and this just wasn't that time, or this wasn't the right audience. I can understand this - since most of my best analysis can't really be shown without getting NDA and other agreements in place first. Another reason is that the presenters might have wanted to focus on the tool and not the data or business being studied which is just serving as a necessary example to work on. But this is misguided, since delivering insights is the bottom line - not delivering pretty pictures. The last reason I can imagine is that delivering powerful insights is hard, and while these presenters are working on it they may not yet have a suitable example. And I think that this is the most likely answer.
My concern is that people spend a lot of time building gorgeous but empty-headed analytical solutions that just don't have much to say. This is pretty similar to the chart junk problem that Edward Tufte complains about. To make this a little more clear I've included a few examples below.
2011-02-10
Breadth of Data vs Depth of Analysis
One of the things that I felt was missing from O'Reilly's Strata Conference was a nuanced sense of the trade-offs between complex analysis and vast volumes of data. Because there is a trade-off and I've seen it play out consistently. It works like this: where do you spend your investment?
- deep analysis - with unpredictable costs and benefits
- broad sets of data - with predictable (high) costs and benefits
2011-01-28
Buy, Reuse or Build ETL Software?
While talking to someone today he mentioned a concern about my team's "homegrown" software: that it would nickle & dime us to death compared to "more robust commercial software". I respected this guy - he was very bright and had a lot of successes under his belt. But I also felt that he was both echoing a common corporate perception, and was quite wrong.
I've run into this notion so often that I now plan for it: in the minds of many commercial software has more credibility than open source software, which in turn has more credibility than custom-built software. And since these perceptions are often held by those that control my budget - perceptions matter.
I've run into this notion so often that I now plan for it: in the minds of many commercial software has more credibility than open source software, which in turn has more credibility than custom-built software. And since these perceptions are often held by those that control my budget - perceptions matter.
Subscribe to:
Posts (Atom)