Part 1: Centralize and conquer
In Part 1 of this series, I argued that a centrally-hosted analytics system built on d3.js is superior to a Tableau-based (or some other proprietary tool — I use Tableau in the example because it’s probably the most prominent in the gaming industry) system because it allows for better analytics consistency: when metrics are defined at the system level, they’re calculated the same way throughout the organization.
The subject of this post is the second reason I think a hosted system is better than a “worksheets”-based system: analytics access.
When I say “worksheets”-based analytics, I’m talking about any analytics system wherein data is procured by an employee and then manipulated using some tool (in this example, Tableau, but I could also be talking about Excel). The problem with this system is access: limiting access to specific data segments in a Tableau-based analytics system is cumbersome. The only way to effectively do it is to create specific datasources for specific employee groups, which isn’t optimal for two reasons: 1) it could result in an inordinate amount of storage redundancy, and 2) an organization might need to define hundreds of “employee groups” to grant appropriate access levels to every employee.
And what happens when someone leaves the organization? The credentials for the datasources must be reset, which could disrupt the functionality of an unknowable number of worksheets because usage can’t be monitored. I’ve experienced both of these things happening first-hand, and these events are incredibly frustrating to deal with: access requests in a large corporation can take days to resolve, and datasource password changes are obnoxiously monotonous to deal with.
But at least access to data can be denied in Tableau: if an organization is running its analytics from Excel, the data in those worksheets persists forever. Zynga understands very well how easily an employee can abscond with a USB stick full of strategy documents when an analytics system is overly reliant on Excel.
Now imagine the opposite: an analytics system built around d3.js, where intranet credentials are used to grant access to a CMS system in which each pre-defined report set is restricted based on user privileges. Data can be stored in one place, and it doesn’t need to be replicated for different “user groups” because access is handled by the CMS system and not source-dependent. When an employee leaves, their intranet credentials are deleted and they no longer have access to the analytics system. And even if the data is downloadable, the metrics are pre-defined in the report: strategic information can’t be looted. And the best part about d3.js is that, as a JavaScript library, it is completely platform agnostic. My d3.js dashboard looks exactly the same on my iPhone as it does on my MacBook.
Of the commercial analytics packages available, I do think Tableau is the best. But I think analytics is moving in a direction that favors consistency and centralization; given this tectonic shift, I think d3.js is the best toolkit for developing a high-impact, highly-visible analytics platform. Closed solutions are hard to manage, and desktop software is less flexible to produce visualization-heavy reports with than d3.js. Once expense it taken into account, I think the d3.js-based analytics platform handily trumps Tableau.