This week’s guest blogger is Larry Keller. Larry has been involved with data visualization and analysis for over 10 years. Larry worked on a team that created Impact, the first data visualization solution. He has worked for Business Objects, SAS and Tableau before founding VIA 5 years ago. He has been a guest speaker at events and VIA’s Tableau documentation is now available in 45 countries and 46 of the 50 states. Contact us if you would like to be one of our guest bloggers.
Over the years Tableau Server has emerged as a very powerful platform by which clients can share visual analyses on an enterprise scale. There are now Tableau Server implementations with thousands of users or Interactors (web clients) as they are called in Tableau lexicon.
Years ago a CIO or EVP of IT was not typically involved in the purchase of server and desktops, as it was often perceived as a departmental purchase with the cost often posted to the user/department. Further, IT welcomed the addition of Tableau as it often lessened the burden on IT resources for the care and feeding of reporting plus another technical environment. Tableau desktop was and is considered a discovery tool with myriad data connections but over the years the game has changed. Those desktop users or authors can no longer “fire and forget” (author and publish) when it comes to creating a workbook for consumption not only by thousands of internal users but Interactors who are clients enabled to see reports via server on a scheduled basis. This is how the game of fire and forget is changing.
The acquisition of Tableau Server now involves the CIO or EVP of IT more frequently. Tableau has a sales and consulting team recruited to specifically to make Tableau a standard and not a one off choice for enterprise implementations.
The reasons are simple – Tableau has proven itself as best of breed in the field of data visualization and analysis but more importantly it is being implemented as the enterprise solution by F-2000 companies and those below that benchmark. Both Tableau and QlikView are forecasted to outpace the purchase not only of Excel but legacy business intelligence solutions that I will not name here will become passé. Read more here.
The entry into enterprise system solutions includes the consideration of trusted sources; those systems of records (SORS) that a CIO knows to be vetted and thus trusted for wide distribution. Today Tableau server includes a test, development and production environments. If I am the CIO of a financial institution and want to be 100% sure that there are no “rogue” workbooks being published to server it is necessary to invoke manual procedures. Example – Assume that Oracle is a trusted SOR and a desktop user uses Oracle and an Excel file from an unknown source. Assume further that the author using the data blending techniques and some calculations. To insure that the production version of the workbooks has been vetted, the author typically has to publish this workbook to the development environment so another employee can validate the source of the Excel file thus slowing the publication process. In essence, the dev environment becomes quarantine until the source(s) is vetted.
With the myriad data sources to which Tableau Desktop can connect, what solutions might be considered by Tableau to attenuate this constraint? When data sources that are not considered trusted are used repetitively, would it be possible to recognize and approve it systemically after vetting? Example – Assume that the author connects to Oracle and then to a corporate data warehouse in the same workbook. If that data warehouse and associated marts have past previous quality checks why delay the publication? I envision an approach where the Oracle and DW are published to the Dev environment and then automatically promoted to the production server. The author can be notified and proceed with scheduling. The suggestions above would apply to extracts as well. Data Source Validation might be the moniker for this feature.
To complement Data Source Validation, server stakeholders could be enabled to “Tableau Tableau”. This is not typo but an instructive approach to a better understanding of the how to live with Tableau and do so with an understanding of the server database. Like any database server has views that server administrators can use. Envision a Tableau Packaged Workbook with preconfigured connections to the more critical views in server. More importantly, envision the use of the server investment by “seeing” who is publishing what and when. Are these authors abiding by the standards set by management in a manner that one might consider IT governance? This notion is not intended to quash Tableau’s tagline of “discover and collaborate”. Rather, it is intended to insure corporate information can be distributed on timely and enterprise basis knowing that data sources have gone through an approval process.
One last thought – It would be a huge help to Tableau enterprise users if there were a repository for standardized calculations. While this is not a systemic collaboration issue, it does address a common problem when multiple departments create the same calculation in three different ways. Yes this is an issue when there is only one financial model.