Welcoming New Changes with Technology
By Carl Hoburg, SVP, CIO, Progrexion
One important way my role as CIO has changed in the past few years is to occasionally help educate my other executive colleagues. Useful new technology continuously becomes available. I have to allocate approximately 10 percent of my time on a monthly basis learning about the most promising new tools and, when appropriate, introduce them to my colleagues.
A good recent example in my current firm is the introduction of advanced analytical tools and the new database technologies that enable such tools. In addition to time I spend doing research on my own, we’ve formed a small advanced analytics team which has evaluated the most promising tools. With the introduction of several new technologies, analytical exercises which earlier took days or weeks, can now be completed in a few hours. This has significantly improved decision making quality.
An important part of the success of our advanced analytics initiatives has been the education of our top executives and several of their key direct reports. We started about 19 months ago with an initial overview for the entire executive team. We’ve continued with frequent ad hoc meetings which show the problems we are addressing and the results of our analysis. About half of the executive team understood the approach and the benefit immediately. As of now, everyone on the team understands the value and has embraced the introduction of the tools and techniques we’ve recently implemented. In addition to facilitating learning, having a strong finance background has enabled me to make contributions to overall corporate strategy and planning. I have found this to be especially useful across multiple industries including health care, consumer services and, of course, banking.
One last critical role or skill that is worth mentioning is in the area of vendor management. CIOs must have the ability to effectively negotiate long-term contracts and put in place the staff and processes to effectively manage complex vendor relationships.
The Importance of Data Quality
Data quality is an area in which we’ve invested significantly over the past year; and we expect to continue to grow this investment. Several of the top engineers on our team have been involved in a recent exercise to significantly improve data quality; this includes the heads of our software development, advanced analytics and DBA teams. During the initial three months of this effort, I spend five to 10 percent of my time per week on the initiative.
Improving data quality is key to our firm realizing the full value of our recent investments in advanced analytics capabilities. As a result of dramatic improvements in our client data quality, we are gaining a deeper understanding of how they interact across multiple channels. This information is serving as the basis to improving our product set as well as the ability to match prospective clients to the appropriate product during the sales process.
Improved data quality combined with our advanced analytics capabilities is also enabling more efficient marketing spend for both on-line and off-line media such as radio and television.
Improved governance has played a key role in improving our data quality. It is a “light-weight” process, which has allowed us to remain nimble while at the same time eliminating to root cause of poor quality early in the development cycle.
Leveraging Big Data Tools
Technology that we use that falls under the heading of “Big Data” would be newer database technologies such as Hadoop and columnar data stores. Additionally, we are supplementing our use of more traditional business intelligence tools such as Tableau, with machine learning tools such as Weka and Knime. The quality and the speed of our analysis has improved tremendously.
“Improved governance has played a key role in improving our data quality”
Two examples are worth noting here. First, our finance colleagues make heavy use of Tableau for both ad hoc analysis as well as defined weekly, monthly and quarterly reports. We are in the process of moving our largest data repository from a relational DBMS to a columnar data store. One set of analysis that is regularly run in 4 hours in the relational database takes less than 5 minutes in the columnar database.
Second, our team using advanced analytics tools is able to run multiple scenarios using a combination of Knime, Weka and R programming in less than 10 percent of the time that similar analysis took in the past.
We are now able to do useful analysis on unstructured data in our Hadoop cluster, which we were not able to do earlier. We are in a position where we will soon be doing real-time analysis on unstructured data from our clients, such as emails, so that we can better predict when they are likely to leave; this will allow our customer service staff to contact a client before they cancel their service to see if their might be a problem we can address.
We are in the processes of integrating our machine learning algorithms into real-time client segmentation tools for our sales and customer service colleagues. We expect the results to be higher client satisfaction as well as increased profitability.