Analytics and big data are top strategic priorities for many CIOs, and rightfully so. Most organizations are sitting on a goldmine of data, but they have not begun to mine it to uncover the real transformative value. Unfortunately, many IT leaders remain on the sidelines because they think investing in analytics would be too costly.
But analytics do not have to be expensive. Technology and service vendors tend to think that if you need best in class analytics, you have to pay millions of dollars and wait years for its value to materialize. That is not necessarily the case. There are good inexpensive or free tools and insightful books available with many examples from different industries to learn from and get an inspiration. For example, the book by Eric Siegel, “Predictive Analytics: the Power to Predict Who Will Click, Buy, Lie, or Die” is a good starting place. As long as you have the data and desire to learn from it, you can start to see results and value quickly and cost effectively.
We have done exactly that at Crawford & Company. Our analytics operation is not very large. But it is greatly effective because we focus on things that have the biggest potential, and we stay agile, utilizing open source and inexpensive tools whenever it makes sense. Our results with Crawford iQ™ Analytics are among the very best in the industry. Again, do not be scared of the price tag, a robust and sophisticated effort does not have to be expensive.
You may be skeptical, thinking, “Yeah, right. Every time we try it, it costs us a lot of money and we never finish.” But just look at the new generation of start-ups in the technology industry. A majority of them build their value on analytics, and they challenge dominance of established companies in a wide range of industries by utilizing data-driven smart processes, products, and services at a relatively low cost.
Availability of inexpensive but advanced analytics tools, combined with the government releasing treasure troves of data — and the avalanche of “user exhaust” data generated in social networks — enables these start-ups to bring innovative products and services to market with little funding. They do not need millions of dollars and years of development work to actually achieve significant value — or become a disruptor in the industry along the way. It does not take a team of Ph.Ds. to get there either, everything is much more accessible these days.
Consider internally and externally available data and open source options
When deciding where to start, look to your proprietary data that nobody else has outside of your organization. Your data does not have to be perfectly clean to be valuable. Do not stop there with what is immediately available. Augment your data with whatever is available from public data sources such as data.gov and commercial sources catering to your industry. Don’t ignore the fact that knowing how demography affects response rates or how ambient light affects crime or how TV programming affects dining out or precipitation affects seed germination — topics that might seem far afield — will help you improve your product or service. There is data available to help find such patterns and help you make better decisions.
When it comes to tools, excellent open source options are ready and available. For statistical analysis, the R programming system is by far the most popular. For predictive modeling, RapidMiner, KNIME and a few others that are very effective. For geographic mapping, Google Earth or QGIS or other free or inexpensive options can often handle the majority of needs. For managing data itself, there are open source databases like MySQL, PostgreSQL, and most notably the Hadoop stack. You may not even need to use an open source database because you likely can use whatever database technology your organization has already chosen.
At Crawford we actually set up a Hadoop cluster prototype on a group of desktop PC’s more than a year ago. We tested typical queries on our typical data set and achieved an amazing performance gain of two magnitudes as compared to the same computing capacity on a traditional commercial database, and all this without any significant investment in the system. These open source tools can be used very effectively to shorten time to value.
Some IT executives are not comfortable with or even scared of using open source tools, but many are scared for reasons that, quite frankly, I do not think are justifiable. Obviously, there are excellent expensive commercial tools out there. But if money is an issue, then it is much better to start with open source tools than not to start at all. Be resourceful and see how far you can go, you might just be very surprised.
Even though everyone is talking about big data and advanced analytics these days, many organizations are just getting started, especially at traditional companies. By traditional I mean brick-and-mortar type, not the technology-driven organizations. The problem is until you get up to a certain speed of analyzing data and investigating results, you are may not be able to show value. But getting to that speed can have a high cost if you go down the expensive commercial software and professional services route. So executives have a pretty tough time justifying millions of dollars of investment before value can be demonstrated. But, again, it does not have to cost as much if you start with open source.
Get ahead or get left behind
If you are not trying to be more agile and smart about using data to improve operations, quality of service, products, client relationship, and human capital, then your organization will be left behind. There is no doubt about it. It really comes down to the market economy and relentless competition. Always a given, competition these days is driven by the incredible leap in information technology, amount of data, and computing capacity. A hundred years ago it was driven by industrialization, and before that it was driven by other breakthroughs. Innovation and competition have always existed and they will continue to significantly shape our society.
Somebody is always eager to take your place. But they likely do not have your data at their fingertips. You have the power to turn that data into actionable insights that will improve your bottom line. Do it now, before somebody else beats you to it.
- When approaching big data, don't put a Ferrari engine in a Volkswagen body
- Five ways CIOs can save time and dig deeper on big data projects
- Don't wait: Configure infrastructure now for future big data efforts
Sergo Grigalashvili leads global efforts in enterprise architecture, analytics, and systems road mapping at Crawford & Company. His responsibilities include: Enterprise technology architecture methodology, best practices, standardization, and improvement; Development of business intelligence technology and statistical, data mining, and predictive analytic models for claim operations and client stewardship; Adoption of business process management methodology and platform; Contribution to global project portfolio management leveraging technology improvement and innovation opportunities. Sergo has 20 years of technology industry experience; he has proven success in increasing maturity of enterprise technology architecture and in broad adoption of advanced analytics and business intelligence at organizations of various sizes; he has earned a master’s degree in management science and a bachelor’s degree in applied mathematics and computer science.
Subscribe to our weekly newsletter.
Keep up with the latest advice and insights from CIOs and IT leaders.