hadoop

CMOs are under increased pressure to leverage big data for product marketing and brand recognition needs. Is Hadoop really the best data management approach for your business?

If you’re a CMO today, I expect you are chomping at the bit to get into this whole “big data” craze: taking the data sets you have already collected and mining them for insights and trends you can then leverage to market your products, connect with your target audience, and enhance your brand recognition.

To mine your data for hidden gems, you are probably considering Hadoop, a software engine designed to give answers to your big data queries.

But that begs the question: how do you implement Hadoop? It would be nice if you could get IT’s attention and support, but they didn’t even have the bandwidth to help you with your marketing automation software. The truth is, leveraging big data may not be either quick or easy – especially if you’re in a situation where IT isn’t chartered to support you.

Chances are that your IT department, with its limited staff and resources and budgets, is busy protecting and keeping running the company’s “crown jewel” applications. The result? Though you’re a CMO – and already a creative genius, revenue strategist, and data scientist – you also have to be something of a technology expert as well. You have to ask not only the marketing questions when it comes to big data, but you’ll also have to figure out the infrastructure and architecture questions as well: how will we run and manage Hadoop so we can take advantage of the benefits of big data?

At the most basic level, you have two options: you can choose to manage Hadoop in-house as a marketing IT function, or you can outsource big data as a service. Let’s look at each option in turn.

DIYing Hadoop

I once knew a marketer who spun up her own Hadoop cluster in her cubicle because she wasn’t able to get IT to support her. So it IS possible – but here are three big questions you should take the time to answer before you sit down to do-it-yourself:

1. Will I be able to add compute and storage as my data grows?
Sure, you can start in your cubicle with your current hardware … but you probably won’t be able to stay there. Big data eats up compute and storage space at an increasing rate over time. How much are you prepared to put into additional compute and storage equipment this year? What about next year? And the years following?

2. Am I willing to spend the time necessary to learn how to write productive queries?
Developing queries is a highly iterative process. You may discover that you don’t have the right inputs or fields to gather the information you want. You may find that it takes 5, 10, 20 or more attempts to identify the query that will give you the correct information in the right context. You may struggle with identifying formats that are actually useful and usable. How much time are you prepared to put into the learning curve? What if full proficiency requires not months, but years to attain?

3. How much time and money am I willing to put into big data to find out whether or not I will reap an ROI?
Even if you have all the right queries, you could discover that you don’t get any usable data – data that can truly support your marketing efforts. What if, after spending all that time and money and effort, your business does not find value in big data? How much are you willing to invest for an unknown ROI?

Outsourcing Hadoop

Outsourcing Hadoop can lower the entrance barrier to testing the big data waters:

1. A provider will allow you to scale compute separately from storage, which keeps your costs down. So, rather than purchasing a physical server every time you need more storage (which is what you would need to do if you were DIYing), a provider will simply give you access to more disk space. Then, when you actually do need more compute power, the provider can give that to you as well – all virtualized and instantly scalable for your needs.

2. If you need help mining the data, a big-data-as-a-service provider can put their full expertise to work for you. They know how write effective queries against your data and how to do all the coding required. They also understand which formats are best for different purposes. All you need to do is state what you’re looking for. The provider does all the rest.

3. Outsourcing can reduce the “I” in “ROI.” You still don’t know what the outcomes will be, but with an service provider, the investment in all areas – time, effort, and money – is substantially less. This allows you to do run a trial “mining operation” to see if there are any nuggets of gold there to warrant further “panning.”

So, DIY or Outsource?

Here’s my advice: if CAPEX, space, power, time, and human capital are unlimited, then DIY makes sense. If that is not the case (and, quite frankly, it rarely is), then you would likely benefit by outsourcing Hadoop. After all, most marketers today want projects to start and finish in the shortest time possible, with little to no CAPEX spend. With a big data as a service partner, you can leave the infrastructure, architecture, and technicalities to someone else – and get on with the job of using the big data you have collected!

 

This blog was previously published on Sungard Availability Services’ Forbes Brand Voice.

Related Business Solution:  Application Management