Best practices

How to curb shadow IT by letting go of the last mile

According to a 2019 survey of 1,000 US based IT professionals, The Upside of Shadow IT, 97 percent of IT professionals said employees are more productive when allowed to use their preferred technologies. And 80 percent said they believe their companies need to be more agile when it comes to deploying technologies suggested by employees.

Shadow IT–IT activities and applications managed without direct involvement from the IT department–has long been viewed as a negative phenomenon because the lack of centralized control could compromise security and data quality. The rise of SaaS tools heightened those concerns. In a 2019 IT-Business Alignment Study, 78 percent of 1,100 IT professionals surveyed said shadow IT activities had increased in the previous five years. But attitudes about that are changing.

While there will always be legitimate concerns about security and data quality, there is also a growing appreciation of the positive benefits of giving line of business users more control–employee engagement, loyalty, and competitive advantage to the business. There are also new analytics tools that allow IT to let go of the last mile of delivering data analytics, and let business users build their own dashboards and reports. That makes for one less area of concern about shadow IT, freeing IT professionals to redirect those energies to higher value work.

No point in waiting

Data analytics encompasses a broad spectrum of activities. Most enterprises have historically had the IT department manage the whole spectrum, all the way from data acquisition through to the end reporting tool. This means getting the data out of different source systems; putting it into a data warehouse; doing a massive amount of transformation and modeling, and then building dashboards and reports on top of it. In this setup, if someone needs to bring in a new data source, or adds a new table in a source system, IT has to update the data warehouse. That takes so long that more often than not, by the time the data is ready to be consumed, it’s too late. 

Unfortunately, this situation is still prevalent today, and with the huge amount of data that’s now being collected, and the growing appetite for data analytics, IT and the business both report backlogs of unmet requests. The proliferation of self-service reporting tools and departmental SaaS applications have made it easier than ever for business users to build their own dashboards and reports. People just don’t see the point of waiting for that traditional IT process to happen.

No single source of truth

The problem is that it leaves the company with no single source of truth. But the answer is not to crack down on analytics tool choice. One need only look at the number of analytics and BI tools within any given enterprise to know that is unrealistic. People want choices.

If we did this right, then there would be one single clearly defined, broadly accessible data set, and everyone could consume the last mile access to the data using any number of tools. IT should not be managing every single acquisition of those tools, although they might want to negotiate pricing with a list of preferred vendors. 

Data acquisition, creating data pipelines, and then the data semantic layer should still be done by IT. Security and compliance should also be controlled centrally. For example, In Europe, GDPR requires that when an employee leaves the company, you have to be able to wipe out all the data for that employee. This cannot be left to departmental people. Nor can SOC 2 compliance for applications on the cloud.

IT must also make sure that only certain authorized users can see certain data. For example, there are rules around which set of people can see which employee’s salary. As a manager, I can see all my employee’s salaries, but I cannot see the salary of employees in another department.

Get your fresh, certified data here

But IT involvement should end once they’ve published certified data sets where governance and security have been addressed. That’s the ideal state. 

The business takes this certified data set, and they can consume it in their analytics tool of choice, whether that’s Tableau, PowerBI, Incorta or something else. They’ve got all the curated data, and can use any additional data they want. Say for example they have an Excel sheet for departmental budgets. They can blend that with the IT data and say, “Here’s my budget versus actual.”

The unique advantage of Incorta is that IT can manage data acquisition and the building of the datasets, and the analyst can consume these data sets and bring their own additional data and start to blend the data together, all within the same platform. But you don’t have to stay within the same platform. You can run Tableau or Power BI or some other tool on top of your data in Incorta, for example. That’s the way the world works today. It’s all about flexibility and choice.

Leaving it to the business

IT should remain in the data acquisition, preparation and governance business. But instead of building dashboards and insights based on what a business user asks for, they should be solving upstream problems such as how to move data from on-premise to the cloud, or vice versa. Bringing different source systems together, and unifying the data and providing curated datasets for the business is another big value add. You could be getting data from 10 cloud systems, and you’ve got to manage the entire architecture for those flows. Lastly, IT should manage and catalog the metadata that comes with all this data acquisition. 

But they can leave the last mile consumption to the business. They know their requirements really well, and they know which tools they can use to consume the data. Because every user knows exactly how to run their business, and they know exactly what kind of analytics they need, and they are able to manage that last mile on their own. IT has bigger problems to solve, and there are plenty of them that they can go pick up.