My last article was about the changing nature of jobs in the last two decades. We concluded with identification of three important phenomena that influenced everyone in this game. Here, we discuss about the first phenomenon – the factors influencing businesses in making technology decisions, expectations of people involved, and an evaluation of likely situations seen across the market today.
In the past two decades, Technology Consumers have consistently adopted technology solutions to compete and improve productivity. This has been an opportunity to differentiate them from the competition or be on par with the competitors. Besides improving staff productivity and increasing capacity, technology also provides opportunity to translate complex tasks done by people into repeatable automated programs. Hence, there is a tendency to spend with an expectation to improve the top line or bottom line or both for higher profits. However, most results do not yield expected returns due to cost overruns, exceeding maintenance and support costs post-implementation. Architectural audit will reveal inefficiency in implementation in most cases leading to significant cost escalation continuing over years. One of the observations supporting this fact is the emergence of a new breed of professionals to address data quality and integrity. Inefficient technology planning and implementation strategies directly result in lack of data quality and data integrity. Most technology consumers have robust procedures, well-placed controls, and excellent project management. So what could be going wrong, especially if most non-IT companies face similar issues?
A peek at IT processes in most non-IT companies like retail, banks, manufacturing, utilities, etc. will show that they are similar to the ones followed by core IT companies. Most teams too tend to be operating well within their scope of work and responsibility. Tech initiatives are often driven by groups or business units within an organization. However, it is very rare that code used by one group within a company will be used by another one or there is an attempt to generalize the solution. So when diverse projects are taken up, efforts tend to get duplicated. Besides, projects once started, incur recurring annual maintenance cost. So even if teams might be working efficiently within their sphere of influence, the organization as a whole spends much more on operation. Reason – lack of enterprise architecture and coherent execution of implementation strategy at code level.
Let us take a specific example. Information exchange or data sourcing is a very important requirement in almost every company. Often companies rely on data staging methods involving popular tools like ETL or SQL-Loaders to achieve this. In many situations, data available with a provider might need transformation. Such transformations are mostly accommodated by the data staging teams. This works, because it’s a standard practice; and plenty of resources are available at reasonable cost. Of course, it is extremely important to make sure about the implications of technology in terms of how to manage and maintain it with respect to the abundance of skilled professionals available in the market. But is there an intelligent way to use the same resources and yet improve operating efficiency? The answer is an emphatic ‘yes’. But, even though the solution concept is generic, implementation would be specific to every company depending on the degree of sophistication desired.
For the same situation, let’s see how to develop a simple framework using the same resources. Assume that there is a central planning team which proposes that “information exchange will never involve any data transformation and it will just work like a mailman,” i.e., carry any dataset that two parties want to exchange. This mailman component will carry information in a specific pattern which sender and receiver can decipher consistently. This implementation will mean that the same data stage can now do the various types of data exchange while the sending and receiving teams will have the responsibility to manage the information carried by the mailman. Using this simple model, the company has now freed a lot of staff who were just involved in ensuring data delivery from sender to receiver. A large pool of talent can now deliver much more using the same operating cost or gain higher profits by using a lean team.
The point that I want to make is that most IT processes today provide the required controls, and do not need any modification. To improve operation, a technology consumer really needs to percolate the business strategy to code level implementation using existing infrastructure. This change will truly give the company a significant boost in profitability, highest ROI, streamlined implementation, and improved efficiency. But this will be a multi-year effort, and will require significant operational planning. Success of technology lies on how effectively it is used by people, and how management must adopt changes to operation once the platform is in use. We will see this in the next article when we discuss the people part of post-implementation operation as perceived by businesses.