Historically data integration as a practice has been housed with similar, related techniques such as data warehousing or database administration. These days, however, DI has been recognised as an autonomous discipline, worthy of its own staff, budgeting and attention. As such, IT departments need to rethink the way they address data integration projects with regard to the tools they use and the coordination of the project from start to finish.
Today, the multitude of options available to enterprises aiming to take on an integration project has reached a critical mass. Vendors and their products are now well-established in the market, users often have dedicated DI staff to implement such products, and DI has reached a point that it is recognised as discipline separate from related functions like data storage and database administration.
As DI technology and practices have grown at such a rapid pace, it is understandable that even the savviest IT departments may not be on the cutting edge of the latest DI developments. IT professionals may have out-of-date mindsets or misconceptions about DI based on previous incantations of DI technology. Moreover, in a rush to adopt the latest and greatest in DI techniques, even specialists in the discipline may fail to pay heed to DI’s best practices.
CIOs also need to take time to discern which products and vendors are right for their business. “While enterprises spend time in evaluating all the features and trends in the data integration space, they need to keep a tight vigil on which of those new ‘features’ would best fit their enterprise requirements,” says Stephen Fernandes, Assistant Vice President and Head of Middle East Operations, Cognizant.
Of course, the most basic of best practices when taking on a DI project is to maintain clean files. To ensure conformity among file, first there must be agreement on the file-format specifications from all sources and stakeholders. Further, to maintain a clean database, Anirban Bhattachara, Principal Architect, Tech Mahindra, recommends agreeing upon a process for file-level and field level-checks to avoid, among other things, “Repeat files, zero-byte files, non-conformant files, incomplete file-transfers, issues related to file-delay and latency, and files with corrupt data.”
Data integration projects can seem like daunting tasks and often take a great deal of time and dedication. However, most delays in implementation can be traced to poor planning on the part of the IT team. Adherence to best practices means preparing for the project in advance. Fernandes points to a few issues that can cause IT teams to struggle when implementing a DI project. “Lack of clarity on the scope and boundary of the DI tool, lack of understanding and firm commitment on the roadmap of functionality, lack of understanding or anticipation of the full effort and investment in setting up and sustaining a successful enterprise-wide DI environment and finally challenges with skilled staff.”
Enterprises can avoid sluggish role outs with solid research and planning. “Proper study and documentation of current DI capabilities and future wish-list provide a great foundation for the successful implementation of a DI tool,” says Fernandes.
Integration into the cloud presents a host of other issues that must be considered when taking on a DI project. Bhattacharya recommends keeping a few things in mind when integrating into the cloud. “Choose the appropriate Cloud Deployment Model – Private, Public, Hybrid or Community Cloud,” he says. IT teams also need to ensure that the cloud platform they have chosen has the flexibility to address multiple platforms including legacy and strategic environments.” If taking on a large cloud integration, Bhattacharya continues, IT teams must ensure a mechanism capable of managing multiple platforms from a single control point and automated processes like application lifecycle management.
Enterprises should also take steps to avoid the costly application-management problems that are sometimes involved with migrating to an SaaS environment. “The key to taking advantage of a cloud based solution is consuming only what is needed when it is needed – and remaining diligent about turning off at other times,” says Fernandes.
In addition, sticking to the design of the SaaS tool is key. Customisations of SaaS tools can lead to unnecessary costs. “The more you can use the SaaS solution as it was designed, the lower your costs will be,” says Fernandes.
Though cloud integration does require careful planning, the benefits are worth the time investment. According to Bhattacharya, implementing a cloud solution can help reduce costs and internal IT needs. In addition, setup and implementation can be relatively quick and resource pooling allows the cloud service provider to distribute the costs to all of its clients. Fernandes agrees, “The benefits of cloud integration include no system administration, lower utilisation price, uniform cost across the time period, pay-as-you-use facility, shorter time-to-market, shared risk, flexibility to configure OS and software, lower system/hardware maintenance cost, no investment in data centres, and ease of space expansion and scalability.”
A DI project can streamline a business’ workflow and clean up useless data that is bogging down systems, however, projects should be taken on with planning and care. “Data integration projects require a clear understanding of the requirements and a detailed study of the data elements. The ability to understand data flow, data manipulation, consolidation and usage, requires expert inputs and therefore users should be involved in the early stages for DI projects,” says Faisal Husain, CEO, Synechron. With a solid blueprint and the knowledge that best practices make for clean data, a DI project is a worthwhile undertaking.