The challenge is to find
- the most efficient way to use these tools
- standard practices and techniques to deliver the business use cases
Our Data Ingestion framework provides the structure to build a competency which enables delivery
of the data ingestion capability for an enterprise.
01. Subscribe
The initial stage of the Data Ingestion Framework is to subscribe to the data. Technical details of the source system platform, security implications, and the data feed details should be understood before the data is ingested and consumed.
02. Hydrate
The hydrate phase of the process concerns the importing of data into a data store, such as a database, a file system, a big data platform or data lake. The data can be hydrated through various methods in real time or batch processing and data is subject to validation before the remaining data pipeline.
03. Process
The data is processed to ensure the data is in a format and structure suitable for the consumer. Depending on requirements, this stage may involve standardising, validating, matching or cleansing. Data could be transformed by an automated batch or self-service. Data may also be passed through the data pipeline unprocessed in its raw form.
04. Publish
Following the transformation of data,, the data must be made available to the consumer. The target data will be loaded or staged appropriately in order to leverage available resources so that any batch or real-time extract or feed of data can be achieved accurately, consistently and efficiently.
- 01. Lowers risk and TCO as the data ingestion approach has been developed and proven through engagements with our clients and partners
- 02. Combines different data delivery styles to incorporate a "portfolio based" approach to data integration strategy e.g. ETL, virtualisation, real-time data flows, event-driven.
- 03. Brings methodology and accelerators to define a framework optimized for each client’s landscape and operating model.
- 04. Ability to ingest and govern data at speed for both innovation and industrialised ingestion models for an agile and cost-effective solution.
- 05. Facilitates the collection and integration of data from different types of data sources and support different types of data transport protocols.
- 06. A clear security model with detailed analysis of identity and access protection, data protection, and network security, ensures consistent access across the different platforms