DIGIT Public Finance Management
PlatformDomainsAcademyDesign SystemFeedback
v2.0
v2.0
  • Introducing Public Finance Management (iFIX)
  • Platform
    • Release Notes
      • iFIX Core Release Notes
        • iFIX Core Build Updates
        • iFIX Core Test Cases
      • iFIX Adaptor Release Notes
        • iFIX Adaptor Build Update
        • iFIX Adaptor Test Cases
    • Specification
      • Functional Specifications
      • Technical Specification
        • Information Model
        • APIs
    • Architecture
      • Technology
    • Services
    • Setup
      • iFIX Service Setup
      • Infrastructure Setup
        • Quickstart/Local Setup
        • On AWS
        • On Azure
      • Deploy Services
        • Deploy from your local machine
        • CI/CD
      • API Access Key
    • Configuration
      • Core Service Documents
        • Master Data Setup
          • Domain Services
            • iFIX Core Master Data Service
            • iFIX Core Fiscal Event Service
            • iFIX Core Fiscal Event Post-Processor
        • iFIX Core Data Cleanup
        • iFix Department Entity Service
        • iFix Client Management Service
          • Keycloak Setup
      • Configuring Master Data
      • Promotion Docs
        • Master Data Service Promotion Doc
        • mGramSeva iFIX Adapter
        • Department Entity Service Promotion
        • Fiscal Event And Fiscal Event Post-processor Service Promotion
        • MongoDB Migration
  • Products
    • Use Cases
      • Leveraging iFIX To Enable Timely Payments To Frontline Health Workers
    • Works Module
    • mGramSeva
    • iFIX Adapter
      • Adapter Service Documents
        • iFIX Adapter Master Data Setup
        • mGramSeva iFIX Adapter Service
        • iFIX Adapter Master Data Service
        • iFix Adapter Services
      • Source Code
      • Installation
        • Local Setup
        • CI/CD
    • iFIX Dashboard
      • Features
      • Architecture
        • Technology
      • Source Code
      • Installation
        • Local Setup
        • CI/CD
      • Documents
        • iFIX Reference Dashboard
          • iFIX Fiscal Event Aggregator
  • Community
    • Ecosystem
      • News and Events
    • Roadmap
    • PFM Blogs
      • Why PFM Needs Fiscal Information Exchange Standards
    • Source Code
    • Discussions
    • Issues
Powered by GitBook

All content on this page by eGov Foundation is licensed under a Creative Commons Attribution 4.0 International License.

On this page
  • Overview
  • Version
  • Prerequisites
  • Features
  • Kafka to Data Store Sink
  • MongoDB Sink
  • Druid Sink
  • Interaction Diagram
  • Environment
  • Configurations and Setup
  • References and Notes

Was this helpful?

  1. Platform
  2. Configuration
  3. Core Service Documents
  4. Master Data Setup
  5. Domain Services

iFIX Core Fiscal Event Post-Processor

Overview

Fiscal Event Post Processor is a streaming pipeline for validated fiscal event data. Apache Kafka has been used to stream the validated fiscal event data, process it for dereferencing, unbundle, flatten, and finally push these details to the Druid data store.

Version

Current version : 2.0.0

Prerequisites

Before you proceed with the configuration, make sure the following pre-requisites are met:

  1. Java 8

  2. Apache Kafka and Kafka-Connect server should be up and running

  3. Druid DB should be up and running

  4. Below dependent services are required: iFix Master data service iFix Fiscal Event service

Features

Fiscal Event post-processor consumes the fiscal event validated data from Kafka topic named “fiscal-event-request-validated” and processes it by following the below steps:

  1. Fiscal event validated data gets dereferenced. For dereferencing, service ids like COA id, Tenant id etc. are passed to corresponding services - master service and fetch the corresponding object(s). Once the fiscal event data is dereferenced, push/send the same data to the dereferenced Topic.

  2. Unbundle consumers pick up the dereferenced fiscal event data from the dereferencing topic. Dereference fiscal event data gets unbundled and then flattened. Once the flattening is complete, push/send the same data to the Druid Sink topic.

  3. Flattened fiscal event data is pushed to Druid DB from a topic named: fiscal-event-druid-sink.

Kafka to Data Store Sink

MongoDB Sink

The Kafka-connect is used to push the data from a Kafka topic to MongoDB. Follow the steps below to start the connector.

  1. Connect (port-forward) with the Kafka-connect server.

  2. Create a new connector with a POST API call to localhost:8083/connectors.

  3. Within that file, wherever ${---} replace it with the actual value based on the environment. Get ${mongo-db-authenticated-uri} from the configured secrets of the environment. (Optional) Verify and make changes to the topic names.

  4. The connector is ready. You can check it using API call GET localhost:8083/connectors.

Druid Sink

The Druid console is used to start ingesting data from a Kafka topic to the Druid data store. Follow the steps below to start the Druid Supervisor.

  1. Open the Druid console

  2. Go to the Load Data section

  3. Select Other

  4. Click on Submit Supervisor

  5. Verify the Kafka topic name and the Kafka bootstrap server address before submitting the config

  6. Submit the config and the data ingestion should start into the fiscal-event data source

Interaction Diagram

Environment

Note: Kafka topic needs to be configured with respect to the environment

Key

Value

Description

Remarks

fiscal-event-kafka-push-topic

fiscal-event-request-validated

Fiscal event post processor will consume data from this topic

Kafka topic should be same as configured in Fiscal event service.

fiscal-event-kafka-dereferenced-topic

fiscal-event-request-dereferenced

Dereferenced fiscal event data will be pushed to this topic

NA

fiscal-event-kafka-flattened-topic

fiscal-event-line-item-flattened

NA

NA

fiscal-event-processor-kafka-druid-topic

fiscal-event-druid-sink

Flattened Fiscal Event data will be pushed to this topic.

While druid ingest of fiscal event , make sure it has the same topic as mentioned here.

Configurations and Setup

Update the DB, Kafka producer & Consumer And URI configurations in the dev.yaml, qa.yaml, prod.yaml file.

References and Notes

Title

Link

Swagger Yaml

PreviousiFIX Core Fiscal Event ServiceNextiFIX Core Data Cleanup

Last updated 2 years ago

Was this helpful?

The request body for that API call is written in the file .

Copy...Paste the JSON from the file in the available text box.

All content on this page by is licensed under a .

fiscal-event-mongodb-sink
druid-ingestion-config.json
​
eGov Foundation
Creative Commons Attribution 4.0 International License
iFix-Dev/fiscal-event-post-processor-2.0.0.yaml at develop · misdwss/iFix-Dev
Creative Commons License