Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Image RemovedImage Added


Low Level Designs – Jira Stories

Delivery Milestone 01

DESCRIPTION

This is a component within “Delivery Milestone 1” which is defined as “UI Design, Blueprint of the deliverable and wireframes during the project implementation, Custom Specifications, Workflow models, Low Level Designs of Jira Stories.”

15/06/2023

Version 1.0

Table of Contents

Table of Contents


Anchor
_Toc137752412
_Toc137752412
Document Purpose

The purpose of this document is to provide guidance on the overall landscape of the Payments Building Block. By leveraging different components within itself, the Payments Building Block covers functionality including payments to different financial addresses such as IBANs, Mobile Money Accounts, . as well as to outline the steps and requirements necessary to achieve these objectives.

...

To aid these audiences this document aims to provide the low-level designs for the JIRA ID that were opened by Gov-Stack. Some stories had flow charts attached which are in the Annexure of this document.


Anchor
_Toc137752413
_Toc137752413
What are Low -Level Designs?

Low-level design (LLD) is the process of specifying and defining the detailed design of a system. The design focuses on the implementation details of a system and in our case the system required by Gov-Stack. The LLDs focuses on how the system will be built and how it will function at a detailed level.

This is an important phase in our project as the design is translated into more concrete details. Data structures, algorithms are used to implement systems.

Anchor
_Toc137752414
_Toc137752414
Key Steps involved in the low-level design

The key steps involved in the low-level design are as follows:

  1. Understanding requirements: The first step in low-level design is understanding the system's requirements. This can be done by reviewing the requirements, speaking with all the stakeholders involved which provides a clearer picture of the requirements.
     

  2. Defining the architecture: Once the requirements are understood, the next step is to outline the system's architecture. This involves classifying the various components that make up the system and defining how they will interact with each other.
     

  3. Designing the components: After defining the architecture, the next step is to design each component in detail. This includes specifying the data structures and algorithms to be used, and defining the interfaces between components.
     

  4. Defining protocols: In this step, the protocols that will be used for communication between components are defined. This includes specifying the data formats, message structures, and communication protocols that will be used.
     

  5. Creating UML diagrams: To help communicate the design, UML (Unified Modeling Language) diagrams may be created to illustrate the architecture and the interactions between components.
     

  6. Reviewing the design: Once the design is completed, it must be reviewed to ensure that it meets the requirements and is technically sound. This may include obtaining feedback from stakeholders and conducting peer reviews.
     

  7. Refining the design: If necessary, the design can be refined based on feedback received during the review process. This may involve changing the data structures, algorithms, or protocols used in the design.
     

  8. Documenting the design: Once the design has been finalized, it is important to document it in a way that is clear, concise, and easy to understand. This may involve creating detailed design specifications or creating diagrams and flowcharts to illustrate the design.

Anchor
_Toc137752415
_Toc137752415
Benefits of Low-Level Design

The benefits of having low-level design (LLD) are as follows:

...

Increased Reusability: When the interfaces and protocols are used in communication between various components, the LLD makes it easier to reuse components and systems in the future as well. This reduces the overall cost and time for future development and improves the efficiency of implementation of the system.

Anchor
_Toc137752416
_Toc137752416
Low-Level Design – Jira Stories

#

Jira ID

Requirement

Low Level Design

1

GOV-417

Non-Functional Cross Cutting Requirement: APIs MUST be Idempotent.

The Idempotent of APIs was done with the Correlation ID approach. Therefore, in the system functional IDs are allotted to every customer to avoid duplication of transactions and requests of the same customer due to system failures or no responses.

2

GOV-71

If an API Response will Take Longer than 5 Seconds, you SHOULD Return a Ticket with a Suggested Callback Time that is Resolved by Polling (Audit)

An API Audit was done to identify any suspected APIS

3

GOV-332

ID Mapper stores Functional ID of Agents

Account mapper is the host that stores the beneficiary modality details. Here is the GitHub link

4

GOV-350

Merchant Onboarding

Merchant Onboarding is performed via an API. Here is the GitHub link

5

GOV-349

Agent Onboarding

Agent Onboarding is performed via an API. Here is the GitHub link

6

GOV-330

ID Mapper stores Functional ID of Beneficiary

Account mapper is the host that stores the beneficiary functional ID via an API. Here is the GitHub link

7

GOV-331

ID Mapper stores beneficiary modality details

Account mapper is the host that stores the beneficiary modality details via an API. Here is the GitHub link

8

GOV-347

Add Payment Modality

Account mapper is the host that adds the payment modality details via an API. Here is the GitHub link

9

GOV-348

Update Payment Modality

Account mapper is the host that updates the beneficiary modality details. Here is the GitHub link

10

GOV-346

Beneficiary Onboarding

Beneficiary onboarding is done through an API. Here is the GitHub link

11

GOV-289

It provides the necessary APIs for encrypting the data in transit and when stored as well as the digital signatures used for API authentication. The digital signatures are based on public key cryptography using X.509 digital certificates.

This ticket was decomposed into 4 stories (GOV 445-48) and digital encrypted signatures were added.

Below are these tickets with their respective LLDs.

12

GOV-445

289 Signature, Encrypting signature

Every field was encrypted at rest, in transit, service call TLS, public keys per tenant configuration.

13

GOV-446

289.1 A security library for single source of truth for generating keys and verifying the signatures (Take help from SLCB implementation)

Implementation of decoding of key from X.509 certificate was done and generation of asymmetric keys and spring interceptor was used for validation of the signature.

14

GOV-447

289.2 Implementation of data integrity in bulk processor

Updated connector-common in batch transaction endpoints in bulk API and no dependency resolution errors.

15

GOV-448

289.3 Integration test for testing +ve as well as -ve scenarios

For +ve scenarios testing was done to ensure API Integrity holds true for batch transactions endpoints. For -ve scenarios editing will be done in the payload/header/params after generating signatures

16

GOV-450

57.1 Zeebe Upgrade Pt.1

In this story we fixed the exporter jar loading issue in Zeebe 8, integration testing was written to start a dummy workflow, docker image integration test was done, parallel feature file execution was also done and upgraded Zeebe and Zeebe gateway version by moving the Zeebe version 1.1.0 configurations to version 8 camunda chart.

17

GOV-464

Elastic search certificates/secrets pipeline

Created makefiles and example.mk for ES secrets. Here is the GitHub link.

18

GOV-458

Helm Chart Gold Standard Upgrade: Service Account

A service account.yaml was added for all microservices. For more detail here is the GitHub link

19

GOV-451

GOV-440.0 Separate deployment, service and ingress files into separate yaml for all services

Separate deployment was carried out and ingress files into separate yaml files. Here is the the GitHub link.

20

GOV-460

Mock Payer fund transfer bpmn derived from ML (payer is mocked)

A flow diagram was constructed which shows a mock of how payer fund transfer is carried out. Attached in the Annexure of this document as Fig.3.3A&B.

21

GOV-461

Mock fund transfer bpmn (everything is mocked)

A flow diagram was constructed which shows how fund transfer happens and is attached in the Annexure of this document as Fig.3.3A&B.

22

GOV-462

Tenant based use case implementation for bpmn support

Default BPMNs are not overridable but tenant-based ones are overridable from application-tenants.properties in env-labs.

23

GOV-463

Update integ test

Integration testing was updated. For more detail here is the GitHub link

24

GOV-452

GOV440.1 Deployment

Two GitHub Links:

Deployment of helm gold standard was done in this step. Here is the Git Hub link

Deployment of yaml gold standard was done and here is the Git Hub link

25

GOV-160

MUST have account lookup to resolve accounts

API fetching by functional ID and Payment modality was done and implemented eh-cache with LRU for get api

26

GOV-262

Application or event logs will capture events that each component performs and should contain at least the following information: event related information (message or code)

Logger debug statements for each worker in each bulk processor related service will log the code of the worker. Channel, bulk processor, mifos connector, Mojaloop connector, gsma connector will also be shown.

27

GOV-265

The components should also generate transaction logs which capture at least the following information: transaction source

Each Source BB can be configured as separate tenant where we can log the tenant. While Bulk Processor and Channel Connector APIs can have filters to log this tenant for each request. Other components can access it through orchestrator variables and can additionally log it at the first line of log for each worker related to GovStack use cases.

28

GOV-266

The components should also generate transaction logs which capture at least the following information: transaction destination

Transaction destination such Payee FSP name and Payee Interop Identifier will be captured. Orchestrator already captures this information in most scenarios. Validating and filling missing gaps in Mojaloop, gsma and mock schema flows.

29

GOV-268

The components should also generate transaction logs which capture at least the following information: transaction status (success, failed, in progress)

This LLD was more of a validation process and the development team went through the code by logging different failure messages and internal error messages were validated and external messages were logged carefully.

30

GOV-320

Target account identifier or lookup of payment address

An integration was written to test number of get callback is equal to the number of requests.

31

GOV-449

57.0 Zeebe Upgrade Pt 2

All the issues after Zeebe configuration update were reported. The Zeebe record was compared and Zeebe client was upgraded in core services and the images were published with test tag, testing was used in tag in G2P sandbox deployment pipeline and business process modelling notation was uploaded in integration testing.

32

GOV-150

Bulk payments have to be routed through the account holding the funds. (This could be either at the government Treasury or a commercial bank).

The payer should be done using fineract workers and the flow was initiated via Bulk transfer.

33

GOV-466

New bpmn for mock payment transfer with debit

A flow diagram was constructed which visualizes the flow of payment transfer with debit and is attached in the Annexure of this document as Fig.3.1.

34

GOV-241

Batch files go through a final check to be clean of defects and inconsistencies, to check with external systems as necessary: Low level validation of data types and data completeness.

Validations to be added on the uploaded files and data types were checked for data completeness and malicious file detection was done. In case of errors the file will be rejected and will need to be uploaded and integration testing was added.

35

GOV-467

Sync response for file not uploaded

Files were checked if they were uploaded, column validation under check for structure of the CSV. While Size validation based on configurable value and file extension validation using apache tika.

36

GOV-468

Async handling of file validations

Row validation in form of check for missing data and data type was carried out. (async)

37

GOV-267

The components should also generate transaction logs which capture at least the following information: supplementary data

Custom data type was changed to Aray and the publishing of custom data as zeebe variable was carried out.

38

GOV-261

Application or event logs will capture events that each component performs and should contain at least the following information: terminal identity (name and / or IP address)

Terminal Identity was captured and the detailed steps of this LLD is in the link below:
Steps to log client IP - Google Docs

39

GOV-210

The account lookup directory service provides a directory that maps the beneficiary's unique identifier to the transaction account

Integration test logic was done to check callback response body and helm charts were created, docler file for deployment.

40

GOV-126

The payment BB SHOULD get confirmation of the PAYEE on the details submitted or looked up via external id-account mapping

Two flow diagrams were constructed in which the first one shows the account lookup service with an AMS connector and the second one shows the Account lookup service with Mojaloop. Diagrams in annexure as Fig.3.2A&B.

41

GOV-264

The components should also generate transaction logs which capture at least the following information: transaction date and time

This will be displayed in UI and in Operations App time zone configuration.

42

GOV-293

At the transport layer: Non-repudiation of transactions by parties involved.

JWS language will be used in response and in callback of Bulk processor level - completion & phased.

43

GOV-454

GOV440.3 - Each service has it's own helm chart (file based helm dependency)

Each service will have its own Chart.yaml and when each service has its own helm chart. Here is the link

44

GOV-250

Detects batch failure rates

Failure per batch to be published and % of failed cases per batch and batch summary API was updated.

45

GOV-290

At the transport layer: All communication between building blocks must be TLS-secured using client authentication

Enabled transport layer service for tomcat server. Here is the link

46

GOV-327

The batch file for bulk payments should contain the beneficiary ID token, amount to be paid.

This was resolved by verification and validation component of the bulk payment service by invoking the Account Lookup Directory Service (ALDS/ALS)

47

GOV-465

Run integration tests as GitHub Checks on each PR

Circle CI configuration was done and testing was done. Gov tag was also added of each of the Gov-Stack related tests and the running integration test job using helm test-suit was done.

48

GOV-439

Helm Chart Gold Standard Upgrade: - Role Binding

Cluster role and cluster role binding yaml files were added. For more info here is the link

Role and rolebinding service was added to each microservice and was tested with respective service account for individual microservice. For more info here is the link.

49

GOV-259

Application or event logs will capture events that each component performs and should contain at least the following information: application / user ID

Authentication in channel connector was enabled and Keycloak was created and linked with API key or Kong consumer which helped captured events.

50

GOV-453

GOV440.2. - Ingress

Ingress Yaml files were updated and ingress values were configured in each micro service values.

For more details here is the GitHub link

51

GOV-438

Helm Chart Gold Standard Upgrade: Pod Security Policy

Cluster Role and Cluster Role Binding must have security controls along with network policy to run in multiple namespaces and this scope also helps in Payment Hub running in multiple namespaces and improves developer productivity. For more detail here is the Git Hub link

52

GOV-518

Separate G2p sandbox chart and create as a new chart

G2P sandbox- chart base was created in env-template and is referred as client/customer base chart.
For more details click on this link

53

GOV-292

At the transport layer: Confidentiality of personal information related to the transaction

Implemented the masking service and integrated it by AES encryption method on sensitive data.

54

GOV-275

The audit trail shall comply with the following requirements: The audit trail must be retained as long as the electronic record is legally required to be stored.

Snapshot registration and ES/Kibana Tiered Storage Backup was implemented which complied with the electronic record storage requirement.

55

GOV-302

Pre-processing validation of data (well-formed data and completeness checks) prior to disbursement.

A google document contains the process of the solution. Here is the link

56

GOV-325

Batch files containing duplicate payments will not be processed and an error will be generated.

Duplicate transactions will be removed based on payee identifier type, payee identifier, amount and currency. If ordering is enabled and all txns are in order, then txns will be compared next to each other in order and if ordering is disabled then we will use a set to find duplicates.
For further information click on this link

57

GOV-338

Payer Bank Simulator provides authorization of payment transaction to be sent via Payer Bank

Two flow diagrams were constructed and are attached in the Annexure of this document as Fig.3.3A&B.

First one is about payer mock fund transfer.

Second is where mock fund transfer is done when everything is mocked.

58

GOV-75

Ability for the government to make a payment to an individual with a Bank Account (inc Bulk)

Three steps were used by the development team:

A sub-batch configuration was initiated for bulk payments consuming Payment Hub bulk connector.

Bulk Payment configuration for used cases was done

Bulk connector mock workers were derived.

Anchor
_Hlk137831666
_Hlk137831666
The tickets below are all Low-level designs of GOV-75

1

GOV-501

Gov75.1 Created closed loop bulk connector seed setup

Low level design of GOV-75

2

GOV-502

Gov75.2 Debulking dfspd configuration

Low level design of GOV-75

3

GOV-503

Gov75.3 Configuration reading in init sub batch

Low level design of GOV-75

4

GOV-504

Gov75.4 Setting tenant header in the closed loop bulk connector api calls

Low level design of GOV-75

5

GOV-505

Gov75.5 Batch transfer worker

Low level design of GOV-75

6

GOV-506

Gov-75.6 summary worker

Low level design of GOV-75

7

GOV-507

Gov75.7 Batch details worker

Low level design of GOV-75

8

GOV508

GOV75.8 Importer rdbms bulk parser configuration

Low level design of GOV-75

9

GOV-509

Gov75.9 New service deployment file setup

Low level design of GOV-75

10

GOV-510

GOV75.10 Integration Test

Low level design of GOV-75

Anchor
_Toc137752417
_Toc137752417
Annexure

Anchor
_Toc137752418
_Toc137752418
Figure 3.1 - GOV-466

Image RemovedImage Added

Anchor
_Toc137752419
_Toc137752419
Figure 3.2A&B -Gov-126

...

Anchor
_Toc137752420
_Toc137752420
Figure 3.3A&B - GOV-338

Image RemovedImage Added

Image RemovedImage Added