Low Level Designs
Low Level Designs – Jira Stories
Delivery Milestone 01
| DESCRIPTION This is a component within “Delivery Milestone 1” which is defined as “UI Design, Blueprint of the deliverable and wireframes during the project implementation, Custom Specifications, Workflow models, Low Level Designs of Jira Stories.”
15/06/2023 Version 1.0
|
Table of Contents
Document Purpose
The purpose of this document is to provide guidance on the overall landscape of the Payments Building Block. By leveraging different components within itself, the Payments Building Block covers functionality including payments to different financial addresses such as IBANs, Mobile Money Accounts, . as well as to outline the steps and requirements necessary to achieve these objectives.
This document is intended for the following audiences so that they may achieve their respective objectives,
Mifos Development and QA Teams.
Software developers and engineers at Mifos who will be responsible for implementing the requirements and building the software system will need to understand the detailed functionality and API specifications in order to develop the system as specified.
The QA teams at Mifos and GovStack will ensure that the software system meets the requirements and functions as defined in this document or in the identified stories from the backlog referenced in this document.
Project Management.
Project managers at Mifos and GovStack will be responsible for overseeing the development process and ensuring that the project is completed on time and within budget. They would need to understand the specifications in order to monitor progress and ensure that the development team is on track.
Gov- Stack Business Stakeholders.
Individuals or groups who defined the software system requirements will use this document to ensure that the system meets their needs and requirements.
To aid these audiences this document aims to provide the low-level designs for the JIRA ID that were opened by Gov-Stack. Some stories had flow charts attached which are in the Annexure of this document.
What are Low -Level Designs?
Low-level design (LLD) is the process of specifying and defining the detailed design of a system. The design focuses on the implementation details of a system and in our case the system required by Gov-Stack. The LLDs focuses on how the system will be built and how it will function at a detailed level.
This is an important phase in our project as the design is translated into more concrete details. Data structures, algorithms are used to implement systems.
Key Steps involved in the low-level design
The key steps involved in the low-level design are as follows:
Understanding requirements: The first step in low-level design is understanding the system's requirements. This can be done by reviewing the requirements, speaking with all the stakeholders involved which provides a clearer picture of the requirements.
Defining the architecture: Once the requirements are understood, the next step is to outline the system's architecture. This involves classifying the various components that make up the system and defining how they will interact with each other.
Designing the components: After defining the architecture, the next step is to design each component in detail. This includes specifying the data structures and algorithms to be used, and defining the interfaces between components.
Defining protocols: In this step, the protocols that will be used for communication between components are defined. This includes specifying the data formats, message structures, and communication protocols that will be used.
Creating UML diagrams: To help communicate the design, UML (Unified Modeling Language) diagrams may be created to illustrate the architecture and the interactions between components.
Reviewing the design: Once the design is completed, it must be reviewed to ensure that it meets the requirements and is technically sound. This may include obtaining feedback from stakeholders and conducting peer reviews.
Refining the design: If necessary, the design can be refined based on feedback received during the review process. This may involve changing the data structures, algorithms, or protocols used in the design.
Documenting the design: Once the design has been finalized, it is important to document it in a way that is clear, concise, and easy to understand. This may involve creating detailed design specifications or creating diagrams and flowcharts to illustrate the design.
Benefits of Low-Level Design
The benefits of having low-level design (LLD) are as follows:
Better Quality: When the designs are specified, the LLD ensures that the system is built according to the requirements and the designs are well-documented. This enhances the system’s quality and reduces chances of bugs.
Better Understanding: When LLDs are documented, it makes it easier for developers to better understand and implement it adequately. This eliminates delays and improves the overall efficiency of the development process.
Increased Reusability: When the interfaces and protocols are used in communication between various components, the LLD makes it easier to reuse components and systems in the future as well. This reduces the overall cost and time for future development and improves the efficiency of implementation of the system.
Low-Level Design – Jira Stories
# | Jira ID | Requirement | Low Level Design |
1 | GOV-417 | Non-Functional Cross Cutting Requirement: APIs MUST be Idempotent. | The Idempotent of APIs was done with the Correlation ID approach. Therefore, in the system functional IDs are allotted to every customer to avoid duplication of transactions and requests of the same customer due to system failures or no responses. |
2 | GOV-71 | If an API Response will Take Longer than 5 Seconds, you SHOULD Return a Ticket with a Suggested Callback Time that is Resolved by Polling (Audit) | An API Audit was done to identify any suspected APIS |
3 | GOV-332 | ID Mapper stores Functional ID of Agents | Account mapper is the host that stores the beneficiary modality details. Here is the GitHub link |
4 | GOV-350 | Merchant Onboarding | Merchant Onboarding is performed via an API. Here is the GitHub link |
5 | GOV-349 | Agent Onboarding | Agent Onboarding is performed via an API. Here is the GitHub link |
6 | GOV-330 | ID Mapper stores Functional ID of Beneficiary | Account mapper is the host that stores the beneficiary functional ID via an API. Here is the GitHub link |
7 | GOV-331 | ID Mapper stores beneficiary modality details | Account mapper is the host that stores the beneficiary modality details via an API. Here is the GitHub link |
8 | GOV-347 | Add Payment Modality | Account mapper is the host that adds the payment modality details via an API. Here is the GitHub link |
9 | GOV-348 | Update Payment Modality | Account mapper is the host that updates the beneficiary modality details. Here is the GitHub link |
10 | GOV-346 | Beneficiary Onboarding | Beneficiary onboarding is done through an API. Here is the GitHub link |
11 | GOV-289 | It provides the necessary APIs for encrypting the data in transit and when stored as well as the digital signatures used for API authentication. The digital signatures are based on public key cryptography using X.509 digital certificates. | This ticket was decomposed into 4 stories (GOV 445-48) and digital encrypted signatures were added. Below are these tickets with their respective LLDs. |
12 | GOV-445 | 289 Signature, Encrypting signature | Every field was encrypted at rest, in transit, service call TLS, public keys per tenant configuration. |
13 | GOV-446 | 289.1 A security library for single source of truth for generating keys and verifying the signatures (Take help from SLCB implementation) | Implementation of decoding of key from X.509 certificate was done and generation of asymmetric keys and spring interceptor was used for validation of the signature. |
14 | GOV-447 | 289.2 Implementation of data integrity in bulk processor | Updated connector-common in batch transaction endpoints in bulk API and no dependency resolution errors. |
15 | GOV-448 | 289.3 Integration test for testing +ve as well as -ve scenarios | For +ve scenarios testing was done to ensure API Integrity holds true for batch transactions endpoints. For -ve scenarios editing will be done in the payload/header/params after generating signatures |
16 | GOV-450 | 57.1 Zeebe Upgrade Pt.1 | In this story we fixed the exporter jar loading issue in Zeebe 8, integration testing was written to start a dummy workflow, docker image integration test was done, parallel feature file execution was also done and upgraded Zeebe and Zeebe gateway version by moving the Zeebe version 1.1.0 configurations to version 8 camunda chart. |
17 | GOV-464 | Elastic search certificates/secrets pipeline | Created makefiles and example.mk for ES secrets. Here is the GitHub link. |
18 | GOV-458 | Helm Chart Gold Standard Upgrade: Service Account | A service account.yaml was added for all microservices. For more detail here is the GitHub link |
19 | GOV-451 | GOV-440.0 Separate deployment, service and ingress files into separate yaml for all services | Separate deployment was carried out and ingress files into separate yaml files. Here is the the GitHub link. |
20 | GOV-460 | Mock Payer fund transfer bpmn derived from ML (payer is mocked) | |
21 | GOV-461 | Mock fund transfer bpmn (everything is mocked) | |
22 | GOV-462 | Tenant based use case implementation for bpmn support | Default BPMNs are not overridable but tenant-based ones are overridable from application-tenants.properties in env-labs. |
23 | GOV-463 | Update integ test | Integration testing was updated. For more detail here is the GitHub link |
24 | GOV-452 | GOV440.1 Deployment | Two GitHub Links: Deployment of helm gold standard was done in this step. Here is the Git Hub link Deployment of yaml gold standard was done and here is the Git Hub link |
25 | GOV-160 | MUST have account lookup to resolve accounts | API fetching by functional ID and Payment modality was done and implemented eh-cache with LRU for get api |
26 | GOV-262 | Application or event logs will capture events that each component performs and should contain at least the following information: event related information (message or code) | Logger debug statements for each worker in each bulk processor related service will log the code of the worker. Channel, bulk processor, mifos connector, Mojaloop connector, gsma connector will also be shown. |
27 | GOV-265 | The components should also generate transaction logs which capture at least the following information: transaction source | Each Source BB can be configured as separate tenant where we can log the tenant. While Bulk Processor and Channel Connector APIs can have filters to log this tenant for each request. Other components can access it through orchestrator variables and can additionally log it at the first line of log for each worker related to GovStack use cases. |
28 | GOV-266 | The components should also generate transaction logs which capture at least the following information: transaction destination | Transaction destination such Payee FSP name and Payee Interop Identifier will be captured. Orchestrator already captures this information in most scenarios. Validating and filling missing gaps in Mojaloop, gsma and mock schema flows. |
29 | GOV-268 | The components should also generate transaction logs which capture at least the following information: transaction status (success, failed, in progress) | This LLD was more of a validation process and the development team went through the code by logging different failure messages and internal error messages were validated and external messages were logged carefully. |
30 | GOV-320 | Target account identifier or lookup of payment address | An integration was written to test number of get callback is equal to the number of requests. |
31 | GOV-449 | 57.0 Zeebe Upgrade Pt 2 |
|
32 | GOV-150 | Bulk payments have to be routed through the account holding the funds. (This could be either at the government Treasury or a commercial bank). | The payer should be done using fineract workers and the flow was initiated via Bulk transfer. |
33 | GOV-466 | New bpmn for mock payment transfer with debit | |
34 | GOV-241 | Batch files go through a final check to be clean of defects and inconsistencies, to check with external systems as necessary: Low level validation of data types and data completeness. | Validations to be added on the uploaded files and data types were checked for data completeness and malicious file detection was done. In case of errors the file will be rejected and will need to be uploaded and integration testing was added. |
35 | GOV-467 | Sync response for file not uploaded | Files were checked if they were uploaded, column validation under check for structure of the CSV. While Size validation based on configurable value and file extension validation using apache tika. |
36 | GOV-468 | Async handling of file validations | Row validation in form of check for missing data and data type was carried out. (async) |
37 | GOV-267 | The components should also generate transaction logs which capture at least the following information: supplementary data | Custom data type was changed to Aray and the publishing of custom data as zeebe variable was carried out. |
38 | GOV-261 | Application or event logs will capture events that each component performs and should contain at least the following information: terminal identity (name and / or IP address) | Terminal Identity was captured and the detailed steps of this LLD is in the link below: |
39 | GOV-210 | The account lookup directory service provides a directory that maps the beneficiary's unique identifier to the transaction account | Integration test logic was done to check callback response body and helm charts were created, docler file for deployment. |
40 | GOV-126 | The payment BB SHOULD get confirmation of the PAYEE on the details submitted or looked up via external id-account mapping | Two flow diagrams were constructed in which the first one shows the account lookup service with an AMS connector and the second one shows the Account lookup service with Mojaloop. Diagrams in annexure as Fig.3.2A&B. |
41 | GOV-264 | The components should also generate transaction logs which capture at least the following information: transaction date and time | This will be displayed in UI and in Operations App time zone configuration. |
42 | GOV-293 | At the transport layer: Non-repudiation of transactions by parties involved. | JWS language will be used in response and in callback of Bulk processor level - completion & phased. |
43 | GOV-454 | GOV440.3 - Each service has it's own helm chart (file based helm dependency) | Each service will have its own Chart.yaml and when each service has its own helm chart. Here is the link |
44 | GOV-250 | Detects batch failure rates | Failure per batch to be published and % of failed cases per batch and batch summary API was updated. |
45 | GOV-290 | At the transport layer: All communication between building blocks must be TLS-secured using client authentication | Enabled transport layer service for tomcat server. Here is the link |
46 | GOV-327 | The batch file for bulk payments should contain the beneficiary ID token, amount to be paid. | This was resolved by verification and validation component of the bulk payment service by invoking the Account Lookup Directory Service (ALDS/ALS) |
47 | GOV-465 | Run integration tests as GitHub Checks on each PR | Circle CI configuration was done and testing was done. Gov tag was also added of each of the Gov-Stack related tests and the running integration test job using helm test-suit was done. |
48 |