Accepting Applications
Full-time
On-site
Posted 1 hour, 47 minutes ago
0 views
0 applications
Job Description
We are looking for
**DevOps Engineers**
with strong experience in data architecture, containerization, and cloud\-native deployments. The ideal candidates will have hands\-on expertise in Kafka\-based data pipelines, Kubernetes orchestration, CI/CD automation, and data integration within regulated environments (preferably banking/financial services). This role involves building scalable, secure, and high\-performance data platforms across multiple environments.
**Key Responsibilities:**
* Implement and configure Helm packages across multiple environments.
* Design and manage data ingestion pipelines using Apache Kafka, ensuring high availability and low\-latency data delivery.
* Translate functional and technical requirements into scalable technical solutions and workflows.
* Manage data modelling, schema registry configurations, and serialization strategies.
* Monitor, troubleshoot, and resolve issues in data pipelines, schema alignment, and data transformation processes.
* Ensure data integrity, compliance, and adherence to regulatory standards.
* Provide technical training, documentation, and ongoing support to business users and developers.
* Deploy and orchestrate containerized components using Docker and Kubernetes (including AKS and OpenShift).
* Support CI/CD pipelines, automated deployments, and infrastructure\-as\-code practices.
* Integrate data sources, analytics, and reporting modules to deliver actionable insights.
**Requirements:**
* Strong understanding of data architecture, data flows, and distributed systems.
* Proficiency in Apache Kafka (Schema Registry, topic management, event filtering).
* Extensive experience with SQL Server or similar relational databases.
* Hands\-on expertise in data modelling, schema versioning, and data integration.
* Strong command of Docker for building and managing containerized applications.
* Experience deploying and managing applications on Kubernetes platforms, including AKS or OpenShift.
* Experience troubleshooting serialization issues, ingestion failures, and performing log analysis.
* Proficiency in Python, Bash, or similar scripting languages for automation.
* Comfortable working in Linux\-based environments with CLI log and file management.
* Strong analytical and problem\-solving skills, preferably with exposure to financial data contexts.
* Experience with DevOps tools such as Jenkins, Helm, and Git for CI/CD automation.
* Knowledge of cloud\-native architecture and microservices within the banking ecosystem.
* Familiarity with data security practices, encryption standards, and GDPR compliance.
* Understanding of real\-time analytics, fraud detection, or regulatory reporting use cases.
Login to Apply
Don't have an account? Register