Accepting Applications
Full-time
On-site
Posted 1 day, 23 hours ago
0 views
0 applications
Job Description
**Data Engineer – Cloud Platforms (Azure \| AWS)**
We are seeking an experienced
**Data Engineer**
with over
**5 years of expertise**
in designing and implementing modern, cloud\-based data solutions, primarily on
**Microsoft Azure**
, with additional exposure to
**AWS**
. The ideal candidate is skilled in building scalable data platforms, orchestrating efficient data pipelines, and enabling analytics through
**Microsoft Fabric**
and related technologies.
**Technical Canvas**
**Cloud Platform Expertise:**
Strong experience across
**Microsoft Azure**
and
**AWS**
, with an emphasis on the Azure ecosystem.
* **Azure Focus:**
Hands\-on with
**Azure Data Factory (ADF)**
,
**Azure Databricks**
,
**Azure Synapse Analytics**
,
**Azure Data Lake Storage (ADLS)**
, and
**Azure SQL Database**
.
* Familiarity with
**Microsoft Fabric,**
integrating
**Data Pipelines**
,
**Dataflows Gen2**
,
**Lakehouses**
, and
**Notebooks**
to streamline ingestion, transformation, and reporting.
* **AWS Exposure:**
Understanding of
**AWS Glue**
,
**S3**
,
**Lambda**
, and
**Redshift**
for data processing and analytics.
**Data Engineering Excellence:**
Proven ability to design, build, and maintain
**ETL/ELT pipelines**
,
**data models**
, and
**data warehouses**
using modern Azure\-based architectures and Fabric workflows.
**Analytics \& Visualization:**
Strong experience with
**Power BI**
, especially within the
**Fabric environment**
, to deliver dynamic data models, semantic layers, and interactive dashboards.
**Programming \& Automation:**
Proficient in
**SQL**
and
**Python**
, developing reusable, scalable scripts for data transformation, validation, and orchestration.
**Database Systems:**
Experienced with
**Azure SQL**
,
**PostgreSQL**
,
**MySQL**
, and
**Microsoft SQL Server**
, ensuring data accuracy, security, and optimal performance.
**Big Data \& Advanced Processing:**
Hands\-on experience with
**Azure Databricks (Spark)**
and
**Delta Lake**
, leveraging distributed computing for large\-scale data processing and analytics.
**Version Control \& DevOps:**
Proficient in
**Git**
, with exposure to
**CI/CD practices**
for deploying and maintaining data pipelines and Fabric artifacts in multi\-environment setups.
**Core Attributes**
* Analytical thinker who translates complex business needs into data\-driven solutions.
* Excellent communication and stakeholder management skills across business and technical teams.
* Collaborative, proactive, and detail\-oriented professional who thrives in fast\-paced environments.
Login to Apply
Don't have an account? Register