TETechversant
Data Engineer
Thiruvananthapuram / Trivandrum ₹5-9 LPA Posted 16 Feb 2026
FULL TIME
snowflake
Redshift
Pyspark
Data Warehousing
Sql
+2 more
Job Description
Key Responsibilities:
- Design, build, and maintain end-to-end ETL/ELT pipelines using both on-premise and cloud-based technologies.
- Architect and operate data storage and streaming solutions leveraging cloud-based services on AWS, Azure, or GCP.
- Design and implement data ingestion and transformation workflows using Airflow, AWS Glue, or Azure Data Factory.
- Develop and optimize data pipelines using Python and PySpark for large-scale distributed data processing.
- Build data models — normalized, denormalized, and dimensional (Star/Snowflake) — for analytics and warehousing solutions.
- Implement data quality, lineage, and governance using metadata management and monitoring tools.
- Collaborate with cross-functional teams to deliver clean, reliable, and timely data for analytics and machine learning use cases.
- Integrate CI/CD pipelines for data infrastructure deployment using GitHub Actions, Jenkins, or Azure DevOps.
- Automate infrastructure provisioning using Infrastructure as Code (IaC) tools such as AWS CloudFormation or Terraform.
- Monitor and optimize data processing performance for scalability, reliability, and cost-efficiency.
- Enforce data security policies and ensure compliance with standards such as GDPR and HIPAA.