THThe It Mind Services
GCP Data Engineer
Bangalore ₹5-12 LPA Posted 27 Jun 2025
FULL TIME
Etl
Pyspark
Airflow
Job Description
- Experience in ETL and Data Warehousing
- Excellent leadership and communication skills
- Strong hands-on experience with Data Lakehouse architecture
- Proficient in GCP BigQuery, Cloud Storage, Airflow, Dataflow, Cloud Functions, Pub/Sub, Cloud Run
- Built solution automations using various ETL tools
- Delivered at least 2 GCP Cloud Data Warehousing projects
- Worked on at least 2 Agile/SAFe methodology-based projects
- Experience with PySpark and Teradata
- Skilled in using DevOps tools like GitHub, Jenkins, Cloud Native tools
- Experienced in handling semi-structured data formats like JSON, Parquet, XML
- Written complex SQL queries for data analysis and extraction
- Deep understanding of Data Warehousing, Data Analysis, Data Profiling, Data Quality, and Data Mapping
- Global delivery model experience (15+ team members)
- Collaborated with product/project managers, developers, DBAs, and data governance teams for requirements, design, and deployment
- Responsibilities:
- Design and implement data pipelines using GCP services
- Manage deployments and ensure efficient orchestration of services
- Implement CI/CD pipelines using Jenkins or native tools
- Guide a team of data engineers in building scalable data pipelines
- Develop ETL/ELT pipelines using Python, Beam, and SQL
- Continuously monitor and optimize data workflows
- Integrate data from various sources using GCP services and orchestrate with Cloud Composer (Airflow)
- Set up monitoring and alerting using Cloud Monitoring, Datadog, etc.
- Mentor junior developers and data engineers
- Collaborate with developers, architects, and stakeholders on robust data solutions
- Lead data migration from legacy systems (Oracle, Teradata, SQL Server) to GCP
- Facilitate Agile ceremonies (sprint planning, scrums, backlog grooming)
- Interact with clients on analytics programs and ensure governance and communication with program leadership