GLGlobalfoundries Engineering Private Limited
Senior Lead Data Engineer - Devops
Bangalore ₹3-5 LPA Posted 20 Jun 2025
FULL TIME
Aws Cloud
Aws Redshift
Etl Development
Big Data
Python
+1 more
Job Description
Your Job:
- Understand the business case and translate to a holistic a solution involving Ab Initio (Cloud/On-prem), Python, Data Ingestion and Cloud DB Redshift / Postgres, AWS cloud.
- PL/SQL development for high volume data sets.
- Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),
- DB query monitoring for tuning and optimization opportunities
- Proven experience with large, complex database projects in environments producing high-volume data
- Demonstrated problem solving skills; familiarity with various root cause analysis methods; experience in documenting identified problems and determined resolutions.
- Makes recommendations regarding enhancements and/or improvements
- Provides appropriate consulting, interfacing, and standards relating to database management, and monitors transaction activity and utilization.
- Performance issues analysis and Tuning
- Data Warehouse design and development, including logical and physical schema design.
Other Responsibilities:
- Perform all activities in a safe and responsible manner and support all Environmental, Health, Safety & Security requirements and programs
- Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams
Required Qualifications:
- Bachelor or master's degree in information technology, Electrical Engineering or similar relevant fields.
- Proven experience (3 years minimum) with ETL development, design, performance tuning and optimization,
- Very good knowledge of data warehouse architecture approaches and trends, and high interest to apply and further develop that knowledge, including understanding of Dimensional Modelling and ERD design approaches,
- Working Experience in Kubernetes and Docker Administration is added advantage
- Must have good experience in AWS services, Ab Initio, Big data, Python, Cloud DB RedShift.
- Good to have exp in PySpark
- Proven experience with large, complex database projects in environments producing high-volume data,
- Proficiency in SQL and PL/SQL
- Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),
- Excellent conceptual abilities pared with very good technical documentation skills, e.g. ability to understand and document complex data flows as part of business / production processes,
- infrastructure.
- Familiarity with SDLC concepts and processes