LT

Senior Specialist - Data Engineering

LTM
Hyderabad Posted 29 Mar 2026
FULL TIME
Apache Spark
Google Cloud Platform
Cloud Storage
Sql
DataFlow
+6 more

Job Description


Role description

A GCP BigQuery professional designs develops and optimizes largescale data warehouses ELT/ETL pipelines and SQL scripts within Google Cloud.


Responsibilities include managing data ingestion Pub/Sub Dataflow optimizing performance and cost and implementing security governance.


Key skills include SQL Python Apache Airflow and Data modelling


Key Responsibilities


·       Data Architecture modelling Design and optimize BigQuery data models schemas and storage for performance and cost efficiency


·       Pipeline Development ETLELT Build scalable data pipelines and workflows using tools like Cloud Composer Airflow Dataflow and Dataproc


·       Query Optimization Write complex SQL scripts and tune performance to minimize BigQuery slot usage


·       Data Integration Ingest transform and manage data from diverse sources into BigQuery


·       Governance Security Implement data security access controls and best practices


Required Skills Qualifications


·       Core Technical Proficiency in Google SQL Python and Google Cloud Platform GCP services BigQuery Cloud Storage Dataflow


·       Data Processing Experience with Apache Spark Beam or Hadoop


·       Database Knowledge Strong understanding of relational RDBMS and NoSQL databases


 

Join WhatsApp Channel