PR

Big Data & GCP - Senior Software Engineer

Pradeepit Consulting Services
Bangalore5-11 LPA Posted 22 Jul 2025
FULL TIME
Pyspark
Dataproc
DataFlow

Job Description

Responsibilities:

  • Design, developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehousing projects in GCP cloud, combined with other technologies
  • Ensure the use of Big Query SQL, Java/Python/Scala and Spark reduces lead time to delivery and aligns to overall group strategic direction so that cross-functional development is usable
  • Ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential data pipeline-related problems.
  • Expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
  • Provide technical guidance and support to a vibrant engineering team. Coaching and teaching your teammates how to do great data engineering.
  • A deep understanding of data architecture principles and data warehouse methodologies specifically Kimball or Data Vault.

Requirements

  • An expert in GCP, with at least 5-7 years of delivery experience with: Dataproc, Dataflow, Big Query, Compute, Pub/Sub, and Cloud Storage
  • Highly knowledgeable in industry best practices for ETL Design, Principles, and Concepts
  • Equipped with 3 years of experience with programming languages Python
  • A DevOps and Agile engineering practitioner with experience in a test-driven development
  • Experienced in the following technologies: Google Cloud Platform, Dataproc, Dataflow, Spark SQL, Big Query SQL, PySpark and Python/Scala
  • Experienced in the following BigData technologies: Spark, Hadoop, Kafka etc..

Technologies

  • Big Data
  • Spark
  • Python/Scala
  • GCP

Join WhatsApp Channel