OR

Senior Data Engineer - Big Data & Cloud

Orange Business Services
Gurgaon4-6 LPA Posted 4 Jun 2025
FULL TIME
Spark
Devops Tools
Agile Method
Airflow
Hadoop
+1 more

Job Description

As part of the Big Data B2B program, OBS set up a shared Big Data platform and a Data Lake for Use cases exploration and industrialization.

We are looking for Senior Data Engineer with 4-6 years of experience in building Data Pipelines on Prem and Cloud with below KRA(S):

  • Automate, Industrialize the build and development tasks.
  • Lead discussion sessions with stakeholders.
  • Participate in all areas of the data Engineering life-cycle and lead the team in requirements gathering and data mapping, systems design, data ingestion development, preparing data mapping documentation, testing and deployment, post implementation support and monitoring.
  • Resolve and troubleshoot problems and complex issues.
  • Provide innovative solutions to complex business problems.
  • Report to and work closely with project teams and Business Analysis team on project delivery status.
  • Prepare progress update and status report.
  • Provide operational support, ongoing maintenance and enhancement after implementation as part of Run Management Activities.
  • Implementing Best Data Integration Practices.
  • Good understanding of Big Data Ecosystem with frameworks like HADOOP, SPARK.
  • Good experience handling large volume data as well as both structured/unstructured data in Streaming and Batch modes.
  • High Coding proficiency in at least one modern programming language: Python, Java or Scala.
  • Hands on Experience on NiFi , Hive, SQL/HQL, Spark sql,Spark Steaming, Oozie, Airflow.
  • Good Understanding of Data Integration Patterns.
  • Good understanding of KAFKA, Rabbit MQ, AIR FLOW.
  • Good understanding of API concepts : REST and also microservices architecture
  • Experience of Devops tooling: Jenkins, Maven, GitLab, SonarQube, Docker.
  • Good Understanding of Devops Concepts and various technologies like Kubernetes, Dockers, Containers.
  • Good understanding of ELK stack .
  • Good Understanding of Monitoring tools like Prometheus, Grafanna etc.
  • Good Understanding of Cloud architecture is must.
  • Good to be professionally certified in any of the Hyperscalers especially GCP. Full Understanding of Compute, Network and Storage Services of GCP.
  • Proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, and Bigtable.
  • Good Understanding of Linux and Shell Scripting
  • Good Experience of AGILE methods (Scrum, Kanban)
  • Good understanding of JIRA.
  • Understanding of Data Modelling is value addition.
  • Understanding of Open Digital Architecture and TMF principles is preferrable.
  • Understanding of Tools Like DSS, Jupyter Notebook is preferrable

Join WhatsApp Channel