OROrange Business Services
Data Engineer
Gurgaon ₹3-8 LPA Posted 4 Jun 2025
FULL TIME
Devops
Data Integration
Hortonworks
Java
Agile
+1 more
Job Description
- Your role is is to assist and support the users of the Big Data platform, like the development teams or the data analysts, and to liaise with the technical teams.
- The key points are as follow:
- Understand and master the EDH environment, more specifically the Hadoop environment, the different components (Hive, Ranger, Nifi, Kafka, ...), the home-made development framework and the datalabs solutions (Dataiku, Jupyter)
- Within a collaborative and agile team, your objective is to support the users of the platform, the Data Analyst teams during their data exploration phase, and to liaise with the technical teams.
- Guarantee the Data As A Service offer, which follow-up is done through JIRA/Confluence
- contribute to the improvement of the offer in order to promote the autonomy of the end user, while respecting legal and security constraints
- contribute to the design and model new proposals that strengthen commitments around this offer,
- Write documentation, including creation of notebooks and templates
- Assist/coach all users in the use of the EDH platform
- Take into account the new needs or the evolution requests, for example by developing some scripts, by writing user stories or asking for platform upgrades
- Contribute to the improvement of the offers with the aim of promoting the autonomy of the end user, while respecting legal and security constraints
- You will interact with:
- The Scrum Master, the Tech lead and the other support engineers
- The different stakeholders: Product Owner, Security team, Data Management team, Architecture and Engineering team
- The users of the platform, data analysts, data scientists, Uses Cases developers, The platform operators.
- TECHNICAL EXPERTISE AND/OR NECESSARY TRADE:
- Knowledge of the offerings and features of Big Data technologies (Hadoop, Hortonworks or other distributions)
- Knowledge of JAVA development (understand/read StackTrace)
- Good Understanding of Python as programming language
- Knowledge of Linux environments
- Good Understanding of atleast one public Cloud especially Azure.
- Good Understanding of Tools like Dataiku, Jupyter Notebook.
- Good to be Advanced Dataiku Certified.
- Good Understanding of Data Visualization tools Qlik View, Power BI, Qlik Sense
- Good to have understanding on Data Modeling.
- Culture of using DevOps / Ansible
- Culture in project management/monitoring, Agile culture
- Good communication/interpersonal skills to interact with teams Use case (for relationship with projects)
- Experience in setting up and deploying projects in production
- Your profile
- Good job performance requires:
- A real appetite for technical positions
- Teamwork
- Good interpersonal skills.
- You adhere to the principles conveyed by agile methodologies (agile manifesto, customer need at the center, prioritization of the greatest added value, etc.). You adapt to new situations.
- Knowledge of the decision-making field in general (data integration, modeling, restitution) is an additional asset.