TETeamware Solutions
Hadoop Developer
Chennai ₹4-6 LPA Posted 16 Jul 2025
FULL TIME
Hive
Pig
Sqoop
Scala
Java
+4 more
Job Description
Key Responsibilities:
- Develop, test, and deploy Hadoop-based data processing workflows using tools like MapReduce, Hive, Pig, and Spark.
- Design and implement ETL/ELT pipelines to ingest and process large volumes of structured and unstructured data.
- Write efficient Hive queries, optimize MapReduce jobs, and develop Spark applications using Scala, Java, or Python.
- Work with HDFS for storage management and data ingestion strategies.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Monitor and troubleshoot Hadoop jobs and cluster performance issues.
- Ensure data quality, data governance, and security compliance in big data solutions.
- Maintain documentation for code, processes, and workflows.
- Participate in code reviews, testing, and deployment activities.
Qualifications and Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience as a Hadoop Developer or Big Data Engineer.
- Strong experience with Hadoop ecosystem components such as HDFS, MapReduce, Hive, Pig, HBase, Oozie, Sqoop, and Flume.
- Proficient in programming languages such as Java, Scala, or Python for developing big data applications.
- Experience with Apache Spark for batch and stream processing is highly desirable.
- Familiarity with data modeling, schema design, and query optimization techniques in big data environments.
- Knowledge of Linux/Unix systems and shell scripting.
- Experience working with cloud-based big data platforms (AWS EMR, Azure HDInsight, Google Dataproc) is a plus.
- Good problem-solving skills and ability to work in a collaborative Agile environment.
Desirable Skills:
- Experience with real-time data streaming tools like Kafka or Storm.
- Knowledge of NoSQL databases such as HBase, Cassandra, or MongoDB.
- Familiarity with DevOps and CI/CD pipelines for big data workflows.
- Understanding of data security and privacy best practices in big data environments.
- Excellent communication and teamwork skills.