FU

Bigdata,Hadoop,HDFS,Hive.SQL,UNIX

Fusion Plus Solutions
Hyderabad4-6 LPA Posted 21 Jul 2025
FULL TIME
Airflow
Linux
Unix
Hadoop
Oozie

Job Description

Key Responsibilities:

Data Engineering & Development:

  • Design, build, and maintain ETL/ELT pipelines using Hadoop ecosystem tools
  • Write complex Hive queries for data transformation and analysis
  • Work with HDFS for storage and efficient data access
  • Develop UNIX shell scripts for automation of jobs and workflows
  • Optimize SQL queries for performance and scalability

Data Processing & Integration:

  • Process large volumes of structured and semi-structured data
  • Integrate data from various sources into Hadoop-based data lakes
  • Work with cross-functional teams to understand data requirements and deliver solutions

Monitoring, Maintenance & Quality:

  • Monitor and troubleshoot production data pipelines
  • Ensure data quality, integrity, and consistency across systems
  • Support data ingestion, batch processing, and job scheduling (e.g., Oozie, Airflow)

Required Skills and Qualifications:

  • Bachelor's degree in Computer Science, IT, or related field
  • 3–7 years of hands-on experience with Big Data tools
  • Strong expertise in:
  • Hadoop Distributed File System (HDFS)
  • Hive (querying, optimization, partitioning)
  • SQL (advanced queries, joins, aggregations)
  • UNIX/Linux shell scripting
  • Good understanding of data warehousing concepts and large-scale data processing

Join WhatsApp Channel