AM

Data Engineer / Lead Data Enginee

Amk Technology
Bangalore5-12 LPA Posted 17 Jun 2025
FULL TIME
Spark
Mysql
Python
Rdbms

Job Description

We're seeking a skilled and experienced Data Engineer or Lead Data Engineer to join our team. You'll be instrumental in designing and implementing scalable data lake solutions, building robust data pipelines for financial data, and optimizing existing processes. If you're an expert in SQL, PySpark, Python, and AWS services, with a passion for high-volume data applications and a knack for troubleshooting, we encourage you to apply!

Key Responsibilities

  • Design and implement scalable data lake solutions specifically for handling customer and market data.
  • Build and continuously improve pipelines for ingesting and enriching financial data.
  • Identify and implement optimizations for existing data pipelines to significantly enhance their scalability, throughput, and data accuracy.
  • Analyze and map source data fields to expand platform functionalities and capabilities.
  • Gather and meticulously document system enhancement requirements through close collaboration with business and core systems teams.
  • Work closely with the production team to troubleshoot and swiftly resolve any data pipeline issues.

Technical Skills

Must Have:

  • Expert Proficiency: SQL, PySpark, Python, Shell Scripting, Airflow.
  • AWS Services: Extensive experience with AWS services including EC2, RDS, S3, Spark, EMR, Athena, Presto, and Hudi.
  • Significant Experience: RDBMS (relational database management systems) and either JIRA or Jenkins (for project management/continuous integration).

Good to Have:

  • MySQL Database Management System.
  • Dremio (data virtualization tool).
  • Starburst (data virtualization tool).

Qualifications

  • Bachelor's degree in Computer Science or a related field (BE/BTech).
  • 5-10 years of work experience (with 8+ years for Senior and Lead Engineer positions) with Python, Spark, and SQL.
  • Preferably experienced on high-volume data applications in a Linux environment using MySQL or S3 storage.
  • Minimum of 4 years of experience with RDBMS, specifically working on high-volume daily processing solutions in the financial industry.
  • Strong knowledge of AWS cloud services for data processing, storage, and computation.
  • Basic understanding of financial concepts (instruments, trades, positions, corporate actions).

Additional Preferences:

  • Knowledge of data virtualization/analytics tools like Dremio and Starburst.
  • Experience working with Airflow.
  • Advanced knowledge of financial concepts.

Required Skills

Join WhatsApp Channel