NR

Data Engineer

Nr Consulting
Bangalore8-13 LPA Posted 30 Jun 2025
FULL TIME
snowflake
Sql
dbt
Airflow
Shell Scripting
+1 more

Job Description

Consultant Data Engineer

Tools & Technology: Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now.

Duties and Responsibilities

  • Act as Consultant Data Engineer
  • Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage
  • Create a robust, extensible architecture to meet the client/business requirements
  • Snowflake objects with integration with AWS services and DBT
  • Involved in different types of data ingestion pipelines as per requirements
  • Development in DBT (Data Build Tool) for data transformation as per the requirements
  • Working on multiple AWS services integration with Snowflake
  • Working with integration of structured data & Semi-Structured data sets
  • Work on Performance Tuning and cost optimization
  • Work on implementing CDC or SCD type 2
  • Design and build solutions for near real-time stream as well as batch processing
  • Implement best practices for data management, data quality, and data governance
  • Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT
  • Investigate production issues and fine-tune our data pipelines
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
  • Coordinate and support software developers, database architects, data analysts and data scientists on data initiatives
  • Orchestrate the pipeline using Airflow
  • Suggest improvements to processes, products and services
  • Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers
  • Create technical documentation on confluence to aim knowledge sharing

Associate Data Engineer

Tools & Technology: Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, Shell Scripting, SQL, Git, Confluence, Python

Duties and Responsibilities

  • Act as offshore Data engineer and enhancement & testing
  • Design and build solutions for near real-time stream processing as well as batch processing
  • Development in snowflake objects with their unique features implemented
  • Implementing data integration and transformation workflows using DBT
  • Integration with AWS services with Snowflake
  • Participate in implementation plan, respond to production issues
  • Responsible for data collection, data cleaning & pre-processing
  • Experience in developing UDF, Snowflake Procedures, Streams, and Tasks
  • Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA
  • Investigate production job failures with RCA investigation
  • Development of ETL processes and data integration solutions
  • Understanding the business needs of the client and provide technical solution
  • Monitoring the overall functioning of processes, identifying improvement areas and implementing with scripting
  • Handling major outages effectively along with effective communication to business, users & development partners
  • Define and create Run Book entries and knowledge articles based on incidents experienced in production

Associate Engineer

Tools and Technology: UNIX, ORACLE, Shell Scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS

Duties and Responsibilities

  • Worked as a Senior Production/Application Support Engineer
  • Worked as Production support member for loading, processing and reporting of files and generating reports
  • Monitoring multiple batches, jobs, processes and analyzing issues related to job failures and handling FTP failure, connectivity issues of batch/job failures
  • Performing data analysis on files and generating/sending files to destination server depending on functionality of job
  • Creating shell scripts for automating daily tasks or as requested by service owner
  • Involved in tuning jobs to improve performance and performing daily checks
  • Coordinating with Middleware, DWH, CRM and other teams in case of any CRQ issues
  • Monitoring overall functioning of processes, identifying improvement areas and implementing with scripting
  • Raising PBI after approval from service owner
  • Involved in performance improvement automation activities to decrease manual workload
  • Data ingestion from RDBMS system to HDFS/Hive through SQOOP
  • Understanding customer problems and providing appropriate technical solutions
  • Handling major outages effectively with proper communication to business, users & development partners
  • Coordinating with client, on-site personnel and joining bridge calls for any issues
  • Handling daily issues based on application and job performance

Join WhatsApp Channel