IN

Sr. Eng. I, Distrib Tech

Invesco
Hyderabad5-6 LPA Posted 30 Jun 2025
FULL TIME
snowflake
Sql
dbt
Airflow
Postgresql
+1 more

Job Description

Job Summary

  • This Senior Data Engineer role sits within the Product Engagement part of our Digital, Distribution, and Enterprise Engineering (D2E2) organization
  • You will maintain good relationships with internal partners and collaborate with peers across the Distribution business domain
  • You will develop a strong understanding of the investments and distribution data and business processes, including both how it is produced and consumed across all Distribution channels and how to apply Invesco s Data strategy and practices to drive Distribution s data capabilities forward
  • The focus of this position is to design, develop, and manage our data infrastructure and pipelines
  • In this role, you will leverage modern data tools including Snowflake, PostgreSQL, DBT, Airflow, and Airbyte to ensure seamless data integration, transformation, and availability for analytics and business intelligence
  • You will play a key role in building scalable and efficient data systems to support our growing data needs

You will be responsible for:

  • Data Pipeline Development: Build and maintain end-to-end data pipelines using Airbyte for data ingestion and Airflow for workflow orchestration, ensuring reliable data flow from source systems to Snowflake.
  • Data Transformation: Implement and optimize data transformation workflows using DBT to create clean, structured, and analytics-ready datasets in Snowflake.
  • Database Management: Design, manage, and optimize PostgreSQL databases for operational data storage and integrate them with Snowflake for analytics use cases.
  • Snowflake Administration: Configure and maintain Snowflake as the primary data warehouse, including schema design, performance tuning, and cost management.
  • Workflow Automation: Use Airflow to schedule, monitor, and troubleshoot data pipeline jobs, ensuring timely and accurate data delivery.
  • Data Integration: Leverage Airbyte to connect and sync data from various sources (e. g. , APIs, SaaS platforms, databases) into Snowflake and PostgreSQL.
  • Data Quality: Implement testing and validation processes within DBT and Airflow to ensure data accuracy, consistency, and reliability.
  • Collaboration: Partner with data analysts, data scientists, and business stakeholders to understand requirements and deliver tailored data solutions.
  • Performance Optimization: Monitor and optimize pipeline performance, query execution in Snowflake, and resource usage across all tools.
  • Documentation: Maintain detailed documentation of pipelines, transformations, and database schemas for team reference and compliance.
  • Problem-Solving: Demonstrate strong problem-solving skills.
  • Meetings: Attend regular team and other required meetings daily/weekly.
  • Communication: Ensure clear and accurate communication and respond to data, business, and technology partners and peers in a timely manner.

This individual:

  • Must be comfortable working in a scaled agile development environment.
  • Have an ability to handle multiple requests from varied stakeholders, in a way that maintains clear priority, and ability to adjust when needed.
  • Be an effective translator of business needs into technical requirements.
  • Demonstrate an ability to build relationships, collaborate, and influence internal and external teams.
  • Be able to work under the guidance of senior team members, take initiative, and get projects completed with great attention to detail and on time.

The experience you bring:

  • Education: Bachelor s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent experience).
  • Experience: Minimum 5-6 years of experience with AWS cloud platforms where Snowflake, Airflow, and Airbyte are deployed.
  • Relational Databases and Data Warehouse: Minimum 4-7 years of experience in relational databases and data warehousing.
  • Programming: Strong scripting skills in programming languages like SQL and Python, as you will be responsible for writing queries and other data transformation scripts.
  • Data Knowledge: Solid knowledge of data lifecycle, data governance, data risk, master data management concepts, data modeling, business intelligence, and analytics concepts.
  • Teamwork: Ability to work in a team/group setting.
  • Problem-Solving: Ability to interpret complex or vague instructions and successfully produce results. Proactively identifies, analyzes, and resolves procedural fail points.
  • Judgment and Initiative: Demonstrates superior judgment, reasoning, and follow-up skills. Shows a high level of initiative, assertiveness, and self-confidence.
  • Agile Methodology: Strong working knowledge of Agile methodology.
  • Communication: Shares information efficiently between team members. Promotes teamwork. Excellent listening, interpersonal, written, and oral communication skills.
  • Time Management: Effectively manages multiple responsibilities and meets deadlines.
  • Analytical Skills: Excellent analytical, mathematical, and creative problem-solving skills.
  • Attention to Detail: Logical and efficient, with attention to detail.
  • Data Modeling: An understanding of the concepts related to data modeling, coding, and process flow design.

Join WhatsApp Channel