Engineer - Data Engineer
Job Description
Architect of Solutions: Lead the design, development, and enhancement of scalable ETL pipelines and Data Products, as part of a Data Mesh inspired strategy.
Technical Expertise: Demonstrate your expertise in ELT solutions (DBT, Snowflake), Python and AWS ecosystems to deliver exceptional solutions.
Collaborative Spirit: Work hand-in-hand with global and diverse Agile teams, from data to design, to overcome technical data challenges.
Innovate & Inspire: Stay ahead of the curve by integrating the latest industry trends and innovations into your work such as GenAI.
Tools:
ETL Tools: Master tools like Python and explore preferred technologies such as DBT and Snowflake.
Databases: Utilize your SQL prowess across platforms like Snowflake, Postgres.
AWS Cloud Expertise: Bring your AWS knowledge to life with services like Lambda, S3, Athena, Step Functions and more.
Essential Skills/Experience
- 3 to 6 years experience
- A proactive mindset and enthusiasm for Agile environments.
- Strong hands-on experience with cloud providers and services.
- Experience in performance tuning SQL and ETL pipelines.
- Extensive experience in troubleshooting data issues, analyzing end-to-end data pipelines and in working with users in resolving issues.
- Masterful debugging and testing skills to ensure excellence in execution.
- Inspiring communication abilities that elevate team collaboration.
- Experience of structured, semi-structured (XML, JSON) and unstructured data handling including extraction and ingestion via web-scraping and FTP/SFTP.
- Production experience delivering CI/CD pipelines (Github, Jenkins, DataOps.Live).
- Excellent Cloud Devops Engineer who can develop, test and maintain CICD Pipeline using Terraform, cloud formation.
- Remain up to date with latest technologies, like GenAI / AI platforms and FAIR scoring to improve outcomes