SOSourced Group
Senior Python Developer
Pune ₹6-8 LPA Posted 20 Jun 2025
FULL TIME
Pyspark
Apis
graph databases
Aws
Python
Job Description
- Hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. You ll also define APIs using AWS Lambda and containerised services on AWS ECS.
- Join us on an exciting journey where youll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectures while continuously learning and growing alongside a passionate team.
- Thrive in a fast-paced, ever-evolving environment with shifting priorities.
- Demonstrated ability to quickly learn and integrate new technologies and frameworks.
- Strong problem-solving mindset with the ability to juggle multiple priorities effectively.
- Core Responsibilities
- Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark.
- Define and implement smart data pipelines from RDBMS to Graph Databases .
- Build and expose APIs using AWS Lambda and ECS-based microservices .
- Collaborate with cross-functional teams to define, design, and deliver new features.
- Write clean, efficient, and scalable code following best practices.
- Troubleshoot, debug, and optimise applications for performance and reliability.
- Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required.
- Ensure security, compliance, and observability across all development activities .
- Expert-level proficiency in Python with a strong grasp of Object oriented functional programming.
- Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune).
- Hands-on experience with cloud platforms - AWS and/or Azure is a must.
- Proficiency in PySpark or similar data ingestion and processing frameworks.
- Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git.
- Strong understanding of CI/CD, version control, and agile development practices.
- Excellent communication and collaboration skills.
- Desirable Skills
- Experience with Agentic AI, machine learning, or LLM-based systems.
- Familiarity with Apache Iceberg or similar modern data lakehouse formats.
- Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible.
- Understanding of microservices architecture and distributed systems.
- Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack).
- Experience working in Agile/Scrum environments.
- Minimum Qualifications
- 6 to 8 years of hands-on experience in Python development and data engineering.
- Demonstrated success in delivering production-grade software and scalable data solutions.