ZSZscaler Softech
Business Technology Solutions Associate - Health Plan & Provider
Pune ₹6-10 LPA Posted 18 Jun 2025
FULL TIME
Azure Databricks
Pyspark
Azure Data Factory
Sql
Python
Job Description
ZS s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation.
What you'll do
- Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements;
- Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g. mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments;
- Collaborate with other team members to leverage expertise and ensure seamless transitions;
- Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management;
- Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management;
- Bring transparency in driving assigned tasks to completion and report accurate status;
- Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams;
- Assist senior team members, delivery leads in project management responsibilities
What you'll bring
- Big Data Technologies: Proficiency in working with big data technologies, particularly in the context of Azure Databricks, which may include Apache Spark for distributed data processing.
- Azure Data bricksIn-depth knowledge of Azure Databricks for data engineering tasks, including data transformations, ETL processes, and job scheduling.
- SQL and Query Optimization: Strong SQL skills for data manipulation and retrieval, along with the ability to optimize queries for performance in Snowflake.
- ETL (Extract, Transform, Load)Expertise in designing and implementing ETL processes to move and transform data between systems, utilizing tools and frameworks available in Azure Databricks.
- Data Integration: Experience with integrating diverse data sources into a cohesive and usable format, ensuring data quality and integrity.
- Python/PySpark: Knowledge of programming languages like Python and PySpark for scripting and extending the functionality of Azure Databricks notebooks.
- Version Control: Familiarity with version control systems, such as Git, for managing code and configurations in a collaborative environment.
- Monitoring and Optimization: Ability to monitor data pipelines, identify bottlenecks, and optimize performance for both Azure Data Factory
- Security and Compliance: Understanding of security best practices and compliance considerations when working with sensitive data in Azure and Snowflake environments.
- Snowflake Data Warehouse: Experience in designing, implementing, and optimizing data warehouses using Snowflake, including schema design, performance tuning, and query optimization.
- Healthcare Domain Knowledge: Familiarity with US health plan terminologies and datasets is essential.
- Programming/Scripting Languages: Proficiency in Python, SQL, and PySpark is required.
- Cloud Platforms: Experience with AWS or Azure, specifically in building data pipelines, is needed.
- Cloud-Based Data Platforms: Working knowledge of Snowflake and Databricks is preferred.
- Data Pipeline Orchestration: Experience with Azure Data Factory and AWS Glue for orchestrating data pipelines is necessary.
- Relational Databases: Competency with relational databases such as PostgreSQL and MySQL is required, while experience with NoSQL databases is a plus.
- BI Tools: Knowledge of BI tools such as Tableau and PowerBI is expected.
- Version Control: Proficiency with Git, including branching, merging, and pull requests, is required.
- CI/CD for Data Pipelines: Experience in implementing continuous integration and delivery for data workflows using tools like Azure DevOps is essential.
Additional Skills
- Experience with front-end technologies such as SQL, JavaScript, HTML, CSS, and Angular is advantageous.
- Familiarity with web development frameworks like Flask, Django, and FAST API is beneficial.
- Basic knowledge of AWS CI/CD practices is a plus.
- Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams;
- Proven ability to work creatively and analytically in a problem-solving environment;
- Willingness to travel to other global offices as needed to work with client or other internal project teams.