KA
Job Description
- About Kazam:We are an agnostic EV charging software platform building Indias largest smart and affordable EV charging network
- Through our partnerships with fleets, CPOs, RWAs, and OEMs we have been able to create a robust charging network with over 7000 devices on our platform
- Kazam is enabling fleet companies, charge point operators, OEMs by providing affordable and complete software stack like white label template app (both android & iOS), API integration, load management solution & charger monitoring dashboard so that you can do hassle free business without worrying about technology
- (Please note that you can use both Kazam chargers and OCPP enabled charging points via our platform)
- Not only that, we are able to drive utilisation to your charging station leveraging Kazam s network for 50,000+ EV drivers
- Through our partnerships with Fleets, CPOs, RWAs and OEMs we have been able to create a robust charging network with over 11000+ devices on our platform
- Key ResponsibilitiesWork with analytics teams to ensure data is clean, structured, and accessible for analysis and reporting
- Implement data quality and governance frameworks to ensure data integrity across the organization
- Contribute to data exploration and analysis projects by delivering robust, reusable data pipelines that support deep data analysis
- Design, implement, and optimize scalable data architectures, including data lakes, data warehouses, and real-time streaming solutions
- Develop and maintain ETL/ELT pipelines to ensure efficient data flow from multiple sources
- Leverage automation to streamline data ingestion, processing, and integration tasks
- Develop and maintain scripts for data automation and orchestration, ensuring timely and accurate delivery of data products
- Work closely with DevOps and Cloud teams to ensure data infrastructure is secure, reliable, and scalable
- Qualifications & SkillsTechnical Skills:ETL/ELT: Proficient in building and maintaining ETL/ELT processes using tools such as Apache Airflow, DBT, Talend, or custom scripts in Python, SQL, NoSQL etc
- Analytics: Strong understanding of data analytics concepts, with experience in creating data models and working closely with BI/Analytics teams
- Automation: Hands-on experience with data automation tools (e
- g, Apache Airflow, Prefect) and scripting (Python, Shell, etc) to automate data workflows
- Data Architecture: Experience in designing and maintaining data lakes, warehouses, and real-time streaming architectures using technologies like AWS/GCP/Azure, Hadoop, Spark, Kafka, etc
- Soft Skills:Excellent problem-solving skills and ability to work independently and as part of a team
- Ability to collaborate cross-functionally with analytics, business intelligence, and product teams
- Strong communication skills with the ability to translate complex technical concepts for non-technical stakeholders
- Attention to detail and commitment to data quality and governance