ACAcme Services
Data Architect
Pune ₹10-12 LPA Posted 21 Jul 2025
FULL TIME
Data Modeling
Etl Design
Data Warehouse Management
Pyspark
Sql
Job Description
Responsibilities:
- Architectural Leadership: Serve as a Data Architect, leading the design and implementation of robust, scalable, and high-performance data solutions.
- ETL Design & Development: Design, develop, and optimize complex ETL (Extract, Transform, Load) processes to integrate data from various sources into data warehouses and other data platforms.
- Data Modeling: Create and maintain logical and physical data models for transactional systems, data warehouses, and data lakes, ensuring data integrity, consistency, and optimal performance.
- Data Warehouse Management: Oversee the design, development, and maintenance of data warehousing solutions, ensuring efficient storage, retrieval, and analysis of large datasets.
- SQL Expertise: Utilize advanced SQL skills for data querying, manipulation, and optimization across various database systems.
- PySpark Development: Apply strong hands-on experience with PySpark for big data processing, transformation, and analysis, especially in cloud environments.
- Cloud Data Solutions: Design and implement data solutions leveraging various cloud platforms (e.g., AWS, Azure, GCP), including understanding of cloud-native data services.
- Technical Strategy & Roadmapping: Contribute to the overall data strategy and roadmap, identifying opportunities for data innovation, governance, and security.
- Performance Optimization: Proactively identify and resolve data performance bottlenecks, ensuring efficient data processing and retrieval.
- Collaboration & Mentorship: Collaborate with data engineers, data scientists, and business stakeholders. Provide technical guidance and mentorship to junior team members.
Required Skills:
- Architect experience is mandatory.
- Strong expertise in ETL processes and tools.
- Proficiency in data modeling techniques.
- Extensive experience with data warehouse design and implementation.
- Strong command of SQL.
- Hands-on experience with PySpark.
- Demonstrated experience with Cloud data platforms and services.
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills to effectively collaborate with various teams and stakeholders.