STStack Digital
Data Engineer
Bangalore ₹6-11 LPA Posted 19 Jun 2025
FULL TIME
Kubernetes
Pyspark
Rest Apis
Aws
Job Description
- Developing ETL pipelines involving big data.
- Developing data processinganalytics applications primarily using PySpark.
- Experience of developing applications on cloud(AWS) mostly using services related to storage, compute, ETL, DWH, Analytics and streaming.
- Clear understanding and ability to implement distributed storage, processing and scalable applications.
- Experience of working with SQL and NoSQL database.
- Ability to write and analyze SQL, HQL and other query languages for NoSQL databases.
- Proficiency is writing disitributed scalable data processing code using PySpark, Python and related libraries.
Data Engineer AEP Comptency
- Experience of developing applications that consume the services exposed as ReST APIs.
- Special Consideration given forExperience of working with Container-orchestration systems like Kubernetes.
- Experience of working with any enterprise grade ETL tools.
- Experience knowledge with Adobe Experience Cloud solutions.
- Experience knowledge with Web Analytics or Digital Marketing .
- Experience knowledge with Google Cloud platforms.
- Experience knowledge with Data Science, ML/AI, R or Jupyter .