IF

Java Cloud

Ifintalent Global Private Limited
Pune3-7 LPA Posted 10 Apr 2025
FULL TIME
Spark
Java
AWS Glue
Hadoop

Job Description

Job Description

Responsibilities

  • Be able to lead an effort to design, architect and write software components.
  • Be able to independently handle activities related to builds and deployments.
  • Create design documentation for new software development and subsequent versions.
  • Identify opportunities to improve and optimize applications.
  • Diagnose complex developmental & operational problems and recommend upgrades & improvements at a component level.
  • Collaborate with global stakeholders and business partners for product delivery.
  • Follow company software development processes and standards.
  • Work on POC or guide the team members.
  • Unblock the team members from technical and solutioning perspective.

Knowledge and Experience

  • 3-7 years of building Enterprise Software Products.
  • Experience in object-oriented design and development with languages such as Java, NodeJS and/or Scala
  • Experience in leading the team of developers and proven record of end to end design and development
  • Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred)
  • Knowledge in Java/J2EE frameworks like Spring Boot, JPA, JDBC and related frameworks. 
  • Built high throughput real-time and batch data processing pipelines using Spark, Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift.
  • Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres
  • Proven ability to deliver working solutions on time
  • Strong analytical thinking to tackle challenging engineering problems.
  • Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills.
  • Experience with working in DevOps environment – 'you build it, you run it'
  • Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc
  • Experience with handling large data sets using technologies like HDFS, S3, Avro and Parquet
  • Experience working on distributed Architecture such as SOA/Microservices
  • Experience working with agile methodologies, BDD, TDD, Scrum 

Join WhatsApp Channel