TATata Consultancy Services Limited
Scala
Pune ₹5-9 LPA Posted 20 Mar 2025
FULL TIME
hdfs
Scala
Sql
Spark SQL
Unix
Job Description
Role- Scala
Job Description:
- Minimum 5+ years of experience in development of Spark Scala
- Experience in designing and development of solutions for Big Data using Hadoop ecosystem technologies such as with Hadoop Bigdata components like HDFS, Spark, Hive Parquet File format, YARN, MapReduce, Sqoop
- Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and streaming data processing.
- Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins, Views etc
- Experience in debugging the Spark code
- Working knowledge of basic UNIX commands and shell script
- Experience of Autosys, Gradle