Skills And Experience
The Spark Developer should possess the following:
- 2 â 3 years developing applications using Apache Spark
- Professional experience with using one or more 'big data' platforms like Databricks, HDInsight, Hadoop, etc.
- Experience developing custom queries over large data sets using Apache Spark with either Scala, Python, SQL, or R
- Experience with notebook technology like Juniper or Databricks
- Knowledge of security best practices and data requirements around PII data
- Experience manipulating and extracting data in the parquet, XML, CSV, and JSON message formats
- Knowledge of work load distribution and cluster setup to optimize cost
- Experience using Agile development methodology
- Experience working with relational and dimensional data models
- Strong team player, excellent communication skills both verbal and written
Position : Spark Developers Location : Chennai/Bangalore/Pune/Kolkata, India Salary :7-10LPA Duration : Full time permanent
Skills: Spark, Hadoop, Scala, Python, Sql, Agile