Big Data Developers / Leads / Architects
Location : Gurugram / Hyderabad / Pune
Spark, Scala, Bigdata, GCP, Sqoop, Hive, Kafka, Python, AWS
Role type:Full Time
Experience:3 - 12 years
Skills and Experience :
- 3+ years of work experience in data warehousing or data engineering projects.
- Experience in GCP would be an advantage.
- Should be very strong in Sqoop, Hive, Spark, Kafka, BigQuery, Oozie/Azkaban/Airflow, etc. Experience in Airflow/Azkaban would be preferred.
- Must have experience with one of the programming languages: Python, Java or Scala.
- Should be very strong in performance-optimized tuning techniques while processing the data using Spark and storing data in BigQuery or any other Data Warehouse database.
- Good knowledge of agile DevOps methodology with the test-driven development approach.
- Familiar with its toolsets (GitHub/Gerrit, Jenkins, Tonomi) would be preferred.
- The candidate must use their creative and innovative skills to assist in end-to-end business solution activities by creating, reviewing, controlling, and improving the processes, reusable/unique solutions.
- Must be good in written and verbal communication skills and be able to communicate with colleagues of all levels.
- Take end-to-end project delivery ownership and accountability.