ETL/BIG DATA ENGINEER
Location: Bengaluru, Karnataka, India
Role :
- Getting hands on with core development work!
- End to End data engineering project development exposure! Whether it is building a new project or taking an existing one to the next level, you’ll be expected to work on all aspects of development (requirement, analysis, design, development, testing) whenever the situation calls for
- Sound technical depth with over 3+ years of experience in Big data engineering space with strong ETL experience
- Exceptional communication skills that can be leveraged in effectively communicating with teams, clients and business leaders within the company
- Handling multiple tasks and projects simultaneously in an organized and timely manner.
- Plan, prioritize, and meet deadlines in a fast-paced environment.
- To initiate on improvements and testing results.
Qualifications/Skill Set :
- Expert in dealing with solutions built it Big Data platforms including Hadoop, Spark, map-reduce. Ideally 3+ years of exp is a must
- Expert in at least one of the major ETL tool (Talend + TAC, SSIS, Informatica, Pentaho) + exposure to another ETL tool is a must
- Must have orchestrated at least 2 projects using any of the cloud platforms (GCP, Azure, AWS etc.) is a must.
- Must have worked in at least 3-4 different data warehousing projects involving SCD1/2 dimension, transactional/aggregate/summarized/degenerate facts in Kimball’s model
- Experience with any of the object-oriented/object function scripting languages: Python, Java, Scala, Shell,.NET scripting, etc. is a must
- Sound knowledge on data quality frameworks, Master data management and using tools like Talend Data catalog and etc.
- Knack of finding tasks/development work that can be automated by utilizing scripting knowledge
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
- Good presentation and documentation skills is a must