Designing, develop & tune data products, applications and integrations on large scale data platforms (Hadoop, Kafka Streaming, Hana, SQL server etc) with an emphasis on performance, reliability and scalability and most of all quality
Analyze the business needs, profile large data sets and build custom data models and applications to drive the Adobe business decision making and customers experience
Develop and extend design patterns, processes, standards, frameworks and reusable components for various data engineering functions/areas
Collaborate with key stakeholders including business team, engineering leads, architects, BSA's & program managers
The ideal candidate will have:
MS in Computer Science / related technical field or BS with 2+ years of experience in enterprise data warehousing / big data implementations
Strong SQL, ETL, scripting and or programming skills with a preference towards Python, Java, Scala, shell scripting
Demonstrated ability to clearly form and communicate ideas to both technical and non-technical audiences
Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data / engineering challenges
Results driven with attention to detail, strong sense of ownership, and a commitment to fun, team and innovation!
Nice to have:
Familiarity with streaming applications
Experience in development methodologies like Agile / Scrum
Adobe is changing the world through digital experiences. We give everyone - from emerging artists to global brands - everything they need to design and deliver exceptional digital experiences.
Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, ethnicity, sexual orientation, gender identity or expression, disability or veteran status.