What We'll Bring
At TransUnion, we have a welcoming and energetic environment that encourages collaboration and innovation we’re consistently exploring new technologies and tools to be agile. This environment gives our people the opportunity to hone current skills and build new capabilities, while discovering their genius.
Come be a part of our team – you’ll work with great people, pioneering products and cutting-edge technology.
What You'll Bring
Protecting the health and wellness of our associates and candidates considering a career at TransUnion is our highest priority. In supporting this vision, our recruitment and new hire experience for this role is fully virtual for the time being.
Candidates interviewing will get to know our team over the phone and video, and this role will operate virtually upon hire until we return to the office. Even though we are not physically together right now, our goal is to provide you a supportive candidate and new hire experience that will immerse you in our culture and set you up for success at TransUnion.
Bachelor’s degree in computer science or a related discipline.
Highly skilled on Hadoop Cluster Setup and Hadoop Administration.
Responsible for implementation and ongoing administration of Hadoop infrastructure.
5+ years’ experience on Hadoop and its eco-system components HDFS, MapReduce, YARN, Sqoop, Flume, Pig, Oozie , Strom, Ranger, Kerberos, Hive, HBase, and ZooKeeper
Worked in 24x7 environment for production support in an on-call rotation
Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
Screen Hadoop cluster job performances and capacity planning
Monitor Hadoop cluster connectivity and security
Manage and review Hadoop log files.
File system management and monitoring.
HDFS support and maintenance.
Experience in designing and building scalable infrastructure and platforms to collect and process very large amounts of structured and unstructured data.
Experience in adding and removing nodes, monitoring critical alerts, configuring high availability, configuring data backups and data purging.
Screen Hadoop cluster job performances and Capacity Planning.
Troubleshooting, diagnosing, performance tuning and solving the Hadoop issues.
Configuring and maintaining Name and Space quotas of users and File System.
Knowledge on configuring Kerberos for the authentication of users and Hadoop daemons.
General Administration of Linux installation, Users Creation, Group Creation, Permissions and package management using YUM and RPM.