Job Responsibilities
Lead efforts to establish a mature solution engineering practice supporting on-premise and cloud installations of Apache Hadoop and Hortonworks Data Platform (HDP). Migrate legacy Apache Hadoop (1.x) environments to modern supportable versions with a focus on integrating fault tolerance to the platform. Develop and maintain financial investment guidance models for performance specific investments to ensure effective use of capital. Engineer and manage the installation, configuration, and tuning of Hadoop platforms. Provide analysis of Hadoop environments and perform the necessary actions to avoid deficiencies and service interruptions. Design framework to manage Role Based Access Control (RBAC) using principle of least privilege.
Skills Required
- Technical expertise in managing production Hadoop environments
- Strong background installing, configuring, and managing Hortonworks HDP
- Experience installing and configuring Apache projects such as HDFS, Hive, Solr, Zookeeper, Hbase, and Kafka
- Professional scripting experience with languages such as PowerShell, Python, and BASH
- Bachelor’s degree in Technology or equivalent experience
- Strong technical, leadership, communication, and financial skills
- Experience working with and troubleshooting network configurations utilizing multiple subnets and VLANs
- Experience in version control systems (Git)
- Vendor certifications from Microsoft, VMware, Red Hat, and Cisco
- Experience with IBM Netezza is a plus
- Automation experience leveraging tools such as Salt, Puppet, Ansible, or SCCM
- Technical expertise in Linux systems management – Red Hat Enterprise Linux, CentOS, and Ubuntu