Desired Skills and Experience
- Bachelor’s degree in Computer Science or equivalent is required.
- 5+ years of working experience in Big Data technologies (Hadoop, Spark, Kafka, MAPR (plus), NoSQL databases)
- Proven track record of architecting, designing and implementing data pipelines and related operations
- Big Data operations and support experience in Hadoop, Spark, Kafka, and NoSQL databases
- Hands-on experience with DevOps tools, automating engineering and operational tasks such as ansible / salt stack / puppet / chef.
- Highly knowledgeable and experienced with scripting languages such as Python, etc.
- Experience with source control management such as Git
- Expertise in troubleshooting complex OS, database, file system, network configuration, and application & web server issues
- Development experience in object oriented programming languages such as Java & Scala is strongly preferred
- Must have extensive experience evangelizing DevOps best practices
- Strong verbal and written communication skills
- Data analysis experience is a bug plus
- Willingness to participate in a 24x7 on-call rotation for escalations
- Highly collaborative, self-starter, technical contributor, outstanding communication practices
Apply