Desired Skills and Experience
- Conception, implementation and maintenance of the internal Hadoop cluster
- Preparation and support of operational concepts for the uninterruptible service of the cluster
- Implementation of security functions in the cluster
- Linking of the cluster to AWS for further scaling and developing a stand-alone cloud-based processing chain
- Optimization of the Hadoop infrastructure
- Administration and support of the various components of the Hadoop stack (Pig, Spark, Impala, Hbase, Hive etc.)
- Close collaboration with the Data Analytics team, contact person for standard IT
- Support of the preparation and presentation of analytical data, execution of data extraction and aggregation as well as qualitative examination of data from different sources
- Support of tests and commissioning
- An interesting, diversified and challenging working environment in a highly motivated and international team
- Exciting projects for prestigious customers
- Personal and professional development opportunities from the start
- Experience with Debian/CentOS or Debian-based Linux systems as well as with Linux Shell Scripting/Bash Scripting
- Ability to analyze and optimize a complex system Experience with Apache Hadoop platforms (Cloudera, Hortonworks, Amazon EMR) and tools (Hive, Hbase or Pig)
- Experience with standard database systems Knowledge of scripting languages for automation (Perl, Python, PowerShell, VB) desirable
- Experience with HP server operation and management tools such as iLO desirable
- Interest in Big Data and related leading edge technologies
- Analytical way of thinking and the ability to work in a team
- Solution-oriented way of working
- Ability for independent and rapid incorporation into complex tasks
- Good command of English, German is an advantage