Hadoop Specialist
Position
• You will be part of our Hadoop team, responsible for building and operating Hadoop clusters
• You will be working on real-life projects for leading multinational clients, building Hadoop clusters based on customer requirements, troubleshooting and solving problems, helping customer with their use-cases. We will help you to grow into more advanced positions and the speed of growth will be fully in your hands.
• The job will offer you opportunities to participate in local and international projects.
• We are looking for people who share with us the passion, that working with technology, data and people is fun and a “mission”, not a “job”.
Your primary duties will be:
• Preparation of solution architecture and designs, incl. daily contact with customer’s teams
• Prototyping of solutions, testing various prototypes, daily contact with developers, even occasional development itself
• Configuring on-premise and cloud Hadoop solutions, assisting business analysts and users
• Troubleshooting and problem solving
• Place of work: Remote work
Contract Type
B2B / Full Time Job
Level
Medior/Senior
Location
Remote
Salary
from 200 Eur/MD
Industry
Consulting
Requirements
Must have skills
• Hadoop platform (Cloudera, HDP), Hadoop ecosystem components,Python, PySpark
• Experience and knowledge of operating system (Unix, Linux) and basic knowledge of networks and security. Skills in bash are also required.
• Good knowledge of architecture and working of Hive and Hbase
• Good knowledge of architecture and working of Spark and Solr
• Good troubleshooting and problem solving skills
Nice to have skills
• Should have programming skills.
• Good to have knowledge of dev-ops processes and automation tools like Ansible etc.
• Good to have experience of working in highly available big data environments and knowledge of operating support models.
• Nice to have experience and knowledge of administration of Hadoop platforms (HDP, Cloudera). Experience with components like Ambari, Ranger, Kerberos ...