Hadoop Consultant
Teradata
Cairo, Al Qahirah EG
منذ 19 يوم

Position Overview

Teradata is hiring a Consultant with expertise in Big data and Apache Hadoop and be part of the dynamic Big Data team.

The ideal candidate must be a highly energetic self-starter who can perform complex hands on architecture and development on Hadoop ecosystem.

The resource will also be tapped to perform proof of concepts (PoCs) for our customers during pre-sales activities.

Job Specification

Successful candidates will -

  • Take a role in consulting engagements and managing day-to-day client relationships and results.
  • Ensure that engagements adhere to client strategies.
  • Ensure engagement quality.
  • Ensure engagement adherence to budget objectives and scope.
  • Engage with Teradata Account teams and prospective customers to analyse and understand customer requirements;
  • Shape and influence customer requirements so that they are deployed in an optimum Hadoop architecture;
  • Assist in qualifying requirements and provide guidance within the Big Data CoE that will enable him / her to determine whether Hadoop is a good fit for the problem that the customer is trying to solve;
  • Participate in design, plan and execute on-site / off-site customer proof-of-concepts;
  • Configure and use the Horton Hadoop distribution of tools and associated products. Typically Hive, Pig, HCatalog and MapReduce procedural programming languages;
  • Partner with the Hadoop administrator to secure and configure the Hadoop cluster to optimise performance and administrate the Hadoop environment;
  • Post-POC-execution, document and disseminate the results and lessons learned to all stakeholders;
  • Challenges standard approaches
  • Qualifications

  • Have hands-on experience in the design, development or support of Hadoop in an implementation environment at a leading technology vendor or end-user computing organization;
  • 1+ years experience implementing ETL / ELT processes with MapReduce, PIG, Hive.
  • years hands on experience with HDFS, and NoSQL database such as HBASE, Cassandra on large data sets
  • Hands on experience with NoSQL (e.g. key value store, graph db, document db)
  • 2+ years experience in performance tuning and programming languages such as; Shell, C, C++, C#, Java, Python, Perl, R.
  • Demonstrate a keen interest in, and solid understanding of, big data technology and the business trends that are driving the adoption of this technology;
  • Demonstrate analytical and problem solving skills; particularly those that apply to a Big Data environment
  • Maintain a good level of understanding about the Hadoop technology marketplace;
  • Strong understanding of data structures, modeling and Data Warehousing.
  • Team-oriented individual with excellent interpersonal, planning, coordination, and problem-solving skills.
  • High degree of initiative and the ability to work independently and follow-through on assignments.
  • Excellent oral and written communications skill.
  • BS or MS degree in Computer Science or relevant fields
  • قدِّم طلب ترشيحك
    قدِّم طلب ترشيحك
    بريدي الالكتروني
    بالنقر فوق "متابعة"، عطي نيوفو الموافقة على معالجة بياناتي وإرسال تنبيهات البريد الإلكتروني لي، وفقًا لسياسة الخصوصية الخاصة بنيوفو. يمكنني إلغاء اشتراكي أو سحب موافقتي في أي وقت.
    واصل
    استمارة الطلب