Contract Type : Fixed Term Contract
Fixed Term Contract
Our purpose at Vodafone is to connect for a better future. As a Global Communications Technology company, we put the customer at the heart of everything we do.
We are forever challenging, pushing boundaries and discovering innovative ways to connect our customers with their digital societies.
We connect people, businesses, and communities across the globe to create the future. We earn customer loyalty, experiment, learn fast and get it done, together.
Join our journey as we connect for a better future. Ready?
Role purpose :
The Big Data Engineer provides delivers through self and others to :
1. Integrate the necessary data from several sources in the Big Data Programme necessary for analysis and for Technology actions
2. Build applications that make use of large volumes of data and generate outputs that allow commercial actions that generate incremental value
3. Ensuring data structure , data cleanings and data integrity
4. Manage data security & privacy process
Key accountabilities and decision ownership :
Designing and producing high performing stable end-to-end applications to perform complex processing of batch and streaming massive volumes of data in a multi-tenancy big data platform in the cloud (Hadoop on-premises will be consider), and output insights back to business systems according to their requirements.
Ingest the necessary data from local and group sources onto GCP platform
Automation of data transfer jobs using tools such as Jenkins,
Data ingestion processes implemented using NIFI
Building real-time data processing applications which are integrated with business systems to enable value from analytic models to drive rapid decision making
Working with the Group Big Data Delivery team to define the strategy for evolving the Big Data capability, including solution architectural decisions aligned with the platform architecture
Investigating new technologies to identify where they can bring benefits
Ensuring data structure , data cleanings and data integrity
Manage data security & privacy process
Core competencies, knowledge and experience :
Expert level experience in designing, building and managing applications to process large amounts of data in a Hadoop ecosystem or other big data frameworks
Extensive experience with performance tuning applications on Hadoop and configuring Hadoop systems to maximise performance or other big data frameworks;
Experience building systems to perform real-time data processing using Spark Streaming, Flink, Storm or Heron data processing frameworks, and Kafka, Beam, Dataflow, Kinesis or similar data streaming frameworks;;
Experience with common SDLC (Software Development Life Cycle), including SCM (Software Control Management) using Git, build tools, unit, integration, functional and performance testing from automation perspective, TDD / BDD, CI and continuous delivery, under Agile practises.
Experience working in large-scale multi tenancy big data environments
Must have technical / professional qualifications :
Expert level experience with GCP ecosystem (Pub / Sub, GCS, BigQuery, DataProc, Dataflow, Composer, GKE); experience with Hadoop ecosystem (Spark, Hive / Impala, HBase, Yarn) and similar cloud provider solutions also considered (AWS, Azure)
Strong software development experience in Scala and Python programing languages; Java and functional languages desirable;
Experience with Unix-based systems, including bash scripting
Experience with columnar data formats
Experience with other distributed technologies such as Cassandra, Splunk, Solr / ElasticSearch, Flink, Heron, Bean, would also be desirable.
Vodafone is committed to attracting, developing and retaining the very best people by offering a motivating and inclusive workplace in which talent is truly recognised and rewarded.
We are committed to promoting Inclusion for All with the belief that diversity plays an important role in the success of our business.
We actively encourage everyone to consider becoming a part of our journey.