Atos

SRQ126037 - Data Governance Specialist / Business Analist Generic Onsite (369152)

Work location:
Utrecht
Starting date:
11.03.2019
Ending date:
31.12.2019
Hours per week:
36

Deadline: 27/02/2019 08:00 uur
 
Taaleisen: Sterke voorkeur voor Nederlands. Engels wordt eventueel geaccepteerd 
Twee dagen inwerken tegen nultarief vereist
Geen ZZP



Data governance specialist / business analyst Generic Onsite

"The Rabobank has a large IT organization for support and services we provide to our clients. The Rabobank’s ambitions for digitalization and optimization of the provided services are high. A lot of changes are needed, along the lines of client processes. Within IT Systems the domain Distribution is responsible for the support and development of these client processes. The CDI department is responsible for CRM, Data and Integration of client processes. This concerns all storage, mutations and distribution of client data and status information. The clients are Wholesale and Retail clients. 
Within CDI several teams are responsible for data storage, data processing, data flows and data provisioning. These teams of specialists are united in the Data Lake and Distribution teams. 
Specialists are working on data modeling, data logistics, data quality, data lake and data warehousing in DevOps teams. These teams are responsible for maintenance and adjustments of data flows and providing data services for systems and departments within Rabobank. 

The team:
At the moment three Data Lake team share the responsibility for the set-up, development and maintenance of the Data Lake and Data Factory for Rabobank (Distribution). This Data Lake is built on Hadoop technology. Within the team, where everyone is working closely together on the realization of new functionality, each team member has its own expertise and background. Next to creating new functionality, the maintenance, configuration, security, processes and procedures are also part of the responsibility of the DevOps teams. Alignment between the different Data Lake teams is organized in the horizontal chapters: Business Analysis, Development and Test. We strive towards high performing, self-steering teams where team members take responsibility and are willing to learn. 
To have an efficient, structured and reliable enterprise data lake a strong data governance is key. To strengthen and expand our data governance processes and to implement our data governance tooling we are currently looking for a data governance specialist / business analyst. 

As a data governance specialist you support and steer the data lake teams regarding topics like data retention, data privacy (privacy by design), data definitions and data lineage. Together with the specialists in the teams you work on the implementation of the mentioned data governance tooling and refinement of the data governance processes. Next to that you have a responsibility in implementing the tooling supporting this data governance process. 
You recognize yourself in the following profile: 

The specialist is responsible for: 
• Contribute to the refinement and improvement of the data governance process 
• Contribute to the implementation of the data governance tooling 
• Propagate the data governance strategy to the data lake teams and support teams in aligning to this strategy 

Competences: 
• Data (governance) awareness 
• Strong communicator 
• Great collaborator 
• Power to implement 
• Open minded, flexible, self-reflective and sincere 

What do we expect? 
-Strong knowledge of data management processes.
-Experience in data intensive projects, e.g. data warehousing and/or data lakes.
-Enthusiastic advocate of the importance of data governance.
-Motivated to learn and improve. 

What do we offer? 
A challenging environment for talent in data. The teams are open and honest and all are willing to learn and specialists on specific areas in the data domain. The teams are all DevOps and work on continuity and new developments. The environment is dynamic and we are working with new technologies. 

Way of working and what we are working with: 

• Platform: 
   o Cloudera Hadoop eco system 
   o Hortonworks Data Flow cluster (HDF) 
   o Informatica Big Data Management 
   o Informatica Enterprise Data Catalog 
   o Data Provisioning: 
    
§ Pivotal Cloud Foundry 
     § Azure blob storage 
   o Databases 
    
§ Oracle 
     § HBase 
     § HIVE 

• Dataformats: 
   o XML 
   o JSON 
   o AVRO 
   o PARQUET 

• Programming languages: 
   o Scala (Spark) 
   o SQL 
   o Java 
   o Javascript 
   o Rubyscript 
   o Python 

• Testtooling: 
   o Fitnesse 
   o Mocha Chai 
   o JMeter 

• Methods and facilitating tooling: 
   o SCRUM 
   o JIRA 

• Languages: 
   o English 

• Specifications: 
   o YAML 

• Version control 
   o GIT 

Knowledge required: 
• Relevant higher education 
• Experience in data intensive projects 
• Experience in data governance and data management 
• Expert in Business Analysis and communication"

< Go back to the overview of all orders