Job Overview
-
Date PostedDecember 5, 2023
-
Location
-
Expiration date--
Job Description
How about a role at a large Finance company in Amsterdam?
General information
- Duration: Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â First contract is for 6 months
- No. of working hours: Â Â Â 40 hours per week
- Location: Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 1 day per week in office in Amsterdam
- Contract type: Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Payroll / One-man company
Deadline: 8th of December
*Increase your hiring chances and apply before the deadline. In some occasions we are able to introduce candidates after the deadline, so you are welcome to reach out if it has expired.
What is the project about?
We are looking for a Senior DevOps Data Engineer (Kafka) to build with us the Strategic Data Exchange between Lending core systems and surrounding systems. The primary focus of the team is on the processes concerning data delivery to Lending internal applications, to the Wholesale Bank Data Lake and data delivery on Regulations.
Responsibilities
Data is becoming more important every day. Your contribution to the Strategic Data Integration will be critical to realize our ambitioned Lending data platform, with high quality and timely data availability, moving from batch to real-time. This way enabling excellent data consumption possibilities to meet our ever increasing client- and regulatory demands on data.
We need your help in designing and building this new exchange and building bridges towards other teams in order to realize end-to-end delivery across Lending- and other teams. We value Agile, self-organization and craftsmanship. We are driven professionals who enjoy shaping the future of this place.
Job requirements
We are seeking a candidate with a collaborative and proactive mindset, adept at open and honest communication, team collaboration, and capable of coaching other developers. You must have a passion for streaming data processing, expertise in Java development, and at least 9 years of relevant experience within the data engineering field. Experience in the following tech/skills are also necessary:
- Agile / Scrum.
- Track record in building larger corporate systems.
- Kafka. Preferably the Confluent framework.
- Kafka ConnectSchema Registry
- Kafka Sql (KSQL) and the Kafka Streaming API.
- Data Integration techniques.
- Java backend development. Java 8,11
- Oracle RDBMS 11g or higher.
- Oracle Sql 11g or higher.
- Data modelling.
- Linux (bash) scripting capabilities.
Some other technologies you might work with / Nice to have:
- CI / CD tooling: Azure DevOps, Maven, CheckMarx, Git, Ansible.
- Database Change Data Capture.
- Visualization with Grafana, Elastic, Kibana.
- Oracle Data Integrator 12c.
- Experience in a complex, corporate environment.
- Experience in Lending, Financial systems.
- Issue trackers like JIRA, ServiceNow.
- Collaboration tooling like Confluence.
- Ansible
What we offer to you
- Work on something that has great significance to the bank.
- Being part of the squad shaping the future way of development.
- An enthusiastic team in an informal, dynamic environment.
About the client:
The client is a globally known bank that has millions of customers. With no less than thousands of professionals from all over the world, they try to provide the customer with the best banking services. Furthermore, the company is globally recognized as one of the best places to work in the world. The working atmosphere is described as very open, as they use a horizontal organizational style to stimulate cooperation between colleagues, but also to provide each other with enough space to participate, grow and be innovative!
Does this role spark your interest? Then please provide me with your most recent resume and contact details, so that we can discuss this vacancy more detailed by phone!
You can check other job opportunities in our website: https://www.magno-it.nl/Vacancies.aspx
Â