
Data Engineer Responsible Investments
Dit wordt je functie
Data Engineer Responsible Investments
The role
On behalf of the Asset Management Responsible Investments team at APG, we are looking for a Data Engineer to join our team. In this role, your focus will be on developing data pipelines for newly developed Investment and Research Platforms as well as streamlining data flows for existing platforms.
What you would do
As a developer you are responsible for, among other things:
- Analyzing and translating business needs to systems and processes; and developing and implementing feasible solutions to satisfy these needs.
- Set up and maintain Azure DevOps repositories to facilitate version control and collaborative development.
- Maintain a Databricks workspace and manage all its dependencies and connections.
- Set up and maintain a SQL database.
- Develop and maintain data pipelines that implement business methodologies, putting in place a clear documentation to support the pipelines.
Our team is a multi-disciplinary team that is divided into multiple agile teams, which are cross-functionally diverse but responsible for the entire delivery of a solution. The Responsible Investments team comprised of international team members that collectively have the skills to get the job done every iteration. We work towards flexible and high-performing agile teams which are composed of members with T-shaped skills.
What you bring
As a Data Engineer, you are a specialist in finding solutions. You think in terms of opportunities and are service minded. Taking ownership and accepting responsibility comes naturally to you; you have a pragmatic attitude and are motivated to deliver quality. You have a willingness to develop or enhance your t-shaped skills.
Additionally, we ask for:
- At least five years of demonstrated work experience with data modeling and data querying.
- Proficiency in Python programming language.
- Knowledge of Data Engineering, DevOps and Azure Data Platform.
- Background/degree in Computer Science / Engineering.
- Knowledge of the PySpark framework and working experience with parquet.