Applying to the position, you may join one of the most respected and fastest growing institutional investors in the world. Our client is a professional investment management organization that globally invests the funds to ensure long-term sustainability. The company invests in all major asset classes, including public equity, private equity, real estate, infrastructure and fixed-income instruments. Client attracts and selects high-calibre individuals from top-tier institutions around the globe. Join our team and look forward to: Diverse and inspiring colleagues and approachable leaders Stimulating work in a fast-paced, intellectually challenging environment Accelerated exposure and responsibility Global career development opportunities Being motivated every day by important social purpose and unshakable principles A deeply rooted culture of Integrity, Partnership and High Performance If you share a passion for performance, value a collegial and collaborative culture, and approach everything with the highest integrity, here’s an opportunity.
Responsibilities:
Manage timelines/deliverables within the team towards the successful delivery of projects. Design software solutions by interacting with portfolio managers, traders, operations staff and peers to understand requirements. Develop solutions that are in line with client’s technology biases, deliver efficiency and scalability, and enable new trading activities. Provide knowledge transfer to team members and support staff through application demos, walkthroughs, and documentation.
Mandatory Skills Description:
Proven experience as a Data Architects 7+ years Experience with designing the vision and blueprint of the organization’s data framework Concepts: Data Mesh, Data Product – Basic Data Pipelines: PySpark, Hudi, Iceberg, Airflow, EMR – Expert Data Operations: Operational Excellence, monitoring, troubleshooting – Expert Data Quality: Great Expectations, AWS Glue DQ, Deequ – Expert Entitlements: AWS Lakeformation – Expert Infrastructure: AWS Data Eng. Services (Glue, EMR, Athena etc.), AWS Compute Services (EC2, Lambda), AWS Storage Service (S3), cross-account IAM, RAM, MWAA, Step Functions, etc. – Knowledge DevOps: Terraform, Git Actions – Expert Metadata: Neptune, OpenLineage, Angular, DataZone – Knowledge Consumption: AWS Athena, Trino – Expert Other Tools: JupyterLab, Databricks – Good to have