Apply for this job.

Please fill out the form below to apply for this job.

 We only accept .doc, .docx, .pdf, and .odt files

Data Architect

 

Position: Location: Salary: Ref: Type:
Data Architect Negotiable 188 Permanent

Data Architect 

Are you a skilled Data Architect that is looking for a role where you can have an impact on design decisions? Would you like to work for a global insurance firm that has a long history of innovation and excellence? Then this role could be for you.

 

Our client is building a centralised data repository (CDR) and they are looking for a Data Architect to join their team. Depending on the maturity of the project lifecycle when you join you will either be involved in the build of the CDR or the design and building of future enhancements to the CDR and the architecture of additional data solutions.

Day to day:

  • As a Data Architect you will design, build, and maintain prototype data solutions to meet the needs of the enterprise.
  • You will understand the business meaning and importance of data, its definition, lineage, and consumption in the form of reporting, analysis, or predictive modelling
  • Expert in understanding the functional and business requirements and providing solutions
  • Translate business requirements into effective, structured, and consumable functional requirements
  • Work closely with senior stakeholders and support the delivery of key data management initiatives by working with the business team
  • Liaise with internal departments to ensure a thorough understanding of the business and system processes and the data they generate.
  • Engaging with source system SMEs and Businesses to better understand the system & requirements as required
  • Has in-depth knowledge of processes and products and which analyses, methodologies and approaches best support assessment of performance, risk, or valuation.

Experience required:

  • 7 + years’ experience as a Senior Data Analyst, with experience in Solutions Architect or Data Architecture.
  • A comprehensive understanding of the principles of and best practices behind data engineering, and the supporting technologies such as RDBMS, NoSQL, Cache & In-memory stores.
  • Proven experience in architecting and implementing Business Intelligence and Data warehouse platforms, Master data management, data integration and OLTP database solutions.
  • Knowledge of SAS, Python, R, or similar Analytical software
  • Possess in-depth knowledge of and ability to consult on various technologies. Strong knowledge of industry best practices around data architecture in both cloud-based and on-prem solutions. Strong analytical and numerical skills are essential, enabling easy interpretation and analysis of large volumes of data.
  • Excellent attention to detail, accuracy, and logical thinker - able to take large complex sets of activities and plan into a logical set of digestible chunks
  • Ability to clearly communicate complex results to technical and non-technical audiences and presentational skills, confident and methodical approach, and ability to work within a team environment
  • Experience in architecting data solutions across hybrid (cloud, on premise) data platforms. A comprehensive understanding of data warehousing and data transformation (extract, transform and load) processes and the supporting technologies such as Amazon Glue, EMR, Azure Data Factory, Data Lake, and other similar products. Excellent problem solving and data modelling skills (logical, physical, sematic and integration models) including normalisation, OLAP / OLTP principles and entity relationship analysis. Experience in mapping key Enterprise data entities to business capabilities and applications. Strong knowledge of horizontal data lineage from source to output.
  • Extraordinary quantitative and data analytical skills, including the ability to model, interpret and present data in an insightful way that is most impactful
  • Experience in driving reporting improvements and technology implementations in support of business analytics.
  • Experience in reconciling large volumes of data within complex data sets.