Overview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives
GCS is acting as an Employment Business in relation to this vacancy.
Read lessfor the following search criteria