Data Engineer - AI, Azure We're looking for a Data Engineer with solid experience working in Azure environments,... Read more
Data Engineer - AI, Azure
We're looking for a Data Engineer with solid experience working in Azure environments, and a strong focus on enabling AI and machine learning through accessible, well-structured enterprise data.
You'll help design and maintain the data infrastructure that powers AI across the organisation, ensuring data is available, clean, secure, and aligned with enterprise standards.
Key skills
Design, build and maintain data pipelines on Azure to support AI/ML workloadsPrepare and transform enterprise data to meet the needs of AI models and analyticsIntegrate data from various sources (e.g., ERP, CRM, internal systems) using Azure toolsWork closely with data scientists and ML engineers to support feature engineering and model deploymentImplement and manage data storage and processing using services such as Azure Data Factory, Azure Synapse, Azure Data Lake, and Azure DatabricksProven track record of enabling AI/ML projects through well-structured data pipelinesOptimise data pipelines for performance, scalability, and cost-efficiencyFamiliarity with designing data infrastructure that supports AI scalabilityPlease send across an updated CV if this position is of interest
GCS is acting as an Employment Business in relation to this vacancy.
Read lessOverview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessOverview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives
GCS is acting as an Employment Business in relation to this vacancy.
Read lessData Engineer - Contract RoleWe're seeking an experienced Data Engineer to deliver high-quality, scalable data solutions that support... Read more
Data Engineer - Contract Role
We're seeking an experienced Data Engineer to deliver high-quality, scalable data solutions that support business goals. This contract role offers the chance to work on impactful projects in a fast-paced, data-driven environment.
Key Responsibilities & Requirements:
Data Solution Delivery: Design, build, and maintain secure, efficient data pipelines and analytics tools across the full development lifecycle.
Technical Skills: 5+ years' experience with SQL, strong Python or R skills, and hands-on knowledge of ETL tools (e.g. Informatica, SSIS) and BI platforms (e.g. Tableau, Power BI). Experience with Snowflake, AWS, or Dataiku is a plus.
Collaboration & Communication: Work closely with technical and business teams to drive data-informed strategies and mentor junior engineers.
Governance & Security: Apply best practices in data architecture, governance, and protection.
Agility & Growth: Strong problem-solving mindset with a flexible, team-oriented approach and a drive to continuously learn and improve.
This is a great opportunity to bring your data engineering skills to a forward-thinking team on a high-impact contract assignment.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessfor the following search criteria