Job Title: Contract Data Engineer - HealthTech Platform (Outside IR35) Location: Hybrid (1-2 days/week in London, remainder remote)... Read more
Job Title: Contract Data Engineer - HealthTech Platform (Outside IR35)
Location: Hybrid (1-2 days/week in London, remainder remote)
Rate: £450-£550/day (Outside IR35)
Start Date: ASAP
Contract Length: 12 months (initial) - likely extension
IR35 Status: Outside IR35
About the Client:
A rapidly scaling UK HealthTech company is building a next-gen clinical data platform used by hospitals and pharmaceutical partners to unlock value from complex healthcare datasets. Their mission is to enable AI-driven insights across EHRs, genomics, medical imaging, and real-world data to improve patient outcomes.
Key Responsibilities:
Build and maintain scalable data pipelines across structured and unstructured health data (EHR, FHIR, HL7)Enable real-time and batch data processing for AI/ML use casesCollaborate closely with Data Scientists and ML Engineers on model deploymentSupport and manage cloud infrastructure (AWS) with IaC toolsEnsure compliance with NHS data governance standards (GDPR, DSPT, ISO27001)Tech Stack:
Languages/Tools: Python, SQL, Airflow, SparkCloud & DevOps: AWS (S3, Redshift, Glue), Terraform, DockerData Types: EHR, HL7, FHIR, genomics, RWDYou'll Need:
3+ years of Data Engineering experience (ideally within health, life sciences, or pharma sectors)Proven track record working with sensitive or regulated datasetsStrong communication and collaboration skills across tech and non-tech teamsWhy Contract Here?
Join a mission-driven HealthTech company with real-world impactPlay a pivotal role in shaping scalable, AI-ready healthcare data platformsWork alongside clinicians, researchers, and leading UK health institutionsLong-term potential - significant product roadmap and investment already in place
GCS is acting as an Employment Business in relation to this vacancy.
Read lessData Engineer - AI, Azure We're looking for a Data Engineer with solid experience working in Azure environments,... Read more
Data Engineer - AI, Azure
We're looking for a Data Engineer with solid experience working in Azure environments, and a strong focus on enabling AI and machine learning through accessible, well-structured enterprise data.
You'll help design and maintain the data infrastructure that powers AI across the organisation, ensuring data is available, clean, secure, and aligned with enterprise standards.
Key skills
Design, build and maintain data pipelines on Azure to support AI/ML workloadsPrepare and transform enterprise data to meet the needs of AI models and analyticsIntegrate data from various sources (e.g., ERP, CRM, internal systems) using Azure toolsWork closely with data scientists and ML engineers to support feature engineering and model deploymentImplement and manage data storage and processing using services such as Azure Data Factory, Azure Synapse, Azure Data Lake, and Azure DatabricksProven track record of enabling AI/ML projects through well-structured data pipelinesOptimise data pipelines for performance, scalability, and cost-efficiencyFamiliarity with designing data infrastructure that supports AI scalabilityPlease send across an updated CV if this position is of interest
GCS is acting as an Employment Business in relation to this vacancy.
Read lessAbout the Role:We are hiring on behalf of a leading energy company in the UAE that is accelerating... Read more
About the Role:
We are hiring on behalf of a leading energy company in the UAE that is accelerating its digital transformation journey. They are looking for a Senior Data Engineer to join their high-performing analytics and engineering team in AbuDhabi.
This is a client-facing role, ideal for a Data Engineering professional who can lead the design and development of scalable, production-grade data pipelines and infrastructure to support real-time analytics and advanced AI/ML initiatives.
What's on Offer:
Key Responsibilities:
Design, develop, and optimise robust and scalable data pipelines (ETL/ELT)Builder and maintain cloud-native data platforms (e.g., AWS, Azure, GCP)Collaborate with data scientists, analysts, and business stakeholders to deliver reliable and timely dataEnsure data quality, governance, and compliance across pipelines and storageDevelop and manage batch and streaming data solutions for high-volume environmentsSupport the deployment and monitoring of machine learning models in productionDocument and communicate architecture, work flows, and design decisions clearlyWho We're Looking For:
Proven experience as a Senior Data Engineer or in a similar roleStrong programming skills (Python, SQL) and data orchestration tools (e.g., Airflow, Spark, dbt)Hands-on experience with cloud platforms like AWS, Azure, or GCPFamiliarity with data warehousing solutions (e.g., Red shift, Big Query, Snowflake)Solid understanding of data modelling, version control, CI/CD, and containerisation (e.g., Docker)Excellent communication skills and ability to engage with technical and non-technical stakeholdersExperience supporting machine learning pipelines is a strong plusPrior work in energy, utilities, or industrial sectors is advantageousGCS is acting as an Employment Business in relation to this vacancy.
Read lessOverview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessData Engineer (Mid-Level) Belgium, 6-12 Months Rolling Location: Ghent (Hybrid/On-site) Contract Length: 6-12 months (rolling thereafter) Start Date:... Read more
Data Engineer (Mid-Level) Belgium, 6-12 Months Rolling
Location: Ghent (Hybrid/On-site)
Contract Length: 6-12 months (rolling thereafter)
Start Date: As soon as
An established organisation in Belgium is seeking a Mid-Level Data Engineer to support a critical re-architecture programme. This is a contract position for an initial 6-12 months, with the strong potential for extension on a rolling basis.
Key Responsibilities:
Support the re-architecture of legacy data platforms, primarily within Qlik Sense / QlikView and Excel-based reporting systems.Work closely with internal stakeholders to understand existing architecture and propose scalable, efficient solutions.Assist in data modelling, data flow design, and optimising on-prem data infrastructure.Collaborate on migration strategies and help transition BI solutions from Qlik to Power BI and Databricks where applicable.Must-Have Skills:
Proven Data Engineer or Data Architect experience, ideally within mid-sized or enterprise environments.Strong knowledge and hands-on experience with Qlik Sense / QlikView, including system design and custom development.Solid understanding of data re-architecture and redesign within on-prem environments.Comfortable working with Excel-based data solutions and enhancing their scalability.Nice-to-Have:
Experience migrating from Qlik to Power BI.Familiarity with Databricks and modern cloud-based data engineering practices.What's on Offer:
Flexible rolling contract with a long-term roadmap.Opportunity to play a key role in a high-impact transformation programme.Hybrid work model with autonomy and ownership of deliverables.If you're a results-driven data engineer with architecture experience and a passion for modernising legacy systems, we'd love to hear from you.
📩 Apply now with your CV or reach out for a confidential discussion.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessOverview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives
GCS is acting as an Employment Business in relation to this vacancy.
Read lessI am searching for a Data Engineer for a client of mine for a freelance role. Candidates must... Read more
I am searching for a Data Engineer for a client of mine for a freelance role.
Candidates must have experience in DBT and Snowflake.
Long term freelance role based in the Netherlands.
For further information, please reach out to me on [email protected] or just apply direct.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessData Engineer - Contract RoleWe're seeking an experienced Data Engineer to deliver high-quality, scalable data solutions that support... Read more
Data Engineer - Contract Role
We're seeking an experienced Data Engineer to deliver high-quality, scalable data solutions that support business goals. This contract role offers the chance to work on impactful projects in a fast-paced, data-driven environment.
Key Responsibilities & Requirements:
Data Solution Delivery: Design, build, and maintain secure, efficient data pipelines and analytics tools across the full development lifecycle.
Technical Skills: 5+ years' experience with SQL, strong Python or R skills, and hands-on knowledge of ETL tools (e.g. Informatica, SSIS) and BI platforms (e.g. Tableau, Power BI). Experience with Snowflake, AWS, or Dataiku is a plus.
Collaboration & Communication: Work closely with technical and business teams to drive data-informed strategies and mentor junior engineers.
Governance & Security: Apply best practices in data architecture, governance, and protection.
Agility & Growth: Strong problem-solving mindset with a flexible, team-oriented approach and a drive to continuously learn and improve.
This is a great opportunity to bring your data engineering skills to a forward-thinking team on a high-impact contract assignment.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessI am searching for a Data Engineer for a client for a payroll position. Candidates MUST be Netherlands... Read more
I am searching for a Data Engineer for a client for a payroll position. Candidates MUST be Netherlands based. They are looking for both Senior and Junior profiles.
If the candidate is Senior;
- Software engineering experience with Java or Python.
- Dashboard development experience with Power BI or Splunk.
- Devops experience (Azure, Kafka, Kubernetes, Databricks).
- Docker experience.
- Pyspark or Spark SQL experience and Testing experience.
If the candidate is Junior;
- Preference for candidates out of college.
- Degree in software engineering.
- Knowledge of dashboarding tools, Power BI or Splunk
- Ability to explain their personal projects/internships
For further information, please reach out to me on LinkedIn or [email protected]
GCS is acting as an Employment Agency in relation to this vacancy.
Read lessOur client, a major energy company in the UAE, is seeking an experienced Senior Data Engineer to join... Read more
Our client, a major energy company in the UAE, is seeking an experienced Senior Data Engineer to join their team for a 12-month rolling contract.
This position offers a fantastic opportunity for candidates looking to work in a dynamic, fast-paced environment while benefiting from tax-free income, comprehensive relocation support, visas, and healthcare.
If you have a passion for building scalable data infrastructure and have client-facing experience, this could be the perfect opportunity for you.
Key Responsibilities:Data Infrastructure & Architecture: Design, develop, and maintain scalable data pipelines, databases, and data warehouses.
Client-Facing: Collaborate directly with clients to understand their business needs and provide data-driven solutions.
Data Integration: Integrate data from multiple sources, ensuring high performance, reliability, and accuracy.
Data Modeling: Create and maintain data models to ensure easy accessibility and usability of data for analytics and reporting purposes.
Automation & Optimization: Identify opportunities to automate repetitive tasks and optimize data workflows.
Performance Tuning: Monitor and tune system performance to ensure high availability and efficient resource utilization.
Collaborative Development: Work alongside data scientists, analysts, and other engineering teams to build data solutions that support business objectives.
Documentation & Reporting: Document data pipelines, processes, and solutions to ensure transparency and maintainability.
Key Requirements:Experience:
3-5 years of experience in data engineering or related fields.
Proven experience working in client-facing roles.
Solid understanding of data infrastructure and experience with modern data platforms (e.g., AWS, Azure, GCP).
Hands-on experience with ETL frameworks and data integration tools (e.g., Apache Airflow, Talend, Informatica).
Expertise in data modeling, database management, and query optimization.
Technical Skills:
Proficient in SQL, Python, or Scala.
Strong experience with cloud technologies and platforms (AWS, GCP, or Azure).
Familiarity with big data tools (e.g., Hadoop, Spark, Kafka).
Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
Knowledge of DevOps and CI/CD pipelines is desirable.
Communication & Leadership:
Excellent communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
Certifications (Preferred):
AWS Certified Data Analytics - Specialty or equivalent certifications.
Google Cloud Professional Data Engineer or equivalent certifications.
Benefits & Perks:Tax-Free Income: Competitive salary package with tax-free income.
Relocation Support: Assistance with relocation costs and logistics..
Visa & Work Permits: Full visa sponsorship and work permits provided.
Healthcare: Comprehensive healthcare coverage for the duration of the contract.
12-Month Rolling Contract: Opportunity to extend or transition into a permanent role depending on performance and business needs.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessfor the following search criteria