Overview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives.
GCS is acting as an Employment Business in relation to this vacancy.
Read lessOverview: We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development... Read more
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling real-time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python scripting, and scalable pipeline design.
Key Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting benchmark data (e.g., CSVs from Qualtrics).Implement automated data quality checks across pipeline stages.Build event-driven workflows using Azure Data Factory and Databricks.Support automated reporting integrations (Power BI and PowerPoint).Optimize storage and processing within Azure Data Lake and SQL-based systems.Collaborate on data modelling (star/snowflake schemas) with architects and analysts.Monitor and troubleshoot data platform components using Azure Monitor.Contribute to CI/CD practices and documentation for long-term maintainability.Essential Skills:
Advanced Python scripting and data manipulation.Strong SQL for querying and transformation.Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.Understanding of data modelling techniques and governance.Experience with Azure Monitor, Key Vault, and managed identities.Desirable:
Familiarity with AI/ML data patterns (e.g. vector databases, RAG).Automated Power BI or PowerPoint reporting experience.Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
Agile, delivery-focused culture with rapid feedback loops.Strong focus on quality, automation, and cross-functional collaboration.High-impact data platform supporting analytics and automation initiatives
GCS is acting as an Employment Business in relation to this vacancy.
Read lessSenior DevOps Engineer - JFrog Artifactory SME - 6 month contract - Fully remote6 month contract - Fully... Read more
Senior DevOps Engineer - JFrog Artifactory SME - 6 month contract - Fully remote
6 month contract - Fully remote
About the Role
We're looking for a Senior DevOps Engineer with deep expertise in JFrog Artifactory to join our growing DevOps team. This role is ideal for someone who thrives in high-availability environments and has hands-on experience architecting and managing active-active HA clusters, federated repositories, and edge nodes in a single-domain setup.
You'll be our go-to SME for everything JFrog, working closely with engineering, security, and release teams to ensure efficient artifact management, developer enablement, and policy enforcement across our CI/CD pipelines.
Key Responsibilities
Act as the subject matter expert for JFrog Artifactory, with a deep understanding of:Active-active HA clustersFederated repositories and edge nodesA single-domain deployment modelOwn the architecture, scaling, and maintenance of Artifactory infrastructure to ensure performance, reliability, and availability.Lead the implementation of artifact lifecycle management, from development to deprecation.Educate developers on how to consume artifacts effectively and safely through repository best practices.Design and enforce governance and compliance policies on artifact retention, immutability, access control, promotion workflows, and clean-up.Collaborate with teams to optimize CI/CD pipelines with Artifactory integrations (e.g., Jenkins, GitHub Actions, GitLab CI).Drive continuous improvement in repository hygiene, storage utilization, and security posture.You Bring
8+ years in DevOps, SRE, or Infrastructure Engineering roles.Proven SME-level experience with JFrog Artifactory in complex, enterprise-scale environments.Expertise in HA deployments, federation setups, edge node configuration, and repository architecture.In-depth understanding of artifact lifecycle management and ability to articulate it clearly to both technical and non-technical audiences.Strong experience enforcing artifact governance policies and working with InfoSec/Compliance.Proficiency with infrastructure as code (Terraform, Ansible), container orchestration (Kubernetes, Helm), and scripting (Bash, Python).Familiarity with DevOps tooling ecosystems: Jenkins, GitHub, GitLab, Nexus (nice to have), and monitoring/logging tools.Excellent communication and documentation skills.GCS is acting as an Employment Business in relation to this vacancy.
Read lessTechnical Delivery Lead - Postgres, OpenShift, Python and AlloyDB Migration (GCP) Contract- Fully remote My London based client... Read more
Technical Delivery Lead - Postgres, OpenShift, Python and AlloyDB Migration (GCP)
Contract- Fully remote
My London based client are seeking a Technical Delivery Lead with a strong background in Postgres, OpenShift, Python, and AlloyDB. We have an exciting opportunity for you to lead a critical migration project from Postgres to AlloyDB on Google Cloud Platform (GCP).
Key Responsibilities:
Lead the migration of databases from Postgres to AlloyDB within the GCP environment.Oversee the technical delivery, ensuring smooth integration of AlloyDB, including performance tuning and optimization.Manage and mentor a team of developers and engineers in implementing the migration strategy.Collaborate with cross-functional teams to ensure business and technical requirements are met.Implement and manage containerized environments with OpenShift and Python for automation and deployment.Please send across an updated CV if this position is of interest
GCS is acting as an Employment Business in relation to this vacancy.
Read lessAzure Devops Platform Engineer with strong Terraform experienceMy client has an urgent requirement for an Azure Devops Engineer... Read more
Azure Devops Platform Engineer with strong Terraform experience
My client has an urgent requirement for an Azure Devops Engineer with a strong background in Terraform
As an Azure Devops Platform Engineer your expectations will be to:
Design, implement, and manage Azure cloud environments.Utilize Terraform for infrastructure automation and provisioning.Develop and maintain robust CI/CD pipelines and deployment strategies.Strong experience in Azure cloud services (Azure IaaS, PaaS, and SaaS).Proficiency in Terraform for infrastructure as code.Solid understanding of DevOps practices, automation, and CI/CD pipelines.Experience with Azure DevOps, ARM Templates.Familiarity with cloud networking, storage, and security principles.Strong problem-solving and troubleshooting skills.Please send across an updated CV if this position is of interest
GCS is acting as an Employment Business in relation to this vacancy.
Read lessAzure Devops Platform Engineer with strong Terraform experienceMy client has an urgent requirement for an Azure Devops Engineer... Read more
Azure Devops Platform Engineer with strong Terraform experience
My client has an urgent requirement for an Azure Devops Engineer with a strong background in Terraform
As an Azure Devops Platform Engineer your expectations will be to:
Design, implement, and manage Azure cloud environments.Utilize Terraform for infrastructure automation and provisioning.Develop and maintain robust CI/CD pipelines and deployment strategies.Strong experience in Azure cloud services (Azure IaaS, PaaS, and SaaS).Proficiency in Terraform for infrastructure as code.Solid understanding of DevOps practices, automation, and CI/CD pipelines.Experience with Azure DevOps, ARM Templates.Familiarity with cloud networking, storage, and security principles.Strong problem-solving and troubleshooting skills.Please send across an updated CV if this position is of interest
GCS is acting as an Employment Business in relation to this vacancy.
Read lessfor the following search criteria