Advertisements

Magnetic – Data Engineer (Visa Sponsorship).

Backed by global technology visionaries like Sequoia Capital, Magentic brings together world-class AI engineering (from places like OpenAI, Meta, and AWS) with deep procurement expertise (formers from McKinsey & Company and ABInBev).

Job Title: Data Engineer

Job Type: Full-time

Salary: £110,000-£120,000

Location: London, England

Roles & Responsibilities 

Advertisements
  • Design and operate performant, scalable ingestion pipelines processing high-volume data from global supply chain and procurement systems.
  • Define, evolve, and manage data schemas and catalogues—from raw staging to high-quality analytics and feature stores—ensuring consistency and discoverability.
  • Build end-to-end monitoring and observability for your pipelines: owning data quality, latency, completeness, and lineage at every stage.
  • Champion secure, governed data practices: access controls, secrets management, encrypted data-in-transit/at-rest, and compliance with frameworks like GDPR.
  • Collaborate closely with AI, Platform, and Product teams, provisioning data sets, feature tables, and contracts for analytics and machine learning at scale.
  • Continuously improve efficiency and reliability via testing, CI/CD automation, cost/performance tuning, and incident/root-cause reviews.

Eligibility Requirements

You may be a fit if you have:

  • Experience working at startups, scaleups or at companies with a big focus on data quality, DataOps and data management at scale.
  • Expertise in Cloud-Native Data Engineering: 5+ years building and running data warehouses and pipelines in AWS or Azure, including managed data services (e.g., Kinesis, EMR/Databricks, Redshift, Glue, Azure Data Lake).
  • Programming Mastery: Advanced skills in Python or another major language; writing clean, testable, production-grade ETL code at scale.
  • Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster.
  • Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing.
  • API & Integration Fluency: Building data ingestion from REST/gRPC APIs, file drops, message queues (SQS, Kafka), and 3rd party SaaS integrations, with idempotency and error handling.
  • Storage & Query Engines: Strong with RDBMS (PostgreSQL, MySQL), NoSQL (DynamoDB, Cassandra), data lakes (Parquet, ORC), and warehouse paradigms.
  • Observability & Quality: Deep familiarity with metrics, logging, tracing, and data quality tools (e.g., Great Expectations, Monte Carlo, custom validation/test suites).
  • Security & Governance: Data encryption, secrets management, RBAC/ABAC, and compliance awareness (GDPR, CCPA).
  • CI/CD for Data Systems: Comfort with automation, infrastructure as code (Terraform), version control, and release workflows.
  • Collaborative Spirit: Experience working closely with platform, ML, and analytics teams in a fast-paced, mission-driven environment.

Benefits

At Magentic, we recognise and reward the talent that drives our success. We offer:

  • Competitive Equity: play a real part in Magentic’s upside.
  • A salary of £110,000-£120,000
  • Visa sponsorship available; (note we are only accepting candidates who are currently based in the UK.)
  • Hybrid London HQ (3-4 days in the office)
  • Annual team retreat—a fully-funded off-site to recharge, bond, and build.

Note

  1. If you are Currently in the UK and you need help Applying for jobs like these, you can consider joining our Telegram Channel by CLICKING HERE and our Telegram group by CLICKING HERE
  2. Visa Sponsorship is available to those in the UK 🇬🇧

Application Process

For more information and application, CLICK HERE 

Leave a Reply

Your email address will not be published. Required fields are marked *