data jobs logo

Advertise Your Data Jobs

Source high-quality candidates from our extensive 
database and fill vacancies fast!

About the Job In Detail

gcp data engineer

Posted on 
10-06-2022
City of London, England
£50000 - £75000 per annum + plus bonus & benefits
Permanent
Job Category: 

Job Description

We are seeking an experienced Senior Data Engineer to join the Data team. The objective of our customer is to develop a robust platform and pipelines to collect and disseminate data from/to different internal and external sources. The role will utilize a combination of batch and real-time streaming technologies to provide cutting-edge analytics, real-time decision making, and robust reporting. We believe that data should be the foundation of the business. We abhor lag, approximation, and deficient data quality! This is a fantastic opportunity for those who have a passion for data, software architecture, streaming technologies, and distributed systems, and who want to build production systems with a world-class team. Key Responsibilities: Managing and administering our BigQuery Data Warehouse to ensure the data's high and continuous availability. You'll mostly work with Google Cloud Platform (Cloud Storage, BigQuery, Composer, DataFlow), Confluent for Kafka, and Snowplow. This includes working alongside Machine Learning researchers and data scientists on cutting-edge ML models, as well as understanding the needs of the extended team and turning them into outstanding data products. Designing and constructing data collection pipelines from multiple data sources. Designing and constructing Dataflow streaming pipelines for event processing against ML models and publishing findings. Supporting data modeling using DBT on BigQuery on GCP. Working with backend, frontend, DevOps, and QA engineers to ensure that data pipeline event services are well-designed and correctly connected. Help us establish a data-driven mentality inside the organization. Requirements: At least five or more years of experience as a data engineer, utilizing real-time and batch data in a production setting. A comprehension of Data Warehousing and Data Lake concepts (including indexing, query graphs, basic administration, relational models, etc). Utilization of streaming technologies including Kafka/PubSub/Kinesis. You have extensive experience with a programming language. We are particularly interested in Python, Go, and SQL, in particular in relation to AirFlow and DBT. Experience developing and constructing production systems on Google Cloud and/or Amazon Web Services infrastructure. Utilization of Beam(GCP Dataflow)/Spark in a production setting with distributed systems at scale. Solid comprehension of data quality principles. You are knowledgeable of database technologies. (Best Practice, Performance Optimization, Defect Identification) You have worked inside an SDLC (including Testing, Git, Design Documentation, and Agile Delivery) previously. - SCRUM Experience with TerraForm/K8s Is Valuable. Have supported and mentored team members. Experience with both batch and real-time machine learning model deployment and maintenance. Understand distributed systems and scalable architecture. Understand Data engineering's Security and InfoSec pain concerns. Apply if you have Fintech/Finance/Payments/Retail Banking experience Name and Telephone Number Email Attach CV Brief Cover Letter Please accept the terms of service. return to job postings

We are seeking an experienced Senior Data Engineer to join the Data team.
The objective of our customer is to develop a robust platform and pipelines to collect and disseminate data from/to different internal and external sources.
The role will utilize a combination of batch and real-time streaming technologies to provide cutting-edge analytics, real-time decision making, and robust reporting.We believe that data should be the foundation of the business.
We abhor lag, approximation, and deficient data quality! This is a fantastic opportunity for those who have a passion for data, software architecture, streaming technologies, and distributed systems, and who want to build production systems with a world-class team.
Key Responsibilities:
Managing and administering our BigQuery Data Warehouse to ensure the data's high and continuous availability.
You'll mostly work with Google Cloud Platform (Cloud Storage, BigQuery, Composer, DataFlow), Confluent for Kafka, and Snowplow.
This includes working alongside Machine Learning researchers and data scientists on cutting-edge ML models, as well as understanding the needs of the extended team and turning them into outstanding data products.
Designing and constructing data collection pipelines from multiple data sources.
Designing and constructing Dataflow streaming pipelines for event processing against ML models and publishing findings.
Supporting data modeling using DBT on BigQuery on GCP.
Working with backend, frontend, DevOps, and QA engineers to ensure that data pipeline event services are well-designed and correctly connected.
Help us establish a data-driven mentality inside the organization.
Requirements:
At least five or more years of experience as a data engineer, utilizing real-time and batch data in a production setting.A comprehension of Data Warehousing and Data Lake concepts (including indexing, query graphs, basic administration, relational models, etc).
Utilization of streaming technologies including Kafka/PubSub/Kinesis.
You have extensive experience with a programming language.
We are particularly interested in Python, Go, and SQL, in particular in relation to AirFlow and DBT.
Experience developing and constructing production systems on Google Cloud and/or Amazon Web Services infrastructure.
Utilization of Beam(GCP Dataflow)/Spark in a production setting with distributed systems at scale.
Solid comprehension of data quality principles.
You are knowledgeable of database technologies.
(Best Practice, Performance Optimization, Defect Identification) You have worked inside an SDLC (including Testing, Git, Design Documentation, and Agile Delivery) previously.
- SCRUM Experience with TerraForm/K8s Is Valuable.
Have supported and mentored team members.
Experience with both batch and real-time machine learning model deployment and maintenance.
Understand distributed systems and scalable architecture.
Understand Data engineering's Security and InfoSec pain concerns.
Apply if you have Fintech/Finance/Payments/Retail Banking experience Name and Telephone Number Email Attach CV Brief Cover Letter Please accept the terms of service.
return to job postings

Additional Information

Vacancy Reference

Other Related Jobs

1 2 3 36
Testimonial

Our satisfied clients’ views

Data Jobs Trust Pilot ReviewsView More
quote 01
Logan Robertson
5 Stars 01
I was having a hard time finding a job in my niche, so it was a blessing to find a site like this with so many vacancies lined up, specifically for people working in data.
reviews quote2
Terry Obrien
star reviews2
I highly recommend this website as it is very useful for finding data analyst jobs. User-friendly and offers relevant information on each job listing.
reviews quote3
Liam Williamson
star reviews3
The process of applying for a job on data jobs was really smooth. They have real companies advertising real vacancies with great wages, which is absolutely amazing.
Data Jobs CTA

Find the Right Data Job for You

Build Your Career in Data

Data roles with top employers
Have questions about what we do?
Contact us now!
data jobs mobile
We are the premier data jobs site in the UK, offering a high-performance platform with a range of job posting services and job search tools.
data jobs logo
We are the premier data jobs site in the UK, offering a high-performance platform with a range of job posting services and job search tools.
2023 © Copyright Data Jobs