Become a Professional Data Engineer on GCP. Design data processing systems, build end-to-end data pipelines, analyze data and carry out machine learning. This course is part of Google’s Data Engineering track that leads to the Professional Data Engineer certificate.
This 4-day training offers a combination of presentations, demos, and hands-on labs. You will learn how to design data processing systems, build end-to-end data pipelines, analyze data, and carry out Machine Learning.
What You'll Learn
Google Cloud Dataproc Overview
- Creating and managing clusters
- Leverage custom machine types and preemptible worker nodes
- Scaling and deleting Clusters
- Lab: Creating Hadoop Clusters with Google Cloud Dataproc
Running Dataproc Jobs
- Running Pig and Hive jobs
- Separation of storage and compute
- Lab: Running Hadoop and Spark Jobs with Dataproc
- Lab: Submit and monitor jobs
Integrating Dataproc with Google Cloud Platform
- Customize clusters with initialization actions
- BigQuery Support
- Lab: Leveraging Google Cloud Platform Services
Making Sense of Unstructured Data with Google’s Machine Learning APIs
- Google’s Machine Learning APIs
- Common ML Use Cases
- Invoking ML APIs
- Lab: Adding Machine Learning Capabilities to Big Data Analysis
Serverless Data Analysis With Big Query
- What is BigQuery?
- Queries and Functions
- Lab: Writing queries in BigQuery
- Loading data into BigQuery
- Exporting data from BigQuery
- Lab: Loading and exporting data
- Nested and repeated fields
- Querying multiple tables
- Lab: Complex queries
- Performance and pricing
Serverless, Autoscaling Data Pipelines With Dataflow
- The Beam programming model
- Data pipelines in Beam Python
- Data pipelines in Beam Java
- Lab: Writing a Dataflow pipeline
- Scalable Big Data processing using Beam
- Lab: MapReduce in Dataflow
- Incorporating additional data
- Lab: Side inputs
- Handling stream data
- GCP Reference architecture
Getting Started With Machine Learning
- What is machine learning (ML)
- Effective ML: concepts, types
- ML datasets: generalization
- Lab: Explore and create ML datasets
Building ML Models With Tensorflow
- Getting started with TensorFlow
- Lab: Using tf.learn.
- TensorFlow graphs and loops + lab
- Lab: Using low-level TensorFlow + early stopping
- Monitoring ML training
- Lab: Charts and graphs of TensorFlow training
Scaling ML Models With CloudML
- Why Cloud ML?
- Packaging up a TensorFlow model
- End-to-end training
- Lab: Run a ML model locally and on cloud
Feature Engineering
- Creating good features
- Transforming inputs
- Synthetic features
- Preprocessing with Cloud ML
- Lab: Feature engineering
Architecture of Streaming Analytics Pipelines
- Stream data processing: Challenges
- Handling variable data volumes
- Dealing with unordered/late data
- Lab: Designing streaming pipeline
Ingesting Variable Volumes
- What is Cloud Pub/Sub?
- How it works: Topics and Subscriptions
- Lab: Simulator
Implementing Streaming Pipelines
- Challenges in stream processing
- Handle late data: watermarks, triggers, accumulation
- Lab: Stream data processing pipeline for live traffic data
Streaming Analytics and Dashboards
- Streaming analytics: from data to decisions
- Querying streaming data with BigQuery
- What is Google Data Studio?
- Lab: build a real-time dashboard to visualize processed data
High Throughput and Low-Latency With Bigtable
- What is Cloud Spanner?
- Designing Bigtable schema
- Ingesting into Bigtable
- Lab: streaming into Bigtable
Become a GCP Data Engineer
Are you a Developer responsible for managing Big Data transformations? Do you want to become a Professional on the Google Cloud Platform? Time to gain the knowledge and skills to prepare for your Google certificate. This training prepares you to design and build data processing systems on the Google Cloud Platform. You will learn how to analyze data and carry out Machine Learning. We cover structured, unstructured, and streaming data.
Data Engineering on GCP is perfect for
experienced developers responsible for extracting, loading, transforming, cleaning, and validating data, and designing pipelines and architectures for data processing. If you are responsible for creating and maintaining Machine Learning and statistical models, querying datasets, visualizing query results, and creating reports, you’re more than welcome to join too.
Before enrolling, you need to complete the Google Cloud Fundamentals: Big Data & Machine Learning course, or have basic proficiency with:
- A common query language like SQL
- Data modeling, extracting, transforming, and loading activities
- Developing applications using a programming language like Python
- Machine Learning and/or statistics
Professional Data Engineer
A Data Engineer should also be able to leverage, deploy, and continuously train pre-existing machine learning models.
Abilities Validated by the Certification
- Design data processing systems
- Build and operationalize data processing systems
- Operationalize machine learning models
- Ensure solution quality
Recommended Knowledge and Experience
- 3+ years of industry experience including 1+ years designing and managing solutions using GCP.
Instructor: Martijn van de Grift
Martijn is a Cloud consultant at Binx.io, where he specializes in creating solutions using GCP and AWS. He holds the most relevant technical certifications for both clouds. He has a great passion for IT and likes to work with the latest technologies. He loves to share this passion during training, and webinars, where he brings experience from assignments at companies including Booking.com, Weeronline, and ZorgDomein.
The Right Format For Your Preferred Learning Style
At Binx we offer four distinct training modalities:
- In-Classroom Training
- Online, Instructor-Led Training
- Hybrid and Blended Learning
- Self-Paced Training
Learn more about our training modalities