Comprehensive learning path to become a skilled Data Engineering for AI. Covers essential tools, frameworks, and best practices.
Basic statistics helpful; will be taught
Some coding experience; Python or R preferred
Follow these courses in order to complete the learning path. Click on any course to enroll.
A deep dive into data quality assessment and improvement practices for AI solutions. The course covers how data is used in various AI use cases, the insufficiency of traditional data quality methods for large-scale AI models, and the economics of building training datasets.
This course focuses on the intricacies of data contracts and serialization in Kafka. It explores how serialization enhances Kafka's architecture and examines different serialization formats like AVRO, Protobuf, and Thrift to understand their schema compatibility and applications. The course is designed for software developers and data engineers with basic knowledge of Java and Kafka.
A one-day immersive workshop led by Andrew Jones, the creator of data contracts. The course delves into the transformative world of Data Contracts and how to use them to implement a Data Mesh. It is tailored for software, platform, and data engineers looking for practical guidance on implementing data contracts and data mesh in their organizations.
This is a comprehensive guide in the form of an e Book that explores the benefits of using data contracts to improve data quality in modern data platforms. It provides practical tips and best practices for implementing data contracts in an organization to build a true data-driven culture with a focus on accountability and governance.
Explore related content to expand your skills beyond this learning path.
Enroll in this path to track your progress and stay motivated.