D2C242: Data Engineering and its Streams, Rivers, and Lakes

Tags: ,

Keith Gregory teaches Day Two Cloud about data engineering in a way DevOps folks (and hydrologists) can understand. He explains that the role of a data engineer is to create pipelines to transport data from metaphorical rivers and make it usable for data analysts. Keith walks us through the testing process; the difference between streaming pipelines and polling pipelines; and the difference between data lakes and data warehouses. Plus, he explains terms that network engineers and developers might bump into when working on big projects, but might not know exactly what they are: ELT, OLTP, columnar storage, and more.