Skip to content

Speaking at 24 Hours of PASS: Past Learnings and Future Visions

24 Hours of PASS: Past Learnings and Future Visions

PASS turns 20 this April! To celebrate 20 years of educating data professionals, they are organizing a 24 Hours of PASS with the theme Past Learnings and Future Visions. From April 3-4, you can watch 24 free webinars back-to-back. If you can’t attend all 24 webinars live (who can!?), they will all be recorded. Woohoo!

I’m presenting a brand new session called Pipelines and Packages: Introduction to Azure Data Factory. If Azure Data Factory is not really your thing, don’t worry! You can choose from 23 other topics, including SQL Server, Power BI, AI, Containers, and even Professional Development.

Past Learnings and Future Visions

I’m very happy and honored to speak at this edition of 24 Hours of PASS. The theme Past Learnings and Future Visions ties perfectly in with my job these days :)

I have spent the last 10 years working with Data Integration and have focused mainly on SQL Server Integration Services (SSIS) and Biml. However, now I’m shifting more towards Azure Data Factory (ADF).

This has brought up many questions for me that others may relate to. How do I move from SSIS to ADF? Are all my SSIS skills now irrelevant? What about the SSIS solution I have spent years building? Should I stop using SSIS entirely? And what about Biml, is it still useful?

My main focus in this session is to show the capabilities of Azure Data Factory. But because of the Past Learnings and Future Visions theme, I will also try to answer some of the questions above.

Is it possible to take what I have learned in the past about SSIS and apply it to future ADF projects?

Pipelines and Packages: Introduction to Azure Data Factory

Azure Data Factory

Level: 200 (Introduction)
Time: Thursday, April 4th, 13:00 UTC

As Data Engineers and ETL Developers, our main responsibility is to move, transform, and integrate data for end users as efficiently as possible. With the ever-increasing volume and variety of data, this can feel like a daunting task. Azure Data Factory (ADF) is a hybrid data integration service that lets you build complex and scalable data pipelines – without writing any code.

But wait! You have already invested years and millions in a comprehensive SSIS solution? No problem! You can lift and shift existing packages into Azure Data Factory to modernize your solution while retaining investments already made.

In this session, we will go through the fundamentals of Azure Data Factory and see how easy it is to build new data pipelines or migrate existing SSIS packages. Then, we will explore some major improvements in Azure Data Factory v2, including the new Mapping Data Flows. Finally, we will look at best practices for development to speed up productivity and keeping costs down.

Tune in on April 3-4!

I hope you will join us for one or more webinars. If you’re on Twitter, make sure to follow @SQLPass for the latest news, and use the #SQLPass and #24HOP hashtags to join the discussion. Check out the great schedule and register today :)

About the Author

Cathrine Wilhelmsen is a Microsoft Data Platform MVP, BimlHero Certified Expert, Microsoft Certified Solutions Expert, international speaker, author, blogger, and chronic volunteer who loves teaching and sharing knowledge. She works as a Senior Business Intelligence Consultant at Inmeta, focusing on Azure Data and the Microsoft Data Platform. She loves sci-fi, chocolate, coffee, craft beers, ciders, cat gifs and smilies :)

Comments

Hi! This is Cathrine. Thank you so much for visiting my blog. I'd love to hear your thoughts, but please keep in mind that I'm not technical support for any products mentioned in this post :) Off-topic questions, comments and discussions may be moderated. Be kind to each other. Thanks!

Share Your Thoughts?