As of July 2018, there is no built-in Biml support for OData. To work with OData in Biml, you have to create a custom source and connection manager. This requires more Biml code than built-in functions like OleDbSource and may look a little overwhelming at first. But don’t worry! You don’t have to start from scratch.
In this blog post, we will first look at the properties of the OData Connection Manager and how to script it in Biml. Then, we will do the same for the OData Source. Finally, we will tie it all together and create a complete SSIS package that you can use a starting point for your own projects.
The Quick and Easy Solution
But before we dig into any code, let’s skip to the quick, easy, and timesaving solution. That’s what we all really want, right? :)
Install or upgrade to BimlExpress 2018
Create an example SSIS package using an OData Source and Connection Manager
Convert the SSIS package to Biml
As promised: quick, easy, and timesaving! The new Convert SSIS Packages to Biml feature was released in BimlExpress 2018, and it really is a lifesaver. After converting to Biml, you can simply copy and paste the code into your projects.
You may run into some bugs when you convert your SSIS packages to Biml. I ran into two issues while writing this blog post. The first was that I had to add UsesDispositions=”true” to the Source component. The second was that the data types in the Source component were prefixed with System. I have fixed both of these issues in my examples below. In addition to these issues, the converted Biml also contained some unnecessary code. Unnecessary code does not break anything, but it can make your code harder to read and maintain. Personally, I prefer my code to be as clean and simple as possible.
T-SQL Tuesday #101 is hosted by Jens Vestergaard (@vestergaardj), and the topic is The Essential SQL Server Tools in my stack. There are several tools that I use every single day, such as Notepad++, Redgate SQL Prompt, and BimlExpress. In fact, there are so many amazing tools out there for data professionals that I created an entire session focusing on Tools and Tips For Data Warehouse Developers! Since I have already covered my favorite tools in other blog posts, I want to take a slightly different approach this time and share some of my favorite online tools.
T-SQL Tuesday #99 is hosted by Aaron Bertrand (@AaronBertrand) and the topic is Dealer’s Choice. What does that mean? Aaron wanted us to choose between two topics: write about something we are passionate about outside the SQL Server community, or write about T-SQL bad habits and best practices. I’m too afraid to argue with Aaron, so I chose the first option ;)
But what did I want to write about? Truth be told, I’m pretty boring. I don’t have any real hobbies outside of tech, just a few interests. I first considered writing about hiking. I need to disconnect completely once in a while, and my favorite way of doing that is to get sweaty while walking for hours up a mountain or waterfall. My Instagram (@cathrinesqueee) is my highlight reel, full of pictures and happy memories from my trips.
However, instead of embedding all the pictures that are already on Instagram, I decided to write about something completely different. Something obscure. Something fun and geeky from way before I got involved in the SQL Server community. Something from my good old teenage fangirling days…
Biml objects have many built-in attributes. For example, all Tables have SchemaName and all Packages have ProtectionLevel. When your Biml solution starts to grow, you will quickly see the need for adding additional metadata that can be used in other Biml files. A common use case in Data Warehouse Staging projects is to store the source schema and source table name on your staging table objects. This allows you to use the source metadata in a higher tier Biml file that generates the SSIS packages to load the tables. To store and use this additional metadata, you can use Biml Annotations or ObjectTags.
Biml Annotations and ObjectTags are both Key/Value pairs. Annotations are String/String pairs intended for storing simple text metadata, while ObjectTags are String/Object pairs that can also store more complex metadata in .NET objects.
Many Biml solutions start off very simple, with just a single Biml file that generates a few SSIS packages. Most developers quickly see the need for a more complex solution for multiple sources. One way to reuse code and apply the Don’t Repeat Yourself software engineering principle in Biml is to use Tiered Biml Files.
In addition to using Tiered Biml Files, there are four other main ways you can avoid repeating your Biml code:
I have written posts in the past about how to deal with technical failures as a speaker: Be prepared. Have backups of your files. Take screenshots or videos of your demos. Rehearse presenting your session with and without your slide deck. I have made plenty of I-wasn’t-prepared-for-that mistakes in the past, and I learned from them.
Last weekend was different. I was supposed to do a precon and a session at SQLSaturday Nashville. I didn’t experience any technical failures. I experienced what felt very much like personal failures.
On Friday, our precon got canceled due to weather.
On Saturday, no one showed up to my session.
I had not prepared for either of those things to happen, and I definitely had not prepared for both of those things to happen on the same weekend. I went through a whole range of emotions in a short amount of time, from embarrassment to disappointment to amusement.
Since then, I have debated with myself about whether or not to write this blog post. I have written and re-written it several times. I did not want to write a blog post full of whining and complaining, but I did want to share my experience as a reminder to both myself and others that:
These things happen. It’s ok to feel disappointed, but don’t let it stop you from trying again.
There is always something positive to take away from a negative experience.
Life is full of ups and downs. Social media (especially my own) often focus on the ups, but life is not amazing all the time.
If you only needed a few reminders, feel free to stop reading here :) The rest of the post is quite long and a little more like a journal entry, really.
One of the sessions I was most looking forward to at Microsoft Ignite 2017 was New capabilities for data integration in the cloud with Mike Flasko. In that session, he talks about Azure Data Factory (ADF) v2 and its new first-class SSIS support.
After the session, I convinced Mike Flasko and Sanjay Krishnamurthi to have a chat with me :) We talked about what’s new in Azure Data Factory v2, including the updated pipeline application model with a new visual design canvas, new Software Development Kits (SDKs) for working with Azure Data Factory, the new Integration Runtime, and the ability to run SSIS packages inside Azure Data Factory v2.
Azure Data Factory v2 with Mike Flasko
Follow Mike Flasko on Twitter @mflasko, and keep an eye out for more news about ADF and SSIS! I may or may not have convinced him to do another interview with me in a couple of months :)
At Microsoft Ignite 2017, I had planned an interview with Sunil Agarwal, and was very excited about it. Then Sunil asked if he could bring Kevin Farlee. Of course! Then he asked if he could also bring their customer, Aaron Gerdeman from FIS. Even better! :)
In this interview, we chat about SQL Server 2017, Resumable Index Builds, Adaptive Query Processing, Columnstore Indexes, High Availability, Real-time Analytics, Real-time Dashboards and the SQL Tiger Team.
High Performance Analytics with Sunil Agarwal, Kevin Farlee, Aaron Gerdeman
I know Bob had a very busy schedule at Microsoft Ignite, so I’m very thankful he was able to spend a few minutes with me! I hope you find this as interesting as I did :) If you want to learn even more, you can watch his session Experience Microsoft SQL Server 2017: The fast and the furious from Microsoft Ignite 2017:
I got to interview Buck Woody about Data Science at Microsoft Ignite 2017! :)
In this interview, we chat about Microsoft Business Analytics and AI (formerly known as Cortana Intelligence Suite), Artificial Intelligence in Excel, intent-based programming, Predictive Analytics, DevOps for Data Scientists and life-long learning.
Data Science with Buck Woody – Microsoft Ignite 2017
I’m so thankful that Buck took some time out of his busy schedule to chat with me, and I hope all of you enjoy this interview. If you want to learn more, watch his full session DevOps for data science from Microsoft Ignite 2017: