Azure Blob Storage

Business Intelligence in Azure – SQLBits 2018 Precon

What can you expect from my SQLBits pre conference training day in February 2018 at the London Olympia?

Well my friends, in short, we are going to take whirlwind tour of the entire business intelligence stack of services in Azure. No stone will be left unturned. No service will be left without scalability. We’ll cover them all and we certainly aren’t going to check with the Azure bill payer before turning up the compute on our data transforms.



What will we actually cover?

With new cloud services and advancements in locally hosted platforms developing a lambda architecture is becoming the new normal. In this full day of high level training we’ll learn how to architect hybrid business intelligence solutions using Microsoft Azure offerings. We’ll explore the roles of these cloud data services and how to make them work for you in this complete overview of business intelligence on the Microsoft cloud data platform.

Here’s how we’ll break that down during the day…

Module 1 – Getting Started with Azure

Using platform as a service products is great, but let’s take a step back. To kick off we’ll cover the basics for deploying and managing your Azure services. Navigating the Azure portal and building dashboards isn’t always as intuitive as we’d like. What’s a resource group? And why is it important to understand your Azure Activity Directory tenant?

Module 2 – An Overview of BI in Azure

What’s available for the business intelligence architect in the cloud and how might these services relate to traditional on premises ETL and cube data flows. Is ETL enough for our big unstructured data sources or do we need to mix things up and add some more letters to the acronym in the cloud?

Module 3 – Databases in Azure (SQL DB, SQL DW, Cosmos DB, SQL MI)

It’s SQL Server Jim, but not as we know it. Check out the PaaS flavours of our long term on premises friends. Can we trade the agent and an operating system for that sliding bar of scalable compute? DTU and DWU are here to stay with new SLA’s relating to throughput. Who’s on ACID and as BI people do we care?

Module 4 – The Azure Machines are here to Learn

Data scientist or developer? Azure Machine Learning was designed for applied machine learning. Use best-in-class algorithms in a simple drag-and-drop interface. We’ll go from idea to deployment in a matter of clicks. Without a terminator in sight!

Module 5 – Swimming in the Data Lake with U-SQL

Let’s understand the role of this hyper-scale two tier big data technology and how to harness its power with U-SQL, the offspring of T-SQL and C#. We’ll cover everything you need to know to get started developing solutions with Azure Data Lake.

Module 6 – IoT, Event Hubs and Azure Stream Analytics

Real-time data is everywhere. We need to use it and unlock it as a rich source of information that can be channelled to react to events, produce alerts from sensor values or in 9000 other scenarios. In this module, we’ll learn how, using Azure messaging hubs and Azure Stream Analytics.

Module 7 – Power BI, our Sematic Layer, is it All Things to All People?

Combining all our data sources in one place with rich visuals and a flexible data modelling tool. Power BI takes it all, small data, big data, streaming data, website content and more. But we really need a Venn diagram to decide when/where it’s needed.

Module 8 – Data Integration with Azure Data Factory and SSIS

The new integration runtime is here. But how do we unlock the scale out potential of our control flow and data flow? Let’s learn to create the perfect dependency driven pipeline for our data flows. Plus, how to work with the Azure Batch Service should you need that extensibility.

 

Finally we’ll wrap up the day by playing the Azure icon game, which you’ll all now be familiar with and able to complete with a perfect score having completed this training day 🙂

Many thanks for reading and I hope to see you in February, its going to be magic 😉

Register now: https://www.regonline.com/registration/Checkin.aspx?EventID=2023328

All training day content is subject to change, dependant on timings and the demo gods will!


 

Storing U-SQL Assemblies in Azure Blob Storage

I’m hoping the title of this post is fairly self explanatory. Your here because like me you found that the MSDN language reference page for creating U-SQL assemblies states that it’s possible to store the DLL’s in Azure Blob Storage. But it doesn’t actually tell you how. Well please continue my friends and I’ll show you how.

The offending article: https://msdn.microsoft.com/en-us/library/azure/mt763293.aspx

The offending text snippet:

“Assembly_Source
Specifies the assembly DLL either in form of a binary literal or as a string literal or static string expression/string variable. The binary literal represents the actual .NET assembly DLL, while the string values represent a URI or file path to a .NET assembly DLL file in either an accessible Azure Data Lake Storage or Windows Azure Blob Storage. If the provided source is a valid .NET assembly, the assembly will be copied and registered, otherwise an error is raised.”

Before going any further, this post isn’t a dig at the usual lack of Microsoft documentation. Mainly because when I posted this problem as a question on Stack Overflow the missing information was provided from the horses mouth, Mr Michael Rys (@MikeDoesBigData). Therefore, all is forgiven and I’m more than happy to write this post on Microsoft’s behalf and for my fellow developers. #SQLFamily

http://stackoverflow.com/questions/40842170/create-usql-assembly-from-dll-in-azure-blob-storage

Thanks again Mike. Moving on…

Assumptions

Within your Azure subscription you have the following services already deployed and running.

  • Azure Data Lake Analytics (ADLa)
  • Azure Data Lake Store (ADLs)
  • Azure Storage, with a suitable blob container.

You are also comfortable with referencing assemblies in your U-SQL scripts and so far have done so by in lining the complied assembly in the U-SQL file. Or have stored the DLL in ADLs with a simple file path reference to the ADLs root directory.

Granting Access

The most important thing you’ll need to do to get this working, as Mike mentions in the SO answer, is allow your ADLa service to access the blob storage account. This only requires a few clicks in the Azure portal.

From the ADLa blade choose Data Sources and click Add Data Source.

Populate the preceding blade drop down menus with your preferred choices and click Add. You should then have the storage account listed as a ADLa data source. As below

Note; The Azure Storage account doesn’t need to be in the same data centre as the ADLa service, unlike ADLa and ADLs.

Create Assembly

Next the U-SQL.

To reference a DLL in the blob storage account container we need to create the assembly using the wasb URL. Like this:

wasb://YourBlobContainerName@YourBlobStorageAccountName.blob.core.windows.net/YourAssembly.dll

Complete CREATE ASSEMBLY syntax.

1
2
3
CREATE ASSEMBLY IF NOT EXISTS [YourSchema].[PurpleFrog.Pauls.DataLakeHelperFunctions]
FROM "wasb://AllSupportingFiles@MiscBlobsAccount.blob.core.windows.net/
      PurpleFrog.Pauls.DataLakeHelperFunctions.dll";

Why Do This

Hopefully pre-empting some comments on this post. Given that we can inline the assembly and store it in ADLs. Why would you want to put the DLL’s in a separate storage account?

Well, this is really just for operational convenience. In a recent project I was working on we had created a lot of custom code. Not just for Azure Data Lake, but also Azure Data Factory. We therefore used a blob storage account as a support bucket for all compiled code and parent object files. This gave us a centralised place to deploy to regardless of what service was consuming the libraries. Again, just for convenience. All DLL’s in one place for all services.

I hope you found this post helpful.

Many thanks for reading.


Paul’s Frog Blog

Paul is a Microsoft Data Platform MVP with 10+ years’ experience working with the complete on premises SQL Server stack in a variety of roles and industries. Now as the Business Intelligence Consultant at Purple Frog Systems has turned his keyboard to big data solutions in the Microsoft cloud. Specialising in Azure Data Lake Analytics, Azure Data Factory, Azure Stream Analytics, Event Hubs and IoT. Paul is also a STEM Ambassador for the networking education in schools’ programme, PASS chapter leader for the Microsoft Data Platform Group – Birmingham, SQL Bits, SQL Relay, SQL Saturday speaker and helper. Currently the Stack Overflow top user for Azure Data Factory. As well as very active member of the technical community.
Thanks for visiting.
@mrpaulandrew