Talks

Would you like me to speak at your community event? If so, please get in touch. As a Microsoft MVP and STEM Ambassador I love sharing my knowledge. Below are the sessions I currently have available focusing on the Microsoft Azure Data Platform services.

 

Azure & Data Platform

These are talks for the technical community at established events for industry professionals.


 Title  Building an End to End IoT Solution Using Raspberry Pi Sensors & Azure
 Duration  1 hour
 Abstract  The Internet of Things is the new kid on the block offering a wealth of possibilities for data streaming and rich analytics. Using a Raspberry Pi 3 we will take an end to end look at how to interact with the physical world collecting sensor values and feeding that data in real-time into cloud services for manipulation and consumption. This will be a heavily demonstrated session looking at how such an environment can be setup using Microsoft offerings, including; Windows 10 IoT Core, a C# Universal Windows Platform application, an Azure IoT Event Hub, Azure Stream Analytics, Azure SQL DB and Power BI. This is an overview of what’s possible, but showing exactly how to build such a simplified solution with a session which will be 90% demonstrations. This will hopefully add that level excitement to real-time data with plenty of hardware out there showing what it can do when setup with Microsoft software.

 Title Azure Data Lake – The Services. The SQL. The Sharp.
 Duration  1 hour
 Abstract How do we implement Azure Data Lake?
How does a lake fit into our data platform architecture? Is Data Lake going to run in isolation or be part of a larger pipeline?
How do we use and work with USQL?
Does size matter?!The answers to all these questions and more in this session as we immerse ourselves in the lake, that’s in a cloud.We’ll take an end to end look at the components and understand why the compute and storage are separate services.For the developers, what tools should we be using and where should we deploy our USQL scripts. Also, what options are available for handling our C# code behind and supporting assemblies.

We’ll cover everything you need to know to get started developing data solutions with Azure Data Lake.

Finally, let’s extend the U-SQL capabilities with the Microsoft Cognitive Services!


 Title Break Out the Cognitive Abilities for Azure Data Lake
 Duration 5 or 10mins
 Abstract Microsoft’s Cognitive Services are basically the best thing since sliced bread, especially for anybody working with data. Artificial intelligence just got packaged and made available for the masses to download. In this short talk, I’ll take you on a whirl wind tour of how to use these massively powerful libraries directly in Azure Data Lake with that offspring of T-SQL and C# … U-SQL. How do you get hold of the DLL’s and how can you wire them up for yourself?… Everything will be revealed as well as the chance to see what the machines make of the audience!

Title Creating Azure Data Factory Custom Activities
Duration 1 hour
Abstract In this session we’ll go beyond the Azure Data Factory copy activity normally presented using the limited portal wizard. Extract and load are never the hard parts of the pipeline. It is the ability to transform, manipulate and clean data that normally requires more effort. Sadly, this task doesn’t come so naturally to Azure Data Factory as an orchestration tool so we need to rely on its custom activities to break out the C# or VB to perform such tasks. Using Visual Studio, we’ll look at how to do exactly that and see what’s involved in Azure to utilise this pipeline extensibility feature. What handles the compute for the compiled .Net code and how can does this get deployed by ADF. With real work use cases we’ll learn how to fight back against those poorly formed CSV files and what we could to if Excel files are our only data source. Plus, lots of useful tips and tricks along the way when working with this emerging technology.

Title Working with Azure Data Factory & Creating Custom Activities
Duration 2 hour
Abstract Part 1: Azure Data Factory. This is not SSIS in Azure. But it’s a start for our control flows. Let’s update our terminology and understand how to invoke our Azure data services with this new controller/conductor who wants to understand our structured datasets. Learn to create the perfect dependency driven pipeline with Azure Data Factory and allow your data to flow. What’s an activity and how do we work with time slices? Is a pipeline a pipeline? Who is this JSon person? All the answers to these questions and more in this introduction to working with Azure Data Factory. Plus, insights from a real-world case study where ADF has been used in production for a big data business intelligence solution handling log files for 1.5 billion users.

Part 2: Having covered the basics in part 1 we’ll now go beyond the Azure Data Factory basic activity types and Azure Portal wizard. Extract and load are never the hard parts of the pipeline. It is the ability to transform, manipulate and clean our data that normally requires more effort. Sadly, this task doesn’t come so naturally to Azure Data Factory, as an orchestration tool so we need to rely on its custom activities to break out the C# to perform such tasks. Using Visual Studio, we’ll look at how to do exactly that and see what’s involved in Azure to utilise this pipeline extensibility feature. What handles the compute for the compiled .Net code and how can does this get deployed by ADF? Let’s learn how to fight back against those poorly formed CSV files and what we can do if Excel files are our only data source.


Title Be My Azure DBA (DSA)
Duration 1 hour
Abstract There seems to be a common misconception that once you move from on premises SQL Server to Azure PaaS offerings a DBA is no longer required. This perception is wrong and in the session, I’ll show you why. As a business intelligence consultant, I develop data platform solutions in Azure that once productionised need administration. As the title suggests, be my Azure DBA. Maybe not DB for database. Maybe in Azure I need a DSA, a Data Services Administrator. Specifically, we’ll cover a real business intelligence solution in Azure that uses Data Factory, Data Lake, Batch Service, Blob Storage and Azure AD. Help me administer this next generation data solution.

Title Business Intelligence in Azure – The Lambda Architecture
Duration Full Day
Abstract The lambda architecture is here! In this full day of training learn how to architect hybrid business intelligent solutions using Microsoft Azure offerings. Understand the roles of these cloud services and how to make them work for you in this complete overview of BI developed on the Microsoft data platform.

Module 1 – An Overview of BI in Azure

What’s available for the business intelligence architect in the cloud and how might these services relate to traditional on premises ETL and cube data flows. Is ETL enough for our big unstructured data sources or do we need to mix things up and add some more letters to the acronym in the cloud?

Module 2 – Azure SQL DB and DW

It’s SQL Server Jim, but not as we know it. Check out the PaaS flavours of our long term on premises friends. Can we trade the agent and an operating system for that sliding bar of scalable compute? DTU and DWU are here to stay.

Module 3 – The Azure Machines are here to Learn

Data scientist or developer? Azure Machine Learning was designed for applied machine learning. Use best-in-class algorithms and a simple drag-and-drop interface. We’ll go from idea to deployment in a matter of clicks. Without a terminator in sight!

Module 4 – Diving into the Azure Data Lake with USQL

Let’s understand the role of this hyper-scale two tier big data technology and how to harness its power with U-SQL, the offspring of T-SQL and C#. We’ll cover everything you need to know to get started developing solutions with Azure Data Lake.

Module 5 – IoT, Event Hubs and Azure Stream Analytics

Real-time data is everywhere. We need to use it and unlock it as a rich source of information that can be channelled to react to events, produce alerts from sensor values or in 9000 other scenarios. In this module we’ll learn how, using Azure messaging hubs and Azure Stream Analytics.

Module 6 – Bringing it Together for Consumption in Power BI

Combining all our data sources in one place with rich visuals and a flexible data modelling tool. Power BI takes it all, small data, big data, streaming data, website content and even supports gateways back to our office. Here we’ll consume everything created in the previous modules and more.

Module 7 – Orchestrate Your Services with Azure Data Factory

Create the perfect dependency driven pipeline with Azure Data Factory. What’s an activity and how do you work with time slices? We’ll invoke what’s been created in previous modules to bring the end to end solution together in a controllable way. Plus, how to work with the Azure Batch Service should you need that extensibility.


Title Learn to Build Your Own End to End IoT Solution
Duration Full Day
Abstract Real-time data is everywhere. We need to use it and unlock it as a rich source of information that can be channelled to react to events, produce alerts from sensor values or in 9000 other scenarios. Your mind is the only limit here. This internet of things or IoT as its commonly referred to needs to be harnessed, which throughout the day is what you will learn to do.

You will be provided with a Raspberry Pi3 and all necessary components (not to keep I’m afraid). Which you’ll learn to configure with Azure services and start streaming data into Power BI, SQL DB or blob storage for consumption. You will build this simplified IoT solution end to end for yourself and learn how to get real-time values from the physical world, all using the latest Microsoft IoT offerings.

Ready your internal parser for PowerShell, C# (UWP), T-SQL and JSON as we learn how you could take real-time data to your organisation. Don’t miss this full day of technical training on how you could build IoT solutions.

Components are software you’ll use: Azure IoT Event Hub, Azure Stream Analytics, Azure SQL DB, Raspberry Pi3, Fez Hat expansion board, Windows 10 IoT Core operating system, C# within the Universal Windows Platform (UWP) framework, Power BI, PowerShell, Device Explorer.

 

STEM

These are talks delivered in schools to students as part of the STEM learning programme. All abstracts here are for guidance. We appreciate the syllabus is often changing and talks can be adapted to meet your requirements.


Title An Introduction to T-SQL and Basic Database Concepts
Duration 2 hours
Abstract Structured Query Language is just another set of syntax out there, but one that can lead to a rich career if used correctly. In this session I will teach the basics of the language and bring real world scenarios from the industry to the class room. Data manipulation code including SELECT, UPDATE, INSERT & DELETE will be covered and demonstrated to students in an informal and interactive manor. Other relational database concepts will be covered to support the understanding of why SQL is needed in the outside world. This talk will be delivered using enterprise grade software (not Microsoft Office) to enforce student’s realisation that they can keep up with technology if they apply themselves. Students can be also provided with virtual learning environments during the session using the latest Microsoft cloud platform services. Please request this ahead of time.

Title Business Intelligence – Is this a career for you?
Duration As required.
Abstract If you are running a careers fair event at your school or college I’d be happy to present on a stand as Purple Frog Systems and speak with students about how industry is using business intelligence in the real world. Where can they get started and what do they need to know to work in the data industry.

 

Paul’s Frog Blog

Paul is a Microsoft Data Platform MVP with 10+ years’ experience working with the complete on premises SQL Server stack in a variety of roles and industries. Now as the Business Intelligence Consultant at Purple Frog Systems has turned his keyboard to big data solutions in the Microsoft cloud. Specialising in Azure Data Lake Analytics, Azure Data Factory, Azure Stream Analytics, Event Hubs and IoT. Paul is also a STEM Ambassador for the networking education in schools’ programme, PASS chapter leader for the Microsoft Data Platform Group – Birmingham, SQL Bits, SQL Relay, SQL Saturday speaker and helper. Currently the Stack Overflow top user for Azure Data Factory. As well as very active member of the technical community.
Thanks for visiting.
@mrpaulandrew