0:00
Hi good afternoon everybody today we are going to see the importance of Azure functions and most
0:11
important I would like to do some kind of practical demonstration of how Azure functions can be used
0:17
for various kind of activities one would want and definitely the benefits of it so yes first what
0:25
exactly are these cloud native applications let's go back in the history we have got applications
0:33
which are hosted in on-premise infrastructure there is a development team who would develop
0:40
the application hand over the applications to the infra team the infra team is supposed to do
0:45
the assessment of the requirement probably the load assessment the software assessment
0:53
the configuration of the machines which are required and so on and on and accordingly provide
0:59
the infrastructure up front yes they have to also take care of monitoring and there is always a
1:06
possibility of kind of surge in the traffic and in those kind of cases if they don't have
1:11
sufficient hardware to back up with the additional ram and the processor definitely the applications
1:17
would crash. So that is when people started migrating now to cloud. Times have changed
1:24
Cloud has become popular. Everything became service. Before I continue, I would just want to
1:30
use this sentence. Anything I buy or anything I sell, I always see either as a product or a service
1:40
For me, everything is a product or everything is a service. And the world is now moving from
1:45
product based to service based we don't want to have a dedicated ownership
1:51
because if I have the ownership yes there is no doubt the ownership benefit
1:55
the privacy benefit but we lose from lot of additional advantages what nowadays
2:02
cloud also provides to the organization so yes hardware is nowadays seen as a
2:09
service we procure hardware in the cloud we create virtual networks we create
2:15
virtual machines and on those virtual machines we would like to deploy our application
2:21
So all our existing on-premise applications when we want to do lift and shift especially
2:28
the legacy applications where probably the developers are not even there in the organization
2:33
today if you want to do lift and shift yes the virtual machine would be the best option anybody
2:39
would opt for. The kind of separate setup we have in on-premise the same kind of setup we can
2:45
create in cloud and we will be up and running through virtual machines we can create web servers
2:51
we can create database servers establish connection between web server and database server yes all
2:58
these facilities dotnet application you may have java application you may have python any kind of
3:03
application which you have probably hosted in on-premise with zero changes almost no code
3:09
changes we can move it to vms but vms themselves have problem they need maintenance i need to
3:18
ensure that i always have the vm windows update linux updates applied all the security patches
3:26
have to be taken care of yes definitely i have to also take care of the scaling aspect
3:32
when the load increases we will have to probably bring in load balancer into the picture
3:37
we will have to add more virtual machines when the load decreases to reduce the cost
3:42
we will have to decrease the virtual machines and so on so this is the infrastructure as a service
3:50
aspect of the cloud you can use virtual machines today in azure you can use in aws you can use it
3:59
in jcp wherever you like you can actually have the virtual machines lift and shift no code changes
4:06
deploy. Yes, the advantage which VMs also give is you can go into the virtual machine, remote
4:13
desktop you can do and some kind of tweaks if you want to make at the operating system level
4:17
you would be able to do it. But too much of work for the operations team to maintain it
4:24
So that is when the industry gave us the PaaS services, the PaaS equivalent of hosting virtual
4:31
applications websites in the form of let us say azure app service if you are using azure
4:38
maybe elastic beanstalk if you are using aws these are the ready-made pass services we can
4:43
make use of for deploying our application for developers it is as good as just right click
4:49
and deploy finish there is no nothing more they'll have to do do we need to make code changes very
4:57
bare minimum if any required there would be code changes so yes we moved away from virtual machines
5:04
and today lot of applications are using app services but then the development also is changing
5:11
now people earlier were doing a lot of monolithic application development would where we would have
5:18
a single project and in single project we only we would program all our modules we would probably
5:25
have one single database and so on but times have changed we don't want to anymore do monolithic
5:32
application development because it has a lot of drawbacks over a period of time it becomes too
5:38
large to maintain to debug and if it is a single unit anywhere the problem is the whole application
5:46
would be down the complete website would go down many drawbacks are there with monolithic architecture
5:52
So now everybody is moving towards microservices based architecture. So what we do now is a very large project
6:02
We break it into pieces. And these pieces, we develop them individually
6:08
There would be smaller teams now developing smaller components of the application
6:13
And eventually all of them can get together in making a bigger, larger application
6:19
So microservices applications are taking over now. And definitely there is a need for reprogramming the whole solution
6:30
You have to modify your existing code. You have to look it into the services which are provided by cloud for hosting these kind of applications
6:40
For example, you might use orchestration services. The most popular orchestration tool today is Kubernetes
6:47
You might probably go for on-premise implementation of Kubernetes, but that would be too much of a hassle again. So why not use something like Azure AKS
6:58
Azure Kubernetes Service or similarly in Amazon also we have got Elastic Kubernetes Services
7:05
So people now started rebuilding the applications. They started making the required changes to the
7:13
application so that they are suitable for the orchestration frameworks. But whatever it is
7:20
whether it is virtual machine or whether it is app services or whether it is a case we have to plan our compute resources in advance so much of compute so much of memory so much of processing speed so much of scaling factor all these has to be planned properly
7:43
but what if there is no load on the application will the cost stop definitely not there should be
7:50
at least one instance of the application always running and that cost is going to be always there
7:56
and you have to yes there are certain features where we have got auto scaling feature but not
8:02
always we made we would be able to use those things so there has to be regular monitoring
8:08
based on which one will have to scale up and we'll have to scale down to reduce the cost or let us
8:15
say to keep the cost in control and at the same time provide the best user experience with almost
8:21
zero downtime. Going forward the applications are now going to be completely returned from scratch
8:32
These applications will not probably have any history let us say before cloud. They would
8:40
probably take birth in cloud itself using the services of the cloud only they will be architected
8:48
so yes if new software development is happening today keeping all the new enterprise requirements
8:55
in mind i would like to leverage all the features of the cloud especially the serverless compute
9:04
aspect of it now what is this serverless compute the serverless compute is a facility where when
9:11
used by developers in building their applications would not have to create any worries for the it
9:19
team at all an application which is developed and deployed into a serverless environment
9:25
will ensure that is automatically going to scale as per the requirement of the user that means your
9:32
compute resources the memory and the processor are going to be used only when there is a need for it
9:40
that means when there is traffic automatically these resources will be borrowed from
9:45
the infrastructure used and returned definitely when there is no traffic the resources usage would
9:53
come down to zero and the billing would stop so basically complete infrastructure management
10:00
for the applications which we are developing today would literally go down to zero as we are
10:08
delegating all that to the cloud service provider so these type of applications which take their
10:15
birth itself in cloud is what we call it as serverless cloud native applications and going
10:23
forward this is becoming a big boom. Lot of people are going to even write micro services
10:31
as serverless compute Azure functions or AWS Lambda. That's what is the future all about
10:41
So legacy applications, we did containerization and deployed them into either as app service or
10:50
maybe Kubernetes services, monolithic app service, microservices, Kubernetes. Keep all this aside, build direct Azure functions, provide the required functionality and let the
11:06
complete infrastructure be handled on its own. So what is an Azure function
11:12
Azure function is a serverless compute service. It's a service actually, which can execute your functionality, which can be any business functionality, maybe order processing, which we want to do whenever a new order is posted
11:30
It can be any kind of processing which a individual can write probably as a simple function
11:37
If you know you can write something simple function, you can deploy that as an Azure function
11:42
of course the function can make call to many other objects invoke their
11:48
functionalities for implementation and so on but point is it is a serverless
11:54
compute implies the code is going to be given the resources on demand you don't
12:01
have to set up any initial infrastructure for your function you don't have to choose any particular specific number of processors or so much
12:09
amount of RAM you don't do that at all so as your functions are going to run as
12:15
a script or a piece of code in response to a variety of events now what are
12:21
these variety of even good we are going to write a function and let's say we
12:25
deploy it as an Azure function in as your cloud all fine but how are those
12:29
functions going to be executed these functions are going to integrate themselves with various other as your services maybe I'll post a message into
12:41
a queue and in response to that message in the queue we want an as your function
12:46
to execute maybe a file is uploaded into the blob and the file would probably
12:54
need to be processed further and I would execute an as your function some event
12:59
has been captured by an event grid or a stream of messages posted into the event
13:06
hub a notification has been raised through a notification hub or maybe you
13:12
might have used service bus enterprise based topics and queues any of these
13:18
things what you are saying here can become a trigger point for an as your
13:23
function definitely as your functions can also be invoked using HTTP triggered that means when you
13:33
want you can use a http url and trigger an azure function in short an azure function is some
13:40
ready-made functionality which can be executed on demand because of a trigger which would be
13:49
occurring based on an action on so many other cloud services. One use case which I can give you here is a simple diagram actually though it is not very
14:03
sophisticated something which I actually demonstrate when I'm teaching a full-fledged course in my curriculum. Let us say we have a web application onto which the user is uploading all
14:16
information about customers the basic details about the customer is uploaded and one of the
14:27
information would be the photograph of the person the customer photograph or person photograph that
14:35
is also being uploaded here we want to save that photograph so here we are going to write code
14:41
for pushing that photograph into a blob container. In a storage blob container I would upload the photograph
14:50
And at the same time with the information about the blob container and also the ID of a record saved in the database table i create an object convert it into json format and store it as a message in the queue
15:10
what is the advantage as soon as a message comes into the queue i would want an azure function to
15:15
get triggered and this azure function when triggered would retrieve that message and of
15:22
of course the message is going to have the ID of the person from the database
15:27
it will extract other details if needed the trigger is going to have the URL of
15:32
the blob which it will fetch now based on the URL the azure function can read
15:37
the complete block from the container probably from that image create a
15:42
thumbnail or as you might have seen probably write code to extract text from
15:48
that image if at all any you using the AI features which Ankit has demonstrated
15:55
to you in the previous session and then with that data it might save it into the
16:01
database or it might again create one more image let's say it created a
16:05
thumbnail image or probably on that image one watermark is added and the
16:11
watermark image is put back into the blob container so users are uploading
16:17
images, the site is creating watermark on those images and that task we can
16:23
simply run in background rather than doing it in the main application itself
16:29
The user experience here is going to be fantastic because the user doesn't have
16:34
to wait for all this activity of reading the image, reading the data about the
16:41
user, putting probably that data as a watermark on the image, saving it
16:46
back onto the blob container all that doesn't have to be done and azure function can do all
16:52
these things in background and most important azure function also integrates with a web jobs
16:58
sdk there is a web jobs sdk api with which if you build azure function lot of coding efforts can be
17:06
reduced we can do some kind of binding binding of azure functions to the queue will not require us
17:13
to write code for reading the message from the queue binding would not require
17:17
us to write code for probably putting the image back into the queue all that
17:22
can be taken care of through basic attributes programming so this is one
17:28
proper use case one can visualize of course we don't have sufficient time to
17:34
do the complete use case here but I would like to definitely show to you
17:38
people how the azure functions can be hosted in cloud before that azure functions have three types
17:46
of pricing to the app service plan the consumption plan and the premium plan there are three plans
17:53
which are supported by azure function now what is the basic difference between all the three
17:59
so if you take app service plan most of you people if you have ever worked with azure app services
18:04
you understand what is an app service plan a fixed amount of resource
18:09
allocated to your app service or now in this context as your function so you know
18:15
definitely at the end of the month this is the amount of bill I'm going to get
18:20
from Microsoft there will not be any additional cost but everything has to be
18:25
accommodated in that particular infrastructure as per the plan setup so yes we can set up as your functions with app service plan but the limitation of this
18:47
is the function can only scale to the extent of the amount of resources given to the app
18:53
service plan it would not be able to scale beyond it but advantage fixed monthly pricing
19:00
fixed monthly bill you know predicted how much is the amount you are going to pay
19:05
second plan is premium plan very similar to the app service plan only but you get better
19:12
performance in premium plan you get more additional resources if needed you can go up to four core
19:19
instances definitely here also predictable pricing but larger sizes unlimited execution duration
19:28
vNet connectivity for security these kind of additional benefits we are going
19:33
to get in premium plan over and above app service plan but the most beautiful
19:39
part of as your function is consumption plan the serverless compute when as your
19:46
function is deployed as consumption plan it can be treated as serverless compute
19:51
yes you don't have to allocate or premeditate this is the amount of memory
19:57
needed this is the amount of processor needed you don't have to worry about it at all you just choose
20:03
consumption plan and leave it to azure to allocate resources to your azure functions as and when
20:09
there is a load on it when it receives the request from different users let's say huge
20:19
amount of messages are posted into the queue to handle all those messages automatically azure
20:25
function is going to scale up and when there are no messages in the queue automatically the azure
20:30
function is going to scale down and it can even go to the extent of zero memory zero processor
20:37
to stop billing completely so this is the kind of benefit which azure functions provide
20:44
in building modern cloud native application no projection on cost you only pay for what
20:52
people use your services the different versions of azure services yes there is definitely a
20:59
limitation on consumption pricing tier right now three point x no more in preview it's in final
21:06
released maximum you can have one azure function running for 10 minutes the 10 is in minutes
21:14
within 10 minutes your your azure function is supposed to complete the task
21:19
with an exception if you are using HTTP trigger in that case your function should end in 230 seconds
21:26
and the reason behind that is if it is HTTP trigger it comes via load balancer the request
21:32
to Azure function would come by a load balancer and load balancer would time out after 230 seconds
21:38
and the response would not be sent to the client because the connection would be broken
21:43
so by default an Azure function in consumption pricing tier is five minutes which we can extend
21:50
up to 10 minutes 10 minutes of dedicated execution for one task that is too much of time actually
21:57
speaking yes if the time beyond this is needed don't use Azure function there is another service
22:03
called as Azure batch for large time taking tasks Azure batch is an appropriate service
22:10
for long time taking tasks there is no way you should take do it in as your
22:15
function yes if you go for app services default is 30 minutes you can make it as
22:21
unlimited but as I told you you are limited by the amount of memory and processor you are allocated here you not limited it can scale to any factor this is how you would probably change it so let us very quickly see how we can create an azure
22:36
function there are multiple ways of creating an azure function you can create an azure function
22:41
in portal you can create an azure function through a studio also to keep it simple here
22:48
i will go for creating an azure function for which first we'll have to create is a function app
22:53
start with creating a function app and give some resource group name
23:05
give some name you decide now into the azure function you want to host a container
23:20
or you would like to put direct code you can definitely put docker container but i want to
23:28
go with code and you can see there are different options not just one language multiple language
23:33
i'm going with dot net you can choose the version of dot net which you want right now only 3.1 is
23:41
supported very soon probably they are going to also support five but for azure function at this
23:48
point of time only 3.1 is supported. You may have to create a storage account in background
24:08
This is where actually the definition of your Azure function is going to be stored
24:13
You can choose between Linux or Windows, it's your choice. And the most important the pricing tier
24:21
The consumption serverless pricing tier or the function with premium pricing tier or
24:27
app service plan. Yes, if you go with premium pricing tier, you get lot of options
24:33
Elastic pricing tier. Based on the location, it is supporting Elastic premium
24:41
Lot of 7 GB memory, 14 GB memory, 840 ACU, pretty expensive also
24:55
Whether your function have load or they don't have load, this much cost you are going to incur
25:02
So I don't apply this, I simply go for the consumption pricing tier and that's it, nothing
25:08
to be chosen now. You may integrate with your App Insight. Because I selected Windows platform, it has built in capability to do integration with
25:20
App Insight. I am not interested in that. If you would have chosen Linux, to integrate with App Insight, developer will have to write code
25:31
Through SDK, they would be able to capture the telemetry data and log it into App Insight
25:39
So this is going to create a function app. Function app is just a host for Azure function
25:47
In one function app, we can have multiple Azure functions added. And whenever the scaling has to happen
25:54
it's always the function app instance which is going to scale. So first time, a function app will be loaded into memory
26:02
And in that the functions are executing one request, two request, three request, four request
26:07
But at a point when it finds that the function app is not able to handle all the requests
26:14
automatically a new function app instance will be created. And in that new function app
26:20
the new functions are going to execute parallel. All this is self-managed by Azure infrastructure
26:28
in background, we don't have to do anything. That's the beauty. So we will wait for the Azure function app to get created
26:36
And in that function app for beginning, I would probably write a very small function
26:42
Let me also at the same time create one storage account. I have something which I'll use it
26:52
And into this storage account, I would like to create a queue. Posting a message in this queue, I would like to trigger a function
27:02
so let's call it as some demo queue only yep my azure function is also ready queue is also ready let us begin with first writing an azure
27:21
function which probably gets triggered per a http call a http trigger function
27:35
select that click on add and i have chosen here develop in portal so in portal itself
27:43
the function can be edited you can write the complete function in portal itself
27:48
yeah for the sake of demo I don't want to make any changes I just want to show
27:55
you that this function is in C sharp language with the extension C S X as you
28:01
can see the extension is dot C S X when you write on top hash R it basically
28:08
means that you are trying to refer to a nuget package so automatically this
28:12
nuget package will be downloaded in the infrastructure whereas your function has to execute and rest all is straightforward C sharp and at a high
28:22
level this particular function is receiving HTTP request object because it is an HTTP trigger and from the request object we are extracting the name name of
28:33
something which is added as query string or the data might be provided in the
28:40
form of request body so name is either taken from the query string or from
28:45
from the request body and it is appended to hello along with some additional message but if name is
28:52
not provided some other message is printed i'm not making any of these changes but this is where
28:57
you can write code to probably read an image from the blob and do some kind of image processing
29:07
like i told you generation of thumbnail or adding of watermark anything you can do and
29:15
here only you would write code to probably save the image back into the
29:19
blob storage as well as update the database record we want to execute this particular function to invoke this function we can use this get function
29:29
url so this is the url which i'll give it to the developers of
29:38
wherever we want this function to be invoked there we can provide a link to that url
29:45
So if I show you that you are, it's a very basic one. This is the endpoint of Azure function
29:53
API, the name of the Azure function app, sorry, the name of the Azure function and this
29:59
the key this is the security key only the users who have access to this key
30:05
will be able to invoke this particular function so where is this key actually
30:12
here you can see there is a functions key concept so the key has come from
30:17
there so now I'm going to copy this whole URL you can see by default it's
30:26
hidden I'm going to append to it let's say my name copy the whole URL put it in
30:38
the browser directly and the name has come don't give the name it will give
30:49
you the other output yeah the functionality of the Azure function in demo is very simple nothing great but you would understand here very clearly
30:57
right now this method is being invoked using HTTP get you can definitely
31:02
invoke this azure function using HTTP post and when you are invoking using
31:07
HTTP post from request body you can get all the data you can specify what
31:13
methods can be used for invoking this as your function you can restrict those
31:17
methods rather than allowing all the methods you may configure and restrict it go to monitor
31:23
sorry integration and there you can see this particular method i am supporting only on
31:30
get and post http request click on that get and post are the only two methods i am supporting
31:40
in case I want to support other HTTP methods I can do that as well so this
31:47
information about binding is all there in another file along with the dot CSS
31:57
that is JSON file function dot JSON the complete binding information is available
32:06
here how the data will be given from the request to the azure function is controlled by this by this
32:16
authentication level the parameter the first parameter name is supposed to be request
32:23
the type of trigger is http and so on not really returning anything concrete just http
32:30
return type with string that's it so this is a very simple http triggered function i would like
32:37
to take something more interesting which is a q triggered function as your storage
32:50
we already have one storage account we can directly connect to that storage account itself
33:00
you can see that this particular function right now is only taking the
33:05
message from the queue and putting it in the log. So here we can see the logs. Now the data type has changed. So now I would like to
33:25
go to portal in one more time here there I'll post a message into the queue
33:32
ideally speaking a program has to be written a C-Shop Java program has to be
33:36
written which is supposed to post a message into the queue but for this demo
33:41
I'm directly manually posting the message into the queue as your function
33:48
app storage how do I know this is the queue you can actually check here if you
33:56
go to integration as your queue storage is linked to a queue where is the queue
34:11
We have created demo queue. Do that. Demo queue as per the string managed by Azure Web jobs storage where is this as your web jobs storage save this you actually find this as your web
34:29
jobs storage as an app setting in your Azure function go to app settings of
34:37
your Azure function configuration app settings and this is where the web jobs storage is and this web jobs storage is by default referring to
34:56
dss function app storage so that is the reason i'm going to this particular storage and in that
35:04
storage i'm posting the message into a queue which is also configured here itself where
35:13
whatever I have done in the integration tab all that as I said will be visible
35:17
to us in functions dot JSON demo queue so let's see I go back to run dot CSX open
35:31
the log post a message into the demo queue I didn't use this you can
35:49
definitely use that by changing the connection string into the demo queue I
35:54
would like to post a message so let's see this in the background
36:01
add a message this thing okay and you'll see the testing has come here
36:16
so keep posting the messages here whatever message you post here you will see that the function is executing in background
36:28
and point is how much code did we write hardly anything not much of coding we
36:36
have done we did not write at least code for retrieving the message from the queue
36:40
all that is happening because of the data binding which is supported here
36:58
It's all because of this built in data binding feature. So yes we have seen how a function is executing when a message is posted here
37:14
But can this be a filename which is probably uploaded into a blob container
37:22
Yes, I can give the file name here from which it should read the content and let us see
37:35
I am going to add now to this Azure function under integration an input parameter
37:52
And here the input parameter in container in this particular blob, we would like to post the content
38:07
That's the point. So let us say I give here a file with extension phd
38:15
But what is that name? The name has to be something. so what is that name we have to define now so that from that particular file the content will
38:26
be read and given to my parameter file content i am going to write now a parameter to my function called as file content Small change I am supposed to make now
38:58
add here spring and that's it this we'll have to make a change now because name is a parameter
39:20
we will have to write here a class ideally a public class in which we are
39:30
going to have public string name with get and set and of course you may want to put one more parameter if you
39:43
want get in set this class let us say is what mmm some name you can give them and
39:55
here I'll have variable which is of type demo the point is this class has a
40:06
property called as name and from that name dot txt the content is going to be
40:13
assigned to this let's see if that's working so file content that's it let's
40:32
save this so I am going to do now is upload a file by name let's change a bit
40:44
so that it's more meaningful imagine this is person and this also should be
40:50
person I'm going to put an object which is of type person into the queue here I
40:57
I'm printing the name of the person along with let's say the ID of the person name and
41:10
ID and I'm also printing the content of the file which will be based on the person name
41:17
where into the blob storage and coding is zero. coding to read from blob no coding to read from queue nothing all that is going to work because
41:29
of the data binding feature save this yes some errors fantastic delimiters have to be closed
41:41
compilation is successful cool so first i'm going to do is create one file
42:06
and let's say i'll save this as tandeep.pxt and this file i am going to first upload into a container
42:25
so here i should create a container now what should be the name of container
42:29
whatever we have given here in container create a in container
42:41
Let's assume that the file is already there in it. So I'm going to upload the file here
42:58
And now it time to post a message into the queue
43:14
Into the demo queue, I would like to post a message and this is supposed to be a JSON
43:18
object which will have name and the name is supposed to be Sandeep
43:29
ID you can give any ID so equivalent of the object which is here of course right now I'm
43:49
manually posting it but ideally speaking this is supposed to be done through another application
43:53
an application is supposed to post a message which is a JSON equivalent of
43:58
this and that object data will get assigned to this from that name property
44:06
will be picked up named or txt content will be assigned to this from in
44:12
container and we will see all that get data gets printed here let's see cool
44:19
the message is posted yep let's see all that it's fast over actually
44:28
Sandeep one and the content of the file is this is a demo file so this is the
44:35
beauty of as your function we are able to almost bare minimum coding able to
44:43
achieve what we want that is why these are getting popular day by day yes we
44:54
can see that for azure function there will be one host dot JSON file where we
45:00
are going to do all the configuration required globally applicable to all as
45:05
your functions and for every function there will be a JSON file where binding
45:11
information is present and run.csx where you can actually write the code
45:17
Azure function otherwise use the same infrastructure what we have it for app
45:22
services. You can write timer trigger function specify the timer in cron
45:30
expression format so that at a particular point of time this time this
45:35
minute this second the function should execute like that we can control it
45:40
we can have key value pairs added I showed you already to the application settings and
45:48
read that application settings if you want so here you see I have taken the similar example
45:55
employee ID name employee class is a parameter here this is binded to the file which is based
46:04
on probably the name of the employee object which is posted here as parameter into the
46:11
queue an item will be posted in employees container based on the ID the file name is
46:16
present and so on. You can also write as your functions in visual studio
46:26
You have direct template available for writing as your function in visual studio and once
46:31
Once you write it in Visual Studio, you can simply right click and do a deploy
46:36
So yes, with this, I come to the end of a small introduction to what exactly is an Azure
46:44
function and how it can be used for writing microservices based functions, which can be
46:51
invoked using URL or probably functions functionality, which can get invoked because of some kind
47:00
of asynchronous communication where via messages into the queue or probably the messages in the
47:06
service bus event hub all this infrastructure we can use for executing the functionality
47:13
in azure function thank you