Change styles.css to styles.scss under build styles.
Add inlineStyleLanguage: SCSS under test options
Change styles.css to styles.scss under test styles.
Now our angular.json is updated and uses scss instead of CSS.
Update CSS files for SCSS extension
Now we need to change all CSS files to use the scss extension.
This is pretty simple just change all CSS files extensions to scss.
For ex:
styles.css to styless.scss
and
app.component.css to app.component.scss
Similarly, change all extensions under all the components.
Update components to use SCSS
Update all components to use .scss files instead of .css
and that’s it our angular application is now ready to use scss styles of CSS styles.
Next, we will update support for Bootstrap scss instead of Bootstrap CSS.
Update to use Bootstrap SCSS instead of Bootstrap CSS
It’s possible you are using some third-party styles such as bootstrap. So let’s upgrade bootstrap to use scss. A similar approach can be used to use other style packages.
Assuming you have installed bootstrap via npm using
npm i bootstrap
You will have bootstrap/scss folder under node_modules
Under that, you will find file bootstrap.scss
Add this file reference in the angular.json file as follows.
Above is for the build section, update test section too.
Congratulations!
You have successfully updated the Angular 13 app to use scss instead of CSS styles.
Select Microsoft Dataverse connector and choose trigger
“When a row is added, modified or deleted”
Because this flow will trigger on deactivate, which is update of account.
Update Trigger properties
Set the flow properties as following:
First of all rename the step to Account or anything else meaningful, to easily identify this step.
Then we need to fill step properties
Change type: Modified
This defines the on change type on which this flow will trigger.
Other options are as following
Table name: Accounts
This option defines on which table we want flow to run.
Any table can be selected
Scope: Organization
This defines the scope of flow, for which users this flow will trigger.
Select organization if this flow should trigger for all users.
Select columns: statecode
Specify comma separated list of columns.
Flow will trigger If any of them are modified.
Filter rows: statecode eq 1
We are specifying flow to run when statecode is 1, which is inactive status for account.
In this you specify OData style filter to determine eligible rows.
Run as: Modifying user
Specify under which user context flow will run.
So far, it should look like following.
Retrieve child opportunities
Next add the Dataverse action to list rows as following.
Rename the step to List opportunities, to identify the step.
Update the List Opportunities step properties as following:
Table name: opportunities
Specifies which table records we want to retrieve, opportunity in this case.
Select columns: name
Specify the columns we want to retrieve, it’s good idea to retrieve only the required columns. In this case we are retrieving name column only.
Filter rows: _parentaccountid_value eq [Account from Dynamics content]
Specify OData style filter to filter rows.
Here we have specified to retrieve only the opportunity rows with parent account id matching to triggering account record id.
We can specify more properties as needed, but for this example leaving as it is.
Analyse value, body and body/value – item from list rows step
List rows step returns following Dynamics content.
value: value returns the array of records from the specified table in json format.
body: body will return the same array of records along with some other properties in body.
body/value – item: this contains a single instance of the array item which is single opportunity record in this case.
notice when we add this, automatically Apply to each step will be added.
Let’s analyse each of them.
Add three compose steps for each of them
Notice, when we add body/value – item, automatically Apply to each step will be added, and compose step will be nested inside with current item dynamic content.
Add one more step inside apply to each to extract the opportunity id
Set the expression as following to get the opportunityid from opportunity.
This should appear like this, so far.
Update each opportunity record
Next, for example we want to update current opportunity record, current item in the loop. We can do as following.
Add a Dataverse – Update a row step inside Apply to each.
and set the Table name and Row ID properties as following.
Output is from opportunity id compose item in Dynamics content window.
Here we are specifying:
Table name: opportunity
We want to update the opportunity record.
Row ID: id of the opportunity record we want to update, current item id in this case.
We can update any other property as we want to update on this record and same will be updated in Dataverse.
It should appear like this by now,
Save this flow
Analyse flow input / output
To test the flow and analyse the input / output, deactivate an Account record having few opportunities.
Account step
on clicking Show raw inputs
We can see the parameters passed to Dataverse.
On clicking on Show raw outputs.
List of opportunities step
On clicking Show raw inputs
We can see the filter is applied while retrieving the list of opportunities.
On click to download, json. it we can analyse all the returned data from Dataverse.
Value Dynamic content contains all the returned rows.
Show raw outputs
Compose Step – Body
Show raw inputs
Body Dynamic content contains all the returned rows along with some other body properties.
Show raw outputs
Apply to each step
Show raw inputs
We can see a single opportunity record inside apply to each.
Apply to each – opportunityid
Apply to each – Update a row
Show raw inputs
We can see what parameters were passed as update request to Dataverse.
Show raw outputs
Conclusion
In this post we learned how we can loop through the list of child records. And on each step we analysed what are input/output to each step visualize what is going in and out.
Generate production www web app folder from the Ionic app
Use following command in existing ionic project to generate production web app www folder.
ionic capacitor build browser --prod
This will generate a www folder, which we need to publish to the Firebase hosting.
Prepare Firebase Hosting project to publish the web app
Install Firebase CLI
Install Firebase CLI globally, if not already installed.
npm install -g firebase-tools
Login to Firebase, through Firebase CLI
Login to Firebase through firebase CLI, using following command. This will open a login screen in browser window. Login there.
firebase login
Create Firebase hosting project
Create new directory for firebase hosting project
mkdir hosting
Switch to directory
cd hosting
Initialize Firebase hosting project
Enter following to initialize firebase hosting project.
firebase init
Choose Yes, to initialize the Firebase project
Choose following hosting feature from listed options:
Hosting: Configure files for Firebase Hosting and (optionally) set up GitHub Action deploys
Next selected Use an existing project, because in my case I had an existing project.
Then select your project from the list.
Next question is
What do you want to use as your public directory? (public)
type www
Next Question
Configure as a single-page app (rewrite all urls to /index.html)? (y/N)
Choose Y
Next question
Set up automatic builds and deploys with GitHub? No
Enter, and firebase initialization will complete.
In the hosting folder one www folder will be created.
Copy www folder to Firebase hosting project
In the hosting folder one www folder will be created.
Replace this folder with the www folder generated above from Ionic production web app build .
Publish the web app
Finally, enter following command to publish the web app.
firebase deploy
Web app will be published to firebase hosting and you will be presented with the published web app url.
Congrats! We successfully published web app from ionic project to the Firebase hosting.
Conclusion
In this post we learned how to publish an Ionic web project to Firebase hosting.
First we generated the www folder from the Ionic app. Then We created and prepared a new project for publishing to Firebase hosting. In the Firebase hosting project we copied the www folder from the Ionic app and then finally published to Firebase hosting using firebase deploy
Often we need to process the array in batches rather than processing whole of the array at once. For that purpose we may need to split the array in chunks of a specified size.
Let’s see how we can split the array in chunks of a specified chunk size.
Firebase CLI provides a cloud function emulator which can be used to run the Firebase Cloud Function locally before deploying to production. Following are the types of functions which can be emulated.
HTTP functions
Callable functions
Background functions triggered from Authentication, Realtime Database, Cloud Firestore, and Pub/Sub.
We will test a simple http function which we created in the last post.
Firebase emulator is included in Firebase CLI, so we need to install it, or update it to the latest version.
npm install -g firebase-tools
Setting Up Admin Credentials for Emulated Functions
If your cloud function requires interaction with Google API or Firebase API via the Firebase Admin SDK then you may need to setup the admin credentials.
The Firebase emulator, which is included in the Firebase CLI, let’s us test the function locally before deploying to the cloud.
In this post we started with installing Firebase CLI and then we setup the Google Account Credential. Google Account Credential requires a JSON file with private key, which was generated in the Google Cloud Platform. Then we started the Firebase emulator and browsed the local function URL and received response from the local function.
Hope you liked this, please share your feedback or any query.
Firebase cloud functions let’s you run a piece of code in cloud without managing the servers. This is quite helpful when you just want to manage your code and not to worry about the servers executing the code. This pattern is also known as serverless architecture.
These cloud functions can be triggered by multiple events such as an http request, scheduled time or in response to changes in Realtime Database and perform the intended job.
In this post we will see how we can create a simple Firebase cloud function using Firebase CLI and then deploy it to Firebase. We will use TypeScript to write the function.
Let’s create and deploy a Firebase cloud function.
Installing Firebase CLI
Firebase cloud functions are created and deployed using Firebase CLI, so let’s install the Firebase CLI globally. Type following command on command prompt.
npm install -g firebase-tools
Login to Firebase CLI
We need to authenticate to Firebase to create and deploy cloud functions, authenticate to Firebase using following command.
firebase login
This will open a browser window to authenticate.
If you are logged in with another account then you can logout first using following.
firebase logout
Choose the account to login.
Allow, to grant permissions to Firebase CLI.
On Allow following success screen will be presented.
And in command prompt message like following will be logged.
Creating a Firebase Cloud Function Project
Create a directory for the project.
mkdir firebase-function-demo
Change to project directory.
cd firebase-function-demo
Open the directory with Visual Studio code or any other editor.
code .
Initialize Firebase functions project
firebase init functions
Accept the confirmation.
Choose the appropriate option for you. In my case I chose “Use an existing project“, because I have already created the Firebase project.
Next I chose the project from the presented list.
For this example we are going to use the TypeScript, so choose the TypeScript.
Choose Y if you want to use ESLint.
Select Y to install the dependencies.
Your project structure should appear like this so far.
Creating a Cloud Function
We will use the sample helloworld cloud function created by the Firebase CLI for this example.
Open selected index.ts.
index.ts contains commented sample function.
Uncomment the code, and save the file.
File contains one sample http request based cloud function. Which will log following.
"Hello logs!", {structuredData: true}
and return following response.
"Hello from Firebase!"
In the same file more functions can be added, for example we can add another scheduled function like following.
export const scheduledJob = functions.pubsub.schedule("every 5 minutes")
.onRun(()=>{
console.log("This will be run every 5 minutes.");
return null;
});
This scheduled function will run every five minutes,
cron expression can be added like following to define trigger time for scheduled functions.
export const scheduledFunctionCrontab = functions.pubsub.schedule('5 11 * * *')
.timeZone('America/New_York') // Users can choose timezone - default is America/Los_Angeles
.onRun((context) => {
console.log('This will be run every day at 11:05 AM Eastern!');
return null;
});
Deploying cloud function
Let’s deploy the sample code in index.ts to cloud.
Execute the following command on command prompt.
firebase deploy
On successful function deployment, function url will be provided.
Paste the URL in the browser.
and you will get the response like below from cloud function.
Great! Our cloud function is successfully deployed and responding to http request.
Navigate to Firebase project and select functions.
and you should be able to see your cloud function.
You can switch to Logs tab to see the logs of the cloud function.
Congratulations! We have just deployed our first simple cloud function.
Conclusion
In this post we learned how we can create a Firebase Cloud Function and deploy to Firebase. We started with how to install the Firebase CLI globally, and how to authenticate to it. Then we created a template project for cloud function using Firebase CLI and then deployed and tested the cloud function.