Showing posts from 2019

Azure event hub namespace connection string or event hub connection string?

It depends, if you have the requirement to write to say different event hub in the namespace then event hub namespace connection string. The problem could be if your connection string is compromised then the application can potentially send to all your event queue. It is always better to have finer control. :)

So I would use event hub based connection string.

terraform creating rule for auto scaling for service plan

It seems like terraform just doesn't like to create a rule. But if you go and create manually in the portal, giving it a name, then terraform auto scaling service plan works.

Here is my terraform :-

Some point to note - I have to use "azurerm_app_service_plan" tags here as oppose to manually copy and pasting the resource id.  And remember to create a rule called "devspapptest-Autoscale-9680" so terraform would be able to find it.

So strange ......

To work with what metrics are available, go here. Also don't forget to go, App Service Plan -> Scale out -> Json to try to match or copy some of the operator values or statistic used. It is like you can copy and paste it into your terraform code.

Terraform secret rule of thumb - keyvault has to depend on policy :)

Did you know when you create a new secret, you need to add "depend_on", to associate it to a key policy. That means Vault -> Policy -> Secret (you need to specifically add "depend_on" in your secret resource provisioning section.

Understanding TLS and its cipher suite - Part 1

Key exchange algorithmsprotect information required to create shared keys. These algorithms are asymmetric (public key algorithms) and perform well for relatively small amounts of data.
Bulk encryption algorithms encrypt messages exchanged between clients and servers. These algorithms are symmetric and perform well for large amounts of data. Message authentication algorithms generate message hashes and signatures that ensure the integrity of a message. Other scheme used here includes HMAC.

MAC - Common message authentication scheme are HMAC, OMACCBC-MAC and PMAC. Newer and better ones would be AES-GCM and ChaCha2, Poly1305

Setup azure function app to a service plan

Yes, sometimes you do need alot of testing to make sure you get this right.
Let's say you already setup a ASE (isolated environment) and then you would like to associate that single service plan (resource group A) to a function app in resource group B.

How do you do that?

With Az Powershell?

azure service plan - scaling out and scaling up

Scaling up means increasing your computing resources like instead of running your app using 4G, you are saying I want to run it on a 16 G machine.

Scaling out means increase number of VM to run your existing application. You may have 1 vm running your app right now, lets increase this to say 2 or 4. A limit of 100, if you're on a isolated plan.

How does this related to a Service plan? Well, service plan controls the scaling of your resources.

nextjs optimization

Lazy loading module is achieve through

Lazy loading components

terraform azurerm_app_service_plan

This is to create an application web service and a service plan. Not to be confused with App Service Environment.

Some useful port in Azure

Some ports that you will often work with in Azure.

UsePortsHTTP/HTTPS80, 443FTP/FTPS21, 990, 10001-10020Visual Studio remote debugging4020, 4022, 4024Web Deploy service8172

ASE app service plan - 1 instance per service plan

Did you know that in ASE, one service plan typically means you are running atleast 1 vm?
Well, you know now.... that's going to cost.

Probably merge all into a single service plan.. :)

Also, turning on diagnostic logging is expensive

react : functionalcomponent vs class component

Was playing around with Nextjs and the code that keeps on popping up are Functional component (as shown in code below).

The difference between functional and class component.

Functional component 

Class component 

nexts - pushing client side javascript

When using SSR, you probably need to push some client side code. This requires some manual configuration.

First you need a folder called static (it has to be static, otherwise it won't work), place your javascipt code in it and then from your index.js or index.tsx.

And oh, you need to "Reload that page" / F5

Microsoft Seal - Setting up your docker on Ubuntu

If you're thinking of working with Microsoft Seal library, then get your hands dirty with a Linux build.

To setup your dev environment

docker run -it /bin/bash

Then start to install

apt-get install update

 apt-get install software-properties-common
 apt-get install cmake
 apt-get install git
 apt-get install g++

 git clone
Build the goodies 
cd native/src cmake . make sudo make install cd ../.. Don't forget the samples..... good fun :)
cd native/examples cmake . make cd ../..

warning: Error disabling address space randomization: Operation not permitted

When trying to run gdb on a docker, i got this nice error :-

This is a solution, which i will try later ..  :)

docker run --cap-add=SYS_PTRACE --security-opt seccomp=unconfined


Solution : apt-get install g++

PALISADE - Library Compile for Linux Ubuntu

Get the tools 

apt-get install software-properties-common sudo add-apt-repository ppa:george-edison55/cmake-3.x sudo apt-get updatesudo apt-get install cmakesudo apt-get install g++sudo apt-get install git
Clone the repo

git clone

Configure and build 

Go into your cloned dircectory

./configure ./make

Azure - how do you upload your react assets (actually for any assets) into static storage web enabled account

This is the script that i used to deploy my react assets from azure devops into azure storage account :

Webdeploy to ASE environment breaks after forcing TLS / SSL upgrade

If your deployment suddenly stops working when someone set to TLS 1.2/1.3 or prevent TLS 1.0 from being used.

Then devops code deployment will keep on complaining that it was cut off from the tcp stream.

Enabling debug for Azure Dev ops

Set the variable name System.Debug  to true

Enable c# with c++ dll debugging options in visual studio

In your c# project, right click -> Properties -> Debug -> Enable Native Code debugging. You're all set.

az cli - setting variable on a windows machine build

Did you know that you need to use the following command to set variable if you're using Azure DevOps Az Cli running on a Windows machine?

For /f %%i in ('az keyvault secret show --vault-name "Your-KeyVault-Name" --name "Your-Secret-Name" --query "value"') do set "password=%%i"
Don't ask me why..just weird

Azure WebDeploy and Kudu

Regardless of whether it is Azure or not, when you use WebDeploy, you're using port 8172 to do your deployment. Unlike zip deployment, webdeploy does not use Kudu service.

That also means any service like VSTS that uses webdeploy, do not use Kudu.

Why is this important? When security team starts to knock on your door asking everything to be locked, you need to know which port is important

Kudu service uses port 80 / 443.

How does Kudu deploy

Function app ASE

ASE (Application service environment) is an isolated environment for you to run your code on.

If you use Terraform 

Unfortunately if you're using terraform, you get error message trying to provision a function app that ties to a service plan  (ASE).

Status code nil, nil - Not a very helpful message

Issue is tracked here.

If you still want to use terraform, get it to create service plan and stop. Don't provision your function app. That service plan you just created are tied to an environment id.

Then use Az Cli to create a function app and manually ties it to the service plan created locally.

You also need to setup system identity and somehow add that into resources like keyvault and all the goodies.

If you use Az Cli

Creating a service plan that ties to a ASE is not supported. Look up the service plan, you cannot create Isolated using az cli.

Microsoft has just move this into their backlog

Powershell AZ 2.4

The only option is to use Powershell AZ.

My case 

Since i am using Terraf…

Azure devops - debugging pipeline using System.Debug

One cool feature that you can turn on whenever you try to troubleshoot build issues in Azure Devops is "System.Debug". Create a new variable called "System.Debug" and set it to true.

Run your pipeline and you will see a bunch of messages.

git apply patch done properly

Totally agree with the way this has been done.

Getting to know what are the change made  git apply --stat 0001-file.patch
Initiate dry run to detect errors: git apply --check 0001-file.patch
Finally, to apply patches git am to apply your patch as a commit: it allows you to sign off an applied patch. This can be useful for later reference. git am --signoff < 0001-file.patch

Terraform - configuring function app to use existing ASE

It is something hard to get the settings rights for terraform, if you don't run it multiple times.
In this example, i had so much errors and i found out that, if you referencing an existing ASE plan, you better make sure it matches - in terms of TIER and SIZE. Otherwise your service plan is as good as no service plan

\resource"azurerm_app_service_plan""ase-service-plan"{ name = "${var.environment}${var.service_plan_name}" resource_group_name = "${}" location = "${var.location}" kind = "FunctionApp" app_service_environment_id = "/subscriptions/ your-subscription-id/ resourceGroups/yourResourceGroup/providers/Microsoft.Web/ hostingEnvironments/your-ASE-name" maximum_elastic_worker_count = 1 ## required - Isolated ASEV2 ## Best to match this sku { tier = "Isolated" size = "I1" capacity = 1 } }

Nodejes - Loading from node_modules from parent directory

Interestingly, node_modules libraries are loaded from child, then move its way up to parent until the root.

That's is not really what the docs says. 

To resolve this, either add package dependencies on the parent node_modules or remove that folder.


npm audit error - package vulnerabilities

Run into this error during my npm build. Awesome treat of the day, it seems.
Time to get cracking with resolving inflight npm libraries issues.

When i encounter this issues, non of my build work. It just shows the following error and exit.

Good thing it ask me to use

npm audit fix

to fix stuff and it works. Obey the npm cli :)

npm react script error - The react script package provided by Create React App requires a dependency

Bump into this error today,

The react script package provided by Create React App requires a dependency. The react-scripts package provided by Create React App requires a dependency

"jest" : "24.7.1"

To resolves this, apply changes suggested from here.

you also need to provide this.......

npx npm-force-resolutions npm install

Funny thing, is that this doesn't work for me.

Az cli - setting diagnostic logs for event hub

This is a script that allows you to setup diagnostic logging for keyvault and event hub. You can easily use it for other stuff as well.

First of all your start off with something simple like this, to enable diagnostic logging for a vault called "myvault". Unless it is a resource Id, then you need to provide resource group info.  (Please note - resource group is the resource group that vault resides)

When it comes to --workspace, ideally it is best to

az monitor diagnostic-settings create -n "lalala" --resource "myvault" -g "devrgpmtengine" --resource-type "Microsoft.KeyVault/vaults" --workspace "mydevworkspace" --metrics '[{"category": "AllMetrics","enabled": true,"retentionPolicy": {"enabled": false, "days": 0 }}]'

when it comes to --workspace, it is best to have something that looks like this, full resource path to your workspace. It looks like the figure b…

Powershell - Passing json string into az cli for execution

when you're trying to work with Az cli, you tend to pass in a bunch of json strings. And the thing about powershell, you need to escape double quotes otherwise you will get a whole bunch of errors : -

Expecting property name enclosed in double quotes: line 1 column 1
Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

blah blah blah

To solve this, look at this example :-

This is an example how you can use "az monitor diagnostic-settings" command line.

hope this helps!!!

Azure key - using rsa (private and public key)

The idea is pretty simple, create a rsa key in azure. Use public key to encrypt. Then use private key to decrypt. Private key never leave the vault.

Here is the code for doing that.

If you're getting bad request? Please check to make sure you have added MSI to the keyvault access policy.

I had that many many times the following error message,

Unhandled Exception: Microsoft.Azure.KeyVault.Models.KeyVaultErrorException: Operation returned an invalid status code 'BadRequest'

and the problem is due to, specifying foAEP to false, instead of true.

varencryptedText = rsa.Encrypt(byteData, fOAEP: true); // use to be false :(

Azure diagnostic settings - quick way to look at all the setups

Go to Azure portal -> Monitor -> diagnostic settings 

Please change the subscription filter as required.

Writing to a variable for other tasks in the pipelines

You can create/ update a pipeline variable using the following command :-

Write-Host"##vso[task.setvariable variable=myvariableName;] $myVariableWithValue"

Terraform - setup AzureRM as a backend storage

Say you're trying to setup terraforma backend - to save state file into Azure, you might get prompt asking for container name, then provide the configuration listed in figure 1.1.

Before that, you need to setup your ARM authentication (yes, all of this)


If you get error messages saying

"Error inspecting states in the "azurerm" backend" - please provide settings in figure 1.1

If you encounter this error, remember to delete your statefile folder (.terraform) or whenever any of your test results in failure. This is important

Error inspecting states in the "azurerm" backend:
storage: service returned error: StatusCode=403, ErrorCode=AuthenticationFailed, ErrorMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

If you want to pass DYNAMIC values in, you can use :-

terraform init -backen…

Wireshark - Master grand list of fields

Deep dive into fields you can specify on the filter section of wireshark. Here are a complete ref :-

Sans SIFT workstation - a forensic VM

Mitre - Threat framework

This is awesome. It has a huge list and not sure how one can possible ensure this is carry out.

terraform spitting out 403 access error when creating keyvault secret in azure

I bump into this issues alot when i try to create a keyvault, then setup some poliy around it and then when i add secret / key into it, bang! this happens -

terraform spitting out 403 access error when creating keyvault secret in azure

Solution that worked for me (adds depends_on) into EVERY "azurerm_key_vault_secret" and key that is about to be written into a keyvault. This happens when you trying to create key vault and then add policy. If you're just adding secret (with the keyvault already exist), then you're fine.

Yes, add to every secret or key that has a dependencies and you can have it as a module. It won't work.

resource"azurerm_key_vault""kvpaymentengine"{ name = "${var.environment}${var.keyvault_name}" location = "${var.location}" resource_group_name = "${module.pmt-rg.rg_name}" tenant_id = "${var.tenant_id}" }

Terraform Error: spawn terraform ENOENT

I know this sounds strange as hell, right.

When this happened to me, Terraform was actually complaining about incorrect path that was given in my variables / settings.

Azure devops - Az cli extension can access pipeline variables definition

Az Cli extension can access pipeline variable definition without any environment setup. You should be able to use do something like this :-

Where $ENVIRONMENT are pipelines variable definition.

az functionapp cors remove -g $ENVIRONMENT$RESOURCE_GROUP_NAME -n $ENVIRONMENT$FUNCTIONAPP_NAME --allowed-origins "*"

az cli function app source code is under appservice

If you're looking for az cli functionapp code, well it is available under appservice folder as shown in diagram below :

How to disable remote debugging from your porta

Been looking around but manage to find it myself.

Goto your function app, then click on ''General settings" - most of your configuration is available here.

Using git subtree

Updating repository 

git subtree is a copy of a clone repository. You need to issue separate command to update your branch for example,

git subtree pull --prefix=storedprocedures https://mycopied_git_repository/ 

Running pull alone, will not update the 'copied' branch. This also means you need to pull and then push changes to be reflected.

For example, say you have your local git repo calle 'my_local_git_repo', mounted to a folder called externalibrary, if you run git pull, you only update 'my_local_git_repo' and not mycopied_git_repository. Run command above, git subtree to update.

After that, you need to push your changes to your 'my_local_git_repo' so that, build tools like Azure DevOps, is able to see it and update

git subtree add gives you "Working tree has modifications. Cannot add"

While seems confusing, i think it is pretty clear it meant what it says.
If you have any changes, please commit or stash it - make sure your branch is clean and neat :)

Or sometimes if you don't have any modification, just run the command "git reset".

And re-run the command.

Using powershell's Connect-AzAccount to connect in Azure DevOps

Here is the inline scripts that allow you to connect / authenticate using powershell extension in azure devops.

az cli - cosmodb keys vs az cosmosdb list-keys

Not really sure why,

az cosmodb keys don't work although it is available in the documentations.

So instead i tried

cosmosdb list-keys

And that works. Feature or bug?

Understanding Powershell extension for DevOps

In Azure devops, there is an extension called PowerShell for Mac, Windows and Linux. This is not the same as "Azure Powershell" which is targeted to run on window build host / agent.

Most of the time, I prefer to work with scripts based instead of inline scripts for tracking reasons.

Az cli extension will automatically login for you and you can start running various scripts. This extension, requires a manual login approach.

To installed a module, just do something like this

Install-Module -Name CosmosDB -Force

Some faq here

Can I used az cli based command here?

Yes you can. Can be as simple as follow :-

az login --service-principal --username $env:CLIENTID --password $env:CLIENTSECRET --tenant $env:TENANT_ID
az account set --subscription $env:SUBSCRIPTION

Or using powershell (you need to use "ConvertTo-SecureString". Pretty sure the following

Write-Host("init credential")
$passwd = ConvertTo-SecureString$env:CLIENTSECRET -AsPlainText -Force $pscredential = N…

A weird case of special character in Cosmodb stored procedure

for some reason i am getting, specials unicode ( and just mess up my stored procedure deployment to cosmo "cosmodb from the powershell gallery module".

All it needs is a editor "Notepad++ and click on the options "Show all characters" and edit away! :)

Publishing powershell module

First you can create a folder to place all these files and make sure you have powershell installed

Create your powershell module with a psm1 extension

You can copy and paste the cod from here, just make sure you have the right eten

Setup your module manifest

New-ModuleManifest-path ./firstmodule.psd1                                                                                                                     
Edit firstmodule.psd1 and add the following

Description = 'demo module

This basically gives it the minimum requirements :)

Publish-Module-path /Users/jeremy/tmp/work/powershell/firstmodule -nugetapikey

Getting started with spark with databricks cloud

When you sign up for a community edition, you tend to create notebook with it.

If you want to make use of the existing database, you can load it using the following command :

To load a table called airlineflight

use databricks;

select month from airlineflight where year == 2002

To convert to a data frame you can use the following command :-


df = sqlContext.sql("Select * from airlineflight")

And from this point onwards, you can manipulate using filter, select. Please refer to documentation here.

Some quick examples,



Documentation for Databricks Cloud / Community. Finally manage to zoom in on what spark library that i am working on to progress further.

Having my notebook in Python allows me to work and then subsequently use function defined here.

Azure Devops artefact and release pipelines

I guess the concept is pretty easy but just that we just lost in piles of documentation.

Key concepts is when you build make sure you publish your output as an artefact.  How do you do that? Depends on  whether you're building a  .netcore or javascript based project.
If you're using dotnet core, you just use "publish and zip" artifact.

If you're using javascripts, then you need to publish your artifact to  drop folder. 

For eample,  we have PublishBuildArtifact here generate dll into a drop folder

steps: - script: dotnet build src/AzureFunctionApp.sln --configuration $(buildConfiguration) displayName: 'dotnet build $(buildConfiguration)' - script: dotnet publish src/AzureFunctionApp.sln --configuration $(buildConfiguration) displayName: 'dotnet publish src/AzureFunctionApp.sln $(buildConfiguration)' - task: PublishBuildArtifacts@1 inputs: PathtoPublish: '$(Build.ArtifactStagingDirectory)' ArtifactName: 'drop' publishLocation: 'Contai…

Azure Blob copy to $web

When you try to copy files to $web in your static site (Azure storage account), you need to escape $web otherwise you won't be able to copy files across.

So typically you do something like

az cli storage upload-batch -s sourcefolder -d destinationfolder  \$web

Especially when you on a Mac or Linux based system.

vscode enable application insights

Install nuget

dotnet add package Microsoft.ApplicationInsights.AspNetCore --version 2.7.0

Setup codes, startup.cs

publicvoidConfigureServices(IServiceCollection services) { // The following line enables Application Insights telemetry collection. services.AddApplicationInsightsTelemetry(); // code adding other services for your application services.AddMvc(); }


"ApplicationInsights": { "InstrumentationKey": "putinstrumentationkeyhere" },

In the section, "putInstrumentationkeyhere" ---
place the guid created for app insights in your portal.

Getting spark to read kafka data

Setup kafka

bin/ config/

bin/ config/server