Showing posts from February, 2020

azure naming convention

It just seems so clear to me that having "-" dash as a naming convention is bad and causes more problem. Azure container registry won't like it, storage acount won't like it and probably a few others. But you will never know ......... so it can causes more pain than anything else :(

react basics

Set state the correct way. Setstate is a merge operation.

React dumb component

Better way to work with events

Another const example,

Check out this cool useEffect()

installing debian package on ubuntu

If you're trying to install debian package on Ubuntu, all you need is the following command :-

sudo dpkg -i DEB_PACKAGE

Linux distro name - when uname don't tell you much

Try running the following command and that will help to display your Linux distro name.
This works for Redhat. :)

cat /proc/version

docker image running dotnet - A fatal error occurred, the folder [/usr/share/dotnet/host/fxr] does not contain any version-numbered child folders

This error seems to be Linux specific. When i ran it on Redhat it gives me this issue. Otherwise running it on Ubuntu Bionic, this problem doesn't shows up.

Azure Devops pipeline - Getting dotnet tools (customer command) to run

Often have problem when there is more than a feed config in your nuget.config. For example, having your private feed, dotnet custom tool like "dotnet tool install' just won't work.

To resolve this you specify a --configfile and provide a single config file entry called dotnettool.config which has the content below :-

  - task: DotNetCoreCLI@2 displayName: Install ReportGenerator tool inputs: command: custom custom: tool arguments: install --tool-path . dotnet-reportgenerator-globaltool  --configfile dotnettool.config feedsToUse: 'select'
dotnettool.config content

xml version="1.0" encoding="utf-8"?> <configuration> <packageSources> <clear/> <addkey=""value=""/> </packageSources> </configuration>

salesforce - working with records

This setup uses a controller that contains the following code, please note that this code is available in /default/classes/bearController.cls


Notice how different the import path is :-

import getAllBears from '@salesforce/apex/BearController.getAllBears';

And load bead runs this method to run query and assigned it into a variable bear with a @track decorator

loadBears(){getAllBears().then(result =>{this.bears = result;}).catch

salesforce - getting all the objects

To query all salesforce object name, you can use the following soql -

salesforce parent child event interfactions

In the lightning-button-icon, we have a onclick event which can be used to fire off events -

And code below fires off an event called 'bearview'

On the parents side of things, we need to tell parent component to handle the event,

On the parent's javascript :-

salesforce - getting started - common terms

@track - is a decorate that make a variable  automatically refresh when its value changes

Handing UI events

On the javascript side of things

Bear tile component

This sections allow us to create a new custom component that is used in a parent component.

Lets say we are creating a component called beartile. Here is the javascript side of things :-

The html :-

Looking at the parent html :-

Also noticed that, it become c_bear_tile - where c is the default, somewhat predefined name for components and then as the upper case changes, then an underscore is introduced.

Loading style :-

REST API Http versioning

Common ways to go about it would be :-

1. URL versioning

An example would be :-

2. Query string versioning

3.Header versioning

GET HTTP/1.1 Custom-Header: api-version=1

4. Media type versioning - this is more for client trying to digest the media type data content. For example, the data content supported by the client should be of pdf version 2.

GET HTTP/1.1 Accept: application/vnd.adventure-works.v2+pdf

Recap in HTTP design

Some of the common stuff when it comes to responding to REST based API are as follows :-

1. GET - 200 for OK, 404 when not found

2. POST - 201 for created and 204 No update. Returns 404 if it is a bad request. Returns 409 when conflict.

For long running or async processes, an API could just return 202 (ACCEPTED) with a link to poll status info. Agent can poll the status and once upon completed returns 303.
Results can be show here :-

HTTP/1.1 202 Accepted Location: /api/status/12345

HTTP/1.1 303 See Other Location: /api/orders/12345

3. PUT - 201 for created and 204 No update. Returns 409 when conflict.

4. PATCH - this is mainly used for update resource. It will return 415 for unsupported media, 400 bad request and 409 for conflict.

5. HEAD - for handling large content

If you have a long running task or wanted to poll status of a resources, then you can do something like this :-

POST to a resource and returns 202 (Accepted) with a location of resource URL. 

Issue GET on the new location r…

salesforce lightning web component library

This is a link to salesforce component library that is essential for testing purposes.

Web lightning component documentation site

Using Powershell for Azure Devops to refer to predefined variables.

Using Powershell for Azure Devops Linux agent to refer to variables can be hard to get right essentially just use $(build.SourceBranchName) - the build.SourceBranchName is what listed in the documentation.

You can see one example given below :-

- task: PowerShell@2 displayName: 'PowerShell contruct VersionSuffix if not Master' inputs: targetType: 'inline' script: |          Write-Host "Setting up version info"          $VersionSuffix =  'prerelease' + '.$(build.SourceBranchName).' + $(build.buildid)          write-host "##vso[task.setvariable variable=VersionSuffix]$VersionSuffix"          Write-Output "##vso[build.updatebuildnumber]$($env:VersionPrefix)-$VersionSuffix"
failOnStderr: true condition: and(succeeded(), ne(variables['Build.SourceBranch'], 'refs/heads/master'))

using terraform expression

Terraform expression consist of conditional operators like ==, != and functions that you can leverage to work with input values. For example, lets say you want to concatenate two variable together, you can do that using terraform.

There are a whole lot of functions available which are categorized under string, numeric and collection functions.

Lets have a look at where you can use these functions. Functions are typically classified as expresssion and you can use it within a terraform code.


Let's have a look at the example below :-

In this example, i am combining a couple of function together (chaining) and produce a list as the final output. This snippet shows where we can :-

Disecting the code out further, we have substr functions, which attempts to get a subsection of a string except the last character.

substr(data.azurerm_storage_account.paymentmonitorui.primary_web_endpoint, 0, length (data.azurerm_storage_account.paymentmonitorui.primary_web_endpoint) -1)

Then we have c…

Salesforce codepen / playground

This let you get a feel of the codepen or playground for salesforce

mulesoft java invoke

To create a flow using java invoke. Please have a look at the flow below :-

As you can see, we need a java new component, then follow by a java invoke component.

Here is our java code :-

public class Greeter {

   public String Hello(String name) {
   return "Hello " + name;

Java new component requires the following configuration :-

And the java invoke component. As you can see we reference our variable using #[vars.mygreeter] :-

Note the class and method configuration below is very important.

mulesoft media type supported by dataweave

Ever wonder what type of media type supported by data weave?

application/avro Avro application/csv CSV application/dw DataWeave (weave) (for testing a DataWeave expression) application/flatfile Flat FileCobol CopybookFixed Width application/java JavaEnum Custom Type (for Java) application/json JSON application/octet-stream Octet Stream (for binaries) application/yaml YAML application/xml XMLCData Custom Type (for XML) application/x-ndjson

terraform refresh - what in the world ...

Terraform refresh sync state files with the real world. It does not affect resources but it will update state files.

setting up telegraf as a linux daemon

Installing telegraf on target server
curl --output telegraf.tar.gz -k
Placing telegraf executable in a specific folder.
tar xf telegraf.tar.gz

After extracting our telegraf.deb, you can move the binary to /usr/local/bin folder and configuration file to /user/local/etc/telegraf/telegraf.conf
Assuming we are in your expand tar directory, /telegraf/usr/bin
sudo cp telegraf /usr/local/bin
Assuming we are in your expand tar directory, /telegraf/etc/telegraf
sudo mkdir /usr/local/etc/telegraf
sudo cp telegraf.conf/usr/local/etc/telegraf
Next we need to specify what metrics we would like telegraf to send over.Example below shows the key configuration for DEV environment that we are interested in. The first is target database and server dns. Here we setup our hostname "urls = "" and database is called "dev_docker_telegraf".
# Configurati…

Mulesoft setting up foreach with random file generation

First of all creation flow with the following component.

In the file write, confgure the following fx or dw codes :- 

Grafana UI component code

You can find the component code for Grafana UI like Graph, Alert, color picker in the following url.

Delete App on Mulesoft Anypoint platform

To delete an app that you created on Mulesoft Anypoint platform, go to Design center and you will see a list of application / api specification that is available. Click on yellow box area and then click on 'red' colored box to remove it.

Installing and setting up telegraf to monitor docker as a service on Linux

Telegraf is a data collection and aggregation agent that needs to be installed on target server. Let's say you would like to gather metric data for example cpu usage or memory usage for Server A, install Telegraf on server A. All these day will be sent to a central influxdb datastorem, lets call our influxdb ServerDatastore.
Grafana and influxdb can live on the same or separate server.
In this setup, we are going to setup telegraf to monitor docker application and send all the metric of interest to ServerDataStore. 
Installing telegraf on target server
curl ttps:// --output telegraf.deb
Placing telegraf executable in a specific folder.
After extracting our telegraf.deb, you can move the binary to /usr/local/bin folder and configuration file to /user/local/etc/telegraf/telegraf.conf
Next we need to specify what metrics we would like telegraf to send over.Example below shows the key configuration for DEV environment …