Posts

Showing posts from February, 2020

azure naming convention

It just seems so clear to me that having "-" dash as a naming convention is bad and causes more problem. Azure container registry won't like it, storage acount won't like it and probably a few others. But you will never know ......... so it can causes more pain than anything else :(

react basics

Set state the correct way. Setstate is a merge operation. React dumb component Better way to work with events Another const example, Check out this cool useEffect()

installing debian package on ubuntu

If you're trying to install debian package on Ubuntu, all you need is the following command :- sudo dpkg -i DEB_PACKAGE

Linux distro name - when uname don't tell you much

Try running the following command and that will help to display your Linux distro name. This works for Redhat. :) cat /proc/version 

docker image running dotnet - A fatal error occurred, the folder [/usr/share/dotnet/host/fxr] does not contain any version-numbered child folders

This error seems to be Linux specific. When i ran it on Redhat it gives me this issue. Otherwise running it on Ubuntu Bionic, this problem doesn't shows up. https://github.com/dotnet/dotnet-docker/issues/1537

Azure Devops pipeline - Getting dotnet tools (customer command) to run

Often have problem when there is more than a feed config in your nuget.config. For example, having your private feed, dotnet custom tool like "dotnet tool install' just won't work. To resolve this you specify a --configfile and provide a single config file entry called dotnettool.config which has the content below :-   -  task :  DotNetCoreCLI@2      displayName :  Install ReportGenerator tool      inputs :        command :  custom        custom :  tool        arguments :  install --tool-path . dotnet-reportgenerator-globaltool  --configfile dotnettool.config        feedsToUse :  'select' dotnettool.config content xml  version = "1.0"  encoding = "utf-8" ?> < configuration >    < packageSources >      < clear />      < add   key = "nuget.org"   value = "https://api.nuget.org/v3/index.json"   />    </ packageSources > </ configuration >

salesforce - working with records

This setup uses a controller that contains the following code, please note that this code is available in /default/classes/bearController.cls ' Notice how different the import path is :- import getAllBears from '@salesforce/apex/BearController.getAllBears' ; And load bead runs this method to run query and assigned it into a variable bear with a @track decorator loadBears ( ) { getAllBears ( ) . then ( result = > { this . bears = result ; } ) . catch ( error = > { this . error = error ; } ) ; } The funny thing about this code is, it returns mostly string which contains SOQL query. Still not really sure how it works under the hood. This is imperative query.. However, you can greatly simplify the code using @wire as shown below:- import { LightningElement , wire } from 'lwc' ; import ursusResources from '@salesforce/resourceUrl/ursus_park' ; /** BearController.getAllBears() Apex method */

salesforce - getting all the objects

To query all salesforce object name, you can use the following soql -

salesforce parent child event interfactions

In the lightning-button-icon, we have a onclick event which can be used to fire off events - And code below fires off an event called 'bearview' On the parents side of things, we need to tell parent component to handle the event, On the parent's javascript :-

salesforce - getting started - common terms

@track - is a decorate that make a variable  automatically refresh when its value changes Handing UI events On the javascript side of things Bear tile component This sections allow us to create a new custom component that is used in a parent component. Lets say we are creating a component called beartile. Here is the javascript side of things :- The html :- Looking at the parent html :- Also noticed that, it become c_bear_tile - where c is the default, somewhat predefined name for components and then as the upper case changes, then an underscore is introduced. Loading style :-

REST API Http versioning

Common ways to go about it would be :- 1. URL versioning An example would be :- https://adventure-works.com/v2/customers/3 : 2. Query string versioning https://adventure-works.com/customers/3?version=2 3.Header versioning GET https://adventure-works.com/customers/3 HTTP/1.1 Custom-Header : api-version=1 4. Media type versioning - this is more for client trying to digest the media type data content. For example, the data content supported by the client should be of pdf version 2. GET https://adventure-works.com/customers/3 HTTP/1.1 Accept : application/vnd.adventure-works.v2+pdf

Recap in HTTP design

Some of the common stuff when it comes to responding to REST based API are as follows :- 1. GET - 200 for OK, 404 when not found 2. POST - 201 for created and 204 No update. Returns 404 if it is a bad request. Returns 409 when conflict. For long running or async processes, an API could just return 202 (ACCEPTED) with a link to poll status info. Agent can poll the status and once upon completed returns 303. Results can be show here :- HTTP/1.1 202 Accepted Location : /api/status/12345 HTTP/1.1 303 See Other Location : /api/orders/12345 3. PUT - 201 for created and 204 No update. Returns 409 when conflict. 4. PATCH - this is mainly used for update resource. It will return 415 for unsupported media, 400 bad request and 409 for conflict. 5. HEAD - for handling large content If you have a long running task or wanted to poll status of a resources, then you can do something like this :- POST to a resource and returns 202 (Accepted) with a location of resource URL

salesforce lightning web component library

This is a link to salesforce component library that is essential for testing purposes. https://developer.salesforce.com/docs/component-library/bundle/lightning-accordion/example

Web lightning component documentation site

https://developer.salesforce.com/docs/component-library/documentation/lwc/lwc.apex

Using Powershell for Azure Devops to refer to predefined variables.

Using Powershell for Azure Devops Linux agent to refer to variables can be hard to get right essentially just use $(build.SourceBranchName) - the build.SourceBranchName is what listed in the documentation. You can see one example given below :- -  task :  PowerShell@2        displayName :  'PowerShell contruct VersionSuffix if not Master'        inputs :          targetType :  'inline'          script :  |          Write-Host "Setting up version info"          $VersionSuffix =  'prerelease' + '.$(build.SourceBranchName).' + $(build.buildid)          write-host "##vso[task.setvariable variable=VersionSuffix]$VersionSuffix"          Write-Output "##vso[build.updatebuildnumber]$($env:VersionPrefix)-$VersionSuffix"          failOnStderr :  true        condition :  and(succeeded(), ne(variables['Build.SourceBranch'], 'refs/heads/master'))

using terraform expression

Image
Terraform expression consist of conditional operators like ==, != and functions that you can leverage to work with input values. For example, lets say you want to concatenate two variable together, you can do that using terraform. There are a whole lot of functions available which are categorized under string, numeric and collection functions . Lets have a look at where you can use these functions. Functions are typically classified as expresssion and you can use it within a terraform code. Example  Let's have a look at the example below :- In this example, i am combining a couple of function together (chaining) and produce a list as the final output. This snippet shows where we can :- Disecting the code out further, we have substr functions, which attempts to get a subsection of a string except the last character. substr( data . azurerm_storage_account . paymentmonitorui . primary_web_endpoint ,   0 ,  length ( data . azurerm_storage_account . paymentm

Salesforce codepen / playground

This let you get a feel of the codepen or playground for salesforce https://developer.salesforce.com/docs/component-library/tools/playground

mulesoft java invoke

Image
To create a flow using java invoke. Please have a look at the flow below :- As you can see, we need a java new component, then follow by a java invoke component. Here is our java code :- public class Greeter {    public String Hello(String name) {    return "Hello " + name;    } } Java new component requires the following configuration :- And the java invoke component. As you can see we reference our variable using #[vars.mygreeter] :- Note the class and method configuration below is very important.

mulesoft media type supported by dataweave

Ever wonder what type of media type supported by data weave? application/avro Avro application/csv CSV application/dw DataWeave (weave)  (for testing a DataWeave expression) application/flatfile Flat File ,  Cobol Copybook ,  Fixed Width application/java Java ,  Enum Custom Type (for Java) application/json JSON application/octet-stream Octet Stream  (for binaries) application/yaml YAML application/xml XML ,  CData Custom Type (for XML) application/x-ndjson Newline Delimited JSON  (Newline Delimited JSON) application/xlsx Excel application/x-www-form-urlencoded URL Encoding multipart/* Multipart (Form-Data) text/plain Text Plain  (for plain text) text/x-java-properties Text Java Properties  (Properties) Links are provided here. https://docs.mulesoft.com/mule-runtime/4.2/dataweave-formats

terraform refresh - what in the world ...

Terraform refresh sync state files with the real world. It does not affect resources but it will update state files.

setting up telegraf as a linux daemon

Installing telegraf on target server curl https://dl.influxdata.com/telegraf/releases/telegraf-1.13.3_linux_amd64.tar.gz --output telegraf.tar.gz -k Placing telegraf executable in a specific folder. tar xf telegraf.tar.gz After extracting our telegraf.deb, you can move the binary to /usr/local/bin folder and configuration file to /user/local/etc/telegraf/telegraf.conf Assuming we are in your expand tar directory, /telegraf/usr/bin sudo cp telegraf /usr/local/bin Assuming we are in your expand tar directory, /telegraf/etc/telegraf sudo mkdir /usr/local/etc/telegraf sudo cp telegraf.conf   /usr/local/etc/telegraf Next we need to specify what metrics we would like telegraf to send over.    Example below shows the key configuration for DEV environment that we are interested in. The first is target database and server dns. Here we setup our hostname "urls = " http://WebAppPaymentsInfluxDB.asbbank.co.nz:8086 " and database

Mulesoft setting up foreach with random file generation

Image
First of all creation flow with the following component. In the file write, confgure the following fx or dw codes :- 

Grafana UI component code

You can find the component code for Grafana UI like Graph, Alert, color picker in the following url. https://github.com/grafana/grafana/tree/master/packages/grafana-ui/src/components

Delete App on Mulesoft Anypoint platform

Image
To delete an app that you created on Mulesoft Anypoint platform, go to Design center and you will see a list of application / api specification that is available. Click on yellow box area and then click on 'red' colored box to remove it.

Installing and setting up telegraf to monitor docker as a service on Linux

Telegraf is a data  collection and aggregation agent that needs to be installed on target server. Let's say you would like to gather metric data for example cpu usage or memory usage for Server A, install Telegraf on server A. All these day will be sent to a central influxdb datastorem, lets call our influxdb ServerDatastore. Grafana and influxdb can live on the same or separate server. In this setup, we are going to setup telegraf to monitor docker application and send all the metric of interest to ServerDataStore.  Installing telegraf on target server curl ttps://dl.influxdata.com/telegraf/releases/telegraf_1.13.2-1_amd64.deb --output telegraf.deb Placing telegraf executable in a specific folder. After extracting our telegraf.deb, you can move the binary to /usr/local/bin folder and configuration file to /user/local/etc/telegraf/telegraf.conf Next we need to specify what metrics we would like telegraf to send over.    Example below shows the