Showing posts from October, 2019

Appinsights ASP.Net- Non http apps / background apps / any newer version of console app

Regardless of web / console / service app that you're running, you typically calling the same piece of code  which is teletmetry client, TrackMetric() for example and pass in the required parameter.

The only difference is how to create your Startup code (entry point) code and started setting up your telemetry client)

For web, you need only a single package

For function app v2, you need

And configure your FunctionsStartup, using the following codes :-

For console app, the package required is here :-

You need the following startup code (which you easy create using dotnet new worker)

Assets file project.assets.json not found. Run a NuGet package restore

Bump into this error and thanks for stackoverflow, was able to resolve it by 
Tools->Nuget Package Manager-> Package manager Console -> 
And type, 
"dotnet restore" 
Or you can go into the command prompt of that project and type 
"dotnet restore"
Which ever work faster for you, i guess.

Sending custom metrics data to Azure

Sometimes you might send custom metrics data to Azure. A use case would be you trying to push metric information or signal that something has been processed and then it gets display in Azure monitor dashbaord.

The following code snippet works using TrackMetrics whch is a appinsights libarary. And yes, the metrics (whatever metrics you created) is visible in App Insights. Sometimes it is like a witch hunt trying to figure out where the metrics would come out.

For me, it took about a day before it is visible.

Azure AppInsights core

ASP.Net Core

<ItemGroup><PackageReferenceInclude="Microsoft.ApplicationInsights.AspNetCore"Version="2.8.0" /></ItemGroup>


publicvoidConfigureServices(IServiceCollection services) { // The following line enables Application Insights telemetry collection. services.AddApplicationInsightsTelemetry(); // This code adds other services for your application. services.AddMvc(); }

Understanding python import

I think this is the best article that helps to import sub-directory as module,

The only thing i wanted to say is, if you use

__all__ = ['mymodule1', 'mymodule2']

That's to accomodate syntax such as this

from sub-directory import *

See the magically asterisk there...... all this is for this little baby. :)

C#8 Getting around optional interface task

Perhaps not the best of ways to work with optional interface, but here is something i come out with to prevent error coughing up with optional interface in C#8.

Python using stream io

You should make use of pytohn io as defined here to work with stream data. So there are basically 3 main types of stream io which are text io, binary io,

Instead of reading everything into memory, you are able to read this line by line as a stream....

import subprocess
## assuming r.csv is a large file
self.streamedOutput = subprocess.Popen(['cat', 'r.csv'], stdout = subprocess.PIPE) self.sendOutput(self.streamedOutput)

whileTrue: executionResult = streamedOutput.stdout.readline() if executionResult: print(executionResult) else: break

Linux getting port already in use

Was having a issue trying to bind to a specific port on my machine.
Then this command came about, :)

sudo netstat -nlp | grep :80

Getting gunicorn to run on a different port

As simple as using the following command :-

gunicorn --bind FalconRestService:api

vscode - enabling python library / module debugging

You can start using vscode to step into python library / imports. All you need is to have it in the following configuration below.

Once you have it in your launch.json, keep on pressing F11 :)

{ // Use IntelliSense to learn about possible attributes. // Hover to view descriptions of existing attributes. // For more information, visit: "version": "0.2.0", "configurations": [ { "name": "Python: Current File (Integrated Terminal)", "type": "python", "request": "launch", "program": "${file}", "console": "integratedTerminal", "justMyCode": false <---------------- span=""> }, ] }

Kafka setup the scalable way ...

Great link to setup kafka how it should be. Scalable and reliable.  This link is really useful.

However images no longer exsit, and requires some tuning :-

 sudo docker service create --network kafka-net --name broker          --hostname="{{.Service.Name}}.{{.Task.Slot}}.{{.Task.ID}}"          -e KAFKA_BROKER_ID={{.Task.Slot}} -e ZK_SERVERS=tasks.zookeeper          qnib/plain-kafka:2019-01-28_2.1.0

Python serializing and deserializing to json

Here a some quick sample code to work with json objects in python.

Deserializing to object

commands = json.loads(commandstrings) commandResultList = []
for command in commands: o = CommandType(**command) commandResultList.append(o)

Serializing to Json

import json
cmd1 = CommandType("list", "ls", "al") cmd2 = CommandType("list", "pwd", "") cmd3 = CommandType("list", "ls", "")
cmds = [cmd1, cmd2, cmd3] a = json.dumps(cmds, default=lambdao: o.__dict__)

git squash - interactive rebasin

To squash last 3 commits

git rebase -i HEAD~3

Then you get something like this, you need to change one of it to "pick". In this case, i pick "c8659b4"

pick c8659b4 Merged PR 1914: ensure strongly type response messages are factor in.
s 986cad8 Updated azure-pipelines.yml
s bdb2086 1 2 3

As long as you have a single pick or all pick statement, it will be good. You should be able to rebase (squash) your commits

python celery first steps

If you follow python celery first step from the official site, you probably gonna get some heart attack trying to get it work.

Please use the following docker command :

docker run -d -p 5672:5672 rabbitmq

First you need to tell celery, which method you would like to register. This allows celery to provide task registration and execution to.

This is an example of

from time import sleep from celery import Celery
app = Celery('tasks', broker='amqp://guest:guest@localhost:5672')

@app.task defadd(x, y): sleep(4) print("executing stuff") return x + y

Now that you have it registered, next is to run it. This queue "add" task (defined earlier) and put it into motion.

from tasks import add
add.delay(4, 4)

Azure diagnostic logging - Getting better understanding of what log type means

When I go into Azure Monitor log diagnostics and setup what logs I would like to include in my workspace, I have difficulty figuring out what information I will be getting. To make matter worst, how do i know what type of activities i should be querying or appearing in my workspace.

The thing is, it is hard. You can try to have a go at this site here.....