Posts

Showing posts from June, 2019

git subtree add gives you "Working tree has modifications. Cannot add"

While seems confusing, i think it is pretty clear it meant what it says.
If you have any changes, please commit or stash it - make sure your branch is clean and neat :)

Or sometimes if you don't have any modification, just run the command "git reset".

And re-run the command.

Using powershell's Connect-AzAccount to connect in Azure DevOps

Here is the inline scripts that allow you to connect / authenticate using powershell extension in azure devops.














az cli - cosmodb keys vs az cosmosdb list-keys

Not really sure why,

az cosmodb keys don't work although it is available in the documentations.

So instead i tried

cosmosdb list-keys

And that works. Feature or bug?


Understanding Powershell extension for DevOps

Image
In Azure devops, there is an extension called PowerShell for Mac, Windows and Linux. This is not the same as "Azure Powershell" which is targeted to run on window build host / agent.

Most of the time, I prefer to work with scripts based instead of inline scripts for tracking reasons.



Az cli extension will automatically login for you and you can start running various scripts. This extension, requires a manual login approach.

To installed a module, just do something like this


Install-Module -Name CosmosDB -Force

Some faq here

Can I used az cli based command here?

Yes you can. Can be as simple as follow :-


az login --service-principal --username $env:CLIENTID --password $env:CLIENTSECRET --tenant $env:TENANT_ID
az account set --subscription $env:SUBSCRIPTION

Or using powershell (you need to use "ConvertTo-SecureString". Pretty sure the following


Write-Host("init credential")
$passwd = ConvertTo-SecureString$env:CLIENTSECRET -AsPlainText -Force $pscredential = N…

A weird case of special character in Cosmodb stored procedure

for some reason i am getting, specials unicode (https://en.wikipedia.org/wiki/Specials_(Unicode_block) and just mess up my stored procedure deployment to cosmo "cosmodb from the powershell gallery module".

All it needs is a editor "Notepad++ and click on the options "Show all characters" and edit away! :)




Publishing powershell module

First you can create a folder to place all these files and make sure you have powershell installed

Create your powershell module with a psm1 extension

You can copy and paste the cod from here, just make sure you have the right eten

Setup your module manifest

New-ModuleManifest-path ./firstmodule.psd1                                                                                                                     
Edit firstmodule.psd1 and add the following


Description = 'demo module

This basically gives it the minimum requirements :)


Publish-Module-path /Users/jeremy/tmp/work/powershell/firstmodule -nugetapikey

Getting started with spark with databricks cloud

When you sign up for a community edition, you tend to create notebook with it.

If you want to make use of the existing database, you can load it using the following command :

To load a table called airlineflight


use databricks;

select month from airlineflight where year == 2002


To convert to a data frame you can use the following command :-

%python 

df = sqlContext.sql("Select * from airlineflight")


And from this point onwards, you can manipulate using filter, select. Please refer to documentation here.


Some quick examples,

%pytho

df.select("FlightNum").collect()


df.filter(df.DepTime>24).count()


Documentation for Databricks Cloud / Community. Finally manage to zoom in on what spark library that i am working on to progress further.

Having my notebook in Python allows me to work and then subsequently use function defined here.

http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.functions.concat




Azure Devops artefact and release pipelines

I guess the concept is pretty easy but just that we just lost in piles of documentation.

Key concepts is when you build make sure you publish your output as an artefact.  How do you do that? Depends on  whether you're building a  .netcore or javascript based project.
If you're using dotnet core, you just use "publish and zip" artifact.

If you're using javascripts, then you need to publish your artifact to  drop folder. 

For eample,  we have PublishBuildArtifact here generate dll into a drop folder



steps: - script: dotnet build src/AzureFunctionApp.sln --configuration $(buildConfiguration) displayName: 'dotnet build $(buildConfiguration)' - script: dotnet publish src/AzureFunctionApp.sln --configuration $(buildConfiguration) displayName: 'dotnet publish src/AzureFunctionApp.sln $(buildConfiguration)' - task: PublishBuildArtifacts@1 inputs: PathtoPublish: '$(Build.ArtifactStagingDirectory)' ArtifactName: 'drop' publishLocation: 'Contai…

Azure Blob copy to $web

When you try to copy files to $web in your static site (Azure storage account), you need to escape $web otherwise you won't be able to copy files across.

So typically you do something like

az cli storage upload-batch -s sourcefolder -d destinationfolder  \$web

Especially when you on a Mac or Linux based system.



vscode enable application insights

Install nuget

dotnet add package Microsoft.ApplicationInsights.AspNetCore --version 2.7.0


Setup codes, startup.cs


publicvoidConfigureServices(IServiceCollection services) { // The following line enables Application Insights telemetry collection. services.AddApplicationInsightsTelemetry(); // code adding other services for your application services.AddMvc(); }


application.json

"ApplicationInsights": { "InstrumentationKey": "putinstrumentationkeyhere" },



In the section, "putInstrumentationkeyhere" ---
place the guid created for app insights in your portal.