Showing posts from September, 2019

spark - pyspark reading from excel files

I guess a common mistake is to load the right jar file when loading excel file. Yes, you have to use version 2.11 and not 2.12, :)

You can try using the following command line

pyspark --packages com.crealytics:spark-excel_2.11:0.11.1

And use the following code to load an excel file in a data folder. If you have not created this folder, please create it and place an excel file in it.

from com.crealytics.spark.excel import *
## using spark-submit with option to execute script from command line ## spark-submit --packages spark-excel_2.11:0.11.1
## pyspark --packages spark-excel_2.11:0.11.1 ## pyspark --packages com.crealytics:spark-excel_2.11:0.11.1
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("excel-email-pipeline").getOrCreate()
df ="com.crealytics.spark.excel").option("useHeader", "true").option("inferSchema", "true").load("data/excel.xlsx")…

Az cli passing the correct data type with some examples

When you are working with Az cli, you often need to pass in the correct data type. This can be pretty time consuming to figure out the correct syntax and what the data type looks like.

For demo purposes, I assume we are working with an event hub

Update resource 

When you update, you need to use the following syntax. Notice the equal sign and setting the value 'Deny' to a nested properties called  properties.defaultAction.

az resource update -n "myeventhubnamespace/networkrulesets/default" -g "myfakeresourcegroup" --resource-type "Microsoft.EventHub/namespaces" --set properties.defaultAction=Deny

Adding resources 

When you add, you use the following syntax.  Notice no more equal sign. In this scenario, we wanted to add entries to an array / list of key value pairs. The syntax looks something like this :-

az resource update -n "myeventhubnamespace/networkrulesets/default" -g "myfakeresourcegroup" --resource-type "Microsoft.Ev…

spark - connecting to mongodb using mongodb connector

One way of connectg to mongodb database using mongodb (not the usual mongodb driver), is using the following codes.

First it starts off with command line (to download driver from maven repository) and then run the code to connect and show.

# Start the code with the followings..
# pyspark --conf "spark.mongodb.input.uri=mongodb://" \ # --conf "spark.mongodb.output.uri=mongodb://" \ # --packages org.mongodb.spark:mongo-spark-connector_2.11:2.4.1

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("mongo-email-pipeline").config("spark.mongodb.input.uri", "mongodb://").config("spark.mongodb.output.uri", "mongodb://").getOrCreate()
df ="mongo").load()

python import

Looking at different way you can import python modules

Generally it begins by path to a module, for example, if you have a custom module, you can do something like this ::

from email.emailer import *

Hence 'from' keyword is for.  The asterisk tells python to import all.

Don't forget the file

openssl - where you can find some examples code

Did some digging and found out that openssl does have some code example that you can find here :- (this is a forked repo) you can find recent codes from the master repo

Have fun

openssl client example - compiling and running

I assume you have build and install openssl. So next step would be to compile it....

To compile it as a shared object,

gcc -o client client.c -lssl -lcrypto -L /opt/openssl/lib -I /opt/openssl/include

Setting up your library ptah

export LD_LIBRARY_PATH=/opt/openssl/lib

And finally running it using 


openssl - compiling in the docker ubuntu

To setup openSSL run the following commands

1. docker run -it ubuntu /bin/bash

 2. apt-get install git

3. apt install g++

4. apt install build-essential

5. git clone

6. ./config --prefix=/opt/openssl --openssldir=/opt/openssl enable-ec_nistp_64_gcc_128

7. make depend

8. make

javascript ; different ways of writing return types

Definitely wouldn't think about how to write the same statement multiple ways..

Ripped off from stackoverflow :-

react with custom hooks

As you can see from the code below, this control uses react hooks with a custom flavour. Pretty easy to understand.

useState allows you to configure value setting methods

This control exposes onChange which just update the state

And finally it also exposes value which you can use to show / render into the control what info has bene typed by a user.

What is the spread operator doing in there?

It just doing a mapping and renders something that looks like this

async function - different ways of writing

Interestingly there are many ways of writing async function which i do forget ... :)

Just to make sure i have documented the right ways....

The specified initialization vector (IV) does not match the block size for this algorithm

If you're getting  exception when trying to assigned key to a symmetric key and getting error message below :-

The specified initialization vector (IV) does not match the block size for this algorithm

And your code probably look like something below :-

Then you need to go into debug mode and start looking into the supported size for IV as shown in diagram below :-

As you can see, IV is 16 bytes, so you need to provide 16 bytes. The same goes for Key field too, if you're adding anything to it.

As long as you provide a valid key size, then we're good and you're code will be ready.

npm install package using proxy

Sometimes you might not have access to direct internet and when that happens, you can be limited.
To get around this, you can authenticate yourself using proxy. Use the following command :-
where is your proxy.

npm config set proxy http://"dev.username:password"
npm config set https-proxy http://"dev.username:password"

I also notice that setting this, results in 418 or probably 404.

npm config set registry ""

So if you have this, try to remove it using npm config edit

Also if you have some issues with certificate, you can probably set

npm set strict-ssl false

apollo server mutation sample

Sample mutation query for testing on apollo server machine :-

Azure key vault - enabling protection from delete and purging

Soft delete - means your item is marked for delete but your key are not removed from system.

Purging is like 'emptying' your  recycle bin. Purge everything and then you won't see it again. If you have a soft delete key, you can still purge it and you key still goes missing.

That's why purge protection is important too.

Here's some consideration when working with soft delete and purging vault

1. You cannot undo it. You cannot change purging = false. You cannot change soft delete = false once you have enable it.

2. You need to use Cli to recover it.

3. If you purge your vault, you still get charge for it until it is really removed

React Test Util - Find control by Id

The following code take advantage of react testing util to find control by id :-

react-test-renderer - Cannot read property 'current' of undefined

You need to have the same version for react and react-test-renderer to work.
As you can see from my package.json file :-

"react": "16.8.6",
"react-test-renderer": "16.8.6"

a react library called create scripts...

react-script is a package that provides zero configuration way of setting up and running your react project

It provides dev server that support hot loading of your react app.

deploying static website (react) to storage account using az cli

The following scripts allows you to deploy a compiled web assets and deploy it, into storage account in Azure.

Live sharing with visual studio

To fire off your live share, just click on the icon on your far right on your visual studio.

It will generate a link which you can copy and share with your team members.

Your team member will then copy and paste that into a browser which fires up a visual studio. And then you can see who is changing what....... if you open the same file. :)

Sweet! that's it

az cli number of workers used for a webapp

I use the following scripts to configure number of workers used for a webapp

terraform - specifying the size of instance for a service plan

You can specify number of instances you want to create for a service plan. This can be shown in code below :-

The magic number can be set / configure for a field named "capacity"

Redux - is there any way to explain it easier

First you need a store. This store is the central repository of data and you can work with data here.

#1 Code

import { createStore } from'redux'
constreduxstore = createStore(appreducers)
What is reducer? It is sets the value of our data. It could be a single record update or appending to a list.

<Providerstore={reduxstore}> </Provider>
When you set this up, this means you're making the store available to component underneath it.

#2 - When you have your store, what do you do next?

You want your component to be able to work with it, meaning you send data and get certain data that you need.

First you need to use a "connect"from react-redux. And you wire it to your component.

Importing the connect function

import { connect } from'react-redux'

Next you defined a simple component like you would do normally.
And finally you make your store accessible to your component using code below.


Azure linking up different resource groups into a service plan

In your deployment, you might want to link function app from different resource group with a specific service plan. Working with az cli blindly (especially with dodgy documentation) could be a challenge.

Here is a tested script az cli to tied a function app into a specific service plan (that you might have created somewhere else) that might help.

 az functionapp create --name $env:environment$env:functionapp_name --resource-group $env:environment$env:resource_group_name --storage-account $env:environment$env:storage_accountname -p /subscriptions/$env:Subscription/resourceGroups/$($env:environment)$env:serviceplan_resource_group/providers/Microsoft.Web/serverFarms/$($env:environment)$env:serviceplan_name --app-insights-key $appInsightId --runtime dotnet --os-type Windows

$xxx - are variables for powershell and you can tell that the script is written in powershell.

git pull and push with username password

Just use the command below and provide your username and password :-

git pull https://yourUsername:yourPassword@githubcom/repoInfo 

When you trying to push upstream and getting some errors :

git push --set-upstream origin feature/stored-proc-refactor

An then it prompts your for password for an invalid username, you can do something like this

git remote set-url origin 

javascript prototype

_proto_ is an internal property of a class pointing to prototype. It is like python ways of always using double underscore to refers to internal object attributes.

Prototype in javascript is used to reference parents. You can find a very clear and good description of this here.