its one of the basic terms you will come across in machine learning . This is how the model gets optimized and sort of directs itself to the correct solution. When the expected output differs from the actual output , you have difference and you can program the model training to adjust itself to reduce this error in the next iteration. This is where loss function comes to play. Different loss functions will perform better with different problem and this is where its important to pick the right kind of loss function. For regression kind of problem , mean square error is a better fit whereas for classification problems we go with cross entropy ( log fn ) is a better fit .
Category: Machine Learning
what is a tensor
You will come across this term all the time and often wonder what is a tensor in the context of machine learning
A tensor is essentially a n- dimensional array or a multi dimensional array where n can be 0 to inf.
When you learn ML in matlab , you will be dealing with arrays which are these vectors and then when you create a two dimensional arrays represented as rows and columns etc – you have a matrix , so a lot of the matrix operations are very relevant when it comes to ML.
so the next question is how do you express an n-dimensional array , its often easy to visualize a 3 dimensional structure , but when it goes past 3 , its not possible to visualize it , so it becomes easier to express this in terms of tensors.
a vector is a one dimensional tensor, a matrix is a two dimensional tensor etc . hope this helps
Homomorphic encryption
One of the challenges many face in using compute available from Cloud providers to enable machine learning , is that the training Data has to be uploaded to the cloud. A lot of organizations are not comfortable uploading sensitive data to the cloud .
Homomorphic encryption can help overcome this challenge . This encryption allows computation to be performed on encrypted data . The final result can be decrypted with the private key and it will return the same result as if the model was built with unencrypted data. This open up the potential for the organization to encrypt the training data on prem. The encrypted data can then be uploaded to the cloud and machine learning model can be trained and built in the cloud. The model can then predict and output the result in encrypted format which can then be decrypted on prem with the private key. This ensures that only encrypted data is pushed to the cloud thus significantly reducing the risk. This allows organization to leverage the vast computing power that’s available in the cloud.
Here is a simple example of homomorphic encryption . first step is to install the phe package
pip install phe
The next step is to write a simple python program to demonstrate the addition of two numbers
import phe as paillier
print("generating paillier keypair")
pubkey , prikey = paillier.generate_paillier_keypair(n_length=64)
a = pubkey.encrypt(10)
b = pubkey.encrypt(20)
c = a + b # adding two encypted values - these are two objects
print ( "adding the encrypted values , the output would be another encrypted object")
print(c)
print(" decrypt with private key")
print(prikey.decrypt(c))
the output of this program is as follows
generating paillier keypair
adding the encrypted values , the outout would be another encrypted object
<phe.paillier.EncryptedNumber object at 0x0000016D8252DCD0>
decrypt with private key
30
the output of adding 10 and 20 is 30 , even though the sum was done on encrypted objects
This is a very simplistic example of homomorphic encryption . The next step is to use this to build an actual model on encrypted data .
recurrent neural network
RNN’s or recurrent neural network are a class of neural network where all of the the previous inputs play a part in defining the next step and this sort of forward loop continues till the last step. This makes it very useful for time series based use case or natural language processing
RNNs can model all of the sequential data that it sees and recurrent as in recurring is indicative of something happening again and its lends itself great for NLP.
this link has a good cheatsheet on the architecture of neural network
https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks#overview
basically there is one to one , one to many ( music note generation ) , many to one ( sentiment analysis ) , many to many ( language translation ) types of RNN architecture. Many to Many can have two kinds – one where the output qty matches with the input qty and the other where the two are different for eg in translation , the target language can have a different number of words than what it sees in the source language.
Setting up a local azure ML env
Assuming you have conda set up for your environments , Start out by listing all of the conda environments
conda info --envs
Lets assume you want to create a new environment with a specific version of python
conda create -n azuremlenv python=3.7.7
This will download the required packages and install it
Collecting package metadata (current_repodata.json): done …
the next step is to create a new environment
conda activate azuremlenv
replace the azuemlenv with your environment name in the above statement
install notebook and ipykernel packages
conda install notebook ipykernel
The next set of packages will download the azureml-sdk
pip install azureml-sdk[notebooks, automl]
The last step is to install a kernel so that the environment comes up in the jupyter notebook
python -m ipykernel install --user --name azuremlenv --display-name "azuremlenv"
or
conda install nb_conda_kernels
the nb_conda kernels will allow conda kernels to be recognized in jupyter
once you bring up the jupyter notebook , you will see the dropdown with the corresponding ml environment that you can use the as the kernel for the notebook . the above command creates a new json file with that looks like below.
to bring up the notebook , you just need to type in jupyter notebook on the command line
if the kernel list does not show the conda env that you just installed , then type in
python -m ipykernel install --user nameofcondaenv --display-name "nameofcondaenv"
restart the notebook and the new kernel should appear in the dropdown under kernel -> change kernel section
happy coding !