{ "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "IyuBmjJqKxBc", "slideshow": { "slide_type": "slide" } }, "source": [ "# Automatic Differentiation\n", "\n", "Parts of this is inspired by [this tutorial](https://github.com/DataScienceUB/DeepLearningMaster2019/blob/master/2.%20Automatic%20Differentiation.ipynb)\n", "\n", "In many applications it is crucial to compute derivatives such as machine learning, optimization problems, and so on. Often the most basis idea is to approximate the gradient of a function by computing a finite difference approximation\n", "$$\n", "\\frac{f(x_1+h,x_2,\\ldots,x_n)-f(x_1,x_2,\\ldots,x_n)}{h}\n", "$$\n", "or numerically a little better using a centered difference\n", "$$\n", "\\frac{f(x_1+h,x_2,\\ldots,x_n)-f(x_1-h,x_2,\\ldots,x_n)}{2h}\n", "$$\n", "\n" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "import numpy as np\n", "import matplotlib.pylab as plt\n", "\n", "def diff_f(x,h):\n", " fvec = np.exp(x)\n", " y = np.abs((np.exp(x+h)-np.exp(x-h))/(2*h)-fvec)/np.abs(fvec)\n", " return y\n", "\n", "def diff2_f(x,h):\n", " fvec = np.exp(x)\n", " y = np.abs((np.exp(x+h)-np.exp(x))/(h)-fvec)/np.abs(fvec)\n", " return y" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYEAAAD8CAYAAACRkhiPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO2dd3hURdfAf5OE0KuACATpXRApiliwoNhAEbDrKyiioq/lU8Gu2HtDEQXBBiKiIKDoqzTpTQNI76EGAoFAIMnu+f44STYJSUjZzaac3/PcZ3Pn3jtzZhfm3Dlz5hwnIhiGYRglk5BgC2AYhmEED1MChmEYJRhTAoZhGCUYUwKGYRglGFMChmEYJRhTAoZhGCWYsGALkBuqV68u9evXD7YYhmEYRYqlS5fuE5EamV0rUkqgfv36LFmyJNhiGIZhFCmcc1uzumbmIMMwjBKMKQHDMIwSjCkBwzCMEowpAcMwjBKMKQHDMIwSjCkBwzCMEowpAcMwjELOhAmwfXtg6jYlYBiGUYhZsAD69IFnnglM/UFTAs65Fs654c65Cc65e4Mlh2EYRmHm44/187TTAlO/X5WAc26Uc26vc25lhvLuzrm1zrkNzrnBACKyWkQGAn2BDv6UwzAMo7gQHg4VKsCzzwamfn/PBEYD3dMWOOdCgWHAFUBL4CbnXMvkaz2Av4A//CyHYRhGkcfrhd9+g8uabqbs1jUBacOvSkBEZgMxGYo7ARtEZJOIJADjgJ7J908WkXOBW/wph2EYRnFgzBjYvl3ouOxj+OijgLRREAHk6gBp17WjgLOdc12BXkBpYFpWDzvnBgADAOrVqxc4KQ3DMAoZH3wA4LiGqdDqgYC0URBKwGVSJiIyE5h5sodFZAQwAqBDhw7iV8kMwzAKMatXQyViacVq6NcvIG0UhHdQFBCR5rwusLMA2jUMwyiyxMbC8eNCNWKgfn0oXTog7RSEElgMNHHONXDOhQM3ApMLoF3DMIwiy9ChAI42/A1PPhmwdvztIjoWmA80c85FOef6i0gSMAiYDqwGxovIKn+2axiGUdyYNQscXkZyF3TqFLB2/LomICI3ZVE+jWwWfw3DMIz0rFkDZ7GM6u4AtG0bsHYsbIRhGEYhIzIS4uKE7vwCVasGtC1TAoZhGIWMe+4BcFzLJLjoooC2VaQSzRuGYZQEIiOhA4s5i6Xw/OiAtmUzAcMwjELEX3/B0aMwkesJCQ2F1q0D2p4pAcMwjELEY49BVfZTl+3QrFnA2zMlYBiGUUjwemHhQijv4jXUwgsvBLxNWxMwDMMoJPz9N4jAp6EDIaw09O4d8DZtJmAYhlFI+PpraFZqI1d6pgZ0b0BaTAkYhmEUAuLi4P334azw5Jxc771XIO2aEjAMwygE3HSTrgmce+Q3KF8eOncukHZNCRiGYQQZjwemToVSJDCQ4XDGGQXWtikBwzCMINOvny4IX15pHmF44YsvCqxtUwKGYRhB5ptvAIQxpQbAlVdC8+YF1rYpAcMwjCDy4YdqDupTaw7V9q+HK64o0PadSNHJ2NihQwdZsmRJsMUwDMPwGzVqwP79QhwVKOeOQXw8hIf7tQ3n3FIR6ZDZNZsJGIZhBIknn4R9++D22r9TTo5C+/Z+VwAnw5SAYRhGEDh2DF5/Xf9+Z/ct+sfkgs+8G1Ql4Jxr6Jwb6ZybEEw5DMMwCpo+fXRfwBNtf6GaZx80bQq1ahW4HH5XAs65Uc65vc65lRnKuzvn1jrnNjjnBgOIyCYR6e9vGQzDMAozv/0GU6ZAzZrw6rrk+EA//BAUWQIxExgNdE9b4JwLBYYBVwAtgZuccy0D0LZhGEahxuuFHj3076mPz8TFH9WQ0QHOG5AVflcCIjIbiMlQ3AnYkPzmnwCMA3rmpD7n3ADn3BLn3JLo6Gg/S2sYhlGwfPstHD8OnTpBh9f7auGE4FnEC2pNoA6wPc15FFDHOXeKc2440M45NySzB0VkhIh0EJEONWrUKAhZDcMwAsLvv8Pdd8P558PcLo9DdDRcemnQZgFQcPkEXCZlIiL7gYEFJINhGEbQWLoULrsMSpeG8Z8dJKz1u1CqFIwfH1S5CkoJRAERac7rAjsLqG3DMIygEh/vCwr68MNQ65ZLISkJXn0VqlYNqmwFZQ5aDDRxzjVwzoUDNwIF7xBrGIZRwHg8ULcuJCZChw7w6mUzdFpQtiwMHhxs8QLiIjoWmA80c85FOef6i0gSMAiYDqwGxovIKn+3bRiGUZjweiEiAmJi4NRTYcE8L9x6q14cMya4wiXjd3OQiNyURfk0YJq/2zMMwyisdO0Ku3ZB9eqwYweEPvM07NwJQ4bobrFCgIWNMAzD8DMej+aFmTNHtwDs2gWh77+jawAdOsDLLwdbxFQKamHYMAyjRBAXB7Vrw+HDcPrpEBkJYTu3waOP6g3jxoHLzGEyONhMwDAMw09s3aprAIcPQ6NGsHEjhId6oGVygIS779YLhQhTAoZhGH7g9dehYUM4eBAGDoQNGyA0FGjVCo4c0cF/xIhgi3kCpgQMwzDyyciRutbr9cI778AnnyRfuPVWWLtW3UHXrg2qjFlhSsAwDCOPREZCly5w110aCmLBAt0MBsC0aZo8ODQU/v03eVpQ+DAlYBiGkQfuvhvatoV58+CJJ+DPP+Hss5Mv/vmnhgp1Tl2E6tcPpqjZYt5BhmEYuWD1ajjvPN0ABvDss/DCC2lu2LZNk8V7PPDVV754EYUUUwKGYRg55KGH4P339e/q1WH5cg0JkUpMDLRpAwkJ8NJLvt3BhRgzBxmGYZyEqVPhtNNUAYSFwdNPaxTodAogLg7q1IHYWHUPeuqpoMmbG2wmYBiGkQUxMZr8ZeNGPb/iCs3/Uq5chhu9XmjXTrPHt26dxj2o8GMzAcMwjAx4PHD//ZoDeONGHfR//lkdfjJVAHfcoRsDrrgCVqwIisx5xWYChmEYaZg+Ha66ShVB2bIa7ue//83iZo9HgwNt3AgPPuhbMChCmBIwDMMAJk6EL7+ESZM04dcFF8Bvv+kaQKZ4PLpQEB0NjRvDu+8WqLz+wpSAYRglmilT1IknNlZTPw4dqhu+ypfP5iGPR6PERUerm9CaNRBSNK3rpgQMwyiRLFgAPXvC3r16Xr06TJ6cA7f+FAWwd68+tHt3od0NnBOCqrqccw2dcyOdcxOCKYdhGCWHrVt1s1fnzjqOV6youd6jo3OgALxeaN5cHzzllCKvACAfSsA5N8o5t9c5tzJDeXfn3Frn3AbnXLYJNEVkk4j0z6sMhmEYOWX1amjSRCM4zJ2rL/EffgiHDuUwyVdSkm4E27ABmjaFPXuKvAKA/JmDRgMfAV+mFDjnQoFhQDcgCljsnJsMhAKvZni+n4jszUf7hmEYJyUqCu68E/73Pz2vWhWGD4e+fXNRSVISNGiglXXqBPPnF9k1gIzkWQmIyGznXP0MxZ2ADSKyCcA5Nw7oKSKvAlfnpR3n3ABgAEC9evXyKq5hGCWMrVvhkkt8G71OOUVdPZ95JpcVHTumOQGionQjWDFSAOD/NYE6wPY051HJZZninDvFOTccaOecG5LZPSIyQkQ6iEiHGjVq+FdawzCKHdHR8OKLmuN340b18hkzBvbty4MCiI1Vu9GmTXDppboRrBgpAPC/d1BmiTMlq5tFZD8w0M8yGIZRAlm6FK67DrYnv4b26KFx/q+5Jo8VHj6siwhHjuhawO+/+03WwoS/lUAUEJHmvC6w089tGIZhpDJ7Ntx0E+xMHmnKlNEIzr1756PSmBhdNIiOVs0ycaJfZC2M+Htesxho4pxr4JwLB24EJvu5DcMwDJYt0xf0Cy9UBVC2LLz9NsTH51MBrF4Np58Of/wBo0YVawUA+XMRHQvMB5o556Kcc/1FJAkYBEwHVgPjRWSVf0Q1DMOA77/XiM3t26uJ/pRTNH/70aPwyCP5rHzpUl38jYuDQYPUraiYkx/voJuyKJ8GTMuzRIZhGJkwbRo88ICu0YK66n/6KXTt6qcGZs6Eiy8GEZ1KfPihnyou3BSvZW7DMIodw4dDhQoa2XPTJmjUSOP9rF3rRwUweTJcdJEqgLvu0ulGCcFiBxmGUSiZPh0++EBnAKAun2PHqsu+X/nzT7j2Wv370Ufhrbf83EDhxmYChmEUGjwe6NdPwzd37w6LFmmq3vXrITIyAApg8mTo1k3/fumlEqcAwGYChmEUAhIS1CPz5581RhvAmWeq+2fFigFq9Ouv4fbbwTn48UcNKVoCsZmAYRhB4+hRuPtuTdk4aZIqgJTonsuXB1ABfPIJ3HabKoCvvy6xCgBsJmAYRhDYsQOuvlpNPF6vJnPp1k3XYytUCHDj/fur/39YGPz1F5x9doAbLNzYTMAwjAJj2za19detC3//rWF4nnlGZwS//FIACuCpp1QBOKe2phKuAMBmAoZhFAB790KXLhqKH3SDV//+8MorBRiS/7PP4LXXoHJlnQG0bl1ADRduTAkYhhEwJk2CqVM1ls+xYxraYcQIzelboFx2mQaAO+cc/Qz4lKPoYErAMAy/88kn8MQTGojTOY2+8NhjmpmxwGnRQhPBV65cQDanooUpAcMw/MZzz8Hrr8Px43pevbqa4PMczjk/eDyaSzIqSt2Ptm5VRWCkw5SAYRj5wuvVPVavvQYHDmhZRIQmbz/nnCAJlZCgUeb27dOBf9cutUUZJ2BKwDCMPJGUpHH8J05URRASovHXPvlEg7sFjWPHNBnMvn1w6qnqj1oMEsIHCnMRNQwjV8TGqmdP+fIwYYIqgG7dYP9+DcEfVAVw6BA0bqwmoA4dTAHkAFMChmHkiKNH1c2+ShW184eEwMCBuvj7229aHlT27dMNCDt2aEC4xYtNAeQAMwcZhpEtkZHw4IPw77+abbF8eejTB0aOLEQ516OiNLrc4cMac/rHH4MtUZEhqD+hc66Fc264c26Cc+7eYMpiGEZ6/vwTateGtm1h1ix175wzR5NuffFFIVIACxZoOshDh1RbTZkSbImKFPlJLznKObfXObcyQ3l359xa59wG59zg7OoQkdUiMhDoC3TIqyyGYfiPn36CatXgkkvUqaZcOXX7nD0bzjsv2NJlYPVqTTLs9cLNN8P77wdboiJHfnT5aKB72gLnXCgwDLgCaAnc5Jxr6Zw7wzk3JcNRM/mZHsBfwB/5kMUwjHzy9dfqVn/dderqWaUKjBkDR47A448HW7pMWLQIrrxSs4G9+SZ8802wJSqS5CfH8GznXP0MxZ2ADSKyCcA5Nw7oKSKvAldnUc9kYLJzbirwbcbrzrkBwACAevXq5VVcwzCy4P33dZBPSNDzDh207NxzgytXtnz6Kdx3H5QqpTYqCwSXZ/y9MFwH2J7mPArI8tdxznUFegGlySI5vYiMAEYAdOjQQfwlqGGUZDwe+O9/NabPoUNa1rChpnJs1iy4sp2UV1+FJ5/Uv7/5xhRAPvG3EnCZlGU5cIvITGCmn2UwDCMLPB717R84EA4e1MXdXr10g1fNmsGWLgc8/riafkBXp6+/PrjyFAP8rQSigIg053WBnX5uwzCMXBIdrUlc1q9Xe3+DBur1M3GiLgIXCVKSwYAKft11wZWnmOBvJbAYaOKcawDsAG4EbvZzG4Zh5JB16zRz4po1el66NPzwg+6lKjQunjnhiSd8yWD+/BO6dg22RMWG/LiIjgXmA82cc1HOuf4ikgQMAqYDq4HxIrLKP6IahpFTVq9W236zZqoAwsI0l++RI2r+KVIK4J574I03dJfasmWmAPxMfryDbsqifBpZLPIahhFYfvxR0zWuSn71Kl1a4/gPHRpcufJMnz66iFGlimq2WrWCLVGxw8JGGEYxYMoUuOsu2LNHzxs1gvfe03WAIsvll2tQourVYcUKUwABwpSAYRRhnnsOhg3TCJ6g4+WIEUV8zdTrhXr1NBBcvXo6AyhXLthSFVtMCRhGEUNE10ZfeUU/QcM5jBihmRSLNF6v+v3v2KEmoDVrLBlMgClKy0OGUaJJSFCvnlKl4NJL9QX57rth7VrdNFvkFcCxYzqFWbIEOnfW0NCmAAJOiZkJ7I7cS6XaFShX3aaVRtEiJkYH/zlzfGV33KGRE0qXDp5cfiU6WrOBxcbC4ME6zXGZ7T01/E2JmAnsWRlN87aleaXnwmCLYhg5Zu9efTE+5RRVAM5B9+6a3GX06GKkADZv1nzAsbFwwQUaFsIUQIFRIpTAqa1rcE3Dlbwxrwtrp24ItjiGkS3Llmns/lq1NKxzmTJw662QmAi//FLMLCSRkToDSEzUhY1Zs4ItUYmjRCgBgLcmN6Oci+e+W2IRjzfY4hjGCSxbphE827dXO3/VqvDxxxAfr4Heil2mxLlz4cwzNaDRVVelt3cZBUaJUQKntqrOKzev4s/Y9oy9d3awxTGMVKZM0QG/fXtYulRnAG+/rW6f9xbXfHtLlujOXxGd5lg2sKBRYpQAwD1fnEOHCqt55PMWxG6OCbY4Rglnxgy48Ua45hqN6BkRoWW7dsEjjwRbugDyyy/q/ZOUpAvAX30VbIlKNCVKCYSWCuGTz0uxR07l7VuWBVscowSSEsc/PBwuvhimTtXYaCtWwLZtJSAszqRJavpJStK0ZUOGBFuiEk+JUgIAHW5oTN/TF/LO/HPY88/uYItjlBASEuCGG3Tw/+ADXQdt0kRt/6+9Bq1bB1vCAuCLL9TdyTlVALffHmyJDEqgEgAYOrI2xyjDK7etTi376+NI6oTu4sqaixl911/sXx/D/BErePq8mZxXKZKpzy8OosRGUeXYMc2CWKECjB+vG2LbtYOoKA3zXLt2sCUsIB59FPr10/Clv/9uCqAQUWI2i6Wl6SURvH/jX5x7jS8gVe2WVWhaaTerD9TizpERMBKgGqEkUZZ4nn0jiiufFVyI+S8bJ2f3bg3etny5Dvzh4brLd8IEqFw52NIVMCNHwjvv6Azgp5/UDmYUGkrkTADg/rHn0e7mFqz/fQveJC8Nu9ZjxoF2bDpel4WjVvHCRTMZ+8A8ojfF8caNy1kfX4ftC31J0pKOJXF+5X946dKZweuEUejYtk03dNWurZ4+zukG2CNH9AW4xCmAl17SfAC1a+vCR5EOa1pMEZEic7Rv3178ye7IPQIiQzrPyPa+o/uPyqEdh9KVfXLTLAGRCyov96tMRtFk926RVq1E1OdR5JRTRJ58UiQpKdiSBZELL9Qvo149kQMHgi1NiQZYIlmMq0E1BznnugJDgVXAONHE8wVGmcqlefq8mQx8t1m295Wtpls0xSskHk0kPiaeZ8dptK61h0uKUdfIjO+/V4eXH35Q+3/Zshra+c47gy1ZkGnbVncDh4fD4sUaEdQolORZCTjnRgFXA3tFpHWa8u7A+0Ao8LmIvJZNNQLEAWXQJPUFSuV6lRk6p2uO7j267yjnRETRt8tO4uIgWrrSN2Ie47efS+y2WCrXK2nz/JLNe+/B00+rmQc0B/rjj0PTpsGVK+h4PJrTcuNGDW60ZQvUrBlsqYxsyM9MYDTwEfBlSoFzLhQYBnRDB/XFzrnJqEJ4NcPz/YA5IjLLOXcq8A5wSz7kCSjlqpfj4ua7KFMGhv7Rmdsb/kW/BysQ8s48juxrmKkSSDqWRFiZErn2XmwZPFgVwPHjel6zpq57mqkbVQAREbrbrXx5XR2vUCHYUhknIys7UU4OoD6wMs15Z2B6mvMhwJAc1BMOTMji2gBgCbCkXr16gTKZ5Zh5n0bKGWXWStTindned3T/UWlSapN8cvOsApLMCBRJSSIvvSRSubLP5l+/vsjChcGWrBCRmCjSooV+OVWrihw/HmyJjDRQgGsCdYDtac6jgLOzutk51wu4HKiCzipOQERGACMAOnToIH6TNI90HnAG/9yV3lU04fBxwitqXN/4mHjan7aTB3rtoHNECM07VgqWqEY+SUjQsA4//aRDf2goXHklfPQRNGgQbOkKEbGx0KaNukZ16gTz5hXDaHfFF38rgcyc6LMcuEVkIjDRzzIEnLQKoFP5VTQ4JZbvtp0LwIpJm1id0IpTI6IZM/ac1PvEm15xeBI8HNh8kOrNTik4wY0cER0NDz2ki76JiVp22WW6+GvWjQwcPKhrAHv3QrdumhjeKFL4e59AFBCR5rwusDOLe4sFt1+5jx5X+/Tc8v9pxu92V9cBIGbjAVqV3sCnt6YPk/vaVXOo1bwyj3eaydF9RwtOYCNL4uJg6FC183/7LYSFqTI4ehSmTzcFcAKbNumUaO9eTX1mCqBokpWdKCcHJ64JhAGbgAaonf8foFV+2kh7+HufQCAY2HKmVOGAeD1eERHxerxSP2yb9Ky1IPUer8crjUttlmpuv4BIw7At8r83lgZL5BLP/PkibdqIVKni8/G/+24RjyfYkhVi1q0TCQnRL+zuu4MtjXESyGZNIM8zAefcWGA+0Mw5F+Wc6y8iScAgYDqwGhgvIqvyrqL8z9q1GsNq0yb/1OdJ8LB59vbUt/nlW0/hzMqbU00/LsRxedPN/LG7JQlxCXrP2DVsSKzP67etYsa7fxPqvHR//Ax2LNnlH6GMHPHzz/rW37mzurQ3awYLFmh+8xEjNMyNkQl792r2G69XZwAjRgRbIiMf5PmfuYjcJCKniUgpEakrIiOTy6eJSFMRaSQiL/tPVP/wzJAkfvoJrr7CQ2xs/uub/VEkDS+MYN6oNXiOJxF5pCHtGqavuHuP0sRRkfkj/wVg9dwYKhNLr2da0fWhM5n4XRJJlOLXjyz1ZUGQEr+nRw+1/1eoAB9+qArg7CzdGAwAFi5UBXD8uOYC+PHHYEtk5JMS9a6zfj1M+CmUK5jG+nVe+nY/RFJS/upseoEGoVu75DBrp28hnnKc2SH9evvF97cgjER+/e4gALd83IW9h8tSrXE1AFr1bEyDsG1s2+zJnzBGtgwbpvnM+/SBQ4c0gfuECXD4MAwaFGzpigCjR+u0KSoKpk2zXADFhBKlBN54A8JDkvii/AMMrzqE3xZU4qHLV5/8wWyofVYtyhPHunWw/Nc9ALS7PP0OyUp1K3Fu5VVM//tUko6p1gmvEJ563YU41h6qzQuzumbahjfJciLnFa8X3nxToxcMGgQ7d0KXLmr+2bcPrr8+2BIWEd55R2NhiMC771ok0GJEiVECO3ZoHot+VSZy6nlN6L/mcR6NGM+wP1vwYZ+85xx2IY6mZaNYG1We5YsSKc0xml9xohN597MPsjy+BXc0X8jZFVbhSUj/1l+qrM4exOvzNJry7CJmvvc3Z1TYxPrft+RZxpJIQgLcdhtUqqThHBIT1ea/fj389ReccUawJSxCPPOM5gMAzXz/3/8GVx7Dr5QYJfDOO+D1Co/FPKkbWmrW5PX1vehRayEPTejCry8tyXPdzWrEsDb2VJo2D6Ffy4WUKlfqhHsuv0PNRnsOleGiM6IJDU+/mcaT4OHiqst4/qJZAGxfuJMeQzvw1SeHKR92nLj9x/MsX0kiMVGV/Wmnwddfa1C3m2+GAwdgzRpo3DjYEhYxBg3ScNAA48YV48z3JZis3IYK45FXF9F9+0TKlxe5pdsedWn7+efUa4d3HZYzy66WisTKionrTnjW6/HK9kU7Jel41jGBn7twhjg8En8gPst7PIkeqen2ys2n/5XlPQOaz5LhyWEmnu+qdW6esz3V3dTImi1bRNq2FSlXTn/ili1FrrpK5PDhYEtWhHn+ef0ynRP59ddgS2PkAwprKOmCYswYjfb4RJtf4HegY8fUaxVqVeDnWZXpdM5Rru5TllvOnklIiMbC+mdDeRbsa0SMnEbH8qv45sfyNOlW/4T6m7YOR2aFsGrKZtrf2iJTGULCQnjjrrXUaVwuSzk/XX0BoLOCUXMac+kpy6l/XntAw1EMvmgh53UrS5+3zIUlheXLoXdvn8tvqVLq+nnVVZrQxcgjd96pC8Fly8KSJdCyZbAlMgJFVtqhMB55nQkkJorMmCEiN9wgcvrpmd6zeMwqqR2yU8JIkBCSxOGRFuEbpF+T2fJytxlS1cVIeQ7LyP/MPuHNfMlX/6YGFls+bk2eZEwhPuaojO4/W0Bk/MPzUssP7Tgk51b4Rxweeaf3XPGmEeH4cZGYfR45FJOYr7aLEsuXizRqJKnfe3i4yCOPBFuqYsLtt+uXWq6cyNatwZbG8ANkMxMI+sCemyPfO4YbNBDp0ydPj25ftFMuqrIs00xkh3Yc0lkzHonbE5dn8bwerzQM2yIgUt1Fy7HYY+muH42Ok+tr/SUgcm/7BfLKdQula/VIKcVxAZEbmyxOvTfhaPFUCGPGiDRs6Bv8y5UTefPNYEtVjOjWTb/Y6tU1XZpRLDAlICKyd68IyMdth8viMatEROT9XjPlP03nynOPH5UPPxT54QeRY8eyrsKT6JF+TWaLwyMz3k2fVvLTW2fLP9+vzbt8ydzRaI6AyCPtZ2Quw7EEebjlL6mDYLtSkfLYGb/Iu5f/Ir+8uEhERKYNmS1NwrfIhnl78i1PYWHcOJGaNX2Df4sWIv/7X7ClKkZ4PCJNm+qXW7OmSGxssCUy/IgpARGRqVNlBa0klETp33S2iIg81WWG1GBP6sACIl26ZB8zJm5PnDQptUnqhW6Xg1sP5l2eLPj5mYVSmnhZPW1T1jd5vfLPp/Nlz+w1ks4ulMyC9xfIlTUWSdx+1Wie40V3VvDooyIVK/p+nzp1RH77LdhSFTO8XpHevSU1F4Ctphc7TAmIiPeZZ+UCZko1t1+i1+zzXRg+XBIJld0PvSrffScyatTJ61rw+QoJJVFuazgnz/JkR35MSifUtSVaGodslCfb/Cz7FmejWAoR+/eLTJmiCjll8L/kEpHNm4MtWTHk+HHfF3311bqAZhQ7TAmIyJcRTwqIfHb77PQXvF6Rfv30q5g0KbVo3DgdiLLiuQtnCIj0rLVAulVbIm3LrJE3rpyRZ/kCRdSiHdLr9MXi8EjvGjODLU627Ngh0q6db+CvV09k8GAb/APGwYPqOw0i112X6azSKB6UeCVwYPMBqcluOSdssXgSM7H1xMeLnHuuyMiRIqLmoBo1RAYOzLrOhCMJcn2dedIgbKucXX6F1A3dIQ3CTvSkePZZkQsv1BSFwaT/GQukKh5f1oIAACAASURBVPvFc/hIcAXJhHXrRFq39g3+ISEi99wjkpAQbMmKMVFR6lKVsqnCKNZkpwRKxI7hsof28CAf8PHtCwgJy6TLZcrAnDnQrx8AId4k2rbVgIlZUapcKSZEdWZTYj0WxLXm4avWsTmpHrv+3pPuvrGjjzFrFnz+uT97lHvO71aWA1Tj32//Dq4gyYjA77/DWWdB06awcqVmJOzXTwNUDh+uPv9GAFi3DurX19gaHTvCqkIV7d0oYEqEEii9+m+e4hXaDeqS9U0pweOnTIG2bTm71WEiIzWrVGasWQNt22omKoAuPaoDMPcrX6KCXbtg/bYylCKBpx8/zsGD/uhN3jj/To2XMGdCeiUVGwt79mT2hH+RNElGBw7UPUiXXaabvWrUgMGDdUwaOVIzehkBYvlyaNECkpLg0kth0aJgS2QEmRKhBDj9dBgwAFq3Pvm9NWrA1q2c8+NgPB5YuvTEW/7805eI5NdftazdDU0pQzxzZySk3jfrF9UgH7kH2X+oFEMfOeCP3uSJBq3KUTs8mjlLyqYr73luNBF1PHgzBCrdvt3Xt/zg9WqkzsaNYf58zUf+6af6tt+smU7A9u6FV1+1JC4BJzJS/+F6vbrN+vffgy2RURjIyk5UEAdwPjAc+ByYd7L7Cyy95J9/yl5qCIgMG5b+0o8/ioSFibRs6ZXvvxdZtsx37fxKf0un8itSz+/tvlHKcEQe7LZKbg0fJ2EkyNpl/vP8yS03tF4pddkm3mj1jtq7LV5A5Fa+FFns22i2a5dIg9OTJDTU6xdvwSlTfKkbQaRxY5Fffsl/vUYumDhR/+GCyBtvBFsao4AhEAvDwChgL2lyDCeXdwfWAhuAwTms61rgnpPdV6A5hs8/X/a07Jq+bP9+2fHpz3LHrUlycMYydVp/5plUt7rB58yQMBLkSLQuvraoskPKcFRApPYp8VKWOLmq9tITwk5MeGSuNA9bJ9+3fl68jzwq8v33AXHV++jRjQIim4dNFRGRj2/R3ceRpTuIp3MXWbZUB/327b2pA/Yfn23MU1teryrQq66S1BhkbdpoPl+jgPn1V18+4E8/DbY0RhAIlBK4ADiL9InmQ4GNQEN8ieZbAmcAUzIcNdM8Nx6odLI2C1QJvPuufj3btvnKPvlEy5YtE4mMlNjufeVLbpX159wqsmeP/PzMQgGRme8tl717dCB9usk4+eEHkWbNfG/C3wyam1rlvnX7pbqLTg39cF3Ij7KTWuoruXChX7sUuSxR2lffLItHRYp4vXJBuUXSsswG8Y74TIbylISHJcmqVSK33CLy9aD5AiIvNRmdqzYSE9UjKsXzsEwZ1ZM7d/q1K0ZOeecdnxYeNy7Y0hhBIiBKQOulfgYl0BmYnuZ8CDDkJHXUAz7L5voAYAmwpF69eoH8ntKzf78snbZbevdWbzoRkRfqfS5z6t+a6k+9Y4d+gx+GPSRSp44c/H66gMjLl82Q797bKSAy79EJIiJy9KjIgw94pEporJzi9snuFXtl/nyR/k1nSxgJsuy7dfL66yJlynilbHiidCy1TG5jjLx5zoTUmYU/2f6Vhqp+sddykaQkiX7yHfnsnUPi3Rudek/LU6PlSqaITJ9+0vqOHBHp0UOkVCmfsgsJEZk50++iGzklrQKYNi3Y0hhBpCCVQG/g8zTntwEfnaSOF4Bzc9Jegc4ERGTBAv2GJkwQ2Txne+oAn4LXK1K5ssh9ffdqVLOXXpLm4RvkyhqLpEvDHeJIkn2L05tTVk7aIKWJl+41l0poiM4WnjjbV+fatSIPPihyaddEiagYIyDSuUKk7Fu33z+d8nolccVqeTviXQGRdSuP+679+6/GaPjqKxER6f+fJKkWEiOeM9qeuNHhhx9EXnhBYmJEXntNpFIl3+Cfcrz9tn9ENvLAb7+pFg4N1X/ARommIJVAn0yUwIf5aSPtUdBK4NiifyQ8JEH+755D8nK3GWpPn7M93T3nnCPStatowC2vV/o3nS2nESVhLlGqhRzItN7XumtdYSSIwyN//e9oljL88Nh8KU28tAjfIFvnReW7T9+8ESXliJPTy+6R9g32pb8YFycyaFCq7WbkSP0XsppmGk/j0CG9R0R2XnefrCnTVm4IGS9NWCtVquji78KFGuZ53Yn5eYyC4qmndErWtKnI338HWxqjEFCozUG5OQpaCcjq1XI28+X8RjukRYWtcl6lE/9D9eunQRdT+OG6r+RFnhIQ6dci8yxiifGJ0qn8Co3YW+m4RESI7Mkm4OfM95ZLZQ5InZCdsn1R/ozry5d55ao2WwVE3nor+3v//VckIsIrfza/V98oQeYO/FIaNhQpTbxUY58IyO83j8re6SS7iHyGf0kJBV2xoqbUMwwpWCUQBmwCGqRZGG6VnzbSHgWuBETkwapfppo3hr99onvnW2/ptZT/b+smRkpDNkhHFsicF7KOdbxz+W6Z9OQCWbJEF08vuCD7sXLZt6sFRF5JY47KK99+qy+Kade8MyMllEziskhZ03WAvFrpFWnJSgGRWrWSYytVqiTfXv6F1KiRxZizYoXIqaeKfPBBvuU2TkKnTvqPMTRUZP36YEtjFCIC5R00FtgFJAJRQP/k8iuBdcleQk/ltf7MjmAogfG9vhUQKV3aK/szMcvv26fWk5QBc9NGtfO/zcOS8N6wEx/IrI3xIk8+mWppyZL25VZJ5wqR2d80cmSOvIpyGi5+4UJ17QSRVq1EOnQQGT5cg0+KiMadufZa33nGRlJi1JcqJbJoUc4aNXJP8+b6PYeH+zwZDCOZgM0ECvoIhhKQpUtFQLw335Kj2w8dErm49r8yN+w8zXyeWWTGG28UufVW3XmWC1KSz+9ZuTfzG+bPl3U0lqNhFdVJPwdRIY8ezTq43QsvSOos6OOPM69uV9cbxdu+w4kXvF6RXr30rXTSJA0J2rChJSvxNx6PfrcgUrasSExMsCUyCiGmBPKD1ysyZIgqgyx4912Rb77JUPjZZ/r1prz9/vqrHl6vyB13iJxyinpvJL/+x8bqTt3sWPq15jIedefsTK8fu7yHlOewXHfqXG37jjt0lM/A3r0q8+WXi5QurWmXN2zQa6tWiVx6qUhEhO8F/oILMk81m5QkUr9CtNxV9qsTL8bFiVx2mc9F6K+/VCHcfLOFLPYXiYm+aVqVKpn+1oYhYkog4Jx1lg6ocXEi772X7Fxz5IjPLBMbK1K7ts4MUl67x43Tr/+ff0REpG5dze+dHV6PV+qE7JTrTstk2+2SJTKfs1Pf3H+8Yaz+MWDACbd27aqXmjcXeeAB1Ud166oiWLZMw2inXM9OMSUkiIy4ZrL8ziWZ5+VMSko/4A8dqhV//XX2HTVOzvHjumMdRNq3t8V3I1tMCQSYW27RN+dJk/QbPSH37f3364adtLb6JUv05h9+EBF9++7Y8eRtDWw5S8pzWOIPxKe/cO218m6ZwQIijRrp+BB72/2aiT2NCebQIX0hf+IJ36PLlqkpuXp1kWuuUbFefDH9+J2lo0mKH+mmNFnLtm4VmTNHTlgoSEoSuesunRUYeefwYdXaoIs0pgCMk5CdErC4jX6gZUuNujl2LFSsCOefn3xBBM48E4YNgwcegE6dfA81aQIXXwzlywMa3Xf16vQhl9m6Fbp2hVmzUouu6VuWI1Rg5kcrffetWAE//cSCRjdTrx58+y3s3AlPH3taY2GPHZt665w54PFAt26+x2fO1DDO8fHw88/w7rvwzDPgnF4fPBgaNCBdKOwRI+CDD0Dq1NWC7dt9F7//Xr+EuLj0X1RoKHz2GXTJJqR3Cum+CCOVrVshIgKioqBnT1i82MKvGvkjK+1QGI/COhOYOFFSzTC9e2e42LevJu8+dCjbOj7+WE4IVTT/v2PlNy6VqHJNZNpYTWoffyBeyhEn97aa5bvxhhtEKlSQ0yOSpG9fLXrgARHnvLKg0c36tpjMo4/qW3+K+fjff3Vd4Jpr1KvwhFmMiMybp2sDnTrppGLfPjVBd++eXAGkXxTZvVvXP7Ji/36RRx7JPKDQrFk6lSlbNvOFiJLM9u2+bGB9+gRbGqMIgZmDAsuaNT4lMHp0hovHj0u28ZiTbS4zZ+rzacP0XF9nvpweuk3uuHibhIbqpl0RzWscERol3jVrNTk4yM4HXxXQcDEiqnNq1RK5osVmrXj5chHR9YsL28aIvPGGeH6cJJ3bHZVq1bwnXZT+6SeNRNyli1p0QkJEVq4UXQuYOVMy9Z/Nig0bdJD/8ssTrw0f7vsyR4zIeZ3FnX37RBo00O/lmmuCLY1RxDAlEGA8Hh27wsKy3/l7Ao8/LtKkiYiIHDigs4GUmYDX45VaIbvl5tP/ksOH1dEGRPpGzJX3Lp2k43rIWbK23Jny9lV/yIiPEzVg3Txf9ZdfrrOBHeH1Re67T2Ji9Px595wIyA9cJyAyptZj+kZ/Er7/PnXjcGbrzT6GDUtd8M6S6OisryUm6qLG9defVKYSwYIFOvUqXdoUo5EnTAkUELl2gX/xRf0JMnHt27T6mIDIsLv1Df74cZGnu86R8hyWEJJkMC/J9jKN5fpGy1J38IaFicQnrxenNVG9fua3IpUry4+PzRUQmdXqXpGoKPHMWyCTB00Xb42aGgFu6tSTijxunM4Gdu9OU/jLL9qgiGqz3CQuWbrUtwKddiG5f3+NzheAvApFim++8f2QKVNBw8glpgQKK99+qz/BypUiog42KaGXv/pKL2WM/7Vn5V55qtNvciCkqiwJ7Sgrh06UEbfMFFAvVBEdoKtX1+ebNBFpXi9OvCAP8p6UdfFybM9BOZA2tt3WrSJnnqkeTO+9l/t+dO+udiYRX+jVSZNO/tysWekHt379dFOC16vbqEFk7tzs6yjOpOSvAJHnnw+2NEYRxpRAYWXRIv0JfvpJRETuvlsHbxGR+2+LlUqVvFnu5p0z25tqNvfccafcxDfinFfmzFGTcenSIqse/kym/fdX3bPW+CY5o8xaufTC4zJ9ur74p4suERcncu21Kk9ukwDs3OmbBo0erXWsXXvy5zwekfPO040KO3fqAvptt+m1mBhdeHj22dzJUlx47TWfAnj33WBLYxRxTAkUVmJiJG04z5QcIHv3iiS0bifrLuif5aNer+49a9NGZOsTH8khKkjDiOPy3Xf6Qv3OK/EiIFuJkLAwkT7XJaT6/zdrpjOE+AxbDSQuTkM7NGiQ/WJ2dgwZonaphISc3b9yZUrSZu385Mm+a+ecI3L22XmToyjz/PNiJiDDn2SnBMzBOJhUrQoDB+omAVI/WL3oMKVW/U2TiyOyfNQ5GDQIIiPh82XtqUgcfw2ZRt++MHIkPNRO9xb8WH0ASUnw/Y+lAPX1X7sW3nkHypTJUGn58jB6NGzZAk88kfN+rF6tmwl27dLKGzWCUqVy9myrVvD44/Dvv1CpElx2me/a5ZfDokWwf3/O6kpKyrnMhZVnnoHnn1ff/wkT4M47gy2RUdzJSjsUxqPYzQQysHWrvvz9p9t2eZD3ZM/47M0yR46oBaUsRySJEPE+9bTv4pNPioSGys71cak5xsuXV8vLxReLppHMyi/04Ycl063PR46IPPaYvrWn3dDw++8+M1LLliI9e+au40ePirRoITJwYPryefO03pPlxj16VOTll0VOO039db1ezf1Z1OjVS/tbrpwGcTIMP4HNBAoxIhAdDehG0PLlYfTvdRnG/ZQ776xsHy1XDvr3h3jKsa7qObjEBN9FrxcuuYTTGpblwk5HCQuDpk3hwAHh7ZYjcS1bwP33Z17xyy/rzb17w733wm+/we+/Q5s28OabsG4dPPaY7/6I5BnLli2wYQM0a5a776BsWfjnH91ZnZaOHfXNuE2b7J9/80146im9LzERHnwQzj4bjh1Lf19MTO7kKkgGDoSJE/VH3bBBt6EbRkGQlXYojEexnAm8+KJ65SQHYJsxQ6RJ2e1yVtmT++2LiGzcqCb1LLOEPfGEfFbq3tQX+w1d++sbc0owt6zcQv/9V7c/ly/vs083bizy558izz2n5zNm6L1xcXp+113BsWNfeGH6wEuzZ+umi7Qup2+/rbKNGVOwsp0Mj0ekXTtJjQR6sl17hpEHsIXhQkyKL+jq1SKibvHlyybJoB45D5mwYUOGWG1pI7/99pvEUEXKlU6UEa8ne9w8+aQ+0Ly5LgJnF4L46FF19/z4Y999R49q/OkzzvD58VetKlK/vvjdrfPYMTU3ZTU4Hj+uqdkefjjrOr7/3mdmKVcu9bsuFHTurLJVrCjp/XYNw39kpwSCZg5yzrV0zo13zn3inOsdLDmCTuPG+rl+PQA//ghH4kM5s0e9HFfRqBGEb98IZ50Fv/wCr7wCZ5wBx4/DhRdStZKXPTc+xN2JH6uZqF8/CA+Hjz+GzZvh1VezrrxsWejRQ81CZcv6yt55RwPXffKJlkVE6CLvli3QvDncc4/WnV+2bdNodz/9lPn15cvV7JMxKJ0IDB8ODz8Mt90G556r8pYrp8H8gk18vC6Iz58Pp52mi+pVqgRbKqMkkpV2yO4ARgF7SZNfOLm8O7AW2AAMPkkdjwLnJ/89OSftFsuZQHS0pA36M/c/I6RyhcTcr2sePqxmpeee02Tjbdr4rt1wg0jNmvqmftFF6Z+76SbNdJZbvF6Nf12lisYNuuoqNWuIaL4AEPnPf3Jfb2btTJ+etctqSpLnzGYKl1ziM2OlhKmYNy/zwHUFyYwZIhUqqGy33ppzd1rDyCP42xwEXACcRfok86FoXuGG+JLMtwTOAKZkOGomH8OAN4G5OWm3WCoBr1fDI9x3n8jPP0uqM39eaNFCd++WLy8yaJCvPCX0QMWKuks5LbmNRZ+UpNuYPR79BN1lfM89+vfYsalB7aRUqey9dKKi8p8N69prdZDPjOXLVSGuW3fiNa83OJm4UkxTYFnWjALD70pA66R+BiXQGZie5nwIMCQH9YQCk3LSZrFUAiIacG3iRM1M07LliclYcsqtt/oGmPHjfeUHD4pMmZL9oLdlS84GpMce0/pPP11387Ztq2sDKQvNvXvr4N+rl64/DBmSeT2JiZrCrE2bk0cg3b1bN1CtWZO+3OvVOu644+RyZ3yuQwddyC5IYmJ0FR9Ezj/fksEYBUZ2SsCfawJ1gDSZRYhKLssU51x959wI4Mvk2UBW9w1wzi1xzi2JTnalLHbcd58mjomK0qQr4eF5q+esNC6lqZltgMqV4aqrfDb9jPzxh2aN+fPP7OufNw/eekvratYMhg6FCy5QW7vHo/e0a6dumkOGwHXXqV0+Y3IZgJUr1TU2MlI3iMXGZt1uUpJuoJoyJX15bCy0bg2XXJK93BlxTt1fu3bN3XP5YfduXTdJSlLX19mzLRmMUTjISjuc7ODEmUAf4PM057cBH+a1/syOYjsT2LRJTh6fOQcsWqQ2+muvzd1z8fEatOi66/Tc49E3/rZt0wcYWrhQzSspCXK2bdO3+DJlNPDRsWN6vVEjfdtO2ez1wQcntvnRR3rtww915tC5c/aJd5o1E7niitz1q7AQHa3BmkC9qGwGYBQwFNBMIApIG+egLrDTj/UXX/buhauvhjfeyF89HTvCgQPqYpQbypSBu+6CSZPUGwc0P+WWLep18+ab6lXUqZNuHKtYUe+JiNA4FJddBuPH69vuH3/AjTfq23bnznq8955vppDCvHnqFXP//TBunIaHuOWWrNNKXnKJvj0nJvrK8hsmYs8ezbfpD5KS9HfMuEFt+3adlR05oueffGIzAKNwkZV2ONnBiTOBMGAT0ADfwnCrvNaf2VFsZwKFgS1bJNVbRUQXgGNifKEM2rXTsBEZadxYN2ul2LlBJDLSd33CBC1LyTeQwumnp8/FmRI97/PPM5cvpZ60SepbtMh+f8DJ6NNH5NRTJctQrVmRlCQyf76uiXTq5HvLz+gRNXOmvvlXrKj7MVq0yH1bhuEHCIB30FhgF5CIzgD6J5dfCaxDvYSeykvd2R2mBAJMz57quph2oPJ6NW3a9ddn7qZ5zTUirVpp9FHQhe20C8xJSZrxplcvX1lUlKR1ixURNZF07artb9p0Yjv796sLbMqg7/GIPPOMyHff5b2/48apHLNmZX592bLUtJzi8Whbt92mAZhAF747d9aEzi+8IPLmmz5X1C+/VHmdE3n/fb0/o2eWYRQQflcCwTpMCQSY2FiNQ5EbnnhCbfop4Y9T3Fv37FG30cOH1V21TBmfzT8lYUy6hAais5GKFTWxTEa7ucfjC2Fx/vkacjq/b9WHDqlcDzyg57Nnp193uPhikXPP1b+9Xn2br1ZNZ0tjx2re38xIyakA6hKalKQeTDYLMIKEKQEjcIwZI6mhIvr08e0L+OEHfVM+80xfrstvvtFrDz2kg29mrrBffKH3vv56+vLPPtPyMmX007n0uQfyyrXXakq2l15SeVu39imgyEiRJUt8927efPKB/IMPfLOEjFFRDSNImBIwAkdKdrS0Nv8RIzSI3LRpat6pU0e9j3r00OsdO+rbfmZ4vWp6ck7fokXUxFKtms4AvF6RX39V81Pp0roHIj+k7G4G3Tmd60TRaZgwQQf/8HBdMzCMQoIpASNwHDmig1/azPOtWolceaX+/c8/vsByoaEasiEsTGTw4Ozr7NxZB9OZM3VTV2ioyIoVvnvmz9c6R4/On/yHDuku608/zd/u3Z9+Unm7dCmauQyMYo0pAaPgOHhQ3+LThr44csQXZrpfP/38+ef0zyUmal7dlMXn/fvVmyYlxs6jj6a/3+tV5dK9e2D7kxNSdks3aKAeVYZRyMhOCZjDspF/IiN1nwDA4sVqXDnnHN/1cuVgxAioXx9GjdKy6dN178DcubBjByxZAk8+CbffrnsSqlXTeypXhjp14Lnn0rfpnO5H+P132LevQLqZKU8/rYlvwsN1R3PVqsGTxTDygCkBI/988QX85z86eC9YoGUdO6a/xzno21f/rlIFPvoILr5Yw0QPGqRK4+23daPb88/rfRERqmCWLPFtUEvLjTeqIvnhB1/ZhAnQtq3mLA4kIprB7OWXISwM/vrLsoEZRRJTAkb+adlS4+Nv26a7hlu3zjw2/g036GevXprLYN48TYbw6ada/t//ar7MoUPh+++1rFo1qFUr83bbtNHcBWPH6vn27fp8ZKQqF3/kM8iKV1+FDz/UGcD//nei0jOMIoIpASP/tGihn//+q2/4y5Zlfl+7dmo6GTRIk9SsWaNJVWrW1OvOqXJo0yZnITRSTEKzZ2vwvX79dGbw00+qlC69VMNfHD2qpqX33jsxrENuEdHZygsvqNyRkXDhhfmr0zCCSFiwBTCKASlKYPVquPJKKFUq8/ucgxdf9J03anTiPeHhun5QpkzO2r7xRjUf9eql6xHDh0PPnvDrrxpv6Mwz4dAhzbIGaroZNCjHXUvH4sX67L//ar1Tp0L16nmryzAKCU4XjosGHTp0kCVLlgRbDCMzatWChATo0AG+/tr3dl8QnHWWppns3h2mTVNlAxqe+5VXNO3lZZepsti3D9auhdDQ3LXx5psaHrtCBQ1hvXix9tUwigDOuaUikuk/WDMHGf5hxgzNRTxvntrx84PXq4pk5szs7zt4UHMSDBgAtWvD55/7FAComWb6dM2H3L07PPoobNwIP/+cO3kWLtR8wNdco0rgwgtNARjFBlMChn9o0ULNJB07qsklP4SEwODBmmAnO1JmHC+9pAPzyWa1110Hp58O776bc1m8XvUCOu00TaazYwc89ljOnzeMQo4pAcM/REaqiSQzO39e6NRJ38Cz4+KL1a30oos0F0KbNvDdd1nfHxamA/rs2bB0ac7kiI9XD6TXXoNhw1TZXXFFzvthGIUcUwKGf9i0ST/95Sp59tlquslsI9ibb+pbfePG8Mgj8NVX8Pff0LSp7hMQUTPRW2+pq2naGUL//rrnIKvZwIoVmmwmhfLlYcwYNTf9/Tf83/9ZUhijWGH/mg3/0LOn2vAHDPBPfWefrZ+LF6cv93jUDfXQofS5mJs00Q1bo0bpusDmzWq26dtXF41TqFxZs6h99526laZl/nzdaFa7tpqXunXzPfvWW7r4fcst/umfYRQSTAkY/sE5HTjTLszmh/btta7IyPTlv/yim9LuvffEZ8LCfDuLO3bUZPagO47T8uCDWvdLL6Uv37dPZxePPQb79+smsA8/1E1ov/0G99wDpUv7p3+GUUgosH0CzrmGwFNAZRHpnVWZYQA6mO/aBaeemr784491kbZnz+yfd053Mlepkn4mABrD6PbbdT/Cgw/6wj2IwPr16gK6cqUqm1q1dJOZCNx2m9+6ZxiFhRzNBJxzo5xze51zKzOUd3fOrXXObXDODc6uDhHZJCL9T1ZmGKlkVACbN+smsLvvznpDWlqc013Kme1gbtpUB/aLL1Yz0ssva70AX36p7qf16qnJ6euvoXNn/y16G0YhIqfmoNFA97QFzrlQYBhwBdASuMk519I5d4ZzbkqGowB3DhnFhhUr4KabfDGAKlbUaKIpg3VOeOstXdhNy5Qp0LWr+v3v2QPnn6/ximJidE3h6FHfM5GRKoetBRjFlByZg0RktnOufobiTsAGEdkE4JwbB/QUkVeBq/0ppFFC8Xph3DhdoE1K0sXfjCGlT8ZZZ51Y9uCDOkP45ht924+L07ASL78Md96pZqJhw+CBB/SesDBfBFTDKGbkZ2G4DrA9zXlUclmmOOdOcc4NB9o554ZkVZbJcwOcc0ucc0uio6PzIa5R5GjVSnMR3HuvvrkfOZL7Oo4f101n8+fr+a5dOrM491ydWbz0kiqYTp10VzDo4L9+ve42/uYbuPxyqFHDb90yjMJEfpRAZm4gWW7ZFJH9IjJQRBolzxYyLcvkuREi0kFEOtSw/4gli7AwdRUV0U1h5cvnrY6HH/aFm05RBl266Ofdd2t8oe++8+107t1bQsnNigAABNtJREFU1yMGDtQdwrfemv++GEYhJT/eQVFARJrzusDO/IljGBkYPVpt9M2b5+350FCN+JniITR3rrp5tmun5+HhGhguLeHhut9h6FCNFdSjR57FN4zCTn5mAouBJs65Bs65cOBGYLJ/xDKMZOrVy7sCSCElyqjXC4sWafC3k/n733OPzgyuv15NUoZRTMnRTMA5NxboClR3zkUBz4nISOfcIGA6EAqMEpFVAZPUMPLKWWfppq/16zUn8d69J3+mTh2dNZhbqFHMyal30E1ZlE8DpvlVIsPwNykeQqtWQbNmOrvICZ06BU4mwygkWNgIo/jTqpWGgTh+XD2APJ5gS2QYhQZTAkbxJzRUE918953mH85tVjHDKMaYEjBKBlOnas6Bc88NtiSGUagwJWCUDP76Sz/rZLmf0TBKJAUWRdQwgsp992n+4/vvD7YkhlGoMCVglAwiImDWrGBLYRiFDjMHGYZhlGBMCRiGYZRgTAkYhmGUYEwJGIZhlGBMCRiGYZRgTAkYhmGUYEwJGIZhlGBMCRiGYZRgnEiWGSELHc65aGBrsOXIIdWBfcEWIkAU575B8e6f9a3okp/+nS4imebnLVJKoCjhnFsiIh2CLUcgKM59g+LdP+tb0SVQ/TNzkGEYRgnGlIBhGEYJxpRA4BgRbAECSHHuGxTv/lnfii4B6Z+tCRiGYZRgbCZgGIZRgjElYBiGUYIxJWAYhlGCMSUQBJxzLZ1z451znzjnegdbHn/inDvfOTfcOfe5c25esOXxJ865rs65Ocn96xpsefyNc65Fct8mOOfuDbY8/sQ519A5N9I5NyHYsvgDf/bHlEAucc6Ncs7tdc6tzFDe3Tm31jm3wTk3+CTVXAF8KCL3ArcHTNhc4o++icgcERkITAHGBFLe3OCn302AOKAMEBUoWfOCn3671cm/XV+g0Gy68lPfNolI/8BKmj9y00+/9kdE7MjFAVwAnAWsTFMWCmwEGgLhwD9AS+AMdDBMe9RMPoYBbwJzg90nf/YtzXPjgUrB7pOff7eQ5OdOBb4Jdp8C8dsBPYB5wM3B7lOA/l1OCHZ//NFPf/bHEs3nEhGZ7Zyrn6G4E7BBRDYBOOfGAT1F5FXg6iyqut85FwpMDJSsucVffXPO1QNiReRQAMXNFX783QAOAKUDIWde8Vf/RGQyMNk5NxX4NnAS5xw//3aFltz0E/jXX+2aOcg/1AG2pzmPSi7LFOdcfefcCOBLdDZQmMlV35LpD3wRMIn8R25/t17OuU+Br4CPAiybP8ht/7o65z5I7uO0QAuXT3Lbt1Occ8OBds65IYEWzo9k2k9/9sdmAv7BZVKW5S48EdkCDAiYNP4lV30DEJHnAiSLv8nt7zaRQjRzywG57d9MYGaghPEzue3bfmBg4MQJGJn205/9sZmAf4gCItKc1wV2BkkWf2N9K7oU5/4V576lJeD9NCXgHxYDTZxzDZxz4cCNwOQgy+QvrG9Fl+Lcv+Lct7QEvJ+mBHKJc24sMB9o5pyLcs71F5EkYBAwHVgNjBeRVcGUMy9Y34pm36B496849y0tweqnBZAzDMMowdhMwDAMowRjSsAwDKMEY0rAMAyjBGNKwDAMowRjSsAwDKMEY0rAMAyjBGNKwDAMowRjSsAwDKMEY0rAMAyjBPP/cfiJcj9q0pEAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "x1 = np.logspace(-10, 1, 100, endpoint=True)# range of h parameter\n", "y = diff_f(5,x1)\n", "y2 = diff2_f(5,x1)\n", "plt.loglog(x1, y,color='red')\n", "plt.loglog(x1, y2,color='blue')\n", "y = diff_f(2,x1)\n", "y2 = diff2_f(2,x1)\n", "plt.loglog(x1, y,color='red',linestyle='--')\n", "plt.loglog(x1, y2,color='blue',linestyle='--')\n", "y = diff_f(150,x1)\n", "y2 = diff2_f(150,x1)\n", "plt.loglog(x1, y,color='red',linestyle='-.')\n", "plt.loglog(x1, y2,color='blue',linestyle='-.')\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "This method can be quite slow and is prone to rounding errors. The following Figure illustrates the main idea behind back propagation which is based on automatic differentiation.\n", "![](https://miro.medium.com/max/771/1*DcLWqOojI1b9jzQaLibUkQ.png)\n", "\n", "> The **backpropagation** algorithm was originally introduced in the 1970s, but its importance wasn't fully appreciated until a famous 1986 paper by David Rumelhart, Geoffrey Hinton, and Ronald Williams." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "> **Backpropagation** is the key algorithm that makes training deep models computationally tractable. For modern neural networks, it can make training with gradient descent as much as ten million times faster, relative to a naive implementation. That’s the difference between a model taking a week to train and taking 200,000 years. (Christopher Olah, 2016)" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "We have seen that in order to optimize our models we need to compute the derivative of the loss function with respect to all model paramaters. \n", "\n", "The computation of derivatives in computer models is addressed by four main methods: \n", "\n", "+ Manually working out derivatives and coding the result (as in the original paper describing backpropagation); \n", "\n", "![alt text](https://github.com/DataScienceUB/DeepLearningMaster2019/blob/master/images/back.png?raw=1)" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "+ Numerical differentiation (using finite difference approximations); \n", "+ Symbolic differentiation (using expression manipulation in software, such as `Sympy`); \n", "+ and Automatic differentiation (AD)." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "**Automatic differentiation** (AD) works by systematically applying the **chain rule** of differential calculus at the elementary operator level.\n", "\n", "Let $ y = f(g(x)) $ be our target function. In its basic form, the chain rule states:\n", "\n", "$$ \\frac{\\partial y}{\\partial x} = \\frac{\\partial f}{\\partial g} \\frac{\\partial g}{\\partial x} $$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "or, if there are more than one variable $g_i$ in-between $y$ and $x$ (f.e. if $f$ is a two dimensional function such as $f(g_1(x), g_2(x))$), then:\n", "\n", "$$ \\frac{\\partial f}{\\partial x} = \\sum_i \\frac{\\partial f}{\\partial g_i} \\frac{\\partial g_i}{\\partial x} $$\n", "\n", "> See http://tutorial.math.lamar.edu/Classes/CalcIII/ChainRule.aspx" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "Now, let's see how AD allows the accurate evaluation of derivatives at machine precision, with only a small constant factor of overhead.\n", "\n", "In its most basic description, AD relies on the fact that all numerical computations\n", "are ultimately compositions of a finite set of elementary operations for which derivatives are known." ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "HRNAJqxYKxBf", "slideshow": { "slide_type": "slide" } }, "source": [ "For example, let's consider the computation of the derivative of the sigmoid\n", "\n", "$$\n", " f(x) = \\frac{1}{1 + e^{- ({w}^T \\cdot x + b)}} \n", "$$\n", "\n", "\n", "First, let's write how to evaluate $f(x)$ via a sequence of primitive operations (here for scalar $w$ and $x$):\n", "\n", "\n", "```python\n", "x = ?\n", "f1 = w * x\n", "f2 = f1 + b\n", "f3 = -f2\n", "f4 = 2.718281828459 ** f3\n", "f5 = 1.0 + f4\n", "f = 1.0/f5\n", "```" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "HRNAJqxYKxBf", "slideshow": { "slide_type": "slide" } }, "source": [ "The question mark indicates that $x$ is a value that must be provided. \n", "\n", "This *program* can compute the value of $x$ and also **populate program variables**, which means to give the defined variables a value. \n", "\n", "We can evaluate $\\frac{\\partial f}{\\partial x}$ at some $x$ by using the chain rule. This is called **forward-mode differentiation**. " ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "HRNAJqxYKxBf", "slideshow": { "slide_type": "slide" } }, "source": [ "In principle we follow\n", "$$\n", "\\frac{\\partial y}{\\partial x}=\\frac{\\partial y}{\\partial w_{n-1}}\\frac{\\partial w_{n-1}}{\\partial x}=\\frac{\\partial y}{\\partial w_{n-1}}\\left(\\frac{\\partial w_{n-1}}{\\partial w_{n-2}}\\frac{\\partial w_{n-2}}{\\partial x}\\right)=\\ldots\n", "$$\n", "\n", "In our case:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 69 }, "colab_type": "code", "id": "xuWlp5RzKxBg", "outputId": "7be084e4-7745-46b8-8ac4-0f3a2768a0b0", "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "def f(x,w,b):\n", " f1 = w * x\n", " f2 = f1 + b\n", " f3 = -f2\n", " f4 = 2.718281828459 ** f3\n", " f5 = 1.0 + f4\n", " return 1.0/f5" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [], "source": [ "def dfdx_forward(x, w, b):\n", " f1 = w * x\n", " p1 = w # p1 = df1/dx\n", " f2 = f1 + b\n", " p2 = 1.0*p1 # p2 = df2/df1*p1 \n", " f3 = -f2\n", " p3 = -1.0*p2 # p3 = df3/df2*p2\n", " f4 = 2.718281828459 ** f3\n", " p4 = 2.718281828459 ** f3 *p3 # p4 = df4/df3*p3\n", " f5 = 1.0 + f4\n", " p5 = 1.0*p4 # p5 = df5/df4*p4\n", " f6 = 1.0/f5\n", " dfx = -1.0 / f5 ** 2.0*p5 # df/dx = df6/df5*p5\n", " return f6, dfx" ] }, { "cell_type": "code", "execution_count": 31, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 69 }, "colab_type": "code", "id": "xuWlp5RzKxBg", "outputId": "7be084e4-7745-46b8-8ac4-0f3a2768a0b0", "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Value of the function at (3, 2, 1): 0.9990889488055992\n", "df/dx Derivative (fin diff) at (3, 2, 1): 0.0018204406870836465\n", "df/dx Derivative (aut diff) at (3, 2, 1): 0.0018204423602438654\n", "The difference between both is given as: 1.6731602188804762e-09\n" ] } ], "source": [ "h = 0.000001;\n", "der = (f(3+h, 2, 1) - f(3, 2, 1))/h\n", "\n", "print(\"Value of the function at (3, 2, 1): \",f(3, 2, 1))\n", "print(\"df/dx Derivative (fin diff) at (3, 2, 1): \",der)\n", "print(\"df/dx Derivative (aut diff) at (3, 2, 1): \",dfdx_forward(3, 2, 1)[1])\n", "print(\"The difference between both is given as:\",dfdx_forward(3, 2, 1)[1]-der)" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "We consider the function \n", "$$\n", "g(x_1,x_2)=\\sin(x_1)+x_1x_2\n", "$$" ] }, { "cell_type": "code", "execution_count": 32, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "import math\n", "\n", "def g(x1,x2):\n", " f1 = x1\n", " f2 = x2\n", " f3 = f1*f2\n", " f4 = math.sin(f1)\n", " f5 = f4+f3\n", " \n", " return f5" ] }, { "cell_type": "code", "execution_count": 33, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "def dgdx1_forward(x1,x2):\n", " f1 = x1\n", " p1 = 1 # p1=df1/dx1\n", " f2 = x2\n", " p2 = 0 # p2=df2/dx1\n", " f3 = f1*f2\n", " p3 = p1*f2+p2*f1 # p3 = f1'f2+f1f2'\n", " f4 = math.sin(f1)\n", " p4 = math.cos(f1)*p1 # d\n", " f5 = f4+f3\n", " p5 = p4+p3;\n", " return p5\n", " " ] }, { "cell_type": "code", "execution_count": 34, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "def dgdx1_forward2(x1,x2):\n", " f1 = x1\n", " p1 = 0 # p1=df1/dx1\n", " f2 = x2\n", " p2 = 1 # p2=df2/dx1\n", " f3 = f1*f2\n", " p3 = p1*f2+p2*f1 # p3 = f1'f2+f1f2'\n", " f4 = math.sin(f1)\n", " p4 = math.cos(f1)*p1 # d\n", " f5 = f4+f3\n", " p5 = p4+p3;\n", " return p5" ] }, { "cell_type": "code", "execution_count": 35, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Value of the function at (3, 2, 1): 3.1411200080598674\n", "dg/dx1 Derivative (fin diff) at (3, 2, 1): 0.010007497053265979\n", "dg/dx1 Derivative (aut diff) at (3, 2, 1): 0.010007503399554585\n", "The difference between both is given as: 6.346288605740824e-09\n" ] } ], "source": [ "h = 0.0000001;\n", "der = (g(3+h, 1) - g(3,1))/h\n", "print(\"Value of the function at (3, 2, 1): \",g(3, 1))\n", "print(\"dg/dx1 Derivative (fin diff) at (3, 2, 1): \",der)\n", "print(\"dg/dx1 Derivative (aut diff) at (3, 2, 1): \",dgdx1_forward(3, 1))\n", "print(\"The difference between both is given as:\",dgdx1_forward(3, 1)-der) " ] }, { "cell_type": "code", "execution_count": 36, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Value of the function at (3, 2, 1): 3.1411200080598674\n", "dg/dx1 Derivative (fin diff) at (3, 2, 1): 2.9999999995311555\n", "dg/dx1 Derivative (aut diff) at (3, 2, 1): 3.0\n", "The difference between both is given as: 4.688445187639445e-10\n" ] } ], "source": [ "h = 0.0000001;\n", "der = (g(3, 1+h) - g(3,1))/h\n", "print(\"Value of the function at (3, 2, 1): \",g(3, 1))\n", "print(\"dg/dx1 Derivative (fin diff) at (3, 2, 1): \",der)\n", "print(\"dg/dx1 Derivative (aut diff) at (3, 2, 1): \",dgdx1_forward2(3, 1))\n", "print(\"The difference between both is given as:\",dgdx1_forward2(3, 1)-der) " ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "2cdOSRU0KxBn", "slideshow": { "slide_type": "slide" } }, "source": [ "It is interesting to note that this *program* can be automatically derived if we have access to **subroutines implementing the derivatives of primitive functions** (such as $\\exp{(x)}$ or $1/x$) and all intermediate variables are computed in the right order. \n", "\n", "It is also interesting to note that AD allows the accurate evaluation of derivatives at **machine precision**, with only a small constant factor of overhead.\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "D-2EJLmpKxBt", "slideshow": { "slide_type": "slide" } }, "source": [ "The above can be viewed as differentiation of each equation with respect to some yet-to-be-given variable $x$ via the chain schematically given as\n", "$$\n", "\\frac{\\partial y}{\\partial x}=\\frac{\\partial y}{\\partial w_{n-1}}\\frac{\\partial w_{n-1}}{\\partial x}=\\frac{\\partial y}{\\partial w_{n-1}}\\left(\\frac{\\partial w_{n-1}}{\\partial w_{n-2}}\\frac{\\partial w_{n-2}}{\\partial x}\\right)=\\ldots\n", "$$\n", "where we can substitute $x=x_1$ or $x=x_2$ to obtain the desired derivative. This forward differentiation is efficient for functions $f : \\mathbb{R}^n \\rightarrow \\mathbb{R}^m$ with $n \\ll m$ (only $O(n)$ sweeps are necessary) as we need to call the function for $n$ different values of $x$. What if we want to avoid this?" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "D-2EJLmpKxBt", "slideshow": { "slide_type": "slide" } }, "source": [ "Alternatively, we can consider a backward version of the differentiation by proceeding as follows\n", "\\begin{align}\n", "\\frac{\\partial y}{\\partial x}=\\frac{\\partial y}{\\partial w_1}\\frac{\\partial w_1}{\\partial x}=\\left(\\frac{\\partial y}{\\partial w_2}\\frac{\\partial w_2}{\\partial w_1}\\right)\\frac{\\partial w_1}{\\partial x}=\\ldots\n", "\\end{align}\n", "using the function\n", "\n", "```python\n", " w1 = x1\n", " w2 = x2\n", " w3 = w1*w2\n", " w4 = math.sin(w1)\n", " w5 = w4+w3\n", "```" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "Let’s take a second look at the chain rule we used to derive forward-mode AD for a generic variable $t$\n", "\n", "$$\n", "\\frac{\\partial w}{\\partial t}\n", "=\n", "\\sum_i\n", "(\n", "\\frac{\\partial y}{\\partial w_i}\n", "\\frac{\\partial w_i}{\\partial t}\n", ")\n", "=\n", "\\frac{\\partial y}{\\partial w_1}\n", "\\frac{\\partial w_1}{\\partial t}\n", "+\n", "\\frac{\\partial y}{\\partial w_2}\n", "\\frac{\\partial w_2}{\\partial t}\n", "+\\ldots\n", "$$\n", "\n", "To calculate the gradient using forward-mode AD, we had to perform two substitutions: one with $t=x_1$\n", "and another with $t=x_2$.\n", "\n", "This means we have to **run the code twice.**" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "However, the chain rule is symmetric: it doesn’t care what’s in the “numerator” or the “denominator”. So let’s rewrite the chain rule but turn the derivatives upside down:\n", "\n", "$$\n", "\\frac{\\partial z}{\\partial w_j}\n", "=\n", "\\sum_i\n", "(\n", "\\frac{\\partial w_i}{\\partial w_j}\n", "\\frac{\\partial z}{\\partial w_i}\n", ")\n", "=\n", "\\frac{\\partial w_1}{\\partial w_j}\n", "\\frac{\\partial z}{\\partial w_1}\n", "+\n", "\\frac{\\partial w_2}{\\partial w_j}\n", "\\frac{\\partial z}{\\partial w_2}\n", "+\\ldots\n", "$$\n", "\n", "In this case we obtain the derivative with respect to a variable $w_j$ for some $j$ where $z$ is the output variable\n", "$$\n", "z=h(x_1,x_2)=x_1x_2+\\sin(x_1)\n", "$$\n", "and here we assume this is only one." ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "D-2EJLmpKxBt", "slideshow": { "slide_type": "slide" } }, "source": [ "Reverse mode automatic differentiation proceeds by computing for $z=w_5$ that\n", "$$\n", "\\frac{\\partial z}{\\partial z}=\\frac{\\partial z}{\\partial w_5}=1.\n", "$$\n", "Now we know that $w_5=w_4+w_3$ and hence we get\n", "$$\n", "\\frac{\\partial w_5}{\\partial w_4}=1,\\quad\\frac{\\partial w_5}{\\partial w_3}=1.\n", "$$\n", "Now using the chain rule we get\n", "\\begin{align}\n", "\\frac{\\partial z}{\\partial w_3}&=\\frac{\\partial z}{\\partial w_5}\\frac{\\partial w_5}{\\partial w_3}=1\\\\\n", "\\frac{\\partial z}{\\partial w_4}&=\\frac{\\partial z}{\\partial w_5}\\frac{\\partial w_5}{\\partial w_4}=1.\\\\\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "D-2EJLmpKxBt", "slideshow": { "slide_type": "slide" } }, "source": [ "Now, we remember that $w_3=w_2w_1$ and get $\\frac{\\partial w_3}{\\partial w_2}=w_1$ and $\\frac{\\partial w_3}{\\partial w_1}=w_2$. This can be combined to obtain the following derivative\n", "$$\n", "\\frac{\\partial z}{\\partial w_2}=\\frac{\\partial z}{\\partial w_3}\\frac{\\partial w_3}{\\partial w_2}=1\\cdot w_1=x_1.\n", "$$\n", "Also, $w_1$ contributes to $z$ and we get\n", "\\begin{align}\n", "\\frac{\\partial z}{\\partial w_1}&=\\frac{\\partial z}{\\partial w_3}\\frac{\\partial w_3}{\\partial w_1}+\\frac{\\partial z}{\\partial w_4}\\frac{\\partial w_4}{\\partial w_1}\\\\\n", "&=w_2+\\mathrm{cos}(w_1)\\\\\n", "&=x_2+\\mathrm{cos}(x_1)\\\\\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "D-2EJLmpKxBt", "slideshow": { "slide_type": "slide" } }, "source": [ "This is called *reverse-mode differentiation* or *backwards propagation*. Reverse pass starts at the end (i.e. $\\frac{\\partial z}{\\partial z} = 1$) and propagates backward to all dependencies." ] }, { "cell_type": "code", "execution_count": 39, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "colab_type": "code", "id": "af4RMYaIKxBu", "outputId": "bd858897-e072-4c76-d0b4-55aa98e63715", "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "def dgdx_backward(x1, x2):\n", " import numpy as np\n", " w1 = x1\n", " w2 = x2\n", " w3 = w1*w2\n", " w4 = math.sin(w1)\n", " w5 = w4+w3\n", " z = w5\n", " \n", " dz = 1 # pf = df/df\n", " d5d4 = dz*1.0 # p5 = pf * df/df5 \n", " d5d3 = dz*1.0 # p4 = p5 * df5/df4\n", " d3d2 = w1 # p3 = p4 * df4/df3\n", " d3d1 = w2\n", " d4d1 = math.cos(w1) # p2 = p3 * df3/df2\n", " dzd1 = d5d3*d3d1+d5d4*d4d1 # p1 = p2 * df2/df1\n", " dzd2 = d5d3*d3d2 # df/dx = p1 * df1/dx \n", " return dzd1, dzd2" ] }, { "cell_type": "code", "execution_count": 40, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "colab_type": "code", "id": "af4RMYaIKxBu", "outputId": "bd858897-e072-4c76-d0b4-55aa98e63715", "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Finite difference approximation: 2.540302261877514 0.999999993922529\n", "dg/dx Derivative (aut diff) at (3, 2, 1): (2.5403023058681398, 1.0)\n" ] } ], "source": [ "h = 0.0000001;\n", "der1 = (g(1+h, 2) - g(1,2))/h\n", "der2 = (g(1, 2+h) - g(1,2))/h\n", "\n", "print(\"Finite difference approximation:\",der1,der2)\n", "print(\"dg/dx Derivative (aut diff) at (3, 2, 1): \",\n", " dgdx_backward(1, 2))" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "f0I-yLHWKxB6", "slideshow": { "slide_type": "slide" } }, "source": [ "In practice, reverse-mode differentiation is a two-stage process. In the first stage the original function code is run forward, populating the $w_i$ variables. In the second stage, derivatives are calculated by propagating in reverse, from the outputs to the inputs.\n", "\n", "The most important property of reverse-mode differentiation is that it is **cheaper than forward-mode differentiation for functions with a high number of input variables**. For a function of the form $f : \\mathbb{R}^n \\rightarrow \\mathbb{R}$, only one application of the reverse mode is sufficient to compute the full gradient of the function $\\nabla f = \\big( \\frac{\\partial y}{\\partial x_1}, \\dots ,\\frac{\\partial y}{\\partial x_n} \\big)$. This is the case of deep learning, where the number of input variables is very high. " ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "f0I-yLHWKxB6", "slideshow": { "slide_type": "slide" } }, "source": [ "> As we have seen, AD relies on the fact that all numerical computations\n", "are ultimately compositions of a finite set of elementary operations for which derivatives are known. \n" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "0jadWz_IKxB7", "slideshow": { "slide_type": "slide" } }, "source": [ "## Autograd\n", "\n", "Autograd is a Python module (with only one function) that implements automatic differentiation.\n", "\n", "Autograd can automatically differentiate Python and Numpy code:\n", "\n", "+ It can handle most of Python’s features, including loops, if statements, recursion and closures.\n", "+ Autograd allows you to compute gradients of many types of data structures (Any nested combination of lists, tuples, arrays, or dicts).\n", "+ It can also compute higher-order derivatives.\n", "+ Uses reverse-mode differentiation (backpropagation) so it can efficiently take gradients of scalar-valued functions with respect to array-valued or vector-valued arguments.\n", "+ You can easily implement your custom gradients (good for speed, numerical stability, non-compliant code, etc)." ] }, { "cell_type": "code", "execution_count": 41, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 52 }, "colab_type": "code", "id": "Eao5AQjiKxB8", "outputId": "8a36acaa-f380-45f6-b586-4490c4dbf8db", "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "(5.50,2.28)\n", "(2.20,5.42)\n" ] } ], "source": [ "import autograd.numpy as np\n", "from autograd import grad\n", "\n", "x1 = np.array([2, 5], dtype=float)\n", "x2 = np.array([5, 2], dtype=float)\n", "\n", "def test(x):\n", " if x[0]>3:\n", " return np.log(x[0]) + x[0]*x[1] - np.sin(x[1])\n", " else:\n", " return np.log(x[0]) + x[0]*x[1] + np.sin(x[1])\n", " \n", "grad_test = grad(test)\n", "\n", "\n", "print(\"({:.2f},{:.2f})\".format(grad_test(x1)[0],grad_test(x1)[1]))\n", "print(\"({:.2f},{:.2f})\".format(grad_test(x2)[0],grad_test(x2)[1]))" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "zhLa7xHMKxCA", "slideshow": { "slide_type": "slide" } }, "source": [ "The ``grad`` function:\n", "\n", "````\n", "grad(fun, argnum=0, *nary_op_args, **nary_op_kwargs)\n", "\n", "Returns a function which computes the gradient of `fun` with respect to positional argument number `argnum`. The returned function takes the same arguments as `fun`, but returns the gradient instead. The function `fun` should be scalar-valued. The gradient has the same type as the argument.\n", "```" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "QvIOvBsiKxCC", "slideshow": { "slide_type": "slide" } }, "source": [ "Then, a simple (there is no bias term) logistic regression model for $n$-dimensional data like this\n", "\n", "$$ f(x) = \\frac{1}{1 + \\exp^{-(\\mathbf{w}^T \\mathbf{x})}} $$\n", "\n", "can be implemented in this way:" ] }, { "cell_type": "code", "execution_count": 42, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [], "source": [ "import autograd.numpy as np\n", "from autograd import grad\n", "\n", "def sigmoid(x):\n", " return 1 / (1 + np.exp(-x))\n", "\n", "def logistic_predictions(weights, inputs):\n", " return sigmoid(np.dot(inputs, weights))\n", "\n", "def training_loss(weights, inputs, targets):\n", " preds = logistic_predictions(weights, inputs)\n", " loss = preds * targets + (1 - preds) * (1 - targets)\n", " return -np.sum(np.log(loss))\n", "\n", "def optimize(inputs, targets, training_loss):\n", " # Optimize weights using gradient descent.\n", " gradient_loss = grad(training_loss)\n", " weights = np.zeros(inputs.shape[1])\n", " print(\"Initial loss:\", training_loss(weights, inputs, targets))\n", " for i in range(100):\n", " weights -= gradient_loss(weights, inputs, targets) * 0.01\n", " print(\"Final loss:\", training_loss(weights, inputs, targets))\n", " return weights" ] }, { "cell_type": "code", "execution_count": 44, "metadata": { "slideshow": { "slide_type": "slide" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Initial loss: 2.772588722239781\n", "Final loss: 1.0672706757870165\n", "Weights: [ 0.48307366 -0.37057217 1.06937395]\n" ] } ], "source": [ "# Build a toy dataset with 3d data\n", "inputs = np.array([[0.52, 1.12, 0.77],\n", " [0.88, -1.08, 0.15],\n", " [0.52, 0.06, -1.30],\n", " [0.74, -2.49, 1.39]])\n", "targets = np.array([True, True, False, True])\n", "\n", "weights = optimize(inputs, targets, training_loss)\n", "print(\"Weights:\", weights)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "accelerator": "GPU", "celltoolbar": "Slideshow", "colab": { "include_colab_link": true, "name": "2. Automatic Differentiation.ipynb", "provenance": [], "version": "0.3.2" }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.2" } }, "nbformat": 4, "nbformat_minor": 2 }