diff --git a/.jenkins/build.sh b/.jenkins/build.sh index 67a42cea923..cf971447220 100755 --- a/.jenkins/build.sh +++ b/.jenkins/build.sh @@ -76,6 +76,16 @@ if [[ "${JOB_BASE_NAME}" == *worker_* ]]; then FILES_TO_RUN+=($(basename $filename .py)) fi count=$((count+1)) + done + for filename in $(find recipes_source/ -name '*.py' -not -path '*/data/*'); do + if [ $(($count % $NUM_WORKERS)) != $WORKER_ID ]; then + echo "Removing runnable code from "$filename + python $DIR/remove_runnable_code.py $filename $filename + else + echo "Keeping "$filename + FILES_TO_RUN+=($(basename $filename .py)) + fi + count=$((count+1)) done echo "FILES_TO_RUN: " ${FILES_TO_RUN[@]} diff --git a/beginner_source/blitz/autograd_tutorial.py b/beginner_source/blitz/autograd_tutorial.py index 98e70a251d6..6669befbdc6 100644 --- a/beginner_source/blitz/autograd_tutorial.py +++ b/beginner_source/blitz/autograd_tutorial.py @@ -3,26 +3,45 @@ Autograd: Automatic Differentiation =================================== -Central to all neural networks in PyTorch is the ``autograd`` package. -Let’s first briefly visit this, and we will then go to training our -first neural network. +Deep learning uses artificial neural networks, which are computing systems +composed of many layers of interconnected units. By passing data through these +interconnected units, a neural network, or model, is able to learn how to +approximate the computations required to transform that data input into some +output. We will learn how to fully construct neural networks in the next section +of this tutorial. +Before we do so, it is important that we introduce some concepts. -The ``autograd`` package provides automatic differentiation for all operations -on Tensors. It is a define-by-run framework, which means that your backprop is -defined by how your code is run, and that every single iteration can be -different. +In the training phase, models are able to increase their accuracy through gradient decent. +In short, gradient descent is the process of minimizing our loss (or error) by tweaking the +weights and biases in our model. -Let us see this in more simple terms with some examples. +This process introduces the concept "automatic differentiation". Automatic differentiation +helps us calculate the derivatives of our loss function to know how much we should +adjust those weights and biases to decrease loss. There are various methods on how to +perform automatic differentiation, one of the most popular being backpropagation (finding the +loss based on the previous EPOCH, or iteration). + +In the ``autograd`` package in PyTorch, we have access to automatic differentiation for +all operations on tensors. It is a define-by-run framework, which means, for example, +that your backpropagation is defined by how your code is run and that every iteration can be different. + + +Getting Started +--------------- + +This entire process is simplified using PyTorch. In this section of the tutorial, we will +see examples of gradient descent, automatic differentiation, and backpropagation in PyTorch. +This will which will introduce to training our first neural network in the following section. Tensor --------- +^^^^^^^ -``torch.Tensor`` is the central class of the package. If you set its attribute -``.requires_grad`` as ``True``, it starts to track all operations on it. When -you finish your computation you can call ``.backward()`` and have all the -gradients computed automatically. The gradient for this tensor will be -accumulated into ``.grad`` attribute. +``torch.Tensor`` is the central class of PyTorch. When you create a tensor, +if you set its attribute ``.requires_grad`` as ``True``, the package tracks +all operations on it. When your computation is finished, you can then call +``.backward()`` on the tensor, and have all the gradients computed automatically. +The gradient for this tensor will be accumulated into ``.grad`` attribute. To stop a tensor from tracking history, you can call ``.detach()`` to detach it from the computation history, and to prevent future computation from being @@ -30,51 +49,68 @@ To prevent tracking history (and using memory), you can also wrap the code block in ``with torch.no_grad():``. This can be particularly helpful when evaluating a -model because the model may have trainable parameters with -``requires_grad=True``, but for which we don't need the gradients. - -There’s one more class which is very important for autograd -implementation - a ``Function``. - -``Tensor`` and ``Function`` are interconnected and build up an acyclic -graph, that encodes a complete history of computation. Each tensor has -a ``.grad_fn`` attribute that references a ``Function`` that has created -the ``Tensor`` (except for Tensors created by the user - their -``grad_fn is None``). - -If you want to compute the derivatives, you can call ``.backward()`` on -a ``Tensor``. If ``Tensor`` is a scalar (i.e. it holds a one element -data), you don’t need to specify any arguments to ``backward()``, -however if it has more elements, you need to specify a ``gradient`` -argument that is a tensor of matching shape. +model, where a it may have trainable parameters with +``requires_grad=True``, but whose gradients are not necessary. + +Before we dive into code, there’s one more very important class when implementing +``autograd`` - a ``Function``. + +``Tensor`` and ``Function`` are interconnected. Together, they build an acyclic +graph that encodes a complete history of computation. Programmatically, this means +that each tensor has a ``.grad_fn`` attribute, referencing a ``Function`` that created +the tensor (except for when tensors are explicitly created by the developer - their +``grad_fn`` is ``None``). + +Let's see this in action. As always, we will first need to import PyTorch into our program. + """ import torch ############################################################### -# Create a tensor and set ``requires_grad=True`` to track computation with it +# Create a tensor and set ``requires_grad=True``. This will track all operations on the tensor. + x = torch.ones(2, 2, requires_grad=True) print(x) ############################################################### -# Do a tensor operation: +# Now, perform a simple tensor operation: + y = x + 2 print(y) ############################################################### -# ``y`` was created as a result of an operation, so it has a ``grad_fn``. +# ``grad_fn`` points to the last operation performed on the tensor. +# In this case it is an addition function, so its value is ``AddBackward0``. + print(y.grad_fn) ############################################################### -# Do more operations on ``y`` +# Perform more operations on ``y``: + z = y * y * 3 out = z.mean() print(z, out) ################################################################ -# ``.requires_grad_( ... )`` changes an existing Tensor's ``requires_grad`` -# flag in-place. The input flag defaults to ``False`` if not given. +# Notice how the output of ``z`` has a gradient function of ``MulBackward0`` and +# the output of ``out`` has ``MeanBackward0``. These outputs give us insight to +# the history of the tensor, where ``MulBackward`` indicates that a multiplication +# operation was performed, and ``MeanBackward`` indicates the mean was calculated +# previously on this tensor. +# +# Now if we perform a multiplication operation on ``z`` again: + +print (z.median()) + +################################################################ +# You can see that the ``grad_fn`` was updated to ``MedianBackward``. +# +# You can change the ``requires_grad`` flag in place using the +# ``.requires_grad_( ... )`` function. +# The input flag defaults to ``False`` if not manually specified. + a = torch.randn(2, 2) a = ((a * 3) / (a - 1)) print(a.requires_grad) @@ -86,20 +122,31 @@ ############################################################### # Gradients # --------- -# Let's backprop now. -# Because ``out`` contains a single scalar, ``out.backward()`` is -# equivalent to ``out.backward(torch.tensor(1.))``. +# Now that we understand how these operations work with tensors, let's practice with +# backpropagation in PyTorch, using ``.backward()``. +# +# To compute the derivative, you can call ``.backward()`` on +# a ``tensor``. If ``tensor`` is a scalar (i.e. it holds a one element +# data), you don’t need to specify any arguments to ``backward()``. If it has more elements, +# you need to specify a ``gradient`` argument, which will be a tensor of matching shape. +# +# ``loss.backward()`` computes the derivative of the loss (``dloss/dx``) for every +# parameter ``x`` where ``requires_grad = True``. These are then accumulated into ``x.grad``. +# +# In our examples, ``out`` is the loss for ``x``. Because ``out`` contains a single scalar, +# ``out.backward()`` is equivalent to ``out.backward(torch.tensor(1.))``. +print (x) +print (out) out.backward() -############################################################### -# Print gradients d(out)/dx -# - +# Print the gradients d(out)/dx print(x.grad) ############################################################### -# You should have got a matrix of ``4.5``. Let’s call the ``out`` +# Let's break this down mathematically for further understanding. +# +# You should have got a matrix of ``4.5``. Let’s call ``out`` # *Tensor* “:math:`o`”. # We have that :math:`o = \frac{1}{4}\sum_i z_i`, # :math:`z_i = 3(x_i+2)^2` and :math:`z_i\bigr\rvert_{x_i=1} = 27`. @@ -108,7 +155,7 @@ # :math:`\frac{\partial o}{\partial x_i}\bigr\rvert_{x_i=1} = \frac{9}{2} = 4.5`. ############################################################### -# Mathematically, if you have a vector valued function :math:`\vec{y}=f(\vec{x})`, +# If you have a vector valued function :math:`\vec{y}=f(\vec{x})`, # then the gradient of :math:`\vec{y}` with respect to :math:`\vec{x}` # is a Jacobian matrix: # @@ -152,7 +199,7 @@ # non-scalar output. ############################################################### -# Now let's take a look at an example of vector-Jacobian product: +# Check out this example to see vector-Jacobian product in practice: x = torch.randn(3, requires_grad=True) @@ -167,15 +214,17 @@ # could not compute the full Jacobian directly, but if we just # want the vector-Jacobian product, simply pass the vector to # ``backward`` as argument: + v = torch.tensor([0.1, 1.0, 0.0001], dtype=torch.float) y.backward(v) print(x.grad) ############################################################### -# You can also stop autograd from tracking history on Tensors -# with ``.requires_grad=True`` either by wrapping the code block in +# As mentioned previously, you can stop ``autograd`` from tracking history on tensors +# (via ``.requires_grad=True``) either by wrapping the code block in # ``with torch.no_grad():`` + print(x.requires_grad) print((x ** 2).requires_grad) @@ -183,16 +232,19 @@ print((x ** 2).requires_grad) ############################################################### -# Or by using ``.detach()`` to get a new Tensor with the same -# content but that does not require gradients: +# Or by using ``.detach()``, which yields a new tensor with the same +# content that does not require gradients: + print(x.requires_grad) y = x.detach() print(y.requires_grad) print(x.eq(y).all()) - ############################################################### +# With this understanding of how ``autograd`` works in PyTorch, let's move on to +# the next section to construct our neural networks. +# # **Read Later:** # -# Document about ``autograd.Function`` is at +# For more information about ``autograd.Function``, check out our documentation: # https://pytorch.org/docs/stable/autograd.html#function diff --git a/beginner_source/blitz/tensor_tutorial.py b/beginner_source/blitz/tensor_tutorial.py index 7b339ee225f..3ce10f3ca3f 100644 --- a/beginner_source/blitz/tensor_tutorial.py +++ b/beginner_source/blitz/tensor_tutorial.py @@ -3,101 +3,147 @@ What is PyTorch? ================ -It’s a Python-based scientific computing package targeted at two sets of +It is a open source machine learning framework that accelerates the +path from research prototyping to production deployment. + +PyTorch is built as a Python-based scientific computing package targeted at two sets of audiences: -- A replacement for NumPy to use the power of GPUs -- a deep learning research platform that provides maximum flexibility - and speed +- Those who are looking for a replacement for NumPy to use the power of GPUs. +- Researchers who want to build with a deep learning platform that provides maximum flexibility + and speed. Getting Started --------------- +In this section of the tutorial, we will introduce the concept of a tensor in PyTorch, and its operations. + Tensors ^^^^^^^ -Tensors are similar to NumPy’s ndarrays, with the addition being that -Tensors can also be used on a GPU to accelerate computing. +A tensor is a generic n-dimensional array. Tensors in PyTorch are similar to NumPy’s ndarrays, +with the addition being that tensors can also be used on a GPU to accelerate computing. + +To see the behavior of tensors, we will first need to import PyTorch into our program. """ from __future__ import print_function import torch -############################################################### -# .. note:: -# An uninitialized matrix is declared, -# but does not contain definite known -# values before it is used. When an -# uninitialized matrix is created, -# whatever values were in the allocated -# memory at the time will appear as the initial values. +""" +We import ``future`` here to help port our code from Python 2 to Python 3. +For more details, see the `Python-Future technical documentation `_. + +Let's take a look at how we can create tensors. +""" ############################################################### -# Construct a 5x3 matrix, uninitialized: +# First, construct a 5x3 empty matrix: x = torch.empty(5, 3) print(x) + +""" +``torch.empty`` creates an uninitialized matrix of type tensor. +When an empty tensor is declared, it does not contain definite known values +before you populate it. The values in the empty tensor are those that were in +the allocated memory at the time of initialization. +""" ############################################################### -# Construct a randomly initialized matrix: +# Now, construct a randomly initialized matrix: x = torch.rand(5, 3) print(x) +""" +``torch.rand`` creates an initialized matrix of type tensor with a random +sampling of values. +""" + ############################################################### # Construct a matrix filled zeros and of dtype long: x = torch.zeros(5, 3, dtype=torch.long) print(x) +""" +``torch.zeros`` creates an initialized matrix of type tensor with every +index having a value of zero. +""" + ############################################################### -# Construct a tensor directly from data: +# Let's construct a tensor with data that we define ourselves: x = torch.tensor([5.5, 3]) print(x) +""" +Our tensor can represent all types of data. This data can be an audio waveform, the +pixels of an image, even entities of a language. + +PyTorch has packages that support these specific data types. For additional learning, see: +- `torchvision `_ +- `torchtext `_ +- `torchaudio `_ +""" + ############################################################### -# or create a tensor based on an existing tensor. These methods -# will reuse properties of the input tensor, e.g. dtype, unless -# new values are provided by user +# You can create a tensor based on an existing tensor. These methods +# reuse the properties of the input tensor, e.g. ``dtype``, unless +# new values are provided by the user. +# -x = x.new_ones(5, 3, dtype=torch.double) # new_* methods take in sizes +x = x.new_ones(5, 3, dtype=torch.double) print(x) x = torch.randn_like(x, dtype=torch.float) # override dtype! print(x) # result has the same size +""" +``tensor.new_*`` methods take in the size of the tensor and a ``dtype``, +returning a tensor filled with ones. + +In this example,``torch.randn_like`` creates a new tensor based upon the +input tensor, and overrides the ``dtype`` to be a float. The output of +this method is a tensor of the same size and different ``dtype``. +""" + ############################################################### -# Get its size: +# We can get the size of a tensor as a tuple: print(x.size()) ############################################################### # .. note:: -# ``torch.Size`` is in fact a tuple, so it supports all tuple operations. +# Since ``torch.Size`` is a tuple, it supports all tuple operations. # # Operations # ^^^^^^^^^^ -# There are multiple syntaxes for operations. In the following -# example, we will take a look at the addition operation. +# There are multiple syntaxes for operations that can be performed on tensors. +# In the following example, we will take a look at the addition operation. # -# Addition: syntax 1 +# First, let's try using the ``+`` operator. + y = torch.rand(5, 3) print(x + y) ############################################################### -# Addition: syntax 2 +# Using the ``+`` operator should have the same output as using the +# ``add()`` method. print(torch.add(x, y)) ############################################################### -# Addition: providing an output tensor as argument +# You can also provide a tensor as an argument to the ``add()`` +# method that will contain the data of the output operation. + result = torch.empty(5, 3) torch.add(x, y, out=result) print(result) ############################################################### -# Addition: in-place +# Finally, you can perform this operation in-place. # adds x to y y.add_(x) @@ -107,21 +153,29 @@ # .. note:: # Any operation that mutates a tensor in-place is post-fixed with an ``_``. # For example: ``x.copy_(y)``, ``x.t_()``, will change ``x``. -# -# You can use standard NumPy-like indexing with all bells and whistles! + +############################################################### +# Similar to NumPy, tensors can be indexed using the standard +# Python ``x[i]`` syntax, where ``x`` is the array and ``i`` is the selection. +# +# That said, you can use NumPy-like indexing with all its bells and whistles! print(x[:, 1]) ############################################################### -# Resizing: If you want to resize/reshape tensor, you can use ``torch.view``: +# Resizing your tensors might be necessary for your data. +# If you want to resize or reshape tensor, you can use ``torch.view``: + x = torch.randn(4, 4) y = x.view(16) z = x.view(-1, 8) # the size -1 is inferred from other dimensions print(x.size(), y.size(), z.size()) ############################################################### -# If you have a one element tensor, use ``.item()`` to get the value as a -# Python number +# You can access the Python number-value of a one-element tensor using ``.item()``. +# If you have a multidimensional tensor, see the +# `tolist() `_ method. + x = torch.randn(1) print(x) print(x.item()) @@ -130,43 +184,55 @@ # **Read later:** # # -# 100+ Tensor operations, including transposing, indexing, slicing, -# mathematical operations, linear algebra, random numbers, etc., -# are described -# `here `_. +# This was just a sample of the 100+ Tensor operations you have +# access to in PyTorch. There are many others, including transposing, +# indexing, slicing, mathematical operations, linear algebra, +# random numbers, and more. Read and explore more about them in our +# `technical documentation `_. # # NumPy Bridge # ------------ # -# Converting a Torch Tensor to a NumPy array and vice versa is a breeze. +# As mentioned earlier, one of the benefits of using PyTorch is that it +# is built to provide a seemless transition from NumPy. +# +# For example, converting a Torch Tensor to a NumPy array (and vice versa) +# is a breeze. # # The Torch Tensor and NumPy array will share their underlying memory -# locations (if the Torch Tensor is on CPU), and changing one will change +# locations (if the Torch Tensor is on CPU). That means, changing one will change # the other. # +# Let's see this in action. +# # Converting a Torch Tensor to a NumPy Array # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +# First, construct a 1-dimensional tensor populated with ones. a = torch.ones(5) print(a) ############################################################### -# +# Now, let's construct a NumPy array based off of that tensor. b = a.numpy() print(b) ############################################################### -# See how the numpy array changed in value. +# Let's see how they share their memory locations. Add ``1`` to the torch tensor. a.add_(1) print(a) print(b) +############################################################### +# Take note how the numpy array also changed in value. + ############################################################### # Converting NumPy Array to Torch Tensor # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -# See how changing the np array changed the Torch Tensor automatically +# Try the same thing for NumPy to Torch Tensor. +# See how changing the NumPy array changed the Torch Tensor automatically as well. import numpy as np a = np.ones(5) @@ -176,15 +242,17 @@ print(b) ############################################################### -# All the Tensors on the CPU except a CharTensor support converting to +# All the Tensors on the CPU (except a CharTensor) support converting to # NumPy and back. # # CUDA Tensors # ------------ # # Tensors can be moved onto any device using the ``.to`` method. +# The following code block can be run by changing the runtime in +# your notebook to "GPU" or greater. -# let us run this cell only if CUDA is available +# This cell will run only if CUDA is available # We will use ``torch.device`` objects to move tensors in and out of GPU if torch.cuda.is_available(): device = torch.device("cuda") # a CUDA device object @@ -193,3 +261,7 @@ z = x + y print(z) print(z.to("cpu", torch.double)) # ``.to`` can also change dtype together! + +############################################################### +# Now that you have had time to experiment with Tensors in PyTorch, let's take +# a look at Automatic Differentiation. diff --git a/conf.py b/conf.py index 06ba0f2c7ff..553fb91e010 100644 --- a/conf.py +++ b/conf.py @@ -61,8 +61,8 @@ sphinx_gallery_conf = { 'examples_dirs': ['beginner_source', 'intermediate_source', - 'advanced_source'], - 'gallery_dirs': ['beginner', 'intermediate', 'advanced'], + 'advanced_source', 'recipes_source'], + 'gallery_dirs': ['beginner', 'intermediate', 'advanced', 'recipes'], 'filename_pattern': os.environ.get('GALLERY_PATTERN', r'tutorial.py'), 'backreferences_dir': False } diff --git a/index.rst b/index.rst index 8dcb2b04e5d..3f70c074628 100644 --- a/index.rst +++ b/index.rst @@ -25,6 +25,18 @@ Some considerations: * Finally, here's a link to the `PyTorch Release Notes `_ +Recipes +------------------ +.. customgalleryitem:: + :figure: /_static/img/thumbnails/pytorch-logo-flat.png + :tooltip: Bite-sized tutorials + :description: :doc:`/recipes/recipes_index` + +.. raw:: html + +
+ + Learning PyTorch ------------------ @@ -188,6 +200,14 @@ Additional APIs .. ----------------------------------------- .. Page TOC .. ----------------------------------------- +.. toctree:: + :maxdepth: 2 + :hidden: + :includehidden: + :caption: Recipes + + recipes/recipes_index + .. toctree:: :maxdepth: 2 :hidden: diff --git a/recipes_source/README.txt b/recipes_source/README.txt new file mode 100644 index 00000000000..26ea2023051 --- /dev/null +++ b/recipes_source/README.txt @@ -0,0 +1,5 @@ +Recipes +------------------ +1. recipes/* and recipes_index.rst + PyTorch Recipes + https://pytorch.org/tutorials/recipes/recipes_index.html diff --git a/recipes_source/recipes/Intro_to_TorchScript_tutorial.py b/recipes_source/recipes/Intro_to_TorchScript_tutorial.py new file mode 100644 index 00000000000..d9b0023a6ab --- /dev/null +++ b/recipes_source/recipes/Intro_to_TorchScript_tutorial.py @@ -0,0 +1,393 @@ +""" +Introduction to TorchScript +=========================== + +*James Reed (jamesreed@fb.com), Michael Suo (suo@fb.com)*, rev2 + +This tutorial is an introduction to TorchScript, an intermediate +representation of a PyTorch model (subclass of ``nn.Module``) that +can then be run in a high-performance environment such as C++. + +In this tutorial we will cover: + +1. The basics of model authoring in PyTorch, including: + +- Modules +- Defining ``forward`` functions +- Composing modules into a hierarchy of modules + +2. Specific methods for converting PyTorch modules to TorchScript, our + high-performance deployment runtime + +- Tracing an existing module +- Using scripting to directly compile a module +- How to compose both approaches +- Saving and loading TorchScript modules + +We hope that after you complete this tutorial, you will proceed to go through +`the follow-on tutorial `_ +which will walk you through an example of actually calling a TorchScript +model from C++. + +""" + +import torch # This is all you need to use both PyTorch and TorchScript! +print(torch.__version__) + + +###################################################################### +# Basics of PyTorch Model Authoring +# --------------------------------- +# +# Let’s start out be defining a simple ``Module``. A ``Module`` is the +# basic unit of composition in PyTorch. It contains: +# +# 1. A constructor, which prepares the module for invocation +# 2. A set of ``Parameters`` and sub-\ ``Modules``. These are initialized +# by the constructor and can be used by the module during invocation. +# 3. A ``forward`` function. This is the code that is run when the module +# is invoked. +# +# Let’s examine a small example: +# + +class MyCell(torch.nn.Module): + def __init__(self): + super(MyCell, self).__init__() + + def forward(self, x, h): + new_h = torch.tanh(x + h) + return new_h, new_h + +my_cell = MyCell() +x = torch.rand(3, 4) +h = torch.rand(3, 4) +print(my_cell(x, h)) + + +###################################################################### +# So we’ve: +# +# 1. Created a class that subclasses ``torch.nn.Module``. +# 2. Defined a constructor. The constructor doesn’t do much, just calls +# the constructor for ``super``. +# 3. Defined a ``forward`` function, which takes two inputs and returns +# two outputs. The actual contents of the ``forward`` function are not +# really important, but it’s sort of a fake `RNN +# cell `__–that +# is–it’s a function that is applied on a loop. +# +# We instantiated the module, and made ``x`` and ``y``, which are just 3x4 +# matrices of random values. Then we invoked the cell with +# ``my_cell(x, h)``. This in turn calls our ``forward`` function. +# +# Let’s do something a little more interesting: +# + +class MyCell(torch.nn.Module): + def __init__(self): + super(MyCell, self).__init__() + self.linear = torch.nn.Linear(4, 4) + + def forward(self, x, h): + new_h = torch.tanh(self.linear(x) + h) + return new_h, new_h + +my_cell = MyCell() +print(my_cell) +print(my_cell(x, h)) + + +###################################################################### +# We’ve redefined our module ``MyCell``, but this time we’ve added a +# ``self.linear`` attribute, and we invoke ``self.linear`` in the forward +# function. +# +# What exactly is happening here? ``torch.nn.Linear`` is a ``Module`` from +# the PyTorch standard library. Just like ``MyCell``, it can be invoked +# using the call syntax. We are building a hierarchy of ``Module``\ s. +# +# ``print`` on a ``Module`` will give a visual representation of the +# ``Module``\ ’s subclass hierarchy. In our example, we can see our +# ``Linear`` subclass and its parameters. +# +# By composing ``Module``\ s in this way, we can succintly and readably +# author models with reusable components. +# +# You may have noticed ``grad_fn`` on the outputs. This is a detail of +# PyTorch’s method of automatic differentiation, called +# `autograd `__. +# In short, this system allows us to compute derivatives through +# potentially complex programs. The design allows for a massive amount of +# flexibility in model authoring. +# +# Now let’s examine said flexibility: +# + +class MyDecisionGate(torch.nn.Module): + def forward(self, x): + if x.sum() > 0: + return x + else: + return -x + +class MyCell(torch.nn.Module): + def __init__(self): + super(MyCell, self).__init__() + self.dg = MyDecisionGate() + self.linear = torch.nn.Linear(4, 4) + + def forward(self, x, h): + new_h = torch.tanh(self.dg(self.linear(x)) + h) + return new_h, new_h + +my_cell = MyCell() +print(my_cell) +print(my_cell(x, h)) + + +###################################################################### +# We’ve once again redefined our MyCell class, but here we’ve defined +# ``MyDecisionGate``. This module utilizes **control flow**. Control flow +# consists of things like loops and ``if``-statements. +# +# Many frameworks take the approach of computing symbolic derivatives +# given a full program representation. However, in PyTorch, we use a +# gradient tape. We record operations as they occur, and replay them +# backwards in computing derivatives. In this way, the framework does not +# have to explicitly define derivatives for all constructs in the +# language. +# +# .. figure:: https://github.com/pytorch/pytorch/raw/master/docs/source/_static/img/dynamic_graph.gif +# :alt: How autograd works +# +# How autograd works +# + + +###################################################################### +# Basics of TorchScript +# --------------------- +# +# Now let’s take our running example and see how we can apply TorchScript. +# +# In short, TorchScript provides tools to capture the definition of your +# model, even in light of the flexible and dynamic nature of PyTorch. +# Let’s begin by examining what we call **tracing**. +# +# Tracing ``Modules`` +# ~~~~~~~~~~~~~~~~~~~ +# + +class MyCell(torch.nn.Module): + def __init__(self): + super(MyCell, self).__init__() + self.linear = torch.nn.Linear(4, 4) + + def forward(self, x, h): + new_h = torch.tanh(self.linear(x) + h) + return new_h, new_h + +my_cell = MyCell() +x, h = torch.rand(3, 4), torch.rand(3, 4) +traced_cell = torch.jit.trace(my_cell, (x, h)) +print(traced_cell) +traced_cell(x, h) + + +###################################################################### +# We’ve rewinded a bit and taken the second version of our ``MyCell`` +# class. As before, we’ve instantiated it, but this time, we’ve called +# ``torch.jit.trace``, passed in the ``Module``, and passed in *example +# inputs* the network might see. +# +# What exactly has this done? It has invoked the ``Module``, recorded the +# operations that occured when the ``Module`` was run, and created an +# instance of ``torch.jit.ScriptModule`` (of which ``TracedModule`` is an +# instance) +# +# TorchScript records its definitions in an Intermediate Representation +# (or IR), commonly referred to in Deep learning as a *graph*. We can +# examine the graph with the ``.graph`` property: +# + +print(traced_cell.graph) + + +###################################################################### +# However, this is a very low-level representation and most of the +# information contained in the graph is not useful for end users. Instead, +# we can use the ``.code`` property to give a Python-syntax interpretation +# of the code: +# + +print(traced_cell.code) + + +###################################################################### +# So **why** did we do all this? There are several reasons: +# +# 1. TorchScript code can be invoked in its own interpreter, which is +# basically a restricted Python interpreter. This interpreter does not +# acquire the Global Interpreter Lock, and so many requests can be +# processed on the same instance simultaneously. +# 2. This format allows us to save the whole model to disk and load it +# into another environment, such as in a server written in a language +# other than Python +# 3. TorchScript gives us a representation in which we can do compiler +# optimizations on the code to provide more efficient execution +# 4. TorchScript allows us to interface with many backend/device runtimes +# that require a broader view of the program than individual operators. +# +# We can see that invoking ``traced_cell`` produces the same results as +# the Python module: +# + +print(my_cell(x, h)) +print(traced_cell(x, h)) + + +###################################################################### +# Using Scripting to Convert Modules +# ---------------------------------- +# +# There’s a reason we used version two of our module, and not the one with +# the control-flow-laden submodule. Let’s examine that now: +# + +class MyDecisionGate(torch.nn.Module): + def forward(self, x): + if x.sum() > 0: + return x + else: + return -x + +class MyCell(torch.nn.Module): + def __init__(self, dg): + super(MyCell, self).__init__() + self.dg = dg + self.linear = torch.nn.Linear(4, 4) + + def forward(self, x, h): + new_h = torch.tanh(self.dg(self.linear(x)) + h) + return new_h, new_h + +my_cell = MyCell(MyDecisionGate()) +traced_cell = torch.jit.trace(my_cell, (x, h)) +print(traced_cell.code) + + +###################################################################### +# Looking at the ``.code`` output, we can see that the ``if-else`` branch +# is nowhere to be found! Why? Tracing does exactly what we said it would: +# run the code, record the operations *that happen* and construct a +# ScriptModule that does exactly that. Unfortunately, things like control +# flow are erased. +# +# How can we faithfully represent this module in TorchScript? We provide a +# **script compiler**, which does direct analysis of your Python source +# code to transform it into TorchScript. Let’s convert ``MyDecisionGate`` +# using the script compiler: +# + +scripted_gate = torch.jit.script(MyDecisionGate()) + +my_cell = MyCell(scripted_gate) +traced_cell = torch.jit.script(my_cell) +print(traced_cell.code) + + +###################################################################### +# Hooray! We’ve now faithfully captured the behavior of our program in +# TorchScript. Let’s now try running the program: +# + +# New inputs +x, h = torch.rand(3, 4), torch.rand(3, 4) +traced_cell(x, h) + + +###################################################################### +# Mixing Scripting and Tracing +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Some situations call for using tracing rather than scripting (e.g. a +# module has many architectural decisions that are made based on constant +# Python values that we would like to not appear in TorchScript). In this +# case, scripting can be composed with tracing: ``torch.jit.script`` will +# inline the code for a traced module, and tracing will inline the code +# for a scripted module. +# +# An example of the first case: +# + +class MyRNNLoop(torch.nn.Module): + def __init__(self): + super(MyRNNLoop, self).__init__() + self.cell = torch.jit.trace(MyCell(scripted_gate), (x, h)) + + def forward(self, xs): + h, y = torch.zeros(3, 4), torch.zeros(3, 4) + for i in range(xs.size(0)): + y, h = self.cell(xs[i], h) + return y, h + +rnn_loop = torch.jit.script(MyRNNLoop()) +print(rnn_loop.code) + + + +###################################################################### +# And an example of the second case: +# + +class WrapRNN(torch.nn.Module): + def __init__(self): + super(WrapRNN, self).__init__() + self.loop = torch.jit.script(MyRNNLoop()) + + def forward(self, xs): + y, h = self.loop(xs) + return torch.relu(y) + +traced = torch.jit.trace(WrapRNN(), (torch.rand(10, 3, 4))) +print(traced.code) + + +###################################################################### +# This way, scripting and tracing can be used when the situation calls for +# each of them and used together. +# +# Saving and Loading models +# ------------------------- +# +# We provide APIs to save and load TorchScript modules to/from disk in an +# archive format. This format includes code, parameters, attributes, and +# debug information, meaning that the archive is a freestanding +# representation of the model that can be loaded in an entirely separate +# process. Let’s save and load our wrapped RNN module: +# + +traced.save('wrapped_rnn.zip') + +loaded = torch.jit.load('wrapped_rnn.zip') + +print(loaded) +print(loaded.code) + + +###################################################################### +# As you can see, serialization preserves the module hierarchy and the +# code we’ve been examining throughout. The model can also be loaded, for +# example, `into +# C++ `__ for +# python-free execution. +# +# Further Reading +# ~~~~~~~~~~~~~~~ +# +# We’ve completed our tutorial! For a more involved demonstration, check +# out the NeurIPS demo for converting machine translation models using +# TorchScript: +# https://colab.research.google.com/drive/1HiICg6jRkBnr5hvK2-VnMi88Vi9pUzEJ +# diff --git a/recipes_source/recipes/README.txt b/recipes_source/recipes/README.txt new file mode 100644 index 00000000000..a7cead8432a --- /dev/null +++ b/recipes_source/recipes/README.txt @@ -0,0 +1,71 @@ +PyTorch Recipes +--------------------------------------------- +1. aws_distributed_training_tutorial.py + (advanced) PyTorch 1.0 Distributed Trainer with Amazon AWS + https://pytorch.org/tutorials/recipes/recipes/aws_distributed_training_tutorial.html + +2. cpp_export.rst + Loading a TorchScript Model in C++ + https://pytorch.org/tutorials/recipes/recipes/cpp_export.html + +3. cpp_extension.py + Custom C++ and CUDA Extensions + https://pytorch.org/tutorials/recipes/recipes/cpp_extension.html + +4. custom_dataset.ipynb + Writing Custom Data Transformations + https://pytorch.org/tutorials/recipes/recipes/custom_dataset.html + +5. data_loading_tutorial.py + Writing Custom Datasets, DataLoaders and Transforms + https://pytorch.org/tutorials/recipes/recipes/data_loading_tutorial.html + +6. ddp_tutorial.rst + Getting Started with Distributed Data Parallel + https://pytorch.org/tutorials/recipes/recipes/ddp_tutorial.html + +7. disto_tuto.rst + Writing Distributed Applications with PyTorch + https://pytorch.org/tutorials/recipes/recipes/disto_tuto.html + +8. example_recipe.py + Example Recipe + https://pytorch.org/tutorials/recipes/recipes/example_recipe.html + +9. flask_rest_api_tutorial.py + Deploying PyTorch in Python via a REST API with Flask + https://pytorch.org/tutorials/recipes/recipes/flask_rest_api_tutorial.html + +10. Intro_to_TorchScript_tutorial.py + Introduction to TorchScript + https://pytorch.org/tutorials/recipes/recipes/Intro_to_TorchScript_tutorial.html + +11. model_parallel_tutorial.py + Model Parallel Best Practices + https://pytorch.org/tutorials/recipes/recipes/model_parallel_tutorial.html + +12. numpy_extensions_tutorial.py + Creating Extensions Using numpy and scipy + https://pytorch.org/tutorials/recipes/recipes/numpy_extensions_tutorila.html + +13. rpc_tutorial.rst + Getting Started with Distributed RPC Framework + https://pytorch.org/tutorials/recipes/recipes/rpc_tutorial.html + +14. super_resolution_with_onnxruntime.rst + (optional) Exporting a Model from PyTorch to ONNX and Running it using ONNX Runtime + https://pytorch.org/tutorials/recipes/recipes/cpp_export.html + +15. torch_script_custom_classes.rst + Extending TorchScript with Custom C++ Classs + https://pytorch.org/tutorials/recipes/recipes/torsh_script_custom_classes.html + +16. torch_script_custom_ops.rst + Extending TorchScript with Custom C++ Operators + https://pytorch.org/tutorials/recipes/recipes/torsh_script_custom_ops.html + + + + + + diff --git a/recipes_source/recipes/autograd_tutorial.py b/recipes_source/recipes/autograd_tutorial.py new file mode 100644 index 00000000000..98e70a251d6 --- /dev/null +++ b/recipes_source/recipes/autograd_tutorial.py @@ -0,0 +1,198 @@ +# -*- coding: utf-8 -*- +""" +Autograd: Automatic Differentiation +=================================== + +Central to all neural networks in PyTorch is the ``autograd`` package. +Let’s first briefly visit this, and we will then go to training our +first neural network. + + +The ``autograd`` package provides automatic differentiation for all operations +on Tensors. It is a define-by-run framework, which means that your backprop is +defined by how your code is run, and that every single iteration can be +different. + +Let us see this in more simple terms with some examples. + +Tensor +-------- + +``torch.Tensor`` is the central class of the package. If you set its attribute +``.requires_grad`` as ``True``, it starts to track all operations on it. When +you finish your computation you can call ``.backward()`` and have all the +gradients computed automatically. The gradient for this tensor will be +accumulated into ``.grad`` attribute. + +To stop a tensor from tracking history, you can call ``.detach()`` to detach +it from the computation history, and to prevent future computation from being +tracked. + +To prevent tracking history (and using memory), you can also wrap the code block +in ``with torch.no_grad():``. This can be particularly helpful when evaluating a +model because the model may have trainable parameters with +``requires_grad=True``, but for which we don't need the gradients. + +There’s one more class which is very important for autograd +implementation - a ``Function``. + +``Tensor`` and ``Function`` are interconnected and build up an acyclic +graph, that encodes a complete history of computation. Each tensor has +a ``.grad_fn`` attribute that references a ``Function`` that has created +the ``Tensor`` (except for Tensors created by the user - their +``grad_fn is None``). + +If you want to compute the derivatives, you can call ``.backward()`` on +a ``Tensor``. If ``Tensor`` is a scalar (i.e. it holds a one element +data), you don’t need to specify any arguments to ``backward()``, +however if it has more elements, you need to specify a ``gradient`` +argument that is a tensor of matching shape. +""" + +import torch + +############################################################### +# Create a tensor and set ``requires_grad=True`` to track computation with it +x = torch.ones(2, 2, requires_grad=True) +print(x) + +############################################################### +# Do a tensor operation: +y = x + 2 +print(y) + +############################################################### +# ``y`` was created as a result of an operation, so it has a ``grad_fn``. +print(y.grad_fn) + +############################################################### +# Do more operations on ``y`` +z = y * y * 3 +out = z.mean() + +print(z, out) + +################################################################ +# ``.requires_grad_( ... )`` changes an existing Tensor's ``requires_grad`` +# flag in-place. The input flag defaults to ``False`` if not given. +a = torch.randn(2, 2) +a = ((a * 3) / (a - 1)) +print(a.requires_grad) +a.requires_grad_(True) +print(a.requires_grad) +b = (a * a).sum() +print(b.grad_fn) + +############################################################### +# Gradients +# --------- +# Let's backprop now. +# Because ``out`` contains a single scalar, ``out.backward()`` is +# equivalent to ``out.backward(torch.tensor(1.))``. + +out.backward() + +############################################################### +# Print gradients d(out)/dx +# + +print(x.grad) + +############################################################### +# You should have got a matrix of ``4.5``. Let’s call the ``out`` +# *Tensor* “:math:`o`”. +# We have that :math:`o = \frac{1}{4}\sum_i z_i`, +# :math:`z_i = 3(x_i+2)^2` and :math:`z_i\bigr\rvert_{x_i=1} = 27`. +# Therefore, +# :math:`\frac{\partial o}{\partial x_i} = \frac{3}{2}(x_i+2)`, hence +# :math:`\frac{\partial o}{\partial x_i}\bigr\rvert_{x_i=1} = \frac{9}{2} = 4.5`. + +############################################################### +# Mathematically, if you have a vector valued function :math:`\vec{y}=f(\vec{x})`, +# then the gradient of :math:`\vec{y}` with respect to :math:`\vec{x}` +# is a Jacobian matrix: +# +# .. math:: +# J=\left(\begin{array}{ccc} +# \frac{\partial y_{1}}{\partial x_{1}} & \cdots & \frac{\partial y_{1}}{\partial x_{n}}\\ +# \vdots & \ddots & \vdots\\ +# \frac{\partial y_{m}}{\partial x_{1}} & \cdots & \frac{\partial y_{m}}{\partial x_{n}} +# \end{array}\right) +# +# Generally speaking, ``torch.autograd`` is an engine for computing +# vector-Jacobian product. That is, given any vector +# :math:`v=\left(\begin{array}{cccc} v_{1} & v_{2} & \cdots & v_{m}\end{array}\right)^{T}`, +# compute the product :math:`v^{T}\cdot J`. If :math:`v` happens to be +# the gradient of a scalar function :math:`l=g\left(\vec{y}\right)`, +# that is, +# :math:`v=\left(\begin{array}{ccc}\frac{\partial l}{\partial y_{1}} & \cdots & \frac{\partial l}{\partial y_{m}}\end{array}\right)^{T}`, +# then by the chain rule, the vector-Jacobian product would be the +# gradient of :math:`l` with respect to :math:`\vec{x}`: +# +# .. math:: +# J^{T}\cdot v=\left(\begin{array}{ccc} +# \frac{\partial y_{1}}{\partial x_{1}} & \cdots & \frac{\partial y_{m}}{\partial x_{1}}\\ +# \vdots & \ddots & \vdots\\ +# \frac{\partial y_{1}}{\partial x_{n}} & \cdots & \frac{\partial y_{m}}{\partial x_{n}} +# \end{array}\right)\left(\begin{array}{c} +# \frac{\partial l}{\partial y_{1}}\\ +# \vdots\\ +# \frac{\partial l}{\partial y_{m}} +# \end{array}\right)=\left(\begin{array}{c} +# \frac{\partial l}{\partial x_{1}}\\ +# \vdots\\ +# \frac{\partial l}{\partial x_{n}} +# \end{array}\right) +# +# (Note that :math:`v^{T}\cdot J` gives a row vector which can be +# treated as a column vector by taking :math:`J^{T}\cdot v`.) +# +# This characteristic of vector-Jacobian product makes it very +# convenient to feed external gradients into a model that has +# non-scalar output. + +############################################################### +# Now let's take a look at an example of vector-Jacobian product: + +x = torch.randn(3, requires_grad=True) + +y = x * 2 +while y.data.norm() < 1000: + y = y * 2 + +print(y) + +############################################################### +# Now in this case ``y`` is no longer a scalar. ``torch.autograd`` +# could not compute the full Jacobian directly, but if we just +# want the vector-Jacobian product, simply pass the vector to +# ``backward`` as argument: +v = torch.tensor([0.1, 1.0, 0.0001], dtype=torch.float) +y.backward(v) + +print(x.grad) + +############################################################### +# You can also stop autograd from tracking history on Tensors +# with ``.requires_grad=True`` either by wrapping the code block in +# ``with torch.no_grad():`` +print(x.requires_grad) +print((x ** 2).requires_grad) + +with torch.no_grad(): + print((x ** 2).requires_grad) + +############################################################### +# Or by using ``.detach()`` to get a new Tensor with the same +# content but that does not require gradients: +print(x.requires_grad) +y = x.detach() +print(y.requires_grad) +print(x.eq(y).all()) + + +############################################################### +# **Read Later:** +# +# Document about ``autograd.Function`` is at +# https://pytorch.org/docs/stable/autograd.html#function diff --git a/recipes_source/recipes/aws_distributed_training_tutorial.py b/recipes_source/recipes/aws_distributed_training_tutorial.py new file mode 100644 index 00000000000..1789516c2b0 --- /dev/null +++ b/recipes_source/recipes/aws_distributed_training_tutorial.py @@ -0,0 +1,691 @@ +""" +(advanced) PyTorch 1.0 Distributed Trainer with Amazon AWS +============================================================= + +**Author**: `Nathan Inkawhich `_ + +**Edited by**: `Teng Li `_ + +""" + + +###################################################################### +# In this tutorial we will show how to setup, code, and run a PyTorch 1.0 +# distributed trainer across two multi-gpu Amazon AWS nodes. We will start +# with describing the AWS setup, then the PyTorch environment +# configuration, and finally the code for the distributed trainer. +# Hopefully you will find that there is actually very little code change +# required to extend your current training code to a distributed +# application, and most of the work is in the one-time environment setup. +# + + +###################################################################### +# Amazon AWS Setup +# ---------------- +# +# In this tutorial we will run distributed training across two multi-gpu +# nodes. In this section we will first cover how to create the nodes, then +# how to setup the security group so the nodes can communicate with +# eachother. +# +# Creating the Nodes +# ~~~~~~~~~~~~~~~~~~ +# +# In Amazon AWS, there are seven steps to creating an instance. To get +# started, login and select **Launch Instance**. +# +# **Step 1: Choose an Amazon Machine Image (AMI)** - Here we will select +# the ``Deep Learning AMI (Ubuntu) Version 14.0``. As described, this +# instance comes with many of the most popular deep learning frameworks +# installed and is preconfigured with CUDA, cuDNN, and NCCL. It is a very +# good starting point for this tutorial. +# +# **Step 2: Choose an Instance Type** - Now, select the GPU compute unit +# called ``p2.8xlarge``. Notice, each of these instances has a different +# cost but this instance provides 8 NVIDIA Tesla K80 GPUs per node, and +# provides a good architecture for multi-gpu distributed training. +# +# **Step 3: Configure Instance Details** - The only setting to change here +# is increasing the *Number of instances* to 2. All other configurations +# may be left at default. +# +# **Step 4: Add Storage** - Notice, by default these nodes do not come +# with a lot of storage (only 75 GB). For this tutorial, since we are only +# using the STL-10 dataset, this is plenty of storage. But, if you want to +# train on a larger dataset such as ImageNet, you will have to add much +# more storage just to fit the dataset and any trained models you wish to +# save. +# +# **Step 5: Add Tags** - Nothing to be done here, just move on. +# +# **Step 6: Configure Security Group** - This is a critical step in the +# configuration process. By default two nodes in the same security group +# would not be able to communicate in the distributed training setting. +# Here, we want to create a **new** security group for the two nodes to be +# in. However, we cannot finish configuring in this step. For now, just +# remember your new security group name (e.g. launch-wizard-12) then move +# on to Step 7. +# +# **Step 7: Review Instance Launch** - Here, review the instance then +# launch it. By default, this will automatically start initializing the +# two instances. You can monitor the initialization progress from the +# dashboard. +# +# Configure Security Group +# ~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Recall that we were not able to properly configure the security group +# when creating the instances. Once you have launched the instance, select +# the *Network & Security > Security Groups* tab in the EC2 dashboard. +# This will bring up a list of security groups you have access to. Select +# the new security group you created in Step 6 (i.e. launch-wizard-12), +# which will bring up tabs called *Description, Inbound, Outbound, and +# Tags*. First, select the *Inbound* tab and *Edit* to add a rule to allow +# "All Traffic" from "Sources" in the launch-wizard-12 security group. +# Then select the *Outbound* tab and do the exact same thing. Now, we have +# effectively allowed all Inbound and Outbound traffic of all types +# between nodes in the launch-wizard-12 security group. +# +# Necessary Information +# ~~~~~~~~~~~~~~~~~~~~~ +# +# Before continuing, we must find and remember the IP addresses of both +# nodes. In the EC2 dashboard find your running instances. For both +# instances, write down the *IPv4 Public IP* and the *Private IPs*. For +# the remainder of the document, we will refer to these as the +# **node0-publicIP**, **node0-privateIP**, **node1-publicIP**, and +# **node1-privateIP**. The public IPs are the addresses we will use to SSH +# in, and the private IPs will be used for inter-node communication. +# + + +###################################################################### +# Environment Setup +# ----------------- +# +# The next critical step is the setup of each node. Unfortunately, we +# cannot configure both nodes at the same time, so this process must be +# done on each node separately. However, this is a one time setup, so once +# you have the nodes configured properly you will not have to reconfigure +# for future distributed training projects. +# +# The first step, once logged onto the node, is to create a new conda +# environment with python 3.6 and numpy. Once created activate the +# environment. +# +# :: +# +# $ conda create -n nightly_pt python=3.6 numpy +# $ source activate nightly_pt +# +# Next, we will install a nightly build of Cuda 9.0 enabled PyTorch with +# pip in the conda environment. +# +# :: +# +# $ pip install torch_nightly -f https://download.pytorch.org/whl/nightly/cu90/torch_nightly.html +# +# We must also install torchvision so we can use the torchvision model and +# dataset. At this time, we must build torchvision from source as the pip +# installation will by default install an old version of PyTorch on top of +# the nightly build we just installed. +# +# :: +# +# $ cd +# $ git clone https://github.com/pytorch/vision.git +# $ cd vision +# $ python setup.py install +# +# And finally, **VERY IMPORTANT** step is to set the network interface +# name for the NCCL socket. This is set with the environment variable +# ``NCCL_SOCKET_IFNAME``. To get the correct name, run the ``ifconfig`` +# command on the node and look at the interface name that corresponds to +# the node's *privateIP* (e.g. ens3). Then set the environment variable as +# +# :: +# +# $ export NCCL_SOCKET_IFNAME=ens3 +# +# Remember, do this on both nodes. You may also consider adding the +# NCCL\_SOCKET\_IFNAME setting to your *.bashrc*. An important observation +# is that we did not setup a shared filesystem between the nodes. +# Therefore, each node will have to have a copy of the code and a copy of +# the datasets. For more information about setting up a shared network +# filesystem between nodes, see +# `here `__. +# + + +###################################################################### +# Distributed Training Code +# ------------------------- +# +# With the instances running and the environments setup we can now get +# into the training code. Most of the code here has been taken from the +# `PyTorch ImageNet +# Example `__ +# which also supports distributed training. This code provides a good +# starting point for a custom trainer as it has much of the boilerplate +# training loop, validation loop, and accuracy tracking functionality. +# However, you will notice that the argument parsing and other +# non-essential functions have been stripped out for simplicity. +# +# In this example we will use +# `torchvision.models.resnet18 `__ +# model and will train it on the +# `torchvision.datasets.STL10 `__ +# dataset. To accomodate for the dimensionality mismatch of STL-10 with +# Resnet18, we will resize each image to 224x224 with a transform. Notice, +# the choice of model and dataset are orthogonal to the distributed +# training code, you may use any dataset and model you wish and the +# process is the same. Lets get started by first handling the imports and +# talking about some helper functions. Then we will define the train and +# test functions, which have been largely taken from the ImageNet Example. +# At the end, we will build the main part of the code which handles the +# distributed training setup. And finally, we will discuss how to actually +# run the code. +# + + +###################################################################### +# Imports +# ~~~~~~~ +# +# The important distributed training specific imports here are +# `torch.nn.parallel `__, +# `torch.distributed `__, +# `torch.utils.data.distributed `__, +# and +# `torch.multiprocessing `__. +# It is also important to set the multiprocessing start method to *spawn* +# or *forkserver* (only supported in Python 3), +# as the default is *fork* which may cause deadlocks when using multiple +# worker processes for dataloading. +# + +import time +import sys +import torch + +import torch.nn as nn +import torch.nn.parallel +import torch.distributed as dist +import torch.optim +import torch.utils.data +import torch.utils.data.distributed +import torchvision.transforms as transforms +import torchvision.datasets as datasets +import torchvision.models as models + +from torch.multiprocessing import Pool, Process + + +###################################################################### +# Helper Functions +# ~~~~~~~~~~~~~~~~ +# +# We must also define some helper functions and classes that will make +# training easier. The ``AverageMeter`` class tracks training statistics +# like accuracy and iteration count. The ``accuracy`` function computes +# and returns the top-k accuracy of the model so we can track learning +# progress. Both are provided for training convenience but neither are +# distributed training specific. +# + +class AverageMeter(object): + """Computes and stores the average and current value""" + def __init__(self): + self.reset() + + def reset(self): + self.val = 0 + self.avg = 0 + self.sum = 0 + self.count = 0 + + def update(self, val, n=1): + self.val = val + self.sum += val * n + self.count += n + self.avg = self.sum / self.count + +def accuracy(output, target, topk=(1,)): + """Computes the precision@k for the specified values of k""" + with torch.no_grad(): + maxk = max(topk) + batch_size = target.size(0) + + _, pred = output.topk(maxk, 1, True, True) + pred = pred.t() + correct = pred.eq(target.view(1, -1).expand_as(pred)) + + res = [] + for k in topk: + correct_k = correct[:k].view(-1).float().sum(0, keepdim=True) + res.append(correct_k.mul_(100.0 / batch_size)) + return res + + +###################################################################### +# Train Functions +# ~~~~~~~~~~~~~~~ +# +# To simplify the main loop, it is best to separate a training epoch step +# into a function called ``train``. This function trains the input model +# for one epoch of the *train\_loader*. The only distributed training +# artifact in this function is setting the +# `non\_blocking `__ +# attributes of the data and label tensors to ``True`` before the forward +# pass. This allows asynchronous GPU copies of the data meaning transfers +# can be overlapped with computation. This function also outputs training +# statistics along the way so we can track progress throughout the epoch. +# +# The other function to define here is ``adjust_learning_rate``, which +# decays the initial learning rate at a fixed schedule. This is another +# boilerplate trainer function that is useful to train accurate models. +# + +def train(train_loader, model, criterion, optimizer, epoch): + + batch_time = AverageMeter() + data_time = AverageMeter() + losses = AverageMeter() + top1 = AverageMeter() + top5 = AverageMeter() + + # switch to train mode + model.train() + + end = time.time() + for i, (input, target) in enumerate(train_loader): + + # measure data loading time + data_time.update(time.time() - end) + + # Create non_blocking tensors for distributed training + input = input.cuda(non_blocking=True) + target = target.cuda(non_blocking=True) + + # compute output + output = model(input) + loss = criterion(output, target) + + # measure accuracy and record loss + prec1, prec5 = accuracy(output, target, topk=(1, 5)) + losses.update(loss.item(), input.size(0)) + top1.update(prec1[0], input.size(0)) + top5.update(prec5[0], input.size(0)) + + # compute gradients in a backward pass + optimizer.zero_grad() + loss.backward() + + # Call step of optimizer to update model params + optimizer.step() + + # measure elapsed time + batch_time.update(time.time() - end) + end = time.time() + + if i % 10 == 0: + print('Epoch: [{0}][{1}/{2}]\t' + 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' + 'Data {data_time.val:.3f} ({data_time.avg:.3f})\t' + 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' + 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\t' + 'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format( + epoch, i, len(train_loader), batch_time=batch_time, + data_time=data_time, loss=losses, top1=top1, top5=top5)) + +def adjust_learning_rate(initial_lr, optimizer, epoch): + """Sets the learning rate to the initial LR decayed by 10 every 30 epochs""" + lr = initial_lr * (0.1 ** (epoch // 30)) + for param_group in optimizer.param_groups: + param_group['lr'] = lr + + + +###################################################################### +# Validation Function +# ~~~~~~~~~~~~~~~~~~~ +# +# To track generalization performance and simplify the main loop further +# we can also extract the validation step into a function called +# ``validate``. This function runs a full validation step of the input +# model on the input validation dataloader and returns the top-1 accuracy +# of the model on the validation set. Again, you will notice the only +# distributed training feature here is setting ``non_blocking=True`` for +# the training data and labels before they are passed to the model. +# + +def validate(val_loader, model, criterion): + + batch_time = AverageMeter() + losses = AverageMeter() + top1 = AverageMeter() + top5 = AverageMeter() + + # switch to evaluate mode + model.eval() + + with torch.no_grad(): + end = time.time() + for i, (input, target) in enumerate(val_loader): + + input = input.cuda(non_blocking=True) + target = target.cuda(non_blocking=True) + + # compute output + output = model(input) + loss = criterion(output, target) + + # measure accuracy and record loss + prec1, prec5 = accuracy(output, target, topk=(1, 5)) + losses.update(loss.item(), input.size(0)) + top1.update(prec1[0], input.size(0)) + top5.update(prec5[0], input.size(0)) + + # measure elapsed time + batch_time.update(time.time() - end) + end = time.time() + + if i % 100 == 0: + print('Test: [{0}/{1}]\t' + 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' + 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' + 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\t' + 'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format( + i, len(val_loader), batch_time=batch_time, loss=losses, + top1=top1, top5=top5)) + + print(' * Prec@1 {top1.avg:.3f} Prec@5 {top5.avg:.3f}' + .format(top1=top1, top5=top5)) + + return top1.avg + + +###################################################################### +# Inputs +# ~~~~~~ +# +# With the helper functions out of the way, now we have reached the +# interesting part. Here is where we will define the inputs for the run. +# Some of the inputs are standard model training inputs such as batch size +# and number of training epochs, and some are specific to our distributed +# training task. The required inputs are: +# +# - **batch\_size** - batch size for *each* process in the distributed +# training group. Total batch size across distributed model is +# batch\_size\*world\_size +# +# - **workers** - number of worker processes used with the dataloaders in +# each process +# +# - **num\_epochs** - total number of epochs to train for +# +# - **starting\_lr** - starting learning rate for training +# +# - **world\_size** - number of processes in the distributed training +# environment +# +# - **dist\_backend** - backend to use for distributed training +# communication (i.e. NCCL, Gloo, MPI, etc.). In this tutorial, since +# we are using several multi-gpu nodes, NCCL is suggested. +# +# - **dist\_url** - URL to specify the initialization method of the +# process group. This may contain the IP address and port of the rank0 +# process or be a non-existant file on a shared file system. Here, +# since we do not have a shared file system this will incorporate the +# **node0-privateIP** and the port on node0 to use. +# + +print("Collect Inputs...") + +# Batch Size for training and testing +batch_size = 32 + +# Number of additional worker processes for dataloading +workers = 2 + +# Number of epochs to train for +num_epochs = 2 + +# Starting Learning Rate +starting_lr = 0.1 + +# Number of distributed processes +world_size = 4 + +# Distributed backend type +dist_backend = 'nccl' + +# Url used to setup distributed training +dist_url = "tcp://172.31.22.234:23456" + + +###################################################################### +# Initialize process group +# ~~~~~~~~~~~~~~~~~~~~~~~~ +# +# One of the most important parts of distributed training in PyTorch is to +# properly setup the process group, which is the **first** step in +# initializing the ``torch.distributed`` package. To do this, we will use +# the ``torch.distributed.init_process_group`` function which takes +# several inputs. First, a *backend* input which specifies the backend to +# use (i.e. NCCL, Gloo, MPI, etc.). An *init\_method* input which is +# either a url containing the address and port of the rank0 machine or a +# path to a non-existant file on the shared file system. Note, to use the +# file init\_method, all machines must have access to the file, similarly +# for the url method, all machines must be able to communicate on the +# network so make sure to configure any firewalls and network settings to +# accomodate. The *init\_process\_group* function also takes *rank* and +# *world\_size* arguments which specify the rank of this process when run +# and the number of processes in the collective, respectively. +# The *init\_method* input can also be "env://". In this case, the address +# and port of the rank0 machine will be read from the following two +# environment variables respectively: MASTER_ADDR, MASTER_PORT. If *rank* +# and *world\_size* arguments are not specified in the *init\_process\_group* +# function, they both can be read from the following two environment +# variables respectively as well: RANK, WORLD_SIZE. +# +# Another important step, especially when each node has multiple gpus is +# to set the *local\_rank* of this process. For example, if you have two +# nodes, each with 8 GPUs and you wish to train with all of them then +# :math:`world\_size=16` and each node will have a process with local rank +# 0-7. This local\_rank is used to set the device (i.e. which GPU to use) +# for the process and later used to set the device when creating a +# distributed data parallel model. It is also recommended to use NCCL +# backend in this hypothetical environment as NCCL is preferred for +# multi-gpu nodes. +# + +print("Initialize Process Group...") +# Initialize Process Group +# v1 - init with url +dist.init_process_group(backend=dist_backend, init_method=dist_url, rank=int(sys.argv[1]), world_size=world_size) +# v2 - init with file +# dist.init_process_group(backend="nccl", init_method="file:///home/ubuntu/pt-distributed-tutorial/trainfile", rank=int(sys.argv[1]), world_size=world_size) +# v3 - init with environment variables +# dist.init_process_group(backend="nccl", init_method="env://", rank=int(sys.argv[1]), world_size=world_size) + + +# Establish Local Rank and set device on this node +local_rank = int(sys.argv[2]) +dp_device_ids = [local_rank] +torch.cuda.set_device(local_rank) + + +###################################################################### +# Initialize Model +# ~~~~~~~~~~~~~~~~ +# +# The next major step is to initialize the model to be trained. Here, we +# will use a resnet18 model from ``torchvision.models`` but any model may +# be used. First, we initialize the model and place it in GPU memory. +# Next, we make the model ``DistributedDataParallel``, which handles the +# distribution of the data to and from the model and is critical for +# distributed training. The ``DistributedDataParallel`` module also +# handles the averaging of gradients across the world, so we do not have +# to explicitly average the gradients in the training step. +# +# It is important to note that this is a blocking function, meaning +# program execution will wait at this function until *world\_size* +# processes have joined the process group. Also, notice we pass our device +# ids list as a parameter which contains the local rank (i.e. GPU) we are +# using. Finally, we specify the loss function and optimizer to train the +# model with. +# + +print("Initialize Model...") +# Construct Model +model = models.resnet18(pretrained=False).cuda() +# Make model DistributedDataParallel +model = torch.nn.parallel.DistributedDataParallel(model, device_ids=dp_device_ids, output_device=local_rank) + +# define loss function (criterion) and optimizer +criterion = nn.CrossEntropyLoss().cuda() +optimizer = torch.optim.SGD(model.parameters(), starting_lr, momentum=0.9, weight_decay=1e-4) + + +###################################################################### +# Initialize Dataloaders +# ~~~~~~~~~~~~~~~~~~~~~~ +# +# The last step in preparation for the training is to specify which +# dataset to use. Here we use the `STL-10 +# dataset `__ from +# `torchvision.datasets.STL10 `__. +# The STL10 dataset is a 10 class dataset of 96x96px color images. For use +# with our model, we resize the images to 224x224px in the transform. One +# distributed training specific item in this section is the use of the +# ``DistributedSampler`` for the training set, which is designed to be +# used in conjunction with ``DistributedDataParallel`` models. This object +# handles the partitioning of the dataset across the distributed +# environment so that not all models are training on the same subset of +# data, which would be counterproductive. Finally, we create the +# ``DataLoader``'s which are responsible for feeding the data to the +# processes. +# +# The STL-10 dataset will automatically download on the nodes if they are +# not present. If you wish to use your own dataset you should download the +# data, write your own dataset handler, and construct a dataloader for +# your dataset here. +# + +print("Initialize Dataloaders...") +# Define the transform for the data. Notice, we must resize to 224x224 with this dataset and model. +transform = transforms.Compose( + [transforms.Resize(224), + transforms.ToTensor(), + transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]) + +# Initialize Datasets. STL10 will automatically download if not present +trainset = datasets.STL10(root='./data', split='train', download=True, transform=transform) +valset = datasets.STL10(root='./data', split='test', download=True, transform=transform) + +# Create DistributedSampler to handle distributing the dataset across nodes when training +# This can only be called after torch.distributed.init_process_group is called +train_sampler = torch.utils.data.distributed.DistributedSampler(trainset) + +# Create the Dataloaders to feed data to the training and validation steps +train_loader = torch.utils.data.DataLoader(trainset, batch_size=batch_size, shuffle=(train_sampler is None), num_workers=workers, pin_memory=False, sampler=train_sampler) +val_loader = torch.utils.data.DataLoader(valset, batch_size=batch_size, shuffle=False, num_workers=workers, pin_memory=False) + + +###################################################################### +# Training Loop +# ~~~~~~~~~~~~~ +# +# The last step is to define the training loop. We have already done most +# of the work for setting up the distributed training so this is not +# distributed training specific. The only detail is setting the current +# epoch count in the ``DistributedSampler``, as the sampler shuffles the +# data going to each process deterministically based on epoch. After +# updating the sampler, the loop runs a full training epoch, runs a full +# validation step then prints the performance of the current model against +# the best performing model so far. After training for num\_epochs, the +# loop exits and the tutorial is complete. Notice, since this is an +# exercise we are not saving models but one may wish to keep track of the +# best performing model then save it at the end of training (see +# `here `__). +# + +best_prec1 = 0 + +for epoch in range(num_epochs): + # Set epoch count for DistributedSampler + train_sampler.set_epoch(epoch) + + # Adjust learning rate according to schedule + adjust_learning_rate(starting_lr, optimizer, epoch) + + # train for one epoch + print("\nBegin Training Epoch {}".format(epoch+1)) + train(train_loader, model, criterion, optimizer, epoch) + + # evaluate on validation set + print("Begin Validation @ Epoch {}".format(epoch+1)) + prec1 = validate(val_loader, model, criterion) + + # remember best prec@1 and save checkpoint if desired + # is_best = prec1 > best_prec1 + best_prec1 = max(prec1, best_prec1) + + print("Epoch Summary: ") + print("\tEpoch Accuracy: {}".format(prec1)) + print("\tBest Accuracy: {}".format(best_prec1)) + + +###################################################################### +# Running the Code +# ---------------- +# +# Unlike most of the other PyTorch tutorials, this code may not be run +# directly out of this notebook. To run, download the .py version of this +# file (or convert it using +# `this `__) +# and upload a copy to both nodes. The astute reader would have noticed +# that we hardcoded the **node0-privateIP** and :math:`world\_size=4` but +# input the *rank* and *local\_rank* inputs as arg[1] and arg[2] command +# line arguments, respectively. Once uploaded, open two ssh terminals into +# each node. +# +# - On the first terminal for node0, run ``$ python main.py 0 0`` +# +# - On the second terminal for node0 run ``$ python main.py 1 1`` +# +# - On the first terminal for node1, run ``$ python main.py 2 0`` +# +# - On the second terminal for node1 run ``$ python main.py 3 1`` +# +# The programs will start and wait after printing "Initialize Model..." +# for all four processes to join the process group. Notice the first +# argument is not repeated as this is the unique global rank of the +# process. The second argument is repeated as that is the local rank of +# the process running on the node. If you run ``nvidia-smi`` on each node, +# you will see two processes on each node, one running on GPU0 and one on +# GPU1. +# +# We have now completed the distributed training example! Hopefully you +# can see how you would use this tutorial to help train your own models on +# your own datasets, even if you are not using the exact same distributed +# envrionment. If you are using AWS, don't forget to **SHUT DOWN YOUR +# NODES** if you are not using them or you may find an uncomfortably large +# bill at the end of the month. +# +# **Where to go next** +# +# - Check out the `launcher +# utility `__ +# for a different way of kicking off the run +# +# - Check out the `torch.multiprocessing.spawn +# utility `__ +# for another easy way of kicking off multiple distributed processes. +# `PyTorch ImageNet Example `__ +# has it implemented and can demonstrate how to use it. +# +# - If possible, setup a NFS so you only need one copy of the dataset +# diff --git a/recipes_source/recipes/cpp_export.rst b/recipes_source/recipes/cpp_export.rst new file mode 100644 index 00000000000..1a78548d967 --- /dev/null +++ b/recipes_source/recipes/cpp_export.rst @@ -0,0 +1,387 @@ +Loading a TorchScript Model in C++ +===================================== + +As its name suggests, the primary interface to PyTorch is the Python +programming language. While Python is a suitable and preferred language for +many scenarios requiring dynamism and ease of iteration, there are equally many +situations where precisely these properties of Python are unfavorable. One +environment in which the latter often applies is *production* -- the land of +low latencies and strict deployment requirements. For production scenarios, C++ +is very often the language of choice, even if only to bind it into another +language like Java, Rust or Go. The following paragraphs will outline the path +PyTorch provides to go from an existing Python model to a serialized +representation that can be *loaded* and *executed* purely from C++, with no +dependency on Python. + +Step 1: Converting Your PyTorch Model to Torch Script +----------------------------------------------------- + +A PyTorch model's journey from Python to C++ is enabled by `Torch Script +`_, a representation of a PyTorch +model that can be understood, compiled and serialized by the Torch Script +compiler. If you are starting out from an existing PyTorch model written in the +vanilla "eager" API, you must first convert your model to Torch Script. In the +most common cases, discussed below, this requires only little effort. If you +already have a Torch Script module, you can skip to the next section of this +tutorial. + +There exist two ways of converting a PyTorch model to Torch Script. The first +is known as *tracing*, a mechanism in which the structure of the model is +captured by evaluating it once using example inputs, and recording the flow of +those inputs through the model. This is suitable for models that make limited +use of control flow. The second approach is to add explicit annotations to your +model that inform the Torch Script compiler that it may directly parse and +compile your model code, subject to the constraints imposed by the Torch Script +language. + +.. tip:: + + You can find the complete documentation for both of these methods, as well as + further guidance on which to use, in the official `Torch Script + reference `_. + +Converting to Torch Script via Tracing +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +To convert a PyTorch model to Torch Script via tracing, you must pass an +instance of your model along with an example input to the ``torch.jit.trace`` +function. This will produce a ``torch.jit.ScriptModule`` object with the trace +of your model evaluation embedded in the module's ``forward`` method:: + + import torch + import torchvision + + # An instance of your model. + model = torchvision.models.resnet18() + + # An example input you would normally provide to your model's forward() method. + example = torch.rand(1, 3, 224, 224) + + # Use torch.jit.trace to generate a torch.jit.ScriptModule via tracing. + traced_script_module = torch.jit.trace(model, example) + +The traced ``ScriptModule`` can now be evaluated identically to a regular +PyTorch module:: + + In[1]: output = traced_script_module(torch.ones(1, 3, 224, 224)) + In[2]: output[0, :5] + Out[2]: tensor([-0.2698, -0.0381, 0.4023, -0.3010, -0.0448], grad_fn=) + +Converting to Torch Script via Annotation +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Under certain circumstances, such as if your model employs particular forms of +control flow, you may want to write your model in Torch Script directly and +annotate your model accordingly. For example, say you have the following +vanilla Pytorch model:: + + import torch + + class MyModule(torch.nn.Module): + def __init__(self, N, M): + super(MyModule, self).__init__() + self.weight = torch.nn.Parameter(torch.rand(N, M)) + + def forward(self, input): + if input.sum() > 0: + output = self.weight.mv(input) + else: + output = self.weight + input + return output + + +Because the ``forward`` method of this module uses control flow that is +dependent on the input, it is not suitable for tracing. Instead, we can convert +it to a ``ScriptModule``. +In order to convert the module to the ``ScriptModule``, one needs to +compile the module with ``torch.jit.script`` as follows:: + + class MyModule(torch.nn.Module): + def __init__(self, N, M): + super(MyModule, self).__init__() + self.weight = torch.nn.Parameter(torch.rand(N, M)) + + def forward(self, input): + if input.sum() > 0: + output = self.weight.mv(input) + else: + output = self.weight + input + return output + + my_module = MyModule(10,20) + sm = torch.jit.script(my_module) + +If you need to exclude some methods in your ``nn.Module`` +because they use Python features that TorchScript doesn't support yet, +you could annotate those with ``@torch.jit.ignore`` + +``my_module`` is an instance of +``ScriptModule`` that is ready for serialization. + +Step 2: Serializing Your Script Module to a File +------------------------------------------------- + +Once you have a ``ScriptModule`` in your hands, either from tracing or +annotating a PyTorch model, you are ready to serialize it to a file. Later on, +you'll be able to load the module from this file in C++ and execute it without +any dependency on Python. Say we want to serialize the ``ResNet18`` model shown +earlier in the tracing example. To perform this serialization, simply call +`save `_ +on the module and pass it a filename:: + + traced_script_module.save("traced_resnet_model.pt") + +This will produce a ``traced_resnet_model.pt`` file in your working directory. +If you also would like to serialize ``my_module``, call ``my_module.save("my_module_model.pt")`` +We have now officially left the realm of Python and are ready to cross over to the sphere +of C++. + +Step 3: Loading Your Script Module in C++ +------------------------------------------ + +To load your serialized PyTorch model in C++, your application must depend on +the PyTorch C++ API -- also known as *LibTorch*. The LibTorch distribution +encompasses a collection of shared libraries, header files and CMake build +configuration files. While CMake is not a requirement for depending on +LibTorch, it is the recommended approach and will be well supported into the +future. For this tutorial, we will be building a minimal C++ application using +CMake and LibTorch that simply loads and executes a serialized PyTorch model. + +A Minimal C++ Application +^^^^^^^^^^^^^^^^^^^^^^^^^ + +Let's begin by discussing the code to load a module. The following will already +do: + +.. code-block:: cpp + + #include // One-stop header. + + #include + #include + + int main(int argc, const char* argv[]) { + if (argc != 2) { + std::cerr << "usage: example-app \n"; + return -1; + } + + + torch::jit::script::Module module; + try { + // Deserialize the ScriptModule from a file using torch::jit::load(). + module = torch::jit::load(argv[1]); + } + catch (const c10::Error& e) { + std::cerr << "error loading the model\n"; + return -1; + } + + std::cout << "ok\n"; + } + + +The ```` header encompasses all relevant includes from the +LibTorch library necessary to run the example. Our application accepts the file +path to a serialized PyTorch ``ScriptModule`` as its only command line argument +and then proceeds to deserialize the module using the ``torch::jit::load()`` +function, which takes this file path as input. In return we receive a ``torch::jit::script::Module`` +object. We will examine how to execute it in a moment. + +Depending on LibTorch and Building the Application +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Assume we stored the above code into a file called ``example-app.cpp``. A +minimal ``CMakeLists.txt`` to build it could look as simple as: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.0 FATAL_ERROR) + project(custom_ops) + + find_package(Torch REQUIRED) + + add_executable(example-app example-app.cpp) + target_link_libraries(example-app "${TORCH_LIBRARIES}") + set_property(TARGET example-app PROPERTY CXX_STANDARD 14) + +The last thing we need to build the example application is the LibTorch +distribution. You can always grab the latest stable release from the `download +page `_ on the PyTorch website. If you download and unzip +the latest archive, you should receive a folder with the following directory +structure: + +.. code-block:: sh + + libtorch/ + bin/ + include/ + lib/ + share/ + +- The ``lib/`` folder contains the shared libraries you must link against, +- The ``include/`` folder contains header files your program will need to include, +- The ``share/`` folder contains the necessary CMake configuration to enable the simple ``find_package(Torch)`` command above. + +.. tip:: + On Windows, debug and release builds are not ABI-compatible. If you plan to + build your project in debug mode, please try the debug version of LibTorch. + Also, make sure you specify the correct configuration in the ``cmake --build .`` + line below. + +The last step is building the application. For this, assume our example +directory is laid out like this: + +.. code-block:: sh + + example-app/ + CMakeLists.txt + example-app.cpp + +We can now run the following commands to build the application from within the +``example-app/`` folder: + +.. code-block:: sh + + mkdir build + cd build + cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + cmake --build . --config Release + +where ``/path/to/libtorch`` should be the full path to the unzipped LibTorch +distribution. If all goes well, it will look something like this: + +.. code-block:: sh + + root@4b5a67132e81:/example-app# mkdir build + root@4b5a67132e81:/example-app# cd build + root@4b5a67132e81:/example-app/build# cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 5.4.0 + -- The CXX compiler identification is GNU 5.4.0 + -- Check for working C compiler: /usr/bin/cc + -- Check for working C compiler: /usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /usr/bin/c++ + -- Check for working CXX compiler: /usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Configuring done + -- Generating done + -- Build files have been written to: /example-app/build + root@4b5a67132e81:/example-app/build# make + Scanning dependencies of target example-app + [ 50%] Building CXX object CMakeFiles/example-app.dir/example-app.cpp.o + [100%] Linking CXX executable example-app + [100%] Built target example-app + +If we supply the path to the traced ``ResNet18`` model ``traced_resnet_model.pt`` we created earlier +to the resulting ``example-app`` binary, we should be rewarded with a friendly +"ok". Please note, if try to run this example with ``my_module_model.pt`` you will get an error saying that +your input is of an incompatible shape. ``my_module_model.pt`` expects 1D instead of 4D. + +.. code-block:: sh + + root@4b5a67132e81:/example-app/build# ./example-app /traced_resnet_model.pt + ok + +Step 4: Executing the Script Module in C++ +------------------------------------------ + +Having successfully loaded our serialized ``ResNet18`` in C++, we are now just a +couple lines of code away from executing it! Let's add those lines to our C++ +application's ``main()`` function: + +.. code-block:: cpp + + // Create a vector of inputs. + std::vector inputs; + inputs.push_back(torch::ones({1, 3, 224, 224})); + + // Execute the model and turn its output into a tensor. + at::Tensor output = module.forward(inputs).toTensor(); + std::cout << output.slice(/*dim=*/1, /*start=*/0, /*end=*/5) << '\n'; + +The first two lines set up the inputs to our model. We create a vector of +``torch::jit::IValue`` (a type-erased value type ``script::Module`` methods +accept and return) and add a single input. To create the input tensor, we use +``torch::ones()``, the equivalent to ``torch.ones`` in the C++ API. We then +run the ``script::Module``'s ``forward`` method, passing it the input vector we +created. In return we get a new ``IValue``, which we convert to a tensor by +calling ``toTensor()``. + +.. tip:: + + To learn more about functions like ``torch::ones`` and the PyTorch C++ API in + general, refer to its documentation at https://pytorch.org/cppdocs. The + PyTorch C++ API provides near feature parity with the Python API, allowing + you to further manipulate and process tensors just like in Python. + +In the last line, we print the first five entries of the output. Since we +supplied the same input to our model in Python earlier in this tutorial, we +should ideally see the same output. Let's try it out by re-compiling our +application and running it with the same serialized model: + +.. code-block:: sh + + root@4b5a67132e81:/example-app/build# make + Scanning dependencies of target example-app + [ 50%] Building CXX object CMakeFiles/example-app.dir/example-app.cpp.o + [100%] Linking CXX executable example-app + [100%] Built target example-app + root@4b5a67132e81:/example-app/build# ./example-app traced_resnet_model.pt + -0.2698 -0.0381 0.4023 -0.3010 -0.0448 + [ Variable[CPUFloatType]{1,5} ] + + +For reference, the output in Python previously was:: + + tensor([-0.2698, -0.0381, 0.4023, -0.3010, -0.0448], grad_fn=) + +Looks like a good match! + +.. tip:: + + To move your model to GPU memory, you can write ``model.to(at::kCUDA);``. + Make sure the inputs to a model are also living in CUDA memory + by calling ``tensor.to(at::kCUDA)``, which will return a new tensor in CUDA + memory. + +Step 5: Getting Help and Exploring the API +------------------------------------------ + +This tutorial has hopefully equipped you with a general understanding of a +PyTorch model's path from Python to C++. With the concepts described in this +tutorial, you should be able to go from a vanilla, "eager" PyTorch model, to a +compiled ``ScriptModule`` in Python, to a serialized file on disk and -- to +close the loop -- to an executable ``script::Module`` in C++. + +Of course, there are many concepts we did not cover. For example, you may find +yourself wanting to extend your ``ScriptModule`` with a custom operator +implemented in C++ or CUDA, and executing this custom operator inside your +``ScriptModule`` loaded in your pure C++ production environment. The good news +is: this is possible, and well supported! For now, you can explore `this +`_ folder +for examples, and we will follow up with a tutorial shortly. In the time being, +the following links may be generally helpful: + +- The Torch Script reference: https://pytorch.org/docs/master/jit.html +- The PyTorch C++ API documentation: https://pytorch.org/cppdocs/ +- The PyTorch Python API documentation: https://pytorch.org/docs/ + +As always, if you run into any problems or have questions, you can use our +`forum `_ or `GitHub issues +`_ to get in touch. diff --git a/recipes_source/recipes/cpp_extension.rst b/recipes_source/recipes/cpp_extension.rst new file mode 100644 index 00000000000..56b02dd1818 --- /dev/null +++ b/recipes_source/recipes/cpp_extension.rst @@ -0,0 +1,1184 @@ +Custom C++ and CUDA Extensions +============================== +**Author**: `Peter Goldsborough `_ + + +PyTorch provides a plethora of operations related to neural networks, arbitrary +tensor algebra, data wrangling and other purposes. However, you may still find +yourself in need of a more customized operation. For example, you might want to +use a novel activation function you found in a paper, or implement an operation +you developed as part of your research. + +The easiest way of integrating such a custom operation in PyTorch is to write it +in Python by extending :class:`Function` and :class:`Module` as outlined `here +`_. This gives you the full +power of automatic differentiation (spares you from writing derivative +functions) as well as the usual expressiveness of Python. However, there may be +times when your operation is better implemented in C++. For example, your code +may need to be *really* fast because it is called very frequently in your model +or is very expensive even for few calls. Another plausible reason is that it +depends on or interacts with other C or C++ libraries. To address such cases, +PyTorch provides a very easy way of writing custom *C++ extensions*. + +C++ extensions are a mechanism we have developed to allow users (you) to create +PyTorch operators defined *out-of-source*, i.e. separate from the PyTorch +backend. This approach is *different* from the way native PyTorch operations are +implemented. C++ extensions are intended to spare you much of the boilerplate +associated with integrating an operation with PyTorch's backend while providing +you with a high degree of flexibility for your PyTorch-based projects. +Nevertheless, once you have defined your operation as a C++ extension, turning +it into a native PyTorch function is largely a matter of code organization, +which you can tackle after the fact if you decide to contribute your operation +upstream. + +Motivation and Example +---------------------- + +The rest of this note will walk through a practical example of writing and using +a C++ (and CUDA) extension. If you are being chased or someone will fire you if +you don't get that op done by the end of the day, you can skip this section and +head straight to the implementation details in the next section. + +Let's say you've come up with a new kind of recurrent unit that you found to +have superior properties compared to the state of the art. This recurrent unit +is similar to an LSTM, but differs in that it lacks a *forget gate* and uses an +*Exponential Linear Unit* (ELU) as its internal activation function. Because +this unit never forgets, we'll call it *LLTM*, or *Long-Long-Term-Memory* unit. + +The two ways in which LLTMs differ from vanilla LSTMs are significant enough +that we can't configure PyTorch's ``LSTMCell`` for our purposes, so we'll have to +create a custom cell. The first and easiest approach for this -- and likely in +all cases a good first step -- is to implement our desired functionality in +plain PyTorch with Python. For this, we need to subclass +:class:`torch.nn.Module` and implement the forward pass of the LLTM. This would +look something like this:: + + class LLTM(torch.nn.Module): + def __init__(self, input_features, state_size): + super(LLTM, self).__init__() + self.input_features = input_features + self.state_size = state_size + # 3 * state_size for input gate, output gate and candidate cell gate. + # input_features + state_size because we will multiply with [input, h]. + self.weights = torch.nn.Parameter( + torch.empty(3 * state_size, input_features + state_size)) + self.bias = torch.nn.Parameter(torch.empty(3 * state_size)) + self.reset_parameters() + + def reset_parameters(self): + stdv = 1.0 / math.sqrt(self.state_size) + for weight in self.parameters(): + weight.data.uniform_(-stdv, +stdv) + + def forward(self, input, state): + old_h, old_cell = state + X = torch.cat([old_h, input], dim=1) + + # Compute the input, output and candidate cell gates with one MM. + gate_weights = F.linear(X, self.weights, self.bias) + # Split the combined gate weight matrix into its components. + gates = gate_weights.chunk(3, dim=1) + + input_gate = torch.sigmoid(gates[0]) + output_gate = torch.sigmoid(gates[1]) + # Here we use an ELU instead of the usual tanh. + candidate_cell = F.elu(gates[2]) + + # Compute the new cell state. + new_cell = old_cell + candidate_cell * input_gate + # Compute the new hidden state and output. + new_h = torch.tanh(new_cell) * output_gate + + return new_h, new_cell + +which we could then use as expected:: + + import torch + + X = torch.randn(batch_size, input_features) + h = torch.randn(batch_size, state_size) + C = torch.randn(batch_size, state_size) + + rnn = LLTM(input_features, state_size) + + new_h, new_C = rnn(X, (h, C)) + +Naturally, if at all possible and plausible, you should use this approach to +extend PyTorch. Since PyTorch has highly optimized implementations of its +operations for CPU *and* GPU, powered by libraries such as `NVIDIA cuDNN +`_, `Intel MKL +`_ or `NNPACK +`_, PyTorch code like above will often be +fast enough. However, we can also see why, under certain circumstances, there is +room for further performance improvements. The most obvious reason is that +PyTorch has no knowledge of the *algorithm* you are implementing. It knows only +of the individual operations you use to compose your algorithm. As such, PyTorch +must execute your operations individually, one after the other. Since each +individual call to the implementation (or *kernel*) of an operation, which may +involve launch of a CUDA kernel, has a certain amount of overhead, this overhead +may become significant across many function calls. Furthermore, the Python +interpreter that is running our code can itself slow down our program. + +A definite method of speeding things up is therefore to rewrite parts in C++ (or +CUDA) and *fuse* particular groups of operations. Fusing means combining the +implementations of many functions into a single functions, which profits from +fewer kernel launches as well as other optimizations we can perform with +increased visibility of the global flow of data. + +Let's see how we can use C++ extensions to implement a *fused* version of the +LLTM. We'll begin by writing it in plain C++, using the `ATen +`_ library that powers much of PyTorch's +backend, and see how easily it lets us translate our Python code. We'll then +speed things up even more by moving parts of the model to CUDA kernel to benefit +from the massive parallelism GPUs provide. + +Writing a C++ Extension +----------------------- + +C++ extensions come in two flavors: They can be built "ahead of time" with +:mod:`setuptools`, or "just in time" via +:func:`torch.utils.cpp_extension.load`. We'll begin with the first approach and +discuss the latter later. + +Building with :mod:`setuptools` +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +For the "ahead of time" flavor, we build our C++ extension by writing a +``setup.py`` script that uses setuptools to compile our C++ code. For the LLTM, it +looks as simple as this:: + + from setuptools import setup, Extension + from torch.utils import cpp_extension + + setup(name='lltm_cpp', + ext_modules=[cpp_extension.CppExtension('lltm_cpp', ['lltm.cpp'])], + cmdclass={'build_ext': cpp_extension.BuildExtension}) + +In this code, :class:`CppExtension` is a convenience wrapper around +:class:`setuptools.Extension` that passes the correct include paths and sets +the language of the extension to C++. The equivalent vanilla :mod:`setuptools` +code would simply be:: + + Extension( + name='lltm_cpp', + sources=['lltm.cpp'], + include_dirs=cpp_extension.include_paths(), + language='c++') + +:class:`BuildExtension` performs a number of required configuration steps and +checks and also manages mixed compilation in the case of mixed C++/CUDA +extensions. And that's all we really need to know about building C++ extensions +for now! Let's now take a look at the implementation of our C++ extension, +which goes into ``lltm.cpp``. + +Writing the C++ Op +^^^^^^^^^^^^^^^^^^ + +Let's start implementing the LLTM in C++! One function we'll need for the +backward pass is the derivative of the sigmoid. This is a small enough piece of +code to discuss the overall environment that is available to us when writing C++ +extensions: + +.. code-block:: cpp + + #include + + #include + + torch::Tensor d_sigmoid(torch::Tensor z) { + auto s = torch::sigmoid(z); + return (1 - s) * s; + } + +```` is the one-stop header to include all the necessary PyTorch +bits to write C++ extensions. It includes: + +- The ATen library, which is our primary API for tensor computation, +- `pybind11 `_, which is how we create Python bindings for our C++ code, +- Headers that manage the details of interaction between ATen and pybind11. + +The implementation of :func:`d_sigmoid` shows how to use the ATen API. +PyTorch's tensor and variable interface is generated automatically from the +ATen library, so we can more or less translate our Python implementation 1:1 +into C++. Our primary datatype for all computations will be +:class:`torch::Tensor`. Its full API can be inspected `here +`_. Notice +also that we can include ```` or *any other C or C++ header* -- we have +the full power of C++11 at our disposal. + +Forward Pass +************ + +Next we can port our entire forward pass to C++: + +.. code-block:: cpp + + #include + + std::vector lltm_forward( + torch::Tensor input, + torch::Tensor weights, + torch::Tensor bias, + torch::Tensor old_h, + torch::Tensor old_cell) { + auto X = torch::cat({old_h, input}, /*dim=*/1); + + auto gate_weights = torch::addmm(bias, X, weights.transpose(0, 1)); + auto gates = gate_weights.chunk(3, /*dim=*/1); + + auto input_gate = torch::sigmoid(gates[0]); + auto output_gate = torch::sigmoid(gates[1]); + auto candidate_cell = torch::elu(gates[2], /*alpha=*/1.0); + + auto new_cell = old_cell + candidate_cell * input_gate; + auto new_h = torch::tanh(new_cell) * output_gate; + + return {new_h, + new_cell, + input_gate, + output_gate, + candidate_cell, + X, + gate_weights}; + } + +Backward Pass +************* + +The C++ extension API currently does not provide a way of automatically +generating a backwards function for us. As such, we have to also implement the +backward pass of our LLTM, which computes the derivative of the loss with +respect to each input of the forward pass. Ultimately, we will plop both the +forward and backward function into a :class:`torch.autograd.Function` to create +a nice Python binding. The backward function is slightly more involved, so +we'll not dig deeper into the code (if you are interested, `Alex Graves' thesis +`_ is a good read for more +information on this): + +.. code-block:: cpp + + // tanh'(z) = 1 - tanh^2(z) + torch::Tensor d_tanh(torch::Tensor z) { + return 1 - z.tanh().pow(2); + } + + // elu'(z) = relu'(z) + { alpha * exp(z) if (alpha * (exp(z) - 1)) < 0, else 0} + torch::Tensor d_elu(torch::Tensor z, torch::Scalar alpha = 1.0) { + auto e = z.exp(); + auto mask = (alpha * (e - 1)) < 0; + return (z > 0).type_as(z) + mask.type_as(z) * (alpha * e); + } + + std::vector lltm_backward( + torch::Tensor grad_h, + torch::Tensor grad_cell, + torch::Tensor new_cell, + torch::Tensor input_gate, + torch::Tensor output_gate, + torch::Tensor candidate_cell, + torch::Tensor X, + torch::Tensor gate_weights, + torch::Tensor weights) { + auto d_output_gate = torch::tanh(new_cell) * grad_h; + auto d_tanh_new_cell = output_gate * grad_h; + auto d_new_cell = d_tanh(new_cell) * d_tanh_new_cell + grad_cell; + + auto d_old_cell = d_new_cell; + auto d_candidate_cell = input_gate * d_new_cell; + auto d_input_gate = candidate_cell * d_new_cell; + + auto gates = gate_weights.chunk(3, /*dim=*/1); + d_input_gate *= d_sigmoid(gates[0]); + d_output_gate *= d_sigmoid(gates[1]); + d_candidate_cell *= d_elu(gates[2]); + + auto d_gates = + torch::cat({d_input_gate, d_output_gate, d_candidate_cell}, /*dim=*/1); + + auto d_weights = d_gates.t().mm(X); + auto d_bias = d_gates.sum(/*dim=*/0, /*keepdim=*/true); + + auto d_X = d_gates.mm(weights); + const auto state_size = grad_h.size(1); + auto d_old_h = d_X.slice(/*dim=*/1, 0, state_size); + auto d_input = d_X.slice(/*dim=*/1, state_size); + + return {d_old_h, d_input, d_weights, d_bias, d_old_cell}; + } + +Binding to Python +^^^^^^^^^^^^^^^^^ + +Once you have your operation written in C++ and ATen, you can use pybind11 to +bind your C++ functions or classes into Python in a very simple manner. +Questions or issues you have about this part of PyTorch C++ extensions will +largely be addressed by `pybind11 documentation +`_. + +For our extensions, the necessary binding code spans only four lines: + +.. code-block:: cpp + + PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { + m.def("forward", &lltm_forward, "LLTM forward"); + m.def("backward", &lltm_backward, "LLTM backward"); + } + +One bit to note here is the macro ``TORCH_EXTENSION_NAME``. The torch extension +build will define it as the name you give your extension in the ``setup.py`` +script. In this case, the value of ``TORCH_EXTENSION_NAME`` would be "lltm". +This is to avoid having to maintain the name of the extension in two places +(the build script and your C++ code), as a mismatch between the two can lead to +nasty and hard to track issues. + +Using Your Extension +^^^^^^^^^^^^^^^^^^^^ + +We are now set to import our extension in PyTorch. At this point, your directory +structure could look something like this:: + + pytorch/ + lltm-extension/ + lltm.cpp + setup.py + +Now, run ``python setup.py install`` to build and install your extension. This +should look something like this:: + + running install + running bdist_egg + running egg_info + creating lltm_cpp.egg-info + writing lltm_cpp.egg-info/PKG-INFO + writing dependency_links to lltm_cpp.egg-info/dependency_links.txt + writing top-level names to lltm_cpp.egg-info/top_level.txt + writing manifest file 'lltm_cpp.egg-info/SOURCES.txt' + reading manifest file 'lltm_cpp.egg-info/SOURCES.txt' + writing manifest file 'lltm_cpp.egg-info/SOURCES.txt' + installing library code to build/bdist.linux-x86_64/egg + running install_lib + running build_ext + building 'lltm_cpp' extension + creating build + creating build/temp.linux-x86_64-3.7 + gcc -pthread -B ~/local/miniconda/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I~/local/miniconda/lib/python3.7/site-packages/torch/include -I~/local/miniconda/lib/python3.7/site-packages/torch/include/torch/csrc/api/include -I~/local/miniconda/lib/python3.7/site-packages/torch/include/TH -I~/local/miniconda/lib/python3.7/site-packages/torch/include/THC -I~/local/miniconda/include/python3.7m -c lltm.cpp -o build/temp.linux-x86_64-3.7/lltm.o -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=lltm_cpp -D_GLIBCXX_USE_CXX11_ABI=1 -std=c++11 + cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ + creating build/lib.linux-x86_64-3.7 + g++ -pthread -shared -B ~/local/miniconda/compiler_compat -L~/local/miniconda/lib -Wl,-rpath=~/local/miniconda/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-3.7/lltm.o -o build/lib.linux-x86_64-3.7/lltm_cpp.cpython-37m-x86_64-linux-gnu.so + creating build/bdist.linux-x86_64 + creating build/bdist.linux-x86_64/egg + copying build/lib.linux-x86_64-3.7/lltm_cpp.cpython-37m-x86_64-linux-gnu.so -> build/bdist.linux-x86_64/egg + creating stub loader for lltm_cpp.cpython-37m-x86_64-linux-gnu.so + byte-compiling build/bdist.linux-x86_64/egg/lltm_cpp.py to lltm_cpp.cpython-37.pyc + creating build/bdist.linux-x86_64/egg/EGG-INFO + copying lltm_cpp.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO + copying lltm_cpp.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO + copying lltm_cpp.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO + copying lltm_cpp.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO + writing build/bdist.linux-x86_64/egg/EGG-INFO/native_libs.txt + zip_safe flag not set; analyzing archive contents... + __pycache__.lltm_cpp.cpython-37: module references __file__ + creating 'dist/lltm_cpp-0.0.0-py3.7-linux-x86_64.egg' and adding 'build/bdist.linux-x86_64/egg' to it + removing 'build/bdist.linux-x86_64/egg' (and everything under it) + Processing lltm_cpp-0.0.0-py3.7-linux-x86_64.egg + removing '~/local/miniconda/lib/python3.7/site-packages/lltm_cpp-0.0.0-py3.7-linux-x86_64.egg' (and everything under it) + creating ~/local/miniconda/lib/python3.7/site-packages/lltm_cpp-0.0.0-py3.7-linux-x86_64.egg + Extracting lltm_cpp-0.0.0-py3.7-linux-x86_64.egg to ~/local/miniconda/lib/python3.7/site-packages + lltm-cpp 0.0.0 is already the active version in easy-install.pth + + Installed ~/local/miniconda/lib/python3.7/site-packages/lltm_cpp-0.0.0-py3.7-linux-x86_64.egg + Processing dependencies for lltm-cpp==0.0.0 + Finished processing dependencies for lltm-cpp==0.0.0 + + +A small note on compilers: Due to ABI versioning issues, the compiler you use to +build your C++ extension must be *ABI-compatible* with the compiler PyTorch was +built with. In practice, this means that you must use GCC version 4.9 and above on Linux. +For Ubuntu 16.04 and other more-recent Linux distributions, this should be the +default compiler already. On MacOS, you must use clang (which does not have any ABI versioning issues). In the worst +case, you can build PyTorch from source with your compiler and then build the +extension with that same compiler. + +Once your extension is built, you can simply import it in Python, using the +name you specified in your ``setup.py`` script. Just be sure to ``import +torch`` first, as this will resolve some symbols that the dynamic linker must +see:: + + In [1]: import torch + In [2]: import lltm_cpp + In [3]: lltm_cpp.forward + Out[3]: + +If we call ``help()`` on the function or module, we can see that its signature +matches our C++ code:: + + In[4] help(lltm_cpp.forward) + forward(...) method of builtins.PyCapsule instance + forward(arg0: torch::Tensor, arg1: torch::Tensor, arg2: torch::Tensor, arg3: torch::Tensor, arg4: torch::Tensor) -> List[torch::Tensor] + + LLTM forward + +Since we are now able to call our C++ functions from Python, we can wrap them +with :class:`torch.autograd.Function` and :class:`torch.nn.Module` to make them first +class citizens of PyTorch:: + + import math + import torch + + # Our module! + import lltm_cpp + + class LLTMFunction(torch.autograd.Function): + @staticmethod + def forward(ctx, input, weights, bias, old_h, old_cell): + outputs = lltm_cpp.forward(input, weights, bias, old_h, old_cell) + new_h, new_cell = outputs[:2] + variables = outputs[1:] + [weights] + ctx.save_for_backward(*variables) + + return new_h, new_cell + + @staticmethod + def backward(ctx, grad_h, grad_cell): + outputs = lltm_cpp.backward( + grad_h.contiguous(), grad_cell.contiguous(), *ctx.saved_variables) + d_old_h, d_input, d_weights, d_bias, d_old_cell = outputs + return d_input, d_weights, d_bias, d_old_h, d_old_cell + + + class LLTM(torch.nn.Module): + def __init__(self, input_features, state_size): + super(LLTM, self).__init__() + self.input_features = input_features + self.state_size = state_size + self.weights = torch.nn.Parameter( + torch.empty(3 * state_size, input_features + state_size)) + self.bias = torch.nn.Parameter(torch.empty(3 * state_size)) + self.reset_parameters() + + def reset_parameters(self): + stdv = 1.0 / math.sqrt(self.state_size) + for weight in self.parameters(): + weight.data.uniform_(-stdv, +stdv) + + def forward(self, input, state): + return LLTMFunction.apply(input, self.weights, self.bias, *state) + +Performance Comparison +********************** + +Now that we are able to use and call our C++ code from PyTorch, we can run a +small benchmark to see how much performance we gained from rewriting our op in +C++. We'll run the LLTM forwards and backwards a few times and measure the +duration:: + + import time + + import torch + + batch_size = 16 + input_features = 32 + state_size = 128 + + X = torch.randn(batch_size, input_features) + h = torch.randn(batch_size, state_size) + C = torch.randn(batch_size, state_size) + + rnn = LLTM(input_features, state_size) + + forward = 0 + backward = 0 + for _ in range(100000): + start = time.time() + new_h, new_C = rnn(X, (h, C)) + forward += time.time() - start + + start = time.time() + (new_h.sum() + new_C.sum()).backward() + backward += time.time() - start + + print('Forward: {:.3f} us | Backward {:.3f} us'.format(forward * 1e6/1e5, backward * 1e6/1e5)) + +If we run this code with the original LLTM we wrote in pure Python at the start +of this post, we get the following numbers (on my machine):: + + Forward: 506.480 us | Backward 444.694 us + +and with our new C++ version:: + + Forward: 349.335 us | Backward 443.523 us + +We can already see a significant speedup for the forward function (more than +30%). For the backward function a speedup is visible, albeit not major one. The +backward pass I wrote above was not particularly optimized and could definitely +be improved. Also, PyTorch's automatic differentiation engine can automatically +parallelize computation graphs, may use a more efficient flow of operations +overall, and is also implemented in C++, so it's expected to be fast. +Nevertheless, this is a good start. + +Performance on GPU Devices +************************** + +A wonderful fact about PyTorch's *ATen* backend is that it abstracts the +computing device you are running on. This means the same code we wrote for CPU +can *also* run on GPU, and individual operations will correspondingly dispatch +to GPU-optimized implementations. For certain operations like matrix multiply +(like ``mm`` or ``addmm``), this is a big win. Let's take a look at how much +performance we gain from running our C++ code with CUDA tensors. No changes to +our implementation are required, we simply need to put our tensors in GPU +memory from Python, with either adding ``device=cuda_device`` argument at +creation time or using ``.to(cuda_device)`` after creation:: + + import torch + + assert torch.cuda.is_available() + cuda_device = torch.device("cuda") # device object representing GPU + + batch_size = 16 + input_features = 32 + state_size = 128 + + # Note the device=cuda_device arguments here + X = torch.randn(batch_size, input_features, device=cuda_device) + h = torch.randn(batch_size, state_size, device=cuda_device) + C = torch.randn(batch_size, state_size, device=cuda_device) + + rnn = LLTM(input_features, state_size).to(cuda_device) + + forward = 0 + backward = 0 + for _ in range(100000): + start = time.time() + new_h, new_C = rnn(X, (h, C)) + torch.cuda.synchronize() + forward += time.time() - start + + start = time.time() + (new_h.sum() + new_C.sum()).backward() + torch.cuda.synchronize() + backward += time.time() - start + + print('Forward: {:.3f} us | Backward {:.3f} us'.format(forward * 1e6/1e5, backward * 1e6/1e5)) + +Once more comparing our plain PyTorch code with our C++ version, now both +running on CUDA devices, we again see performance gains. For Python/PyTorch:: + + Forward: 187.719 us | Backward 410.815 us + +And C++/ATen:: + + Forward: 149.802 us | Backward 393.458 us + +That's a great overall speedup compared to non-CUDA code. However, we can pull +even more performance out of our C++ code by writing custom CUDA kernels, which +we'll dive into soon. Before that, let's dicuss another way of building your C++ +extensions. + +JIT Compiling Extensions +^^^^^^^^^^^^^^^^^^^^^^^^ + +Previously, I mentioned there were two ways of building C++ extensions: using +:mod:`setuptools` or just in time (JIT). Having covered the former, let's +elaborate on the latter. The JIT compilation mechanism provides you with a way +of compiling and loading your extensions on the fly by calling a simple +function in PyTorch's API called :func:`torch.utils.cpp_extension.load`. For +the LLTM, this would look as simple as this:: + + from torch.utils.cpp_extension import load + + lltm_cpp = load(name="lltm_cpp", sources=["lltm.cpp"]) + +Here, we provide the function with the same information as for +:mod:`setuptools`. In the background, this will do the following: + +1. Create a temporary directory ``/tmp/torch_extensions/lltm``, +2. Emit a `Ninja `_ build file into that temporary directory, +3. Compile your source files into a shared library, +4. Import this shared library as a Python module. + +In fact, if you pass ``verbose=True`` to :func:`cpp_extension.load`, you will +be informed about the process:: + + Using /tmp/torch_extensions as PyTorch extensions root... + Emitting ninja build file /tmp/torch_extensions/lltm_cpp/build.ninja... + Building extension module lltm_cpp... + Loading extension module lltm_cpp... + +The resulting Python module will be exactly the same as produced by setuptools, +but removes the requirement of having to maintain a separate ``setup.py`` build +file. If your setup is more complicated and you do need the full power of +:mod:`setuptools`, you *can* write your own ``setup.py`` -- but in many cases +this JIT technique will do just fine. The first time you run through this line, +it will take some time, as the extension is compiling in the background. Since +we use the Ninja build system to build your sources, re-compilation is +incremental and thus re-loading the extension when you run your Python module a +second time is fast and has low overhead if you didn't change the extension's +source files. + +Writing a Mixed C++/CUDA extension +---------------------------------- + +To really take our implementation to the next level, we can hand-write parts of +our forward and backward passes with custom CUDA kernels. For the LLTM, this has +the prospect of being particularly effective, as there are a large number of +pointwise operations in sequence, that can all be fused and parallelized in a +single CUDA kernel. Let's see how we could write such a CUDA kernel and +integrate it with PyTorch using this extension mechanism. + +The general strategy for writing a CUDA extension is to first write a C++ file +which defines the functions that will be called from Python, and binds those +functions to Python with pybind11. Furthermore, this file will also *declare* +functions that are defined in CUDA (``.cu``) files. The C++ functions will then +do some checks and ultimately forward its calls to the CUDA functions. In the +CUDA files, we write our actual CUDA kernels. The :mod:`cpp_extension` package +will then take care of compiling the C++ sources with a C++ compiler like +``gcc`` and the CUDA sources with NVIDIA's ``nvcc`` compiler. This ensures that +each compiler takes care of files it knows best to compile. Ultimately, they +will be linked into one shared library that is available to us from Python +code. + +We'll start with the C++ file, which we'll call ``lltm_cuda.cpp``, for example: + +.. code-block:: cpp + + #include + + #include + + // CUDA forward declarations + + std::vector lltm_cuda_forward( + torch::Tensor input, + torch::Tensor weights, + torch::Tensor bias, + torch::Tensor old_h, + torch::Tensor old_cell); + + std::vector lltm_cuda_backward( + torch::Tensor grad_h, + torch::Tensor grad_cell, + torch::Tensor new_cell, + torch::Tensor input_gate, + torch::Tensor output_gate, + torch::Tensor candidate_cell, + torch::Tensor X, + torch::Tensor gate_weights, + torch::Tensor weights); + + // C++ interface + + #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor") + #define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be contiguous") + #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x) + + std::vector lltm_forward( + torch::Tensor input, + torch::Tensor weights, + torch::Tensor bias, + torch::Tensor old_h, + torch::Tensor old_cell) { + CHECK_INPUT(input); + CHECK_INPUT(weights); + CHECK_INPUT(bias); + CHECK_INPUT(old_h); + CHECK_INPUT(old_cell); + + return lltm_cuda_forward(input, weights, bias, old_h, old_cell); + } + + std::vector lltm_backward( + torch::Tensor grad_h, + torch::Tensor grad_cell, + torch::Tensor new_cell, + torch::Tensor input_gate, + torch::Tensor output_gate, + torch::Tensor candidate_cell, + torch::Tensor X, + torch::Tensor gate_weights, + torch::Tensor weights) { + CHECK_INPUT(grad_h); + CHECK_INPUT(grad_cell); + CHECK_INPUT(input_gate); + CHECK_INPUT(output_gate); + CHECK_INPUT(candidate_cell); + CHECK_INPUT(X); + CHECK_INPUT(gate_weights); + CHECK_INPUT(weights); + + return lltm_cuda_backward( + grad_h, + grad_cell, + new_cell, + input_gate, + output_gate, + candidate_cell, + X, + gate_weights, + weights); + } + + PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { + m.def("forward", &lltm_forward, "LLTM forward (CUDA)"); + m.def("backward", &lltm_backward, "LLTM backward (CUDA)"); + } + +As you can see, it is largely boilerplate, checks and forwarding to functions +that we'll define in the CUDA file. We'll name this file +``lltm_cuda_kernel.cu`` (note the ``.cu`` extension!). NVCC can reasonably +compile C++11, thus we still have ATen and the C++ standard library available +to us (but not ``torch.h``). Note that :mod:`setuptools` cannot handle files +with the same name but different extensions, so if you use the ``setup.py`` +method instead of the JIT method, you must give your CUDA file a different name +than your C++ file (for the JIT method, ``lltm.cpp`` and ``lltm.cu`` would work +fine). Let's take a small peek at what this file will look like: + +.. code-block:: cpp + + #include + + #include + #include + + #include + + template + __device__ __forceinline__ scalar_t sigmoid(scalar_t z) { + return 1.0 / (1.0 + exp(-z)); + } + +Here we see the headers I just described, as well as the fact that we are using +CUDA-specific declarations like ``__device__`` and ``__forceinline__`` and +functions like ``exp``. Let's continue with a few more helper functions that +we'll need: + +.. code-block:: cpp + + template + __device__ __forceinline__ scalar_t d_sigmoid(scalar_t z) { + const auto s = sigmoid(z); + return (1.0 - s) * s; + } + + template + __device__ __forceinline__ scalar_t d_tanh(scalar_t z) { + const auto t = tanh(z); + return 1 - (t * t); + } + + template + __device__ __forceinline__ scalar_t elu(scalar_t z, scalar_t alpha = 1.0) { + return fmax(0.0, z) + fmin(0.0, alpha * (exp(z) - 1.0)); + } + + template + __device__ __forceinline__ scalar_t d_elu(scalar_t z, scalar_t alpha = 1.0) { + const auto e = exp(z); + const auto d_relu = z < 0.0 ? 0.0 : 1.0; + return d_relu + (((alpha * (e - 1.0)) < 0.0) ? (alpha * e) : 0.0); + } + +To now actually implement a function, we'll again need two things: one function +that performs operations we don't wish to explicitly write by hand and calls +into CUDA kernels, and then the actual CUDA kernel for the parts we want to +speed up. For the forward pass, the first function should look like this: + +.. code-block:: cpp + + std::vector lltm_cuda_forward( + torch::Tensor input, + torch::Tensor weights, + torch::Tensor bias, + torch::Tensor old_h, + torch::Tensor old_cell) { + auto X = torch::cat({old_h, input}, /*dim=*/1); + auto gates = torch::addmm(bias, X, weights.transpose(0, 1)); + + const auto batch_size = old_cell.size(0); + const auto state_size = old_cell.size(1); + + auto new_h = torch::zeros_like(old_cell); + auto new_cell = torch::zeros_like(old_cell); + auto input_gate = torch::zeros_like(old_cell); + auto output_gate = torch::zeros_like(old_cell); + auto candidate_cell = torch::zeros_like(old_cell); + + const int threads = 1024; + const dim3 blocks((state_size + threads - 1) / threads, batch_size); + + AT_DISPATCH_FLOATING_TYPES(gates.type(), "lltm_forward_cuda", ([&] { + lltm_cuda_forward_kernel<<>>( + gates.data(), + old_cell.data(), + new_h.data(), + new_cell.data(), + input_gate.data(), + output_gate.data(), + candidate_cell.data(), + state_size); + })); + + return {new_h, new_cell, input_gate, output_gate, candidate_cell, X, gates}; + } + +The main point of interest here is the ``AT_DISPATCH_FLOATING_TYPES`` macro and +the kernel launch (indicated by the ``<<<...>>>``). While ATen abstracts away +the device and datatype of the tensors we deal with, a tensor will, at runtime, +still be backed by memory of a concrete type on a concrete device. As such, we +need a way of determining at runtime what type a tensor is and then selectively +call functions with the corresponding correct type signature. Done manually, +this would (conceptually) look something like this: + +.. code-block:: cpp + + switch (tensor.type().scalarType()) { + case torch::ScalarType::Double: + return function(tensor.data()); + case torch::ScalarType::Float: + return function(tensor.data()); + ... + } + +The purpose of ``AT_DISPATCH_FLOATING_TYPES`` is to take care of this dispatch +for us. It takes a type (``gates.type()`` in our case), a name (for error +messages) and a lambda function. Inside this lambda function, the type alias +``scalar_t`` is available and is defined as the type that the tensor actually +is at runtime in that context. As such, if we have a template function (which +our CUDA kernel will be), we can instantiate it with this ``scalar_t`` alias, +and the correct function will be called. In this case, we also want to retrieve +the data pointers of the tensors as pointers of that ``scalar_t`` type. If you +wanted to dispatch over all types and not just floating point types (``Float`` +and ``Double``), you can use ``AT_DISPATCH_ALL_TYPES``. + +Note that we perform some operations with plain ATen. These operations will +still run on the GPU, but using ATen's default implementations. This makes +sense, because ATen will use highly optimized routines for things like matrix +multiplies (e.g. ``addmm``) or convolutions which would be much harder to +implement and improve ourselves. + +As for the kernel launch itself, we are here specifying that each CUDA block +will have 1024 threads, and that the entire GPU grid is split into as many +blocks of ``1 x 1024`` threads as are required to fill our matrices with one +thread per component. For example, if our state size was 2048 and our batch +size 4, we'd launch a total of ``4 x 2 = 8`` blocks with each 1024 threads. If +you've never heard of CUDA "blocks" or "grids" before, an `introductory read +about CUDA `_ may +help. + +The actual CUDA kernel is fairly simple (if you've ever programmed GPUs before): + +.. code-block:: cpp + + template + __global__ void lltm_cuda_forward_kernel( + const scalar_t* __restrict__ gates, + const scalar_t* __restrict__ old_cell, + scalar_t* __restrict__ new_h, + scalar_t* __restrict__ new_cell, + scalar_t* __restrict__ input_gate, + scalar_t* __restrict__ output_gate, + scalar_t* __restrict__ candidate_cell, + size_t state_size) { + const int column = blockIdx.x * blockDim.x + threadIdx.x; + const int index = blockIdx.y * state_size + column; + const int gates_row = blockIdx.y * (state_size * 3); + if (column < state_size) { + input_gate[index] = sigmoid(gates[gates_row + column]); + output_gate[index] = sigmoid(gates[gates_row + state_size + column]); + candidate_cell[index] = elu(gates[gates_row + 2 * state_size + column]); + new_cell[index] = + old_cell[index] + candidate_cell[index] * input_gate[index]; + new_h[index] = tanh(new_cell[index]) * output_gate[index]; + } + } + +What's primarily interesting here is that we are able to compute all of these +pointwise operations entirely in parallel for each individual component in our +gate matrices. If you imagine having to do this with a giant ``for`` loop over +a million elements in serial, you can see why this would be much faster. + +Using accessors +^^^^^^^^^^^^^^^ + +You can see in the CUDA kernel that we work directly on pointers with the right +type. Indeed, working directly with high level type agnostic tensors inside cuda +kernels would be very inefficient. + +However, this comes at a cost of ease of use and readibility, especially for +highly dimensional data. In our example, we know for example that the contiguous +``gates`` tensor has 3 dimensions: + +1. batch, size of ``batch_size`` and stride of ``3*state_size`` +2. row, size of ``3`` and stride of ``state_size`` +3. index, size of ``state_size`` and stride of ``1`` + +How can we access the element ``gates[n][row][column]`` inside the kernel then? +It turns out that you need the strides to access your element with some simple +arithmetic. + +.. code-block:: cpp + + gates.data()[n*3*state_size + row*state_size + column] + + +In addition to being verbose, this expression needs stride to be explicitely +known, and thus passed to the kernel function within its arguments. You can see +that in the case of kernel functions accepting multiple tensors with different +sizes you will end up with a very long list of arguments. + +Fortunately for us, ATen provides accessors that are created with a single +dynamic check that a Tensor is the type and number of dimensions. +Accessors then expose an API for accessing the Tensor elements efficiently +without having to convert to a single pointer: + +.. code-block:: cpp + + torch::Tensor foo = torch::rand({12, 12}); + + // assert foo is 2-dimensional and holds floats. + auto foo_a = foo.accessor(); + float trace = 0; + + for(int i = 0; i < foo_a.size(0); i++) { + // use the accessor foo_a to get tensor data. + trace += foo_a[i][i]; + } + +Accessor objects have a relatively high level interface, with ``.size()`` and +``.stride()`` methods and multi-dimensional indexing. The ``.accessor<>`` +interface is designed to access data efficiently on cpu tensor. The equivalent +for cuda tensors are ``packed_accessor64<>`` and ``packed_accessor32<>``, which +produce Packed Accessors with either 64-bit or 32-bit integer indexing. + +The fundamental difference with Accessor is that a Packed Accessor copies size +and stride data inside of its structure instead of pointing to it. It allows us +to pass it to a CUDA kernel function and use its interface inside it. + +We can design a function that takes Packed Accessors instead of pointers. + +.. code-block:: cpp + + __global__ void lltm_cuda_forward_kernel( + const torch::PackedTensorAccessor32 gates, + const torch::PackedTensorAccessor32 old_cell, + torch::PackedTensorAccessor32 new_h, + torch::PackedTensorAccessor32 new_cell, + torch::PackedTensorAccessor32 input_gate, + torch::PackedTensorAccessor32 output_gate, + torch::PackedTensorAccessor32 candidate_cell) + +Let's decompose the template used here. the first two arguments ``scalar_t`` and +``2`` are the same as regular Accessor. The argument +``torch::RestrictPtrTraits`` indicates that the ``__restrict__`` keyword must be +used. Note also that we've used the ``PackedAccessor32`` variant which store the +sizes and strides in an ``int32_t``. This is important as using the 64-bit +variant (``PackedAccessor64``) can make the kernel slower. + +The function declaration becomes + +.. code-block:: cpp + + template + __global__ void lltm_cuda_forward_kernel( + const torch::PackedTensorAccessor32 gates, + const torch::PackedTensorAccessor32 old_cell, + torch::PackedTensorAccessor32 new_h, + torch::PackedTensorAccessor32 new_cell, + torch::PackedTensorAccessor32 input_gate, + torch::PackedTensorAccessor32 output_gate, + torch::PackedTensorAccessor32 candidate_cell) { + //batch index + const int n = blockIdx.y; + // column index + const int c = blockIdx.x * blockDim.x + threadIdx.x; + if (c < gates.size(2)){ + input_gate[n][c] = sigmoid(gates[n][0][c]); + output_gate[n][c] = sigmoid(gates[n][1][c]); + candidate_cell[n][c] = elu(gates[n][2][c]); + new_cell[n][c] = + old_cell[n][c] + candidate_cell[n][c] * input_gate[n][c]; + new_h[n][c] = tanh(new_cell[n][c]) * output_gate[n][c]; + } + } + +The implementation is much more readable! This function is then called by +creating Packed Accessors with the ``.packed_accessor32<>`` method within the +host function. + +.. code-block:: cpp + + std::vector lltm_cuda_forward( + torch::Tensor input, + torch::Tensor weights, + torch::Tensor bias, + torch::Tensor old_h, + torch::Tensor old_cell) { + auto X = torch::cat({old_h, input}, /*dim=*/1); + auto gate_weights = torch::addmm(bias, X, weights.transpose(0, 1)); + + const auto batch_size = old_cell.size(0); + const auto state_size = old_cell.size(1); + + auto gates = gate_weights.reshape({batch_size, 3, state_size}); + auto new_h = torch::zeros_like(old_cell); + auto new_cell = torch::zeros_like(old_cell); + auto input_gate = torch::zeros_like(old_cell); + auto output_gate = torch::zeros_like(old_cell); + auto candidate_cell = torch::zeros_like(old_cell); + + const int threads = 1024; + const dim3 blocks((state_size + threads - 1) / threads, batch_size); + + AT_DISPATCH_FLOATING_TYPES(gates.type(), "lltm_forward_cuda", ([&] { + lltm_cuda_forward_kernel<<>>( + gates.packed_accessor32(), + old_cell.packed_accessor32(), + new_h.packed_accessor32(), + new_cell.packed_accessor32(), + input_gate.packed_accessor32(), + output_gate.packed_accessor32(), + candidate_cell.packed_accessor32()); + })); + + return {new_h, new_cell, input_gate, output_gate, candidate_cell, X, gates}; + } + +The backwards pass follows much the same pattern and I won't elaborate further +on it: + +.. code-block:: cpp + + template + __global__ void lltm_cuda_backward_kernel( + torch::PackedTensorAccessor32 d_old_cell, + torch::PackedTensorAccessor32 d_gates, + const torch::PackedTensorAccessor32 grad_h, + const torch::PackedTensorAccessor32 grad_cell, + const torch::PackedTensorAccessor32 new_cell, + const torch::PackedTensorAccessor32 input_gate, + const torch::PackedTensorAccessor32 output_gate, + const torch::PackedTensorAccessor32 candidate_cell, + const torch::PackedTensorAccessor32 gate_weights) { + //batch index + const int n = blockIdx.y; + // column index + const int c = blockIdx.x * blockDim.x + threadIdx.x; + if (c < d_gates.size(2)){ + const auto d_output_gate = tanh(new_cell[n][c]) * grad_h[n][c]; + const auto d_tanh_new_cell = output_gate[n][c] * grad_h[n][c]; + const auto d_new_cell = + d_tanh(new_cell[n][c]) * d_tanh_new_cell + grad_cell[n][c]; + + + d_old_cell[n][c] = d_new_cell; + const auto d_candidate_cell = input_gate[n][c] * d_new_cell; + const auto d_input_gate = candidate_cell[n][c] * d_new_cell; + + d_gates[n][0][c] = + d_input_gate * d_sigmoid(gate_weights[n][0][c]); + d_gates[n][1][c] = + d_output_gate * d_sigmoid(gate_weights[n][1][c]); + d_gates[n][2][c] = + d_candidate_cell * d_elu(gate_weights[n][2][c]); + } + } + + std::vector lltm_cuda_backward( + torch::Tensor grad_h, + torch::Tensor grad_cell, + torch::Tensor new_cell, + torch::Tensor input_gate, + torch::Tensor output_gate, + torch::Tensor candidate_cell, + torch::Tensor X, + torch::Tensor gates, + torch::Tensor weights) { + auto d_old_cell = torch::zeros_like(new_cell); + auto d_gates = torch::zeros_like(gates); + + const auto batch_size = new_cell.size(0); + const auto state_size = new_cell.size(1); + + const int threads = 1024; + const dim3 blocks((state_size + threads - 1) / threads, batch_size); + + AT_DISPATCH_FLOATING_TYPES(X.type(), "lltm_forward_cuda", ([&] { + lltm_cuda_backward_kernel<<>>( + d_old_cell.packed_accessor32(), + d_gates.packed_accessor32(), + grad_h.packed_accessor32(), + grad_cell.packed_accessor32(), + new_cell.packed_accessor32(), + input_gate.packed_accessor32(), + output_gate.packed_accessor32(), + candidate_cell.packed_accessor32(), + gates.packed_accessor32()); + })); + + auto d_gate_weights = d_gates.reshape({batch_size, 3*state_size}); + auto d_weights = d_gate_weights.t().mm(X); + auto d_bias = d_gate_weights.sum(/*dim=*/0, /*keepdim=*/true); + + auto d_X = d_gate_weights.mm(weights); + auto d_old_h = d_X.slice(/*dim=*/1, 0, state_size); + auto d_input = d_X.slice(/*dim=*/1, state_size); + + return {d_old_h, d_input, d_weights, d_bias, d_old_cell, d_gates}; + } + + +Integrating a C++/CUDA Operation with PyTorch +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Integration of our CUDA-enabled op with PyTorch is again very straightforward. +If you want to write a ``setup.py`` script, it could look like this:: + + from setuptools import setup + from torch.utils.cpp_extension import BuildExtension, CUDAExtension + + setup( + name='lltm', + ext_modules=[ + CUDAExtension('lltm_cuda', [ + 'lltm_cuda.cpp', + 'lltm_cuda_kernel.cu', + ]) + ], + cmdclass={ + 'build_ext': BuildExtension + }) + +Instead of :func:`CppExtension`, we now use :func:`CUDAExtension`. We can just +specify the ``.cu`` file along with the ``.cpp`` files -- the library takes +care of all the hassle this entails for you. The JIT mechanism is even +simpler:: + + from torch.utils.cpp_extension import load + + lltm = load(name='lltm', sources=['lltm_cuda.cpp', 'lltm_cuda_kernel.cu']) + +Performance Comparison +********************** + +Our hope was that parallelizing and fusing the pointwise operations of our code +with CUDA would improve the performance of our LLTM. Let's see if that holds +true. We can run the code I listed earlier to run a benchmark. Our fastest +version earlier was the CUDA-based C++ code:: + + Forward: 149.802 us | Backward 393.458 us + + +And now with our custom CUDA kernel:: + + Forward: 129.431 us | Backward 304.641 us + +More performance increases! + +Conclusion +---------- + +You should now be equipped with a good overview of PyTorch's C++ extension +mechanism as well as a motivation for using them. You can find the code +examples displayed in this note `here +`_. If you have questions, please use +`the forums `_. Also be sure to check our `FAQ +`_ in case you run into any issues. diff --git a/recipes_source/recipes/custom_dataset_transforms_loader.ipynb b/recipes_source/recipes/custom_dataset_transforms_loader.ipynb new file mode 100644 index 00000000000..c574ec6b115 --- /dev/null +++ b/recipes_source/recipes/custom_dataset_transforms_loader.ipynb @@ -0,0 +1,876 @@ +{ + "nbformat": 4, + "nbformat_minor": 0, + "metadata": { + "colab": { + "name": "custom_dataset_transforms_loader.ipynb", + "provenance": [] + }, + "kernelspec": { + "name": "python3", + "display_name": "Python 3" + } + }, + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "DCgx3NYWvMfb", + "colab_type": "text" + }, + "source": [ + "# Developing Custom PyTorch Dataloaders" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "ViYQRS93vH39", + "colab_type": "text" + }, + "source": [ + "A significant amount of the effort applied to developing machine learning algorithms is related to data preparation. PyTorch provides many tools to make data loading easy and hopefully, makes your code more readable. In this recipe, you will learn how to: \n", + "\n", + "1. Create a custom dataset leveraging the PyTorch dataset APIs;\n", + "2. Create callable custom transforms that can be composable; and \n", + "3. Put these components together to create a custom dataloader. \n", + "\n", + "Please note, to run this tutorial, ensure the following packages are\n", + "installed:\n", + "- ``scikit-image``: For image io and transforms\n", + "- ``pandas``: For easier csv parsing\n", + "\n", + "As a point of attribution, this recipe is based on the original tutorial from [Sasank Chilamkurthy](https://chsasank.github.io) and was later edited by [Joe Spisak](https://github.com/jspisak)." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "qOyML7wg1BMB", + "colab_type": "text" + }, + "source": [ + "### First let's import all of the needed libraries for this recipe" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "2Rc27Y3fugUl", + "colab_type": "code", + "colab": {} + }, + "source": [ + "from __future__ import print_function, division\n", + "import os\n", + "import torch\n", + "import pandas as pd\n", + "from skimage import io, transform\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "from torch.utils.data import Dataset, DataLoader\n", + "from torchvision import transforms, utils\n", + "\n", + "# Ignore warnings\n", + "import warnings\n", + "warnings.filterwarnings(\"ignore\")\n", + "\n", + "plt.ion() # interactive mode" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "f2rHNlke17xe", + "colab_type": "text" + }, + "source": [ + "## Part 1: The Dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "akRD-2aTKSXR", + "colab_type": "text" + }, + "source": [ + "The dataset we are going to deal with is that of facial pose. Overall, 68 different landmark points are annotated for each face.\n", + "\n", + "As a next step, please download the dataset from `here ` so that the images are in a directory named 'data/faces/'.\n", + " \n", + " \n", + "**Note:** This dataset was actually generated by applying `dlib's pose estimation ` on images from the imagenet dataset containing the 'face' tag. \n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "Dz_FGQmeIU8v", + "colab_type": "code", + "outputId": "44ed3e5f-ae72-4c1d-fba0-e7968ae909f7", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 221 + } + }, + "source": [ + "!wget https://download.pytorch.org/tutorial/faces.zip\n", + "# !mkdir data/faces/\n", + "import zipfile\n", + "with zipfile.ZipFile(\"faces.zip\",\"r\") as zip_ref:\n", + " zip_ref.extractall(\"/data/faces/\")\n", + "\n", + "%cd /data/faces/\n" + ], + "execution_count": 11, + "outputs": [ + { + "output_type": "stream", + "text": [ + "--2020-03-31 19:10:25-- https://download.pytorch.org/tutorial/faces.zip\n", + "Resolving download.pytorch.org (download.pytorch.org)... 13.249.87.32, 13.249.87.127, 13.249.87.81, ...\n", + "Connecting to download.pytorch.org (download.pytorch.org)|13.249.87.32|:443... connected.\n", + "HTTP request sent, awaiting response... 200 OK\n", + "Length: 5780252 (5.5M) [application/zip]\n", + "Saving to: ‘faces.zip’\n", + "\n", + "\rfaces.zip 0%[ ] 0 --.-KB/s \rfaces.zip 100%[===================>] 5.51M 30.9MB/s in 0.2s \n", + "\n", + "2020-03-31 19:10:25 (30.9 MB/s) - ‘faces.zip’ saved [5780252/5780252]\n", + "\n", + "/data/faces\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "rMoZIwXS6pn2", + "colab_type": "text" + }, + "source": [ + "The dataset comes with a csv file with annotations which looks like this:\n", + "\n", + " image_name,part_0_x,part_0_y,part_1_x,part_1_y,part_2_x, ... ,part_67_x,part_67_y\n", + " 0805personali01.jpg,27,83,27,98, ... 84,134\n", + " 1084239450_e76e00b7e7.jpg,70,236,71,257, ... ,128,312\n", + "\n", + "Let's quickly read the CSV and get the annotations in an (N, 2) array where N is the number of landmarks." + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "Ar1FD1QIz97S", + "colab_type": "code", + "outputId": "91fcc1c3-a2e8-4e05-d819-72695de48829", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 119 + } + }, + "source": [ + "landmarks_frame = pd.read_csv('faces/face_landmarks.csv')\n", + "\n", + "n = 65\n", + "img_name = landmarks_frame.iloc[n, 0]\n", + "landmarks = landmarks_frame.iloc[n, 1:]\n", + "landmarks = np.asarray(landmarks)\n", + "landmarks = landmarks.astype('float').reshape(-1, 2)\n", + "\n", + "print('Image name: {}'.format(img_name))\n", + "print('Landmarks shape: {}'.format(landmarks.shape))\n", + "print('First 4 Landmarks: {}'.format(landmarks[:4]))\n", + "\n" + ], + "execution_count": 12, + "outputs": [ + { + "output_type": "stream", + "text": [ + "Image name: person-7.jpg\n", + "Landmarks shape: (68, 2)\n", + "First 4 Landmarks: [[32. 65.]\n", + " [33. 76.]\n", + " [34. 86.]\n", + " [34. 97.]]\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "v9GLPpde0BhV", + "colab_type": "text" + }, + "source": [ + "### Next let's write a simple helper function to show an image, its landmarks and use it to show a sample." + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "5hpERgmA0Egv", + "colab_type": "code", + "outputId": "6ab3417d-98f9-4ceb-ff88-89f757d5483e", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 269 + } + }, + "source": [ + "def show_landmarks(image, landmarks):\n", + " \"\"\"Show image with landmarks\"\"\"\n", + " plt.imshow(image)\n", + " plt.scatter(landmarks[:, 0], landmarks[:, 1], s=10, marker='.', c='r')\n", + " plt.pause(0.001) # pause a bit so that plots are updated\n", + "\n", + "plt.figure()\n", + "show_landmarks(io.imread(os.path.join('faces/', img_name)),\n", + " landmarks)\n", + "plt.show()" + ], + "execution_count": 13, + "outputs": [ + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQEAAAD8CAYAAAB3lxGOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9S4xuWZbf9duP8/heEffezJtZWa/u\nMi5czasxIGPEANrIqGeeWYBkgYTsEQMkhLAYMvIIiWlLIIGEBEg8bCOjdtsSAwZutwCr7K7q6qqu\nyqrMvHnzPiJuxBff953H3nsxWHufc77IrKxylYu+rb5beTMivsd57LP3evzXf61lRIQ34814M/74\nDvuHfQFvxpvxZvzhjjdC4M14M/6YjzdC4M14M/6YjzdC4M14M/6YjzdC4M14M/6YjzdC4M14M/6Y\nj1+YEDDG/Lox5jvGmO8ZY/7qL+o8b8ab8Wb8fMP8IngCxhgH/D7w54EPgd8B/l0R+dY/8ZO9GW/G\nm/FzjV+UJfBngO+JyPdFZAD+B+Av/ILO9Wa8GW/GzzH8L+i4XwI+WPz9IfCv/rgP7zatvP3WA6x1\nVFWFcQ4wMP1ffzEYPmW3yPz+4tP6lgjGmPyWARHk/AvTMQSw1iApIQhmeWIMIiAJur7jeNjT9z1g\n8JVnvV5TVTXWWkz+4n5/xzgOhDASY5qOZUw+lkj+mea/Yfo+2UJz3s9XYQzWOb22/J2UUr5W/V6M\nkWLdWWsRSXmS9NwpJay1VFVNyteVJBFDxHqn85X/jeNAjAmRhHMuz1+5NqPzJZBSIokeyxhDVVWE\nECnTnpLk53f+XEC/u3yoxiw/uHgN9NkJ0/VhyrGWz7zMFRhTnsfiqU/rSLDGTp/zzmGswVmHMZbj\n8TifczGMAWsdKSXW6xV1XefX7PSvXJ/+bYgxgpjpczHF8kj0GvK6S2VehM8899l15Hv48cN86pXf\n/+73X4jI4/uv/6KEwE8cxpi/AvwVgEcPtvwX/+lfYrO74PG7X6TdPgQMgtEHieCswRlDsO78QGLL\n8Tg3bIQQRrz3eePphlmcf/qZkjAMCe8N3hswiePxSNu2fPzxcz740ROeP3vFx0+ecTzuGfo79vs9\nh8MB5z1vP36H9957j9VmS1XXGGP49re+zeGwp+tP3N7eYbAY67DGAfpTxBBjwuDow8gQRoZhIIQA\ngPeehw8fEmPEGINzjvV6TQojKUbGcUREpnuMMXJ7ezt9frvd0vd7rNWNaY3jcDhR1w1vv/2Y47Gj\n8hUpCXd3dwQD7aqlaRqstXz85EO6riOlxG6zou/7SYiMQ6SqKkSErusIKSKArzzvvPsup9NpmutT\n1zGOIzHGvOmF9XqN954UZqGVUmIcx+l5ldeXP733KpAMJInT552zVF7XhjMG7yxN0+CcI4kwxhEj\n4KyukSSR7XpD0zS0bcPDywvqusa5mhSFH/7wh/l69fouLi70/iXinG70hw8vaNqKYej51V/9VbxX\n4dA0NV/72i+z2Wyo64r1ek0Y8r0bmQSlMdB1J302VoWGc45xHFVw3rv3++tWBc2sMO+v7fvjz/4b\nf+GHn/X6L0oIfAR8ZfH3l/Nr0xCR3wB+A+CXvvxYRLI6pkj9ezeiavLeabJG+AypB7pplgKgLOD7\nk6STqZuoaHQRePLkKR8/ecaTJx/zu//oOzz56CkGYbttcc4zjolTd+LUfcR+f2S12dA0LVVVs98f\nGEMgJV3MSQQjiWTAGoOrKpxxeG8wxuGkogoBa+0kBJxz06LIc6abaRyIIRBjnO6xCIGlYPDeM44W\n5x1VVRNDmuajH3pEEsYavHM0bUt/vCOEoJszJURkWpg6J7OFYUw6F6rWqhUlMlkj5fPOOYZhIMZI\nCAGXj2UWz6hYKUtLppwrhDCda3qmCCnJJPAkJSxg8mYqFoqIkABJ2cor15giow9Y63E2cjr1jGME\n6QhRrTObhU2MEessAsSYGMNIXde8ePkiW3LC8+cvqOs6L1QVGjEmNps1bbvS67aWlAJVVeG9P7vP\n8qyNMXhfTfP3WZjdZAkBzlX3toX59Db5CeMXJQR+B/i6MeZr6Ob/d4B/78d9WEQYQyCkpA9MDJLN\nvWIGG2sxxqoJa1DNaopJdG4/lnkrErsIgbKoy2shFKluqaqKcUzZ/DVUvuF//5v/B9vtBQ8fPOL9\n998nBGEcRp49f8GjR49Uc/iKMSU+fPIJddOyWq159OgR7WoH/ZFwigiqjSRGRCLGJDAWXWMWY4Sq\nrqmbJmsFOwmtomlTUpM8pUS92Dh1XVPX9bSBQgg45/De0zQNMTVUladpVuz3e7AWMTCGUTeAd1jr\nqFNN2EfsOJ6ZtsXKgHnxOedoGtW66o6oq5VSIg5qjRyPx2m+15vNmSA2kLVdQmKarrcInGEYpk1f\n1/UkGGKM03WBCp4YQ54nSLrbIQkpJlJS7Wtdha9WIDFfhxCCcDoNjGOiO/Uc7w4qmKzDOt0WDx5c\nkCRxOp3YHw70fT/d0/F0ghRo25qv/tJXef7sOe+8+w6PHj3k5csXfPDBh8QY2W43XF9f89ajt9mt\n19S153Q6TZbRarVitVrl69LrU2Ey743748dp+p91/EKEgIgEY8x/BPwm4ID/RkR+98d93hiL9zXe\n1VjjdbMLpJgIAiIRV0wfm31iEsYI6gJkbSDz5Cw3eUoymVcxqqSOMTIMQ158Duc8XTdwdXXF1dUV\nT548YdVu+PjJJ/zDb36L9XrL4e5Avd3y9voL7PcHTLTUdc1u09I2UX0+5+gGqCpHCDCOCXXL1T+P\nMZFSQGTEmpivMxBFyJfIarWaNlzTNFxeXk4Pvu97fOWxxhCy5TCO6kbc3d3R9z1t206CLkbBOXWr\nxnE+3/F4xFpHiIO+NwQQmTT2ZqOmct/3DMOAEYP3ZbmotnHOqb+bfdu2bbHOTYJMRLCTeZsmbVfn\n9xEhZBdhubBTSmcukTGGYRjoum7S/LM57DEmTf62rieyee0xRhD03okpbypVHCEIIQwAOHS+Hr71\nkN1ux/X1NTc3e2IMDGOPMVBVNU1Ts1qpa2SINLXHYLi6uuZ0OvHsk2f0Q49zFf/oH32Lv/f3fptf\n+7VfY9WsQYTDIVHX9fSMYlQrr+AHOleRz7Zu57X9Wb//rOMXhgmIyN8C/tZP+VliTIQYGWPExUBM\nEBMYESQFIgZrE4lxsWDMvX9LYA2SqFZcmlZFqyxNT2MiMGCM4eXLl3z85CnWeT7++BkvXlyxv90T\nsiltrSOJ4hXjGEkykMQomJkMVgQbAtvdDmuSaiaxk+k6joEwKmhWrmEcR2LxhmDSOGWhV1WFW2wu\nn8Eu51QbFwsgxjgtsOIPW+PwvqKuGzabzaSB9DujugQY1UKA5A25dJ3K8ylugbUWWVpYSTCuAGLz\nwiz3Z/OCB13sRdPpOeK06Zcm//KZFeFTrlsy+peyQLPWYo09c1kK2AoqoLCelAKQUFDTZyVSPpNx\nE+sYY2QYA/04ZLNS2GzWGGsIMbI/3JFixEgijCPOvWK3u6BtWwA++eQDvFPhlSQxDCPf/d732O22\nfOUrX2K1WmXrLlLXFU3TTHOl83IOdn7evlmCwtPCp1jDP52A+EMDBu+PkIVA+ZnEknEUkIIkC/FM\n8ik6PeMCWUPkSVHEel6YRQhAmcAZH0gp0nUDTz/+hA8++JB3332Pp0+fc7e/IyWZNhXGq9mJIaaE\nBIMxkaaqsdbpgsOAsVjncb7G+0hMEcFgo+jiE13EMaPC07VnVLlsiPt+oXOOFMPk24YQGIZh0rbF\nPajrOm9Yj7WKD7TtSjVzNrfHMDIG1UTGCCI+z+kMos7++ow/WGvB2NmkF8GZc3ymWAkhBOrFPajP\n6yfhoXM/32txDYogKua39566ridLhXlZYIydvhdjnNVCETzOYa0nJpmsRe89kgIFfXLW0rZNtjpG\nxhCIMUwWj8vXHGJg6HuSJBzgLIQw8tWv7jDGEcJId+q5ubnFe09Vea6urrm7veXBgwveeuvhZEkU\nIVDufYmx/LTjvmtQlkr20H6q8ZoIAUX2Basgma1w1gPZ/5OIxSiya82kneabLJt/jhRoOG3eWEvN\nUnzPeQJV2v/1v/43+e3f/vt8//s/om3WHA8ddVPx6NFD/tyf+zW+973v8fL6hpt9B07DSnXdsN3u\nuLi8ZL3eYJ3j7nDHx08/JgXdrIfDYRJAMY7TYrf5fnytGsj7itVqNZn/ZePc3NxwPB4ndD0Og2rw\n/JmyMRSZbiazOIQAzpMwdEMgCsSU1PUA+rGfvmuM0LYPqKp6AiN1k2S/fJTJpK+qCmQWAt57qqzN\nAKqqoq5rjsfj5KIAZ5bFOI6Mw8g4DGfvWauofpmvYRim95qmYRiG6bX1ZpdxkYqq9tS+0vnNAKCu\nBcnCuEKcnTa2954YFuBlVgghjIRR8n3UpKSC7Pr6hqapJ817tz+w26ywxnJ9fcPx+PvEGPDe8Y1v\nfIOmaXj16pqnT5/yW3/77/Av/elfJYTAN7/5Tb72ta/x+PFjdrsd3jv2+z3ez8KvWDGf5/vPimG2\nZrIqOQullt8/L5z4WggB5xyXlw9YbS/YrLe0my0Yh2BwxkKKE0PA2KXfDzOfYLYEpmHmjb/cVDBL\n0OPxyOFwx8uXz/md3/n7vHhxhcXRnQac8xgcd/sjv/mbf5vNZoVYh/UGa1HE2Ahdf+Ll+1fUTUPb\ntGy2G8QYjHUYMUjSzaea71yAacy+om3XOUQ1+9Cg+MB6vZ4AshgjR0nEENBY/jjdT7EAShjNe8/a\ne0RgHOO0qcrG9d4zDH0GK+1klpf5stm39t7hTD1tUrWQZUK4QwiMGVA0BQvI99W0bV7kM/BX3Jhy\nzUVoOOfOIiPW2gmTWIKU3nsEOJ1OGunoVeO2Ta3nNqIRGKdhWEmJGAesIYOyoG5BwvssjFMkxJ6Q\nhCEkhmFgvV7n6/VYC7vdBXVdZWwnapg2RTbrLZvNFuct1sLz5y8IQYXNer2lqipOpxNdt+LywQU3\nNzcZl7ngK1/5soYQQyAEjRwUC2gJyM5r/hwPMFYtmSIGlptdFsLh81yD10IIABNyHPJC1z2tZA5y\nbNnAJOF05E2/kHRnQIkpZJrZGiibq5iq+/2eTz55yne+8y2efvwJp+MAOUzjfT2FgDabDat1Qz8G\nxlOnBI8sha2z+MozjgNjGIkSM5IvjEFxgJQiKR9LgR9BZGnGBmLMWnZxfSUiMJnhqKYtYbayaZbv\nl7nUCVFTXk1oYRzVzHXOZjJQmIRCCTOW0GNTV9NcFzyiXJuzGv8u1znEMJnshXTUtm0WNMMZql+u\n1zmLs256vSDjSxS+hAiXGIliCoYxzFgISBZaFpMFSLknYy3GGRyGQTIGEfXedeMW0lNUsDAKxlpC\nnLGbqnJ0Q0+UhHOGd77wLuOpY+h6fe7jyHp9wWq94pNPnrJaFRxEOJ06njz5mBhH1psVKaWJjPTo\n0YHVajXNyf2Nv3xtOWb3quwEJte4YCYz2erzXYPXQgiIJPqhw/Y1w9jhhlajADjEOhBlWGW5AGY2\n/xXcWbCssn+NUdJPSgVNnhdwsQ6urq559uwT3n//ff7B//tNrq9uMMZTVRrqq6uGEEZA+NKXvsQw\n9ow3t/RDjzF2AmbUHK3Z7/d0Xcf+LnH54MFk9obsXxZkOmUWn7VpAr6GYUCECXsoG7yY0kvSSFVV\nJGuJBbyzFlJxD4p/qZ8vWnrMxKlhGAlxxDkFIAuzr5jcRQgsmW8KQnrqupoArKrytKsVkhJ93zMe\n7qjrmtV6BSggVlyu/d3dNOdL8E+1tc+aOpLyvRXhV8KRKrgSGgVStqMxBkwgxTiHJ2PEGpOtlyqb\n77oWjAUrhhgCKQXCGLKA95TFo+C0EJNQ1V5xgRSRJLiq5e5wwFp1P/7kn/yTdHcH7m73XF+95NR1\nXF5e0jYtIFxePiClqISxruPJkwMxjjx+5zHb7W4S3tfX11kROJxbEt/yOufcEph2tDEYETRKxvS9\naf3fC5d/nmvxWggBJDKeXlC5EScXuNQQoyFi8M0q32T25yWbksUikALu5d+LTSBCMiHH0DUUpOFB\n/WwMgf/tf/0bfP/73+f66pbuAHFo2e3W7C42PHzrAdvtFldVGGO5O5z40Y8+pB8Dxuviqrynrhra\nZqOhL9/ohhtHbq9eMfQdIYw4nw21vBEaX51tsgJSFkCwqlTLNk1D13VnPn8RBiFGuq5T4kkzm9eb\niwva1WpC4O9uP2G/v+V0OrFerRnGYTb3XUtl9HOCRVLEGsHXnovdhmEYAMUqBGjXW2UtbiObVcPz\n58/p+4HLBzu6sc/4DWAtbd0yjiN3d3ccshAoFsrFbsd6vc6mfA9WhXiIgafPPmG72SAGhhDAWKq6\nmRZ3XSmPwCC0tSWNohaFs3iv/9rWs1qpmxFjVB0yCCEGPA5ja4YscMMYOHX9NLeBBA7adn3Gvmya\nhuPxyPHY54iEx4qFJFjfst00PH/5kk8+ecq//K/8i7z3hXe5uXnFBx/8iLZxjEMiBssffPeH/OD7\nP+LP/tk/w4PLh/zwhz9ks1mz223wVU1KgZTnUWbNNj1fUJd42tTpHoOWIickg70yKZ8fN14PIQCc\nhTky+DLj/MuPyWQDFRKhmPLtBVhoyJotTYuvrhtEDF3X8fLFFR999BGvXr2i63qsbWhay8WDLQ8f\nPmB7sabrezwJ7xtEYLvdUYVIFNWip9Np+ld8Wu/9RIkdx4EYxkm7DcNA3/caY85arm1bLi8v8b7K\nvucM6k0hLjtrCJE5tr6MHJTjKYEmTsLjTIAgc/x+YZrr95XTXmb8PBZ/7nqUayr+6zIiU57bRAwy\nlt1ud0YbLnhE0d6FYlwwi7FpFCvwSnIqkYJxVMtIMZOARee6YCjFBVDexMyX8NZTOeWDiAg2AeJp\n24ZhgL7XMF5KCVt7qhxZKdjFkjhVLJm+75XxaV0GY/VncJYPP3yCpMh6s+JP/Ik/watXr7i+vuV4\nOPDhhx/y5a98iQ8//Ii+79hdqEs1jGPOWRGw9U8N75d5/yzuwLkV8LpbAiwvcQ7ziSgcaJZhsoWp\nNJn9WbvL2YHUBShx76qqpzePxxMff/x0MsGNgZRGtrsVVW0RExFJmZsOMUI/jFjnqYzDZjMf1F8c\nMsJdwnLLSMTyn162OTN1l2G3stgLd6CY8vHepp80eV6YZZGW4xWA7+7u7iyMV86/xA+WEROJM/ZQ\nBNoy9FY2tr93/eV5FJ7A9ASK319IRXB2rTYLrCLwlqQhyQ/GOYt1LjMAF2HTpNyFAk5qkpMKF03a\nkgnbEATJnP2Ukv7uDK6yuORwyeKCxQhYPz+H5XyV659dlBFX6zwkiRijYUQhcXt7y3rdElNkvV4p\nPdpa7CI56dknnxBTYLP9ZV1HMRGtWq5ilqHvcx0+befF8xQp4GBeZ8WKEKVmf87+B14jITAvIqXx\nGgxGVM0vEc9zUFBj8nNkQN+Z5AWWJJroUlU1x2PH0I+8fHHFD37w/iS9UxKG7sTb7zwmxkDX31HX\nFgzqI/aB4zEAem3OcrYxzkk1cbIS+r4jjMO0CTSPYGbO2ezXd11HCGlyBwrYtRQGy43s8sava2Wv\nlWOVBVrYdfv9HmNnghSch+KKwCr3IMIUAqwX2rBYMpPAyJjFUltOc08BqUp8vzq79gLWle+llAij\n+t4Adb1ckgkhIWLPBKlzbgKJrdGQYoka3BeqipkYEipgomSANgsC4w1OHJVU+X1IwpnwLG7LEs8I\nIWAag3WWru9o24zlJMfxcOTm5pbjUTGEy4vLiWvSNA3XVy/pu44QR7785S+qssq5EFVVE+J5SO8M\n7F4u8DxHRQBkQOje5+UncoZeHyFwxvrLQmCSYJb5TpbgyX0zZ/5dNfSJ7faCEAKHwyuOx46/81t/\nlx/96AOur1/x7NkLrLW8/fYj/oVf/dd4+fIlq9Wapm0QLB988IT1es1qtePJRy8JQRNvjv1xyvZb\naueiLWKMHA4HSgy7XW1yymnJWQiLhJpEbwLDOGsaJeMYTSM2Bp8jBmrRnCeXlA26/FdwCYAQZpZd\n2ezlWEtmnveeFFPmPtSTuV/+ea+c9wLanVKYNHdJBy4a2rg5IaZtW/b7G+XzJ2UndqcTFvWzd9ud\nYgDTZjtpolUMBITB9ECfGZcJZw277YbKezarlr7vWa9bfOW5u7tjGEbaVtmRp9MJ5zxQ2KHpLC/h\n5ct+ErLT/Pga5xtdQ/pICDHRne4U98mMzPIMU8wRHEmsVi1+uybEyN3d/PwfPXobBs1ILJGaJHA6\n9XznO9/h61//p2jbh1S+5uXLl+wuHkyuy2eNM9OfpWL82cZrIgTMFOKQJWmqoNPFZi+vTT/t9H1Z\nRgjQEE9dtwoiGYtzFR988D1++KMfcXuzZ7XaUFU3PHz4iEePHqi/aw1DGIknTSK63e/xp4HjceTu\neMzho3AWb79v5sMcolNzN6lGzu+XxJ5Che26LvP47+ehzyG1+3/DzDArfv9SCBQLIqU5a25pZpff\nZ1NaQ3NDP4fyymYv97bM5osx4oyc3Sv3jq9JRkpcKi5ZuccwBkKlpm+drY4p1Gdkdn9SyhGEQqby\n1LUmWXnnqFy5JohhEXWwDi1uNeNCXX+iRDpsdlGGccQ4i7cV3nnGMGam6uwKFKS9EKXKOWKMOK/W\n4uF04J133s5ApArhzWaNpLJeIIZICJGUArvdjhBHDocjF92W0+nE3Z3Sh1erte6GEuWTvDtEJuLP\nmXY3S9LcBKflz5///HHjNRECAHN4S/+a3QORc/+sfJ7l73J+LChmmy6ovu958tETTkdNQkFgvd6w\nXm9wznN7e8BX9ZT77p3m2XfdQN8nYtJknPug2jL0VZDkJRBHmCV10RhTCDAGxntEG2vMVNzCOot3\ns8+PyVTlvPGK5bF0Gcr55011DvIttUbBD0DxjDDMGYRl0S/j1UtXxTK7QeW+IQvgBV6gxUgWzzNH\nAgqrzxj1l4Mof8KgadfWaAhRLRMFTfV4Fb7yyh4VoRT4EBG881T1nLxUIgoiTPgBxeW0jpSG7FJk\n3CMlUlwCavN6Wq67cjzrwFqlp3dDf6YUUirREOi7YXrOMY6s1yusNTjvGAYVBk2jtPPNxs0bndm1\nypO7AP3OKUBmUo5MMmIKFH7KRTgfr4UQ0AnWsMg8kUYXglXW1zzs8ulwVkhkvuvyC03TcHu75+nT\np3z3u99lvV4hAq9eveLx43fRQhx3DFcDv/Irv8KL5+/z4uVzdhcXbLcX7O9O3B2OrFeXWJuPm7XV\nUusuXYOyOTEG6zQjT4tIqDtwdzyco+WVn2nEzJq3WA3FBSiaP8WYNctsxi4fchEQemyLMX467tLS\nKNZIIeAcOZwBgUvyUfnclHiVAlXGJpZCAOac/2LFGRHN9TcGn9N0SaIFRUJEYmLsB4ahxzrw1lDX\nnrZt2e0uNTvROiTB4XCgH4e8UZUPMOTaCJeXl+wutnTdif1+PwltMDhb0dTtPGcRhl5JRs4ZYjSE\nMRGTmUzsJbquefuaCDYMI30/ECJUlWW9XvOD999ns16z22ypm5aXV9c4q+9d395iJTIOQ3ZRHF/4\nwru0bc2rmyuur16x3W65uPC5SMq51QefRv9nMHAGB++Pc1D6NRcCxR1IUkzBORZqjFFxW1yCJW24\nYACiYcLyUwBJQn/quL3dc3uz59mz5zmuPbJeb/jKV7/C9777fSV4tGvaZs0Pvv8R/RCp6x3jAKf+\nCGJYrzYaysqprCKKAMNslodM4y1+42q1AqMLaul7g6GumwkwKxtIQS37KRO8CIEiAMZxpPYVxptJ\nCJWFvVw4VVXlegIzrbhEEsp1FsFkjLL7iiVRhMCSRVi+Y60ljCMxaOpxed/XzaT5SyitEJ0ePXo0\n4Qmy+E5JgW4a9cGbpgYTJxCu6zrGUXPum6albVa0bTuh869e3fLo0aOpSs9ms8G7ihgPcwq00cIb\n69WOy8tL9vs9fT8QU6SqWrW8jM3EswokkGSOOpRnDEzWVhG6TdPQrCoshvV6jbGWMUXee+c9qqoh\nxUjlK/65f/af53vf+TaI4Z133uHu7pZ33nmHL3zhHYaxY7PdTKBm267OQNDPG0LZDrJ4pWz+8vdP\nPs5rIQTOzZbiB+mwVhF+lqEOU8x/c/b3sn5gTInr61fEmLg7HHj58iUxnofZ3n777bzpLOOgZp3g\nsBZCGDBOrQ4RwxgGoGy0knY6RwVOp9OkfcdRuQFFCCw18zKaUP4VTb/UqBN5JYRpk0xRA+smm7UI\nmCUduiygqqoQ5tDcEjkvfy99+GEcaGJzZl0UAbdM002yXGx63ZWZcYviThTcpPxdrq0ItSIIZpAy\nYSzEWCwUi8gsFNQCmd2YIgBLqnUIga7r6E7dFOmw1hGjMlI3u0vGkBhCsZJyXUjMhC0USGoiGhWt\na2baecE7qkyrLjyHMlfKhagYBYZxnAqM2PzZuq45nY7s93suLncL7kPGV7KymMz7aeHLtK8nvGCK\niJ0/k6UQ+Um44WshBMqNFkzATBJs1p6K0RgSdo59ipkekMksKck+pVJuRwxwt7/j+fMXuaiGIUah\n7wfWmw1DPzCMka4TogSc00kLUfAu8w9EzsxjOC8HJSLThpoy/WKc7mIC1fJ7xmoGosukGWtMTp/O\nGyYmYpoFStG4KeUQUFLascnz5qzNllSawpAlcHoWKitaL//tcnIRhlwebM5ZEJivIYwTAQgRSCm/\nr38r6p8LeUxCrORNjFRosVIkFy21HqXsBmIYcAaiAWc15Jb8zD0IIRHjqMVQs5U4c/2FcRywTsOA\nYzfS9V2mBOeirFZTr/thpB8GxhgnslVdgEsRzfg0JoOK90uUxYVLpLUHmrrGZ8HVDyO1r3VJJpl4\nI8YYYhKePXtGW6ty6Yeey4sd+7s7jIG6qagXjM+YYlnhs7IzMoXBZwkwOQPzz2IBLBTkTxM4eE2E\ngAVbIWIwkvBGdOKjEMYeYzUV1LqKFDVHHkAr9ASc14cRgpp/x+6OsR94/Ogdnjx5wpOPnvJ7v/c9\nnj+74vLhQxgC/dUrME6ZdWMkJs1YKyGxuq5z9dcimOxUxqvP4b2u684kcPGzl0lAKSXdNJnj7rO5\nvMrU3qqqePnyJZAr0ZJrBzsKr24AACAASURBVFoFBUMIMMbJUmqsx1qTwTOl2qrJa3C46VoAjBG8\nmQG6MdchMEbrHLTthtGNOb3Yst5sGceRm9usoQY1m2MYGbpe6c4pUZG4DREkYZRwjDeJyhqcNZoA\n5Gr6ZDkOkXa4o3bga73eygpVjvqm7J4RRkYCbd0gccgCHWxl83UEEgHrHV2udNRWK7r+wN1xTxKo\n6grnPREhCljjGRKIURft5uYmz8ssuHUdlQrMmgPived4PBJKibIYOJ1ObNcrqsrrfFuQZEnJ4GhB\nHFXlqbzj6uoVdV1RVxXbtSYHdfGEQVg1NVjh1fUVN6+uuLl5wa//+r/N5eUFVVsRiRiTS7plDMZg\nZqqwzNU3JZVNP2t9m2XFMhr0k1yL10MIiDKmjClFJAxSinMAklNnk4DzK2ZOdE4asYohaBqtQ0iE\nPvIHf/AHXF9fczp1bDZa+69qGjUzMVR1jQjUjaVdben7/iwUVzR7Cb0Vc23M/v/Sp16a4WcVcOBM\nMCxZfiKSWYvnaaPOucwwm7MGl+BPSGGhmcxi0y81vpn8+2LG1lVFU69ykQ07uTQxa8e6rolhZBx6\nbm50Y1ijKcVkC806g2ka3DBM6H3bVLRNBUZIcSSkSAqBOEalW0rEGqXtGhT7KAtaBCyGyjqMA1LU\nUl/F2jMGWzVKDXdOy5cZN6Vlq7U0TNmndVPn0nTZRQo6f+vNZqrrUJ5DKcq6FAig0Zau6xQ38RW+\n1ZBu21TYPN+r1QrrtFDJkuI9U6G1+IjPx223mlx1sd0g48B7X/wil7stm1XN1dUVq82KZtVQtRVV\nLjRair4uLczlOplyNX7O8TMLAWPMV4D/DngXvZTfEJH/yhjzCPgfgV8G3gf+oohc/6TjFTNeTV5m\nX2gp0VLCpgjZ/PXOYnB5UZER3oHueOJ0OpJS4tWrm0wi0XqCxhZjS9lsbasCYcl+K5tm+W96feF/\nf5a0XT6kxVzNaa0LgG3pZixDakUolAV3nwxyn5+wPM99ToHLsfQC9LlKqasxm+rFL9fU1nkhj0NP\n5SvFRaxBYmBEBbRBEXyD4IygVdojWAEsFiUpqfUTpuvQe0pUOdW4uBLWaM0CDY9KLoQyr+9YwN5s\nHjsMKQuI4kJKSoQwTvUMtETXzLMogrzM/X0AtoChS0FujAE/sztTEoR5TZjJa8nZlxmZ1ve1JF7M\n54pJc0n2+wTjyMOLS7xzU5Whw+HA8Xjknd1jZTiGGT8q5yjW5pLJaPmMXhz/mOPnsQQC8J+IyP9j\njNkB/7cx5reA/wD4uyLy14z2IPyrwH/2uUdagBsiGl8vufplFHM4xoBDE2WcK3zsvBmNcDzlCT0c\ncc5xe3ubKZxHvKs1Pm0s5NJjdbYGhnC+6Zc176ZryIvtPsd9CfgB0wYv3y9ad8kXgBlIKgk1fhFy\nK9PiCkghcnauzxI6y/j8vHhk9tuNptkyYSbqP3d9z83NDau6+OqJNEYqq0UyjBhSDAwx5vi2pak0\n10BFqCBpxKI+uDOGFEdSUEtAhYbFOvXjK68WhqRstiNan8E4rS/ls7DN+qAs9CRClIQt5q01pJjD\nj2g9gBjGSZkYU/ITAuM4E6DKPM9zYicXwCzA3ml5FpwjR4aKdWisFr4pz1GsQXL0KoTMecgYTAyB\nLgUO40BlDF3fEZNWghrjSHfqONzd4b/4hYkPsFQAy3yL8zXnzq53CWL+tONnFgIi8jHwcf59b4z5\nNtp56C8A/2b+2H8L/J/8BCFgc424JMWc0qovgsE7MNaB0fJjIAoUJY3txhimCdKH6/GuBizPnr3g\n2bPn3N7c0p0GqgrC6aS15hIcTp2GnHxFkllbLGm99+55EktlIcHC989jKRSW/PtyjKW7UcJz9jMW\nH3CmMQtTUSkVsxBYxuWX7oBzjhgGvLP4VUPT1jhjiRLBJMIw5vTYI3d3e0LjiEF9YAd4EZwINm/U\nSEJQTOZysyYMPZKi+scxICkixiImMZwO9KMCeStnsCScyem+U2Q3Rx/GPrsKWbBM8l8QNIkIDEly\nhMTBmBSQA3UFnXOYVAq7qlCwDtq2zu4frHKFJmO0IlPf9/q31QIip6771BxaM4dUvfpF07NVNypN\noU5ntemJ8i/m5+hyAlQMgTAO7C4vqZynrWreffddnDOMKXA6nnj27BlvPXx7EkxLmvN9fodkvOD+\nmvks6/Hzxj8RTMAY88vAnwZ+G3g3CwiAp6i78BO+Py92UPRVNBwwPQwlRQgGfcjWWPVPxU6+qxFh\nf3vL3X7P1cuXfPOb3+Tq6ioXAdGy1a6qWbV1LrpZIaK5+cfuOPlhxadfbuxClDHGTD71EjO4LwRK\nBtuyIk+R4EvEvhCCynnLZwo6XFUVzlhsPvc4jmeW01JDLLGJshDGcaCuKlbrzRRGk6TsOmOVtRhC\n0GiFs1gjYDykoCZ+EPXHRUVwAZ4mXoT31JVm7mkh2ESSgAGq7H611lBXFVWlyVfOoYItRvqkJrS1\nFouhTGMyhphzxI2x0zWkEkkyjuAWrhBG8/DJx056/DjmSEfU6lBLy2lZ0HS52URk4iOUQh861/Pf\nwzBQt6spr0JzNWZrrcTqjdW/x9NAVVkutjtIieurKypnefedt1hvV9ntgg8/+IBVs2G32005CssM\nxuXPKCVqMK+H++OnsQh+biFgjNkC/zPwH4vI7VICiYgYYz7zKsyiDdnjtx6cv2dtfvDnNFcRyYsU\nQOv5a+WfPCkx8snTp9y8uuH58xfc3txq5RzR0FhKIzbNXIRhyFV/0qIf4MK/X8bGJwrt4prKZi8b\n7z4gt0y1XS6wfP9nG7i8Vza2WXzGMJ/XWq1ruJjHSUPcjxVD0RZQ+blaUUzqCsVgGYeBMZfWjilo\nhSJJ2CQ4b6YqwrX3JO3jpcLXGizFj7e5W48mEBFLJFHP3dQNlbNUzuKcwTujZrWBFHLYL2td580U\netRImJn4FkKu6xgT1kJlHSGWylGipSYyscyQGMZcVj07DFoSLof5nMNljCeGkePxlBuy6PMoNR0L\ngS3GCJnZqY1qxtxHcO4/uHTZpFx/+b5EKm9VGOYIys2rV9zc3NC0NXXl8bXHOLi9vUFE2G6387Hz\nuM/ktM79WGvgpx0/lxAwxlSoAPjvReR/yS9/Yox5T0Q+Nsa8Bzz7rO/Kog3Z17/25QwBFOKKBety\nkuT0eZII3hUzEkRCrt+ujUZOpxPv/+CH3NzcsL/dkxI463FOcAsUtzTf6HrNtrPOs9ntzrT5/clf\nao2Y5nZb99OJly5AeW0JLpZ7KedY8tyXYUYWn5GYzr4TZW7AseT3n23+hUVREP5+VOQ/xgRWk5v6\n7sTQdxhnSRnNt1GosdR20YHIwWhUE4vN1pAx6q+bpBvCGWVxp0hMacqMbHYbfA4fVs7inTaXsSRw\nFqWC5ZwJXzHEgMlcBIzG2oXsNs6TCMYSUkAoQkNRfodyGPphYBxzteFKzX7JqLPzWmdg6DpOpxO3\n+1tlaNZmEgLH45EYEmI5e86bzUaTvvJ9lOeif3y6Sew4jhg7m/SbzZqb62v2+z03Nze8/fgR3itI\nePFgxydPn3M4HHn48GEuODP301xaqCklKucXJLmfbfw80QED/NfAt0Xkv1y89TeAfx/4a/nnX/9p\njpckTUw0g5mAITs1iJiio5pmmrOyTscDla/x1rNdb3h4+YDL3QW3t3t+//d+kJNo1Kzebi4YgjaW\nGAZl9e12F2AsYwyTBi8m+jJaUIC9LiO5MAN8JVwIi4YbC0S6HGPJ2iv/gKnoZKEJr9drYk44CSHQ\nnU5nTUFKks4SSJwQ64UgSilplSPnlBTV9+qbxkgcBoxbsOJCBEZM0kq9m7blrYcPsMYwhpHj2DGO\nPaNEbd9RqZWgIONAVTmqutYmHwjJOEJOm7VGiGPAiMUZr3UiY5y4E+rhZRM6Z3KWeTXWITZ37/WO\n2uUqTeNIf+xI49yDAdTaUMAOQmlumjGBZfq3MebM9avrmt1uR+UrKu/Z7Xa5y5DkCsMyFYotlGwl\nII0T22+i84hk10rd1HEckTiCRK4R3npwyXq1wlutDfnhBx+SPhRc7fmlr32VRw/fpqpqxnGc3JJS\np2JZ3aisuTJX/zjafzl+HkvgXwf+EvAPjTH/IL/2n6Ob/38yxvyHwA+Bv/gTjzSRGzIAlnShaVce\ni/eKkBuBYThNKHdde1Ja4VxFisIxAyvjGDhl0O94PDKMmRAThJTDSs55mlx+agnkFR9+s9kATBx0\nLfwxC4oilZcgzJLKWqit95H8Kb67GFNMOp+7CJzCvx+GXotdFoYi5808lynEJX23fN9WlhBGbagx\njsrCi5FhCLhqvq4YtYS2EwGjrcLWq7VGQ7JvnUIkGUGcpR9G9flRZiAYYhpJJnA4jQxDYMzpvU3T\nIES8t9SVZgXGaEjB4kwkasq/btzi7mTWm6ANWrDgRKMVYtTsL+XDClOynC9lS6TQnROJSDorBANM\n2ZvLsOE4jgyZNNZ1nZJ+ai1lHoa5sMh9DslyzM9b/w4hUNUlLGm4efWKVdNweaH5DE+fP6VqanbV\nFrK7ejwe6fuet99+e+rDULCKvu+nEObSCjw/9/8PwKCI/F/AjzvTv/WPezxjLcTcq0/U78+3N/0z\nRmZrYYHixhAYh8jxeOL6WmsGDsOoLoWolRYTWiTUaZwcq1oGQIyansW/s9Zm70SmLkFTyq8xulE4\nr1y8FCZTmDFG4lQ5154JCzMbNqQYlG6bAU9J2qiztB8vFW+FlNmRahkZEa1zZxxYwYrRKL3kUGYM\npOgZQ6TvtcedcaUE+YgYreCrCilhsQoAGkPtHc4JGMHbHJdJoivbOcRASMoajCKq1WMkJmEYCzNR\nOwU7ByLqDjhrMxCsIUOPRXLSDqJ9JgrkYbMlkFLJiZ81LSI4A02V+wAkAbSCcBJtSDqtnvJMQpis\nD0FDftYoVdlZn+clEIOyQWF22ZJIJit5yKnIcwXrpC6slDWjBo5QQG3FpKxR/73repq6nupXaF0D\nkztNBw6HvVY6DpEkWohE13xhN5Z25LLYHct4BBMmMS2yzxmvB2OwmH1EhqxVtNe8JSSQTE81CK6p\nci8/vbGu6zjcdXTdQHfquLq6zto/cByELgrW19S15dSPOEr9QujDmM1+bT6BtWAtCbi9u5u1uDFY\n76msZeh77m5uPmXWF60Cc8HMIZuJ282WOiebxCwY1PDJXXlRAozD4I3ldHeYKgSdTge0+F0kSCRF\nS+08DgMpUtmGuvLg83nDQBhGggSiBCQ2E71U+yiWCjcjodeQmHdgiBi7gTjiTVISVTpgRah9YNV6\nDkOAaHGuoWnNlDRlrdWWZpl85JyjsplIZA2SBpZ5BX0/m/tTIZcYMVhWvsEnBQO99VTNio5e288Z\nB2IUIwkBM440dQNGawGQEsdeMY8ouUZCStq8JkZkHBX4BO38ZBVQtAaapuL29naqLdB13dT0JURh\nvz/w4MED2vUO6xusF6pac0xS2lNVfrJAkkBIaNa7c9iqUmhSLIgjCnRh5NXdnvFHA5ttS4iBw+GO\n/njkxcsXNG3Lw4ePaJqKFOfCMIfDITc0qbX5SpaYS4al4klpCrP+0RAClBj8DJDpTX/2Z6tKE19K\nJeG+77m9uc1JQhqqEtGkotVqNVkM3aBmm7ocM5LrctOMEioqLLqpOIjIVLLLmtwObXHdJdGn/F78\nTKWatosw4Ry6IoNZkuPMuoZlCmVK3qwSx/xcIxIDeE8wiqGMYaRjQFLmEohaC5iozDaTGIY0rQHN\nKNRFUVUNMY45SSgCVrW/NzQWvDOEYUBiVJp0UgvIpkjse5Kz7FZtLgIa2d/d4SoPlZ96EVTOs2lb\n1qvV1BGoqirNBQhB+QqClu3Owa7Qawdgl9tyIWrWj1FDfY1ZUXmv5CPvp27OBZdpTW5KkoQQE8Y5\nnHUkOzdUTSLKeMhRKGByHbbbLZv1dnLRRLS+oHOO9XpN27Zn6dtLkxxml+98HSuugkSMNXTXd7Sr\nL9A2K7abLXd3r/DeaVNbLCFEdnXDW2+9haTcLFaUAXlxcZG7GSXadvWpUDYyVyCaLenPH6+HECjh\nuXQeoitm7/kwU0pwyQZ88eIFT58+48lHTxZdd7Wqy9IHn0JsMAmclBJp1AaUy1EAwjKmwiE5fHU/\nHHefW1DMyLIgyoPUTMLSNDO7NrmppYYztXmpmsIa2vPOkqJhVHlPSpGRlOVIwuQCqLoW8xyiyS+W\nstCFcRinlFbvHbaE8TIYaCVSOUNbOWrvMGXxAd5bKm8V7AsDabD4uqb2Hrynu7vLmZB6rLaqWLUt\nm82aw+2tovbeUeUGKKULsrpBKUcwDFagqTXBqaprXKXI+BgDIUXEqHtWNr2Mg7qIkvMyrCM5cgVg\ntKU9fqo2XFyJUvC19LaMUQvSWmPPNm8ZWvpLk77KWijRrGV4tqwzkURKM+1b+QWSU6fBu1wwpm7o\nXmpz0hACQwgcDifatqPvhqlOYllPy8Ys49hTyqhNW6kIAeayY2bC3D57vB5CgHsx+s+zBKZ4vt7c\nOGpp7evra168eIExy7bWLMC0pUDIcepF2mwS+RRItATwJn55CIR7lWjvX/8ySQgKR0DvqxSbdFkE\nlE2WkgoAXVsJ4ywWi7OWpvLKn49BgTIhp9aW8zkl8mRrQ5KWS08xYKyn1GuOY8h0Vw0Zuhxbd3kD\nOYlUxlN7R2UtRCX9OJQU1FRBS3+HBEFNbI8WKW0rT5+0cKqz+vl1U7OqKkJ/p00yxCvhJ3djUlbf\nTHO2peuU0e/7qsJVFdYrH2AIgX4ckUxGcs5hg8WQlCeQrTQnQrQJk5UAOf9hirFLIsYcBs7cgVK3\nYUknXiLuJePTOTcBc0tmYRkz7nNOJzdppvU470mSKxQNAzHB0KsFe7s/5MpFI6dTn4uumDk8nc9b\nEqCWiU9ne4R5jyyv67PGaykE0rS5zi0BlQ8pt6HKrcsz/79tW6pKm4sUkmvFefHPueqNhsQKop7y\nOUtYsAiB8hAnsghzBGApAJbssyL5lwlDIqV2XW6LlgoFV92BlBLeebyvqCqHSMSIYiBGRENJhlyF\nN2K9AnNRRYKiJSaTaiQRU1BqdSbKGKPJPnEcSCFgrUEsGO9oMtHHe4clsvKG1lu86JGd0UQtV1fI\n2tL4kdb37E+3jKc9nsDmwQMeP7zQ0l85E1PvP5H6I+88eHDWkPT+vJXfdaHazAvIEX2rlZiiCGNM\n+L7LHZNV+3vvwUZskgwCG5yov5+sU/K50fJiSxp4eU7FMgTV9pWfi5qWz5fNXly+EiViIfTLKMcM\nYTg7T0gRby2btmWzXfHi5RXPnj/nYrvhq7/0FY72jtPxyHd//w9470tfxBjHKRdHGcfAOKql+uDB\ng8nabNuGsOisvNhN5b+JdPZHQgiUa1xq8fvXLSIcDge22w0pwd3dgR/84Id0nWa8bTYbnjx5qiis\ndYjYswdfrAJtKjJOr1XO4auKtm0niW+tnboFlZ6A4zhO5JvlAikLeRl3LoUlkNKGvOQu6v9cZgKK\n1VwIbU1tqSpHGALD2BOHAQesa0dtwVYV1I7kKkYROhvoxqgHtYrkGwTvLJXRugNDrwVBnHW0mxVI\nwhnluDe1m2LZGNjWfmL1ISqAjKh2bX1Fu60R1giGyFvc3NwyDD3D/oYYE2YcaY3w4HLH5eUDFTYi\nNAacdVNO/MQyNNoMdC4rl3TT5+hDySRMlqnxVIoO79QlHIdIu3IZiBPGMZLGUOxfsOoWGutxrprK\nnS3Xk5RIQVYSRedo9R+tB1i6Sy0xoqKsitVwPwy8xIes1VqDSOLudGSIA+PQT2vpeDix3x949eqG\nT56/ICbwruLx2+9o+zInuTVdm0vZl7XcAMsw4Y/fV583XhshsIgJ5vHpOzKGXJtPcsWZRN933O3v\nOBxOaikkSCZlwlH53oJym8NHU2ptzlH3ucRXqeMHTDFgEZkbW+iVnuUL3PcHF3eUW6nLQiLn0NeE\n4ma6KwJGE2CMAVvAKwON1/oCyVmsswwJXNRW5zEJQQRJkTEG0jhiUsRKwiM40Q1YOaH2lXL3jaFy\njlVbU2VSi0HYrpq5hKMIeJf9ao3tG6vlupz3VI1n2zR5Y8k8V8BmvabNLb8Vpoi4Rc1Il6sq2cWG\nGTNsaSwL9y0nEHnV1jFloR6S2lGScCb3ZiARjM5j6U5sJc2U5IXpLmlu7ba0PFOmjydkKoNe2IPL\nugyFuKNu3CwECtdlaRmU74SQsKY0rdHrQhJdP3A89SiJ07G/u+Ojjz6m8jXb7QXvvvseDx9e4p2n\n7/uJxKTXmhCZ3YHp/nL48KfABIHXRAgU4MLw48VWAQmVaKKNKLS8mFoEd3cHQtDedwoyahbifXNN\nM7/CJPmX9N77EYH5+sykDTSyl6ZowZIQtKT2ap68lswCTYdVQYTGwrM0SRmUy9QFrDqyiLOY5Ki8\noW1qvLNIbl3ehYgfFVwMAsTIGKMCf32HjREnCessTVVRe0NdWZraUztL5Rx15dmtVtRVLmACbNYt\nJbMvBo2na2jPTpme3jvqqma1WXO5XjMGbbs+9FrYQ7I2Tyl/11pSlDklGo3uLDGTYnUEo40+E4s4\nt7EYV2o+CGMwpJhmNqJVQTUBrfl3m+dcHyBnQsBZS5QiBGYhrmFK5aOUEO3SfSlro9QyLFGi++b2\njDvp91JK9EOg8pa2UbIPebMOw8jd4YR3lrpuMDiur26o66dstlu+8Y1fwVp1Lbtu4N13V8QYGIYw\nRXXKuaeQNQbEzMDzHx134L4l8OlRNlrJFzDGcHFxwZOPnnI6aZeaqvL0vfpum9xZV0QR5WKmTzRP\nYyZzP2ShUMqLFbrmEhco1+kXoOFSGJT3C3CT43567UYXpbfFvLRYo2auiOBrh/VavIMoSO0gRWpn\n2W7WkBJj39E4x269JWE4DgP1qePQdxz7HiQxDobKWmrr2FQVq3VF2zas2pbaWWrvWTcNl7stD7Yb\nfM6Mq5zF1S5rXBhzOLVsnjCMWLQ8d115TFXz1sOLCSzTZ6Jz1XXdGbU6RSaXaQmolb8BrBjEV7Tr\nFf0wMMQxtwZPxDGB1fp/lXfEoEJUUmLoO7ClqkEuvIrJxWnJLo2cPRuAujQukZl9OY5jrgQ9M0KL\nxVfwDGAWAosKUJ/lEpQ1q9wJbSR7OB0Zup7VaoX3DXEYefH8Jdvtlt1ux5/6p/8ZIPDq5ppv/e63\n+ct/+S/z5MlHOG957713ubq6AqPRn9v9Lbvto8X5zJTDULIt/+iECFlIsJ8wQoiZ0tlzff2Kw+FA\nVVfsdltiFA53J23u4CqoqokvPo5abFI1tHL067qeeAEm03Xbtp16xi0XAuSSVMPA/nicBER5vSzm\nZbTAmTkTUXlIGiry3lGXRqLGICSatsJacmlu5ay7rMl9zt8V61g1mthy6kfuxoHbY8dxHOhzfrwX\nWK9WXKxaHl9c8tbjHbvths16jTVQGVi3LQ8vL1h5RxhGJGpNw2qt3ZdDTNquW8jUKqMFQVGQsq4r\nyCXTNCXZaHWhyex+wOFwIIZIFK3u670y+8pczsI1b+7K5RJhI85ZVr6lbTTp6a7rcngzMS64HGXj\nFkKXcxXWWazVLMmQckNSMzeE0YiAVeAwlcSjed2Nw0gM6cxSKSZ9qfpczi1J2ZL3aeD3uQIAYQw4\np2HN9XaT06YTEaHJ6/Tq6ppxDEreMo7d7oK//Zu/xZ/6xtd58OCtbA10rDcrnLN03WkheHQfTXjI\n5G7+5D31mggBYMKD9XcAMqBUJrpsmhADMWg3l647UVWepqnZ7w+s162aeMYizk/sQA3ZKOHCGKYC\nJqC+mFJHbS7w6fDO5fizgHM552CW9AUAjIsw0HwrM7JtsrbHaH0+712OWEn+B9YZbeslojX2sjuR\nklDVLTEACYw4UjAMyXEMI7fHgWM/MMZADAEJQTnp6xVv7XY82m65aBouVi0X2y1No5172rpit9nQ\n5JCnRI2p+1qFQEwJ7+uFEIAYRqUpW6PsuOzf2glTYOqc5JzF2dm96seYUXghhrkleUqJWGi5Lhf2\n6AcFTTNqH0Kksg6xyiPR0iNohGWIhDEihemZtONSwoDV6IQ1loTRikRublwzhwt13VljcxKb8jRE\n1C2bIRKZNlbhQyRJSuZMOVXZgDEayXHOEkudQ9RVKYahd+cdkmYgMbDf31J5z2rV0LQtv/utb/P2\n47do2karPhuNTFln8bnoKYtrTFO3Lgsm5T4cBSf47PGaCAEpGNkUGVC+gC60MffIq7zX5JVjTxoH\nnIX+dKJpKiQlnj9/xuWDS45dR98HnPUYMy0bxbqKqRcCfa99CqvKTALG5gdffpqcMrsEg6pGkeN+\n1EIURuZUY8MSKNT4tcUgElUT1FpCCxM1vo0SXIgjKUZM1Ovu82KrmzVdN2ISePGcjpGTqTmSuOkk\n06yV218hXK4a3r284N0HD9nULZWJtBg2lefBxYWCoM5R10pWwRSTUTDJT8+gqWdGHGjhlRTnJiRB\n5r6FDhVuxqrIsBn7CEGrJVfjot9Ddb9wywxsxZgIObZvctQAgbbSLNHBKpkmhciYIuNxYIgRU9WI\ngy4OJJxyC5z2fbAmpw9JwC0yPYdhULcs31+JBsWUawikOFUdKv69JjNF+mGYypaD5FLlauE5p8Kr\nZPsVy8dZgyRDGGPOIchznkHQYjHe3d1pCNxaqjry3e99n6/+0lcZY2C7XfPeFx8TU8JhadoGaxVA\nVR7BTKQy1uYW55o0V/JkPmu8JkJgrsgrIpMmDLEniqGqaoZx5Hg4sF01VE3LGAVz1JDP48ePiVF4\n9WrPg4ePCC9fcuoG4tJXp5A45qYapYAEkJHWTzMWl4zAELR3YIhzxEAFRvYXFyhzOaezGn9OQfsH\nxGFkvdkgJOrKc7Hb6oP3DXhhGAKHvieEnI5cGUax9J22O7/b3xFzq25nhHWlVYCcq1hvWr7+1a/w\nlffe473Hj3l0ecHlxgFGawAAIABJREFU5ZZ2pS6Od1qJucTnrZ3nQ+T/o+7NYjXLrvu+3x7O9A13\nqnns6uoim91NuklRJEHHomQNtsw4kmXYkSLbseMEcAIkMBAkUfyWBwdInhI/BMmDnViyDciQYlge\naFqMBJqazVGim2x2s9lDzVV3vt9wpr13Htbe5/tusZuyQ8voHKBQ061b956zz95r/dd/CAQXsZlA\nPA1X9yG1VankN/oJ59v13tMHnLPDi55GtenHOqB6ypPBBLbzagD9nHM0WYcyRhZ+10NUdYI467Su\nw9hVFkXXt1KlxGfS+ZUTVfpe4Nv9ItLmJpXdyrhDwmxXJqRpYxTm4MqNSKnTzaz3T8zvQ8qFSLmJ\nMh5NbEir5bApioKyrOi6nkePHqF04Nd+7dd44YXn+JP/wSfZ3NykKETVWC8X6CqP+RgM39eT14BR\nvdO7945/8+/0Oo1eOi9mCZkRe6q2bSHeoPliyd7+Pl3XR7Wh5uhQHIS0lcDPeB4NAp519mF6Sb0/\nDRZpvYrOatuWxWIxTBbWwT6tNbjTtGAdTwoJtFyjDsfQCI1UFD4mGuv4PfmoqvN9L62LMmiT0bZz\nmq7DB7j98CFNI2Bm13d0zjExhYiIlAIXyK1lUpVcvXSR527d5NzZM2xtbjAeV4xG40iA0bE9DAM2\nIeMyP4zjfB9f7PTna/NwrMY4P8yo6dthMK2UQhu5DyoudtPrtY1Vn3I2Hka1fpVCJBtBkPvQrcRG\nUmHIZMF4H23QTAwckf8L79GRZRl6oj25osgLlBFxU3DfDt4+iZinCkT5FTsvVZAJe1jX9a9/nvT9\npEkX6ed4hWFzTfbkJkqj5QQP0btNsCiPjhb6y3rGcrng4OCAO3duU5YZG5tTjInuxz4Q9NvTl9fX\n79syC+P17tgEnnhB16Os09srJJ/Avfv3Yxy30CoXyyVNBI3atmW5qOmdjAcTspsIIadP+m+3A1sf\nEa6TiRLTUL4O+VqHkRNy4jvnTr8UWmO0pBhZo+NcOCUsh1Uqb4jEofTyoSTrr+txIbA4mVO3jfT9\n3qOsSG9VgNxaiiwXDGBri/c+fYObN64znU4oK4nJKvKRVDxKE1il6Kg4ulTR1VPwi264H+sVggoB\nGzzahqg7sPS1i97/8vhk2iELHQfKClkLhLuxPi5Lz3p9fKuU2Ir5OMILBAwmeklInZHEW5mVoA+d\nGWgTS9LhlXBEkgWdNrLZOa9QLrX0MT9ByQhNOo40Uw9PnOYitfY+IfxrdvBanxovymYJKoGPa59J\nPkYP/fnghpzagdhqoNQgyNJaYzODNiPqesnx8TGPHu1y+fJlirKkKDKszb/9VXqiunnyz97uends\nAmFFx00jvfWyLS+ktzw5OeFLX/oKN2/e5ORkxte//g3m85osL/BB5J7zrMHDEEU2zP39aofkiblp\n2r1P7dxru+j6uCeJnNY3EM3p0M5h1GjAasiNFcWeCsKEcz0qnmCu7ymLgmAy+sh/6OsWFQKFMehO\nXJTSaVcYi/INWjkqa7m4tc3VSxe4eukizz/7LOcvngOj8Hi8VlhVDm2PjuBZinpPOvV0n4WoFBcO\nrDbjEDDWiuTZWjJroV9FcWvWKwsBN1m7p71bberrbk3rL5BSCu09ddeDklEkSg8tojwPRZYZyiKn\n7R15mZN1DV0Qf8Q+uh4FBODrYnyZQIVij542YdHzRDGY9yuqelh7MUNKmPaneCXeh0j6WZnDpqoh\ntR1DYtCwxqKEXaXsyICgvQo74GBSMSitMVZTljk7Oxf4+te/xv7BAY8ePWaxWFKWBUYbJpMNXKQn\np/u4/v+lCdZ6i/p217tiE1BaMZmMQSmhTGZZnMW2mKwAbXj99dd55RuvMJlusLG5jc1Kzp0/hMe7\nLJYty7phtqhp2xOZ9Zp8Vc56P1hDJdpquhJ4k0Il07Ve5omufxlprXqIEk85girOZ4Nf2T6VZUlf\nL+ibJXmVTrBCaKInx+SZJTjH7qMHlGVJ7QNN39POlmxWFVvjKVvjCVWesVwupI8OnsnmlLOXzrC9\ns825nTNcv3CRrcmUqiwpc4mxiq57KKUoqxJr85ULkRMFn7Ql+TBJFmFRPvSppwhTIaATwGVECWjD\n+FSfrFXcyHFD1QXpZY8sSC0AIqxaMhWRdEHT5esyxFm/CXRZB0qjjSgIjYueh9pyyRaMxmOO5jPm\ndc2yd2hrhfXoOpqlBJiui3mGjUufxikGpx6zwqZAwMrTbr9ybxaLRdzYVsBiYluGEGJGwZq9fMQs\ntI32496Lf+Gw6awOwiIzGJMNnIXxeES9rPnyl77CYrHgR37kR9jZOUfXOQngWWtH0teyPsJ+kvz2\n5PWu2AQIYhTq42hKUn1z8iIHZXj9jTe5ffs2e/t7nD1znldeeZW+d7Rtx6PHAgLKaaNj6Sa7rH+b\nU3tA8NUKKISw8nJ/YjdNPz/Z/6U/18YMM9/EdFt/IMIEk5crMfPahWY8KoHAci5Bpa7v8X2PUZ4z\nmxucmWywPZ4wyTLqIhewymh2zm5z9qkrbG6Ll+LOxiaZMWgFdd9iMovNLNkgk4XgxZAkTUXS6Wh0\nJm2BD0hUmBCaxPV34NsRuyecF2FTchhSRpyIhlI63mtUFDSFqBR8G2bd28WpgyLLPF3X44IfNtW0\ngRitY78sEutJJTHenevpnaMPnmCMHPAxtk6F9PlXCk4VZ+phMFgRRyFjNK6Xe7RSgEp1tgLXViNG\nWQsrsZiO43nxq1i9eNZaOhc5CUrH9kUo4iLw6gdMyRhPVWb0fcfx8RFnz+4wHk9xznN4eMirr3yT\n69euUxYlt249g/fNsMafvNYrmnf/JkCkZDpPVog3fp5LybNsWt66/RYPHz6Mlk+Kvb19+l6srI6O\njqTn05Y8z5BqU8WTXz73+gucRjLpehItfieJ8DpHPPVspzaDONFgrYzUEUQz2kS6cJTwarGehkAb\nY8aD7wmuQ+PZGo/YGldsjUo2y4q+ylEabGY5e+EsZy5eZLyxQRWJTSGmB/dBqhAbgzlTqR6C9O+S\n0BNZctErcBh/+T5y+WWZa1Qcj8Xno1S0zRLlol/d3LjRPLF5KqkxQpCTPoR1DktYu3/SL8u/C0Mk\nuHckTu9qM0ENm4E1UCpNn+eU1tJYS+t6vII+iK5EvibBlZKGQ1681Sg3xM1G2gThCkRnslMb/up7\nk59luqQIaVJjVmO/J188AZP9cA9S6xGiO3GqqHzkGxDXTxdHqyl/QClZ+6+//iZbWztcv/4UWq9Z\n1T1xra/h73T928gdMMAXgLshhD+plHoa+HngDPBF4C+EENrv9DlYK6uSH5yM5BxvvXWXl1/+Bl3X\nMRlPOHPmLNONLXZ3d3nttddp256z5y9SFBXLRU1ZylRAQL6VFbgBrHUrRHzttB5eWn3a/nsgtLiV\n6jA+5hVD0HuCsRFwYlglMsITboPre3HQTTZpQdx3tRb0mxCi+26Hco5plTEuDGWmOXt2is0ttrBk\nZc7m2W2mk22yrEQZsQ33WqHznHFVUlWV9J4R3MytQQwrhE8gJp6y4bbNCqkXV2YLKRiElepywETS\nBuk9vu9WCHvaHENYUXeDw3txMliRn9KpLinKSgmhRusk3olkqgjAeedW3JE4xTBaxxQhR7/oMRE7\nqazF+4JF00pV2TnEZcGBMWglFZuTXiWGn4p4SyvxNsxtFn0ah7WNxGasyur08zqbcLV+VuKwRERa\nTadOj6gDgg8ppei94BUW2TDqeklR5FRVxcnJCQBXrlzhxo0bfPrTn+bevfuUZcWlS5e4du0cRbFq\n99bXc2p/hq/nHa5/G5XAXwW+DmzE3//PwP8SQvh5pdT/AfynwP/++32Soihp256joyNuvedZTk5m\n3Lv3gN/+l1/AWqkOZvMZRVmxuVVijOX4eEZejmgaYQ4u6lqYbi5A0Hjfr9R/WmO6njQpEKAnUXr1\nwIFfBxNhRW+FlQmF0mt+/Gs33sfFChHtFq4abb0kICVsVRZsbG/RNssh+Wc+n+NdR6HhzMY2Lzz3\nXsZ5gQWmGxVegcoNusox2yWqQ0zstMbklrwsMLmk5/adBIgYFNPJFOU6nAus3IcEZG1aoeKGIN9T\nnml8G+TlVfEo9PICo2T0Z6JDkdaa/pQZiBYrstRHr5ena78OT7wU6d+vqMTCSMzzbODk111L7yPa\nrwK2iBbcvSfTMWTWGPKyouo69g6PUIsaFSAzOcF5nNL0b8MLSD+nlyTLxM/Ptd3aASG4QAhiM5aq\nxSxWcuujThMxj7IsJZRVdYMLkdJWDnmX2iB5aW3EJlLblOc5bTcfQFtjDNeuXaMsS5bLJR/84Ae5\nffs2L730NR4/3uOv/JU/jzGbpAnX+n1el9H/gW0CSqmrwL8P/I/Af63kzv4g8NPxQ34W+B/4fTYB\nofbKKIegqZcNy0VN23aMqxHFqIpgW8W57TEBzczI6T4dTaiXBzgXGI0nLOpa+knC4OBCWLnM9E6A\nK5PMRZRIR9fL/fUqYd1W2hgT+20rs3elh8mGCgHte4LrUaGntAWjPKM0OQEbHWSEpdjnmtA3GOcw\nAUbViKWShJpbN64zLktya7HGMN7YQmUGbS22yJiUGzGNWcI+lFFkMdQj14o+avgVAY2XkMyghVqr\nIjHbRH++yL/QWuEAo8NAukkGaErFXlhJ+auR4A4XZH6fnIGaumFw1dXQp5gQpfFKg5HPN7RQxFPU\n9XGcJ8lGUpIz8DiMsfjQRyWjwXe9/IjjuzxXVMEiuQU9kzJHBYdGPBh9dEN2HULJVg4foO2aWLlI\nLoEDWkfEk1aXVgpl7UAUE/BafClDxJOUknF0GDwMY+2gDSaLZiTeD7T4LMtOja51rHxcgM55CXZV\nBmtzbtx4CmMUXSeBpbkBqxy4mnZxQrts6Ec9Bk3vAjbPhvGktVnEXobp9dte320l8L8C/x0wjb8/\nAxyGEJJh3x0kpPTbLrUWQ3bh3BkkXBS0tpwcz5nNF3Rdz+bGBls722xt77C9vUVlAk3nyXRAh0CR\nlbherLZG0xHzph5cezKbxYy52AMDLiTGnJhMnAYITzPo1hltaVe1UTlmYwsQokuQVQHle5RvsXSM\nrWGa54yKHJTiWDsWy4a6WdJ3OcY7LFAay9Z4wjKDorDcunZVxEXGYIqcarpJPuTiWap8jLdx9BWk\ntLZKKMMZ8nK5dEI5WZgh8udTPx6URttMZvIm8e7lnkJseYKSlxctc/oIqGojY7qmkz8X5N7SO9lA\nVtbqMe1HQVgftYYAOjr+eh8t0xL5SuzF+t4N2I1wHDxGzoeV12MIoCXhuPBKAmU7mJQZKjhU8Byf\nzCI2FHv+IJUGHuquI7AKunUB2j5qPmDF+TBmeNZCKRYKdb6GLaX2JcRNRei7ADGerfc4J+SqNGZM\nHBYfEmVYKNNt7ymNkMbyrODy5Uvs7t5nuVjiXY1CMS4thooyUxwfHlHmBWqcfDHzBCvEcWya4Lzz\nS/zdJBD9SeBRCOGLSqkf+Df992EthuzZWzdCWgjW5jze22Vvd4/9/QOO5zM+8MEX2djYxDvHS7/7\nJZ56+hmKsqQcj1guPU3TMpvN6TxMJhMW9XLNeCEMCTdt24r+gJUSLEvI9RNThPg9DuOWVL4qrSXn\nQK33hE7sulyPQWzKptMpoyzHBsXRyQnOdxRZxriqOLO9SYamMJZJXlJmGUV2iaLMyAuperI8p6wq\nsU0rCvma07RDZ6wXd71X0IPHxZFWzD3wiqyIRKWorh+wFx3IsmZwUfLOEZpa0PIAmDwCVvH/6Htc\n21LkFmvKWEmtfBVSa5NUeus9sLXFKXCWsMIW8Kvxm1JqyAtYv/fSPniUXrURSoGxmsWyIdOWkCn0\n2KD1UgJcoz153TT03tEFT+cdvRMwUNACHy3JdDwYZFPwXq2UilYUielZp3GjgIGSV2BjpbDOzFtn\nJyb+wTpYuB4c0zQrhD/Pc6KrAV4p3nzzTZRybG5ucuuZm7z0e1/lgx/6HqzNeOXlb/Drv/7r3Lp1\ni2eeeYbr168PwKOI21Zjyu9EGPpuE4h+TCn1SaBEMIG/AWwppWysBq4Cd3+/TxSCmC62rfTjb925\nzexEJMI//MM/zNmz56jrhsPDI/JqxMlsyfHxnKbuODiaoZRiNBpRTSb0vsPFpF3fOYyJ36ICbVal\nmPOngUC1hlg/OVd9knopST7rqsGoJAw+OgRbRkXJpCwpjSUvMpqupWkbuq6lXdRgDXkhU4LRZMx4\nVFDkgktsbW9RjcZU1YjRaITJ8iEEpSgKmliWZrEHTL21lPopI1D4Ft7F3EavRAyl9VARWFMMUuCI\nUkXwD4oqx4dIXNFiaNKne7W+qOSwx0b+QLLhEsakVE3GrKKz4u0SYCwEdMyKTPe4WbPyXtfqw4rT\nsSJmKdqmoXdSFWXaYLXGKk2uFEVSfwJOBUHoE97jHc6DCwrnBZwzmY2c/pVBrBpeeCM+iwF6I6ak\n69qU9YNinVKuVKQCZ+bUukoft25frrUWurrV5Jkdfr+9vcF0usH29jYf+vD3oNDs7x3weHcPm2k2\nNibs7Gxz9epl+l5FNae0JMnirY/Rbm93fTcJRH8N+GsAsRL4b0IIf04p9QvAn0EmBH+Rf40sQjnl\nhLWltGY+X+C8Y1xM2d7eYTab03Yd2hh2zp6j76Tc8gHmi6X0+SFGTDetuA4hzDwR9sj/UuSF3BRE\n8qrUKsr7SUzgnccqYVVexitNBRSJGYaUxfHjR0UhufVa0WrFpBpFSW+OzQxlVVAU5bAJjMYTCb4o\nyiHpRiwFYgmqpHQ2WRHHbrJoXd+zko0qUAbve4Qkp06V60KdFYWlONHogQvgfdws4yaglMI6SxtP\n6q5tB+Q+qS3Frju+rPGlTHl8wxgyMJT5396nyswd1sguIWD9OrFo3c4t4F0cs3kp4rUS4ZBVikwJ\nVtJrLa1CELDUx5hzp8ESYiFy+pmvs+0GBqCSNZPYVVoLzpK+p/UKclgpa6BowkpOEZOeqDTT+hNc\nQXQsi+VSWJJlLniSc/heKq/pxhTRrQWWy4VUI9agQ9yEWFme/bvmCfwM8PNKqb8OfBkJLf3OVwSC\nAiH2S46yqtjc2gQUDx4+IC8KNjc32draYnf3UPLrfWC+WErEFoqglzSdxEcZJQ9eBT8suul0Qt3U\nKAV5ZgcyUUL21xllp7689U0hbiiEBPOwJkFe3XCxNO/xBHGRyQyFNfR5zs7ODsH34vWXWapRJTZg\nWUZVVYxHY8EBrKTViPRYTtbeeYIyGJtjMzG5kDLa0fsUfhpfxiB+erJfiZRJx5c+xGm5QqPSx4UQ\ne1Ux2hSMLDaYWSYAonOS1utXp5lSMQMgjXa7buh/bWy/vE99qSbxFhIF27nEKjy9CQQgi8CrDwHt\nTluBu94LUUtJXkHyG7Ba4s8zDd5Kz985jYtkH5XwChfEmj0kS1MGTGDVhqzZd2nZRMUpLVWGp6uV\nJyvJQSGpVn+XvAvT95ms62Q8WJPlUvnN5wuarmd3bw/veyaTMfOTGQrwvefy5StkmRoSlNu2xeaZ\nAEVwalP6g2oHhiuE8Fngs/HX3wI++m/6ObQx9J3j/r0H9N7xvuee5z3veS+7+3uMRlOUUsxnNS+/\n/C2c8zx+vM/t+w+o65a+93TOMVvMuXDhXLzRHZke03VCkEEpLlw4x50793CuRRuDc548y3Bas4xj\nqlPEn3it0zLFqdfEFiCeubGE1IEYNxXAO6bjDc5tbVGWpVg+OUfwPdPplGUtScRFVXL+0nkm5Zgq\nL8nLQtqVXk5knVnK3/gdzv6dn2f/P/nzHHz4g+hMiFSk0aUR4840jhtCMr1fVSlvc1LF5yU/x3tW\n1/WAkMt4LPoxxpl207bMZjWT6eZpHYD3dG1H17aD7XhmLZk2HM0XtJ0kC2ktUet99CYMztHEODSp\nUKREPvN7X+PWp/8FL/3Qv8dbt24MJ+5g8dX1LJeNtBNK2jvnemmDtPgoWq3QRQYa+lb4FOkO+BBw\nKLRXKJ9IS35YL2lsabNkU+/jKC+avWQZjtX9WX/xU1rw+obg8atKUa2cjmAlo1dK7O7aro9EoYwX\nX3yR3/vKl9g/OODM0TbPPH2Tg/1Djo9P0Mpy+cp5Hj9+zMNHDzg43Ge0MUZpUSv2CcyOn/+drncF\nY9AH6XWr8Ziw+5izZ88RgN39Pe7cucdoNCYQaJuWx7sHLJY1h4fHoi0wmTDYUKIvb1vxzMssWZ5z\neHSEirTgrc1Ndnd3Je2n61nM5+RFPuz26xjB+gJPPyuVGGuR9eVlDqEREEojf5cERHmWSZCH0Vij\ncQr6Vggu29MN8qqgmo4pypxAoOnEJejkZB6Tdw1ZkXP1//y7lLfvsPW3fo67z70PHxYYrVmUJVpr\nJpOx+AVoMbYIrqNzHa3rpOdXCrE3iqdwiDUtMjURxkwUsfiVmGaV5OTRkdyjoi1XFkeYAWibBkIQ\n9WPfS6zY+gsX27TtL3+Vm//4M7z8xz7Bg2efFpOS+MyS1Lle1jjn+PA/+RU2H+/z3k9/lq/89I+t\nJhbxpepdT9015HlBeoVdCGBAe3EotkbjtSILnqwzdDg8ovb0zsc2aVX1JQAvbZbpORbRVdknuXNq\nAdZkx0/ySdK1/tID3xYnnwDF1AbEf4Rw8DRvvPkW6JTAbDg8OqYsK/re8/q33uDs+Q1msxknJyc8\nfPiQnXNnGY1GaGNQA+mu//+BlDhA77082Kbl3Lkt6rph9/FeXIRuQMc755nPl9RNh+jvFaEXBDcB\nUEWRSUinC1EJt2Lmpfl033V0XSsuv8Z+20m5Xt6t93zDC5V6jIgRyMpIppXyMtpoWWa1GH96JMuv\nyCzVaERWFmRZxnw5hw6Ul9Fl2/Wi2svEX//VH/sT3PqlT/GtH/8kB4eHeCeZiGl3bzY3BUPIMqrR\niL5t8V2L0noQEglTMha9sY8nUmXjBJ8UlZZYeqvTLaneVpXS8OjSQg6rfneIM+8dHTI9qOuGG7/0\nz5nef8itT/0KL134s3jnsFoql0S+Suk+v/2R9/PR3/k9fvPDL0T0nDh+S4xSqf6UE8VigMHZyHgR\nJFmjccFjvWRL5HHT8h40juSClADIQPRaiEq/ZDprrZVRsxM7ueBX2oHUzz/JNk1rJ2FN0hKsNpd1\ndV+qPNNGEB/B0BK4KDha1jXLZc2oGpPlGbP5XNawEgBwNjthuVyS5xlFNC1JrZZzp2P21q93xSYQ\ngJOTE3Z39zg4OOTKtWuD6eKt974n3lSFsQVlNSIcHKOUoawm1HWDWyxxvRP6JIpxNaIscvb39um6\nTuan2nB0dDScVsIfkBLWmECWF6uv522AwdXiT0DUGgSnQbBY4cVqo2IZabBaFmBhLEEpcqXZmEy4\n+OprPP2PP8PXP/lHefXiJm7pMMEymW5w9uxZJhtTyqoioDj83g/xuRfex2JZM793D+0daUpd1zVb\nm5uMJxOqquLGN1/n4s//3zz66T9L+KEfGJJwpNQ2q18rBaFHJcAweLk3XjaGpOUICKjVdY2M1rSY\ntA406hDz9ULMaXBuQODTCzCfzzk+PuZL3/dRXvzsb/FbH/kABwcHBO8pMlmCKfSlqiq6ruP+s8/w\nD27doHUe4ky969fcooFgMlzb8tzdB/zw732dz37P+3n1qcsEPDYYbG4lZTkoiiyXhdY5gurpYmJR\n8JqgPY2T56e1xWg7bADGSK7BOn+kj9Hlo6qiKPNhYpHGiutgc6oqnXMRrV+F1axvImm95XlO3/U4\nPNrLmpPvuUET2JxuiKZD6fj/9uIfURbU9ZL5/ISyFPGdsULAE47Du3wT6LqO3d09jo6OmS/mvPH6\n6+zu7hOA9zz7XrrOce/ePW7fuQt2jEPReS+nolfUTRsRasvW9gZN03B0dMh8dsJiuaCMVNOXX/7a\nIDzyITCdTk+BgrDuguuGh/OkFtt7scBODzBZRaX5twpgEGlx71o0BZkWjgHWoELgqX/4aaYPHvPs\nP/0V9P/21xlnEzKd431ga2tLmGkh0LQt5y5cpIuONk3TUJ8cUC+kBNx7/ICjg128l/TgP/RPPkv1\neI/tv/33eP3D70fbLE4TFM4lFFpIP73rMFaATR865nPJwdM2hW56KSuVuNz2TYOKQZ77+/uS+Kw0\nTbTf7iMmkMxZAFzvePX1N5kvZrw2yvnKn/7jcn+MIq8yNsYTsswM7s+nTskQcxe9p+8dTdtxcnJC\n0zScLBY8ODikaRr+iy//Ky4ez/n+L73ES1fP0fWt/PAdy3qJ6wPBSfUiOgFp6xKGgwPnOtq+k2AS\npZlOp3EdtHTxpS9HFbk2uOw0BpBe7HVsYP3vREUKhJVhbirPEwCZ7lmasiS14UsvvcTNp68Dnjff\nfJMXnnue27dvs7u7R9d1VFXFhQvnqaqKx48f07WC6xRFgc2zoWr7AwcGv9vLWEvbO05O5vgeymyE\nUScsFjV333oE1nB4vGS26Gj6E1yUEWtjqdsF2gizLBasUf8vZqNKWXqvWDY9XR8IQUfjiUDddJEs\nIw9ENOGKoGV+P+QJRIIQxhAIeCXthDU5VmtJ6DEa5RwZngxHrntyo8izjKzIZZrhPL1zNHXNN370\nB3nfZz7Hg5/6CcbFhpTDWvD7kFvQEnQi1E/J3tM6AD15NRIjTZ0znu5zdHhI23XgFZ/94PN84ssv\n8a/+yEfZah3KmJhhGKsZFcMw+g50oAsSFtK2LcE1FJmlqAoypSCCp13UIwTnCN6JpVnfoWqFtobK\naBb1krpuWDaNGL12PU3bsFzWnMxrmialOgW2t7awRjQCRSmj0aIo4gsgS7LrOtquQyEaD1B4ApPJ\nSE65LMP5wGw251PvvcGf+Mbr/NObVyQ2zWT02uODIssrlHJ0wcGQcSg+Bt5FbCBubMF5fAQvR+Px\nqTyJLMsi5qPJrLRh6+EjZVkK9tG2A+KfDpO2bTExJs77TuLShsMm0XqlumyaBdrk4uAcoHPQu5hX\nYQse3L3H2bMLIdBnAAAgAElEQVQ7nN/Z5MFt2NzcoihKQFMU1VC5ybrxA5fkyYNs/XpXbALWCge9\nT4g4Bryiax27jw/QRS4LqfMsloLeJtKKD46U4tF0DXW9ZLmspU1wAq75IK683q9a+RDEWSfLCrSx\nA/qdrMXXAynxYnslSPuK12BtLK2DzKe1lszAwmoyA5mRBWNM9NQPgV4ptLEcf+x7+fqP/BBb2ztk\nxqB0kFzCEPBGRbdYmS13bRNxkQDBo43EgI2VYXNrh7puZWGFnt87s8Mrn/whtrY2+VBdU9goDw5C\n3w0InbhrGkxmcAhPqOs68A6b5xS5HU5Mj8f3Tv6988I7CB4Vn0HwFpNltHVNXdfUTcN8uWTeNJzM\n5pyczAitHloJHyDLC6xG7k1mhSNRZFiT4V0MO1GiBUhJRsT+Pctt9BdU+N5jAnzz6iX+p51Nmq7j\n0pA9J5RnY4R/4DxoH6CXZ7jOXCQEjFKCEihQScjTroxE04mq1YqkpdQKOE7My/SxQ+5lZFWawqBU\nahHWsZZ1wxIZ0Wa2IIWpOA913ZJbwQoODg44d3ab6bhia2PCeDyWKqlZxNbDyqGmdPz+Uiv7LgcG\njbFsb29zsHfIybHEjM9mM+q6Zrlcsjg+YlHXYjgaVjtbUnN1jbDxjo9rZseHQ786nU6p6zoCwFKi\ndYl3jjzoPLdShkZw5u0yBlO5l2bXBDGxtDr20n0PpaHMM8ZFyXQ6piqrU0GWdV2jtKasRly5coXR\naIRdG9vkWQ4aThZLdAzRCkEWZVvXBO/EW89kUgUpxXQy4vKliyjv2dOK/b1d6sUc1y4xyrO/+5ip\n35CIb61RfQ/BMfn8V7j2i/+ERz/1Y8w//mEhz3Qt2lhAaMa961FGvn5rHK5bianw/SA+OvPSq1z/\n1K9S/+DH2Xvv07R9T9t1zOcLjo+OODw8QveGjc0po6pkOpkwKoXkZDPh0ReFVAIKzWxZo7Vs0EoF\ngnN0zq+ceqM9WmY0m5ubg6irKEuOTk7keSMakcTRB3lJnQfrPH3iHOiAcuLzSHIHjhyL1NtL3+2H\ng0Gp0xZp6aBII751afa6ki9w2r9wXYqcrtR6Wp1kwTJVOdg/QOFoFsdcPH+Ww8MjNqZj3vPssxRF\nwe3bt7l//z5bW1tcu/GU+A90HTbLhHeQZWxsbPBO17tiEwghkGfyIt6/f58PvPB+mrZjWcvo6OHD\nh7R9L2W5k/RhrRVd34qjb5CEnyKzEdW1cbftGI3yeNKI1nw+nw0uRKOqkpNFEVuMdqBwJnMLWPVT\nfS/+d+lp6hBEvJNptqYjJqVlVBjGoywShFa3tygKqtGIre0dzp07B8RZtZcF0XaipiuLiuARZ+E+\nlS5G0PdOdBLe9TjvaJdz+q5hY1JR5Re5dP4sx0cHzE9mtE3DvTu3uTG6KeIbpZkvl3R1zc2//0uM\n7j3k3M//EgcffC56/Duq8TQKagQDUMYNBKKuaejbFtd3BN/z4OFDbGb5Q//ol5k83OXWp/8Fv3tu\nm653nCyWHJ4cC9Oz7bh8ZpvJZCLBnkZe8NGolJBTpcQEZLlEobB5VNg5TwiSSETMbjBa0To50Z0X\nOrS1lo2NDYqqAq3ZO9inbsSYtSjL+OySbkFQdK2EMiUV+hoI7N1AB09ZhF0nm8A6kSm96OsTJGtt\ntCHvTq2bdIB0bUMIgjOtS37XN5IVi3Bl1JLk7kZr8qJCacMrr3wTo+HmzRs0TSNrq6r46le/yvue\nf47xeIxS0hYXRYExZpWS/TbXu2ITAMEFUjKw0WbQ9x8eHg5z4a7pMWYs1ldKEdpoGeXEiDMx54Tx\npbFWU42q+ELm1G0t1FIjtmBoT9supU1welX+JcJQSGGhejUeCo4gSBLGWPLMMioLNscjqkJTWBkR\nDtz5yAQry5LJdMp0Oh0IPdK/CRBmrBB+8rKkaVu6PkROvEIrIyKYpqGpWzTyctTLJW3TEnyP0pDn\nGZPxGKMUbZPHjaynbRus0riupWtbXvvjn+CZX/4cD//0j5JZg8ksuVYosui7D23XYULyXnD0TjYA\n14k2Y9nUGGf5yh/5MH/o177A73z0A5LJkEaGKDKboaroD2BWmgVjNHkRvz7fD9bwBMhMKucVJrNk\nPpe+ORqjuL4HAsGvkqG1NhS5YjKZcDKb0cTnKOIlC8i6OE35FnJQ4gmGsJL6wmk1abrSC5quxBRM\nIPI64Des6yQo6qNgKlacAwsRhtFhiL/2Tr5HgkeZjL5vpUAJcgi0fc94VHLx8mVOUvUTpxAm+Wco\nNXhbpO/nna53xSYgsl9LEcdEXdcNj+Pg4ECokI2ATONJhYo3xMURSZpvSw/mJEbMGkajnI2NKXlR\nyAnnk/uL/H0ILgaSegj50Eas79Yq7vyDU0uQjUC5XjL4dMakKtkcjyhyjdUBrX2UddqBrTUej5nE\nMV5i9IlpaRZLwAybiY304ckRXRT+GB0rFeep64am6ciMo+9b6nqJ6/qYaKxAQZbnKAVFIXHaru9p\nCXit8W2L9469DzzL/CMvsrE5FdJPlmGLHOcy2r6LL7MjemQLtTYy8vq+o287mr5Dec+r1y/x6l/4\nMRbLGuWFryGjr4yy1PGUjOVtfH9UxEXQhr5rGezKAoQQA06MTF9snhGaQB/1AyJOigBtLLdNfAGr\nqqKsyoF0lf6vRIKKv5BeOa68lSA4AkVJF7XGD4GVzfhp/kiaupw++d+Op99rhQoqEpVOawdOaVZA\nPBYAlMfoXHgXSrgc88Wc3nmyvGT7zDmODoT8lmUZW1tbAzYhlmarfMx/19qBf+MrBJHfbm5ucubM\nGe7evcvDh484PDyiKMdsjkexvGnRKOqFuPIsFjPKPId44vWR0GGLgs2NDa4/dZmiKOLHCk13Mh5L\nFaAU8/li8P9P3PX1m5VwgMFFCAjBoFSGxbNRWLbGI87vbHBuZ8qolAhx8Lxwf5cXP/OP+OYnf5iT\nj34P47GEgKSFm04Nay2LxQLrc9rOcXi8z9FsAVEfMBoVnMxO6BsBOnufCEuGohqTT1c03MViIYQX\nYzFGvvamqek6RaYtOkBmpGwtyzKaa3S44MFobLZBpnOU6eldi8nESDQ4TyMCD/qmY7GY4wJDbFfw\ngNIUVU6e5ZQO0O1AMAIwsbJTWtM4x/F8IbJs5xiNJxRlKYBbECdf3wmzz1rxPrDaIPZn8aR1Ht2J\nXLyrl7KJaC2lsNHkZcFiuaTrOj746ICfeO0Ov3DjCr+5MY5TIB+NSlwMiUkaDQbyjhwG5tT0aP2F\nVQlM9it1Y+IIPCkqEi9Iu3IaigdNURTDIZMmCbk10rZoxcbmVFKgg6fvWu7dvS/TEqX45mtvcPHc\nJufPn2M8HvPUU08Jc7PvB8wptTXrLkNPXu+KTUD0MSKE2NzcZDafM5vNaJqG7Z1zHBweUjcNWSZC\nCUFUV5HTWsmerpTCZprge9qmFkBtUKZZppMpeV4MVmUSZ54Yg5m40PTRlTepBLWcYJ4gnn6do28b\nyZazitC19Ms5y5NjSrOBzURr/4HP/Dqb9x/znk//Kq/+0PfLHDkqw9b94FIpuWhaXNPTOc/G2TN0\nnaftHLVz7B2fYICqyDg+2KNdHtN3DX3fUeR5XMAird2YTKPARDYFFwFGAmRISGvQFodOFTE+KExQ\nlHlJpsAFR9Mu8KET0w8vnHytJOij7zpmyyVN3+Fj+2ZtPszzu75HRWTeGMN4NBJ2o4qGG9rSBU1u\nS6ZbE5QxBK2pnePw4JDlQl7e4CJ2YDM5oL3HaAnnSPqGLMtwjegeUiuilYC+dYyd/1Ov3eHabMGf\neeMuX/jQc7FN0NheY7zBhIAxDu30YDKzKvHNIMdN+EJ6gZN6U9bXSjqcWIDrvw+xws9tTI4KImUK\nvcMpqVI0iqoopcp1HX3nmZ1oqrIQM5TgafuOKqtwPvDw4SNef+1rXLl8maeeeornn3+eclQNEuQk\nPU/Ep3e63hWbgDZ6KJu3t7eZn8zWSBaavutICsOuaUQhFnfRPMsIRuOtk1ND5FpopZjPFmSdxGIr\nrSnyQrjm0csi+BDloUaMNbX05z7x5ROCC8MOzzBnFn+4zBrZuZUn04oiy8iKjJf/xB/lhX/+Od74\nsR89xQpLo7IUu4WLOX8mwxuLtpa665kvamazBcu65ujgAE0g04rbt98E1+C6lrZppKfXkqQ8qkq0\nMjF3MCOvRrgQMRKtsGiMNShjCcrg41gTLb9X2mKtxuDofYtrm3hCdYQgGy8h4DpHE9sGlMLYjIBE\nmvsgbsZmDUkP2tB7ceZpux7ftHg/JysKdtD0PqlHex4+eCT3iziOCxrX9pGK3DMuR3i/sh+T0jeO\nVCPL0ceqLnH///F7r/Pjr97ml25eXQMH1XDaaqXFiERLRGwC61Ynuj51kiolB5bQcVd4wHqlkD4u\n/Z0LyX7doCL9GWKp7lcGJNLCOVFFovB99BuIm4C81HnklWjKUsxI7927x/PPP0+WSypRH4lH/zrX\nu2ITMMaQZzlZnnPmzBke3L0vgglth4egtZRqtWvwLkPn4saa25To4uhdx6goB5T/+HiGtTVlVVGU\nJVpZkZEGhesD+KhKU1r6rDjmgW8f3aSHnUBDYzVFnjEqcqajkirPKIucUVFQjCqOvvdDfP5jH6Gw\nGZP0+dZOBhe9Dn0Qok6XeVShGFUVDx7tcXB0zNHRCXt7e6gQ8H1HWy954/XXGBcZwTsW8zlGiaFH\nkedsTMdobdne2hJno8kGnesAcT42Ier/tcKrKAO2mbAKtQGMVEXaYJyVXtx19F0bwTnZIF3X0faS\nDWCUeA062QVkrBcCNuYfKqVoosFr38sm0PrAbD5H24wGQ9N2ZLkAgG/cvsvZM2cpy5wiz3FBU7c1\nXdvStx3GlkJY8h4zvGwxUVrr6IcgG0Gy8vr6lYu8dPEcTdtD15FOc6V1bAm86AfiZCQ580B8adV6\nXx0GoDdp+BMol77fdTPVFY6wmi6k60nwcWgh0uajtVCknBighODZ3NpEK43NxHlq69yU3ceP2d3d\nlco5+jD0fT9QsL9TKwDvkk0g+IDNMqqyxFrLjadv8Oabtzk8PuH69euYRw85ns9YLJeMqtEwTzVG\nDTz2rm04OjjATSaD+s2ajKPDE5Q6xBjDbFkzGo1wzosZSd/T1C1ojc7zoT9LDzSJR9YvYwyZKcmU\nZzoecWZniyuXz7Nd5Uw2JlSjMVlZMt7YJLcZVmmWy6V8vrygKLNhk3Le07vA48ePaUyGz3OK0YR/\n+I8+BUpTVpLy88EPfIC2qTmZzdjc2kHj6Npa0nKKgqosKLIMZQwPd/eYLWumJzPOnTtH1zUYrUW0\nlEkysdIyIrNZFTMKpBqq6wa0WJKJTVgitAhJqO86mrpmdrLABU/X93Q4XBxfWpuR2Zw8L4cI+Lpu\n+Nbte1y5co0Lly5x5do1ds5f4M7de8wXS/JqxOHREZPNTZq65l9+8Yv0CZ9znnE1QkU7sCy2Hcv5\nAqMCZzZK5vM5TdcKeSuOdVNuYxLk+BBkE+kTmCwbhQopD3DF4/drvgryvGWTWeEAclqXZcnJiQh2\n1kd+61F6STHovZdDLcaZpXFd2kyS43L6Ya0YgqSvQ1B/Q1YUXL1+nYcPHzJbLDg6mTE7mPHcc+/j\n2WefZXt7m7puyHIBotfp70VR8E7Xu2ITUEqhcfhQ4/oTMqvIi4I8c3ifczxbcnQyo2mXTEebjEdj\niiIny3KKsgKl6XqHyUswGc4F+kS8KEt5OD6gs5ygND0S1IE2Q267d442qtU0SsgjTrLtVQy8aJsW\nELNKXUjKT24tuZHJRpFLqeaVwgVN6zxOeUbTCX0n9NB62aIzS+N6nAdlMuxkyu7jXR48esQ3X3+T\np248w6Ur16jGE37jN3+TL3zpCwQvSUXL+QmGHlwf26Qe71s6mzEqSxkveU/oewEBdSBYi+pziqzA\nQRROeWzeYLXHqozCQsgdqAbfBXy9QDU9uvNoJ+1T4xx18DQG+ibJagN909A5h7GOPAdbjjma1xyf\nHDNb1Dz1/o9z/anrFEXBcd/x4LU7HB7uozScLSs+/4XPRxYf6Dxn0Tv6tqVtavZPjtnanOKdo14s\nseOSrmsxgD+SzUlGxD3ed+Q2oPA0oacPPahMMgaMEb8DL+7DQQmhSPgGchLLySxZA2WRyYg2nuBi\n8yYVU0CzrDumGxtUVcnR0dGQYl0UBfP5fDAOSZFmXbdEqUg4shaZOGg8ARenUgPRKMjMQgE2L2i6\nlspatja2aOYLplUlY1atUJmNicV3uHz5MraI7UDf47tuIKy967UDIGYoSgnBJ8vz2J/DYlHT934o\n8Vzv4kkt/ftsPh/ixJU2g7YvZQCkOa3SwkUIaWSEmHGcujXpBFprCeTEEKXdUA4q8SuwUVOfdPzW\nWIwWd18dzTBRAjJqI175fdeRGSOnqJjccfvefQ6Pjjk+nlHXDXXTcnR8wrxuOZkvODk+IsSYsLZe\nkCknvgU6lX7i1KN8wGcZBkVmDM1yIcm9cQbvvGQp6qDEKDhZoHkHvSIU4iqkEKaeCkHCOIKWuXyU\n77be0bVJlaboBlafBtWxWNYCAPpAUVacu3yNbDQR45e6Zb6s2T86xvWdOOfsH9B1UoWgDfO6Fk5C\n38dYd5nz18slu0cHQvdWmtB5skwCU1UQMDQgjEIVfEx7kmdnM4tuVmEpyfYstWhD6f5EWjDBS9Kx\nTklSgsw3bceokuoxGY2mNbO+dtKP3sdEKoiejvLfp3U6jPWQFjH9eUA4LSYmS0/GI8qiAAKz2Yzz\n5zYo40E3m82Y2s1hCpP8EddZsG93vXs2gQgOKg1VWaKUGEw8fvxY5KtlAUrGR6PRaLjZu7u7Q+b9\nZGgFTss5h3Fc7Nd8WJlDphIvgShvqxjktH68KHJGo4qyLIQIYxOtWBBxInvMGplUnBwfszER2mbX\ndWCMUKKbjs45Pve5z7GxdYbNnTM8/4EX+fyXfpfnPvOr/Gf37rF7/QZfuXolvgRz6Q8tlEXGaDTh\nex/v82defY1fvHmDL25NmUarMoKXExqZSvjAcJ+MNTHHoUCrgA9eIt6yEWSB8jf/JTt/8++w/+d+\nguMPvYAxUtK6OG5q22ago6ooafVetByiCN2lqkZs75zh3MWLnD27zYP791ksxTtyMh1zMp/z6NEj\n2te+CXlBINC0Dc285uBwP7YwGUWW8fDxPjpWYHcfPEAFKLMcvxGYjktyK+aivZNN1kfPwdwWGJPh\nNWjlxcg2BIJy+D6uCx0wBmxQZCHEdkFenL6TF1sHj3MFWZ6TQkPatsaagDV64Lb0UaK+vnaS9Dl4\ncaReD3FN0y3Fqh0ByExGH9uKtm3ZmI7Jcku9XHLt2nW2NjeYzU747d/+Ld7/wi3OnTvLeDzmzp07\nXDWajY0NqqrCxf+/jW5P73R9t+EjW8DfBN6P7G1/GfgG8PeBG8AbwH8YQjj4fT8Xqx3ZGEtVlUym\nY7JiTD93LOoTlss5k3LrVF+2XC6HuXtRFAPDMG0AdV0Pox50ShgSUwrBB9ywCay7vKRNZl0emnq3\n46ZmZxSVZMZSlgXjiZiDZkVBpw2jUSkvjQtsb29zdHCEj0lAd+/e5a279zg4OuJkvuDm009jx1sc\nLzt++Vd/nbOXrvKf7+5ys6n5L/d3+dm/8Jdo24b9vV1ODg8YVQVlIUDkT33+73L5ZM6fv/+I+cc+\nxt6De/RNw9w7jPP4vqAaj6iqMRubmxRFGU8d2QStBqMCCkPrHE7Bxt/6e+Tfeoudv/sPOH7xebyT\nGLambWm7dphDywJraLqe3geyAKNxzs2nnuLixctCXLKWdn7Am996mcd7eyzbjmduvYfv+4Hv5/wX\nPs/Nn/tZfvn7foDfff79NHXN8d4e7y0ynvvmq/zkSy/xczdv8ls7O3RNQ71ccnA0E4WdWrJ7eMLl\n8+fY2ZwwLnOMzuhdK1UR4lTUdz3Gir7AWovuOiDG1bsY056Q+0glJqTxo4kTDuntbTQqSetgsXDD\n79cB34QRuCj/btsWrWR6ldZU+j9A5Mzer6rP5Xx5ikewt7dHVRZcuXyRL37xi1y/doXLly/zUz/1\nH/HaN1/itddeYzwe8/GPf5wy4lohhOHAHI1G3/Hde+dson+9628Anw4hvA94EYkj+++BXwkhvAf4\nlfj773hJ9S2ldlEUGGsoynJg2YlfmsyktbH4IC5DvQ/kRcl4MmU0mWKzHKUNeVExnkxX8lStScot\nYaOZ4UGlBzeEQaz9SBXFKfPREAjIJCEvcnExymVUaQbxiHD7E/FkuVxGGmcQpV0SRi2WzOdzXn/j\nDX73qy/xymtvsH32Ah/63o/wzz7+CV7f2OKXv++HuHDpGu999gVe/OCH+ebrb1J3nqycMJ5u8c8/\n9jEeXTjPl//Yj7C9vc2ZM9tsbUwZVyXWMCgzTZZRVbJJGWtxztNGXrxP7DclJqSHf/GnaJ6+zu5P\n/ngEXVu6psVH6zCj9BC9nU47reUEOnf+PBsbG9LfK7E0feXllzg62GVc5bzw/LO88MLzBKV4/y/+\nAtcPDvjJl77K9Zu3mNUtb925R9M6fvrlb3BzNuMvvvEGZ85e4MzZC2xubROUoe09ddsza1v2Zycc\nnJxwNJ+zaDuaTrIFfABjIkhp0qgssMoVjtThBA5KlS9ybi24QJ7nEXvKCFFS3cdTPbcZwYsPwLdT\ni0/nFELKAvD0fcIdNFpbtJIYcjFyCbRtvwZOC/9fynsV6cGeg4ND7ty5w2uvvcbx8RFZlrG5uQlI\nyyfVWhstyYSivlwu3/H9+27CRzaBTwB/KX7DLdAqpX4c+IH4YT+LGJD+zO/3+cTJxg3IvB4eBqxy\n2fTQ8zsfRNzhAzrpzUP00NU6kjz00K+5gQFmCYFVwm5YobDx+zjVz61fIdJKjTHkRU6eZ2R5Rp5n\nAzVWqJ8h6v/VwAkwxuC0jxOJmuVywXw+Zz6fs7u/x5KSbLTJjZtXuXrtOq+2np/ZvsDNW7e4uKjj\nzDvn0eNdrl67FpNrMr56/SkOP/oRticjpvUSt5zimmUEBoUclOU5eVFQlhWYqINwyZ4cpDeO9xzF\n8g9/hKMPPU9fL+hrySzs06hpDXF/khwzHo/Y2tyK8lbZYPrgefzwPrlVnD13hlvP3OTqtau89tZt\n/p/v/6P84U//M8J/+zNkyjJb1uzu7rG5ucUvfOCD/OTXvsovvPd9VNVY7OKV4vDwEOcaYYgSOFks\nyYyg6aYqBjxHNrWk/PQCEib6r4q28EQYQon/8MD/j38negf5Xd/3dLHUT0xP77qotVgdEk+OmIdN\nIawox3oYTcp6zKKRSmoVkpFL+rMsz7HG0HUt3pecnBxL9uZiwflzU8bjMRsbG0M7kUaCozwfJhZ/\nUJjA08Bj4P9SSr2IJBD/VeBCCOF+/JgHwIXf7xPJKGlJ3dSIxRO0bcPxyRHhZIn3ThRnpsJHYDAB\nIYeHh4PYSMeeVFhjCoIb0FEAPbiuKnrnB8FQCELYWacHP0n/TH9mjKYqigETKIo8atyjJ70S4DDx\n+ZWRgErhjSuWpmE2n7O3t8/jx7scnpygMsvVy1fYvnCVcxcv4XpHVRacPXMGozSf+tSnMBq2NiXt\nbXZyzNFRRZkZCTv1vYSdTqc0x/tDgMd4VFFubDKebrCxuSkAkmLI0qsyhY39MmnmHzew0Dt873Bd\nR9d2g3VWIL4ExkAkBGVNG8ldBaPxiOl0yr17D9g/2Ofw6AhrLM8+9xyXr17j4pVrZOMx+/v7vLZz\nls/82Z/mRV3wyiuvcjKXhX337l2+8PxzHH3iE3ztq19l0vcUecaZM2eEN6HEi7DrOubzJZaAUYGN\nUSEmmzFtOAw/hB2mVfJ/1GTOYo2AfsEHgpPPYQ3iu+DFSFbyG1UMjukG1H8yESwo9PpU+tCTcvTh\nUFMWpVYbiGYVgloWxdByhSCJ0G0rYq+qzBlVFUan9qKmbcX0dHd3l4999E+xvS0bb1VVoNXAE0g4\ngLWW6XTKO13fzSZgge8B/qsQwu8opf4GT5T+IYSgJNv52y61lkV48eI5RuOK5bLg+HCVALtcLDCZ\n4syZHfYPdznePWJSbWKsRGTVjVg/BTR6MFSMDw8osnUPN/l/+76PpVkYHh4wcMXfbpSSKgOZ4zqC\nb2jqOlYPIU4JJH4raC2Bn0qJDXYEl/quj6EcgYcPH7K1tUVRjaj293nmjbf4y7/1G/zWD/4I40/+\nKNPNHZr5nAd37tAujnnqygXu3b3DS7/7Za5eOMfNKxc4d2aHUZnDKKdQoPoOryVppixynn28x0c+\n/Vle/tEfZu/SFfKiGFDnFKXlgkE5JwIh12PKPI5H1+Kz40nSd520A07YbKPRmKPjI9quY2tri0tX\nrqKN5dHDh7zyyjfR2lJWJVeuXuVjH/8jLJuW0WTK9uYO3mZoD5nSnN+acveNN/jGv3qJ1159hcwK\n87HMcwFyXUdVZOTW4lzHdFTxh/d2+Y9vv8XPXrvOr5YFBke7WDA71FCWg4ozywJYCEGozN73sfyX\ncl/pgA6SleqU6PclN1XT9m2sTCV0pCiKGHbiaSI3ILWYfd8PHomJcFbH9ZFMRUiMSyRrIiTas7W4\n8VimN87RtY5yXEZGYqBZzun6jqCltVosFrzv2WeZTic8fPiQs2fPcHh4yIMHD5hOp/RBOAnj8Xjl\nbbjGTXinF/n/63UHuBNC+J34+19ENoGHSqlLIYT7SqlLwKO3+8dhLYvw+fe9J0ivLs43zsUZay48\ngNFkTNMvqeZVRGbbiMR2IueN9GJxGpZ+ymjhHiQyUUBcWrqIHqd2IL3kp1iBq69x+JhVKKnHmdSi\nJPKIGXANWRiKru8GsDPtzMt6yWw2Y2dnh7rtCEpsqf7y7gHXjmeUv/nr/M6f/gn65YxMecaFpZkd\nc+PyRc5tjrl0ZhMVPBe2p2xNSkZlSeE2yZQsltZ1g1z1A//iN6ge7XLrU7/M7OMfI88FC1AEcAwT\nktQOnFe8rSMAACAASURBVMpaSK1QCgiJZanQqUO8l+7/pe5NgzXLrjK9Z+99pm+4U96bVZlZVZk1\nS1UqCUlIoAEQUhkhMYnuZuiGIAiMf7jD0QSOaIcjjCPa4XbY/mGH7f7T/tN2EzaCRkzGTEIGgZiF\nQEIDSKqiSllVWZl5M/PmHb7hDHvwj7X3OedmVTW0ZSKyP8VVVt578xvO2Xvttd71rvftpdmCUqyW\nS7KiFNsyY8iynKeuXuebf+Wj3MznXH3zm3EucHJ0QjGbc8/uHuu64frNW8y2drh44RylgebkkPPn\nz7G9Oce2DZmOCkfOUi+XlEXGj1y5wkOrNT/y4ot88g2vwwQf3XqV0J5jDe2Do+kCSjjio/R/ZBij\nEjo/TBAOg4UyWOaDi45V9JvVdp20+EbYUTr1EwU+ralkW54OmSzLccihYCOxTTFMLiYWYq9E7DzW\nB7SCna0tDo8OCcHz6KOPcnh4xHw+59y5c8IQ7O/LMNb8b+MIwFdnQ3ZNKfWiUup1IYQvAU8Dfxm/\nfhj47/lb25CN66m0GaPsWARZirxgNpvSrlVvgpmQ2LFAZeoS5FlGcE2/uFMdm9iESTxijAuMg8D4\n++O6SnT+Bn641tHZF0k7A2IFbiNZR2uxvXbe9a2anZ0dFutaWoVtx8ff/la+9S++wCe/8d3cvn6V\n5WKBa1fMyox2veTe3R3ye3e5/95drl99mVlhKJQlxzIrM7q6Zr2yBCvOP0opPvtN7+Itf/hJXvx7\n3ym8i0TjJRCCQevBrUgOPy3BMU4L9p89TlmeAkiDaPPJaQV1KyKlExTVZMrm5iZZVvCeX/sYezdu\nUn74w7zw5BvpbEPXHZCta+bTKXs7Z7h54xYbk5JHLz3AfffscbB/jb29XYzW2K4VCzetREBzvUYT\n+Mhjj/B9zzzLT124wLwsyBVkGorcUBQZJjL1WtuJQlLwsWTQOB36eQGlNdqLf6QOY+WfeDh4jwsW\ntBcNCiX6lN6JpDp66CSlVD5RlZM8XcKJpMswuDWpADbYHgtLGABI1mid/GxS5gLeEsiMYisyK9u2\n5d5772W5XHDhwnmZIATqrj01xzCmKb/W46vlCfwT4KeUUgXwHPAjSMfhZ5VSPwpcBr7vb/NE3gvh\nJctyuqajaWoWJyccHu9jTWC+NWVjc8btejBvTDTfVMunlqDotBkyPaDXbduCSnJTqg8wqdYdS0EB\np8qEMe9Aa8OkKsmLfHDhTcQSJUq1phC5Z2MMBo2Pi2I2m6HQvHT1Ko8++ggh/vfnv/I8fzAtaQ5v\nov+fj5JP5nz98Qn/+C/+gj9473u50Syxa0e7XrE4vMX61prXf+UF3ve5L/GxN76eFx64QG4y5tMp\nG9MZs+mcxdvfxl998AOEIk7vxZpTaZkEBCJAKpJdDte7GE3/6E/Z/T8/wsvf/a0sXvfQwL0YceKr\nSRVLK5jNCkxeUJQVk+mUe+49x8bGFs9+93eQ/epH+cK3fICrV6+jtLTqvvjlZ9jZ2+WNV17iez72\nG3zyA9/O9a/9OrIso754H+v1Eu89Fz/7Wf6TT/wuv/bWt/Cnu2cIruPk6JDfzjS//brHyINit6wo\nC0NmFEY5plVJWVVkZUFrO9ZNS7AdqutQ2oNzBO3IPeSdR1hGAaW9WLylQwKpy20kQhV5RVGUkWEa\niT16OAxSB0h4JGUPQud5HoOl64NEURRiaR+7UnXT9JmAcw7MYL56uF4yqaLuYy6zNW948kmm0wnP\nPvssT73h0V6jYjqdsqrXvX5F3QyHYAK+X+3xVQWBEMJngLe9yo+e/nd5HhWgUDm5yUXKuyoIeY41\nGZQz5tMp1aQiYFivXur51GVZRT18J5nCdEoIkvp1nWPdCqre1C1105DFOitF68Vi0b8H3+eAcZAj\nKsiURYU2muPjYyHghEBtPWVmmBaKqqepGrKigizDOsc8eLJg0T6wOr7NYrFmXXe0neOxxx+nilJe\n0/km1XTOyXJF08rP13XNB3/3E5w7OOAbfud3+Lk3vUlqT6/Y2NrFLm/zbX/5LOeOTviOLz3PL7/r\nG2VxZTlFJQNM5WyK3trCOSHxaGNw3qKD6Oub4JD5Vk9Qkmq62L/e/amfo7r8Eud/8de58V/+E3Hr\njQECxO14iqHVuh/Fts6jPGg0RuV4G7j1NW/lj9/29bi8YI+hLHr8sUssTha86Zd/ga0bN3jiZz/M\nr65P0FqxWh8LOq4VH/rVj3Hu8Ij3/v7v85GveYK2bYSEVWWUWcZmUbG9LaPTwTls17BaN3RtR5ZH\n7wBrcQE6k0dtBRmCypRmYgwqOILyODyZDjin8GSYzHBs16gsoygz6qYDNLmRazHb2sa2DW17ejR8\n3GkCes7+ql7RRRWsZVPL0NdkIiKtbRMt2aXTpbAiKGICVTmhKgvm0wk725viz3HrJveVF3jHO97B\nhQt7KK2wPlB3LZPpNALkTXTKkv2g1V3PGBzcchMghVIoIymUc4GmaensneouQ5pKEAWbLkmERcqn\nUoqA/D3VWeBQkf2W6scUMcc4QD8woqK1tlK8f73mP9u/ykfPbNA+dI4ibhCPkJFQGuc6nHd03gvw\n5nwkfih0JgKoKsvxaCbKsL25xXxjM9qoWeqm5jMfeD/5b32cv/yOb+e+8+cjq6+h3t7Cr7f43Puf\npvj47/G5b/4mppMpeSRPFZMKXeSYIseFEFNj08t7ASIUgocgaH9k0pIsSm/94Pey+1M/y/W//8HY\nz473Ri5KumUx3RRvxhDEBFYcjuN1VoJMF2VJFvUfrbXMJhVFZvjMt76fN3/0N/mtN72JWZHjg2Nl\nO7yzBAW/+PpH+M7Pf4mfefB+qiKnKkQwM4+gYe4ceZ6JUKjRQu7xtq/7CQHbtDgFIQq6iFhpiMND\nIiSiQ2qWxrJ0pEqdpgec9z1Q2o8Bj1bwuAU3BplTppplGeKAFiJrVMbd03P20mYqcQOEJbsxm3Jm\ne5v5bMJ8NmV3Z5utrW02N7e45557UHr0fkISQx0mFOUpBzbiqz3uiiCQprj6+juCbjouzratqVtZ\nVEWMrH19f0fNWte13ASkRuxtoszA6krAYE8FjlqA437qGAdIKK8xhn96fJsn2pb5Xz3Dz33DW4Qr\nkGc4LwEgQD9y23lPsAIqzWcVLiisFzDQxhVmjGFaVRSTCSbP+toyXLrI5z/0XWxtbnK2mojrzXot\n2n92jX3oIn/yzd/I8uCQAkWRF2xszCmmFV4jOITtKKdVvzB06pvJBSQEAcQUA59eBVi/8+288LY3\n0tUr9GrVL2gVKe++7xzEoR9jMCEGaMQS3nvhFBgl2ocpRU7Xt8gz6qffxx+88x10BwdcdE60EA0y\n8ecczzz0IP/87B4mzzlnDHmWsb2zQ17keGtZHx6QLL+NySjznOAtREXivpRTCp0J3qT6vuFo/QX5\nSv9NCFFGLv5pGf4e7621Hb1tG7zq4TGs70hLV/R05OAGW7L0O0A8xS3GSGtwa2uLc+fPMZ9OKXLD\nA/ffx+6ZM9IK1Zrl8jhySLI+6IwPR53mWrLX3up3SRDQ0a9ObMXX9ZK6XlKvlxwdtWzv7NK2NavV\nkmJjB6UzgrU0nQy8NI0IVapVHU87ueFNvaCqpv2FKCcSQNzolErW2anvPI7eY3WYVNv9L7tn+fGD\nm3zsDa9LOwJvxRBVBk+kJpzOKrQPBOvpik6EN4LGoylKaZHJ8JomM3uAeC4qi1iqK4VRHrzFrhdi\n72UthVZ0sQU6mUyYn59Q5HnUVczonIsgkqYoY988ClIkUVbvevi7H2QBetCv3yFqkFtPY9XWyXSl\n7aetiNfSRKl7S1uvZWS5KjHxe8EPc+4hBLQKzKYVW5tzptOq9/p73cOXWK1WvSzWYrXsT8919Cl0\n1qG8Zz6b4q0ji7MQs8mEpllHsVnFbD7h7JldGudYdx1lnouLdWfpOovSLdpatLWErolqUgplFKHt\nyHMjOg5dEzd5Ins1rJfLyCgse4+C9NmAU/gSIGIuscR0bRd1AwfT1rTWMm0k28lFAu/atWvc2L/O\n7s4Ojzz8IM8++yx/fnzMerXC2pYf+Q9/iHPn7mU2m2MyHTUzRQw2nf51HZCc5tUfd0UQ6KyMUiYX\nlwvnLnDlxVsEZ8m0Ilg5IabTknVdywUzGkNGkRmR/bKuV/wRhpanW7cRBJPZAdn0HTiZcFNK/Ol7\npljKGuJ/pwwgZQXr9ZqPlhW/98ijvOu+8zIwkTISFLqzcVNFj0JlUMaQK40xuYBKWlhsLratJGsJ\nIhfWNaIo3HYQUzhF6Ad1yqSd51rZhD4h2mJZ5rtOTE+NTDQWRUGeJZVfjcJJioDql4Sk74CHMDJa\nHVJJRvwJCa4hBJRRfavRuQ4wfSvPtjW+zPG2wLsS40t0cPJqcXoxZc9Ggw4O3zXSdiPHuE4YgJlG\nVQUBaLVCuY7WW1piVjiZELyLLWFpCwcnJ7UcBpqsKGRK0ii6usB34pXoILYNZR2ImIeXmYOAPK8S\nBqX3UfK7KCAo2raJKfaQ6jdN04PScDoIGGNi+SF+CVlV9Eym5PqsAKNEIk6bnK5rWa/XbM5ndE3D\n4eEhz3z5yygC08kEgMXiRPAEL63M9bqJ3Bf6uQMgZkV3ucZgCAhyHOesy7LgwoXzNA1cuXLAZDql\ncx3resX1mydi1KEg4FFEDbhonS3ROM5iZ1nfzw9wSgzS+XSShZghn3YevpMGmsoFaZuFvhuQamCl\n40kaW07BB3HJTWOdCKpMiGww62I6rURDLxgUBkW6caFXlsE70HHDe0eRi2NScEKCSWKfSsnIbJaL\nFHpuoiIT9H3y8dep0ZEgYKhWoLzG97jLmE8RlYWcI8QxZu/FMNQYCF5BiJiI7XBdi+taqvgZ5P1H\nuW2XFHoh04oiM5jgxTI19uE7K/iIinWy7zoyQMWWbGYUmKyfBfA2nGqNZUbWkwe0slEuUjZf8F7e\nSwi9fmL/JYAC31LX/NjhEf+8yPmNyUyCTGypppQ7vV4Cm8dTgqmLJe3pRLmWq6/uuK4pJ5CSY5iE\nNcaQTSq8dxyfnJBpJarKZYH3U8Yyayn913rofo3bk6/1uCuCgNaiMa+MRhuFc5Y3vvEpnnjizfz+\nJz7JbHODo6PbvHz1ZV6+doCP4p/OSXvLe0n/8sxQr9s+Sk/KknVUnA1dB0pHhVwfd8Gw2X2qlRm1\naqBv9Zg4/guhb2emxSY9Z9NbmOETxVhcbUOQQZ7gxCpsXYt4idHSLppMc6pCEUKJUoZ1k9hmAfSg\nauNch111zKoKnYtddtOJ0UbS2J+UE6qypMjS4NSgDRiCE9RLDSclQRx40mCVjp8ZF0E+wuj1fUzT\nLZhBht05R1mqKIapJRvoGtrG0NYZm9vb4Lr+d30kxIiYTMEkNxRmQnAFdJaFtawXCxaLBUcnxwOo\npWA+36DKCiBgIw8kcUZCCBR52ZcveSSKKecgrRUnOI2zUYwEemMTYwwZCh/xjR8/PuYJa/kJ7/mN\nyWQol3yIg0Sh36hprdw5hZrWT9t1pPJLKx2BVcnQjIpu1yTBF7GdS7yC7e0dXNdycOsmZVkxnUzY\n2Nhgb2+vx1iyPCezXQxGEggGLEvf/ZhA6vk3regGllWGUoGuqTk8OOC++8+xOS+Zzwrq2kbKcM3J\n8QlHRwdUVcXm5iYXLlzg9uEhVRyWqaqKGzf2qetGwJYiZ7UUQQ/b6w7IWLEbbepXY1j1GAFBZMMi\nk02DZAKoaGZqRPYszwkBmtbGelmTF7FvnEdwM26Krm0Isf6uygKdmd45WcC02DeOo6nLxQnBi0BI\nMakolZQ7RVkwn83ItUGjaOpa+uIkVDyeyOkEUgpNEj+JixiY/vGn2P7Jn+b693w7zVOP9+SXPBMi\njrKOddeKJ2Q62YOT8kcFCB22W1OvHNiWphGFm7Q5q6oSARCJjn0tHYIntDUqWKZVRpHNmc8rqqoC\nwEb9wASelnlOnpdxXLejs46qKshzmR710KfbJs79F3lLcBpnNTZLaXogBJnhR2uCVmRtw7/c2+M/\nvnGT/7bI8NbS1YINtK2svSzPos+DOhWIUhciEXXkQBGsxXtPZxsk8xRTU23ij73HWyt4SnSv2t/f\n5/D2AWWeM6kKDg4OZEKwLDFGs7klTk7b29tUVdUT6ZLJ6nju5bUed0UQAEVrO5quRRvDel1jlJiC\nnJwcEryjKgu2NzeockVWVWxMS7ZnE6pMMd+Ys7NzhosX7+fZtiYvcspS2mSyqfvhMojU3qELMICE\nMAx8pHZWWnRJMy+NOecj2aakV+ddApYMxBLHZDnKaIJ1pPFRY1zfd+/aFmMybPAid600LsjUo/Ny\ncth+tFnGf0tjUJnQdVWR9X38oiwFhY8DQK7rUHlsbfUZzDDLnk59lYJZPPu3/49/Q/mVF7nnI7/C\n7Tf9056VWRQFRbR2X7Z15FbIJpL3Jym3VxCMbCbnNKWWepggoGTXivZgyl58bL/hfcwYZCZDpjAh\nsTh8cFhnadsaBZRVIYHAaGxZSomldM95cF5ay9a1gO1LEhX/l9ZDj4N4j3cBZ+We/s7GjF8rMg6P\nj/GdxQcncuZ5HvGk02PoMJSPKSNIHStjIgjSZwlycIQgUuNAskHERPHcEDzb29voCPSWeYarKh5+\n6CG2t7c5OTnm8lcu89BDD3Hp0iWqSRmH4oRx27sfxbXzWo+7Igh472m7wToK6Ot17x3LkxMys8F0\nUrJ3ZjuaV6bxzjWz2YztzRlntjcpck2RizkoI/XglNaKys4QqfsNMGrR9DJPo15rVY1abVr1iyk9\nRjHm1EP68FKbaz20LI0xAgxpWbT0Xesof5ZWhErbTJx4dZZRGC1EEAU6BrtkPyW/H2Kdmd5B4kkM\nfIn0/nUfBFQcUlEc//A/YvNff5ib3/sd/fVI2UCWZZjMQTP0tdMJ5yPg5pQi+Dy+nkcrsVWXWbKA\n9xYbB3SSwIePMwo4cT3WWlyKvArCMQCUSRRtuSJVWVIWOSAcgMY6mraLbdqB3u2Tm3IchRbQNin7\nqogTpCU3YD5ptkBHlqUcGEPPfTxYNi4BGD0/yLyKIgW0IeMar50w+ndaaQFp0dLdIc07BKqqYmt7\nm62tLVarJbcODjg8PGS5XII6bYrSr8E73tudj7siCDRNTVML82q9XnP27L1Mihm4jp3tbZ577lnu\nu+8cDz50kbe/7a3SCus6FosFN25cQ6sgfnz4OFZbMZkUeFP2Y8bdSGVWAo0ny4rYBXDUTdcDdTIP\nkPXZQVEUXLp0iaIoOLx9wLWrV+IJY/vFIjr7BkZdhtaKfNjGfE4WW4x5nlOzlkXpBVlXStyAVZS8\nRhmyXP5ucoPtUkNeTqjSd/KZtZIZei2ClW3TUHe1TMYFKMoCG8SdJ0SmYLgjSum46Hr5cQX2m97J\ntbd/DV29xDRNH7y00ZK+jsC3EMQ8NGkNBO/FrZnk4ygdA5VnZCYNWynyTONUIM8MXedx3kbwz0bB\nlpKsyJmEQFmV/Yx9FhmZwVl020nKH4Ty65uWumnlPhpFVVZR2suyjMh5EkNRiJ18MBIoTHovXhyL\nE8BmItsS6A8OY0z0ClB4NVByx+XknXMnLgGCarA2D5GvILh0Kg8MPvi+/eicZTafCV+mXlNNJiyX\nS1bLJV/60hfJcpksPDw8pO0atra2Ytsyx1nJnJQSn4TXetwVQaCalMymE46POybFNq41hByqScYT\nTz2Isz46DTvOX9gjgvCcsTvs37rBweEx1nlevH6TYvMMB4sl+0fX2d7algzTBdqmo/OOsqgoiwne\ne1Z1E2vGcKrdo5TC2VZugnWsFid88Qufj3wBw6wqaGuL8hNCmLBuHBvTgNIWpTSFznBLS54bylyh\nXY3G4FtLYxtMBh7xl6PMcRAls8DZJoJU8qUU6HzE/jIKPZKkpm1FDRnFBCWeg85JtoMHL6edjqIW\nwcS0149ELHQg1xo6RIEZQ7AtrjN4qwhepvOmGyVkDmdaylrENm3X0XYdxXQ2ms4TLwEvGBpV1aFc\nhs4MeWRnZvOpEJQiDyFlPM7roYRDNkg6QVOJpoPDJfZd29E5K7TloJjlhQif2o7Vcom3HZl3VHmB\nax1GB3wGuTJYpWREuuvwnSe0jjwYgslpaanraGsX2ZgueFz8XFqpHlBtG9mcqe8iQKCN712RlxVd\nW9N2omkgwqHjjGAESgPaOWzT4DvhE4QIKnZdx/l7z8r78p6ymvLB97+X6XTCi195gbzIePyxx1ES\nGSWQBTHGzdRdHgQUMmMtp7Z483WdDPpMJhXLxbL32tvfv850OkcpsfPe3tmhbi2HRyfs37jB9pk9\nnIflcsmtW7dpItIu0V/3l1vSdNWThYwfzCq01oQ4COKMCJSkMsU7T9t2Uv+raBARdemFKhtHQEPE\nBYiTeFpHz0Oiz6es7FR6pLHjIW1LKVyiQKe2ZFo2AzsNP8I8Xi3tS13CngwUr3sqBbSUGjYESd9D\niDyCgRugAIMWHGE0TTj+U+ivnCoh5NR0PTAZkAlOLaor8aRNKsCaXBWkadLhGsQyIoJ49NlXTtfZ\nmJKLbVdikVonGIWCnhaeanAf3ZAGVSD6Es17g0EJYGLltdTIU5EY6Iw2I54GvTJwCLFdFxLhWK6F\nNobQwwJiNtvzEwgjzErYj/3osTE9S1ZrjfMyrCSiIS3L5YLj40PatmUyqbj3nnsoypyyLLCxWxCC\np+v+PbAmH5N1ZLJPZJ43NjZYnCx7iuaNGzfY3OwwWYb3wsNfLNes65bbh0dR2CGT2vb4eKjZkNo8\neAHJnB96uCjV93x7JxkjLDStVN8KApl2bJo2TtQNKaAPPrYeA954CFkknQSp1bzHxzQ5bt+YAia6\nz+mOxDidHM80aK17PvupfxJO/dH/t7rj+dJDxaN2nNonDQEfsZMxdkZIQpxqUMkdvf/El0+U2XRt\n0/N6H8VLvAzG9DW5HipihbgpxcYkhCDZTORypACQ6voxMGuto227XjRmrBcx7sGfohOn8ehIMjNG\nrNlsvHCh/+zxNRNMoyDThpCGDOJzS9nFMJ+gB5DQaC3Bfxww07VS8oKng7IWTQwtbkMgYjli456u\no+H24QFNXVPXa2bzOUfHR0xn02hL30WR0SCS7q/xuDuCQOz5pzbLxsYG8/kGs9mc+XxOva6pa+kJ\nN03DCy+8QFGU7O6dZWNrm0uXLnHPufPcd/8DPPf8Za5evcrh7SMyLfqCnbN01slCNlkcp40LWCWh\nh+zUhtOp7TNyhpH6zhK8jJd2UWkIdL/4JOWOC9eL0pFOizFAkrtHJ4xaxWkX3S+o8cZKf56io8bF\nP2zwcTSIM+/E1PJ0a6R/fgVg1KnAZ1Mb0TmC71BhOPFznVNlJS63FFlJntX986W6N73HNJiVOgbe\nZz3d11tHyLKBoDOiZstVUCTGZUC4/gnQxA8Ap3eOYEO8R8k2PeI+IURTkcj29EkQxcfy7w4B2VEw\nJwxhOkKs6T+FtKUkJyK4ft4gkGZS5GBwWkd1q0BqzAnIOAypyb1NdmND9qSUomnbXn9QbOmF11Cv\nl+R5zt7eHmd2zvD4Y49huxO8d3FqMuPo6Iiua7ly5UXyPOfhhx8kz/O/G6HR/18fQYZMptMp8/m8\nl0IqioLpVFhRBwcHLBYLHnzwQZ588gm8h9W6Znt7m+WqZrkSj4LHHn+cyXTK1avXWK1q6nWDbhqg\nFZmmeDqHEIQHH4RjPZ1OTgWBfgY7DBmDaOwFgoLOWpbrmtW6pp2XlC6HSImVDF4GV1J6bGKGqYnD\nTKPIr6SB1pcor8ZTSI/TP3v13xuf7ONAl/5F+h6jNDeEwHQ2xVpH9Tt/wJl//VPc+r7v5uANr5f0\nOqkxO4/vRMNfuBZy+jkf03ch4pGAToUhU1oMNhNZyLrhczol7cR40na2iRmFZ2jlDhN8zsv7cNbS\nrNs4829jNyB+tnjKpqBgvYh0dJ1Md46ZfMp76VqE1EoL4l9gbUzpkxJxkmVDyhtt++AJiKpJWswk\nZSaPUi62QgdPw3QjUhi/836n8rNppI1a5NL5qaazaFseM88AeVlw/v7z7OxsY51jWa+5ev0qi+WC\np556KrpheY6XC17rcVcEgcRLH6e/IBfs1q1bWCumGVVVsbW1xcbGJtaKB16WZUynUzatk55qL6Ig\nRKAQkNQqug8Nda5M/spG4RVqw8kptkfO1TCBiIK37u/zo5/9LJ9/+pvw73lnxBwELEoTesHJrL4i\nbjidXj31qU9vzrRZ0/fGKeO4/zxOG8e/F69mXGzETahOPe+dYWPMjtRaY7Rn9yc/TPn8ZXb/zS9w\n8F//FwTPQKpyEYn3Y9NNQ+hr4AHXiIesUGWV4A0+Mjy1VwQlbU6pjTWk8eY+CLiIo6SpxegVMDrF\nx2umT/PjfUrf83FTOzc8X/z0w/tM75mACy4Gi6Gbkq6dlCTRlSq2DyE6GulY6wOEpFCRTv/T10ZH\nlmDCP9JnkMzARPl6UM7GdSw6GOQ5eVEync05c+YMm3PDzs4WRVHw/PPPR0B0Rdt2KKU5OVlgjKbr\n7nKykIoKLUnNNQUB5xz7+/uAYmNjA2MMOzs75HlB21omlZwos9kMneWsVjU3D26ziJRT2wWp00lT\nXIneqfv/lhOjY71enZ65jjdFmztq27jwf/DyZc6vVuS/+wd85j3vkgUS03IfiRlBycQd0X9eZOxS\nGZIWixzKIWYEp5SK0vUZBYDhe9FWnFQuxNpWfiroutYon2TFTz/GgS69XhHdfm//8D9k+3//MAff\n//cQtmEQhZ0ufsW5h/Q6MhDFqWB5mj8/QsFjEPBO/i1KhD6FSauAQW8v+LjpSc8l9uTJb3EcLH0Y\nNnpiW54KDG7ER4i1ez8DMr5CMSA4H92K+k2r+iAegtjSZ8agMwNBkRndrzWVbizDn+NgLu97NC/A\ncC8AwaKiNF1wRkRtY6AOKHEkrqZsbm1x37kdiiJjuVzx/POXqSoBEbPM0HWOl156uZ+Afa3HXREE\nXK/LQAAAIABJREFUpHfre7XWo6Mj6rphYwN2d3d54P6LfXrXti1lOaFtO/G6K0u0yQh1w2q14vj4\nhMViKSwtXYj5xghUMyYny3PyvAAdB47qNaxXp8CsLAaE3rI6RmIbPE7Bvzp/nn984wZ/9va3Rutx\nBT7qHAQH+UDMGT/SwjXjWj2B/iNAbdxjHv+7+OuDYibxJIunfx9clIplyWkgajwV6e3w3MYYuqND\nvHUsnno9+//dT+C7jnq1ZlWvWaxr6nbNqqmpnSXLilOfzWH77MRaUXbS2mOM7+v7PmOJ2IOznNp/\nCRUH+mCWNoiUBjaOJSeJudjNiUBLXwL40JcwXdfhwpjO67De95iO877vhBhjpBNg5NWdd5GrKANO\nKlqXKK8oskx8HmM2pvMM28lAV1/WEYllSvOKTA7JBBLmMmAK4RRInWXRNyLPo/2eZ39/n2vXrvH5\nL3yBp9/zbqazCavVkue/8hKTySRiHpa/fu4Fbh3cZGdnmze84Q2vuf++Whuy/xT4j+Kt/ByiMXge\n+BlgF/Ei+KEgxiT/lufRfSZQ13Xv65d6yGOW3e3btwFF1wld8+bNmyyWa24e3Oav//p5Ohfoulas\nzidz1LqWDMOJxztKHHuTaWld17RxTj0Fiizq6vcoLy6qFPso4JDze9OSlx66xKNn9/japiEv4gYm\noeygTWohCiNPgbSVElc87mut49TeHUHAj0679JA2oSxIMcYYUPOESHtGbaiUEvvhpPRpbqKzp15P\nO4+3MsG3WixokkvSYsHB0SG1bWnalnVT4/2QrdypautHp5oEUCclkoIQ4rBM7Jf5ADqmyio4hlFe\njwrRDtmLwaiK/gHyFXCOHt3vrw30WELXdaK9rxCNgITgj66Jj609xpsUcMnYVMmpr0zWOzQpL36B\nwSjJ4JSQy/D0mUbqaCRi0p2lnVKj9mICjlUKZEnQVrKCznp8aIcMMWZXxhg+9alPU5YSkJeLhrax\nwqDd2uPSpYu8fPVltIJm/XegMaiUug/4MeDJEMJaKfWzwD8Evg34n0IIP6OU+l+BHwX+5d/0fMkx\nqCePIDd4MpkIYhs3RRFNIU0meMDJ0TG3Dg64efMWx8fHaCMUUoXcEHlucZVJkVest2RTJ2CpB/76\nOlefWsxCFMoJRqPzXAJIU7Nar2ialsk079P81FLrwT/GpxunTr8xOJhq9zGoNy4N0jWQMmB43hDu\n/Bqf/HekxXaEotvhxFFKiYCJdXRNy3qxZLFYsFqtWCyXHC1OaJ2l847WduQq76/POHMhDAKaeZ73\nijy9og+J1hqDV/pKdf8ImE1z8KkUICS6c/on4yxn/O/CK77SeySWAul6vuKGpHcUBPuRGl36/GLO\nIr+hlcIFBOtAWoDOaLTXvb9ECsTyXMO9lLbgeEGcxnoENwn9kFvXdfg4sCbve1AlcnnO0eEJTdMQ\ngsJ2PpZtnrrucNbTOsfBwdFr7r2vthzIgIlSqgOmwFXgfcAPxJ//JPBf8TcEgUDoRRCEKun6hTSb\nzVit1nSNTEft7OwgSkRiOnL1+j4nx8csFotYy/l+MyyXS0BqYyEiETkIFm8HRVdSqm/twBuPm0M2\ns0TksiwJRuO1JjQr8RFYrajrBmuruJkiPBbCqY095obFc+I0SNcvAj36c9SsCsPJciq1Hv18HLRI\nNXQ47a3YRZfctm1x1vWTfVprfC3XuKlrFscnHB0dsVyvWa5XHC0XWDwOwTqyrIh99NNlBoBtW5ro\nSiSOxkIA6vfM8KH7zZzec6JiDwBt+rxDkEuPcSfn9Ncrr08C1yKUES957CS8xueQ+xdbqLHsEGqA\njJIHJaWEieWDkIgUou0pWaRPfH59ejo1RO0GkJJTqXEmRXQltigkkzVak8XSo4tMwvV6zWMPPcL1\na9e5des2VVXivaOpLQf2Nuv1isVC9kBZ/h1gAiGEK0qp/wF4AVgDv4mk/4chhJR7vATc9zc/m7DV\nuq6lXa/o1gvyrQ0mhaFdr1AqEyVfk+PoyHPpn4ZWceXqNUxecf78A9xz9gFefOFlulIWXhfBoBRh\n+0EPL5F0vV5FQpAhhJYiN4AhoLl9smA+maKCx9mWZnHE5nwDgqLzljzLUEWGLzJao1CqRFkBH7XJ\n8JkRSWqg0EJRlRahR2d5/JmPeoqaoAsI8trO657w4kinSMICAiiPx9HaCHYltNx5bOelDakURokX\noSYQnGN54wa3rl1j/9p1Ll9+gb969ln2LtzPxUce5R3f+A0cXr9CV9ci2aUKrFd4lRPMhGWz4GCx\nook8+Gpi0QQKo9koKyoNIfbsvbW0NlDVHW0X2NmaUeU5eVZE4kucfgqIC5D1BNdhUfgQW5zG9Ket\n0YlWHKhXLQQj5YF32LYD5zEo8fTLRHrOukBrLcQMr+laHGCR5xSLculQSKs08hw8GKeZlxu8/fZt\nfuz4iP95Z4NfM4pJIToNRZ5RZRkmk65H23Y062MIYFA4ApkOdF6mErOiBARbyrKsl9FLcmubszl5\nkaE01Oua6bTqS2MdYFLkktk6sePTKic5Hv/FZz5JURRsblacPbvH0dFt2q5BKcfFixdo2w3Onj3L\nk294kk//xP/4qrvvqykHdoAPAQ8Bh8BHgA/8O/z7kQ3ZPX2IP3UaMqRlxFZUlpto4tH1brg2zk97\nL/PXTWuRKXpIvoUprUq1fd/fJUpkdR2BEW99nD4iqr0heFznadqaYlJinaZrW1aLJW77TBxmkS6A\nNRZlBJ333gt7LDCIWsZFCAhFNMvRWXHKQyEN5YwzhnRqAa+sbf1w2gk4JRmODw4bAuv1muOTY46O\njzg6Pubw6IgGQxugnM85tz0TD0Ln0XFQZ7VccvPwiCtXrnC0WrHuWlovxhtlnjGrKs7tnmF3cxZJ\nUTEjCVHAI4J0wwmdPkfsH3pFguZAJgfTZ3OIGItMYA73q0fYT+El8vdTNmuxs+BekSEN5cOdpVnw\nybLM82PHC57oLD9+eMJvnt0RLcMso8wLyizHBysy7lrF0idgA6jkLKzlc44nY/vWamw1hBCwrsN4\nRaYlG1ZGMZ1O2dvb49b+Ps5ZULED4VycopVyamte9eh/VZVsbNxHNSnZ2Jhx8eIDVFXJdDZje3vr\nNffiV1MO/AfA8yGEG/EC/gLwbmBbKZXFbOB+4Mqr/eMwtiF74rEUAfqaOKWPfW3G4PYzTAKq6Ews\ndVTXRj/4zkqjKQpJ3Cn2ME6bZcHG76WaLS4I510EpIKc/PHnfTsnRvLlciklRjz5E2tNexng8SGc\nvtAptY9/VUhq2JMJ068pqQPjlk/F6um6fzQZ2Qt8xHozXb/gk5iFjd4GXUTGA+v1ioPbt3nhhRfZ\nmz0qgJt1GOWieEvD4uSE4+Mj1m1H3bUSCKylKnL8fMrWtMLPKpKlt/cej+3ZmkOLLr3X1CKNNGhB\nMiXQm3599ME7/f3ViFIC7NEfFONRcdlgCeEnXcVXlFPytKN2Y+QI/IutOT92tOBf7GxGJ+O4PpXC\nZAbfRbuwEPp2rR61IFEq2qGF1OeI62xI/UMsgYzVPdjnERGUSVVxzz1nRR2rbbG2Yz6bsbOzQ1kW\nONsxn5Q0TY1zlqqquHDhHGfO7LBzZpudnS12drbJ8ux0qXjH46sJAi8A71BKTZFy4GngU8DHge9B\nOgQ/zN/ChixdDDg9Q0CIm1QLmSQQ66GgyLKCSZVx8+YtprNNlDIcHR2zsTnn9uERq9WarCj7EySR\ngcYnCqRZ7xAzhvhe4s/apkXhyTTMJpO+fixDyaQsKfMMvOPo6AhnO0KWEdJNHwUx2ZfD7H4KYAlZ\nVkphOws2YKw5Na/OSANgqPv9MAloR/TX9FAKIv+8qVf4LirhKiGi6CynqCpmGxt4LboGx0dHElzS\ngJSHxnY0bUPd1ARnmVQFJjeoGlxnZZDIWbzterk1QjTWjOKxJtNYe4aus+LOpDTeZ5EXH2fzI/6R\nTsb02VNwG4Nm/XVIQaLv4MjDe5nhcAmfcHHM1+hTG+9U7Z8ICzHr80EIZJ+YTvjdWSWb2XZiRa7A\naoXzIhZrnaVzDm0UKMGQuq6jtS52HXQkFPX9IbROGgUAXizP4+epqoKua+mamq6ued/73sfBwS2u\nXb3K888/z+tf9xgPPfQQe3t7QqlfHfPsM8/w8ssvs7W1wVNPPcW5c/eysTlnsTjm7D17aK37EuTV\nHl8NJvAnSqmfA/4csMCnkZP9V4GfUUr9N/F7/+pv+5wKeo20xKQaP7zzXL9+naKo0MbQdpLif+X5\n51iu1qxXokaktLgEi87demgVEdVtEwBjNC6mokZrQmy7qdilaF0rc4dK2Io2Bgytc2aTKVUmv7eu\nV6famc66SCGWlDzTSvQBEnOO6J6MLAYfdOS0K+HVq0Leg6yh2M3wp1p7feofRm0nM1iypUeRl9gQ\n8LZjNp/zteuOh3/v07zwXe/n+vd+L5aMoEUVuQodx0dHHB4ecnR0wnK1pG4bPI6yLJjM5wRgXVds\nzObMphPe03b8wJev8Ft5xWe256xXa9brFdVkg42NOWfO7PalmKT/Uorl8VoVRYYpRFtRoXrr77jG\nhtQferBYmH/SIkzBZhwk++zR3DnbPwJYx4tuBNqiUjfJUCqh3PoQUB5c52hd6ANeZjR5XtC5JrIm\nhem4rltWdYPSGUVVUZUTOSTCqHOBj7FadIWcs9S1OCfXzUpwL6P4kz/+I+FHBNjd3eHGjetcu/Zy\nFNTpuHj/+UizD2xtbXFwcJOTxaFkNyrw0pUXmc2mbG7+3ViTE0L4Z8A/u+PbzwFf9//pCfsbPkTs\nlJ4nBthstkHbdiyXK44ign10fCRIqjKAqP1mmaHKZZoqsRDHVFuAPJd+q6SpXcQdBJTKsgxrDDpI\nPd8vxhBEKozQR3SlFCbT6BgUOiuttEwrTGQR9lhAOu36E0/+T1RvlaSNUY4cVMQrEk8/tvY612MK\nSeQjjauqtIEYuhxKazmR8pxHfv3jzK5c46Ff+226b/5G5p/8C85/5Je4/A++i+Ubn6ApavK8ELKN\ndyijmUynbFnLO9cdH/rKFX7+4jn+8MwZJlXJ93/6C1w4WfH0F57lz77+DQRvyUxGWRYUZUFe5IQg\nst3eeaySGj+dfForrLakcWbJoof7nzbxuJST8g7SOLaPmUF6zy5SdhNu4EOcjhyJVvWOxLF8SjhC\nCBDUIDNnglClQ5bHDGHAI1LHQJylcro4ICXloBDCdORyOD82HnXxMJKsNzepPR1FWwsRwsF7mqZm\nYz5jOqmkXY6nXtc0BGzX8vDDD1NVIp4j/gMTRGHIk+WGw8PbOGfvfsYgMGqRSQbwihZYXAB5XrG/\nf4ujoyNuHx1xdHRE27Rok7G5ucnh7UNMJgYfk+kc7xxNXEgmjg2nmj+l6tYGOuv7DZQWh9YqcsSH\nFpCOKL/Rg0S1MbpXSlYRxbbOCgc8McN8VPgNuq/t+wM7gX8q9GKT49Ovs11UiZG6OnQxjdVCfz41\nu6/iuE1IIGWIElqKrCy59v0f4sLP/QoHP/Q9FGXFhZ//ZSYvvMSln/+/+fKb30iW5WRRF1EbTVkV\nImKi4e9/+fNcWK743iv7PPOuS+SZ5mNPPs4Hv/jXfOzxi2gluhBVmTOdTSmjGKZCQVB9IA8h2X5F\nxx3V9TgQsUQYn+p9fT9Wh4pdhFTjp/HnRO0eTtwxX6DvDQJRWjx1XeLveYTUNEz9aVSwqCyjdYOC\nUMIg0FrERo2hcx7rJWiIRXsmOgJwCosKweGdFoEYI8Y4ICShEDxFnkdBGekgzWdTNjfnTCZT2rZm\nOqnwTgaMLl68yM7ONtPphNlsism0lBO2ZT6f0baN3JMo1vpqj7smCIy5E0PkH1Bh5xx13XBw65Df\n+PXf4PD4iOlsxv7+Pucv3MeFC/fzyCOP8vGPfzy6484wOmPVtRA8RkFm0iZPGn+ezrteaqSvua3D\nBTGJFOsu0RsooqglRcZsMiXHk2eGsiz6WjTSArHBiTiHF4ttRcAHMUrRKin8Rra8lzFn5RXOd9KL\njwGwJ4uMAU0vtFOjNEYJo20Mpsp7kaDh2ga8QxtxtnFPv5sXn/6GKPhp2f9H/4B7f/oXuPkD3y+u\n0MZw6bnLfPOv/xZ/+PY38dyD9+MJnBwd8/tvCbzns3/FH3/Nkzzy4CVs13F99wz/2yMXcbbmrDEU\necFkY47HEBgCuwBqOqbNQYaSiGStVlJ6rTXlpOgxkbGG/1jQczixVSrl+yDRZws+SFbgBkdppcwp\nooHcg5QZpudNfSXpy3skWOtM6NAuBJwH6w123aGVoawmBIfYp9c1HhUlvkryoqBuuzjlmL7iWgsy\nwahU1peOYlfmyfIinvwwmZZkmWG9XmLbjieefIILF84zm824fesWzllxyc4NITgODw+4cfMGTzzx\nOh588FJvlfdaj7smCDBq+Vg3DHzIwlZY5zg5OeYTn/hElH2Co6MTPvShD7GupSZTiujWA13Xcrg4\nYb1e9/bjIAsrWWqN2XhZluGDgHada7HeoYFMKcgUqxWUW1tMZzNmZc7mbMKTzz3Pd33xWf7oHW/B\nuZbOZgSTDfWmD4P+nlJo79He9OlkUHJSAYgar2Y85RYgTs2lMd44CRYyYY9F/a7gA0GH2FYbtb+A\nPCvw3mGjiq9P7VetKHTB+hvexXPvfAf1qqG9fUDXOd70m7/D5v5N3v2pz3Hy9W9BK8XmbMrh7hn+\nrzc+IQlLY8EHMm3IqgmErEfNM2NwJM7EICmm42QcRCs4Ly7AmiHodU4Y5onhN26Xpk3eawYEF1WF\nvWhGOJvoB30PPtX0KfMat1hhyDz7EiSuRWnx6VjCaTEtDRKwfReVodFo5WT02AVWdUNdt0J8Kwp0\n1KhwCTgMcuJLhyAQghPANFKoi0Ksx7tuJT6EGxtY23Hjxg22t7Z45OGHKYuCKy+9yIsvXObBSxeZ\nTqfs71/n8OiAe+65h0cffZjJpGJre4vpdEpVlYQQ+vH8V3vcNUHgVCbQb4QRyu4Ftd7fv0FZVuzu\n7bKzuyuqqy9fI6jA3t6upFbxhOiHdWK3ob8hiQOQXo/Ye4bILZcFlGuDVwFrA3XtsbMZSlWioW8M\n3/lXz3Lf8Qnf9Gef44+/5ZuijFhsNfabe2grpiaN95FIFIgz66IzTDhNVYYhE0jCJt57ynyKNqNp\nvb7tpkaDRKcpx73EVd+18DJ4E+EJF0sQheKZDzzNYx/9bb70re9hvrEhvoZZRrtaE6xs/oZWBEQJ\n5DqggpxmxA0fkEAzYBankfneEak/zeXE79Zt71GQMrbxTEI/B+E91jOQpZw/zQeIrxXu+PsQAEK/\n4sbgYzqM+la1oLeyjrRo9nmpbshNhlamb6V2cSArz8VyLgTueE8JyB3ej3MOGw+voijY3t6myLcF\ndIzq1nVTk2WG4+Mj7r//Pi5dukSeZWxszEV4xKiowRk4PDzEGPHx0FpHf81XdsXGj7smCAx3awDM\nknl023UsV2tOThZordjdPcODDz3E6598gs999nOs1kuqasK5c+dEW83LRi+Kgq7rei7+abVh+kAz\nfk1AFqX3oI2QTZxn1bU0MS3PlGjt/coTj/BdX3yOT7/77Yh6XiTqxA3eP3tc+KkOlvQ/pu6xBeiQ\nsiP1+lV8P23bsF6tsanf7j25LqINWDQx9Qrt4uuOAqDMpCcPQINRBSHp5fsoYxJCD6wpo9GZ4eBr\nnuJPv+7NODwzY6TkyXKaLBNRTudZhxxnpYWaa8DHkV9CFLIAZaRjoYm26AnAc64PAklFmAi6tm1H\n24r4iwQAfereOJ/UgZKOYMoG3KnADsQhpDt7TDCEhogojJSZEmiYghdBoYz8vjEGgyC8SmuKopSS\ns17RWpn7R4kVXFAqflY/dAVInYUQOSkpEHjyXJHlOfP5BntnZoTgsZ3IgzWNyIcdHBxw7tw9PPn6\nJ9jd3aVtW5arJWVZkBcZzlpu3TpgOq3YiN2Atm3RRv97gAlEqqYJAZCJPFOUYGY0fs4nP/1pPveF\nz3Hl5Rf5z3/8RzBaJKnzoiAzlu3NKdVkRmYC9957lrqucQ4yQy9Q2nUdRinmU2FYlbmhWdXo4NB4\naifyYHlekOsMpVyP7IYQWK4th41nwzoqbTFNx7Pn9/iXj9zPhQfOMw8dKp9BWaDrQBEyFI5Oy2hp\npqOsGQHXDpyFLJPUvnZrrHdkQQuN1nps23Hj+j4nyyWTjTk7Z/copjMMBpQnGCv4gXPgBdkoigJj\n5BRtrcVUkx4nwIlhqiGQRUEV68TuLCsKTvwSU02Y6Vm/gTIt6X1XVJAXtLYjWEueH5E5Ay6ADYTO\nxawtUDcN2WyCNzLROF11fdpuvcPGDM8oTaHEMzCLfAUHnCxW/RitUoMNnMlK2k66NUYFKoQ/b4OU\nQTrPcK24TdnWggtkHlGPMdGuzHcS8LSoPfnYjRF2oacymlxFX0ECZIosL2naVlqaIMKlaAqVYUxJ\nNslZLGtaK2zKXHlQrXBDdMDWK7TOUWnCFIVS0gqfTEvhBDhYrlpevHKN1ark3Ll7uO+B+zm7e4Zn\nnnkGpeCes3s888wz3Lp1iwcvXeK9730PV652bG5usrm5SQiOtm3FgbsoQCmKSmZx1vVdrjEoIh+D\nFJe1Is1cFAVVWfKpT32Sw5Nj9s7u0TQNRS4nh3WOra0tThZrtMni5ndY67Au4L3q9e6m0yltXYvk\nkve8/PLLPY+7cw6VZbi2Bdp+Tt07F2tIeS/r9Yrl0lBvVpSTEtcpurbl6PCYe+6/QJHlAtblCuM0\nQbnhBCANFA2fe8yIy7Wo3DZ1g+4c6+Wa5WLJ0fEx27tnmG9vMduY09guClMy4gr0cEqsNyPByItE\ndnqY/sUFdSZETT9t0IWmagsBqoLCuVHqrBWZycmNBAyPpzI53iu8cgTEgkxKMY1yoJXpT2o/kgqz\nztHYaCWPIteaSVGSZ4O9+rgEms/nvf5jcv1NmV1nhQbuUgYWEtlM1KKFiCRcfjfKAHthF6NjqxKi\nXTMm4Rh5Lq5QIYjkWCrHgsdoGboKIdC0LaumZl3XMsuhpB2okq9QCBR5gbWSgY05EC62DJPHRZ7n\nAMxnU+bTKZOyRKO478J5losFR7cPMVoxn02AwGc/8xnO338for1Zs1otKcsycgxMBEXlHt71LUI5\nqQTQAgH1uq6jaxqO7CGHh7eZbcx5/LFHo9Nsap2Jzn9eFHgv9VCeZXRZJiCYH9BlhXAHNjc3gcDJ\nyTGr5YlE/CAp7CB6SV9PiemppIJN07BaZyxWK/a2NsF2EKRvW+ZFrw1nEFQ5rczTFNWhNoZBpVfU\nQQMZipPFguXJkvW6Ji8LZlublJOJ6ARE5iTeSzvQR4Q5pI2QrmN0y00MmJiKp2Q4dbjSl1CbB7HT\nNM7qlfTXtZGWV+YDzngKYyQtdwM7T+g+igyFcghg6UZmZbGWThlI8AGntGyXEHGKTD5fytA7a3uM\nJ7UCnU0y47FlGEsQH+jHfnVUNEZrVKQryyMpGqm+O6NP8UCGe6P6Am58/8btWINzgbYTgZtUZLg4\nSgz0LVq5T6MuThjKl/T6qcORaY23HcuTYwqj2ZzPyZRieXxMVRYoAs16xfWm5r5LF1+huTFWnFI6\n0e1Hqll3PO6OIBCE4502S9PU1OsVJ0eH3Lx9DQ1cuniRt7/ta5mUUTHGWtrIBMxMRuM6rl27xmQ6\njSj/GufigFFcLPNpJWy1PGdvb5ejw9sCFLYta2v79DPdaJnUanHWU5WldCiWcHCU8cTDD5OHgO2a\nfnaeIDRVQ4ZzAaV8VMs9/ZUeY2widBYVlZIObt5mtVqBUpy7/342trfwwLpt0UVBsMSTNQlmRrZa\nes6I/islCjhEyfS+blaRYxAZTDLNaDFR8EOChIhZBh9EjDPL0FlOhhLz1iyjsx6LjxwAJ7+HmGco\npdHOo5wfrTKZzUcrXJcCiCPTUToc3/MyTCyfVvUa07W9F2LoOpo2WpjFzyXBUSoTHenf2sShLX/6\nWksMju8jBWSV3KUVXXQzEg7/aTxBLNBU72oVTIZTMaDFwbMQwHbJwl7o3bmRFnLajEopvAuM3R3S\ne2yaBkJgtViwXi5ROLY3L5HNZxzPpmjEmOSobfvMob82IeltDKrGUpLo/ndf7XFXBAGlxHNdqUYc\nfmYzqrJguTzhl37xl3j4da/n/L1nWC+P2ShmPXCkVcatWwfMNzaxzvPMM3/N/Q9cYrlcyykaJZkm\n00rahnVDWSbL7sBb3/oWnnnmGfZv3sCt1ywWa9Ep8KJvoALkRkQfQUwhV+s1L15bMd9+msmZXYJr\n6Zo1mcpwIaCJ6kSFkQ2mBnT4FQSoUTZA6/B1x9H+ba5deZkze3vce+E8m3u7UtpoRVaVrJuGrXKG\nUcJM7DsfDJ0EFU8DpRSdsqfQb6ciEBiHX1AKlRuKMmf/2r5swDwnL3JMUYHWBKMhz8msoP7aCo01\ny2TcN8tzQuq6+ICyLVpDFklCTd3iYv1tXSzVgsi+Bx/ErcdD6WFrZwvvfcQ2DJcvX5Zx281NLlzY\nQquMPC/pmpZ6vUSaq5KNdD5QRpRcackonI/DXEmoZVB7feU61Jp3HJ/wAy/t89MPXOCTu9uxRFXR\ni1JKAGW08BsUrDvPYlULS1FFYDYxC6ObU6ZNn5a/1mPMhSjynDzTdG3D4a0DvnB8wpkzOzz26MPU\ndc2f/emfYruOb3n/+/nK88+zWCzY29sV67FIKEt7KiDPeXJy8pqvfVcEAQh9P9d7kVjOM0PTWPav\nvcwHvu2D3Hv+XgqjWa3W8bTThKB6rUHvA02UCVut1rRdxyOPPsIzzzxDkVdMZ1MObuxjrWUyqdje\n2WK1WOK9ZzqZsnnmDJcvX8aYLqZzahRshBOutdByg1HMNjfZnU7I8XTtCmddzwBTKunna8KoBny1\nSa7+hLIO27YsFgtMllNNp0znG5gsp21FNUaHjELnMqKL72tKpfxAR2bQ8xczDRVLkyAml7EdLr7C\nAAAgAElEQVQWUEpksnyQlN1ai47GnyHThCJDJX8DpdBZGuoKOG0IvfJSwAdH5x0E2XB16MhCFmWx\nI4subtRE60ULOKY98jpa4ULg+PgY55wE78mE+XzOarXCe8/x8XGfofngpXWnpOwQoVEnC1+HXn8R\nJIXPTNSTDD4akxCdoOSReAw/+NINHlzV/MBL1/jzc3ukiCGg8YCRhBBYrBpO1g3rthHZdKWlZCEI\nuCxiBT0vAxju2eg1NcPgXJ7n3D48YmM2YTqp2N7Z5YXLl1mcLMhMxtbWFrP5nK7rODo+5uWbN6Mh\niefChXNxTYUeJBRHLNdjT6/2uCuCwNCoiydlcFLTuw7vLNtbG8xnU1A+WitJ2t5FH8GU7mttenHJ\n4APb21sYo5nOppw5s8PtmzckzY5A3Wq1YjqdUE0mZFXJy1euEEJAq8GkImkbpBFTrSCYnKA0ZVUx\nyTS+y6nbhrySNLHfa6MT+BUYQPrsqUywlrbtqJuG6XxGOZ30m5LonOS8JdNZv+iHKTspPXpy1ejQ\nCWrAAgixBRYG/gKx3eatkzFtYrmsZR4iNisxIemZRL+GyDHwQQBKF2RqzgfBEZySzeDipnBeAn1v\nuR5rc21UH1xTqzBpOXjvmU5nUSzWs1gumc/m8Zqo/kRP04SJlDU818BDSFZkPkaAhJ/0tXsMnD/7\n4AW+7/JVPvLQAxL844sIDpDen5KRdbtiVdf/L3VvGqvZkd73/arqLO9299sb2d1s7ttwVs5oZI8j\n2bKWiTSSrRiyFCuOowBGAidIviS24G8BjMgIYCBAvtqxDdhxbEdWZMuLRFujyENxPBs5HC5DNsnm\n0uzu23d/13NOLfnwVJ333NvN4XjsD1QBt+/a73tOnaqnnuX//P/SY6E1CfoV7y4+AdUmQVO4lpKX\n8mEgntxpzGaL2Dym8QGquqFpLPsHhzgXyMs+Ji+4vbtHVVVMJhOOj484d+5MW/b23sfyoBxgJ9uw\nT46PhBFo20QRhdnZbEpmerhGcWZ7C9tUeFdTlDmNV/R6fZz3zBeiyLKYL6jqhtWVNaqqaWPlNOGr\nqyucP3+OV196iZ2dW2QRQukay333XaYoS27v70dUQoyXAYLUdI0Wt7Va1IQigyxn7+CIjX6fMpYc\nD6eHmFyjiwI8EmOz7Fg8XcM+3RhTLSoW8wW1tZy7eC/lcEDjhBS1zAoWVcV0OmV9da1FzC1x8Z6A\nbk8ZlG4JRl2HvTcLQuqpkc+h8ejG4qyFuiEPAuUNPpATyOOJF5wndw2mcdhFhZsuyFHMvXR2Ns63\neYOk8aDTprAgDUREKG/yYCQGDzHhqJUmzzPW1lfbCoFzjl4vZx7vfTafsLK6JlTfTtM0VUeVWIx/\nKt/pSAmWksK04J/Uw0Gyi+3zMMbwrTObfH17C4gnatzS3vu2qUhHAZDGWRZNTVAmGpWl1oC1olAU\nnBha8mVzV8Kw5LlUBeqF8AHYCNiqmoL6aMzewRFaZ5iiR24M03nNjVtvcOnSRfr9nDfefJuHH38E\npRTHx2PqmCdIicaqqjCZ9CZ85HMCQAu0SUmOjY0NRoMNvvTTX+TcmS2yTONd3WY+09MrioKqtmhl\nWFtb4/buHt4L/8DOzk02NtY5Ojrk1s0bmExzPD5mc3OT++67j+lYtAp3d2/z3devMp1OI6JLHtL2\n9iaJ1PP4+Ji6rqmbmoPjMf/2ua9SPfoIF7e3WB0UBGNRIURFYU0WDNJhfTIESCdON1ForTAhLWzN\n+7du8vDHn6TsD7DOsbe/T783IDiPsp7J7X2sUYLMy0wEgQiZByQXc2l4VnSOUUoWZNUwPzpmOp4w\nOThksnvA5ivf5dGvfZVXP/U0sz/+OVSRYQoJxwrvWUxmjA8OuXl7H3c0pRnPmB9P2Lx+jc+88SZf\nfeQBrl2+h2w0oBgNML2SxnuKshTAVWOZzGbiBcRejcbGWj1SkmtyQWBmlWG+mLG1tdXO0Xw+xxhD\nv98XWvQI/hKy2az9O+ccTZThRqnW+KZehYTMFLdgufvFQGgBBYZYwYhsUEu+QxXZhgIoR6hFEWs8\nnlLVDdpo4S2IvRGiFyBrVKmUhNQtEvH0EMHQpqUUK3JYXV1hZXWN2wcTjo4OGPR7XLp4idHaBoOV\ndeqq4o233uOxpx5nPB6zv7/PffddirycAwaDgeS19PL1P2h8ZIxA6tpDBWZzobk2qqQsC0bDAS44\nZvMFea8UBVorocB8PscjblNygYgL7PDwkKZp6Pf7rG9ssHt7h7Is6fV6rK+vceP6e+zv73M0HnNw\ndMyFCxfahFQIQg4xm81bHYM8zyW55T3vXr/O/dtbrOQ52vcYrvXxMVuttHTwaeXpPvNuM00ayRD4\nCHjxCurGQtbggkeZjOlkim8sNI5RbxDpPjWaJaQ2uZpKKTl54rWE4yluUVHNF8wOj7CTBaFqoG5g\nWvHoc8+xdnTAI//uOZ558B50mZGXBVQ12y+/xgP/4hleeOoJ3it6ZHMLiwa/qPjMd19nez7nsy+/\nzvN5RlMYdL+HLnMsgU8cTfnCt1/l2U8+ye2L98TuxnjPSlibQmwv1gG8yfDGo5BSbK8nxK3z+byN\nl4uiYG9vj83NTcqyaGPqGLGcyLmE7ld3qcwoJCegOllCpZDSZljiAiSsSxoAsi5q2whIyVl5DWOk\nvVyr2D0oGBNi2GW0wqtlOJBg09bKPXUFd0IITGdzGuuYzhbce88FslxAUtdv3KQscgIG5yzHsxk3\nb96KnYJ5q8DVrQ7kmaFLD3+38dExAvFzCKI5P5vNyEwfYzKKomRRzajrit5wjfl80dZiBTySR3e9\nQusMrTOsrZlORH9tNBjSK0spJ2YZ1jZU1YL5fM7R0SHzRcVwOOL8+QvC36YUs9mc8XgsScZIyTUc\nDHFIH/f+0RGH0xmTqqI/KBlog0dANjpEiLBe5qe0allG7nr/IpXthPXYOVQjRgAC86kYAePBZyXe\nxOA+GjsV8yjKWnxdS/bdOar5An08o55MmY4nHNzeRdtAjqanM3poXn/kCR5+5UWeP3eJw/feJ+sX\nFGWJm0z5+D/9V6zuH/DYc9/g9y/fR99rMgfaev5g4ww/5G/x5Y1NFgdjJsERygwyQ6MCn//uO5yZ\nzvihb7zINzY30HkmCbUIChNYhBBxqhhyhCwj04hwTKzFt+IrWrr6Dg8PGY1GFEXexv5t/B3Syb6c\nV3mvEGUO0mYXJF/8JuIuFEmbMUCbh0hNUCiNzop4zZZF00irtkqai74Fg7X9IvH3ksT2nTwE2Hhl\nStHC2UOsYswXDYuqYb6o2T5zBlDUTc1kOqEsC7Jc6O2DgsODA7JM0++VrIyG2Eiy4lwj7MjBRJ6K\nj3pOIP7rncJZTVmMsNawWATW1yRDO53PODg8ZH3jLOPjYwKKfm/AymAF5xWT6Yz93QMuX3mQGzdu\nsTPZoammPPTQQ0ynE669/jpGweT4iFePDnjttVd4/PHH+aE/8lnW1rdY3zjPq6++GiXMpty4eZu9\nvT0p2ziADOfl1NDeUKHYQ7NZ9FnfvodZDqXuUaoMHSzB1TgtXPR5nqGzLO7b0OL2lSL2jWcc1zXN\nfEEvz+lpDV7owaeHhyxmU0EiZgUHx/uYXklWFJKcyxu0smjn0dM505s7TG7vMXr+JZ584UW+e/k+\nDje3abzj+OiA4eoKFDlOa8xgxP7ZDb6y+kPcuP4+09ffY7S6gusVvL77IlWe8aNFwZdHI6bHc2zR\nR3mNawK3VcE3z12WLH8Nel4TMo/XEqv/en/IzzaWf7a6yq233qO3OiLvl5h+r4UTu6BwTUOzaMi1\nJjcZweWtazwajSh6PWazGXmR0x8MOBgfcy54gtFENTIUmkxn9LOcQsfOPa0wRioOPkBwOjJGgcKB\n7noHGqMzqlmFVRJqLayl1++RlyVZXjCtLKEs8QGsh+MmYIOOG01Js5kTaLZVS3CSDw7yHNvULRhu\nNBq1NO8tX0TEyfSKnHouna5Gl7xx9RplrjGZQmvoDQyeml5RcPHSWQqt2Vxb5dy5bVw1QxGJvp0i\nK/ooK+jFUn3UEYNhqfvX7/clWVI3BD8lz/pUVcVgMOTipUsYYxiOhlFgwfL++9dZ1OIer62tMZ/P\nWCxmWCstnQcHBwxXhjz6+OOcP3+O55//FlVds7a+xuXLFzk+PuaNq29yc+cbAMznc+bzOePxNMKQ\npXd9OBxS13VrrbU2XH7+Bf6HZ7/Cm3/uz1H+pz9BNT7EL6ZkZU6WKUHbRey7NKPIKeBjWr2F0lpL\nUZaMVlfJeyUq6tolXkSlNItFxdH8iMV0zmBjXejVqopqOoFFTahr1LzGH03x8wW/+Py32ZoveOTa\nW3zDKBa25mg+JbMzAauEQC8r0M4TrMPVNQuluTU/EHZi3/COt/zb9R5K1egAh8cTVNDieakqZunl\nZJ1WFRiD14qJa3gm0/z25pDgF6jrN9G7OeRGSo/9grLfI8tzUfdxQpNeZhm9UsQ1pR9f2nHLsqQo\nC3o96eCs61qktoOPuIOTvAEyZ7KZVczRELRAnIMioEETZc1l7WVGgMOKQKYC/UJBcOAcKg8MhwPm\ndcN4MuX23h5VbbFBRRq5O7Ujuwg9YwyZKdu1nnIbqamtyynZNA39wUDCjnqBUoHcZPjGY22NtTVl\nUWLW19jc2GI4HLb9ImVZYiOQLI3URv29xocaAaXU3wJ+BtgJIXws/mwT+L+BK8A14BdCCAdKAtT/\nHVEhmgF/IYTwzQ97j4jjjLxxLpJHulZNdjabkZUGlFCKm0wyndY2bXZcGk0Mi2qODw5jFLYWrHd/\nMGBjY4O6btja3mb/4IBbt26RFRn9Xp8sy9vSYlLdWSyW/dcpRoz3HifX89/v73J/07D+W7/Fyz/9\nk5I9ThJqsWe8RaWdfhAxjk0/N5mh7JUicJmUbo2cXkWvJy5kY7n01rs8/Y/+Gb/71CNcPbtFdXgM\n8wrTWHpeUdoAdcPvr23wBbvHl1dG7B4fUdkGG+HFKOGwX+iKR4+n/NR0xu+tr/HW1jp1XfHg8YQf\nH8/5V6slLxa5VFuUx4Wajy0sX6oqfmulz4tFLk1GylDG8qANARXxAjaWEzMbRL+h0TQV2IUmm+at\ny2/Q5MYw6vU4t7XeGoAsE6xCMgJJq7Kua0mgFVmrcHQakRkiNLjtUNTqxGZIcmcJY4KWa0dBRqDM\nNIvG4rWGUDDo91jU4qGMp1NpSvLhjs0PLFmS4tBaMxwM0Vq1rex1FIFJnAmpPyKttZQHCXEuM60Z\nDAY0jWhuroyGnD9/ARYTnPM0zZKNKkTrLKpNERn6PcKBD24yXo6/zZ16An8F+NchhIeBfx2/B/gi\n8HD8+It8H/JjsEwKJmVZ30KIBUY5m81avcC9vb22kUTiRUVR5C3P2mIxE7epVwgnfGyjHA6H7O/v\nS4kpy3jv+nWuvvEmi7omj8pH3nsWiwWz2ZyqqpalyyCkDCmzDxI3/o31Dd5aWeX1X/ylKAMV2jJS\nqmV3M/VJ3SjFsd0kYSs82Ssl/jRGjJ0Sjr/BaMRwNOLz3/oO5w6O+MI3X+Lo8Ijx/hHj3UOmu0e4\n8YzcBXIPb/WH/M3Nbb6T5VR1jW0sZVaS6wyD9MXjAj85mXKPdfyJ8YTN0ZCN4YCfnCw4X1u+OK7Z\nGvRYLzJWMs2K0fxsteBeZ/mZ6ZxhZhhlGat5wUZespblrOiMoTb0s4zcRBrtzvu5RsBcR0dj9g+P\n2Ds45Gg8ZjydMY+del0jkGVCjSWNMUIGU1UV88XiJI1Yqv0jnpbvGIQuniDF5FK3lw+HuPhNxIYY\n5Skzg/KJ0MWRZ5kA0uqayWwRAU4nhUbT57ROuh+DwYDRaES/3z9xf11quHS/TVPj/RIN6r1gYIbD\nFZRScf07zp87Fz2GpiUNkdtbGgP5+OBcFHwfnkAI4f9TSl059eOfA340fv13gC8Dfzn+/O8Gmf3n\nlFLrSqkLIYQbH/ImcjHR9cvynGANTWO5dettHlt9lGE2YLQy5Omnn+bfff3reA/bW2d4/8Z1Hnzo\nYc6cOUtZ9rj27jW2trYYXjjLW285irJsT/jz91zg+vXrmCzn0cee4K23rvHG1WuAYm//gLNnzwJQ\nVQuJNZWgBq11rK+vM51O0cbQ6/fp93o8R+CvPvkkf+knfoLpzXepp0dkvqFYGxIa6ePW2dItVCTc\ngJKKUdAQM/zGG4zyhEaMTZ6XKJNj9/ZZW11lbXUNtXWWd7/0U5S/82Umf+Zn+c8//hTFb/8uV377\n3/DSE0+wd/Y8Q50JPVpjaao5K7pouweHKyOstVSRo2Dt5nXWdw+Z6sDOQx/jniwj6JKDjU22bu1w\nuL7Gk6vbUUREoTcLjtX7nL29w+76Oh/fPE+GZlD0sI1nETzz4NizFfN+xhzH1DYcTxVWQ40nD5Yf\nmc/48wdH/P2L53n+7DafPxrzC2+/x2+trDGJ1ZtUpdFxY4iISmjr30opVlaGbQkubUjPScMgOgQC\noHIhiJipEto3go64AoVFSZK3ECWjfq/EoagjEvXmzi2u37zNrb0DDg6PKHujE/sqlSmXCcGlgffe\nc3Bw0CYB19fXGQ6HjEYjqqpqKxEpCVqWkhc5Ojqg7OUErViEIG3QPrTJ77qxTGdTjFYM6r4AteJC\nW/ZGqFgF+eDxg+YEznU29k3gXPz6XuDdzt8lGbLvaQS6XVvee+azGc5qZtOG73znVc7de5asMATl\nmE0tZ7bPcGtnh29+8+t89rOfobGO2WzCbDblY089HuP3hldeeYWimFHXA2rbMNufCxbcZCidUTeO\n8UTw58459vf3MUbwBkdH0mG4stJDa8N0KsAk7z3T6VQs8nTKa6+/zj//l/8C08y5Z3udM+srTGcz\nRr2sRb5141Rp6FgiypLrWPRLQgmqFhcYI5TjAlEVAJDJDfPPP81rX/xxjDbk8wUP/d6/Zbi/z8de\neYWvXLqP6WSK9sKnuHbmDN46gpKmF93vk3kvFYRBj0e/9RzDumKyts7thx5gczZBOcu941cwIXDP\n8ZSdrIfKY43blFyaTslC4P7pjJ3RCBMUZVYy9XN85VjUDbaaU1lNhYhhzhvQRY7KDGVR8hdu3OTK\nouLP39zl3cce4he/8xoXJzP+1Jvv8P/80CeX1G8sORd8kM2Y4mjnhIHZt6d9XOwqyXzQbgaFQItJ\n2o5ewhOFyKEHrbCNY288YVhogi9Z6Q0ZrA4wjaOZLbhxa4fdvUNmswVF0YuNOpGdOjapKaWW1YxT\nJ694ir5tEkr3JdUqe4JLssgzTFD4oBiNBhRZHj3KWFFAMxlPef21N2mmM9bXVuiVfbQ2ZCYHlYhS\nDcEnSZv/sHDge46QcIr/nkMp9ReVUl9XSn394HCpmCrWO8gi79CFO2epqprd3V1u7+4yHo8JBEYr\nQ5QW/nnrGlZXh5IV9g2DgQBMhB1IOg+nsznT2ZzFokLgoDnGZO3mXiwWNI1dxqsd/HwSplSxzdg6\ny+HhAd/81rd46ZWXOTw6OgENTad+e88nJu7EXGAyQ14U9PoDJJ4DlKLs9aWyoGIHW1litGDtMYb3\nv/TTTM+f470f/RGKlRGhzLGZolaBube4YQHrQ/TmCnptSLG5Sv/MBsPz27z/w59nvr3NrT/2BfTm\nCqOtbXqr67z/6ONMhyPeu3IFkxeirYjCBs+7DzzAbHWVN++7LC62Fr3EyjsW3jH3jpmzzL1jETy1\nAp0ZsjKnHJT0h31+/f6LvDMa8k8fvZ8iM/zOU49xY32Nr3zmEyd0ExRL95qYNGvr6dCujdAxBO1M\np5yAjlwVpyu0yW1Wmo/v7PO/fvUFPrV7wN7RmJ2DY24fjBlPK8azmvF0wd7+IdP5HOtExDU16SQ3\nftmGfudH65FEjEDKachaa04cEknU1hhNEanHuwZFRGUti0XF8fGE+aJCaUPZ74unk05/ydjG/gjV\nhgh3Gz+oJ3AruflKqQvATvz5deBS5+++TxmyR8PpRMqgN6DfM6yvr7cU1E3TsLu7x7W3r5EXBffe\ney/eO6wVEs08F8bV8XjMdDrm7PmzQuRotCQRURwcHDKdzrHOo3XGYDCK7to+i8WiPW36/WGUMJPG\nFGnVDJjMoDNDNV+gjGY6m/HCCy+wUsCDly7AlYuxriwPYbmc5bNqY1NiGLSsNhiTk5U9Zou5CJUq\nxcrKKnlELXoPeV7K6Sec28x++LNc/dTHcbM5/f1DmmBZRHDTZHLE1ta95KMhutcDk5EXJaVSlN7R\nnNvkxk/8CSHKuHWLfm2oxscclQWTBx6gno5x9YK6qlhE/obqzBazJ5/gvWuv410FQaGV58BWjH3D\nMZYjFVhoRW00Vit6WUkx6KPKHIqMV9ZH/LUrQoZRAG9cvpe/88jD9MuCQXzWMl3LONsH3xLGpmx6\nIh4RcVHa2vkdyToNmCAszX4JUU+G4Oevvs3l6Yz/drHgn1QFewpGE8vK6gIXYDKfs3twiPUKZbKY\ngK7IjEaYgu8EgXV7RkJEkqZmsqqqWrKbrgFJxCLWLsi0wZgihhk25pED8/mCzBiMkQqaq+aYPGMw\nGtJYi2417DqfvlcswA9uBH4TkRj7NU5Kjf0m8N8ppf4B8EPA0YfmA6BFdKUSYb/Xo9froVXJ5cuX\n2dvbwyvLmXNnePrpp3nq4x+XLG6Wsbe3x3T2EsfHx1TVnB//yT/JweEht2/vcng4Z3d3l6PxmJ29\nXeazGuekWpDnBZPJjFs3b0siJsjiOnPmDEUhv5tMJmRZTp4XbYeiNMjAeDZFeUkgrQ2HECquX7/O\nW+srnPvkU9jY5pv62+FkbiZ0sAISy0Z0mdE4hbisWjMcjfBWmG41UnDASOLRZ+B6GbVvaKyhHhb0\nL56jWFTUsxmHO7vM37uNDbeZaA2NZ9jrUxaiLTCrFvRGA7yC2eEhN7bPsmhEVmuoNMVgRK0NMw+T\nxkJRQNmjMIbb8ynWSUdeHWBvvqBWisoojkpDXRh8ZlB5xspwgAMaGpq6Quc5o/6IQa/PqNdnkJf0\nTEYes+c2cjsYvXRUU6NYm3xVmqqusN6dwP8n3cVkHLpJ2EDSAZTaPl4qUf/gyiV+/uo1/hqOsVVo\nlXG4PyM/riNPgcernKLM2sSwszXzhZCe5HkeCWikTNllsm77+nPTdnfKkl+KpqZQIs9zNjY2GA4K\n6loS4lprNtbX0Uoxm83Y2dlh0BdYsLUNTW0ZDlc4s30Waz1SiZb9VBQaaei+O1w5je+nRPh/IUnA\nbaXUe4ji0K8B/1Ap9V8DbwO/EP/8nyPlwatIifC/+rDXj28iExOz5y5BHouc8+fP8/7OdfqLHiEE\nbt26RdnrtSKLq6urnD13BpNJ2WUyHXPjxnV2d/c5f+E+rl+/zmQ84Xg8ZjatKHt9mrrh6HAsYg8o\nsqKg3+szmUza2Kzf78eTxrUxnLRmKkyWMRwOyZRCBU9VV/R7cjqlkub6qHdH/bgd7Y+WLEapuURC\nCSMwaEReO1gvXH9ZTrOo2gRQCB6T55ghhExThwZqy/k33uDyb/8ub/zw59i/eC+uarBVzfx4jPU1\nhQ/0lGaCpQpWwo31EYugaILGBs3RdM7WsIcpepQ+MKstWX+ARXFzdw/vpDOwCYGFd1QEfJGji4JC\nGT4zm/Fzb7/Hb9x/ie8MRaHZaAUmIytzisKQZ8IVqLyV0ALhBEhxddfFhuUJn9SKfAgnDQC0JKo2\nbq5UqRGQj0d7UXoSAyPu+de21vmt8mFefO0q3hq8KqLnkRoARB49RHg6wdPvlTHhGFrDlK4tVTHS\n83ROFKvS33Yl64wxeO/p9Xqsra3xyCOPMBqWzGZT9vb2eOutt7BNw3A45Ny5c0zGE6y1EcOySln2\nWF9fZ2trO75nIAm/uqibodRynu42vp/qwC99wK9+7C5/G4C/9GGv+QHv07rJQgvm0Maztj5kUq3Q\n7/eFt85aijbzG+j3eoyGI2Yxzp/PFjS1kISurIxaQc06qrpSKZraMZ3NUYgApNaQZ57VlaHEaXVN\nWUYpMudxtiHPe5JYUgrQZMZQGE1wjqquwAxYWOkxb6yLsmKnkiXtiiXG+F36aY/HSaNQ6ODdPQTr\n8UaIRbyXlt1U/zbaoLMAwWNKkcq6/MyXGd3c4YFnv8rBr/wXqEWNn2tcU7HwnkZ5Gt9QBYezNQbp\n77fVAhtPNFvXVGUurcMBgtIxNLJMF3OOApDlOAVV8DRKE8qcUORok/Gnrr7JpdmMP/32e7x45pFW\n0chkEusWxohoChG7ELP4aULS5sk7HYVps4nBFN3CZSOQNH0RImuPT1gAqZc7L4bFR3k3ozMxtEpo\nyQQXoiIlezKyMZpWRAMQlZNCQBtBCHbj+a5WQjrxWwPm7RITYoyQq2YZeS604isrI7Y217h48TyK\nQL9XopXite++hm0sCsWgNyDPclwjZXRN4PLlezlzZovBQAhEEmZGqUirrzofHzA+EohBopUUCKgR\nSxYWoHNGqyM+eeFjWA+19aysrUerKwkQSez1cFZx88Yujzxs2do4y/bmedY2NqiairqRzGxRZuzt\n7mKti+SOOWtr6ygcx4c3uHLpCu+8c52d432a2pEVJcZYrEPIQwtF46BxoJoGiiwyCWW4fMTOcc1o\nb4LHQFOhVYnKSqTNIbRJWoVsCLlvLyKqyqKUiz0RSEcitISfzjUsmgqtMunt19LQokLk1gPKIud4\nMeM7P/4FHvtXv8cLP/JZprko3NisROsV6pm0XU/rGcZ6mnEjoYYP0hIbRDPPhpqDmfQg2LoWY7e/\nj3PiZl81muHKiKJXErSimk1pvPD9qVzx/165wM9de5/feOAezKCPCsLbnxnDwPQoVU4WDCYIx2Ci\nRzNGxFFra3HzOZv9PvPFgkW1kMx5r0QboT6rZ030ICSRtrG6jgoGnEV5TZFF4Q1bYZua0mRUrsah\nyApF7R0hK/EE9vePwRtpwaYSTyGjk9hzUUUocvi7+Gw46Qm0CkgRFNTmC0yIoQrRC3OJHgEAACAA\nSURBVMgwOqNX9lhb7bG6OmJ7e43zZ1d4//oOeWbY3tzCVjV1ZWkWDruwGBSF0eSZpsgCP/Ynf5Rz\n585iMkUIBmtlPRoTjaTykUvhTkKbND4iRqDrCdwJv8yyDCHHlckdDocopalraSs9f/48Wmtu377N\n/v4+6+vr9HoFN95/n/svXyHTOe+99x77ewfYyBYUgsN7xXwyJi8Mq6trsd1SRSkoObF8cNSNVCWG\nK0PKLKexGTpkEfWnMDqniVDW+WweqbUB59AJy33qPlOWexn2Jtbbk38LdP5Wgw4ifKqS+IhrYaJF\nUbCyMmL++c/w1U8/RVUvMFYJpt1mqCwj65W4usEuKjZffZ1Pf/sl/uChB3lzc5PbhcUGz6M3bvMT\nV9/hmYcv88q5TYLxGBS2r3hsZ8yXrt2gd+kir/d7AnF2DpUZCi3lTVPkfHc05K/fdw8oxaBfCIeB\nEiNQZrloNyRNAa1aIlOvVNv2moAxi2rBbD5nMp0yHo/bnIBKp3Mn3m0FSuKctI058b2UVtFbQLLn\nMX9QNTW1s/igOpqUJ+PobmjSzfy3HYcd9F+3siTPUjAPRV4yGq0i5EoSFhB0bFabMJ1NmM9qplPp\nXhU1bMvR0SGL2YyqWqBwnD17jj/9838q0vJ9cLz/vX6XxkfGCNwxYsIsyzJMlqG8AiXcc1VVRSqm\n1E/uyDLDhQsXWFtbo65rqqpmffsMu+WeiHB0cNplWTIajbi9s4tSPYo8YzDocXQ4lqytNqAN06nQ\nWhmdUZaCF0BpMqNQXpO4aJVRNM6xWNTM5nNm8wWuV6JdILA8EQiJ5SfeopIGmFT9OAF7hU5otFzo\nWqfXSQYgVmhj3JquM7ma1UIy6F7HbjoDoTCEXPPpl19l+3jM5998k9cfuJdeobF4fuoPXuDCZM5P\nvPU+bzx+PyGSZCrn+OI3XuXCZM4Xr9/g7YfuF9fZKjLlwWjhN8wMSYMpKDBKt0ApIdpIbdOiymSU\nbGYdJ6XbA1A1NXXTxGdaRV79DKMNRme44E+UFduOvEgI4qI6ESSjugw3VKSoc94LRZeACe9Ipn+Q\nMejyQnRHFzjUGoKgUEjHofdB5OsFtIBRDucbwGFdTVMHmkZK4vP5PIY1Ae8szlvObG1w4cJ5Njc3\nyLKTEOWukfpeqkPd8dEyAqdmP3kCKQ4MWiY1ZVMFnCG8cXmesbW1SVn2aOKiWcwX7ambsP9ra6sR\nkVayvr7esq70ypIbsx1cZJrVWc5kNm8BK3letBeo1clrVlrjfOCPHh7yP11/h6sXtvE/9jnJ+Hfo\nxJbAVhkhLD2BFlaclmGyAq2uYDydSB5AMgLE5GLqZc+BJcegQxMaTdAKp0DlRjrLMs3X/+hnePrZ\nb/DcZ57CbIxYiZvqDz77Cf7o11/k2c9+jOFoRd5LK5xr+L3PPs6PfvO7/P4Tj5L3CmykN8t0AUbJ\nDemoLxhr9Wnzq3jT3kfp8FTLJkQ68tBuoPQ3dV23ZC51U0ts79JzMK0XJIZWeiJaqvmYJAshRB6R\nZd3MRdJRD1gXqJ1tod53XZofsNHuNtJ6SyGCYFWi7JwXxaHUCt80MA8W52t5njiUyslMznA4ZO/2\nHq70aGexjQPl2dza4Nz5s/T6BUaHO67nw67v9PjIGIFuKQdo90qid9Zo8AHrakYrI0KAalEzHo/Z\n2JDNb63l8FCYg6q65td/4ze5cuUKk8mU8VgQf5/+9KeYTqe8+uqr/PTP/DQvv/wyh4f7mCxSUSnI\n8pzhaI3JdMYinT555ORPzRjWSqwVQTxGZ/yPR/s81NSMnvk9Xv+pPyatoh2ZcbnRLqQ0xNBErLy3\nUTJMLS148B7lg+ytIELnja3bBBcQSSTUklRDiaCFUFP121O0Rdx5j7INNz/1MX7jqccIITBSivKo\nwXvH8YMP888eeBAPrCvhD3QhUHvPOxfv4f+89wKaTNqXnSPTOZpMuiaTtxOrHkbriNATbENQtCIc\nEiublgoMpdpKTDKHs9lMYM4LSe7KXUfJt0jfo3VqyRZaNx83E0RPyi/lyJRfalIqneGsiJlaF4Q2\nvMXdL+G/p9fp6ZD1NDAohQeJODQ1B8nzacjz6B0idn42m+GDxRgYmT7raxucO3eBjfVNbly/IY1S\nzjGZjFnfGHDvxQtcvnwv4ERGPr7vaaxC1xO4a5Uqjo+EERAo7RKC2UI/kzuFPEAdobOLxUJO5yJj\nZWUlutVeVIbqhqapsI3lE099kq88+xVu3LjB8fExm5sb1NWc4aDH/Vcu8+Uv/xvyTJqPRqNVfuEX\n/izvvHOda2+/x+tvXMN70DojU+LeymlrAE3tPSr2rje2Bq34tf6Iv8KYX9/e5Mm9AzbWVsg7CaMQ\nM81JDUYWWmyDta5VHVYR9QWS2fZtQioy4askbiEeQjdDDZA6SUOATGmUylFaEnZCbiK1737Rb0VA\nlFLkpXRvuphNt8HLZlORJDQ4XBAVXr+wEaUp169MJAcNniYqN6VFOD0eCy4i2k8TufVabyUEqqYh\nIKxSidglhMDxeMx8MRdDUFWdeSMmWVULv5V79+2GOMHuHMOxNLchgEOxqBpmC8EbqCxvW7zTuNvm\nSiO5/Ok+7gYaSs+lLHst81S1qKWqo8VwGW0Y9PusrPS59+I9rIxW6fcHeO+jV1uRZxkbm2t86Wd/\ninsunGV9Y4TJuENdu/uey7m605h1x0fCCIjxTQs/5ts6F9+9ve4DIhZzlJamj8FAav3T6RRrPY8+\n+hjf+tbzrK6sUhYlK6vDttsvy6RJxBhDXVe88cYbPPrIE5J/UIr5fE4IUYRSKcBjvUXh5XRTAW1i\nL4AHUxT8rl7n2dU1LuaGxyJU82SX28kFlbDkEgrEBF9Q7d8lL0E631QMDXykYev8Ls1bazyWBKSZ\nydFGEzIAqa+rGLHrINx4KU+RaTk5k1CJso3QlKvIgYBIkFnvcaZmUS0ifFovjYCPbDmJpSeIgKvX\nyw2VQrxkJKyzraBKK0kfRV2bumK+EAr5bhjYrhi9PECSEYqxlMynChG+raUMqZZGxHlRE54tFi0u\nQ6WS5d2W6V28gOW6XK7Xrup18gzyPN2veI3GGEw0Apr0f2AxXzCbLsjzgqQdkbxGCJw9s43JFFW9\noGkqBr3hCU/gg673I+8JtBhOukmxEBd8KqXFLe88piNFFoJsyizLIwjDtJwEm5siXZ5qp2vrK4TU\noOQt99xzgbquuHXrFlevXmVn5zbHx2PmSfPO5GhtZBMFR+MbyeTh2vpxekhZUYjkVWiYzhe4oGMM\n6O9YFLBcNC0vYFhiAwjLzIGP5cWQqgHex8SbagEhyYDKtZjo7vqYc5C+gyxAIBJwqsh7r0zbhKMA\nU6rolcSOtnoOCKYhqFQvDzjvaIxBaSnN2aYBI7x23jlU8FgiYMUHiizHetd6Al3ZrKRP6KLsWTsn\nSEhQ1RWL2HvfzpWK8Gt83FA6GgEXcyVLPQQV8xLp9ZY0W1KanS/mzOZzIiyI1G7z/YxuCNBNEHaT\ncskIZEZJe7gxLezX6A7DYYCmcRwdHXN0NBFOxVyalISZ2JBpz3DUx9oFVTWnqnqotTvBVN3r6661\nDxofDSNA3AOdb0Iie4CYMBN9vSwqzAiAREIAk2mct0ynM4bDAaPRkP39Q/723/xbzOdzcaU21vml\nX/qz/O7v/muuXr3KrZ2b/Cdf+LPU9YI331zhjavv8Mwzz9A0HhcUZ8+eZzyZUtWCMyiKnLVVqTxM\njyfgRDrao1hUAW8qlHdobznyU27v7dMzsDoo2wWfElPLZKVbwotJHusyT9AFUC1PGihMr1PGAvAn\nElpJHFRrj9ViSKxWOC3eiSj8ZvTMEreuQdSGoqGx1rKYzeLrKbyt203bNA1OWea9eYvfB1pasLRh\nU728apZgn5QvSPMgyj2SXNVKsuZ15MnzwTOPOZn0/1XMM6TNlzwKpRRNSzLTkUBPS6pD5JlAWFVd\nMZ5OGU8np1K2dx//PqW4bnKu9faioW98g20cWmm0UeRG8qlJ9s5Eryq1TD/wwP2cPbPFcFCyujpC\n6wHeN9TNvGVVTpTod4SGf1jCgZBct47FShtGKXWKtPsknDTBeb2TU6Lf70cSEtGve+edd9jc2qDf\n7/Pyyy9z7tx5NjbWmc4m3Lp1g9u3b3N0NOahhx7ipZdeQamMojdgOByKnLOCPBRY20jDhvOx/CZ8\ndd4LIYXSUZJLQx5KGisNN4lRFmI2nCVmPH0s70fjvY1pL9rNL9wGPlZDHNZZNEuYaojqnQKTFdfY\nGEOeFdRZDKEyjbIxUaQ0OhM5Lx29ghACE0OrzgMFtY7IRwX1QvIF1jqa0MBijm2EcEOnioUTTyA4\nT24MymSEPAA1Vgltd6IGFy9BPDatJZxAKQlPmmXHXZWUh+P3SQ04ROepu9i98yc2XJpbQiC4JYbg\nY7f2+NKb1/k7F87yaiX4e6VSg09o9SPvtubaNRtP+LT5ujDhlCdI8OA8EoN6L1UfKX1HeTsFzjco\np1DaY21izW5a1uVPfeqTPPLIQ6ytDkDVrK70KAqD8w25ybHW3XG93Y3/h8IIdMfJeDnGyWq5KRRZ\na7K1Si6oj6eCuIVaiUDmZHrIymqfldUBea7Y27/N1uYmo5URw9GQF1/8NkdHR0xnc+q65sknH2c8\nmbOoarT2aOXQKsaYwaGCwiCCGcG3mSmMEfotraQU5YLh2s6eMMD2+wSiTDkBhce6SIseOewSXZq8\noo44c/mJLCoA6SL0XmEbiwkhymiLEQiRM7/rX4YYV8iJI2y9KQmotYmNSKqNjxPIJp2aXVe1PdlC\nNFZOXlsFWchKCYrR6AxvEgtvp3oRllp70EY/ceMGdIxqAkmv0OKskwahEJYJv7hGjNZkWjaTUkLo\n6ZwIqTjvI7eevL6kNaI8OZovvXWdy9MZv/juDf7m6gpeGXJjaAJtPuC0NkQK67ohwMn8zsl8QPq6\n/Vmax/i3Pgg7sQjbKIRlxgg83c/JNOgy4+GHHuT8uS02N0asrg6ZzydkWRkFW/KWPDWF0Msh5WWF\nGPE7j9Ll+IgYgU5SUCmW+mkW5yzGFJLUiTFd19p6uwTTGKNjn7fIch2Pb3P//fczHA7JC6iqGfNF\nSZYb+v0BOzu3sdYymc64ees2v/Irv8Jbb13jzbfe5tatHVSoIclaB0umcoKGoBS186Bk+owOZEpc\n2EBgERTfuPoug96Ai9vbeAwOB6KQIL0I2qBVQAeR+GpLa0hcn2TFusm/EBBcRAPBZ5hSs5RJk3lU\nsdVQpL8seCXCHlmGUlncVBIyBKVFwTlEFaHGYYJvuR7ztFNDQHlhg9ZeY8hYxBg7vZcAlFL+YEmu\nkZ5NUpGWDsEituFqbONomiilppUwf0dDUEeWpXZdxFPXxBO2zDVlkYMmKvg0kbk3eZYCnTUqkGmp\niDiT8Y8fup8/ffUav5bljBtHMAWFyfBRsyK9X/K8UrIynejpvlrdg873XYLRE5WCJCmPtAYTeQ2N\nykEblDJonVOUI7ALcmXoD4b8iT/+R9jeGqCo8N5QFgXeKSoXIBj6fSNycV63LceS/SH2FxCTyB95\nI3DnUDH5k77WSkWWlDu7y9KDSt1VkgfI+eVf/mWeeeYZyrLk4sVLPPDAgxwfH6O1oShKzpw5wze+\n8Q12bt8mYHj22Wc5Ph5zeHTMeDyOD9i1MNa6rkga80kcI/ESduPgxWLBzZs3ea1vGKmGJx57lNGw\nIM+kk26xWGByTWYyBqOC+WzJZ5gy52l0iTOW8Z5uY03VWKEeVwqhKlt2I6Y8SwjJUzKtIVUqAniU\n1OoBclPGU8/hIw22d8sTtmkstqmxTd2+h7WW2qXTfXlqpuv23lPHunm6D+csAXUiGQhyatduSbeV\nXO00L0v5rpw8i9qA0AnmU1ovLBOqIURaMQkwvYevbW7wO59Y4duvvo6ron6AliqQDx+cYEsjvW5b\nzu78fddjSEZCKRUboWRtyH0pCMJIZOuaCnm96XTKsHD0y5yVlSFPP/1p8XS8ExZoa9Hak2U5/cEQ\n708Sq6QKkRKcPSnB/ofACJyKwdTyi+UDENe77c/uuFvdbGxZlu1iaawXObNI8GiM4ejoSGjLhyt8\n7Wtfo2kaVlfX0CZnbW2Nvb0DJpMJ6+vr7Ny63fYnnDlzhswUzGYzDg4O2lOhu0DSQhXmGc/hZML7\n+/s8nme4ANQW6xtyncVQJtA4YQHOoqE7fZqcTBBG7EQq13kP2OhKxlbdTvLROU+/P4gbScqDGVE+\n+1SpjajaK+GAI0QDkGJ8b53AVuOp2MVznKbUSp9T7iMZ0bZK0tk4ygjRR7rmrjFN/6d72qYYW0K/\n0Bo5733LpedDaHMIPqSSn8djaLxjUTsmkzlVU8fqiMIpG6s9rTNwYk3eLSeQ7vV0haBrtNoyaEwC\ny1wErE24hUCmDXlexHsz2OqY/sYqFy7cw/HxMYNhr+VWkANoyZCtOt2BSrVQp9Ywdg3zB43/YHqx\n/ygjnVjpm/TjUwma1hx8wMNIvdxlWdKPxAtbW1vkec54fMzh4WF7ivd6PQ4PD1lZWWE7at+NRiNx\nSb1nZWUlXo1Y6MQSm07/u11fNxkEMFlU7B4fM2ssVWOpG4F+GmL3n/c01oFJJ/mpaTl1IrX3Hl30\n4JcJNELC2KXMnnws56rjSUDMOyTwEXGRLf/fMpsdryPmCtKGI5w0TK23curZJUPQPqPOvbZ3F3MB\niYLr9DNNhjGVFuVnqqNeLNwG8Y477yvekBgFUT1qnPAgTmezqC+YINie02uvO+d38wbu9ry6huCk\nMUws2oKj8E4YsZx1radrok7CaDjk7JkzXLp4kcV8HjkIbWQjEnh06OTCTqyN7nWG0P30geMj4QkE\niKWpmBA7tcC6D+fDmiKW7rSUwy5dusRbb73Fa6+9zvHxhKeeeort7TOxjVhIHLwPvPPscwwGA8qi\nJM8L1tZEBMM6T1AqUqE3rSAm3GkEYLlwlVJMFgtuHB7z3u4+96wOGWaGvkJUi2OiyjpL0RtB49om\np3ZewrIRJd17svbBe6nds2wqiiteriPI30teZZnI0tEr0FqSldpkGJWRGUPIDcGJp2DloRA1OKFr\nYMJSCrzrzqfEXff7FBooJQtcBY3yiRY8lhEjKrJrLFK5qxVuOfX8E8rUWdE38B1D5P3SCCxjeLAO\n6toyWyw4mgigTNh5Vdtc1r32dC3p8+l6/Onn/0H/d9nU5GMuJCJBXYgIVPDeYhtP8IZLj1zm8cce\n54nHH6euamzd4JTobl64cA+L+QLngwjVimMV976Kh8HyQE375w9BdeC0y3vSqnofYnOJTOgHWeZu\nqOB9YFHVPPjgg2xtbXHjxg1eeOFFrl69yu///u/zta99g0984hNcuHCB/YNDgFZ2LITAK6+8IiIk\nVU0d2YJGw5XWqjZNc8c1pIed8hNj55g1C47/8W/wX+4e8MoXf4zwhc9zeLxPnkXSUmNoFhXaOXRI\nfQD+xD12DYsxRvgRIkeigVasRCC/qeZuyJTqwJYFcJRq7C3kVmtMrimKjOBksXm1TNWqxMyrYukx\n6gkeH01aLyQJgMQniYtYAmuFEdhGjcWUi/As56i2lrpO+QMHqrPR1bItt2sIgShuIuKg3jlc9AZs\nbM5xLmBD0gcU1aHKe2a1ZTKvmEynIgdnhBVaSpap7fvkc71bKJByMilvcXp03W/n3InjWGswKsNk\nhn6vL+CqeB/9fh+CqA8dHO7z7Ref52Mfe4z19TVSu/xotMKg3yfPs7Z6knIhnYvsVHWERemDxkfE\nCMRxYu5Pwm2XGfKYROpM8mkLrZQizzN6/UErGtLvD1hfX+fy5cuS+c5zPvlJobeuG+kFH0ecunOW\nPC8Fk+59ZBMSKbKmsRFSfLIWnBZFGlprGu9pKssvv/sO5xsL//Lf8OWnHiN3FhMZZawVAgxCuOPU\n+UBDlz4QOvG2+1BFDr2AsOuarD1dpXdBtQYmBQbOezLnRMXYSwdeNxmYqKqCd1Jyip5AivPT8+mG\nSF23PiUkW++pfbqcOL3b1+ngJrqJtTvzDRLOWJtgzmnz+7bSknqsghK50MY6GuexTpqGkkyZUggY\nLQSIykTpGZ6e/7t93yIZT1yfP7lWOyEcwYCWE7qpK7yzGKMZDgY88MAV/sjnPsHG+gpZbnjyicc5\nd+YM/UFPQgE0tq5ZeMhNftewpDuWCfYP9qA/NCeglPpbSqkdpdR3Oj/735RSryqlvq2U+idKqfXO\n735VKXVVKfVdpdRPftjrn3gv7szEAicMXPf3d2bNl0lCk5lWVSjlAYbDIaurq6yvb7C9vc1WzAUE\nn5qP6nZxC3GJNNykxWBMdtdQoIuF7y5c0bX3/C+Z5mq/5B89fB/Xd3aY1TU2SNbaNpZgXcuv+GGx\npve+RfQE704YBd9+drEXIXW0NTEmlQ9rZYNba7FNg20amrpua/GhzTe49iPFtEk9OXk9/tS1dQ1A\nt5SbvI9kMFI3YzeXEDi5ebrP9LQ7LizTFusS8tJFgyBQ5Ujj2uYEbIDGxVAhiEgp7QYBSHiQO8O7\n04m/0zF/uqbu162X1LmfdN0S7sqzcvHZFHnO2toqDz7wAPdfucLG+jr4wH2XLrG2usqg16NflihE\njs7WTTQod4bM8cLvOnd3G9+PJ/C3gf8D+Ludn/0O8KshBKuU+uvArwJ/WSn1BPCLwJPAPcAzSqlH\ngvR8fo8hG9d3NpD8tL2f+I2iLMs2Vk7Ak9Mtk0oJY8zR0RGj0YjBYBDFIjTvvvsuVVXx8MMP89pr\nr3Ht2jUmkylPPvlkLOXk9Hp9zp07x3vvvsdsLvTkIQQODg9aNph08nUXZneBLBcB/PO84Cv9jHuO\nD3no5Zf49OOPUvRy+ibDzivKfh90EHHr+H9Pg1XuZhxkgUm5TWswJp5oSuO9RvuouhPnyJgclRZw\na9gkySazu0TWt6e+c63hqCopDzY2SV7Fstupkz95bmkOVKaXG985amsjmlKSXUto0knjejdPqFtq\nlMSq7aAvJQTwfukF+BBwQaTAa2txBOnx0FFmLASW/HsfvFHSfKe57BqGdhV3Nls3bJXDhKiHKJUr\nzbL3ZDjoc/bsGa5cucJTTz3F3t4e89kxTbNga3s1GnEBRB0eHLOyskrW6xHwJ5iVUmJQBaLYyve8\nnXb8QDJkIYTf7nz7HPBn4tc/B/yDEEIFvKWUugp8DviD7/UeihTHapYtxXQyna0VWPabhyU4I8Vk\nXRUXbUwLIU549gceeIAXXniB3d09ZrM5W1tbbG9vs7a+wf7BEc8++yxZXjAcjjg+PsaYjOFgCFqL\nJsGiaj2NbhmrC4yBbkwrtWxdemrXcDSbcWP3Nt99y1DNxmwMhoxMQX08IR+U5L2iJU3pYuK7rrFw\nFPhWTTcE4SIMRlqdiQZQst0OKQ0KeYYxKSzwhEjMYeNi1VqLNxCRmq6xbZnQ2YamqmjqhbQP182y\n5u99CwOWDSVnUvKqhAw0a2v/1lrm1bLHQPoEBMGYHvfdTtmuZ9H1QpJ6z8du7vFzb1znH1+5wPPb\nGyf+rw/EFmgHyqAjyadysVU6JqSNEfRe2td3u4bTIVvrnZ0aKURIazTEMEvKsBa/xGeTZ4b5XOjx\nX3rpJR574F62trZAeYbDEVr7iIhU3HPPPXEvaKq6jgYtHZ7C3hQksQLB3/XwOD3+Y+QEfgVRKAaR\nHHuu87skQ/Y9x7JUJAkcbfK4GBFUW7zpEGJ5JJa1xNWLtR9ie6zRrUX0CoJWbYPK9pltLl68F2M0\nt27t0NQL1tZWGOkMfCAzmiKTZp693T0xIouaRdVQliJtlhZVHasEkmWXlZ9SYzayExGkRq11Rq4N\nJu/jdY/X377F8XHF2c1NLp47z8pwyKAxFC6gWbC6AhgrTT9osrwgR5ErhGEmBAEIGYUKS6pt6y3G\nZB3PSajK27jQRX5FYvI15jvSUvdN3FzO0dS1JM6ii+xdTOY10fV2jTR5ddp/0+vKzyLmQIFSWuJw\n66L73gED6UjmEicweIn3JdepsJGGy0cXuj2BYw7DRdjxz71xncvTOf/ZtRs8v70Zr1tCMhUElWmU\nlAiDjazO0M5fUni62/H5QRUg0WgEjRCfehdJaTIjqYUQhJodh470bjb2SyQ5VAhoHcgyRVFoykKz\nu7/D9tYG586exTpHbjIxktqTFQXeJY1FJ95bDN8CCt85OINSLXz6e9mC/yAjoJT6q4AF/t4P8H//\nIqJczPlzZwnttCiUznG2xnoQXLw8oMBSSyqdOj5IMql1vTrxpwselXjvgNX1Ve6//wplWeC94/Dg\ngLLI6PX6aKXZ3twkBOmF39nZZXPrDFXtsc2Cfn9FmjuCwJIb66SJSEXeuODjdUl/vIqAbnG/Bb9Q\nlCN0MeLtd26xezBl53hBUww5X/ZZ94Gh9xhqenlUvtZQq5w8aHwuuaSMBhekmVnrJWEJ3oFb0ozp\nOEeZXrLleGeXpcQ4byfCjdq3TTh1VQmnQDzJXKS3ctbJZrbJIHbi3xgGtPLyUewjpKScdW3JMIVy\nWbZMXkJMd3hJbIYAzp6MrbsnsHOpOgG/fuVefv7adX79yr3LW/S0BCsaT6YVeEuwkodJ8yJVi3QF\nd5b8Omv2VIyt2jxWcEvMhUI6HV2kOrPekRVaWJXi8zJagQYVPCZTlL2MwaCkKBTX33+Pssx44MEH\nWFRzchWZlyLeU/6fJngbS4IS+3h/Mv4XzsRkaj54/MBGQCn1F4CfAX4sLGfrB5Ihe/yxR4LUv2Nc\neAL0QryFky7iB33def1IOKHbU9lay3A05OLFi6yurvLKyy+3C2pzc5OnP/0Zbt7a4eatHaqqYjKZ\nYoy0Ic8XsuiNNuSF6TTChFadqBsrNk2DyXIyUzCfTShycUP7/SFFr2T3cJ+9wwNu3LrFmbNbPHHv\n/VzaOsPmKOf6rVvoTJEVOcVojXnjKY3BGk1Ng89L8e0799res1qWEwFqV0dDmltSzQAAIABJREFU\ndTJ7nb7uutrT43H7uybFy166/pqmWSroOoftbEzn3Ak1YBdPPNm8gbpxzCPoxXvfvs6p9QR0SoCd\njHvraXTCrRNuOYFvn93k+e11XEgJuNA5/TprKeUs7rImT49uvifNVXfOtdat43A3jEH375vaAoFM\nZ3gPvbIgz0W3YmW0yvrqBv2yx3defImdm+9S5BlPPfUUKysr8bV9K4Qj6FdNVS2rZqfH9xMGpPED\nGQGl1E8B/zPwIyGEWedXvwn8faXU30ASgw8D/+7DX/DU9yF03NbQZjrFXIughIru4tKKdwxInIC0\nMbXRKDJRbWkasixjfV0KGloLXHZn5yZPPPk4jzz6CPuHR6yub/D2O9dprCcgDMNlOcR7R90saa7S\nyeaca+W004LRWqEyzWhtlSLPqJzl5u5tDqYTMBIDT+ua+uYO86M5O2fP8IXPfYIHr97gh7/6As9+\n7gmuPfowuSnxZUneLzDe4bDgE0IxMRBB01hCSP0HEqokUE7r8nc2fbruFuPvGpnPmHNZbmTfci12\n8yApw++cdEj6uLm8FxCUtD07pgvb8hsmo5EWaTIG3bxHN/5P89xNwp3GUXRr5CHmCuQ67yzx3W1r\ntBumkws4XZa8W0VAlt2dxrX9fffvwxLnYesaNdD0e0NWV4bYuubmzVvs3t4FHEUmB8vu7i7nL5zr\nGEHX5r2c0zRNjdYFWp9MpJ6oJHWu5YPGDypD9qtACfxOfPHnQgj/TQjhJaXUPwReRsKEv/ThlYHu\nuDtYSC4kfepi3u/yoBI4JMaQ6WetTLgXDyG5ob1eD6U09aKmLAtW1/oUZcn62ir6/pyDg2MOjo7j\nNUmM2SZ7wrJE+P+396ZBlx3nfd/v6T7LXd9tFsyKAWawcAH3RbJIOZJsSSStiJalVEmVSuQkSr7Y\nVbHjVEqJvviLU+UkViqppOyKy6qyHNlyVJYiJlZKdOxUpEiiGBHiAhAEsRLgYAbAzLvf5SzdnQ/d\nfc65973vDAASg3E4z9Q7995zzz2nTy9PP+v/WTYaeVHZ2yuUKIy1FGVJVZVUxpImvm5BbS22Mty0\nh/QnfdSgzye/9HW2bu7xiT/5Bs898jB1UeJqA1VJhUEGDpXaRWy9qONTe6gsB0kiKImWikXXVfSu\ndJmAiYvLdAx+QbQvq9L71l0nNTj45uvABLpuPm838H/RWLtsOO0u8g/e3ONnXrrOb1++wFdPbx3Z\ndbv9Go+t8h50Yw+8eO7F9uXFfzvD+SrJaZncwpVbodsF1VAkRHMKwZDrs1uTJMEZR1VWzGdzbFUx\ntzXOGRKteM97LjMYDNjf3+e+M6cDg5RgE1uUTlb1xSqP0nfEBNzqMmT/4Bbn/y3gb93uure8Z2QG\n9uiDLA5OxwjilkRdfyHvKZB2hzFRX7K+yu1gMPDxBHXNfD7zWWpJQr+XceHiJV5++RXq+kWm2heE\nAEeaps3C6Yqmy5FldQi6ERGsS6lrhRCMcToN5/gaClNTc2gNqt/j+Z/5NPp//TxP/fkfIM1yDven\nVLOCwlmGiSNzkPZ6pKmvKdiKzd5Hrq0jCRKU1q7xJUcmEF1qsYhm19LftD248SLeYBHhvYIV2gbb\nSW19GnLXM+CvbRpcAG8/aMe3Ky2FOcNPv3Sd+6dz/uILV/nKqda6HyWEZm50LrRslV858UVaO8gt\nKX5//I5+1HVp2zoS0mECxEhFGverAWK1qcF4iLWO6WRKMZuQp4mPGLSGYb/HQ1ceYnNznYODfaCV\n+CS4FOPc8xvZojqw7Enptv84ursiBsPOLkTRrcP1iZ2/LOKFXy6Ihr7z8yTxwT/hHKWUz8e2PqUz\nFqzs9XqMxyNefvkl/uAPXkJE8TP/1s/yT3/jN1lb3+SHf/iH+N3f/bzfVbVChziBLvBlHJyu2GiD\nJTzNsoDyK9TGuw4dPl59Vhp07Shtjdrd4UtPPsF9n/w4ux/7IKlozm8f8np6k8n+HsXBPv0s4XAy\nhVnh6yX0eg0se9sWg1I1WlfM3Kzpu2UbQNdKb62lmLYiex186g2DcT4QSDmPilBbi4k7r4vW6s61\nja9uXJl6gUEvSwNxMv/WA+f4S9+6xueuXFh4lu7Ydne4BfUgeA8WZlLHUBxh6sIM60iSixT42PGz\nc4WKsPx9e63F3bnf90VhdIBHz7OUfi9jY33M3s42ptb0ez1+8Ac/wQMPXMKYmtls2vSP3+hsM27O\nWe+FeKPBALegu4MJHGO+jGrBMh0dgJaLLzO8Bl4qJJVIkC6UUqH4iAd8HA37nDx5gr29fW7e3OZ3\nfuefs33zBvt7+1x75Trz+Yx+39cVLAOQSdxVoYWuivd0zqERRCfkSUo/C3XxHNQqQfATXUmKc6C0\nYJ3w6vXXEat8bLkoelmP0XANU9YU0wnToqQOrixRBbN5EZKeUtIs9dWTpO2WVKmQnRZ8ybRqQfTV\n1wFctK7aCDRjLBLz9aUFBCXq3EHlsLR/Xl2i8V44J+B8ym9r0T9acVhrzVPnTvNfnr/Ph2GvMF7C\nIhNYDE5q58oqg/EqGSBuNvHPMyrXnLxKqli2FaCkSdjqqjjG1M2VffsBfNi1sb4kmqksk3rOwd4N\nTmxucPmBy1y4cJ6TJ3zqe5Z5STGqeyI+YrWqKl8mD+cL7N5mCd9OCuC2V7iDdIzJZvHF0UxiOq/L\nTEGCCNjdAZrzXVsv7uTJk01hiDRNGY2GbG5sMJ3OeOlb32J//9CXnyamb/riIF3MgEjLeQONGiI+\nVkAI4cXKA/lF0d05wsIVnIXDgwmpzkhUgjhIdcJgMKSqSopigp0d+iAe6+0N0QoeIbUefOElPvL/\n/Alf+uTHeOmhByDR2E7yUeyHGG0X0X6MMTjT9qk3hwaNN9oQYlAUUTJwTemuR6++xo8++Qyff88V\nvnbmlJ+1yscxKKcWEqNiH0VmGXMvVunhqyztyxtDswWIrNjdl3YYac/1HyMCcStWr7rnsZ4p1RqB\n/W/ANFmX8YYO8KpUohWjYZ9Bv0eiFFU548yZU1y6dIH777+fQT9nMOjT7/c72IQ22KLivTzDMcYs\nlGBbtS7eCN01TGAVdV2Gnsvf3uLZcHAh5JN73PnurqCVD928cOEC+/v73jaQZwx6PU6dPomxjmvX\nfXFTYyFJMwbDNXZ3dxGd0h+uNUa5WIhSBb8w0GAOiHjffjkvMUlKPhgyGI7o5X22t3cwdYGtLb1B\nz8OZG0c5KxnkfTKdYYrSl6Ma5cHd6TjcEdRs3ixcWIxY/NDvf5ETN3b48O9/kRcevIgRbzCMkyme\nF20iUSUwxoDtLD5p+9JYSxHsByI++MozHdsw7x998hnO7R3wY19/jifOnfaSB3FpSYg1aAFJukUz\nbpUevrz7rxr3WGKsywSiZBANovF3UR1oznXxP6G7FcX7dd2+y8lMXkRvjcPxdwv4CSrCzVvSRLM2\nHnL27BnOnj3N+tqYLNGMhjmnT53kxNZWUyfTl8rLmM4mDfOMtSf9WJpQl7F3hAms6qNVEnWku4sJ\nSGvMWbnAoTECHMcAupSkiXdNORpoZjpivFKKF198EWcNH/7AY5RlyaX77+eBBx7kvjNn+dV/9Gvs\n7B7gnOP+ixfJ85x5WWPRAbik39xzORd+b2+Pal5gyopxGFSlNa42mMrQy3K00uR5n9lsFmLcDZO9\nfaqipNYptraIpAFJWZHmPXqjIUmSeaQfaxf+qsrw+x96jE8+/gRf/L4PoXXK4eF+kxgEiwk6y7r1\nAy98mx/8ylP8/gfew3P3n/MZiqG/ZsXc92mSkHSMU3Gh/YvHHuZHn3iG333vQ0AQj0U8YEplGmNq\nZABRzI1MtOs27I7nKkt39MY0fR52c79AwkSK0YvWhojAKGV0pYDgPDhmfXQZT1zkXUnGdFB+uzaX\nyNS60o5SNUo70lTxrnc9xLlzZ9lYWyPPFKdObpFnHnfRWB/LgoOimDdANs5ZqqpsvFrWugZF6zul\nu4QJOG90E2/2S3VC7QqMsR4UztbNaDmixVdwokIxjQ4yTyxBjYCBBO2r6DTSmY8+tBbyXp+HH3mU\nw8MDnn/525w4cYobu3ts7+zy5BPfwImm1x+gVNrAmvd7OYPxmGuv3KCX9XEWprNDxuMeIhatFaPR\nkL2bu2B9CKnSGisKfNoIZVVjQiWhQS+lmh9SiQ+4mVcGKymSeiZhrUWo0Ykjz1KMznE+fhhxDluG\nfAat0GnCC5cv8vyDF1BKM7IeyTi6+XxRkTbJx4dU2GbX/MEvP8V9u/t88stf56lzp5s+7xrdoi0h\nItkSYtK+cfYUT505iQOUC+HSBHNPqkF8zcAk1cFgHzzHElUl1XqF4sJy/urEV7riva8AZWo/b+KC\nth1sQRFIEg/waSyYqgiMw5H3Mkzta0cQxsLbBWiMtssLrLvxxChKhwTgkwQJBVX9QvUuP2cdWZaR\n6BRxlv3dA+aTKdV0gu1lDMcb5GlKv9cjyVKKssRUpZdYCH0YczOMdz0q73agrkyD9bDCzNk8T2j9\nsavvLmECrYFGEK8jOx+K6QfU4DzMrxcEAogGTnmseOdDc/2FOgMXjDBx0vsJ5H/rByvlvvvOkPd6\n3HjmBmf7fa6/foMXX3qZbz77LGmWYp1QV5b9/X2KoqSvvYV3cniI9H1o8PRwwrAvWFtitML1B7ja\nI73qxINWIBIAQRXG+QFSAqlWvjaueKSgqgaHRlTQk00B1qC0ZyhpknlgzGY3rLHig4ESpahrX6jD\nOh+yqgLKkg3Zhs0Sck0XNTvev3r/o/zwV5/mXz72sK8YFGhZH1+I9gsLPuIMxL/WNikEfFjPIKLu\nTNxBCYvYv/HSRbhnI963DKBFMvJ6uFOKFhpMuk+ISKc8mmvFf88cNNoG2554FdE/l7/Xsvt3+dUz\nCN9mHYBJiFkuSnlmay0uVGlSonAGKlP6smxVjSlLxPqNTgSU1qRZhq3K5jpCp0qztTgbDbwqoD+1\n/XgkjqJ53qMG8y7dNUxgWW/rHm/nYHxAiJ6AZoJGtmfbajzOLk7e7vuoI4v4giVXrjzMiRMneOaZ\n53jttdfY2dnm0qUH2d7eZWf7Nba3t8mynIPJHq9tv8rOzj7FbI4Sv/D29ioODnao6oqd9X2oE7Je\nTpZn1MaBCkhCyhcjKUsLnYAdCXDdMSfe67htf8RFk2e+BHjcnRWuieQTWKhSa61FSwvQsWwYjP0Q\n++bp8/fx1NlTXgRegjlbNnx2+/TWBqlFWfuIPr/kcuu2aXnMuraBJn5AA1YadCMJZdG79oN4lTzP\nqaSmCIxEK0WapAFyTSjLOT4wXx/bxuPI38/RZO4tqTbz+ZxEK9bX17l06RJZ4l1+u7u7ZL0cp4Se\nc+T9Pqbq2GM647BsnFxGNFpW87q/+dfAJtAOcMt9W67v/3xWia+24xp8Pf9z15p2gujvcL6ceaBl\nD0K3wxKtObG5yWvXrvPUk09y8/XX+Rt//a/zR3/4BdZHYy6eO09VGw72DyjrgspWKKuZHBZYYxgO\nMj78kceoqxmHBwdcvfoqZ05fIElzrIMbN3c5nM2piwJr4fDwEGsqFI481fQHPWa2avLrjTVNZKOI\n+LLp1iEaksEAqQqsc02x1CSZ+xLoQJrqTmmw9pmjLh716e6CamIDOmHCyxNqOQaiKw0sB9MsjKxz\nARpu9eK+FR3HMBY/B2bJUeCP5v7Wp1YnqSK1vmiJsxYlQt7LIDBKYyqvThCxFnRzz6YM2pI9JH5W\nTepulF4X+2U4HKLwUtTW1haTg12qqmRzY426LnGuj4hQFHOSznxdNoR3+2CZGd/KTnYrBnZ3MIGO\ndRcWuZe10XdNh7uq4zWc1tC7QMsTdUFsCrttkiRsbW6ys73LH/3RH7KxucX6+jpFUXF4eIhgmc6F\nWenY2BhTzEushfHakM2tNbADBv2M+bxkOOozn1ccHEzY3d1hWlYgOlQ99s9rnfW5CdbSRRZywZgF\nfqBTSZGQgpaLpnJex9dak2UpIiFPAqhrj0Ogla82ZGrTdIrSIXDGhR3L4kXuMIG1tEE6XU/Cqp1+\neRdaXuDdV63azwv+/Q4zWI7+WzVpu9b27v0Wr7XEBGLshqgm5TZCqFml29oL0nHmrWCCx7Vpse2u\nMTwvP0dde9uDVjCdTVlfXyNLFVmqGY1GJFpT1xWitY/KpJ3/XczF5XHo0irD6nHMuUt3BxOAZtGK\nSFOWursDRRFLRHXMuovehK7oFy96dOBaK3I7MR1gQpGSC0wmU77y5a/wU3/pp8myjNlkxmvKuyh7\nRcqsyKnHmsODCVVVszYekabaG+oGPba2NrDGUhRzJpMJB4f7FJVDpykZORHJxlrrd19jCIm4NMEl\njU7smRMOXO1IdIKuEhATdvdQFcd5V2hZBulBaUwn07F5lZjE4wj1uZp+UZ1JdqtdOC7G43aehQUt\nNAU2lxlB9zfd11vdt7sbrmQoS5eIyDtKC66MeRNtbkG0RSkJOvgSA1jVrm4bFu/fZQJxN/LnVHWF\ns4ZEC7u7u5w9fZmtjTWKYspg0Ke2ltpUvoZjGM9u/xwnba0aq2VmfLt4gbuGCTgIi7yTUtx94EZd\naBd9TJGVxoq7PHHjlWXhmDcdREbgd+KqtuRZxrsefYQrly/z6R//1ILom6Zen0vzFEk0O9sTsjTn\n5o2brI8HfOuF5ygKn1C5tXGKJ594luFwg9OnTzMrSnb2Jxjrmgq7JlilUx1KdYdHTdNFI46IZwLi\nBFf7SZulKbUNrqc08fX4iLzRqwl1WoegpDa6sRsq3M3pX/a/HyeG3m5xHBHD4/d6Ua9ufeeLu9Xy\n4rodRbG7e263rgGAE49S5cex9ECmIvR6PV9TUidNyXBjKuZFjVQtinA3KrTbH8vtMMZ4e/UKf2N3\nE7PWcfXqtzlzapPxqM94PGY2nYJWqDQlSVQTy9Blct17rur3Ve16o3R3MIEgjkVLpqejImN7+tHo\nsmYHbU5auPwCRX0tdq5OU9IkaUpBiwAKqqoIATWGy1cu8XooV7Y2XufJJ57hYH+f/f19DvZ2ePDK\nWbY2N8mzDGcVl+5/gMPpnN29HQ4PD/zit65xP0UXmziLCm4znWiygBBsjCFR2qMHBelIaYWrDGmW\nkpB6q3Eo442LmP+pVyHSFIUv120jAlBt0KYmMW01nG4WobgWG2+1qB1vteQhWPpu4b3gATCWrtPF\nUOxSjCxcvv8qxuNhu3xkpgufNapZuBF5WCT4RYwlUYosTejlKbXSJGmGThLSRFNMJ9g0SA1hbsRE\nqy5+xHLFpSiye2iPtq2qwzyttayNRly6/wKTySFpljAcDXG2ZDgacjCZcLi/5z0oWY8kMKZlqWm5\nD7rteKt0dzABWrH+DRmNwubeMAC6u1NrEPDGwxaFaPkise+s9UAXscqNn+QeBVZrRZL4YI0k8Tq4\n1inbN7fRKmHQH1CVc05unaTXS6mqipe//QplAbPZjIODA4piHiakNP5upXxIsaJ1BcVNJEbyZYm0\nQTDEfHvXoGILNGAeYUtskomcc97PLm1egxeLVRNuGqP44h/WLUzsOAG7UsGq8TluzKJ6tkr0756z\n8ndLi77LtBeDcTycWaui6GZDiLDkvhBL8JCIQofHccagMiFLfWmzCFdmQuTkYlLW4sazvDO7wIRX\nGauc89LZcDTkzNkzDHsqhATT1DxQysPkmxgT0OAGLnpNlvtnuW1vhe4KJiDSmTDhWFfMk2C16epe\n0lnY7W9c81uCU83F3y/N0yZAVHxF2/m0YHNzkzRNAxMoMaam3+/T6+Vcv36dNE3I0gzn4ObrN+ll\nPbLNnLKcceH8RebzQ65dv8bT3/gmw8EW06LgYDqhLEuceH99HLBEK7Q4FD7ASKwJbi7aAqdZ4n3K\nnag17YQyFMnwuQit2Ii1TZFO55yvIUgHkLTjGbDWV9mNLsq6rnEBG7FJuw5SQtciHq/ddTfeUqeH\nJvIwjuXtGP1KlYLV1vCoPsZxbhlEx50YchiSJME6hTL+2ara0hvQZGP6UNyKyizea/l+zZxkkcEe\nETkDWWvJexnD0ZCTJ7cYDzKykM2aJprZfIZOEgZ5ymw+x1hD4nTzzLcLlV9Ft/OsdOmuYALQDmRL\nbXyAf4COrui6wT94qy80u04zbyKTuM2kS3VKPs6ZT2dtHL1z1FXJYV0xORSqcs7Ozg5ZPmQw3OCb\nTz/LqZNnOHnqFJcuXeTEiVN84xvXufrtq8znc/Z3r2GkDcdP09SDptoYyhrKkNUldeWoAhMsa8Nk\nMqEoCob9HIhAnEKiEzItzA4KjLPkeU6WZu0CDUwgTv5iNmd9fX1BJI394ZyHQItSR1VVYF2TWRhT\nraOUEG0ZXWYQr7UQQrzMFASMcKQN3d8vv18wBsehjHaghWy9FiOxWYDNQnWN2uIEVKJC9GBKWtsg\n9cB4PObU6dNsbGyws7vNK9df53DnYCExJ94nqkrxeZdxDmIq0qrnm89mHB4e+HqYbtj0pxKhP+jh\nRDEvS1544QUeeeAhUr16ad7KNtA9Z7nvbkV3BRNwzheHQMTXatcJSico5wIwYwDQjAPs/E5ujWnc\nKdGYokMIsZ8Gi1DgsFjiqp1A3oqs0wSdJsFQmDAc+clfFAWTacnBwQw1LZkcTtBqzsZmwvpmipWS\nq6+/zlPPvcwzL16jTgdMijk2hHn2BmP2DvzEGgyGFIVP/3XOMreW2lpcqCqc5cLBwTaTyYjxMCXr\n9yjnJUmiyDJBS02/32syEI0FQYWSZmEyBoCLtK9BJUQNSSTEHhCkJVWDylB1jegSW9WAxqHRxqKU\npq4NIp5BSGA4VV0Hl5bHGzSuXogEtMHlGa2V1prGoOtE+fBl59Uf6cwBoEGAiseWxfEuI1FKkWZg\nbZukowKyr0oVpoJaYjZnQpoKzjiUWLSyiLH0spTRcMhoOOTsfeeZTiv29yfUVUnS61HXHnl5PBoz\nm81xzsfsV2WF1cZLsTicNYj24K9lWeJBYIP6J0KeOJSpmR7sc2J9hIjGicYAdWkQ5QvGjAcDkk58\nQpyr3dfG5qAC0nUTJxnmeVxLnWNyi33wrmACEKX1rn7T9RL4E2LwTPMDYeG9dK4RDx5nWFxFzXHl\nEYS1TnCupCwrDg4Oubm9gwjkWUqeJ5w+fYKNrQ0OJnNefe11bu7uczApQDRWCaARvNEqTVOfOKMV\nNtHoRIeCEiCifIBKeMZ5MaUsZwFtJkCv2+DJEC/GK9sm30SEZRFpqtIQF3xAJHYhzLZrR9FIqMTj\ncxosCudA25gEE8ZBfE5HtCFAm+cONL54P1Z+cVsJx6S5Suhjj7LTmeIrx2EVA+jSoqTQ2ndUFAiU\neACPIKWLZ5UEpAPAB/YoJR5Jqt9nY2ODQX/gkYLBl2EzbQkx731RJDqhdKVfWNLaG2LxVGt9ARiC\nXSbRGi0GcYa6LEm0Zj737uO1yBBChuBoNFxpND1e7+9K0NK8tL+NzOB4eiMYg7+CRxV+zTn32NJ3\nfwP4b4BTzrkb4lv53wGfAabAX3bOPX67ezQjuCBOumMb3jAJWRS/XOcaUYNY1XGrGEMEHG2NYS08\n9nw+5+bNG1y9ehWcYzgYsL6+wZUrDzMYjvjTr36Na9dvUMzmaJ1wcDghy3pepEQxnU3Z3NhorM29\nXh/w0FzW+PiEynhXJWKpgogedXW/y3rRNtEWnajGLRpF1K6xrCsmH+9JcQu6fey/WBm5KWahlMdD\nzDKgZTzLWXWrRPjY18eJsN22HRcstCwFxGs452shdIuhdiP8CG1MksQzUH+TZn40KkyIHBwOh2ys\nrzMY9EkSjbUJ0+kEEUWWZsznfmyTJG37TGJlJ320bL2pcXiAlyzrIbZqnqfX67G9fZPalozXHqHX\nyynLAuqKtbU16Kgcx21Yjdp7tBbqm6a3WoYMEbkI/BjwUufwp/EIww8D3wf83fD6hsi5GOIZjXsL\n366eQLIoHUTW4ZzH7l91jyMdKx7yqa4tbc429Ho5KsBBTadnqOuKsqgQp3EkbGycIM1ysqzHzZs7\nFPMSrRS9rM/BwQFKEl/6SxSHB1NfeyDLAMfenof3Xl/b9J4JcaRJwon1EYN+zvp4yNbGGrUAaKzz\nOH+Z0qQqSBgdprlsPe4u0tZy3jEaukV4cq01BNdir5d7WPWyatxjtq6a9mdZ1kCzRUPiAmBpx81o\nnUPc0WSk5Z3Nq3lhMUfxVmRBigDBOIcK80NEmuIhiJAqTaq9d8fiP5NmVHXAQYwSk+owlbBTb4zX\nsKXl4YceQqUJX/3K1zzTTTR5nlEUvjpTVZWUZeltPHhIdWurBbwEESFNQkUl8RmyG+tjHn30IT71\nqU9x6vQJnnrq6+xs7/g0cusZvdKavb191gYefCaWS7/FiokdcYtzbk9vqQxZoP8WDzv+251jnwV+\n1fmR/oKIbIjIWefctTfSmMawF818tzDoNbt9oOgW6l7NLXsYbnGtaCBbFjO11gwGA86ePUuv12N3\nZ5+93UPe/8EPMxyO2N3fZz6d08tzhoOBN3CaKdMglyoRX3tO60bXq6qatbU1dHADVVVFlueMeikn\ntsaYKga1KLAOq2IwjwIVk4dW77BxEoq0iMqrdMqubh3PryIoSpLQ6/UwoVxXWZbUwsKE9PPbuzjF\nOSrAigRfObG8p49PIDmSe7Cq7V01oLlH51wXI/w61+jG9HdzI+Jn/5y1L/wqXs1JlCZLUmalacT4\niA+xsbHO1sEmVV3S7/cbpjcY9JnPPQMoy8obYIlz1j+Hl9wqjKlIlI9uVVrRzzNOntzk9OmTbGyu\nebdzqslsymAwQGtBa594berapx13xmrZFnA7eqPnRXqrdQc+C1x1zn1laXGdB17ufI5lyN4QE2go\nrP/IFBa+OmbC+GPQ2gF8llY859iJtmRsakXP9rdpmrKxsUGv16PXG5Jme1y6dInZbO6BJazjgfsv\n8dLLLzOfzUmUdwE6205m1eimDi3eN61CfL8W4eTmJie3xpzdGGAqX/hnFrHMAAAgAElEQVTTi6vg\nQmSkVtrvrHYRpmuVZT7u7kefa7X7LVLjm048vFnsG2fqNm4fn3rrwoKL9+/q+hLuoZxQs8igliMG\nu9RlAHFQWylhkYHoznVUeF5/zEsPWmmstihlF9qllSIJDC3RmizNyPOcPM8ZDAY+2UcJaRaiMSVi\nBlTh/qHdylsafDMVzgXkJAEtoUisEvI05eTJE5w4uUWe+/iAPM9QSkL+R8gncZbBYIgv3HS8LeRW\ntOr72/3mTTMBERkA/wVeFXjLJEtlyCAMfOjkxWdvswRb10wrCUj7pjHeOGcXILRvZQwUEXTqgzXi\nBPVhtVNi/bs8z3xdgrVNLl7U7O/vk6Yp/X7OcNjnJ37iJ/mf//E/4bnnnvN6X6qpKh90NKtK8jxH\nsoxMK/qDPnt7e9R17d18ieZ9j72Xdz38IENKvv3UV5jNJhTzKSQDUD6aME0Vtp4H8dE1xVi7jKzr\nBlxmALccDyDv9UL5MktVF9Qd8VZ3FryxtglyioxmObKta3dIVHssvh5nQ2hSw5bbLD5d2KdGN9PC\ne4PiMzoCqnMozJkk4BxG+zljrGvM5HGc8zxnNBzSz3uU/ZLetEe/32dzc9MXoS0KnCuILkLwOn1Z\nlqgk8wk/1iBKo7UFpxlkA9IkQYlDC+Sp4tKlC5w7fwaH9XUtNjYoioKynLO+for5fE5Z1gwGQ18M\ndsXcPdK/sRvewsLv0luRBK4ADwJRCrgAPC4iH+ctlyF72GsQ0lahEWkzBdvn6YpIrnutzoRpJ9Gt\naOE3nQUTX6N/vOuOibtAmiWsrY8ZDoeIgvHagFdeeZkPf+gxHrz/HK9ef5Wnn36amzv7lLXlfe97\nP6+8cpV+f8DW1hZ//Md/TJ5r+v0MEeFd73oXP/bnf4gPPPYuZjev86+uv0Avy9BaobOMCo1SSQh2\nMdS1BStHADoj6k83pDeCVS7Tqt24rip0NzFoyeYQf6e1DtKI8kZwZbFiGtekSKyM5BrJpRt9t2CY\nW7ZnLHkLum1fziBcjumP12ra2PQLmMxRzufUVY0pfV6FrWvGoxEnT55kOBxSVYZhMWdzc4OPfOTD\nPPXUU1jrGA5HXP32K40Uk6ZZwHDw89AjQAWY9qoi6+X085RBv8d4NOTBBy5x8cI5Tp7cYjDo4Zxl\nbW1EWWYcHh76DSL0TVEUZDrBLQUJvZ30ppmAc+5rwOn4WUReBD4avAOfA/6qiPw63iC490bsAdEX\nIITF1nV5LXC/oBf7T7E9tA5GWh7Qtm9Rr1zBIUUC5Le1zSTUWpFliZ/EtCKsdYaqmpOnGc4Zer2M\ny5cvMRhkflBT4eBgh9Gox3Q2pZ7M0cpy5fIlX3769WsINUokFEWxDPop6+Mha+MR9Z7P/quqkqos\nSXqCEu1tDdaFnbddLMtgH5FZrQr7XTbEdRmBtZaD/f1GB/ZiatYYvCoWDZCuqnFq0SqvRLwxr7PQ\nhUU7wHKAzbK9ossElplXdz7E51tlOOsyAW+odKSpRebzALVmCPJ38PSs+6IzIYcky1K2tjbp9/tU\nVRWyM4sghXrpK00TSmsxBnRTRdvnKijl1b31tTFnz9zHIw9dYX1jjFIwm005PDyk3+8zGPQxxjKb\nzYILOaWqalLVSl3dvjsyjs75GJDvkN5SGTLn3HEViH4H7x58Fu8i/PfeTGNEfGxAt4Cmp0UxqPvq\nFcBuTMDixOr+HauHEpmAa9yKWikSmywY0Yz1oaa1hUG/x8HBHvP5nOGwz2S6z2x2wLyYMp0eMBzm\nZHsau18xOdzn/CMPc+Om5Vsv7nJia71JARYRLpw/w3g8RItQFQXz2ZzZdMp8Pqe/5nVYBIw1KB3g\n1zpMID5P9FN3+26V6+24PpnNI5io9vcJ11NKwRLgiOvq4+oopHgzRu5onHu8X9dQ2HzfGZpVEsxy\n29sybIuIQ12vRZI40rSDQWBdkzbd7w8Yj8ah/oRqDIWj0YjxeMTh4cQXUqlLIIKP+OAnZ/yccC5p\nCq9opRj0+x6+fnODM/ed5ty5c/TyFGNrptOKg4M9er08eIqEw8PDUFJeYW29MOduh8SMO+od+K4b\nBt3qMmTd7x/ovHfAX3lTLeiQSCc9eEknXDaUdO0A/pnD4nfBSSiL0YHxdRkwQkQW4LWX7xur8c7n\nc2azGQ4hzX2hkj/8wh/w7DPPcHCwz4vPv8DDDz3ExYsX+cAHH2N/f5+bO7tMZwc89/w30QlsbW3x\nA5/4fi5dutTkKBhj+IVf+AXm0zk3rl/jtVdf5bXXXkMDGxtbjE+cIcsGWCxF5UFM8iRvsBMjtFhc\niNE3Ho1Z0eMRqbtAIkW33kbYEbXWlNWcqiwb/ttdvNb4IBtx7Q61irGsGt9uO7qRfw3dYgJ3xzHa\nQyITiKpQtw+cC9WhdRvIE38fPw8GA8bjka+xEOI2rPUh2ZcvX+aVV67x0ksvN+5REYWpbUCA8s0t\nC48D2ctTNtY3eO973sXJzTXWR0PWxyMmh74AbZp7F+yp0yd9tqnzkt14PA7xB8EmRZu3AYtIx292\ngb8RumsiBpUkgM/i8gMkGOtFpdF4HYdpQmVFFCIRe6BjP3BAg2JLwyHtkrsxWhuaSYvDxqQRp7DG\nu7aUJPTyhDwfMBpa6rqiKEom0zlPfu1Jzp4+w8Vz5xiPRtRlSTGfMp1OuHHjBoNU0Us162tr/NzP\n/Sy7uzfI84ytrRNcvHCaqqpRSvvCImWJm0zQ0znaauY24frOhPSV13ngEcFWUxKt6aWa+WyGcYIN\n6bPRyyDBYBaRhETAGl+/fiGlNVQbAtqFpDVpkjCTWEnI+9mdaBwWcdaHc1sQcSitUEFq6qbaikiT\nlFRVPjgm0ZpcUu9F6UTXGwGlEnSmsLZTHDUOWzR8uVZVJFw/ulrFeJeaCUFAmU7IUr+YNILF0UtS\nEMecAiMlvV4CSlEWhnHeZ20wYDToU4tjZmZg5gxtTZYo0pOnKKcFV799DaUSjAMljiQTamMZivfu\nlMWEvJdw3+YGp09v4uoJWTZitJZx4vQ6Gxtj1sYD8twHGqVJ2qi8xIrRpsY6E9qtkCQl0b5o7YJv\no2MQJBStaWnZhtD9fDzzuEuYQBTt2yAQfOxjk2wTT4jW+lbr7z6ca6/XCRxacadFUXOF4dGDlEqj\n62oVYLt0SpLmlGVJvrFOP88ZDgeYsmB7+wam9jpkr99jNBgyHo5ZG6+R5xpoK9X4Ip01WqVY46jL\nkqoofLk0B/OyYjKbN9KNdy2qYL0PVhTnOrH2HZReF7EFFvXKrjuvK/U0O2RwWcZsxtgP8ZpNN7kW\nT6BrQO1eb8EO4JQPI14xVC3zDnxa+aPN6Eo06LeBQyIedlvExyYoabMIG2mwY8tRTpAAdz6dVZja\nkKYJ58+fZ2N9nTzPwBlfVSkGqxlHnvtKQHne84xP+/Rvv3BrcN5BmCVCL0sYDXtsba5z8eI5Tp8+\nwfr6mLX1Ib1+Sp77QrdNX4d+XOj/TseIHHGRdfymtyO39HprukuYwPHUWKc7u8iC+L9w8sq3t70+\nHHXDLLtj4mve6zEaZ00egA7VYg9nU0woCJIkCaOh9wQcTA3b2zvcd+Yk1tZMpxNmsxn7+wdYA3WJ\nhyKbl75Mdcc7UZVlwBpox1/rRe5/q6y87vddJtA9d8G1F/AH29iAgH/oYgRgiJ0IasiyATLep2ug\nFBHEtmrDsmFvwfvgWnBYcdFQSHAPh1ySyJg8l1vIuY/PtGwfUFahRcizjN3dCUVpyPIRV979bk6e\nOkWaZhjjAV5rBwZFaQz9fkavP2A8HnsdPtWIOFxdkUiGMyAosiwlyzRr62NOnNzisfe9h/F4SJbq\nEAi06Mpd9mIsG0vfLLW/cUvH/rVjAsdb73Ht5FxhK/quUdf9tIq8uuD1TF/CXOOsoSjmlGVJFXIP\ntNb0BwM+/OGPcOnyjF/+5b/Dxz7+MR599GGuXLnMq9dvcv36a5w6dZrv+/gH2NudcrC7x+HhYaMn\nFnNvh/BGuVhFOOy2Ibmn62qL1LX6x9yH2PZVu3Q31NfRGhqtqbGhUIfgVShR2sN5kwKz5rqNdyAs\n/ghaEvuxwStYNsgu2X5EfCxA83141aoD7uGAwAQWystFRsIiY9dak0pKT3LSZOYrQaUwGG/wk5/9\nKTZPncIpzezwkN2dHYraovpjsrQmH61x+pzmfUnCxuaIrfURG+tDTm6uo12NpHlTYv6Zbz7F1tYG\nZ8+e4fSpTcbjEdZ5L8/m2hhTE0KMj3pqbjfv3hi99d/eFUygce2tXP9t+esukMitqJkM8fpLxr5V\n58bzuhNo2YDY2g9CJWLXGrbyPCfPMqbap5LOZjPG62ucOnWKH/mRH+GZZ7/JK69c5Qtf+AKz2Zxi\nXvJn/swn+Pl/5z/k4GBCVdZUdRVcczllUSz4+621wSCnSLQ0CURdUT9O+u4u61GRVrvbFhJeOqqF\ns9ZH+VkHtk1G8qXPaoypGQwGDRPR2sOcx/OMMb78m7VN4k43iqNbpzA+m1MKrC+82W3jEalCWglB\ndQRogcZdFlUGrbRXoRCoHWVRkWYZdWnY3ttjfXMDlSaUVRVUATBOUCohHw4oa1/BaevUSfb3bpCn\nmkGecnprHeoCm2QYhOl0ytmzpxmPR6yvDdnaWmc6OUApod9LQ5XiZOFZun3VHUevBqye17eau7c6\ndju6K5hApOXmOzrqgHO36JsIJCLNRY7rimWxf/n9Ee/DMdeIIb3RD57lvjx43AnLsqSqKvJU8a53\nvwtja3Z2YsKIY3PzBKdOnWG8vsnu7uFCsc4YuWhdBLCM5dVtw4i6to2u1XvZRah1uyMve0aAxfM7\n3hWnNCiDdR76TFzbz92iLqv6dVk09wbcdqJHPL6uGtCtlwg0KpC3x/jYkQU05O4YOdeoDN1nioFP\nChDrbSppoqmsAlNBonEimGg9iotQJyRZTh0Qf0b5iEG/T64tmdbkqSbJckoUpXVkuWY8HrK+NmI8\n9iHHWimUlhCeHFzPHQ/XKiluuR/fDEUVuXPkDV/n7mACjRVoKYw06qbHhJj6U+LODQuGwWPoVp3S\n3fG73HrxPn5C+lJfLUZgkmWkwb0W3VXT6QQrOZcvP8gjjzzM3t4u169f48brOzzyyLt57/s+SDoY\nUpW1r7vo2lyF6BbyjCRr7AAiPmjIYo/skF0RP7Y5uiLjhFuedMsgK9ZaUF6HdiEiEBsMjaJwYrHQ\ngLLCUejvKGXEtvi6ix1cwM7voioSvQNa6074uDSAm11m17gWRS0wtub6y9KDA+1g2OtjK0uuYGMw\nZl4bstSBUh7URWuU9eXAdJKgEk2WJvTzhP5gQEZFkmjqqmS8PqAuPGL0aDQg0bCxMWZzc52qKOj3\n+95+gPOuVI4a+o5zqTovuh07T7/bdHcwgSVa3oVtNAap1ioO7aL9btxvWRpYtbPFXTjm3FsTkWx9\nObCoz2dZRq/XY319jfHmJoilKErW1oecPPUeevmIBy5d5sSJ09TTQ6rKJ6boJKGYF4ioRg/e39/n\n1ImTfkcJ4BXWeZUkPv+yzz8uqK64Gc+NfvVVYqkNrjaWdFcvvnsrvFGqwTWMFDMVjwvjtbVppBhw\niA4iupcx/LXDe92JE2mAQePClljhd1FlWLWQuhJdohP6eQ+jcia7e5y+7xQf/6EfYW1jndo5itmM\n7d0dEp0y0AmS5GAN40EfnXhjoFYw6A+oTcHTzzzDxz/yAebzGUVdMzpxgvsvPOyZuIXNE5tMDg9R\naF8Ze14gqp1PyxWsj8y1O7f+gbuICQjBq9kV0+mKsHHh34JJduZB3E2ar24xSVYdj7/pWpohJKio\nAOktEdyixoWBVeE7rRQ6EdJMSNMe47WBd4MFT9l4Y43xxhpaNGVVhDDljIk7BFoYsIODA7Y2NkkT\njVJxd3XgWGjXsqFvlbFpFcZAN7Q45uI78VV6nA2wWc7iEB9IE+IMuqCl3Zj+bj+uAuuMfdRtRzwn\npvl2pRqttA+v9qZJH9jVGevIoLqqBdbhMD65x/kko0xn2PmMNMvY2Nrk8pUroBXlvGA2n3EQYvgr\nY6ltiaBIkz5a+yg+lKK2loO9fZ795tOcP3eW3rDP2toam5s+xNjUNbauOdg/9FJsiHtRSgeP9+KG\nsix1vh30Rq591zCBSAsmPde+vk19dEuD4YL7Kp4r7QTXKrjNrPGVgiWoBsGNp5SgA8Bl3svAgTGO\nurakqW6Qh4ytybRGguju7+fFx+l0irERPSiKt6vLZMVdJn72cf9HC4ssP3/7PAFlOOjcLhgjsR4Q\n0wQm4Kz1NRCWGORyf3XVg2XvxDLjaLIVO3pzI9Y3MQJer27MPy5Une4gCxvj0ZmsC9GnKqgIOkEB\n4/EaWydOsrm1xX5VU5masq6ZF/PgTrQorFdDPL8HhCRNQSzzsuTGzW0m84LR5gaj0Zg0zTxjUxqL\npa5MB8vA17rsYkAsb0DHqbpvhqSxrtPOn27MwS3ormMCnpZ24/jvdh215GFoTA3+Qp0vjnbKqo46\nwgDwRrHaViCCljg9aXzWWvvqw9ZaREGWahDDZHJAluX0egOyTFGXBeV8Sj5Y8yXK0wRnfVYhjbTj\nPBOoo+jfGrwIoaXCaiNnFDVjfo1zLsCnmSBNtYusCb81dkFP94u+jkivC9eOaonvk+hWtE20YFVV\nzbXyLMPYNiU5MoC4cKMaBSwwgWWjZ3esltviABc8FMvhtknQ8fNexflLD3Lu0gOIVswOC2rj0YGq\nqkKMQ+HItGY46lM7i1KQpCmj4RhxFVneJ+v1Ga6tsba+wXA4ZDqd0UvzcF9hMBiFIKsw7yyAAWlt\nFnFMYvHZ6ONo1IY4nRfU1CNTdMkCHvpk+fBt6C5hAs6nzZJTVyBkYcJbjPPwS872cDa95VX8RFkU\nMcObpdsFvUu8NdwXBq2b5JG4W3V97Gnq722cUONLWltncKYK0OEV1lRYW2KqCcO1EbmyJKZEdEaa\nDlE6J9Ej7r/8EHmaUxqYbN9EZ0JVCkYlZOsb2CRD0hylNTvbu7z26jWoT7C5sUEvz6lda+ATvLir\nlGdIuvO4CmFazZooOB/nYJrfpOIwdemxDq0lSTIv2UQMhV7Pu2WtawKJolXeuQqch02vpCTVCXWS\nUFcpVZqBQFGUFPO5z0UwZbh2QAJ2bbx/ExUq0rhBfaZeumDYjBiQ0WCoBMqqBJ3gRKgBl2rPuDD0\nxhnOVhinqcnQa5s8+vEf4NyVR9CDNaobM6ppST31jM4lHhcw2i4wFqU9CtFoNGYymZINT3L20mP8\nD3//f6GXCBfPneEv/MRnyPM1er2MNBNIoCimiDiSVGHrGiTHoTC4INmFRa1BVEDTct5LYdBEA7SL\nz6zEB6Y1G5PzlZLq1cv9zZgV7hImEKhhddLGDoQ4+BgxpkKgzHE795tigcHAIOL18W7dvm4SyoJE\ngEebxXoALVEBiDPxVWzyENtu4o5YVugswRsE/KaQZT3yvIcxlmLuA41S65qB94ujxlYVs9msDcsN\nXaJVQAe2FuWabgpGy059AZFOf4TdJu7AEDLg/HUj0k7jsnPeAIp1vohJbRom4E/0uQFxIUdJoLFH\ntDJSa6fAFwWJ91lwUyJB9DZN38fXrsrRJEuJRxOuTSgPZn0eAbFeLYpyPvcIT4BTmvvOX2C4toZK\nEqraeFWsbmHlXVKTKQ8FV5QFETUoJhoVZYXWCWmWc3A45bAuqcuK3/u9PyBJEs6fP+NLjZsqsGND\nbbxqgZNmDntVxnu9rPNQ0o30owQTDc4SEKLBv3c0aNIi+HDsW074N8YK7i4m4Dqb9pLhpPsXmUDL\nCDo2A7coEr8RUiK+3kD4XZxoXYmi2XmV8rDWJoh3AqI1RikfRqw1WiWN7l/XFknwMfHWYQImQJLl\nSNgFy7L0begswKqqqOqaaT5t3JHRXRpBO0TEFx5g0Yi57K6Ly7H7PMYYTMcSn+iEuuP/90wgeCFq\nQ1VWIZgnnlN7Y65tC3Y2KoRtAVqWmejCYNOJU4haFbLABJbVgoVjQbx2CLG2Y8xAUkA1L9G9nlfs\nlebipUv0h0Pf+pDwZExNWRSekdkanWqs+DHJ0qzZePr9PvsHk4U+rKqane09vvT4lzl79qwPEjtz\nml5PszYeICisrUAnjZswtr3JU7AWCanwIiEyM9TLEOdzRlxEQ3KEeg0On1Txhqb3bemuYQLLuo+K\nxiqtG104vnYlgYWFHszG8VJvlAlEL0Q32i5mwXX90L4AZoVzVeDuDuMszlQU8xnz6Zz5rMAYS57k\naJ0hOifJ+6BzrFMUpUFJijiPbjydFZRF7SdgyNlXWmONoZhO2baGw8NDJsMhvSTxEyPrRYsVwIJr\nbtm7obSCUPVI4RNuAJyxVK5q9PRUJ01wjgvSjXMGY8GESdi6Hi1K2hDubr5D9PnH82MuRdTPVbAJ\naKGJUIxAMs5Zynq+wIC7hs4j4ykKnWRYaxAFm2tjH4xlTAjkUqQqwyY5bjDkgx/9KGYwpHCKelah\nBObzGdPJIc4YdOpz+qP9JEtbxB8PKe6lNA855nBOUTvH4WTOP/2N3wJnGA37fPozP8Zf+MyPs7Y2\noqoKH+5ta4S66YcYbRrzCvy8jQzVzylvjvFMX6sEVIISwZro1q0hOT4a9o3SXcMEoGtNVgsGFBF8\nVV4lzWf/+tYfvutCc85SVmWj90dXGyxh2Yffeaw7HWrOWxTCoL+Gs8JsVlLVgsNS1FCjcJKiVU5d\nW6rK0huOqIqS2WzOfD739oV5ia1rkiRhY22N4nACxlBVJfPpzIOM5Dnj0SjGt63otxZqPDIvFZTP\n+AuFgCiMBDE/iPEKobZt9SBTm1blESFLUzKlG911XrS7YnQ1RrdhjJqMcQnd3AQV+72jqXhVwucp\n5J3Iy64BMVIcm6qumc6mZDoJMROWfJAxMx46TCtNrz+iKGtUmjLePEl/Y5O9wqeD15WvBjQ7PKQs\nCm8LSjwTiM/URWkqyxLnfLHY69evh6CqBESD1kwnBwwGPQyaf/ab/ztf+OKf8LGPfZQ/9yM/zKkT\nW9TVxKtyOiFJhLpWGFf7fo6ZseINvv3cw9IbYzEVxGzFGEvhbTvebmGOpA+/ebprmECwdRwr/keX\nmT93tc8/hLUdsaIed37LCFj43D1v2f/eLLQ4aFG8C9VsLQk67YMoKgP7hzO+fX2bvYMZVW3J8j4/\n8Il/AxXsF+JUgxTkCEkxITBHoKkXWIUaAMQ2Nu1anXzS+p+jq6+TILQc9hv0f4m+eLwtIBYptdbX\nOWycdeF5uwaY5azA5Ug+8O0wwRMgWi3s+DF9eTnibxUDaBicisVFfKUgUwf8BKXRSUrtBKtS1jdP\ncP+Vh0n6A1x14CVp7SHD4m+SJPGeAqVAswBCAjRFYOq6Znt7O3itBOPAVQad5BgrzOYV02nBa69t\n88QT36A2jtOnTnF6c8TG2pjNzU22TmyidIaohMoVmE41aERjat+3vqy6DqX2Ih6EC/YFvzE63qQd\nbAXdHUwgcoCou3bKZUVLaGQFfiHqYNh3DZNoF+/tpYPlxQ6t22k5mKM7qWN2nDcYgtJJg0q0d3DA\nwcGEg8M5hfFeB7M/Y+eg4Kmnn+WFl74NornvvnNMJjMGeQ4olPMqj9MaZxTibKjx57P46qqmKivK\nyuci+F27ZWTLUtERRkZnpw0WeaLhLi6yoP/rJKTkhMVqah8YZK1FnFuI5jNmMSuwuzibnd8tRjR2\nXbXed6+bxV6ZOjCi48XbeO3oiktTD8bqEo1xhrL0ocwqTVBpxqwy9IfrnDx3gYfe+xikGVYpn4Go\nhLKYY02FiGcC82KGShSStCCuMR6hGxW5s7Pj2+G0r1tpTQAqLZgXBXmvj6iM5194ia9/45tsbW7y\nwfc+yqWL5zl/4TxWhF7eQ0IbjDUkOvGI0lmCK32sgyABbUgHZ0Uo7IJXjTW6u2zeMt0dTID2WZrd\nvzOpXfgy+tCjOHx0ssibd5IS7A86aeCp4p8XAT0TyDKPTW+t8yKmSkjzAVVVc/PmNv/bP/88zz/3\nIq+/frNJEopuyMo4st6Qhx56hPc+9gG2TpxgPp0xn5dNyTJdV9TWUMym7O7ssLe7y/TgADufUxVl\nUw0oqg8qDc+5tFPG17j48twj/RTaFxEp50WzQBWCCSm+ta7oD0feqBZ32NRS41GInXVe/XEOrCXR\nSeNdWGX4a+L7YzJPDFiINog0RQePioiQBpE+lZYxdCWBeI9oePS8S3F4sEOmhSzRFLYiH/WonaJw\njkMH3/eJT3D/I+9m4/RZbmzvcTidUpQltirZ29lmcnhAXVWkgxytyyP37j5XWZRMJhMODg68/YgY\nPZozK+Y+BgOhqi03dva88C5eGvy9P/wiSvC4ghJdo4YsS7hy5SGuXLnMgw8+yJUrVzh73xYK57Mb\nK0cIOAQruLaABfa7oArAXcQEgGahqwYurCNWOn/CrRhfc3zFAN6Komut67LqiqTRQOkxBhUGzZ8+\n/iWeffZ5Xn/9Jnt7B7xy9dUgr2gkHTKbTMD5Crl5b0ia9bjvzDne9/4PUZYeAbc2jnlZsbk2pCyn\nXr3IctIk9X9pSjGft+g9xmKqGqrKGwjFR9j5R17slS4j8zUU6vZZo6FTKVSwgyilmEwmTbx+fO4k\nhEA746MHnfUhxWifIrtsSIW2cGm8RhML0FnIrqxIgv0gwsk5p1pppRnKRQ9NV7rQkjIN1Zy1r6JG\nZQ2HVcnUCPnmaU5cup/1M2dQWQ9Xl6gsR1nDfFJiTOVDvoM9I896DUaBtRZ0KwXUdc21a9d49dVX\nfYh1mgI+8QiFZ4rGS0jGQhpqN+Is86JCh7wPV3u3siXFoplXjudfeoXrN3b50689xXAwRFnjc0/W\nxjz6yBUefPABtrY2PBit0lhXY2xNXVXkSdZxxr41uiuYQCMFhJ3cj3so8xR3GsLxVkhYcR3XGEze\n8L1djBVYVAG6O1lcIFevXmXvYMreZM5T33ia555/gZ3tPYqiZi6vcjsAAAbbSURBVF7WZGkfrVMP\nQCkZUINyJGmOKMVobY2L99+PNY4kgEp6i7sPNTXGkCZJU+ikLlLm0V0YJIGqLBFrvU0h9l1noXQL\nrnRzA9r0Wu3DbO3RXcRX4PW8qIEx66hHWqkGUrx2xYLYHxf9qh00/n5xQZuwaEwrenvZe4EJrMo/\niG5Pay2J0g3kl3MOg8EqQac5Z+6/yHBrE93vY0VCFiTUxjCfz3HONqqoc24hXdmrnb6DYxTk6zde\n5+bNm00bfIiPLyvvD3qPByH4yU/jwFCcxtlg93EQGYjDMJmWTGclsr3nmWFpyLOM8XjIbFZwY3uX\njY2xT1de93/DYZ/BoNfO4e+A7gom0JDE/3zvL4qarnPScpxAIBf/e3OcUcLgdVNVuxPRWstrr73G\n448/zlPffI6nnvsWaZpSFgZQJGmPrJdR10IxrygrQ78/REuNEkuaZtTWMRwMOXfhPEADQmKqmvms\nYDad4sqC3trYY9L3B5iiZM+5Bum4mBcURUG6woAJre7fXfTQuqEEmqAfYxcLgAghIi0YJF0IpY3B\nKTGLMPrN55O5R9w1bfXmyDC7n6GDPtQxtFnroItPqJT3ero27iG+LruDY05EMZ+TpSm40u+MxlBr\nIcn7DDa2ePcH3k9vYw2rPUKwE2/YK4o5hweHwcjq54x1FosL2IURpCUg/4aAolevv8qNGzca9aaq\nPS6hqBDQFOYSWtrMUK29Ac8JxgkmgrVGb5fEIC2aceklfYxxbO8csPP413j8y19GKUuSClceup8P\nvv8xHnnkCo8++hB29p2rBPKdcpHvBonI68AEuPFOtwU4yb12dOleOxbpX+d2XHLOnVo+eFcwAQAR\n+RPn3EfvteNeO+6148624/jyJvfoHt2j7wm6xwTu0T36Hqe7iQn8T+90AwLda8ci3WvHIv3/rh13\njU3gHt2je/TO0N0kCdyje3SP3gF6x5mAiHxKRJ4WkWdF5Bfv4H0visj/JSJfF5EnReQ/Dsf/pohc\nFZEvh7/P3IG2vCgiXwv3+5NwbEtE/oWIPBNeN9/mNjzaeeYvi8i+iPy1O9EfIvIrIvKaiDzRObby\n+cXTfx/my1dF5MNvczv+axH5RrjXb4nIRjj+gIjMOv3y997mdhw7DiLyn4f+eFpEfvxN37AbkHOn\n/wANPAdcBjLgK8B77tC9zwIfDu/HwDeB9wB/E/hP73A/vAicXDr2XwG/GN7/IvC37/C4XAcu3Yn+\nAP4s8GHgids9P/AZ4P/AR4R9P/DHb3M7fgxIwvu/3WnHA93z7kB/rByHMGe/AuTAg2E96Tdzv3da\nEvg48Kxz7nnnXAn8OvDZO3Fj59w159zj4f0B8BRw/k7c+w3SZ4F/GN7/Q+Av3sF7/zngOefct+7E\nzZxzvwdsLx0+7vk/C/yq8/QFYENEzr5d7XDOfd45F+vBfQG48N2415ttxy3os8CvO+cK59wLwLP4\ndfWG6Z1mAueBlzufv807sBBF5AHgQ8Afh0N/NYh/v/J2i+GBHPB5EfmSiPxH4dh9zrlr4f114L47\n0I5IPwv8k87nO90fcPzzv5Nz5t/HSyGRHhSRPxWR/1tEfvAO3H/VOHzH/fFOM4F3nERkBPwz4K85\n5/aBvwtcAT4IXAP+zh1oxiedcx8GPg38FRH5s90vnZf77ogbR0Qy4CeB3wiH3on+WKA7+fzHkYj8\nElADvxYOXQPud859CPhPgH8sImtvYxPetnF4p5nAVeBi5/OFcOyOkIikeAbwa8653wRwzr3qnDPO\nOQv8fd6kaPVWyDl3Nby+BvxWuOerUcwNr6+93e0I9Gngcefcq6FNd7w/Ah33/Hd8zojIXwZ+Avi3\nA0MiiN83w/sv4XXxR96uNtxiHL7j/ninmcD/CzwsIg+GHehngc/diRuLT0n7B8BTzrlf7hzv6pc/\nBTyx/NvvcjuGIjKO7/GGqCfw/fDz4bSfB3777WxHh36Ojipwp/ujQ8c9/+eAfzd4Cb4f2OuoDd91\nEpFPAf8Z8JPOuWnn+CkRnwIoIpeBh4Hn38Z2HDcOnwN+VkRyEXkwtOOLb+rib4d1801aQj+Dt8w/\nB/zSHbzvJ/Ei5leBL4e/zwD/CPhaOP454Ozb3I7LeOvuV4AnYx8AJ4B/CTwD/J/A1h3okyFwE1jv\nHHvb+wPPdK4BFV6n/Q+Oe368V+B/DPPla8BH3+Z2PIvXueMc+Xvh3J8O4/Vl4HHg33yb23HsOAC/\nFPrjaeDTb/Z+9yIG79E9+h6nd1oduEf36B69w3SPCdyje/Q9TveYwD26R9/jdI8J3KN79D1O95jA\nPbpH3+N0jwnco3v0PU73mMA9ukff43SPCdyje/Q9Tv8f1kffAvG/GgAAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "ALvS7mzE7WPw", + "colab_type": "text" + }, + "source": [ + "### Now lets talk about the PyTorch dataset class" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "pISn2IKj0G6y", + "colab_type": "text" + }, + "source": [ + " ``torch.utils.data.Dataset`` is an abstract class representing a\n", + " dataset.\n", + " Your custom dataset should inherit ``Dataset`` and override the following\n", + " methods:\n", + "\n", + " - ``__len__`` so that ``len(dataset)`` returns the size of the dataset.\n", + " - ``__getitem__`` to support indexing such that ``dataset[i]`` can\n", + " be used to get :math:`i`\\ th sample\n", + "\n", + "Let's create a dataset class for our face landmarks dataset. We will read the csv in ``__init__`` but leave the reading of images to ``__getitem__``. This is memory efficient because all the images are not stored in the memory at once but read as required.\n", + "\n", + "Here we show a sample of our dataset in the forma of a dict ``{'image': image, 'landmarks': landmarks}``. Our dataset will take an optional argument ``transform`` so that any required processing can be applied on the sample. We will see the usefulness of ``transform`` in another recipe.\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "sJ76LdsU0K_c", + "colab_type": "code", + "colab": {} + }, + "source": [ + "class FaceLandmarksDataset(Dataset):\n", + " \"\"\"Face Landmarks dataset.\"\"\"\n", + "\n", + " def __init__(self, csv_file, root_dir, transform=None):\n", + " \"\"\"\n", + " Args:\n", + " csv_file (string): Path to the csv file with annotations.\n", + " root_dir (string): Directory with all the images.\n", + " transform (callable, optional): Optional transform to be applied\n", + " on a sample.\n", + " \"\"\"\n", + " self.landmarks_frame = pd.read_csv(csv_file)\n", + " self.root_dir = root_dir\n", + " self.transform = transform\n", + "\n", + " def __len__(self):\n", + " return len(self.landmarks_frame)\n", + "\n", + " def __getitem__(self, idx):\n", + " if torch.is_tensor(idx):\n", + " idx = idx.tolist()\n", + "\n", + " img_name = os.path.join(self.root_dir,\n", + " self.landmarks_frame.iloc[idx, 0])\n", + " image = io.imread(img_name)\n", + " landmarks = self.landmarks_frame.iloc[idx, 1:]\n", + " landmarks = np.array([landmarks])\n", + " landmarks = landmarks.astype('float').reshape(-1, 2)\n", + " sample = {'image': image, 'landmarks': landmarks}\n", + "\n", + " if self.transform:\n", + " sample = self.transform(sample)\n", + "\n", + " return sample" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "MtKUWMoG7_VT", + "colab_type": "text" + }, + "source": [ + "### Iterating through data samples" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "53V7WwaW0OYV", + "colab_type": "text" + }, + "source": [ + "Next let's instantiate this class and iterate through the data samples. We will print the sizes of first 4 samples and show their landmarks." + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "0Sh9OETx0R8N", + "colab_type": "code", + "outputId": "3d2a0ba8-63b3-4105-99c0-32572b853988", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 634 + } + }, + "source": [ + "face_dataset = FaceLandmarksDataset(csv_file='faces/face_landmarks.csv',\n", + " root_dir='faces/')\n", + "\n", + "fig = plt.figure()\n", + "\n", + "for i in range(len(face_dataset)):\n", + " sample = face_dataset[i]\n", + "\n", + " print(i, sample['image'].shape, sample['landmarks'].shape)\n", + "\n", + " ax = plt.subplot(1, 4, i + 1)\n", + " plt.tight_layout()\n", + " ax.set_title('Sample #{}'.format(i))\n", + " ax.axis('off')\n", + " show_landmarks(**sample)\n", + "\n", + " if i == 3:\n", + " plt.show()\n", + " break" + ], + "execution_count": 15, + "outputs": [ + { + "output_type": "stream", + "text": [ + "0 (324, 215, 3) (68, 2)\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGEAAACbCAYAAAByD/XcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO29ebRlR3kf+vuqag/nnHu7W91qtdST\nJCQhBJKQxChGG0wcz8Y2fomNg4EXO7HJMys2eV7JSl5I4qwkz3FI7MQ2NjG2wZjYgYAtDBaDQSAQ\nGgDNQi3UUkvdUqu773SmvXcN74+vqnbtc2/fe7uv8pC9VFpXt+85Z++zq776pt83FDnn8Oz49g7x\n7X6AZ8ezRHhGjGeJ8AwYzxLhGTCeJcIzYDxLhGfA+BtDBCL6l0T0gW/3c5zN2DIRiOhVRHQzES0R\n0Ski+hIRveTpeLhvxyCirxLRc4noOUR0x8x7O4noo0Q0IqJHiOgnno7vVFu5mIi2AfhzAP8QwP8A\nkAN4NYBq64/2//8gogzAhQAeBPBjAO6Y+ch/BVAD2APgGgA3ENE3nHP3bOV7t8oJzwUA59yHnHPG\nOTdxzv2lc+5OACCiS4jos0R0kohOENEHiWhHuJiIDhPRu4joTr+73kdEe4joL4hohYg+TUTn+M9e\nRESOiH6GiI4S0TEi+qXTPRgRvdxz6CIRfYOIvmMT87kSwL2OYYQXIyECEQ0A/CiAf+6cGzrnvgjg\n4wB+6oxXbXY45876B8A2ACcB/D6A7wFwzsz7lwJ4A4ACwG4AXwDwnuT9wwC+At5Z+wAc9xO/FkAJ\n4LMA/h//2YsAOAAfAjAAcBWApwB8l3//XwL4gP/3Pv9c3wveaG/wf+8+zTzeCmARwBjA1P9bA1jx\n/77YP9N45rpfAvBnW1lD59zWOME5twzgVX5xfgfAU0T0cSLa498/5Jy70TlXOeeeAvBrAF47c5tf\nd8496Zx7HMBNAG5xzn3NOTcF8FE/+XS82zk3cs7dBeD3APzdNR7tzQA+4Zz7hHPOOuduBHAbmChr\nzeP3nHM7ANwO4OUArgZwN4BtzrkdzrmHAcwBWJ65dAnA/AbLtOHYsmJ2zt3nnPtp59x+MDvvBfAe\nAPCi5Y+J6HEiWgbwAQDnztziyeTfkzX+npv5/JHk34/475sdFwJ4kxdFi0S0CN4sF8x+0CvbRSJa\nAvAKAH8F4AEAlwNYIKJ3+o8OwZyfjm1gbtnSeFpNVOfc/QDeDyYGAPxbMJdc5ZzbBt6htMWvOZD8\n+yCAo2t85giAP/S7OPwMnHP/bo1nPuW54GcB/K7/9ycB/IC/7j3+o98EoIjosuTyFwLYklIGtkgE\nInoeEf0iEe33fx8Ai4ev+I/Mg3fQEhHtA/CurXyfH/+ciPpE9AKwLP/wGp/5AIAfIKLvJiJJRCUR\nfUd4ztOMF6FVxNeCRVMczrkRgI8A+FdENCCiVwL4IQB/uNUJbZUTVgC8DMAtRDQCL/7dAH7Rv/9u\nANeBZecN4ElsdXwewCEAnwHwq865v5z9gHPuCHiB/ilYeR8Bb4D15vsiAHcQ0S4Axjm3sMZnfg5A\nD2xAfAjAP9yqeQoA9NclqENEFwF4GEDmnNPf3qd5esffGNjir/N4lgjPgPHXRhz9TR7PcsIzYDxL\nhGfAWBdFff2rXrpKVhERiNjfcs7B8YsAmKJEBCEIUkr0igK9XoF+rwelJIy10FpjOq0xrWrUdQ1n\nHaQUKIoCea4ghEDTaEwmU4wmY9R1g7rRsM7BwQGOvyOTEkVRYNDvYW7Qw9xggMFggH5ZIssVpCQQ\nAEcO2lhMJzWWl4dYWlrGcDxBVdcwxsI4BzgXPUgHwDoX57jW/MPcw3DOwVq76rV0fPGWr53WSV2X\nCLMPstbflLxOxD9SChR5hjzPUBQFlFIQgmD9w2qj0TQNtNZ+Ufl+QggopWCthZASQrSMaq2F898h\nQHDOQWuNyXQKaw3qusG0qjDp99ArS5RlgTLPoJREoSRkT8BZ/n7rF6mGhjOG75vOa71FmRlhsdO1\nOVM9e0ZEON17gTsE8UJmmULpd2mvV0IpCecAYy0cXFwMYy3IAYAEwNcyITIopSGlBJGIEwtTc8Tf\naQFoY4Faw1iHRmtUVY2qX6NflzD9Pvr9EkWeoSgyCEEAAdZZWGPh3HTVLiasnnf6nn+hM/e1CBHf\nX2+B/dh0UOe07AmCAPFvIZBlEmWRo98vMRj0UeQZBBEaYyCIP0cUno5g4cB7nCcihECmHDIloSSL\nJxCBpUZXJCnPKUQAnIXWBhM7hTEWTaNhjAUASCHQK3OosgSIYK2D0QbGaBir4Vx3bjI+DXjBZ4mS\nvj+zPi4RZZvlqLPihA4XwHOBEJBSIMsy9HpMgEG/hzxTLIaqGtKLGCFE5B7eiQ7GWFhrATgIIaGU\nglKSr+FVhrEOzloQWThHUTcQUUswANpoTCsXRaNSEplSKEuFflnCGoemrlE3NRqt4ayGsTYuYmcB\nZ3Z6XOQZkZO+l35my5wg/A5kBbc2MQL78g5WKIscvV4Pg16JXllACoI2JhJAShl/iAwAlvfGaBhj\nYK2FFIKVu5LIMglZ87VkDIs05/waOEgv/kCs4JlzePNqYzCtKuRT1k9KZciURL9XoJrre+OAOcaB\nNwN5DnVriJzw2loKeC09sBaxzpgIfCP/G6tZMBKCACkJSkmURYF+r0SvLJHnKj6gFOS5wBPDXxdG\nUNrWWpBgeSWJCVvkueeShj9rmXjGOVRNAxJ8LyE853juEX7x6rrBZFohzxSkLCCVRK9Xot/vYTKd\noqor1k9kwQyx9sLxW4lumuWG9HcgXLqIpxnrE2HWGkq+PIgT1gVskuZ5jqIoUHpzU0oZFzWKIHQt\nCimFFxksTpxDVJRSCuRZDoDFSp430I2G9hyjLSvY6bSCMQZaG/R7JYqyhFQquadD02hUVcWiKcuQ\nZxn6vRLjssBkmkMbC+csePOuQQS/q9fb1w4AhWvD7w0IAJyhTkgXrzVLyS+mRJYp5LlCplrzMlVY\nbBEZv6uBTEkQFLJMoShy5FkGKVktOimQIYMQAnmuUJYG1jpozdc3TYO64R1e1Q0m0xp1o1E3GgNj\nMN/vI8tUNHWdc9DGQGvtRSN5q6lAkeeo6wbWWIBcNGFnJr9qLVLLaJUuOc0abokIHa3PPBatnKiU\nFe9+8juad7WDMbx4WmtobeAcPMEyKKWQZRmyTEF6bnHOQToHl3lLaGZq1lkYbTCeTjEcjTEcjrEy\nnrDv0Wg0WgPOQSmFsmQ/RQoBOMAYNgKk90nKokBR5JhOKyaws5Fbg/9wOrm/FsdsRvycFRFSAgCp\n7KMoalply7vOGBtFi9a8MMYYAI5lsypYLCh2qMKDz3qeIihpKSGlgpIifm5SVRiNxlhaXkG+tIzF\npRVMJlOMxxMQHDLFllqvBJR/PgBw1gL+mfM8Q5HnyDKFumF/wwWz9DQe8FrE2MyOP93YlJ+Qsh2/\n0JqmrRkoQII8AQwIvIDGmEgAIkJZFMgUc0GWZXFRjbFojIExXZNQSYk8V8izLHJMFC9aYzI/h/m5\nOQz6PZR5juMnF7EyHGE0niLPhhj0S8wN+pCBS4nYzndsCbGOUF50CVAIFwX5v8HufzpQ6PWJ0JoB\nM8Kuu1OlFJEDtNaoKsBmGYSQsJZNQCKBsiyQZ5lX3FlU3FprlskWsMQ7MTh/UspItKJg7pFSxefq\n9zXmBgPMDXooiwJCSDhnMRyOMRyNsTwcYdu2bZgbWAih0CIhnghSIs8UsiyHkhXqwP1IvPQgmlJu\nwOm540whjPXFUfKFwexqHZBgIbXioW6aRAFbKJXx5wSh3yuR5xl6vR6KPI8EaPw1QhgAppW/ntBs\nfpI3bVkkBZ8DRMhQouxZvm9Rgkig0Q2aRqOua+aK0Rg7ts1BiCJeS0IAzkW8KssUlFfYgcCdBZy1\nejYxNsslZ5WLmiosFj8OdV2z4s2Ut+N5QbJMoshz9MpeXCihJJx1aBq+xj/x2oouzr2T+QYAoOAX\nCIky7yHLClhrMK2mmEymOHFyEZPJFCujEaq6BjAHpVRnpwohoKRCrrJIiIbMqp0+qx9s8ozrIQtb\n5oR0rCUfAYI1FrWrUWuCAFs9bOMrFEWGssgxGAww6A9QlD0olcE6B93U0epxzkVnLSx0dP9x2hRM\nr5cklMrY38gUjNUYjcdYWlrBynCMqmkwmUwxmVYw1kWHLjqGZD2sIZFnGXPDzJq6YOp1FjjxGaJL\nsBrm3rKJmt6o66a0ys0aA1MbPyGBIs8hhECv14OSEv1eibnBPHr9AVSWgYjQNA0AxoFcXHgL6xfc\nOp+f6Vo5HMWcV/BEBBLOe8oCMsugMoWBmceO7Tuwbds8yt4Cph7ink6m0I1h0hFbXM61yC0rZ76H\nkALUoENw5xxk1BcMm6QQOPGH4M7CSjprFDU8oNYGddOgaZoIE5RFDgKQZRnKkj3YLM9BJGCdjQvK\nP2F386LH3e6na9fggrVYXBAvZl6UGPT72DY/h0GvxPLKEFWtMZ5OUTd119+hFHyUUIqBQyklQ94z\n5rJNRVIQNUS8Jb3VhWDensHY0E+IrO9SY8l5wMui0RrTqkLdNBHMq/s9WDgPHeRQGXu+DoAzFtYY\nWGM9QUyy8MkiB1My4cDAMfH5koWMIGKWoSgZvyqLnGW8aaN5zEkASICs7RBBqqCcQwwDkVOJCAYz\n6Kr/kCXqvI4z9Bs2AeAlhEheD+LIeCigrhvAAVIq6EbDaAtnA8O2KKQ1BrCGFzNs/9mbd74lEQs2\nfL7FZVIi8EscFCqLHL2igJICTW0Y0qgbdhidA3nvPAwhCEp6saQCLM5Qx1oyvsXOZp5jZs02M87Y\nOpq1jUMsoPFxYFlJTKsaVV37ECaHMcPuYvGT3jGYhKteWfW9ISoXlTe6nBAxLJUjzwvkBVs8E/8s\ndeOJMHNNyw0KmYe7lRTRAgwWXDDHZ68lcqteC5/bzDg7jxmJ7I6EYICtrmtMpmwiTqsKdV3B6BpG\ncPwNznaVfCKKVlOBkj8chyW9HuFX2udLF0cqxcCc984JHNNoPBGCN++8Ug7QSPCe8zxHkeeYyCkq\nBJidOZAIHQ4QHjUIhOwQlmjrKOpGhOgSw8IYi7oBJtMK4+kU48kU06rGoG6gBOsFBEvIO3TGtYsb\n7+8AiHYygdDWGljXmpdrKWohBFSWIc9z9MoCRZ5BEsFYFi0hzk1sViVEEMg8EcoIx+eYVDWcZVgl\nsHC60LbDTfDWWvveZghxRuJo1g5OFXSAmZ3jQEtV1aimFeqqRtPUbPo5CWt5R9Z17X8aVFWDumHd\nEkzQgA+x8hYdrgumamthJYF6amMbZVmi9LEF5xgeMVqzruosXgKRZBmKIudsjaLAaDIFUQ0Xv6dd\n01U7P/wtWk4QIuXmrRLBh9YStdh5z4EhZljAaMOwQdP4vKEGsq59TlGDajrFeDrFhfc8gNfffBv+\n/Nor8fV958e0FiHaSVljYYXPzDAAkYGUIhoEomkgRIDPORDknIOSQSTlPtvDw+meG1TigLWEoOi0\nMRE4xhEsO2vbRAM4zh5pgcwWUe4QgcTWOIGolb3ed2Vi2NkJSEghfTYFv8fmq0FV15hOpyAiXHzf\nN/HyT9+Ev3zpNbh1z3l4y023YO/iMr771q/jC9tfjWuPPoE33XsI/+vq5+Heiw96+irUJJARZ2sY\no8F6MlhbDayuoZuCPWclcf4dd+I1H/kzfPr66/Cwt3aCUm58UMh5goc58KIBUrBOaQmRIcskqlpA\nk44BKWeDZEjMUcMbSARuAEXkdr1xBmmQM4ENv1OVUsh9gCb3wByIvdFrHj+Gd3744zh49/0YjkZ4\n2Y2fx57jJ/DaL34Vjx09hvfu3YND/R5+Z+95OH7iFN541wM4uLSMH/zGfbj80GG863/egOceOhyD\nQcYY1ju1xmRSYTRcwfLSEnZ++av42+/+D5j/wk04+dRxvOB/fhy7jj6B137hFlx37Cm87xv34lWL\nS1GMmcTs5AXqWjZKUoz2sV5hblpTHHccT8sGgNFoGh3FLiMEWyWC3/ipUxW2kZLSP2zps+0yn6IC\n/Pj9D2H/wiK+40u3YmU4xp9fdzWO7NiOD116MZZWRvhUnuPHLtqHPwNhYXEZv33+bhwa9PAHF+3H\nG+++HwcWl/E9t98ZLa/w02iNg/c8gLe894PYffs38NJP34TznnwK137iM3jo4cP4ixdfjSd278KN\nL7sOb7z7flwynuCdjzyO/3TzbbjiW49wwMm2itbvqs6/pBDIM4bPe2WA0Ffv6tXePOtHk+hJrc3T\nQISZYb3CBHzgRQWwLvMZDRzB+qNLLsSj27fhY1c/H+PxBLeffy7+2Rteha/sOgcAkOc58jxnL7uu\n8elegbdccRk+Pz/AH1x8AIe3zeOjV13ucSnyKS3MZa+/+TbsPbmA7/rKHfjwcy/B4W3z+KNLL8LC\n0gq+tvd8/OZP/SgeuOQgPnrl5Tg06AFEeM5wjO+9/a4IyEXch4LyFC1dSEAqRoCLIkeZZ8ik5ES3\ntfJ/AkEs2JexHpR0bJWtN87IOur6BjbmdQaLRCnF8Vu/Y24571zcc9F+KKWAlRXUtcbyygpOnFrA\ni544jn90chG/tm0On5vrI8sygAiN1siyDF89dxfuObgfc/N9zHkbPvdioWk0Pn39i/C6m2/DR59/\nGW4791zcet5OlEWBHXMDzA36KIoM08kUt+3ZjT+98nl4yZNP4eefOoVPXfMCnwK52tmL8/S/peCA\nT5nnKHw8XErNPoNzqw2UNdYK2Nhp25AIfKNupCglQrCWWq+Tf4Rs3f66aTCtaiwtDfHkUydxamkJ\nv7u4guc7h3ecXMSJk4v4VwS855zt+Pre8wEfMpVSxIXolQUGgwHyTKHRBo9ddQX+4949OHlqCc+9\n70H8xEOH8Vvn70avKPHWo0/ghuuuwiMH92FhaRnj8QSf6fdxz4v34+IDe3HA2Y4kWkUID6ewA+f1\nnQ+vVlXNYVyzNglSgHCz0MWG4ihaAMn9jHPQ3tFyAF65sIjfu/t+vGZ5yJEvpXzIk0EubQym0xqj\nyQSvWVnBTcMxvrRjO+4SAv8CXOJ5tQPeubCEyWQC5YG44Ln2ihJXPfo43vzf3o/LH34cc/PbUZYF\nnLE4/NhRvOn+Q3jutMLfO/wYfvJbh3HxyhCvu/lW3PTl23D4yFEsrwzR1HU0MWeBwHae7fz4NRE5\nPM88nKHaqN5aazXrvW+ZCO1NqONvzFoHbz9yDJeNp/gHTxznoL8UUEJwhoNPj+yVBXbu2I5fnkxx\nRaPxQ6MRAFbsn5ACEwCflBKvWR7ivV+7By8/uYCyKHyNQ4lXfu5m7Dr6BK674UYM5ubQ6w9w6YOH\n8d477sJn+yXuzRTeLYD/tH0bvjU/wH/esQ3XHjuOPz50GK88tRCTg4HW15ndqd3YcLurQ3Ja+JFC\ndhb7dPfYLCE2JAIRrUX0jrx734G9eLDfw+/s3cNIZHhgyT9FlmPQ72PH9jn8yRXPxbfmBgARrrIW\nvyIFflhI9AD8gBD4peEYzxmN8X/cfwi9skSvV6AsS3z9B78biwf349BPvgn9/gBz89vwxnvux/Pq\nBt/daLxv/178W5VhbtDDR17wXPzf4yn+Q1XjSmPxT8bTKCJfeOQYfua//zEuvvebnXnEecXXUj+I\nr03j02JmsTdaw/XGppO/KPk7vSU5hy/v3IGbd2yDNgYCYE6QEtLb24IImXOwNsM9F+7DL+/bg6sf\nfRx/98HD+P2956PRGj977Em8/8AF2DYY4KcePoJPvuQari3wGI6SHJVTWY6i7GFubhtu/57Xw95w\nI/7kwv14x0OHcXHd4F2jMeRDj2LPaIwnewW+qSR+fcc25B7M+8Hb78SehSW8/DM34dGrrli1WCJZ\nNOYE5z1fQAp4R8xjRBSgm9Xc8L8Hyl6D7VrALXXgyEeoOJbbSbpyDtpoEBHuPrgf77rg/JiT9AsX\n7UOv18P2+Tn8v9ddhUG/j7nSE0ApvPDjn8T2xx7HJR/8MBaufxmKssTCq1+B3z+wF6dOncSndp2D\nH/ja3fjSa69HluW4/jM34UOXXYRPSImllRF6PnXmhuuuwg/fdR+++l2vjs+UcvVacr1N5xeMmAKr\n4KDZhd+sPgDOyERtYdzuqy2XyAQEC8m45C2mMGGQQvBQhdcbRID0Tl+/V6JX5CiLzIs0VvB3/8j3\n44Uf/yQefcubIZVClhcYzG3H/n37Mej3MDr/fHz4Ndd73Engm5ddhKPHjuOcJ45H+Z4piXv3X4BT\nL7sWc3PzMWYcFo3FTltDwQnC/Npsqk3AjGatrDPhgE0RIV1w4ZIEqEQsEYBXLCzibUeO4f0H9+HW\nPbughPQ1BqKDOMYHVA4OCiQQrQ3hPVQmYBZjvUIwFx295mqMX/+d6PUGjFWpDFmeYzA3DxIC08kI\nznofBYQsy7DznArj8Ri1Nt7/CPmuYs3EXSEEGxJSwhkLsg6aeKNkPjcpFK20C9414c+GEJvymGfl\nZWRZb4K+7cgxXDqe4K1HjiJTq937VeztJ5ZneRQ3SilOyOIPxkWBn9Ter30D1//CP8HOL3MDmTSl\nvigK9PpzKPsDZEWJrGBlPj83h/n5OcZ/fE7RlY8+jrf9zgdx8X0ProKhu0UsgQM8d8dUzDZxOdWR\nZ2uebpoI6Y1XKS9r8b595+PQoIc/vPggpJJdIriuBRLg3cApAYNnl99GkM05zo4LTuGVH70B2w4/\niot+/4+6u41Y/LBfUaIoesiLHrK8iHHmosiRZRmEFPjbt34d5x0/gZfd+PnV8xKtBSQCN/tyq5A4\nXPiK0Di/jlm72uTdsnUUbjSruASBo1UAtLX4/PZ53LJ7JwdRhPSOWqAB4ycUPEkigJwvNQxxYxtT\nW4QmaK1bIhgD3dS464e+F9fc8Ck8+pY3R0KFIBBbMfydgaNIACrntHjlw5dEhBtfdh2+5/Zv4Na/\n9dpVO5eIuFxLSkjJyQjKCkA4FCpDUeYocs6nncqar3HdjJCzGZtKeemabBQnxNkWvqpGCBR+4YK3\nDDDYJ9awQEKukfOJAsZowBOmyPP4WWMt6knDIJ7P2bNW+9zXGk1dxcSzIEqElHDOxNix3w0gIjxw\nyYU48dJr0OsPomJOd2uLgbXFLFIycQsfdeNc2qnnVNPqyDX0wZadtbVu0iKOABxDtowP+TIm18Z+\nY/AkIcJsXDgkCYTqzRAi1dpyBf7iMpaWh7jyf92A+cOP4OD7/5BjzdbAasbsTdPAJpl5rTlJPuLX\nwhStOE0sokQfpHqhfb1NGi5yn7MarL+wQeHOmiM2rRO69jTTISy01hrXnzyF37ztTlzz+BOo6zao\nYTSzdZrYFQnSec1X4FiLum5wamEBR594EqcWF7G8MsRNr70ep/ZdgDu+7w0YLi+jqSqYRsNpn0gW\nyrACyOUcrLFJSNM/e0wgEB2irVbMMnJSS4iMU2KytryXvAXItHCtybhxaPnMidAlBtqURculS//g\nyRO4dDzBTzz4MEbjMcaTCaqq9s5YK/PjfZJdE8WBA4wxGE0mOHDPA/jFP/1zXHT/g5hMJrjzwPn4\n3bf/JG7MCE8+cQzj4RBVVXkkNwTu23szAdr2DdZavPCxY/j5D3wEz3ng0AyMnRJAzZT5tvFnFery\nMo6bhFpr0bEGXXToNmsfnXnyV5im9xuCOPnPO7bhnYvL+O3zzsXKaIw85yJypVQbWKfUgmB0Nuw0\ngOAcK8a6rvGm+x7ExaMx3nLn/aC7HoBSCvdd/XxcfP+DOPKC5+HSQw/jju/9W3j86hewo5g4UsEz\nD+FF3TARfviu+3HB4jKu/8wX8dHrrukoZIL0DifL/1Bg6JyBMQ5CuJihV+QZikwhzySsVX5DdcVt\nmxi5sYg6s6AO2pyhIO/D+HRZ4vMH5pDlGTK/Q6WQ0IXxWXME6wLruZZjSYDD0qGGmN/40+dfhh+/\n/yH0tcHu0RgA8NKv3A6lNXZ/6atQWuNFf3EjnnzxtWwoSAkSEoIEnAWs1qhqruwMqYwfu/oKvOm+\nQ7jtDa8BSIBblXj94eW7gATZ1lcwhjPsYnKY4uSwvMihphmEtlDOAgEmb7dpNCSeNiI4l4TrtI3K\ntN3ZnjtsN/CdKmrnHCwFHgieJl+f5vAIJ3D3hQfwwCUX4yXHn8KPfvkOCCHw8EuvxcUPPISjV78A\n++68F3e/8fsgg8ctvcNHAtZpGMO5p1VdQ2u2lO46uA+LL38Rdu08B7kQEF4vCCFA0aJzHmppRVIQ\nwTGxwfsMeaZ8mZfXKdZGMG8tb/qsiTAb0gyKrtamrXT0i/i68QT/16kn8Fvnn4tbzt0JAEidL08r\nltkhZcV2i0OA1mq59ugT+MG7H8AnvvOVuP85B3Bg/37c9+M/Aikl7vmxH+LvDYo1eNhgvVLXPvms\nrqGNxiueWsJbvvo1fPl1r8Kpc3f5axIiCMawnGMz1XnIpLM5BEH5yqMiD/F0rmVI9VqYQ9DVWyRC\newfr0xyDsmuaBo1uoi0thMA/OrWIy6saP3f8BG47j7sxW2O9H2B8Lk64X6s82zoFG60W4wy+7467\ncMHyEN/3uS/hDV8p8dBlz8EVjzyGu3/k+/H4NVfBWddiUkQsKh3QNBWm04k3DhoYbfGThw7jwHCE\n/KZb8LFXvCQSgEJ8wOsT4eCTyWRU1Gw++0JDwaVVAWIP/gKRjZAMJ0E4wNGmjNb1G4wI6qSatGnw\nzOpNo2MMWgiB/7Z7F95x4hTet+8C7336KnqjfWY2J0WxFcSEaXzxd6M1rDF40bGn8OaHH8V79+7B\nf929C3+/bjCvNS44cQq7Ti0itxbP+5OP4f5LLuIEAr9z8zyPOmo6nWA8HmM8nmBaVdDG4AOXXoi3\nPvI4vvLal7dwiWgJASAmCgfxJqWCEJqJQhaAhRRtTlKRZ8iUQpX4C61VtDkCbEgEgNkz5NE02qCK\n6Y11zI4OlskXd2zHrXt2I1eMfBrHOUJNo5EpEycYPG3tM/SqukZV1ZjWNX7iocO4dFrhbUeO4nO9\nAgenFR5TkhtMWYuFXoGv7NyO7//Xv4ovfecr8fgLr0Se5zD9PozJ4ZzDaDjEysoKVsZjTKsa1hrc\nuvtcPPHCK3HwwAXYBUQOIDJG7MYAAB7tSURBVEERMAwebxBVMo2meXEE33mmyHPuBpBnmEwluPnJ\njOdM2JRq3tBPcHAwzsS2CDGrzOuGFsoQsdVNmKBzzqdCNtGjDhlw1ppYu6C1xkufOon/8fV7cK42\nOJpn+MDFB/G2pSFKAJdojtj1wUUoLzuxgH2nFvGqv7oZp06dxMrKMsbDFUxGQ4yWl7C0tITFpRUM\nhyNMq5qfM4QrERy1JJk3mS/5iA0JYmsrOmxdpy1P+mKE0tvQnSb8BOtoIzKsS4QoTjwBQoJv4xcu\ntEloIQD+uf7UIt73jXvxilMLcbdPqynquonXBWzGeUzn7Y89gb21xk6tsavRKMoCH7voAGoh8MA5\n25NcIIHjL7oGWik8ftUVmJ+fh3MOVVVhZWUFpxYWcOLkAk4uLmFlOMKLnzyB/37X/XjJk09FiyyY\npEKIGXt+Jq6clOjy3PjfSoWy4AK9MkdehJ4eM1C/J+pG8NG6RDDGtmKj4d3cJHmhwXSbdf3ffuQo\nLh1P8H8+9gSMMZhWNSbTCpPplPtHRAIwywoh8AcXHcCxPEdDQOEc/s4DD+HXd+/ED7/2evzidVfh\nXzz/uTi8bQ5/8rJrcd7tX4fSGtd88RZc/tBhSClR10yEkwuLOHHqFBYWl7AymuBnjx3HZZMKb/7W\nozypBIFeC1wLnB0wpjS0GYBLKUMuVIl+r4deUbRxhqT+drMxhXV1QqhuqZsGTR0gANNBGAP+E76U\niPD+C/fjpx99DL+7dw/v+rRK03KDkfCwkgQggdv27MZbd+/CS4+fwN87fAQ3zc/hfXfehy/MDfDa\n0Rhf2rEN1lqc/9gxZJMptBDItMErPvslPHDJRaiqCksrIywsruDUwhIWl4cYjcd4z455vHMJ+GOf\n5d312LuL377uWsuJ2s3FaC0bLJnKOCGt38egP8F4Mo3tepxN4ZmN9cK6ROD2BEEH6IQAM4m0IYTp\nf766exe+cu5OTKsKdlp5565KgLqwc7wsdQQLCxBw655duHnndvzWHXfj0mmFg1WN0jkcrCoU1mH/\nNx9CbiwWd2xDU5a4+XWvRF3XGI6mWFoeYmFxCQtLQywvDzGZTvGXZY4vn3sh9u7ehQs9dA7XitAO\nnjUbqCdK6pIDcfxmUxwTn+uXGM/1MRpPMJ023jFk799FKmyBCHXdoGo0K+GOLE+C/qu1WnxkPzO8\nenEZ7zh5Cr9x7k7ccu5OGGsx6JXIs7YFjhACAg7WQwYfvPRC/NS3juD2887FS08u4J4L9+Gao0/i\n0CUX4XmHj+CWN7wGh6+4DNNphenyMlZGIyytrGBxeYhrHz+Gnzt+Av++V+Kvtg06qK11rRcfFnut\nmPAsd3cJwaVRWZah1+9hUNUY9MfMDU3jRa2JonqjsS4Rpt4XYEvIO1Wuq8A6xnA39SA+wDtOnsIV\ndYOff+okPtsvYazh6/rs8XIgRkBS8NAV7jqwF798cD+KPMdn5vo4Z/t23LV9Dv1eD/dmOe/Saoqq\nrjEaTzAcjrC8MsLyyhA/d/wknq8N3jWa4LNz/YhqRuR3jZ9ZvC1GTxNCBI866EAlHMqyxFy/xtyg\nj9Fogsm04g27KvX+9GNdxdw0LIJsEnSJ9i/58qAW/on0CKnnYfyXc7bj3kzh1+bnMJ3WGI+5Y9d4\nUqHWbLrCcSsFSdzvNE0sDmCaUgpZnvvuAITGaIwnUywPuaXOysoIK8MRfqXIcKcg/EqetZuFqI1f\nzMAkAGJpbqjUjEZHCkPEgBF5j5rxql6Puwj0eyWKIo+W0mbHupwQQDjjq+9DQm3rWVJnxwcCzMrW\nz83P4ZNZxhaVhym+YzjCPz50GO+/cB++vu98OMcJt0QiQt4k2r55JprKBs5x0eFwOMHS8hDLyytY\nWh76ZlMTfNwRPpplKHKF+RDAmXk+a23EmqK4SsrAPE7fyTgPjmbgCvhyqCLPMRj0MDfXx3A0xnQ6\nhdYi9m7aEhFadmV7PuDkwXFhR2fGyoBv+I2Elf2nXBIO/YWFJVyuDX760cfxMzt38GInfSUcXOzL\nqqYVJvkUue8g4wBMJhMsLC7j5MICLn/oEfybhx/Fv++V+Ihmzz6AymtlPHQlKFtsXGsX5hjfjEQJ\nZmq0nHx4kRPYBHoldxjr93ux0box3DRlI/xi88XkITyYTgxc1RjEUyAAkokEezsqbf/+v+uV+OVJ\nhV/fMY/ReMJEyNucpbDjGs2tMkEErQ2y4QjGWozHEywsLWNhcRnvfvgILq9q/GOt8SGVcTNBYlGS\n+q/h62cXheBas9KLpEig1G8QMv4d5gYAQoaOlTkG/QIrwwKTKXeld3qmeP5siNDiJqLDloGVA1um\ngzMjkOyi7m50zuFTmcKnywJFlqE/mQIAXnFqEX//6JN434ELcPM5O/y1bCZXjcZwOAYR4YWPH8Pf\neeBb+O3zd+NTeYZfne/jF3SDd0sZOwiHTREXa43nDJo2SIy2HtoXvySKO8w9lG6lc+FkNu+89Xvc\nb3UyRdPoeM26a7wREcIOWBUEp5D60i2o7mRVIFFo1GY4hPdDTwwG8Rq8/THO5Hvbo0dR1exlX3v0\nCfzGV7+Oyw89giePn8DRJ57Cm+59EJeMxnjrkaNckEiEV/Z7+Dh8ZgXaoMpq1yyx7tB9nih6w2ei\nfls7abidW8gozNDv9dDv91AWOes4sXHy17pE6KR8+AxrLvzwesETgrPp0AmshIkFD1OI1Q/vEHpO\ncL+639i1E98sC/zmnl0eLNQedpjibY8dxanFZZxcWMJ/3DaPezOFX50boPJtfprGO5HRQ3JRVKYo\nGhOIYem48HHTGA43xdfZPOL1t5iVY21UjoP9WZahlxcY9PjIAm62KNr0ztOMDeMJbVjPd+6dsXw6\nrQWSlEYksjT2rxYCQsykudgQKJK4sSzwVxcd4Kp8zZbZf9m5A+84uYBfm5/DeDqBs8CnMoW/OHcn\nR9CiHxMqJAMnJrZ+aufb0KXYtTHyWZ8Bq52smGUSN1erF8Jm5bzYnPtw93oYT6aomgbO1mdPhNAr\njlsVhBoDdIgRMtyC4ovyUnAzERZfNmYwtBNBtNdjiwTi6FhocmudxafLEp88/zyW16YNhYaOAY3H\nt7qmMSUcKOK5Dc4zirOOTxChtlltCLVydK87l1YHuKgTQrokiKLJzonDAVPqYTQeM3Sj1z+bb8M0\nyKDUhCQQqbgAaczYp3ZFGSuEiLuMW+vDR9hU10FqLIyv9w198SwUhLXRPJ3t6ALXNhjRnovSWHd8\nKG8Yh7ZsYRGd5XMYjDEQ/ugANsG9FXa6NYAAED7vIiEgRGfeXFxfeG4o2SGt16/o36A5bWBNxPTd\n1EoIuyI+GNjjJcG9hELsVxDBGu+sWRcnkyYBGGPiV0rpVumW6GR5ImjrfPtnEz3csGjgr030lYgB\nl9gzKSw6tZhSusCpBZSiqOEzwdkLgEGHCDkXSvbLEmUxQVWtv8wbtu5vbewuchq+tH2lfZCQJhII\naGJI03Rg3jCR2DaHAKsdFByUF38pEWL/CGuhjU0IEPb+zLMnMHS7WRBT8K0zXkzNpGemeg/hPtKj\no6tza1P90RKCW/30yhLT6eTsiRB2QxtBCwsr4JzfGT4oQ77djZQEPl6rZVMr+bpGa39OAa+aNRx3\n1tbAOAL5AySctoDoTi4QwfgfrXWszE83iHOOK/oI/qAlYu70FIjizWjY0/SqSJ1Ff5YV31IYNj4S\nognP7WG9ELvEsPPW6xWYTsuzJ0LASsLvVvyIzg6Q3nVPCz+kD5YHCwsENDqPKS7MGb6oUGuwyQif\nhWegHXXKVMM12nLcutG6m9+a+CX8d7qb/D1cS8RUnIZFX80JMwo64EeJWHJAp4d28IVC86peWWLa\n2yIR0p0SrQaE5rT+Jr7DbvSqKZyxEw6QYOWdZw0apTwIJzl7rglmLRLLhABiAIxAQJLFx2JId/TA\nKpAs+TNgQ9byqSNGGxifpCy9OJwN7MzOOTVCaNVrblVD2iCWc8XN2XvlFogQk2z9jWMZU5inY5uo\nLR4XLIQc4s6JgJxzaLIMlVJQinexiP0rCDpRjtaKeJYaUcv6Rhs0xnYIMLt4wUEEfG8sy140t4Iw\nqI2O8LmyNooSQauDOxGcpJgzuAoFcs51uhiHwUEf5RMCtkCELBRNEKH2LGmt7bawJ4op4kpxQm58\nOHjLQvIpHplSUJmC0m03GCGFT67iSp12g3UtL3bqbMSGwuuz0Dm/gc77wTEzxvCZPEmZVbDg1rwP\nKHJp8BGCKOqIsISYQUUQEZQSyHPuIHbWRABakaRSbzJ9WMEHWMT2xq5dtBT4kkpCZb4UVUpoKSGN\nWWW92GSSYbAMX73oqxdt9Qh4UDhlkPGqNv+JEjMzvSYQIGBD1q62Ebufb/GqsG5CiJitt95YP9vC\nN5WNNrAQsH5RAuVjja//MdZAa1ayQQEqsALOvNVQZxJSsxhSiRInsq2ucS7uwOC0rV6odUZcEL9A\nljes9kpdNz5jJLFuwohBGwrGCYHIMjFg1rCoVkfq+D4SSpqYkXi6sWFkTWu9SlFRoidinW8MyLDy\n5OsDR7DH3basUciUhtZthlv8CucQjmFsveTkvY0iJMlwLvy0Tp61BiZJXLDWMoo546CF+bFH7EC2\nzTtKlTfPL7GWOnqhbca13lj33dmGfbM7IBAhizpBgTRgFMPUIZrGnnZbfJdlGVStoUTTZrhFXdKd\nzGZEzlojIqOeAGmNdKjiSfOnUscznR9na4OhlBSgRKsXTDyjrd08/g4RwFxvbMwJId/Uiw0pBBz5\nPhZEUELGUwGF/zIOb5o22O13c+h9lCuF2v9I3xMpnBrIFmmXAGv9nkVzu6vHZGCrKMTGPTzh51Qb\nzibMjAbAuoxEu2COBHNB0qNVkGz9ijAvhJIxm8D14bnW3rxnSASHOhZ2E5xSbCdLmQT7ww7PfNXM\napPPBfs66UetMglVy8RCSmPWMx7wWYzgd7RiKDF1Q1Vn6EjMD8ylVv5hU4h6FWS/hu/E9w01eF2Y\ne0ucoC3LdwcHhdljvtLz1UQ8TjGgnyngRfGSVkYqpXwLhja9JSC26eg4Rmc4gmUULaQAWVjrm9xq\nj6YSABu5scWLRJTnwZJKi1LIuU6o13ljwrnEsZ3B2NYaG2RbWDjLh/xAcIscEGMxUhJ305XdoA0v\nsu04dwGpDEQIZxRkST+kFvc/OwLMerpgdRpFhYlOHlttxjAIyP6KBZyBcLOBGgnhjxMjoSOnhqoe\nuBAvkRGI5Gfh70/ncdZE0LrNO3VoY8RpM6nQ5y7gPLzTWlt5lr2df3CV3EOmbZ7XeeDNEKUVabwa\nsZ5Z63juZwQCk6bnfG8OYYYd301oCGn0rbiN3vYMcjzLzlskgo71vDFQkoJ0MuyWNk2FrY+u1ZHC\n3MHrZnHE7W9CZ5i2hU2LS53J4s+OYGJXTe3rKkz0XUK5lrWMY3Xh6+79Ah4WTx9x3bhDWs1zNqJz\nXWGVIp7h3gQPEUegLgmcREK0lfRa65jbEJ0+79gFbshk265Nii56Cqy2jmbH2pADH3Za61BK23iz\n1Ismn9ppjEvEh4euo3nT3iuUVjlqdU34UGhukkI2sz/rjQ1N1O6NvELyYcNW+3sLwLURsTQKxZxE\nyec9hOF7RAQYvGOuuvYeZzrCpC28h9w03BHGJwUUISYRc2wDTBEI4MWi6yZ/gdqNxsrXIS0e4UJL\ndCD2LRMhBalCI+6ujO92zAqKKgByp9sRgRtiw0IloZSI6SGtfD69rzD7nLMECCNE4Bp/VrPWDYwP\ntfKPXnXN7P1SDIy884YoGdqNZYRPlUxPzPIm8lkTIbVvA34TYAh280W0IsJpskHpainjqSB8fffE\nDylkJERacBjEXCtfTwebtYu01gJGbrA2ckMTucHXWuiZbPNwzwQ9Tb3jtHQqmL4QgCDOKJF+ztZ1\nk+C2TIRWH3jK2rZjCxG8GGk7PoaFzaya2VGsCMOuYVyFe2F3i/O4AEMQw+TOJ9SmmM167J0uGsDA\nXQDt6oZjCY1PEAj10x1YO7GAnDf64z1FgN0pETkOgkJrB2+kJIu+ZXEkJYFjyQlVPWUDHtRaSRQJ\npzI+XxNBrDhfCZqgsql+iGVTib6R0iFUu+AsrQ7nHBxxX+/0eJmmaQkQ4t7d3bpalEQu9dxgkgV2\ngH/m9lzPcO2WYYuQMYeYDo52Z7gWJZQy+AnswCkp4DK+tbXeLPQFdUS6Ne1E++AyTjK1ogARxKDh\nYwJmwbPZsRpmDqXAad5rjbxWqLPMp1DqNlgUZXkQJcYDeDNOnGjTfiA4ESJ8f+JCbOgtb0gEpiqH\n9jg3P0l/R/dAbPhCDOnFDBy3PNCaH89oTj8E2gbgIb1SyVZEEQkuQAyOknOwTqAh7vIV1n09c7VD\nIOKFjJWoVY1pXvvGUTUaf/CSDmftIJx8awG0WdguYGKeCEDrF6XEZ4upBe82Q4hNdYMUoT1xFxWI\n4oioZU3OtJBwvnkTEcE1gCYTH9gYGYkQcCeVSSipogOoXNu4iQMpynPDxmKpSyCW8xE91Z4bmhxl\no7nbgC+QP11OkfVBLLaE2l5I1lJH7Ky1dpsZG2BHvNrhXqyHmGXZaQtQhfM5RDI6Ls4RQMYrMYum\n5s+mfZJSQC+TbUdgKQXItva5tfwdQnBu0lmohxje1DNHj9VePDV1zeZqIIS1QAJFOGt9Ox6C9M1H\nOkXxCP5BZwWfDmctuQklnRUBz2/t56y10YFTwTmzDOJJ00LcrNz5wdv2/lmCIwmfEJwGSRI8aGZs\nCBV4VDOct+qci5G12reKqKoKVVX7EieNkJcaCM/P0HJG2zs7rWhFpxNaywVPg8ccrQbLR6Jby3Z7\nsBSCY2UtwUkXlatC6LDuYFUXX0qdH+W7poREgeAMpZ/VllNWVh+eembDobV4tDHQDafWc9uHqW/V\nFuDu1jLqwPKJZRd+nOEK1/AtswTYEhFCp16OtRJXM7pu2RQQrIiuR8w7WfjLjO8O37r8ADNSFEeZ\n9O2PObIWMCdrLRrj/Cm3ayzsGVIlEIKhbO8/1HxW87SqOmFPcm1WYECAo28QfQYdCRuhnZmxJWct\nTVsJCCqQwBUUaoO5t1G7IG2FDpGMYdHQ2C+MYKZmSiJXGTJv6hpr0fheplyH4BHPZME3i1jGJ/Ii\nFT67PJTlNlEkMSHquvbOm4FzvlrUsQnqXFt2yxP0Jb/gCv4UL+PfT4Oz1jW/CNKDVdI38wZWg3zB\nkw7PCY+tpCHM8KDc8gYxPMqHUfPiBrwn6qWwpD7ytRkCUPC4+THa4QK3GQ9psJVUVVw7p32eq0ih\njESMhsnN5kt1YZq1uWKtsSERiAK6OBMz9aabMRZWWk4ASHZAIADg/YkE7k6hgHhoUKag/Kl+8d4B\np3JYezE3NVaDe0HMhRPVMyl93ZxvjqX1KiUbrg3QS7hz2KChRR3QVvLMApenG+uLIxeQFAdp2VRV\n3kYmcEMqoQ2UNBGunsVfwiSCWy9k0huOHJQgGNUmkCmpOukvfvkwm+GwHnLaeT3uzPZeQLdWrs5C\nH6cm6SisQc4C6NZJhGv5Tg6O4CFu4UVY2iu87Xa53thUMXmYTEhnFELGEiciPj53I4q3wJ3kRq7e\n1whJwyGzO6CqwQpJF5KZq5v3M7tAazx9x/kKgX5LiHB2aCHU6XTZaGSZAckuEDmbTh+ibuH+Wreo\nbPp7vbGpvqjhi8IZ9kIQrLFoLHu1OstWib90oYjI5/RwFYtBG62jhEAsktqzazbCh9L3TvvZBBHl\nRRfQRFBQMA4+40LHOHR6cqwxBtLaNvXdO6ucPU4drgwiKBAzpNnAbay/Nt1WQUh/fLriQ0Bbr1d0\n2u/w59sliNcnCr2VzUxgJYBMiZirmoWO7J0Oi6ePHYRFXmu0WBKgtYWWnJjmHOCUgjAaWtvYtbKq\nG0xrbjuqdY3MdykLLSS4B4kFnN9ctlvhGapKjUcRtqwT2gVsMZ6glAIG3+byzOx8avVCqE2LQZGk\njwSQcIJqz63JpICQ1IEC1uKAzYyoVC0vtnMS0nHFARGhVh7CCMfWVwxjNE2DPDeQgg9hmr1nyu0p\nRJ86hGzhbfEU2mDvyyir27Kj9tx6F3M/w0OlRGiTxKSPPnWTB7jYLuSpssgLqTSGzi7OPDt4YeA7\nwCD+EBpuEFJnqKrGtw+tPcTNSlqpnE8dIXZY4Vqo26/SqjmGnKYg5tYb68eYwTJVCuEbQfmms9wk\nHQHads7Bchc7WOcgk6B5NEWFQK4kGimhhY4Lw3GJ1nPm471C3utMQMT5/wX7f5MckuJf4aDqcNeG\nLKjWyKYVpr2aux97OCNkixhjIJS32nzBC7dgSMRqjJsrqIyPoQchBsHOngjUPfyzm+7XPgCjjl1s\np2PdIBE5UkGIZtWCsdJWXu9wTtIq5UwA24RrL/ZayrlLxMCx7RyMMWgIqGuB6dQjqk3jW0uzktWm\ngTAKpFoOjz6Af5oUES4so7WNzzbcqAnYJojQgmvAahzEOdepiEzPrjkdIUJA3MXN7AsNZTipQ/mu\nu+0pVWc7OkSh1lpqn9EnPjd83kKLH4Uzfmo0dbsp0rPmUlEc5qekgFUSRZ5xMQ0Au5X6hGC/K+WP\nZwRj+209M5tp1oRzbbrnKfACshfpfCOocDiq8/ZpG6t2bVKYytpkYSFgbVvPFvkg4bjNKmpnHWfB\nO8CFOzmAYNt2/3WFWjc+M4+Tw5qmgaprSAq5qQQuI9YdIjjnYjKAUgpl7iA3cWLOhjHmsDAiofxs\nnVlqJa2urGxjtsHUlZKdPaKUldvM7nASeEiRDM1tN2KK9QgSnjuYrJyNDZbxAGAsm6cxrtByvDEG\nVVUxB+QlmJ8SbuAvCE8RJUiWuZiFst5YXxwJ6sjlVBRFhwrte/y7RRPTTAPnty+fm8nn7aS9pQHu\nBhASwkLmthINbCY5Rm3as9bSxU0JsBnOCJBC3TgoGzx5oGka7rM6rfyxBPx5ay2apombT8msRWWI\nGGdaQxeFVnIbjfU5IaayuFix03KB81ZNmoNqoY3zELSLULZzFtbokFIMJSSc8BnQ8MQh/r5QiNie\nBM7Xk2RKmsgRqxd7I8eoSzD+u/EHF0k/Tz7GfopJVaFpDMqSr+HaPR+0z5PjZ9DSI2y0YM0G0HNL\nzlpokcmtaVj+hYXhWEAbsGdMx2c8awObdRV48A2CcpNC+MyGdhOFtjhZdNpYJFltYAWf/OdgYH0P\njLUWeDOj+9k2kYu0QdXUmEymGI8nmFQV+v0SSnFcgVvmMOemxYBBT4VFD76VW8eSS8e6RMiVAuDQ\nNFzZ4m0cuJy/OmRSh/T3CIgZDWNUmy7iTUFrbbSOhCdCOlpZyocHhcztxvfTDp5pCshtdQTFSh6K\naRqNyXSKleEQo9EI84Ne/M5ULAGrU1kCxuZEiwI718ZlTjfWJYKSktMGG80nioOigg6sGTKyGVfh\nk0PYYzTIXWshOVhoa+A4INU5X8c5F9soBL8k96aq9NA2wcbd5YRDOq/N4vbp59N/h3iJsUCjLSbT\nCsPRBMvDEebn5zjOIdhJ4+ZY/KxZlq3yZUhI7ipD1luRqxOOz4gIxjk0DbOo1TYuOhdhJ32uY5TN\nRW4wPpkq2vreHOVMi7b7S9hhQYEJYrg86IW0lMqhjfWaTQbR1yLAag+7dTK1NphWFVaG3Hd7+7Z5\n9MoSvbIAefRYa42qqgBw0/JoXFDYWGxyb8QBYaxLBMZSajS1jtEwY3zvUXQtgFh2BF/hE3NyWmAr\nwL+z+L61BsSaN5axSsUtCaRqE8wYuWwXM/w+Ew5IrZW1UFhtNKpaYziaYHFpiO3zQwwGfc+VMpq3\n6XemhAg5uOSc54gt5qJOphWapj3ACBG4Y2e9LRdNUwL5TOXQoTHLVMxaI2rgIDhi58I5zSGxy0tm\nL1elTwKLhYXeoWNr8HQNQTY31goKtV6wQ1XXGE8mWFoZYmllBdu2zWHQK+Ni135jciczA0IJleUg\noeLxws47srOm61pj/db902ncPSFoHv4DWj8iiBU+NYTzTXOjvVmbxJi9pRWI4GVUWAYAIYDU7cUa\nd1Ps7L42ILYZqHuWG9Yaxouk0WiEpeUhztk+xvZt8xj0+1CKDY6qmkDXOja/6vcJmQ/hBrM1iM+N\nxgbiqIrYEU8AScZZm/Zi0RYMhknqJhRhmGjGCkExjSXUI7Qn93axeemdOiHljC9iYtu1WYW8Gcco\nfNdpP0tcZtVog0lVYTgaY3lliJ2T7TDzcyiKAs4xt0zrGo2uYAxz6KDvkOV5JzKY5medFRFqrVGE\nhw1WjmcxIrQlr34han/4nSDhzdS2RJXhZ2q9bicgwCkv3TBliGEI39atVfzOcYr77LHCZztmieGc\nT0Im8icXGm8pjTEccSf4QR/IiwJ9a1DXDUbjCZq6iUkCvV6JvCiiYZHCO6cbG6S8WDiVWhCnx2WM\nT7Z1Pu3RWhd3fbg2zTxwFiD/7a3ycglKmbbUTLL3kjykiNacBdS6HjcE3NH6HT+ZTjEOjWatQZ7n\nKMsC/V4ZTzKpVzh5rD8tMej30O+XKIoSeV50ys7W/L6nY0c9O7Y2zvhk8mfH0z+eJcIzYDxLhGfA\neJYIz4DxLBGeAeNZIjwDxv8HRamcX2XL6uQAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + }, + { + "output_type": "stream", + "text": [ + "1 (500, 333, 3) (68, 2)\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFsAAACSCAYAAAAq0bblAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy8eaxlx3ng96uqs979vv318rrZG5s7\nKVEkRe2yZY8Vyx5b8tgziIEkgCdIMn8ksAeYBAkiA5MgAYzEwCRAgEwwduyJHUsjOZYly5IoSyIp\nUhL3ney93+t+693v2WvJH/dJpmh2S2oJtJ3wA967555T91Sd3/nOV199X9URzjnelrdG5N92A/7/\nJG/DfgvlbdhvobwN+y2Ut2G/hfI27LdQ/t7DFkJ8Ugjxh3/b7fhh5IZhCyHeK4T4phBiJIToCyEe\nFUK86yfZuLdShBDfFkKcEkIcE0I89YZj/0wI8YQQohBC/N6N1uHdYMNawJ8D/wnwJ0AAvA8obrQh\nf5sihPCBI8AZ4BPAU28ochX4l8DPAvGN1nOjmn0KwDn3R84545zLnHNfcs49ByCEOC6E+KoQoieE\n2BNC/FshROe7PxZCXBRC/HMhxHNCiEQI8X8IIZaFEH8hhJgIIb4ihOjulz0qhHBCiH8qhLgqhNgU\nQvzWtRomhHhg/4kbCiGeFUJ88Ie4ntuBl9xsOH0vb4DtnPuMc+5Pgd6PCur7xDn3I/8Brf2Kfx/4\nOaD7huMngI8AIbAIfAP43dcdvwg8DiwDB4Gd/Qu8B4iArwL/7X7Zo4AD/gioA3cAu8BP7x//JPCH\n+9sH99v1UWaK9JH974vXuI7/EBgCKZDvb2tgsr990xvK/0vg926EmXPuxjTbOTcG3rsP4X8HdoUQ\nfyaEWN4/ftY592XnXOGc2wX+J+ADbzjNv3LObTvnrgAPA99yzj3tnMuBz+6Df738tnMucc49D/wb\n4B+/SdP+feALzrkvOOesc+7LwBP78N/sOv6Nc64DPAk8ANwJvAC0nHMd59yFHxHNdeWGO0jn3MvO\nuf/AOXeI2WN4APhdgH2T8MdCiCtCiDHwh8DCG06x/brt7E2+N95Qfv1125f263ujHAF+Zd+EDIUQ\nQ2ZKsfrGgkKIuf0yI+BB4GvAq8DNwEAI8Z9f5/JvSH4irp9z7hXg95hBB/jvmWn9Hc65FjONEz9m\nNYdft73GrNN6o6wDf7Cvld/9qzvn/oc3aXN/X6v/Y+Bf729/EfjY/u9+98ds79+QG4IthDgthPhN\nIcSh/e+HmT3Wj+8XaQJTYCSEOAj8859AW/8bIURNCHEbM1v7f79JmT8EPiaE+FkhhBJCREKID363\nndeQd/LXHeI9zEzK94kQwhNCRIACvnveH9mTu1HNngD3A98SQiTMIL8A/Ob+8d8G3gGMgM8Dn7nB\nel4vXwfOAg8Bv+Oc+9IbCzjn1oFfBP4rZp3oOrMbfb3rfCfwlBBiHjDOucGblPmvmZm2f8HsKc32\n9/1IIv6uJw+EEEeBC4DvnNN/u6358eTv/XD975O8DfstlL/zZuT/S/K2Zr+F8jbst1B+kK/4to25\nMXnTAdzbmv0Wytuw30J5G/ZbKG/DfgvlbdhvobwN+y2UG0r4viXiwO17nmL2FSH+2qN6/cj39fv/\nLsvfWdjudS7+bHsG9O9zeOG6sP/GhQkQ1024uO/7mBX9m+W/e97vHRHf3f/957KuIit7hP4cngy+\nd9y6Cus0xuSEfoc35f8D2/rWy3UDUc4555wFBA6HFPK6j+y1zvXG31y7ztl+YyuSbJdpdhUhQyTQ\nqh0iDFpIqUjzIbuDM+TllOOH3oun/O/WxOvv9N+ieXnTin+AGXFURiOERAp5zSyimxnY2bYzgNvX\nLG9/35vBtVhX4ZxASR8QVDpFyZBJskN/ZwMZ+FRVgqscZbxJe86gZAAodFUwHe8yHF4mCjsopQiC\nBna/ft+L//oJuhHo7q8N2U/qpv0AMwKe8hA/SEvc7F+px2T5NtqUWOdRjxcJvQZKBd9X2FrN1b1v\nk+YJzvgcWLyTOGpTlhpnC9LJBGcjdApe0KSyKbq0lHmJNQmmKgjoUvM0toDxdJcwbmEikErhBxHW\nWqS8cWfLMVMcIdSPd9NeJ9eFLYT4G3bvWiagrMZM002sdkTxCoUecX7jIZr1ReJojmb9CGIftEBy\nYf1p4nCRpe4d6NJQigJdlhR5iiksToN0AmE8pAywxpKnKbV6A9+rUeQTmrUFnAZfhgipwAqE52Eq\nQ5Gm1JpNpFRv2uYfBpw2JZ4KEeIn4yFfX7NxCMBYjZTe68A7Kp0ghMDYAmNLesMzDPobFGlKa+4o\nQk6Yjs8zHpyn07mLODhEoYeMp1cIvBquCjBO4CpLQUI+naCkwFgJepbGBlBOggzAGYSxpJMhCMjT\nKWEc4oTBUxGeCvCCCCEFpqpIej0EEDdbwL7i/BCAX39jZqAFYPlJDEmu20H2Jpedr0LyYkwQtKiH\nHTzlY60hLfZwFibJVabpAG0KkvEWo8kAjaXViIjDLlVlybIpQdggLwek04TF7s3EQZtARniqBgjQ\nFik9VODhtCNNRyjh4/kBWlc02l2wBmM1XujjjEMpCUIQ+RHSD1B+DSEFtipJBj1EENBaWPre9Ugp\n8Xz/+6Bfy3e3VgMOKT2srbDO4Hu1H5rrm+28rmZbYxlMriCIGE+uEq+0sHLWsV3ceJqav8D8whpS\nNhiMLiFFHc/LcFWJMB71eIWos8i5S0/SH59Hp5b51lFCUacZr2DKHIVCa4vWBiUF1liU8sAptK6w\nZYE1hlL5BH5A6HtgBFIorDHIwMdWBbbSaN/MrtNqpuMhTgqCMAYpKLKM6WhEZ2mZZruFH4TfAyyE\n+BumxrmKXv8KtXqbWtTBU9EPC/qacl3Y9WgeRYRUAXmeUJQ5xmqMLRgM+7y6/SJra7cwHl5FqAnd\n5hHCsM7Kwm2sLt3Mqxcfpje8RJJepN1q0OzW8FyEKxVFlhH4EUWSIJVHkSQABH6IwuKqCqzBmhJX\napLpGFev44IIjMZTHlprUB4Gh/F9nJA4z8eWJb3tTbRzGBTNZpPtzatsbayzOj3KsZtvRUiFUhIh\n5PeBds5grabSBVVVMhxtUYvn+PEndP3ADlJRq7XwZECr3sE5ixCCSZqyOH+CvBDozDAej4hiRfPA\nKnExh/QCMt0jy/dIix5+4OOJBiFdygI8ryAZ9jFRE+sEnq3wpEeVpZiqRGHwygxlNEaXZEmOsRKv\nzPEaTZRQSGXwAWMM1hiKwYCs1GSmwljDYG+XtKwYDofcdPPtXHjlJapKU2u2mIxGKD8gCAOUEvv2\nfOZ9CaGwriLNphRVTpbvMN89TBx1rofqx4ddlgnOWaKwRagihJh1W8YZ4qhFu9FA2BDf67B26BYW\nFk+ALdnce5mzl14gKS5SaxqwLRQBOjUEso4TkCUpeZIRx3Uq55DOIKsSnY7wBPgSrDEoAUprnJO4\nqsRWFX6oAIdEoIRESYcMIzwKzDhjY+sqw0GPvdGIzmTKZDRid/MKtWaLcNDk6sY6SEVcr9Nqt1BK\nYuzMP5fCw/cC5toHSdMpV7cnGOsA972R6o26gNc3I3H3e9vftW0ArdoiwnkM+lvs9q9SFhWd1jLj\n6RU2t19mce4kaZqSFQWhq5MnmoqERhCiAFcZEBZTWao8pREECF1QpWNElaM8ha98Km1QnkcQ+HgO\nFAppHcJanAUhJQiHNJYAkJ4HUUAZ18kGfao04+rFc0glEcqjqCyokCCuY6xjcXUFz1P4gcdwfJ5c\nJxxYvIOyygj8GkJYVhaPgXBoU6Gk+p7C/cRhl1VFGPhIOavAYcGBkgolBbVaTNd0WDtwJ55XZ2Pj\nNfYGIxq1KUKWKOmweZO5ehtnAiIvRhmFKTROG3SpaYcRc5FH0u+jpMOPQqRzKGORnsI6RxxGFMbi\newpXllgcnh+CswgpENLDGYePohbGrCwtEIQ+tbjOxSuXycqCUht0WTIejdDnzpJMJiTJhHG/T7Pd\nROsxu5MLTNMJsd8lyzJee+1xlNfiytYGreYCBw4cYmnuMM7uKyAOKdQPrenXhV25ArTBunI2bHYV\n4OF7EXk1oNGJOXDg3QSqQVkWHDl8O7nu8+qFb2B1SRgsEzJHTc2jZIDRmqoqScdTiqzk0PIyS7UQ\nk46IJeBLhANTVkilkE5itEUGAbas8JRCCYl0DukcGIND4iQIP0AIQyB8vHqbwI+p+SGR73G1t0tv\nPKYCJuMRo9EIa/QszGAcZVmhi5QoXuK1l14Cz2d56SCjUc5Nx25B7U9Y9WWNnb2rdDvz6Kpib7TJ\n2srNP/QI87qwN7ZeQVc5WTbEuoLl+aOEUYOy3GE0uYzWU+xSSCCnzM8dRHoWSUKV9olrC3QaB4nl\nAp7zyZOEZJQyGQ5RUnJ48QBHum10NkJUBc5WSCeQQhD4AdoZBAJP+ggt8JEoHErKmUYLh7AGITys\nnY10rZRIJ0BJlHM0W01uCo8QxTXKS+e5sjfECEHYnqPUlv5ej7jWxAsinJGMxyPWlu9mc7hBvzdB\nqAaLy6u063PU4iZnz7/AeLrH4bWjdJqrDMZDluZTIv+H87+vC/uV1x4j8ELieI52s8327iXCsEmr\nscx0mmErzTPrX8I6uPOe93N1+zV2+2fw/BAlAnRhyd0EnwhTOKosw+QlR9aOcXJ5ET3p41mNEgIt\nFTiHEArhBJ6FstIIJ1DGEng+AhBag5QIY3DCwwqHsA6rDW4/pCucQEqF9H0iIVjodFidLjJMMrb6\nQ0QYM1EK4Qck4zFRXCeI6rRrq9QbTW5qNjm78RJx3OHFF57i+ImTZOmQ4aSHRLJ5dZ1NsYlUknPr\nT3HLTe9G7sdQxCwC96Zy3THocNjHWkUjXqTVXKXUlvUr59nprZMlOcJ2OXbsvaigxtXNy/R7KXPN\ne5lMK7Z3d5kkU6xQ+CoimUyZTlNuOnSU04dWsdMB2AppNJ7v4UmBdKAcOGuwFnzlozyFkiCFQyoF\nUs6gOocBrLUYDNZUOMBahzFmlm6QHjhJEAZ04xq+Megip7+zxai3x3gwpNCGaZJgjEYIKEtDqAK6\njXn6w03CWszq4lFWFo5xeOU077r7H9BudLm0/jxnzn+LF157gkvbL2FsBXx/0uNH0uzDq/ewNH8Q\n6flgJaHXphZUDPqb7Gxexq+dx2s4jhw5yWB8kc3ec+SJpBaHdNodOo052kGH/tYO48EuqwsHueWm\nNcxwF6dLlJtBQRsUEs9TVLPwOUIqpJ6tshKewDpwxoJwWAvWOKwQICWzGJ3E2QprZubHCjGL7AiB\nRFKPQubrEf16na3hiOn2FosHDtPv7VGr1UmShFarte9teHQbXWpBnZdfforhcIvbTt7HieO30Kg1\n2d3zCbw5Lq1f5nRzno2NSzSiLgvtA9cNWl0X9u2n30NeTjh7/hnqtQ4QsLx0grLIELrGID3Hs89/\nhaJIqbcsi3MHyWuGWrBGsx7QVsuMtjbZubqJr0JuP3kc8im2TBGmAgRCCpwTSDkDKLQFAc46rAKB\nwklASYSddYjWWbACIzQChZUCaQ1Wa6SUOKlmGq81CIGQEt8PWF5cZqQdU+3Y3Nvj7LkznDhxCik9\nzp99jXa3S+xAm4pWq8uBxSNcuHKGIFR0ux1eePnrKOmRFxmnjt3BxtY2V7au0Gm1sU6+Lnl3A7D7\n4232elfZ3dvjYvYKzfoKi/NdJsMd6rV57r/745y7+Di9/lW8eMg0GVELllhdPISvFdOdHntbO+Rp\nzr3vegexsFRFirR2NkDwJFiLQOwPHMDiQIp9+wdOuO95DdZZLJbZGMPOPBFrcBpKY3G6QEY1kD5O\nejjnsHoWUPKCgFa9RV3t0Gk1KSrNsN+jrEqyPEMpnyvrl1leXqLVXWQ0nnD40AnuyDZ45fx3mIz7\n3Hbzg+zt7nFp41VuOd3g5NGTLC6ucvDACp16CzGLgl8T+XVhry4cZr69zPG1O3jmpa+SpwVXr1wg\nSS5w7MQxtOwStfaYDzIckOSSWtCAFIa7A/a2thn2eiwvLLPabWLKAoyePd6O/QCuwlqNBYQF60Aa\nA06AVBg9C0QJMTMNFvZvhpwlhJSH8DyUELNQrCexzlFlU4ybJRAsDotDKkezFmF2e7TabaQnqaYJ\nyTTBYenv7THXnQNr8P2QIIw5euQU6zsv0u2scN87PsR4NGB96yJH1o5Rj2pUNmVz6wzPv/w1Dqwe\nR7ucB+74hR8ddrfdBqAyhg+/55cwxnLp0otcvFqRFQnPv/xFnEixtiSdGrb3phxeWmBxSTEe9Nnb\n28FUlltPHEdag6kycBqBxSqH1QVYizWGWT8uZjZTOKwxVFmKsxrpqdngQXkgHM6CNRUyiHBqFims\nqlmKzeZTZBihohCTJBR5hvB9rJ3Z9SgMaXiCsbAYbUAIyjzBIVBRjaIoSNKMOLb4mWI0GiGVj/J8\nwGGMplGb49tPf40kG7Gzvc7y/EGiqMULr3yeUoxvDDZ8d5QEnudRVmOiuMZNa/cyGJ9haydna2sH\nJzK0kUg9x3J8gvFen2F/RDaZsjS3xPL8PLbKsWUGZTHTTufAgJMKqeQMhnM4a/husN4PQhw+WutZ\nLtQanFNo63BWY8uSqiopkgykIohihFTYstiPbwdI36cqS4xzCCURzlKPawzGKUIIKq0ZDHocOXEa\nKRRZUVDXFaNRQa0W44sGV67s4nvbZFnC+uZZjh45ztPPPsYrF5/j8OpJ2p0VlIDpdEzU9K/J8rqw\nJ+kI3wuZpHuzKFg2pSxSwrrHJLmIkBFHD7+TncELFLnH4fgUDb9G6UriZp0Ft8jNR48hdE6VjpHO\n4pSaxTaMxTqHMA6EwBmLQ+CExVmLLioqrbG6oioLVKAQlpm7h4czBcrzCTyfeHERFUQgFVZ4GOfI\nxiOyJCGo13BAlWdURuOHAVEcEVWWUih0WVBWAdPJmMXlA4S+R1UW1BsdyrKk3Vhgae4mBuM+/+4L\nv8fLrz2PEyVWWzxf0mzU2O1dIEmHWKG55cT9NwZbV1OqKmMy6bE3uIguK8IgpL91meGgIs1yuu06\ni3On2Nk7T2EKZBgidEmz2SRWsLqyiC2yWVZFz0C6/WSBLsuZ92BnZkRXFclkDFWJMYKo2cCLavhx\nhAIscpZKk+AKgQp8hArRwqPICwwO6YcYB36tjpM+2XSMF4RI5SGtww9m2u8pD+U5pskULWA46DG/\ntEIY1aiKEtmaeUlJMsSKlLTa4vzlPYajIUhNsxnRaXtouY5TkqRKKEzG5c2Xbgx2pSs2rjxDp7vG\nZLLFYHgV3xdMpiOcmaMqJtQbc5R6l6qsqIc+TkGj2SSUglJYmlGInuyBLjFZNotJGIupNMZYhJ3F\nyK0TZMkU4yTdhSU8P8A69sOvMxMjrIB9T8Y5SzEZU1UO54XIKMI6h9YGh8IKUMrDCwOqskAqD+sM\nFouTPnnew4+aVOMx1WiE5wdMRgNq9SZRvY4QDs/30Man0lOU00SNmFrqMRznDMcZlfNJC4NUksW5\nFkY4Nq6sX5PndWGPkhGDcZ+yUijVxorz7PR6KFmj2wlRqsbmznP0BgPmGqcQwlLpjFatTdrLaNab\n2HxKlUzwPR8/jpEoTFlivAprQTg3S0ogCJsdhASFo8hyiqzASkfgeVitcRZUGBD6AbLWxSkPjUQ7\nMcsZGodTIUYphttbeFGIsXq/c5NUZYnWGj+KMBZKXWKsxRhNkaUk0wlpMjOV9UYdh6DdXUGqiO3N\nCaVOyDPNdDKl3oww1hLXNGEUUlUF7UaAuc662OvCNpVhaf4YreYSy4sHeeX8Al//5h/QbTVIkk0G\nowGlTlEiwuodVHATgfCY9PvooiDuxHi+h/PVzIPQFlMVOGuoygorJErMYsRCgNEluixRVY7RELe7\neJGPkPt+qwArBbaypFlBno8p0pzKGIyukH5IUGsQtTrUmh3G/V28KEZXGik0QspZZkh5COco0pQw\niBmN+qRpRjIZk6dTnLPMLywSBiHahywpKZ2hqDS+71Or74cRPI+y1NQbMBhl4ASW6sZgH1w9Rlmk\n+F5EVgypRy2W2sfpJxtUZUm3eQqpFNPkDDvDCxSZouHV6URdpOfhKUm6u42rCnAOP4zwghAlJH7g\nyAuNqVJsmeGcpcpn5VpLi1jj0I5ZRh2FxuFMTjbJMAas8CAIiDshNc9HIjDGkk4mTLav0DqwRtRs\nUmYJzjmqogKpUL6PSFOiwGdnr0flBH4YkyYJeVEwnYypN1uURYmpacajjNF0ijMOoQVhU9DqtnBO\n4nsBRVYg8PBjjyqvEMbeoGbbAutKrBVYbZjrHmB57iZqjUWMG3L8pvewvfsiqh+QVCGFy0jLjGbQ\nIYwiQt8nijo4XeGKEt/zkc6hqwprLcoT1OIWprI4Z9GxQXoCXEWlDShFpS2lKWemxvcRtRYKgVQS\nXWiyJCWIQTgLxtLotHFeQDrs4TValEWBkFBlGdoKvMDDU4rYDwiDiGGvh/JDhHAMBz3COCas1UnT\nlO58lygI0E6TTiqytGDQF9Q6Eb4n6HRisrRgPE5pLjSohYrK3GBspFVbIFcJZZXRG27wwkuPMJls\ns3roBGnZ48KVh+j3tiiqEikVvuhSGUlWlkRxDYEjHexixiNUEFKpWRjVGovwPLCK0vmYyiGMxbjZ\nSK+0GiclOsvI8gLtHEGjjslztJAIpbCeT55OSfo7NOcWCeMGBIoqS5EiI2q30AYQEuEcXhRTjkaz\n6Q1+QKvVQQ1HeMpnkkxptltMk4Ro0KfRbDO/sERVljTqHTwbkSWaItfkORhr8UNBkVf4sSTwfXRW\n0U9yVHCDsKUUGFtRGYPzUlADLIbpdAOtBUiH1gGhN0fJJkXZY/H43bS8LmYypEhzOkFMcGQBJX1s\nmWHKauZDG02pK8o0naW0ghBrHEWa4MUBuszIphlZkiLjOtVoijYOFwRUzpKXBXmRU+YZo91NBBKH\nw1rB8spBWga8IKDe7lAVFdYMCeOYojJgLPVGnXoYEUYhozTBaE06nZLValRZClbjHGhrSNIpnpIU\nCIR0tJoxRaHJc4OTgtCTGOvwlCLP8xuDXemSvf4GeTni3OXHcDomkCGLzRMo3yeod3jl3OOYytKo\nrdEbneHy5rOcPng/ZVFCM8bqhGTQw5YagUBrvR8GVRSVxjqDJz10MsEUmtF4RHOugylLsvEELTyE\nMeiyotAV4/4uw+mEylmyvOTouUu8+zvP8/i9t/PKyiJH17e4+blX2fj4L6I/9D7idhdnLVGtBU5g\nXIaqJI0gIPB9pBA04hq6rJBSUhT7JktAmkwIIx/nNKYys9i4B0hH3PYoC4spNa5uAOj1M/R14n7X\nd/2mG7x2+YtUpWI4TNFlTmkHrPdH1OMuZT8nrluKPKWulrGmSZLuoV2JMRU6BxFIZOCDF1BOEipj\n0aVGmpKiKLFCgtIUWYapDGmm8dMUCRT7Sz10UTBIEupPP8+Djz7BQ7ed5NlWndI4Pv7Y0yyOp7zz\nsWd56t/7IO9/6kXmh2PkZz/HC/e9ExXFOGMQniRqtUAIEBJXa9Ju1qkNRxRa43KDs7PQf5FnJIM+\nXd+jhmVxYYnt4R5OQxj6IATGOKpq5vMnk3yWuZeSuHFtpNeFvX71LJc3LrA3SPGkw5qKSs9mRo1G\nOV44IM8rjBVU1TrCKuqNGpNsm8jUCKqcws5yglVRYKzBaENZaZy1FFmGrSoqJcmyAuug1IbhsCKU\nkrQyGAuTyZSrgyGf+PrjLE0S3vv0S3z+zlvxAo8/PbbGPzx3ia/fcYpaFPPYvXfy4BPP8cQ7bqfa\n28P3A5rtLrooUNJD1AUqjCgcHFo7SmEkyfkz+LUallmKrRgNsDtXCZXFU44P3fJetkbrTEYJWeYY\njTPCWBJGHtooSq1JMk2tFuF5N2izX3j2OarcJ/ZD8nIWJ6mKBlWxgPEVw+nMDCjPEIYKYR1aF4zG\nCU1Zo8gSdqcT+Pq3OPXQ45z/yHtp1Bsc+/OHuPJzHyI9eZRCa2I/Zprk+JGHsY7RYMJcu0mWJLRf\nvci9jz7BF286yp+urfGxi5f49KFDeFLiW8vVowf5/ZuPUY9rNOOY0Tvu4t8eOYw1lgNJiv9XX+PY\nXz3C1q/9CtN73kEQBdjcgZS02ooDB2c++s7eLsOdLQIpmG8GBJMBI08wuHiBerdJS8wxdBOUAhVA\ns+XhCUVWaJznU2YOpw3GXJvndWFfunKGeiPGSYuwIWVeYUuLakRYY4iCBaqyIPCGpFPH+sYO83N1\n6G5x78YmB37vT3j8HTdzx8NP0xlMWP3cV/GUot4fsfxnXyb/8IPc9tVvcumnH2S906EdttBak2cF\nPWEpspIHH3mCheGID+tz/M4D9/PfLcyT5xmtAFbm2rRbDeq1Oo1WBy+o40UhtXqTjfUNxv0+N33+\nS8Q7e6z80ac4c8/dgJpl6IUg8qAWBxw5fpx8MiKuhSx36yw1Yqqq5Nknn0HujTh+sEM8H5NnjkBC\nGCmwUGqIY0FVOfzAEDclcXDtSTzXhd1sQ6NVIysK2p0Fer0ryHrIkYMnKao+ZaXZ3dlkOBwzGCbs\nbaVkU0Vghqz8n1+msbHNPXnBo3ee5N6nXuHTB5cQAj5eaZ684yTv+8ojNPcGHP7yIzz10Q+jTYXn\nSYoyZ7ffR3kBXz11jPe+fIYvHD1CmmWUVQFOM80cT13apXC7eFIxH3ocmG9y1223MH/gEE5KBjvb\nfPuB+3jwyafZ+9VP4AUBpiyQUuCUpCYDWvU6obEcaMbEzTUCX2KLjMG0Ym5hhd3BmCwviMImorTk\nlSXNDFE9oEgqjDEYpzm+1iFs+dS614atPvnJT17z4KNP/eEnk2yLZu0gYegxHPdpNyKsTcmLPXb6\nF3nuxVfZ2cw4vnY3/dEOeT4lzQrW7nwHcxtbbP3Ch9D33caFO24ia9W56ikeXl2h12xQNWssTBKe\nvvNm1q3FWEO9XqO1MEdlBeNpwlnteGhxkXNCkFUFSVGwPcm4bTDity6uI8cTfmPjCmctPDZOGGxv\n0Q0VnaUlpuMR+aGD7H74fdTvvptiMsUPInRZYI3GSomSinQ05NDTT/Ohv3iIvNngohPUoxqHT5yi\nGE+InUavdbg62UNjZ7NmS6y9Il8AACAASURBVEdeVBgHUkiOHq5z1503s7zS5p6bf/63f2TNFl4I\npkVVeGzuXsRXEWkxZjDdZDKdxXSDUBIoRZZktBqLSJXyzktDTj/+l+z86sfwHryH9uYGo94eZVUi\nqhJXWdJByeOe4sUH76bmKXRZccett3N6b0z79z/Lax98H88vrNDrpwzTMf2yxGiDco4Ay3+0t8ex\nvOBwURA5x386GHB87hg/89TLPO4cHD6C8jxqjRq1RotkMiaQijyZYO3Mc6iMwSCoBT73PPYduns9\n7vzmd3jqp3+KOAxpNBvMLa9grp7FjkpGg1k4oXJ2f8qCxVpQQnJ2Y8jSkQ0ONd7sRZozue68kenE\nULkK6yRBFNBpt1ha7lCvezhnSJOSdjuiMx/SH6/jvDEHj9T56COv0l3f5sC/+0sGVy/zB5/6Iv/6\nL57kwl89x68//Bx3vnCB/+xbz3P40iZXt/ucv7LD2DjW1g4y//98ifjKFqe+9iiHDhxirtHAOUjL\nkgcmE/7XS+t8DMfVtQNUSnLx6BrjRp0u8Gtnz7OW57z7mVfYuXiOhZVVkt0ddjYukWc5ZVlQVgVC\nCqTywGqkswhneObB+9ib6/D0/e+iM7+IcAaTTugsLaICn3ySkuYlmanQtiRJC6wFz7cETUUQeTij\nSdPRNXle14y88uq5T97/jo9w26kHcHJAmp/FUhIHDZwwJEmJzmpYY4laGfMLPg7NpFlnZSej9+u/\nxNOvvcYXvnMGLRW/vbPHqaLkdF6woDULScaGtvzGpauU3Tan3/9uxNwcYmOT0a/8Q8oDB8mTlMyU\n+JHHb52/zNG84Ji1LKc59TRHSYFTivZwTBEGTFsNHrr1FOesod5o0JxboN5sE0cBfhiiixJdZjgB\nZVFgrSPwPTaV5DvLi1yylulghBQQKWjMzVMVCa/aMdvTEWVZYa1D+hI/8vD82fyUhW7A8QNLRHGN\ne275xR/djNxy4jTvuuP97PU3CegyGkg67RrWSXzp0ZlrctfRn+fpV75OszPFWoM2mqeOtHn2n97H\nx267lfLieX7ROX5tc4uXjh8lXN+gX4+5bXuPC/Md/slgyMG8YO7iBjYMKd95N+k972RaVIg0Z3F1\nmdPKEvf3+MrphI+eucRzd5wgjiPufOplXr7/TqrKcOdTL/LK+x5g847ThMrjtHZIk1FTllo9Jghj\n/CBEKZ9ksEc2HWGEQnozN/DwqZuRBr78pYdo+AHSztOKPNorB/EWl9hdv8TCapM8z0lGBUEsqM/N\ngm2DYYpQEdaP8L3WNXleF/ZO/wzrm8cI/ZDFhVWaW12EV5JnJYEvCSPDa5sPY4MxuVZUuaMetvGl\n5cr6JtWxgna7wS+kKQvTlObldXTgMzee4lvHHdOUJ+65jeilM0x/4ado45Dffobupz6H/cQvMzl1\niiiKWZqfI44CtsKY/2WuSzuQdOsRZ37qXagwRAUROyePIOM2ZVogVc7C0jzNuSMIIUlGY6bpBD9s\novwAaw1ZmuJHDaw0CByNRoNjd91N8rkvoIBpllFVBcoDWYvRQtNsR6wcbpCmGZNRRpLlNJo+t92x\nwoGVNuNqD5Fde5LldW32uQuXKSqPvekuz7zyCBc2LjOcTKjFLaoC5jstOvMFa4ebzHfqrCyvUuSa\nwbDPdFoyygqWj59g52MfIFtdRElBuzdESMlwrs2L997KleUOn3rfvZgH78NZSf3Tf4Z/eZ3upz+D\nsBLpFM32PPPzy5w4coJjJ05S4jFICioUlbEUumI4npJM+gSeo95ooUvLdGsDm0yYX1mm2V1iuHWF\n8aCHDAK8MGL2XnKJNrNMUaAk3UaDZDrB6QLle1hnGZkMqwoGwzH9cY4IFGEtoF4LWFmos7oQ41yF\nQJAmezem2VHs8ejjf0xJn+3ti0hj6LZCPD/Bjw2R12QyzciKAmcUeBoZVjS7gunE58DzL3H7Z77C\nzi98mNf+i1+n+cJ5lr/0CBfecw+XDiywd2WL8WDM6bvvpNnugpCkv/Zx4j/+LP2Pf3zmNSBQwicO\nPbxqgt9aYPmuOa5srJP2d/GUQ/kepYYkyUEqGrUarUaLWrODimpkkxFBs83SkWP0tq6gsxScRVuQ\nlOAgfvgRDv/p5/jA8gp/oQRBFCCVwmnH+mRAb5CRpQVZqgliRXehTtgQBIHFVH2SiWNzO6HMbzAQ\nNZieJ8lbNFsNluePsNu7jDE5SQVRCNrl6BJ0IUnLlNJAtx2xvOrjhxGn/9VXia/ssPznXyf9L3+D\n8V0nuLBQZ7Q3ZrR+lZ3dIc2lVU7ecgoVhIjSUt5zD8ld7yK3AjWZ4tcaWOtQWJQK8LF4oc/yHXcz\nySZMRj0qXSGVjxf6NNoN2nMd/LiNlD46S4kaXXRZ4AU+jbkutqxQ0gPrvpc5Wv30Z4ivbvHh8Zhn\n77qNw2trNNpdKuvYnowpktnck1JryqlGSEEYeGRlxdx8RKwjRntD+sMbDLFqXVEUu4zGCctzSwTK\np9LlLBCjZmvKi9whnIc1MePhHqGcuYy4iPP/+Ge4+f/6S3Z/8aeYTIZsvnaBQX9KkuT0hlNkFHPf\ng+8gDCOU8rEUqG8/SftTf8rgE7+CvvPu2bS2LCedjFFpRi2I8BFEYczC/Bz24CEsDqfA6grtLBb2\nEwUeIqojsxyvHlHkGVIInPJQDlTo4QQYHM9/4D3c8tDDvPjgu3nwntsx6ZiqyBkVOWc3LlNlDiMt\nFQ6JIhUVtlZRiwOKxPLEc2cZjEq0yW4M9nicEPqKIvOor2k63RinalQ6Znd3Cs6gS6j2530EfsX2\n3hhftFCy4tIth2n/zr9gvLPN+OomlVIM0gmDqWb14AHuf+AeVpYWkGEIxmCdpfmpzxKsb9D59KfI\n3nUfzhjyLKXIS0YXz9IKI1ZXV/G6bTx8VBDipMA6TYHCVCWmzHF4WOGhswwvDHDFbDG41Rbpebiy\nBDObNISzXDx2lJcOrbGwOM+BgwfYOvMKeVEyHaWM+wUV0GiH1JsK3/cQQlBmOVUJifNIixxkSXsu\nuCbP68KORZc8HRPVKuZXHFUFRebj4bPQjkEUNJuS7UGJr6Y0G8ucPddj2iupbEHnWy+x+oVvEf3y\nTyNPriER1JptVg8c4sDiElJCqTV+1MBUJVWeMf6ln6f12c8z/tV/NMs5xg4mAs85qqDGxtYGThcc\nbsXIUOFUiHByNr1MKqT0UcEs+et5AVgDDrLpBD8IMDrD95poq1GIWTZdCISFCIvIppgio7VyCK/W\n5tmXz+LlkrtuuZW8OWI0GVJmhskkpyoNTzy1x9qxOY4cOkiW96jKGwyx4hcszcW0msvgPHq9CVhF\npjVSBqR5iRaOhXab4TihLDXNdg0jcg43G9z1Pz9CuL7L/GcewvyPv0mr0cb3AhpxE50XTHa38RYP\nIZDYrECIgOr++9l54EHKSsze1qAtvh+xsHqAuNXigoSdvR3iK1ch9IlpIaQ3i3UogRAeSnp4QuIc\nKE9QVTObPvsMsPsrhI0xCE/hrGPa66F8yc5kgLaaAydPk2YF33zqKYJ6TLabc/LAHXx761sk5QSh\nZktRmlGEkhVLcw2G4zo7u5Mbg+3XLNoK8mKALhsEoaTXS0lzx2g0wFqFw1IVQ5JJzsJ8RFTzWe76\n5HnJwz9zCz/9ZY/kH/0cgReDgsDzsUaTjif0d/usHL4ZpMPgKEtNlRdk2lCVUIkAJySNRg1PQb3Z\nYjQY8OL6BtWFy5RSML+0QNhs70fyPKwymKrEeSFCBZjKgphNI7bOEtWbGGPQlSbPE6SUVJXB4cjz\nnLDRQFsY7+3xzKvnmeQpC40anuexfWGH99/2IR4//w2CZsmtJ2rMd+YoKQmEoFFv4UXDG9TspEY/\nmaAXKgZ9yXBUcGlzAhbyooTSQ2DJC4fWFSMpmU4rejsWq+HzjQaHP/nPWGy0CORs2Z01hmKaMNrZ\nJhmnmHRM6QmyZIJ49EmWv/Blzr73PWyeOIUMG0RxjSDwELUI6UtWjxzl1fNn2e73yM9d5JAuaDXH\nRFE8m9Ltz+aTCM+AyBBRncpUWGdxcjY/1lQFzgl8OZumtvziS3zwoa/zwofey/rhBVxRsrWxxVcf\ne2y2yljur2+3hklvzH1HH8A/0KPVSsnyEUXuSCqHsT4L3WsPaq6f8KVk5UCHyiQMR45kqgiFhxEw\n3/WYjCuiMORQrctuf4fJSDMZGJRULC5GRA3IsEhv9nIW3OxtOMl4xPbWLsk0p7m9gS400+mUuz73\nRep7fQ791dd5OGzghQGNZoPJaI8gDpFak+cl9bhOFadsDAe4DY+lpYp6o44MfKR1qDhGeAXKjwhQ\nyCicmRk3e+mLLsuZCVEKYy2nv/JXtLZ2uPMbj9O/553sJnv82cOPcG5rC5QkGgxYnptDKsm430e4\nLivzi1TVJuOpB9ayNxhS2JS59g3CPnp0ldFwyJWtMUvLBWEtRnqWIvMpTU6nE2G1Js1nUxycYzbq\n0h7jUUCjFhB853nmvvAok1/+B2R3nSabpmxdusqVKz3GaUrnpkMkRck4yXj09tt517PP8bVbb2Mv\nz3DpFLe9hTCa41tb/My58zxyyy2I2+6k3mozSXMuDUYQBswJiV/O3mcl8hKvVkP5Gh9HqECo2UoF\nXZaz5SLCgdZYIXn63fdz9yOPce6nPsjezi5/8di3ePS1s3hSYCpHfzhkNJ0QxjGe9MjSlKtnoH2o\nxrBaJwybFNXMVd7c3rkx2NNsRJInlLpkNGwRqjpWDmg1LZ0gIs8D+rs5SeLwAkEYCbJ+TqPRJcsz\nXny5YO2bTxJs7dL81BcY3XyE8aDPbm9Ebzyls7rA0vw8+rFvs/KX3+Brt93G//aRj6ArTZmn7AwG\nHN9Y55/s7FFzjqWy5ANnz/KX9z9AEPk00ozB9iZXegNKZ2hEMUIymzBkDJ7KiKzGCVBxjGQW6RNC\nosVsCaAQHi+urPLo+x9EZimXnnySJ86eoTQaJxTWOvKyRGtDVZY4zzIealbqdardGgfXTnFlcpFD\nqwfJ0ojBdPfGYKf5hGme0Wkv8+EHfp1Hvv1pWu05mk3Ic0deGCYZGK1J8hJrHHFD4bwehS7xohrf\n+MC7+OijT9H76AdIJlOGwynTtCAxhlvXDhG36iw9/B3C3oCffeYZ3q+e5/lOi5M7Pf5kdYVP9Aas\nFQX9OGZ3fo7n/t/2zixWs/Qqz8837b3/4Qx1ah57qupqu+0G7NiYYGwZsMEEQ5hClICCoiCQSKRc\nJFIuIuWGXEWKlJsgoQiJKAiiRJiYxMZuD42H9tDGdts9d/Xg7pqHM/zT3vsbc7H2KZOILpJS4itv\nqdV1Ud3n/Ov/9vrWet+13vc970GtjUjziJuMmEzG7M5mKK0J44ytKrQrmJCoK0fQsodTI8o7ISYK\nEesqSiwEAgvfU6ylDZ7rO9v0IUhuF1UuQkoslis21tZlLUVpLl+5zNb6BkmtU2+O6LoFu4slvr1L\nDvLqK4m9RcfmkcJXvvFhVt3rHDq4RegMtVNUdctkw7K3nVjNMtP1Mcr0NE1FsRBC4PENzbn3/Sa0\nPfPdGbNFy17bU6zhwKF1slHc+vkPcOC/fgy9bDm0s8e79+a4nPmlK1f5xPmz/NTL3+azb/8Brj54\njqauUO2K1ncYZ6jqGm0tl3Z38TkzGo8w0eKMoc+RkCPaVei+g2ZEKoUYvDjZ5UyXFd57gvcsu5aZ\n7wlZNtJyFkZmvalJKTJfLhjXNdY62q4jhMAqBCbUXJi9TEoKfQcnnTsGezyxzLtIyhnlLnFwq+Gl\nl2+CUhw+2lBVY44fnqLynOBlyGXVGmKfGK87yJFbsx3mfU8dE0sfWHSBZR9YP3wAVztWoSc9cp4r\n587A5/+C+z/5BV49dZwTr1zkUw8+wEv3neZ3HzqHripM6IhK5rVdXTPvPLqpadam7G5v8/rODodT\nZtLU9EYzyhXGaPpuiXUGVTfE6FEUIQ5KYdEHYteSSmLZ9bR9Typ5mBsHrRQTV0MpzBcL2tUSjKVy\nlo3NTfq+I1zL3Fhsc98D9+DuoF56x2CbtT1sq1ktIzEUrl+dc/NqxI0cdaM4dmLEYr5iNl/RriLN\nyDFeV6i8wqgpBw82LHY9fP1JHvqzJ3jqnW/lpdE6ixB54ORRklL0IdJ2K3rvWZy/l6ePbHJze8H8\n7H30KCorS6ZFQaUhkQhK0/lIqS2516jKsra2xnK55MbuLmZjk2bS4FOgD4YQZK7QhiBqOVrTLucU\nU3H82Qv83Bce5xMP3s8nq5p+2PUpZDKaxlVUzlL2F6xQJO8l/3cdk8mEppny4KEzHDu2xmy+vLtg\nkx20mcnEEJJitvJsHp5w5OiUra0Jk0lhuYws9pbcutmijWI0UugS6UPL0cNrFKU592dfYHJ1j4ce\n/zpffu+7KZVhsjGh14Vay8pj8D2dD/Rtx3w5o4uRylWM9ZiAIYaANorVYoXShuQzPnhKijgF48ph\n7Dq9XTL3S8bTGqMEsQ4xk2KWNxTZbcdVLDvPh774JY7szfix5y7w8Tc/SMyiqVIKqKJYGzdkpeiC\nZ+V7Rk3NpBnRVBWL1ZLJdA2VFev5Pi7feobJ5C5ztsqRo8dGjNdh1Rb60LN1sKZPLc2oou8V2zsd\nfVR0q0DwhbZSxJKJLSxmidP3jvhvP3yMv/O449l3fT/aKnRlwBUWixneOEKMMgVK4fyl6/zc49/g\ni4/cx7dPHOGBb1/i7d+8wGNvPs+3Dm7ywOXLfODFV/nYA/fyzUNbnL10mV9+6XUomf98/hwvnT5O\nDp7KKabTkey8l0LRmjLU2iElVsuO1aLj8Uce5oeefIpHz95HKDK2rJQoK1TWMGlEZlRpzVpdMx03\nKK2JOZF9z3w+YzRuMH7CgaNHOX5i7e6C/frrS46d1oQsp+PkyYMc3pwwahx72y2Xri+5eWPOchlk\n7EoritakRcE5DRpefWXFf2wqbv7Se/m+tcNUr11l04xQqjDfm6Ny5E2f+Spnv/IUzzx0P2dfeIVx\niPzEZ7+JUoreaKYx8YtffILJ4UM8slhwou14//Mv8bht+LnXrnDEewD+9quv8Ttn76GZNtRaYa2l\nKEUqmeg9ZdygtUFhmPdzXEq8fOoEzxw5xM5iTrp+nZxFnUdnxbiqGDUNo7rGalFKU0qLzEXJ5JgJ\n3uO9zKev+VN4/1fZVP4fBLup1nGmsLPTEkKisVOWq0LXe65cmVGouXW50PeJHEHrjKsNtas5cFCz\nXBa00fhl4SvffoV3/vA9bG6tUQ97jjdv7LD1jWc5/+VvoYA3P/cSalClcchyk45p0FyAn9ze4Ymf\n+VkOfPJRtpTiH5w6yUsnj7L+6c+Sc+ZTbz6HtprpqKaxlqwtqWhiLsQYiT5g64blqiV0PTYmaeVD\nIvko898FnNKDUExh2a4IIchpL4UDa+s0dS3jzynj6orlYklV1aSuZuvAvXcX7Euvd+zsJTIJoyLd\nYptmYtDGceP6HsEXYifqNroUDIW0zBw+vc7p+w03bwaS1yyqjqWao775LT746Nd49j1v59axLeqm\n4v0vX7wtxFxGDauz9zF96nkKonqWJ2OuvO2tnPjCE8S6Zro2pkwnrF25yvd98Us8+mt/j/9+7gHa\nEFnMZth2AcaAAoe8aZXRKJWJIZJMZtn14HvCIDgQcySoQswCSBmtsEXLrnzK9Hi00jTW0fqemERg\nhpJYLeecPHkPMUSCL+jR/+6G+53njoTvxmFF5SxOO2Iy1JOaW9darl/ZYzpREOTiKQmMcSg9YbWE\npnaEzqKKph4X+bJc4c2f+Apr127xps/+BSZmNiYjXv/JH6E7dYx46ABuscLduMkzv/BBulFNN6rp\nleLFtSm7BzapVytOfPRjfPkdb2f70EG+9q534qPwlJW1jJqG9dGIpY9EZQYxOkXRSlQatKENntNP\nP8NvfPxTnL14UZoXo2S7O4u+iVNqGOSR3ByTXJraWQoQh91NZx1d36MHBbWm3uLhU++9u5NtdIux\nDhs0m5sN84XHVhbjChuHHDEu4ZYip0IpiY21GjBs34xk3TAaG0bjyLVLhZNHj/C5HzzF+594ild/\n7Acp2tBUNfO3PsRLP/x2Jt94jsMf+RQvvPsdnPnk55nMlwSjaVLPmx//Kp9+5C2855tP8+mHHuTC\nZMo3f+ZDOKOZDg2KCMSAsxXFBxZ9T6Uzyjqc0ZQMq5BofeDnn3yKo7M57336eV44fQKUJg3yGrLS\nrTBaY40hF6iU/Nk5R21Ey9VopJsMEe89yRhmN3dprx+Eh+8i2LOFYmMacbWjXUZmez2owmS0xuZm\nTYmamJYoNDo52lVgc20dVTRr4xEHDhnWNyxXXlccUae4cKbiuaOHePeD95BWLaH3pKYhpcT2mx7g\nwqlj3Li5x2uPPMS7vv4ML508zv0XL/Pomx7k6a0tHv+b78JYxyRncopoHDFEjLUkrXBVhfYZ2xu8\nz8xTx6jJ1PvaJKWwmM34+IP38cEXXuGxN52TO0JBLoW0P8dXClYbrDY4Y7Bay5gxopFPKYRYCClg\njSX0PXY04ub1W3zkwx/mR9/37v/7YFujRA97zcDYYqopa9NTOJvI+grHjm9R1VNuXpuxuBnYOLAF\npcE6xb1njrFxMLF3o+ORrRNsUHPj5lUWbcsrW5uc3pyiFBhrSLmQUVgtqjiv3X8fzx09TkHzifPn\nCTlh0SjjsFZjyLJgZNSgS5Ix1tKGQJ8KxTiySSzbjoRQbwUlQorLBc8fO8rLp0+jDZQQCTnRhkBI\ngyE9soxUOYcaSkGlFdlHvC0YY0g5SfByJEaP0RN88IzXNt44nncKdkqJ6aFNDh8+izWeyWiLNz/4\no2zvfpvnX/sUvk9srq1zcHofV5ob/Nj7PkTTjHj55Ze559QRjLtOvHiZsfJsb9+kD5G+83z6q9/i\n1w9u8o7PfpXXP/Sj9G9/C6YknNVsjhvImZGB2CUWJZOKrFJkrSglYZWhrixNU1E7RzUZk1XBtz21\nhvmqIxaRK+pjoBSLMt+ps5UW/ZFY5M7pYqL1slhVikgmaaVw1hKjLDZpJdf4XzbHUCiMVrKmnRK9\n97zlHXepyjAeOTan9/CWc+/j1s4Fbt3c5vKly7x++QJ7i4BtNFde3+bM6SNMN6ALF2kj3Ni+hhnN\nMPkq5aahW3WkmEQ3JEZmi5b7v/o0G7MF6qN/zgs/9Dac1UwGNmTUaFx1iG7WsVq0LIKIpWtjCRRs\nU2GVAa0wxsnrHwJ9SKx6TzGaGKGPEXLk4etX+FsvvsKfnX+Ap7Y2CCmIyqWRQC3bDt/1xAzkQt4P\npgZjRaRdo9BGYa0RYSKjsQypZ19tra45dPjeuws2wdCtIleuXuLajSssFz1Xrj3O1tYx+h6MDfQ+\nEPKc0ViT1Mtc291j1Y6o1ToXn5xz0I0FtE+JGL1oQinF7xxY4x8bw8UPvgestLhVLUE01pDayDYd\ny5zQqjCuHHXlSBm0NdiqJmRY+kBPIeQopWdRLHvP3qolK6nVP/jcBU7OF/zksy/w4nvfhVaWqDQx\nZ3yWOZNAxCeB7GT5FWIfpJJRilo5kX/el5LLmYTCWYMqibqqOHbiHg5uHX3DcN6x9Lv+2pIXn30N\nYxqaaszBrS1uXL3I5dcvMLs1l1q2trz88gXazuNqx2InEm7scu0vXqHb61n1ovEUYhSMYhDe+ux0\nxC8eP8znNw/IK680RlmctZisCL1n0XuWXSCERAyB1XIlO+Ztx6rzLJYrvPf4vseHjE8ZH7MEsfcs\nVh3Ltucj5x7g0voaH3/wftAiZ1eUIqs8CPoU6ZKztOuDBBhDU4xSEEsWaVI0VmlqY6mdDPkIbmM4\ne/4R/N0unU5HU/ZmS/rZgu976/fz9DOf4+Ahx+GtDdbXj1BvOOZ7T7OcaabTKbZYzBXDETeibzu6\nvmdpNZNJQ4iRPog4YsqZtg+ElOk++ine8kd/ys2f+wDzh8+hg6fvI7uzJcu2Z+WD6H5EEe3KSrOY\nLYhliWoakoKY5XVOFNq+Z95Hlm1LyYWJ0zx59DAv3XMKHQOqZJK2pDSofymIXrRPShY5DgbSoJRM\nKaJZpRDZIw1oMtY59LDikUOi63ruOfsWir5LIGp9rSb1mU8/+hGW7SMcO7XBocMPktHc2tnF5cSp\nM+uEMCLMA0/+2UWc13RxKUWCgt5H6joTQyIJ9UEuRZoRBb/86iXW2h79J49y/fz9xFXHbLFiOW9J\nCpxzKA1KK7o+suxWdKkQU6QJScRbUAQK292KMxde5reefoY/PH2KJw9vAZoSPRqH1YrOJ1TxoC05\nJXLO9NGTSxJYVQ2SjMOJTkiT47RBaY1OEW1Et1spAago8ns04ynWvPFg5R3TyORgRVNbSIUvfvHr\n3NrZJmuFLwuaaWF7ts3BQ2tMqgPML3joBJ9OgLVy4/sUiSnio/w7pWEfZThBf3DqKJc2pjz7E+9l\n2Qe2u8h261mmSJ8SCYix0IaEL5lVKsyDp1ew07W8tr3DpVs3ubY749bOnJ9+6mnuXSz5u99+jS5G\nSilUoxGkjPf9MGIc8aHHp0Tb94SBKNiXwBPxaqHTtFJYJWqaWnBXqUxUEexGaZQ2LBdz2vkMZ9/4\nZN8x2IfusdiJYnNjxHRseepbz3Nrew+FZlQ77jlxDN07Ln35OnkZiCnSti3OGrQqOCsNQQxRdtlL\nJhZpoosS2umJrQ3+1bveyh/N5rRFs+h65j4yi4W9kGkVBGfpjWFn1bO96uhDYtFH9jrPrb0Z2/Ml\n82XLrd0Zf3D8CK+MGv7T8aM0IKrv/RDkpEhao1xN0jK406coLfkwxFOAOAg4ilCmRmkJqJgEycnN\nMZFuUwyF3evX+N1/+y/5D//ut98wnncWK88dWM9k3XD82Jinn7zOU197nre+7TTrW5awSDz16CVU\nLzkjRdnC2re/ssMHWiw6fMoYNFrJB6qMYRlkA6wLkYvXr/LRzz/Bh3Thx/78K3z6kTfx+rn7qZSS\nSamUWYTEIkVRxjHwRCUbZAAAEzJJREFUpsvX+dkXL/CH95zh8cmYGAJPbK7zxOYGzhoqrYk+oIwS\n9FAbVNH42ItcUYjkJLqCMSfy4KEWo1RMGjWIBmh0ySIKVhQlJawFrYyIwOSCqyMxKb70+Cfu7mSH\nXlPGno17Euceupd3/PCbcaNIO1sQ9xKf+/CL4BUpxduKZlIqS9WRSiHEROc9oEiDsBYoKmegQJ8T\nMSas1jz57LOc/ehnOLK9w7u/9i1WSSiwE698m1/9k//Bva++Rh8iq+iZr3p++vkXOLNY8osvvUzX\ntZSSpAMcKgqVoqSOkPBDyZmCJ/StiAF0S/JQ+hWUKF3GPOgODkKfBSzSTKWUyDlJrkY+26rryCVR\nVZI2t3ffePzszqjf1PL9f+MQZx9eZ31rnR9454P8yI8/xObWOl/+2KukNpNjJMVIKRmjRB09xkSI\nCa2Q0zN0XWlfplMrGmtRRrHq+qEKKExGNf+mcTxrDb+twaSOc8fHvP+pZzixN+OnXnwRTB4Ykj1+\n79AWF+qa3ztyEC3yJ0PnZ1AoQkkYbUXeLiVy8MTYSflptEiHligs+v7BiFkamMG3zij5ErwXNkkr\nSS1yl0RGVcX6eETjKkoprNq7nM9+8C0n6VYdtlkwW71KPTmPSZt8/dGvMbvl2RiP5EYv6naLq1Uh\n5yIaHang86AaTJGGxipUAWcMI2dpu0DImRJg1NR8bjriM01NKoV3XniZHzp3nFd+/B2oTz3BX3zf\nee47ssHFHFkFz9ePbPHlA2vkrHClkI2o7FAKVmtSVqiU0EqRVCKFSCiJPoksKEUmAmLOhJzZ8wOC\nqGGw/ZNTLJM/MlWllDRLKDGNs2pIL4BSTMajuws2ZKqRtMlraxNiG/nqR1+k241opfAhkCr5X6RU\nBv5OGHDh/hIlSzmlUSQEd1BKlNtrK2sSi7ZnazrCaFF1l99b8fRrV9ne20U9fA+P3X+a6ztz8mzB\nsa0pMQRuzTJd9vgQqZoGpwxOKSz7mtoy9ZRQaOPwyJCkGTpEiibkTJ8zNxYtfYjDf6WF0BhSnlJg\n9G0jG3RBJqpKJkZFdmCSJ4VIZe6W8I1T0FfIvuCU4+uffoHdK0u5pY3Gp0QXE0ZJaUaSzstZ8S7I\npUiNjHjJxCJ8oFKCMzgrajqLVcfB6YjZqhMjtiSCicsucPXWHpujmrXKYQ5MKd6zDIkzRzZZdp4Y\nHW7gBJUymFIw+4G43VlHCAVlDEVrdJaGxqeMT7C7aJm1/eBDM5xkpVBqAJ20iCLpoRIRBfqMVWJy\nl3MiRcWoduLedzfBntbrLLo9LCNeeXKbm8/Ph9mJQqU0yxCYtx1OCx+3b+SqlUZZTcoMl6JI3GsF\nMsxasIC1YsHShYjPmd7HAXdWOKsJMYpSWkkYLGMHB6Y1Zd4ycjUnD25wLRd2h5xrSMSYscqgKoXT\nClUKWRn0YPeSSyYZUb7sY+TqbM6l+Ypc8u3TrAXSo6AwRmEowvoM8v8Md5OzhpwUKSdyyozr8V9t\ncTs8d7wgtVrD6AlKK66+MCeGRAjihoRWGKsJw1ycNQZjhrZLKYyREjD4RB+SzGwM7HQshaIU1mgq\naykls+oCGM24tgLkD19dJMsOC5BjpFYZmwJx1bHZWLQWmFMPyJwzGoxUEA4wCH1ljBLvMSMGFkvv\nubg749LekpjibclRFMPdw0CqyVuqKKjb6VxRKcVy1UnjFqN0nCTW1+9ylGHRPY21jeynjDWzVSfy\n+IDVCusEhUsxo2s7kKT7Y1vSxoahSgEjm1q6oPLQOChF7QwhGxZty7iu6TtFyYkuQTMQrgrIJWKt\nxWpR7DGlMK0da7Wl95q+yEVorUORKUqqjTJg0DFmQhFtkK6PfHtnj9f3ZoSUBsK5DCy+pAq5VyQO\nMkUlF2ZRilXbseg91mhiilTOiFFogUMbdxnskkYELK89d5Nnv3EVw4CEIcyz0ZrKGfoY0TGjjZFf\nPCc0bvhlZWTrO9yewRrRgtLKYC3DWJgiZ9hte8TPN+GsYT7vyCLQRyRjKwfG0HeBEFt0/l+bpaw1\njbOSygqkEGhDYtl72tCzip4rs47ry/b2Zi8lDwdE1laUEtbIKE0ZBnYU4rbkU2LuPZVVHJjUGCtD\nnDlLBaTuoKRzxzRSqnO0q5M89qcvCi9nxYMxpIQb2AtthQxNUdzrUk60PhFTwRmDQwslVQZF9yKj\nD8aIZl5RBWMdPkYShVHjpC7PhZjh6vacxd5ikDPqyBSaugKlWPbC3sSUxKSiKHRJVNYwciJM3gfP\nrFuxDCv22paXtxdcW7aknAanvCLg3233JIXWGmcU1kq+BtEUySXjvWyZTV2D1RanDRZZ13PWDGLr\nd3Gyty9f5HN/+g36PU9TOUKIGKOZLToaK/myDFhH1hqfMiklupDoY2BcVVSVpcRAyvk2EijS+5Ib\n+5ipjGFVAkYr1scjtmcrYgqoUnjm6g6PXN3myOYaxhpybbFW4Yyi0YZOabRRoiZsJKWZrFFF4X3H\nXrtk1nm2V4Ery26gyeQ+yPud5mA5m0tGK5kzsVoLfq0UGrmKchEji7q2WKdv5/NMhqSIWtiouzrZ\nX//k17j66k2U0vLaa4U1hoMbExRFWJogC6daa0LKtIN7hk+CLzsrCLBP0kqrwmD6sH+aFMYZDq6N\n6LxHFTi4PpYZjpx5ZW/JFy9c5LUrN9ibzQQXyRFrCoWEjwnvA4vFktL36NhRE6myR2XPXh94abfl\n0rwT/T4pJoaWXi7Q76gYK7SWkk/rfQ8GPVyWkHKitoY157BmGGdQDHiQ3EPxjXVv73yyLzy3zbiq\n8XHwlCmaSeVEjDaH2+Wc2GpIcEqGSVMJSJ8yVmusgT5mYk4Yo6mNo9eih51yFndTJV2os5qcKpxW\n+CRD6V++tMNy5XnbmUNUzohpZo7E4Om6FV3f0fUdbmxZs47iPTd84okrMy7PV8SUuG11h9izaCXl\nn8CoWuCGYZ5vXFcYraiNlhpbC3S817ZMRg3GylCQ0YPDyD7RgHhD3FWw111NGr50bRShSC2stfyw\nykq+W3aeVOSbHVWOsRPfgT4kxrXDGAMpkHWmGLGrUkrTRRmIbH1gUlfUVU3rO0LOWGPwPpBSZllV\nPL3XEl+5gVZw5ugmqIQziehbFAljNcuQuRV6Xp/vcX3l6WMawCkpCylFLA3R5EGAT2st17eSCaja\nGtZGldgzGjtMtEovMB41HFobY7UZ7FsGWD7vszqF/m5zNkYTY6CxdqhDh5VlVQipYJSi6wOlFNLw\nzQovJ5qrCaicCHah91/JAUcBuj5QOSssji3yalqHNXnoQOVn9l2PGjU8t+xZvXSDtyw9x9ZH3Fy0\nXG4DV9vELBSeXyzoYxq8xcpfShkZhqnWmBJlSGkoJf42Q6ChcPzAOuNaZrtDCjgsoAk5sTEayyyJ\nUgP+k1EDkJkKtH2gC3cZbB8jFMiDNgdF0eREUzsqK93VrO9lN7wLHJiOSQO8GmOhaZzg2lZTWdk5\nrIzF+8Cq93QhoI1MKoUUqU1FZQ3ByunPw6sfc6Lre3J2PJsyL85btBqmU3OSD57zQB0KPqMGaAAF\nrpJxBBHdHZgiJSrK+zN9qsCB6Zgj61PpF5USJ+4s/4xrR2MMoMRKK0uQ/fB5vff4nLm4M7u7YM+7\nnkobSlEYU3DaYq3GGakIYhCPr857Jk2N1VIfp5hlcnTAh42RMS6VCzEJvbXbttTOYpWS4Zuh0THa\nUA+rFQyA0f7oV4wBUyweaXKKUJoUNfxhAP8F7IK6Es+YVMRnLKf8nS9jcKe2WqNR1E5z35EDOOco\nOWNUQTtHypmu94xrUSTOMdNFmZ7SWhFzYda2QOHmvAN3lz41o8rBoIFXGc2kdjROcOiYCouuH3zS\nDU1lWfServccGI+JZFTKBBTOSRc4HlXEGOlDlDY9yYCLUgqvElW21FZjnGHaNNzUC6m3ckE7Q2UM\nCTlVBuQ+KcKKGxQYhdEWjRgg5zz4kZVCGbzJpOKwgssY4RVdydx3eIvaOpnlU5qmllOcktwfpWRy\nEU/4xjmKjgQfmHfi1LezCmRjId2lwYQqcgJGlUErjTVSe5aU6bw0L2iFU5auCyw6OQHKgC2KTCHF\nRNvDqKqAQhcTs1UvF0oueCVLSiZLmklZyqzpqKapHMtWHDdijOJGbQzGSl3sEP/H/X4kZ8HSQ06D\nH9mQs7PcKSBdr3MGZzSbSjMPnvGkYWOtkQu0CCKpGEq+StalU9H0PtJUkIqRrlYpfIzs9JFixC2k\nXd0lebB19BCzm9vkso/myZB5HzOtF9A/5UzMhTZEjNY0laMyltpJa9uVSCoZ5wyrtqMLgY1xw6iy\ntJ0XiNYIopaGbg4D06ZmfTSm7YMYsQ0/S5A3udgSSuDNnFBFLrg8YCFFgVEyAFlyHsphjXV2WPxX\nzHOko/Dg4U1pcihYo7HOUTuLUlI3F21wJUujFLIsM6XI3rJj1wsVuFquaNsl1rxx63LHpuaf/It/\nzfm3PixOdlq+6ZgLrZfRhJKE4WhzxFjF2rhma9KwPq5RStF1kS5GGmOZL1vBPYpifTJiVFeMRmIy\n3/ow5Hr5+zEVrDNsjEc0lYPb3GUZcAs9bHQJLn279AIZWh/ewgxCXgBoLXYnxgwXcqKLkeMba2yO\nGvLQgI1qR2X1sIonb7Ls01gaZ+l6z07bcm22YhZkm2E2mxNDx7GD6/zA2TN3ebK3TvD3f/2f8djH\n/wtPPPYYfSq4VPAx3T7VFoV1FbXVrDcVo6qiD4G9VYfPmfVxQx8SzirWRxXCfMjFUruKcQN9XNH6\nAEXhhq6tMpr1cc3GZEznwxDccvv0qoFzLFqcTAdS5XbHB2L2tj/uZrS+PdMRs9BhldGcPrRJzllo\nuqbGaDNAq7Cfn8ow19Cuem4uW3Z8IBdF3/doMvcfO8DW2oRJ7ch3OL9/rSCXcw0/9bO/ygMP3M+f\n/OEfsLM9l24vyXKSVoInj2tRYQCpYnIpjGuDUQpb79emGWsVXZ+wzuCMxcXEqK4IKbP0kdGAsMlJ\nckxGDfWqpe/FbCcXgVel9JLqQk6+GqB0STE+Rdl3pIhwgDVoY6T+HyDaYwfWGdUVlMKoabDDl6EQ\nl+uihV9VObOz6Li0t2AWEjkVlquWaa15670n5bLPmUXXs7tY3V2wcxLStuTC297545w4cZrf/91/\nz8VXLmKtkYYkJ0bOsj6STnG+7CkoRtV3nJPMAC5MxyO893S+JfewNmooSlFZw7iumK16uhhQykKA\nxtWsj2rmo4YbPtweg9gPcBrA231MQw+QaC6JNLT6aljRsM5K3k+JXCQ3nzh8AG00zajBDORARkaF\nGWr13WXHtSHIMWf6PrBaLdgYVZw/eYTaOnyIbM/n7LYr7uAJ9Ned7Di4Pxfmy5aTZ87zG//0n/PH\nf/j7PPW1b5BjZmQta01N5cT6SQHjqpITOBAJoFhrZJcwKMXmZMzKB9o+oKy+/Yo3lWXVeXot6Snp\nQOMs09GIvVVHTAlbvoOP76/x6QFbR6mh0ZAcrpUadmHsQKZnmqLoc+bE4XUmVcV41KCVlLLGICSA\nguWq5/WdOXu9qPB0PrBaraiM4pF7j3FoYx2nFDfmS2bLli72rPo3trr6a4NdciQN00HeB9rOcvTo\nSX7lH/4Wnzjyx3zpzz+DShlrLW2QQXerFXEIQimaTB6wYUNJWU6gVdTZkFKm7/1waSkq5+hDIsRM\npxTEROMsa6OajfGI7dniNuOdB2JWDWlif/wgJdkgoBS0lppdGz1MPmX6LCJc9xzdYjpu0MbgfRJU\nL2b6kLg2X7LdevqU6DpP37VQEqcPHeCBY4doKkdBsb0347Vrt7BWyuF9r5q7C3aRjs9oTYiybDpq\nZOXtne/5AMdOneGzH/sIOze3h3wsjUCJMjnks6SQiRMGp4/DiJdSaGuokS2tvbbDDB2ds5o+ZnzI\nso+YJODrk4a95YqUk8xwU75zIQo7S8yDVWyRL8IYizPyEWPKg90hPHBki4PTNbQx5CQHQ8jfJTeW\nHSEnutbT9x2NU5w+OOXEwQNsjMcDaS3wQMpl2EQoFK1kG+JuT7ZSihA82VoMms575kvNdDyh9z3H\nT57lF37tN3nsY3/Mi88+RymST1Mu9ClhtWFjUjOqHZ2X+WopKIRQtUZRO4vxgRyFmnLWiAxnSPik\ncFpjTGFUVayNR+wuV6SSBEAabv4yMC4pp2GYXRajqkpQu5giKScYcJ1zJ4+hlKiedb3n5rLj1qpj\n0Qf6rsf3PZPG8tCpgxxcmzKpa4wywsynQsqDS6vWjGrLykdCL34Ntzusv+K5M7tuDDEGUpCWNIZ4\nu4lxrqIaNxw9foqf/5V/xL333TuMA2c67zFKs7k+pqkr/DA/lweyICPW3/vt76RyMAgqplSEG7TS\nePhchv3EzPowbZTzgNohaFAu0qmmfdvvAYspSlFyZpwLVSlUxvD9Z++lqUTT7/L2Ls9f3+HV7T1u\n3NpjvrtH9D0nD67x8JnjHN/aYlzXMLT/IRX8QAv6GDBKM25GAstYYaruFFK1v/30vef//3PHk/29\n5//t871gfxef7wX7u/h8L9jfxed7wf4uPt8L9nfx+Z/IjcZmeOzOAgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + }, + { + "output_type": "stream", + "text": [ + "2 (250, 258, 3) (68, 2)\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFsAAABpCAYAAABVhF8/AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy8eZQl2V3n9/ndGzfibZnv5VZVmbVX\ndW3dXb0vtFpLS2ohCSEWCSELwYiZAcaAwXMOs9jjmQPjhbGNOSPbDBgDHhYNAizBIEvWhqRWq1tq\n9b5WV3XtlVW5L2+P7S7+46V8+uioiyFnjhhs/c6JPBF5fy/uvd+48bu/NSSEwHfo20Pqr3sA/3+i\n74D9baTvgP1tpO+A/W2k74D9baTvgP1tpL/xYIvIL4nIR/66x/HvQtsGW0ReLyJfFZGOiGyIyKMi\ncvd/yMF9O0lEHheRoyJySESeftX/ExH5HRG5LCI9EXlWRN65nT62BbaIjAOfBP5XYBLYDfxzIN/O\n/f66SUQMsB84C9wJPP2q5giYB94ENIF/CvyJiBz4q/az3ZV9FCCE8NEQggshpCGEz4UQnt8a/GER\n+aKIrIvImoj8GxFpfePHInJJRP6hiDwvIoOtlbNTRD69tXr+QkQmtngPiEgQkZ8SkQURWRSRf/Ba\nAxOR79p649oi8pyIPPDvMJ+bgVNhZE7fxavADiEMQgi/FEK4FELwIYRPAhcZPZS/GoUQ/soHMA6s\nA78HvBOY+Kb2G4C3AQkwAzwMfPhV7ZeAx4CdjN6Kla0J3g5UgC8Cv7jFewAIwEeBOnASWAUe3Gr/\nJeAjW+e7t8b1PYwW0tu2rmdeYx5/G2gDQyDbOrdAb+v84Lf4zc4t3uN/Zdy2A/ZWpyeA3wWubg3w\nE8DO1+D9AeCZbwL7g6+6/jjwG6+6/jng334T2Mdf1f4/Ar/zLcD+x8AffFPfnwU+9JfM5SvAbcA+\n4FlAXoPPAH8B/OZ2MNv2BhlCeDmE8OMhhD2MXsM54MMAWyLhj0Tkmoh0gY8A0990i+VXnaff4rrx\nTfzzrzq/vNXfN9N+4H1bIqQtIm3g9cDsNzOKyOQWTwd4HfAQcAY4BmyKyN//Jn4F/AFQAP/Zt+j7\nL6X/IKpfCOE0o1V+89a/fpnRajwZQhgHfhSQf89u9r7qfB+w8C145hmt7NarjnoI4b//FmPeCCG0\ngL8H/PbW+WeAd2/97sPf4BURAX6HkQh5bwih3M4EtquNHBeRXxCRPVvXe4EPMJLDAGNAH+iIyG7g\nH26nn2+ifyYiNRG5iZGs/eNvwfMR4N0i8nYR0SJSEZEHvjHO16BXax+3A099C57fYCQ23x1CSLc7\nge2u7B5wL/B1ERkwAvlF4Be22v85cAfQAT4F/Ol2B/gq+jJwDvgC8D+FED73zQwhhHng+4F/wmgT\nnWf0oK83zzuBp0VkCnAhhM1XN4rIfkar/zZgSUT6W8cH/6oTkPAfefBgS5+9CJgQgv3rHc2/H/2N\nN9f/JtF3wP420n/0YuT/S/Sdlf1tpO+A/W2k6HqN7/zZVijTksgZfBpwRUBpjQgEAkpBLIooC3hK\n4gR05PHKIpFgM4iICSWUKFRsiHREmuXY0hIFTb1RZ2b3LjrpgCuLi9gwJCoU4hR5bHFVTT0zBBdQ\nWqGU4DwErUAKKjumULMHae28Ae001URhxIIvGDc7sapGWYnAOsr1Lv32OvO9VzhW6fMzJ09waKoF\nY3fhpU7pIlAVUBrnuoiACoL4gAkJw+CoDa/gNi9x8dTLxH/xPP1GSV95gokQHxCJedOfLX9LA+66\nYKdrlqxXIIUl75YoDyYJBCdYC8EDISKUBhGHihzGCLUJg6loOqtD/HBAFDSuDPgtwJwDQiAuA13d\nYzlepQie0gaUreCsJ+CQOuzcP0l/0KffHeJDQAQoA3ExMlHzakF1Z2BjpoePNUaDOEstMeTTKbY2\nhmk2GE+qNEnAafbGTd57yy3UL72CW16G/RXqs/thfBJrNA4htnXwAVeWEAI6aHytRiQV6CbMTO1j\n/sWHMeJQIeAEKlpR+NfeA68LtioNdpBhU0szGceIBtulUq0yN7uPPHNsdnusdjZoNGrsmpsky3qo\npE6tMYNKl9jcXEcKReI0tnTY0pLEBmctiRI0kG8WaAQtirzmMGMGVdHoBIJ16G4M7RRFIIoiCudx\n42OMjzdpb24yvLSGWlwnrhmKtCDrF4xVY/SOVYa1GqUEJC2YcBGl7/PAA7cx3VXUNzzDdJnF5z/F\nnmMniHfMUt25F4nrZHETE8cgUBYluQ1YnRAVfVye0Ww0WfYxDD3al8Ra01CGnn9tU+C6YE/OjWFq\nirJnOXn0ZjqrbebP5GQ5nLj5EJOT4/SHA144fZZLl+e5656jRBEMbcTmpiddbdN1DpUVOK8QBKMD\n4gu0goGGSpIQIgilRUKgMQ17b5qjjBTnX5ln+eIalUxhIk1ZWpTWTO6ZZd9b7mdqapYrp19m+cwL\nNGuaw0eP8MrL57h8dp5ma4qDt91Bbe8cykQMFlZ5+jMP0WiU7J9q8el/+xAfuutW9NQM0Vqfi/M9\nKqspe/sFyfg4fschoAZKUakmuF4Pt3ABXXZZXzxPVU/gI0jFUmioGYUEISi9PbCTltAfgqoJO2+Y\nYGbfFNVKnZeee5HV9hWkMU57aMlzw+KlAStXOxw8thdVhfMXTjHIN1CJQ3uPLyweGPmjAiKCnp5g\n7sghtIdXnnkRBjnNgaF3YYO+U4TViEY+jlMdVCTEkeCxVMcSZvfuojE+iR3sItqcZ33lKteWr7Ha\n3qRIDNNHjxHqFVQkGCUM2m1ckXPy5kN8+fMPM7WaU/vhEyRzTZKQYFQg7yyhyg66YgiRQggoCQRf\n4tMuLFwAlaPTTXrtAUWRk+DxPpB4oRGglNf2t10X7Gyjx+q5gjgJlGpANDXG+L6E6ss1yrSNJA3y\npMa1lXPQcQyXFtC3Nlle3WB16RLNPQdJJuZYvHAB2ik2dSAQV6pMTE5z6L472H3TUa5duUrl3GXU\nZknZV/S7HVwUaE2O09gzTm+g2dzs4HxAJBCcxbaXSVVKd7hIu9+lt9QjWxzi8sCx2+5gx213EkUF\n2coy1y7M89zDT9LQmmdevEZ1dZ0ffcvtVNQibrBCvb4TkYSQRBShhkvGUK1p0mGPugIVSkI5pOYC\nZF3GMsfguUWyPDCmInIsEjwNG0BtU2bboWLYyajvqNCoJDgV0FrodgoGvYSAUOQ53fUh2ilc5oiV\nsLK4wL49O9lz+Aa6myVle4PFpRQJEcSK8dlJTtx5M/vvPE7qPKsLC2SdIQ2vKQRIDEE5dh+Y5ra7\n76JXGq5evIi2jrTdYWKiQqu3zMrCRRZOX6ZztY14zdBb4laLE7fdwtzBg2xuLnH+yWe4/MxpfL9g\nKJCahPfecZIxlXP6sc8Q7zBMNKcYa+6iOrEfFU/hpEHsHcWgT2wV4gp8kRG8Q0nAO0d3vYNznqA0\nOgix0rgQcNexEa8Ldmcpxw8UqhSkLEAciGc4KLBFhIhiOOgzXLeYXDBBo6zDFxm33HwMVRd8YQl2\niNaBwgZ84Vm6tkKtcYHqxC7SYc7FJ05TcYqTt97M695+J/sOzhI3DJMzdWpjVTyWRN1LVFpUmgKe\nLHg2uiXX7uzxzDNnefzJZ1jrd9ATDUIood9n49I1Xnn8RVjpEAUhnmxyy5tv5/CeOc59+TEGV9ZZ\n6aYkep477jjAW39wlmQ/FDWPsjmJArElEQ5lNLYSU/YssYlp1Op01BrWO2ICkQevIOhtipF006Gt\nIWSBRhKTRY6lpWukaYFzmnqtQXfzGmnHUXEaE4RQlBiEiWaVvh+wtnIRoyytHQk28/TWC2g7rj16\nkfaZdVoTFd519028/btfx5FDs9QrgjZCEIezOURD8AW9dps0K6hX6yRJgglQVzF7p/Zyz51H+dEf\neSeX5tf4/FNPgwmcfuTLPP3VZ5DFDpUS3HiVG+6/i7333EEiGtlxiWvnFxibmaRarbLW6RFcCs5h\nkjqUDoOHMkfE4cucEg8io408MQjgCMQIlI4yCJIk2wM7H5RErkoUQCuIqxHdjQwKRXuQMSgt/bUe\nSYiJtFBay6BfkBcBFxyDTp9L5+Ypi0C9rkkaCbVSGAwCu2f28NbvuZs3vfUEE62C7sYVVs69gOtB\ne1AwOd2gkXhqtSbJ5DiVZIx4YicoNXoIZYkx4FWKkoxWvcZtxw9w6Nhhnj19no98/Tn6F1eo5poo\nVkStGs0DLeLaGF3x5NOa6eOKEyfGOHRkJ4uvXGJYLBK5veBjtMnRiUGkJBQltrSIBlNL0BWHSzOU\nKDYV7BKNCeC0bBkf2wAb59AWtFMEgW6esjo/JC4VIdGUKLqLGZIHnHhMrUa371AmIijYWEzpLlls\nKqTDggqKKnXe+f4384M/8gay9pM8/+hHOP3ECt2FgpkJzYmbWpy/kmJ1ykwrMKYSpna2uPGtD+D3\nOXI7JDYxOjQQVcNGCU6EWCyiuzSjhPvv3kGz9r38pjgef+xFSg2xONaeO0sjUSzEQhj0uUHXmZ2b\nRQ2XmW5UIERI1ZBLjziK8ZEQeYM4h4kTSqdQSmPLnEG7i/GaDeWYE0FwI1W2cNsD27mAOI9zCldq\niiIi7ViUEyIvUCjSngU8IjA+XmdtrYcxBpvnnH72EoNFS9kTGlKhOh7xEz/9Qb7r3pv50qf/kMc+\n/QL9NYsbQC32nHhdg4bKkDyjl2p8IWSJMH9ujYynaR05zf6bjkAUUxbrxJWdqOYOPIZQVpDcIJFH\nm4KTN+zhn/7if8m//PU/4KkXzzDsFiyebnNp+BWmT8xRXjzL0ck61Wad7qkO02PTaASpaqhrcIKX\nkVtAjEa0JooquFwzHBZstnMMERXribWglFBaR01Vtge294IvLc5GeJdgC4XrC8ZHhMJR9gNF3xFw\nhKAQUbQ3+8wemoQ8ZuVCH9U1VFOhCvzA++/hvte1+Njv/B985RPPEXLN5KTj4MkW3VXHC08E3vWW\nFnt2Cr0rJUtLKRsVz2YKZ774Iq8bTjL/wjnufeO9pL5grN5D9ZfYuHyFWpSASlhbW2Zx7Qr9vsUk\nFd5z+zg37GuQtk5yflWxunaaYzcfoTU1zfHOEroY0mo00SpGAaFMESnBVEgaNXy/j7dC0BFBEnRU\nQ0d18gKMiRnaAh9p8CVGgw5me2AHDz4EYhPRHJvh6rU1so6j7g0mGCZqU+gQIcqiBIbDlF4v5c4d\nd7J6tc1wzWLSiKR03HFyNw++/nb+zf/yMb7+5fPUiAmViOZ0RDI+pOYc7WXL8mKVfQcn6dJmOp8m\nzzrs0WDEUBlUUHrA2ade5OjJ46xdXWNpuMjqhWuMOY8xVbpDh9djPP7ERRaWc5qtiNpMncqRkuau\nCXTsmSg89bRkPLO45TWiSgVt6rjSwmAA/T6uUUNHmqAVzlt0ZPBBE+kEU6mDUrSDo90IdINn3Cqw\nFhPF2wNbJCIhIsSKMtK05zeQPFBqR6wNrm8J1lGLNJHAlVeWwRTsG2/wya++RMgT6kXJHbef4Cf/\n0bt48onHefwrFzCmQrVaMDke2DWjmapUseOWI1MFtVrEjqkZ7mvtIC0rFFlOSxxBJcwvLbN7/268\n5Jx6dp7Hnr3C+MQs+/bNkZWrLG+scuFij6QyTuY8B4402NESNpcGlGdXWXzpPH0JhOUUd3mBW26Z\nIpEaKlTIncWEIaE3oGxNoPMUYyJUoihz8HhcrMnzAmoRvhGR9z1R6lkTSzOqIC5lELYps8vSE1mH\nGCFoRX+tg/ZQSEmel2wsr+GsJWZk2sbU8Hmg5hIunFpCZYGTN+zmp372Xi7NP8fXHv46NRMobErN\nJEyNR+xoOSh7xEYzOSFsZgNmSiAKREnJ2mabtWFBVKlx+toyZqJCq1XDNCMyWWLSRTSTiJQm3Rym\nGgmVWuDYkX0YMyRympUo5uLagJmqpj707E6a6H0Frdk6wSmMaZAbi88KtPdESpAyAx2DBHQlxkcR\nkXhk2Ee3AmZHk+7KOtM+Ag/abjnb1Db1bA24oiQSjUbTb6eIEgRFmmYsr6zinMeHQAiaLC05cGiG\nWlzBLjru2j/B3/2ZHSye+yqf+Pg5Kl64+9gMUc1gNzJ2zdXYORcT/IB+t8ARE9UML718lmMnDpD2\nO6giY2F+nUFWstnO6O3sMFhfYfbYYY4cmCVdGNIKFXa1Wtyy+wQb3SUuL57n4uWr7NxTp9o4yOy+\nmCN39Xnx1DoXnltjT6WCHqvR768xbloUvqTWaLF+9QJTcymJHeAjkKAIEiGmSmQilKkSjIWhZs8d\nR7n2yqOjvSp4VOmoBsVwu76RKARirYhFYdOClYV1ytKCUhRFQafdBgGtFSoYVlY2OX7jDOdPrbBv\nPPCf/9QuhutneP7hkj1xwngFdtQ1kwcmiA/A/OVrnH7BU0sSVHAUwz6H908zRMjXNjlw4CC9sZwD\new9QeiEgSNpDiaPWmuKe2+eY5wKLp8+SjBkO33qMHbtbJM2T1McOEZseR0/ci1SrXJj/GoP1K+Sd\nkpcf+Rp33dJi+midflpSG6tQndnJykPPcmDPJlG/ijKCL0BVJ4jiOj5EhKggiMOPWWbvOMTk8+c4\n//w1gg+ktiQWIVwn7+v6YCuoxBHZMOfa5Wt0NrrEJsKGQL/fx6kMEylc4XBliS8dvW6Hh77wCHcd\nqyOb1xhsNDh6EGanxnA9T216jg2TYeKMGQenHs9ZWBxw7HCNicRQ04H9xw/ix+uoqQmae8eIp3cS\n1Rp46wjdNuItBTDenGZ6ZienHvoKLz57lucuLDB3cIITd97F1559hcmow+zcBGmakBUxt95+CMMC\nV15aYqKfMe4rvDJMaU02kaRKWUR0V1ZpbGikkqCqCpyHyEBUQaKYII5QH8DYkJvfei8vX/o02SAj\nK0A8uO2CrcSwIRadlyydukLZKanFDSJvyTc8RbdAvMdFgYryVEtILw/YMaZ4811TLK2u8YmH1jm6\nr8WOVsrE3BT9kJF1VrEyYNIp7jw6gbm1xe5DszSmx8BU8CbGzOzCVxqQ1JHYABaJLOgpQghUqlWc\nqlC/YYzbmwmzhw5x5rGXKAd9vvrpR7i2sM6wlfDJTz/MwYM7ufOe22mPO86eWUCrOt0NqCph/1yd\nRBmk9EwcmmVxc4Gjq3XCRE5oavI4EONQzuF1ABdQSqNbNaZunOG2ew+x8RcvoZxFxQa1Xa+fKxzW\neVxRsnFtDYpAqUusLUkSTRLHeJ+ilUbKgCFwaP8u3vWOI2Tr53juVErmPZeXNzkwM4tPPDldbJpj\nXMxMY4qxE7PIrinUXJMiKompISrGN5q4pIaKa4TS461D6wghQkcxQQSlq0i9SiKe3feNM3P8EKtn\nXsY98hTHxjSpU/SWUzpmg6Udl9gounSzDXI8K92c3E9gqg2SiXEExeHbjvDEn5xlZ71Nay7FaIXU\nqwQSgtWIL0EJzns8UCRw8sE7udSx9L7+CnFQDLcLdsgD2iukdGwstxErWF/ivMWYhDg29HsDvBOM\nVdQbhte//jhzR8b54y+scHY+Iap6Zve1+PqpNfYPPAcPj9Mam2B3Yw+RifGNCrpRwymNjyKCi3Ao\nxINsJb6KSVA64GyBijQoAwG8D3hA4hoWUBHsG7uJHbvGWbl4haceP8/asmfM1Am5Z2qyxcEjGcPV\nHmmvT6FiGlOzSM2QDvuk/TXa654rWZ/WjSXSt8i4ReqCaPCdITYfYkWIqlVUNcGrhOM/8iAvSGDh\nyXNk2zXXY28gCLEPDDczYjH4YFEilGWB6+eoSOG9wbnAVKPKuO7y3Bee5fmXN9jsaHaKIzjL0ZN7\ncbbgpdOXODI7w96WhWaMrRQoP0S6jiSqgDHoSCFFALGI8vjIo5SAlq3cC48ohcYxCv9oTIiJcATj\nqBzcx/RklduqFTYurZB1u4gPREGwRU6RZ/gAOklQSYVgSyJdkIwZetby6GPzqGad23ceRjdznF2k\n2+4TqxqVaoKJx1GRJkhANwYwbTn+d97FlfYfoS5vMwapgqCCQpzHZg7lDU4CELDWIh5yF3DOUgmG\nqhGyzQV8D9KhI6p4KkQMOl2WSGlUa6BjPLDevcyEr1OfmkClDhc38MHgE42pJFAxOAW4Agl2FL0X\nh0gEvsSjUEHQDggRaENQ4EvBZo5qa5Z9d1SoVxVL5zzVaoWiKBm0hyivMBoGvS4176l6RelK2sMu\nhYlZy0uefvgCu2+/xkTFsjm8Rm28iZo4BHhCNHoLy36frEiRZkz1+A5e/9Pv4eJvPrI9sAvj8aGk\n0Zgl7S5hKNCRpig8WMGHgFGKxDpMmTAoeyxvVHjusR524KlWDaEe0Wl7OsOSemvAsFvi1gtqezTN\nw3OoECHSQOkGxDFRpAliCVKgSBDrwXgkBEZ6lYcQRukDPqB1TBAIEpDg8CGgpAIWQqyYOFYjxNMs\nXb3I2nqXPHVsrsOhuQq7dlRRtVkI45jOJmYlo2IqrJgU20459dmHuKW9g/rJg0TTkxhnKJ3FUaKH\nFjvowaBNurDJxXMvEbmCmTdXtwd2CAGPY319Y5TL4Ue1IVE0iiCLjPy3kfKIWE4cmSMgLPUsHQdl\n6XCDgqZKqNU015a6aGJ0P+eCh0M3NZD6OE4MXjRRbEB7Ah6xo/uL1gQ03gdGL5VHELz3hAA2WHSs\nRo6csBVIjmMEGd0nlEzNTRFVFesvOfrdFVzaRZsa7aGj6QQ36KKt5/ylFc5cXKBXRrgQ8+Vn54kn\n4J7jh5DeBnkxREoDwbJ8+TLlMMWVnu76Bu3VAQf37uPJM6e5YTtgez+aWJHnqK3VUxTlaEJajcBw\n4D14KakCS4sbrAxLbKwZSiBSmnbmaRcFJaMEmopWNMbqxI1xvFJ4JahY4zVQGwcFXhRIQhCDDh6C\njB62CyP5HUb9j8a5JSd9AJFRpAMIXhNwuJCijWJtY4AULQ7s8uw/NIEemwYfaG9cwaU5/VBhPRMy\nJRTK4EvDV15YZ9/N67SyPmG6BpKwOH+VONFUxprUWpM0dIuK2YWuTrP/jpnXxPO6uX4BqNWqVGu1\nb1RLodVokiEEnHMExwgIrdhcWGfhap+0GIWORAc6eUbHFfSsoj0IpKnHOGgajRIPT89j/os/Rx4/\ni0Qe+dJz6Pd/GPWl5xDjkQRENEpFgBrJ7KABPRqgBEKweGcJISBKRuqZgCNGTBXvMop+h2y9zSsv\nXmGq5jh8bBIZayLKEauc8akZbHWKng0QCZlP2STh+as5f/6Jpznz+FnU+oBHPvYw69dy5m68k+lD\nR8nTEtNssf+2k0zeeJA99x5+TTyvH6khkOcFPghKRvkeyEghc86Pys2CgKjRJAMM+poIQy12xOMa\nqVRYbQ/w1iMYpLRMjAd272ohONTHXkTmexS/9RVOVzY48a9OUVkYUPzqn2O7KzT+8AXCz70bHriV\n4EcGhXyj1E0rgviRqJFRPBoBp4TCe6IkgSJHSyDdWKS3vkndROzf30AnJZiYvEzBDen3B1y6uk4W\nBB8JgsVLjcwmvHy+xw17Jrj01EWGS4GDJ/bx9JOnmNszzeT4JGViSBNPIV2ySrG9lQ0KWzoIoLVB\nacPY2Bg+hJEMFfA64COP1iUDNGkKDW2ZHFfs3BHjbU5wCl9YXNBMVSLuOdxkV5EQ//JTtPdo0pkK\n2Q/cy/Fb34j8/e/FHZpl+R23UH74S8gry4T/5qPwnv8OvvQsIYzcnSIlInYkylSCfPEF1Pv+BfLZ\nJ1CffZLqD/0P6C89hg0eMYa0t0A399x0rMrR23Zj4zGqpokLAR0F0kHG0sImVkVksaanFeL7HN09\nSVwb57FnVrj80iadpYwv/9mXaD9/hpYNZFmA2jSVmVkaUw3UdgO+SilcGE3I2RHow2GG9wGtFd57\nvActhjJYri31qbhAEUBMTKeTM+x7fBHwCJEr2TVt2HNwCvPxS6jFIbUywf/X72Z8xwFcEsMDt+Fe\ndzOz3SUurFyDL87TyHP02QHqf/s0/k23jpI5jaBcBl98Cv9rn4Neiqz1yX71TyFAdWWA+5VPYn6m\nRP3W59B3aCbxnDwSU5upYFt7MM1JpD1PpmtcWdjg2tLaaB/wjkTFKEm5/c7jvPTsyyxe6cCJvdxy\nt2fIOsfvPUa/VEwemMM0aziVQ/CY63iirh882PrjvEePyoFxLqBEcFvZKAqwZaAQoXBCHIQkCuyc\nqjK/lCOlJlYBryKmjXB4V8LEnib2exLUJ8/DW3fhe6u0fUT++XmmP3mK3vtuInrTXqbffQu9u/dQ\nPbuOf3gZ97fvJ5QlCg0KQjlE/tVniS6u0a8q9M467scfRNdq2N/7Aldet4fd//OnqCx22dOvMHHE\nMTVtCGOGZGYWbxURBZJMstHfZJgVeG9QIaBDTLOREFSfqysrZF7xyNPn+VvHj3BwzzRZ0aa5/yim\nKpT5BtZaUl+QuW1Gatja7YP3gOB9AA1Kyf+7YQoBhYAy5MFTx7OrJYwzYHY8IR1Abi0+L4jzwFyz\nSaU5Rn7HAdyNByjTFa7Or/G5587wnzyxQpx67L9+jAvLbW55+Cr19x0nPHAj7TfdSHP/IUQJXiu0\nzbDDHvaHT6L/4CnCu2+jeOONNI7ejK838O99gL3dTVarH2fn585x7Z5dzOYrNOd205jeiVUBX+Z4\npahP7mNp4yXSIuDYUi1z2H14kskdNaw4nK6wNix54muLvO2dx3DGEyUDfL5MOuzSNxXGb7iFehh/\nTTj/0soDCSMfhVdCKQFvHRIg0noUnwmCCwodAjNVYa4ZMzvVoG40ZeZHPA5eV8Iv9uFYPrL2dLWO\n2jWLndxF94lrfPDJNTh6gDLSDA4d4K6PnyG52qbya4/DmQ5aR9huH778AuoDvwxffAY7HBDuPUD4\n1Q8Cgdo/+kOyP/sU5ZWzRCuXwLaZ+aE3MPy+W9j90DzmTEakxynSnNiX6EgxHA7oLizQXuqhRKOj\nkcalnaU+3uT2N9zBnoMzSBwYSuD02SFPP7ZMI6rie9forF1l4BWzx+4grs9QqNc2aq4LdmkdEgRB\ncEooI4UWhS0cLrfgAsZrqj7mpukGf+/Bo8z6UXLky+cyltaK0T2s5/1Dz3QemHxyncIVqMRidk3S\nPHiMey4HZoeW3S9fxljHwYVmWDUAAB+tSURBVJcuI3lJEFClx/zxMySVOsVggPzmZ1Fnr8Fvfx4l\nBvPiBvyDj6L+8HHM1S7lv/wsgx/+FZ7/8McYzF9C8ozGJ16kvtrn+KUhZaePzQrKDNK0YLi5zKlH\nv8TKhRWUKIKUCIKKAi+evsJye8gPfegH0DVII8+1oeWZFxfoX+2gV3sM2zkzB26lkDFcViJuuD2w\nvfdY6ylLOyrriDQYTRRrtBYihIqHnRXHD3/vLTSkj3GezY2C/sDjSkUxtKjS87m6Ih03uHftAgqo\nCSQxamIX+j/9fvzeKYrvvgF/YAr3vrvwR2bJf+yt+Bt2w0++E1Vp4rzC33uUEEeUd+1HVWrI736V\n5GqbRCJKrYh1wkyn4KZPnuPCb3wJu75B+YE7GE5XuHJE07NdnIyjKjup1Gs0d0yQuohe5vCorRIW\nTxEJ3V7Gb//6x3nyiTP0c4fVEUUSs9jOeempZTbOFcw0dmDskGA7KEpwr/19m79EzxZEBKUUOjK0\npifotzvYQYayQiyKWiXi/R96K3fdN8XyFy6zLsJKmuPFgCgMDiOBiyow/86d7D4Bsc8RCpS0KY0h\ne8cdVN9+D8Y4gkuQuMTrkqgI8NPfi+u3kV4bU6mhHj+LFBb9xEXcT70dfvqd2N/4HGqQozoDlDEE\nSTE+cPj5DfppSeN1h7gma8zYIfHsDPX6DFHF0L16lcHqGqmtkRHhlUapgqA8QQVMrtm8sM4Xzn8Z\nnMZEikx72g6ef7lNXVf4rqMlMtgkGhsl9UTbja6LQBQpggsEP0r6Lgm44EmAWCnuv+923vK2W+hd\ne460qND1igElNVWhyEsUjloifBew/zPLDDGYWxSMdfFjO1HGUokKJES4SpOgPRJiRvkDPVBDjLGE\nyI08e2+8GebXCPedQJwh3H8Sf99NuC88gfn9R5F+hgTwRlG+/34q+27AxFB9uc3M15cofqSFzATK\nYpGKatNNA8sbA3IPZQhEKoCzaBcY91VmQoWNoo8RhRQlnUToGeFKJ6f27CK7j55j7wGDm4CoVkeu\nU/F9/Q0yeIrgKVTAimdzc508L0iIqaC57eRh/s5P/CANE1PTdfbccIBqpEhIcNYy5jx3NRvcEBle\n3w8kbU/jCzmu10EGOQxLdObRzoOyKBlgTI8oShHRBNMYxf2cxQ038KKQR15G8hL9+DnQFq0Fkgrq\ne9+A/ZN/hv/HHyAcnSP8iw8x8WPvIpqZJCiYeXSReDEl/pMXsT6l6PUZdgakm30WFjIiBRGCkggM\n+AiGehTrnNGGWQLaB+Lg0SpQaLiU5nzlkecZvHIRP1hFig7aDrYH9kjbCOhYo7VGBTAIkXPMTY3x\nY3/r3UzOJAQfSKoJw3SNgoAETXCOI5WEd33fm7nnngMsHqyTtTTtk5oyD4SyR+gtorIevvRYBERw\nPho5lCKDmARvC4rhJmkPomQ//ud/BH9sL+5nf2DkfymFiAStKiND6y03U37sn+DeciskEWihyAd0\n3zKH3VMn++EbSMbqJEkTdIPcetLUESlPJVJoZZBYIFEMY8963ifxnlYYJfonYfRdpYCjEylevtLj\n4kOvIFfX8dkqLrS3B7YJEHmoKkNVGxIUifPUosD3f98bOXhoB6XtoXQJytLNBmxmlrQsUCFwcnqG\n3fcc4ZbvuQc1N8bzt1VYrBlsphDfQco2Ie+jgShE4A3B10FqSBShlccN2xTDDvG4QUwPlCOIImiN\nxDV8iBAVjco/cARGgQatRw4lypLo0bM0v7zE4PsPkN95EGeF4aBHfbKFjRKUthAKnM3wbqRlKQLK\nKCKjGYsNTRQ1FEkQjCi8g1I0Kxk8/egi3ccXKNuL5GwT7EQ0UQBflNhhjio9Y6L5vrffx3c/eBvQ\nQUclngGiLLZMqVcUQoRBUPmAwalT5OvrXL3SIe2V2DLQ6/YoB33coI/v9yDLCGVJsA4leuTB846Q\n9hgsXKOq6xTFOqvnPo361d9Hn75M9GsfxxUdMFDGgZJy5KjSGlGjI4hDpX2iP3iMeGlI7c8vIbq6\n5UBLKfttZmfneMMbTqIVaBUQ70bFoz5QBkvhLVUUzaCoOjAWlAtEGox1WKU533G8+PBlygtL6H5n\ne2DHKiJWCikDNisJueX+O27j/e/9bmrVEh0VDLM2ElmEkopSVL2gvCIiYqHfpfvcRcqLq5yc3cNd\nL5fMDQPW5bg0oK0mZBaf5YgvUaFE/BAlJeJSpMzJN1YZfOZp1N/9v2n+/GOkew12f5P1N+1g5akn\n0MMeGouIQsSA13incFYo8xSVdui+dSflbIXBu/YgxqAi8FmH9SsXiLKMu+65ianpGibymEgT6ZFn\nM4wsOioijClDXRlqOuIbxXcNazEarpiE50+t0n1ukVrntVW/64ItkmGUIgoRjTjilmO7+OD7bmG8\nlVO6LnHRp56nFKsdxBoGRPS9JyhHKoELhXCliAhDGH/5KvVhYOq0RTWqqK4FsYTIUsoQotHrT5ER\nnMMVGU7F1HafpPX5a4wXnlrmib6+hF/vMf67z7Hjv30Y9/mnkMKCJPgoIJKhXAluiNhFQtZheVfC\nxn91K/6+/VgzhpQDMrtM3m6TjFWZmou57b5DGBVoSIRxmgSFoFFAzRU0Q07ic+oukEeKvhYybSi9\nRktgIXM8+5VF0ouvjed1wU4jj9MQ4dhVT/jxH3yQZs0z7K8ilPTWVnC9dYwvWL0y5NFHFkmLURQn\nx7HsPJ97/hzZ7H5eOdAiTTTl/eM0p6bI6xVEhCh4ImchS3HDAVKWkJUQoEgM0fGj5L/wHtzsJGF2\nkqjRIO574iIQ90rUH32VwqfgUnQoEFeAdbg8g8xRfPEye37zDJVTJY5xkvEa/XQRxyymUUc16xTK\n8rb3vZ2ZPS2KPBsFJfxI3cUHJHgiHA0lNHVEzKg6THtL7HO0s+RaOHWlx+LXVl4Tz+sHfJUmItBQ\ngQfvOcqOJMWmUKmP4bKCeqIo+226Xcsn/vgZLpztkqkIpxxOQVuEfpZz6sI8rWNzfGVYcO8NE1Sb\nETap4RyEQYaOU1ScEkQocRhliCTGuIhgt0JhYzXcz70XEcH/8u9DliPVhPLn30GkSnTht5LyA1IG\ndJZStAfIHz1FfbPAfeo8w7fdTlQNpNGQseb95LMdpFEh1KpMNlocufUIC5e+jgvfcLQpoghMFJH4\nQNOPnG6VwkGkaARPkgibPjDwitVC8czXrnLPdsCOQ4IRy7FD07ztLSfRbpGqaWHE4WyGcpaVKwtc\nOtNh/WqXWhJRhK1grGgQ4fDsJLvLDS4s99m4ssEb/3WP+D170HfvRsocZx3GOkKeETQol1DUAmXN\nYx56lujDn0U6GbLaRX7595DxBv4dd6Mefh73s+/AP3AMbR1SeoL4UUB0WKL76+RLSzw9FXijqVF+\n4BiFcpSb6+hMYxolWTkKqSVRhPUZt911A4985nHSXIiiaHTPLfXXAOMh4EvHuAJvPZGHgYcyCApN\nVjrWXts18peAnQZmJ1r80Hvehtc9KoknEo8eBR4pBl3IFRdeWqcaKSanGnSWO6MV6w26LDlYN+zL\nNxkGuDsNVAYF4VMLhDfeBARCMXLW4y0hOJT1kGaE+U2iX/w4en3IVpgT1rqoxTZy9hriA+GX/ojK\nrzWxP/0g7v6T4MHmBXowgPVF5k9dY7FVYeMn70btb5FUpuiunKPRgYWFh3DtTZozLVRhcXbA3Nwk\n1bEKKYIvsq0I/ShlQvnAGIJDRlpJPHIpFyoapVQIJMqjrpNZeV2ZvbdZ4+d/8rs5ciiALxA3hbga\nvc2MopuRLbdZO7dEI4KjhyHRBYX3BC14sRAH7NDRLgpUZ8D/Wde0K0LxxilsyJHqFLrWxAHea8qy\nAj7C/+nT6A/878j6EKcUKBmlMZSeADgfCEDopMj5JfSvf5YiXUflKVE6xA2WGT72Cnt/7zTH4yph\nQhGUQhtLlm3iI+hevITzGYGE0lkoS+pjVfYcmkPjKVWMLjVD68nx1H1g2nlmg2N3CMwFYU4rGlgk\n8lgDIRllc20L7De9pcWOnX2C36TVqKMC+HII6ZDYeYabbSLl2LUnoTkVUziNtYrSB5wNlB7Ot1PO\nDYQzWcKluuG3dkYsjhWo9gYqMUhssM6D88Rb34Y1f/40pvCjwW0lN35jsMIoSV/4xicHQFa6qM+f\nwvYG5INVhmmf+P+6yuTQc+OZDsSGamOcMhtSa0RsdFexocRHitxDmnaIyKjokZXoylH5ihZFiBSF\n0RRa42QUOKlFQhyghiK2niR4VBglEvnr1EFeF+w3v7WJd/OUww5l2gfbJx+uUimHlCur9FfX2T03\nxeHjM+RBmF8cYL0ZlbMhKBVxtpdy0dc5M9Dce/NR9ubC5EdXUY+t4F1BVIkRM6qwUhq8BvsTbyF8\no1xCQ7lnjKAVfs/ECNwtkL8BvvRz4o9+HW1LfNnH+8DjB5qsVBRrb9uPqTUoS0b5fkqwtqBwJdXx\nGsGA0ZZyuInrb5K1O4TSY7YefCGwkhUMo4hCQekdWilU6al5aDhPLQRi7zDKo/Q2V3be61L0SiKb\nkHe7pINFNEPStSVWXjlH01RoTdQZayXEjQnaqaBMhDeByARiLeRJxOmra2T9AXsjz1s3C8b7wGcW\nkKwHBJSJ8RIIFKOPQD94K/5XfpxwaCf81NuI1oeI86iFTex0jaAEb6IR6JHG72wRPvQA1qXgFdFT\nyxx/fpnLd+3Ef9du2sMhQQyuKOmt9RhsDtAEvM9QusDbgnRzk1eeeY7Va5uEPEJZj6AotaIdAv9P\ne2cSK9l53fff+aY71PSq3tjNfj2x2SRNUVIs2qQlR6IlRRFheIpgI0CycpJNEK+ThfdZJIusDC+8\n8MYbJx4QxwYiRIADeJItyIEiihLJZovsZr/ufnNNd/qGLO6TsiIN9IIrnruuQtW5937fOec/fCur\naFQiiGBEyJIwCMIYYQDkAiZF8vIJ0XXfZBR6gm9hXPaM1ljPuffOHcpkGO7toguHqEjlU0/7VZEg\nsefQBI92GfN1y9jA8d13ufu0o32r451teO7dB2w9u4HXDuV0/xlJaGXwX/4M8asvY3xC3Xia9O9/\nG2k9Zt727fTuhDjMaf/Na5iffhFdHZDmR4hYzB/+XzbqyPgHp7xxfsxgMkSUoesa6nlFbAJaIkKL\n785wZgeVIjpBu054b7B0GANahAqotaIURVK91sZqIY+JsWhyoJVEZhQhfjBv5MO5fsYSpMINhLZu\nKa2hWp0w2DYU5RaD554mGo1aHpPKOcvUEZRBkYha00gip6O2sI6JzQ1HeCbyHxt4aeH51H9+ncUv\nrHBffZrkNgn2KbQoJHS4EEFpos0Ir30Ocov8l/9KevXTqD//Dunf/gL846ewC4M0K0K7gLajWi35\ni6nhq3rEu58zFMOSjeEWMbZE7TEmkg8h+AKTlcxPDyk78NU5VbWkmOY0C08yiSy1ZLofMZ/5yHaE\nlSRsp1jYlpFObIjtzQUilHVAjT/YgPhDk21zi0uKtl6xU07wyzmhA2MsG5szfFcjesxg8yrTjUMm\nQ+HUW5DeGS2lSAwJLYoYAnvTAeWtkptvPORL9yPjBs7/5C7n2ydsPLdPmwUytUeiQNA9g0osSmrS\nq88TP/+baLHEf/fPev+Pak2sG2iOiU1H+vtvYX7nLYr9GQ9/c4/D94/YyYaIsdgso601+WCTxdGS\nk6Njtq5fJSscxw9+QHNW8f67if2rAw7bNfN5R1dHvAgLnThtPV1UaBsZJuE0QGVhqAVRka5QVKIw\n8YNX5g9fRlILYsmNYvn4Uc9H9oGyGGGNJoS6p+cSufXMjFs3HvPtO6vepSwqJKkfbwoTlZhsO/JL\nOc8/dcwfH3f88gF8YyKobydePDrg9uM14enb2Mv7RKUQa0nBAwFJGkk9RhjbCuUjqenAn0F3RuOP\n4HffZjpv+ezDE95cOE7OPft5TogtTdXSrtekkHN8XNHULSfH76E7YXVcMXY5V/cyzmpNkUW6JAzF\nMe88IcCxwOsm8LxRIAmbDAcEVkZYOE1bGOYibOsPTumHs1glsFquac97tlGGYEYFShnapiYb5YiD\nxXzOW6/fY37YYJKmCg1ohRZN8IEYIk8NFG7PYjZ3uHX7jL976PlPI08KjvF5ZPuvjnnmvx+x/NlT\nBr/cYZ+GZARcTkIgZki0PZmza9B1C2FNrO8yfzDnr759j3fqwK85If7SHptbL3B69C5OC/XiGJdp\n2sUpNkYyEwnecPDmilQ4NAPUoKUsE0OjYBXQSpCQSAjrBPfx6Fz4bCv8jW24j1BpOJ4GBgg3twqs\nNWxtbXxgPj/ckEtldHXFyf1TihpwlnIoZDonLDtazqH2PD5NfP3rj3nwOKOKkagdAUVsG7IY2Xaa\nV/ZL3LjAFrvsPa944ftLvvtwDbHDSMtPPkoMK/D/+4T31n/Lzbf/GvlXP0X48nP4fICxJUY5UlcT\nl+eE0yX1yT2av3wT+YMH3PGB/5Nb9E/N+Ee7wkbbQPA8uvc+2zevUK9PsfQefUfvrTg504ga84Vf\n/9eYkeGdb/4PioHh2Zdvc/0rn+ZsveTv/ui/cXx+RF01uLMKv+o4kMSjUpGemvCTNy5z/blr7Ozv\nMN4foi8VNGH6ZMnWTcfi8AypPGUxJs8HjPUmVtUsuyPODzse3VF86/WW1SLilKdMnulQc20/49a1\nbS5nlktuTUmDZDmqHJM/O+GlV09w37rD+XyFiPC6B/N25G8vK1751hyzhuq3/oKD7BHje7DxZ/ep\nfvUZwiemxL/8IYM/PUC/dpnR/zzGrQK/ZqG6nHjh+gTbedrFAfPH9yj1AHVrA3TNoLTc/eGCN+7A\n9MUXee1XvsbV174C5Ygrn/+nSFdjJjvEbEbyDT/3tZfw976HPzqkevCYwzcPOD1c8LVbV7n0iV2y\nDUvSluQMamAIpqH2T1iNNPUBWi9xw0BkRTkrac19lq3i4VnirXc9d364Jq3hE5czNmZjbnz6Gldv\n7jGeZShq4tmKcHaPswcLhmNFaz16MmTnpef5pK45ef9+T5q5DG+/UjINQ47fOaP8zprlF2dIaBj/\n0T3M4470e68z/43n2fizA/TDNfobh7z5yh47/+suZy85PrOhWT6+T5mPcOWIXG0wHu9RFjO0XqMW\n5yRt+fn/8Bs888WvUg7GpKxEtIHN64QoRBFSij0HZbJJOowofUY2bbn5MzOcvQ5ZTigVrbFoM8U4\nTZIaRYuWJ/T1M6HDFZZYKwblmBUttel4+HbGG988w9rEy7cn3L55id0rE7LdCWxvEYuNnjOXPG13\nCKtzTHGEMQkVItFDNtlkcvMS67jg/OBx77U0dIxnE9wLE85+VZONR2zmBelfTvG//wPqX3wOffk6\n3T8fIX/4Xep/8Wn++o07pBcSX/qc44o3LNcF3hQcVSuKDaFJZ/h0TtMeE5ZHfOrlf4J86ivE4ZV+\nXh4NJmqSKYjK9G13bNBaSHFErBrUyWOcG6BHA7w4VDGEvECbDMSCBm0sKIvhCZua+WFF1miMdyyW\nwulB4r231kyHa37u82N2r4zZ2NlCFRPibIc4mCDSu6JJWuN1h7syoNOGFBwxtyQxaDMAIsVszGg6\nZH10SFt5xCSKjR62akmIM9h8QPXqJumnb+BwaGVoXrlK+MJNzlZH/PBPDtk1GWjHMCuRYsKj8xUm\nd5wfHzLRnqZdU1cNUgcWy4bxYEouUyoBVD/ZE1oMDaIsKfUiqs6OsLYXvdpiiNIlQfdLB8miQkSp\nut/ANYAmPWnpt1olTk8s9SE06yN2dgq+8NkZs0s5+cYIPdyDYhvGYyTLSMogQUPTYjpHFwRlBJVv\nke1CmkyJzpJSLz4yo102ryxpF0domePrBrWssENHNArTdOi4JpqIvWiTu9AiLkNruP+9e5xUAdGR\n+RlMy8j7p6dc/eQLPHj3LuQ5hRJ8bFBSUA5KutF1nJmCUjgETSQpBYQLjaciXbB0ozKo4YxOWzSe\n2FbgIioaggeUQrQCNL71qCzSH2fzBMk+fgDVvGN3a8itFwfs39ig2LyELobEfEh0JSYvUZlDJY9W\niaQF30WUUThTkLqG6C1aj0nRQOcR36K1I9opdusyWzdOkHif84cNvqrQNhGTpgugg5BsQxRDqywp\nzzGFgqrhwd0jOoTzOnJ8WLGxLVy5do15V3Pr1m3evn8XmzRaIl1dUc2Frcu3UckSdD9VVERSBNEG\n+fHVy6VVitjJNl0+ROMJGFIIRN+hM0OMEUy/bKSLy+kn9Ii6tneJyy9v4MaabNTryGW0h+QFyRqi\n0Xjbo9HJK+gihBajLUkZUuyQpNGu6KEtL/jlCmUdFA5cgdJTyq2n+lren2JNT0Gw2iDakJTqzRnF\nEJRBnMPmhtXRCe/cO0UhtAEOTzw3J8JsMmHddizP5sxGU1JXgFeE02MG7grD2TZJVC8PIvacaBT8\n+EirniyUercVpJyRihlh8RCVF6TY09O6ekVSCmUE0Q6jda/E4Ak9oq6+sMNgUpKPJwQmhOhQqkPT\nK7NUvNDlWUNIglbmQkWkIPUy6uBrxCna1qKTguWKuuvIdyypKJBsSMqnuOkmpff4dY+yG61QmUNM\nhk4ZJI3VOdoOofG89859js4btMoISjhcQdUk1senDKYz7v79G+xe34ViQtcl/HqF3dpFBmOS0v3E\nMAUiAbCkCzhPGyApFAqdapKbIJM90vIhoW4Ra1BaXYwgwK8j2hWI1ehMWHdPKGCK04KuyFm2LZ0/\nxboFURm8GFAFooZIdNC2EBJRKcgsOIh0RNWA6dCZQg9LolbYmDBVSzg+h3oFCVQxRo2nqMmIpBNI\nomubi3UzsYoneDVH6xUS51SnDzl8/5guAcmDThxXgXWAGAJnjx7THi9pz0+IKWLynKZrafIpMRuA\n1kiiV8BJ70KplEKpH6VDkCiID3hdkF3a76nS0tPqUuh6tzMipuuQuoGqQeoOxROCBzIcUrUNqamx\nqQNfk4JHoifEBq8TwZkeaJWaFJdICr3c2VqUGSIyhOBwtDgjiB0QvEC1IJ2fQy0kGWPGW2TbU0bb\nW5AJ4jzoFqM8ZaYQNyK5TSQ5VNdy9PiMzhtaYxHXq4HvHDbAkNG0pNwfUc81cTygOXyAyjLMIKdn\nMEKk7QWpXkG0F6rD0DOx0EQBSQtaLMyuwCCD2CCp137q4EjBIhc+Uil2hK5/6D4oPnQZGWxcRhUN\nUq1JsaP1HkknKMkgWYxTiJQk0YiJ/U6eEl0I2EwhUUE0aKWIjSd2HjEabQziPWG5Qhc1FBnaFKjB\nFGax39V9ADKCFzQFyuREIDRLzlcL3jttEQ1j07LlIiOB++8u6fg+s9sDRGnayqOdEE7PIPqLhPYw\nLvR4pvqRuXhPgSKl0JOFpAXVQ2OqmNEVW7B4v1c960g0iZQiEj2i+9sXu0jHE3aQXkqyYkhIGc50\nmLCia5b4tkW86UX9MYDO0EZd/PzUAwEGUptAS2/7owd0qzVGJUy0hDognYemBmeJkiFqQixblETC\nqsKkjM5rgs+wuSGlNcRzVvWC4ypQ2sj+TLi1PcB0JXffXtBUDdbMODo44ubuTYrSsj5q8K1H2xFy\n0eH1MvjUD5z4EYFdiDGgVUeiI4lGp4RSQ9TsFvHwAAmpx2Kj753RYiSEQJSItvrJvViz3NI1kWRy\nkiiMiSibo0JLWy3x9YrUelw+RJJgrOsH/iqDKEiIvcBJaVAamytUbAgEkAydAqFeQ1GAzVHWIcqC\ntyQ5IXXglAXlSAS6do7yS9rVOZtlYrqhePbahGuX9xi5Ldrmu5T7m2zfuMR0uMn6qMWNRjQup+6A\nfAswpAtnhz7lgHhAI6IIsem7SNEEcegUevRn6wbN3W+iG9/3L74DFDF6QvJ9sRgj8iEZ/QdEpy0u\ny9Emw/qK2HiCKrDOk2tLjL1pQGorfNuBMZhihDgDyaG8JUlfd9OBRAFnEYmIdaj1iuB7sxhMjhIN\nkiEGlPbEtCZ0EZNnmAyapsHEiGornt1RXN6acfP2PoONpwhNYrIj6HHi0dkBNgzY2NmFrMBkBeKG\n5JtXSKL7TexCFi7CRfnXP9m97DAiZERxGBJJhDjahPEIOZ2T6gZlNMGD6H7OriQRWk+j6idLdqs3\nyK1F2o4gHqUKXLXEtx1iNCobAQYlBokB7xti25GaMxQ5OiuIaLQb9X544kipQ/QY6QyoBehIz/kH\nlMZno95QxudoUyHKkEYzYn0fdW4wuiCzip+4vs3etaexGztEN4Lccrou2I8WEx2D6Tamzagf36do\nl7hyTHblEhJjb5X/I8sMJZAyUoogoXd8oNd3GnzPFcei7Yh2eBlzdEDwEKNF67YnI0kkKU9QLbF5\nQk1NUpCsxhiNrxrE5CRlCN0CTUsKNaJyPBkBjSkn/VeGQEqBgAajSarpKxM9JnYdYlQP+WuNdgpV\nDEjW4H0gtQ5tSxiNSHmFUoJyhrbNUINtlBFmT1vsOicbTGmSw+oJQQU8a6pakVzEzYThZMjB+j2c\nSmzeeB47nPZ9gYT/X6LFvtZW0sN4ShQx9Dck/Vga3b8JerJDdIaUOmJs0VTE1BB9BG3wSvMhJ6f8\nA7ORRUa5d4mEgjgBEsEolJtDfYKKba9pL8veosKOSIzQ2P5PIXS+IsYGY4bEoHspRohEFbCZppMI\nKUcl17++yRLEkFROUBatA0JCZxMYlSRjyPQIsxBCMjhdolRBF85wRWK8MaLODeV4gHGKmSk4/O57\n7H/mkyRlIXpibPsNMPU1iehEAET1FUaK/bkGIhfdpHg8HRRj6tBC6vraPLX4GEgJQpdReUu1+mB+\n9sen5n2E8fFBbh9hfJzsjzA+TvZHGB8n+yOMj5P9EcbHyf4I4/8BT8DgwhlLseUAAAAASUVORK5C\nYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + }, + { + "output_type": "stream", + "text": [ + "3 (434, 290, 3) (68, 2)\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFkAAACPCAYAAACRbHedAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy8eZglV3nm+TtLrHfLe3OtzFqyqlRS\nqbQgJEDskgwyDW02Y2xsoG2m26Y97bF7Ho+nPe7Nbve0p7ttt5/pGW8YG7MZbLwAtqBBGIRBAoSQ\nhHap9szKfb1brOec+SNuldR6UKlVnsF2DyefWxk3IyrixBvf+db3C+Gc4zvj/9sh/6Yn8P+H8R2Q\nvw3jOyB/G8Z3QP42jO+A/G0Y3wH52zD+ToMshPh5IcQH/6bn8WzjkkAWQrxcCHGnEGJXCLElhPiy\nEOKF/29P7ts1hBBfE0JcLoQ4JIT4xtP2fVAIsSyE6AohHhdC/KPnev7nDLIQogn8OfCfgQ4wB/wC\nkD3Xc/1tGEIIDzgAPAHcAHzjaYf8EjDvnGsCbwD+rRDihudyjUuR5MsBnHN/4JwzzrnEOfcZ59w3\nR5M+LIT4SyHEphBiQwjxISHE2FNu6rQQ4meEEN8UQgyEEO8VQkwLIT4lhOgJIW4XQrRHx84LIZwQ\n4seEEEsjifpfnmliQogXj1bYjhDifiHEzf8N93M18LCrQt8X8DSQnXMPOefOC5AbfQ7/t8NVneQ5\nfYAmsAn8PvBaoP20/ZcBtwIBMAl8Efi1p+w/DXwFmKZaBWujG3s+EAJ/Cfzr0bHzo5v6A6AGXAOs\nA68e7f954IOj7bnRvF5HJTy3jr5PPsN9vAvYAYZAOtougd5o++BTjv310XFuNNf6c8LsuYI8uuiV\nwPuAxdHEPgFMP8OxbwLufRrIb3/K9z8GfuMp3/8n4M+eBvLRp+z/D8B7vwXI/wz4wNOu/V+AH36W\ne/kr4DpgP3AfIJ7hOAW8HPgXgPdc8Lokw+ece8Q59yPOub1Uy20W+DWA0dL/iBDinBCiC3wQmHja\nKVafsp18i+/1px2/8JTtM6PrPX0cAN46UhU7QogdKlD2PP1AIURndMwu8FLgC8BjwBXAthDin36L\nezbOuS8Be4Ef/xbXf8bx13bhnHOPUkn11aM//Tsq6btmZCzeAYi/5mX2PWV7P7D0LY5ZoJLksad8\nas65/+NbzHnLOTcGvBv4ndH2p4HXj/7fr11kLprnqJMvxbs4KoT4aSHE3tH3fcAPUulZgAbQB3aF\nEHPAzzzXa3yL8S+FELEQ4ioqXfrRb3HMB4HXCyFeI4RQQohQCHHz+Xk+w3iqN/F84J6n7hRCTAkh\n3iaEqI/O+Rqqe/3cc5n8pUhyD7gR+KoQYkAF7oPAT4/2/wJwPbAL/AXwJ5dwjaePO4DjVDf3y865\nzzz9AOfcAvBG4OeojOMC1QO+2D3eAHxDCDEOGOfc9tNPS6UaFoFt4JeBf+qc+8Rzmbz425y0F0LM\nA6eoDE35NzubSx9/p8PqvyvjOyB/G8bfanXx38v4jiR/G8Z3QP42DH2xnW9/x1vdoYMHMMYgpMQY\ng8PhbKVinLOYsiRPE/rba6wuLpJlKUVeUhQZ3V6PLHdIodBK4XmKei3E9zTCGkyRY5yjKA1ZkpMk\nGUVuyQuHVILLDs9wYO8sfhAgneHcZo9zyytEUURRGqTUWGOwriBJMwI/otNp43keYVRHOIvvKQor\nyNOEZqMGzhEFHr6nqdVq+MoR1uv0nU9mNVZInAMhBNVdCsDhAGstQlTfQYAQOBzCCRYXF/n4n3z8\nWwZdFwX50MF5jh27nKIsEUIgBBgLzloEUJqSMkvpba1yqrfMzESTIvOwZUm3Jwi1oChysA4lFVEY\nMtaM6bRaSAl5lpFlBXmeU5Y56TAjLwzGgHMG3+U0ApiZHqMReuybnuCRSLPd3UGpEOsEQiqctaRp\nSp6X1ANFGGra7RpxfQwFhIEmzUuSQZ92u4WwObW4RhwGNOKAOPZQcYtdYlKnsQgqU1UBiatAPv/b\nOVc9hNE+nsWsXRRkACFkBTAghcRiEFLgrKPMM0yRsrpwmjwtacQ1dM2n3++TJCmezKqJCEfoezSi\ngHoY0og8Ak8TTXQoi5I0TcnSDOEEUkmMcWRFTppl5MOULEmIAp/Q97ji4AEePm4YpjlKCqTyMMIS\nxR5a5wyTFK19et0+SnmMjXUAQyMOMGXJoN+j05mgKDKMp3EO8twSih6tusY6j8w4hFRPkdwKWARI\nBE6AdefxdyOgnxnpi4IsZaWyPaXIiwIh5IWTmbJACsnu5iplMqAR+zQixcrSGhtbfXr9AWWR40uP\nMA5p1Zs0azFjzRpjjRq1KCAKQ7QncWWJyQokgqIssQ7SPCfPS5K8wBmLsA6lJI3I59qjR3ngsRMM\n0gHO5sRhnUGSUK/XMaag3+8TBD79Xp8oCGh3OrgiY2ayw7nldYaDLo1aDWMsSZKiVYQ1lprLGdOa\nHRFSWAmj2xVCIp3FCRDnf4RDIHBCghIVNpckyc4hhQChkNLgTImSCmMNWnuYPKG/vY6vFMJZdrd3\n6PaGpGmOM4LAi2g1ajRrMdOdNlMTYzSikNgPkFKglMLzFViLaioElYQUaYJxlQoYpBl5aSnLHKEC\nhIBmLeSGa6/i3gceJckTtHQ06y0Gw4QojFHCY9DtMTk1w8bWNo1GTBzEKGHYOzvD0vIyjbrD2Eo1\nWWNQyqccDmi0PHKh6AqJFBJnXaV3pbiginEV2E9Z7xf1IJ5VXeCqC0ghsEIgpMATitKUDHfWGPZ7\nSCco85TV1XXWt3qsrXXxPI+p8SadeoO905PM75+j1ajhK4XneYDAYVFSooQczd7irMVEIUWeEwY+\nvpeQZAVZWZCUOUKHYEvajQZXXXGQx08tkKRDxsfbKK0pi5xaPSYZOvIyR2uPpXPLHDywF+X7NELN\nnplp1leX2TMzDjiss+RZQthq4fKU8TjEFIJ+CVJJwOEs2ApduKAmKpXxbOOiIDsqlVHpSYOUlX52\nzpL1enQ3VsFY8jJn8cwCG9t9NjZ7aKnotGL2zUxy9eWHOTi/l7FWi8APLzx5qPSYswYJOFPirMNY\ni5EeUnkIqaqlKTPc0GCdozA5mSmRUjI31QEkZ5aWKcucyXaTrd0eWMPYWIvBcEjg++RFwckzCxw7\ncggBdJohWVKnt9tjemqcLM2IwjplnhC32iiR0QpiUuuwjgpJUfm7roKaCz6GuKBTLg1kKQVCSXAV\n2EpKyrJAOEva22Z7Y50sTVk8u8DK6i557vC0R6secfnBeV50/TUcnt9PLY7QSiOlHukuiy1LnLE4\nW6khi8IKC8IAEsSTlQXPOryiwGDod3dx2iN1ljgK2D87Tl4aNrZ3yLIhU50x1je3CTyF12iSFQWB\nr8lyw8lTp7j62OUoqZke77C+sUWR59RrMSbPUGGDMh+ihU9D9xh4dXq5wFqLlAKHwBhBpTmedPCE\nlBeV6IsGIxJRWVNnUSNrK6WkLDK21pYYDhLWVlc5u7hFt5+S5QWe1Fx+cB83vfQFHDt6hLFWizAM\nCTwfT2k85aGlRikPqfRIWhVCVjpQCInS1QPRuvr4WuN7HloKtNLYIidNErq7OwSeZP/cFJ2xMRAK\nZ3L2zEySZymdZkzoazwliXyP7Z0+p0+fJgoEoS+Ynp6s1N1ITQlZ6VatFIEwdGKJpyqAjbUX/Ocn\njV+lPs+DfUkgV9JcqQglBVorAJL+Dlurq+zubnP85BLrOxndoUV7AVddfoCXv/D5HJrfTxxGFwBT\n2kdpD6EkQiqk0kihwDqcUiAVTsrREhQj10kghUQrjacVvu8RhQEakDiyZMj2+grNSDEzXqfdbLG9\n28VXjk5nnCIbMtFqIIRByBI/DFjd2GVjfZ16PaZR82iMddjt7qJ9Tb+3ixf4SOGIQo+WZ4m0AGdH\nkuqelNiR2jy/fTGYLwryeXUklRrpZ0FZpOysL9Hd3WV9bYPljYR+VqI8jwNzHa66fJ6jRw5RrzXw\n/RA9AhepKoCFREgNUl9w6oVzCGvAWrBgrcNRBRpCKoTSeF6A9jw8LQl9H00VgQ37fShTZqfadOo+\nM5MTrK6t0R6rozyfeqxoxjUCrWnUIoaZ4bFTC5T5EE8YZvdMoqXGmpIortHd2kQiUFLSCCSTNQ8t\nn/QmhKgCFCGe8rcL/1wCyFoppBBoVRk/ay02T9heWyMZ9Fld79LLHMbBWOxxeO8M1xy9jGa9ju9H\nKO2hlK6kVnugfYTSlW85kgAhfaTU4CRYEEIBCkYPwwkxUiPVbykkYRASBgGeUqRpyebaKhrDzGSL\nibGIVqPJxsYaU5MTFFnKZLtBFIaYfECnXWd1fZPjJ07gTIavHHP75rBFji8tpbEUtqy8KlswUROM\nRbqyT87hnL2gMqSUI2zP0zEuAWTP8/D9ACklnq50WzbosbW5ye5uj41uQeEcvpRMNOocO3KQmalp\nfD+sDKXyUDpAeQFSV96CUxoh5chia4QDnEBoPQK98pe18pBKoZSHw2EBoRWe1mglqwdvDVpKdrZ7\nZEmfyHO0ahGH9s6Q9lOKtEuzUcOTKVPtCN/3mR5v4nkxJxY2WVpdQQlD7EtqzTFKU9BoNsjzojJ0\nrkoNTDdDfCWreTsHzj4J6khViEtVF5WqqLwKpRTGFGysLpH0++zu9ummBgdEWjI32+HAvjn8wEdK\nVbl+2kPp8wZOVk8f96QkCAEXAJeI0XHSrx6MlrqSXiTGWJxxCKGQ2kMrHyU1WimMcaytrqEV1CNN\n6DkuO7ifswtL+BryLKEVe+ybmWTQ77J/7yRlWbK+nbKztYH2FK3WGEIo8nSA1D5JMqi8CGOYbAaV\nNIuRyzaSXGvNk9Beurp40sOTQjDsbrOzsU6WpfT6KcZWJ2jEHvN7p+k0G+jzQEk1mpSrdO352P6C\nqqDKgQiBUxqEAqVxUoIUOOEuJGCcpcq4WaoHNvrRSqEExGFAd2fIzuY2vjD4CsZqmv379rOxvkm9\nVkNQMt2pE2hN5EvG23V2drc5fmoZZzIa9ZBGawxjHZ40CCQWKPIUmfXY067hq8ounY9Iqu2RqrhI\n7uLi3oWo9LJSGmMKdtZXqpA3L0hzUy1hoN2MufLwQeIoRkp1IakEjKKi8wkmkCNLLS6ETjxpYUfH\nOwvOVv6pMbbK/KFwiGqfcUghCfyAOIyJwpBaFNPdHoBzBNLha8me8QaZlUjn8JWjHnnsmWiTDYc0\najFKSgZ5yZkzZ3EmI45j6vU6WZIilaRIByg/pigzJsZqNOPwScG5EJI8PcR+jiAba1FaIQQk/V3S\nQY9k0KcoC/Kicl+khJmJDnMz0yitEUpdSCwJoS6oDjFa+k7IkTSLUZbwfOTnwFV/s6XFlAZTGIwD\nKwTWmFEe4Umvw/NCgiAiCmJqcQ0lNIPugH0nlnjtb36cA0+c5eDcFKtbPer1MWr1GrMzE8RRzHi7\nQbNeQ0nL8taQ7e2dKi/SnkQqjTmvmnAkaQFZj32TTXAOJZ8CqvivfIznDnIVgDiMKRh2d0kGPcqy\noCgKrKkkVAvB3ukpGvW48iREJclS6SfBRlYXEhKpqpAZIaoAwFElqUchrLUW6xzGVb/PGxaLwqFG\nHodE+xH7T63wuvd9kj3HFxgUOZm1rG32OPLpr9Fa3uKav7yXmYkW2vPZ3t1GSEngK6Ymxgh9n1ar\nwcR4h35Scm5lg3Q4JAhjouY4eTpABTFFkRHU2uxubTJe92lEPg6JEKparc4+Wzr52YIRhzUFJi8o\ni5RsOAAnyPMSax0Ch68Uhw/uJfD9ClglKr0wWvcXXBznRsvK4UwVVltjAFGF1qPw2pQlJi8xRYkV\nlYW3oyKBc1CUDiM1qbVc+fmv0V7b5tjn7+Hhx4/zzUcf5aHjJ/nUFTN0Z8e5/5VXIhBce+wI9a89\nwvU//7tMffMJ9u/fT+EkQRghsMxOtDhxZo2trU0khrHWGEFYo8gzlBfgaYlFIk3ObKcx8jBKLtwa\n9tJ1srWWPC+qqsWgx3CQAKOM9cj4NOOQfXMzaC+oLK8Y+Y5PkcIqR2GrJW8s1hRVwOHAlCUOQVmW\nIxVRYozBWkdZGvKsoDCO8Qcf59bf+iiNex/izPIyDz72OH8w1+ZMLeD9kzE7vQFKaCSSOwT8/hue\nT/+GK0nTPqEnuPn+czSXNrnsU18n8iWh51GLa9gSapHAoXjk5BKD3jbK96k1GpiiQGuFNQVjE9P0\nd7aY6dQJfK8SkPMRn/3rSLKrEuWmyMiHfcqyHDkHEkcVajdqEWPNOkp5CKW5oJvO693RYnLWVqrA\nmhGIBlMUlEVJkeUUWU6epxRFVYoq8pS0PyBNBnR7uxy9/S4mNnZ4wZ338sSJM5w5t8JnRMlPHJrg\n4alxxlsdxtsTTE1MMdGeYHMrpygNRWFI04TVH3gV6xNNHn/djRhrGZ+cwDjLMCsoDVx5ZI4njp9l\n4dwy5bBPWB8jiGukgx5C+mBLSgSxB9PtRuXzV7WoCyH3pUmyq6StzIakyQBjHdbaC5xbKcDTkkB7\noyxdZZAQ8kmvZsSBtqP/a8qyqg2acgRoQV7klTHNMvIsJUuHDHtdurvbrG5scuLUWT62r8OZKOD3\n2hHra2sMun2UFYzVm9TrTTztgxAYYyjLHAvc/+BJWp0J8sKwc/3lfPGfvZPH5pqApdNuob2Qiakp\nSgSNesTc7Bz3Pfg42zubgCOMapUw6WqFhnGT3vYmc60ANbI352t+F9PLF8/CCYFzjjJLSIcJjsp3\nVlogRz65UpWHobSukkFCjqq4TwHXGKyz2DKnSFOKLKPIMkpjKIqcIs9Gf8vJspx+r8dOd5fN7i6n\nl85xdnmZzyvHTxwa587Yq/IKUUyr2aTebBFGIUorsjyj1+vR6w/p9QacPbPAN+6+ByE9BsOE+f1z\n9IclZZ6CLYnjmGarwe7ugNJaDu2fYGV1F+64l+kf/lnqX3sAKSVJmuBMgecpBoMBkSxpRh7O2Soa\nvXhU/SwJolHiJk8TiizDlmVl54VCy4p6XhQG60SlLka+r3Oy8hDKkiJLydOMfJiQDRPyLKXIUoo8\noyxy8jyjMCVlUUl0mib0+wNa9z7C937oNuYeP0tvkJCXJVIIAs8n8AP8KCKo1ZCeR1Yatro91ja2\nWNvaYaPbY2V1lZ3dPnff8xC9QZ+8KLBFweEjVzAYVJLeHmtTFIYwqrPdTWiPtzlw4AAHPvaXRGdW\naL/3Y/hxE+cEZenIh0P8oEba22S2HVeup6lC7IsxsZ411elMTpEnpFlOWeRIoQkC/4I0p3nFf5BS\nV2pCSIIv383kP/xp1B1fZjjoIb/wJfb+5D9Hff6L9He2GPa7pIM+ybBPkeeY0lBagykNeZ7THwx4\n0d0PsWd3wOtPLl9w9WxpcNYidYATgl5vSDexpC6klDWs1yIxPsvrfU4ubHJurcva5pDPfPavaMYB\n/d4OE+0GRinCMCIKJb4STI6Ps7G+jXVwxdHDfPLofrZnxll8yy0ooVCiKoukwy6eJ0mHCZMx1AKN\nc9WcLlknCyEwZUGeFZRFMQpyKv3jKUXoawoDWocgZFUELXNq73k/wckz1N7zQR57+H7aH/xj4oVl\nJv7wzzl95iT6C3dy/b/7DVr3PUxuSowzCCVxWIrSkOY5tx3Zy9lawEdmWuR5RpJm5MZhEHS7fbZ7\nhqCzl/G9Byi15ci5BX7x7vt53s4WLlS80sHvbSRcvZHx4MOLfO0bD+ArwXB3g30HDoMtCL2ARqNG\nECmQgkFm8H2P/o3X8r633cQT+zqYvE9cb2LLAh3WGeys48VjmHTA3olWVbF+FkrAxXWyBGcNZZFi\nTFlVmMWoFKUcSjiUVnhhiPdXX6HzIz/F5kf+iLtecAVr400+c8Vejp86y+euOsBau8Fnjx7gxOlF\njnz2LlorG1z+2S/jrEEohfI8tK+RCkpbcv9Eg39x3UHubsZkRUleGvIiZ2enj4zbNCfHOfDIA/z4\nh/+Q5LY7eP3DpzmUZLzt7CYL53Z411bKUQs/UTqsEdx510Ps7OyQJUNqgaQsbWVHBMRxgyCMWV1Z\noSgK9u2dxhrJ+u4um+uraF9XmUWlydMSUwzo72wz3ZA0ouCCcb8kkE1ZYK2lLEqcMVhTojUYU1KL\nw8ovlhrPD4jf8yHis0sc/OQdPDw3zXvf8Aoe2zuDsfDo7BS/fusLeWTPOKYs+fyxg6yPt7j/5deh\nfU1cqxHVakRxTBhogsDD8/0q4gTysiTLUpIkwQtDhBmwubbCTyxvcUVR8h9Ly3+UgrPAmBC8QTp+\nM1I8rgUfnoyZPzBJo9Xk+OklfN+nu7VBoz0N1uBpRX9nk2ZrjO5Wl52tbXrdPpNTU3S7fVbW1iiG\nPTwvwBYZYb2OyYbosI7pbTBW81FKXEyQn8Xw2SriS5NkxAUzOARSjoDQoHyfne1VHrn1RfT2zrD0\n/a/nqquex/OvvY6rrjjCzPgYka8R1pINE5LBkHubPr996w2cPTRLGIbUmg3qrQaN9hiNRoNWo0kU\nhggpcUJgEBSl4WVJyXvOrHHdwgYH9+3DH5XDosAnvfEaTBQwayw/mTuWju7lF284zMrR/eyfm2HP\nRIfuTsLOTpc8TZBULmQY+KSDPnHks7Xdozcc0usPqMU+YdhiYydhc215lMNRBPUO1lX5nP72BpOx\nQDhz6epCKU1pTEXHKgs8XeVwpYTA96gFmluHA675579CqzPN4m/8e7j1lTTqEVHoEcURcRwThAHG\nObKiJCsKkqxgtzdgfWuHvMzxo4Cph09w3S/8Zw4ubtAege35MZ7voz0f5Xu8e2g4Yhw/JRUvfd4x\n7r7lRtbaTW675hAToeJTl82yWI/43LF55menODA7zYGZGeYmxpls1AmF5MzJRWyZs7u2iKXKMPqe\nz3B7nXa7Q7fbp7SGsa8/wPe99xMcPrXO8eNPUKQDvKiOKxN0VMMWQ6QXobItWqHPxTzli5NbhBjl\nhR2SqmIrqFJ91jpqUcj/uLBNszB4f/xpTrz0BWTDAemwR9LvsbG5ydLyKgvLa6xv9OgNhhRFjicF\n4+MthNLs2dll37xm8kN/SnB6kflPfp7F//lHmGgPGQ5LnHOkhUXg+MP9Df7hdsJD3/1KarHP5lVH\nuO3QXpIs4fI0ZWdmnPccmcOZkj3NJhMT44SBjytzsjyn3+/T3+7T2+0xpgM8X5H0chqNOv1hn3oM\nW7sVV+Oaz9xNa7PL9Z//Gu8ZeylzJx9hev4q8mSIQ5ANh2g/wBU5c+PjF012Pjvh0DmssRRFjhyV\nfQLtkWZ9AP7PWsi/jWI23/L3GPZ26O/ssrW5zpkzC5zbHNKcOwzjPpcfv4e3L6zwW42Ar9c8zPoG\nnU6HKAoRwtF919tove+P2PqhN9GoN5mdtoReQBT41Gs1esmQswjes3eW/TNT1JzFCYeWDg9DaUtq\nnsfU2BjT05M0Gw2kFBRFVqmpdAgju7J0eoFaI6K7toAM6oSBRjqHFBZbFgyTIZ+57jJe+8BJzr3t\n75N2t3jgwQdodqbwohbD3Q1KC7IYYowjCpMLqus5g2xsRT6xVJyEokixroEf+JRFSeng00ow8z23\ncPPRebprK6yvrPDwI4+zmYe8Arjpg3/Ar4aat273ucLBu7uGByb28IKNXd59x71s7d+Hd/V1lC9/\nMVvfdQtZt0u8tQkGPKkIPJ9Os05aVg85DH186VBCoIII3/NptzvEcUitFuMpWfnUxlacPanRwkNr\nr6qEG0tv2GV7bYNGp40v5IiT5yGsI80GpEXO3c2Y4697ES+4bIb5nTEeuu9rHDx9nANXPJ+yLLFA\n2u+iwzou2aYde5cGspIV4Q5bUbTSLEcrTRQqtKco05KkcNx536McOzDBxsoSp0+d4/jSDrLV5iVf\nuY/5vOSfFJpfjRT/MjG0peLtjRavW+8zNRww/qkvs/G9b0ZJifelr9D5rd9n7QfehLvqcpTnEQQh\nYeCRpEOMKQlDn1qtThzH+H6A73ujqrpECjHKdVukdkgLXigIcYR5hvYjpNKUZc7ZJ85w1fUN0v4u\n2g8rG1OLKIoS4yAKJb2k4NzZJQ5dcSWPPjLGo489wVhniqDepsgSShQCRzLoEstnfhPFsxZSEaMM\nWlkihUN7Eq0hDFSVClTw4IkznD63RL/fY5CmXN/t80tfvY8Hx1ts1iImfI/rr7kK1WwwlZe87uQi\nj736JfT37mH7h9+KJzUCSf0978c/eZrJj/wpQa1BVG/QGOswOTPL1NQexjtTeDpAKx9PhwR+hO/F\naC9G6hBUgPZjtBfhhXXi5hiNziTNyRk6e/YxM385s/OXMzW9B1dYNtbXSLqbDLo7hIGPwFALI7q7\nCVIJtjd3eeChx/C04IpjV3F6cYulhZNIDFIqSlfFEaY0yLJ/aZJcFJXhkUqCM+AUeW5QUlOLfdat\npREqTq73eWJhhWvnx6k1In54o8verCQepJS+x/R2lzc/fpq7rj/KS+9/jEdveTHcejPL73oHNT9A\nj4qo/X/0D6i95/fZfufbqiyXH+BFEaGpE9UaZHlGmgwo8yr3UJWvZFWEHQmDs1TBjdL4cYzyfXQQ\non0fJwT1yUmCRp182Gdl+TSzB+exeUItilFCEGiJkgJjYWt7ABLuvec+Xvzyl3Hv3Q0eP36a1liH\nsen9ZMNdSlsxU02WXhrISkmcNSihqHLDEmchCiIaUYDvKWzheCOCn7n9Gyx9701c+epXkR64nP4f\nf4bFW14AQhF+9i4ef+m1JIf3c9ctL6YzOcP07Bz1OEZacEIhEeSveDHJy19CkSbILEWj0IGPCiNM\nntPb3cZaQ5HlpHmG1hpRFiN6b0W+ccYAYLMRad2C0j5CSLw4Imw0COIaxbDL1pfXGA4GBKEmHQ7w\nfA+lIS9S0iShVospsoylhUVskXDs6mN8/a4vMDtzhnqrVZXbRqxfqS5RJwshkaLiJ1eEQAsCAl9R\nC308ZTBO8nOm5FAJE1+4j5W3vQVx4BCLr7wR3asSQfc9/yhSBcz7AY1Wm/pYiyCMkLJqgjmfN5Uo\nnHBIZ1FCILXCDwK8OEY06sgoZH1tjaWFMwS+wk1MUa83cZ4/qsoI7KjCgnNV/hsQ2quIKVoS1OrU\nxyfYc/Qa5jeWeOLhe9hzcIVAKx0AACAASURBVA/WlEigHgc4K8jSnBftDnnT8XN8eG6cU897hL17\n53kw7nD8+GnG2y0mZg4y3F0hjOrY8hIl+XziQ0mFsyXWliTDIePTY4QB+J6iLAy/7Gl+zjqKN91K\n0w/RShMGEWEQYzvVk1ZSorWH1n711EfkaivcqIhS1QSDL95J53c+wO4730r28hsRArTn49UbGCE4\ne/IU933jm9RizZXHrmR60hCHtcp7UKryhsqqAlOUBusMCIlTEhVF2NKglKY+PsXEzF7uveuvSNKc\nwDMoFQCSKPApy5I3HV/iwCDjB89t8t4nnuDmfYc4ePgAD3zjbpaXzlFvjSOEpsgzPC+6NJABlOfj\n+RqweNojy3KEEPhaEYc+eZZxV83nzX7I795wDR0p8XTFbwv8cNSz4ioumdTgzrOGqACwBjviYUil\nqP/uh/BOnqb1gT9k7RUvrsryykOHNeQwQaiYQjc4efY0yg9wzjHZGSf0Q6SqKuWutOR5PmqLyxGB\nj/A0eqCrwEpWzKio3kbpgHNnl5mf34PyBL4ngRIhNR+cHuMdqzv82eEZzi1tsHjmBAf27mFtdR/H\nz6wwMTlDZ2KGIulirHlGDJ8ld1GVu9Wohw8kpS0qR1xqanFA4CsiUWKNBVfxmSvySrUClNJoL0Cq\nAOFVH5CVH1uWlEWGy6s2NOH5JD/+P1AcPkj3nd+PsBaldKWunMVTiiuuvZYDBw+RI3jw0Sd46ImT\nnDl3js2tDQb9AVmWkeUpSTKk2+uytbHB+unH6a8vkQ/65MMh+aCHyVOQVbFheXmL3nBIliVIUZWi\nkiTnC7HPuy+b5Yn5WQa9goVTJzCmz+Ejl7G6nnDq7FmytD8qq+WXJsmOEZHDWaQQGCxaVtSowNcE\nvqQW+3R3c14yGHLsX/0q5Y/+IPaVL60KgFr9V4SWynBWTZJFkZB2tzFFQTTWQWof6Szlq29h+2U3\nko8orNUQuKIkiOtcdvVVLJw8xaMPd3jikQewUtIfDtm/Z4Y9k1M0ohpSQJoMGSYZzlqSUOM3xvB2\nN9GhjywVNnVY56qqytaQze1dpj0fLQVhoMAJrCnZ2R7QqvvEUY0szzl76jTzR65kz+wsZxfW6LTq\nzM7MkqfJpUny+ZyHHbHQzzcPViB7+J6HcBCGkp9MC5pLa9Tf/8dV6d/ZC4zN88wgU5YUaULa69Jd\nXmL19An63V6VjzUlZZ5h85wiHZKnwwuJKWdKrDMI4QjjgHojIs8yCmvpDwds9fssbm5yevkca9sb\n7Pa77HR79AcDtra32drtsrmxRnd3mzTpV2nTQZ+iLCmsIEtLdnYTsjTB832sU6Akee4YZCW73QGF\nrbpy19e3EK7g8isvIy1gdXmNPE8qVXgpkixkxWrUSo2qsw7jHMiq387TCodhrBXznzYLfiEIkD/4\nRnACxah9VlYSbMuCMsvIBj0Gm+tsrJxlZXGRif2CmfnDGGsQeUZZFMjPfp49H/gj1t/yWoa3vgoE\nFIOq7aC3s8PD997N0vISXhxirKXfHxBHEUk9JqPAZEWV7UsyksGA3nCAlBDWIsKdTYwA4SxZllNa\nhymgP0zp9ga0dIxSPs46hILCCYaDnG23S7ftUa/XWVw4y+yeOdoTE2z1d9jY2Kx6ay4JZFGxd5RW\naKVxLkGgMbZEAJ4nUdKBkny9FfCjM3v55euOMTbq/XO6YtPbUUU6H/QZ7G6ysbzA2bNnWV1bh7BJ\naQyUJTYvKI1h7gN/RLSwRPsP/ozTz7uSMk3I0iFfv+fr3PX1+5h+4iy/udnj99oxd/maJCvY6g6Q\nSlKPQzrNBmQ5WZGT5BnWOOTWLkF9hbjVRMcNHIYsSysSo3R0dxKy8QxT5gQ+SGl5ZZLz9p0uv0PM\nfUpSFAW1WsjiwiKH5w+yb34/D9+3xfZOl8JeonchhByFIBUh3JgBUmhMWZXCBQLPU1gniT3FqcUV\nVtc3abY6WGeRpcE5hxmRVZJel521FZbOnePswhL9PGfaWbpb6wigzEuyIqd/0w3sv63PPVfMEawt\n0cszHv2N3+OWR05zerLNO7YGHMpL3rWTcPf4GFIK0rxgdXOLXnebw/v306zXsRKMlBhT0B0M2NjY\npl5bpNYcQ4cRxhiU9kel/pLuoKTezFEYAk/xrrUuh43lx3aH/ON2WFW8TUFeOlY315idnOR+41jf\nGWBE8Yw4Xjx3AdiyqMjOQozcHzEisFShrO9XkU4UCPpJwtYnb2fyH/+v6C99hbIsMOWIwJIMGXS3\n2V5fY2Vtg51+n6DeYPbAfrQXoL54F/M/+0v4v/UB9v7FHXzu8CypFVz7K+/jnv/069z88CkOZiXv\nWN/hQ3vanIoCPrpvmnqjQRjHKK0pLGx2+9z/8MNs97sILUBLrNYYqUAISutIBj1smRNEEVEco5VP\nmRmyoiQZDirVaOEjcx0eUZL/ux7ia0mSJEgMoR+wvrpK4EkuP3KEcyu7bO50L1GSASkqEnbge4Ad\nMYgsWWEYpBlR5JOkQ8rCEPuK593xVfwkg/d9hOGLno91lmI4JO3tMuiODNJwiJGC6dlZ6u0xdBSx\n7/avEKxvc/XWLtpYXvZogTGWsWHGG/oBf3pwhu9d2OATB2Y5PjnOvzk0j1Kam9c2eeOJs3x03zRf\njhTDzDLMU7a7u+yf20tWFEilq3yvFBhjsUJijcH3Q8Y6baSW5IVjMLQMBgl+6KN9yd2dmD/Jqi6q\n/YGiLPKq5mkMaW7Z3t3m0MG9PPHECZY2dy9Nkq0pEYJRm5jA2arWVpQFSZaTF2XVwmtBa4h9+Hlj\n2Z6ZZOMH30heZOR5TlYUpHlOf9BjMEworaPWaDE5M4UfRaAU2//g+8gO7WfjDa+mOzvFI6+6ka9e\nfznLzRpfvOoy1o4c5jMH9/OWMyu8aHWLplW0kLz5xFn294e85cQij55c5p4zW1yxOuAXvvAw7/7I\n57hqdYu4FlJr1EFqsjzjPL3YCzwarTEQVfW6O0jJc0PgaXwlkQhCXxN6ClPkRFGAKQ2e59PrdsmS\nDKkVlx06QH94iWE1WMosrYILUdH3rbEoIavWAqkrrq60FV3LGm73ND929DA/d+wQQZpAaUmTAUky\nICsLrBB4YUCtPUa9UatWh4Dhy15IdtOLQUgGWc6h/3IHk1/7OEtXXsHrHjuOP7S8/MQCc2nKmxeW\n+e3Zqgj7iT1T/L3Ti/xKadjcTej4mp/FMpFWwcF3PXiKc1dehqcUWgiCsGo91kGA9nyk52OMZZiU\nyH7BMC2ZCyouhRIKZwxB6GNMWXFDnCWOFds7FmsLjBUcnN/HgRMLz4jis74lwFo36n6qSIVaydHL\nPARKC/I8RyuJ71dMG88avvCNB3ntNx/i6qOXIUzV9Z8kQ4wA6UniRo16I8bZkjQZ4IUBSilcKbB5\nRr6yzoEP/ynx6jqHV9ZR1nJzXvLx8Q5vWVsjMpb51SXu6gVshILbjh3k8YdOsddXHNg3zSe04J0n\nVxACvvC8I2gtCYIAT0qCKEQphdYeyvPo7uxSFgWFdQz6BUVWgFTY0iFU1VOtlI8QhiLPqfJMkiDw\nQCgKayp1su+ZX6R4UZCttVSMtyqB72mFc5bC2KrfzlWciDD2SfOKVutJR7coOPPRP+f7spLTr7uJ\nnaPzFKbACfDCkEYYoJRk0N/FSYsKAppfuZ+JD/wJJ176IpbabZKrj3EovZcT7TEOL63glQUqEKSe\nYm6Y8erFVb5+84sR5xYgK5mf6DA31aY+O8kJ7fgPxw7ja42nJWOyovn6WhPFMWrUcpEmQ5aXV3Cu\nQPkS6yzXbA5428fu4aN7O3yt06r0dZrQHNejZtESITOU9nG2rHr7ZFC9veBSQD7PinHOjp5o1Qud\nZSlgUEpRmgSlNZ4n0doRaEVkBG85vUwTmP/UHSxeNocFlPbwogAPhzGGna0tjj/2OJOz6xx5/18Q\nb2wz8+nPcfvLruNIf5d9zrC4r8Pc5gYzvSF/f6vLF2+8lld95ZvExvG8xTWs0+yfO0A0r5i+4gA9\nbVnf3LjAdHKmIApC6rWYQClqcQ0/rPoMN1bXWV9bo1UPWekJnCh545kuE0PDG/OSezpttJDkrkAJ\nfxQrGJSqVnaaDGjbnHqjTpotXhrIalQPc8aAA+15Vbozzapsl1CVnkYQBB6+Jwg98ErLvwb+d63Y\nuPmFRM1mVapKU4SsXmZQFBnLi+vEX/kmrzvzRXCwJOBL1x2hc/QAr/joZ2l1B3zPXQ/y0IuuJn7g\nceoIJiemsUHA5E6XHzi1wHX/5CdJ+l2m186x/2N/zuOvu4lH9s5ihaMoCkxRUA9DarUagaeIazU8\nz6MoMhbPnGHY7zE90eaRswv4Efzpvpgf2rT85RVz9JMc3/dxRmNMjlQNtA4AS6BDhPTIi4IoUlzs\nLcnP3sA+aqCRoqqUSCkZDpOqDcHZqonSVvkg35P42hJKx6eV5Brj+O2NHYTn0RyfYHxqismZGcba\nLeIoRAnBWzcT9jhoA36zwT1+RNiZ4vhrb8FohSpL5k8tYXyf+k6XF93xVbqvfCXDiXFWXvYy6O/Q\nnhlj/s6vEJ05x2V/8QXa7RaddpuxRp25PXuYmpmhNT5Ovdkkrtfx/IDN1TXOnDpFkQ0Za7fIspQw\nENzb8fnY91/L6UNTWGdoNgLiqOLoaQFaVemCIPSwtqL7miKj0WhcGsimrJpPrK3YllpJ8iKDEfNG\nAIEXUBQWayGOvQpsBb6suug/def9PHbyDEhB2GjS7IzTao8z1hlnot3k+GteTDY+RjE5Tvn27+Nd\nr3oN80OBmr+Mlbd9H/ncLMObbyJ7zWvJZ2fZ/q7vYn1mjru/5/Ws7tmDbIX4+ybZ/OHvJz24j8Xv\nfQ2B5yGBOIwY64wzPjVDu9OhNTFJGMeUpeHc4jm2t7YJtMMApTX4vqIoHVhH6PujjgJLvRbgeR5C\nODyv6h/M8wIhHM5a8jwlTS+xkCqEwNkCY0qMqZoclVTkZdX9o6TEYkkGQ8J2TBh4RH5GklkaWjIw\nlp1BwskP/wU/ZP6c9R96A8MXX4cvYsAxPjFOvn8fx9/2ZuKojkwFE7ffyaGP30bvu7+b7KZXs/Ha\nN9B74jGMcSzvmWV3dYPtpWWmN1a45sQJlt58K8kVeymffyVrV/9vJIMBDPtoz6dWq1FrNKjV6yit\nYdSv0tvd4OzpsySDHu2xOsur62gFjZrGGEuWZQRhg9jXKE9Wq9ZavFHmMYobnNvcpR41yLIMg77Q\n1/icQT5fSGXU1Ohc5bIVRuBcxZMDMLYkzUoiT1OrewzSDDl0REozLA0/dHaFBiA/8kmeeOHVFWXV\nD2lOzeD5QfWiECzWlzQ++Sn8pWVqt93GYHubyS/fSfd5z8MWJYfvu5d75udZmu7wikceobHbY+YT\nt3PfdVcSFgVCCMqyACGIo4D62Bi1uEZYq0A2eUpvd5dzC+fY2trGFEOuSgQ3PbTIb4WwGWr6gwJj\nChqRTxhXNAGLQus6vueBtURhgDM5Y+1xyrJgkJQEwTN7F8/KuxDOorWmzHMEDq00eZ4ThiHGVpQB\nP/Dp9lIsUItHBlCDEo49zYhflJIHpeTzN1xJmiYVoRzww5GlFwpB9YqwjXe+kWTvHpZe80rG7/wy\n0cYGs/d8nbn7vkGz2+XYqRMs+zm3X3WQtXaTu64/ytbGOr2dLcqiwBQZQRgRxTXiKCaq1QmiGC8I\ncEIxGA45ffI0w0GPIPS55eFzHMwsPzp0hJ4mCD3K0uJpgbCVh+X7ijjyqUURWim0rspo1kEU1yhK\nM2oFvgRJdtZS5hlFUVCM+u2EPN/mWuU0MpMRRz79btXSpaUkjjVR3+Blls1hxpcaMS9Jcvbf8wj/\n6oar2Ds7g+9p1OhVC8ColC8YvuQ6dq6/imyYkEaauY/fzlevu4z+MOHlD5zkS9cepj3eZnv/LLfd\neC2+HxADSkicsQRBjK89wqhGEMZ4fogXhFhrKIuchTNn2dhYR2JI8pL3tyRvt5YPT4TVG2qEwlpQ\nGHzfr3pVlMT3Nb6uXg7oe5JWPcbYknptnJmZgFP3PHhpIFtbUroqAjrfySTc6O2Gtqp4mNLiKY0V\njrwE7VlqkSaKSpq5YDc1rO0OcDgeWVjl//rDT/NTP/AapsY7xLU6blRtEedfjePE6A1esHP9Uc4c\nmKI/7COlx12vfjm+kFytq+5XhEN6Ab7nEwQ+fhghPYUOAuJ6E88PUJ6HFJIiH5L0+5x+/LEREcWx\nvbPGSiT4av3/ae9MYizN0rP8nPEf7hhTDpVV1VU9VGO7y2BszGCMLWRbiB2jxJYViC0rhg0sWMAC\ndoCEWSAhgxE2bmNbZjCmhRu72z1XdY1ZOVRmZETGcOMO/3QmFufPbDbVJcKoV32lWNSmMvPEueec\n7/ve93lrqtryamlohrxZSmvY26s5Od1QT2bsdluO9hdYYxEicbicEboWrTWTiYbw0aDy734mj/qy\nJETWMCQIKRGiY9d0IAI+JIxKGKMYfKS0Cq0SdSUodpGFEZz36dne54tv3OWnfv6X+du7jg//0s/g\n/vSPo41BFUWuLWNCGBCxQiRBYStuRjDaYMoKqTXJB7xzecqtM2NOWYPS6jnUzxYZp6NNQYwD/W7D\nO9/8BqunZwTfcHF+nDFpCJRMWK1yp1EMIAxSJGZ1yYWy+OBQ2hBjoigs3gfKqqBt21ywxTzVv9Yi\npxRyY4SxtxwCUmQYiAvhuZXMx4gSCTd4BqOQQGk01nimXrDT0PpR2RMTf+XuI5YA//G/8q0f/UGU\n0ehgUaZAyZQhJlJSFFWGQpGHt3YyI8SIbzsG5/K3QMmRbUSWGvgBJfLgV5u88O1qx+mjh7z/7Tfp\nuh0Xlyc415OiJPiEtuBTyPwKqQjjwPdg74BHj07pmgGlJM57tClBaLrtmqpU9O0aW+0R0zWLEe+G\nfDT4LEOVKnPdlFK5G6fyaD9GEDLhh0A3ZEltWSrmde57LLSkUCO4j8TfA74O/LP5hAePjtmNQ82Y\nAlJbdDmhqCdUkyn1ZJ4vr9kCaUYTDCN5MEaeQQezWSijIFA2k2IS+HbL+fFDvvbF/8XNt+7yN37r\nG/zQ6XbknCRCyN+yroN+yKpRFyJh6Llx84B6UoEw7FqHEBphDCjL1a7DB+i2G+rSUtiP3q8f0092\nuBhGBZDJf7GYMvQquNyJShHns+gupkiIEh8iQiZmE0NZShSJiRSYsWL8z0LwI0Lwj+4+4p//h//G\nt995n6vVisEN+JQy/qaoUbpAjnAohCIMPX4YRr2Gy27WvmdoGqLPAm8gN4CAFB2Xxw/46u/8Dg/v\nP+QnvnGXl1rPXz4dskVZaUKCNFqVh8FjtWUIiaYdmFWag4MDjBQ0u5YQE2U9Y+haklCcXVyyWl3S\n7jaU9prHRRh6RCXHiYAjxszXFKN32toSRML1YbyIBtzgQMus0pGJRRHZtWClZJoi2yTwo2c7pcTX\n3rzLv+gcf+ez7/H6197h6q//VcKf/SnQ42U4yqxi9ASf/YR919K3LT4MWVBjLd4PlJNJbgwZhUiB\nq5NHfPm3/wfvvvU2Dx7d5xePLH/xJPDvDgxKalSMwJC9eCITYkoJjfPEJJiUir3ljPsPI8oItk3P\nerOlH7IXvN0NhCRwQ0dZXrcLN9pNgg+klNEgQ/DElFBKjGcnDMGjjUKrjLUJEbxPGK2ZVAG78Znf\nEySFhvUA7pmmI0XevvuAT7//gEmIpJ//9xz/+I9QVJPMM0oi6/B8YOg6+t2Odrel71qGoWcYepSU\nVJMaEfeo6gqS4ezxA37/C1/grbff5vHJMb3r+PIEfu+VKg+BU/7XpSgBhUiCpvMcHpY8fbyhdeCG\nnsNFgS0sw+BoOsf52Tnz2QznAr0HOSSuVmuG/qPFLd/9CTd6LHIVpYjk+Z4UMhO7Y8pvySHkS1Ep\nuiGgGCFPOlAWinkVuNpBYSSNj0w07BwkKdEpkQT8XR/5B8Av1RU/9u57fOKlOxRFjdEWkWBoG7px\ngdtmR9u17Da7PPgEbt6+wWw+xXvF1b//JQ5/8Tdo7sz50HrcSCZISSJkxt4kQnYpC4HAkESPj4n9\n/SUfHF9xtm5oN5cspksWkxmg6Jzj0fFTyjK/qX2IrLct211DjNcsRoIbkLXlGbVTK0HfJ6RUGcEO\nGGORcsC5/LXL9LFESophcJRWsTcRrJp8dlsJPgqsSLjgMUbR+MSvCsmvkkjv3ONT/+Rf8bf+4s/y\noz/4GSaTKVIqhral2zU0uy27XXYwbbc7fN8zqyzi9iGri3Pu3X2Pn/mFX+Vw3fKzXcfv/MDhyEKy\n+e8l8r0ihEIrgSkSHp/RZzGhtWU+m3J+1bJZb9iv9zjanzGfzri82CCRnJ5dUZYFpAYfEo9OL3Hi\nmscFCHzwoyTgO5RVazT94CiKgqIoSKnLR4jMhOzgE6aE4CT9EChKyUGdON3Aug8YJbASrMoFiE+J\n2mpcjPTO896HT/j7//IX+ekf/gx/82DGn/z9t3jrp/4Yj1++RbPd0fYd3a7j5fuP+Ol3PuSbf+p1\n3reaJyfHrFaXNHfm/Hkf+PxLe0htUcojo3oO7kpJEFIiJY+tEl3XE5Km7XtOTp5w+2jGO++vODlf\nc3TUsL8343C5YLXecHa2oi4ti+UcXViiC2yblk23u94ix5TwIRCDG/vGARcCxigSnsEFClsALqve\niUiZn0JDD1JlaXeSguWCnMIQBX1MCGBaSJwDI6BzfpywmMye73p+7Utv8I8VLH3kpV//bX7lJ1/P\nf8YIffhrb93jcN3y2S98lV//iddycSEV3761xzeO5mOFGpFJjILH/DIKIZBCwHkByrDetcghMp1W\nXK03fPZgQWE0Jxdr/pDLGr2bR3MePy1xPnL/4QV/aDoj+ETXdlRlydXmoxf5Y7hwgujDyM+EENNo\nZNdoZbIbyphRziXHhc7MtsAYNJDyzpZCsphq5lZgEBRWsnORLkTmRrFX6iyGCQEz8ohjivxDEu8X\nkn9zs+bx9pKn3ZbzYcel7/j8K0c8mhb8+qdujoWJBiVIUpOUAVugigpdTBHakJQlCMXOw8PLga/c\n33C+C0wqzf68oDSGftTFHezNuGp6Lq+umFcWoyX7swl1odltGj64/4iyLinKkpPT1R+EaQ/ee5zP\nl5z3IYtDYszK+LHnUNcVV+uGECRSCWQcsbdSjxSsgA9QlJJFLfA+X3ptSjQxEl1gv7ZM9qecbgaS\nD6NkV/DLSL55UPCJmWYWIsZYlLUYbXj3lZp/+uqtEdyVSYpJGqTJF6oPgRAjg/NctZHTyx1PVlvO\nNh2SyK2J5c7+BKur7CgQGfh0sXHszSq2refJ+Y5PvzyBFJlOJ4TkWexVdO3Ag4en3L51hK4dpx+e\nXm+RY/T4MIzUwjRORQRaayRZu6bqinoype0cQz+gdQ5YCQh8CFRVRQo9IkWCF8ynhhA9TzaR2hqc\nyhXl2iduVoob04InVzuIGW/mYuS9046YBC/fzNxPXdisCBIq8zcShJAI5AVt+o7LTcvTyy1PNw1t\nOxCiZ1FZbs5rPvfCkr1JidEZ9xMIuKEnAX0/0HvwUTKtCo4v1hwuaw735zw62yCEYjGbcrnreHzZ\noKY9hzf3ebq57hPOR6JMmZ88soSEzNMDJTWD6yBNKIuaSd0ytB1aC4IWhD6X237waKMpC9iGHoHk\naM+wS4mzzYBC0KYILuL7nmVlEYuK+xc70ghODSFy96Thw8uOxaRkb1ZTVyUkcCEydANtP5BEIrqA\n9wGtBcu65LUbM24sJ+zNp1RFmY+hkNuefmTTKW2AjPlJQNsNdLMpi8Wci9WKy03PzcM587rk7HKd\nsThCIauay06gB40p6msuMoxOIp6j1oVKOWNESdreEWKGhtaTORfnF0hZgIgoneOGvMstybKuYDcQ\nUr4c7+wbWpfoQqRxgUCibfPcrJCSO8sJj69aSIlSwM2JZD4r8EBsdzjfMSkti9IwXUwoiiVVYZhW\nBdN6Ql1nQPazQXAauR3ee4YBRFTPMWrqGd1WS0KA4BPbXU9VzDCm4nzrqKsNL9065GK94/j0EltN\n2AySnU+UvcCLa1rMhr4hyEzCSiLrJuh7gsj9Ae/751gcqTTaFoQY0caQEqjoEWMVGFOirg1tEwgx\nYJVjv4LHa8F+VRBDYAgBhkClE6WGw2lJ0/RUIldoi4nkcG+P+WzCcjpjf3+P/eWUsijyICGm59C/\nXBsIoo8Zfek9g4CeRAqBFFXmd4jMTrK2RqiIkhCSpd0NbMvMUz4/W7G/mLOYam4eLlltOj5sJfXe\nEdvO83Cd2K2vW4x4T1BpfFWA97mFGFPEKk3Ck4C6qji/vKKuJzRNSxz65wB+xMje9BJlLMo0CDTb\nXc/hVLFrIlIJtoNiExNJ5V5uiRgNjwKjJUYlvM/VZwoZzSBjQKdxbGTM2I+NRCBGOeLSIiFonHdo\noRCyywOIsawWSeRWrRLEGCgsaBnx0bDbZhntdFJwfrWBFFnOZxwd9rz/wSW1hFdePCBJw0Px0d7q\nj7n4IIhAit+hOYQUEUkilEEIiRsc1ih6N7C32MP5TJYaRqFiTImYBM7Hsd2pUBJ8yC7Sl44E51u4\n6BxDjMTe4wVMpgYtMsbRJE/G4udvjQ8Zi55JMgFJQmtFUWZkTxxbmMHHfO7GgPEGpQbG12im4zqN\n9DHz8VNEyiK7vERCi4GZicR+Szmd0LsBzwKcZ1LVvHozcBokX33zA2zoufvw+Po72UdHShLvA5BA\n5srOuz7Llgg45zFakxBMpxXbTSCGLke+EUEahiFSVQJrs56uqjRNG5lNFIs6cn+V38cxBlzKxYkd\n3a+FAqs91hikViQRc1cuhtH7LTBGo7XB2BIhVeYuh4h3bpxTDnmsNf7o3tG7/P/wcWyCibGNGyPW\nVgTXEL2nW6+xkwmrzRVlUVJUBTJe0l8+4fjeQ4bBsZxOrrmTQ0AoaLtcNufvo36O5dVa41xAyQzf\nDdFjTEFRhfEWb4kp+zpIngAAFrJJREFUL94gPYP3aAnOkYUkwbPrYW9uOVw7jrcCLccCKHhUSmgi\n06lBy4S1JueVjMjInO4g0dpiioKiLNE2x24wDkS9zz1nN3Totst9ZAFaG6qUnvM8fYyEOGK3x8wr\nWU/pFrfA1jy59zZufc6tgwLSjqO9KVdd5GBS8ajteHhy8pHr+DFldf5aep9H+CEmok9ESZbJ6qwB\niylbaZ3zSCKFrQiuJxLHcf/oA/T5OVcYmYuQqOmGyGImePWoZNW2GFNC6lAJtEhoFTFGUBd13kXG\n5Eyoosy5JTKHvxhbYssaOzIthDGAJDqPsz1D34ypPePE2frnx05mXKSRhBuR5N2+tRN+7/6Ok9P7\nnByf0Jw95M/8cM+Lh9m8boLjzo0lu96Nv6BrLHLf9aQ6Tzqec4xFQin13EAZY2bDaW3YtR31pCYG\nx2SaUw76zoMYc+1EwqdIbRXKC7SODD2crjw3FoYX5oFHVz2HlaVMOWhxUuaLra5qyqKisAVGGwqt\nKYsSa4vR+arQKkduaFvkaUqCpFSeIer8y1DGYruOfhjwg8d5RyzHi9DHfAyagqep4De/8YCT0yd8\n+1vfpu9airLgt37/Ln/hz3yOaQn7iwlD2vKJ2/t0/TUdqW7oCWWJc9lrrKQmyIHM80/0Q0dRKZQ0\naJ1NlG4IFFYRvKOe1LhhM55xmTfhnSNqRWEUO5GQMuBjdoYuK4n3FkvCpERRSWbzmul0Qm0LysJQ\nWEOhTd7R1lKVJWVVY4oKbfOxoYt8LpNy1RiCQxuFtDlYRhuLHlwuSJ7xmkNApMTVLvD2TvBL//NL\n3L/7PmHogYQ2Oku8+sTvv3WPn/jcy8zqmqtGsz8TfOKFg+st8jA4hiFfBEYJvEjjaFyM8W5ZJ5GQ\nBB+xZUHTulGZLplMZvTtQD8kjBL03iMVDD4wrUqk2GCkwsXI4AKTKs8E211LipK6tsymU+qypiwN\npS0ojKYsiueBLoWt0LYcc0k00pjscVEmd97G6NGQElZkG3JeaI8bvlP1peC56BK/9v4xX/zqt3hy\n/x4/9uk7vHb7Jl/85ju8c3aZe9spcny24/yq4YWq4sWbe7zz4IyqumY/OcaI97lMLQpJGJ33PjiQ\nuevmw4CJmkTCmArZ5SdTluEEJtMZceOxSmfju5bsth2z6YRJZTKrs4MQJFYLmi5QVSXRD8xncyZl\nTVUVlEWRlfJlQV2VTOuaSVVRFGXm1Wk7MvWfRR/lBY0KnsOchQRTIKRBao8yNlMcfcCHwK996ev8\n0n/6PL2LBOf4wjff5Rvv3OeF+YRago0evAM75b3jFXVVcHCwz63DJfe+ee96iwxilGdlwDRSMbie\nhMpJDBi8Hwj6mVwAxJhwE0IG7eWdY9GmwCTJ4D0I6AdHWRhmEbaNY3CJrssVVwSKqsRYmy87a/OP\n1vmIsEX+77KgKGtsWeVsKFWMcXQ5dQeRhS5CivxLR4APQJaEyfzMIIbAvVXDV771NkPI5sosNpGs\nnKfZdth6ht4/RBUTtFK4wnPZODAd80rw8s359RZZiIxeEELQu3xJkFKeIMfwXNUZYxybSDlyc1JX\n4ztW4qNHCknbbhHS4l1C25JdN7CoCoz0LJcFm/VA6wR7c83gPNZojNYomaOIrFI5Mk5nT15pNYWt\nM9JGqZzMM+rrpMjZPGJM5JHjkZa0ROJIUiHkGEwQcwze+4+POXl6ia73Rg6poCgNk0lNOV3kV0jI\nzzyhFMW8JOo12zZQFjXz6pq9Cx8CHocQirye2UvtB4eQNcF1KJFnY2VZ0nY9pBwMkFJ+BvkQ6Po+\nK3D6FiEEQ583yiA7UoKmT+iqJEZPNwTqwiKVxBiD0vn8VzLHfhqtKYqCajKhrCuMKVG6HKM1xnw/\npUZJ6vgz/r0FELVGxpSdWEhYXRHfeI9+M3B45w4HqkQag7Ul0/mMxWJJQLNtdrRNS991XK3XPHza\noKuOzxxa3j9eM9HX7F045wmAsQrns70qpQxmSiNmIYpxnCMU3rvnt7AQmc2plSKmnISmjCH0gUQE\nIVFGMbgOoXJolm8TwQtMDVJm07waz1alJNZoyrKkqqo8yS6qrKEzOYzrWchL4v8KvEppnC9IhBIo\nITJM8HLD8MZ7XHzrDVbnZ9z8sR/np3/yJ3jcaoTQ+PQME58QMTKpJT4IFss9eg/drue8U9zaNVyd\nbzn+aL3hx5TVwYFRKF0wuB1KW7zrsh6CiNISIfRotBlQWpPaLn8dBVmYZ0uUbPBDfsZpA2nXjDs9\nazuQubo6XM7oLi8wWme0RsrWAikThdbUZUldVRRF1kJond/GOZjxOz/PouhEzP2RcXyKSAl/dkXz\n9Tc5/fo32Vxd0A+5x12cneEXRxRlTg2OIl+UWmR7s5GSSkb6tsvup7KkGzpOdjtuHS75nTeuaZZ0\n3uezNsXnslk5ZjSFFHBDhzGT0Tbr0MYQksQPjrI0BOdw3lFWBUMfxl4DFEWFj3k8ZE3BcnJAFNC7\ngeV8iogeZbLeTjAucFXlJ1thKawZabQGqTVSKdxuhSlvkqQYzfYjYV9IkoTU9Fz8z9/l0f/+Es12\nRUoBY8tMR5QKQ0TLRExZjywIeaDQ5y7itm1Zr3coAc12Q9dsqULLmUt86tacory2qjMilcnhs0KO\nf7ii7UNu1KQcl6a0IoRAUZVoJRgCaGURoieE7C3RBprdBmWqkXCVQ7vCEDk/e0wqF9w5mrPdbti3\nMBMK3IAUFbYosFbnjpxUGGXRyuQK71lZvXc0gp7EuKPzuZ9iJJyt+fBXfoN3vvwluuCY1wXzqqIq\nDMrUxBTZTao8wY6OXbPlaG/JZtsiUyBET1VXnD19ym67gzAgo2dotnkzCMHLL96+5iLHhFaWfuhG\nSLNgcC4HBYg8RVBqbGeO6TpVWWbhi1LPqS1ZdZkVoVJC8JnvU1iD63OY1tXpKYU1zLWk70Aozx1V\nM4+GNORzVY3hi7oo8hNyXMh83o5plEI9B0MBiMcnvP0Ln+dbb77Jpt9yY2/Owf6S/emEoq7xHpCS\nR9WMdtsjjWA6mTC3klW74enTM7q+p2l6Nps13WZHSp7kHVVs6ct9nCgo7TUN7NoUxBSICbQcM5tS\nhtx555FSjW9ohxSK7WZNPVsQQl50rbI1OD9+8wA2hYiPOXggT1c6hJBYrTOQ5MUjVhdXLDScxSuW\nWnHgI/vKgCrwdmRvjJecMvY5qiwfZBFBJjISA8PX3+DtB/eQixkvyAk/+OItCiFp+p71esvV6orP\nvvIKXVXRnm/p3JaXZnDr+LdZuoH/8viSB483bNYNQgp810DKYnDqCdOy4nzTPT9G/98XWVucc2MG\nnYKxyMhqS4EQmiSeDSQS/dBTxXwhxhiQY1I7KEgJYy1D39J1UM4LcrieQRUCrRIkwZAkdakJQXDV\ndWzKgRtVxUEUHDpNf9bQeIm7IYnG5kKOZ081AEEKOZNEdAPbiysigvm04qU/8WOU+wfU9+/Rnzwl\nNjtu3Dik2tvnaZKcrhou1ud87lOSmavp1oGfe2WKXJ3y3q5l1XpEgulszsGyYJcqwtDRpyn1tY05\nKeJ8FhXGEHPyY04k5Bn9MYaItiXBZ0H2Zr1hPp3iBp+nFIwXkQQpNF0nGAaXd5wAWdicv6cSAsmq\nabl96yb9k1O6xnFut7yyt6SwJSIGbEjMVh1mOGfbejoPgZEmo/JxlBWbAuEcoW+5OZvw8s/9OY7+\nyI+y85GL5SFv/+uf58ak5GC55NuLA+5e7Ti8fYfLq3c5vtzx5ukp337zAZ++UXNnsQBtmE1LhsFx\nuCx4e1Pzwss/hLKGWknSxUe/Lj4m+ymOED1JSD6r0tOzQOxsBQ4hZIl/Vi7T9f2Ye5dhodEPeXel\nmM9nISntszBEjRgzWAMKrRVd22Fne0QhGYbAduiJIaBFQqUEbkCEHuMHbpyveOmtu8y/9ia7s6cM\nbZuDEcfQqWgNRVlyYzFnsthHVzXh0V38ZJ8H52fYsuCSxH+/2nB5do61htsv3uGs/BS7vR9iO3+R\nN7nJ+/qIiw5OzraUKvDakcQouH/vPmcnJ6AMQV7zdeFcQFZFVguRG0M5t3SUYMU8mQi+xxQlyjuc\nH9htdxweHiG6hqbZ5bPX5de6MQUpeUSMmT+MRJocsbndddyYFqy3V1TLQ/rLLS7KMYktR3dKNdoW\nssocpRVF0/Lhb/wm0z/yh7nzudcRWuapiVLUyzkHTcPiwQd0ewewd5uTX/s8r9w+pKxKvnb0Im/f\nvQf9itJd0KwiHz65YHZwyLZPHMxLzoOiLV5CJ4cvpqSd4dXXX2Wx9mx362wq1R+9yB+zk7OVIcU0\nNugDMeYnkjYGH/LlB5Fh6DHaoGUetW+3G7RRGauLQCqT+7paIUS2PyAVIYEyJieBpUBZWC6v1hQv\nfgJlijwlDzFPzKVAFiWqqlGTGWq+h5rvU84WHKE5fvMNnMsdwmeU2frokMVshu127H35f3H0hd/k\nR2rFZ268gCwnvDlI3njnQ97f1mztTYSI1GnL5vgDrLui0oJ+u0NIRbHYY7Z/g9l0StLZifWJT36a\najJn0350yffdE3NU/se70cieYgY/CSXGLpsfzZOSFCNN02SrlZC0/YCSmrqux0spY860VhijcT6X\n4i6S3UsxcDSv2LYdmwEaIEjF4CDpOvPrJzOKeoqpasLgGNabnLgjJYeLPdaPPqTfbSDmpGAhBCxm\nGKnw/UC3awiDo7lYYRJ0+/t060fc2VfsF4FHT85oYsW7Dy7Z9hl6sjs7QTXHzMWWW3XkaDnh+HJH\n13acHj9mt9tx9737PH740byLjwWiPk9ej8AzSlwajUdRUBTPMjayjkGPpD8lBavVJVU9w5bZLitS\n7odobRjcMOavJnzXMjMSLQXbwbMThiFEeqV5fHHF+09O2XTjxHnEDfu+I3S7/B4WAmss3dWK5vyc\n5+G2QhBnM/CezdUV64szmqsLrFG8/OIdHjU9/+OLb3DRKEp6btktsjnjxo09hC2oD16gVVPE7AZx\nesCm7dmcPkB1F9zQO169PaPbtnzw7lv07UfTtD62GBn1IiDDuJRZ8jS4mI8DaRhcixs8EjEmeoHW\nEu8Cl6tzlosFhTV0LfkXFCLWGjwKZUtmMrJpBy6bgXq2QC+PCG6g9YFucLz34TGnr7zE4XSKrE3m\nOVc1u8tzJiorgKSUhK6lubwgjGmPEkhVTjwb/IAkMZtO2bt5G4qKLz58lwf37lIvDji/mBJDIoaB\nFF2Oa9YGmTx1OaGYTbh55zVc23HuHzOU+3QXH+IuH+OvnhCKj34nf3wQbUwjRctByr3XEDyDdxhj\nsxQg5Ea8kIKubymKAq0UKUn6IdI0HVVZj4DqrGFW2rLrPEYJSqPZNC4r77XFEOm6HqstU23oh55H\nF0+JMb9/U8gTivV2RXTfIQsGH+ibXZZhpRzTHMsiN7mkoqonLA6OKKYztjEx1Jrpci/v8tUGJwyD\nmkC1T4+hdZGYNKbQLExPcfUBq/vfptCSwUvS9AYfrgSdT2A/GiryMSiG3KTJOgtQRiCVpusdKSSs\nMc8ZyxlErRAy93y1ViQxgIDC5CZTWRb0fY82ehzHJyZGYZRgb27Y7np0ioS+Z5MStxdzTNNyZ16y\nP5tSltUIM0koITk8vJE7dSRc39MPjqFtn79GEIloNKrMjaXJpKaYzkFpLlQgFBOOPv065uyU1ckx\n7ZOHKFtgqin18oC6LnBdw1tvvENZlty49QLry6f84N4C2zxGnK95eu9NJAOEa0pnpciCFSVtFnRr\nAym/ja1RlMbkPrOQKAUuhIyNSSmPyIWkLPLsLc8LR8taiAQyZ+5gXlEYzYsv3SIJeOPdE65iwLdb\n1P4Cc3nOa7dv8cmbtyi0fWb6IHnPdLLIt7HICMuWiEshPzOfLbRW6NmM8uyCqswJ6gjJez5xcrGj\nni4RdkIql7SbcwoRie2O3dkTNghu3XmBT7/+wzTbNafHjyhKw+llw/EqsLva4P3AbD5DVdPrLXKM\nMku0gsvmQpEny8579vf3x9lZljb5EaWgximxkDmPxIeM1hEqu1h3mxZha2JUvHh7ynKSq8nJpOZT\nr32Wk6svcf+dE8q6ZCuySPuTL9xkOZ19Z/QlQFUVoqzyIiNYtw3Tmzex0wkjljFXp0TEYpYHsTaH\nMwYpeXO95fjklHWbiMpSVBW2uI3SmsPFFEXi8b332K4u2VxJlodH3HrlNQ5vHRFFwbe+9k3a8xOE\nLvC6wH8Xb/V3lwT4AURJipGqrnFDi3M5Ls7qjO8agiMJNZ7decIhpCT6rPttB4eQGa8QfKTpB0JQ\nCG2YVgYhBfPlPp949ZPc+cQrSPs2zfZ9JoslLkWKxZzZbEZdV9iqzBKskC/h4FzOrOlaLnYNy5c/\nzYt/6ucYvEPX4ygqBtLhLerpY6S1ICVrBO+eXXCx7mn6gSgU1hoSOb3YhUC12OfFz75OaTXNZsXl\n01NE6litNxw/+oD28gSEp1geoMuawl5zkZOQhBBzNh4RW01wYqAWEjXiGTLlO49pqqoYc5QsQ+pJ\nUjGZWiaTina9yWkJuiSair7rMLZiMjV85gc+yw+8/kc5Pdvyu1/+BkEq2iSIUmELQ2n0KNQWSGNJ\nOhL7Pi+21nTbHWfDwPLl17DTJcVshiyKXEB5R1q26KpGmLzID33g7UendG7IKcQxEaIkeg8xMJ8U\nXJ49oSorKOcsjm4hZIFJW06uIkPbo3VJsZwibY2SEnfdCIwY4oiQtAhl2TbZHHj71h3qsmDwPbbu\n6DrPdL+iLos8xq9qdk2L1JYUBqwImOUeqZpDD5vOM7gVq17xx3/qJ/ncj/44p2c7/vW//QUO7rzE\n5179JFZr5lqQvvFVqr199DJHy1HkQaccelhfIaTkcrvhvN3x6nwfXU2yTy+mTDYQCl/VuYFEIkn4\nyqrlfL1FWoNy2Rj0bNQVo6PdNVydnxMSHN44ZP/mCywWM+6/+4izVUNMDjObYMoJprAUStKsrknT\nSgnKosT7gWZwSGOZTcqMHZA5DijzIkqidxgzxVqdAw77jgLQwlOWNeurHV1QODewvlwxqzRPL1b8\nh1/9bX75v36FR4+fEEjoeko3RKpCo7VmbgtKm9/CwmSBoei73KQvS5IPXA4Dg4BysURqhTYWBEhj\nR95oBCVJInHx6mv87q//FpvG5U0Qs0Q4aSjKnJk6DD1d19JtNgzNhuMnJxwtF7jg6TdbhFHoqsBW\nJSlmBL1U12TaK5UDWNq253K9o57MkMFz4boMCyXRD/mMNlrTbC4RYsHJxSkqJXbrSwoNvS05u+p4\n93SLc4GL80tefWEJ8QD8Br99iNa5iS9DxK8GNtvEdDan7FtM2+RHRd8RhEAmgSqLPAyQkXXTglKU\nyz2UyXKp4D3CjueyKUBI3NEhj6sldx89JgqBiIG2a1FCMKnKHIE3ONphYL6/hzKWdr1CDA5vIrtY\nEIPDVhO0zkwNUiAinmMg/p8XuXWCJxdr2q5j0/Q4CjoXSWGX9WUx4FzE2JKqKtj2ke7kit4nrMrk\ncJESUjY8vWzYtJ7gPCoFrpqW6BucDJSmoFntstFRDFhbYpMj7BKm6/m9R6dZqjt230SKyKJAGEvy\ngf99/wGPw8Ds3kMe7BzPSCNibK3y9IT99+/RXTV85a1HvPPWO1w1w/OBqyoLXLPFtRl5lshKJyEk\nxXRGGDrarmez2WaOP/mZ530/Kl0F7rtQAsR3iwL+/uf/z+djWZ3f//zBP99f5O/B5/uL/D34fH+R\nvwef7y/y9+Dz/UX+Hnz+D+ugDZx/wgJfAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "j1I6-iX38Xpr", + "colab_type": "text" + }, + "source": [ + "## Part 2: Data Tranformations" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "AX8VE7yS8g5I", + "colab_type": "text" + }, + "source": [ + "Now that we have a dataset to work with and have done some level of customization, we can move to creating custom transformations. In computer vision, these come in handy to help generalize algorithms and improve accuracy. A suite of transformations used at training time is typically referred to as data augmentation and is a common practice for modern model development.\n", + "\n", + "One issue common in handling datasets is that the samples may not all be the same size. Most neural networks expect the images of a fixed size.\n", + "Therefore, we will need to write some prepocessing code.\n", + "Let's create three transforms:\n", + "\n", + "- ``Rescale``: to scale the image\n", + "- ``RandomCrop``: to crop from image randomly. This is data\n", + " augmentation.\n", + "- ``ToTensor``: to convert the numpy images to torch images (we need to\n", + " swap axes).\n", + "\n", + "We will write them as callable classes instead of simple functions so\n", + "that parameters of the transform need not be passed everytime it's\n", + "called. For this, we just need to implement ``__call__`` method and\n", + "if required, ``__init__`` method. We can then use a transform like this:\n", + "\n", + "::\n", + "\n", + " tsfm = Transform(params)\n", + " transformed_sample = tsfm(sample)\n", + "\n", + "Observe below how these transforms had to be applied both on the image and\n", + "landmarks.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "XgJRPN2-9fJm", + "colab_type": "text" + }, + "source": [ + "### Let's start with creating callable classes for each transform" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "tESuP5h18hhU", + "colab_type": "code", + "colab": {} + }, + "source": [ + "class Rescale(object):\n", + " \"\"\"Rescale the image in a sample to a given size.\n", + "\n", + " Args:\n", + " output_size (tuple or int): Desired output size. If tuple, output is\n", + " matched to output_size. If int, smaller of image edges is matched\n", + " to output_size keeping aspect ratio the same.\n", + " \"\"\"\n", + "\n", + " def __init__(self, output_size):\n", + " assert isinstance(output_size, (int, tuple))\n", + " self.output_size = output_size\n", + "\n", + " def __call__(self, sample):\n", + " image, landmarks = sample['image'], sample['landmarks']\n", + "\n", + " h, w = image.shape[:2]\n", + " if isinstance(self.output_size, int):\n", + " if h > w:\n", + " new_h, new_w = self.output_size * h / w, self.output_size\n", + " else:\n", + " new_h, new_w = self.output_size, self.output_size * w / h\n", + " else:\n", + " new_h, new_w = self.output_size\n", + "\n", + " new_h, new_w = int(new_h), int(new_w)\n", + "\n", + " img = transform.resize(image, (new_h, new_w))\n", + "\n", + " # h and w are swapped for landmarks because for images,\n", + " # x and y axes are axis 1 and 0 respectively\n", + " landmarks = landmarks * [new_w / w, new_h / h]\n", + "\n", + " return {'image': img, 'landmarks': landmarks}\n", + "\n", + "\n", + "class RandomCrop(object):\n", + " \"\"\"Crop randomly the image in a sample.\n", + "\n", + " Args:\n", + " output_size (tuple or int): Desired output size. If int, square crop\n", + " is made.\n", + " \"\"\"\n", + "\n", + " def __init__(self, output_size):\n", + " assert isinstance(output_size, (int, tuple))\n", + " if isinstance(output_size, int):\n", + " self.output_size = (output_size, output_size)\n", + " else:\n", + " assert len(output_size) == 2\n", + " self.output_size = output_size\n", + "\n", + " def __call__(self, sample):\n", + " image, landmarks = sample['image'], sample['landmarks']\n", + "\n", + " h, w = image.shape[:2]\n", + " new_h, new_w = self.output_size\n", + "\n", + " top = np.random.randint(0, h - new_h)\n", + " left = np.random.randint(0, w - new_w)\n", + "\n", + " image = image[top: top + new_h,\n", + " left: left + new_w]\n", + "\n", + " landmarks = landmarks - [left, top]\n", + "\n", + " return {'image': image, 'landmarks': landmarks}\n", + "\n", + "\n", + "class ToTensor(object):\n", + " \"\"\"Convert ndarrays in sample to Tensors.\"\"\"\n", + "\n", + " def __call__(self, sample):\n", + " image, landmarks = sample['image'], sample['landmarks']\n", + "\n", + " # swap color axis because\n", + " # numpy image: H x W x C\n", + " # torch image: C X H X W\n", + " image = image.transpose((2, 0, 1))\n", + " return {'image': torch.from_numpy(image),\n", + " 'landmarks': torch.from_numpy(landmarks)}" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "H6ZPTGhp8hLz", + "colab_type": "text" + }, + "source": [ + "### Next let's compose these transforms and apply to a sample\n", + "\n", + "Let's say we want to rescale the shorter side of the image to 256 and\n", + "then randomly crop a square of size 224 from it. i.e, we want to compose\n", + "``Rescale`` and ``RandomCrop`` transforms.\n", + "``torchvision.transforms.Compose`` is a simple callable class which allows us\n", + "to do this." + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "ItggWW7H8hnA", + "colab_type": "code", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 501 + }, + "outputId": "24a30e67-24d8-464b-f415-4f302d57cad4" + }, + "source": [ + "scale = Rescale(256)\n", + "crop = RandomCrop(128)\n", + "composed = transforms.Compose([Rescale(256),\n", + " RandomCrop(224)])\n", + "\n", + "# Apply each of the above transforms on sample.\n", + "fig = plt.figure()\n", + "sample = face_dataset[65]\n", + "for i, tsfrm in enumerate([scale, crop, composed]):\n", + " transformed_sample = tsfrm(sample)\n", + "\n", + " ax = plt.subplot(1, 3, i + 1)\n", + " plt.tight_layout()\n", + " ax.set_title(type(tsfrm).__name__)\n", + " show_landmarks(**transformed_sample)\n", + "\n", + "plt.show()" + ], + "execution_count": 17, + "outputs": [ + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAJwAAACkCAYAAABikUeNAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9efBt2VXf91l77zPc4Te/eexZYzei\nJSERNLQkJIGEaNlYgHAwkMQqu0IZJ5XExOUqVyp2Kq6KK07+CBVwKBAUCANJwIBBoKFBI5pB9Dy8\n7n6v+w2/+Q5n2FP+2Ofc31PT3WpitfSw36q69bu/O5x7hrXX8F3ftY7EGLku1+WbJepbvQPX5T8u\nua5w1+WbKtcV7rp8U+W6wl2Xb6pcV7jr8k2V6wp3Xb6pcl3hrmERkR8XkU98q/fjGynXFe5ZRETO\niUglIlMRuSgivyAi42/1fv2HINcV7rnlPTHGMfAq4NuB//5bvD//Qch1hfs6EmO8CPwBSfEQkdeL\nyKdEZFdEviIid/Wf7VzgoyIyEZHHRORvX/Xe3xWR+7r37hWRO7vXf1pEHrnq9b/xXPsiIi8VkT8U\nkW0ReUBEfvBFO/AXS2KM1x/PeADngO/unp8C/hz434CTwBbwLtJifXv3/2FgBOwDL+m+dxx4Rff8\nfcAF4LWAALcAZ69670S3vR8CZsDx7r0fBz7RPR8BTwI/ARiS1d0EXv6tPl9/pXP7rd6Ba/HRKdwU\nmAAR+AiwCvwj4Jee8dk/AH6sU4hd4AeAwbN85qde4G9/Gbi7e361wv0Q8CfP+Oz/CfzTb/X5+qs8\nrrvU55b3xhiXgLuAlwKHgLPA+zp3uisiu8AbSBZpRlKKvwc8LSK/KyIv7bZ1Gnjk2X5ERP6OiHz5\nqu29svutZ8pZ4HXP+O2/DRz7hh3xN0HMt3oHrnWJMd4jIr8A/C/AZ0kW7u8+x2f/APgDERkA/wz4\nOeCNJFd48zM/LyJnu8+8Dfh0jNGLyJdJbveZ8iRwT4zx7f/+R/Wtk+sW7oXJvyLFa58C3iMi7xQR\nLSKliNwlIqdE5KiI3C0iI6AhueTQff9fA/+NiLxaktzSKduI5LKvAIjIT5As3LPJ7wC3iciPikjW\nPV4rIi978Q77Gy/XFe4FSIzxCvBB4B8AdwP/mKQkTwL/Lek8KuC/Bp4CtoE3A3+/+/6vA/8c+BVS\nXPj/AusxxnuBfwl8GrgE3A588jn2YQK8A/jh7jcuAv8CKL7Rx/tiinTB53W5Lt8UuW7hrss3VV4U\nhROR7+mAyYdF5KdfjN+4Ln895RvuUkVEAw+SguzzwOeA93fxynX5j1xeDAv3HcDDMcZHY4wt8CFS\noH1drsuLonAnSdlbL+e7167LdfnWAb8i8gHgAwBFnr369MmjmCwHkYR6yjOxz2fDQiHBWJL+AEgq\noTjnaRtLXdc0TY1SinIwwBiDtZZ5VeGd67YrxK7MFw7KRmitERGUUggQQkAEQojdcwFit6uC1oYY\nI94HlE5r2dp2secxgogQQiDGiDbp85FIDLHbVjqWuDi2g0PsztvXHLcAogQl6RiQ9FGlFMYYjNZ4\nH7DOdmdREAFjNFmekWU5WiuyLMMYfXAe6bcl3a/Fr33vWS/JwQsPPPjIZozx8DM/8WIo3AVSKaeX\nU91rXyMxxp8FfhbghtNH4//6P/1DDh8/i6gMJYLWiiCCoBBRJKXoL3DsLhoo1Z1oBO89IoFHHn2U\ne//iIT77mS+yt7tHnht8CJSDISura4jS7O1tsb+/TwigVE6elYgYXIzUdQ0khXPOURYFxAAxkmUZ\n+/v7tG3L8vIYEU+MgnOe4XBMlhXM5hWeiBJhZ2cbrSLOWgRFCJG6bXDeMxqP8J1yz+dznLWMhkOM\nNiilsNbSti0i/bEJRVngvMPaFqM1mdEUmUEbDSjoFsx4OGB5aUxRFFjrcc4SYgAi3ju0hvGo5OwN\nZ1leXuLEiWPcdNONrK9tkGcGUdJ9TmGMXizIq2P+tBj1VYvgQO78znc9/mzK8WIo3OeAW0XkRpKi\n/TDwI1//a4oQheADhIBWClSvbEnh0iNZghhDdyH6E2Oo64Zz585x77338+D9D7O7s4cP4KpAXpY0\nVrG9M2e0tASSAwbvPbZtcTZZMBcjSmmMSRe9LMukaCadKuccVVUl62EyRKVt+NZS1XOcd7TW0jpP\nURRkmcHZFmMMIooYoLUWYwTnPXVdo7tt53my8E3TYLrXvPdkWUae51hrIYJSOj1EESN4H7rFqBeW\ncjKpqWYNg8GAcjjAeYv3FmM0eZ4Tg8OYnM0rW0wnU/b3J2xubvGaV7+ao0cOU2QFxmiM0d3iDjx7\nBNZfjxemHN9whYsxOhH5SRJDQgM/H2P8i+f/DrQ2YJ0nikICxBCIEoFkQUQUIpJcgup9TLJ2IooQ\nApcuXuKTn/gsF5++zJUrWywvL1O3num8pbWRcVmSFyURjdIlJhviQ4PzluAsznmiCEVh0m+JoLUm\nxoh1lhgCVVUBMBwOk8JpoSgVEWE2mzKdTYkIonJCSFYphoDKkoI2dZtcmjYorZMVtRZEyPMcpRSu\ntVhrMcYkhQeyLCN21leUQhtDkedAREuvhDnBO5QSiAGtBGMMddNQ1RXGaLQxTOdzonc427K+vsZ8\nXrG3t8/u7h5aNC992a2cPHmSwSAp3V/2pYtrfVXkExevPZ+8KDFcjPH3gN97oZ9XSlGUQ4pyhDI5\nKoIiguqPRvXRSjLfEhexkNZpBW5uXuHTn/40X/jCV7BtYDAYUBQlUTnm1mGyDBcDhkhtG6rpnOmk\norUtIfgU0xQ5w+GYwWCwiN3qusa2LfV8hveepmnIsowsy9DaEESwzuF8IMTAdD5BKUNRaJqmRSsh\nL4p08bXBm7TPKEWeZWit2d3bgxhRKi0cpfqYKlnZtm1xzqGUSm4tgArpvGgFWqWFoY0ieE06RYJW\ngvMtVevSooxpcQ7KAbPphKa27O7sU3aKNb+8xWg4ZGl5hIhw5sxp8twDJOsc4192n5IsXAo8Ywqn\nn0fprhG2SMT7Fu8tURSIIsaQ1EwUIrELpJPpjhIQhBAi3nvOn7/A5/70c3zuT79E9LCyvIrJDCbL\n8E0DIiitMEaxP9lDmwzbOpqmxdqW/oQlaxYW8VMv2miyPCc0DUolKyFKgUhSSNdSzec0bYP3Aa1V\nShYEjM7Ii4K8s1BFWRKJBKAcDoCksFVVEUmWzGiTlGYRuymctymGK0qc9zhraQQGZUFRlBiTYilH\nwLYtWqcwpGk8vjt3EqBqGsqy4NDhI0x391FKU+QlRZExn8948oknWVkeURQFZVlw6tSJ5IKBlEWk\npCMlQP0Zki6uTkr3LCHdQq4JhYvB45o9opsT8GidJcUiXVREI6guMYuLg4tR2Nub8vv/7sN85ct/\nQTVzHD56iNW1FUxesL8/p24Cgsa7iCo04+EY5xwWz3hpQNMonHNorRkMBgy6TFZEqKqKtm3x3tM6\njwsRXRTkgwH5YEhjLVs7W4TgU4YbBG1KIFmZIs9QSqFNxmA0JjOa6SRZlNZ6jM6TNQgR7xwxBIZL\nSzRNQ9M01PWMsizJswKT5VjrqOsKo8DkijwT8kyhdUQpUEEQpRGTISLUVY0LHjMssNZ2iuvZ3tnF\nu4BRhkCkbhpOnz5O04zY3dnj0UeeQEQzHA5YX18hyxVRNBEhpIABUV1C1xk2AJGUlPw1sHB9xiNE\nJOlVnyOkpX6wwkirVeuMqmp4+KFHuHjxIjE61jaGjMY5okJnbaAsR/hqTl3XhBAoy3LhLnUHXegu\nliqKkizLcM6li962OGsXriQvCvI8J89zmqZhd3eHpqnROiUZPYwigFYpOO9f995T5MkVt9Z2F4fF\ndgvbkuc5RVHgvWcymSzgkywzGJMTY5X2BWFQDtFGobWmbS15DtpkQERn3etW0LEPRzrYJyS32rYW\nXShMltG0LU+ev8DqyjJ5XjCdTnnssXOsrI658cYzeB8wWdY7TrqNdZdDECIhhi5heW4AC64hhUtm\nWV+VlfKM/xPAFEJEotC0LV/60pf5xJ98ip2dPU6ePsHGxjImy9nfn4FVRBIeZ60lhLCIhdq2JYTA\ncDggy0uyHKy1TGdz5lXC7KTzG6aLpXqlc85R1/XX/O0/38dcSilMp3BFUaCUoqoqamGhRL1LKoqS\npiloqoroA9EHxsMRdVXRNjXRO9qmJniPVsL62hpFZjo4yAOxgy0CrW1wznVYoMc6R4ggoccOPSKO\nwWBIiBHrLeNsiNaK3b0p1nqWx2NEFLPpjPNPPsV0OmM0HqFUQGmNdPveY4f9eUpmLv19vrTh2lA4\nkbQ6YgpOlajObGvo4rj+mEQpILK7u839993PZDJjaWmFPCsJUbO5tUPTemYzR2sVvrvAvoMgIEEb\nolQXM1m0TmBw0zQLy2eMIc8TrBFDxDm3UNZ+e72yXa1weZ6jevC2U1IRoW1b6mqeAnnrUlypFEVR\nYLTBKI1C8NZR5DmalGmCYLRiNBygVYJWiBFrLZnJyYuMuq6JgHUhvS/gWte5YY/3aTt0cW+M4LwD\n8czmc4zWOB+oW4epGtqmYTQa0LaWra1t1tfXKIvywH12FyPGSOy8kPTwCJEYwrNeZrhGFE6A4JO5\nz5RGqW631AHgG7uF5L1jOpnw2KOP8dRTTycXaAN15Wldw2zeJkuIEKLH+6QoIQS894QQUmCeZQnp\nB6xz+BDI8pwsO3CNRZms03w+JwRP27ZdLCTEGDvcLHYZ64ELjTHibVJGSFgaQF3XKIGIkOUlWhuc\nc+QdxBFjiuWC82QmIxYF3juIgRgC2qRMsbVuEdSLwP7+focZLkGMVFWFUkVKrHzoQpQDt9o0LVmu\nUVoxmU1YXlohywcUedlhkBkrKyucPXsWYgJ3Y0zWuZcFCCwsrNsBMHzNx3DSZThhkeWI0kRJFi52\nqyfgqeqa3Z1dHn74Ufb2pgyGI7TJmNeWdtIiSkDFxQkSJYsyT+wsA6T3QttinUOJUBTFAmLpy1jB\nebLMEJxLeymgO+iirzqI6mNA02WLJlkzW3XxVZswvE7ZXfBopchMsopV0ykxAZGAxIi3NWWuMTrH\nurRwptM9KjMHBIXGZJq6aVJ4EAWi0LTJcroIeTlMllx7MpWyfmsTlqgEyrLE+ZYYBO8CeVnQWkfw\njiwTrLNU84rkdhIuiqhF9qwWcTVAgmrSOZJnrTz0cs0oXIwJwCR6fPBolaGkIBKI0SEKJEaapmZr\nc4cLFy7iXKSuLahAawMhKlRUqJiC9NjVPHvLBgl26DE8EcEoRZ7nlGWZLp7ziE7YoEQQn6yuCw4h\nZYNa64Ur7d1plpeU5RBtDM5H2tbiY4JIonMYERQRHzxFbsgUONtQ24ira3IDQsDoiI41mTJ4Uego\nBMD6COLxIQKB3b1tIinoDyGioixKaP35TC67h1E8WmsyrcgzQ5HlEBU+Oqx1eDehyHJirpL7FLBt\nTYyBQADd1atFCAdpafobE1RyUDO+5i1ccjtX76gPHsSmi6vTymrqiv29CU899TST/UnCzRDyrCAv\ndKpW2ITS9zGWdW6RlfbS/04PrvbVBEgrXynVWVpBiUpurlPQFONpsixBHqmspsnzLMWBbcNkfw8f\nHO2sYlDkxBDwRIyKGCMUGlRIiYtrLBIcZWFS7KiF6EGJhiBkgI/gTbrQFo/vsDtrW5xPiUFmMqLJ\noIslAYwxNE2TLFuMFEVOcLZz8bJIgqInhQyqYbhUMp1Yhkazsb5OXVVUsznLoyX8VUB77z5TUtc/\nf778NMk1oXAiPVsjWSHVgaoxWoL3RJJVsY1le2ubxx49x/7+BOcjJgvEqEBpmjbBGX1g3ytR7077\nDC6EgDFmoXB9TNZnlAs2R2flnHNE4kLZesumdUpwyrIkEPGuZTKdMZ3uI9Gjgkf7tD0vkWJQkmPQ\nCqJvCcHimpZCFIXR5LlBkbJHIXSokMILoDQuJJdb+YARRdRxYbmlQ/iLokjH6D0hRuZdKa7PoMUY\ntJKF5betJWoBAkHAthaFwlmHFsV4OGJne5vxaMx4vPQ1TJoYIxJZWLwXQua9JhSur8unemByUYGA\nJhKDp20ttnU88vAjPHj/g1y5vEmM0p1soa7bxLqQBOL2cURfCgqdsvWMi17Zri6P9RfEWosS6S6M\nJnqP0gclp/5Ehy4WMkqhldBUNQGhms/w3qKiJ0coREGmsCqitGB0hpGYsufGE3ygHOQMi4yy0Aie\nJkZiEERrnPIEUhIkKsVNXgIhtOgoGKUTWaBpyEvdAbKC847J/j7OecajEVopfAetjMdjQHA6LM5B\nipEDbdOwPFrGO8dTFy5w6PAag3HJ1vYWIorRaLT4zmJRdgv2hci1oXCwSBautkwhpIJ6U7d4F5nP\n5ngXsNYly1LkWB8XJaRAZDAYkOf5IjNsrV3AF1rrbrshubOuwtBbud71jMZjBJjP5rR1nVZyVwjv\n48H+O0pIZa2mSbFb09C6Fi2O9fEK68tLzOo5jatoWxKdIXhMZrBBqF2yUDpGsg69D1qlclSHO/YB\nUmZyJETsPGW7WgkmM4SQiA8heOZVtTi2ECODQZkSBOcSAyckFktrXTreDjtTXd26h4fWxiOC95w/\nf57a1Rw/eZL9/X1Go9HXuE8RRcS/4Mt8jSic4ENHXIxhgT+F4FEq4eSz2ZwL55/i8pVNXITWB6J3\niDJdrS9hQcYkFkbrHE1d47xfZJ3AwrVa59BKL2ISYqCtEy1IdzXMpqlo2hrdAblGEhHTRyAEXPT4\nHlD2jsa2CB5FQEVhVOaMSk3TRIKNRBMJvk3urm5orCcGT54JWabIjIaoIIcWi4/p91TvviTVLEuj\nyYaptNY4n5g1MWCbJlU1QiISFEVODAkm0VpT5AUhKJTOEJdAcZAEDgNRBIkKbwNNa4lK2N7Zphhk\nDIcDYggcOXIYkYOsv08kYjq5CzzuueQaUbgUB0QS7JAyLBCjcS4wmU45d+5xLl68xN7ehKgMxcAw\nr5tkEa1dlF7qplm4z7YrTamrMCjv/aIgno9yQGibBoJP9UwfmOzt41yLbWuILSnGNtQBJGpC8Hhr\nCdFB0IToCcFB9AiW3GhKk1FkENyMTAV0EEKAojA4BdW8QkXPyAjDXMiNgpDiRdfto0IolMZkBZVt\ncd6jRSUXryJChg9pkaEUwacLHqUH0AGdsk6lFNZ5hsMhOisQF1BKA54QPYSYWMpRaFvHvG6YNy3S\nOqbTKdn2Nmsb612cfUB6PVCvjqP410bhOjd3UKOjw9Iie7v7PPDAg+zu7uK7WEpEYbTDx+QGWmuR\nDm+7mtpjlcJ1pS3nkhtJuFwqBznnCd4hMZV+EkSQaEWlURAzGttgbUvlLUql7DTEluAdmgzbNnhn\nyXONDmA0jDPNwGjwjsJoRplm3tYMBuN0wZuaIIrBYEAWI66qiVp3kJDC5BlaMkRrTFEyiAOqtqG2\nFqVyGmsxWijyHB9bVFDELtaMwad4L0bKolhw6QBGo1E6194vMvcUk6VQQSvpLGTEukDb1ly+so0y\nOfNZQ123FEXBQZFeFteqp8sr9dzZ6jWkcKkkEhf/h4VCTCZTlNJ4nw5QdWUpbQy+7dJ8SSd/MBgA\nUFUVtovfekgjyzKgL2J72qZBSVqnQgKey8JgtMI2Tao2ZAYVFFqEefRE6Yih3iPeEYMnI1JooTSa\nvMzIjWJc5BSS+g2Gec6wGFLblrqtMQJ6kFi84/GYQWYosyxlvV2sGUIkIiCJah+VIs81edOmGNZo\nlHUEARcMLkRiVDRt17vQUbx67l4fUvQL21+V3aZkyXVVGI2oyGQ645FHz3H86GEef+JpQlScOXsT\nTWMXipplXamtk76K+tcA+CWh2c94wftAVTVsb+9QzWu8SwroOSBj2a6EZLqss2mahevMsow8y3Ad\nNtdnmt77rlwUES0YrVPJyUNmEqkxSqDQhrXxEBhQ+cjOvGLaWJq6QdqaUoRBkTMsS4Z5xvKwZFQW\nFJmhyA3BeyQmenpmMopBSessbWtp2+aAyk5KANQC35NkaUO6hB6IWqM0xGBxNpIbTYgR7T2ZSdl0\njAatPIH0XgghueEOkyzLchFq9OyZFBcmpUmK2LFJmnTOdssZbdPyyMOPc+zYCW688WbKArRWOGfR\npkcEhBj6FoDnlmtC4a5GqZOknfY+BeREcM6jtSFH4UVT182C9aGNYT6fo41hOBwuTmzvOoQELPdZ\nqFIKFRPOZZRmUBbkJpEes0Lh25ZSw1JZMh6UaGDgwftIYx2tdwy14ejKmCOHV1keDVkaDDi6tprw\ntEwjOuGCsUswNJFyUFKUJaISQdPaRLwMISJ9yYyutyLzBCIuBFrnqKxDgkdiOiYXHKDIjMG6gPIB\nT7JWuVbYDjv0zuGUW4DhfW9ENZ8f0Kk6hQkhVWxiVCgRQsc3XFpaosgNT124xHQyZWlpiHWW4B2I\nxuguzFB/yWr8JbkmFG4hMfadCgkLi4GmmmNty3BY0raWxkbmjcO5Oc55VAd1KEmWalCWXaeSxVlL\nZVuaDpFf1AA7pdMCWa7IM02R6YV1qx04EbKsIPhE/NydN2ztTambOQWRG48d5YajRzh+bIPDh9YY\nDUqWxyNUly8jmqa1XajgIQRMptFGdz0HaR+auqJuWnwA38WYzif3Nm9aIoKxFh8qnHWID7ja0viI\n14YgBqULtNcpI+/iOG3MosQ3r+apGSamurTzHus9PgaM1hitUmbfNAQfUtdcZhYY5v7+fiIoGMPn\nP/8lXv+dr2Hj0DJ50TOTY8p0lYIQE2v7OeQaUbiDuCxlkKarhaZ4w2jF8soyEcXm1l5H9EtocUx9\nfhiT2gtDR7/uv5s6mlInVmayBYSgJKb4jECmhSLXyU1136mtYxYDO1XDfDbHeiAGVvKcM0c2ePUr\nXs6Np06wvr7KaDRMPRFZajhJVCvBh1TPtc4m5F6nNg3V0bQJATdIbXzWugUbxVqLdY68GCTKUTUH\nSSyaprG41tGEgAtgCeT5EGUUipZAXHS99QCttbYDnVO7pHUOhJSAdVT0q8Fb2zpiiKmnVQdMF4Zs\nbm7x2T/9HCdPH+PQkTsAlY41cSwQFFFFpDMCzybXiMIdJA1KKwIJO7pw/jxb2ztsbu8wr1qaJjWr\ntK1dAMQhRjSSeGvdauyrCX1NUWtNZgzBp2BZJNGzyzwjeNunw2iVEVFY65nNKmZ1Q9U2IJHVvOTQ\n0pAbjh3lO+78Nm666Syj8ZhBOepKcR2XP/lQiOEgjurjRyLBNqi+CuI9Ic8IPlGGjDEL3l3TNLQu\noJzDxdRN5rynLAt0rvGVw0ePF1CZIlMKGywS6LDLdD6Tx0iYWwjtVY06KUtPcZs/KLwjuBAINjFk\nTEi0K62SJ5jNZzz80CPccONZDm2soTsN6l1z7z2eS64JhUsKoomQAmrfcu9999E2lv39KVvbu+zt\nz6iqFpFskXkCi4Oz1gHxaxKDHkk3xhwonDEUeYYOjtwIKjcE2xLaljZ6ah8R5zmzusYwz6mbmsF4\nwKkzx7nl9FluPnWKo0cOoUuDZBllsZRa/XyyqiKJsyf4RX24xwgNAdfoBX+sbQKEBOYmmCbrGfWp\n6tK0KQ4r8mSRYmA5jDhy5BB+e4edukaJwtk6YeXRQ/AoDgijfR0aEjWrJyf0JcAFkbKvHkiiOoXg\naa1HKYfWlkGZM14aYW3DV796L8dPHOdNb3zDYtpAfy36Gu1zyTWhcMDC/Afvuf++B3nooYcYj1bY\n2d3n8uVNnBd8AEgFeCXqa2qlMUa0VouUvMfdELr+T4NWcVEQN1qlHgOVOp0yLTRNGguxOih4yfET\nHFlaQqnIxrHDnLr1FtbX1sm6OCwflIjSaIHgWryzqZFFKQI+MXiVQlQqmdnWIggmS30H3rmElUmK\nfbLsAJyGdPEjiYfWdDXbPDMMyoLDa6tY77HBU7mIrSuI6fOhi1VD13eKUnhSJ1wIvfVxXd+r7vpO\nWeCbESHElK26kCjsMUQykxCA8XjE/v4en/rUZ7nh7I3ccsuZr8Hd+jrzc8k1oXC9Kfbe88STT3L/\n/fdhXeDQDUeYzmqKcoj2gnWetkmmPsszXEeo7OvGCwXslM11F1W6Fd0nDME7lAbXNASJlJlGkUa6\nm+g4efgoN5w6wpH1VYqlkrUjh1gdr6OKAj0oKAYlWhRGBILDJcSMtq3prhHBR1AqJQmRrhcgLIrr\nKaPUiEr7RQgoBcYoRAwiyUp4HzBtS0sqlI+GQ2JoqMdjWufZ3JukpAS6LLE7GV1hXZlsEcemLNQu\nWhn7Ml+eGYw22NB38HcVK6XwPhCy5Hm8dywvL3Pq1CnOn3+Sj33sHpaW3smxY0cW1u1g5sqzyzWh\ncAlzE6b7cy5fvMz62hpHjxxhfXWV/e0ddlfXubS1iwe8pNJN7GFGpRaxClxVK+1A33Iw6JKFFFtp\nVzHIhPXBGKIwq2cYKShiZG0wwIwL7rj1Zo4cPsKhI4coxyNGS0sYMSijGZSJhu6dT0G1MiiJiFG0\nwaaEJIKEgA89NqXIdNdvm3YSJBE9sT6RG1VM5SWliG2LltSF5UMkMxnGebxP9eZBLmysDLv+3MBW\n3KdqWsR7tASsT32vXjQxKmJwiybopm2IAUzXg2tt4hxqpRGdE2wLCKZLsABcAB+F4XDMiRNHqeYT\nVsYFl84/zsXzT7G2tEw5HCa29dcZyHVNKFyybvDEk+fZ3dvntpe8hNFoyP7OTjdQxtO2Dp0bQoe9\n9XCP0qongy1ikas5W8SIdx6Px+DJxbMyGrGxNOx6DApGg5K1csihpRVWlgYcPnSItbU11tYPYYqS\nrMhTv6cSIrq78F07swiIQWshyyMxenCO4BqiNqmGEVMAXxSpJNVPbdJKgyRIQikhihC9gxCQIASf\nrL82umOqRGJoyTNDbh0rg0E6NuuQEIguEEKkbR0+CD4oVEdld60ly1PtOBBTmKE6xotKQLH03WrE\nRb9F0zT4ECiMJkRwruXUyWMcXl9le3OHP//KV1hfW+PMDWeJHezyfBbumpjxmxpDLBcvXWJ5ZYXR\n0hI+gimGzKqWvf1JanaxLrFvvUWIFEVGZlIpqnfJPU/rgA8XcC65gxgCCsizjOGg4MjGGkdXV1kf\nL7G8NGJtY5XDR4+ycegwRzQsZwQAACAASURBVI4d59Cf3cstP/XTLH32i6isoBguofMSFxVBGVAZ\nwYFg0Con0wVGFQga23VflWVJluUJrnEWiQlszoyBCJlOM0diTK42dp1raZAoHLvvYd75M7/MDY88\nASQX660nOk+GMNCasTGMMkOukps3SiEx4lxncVWH/ZHmlwwG5aLxp6dZAYvyX8/568mmkEgF86pi\nd28CohmNlzBZRmtbtrc3ca5FVCJd9ETaZ5NrQ+EQ6sZispwTJ08xm1VMZzVPXdri6cubTKYVs+mc\n4D2DPCfTKeZaWV5Cq37ozEEhetFXSneA0ScqU4eBKSIr4xEnjx7hppMnufnUSU4eP8qxk8c4fuIE\ny8ureB9Y/+CHKB49x+rP/3KqNfqAFk1mcpQoQkfVVt3sE+nqc6klsVqU36SDNOazec81xTtHW9cJ\nO3SB8pOf4yX/3T9n8OkvMJvNmU6nzGZzbvrdj7By8Qp3fPwzi21Z61EdtTtTijLTjMuMgdEUSlOY\nBOaKpB4K53rCQp9RykLR+qzyaiZNH++FEBaN3yGC62Clza0tEEVZDllfX+fy5UvUdUWMIWXoz8M0\nvyYUjhix3jEYjdje2WU6m7G7t8/jT15kazutKBAkkkBWrcgyw6DI8c5Szab4q3pE+0769L8kJohP\n9CGt0giG8XDA6mjIxnjM7U9f5n2/+JucfOAhdvf32drZYWdvn0fufheT0ye5/11v5/FHHuaRB+7j\n4oUnUB/+MMd+9L8g/9jHE78ttHjXEoNLDiv4BUUqtSB2OFfCSWibhqZucNbiraOuao7/+m8zPv8U\nJ3/z97hw/gKXLl3m8uXLfOa1t3NlY5VP3PlyJpMJ8/mcWVNz9vEn+cDv/BGvuHSZojCURcYwyym1\nQQN60ceRLFXZVWB88LS2pZ8MBRyUAUOPUx6QTRfvAXVjmVcNs3kNolLJra7Y3dthe2eLEBMPz3n7\nnJf6mojhrHdcvHgZUYq9/Rk33nQzVzZ3mNUt1ieaDAiDssRby3w+x2QZly5doq6qrv/hIH5buNXF\nL6RqRbKGkUwrjBJKk5Hnwis/9kmWn77Mbb/7Eeavex1L4xWWVlZwp0/z4JvfxP7+Hn5/l6efvMRD\nTcP7PvgbFJeusPp//SLn7rwdnxmUFmIMaC04W+O6pMWIxnaMFm8d8zDFto62aairGu88Wzu7XH7V\nK3h92/LZ19xOBAZFYurOX3cnH73zDlrnGc/n7E+mzGdz3viFr3Jsb8Lb/vwBPn/Xq0EChdaUWUbl\nPBkeZR1t21Dkya3brt1RdZOQym6sxHw+X2Bn6iqrrPVBf0cUxbyumEymBFdz5NAhJpMJJ45vcPjw\nOlVXgjSZWSTKzybXhMKBsL8/o8gKtq5MQIZcuHSZ3b2KurE0rkUBrW2ZTiY0radxER8afJfZZVmB\nEhbAr23bjtSZ8KSBKSiUYaA9QwNFptGZxobIfe94C6/86CfY+/Ef4eTZU5hiQDEc07YeyQxZJjTD\nIdYHHnnoIf7oZbfwFue4/O53YKsZ2Iylz36eQ7/ym2z+nR+geeWt5EbIVWpEdk2bIJh6zvIXv8rx\n3/ow97/9DTx2+gRbu3tsb+3hl8Y8+DffzdrKMutry6wsLWF0jrOeeV3ho6cclAxHQ7TJ+MOX3cx3\n/8VDfPT221K/g85QJiLKpuQmJMpV8GEx8DCEhA8WRUnWudSyLBcjyBLE5IDkGkNwC5pYlILGBiaz\nGgkZe9vbjAcZG+vrlOUwNa93/bH6eSoN14RLTbNoUyWgmtecP3+B7Z3dFKvoVKO0tmF7a4u6rshy\nRVlmqawigaLI0v8d+4GrkgYkDZbRBAa54dDqCqsrSxR5ToiRYjgg+963c/5n/iXNG/6ThI2Jwjmw\nLvVzxpDm1W2srXLqxHE273gZv/w33sl9R9eYT/aY7W2z9kv/hvLx82x88DeIIWBMhvMulY+8o6kb\n9vf3OfJ//x6j809z4+9+lK2dbZ66eJHJ/j6DomA8HDIapZ4M3dV1ezaxdy0SE0C9trLCzp138DPv\nuovPr45pmjadxzzDZCbhbyTsD8IiKWs7rlxPgeoTg961Lrq9OjBRKVk0PutuxMRkf8ru7h5XNjfT\nZ7Ti3Llzi31I4PJz11KvCYVLqb/pJgcF9vf22d/fI+JT8AtonQbsDcqc9bUljh3dYDjIGA0ztAnM\n51Om0ylV18fQN+y2rSXYltIojm+scnRjjTsubXP3v/41bnj0CY4fO8bS0tIiG9Mqg6iYTGY0lcW2\ngdlkzny2z3Syx2hUcvrUCU4cP4oQqWYTJnu7PPzONzI/dZzNH34P4/GYfDCiaVuatsK2FW01Z3d3\nj8+97g62Dq/zqdfenmKhCOPRkEGZo7VQlCUuRibziqptcTGgsy7A7yjwIQQGZcna2hoxQts03HFx\ni//h01/mNVs7JNi2G5Ma+2bwg661Hgrps9R+REVPUE1IwIGLTfFnKpuFbjtb27u4wKKu3XuWgy6w\nZ5drwqWKwOrqKk+ee5Ld3V2y4QjbWhpdY6GrAgg+BlaWVzh56jjOpdjDes9kOktjCkKamJSZRDFP\nSufIEdZGJUfXxhw7tMqb/+gTrF3e4mV/eA+Pvft78N5jTEaIkdp6msrTJI4m1ayirlpsMyf4NAgR\nbRivrOC9YzavyJVi91Wv5P43vZ58WDIwJVkxoPUNRIvgsU3LdDbj3JFVvvTet5IVAzIXGI+XyU1G\nVg4IyjCznlwZinyAdZb5bI6SiBFNVDlEB/jFhV5eXsLHwPc/9Becmc55/5MX+chtNyRCp3RDRGMf\ndvRegEU56uqatNY6EVYJyf1qjc6yRAODRGkvckQCm1vbbO4cYnVtjzNnznDo0MZCma/5WqqIYm1t\nlXOPnEPkYNVUsymj4ZCyKDBaIGQQItP9GVlRsDReZn8yJbiAVqlby1qbKDoqgZAxJjbtIFPkeJYH\nBfd/71u44yOf5PIPvveqljpwoaVRjmBytvembG/tUs8m7Gxeopps421LmWUc3ljn6OHD6KwgRp9w\nrjzHi8GJAZ0zHK+g2n3m04rgLdE7pvOKyXxOlhdE54lBKMsBJssJ2jBpHHubeyyvb6Daiq3NywTv\nGBYluVaoCJnuGHeShvDECFme829vO8v3P/Q4v3H6OJnRacK5NrQ+9cBqOYBCeuvmnFswaiCFNsEq\ncmPS4o2R6AOZ1khMXXJtXRFNQgnmVcOFCxcYDgpe8rKXpu11EwmeS64JhVNKsby8xNGjR3BeaKNQ\nO0v0QlkUjEdDptNA8NBax+VLm4jS+Bipm5a29kQV0LlerNQ0zDkV9AeF5uihdU4cOcTG2hrNzTdz\n7zveluqhMZWLqnrO/nzOdmO5uL3H5s6EtnUYBVeePo+rZ7i2YWk4ZF631NZz9PBhjFHkWU5QOVHl\nmKxMQ3i6GNJ7i7NpouW8qqmtpfVQN4HhYIxoYWdasXH4GEePHGF/NqeNMJvVPHzhItE7yiwjV4pB\nnjMqC4Ya2jbdZ8I7CzFw/+mjfGljlWlVQ9UmEmUHE/UWx5i+FyFZurquk5J1i25xr4aQatS6a0QK\n3ncjyBTeeQaDgrX1tQ6LK9nY2PgaEPmar6UKoFTL0lKJ1iVtVdO0luXREnlR0Dqf6oMhgZ70RfMA\nVesI/X0cfJoLTEgrM4b0KPMhy6MRq0tL6T4IWYF0J9O5gI+RSdXw1OYOD597nBP3Psjfv+8BfvWW\nW/jj8RDbzDFYvmt3wg8/dp5/9/LbeCpGdIgsDQfodY0yGXkMmOAxocV8/KOs/9wH4X3vZnLzGWZt\ny7xpaCuHdS1og40GpGXjzEtZPXUD0/mMK5MtptN9Hn74Ieq2oW0bYnCMioLcKFaXl1gtBmk+SWyQ\naBFcGhaoFT72teaOowZodXCms7wgisJkqeGoaRqARVO4CyG5W62SP46J3hW6aVRKIoPBiKOHDnUN\n6AcJRy/PF8N93aRBRH5eRC6LyFevem1dRP5QRB7q/q51r4uI/O+S7iL4ZyJy5wvVOKMVg0HJovdH\nEmnQe8/u7m7KsrpHVVXdQOhUl9Rd4NuvXGDRvwCQ5xllWaZ+h0HJYDggL/LFVPIrV65w3/33c+99\n97E7b/meL/8Zx69c4W/ddz/Hz97KaPUI68dv4EeevMSZyYzvu/8R2qpitr9HXXdNxmVJUQ5QWuOc\nZennfon80cc58qHfxrs0ZiyNrLBUdY21yaKcPn2GwxurPPzQfVy4eJ7bn7rAP/i1X+PNLmBGy3zX\ntOJn//x+7ri4yeWdfR5+4gIPXXianaqh9lC3HgkaLYZBWZLnWRfoS0cf112pKfWMJneX5hfHRUKR\nLOB8Pj+Ya3xVBQIO5od4H3Ddvp84fpzl5eWu5touzv3zyQvJUn8B+J5nvPbTwEdijLcCH+n+B/he\n4Nbu8QHgZ17A9oHUTleUOePxCN2BqCiNj0JWlJSDIbFboYnw6Bbc/T5QvXqEQxpckzK1PM8YjYaM\nx8Nu2vcBlaltLZevXOGRRx7h0cce49z5i/zRd93Fk+uH+Njb3sVLXvEqYjZkvH6Mj73xTVw5epQv\nveVNrC6N0RJRJmN5ZZXhaEyaKZa6r3Z+/IdpbjzD0z/wvVTzOa61iX0bEwN4dXWVUydPMhgMePr8\nY/h2yktuvZk3f/zjnN7e5v0PPMBk3vKfPf4Et8znfODSZcbLa8yqlgtb2zx5ZZO9qiEN08wwqkhu\nVBLbV5HoTlqlBpkizzFGY9uG6G1X2+3n3h0E+94nir2gMDrdGimEhOeFmGb5Nk3N+fPnuXDhPHVd\nU5YlWqfx/leXFZ9Nvq5LjTH+sYjc8IyX7wbu6p7/IvBx4B91r38wpmXxGRFZFZHjMcanv86PEGNa\nTZPpPhDJ8gxRmqqqmUwmaKPTqFIiuUl1QVHJ1Lcdkt8X7fsV631IpawiZzDI0+pX0unFwUze/b09\nZrMZ3/b0ZX5s80Hu/Vvv5+de9VNs7e1x8Z572N25wolDKzxw4w3U33EnhW/5ts9/gdd/7os8/H3f\ni37tMibP0sQkkyxB/frX8vjtL2Gys0V94SlC17aotWG8tMTS0piqrtna2WO8ssrLXnELp2+5jU+9\n4528+nd+hw+94tu5fOECv/LSl/OjDz3Ab99xB99dV9x97gn+j8OH+HymWc41xWiYWCYkYFcrSUmD\n0ZgoZDrQj8OJMdDUFVoli9zDJP0gbaUUPWc8EQgSoJ5YIB4fHJnJFp5hb2+fl7/slkXfaz8i7epE\n5K+scM8hR69SoovA0e75c91J8PkVTgRtFD4khRqPR8zaGa5N9xWICEoZ8sKQZ4ZMQ9O21I3FWr9g\n/PYnsHcV1lq8dMVrnciQCRcOC2tYVXPyvOD48RP82Ge+xKm9fVZ/80N8m8n48Ou/iwdvvYXJ7pC7\nti/xhns+weff/AaeuPEU3/GZL7B6ZZOX/sFHOffe9yJaEX1MIx8kUn7qMxz7xV/liXe/Fbexkkaw\nipDnGegENezt7XH7pU3e9m9/n8ff/yNUp25iXA4QYJgpXvGSW7l87Ai/uzzmvV/4PGVdc7iq+MnN\nLf7h8cPoGBAViRJofU0M6S406UYhCu1jWlw+EpxDGYVWqTtLKZW6/LtRsX2/ajqHqRG7qppEkwrd\nUMIuwfI+UpYlJ0+eZLw0ZmV1ZUEO+Hpu9d8b+I1fb6jrc4iIfEBEPi8in9/b20+D/0Iacnzx0hUy\nky/cYp6nLvUeQ/Le03YxXIhhQbpsbbuIP/omYKUVWdcC2GUnybop0kgEEY4dO8qrXvUq/vTNb+Cp\ntVXatuXo5Uu88zOf4Pj6CmvDgtd/5GMcunyFOz/6caLz3PeOtzI9dZILP/gDKc6J6S4GIaQBN+u/\n9GuUj5/n5G9/GHfVxKXBYECZFxhjOHXyFG/9wp+xcv4CZ37lV3ng3vu5/f/5TY5fucTf/OLneO3t\nL+PI2hLf/6UvcnpnhxADjw6H/PrZ0xzbWGN1acRoULK0NCQvUod9ZgxFlqU6cWYSs0YE27bMZjOa\nNnV29dDFfJ5uKdArntI6VXe6+37V3e0Dmrbp4ruGPM+54YYbOHPmDBvrG2xsHOru/JMvKGHPJf9/\nLdyl3lWKyHHgcvf6C7qTIEC86m6CL73tpqiUJmrBKY2LQmwl3SUmRpzt6oIhUFcWRepy8onyS2st\nKTk9uOlbr5xvmc35J3/yWR7fWIIbz+J1QQwe31bYeo7ScOLkSUw5ZOvuu/nkd7+djS9+iW/7/Q/z\nwN3v4fiRI4zLgq9+z9vJP3IPD77tLZw5dSPNy1/JV7//PeRFSSZAbCE2RFKb49Z/+j42PvhrXHjP\nWxerPjOG2BE2jRgynfHA972Ll//+h3nih36QY0fWuO897+Ylv/Vb/NG338Glxx6kdQ2/fvMZ7m5q\nfuO2m3jw5hsYAKu5QcWAih586u5qUXgf0SgKpaiJGMCgiWgcKYPMlcbk+SIO7u/rVRQFLqTJmjiL\nzkwa9BMjMVoGo5yjhw5x8tgR1tbWuOmmmzhydA0fY4fdpTlz+kWYLfLbwI8B/3P397euev0nReRD\nwOuAva8bv3WSAn2LtTXORprGL8bZW+uoupH3RI9RoLVBKU1rHf2d7haMhw57UlrzX13e5kzTsPyH\nn+T+t9+1UEalhNF4zGg5Ix+M8WgObayztrJEvfEWvvqOtzIaL3FaFHZjBXt0lS+/5S5ypdkoCqJR\nFGXBYDhAYpq81GdyAsxe/1qm3/4KpnvbmCcuJIvS8ee00Wl8fltz6RUvpXnLmygGQ5Z8wL7tLr7y\nhu8k293l1U1FVc2ZnT7Nr3/X62mt4ywR39YoIoOiIHSDd7z3TOq6K6BXWB8ogqe2HmMTS8ZZi1GK\nqpqzNEq3N2qa5qqJUtINvgnpeDq+jTYGJUJmsgTTxMDm5ib33HMPb3jj67npphtSVttNeH8+PtzX\nVTgR+VVSgnBIRM4D/7RTtH8jIv858Djwg93Hfw94F/AwMAd+4oUoWwwpSTi0scHpk8fZ2poxmc/Y\n3U9Tu02eEbvbSyYGRLqNTwpOA0h/9yoWnLieqvSv1jf4J/MpT77jjeAD4j0q1+RFCVmyNiH5acpM\nYWPEi8f6gG8SDWqQGfLBAKt1GgGrUyIyHJTkuSK4kCpOkY6E2d2Pip4Q2sWVMeBjIHaESFvPcc0M\ncQ0qZEiMiIpkRUYY5GBgHj15cDRaqEhVAzUapA5/ERzpXl7aKJhMaKuatqrRHdRklHDXfMrf29ri\nfywK7hmMFpT3vBxQliVVVS0UTmnITBo+HbzHu7jg1jVtQ24Ujzz6KJeHA1aWl3jVt78yVT2+pgvs\n32NcV4zx/c/x1tue5bMR+C+/roY9Q0Qp8iJjdXWJ173mTs6fv8zm9hb3PfRYuoNLVbGyvIyQUHLv\nLNPZbBH89/2oBw1LcRHE/nGZ88/e8Bre9crbOCbSDWIQoqT5HyJCdAEJkSzXlOMRy0tjGuu6SQAx\nDS8UCEaBFgbDIUvDUddsbDvipUeUQnUtjOPPfYnVX/gVLtz9dp46vEYkpu56n2bnZkbjbUU13WW6\nU+DbZsHIVUox0IK1Ad9U2Pm0SyR019GViI5taxmPhuRFSd033khiiWgg14oiM/zkzi63Wcc/DpGP\nLzuCj0ynk670d8AhTAs40t8XIs2XkzQ7z/tF/IxEmqZBZJmt/4+6N43RLDvv+37nnLu8W721d1Uv\n1dt0z8aeIWfhDMckJVmUIotSTMtyLEeKHAmOnSBAEsMGEiFBPho2kiCwnQ82AthBJMuSIFFytNha\naJPiUBqR4pAczsIZ9iy977W+213Okg/Pubeqe2ZIOqGB1gVqemp5q+5773PPeZb/srkJSK+zuQ/G\n3OejrchVJwRPVZV4V7Fx9BCuLiliY3bj2AZVVaJ0wt5kJhe8dhRVifJERIluRzWNr1aWiE1kI1xN\nQKA2aJRJyNKULEMM1pzDK41JU0x0b3bBgYqabZ0eaTcXngKaajLBU7WyFBqh/hltWPr5XyG7cJkj\nv/G7vP3f/izdTofdosTVHpTH2QpbG6oiYTbeReH3TXiRFoerKhIDw2FfGspOSC3dLCdNUoraYl3A\no6jKCm8tBki0jLWMF1Gaf7q8xN+6s8k/yFOsrdAYsWU/wFmA6NCjHIQoiBhhSV55SWGM2KxnRjE3\nN6Df7/PG66/zxBMfjJaXcUFw9n3v9X0RcN4L5izLOgznEq5cvsKgv8BHn3ua6azgpa9/nSxLBRvW\nOKPEADHaQKIJweKizIH4uBsWFxbwVnLA4GVr04lCA95KTykxWhj5KohcQyQDZ5kwtfJoyZjiUKmB\nxAhEx9ZRcajx2JNDIzdl9LM/zeCf/9/c+sufJEtTOr2cbKKZltJEtXWNSxO8swRfkyhPauKWjEjQ\nalK6vR4mjXNKrYQYU0ZlT10zmUlSn2iFr22cHyckxmKcI1GKF+aH/H6vw53RnqQl2kSvsBlpmsWt\n1OCdp64dRgumrUnFGjZXlqTi1KMgSVK2t3cwZp7Lly+zsXGUpaXF+Pfvc20RQTZ0GI/HmESzfngZ\n52oClv6gx/GTJ3nrwmW8TtjdGbG3O4rFAmSZCCSr2OVvWgOutmzduUNqNLZaIiDwIxUs2JREB5Sr\ncFUgaCPVY2riWMiTmEQCMHiUjtDr4FFOKKQ2BLFy9AotGqcRii1DpNFHnmHn8XPM9rYxO7fodlLS\nBFRwEZiZ0Hi+emvFElxDlncwsZ+ldY6INkvjVoLEE6wRIetZgY/ASm0tSeM/r8ArSS0Spcm0gSjX\nWuOjBwaEspB2jhb7dwksE5WmgoxSo/qDBvCiqeeAohSt4jzLeemrXyNYS6/TjdL/3/3G73f9aHKu\nlZVDBA83btxga2uLLO+xtr6GQ3Ptxm2uX79FUZSig9uoIUcJhXuRCt57CluLvZCNM8NaJPmdtvEq\nKlQCSicRqbovznfwXxoZCS+vCUo+DhYHKsJ+lFGkX/hjlv6vX+TGj32SvVPHyHWKUfJ9Z6O0a6xa\nfazwnLMk0SlQKdUy0ryXggPvCM5howJ5WQrIobYSxDb6ijV9v2blCqpue4VElr4yjQ+YBR/dGoNu\nx37eORweFbdRH3zLiWgYcEcOryMytSU3bt2ksiUnjm/QH/Tf9z7fFwHXjJlAcgAfi4Fjxw6jdELe\nG1DWgaoO3Lhxm6KoSZwn2IgcEUhqizatvW/JIN+zt8vPPf8F3prrYn/woxgTCNZjqUGFiA2L9MFY\nVDR1/b5yuKhEipxBo/1h0Krxn2p5/5J8K8Xyz/8S+YVLrH36t7n0t/+mqHfWHh+tw70P0cZI/naI\nKkuiwi4MfR8qUCryCxzW1Thbt+4yjfh0XdetqqW19q4i0Xsn4oE+0vdCwFuRxzBpEs3mZHrgraQk\nTcAf1GnRCtEbKQoBZiYpi8vLnDy+RpYZbty8wY1bN+kPBhT1fc7aCvFGNX6ma4cOsbK8Il6ilUWZ\njF63y3Q6ZTQaU1sHSpNmhjTNUVoznUyo6zLKFojMva0t/932FqeqisFnn+fVT/w5gg3x6ScSTfaD\nTaFED+QAfQ6AJuCMwWMbVf8GTitjNWtx1kk+5QN3fvLHWfr5X+HSj/wge+MJO+MplXXCdk/F4ca5\ngLVR+TtyZ72tcRo0KSoqm+OE/O1tha/Lts/lrKUqK6rorWBri42rncjKRpGgIIqZWkM3k3ZLlmbo\nNEHFXqb3VqBd8WiKCR9EsV1pAWZqo0lS0RS+efM2q8sLjG/ucf78eTqdjK3tPRYXFt73Xt8XAado\nxPBE521hfoG6tkynM5RS3Lh5g2+88RbXrt8Q2as0w/lKnjgn21FRlrE/p8RfPq4e/8fqOn936zZf\nfvwDLJUVadRFMyRRZRP5ID7V0QDkIO0wtDIIGoKOJJNouRlnkGVRUFcVWmmy2jJ98AHO/53/is3b\nd7hx9Qrb0zFlVQkhJybh1jqcjWoB3kV0i8N7RQgaFaSYwltw8cOLe420MsK+Zm9Z4LwQYBpObCC6\n1yh5D1mSYEyGDrLVYowUSUFaNo0kvveya8iKKPeobgsS2Xb9eEJZFEz2RhTFjNl0yqG1VZQq2dt5\n/17/fRFwIIWDxksvyYgj32A4IIxnbG3v4J1UbQpDwAswsxIzNbF/FEHARndEEcgTzYtry/wXh5d5\nenWJ760d/VwktIJWoLS0Y1QjjuNRZHhl4mc+fl3yn+Bs3NYisTlyNmejEdcvXeLipct05oacOn2a\nejJhvDPi1p0trt3Z5vZ4T5TIjaGfZZgQCM6h0AznevR7XTEmDgrlA8E6KlehkG1RtLvERyF4sdYk\nzpH3Ze6dADDjNQ1eHKa/Z2/G37i9wz9cmOMP51KyKLPqYz/PqIAKEsiioWIEoqQCeWR4aS2ToOAs\nyhu81hTFjEzL+XQ6ObPpmDRVLCzMv/99/g8dSN/J0UBrVPAtmiDJMnxwVNYynRWYJKec1TgrAdni\n3uo6MozE1SXoFOs8JjhMsISywqWGWVVSljW+46i1R3lROWpvDoIBM0mKMQK30YYDq1zdJuveSsUn\nSb1mtLPD26+/zhe//BVc1uGZ5yasDufAwd5owsXL17h85w6TuqLfzXjg6BGWBj1safHBMItz4RCr\nVmWFva5NiKhkWVFFdFBHW9lYXbLvrGO9F2Ru8DQesyEE/ss725ytLX97Z8Tn5/p08ow8MVS2pHbi\ncmi0wrqmVymVkPcerz151iHPM7I8o5hOsHVNv9+jk6VkiTDp1tcPcfToOseOH2VpcZFf/e3Pvee9\nvi8CDvar1GZFCQGKQqSltja32dzcZTzZoyxnpJ1O3FZ8CzdqkB9Ecgl4enkH7yp6nYw6oh38/LxU\nfPG1otbtwQUSrbFV0bKcdGwheOtwrr7LxlIZYYjV1rL88uv8Z//6eQ6dPcFX11ZFe0QbxuMJe+MR\ns+mEPDGUpefP7U74r6++wh88fJIvz/Ukh6pqZkUpOLVUFDqFHyDw90atUmaesl0226pzAs+yzrUK\nUs57CA0zK/BPlxf4nPsV8AAAIABJREFUm5tb/OPhgBBn1okREUKUwTlLUVYQ1cgD4i0WHJRVidag\ndKDbyThx4jibd25z7MhhVldX0DhGe3s8+ugjbBw/Rr/fuf+rVHj3DNTawGRccOPmLa5evcLtO9vY\nOpCmOYnRjGdTFIEsjZaPtYy4EpO0W2qWpjhEAEcrAV6a1ODiVmkUaCWtAh/AGw1e4Qgok9CYh9RV\nja9ldJWkCSZNQcuWZXTCxr/+LP072/wYir949Q6XTjzA5NptTvz27/M7D2yw7jx/6eptfnXjMP/J\nlRtsTGb80Dcv8uJTDzMYDMRFuqyYBrBpQvBdOjmtP2krJlhVlKUoslsvRsbWuTYlcG06IT01EHXQ\nFxbmeH7Yp64txtbS41QZWd4RYerxjLKqyXu5jPNsFc3jRKSnLAvSVDGZ7JFnhvVDK3hXceniOzzy\n0IOceeA064cPoVRgb7QXcY3vfdw3AUdsaTTJ/nQy5eWXX+Ubr7/BbFZw+vQpNje3sS6CLoKPQEMx\n9XCulgLCRX9VHUApBr0eHx+N+Gsvvcqbc3P4o4fxPmCdFokpo9AqEVqhk0rNWYtFRj1VVUnOFjRZ\nnotEgtbUIUK0Q+DOT/8VzC/9Onoyo3fpCid+43eE17C5zSeRrWltMuOnbm7y2ac+yPe99CpfOPcA\np06cIOn2Y0tIWiV1bZmGKWVRkuUpeSdvV7KyLLG2xvqAi32x2kvfzUZ4eKM6YIzCOsnRUJAnieSN\nXsR1yqoiTTOch7KqSbOcJEmp6lIKlyA8GhPJNJ1OKqtXL2dhfo5et8v6+hqrq8scPXqY4XDA3t4O\n8wurdDrZ+97m+ybgFLTbQ/CBGzdv8ub5txgO5zlydANtEurasrm1y3g8bbc3necyDIqtjbIqRWI0\nURSpYe3wGj/x9Vc5ujui99k/4qsf/whaJbGBKv2iBLm63osTYPAhVm51a9yRJz1IhQ3WuIvKSpIw\n/eizXPieZ+n/8VdY/flf5fKnfpTZZMTJ3/k93voL34uvLdnv/yEvPfMkt44d4dcfeQCtPMM0JZiM\nLNoe+RDwtYtFSWBWKubCoM1ZG0n9shZVpipColwEfQoYtbmeUVUcEW1MtcH4gFU1laupncW5wHgm\n+ixZnu/DqyIvorY1eM3y8hKHVhfp9/LI4tIsLS/Q7Yqey2i0x2DQZWlpkW6vw7egpd4nARdAISOk\noHLevnSbN958h4889ySHDx/h1ddeB5WwurpEWdVYiwRLFFoui5kMnPGxh6awPrA7LfHe8XvnHuKT\nr73B6z/wUepE0/EGrwI2WPDga4920mcrQ4myHldatre2CVrRnRsQlMX6ClWD8gadit03SYbRBh88\n4498hJ2nPsxsPKYsx7z03JOyQoTAl555kr3JmPnpDGVF5bLygZAlKA/BOUpn2/ZFZhKsrdndG7dS\nDD6IsLZ3TrR6Y//SWydWCUpjgzS0QXLbREEvTdDK4JWmcjVdB1nawWPYGU/xKqB1wDX25IFWtFuq\nXsNgbsCJ40dRITAe7VGVBRsbjzIcztEf9PDBkXc60qsM9zvzHplTiloPvPT11/DKcfz4EZIkYzgc\nMJmW0RNemrJJkrC6ssxg0OPCO2/TuuEZhdYBWztmdWB3b8QbG+u8c3yNhz9wlrUkQTtNCA4XHEaJ\nHaX0oAIqBMrplO3bW1jrWFpfw+SZdON9jXZaWjBaxwfFYIJGB43H41UQt2nfieJ8GpMabB/q4FHB\n42YKgyZxnpoATqYGRV3JVqcNc50eSaopouRpY7JrnaeOXl2NZVLT6C6dgPKMFrRz7VyEOom9ZRGd\ncVKTkGrDqLQyGgsBrWrh83qPikTuRuzGWrG37HVy5odz3MZTzCZcvXKJ4aOPkqQJEGHpBJL7HZ4E\nDUrXcunCO0zGOzz1zFNkWc5sJhzO0WiM94JcMMaQ5UIpXFlZoioLKmdxk2arDVRlhVKa21ubdB4+\nwzA35Gne5ogHsWBAqxwUipqt23eYFSWHjh4l6XWxQcxHmp8XNCwYLWa+Ichc1+MJWqpsgSEZvFZo\nk5Ll0CkLVFJSJ+IVH5wHG6idxwYZdVXWUvsaHRQ93SVJc+HilnvMD+fxlaV2sQ0CMnuNq3orNauE\nv6G15pmdEf/plVv84sZh/ng4F1EwmhAD0IZo0V6JqUgIYhZstJZxGLS+Wndu3iDBs7q8yPaWZ7S9\nxXQ6YWFpobXI1OZbwH25XwJOSZ6RZwlXr95geX7A0rAfhfwCxaxgOhXys1aaldVlvBcXvSxLmV8Y\nsl6tEW7dYmd3N6phAiEws5a55RUePLJGmqlIKFft6MbFVUACzlPvTRntjFhcP0Rn0Kf2slWBjMKk\nmx+tJVONbRrDKrRMf+c9lXekvY5g6xKDUQqrE7wxuFRTO0+lPA7pNVa2pnQC6zZaY6M4tIhOJ8xm\nZSQbJxHdEaidI8tzKXhU1AaxxJ6evM+fvHqLk9OCn7pygxefXEWbBB8Cd/bGbI9HVM7jI+5NWkSN\nEpJMXEzUBp5OZ9R1nxs3brK0sECaZAQC71y4gE4MR44cJoRAivmzUKVKL8zWJYNuztxcjzzVLUvc\nRW8tW0tu0O3mJInh9o0baKNITcLRo4fZ3N4SV8AgOmVlWVFaza2tXR46fFhmgt1MqHOxR9X01Zqk\nfDQZk3Q79IfDODZzqAClUujEtNMJgSqJEa62nlDUdL/wAmu/9htc+uEfonzqHGlIMQ7c7h5hd8r8\nF7/I45/9Q77+0ae5/OhZUSlCAqeoLIWtcUFsM4PSJFlO3ukAUJYV4+mM4XAOZQw2DuU7vR46iRpv\nQaG9yOs3vcRfO3mMv3LhKr984ijELVIDs6JgWlXiVh/nyMEJ+kbArIZet4ONTLitnRFaawb9HkXl\nWZgfCkhUG27evMXy8hJ5nlFV7luucvdJwEkAOO9ZWztEr78o1WJ0wxP0qWklBXwYYYxhPBlz+PA6\ns+mEyxevkiQJR44cwVrL9vYOdV1TVBVf+spXOdzJOXZkmd5cl4hxlL97AO3hnGNWVzilIEkYjcbY\naUmqDCEzdPt9jJGWwfCLX2H9lz/N9R/9Ya5tHKfc2uOZX/hlelubrP/qb/B6P+P07Tuc/Mwf8uKZ\nB7iUDfjRL77A/GzKg3/weT5fFpzd2eX7Xn+bz3/wUb6xcYQqSDFQoKjTCu9tVAoQ/TxbVVRVjREl\nxtYhsbEsak3vlADpjdF8fX2FLy4tCoLDWpRJmJUFu6ORpAFxlBicALNUlGO1dU1lNN4KEuXOtido\nQ1F7dkZTaq+YTScsVzPm5gYyKVIquhbe50UDCkLQ2BrKwrG41GFWz1AuUBUVuEAxLcnTHq7awwA6\nEVvIi5feYXV1hTNnz1DWga2tba5du850WuG8AgyPXL3BT/zSr/LNv/QfY9fXwQkXPUmb7akh4AgC\nI81zbFkwHe1RjKYkypD2uuA9fjTC7ow4889+gd7mFsu/8Mv8u8fPCfT62FEeL0v+ZGGR6196iY++\n9BLD6YxHdvZ4/vBp/qA3z/dZy+9257j12gX++vVrHKpqnv3i13g+SdC9Dk5DsI6SIqKSE/qDgZi7\necekmIkfK4oEQxJHbEEFCLIrpFr6dASNtyI5q7Qm6XRxyjArKqa1R6tIeLHyEIFYhuICOs0ji64i\nhEBRWDY395hNC4x2OF8xN9dDBcfCXJdOAtgCrTto7nPEL6HJkxSzWcmsKEi7KUUhY5ytrW2q0hJC\nxIY5OHryBGvrh7h89TJJmjErSq5eu8Xt23cYjcZRXlQS6f9ma4ujtmbuM5/lq9//UTLpbYjiECAo\ncpk+9Ad9sk6nrVpXX32Dhz/zPJ//0KN8ZWkedkd0ZjWbc3M8NxrzuW7OhUsXCEaTTmacrQp2ZyMW\n3t7FlCWbWvE7vZz1zSt8fDbjN/s9XlGecjzil7OEH/eOf5Ea3nn9LepOgupKX65jUjrp4ZYvihIg\n+97eHoFMJgzRgUdpJRg6BzoK19SVR6ZdgdxAaYVQPnOK29s7lNYR0O0K2difdzt5KyZdRht2ZRQK\nmenOZjOyTLGzu8PRI2ssDIdkaRq9MgQscNCW/N7j/gi4mDeEEBhPxmxtbrF27BBVXTGZTvDesbQ8\nz+bWLotLQ6qqotPpUFvJqS5cvEyn22c2m7G7uxv922V70VrzD5eW+J/rims/9imKqqKT0QpQS9NY\nqt/EyGqi05Q075DmJR/6wp8y2Nzm+15+g8GHzvHkC1/lpeOnuHXsJL+3vE4IgXN5hvWeT734JYbW\n8iPbuwQCc84zSgxhfZW/dv4iQ+f4ickU1o8xrS23fMn/kgT2Kk+K5kM7E/7zyzf40nDAc5MZf9Lp\nYh9+iE6nQ5KmlLWw500Sjd9AenERIRJi4QLgg6YKIonRyw0BRV2VbI8Kbm/t4rl7lNjeitgFEEOT\ntIU62YhQNlqjVU5dW7Ksg4u5siyQ+iD28z2P+yPggrxRW1tu396mtJb5lXmsdWxu3mZlZRHrAkeO\nHWI8nnDp4lWmxUxsyZOc8aRkd6+RLNB4b8gymUz44Pm9TpfXTp7kw6Ndnt68Q29lnjw30f1Yi7pm\nlCTNh10cCp2lJFnOtb/6KY792m+x/dd+nI/9y1+jNxrz4atXeP7MWbrRaabf60nD9pWXwFpMmqG8\nh9qSGMODC6ukyVVwjtQknJ5f5PZoymzmmeEpkoDOc/7GnducKGs2NnfJvKf/2nk++/0fb8X+6kq2\nuMZenMhwDxEM6qWvQQiBD93e4UfPX+KfLAx449gh8sGQrZ0xF67cEKHFVDzD9pWmpICaTCZoBcPh\nkCzL2olLo2jV6eRSzIXAzZt3CEt9TJJGiL5CYQRG9T7H/RFwSsYpaZaytLSE9YHpdMJsZlleXqYs\nK65dv8GR9RVMolmrDoli5WjK3mgCGJy1TKdTIY4k4hWgtcYHKMqCdy68w2z7FicPDTmxOh+X/QZt\nIlxMrTVZ2sEGadL25+YYP/M033z2GUJiwDrWf+XXufDcs6zu7vDQH77IxQ8/Q/HQQ3SznOuf+ATr\nL7zAhaeehlnJxldf5MLpU4y05pWzZ3jowkW+dmiFK+WYm7ZgebbHT+3u8huHD/Hq0jK/mW7wly5e\n45tH13ns5iZfee7ptrcWEOQGqFY/pdlmQwgoo1HGo5xCWcWn3rzMxmTGXy9KfqSw9AYjxkXBzmiC\n1gbn6tbUuP0bYV8LbjweR7VzLXooeYKP6kmKwGw6ZTSaMD/I6c8NcSHgxZz7Ll7Jvcf9EXBxlGKM\nYXl5yO2tbay1HFo7RKfbZTKdMJmOGQx6kY/qefW1N5hMSqpKjN+sc/R6XUJQ1JVlVpStsIw2BuUF\ntnRncwtrT0L27reuddyqlBLugYrMYi/SXzvPPc2dJ84x29ri6X/wjxjs7HLyxRf5w4VFMqUxgyFv\n/8iPYJWhLCxvzS/SxVPt7XJhfcCrcwP2RjvsVhN2U81P7u2xUVb85dubvHp6jVeGy1x85BRLvQGv\npR2SLGXB+4jK9RGjZ6iti+6ACudkiO+cxwUXIVuaT5/e4EfPX+DvobkzgzDdFSaWTlFKwJ9VVcX3\nvW+L5CNBpllV0zTh5MmTbBw7zPXr19ja3GRxYYFrV6/hnWc4N+TQoTVARxXNhG8xaLhPAg6BS3tv\n6cx5zq6cQKc9Or0eaZoxZ1LStMt0UrK0uMruqKCuC8bjPabTkizJGfZ7GBO4dv2OsNPFqADQaOdJ\ntKFSOde3puIBHzJJmqViQSSqoA6VjIx8wLuADTJRSJKOkIuzlKqf88YPfy9n/s3nOP+xj2DX5pmN\nJ6ydf4vHX36Vrz38MMp7nnz9db565jS3hvMcuXSJj1++wu+f3ODy6jJTW/ObD2zwFy9e47fObpBH\n8exBkpMEJQwrVMtX8ISIhAnUpYhg93o9cAoVEhkNRkl9hebF1WX+Rek4f/WmTAyiyLaNfccQ2WIN\nf6PRXHFKCDOdTodet8fifJ+jhxfp9zK6eU41qwi9QKo0q8vzPPuRpxjOzxG5a0BA6ftcNh8a5G3A\nRFZ30OKWkqYpaapYW1unKAq8c8wN5siSlLos6XUynPPkWY6zFUZpKie4sBACaZoBFhU8H98d8T/9\n3me4tdDFfvzDaJMIhzOSasS0Nk4S4pA8BHnaFTLcDiHQ6/UoPvYsX3ryMcppSZgVhH7Gud/6Nyzu\n7fGBN17HE1gZj3nsnbf5wg89x1986Trrs4IfvHqdl08cIQs5b64s8L89sIFJDD2jyUwSIeU+qiwp\nEfGJWm57oz3Go3FkwRuxH7eOEBUqm6soSpUwjZ4VKs5Wm+bjQTbWwe20Cbo8y8hSQYaUZcGb589j\nTMLe7oTZrOD27VukqebJJz7IsWNHonbKvqoo36J0uG8CrnX/izplwWTCWsJjrWN1dZnt7V1290Zc\nvXKVvb0Rq6sr9Hsig1CVM6ZTTa/Xox6Lh1X0UozYbPg7u9ucshXD3/4Mb3zsaeF5QnvBlJJVoikB\npcQXc7i6LuJFldWwk/dIkoxpVqKyKbrX5ZUf/Djn/u0f8bXnngbrSL74Ii8+80GWV1Z54dnH+eif\nvsbzjz1Md65PiheycjSfU3Fw7lF4I4Tqqq4IlnaqMJnNqGxNHhusWhM18sSwAxcE/eID1gdmdY2P\n53/vdW6vdwySRjo1MQprPdY4kWWtZnR6htXVVdbX53j7zbcpqxmnTp3mxMmjMpv9Fjnbvcf9EXAR\ny9Y8gVobQrQtQglTqN/vMZtNKYqM2aRgtDvi2MYc8wsD9nb3MEbxyMOP8Obbl9l67Txap6CaFoHk\nNf/rcIH/frTDVx57mONNYBEwZt+H3lnbkp29F2EaghCWZdU08Tw1RmUM8pREEHVUH/8YLzzzYbQL\nUDm+8OyHqb3lEJ7pyiq/8+SHmE1mLMaVsnI1zgfqoqQuRR83zQSeXlQVtRP5Ul8WTKYTZmXRchWS\nVAyDfRAtEJnuarQK1MCsrJhVtRC2vaBgDgbGwVaI8CaiZnKWAwpnPSrR6MSwML/IcG6esqgJeIbD\nIR9+5gkGc3msXg/g6A4UH+913B8B1x5hf0mO/zrnSBPRxu12u+zujtFK89CDD7Jx/CjXrl3h1MkN\n3nnnIq+99g1Gk0rcjiPJxDkr7CYUn+kP+Xx/wF+Yn2etsnTztP2b3gtfQEchnBBXCq8MibMkeQYR\nUSszXi3tiESRCtyRfs+0W5OPKp1VJUESvKOsCybdsRCZnRVqYW2plKFKUkAauDZq5TrvcEF86vfG\nY2orbRalIUkMIdICXRDApkLzyI3bfPLNS/yTpXk+V5Y4lCB94xW+V53g4Fba2FemSSpKUMqgVGA6\nnTGZXKauLEtLizz44CnW1pcJ2Dgl2g+45ne+33F/BNw9HNAQBIHR+DwpJRCZXq9LWRZcv3ZV4M0r\nyywuzLG1dYfEGGazGXXt6fd6+OkEH2R8o5OEENrhKTuTGaPpjE6eghK5AxflXU3chkUkOQCeqlJg\ntEhBxJzPmASTJtRakC4KyLOc1CTUwVPgSKxFlV0R1CkKtLfkaSYFTFmiXaCX5pSZWCjNypKqkj4b\nWgJvOptFPeNCWj5RYkxr1c6ZW26r8/zIm5fYGE/5mbLmH2dpqwxw7yrU5G8NZ6IRI9RaiieFx2sd\nq2CHCiL8/fTTT/ADn/gehsOcblecsIWme8Au9FsE3H1h7gbN0xFwrgYc4IWjGS8O3uNdTZLA2vo8\nS8t9kkSxvLTE1tYOV65e4cTJDVZX51GUKF+hXE2ikKakMSTK4kLNa5dv8MbFG0wrhyNQ2xKNw8TK\nVCnRuZUH1uNsja2q/W1JK4ISwzTjhc2epxnKpDht8EGROEXiIEWReIV2oJ3s8kKGTjBJKu/Uh8id\nsBAkhyqL6ElRFrH/RuwTpnQ7OVoraluJgqWVSYD3FZ8+vcFbvS5/L+9gdEbiI/w+qoI2WinN15rV\nbd/U2EToU41HeK7OibDhw2eP870ff5qlpT5ZmqNVF2PE+VDpCBoIHsV9b0Ee06nmETmQC7TW3shS\nvb62xgMPnGJxcYH5+SE+BG7duiXWQtMpOzs77Rx0fmG+FaMWXRC54Dt7I1556y22pzPqyoIN0r+L\nYyKp1PL2tU1jWMY8FnVgG060ij5WJhqqKdIkIdGaRIk5ubcWW5XRGtJTlWX09xLSTIPqDdD+28wy\nGy+FJEnE1bnTiSa4caQVxGbTOU/tAl9amuenTh7nNyOLS92z5R1c7Q7aVjbXKIBMM+qKqiqFUGQF\nSXx84xgrK4uilG50vDZy47TSrYJC40T4Xsd9EXBAS1wJ3rdL8sGcQCnRbBsOhxw/foLZrGBrawvv\nHYPBgI2NDfI4eG7Qp81rD3bTm4t6+PXzfO/f/8f0v/gSzsqNt4EWgHjQ96FVE7I1wTlCcC2vtdHD\nFVtJh1YiWWoar9EYrCHmiM1sshExdJGfgIoabTFQ6qiC1Gx1eb7/AIj9t4ssNR/VkwKVD4yKks2d\nXdELiflrcx0OmnYcvC4HlaLqqiR44bj6qHqpCBw9cpiTJ06wvbXFdDzCO4uPgahi3hPaHPz9j/sk\nh9uHfDduJ6qVUz3wY7EQWF1dpaoqbty4xVe/+lVOnjzJjZu3uH7jFlVVtfLuu7u7GJ20DikHjc5+\n9vIN1usa89uf4Y8efoBuJwUVn8wkadWc5CaBtTUgakve1mgtGidEtjsAUbXJaBn/NIFG1CauqlIU\njmJhYqMOiJekiQBYJ570ZV3TyJdKL3Lf1qmylqoSXbgm8HyAKsC4qBnPZrJKEuLfbsCs+9fxoK/F\nQScfeeClIZwYDd4zHPR56OxZsjRhOhnT7+VUZUGaZLKqNSVJ2N+q3++4PwJOyWirrZ4iXMkY0wZI\n8z3JNTSLi0uMRhNWVlb2Vcsj4iPPc7zfw1kbe3n7nfTmYv/9fp+fG4/4V2tLdG9c5+ShVTKlUFkS\n8fz7KwMIPdD6GjCYAF7JdARVoXWC9rLyEG+ycxZXS+5XVyVVUbQcV+eF0d8EnJCqaWW3qhhsLgZc\nszI1g3TnHGcvXeOTr1/k06eO8OXFIS5A4QKVFRqj6HA1zeD9a9DqxrVNWnmPjRONBoE9JYbgLHmn\nxwc+8ChnzzxAnnoOr61iEvF98LmNWiRRwSAqGN7/8CSk0tMRyRqUNGJ92L84SsvcsIGjm8TQ7XWZ\nTieEAIuLC7z11jsiuBICed6R5DvaIonS0H5n/TPdAZ/JM454x9mXX2N0fINjK6sMu126nZpOJwUd\nJx1a4VWURrCi2aZ03EJ1ijLgvEwlbFVSAK6qqcoSV9dUZcF0Om2DrnL7s8+qqqSqrSumkbtR1YLl\nU0reb/BidLevSmD55OsX2RhP+bG3r/Llp+bb7Vmq+wiAjCicdgG6J7XwURKide9xPiKvHT7oaKwC\n1pZMZxOWl1ZjnxDyvCtkIWcxSooGrTRBf2s3we9ENn8D+HnE3igA/2cI4R8ppZaAXwFOAheAvxpC\n2FayFP0jRD5/CvxMCOEr30HIoVSCi1JVASPTOS2ICBWIqkcQ0KR5xslTJwnecfnyZTp5l7OnT3H+\nrXeYTPZIk1yWemepsS3Run1faJROMfmQb1y6zTvXNnn4zAN8YO0YT966zFMvfJlXfuBj7DxxjoEB\nZ4Q652xNUCKcqJTBBAO2llwuMv9tVWOLirKqRW3TOYpZIa4udaPe6cWEw1kqD5NpIYrtRdFu4wSR\nBKtrh/f1XfyLf3XqKJ96+wqfPnEE74Q8boIHJyqZqgW1vuc9RTWkmUaHRDWGeTS0IJLU0O91IFjO\nn3+dEyfWMWneIkKUEhEi72ghST7c1SB51/GdrHAW+LshhK8opeaAF5VSfwD8DOIo+A+UUj+HOAr+\nD9ztKPgs4ij47Lf+EzJlCDF/C81FQcfAUHEsHLe4iGYgBI4ePcre7i62djz22DmOHN3ghS++yO3N\nXaazmtqJh5e1tnVc0Vqj4oo6q2tGs4LNcsb2eMzOxjY/+8Z5Fu9s8/AfPM/vHD+Gyw262yFJoie8\nEyhOmkBQUkX6qGBZ1TV1UeHKupVjKEtBMdeupqxrais4PesdZS3C0NNZIaqWcWt7YmuXH790k998\n4BgvH1puk/tmdXppdYkXl+aprLt7RQlN05w4rms+DlT+TX+zGf3FQysddxWDwjA3GOJc4K3zb7F2\naJ69vT2WV5b3PR202u9vsr+CfqvjO/FpuE40ZwshjJRS30AM276rjoJt0SCPSLxApr04UnrHqRM+\nQqr3b8Di4oDptGB+2Oexc49y4dI13rl4hcl0hs/zNpFtLnxlawyGvcmU0jrQCTuzire3NvmTT3yU\njz//p7z4555ic2dEoRyd4YBur0+aStBV1pNnnlRLVd0EdFVVQm+MkKmqrinrKspYeCrr2oLAekcR\ng7G5iQ1M6Mcv3uD4tOBTb1/h5UPLd+WgjSeFP3CDRSwx3ux2TBgiiGNfQnWfdIMUNKp5jcckCuVk\n5UuNoS4rfFUQujmnTn0obvW0XYBwoALeP6+7R2j3Hv9eOZxS6iTwBPBFvquOguGuXltooS6qfWDl\nKdVRt0K1LCGllKBu+z3GoxFvvXWe0bjC+5pEQ56lrdlsM74BxAhEGYxO0DrF1paqdmyNp7ywvEj/\n7/0c5W5B5/JlJjt3GG/ukO5N6fW6PHDxKh/6/Jd4+c8/x+1HH5TpBEhwFYX4gFWiboRWWOciShes\nDzxw+TqfeOU8v/vwKb66PB8BBLo9R6UUv3lmg0+9fZXfPnO8rZibVo1vKuPm5rY3WN0dXA0G557c\nrTVgi2RpCRQfWzCGNE3o5AlZplleWOCDj5/j0UcfYXl5sd2OG7tRDt63b7O6wb9HwCmlBsCngb8d\nQti7Zx4XlFLf/q/d/fv+FmLiy/raKge3/bZHFFGtzdHkG075yD315HnOysoKxXTCoZUldnfXOP/8\nn3Bnaw+T5igF/b6sTFVV0ZiGJGWFRjHX7dNJcra2dzAq4OtAPbOkKmPQS1ldO8ydUDMbj6V6nAQe\n+9wLLN3Z5tzhZokwAAAWbElEQVS/+yN+6/iRdvWsI3bt2PkLPPfll/njpx7j7ZPHKKuK2tbCJw2e\nH3jlPId3R/zQ62/zte95mjRu1Qe9Rl8/us4bxw7LtYotjLtmlqHB8SHXJCq6N8x7GX3JNw9ux83r\nRY4/xE4A+y0S7Zkfdnno7GkOr69w+NAKD5w+yaFDqwwGfbFvisYf0hnYvz/fCWrkOwo4pVSKBNsv\nhhB+PX75/5ejYDjgJvjIw2eDAjH58LF/FQRJ0aiHS2Mxsoyi7KiKK2Ced9jd3aGTZyyvrtIbDNC7\nE4bDIZUN1DVkSY7qO7LMUJaO3a1tkiQlTTLq2tPrdpkRsHVBMasAg048nV6HuflFEpPGloTlj5/+\nIM/96dd48SNPMZnO2mZuE8zP/ulLrG7v8ZE/fYmX15apY+VskG3sD86d5QdeOc/vP3KaNIpgKyBN\nkwiLbwAEB5vWjcBg3A6JAAPVaBDHNmLwmESj6n2TtSbgmvZKS55RIZrkSTvHaE2WGvLUcPrkBo8+\nfIYjh9foDXqkRhTPkySJ0xYJbKPEaGU/F9xf9f4/BVysOv8Z8I0Qwv9+4FvfVUdBaZganK0I0Z0F\n5UCJ9SLKSDspaHRQ+BDlQZVibjgPWjGejHnlG69T1jVpmrK7s4NOEnZ3Z+A13S5UhUWHnF63S1Aa\ntDCatFIkylPjKa1HJRnKFyQJ9Ls90YSrSsqi4OojZ/iXpzfI0hQt8ygJAOR5+dxjD/O9X3+df3vu\nDFVdtblN8/H6+grfWJNCIEFsEqQil0AQKQrzrtf5NjXQkmqoZt4cSLRBx++laUJmHUqLW3aTkjTH\n/vZqYiUr9qAyJjYkOmXQ6RLqCrwjSdNoCiJtl+AE6m6UbhKf5hfHmHn/+/ydrHAfBX4aeFkp9bX4\ntf+R76qjYGgrKd/onPkgli+qga0FUfcRgtu7RjT9Xp/xeMx0MuGhs2cY7U3Y2dlle29HqsRxyYkT\np1CqZrRbolTOzTs7zMqJzDLrKj6YoWUxGZOIjVJfPA2c7TBLU7ytKVJxu8GLWpKOToPOOS6eOck/\nP3UM5xzJgWT6vWaYIQScCm0BcLDiPLhF3Y34kBaRiiiNZtZrtIpSF54sTUmzDO8nYl4XX9+sxE2u\n6HzURo4xo7WQmRYWF0hTUQGt64rMmEi9fDeI82AD+dsd30mV+gXef438rjkKElcZ2ZakAtKYCE+K\njUpom5j7Abq/AswNBjzy0IMoZRg8OMfNGzeZlFNu3trhrfPvMDfXIU1y8JrNrSmj0S6FlYrP1aXY\nIh040jQB1RFVcy0YtyxL8c6ST7N2CN94vde19OgSLXL1Bzv77zWma3Or2Gs8ONN8r4H7fsM2CucA\n6sAaoxVCggGSRNo+VZXgvN1/mA/wGBonxkYNHSQPnU7G2Lpife0ESWRrqUTY/s3WfG+QfTvgZXPc\nN5OGZgDso5la04trqqq7f3b/jUW2H0opet0ujz12Trr5VcVwrovXipdfPs+ldy5Tzmb0FgdkSRoZ\nXSKmLKQZS5Y0ZrQ6BoqIOzvj6Xa70vT1KcFZUmPw/R5V7J2J/m7Zjq9sXb/rpjQ3/ODXpK+o20f6\n3lnkwXZDc2il8JFPq41C1RYXuRhaKbrdDqlOSPNMZsA6tnIOFE1NgISg7oYTKVhYXGBWzsg7ootX\nlAWZSVvX5/3Xhnetct/uuC8CrnlKiXNLlOyjIQZek4SEaDvUMPiAmPwEvBUYeZomFMWUJFEURU1w\nCTev3WJpYZlTJ49y58513nrzHfYmAmcKCO4tSUxkGwWKsqAsS/KuEFVMmlLaOqqnO/AJeZ7hrCWP\n22ijvlQUpcxOy0rGWTH424brgZvVfC2o/QH4vZXewWBrtsHgAz44lDEkSjwV6qpGK02eZ6jRjDQx\nLC0usrW9TVnue7EeRPw2slwHi4skMRw5epj5+WFUmBI/2bIsyNO5u1bb9v7dc87ftT7cf6gjgBil\n6TgAj4WOQLN9VNMFjY6RJtvufr8pNkUb68YkZbS3x+1bdxiPp9y5fZXDRzYgSXjj7Svc2N6lKB1B\naUl8tdAFy0oYUrPZmMlkj/6gizGKPEtw8SInkf8QvMelnm6UmQfRPZnNCunFzWYkSUpZlpSlkHoa\nyXuRovBt/0tUPqSp3cxMCYg6eQy2psJUShj2Jk5ctIbEQOUtOknJ0wStPN1eh+MbG4zHUza3vilA\nT+ekSIgoZ6IADkrhAzgv2L1iPJZV2svWbK0IJMK78zfiA9Ocb3M/3++4LwIO4vytEf6DfeDlXdV2\nbA+/1xMUu+eyWqVUdc31Gze5c/sOiwsDjm0c5rU33uLG7W1m1gsKWCmyTHp1VVWDltLf2YqqjLbc\nIC7Q3S51bdsL7XBoFSIyWAJBO0cgNpOR4bcxCXmWEbxvV7tS7/vMCy4uVn/xwTKxR9Y8iE0FK9ch\nRI+w5gFQZImmTmK1DRAcRgkaeuPYBu9cuMRkOpOgBRKTUNuKJNFkWSYPgbUkCeRpQq8j57u7N2J5\neSHC4pMWkdNuo+29UPsb1Lc57puAa6rT5tjfYpotd3+LVfc8QyHIuNh7QeQao5ifH3Ls2FG0Spib\nW6Dbn2M6mZHoBOUF5pNlucxqA6QmjcYciuGgR6+TCUwpgAuhrdKaHKzJLQ/2tRobbw0i6Z9oiogS\nUQqyLKMsCrRS7dzUatVSAZvf21wPQhD36qZAClF5U2tUEE+KNElRPuDSjFnl2vNRSrEwnOf0qYTz\nb7/NhQsXo06wbP0qFh/i6FNjVCDRcOjQMs8++zQLi/PMZlNA5FSFfujvSgsIYX83OhBt3yqfu28C\nrjmaNK7N4dqqtBl1+Ra2dFc+EYKU9GlCCJ5ut8PZs2dYXz9GknS4du06j33gA+zu7FFMp4x9Sao1\nWfRBdU4ayM88/ThHe6I05J3DZBlBhbukWZuufzMVOJiXQcM9yKT6c4IOdpFxRVztFE3vUcu+aGVO\n2eSD6q7VPKKHvQdtMARClGdIklSQMNZTVjM0ogiapxnzw3nyTo+VlSVu3bqFisWQFv8TlBY4mCaQ\npYa5bsaDZ09x5Ni6sMKCiFl7LwGnOLCVtuf2nQcb3CcBJ1tG82TG1a19aprNpnl/qs3fDiavMpa5\nG72QpglLy4tAgvM1WZag+DBffynn0uUbrB0+AsDVq1cpC8vpkyf5yZ/4cbbeehVVC0rYmAythTOQ\nRhPgEIQ4LLPH9K5AA0Htcg9ZBWPkayYhNYkYBBupKp0SdG7zs1VVxfer77rBDeasQZSo2Erq5B2C\nD8yKGjedEpxn0O+zsrzMaDrlgTOn2dreZjKesLm5TZqmog8XtYK1UiwMBzxw8gTnHn2QwaALRIPj\nohCxxcS0k4/2fIioYt4deO933BcBB8Sba9rtU7G/NeyvHncn0HdDdqRdopUiyxN8cJRVQW1nZFmH\na9cvc/vmTeqqZGV5wK3bijRxDIdDhnOnSNOUJ598kqOHD3Ht5S8xGY9YPXKqbTw36OOmD9WI9tUR\nQXEwt6nrmmBtmyNphJiCFzUTFyFSJhjis3VXUdAQZ5rvwd3J+j4pyEfBGTGjy7OKMB6jlGJubo6F\n+XnK2rK0tMTRI4f55vk3pbqNO4WzFqVheX7IqRPHefwDj7C6usRkukeW5szNzce/EV2pI7igOdrh\n/V1b6p+BFQ5kYW8MNrx3MioKXvpESiEWevFnmzfYNB0hckVi1YAWle0gaNWtrU2MgpWlBapiRqIc\n167d4OQJYSFtbGyQ5x0++PgT9Dxsb4/YvHmDE2cKup2cYB0hMsy1WCgLuhVh6wPYEA3hjCHPc1yS\nYOsKFwJE18EG2XGweeq9F3+wIPIOKkCeZNSNSceBBN0YI1ONEHBWquwsEa6Dz1J0FkhSTT/LWVkQ\nVxjnSgaJ4eSxDS5cuEyaSZWd1A5va+bmupw5dZSHHjrJ0eNrDPpdBr0eeS6eZioxUVVKughSxOxX\nqHJ+4Z6P9z/um4CTVlv0f2/NamOvql3d/LvQpO1nsYqVH1UkSUqSpPS6A/q9AccOH6aYjrl04W2C\nqzl98hSdrMNgMKCuK5wTb4XJaIy1gZ09Ud7UTZnS9MPajrpAshuURbvyNeMq7wXZG12iQdoSdVW1\nr2nwbQlgsYKOic1bjcIfWO1R+87VBoUy+8WSCpAmCVmeAJ61tTUef/xx8k5GZYWHsbC4zOHDhzEJ\ndDJFsDUKQ7+f89QTH+CRR84yHHbp9/qkJmlzSBMH+02x0vRIBUfXbKXNDdjHyL3fcV8EXFuFxv/c\nmyu034O7phDfrsstK4qm0+2igqOMP5+mKY888iif/fwXuHjpElopHnv8CXrdeW5dvY7WukV4NBAc\n2UL3t/AkSfB+f5pw0A2xmcU2Hzo+L1rvUxgbr3ofCTdNIdEgR8KBqYRWQiE0Rirq5vNmS1MoUpOS\n6oQ8y1lcX+eRc+eY2pppWWN68/S14rHHH2Oyu8aZE+t08pTCeiaTEQ+ePcXqyiJJosiSrM2D74I1\nHWgO33Wz4icHU54/E1VqO+pBCMYy87snLwjvv2Df2yC9q3LyHlSIurUZRVGwNJfy/d//57l27Sp7\nu2POPfYEWd6ntpYsFgcNWVl5qZYbdnq7EjuHUtm+EMw938d7lBH7Q38gBdgHPYYW4ycNaDlv6wWR\n0VayJokAAUH1qrjiNr8rMSJBkSpDr9unNxxClmHrCqUNSaeD0rC6doiMgrluxmCuy8x5VlfmWF1Z\npJM3kqmCIoH9UZzWmns2lve89vf+/3sd903AQczF4nbUtEQOzhxjg+Bd5fh7jVn2V4e42MdVM0kk\nx5ob9ujPDTm2sUa/N8/jj3+Y8e40isVINTidzkhWTRSgllZIsw02TDDYX90OijSrgEhDxHNKjMHF\n1x4sMLTWEG0wjZLACTrKvyJKUkliWi7GwfGej5jAplod5D2K+YwPP/ccJs8Y3Z5KYzcIibygYpIm\nvPXOBc6ePclwaZHVlRWyJGm3cWk9hbvOMwRBs7xnw/297+L7fve+CjiIvTff9N+aYGv1k+9+0g5c\nhIMIhrtRESpWv1BWzehLINS9fkaapPT7c3R7OaPdUXR+kd85Ho+AaHnuDhCz2ybvAbWkA01RueSe\nOoQIYZJ+nFbSOyMEqvg7mtcJ8ViYYW1lHl9jovigQsXxkWzxjkikVpCZnE7e4dj6MhunTlFay2gy\nFtCkgSzThJChkpQrN25x/NQJ+v05srQDwaOCYOxUnK02ztChscRshj5xpHbwNjTvOKbR+zOu9zju\nD6kHFdAqQ5ECDh9sPOmmFUB7I9pxVyTbNN3ug0iMgygTjyIoQwjSCtDaY7QlwaJ9wJguK2sbOA+V\nrQhpiso7eKXZirIGBlolJ4OsfklM3r2zhCCAUeWtfATfJs9Jkopkq0nJ04w8SUlNQidJ6WY5vTwn\nN0mrQ3IQAiSrW9QUiUQgYwypke3XK4VTAUyAxJMNhjzy4ecYLB+hmDmqWYkNFYlRGBXo93r0BouE\ndJ5f+38+w+c+9yW2d6aYNCeYAIklaB/noxBUwOMI2hNUI24d4qTHo5SPpY3IhYmI2bcOqvtmhZOt\nUrfoBaXMXdCk/anDPayg+DVjpJ48iN130QurCV2xeRRwQF07QtC4oOl059AmFbkvK+KD0+mU7a1t\nitkMPzeHbmQN4nbj3T2jOC/MfFmp4rA/yNfrqhZ/V28h8lQbbZH9icr+NmSMwSQJwbpWdhYO9PqC\nJ1SVcDBQYB0hgeXjG2w8cIagE+qyZjqZULkSg6hB9bpdOt0eHsObb13m1o3b7OxN+ODjD/PQQ6fo\n9bLW5C0QRWlUAyxwQEATlZF9iBLKDfK6OX9BIL/fcd8EXKMHp81+oN0Lc9kfb787f3POt+v8waDz\nTrry3tYQFM4ZrDXUPqG0mumoxAdNUdZUZS3W32mKVpq93V12d3ZYnJ8nNQ0ndX9eiY+rgJNcylkn\nkq3pfg6pQgNf11juYV7FwiO+Idme48hKnj8j5xLThYZ1Vls5h26aoKxF6xyluxw+dYbFQ2vc3h4z\nHY+wtpbeWbgblrSzs4P1sLU741//7mf5ky99mSc+dI4f+o8+werSPP1uRpaJB2qTVxsTCEHg6poU\n4hbcVqf7bYQ/A2iReIZay9PV9BObN9Mm2EodeJPxpW11K8e9rRIVg7dWCXe2d7h8fZvbdybc2nuH\nK9e/BCbn6IkHWV9eRqukxekH75lNp0zGE2xVk/Vj4qL2zxWEEO2sk669j2I8Tb4V+27e2vg7g+i4\nhf1V7iCg0TlHnqWYRCwmdaAdpx2sarVJ0QRcVaMTQ0hyFo+e4OS5x3FJQm1LZuM9Em0ovMdkovvW\nIJNv3bpFbT0hiC/qzdu7PP/HL/LKa+d56IGTPHbu4SgWLf29hcV5+v1MKmaVSDbjmuY1dyd03+a4\nPwIOACUkEqCxLTrY22mG980T965XK9H+aAbqzc8UZc3Fyxc5/+Y7vPrq61y9cpPxZIZzNUnW5Qd/\n6IeZn1+mKGuUkqG7rS3FrCDgKYuCYjbD9Mo2cW/OJ00zlAWnrWzFcRutQwVKEv00SXA+RA9XTZIm\n7Sonbsv7joDNh5joKpSPueOBFMHoCLh0FSSKUinqbs4HPvYxBivrbE73xNditBe16tL4egno7e1t\ndnfFmgkjLtbOw3haMpnd5tadLb78tVcwiTxQKyvLnDp1ktWlBZYW5zl96gSH11ZJUw2hbvPO7/S4\nbwJOxS665DNxO0K9R772Pq/nbjQtwMWLF/mTF7/GN775NpcvX2dWeEJIcV7yuSzJOHHqAeaGC0y2\nN6kr0Tzrdjr0ul2q8YjZdEoxK+g48Xc0Wgv4kn2USJok1MZEvbRGbt+3OZ82Gq1SQu1x1X7D967z\nb7mtljRzQmuk0Wjbvw7aiENgojRl8FgNKyc2+H/bO5/WJoI4DD9v1XqoVUwVmkNNFIJQ9BJ6qCDe\nBMkn0IsXP4DXFj+BHgUPevDq2ZOIiueiB61ViLUgRPAPnhQXYgrrYaawXbUmwc5vQuaBJZPZPbzL\nvsyfncmbmXqNfN9+NoEs++Gy4dj+g51ut0un0yHLMpfYJLf0zlZ+C6L7M6e32XNj6Ymcb9lH3n/4\nxF4mOHhgimNzs8xWKzQaNU6fOsmhqel/rWZtv89+96LvJpK+A21rHUNwBPhqLWJAQmmu5Xl+tFwZ\nSwvXzvN8wVrEoEh6Pmq6rTXH8R4uMTYkwyWCEovh7lgLGJJR1G2qOYpJQ2J8iKWFS4wJ5oaTdEFS\nW9I7uejWaJB0V9IXSWuFuoqkR5LW/edhXy9JN/19rEpqGmmek/RU0htJryVdjUp3cWdq6APYA2wA\nJ4BJ4CUwb6mppO8c0ATWCnU3gCVfXgKu+3ILeIBb6FkEVow0V4GmL08Db4H5WHRbP9AzwMPC92Vg\n2dpoJY31kuHaQLXwcNu+fBu49KfrjPXfB87Hotu6S/1bHnDMDJptbIakOsNnMu8K1oYbaXLXJEQ5\nzVcpk7l4zlK3teH6ygOOjM9ymcZoiGzjEGiHTGZ/3ky3teGeAQ1JxyVNAhdxGcExs5VtDL9nG1/2\ns75F+sw2/t/IbSvZKZMZLHVHMKht4WZSG8A1az0lbfdw/y/Rw41trgAzwBNgHXgMVPy1Am75+3gF\nLBhpPovrLleBF/5oxaI7rTQkgmLdpSbGjGS4RFCS4RJBSYZLBCUZLhGUZLhEUJLhEkFJhksE5RfI\nXbndrpOi9gAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAJoAAAChCAYAAAA/QqZ5AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9ebAleXXn9zm/JZe7vK2qXq1dXb3S\nCzsMmzCbQAi0IQ0YDVqMPJ7FETMORzhsyxEejz0RjpEjRgpPOGzHaBhZo0EaaUaMJKwN1CBA0oDU\nDYNYGhqappfau6pevXe3zPxt/uOX99ZrDdUIJKCE+0S8uPe+mzdv3szzO8v3fM9JSSnxjDwj32hR\n3+oDeEb+/yHPKNoz8k2RZxTtGfmmyDOK9ox8U+QZRXtGvinyjKI9I98UeUbRvk4Rkf9ZRN79rT6O\nvyrybadoIvKoiCxEZCoi50Xk50Vk9K0+rq9FRKToFfmLIjLrf9PPicipb/Wxfb3ybadovXxfSmkE\nPB94AfA/fIuP52uVXwW+H3gHsA48D/g48J1/dkPJcsNfxxv+AP8iklI6D7yPrHCIyE+KyJdEZCIi\nD4rIDy63FZF3isgfisg/EZEdEfmyiLxp3/u3iMiH+8/+HnBw/3eJyPeLyGdF5KqIfEhE7t733qMi\n8t+KyKd6C/UvROSwiPxOv7/7RGSz3/b1wBuAH0gp3Z9S8iml3ZTS/5lS+hf9Nh8Skf9VRP4ImAO3\nisgrROR+EdntH1+x7/s/JCL/WET+RET2ROQ3RGTrG3DKry8ppW+rP+BR4PX98xPAp4F/2r9+G3CM\nvMDeDsyAo/177wQc8LcADfyXwFlA+vc/CvwMUAKvAibAu/v37uz39QbAAv8d8DBQ7DumjwGHgePA\nReATZGtbAR8E/mG/7U8BH/4qv/FDwOPAvYDp97sD/Fj/+m/0rw/s2/4M8GxgCLxneezftOvyrVaM\nb5CiTXtFSMAHgI3rbPtJsuVYKtrD+94b9J8/ApwEPDDc9/4v7VO0fwD8m33vqf7CvmbfMf3Ivvff\nA/zf+17/feDX++f/HPjlP4ei/aN9r38M+JM/s81HgXfu2/6n9r13D9AB+pt1Xb5dXedbUkpj4DXA\nXfRuTkR+XEQ+2bu3q+QVvt8Fnl8+SSnN+6cjshXcSSnN9m372L7nx/a/TilF4Amy9VrKhX3PF1/h\n9TJhuQwc/XP8xieu9/37jm//9z/xZ96z/Bn3/42Ub1dFAyCl9GHg54F/IiI3k63F3yO7lA3gM4D8\nOXZ1DtgUkeG+/53c9/wscPPyhYgIcBPZqn2tch/wEhE58VW220+7ecr37zu+/d9/0595zwGXvo7j\n+7rk21rRevnfybHTBvniPAkgIj9BtmhfVVJKjwEPAP9LDz28Evi+fZv8G+B7ROQ7RcQC/w3QAv/+\naz3YlNJ9wO8BvyYiLxIRIyJjEfm7IvKfX+djvw3cKSLv6Ld/O9k9/ua+bX5URO4RkQHwj4BfTSmF\nr/X4vl75tle0lNKTwC8A/xPw0+TY5QLwHOCPvoZdvQN4KXAF+If9Ppff8RDwo8D/QbYS30eGWLqv\n87DfSlaeXwF2yZb3xWRr9x9JSuky8L1kBb9MTka+N6W032L9K7J1P09OQP6rr/PYvi5ZZlTPyLex\niMiHyInLu75Vx/Btb9GekRtDnlG0Z+SbIt8Q1yki3w38UzLw+a6U0k/9pX/JM/JXSv7SFU1ENPAF\ncqZ3Grgf+BsppQf/Ur/oGfkrJeYbsM+XkBH2RwBE5JeBHwCuq2hFUaSqLBERYkqkmECEGOMK5Aox\nrl6Luubx9y8UpRQiQkpp9Zj/L0C/v+X/l6h13gu5Lp0/l2EwEFEolV93nUNrDSmhtCbFCECMkZji\n8mAQpVCi8jYpkWIkpUSMAUFIgNaq/2wCVmg9iYT00czyOLXWaK3z9/T7EhFCjFhj8CGsvnd5PCkl\nVP8bEonUV4CUUmitSCmuvtNaiyCEkI8v/14wxlCWBdpo6rrC6F5V5NpDghWaJwJnz11k5+ruV8Ql\nvxGKdpynotCnybDAU0RE/jbwtwGKouD5z3sxSilCyNBOiJHJZELTNFkBY0QphTFmdTK990A+WcPh\nkKIouHLlClrrldKREoWxsFRIDV3XrT67FKXUav8+BLS2jNfXqOsBIUZ2Ll1iNp0wXh+DCN61hM5R\niGK6mNN6j5bE+niNejAkiMVhmOzuUoUZiEdJRCmQJAwGYyaTGUVlmTcLEoILgZASyljarqU2NdpY\n2s5TViWIIoZeqfYtFucc4/EY7z3ee7TWOO/puo4YPYv5nI21EVprBsMhzkecc1SlRQBrDcO6JpEI\n0rE2rCmVoq4s3/HKl/GcFzyXoiooVPGUa7F/MZPgHT9xfcTkG6Fofy5JKf0s8LMA62vrqaoqQgh4\n73HO4bwnpURR5B+3VIwQAiEEtNaUvRU0xqCUom3bftVmK7BSNGtXJ6RxDcDKai2VS0Sw1tA0LVVZ\nUg+GaG0wxuAWC7RWxBi4urODtRarhRQCWMFqRYowqCrqQpOCI6SAC0B0KAFjLEoFjFbEGDEi1EUB\nREplSCIUyhBFEKUoRJGSJoZA1y6IMaKNoTekrI3HLBYLIFsf59zqeYyRtm1JKTEcDLDGYK0GoCwr\nlM7nMIZISpGua+iaBm006wdHaKVZW1vjlpPHmc/m7F69yuFjh7HW0jQNRVGsrsPyHKqvUmD5Rija\nGZ5a7jjBVynFxJSYTCYrSxVjzKZc5ClWTu1zmUopyrJcWaKlMlZVhbU273fpegVSTDjviP2VWn7G\nGLNSVFKEFDFGIwqaZoH3nis7O6ToCMETg6dUglKazrXE0FFYgxGNlUjyDUpbNAa3aIguoAtFVRiU\n1hgl+M4jKVAVFokBa4WQIEl+DDGSEnjJVkuJELwjxEBKWWmbtsWHQF3XCHkB2qLIC9X7p/xOAN8v\nXO8dIWQr6EloJYTgiD5QqYr5dMbs6lUqrSjKkrqq2Lm8w/bhQyTz1DBFKbW6XqinV7VvhKLdD9wh\nIreQFeyHyaj6dSXGyGKxWJ2clHKMZoxZvb+0QEulK4oCY8wq9hARtNbUdb1abcv3lNKkmAg+rAAd\nY0yOuXrRWtMuFqyvjfMFCwElWdma+RwkIEQMERMD4hIqRRCFtdliWCWkFAi+IxIIrkWjKLSmsgaj\nQSSB9xA9RllSTBgk/18pjCic94g2OB9QgDWGmBIxQQS0FpxzlGWJtZau6wgxomOkaZqVZV9aahGh\nKGx//rIShxAgBaQ/h4ms1G3TMqoKXOcIzjEejogh8OUvf5ln3XH36vxeO7f53H/TLVpKyYvI3yMT\nDjXwcymlz36VzwDXVp9SalUxds7Rdd3q/8sTuAyQrwW52ZUaY+i6Lp8AsgIpEVK/TSCslFJrnV1I\nv5/gPYO6wnlP2ywQpWkWC5zrUCYi0WNiopAcNOuqIChBacEog9FCigoVE10XCTFhVGJQltSlpTAK\nCKgYCT6hlUK0xoWATpAQApB0thRVWdC6LgfoCbTSdF1H2zSItlSDmgSYwoLAZDZlNp2ilWY8Huf4\nbRm31SWDwQAfYt6X1gQf+4WbFa3rOoyF4WBIs1hw+oknOHBgg83tLaqy5NFHH+Xo0aNUVfWUBCXG\niDKGp+MnfENitJTSb5NrdX8uERHG4/FK0YwxzOdzOucoiiLHRH2ctfxbrqpVLEY+eU3ToJRiMBig\nROi6jmbRrLI/pM+0JFuFlNIqrhnUA4L3tE2D6zpCzCvcuw5SQIXAqBxwcHODzjlmbkEXOroOxAht\n69EairIiOo+PEPokgRDRVrIFW6ZsMZJMvuBKKRDQSmGqEu08btGQvCeliBKd41Wgcw5UzhSX1ss5\nR9u2aGMYDoaMx2Mmk0nOTHUOPwaDAfNFw2y+WIUVMUaMyZly1+XQwjnHuK5wnePixYtc3rvCyVMn\nWRtvsLe3x3CYSSxPCUO+BTHa1yxKKTY2NnDOsbu7y2QywfTK5ZxbWR/n3Mp1LmOOZYYJ2TJOp9NV\nBqq1pm1b5vM5qrdiMcXVauy6bvX5GCN1ZdjZ2cF5B5IVJ/bW1juHSYm6LNnc2GDnyhX2Zg6fPD5m\nS5a8Qymh9Q27s5a281itKMsCa6AqC1IMaFF4E0kBGiASASGkBEmwRpAoVFWVoYwQ6HwO3L13xBBw\nPqK0WZ0H5xzW2pw1e8dsNmM+n7M2PpQVsV3kLLH/3TFGVH/OlrGw956iNDTzBWtVxaFDB9m5coVq\nWNE1DbImnDlzhq2trZUBWMZoy8V+PbkhFA1gZ28vWykRdFEwnU6JITwl/lqu3EXToEStVlbbNIgk\nog9oEhpF7ByNmzObTkgS6GKHFk1pLCpEalMzHFZ0LrsinxzJFTiXENE439F1M5CENQHLAHzLsNZE\nf5W6cFSFZTYDO6hxviHEQPSRdrKHUorKJGprUDiMLnAuW2HRhq6ZEXygKkb4KNkKiuBCRCVF8AHV\nNIyKkmA009iw6DLUo7RGGkdqWyRGUIImYbQgRELwtG2b3SeKq7sTtre30bbGlAmtG0Q8MUV8DEhI\nKKNR1qKTJkZh7joeO3eG8doAP3PMJ3s8ePlJjh07jrUqJ0YxUlUDIODj0wP/N4SixRiZzWZ476+5\nA7IrXLrJrutW8IdSinpY9y4vELyHFIg+4H1gIbPs+kKH812OfLwHEp0kvA/M4pyYIKUIEogSaBsH\nCDF5IGJtiffZbVqrKG1NYTXdYkHbduiUUDESm4bxuEIpxd7eHqYsVhZzbTBm0GfCKSW6riXEBDHH\naKHrMMZgtaLxnq5rsZIoiyIvrBhJEcqyJCmFuIBPoNCrmNX3VnoZVqyvrWNMDv7btsUYw/r6Or5P\nkq4B0qyseY6ThbbrQBJVbRkMxjSLOc4IKQlt23Fg6wAx5mSiLKscvyrTg+LXlxtC0ZDsJpbpsnOO\nrm3x7hocsR+MNMb2sURHipEYHHppuiUyn02x1qC1UBhFVRQs5pGYEq33eO8gGYyxiKJX1g4r2VW1\n84YQPHVVoGJEkmCjZ1iXDAoDrsOIYlAp6sbRuAYTDKN6RDCa2azBKM14NGRjULN75fIqvgoxK4UV\nQRuDwlJrTVnXjJSicR2LtqVxjrIsSE1LIGG0xsSECxEdwZTl6twE31ct+ostKoOqS+UbjUYMh0N2\nd3cJvZeAjPlm1+xX2KROfXCvNSA0jWM63eXKTXvsTSdcvbrHxsaMJe65xPCUenpVuiEUTYlQluUq\noF0q1f40uuhXOOSSUooB7xIpBSRGgpDLQjEyGFQYnbEnvMOWBi/gY6A0lhgdMQWS0ohAih68x8UG\nrQQVPSpFbIpYazBKUWhhVFlsSkQy1jYoKoxYZs0c5xuUNxwcD7ApQw91XWOUsLV+sMfqcinMGN2X\noxKgM2whINpQlhZrNWXXEX3KoHQfo4UY8UYjIaK1pWla4Fp5KfUWyhqLMRbv/QonnM/nGfTuY85s\n1TIe6Fx7TTFFKMqSq7sTPv2ZBzl16iQxaj75yc9w+x23M593eB9ommb13dYaYvyrYNF62Z8uL+uK\ny+dLa7esHKg+Q9OqhxqUEAWUJIwBLeB8R3IdxaiiGFQoJTRJUAth7gIxekLwiHfYlEixozSW9bUh\nRkFtDXVZoJViWNocpCdICow2lEXF1thgisPM5lOaRYMPnnI8ZG1tndFwCCQU2U0qrbHGrGqlTZ8x\nIoooiaQUBkjJoIjMvKOqLCZaaDp8SpQA4rG2oOsciWt10dgH9svgfD+A3TQN3nvaPgTZXxmJMfWW\nTtDW0HRdtpIKdnenNIuOM2fO0XWBw4eP0Swca+slAN47RBKZwX59uSEUbQnYZmCxyIi/CMF7mqZZ\nZUlAr4ARRcrFX62wJrtJYiKlSFFZVIzoqkCVhvXxkOQ9RVEQ0VTFgiuLBZP5gkW7QHcdldEM6oLx\nqKIqCgZFweZ4xNqgpjCGsjQggnPZKmiVF4I1mqquMfZwn6n2FtnleEsbA+mplQjorVBKFGVJ6z0u\nBJLKiycmjXeR4DP8kiQDvUoJSgRJ14r/qj9nIQZSzAvTB4/2OSHYr3hN07BYLDArSEh6eIhrteMU\n0UqTEGKCS5d3WBuN2draZj5v+cQnPsmLXvRiutZRVbmE1nYNZaGvf4G5QRRNlGJtbW0VQEPOJOO+\neGIJa0CGBYXMMtAqwweltWgliAJbaprZHEPMClSWdCkxLAtms5bkPLFz+K5DxcT6cMjRjQ2OHNng\n4IEDGIFhWXD4wBYWIESKuiAhzBY5mNeiiL7DKqGoSoqy7K1Mi4gikZhOJoQYCRHoYYoYI8F7RCmM\nzu7OGINPkVnTsGgWOWj3PuOArgNRaG0o0TnG2xfAa60pTAZy6a2Udx6h61knHc45hsPhtfpk/7mU\nZKWwOemKpOhR5TXygus8u3tT1tfWMKZgOpnxyCOP8Kxn3Y73Aa0TznVYUz/tNb4hFE2JUFibFQv6\n4FJou2zRUoi5nMIS90koSZAiZWWxGrSKaBEKq0m+QxNxKWFtRdMmdChwC8VuLDl/9QoLtyC6lqFW\n3HJ4m6Prm4xqxYmtTdbXxxTWMhoOsEb3tBsNSRh3XX8MiuAcIiBaECVYrSmronfpitGgom1bnI+r\n7Nk5tyIIDErL3rxBWY2LAe9zhklyuKRwswWt91CUdKkDbUHVaGVxyaOtBSV0fWap+1Cjcx3OZ1wt\nSY4Fk0DTdbjg8QFMMhiTE4iiLGmbBkgYJQSfENMnCyrTii5cvEBZlBw+ss0Hf/8jVHXFxuaY7cNb\nlFWZ48+nCdNuCEVbUl7y0/SUwrBSCm0MRpucYcYIRAqrUSSIgcJWiEQ2N9ZYLBYoVTCft7Q+EKyh\n0YrpfMLk0gQXNSl5Kq2wuuLk9kGed/fd3HzsCAcOrLO2tkZhLUiisAak53gF+uJ1Bk4TGTqIIWCM\n6l2bYlU8i4miMH12ln/Tkp60P9nZtDUueJq2JVZgrEWaNsd7PtC6DmMtXUiEmCjsANEaYkQbvQop\nVq5UKTrv8CFgrKGwhti7VACtDSH4p8R2S7qR9L+1bTtiNGiVqUtlX/Iz1nLhwkWu7Fzi/gce4C1v\n+V7KsqJp5ih5+s69G0LRfAicPXv2Kdnl/nqk6ol5roc7lCTE5GK2axu0UjSLBYUxTJ1HVzUhJSbz\nOXttRxccjetIwOFqTF0obDLcdHibv/b853L77adYXx8zHKxll0Ls4ynV40uJFMNTMKcQArotSH2w\nn7zLtVWj0D3JMnrTF7ATTdNgjFm5r67raLsOqwy6zb8rZMoGhQ9YYxANrg0QA/T136KykAriIhLd\ntbh1GdwvM1DfY46gVlZ0eQzLxGqZAFz7LCSl8CHHhoU1aB2JIqtielkWzOZ7nD59mieeOE1dV31N\nNz7tNb4hFG0pyxW2op7AStFyTc5graUsDJUWIGIkEYNDEfFtx6Cu6CK4xmF8ZluIC+AD5aBExQWH\nNra47dhxXvTc53DP3c+iGJQEBVYP0MaiRIjR90rVs3J7Xhl9nKWVwpLjK0kB15BBWJOD69h1iM41\nwBhDj/+ZFcgK2S0t2g5lhKK0RDLWVxSG8WhAPR4wj54mdDgETIGPHZI0QkQRiSlgFJCuMVbYl312\nvasvioK6rvtSk1spvDHXSKI5adAsIZcQU09ZSsR+4Y/GAza3xpw9e44zZ85x+PBhtrcP9dzh68uN\noWjpGp156V6UVti+nqYlMy9SjLmGSST5lqKwDMoRs71dBlXF5ScvYsuCvbbDErnz6HG219dxbUPn\nOja2Njh51y2cOn6CE9uHGQ8GIAmvEtooqqrG2gJIGRqJEaUVMQRCAK0Ek1hRapwzOagnYbUQQ0Qk\nF9KTZJ6d9BbRGJ336xwxgjEarRUhRHSILCndCQGVF9yJo0eoh0POXb7EzAdQgmsWpJjrojFeI4E6\n53CuI4SIGLvKbrvOr7zDsuqidVZ21btK731vwW2mIZm+hupyLCm0hBAYDTL/r6oqmkXLBz/w+xw4\ncJADBw5RWMXT6dqNoWj7Mp8V2h0i4n1WtL4I7ntCpNbkTLOwWCU4q9lYG7NzxRG9J7mGzeGAm7cP\ncGxjE01EF5rtY9scu/1ORsMhxhq64DHWMhgMMnOChOsWxJBjKFFgxRKjR2LMpZb+IpESKeTjiUQQ\nQZtrvQQiKiP1cQloxh5GWFKb8u+tqpLFYkFyS67/ErJRrA1qUoxMZhNi0+FCxHUOkiBGk2LGArXu\nLXDI0Iv3EJGVq9+PQ2YMLTNYVA9rLM+5MYbWZWwuhEhIkURPRI2J8bBgd/cqGxvrbG5ucfXqLh/5\n8B9y04mTHDmy+bSX+MZQNK6BtUuGhqicLRljUFzjpqee4ZB8YN51WK0YlBWuy0zXplmgo+PQxojj\nRw6wvbFGNSix44qtwwfYKHIwnbSmHA0oqwqFoBJYlejaNhf2laLrGlzT5SxgaUX2FfglJeibU0RA\naZ3jrCVTJGbLll1xtmxaK0SuxWrWGLy1WO8z/af/zaPRCPY6fFGwNRqjVMPO3pSYMnczSKb2EBOK\nDCTnphiVeWvhWsy0tGRLOARyZm+0PIWCFWNAVLbgCCiliSm7T2MM0+l0BUPdeeedLBYLzp07x/vf\n/wG+//tfT3oak3ZDKlpKicpWlEWJkrwqJUbENxiVGBU1tTW0LtB2c4p6TGwXDIBhWaJqzXPuuI0j\nh7bZWF+jGg2wdcl4awPVKcQIpjTUgxqlhOA8pS0zwh08ShSSFD66nqakUL0bjFnPMUohoiiMXqYD\nmZnrZojKVHDw9CxDlh1RSmU6kaSIURZt+ljKOYw2hBCh71oqC2HdVHjW0EZDCuxN5jgXcGEBMWBU\nzJRyEklpAoqUyMpCVhbvPfPFYqUwKS4TK8mhgjIYmzG7JW1I6wwniUBMkYgQoqCU4dZbT9G2c1IQ\njHg+9+n/wKte+eLcvXYduWEUbWnC97fPCRB8wCeHVREVO4al5cC4RCcggljFoDKU1ZiNomZYlQxq\ny+EDB6iqktHGBmvrGyRR1Ha44r2LKAiZGx19pI0ttiqJ5NhFRKOLmkC2Eip2xBCIypBi1p8QczJi\nraV1Du8DzvmeASwgOpMWrcVqfa1NMF0rs8WQm1FWHV4JUpfhiarU0EXGVdH3MwSIkclsTgwRdEQQ\nuqbrUwMNqL6mmhdJWdncuNN7Cmstpu+pSP3C8SH21ivXh5dJl/d+xfhoOs/QaowtgURZaAwVd915\nO3tX9/gPD3x81SDzleSGUrRliUZEcvdP12WFSBGRSCEwqCvWx2tYgaK0zGZTVIjYomBjc4ONtTXW\nxgOGozGHtrdZ29igrAc47ymKkqZn7SqlQGVXl7RBjMlF+qgyN0spUlKQFEpB9IHQNhTDARGF0QYP\nvcIocHmbXNQ2K5aE7lH9lBwimpQCShtMz+xduG5lQZTyudjeL7pSFbhmhoqJgS1pTUdTFIQ29yTE\nGHDRQwyEGOmCR7SlttWq1ml6hoUuLN5lRUsxrJKF/VBSVZZEq1f4ZUopx48pZfJoka/NZG/CqVPH\nObC5gescH/33H+PTn/4UXdde9/reEIq2n6K9av4lZvq15C6gHBNBxmg9tizYGo+pjWZ9fR1FpKpL\nxutjhnXF5uYWG5tbKGtzA0lREGJCTEU5GALSA8OOFAVlDMk7iApJGiUaLYkosXd7iaZtWD+4TZLs\nVqIvWEwXMM8lMa01uqoy1antUAhFWeSm6CR9o27KvyGSezQlx0lVSkQy5iY9JOFdgJiy9wVKrai1\n4K3GpUwBRwux0IjPNc4lIzg3AJcrjKywBYJf0Yn290pAjuOWRfnle8sMU+tc4iqrkqu7u1y4fAlt\nckeVazoOHdrm4MFN7v/so9e9xjeEoi0R7eVKjjFiterZohlW0FajEkiKjAYVJ7a3MVoTupbhcMhi\nMWO4Nub4TcfZGKxhCptRfIkMP/YxDr37lzn/o2/HvfrVpJBPYmktRqmMNfWd2qKkzyrzsS2V33vP\ndDrjCLmb3vUEzK7t6NqW4Wi82raZL2iaJhf7RTOdTogpICkH4dG7zApWAknY+tPPcvvvfIhPvfbl\nPHLqZOarWct81mBMkbNg5zCisFozKC1t9DjnSUkotOCT6kudka5rV/hZWfXcPZ9JndZaXNeuYI2l\nZcvKFRCuYXDLfoqyLHvmh8NYxalTt3D6sUc4efImCluxmHccPXb0r4br3M8QzdlRzuo632XqclQY\npbGFpSwKKmuoTEEXAqUIh46fYHxgnQTsTSc452k6h60qXvZzv0B5+iwHfv4Xuf/Wm6nrmrquObC1\nRVlV+OgySm/LHmaRvioWEMkBWerjqhBj7igPmTaewdeENYa2bQkul5oIEdGZI9e2HcOPPcDtv30f\nn3jlX+PsnafwPcd/Mpnwgvf+HuuXr/Ks93+EB9725mU1jlm7oKyqVa1SW5VruclQOYtTYdWTmcKy\npp6z3mVn2GBQs1j0LXiikeFgpUjLmuuK5yeKEHJosawk7Ic/kihChMefOMNwOGQ6nXPk0GEe2vkC\no3HxH3X/75cbRtGWCgZcw6mIaMkxUpJMuTYmn2yrNHVhqWXALU+c4Y7/55f53Jtew+dPHcH6gq0D\nBzly9AimKDn99h/i+K/8Oz71uldx4czpXJJxns3NDW5/7DS3/sbvsPMTP8LsFS/NFysuW8dij7KT\nYY+UejZqPsZ2seixqrDqp8yd5VnhnHNMZzMuXLzE6977PtafvMy9H/wjPn94K3O/dHb7n37NS3nu\nR+7ngZc+H0Fo2oamaWgSnHroS7zxs1/gwy+8lwePb2NKjYtQWkuscqbZhUTU4EQInUdI1HXdd/mr\nFSC75PINBoOn4Gq+ryvvbz9cxm2Q49CyLHGdw9jcPXZ5MWE0GHBg6yBt54jRU1fVda/vDaNoy1XW\n84v7jCgXjgtjsEqQ5LGk3pKA0ZoE3Pa7v8/4/EXu/t0PM/7nP01pamxR9klAIPzgW7jwpjcyns84\nNdnlyYsXefjhhzl75nFe8d4PMLhylfRz7+bKC/uRtqHICH/0aJPb4BbzCapvgEkpD1lpm4boHKFr\ncfPMp3M+MJ9OmTcNbeeYTmacOXeBcM/tvO7Tgftf9kI2t3Kf5Gg0QmtDPLjNn7z8ZUQSN3Uds9mM\n2XzO4+cu8PpPfY6jezNe9Tn0BBwAACAASURBVPHP8Omjr8aFQBcCKWasSwClZVWB8N7RtB1r44x3\ntd0cHzz1YAiF9PSqzGFbYmtLYFcphVYGyLhgBoJz51Uum2p8iJw9e55jRw6ye3WXSxcv0s5nrK2t\nU9fXpwrdEIomItRVlcmP9NZNg+8CpTUQE6U2DJRlVBoGpaEoCroUCCQeevPrufe+j7D7zh+h0DUY\n8LpHv23BbPcqMeXO7LIes33YsjeZc+aJJ7jvnjv5zs8+xJnvei2xmaMQbIgM//h+Dv7rX+PSj/wg\nV1/6fFLoGA0GWCUoTJ57ESKubaFdgA8cePBhtv/db/OZ172cx04c4crehIsXL6GC5vPHjnHpuc9h\nOCjZqGvWRiMGg5quSYiOxDZ3JSmBqiowVnMzmo+++Hm84oE/5TdvO0mKgojG2ipXBFKHuIz7pehX\nDS8xJaq6Xlkqow0KVu17McZ8bnpYxVpL27a0bUNV5g4nYNXppLXQtgvKwToxJRatZzJZEJoFcvI4\nJ44e5NDBQz3g/JXlhlA09iUBq/7AlOnSmoROgXE9ZnM0ZH19vDphKSXWN9YZfN9zOPvWtxBRWMkd\n6oqclfmQcG3muRFSHlOgEtsHt9DJszssefcdN3Ngc8Qte7tURcHefMZN7/5V6jPnOfiLv8al5z8L\nW1S5c71tUKaClBmwXduy2JvQLRbc9Z7fZHzuSe583x/w0be+kctXrjDZ2+PQ+gHWxkPq0jIY5O4t\nZRTeeSK5fJRSILiOJCmD0ynP1bj03Hv5xdtP8eSlS0x3dvBpySzRWJML9Vr5zIvrQWESK1pS2zpE\nhMFgsHKFy8zeWvuU7FOJWgX0+8tW0PciaE2Kjq7t2NvdQ68PePjhR9g+dKB3zdenCt0Qihb7rC6m\nvCIzV6uD6LHGsL6+xpGDmwxLKJeFdq0ZjUYcPnqMuq4zgm/zHInS1qAt08mMEASlLPO9GYvZHEU+\nSb7rGI+HDOuS2WiAAG0zR4IntC1f/K5Xcsfv/SGX3v49DOsaJI+PmjcNxmZ30rULuqZhMpkwmU74\n+CtewHM+fD8fe+lzWCwaYoTRcMRwWGN07m0YjoakFJg3LcF7ClNl0mRpiSniutwlLwKuy5ZlfW2N\nznt2pxM61+UEIfaF+Bh6fk/Pk0sBUE/pv1jOH9nf0b+kZC23ydlnol3MV8kBKa08jBIhuDbv3yo6\n1+JcweWdq9zxrDu5vHNlSfD9inJDKFpmLbDqtF6yaDWKcVVw7NAmJ7Y3GVSWOx97glf81h/wyPd+\nF90rX0ZZlj2wK6u5Gz4m9vYmzDtPVQ1xbUPTOJo2UJiENpaisLi2owuBejjCaMV8PicYh46Jq8+7\nh4de9TKKuqQgURRD0BEXOpQGGxMpetyiZb5o2FvM+eTBDT7xQ29gOFqjnC0YkWESbS3KGAL0VPBI\nNRhQD3Ic1TQNioQyBpUqdOrJqm1uKAmuoywKNrc22d3b456zF/nBh0/zb08d5yPDbN21CEpyHTT1\n1n7pFoF+AN+1XoP9g/2WGWpuCK5yvbnvsaiqamUdrcrA8nhthFXCdDpjSuDKzh7ra0PKsrzuNb4h\nFA2BkPr5E0oQrfAuoGJApwjdgm424eDaNi/7959g48Il7vydD/KlN7yWrp83sZzZ5WPk8mSOtiVl\nMaBzngtXrrI2KNmZz5ntXiQsZ3ooxdpoSD00zCYTfBQIiVI0YkuCaNqQIAh1UTMylqad4sKCiEdL\n7gmdLxZcnU5BhMJWuS0uRozJ1nc8GiFaE0WxCLCxtY0yBidw8cmzzKYzSJHxYIAWwUhmwS4Xz2I2\npXVuhWm95eEnuGm64G2PnuGPn3cn1uRsXBvdx2jXGlKWBfOlgpmemr6fKLnE7WIMpCAoYwkxohIk\nH1AIg6omBUfXeuazGYMq90cYozl3/gKPP/bFVQveV5IbQ9FgVbJZshoEQRvFsC7ZHA8ZVoa14YAv\nfs93cs/7P8LZt167cclq7BUtrfOk0RrJO544c4bTZ86ymM0I7YKHv/gQa5Vlb28XqxSj4YBTN93E\nqZtvZmPrED66XDJKGSROyiLWIrZAVJEDbHEsdvdomzkhOFIING1L6z1lWRP6BpYYI4PBCFGKWedp\nmhmzpmN3OuPw8eOECEVZ8vhjX2L70CHWRiNmLjGfThgPhmhJqJhWhMnWZ0hGac3v3Hs73/PgI/z6\nrSf6EQnZIhlRGK3xgX0cs2tTf6oqMzVmsymDwWDlPZbKppXG9SCtLBtplmPCrCX4jtIYCJ6uTRRF\nnlA0HI05dtsJPvLHn7vu9b0hFG1ZvsmrKptzay2lSmyur3HyxHG212rW19eYv+I4D73x9UiIhK7L\nYzxDpO1aOp+4ePkSM/Ukj509z+OnL6CU5o7bbuXC3lWG4zVibEFpisGAoh5wZW9CfOxxjh07hhCp\ny5JRVSNKk0RjbY02eXxUEavcaSWJGALeORbzhkXTZp5+TMQAZVEzHIxAFDtXJzRRuOueezlx8ynO\nXbiIGMuFJy+ysbnFH3/8AeadgxCpiyLfE/HgQdqm4cjGgNlsRudakuQeSh88nz12iE9sjGnaDt+0\npBj7FrzcFSYx9hZRoXVaAcDL2LbrujxiIaXVqInl68Eg14XjvklOSyq4tbqvkgRsYdk6dJAYApPZ\njNOnJ2xsblz3Gt8gigbEjFFBX8BOATMoqQpLZQx1VVEVFUkyYFkUhgLFYtFii5JJ40EbrswWfP4L\nn2Jv0fKWwRrP/63f5l1Hj/DQ4YO08xmF8rzsyh4/+sR53nvnbTx88wkkBEptWKsLqGrKosyUF+eo\nfUvpQVUO8/v3sfHPfp709u+lu/dOvE/MomPhPaGJtKGl84Gmioit2ZleoUXzgv/kzZSF4XOPX+Tc\n2Txz7L4PfjBXHoqCcztX6ZoF40FN9I5p7DBa6JzD6oSKHsFR6IAPDpRBF4bQtvjVpMh+KDJATweP\nMbv20WiEMhYfc4PwwYMHmUwmK3fqnFuNpFi4RYY+oiEKdK5Dq5w8+JiJm8ZkdkttCzYOjEgpMR5Z\n2vb6dyS6MW5osSIKZj7aik1QV1RVRVlaqqqisJaqLBkM6lUJxTvH3t4e586d46GHHuJP7n8AigGN\nVzzrV9/D8cuXeecTZxBbgx1QjLb4sbOXuHk65we/9Biu7+9s5zPmi/mqr9RaS10PKIoSH0KehPiz\n/5Likcc49K/fixLBeU/TtHmGmvd0rqNzDuc8s+mMtfV1nvecZ+O7GQ8++Cmm0z3ufeJx3vozP8Nr\nQ8Kub7LoAs87c553febz3PvEOXanCx47fYZHHz/LY+cvcHXR4EXTuMiiCSgMRix1VVNXNUWRRzao\nnu5eGNuPC/X7/jLBEhKL+WwVpixjKhFhPp8znc1W1mvpNvePtdJar6al08d/J2++mfF4zPb2dk9X\n/8pyYygarEohy3rZsuRTFJZ6MGA0GlIUNge8Rq1cQNt1nL9wgS9+8Yt85jOfwcfIY2cvgR3w/77k\nO3hi6yCf+eEf47u/76+DHXDyjmfzh9/5Bs4fOsiffvcbOXb0GIVWhLZhPl/0dKMthqNx7mmEfhgL\n7Lzzb9DecpJLP/z9uH7you97UGOMfYucZzwec8utt3LLqVNYo3n0i59l5/I5qrrgez7+ADfv7vK3\nz53j0E23UdZr/P0rO9yxaPi7l66weWCbxiWuThd86fx5Hn/yMlebliYknE+kpCHmeEpr1bN3c6VA\nyFRwJXksXlUWuSyUcs1VyFMt93ewLzvNloBt17l+IB+A5P7VkOu1ywnpIQQW8zmXLl3iC1/4Altb\nW1y9evXGx9Hg2mC9JRBrjGE4HDAcDhgMKmxRZOpQCiuTXxQFzbxjMplw8eJFzj95iUVIbJx4Fnfd\ncy9Pdnfx0y98Od57jly5yqOPP8HJm0/y+dvv5PMnT3DniaMcufIkk0uJQgvFcMzaxgbj8RpREsF1\nKK0Qnfshm1e8hMefdzftbI/FlR3aRQM90GlM7gE1WnP8+HEOHjzA5Ss77Ozu4TvHi17wPG6/+9n8\nwSNv4Pm//utc/C/+Dpc+83m+/Ohj/MLtd/HOR77Ir91zL4ePHKdtO3auXGbuOs5fuoJVicMbYwpT\noJXBmpIYW4SE0fneAVopjIYQwRqNh37os9A0mVO21pMrnWtXXmPVwqgU1hYoCXkauTZYYyiLguB9\nHqVghK5rKbSiKLOrPH/+PDtXrvCSlzxvNSf3K8kNo2j7i+p5BoTLrW4KirLAGAUqkxJT6m9sIYLW\nigsXLrA2HnPro6d522On+ZNTz6K++y6++MhjNLMLTHeu8Fsf/TDbW5scGFdUKoFKGHKTyGg45GXT\nOSd/6T088da3oO64K09mNBbREEI/yCRFUgikmJtEXNuRQu7MSvM5g8GAQ4eP4rqOhx56iBASW4cO\n8cK772Vz+zBbmwf4o5On+PA7/xalg89/9kFi8Nx/+AhXXvZSrjx5kXFV8Lr5nLc9+hj/4sRxfl+E\n2WRKYzW655cpQj+XI0+PNEowWmESOJVHXAWf3abpb8qRE5c5dR92iMgqAVgOfUkpNymn5Akq4l1A\nSx5flSIUdZ6iGX0mODZNw7GjJ0kxcsupU3h3ffbGX8h1isijIvJpybePfqD/35aI/J6IfLF/fPr2\nmH1Ktn9akPMO51ogB7YCxL530XsP/VyJS5cusX3oEOsbG7zjzHlumS94/Uf/CNVNGZpEWkx48XPu\n4i1vej3f8aLncurQGqcOr3NkY8Rk5xLNbIqIcPNvvo/1s+e59Td+C2uL3DGuFClJnvbTD/wLIZJC\nzOM9XU/vDhnoLPpJlfN+wvizz13gx9/9K5x88GFc45nuTXjWbXeQ2o5awete8TJe8qIXcvftt5K6\nllIr2tmMd3zpYW5fLPhbZ86yvTZmo66wSjCFBpWYt3N8cKg8yDtPKlLLORoJJOU+y+CJ0VNag7Ga\nrm3z3Ln+HEMe8Ff1rIs8imJ5nwbTA7UuTzNnOQ94kefdti1lWXL58mVuu+02Ll++vJrA+ZXkL8Oi\nvTY99QakPwl8IKX0UyLyk/3r//7pdrAcdbC/C6qqbD+Htu+spo89jMnUaWFFs9mbzXnpS1/Kn17Z\nRe77ff7dsW0e/PX38NLJnJ/85Cf5D296Ew/fditPLibU932c13zyc3zgeXdz7tZTrI/GrG0d4PRb\nvo+bf+t3OfO2H+r7MrN1jbEffKIV9cfu5/gv/ApnfuANnDu2vSqbWWvyzH5ToIuSsig4duIkb3zf\nh1g/dx75pV/kT4+f5NInP8Ntj3yJf/CRD/Enb3ozt7zkO5jPd2mbOUc/8XFeft99/Oqdt/PuE8f5\n4S8/xi8eOcyB0Yi6VJQW1tZGKGuYtS2pafKgGaUpfKSwkSi5iaSL2eJ0XcdiOqMsaoqiQmmLSB7f\nsKQ1LSeZL7uhIENLRmfFbJoG1VOM8kDolnYxYzSsGI/HvObVr6RZLNjc3MxEzuvIN8J1/gDwmv75\nvwQ+xFdTNPrOosJSVCV7e3u0ITfdrpWgJdIkha3HpOgYJEc7X7B75QpXJwueddddVKN1mrf9p3zk\nFd+Bnc65bTbnje96F4cvX+HF738/n/87f5fNA4d542/8Jkeu7vE9n/sy9732jRRVSbU2Zu/Zz+XT\nr30d1lqK0OZe0NSRUu7vDCmw9a/+LdVjpzn2a+/j0f/6J1ZF6qosEaVBVB5ijMbqks+9+c3c87vv\n45G3/nU2N4cMBoaXvev/YuP8Be7+lV/iN65eYjLbwZYF3/+BP+DY3pQ3f3LK33vBPdz/8udzaDDi\n4MaY2Dlct2CyN6EqCwoSrTa00eG6QKE0A63zrYUkYFGgK9puQTmsmS1apNIM65rh2jqzye4qqF9m\n+XVdIyLszfZIrqPWFcO1MfP5nNblodKS+vHw1ZADm+vUdcXVq7u85K+9mJtPHQN+8brX+C+qaAl4\nv4gk4J+lfNudwymlc/3754HDX30naXU/gWVp5I1tw//4sY/z+fWK6oXPJ4ole4PAosnEwo2NTQZr\nW6ytreHJkMT2gQOcvOkmnHNcfOePs/aeX2f2N/8mL3/B8+nahiuppf7l9/D5172a9Y0x4/V1klUE\n3zEarWO0yqzaGEkxuyAhDyi+8uNvZ+sXfpkLP/Td/byLa/TzlPI4hEIZtCiC6zh7z10sXvkK6vV1\nDmlDSpGzP/IO7K/8Gz73qlfx7Ntv4/z50zRty+88917e/OkHue859/Cie+9mMBgwvXQx36tpWFNu\njInBkWJgurdL6uGKlCK5dYprNdKUCCH1Y6faHqaI/Vj7dnXMy3nAS3az1npV61wsFmhRqxZBrTWz\n6YIjR45w7PA2J08c49CBLU6dvInhcMTOzs5TOtj+rPyFbqMoIsdTSmdEZJt8w/m/D7w3pbSxb5ud\nlNJ/FKfJvpuOaW1edOqW25/SLPHbZx7n7q7l0vYBvvS//SRRl6h6iNEwqiyhCyCKJIayHuJCIiaV\ncTGfC92RRF0PKcqSzuXGjMXeDsSEEZWnBgFBoKwrRuO13AoXHN4tiL5Dku9b5ypUEqJb4BZTdi7v\ncPr0GS5duUobAm0I/dwOg7YFm5sHWN/Y5OD2YYrRCKWusVeNMezu7uaaYaGZzxeZVdu2zBcLFk0e\n4aCSJ4ZAVZa5OXqxyKNHtWbWdVzd3WU6mdE0HdN5w7RrmTQLpk1k3gZ2mwXTpqFLsL62Sdd0nDh2\nlKIomM1mTKfTlZL114QomSEc/ZIdklZjxRCP0QpiYGNtxE3Hj+G7lul0wjt/4kf4xz/zz/ncQw9/\nRf/5F0oGUkpn+seLwK+Rb6F4QUSO9gd+FLh4nc/+bErpxSmlF++/VQ7krPOnNzd5dDzi469+SR8n\nRcJyeLIobFlhbElRlNkFdB0pdBRGoQgYiUhwhG5BaBd50rVRDAeDPO9fQeM6fMoTcuqyxOiE1nks\nqepPjqAgXmOmLs/ikhIT9/G5nOvwoYPgce0c387x3RwdPTp5rEQMAUOgUPm5ch20C2KzQFyL9h0V\nkYFRjKuSjdGAymgkRoxSGG0ze7iwlMbkm2PEPHFoiaEpgVdPJ/zuxSf5rqbBqDwwejkCYdndtGR2\nLEdo5U6zSGENdVXkaUJK+j7X/DedTal6kupDDz20qjDs90ZfSb5u1ykiQ0CllCb98+8C/hHwXuA/\nA36qf/yNP8/+lg0Qy3s8faiwTF70HN54121s5Mg/z5PI305SJrs2EVzrkJhrfVVVUhZrbG7kptnY\nj17zIdK1DVYg6V6FRDEYjvKwlwQpupyYJI/0o6uWdzlSWjP+2ANs/Pwvcf6vfzdXb7lphZJ33tPF\nZfYJIoHgW9rFhNlVu7qZxnJsVTIGS2J9UOEXc0LX0E738DGgVcaoMp09d6//f9S9d6yl533f+XnK\nW065de70wqE4JMUiUoWkJUqyLLrHkWTLLXaSzWazzgZY7yLANmDzR3axQAAb2GCTAJtssAniNNuJ\nLMdyZDmSLNuSTKuQkihSYh3W6XP7PeUtT9k/fs977rljknISezF+gMHMnLlz7j3v+3uf51e+ZTyZ\noBBHu7woxCsqpmM9ClBSxyiizpml8Jr/bnubO9qWvxU8nypExt41jahQJmGYebiQIG4VzieaoyDq\nRZUyCMjTR0dmBY3bKzKcFxfCheGQy5cvven9/c/J0Y4Cv56i2AL/Osb420qprwL/Rin114BXgJ/6\nTm/UNWC7qYAxhrzIKYsCm0ZTOgAqiQ4rQ1QWdKAoS6wRHFtTVSJboHQyZwBClItHlGSZiDaKQX9I\n0SsZ9PpkaKZ7I2Jshf6femWKpGeRfr6Vf/4r5C+/xpF/+1tc+Vs/T1mUFGXJZDwWg9YQUrujJbSa\ntlZMRwZlBGSYZzl1U8/g1jFEYlthdGA4LFMqEGnalsl4wkK/oNfrU+aWxgWUEdJMVTV4LzClzEgf\nTWuFiRGTzHL+79VV/sbmJr+QZ3jXis2QitR1JXCmPD/Qu+xwcUbHGYLGJAJ0VMlXoMyTLkonHmNZ\nWVmhVxZ85Stfxfk/BRZUFIfh+1/n9Q3ge/9j36/byYbDIUeOHGFzcyNNC+SGG5MatkqLZheBIsso\nipKpmyQZAEGEokQPQ+mMPDMYH8lLRV72GShHtBrTKwhKpKZcEB9NR4DgiSFNKZBAs1qjbMbef/WX\nWPgn/4KNn/wRrLXkeUaeZaLJr4QpIq4lEaOg3ytAiViBqHgGYmgJUSfhlYrQOqEQ9npoazCJJd80\nDXkIKKNpfY/RpBItNaMT57ORdCGIzKnVBpfgTYrIF4YDfm95ketbm4TRmBA8WVLN7CBEcFDEbzqZ\nUJb7jnXM5W6dgsBgMKDILL3c4pqafr/P1uYmm1tXcO3/v+2N/+ilFMTgyTPLZG+XVydjytzi6gEq\n9qmayOKCAu2xMUM1nqx0GMDVE7SJuOCImcErMFo4mUVuRX4z7WjkGcJCYHbUilqOJ6iI8godFUpb\nQpSj0AeH1hk6KOqH38/e/e+gGm3DeIeizBgs5ey1iknlqZqKrOwRowjwVVUtIiq+JVMludVkC0O0\nSTobQAgmKQylvC/tHnVTwbhKKt8tWeqZVdMJ1A2Z0pjknpKVBTZG4tSDC+RKU2jLKIFCi6Kgahph\n5HuHaWrKsifSDNaKLZCWU2AyrWSkZeShDnTKARHVtrgY8RW0VlMWOePxlPFkyvd+8Hv5Vx//D294\nj2+SQNMMBv00u5zOlArbpkUZ0eUPwdP6gFd5SoBlHORxAl1Okk2SV3UyV52n+T52vmvExnCDZfSM\nms5M4wwlP9tMjqqTHFDy/1QEI+fK7H2C90RjMKkJKnaFYiqLSqAAI8GVaY0xA8G3zTD/QXTOgqQF\nYptYU1ViJOHScNzM1CTDAcdmYwxZVGjjiE1E+UT4UQptNVaZJHwovAfYB51aayHY1P4QXu1+ISB+\nVUYnif5cNDx2dnfZ2xuxfv0a4/H4De/xTRFosO/31MFU2rahalvBOEVh5wQgGumAd95OopCoUVph\ntRGt1zgvab5fFcYoElLdUSHxJE9tIM5sfjohYaPNLJFvYiD/4qMc/aVf5vqPf5jRnWfJdU4vK8lt\nTpbty3XKQ9IIKjW95tuW4HMyM+f3jlS43TUIMQjZJMjvTdPOjeNEYch5eWjcHASo+4yQfBi8PDiz\nhyfNL0VOS1j4M+96pSAEvJeHZ59E7EVcx3QCNJrpdDpjTk3GI9qm5siRI9xz773iufUm5JSbCiY0\nXw1FIu+6fp0P/+N/ysLXniB4EZwTypxLAZlY1Z0InRZv7664mC+35/9+47/ppHXWBcDsl+4CMtDr\nlxz+l/+G8uVXOfyx39hnePtA24jZWYwi4hKjQiBbRnBipFywlb5Yp4MbQ6BtK1pXC/uprXGuEZmD\n4MWzdM7Mo3tw6lpUu+u6me1GWu3/rN63OCfNXTQoHYnO005l1imQIumTEQWN65qapq5mD6VK14Ub\nrlPTNIzHE1yI4jtvLdpYhouLDIcLb3h/b4pAm/WMEzwoxoiP8Fdee40j6xvc9snPpDkVhDYQXSA6\nefJ1VLOqsHuAu+S1o5V1rYWumy9PqKASOszVjeamMR2DHQPItw1bf+UvUJ09w/pPfQTXBqq6YVo1\ntF6GztZms/eWByexjAzE6KQadRIEeIeKARUk8KJ3BNcSWvleoW0OmHk0TXPDjj8HUJxjNCGXCedd\nqgKjaJZYQ2E1ZZZRFFIBZzbt7jEc+NzCQ9j3QXUJw4axFP0BeVHinMdHsFnBcGGRO+68a1//7XXW\nTXF0Sv+rTMGUPqg1/KOjR/mbm5tc/sFHRJozRnz0ZDol9Ck0BDrTvde+rXW35qcfUYm9dEytCNL7\nSPckzEYzHScgJoJI6z17d9/Ohf/tf6KZTNlcX2dze4vdyR618yhSAh11wnlJoHU/d4whCfB5lFOp\naNGI5ZX4OBEk4KJvib4F7OwB6H6upmmpqkoG6AkF65xPOVqyAzeKqCI+it5amWdoZdBRkdkMrEEb\nQ9OK0J6kCdLsnWdMkdKNrintnCfPZfieF9IsX19fZ3Nzk+2N60zGkze8xzdFoIXUrfapf5VlGaYs\n+aKJXDp9go/ccobDbUuRFaIXhpr5InU7HTB35O0H2nx+FoLYAyqSiEx6kCOR4CM4Pxs0N3VN2zQQ\nU+ulcbR1zdb1dba3tljf2mRjd4epl2BUUaVxTkCFg74IwTsphZEma4weogbk3zqESPTiO6qikE1c\n+rcQ99Et3ZEddVdAxLnPuH85QhI6tibDZDlGiYaa0pqQAJoxSZjG4HFxdhHlYQ3704DuZKgbMS2L\nMRK9nwWgNYarVzcY3+yBBlA1Lbm1hNBQaLGIMUSmvmXsWtaCQXmNMpqoRYMioPAJvBhVFA0yIKLR\nScvVo5NNDUQtElStE5l00ZYQ3qJCSB4KGG9scOXiRc6ff4mpc9z7zncxLHOme3u005ad3TGjacv1\nnQk79TRJh2oGWUahFW3TSvIfFcZkLC70KHQ2s/kRYaSAb1pccPs9q6jQyhJ8C1ETU14nD5csHwJR\nCXHFhYgnPWdKrCUjEH3kkUnLz63v8H+tLPKZGLF5SWa1WA6pgHM1Ooh5mybQugT0TFcjpEF8kY7a\ntm0wPbFj9LgZEyzGyLiasrtj33SoflMEmkIlZKccH4pAO6lQVuMKw95oRPBHCTgcBu09NiQlahJi\nQWALknznJVkyzYqRmQqiKHq3+OCTQo5LrQwhX9g8xzc12xsbvHL+PE9+/QmubO+yMZpy3913YmKk\nsDKgv3J1nRdeeYXN0R5VcORZxpmjRzh15BCWKIDBWLHYtDKZUKIIEtNO6luHbx3G5qD2VRbFvU+J\nlhlyvOu0S3fSEa1zuBDSziXk69lgHJm//o2Nbe5oPX9ze8Rnjqxgraaf98itpmmFGa+0xnkn05co\naBVUTAKB0gKS768pixJlFUeOHObqpUuSCiDy9dYI4NKYmzxHiwnuUlUVhdHkCQJT5pbcaK5fu4Y7\nfZpgSnzSmgjdiCjlqOWMMgAAIABJREFUL3hAKSzg6goVZF4oCXcnDyqcA5/YUzNRGTH/ZFqNcU1D\nG0RX3+Q5aM3ly1c4d8tpCmPwbWDa1Oxsb1FPx1ijcJMaN50wHfbwbpHCCtO8bkeMxwOaZpm6qgV2\nbYXka7WWkRr7phLdETg7CtlvQ0iCL/KfbdsSU0EoOVSYyYKCUOL+4coi/83mNv9gaSgOK64lZBkh\nWuFtIgrbzlc0zkOCbMt7RLTRxOiomxqtFf1+j7aqaK3hkQ9+Dy++eJ62qXnowQc4fvw40dX8wt/7\nR294j2+KQBNJT01bN/SyPGk4BIb9Hr1ezngykm56Jt5KaEl5skyjTJwFj9YaHwytk10xyzKZEoQw\ns6VxdTPL2YwxmMzOqqvcFng95dZXL/PB33uMCz/6w1y77168zshCy9XLl7h2bZ290QgXPYN+j5Ve\nj0HZIy9yfsRkfPjRJ/jEmaN8qZexuLzG0tKS4LuSLWSWZfTKHFOUGG1n8GnY14ibVboemiQJGkJq\n5WiVrKylnRJToSG7upodo48uLfD7C31cCOiqEukIpVG9EmssbZCMcVo1jEZjeoMhZa+PaxtClP5e\nl/PW9ZQYPVU9Rmt48skngEivLHj88ceYTiecPnn8z8DRmfBO0TlhWisFLqBjwGhJRrPCYqymrlta\n78gxMyFh6dJHkXyKMmeM3uGjPOlNIzqzzntiKwVBZ/+jrcEju4jVAg669ZO/w+DiFc5+8rP0Fpc5\n/K8/xuUf+/P0Njf5wCc/zW/fcZa9w4f4rmnFh59/ld98yxmeOn6EDz/6OMd3Rvy5Fx1PPXgPi4tD\nUQny4FrRsA0+JHNXcVpRWh1g6HdFRNu2oKzsatHTeo8LsqNFOo8muX6i/q2ICtpWPrc2hp6V3cuW\nisZJQGsr7KYMxbhqxOhWaYzNiZHEa42EpC5eZJamEZ7B8uICYhyrOHvLaRltTSbs7e3xyAc/yJPP\nvfKG9/imCLQYI9OqAu/xQScuwAJLwz4PXr3Gj3/lCa6srjF56F346HGhxTlF22oMPvlWij5Y8C3K\nB1xT0yL5SjPnkYkXZxHbIUuRm9+4FtfUaGPY+Ms/ifnlX+f6z/44R//Vr1G+8hrHf+03OeIdCxtb\n/ODzsP3Qj/LRP/gaR3dHfPTVS+y+6x188R0V7//Gt/jq29/K3ffcgU/tCana5GgKIVJVNW0tjiy9\nvhBDOrJu1x/zXkRpo5KfzzknuaX3M+c6MQiD2ZHXHaXJCMEaDR600TSNp3U1VNDqDJsVTMZTXIj0\nB6IENK06+XYJ5hB9mpIkb3cdWV5eosgzlALvWs7eepazZ2+hrqZveo9vmkAbjUbkWjONLXVuWTty\nmNXFPj/6B1/m+O6I/ic/w9ffdV/Kdve1L2ICTimjcSmhVdHOuukhSNB1HfbC9Ilmn8kUlOQ0Siky\nk9MAew8/wOQD7xbDDJNx6Jd+hVd/5AeZTie85T98lue/7/3ccuYUz/7Ad5N97lG+9tA7ybThyp3n\n+PgdZ0X21GaIOmMuTPIkaue8I3iFS3lp64QZP98c7QKtTiOmpnW0ye7HB5l3zma56XdUUuVOwam1\neCEoHTCpndM6J+4qWjGaVOxNKqyVnphPbRRrdZpqiNZH9C15lnHixAmMdiwtLOJ8w3Q64fZz52ib\nmmtXLs9ShDdaN0WgoRAPTqWoGsfupOIUERsjn7zzLXz05Qs8/0PfjTcKFeWGBAI+eqKPxBaUl6aP\nJPK1tBCCQF92d/cweSa4sF6JjyL0F5qAMgZlDUYpgtYYnRODnhUouw89wPW33Uc12qGupnz94XcS\niKzlOdUH38eXHng7o2nF8rRCB4/yXjBlIaCtRRmDCYmJn1oGmTEy81SaaVWh9YQsz2a9Le/Fdjq4\nkGacYuQagxidaenMzvIzuYRxpruRpfTDGo1Jcl55sgLP8wJrSzbrbXF+yTOCEu3bkCYJIogopZbz\nnqIwFGWfE0cX6SeE8sbGOtvbW9xz912cO3eOyWQyo+293ro5Ao1IkYv1TcAwahxVXWH6Gc8eX+Of\nPXgvb7n7DpbyjAwppR3SjVda4XyDCskMTBmB+NQNzXjK+tV1MIajp08KslZHPC0uKLSPCSumktFW\nBlELaTblfC62GGtRuaXMBkQvgs5FVqB1hkOc6HoWQuOgCWSZxU9rVJkTXEv0jtY7pq7Bh0CuLb08\np8wKrM3ZG42T1ocwkZwPoAwqBpGUV8zaG8RIhsZrKRRClPlw1CLUHAmUxqBMJru1NYzGLSpCaSy5\nsvggs9gm2VXqIIW3QlIXEI018XzXRAybWzusLGW85bZb6RUFWsH21iYvvfwSS4tDDh1e21eKfJ11\nU8w6QRL+TvZ8Op1yfX0DneWURZ/d7V0W+gO88yl32CccAweQGm3TomtHM5ly+dJlApEjJ4+DNYTk\nvSmtBD9rJfgEpcmVwsaI9l50ZFWcwWRyLDkW7RW+FZaRMRl5XtLLCvomIw+QxUiJoURRtAFde8Gd\nKWm2TquK3fEeo8mEST2l9XIkuuCpmlo8DGJgWlcixdU0M9tvm1lMmtnOht9KzUSPO3BAV/tpJV3G\n4PdF+Wye0QYv+DTAtY66bsSKJ+28nZByB63qOAH9PGPj6lWibzl1/BiDssASOf/cczJHvtkDLYZA\nXU1Fwjxhsq6sX8eWfU7fcpbVlUNYkwm8OrGy58dKByq2pqHaGXPtwhVc6zl8/DimKIhpXlkURRpP\n7Tv01lUlLOy2oQ0eT6CJjtq1tARUZqiqmsmkJmqDLnJ8ZlF5TpaXZMaSW0PZ61H2S0xhE7QaTIyS\nYznpzfkIrUsD+WmNtQVF2UebjGvX1nHOU5YDsryczUBdCj5rM5SWaUgHZboRcfLA1i5//9sv8tDW\nThofaYpSBPOyLKN1jmnd0DhP1Bqf/NbFM34f3dKhQWAf5p1nGTtbG1y+cIHJeI877zjHeHeHxeGA\ny5cvz3p+r7duiqOzG8GE4FJ1Bm2E4fIK526/HV+PcFVDlouqj1YG9H5wdVt2CILqaMYTxuMJy4fX\nGCwuMnUt2lhync+qN6X0rNmrjeyo3iTAXxoaKqVkDOQcwSg5rEsroxotaA1CxBlL0BqnWmoCMTRM\nYoPF0sTUrojiD4DW2EwMz0KMbG1tYYxhcXGRsuwxHk9kmuEDJCCn8y1BaWyRJ9KKeD3hE+aSfeDA\nT796lVsmFX/x4jWeOHFUQJpK0+v1CUTWt3bZ2Nmhalui0QSgMHkCOWpUiInvmRAwadaZ5zlXrlzl\n2NEjaJPx1FPf5m333sNguEDrPc8++yz1m0iL3hQ7Wpe4xiB9M9fUeGXY3BvT+ki/P2Q02pOvkcNz\ndlzOq08LnKZld29Mf2HIwsoKjXMUWcloa48sqpnkQoxhH5QYEhwoiQRrF8hdpGgDK5//Enf+/P/M\nkce+gYmeXEEv0+TBU1Q1vVHFia8+wXt+8R9x7MlnwXt0agSTqdmYqHUhIToiaIO1OSYrOHLsOIeP\nHiMrCnqDIXsJ61X0xN3EI9Wi2H8bcd5L3AJpm6hZoCkFv3brSV4e9PjVW04S4kF34TzP5Vhuajzg\ngqBLRJA50Ezr2XQhz3OGw6FUpAkDtzuueOaFl7i2voXJezz/4quorMdrF68yHC78GbDoiZF+v8fi\n4oI0ZNtNpnXFo1/5KoeN5cyRVWwPGeHkRqouOkvpg4ZlzjvGTUW0msXVZba2d1Cxgsaxd22D1iqy\nIkdrI44jqnOd8yyqHBUCbjxl98o6m5eu8MD/+0v0Njc59qsf57UThzn+jSuc/I1P8dz7382zw2Wa\na1s88plPMxyPOPOJT/PoD3+Q/toKJ196jYce+yZffuA+vnn0MG3wokPrRQGolxcUNqN1NWtra7OA\n6Pf7tG1Lr9cTUGGaR0ZtJHFXIlHl4/7nT9svCsU3j63x+JG1FNiic+a8x9pI1dRsbm3ROEfUCrRJ\nnqHinqe1EWDBPEQoHZvT6ZT1QnH06DEuXt0kBs8tZ05TDhb59tPPc+zU8ZmI4eutmyLQBsMht507\nx/r6BhsbrwmrJ8955eJFLl1fZ6GXc3i4QtM6SAgMbdi3V1T7HIEQAy7KeGY0nhCAnY0NYtWwOlzG\nEcmLPBlceEHrjiYMvvQYa7/+7/n2gw/wzGBI7iJDnfPC7Xfzlqee4KsnT3L5iad44LO/T7mxxfHf\n+G0+deYWBq3mDxdWeMi1fHZhkZef+DaxX/BDz7zIoWnNO//gMb78Q9+DyixRKVwUyavgPCHLMUo8\nMYuiSEauJZcuXiQv8n1LbSAkBEcH0Q5+fjKQLmR39BFnEHBjc7K8IMtzRtWUaVWjUo4aO3HltiUi\nAexi2nkTz0Gl4ksrxfX1HXb2ppy77VYmkynfeuYZtnZ2Wd/Z4YXnX5BG8RusmyLQnPO88spFtre3\n8UEREdy7UznLz5/nJ/7gUZ7/sY8wWTuKRlxGorLixRmlYlNR4EJGK/CB4bBAe8d0d4dqvIeOcOnq\nRYqlRdy0ojJb6Kqhub7J7quX+eBvfZrhZMK5z36OL5w5xWBpgbGxjMuSb567jY29Mb1nX+G3s4z3\nZ5bf6ZVs7Y7ZJeeSynh07TjRKNzWlDh2fCwv+agLfHzQ57VnX6JYGqL6BTEzRB+YtlMqm0GQGe3K\n6qrkb0XOpKkFHxYjPogLjNWaXBus0VQaYpBWjtYBYwSUSBSIVBujFA95Tm+wQB2gVYY9D2MXE8Rc\nEVzA+eQNSkDZjOBE0KXzdWpbyWl1lhGdgWh58fzLlIUmKzSNG3P8+CEGRU6R5W94j2+KQGuamq2t\nrRkHoFtaa37q29/icNvS/9Rv8+WH7hPHtjRVnx0bc9s8wNLKMuWgL4erUuRlyeKXv869v/eH/O79\nd/GNQ8vUW9tk45peG9CTGr2wwHtdy6d7BVf2dlDVHjFE7hlN+aG9CZ9fW6Hevsp374751ELBk5kH\nP+KO3YoP1RWf7Jc83evRNA4far6oHb+zXNK6KfbiFfx1Q5NrTF+0eC2afp5zaPmscCWLgixGev0+\nZVmys7PDYFAmEkpCcHRTEK2wmZaiJqjkO9/StgEwKCL9XDGpG2JwLC2ucOnaBlevr+MDBBX2rRPn\nqtY8y+gvLcwgS9Pk3tc5rBgrDsoCxFQ0dcvmxhb33/82BrnZJ8O8zropAs05sSHsGDgCRzZk1vIP\n1g7zC3nO8x/9CI13RJIUQmJUz8goydDVWkuv38fkOVlRoI1lcXmZd3356yxsbvPBp55j6cF38I6v\nPsWXTp5i68gxbOHZyft86tgpnPe8fTigdS1Hr17h+7aukRH5gZ0Rpm1ZdJ6f2K1pbjtMDJqfvb7N\nkvf89KTil4+fIrSRCYFrvmJUaHbblraxVDryztGIv3ppgy8N+zw8qfj4mZNQlgwGA7HfNoYiaWLs\n7OxQ9vIDPACf5pgR6cuFBA8nii23BxoX0Dj6mZWZaCu9uPXNLTZ39kDbGfx93jml+x4dUbhzsZln\nkTVVTYwGYyBEoe5VVc3S0jLN7uaBTeLGdVMEWmcfY4zFmH3T+DzL+ESMHP2Zv0ARam4fjempgl4u\npOBuxmesdIDEgMuQmx5egc1zsAZrMq799EfR/+bfsfuXfor3/ct/S393jw/Yq3zhrfei25b+iRPo\nINJXg36fpm2487lnyYh4rbl4x13c9sy3wDkybbhn+QhRGYpXL4H35MZy96EjNNOW65Mp3gesUgQF\ne1bTKzN+bv06Z5uWM9u75CHys1fX+Y3BYNYgFanPMEvAvU83TguqOKq0ZyiIBAIiryUkX4Oymt3R\nHrGZcmRthaWVQ2yNa1585RUuXr5KCIqizGd9yHn8G4ji48bGBtZaVldXGQwG1HXNaDSSMVaRMZ1O\nGA6XyK2RXFlbLl64gg1TmCP33LhuikDrDBXyPKNtxVheWUPdNPgY+JVf/VUOLxT8zb/+XwIiuNJN\nB0i8ge7JN8ZQlAOaxA4aDIboqNh+59vZfcfbUYMesXEc/9WPc/m730++sMhoY5NJM2Xl9DEGS0vY\nvKBnLZvtD2B///NcfP97iYdPcHFhyIlvfoNXz95CsBmVczx3xx285eWXePrYUabRs+tr1n3NuqsZ\nhYzKGspBD7vY5zeLs/zoyxd46uga969v8aUH30meS17T0dumqXmslBjEtk4mB5ImCEJDCM1B5r5e\npQJcc9+1TT709Hl+MVN8bWfMyqHD7E0qzr/2GpMmYLMc78Uf60YH4v1JixQS29vbyVh2/6QIvmE4\n7Cc8nFT4Td2wvr5JYdqZbfbrrZsi0LJMZESrqiLLCpnvxch4Oknqg4btMOX8+Rc5+o57EwBQxiuK\njmAhIyOV5nM6GUKUvT4EYZSrKITZnXc/wOb99zDd3Mbu7rGw2GPryjXGF67hLm9ifGRxMETni1z9\noQ9T+xZXO8YnT/Pq8RPkrmG0vcnIe+pTJ3np+GGuXb5Is7fBxrRipDUbhaEqMkyZ0++VNCrw+NoC\nz932XawuLPGtvGSYl+jUy+vGNx3StvPQbEPH1BLkhk8QbrRCDFwF4x994KPPv8LZuuG/bxQfiCMu\nbDV4ApUTmI94bU5nvbJO7XH/GBXJqu7nmB9xra6usjAs2N7eoShyFgYD1tfXBe4dpHXSyWC93rop\nGrYhxOTGa2am8ygYDoccWltjWldkWc60mopYr973kZxf4qBrEhNIGOyNF+aOtvLebS1jJj0cYA8t\nolaHnNra4Ic/+zlu0464usCmclxpxqyHmut+ynSQMdGaBsv17TFkPYr+EqYcUkfN9a0dGhcYuZY9\nAiz0edAafvHli7ynDXjjsaWmN8zp9ay4oUQHvpl93g7wGRP0x1o7CzKRcpB+WIet88HTejGTFVFn\n+FdnTvJ0bvnfs5JpzNiaNIyqFqULgoSjaMPFfee7sixnHvN5XhyQD+unwuTo0aM8/PDDvO997+Ot\nb72Tuqro9/scO3oM1woFcDAcvqm73U0SaJ7R7h5ZXqBMxBTgUbi6RTkRWanNIs+8uiHlvmvIlEEL\nmAyFxtoc0Mm5ZEpbjVF1ha5qVFsz3t0CPFnQDG1BGaEwmmgVx3/r0wyvXOPUo1+iXu2jT6yy19OU\nLz7Pe37jE5SPf4O9KxdZfeFbfP8f/i7+uae5uLfN7nTC4Olv8+effIoywvaZU0xPHGWrzPjQpSuc\nHk/48IuvYAYDiqLHYj5gqAqyoNFRCQHXWlwITKpqBirIywKUoh41hBbwmqXBCrFVZKog0wWuaciM\noWpqotHUMfKHKyt838ISnzIaRU2WRaxV+ChTjzaA88zk3jsd21kBphXGij9BnuWUecmJo2ucPb3G\nmVPLjPd2OXr4MPWkph43xDZQWsNCz/IzP/vjSdHp9ddNEWidvoXRRoRMkldktxVbm1HVFXujEY2T\nAXWXX8zDnzuySQhhRn/vkmvZJSPG6hnTW2sxl33tx36Y3RNHee4HvwcW+qjVRezhFd727HOs7O5y\n5zNP8221x+3PP83qaMTtLz7LtxYdzx3WvH3zGseqmj938RJRa3Rm6Q0H/Ie7b+PC4oBP3XVrMuUY\nUBaFdN9jJCqBckelqKqKyWTCZDJlb2+PnZ0ddnZ39p3+0lE3PzqbXTut8V64ro1rqdtmRlSZl4CY\nb2PEGJNK0X6OJkhggXkvLCzQ6yX3uxC5fPkKX/jCF3jqqSf54he+iNGara0tNjc3sFbzvd/3CAsL\nbyyHADdJjibldM7mzg7WZDPV36iRHcsYvnt3l//1yxe5cPow4ZH3ELthujp4hGql8bOZm3Adgxfd\nWLFyboQtrtLxYQvq934X337oHYymFWo6JWYaVWZ845H3cN/v/SFfftfbWDt+gq+++12857En+NKD\nb2PtxDECkc8/dA/f87Vn+fzdd2J7BUUuPb5nFvo8feY4SifTVgQV3PpAUJrMiKlXSMHkgvwaTyfU\nTU2MUGZiJiZV+T5TynsHQcCfCiWfV2ma1lM7f0Avo7u+3e836pHMkBl5LpaLXuNcgNjS0uDbCuen\n1M2EzPZYWlri5c2XMdoQQsM9d93O8eNHKIo3btbCTRJonWaFQny5NQplktudFlTG/7i7ze2uYeFT\nn+P8Iw+j004GooYtF83jEW/yDkcffYQQUNGgidSuM9qa38w11hQsDUusyqhtQySy++4H+cz99xJi\n4NgkUC2s8Jl77sURWFOeNnjW77qTX7n9dlzr6TcNhZKRTRvkhmulaKta0LFGXIaNsaAVLgSmtZir\nKtcynk7YG+2JNJVSCWIOWWYBkTyNJFkF6VOj0TIcx8yG5ah9Y9j5IJs3DunGWx3jSiklFjtJuUlh\nMFpR1w1ZblhZWeXw2jGWl1d59ZVXiAQOH17l3vvuYrBQvGkhADdLoGUZW9vbGG2xWYYL4goSkJ2o\nqQN/d+Uw/8P2dT5zyynu94E89d5QkUwnde2kv+q9g9hxHKXYEKBjwNj9RiWQAH7SSyptQcwjVmdo\nbRiUnrqs8T5ghkLGcM5RtbWU+Ikd37Ytk/GYqq5ByYNRt0JUDj4QVUvUkcxms5ZB7Zzg35yjLEtC\nDIynU8aTyX7bRnXGsGZWFQqRIQqcygfQhuihqlt2RmPRcVNC1Ik3HKHzqwvE+daGuODlwqbv1JkM\n9Ps5RV6ytbXNaDRJ7xu57dxZjh8/jPe1pAFvdo//pIPmP2U1dc1gMGRa1YQoPbW2aeVYCQrnIp8Z\nLPE7ec79Zcltk4rl5CXeFZ+dIJ0I1HXNSGa/e6VoG03WL8F1TUpBpArDOhCtQYUMDRhtKGyP4WBB\ntD5S4uzalslkLDq0KXgbVzPp9QSlGgN109DUDc472tYl+HTCwBnB4bsgGmQhBppWFIlmxyYiACPe\nmNJgbdp2Lj+LnbzN7EEaTyds7+3imWPvw+sG2z7ReL99ATLXFIUhLW0LJXq/3ge2traZTGqMMfT7\nfe66607Onj3DwuKA1kkn4M0i7aYItBACqyurXFvfwId9mYKyKITXaUXAROSgIpPplIVSRiSKgxaM\nRucpQW7TThWRil00LpQzM5GYGCLOBaw1FEVJk2mCAayABfME5VHAWEdclJus9+RYcVUltL42oiOU\neS5ypI0jKE2v7BOKyHgqHgJ121A3Na13iWNpaCcN00oas1VT07TtTGZrfjzUjYI6CHp0slvefeU6\nHzp/kf/n0DKPTmvQNjH01YFAmi8oOrh29/eu1ZFlGTEoAoKO6eBTbRtmqkc72zucveUWfvKnfpLF\nhYzhMCPLwSqx93mj9R2rTqXUP1VKXVNKPTX32usaiylZf18p9YJS6ptKqXf+cQKtLAtiGGOoIIg9\nTmEEZuyixhpFoRu0DbyyO+GxFy7hyXBK4ZSnbscY5dDOYUXvWFC4dPLmoo7oXEM9mUIad3UWiSgJ\nItU4hrZgqT+gzAuUsXhjmIaIr1qYtIS9mjwodBMogqYIhgyTZGUMxmSUZZ+87BFUR+6Aqq4JiWdA\ngGraUk1aPJHatUyr6UycTyuFNYZer6DfK3C+oW6mMpZygbZxaFpa3/Chly5xZjzhZy9eJdiSQudo\nvx9gXQ6W7g+ddlwXxOK3ZVLlafDplGhDS9TCEtOmpD84RM/A6eOrfORDH2R50bAwLNGqILQ9lM7+\nszkD/wz4oRte64zFbgd+J/0d4IeB29Ovvw78wz9OoLXOMRqN0J1zbupKd85rSkHbCoFifWODbz3z\nLM+//Cp14zBK4xpH8JHeoE/VdCqIhqIo6PV6iSegZzj74D0hfQ+dmqTetUkmvU1z1CTVkAT2BmWf\nYX/AoCwp8lyOTi8uIz4pP9bJpKvjZTaNmKnWiVPqvcc7P9tpfQITNI2wo7ogKIpCzMCybEYKlsE5\nM8211gnp5WO3nuF8v8ffyXNCiAeOzRuVL+fz0vniYJ/cosiMpWmb2c5XVxW7u7tcuXIF51oOra3w\n0EMPsLAwJERRl8y/Q8UJf4xAizF+Hti84eWPIIZipN9/dO71fx5lfQlYVslF5c2Wcy2HDh1iMpkw\nGAw4e+vZGTSluyBSFYlE/PruLtfGI5oQGe1NyZSYRozrBqd5HZXHNET24tLrvad1DSE4IKTcx81c\nVfLMUuY5xgrosLOOblpRyA7Oie9760S9Mf18PvEL5o+8GQxbJ8ktIiT9jBjjjNjcBVme57NfNrOi\nGu4DMQWc60TxfMCjeHRlgZ8+dYJPEPFesHpdOnZjDtYF1sGx034FGoI8LG3biIZwNQEFg8GA4bBH\njJE777iTra1NkQhTIgff9SrfbP2n5mhvZCx2Enht7usupNcuc8NSc15QWSZuw9aKR+Te7t6smXgj\ns0Zrze5kSvmlx3n3r36CF3/k+5i+4x7aKCOZvOzhW/e6F1N2M3Ed0VjQ3YxUku+2bZJ0k57BqJWR\nm6+VxbSRNiZpLQUKUciWpL+dCbEA+zyGtLspI+hX74Wz0Hov1TH7DdMsy2YBB8LldG33tUILbJ2f\nFUBNgNGkYn1rB+cCUas/cr1eL9DmH+Au8EBmn8GL54Im+UABbVOhyXnw/rfx9vvuY2tzM4noWFQA\nm5VvKlkFfwLFQIwxKqXerLJ9o//3j4F/DDDo9+KFCxfw3jOaTAlekv+OSNJt9d1uMW4a/vw3vsVa\n02I+9Tl+/45bGPRKooKmqrBhf17Y7S7dBa/rCUZnwnkMXqo0LX08o6RlEVVMMqAx6ePKBfckOmAI\ns2AzCiaNHOsu7pNk2hR8zntJrpUoPnrvxdKnafc1ZtPO2yXoXcuhaYVsI5poosgtLjAQlaZynnHV\nsDcdE5IMQogOOfT1gSJgfs0zx7q/dw9Fl9BrbWb+UcE5jpw8yZnTpxmP99jeWWdhWAI51mT0Sp30\n3954/acG2lWl1PEY42V10FjsInB67utOpdfedEUiq6ur7GzvzUCPwe9vx0WyeJ71e2Lk/8hz/raC\n37/9DOzucrpX0k4riiwj6IOl/DxTihAJscWppK6mAS14/KoTN9YGlYgaWvsE4VH7O1QrgsdtK14I\nbdtI117ti7V60GYlAAAVeElEQVSEudGXsWKhKIPwRGCOAZTYQtpu95zLp6Ta25eAdz7Qer/vbaVg\n2nhRfYyy+6HSDo2eJWrz5Or5EVT3AMYYD7CXZtxarXBtQzSKY0eP8t3vfy9vv+dWtjavc+/dd1OW\nGa1rqadTekUPrP1TaW+8kbHYJ4CfV0r9CvBdwM7cEfuGqxtBdUfcZDwWllLKGzrxkO5JjAE+3Rvw\nuz3LHXvbvOOl82g8w2jwWUY+KGZQm/ndTC60IHp1aAGNsRGfxi4RUaPOMi3S6Uon2StmjeC2aQhN\nS1NPqacV0+k4jbUkEJx3aZbJTH4qmGQTXTdMppVopcV9wOf86oKzczdu25a7Ll7jwy9c4GNnj/O1\nQytEAm0I1M6BkUmDSpLxM9waB/tlMz24OZZ7t7oc0hojn5uYHixPkRcMBn20MYz2pGDLMktIyJKy\nTIiPNyEPwx8j0JRSv4w4Cq8ppS4AfxsJsNczFvst4M8BLwAT4K9+p/cHGapfvXyZhcUldvYmGGtw\n3lE1jZhKxI79k3QpUFhlxWLa9Hn068/z0qvXedd993F4ZYXFjZZ+2dAfZLRaY7JSyBPR0SB22aKg\n2KK0AAnBU+hS7pFr9nfB7oa0sptVVY3zouPmmoa6ll9N2wijPHiBXAcvxA4CzovgXVUJOkOROvc+\nECTlIxqoq3Z2A1EK17Z4H/jwCxc4M57y4y9f5murK9K+IWCJxNaB32/gCsvzYJWZ7uP+URoFLBpc\nSHmhtEESYFn6kV6j8ChtsVkkhopXL7zCQw89gAfKvEeRXAJ9cKjAfx5nIMb4M2/wT3/EWCzKJ/tv\nv9N73riMNmxv77CwtEJZltS1I8vymep0V7ntO+XaBGfW9AbL7Gzt8ti3nuXK9i5nT53iwTPnuPfy\nSzzwpcf55ge/i8v33Yuvc/o0tHk5Oz6UZPTYdIE6ufSO10gQG6AQAtPdCcF5mqSUrYwWf/FKmqxN\nUpT0SSzPJQSqC4FJ45hMp/L1dT3D5YOAAGQH8weOsfnR0K/feoofe+kCHztzPDHppRDRaSgvx6Wa\nqT521cI+DXF+sK6xRs9ySZkJAxG8C0QljHVjDHmWMRwMObx6iJdfepH1a6/x3vc9LIoBWuNcIyM1\nLQpIb7ZuisnAysoKDz34dp574WXy1PR0zlE3zFoA8+IjKEWWjEvXd7bYGO1ClnF5fYONrW3qvTH/\nxXMvcGh9m3t+98s8deo0+dICGo+LZnajxXtJSLbW2tT5n+ugu/2EvqkqOo+p1jtcLcEjPbAke9DJ\ntSdBl8Y11E3L3rg6KLAHvGNzl5947Qq/cespnjx66EBFCBwoYp44vMLjq4tScaapyTx8vVuzoOJg\nD+3g14kb3vycEzUXiJgkq685vHYEReS558+jVWR1ueDq1ascP3EsFWrtDMyg1J9OjvYnumxmOXL4\nECdPneErX/06o7HMDMuynCWxHewYpDdV+ZbGZYKaDZFe0aduKnxUrPuGb33kByh++/f42sPvZDSe\n8sruHgsm0ju8RlmWiQgSaZzH+UhegM7EZrDD03c+775rRcTkIpIUfqpGFBID0PpA3aYxUVddNi3T\npp7lmPMmGz/+6hXOTCp+9KWLPHF4ZRYU89bT872pGDrkBTNEbbcO1ntydL4eNKj7XZQcZTgf09zU\nZjKrrKqWLJMx3ZXLVygzQ9vUlGXOT3z0Z7l69SonTh6nKHKU6tC66s/Gjraxvs4D73onn/u9L/K2\ne+/mK199nCy3M1hzV/Z3x13nWGfzQiouZfBRMa09SgcuXF/n88sDwt/5XwgTx9rFa6xffI3WKiYb\nW7OJQZ7LXLTRXtR9mBzYBbpga9uWumoITnYurySvdCmfCggVzqfK0kdpnjbe0c4BF7uWSwiBf3fr\nCX7s5cv85m2nZzvsLBDmq2REu6PLf7TWKK1RnQGfUkkOISa1F/anCex/zez9Ywqu2efs3Ialiby4\nOJSRX25ZXhwQfYNmwAc+8D5OnjzJ9vb27D07jJx80z/lPtqfxMoyy8d/7WPsjac0DowWIZQub7lx\nhGKUJjOGXpaLDEBSSNRkqd8W2d0ek+kclVkOra4x2tlmMt6m9S2Tac1oPGUwGHDu1Yu864uP8fj7\nHuTaXeeSbY7sPM45GSHVteQvIckDGOFQts7hg3i8uxi5/cJVHnnyWT5z9zm+eewQAY1SBmPULGi6\nqcUzJ4/xC6dPSMU99/nmnV661+R06xqvNx6FKUhQKDW3A6b3mm+bCMJCdEu619rW06lGxiBRarSh\nLAqaesShlSU+8IH3c/jQKsvLyxw/fnwGrdLJUrFpKt6Magc3SaD1+30Ora1y/qWvMxguYUyG97Cw\nsCBYraRj1l3gajwhOo8KsLi0QFn02NjYFIkApfG1JzaBjIyIYzhc4ujJk+xtGKajyawjX9cN7/j8\nVzi0scU7v/gV/t2tp2b5zTxE3DnHLS++xsOPP8Wj73obL509NcP5K4UM30Pge598luM7e3z/t1/g\nyROHE0YrJpBinOWZ3cMzn48BB3axeQKOQqpBlRDqMTGhZgGk9j2x5BQgyYCFP3qEKmZ2Ql37qGt9\nyI4bOXnyKPfeexfHjqyxvNjn9MkTlGXB0tKSkGaSZWJInvYCvc/eNNZuikBTWjMYLqRhckm/P8QD\nk2lLYXtMpiMWV3toDVplXLl0FVVAXvYATdM4ijwnt5ad3W1c9EynDRGDspHYtvTKEj8YopQcXWIT\nHfiDB+7j4cee4PH3PDDrwncD5X1EauA9jz3Jke1d3v3YN3nq2Jq0L1yb5EwlSf/svbfzvU89z2fu\nvg0dI1YpdKKvESHETr0nGVDQDczT0aiS3dBsw9KiRimcrlRQirOJQROqpImbZ2JK4cVlJSRhFpDg\n7cZv3S/vmPUprbV416KI5EWWLLnh6Noqt54+wdGjaywtL9G6FmsU3rUUeTaDLKm0A+53715/3RSB\nNplMeOLJb7Jy6BAb61uMRxPKYY/19V3KbMh4vIsKBePJLguDQxilyPIMtGi1xtQ9yoxCRenQNy6i\nbC4izKHCZppe0Scqk5QjBcVx4a238S9uPUlZFFhliIhdEEoCTIbakc/ddyff88Qz/M4952ja5sAx\np1Il+czxw3w7VZAqBJGCUiopf0d0TPBsLQoZxtjZLhYTBV3eV97b2HRUBhFw8cmeWxstXFVt0NqT\nZZYsgrd+Nv24EfA4395QafJB4k249DBZY4jek5mMwhja6QRX14QQyMsebS2yFZ3hWDf10GiZKLzJ\numkC7cMf+hDffOIplhYWmUymTOsJRa7Y3dmkyA3nzt3C1uY1drYrzpw5wbRq2dwd0VTSn2qbGm0N\neZ5RefG1DDEIhSwdE8PBEJXZhPfSNHVN8CID2t2cbrjdHSXd7vbyW87wT245SQgBy35FOEOGzB15\nsH8czjsB/9FWA7PX5wuB7v9rrWeCLNHvv67S8VkUGaUXd7rMGmwxpGlqqroBv98e6arorhjpvk+H\nRO5aHiB8zoWFBRYXl3DeU9VVctNTIvEwVwl3n3VWTb9JsN0UgVYUBb/7uc/yyCPfz9bmFleuXOX6\n5nWKsmS89yqHjxzmLedOM9lb4sKFa9Q1XLt+hc2dEVnZl2IrRupmeqBi625WWZYEFygLS7OXXI6J\n9MqS4B3TaUZbC5mXCEZbcZ8LHo000rOoZ0BC2Md03bjmAy/GmHQx9nO0G5P97v90a75RO/+1XQUe\n09RBa5XQIE6cja3FKIuZ0/edR2rcOOfsAq3r63XVfNU2jMcjlpYWWFnsMxj2cb4lOpXcB/f96eFg\ndXvTcwaWl5f4ub/2X1MUBTs7u5w6eYz+woC9UcW//83P0O/lnH/+WepqgtE9drZGLC8vszsWT03n\nPa5pyE3EWoMhVY5RhuE2z3G5w3gxoI0xYlQpMHDXylyvKGd0tvkeWku3M+oDg2h4/R3qxpuMUmiz\nr3A9X03CwaR//vXuz910pOuxAUSlsAZc1RC8p1cWhMyibU6eG5wX5Ef3YNwY1F3Q6xvaICGKD+fe\n3i7r6+ucOHYnIQgsvbTmQLDO/s93wKF166YINKU0RZkxnY7IC83K6gJ5r8err10WVO21y9x3/+2c\nOXWane0pVaV5+dULVNWYNpAS5kAbPFlusTajM5q1Vu83Qp2j3+9LQLlkVq+FP6CAJpGQu96ZbaQh\n7Jx4ZnaVb3ejYL+N8HqK1NJ1VxD+aGDdaDQ2X3HOf61OQAxtDDYqKVg6PJwPFHlGr1fgbUbZ69PU\nFXXVYjMJiqqSqcT88T677poDD00IgX6/x+nTp0HJ0WxswZX16wQiy/2F2VD+wGf8Y6ybItB0EiyR\nvkxGXU3Is4JLFy5z7PAxmqUl7rnrXl56+QW+8odfBV2IcF8yfjDGkNkMi8cYNet/TadT8nwgAWcs\nJtdUyXImJs8BrKEocoKTHcAn1EcXPJ2XuXcOW9lZOT8fIDdCow/MF1WntvvG60BhodSsqStATQgk\njkPivYa2xSpFlucUQePcLt5HBocOEYJjMp3ggvS4Zr3HlHN2D+B8sHc/AzGyvLLEW267leWlZVBQ\nVxUrK8uMphOcc7Pmcve530gH5cZ1UwRajDJcbpuGyXjE9fXrbO+MefH8efrlIrfeditZ0ePpZ19g\nZzSmakYJsKjRNhPYj9YiP64iQSmm1YTJdMziYh/vI7m1FNYy3q2wxpLlNpXn8gM01CwNhxIoQeZ/\nnRp1Na3wTrzM80QFbJLZhHd+drx5J0N1eY995IdXaRcMYTZKkoQmylTB7zem54+nGYask61Jfw6+\nxaPJMkMRDCoEsjLnxIkT1HXF1eubbGztJG/3NqGH01zSp9FTFM2SWZsFhQ+ByXjEztYWTXNMNN7y\nDKzm8vlLLN46BA7mlLOj9HXy1fl1UwSa1oq8KMmLhq3tXS5duiYD2zjhrnvuZVxVfPPpZ3n6/CWC\nsjQqYHUqxTNJgIuyTARehVHQthV7u5usHVohOEPeA2VaFheWUkUKxhaz3cQURtolgDIpiEMAnYPO\nU5mvaVqH0VbIL1k+O2o7aQETDspPSaEislQqIlqxpJ0uJJmtG7r488eczURbQyP2hTFTtEZYXspY\nmtgKOFMFFoYDVldWWVt5la3NbaLz+KalHAyYTmt6vR5VPcVmwmXNskyOVRT4SJFZsuhoplPa1lE1\nntLkhKZhbXXlwO41X9Eq6bm8abDdFIEmo499jYmqmjLeG7O8fIi33ft2Hv3KV3nhufNkWc5oNIUo\nRJAiL8hMhrI54/GY3OZJQdqTGaHpqfS+ddOQFcyE7+bFU+AgXmv+SOnMHDKtpScWk0W1MQIkQzwu\nu8F5F2Dd9wippwavT+aVKaGaISqUUvJ/IjOraQVYLSI4wUiAty4mJEcKWB/IjeXw2mFOnTrJ1evX\n2NraptfrU9c1eWqyaiFs7iNhEGwb0VMUPZaXF3jrW+8gEtnYWOfU6RMQRI2y48/eWHFL8P0Z6KN1\nTnJFkXP8+DGsNbz26iXWDh0ly0tc4yHAYn9AW7WMxlOKvI9K1H3vHMP+EG0M49EeK0sDlhcH5Jmo\nWDuVBtPp4nSjl/neV5cXzecd3dNrrejBlmWJMUqO+MkE3zYzZGpjZDdsE7JDK4VPOV/w+8fgfFug\nG3Kb9P32b54iAJm1M6ucIgEAiJGQizYtRKwx5NbilKIoClaWlzl32208+8Jz1HXDYDBAKc14PBEs\nXGZn+adAoVryzGAILPRL7rzzHPfdfw9XrlwWt2Ol6fV6AhK9oeI8kJt9hzTtpgg0aZKKD3m/3+OW\nW86wtnYcoiAVrDH8xZ/5C/zi//l3yQzkRstNJqIyS78sGCeHuEMry/zAI++lV28zGe2Jhr41FGWO\nCw2haQ5Amrvyv2MgzXaVG+DONt30EB2uVWTWouZoZjYBBbumpU4zSBMjQc9BdG7oo3WBpNTMJF3A\nmFrciVWCBRlthGlkM2IWqI3DOTmSM2uJaBaHCywMF1heXmJpaYnd3b2ZgEsITiDYgLEZ2ohUaJln\nlLlFK1heHHD3vW8lzy2LSwt0XupN07C8tEpoDzqjHGjzfId7fFMEGjB7yjp0w9LSIj6I6uPK6iJF\nYfiZn/oxzr/wAt9++jmGi6torVlbO8STTz5JWSqshfc+/AB/+S/+NM995fO021ti2KANeVYSXcS1\n7oB7W1dZwj4JBv4oTS3EOGNii900s68zWosojbF4JX0npQVl4oLHzLm8zHMgFJ0D3VyC3e26c9Wt\nQiYBOvE+AYraU7VjXN01mRVHjhxhcWGBxcWF/6+9s9lt4grD8PPaM2OcIcEhEFhUitqKSiRdgIgQ\noG7YVFwC3XTTm6jU62DTVXftrlKXwB00RFS0Ev9CoiCleFXiTOO/j8UZE1vCTWLJc8bhPNLIli1L\nr0evzjkz833vYW3tPNvb26Rpyt8vX7/vsmplu3n3vOh1e9TiKsfTOqcWG6x+8TkrK5+Q1GIWFubJ\nsh1arRbVapVut0stjv93G57SFz4CJLUIVdzGomZGtrtDvTZHp5Nx9eo6nXabJDHe/tukcaJO39pc\nuniJzc1N+t2MKIqJq2LxxBxLiw3o98l2WvT6PaIkotfrk0RRXsu2N5UNboIOTycjZTW4tVaz2SSd\nm6N2LOZ4mtJJEtr/ZS7zA2HKS7HzCPvB/5DtpfZIer/OkeQ2pbXRqzhgpOl5uFhy0BQNUK/3eZtl\ndHbd/k1RHHH2zBlqeVHn8vIyjcZCnlWyDVbJR/IKna7r3ez1ulSShLR+jC/XznPtymXiWGxtvc6T\nuZdotXZI03na7TZJGg2lQ1ZG+0P3GdN00Btu00TSG6AFNH1rOQSnmC29MH3NK2Z2+kNflMJoAJI2\nzGzdt46DMmt6wa/mUmTYBo4+wWiBQiiT0X70LeCQzJpe8Ki5NGu0wNGmTCNa4AgTjBYoBO9Gk3RD\n0iO53Nvv9/+FHyS9kPRA0n1JG/lnH8zy9ahx6nnDk+LVaJKqwC1c9u0q8I2kVZ+a9uG6mV0Yuhc1\nLsvXFz8x5bzhSfE9ol0GnprZczNrA7/gcnBnhXFZvl6wAvKGJ8W30cZl3pYRA25LuieXvwvjs3zL\nxGHzhqdCaR6qzwBfmdkrScvAHUkPh780myzLt0h8avQ9ok2UeesDM3uVv/4D/Iqb9rcG041Gs3zL\nxDiNhZ5730b7HTgn6VNJCXATl4NbKiSlkuYH74GvgT/Zy/KF0SzfMjFO42/At/nV5xUOmDc8McMt\nYj4OXObtY+AZ8INvPWM0fgb8kR9/DXQCS7gruSfAXeCkZ50/4/Z06ODWXN+N04iru7yVn/cHwPo0\ntYVHUIFC8D11Bj4SgtEChRCMFiiEYLRAIQSjBQohGC1QCMFogUJ4B7dLgYCdIC/XAAAAAElFTkSu\nQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAJgAAACfCAYAAAAFxxCZAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9d7Bt+VXn9/mlvfcJN7z7YufcrZZa\nCQESojEoJxQQIgrbTE0BU1PjKtd4AqZcHsrUuCiXx56aKowHqghGA4hgGDECoYSkEUhCObZCx9f9\n+r1+6cZzdvgl/7F+59zboH5ikB79jHpVnffOPXGf/Vu/Fb7ru9ZWOWeekqfkcol+sg/gKfn7LU8p\n2FNyWeUpBXtKLqs8pWBPyWWVpxTsKbms8pSCPSWXVZ5SsKfksso3vYIppX5UKfUxpdSeUuq0UupP\nlFJ3P9nH9fdFvqkVTCn1T4F/C/yvwHHgeuD/Al7/ZB7X3yvJOX9T3oA1YA/4gSd4vkaU79Fy+7dA\nXZ77HuAR4F8AZ4HTwBuAVwNfBi4CP3Pgs34W+D3grcAu8Ang2QeevxN4H7AFfB543YHnXg18obzv\nFPDPDjz3vcCnyvv+AnjWk31e/9p5fLIP4ElUsFcCAbBP8Pz/AnwYOAYcLQv4cwcULAD/M+CAnwDO\nAb8JrADPAFrgpgMK5oE3ldf/M+CBct8B9wI/A1TAi4sy3VHeexr4rnL/EPAt5f5zi3I/HzDAfws8\nuNgEV8rtST+AJ1HB3gycucTz9wGvPvD3K4AHDyhYC5jy9wqQgecfeP3HgTccULAPH3hOLxSn3M4A\n+sDzvwX8bLl/EvgpYPWvHN8vLhT+wGNfAr77yT63B2/fzDHYBeCIUso+wfNXAw8d+Puh8tjy/Tnn\nWO635f/HDjzfAtMDfz+8uJNzToiLvbrcHi6PHfyua8r970fc5ENKqfcrpb6jPH4D8D8opbYWN+C6\nv3KMT7p8MyvYh4AeiZ2+mjyKLOJCri+P/W3lusUdpZQGrmU/vruuPHbwu04B5Jw/mnN+PeKq/xD4\nnfKah4F/nXNeP3Ab55x/6+s4xm+4fNMqWM55G4mhfkEp9Qal1Fgp5ZRSr1JK/W+Im/qflFJHlVJH\nymvf8nV85fOUUm8sFvO/R5T7w8BHgDnwL8r3fw/wWuC3lVKVUurNSqm1nLMHdoCFpftl4B8ppZ6v\nRCZKqdcopVa+jmP8xsuT7aOf7BsSi30MmCGx0NuBFwIN8O+QWOl0ud8ciMEeOfAZFonBbjzw2AeB\nH8v7MdjBLPKTlGC9PP8M4P3ANpIxfl95vALeAWwiyvVR4O4D73tleWyrHOPvAitP9jk9eFPlQJ+S\nyyhKqZ8Fbs05/9iTfSx/1/JN6yKfkr8buWwKppR6pVLqS0qpe5VSP325vucpubLlsrhIpZRBEO2X\nIen4R4EfyTl/4Rv+ZU/JFS2Xy4J9O3Bvzvn+nPMA/DZP1fe+KeVyKdg1HAAWESt2zRO89in5eyxP\nhGJfdlFK/STwkwBN0zzvhuufSP/Ugfd89Vd479ne3iH4wDB4jDEorYgxk1JCKVUek/2UcyalSE6Z\nnBOPixIUaKVBKZRSkDMoyCkfeG9CKTDGymsUKBQpJVKKpJSWB6yURpX35yzHopQipUzKCSVvBth/\nH6C1JmdYAPxKKRSKTJb3lNfncl5U+S7IyzOmlC6/Lcu5y+V1WsnvyPJarTWZTF6cK62Xj2ml0EZT\nVzVNU1NVFVprlN5fjHu+eO/5nPPRr7Y2l0vBTnEAuUZQ61MHX5Bz/iXglwCefucd+S2/+gso8oED\nV8iyiVIohZxcJQtpjCHnzMWLF3j/+9/Pn77jz9jb6VDacOjQKpPphL12YG/eUdUNShvqusFYy3w2\nZ767R993hBhQCqw11HXNeDylrmsAjDH0fY/3nna2R/Ae7z0xRkajEWtra1R1TUKRUqRt9+iHju3t\nTUII1PUKztUYo9EKSBGtNdZYvA/MZjOSAlc5jLXMZjNm8znOOcajEX4YSDFRVRXGGLTWxBgZhgFj\nDF3XEWJEKYN1llFdobXCaoXWGmMdRUuBjNEKoyBm6H3AGAMKJuMRdV2xt7tDjoHKOUajBlct4L3M\nrbfexK233siJEye49tprWF9fI+eE1prnPP/VB0tqj5PLpWAfBW5TSt2EKNYPAz/6RC/OORNTROWM\nypRdr1HIDs4Aat8CpJSJMfHYmTN8+jOf4S8++GFmey3jyQqVq2maBuccfq8FLTvQOststoe2lhwz\n/TDQ9wMpR8SwZax1S4s3DAMhBAELlaKqROlCjGjEeolZUGWhA/N5Swg93gd0sRLD0NPUNdpZXOVw\n1pJSphmNQEE3DIzHE1E8a5ffiVK4qhKrAyitizLJ53ofMMZirWPwnjAMDCrT1DV1VWOdFUsVE33f\nkVLCavl9vY8kpYjei8UeNJHE4aNH6fZm9F1H04xRKlE3Fbu7Ozxw/wOsTMdUVU1dV9R1xWjUiAW/\nhFwWBcs5B6XUPwH+FKGS/ErO+fNP+HoSpEEWE11svkZCRAUZso4opYkxkbNitjfnXe96L5/4+KfZ\n3pyzMl3l6PEjNKMRWRnOnr1ACACGnBTOVJipJYTIvN+jrg1a13g/AArnLHUtLsBai9aatm0ZhoEY\nIyFG+sGjjMU6TT0eUzUj+qHnwuYFcblZNoE29VIRnDXLzzPWMZ5OsMYw29ulbipiSlhjMdbRD56U\nEt57xuMxo/F4aUFne3OqqsJZRzMaE2MihID3A7UVV1hZTeU01imMAZUUOSucdihNCSEGooJqXNN1\nPUZrUkpsbW2TYsZpi61qtra3ufbaExw/dpiLFx27uzPuu/chUoLxeMTa2grNyFG2/xPKZYvBcs5/\nDPzx3+S1Ki/cHwvPCEpRwocS40hMYK1lGCIPPfQwp06dou3mjCeGlfURVWMIyRNCxNqaZmTE7cxm\neO8ZjUZYa8tCGaoq0nWanDN1XTOdTqiqhpQSwzDgi0tMKZFzRhuDcw5rLa6qmLctOzvbDEMvCmQM\ngtCoslko3+eWShZjoq4qrHNikbRsqJwz1oibFotZUdc13nv6vieGQLIGa0c0zQjvI3t7uwQfaMYV\nxjick+/oup6qclS2QluFQeGcQw3g44A1GqUUWqvlRtBa0/c9ptE449AGzp2/gDGKUdPgh8hsNuPk\nyYeZTkdcdfVxvPfUdXPJtX3SgvzHiVr+s3SP8nexZsWFGmPpuoEvfP4ePvShv+T8+QtsbKxz/Phh\nXOVAGXb3Okx2QCTFYVkTCyHQti0hBIa+p65rXOUYjSbEFEkxsTdrMd1QXHMmpoTSGleUI+eMMYYY\no8Rlbcu8bfE+UFUOXYLjxWs1iqZpaJqGqqqKaxOlVVDCG4mRjDGMRiP29nZJRo5HK81kNKZrW4lE\ncyaEgWHQYm2s5dD6OrUzVJWTMEOBMZJYxBxIMYqliwPD4AkpQDaE7IkhEXLED4GqriFDjIGUDdPJ\nmNlsj9OnH2NtbVWUVRu2Nrd45NSjPH13jyOHN/DiJp5QrgwFQyyUWDAj2Rda8qEs66C1xC5bm9t8\n6lOf4bHHzuJczcbGBqPJGB88wzCwN5sRgqPtA0NZzBDENSilCCGQcibmhPV+GUAPoWeYzZYKYqzF\nWiPuyxgykEKg73v6vl/GaTEEOOAmFsG41poUogTS5fEYI13X4YcepSAcWBznHMYYjNJYbUhBlKOp\nG2pXkaInxkAMBuUSTV1hJxMq54jBywZUCmsNIFlyPwxQYkgfIyknrLMMPpFDJGeJZ3NONI0GBT4F\nXLaoksWGrOj6SE5yHq1z7O3OuHhxk2PHjrK6unrJdb1CFEyhtbgWgRIO+spFEpRo244HHnyQc+fO\noZSiqRv6zrO765nN9wgxEJNi8B6UKjtZPmMYhmUAr40GpVHG4GOkGyTO0VpjnEUrhbGGummKSx4I\n3jP0EvjHGJcZndLiUhYu0Fq7dKNd22KtXbqhBcTRtR1GK0FADlg9BThrUcVlJR/I1kESa6iNpqkc\n4/GIpmqIMRGDJyXZoNPpBOsMFy9eBMCaGmMMfvCkkCBbctakNCy3hEAZiWEIQKIeWfZmM/zgaeqa\nvu8ZjZRYPx9ZXV3n6NGjtG1X4uFLr+wVoWALLEaCrYJVIQFqlhSAvZ1dLl64yMmHHmZra5vpdJUQ\nEilrdrY9MSlCglySGh8erwy6BLPGGHEHShG8J6ZUFGOhJIIDpZyX7w0h/DUcbKGsTdMACWvF0jnn\nqGtZ2PnebPkbF8cCEGMkxUzOUJe4MKXEEAJNXTMLoVg4TwyOpmmwVn5TjIG9nR3CKGC0LbQdzWQy\nKdBFTwhBFFZZrBvRdgFta7TWElPm/c0XY1yGEE1T4ZxsKB8Da8061tYY41Ap0DQNdV1z5MhRmmYE\ny3V7YrkiFExCkYy2jqwcSkFKEWUgBs9QUv+LF7a49977mc8HUIJ5tW1H77MohjUYowkxYY1ZApdp\nqUR26arImVAW3NYCbQzDQIiJVJQ8+UB2gRwTgr1mtAJnzYKPJRiXFkhAa0NVNzTNiFQWvm076lFN\nHnqBBWIkp4iPEaMVTgtsOoTEfN6hfI/VEaMzWgUImdoonDYYbUgp46Nn1u6gtCHHTOVqYgooI5so\nZk1Gk2IkzltCTDgnMaKKiaYZkxGlknOUsUZTVxVkTYoBpy3tvMNaS9/3oCJNben7jlOPPMyRI2sY\nZ4hL1vhXlytCwUBQbVJG2xKLkZYA6DBI7HX69Bn6bsC5ipRAK4WtKkxllnhVSqnEK5HgPUDJmPRy\nty0sU1VOelVVEtgvXKjW6IKcWy1xUUhhkdyW15hl5rj4jroZUbkKV9V0XYc2mvlsDx8GjFEQE05r\njBLLPK4czkCKAz55vB+oUqRxBm0SRoFWCYUlJQkfkgKrNUlKDgQSOUd2ZzsopTGuIsWMjqIw3vul\ni05JLG3Kma7rltkzOWGNpnIVaIPXEVC0bYdWGWcd9cRhnQC6h9bWCN7Tdz0rq5cm0F4hCgagRMmi\nx1iDNUrcRITZ3ozd3V1OP3qG2WxOVpocApWtqGvH4AU7WmBWCxe2cH9LduWBlNwUSGCheCEEnHM4\n5wCWMMPCjSxcibgehbV2GV+F4DFGY60pSP8eW1tbRN8T/AApYCsnLk+BsxqnNJXO6JLpxiGSBo+x\n0FQOY6UjJ8cspSutcChilpAhogrMIci8ypkQxeXnDMY6dJDfvshgtTGQM0Pfl00mvyMG2ZAxRbTS\nJZkQa51SINqIcpnNzQ6XM5PxGGsMuzs7HD6ycclVvUIUTKyGQBKBHCMpi39PMdLOWh568CSPPnqa\ntu0Lyt2gBo/3ib4AiAsFg6IgxS0ulA723eUiEAeWSuack8UuLtQYs6wxLlziws1CsWQKKmtoRsUt\n5kQ775jPZqACiohJGR0VOiWyoVg5i9WQcySERBgCpESlK0aVxVogJYbk0SRUBq1Ao8jFgukscZyA\ntcWtkyXrLmFGXTfL2IuC73VdtyxZLeq01jjJ2FMixkSKQT6zbE7fDdjGCOzhPZPRmL7rOPvY2Uuu\n7BWhYAvXk3LC6ExO4H3A+8jFCxf50j1f4sEHHqJtO7EoSTLLtu2JSXbmQrEE7FTLoD7l/VjjoOUx\nJUZbFJ8XRdxFnU+VzyImtNGorDDaPK4mmFJCaYUzGqOg7zoS0LYtPvRoHdEx4lxFow3RaoLOKKOx\nxmCVlMJiyKTkIcO4cowqR+U0KQVMyuSs0NYQYhKlUoqYAWXJKpKKMhktLn3wgWHosa6m1lLc1tbQ\nti17e3sYrVkZrZBzIoSIVpnxeCznNsu5y1E2oyITYiAPsL66Ts6ZR089yrHjhzkyOYIfhkuu7RWh\nYMDSQqQUiCHSdwM5UbCmWEojEWsdThl8LPU6V5HIUiAej5eKBdD1/dJiLazOwhotFNI5t1SyBVY2\nHo8B6NuWvsQqmbzc8Yu45uCta1v6fiDEyND3Ev8pT6M169MpI1ex28/oo0AEKlZ0KWCtIWSNj5ng\nA0YrVE6ChWXIRhGDIitIJDFNSqxurSvy4BnaTgBWLTXXnDN4qWn2fb/8vYvNMyrZYN/3WKtIMVBV\nFaDoZvPl+RNgWZNTJqRA3/WsTeTcnHvsLPN+zrXXH+Q0/HW5IhQss6j4I9mQUlij6bxna3OLU4+c\nousHYsoM3pcypaGqarJS5JSonKOqKgEz+57hwM462OUSQigFZQooWWgyOUuRt24kA43FnfStlFS0\nxmiFxpByFMpMsZB9SPRFmfthkAoAiZTBKs10XFNpzaxLJC/vyWogxQiDx0dx81YpKmeorJFMVQu+\n5VUgoSGIW86lpqaMlHtGVYUzht4HQopCGcqJWIDWlBIhCujrajlHbSu9wq4ZEQCl5VxIvJmW1YxF\nGUsrLbXMfqCqK7Z2tokE5huHLrm2V4SCKSFcobVCW02K0A89Z86c59Qjp3js7Dn25gNYh9WO3gsu\nlUtgjtaElOj29pZ4k/eeeMDSAEsFizFSNyOMscQQCH6QuC9EorbMd/fwfqDvZqQsAbBTjjCA1TUG\nQwye3g9S7E2i5CkHUgoS1JuE0TXjymDUACnhNKhoRDGMlML6rpPPJzGpa2onMEiKWW4pEWLAKItD\nY21FALoQIGUMYDQ4bVHAfBgw2pCtQoWELpvHKNC6GEBjMNbgrMOHiHUVrh7jvRfMq/DIUk7EJMqX\nU8b7xKzr2ZnPaZJBW8X21uYl1/aKULCDpRayIpdA+cEHH+S+++5nKCwD54R94KNYhyV4KdjBMsZS\nxaqRhUSX0oJ5ILCFtQ69hDQCEnBkYgzMZnsMfYvWYBQ4Zxh8JEbP4APkHq0sqEzKg7AnsiOEgRA8\nzmpUypisaLRi2lQYEJ6V0VQ64YeOSTPFmIrtocenjHaG9fGI0Hf4nEpdMIICp5RAMk1F1TREpTDD\ngI8RbYVI2YeAs5YaSNmjssYaKbXFlFhUSXLONKMao/crDNPpyjKTXmTJ4lAUMUWsFjUZCjDtXM28\nnTEMA2vr65dc2StEwURyIcZpbeh7wVn6foCSUeays6y1+CAxRSrp94IpsYAUFlZrEU/ssx0KUp8i\nwSdyiqicoNycdRiTIUVxWdaAz3gUIUdiFuKeUpkYPLnAFykMEDxGO2pjsFrTOM20Mlj5AVQjh1aW\nWdeiQs+4GqEnNcMgFYFRXbFaO6rKLflwxpYSmjJSm9UajGU0qpn3PX7wxJjJWkFIxJwJNkFKqEJs\nRIGmkAhUqXtq+7gkZ5EI7ZeQxP3GGLCFIWydJBAPPnSSw4cPMRt6Hjp56WkKV5yCiVXKnD17jq4f\nRNm6gaQsiSiEYaWIQYL0qqqo6nqJVC8UbEHOW4CriyBfYjAvGZrWGA1aK8gapTLOlYyw78kaxqMV\nmsmIgGJ38MyHQMwBYkTHgE4JTWbaWJyuaCrLqHY4oxlXjspqVE5oHFVVc3jVoaxhbyZueKwq9KRh\nbXVNuGOKZbZqrVQIQgjiqpQArVlrHBqlEp3KDD5hnEX1gZghAoSIUoUSlPdroQtZZNEL9u6CnhRD\nKJtTXPgCU3S1IaTEMJsxn8+oqpqubTl//uIl1/SKUrBFgdsPnpQy21vbBB9LUVgtMakYIz54jJGM\nqeu6fXjhAJeKnPHDQN/3S+aC0cJb10g84ozBOYsiQ7a42oiLayoaZ1ibTsTCaUvTD1yct+zM5/Tt\nnCoGmsqxMq2YjkZM6pr1lQmr4xG1s9SVk5pmEFBTIZBGMxqh7TXknJbH5r3UH1XZQIvS1mLTpZyJ\nIKyI4jJj1PRdIoVA1hqtMsYoTCxcFC3QilIaHwOxuN7gA1oJHRxYhhDz+XwJLovVLJWPlOiHiDNC\nE0IpLm7uMBmNpEJxCfm6FEwp9SAyayECIef8rUqpDWQGw43IQLQfzDlfOhI8ICkJxdf7AWslM2ya\nBlzNvO3xQ88weJTSaG1ouxZtDNPpdAkv+OI6+7Y9AH/sg6WyiFK/dM4xaiQL00ZhnaZv56Aqpk1N\nUzl0SsQkZYU0eOIwYIEja2tcc2SDq04cZmUyZlJXHD20Rq01KmdsXRFiohu8NGyEgNVQ1RVV3Ujd\nNARQiq6VTRLSIpsWTtjifx+DxJ1ZAu2+70leXHSMUSy6tjhb4WNGx0Qqn2WsERJAYd364Ev9N9H1\nHXWul0q2BJiXjSQFaE4BrYTCRIah9yg0k3LOn0i+ERbsRTnn8wf+/mngPTnnny8d3T8N/MtLf4Ts\nAq1k50U/0Hct05URs/mMFT2lG0Ah8QZZ4oO0pBsbnLVUBYlXC/wHCUxDCEvwVEpGklUZDXWlcQYq\nJ9BIzhFNxueMtrV8b4BZTJzf7Zl1HTpGjqxMuPO667nq8CGOHl3jyJENRk3NeNTgjCn9A5INLkh5\nsSygLlnk0poqTVwJZWNFKWiH/QaTlBLtEIhJCvRSaFDklNEx088GotZ4lYjagpYgPqaAtkZgE6Vw\nxSIuzolggMIU9iEQokAcFqF6q9I84r1Hq0zwpXxGJhvBAy9c/LvPIl+PTJ8B+HVk9ujXUDApdqeQ\nCD4sWQ6TyZTJpEObnlk7KxZIqNQHGRJKQfCerm2XOy6Wk6W0wlUOrXQpgwhy3dRW3EjOGK0ZNTU5\nCQkvhMR88Gg/EHxktjcjRWlImVjD+uo6d95yE8+84zauOnaUtbXVQtHZJ/zlnIXikykVhYQvpSyj\nFYq0LDWpDDF6Rr4hxrwsey0yX+89rlb4EGn7nphBl6K1VjN88ARjCNrik6euK4zT6JBQWUvAv4Br\nFCgELzPZYpXUNH2xpAuq+uIGBUdENmtKCWcNUUeMc0ss8Ynk61WwDLxTKZWBf19a0Y7nnE+X588g\n05v/RtK2LScfeoi9vRmPPXaOeTcwn3X4IF00obBHxWQnWJZuFH3fL2tsC3QepM62UC4BGzXOaGpn\nUTmhciLHIIG41oQEfgj0846+H+h8T8yZqXOsj8ccWz3CM267lec86xkcObpBMxpRVeOSTKTC8M7F\nWorlWgTKxlcA6JyIvi/dSYKUpyCsjhCEs3aQmj0MA0MQcmNC2LgZaKqKphE2SR8iIUNSgk9pY/DJ\nS/2yhEnSA5qXxygJVVoG/AJRsET+lTLLbBwlmbcP+7TsXKzipeTrVbC7c86nlFLHgHcppb548Mmc\ncy7K99fkYOPtiWNHmM/mfOGeL9K3PSnDzu4eW9t77OzOAU1MB+p/B7LDg0pXPnfpWigxljWWFCOm\nUHMsCavEveYUIXq6+QxjLK2P6JS4/tAGk7qm6zuMMxy76gg3XnstN19zLdeeOM7q2grZKYyraeop\nWhti9GI1VYn50gJTkoU1rhSXyYReQ5LF9t6To2LRNGtK4lHOYkHvh2JVHKkoWCKxvr7GsaQ4u73F\nTsm6ve8gaXIKAsMUixrKecnL0pko2aI2K6Wy5eotj3tBRBDANeK9dHhp4w+8/qvL16VgOefFmMez\nSqk/QGZSPKaUuirnfFopdRUyCfmrvXfZePu022/OX/ziF7n33q8wmazSznvOX9hkZ3dOzqaQ90RP\ntdbozPKxlHIh/ellEL8sbrOg1hhyoSSrkoFV1lI7S44BkoYYGPqOYejZmDTccuwqjk6nqByZrE25\n9rZbOHbiBCsTKQqbyuKaGqUMWmVSHIjBg8olu01L6rVWQgEa1CCNvoB10hSbF5idBqIE5OLOFrBC\n3q9EJCk8a63ETSXHdNxwDM0QAyHvMkTwXSeAtVKlna7Ql4KX8EObfVJlysvvE6C1gNWa0n1e6Nul\nuysmCClKJ3je7yJ/IvlbK5hSaoJMRt4t91+OjP5+GzJS++fL///xa33WMAx84Z4v4IfAtdfexDCc\nI6bMeLKC94kQk2QtSixSzF6aIvIiaN8HVRdxS0oJVWjBRhv0gdeoQv7rQ4cz4IwwQJVPmOQ5tn6E\na48f4vjGIZppzcrhNY5sXEU9moCzmKaibhqM0hLHpUjICbTw0nzsgYzKGpUBLZtAZSBmqWWWgnWI\nacnLTzlj0YRSNrNWo5RBQlIrhfTgpUEZqEqQHoLi0HhMCJHNvTkpCnKPMagscaAqDbxWazCGYUkB\nZ1mbXZxHIQKoghPqsln3FU4YLbkkBQdnF/91+Xos2HHgD8pBWeA3c87vUEp9FPgdpdQ/RKYl/+DX\n+qBh8ExGI9auOsTVxw+TvGfr2AkeO7dJ1kLdyctYQBUMUJELJnOQsbmIW1CKUV2JG8ylKzz2WJWZ\n1A21VnQ+QAanHTpmGmepRqs849abufrYMQ4dWqeejhitTandSE547RgVK0bKVNYRo0Tq2ioS8v1G\nF75WioVaI5bJmdIHiSQXIURpeFFWXKApBMmU0M6VrnbQxpKynIveeEIsC2s0K+OKyCrKCP16O88Y\nfBACIYGQMzFlqYQsAVRfzouS7iMohMRUPIDCNg3aViQvrh9UweZkpkZG6D2Xkr+1guWc7wee/VUe\nvwC85L/ks1JK3HLLLRze2KDd3aF2Fmcsfe/BWJTRUoxOubRZSQFWs08rOdi1AxzoOwQ/eIyK2Dgw\nHVmOTGtUSmgGlIbVcc2qW2W1aViZNlx19AiTlSnrh4/QTKbYqsKWmMhog0oUnlTEo8hKk0qx2bhM\nWrTgeS8B/NJ6ZjQZ56wwH2IixlzqghFUwtgKpaW+uEhmVKk0GBNlDIK1VAn6YUDnTO00k8qSJiMp\nXaXIzt6MFAPJLDLqTMKQimUiZ4lLbSVF7ZipKrOkJMlwFMQ1akMKXlz+giAQIzHmJWnzieSKQPKN\nMZw4cRV91zPETNd7dmfzkoZLkDwMw3LqjXNWXAv7AOri/4P8+xgCfhmFCpth1NSsTicYoK4dwQ+M\nq4r11VWOHjrE2uqU9fVDHD52jOnqKsZVaGPIRoiK2kg8F1Mia0VWhhSk7iesUg2IJYilSF810rQ6\nxEDyEbCFPSoLJAZZ46xDG1u63A2KXIa8aGJ4PGHSBwFNFyxUYqIxjnFVsVJVRNeTEBB2KBYxhEQq\ng1KWQ2QQhm1y4IwtrBa9zCoXzcZ1VRcrBsZa+mEoVZT+kmt7RSiYc47dPcnibD1mrz3L1vYebduj\nfJTAubA/jbFSB+sHaVXzeTkw5K9Pa5RdKr2Q4l3JYLVifTplPU8hRUajhlHl2DiywfrqKuvrh5hM\nV1j56Cc58pbf4eKP/yh7d0xxW0UAACAASURBVN+NrUbLlD6nINhdFphBGkWslKAQynfwAo6u1LW0\n4aXAbL5bLIc02iZg6Hpyks4fIVYKCr8gWko/ABz93Je46e3v4bMveiH3Xn+NgKAhEkNCZ1HtWmsa\nq5lUDp8CUSeSFerNolyUyhAZU3o5dYm1bFGwBRlzsWkX53UBXyz6GYS18v+HriKlmLcdwxB45NRj\nnHz4NLO2I8RETgPT6YjaTpjPZozqivF0Qtd3BD8I66DEYYvFX7hJaW4V96CMlc5nrZhOxhzbOIRV\nGlJgNB4xWpmwvr7OuBqTMmxubnPDr/0mzclHWP+Vt3D6mXfhh56mbhiNRzTO0nUd0UujhCobQMbY\nZ3IW8DKEWKjJqZD/RCEmriLFiB9k9oTKoJx08kw+8nFu/I/v4J6Xfxfnbr95WS569tvexdrZ89z5\nrg/wuTd/37Ko70MErcnJS/uZNcTa0geL9xGvJKPVGohJXFwIZbNWhBjJKYmVWnDQSvVj6R3SPnlg\nUdeVGWKXlitCwbyXoLX3nt1Zx2w+oHUF2e/366XEbGG2Y8T3A13bYspkGtivnS04TbqAkpJuJilu\nV5am8N4bY9G54uaHT3HbO9/P515+N6dufxrTyQrTtVUeftMbuOZ3/4B7XvESTj3wAGQBJTc2Nrj2\nS1/hxG/+Hhd//M20d38HZEOOpTMqRxRx2XUdgii5gJcaVKEw9wN+GPC9zMOIMbK1tc0dv/92Vh47\nx81/9G4+/V+/Qfj71vKx73gOz/3zj/Ohb3sms/mMYfD03hOy4umPnuHFn/4i73n2HXzi0AohGUa+\nIoSMj9BHL13qKaC0Wo64UtpK6ajMsGjq6nHEzGVsG6OMUiiUqIU10/ryIvnfEJFBIj0PnnyEzZ05\nvQ+0fU+IkfFkBDmzs7ND23UkFCFu0Q/Cplj8wIOBPkiQn8p0QAEShd5ilRAJrdbUlUPlzK3v+DNW\nT5/l2e/5EJPXvJbRZMpoPMFffwOPvOKlqL09Du9us3nhPA8/dJL77/0Sd771P9Gcu8ChX/0NNr/l\nLpy1GCPxojKKFCN912JcTQyBjJJ2sRCIfiAOAzEm/OCZzWb0g6dtOx49c5b5s+7guz4R+OgLnstk\nPJFRAc2IcPgof/6CbyXExIlhYG9vj82dXc5tbvHdn/g8J3b2ePGnv8hHX/Q8QpY5Zs4YXGkyMUks\naNe2jMcTFn0IKKjL/LNF0L4oUQHFnZoCqgrkEqNMc9TqMmWR30jxQ+CBB09z+uxFtneFRBdSIKtE\nP3QMQ2Y2a4kJ5u1ASBKjaA11Uy8L327RhVzKIakg/JVy1NoysZFppRlVBlc5AsKtuv91r+Rp7/gz\ntv/Bm5lurGDrMW40hZDweYfRqMLqdYxx7OzOePSRR3jX02/jpZ/PnP3eVzB0c5RxjP/yExz+rd/n\n/JvfyNa3PROSZ1xPcVrTD4HkPb7vyV3Lyic/y7Vvezdfevl38dD1V3Nxd5dz5y4yzD0Xjxzh0R95\nIysrIw6trklSoitSTOR2jtIRZTTGaqq6wijDO++8hZd94V7e+8zbyVljbI2pPCpKFroYkamVUF+q\nqiIV92eUpqnrJT1o0dK3SJr6vkerVABbUVIBcMGHy4eDfcMkJ2mF2p7NSdlhjAwQCX5gb2+XnBLG\nKOraYpxj3vaEkGgaR13bwl/fN+f7qL7Cao0hMaosG2sT1tdWGTUy00ppzdGjR6nvuouTr38t1klR\nXKGlV9EHwhCFUR08ldUcP3oEFQPnK8tv3nEz1111mON7OwSlueEtv8PokdMcecvvc/7Zt2NdVbAr\nj7SnBYaup93Z4Wn/7zsYnznHjf/pPXzkza/n/OYmFy5eYKWesjKdMB7VjEcjmfllpA4oTbWJGAZx\n+SlRWcP62hoXn/tMfuGGa9idz/BdR4bCMjFlao8SoFfQaaGel5m2MnRPxi4slOpx0xZZgK/pcXM+\nDlYZnkiuiCt9pJzY29tjd3eXmALOSYq+JARaw3hUcfTIOkcOrzFqDOOxw7jEvN1jb7bHfD4vCyng\na1g2fgyMnObExhrHNtZ59mMX+f5f/X2u/8oDHD68wZGjR/eL4xmslfkMe7tz2rknRZjttXTtnO2t\ni4Sh49ChVa677hqOHj0MOdLNZ8x2t7n35Xczu/YEZ3/oNYxHDdOVNVCGrm8ZfMfQtwxdy/bODh99\nwbO4eHSDv/i2ZzKbtzLNpmqYTkdFKTT1qCGkxO58LvToFDFOFCYnUdgURekm4zHr69K32A8Dzzpz\ngX/155/kW85dLJRwiUNzXiQb+wOL9+NWvZwStOhyX8AiizhssZF1iXcvW6noGymLVNx7jxmkwydH\n6RH0XoLL48ePs76+xt5sxmwkGNmsjLhMsbAByJJ6m8UOMzTGsLEy4vihFY5trPLid3+QjXMXecZ7\nP8iXX/WyJc3aOivMTR9ph4E+gFaGdtbR957Q9/gQBAeLYJxjZW2VlJMM7lWai896Ovfc/XzqcUOt\nNFW9go6RmD2kiMoR3/XM5i0Pn9jg09/3ErR1WG0ZNWPqaixUm6oioJgPAWMdtWuIpa0up4jViqwt\nymZUiKQkHU5KKdZWV8kKXveVz3P93pwfPHma9915s1QWSleRXrSiPa7DXVTl4KipfWatgeQIXtyj\ntRZT16Jwl7FU9A0UAexyTrSzPZkI6By2TEU2WgbWDv2ANZb11XV29/aIw57UGUtzQypkPin0KbJP\nWKupNdjsmdSOz7/8u3jO+z7CmTe9rkx79sUdeIaY8SGTbcXeMLCzc5G9nR362Q47Fx9j6ObCHasr\njm5sMBlNyNHLgCll0JUjaYvPCothUk+oNMy7bYa2gywWYNbO2Z7NRHmUQeck962jahoyit0+0OaW\n1UMjuiGzvXWRbi7fP2lqaYMryq610JG896jCGHnb7dfz+q+c5A9uvKbMiTWSjRpNZH8AjF0W16XO\nm9L+uKv9OR6WVAgCKRWnFyV2c5eZD/YNEWMs4/GIPgb8EHHGsDKdSO9hJztub3fObK/FuoqYMvN5\nL11BKJTbhyaWQGgWlHzUOI4eWufooVU21lZob7yBz77sJdTOYZPADv0wEELHTtvS25rz23s8cvos\nXdfTOMe506foZtv08xlWa1anE3ZnLVedOMHa6gpKSSucspasHaYaga1IylBVFjUovO8JQWCJtu3o\nQ8CiiLHDuZq6GRETbO62rB/aYDyZsjObE+YDKcODJ08Tfc+4aRg5Rw6BtekUQ6JSgrmllJYThb50\n3VX862OHmXcDtJ2UebRMT/QHcK7FNGxjDFUlE6UXIOoCcJVGaCvTfEpTyCIOu6x0nW+UCN/IyXTl\n0YTxZEJV13Qpy4gA8rKLKLY9KSvabsBHaek3KkgZCZb1xxiClIaqmkNrqxxZP8TKdIW6GWHLgLgF\ne3WIkc5HLmzv8pWHPs/m7pwX7nV8+zvfye897Q5OH1olDTOed/osP3LyUd7+9Nt5UGtyzPiuZ9xU\n6NVVlDGomBjlTE2GPOD+7INc9+9/lUff9Ep2b7mB+TAw7z1D52mTJ6OZTDU+GWZD4Kpb7mLj+An6\noUPFi8yHjvvuu5edvV28Hwi+Z1zXGDKHD63TWMPUOlLsUdmjCeQ0LEc9CXqfH+f2QFrvFufeukqO\nXVvGkwl918m5Li6wqiqyVmhlhB2rtTRJFwbIJdf2MunMf5GklPE+ktnncp0/f4HzFy5IFhYivfcM\nBU9KQCxuxTkZ7S2lmjJ0LQhLwmgjZaBarlLR1DWulvldsQS6Xdextb3Dw6ce5TOf/wIXtnY5v7nD\nt/zxn3Dt5iav/9wX2J51tCHzwydPc/3ujO+9515msxntbJfZzjbtbF76D2WxrDEkP+B9x/SXf536\ngZOc+J23E0Ni3ve0fUcYZNZD2/bszecMMXLi6mtY2zjC6XPnOL+1w8bHP8aP/eL/zU333sfmbM7t\nDzzE//mXn+CW+x7k9IUL3Pfww9x/6jSnN7dpQ6IPkcEHNGJxmrrGlRm0iwF51hhRrpxKDBUL69cw\n+P3OrEXCtJgr2w+CS+ZCb48x7Vu2S8gVoWAytMMQkyySZDOOphmhtMGH/UEebdfSdZ1MYV7U0cpJ\nOTh7AihzuwzNqGY6nS4vheIqAV77vmd3d4/773+Az3z2M1zY3OLc1pyBmrfe9RweOXyUT3zfD3Dj\n7Xdx7PrbeN93fw+njxzh4y95ERtr6+QQ6OdzQgyMJ1Mm0xVcKY7rMt9i87/5IfqbruPM97+aoe8Z\nhn7Ztd11PZnMxqFD3HTjjRw9ssG50w/y8EP3ok3mtZ/6JNdvb/EPH32U1cMn+MfnLnBb2/ETZ89T\njVbY2m05de4C9595jAuzlnlI9D6Rk0ElI4NiSrMLZVKO0bqAo8LqcNawYOFLX0O3dI+LOKxtWyEi\nDDIfJBf2YSqzQi4lV4SLlDkiwpdHG4ayE4XYVgJoo3HIlTacdRgrlu+vYjUHC7S2skwmIyaTMU1T\nl2v0xJJFCf+p61rOnz/Po48+yta8Z3LsRp71vGdx5vqb+T9e+CJGTc32fV8i+I4v3v40Hrz9dm48\ntsFV5x+j392iriom0xVW19YYTSak4Au9WGKe+Quex+Zdt7KzeZHu9FlyXNQFZW9Pp1NOnDhBzpnH\nzpxhdzbnrjtv56Y7ns4HXvJyXvjOd/CBV30vj9zzJX7lplv4iZMP8ofPuIsTV53A+8jW9ibnd/aw\nRnH1xiojbdDayQyL2KM0JYOUOmzQCmukW9w5CdDb+RxThsn44B9XboNFzdGilJSKnLFlunWQASuX\nkCtCwWQYiVycQGuZBbG9s7vPVI0Ja6CuHGV2ETnKLImDFf6Dhe4QPMlJ17arbAEslXDL8oL9INOT\nvfd829YOb7j3IT589U3YW2/mni/fx7nNLe576D7OnT3NbTddy8QqXMpYJW1dTzu/ybd+6GPc97rX\nwLOeI1MFtUKbTMwBnTLERAqRMHiGvifFuGSirqyssL62zubmJjfc+yCv/+Rnue9NbyTd8TRG64f4\n2K238eET1/LQo2eY7e7xl1cfZ+c772br4nnGleWlXcsPPfQwv3z8GB9rLd3cUU9GxJzQORJjkP5M\no8vNYBJYowU/i9J1JZOv2+XsisVUxGWvJJTwRWYr5pix2qLQ+7HvE8jXdJFKqV9RSp1VSn3uwGMb\nSql3KaW+Uv4/VB5XSql/p+Qqt59RSn3L30TBlFI0Tc10MiFn6LoOpTR1Gai7urbG+tr6cv7EotPm\nr7JYFwomdbSwbMIwRgt8kfeHoYC43Bgjx44d44dOnubG2ZwX/8UHaTcf484vfYb/8a2/xg9X8IJn\nP52XbF/gH/36r3HXA/exc+Ecvu94zn/+CCuPnuG2P/5TnKuW1y/KZRpVjJHmL/6Sm/75z7H2yc/L\nMZZkpWlk3IH30sr2kk9+hmPnL/CMt7+Dbj6wdf4iz3v4Yf7pr/0S39vucPfzv427nnYHzz35AD/3\np3/CM+67lzfffz+3ti0/dfYch6cTGqupK4OtNEP05FyG0hmNKbfSvyFofvCkKOzbGAO+tMotRr67\nMhJLSGdlvocWSEgucSNdSV+XggG/hlzd/qAsmmtvA95T/gZ4FXBbuf0k8It/EwULMXL6zDm0cXKl\niRRxywshGGIMDF5GP7Zdt7+rivVazmc90IyQShJQVbZw7hW5sGON3ke1x+MRN99yM/e/8XU8duQw\nf3LXnXzuU5/gO9/7bq46f4673/tuJpXj7g98kCOPneNb3/ef8fOWUT3iK695BbPrruWRH3yTlHFS\nLHPBAjEFck5svOV3GZ08xQ1vf+/SfTvnaGopBVWu4tChDb782tewdc3VfO7Vr+HihYvc9+Uv89w/\n+kNOnH2MV3zw/dxy9XHWxxUv//CHuG5zk9d+5jP8xrVXc/94xB/eeiNHDq0zHY9o6orJeMSoqSTh\nsE6ub2QdlbVU1mJLj4L3nr15W3ot85IfFoJcCW4BZVRVjdJaZqQ56UDqvaftOwb/dbrInPMHlFI3\n/pWHn6i59vXA/5MlKPqwUmp90WF0yS9RikRF35csMoqFSSngXEXfd1LuIDP0AwlQSvAraekqzRyp\nxA5G87Ldlp++7wwfOLrKqH4mGEuyNTFGKt/j25aUPNPVFdYPH8O/6mo+9vwXMBk8d3U9nw2R5r1/\nxn1vfCM33XgT97zm1dTvejdf/J7v5vprrme8MmV+x9P43KtfiXM1Tity9qg8kHO52hpw/kffxOH/\n8LucfOV/hTbSRe0K0VApC8pgleHMM+5i9sLvpJqscFRr1lbHfP41r+LOt/0R73nuszl78iv0Q8fv\n33oDr+1b3nrL9Txw0w38m2fdSa1gxWh0TmiiuD4v58lHoStVWuMUGJWwgMWSssy5GFLGKo2tKqIf\nHrdxjTE0TUPMMg8EP2CskUF6JXH4uhTsCeSJmmuf6Eq3l1SwRQG3733pL8ylO6gD5hIzUayakzFG\nPkQkMlDL4H45Ax/457tb3DEMTD99D596w8tK5iSccm3kKh71aIIyNa5pcBGOHTksdJa+h6tfyr0/\n8H04Y7lxGAhHXsGnXnQ3+MjxusY4R7bCQmhGI0HpQyyDgOW4lIL2O76VB55zBzsXL2IfPlVQdznG\nRbYZgyf6HqOgdgZjLZVV+Je+iE/e/ULM9jbP7aUeOr/xRt76gm8nhMAtyMjR1HdYo2nqihg8RilS\nCswGmR+h6YgJ6pjoY8CajNZRWtW8x7iavmvLhMdKiAZhP74FqCpHJkv9E5m/Zgoedin5uoP8SzXX\nXkrU4654W7MyrumGLJMMlcI6t6TbKIAgQ+CcKVesiDL3KrM//2uRVqeU+N9XD/Ev97b58+c+g2Ml\nTlMx7GNnpkIpA8aVVixonEyc0UnS8dhptKtorCXVFUEVCjYy2KKpa0ajmqoSXr5fsocVZLXsvVdl\n+JvWWsKZLO1fOXqsUsShI3Qzkm9RaYJOWYr9RuFGFal34MBbzdwZ+roS6GDoiclgqolkpkox5ExV\nOVzlsG1L8oE4BHo1HAj2NS+a7/CPL1zk55qa9zXjwu2XOuN4PF4SDvevlCKXC8zaFX6/FLwvV6no\niZprv+aVbhdysPH22muuys991tM5c/YMX77vJL4Emk4bqlGD955mukrXtsKmLEwCSKRCFFjgYIvx\nTe9fW+OTVx3npddfywmll9CEKmwL7SrpUEpynR6tDaO6ZlSvsDqd0vsBkIZTGcKmUM7S5UhWMJ40\nTEZjoSFnCagX8yYUGlWs5eQjH2H913+bh17zYh45vEZMkSEEfIyAFPRT7Bn6Pdq9LZq6IlnL4hpC\nWmsmtSX1kb5vmW9tlmODkS2ES5UK/z+Wq9c2co0A1aKRYXqkKA0exlDZzH+3ucUdIfAzbeZ9a5Gc\nkGsoLeaSHWgFtNaSxW4JNVxrrMnLqZCXQ8GeqLn2bcA/UUr9NvB8YPtrxl/Ijh43jptvuIa+ben6\nnosXLrKyusLRo0fZ292jHo85d+6CXKgpCBc9Legn7E+RXuw4aUyo5OJSZcAICLqPMmRlMFYKvHUl\nqHbp25f6obWluVSG6cYyfHiyOqVqaiajMSbB0LUymzUGYUwgLIwFneXQb/wu9UOPcN3b3sWXf+pH\nZXDx4PExoCkuLg74fs58ZxOtNHVT41xFCDKmKqVI8gOKwHRSM1ENKMqAlIFJ05BHMPgobX7a0nYz\nclxcrkZjFzcjDR6/eOQwP3X+PD9fCe9Ooem6FqUVdV0/jh0sxyHnSIbHqNLh9dUabR4vX1PBlFK/\nhQT0R5RSjwD/qijWV2uu/WPg1cC9wBz4B1/r8+U7YHPzPNdddxXPe85dhBj4ylfuBeDQ+go59tSj\nhq3SmxhLO7xcZk8Rwj5JzlrLynTKysoKaWiXE26EMSCWLMUk6XvJsEL2KOQKGUkZstJo49BWo2PG\nuhqXg4xdaiqZ+1Qug6cVhBzJObK4gLsuJRRtHTs//iOs/up/4LHXv5y6rmmaCtuaZUdO8ANeK3Ll\nShYaMKpCEUhhKLMrAsF7tNGsrE6XQ3xzllFXiyuAtP3A7P9r782DLLvu+77POecub+nXy8x093TP\nisEOEuAGcgiCIrXY1OKIWmhFtlRR7KiilGK74orKiiT/48RSJbYTOZWKS2UnrnJZpbVCKWJISpYI\nSRRIASAAAsS+DDAbMHuvb733niV//M69/QbLACQBYsiaU9XF5utGz3v3/u45v+W7TCKJVmlsKQx4\npXZY2kYJZfZLM13+rNPi4tZmU7WDj65zsRGtxaDM2gqFFfXO2CBW8capbzYHCyH83df50avItbF6\n/Adv9Ddf9SYSw9LSHFlmWFpaAgwez6mXzrA1GlMGzWi7LwyZqgIUrVZH5ISCJU1ErFc61gnFZIyr\nSrp5gi0D3ieRHe3RPqCtx5hKNC7iwNw3w19QeLLoHut9IKAxyuyo/zWSTEKp0x5A41WUofKBYCAE\nzeSuu9l697sY9jdINy7S7mZkQ1Bjh7Vlw+S2VkgXwVYYFcgTQ9brSCGgFD4YtDbRlAtQsrNUxQQ/\nKkSPwnmSECjLCTrmdwqNNgaTpeBE7DhThkwbxlVFYhKcdVROiMBCULGiNq01LniUNtjKCkQn5mNS\nBb+Je/v1BsPbsbTSzPS6TCZjzp07R94SdGZ/VHDh4jrKpKxfOM9oOIqUeU1QoLUhz5MGbpLEr2Ii\n/o8Tb6X7rEzUHyvxKkUiy+JtJIso6fHsOO2CtEl2AHUhGiAEv3NhG0mWmLhD3Cm1jvJJQoh1sbA0\nSpNrgS/rutnpXNPEVIqm+Qmyu5oY6Ma00NpEXLzDByd+QkBQiqoqGY/GlJVtNCNEy1/yJJHh9Bgd\nfaC0VO/TMSK6ZRF+Hql+9UhNG0PwIqfpfWjGT+YNdrCrYtidZRlzs7PRQUN2h9nZWQ4dOsTy8rJo\nqOooHxRE3cXVEpFhx63Deb+Di7IVR9fX+e8//ycsPPK48BOtJ1iPr6wQS+qeW7z5Ru9wLKfZ4jWN\nPkQX3p3vo7Gp2kGD1keR0gqValr3P8ChX/gVel99DBUUiU7RccSilMFaJ2BJLxY6+ICrxO2WyNwG\npDHsnRB+veR8wYlv0KSYRP00QZJUlY3D6YLK7niOS6qA7Ni+zl+lCAi2kjlqlCvw3uGdbXwEqHOt\nsMPgUuqNumBXSYCFIASEVqvF6uoqy0tLzPR67Nm9m/mF+aiELA61aZaTZjlZ3iJvtdFmx+zKxeCS\nuWTgH126xIHtbW750z8X8dqgCNbjKrkRIQZXbd2nUE2BMO2/LWgN0YStnXmJv1t3uHcCUwoNa6Xy\n3PObv0vr5GlW/vDzkWQhoErvd3KcpnWGHNHBO1xVRZl00WH1rsJbwYO5ssBVBc6KZJQwwGWXqqK6\nUFmVcZwjgsrxSkue6T2VLam1zLRSKO9RkVwjhYDsZsFLkDXgQ10bl9Xggren0foWL8WRI0ficSQg\nwCpqm45HYzY3N6MJQCBErFOat0ApxqMRxaRsEJla9n6UUvzr3bv5pe0tnrnrKG0fCM7LjqW8aILF\n4NJT5AWlL69Id55UUQ2slW8AgqoDJDQB7qyLLCiDto5z//knWfytT/PCJz7O+nafzcGAIhJWtRLy\nhgAfIwOdEHeqegepycRmR4vDxV2sKkWdUe3ILlVlRVlVjXp1Hez1zo6SIPF4nJc2STvPUKkiMSkm\nS1DGxHsgAVh/AXG47SMzyV3mO/5a66oIMKUVs3Oz2MoyHI5QWrO5vs7xE6d58cWTjEZjTJKSBE0I\nqvEscs4LGTdevCx6PXrvMUnCX+/Zw88f2s/3LO7mfUVJok3s50iZrVHNMVRXmUTY9bRSjzCVamET\nkTyqxeGc84S4axSTCWUhx0meZehxweb1h3n6v/17XFq7xIUzZ+gXI4EiRVezWpq9HhwL39BRG0N4\nbyFoOcYg9rQsxLZI8C4qDkUgYLzpRVk0BYf3bmq3kVyytlwQwnCKDgqjDN7EaYeSUVDpfZSbkoox\nKJr8UQRUvg12MIX0p6zyzMzOUJSW4akR40lJnnfQaoz3JUobstxQFGVjHCDaqgiIzwexcrGObp6i\njKAE+pMxpfV0vMB1iEl93WxvnMyQvCgoI1MEFWUko6qWi0m/oDSiul+QY6cYDDh/5gynT52mcJ7r\nb76FXMN4e8jW5jZrG9ucW99kYzySo16LUUOCigEqtn9zs21yJUcyAZQPBOuoXCGaXEFyNYVu3rxv\ncskQr4O0XCrrROwkXuQaCBC853uGE37u0ib/+3yPv+wZUp2QaAUGfLBI7ezBO3kktUEFJQ8UAldK\n0wSrvx3ET+JKsoyAoxpPGI3HoDRGp4xHJWBIEt0gVoUTKPO04ITpbL1HewXeRdntCtfOGZUFZWnx\nucfiMN5BMKgABpqGoVbRKTbJ5G94CEraGASH89KPspWVY1BpjJGqcbi9zaljx3j44Uc4v9Xn3PoW\nN113AB1FhTc2+5w8dZZzW5uMXUU7Szm4d4ml+VlCZSkKS7vdobK+KSZC7LVZXzX5Xw1PkpmmtCFq\ntzitRCTYxkmB9eLp5KIyIfFB8S7w82vb3FRZ/vFmny/OdMhbKe00w3kRHIaAUVItRnRTM/7yIUgv\nMTdvqA92VST5sFMsVZUcCYPBiI31DdbX1yhL0U4FIvGzru7qkVH8G17QFiKim4o5VJpSjMeMx2Nx\nywi+caatocM1Nh2kcemrUgIsBlawVpLuosSVVePtXYMXK1sx97Un+dSn/5S7vWb34qKgPlygqiqG\nkzFb21uMRgNRmB5PuPPSBr/60JPc9vIFnBWN/LIUb6ZiUjAZF5RFKbmZ8yJkFwGAdV5V49+aahdp\ne/ggVXQtFhyC/Pe1i0gInt9YmOWZNOH/mBMJK++stD4anTVDZR2V803qAKHBk1VVSVFMcO7bATJd\nY8KtZ9AfMxpPWFtb44UXjzEeiw6YSXK8qyiKiTC+E4NCZJFUTMC11rjKkiSZsKNVoNNqxWPOY1JD\nrc8san2x7xPzHO80O8s01gAAIABJREFU3iKGoEkifpBRBMSWVZPwJ0kiLmdKEYJCq4Qjn/tzupc2\n+FSrzfs/+YMc+swfc2Jxmcd3z3FxbQOH5+PW8RPn1vnt5d381Pk1Do0KfvD5kzz0nhtI8w6dbke8\nHCeCfLVZKuahWSZQb3O5kqPg4GTsJJ7bXgR6VRQAdvWDiKReSnZFFNw3P8u9s11Ka1HWUlUlaSpK\nRUFJbjkpSoqyot3JgSBensQcTCvKsnjDALs6drBYpleV5dLFdR588GFOnDiB0ZrlpSXarYy5uZ7M\nFZFucp6JF1BidNMPkwpJKrvEGFpJwl0bm/zTLz7AwWMvRgN1R+WtSHxbkfmuE2p8hBqXBeV4RDEa\nMBn2mQwHovdQWRKtBbRnJFjr8v3CT/0448P7Wf+Zn+D6z/8pvZfOcOTz93DdyZf5+5/9Ah8pLT91\nYYNDown/5fo2X3zvuzk71+OeW69jz+5dLC8v0m635MhzAVs5ikIUd4bDEePJ+DKEQxFNVxsfoxBi\nN95RGz9MV8GCRjUEhJGljSZPU9qJuMNBlLpCCS4szmGTVCQ9rRMTMQkoIYzo2Gy90roqdrD6eFxb\nW+PBBx/izNlzLC/vpdXuAJrRZExVlgwHg8YNLMlzaWgiuHtrJT8yQKE9c50Wuxdm+cl772Pfdp/O\nn3+JR7/rwzFnkQal1gqDx5AQtMfF3UtyGrBOHF6ts+iQYFLdDM2njyelNOOPfpiT332X5Ck6Zc9v\n/j6nf+hv8q7/97PMrm3yiWde4KHv/QjZvQ/y8J3v4fz+VX775iNoLHvzDJ22pFJOTCPsZmPDdQKY\nRONndmQtGycQ6+TmeyHfuiCGE9YJukJKKDkea0eGQEBpSJURHyYLI1sRCkVVeVCa/mCE0oY8azUV\npwRTvZMpsjRlbrbH6bOvbwx/VQQYwHBUcO78RYzRfOQjdxGU4uzZ88zOzXP6pdNMJr6RerTONoaj\nQMTBCxgOpRiNKsbtNq1sgT+54xZ++MnneO5vfBRHwNR9qyDzt1DFkl0jPaxQRpHCwGQ8ZjQeE4BW\n1o2ARguVIhgdm5QadIJJpRINIdD/8Ie4ePvtTLa2KGzFdZ//U07/yPfDLTfw5Q++l8GoYG48wQTx\nqgwKnDYiq6liXhfnhsbIgLqsoo5XnsdkP8QpQCXN26pqIOPex1EPtZhO2JmAIb2/1BhSbTBBIOtJ\n8GgtvcTtwZCyqkT4WCtctaMHK5MWeX8z3Rn2rqzw2NPPv+59vUoCTPPgI89w/NSLfOK7j7Jr126O\nnzjFoL/FzEybudkZtLaMx9I8XJjpkmUpW5sbBF9J9agNWaYJHiqn2BwXBFvx3N5F/t0N+7nu5utZ\nTA2pTwTJiUOhsSGgqkijN57KO0JpKQdjNtY20FnK7OJuockpi3MirmtCKn5BAEmG9gnKe4IK2GDR\nJoF2yvC7jvLE3R8kTRJmvUelGa41QI0TQmHBBWwIjdk7paVylomrIEAepLpTOmEwnDApKvK8FQfg\nxCPVCik59qWC8yRonFZiDoGwqFwsjHIjvuBBKazS6KogNwmpSdE6QzGh8g6MRkesm48NYa2jk673\nlDbQHwyveGevigBzznP8xRdZPbBCt9sVwqzW5HnOcDAgSVKCF6x4miTs27dKlqWUxZjhZBQNN8Xd\nrKwKvIfBYMB2v8/K7DK2ctGMIWrUx0oyxK2/hgaHOC4ZbQ9Yu3CJJMuY37ML0rQZAWljMFp2CbyW\nhmnU3KpHh1ohOY5K8cGJmJ5XGJOR5R3a1qKriqqUHl4IitJ7yQ2DxwVPGYksE23wrTZplsaBfQAt\nlXJlXazmSmEnBdEEM0lCcF7QGiFEQ3sTBegURzf7/PSZi/zWgRW+NNPF+0ASK+/COWGeO09AKnQl\n7TBxLYk0t3pu+0bqOldFkj+ZjMgSxZHDBwSfZR3DKGO+trYe2dKaVrtFp9um1cqZm5vl4MEDzM3O\nkiYaW1VUZRGT4IqqKrm0tc3CnmWWlvbSztux+4wo77ximO2i7muxOeTS2fOEAAuLi+gsx0HUro9t\nDe93HNGcxQaLi4FUeYvDg4mcS+si4SRBZeITmUUlHaH2p4IzQzwepTUQk3bvKcqK0aTA+UCSZBiT\nsr6+ibWeJMtBmSbvct41zCAfq55aQ0LpOBLTmp8+c5HrRhN++vQ5EYhpt2i1WgRgUlVMKktQIpPp\nrBdI01SxAAIDd87hqm+DUZGzlv0ri8zNtKMPNxRFia0so9GY8aRkYdcCSZqRZ0kEG4pM0fLyEiZL\nOX/+AlVpY5M1yggozcLiMqtzXdBOYDL1BVc7gVU/ib5yTPpDiqJkz8oKeadDVffLIo7eOY8KQu5Q\nxuDRWGI3W0UAjxPEhke8tNO2yAkk2mCcxylD0AqrApXylMpR4bB4yspSOEvpHCiNTsSrvIxD61Yr\nJ0kzhsMxnU67GbQ777EhkLdSAUQiuqrBhZ12BeKS9vuHVvnJU+f4vcP7yPJcZqzA1mDE+vY2k6rE\nK3FTS6LxRFDyYNUmriYGW/UGs8hvlHj7z5RSLyulHo1fPzT1s1+OxNtnlVLf/2YCLEkMu+d7pImK\nHWsRRKsqS5rm0W5YNU/a+sYa6+vrKK1YXVlpBD3SNCHLU9LEYG3FoKg4c/6SyGGWlWAgdD283RFa\nq3ezsizpD4fk3S7t2R6ltfjKUw0KynHR+D3W6A3vBUasnMdUFjWc4C5u0vrcF7jxH/0TFu77iuQ3\nwaPHY+z5i5Qnz9D7wr188H/+N8zf/wiTwRBbicaY4PVF6KUoK8rK7Qz404zOzAwmzWi12kyiO26S\npqC17J62itgtjU5MgwqpycBScRseXd7NL37g3Ty4az4iKqL9n5WmsPUBG4Tv4CP6pC71jTFCgE6F\nAD0Yjq58b9/E/f8PwP8J/MdXvP6vQwj/6/QLSqnbgL8DvAtYBb6glLophHDFgVWWZexfXRH/RkyE\nh8igVikxgN/e2sQkCZOJYzQacejgQYaDPseOHWM8HrO8vAyIzvza2jrGaNa3tnjwkUdY0LC0NEun\n127QoFPvuQk25x2jqiA1LVSSMBqNKUcFiQuQJSStDJMk1NiLzpcfYOW3/x/OfvI/4+yhQxQbW7it\nER/6vU/T3dxg9Q8+w4nFeeZfOsu+P/4CX73+ek63evytr9zH3GjETffcy18VE1S3xa1rm9z92LP8\n5Xtu45mDKxSx71SgqVKB7uiYl/oQyPOc0WhErzcTh/NSDRI/T9NUjYFVP1RGiTpRZS2lrQAtfa6y\nYmt7m9I6XE3x05pgA6bJXWUyURoxORXExjfZyX8d4u3rrR8BfjeEUADHlVLHEIu/+678b0CW5wzG\nfZJMgoQgT48tHa2sy8bGGrNzPaqipCwmnDj5InNzs+w/sI/9Jmc4HHLhwkVG4y0mhQUl7mRHTr7E\np55+lmc++YNUy8vig62kq6+NkSFwnZtF3l9LK2wxYTToM94ekGIwrZwWgRSFqSzFxhbX/9+/SfvS\nGrv/42/zxQ+8H6007TTjmeuu58ZnnuL+Xbs5c/+jfPCBrzA3HHHr1oD7913PPZ15PmYdfzYzy6Vn\nT1Glmp859TJLRcHdD36NB3tdQpZgg2dSFZSqFK8grZifn8d5j8lS1i9s0p7pECI0OlGaVElbI6hA\n8AqUwSQeFSIKxCtsiFAcbTB5i6ASqsmEQekALa0SH6hNLGIlAi6QJjnOCVjT+Sl47+usbyYH+4dK\nqZ8BHgJ+IYjx+z7g/qnfqYm3V1zBB8qyIm/lIosdoL/dZ3NjE60zgZtYYeEcOniQ3bt3sbZ+iZle\nD+sC585fZGNjg7W1DUaj0RSAT/Nz586xUpV07/kiD3/vx0hVhcah4lEieCwXq1Dodrt0u90G0Wmt\nZearj/Dev36IBz74Hh7bsxszHKO3R6zPznG0P+BLM11OHn+BkAhAUfVHHLYV29sbtB9fw0wmrGnF\nZzs5uy+9zN2jIf/fTIcnCEz6fcYq8Jt5wt9xlt/JM1545hi+nZK0WyTGkCuNUbuYn5uVYy9JSNKU\noGA8mVDZCJGudyujGj2w+joEL4hXgpiIpSZgg1D8yqBZ2+4zKGqUiFxApZUct1qTZUmjClQ27mzT\nEPPXXt9ogP0G8M+Rk/mfA/8b8F99PX9gmni7a36Wra0tegszDAYjKuspi4I8T0ElKB1EYyJLmZ2b\nQyeGM+fPceLkaRZ27WEwGLC2tka/P6RmetcVz68vLPDPnOX8T/xtQUwYHfmLNM3aHSuahG5vhqzV\nEl2HLGduIeXOBx5l/tIGH/niA3xAa+5dXubSrj28PDPL7xzMKIOjlxjQioPr6/zoufOkwN/Y6pNZ\ny6zzbBrN+kKP//rkWRa852/3B6zNL1BZz2aoeL6V8Iu9Ns4mFMMJd6yt87ODEfd123x0XPJX77kF\ne/0RWq0WOknkqFaKfr8vXX4vrRdfN1+UiianUk2GoLCeaM9j6aSaYSnYssoq1jY2RU0yXot4jxpE\n70x3ppHbnG5P1MqTr7e+oQALIZyfCpT/C/hs/L/fEPF2dXlPuHjxIrO7enQ6HU6ePI02MLcwSwiK\n9kzOmZcFUFdUwrqe6c1x6vRptrZHFEVZ/01qo/hanvsvez3+yXvfw9G5HivrayRzbVppBJBBFGSL\nsOhck+gMjCFttTDDEZ1uh3M/8SOkf/THtEdjFtbW+US6wZdufz/GOjKjSbWh3Wrjg+fWz/8RKeC0\n5uUbb+am554F68hNwvv27KV15hIUBVmS8L7FVYaTkrPjIee1ZS04tqzodv3c+gbXlZaDdkDmA53n\nTvCF7/u4yArkeeQlOMrS0u62G9hpQOA0QUmw+YjlJyjeu7bNJ587xW/uXeDYdau0uy3Wh2NOnV9n\nY6tPwDSaYbVaUX1dt7a30EoxNzdHr9drPNLfiFn0DQXYKwRNfgyoK8zPAL+tlPp1JMm/EfjKm/mb\nk/GE7e1t1jf6bPe3Wdg1x9b2gDRLmWnNsL3dYVJYJpMxG1t9tvtDvJOLOhgMmEyK6KWTxQLBY71n\nOBrz0Fe/yoXTx/nUD3wPy7P7BGx4GV1Bvk8SgWJXQXCvrU4XoxNGH/4Qz3/8o/Tuf4iV3/0Dznzk\nI5hej2Krj0sT8j27SDsdsiRl7fu+l+TLX+b00aO43cuc7nTZ9+QTnDi4H5WkPH/jjRw5cYKnV/ZS\nKU/fVWzYkk3tGCeatJOR9zp8JjvEj596madXl7njwhoP33XnZULH9fDbR6tDGXcJ4lSG/gYSh6r9\nuVD82LHTHByN+YlTJX8/SejMzHFpu8/Z85cIQZhGaUQFTwsAg+xUjsBgMIiSTvKzJL1yCH2jxNvv\nVkq9N96ZE8B/AxBCeFIp9fvAU4AF/sEbVZAAaZJSWYHFrK6usrpvH1VV8fQzT9Nqt1haXgaVcPbs\nBS6urzEYjAlB4YNiNBRxlE6njfdQFIXAqp2LHfGSgS05HyrOnTvPLQdXLrtwQINOVUaJGG4E1OXt\nNsormV+i2Dp6JxvvvZ3J+gar9z3I4S/+JU/f/m62koTJ2hadNGOwsMT5H/0Uhbc4b7h48DBnVvdh\nRwO2h32GqyscW9zNxtoFhsMN1ivLRq44Mi758XPrfObgKo+2ujw+u8CJWw6x0OnxVEtkmXq+xrBN\n+ZJrRWmrBpLjvW+0VGsUq1Ia5QK/f90BPvnccf4n4NjZbZJkTOkqrBfKWmkrikI13gF1QMv3MsOc\ndlprt9ssLy1y5szrk/e/UeLtv7/C7/8a8Gtv9Henl0mEF1kD29qdDjPGsLS0zHA0ZDKZMDvX49Tp\nlwSAWDjG40oE3lo5nbZhPJ6IFJHWOOsEAq205CzOEpRhIzZRZ9sZ03bmKrKKQEXWuID0vBNJKKWF\nV4k2mHYLMzfDdX/1ZXqX1rjtyae4//bb6G9s0S+GJJUkvjpLsVZTliJenJgUZ3ImhWVj2GeoNKMU\ntrXCtXM+9dIZDowLfuTlczxxeEnkCfKU3IAJkQDipMEbYsNX5KDEzQ2IbrQi0OtdaPp1MevkgV3z\n/If9ezl5bgPrgKqW2BQNDjG52pFeqJnyxpjI1SRqm7XI85R9+1a47bZbue/+B1733l4VoyKt4YYb\nDjE/vxDFetvkeZteb4HxqKKVdZntzZK3cgieQX+L8WibVAdmuxkLc21So6KpQooPFpUg2PsIJS5J\nObs+pKiET6hD9O4JQmcjki+sLXFVgapKKCtRvqkKDIpMaRIfyFLDyR/9frZXl3jyEx+lnO8Qlucp\n5zvMnDnNRz73OdInn+DU1jmSU89x9It/Rvnyi5xUE9JzJ/nxp55m2VpGe3ZRLPTwM20+e+NBTvc6\n/PGtR+h0uvRaXVo6RYdINtGayjmGoxGj8ZjReCQm8UWJLSzBKVpZh+AUOhg0RualClyQqWLlAv3B\nOJJyHSLIKL5FLgR8UA0kG2gSemMMJtHkrYx2u0O71WJ+rkc714xHG1e8t1fFqEgpRd5qEVSKjorR\nxhjm5+fYu3evwE+sZnZmNqolw8LcHGVVkaWGEIT0oJWmqKrIYRRdd2WlB/SxrW1+5d4vcXpxlup7\nj5Ig23+IAC+liGZQPvaAaiPOAKZG3IqplVaG4u6jPHHnHYzHE8JoQsg0tFLu+PTT7Nre5j3HjvHw\nrQf58JdOsDQYcvTkKR69/aN89NHHWBmN+f7TZ3j68D6y1KDShOfmZvj1I/tJEkOuFEaJjqrFErIM\nUI20pfOO/qDPYDAQ6p1JSCM2rqanOWcjVofYcNVUzjMpy3hs7lz76f/dMbqiOR6TqOKTJmINba1j\nPBzx0rjPpbWLV7y3V0WAoWS+FVQKidgfF4XM2hYW5imKkuMnTnPm5TMM+kP2798nxuhpxmg0wDvH\nqFfBsGQ4LgTGomqOoVSTv7C9yfW2pPcnf8Fz3/PhqYvI1BAbfGXj8al2iLmAs2UkTsg8UylNlrUx\nJiNJW7KjtC1P/cD3cNuffpGvfdcHWdm9l8c/8iGSBx7mq0ffy67ZXdx/9D3c/eAT3HvHLbR6HZLg\nI4Mp3mQniXnA46MtTlBQViXBgq4M47KgPxhQWivYLi1IDVTA+ejs5p3QF6PsuAeKylI6L3zOEK5w\nO1QTXPX8UfIvhzEWozTDUUGawkyvc8Vbe3UEGJJA+jipD8FHe7mEmZkuRTFhYX6Bhy49gq0qsjSh\n1+uyubmFUrBv/34Wdu3lKw89ugPJwUcspwTFv5pd4Bf7G/zF9Ye4ubLNQNwYgbJ4H/UbbDV1ZDaZ\ns9CPpMGE/B8FQbrenSxBBxkYV3d/hAfufD/ee5bLQLF7kb/60J04HIt4RnsW+dz73sN4NGa+KuW/\niYhU7zzFaIzHN0RipTSTsqSyFpOI8XhRFvSHg+i9baIQXNL09gI7/T3R8Yh8h9GIyvvIWPJNINWj\nMrn2NXpVN697H8jyTJr5NqATmWnOzHSYn1u44n29agIMoC6FpKsu5gytVkaeZzg3YHZmlj2372LX\nrnnOnjvD4UP7OX/+As88/Rza5DgX0EZw+2iRLyLIBb1nZpa/mpnhzm6XQ0VFK5bXNVdQutxeINOO\n5sIqpbG2IkvzpkqDWkIAQKGMIg1SJhhjmOn05CdBFHCKyUTwZogeRFkVDNsDJoW8XkSOZ1VVpEFj\nnWs66LXVsXOOBI/znn4sfAJxeK8jpDqKokju5dHCi+L28xf4W8+f4l+0cx7xnoARCnGo04OdIKuP\nyvr76Z0sMWnTN9RawAgXLnw7HJE1IVSFRpvKGIO1Iqbb681w5qXH2Nrc5Kabb2D/vn3M9roU5YTE\nGMqyYLu/RbfTwXovZgIavBKAoA8KF6QdNJqUbA+G9Np5Y8DpXMTzm6iEiIuMaB+DXaFTjTZJfH9e\n8FkmRRlDFQmWKkCWZmSp9IkmNVKvKPHRStmWBWboaKUZiTbCDfDQMgk+bzNOiqjebCPezIFW0TZQ\nLAXl5+ISp0gasbsa6eGiOk5wIhLzyRdOc2A05h8WBf8+Si4QIUiv3MVqz4F61Y1rlMb5yMjSGpyn\nqt5wFHl1VJEECIijqgoiQBm8GAVIeuRAlywu9Wi1NT5U7F1ZZn19nbNnz7Kyupebbj5CmnoMFQRh\nfCdG8PLaGFJtUdrz0mafp06eY1x6bCQwOFugg0M7T/BE5ZsaziMAxtpEQUerG6VinhQEntzNW4KX\nTzKCSahQBBfABqHlB4UJGu0UyoMKKsoHJCRpBso0nfiyFFkEYVo5iolAoidVyWg8ipJPYguTpgl5\nKwPlqapo92JdJAdXVN7xe0cO8Wye8atZi8xk6BAuC8ppVGoDKKyrRyMIWWnVBKy3wj1QEDAYnV7x\n1l41O9gOjCY2BlEN9co6x2233cKTTz7F3r1LzM7O45xne7vP+QvnQRucg7KciHialiFuMSkwadpg\nvbz3XNosefz5Y1y3b5EDe/eQqshiNgnWSgBJ/0c1u2iDG4vVmVJJ0xsLLpAqjUa651IAqMbG2CFa\nGlbJzmLLEh+hSAKujH2t+B6L+D59ZGOHEPAuULoyegHIDpKmKXmek+c5SVQ79GGn0eqcBL8Nmntn\ne/zGnj1c2OwLfCfU1/rVa3r3AhpHFK0UpbVyrbSKvBgvOhlXWFfFDrZTz9RSRjsQ3VoSc3l5L3v3\n7qXf7zOZTEiSlMFgwMrKKt3uDM57Op0OIQTa7TZzs7OvyidqAN659Q06Dz7KB//HX2fuwSdRXi6e\nndrvp4+O+EdwUS+rkcuMAm5aq0jdj0dGEPsWpcJlnEtpltqonuN3jrP4mWtZhFoWqmYNuLCje1YH\nV5Zl0Wtbji/raqETabw656l8oLCeze0+g2is4L2Np+Pl+Vd9nepV72r1jiaOJJUQTIoJIHr/rTy7\n4r29Onaw6aCiTvQv9zhMk4QjR47w/PPPs7a2xsWLz7K8vEwIcPb8BZSC8XiMtY7J1ha2cq/a/uu/\nd6k/4EN/8WXmhyOu/9w9PPCuG1GJoA+qcLmxvFI7TrDeVqhEEbwhKNnJjCj5RukU+SwqUWIxiAcn\nqtQqfjZNENx7Ee3xQohUf5FYstE/iBCkkWqFluaiu1wjUxW/DwhCoox2h9Y5ge/UOWdl2egPGBdS\nsQbvYv61g5aYRvdOu6zVX7aqcLH/aLTcC1HHtszMXLmKvCp2MKifmNCwd2pWcv1UtVoter0ey8t7\nWVhYwFrHddddx0yvR1lWTKKJZg0lqY2d6tlZmqbNxSxKx6/mKSd6Xe798PsZFCVlabFF2YjY1X8L\nYkXpBJMmSjZxJ4v8TGtL0UstBQxZTiaUxURGVhHLVj/9wTuqUpxvfa3dVdlmtqiNtFl8DDAbg8b5\nehxkmi+Q3WpSlEwm8lVZgVk7HygdjEvHpKxwEd8VgmuMIKY/X6ODFi6XraqqSnZPJw9Y8ILOcLZi\nYWGOox+884r39erYwai3avGQris40QX1TdCkWcrs7CwbG5uCzV9fZ3t7m1arxdZWn26nw6VLGzjn\nGA2HgkOfumg7WKbAZ7Thvj0djpZD7rx4gcWZGdpJCnl6WXc7ILsPSJBVsTegdcArEc1F6cjmSWTu\nhaUq4y5UlVRlQVUUFJMJ4/Eoajo4qlq3Kz4Izllxqa0qxkXJeDJhUhRx5/GRzLHzEDUaFdZy8+lz\n/PDzp/j04VUe2jUvg34XKJ3MUOWBi1yjV0DGa2jT9M5VXycRBDbU+JPazDRLM/YuL7OysveK9/Wq\nCbCAhqCwTsCA0gdS+HqTNZqqrMjbLXbv3sXLL79MVVWs7F3iwvkLVGVBWVjarTZBJTgn3MWqsmJ8\n7qc03YMiMTmVN5w4v85g8Bi3XXeY1d2LzOU53bwgb6cEk5CmOa1ES1KrpPDAlqLZQBBDdyMKzn5q\nGF0AvhLeYlVI4l6VBZPxhLIoKKqSqm4rTB2RFTAaTxiPx0wmhcCS4xUKkcxelTu7bH28/vDzpzgw\nGPNjx8/w0MJ81HuN+V8giu0RC6ip6z6VEmil5fMhEwAJOLFTRCOKiD7myaQ4V7Kx8W3QBxOQnLjq\nqMhQqIFzgrCUhqhODLnO2X9gPxA4eeIEWZpzy003opXh2IunZC6XxGZtbFT6IOrNO/M2mXcmWZuK\nnMeOneKlS5tcf+Agt68e4P0XXuLO+x/mse+9i7X3vJvZVMvuYUzsgYkUpThgGJHjjLIBYUpiqRwW\n2Mo2tjdlJUf5pCwpbdkcfc57rBNNr3HlGI3E1bc2o0IuCd4HnPXY4C7Da3nv+cPD+/nR4y/x6YMr\n1ILCKnh0CARrmyb2dHBNN1frL60MtUhw/dvOOpyuS09Flqd0u23yPOH48ReueG+vigDbYb7UNiw7\noroQcyBCTH7lgi4tLXHp4kWcdayu7GX/gYN0Zr7K8ZMvs7k1xCSCBbPWNNVZmqZyw5TQwLxWbPT7\nDMqK8cU1NrdFRednn3iaXRc3eNc99/GHBw8Q8oQkz0jyXHKkAN6XpFkgNRC8zPZq/a46b6nGYp9s\nnd1pkBYl1lvKOreJTO7KWiZFyWC8o5pT56HvW9/iU6fO80dH9vPE8u7LclMV+ztfW1rg4V1zVE1h\n8+rKMF7qnWv9ijEREJnb9d+V9lH9mlSUmk6niwqaUydP0259k4DDb8Wqy/E6D9sJLhWbzkqw5hFT\n76cu7sKuBUaDET44jhw+SLvd5cUTpzl/4RKQEELeJPd1wFbORsaNIrgKrxPhJI4LTve3efj7P8Zd\nf34fX737TvqDMeV2SbvbodXtCr5MKVCW1HmyNJAq31Rg9cinqipsYRsdr8paiqpssPOVcxSVqOF4\nLw/DOEoyTUO/Qwh86uQ5Do4m/Ojxl3h8addrzA99rMR9FCieaq+8+mJf1gO7rB1Tj8KUNMsEdaEw\niQAW65lnVZSEqsQoz4HVG694b98M8faAUuovlFJPKaWeVEr9d/H1t9T1NtSQTETotxbPCUFeS2oS\naYiddoRP2Wm3SBLF8889w+bmGttbG7SyNGqIpk3vqxb2BVBeVHZSIz7USiVYp5nYwIXNLb6ytJuH\nf+2XKL7ro3SJBuTDAAAPX0lEQVRnF5i4wOb2gAsX17hwcY2t7QHzjzzF3f/q39K+72E2NjfY7vcZ\njoaMxmMGwyH9wYDtQZ+t/jaD8YjCllTOUjpL5R3WB65/6Sw//yf3csPps1TOx/xzJzhq0N9nbjjA\n6ZkOn73hEGmaNiDA6QLGez/1kF4eYM1rqMuCa3qwXTdUlZJdy5gIFYha+lmW0GqlZKkmSzS9mRa3\n3/4uPvjBD1zxvr6ZHcwitLSvKqV6wMNKqT8D/h7ievu/KKV+CXG9/R+43PX2KMJAOvrG/0ygnkm+\n1vJTSboPnizPWVpaoiwmzM/Nsn/fKg999TFePHEaouVckqS0222huFfVDtso/nOZNnR78wwGQ8qi\nj1EpVeGwE0emM7qthD2LywRbMOpvUZZVY5xw+5//NQuX1rntni/z+cP7m4aktWI3vPrsCxz9ymPc\n94HbOX7dAQIwicxwYWIHvu/x51jZ6vOJp1/g0eUPSf6odBMwdRA9s28vz+zbK0Ey1Z+6/OpFicz4\nEE7v8k2/K/hYEdME5jQODECzE+RVNQVpUo5er83+fSss7p5n90KPW2+5meWlxSve1TcDmT5LNBQN\nIfSVUk8jXMe31PW2TpADtjkq5cLFTjyivxBLGJRWpFnG2qWLzM/1OHDoIBfWNjj24gnarRlaLYMN\nBu+h0+4QggRX8IGtjU0I0Gq1UTohTSo6rY4QKYqKclIRvPh85+0Wvfl5EbmLcz4fPF/6wO3c/fBj\nPHzX++kPBpcJ0nnn+NADX2NxY4ujDz3GU6uL+BDEoRdE9sk7vvDuG/m+J57nP912vZhMZYlguOKD\nlhgTrZqj/1GIhl1hp+VijDCkvAelA8rvSKyrOHqr9TiIift0gL6yoy/4/J0dvzYdSwzMdFu867ab\nuOHIIZYXd7GwMNeQRL7hAJteShje7wMe4K10vQ2At4RQEJwMk7U2YAwKDdqABxNinwZhKc/NLQCK\n7cE2L587y7HjJ2h1urG56klbho2tISoo0syTJgGjW6TR9VYnGTbO/FpZQlV6bLCUVUCnOdqLbECn\n3ZYeWFZRFhO8c5y88TAvHtlPO2+B801TtJ4j/uUdt/Cxrz3NPe+6gXEhAsZ15afjDX5mZYmnYtKe\nBKHo+2QnwCQ4AK1jmyUGVtBg6qMxtnTw4BFlHyU4N2dNI2KXpgll1NR3cWY5PdiGGgOmoxFWRLRU\nFqMTUpPgKk+mNC2tmGm36Ha7bwineNMBppSaAT4N/OMQwvb0GR/C1+96q6aIt7t3zUZ4skzp61ER\n8SIHL9im6VU/wb1eD51oHvzKQ+xaWCBNctbXNplMCsajLcbjEXjF6vwC3heUkyF7ds8zKR2jUsyy\nnHMoL8wZ5wOVFdODOjcRKxvRyZ+MEsqiwCUCE6pHLrWJZ41pP37dAZ47uCJ/+zU65a81C6z5jNOM\nup3r7AlhpxG6kzIE0f4KCnAoryCypLSCPM9IzEQcPVotrHOCJYufDXbmjvFfFOUgX+dfO+9jpjvD\n0tIi1laMx2PaZdlQ2F5vvakAU0qlSHD9VgjhD+LL35Tr7TTx9rqDK6Gurl7zgQhTOUbdF4vbt1Ii\nMdRutRj2h7zr1lvo9wdcuHiBje0t0mSb0WDMkSMHyDK4eH6D8QTWNs6zNZigTCICHl4cOgTZIO0B\nkxharRxHVLD2jtQYbKuFsxXj8QhbileQOKwJK0dpkTjX3jS7Wl1lTgfa1LWIM0kZmPvwCnnyqd99\nVftm6nfC1ExXwJICHfeCoIzjMkVZ7lTil7mZxFWbvXvnp15zlOWELE1ZWtrDzIykHfUI6/XWm+FF\nKoSm9nQI4denfvSWud42DhQh4pQa0TTd0MtUE3w1rqe+oGIhc+cH3s/2Vp9Ot0t/u8+BA3sJWvPM\nM8d59pnnGI/7uMia3toaobQSJzIrxgo6VCgVml0L5IhQJsc6+fessyTaQJ7jbEmaJOJuFntWopda\nQFnB1NC4vmk1oeK1gmvnSNQQdPOzqfvwGv2vqb8R+4R1YzYgLQLvHEZrWq0MlWZxPusoKslJX8tr\nyEUcno9okRCCYMysZWNznVtvPkySaqyr3pIc7G7gvwAeV0o9Gl/7Fd5K19tXHK6vLLPlAjatw/hL\nES4cFEonzM3OkucZk8mE+YUeWcuIakzxLMEHsjRnbrbDhdEGzlmGwz5lOYlejtLL0kaRKEPwMhvM\n8hq1oFEqI/EJPpEBt49tEO8EQds0WKucshDkqbcuVmNVg+yYbitMf776M70yiF55Hab/22Z+SGQJ\nGEOSqKaJCyLXlGcpWVC0ul2cdwyGIzJldq5ts+tN7WYmaorFQiFJU/bv3xeF/1KUCUwmEwo1ueKt\nfTNV5Jeae/vq9Za53io595rcS5LZnYQ+CJWBuvm6szMEVIhldggy6feOYjJiUsDFC5fYu7TCkcMH\neOnlEzz7zPOMJnKRAzGp1dHmKibRo/GIoihod1IUijTNKIsCE/15gnMQ0mYs1KJNbV5aa9gXRYEt\nK7JMgr4sq6ZdMn1Dp3e0APgGfhbiPo18Lrjs96dhNs5KeiFicwYVd1SjIc9SzNgS7ITZ3gxKa85d\nuMhkUor27XS7I0wVGGEqsL1Q+ubmZ5mZmZGxHZCmCYNB/4r39aro5MuTS4NT8jX2O+ZcpnmK68Rz\nuhmpGu2FLMuZTCZsb29x9uw5Bv0hZbHFvv3LDMuSp4+dYm1UMB5XaJ0QrCPLow+4FeZNYjTj8YDh\ncIter4PymjzTKFKsi3M7kzajIZ36SACRPdYHUQaajCdURYFJMkySkhYlWcSV1XCkqqyazr6PeZ+N\nczMXYkUYc6vpR/yVGK5gQNzLA1oHUqMocWgSsjQl0SO0DuzetcBMb45z5y/R779EUBpvLbUHpgsO\npevd0+CcSICqAMFVjPrb2MrKdajnst+sAN23cjVJfPNKvLKx7/V6TVgZjsfkWmv6/QGnT79MOZmw\nf/8qe/cu8vhTz3FpfZtJaRur4Ha7E+FBPibqInHuauNP54R2751QyOzlgLwanVCThXeSd9lJEpM0\n1XFiEkKeNfDtqqrQSk8RNaTPp6OUh5+6HvUoLago5Bs5Ab5J8qXbrpUm0QqfiBuJ85G9jmigZWnK\n8tISe5eXOXvuPGVZCMQoyODeRkBAmiakaSpEYx/QaUqeJsx0OiilWNvYYGVlCa2NtCqusK6eAIvb\nM0znHmrn9TpNeUVOIo1DcXwFEcldXNzNcLifzY0BS0sroA2j4RijDBrNZFKQJrkwgRDMWStvY23F\nTNvQ67bptCLrCEGMppnGmMsT73oMVedCIJ18pcQFw0AUzU2YjMfYShqtWZpSFAU65mc2wrW1F02J\nV8KVazc1g2r8xeuLFmIyKv5NoiVBAJdljCeSwIsjiswR9+zazY03XM/J06e5dGmNPG81BcpOGyXE\no9yTaEWWKFb2LnH0w3cyOzvDaDSKNLbkMjOy11pXR4DV8TNVZkfUDnXVGKKLxfROMZ231dWT1po9\ne3bT6/XY3hrT7c7ywvHj3HrTzWxt9RkPhriyEus+HVnRSQoKZmc6vPfdN7LUUdiqkFwrNdEB1qF1\n0lSCdVVYd7xfVeVFVxKdZRCkTVBC0zKov+qaWCvZwbzmMqLJZeOc5uGK2ZmS5NEAmEASrWgU4K2j\nLKQHl5gEozWddoe52VkWFxeZne0xGA5IjISAtfH61uBE70gU5Imm18658abrWN67R5hMpSjrgBRP\nV1pXR4AB9RY1XdWEeA2FYTT1m3VwESspIzAfSdKF6t7tduh05whBs7i4m5mZLijHM702x0+cZqa3\n0Hgdnj17FqM1d7z7Dn7qJ3+cc888QvA27kwpxggmSmsacTsRfyvjuGaHidPsviGIdqoS0B7IMa+C\nHHOJMTiT4LQlGDnetHdYtbOL130z2PEVr38WQkAZhVe12qCKgarJ0ozgA0VhRTvDeXSA3Qu7mJub\nY3405PobjoiZfGmZTDbF2MI6QiwUQqho5SkLcz0OH9jHLTffQC/KBGgjOhnGGIy+nIX0ynX1BFjY\nKbunG4mEumF4+cXdgfmquNN5TKLJ8gTnPNZVeCIb2QScHdPrpizMtdha6JBlhtV9q5x5+Qwrywtk\necYd776ZwwdWufTso4yHQ0FuaA0qiGAwU+MeXZN2d5qn04NlW1UMBgNarZw0TVC00ApcVSGDLIVP\nPNaY2NjceXCmj95Gp4vL4YLT453pHbRGYADkuYXRWCYhWrO4uIeZTpd2q83q6goXLlzg/PmLYm6l\nFFpBYcVPgAAz3S4HVld47x23s7y0h9F4gELR681SliVZ1pLd+QrrqgmwmMZOvSCYJJqOPTSlVIg/\nR0poYpOxToCtLaMBqWY0GvHkk0+wub7GXG+W1dVF1tbWmJQVWaa4664PsLi0SKvV5uPf9XFsf8DF\ni5eohgP2HryBXltmm2lTNdloJyP8AdghljQPSPy+1W4Lxb6UI89og9euCaLpgKyDRLBmWtJPLceb\nQ2huTF0DpcSGum7PEGSUlSZJZEF50iyR3p4x5GnG/Nyc6Ft4y2ynw8H9+7lw4SJpKjwE6yzKBWyo\n2LMwyw3XH+bGIwdZ3LNALSbT6XTiv6Ga4/VKS71uZfYtXEqpPvDsO/0+3oK1B7j0Tr+Jt2B9vZ/j\nUAjhNXE7V8sO9mwI4cr8p2+DpZR66NrnuHxdNbzIa+s7c10LsGvrbV1XS4D9u3f6DbxF69rneMW6\nKpL8a+s7d10tO9i19R263vEAU0r9gBJvyWORnXTVLvXa3plvKX3v7V7qW0RDbNblo5lv7RcyRnsB\nOAJkwNeA297J9/QG7/djwPuBJ6Ze+5fAL8Xvfwn4F/H7HwL+GOmMfhh44J1+//F9rQDvj9/3gOeA\n296uz/FOf9i7gP809f9/Gfjld/omvMF7PvyKAHsWWJm6ec/G7/8t8Hdf6/eupi8E6v43367P8U4f\nka9Hcft2Wl8vfe+qWd8kDfFNrXc6wL6jVpBH/NuiLH8lDXH6Z2/l53inA+xN+0texet8pO3xjdD3\n3ol1JRpi/Plb9jne6QB7ELhRKXWdUipDDOU/8w6/p6931fQ9eDV972diFfZh3gR971ux3gQNEd7K\nz3EVJJk/hFQyLwD/9J1+P2/wXn8HkUCokFzkZ4HdwD3A88AXgF3xdxXwb+Lnehy4851+//F9fRQ5\n/h4DHo1fP/R2fY5rnfxr621d7/QReW19h69rAXZtva3rWoBdW2/ruhZg19bbuq4F2LX1tq5rAXZt\nva3rWoBdW2/ruhZg19bbuv5/iOCh0FsLT/EAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "uzUWmaHR95LA", + "colab_type": "text" + }, + "source": [ + "## Next we will iterate through the dataset\n", + "\n", + "Let's put this all together to create a dataset with composed\n", + "transforms.\n", + "To summarize, every time this dataset is sampled:\n", + "\n", + "- An image is read from the file on the fly\n", + "- Transforms are applied on the read image\n", + "- Since one of the transforms is random, data is augmentated on\n", + " sampling\n", + "\n", + "We can iterate over the created dataset with a ``for i in range``\n", + "loop as before.\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "YMJTEVFr-Al_", + "colab_type": "code", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 85 + }, + "outputId": "77055c6c-58a9-4681-b85b-d34771b708aa" + }, + "source": [ + "transformed_dataset = FaceLandmarksDataset(csv_file='faces/face_landmarks.csv',\n", + " root_dir='faces/',\n", + " transform=transforms.Compose([\n", + " Rescale(256),\n", + " RandomCrop(224),\n", + " ToTensor()\n", + " ]))\n", + "\n", + "for i in range(len(transformed_dataset)):\n", + " sample = transformed_dataset[i]\n", + "\n", + " print(i, sample['image'].size(), sample['landmarks'].size())\n", + "\n", + " if i == 3:\n", + " break" + ], + "execution_count": 18, + "outputs": [ + { + "output_type": "stream", + "text": [ + "0 torch.Size([3, 224, 224]) torch.Size([68, 2])\n", + "1 torch.Size([3, 224, 224]) torch.Size([68, 2])\n", + "2 torch.Size([3, 224, 224]) torch.Size([68, 2])\n", + "3 torch.Size([3, 224, 224]) torch.Size([68, 2])\n" + ], + "name": "stdout" + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "bs1tezbj-V5T", + "colab_type": "text" + }, + "source": [ + "## Part 3: The Dataloader" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "afwB59js-g5h", + "colab_type": "text" + }, + "source": [ + "By operating on the dataset directly, we are losing out on a lot of features by using a simple ``for`` loop to iterate over the data. In particular, we are missing out on:\n", + "\n", + "- Batching the data\n", + "- Shuffling the data\n", + "- Load the data in parallel using ``multiprocessing`` workers.\n", + "\n", + "``torch.utils.data.DataLoader`` is an iterator which provides all these\n", + "features. Parameters used below should be clear. One parameter of\n", + "interest is ``collate_fn``. You can specify how exactly the samples need\n", + "to be batched using ``collate_fn``. However, default collate should work\n", + "fine for most use cases.\n", + "\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "BOoe2F5s-1L5", + "colab_type": "code", + "colab": { + "base_uri": "https://localhost:8080/", + "height": 199 + }, + "outputId": "60b24803-8271-4c08-d1d6-193e88cb5c96" + }, + "source": [ + "dataloader = DataLoader(transformed_dataset, batch_size=4,\n", + " shuffle=True, num_workers=4)\n", + "\n", + "\n", + "# Helper function to show a batch\n", + "def show_landmarks_batch(sample_batched):\n", + " \"\"\"Show image with landmarks for a batch of samples.\"\"\"\n", + " images_batch, landmarks_batch = \\\n", + " sample_batched['image'], sample_batched['landmarks']\n", + " batch_size = len(images_batch)\n", + " im_size = images_batch.size(2)\n", + "\n", + " grid = utils.make_grid(images_batch)\n", + " plt.imshow(grid.numpy().transpose((1, 2, 0)))\n", + "\n", + " for i in range(batch_size):\n", + " plt.scatter(landmarks_batch[i, :, 0].numpy() + i * im_size,\n", + " landmarks_batch[i, :, 1].numpy(),\n", + " s=10, marker='.', c='r')\n", + "\n", + " plt.title('Batch from dataloader')\n", + "\n", + "for i_batch, sample_batched in enumerate(dataloader):\n", + " print(i_batch, sample_batched['image'].size(),\n", + " sample_batched['landmarks'].size())\n", + "\n", + " # observe 4th batch and stop.\n", + " if i_batch == 3:\n", + " plt.figure()\n", + " show_landmarks_batch(sample_batched)\n", + " plt.axis('off')\n", + " plt.ioff()\n", + " plt.show()\n", + " break" + ], + "execution_count": 19, + "outputs": [ + { + "output_type": "stream", + "text": [ + "0 torch.Size([4, 3, 224, 224]) torch.Size([4, 68, 2])\n", + "1 torch.Size([4, 3, 224, 224]) torch.Size([4, 68, 2])\n", + "2 torch.Size([4, 3, 224, 224]) torch.Size([4, 68, 2])\n", + "3 torch.Size([4, 3, 224, 224]) torch.Size([4, 68, 2])\n" + ], + "name": "stdout" + }, + { + "output_type": "display_data", + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAV0AAAByCAYAAADwBQLgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9ebAt13Xe91t77x7OeMd3733zgAeC\nDyBAgrMoiqQlirEtJnISyUrilDxIThQ5UmRH5QzliqKUIjuuSiWuKIkqk21ZsS2FNl1yHFuTTVEQ\naUihTAIECGF8eHjv3Xfne8Ye9pQ/+pzDS4aAKEqEhPL9qrruOaf7dO/u0/tba31rrb4SY+QUpzjF\nKU7x+kD9fg/gFKc4xSn+ZcIp6Z7iFKc4xeuIU9I9xSlOcYrXEaeke4pTnOIUryNOSfcUpzjFKV5H\nnJLuKU5xilO8jjgl3VO8KkTkpoh8+Hew/Y+JyL6I3Pt6juurHMuHROT212nfV0Qkioj5Ouz7EyLy\nvb/X+z3FHxycku4bDDMiLERkLCJHIvKPROTiV/ndrydZXAL+Y+DBGOPW7/X+v54QkT8lIo/9fo/j\nFP9y4JR035j4V2OMXeAssAP8D7/P4wG4BBzEGHe/0sqvB9Gf4os4vb5vHJyS7hsYMcYS+Bjw4Pwz\nEfk2EfkXIjIUkVdE5L888ZVPzv4ezzzlb5h958+KyBdEZCQiT4vI2098520i8oSIDETkZ0Qk//Jx\nzCSIXwTOzfb7N0541d8jIreAfyoiSkT+koi8LCK7IvJTIrI028d8+z89G/eRiHyfiLxrdvxjEfmJ\nV7sWItKaHfdIRJ4G3vVl6/9TEXnhxDn+67PPbwA/CXzDbOzHX8V1/PJjnxORnxORQxF5XkT+7Il1\n7xaRT8/Gvy0iPyEi6Yn13yoiz8yu708A8mX7/jOz3+ZIRH5eRC6fWBdF5M+JyHPAc682vlP8AUOM\n8XR5Ay3ATeDDs9dt4G8CP3Vi/YeAh2kM6iM0nvAfm627AkTAnNj+O4E7NCQlwHXg8olj/TpwDlgF\nvgB836uM60PA7RPv58f6KaADtIA/AzwPXAO6wN8H/taXbf+TQA58BCiBfwBsAOeBXeCDr3L8vwL8\n6mycF4HPf9l4vnN2Hgr4LmACnJ2t+1PAY1/hfL6q60hjzP6n2bjfBuwB3zxb9w7gvYCZfe8LwA/N\n1q0DI+A7gAT484ADvne2/ttn1+vG7Pt/CfjUiTFGGmO3CrR+v+/N0+WrnMO/3wM4XX6HP1hDhGPg\nGLDAXeDh19j+vwf+u9nrr0S6Pw/8R69xrH/3xPu/Cvzkq2z7aqR77cRnvwx8/4n3D8zOwZzY/vyJ\n9QfAd514//fmhPUVjv8i8IdPvP/3To7nK2z/WeDbZ6//f6T71V5HGoL3QO/Etn8Z+Buvsp8fAj4+\ne/3dwD8/sU6A2ydI9x8D33NivQKmfNEoxjm5ny5vnOVUXnhj4o/FGJdpPKv/EPgVEdkCEJH3iMg/\nE5E9ERkA30fjUb0aLgIvvMb6k5UIUxoP9XeCV068Pge8fOL9yzTEtXnis50Tr4uv8P7Vjn/uy451\n8jiIyHeLyGdnYf4x8BZe47r8Dq7jOeAwxjj6smOfn+3nTSLyf4vIPREZAj9+Yj9fMubYMOnJc7gM\n/LUTYz6kIebzJ7Y5uf0p3gA4Jd03MGKMPsb492k8rffPPv7bwM8BF2OMSzTh+lwn/EqPlHsFuO/r\nOcwTr+/SEMkcl2jC6R1+99imMSAn9w3ATAf9X2kM1NrMYH2e174ur3UdT+IusCoivS879p3Z6/8Z\neAa4P8bYB/7zE/v5kjGLiHzZObwC/PsxxuUTSyvG+KkT25w+JvANhlPSfQNDGnw7sEKjFQL0aDyv\nUkTeDfw7J76yBwQaTXWO/w34YRF5x2x/108ma36P8XeAPy8iV0WkS+P1/UyM0f0e7Ptngf9MRFZE\n5ALwAyfWdWjIaQ9ARP40jac7xw5w4WSCi9e+jgvEGF8BPgX8ZRHJReQR4HuAnz6xnyEwFpE3A//B\nia//I+AhEfk3pKk++EHgZLndT87O6aHZuJdE5Du/yutxij+gOCXdNyb+oYiMaSbzfw38yRjjU7N1\n3w/8VyIyAv4LGjICIMY4nW3/a7OQ9b0xxv9r9tnfpknq/AOaxMzXA/8H8LdoEk8v0STKfuA1v/HV\n40dpwvqXgF+YHQeAGOPTwH8LfJqGYB8Gfu3Ed/8p8BRwT0T2Z5+96nX8Cvi3aXTeu8DHgR+JMf7S\nbN0P0xD2iMbb/pkT49qnSfD9FRr9+v6T44oxfhz4b4C/O5MmPg/8ka/mYpziDy6kkZFOcYpTnOIU\nrwdOPd1TnOIUp3gdcUq6pzjFKU7xOuKUdE9xilOc4nXEKeme4hSnOMXriFPSPcUpTnGK1xGv+WSi\n3pLE+94ESoEShbYtnK0RFUmSiE6g1U5QHYe3kWLkiTWoABKE2kVEBKUEEaEoPQowFpJE6C61cESm\ntaWoPMprlBOUC4QYCa1Avt5CCThrqSqPSEQEYgRdNceKEXTb0N48Q+/8FfL1M0QRjIZEQV2MWep3\nWFq6SEi66KUOOktpqwQ9rRkdHnHr6GXw2/zxBy9z/XCX1VjSvnyDZONN6PYSJB28buNVCs6TeIcg\neOeQEFFKgUuo8hQV9vCf+030mYwXfvkzHP3vv8THS8+nH/1Grly8gIpCCIH9w3t87snPoNJIWVZM\nikCSGVqppp0azqysEGrHeDSgk8KZdpe11RU+8Oi7eO+73sfSxiarKxvk7TYqUUhiQAJSlVBOCeMB\ndnAIzlPXddOGCAQiha2ZlAVlWXI0GnA8GLA/HDEZ10xKT1lXFDawU5V89pUXuTfYB4QYwYeAaGG1\n1aHdajOeTlEBkiSDGHjnox9i+exlivGE0joSk7C2sszaUg+CwxhDp9PFKE2eKPI0AeUZTWu8C2jA\nVjXT6RTnmhJerTVpmmKMQUTxV//a/0i330dEWFtbQ0QYDod0Oh2MMaytrVHXNXmeE0LAe48xhl6v\nx+rqKs8++yx5ntPr9bh06RJ1XXP58mXW19fRWjMajTDGsLq6ymAwIIRAXdeUZUme57P7WlFVFVtb\nWxRFweHhITFGyrJkMpmwu7uLtZaDgwNijKRpysrKChcvXmRjY4N2u81oNCLPc1qtFkmScOvWLbIs\nYzqdMp1OSdOUuq4pigJrLSLNXEqSBIBOp0NRFKytrVFVFYPBgOPj48U+nXMMBgOeeOIJJpMJdV2z\ntrZGlmV86lOfoq5rsizj0obh+uU1YgSlFEqp5l4RRRBFjELeatFJDc46JkXFYDTF1x5nHdY6tNa0\nWzmJgUvnNvlD3/AB1nsbmBjwria6ChUCqEiMmuA1IhGtBdCEEHDeEUMkhogPDiUQCcTo8LYmEqiq\nAucsztYg4FyNszUheBQRLYoQIs5Hqrr53AeLVoISsLbGOUsk4IKlLCusdVjr0USEiA0RB6A0Yxc5\ndpGpVUwmlkFRYa0iBIX1FhCsdYQQSJKEe4fHTMvqKzXS/Pakm4hiSWc4G3E2Yj1EDHliIDrqsiZJ\nDGns0e92qAZ7lKOSJBpMNJg6EHwkxECSpGTOEUOgKwqmgelBSVSCFqGDItIiRACHNp5UNFudTZRK\nuHN7m2o0RenmposhIq4hdxGIReT4cJ/xzSllGolJs18VQUKg322jl15iYnJ8qsnbLboqZSPvUk0n\nDNwRH3zHNd7S3sQ88xKtOGF89Bk6V8a0N85BdxWV9SDrEqOBYJq+IuextW0upmpRp5ZEHCF6EhI2\nty5S1gnJOKJDRGbNSN5b9vfv0emlOF8zsYoQI9E34/V1yeR4HwmedpLw3je/hTdfuMYDNx7k0Xe9\ni6Uzm0i3i2q1keBBPABSTBFvIDFIlkKrRShKUiCEsPhtldaIgPeeLM3JM08rh6ockqYCWkBXLIlh\ns9/neDKkchaUQtF0GljvidHTzhJwkaRlmIwKoq1YztrkQZgUJXneYn1lhdXlPtFaMmXo9rogEVtN\nsVWJKEiVxhlFMZng6oZgtNZorWE2/rqu0dqACFmW0el0yLKMPM/RWrOxscFwOCTPc6y1KKXodDpf\ndCR6PcqypN1uc3R0hIhw8+ZNDg4O2N/fX3zHOUeapjz44IP0+32Oj48RETqdDq1WizzPyfMc5xzd\nbpcsy1BKMZ1OOT4+xlpLjHEx9tFoxMbGBlVVcXh4yGg04uzZs2xvb1NVFf1+n/X1dabTKXmek6bp\n4hxijIQQyLIM7z3WWpaWlnj++ecXn+/u7tLpdJhOp3z+858nTVP6/T7nz58nz3Pe+ta3MhqNGA6H\nJElCr9ej1WpRFAWj0Qh17gx5nhK8J4SA4EFAxUBEcD6ivMaYDKUTVNLCBU2pSkRbgkArb5FmKa1E\ns7q6TrfbxyQJytcQhRiFGAPEZuporQnBEYnESDP3RYOKGK0Rr/HeEUJseCTQvA4CsSFWiBAFEY2S\nSAyOiEfEo0UhMSDRk5lmX9CcW7MEFJE8NQTnCMSG5E9U0SpRpIlG++Z3SJKEOC2pqhJIQUGWZRiT\nLO6dprHw1fGapLvU7/PBb3g7VWkZjIbcPbzH3v4Bly6tcuXaecbjY3wIeLNOIjlSZOwO7qGqSKwC\nmbRxzlHXNa4IpGmGtRavhCzJsbZAzyaTVppJe0prNae90sVR44Ll7nCHbNrCDj3GNtYwyzOqusYu\ndeiun2HjzCYH+3tMd+4RvKWrU1qtHFtadrcP0UA29RjVg14CEhnuHnB4OOLuuMK7KeevrXGl9252\nn36Fq7bNcHDInf07+M/fZePSJdKVNVYvX0W1uvi8T9VeIk1TRGuMFqy1FOMRSEpghPYl5SiQK4XS\nCS42nrGExvuv6pKqHtPtdDk4OCL4QJpqlts5wZUoG7m8uczFzQ2yJOGjH/gQ585e5tJ9D9C9cBa6\nXUhbiNLEUCPOgnOI0NzRShBtUEmKODczVIEQmihCvJ/dUCmJSlGqJEsMaW5wOFTQiEqoJbKc5+RJ\nShksREjTFGsdpbPEyZiWSVntr3I0HuK9Zjoa421Jmhr6/Q3ObGzQaed0WjnROhQNYcYYMALjyYCq\nLFFak+iMYBKiDwsvd+55hdn4q7qCGGm1WhwdHRFCYDQa0W63F95kWZacOXOGEAIxRo6Pj1lbW2M6\nnSIzwrbWUlXVYpJYa/Ez0jHGEGNke3sbpRTb29sYYxbHbLfbdDod8jwnxkhd16RpynA4pCgKJpMJ\nnU6HNE1ZXl5mb28PaxvjPJ1Ouf/++9FaY4zBucZLfPzxxxmPxwtDsrKywtbWFisrK4sxTqdThsMh\nrVaL5eXlhSGy1mKtZW9vjzzPGQ6HHB4eMp1O2djYYHNzkxgj3W4X7z0xRt7ylrews7PTnGPawrR7\n2HKKLQtiiCgRtFYYkyIafIwUtUUpQ8SQ5TkhglIapTWtVossTelkKasrqyRJhogmxtBEqEoRVRMt\nNR614D1EhCCCGIMGYoho0YgyiLJASnCG4GtEoN3u4r2lqiY4V1PXFSEkKDTOVURfUddTgncIAa0E\nY/TCW5YYMCriQyAGj5LGAyYGlNK42RzVIoQYIYbGe3Y13kGWphSFp6osqGY+JSZBa41z7kucm98x\n6WqjOHd5jW4no7IVe0fn+NXHHqdyJRcvbWCSTZx3DMoOo8MxRymoaPFljXZQxxpESJLmaWZIROnA\n1Ag+M4hJcdYhoTm5rA3nri7T3VxlUtXcfP42R/cK8qIiVU1YGYMnhEgrb7H6tge58PDD9Ptr7L5y\nh/Tpz1EdbLN1do3L166ye3ePwXCCjorl9S0uPfJWWufPYtoZ4iPDW/f4F7/8q4ymJe98+Dr3Xr7F\nC59+gu9+/3vIr62xealmcDhlv5iQHozJ8j3avQksBVTaQhtNCBadppheG1eMme7cRqc1cbLP7uGY\ntapH1AGvv2g+Y4zs7NxhOi2oa9gbTFnq9bn/yjVqN+Te7ZtcWl7nWx59D4/cuM7weMB9W+fYunaN\n1uY60m4T8hytzKzzfmZZF69n7wVEKRCFUpEQGxkk+JlXjKC0Js0gSSOqVGidoqRqbjIDXRTLrS69\nVpthVeBjbAjJWUBRB4XYiCSaECLdbo8QPFU9ZW39DGfPbrGysopE0AGCZkEg3ke0Tun2ekynHusc\nSkeyrIX3ETczFiJCjHER0nvvcd4twvY5WXa73cZrU4qDgwOstQsyrOua27eb/96jtabT6XD27NnF\nMbrd7uJvWZZkWbYgzDlZigh1XTdOhHMMh0MAjo+P2dvboyzLhSRx4cIFtNaLMV+6dIm9vT12dnbY\n3NxkY2MDay39fp/pdMrzzz+/MABzGWMxD7Wm3W6zu7tLr9ej0+ngnGNjYwOlFEmSEELg+PiYVqtF\nr9fDOdc80WpmsAC63S6f/vSnGY1GdLtdtra2uH79Os888wzHo7s4leNVwEnEhRoiZDpFdI5JUlCC\nSdOZrOZJUg1kTLxDPLTbbVpZSjdLWVs5Q2paqKiaSFQaWSsA+ICgMAZUYhrPOkaSROODo/YV0SSI\nAk1AiI1c4BpPFnFEb9E6orXBqGZOORuRGHDeQhAkBrSayZHeIjEsjImIBgLRe2rnmus1c8E9nhBn\nTsvMUTISyY2iLmtUVCRaU0vdRHuAVgqlFXoWiX/NpFvbisF0B6c1g+EIW21Ql4adu0fs35tw4fJZ\nskxRULB/dIeiPELpGp02FzGIbxQSEUQiPlrQguu0WNlcZmPtDLt37nJwdwflI6tRwWHFsDxmVAbC\nXkpnmqJ1gQsVeZ6gjCbgMFlGu9ti89wmWdYnVBUcbXI42WM0OWb7aJfd4wMm0dNtt1m9fBGfGIIC\nnSZQO8qZpvnQjfOk2vHz/+RXeEDaZBev07+8hiGySob4knp8gLJjtHJIOyEIKAJKNxZQBU0shsSd\nl4mdiBRHxHrK4GBKXVWo4BtPVxprPBwOORqMKCtYX1nixpveTCYJd+7dITOGRx54E3/0Wz6MiCMs\nrbO8dZbW5jq00hldzsl1FmLNQ63F809isz5qok6JKoKqEOdQQTBKYRC0b8YVhUbq0ZDqlAAYDInx\ndLOcbt7C6ISIo6oqWnmHYlKgJVBYS1VZdIxoPOiAEmi3OnRbbVLT3GbRenxoRPh5BCQipKlG65wY\nLWmaEQ04bwgxX2h7887JhoCFPMs5f/483W6Xe/fucfbsWYwxHB4eIiIYY8jznKWlJVZXV9nb2+PO\nnTuMx2OuXr3K8vIyKysrXLhwgX6/j/ee0WjEaDRiMpksiGptbY179+4tPG3v/SwsDiil0Fpz584d\n7ty5w4MPPsjZs2fp9XoLyWEuDXjvWV1dpd1uc/fuXbz3tNttzp49i/ceEVlouACTyYQbN24sJIYn\nn3ySnZ3muUC9Xo+zZ88u5Ip2u72QMs6fP8/q6irD4ZCdnR2yLGtC4hiZTCYcHR1x7949sixjeXmZ\nnZ0d7t27R6dr0EmbiMYEhQsK7ywuGIykoFNc8GADy70ezlaousJZMInBGEOaJoAnSzR52sboFJwj\nuIBCoVWGmATvK7xzuGJKFEUIHqGmtgFrC5CAsTneh4agI7jgcNFRlwXe1WiTgM4gemKw2LpEIdi6\nwlYlWjVE3dw3EaWAAHWAIJHoHXVVAxGjDZU4lNZEBT56YhTwEEVQolHBk2lFnhqKscUoyFspCYoY\nA3HmJSvVyHZfM+m6usZWe9jlJcqsx/7xLsf7I+y+ZXTnFeRqSpnkHO++xP7eLVyrR3rxPKO9Q/xk\ngqnB2bmGI6A0rVaHsw/ez/VHHybvdbC/+TmO9o8xByUEw8GLA4IZU8VAmiR0ljN8BOuEEC1RhIBg\nlSPUFbgpNnU4PcAlJbVEijtHHB5YRkdj2uSsX7xO64EbKGWpBvuo8YBbzzzPs595ijgtUa3z/Mon\nnkRujXj7B6/TbntcOUAnHZJWBtFQ25KgoXQT8k6fkPUofE3LaHSEGGq8LWiHCMMB0U85O4psP7lD\nYUECiAKRSIgObTTRJ5jEs7W5iq1HHB5uMx0fs9Jb5f3vfx9Xr9/H4GAX36rpndtA2hnBND+qChEk\nEGfXNhK/5HlTzfsmcRaNacKlIASpCbFGhdBY5xk/SwSNoAWUCIHmrxZoZSm9dk6WJtiiCZ+KskIp\nA87SzQ3j4TEqwJm1DUySkSYJeZYjLoKPoATrLHVdNTYiBMqymHliHbTK0KkmSTTOWYxpiDWEiHXV\nwrt0zlFVlkjjcSuluHr16kJCUErRarUa6UeEXq+3IL2NjQ2WlpZ47rnnOH/+/GK7TqfD1tYWVVUR\nQlgQVpIkdDodlpaWEBGccwvSjTOPfzAY8IlPfIIPfvCDnDt3jtXVVZaWlppQO8sWXvo8YSXSJFGf\ne+45rly5Qrfb5fLly2xtbeGcYzKZsL+/T5ZlC/25LEuGwyHe+4XnOk+QPfbYY6RpytWrVxfJs3a7\njVJqQdha64UE8eY3v5mtrS2MMWxubnLz5s1GbjGaTqtNlZjG0M0kCD+TZyKNJOB9mGmeqpEPSkur\n1cLWFmctSZaQpsnM4DSaqMraRO8RH9ESiIkQlEOUauSxqIhBNfNCVw07ogixnhFyJBiL4EmkRRoV\n2qQEgeALvJ8QxFMXU8q6wNm60Xd9nEVIQGgiPZ0o3Cy5JlqQIFR1vXjs2zzy0SZptOwQCMEhgLeO\n6D1ZqijrgLMeF2YJx+DxvvGif7tHK7wm6TZ6GJg0QeqE4XBEXdRoL9hpRZ5pylhy784dWnnKucv3\nE33GnRdvceeFFwlVk9lDC0m7xcqZNS5cvsT5R2/Q2VhnOBoTVUSjyJXGI1SuSTJJpljbWOLClQvU\nJGzf22Pn7jbBzbxoq2n7iuLOS9DK2Lu9ze7NO0wOptiJpRgdUnvN2fuu8vb3fSPpuS2CG3N8+2WO\n7u7xyheeZ7o/oJclPP3Ei2TK8UceucK73nIGP/o8IoGk18cUPaK06ZocR8rUdtH5GirPGR2OMRIw\neGJpm9AnQvSuycJOHYN7B9jazfgw4rxlMDxiODwkiCNrG+7tbJOZiNIOlGJ95QwPP/Bmsl6Hlews\nSht0t9PIASJgFNE74kwe+BIsnm0tEEC0RnQT4kk0KKUxShqvYJYNJ0aMaExsiHme6DAojDHYJGU9\nb9MVxWRGxkoJzlp0DLTThBghz3o8dOMRimnB5sYmeSsjhoAtKyTRDEcj6qKk3+uSJQk+NCRmEkOa\npIToIQaCaJJEMAacdUTMTLN0VFXdZJ+tZTQa8c53vpObN2+yt7e30NNefPFFLly4sPBY57prnud0\nOh3W19fZ29vj8PCQ8XjMYDBovL1OZ0Hu89A8hMDly5fRWlNVFWVZYq1lPB5jreWpp55ayAlLS0v0\n+/2FtpwkyUJiSJJksczH+OKLL7K6urpInJ05c4bl5WWWl5dRSnF8fEyMkaIoCCEsyP/GjRsopXju\nuecajbuquHnzJt1ul42NDWKMi+RiXddUVUWMkaqqyLKMy5cvL4hhnqiMxEZTVzmh1RgpYkM01lr0\nzJvNk2zmQQrBB4xpfntJBa3UzLNO0dog0uzfiyLSzNtoPQRBiSGiEHQT6quU6GsSY0AiUWtkFjEL\nQKiJroYQZ/lr10TOylBHg04aKSMEaaQLrQjOz6ohmr/OeyA0UV+cpz7UCaMyJ2mZGdnGgzVKkSUa\npiUxONIkIU0Vugq42CT1Y4TgITA3Sl8j6TobKaaOng/UtmJyXOCKSBqEUHsShMHBIcFVPPzQw/TW\n1pmMAtWgyyu/FfG1gBiigpAI7bUuW/edY+38JjbCYP+IO8+9jJs6olOETOivLTOupiTthK0LKzz8\ntuskvTPs7R1RDIdkgKotw4NDLlxdwVGwfXeH8uY2k9t7hGmTBHEEtMrob6yxtnEG01vi+Ljg7ivb\nPP/rn2P8yiHKQeEtwXje8pbrvOnKGjeffo7tLxyh+kJ/pU+/3Wdt8zzd9ctIewOT5UQfoa4IdYWP\njbbkyikqBjxhpl833FeOGm1uLq0HHxhPJxwNR+jEUFUWHzzSVqgQWcp7XDl3kY2lDggkrRaoxuqK\nsJACkEjzKNpIjL5xQ2Jobqq53BACcRbyIDSZY2m2k7rJTscQZxStUDEioSH2yFx7FfIkY7XTpZvn\n7EwnjWGJkShgY8ToNt12j9WVM2gUK8srLK+skLdzYhmYjMcEgYODA4J3ZKlG8nxW/jfXajVazIwM\nzCxL7QCN841naWtHXdmmymO27OzssL+/z+7uLiEEJpMJeZ5zeHjI0tLSwhueZ/673S7nz5/n7W9/\nO48//jhf+MIXuHjx4qLyYO7JWmuZTCYYYxiNRmRZtpAU2u32Yt8f+9jHeP/738/KygrtdntBYko1\nBkufMIpxlvybl5g9++yzlGWJ956yLCmKAucakkuSZFZa12E8HrO5uUme59y+fXtWNicL2SBJEuq6\nXpz73Fu/dOkS0+mUJ598ktXVVe7evUuapvR6PfI8J0mShaFw7hgXHFonTYIsBEQEW1a40FRM5K0W\n3U63yfAHz/FRQWJM83vMtOi5Lp3neSPRxEiMtqko0AplFMo1ElEzJzSgCKFGx5pI49FSlXjniD4Q\ngsc6S4yeEB0hNvmEEDVKmmqf6CJaEtp5FyuG4B021k2iXkGWKEwQrK3w3uKjxzlLiGCSpNF26woX\nHM1EC7M5E4khEKNalDUE38gLWgsGTdDg3Bfny++KdJs5miGiCN4y3C+IpQIXMFGTKUU1HrG23OPy\nxQ1qQnMB/BSNI2qoXSBYqAYFLz//Cp1WF9VaIaK4/bkXOH55j3WTcd/1czz6TQ/x0NseIGkZusst\n+ss53X4bR1N/lxuDtg5VlYhzOBWpXGAwcWzvjLhz55DHHvsML758k6mvCXlG3skoRsf0s4zx9j63\nnnqB8Z0DMg9BBN3rcOnR67zjG95G/fIdtu8EJnctO8cTfDxiqZXynm8U3v2HzpOet7T6bQIlJmbk\nWiHeokRIFcRE4/OUahzRIiRpRq/ToZbJzLrGxaRySojeE5wnm3kJRqDfXmG9t4YR1yTCtCJq1RBn\nM3PBRyRTiJKGzqODYCE4iB7w4BzRBiRLkSQFUUQs0QoSIEpJhMb6z26SMNNbjTZAJESND03WeSnv\n0Gt34PCgCT19YFEqkRi2tvYXIHgAACAASURBVM5iVMre3h73Xb+PbreLC5HJdEIxLZq63+kUrTWD\nwRdDZRFZlCkZYxClSEQIMaMqm2vmfVNnXNX1oo7T+8bD+43f+I1FPW+r1WJtbQ3nHKPRiKWlJQ4P\nDymKYkF4m5ubrK6u8sADD9Dv93nyySdpt9sLUtRas7S0xGQywTnH6uoqWmuKolgkygCKouCZZ54B\nYH19faGdzok2SRLSNP2Scrc4O5e51txqtRaVFMYY9vf3FyVkc0MRQuDChQvcd999xBhZXV1dZMd7\nvR7nzp3DWsvdu3fZ29vDe89DDz3EYDDg5s2bbG9vM51O2d/fX+jGd+/epSiKhd7dVGMEbDlFshyl\nDFneIWJQpiaWJSE6kjQnONskPI2AaaQSowyjyRRrLcHWyMYWqcnQ2uBtU4mDSdHKIL7Gu7Kpz1UK\nj8e7gHgIdkygxNopbjrB2YDWGaJ1UyrnAlEbVNpFlCZ4R10VBD8huArv/CzTEdASCTo25WiuxtuS\nGJvql0hoHIYQCKHx8EUp0AqtmkjEOg+icAGUUTgXsSFgvYdZ9IJ4Ih7nGqINYe4l/25INwpCQrvV\nwfsDDndGhFqhYySJQqI01WTC2nKfbscwtjUHxRHl9Ih2q/E2TRUoxh5VCPVkyjP7T7Hz7D5KR3w5\n4QNvfYBv+8Mf5NGHb7C22UHjkAQQTwg11k8xaUB8zXQ8YTwYkoqi3+3NMqCaVi9hq3+ORx+8zjd/\n87u5c3uP5166xSc/+1mylTZhcszNV+7w5GeeYPziPdolJKKp8pRLb3uQGx94B8lqj2zqGT23zb3j\nwKQyZO0Og6Lm9ivHvLOyYGtMovFGN68JTciDIMHhvcXGQJDGfuM8/XbGIREPhAAiBmtrYoAQPUoL\nSWbIs5yUlFa+RL/XAqfAWqIE0AlxVkAoISA+QO2IRiEqpVFjheBrsB6cgCTNdTSGaJrkjIpCpKmd\ndqJwYggmn+nNDp9oohFwgnWzDK5SaKXo6pTL/RVeNHc4DK7xjD3kOmO9t0wxnZClzArdm6zypJgy\nmk4pxhOG1ZjaVigU3jfF6Wma0m63Z55rU3qjlSCpQZw0csPsDm4K52tC9IgSkjRhfX19kYTa29tj\nfX2dfr/PaDSi0+ksPLt5U4VzjmeeeYbpdMrKygohBK5fv461luFwuPAavfeL5oP5tno28eXEeK5d\nu8bu7m6TtW+1FmQ7X+bHnVdYzEvE6rpGKcXS0tLC+4amlrcsS5xzJEnC3t7eIik4976ffvrphWZ7\n6dIlvvVbv5Xd3V3u3LnD/v4+h4eHi+aMp556ivF4zJUrV9je3mZlZWXhzSulKIpiUVecZi3KqmjO\nPWuhdPol4w9RU5QFbhZRhcIxnUxo5a2Z19gYulSrRpJMUiICyiB5l+grQj1B25rgChy2IdPgmwYG\n76nKiiRJUDrFp5q0m2GSFsa08GKobI3gmtIugWAjPp0ioY2tJ5TlhBCasrCqLPHBEqOd1e8GQgyN\nwzPT5qHxTr1zVLPmBmMMSmuCdY00F+OCUGNsytvmtfaJSXAxEhHqWV35b1cu9tuSbgiR6dSitGE6\nmVCOazLVRoe6cf2t42h/yJkrSwRfURZDbt16nsODI/q9NrHfdJmNdgqmexYKgx4JaT7i3d/4MB/5\nV97LfQ+dI02mVPVt6gEoZZAswWjBVwVpq4WfNiRQTh39tXMYkxGURocScRZf2SZJlQRIhRv3bfHm\n69d45F2P8sQLL/PMMzd5+lP/nNvP3EZPPZkDkkjW77Jy9SJm/Ty+1aZ3yWDuvYCrA93lNjfe9iAm\nBnZeeoHYOgZWIHTRpgd+MEv6gBDxtqauSoI02dwkTZsaQx9QoggxzH7kJtS3dSBvGRDfFJCLotfp\nzWoeFcHPihlPxiqzcEfmn8emOkEkASzEeTgXEZ0guYZEg9ZAJNZzGQJi0MRoSPIOeZJSe9BVReIC\nsaoprJ/pmoLRBqUD5/qrbHR6DOuaQKST55xb3UIrRV1XZGmGtYqyqNGqOecQPJHGg8x0PruJG4+v\nKIpZuN4hhIiOERGFaE2SprRCa27+KcuSaVFQ1VVTGz7zjldXVzk8PGQ4HC48xk6nw/Ly8kIPraqK\nVqvFysoK/X6fuq75uZ/7OS5dusS1a9fIsoy1tbWFpzvXb51zTKfTRTfZnIiffPJJrn3+83z3Cy9w\n7sMfZnvm0c6TT/OlruvG2Cn1JeVb86aPk1JFlmUcHx8vpIF5om6uMQJsb28zHA4Zj8eLpa5rVldX\nuf/++7l69SrHx8dMp1OuXbvGhQsXqOuaJEkYDoeEEBZNEcfHx9R1TQihkSSio7KNXp2GSNYS0iTH\nh0ir06YoJ9S28XK1EtpZCiGSmoTxZMp4PKaqKpJWC6UNEWnyp0o33qez6FCiRDCmhcQclDRVBWKx\ntkapLkZneA+9Xgsbx9hQEtAk0ZFT4Wyjq2Z5G6cMIhHraorxCBeq5n41QtLJCWiKqaUu65ksFaht\nvejOnFeoJEkCSoN1Mw9WMLFpo3DON0YlGLyPeB8xuvme0opcG3yYJxhnv334XSTSXAjs1wUHRcFk\nf0pSahIMaEsdofKG8TgibYOLCjvRHN4tqYtIbafkSki1pp8rMJr1C2f4pg+8jw995BHOntdUR89z\n+zf+X3Zv7RF8STvJeOnmEQ7h7IUOK11oqw69cz02rl9nubuOyiBSEkNNFTTGa1JlcApidCTeoU3T\n3XXlzBIbKw9xZX2Z53/z89ytIY8pYiJqKUWvJ2S9gjzsU/s+vxXHDGxJP0u4stnh7FlLrAaUBxWD\ng32SlUsEO0ZFjU6FgEbHjGhrRAxaG6IP6NygY4KPBW48JkVR0PzIoiIry8v0MkOeNk0TRmlCDEQs\n3g6wxYg4629eVCbMWVdrotEzaUFmlKQBjRID0rQ3RqWaG0mxaJvGNfKPD4HaRUJMECMQhNoLVRVA\nZehEk2RCqO1McgigodNucXF9k9vDIbUSttbX6WYJk8kx3W6Psppycf0cSlKm4zGpahIaeZaQtjN8\njDhriZVFlG6MlW+0NefqWelNRAVNohOStp4RYQIoaoTBeEhVVYAsmhXmSaNWq7UgtcFggNaaK1eu\nkGUZZVkyb9Hd3d3FGEOSJBwcHCySV0nSZN3njRPz+t/ptAmdO50OSikee+wx/uITT9C7d49vOjpC\nPvYxlNbsf+QjnHn8cY7/wl8gfPSjX5RPZt7PvPpi7lHPa33b7TZ1XbO1tcWFCxeoqoq9vT1ijIxG\nI1ZWVlhaWmJ/f5+qqhaSxdHREU8//TQPPvggGxsb/MIv/ALj8ZiiKBZGqNVq0e12eetb37rQe4ui\nYHd3l/F4jFKK/f19IqFpOKgrausJCMYkaJMQYhN+O+8J3pLS1A1Xkwl1VWGtpSzLhWccomB904Uq\nEUI9JE6PURJxqgMoNGHWeNAY2jAnKmUJ3qMdlJN9TBpJjUZ8oCrH1D40bcl1jbJDxJdQD8Eeo0Wa\n+nJrCd5R2pKqrpAoJDrBBU+axMXvEGJs0iB80TttasCbKC8wC7QCX+yOVFBVNVFSgo9Y7xBtFgZy\nPk2/ZtJVWmh1W2RZm8FeiS8D3jpCEmh32zgP06nDZAoB9rcH7N8eo0PG4LDAI5TBkasWDz5wP9/7\n5/4El+/P2H7hF/nFv/dZnn/iiIOXHbEU1lcV737vGTJrGR5bntndZn1ZsZx1qB4fceOdQ869/RHy\ni5EqVERX08r7KJUhqk3UKbVJ0RECBUoVKB3pZI4HH+zzw3/xu/g/L/wi/88//jTeWkyiSI5K7vza\ni6Slhq0Ow+MjunVNgmHj/DmU90jtOLO0Qpa2Md0OdQ5O2UVCS2mNeAXaNGUmXiMovA8U0ynTcYmJ\nggoNSYqGdrs/q2kUzqydwbmKcjImdGps5YnWorVCQmjKrWJgXk+AShCdNslJBQrbrI+20S98Iz9E\nFEhEAsy1jVh7fOmwtcf6SBUi7pOf4tzP/kPufOPbube5hjaa+166w4cf+wyPvfshXry0RWoMJkvJ\nRTi3us7yzh3uToeMBkNWNns45zkeHLGydIaVlXXW1prCfxcbwo4xIF6IPmBrd4I0m+oA51xTizvL\n/GaSNYXpukkUGR1ptTr0+zVlXWNrhyhZJMzm7brHx8esr69TFAX9fp+lpaWF1LC2tsbR0RHPPffc\noqHgiSee4Nq1aywtLVHXNePxeFFqFmOk3W5TFAXj8biZD7OW4geefZaWtVRnzxKriv7BAQDtn/5p\nBMi+//upf/RHGf/Yj2G+4zsWoezc451P+rmUsLu7y/Hx8ULaODw8pN1uk+c5Z86cYWlpiXa7zYUL\nFxbJtul0SlmW1HXNYDAgyzJu3rzJZDJZ6M/zRNmtW7dotVrcu3dvYZiuXr1KmqYcHx8zmUwYVa8Q\nxOGjw9URmWhSk9FpKTRgZC5hRaICUUKr02I4GBKcRQtIDCSJbrTXIBArQu0xYohZjxAqkEYekKY2\nhuibZ5ekuZ6VegWIBd6PqKZD8mwNLYJvCKmZN2LQogFLaT2VtSAK0+oRvMXbmqbxwc2qfFhcdx/9\nrCTM4EPEB48LcdaCIYQYmqYMaaozlNZoIkkMpCYiGlwtTYu0NEk1okdrcB5QqtGHv1bSFWZk4RTF\nyCGzOgulhP5Sl+FwgneCFk9dFDz/9C2m+x47mqJ8QxDKBD74re/lu/7EH6eVDvgnf/dv8tlPvMDh\ntoVSUw4jF85r3vGuLi0Z4cY15VBReIORBLqK6T3D5z51i0oS9O3f4vxD18mTBHs4RoUMk6+j+mea\n2lcL0fUJvvFkCAU6iTxw9So/8EP/CSH7O3ziV3+N6bTClo69lwZsT36d7iPrROfQz7/E1nKb1c1V\nBns36RYFbek0hKZB2gZSBVbwswJXZTQSPSiNUhnEhKpyjMclo7GjHQ1ZI+njgSTvcv78JY6P9kmT\njCxtMR6OGI+PyToraBRGGfCB6AJ4kKiJQRDX3HySG1SsoZxCOSVOSygtxCbjK4GmtM478AHvPEVV\nEH7pV+n89M8y+Nc+whcunOV9P/1xensH3PjFX+OfffRDpEnCv/nJ3+DM4YD3fvpz/NbFM+RZuyE5\nUVzsLHFt9Qx75XBWXaDI0haVHbO2uoL3jv5SnyBweHxMNZ40eiaa2nkmtsTHiEZoSUoIjcyAgDEJ\naSKLh9sQ46I8LQTfPG+DLya85u2xt27dYjwek2UZR0dH9Pt9jDHs7u4uvF7vPZ1Oh8uXLzMejxmN\nRnjvF884mFc3zFuNQwhorReJrfm2Sin+raeeol9VjO+7j09+9KN821//6yjnFg6OjpHWvXvYH/9x\nrFIs/8iPwHTKcoz4POeFH/xBjmaJsXnC7qmnnqLb7S686rmk8I53vIPBYEBRFGRZxvnz5xfkHUJY\nNFMcHBw0CaBZF968VG0uY4QQODo6YjgcUpYl3W6XVqvFJz/5SabTKefv66AzhfcKV3vquqSYjkm1\naaoOlEZrg6UxSEVZkuUZbdtCoajKckboTT22KEHEN7IfmqgzosSm2SdqfExIUGBLYl0g85rx4NEi\n1PWE4ByJSfG+0WGJHiMgquksq70jKoVOMsRodHsZZysq56imFh8F0c19ZBLdPCOimC5q0Y1S1KIa\nPTc2hGlU0wlnZ00ZzXNGIto3z2kQJY0XHMOCtGPwM0klLro9v3bSjZFQWaphszSF0nPBOWFwPEap\nlG6rRz327L48hrEmqwzKaZSLvOP9l/mT3/cthGqbn/lfPs7jn3iCOG7TymFlSdPbXOVg94iXnwt8\n6D1nsOvHjEvL0S7cGk056k6ZFIpyeMjz1ZC3PrrMrWef5T3vex9pP2U83CXPh4g9ZDIe046atL2M\ns8JwOmHn4CaVL6grRXtlhe/9oxlvue8CLw1SDosldvbhhdtPcmFrnfXlZbI852ooyDRU1qK8Js0y\njFJEX4NyoCKSGNJ2C6mEWJagVVOepTOEnGByfEhxHrTJZm2OyezpTZH19RXKakD+/5H23kGWXud5\n5++c88Wbb+fu6Z6eHDAACSJxwACKAaDEHMSktBIlubSUZJW8K9tVXqWyghUoK6ws0bZo0RYNkiJN\nmQIhMYAQJJAEQIQBZoCZweTUOd18v3DO2T/OvZfgbtVql5qq+Weqa7r73u++5z3v+zy/J3QC/Q3P\nJ9eaJEsoRAFhXHTF9WuPIf7Df8H+/Eewb3gt+qF/wPuDj5H97/8ca1KCj/4x+U9/GHP8LkhyhHQP\nibQCoTPyh/8e/xOfpvGBt7N85AB7P/4XhMtrFP/iszx4z52cn5/mvf0eDx5ZREl3Nfzay47y+mdf\n4OGXH2b/peu84cQFnnndnVw/sEgtDNnVncZfukor6dLqbRP5JYxx10bneU/pp5Z2u0O/3x2wNTzH\nevB89MAOmg86+na77a6nYUi5pPF9iRAFvIHW0yBRyi0n4zCinzizxOnTp1lcXGR6enr0zA5hNr1e\njyAIRgqBc+fOkaYpc3NzI6vs5OQkvu9TqVSIogjgOzrRofogDMPR/DPPc/501y5+fmuL7CMf4bwQ\nvPCrv8pNv/zLyPzbocZGKYJej+LP/iyq0/mOD9zhX/xFrv3kT1J+xStGIwTf92m1WvR6ve9w1A0P\nD601m5ub+L5PFEXU6/XRXHyouLj33ntZXV1la2trRDwbGil83x/NqZVSbGxs4HkeOzs79Pt95lUB\nL3bdHxjILTpLSNI+fhA4WSIWoRQISZo5TbVSHlp3R5pkrTPKlRK+72FtihSCHIO2FoFCCrdjsEaO\nhmPCamdpl6BzDVYPJJgaaR3BTFro99oI4RZtUjjolhAWlEJKD+kplAjw4wKZNeCHSBxTI81zUmvR\n0scPfTe0MwY8iSYl0waduwWdEt8+sIx1cBxG4wXreAuD0UI+wBIIbdw4T6h/mmTMDMhCxaCCTRRK\nioGfH/Jc09joEoVFKvEMV6+u0d7IEF2JnwlIEg7vneEHP/Rmss3L/OV//yqPfekiflJ0yDYvoTat\nWZyrkKU+L55JmYo2OXzLFEuNJaqJpNuLwSakKsMqwc5mQmM5RWRdHt/8O26/73VkOsCzHn7o0byy\nyZnnnqdcjpFBkdW1TYzI2G62uHG9Ta+rmdldIIlDgtoM+w9PU9tbojy9h5nJKlPj49StYmb5Mvnm\nJrMTY2Q7XYKwPKCaZYikh5U+1gtQvsJmkswMTmDlkWuBJz28oAgiQBvoAT1pKBYKVIqum0m8gMAP\nSZLU2R4952fPdU65VMLzfIzRqD/6M8S5i/DRPya/7RbU7/wh8sJlsl/5dbcEubFC+lt/yIu/9q+Y\nevY0U//jQXo//D541StRNif6+P14V64TfeIznP7we7n68kPc1u1w/+IMW80GJ6bKXNh3NxO1OmU/\nwmJYOraXTx7bD1h++NNfZHyjwZ1ff4bea+6i3+sxl8xRr1bZHFyri4UKSiWsrjeIgwpoTa+Xjxxe\nQko8z3dWUGvRgym0sdp1DEmC7jojw1CjWogyfOXDQKjueR7VUpk4LhAGIVI4rOLq6up3sBGGHW2r\n1SIIgpH1Nssyrl27Rq1Wo1wuUygUmJub+47n3YF8stHibKhiGBaUISpSTU7SvnqVbz36KPndd7P5\n6lez/Id/yNRv/Aai28XGMXS7RMvLLzUJMihnqCzjVZ/5DF987WuZnJwky7KRI67f73Pt2rWR+mE4\nhx3qeIcz7CRJRlI5rfWIFbG0tITv++zdu5eJiQnq9TobGxvkec7i4iLLy8t0Oh22traYm5sjCAK6\n3S4oi4olvvFRwiASiRTOqCSNs+pK6fCdAlfoMp2T9xPa7fboZhCEPrVaBakEWS+l+vWn2P25L3D1\nfe+geccrXMcs3W7BWEmOh/AjAl9hdeY06IDOU3SagXa6XSVA533yJMPzjdunSAtCoQjcDiPvY/MM\nz5McuLrJLY+c5Ol7XsaNQwukWYanc5KkT572IE/whCVPczwj6Gc9MmNQA6oaOEaJHszjBW5xpk1G\nrjXaSoeP9T2yJENKAVLyj+zQ/vGii4VCWCD2C9jcdRtkjtrT63bZbDVZWFigHM9w6cXn0V1LkEtU\npqkUQ177qgPUQ5///qcP8q3HLlL2Q+IgI8dndqFAbdzS6K1TmrTgZ+Q6wAtD9h8aJ6pZsqyEyfpE\nMiNQARiJ50uiMWj1N1i/dJpSsczpjS6rfcvGVpupsEZrdYdCrOl1PYLSOM22x8UrfdZbgvMrmlLQ\noT67Tn/jG2yXPGwGjc4441Zy7bkzLMTQ1x3kWIgfFMktmCQhb3ew7QJWhOhCASUlIvAwwg3gpfIQ\n0sMg8YMIPwwRStLNDG3pisHM1DTtdhslYG19AykN7c42gQc6s+Qmx5Ij/v4biD/6c/Ljd6CMJf3R\nD5FkKemPvI/4zz7JY3ffQb/X4a5+j6dvP0L70jkO3P95gpV1zH/9NNt3HAGg9e57qX7mizzzqlvY\n2N7i3FSZR990B70cZn1BoeBTiouM10qM1WZRRtDutUhyTdLv8MTdL+P4Y6c4+4bjFOICcRSxxxdM\nn6yztb5NQAFhFcJKFIJQWMDhLq0VgEIagbCQDpZlnpQOFiQkRliMkqQ5mCyHVhvfDwmDCM9XKDyw\nEik8hEiJQx9RKREOrvpDLOJwO99qtQBIkoS9e/eOjAOlUolarcby8vJoxtpsNtm/fz9jY2OEYTjS\nUA+dW9nA9ZamKVpr4jjGWsuPX7nCdJ6jHnyQL73udQ42c889pN/7vaOZ8OQrXwnb2y/ZYg4+uIOP\nlgBWV1eRUjI5OcnOzg5KKSqVCseOHUMpxeTk5MisMVQr5HlOsVjE9/3R8mo4Wjl//jxXr14F4OzZ\ns3ieR6VSoVQqsWfPHmZmZtizZ8/I8CGE4Pjx45w9exYpU8dwUGACi00seTJwcJncFVxPOpeadmyC\nKC6RtPsYrZFKctdGgw+8eINL0/sxh/p02x2OfebzlG4sM//pz/PkTQeYPXWG3Z99gGvvejON21/h\niqXwyPMErXOENWBybNonTzUwoH3lOTZPafa7RIUKWgumz1zkpkee45nX3MzVg/MonQyGeIKb/+5Z\nxtab3PrICS7tmUZrMHh4cYkgLqDTHt12g8w4KJfbzWiSLMe3OJiPcGxdjcUKdztL0pwkzTFIB4c0\n1jnrhJOafceb/F0VXQGh8MlbKVnaJ5Y+MhBIYbl2boNe3ubluw9g+ztcu7EJeUwhyRmLPT70Ex/k\nNW/dxaNf+QbPn7yK54fIUOHLhJmaohS1GQ+KmG6B6SqEk31Ck1OojBEkHjMTVTqpIs00UdqlEISk\nGi5evcae+b10+wWisMiJk1d49swqcbHOLceO0ulvEo1VubG8xuXLDbI8IIwiwgK8fHeRuemA7laT\nra0u+Q3FVneHjtS0Sjv0LzfwlpZQd86hvICxwizNbpfEzwgEiG4HnY6T5Tki6WJ9hVAWFUl0asmt\nxgQSm2VY20fVQtJYYjOLn0BYKDAxPk65VCIIfJZWrrOzvYax7iBjoEjwvQj5x59AnL8E1tL/j7+P\n8aSTf735dbTe+BoWd9o0tpZ55vUvo9vqQppw9vV3cPSRp9h+z5vxBFhj2L7lIC9M/QBXVpbotXtY\na4jjkMgqZqcnmK4WiaOILMvppimNborNDZ7vE1cn2bmzzv175gmEYXx5iV179rK7WuLmg4e5emMV\nFQSkOgXbZ6rgMVcP2DMxxpmVDTwhyTFgrLPzWuPkQ9YgUIS+RzYYVUkrSdKETpLA1vYIXh3HBXzP\n8UqNNfQ6TmfqGM3+yGF28OBBrLU8//zzIyvvcPM/pIUNkYvD+edw4zzkyw5NGMNtfBRFI67Dt7kP\nCat33UXtgQd4eIB2HBleBiOJwkMPOdvt7CyNt72N0he+gJ+mqFYLmefoIOCJH/xB1tfXRxZdIQTb\n29tsb2+jtUZrzcWLFykUCkxMTFCtVpmYmEAIQa/XGwFzhlCbarXK0tLSyBY8ZEUMD4zNzU3W19fp\ndDojg0Wr1RpBdpQyxGHkDkPfknk5PdvH5I4rIIU7VK1y7kWERxRXkGOwfOM6lXKR95+9wO5ml8oX\nH+LxO2/Dpn1Ov+UeDj3wNZ6/9zjb20vc/qnPU1xeZ+6zD7ByZAHPk0jpsJE66yOtQZgcz2o85ZDi\nxmhnJNIZ/bRDL81od3Ne/ZWnqG13OPbwMzxR8ZDC4gchuTZ89dguvuc5zaO3LrC1tYnnuQZIk6Kk\n8xgUiyWE0XT7fYQSAy+n61hdx+2QlHmukZ7ESkmqBdoI1wELBVZgDejMOfhUrFzX+10XXSDtpTS3\nGgP9nmNbCyCUjno/ERbYWu6ycnUHEkXZC3nH2+/gtfeOc/L5Z/nG3z9OIDSJzhH4FIOQQGmmxjw8\n2ybNFVHsUSkZUu1z7coS+w4vsNPpEMQFtlo7tPoZyfYOxWqNC+tbVBrjKAzxRIHCxASp3IJ2ykS5\nRJqtIHSBpFXA9HvMTNVAZhw7up8oTPBtRjNQxJHh4maXkmdRKRStR9kvUJytUJkqoLpdhA0QMsd4\nTrWhACvdtUvkCVb6CE8iPIkUvrvyUMAmMUpnFMer6GJI3kopKA+kwPc851cXltmZWVZXrrprdpYT\nhhGeUlRrY+h//mOoP/g45sd/iKAQYSMfKiVia+g1m3S31kmSLp1en0a7T9JP0Ht30XjZUebn5qhL\n9W1NcJ6iU3cNrdVqpHlK2kmpPXmSO0+8yOcPzCCQvPP8Kn9701Eu7JojLoREpZip6XHq0/OsLl8n\n32xi1A2q01Mszu7lwO4bLCws0G+1ybZTDkzUOVAK0WdPsjCzn+2dlmP3AhiDGmyHc+OucBZQymHy\nQs9HCegkfXKd0e508Lc9rIVyyRuNHja3tkfi/06nw1133cWuXbtGrrGVlRUuXbo02tZXKpVRwZ2Z\nmSFNU9bW1kZ6XWCk1+z3+2xtbY3GJpVKBd/3R+yD4Tx19sknCY3h3iTh6wOoTJIkTD3+OJN/+IeI\nlRW8rS2S2VnWf+7ngZf13QAAIABJREFUeO7972d+fp6x3/s9Jj7xCS68612cOXQIubIyWnx1u92R\naWJophgqEzY2NkaSuHq9zrVr15iammJ2dpbJyclRqsTExATJQMI1XJgN7cidTmc0vmg0Gvi+P2IR\nd7tdqnNFPM/H93zwBMKmmNySp4ahalAIOSgo7vVqtloUAo9ipQrC8pdHFvjguSUuvPX19Ho7yLzP\nhcUaZz7ybqwxNG5c5KFbd/PqtM9jr9hD8/p5fAT7rqxw5OETvPj6W9k5tgdPSjKT4gVlpLBUT5xh\n75cf49nj+2kultHWGSoeOjbH6567ylcOTdNJMjf0zy25NixPVjjx5jvwpIfIDR4Je1+8wRtPXOWh\nm2e5vjhOqBQq8PF8HzLHMvH8wD2juSMkCuk515k2gxAHCUi3QMNghSvQ2mhMP8X3gsGT/V0WXQu0\nO13W1zdclIUxTqdkIQgiSsWYvfv2cOryNcSWoobire9Y5N631Ll46uv89aefQSVw0+IUK1vbKCso\nWcu+xXGmZz2ybItotsj68jZeqUZmYq6srVKZrSCUpLWzgU77bGx2WFlt4EfbXF9qMT29TeQZtNAU\nS0V2zY+xcn6TrN1nqjhBvTTNYmUfzUNbbDWWSXSTxuYKTdVlcnyKbj8iKve5dXeVVhJw6fQqY9Vx\nDtx6M1s3nqLba1DzIhrtJuVdM3TyHu1rK0ymGi9PCEkR1iJxJx1CIQcef+mHGD9DIRF1j7G901xe\nvejkLsogdEaaW3SeMTE+SaFYYHvHKQFk6OMHIWPTY3DXXeg3vgbbamF9z/m7W03kQw8T/5+fgLe8\nimpjh2NfeoLHbj/EqdkJttc2UAf2sS8M8Z8+Q/xnn+Ly99zJ+ZLH+sY21YkxgjDkwqWL6FzzwW89\nz3wv4/vO3CDNMhYyw1ufPkF66gX+Yn6WF/bM0ewlTEyPceeb7uPMqVN88RtPMDGzwPe88V7eXqpz\n/co5+lmf6swuDsxMYjYaXF1/ntlCjJSWYf9gtPvQZlqT2hxpDJGwFEWAkhI1IF1Z7SGFwPecRlfn\nGTrtkxnDVrPJ2vYO/aRHtVbjzjvvZHp6mlKpRLvdHi2Yhtzc+fl5bty4QafT4ciRI+zZs2dkChhe\n24dusqFaIE1TWq3WyCQhhBgVLKUU169fZ8/evdy9ssL5w4dHXanWmvHf/338M2dGH7lgZYXoK1+B\nm28mCAJKf/d3yDRl7BvfIDl+fKTPLRQKI55uHDtDyFDzOuy6O53OqBD7vs/Ozg7tdnvksBu6+w4c\nODBSdQzVCrOzs4AzVwAjHOXwd2w2m3hebYAllCjpYUOB1SCFwzcarRB4GJNhrSHP3UFz88V1/ren\nznD/oXleWNzFH9x6M7e9/CYqwpJ0tsm6TTLjsJn9bpszUxHPvvkYSZLhnT2DwvKGr52lstNl35ef\n4Jv7xhC+RzdpUxmrYW3Gvq88Tnl1m2OPnubM1G3gKaJAcuPANB9fqJNkObrfRQtna7dIcm3o5okb\nHUhDpVLkjSeuMbfT4/XP3eB3SpJyHIHnuB/GDizDQmIMkGuUJ53SxvPRg0KstSXTrtgy0Bdb4fYT\nymq3UPsnsRcs9PsJotkczC4GG3gEa6sbFEuWiYkxNv7hOcq55QNv2cc9r+1z7vmv8fDfpCRrfWqx\npDyZse+2BcqlCq3lLVKdoJOAYjyLzvvEcZXNdajWfSKruHb2MvsP7qYkJAJBaWKM7noLaTJ212KC\nbodqOUS0OyzML9JdkGQ3OpQ0mNSQZuvM791NNakwNunx9DPPkvU8bnr5Ieq1RXbvmqK+y3B16SlO\nPL1CspqyunWZaqVEwe8gTYZXLHPtxhqHDyxCz7K+2mR+soPXaeJVPPALWC0QMkb6kQP7GInwQpRn\nHA2tKth752GuX1iid71LaJ1tWKc5eZpSKpaYm5tla+c6avBOhEHI2PgkVvpID3IvpNdpotIO3e0N\nSr/7HwmWNwg/8QWkNtR6GYf+7mk+dfd+Ds4ssN1scGljlbs/fj/x1Ru8/B8EOz/9Pja31rEYNjY2\nuXOnz+ufu8BKvYzOm5xcmGF7Z5v96y0qUlBqt/nJ9Q1+77ZbsFlK2uiwdPYcO9cvc3RhkrcVihz6\nlV/jb47fyj80lrh9cZHpcplyHJI2uxgM3voaxfIUyeBZMtYBpl1cixPFf9sy6ZA7SimiMKRQKFAu\nVygWC0il6GbpIAlh22XlDYrm4uIiY2NjA92vmzW22206nQ5CCGq12nfkjDUaDQqFAgcPHmRzc5Mk\nSUbgmm63O6KSDWEvQy1slmWjBdSRI0fYdf/9hMZwx+oq/3ltjbGxMXb9yZ/gv/jikO8GOPXP+B/8\nATuf/SzxV76CaDZpjo/z4CtfCThWb71ex/d9jh49irWWzc3N0ex2mIIxdHsN8ZBXr14dzbOvXbtG\nHMejGKBOp8P4+DhHjx5l3759IxXH1tbWyDY9dMLNz8+zvLzM1tYWSkmUdBpzJRUilA6aNLA+61yB\nHVynbYrWBq0N73zqBXbvtPiBczf49YN7CcMiQVgikhovKaL7bbK0hxdFZAJ0ntHSCZ1Om81WA2st\nf7VQ5J1W88ytc7TbO/QEZNZQrJTwPMmL97yMfQ89ydePztJupQhfkukUISSlQBEqn24/oddPCITC\n9xTGF/TI6OeaXqtHnvT43O4q78xz/mrPGM1E08t7lAvOYp1m+aDmWYx1ozlPeBiTo5TnDpksJ9PG\nJUsMY3mEM3eEgU8gfXxP8v8+XPhHdboO7JHqBITF85xGz2jY3mpTKhfZ2tzmxRNrvOF4mbfeJ2ht\n7bB8vs9sucJMKaDgCaYmK6Tk1MYDikGJ5eU2F88t0WspxusuhSBPckpGcGR2grXtdXS3x+ziHlrl\nPsqP2H1wPyKIEFLR31gllIaezomLFV72inkqosjaxWtsLN1g4fAi65uKcr3IxPg0t9/1OpaXuzTX\nL3JgzySl+iI7/WU8NYZJl/E8Rdbvc+PZU9x5rER9OqafZ5hihCwXKJTrNJtP0tlqUGnHkHjOgWst\nSoXIIAQZYYxweEdlsXGGKWrqB2fZfds+0uXnkdKxRj3fIwwCWr0uc7OLLC9dprGzARaKfolSbQLh\nKbQUtIwmsBlb9/8l1c8/wkq5xKKSPFIPaPb6/Fgf1koRv/70dU69dpLwzj3kUZnmj7yX8P6/hg+/\nj7mKz/TEOC2hENryfWevM97sMttLkVnO9/U1NioTmiaUijRrVV64+3b2To6DFJQr48SeIvYCysUK\n+z/9IMGVa7zJGP76roMEgcQL3fMSFSN0EEC3QbG+i02bkgiDkRahhSOqYRADgI9hYEvm2xbcOI6p\nVWv4UYSVsLGzyXprhzxNB/Q06Ha6PPDAA9x7770sLCzgeR69Xo8DBw4MFkMuPWJ4XU+SZCSPGi6W\nhhbc4Wx4qFyI45jLly9z+fLlUSGz1jI+Pk4URfwbrfn9IKCSpuw5eZLt6WkWPvUpF7rIdzY6qefx\n4osvcvC3fot4dZXNiQn+Wggqp06xZ8+ekfZ3CBUfGxsDGB0WQ+3x9vY2vV6P6elpzp07x8bGBu12\ne2TkKJVK5HnO5OTkyJVmreXixYusrq5y6dIl93obw+zsLBMTE6PXwH1t4pQFuPGZ7yukDV0CQ5oP\nYDEaTwSDBWlCnmd8+qb9fPCFi3z51bdRm5qkXKziqwBrOg5IniWMP32GVzz2Ig8fnafRS/jAi8vc\nP1Nka6aK6vZ4yuasvGYfExN1vFYXz1fUp3fjxxV0blg+PMeJ6p2srq7T3NzEDmzGgRIUQ4W20Gr3\nafcSVAaVQkRxLKI6XsBYQS+J6fZSTk6EfKs65wYEmcFqOLDZ4h0XNvmbI9OcnimR6hyjtdP1Cre/\nskKTZoZUK1JrsXjOpYa7uSrpIrvCwAernU3/uy26Fmg0dhCecw+RmUFCgMIMpCqPP/4tdjZWOXJr\nhdaVa+TUCETOba+oEgiLbltKYzP0iz5NvUNpUlHXhu112FzpkTUE+3fHyDgglIZ6pUB97iCUS+hS\ngcL4BKpaJ6zWESrAaEOhPYc0GcU8I1UexdoExyo1VuoldrqbfOvERXj+IuNzJfYc3ktpbDdfeuSb\nzBYT9u4ewxYy1raadLpw8y17qRY3OPPsMjR6jGcRNSG4sL5OOLeI9T08IqQssLG0SjwfYlvKbStD\n5UIhDaB88NwVzBqL9rvIQoztdjl4/BYqT63S72cjr7uQmlavS61c5x1vehsXz5/h1OnniKIQrRSt\nfoqUBhMHhOU6ux56Gn+jQa3VQ2jD96sqaRgR2XXu3O6jtKb29DmufeB97EjJ5SP7yP/ktxmvlKhe\nO8fcYpeNNKe1sc7pe27nFY+dpHf7TVSfPUvz+++jn2aMf+5LnH/Dq7i2d4FEKGoCkAFTMxNsrVyj\nXqkShCErH3wXC3/1JcS//gVel2wjN65A1qff7xH5Ci+OsMKZl6VSKO10w9JTWKvJrEUa4xb7gNEa\nMQCDD/kEajD7NtLZZ/u9PonRg8QOCAKfhYUFVldXmZiY+Hba8WAptWfPHjY3N0eOsqHDTGvN9vb2\nyP576NCh0Xz44sWLnDx5klOnTo1MCy4RIRhpXVdXV/m6EPxbIZhbX+fNjz7Kf7jpJtI4JhooJ17a\n6UyeP8/uEyewjQZb1SqfPHSIiYkJisXiCIIzRDs2m81R1x6GIQsLC+R5ThRFI6avtZbjx4+P3GdD\n6dgwYmjI0K3Vamxvb9Nut9nY2CAIglEHPzxoGo0Gzz//PGtraxyYmEQ4kTdYZ4AKAqcYEiLH6JQ0\nzQczYt9dp3XOMwszIOAHvvEMjxULbL1iF0prxp94it2fe5CvHJjkppPXmOqk3HniMkjB7m7Ke5YM\nH6lEjHkBFS9gc6OBtXDzdo/XnV3n6jtKtCcX0SbBCIHGkOqcRqtDZiXS8xFGk8Y+xzY6/Oj5Tf52\nscz56XEOrna474nLfOnINC8u1p181rMUC4o4xzllU40Rkred22BXO+X7zqzx/FTRjbq8gftuYJl3\n1mAXjGmtG0dgc3zpkoyldM2EzoyzLP9TbMDCQmvdJc162pILg/AEPhIvA9U2nH/iCq97+TgL4zlJ\nXuDvn9qm17dUqxn7pgJUpUCCJUlahF5Od2uTkrUsTkbUvZipyXHmd9epTNRQUeAcW/U6qj6O9iMI\nYigUnTRrYOsTqorOHF4uCmNy4aMmY+bvDInGQy4+cZnzL1zA7GS0Vxp88xsPsdlqEvoBTzx7Bu/M\nC+w/cohjB/fT7W2ztd1wXdWO40fESrFrukheiRFaYrKU6q4JlpcuM91oEexE2LiGLQoSZVBC42kH\nILeehFw4gbWnkNUCRauYP7qLM802/SQjimKEhVB57NmzyH13HaWV3MvHP3k/p06d5OMf+xjFUomx\nqRrvfN978QoC8S9/AvPRP0ff/XLko0/TfPfrae7swF//A2sHFpi9vELzA2+jMlZne3ub6IkTTP/t\nHyF+4SOM3bKH/f0+a089S95ucW6myvqPvJWZ+VlmfuDtGCPodHqsHTlImmkq2pBJQSfPEGGBPOug\ne04A7ylL485bmfmpn8avT/CW5jaPf+7jXHn2LIGIiMOA6sQ4plCjJy2B7yMsznGkPIxx3nYnJnNY\nx34/QUmF8jyE0LRabYRQVOpjxIUYT3rOx++lzmEnIIoijh07BjBK8Q3DcMTRrVQqLC4u0mg0OH36\nNNvb26Ptf5Ikoxnwo48+ypkzZ1heXv5/GBOGeWVDkM1Qi2qM4ffGx/m1VotymnLnygqP/PAP86aP\nfQz1EkeSBbaOHuX47/8+Kk25XK1yYvduSoOEiGGcz1CNMDQydLtdDh8+/B1W4SGrNooiCoUCe/fu\nHR0M5XJ5pFUe6naHwZQHDx6kVCqxtbXF5OQk6+vrFAoFarXaSyLtHX8D424cxmo0jmPgyGmCVGms\nzQYz0wENRLibygdOnWeh0SZ4+HH+5o7bsDpjz+e/TH2jyd3tLv9lPOInPJ+TbzjOZrdD+K0X+OLh\nWd7QavOB00v8z111zlV9NtYbvPLUKhM9Tfjlb/Hc8VeRpZocgRcG+HGMpkWrkzoVhdH0211+9vwO\n8z3N913v8BuLM7zp2g3mOylvPLPOX+aKwBNEvmSyVqBakA7N2OvS7Sf8j8UK77nS5G8OjjvnoxjY\ntbVxv6txzjOMGI0epBJ4CEqRouD7lIouRRljkUiub3b+76X0/3vRxUDecR8YdI70PGehyywBlqLw\nmS2X+d7X7sbLrnPi2RucvJCgvICZiR6xTalPReReQq+3w/baKtUwZLpQZ36xjjxWIZyewB8vYQOD\nVAVkLpCFEiauYIMIERSQQjk5iQDleRjj4N5WgggifBVBAHgeUwcPMDG1h33H9rN6+QIvnDxLsr7F\npOdhMrhytUGtALvGtuhVC2x2dlhdW6HV7UFiWN6SHMuLRGWPLPQJCwV0L2fPLft4bv0qrbUW45Uq\nYizFCzxsIQQVIHKJ1UM2aM4gkgnrKyhJSjNlkk3nfDFG4HsBk2N1ji2Ms/b1hzCPvcD/8c2TfPmm\nvbC4m5tefpSWZ+mvLmEChX75EfhvH8XzFI3/5b201jfR/Yz1N76eKIhoBwGe7xNMjlGyOUf+9h/w\nX7yA/fcfw/vkbzFTr7E4Oc6S0Og0o58ltDsdmnHB0bliDxkpvNRgjE/P5JClpFbTXNvE9zwynC03\n1Ra8ECsExUqVSLsFabfTodHrUd27gC5OYvoST4oBqlOhtcFXLiUAcKF/1qVDZFbjeQ4SnqUdOq0u\nje0G9bExemmfPNPozBUUjaM6DWVf3W535LYa5n8No8bBFejZ2VmazeZ3RKI/88wzbG1tuQZj4Dga\nFl3P80Yox+HsudvtMjs7SxzH/JXW/Is8Z3JlhVs/9zn+88/8DN7P/Ayv/sxn8FstRLeLrlQobG2h\n0pS+EPxmocDp06e5/fbb2b17N5OTkyNNsNZ6lGFmjKFUKiGlpNPpjDrcoRRsa2trBL+JoojNzU1W\nV1cpl8uMjY2NZHILCwtorUf6XCEEu3btGplWhgGdURQNgIWua0MPtLhiqFSwgwMgAeF4BNoYtyIV\nlk/etMgPvnCZZ153pwPY+ILHX3mQm7/2NF86vItDUuOd2yDqNiklGSbLUP0eP3R5nYVE8/7lBr8Y\n1FAK/nZ3nXet9Vl5x30gQxAeQnkEUUQYR0jPp5N0uHW7w4ebCf9zrswXZsu8c6XDA7snmNl7M4/E\nC7zma9/gs7fs59j1a/zUcpM/qRU5MVZksugxPl6mEAeoQHNquswLc1XCwMcDsjRzAZoD1rVUits2\n+rzj/A5/MlbiszZAG42vLIVQUYsV42VJIYwwmWFro4F6CUDn/3/RFRDHMZ4vaTR2sLnFZg5irqTg\n8JEF3vnO11ArXefvH+xxdSmiVHE37asr21SiOby6j+e5N3GyOk3FK1EOS4TlGmqyhpgbRxegK1Ni\n6/LITBRBFGCDAKMkUgRYa9Amdaes8pF+7BYzWmAszp7oh4jqGHmUUK9HjO2bYHJhjF0nTrN2Y4O1\npqbTdw9N3snoNNtkOqFY9RibidjY7tLNUhIrUEGJoFpHFF30c1yOCIMq559dpVydpDCXI3ODpxQm\nDDBBiNQCX6dYI0gHibXKWDIpaOcJmY5pdzrUyxUqlSJxHFLwPDqXrrD4xUfwt1u8tp+w8ZM/xFSp\nyESguHjpPEGxDFiKYwv4xSKFuk9YKCOyDN1LMWmGCkNEuUQrSSj4PunP/694f/xnmJ//ZwhtyJOE\n0JcUPcmOtmQS2p0OxWKJIAoR0kdaEL5GJ9Z1bJ0uSbPlBOvCAXSsEejcInznMNOeizyJZcB2dwMt\nwYvrZKUKpt8ZbI9djpQDgFt8XAEexqII4WQ4LpPLdVK5NrT7XTYbO6R5RjdJSLN8hIUcktfiOMb3\n/RGzYNiRhmGIUmrEWBhaZsfGxjh79uxo7juUVA1Te4fur2Ge2XAmLISLN4rjmDiO2dzc5Fd8n1/0\nfR4DPvCbv8mZAwdIkwRbLEKpRLyyQrNc5moU8a/TlCeEIG63eeSRR3jqqac4ePAg99xzD7OzsyPV\nxBBSMzExMZrZDrkP4Lr6IfRmyFoYdsOtVgvP8yiXy6RpyqVLlwiCYGSIuHDhgsMvDhx2Qyh6q9XC\nkxJhJcKKwdjHOtqW0eTGkGQJvaSHlRlRGDuTRA4YzVO7pjk5v8DdNx+iZhLCqED3TW/kS8cOMjk5\nyxt+5+NUGz3ueupFsJZ6q8cPXN3iibv2ET55mYfnq0xNlAkDxWoc8/B7XsbU7iPOkSZDPL+MHyV4\n/jahL/Ak/FgzYV+qefdym8/PFBFAwRcUw4BiXEJYd4v6ma0ee5Ocn9pp86OVgOsbKeuNHlO1iOl6\njKcUyvdAQD9NkAgX4YMLs5TS410XGyx0M37CtPnMWMWlAytLpCDyIPIUlShGRVCNBE8stf8JRXdQ\neN3yzGJyEEYgjCII4MihGSbmI9avNXn2dJONzYAczcteMYlOck6cWePGTocDh8cpeAG1QpXZ8jy5\nMuSFAOtLRJ6B9vFliLS+E2OnBqkMQhg3yJaggtCBv3WKUNIZCazntqm4GBwjFEKGGGWQniaol5m9\naT/VepGt1TVOPXOFa5c3KAUC09f0231KYwWmpmt0OymNKxk9Y8mlT1weQ1aq9HWP1LRIWjtkScaN\nKwmz53bYt3c3tHIoZoggR/guzll3W5i0hzY5Ki4gbUAgLGG9gt1wjpbMaGeqEM7JElSqLL32dsoX\nVkk+/CEmdu8m8p1ba867ic3VNcZKVcJ6FUoRsifwhHKQeaEhlogwpuMHZEnO5OQMwc13YKpV5L/9\nbcyH301vuuqsttYyXigzXh93agGhyLuJ87BbS56mNDa3MdqQZRqbO3G6ltKZQQaLG/mlr6J+53cx\nv/RvoBDQBZK8j5CSUm2chl9FRhZ/kORrjGPzSmnR2o0RrMlHUT0D3DBGahKjyXROlmus7ZOmOXlm\nBvM0gdGSJHFUsPX1ddI0ZWdnh+np6RFjV0o5klcNU3NrtRobGxs8++yzFIvFUUf7UjgMMNK6Dgvd\nsEABbG1t8f73v59HHnmEWz/0IR4qFHjXr/4qC+028889RzjoirdrNbYmJvjzffv4i2aTs2fPMguj\n7rzX63Hy5EnW1ta47777mJqaolgsUigURkwE3/eZnp4eHTRDu+8QNzn8+W/cuDEqzJVKZaTPzfOc\nUqnE2NgYm5uboxnw0EAxfN0ajQZjM3MudcS6+Jzhwae1Jkky+mnfOQrJUVK5PLLcPfPWggwdt8H3\nQwK/iF+fI/QigjDk2rvfjPrCV7n8xrswWZ8DX3uS03cfZGOmyH8rCnaaDjiOlNTGxxibnEYK4VJL\nUCgZDFyJgkIUUIoDPjVV5AfWu/zV7gned2OLXd2M7724wX86fI5Xffk0M80WHzp5ns/sm+R9F9b4\nwq4y88WAnVafXpqyuZMThz6lkk+eJkhfEQWOa51n2vEVBkm/X9g3xtsubPFfZ6uoRKPwCAOLrwRR\n4EYRW5vbBEowv2vMLdS+26IrhKDf6yOkQXoCg4dJPYzWVCKP3dMlzj/2ZS5e3uHKco8kyakVLes3\ntjh0bDe7pidIsz4XLi9RCQWFuQBbS7CVEONrDD1iE2BbOaEXYqVGSqcMkJ4BcoT0sYFzrQsJWHcl\ndZ9SF1nj2JwMNLM+oc2RwjgaUblCFPuUqj4HpcfERJVes4kQGl8qfKGweU6v0x4kekpUECJ83+nw\n0j5BYPGqEak0XN/oYr9+CVmtsL++C1nREHXIuw26Ox18QqI4xPOLYDTSdwaFII6xpj/qrpJul0D6\ndDEUxqfYvDMm/08/TmFtCaUFlEpIT1Crj+NPz7N6bYlup0W9IIkEmDwntRoCSS4EfZuhjKJWiPDj\nInge8tc+ijh9DvmxT2F/6Z9htMZmOZ71sElGfbJOfdzF0WAzl3yR+JQ9n9xCmuQ0WyndLKGfawyW\nSCpCJN5v/TbihRfw/u2vI//Fj2IKMamQ1Ks1olIJTzvRucntYCGlBnlliizVmFR/mxIshGMgG0dt\nM9a5u/JsECiYf1vnOxTmCyE4derUaEnW7/dJkoRGozGaY3Y6nVEygzGGer3Oww8/PCqwQ7bCsJt9\nacpv/hJ4zbDTHWayHT9+nLe85S3s2bOH+fl5ir/7u+iPfpRLhw9Te/BBisUiX7rvPu4fyNcWqlXO\nnj07YjrAtw0Zy8vLfPOb3+Rtb3vbqEMfdrHDBVuWZaOvH4ZnFotF4jim0WgwOzvL2NgY1loqlQpX\nrlxhbW0NpRQTExN0Oh263e4ocaLRaNDv90evnft5hu/BEMbt8vaGc+UsywZ5ehqTZwjpoSQYI0GA\ntjn93OBHFQQBUoaEQREpFJu3HmPr5v3kSZs8abF6aJp+r4VtNwiKESXp4xsoFQpMzcwTF2uOEDaE\n0liLyVyacKgUcejz7Bg8XlEEHqjZMu9dbvL4kV1Ugognb1vk+NOXee62fTBb5TNHd6HTlIOdNuuR\noJGEJHmONi4kM/JdCrX7ffVoPiuswRo4PVfhiXrMeg+8tRbaGILQRyoIQkUU+nRaHbppSj9P/mk8\nXWutw7QJizaWvjUEBkINxcjD9rahu0V3MyA1Oda3FAOfsqdYXb3OVL1OFIf4SUgYexRrEY3ODUIT\nUqhXUX6A7UFuA6QvMYUQFQTIIAI1mIzoHJv1AYNSA8ezUs6f7bAp7kqkLcq67lcoDyEF1ijSvkZ5\nMbW5PcRhRCG0NJYlvVaHaqVC3/TpNLoknQxpwfcM7VaTgp3GNwbfOqpQ2yRs9xK2cwkrmucevcDU\n0Zsp10v0ehs0t5eJ4hqysguB7zpxPwQl0VnmEI1AoDwC321/AXraMj4/T1zYxCxdoPmtExQP3YJf\nq7lEVOVRCguEx+o0VpdZOX8Zv98jSDKkJ9A2xypBGBcpOJMtqhY6oMi/+jnkv/v36B99N5E3iKxJ\nMrLEcSImJiYIzHTJAAAgAElEQVQoBjFRtQJeMHIcmX6PvN+n2+myubXDTscFaRptUAZCz0f/wi8g\nPvpRzC/9MjHbtLoJU7sXmZ2dpd/uIqvlEYbxOyOpLblO0Tp34wprncNPgqec1VLqdCBWl9/W8orB\nWn3IpjJ2VEyGDq5h97a1tUWSJKNrdhRFaK3Z2dlhY2ODQqEwurZLKUf4w2FhHRblYfEdjhmklAPF\nzuPccccdNJtNGo0Ga8ePEz/xBBOdDj/1Uz/FxMQEFy9e5MknnwRgampqFOUzLJ7D72Wt5erVq2xs\nbFCpVEY/z5DtO4zVGf55aReeZdnIJDFk/Q6daJ7nYYyh0WiMFoNjY2Ojg2j4d6iWkJ7nEktMPii8\nDjLuTBB9Z4WVcjD5tSjh3IRaK3Jh0Dajl2YIL3bvpadQUeC6dKvIUhe3lOmEJBngKyWExQhV8Jmp\njFEpVaiP78IQgHE5fFmSkKUZOtOQW0Lfo1aKsQhi49FLNd8oe5yemWN2rMq4grXDc3z+4DwYTSHr\nITxF5rtbcGo0wsvJcoEiQeaGKIgdRRCnRFK+jzaaLDMORzkYNRid4ytJliZ4fkAYKoQw9EyKLfh4\nVpKHwkVrfbdFF8cuwQBaCwROxeBZnyxP2GmuINuK61carhjHEqMNjUaKyDXCunmcNR5py+eqvc5s\nxVKQ46ie7zoXJZBB4PK+fJcFZnwgEAPaDwiTI5UrwA4eIkEPkm1xTa8c+qDRg8A84Sx7wsfmAqst\nwfgskypC+Ddon7tEs9snVxlYS6+TY01EMTSUCwLPjxBeDc/2SDY3Sde2mZ2c4oXxHpurKeLSOgef\nOsWRYp+d/hLhrnHCcoQyDo9olcDYFKkFea+LzRN8IxDG6VKDyM0cUw26PstYaQy+9gQTv/dx0o/8\nGOzZg9uFRggR4/keEwv7maiMkW2vQrOFbTSwJnGQEgkiVASVsusgdY6573uwr38NLF8huH7BzW49\nRbvZIc8NG4UiteoYUb2OVygipCM9qSzFSIlUEn8Q9y5x4Grpec75c9+98L73gVKor/4lW1s7HD56\nkFptjDRNSHTmFmWDuS0wYhf0+9/e1nuDD/uw6OmhbtcarBn+u0abdPD/uMIrlWR6epowDFleXh6h\nF4ed4nC+OTMzMyKOPfnkk6NZ7/D7DYvT8N+GsrXhzzwEiw+bECklS0tLXLp0iWq1yvb2Ni+++CJZ\nlo10vw888ABpmjI5Ocl73vMevvrVr46i2V86Ix664tI0ZXNzc9StttvtUWRQGIYAo3ny8OcAN+oY\ndq3Dw+ncuXNsb2/TarWQUjI3Nzf6ffI8HxkjpqenR8jIMAyRMgCpMDIB68w9WebARWk6gMwLgxTe\nIGLHDj5/htuurfDBF67w1VcbsoMvQ2FREnJpyZLUfdng6wW45aqnMITEKiYq1ihWxijEZWbOXmf+\nC5/g6tvv5caB/aR57mA3AyVB4PvUa2V3COLRSFKa3Q5aCYRNsJnCpBqhQhcH5A3SeYVH1hMIJQl9\nQbkQEfpu1yAEWOPm11iJVAPKmJQoJRCeIDA+QuREgU+v50DpQVAiiH1ymzmspB+QRw6A9d0XXcBq\nl+vuS4lnc/w8Q2iJCDTNpMHy6R5XVlokffCBvCDwCgVa7Q7tXkJ1IqDbzuhmmrDTI55VVCsVCkGE\n8MvgVZDKR4QhQoKxg1QGY5AiRGiB8YSLpxmkL5C5JFqhcyfp0MaBjKWHi7rVCGEwOkcKJ0XCGLIM\nZLFOfSFAeDHr68ucvXidJLW0Gzl51ycOJJVqiArKWFHCphmRiel0BatXl9lO+nSkwiSWk998hoJe\nYuLYHOHu3UgZoYwhS7puJpb18LWhu71Nt7VDLmpOz6dzEA7gkmrDSqrYP71I8Kf/EnljFfnx++GH\nvh/JQDdpcpQRWGGwgYdXLrmxRawg7WOyFN3pY7oJpiYgUIhiGfm3X0P+6r/DfviD5PvHyQRYJQnC\nkDzLubG0RH18gvrkJF6YgeccSGYQKZPlKdpkGOPo+zkO4myVwKBHI54cmJmYJHCp1ZR2zXBpszvQ\nfbrnh4HyxPMMYRi5a6MQKCWJIrecMlh2Wk2yLB/Ft7h8OYE3mLsKXEeSZzmVSmVECQNot9ssLS2N\nRgOtVouZmZkRFP3ixYuUSqVR0Rt2ri8FfY+e+8FC7qVZWsMuVGvN6dOnCcOQo0eP0u12nYlowPjV\nWnPPPffwIz/yI6Nuebj0Gn7voWZ2WEA3Nja47bbbRp25EIJ6vT5ylCmlKJVKI17EsBAPY+E9zxuR\n0VwEknsNoigaGSWG89+NjQ0WFxdHwJ21tTXmD9xKEE2S2jUy3SDRfZI0c0Qtm4M0LujRWHzpO9YL\njvnywecvs6fZ4c2PPc1X3v5OlJB4VpDglqdZ0qPfbWGTDub/Iu3Nw+066/vezzustfZ8Zh3NsmXL\ng2TLxpYNtjFgPGAIQxhuBkjakKRpQtLmPrlD2+c+bULa5j63ZGgSAqUhaSZKuJckECAMhlAwxmDL\nlmdLtmTN0pnP3mePa613uH+8a20du/RyH7L96JElnbOHc/b5rt/7/X2HkQluTamJogq1SpOkPkGl\nMUFSabDrb/+c+rkL7P7cA1z8lwdQNiftB+AVKrjilPDU6xEYhYok1Uqo4tkYZcg4wWtB7D1K2RDH\n4DzZKMdaR6VWpRW3woUDPw6uD8tZQSQU1vpQHOAESIcQFuOLXYZ3VKOIeqKoN2KiehTKuLXAVATD\nROL+IZOuCFu0sTZRiSAyls6x/8qdbJuf4on2UfpC0veBZ7S5YrjYY8f2SaxIubjUBTR157mYGpoR\nXHnjBLLWxOsY6z1OCCKlQROAxWZhxhVFz5fQeB+mVRF+IgII21IvKDG5QcWySNLz4cIqwmRW9ndG\nwoG1yKpketsMqqroI3nsu89gBuDzASppstQeMrtVIrIReb+Dzi3HTy1w8twanX5OThNPxNPn+8jn\nFrn36j00ByOsXiKNOvhIEtcS8s4658+eRVpPutEFMUmWpWxsdFFK4L3FYnDGkm3fRvZz70X+1n/G\n/dN/FDJNnUAKh89HCEyQ7vlwUSEb4TOD66ek3Q3SfpfKxByiWoPeEN9dQv3qv0ccO4786H9h7V/9\nHBcurmBTSy2pohtB6H/2zCmmJibYqiQ6ThDW4o3FmhBz1x9lpFnOyOSkOOquQm5SgsTcAhJdadJq\nNYLELM1BJXQ7S0BEnluMcSgVU6s2iCODVqMCwEBpRbM1QSWu0jcpeWeDNDNYW3D2QhQLOAUUUwsW\npS8F3JTB3HmeMzU1NW5kKAsZNzY2ePbZZ1/WxlsCa/l7qfMFxtP5ZrAtQXIcbu0chw8fZmFhgd27\nd1OpVDh16hRHjx5l9+7d/NIv/RJzc3M89NBDYyC+pNYQ4+dbAvuJEyfGNuUS3Ov1+hho0zQdO+PK\nVLPyYqKUGrvlShdbGZAzOTlJp9MZP2ZZX1RW/ZSg02juZPvWQ3R751ldP0V/8BLddD3QPDo0mBjj\nkQTDgJLB/ip1xP9z/RX8T8+8xGeuuwrfXsFv3YYUHi0F2SBFZEMqwkOkyYwDUUdGCRWdECW1wjpf\nQYiIc+9+Ozv/+vOcedub0Toml4o8y8itwclQFOCxYap0Qe890azjEZh0SG4Mg8EQKg6lIU+zMC0j\nieOE3PuQU630OPazvPCG76vA2s0XR4qlYYidTCKJNymTrYR6TVGrR+RGkgqPSwRehyHkBwbdS4WI\nYYIRBiwOL3ImqgmLZxdZ6WX0nMAlikwFzjKqRJy60KFSh9wHiRHGMVkVTE61iJsTeBVi/YSSSC2x\n3iDiJl7JIoINhKiAiJBCBGK/sDA554J60AEybDVtUMXii9/xvsi2FAVYgfeyuB+D8yleChaX14nk\nNHt3VEnjRabmWujGDMZLGLTpri/iR46Bi+jZmJGzGCUxMkE4wxOnh+x5cZUbJ1u4wQZ5PaJSb9G+\n0Gfh7BmSWGKlRusYacPCZ319HfDUsyreeWQs6Ha7tN71I/RvuJnq3AQiHYFQRcBIIedSEkzobSPN\nscMBaadN2mtjTEol2gYmZ3D2FN2VNlPvuh/9F5/mwl2HOHrsGCcuLKG8YfvUFBNRmA5OXTzPTOGQ\nqtXqKKnGAv2NbpeNfpdBOgrTZqQDLzfso//uS6jf/Qju13+VeKYROjCdwnjB8vIqaZoHJ5PnZdIs\nYLxM0pEYg0a/32dxfZX19fVLoeSBFQ5pTt6DD6n8qghmqVarY6lYq9UamxlmZ2fHxgOlFKPRiGPH\njo0BuVQDlABbbvo3Tz0lD7yZDgDGITTln8+fP8/CwgJCiHGL7913343WmoceeoiTJ0+yuLi4aWF1\naaIuJ93SJff0009z6NAhBoMBR48eHZsYBoMBg8Fg/HmlmyyKInbs2DF2zkVRxL59+172HDc2Nmi1\nWuPnftlll40XgkKIsXys2Zhm5/ZrGI2205rYhdDTZPYYG+uLGDZA+hCEAwHsjCFSEUIpHt25hUfm\np6jFFS5bX2aYDkmKoPKq0rg4JvMxqc2JKglxVAUVKtrjKEbIKKiO0Ky/6hCdm2/D5RnSBMrIeRNC\nZYTACYHQMfgMJwxSSeJYh5qdShwke97hnCcSiqRSReSGzIRY0SjWKCUxJpwISnoJeNmFdUwzeXCh\nBg28o1atoF3KRC1hslUnqse0ByOGzqDiCLTEfR9L2veddD1BYysgeOeDooSVc0v4fpd+LghDicd4\nR2doybylUo3ZGBmGqSOSngRNLAStJHQeIUNPFtKH6bke4x8+jvjoA/hfvA/ecB1eaayIgpzJ+qCf\ns2G6csaFdKAiiCdMuBaTmVBa58LzFypQEtY7jJNolZCZNfK0h0aSdvs8/t3j1EeeN756jr1XzxNP\nThFVKmTDLpF0pDommphnbXiCXIogazIWrxLS/pAHHz1DpGBii2D7/l2sLl7gse88y6G7b2Nm1xy6\n0aB17DDpkmGUG8QwRchuAAUp0c4xSnsIPU/zppuxK+dwwxFSJohII/CFtA6wOeQZfpSRd7sM+j1c\nllGfbCF9zuDpI3zuC5/j4WOnOHTVNbg33cRouM7ysXOYx5/jHy/1eWDvVrLbb2ZkLXk65OyFC0y0\nJpifnw8RiWlKd9BjY9Cjl6YMjEHqCO8dEkHa76J//88Qx15E/tq/RX7sd6jqCpEEV0lYXl0hNwHc\nosL1FEX6ZROktZa8iJvsDYZsdHustduM8mwMaOEFhyNsJATeC5yTGCPQWo1r08vm3jRN2djYGAP5\n5OQkU1NTPPnkk+PjdgmgmwFvc8JYSTVsBsjN9uTNS8Fy4gXG02cZM/n1r3+dlZUVnnjiifFjbF6i\nlc+lvL88z1lfXx+rC1ZXV6lWq+NW4LJQskwFq9VqY/qh3W7TaDTGfy7LLUtp2q5du8ahOaXpY35+\nfmxtrtVqJJGg1axRrdSI4hmiaI5m83IWF59nZekYvc4yJh8G8JGBG5UqBORoHRWvxZKlKcZalBQh\n09kbnBegggEnUhWciEBp8ITMXBfSvZRKirqr0LSND+E7lUqCcQkJHjvqY5xFRhFxsVvVUpLEYUGd\nxxHWGgJbK4v0tUYRVONwEvI8QyUxxuQ464gK4JVCMMiz0MztPQjB/sUB9x9f45M76ixUNApHoxKT\nKEUcBX278SmZMUROo8T3STD/fqAbovWKJlMvEFbhVZGwk1n6bUdnECHynCgh1HlHghTDcODAK6wD\nn3tU7Kl4TysWxFriHzmN+psXcO+9iez2y8mXN6j/7heQp9bwv/cFxGgD9ccPI3/5zYg3HAKpET44\nYCTFYyERzgZQFg6hQsq88KKYbEPLpxDFoklUkTIKPJvL2FhqM1hap4IilobLdtXRlRwTaVKTIt0Q\nJS3TW7bxzImTtEc5IooRzuDJSEWToXUcPdPD9p7nzfddQb7c59mHjzEaSRbOb1C/fC9umDHs9rG+\nEiaFQgpUSeJgc4wceW5QX/8a6kN/gPhffxF/261BMuMdUli8LVp/ncGPhrhBH1MUJka1SWrzu+Db\njyL/4x9zbVNxz8UOS33J7tMXeG6qxuUX19CZYS73VM6u8vH1FaR1JJFmvb3OxYsXiaOIKI4ZpqMA\nuP0evX6fzHmUCqlT0kvwnvQDP0f143+C/eCvEVcSkqRBWmtxci1l3aRIrwpgklSrFVRRU715qrQu\nx2eOQZqyttFllKavmCw9SiuSKKYWh2Ah6wTDQdi4p2lKvV4fb+hL2VgZQJ7nOb1ej1OnTo3jG0vg\nK9/fZSV6KRcrQ81LUN48lVIMIOXfbV66lVU63nsefPDBMZdbZkK8Uh1RgnnZuyal5MyZMxw+fJhO\npzNu9Y3jeBx4U1bylGE4ZaiPUor5+Xk6nQ6dTmd8khgVZZGl7bmUfvV6PSYmJsbB7P1+PwCc00gl\nkFVJLOdpVBrMTG1nZcs+FpaOc/78CQbtJYQLyhMVRQgncMqH1DHnMS5kznob5F6WEIkpZINIhIxn\n6T1WSIyIQBh83kVHQDKNlwkqH6GUIEtTnB1hsXihkCoCVDjNCkGsFdIVjdyJRitNksTjizD4kG4n\nPNVIFpm4BiFMEbAU8hysy5FChaU/BeWnwOF40/FVdvQy3nPW8cAV8yhrmG5KahXQSjC0nsxmVKtB\nIiml5vuFL3zfRZq7dKrHyaJ2RHpy7xEmRjqYSCJ0NaMyEaHrDS4sbZDnHuUEqArCZLS8Y/dsjZmd\nW/DWIv/qecS5Hvl/+Q7fGS5Sm6yw4+55Zv4u5cyhKXb+9peoLw4Rv/N5/B1XIXU1LPWsR6hwzBHe\nAMGq54psS6mSMCDJcJxACjI8uYdYF1SHjlEuI+su0Rv0wMOOrRG1lsQJh46qCKVJswHYlIULFzl7\nZpmRAaMUVhgcHi3D9J1bwVLbs77u4amzXDje49X33MfaepcjD36XHXu2hHBowHobfjkZclKjiFol\nYZg65H/4MOK5o8jf/DD5Z/8EaYNWEgkYi7cGXI6zOcO0T3/QI81SoslZxNQs6o8/TXRhiVdFGpkb\n5p96EZHlvHqti7KWbr3KUqXCg1dsRzlJlCRoJRiMhiyvrtJqNKjWqmTG0u332Oj3GOYGZFhEShkW\nY9Zahq+9hfj970dUW9SOHSHZso2XNgb0XGgydjaoDLSOUCqkqiVJTJql5CbD2BxrTbAjD0ZFI2vx\n/RWM0/e1UsQqYqLVYmp6FqRiMBhiTc65c+eoVCovc1klSUK/3x8rADqdDt1udzwVb554y6VY+feb\nOdcSXL8XHbD5GFp+TMmlljxtCbab77N83M1LtM33tba2RrVa5cCBA+Ns3fX19XEC2iUNrX8ZsG6W\nipXGibLRuNVqjfnfsmCz5C/LjjXvPabIiYVA6Ogoot6YQCctGtUZplp72DJ1He3lE1y8eIx+dzGE\n3sgIHSlwKtjLswHDrEOmKlivUKqJIA/t1FJjvcR6EZp3vUGaFGdyrG8y9/gT7PjUp7n4rneyev11\nOJsFAw0CCNkPQjryNBtTCPUkOBKVVCip0UqhVYSSWahi90VriRSFoNKHLGwVGh+0kninx69dFD1n\nvtgffO6KKd56Yp3/e1czqEAlVLSkGkcI5+i0O9RwNCYm8NWYgXKbTmo/AOiWJykhwi+jHEp4pPBk\nOqLXGRF7SxLH1Cc0temYlY0+3oIz4I3BxzEN4IbdE+zbU6e2ZFEff5ze3gpqmLLyuss5eMNrqW9p\nwZsr+J9qsbWTc1Z/ii1feJZzB+rs+LPPM/3ZF/C/8Cb8nTfiZag7lliEMIBHCnXJkfaV76D+4O/w\n//Re/OsOED10lOSjX8b983vJ77gaFVVA5Ax6i/QzmJhw3Pm6K6huaeKihIqq44Uidxn1WJN3LMvL\nbaxXWK2Ca80JIttnppZw1c69HH/xBN997DwHd9YYdSzf/drDtE2Hg3fsZMvlO0JgBmC8QTiBysPE\n7lywJbfWN+j9s1+g9Qf/CfcvfzlcVGxQ2nif4Y0BkyOcwaU5eTZilGdUazWaW7YiqhX8//Lz8Fsf\nw991B/7r38LdcSv+a99kcPBa6k8fY+me23h6ok5nfZ16ITPSsQiW12qFYZ7hUkjTnOFoxCjLMUKi\nPGHKdYyBJk0HQYGAQ9erDOpV0vaAWi3BWj3mZa3x5JmlkgiE9DhnSNNh0JtaQz/PyHJz6XQCIdO1\nWHoJBEiFsQ6fW5QS6ALMLl68yFVFatfx48dZX19namqKJEnGTqsTJ06MnVsl6G3maMsjdgm+r+Rw\nS6DbnMGweSouP6es/in51HKyfSWVUC79Ni/xSprDGMMjjzxCq9UKteg7dtBqtdi9e/f48UsuOk1T\njhw5Mp5cq9UqExMT7N27l2q1OqYZSoqipHRKR1x5X+Xz9t6R5eG4jVRhWvUOrR21akIcbWV6Ygvp\n1p1smd3GyZNP0F67gMlGCByvurDIux8/xt/edA0bB/azdWIeIeNgMpBhKkQEB6n1EoXDpynpoItF\nU0mm2PGp36N2+gzbPv1XLF57JVKEPr2ggHEgglIpBPspRsMhWMFMkhArWTgKZbFglGQ+YBW+/Lo5\nvHN4G6jJSOnwb86HRVmxOPPIYjDyPLN1gsMzVbqjHNEx1OOIyENVaYSxxAimG5MkUZU+nszaUsn6\ng4EuhIkjCJw9qtggW29ZWhkgRxa8YJhnNHTCWnvEYAQms0gvMB6kyWkmgokJwfzOGeSfvIi8MECn\nKWu/fDPbL78BmlOhW0zXcM5Ra1q2v/MQD9f6HDtylvd/4xxi3SA/8mXMnfsRPkI4iRUKqSS4HPG1\nJ5AfeQDzM3ejPvZlxMll8t/5DOdlm+0f+m/oixvw218kueUqvPOYSo31XsaElOzcHrNtR0J16yxD\n0UDWW0gzINGC/shycXGd8xdWglzK2xAtKCOwKZdfvo2DB67l5OlTnFsccNNVu7nh1i2cPHecq3Y0\nufb6ffRTz9AVVy8u/RBaa8nynH5/wGAwZPng9SSf+SSqVkUMu4ABG6yWOAM2TLnWGOKkwsT0FFEU\no2sJXgnEva/D33Vn4MR+5Rfw3TU23vZGrNBUt2whOnWCyRMnaecZIxHeaCqJies1rNIMbWEPHY1I\nsyywYqGs7GVX79JGSjE1IGPWh6MQMCPkGMC8C1kKaZoiVUjXN4UcrJwGcxOUBEqLsCn3nuJthtIJ\nSgaDyTAzrK6thdfmL6V+raysvOzYrZRiy5Yt4xbcixcvjhdJm+mCze6z0iiwmVoob6/MXyj/brPc\na7Oud7ORYrPsbPOyZvPXsZyqy89fWFjgwQcfHNt6hRBjtUGtVqOMqNyxYwff/e53EUKMs3Rrtdo4\n6rEs6iwXguW/bd++fWwomZiYGEdK1hJFvZrQHWSYgv8Mb1Vb7B5ECJhPZqhVXkWrPs3p049z8tRT\njAZ93n3kBfa0e7zjyDE+f/c9uAmJEA4hw0AUsjs00gvIc0TWxwy7OCeIJ7YQV6ZY/NEfZf4vP8m5\nd74dfIiTtNbgjcMaj3cSZxX12gxCKdpmhV53g0RLJiOJUNWiJj28gcLp0gWHWSnxdg4lFc5YhBIv\n4+2FCFy1NyJYoJ0njlTQr/sUiUV5STWuFhngnulWi2qzQYbDpCM8JtADPyjohvdBsUF24LzACUWK\nZb0zZFIrnBVEsaBRr9BbHZH2PNjwOVJomlqwa0azbUeT+kyN9E27iL5wBvvGbczUE/ywg48jjNJE\nHY1OFIgO9cmIA/v3UMkd7Zku1aeH8L6bcPkAvEa7BCcVQjqwGeIPvoQ4vsTgQ5/m6A0T7O/ELL92\nF7kRrN6/n6kvPstL17XY8mcPMPulZ1E/eR2pTahkXbZt0YhkRM/n6MkZosYkfmWVer3BWrfHSqfL\nRn+IRyO8RViJFholUrbMNzl+6gVy78gMPPbsad7+jqu4+8Behj5Fa0seJdSnJokXBggvAk0iHLmw\nwTVnHUZKVoc97NkOrThianISGelSf4704Wvtchf0rhNT+GwY3C9KIMusPVVUj3iDEAqtI+JqDZFE\ntOp15qam6OYjcg3pMBy/RrkhzTOE8CRa47K0kAUJlBJYX8qrAijOPfMCOz76CeQHfxX/7nfS6Xbp\nj8Kk8HLvmccX9SXDwShkObhgrfQuBN1oqcgLV2EURePJMRpPpgZrYGh8oKWEJM9Dg8Ltt98+bnso\nTQ/9fp9Op8Pk5GRRQ6PH09zmW0kzxHHMaDT67/6tzGEogXTz0mxz1OPmW9mjVk6Rr3S6AWMw3gzi\n5a3ku1dXV7nxxhvRWo9db2fOnBmHnCdJQrVaHdfwlAu2559/nn6/P+a3N1MJSqmxjKzkgbXWnDlz\nhuXlZeo1zdx0jSjWdLpD8qHBGFFkMbhgOhI5kVI0Gi0SfQX1WgWtJcdeeJS/vGE3P/bkaT570wEG\nwx6pMNQFSBwoCV7inUN4gzApIu2Ehdb0XnR9C0hN59AttG+4DpOO0CbFCIW3Bk+gR4QXxImiVp0g\niiMUjuV0QLffoxprqlENHQWzjZASSMBZnMjJM0OeZ+OiXucopKdBDhf6U0OwkxcCB0gdIbVA2QDU\nUufk3iGkJMstsQ725TzLGGQjsjzDaIk1/wDQDcoFV3BsgPOhbhyJrsRoCX6Usm26Qk3nVIVgKCOI\nFZnLkEQ0vWFrHSamqpAk8KorSK/YCnaDrNcmG0kunl9ldOQUV33rHBfv24ulz76H1mj+8LVcf9sB\nKv0+ox9SqG1zqDxDKhc2qB6kc/hsiPnJW7Ef+288tTfBX7WV7jvewM6rr0VOz+LuVfh/bpg+/iLV\nD/whYnlI9OdPod40QUvBzu3TxJNNWnv2MLQNRoMBymbopEJjZorF1bPkRT2JFx6FQllJqxGxa88U\nX3/gMYbGIkTE6ZURTz55mjdvuxyvYTDsMLfnSoRaCglGBImbFwJTpPEPRiNW221qCwkXpSHOU/bt\nzLj8il3gi/oPGfgmpQQUDQxx1GSQp8GimRvQKkjuHvgm6jc/gv25n0Bfuw8RJYi//zbND/8hU295\nHfUdM2BHm0gAACAASURBVCRCMgJMnoceMmepJFXy3CDzlDiOqcYRTgqGmX0ZeOz78jeILy7hf+M/\nkP/Q/awuLYVjVaGYH099BFOLKCb8LM1etojSWuOVwBsbApU2caeiqGpyzo4BbSgKYHKWLDdcuHCB\niYmJ8fG+2WyOaYV6vc7CwgKVSgVgDDzl8by8z5KPHQ6HIS94fNz24+dSAu0rnWqdTieYOgr+NEmS\n8WKsXHSV0q0ypax83HK62nwr73vzdFo+7tzcHPV6nbNnz3LNNdcwPT09vr96vc62bdt47LHHcM6x\nfft2gPEir7xgTExM0Ol0xgWYnU5nXA2ktaRR0wVYgfEj+gNDZv34gmspLnyxpFavUa1eTq1aoVqp\n8Gyk+d93zVGNGmzttkmdoRklYIMO33tC4L9NsVkXn7aRcZ3m1BQOiZLBlGPyFFEYciQeJTyRgkSH\nC1KzElGt6tChV63QrVYwox7pMMU2M0QSB6zyga81LrReSCmQArI8BxmGA+88ztgC48InWRMyQbwX\nKKVDHsXQEkWaai1B65KWcijjcFmOVCKk8jlYXOwzGOY/OOiGN2uYWgWgFCFjU2q6w5xqDJVEsnPK\nklnDllaFQd8yGFgMFtIRkXLMJFWmZieg0sCOWlBPMKmn3W6z/K3n2PHoMrH3TGSe3heOEseKeDXD\n/80L9G/ZiWrWyfspWXfERMsiE7CRRDmByFPcqE9+4xyDf/cW9qcxlUSTJzE0aohGPUzs+Yi53bsY\nvP/1DP/8QfR7b8UtnGW61WJm21aoTwUnjctRUgYuM6oRN2ZYXOkzyilSMi0Yh8thcluDK/Zt51vf\nehypJbnTpHhefLHLNZdnTO6IqUxEIDaw2QZSVohkEOKZgpVWzpM5w8LiAvlgQK1a5coXjrL1y1/F\n/+tfwd//evA+hDZrCdLhM4MwoY1BW4kYjsgcqO8+gf7dP4T2BmJhGf797xHXqgzf8zbkpz5Dcn6B\n6qc+j7v5Gt5y+Cjfuelajm7fQi/P8ULQq2VE0iFNSlyrUKnWyXKPcMMA+ggQcOItd3HdNw4j/rdf\nweNYXV7DOxm2yv7Shl8IgZIyuAfxgdM1FutM4XazeAHKuaLoM8h8tNRBcUJxxPXFcstYjA/C9eFw\nwJEjR9izZw+VSmWcLmaMYWVlhWeeeYbhcEiz2RwfIUvgfeWtnEzjOB7znJuzdEtg3rzIGgwGY6As\nP6a8r5K7LXnkmZkZRqMRKysrL6MrXmm42Mwj33jjjczOzo5jG0sb8MLCAvV6HSnlOC93dnaWSqXC\nzMwM/X5/XAvfbDbHrRLl69mcNwGMq+elkGglqCUK26iQGx+s9KkvXJQCaz0ZDiFytIqpV+s0Kvto\n1Vu0Gg2eePJBVpYWGaZ1MutwIqgh0AIyhxRBc+tsj2zUJSeibnokUQaiSmaHeNMPFTjC4YVBYdGA\nxhELi5QC7QYIo1AYYi2RcYIjDBD4wsQlCPGOItSlOxMWt9Z7IqlxePI8C+8tGYaFoOYInHFQjMng\nThMyREDGnolqRCWOQ8iPCXRZI9ZYoRmmnpdOrDMa2v/uPfb/G3SlEGhVcB4uyK8cAuUd9cizc6ZO\nv2+ZakiGuaXddTjrEEgODTw/MfI8PqvZs22CpBojZIyu1nGtBmKkWVlss+fwEvOpZ1iLyZ3l6X07\nmL//Tqb+6iEGB3ZS/1cPkP3YQZLbrqK/3sb2Bsi6hW8+hfzI5+H9byA/2MJbR73RRFWqZN96gsbn\nnib7qdfCvbehhELaAbnyVN9yE8Nhj/hPv8v03glODjKgQlybYLi+TqVVw5rwg7O6sgTEdNuDEMOo\nijwHA9I7avU626+6gltfex2nz38naFhzz9qG58kjy1zLNNdeGWP7S0g7JIoaaFksZERIZ/Lek+U5\nwjraxtPvDXjPZ/+O5tIS9jc/Cve/Hgobs7cOrxRSB67P21Br3l5ZRscJUx/6KPKl0wzjCC0Fdq1N\nZWUN/9E/4dEtE7xaCr6dCK5/8Aizw5zbHnycm5OYp+cm2b+0zkO3HOD03nliYZnRGhVpXBaO4aKw\n83ovWLnhOlb/yQeY2rqDfJTR3dhASYW15mUTnBACHV1aOskiqxWCK8gXH6tlMMSESVoihcQKj8MW\ny9wAaL4IYgnOND+OKYzjmPPnz4/1rNu2beOFF14Yg1kJciWlsBkkS/naKz9ms7LBez+OiSxpgSzL\nxhKxzbGQm2VppRSsbHgoOeTNITrfi4ZYXl7mqaee4q677uLIkSN0u92xlbh0oVWrVfbt20e1Wh1P\nvXNzcxw9ehSAgwcPjquKykjHLMuYnp5mOBwyOTlJnud0Oh2Wl5cJFGKO9IJYCxq1GC8EWhuyXJBn\nkOcW5wXWhwxrhKBaq9Bs7qLVeiNzM9t4/PGH6a4u0h9l+JZGinCaweZ4k+GylMbhp9nzlUc59Za7\n6bR2MHBDtr64wGWf/lvOv+1u2tddEXIa/AhchnAWbxxSyEIeapDSk8gQrxjXakjrcTbHu2BPVzJ8\nTXPvcDYPiX8yNLoEzb6jYBhCzboLSYVBxCDQOkapKOxwtEaaEHrTqiVMNJo47+kPBzgTBgzjE86e\nWWfxbAf3D6EXnPXYrBCoI/A+JpYRlzUN779jN+2zSzzbFvQGFY6eWqFnQEZ1VJ7zvhHssTDRsbhd\ndVxV4dIUPZljpyeJXI3tI8O5WxapH+tT1THqwhp3XRwg//o7iP/jA0z+xh8gzm/g//IpuHM/iVKk\nvS7RTI76yBcQL17Af/yr+N9+N0lrinRoWDi+yMynDqNXR4x+54v0P/wAp27fyY53HaS5ZQKdTNL4\n3FHkmTWu3hjy6IwnWxnh11dJoiqm7jBe4zod1haP0z5/mnRpiEJjZYb3ikhJEDknz65y8uIyd77l\n9Tz34jJPHjnByFtWR4JnT63RmBJcvb9FTWtMr08UbSdJKmRZCsbjZFgeBS20Ddt5qfjyrbdw/+HD\n8NPvZcYR3rSinBijMP2NHMZkpF/5BtWPf5JT22eIFhfJY0Vkc6qOYnKEeprxqotrRM7zmvUhf3TZ\nFO96aZmazdm+kTPd7ZN4uOPRZ3hp2zT1qTqtWj0swkzIT0UInHV4Y9GVWgA+Y+gOuxibIaRHCYXP\n/RjUyumtDFsZc6DFFKHUpe2+kgqpFH6cFhJSpkLmUtCBehNMOrIAqcnJyXEwTJ7nDIfDsX12M+CW\nk/fmlK8SwDaDcJZlLzMMbAbnkm8t/7+UnJX3vxm4y4tO+Xp7vd7LALycrL+XnKz8c7lEKxuAK5XK\nmBoA6HQ6HDhwgMOHD4+D2suLz549ezh9+jRnz54dP145mY9GI7Zt28ba2hrtdptTp04VPHTY23hC\nO0gcSeo+IpKS3DhMrklzS+aCpE9pWZx6LVEcMTuznWZtgrnpbTz71HexzmGsR8dRUENIh3cWkw/Z\n88Bhmott9n7l2zx7xxsY9drMfeqvqF5YYvvffoW1a38aYzKczYslmGb7qSX2P/QsX758hovX7GDn\njnmkFERKESfVIK/0BegSVAtOhLA/pUKtuslN4XQNDldnHSFHVuGLgdJ5X0y+jmvOrfCOF5f5zJXT\nPDZdIYkUSkiSKEYJ8HlGimNjlHPsfJsnnzlHNvr+xZTq137t1/6H//h//voHf21WiTDxImi5iLo2\nvOf1V3LrjgpnnztLu6fp+ZTVTk5qoNfPSVPLqoPLBaxc22Lra7ZjE/BOIaYnERNbIZqgNbWdyb27\nOb9Xw/YqlfUcleXo0xfxzz7P4GfuR710kdH7XovetxOZOfLMEDUm8Lu2Il5aIPuJ21GXzyIfPwcf\n/CzD6RbHdm9BnllEIpju5KjljD9d32BWCaYqEW62gTzX4Zl9NTayDbZf1mDi6p04mohkDuscafsM\nG901Hn90geNnMoySWGHwHpxTWARpnrK2ssQNNx7E+4RHH38+KDZ8MSHmI/bunKGSxHzlqQv0qpcx\nOz2Ld4TMiYA+Y7OrKbi+5YlJnrnlZiYPXsPWbXNhyi6/KaXe0BhMb0Dywd+hfmGJLSsbVNOMbHqa\nSCriNMPW66H51IYjmvAem8Q8eeh6Ht3RYrESMT/IObxzjqqxfHXfTsy2ObZvmaeaVBgNR2TDwIl5\nIXDeoSqa7Tv20pyaQUcx3WGf80tLFLQf9nscmb33ZKnBmtIMUHK3agzAQqiCVyvemEIRFRNIkPpY\nvAsuxtw5jr5wnJmZGYQQzM/Pj4X/Wms6nc7YtVUu2EpZ2CsBDhjzriWFUDq3Xmlm2CwDK8FwM4CW\ndEF5gdk8vZZcbwnGrzRJbOaLAW699VZ27NjBZZddxvz8PLt27WLHjh3Mz8/TaDQ4ePAg27dvHz/n\ndruN96EVeP/+/dTrdbpFUaaUclxpVPK5zWaTCxcujPns668/yM2HXo1xITzcl5ZjpalUokD5xRKt\nJUoWeRhKQWHJbnztS8x94GeoXXEV8pqbcE5SVzkq0uEi6XN8tkHWXWFQj2ms9rnwjrdTWVzhqj/6\nr5y9fCuJcZx76130tzTJ8wxjLLl1DFLDoc98k9m1Lq3VLo/vmSWKBd1hBgiqcVCvOBO+j0mlUsQB\nOIzNcd4V+vjCxGEd3hOs9cFkTuYcwyzHeI8VjtRl/PwTF9nVzdjaz/jq9gZaaGzqGA2GtCpVpNSs\np5Yjx5d49PkF2kOB0Jrc5vzrf/NvPvg/wtXvy+kKBMqDdJ5IZtx5+9W88Z79DA4/iBeSHEG3bXBO\nhGBjIPKW5yuSP2oI3nF5HVSGyVOq1RYuASFTRFzDRDHJ9l1cOV3DmVV462txj59GfOI72J9/A+L2\nazB33kxDpNiNFXxShyxEzsWi7G7KQmnhxx8kObPOzr9/kfin3476+lE2ds0x9cRL1G46wPseeoLn\nL6yw+5804KZt8Jo9DD/zZa6cqOIrGUuLq8ztuoyoNoUYLCGn6szYbQxtH5k4nBcImSERGOfwkULm\niheePsOHf+tPWV0bkmaOKNL4SsRw6Fhq5xx9chntNFXVoPS6OxecMd4aQoatHP/wSgIfNcoMR559\nkQNXXUW97tm86PbOIRBoqRj95LtRn/gbxF2347/0DVrO4e5/E+6Rw7if/DGy1KA/+SnM9dejnnqC\n3htv583tDbYefopvHbiMT791P15Kni1kNtW4gpKhmWE4HJLnHpTEI9h54jQ3PfwE6/9YY3buKTg0\n0Com0gJrQ+5rqTktWxmCpbJcIJXH8VKdIF/mVBu/xnLDT7jgg8ALwHmsC9psIcS4DSJJEmZnZ9nY\n2GB9fZ1qtcq9997LU089NW5S2EwZbDZBvOzxhBhzseHC8HJTQ/l6NpsjNqsTyls59W6+/81fm833\nDZeoiPLXCy+8QLfbZdu2bZw7d45ut8v09DR79uzhscceo9lscs8995BlGRMTE9x3331sbGwEGWIW\nrNRlLsPGxsb4JFAqPUqJXRRFRZAOYylf+T2KtCCS4eKYW0kaMqbw3hY8b2gHt3nOzt/4INHRZ2n9\n9odI/uJvEfE6zgyQUqO9xYx6ZP110jzF33ITz9x5HxbNDb/+IRrnF9luDA/8wruJpaA+GpKnQ2af\nPcWVf/84Dx/cxZcub3FXlvLl3dOsbbRp99dotKaYm5lFRlEYXuKYoTFUrUUrHfTBkcbYvKCpBM4L\njCvkr1JishQrwiScW4vzHqk0cSL57L5pfvj4On+zdwaHYJjmdLsDhrlhoZOS+5gXLqxxerlH6hRa\naXRkGab/35j6/SVjnhDrKODgzVfyYz91L1VzjkGtxpoTrJBhUouMYpwNaesaR+QgTgX1KljbRTmN\nMinOpUg28EozpE481SARTXS8G28S2HUA9557kORUMofIh9gsTDHKe6JKBV2tIj/2RcTxBSr/9RHs\nG27E//z92P/0FYY/8Xrm/vBvkBfWmFntItKc+kNHaKQ51dOOjUFOTWoqSjPZiJnbtpvKFXP4yjT1\nqTmsyxitn2fUWSXdMHRSSepkCBSXMlwdpcMaR5RDTWjOPneOXEhiJ9HWkUtDrhwdC8+f6KCswEcJ\nND1SepQOaURS6YKn2jQVOo/VkqRaZcvTz5P86Sfw//PPId58FwDOGYTNAYvQkuieO7FvuwfVasBD\nh5EvvASPPop7348Qf+IvcT/5o6T/8Tdw33gY9cQRhv0h2771CBPrG7z6mZc4sWceKT0ylugoIkk0\nWZZircHkBuPAiuDgufFbjzGxvEb8qb/iwpveiM4UcbWBjiK08YhMjDfzY32qDT/MZZFkOJaHPFx8\n2GqXgBh68AzWWJwLk7/F44UNx8QCeGMRvm4liDabzXGpY8nzrq2t4b3nfe97H4888ggnTpwYA15J\nI3wvp9hmfWwJrpsn5M0xjSXgbp5cy9sr1QklBbFZelaCM1xSV7RaLW677TYgZONqrVlYWBhbd9fW\n1sYLwvPnz/PFL36Rer3Ovffey7e//W2892zdunUcwlOr1caBQFu2bKHdbmOMYWlpibm5ObrdbpjY\nKYwpIsgES/+WFAGEjXGkqWOUWXIX2hyECAYK5zxn/tm/YOfv/19c+MV/QSaqyEoTRhuhATrNsOkw\npHdVauSqgvOOUd7l6Xtv59ovfZPn3nQbW188yYEHHuXZO6/j/I4mt371USZWNjj0eMrH77mSj2xt\nkeU2UBde4gSkNicWxfOPI0xqGQxHTDRb4b1UTLJeUMy+QQJJedIIK15yZ5EqQhdLbpPnPDlX57lt\n02wMM/I8C40nxpPlihfPr9DJBGsDEV6vBq0duqYQw3+gDVgKiJDMTDX5kR+/l1rd0DvXZX3D0TOK\nrkjRFioywqQjhLfEEl7t4J1dT3auTffqPnFWoVppwXCIr0yBtiSRQcscL6rkcR0fi0CUe40wI7zr\no0SKEBlx7PFZIMjVI8ehP8Jvm8K9/26Ej7Gv2Q+vuwHR7ZHbAfFfPIy543rEQ0+xceMWKt85Q/+u\nq2nuuZJkZgL51SfY96lzDN6yE/3qGcRGQpYP8aZNIjtYqeis9OhlOZkDU0xZwgftqbLQcAnTVBhY\nSw9PhETmOf3YkUWenhGca4+IXlhluRnj512R3BaOyU6Erjet9PiHzpjQuqwjzZ3ffJDK0jL29z8O\nd70mbGRd6BbDBr2gjmOIo5CD+4v/CPnhP8X9+LsQf/JJxJnz+A//Z2QcIQZDdKfH5Nce5OvXXcbt\nz5zkmzfsJVNhQRF5gzUOIyEtAGc4yhA6CZpG53jqDbdw07efZO3HfxhnMrzJUJsSxMo2gxJMtNbj\nTbAv8nF98V/QSYriB9ePl0zSF7ZzJ3DWYExRNlnAgBQyGCmKiW5hYQHvPevr6zSbTWZmZsiyjMFg\nwNTUFLt27WLPnj2cPHmSI0eOcO7cOXq93hh4N0/Xm8HzlQC52X1WUgvl55S/l98/uGQZLpdwm/Md\nVldXx7GN5QW32Wxy8OBBXvOa1zA7O8sTTzzBysoK27dvHzcEb926lePHj+O9Z3Z2lueeew5jDHv2\n7KHf7zMajcYXoMFgQLVaZWVlhaWlJbZs2YKUkosXL46Xh2WpZXieAXQD3SUKqijY/XNjGY4cw8yR\n2/AellKipESJ0BLRvfetPP66N2GMxeQeJ2Ks14WzLWjRcwQmriOVxroRuTcs3LCf1VtuQvgBd/zb\njzK51ObabzzJd+7Zx9f2beENzvH4q3dQrUYIGdEbpWRWYKzHS0lmDJk14cSVaLxU9IZDKnE1VIy5\nkL/gfFG0WbzW3IaAGyFloFM8gSrBhRqiLChqRqOUNLMY47FpjjKhxEEJhycPTHDRgqIqEipx0CX/\nwKDrBQpBvZbw3ve+mwPX7GFj7Rg4xa7dV1M5vIokBRmyQTGWiYqipuCta4ZZC9mzGSt3OCJr8a6P\n6w2wtRidxMQCJCO8FEgnIM6C+yNTWKfwMkIoh/ISN0jD8TWqI/7oq4jzK7irdiLuugk/HAXNrtbI\neg371kOM3nonwlRQH7iPWt7Bp5J5VyVTBmoK+clHqK5myK8tsHTfFUzYCGFHmKwNaRc7yllZ6LLe\nGYJIsNYRRQqJRSqPcIF6j9FolaBcyOxMjWfkLV5BriW9zLHUzbk4GFIrpCxxHDGlYgZ5xsiG2hKl\nJAIdVCLW0uv1+LtbDvGWRx8l/ZF3MtfbIIkCf+YDCoRKOB3cPsKBv/PVuFtfhRmE41n0ib/GdbrU\nVtu0awnrzSoPXDnPsbkGh++7iTiJSKTA4THOIAlidJGGIsnUOJQMUyUCLl63j8N338n07Bxy0KfV\nbAUpoSyndU0cJzjnUcUUnxWKhkAhBK1qyOMNeblaXZoKQkFlWEi5IsKTgmJwApQoc3UvLdM2h3eX\nUqpGo8FgMGDr1q3UajVmZ2e57LLLuOWWW3jhhRd45plnOHXq1BiQymyCMp9gcwzlZn1tyZ9WKpUx\nDQCMP/+VgF2C8WZOuKRSkiRhbm6OWq3GgQMHOHToEPv27aNWq9HpdJibm2NmZoZ2uz2eVCuVCgsL\nC+PX+OSTT+KcY25ujvPnz9Pr9ajVanS7IcEujmPa7TZShhjMCxcuACFNLc/z8QViOBwGJQ3hmO0c\nIRzGWHInyKwjsxaHQ0uPVsG8oqUglmGrn1pBmoEJFcEIpTBxA08fbzJSl5MrSRxVMRaMy/AiJ0ka\nRFqTDQ1H77+Na7/0MI8e2svQw8MTmhd/6CDz0w3Y6KBiQSWKUULT7fbpZxlJrcbI5MFLUFR1ZZmh\nbzJqOi6+J0XZpg/23twWFFeksS7HAsYLVJTg7Ahvc6QTOKFJrSVNgyU4VhEyMhiZ461DC0kkPFo4\njBT4SkTcVKjVf8Ckqzw0peRtP3Q799x/HVIYatEUstnFZW1qDagtJJBE5KMBW7zg0NQE85NVnsou\nUEs9+c0NokGK8yPMXIpOHa63iI4F3k+QxwkiFgidBtODsKEBoszczDu4PIXRkLyzjpydxf3Ce5Af\n+wz5L70TVU8gWw+hNyhMlBDLBl5IqAgQLZStIayDUYaONJYh2buvgz/7Nv49l6Ncl1GsaWYZPk0Y\n9iPa504j4hb5cIWK8iivcC7CaItVnhywVYvLN9iCYKu3pN5wXkBFCLzxVJRHRIJlLEvOc02lSrVW\nAe+IoirxaES/3w9bVikwVpA5R5qlWJnz2PatPP7Ot3NDo859gxEzFUWkkpD4BeEYwqYjsnN4n+NE\nSv/V+zm/a4beV77Jtd98jL+/bjdPzNRQWoellCXYHR3FFl4jg3+IkbB4EWzWwQoc6IW4UsFLySgd\nIfNAEXgBURRjTOFbj0WooC+mu/FNhAkXcWlyFCIUHQqhx+A0vt4Xygtf1GF7wAiPKypfchOiHOfm\n5sZ8aMlbrq6usmvXrrF7qzy2z83NsW3bNvbv38/y8jJf+9rX+OIXv0hZUFneSvpgs7UXGOfsbl6O\nfS9XWsnlbp52y/9vNptMT0+zsLDAzp07+dmf/Vl27do11tpmWTaWdl199dU89thj47LNxcVFRqMR\ncRyztrZGnufceOONzM/P89xzz41pmm63i3OOZrPJyZMnqdVqNJtNtm7dOtb8jkYjOp0OSZKMqRjr\n3CU6yLhA8xSZGFKEOE2BJI40SRShZZDgOu/JvUMrQRyFSEaSBHKF9ZZs1GWUDhHKk2XDUPyiBcZK\nVBwWT+BZueFqvn71TpYXLiDWVvCxYmAyhtYQV6vEIhSq1mt1nPdstNdDMpi3IV2sqNxxUjAyOVUR\nYkVzY8YSt2ItG45fKrSSjPIM48KEr3REnmc4FzS+xvqCHvPkzoQ4SUGxnwmt6CMPaAUampM1VlX3\nBwddIT0Hr7+S+978epAGa1N0ohjkI/rrK6z2Rgy9Q5sM4S2XV6rc/pY34ucEy995hGPdDSanFWJl\nSKseFw6QNi6NIWsg4ka4rDqBcBprBcobBOVW1EKWYbIVht0Owk/RrO4BdS4cUYVEEyN8DKkMjbNK\ngxQ44YIbBYeSEmdCjxECfGbIXrWDteRqWnuniLSmVp8mmZjG4RGTkywJycZwEIo5ZTjWitJeKMBH\nAm/AYDEGKkqisFREONLEMqgFnHOgIwbW0KjXaTYbeO+IoxpJUqGSJEUcYYrxQdqUm8JcAgilOHFh\nhadOnOXmK7bRjKHiovA8Yh284qrIBtAKIWIi4ajnOVOtBubu23ns0DXQ22D3yjrd0ZDMWbRSwZuu\nyolNFjRHkKZ5KbHCcdnp89zx6HMcecMt9LfOk+UZeTun1ZLk+RDp8mIqzMYAE6pPLgFuCTilVvXS\n4unSx7hCg2ttcMf5whBRettC7k2xvCrAfOfOnVy4cIFqtTpO5CrDXPbs2YP3npWVFdrtNkmSUK/X\nmZz8f0l7syDJrvPO73eWu+RSWWt39Y5esTWxEAApEgIIUiREUlwsKmhKVjjEmaDkiImw5IlxaGYc\nYYcfbMuaB8kKm0NbY0lDaxgjk0NKpDZQFCkSBEgsBLGjgUbvS+1LZuV213OOH8692dWQTIakeuil\nurMq89bN73zf//svMxNv2pdffnliTlNjvLtNrWtIYDePti60b5X33vy6+FsLOmDCsQ3DkEOHDvH6\n66+zvLzMsWPHJtetlufGccxoNOL69eu0221uvfVWnnzySa5du8bdd9/N2bNnaTabHDp0iCzLbjLS\nqfPaanlzEARsbW1NrCIPHDjACy+8QJ77Zan3aXCUhcfS68QOpWQl2XagvJ+xVpJQa8JAISueuSn9\nNdESokBhrIAwJDdeCFOaEXmREuBI0xFITaiatBpNhPTRT6XzhTCr/HgLY1CB9pOcFERxk9IKtLFI\nvJxYBZq8KDBGeMvXIAAcQgmGaUKgBe1GjNQKHWgoCo9b1zsE/wbzuX+5pSwKpDPkhSUvLIU1lAaM\nAWMcQZVGLYRCqhLpQGmHMBYrHe2pJq2ZZuXh/Q8sup12k49/7BFmZyKsTZAuI9AWqR2pydhJS1In\naBhD5ODU9Ax7b9kDd8/QXF/i+kt9uusJcTsgQDBVlERiAKaJTQbI1h60UDinEVbjXOA7ISEQQiNN\nA38TuAAAIABJREFUCeWYbKeL0hB0JE53UZ/7AuLsNcLP/jHuPf89MmpgjPfqFAqghCpCWgiHwCC1\nn+1tYdB/8xr6d79J+vA89vgeyjLA2pBsOGI8HtCZnoKwASojDLx4wVhXnZTWG687kIFCOUXoLG2l\nyEtHo6J+ocQN7qNQpMbzGafaUwCEQQtjDKPRkIEYej9dWSADgcudTyXVfkTPrGX02Hfo/PAF7K/8\nPPaRB1E68hSrirLjwTWHKAVKamICZm1Ao9mh04xZW1U0TETqDEZUvgpaY6VgMBoyzlKkFBWdxru4\nCSV58NkzLGz2uP+JF3j6ve9mMPDpCDOzM2TjIa2pWZzNq88L0jJnXHq4J5BqYrloTEFe3PA1uDmT\nzMs03aSn9R+RCtCiNpvx8T6ukmqGOiDLMubn59na2mJ1dXUyTo/HY2ZmZuj1epPFmJSS1dVVjDF0\nOh3W19cnPNYaEtgtXNh9WNRFczQa0Wg0bnIs281e2G1ws/swqV9np9OZCBPuu+8+vvnNb/LUU09x\n3333EVWHb57nE8Px2nzn0KFDNBoNLl68SFmWxHHM9evXmZqaIkkSnn/+ea5fv86hQ4cmOHUYhqyv\nr0+eS7fbnUQYBUEwCbMMw5CpqSl/n1Yb/onAQ3hzISWqpkMqf99IfJAjjsJ4fNRax+y3/4p9v/Ob\nXPmv/xW9B9+LiCLyfo51GUJYsjwnSfscefM6d33zZS7/7PuxznLiz77Lmz/zEOunj1eGNT4h5b6N\nhPe/ep6XHrqNlTuPVHsBy3i0Axh0EGCcRUvtParzzBsqlX6RoDC0mjFO4K0dq8kMJTHOr9AyU1I6\nEEFAWVruvLLBJ85v8MUjs3y/E1OWnmKG80tcgUQIDxMpDFoLQqFASYKGRoa1Z80/sOgeOrTIHXfs\nxzEgCBTaOYbdTaJYkSYj2m2F2HY4JwmwtKWjXF3GLKZsrHXZWB6gZvyLlM7SGQ3RqcQwROgRMhkj\n1BRGFp4ZEChwzhtPyBKXj0k313Ejw9R8h0HvGoPrF1n45XfC74H9F59EyQRrE6xsISKJEQXKeD6t\nd7cHnO9MjHWIMiP8/ccR13aY/nZO75HDON1AhQ2KNEHYFJPkNOMGd5w+wgtnB1y6vI2WjsL6XajA\nd7GldWTGgFM0hSSwktAaokCQFw6lvWeFKUsC4ce0MIirgMYQaw2I0u9LXYlU3o+h1AWm9EtDrTWR\nDnjkyWdob2wy/r0vMX73A7SUQjk/EjvpEFriZIiLKrPuIqM93aFVlsSr/vXNL5ZEjRgkPrE390mq\nSE8Qz/OCkhIjfAFUUvDMg3fz7qdf5cxPP8j21hbDvs/YGo8GDAJNY3YPb15+g3/3+S8w3ZnllltO\ncPLErbQaDZSQSFfbJcpJF7kb36yL21vtDoVQSKFuolhJUZtS3+AB1zaIuxkGnU6HMAwny7K66NY0\nqQsXLvDmm29OCubux9YFsu5k68OhXj69tbvdXVzrTrh+/G7ZsFKKAwcOsL29jbV2kqv24osvsr29\nTZ7nkwy0LMvo9XqT9N/9+/eztrbGYDCg0WhMkiGstVy4cIHz589PvCa2t7cZDoeThOL6IJqdnSWO\nYw4ePDgxRK/N0H3Eey3UuHHY4LzvilI1h9hv25wfTnGVAMJWCdf7fuc3abzxGkc++29IP/AoxG3G\nXQ9NICzzL5zhp772DK1RirKOI1/+BkJIOutdbv2zx1k9PEdoCyJh6UQBj766xML2iPufOsc37z6G\nKHwqdaA0ItLEgdc2ICp+uBOU1lC6nEhp72pXejzWVdOzE67y7iiIhKAwzqecSLDG8onzmxwZ5vzn\nV7o8eXoRHYTkhSFLMrS2yMAhbDlhdDWi0Ae2CkdpMsbjkB+jjfjRRVdrmGrD2I7JCoEOodGJMYMM\naS2RcwRGIANNKC0rO1vsee4lOvlB9h49hHlmiSNn+nTvDhm0LXmaUo6bCOkgN9hxiogLhPZWjTif\nq1VZ/iDyjLLXQ3/vLOWfnufqPdOIOyLadwSI3/7PiBanya+/Qba5Q/vArQjpKKXFlCBRfgHkHLja\nizSFbEj5ydPYP3ya/KOHsWVO2IlxShPIEpflJNvrdIKAaG6Be+89xdVrT9d6QYyU2KrwWuEp2IGS\nxA4aMqCFoMAwdsYT/xHE1tJ2DodAygCtqtw34RMZ4jigLENKY4iznFwFFBgQPgkj0AGPv/e9vO+7\n3+XyRz/MYpLhbElDSQIhfEqqUN6/VPqu11QmHGV/iBOK5vQ000GA1gFFWTAej7CjESpNCYIIIfOK\nXFNlUTmHVJLrtx/jsQfuYjweM1xdIwgCsjxle3MNm6VMLeyj3QlZ7i5zYekKr5x9nVMn7uQjH/o4\nqjImEZUIxCHxEUs3CpMvbpW5DdYXW3mjo7ipECovRwb/8xyNRszPz7OwsMDq6ipBELC+vs7Jkycn\n2/macVCr1DqdzmThtrOzw2uvvXZTAd1N36uLTx2TXuO5u4tpXVzrZeFu0cPf9XW9b0TCxYsXWV5e\npigKXnvtNebm5piampos6YIgII7jiepudXUVpRTT09OcPXuW4XDIqVOnOH/+PJ1OZ8LgWF1dZf/+\n/RMop8ZuT506NemYdy8D2+22vwaCqvDii6u/ef3Bs+tAcs7DDP7v3r7TLzsd6//8v2Pxd/5XNv75\nv/K5ZjIgiJvYkQJjeNs3XqAz8FzuQgr+8kCT+9bHTAPdOGD+2de485nXOfPu27l4YIpX33mSu39w\ngZfuO0qysc3+S+t86AeXeOr+Y1w5vg8rDE56TNbhsIX39s6GY0pbEjQapGmGsQ4dRtgsxwkf3VUY\ni6x8sbOipMCQFYY/umWWT13e5v89MofUAUVpMc4hrJfcI3y8vHWOMPDuaVYLlDLQUgQaT4f8hxZd\nsFg7Jo4Eg/GYnSShHQhwllajQaQUgQPjBMbAZmEhaNN7/grtuw5xvJcRJ47wXEl5G5SZQZUapSJv\nMFHkaFPifTctuByc3xS7ovDJtAbCr71OuJJwaH2A+RaMPpBQzr7Cwl9u4nLD1nv20/7lPYjOHDqu\nguusrLDBKkGiNBT5iLAcw9sXedPs5cCxDqFWjLOcplLYdMBwZ42tixdo2hYzh+7krvtO8eT3X6Db\nLb1DmJS+G51cIfwW13oeqXYQCUUoLBme4xobCI2rMsEsWoOxHj8TKKQIkaJACh9t0mjEUJQ+AFBK\npJJcPX0n/+Htd3Fy/14OdeZYWbvKfBDSVopYBQitqdsQV/rEZpelFDsDklGCrIp3Vhpya+knCTvD\nIRvDAeOiIDWOwkmOXF7iwR+8ynfeeQdLtx31b+atbdIkJVIK8oISQyK9xn3l2iXEsE9DB/RNRmlT\nri5f5dL1VR64+05iaRGipCgqapWhGuXVZJPvnPBJE64qttysGqs7YqVrjLScFLEsy8iyjIMHD3Lt\n2rWbBAx1watH6prS1Wg0OHbsGM1mk2azybPPPstwOLwJItjtieucmzy+Lqa7eb675cK7u+C68Ftr\naTabE8ezKIq4ePGiT+BIEt544w3uu+8+Op3OBHppNBqTpVsYhjz22GNsbGxw22238dJLL01e/9ra\nGg888MCk+11YWOChhx5iZWWF7e1tsixjaWmJJEk4c+bMBP+tMXalFNeuXfPT21vGYimEtw6dvC6/\nRboZQnGT7jj56Q9z+ac/QmkLyiwlL71nS0OH9AvDc+8+wTu/+SoW+Ku7DlC+/Q5O/bu/QgAHr22w\nZ3mLZl5y79d/yLVffC9zVzbpdEdMX1wnaYe8/ekL7OmnvPOZ8wy7Ix55Y5kXpyPu7qX86dE5rp/a\nR2u2TZkVnhXjnHcUc9XyTFWpFSrECUte+qYJpTGlobCOp2bbfK/TROsAUxqSNCVPcprS0gi0h7mk\npLSWUCkPiIWgQ4XsBDRmI6LwR5fVH1mSHYa8GCBHfWbKjEZuMdsJQjTZSA2rSY6RBqtLUiVZdprv\nLW8yfew2cmu4+s59DGLJ6MEmdjFE2SYMRhib4HSJUSlWZyBLBD5WnNJhjcWYnFJpGkffRvEL78Me\n38fUzBQLuWP+60vMf3WVoJvTGBn2P7NJ1h/hTIErwIqQUhmkSlDkfjnnhii7hku36a31ULpFGYSI\nuI2eXkQIhSu6uHBMJBxTM9OIRsmhk9OcuuswSlqaSCKrUUYSIpFCIRxoU9CxOS0SQjKmyhKJYBhI\nxkqQKs1QKvI8q+g6OXnub36Ex1ajKKLZbNLpdJiZmaHVbE5MO2oit5aK4WhMc3qazgtv0Pn0f0vy\nF98mHSXYrMDmJS4rcGmOGSWMt3cYbPfo9XYYjsb0kxHDNGGr1+Pa6irX1tZY3uqx1RswGCckZcFP\n/uBV9m3t8N5nXwfnscB+v191+qCUQylfdNJszPVr51i5coFGFR/jnMM4wziz6KBZqZ8ioigiikKi\nOKLdbhFF8YRWVRvZOGcnVpC+6FW+DBPTmhv3ZhCG7N27l0uXLnHlypVJByuEmKQm1AVzd9LvbkOa\nqakp7rnnHt7//vdz55130ul0JiyGG124fw5Jkty0QKs/drMXdn+f+nmDL0hRFE2SeeM4ZjAYTJZ4\ntddtjeHWUEgYhhPP3PF4TKvVYjgc0uv1CMOQq1evIoRgOBxy/vx5rl27NmFB1CY5SZIwPT09ee01\nK6MWefT7/cn12M2EeasvxFtFL5P/IwVSeS8GpX2D4PD4rw5CdBCRlTlGOM4enOYPfuF+Hv+JY3zg\n3BZ37JRcfc8DGK249K57iSp5eFgadgYJd55dRlvHXRfW2dnu8zeHZ1htRfzVoVne+/oyB0c5j64M\nODwu+PjlLoPeDsOdPpHWNELvPIcURA2P6wopsSgK49BRE6c0eelzxK2UJEXBKM1JCkt/lLG9M2KY\nZDhT0AklzUAQVv4dOlA+zToSRKGg2QxptyJCDd6U6f//48d2uoEW9NZWaWpFQU5LzbJ6PeGHP1hj\nOLKEUZvUjchx9Cy8cHWNU4cPs+fBY7z53EX6R1rcemSKuZahmA5Im9BUCmX9coQ8x8mUsjSoMPLG\nLs7htKJQAtecRv/yR+l/+kOE3zmD+pe/h8xLVBRRRp7bKT/1E+TtJqUsPbBuFFr6PDGswhYGUxaI\nXCCeXKb1+08zengReapDYRrIqIXTffqjAcbtR8RdXCyg0cQqy6Of/BBXLn6Bzcs7FE4gtU8l9jTS\narxyFuEsDSEIdUAsDIGAUCq0NShryLKcbrfL9HSLMJZIESDQSGnRAcROIqXPFDOFt3A01lYNrMEW\n0BuWpKbgwJcfQ11dxv7+H7F8z2mmZzLizhSh9KNwOhoy3Omz09th0B97bDi39IdDVldXWe+NGI5G\njIwfwZ116EDyjXtv5QPPn+Wbd91Gd6tHIEpi3SAUIFzm47KF32inSUJ/OGC4vV1h1tWWXrQwZYDJ\nFUFDeB5x9b72G/GQovC6eE/PMZTGVB0Tkw3+ZDFloSxN5QTlVWp5kZNlGYuLi/zwhz/EGMOtt97K\n9vb2JA1hYWFhEp+T5/mEFVArtWoa2ZEjR3j729/OxsYGq6urLC0tTUQUdeGuC2RdbOrnVnfrteFN\nXZhr/LfGo6MoYnNzk8FgwMzMDNvb2wghaDQa7N27d5J5VhfdusNfX1/nqaeemnS/S0tLNBqNSdfe\nbDY5f/486+vrk8e98MILJElCp9Nha2uLu+++e5IUPDU1RRAEkz/nec7S0lL1vev3/c0tb3WLezjB\n3fDWqA9FKpEBVC5d1aERTbXY7gpGeUFmLUEjJooCHn75Fea3hkSPv8SVT3yU5Nx10tO3czWOOfL4\nM1x61z0cOHWc8/dvcfL5N3nznuO8QzS5e2mZp+4+ytaBeb7fafKeN5Z5c7HNqbUBT9y+QKfTqu4j\nhQsquW8VTT9KM/LSK2elA61DhuMxVirK0jBOMvK8AARFbkmqCUo6R6cR0oyV5yiHAaIwSA2ZsIhQ\nIQKFbISoKEKGAT/u48dEsEM+HjDViCAbo4Kc6xeu8fWvvMC5cz1KpchsidOWzDkKCwWO7z//Gj/1\n0Cluv/0EOy9dRsoOigHB4hxF6JVPbpyhoxI3ThEyqlys/HCprEJZTYiEQoASaEKK995P+VsR+rN/\njP21j5G/5w6C3BAIgaSolmi5D8QUppIcx4jSIvMcMy6Q//F5mlsJB57ZJPn5Fla0CDtTOLtCEEWE\n6m0Ee/vo0FLoENEMOTy/yC23H2P1yvNY5z02axWPqBQ52klUWdLC4aSiYQ1h4VkdDWdphYKiyOn3\n+wShwNFAa0kY3lgg6UDjqJdB/mapcUkdBiAcBugNRiz82q9gf+d3uf6RRzh3/izTs/Ms7NlTBTWm\n9Hf6DAZDNjc26XV7nsYWRYzGI/rDIdvjglOXr/Ppsxf589OneP3QIjI3PDc/xQ8+cD8BkpazRIEk\n1A6pLEpVBuzWJ6AWxlCYku5wwDhPsU6iA03UbJKbkl5asGcuJsLzMqUQSFl43qO1UJHUpZJVXPcN\nTqusxkLnV8a+yJhKImy8ig6YcGy73S6XL1/m1KlT/OAHP2BpaamCanx2WJ0gUS+nWq0WQRBQliXt\ndnsSYX7nnXcihKDb7XLu3DleeOEFXn/9daampqoCdAM+2M1e2N0V1sY7u//PK6+8cpPvQ80gmJub\nm+CrNTNhdyed5/mka62hgZoDbYyh1+vdBHmcOXNmIhi57777WFlZ4ejRo7TbbYzxKdR1cGcNtfjD\npP6eNWRSqdScq/7sqn/zYgghPH/b+ySLapnlPZCFFDRQiIaiVA3KoIUTMTZ3RKHizPvv5u4n3uTy\nJz7Kia99g/b1ZW75i2/x2nvfwcxch6XFFp1zV9h7aYUnPvwONu+8hQ/8wdeZ7Y148Owy63cfY2Om\nyX+65zjj8YinhSPLM2ZbTbLCUFpBP0mxqmTOOcq8pMhL0sKQlaVPanEFhXWMPWkYi8JaQZ7mlEZQ\nFGBzw3QjYCaSKGkJQo3B0QglqTOkwiGCEBUawoZERAoqCOIfXHRxDpuNkUrQ397CuoJzL6+zerWL\nRlJYr/CQvj2hFAJX4W7Z+iqrf/MC963nbO+BxUdbyE4bgUNVP0g7TlFBAUWOkw5jck+qVg6jBeqJ\nN5CffQzzax8GJ2h89uuU77vXE/Zfvk78v/8p2a/+DPZ9b8NYhzSOwDow+YQRgC0RaY7MhqRbm1y7\ne4Y9vSHNX7gTQoNQoJQh6fdQLiJqz+OSGWy5hW7EiEghA8vb7jvFc4+/SJEpL0etXIqUEgjnL3Io\nBFPOkeUFTeFoSkcDSWh2FxTI0gJrQevCjyKi8J1ctRlWSvqYnSjweVW55xhbV6CEoNcdwKPvwT58\nP9NbG0xdOM+ZN85w9C9WedfTz/PKow/y5uF9dEdDNro9Drx2np89e5U/PnmQF/fO4oQgKRy/duYc\nh0cJP/Pqmzwz10bbEiUhChVxMybWnvwupfdLUOqG85bBYkxJlmds9Hbop1nFlQwJmk1SB9c2+xyY\nadJpaNAaIVzVMVmCwNPpaqcxqm2456p6E/OyspUUSgEObarlVlagq+XW1tYW8/PzrKys0O12mZ+f\n59ixYzz33HNcu3aN1dVVbr31Vo4cOTIxgqk71t2QQ6vVYt++fRNaV7fbZWFhgdtuu41vfetbvPrq\nqzdxb3eLOerP19emhibqUTxJEh9zXn3sZm7UQow0TSdCjt285pr2VttS1sW57qRr6GU3Fa8uyufP\nn2c4HE7kwrWoot/vT5gdte1kvUi7+UBxGOsmh4WU4qYDpuZRVxb1UP0qnaf+KRXQaE+Tx3Ok4Qa6\n0UJpWL/3EE+85wMEjb1YFXDiK1/jtQ+8g9v+8glm13uc/s6LAMxtDXjb91/jz47t5YWHT3PPE6/x\ng3fdhokjhAvAGbTw3XcY6sm0EeqAYZLikCR5gRYwTnN2xglJ4Y3MdRAxGKfcdmWTX7y8yR8enudv\nYkXhBIUVUDqmdcRcFBJo0FIi8ZxiqQJMUaLDkFKADgOCMEREIUbrfxxlzFlDpEEWOeNBn+XLO5x/\nZRWXQ7vteW1pafE0NolyjsPzHT52aj9i4zy3DyzTBvo/3EJ/cMbL5EQM2RiTWzTOZxjlKU5ZAhlj\nVUEaGERkCT7758g317C/8WWvyV/ZQV9cQRQGd2kFkZc0futr2PtPoGPpn4eVCOvjcLACm+eIUR85\n2sB0+zzd67Hvpxf5yfsWyfGGLGLYp9guaKs25fga4/UNWrMBNh/hZBPDiOMn9jEz1+T6coKU2mPQ\nojrhjTcll9bRcgLpHG0JceloCL/IM0JUqixIksx7v2pJmoVobfwG38UIKYnjEKUkWZ4jhSISklaz\nQelKtnspz5+9yv13345ylvZUh1uPHcNZOP3vv8bMVo9b/uSv+dw7biPJxxRFyq+8tsQtScnHXr/M\nn7sUpQNA8P8cmOGXli1fPDZPWYyJpCUKGkzFEY3AJ7YqFdxgGOy6myQCV1h6wzGX1jZJSuM7Vt0k\nL1KytMva+hKvRYKpk3sIpPUqHysAjVIQBJIwvCE62I3D+jc9kyWWL5QBOtAIkeCcmwQ11hSszc1N\n5ubmaDabk4L1xBNP8MYbb/Dggw9y7733TjwT6gKstabdbk+64LIsSdN00h3HccyHPvQhZmZm+N73\nvneT0q4ucnWhqn+v4YV6ieZDPJkU5vp7ZZm/D9bX19m/f//k6+1mOSRJMkmyAP5WqnFdxHdDHnEc\nUxQFL774IjMzM2xtbU0y43q93uT51IXbpy7XyruqIbKen2qsu+kgqeHfCfZ7o1pUMJVFOWh98zEW\nfvs3kP/sX5AePUBv/TKq0SQvcppBC0NAqAQb997OxaPTjIdddh56G/d/92We/YnbsNbxE8+c5fv3\nH2eQppw9spfX/8vFyn+69IXNWoRU3kwJLxG3xhKEIc1Gk8KU9EcJzVAidITQBlNYhuMEJwqMhV+4\ntMnRccYvXt7g6yf3eu+IrCRwjrl2TCuCQAu0DDBlRtjQ3odFgAq0H8SVQmjvB41WP5ao+6NlwFJg\ni4JsNKShQ7orOaNuzux0QHuhQ391C1M4rFMe5ytKGnnBAinD1TXOH4jYe67k4j7Nu/spLArQMcKW\nuKLwT64yHfYcLIfIDY3HzyD+r78hmw3RsaTo7RBnYLWgmA4Juxm2HSF7hnw8QH78f0L+6keQH347\npfSSXeccZZGjshIxHiCeeJ3GfzrLXFBy+P7bGA0NotUmbu2lTHYYroyJpwKubH0HNRgxP3/cQxWu\nzXg8pN2KmNszy8pmAXi/3PrSFsaC9odOLATCCSLrCIHpZkTqCozwMsOaF1qUOaqsYq4Dn7bhF2cK\nJQusVhRFSWEdkdRoIUlLySsXlok0pE7SiiIi65c/b2s0SH7lF+h+/st8+9YjOEryoqS0kj84NMc/\nvd7lD/dPo5T0/GEELy52OHt4Dw0dMhOENENJ3GoSKofGTRzQ/Bu6tjH0jWlWWgZZzvm1TZZ6Q88r\nVgFaxSTjAZvrV3AmZ5T2kOokdx3eh3KWIvekdqkCQumNfgS+qy1Kb5RurakYeqKKyxYEYYC1AmV9\nJ1yU5QTftNayd+9eZmZmmJmZ4dKlSywuLnLgwAHOnj2L1ppvfetbvPLKKzz00EOcOHGC2dnZCbZb\niyJqnmy73Z7AAHEcs2/fPh544AHG4zHPPfccWZb9rYVT3dnuPkDqLrY2Hq+LZVEUE56tMYbV1VVO\nnz49YUjUX7vm07ZarQk2XHvkhmE4+Vz9e31A7ZYit1qtSdLExsYGMzMzdDodrLVcv34dYwz79u37\nW+99a61n2BhPgbqxWHST+95Vv7gJNFQZN1jHwm/9BvHZM+z73G9z6bd/Fxc0sCqgzC1ChghnKdMR\n6WiLdNzFOcHlU7dw9eQtKCFIs5Q3Du/BmQJhLS7PvQe18ybrwgFlSVnmCAGB1qhQYrOcIvdZfIPx\nCGsdRSjIEWSFAanRUcRonKKDiP94dJH/4tIq/+dskyK3aGtoRJpYS4LA6wWECCnygjgOUBqSrATp\nhRYO5132Ai80kkrfdBT9vYtuWRhcasm3+qyfu0rRV+ydg6kpSz9PKUpHqf3YV8iCIHKEOawMRrRM\nzqpS/NZcxDtKxwNbBns4I5g2qGgWLQ3GgXeZCsjziEgHiO+8hviNP0GUluC6YLs1zfxoB+l8MF6w\nmfgRZnuEAMK1sceef/tPkffdQt52OBURGEWYF5jRDkWyQfi1N4k3U97VUAwOtzBNryqL2oru5goi\nluRJQr68QtyJMDLAGIlyFpPlKB1x7PajvP7GClZojIrQWUmWFoyrm65tHIGAEsvYWjIBMw5ypWg6\nXyw83aT0xi3Od36C2J8/zmCNIUkdgQ4xzuG08OC9Cnnp3DLL/YI5OWacZDSnQkQjRmrNtG7Q+flP\nknzsUR5YXSF6803euHCBq8tLvBxa/tmsQgNKGgItiWRIO4iYiiPiKCKO/IIhEgKJRaJRk6yyqujm\nBUpDbhxp4Vjv9nlzdYte6vm9KtSgfIfa21lnp79FoKfori+xdeIU73j7aSIdoxWVh4KrzHKolm0K\n5yTG+C27dhIrXbVb9WbqXkobEAYB4/GYJElYXFxkeXmZffv2TZRdQggWFhY4cuQIe/fupd1u8+ST\nT/LFL36RhYUFZmZmaLfbLC4ucvz4cebn5zHGcOjQIZrNJuPxmLIsiaIIIQTHjh1jfn6eOI757ne/\nS57nk21+3aHu5uzWxTBJEprNJsPh8KbCuzsgM8uySYded6tFUXDlyhWMMezZs4dut3sTlltDC/XX\nq7veOkutZkzs27dvIoCowzdHoxGtVoudnR2cc6ysrFTVs+pyncNQ+XJUic1K1F1uxWaotmuVaBjn\nBMI4XJVAvvar/5q9/8dvsv6r/5p4agrVWkCmQzotHzY63tkmDBSmTImUJjU5tsxJ85z+YIDUIdYY\nBv0+01Nt8nRMoxGSV+kgcRxTOkc2yojiBk4FGAelESAlWV5iCkeWGvIyIXcOJwQuz9BYpDMV8PWk\nAAAgAElEQVTY0vD9+TbfCA8y6CW4saPVVMw0JFPNGG0KjBUUFoLQEUba240WJbLVZGwdRIJABUgl\nsIEm1i3kP6boOmshyynHKclgSBg22btP0GiEdK8L8txhFAjjMM6hBIwd9HUDqSPSZouG7LE9slxZ\nG3OwuwVBA9HuIISPpymLAhFadFCNsP/hCURpcVJgFqbpzM7B1QSSDOkndJwQ5M2IYJSSCogciJ2E\n/N9/C/VL7yaXJUIGuGJEmffJ8pLuwwcJvnqes3fMcqwVIsOAZmuKdNDDmIR2J2aw1GWcJsiWphTC\nB0XGAY1QIqTgwKFFnHEYSmzp0w2M8o5i6BCb+sKgpaKhBU0gQqCFQtsSIU1l6GMJpEZKhdb1G8dT\nTXz345CiGjWdD3e8sLbOmZUeRW7YHqyxsbrFwtQBP94GgQfvA4WOZjjZaLB3boZbDx/k4tWrXLl6\nndWtdbYHO2SZTw7QKiIKQ4Jqu+tljZ6b6TtyjUBhSuPFDcp72hrjSPKSjX6fV1avc3Vn4P3CpMKV\nkNsEISxlNvKFU67Q37pIf32Jtc0VTtxyirtuPU4z8lSj+v70BaSmiN1gLng2h8d04cZ4niQJw/F4\nYh7TarW4fPnyRGnlnGNnZ4eDBw9Ous66aA2Hw4kM9uLFi/zwhz8E4LbbbuMzn/nMRJwQhiHNZpMk\nSSjLkuPHj/PpT3+asix5/PHHJ89xdzGtWQ7OeUtHYwyPPvoojz32mF+IVte7/veiKCYpDrXZzfz8\n/IQGVne2+/btm8Sx32R6/xblXF2AZ2dnJ0u43QbxtdJtZ2cHIXYZE4n6NdwwM6c6FFUFK9QEFefc\nTWWl9l/wPsgG62DwgQ8yePSDBDpgZjxmozVH2l8nGW0hAwjjgCwd45whCiOQjkajTRQZ8qwgyzMi\nrTChJh32QeF9OYzBuQylA9I8wyIIothL5bMUqbVXnoURNitwSmOR9IcDf2AHGuEcOgxJM0cyKsnG\nKbGyxLFirhUQSkNQSeWL3GO3caR9tprxna0VEucMgdIESiFEgJQRsjSIt5gg/b2KbigVQWnora0T\nasWxk/MMhgXbXUN3UFLaALShRHi7Ay1ZGiV8/aXrPPL2Rb53eZODU1PcOR5z4BtdbJ4iPrwA0/6N\nVeID7kIHwllMWeD+ySOoP/g2Zt8U+tUl+qrE3HM7zYtLlCePwpsXsA8/iH78KeQo9Yos5wgc6K8+\nT3lsD+In70JiyPMBSdLDGMVLYchTeyQ/eXqBoBnjZIDSEa4oiKMAUSRsbq8TNgLCdgMRhqhIk4y2\niaMIJSXdpSUUjrww3nXMgZUwtpYdU7KgvVUcrur8HMjCK1l0Rbep8TE/CqpKGnvDsd85gZOVbWNl\nsDPM4MWL65jSYLIhva01rl69xh23HpgsoQgUrgDhIrSSdDoLHLCOINTMdaZZXplnfbNLb9BnlIyr\nBADfbdYKpHqp5Z2lBM4ZPE1YVHpLhclTdkYj3lxf48L6pqf6eWgb6RS2SChMUU2ZPgAQKcjGFxj0\nN1lausqV9RWOHjzAocW9LMw0iZXvlyQSsF4LL+SuQuyxxvq6gcePTRUKqJRiNBohhGBjY4M9e/ZM\n6F/OOU6cOMGzzz47wXF3F8qaOVBv9BcWFjh48OAkunxqampiFdnpdDhw4ACf+tSneOWVV264c+1K\n2q0pY7WcVynF+9//fsIw5Mknn5wUyRp2qA+J8+fP8/zzz6O1ZmFhgSAIiKKI+fn5SSEPw5A4jid4\n725O8O5Oe25ubvJa6+dQO7HVHbJSiv37909YFLtK6A2pdpVj542Gbta21vALVGkYxot9pFQo7dkQ\nQggCrQlpMzuzl+7qRUpXkhmLI6AUmrJwCB3QmppFtEuKdESsY0bjPnmWMtVsUGQpwzTBSYl1JaYo\nyXKDRRE0/HtZSMgSnz04Tkc0Gi0aMiBLU0xpePvGiJ89u8RXbz/IC/unkTpk3BuRDhK0LZlta2Za\nmulGiDUlAoktLVEYEEce3iusTzgOI0VmfZMZ4IMNGkGHmfl9dLeWecul+vsVXVvkbF+6jB2nLO5Z\nIJprEvZnuLy8yvJGTmkjLx3Vgsj6TnWnYhH0gw4D1+P9D5zmnX/xFDOJI/legvgpgy1TdNggiGLS\n1OCE85SiKCR/z+3wyJ2oX/q3yFHOdBQwLpZw+QCXXkPLArn8KiIbYJSPCxLSGw4L45Bf/iHuJ+8B\nk4LJiF9aJfzym6SRZWqmwcK+KUQgEWFMkhYE0nvjjkdjcFCUJVEjpHQFAQWhdhSjHaSSpP0+wgCF\nl0jnzlECI+folpZES2zuqmA6gRISZSCSktDiJYPWVTxUf3Pe2ATXnUS1HBAglMAKx8pml9XuACEk\npU0Y2i2ura94uEVJP5FU2JqoR0RjiaIG01Mz2BKyLK/cmCr7OlN65Ve1/HDOj/E43+laa1HCy3J9\nYXFkRU6aJVztbXN2ZYlB5jOqPDTgT/7SVplUuxgbWJCqpNfrstXd5vr6Nc7tOczCnkMcPnSAk4cP\nsm9+niltkK7E4rHtyjSqNnn0B5IDKf2mfnHffnq93kQGu7Ozw3A4ZH5+flLMpJT0+/0Jj3U317Ze\n0NUFqmYQ1ObfrVaLPM8nRjRZltFsNjl8+DC33347Tz311OQQrQvVbsMcYwxRFHHy5Ene85738LnP\nfY6vfvWrk/y1six56KGH+MxnPsP09DTXr19nPB4zGo04c+YMS0tLzM7OsrCwMGEa1B1qFEWTovvW\n4ttut1lZWeG2225j/35/jQD6/f5NceyDgffRuNmo/UZBVUpVBZfJwex9aW+uKp5LXQV1aoXQN/i7\nSkmkLJmZm2WqPYcyA7ApWeFoxFPVIlWhogYagxaKQCjQmiBLCbR3M9P9Ljv9HRpBjGs48rIEpXFO\nUlgYFwWSgCzNSYYZWI0IFJkDU1p+7vwqR0YZP3t2mSdnYgoj2BkOaUSCdtBgvqWZawU0YoUpFUma\nkxaGRhwQSkHlXOUN2KuDVgca7UA5x2xrHnKBLgoUP7rq/siiWyQpvaUVZjodZqc75JGhPd1CxdMM\ns01EFKDD0jtoGb94SbRgWOa8+tplQgzz+Zgz+zW3XjOc3yu5fXWL2YVZL9VVETJQnttnMx+JIxQi\nauD+m4/j/u1fws8+QPOPnkD2MoIX17xZ98YQ4aBsxUQ6QO0McIsz2KkG5p+8HxVKzDjFGEvzT86h\nlvu8OxDwkSMsLE4zzBKEjuiE2ndm44zeWg+XG0IlsCZByBytIR2NMKMx15dXuXDmCiYTYBRCWKTQ\nWCnJhGOntOShxpYQ4OXBoRQEJcRWELp6PK64vYFCVU5k4EdmJyqsrHJDMtanQ6ysbZIbh1YWlxWU\nRcm1q5cwtkSryBfrsoTKeKTG2iaKIelNp8MopNGIEeqGiYt/w1Qu+rbS20vpPXepljTSo7yjJONa\nd5uXlq6xnWbeyIj6ewiMKUHcoB/V3E6oVGFVtH13Y43hoMfV6+e5trTIuYsHOHzoDk4ePMSehWmm\nm5pYWwIqOhmmcnnznW8tqz158iSvvPIKSZKwd+/eCWPg3LlzNBoNer0eQgiWl5dJkmRSTOrutNPp\n0O12J9zY2iCmlhZ3Oh1WV1dpt9tIKRkOhxN57jve8Q6effbZmwreW+0ci6Jgbm6OVqtFFEX8+q//\nOouLi3zuc5+bOH498sgjdDod5ubmaLfbJEnCyy+/POnS6w7+4MGDk47XWstwOLxp6ba7AF69ehXn\nfBz91tYWZbV0BCoedzaJhVdKeWzXUqUqeKxWCm8aX9+vduJCZt/S5TqPowqJ1LLC320FSQi0EAgt\nmH/m+9zyb/5nnv/g+1i7727aYdOP5kJiTY5wKSYvcNI3RDPTexjtrGLGY6yzTE/PEMQdstxiTYbK\nR75bHQ68p7MOGQ1T8qxACuk9fpOUnf4AgeSLx/bxqYsrfOnEfsbjgqKwSGuZ6wjm4oh2oIikBMNE\nkl4bJdXX1ftQ+H2GA29l6UDpBp3OLNub12k482MQ3R9TdE1RoI2gUY3XcVPTT3MG4xKk9h6bwmKk\n84l1tkRGEcZIBoOMthZcO3OOA0ebfK7MaZiQ6PwK4b5pOtEUNpIY4VCBwklHSYnSAUJH5D99P3zw\nAVRZIvcdQPzff4m5/yTi+2ewdx9FvXgR808/DFGE+Pxj5P/VR7AP30WcGES2QilKQFN88l76v/8U\nzy0oTpxYpD/uYsMGzXgapQKyYZciKciHOfkoIwxLhC3ApZgyQwlFnhWYNCcbGUzuPTWtKxAqQCiF\ndY5RZikDDZk3RrbOEaiAoBSEDo+XKtDaj2rWeMd6a72cWAfaOzdRGaZUGNy4yFna7uKCNqbMcEWO\ndCWvnXmB8XBAPOP1/I5KTlvHlluv68+zfEJd0krQbETEcUiZF96y0lWG3c5rwiUSA/gAAO0xOuMY\npAnLgx3Orq2zOhxjnV+yVfmSHt+sPBFwu287Dw+Uzlaic+H9V0cpWZqSjnbY3LjOysoKS9fvYP++\ng+zdv5eD+6fZ3wppaVl9BYeUGgRI6VVgdax4bSCzsLBAURSsrKxwyy23TEzM19bWJkU1iiKcc0xP\nTzMcDm9afLXbbaIootvtTtReX/nKVzh16hSnT5+eQBAnT57k9OnTk054t/tYja/WB9rhw4dpt9uT\nQ+4Tn/gEzjm+9KUvkWUZR48epdFoTBgOtd9tvVyrebSrq6s0Gg2iKJpEzdePqUUedQeapimNRoPt\n7e2JUKJOq6jhjxo6iaLIF12YLNGguhfELscxbnTCcOPzpfETiFK1BNizC5QU/ms4hxWKhc/+b0RX\nr3DPY3/N9973KFKGCFsgpcOmJaKUBLqJjjsUWY6VAUHQI25oCpOT2pLZ+T2kmWEwWAVlKYzAaknu\nbAU3OKJGTKwD0qJkMBhRGIkKAr4/P8232w0CqRhvb9IMNK2GYq4lmI9DQuknuqwscU74sFt8HwNQ\nuhKpJLY0ZKlDN33Ki3TQiKdIR0PEeAfpihsP+ocUXQKB2NchPHkE226R5hvo6SZFJEmkpRSZ59ca\nKEOFM5ZIlDjtWC4sKodmKJjfW5CvBnRHhpUXC+76zovkP5ciHn0bJoxw2iF1gHIaYQSiMARoCEJE\n2MT8zMPw8feArCgipcMRADGwg/vYSdRIEKQGUYxxxTYibSKt4eq+gq8c6XDffQdJxDKyaLJv+haC\nxjRFMQZtwBUEukR0PIVIhpJkvEEhBVPNecqgpDMTErYiEldgBTSlI6LEYQixjJ1gDHQchDjGDhqF\nJIsK9pQlwjKBFTzepRBo9C6upxdcVNaWDoq8YGuYsTOy6IaXWgqtCMKAJEkxE1NwvLWjkbjC3MTp\nrEdZKSXSGQLpHfGjMKZhffxMWZTgHLoaHcdlQVFAnkOZG5IiZykd8ObWGivdruc2awXG3tTl+Vwz\nUUWt111lLZX+u28xY40vZulZtrY3WV4/wv7142z2j7CyOMPB6Wmm2206UYPQpQhAa8+D3dnZoSgK\nZmdnWV5eZjgcEsfxpDg55yZS4MXFRa5evcrGxgZK+XipWu1XF5Ua/6whg9qN7MyZM7zrXe9CCMG5\nc+c4cOAA09PTPPLIIywtLTE9PU2z2ZxARZ1Oh5WVFb7whS9M8Nka5221Wnzyk5/k+PHjXLp0aWLa\nUz+XustOkoQ8z2m1WpNQy7W1tUnBrSXI1toJE2F3gax5wHNzcyilOHr0KGEYMj09DXg62s7OzkR8\nUX/s5v7WMt/d16guuDWG7ZxDaT+1ySrKWiqJEtLDbNXjBv/yf8D8L/8jr37kZyob0RQtHa4skbhK\nHm4pisxffwRxcwYZS1Q+RpYGZIMwytFJgDEhhSnIncPakrjVxgrFOM2QKqDIjW+EjECY0jc5pTfC\n2T/bYjqUKJcTVykYwlbWrdWSUNUwlPVcZak0TihynxmDxsuMRRAz0+6QbW1U077+x7EXhJTMH1ik\nuTCDC2PCIajYINQ1SmswLgCrUM5OUjWl8bxbo7yqY890g9nDcPCa4Uya8o4lRyOD8Z+cRZ2OiKck\nVhwgbd5CGGicSREolAgqPK/yOsBWOfV4/bQICQofEUMBFA6KjLLsI4oII5bJy5zH/3qJw0f3864P\n3MrFC8+RWokOGuggRCiLFYBSzMzNca23gwxipqI5rFEMul3seMhgc5vL54ccOz7Pq5eGJA5S6zBF\ngQskuXP0pSPNDfut162EQhBbyVJZ0pF+FBHVTkhKgVYBznnamKgYA06AdF714kSJUCEb211MKb2t\nnHNoLdEqRkQGYRy2KJHBDRtEoRQ6DhGFd+MqymyyhXfWMoUjroy4PY1oVPmpVgwCB0GeM84KEjMm\nsSW90YDVrXU2elsUQKQjnBQUtpi8KX2kCZSFrZRmNRYoqvddtfF2eKqcBKEcYRjhhEBoTeFyNrvX\nGY432d5eZm7xAKuHj7K4bx8L7SaH2oLpKidOa0W73Z4IGWZnZ+n3+2xubnLixAnCMCRNUzY2Nshz\nL7++8847J36zZVkyNzeHc47777+fAwcOMBwO2draYnFxkc3NTbTWPPzwwzz99NOTIg5eWNFoNPjg\nBz/I5z//eQ4fPsw999zDeDzmzJkznD9/noWFBT7+8Y9PEhyEEJNxdWpqikceeYRHH32UnZ0dtra2\nbvraZVlOTNjrrrk+ALrd7k0/L2+EP5r8HOpw0E6ncxNVrTYHGo/HXLlyZYJR14vEv6voekHMjQii\nupPfLcTwz8MnCFfwPlL5LhfrKhGDI/vQR+n+5CN0X3uBbLCNScc04xAlwZQ5thjgrI8fdcYSKomV\nEqmb2DLHuARrnI9wdwJBhLU5WZqTZTl56ji90uXR517nK3ce5cmZFmlR+GbHlQTCMdPQdALLdDMm\nxKCcABV4DFl6Q3OtJbYoq+cukF5jhRSKHMU4TwlCjbQOhSJoTCELixwPaQQCZPCPK7pBENCenmI8\n7KNaDhXPYhmyf1/IbEfQKyRFGaJEgaRyZXcOUS1jtLUsdBo0FhV3Hci41Ev4k0XNz61ZXlmAhe+f\n4djxBk1nKDXYzn6EjD3Oab38U6L8CGILrLEIW6UMGOuJ9EUJeeBjnvMBruwinrpA+KWneOOONhde\nN3zkkwusd1+kn2R05hZxUhE2YqzNyJ0AGSMDySixJNmQPbdIpjpTlL1V1q5dxowsw+7/1957Pcl1\n3Xmen2OuSVOZ5YFyQJEAIRAkSyAhUiIlyjXVctPLlnbV0zu7bxPRG9FP86+sIvSyEeqNiZhtxY60\nWkpit5qUKIpDUmxaiRYE4QhfLlEm7TXnnH04994qUGpqo3tCT/hFgFEmqyqZee/v/MzXGLRuMjEd\nonNNb8tXXbl1GAFdAd3M0jUQaUdNCGIjME4wENDCV8J7XH2Hc96DShSJV0lJpvzWNHcR/Szn2tqm\ndwD2GBEkAYKILO1hRykiy/2sU3jdinImqwONUnWk8vOuMvGWN02apjjrxZdLXKefdVrEoM8o9c6p\n3VHKxjBjY2AwskEYekJH5nxru9dWFweiFJ4Z6MoNtsS5fF+l63DKocMaOo5RuoYQYbHx9knaWMHG\n5hq73S6djXXW5g8xP79If3aOdjNkIlLUag3+6q/+iscff5xGo1GxsEqdg1/+8pecOHGC8fFxrLWc\nPn26SsjdbpeJiQnCMOTFF1+k0+nw0EMPMRgMeOaZZ3jsscduEUe/evVqtf0vRWLCMOS3v/0ttVqN\n1dVVOp1O5aw7GAx48cUXsdZy5MgRzp07x9zcHPV6vdKCKE0wS92H8m8EQcDBgwf5+te/zg9/+MNK\ncyFNUw4cOECWZaytrVVVfvn6l8myhNSV2g+lAND6+npVhZfz3FJRrRTbKaPUvtg/Lt6rdAucrt3T\nHpFiX0EhKBAx+34ffoFba4wxt7TM5Q92yZ31nmvWIgnABliTo3WMU5I8T8hcxtD0yEc9TN5DBAH9\n4QApItLUMdjpko8StNRkacZXX3+fhZ0ej79znl88dJx6FOKylBDBWBzQUIZ2IAi1RSlNoGo4B/ko\n3RNcwneNQnhpWJzXy7ZK0Rtk/sBXqqDra8K4xvbmJk2RoETgmbD/FvSC0t4d1kmwLiEMx0j626ys\nLPH227u8++GIxHghFBxVshX4+eJEIGlOBoh2zJ2LfSYvCt6QjvfqAbHNWf5Qst63fKK3xlynj10Y\noGfmcU3nbWjCwFeytlDxKpneQmHMEGEylHG4NAc7ALONMyPUf30FdbnPzPUBtaNtJg6mrHW26Ww7\nlo5M+rYkHZCnQ0aDPsIG9Hoj+oOMMDDsbF8nURvsbPew/ZxmWOPQvKYzjGnUYNTLqVnBmAoYWMvI\nWoyDDo5MW+6Q+FMvhUAobmLZcoXIeb1d3RC4YkEh8GI2wsNghibm7JVNzl29zM5OQhg2kAaE0zhV\nR4dt8mzEyOSIoNRDsLjMa9yCP5mdECgREoWOKAz9osRasixFa4UxGuvCAkIkybKUfn9IaUciBago\nYCtRjJhEBzlWJuT5CEzJsto/yxRgJU44rPAgdWk9pE8Ud6dUEqcdQTyGjqZw1hGoEB2EZDYt0bgg\nIcstu1sdstGI3sYm/aVdFo8eZacWs97Z5qmnnuKb3/wm7Xa7WhyNRiMvjdlocP36dXZ3d6sF2dTU\nFPV6nc3NTd577z0uXLiAEN4O5+/+7u+4fv16JQy+srJSuU9cvXqVfr9fVZZCePnI8+fPE8cxMzMz\nvPPOO2xtbVUHwCc/+cnq8U899RRBEFSqZX/7t3/L4cOHK5pxo9GoYG5aa9I05ciRI3zta1/j2Wef\nrSzlS4WyRqNBv9/n7Nmz1Wig/FtJklSLv/LQGBsbY2Zmpvr6cDik1+shpbyFolwm3HJ55uFjRdoU\ne6gPHJjcYXJQupjfigJi6ATSFe+hkAWSxnj9DCWptyZpNFvMvPAcD/zjP3H68b9k64F76XfXSdIB\nzYkFCGcxmSFNhpD3cNmI3BQi/UoyyAwj5UjNCCMkQVTDGceTK3fx5799n/9zcRZpUsYbMWSSwBrG\nAkNdexanFHtayTbN/I7F5agC5oZUSBWQJ54boLVilGc4kxJqb2JgpSCIArSUpMMBIhRYcm+b9W9J\nus5a0mRI1GjjtGOU7mKylJ3ODjaxBFKhMOSZ96ySVcJ1aKFYaEA4JXC1cSYOpxw9vUtnfURmUvou\n4OIo4uDpEePP7bB+r6bxxYyxOy36kPf+shJEUCv2MhJhA3Aamw+weYrKU2QG2BTpNhj118m6jhuf\nmaS52uVHVnPfAw0OHJpAbI6zuXYJBQiXkfS2UMKS97to639HICQiN9y83kV1FUmiaIVtMIZGLMmQ\nMMwRqXds0JlD4RlbqYBVaRnEcPfIC51fkEO2lGCgBVcUPNBsMTN9EBCko6QApVuMtYxySXeQc7Wz\nxemL11i9uUuOF4XBld5nCqNjgnoLNeqwttlh8dAyQinvmpsZ7DDzDCHpF1pSBGiV06jXEAiGyQgh\nBSY1JElWLOAMQRDSatWJ4jq9bpdRZlFBzEYyopMG/kIVIwypZ/7k1hMirHcT8HdkULSWFmSACjWu\nP0BJRVSvg9BIWUOENZxUSBWhw+LQkBIh/PJCKoVAoLCIgqm0u7tOfjEhSQdMzxyg2+vxs5/9jNde\ne43Pfvaz1SxWKcX6+jozMzPs7OxgrSWKIo4cOcKjjz7K8vIyP/jBD7h58yZxHHP06NFqwbS8vMzy\n8jLr6+v88Ic/5NChQywvL3PixAlGoxFBEPDKK69w55130u122dnZYTAYVNC06enp6uOZmZlKC+KO\nO+4gyzLOnz/PpUuX+OEPf8g3vvENlpeXq5lsuQQslb+stTz88MN0Oh1eeukljDFsbm4ipayUyUrK\ncpIkWGsrDYmFhQWmpqaq2fTc3BzdbperV6+ytbVFt9tlMBhQq9Uq5h1749ffQ0R8dJ5rnSuo2hZQ\n+ypisQfPcaUGr99R+LGZh7rNzC1z15M/Z+zyJY79Pz/iVycOYZA0WxMIKcjsgEHSJ00GYHICpXAy\n5MA7Zzn1yxd4/vOf5v07Fwlbs6T9AVZK8jzj7cNLvDg+hhnushBa6oHEOo12jkao0MKBMeQuJyiF\n8QXk1lKLI0zBBAzDkDTLvR+cUhjnvO6uE17YyzpyLPUwYLi9TWgMJgUnFN5m+9+wSLPWkaYJMkuJ\nW01GQ7h6eZdf/sM5Vq8IcheSi7SQVhRYYzCZF5UIhGBxPkZMaghb1JbgrgNDLmyvso1EWoumz+fX\nUiZG4N41nGtdY2q9x+QwoX7XnehpQKQYFIIAicfAutyCzXCDLi6z2HSH9Fe/Ifi/z/PB0Qm+f+Em\nmypkui24q55z9fI69dbd1Gvb7GzdZLIWYNIEIRxpf4e4FjDorxNKyWg34PK6ZfzAHPHkJOvbq8Rq\nmwMzDdqNmHatxdj8Ius3bpBd3yG3hkwr+plhLcsIAmjnsBXBudjQlYI6ijAVNJvNqipLw9DDjVzO\nbm64cHGLd898yOrNTQZZ7k/bopXL85xc5qjQO6OKoIYdjfPyK29zcuUkSHCBRrYbiFiT94Zko0Gh\nUO+7A601gYNumtLt9UmTHJyk3qgzPT1VzCwtu70uo+EIh+D6UHF+W4MeQ+a7+3CuAkGIsSnOGZQM\nfKsp42J04NBBGytSBv1d4nqTMJrAOoFSEVI10WGAExYZBmSFK2JNRSjhrYacdSjjEFKjNSidkaQD\nNjfW6PV7jEZDkuGQGzdu8E//9E/ccccdKKWYmJigZGR1u91KoevcuXPMzs6yuLjIcDhkZ2eHU6dO\nobVmfn4eIQQ3b95kdXWV48eP45zjpz/9KQsLC3Q6HXq9HvPz82xtbfHUU09x8OBBjhw5woULFxiN\nRnzrW98iDEO+973vMTU1xSc/+Ul+8pOfcO7cOdbW1piYmOCee+7h5MmTrK2t8eSTTxJFEadOneLB\nBx9kZmaG+fl5ANbX12k0GmxvbzM/P08URRw7doy1tTXW19crqNhDDz3E3NxcNTMuxUm2qBAAACAA\nSURBVHqSJGFjY4Nut0uv1+Ott97i8OHD1VKv1+tViIeSebeHw/39ZLt/zuvzQilmXibovSRdJuw9\n0SJLSZTA+tFXc2KWK//bf8L9H/877337WxgZoVuzjJKE6ZffZOWpZ3n1sc9z9Z5PgA7QYUSW59z/\ni+eZWd/kkV+/xDvL30FGLXQuMXkG5OSjPm1tGBuPiaTPFYMsIw78GENgkUoQFfKLQvixh7U52rtt\nokKNkPjHArmxpNaR5GCExFqB3+mFIMD2e0TKu8bkeY2x2Xmi2qV/fdLFOHTPYZKbdNc6jOozvPnb\nDh9ctgxsk0HutSiNDEltTmANE8DBmuL4wQafujMkChqocALVnmXhXsd9W5ucXvVarEo4XlySfPaq\n5fkDirX1nHuvdJj76SsMvrFO7evHUeMNTGMcq0OiQvtTZAmm18V2d8m3EpKtKzR+cBa9mjC1OqRT\nEzTqioOLdc8mSjJazSGhHnDtbIfW9AxBO6KzcZlWI0IHddYvGa58YNk2ionWQR7+i79h7nOPsHPx\nDOf/+WlEQzCxcJz/+MW7mDxyks1r53n1p3/P9o0P6fVHuG6f5PoOpjPgVXKGCloH23zinnnuvnuJ\n+rvrCK3Q2lOKwTBKFZ31LZ55/T3OXdsiSQwGh9SeMWRyU8kACulPXIclVREynuWf3z7Lt9dWmVuY\nR+TGg9J1QNiU2ADMMIHcz52TLGeQDcnSjEjHTE+OE0URcRyjA7DWVHqteW64mcLZzR6Z0QiRFvhQ\nU4D/wTPuNUI6n3SF9pWqCACFlNq3sGGMrjWRQdNXr9Jf8FIJEBorvI2QKpakWvmL3uammGU7rNAe\n++k0g16fYmhctflSSs6ePcvs7Cyl+thgMEAp70Y8OzvLzZs3OXPmDDs7Oxw9epR+v0+326Xb7bK5\nuUm9Xuc73/kOMzMz/OhHPyKOY/76r/8arTXf/e53eeKJJ/jc5z7HxMQEP/7xj5mdneXhhx9mdXWV\n7e1tnnnmGQaDAfPz86yurvK9732PWq3G4uIiW1tbrK+v8/7777OyssLCwgKPPPIIw+GQM2fO8Prr\nr3Po0CEefPBBxsfHCcMQIQRjY2P0ej22trb4m7/5m2r+PDs7y/b2NltbW7zyyiuMRiMGg0H1/u2v\nVMuxw9bWFq1Wi4MHDzI1NUWr1bplbOETZZU59yXevXRQ/t6ygoU9/7Rqri+80GMJQXPO4/c9pLBI\n4irEfvt/5fSXvsrO6jXM9jqEKSIwrDz9HFM31njwl89z8fi9PimakJGBF//sizzyy1/z60cfZFh4\n+5ksQ2GYHKuT7vRpRRJtDM5pEDmh9vsgJTwZJlZi3+vjHa8JAnJjCANNmqWYPCMKA0ZpRm4cae49\n1Swagx8vxHFMMhrSMAaiNvW5Q8zd9UkO3P0Zav985V+fdB05fdNhmFgSG/Lqix3On+tSUwKRj3Dk\n6BAWpg0Li3XuWJpjthbTtn2mooR8mGFDha6H2OYY4/ce4t5sm+Dda2zvdJFSkBxwPL0syRNQieLe\ncxn1ocX85Czvhps05lqMX5NM/XKd0f90FHNqFpkb3G+uUP+HKwy+fIjw/nHs/3CUzn95n6cjw+G2\noF1zHFuq0VYSZywbl86xefUaNT2JzeqEYYs42qKuAtJcc/p8n35eY+7kSR772tc5/PlHyaanOTA7\nz+w9K2ByVG0MonEyEbE42+DwiXHc6hnyrQ52Z5d08yZblzbYuL5NPDbG4onDNOdriMjx325mZEGA\n0h6utNYd8Zs3z/L2O+/S6/fIXUyOwilBUEzGvQKXrKoJaz3awqoIKXLWd7d55vnX+A/f/gbSFsy0\nQgRFigihwLmcIPCtKG1BWGuhVVgtUZTyOr1AZd+9ngX85vKQrVQjhSVzjtzswdP8z0mki3BEfvHh\nJIGKUbLmmUFZ17P72jOIoE7uYi9oIyAINE4plFDIkgxSQKAksnJmtcJTobUOC6SHd7jo9/u0Wi3a\nC3OcOXOGfr9fwb3KhWGv16s+Pnz4cKVZu7a2xsbGRrVQKlvvEvkwHA4RQtBut3n66adZWFjgyJEj\nXLp0iStXrnD48GG+/OUv88477/D2229XymYlI25mZoaJiQn6/T6dTqeCg5VV+Pnz53n11VeZmpri\ni1/8Ivfff39FA/7ud79LnuecOHGChx9+uLKKb7VaXLhwgSeeeAJrLR9++OHeWOCPxH57nUOHDnHP\nPffwwgsvsLCwwPXr17lx44a3InIO42yRSIsJgS1lNv0oSQgwRnhhQESB95XFUricxpdJ2OIpQl7k\nvJA7B2f9rkBKxiZmPDN1bJx06O12fveX3+HkT37M63/+NaLmBMNRlywZYtKUGyeO8V+PLpBmGQeD\nkDwZEIagbE5EhmjE2DzBCchygzHWL+JxNIKIQAmEchXcDTxVXxdmqVmSIKUjikL6qWGUWUaZJXV4\n12MkTipECGQ5YRgzf3SZxRNfYPGuB6i3J7E6Kkg8/8qka6Xk2i6sbknOXdxm44alIQUHJyRKaQ4c\nneMT9y4yf7BFrS3QIodRjut2GG1fJXM5ckyRqRwbWKK5CabtUVAp2+uOfJRgUoM1CkFIltZZjwbU\n3uyy+5kmrfGYzGSM/8MawabB/uA9Ntt9AhUy+eQV9OqI8efWSL56L9v1If/v21c4MOlYih2jfs5g\nq0N/Y0RzZpwgjDDDiPbSHM3mLEoIxltT6MGAze0e7bsO8Y1v/weWHniQer0Jcd1jTHUI44s4A3mJ\nT3QGpyQibmDrIelGB0kf6gNmj8csrRwDGUItxETe3ddqryQ2ylJeeuMd/tvvTrO2PSDPHZIQUBjr\ntTuVVCgZkSSZl3vUASJQ5FjvmmwzlArIVYtnX3mLh07dx5HlRaTw6mCeZeE3sB6tZQvpRA9HykzO\nMBWEShcqWoYMSSIVF3YcP//dDda6Iehx8tE2aTokTT0zaM8JViOkRjgNBNRCRaDGEEJiTUK/f5Oo\nNgaijrMBuqCT+lmtBCNBKZyDQAQ45XHFSkiENYhQYhzEYeBRIkZi8xytFMbkhEHIn/3Zn3Hp0qUK\nNZCmaQX1WlpaqirG06dPA96avSQLCCGYm5tjZWWFq1c9OePcuXPMz8/z6KOPsrOzw7lz57h8+TIL\nCwtVInXO8frrr3Py5MnKIeKll15iYmKC2dlZZmZm6HQ6dLtd7r77bprNJufOnePatWsEQcCpU6fo\ndDpcunSpoiZfvXqVVqvFww8/zMWLF3n22Wf5xS9+weLiIufOnWNpaYn333+fqakpjh49ygcffMCF\nCxf+aML1yW+vVL148SJJkrC+vl6ZXH7605/mrbfe2pssiOo/t/y8EEXV6/a+vwcL3Kty/c/sH0mU\nCBafiK11OFM8XkkarRa1RoMsmaExvgOHj/DG4/+eUMIxrdi8fokb59/kwFhIqx5iR4YkEfRHCbky\nxIGD3BIpMEJhVYxzhsGwj0kzIi0JpEALSxBoglBjjcFKSZbnJJlBKkVa2FZJHdLLLcMsJzU5SZqT\nWoWxFidTkI6ZgwvMzN7B8idOcfieB6g1p32ilX6p+FH0xkfjY5PuzrbhzVdGbO2kCAkn5mrctTzJ\n4mKLyaVJ4qUDUK8j9SROJghrSG5s4foO47oEtZsEumiJjYUopDY9TfvQNKnok+wYBrspCk1Ui2nX\nxgmOTbLxWEZzLKLVakAtQvzPE5gfX2T05wvEEweRMiL7VgP55AeYf/8p8kaTK1du0tnqc2welo+2\n6Q4U27sCG8BukhDX6tx5bJ4sN+jIMkq7CNMj6XWYX7iTv3702wRHHyQPmuSh9y6LZIhwGiNr5Eqh\nnEO7FOkyjAyxJsZaMDubuEEPEYSIRkAmLEJLZD3CaIkIYqzSvH9pldc+/Ec+uLyKcbKqHP06TqK1\n8O013trZ+6dphPBWNSkWZR3aCWRURyBY7W/xn//+Sf7Tf/xfmDwwhnB5AS73d4oTFkeKdRlZmpMk\nIxC+UrUywoqYa33Dz577Fd1ujXcvnefC9bOooE1r/E6sSbD5EOvMvlZTYIVEyQApQ4Ig9glTeiuc\nneE2TkoarYPkzjP2pFRVK+p/LvAMMyDQe4aM/vvK6zUEDokH2ufKoANvSW+tJUkTZmZmOHHiBG+8\n8QbOOXZ3dysmmNaao0ePVnCosgoumWH3338/s7OzdDodNjc3OXDgAKPRiHq9zvvvv8/c3BwnT54k\nSRJ2dna8X5aUrKyscO3aNdbX16uq9L777mNtba0aUywtLVWjjXa7zczMDLOzsxw6dIiVlRW+//3v\ns7KywiOPPMJgMOBXv/oVhw8fZnZ2loMHD/KVr3wF8LPdzc1Nzpw5Q6vV4r777uPIkSO89tprtxAa\n/v/G7u6uNxkt4q233mJqaoqpqalb4GH741bd4I/ieMUtj9kfe4/bc8LY/28/3ldqTRSOMxHXq6Rl\nrUGLgMnxKeygQ80MiO0IIR25MoQyZ+BSAu11QULlu6e8YJTJuEYsPBBASUEYKJw1pJk3FXDOj0CM\nFPRGQ4y1KK1JhhlpagvHFkeeGYwWNKfnmDl0N3ceP8nikXtojE8T1NsYguIM8t1Abg37D60/FB+b\ndNPEoE3OAydijh6f5OD0LOFYHTXexDUb5HEdag1SlWNcgnYGO4FfdKW+tBfaOzkoIxFGocIm8cwk\nUX+b/qBPJgo7awX1cUVUj1AqIlUS14iQcQ33lbsZfO44ZpSjRITSEcnsAubzdyGCEFmD8xeuMxg4\nLIJ4DPIwRLXHSWxEpz/AuIzdm2vU4jZpvoPRQ0yyw7DbYXz2ODRbpLpOFEyQaVBIgkyA8uLk4PUJ\ncN41QguJ0S2EaiC6fWrO4eIGMqz5mZWKIGiggxgnJVrCcy+/SW18yusIl7hUJ5BCI4pTsvzcmD0R\nEmNMYSFdJCarSbHoyNEfSd660uGnz73Md/7dFxlrFhe5wC8dC78nkwv8rDUCYSEI6QrNy+eu8n/9\n/Of85o1Xqdfa7PS3yEddpBJ0Ni8wPrlIHNZwLiLPE99mFSwkHQQoFaJEgNY1tAy8a0Sa0Z5cBBEW\nVNDCztu5woonQMnQIx2K5VyphFVNBJ1EqQiBwAoIrEIFBmdNweJKWV9f57HHHuODDz6oqtBy1FBK\nKN51112VOHiZCMvqt9RmKF2Atda02+0KRbC5uUmr1eLo0aOsra1x+vRp3nnnHVZWVgAqckHpdfbA\nAw9greWFF16gVqvxpS99iYWFBS5evIgQokrGR44codFoMBqNSNOUr371qyRJQq/XYzAYkKYpcRwz\nPz/P+Pg4V65c4ZVXXuHZZ5/lvffeuwVX/YcijmMWFhYYGxtjNBqxs7NT2bHvD2stGxsb3Lx50yeK\nP5gr9hJkeT3u/3x/1Vt+f/9jfFGxp7BXQgvL0YWUhVgSoKIQLQXJaIQgwElF0Gxzxz2f5sbpN1BG\n4ZIhwibeKqtARTipAUWgFFqCMd4bQQjtd07W+Io1N4XhgldfQwiGo4w0y71cY2ZJ0hHoEBeNoSYm\nuGPhCEvHVji4eISJ2UOgGwihsAqchDDPkEqTI4vF4sf7o8EfSbozM5rHvz1Dvd0iaLbR8Rim0SJt\nTeOCGspKlBUoY8isBe2ID0xidUIm13GhJY/8skQGMSKqY7OEqBnTatVIaxpCycDkaGGoxQqpHU54\nSp5EolSMcQoZCaQThMLjR1ObY7UkiAW73XXOv38Vk4fsDB3OBdTDGqOeZpQ5oqhetMaWuBaRGUNi\ncmwCWtTZ3txlojZGGLYIbRODwQqHUxKUARKkMyACnAzBghGWREXUoqZXkTcKIWIUAQZJuQ02aYZU\nAmcSMgMqL5RDTEnRFoShQ0qLdapAgZQXsatuDiklQmtclqNUTGoTtIjQromNNL8+d4PmGx/wpU8d\nZbIeIa1D2MzDXHRIEPukq2RO7hJWdwb84Nk3eP7N17l89QKjtEu/v4WQAovFIDDpTZK1XabHl4jC\nSVABxuVkBWIhUDFKxSDr/rBxhnQwImqMEUZtn5wrOUufVLUK0SoqAPyyWLmwd+M64a3tlaehOpwf\ntzgwZohUgiT3reD169f5whe+wCOPPMIvfvELgCqBJ0nC9evXSZKEkydPEsdxRUwpt//Xr19ncnKS\nlZUVrly5UgnfNJtNDh06xNbWFlevXsVay/T0NN/85jdZX1/n5ZdfZmlpiVOnTjE1NcW7775bkSda\nrRYrKyv0ej2uXbtGmqY8/PDD1fPZ3t7m3nvvJUkSbt68WTkf93o91tfXWV1dZXNzk7W1tQqONhwO\nyTI/d1fKM/EmJycrJMJ+gXQhBI8//jinTp26hda8vr7OuXPneP3111lbW7slYXuPNFcUFVUaLZJl\n6frrqurQOovFz2VduTy75RAQtyQfZwvoY0mscLYaNTlAYiklk71okqiU72QgGZuewyzfzcaFd4ji\ncRSSWCisUCACsjQFZ71etfUQSJfnZKOMLHOMMsPIej+/zPgEnFuHExoVNKg1J1GNMZrNFs3ZJaYX\njzJ5YIlae5r62BQybBRzaIHA4tzeMllKhXEeBVHOv/8YUPfj9XRrAe3lWVRtBhFOYptjyHoNtE8s\n0inyJEcbgcxCTJLjMkeeR4T1eXLRQ7QmELUYIxykqa966rM0D6TYbIDNByDBJhmiO0I3QnQ9JMUS\nZBmq24PQYXJHJBXSZeQmxSEQ0RjS5qydXWNrJ6OvJDe3HYOupaFzrl3a4tDDn8JmGR9eOI+NvDuF\nTAe4BgRJg6jRYFCfJxo7jBAhVjtvJ24dSK82L1zxqXA44S/QQrkb4iaprhHLFOcS0tSA1ihdx+YJ\nQhhAelU0a7GFvQdO4UwpC+iNH8vlRFktlMiFqnKQEhkEKAvKDFldP8P2jWsooTm0fJjvfniGX/7m\nEzx0/ASfWlnk2NwEUlocEmclEm8Tv5UofvbyW/z0hWfY3LiBl5P0N5Z1FluonSgBwiq2d9YZH9eE\nYQPhAl9dWIWUNYKgATpGhRHODhimA6LmFFKpahSQ51nV8gdBsLdswVXuB2VCcbawIJfKzwMLrYg8\nSREI8jxDABOTE3zjG9/AWstjjz3G7373OzY3N/dcBfKcwWCAtZY333yTBx54ACEEu7u7tNtt2u02\n1tpKgyAMQ7TWFeOs0+nQarUqyNj169dZX1+n2Wzy6U9/mm63y8WLF1lfX+fLX/4ym5ubXLx4Eecc\nS0tLaK1ZW1ur1MQ6nU5lA9/pdIjjGKUUvV6P1dVV4jgmCAJeffVVdnZ2vEZtYUBZq9U4cOAAw+GQ\nbrfL1NRURXW+fPlyVeWXsbOzU2kAl8+j1Ip46KGHePHFF3nuued+r/ItR1K+/f59uKkrh7rFmMHu\nm17uwcv2kmv5uSuQDuV7XlXJohxTOAQGJyTGCnIrcFiU8CxNtGZicZmgVmP76jny7TWUDIitZ6Qm\neR+TZ4yczwtpZhiMoJ9JZDgG9YBavUZDR0hdJ4gb1MbGiRstWuMHabQm0I02Om4S1BqoIKzGJ/7/\nx+57zhLQnrIsBHmRKlxR1P1RZgR/THshbOCmj5EHNVRYQ9YaHlspDUoCIsBYickHiEChnQZrUGiy\nVBOEbYz1wigyz0B5ppRRY6jWLPXZbdpJH9wOvZuWpNdHSYtQvtrNjUBphzUet5qjsTKAKASlENrh\neiPWL22SGhgKx07X0N1NqNcEBw9M0CNlLK5x7733cfbaBUgdyuaoICDr9ZGppn70MDL0nllWe2k2\nVTC7PLYw8CgCTzugEIVAOYsIa4TjM7B9AyUMVnholc0yL34RSfJiVKBVQBw2AS/bqAvdUb8BVggZ\nIdFYZ8lsShhFWOUrsxyBFBbSbdaun+f65XMMBts4kyO0YnewQavZ4tq1y/zzay8xPdHif3zsz/nK\n/ceZbNdQ1pCmKb2B5McvvMzf//xp1jpXPZgbf44ITdE2KcIoYixuEQdtnHHeBy7PUDoiEjWkCtBh\nHaFDgjBCiZDNjYtYLKrw6VJKYY3xwtYy9ouMIMAYj/HMsgxXXNCuaBOVDvCCZA7wQHOJwilv5SOU\n8sLqeL+wKIoq3diNjY3K9aFUFQO4du0aWZZx7733sri4WNF4y0qxrI6N8cpspX5taW9emjqW3280\nGpXP2Wg0qnC/m5ubZFlGHHvlN601KysrDAYDsiwjSRImJiaYn/eOHzdu3Kgsgko/s4WFBZIkodvt\n3sJYKw+KpaUlZmZmOHz4MJcuXarm2fsT4zPPPIOUki984QsMBgMOHjyItbaC0T322GNEUcSTTz55\nS3X6h0YFVS4QpRecV8yzAj+mcmVbvcdMLB9fJmnH3tiBfY+pnnOpmOxc0ZHuG1EUhAsX1KjNHCKo\nj9HvrDHcvIbduoZTW1hXQwloT00TNccJ6uPIoA4qJIzrKB2gwhAhNU5FCKlAapwQSJH7ha4MsKJw\nT9n3fHzFvm+8AlgP5MXswzE7SgSHveV1+0Px8egFHZA3JyEIcEFYYEwVLs/8BtJmKAfoaO/NMQJU\ngEMjHCgryQYjf2MGoQclhzFS54StaSYOpJi+RdoRcRwgtLflkEFQ0EmL9gMBwuJEjhUSESri0JHa\njCs3trDWJ47doWV3O2VaBBxcmGYjjhjuDGiNTzDRniTsWqSqo/OEdHebJKszPTGDLFldrhBmkeWl\n8AcGXUJihWeTIQPimSWyjRsIm3jHCW+3QJ6N/HCp6J2CMERpn5CEKys7X+VKoZEqxEkJNsclRUVR\n4AiF7XHz0vvc+PBthsMuNjXVQkBaL+fnMZq7bG7AlasBne0eb737Cf7yyw9z8s5lElHjv7zwDP/5\np0+wsXWTEtfuqxaHKC65WjzFRGueVr3tkRVO4owhzYZkNqUWh1gRoHSMimKCsEUy2GFz/Qrj03ei\npCbNR9XF6EV9JNZCnhu0LrzQnG/xpPSuESIIvRaFpIAWCRzGO8xKSeac3zQn3obmpZde4i/+4i+o\n1WocO3aMd999t1owlc4MpThMmRxPnTrFnXfeCcBgMKgSdEku6PV69Pt9Jicn2d7erhJpmqaVgE4p\nolNWniUjbI80sCfx+Pbbb3Pq1Kkq2ZbMtDRNeeONN6rDqVarcfz4ce6//36Wl5d55plnqNfr9Pt9\nlpaWmJ+fr3C13qZe0e12/+ANnmUZP//5z/ntb3/LZz7zGe677z5arRawN4eenZ2ldC32V7n4vWS4\n74JHCIoxgwDpMLkXtiqUm6tH7uF197kkf7Ri/shdJYUXiapev32J2p8FRW+kFLrRoh3WqE/OkWcr\nTJkMZ4xXKVMaZIgVYUGoylDCUh4obp/eNIB3wwj9sk8WTjAUf2/foQH7Ridlhe8K7Wv2pA98t/j7\nGscfjY9NulJKdChxyoLMsNK3hl6kBTA5wklkEGFtjrOZb6eVJIhrHvBuBAxHGOkrRVkf83S5oAZR\nG9UYELd3vOsrnr8cqAillRc6l9rf+EZ6v3mniaIxiCKky7hxbZXLN3pY69vZbgo7fTBNgekPaC8v\nsLG+y+rlawRjMc5KkkxCluEGPertOaKJA6A1znmVL0fuE6qveXHFxSCkJzo74Ste4XIcAaJ1kFyG\nxNLTKW2eI4T0bsouI89STJZWm3N//fjFjhNl0vUWzqkzBQ0WcJbApaTdG1z44DV2165hRb5XAfgS\nw1/8AlwQMDt7BKkChiajkxp+/c4ZrnR2+c5XHmV9d4Pv/+OP2bi54fUQKDVQDRYQTtKIZ5kdv4tG\no0WglHeDFQKrDEJFyHwAMicIoFZrE0UTjIYbXLnwElnSp1YfRwkvuFLiU531IuYeSVCAQBGEUd3r\nEVuLQpEbv5xQQYB2IDHkeYpzvs12uKI6BpPn1ZwyDEOWl5eZmJhgY2ODTqfjuxZrq5unlE4syQTH\njx+vjB+ttaytrVWSiuV7VEoqlv9Kw8jygCuThCiA96WbQykdWeofvPXWW7TbbV5++WWCIGB3d7ca\nfZw6dYrPfvazdLtdPv/5z3P06NEqaT/99NM0Gg3efPNNPvjgA44dO8bhw4cZGxur7HoOHTrE5cuX\nieOYbrd7izPwjRs3eOKJJ3jqqaeYmpriwIED1Ot1ut0u58+fvwUBUaUOQfXz/t/eI3xOwM95hb1F\nNtZ95GOx//fwkbzrnFcPLDQbZAFTM7YQhPc2u3uiOoB0WYVmc1qDChBW4Kzxc9ZCjF+USdVlWCmw\naK8JY8EIyAummp/N+qrVCVWJRe2vzP1zLUYi5Ywbz83xt5+oDmx/KBVstn9L0sVZyFKEkwhnkGHg\nLdetQAmFCIr2wVHIEWaYbIiWXhBiaCBGQJqR5ztkSU5tNsKGIUprZNwij3vo9jg1IO+n2NyC8ABq\noTQyCBGESKtwSA9PkiHGStLtHS5/uMow9RY3gRakRnBzYEmtYntjiyP1Ouc3tumubbP0iQWiwMM8\nBrs9bJohG1PYeruY1QiwFusMSOMlCmWJS7VI6zzltpBvlDbHihDRmMHWmpjhJnaYILRGSb+MQ0mU\nEL56tXsg8qoVk74q8LbSvp0R0iCkQWHobnzIhQ9eZzTw0nfF7MP3eEiCOCSoRUBAszbN9MwRkCEe\nv25J84wrfcvf/dPzXL70Nhvrmz7JSf99U4h/aFVjYnyGyfFlmvWDSOkQ0mKFweFtg6SSBKJGbhLS\nNCHPbrIrV7l8+S2G3ZvUahOefVRsq7XW1GoxaZpjcu9e4W29JWgv32mRILw/mzOZB9ALf4GnSQrC\n7LWbUoEwCLy8YxiG1UhBKcWpU6d4+umnK1Ga/a9zWcWGYcjp06fZ3d3l7NmzNJtNFhcXK9nD8udK\npa6Pyh6WGN+S+VaKyJTqYocPH8Y5x3vvvcfCwgJf+9rX2NzcJAxDnn766Uog5+zZs8zNzbGzs1PN\nbst57xtvvFFhi8v5bpqmPP/887z99tusrKxU+r5CCO64447KEaLX67G9vc3u7m71upULuStXPoYp\nZW2RmPYOqj2qr3eI8Oe7hAJra513dbbSV8GepeYfV1kAAcZZ34niizVTqNs5UmKUfgAAAqtJREFU\nKz20xQly5/+Gc54dCQrrvOKcFqpq3x0O56300LKoih1ea0QWBBr/ZiG9N7x/jPKdsHQUJ0L5OyXG\ngilGhghf7QooNCYcQqkSU1McEqK6d/G3eKU/XBVEHxPi4+YPE2OR++TR6QowXTJLSg1Y/xsoElXB\nQLG+PXfGA+n9T/gXB6EQOoDQQ6RwBmdSTJp4NbPceNktIYrkoxFKl0AAgL1WAIfLUna2+2x3vf8X\nzt8YzRjG6yE5EE+02O3s4jJD1AjBKXRcw5LghgPCxjTh7CKoaH+TBKKUrttrk26ByTh8RYwfs5jd\nTWQ+wsssArJYMhTb3avrPa5ugw6iaoYly/5eQCE15i8I4XDWoKVk0Nsmy/YbB5ZPQYCQHraldbHf\nUIRhreqgyoThnMPZlEFv2x8o+34FRSuoZEgQhh6+JrR/36S45eYp324/eysqvzzB5H4ho1REGDWL\nm8Puq/bZd80Uhw7++Qvv7+Jv9HKgI70AtrOF/U1xQ5TwI2st2aDLWLNBGIaMj49X4t8f3cx/NMqZ\nb4kasNZWVeJHJRL3Y3z32/Lsvx6UUjQaDZzzZpFzc3Ps7u6ytbXF5OQk4+PjlV17KZBTSmtqrTHG\nVOI1JUQsyzIuXrxYWaYPBgOiKKpIICXMrXx+pbpYuXS11v7ecu2PxfLynczNL+z7yv4qt5zpUq4z\nfBWKl3Ksxg6U98yt44ZbJgyurCVFOTkov1wl+TL7CSEqy6CqWNlfM1ePhd/78t4TuOXz31sO/guv\nx7/09T9UxYqPfP3M+++ws7P9L2bej026t+N23I7bcTv++8YfR/LejttxO27H7fjvFreT7u24Hbfj\ndvwJ43bSvR2343bcjj9h3E66t+N23I7b8SeM20n3dtyO23E7/oRxO+nejttxO27HnzD+P03/614k\njobtAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": { + "tags": [] + } + } + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "mcgv13T_0VDz", + "colab_type": "text" + }, + "source": [ + "Now that you've learned how to create a custom dataloader with PyTorch, we recommend diving deeper into the docs and customizing your workflow even further. You can learn more in the ```torch.utils.data``` docs [here](https://pytorch.org/docs/stable/data.html)." + ] + } + ] +} \ No newline at end of file diff --git a/recipes_source/recipes/data_loading_tutorial.py b/recipes_source/recipes/data_loading_tutorial.py new file mode 100644 index 00000000000..f0cc99ce081 --- /dev/null +++ b/recipes_source/recipes/data_loading_tutorial.py @@ -0,0 +1,453 @@ +# -*- coding: utf-8 -*- +""" +Writing Custom Datasets, DataLoaders and Transforms +=================================================== +**Author**: `Sasank Chilamkurthy `_ + +A lot of effort in solving any machine learning problem goes in to +preparing the data. PyTorch provides many tools to make data loading +easy and hopefully, to make your code more readable. In this tutorial, +we will see how to load and preprocess/augment data from a non trivial +dataset. + +To run this tutorial, please make sure the following packages are +installed: + +- ``scikit-image``: For image io and transforms +- ``pandas``: For easier csv parsing + +""" + +from __future__ import print_function, division +import os +import torch +import pandas as pd +from skimage import io, transform +import numpy as np +import matplotlib.pyplot as plt +from torch.utils.data import Dataset, DataLoader +from torchvision import transforms, utils + +# Ignore warnings +import warnings +warnings.filterwarnings("ignore") + +plt.ion() # interactive mode + +###################################################################### +# The dataset we are going to deal with is that of facial pose. +# This means that a face is annotated like this: +# +# .. figure:: /_static/img/landmarked_face2.png +# :width: 400 +# +# Over all, 68 different landmark points are annotated for each face. +# +# .. note:: +# Download the dataset from `here `_ +# so that the images are in a directory named 'data/faces/'. +# This dataset was actually +# generated by applying excellent `dlib's pose +# estimation `__ +# on a few images from imagenet tagged as 'face'. +# +# Dataset comes with a csv file with annotations which looks like this: +# +# :: +# +# image_name,part_0_x,part_0_y,part_1_x,part_1_y,part_2_x, ... ,part_67_x,part_67_y +# 0805personali01.jpg,27,83,27,98, ... 84,134 +# 1084239450_e76e00b7e7.jpg,70,236,71,257, ... ,128,312 +# +# Let's quickly read the CSV and get the annotations in an (N, 2) array where N +# is the number of landmarks. +# + +landmarks_frame = pd.read_csv('data/faces/face_landmarks.csv') + +n = 65 +img_name = landmarks_frame.iloc[n, 0] +landmarks = landmarks_frame.iloc[n, 1:] +landmarks = np.asarray(landmarks) +landmarks = landmarks.astype('float').reshape(-1, 2) + +print('Image name: {}'.format(img_name)) +print('Landmarks shape: {}'.format(landmarks.shape)) +print('First 4 Landmarks: {}'.format(landmarks[:4])) + + +###################################################################### +# Let's write a simple helper function to show an image and its landmarks +# and use it to show a sample. +# + +def show_landmarks(image, landmarks): + """Show image with landmarks""" + plt.imshow(image) + plt.scatter(landmarks[:, 0], landmarks[:, 1], s=10, marker='.', c='r') + plt.pause(0.001) # pause a bit so that plots are updated + +plt.figure() +show_landmarks(io.imread(os.path.join('data/faces/', img_name)), + landmarks) +plt.show() + + +###################################################################### +# Dataset class +# ------------- +# +# ``torch.utils.data.Dataset`` is an abstract class representing a +# dataset. +# Your custom dataset should inherit ``Dataset`` and override the following +# methods: +# +# - ``__len__`` so that ``len(dataset)`` returns the size of the dataset. +# - ``__getitem__`` to support the indexing such that ``dataset[i]`` can +# be used to get :math:`i`\ th sample +# +# Let's create a dataset class for our face landmarks dataset. We will +# read the csv in ``__init__`` but leave the reading of images to +# ``__getitem__``. This is memory efficient because all the images are not +# stored in the memory at once but read as required. +# +# Sample of our dataset will be a dict +# ``{'image': image, 'landmarks': landmarks}``. Our dataset will take an +# optional argument ``transform`` so that any required processing can be +# applied on the sample. We will see the usefulness of ``transform`` in the +# next section. +# + +class FaceLandmarksDataset(Dataset): + """Face Landmarks dataset.""" + + def __init__(self, csv_file, root_dir, transform=None): + """ + Args: + csv_file (string): Path to the csv file with annotations. + root_dir (string): Directory with all the images. + transform (callable, optional): Optional transform to be applied + on a sample. + """ + self.landmarks_frame = pd.read_csv(csv_file) + self.root_dir = root_dir + self.transform = transform + + def __len__(self): + return len(self.landmarks_frame) + + def __getitem__(self, idx): + if torch.is_tensor(idx): + idx = idx.tolist() + + img_name = os.path.join(self.root_dir, + self.landmarks_frame.iloc[idx, 0]) + image = io.imread(img_name) + landmarks = self.landmarks_frame.iloc[idx, 1:] + landmarks = np.array([landmarks]) + landmarks = landmarks.astype('float').reshape(-1, 2) + sample = {'image': image, 'landmarks': landmarks} + + if self.transform: + sample = self.transform(sample) + + return sample + + +###################################################################### +# Let's instantiate this class and iterate through the data samples. We +# will print the sizes of first 4 samples and show their landmarks. +# + +face_dataset = FaceLandmarksDataset(csv_file='data/faces/face_landmarks.csv', + root_dir='data/faces/') + +fig = plt.figure() + +for i in range(len(face_dataset)): + sample = face_dataset[i] + + print(i, sample['image'].shape, sample['landmarks'].shape) + + ax = plt.subplot(1, 4, i + 1) + plt.tight_layout() + ax.set_title('Sample #{}'.format(i)) + ax.axis('off') + show_landmarks(**sample) + + if i == 3: + plt.show() + break + + +###################################################################### +# Transforms +# ---------- +# +# One issue we can see from the above is that the samples are not of the +# same size. Most neural networks expect the images of a fixed size. +# Therefore, we will need to write some prepocessing code. +# Let's create three transforms: +# +# - ``Rescale``: to scale the image +# - ``RandomCrop``: to crop from image randomly. This is data +# augmentation. +# - ``ToTensor``: to convert the numpy images to torch images (we need to +# swap axes). +# +# We will write them as callable classes instead of simple functions so +# that parameters of the transform need not be passed everytime it's +# called. For this, we just need to implement ``__call__`` method and +# if required, ``__init__`` method. We can then use a transform like this: +# +# :: +# +# tsfm = Transform(params) +# transformed_sample = tsfm(sample) +# +# Observe below how these transforms had to be applied both on the image and +# landmarks. +# + +class Rescale(object): + """Rescale the image in a sample to a given size. + + Args: + output_size (tuple or int): Desired output size. If tuple, output is + matched to output_size. If int, smaller of image edges is matched + to output_size keeping aspect ratio the same. + """ + + def __init__(self, output_size): + assert isinstance(output_size, (int, tuple)) + self.output_size = output_size + + def __call__(self, sample): + image, landmarks = sample['image'], sample['landmarks'] + + h, w = image.shape[:2] + if isinstance(self.output_size, int): + if h > w: + new_h, new_w = self.output_size * h / w, self.output_size + else: + new_h, new_w = self.output_size, self.output_size * w / h + else: + new_h, new_w = self.output_size + + new_h, new_w = int(new_h), int(new_w) + + img = transform.resize(image, (new_h, new_w)) + + # h and w are swapped for landmarks because for images, + # x and y axes are axis 1 and 0 respectively + landmarks = landmarks * [new_w / w, new_h / h] + + return {'image': img, 'landmarks': landmarks} + + +class RandomCrop(object): + """Crop randomly the image in a sample. + + Args: + output_size (tuple or int): Desired output size. If int, square crop + is made. + """ + + def __init__(self, output_size): + assert isinstance(output_size, (int, tuple)) + if isinstance(output_size, int): + self.output_size = (output_size, output_size) + else: + assert len(output_size) == 2 + self.output_size = output_size + + def __call__(self, sample): + image, landmarks = sample['image'], sample['landmarks'] + + h, w = image.shape[:2] + new_h, new_w = self.output_size + + top = np.random.randint(0, h - new_h) + left = np.random.randint(0, w - new_w) + + image = image[top: top + new_h, + left: left + new_w] + + landmarks = landmarks - [left, top] + + return {'image': image, 'landmarks': landmarks} + + +class ToTensor(object): + """Convert ndarrays in sample to Tensors.""" + + def __call__(self, sample): + image, landmarks = sample['image'], sample['landmarks'] + + # swap color axis because + # numpy image: H x W x C + # torch image: C X H X W + image = image.transpose((2, 0, 1)) + return {'image': torch.from_numpy(image), + 'landmarks': torch.from_numpy(landmarks)} + + +###################################################################### +# Compose transforms +# ~~~~~~~~~~~~~~~~~~ +# +# Now, we apply the transforms on a sample. +# +# Let's say we want to rescale the shorter side of the image to 256 and +# then randomly crop a square of size 224 from it. i.e, we want to compose +# ``Rescale`` and ``RandomCrop`` transforms. +# ``torchvision.transforms.Compose`` is a simple callable class which allows us +# to do this. +# + +scale = Rescale(256) +crop = RandomCrop(128) +composed = transforms.Compose([Rescale(256), + RandomCrop(224)]) + +# Apply each of the above transforms on sample. +fig = plt.figure() +sample = face_dataset[65] +for i, tsfrm in enumerate([scale, crop, composed]): + transformed_sample = tsfrm(sample) + + ax = plt.subplot(1, 3, i + 1) + plt.tight_layout() + ax.set_title(type(tsfrm).__name__) + show_landmarks(**transformed_sample) + +plt.show() + + +###################################################################### +# Iterating through the dataset +# ----------------------------- +# +# Let's put this all together to create a dataset with composed +# transforms. +# To summarize, every time this dataset is sampled: +# +# - An image is read from the file on the fly +# - Transforms are applied on the read image +# - Since one of the transforms is random, data is augmentated on +# sampling +# +# We can iterate over the created dataset with a ``for i in range`` +# loop as before. +# + +transformed_dataset = FaceLandmarksDataset(csv_file='data/faces/face_landmarks.csv', + root_dir='data/faces/', + transform=transforms.Compose([ + Rescale(256), + RandomCrop(224), + ToTensor() + ])) + +for i in range(len(transformed_dataset)): + sample = transformed_dataset[i] + + print(i, sample['image'].size(), sample['landmarks'].size()) + + if i == 3: + break + + +###################################################################### +# However, we are losing a lot of features by using a simple ``for`` loop to +# iterate over the data. In particular, we are missing out on: +# +# - Batching the data +# - Shuffling the data +# - Load the data in parallel using ``multiprocessing`` workers. +# +# ``torch.utils.data.DataLoader`` is an iterator which provides all these +# features. Parameters used below should be clear. One parameter of +# interest is ``collate_fn``. You can specify how exactly the samples need +# to be batched using ``collate_fn``. However, default collate should work +# fine for most use cases. +# + +dataloader = DataLoader(transformed_dataset, batch_size=4, + shuffle=True, num_workers=4) + + +# Helper function to show a batch +def show_landmarks_batch(sample_batched): + """Show image with landmarks for a batch of samples.""" + images_batch, landmarks_batch = \ + sample_batched['image'], sample_batched['landmarks'] + batch_size = len(images_batch) + im_size = images_batch.size(2) + grid_border_size = 2 + + grid = utils.make_grid(images_batch) + plt.imshow(grid.numpy().transpose((1, 2, 0))) + + for i in range(batch_size): + plt.scatter(landmarks_batch[i, :, 0].numpy() + i * im_size + (i + 1) * grid_border_size, + landmarks_batch[i, :, 1].numpy() + grid_border_size, + s=10, marker='.', c='r') + + plt.title('Batch from dataloader') + +for i_batch, sample_batched in enumerate(dataloader): + print(i_batch, sample_batched['image'].size(), + sample_batched['landmarks'].size()) + + # observe 4th batch and stop. + if i_batch == 3: + plt.figure() + show_landmarks_batch(sample_batched) + plt.axis('off') + plt.ioff() + plt.show() + break + +###################################################################### +# Afterword: torchvision +# ---------------------- +# +# In this tutorial, we have seen how to write and use datasets, transforms +# and dataloader. ``torchvision`` package provides some common datasets and +# transforms. You might not even have to write custom classes. One of the +# more generic datasets available in torchvision is ``ImageFolder``. +# It assumes that images are organized in the following way: :: +# +# root/ants/xxx.png +# root/ants/xxy.jpeg +# root/ants/xxz.png +# . +# . +# . +# root/bees/123.jpg +# root/bees/nsdf3.png +# root/bees/asd932_.png +# +# where 'ants', 'bees' etc. are class labels. Similarly generic transforms +# which operate on ``PIL.Image`` like ``RandomHorizontalFlip``, ``Scale``, +# are also available. You can use these to write a dataloader like this: :: +# +# import torch +# from torchvision import transforms, datasets +# +# data_transform = transforms.Compose([ +# transforms.RandomSizedCrop(224), +# transforms.RandomHorizontalFlip(), +# transforms.ToTensor(), +# transforms.Normalize(mean=[0.485, 0.456, 0.406], +# std=[0.229, 0.224, 0.225]) +# ]) +# hymenoptera_dataset = datasets.ImageFolder(root='hymenoptera_data/train', +# transform=data_transform) +# dataset_loader = torch.utils.data.DataLoader(hymenoptera_dataset, +# batch_size=4, shuffle=True, +# num_workers=4) +# +# For an example with training code, please see +# :doc:`transfer_learning_tutorial`. diff --git a/recipes_source/recipes/ddp_tutorial.rst b/recipes_source/recipes/ddp_tutorial.rst new file mode 100644 index 00000000000..515a4cb5ce4 --- /dev/null +++ b/recipes_source/recipes/ddp_tutorial.rst @@ -0,0 +1,278 @@ +Getting Started with Distributed Data Parallel +================================================= +**Author**: `Shen Li `_ + +`DistributedDataParallel `__ +(DDP) implements data parallelism at the module level. It uses communication +collectives in the `torch.distributed `__ +package to synchronize gradients, parameters, and buffers. Parallelism is +available both within a process and across processes. Within a process, DDP +replicates the input module to devices specified in ``device_ids``, scatters +inputs along the batch dimension accordingly, and gathers outputs to the +``output_device``, which is similar to +`DataParallel `__. +Across processes, DDP inserts necessary parameter synchronizations in forward +passes and gradient synchronizations in backward passes. It is up to users to +map processes to available resources, as long as processes do not share GPU +devices. The recommended (usually fastest) approach is to create a process for +every module replica, i.e., no module replication within a process. The code in +this tutorial runs on an 8-GPU server, but it can be easily generalized to +other environments. + +Comparison between ``DataParallel`` and ``DistributedDataParallel`` +------------------------------------------------------------------- + +Before we dive in, let's clarify why, despite the added complexity, you would +consider using ``DistributedDataParallel`` over ``DataParallel``: + +- First, recall from the + `prior tutorial `__ + that if your model is too large to fit on a single GPU, you must use **model parallel** + to split it across multiple GPUs. ``DistributedDataParallel`` works with + **model parallel**; ``DataParallel`` does not at this time. +- ``DataParallel`` is single-process, multi-thread, and only works on a single + machine, while ``DistributedDataParallel`` is multi-process and works for both + single- and multi- machine training. Thus, even for single machine training, + where your **data** is small enough to fit on a single machine, ``DistributedDataParallel`` + is expected to be faster than ``DataParallel``. ``DistributedDataParallel`` + also replicates models upfront instead of on each iteration and gets Global + Interpreter Lock out of the way. +- If both your data is too large to fit on one machine **and** your + model is too large to fit on a single GPU, you can combine model parallel + (splitting a single model across multiple GPUs) with ``DistributedDataParallel``. + Under this regime, each ``DistributedDataParallel`` process could use model parallel, + and all processes collectively would use data parallel. + +Basic Use Case +-------------- + +To create DDP modules, first set up process groups properly. More details can +be found in +`Writing Distributed Applications with PyTorch `__. + +.. code:: python + + import os + import tempfile + import torch + import torch.distributed as dist + import torch.nn as nn + import torch.optim as optim + import torch.multiprocessing as mp + + from torch.nn.parallel import DistributedDataParallel as DDP + + + def setup(rank, world_size): + os.environ['MASTER_ADDR'] = 'localhost' + os.environ['MASTER_PORT'] = '12355' + + # initialize the process group + dist.init_process_group("gloo", rank=rank, world_size=world_size) + + # Explicitly setting seed to make sure that models created in two processes + # start from same random weights and biases. + torch.manual_seed(42) + + + def cleanup(): + dist.destroy_process_group() + +Now, let's create a toy module, wrap it with DDP, and feed it with some dummy +input data. Please note, if training starts from random parameters, you might +want to make sure that all DDP processes use the same initial values. +Otherwise, global gradient synchronizes will not make sense. + +.. code:: python + + class ToyModel(nn.Module): + def __init__(self): + super(ToyModel, self).__init__() + self.net1 = nn.Linear(10, 10) + self.relu = nn.ReLU() + self.net2 = nn.Linear(10, 5) + + def forward(self, x): + return self.net2(self.relu(self.net1(x))) + + + def demo_basic(rank, world_size): + setup(rank, world_size) + + # setup devices for this process, rank 1 uses GPUs [0, 1, 2, 3] and + # rank 2 uses GPUs [4, 5, 6, 7]. + n = torch.cuda.device_count() // world_size + device_ids = list(range(rank * n, (rank + 1) * n)) + + # create model and move it to device_ids[0] + model = ToyModel().to(device_ids[0]) + # output_device defaults to device_ids[0] + ddp_model = DDP(model, device_ids=device_ids) + + loss_fn = nn.MSELoss() + optimizer = optim.SGD(ddp_model.parameters(), lr=0.001) + + optimizer.zero_grad() + outputs = ddp_model(torch.randn(20, 10)) + labels = torch.randn(20, 5).to(device_ids[0]) + loss_fn(outputs, labels).backward() + optimizer.step() + + cleanup() + + + def run_demo(demo_fn, world_size): + mp.spawn(demo_fn, + args=(world_size,), + nprocs=world_size, + join=True) + +As you can see, DDP wraps lower level distributed communication details, and +provides a clean API as if it is a local model. For basic use cases, DDP only +requires a few more LoCs to set up the process group. When applying DDP to more +advanced use cases, there are some caveats that require cautions. + +Skewed Processing Speeds +------------------------ + +In DDP, constructor, forward method, and differentiation of the outputs are +distributed synchronization points. Different processes are expected to reach +synchronization points in the same order and enter each synchronization point +at roughly the same time. Otherwise, fast processes might arrive early and +timeout on waiting for stragglers. Hence, users are responsible for balancing +workloads distributions across processes. Sometimes, skewed processing speeds +are inevitable due to, e.g., network delays, resource contentions, +unpredictable workload spikes. To avoid timeouts in these situations, make +sure that you pass a sufficiently large ``timeout`` value when calling +`init_process_group `__. + +Save and Load Checkpoints +------------------------- + +It's common to use ``torch.save`` and ``torch.load`` to checkpoint modules +during training and recover from checkpoints. See +`SAVING AND LOADING MODELS `__ +for more details. When using DDP, one optimization is to save the model in +only one process and then load it to all processes, reducing write overhead. +This is correct because all processes start from the same parameters and +gradients are synchronized in backward passes, and hence optimizers should keep +setting parameters to same values. If you use this optimization, make sure all +processes do not start loading before the saving is finished. Besides, when +loading the module, you need to provide an appropriate ``map_location`` +argument to prevent a process to step into others' devices. If ``map_location`` +is missing, ``torch.load`` will first load the module to CPU and then copy each +parameter to where it was saved, which would result in all processes on the +same machine using the same set of devices. + +.. code:: python + + def demo_checkpoint(rank, world_size): + setup(rank, world_size) + + # setup devices for this process, rank 1 uses GPUs [0, 1, 2, 3] and + # rank 2 uses GPUs [4, 5, 6, 7]. + n = torch.cuda.device_count() // world_size + device_ids = list(range(rank * n, (rank + 1) * n)) + + model = ToyModel().to(device_ids[0]) + # output_device defaults to device_ids[0] + ddp_model = DDP(model, device_ids=device_ids) + + loss_fn = nn.MSELoss() + optimizer = optim.SGD(ddp_model.parameters(), lr=0.001) + + CHECKPOINT_PATH = tempfile.gettempdir() + "/model.checkpoint" + if rank == 0: + # All processes should see same parameters as they all start from same + # random parameters and gradients are synchronized in backward passes. + # Therefore, saving it in one process is sufficient. + torch.save(ddp_model.state_dict(), CHECKPOINT_PATH) + + # Use a barrier() to make sure that process 1 loads the model after process + # 0 saves it. + dist.barrier() + # configure map_location properly + rank0_devices = [x - rank * len(device_ids) for x in device_ids] + device_pairs = zip(rank0_devices, device_ids) + map_location = {'cuda:%d' % x: 'cuda:%d' % y for x, y in device_pairs} + ddp_model.load_state_dict( + torch.load(CHECKPOINT_PATH, map_location=map_location)) + + optimizer.zero_grad() + outputs = ddp_model(torch.randn(20, 10)) + labels = torch.randn(20, 5).to(device_ids[0]) + loss_fn = nn.MSELoss() + loss_fn(outputs, labels).backward() + optimizer.step() + + # Use a barrier() to make sure that all processes have finished reading the + # checkpoint + dist.barrier() + + if rank == 0: + os.remove(CHECKPOINT_PATH) + + cleanup() + +Combine DDP with Model Parallelism +---------------------------------- + +DDP also works with multi-GPU models, but replications within a process are not +supported. You need to create one process per module replica, which usually +leads to better performance compared to multiple replicas per process. DDP +wrapping multi-GPU models is especially helpful when training large models with +a huge amount of data. When using this feature, the multi-GPU model needs to be +carefully implemented to avoid hard-coded devices, because different model +replicas will be placed to different devices. + +.. code:: python + + class ToyMpModel(nn.Module): + def __init__(self, dev0, dev1): + super(ToyMpModel, self).__init__() + self.dev0 = dev0 + self.dev1 = dev1 + self.net1 = torch.nn.Linear(10, 10).to(dev0) + self.relu = torch.nn.ReLU() + self.net2 = torch.nn.Linear(10, 5).to(dev1) + + def forward(self, x): + x = x.to(self.dev0) + x = self.relu(self.net1(x)) + x = x.to(self.dev1) + return self.net2(x) + +When passing a multi-GPU model to DDP, ``device_ids`` and ``output_device`` +must NOT be set. Input and output data will be placed in proper devices by +either the application or the model ``forward()`` method. + +.. code:: python + + def demo_model_parallel(rank, world_size): + setup(rank, world_size) + + # setup mp_model and devices for this process + dev0 = rank * 2 + dev1 = rank * 2 + 1 + mp_model = ToyMpModel(dev0, dev1) + ddp_mp_model = DDP(mp_model) + + loss_fn = nn.MSELoss() + optimizer = optim.SGD(ddp_mp_model.parameters(), lr=0.001) + + optimizer.zero_grad() + # outputs will be on dev1 + outputs = ddp_mp_model(torch.randn(20, 10)) + labels = torch.randn(20, 5).to(dev1) + loss_fn(outputs, labels).backward() + optimizer.step() + + cleanup() + + + if __name__ == "__main__": + run_demo(demo_basic, 2) + run_demo(demo_checkpoint, 2) + + if torch.cuda.device_count() >= 8: + run_demo(demo_model_parallel, 4) diff --git a/recipes_source/recipes/defining_a_neural_network.py b/recipes_source/recipes/defining_a_neural_network.py new file mode 100644 index 00000000000..2e8ec19b292 --- /dev/null +++ b/recipes_source/recipes/defining_a_neural_network.py @@ -0,0 +1,183 @@ +""" +Defining a Neural Network in PyTorch +==================================== +Deep learning uses artificial neural networks (models), which are +computing systems that are composed of many layers of interconnected +units. By passing data through these interconnected units, a neural +network is able to learn how to approximate the computations required to +transform inputs into outputs. In PyTorch, neural networks can be +constructed using the ``torch.nn`` package. + +Introduction +------------ +PyTorch provides the elegantly designed modules and classes, including +``torch.nn``, to help you create and train neural networks. An +``nn.Module`` contains layers, and a method ``forward(input)`` that +returns the ``output``. + +In this recipe, we will use ``torch.nn`` to define a neural network +intended for the `MNIST +dataset `__. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + +:: + + pip install torchaudio + + +""" + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Specify how data will pass through your model +# 4. [Optional] Pass data through your model to test +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.nn.functional``. +# + +import torch +import torch.nn as nn +import torch.nn.functional as F + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Our network will recognize images. We will use a process built into +# PyTorch called convolution. Convolution adds each element of an image to +# its local neighbors, weighted by a kernel, or a small martrix, that +# helps us extract certain features (like edge detection, sharpness, +# blurriness, etc.) from the input image. +# +# There are two requirements for defining the ``Net`` class of your model. +# The first is writing an ``__init__`` function that references +# ``nn.Module``. This function is where you define the fully connected +# layers in your neural network. +# +# Using convolution, we will define our model to take 1 input image +# channel, and output match our target of 10 labels representing numbers 0 +# through 9. This algorithm is yours to create, we will follow a standard +# MNIST algorithm. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + + # First 2D convolutional layer, taking in 1 input channel (image), + # outputting 32 convolutional features, with a square kernel size of 3 + self.conv1 = nn.Conv2d(1, 32, 3, 1) + # Second 2D convolutional layer, taking in the 32 input layers, + # outputting 64 convolutional features, with a square kernel size of 3 + self.conv2 = nn.Conv2d(32, 64, 3, 1) + + # Designed to ensure that adjacent pixels are either all 0s or all active + # with an input probability + self.dropout1 = nn.Dropout2d(0.25) + self.dropout2 = nn.Dropout2d(0.5) + + # First fully connected layer + self.fc1 = nn.Linear(9216, 128) + # Second fully connected layer that outputs our 10 labels + self.fc2 = nn.Linear(128, 10) + +my_nn = Net() +print(my_nn) + + +###################################################################### +# We have finished defining our neural network, now we have to define how +# our data will pass through it. +# +# 3. Specify how data will pass through your model +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# When you use PyTorch to build a model, you just have to define the +# ``forward`` function, that will pass the data into the computation graph +# (i.e. our neural network). This will represent our feed-forward +# algorithm. +# +# You can use any of the Tensor operations in the ``forward`` function. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(1, 32, 3, 1) + self.conv2 = nn.Conv2d(32, 64, 3, 1) + self.dropout1 = nn.Dropout2d(0.25) + self.dropout2 = nn.Dropout2d(0.5) + self.fc1 = nn.Linear(9216, 128) + self.fc2 = nn.Linear(128, 10) + + # x represents our data + def forward(self, x): + # Pass data through conv1 + x = self.conv1(x) + # Use the rectified-linear activation function over x + x = F.relu(x) + + x = self.conv2(x) + x = F.relu(x) + + # Run max pooling over x + x = F.max_pool2d(x, 2) + # Pass data through dropout1 + x = self.dropout1(x) + # Flatten x with start_dim=1 + x = torch.flatten(x, 1) + # Pass data through fc1 + x = self.fc1(x) + x = F.relu(x) + x = self.dropout2(x) + x = self.fc2(x) + + # Apply softmax to x + output = F.log_softmax(x, dim=1) + return output + + +###################################################################### +# 4. [Optional] Pass data through your model to test +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# To ensure we receive our desired output, let’s test our model by passing +# some random data through it. +# + +# Equates to one random 28x28 image +random_data = torch.rand((1, 1, 28, 28)) + +my_nn = Net() +result = my_nn(random_data) +print (result) + + +###################################################################### +# Each number in this resulting tensor equates to the prediction of the +# label the random tensor is associated to. +# +# Congratulations! You have successfully defined a neural network in +# PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes/deployment_with_flask.rst b/recipes_source/recipes/deployment_with_flask.rst new file mode 100644 index 00000000000..0d1291c8d89 --- /dev/null +++ b/recipes_source/recipes/deployment_with_flask.rst @@ -0,0 +1,284 @@ +Deploying with Flask +==================== + +In this recipe, you will learn: + +- How to wrap your trained PyTorch model in a Flask container to expose + it via a web API +- How to translate incoming web requests into PyTorch tensors for your + model +- How to package your model’s output for an HTTP response + +Requirements +------------ + +You will need a Python 3 environment with the following packages (and +their dependencies) installed: + +- PyTorch 1.5 +- TorchVision 0.6.0 +- Flask 1.1 + +Optionally, to get some of the supporting files, you'll need git. + +The instructions for installing PyTorch and TorchVision are available at +`pytorch.org`_. Instructions for installing Flask are available on `the +Flask site`_. + +What is Flask? +-------------- + +Flask is a lightweight web server written in Python. It provides a +convenient way for you to quickly set up a web API for predictions from +your trained PyTorch model, either for direct use, or as a web service +within a larger system. + +Setup and Supporting Files +-------------------------- + +We're going to create a web service that takes in images, and maps them +to one of the 1000 classes of the ImageNet dataset. To do this, you'll +need an image file for testing. Optionally, you can also get a file that +will map the class index output by the model to a human-readable class +name. + +Option 1: To Get Both Files Quickly +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +You can pull both of the supporting files quickly by checking out the +TorchServe repository and copying them to your working folder. *(NB: +There is no dependency on TorchServe for this tutorial - it's just a +quick way to get the files.)* Issue the following commands from your +shell prompt: + +:: + + git clone https://github.com/pytorch/serve + cp serve/examples/image_classifier/kitten.jpg . + cp serve/examples/image_classifier/index_to_name.json . + +And you've got them! + +Option 2: Bring Your Own Image +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The ``index_to_name.json`` file is optional in the Flask service below. +You can test your service with your own image - just make sure it's a +3-color JPEG. + +Building Your Flask Service +--------------------------- + +The full Python script for the Flask service is shown at the end of this +recipe; you can copy and paste that into your own ``app.py`` file. Below +we'll look at individual sections to make their functions clear. + +Imports +~~~~~~~ + +:: + + import torchvision.models as models + import torchvision.transforms as transforms + from PIL import Image + from flask import Flask, jsonify, request + +In order: + +- We'll be using a pre-trained DenseNet model from + ``torchvision.models`` +- ``torchvision.transforms`` contains tools for manipulating your image + data +- Pillow (``PIL``) is what we'll use to load the image file initially +- And of course we'll need classes from ``flask`` + +Pre-Processing +~~~~~~~~~~~~~~ + +:: + + def transform_image(infile): + input_transforms = [transforms.Resize(255), + transforms.CenterCrop(224), + transforms.ToTensor(), + transforms.Normalize([0.485, 0.456, 0.406], + [0.229, 0.224, 0.225])] + my_transforms = transforms.Compose(input_transforms) + image = Image.open(infile) + timg = my_transforms(image) + timg.unsqueeze_(0) + return timg + +The web request gave us an image file, but our model expects a PyTorch +tensor of shape (N, 3, 224, 224) where *N* is the number of items in the +input batch. (We will just have a batch size of 1.) The first thing we +do is compose a set of TorchVision transforms that resize and crop the +image, convert it to a tensor, then normalize the values in the tensor. +(For more information on this normalization, see the documentation for +``torchvision.models_``.) + +After that, we open the file and apply the transforms. The transforms +return a tensor of shape (3, 224, 224) - the 3 color channels of a +224x224 image. Because we need to make this single image a batch, we use +the ``unsqueeze_(0)`` call to modify the tensor in place by adding a new +first dimension. The tensor contains the same data, but now has shape +(1, 3, 224, 224). + +In general, even if you're not working with image data, you will need to +transform the input from your HTTP request into a tensor that PyTorch +can consume. + +Inference +~~~~~~~~~ + +:: + + def get_prediction(input_tensor): + outputs = model.forward(input_tensor) + _, y_hat = outputs.max(1) + prediction = y_hat.item() + return prediction + +The inference itself is the simplest part: When we pass the input tensor +to them model, we get back a tensor of values that represent the model's +estimated likelihood that the image belongs to a particular class. The +``max()`` call finds the class with the maximum likelihood value, and +returns that value with the ImageNet class index. Finally, we extract +that class index from the tensor containing it with the ``item()`` call, and +return it. + +Post-Processing +~~~~~~~~~~~~~~~ + +:: + + def render_prediction(prediction_idx): + stridx = str(prediction_idx) + class_name = 'Unknown' + if img_class_map is not None: + if stridx in img_class_map is not None: + class_name = img_class_map[stridx][1] + + return prediction_idx, class_name + +The ``render_prediction()`` method maps the predicted class index to a +human-readable class label. It's typical, after getting the prediction +from your model, to perform post-processing to make the prediction ready +for either human consumption, or for another piece of software. + +Running The Full Flask App +-------------------------- + +Paste the following into a file called ``app.py``: + +:: + + import io + import json + import os + + import torchvision.models as models + import torchvision.transforms as transforms + from PIL import Image + from flask import Flask, jsonify, request + + + app = Flask(__name__) + model = models.densenet121(pretrained=True) # Trained on 1000 classes from ImageNet + model.eval() # Turns off autograd and + + + + img_class_map = None + mapping_file_path = 'index_to_name.json' # Human-readable names for Imagenet classes + if os.path.isfile(mapping_file_path): + with open (mapping_file_path) as f: + img_class_map = json.load(f) + + + + # Transform input into the form our model expects + def transform_image(infile): + input_transforms = [transforms.Resize(255), # We use multiple TorchVision transforms to ready the image + transforms.CenterCrop(224), + transforms.ToTensor(), + transforms.Normalize([0.485, 0.456, 0.406], # Standard normalization for ImageNet model input + [0.229, 0.224, 0.225])] + my_transforms = transforms.Compose(input_transforms) + image = Image.open(infile) # Open the image file + timg = my_transforms(image) # Transform PIL image to appropriately-shaped PyTorch tensor + timg.unsqueeze_(0) # PyTorch models expect batched input; create a batch of 1 + return timg + + + # Get a prediction + def get_prediction(input_tensor): + outputs = model.forward(input_tensor) # Get likelihoods for all ImageNet classes + _, y_hat = outputs.max(1) # Extract the most likely class + prediction = y_hat.item() # Extract the int value from the PyTorch tensor + return prediction + + # Make the prediction human-readable + def render_prediction(prediction_idx): + stridx = str(prediction_idx) + class_name = 'Unknown' + if img_class_map is not None: + if stridx in img_class_map is not None: + class_name = img_class_map[stridx][1] + + return prediction_idx, class_name + + + @app.route('/', methods=['GET']) + def root(): + return jsonify({'msg' : 'Try POSTing to the /predict endpoint with an RGB image attachment'}) + + + @app.route('/predict', methods=['POST']) + def predict(): + if request.method == 'POST': + file = request.files['file'] + if file is not None: + input_tensor = transform_image(file) + prediction_idx = get_prediction(input_tensor) + class_id, class_name = render_prediction(prediction_idx) + return jsonify({'class_id': class_id, 'class_name': class_name}) + + + if __name__ == '__main__': + app.run() + +To start the server from your shell prompt, issue the following command: + +:: + + FLASK_APP=app.py flask run + +By default, your Flask server is listening on port 5000. Once the server +is running, open another terminal window, and test your new inference +server: + +:: + + curl -X POST -H "Content-Type: multipart/form-data" http://localhost:5000/predict -F "file=@kitten.jpg" + +If everything is set up correctly, you should recevie a response similar +to the following: + +:: + + {"class_id":285,"class_name":"Egyptian_cat"} + +Important Resources +------------------- + +- `pytorch.org`_ for installation instructions, and more documentation + and tutorials +- The `Flask site`_ has a `Quick Start guide`_ that goes into more + detail on setting up a simple Flask service + +.. _pytorch.org: https://pytorch.org +.. _Flask site: https://flask.palletsprojects.com/en/1.1.x/ +.. _Quick Start guide: https://flask.palletsprojects.com/en/1.1.x/quickstart/ +.. _torchvision.models: https://pytorch.org/docs/stable/torchvision/models.html +.. _the Flask site: https://flask.palletsprojects.com/en/1.1.x/installation/ diff --git a/recipes_source/recipes/dist_tuto.rst b/recipes_source/recipes/dist_tuto.rst new file mode 100644 index 00000000000..3a76bb1dd3c --- /dev/null +++ b/recipes_source/recipes/dist_tuto.rst @@ -0,0 +1,629 @@ +Writing Distributed Applications with PyTorch +============================================= +**Author**: `Séb Arnold `_ + +In this short tutorial, we will be going over the distributed package +of PyTorch. We'll see how to set up the distributed setting, use the +different communication strategies, and go over some the internals of +the package. + +Setup +----- + +.. raw:: html + + + +The distributed package included in PyTorch (i.e., +``torch.distributed``) enables researchers and practitioners to easily +parallelize their computations across processes and clusters of +machines. To do so, it leverages messaging passing semantics +allowing each process to communicate data to any of the other processes. +As opposed to the multiprocessing (``torch.multiprocessing``) package, +processes can use different communication backends and are not +restricted to being executed on the same machine. + +In order to get started we need the ability to run multiple processes +simultaneously. If you have access to compute cluster you should check +with your local sysadmin or use your favorite coordination tool. (e.g., +`pdsh `__, +`clustershell `__, or +`others `__) For the purpose of this +tutorial, we will use a single machine and fork multiple processes using +the following template. + +.. code:: python + + """run.py:""" + #!/usr/bin/env python + import os + import torch + import torch.distributed as dist + from torch.multiprocessing import Process + + def run(rank, size): + """ Distributed function to be implemented later. """ + pass + + def init_process(rank, size, fn, backend='gloo'): + """ Initialize the distributed environment. """ + os.environ['MASTER_ADDR'] = '127.0.0.1' + os.environ['MASTER_PORT'] = '29500' + dist.init_process_group(backend, rank=rank, world_size=size) + fn(rank, size) + + + if __name__ == "__main__": + size = 2 + processes = [] + for rank in range(size): + p = Process(target=init_process, args=(rank, size, run)) + p.start() + processes.append(p) + + for p in processes: + p.join() + +The above script spawns two processes who will each setup the +distributed environment, initialize the process group +(``dist.init_process_group``), and finally execute the given ``run`` +function. + +Let's have a look at the ``init_process`` function. It ensures that +every process will be able to coordinate through a master, using the +same ip address and port. Note that we used the ``gloo`` backend but +other backends are available. (c.f. +`Section 5.1 <#communication-backends>`__) We will go over the magic +happening in ``dist.init_process_group`` at the end of this tutorial, +but it essentially allows processes to communicate with each other by +sharing their locations. + +Point-to-Point Communication +---------------------------- + +.. figure:: /_static/img/distributed/send_recv.png + :width: 100% + :align: center + :alt: Send and Recv + + Send and Recv + + +A transfer of data from one process to another is called a +point-to-point communication. These are achieved through the ``send`` +and ``recv`` functions or their *immediate* counter-parts, ``isend`` and +``irecv``. + +.. code:: python + + """Blocking point-to-point communication.""" + + def run(rank, size): + tensor = torch.zeros(1) + if rank == 0: + tensor += 1 + # Send the tensor to process 1 + dist.send(tensor=tensor, dst=1) + else: + # Receive tensor from process 0 + dist.recv(tensor=tensor, src=0) + print('Rank ', rank, ' has data ', tensor[0]) + +In the above example, both processes start with a zero tensor, then +process 0 increments the tensor and sends it to process 1 so that they +both end up with 1.0. Notice that process 1 needs to allocate memory in +order to store the data it will receive. + +Also notice that ``send``/``recv`` are **blocking**: both processes stop +until the communication is completed. On the other hand immediates are +**non-blocking**; the script continues its execution and the methods +return a ``Work`` object upon which we can choose to +``wait()``. + +.. code:: python + + """Non-blocking point-to-point communication.""" + + def run(rank, size): + tensor = torch.zeros(1) + req = None + if rank == 0: + tensor += 1 + # Send the tensor to process 1 + req = dist.isend(tensor=tensor, dst=1) + print('Rank 0 started sending') + else: + # Receive tensor from process 0 + req = dist.irecv(tensor=tensor, src=0) + print('Rank 1 started receiving') + req.wait() + print('Rank ', rank, ' has data ', tensor[0]) + +When using immediates we have to be careful about with our usage of the sent and received tensors. +Since we do not know when the data will be communicated to the other process, +we should not modify the sent tensor nor access the received tensor before ``req.wait()`` has completed. +In other words, + +- writing to ``tensor`` after ``dist.isend()`` will result in undefined behaviour. +- reading from ``tensor`` after ``dist.irecv()`` will result in undefined behaviour. + +However, after ``req.wait()`` +has been executed we are guaranteed that the communication took place, +and that the value stored in ``tensor[0]`` is 1.0. + +Point-to-point communication is useful when we want a fine-grained +control over the communication of our processes. They can be used to +implement fancy algorithms, such as the one used in `Baidu's +DeepSpeech `__ or +`Facebook's large-scale +experiments `__.(c.f. +`Section 4.1 <#our-own-ring-allreduce>`__) + +Collective Communication +------------------------ + ++----------------------------------------------------+-----------------------------------------------------+ +| .. figure:: /_static/img/distributed/scatter.png | .. figure:: /_static/img/distributed/gather.png | +| :alt: Scatter | :alt: Gather | +| :width: 100% | :width: 100% | +| :align: center | :align: center | +| | | +| Scatter | Gather | ++----------------------------------------------------+-----------------------------------------------------+ +| .. figure:: /_static/img/distributed/reduce.png | .. figure:: /_static/img/distributed/all_reduce.png | +| :alt: Reduce | :alt: All-Reduce | +| :width: 100% | :width: 100% | +| :align: center | :align: center | +| | | +| Reduce | All-Reduce | ++----------------------------------------------------+-----------------------------------------------------+ +| .. figure:: /_static/img/distributed/broadcast.png | .. figure:: /_static/img/distributed/all_gather.png | +| :alt: Broadcast | :alt: All-Gather | +| :width: 100% | :width: 100% | +| :align: center | :align: center | +| | | +| Broadcast | All-Gather | ++----------------------------------------------------+-----------------------------------------------------+ + + + +As opposed to point-to-point communcation, collectives allow for +communication patterns across all processes in a **group**. A group is a +subset of all our processes. To create a group, we can pass a list of +ranks to ``dist.new_group(group)``. By default, collectives are executed +on the all processes, also known as the **world**. For example, in order +to obtain the sum of all tensors at all processes, we can use the +``dist.all_reduce(tensor, op, group)`` collective. + +.. code:: python + + """ All-Reduce example.""" + def run(rank, size): + """ Simple point-to-point communication. """ + group = dist.new_group([0, 1]) + tensor = torch.ones(1) + dist.all_reduce(tensor, op=dist.reduce_op.SUM, group=group) + print('Rank ', rank, ' has data ', tensor[0]) + +Since we want the sum of all tensors in the group, we use +``dist.reduce_op.SUM`` as the reduce operator. Generally speaking, any +commutative mathematical operation can be used as an operator. +Out-of-the-box, PyTorch comes with 4 such operators, all working at the +element-wise level: + +- ``dist.reduce_op.SUM``, +- ``dist.reduce_op.PRODUCT``, +- ``dist.reduce_op.MAX``, +- ``dist.reduce_op.MIN``. + +In addition to ``dist.all_reduce(tensor, op, group)``, there are a total +of 6 collectives currently implemented in PyTorch. + +- ``dist.broadcast(tensor, src, group)``: Copies ``tensor`` from + ``src`` to all other processes. +- ``dist.reduce(tensor, dst, op, group)``: Applies ``op`` to all + ``tensor`` and stores the result in ``dst``. +- ``dist.all_reduce(tensor, op, group)``: Same as reduce, but the + result is stored in all processes. +- ``dist.scatter(tensor, src, scatter_list, group)``: Copies the + :math:`i^{\text{th}}` tensor ``scatter_list[i]`` to the + :math:`i^{\text{th}}` process. +- ``dist.gather(tensor, dst, gather_list, group)``: Copies ``tensor`` + from all processes in ``dst``. +- ``dist.all_gather(tensor_list, tensor, group)``: Copies ``tensor`` + from all processes to ``tensor_list``, on all processes. +- ``dist.barrier(group)``: block all processes in `group` until each one has entered this function. + +Distributed Training +-------------------- + +.. raw:: html + + + +**Note:** You can find the example script of this section in `this +GitHub repository `__. + +Now that we understand how the distributed module works, let us write +something useful with it. Our goal will be to replicate the +functionality of +`DistributedDataParallel `__. +Of course, this will be a didactic example and in a real-world +situation you should use the official, well-tested and well-optimized +version linked above. + +Quite simply we want to implement a distributed version of stochastic +gradient descent. Our script will let all processes compute the +gradients of their model on their batch of data and then average their +gradients. In order to ensure similar convergence results when changing +the number of processes, we will first have to partition our dataset. +(You could also use +`tnt.dataset.SplitDataset `__, +instead of the snippet below.) + +.. code:: python + + """ Dataset partitioning helper """ + class Partition(object): + + def __init__(self, data, index): + self.data = data + self.index = index + + def __len__(self): + return len(self.index) + + def __getitem__(self, index): + data_idx = self.index[index] + return self.data[data_idx] + + + class DataPartitioner(object): + + def __init__(self, data, sizes=[0.7, 0.2, 0.1], seed=1234): + self.data = data + self.partitions = [] + rng = Random() + rng.seed(seed) + data_len = len(data) + indexes = [x for x in range(0, data_len)] + rng.shuffle(indexes) + + for frac in sizes: + part_len = int(frac * data_len) + self.partitions.append(indexes[0:part_len]) + indexes = indexes[part_len:] + + def use(self, partition): + return Partition(self.data, self.partitions[partition]) + +With the above snippet, we can now simply partition any dataset using +the following few lines: + +.. code:: python + + """ Partitioning MNIST """ + def partition_dataset(): + dataset = datasets.MNIST('./data', train=True, download=True, + transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ])) + size = dist.get_world_size() + bsz = 128 / float(size) + partition_sizes = [1.0 / size for _ in range(size)] + partition = DataPartitioner(dataset, partition_sizes) + partition = partition.use(dist.get_rank()) + train_set = torch.utils.data.DataLoader(partition, + batch_size=bsz, + shuffle=True) + return train_set, bsz + +Assuming we have 2 replicas, then each process will have a ``train_set`` +of 60000 / 2 = 30000 samples. We also divide the batch size by the +number of replicas in order to maintain the *overall* batch size of 128. + +We can now write our usual forward-backward-optimize training code, and +add a function call to average the gradients of our models. (The +following is largely inspired from the official `PyTorch MNIST +example `__.) + +.. code:: python + + """ Distributed Synchronous SGD Example """ + def run(rank, size): + torch.manual_seed(1234) + train_set, bsz = partition_dataset() + model = Net() + optimizer = optim.SGD(model.parameters(), + lr=0.01, momentum=0.5) + + num_batches = ceil(len(train_set.dataset) / float(bsz)) + for epoch in range(10): + epoch_loss = 0.0 + for data, target in train_set: + optimizer.zero_grad() + output = model(data) + loss = F.nll_loss(output, target) + epoch_loss += loss.item() + loss.backward() + average_gradients(model) + optimizer.step() + print('Rank ', dist.get_rank(), ', epoch ', + epoch, ': ', epoch_loss / num_batches) + +It remains to implement the ``average_gradients(model)`` function, which +simply takes in a model and averages its gradients across the whole +world. + +.. code:: python + + """ Gradient averaging. """ + def average_gradients(model): + size = float(dist.get_world_size()) + for param in model.parameters(): + dist.all_reduce(param.grad.data, op=dist.reduce_op.SUM) + param.grad.data /= size + +*Et voilà*! We successfully implemented distributed synchronous SGD and +could train any model on a large computer cluster. + +**Note:** While the last sentence is *technically* true, there are `a +lot more tricks `__ required to +implement a production-level implementation of synchronous SGD. Again, +use what `has been tested and +optimized `__. + +Our Own Ring-Allreduce +~~~~~~~~~~~~~~~~~~~~~~ + +As an additional challenge, imagine that we wanted to implement +DeepSpeech's efficient ring allreduce. This is fairly easily implemented +using point-to-point collectives. + +.. code:: python + + """ Implementation of a ring-reduce with addition. """ + def allreduce(send, recv): + rank = dist.get_rank() + size = dist.get_world_size() + send_buff = th.zeros(send.size()) + recv_buff = th.zeros(send.size()) + accum = th.zeros(send.size()) + accum[:] = send[:] + + left = ((rank - 1) + size) % size + right = (rank + 1) % size + + for i in range(size - 1): + if i % 2 == 0: + # Send send_buff + send_req = dist.isend(send_buff, right) + dist.recv(recv_buff, left) + accum[:] += recv[:] + else: + # Send recv_buff + send_req = dist.isend(recv_buff, right) + dist.recv(send_buff, left) + accum[:] += send[:] + send_req.wait() + recv[:] = accum[:] + +In the above script, the ``allreduce(send, recv)`` function has a +slightly different signature than the ones in PyTorch. It takes a +``recv`` tensor and will store the sum of all ``send`` tensors in it. As +an exercise left to the reader, there is still one difference between +our version and the one in DeepSpeech: their implementation divide the +gradient tensor into *chunks*, so as to optimally utilize the +communication bandwidth. (Hint: +`torch.chunk `__) + +Advanced Topics +--------------- + +We are now ready to discover some of the more advanced functionalities +of ``torch.distributed``. Since there is a lot to cover, this section is +divided into two subsections: + +1. Communication Backends: where we learn how to use MPI and Gloo for + GPU-GPU communication. +2. Initialization Methods: where we understand how to best setup the + initial coordination phase in ``dist.init_process_group()``. + +Communication Backends +~~~~~~~~~~~~~~~~~~~~~~ + +One of the most elegant aspects of ``torch.distributed`` is its ability +to abstract and build on top of different backends. As mentioned before, +there are currently three backends implemented in PyTorch: Gloo, NCCL, and +MPI. They each have different specifications and tradeoffs, depending +on the desired use case. A comparative table of supported functions can +be found +`here `__. + +**Gloo Backend** + +So far we have made extensive usage of the `Gloo backend `__. +It is quite handy as a development platform, as it is included in +the pre-compiled PyTorch binaries and works on both Linux (since 0.2) +and macOS (since 1.3). It supports all point-to-point and collective +operations on CPU, and all collective operations on GPU. The +implementation of the collective operations for CUDA tensors is not as +optimized as the ones provided by the NCCL backend. + +As you have surely noticed, our +distributed SGD example does not work if you put ``model`` on the GPU. +In order to use multiple GPUs, let us also do the following +modifications: + +1. Use ``device = torch.device("cuda:{}".format(rank))`` +2. ``model = Net()`` :math:`\rightarrow` ``model = Net().to(device)`` +3. Use ``data, target = data.to(device), target.to(device)`` + +With the above modifications, our model is now training on two GPUs and +you can monitor their utilization with ``watch nvidia-smi``. + +**MPI Backend** + +The Message Passing Interface (MPI) is a standardized tool from the +field of high-performance computing. It allows to do point-to-point and +collective communications and was the main inspiration for the API of +``torch.distributed``. Several implementations of MPI exist (e.g. +`Open-MPI `__, +`MVAPICH2 `__, `Intel +MPI `__) each +optimized for different purposes. The advantage of using the MPI backend +lies in MPI's wide availability - and high-level of optimization - on +large computer clusters. `Some `__ +`recent `__ +`implementations `__ are also able to take +advantage of CUDA IPC and GPU Direct technologies in order to avoid +memory copies through the CPU. + +Unfortunately, PyTorch's binaries can not include an MPI implementation +and we'll have to recompile it by hand. Fortunately, this process is +fairly simple given that upon compilation, PyTorch will look *by itself* +for an available MPI implementation. The following steps install the MPI +backend, by installing PyTorch `from +source `__. + +1. Create and activate your Anaconda environment, install all the + pre-requisites following `the + guide `__, but do + **not** run ``python setup.py install`` yet. +2. Choose and install your favorite MPI implementation. Note that + enabling CUDA-aware MPI might require some additional steps. In our + case, we'll stick to Open-MPI *without* GPU support: + ``conda install -c conda-forge openmpi`` +3. Now, go to your cloned PyTorch repo and execute + ``python setup.py install``. + +In order to test our newly installed backend, a few modifications are +required. + +1. Replace the content under ``if __name__ == '__main__':`` with + ``init_process(0, 0, run, backend='mpi')``. +2. Run ``mpirun -n 4 python myscript.py``. + +The reason for these changes is that MPI needs to create its own +environment before spawning the processes. MPI will also spawn its own +processes and perform the handshake described in `Initialization +Methods <#initialization-methods>`__, making the ``rank``\ and ``size`` +arguments of ``init_process_group`` superfluous. This is actually quite +powerful as you can pass additional arguments to ``mpirun`` in order to +tailor computational resources for each process. (Things like number of +cores per process, hand-assigning machines to specific ranks, and `some +more `__) +Doing so, you should obtain the same familiar output as with the other +communication backends. + +**NCCL Backend** + +The `NCCL backend `__ provides an +optimized implementation of collective operations against CUDA +tensors. If you only use CUDA tensors for your collective operations, +consider using this backend for the best in class performance. The +NCCL backend is included in the pre-built binaries with CUDA support. + +Initialization Methods +~~~~~~~~~~~~~~~~~~~~~~ + +To finish this tutorial, let's talk about the very first function we +called: ``dist.init_process_group(backend, init_method)``. In +particular, we will go over the different initialization methods which +are responsible for the initial coordination step between each process. +Those methods allow you to define how this coordination is done. +Depending on your hardware setup, one of these methods should be +naturally more suitable than the others. In addition to the following +sections, you should also have a look at the `official +documentation `__. + +**Environment Variable** + +We have been using the environment variable initialization method +throughout this tutorial. By setting the following four environment +variables on all machines, all processes will be able to properly +connect to the master, obtain information about the other processes, and +finally handshake with them. + +- ``MASTER_PORT``: A free port on the machine that will host the + process with rank 0. +- ``MASTER_ADDR``: IP address of the machine that will host the process + with rank 0. +- ``WORLD_SIZE``: The total number of processes, so that the master + knows how many workers to wait for. +- ``RANK``: Rank of each process, so they will know whether it is the + master of a worker. + +**Shared File System** + +The shared filesystem requires all processes to have access to a shared +file system, and will coordinate them through a shared file. This means +that each process will open the file, write its information, and wait +until everybody did so. After what all required information will be +readily available to all processes. In order to avoid race conditions, +the file system must support locking through +`fcntl `__. + +.. code:: python + + dist.init_process_group( + init_method='file:///mnt/nfs/sharedfile', + rank=args.rank, + world_size=4) + +**TCP** + +Initializing via TCP can be achieved by providing the IP address of the process with rank 0 and a reachable port number. +Here, all workers will be able to connect to the process +with rank 0 and exchange information on how to reach each other. + +.. code:: python + + dist.init_process_group( + init_method='tcp://10.1.1.20:23456', + rank=args.rank, + world_size=4) + +.. raw:: html + + + +.. raw:: html + +
+ +**Acknowledgements** + +.. raw:: html + +
+ +I'd like to thank the PyTorch developers for doing such a good job on +their implementation, documentation, and tests. When the code was +unclear, I could always count on the +`docs `__ or the +`tests `__ +to find an answer. In particular, I'd like to thank Soumith Chintala, +Adam Paszke, and Natalia Gimelshein for providing insightful comments +and answering questions on early drafts. diff --git a/recipes_source/recipes/example_recipe.py b/recipes_source/recipes/example_recipe.py new file mode 100644 index 00000000000..9dd3bb199b9 --- /dev/null +++ b/recipes_source/recipes/example_recipe.py @@ -0,0 +1,65 @@ +""" +TODO: Add Recipe Title +======================= + +TODO: + * Include 1-2 sentences summing up what the user can expect from the recipe. + * For example - “This samples demonstrates how to...” + +Introduction +-------------- +TODO: + * Add why is this topic important? + * Ex: Provide a summary of how Integrated Gradients works and how you will teach users to implement it using Captum in this tutorial + +Setup +---------------------- +TODO: + * Call out any required setup or data downloads + + +TODO: List Steps +----------------- +TODO: + * Use the steps you introduced in the Learning Objectives + * Break down the steps as well as add prose for context + * Add comments in the code to help clarify for readers what each section is doing + * Link back to relevant pytorch documentation + * Think of it akin to creating a really practical Medium post + +TIPS: + * To denote a word or phrase as code, enclose it in double backticks (``). ``torch.Tensor`` + * You can **bold** or *italicize* text for emphasis. + * Add python code directly in the file. The output will render and build on the site in a separate code block. + Below is an example of python code with comments. + You can build this python file to see the resulting html by following the README.md at github.com/pytorch/tutorials +""" + +import torch + +############################################################### +# Because of the line of pound sign delimiters above, this comment will show up as plain text between the code. +x = torch.ones(2, 2, requires_grad=True) +# Since this is a single line comment, it will show up as a comment in the code block +print(x) + + + +############################################################### +# .. Note:: +# +# You can add Notes using this syntax + + + + +######################################################################## +# Learn More +# ---------------------------- +# TODO: +# * Link to any additional resources (e.g. Docs, other Tutorials, external resources) if readers want to learn more +# * There are different ways add hyperlinks - +# * For example, pasting the url works: Read more about the ``autograd.Function`` at https://pytorch.org/docs/stable/autograd.html#function. +# * or link to other files in this repository by their titles such as :doc:`data_parallel_tutorial`. +# * There are also ways to add internal and external links. Check out this resource for more tips: https://thomas-cokelaer.info/tutorials/sphinx/rest_syntax.html#id4 +# diff --git a/recipes_source/recipes/flask_rest_api_tutorial.py b/recipes_source/recipes/flask_rest_api_tutorial.py new file mode 100644 index 00000000000..e904e55d303 --- /dev/null +++ b/recipes_source/recipes/flask_rest_api_tutorial.py @@ -0,0 +1,370 @@ +# -*- coding: utf-8 -*- +""" +Deploying PyTorch in Python via a REST API with Flask +======================================================== +**Author**: `Avinash Sajjanshetty `_ + +In this tutorial, we will deploy a PyTorch model using Flask and expose a +REST API for model inference. In particular, we will deploy a pretrained +DenseNet 121 model which detects the image. + +.. tip:: All the code used here is released under MIT license and is available on `Github `_. + +This represents the first in a series of tutorials on deploying PyTorch models +in production. Using Flask in this way is by far the easiest way to start +serving your PyTorch models, but it will not work for a use case +with high performance requirements. For that: + + - If you're already familiar with TorchScript, you can jump straight into our + `Loading a TorchScript Model in C++ `_ tutorial. + + - If you first need a refresher on TorchScript, check out our + `Intro a TorchScript `_ tutorial. +""" + + +###################################################################### +# API Definition +# -------------- +# +# We will first define our API endpoints, the request and response types. Our +# API endpoint will be at ``/predict`` which takes HTTP POST requests with a +# ``file`` parameter which contains the image. The response will be of JSON +# response containing the prediction: +# +# :: +# +# {"class_id": "n02124075", "class_name": "Egyptian_cat"} +# +# + +###################################################################### +# Dependencies +# ------------ +# +# Install the required dependenices by running the following command: +# +# :: +# +# $ pip install Flask==1.0.3 torchvision-0.3.0 + + +###################################################################### +# Simple Web Server +# ----------------- +# +# Following is a simple webserver, taken from Flask's documentaion + + +from flask import Flask +app = Flask(__name__) + + +@app.route('/') +def hello(): + return 'Hello World!' + +############################################################################### +# Save the above snippet in a file called ``app.py`` and you can now run a +# Flask development server by typing: +# +# :: +# +# $ FLASK_ENV=development FLASK_APP=app.py flask run + +############################################################################### +# When you visit ``http://localhost:5000/`` in your web browser, you will be +# greeted with ``Hello World!`` text + +############################################################################### +# We will make slight changes to the above snippet, so that it suits our API +# definition. First, we will rename the method to ``predict``. We will update +# the endpoint path to ``/predict``. Since the image files will be sent via +# HTTP POST requests, we will update it so that it also accepts only POST +# requests: + + +@app.route('/predict', methods=['POST']) +def predict(): + return 'Hello World!' + +############################################################################### +# We will also change the response type, so that it returns a JSON response +# containing ImageNet class id and name. The updated ``app.py`` file will +# be now: + +from flask import Flask, jsonify +app = Flask(__name__) + +@app.route('/predict', methods=['POST']) +def predict(): + return jsonify({'class_id': 'IMAGE_NET_XXX', 'class_name': 'Cat'}) + + +###################################################################### +# Inference +# ----------------- +# +# In the next sections we will focus on writing the inference code. This will +# involve two parts, one where we prepare the image so that it can be fed +# to DenseNet and next, we will write the code to get the actual prediction +# from the model. +# +# Preparing the image +# ~~~~~~~~~~~~~~~~~~~ +# +# DenseNet model requires the image to be of 3 channel RGB image of size +# 224 x 224. We will also normalise the image tensor with the required mean +# and standard deviation values. You can read more about it +# `here `_. +# +# We will use ``transforms`` from ``torchvision`` library and build a +# transform pipeline, which transforms our images as required. You +# can read more about transforms `here `_. + +import io + +import torchvision.transforms as transforms +from PIL import Image + +def transform_image(image_bytes): + my_transforms = transforms.Compose([transforms.Resize(255), + transforms.CenterCrop(224), + transforms.ToTensor(), + transforms.Normalize( + [0.485, 0.456, 0.406], + [0.229, 0.224, 0.225])]) + image = Image.open(io.BytesIO(image_bytes)) + return my_transforms(image).unsqueeze(0) + + +###################################################################### +# The above method takes image data in bytes, applies the series of transforms +# and returns a tensor. To test the above method, read an image file in +# bytes mode (first replacing `../_static/img/sample_file.jpeg` with the actual +# path to the file on your computer) and see if you get a tensor back: + +with open("../_static/img/sample_file.jpeg", 'rb') as f: + image_bytes = f.read() + tensor = transform_image(image_bytes=image_bytes) + print(tensor) + +###################################################################### +# Prediction +# ~~~~~~~~~~~~~~~~~~~ +# +# Now will use a pretrained DenseNet 121 model to predict the image class. We +# will use one from ``torchvision`` library, load the model and get an +# inference. While we'll be using a pretrained model in this example, you can +# use this same approach for your own models. See more about loading your +# models in this :doc:`tutorial `. + +from torchvision import models + +# Make sure to pass `pretrained` as `True` to use the pretrained weights: +model = models.densenet121(pretrained=True) +# Since we are using our model only for inference, switch to `eval` mode: +model.eval() + + +def get_prediction(image_bytes): + tensor = transform_image(image_bytes=image_bytes) + outputs = model.forward(tensor) + _, y_hat = outputs.max(1) + return y_hat + + +###################################################################### +# The tensor ``y_hat`` will contain the index of the predicted class id. +# However, we need a human readable class name. For that we need a class id +# to name mapping. Download +# `this file `_ +# as ``imagenet_class_index.json`` and remember where you saved it (or, if you +# are following the exact steps in this tutorial, save it in +# `tutorials/_static`). This file contains the mapping of ImageNet class id to +# ImageNet class name. We will load this JSON file and get the class name of +# the predicted index. + +import json + +imagenet_class_index = json.load(open('../_static/imagenet_class_index.json')) + +def get_prediction(image_bytes): + tensor = transform_image(image_bytes=image_bytes) + outputs = model.forward(tensor) + _, y_hat = outputs.max(1) + predicted_idx = str(y_hat.item()) + return imagenet_class_index[predicted_idx] + + +###################################################################### +# Before using ``imagenet_class_index`` dictionary, first we will convert +# tensor value to a string value, since the keys in the +# ``imagenet_class_index`` dictionary are strings. +# We will test our above method: + + +with open("../_static/img/sample_file.jpeg", 'rb') as f: + image_bytes = f.read() + print(get_prediction(image_bytes=image_bytes)) + +###################################################################### +# You should get a response like this: + +['n02124075', 'Egyptian_cat'] + +###################################################################### +# The first item in array is ImageNet class id and second item is the human +# readable name. +# +# .. Note :: +# Did you notice that ``model`` variable is not part of ``get_prediction`` +# method? Or why is model a global variable? Loading a model can be an +# expensive operation in terms of memory and compute. If we loaded the model in the +# ``get_prediction`` method, then it would get unnecessarily loaded every +# time the method is called. Since, we are building a web server, there +# could be thousands of requests per second, we should not waste time +# redundantly loading the model for every inference. So, we keep the model +# loaded in memory just once. In +# production systems, it's necessary to be efficient about your use of +# compute to be able to serve requests at scale, so you should generally +# load your model before serving requests. + +###################################################################### +# Integrating the model in our API Server +# --------------------------------------- +# +# In this final part we will add our model to our Flask API server. Since +# our API server is supposed to take an image file, we will update our ``predict`` +# method to read files from the requests: +# +# .. code-block:: python +# +# from flask import request +# +# @app.route('/predict', methods=['POST']) +# def predict(): +# if request.method == 'POST': +# # we will get the file from the request +# file = request.files['file'] +# # convert that to bytes +# img_bytes = file.read() +# class_id, class_name = get_prediction(image_bytes=img_bytes) +# return jsonify({'class_id': class_id, 'class_name': class_name}) + +###################################################################### +# The ``app.py`` file is now complete. Following is the full version; replace +# the paths with the paths where you saved your files and it should run: +# +# .. code-block:: python +# +# import io +# import json +# +# from torchvision import models +# import torchvision.transforms as transforms +# from PIL import Image +# from flask import Flask, jsonify, request +# +# +# app = Flask(__name__) +# imagenet_class_index = json.load(open('/imagenet_class_index.json')) +# model = models.densenet121(pretrained=True) +# model.eval() +# +# +# def transform_image(image_bytes): +# my_transforms = transforms.Compose([transforms.Resize(255), +# transforms.CenterCrop(224), +# transforms.ToTensor(), +# transforms.Normalize( +# [0.485, 0.456, 0.406], +# [0.229, 0.224, 0.225])]) +# image = Image.open(io.BytesIO(image_bytes)) +# return my_transforms(image).unsqueeze(0) +# +# +# def get_prediction(image_bytes): +# tensor = transform_image(image_bytes=image_bytes) +# outputs = model.forward(tensor) +# _, y_hat = outputs.max(1) +# predicted_idx = str(y_hat.item()) +# return imagenet_class_index[predicted_idx] +# +# +# @app.route('/predict', methods=['POST']) +# def predict(): +# if request.method == 'POST': +# file = request.files['file'] +# img_bytes = file.read() +# class_id, class_name = get_prediction(image_bytes=img_bytes) +# return jsonify({'class_id': class_id, 'class_name': class_name}) +# +# +# if __name__ == '__main__': +# app.run() + +###################################################################### +# Let's test our web server! Run: +# +# :: +# +# $ FLASK_ENV=development FLASK_APP=app.py flask run + +####################################################################### +# We can use the +# `requests `_ +# library to send a POST request to our app: +# +# .. code-block:: python +# +# import requests +# +# resp = requests.post("http://localhost:5000/predict", +# files={"file": open('/cat.jpg','rb')}) + +####################################################################### +# Printing `resp.json()` will now show the following: +# +# :: +# +# {"class_id": "n02124075", "class_name": "Egyptian_cat"} +# + +###################################################################### +# Next steps +# -------------- +# +# The server we wrote is quite trivial and and may not do everything +# you need for your production application. So, here are some things you +# can do to make it better: +# +# - The endpoint ``/predict`` assumes that always there will be a image file +# in the request. This may not hold true for all requests. Our user may +# send image with a different parameter or send no images at all. +# +# - The user may send non-image type files too. Since we are not handling +# errors, this will break our server. Adding an explicit error handing +# path that will throw an exception would allow us to better handle +# the bad inputs +# +# - Even though the model can recognize a large number of classes of images, +# it may not be able to recognize all images. Enhance the implementation +# to handle cases when the model does not recognize anything in the image. +# +# - We run the Flask server in the development mode, which is not suitable for +# deploying in production. You can check out `this tutorial `_ +# for deploying a Flask server in production. +# +# - You can also add a UI by creating a page with a form which takes the image and +# displays the prediction. Check out the `demo `_ +# of a similar project and its `source code `_. +# +# - In this tutorial, we only showed how to build a service that could return predictions for +# a single image at a time. We could modify our service to be able to return predictions for +# multiple images at once. In addition, the `service-streamer `_ +# library automatically queues requests to your service and samples them into mini-batches +# that can be fed into your model. You can check out `this tutorial `_. +# +# - Finally, we encourage you to check out our other tutorials on deploying PyTorch models +# linked-to at the top of the page. diff --git a/recipes_source/recipes/model_parallel_tutorial.py b/recipes_source/recipes/model_parallel_tutorial.py new file mode 100644 index 00000000000..515b689301a --- /dev/null +++ b/recipes_source/recipes/model_parallel_tutorial.py @@ -0,0 +1,360 @@ +# -*- coding: utf-8 -*- +""" +Single-Machine Model Parallel Best Practices +================================ +**Author**: `Shen Li `_ + +Model parallel is widely-used in distributed training +techniques. Previous posts have explained how to use +`DataParallel `_ +to train a neural network on multiple GPUs; this feature replicates the +same model to all GPUs, where each GPU consumes a different partition of the +input data. Although it can significantly accelerate the training process, it +does not work for some use cases where the model is too large to fit into a +single GPU. This post shows how to solve that problem by using **model parallel**, +which, in contrast to ``DataParallel``, splits a single model onto different GPUs, +rather than replicating the entire model on each GPU (to be concrete, say a model +``m`` contains 10 layers: when using ``DataParallel``, each GPU will have a +replica of each of these 10 layers, whereas when using model parallel on two GPUs, +each GPU could host 5 layers). + +The high-level idea of model parallel is to place different sub-networks of a +model onto different devices, and implement the ``forward`` method accordingly +to move intermediate outputs across devices. As only part of a model operates +on any individual device, a set of devices can collectively serve a larger +model. In this post, we will not try to construct huge models and squeeze them +into a limited number of GPUs. Instead, this post focuses on showing the idea +of model parallel. It is up to the readers to apply the ideas to real-world +applications. + +.. note:: + + For distributed model parallel training where a model spans multiple + servers, please refer to + `Getting Started With Distributed RPC Framework `__ + for examples and details. + +Basic Usage +----------- +""" + +###################################################################### +# Let us start with a toy model that contains two linear layers. To run this +# model on two GPUs, simply put each linear layer on a different GPU, and move +# inputs and intermediate outputs to match the layer devices accordingly. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +class ToyModel(nn.Module): + def __init__(self): + super(ToyModel, self).__init__() + self.net1 = torch.nn.Linear(10, 10).to('cuda:0') + self.relu = torch.nn.ReLU() + self.net2 = torch.nn.Linear(10, 5).to('cuda:1') + + def forward(self, x): + x = self.relu(self.net1(x.to('cuda:0'))) + return self.net2(x.to('cuda:1')) + +###################################################################### +# Note that, the above ``ToyModel`` looks very similar to how one would +# implement it on a single GPU, except the five ``to(device)`` calls which +# place linear layers and tensors on proper devices. That is the only place in +# the model that requires changes. The ``backward()`` and ``torch.optim`` will +# automatically take care of gradients as if the model is on one GPU. You only +# need to make sure that the labels are on the same device as the outputs when +# calling the loss function. + + +model = ToyModel() +loss_fn = nn.MSELoss() +optimizer = optim.SGD(model.parameters(), lr=0.001) + +optimizer.zero_grad() +outputs = model(torch.randn(20, 10)) +labels = torch.randn(20, 5).to('cuda:1') +loss_fn(outputs, labels).backward() +optimizer.step() + +###################################################################### +# Apply Model Parallel to Existing Modules +# ---------------------------------------- +# +# It is also possible to run an existing single-GPU module on multiple GPUs +# with just a few lines of changes. The code below shows how to decompose +# ``torchvision.models.reset50()`` to two GPUs. The idea is to inherit from +# the existing ``ResNet`` module, and split the layers to two GPUs during +# construction. Then, override the ``forward`` method to stitch two +# sub-networks by moving the intermediate outputs accordingly. + + +from torchvision.models.resnet import ResNet, Bottleneck + +num_classes = 1000 + + +class ModelParallelResNet50(ResNet): + def __init__(self, *args, **kwargs): + super(ModelParallelResNet50, self).__init__( + Bottleneck, [3, 4, 6, 3], num_classes=num_classes, *args, **kwargs) + + self.seq1 = nn.Sequential( + self.conv1, + self.bn1, + self.relu, + self.maxpool, + + self.layer1, + self.layer2 + ).to('cuda:0') + + self.seq2 = nn.Sequential( + self.layer3, + self.layer4, + self.avgpool, + ).to('cuda:1') + + self.fc.to('cuda:1') + + def forward(self, x): + x = self.seq2(self.seq1(x).to('cuda:1')) + return self.fc(x.view(x.size(0), -1)) + + +###################################################################### +# The above implementation solves the problem for cases where the model is too +# large to fit into a single GPU. However, you might have already noticed that +# it will be slower than running it on a single GPU if your model fits. It is +# because, at any point in time, only one of the two GPUs are working, while +# the other one is sitting there doing nothing. The performance further +# deteriorates as the intermediate outputs need to be copied from ``cuda:0`` to +# ``cuda:1`` between ``layer2`` and ``layer3``. +# +# Let us run an experiment to get a more quantitative view of the execution +# time. In this experiment, we train ``ModelParallelResNet50`` and the existing +# ``torchvision.models.reset50()`` by running random inputs and labels through +# them. After the training, the models will not produce any useful predictions, +# but we can get a reasonable understanding of the execution times. + + +import torchvision.models as models + +num_batches = 3 +batch_size = 120 +image_w = 128 +image_h = 128 + + +def train(model): + model.train(True) + loss_fn = nn.MSELoss() + optimizer = optim.SGD(model.parameters(), lr=0.001) + + one_hot_indices = torch.LongTensor(batch_size) \ + .random_(0, num_classes) \ + .view(batch_size, 1) + + for _ in range(num_batches): + # generate random inputs and labels + inputs = torch.randn(batch_size, 3, image_w, image_h) + labels = torch.zeros(batch_size, num_classes) \ + .scatter_(1, one_hot_indices, 1) + + # run forward pass + optimizer.zero_grad() + outputs = model(inputs.to('cuda:0')) + + # run backward pass + labels = labels.to(outputs.device) + loss_fn(outputs, labels).backward() + optimizer.step() + + +###################################################################### +# The ``train(model)`` method above uses ``nn.MSELoss`` as the loss function, +# and ``optim.SGD`` as the optimizer. It mimics training on ``128 X 128`` +# images which are organized into 3 batches where each batch contains 120 +# images. Then, we use ``timeit`` to run the ``train(model)`` method 10 times +# and plot the execution times with standard deviations. + + +import matplotlib.pyplot as plt +plt.switch_backend('Agg') +import numpy as np +import timeit + +num_repeat = 10 + +stmt = "train(model)" + +setup = "model = ModelParallelResNet50()" +# globals arg is only available in Python 3. In Python 2, use the following +# import __builtin__ +# __builtin__.__dict__.update(locals()) +mp_run_times = timeit.repeat( + stmt, setup, number=1, repeat=num_repeat, globals=globals()) +mp_mean, mp_std = np.mean(mp_run_times), np.std(mp_run_times) + +setup = "import torchvision.models as models;" + \ + "model = models.resnet50(num_classes=num_classes).to('cuda:0')" +rn_run_times = timeit.repeat( + stmt, setup, number=1, repeat=num_repeat, globals=globals()) +rn_mean, rn_std = np.mean(rn_run_times), np.std(rn_run_times) + + +def plot(means, stds, labels, fig_name): + fig, ax = plt.subplots() + ax.bar(np.arange(len(means)), means, yerr=stds, + align='center', alpha=0.5, ecolor='red', capsize=10, width=0.6) + ax.set_ylabel('ResNet50 Execution Time (Second)') + ax.set_xticks(np.arange(len(means))) + ax.set_xticklabels(labels) + ax.yaxis.grid(True) + plt.tight_layout() + plt.savefig(fig_name) + plt.close(fig) + + +plot([mp_mean, rn_mean], + [mp_std, rn_std], + ['Model Parallel', 'Single GPU'], + 'mp_vs_rn.png') + + +###################################################################### +# +# .. figure:: /_static/img/model-parallel-images/mp_vs_rn.png +# :alt: +# +# The result shows that the execution time of model parallel implementation is +# ``4.02/3.75-1=7%`` longer than the existing single-GPU implementation. So we +# can conclude there is roughly 7% overhead in copying tensors back and forth +# across the GPUs. There are rooms for improvements, as we know one of the two +# GPUs is sitting idle throughout the execution. One option is to further +# divide each batch into a pipeline of splits, such that when one split reaches +# the second sub-network, the following split can be fed into the first +# sub-network. In this way, two consecutive splits can run concurrently on two +# GPUs. + +###################################################################### +# Speed Up by Pipelining Inputs +# ----------------------------- +# +# In the following experiments, we further divide each 120-image batch into +# 20-image splits. As PyTorch launches CUDA operations asynchronizely, the +# implementation does not need to spawn multiple threads to achieve +# concurrency. + + +class PipelineParallelResNet50(ModelParallelResNet50): + def __init__(self, split_size=20, *args, **kwargs): + super(PipelineParallelResNet50, self).__init__(*args, **kwargs) + self.split_size = split_size + + def forward(self, x): + splits = iter(x.split(self.split_size, dim=0)) + s_next = next(splits) + s_prev = self.seq1(s_next).to('cuda:1') + ret = [] + + for s_next in splits: + # A. s_prev runs on cuda:1 + s_prev = self.seq2(s_prev) + ret.append(self.fc(s_prev.view(s_prev.size(0), -1))) + + # B. s_next runs on cuda:0, which can run concurrently with A + s_prev = self.seq1(s_next).to('cuda:1') + + s_prev = self.seq2(s_prev) + ret.append(self.fc(s_prev.view(s_prev.size(0), -1))) + + return torch.cat(ret) + + +setup = "model = PipelineParallelResNet50()" +pp_run_times = timeit.repeat( + stmt, setup, number=1, repeat=num_repeat, globals=globals()) +pp_mean, pp_std = np.mean(pp_run_times), np.std(pp_run_times) + +plot([mp_mean, rn_mean, pp_mean], + [mp_std, rn_std, pp_std], + ['Model Parallel', 'Single GPU', 'Pipelining Model Parallel'], + 'mp_vs_rn_vs_pp.png') + +###################################################################### +# Please note, device-to-device tensor copy operations are synchronized on +# current streams on the source and the destination devices. If you create +# multiple streams, you have to make sure that copy operations are properly +# synchronized. Writing the source tensor or reading/writing the destination +# tensor before finishing the copy operation can lead to undefined behavior. +# The above implementation only uses default streams on both source and +# destination devices, hence it is not necessary to enforce additional +# synchronizations. +# +# .. figure:: /_static/img/model-parallel-images/mp_vs_rn_vs_pp.png +# :alt: +# +# The experiment result shows that, pipelining inputs to model parallel +# ResNet50 speeds up the training process by roughly ``3.75/2.51-1=49%``. It is +# still quite far away from the ideal 100% speedup. As we have introduced a new +# parameter ``split_sizes`` in our pipeline parallel implementation, it is +# unclear how the new parameter affects the overall training time. Intuitively +# speaking, using small ``split_size`` leads to many tiny CUDA kernel launch, +# while using large ``split_size`` results to relatively long idle times during +# the first and last splits. Neither are optimal. There might be an optimal +# ``split_size`` configuration for this specific experiment. Let us try to find +# it by running experiments using several different ``split_size`` values. + + +means = [] +stds = [] +split_sizes = [1, 3, 5, 8, 10, 12, 20, 40, 60] + +for split_size in split_sizes: + setup = "model = PipelineParallelResNet50(split_size=%d)" % split_size + pp_run_times = timeit.repeat( + stmt, setup, number=1, repeat=num_repeat, globals=globals()) + means.append(np.mean(pp_run_times)) + stds.append(np.std(pp_run_times)) + +fig, ax = plt.subplots() +ax.plot(split_sizes, means) +ax.errorbar(split_sizes, means, yerr=stds, ecolor='red', fmt='ro') +ax.set_ylabel('ResNet50 Execution Time (Second)') +ax.set_xlabel('Pipeline Split Size') +ax.set_xticks(split_sizes) +ax.yaxis.grid(True) +plt.tight_layout() +plt.savefig("split_size_tradeoff.png") +plt.close(fig) + +###################################################################### +# +# .. figure:: /_static/img/model-parallel-images/split_size_tradeoff.png +# :alt: +# +# The result shows that setting ``split_size`` to 12 achieves the fastest +# training speed, which leads to ``3.75/2.43-1=54%`` speedup. There are +# still opportunities to further accelerate the training process. For example, +# all operations on ``cuda:0`` is placed on its default stream. It means that +# computations on the next split cannot overlap with the copy operation of the +# prev split. However, as prev and next splits are different tensors, there is +# no problem to overlap one's computation with the other one's copy. The +# implementation need to use multiple streams on both GPUs, and different +# sub-network structures require different stream management strategies. As no +# general multi-stream solution works for all model parallel use cases, we will +# not discuss it in this tutorial. +# +# **Note:** +# +# This post shows several performance measurements. You might see different +# numbers when running the same code on your own machine, because the result +# depends on the underlying hardware and software. To get the best performance +# for your environment, a proper approach is to first generate the curve to +# figure out the best split size, and then use that split size to pipeline +# inputs. +# diff --git a/recipes_source/recipes/numpy_extensions_recipe.ipynb b/recipes_source/recipes/numpy_extensions_recipe.ipynb new file mode 100644 index 00000000000..1d34c5fb931 --- /dev/null +++ b/recipes_source/recipes/numpy_extensions_recipe.ipynb @@ -0,0 +1,318 @@ +{ + "nbformat": 4, + "nbformat_minor": 0, + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.6" + }, + "colab": { + "name": "numpy_extensions_recipe.ipynb", + "provenance": [] + } + }, + "cells": [ + { + "cell_type": "code", + "metadata": { + "id": "XjhSh-86CvG9", + "colab_type": "code", + "colab": {} + }, + "source": [ + "%matplotlib inline" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "YL94U-qNCvHC", + "colab_type": "text" + }, + "source": [ + "\n", + "# Creating Extensions Using numpy and scipy\n", + "\n", + "Numpy and scipy are a some of the most popular open source libraries for scientific computing. Because of PyTorch' pythonic nature, users can easily leverage the broader ecosystem of python libaries. In this recipe, we will show how you can leverage these libararies along with PyTorch to:\n", + "\n", + "1. Create a neural network layer with no parameters using **numpy**; and \n", + "\n", + "2. Create a neural network layer that has learnable weights using **SciPy**\n", + "\n", + "**Original Author**: [Adam Paszke](https://github.com/apaszke)\n", + "\n", + "**Updated by**: [Adam Dziedzic](https://github.com/adam-dziedzic) and [Joe Spisak](https://github.com/jspisak)\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "ZiX2OYUpEpZg", + "colab_type": "text" + }, + "source": [ + "### Let's first import the basics" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "FM8ujQylCvHD", + "colab_type": "code", + "colab": {} + }, + "source": [ + "import torch\n", + "from torch.autograd import Function" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "f6e0IP_PCvHG", + "colab_type": "text" + }, + "source": [ + "## Now let's create a parameter-less example of a layer using numpy\n", + "\n", + "This layer doesn’t particularly do anything useful or mathematically\n", + "correct.\n", + "\n", + "It is aptly named BadFFTFunction\n", + "\n", + "**Layer Implementation**\n", + "\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "yqSAH1UkCvHH", + "colab_type": "code", + "colab": {} + }, + "source": [ + "from numpy.fft import rfft2, irfft2\n", + "\n", + "\n", + "class BadFFTFunction(Function):\n", + "\n", + " def forward(self, input):\n", + " numpy_input = input.detach().numpy()\n", + " result = abs(rfft2(numpy_input))\n", + " return input.new(result)\n", + "\n", + " def backward(self, grad_output):\n", + " numpy_go = grad_output.numpy()\n", + " result = irfft2(numpy_go)\n", + " return grad_output.new(result)\n", + "\n", + "# since this layer does not have any parameters, we can\n", + "# simply declare this as a function, rather than as an nn.Module class\n", + "\n", + "\n", + "def incorrect_fft(input):\n", + " return BadFFTFunction()(input)" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "MNvw2q1ACvHK", + "colab_type": "text" + }, + "source": [ + "**Example usage of the created layer:**\n", + "\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "wo1CvWIaCvHK", + "colab_type": "code", + "colab": {} + }, + "source": [ + "input = torch.randn(8, 8, requires_grad=True)\n", + "result = incorrect_fft(input)\n", + "print(result)\n", + "result.backward(torch.randn(result.size()))\n", + "print(input)" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Yj2-qAJCCvHN", + "colab_type": "text" + }, + "source": [ + "## Next let's create a parametrized example using scipy\n", + "\n", + "In deep learning literature, this layer is confusingly referred\n", + "to as convolution while the actual operation is cross-correlation\n", + "(the only difference is that filter is flipped for convolution, which is not the case for cross-correlation).\n", + "\n", + "Below is the implementation of a layer with learnable weights, where cross-correlation has a filter (kernel) that represents the weights.\n", + "\n", + "The backward pass computes the gradient with respect to the input and the gradient with respect to the filter." + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "YW6YZ26HCvHN", + "colab_type": "code", + "colab": {} + }, + "source": [ + "from numpy import flip\n", + "import numpy as np\n", + "from scipy.signal import convolve2d, correlate2d\n", + "from torch.nn.modules.module import Module\n", + "from torch.nn.parameter import Parameter\n", + "\n", + "\n", + "class ScipyConv2dFunction(Function):\n", + " @staticmethod\n", + " def forward(ctx, input, filter, bias):\n", + " # detach so we can cast to NumPy\n", + " input, filter, bias = input.detach(), filter.detach(), bias.detach()\n", + " result = correlate2d(input.numpy(), filter.numpy(), mode='valid')\n", + " result += bias.numpy()\n", + " ctx.save_for_backward(input, filter, bias)\n", + " return torch.as_tensor(result, dtype=input.dtype)\n", + "\n", + " @staticmethod\n", + " def backward(ctx, grad_output):\n", + " grad_output = grad_output.detach()\n", + " input, filter, bias = ctx.saved_tensors\n", + " grad_output = grad_output.numpy()\n", + " grad_bias = np.sum(grad_output, keepdims=True)\n", + " grad_input = convolve2d(grad_output, filter.numpy(), mode='full')\n", + " # the previous line can be expressed equivalently as:\n", + " # grad_input = correlate2d(grad_output, flip(flip(filter.numpy(), axis=0), axis=1), mode='full')\n", + " grad_filter = correlate2d(input.numpy(), grad_output, mode='valid')\n", + " return torch.from_numpy(grad_input), torch.from_numpy(grad_filter).to(torch.float), torch.from_numpy(grad_bias).to(torch.float)\n", + "\n", + "\n", + "class ScipyConv2d(Module):\n", + " def __init__(self, filter_width, filter_height):\n", + " super(ScipyConv2d, self).__init__()\n", + " self.filter = Parameter(torch.randn(filter_width, filter_height))\n", + " self.bias = Parameter(torch.randn(1, 1))\n", + "\n", + " def forward(self, input):\n", + " return ScipyConv2dFunction.apply(input, self.filter, self.bias)" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "094F0pWVCvHQ", + "colab_type": "text" + }, + "source": [ + "**Example usage:**\n", + "\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "ZzP0famSCvHR", + "colab_type": "code", + "colab": {} + }, + "source": [ + "module = ScipyConv2d(3, 3)\n", + "print(\"Filter and bias: \", list(module.parameters()))\n", + "input = torch.randn(10, 10, requires_grad=True)\n", + "output = module(input)\n", + "print(\"Output from the convolution: \", output)\n", + "output.backward(torch.randn(8, 8))\n", + "print(\"Gradient for the input map: \", input.grad)" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "LDcCnAiLCvHU", + "colab_type": "text" + }, + "source": [ + "**Check the gradients:**\n", + "\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "vWBxY_MHCvHU", + "colab_type": "code", + "colab": {} + }, + "source": [ + "from torch.autograd.gradcheck import gradcheck\n", + "\n", + "moduleConv = ScipyConv2d(3, 3)\n", + "\n", + "input = [torch.randn(20, 20, dtype=torch.double, requires_grad=True)]\n", + "test = gradcheck(moduleConv, input, eps=1e-6, atol=1e-4)\n", + "print(\"Are the gradients correct: \", test)" + ], + "execution_count": 0, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "BWgn4uh1FyGx", + "colab_type": "text" + }, + "source": [ + "Congrats! You've learned how to create extensions to PyTorch using numpy and scipy. We recommend trying to generalize these approaches to new areas and problems. " + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "agyyA9_yHVGS", + "colab_type": "code", + "colab": {} + }, + "source": [ + "" + ], + "execution_count": 0, + "outputs": [] + } + ] +} \ No newline at end of file diff --git a/recipes_source/recipes/numpy_extensions_tutorial.py b/recipes_source/recipes/numpy_extensions_tutorial.py new file mode 100644 index 00000000000..afc9a118c30 --- /dev/null +++ b/recipes_source/recipes/numpy_extensions_tutorial.py @@ -0,0 +1,140 @@ +# -*- coding: utf-8 -*- +""" +Creating Extensions Using numpy and scipy +========================================= +**Author**: `Adam Paszke `_ + +**Updated by**: `Adam Dziedzic `_ + +In this tutorial, we shall go through two tasks: + +1. Create a neural network layer with no parameters. + + - This calls into **numpy** as part of its implementation + +2. Create a neural network layer that has learnable weights + + - This calls into **SciPy** as part of its implementation +""" + +import torch +from torch.autograd import Function + +############################################################### +# Parameter-less example +# ---------------------- +# +# This layer doesn’t particularly do anything useful or mathematically +# correct. +# +# It is aptly named BadFFTFunction +# +# **Layer Implementation** + +from numpy.fft import rfft2, irfft2 + + +class BadFFTFunction(Function): + @staticmethod + def forward(ctx, input): + numpy_input = input.detach().numpy() + result = abs(rfft2(numpy_input)) + return input.new(result) + + @staticmethod + def backward(ctx, grad_output): + numpy_go = grad_output.numpy() + result = irfft2(numpy_go) + return grad_output.new(result) + +# since this layer does not have any parameters, we can +# simply declare this as a function, rather than as an nn.Module class + + +def incorrect_fft(input): + return BadFFTFunction.apply(input) + +############################################################### +# **Example usage of the created layer:** + +input = torch.randn(8, 8, requires_grad=True) +result = incorrect_fft(input) +print(result) +result.backward(torch.randn(result.size())) +print(input) + +############################################################### +# Parametrized example +# -------------------- +# +# In deep learning literature, this layer is confusingly referred +# to as convolution while the actual operation is cross-correlation +# (the only difference is that filter is flipped for convolution, +# which is not the case for cross-correlation). +# +# Implementation of a layer with learnable weights, where cross-correlation +# has a filter (kernel) that represents weights. +# +# The backward pass computes the gradient wrt the input and the gradient wrt the filter. + +from numpy import flip +import numpy as np +from scipy.signal import convolve2d, correlate2d +from torch.nn.modules.module import Module +from torch.nn.parameter import Parameter + + +class ScipyConv2dFunction(Function): + @staticmethod + def forward(ctx, input, filter, bias): + # detach so we can cast to NumPy + input, filter, bias = input.detach(), filter.detach(), bias.detach() + result = correlate2d(input.numpy(), filter.numpy(), mode='valid') + result += bias.numpy() + ctx.save_for_backward(input, filter, bias) + return torch.as_tensor(result, dtype=input.dtype) + + @staticmethod + def backward(ctx, grad_output): + grad_output = grad_output.detach() + input, filter, bias = ctx.saved_tensors + grad_output = grad_output.numpy() + grad_bias = np.sum(grad_output, keepdims=True) + grad_input = convolve2d(grad_output, filter.numpy(), mode='full') + # the previous line can be expressed equivalently as: + # grad_input = correlate2d(grad_output, flip(flip(filter.numpy(), axis=0), axis=1), mode='full') + grad_filter = correlate2d(input.numpy(), grad_output, mode='valid') + return torch.from_numpy(grad_input), torch.from_numpy(grad_filter).to(torch.float), torch.from_numpy(grad_bias).to(torch.float) + + +class ScipyConv2d(Module): + def __init__(self, filter_width, filter_height): + super(ScipyConv2d, self).__init__() + self.filter = Parameter(torch.randn(filter_width, filter_height)) + self.bias = Parameter(torch.randn(1, 1)) + + def forward(self, input): + return ScipyConv2dFunction.apply(input, self.filter, self.bias) + + +############################################################### +# **Example usage:** + +module = ScipyConv2d(3, 3) +print("Filter and bias: ", list(module.parameters())) +input = torch.randn(10, 10, requires_grad=True) +output = module(input) +print("Output from the convolution: ", output) +output.backward(torch.randn(8, 8)) +print("Gradient for the input map: ", input.grad) + +############################################################### +# **Check the gradients:** + +from torch.autograd.gradcheck import gradcheck + +moduleConv = ScipyConv2d(3, 3) + +input = [torch.randn(20, 20, dtype=torch.double, requires_grad=True)] +test = gradcheck(moduleConv, input, eps=1e-6, atol=1e-4) +print("Are the gradients correct: ", test) diff --git a/recipes_source/recipes/rpc_tutorial.rst b/recipes_source/recipes/rpc_tutorial.rst new file mode 100644 index 00000000000..cd883930030 --- /dev/null +++ b/recipes_source/recipes/rpc_tutorial.rst @@ -0,0 +1,641 @@ +Getting Started with Distributed RPC Framework +================================================= +**Author**: `Shen Li `_ + + +.. warning:: + The `torch.distributed.rpc `__ package + is experimental and subject to change. It also requires PyTorch 1.4.0+ to run as this is the first version to support RPC. + + +This tutorial uses two simple examples to demonstrate how to build distributed +training with the `torch.distributed.rpc `__ +package which is first introduced as an experimental feature in PyTorch v1.4. +Source code of the two examples can be found in +`PyTorch examples `__. + +Previous tutorials, +`Getting Started With Distributed Data Parallel `__ +and `Writing Distributed Applications With PyTorch `__, +described `DistributedDataParallel `__ +which supports a specific training paradigm where the model is replicated across +multiple processes and each process handles a split of the input data. +Sometimes, you might run into scenarios that require different training +paradigms. For example: + +1) In reinforcement learning, it might be relatively expensive to acquire + training data from environments while the model itself can be quite small. In + this case, it might be useful to spawn multiple observers running in parallel + and share a single agent. In this case, the agent takes care of the training + locally, but the application would still need libraries to send and receive + data between observers and the trainer. +2) Your model might be too large to fit in GPUs on a single machine, and hence + would need a library to help split the model onto multiple machines. Or you + might be implementing a `parameter server `__ + training framework, where model parameters and trainers live on different + machines. + + +The `torch.distributed.rpc `__ package +can help with the above scenarios. In case 1, `RPC `__ +and `RRef `__ allow sending data +from one worker to another while easily referencing remote data objects. In +case 2, `distributed autograd `__ +and `distributed optimizer `__ +make executing backward pass and optimizer step as if it is local training. In +the next two sections, we will demonstrate APIs of +`torch.distributed.rpc `__ using a +reinforcement learning example and a language model example. Please note, this +tutorial does not aim at building the most accurate or efficient models to +solve given problems, instead, the main goal here is to show how to use the +`torch.distributed.rpc `__ package to +build distributed training applications. + + + +Distributed Reinforcement Learning using RPC and RRef +----------------------------------------------------- + +This section describes steps to build a toy distributed reinforcement learning +model using RPC to solve CartPole-v1 from `OpenAI Gym `__. +The policy code is mostly borrowed from the existing single-thread +`example `__ +as shown below. We will skip details of the ``Policy`` design, and focus on RPC +usages. + +.. code:: python + + import torch.nn as nn + import torch.nn.functional as F + + class Policy(nn.Module): + + def __init__(self): + super(Policy, self).__init__() + self.affine1 = nn.Linear(4, 128) + self.dropout = nn.Dropout(p=0.6) + self.affine2 = nn.Linear(128, 2) + + self.saved_log_probs = [] + self.rewards = [] + + def forward(self, x): + x = self.affine1(x) + x = self.dropout(x) + x = F.relu(x) + action_scores = self.affine2(x) + return F.softmax(action_scores, dim=1) + +Let's first prepare a helper to run functions remotely on the owner worker of an +``RRef``. You will find this function being used in several places this +tutorial's examples. Ideally, the `torch.distributed.rpc` package should provide +these helper functions out of box. For example, it will be easier if +applications can directly call ``RRef.some_func(*arg)`` which will then +translate to RPC to the ``RRef`` owner. The progress on this API is tracked in +`pytorch/pytorch#31743 `__. + +.. code:: python + + from torch.distributed.rpc import rpc_sync + + def _call_method(method, rref, *args, **kwargs): + return method(rref.local_value(), *args, **kwargs) + + + def _remote_method(method, rref, *args, **kwargs): + args = [method, rref] + list(args) + return rpc_sync(rref.owner(), _call_method, args=args, kwargs=kwargs) + + # to call a function on an rref, we could do the following + # _remote_method(some_func, rref, *args) + + +We are ready to present the observer. In this example, each observer creates its +own environment, and waits for the agent's command to run an episode. In each +episode, one observer loops at most ``n_steps`` iterations, and in each +iteration, it uses RPC to pass its environment state to the agent and gets an +action back. Then it applies that action to its environment, and gets the reward +and the next state from the environment. After that, the observer uses another +RPC to report the reward to the agent. Again, please note that, this is +obviously not the most efficient observer implementation. For example, one +simple optimization could be packing current state and last reward in one RPC to +reduce the communication overhead. However, the goal is to demonstrate RPC API +instead of building the best solver for CartPole. So, let's keep the logic +simple and the two steps explicit in this example. + +.. code:: python + + import argparse + import gym + import torch.distributed.rpc as rpc + + parser = argparse.ArgumentParser( + description="RPC Reinforcement Learning Example", + formatter_class=argparse.ArgumentDefaultsHelpFormatter, + ) + + parser.add_argument('--world_size', default=2, help='Number of workers') + parser.add_argument('--log_interval', default=1, help='Log every log_interval episodes') + parser.add_argument('--gamma', default=0.1, help='how much to value future rewards') + parser.add_argument('--seed', default=1, help='random seed for reproducibility') + args = parser.parse_args() + + class Observer: + + def __init__(self): + self.id = rpc.get_worker_info().id + self.env = gym.make('CartPole-v1') + self.env.seed(args.seed) + + def run_episode(self, agent_rref, n_steps): + state, ep_reward = self.env.reset(), 0 + for step in range(n_steps): + # send the state to the agent to get an action + action = _remote_method(Agent.select_action, agent_rref, self.id, state) + + # apply the action to the environment, and get the reward + state, reward, done, _ = self.env.step(action) + + # report the reward to the agent for training purpose + _remote_method(Agent.report_reward, agent_rref, self.id, reward) + + if done: + break + + +The code for agent is a little more complex, and we will break it into multiple +pieces. In this example, the agent serves as both the trainer and the master, +such that it sends command to multiple distributed observers to run episodes, +and it also records all actions and rewards locally which will be used during +the training phase after each episode. The code below shows ``Agent`` +constructor where most lines are initializing various components. The loop at +the end initializes observers remotely on other workers, and holds ``RRefs`` to +those observers locally. The agent will use those observer ``RRefs`` later to +send commands. Applications don't need to worry about the lifetime of ``RRefs``. +The owner of each ``RRef`` maintains a reference counting map to track its +lifetime, and guarantees the remote data object will not be deleted as long as +there is any live user of that ``RRef``. Please refer to the ``RRef`` +`design doc `__ for details. + + +.. code:: python + + import gym + import numpy as np + + import torch + import torch.distributed.rpc as rpc + import torch.optim as optim + from torch.distributed.rpc import RRef, rpc_async, remote + from torch.distributions import Categorical + + class Agent: + def __init__(self, world_size): + self.ob_rrefs = [] + self.agent_rref = RRef(self) + self.rewards = {} + self.saved_log_probs = {} + self.policy = Policy() + self.optimizer = optim.Adam(self.policy.parameters(), lr=1e-2) + self.eps = np.finfo(np.float32).eps.item() + self.running_reward = 0 + self.reward_threshold = gym.make('CartPole-v1').spec.reward_threshold + for ob_rank in range(1, world_size): + ob_info = rpc.get_worker_info(OBSERVER_NAME.format(ob_rank)) + self.ob_rrefs.append(remote(ob_info, Observer)) + self.rewards[ob_info.id] = [] + self.saved_log_probs[ob_info.id] = [] + + +Next, the agent exposes two APIs to observers for selecting actions and +reporting rewards. Those functions only run locally on the agent, but will +be triggered by observers through RPC. + + +.. code:: python + + class Agent: + ... + def select_action(self, ob_id, state): + state = torch.from_numpy(state).float().unsqueeze(0) + probs = self.policy(state) + m = Categorical(probs) + action = m.sample() + self.saved_log_probs[ob_id].append(m.log_prob(action)) + return action.item() + + def report_reward(self, ob_id, reward): + self.rewards[ob_id].append(reward) + + +Let's add a ``run_episode`` function on agent which tells all observers +to execute an episode. In this function, it first creates a list to collect +futures from asynchronous RPCs, and then loop over all observer ``RRefs`` to +make asynchronous RPCs. In these RPCs, the agent also passes an ``RRef`` of +itself to the observer, so that the observer can call functions on the agent as +well. As shown above, each observer will make RPCs back to the agent, which are +nested RPCs. After each episode, the ``saved_log_probs`` and ``rewards`` will +contain the recorded action probs and rewards. + + +.. code:: python + + class Agent: + ... + def run_episode(self, n_steps=0): + futs = [] + for ob_rref in self.ob_rrefs: + # make async RPC to kick off an episode on all observers + futs.append( + rpc_async( + ob_rref.owner(), + _call_method, + args=(Observer.run_episode, ob_rref, self.agent_rref, n_steps) + ) + ) + + # wait until all obervers have finished this episode + for fut in futs: + fut.wait() + + +Finally, after one episode, the agent needs to train the model, which +is implemented in the ``finish_episode`` function below. There is no RPCs in +this function and it is mostly borrowed from the single-thread +`example `__. +Hence, we skip describing its contents. + + + +.. code:: python + + class Agent: + ... + def finish_episode(self): + # joins probs and rewards from different observers into lists + R, probs, rewards = 0, [], [] + for ob_id in self.rewards: + probs.extend(self.saved_log_probs[ob_id]) + rewards.extend(self.rewards[ob_id]) + + # use the minimum observer reward to calculate the running reward + min_reward = min([sum(self.rewards[ob_id]) for ob_id in self.rewards]) + self.running_reward = 0.05 * min_reward + (1 - 0.05) * self.running_reward + + # clear saved probs and rewards + for ob_id in self.rewards: + self.rewards[ob_id] = [] + self.saved_log_probs[ob_id] = [] + + policy_loss, returns = [], [] + for r in rewards[::-1]: + R = r + args.gamma * R + returns.insert(0, R) + returns = torch.tensor(returns) + returns = (returns - returns.mean()) / (returns.std() + self.eps) + for log_prob, R in zip(probs, returns): + policy_loss.append(-log_prob * R) + self.optimizer.zero_grad() + policy_loss = torch.cat(policy_loss).sum() + policy_loss.backward() + self.optimizer.step() + return min_reward + + +With ``Policy``, ``Observer``, and ``Agent`` classes, we are ready to launch +multiple processes to perform the distributed training. In this example, all +processes run the same ``run_worker`` function, and they use the rank to +distinguish their role. Rank 0 is always the agent, and all other ranks are +observers. The agent serves as master by repeatedly calling ``run_episode`` and +``finish_episode`` until the running reward surpasses the reward threshold +specified by the environment. All observers passively waiting for commands +from the agent. The code is wrapped by +`rpc.init_rpc `__ and +`rpc.shutdown `__, +which initializes and terminates RPC instances respectively. More details are +available in the `API page `__. + + +.. code:: python + + import os + from itertools import count + + import torch.multiprocessing as mp + + AGENT_NAME = "agent" + OBSERVER_NAME="obs" + TOTAL_EPISODE_STEP = 100 + + def run_worker(rank, world_size): + os.environ['MASTER_ADDR'] = 'localhost' + os.environ['MASTER_PORT'] = '29500' + if rank == 0: + # rank0 is the agent + rpc.init_rpc(AGENT_NAME, rank=rank, world_size=world_size) + + agent = Agent(world_size) + for i_episode in count(1): + n_steps = int(TOTAL_EPISODE_STEP / (args.world_size - 1)) + agent.run_episode(n_steps=n_steps) + last_reward = agent.finish_episode() + + if i_episode % args.log_interval == 0: + print('Episode {}\tLast reward: {:.2f}\tAverage reward: {:.2f}'.format( + i_episode, last_reward, agent.running_reward)) + + if agent.running_reward > agent.reward_threshold: + print("Solved! Running reward is now {}!".format(agent.running_reward)) + break + else: + # other ranks are the observer + rpc.init_rpc(OBSERVER_NAME.format(rank), rank=rank, world_size=world_size) + # observers passively waiting for instructions from the agent + + # block until all rpcs finish, and shutdown the RPC instance + rpc.shutdown() + + + mp.spawn( + run_worker, + args=(args.world_size, ), + nprocs=args.world_size, + join=True + ) + +Below are some sample outputs when training with `world_size=2`. + +:: + + Episode 10 Last reward: 26.00 Average reward: 10.01 + Episode 20 Last reward: 16.00 Average reward: 11.27 + Episode 30 Last reward: 49.00 Average reward: 18.62 + Episode 40 Last reward: 45.00 Average reward: 26.09 + Episode 50 Last reward: 44.00 Average reward: 30.03 + Episode 60 Last reward: 111.00 Average reward: 42.23 + Episode 70 Last reward: 131.00 Average reward: 70.11 + Episode 80 Last reward: 87.00 Average reward: 76.51 + Episode 90 Last reward: 86.00 Average reward: 95.93 + Episode 100 Last reward: 13.00 Average reward: 123.93 + Episode 110 Last reward: 33.00 Average reward: 91.39 + Episode 120 Last reward: 73.00 Average reward: 76.38 + Episode 130 Last reward: 137.00 Average reward: 88.08 + Episode 140 Last reward: 89.00 Average reward: 104.96 + Episode 150 Last reward: 97.00 Average reward: 98.74 + Episode 160 Last reward: 150.00 Average reward: 100.87 + Episode 170 Last reward: 126.00 Average reward: 104.38 + Episode 180 Last reward: 500.00 Average reward: 213.74 + Episode 190 Last reward: 322.00 Average reward: 300.22 + Episode 200 Last reward: 165.00 Average reward: 272.71 + Episode 210 Last reward: 168.00 Average reward: 233.11 + Episode 220 Last reward: 184.00 Average reward: 195.02 + Episode 230 Last reward: 284.00 Average reward: 208.32 + Episode 240 Last reward: 395.00 Average reward: 247.37 + Episode 250 Last reward: 500.00 Average reward: 335.42 + Episode 260 Last reward: 500.00 Average reward: 386.30 + Episode 270 Last reward: 500.00 Average reward: 405.29 + Episode 280 Last reward: 500.00 Average reward: 443.29 + Episode 290 Last reward: 500.00 Average reward: 464.65 + Solved! Running reward is now 475.3163778435275! + + +In this example, we show how to use RPC as the communication vehicle to pass +data across workers, and how to use RRef to reference remote objects. It is true +that you could build the entire structure directly on top of ``ProcessGroup`` +``send`` and ``recv`` APIs or use other communication/RPC libraries. However, +by using `torch.distributed.rpc`, you can get the native support and +continuously optimized performance under the hood. + +Next, we will show how to combine RPC and RRef with distributed autograd and +distributed optimizer to perform distributed model parallel training. + + + +Distributed RNN using Distributed Autograd and Distributed Optimizer +-------------------------------------------------------------------- + +In this section, we use an RNN model to show how to build distributed model +parallel training with the RPC API. The example RNN model is very small and +can easily fit into a single GPU, but we still divide its layers onto two +different workers to demonstrate the idea. Developer can apply the similar +techniques to distribute much larger models across multiple devices and +machines. + +The RNN model design is borrowed from the word language model in PyTorch +`example `__ +repository, which contains three main components, an embedding table, an +``LSTM`` layer, and a decoder. The code below wraps the embedding table and the +decoder into sub-modules, so that their constructors can be passed to the RPC +API. In the ``EmbeddingTable`` sub-module, we intentionally put the +``Embedding`` layer on GPU to cover the use case. In v1.4, RPC always creates +CPU tensor arguments or return values on the destination worker. If the function +takes a GPU tensor, you need to move it to the proper device explicitly. + + +.. code:: python + + class EmbeddingTable(nn.Module): + r""" + Encoding layers of the RNNModel + """ + def __init__(self, ntoken, ninp, dropout): + super(EmbeddingTable, self).__init__() + self.drop = nn.Dropout(dropout) + self.encoder = nn.Embedding(ntoken, ninp).cuda() + self.encoder.weight.data.uniform_(-0.1, 0.1) + + def forward(self, input): + return self.drop(self.encoder(input.cuda()).cpu() + + + class Decoder(nn.Module): + def __init__(self, ntoken, nhid, dropout): + super(Decoder, self).__init__() + self.drop = nn.Dropout(dropout) + self.decoder = nn.Linear(nhid, ntoken) + self.decoder.bias.data.zero_() + self.decoder.weight.data.uniform_(-0.1, 0.1) + + def forward(self, output): + return self.decoder(self.drop(output)) + + +With the above sub-modules, we can now piece them together using RPC to +create an RNN model. In the code below ``ps`` represents a parameter server, +which hosts parameters of the embedding table and the decoder. The constructor +uses the `remote `__ +API to create an ``EmbeddingTable`` object and a ``Decoder`` object on the +parameter server, and locally creates the ``LSTM`` sub-module. During the +forward pass, the trainer uses the ``EmbeddingTable`` ``RRef`` to find the +remote sub-module and passes the input data to the ``EmbeddingTable`` using RPC +and fetches the lookup results. Then, it runs the embedding through the local +``LSTM`` layer, and finally uses another RPC to send the output to the +``Decoder`` sub-module. In general, to implement distributed model parallel +training, developers can divide the model into sub-modules, invoke RPC to create +sub-module instances remotely, and use on ``RRef`` to find them when necessary. +As you can see in the code below, it looks very similar to single-machine model +parallel training. The main difference is replacing ``Tensor.to(device)`` with +RPC functions. + + +.. code:: python + + class RNNModel(nn.Module): + def __init__(self, ps, ntoken, ninp, nhid, nlayers, dropout=0.5): + super(RNNModel, self).__init__() + + # setup embedding table remotely + self.emb_table_rref = rpc.remote(ps, EmbeddingTable, args=(ntoken, ninp, dropout)) + # setup LSTM locally + self.rnn = nn.LSTM(ninp, nhid, nlayers, dropout=dropout) + # setup decoder remotely + self.decoder_rref = rpc.remote(ps, Decoder, args=(ntoken, nhid, dropout)) + + def forward(self, input, hidden): + # pass input to the remote embedding table and fetch emb tensor back + emb = _remote_method(EmbeddingTable.forward, self.emb_table_rref, input) + output, hidden = self.rnn(emb, hidden) + # pass output to the rremote decoder and get the decoded output back + decoded = _remote_method(Decoder.forward, self.decoder_rref, output) + return decoded, hidden + +Before introducing the distributed optimizer, let's add a helper function to +generate a list of RRefs of model parameters, which will be consumed by the +distributed optimizer. In local training, applications could call +``Module.parameters()`` to grab references to all parameter tensors, and pass it +to the local optimizer for subsequent updates. However, the same API does not +work in distributed training scenarios as some parameters live on remote +machines. Therefore, instead of taking a list of parameter ``Tensors``, the +distributed optimizer takes a list of ``RRefs``, one ``RRef`` per model +parameter for both local and remote model parameters. The helper function is +pretty simple, just call ``Module.parameters()`` and creates a local ``RRef`` on +each of the parameters. + + +.. code:: python + + def _parameter_rrefs(module): + param_rrefs = [] + for param in module.parameters(): + param_rrefs.append(RRef(param)) + return param_rrefs + + +Then, as the ``RNNModel`` contains three sub-modules, we need to call +``_parameter_rrefs`` three times, and wrap that into another helper function. + + +.. code:: python + + class RNNModel(nn.Module): + ... + def parameter_rrefs(self): + remote_params = [] + # get RRefs of embedding table + remote_params.extend(_remote_method(_parameter_rrefs, self.emb_table_rref)) + # create RRefs for local parameters + remote_params.extend(_parameter_rrefs(self.rnn)) + # get RRefs of decoder + remote_params.extend(_remote_method(_parameter_rrefs, self.decoder_rref)) + return remote_params + + +Now, we are ready to implement the training loop. After initializing model +arguments, we create the ``RNNModel`` and the ``DistributedOptimizer``. The +distributed optimizer will take a list of parameter ``RRefs``, find all distinct +owner workers, and create the given local optimizer (i.e., ``SGD`` in this case, +you can use other local optimizers as well) on each of the owner worker using +the given arguments (i.e., ``lr=0.05``). + +In the training loop, it first creates a distributed autograd context, which +will help the distributed autograd engine to find gradients and involved RPC +send/recv functions. The design details of the distributed autograd engine can +be found in its `design note `__. +Then, it kicks off the forward pass as if it is a local +model, and run the distributed backward pass. For the distributed backward, you +only need to specify a list of roots, in this case, it is the loss ``Tensor``. +The distributed autograd engine will traverse the distributed graph +automatically and write gradients properly. Next, it runs the ``step`` +function on the distributed optimizer, which will reach out to all involved +local optimizers to update model parameters. Compared to local training, one +minor difference is that you don't need to run ``zero_grad()`` because each +autograd context has dedicated space to store gradients, and as we create a +context per iteration, those gradients from different iterations will not +accumulate to the same set of ``Tensors``. + + +.. code:: python + + def run_trainer(): + batch = 5 + ntoken = 10 + ninp = 2 + + nhid = 3 + nindices = 3 + nlayers = 4 + hidden = ( + torch.randn(nlayers, nindices, nhid), + torch.randn(nlayers, nindices, nhid) + ) + + model = rnn.RNNModel('ps', ntoken, ninp, nhid, nlayers) + + # setup distributed optimizer + opt = DistributedOptimizer( + optim.SGD, + model.parameter_rrefs(), + lr=0.05, + ) + + criterion = torch.nn.CrossEntropyLoss() + + def get_next_batch(): + for _ in range(5): + data = torch.LongTensor(batch, nindices) % ntoken + target = torch.LongTensor(batch, ntoken) % nindices + yield data, target + + # train for 10 iterations + for epoch in range(10): + for data, target in get_next_batch(): + # create distributed autograd context + with dist_autograd.context() as context_id: + hidden[0].detach_() + hidden[1].detach_() + output, hidden = model(data, hidden) + loss = criterion(output, target) + # run distributed backward pass + dist_autograd.backward(context_id, [loss]) + # run distributed optimizer + opt.step(context_id) + # not necessary to zero grads since they are + # accumulated into the distributed autograd context + # which is reset every iteration. + print("Training epoch {}".format(epoch)) + + +Finally, let's add some glue code to launch the parameter server and the trainer +processes. + + +.. code:: python + + def run_worker(rank, world_size): + os.environ['MASTER_ADDR'] = 'localhost' + os.environ['MASTER_PORT'] = '29500' + if rank == 1: + rpc.init_rpc("trainer", rank=rank, world_size=world_size) + _run_trainer() + else: + rpc.init_rpc("ps", rank=rank, world_size=world_size) + # parameter server do nothing + pass + + # block until all rpcs finish + rpc.shutdown() + + + if __name__=="__main__": + world_size = 2 + mp.spawn(run_worker, args=(world_size, ), nprocs=world_size, join=True) diff --git a/recipes_source/recipes/save_load_across_devices.py b/recipes_source/recipes/save_load_across_devices.py new file mode 100644 index 00000000000..c2d86fbab50 --- /dev/null +++ b/recipes_source/recipes/save_load_across_devices.py @@ -0,0 +1,190 @@ +""" +Saving and loading models across devices in PyTorch +=================================================== + +There may be instances where you want to save and load your neural +networks across different devices. + +Introduction +------------ + +Saving and loading models across devices is relatively straightforward +using PyTorch. In this recipe, we will experiment with saving and +loading models across CPUs and GPUs. + +Setup +----- + +In order for every code block to run properly in this recipe, you must +first change the runtime to “GPU” or higher. Once you do, we need to +install ``torch`` if it isn’t already available. + +:: + + pip install torch + +""" + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Save on a GPU, load on a CPU +# 4. Save on a GPU, load on a GPU +# 5. Save on a CPU, load on a GPU +# 6. Saving and loading ``DataParallel`` models +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +net = Net() +print(net) + + +###################################################################### +# 3. Save on GPU, Load on CPU +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# When loading a model on a CPU that was trained with a GPU, pass +# ``torch.device('cpu')`` to the ``map_location`` argument in the +# ``torch.load()`` function. +# + +# Specify a path to save to +PATH = "model.pt" + +# Save +torch.save(net.state_dict(), PATH) + +# Load +device = torch.device('cpu') +model = Net() +model.load_state_dict(torch.load(PATH, map_location=device)) + + +###################################################################### +# In this case, the storages underlying the tensors are dynamically +# remapped to the CPU device using the ``map_location`` argument. +# +# 4. Save on GPU, Load on GPU +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# When loading a model on a GPU that was trained and saved on GPU, simply +# convert the initialized model to a CUDA optimized model using +# ``model.to(torch.device('cuda'))``. +# +# Be sure to use the ``.to(torch.device('cuda'))`` function on all model +# inputs to prepare the data for the model. +# + +# Save +torch.save(net.state_dict(), PATH) + +# Load +device = torch.device("cuda") +model = Net() +model.load_state_dict(torch.load(PATH)) +model.to(device) + + +###################################################################### +# Note that calling ``my_tensor.to(device)`` returns a new copy of +# ``my_tensor`` on GPU. It does NOT overwrite ``my_tensor``. Therefore, +# remember to manually overwrite tensors: +# ``my_tensor = my_tensor.to(torch.device('cuda'))``. +# +# 5. Save on CPU, Load on GPU +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# When loading a model on a GPU that was trained and saved on CPU, set the +# ``map_location`` argument in the ``torch.load()`` function to +# ``cuda:device_id``. This loads the model to a given GPU device. +# +# Be sure to call ``model.to(torch.device('cuda'))`` to convert the +# model’s parameter tensors to CUDA tensors. +# +# Finally, also be sure to use the ``.to(torch.device('cuda'))`` function +# on all model inputs to prepare the data for the CUDA optimized model. +# + +# Save +torch.save(net.state_dict(), PATH) + +# Load +device = torch.device("cuda") +model = Net() +# Choose whatever GPU device number you want +model.load_state_dict(torch.load(PATH, map_location="cuda:0")) +# Make sure to call input = input.to(device) on any input tensors that you feed to the model +model.to(device) + + +###################################################################### +# 6. Saving ``torch.nn.DataParallel`` Models +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# ``torch.nn.DataParallel`` is a model wrapper that enables parallel GPU +# utilization. +# +# To save a ``DataParallel`` model generically, save the +# ``model.module.state_dict()``. This way, you have the flexibility to +# load the model any way you want to any device you want. +# + +# Save +torch.save(net.module.state_dict(), PATH) + +# Load to whatever device you want + + +###################################################################### +# Congratulations! You have successfully saved and loaded models across +# devices in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD +# diff --git a/recipes_source/recipes/saving_and_loading_a_general_checkpoint.py b/recipes_source/recipes/saving_and_loading_a_general_checkpoint.py new file mode 100644 index 00000000000..6e0c490ec2a --- /dev/null +++ b/recipes_source/recipes/saving_and_loading_a_general_checkpoint.py @@ -0,0 +1,162 @@ +""" +Saving and loading a general checkpoint in PyTorch +================================================== +Saving and loading a general checkpoint model for inference or +resuming training can be helpful for picking up where you last left off. +When saving a general checkpoint, you must save more than just the +model’s state_dict. It is important to also save the optimizer’s +state_dict, as this contains buffers and parameters that are updated as +the model trains. Other items that you may want to save are the epoch +you left off on, the latest recorded training loss, external +``torch.nn.Embedding`` layers, and more, based on your own algorithm. + +Introduction +------------ +To save multiple checkpoints, you must organize them in a dictionary and +use ``torch.save()`` to serialize the dictionary. A common PyTorch +convention is to save these checkpoints using the ``.tar`` file +extension. To load the items, first initialize the model and optimizer, +then load the dictionary locally using torch.load(). From here, you can +easily access the saved items by simply querying the dictionary as you +would expect. + +In this recipe, we will explore how to save and load multiple +checkpoints. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + +:: + + pip install torch + + +""" + + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Initialize the optimizer +# 4. Save the general checkpoint +# 5. Load the general checkpoint +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +net = Net() +print(net) + + +###################################################################### +# 3. Initialize the optimizer +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# We will use SGD with momentum. +# + +optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) + + +###################################################################### +# 4. Save the general checkpoint +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Collect all relevant information and build your dictionary. +# + +# Additional information +EPOCH = 5 +PATH = "model.pt" +LOSS = 0.4 + +torch.save({ + 'epoch': EPOCH, + 'model_state_dict': net.state_dict(), + 'optimizer_state_dict': optimizer.state_dict(), + 'loss': LOSS, + }, PATH) + + +###################################################################### +# 5. Load the general checkpoint +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Remember to first initialize the model and optimizer, then load the +# dictionary locally. +# + +model = Net() +optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) + +checkpoint = torch.load(PATH) +model.load_state_dict(checkpoint['model_state_dict']) +optimizer.load_state_dict(checkpoint['optimizer_state_dict']) +epoch = checkpoint['epoch'] +loss = checkpoint['loss'] + +model.eval() +# - or - +model.train() + + +###################################################################### +# You must call ``model.eval()`` to set dropout and batch normalization +# layers to evaluation mode before running inference. Failing to do this +# will yield inconsistent inference results. +# +# If you wish to resuming training, call ``model.train()`` to ensure these +# layers are in training mode. +# +# Congratulations! You have successfully saved and loaded a general +# checkpoint for inference and/or resuming training in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes/saving_and_loading_models_for_inference.py b/recipes_source/recipes/saving_and_loading_models_for_inference.py new file mode 100644 index 00000000000..a1bf52821f0 --- /dev/null +++ b/recipes_source/recipes/saving_and_loading_models_for_inference.py @@ -0,0 +1,168 @@ +""" +Saving and loading models for inference in PyTorch +================================================== +There are two approaches for saving and loading models for inference in +PyTorch. The first is saving and loading the ``state_dict``, and the +second is saving and loading the entire model. + +Introduction +------------ +Saving the model’s ``state_dict`` with the ``torch.save()`` function +will give you the most flexibility for restoring the model later. This +is the recommended method for saving models, because it is only really +necessary to save the trained model’s learned parameters. +When saving and loading an entire model, you save the entire module +using Python’s +`pickle `__ module. Using +this approach yields the most intuitive syntax and involves the least +amount of code. The disadvantage of this approach is that the serialized +data is bound to the specific classes and the exact directory structure +used when the model is saved. The reason for this is because pickle does +not save the model class itself. Rather, it saves a path to the file +containing the class, which is used during load time. Because of this, +your code can break in various ways when used in other projects or after +refactors. +In this recipe, we will explore both ways on how to save and load models +for inference. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + + +:: + + pip install torch + + +""" + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Initialize the optimizer +# 4. Save and load the model via ``state_dict`` +# 5. Save and load the entire model +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +net = Net() +print(net) + + +###################################################################### +# 3. Initialize the optimizer +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# We will use SGD with momentum. +# + +optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) + + +###################################################################### +# 4. Save and load the model via ``state_dict`` +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Let’s save and load our model using just ``state_dict``. +# + +# Specify a path +PATH = "state_dict_model.pt" + +# Save +torch.save(net.state_dict(), PATH) + +# Load +model = Net() +model.load_state_dict(torch.load(PATH)) +model.eval() + + +###################################################################### +# A common PyTorch convention is to save models using either a ``.pt`` or +# ``.pth`` file extension. +# +# Notice that the ``load_state_dict()`` function takes a dictionary +# object, NOT a path to a saved object. This means that you must +# deserialize the saved state_dict before you pass it to the +# ``load_state_dict()`` function. For example, you CANNOT load using +# ``model.load_state_dict(PATH)``. +# +# Remember too, that you must call ``model.eval()`` to set dropout and +# batch normalization layers to evaluation mode before running inference. +# Failing to do this will yield inconsistent inference results. +# +# 5. Save and load entire model +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Now let’s try the same thing with the entire model. +# + +# Specify a path +PATH = "entire_model.pt" + +# Save +torch.save(net, PATH) + +# Load +model = torch.load(PATH) +model.eval() + + +###################################################################### +# Again here, remember that you must call model.eval() to set dropout and +# batch normalization layers to evaluation mode before running inference. +# +# Congratulations! You have successfully saved and load models for +# inference in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes/saving_multiple_models_in_one_file.py b/recipes_source/recipes/saving_multiple_models_in_one_file.py new file mode 100644 index 00000000000..b2f38247b4f --- /dev/null +++ b/recipes_source/recipes/saving_multiple_models_in_one_file.py @@ -0,0 +1,162 @@ +""" +Saving and loading multiple models in one file using PyTorch +============================================================ +Saving and loading multiple models can be helpful for reusing models +that you have previously trained. + +Introduction +------------ +When saving a model comprised of multiple ``torch.nn.Modules``, such as +a GAN, a sequence-to-sequence model, or an ensemble of models, you must +save a dictionary of each model’s state_dict and corresponding +optimizer. You can also save any other items that may aid you in +resuming training by simply appending them to the dictionary. +To load the models, first initialize the models and optimizers, then +load the dictionary locally using ``torch.load()``. From here, you can +easily access the saved items by simply querying the dictionary as you +would expect. +In this recipe, we will demonstrate how to save multiple models to one +file using PyTorch. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + +:: + + pip install torch + +""" + + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Initialize the optimizer +# 4. Save multiple models +# 5. Load multiple models +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. Build +# two variables for the models to eventually save. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +netA = Net() +netB = Net() + + +###################################################################### +# 3. Initialize the optimizer +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# We will use SGD with momentum to build an optimizer for each model we +# created. +# + +optimizerA = optim.SGD(netA.parameters(), lr=0.001, momentum=0.9) +optimizerB = optim.SGD(netB.parameters(), lr=0.001, momentum=0.9) + + +###################################################################### +# 4. Save multiple models +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Collect all relevant information and build your dictionary. +# + +# Specify a path to save to +PATH = "model.pt" + +torch.save({ + 'modelA_state_dict': netA.state_dict(), + 'modelB_state_dict': netB.state_dict(), + 'optimizerA_state_dict': optimizerA.state_dict(), + 'optimizerB_state_dict': optimizerB.state_dict(), + }, PATH) + + +###################################################################### +# 4. Load multiple models +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Remember to first initialize the models and optimizers, then load the +# dictionary locally. +# + +modelA = Net() +modelB = Net() +optimModelA = optim.SGD(modelA.parameters(), lr=0.001, momentum=0.9) +optimModelB = optim.SGD(modelB.parameters(), lr=0.001, momentum=0.9) + +checkpoint = torch.load(PATH) +modelA.load_state_dict(checkpoint['modelA_state_dict']) +modelB.load_state_dict(checkpoint['modelB_state_dict']) +optimizerA.load_state_dict(checkpoint['optimizerA_state_dict']) +optimizerB.load_state_dict(checkpoint['optimizerB_state_dict']) + +modelA.eval() +modelB.eval() +# - or - +modelA.train() +modelB.train() + + +###################################################################### +# You must call ``model.eval()`` to set dropout and batch normalization +# layers to evaluation mode before running inference. Failing to do this +# will yield inconsistent inference results. +# +# If you wish to resuming training, call ``model.train()`` to ensure these +# layers are in training mode. +# +# Congratulations! You have successfully saved and loaded multiple models +# in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD +# diff --git a/recipes_source/recipes/super_resolution_with_onnxruntime.py b/recipes_source/recipes/super_resolution_with_onnxruntime.py new file mode 100644 index 00000000000..2a3a3dadd36 --- /dev/null +++ b/recipes_source/recipes/super_resolution_with_onnxruntime.py @@ -0,0 +1,312 @@ +""" +(optional) Exporting a Model from PyTorch to ONNX and Running it using ONNX Runtime +======================================================================== + +In this tutorial, we describe how to convert a model defined +in PyTorch into the ONNX format and then run it with ONNX Runtime. + +ONNX Runtime is a performance-focused engine for ONNX models, +which inferences efficiently across multiple platforms and hardware +(Windows, Linux, and Mac and on both CPUs and GPUs). +ONNX Runtime has proved to considerably increase performance over +multiple models as explained `here +`__ + +For this tutorial, you will need to install `ONNX `__ +and `ONNX Runtime `__. +You can get binary builds of ONNX and ONNX Runtime with +``pip install onnx onnxruntime``. +Note that ONNX Runtime is compatible with Python versions 3.5 to 3.7. + +``NOTE``: This tutorial needs PyTorch master branch which can be installed by following +the instructions `here `__ + +""" + +# Some standard imports +import io +import numpy as np + +from torch import nn +import torch.utils.model_zoo as model_zoo +import torch.onnx + + +###################################################################### +# Super-resolution is a way of increasing the resolution of images, videos +# and is widely used in image processing or video editing. For this +# tutorial, we will use a small super-resolution model. +# +# First, let's create a SuperResolution model in PyTorch. +# This model uses the efficient sub-pixel convolution layer described in +# `"Real-Time Single Image and Video Super-Resolution Using an Efficient +# Sub-Pixel Convolutional Neural Network" - Shi et al `__ +# for increasing the resolution of an image by an upscale factor. +# The model expects the Y component of the YCbCr of an image as an input, and +# outputs the upscaled Y component in super resolution. +# +# `The +# model `__ +# comes directly from PyTorch's examples without modification: +# + +# Super Resolution model definition in PyTorch +import torch.nn as nn +import torch.nn.init as init + + +class SuperResolutionNet(nn.Module): + def __init__(self, upscale_factor, inplace=False): + super(SuperResolutionNet, self).__init__() + + self.relu = nn.ReLU(inplace=inplace) + self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2)) + self.conv2 = nn.Conv2d(64, 64, (3, 3), (1, 1), (1, 1)) + self.conv3 = nn.Conv2d(64, 32, (3, 3), (1, 1), (1, 1)) + self.conv4 = nn.Conv2d(32, upscale_factor ** 2, (3, 3), (1, 1), (1, 1)) + self.pixel_shuffle = nn.PixelShuffle(upscale_factor) + + self._initialize_weights() + + def forward(self, x): + x = self.relu(self.conv1(x)) + x = self.relu(self.conv2(x)) + x = self.relu(self.conv3(x)) + x = self.pixel_shuffle(self.conv4(x)) + return x + + def _initialize_weights(self): + init.orthogonal_(self.conv1.weight, init.calculate_gain('relu')) + init.orthogonal_(self.conv2.weight, init.calculate_gain('relu')) + init.orthogonal_(self.conv3.weight, init.calculate_gain('relu')) + init.orthogonal_(self.conv4.weight) + +# Create the super-resolution model by using the above model definition. +torch_model = SuperResolutionNet(upscale_factor=3) + + +###################################################################### +# Ordinarily, you would now train this model; however, for this tutorial, +# we will instead download some pre-trained weights. Note that this model +# was not trained fully for good accuracy and is used here for +# demonstration purposes only. +# +# It is important to call ``torch_model.eval()`` or ``torch_model.train(False)`` +# before exporting the model, to turn the model to inference mode. +# This is required since operators like dropout or batchnorm behave +# differently in inference and training mode. +# + +# Load pretrained model weights +model_url = 'https://s3.amazonaws.com/pytorch/test_data/export/superres_epoch100-44c6958e.pth' +batch_size = 1 # just a random number + +# Initialize model with the pretrained weights +map_location = lambda storage, loc: storage +if torch.cuda.is_available(): + map_location = None +torch_model.load_state_dict(model_zoo.load_url(model_url, map_location=map_location)) + +# set the model to inference mode +torch_model.eval() + + +###################################################################### +# Exporting a model in PyTorch works via tracing or scripting. This +# tutorial will use as an example a model exported by tracing. +# To export a model, we call the ``torch.onnx.export()`` function. +# This will execute the model, recording a trace of what operators +# are used to compute the outputs. +# Because ``export`` runs the model, we need to provide an input +# tensor ``x``. The values in this can be random as long as it is the +# right type and size. +# Note that the input size will be fixed in the exported ONNX graph for +# all the input's dimensions, unless specified as a dynamic axes. +# In this example we export the model with an input of batch_size 1, +# but then specify the first dimension as dynamic in the ``dynamic_axes`` +# parameter in ``torch.onnx.export()``. +# The exported model will thus accept inputs of size [batch_size, 1, 224, 224] +# where batch_size can be variable. +# +# To learn more details about PyTorch's export interface, check out the +# `torch.onnx documentation `__. +# + +# Input to the model +x = torch.randn(batch_size, 1, 224, 224, requires_grad=True) +torch_out = torch_model(x) + +# Export the model +torch.onnx.export(torch_model, # model being run + x, # model input (or a tuple for multiple inputs) + "super_resolution.onnx", # where to save the model (can be a file or file-like object) + export_params=True, # store the trained parameter weights inside the model file + opset_version=10, # the ONNX version to export the model to + do_constant_folding=True, # whether to execute constant folding for optimization + input_names = ['input'], # the model's input names + output_names = ['output'], # the model's output names + dynamic_axes={'input' : {0 : 'batch_size'}, # variable lenght axes + 'output' : {0 : 'batch_size'}}) + +###################################################################### +# We also computed ``torch_out``, the output after of the model, +# which we will use to verify that the model we exported computes +# the same values when run in ONNX Runtime. +# +# But before verifying the model's output with ONNX Runtime, we will check +# the ONNX model with ONNX's API. +# First, ``onnx.load("super_resolution.onnx")`` will load the saved model and +# will output a onnx.ModelProto structure (a top-level file/container format for bundling a ML model. +# For more information `onnx.proto documentation `__.). +# Then, ``onnx.checker.check_model(onnx_model)`` will verify the model's structure +# and confirm that the model has a valid schema. +# The validity of the ONNX graph is verified by checking the model's +# version, the graph's structure, as well as the nodes and their inputs +# and outputs. +# + +import onnx + +onnx_model = onnx.load("super_resolution.onnx") +onnx.checker.check_model(onnx_model) + + +###################################################################### +# Now let's compute the output using ONNX Runtime's Python APIs. +# This part can normally be done in a separate process or on another +# machine, but we will continue in the same process so that we can +# verify that ONNX Runtime and PyTorch are computing the same value +# for the network. +# +# In order to run the model with ONNX Runtime, we need to create an +# inference session for the model with the chosen configuration +# parameters (here we use the default config). +# Once the session is created, we evaluate the model using the run() api. +# The output of this call is a list containing the outputs of the model +# computed by ONNX Runtime. +# + +import onnxruntime + +ort_session = onnxruntime.InferenceSession("super_resolution.onnx") + +def to_numpy(tensor): + return tensor.detach().cpu().numpy() if tensor.requires_grad else tensor.cpu().numpy() + +# compute ONNX Runtime output prediction +ort_inputs = {ort_session.get_inputs()[0].name: to_numpy(x)} +ort_outs = ort_session.run(None, ort_inputs) + +# compare ONNX Runtime and PyTorch results +np.testing.assert_allclose(to_numpy(torch_out), ort_outs[0], rtol=1e-03, atol=1e-05) + +print("Exported model has been tested with ONNXRuntime, and the result looks good!") + + +###################################################################### +# We should see that the output of PyTorch and ONNX Runtime runs match +# numerically with the given precision (rtol=1e-03 and atol=1e-05). +# As a side-note, if they do not match then there is an issue in the +# ONNX exporter, so please contact us in that case. +# + + +###################################################################### +# Running the model on an image using ONNX Runtime +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# + + +###################################################################### +# So far we have exported a model from PyTorch and shown how to load it +# and run it in ONNX Runtime with a dummy tensor as an input. + +###################################################################### +# For this tutorial, we will use a famous cat image used widely which +# looks like below +# +# .. figure:: /_static/img/cat_224x224.jpg +# :alt: cat +# + +###################################################################### +# First, let's load the image, pre-process it using standard PIL +# python library. Note that this preprocessing is the standard practice of +# processing data for training/testing neural networks. +# +# We first resize the image to fit the size of the model's input (224x224). +# Then we split the image into its Y, Cb, and Cr components. +# These components represent a greyscale image (Y), and +# the blue-difference (Cb) and red-difference (Cr) chroma components. +# The Y component being more sensitive to the human eye, we are +# interested in this component which we will be transforming. +# After extracting the Y component, we convert it to a tensor which +# will be the input of our model. +# + +from PIL import Image +import torchvision.transforms as transforms + +img = Image.open("./_static/img/cat.jpg") + +resize = transforms.Resize([224, 224]) +img = resize(img) + +img_ycbcr = img.convert('YCbCr') +img_y, img_cb, img_cr = img_ycbcr.split() + +to_tensor = transforms.ToTensor() +img_y = to_tensor(img_y) +img_y.unsqueeze_(0) + + +###################################################################### +# Now, as a next step, let's take the tensor representing the +# greyscale resized cat image and run the super-resolution model in +# ONNX Runtime as explained previously. +# + +ort_inputs = {ort_session.get_inputs()[0].name: to_numpy(img_y)} +ort_outs = ort_session.run(None, ort_inputs) +img_out_y = ort_outs[0] + + +###################################################################### +# At this point, the output of the model is a tensor. +# Now, we'll process the output of the model to construct back the +# final output image from the output tensor, and save the image. +# The post-processing steps have been adopted from PyTorch +# implementation of super-resolution model +# `here `__. +# + +img_out_y = Image.fromarray(np.uint8((img_out_y[0] * 255.0).clip(0, 255)[0]), mode='L') + +# get the output image follow post-processing step from PyTorch implementation +final_img = Image.merge( + "YCbCr", [ + img_out_y, + img_cb.resize(img_out_y.size, Image.BICUBIC), + img_cr.resize(img_out_y.size, Image.BICUBIC), + ]).convert("RGB") + +# Save the image, we will compare this with the output image from mobile device +final_img.save("./_static/img/cat_superres_with_ort.jpg") + + +###################################################################### +# .. figure:: /_static/img/cat_superres_with_ort.jpg +# :alt: output\_cat +# +# +# ONNX Runtime being a cross platform engine, you can run it across +# multiple platforms and on both CPUs and GPUs. +# +# ONNX Runtime can also be deployed to the cloud for model inferencing +# using Azure Machine Learning Services. More information `here `__. +# +# More information about ONNX Runtime's performance `here `__. +# +# +# For more information about ONNX Runtime `here `__. +# diff --git a/recipes_source/recipes/torch_script_custom_classes.rst b/recipes_source/recipes/torch_script_custom_classes.rst new file mode 100644 index 00000000000..031e6c3f696 --- /dev/null +++ b/recipes_source/recipes/torch_script_custom_classes.rst @@ -0,0 +1,609 @@ +Extending TorchScript with Custom C++ Classes +=============================================== + +This tutorial is a follow-on to the +`custom operator `_ +tutorial, and introduces the API we've built for binding C++ classes into TorchScript +and Python simultaneously. The API is very similar to +`pybind11 `_, and most of the concepts will transfer +over if you're familiar with that system. + +Implementing and Binding the Class in C++ +----------------------------------------- + +For this tutorial, we are going to define a simple C++ class that maintains persistent +state in a member variable. + +.. code-block:: cpp + + // This header is all you need to do the C++ portions of this + // tutorial + #include + // This header is what defines the custom class registration + // behavior specifically. script.h already includes this, but + // we include it here so you know it exists in case you want + // to look at the API or implementation. + #include + + #include + #include + + template + struct MyStackClass : torch::CustomClassHolder { + std::vector stack_; + MyStackClass(std::vector init) : stack_(init.begin(), init.end()) {} + + void push(T x) { + stack_.push_back(x); + } + T pop() { + auto val = stack_.back(); + stack_.pop_back(); + return val; + } + + c10::intrusive_ptr clone() const { + return c10::make_intrusive(stack_); + } + + void merge(const c10::intrusive_ptr& c) { + for (auto& elem : c->stack_) { + push(elem); + } + } + }; + +There are several things to note: + +- ``torch/custom_class.h`` is the header you need to include to extend TorchScript + with your custom class. +- Notice that whenever we are working with instances of the custom + class, we do it via instances of ``c10::intrusive_ptr<>``. Think of ``intrusive_ptr`` + as a smart pointer like ``std::shared_ptr``. The reason for using this smart pointer + is to ensure consistent lifetime management of the object instances between languages + (C++, Python and TorchScript). +- The second thing to notice is that the user-defined class must inherit from + ``torch::CustomClassHolder``. This ensures that everything is set up to handle + the lifetime management system previously mentioned. + +Now let's take a look at how we will make this class visible to TorchScript, a process called +*binding* the class: + +.. code-block:: cpp + + // Notice a few things: + // - We pass the class to be registered as a template parameter to + // `torch::class_`. In this instance, we've passed the + // specialization of the MyStackClass class ``MyStackClass``. + // In general, you cannot register a non-specialized template + // class. For non-templated classes, you can just pass the + // class name directly as the template parameter. + // - The arguments passed to the constructor make up the "qualified name" + // of the class. In this case, the registered class will appear in + // Python and C++ as `torch.classes.my_classes.MyStackClass`. We call + // the first argument the "namespace" and the second argument the + // actual class name. + static auto testStack = + torch::class_>("my_classes", "MyStackClass") + // The following line registers the contructor of our MyStackClass + // class that takes a single `std::vector` argument, + // i.e. it exposes the C++ method `MyStackClass(std::vector init)`. + // Currently, we do not support registering overloaded + // constructors, so for now you can only `def()` one instance of + // `torch::init`. + .def(torch::init>()) + // The next line registers a stateless (i.e. no captures) C++ lambda + // function as a method. Note that a lambda function must take a + // `c10::intrusive_ptr` (or some const/ref version of that) + // as the first argument. Other arguments can be whatever you want. + .def("top", [](const c10::intrusive_ptr>& self) { + return self->stack_.back(); + }) + // The following four lines expose methods of the MyStackClass + // class as-is. `torch::class_` will automatically examine the + // argument and return types of the passed-in method pointers and + // expose these to Python and TorchScript accordingly. Finally, notice + // that we must take the *address* of the fully-qualified method name, + // i.e. use the unary `&` operator, due to C++ typing rules. + .def("push", &MyStackClass::push) + .def("pop", &MyStackClass::pop) + .def("clone", &MyStackClass::clone) + .def("merge", &MyStackClass::merge); + + + +Building the Example as a C++ Project With CMake +------------------------------------------------ + +Now, we're going to build the above C++ code with the `CMake +`_ build system. First, take all the C++ code +we've covered so far and place it in a file called ``class.cpp``. +Then, write a simple ``CMakeLists.txt`` file and place it in the +same directory. Here is what ``CMakeLists.txt`` should look like: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.1 FATAL_ERROR) + project(custom_class) + + find_package(Torch REQUIRED) + + # Define our library target + add_library(custom_class SHARED class.cpp) + set(CMAKE_CXX_STANDARD 14) + # Link against LibTorch + target_link_libraries(custom_class "${TORCH_LIBRARIES}") + +Also, create a ``build`` directory. Your file tree should look like this:: + + custom_class_project/ + class.cpp + CMakeLists.txt + build/ + +Now, to build the project, go ahead and download the appropriate libtorch +binary from the `PyTorch website `_. Extract the +zip archive somewhere (within the project directory might be convenient) +and note the path you've extracted it to. Next, go ahead and invoke cmake and +then make to build the project: + +.. code-block:: shell + + $ cd build + $ cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 7.3.1 + -- The CXX compiler identification is GNU 7.3.1 + -- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc + -- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ + -- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Found torch: /torchbind_tutorial/libtorch/lib/libtorch.so + -- Configuring done + -- Generating done + -- Build files have been written to: /torchbind_tutorial/build + $ make -j + Scanning dependencies of target custom_class + [ 50%] Building CXX object CMakeFiles/custom_class.dir/class.cpp.o + [100%] Linking CXX shared library libcustom_class.so + [100%] Built target custom_class + +What you'll find is there is now (among other things) a dynamic library +file present in the build directory. On Linux, this is probably named +``libcustom_class.so``. So the file tree should look like:: + + custom_class_project/ + class.cpp + CMakeLists.txt + build/ + libcustom_class.so + +Using the C++ Class from Python and TorchScript +----------------------------------------------- + +Now that we have our class and its registration compiled into an ``.so`` file, +we can load that `.so` into Python and try it out. Here's a script that +demonstrates that: + +.. code-block:: python + + import torch + + # `torch.classes.load_library()` allows you to pass the path to your .so file + # to load it in and make the custom C++ classes available to both Python and + # TorchScript + torch.classes.load_library("libcustom_class.so") + # You can query the loaded libraries like this: + print(torch.classes.loaded_libraries) + # prints {'/custom_class_project/build/libcustom_class.so'} + + # We can find and instantiate our custom C++ class in python by using the + # `torch.classes` namespace: + # + # This instantiation will invoke the MyStackClass(std::vector init) constructor + # we registered earlier + s = torch.classes.my_classes.MyStackClass(["foo", "bar"]) + + # We can call methods in Python + s.push("pushed") + assert s.pop() == "pushed" + + # Returning and passing instances of custom classes works as you'd expect + s2 = s.clone() + s.merge(s2) + for expected in ["bar", "foo", "bar", "foo"]: + assert s.pop() == expected + + # We can also use the class in TorchScript + # For now, we need to assign the class's type to a local in order to + # annotate the type on the TorchScript function. This may change + # in the future. + MyStackClass = torch.classes.my_classes.MyStackClass + + @torch.jit.script + def do_stacks(s : MyStackClass): # We can pass a custom class instance to TorchScript + s2 = torch.classes.my_classes.MyStackClass(["hi", "mom"]) # We can instantiate the class + s2.merge(s) # We can call a method on the class + return s2.clone(), s2.top() # We can also return instances of the class + # from TorchScript function/methods + + stack, top = do_stacks(torch.classes.my_classes.MyStackClass(["wow"])) + assert top == "wow" + for expected in ["wow", "mom", "hi"]: + assert stack.pop() == expected + +Saving, Loading, and Running TorchScript Code Using Custom Classes +------------------------------------------------------------------ + +We can also use custom-registered C++ classes in a C++ process using +libtorch. As an example, let's define a simple ``nn.Module`` that +instantiates and calls a method on our MyStackClass class: + +.. code-block:: python + + import torch + + torch.classes.load_library('libcustom_class.so') + + class Foo(torch.nn.Module): + def __init__(self): + super().__init__() + + def forward(self, s : str) -> str: + stack = torch.classes.my_classes.MyStackClass(["hi", "mom"]) + return stack.pop() + s + + scripted_foo = torch.jit.script(Foo()) + print(scripted_foo.graph) + + scripted_foo.save('foo.pt') + +``foo.pt`` in our filesystem now contains the serialized TorchScript +program we've just defined. + +Now, we're going to define a new CMake project to show how you can load +this model and its required .so file. For a full treatment of how to do this, +please have a look at the `Loading a TorchScript Model in C++ Tutorial `_. + +Similarly to before, let's create a file structure containing the following:: + + cpp_inference_example/ + infer.cpp + CMakeLists.txt + foo.pt + build/ + custom_class_project/ + class.cpp + CMakeLists.txt + build/ + +Notice we've copied over the serialized ``foo.pt`` file, as well as the source +tree from the ``custom_class_project`` above. We will be adding the +``custom_class_project`` as a dependency to this C++ project so that we can +build the custom class into the binary. + +Let's populate ``infer.cpp`` with the following: + +.. code-block:: cpp + + #include + + #include + #include + + int main(int argc, const char* argv[]) { + torch::script::Module module; + try { + // Deserialize the ScriptModule from a file using torch::jit::load(). + module = torch::jit::load("foo.pt"); + } + catch (const c10::Error& e) { + std::cerr << "error loading the model\n"; + return -1; + } + + std::vector inputs = {"foobarbaz"}; + auto output = module.forward(inputs).toString(); + std::cout << output->string() << std::endl; + } + +And similarly let's define our CMakeLists.txt file: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.1 FATAL_ERROR) + project(infer) + + find_package(Torch REQUIRED) + + add_subdirectory(custom_class_project) + + # Define our library target + add_executable(infer infer.cpp) + set(CMAKE_CXX_STANDARD 14) + # Link against LibTorch + target_link_libraries(infer "${TORCH_LIBRARIES}") + # This is where we link in our libcustom_class code, making our + # custom class available in our binary. + target_link_libraries(infer -Wl,--no-as-needed custom_class) + +You know the drill: ``cd build``, ``cmake``, and ``make``: + +.. code-block:: shell + + $ cd build + $ cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 7.3.1 + -- The CXX compiler identification is GNU 7.3.1 + -- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc + -- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ + -- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Found torch: /local/miniconda3/lib/python3.7/site-packages/torch/lib/libtorch.so + -- Configuring done + -- Generating done + -- Build files have been written to: /cpp_inference_example/build + $ make -j + Scanning dependencies of target custom_class + [ 25%] Building CXX object custom_class_project/CMakeFiles/custom_class.dir/class.cpp.o + [ 50%] Linking CXX shared library libcustom_class.so + [ 50%] Built target custom_class + Scanning dependencies of target infer + [ 75%] Building CXX object CMakeFiles/infer.dir/infer.cpp.o + [100%] Linking CXX executable infer + [100%] Built target infer + +And now we can run our exciting C++ binary: + +.. code-block:: shell + + $ ./infer + momfoobarbaz + +Incredible! + +Moving Custom Classes To/From IValues +------------------------------------- + +It's also possible that you may need to move custom classes into or out of +``IValue``s, such as when you take or return ``IValue``s from TorchScript methods +or you want to instantiate a custom class attribute in C++. For creating an +``IValue`` from a custom C++ class instance: + +- ``torch::make_custom_class()`` provides an API similar to c10::intrusive_ptr + in that it will take whatever set of arguments you provide to it, call the constructor + of T that matches that set of arguments, and wrap that instance up and return it. + However, instead of returning just a pointer to a custom class object, it returns + an ``IValue`` wrapping the object. You can then pass this ``IValue`` directly to + TorchScript. +- In the event that you already have an ``intrusive_ptr`` pointing to your class, you + can directly construct an IValue from it using the constructor ``IValue(intrusive_ptr)``. + +For converting ``IValue``s back to custom classes: + +- ``IValue::toCustomClass()`` will return an ``intrusive_ptr`` pointing to the + custom class that the ``IValue`` contains. Internally, this function is checking + that ``T`` is registered as a custom class and that the ``IValue`` does in fact contain + a custom class. You can check whether the ``IValue`` contains a custom class manually by + calling ``isCustomClass()``. + +Defining Serialization/Deserialization Methods for Custom C++ Classes +--------------------------------------------------------------------- + +If you try to save a ``ScriptModule`` with a custom-bound C++ class as +an attribute, you'll get the following error: + +.. code-block:: python + + # export_attr.py + import torch + + torch.classes.load_library('libcustom_class.so') + + class Foo(torch.nn.Module): + def __init__(self): + super().__init__() + self.stack = torch.classes.my_classes.MyStackClass(["just", "testing"]) + + def forward(self, s : str) -> str: + return self.stack.pop() + s + + scripted_foo = torch.jit.script(Foo()) + + scripted_foo.save('foo.pt') + +.. code-block:: shell + + $ python export_attr.py + RuntimeError: Cannot serialize custom bound C++ class __torch__.torch.classes.my_classes.MyStackClass. Please define serialization methods via def_pickle for this class. (pushIValueImpl at ../torch/csrc/jit/pickler.cpp:128) + +This is because TorchScript cannot automatically figure out what information +save from your C++ class. You must specify that manually. The way to do that +is to define ``__getstate__`` and ``__setstate__`` methods on the class using +the special ``def_pickle`` method on ``class_``. + +.. note:: + The semantics of ``__getstate__`` and ``__setstate__`` in TorchScript are + equivalent to that of the Python pickle module. You can + `read more `_ + about how we use these methods. + +Here is an example of how we can update the registration code for our +``MyStackClass`` class to include serialization methods: + +.. code-block:: cpp + + static auto testStack = + torch::class_>("my_classes", "MyStackClass") + .def(torch::init>()) + .def("top", [](const c10::intrusive_ptr>& self) { + return self->stack_.back(); + }) + .def("push", &MyStackClass::push) + .def("pop", &MyStackClass::pop) + .def("clone", &MyStackClass::clone) + .def("merge", &MyStackClass::merge) + // class_<>::def_pickle allows you to define the serialization + // and deserialization methods for your C++ class. + // Currently, we only support passing stateless lambda functions + // as arguments to def_pickle + .def_pickle( + // __getstate__ + // This function defines what data structure should be produced + // when we serialize an instance of this class. The function + // must take a single `self` argument, which is an intrusive_ptr + // to the instance of the object. The function can return + // any type that is supported as a return value of the TorchScript + // custom operator API. In this instance, we've chosen to return + // a std::vector as the salient data to preserve + // from the class. + [](const c10::intrusive_ptr>& self) + -> std::vector { + return self->stack_; + }, + // __setstate__ + // This function defines how to create a new instance of the C++ + // class when we are deserializing. The function must take a + // single argument of the same type as the return value of + // `__getstate__`. The function must return an intrusive_ptr + // to a new instance of the C++ class, initialized however + // you would like given the serialized state. + [](std::vector state) + -> c10::intrusive_ptr> { + // A convenient way to instantiate an object and get an + // intrusive_ptr to it is via `make_intrusive`. We use + // that here to allocate an instance of MyStackClass + // and call the single-argument std::vector + // constructor with the serialized state. + return c10::make_intrusive>(std::move(state)); + }); + +.. note:: + We take a different approach from pybind11 in the pickle API. Whereas pybind11 + as a special function ``pybind11::pickle()`` which you pass into ``class_::def()``, + we have a separate method ``def_pickle`` for this purpose. This is because the + name ``torch::jit::pickle`` was already taken, and we didn't want to cause confusion. + +Once we have defined the (de)serialization behavior in this way, our script can +now run successfully: + +.. code-block:: python + + import torch + + torch.classes.load_library('libcustom_class.so') + + class Foo(torch.nn.Module): + def __init__(self): + super().__init__() + self.stack = torch.classes.my_classes.MyStackClass(["just", "testing"]) + + def forward(self, s : str) -> str: + return self.stack.pop() + s + + scripted_foo = torch.jit.script(Foo()) + + scripted_foo.save('foo.pt') + loaded = torch.jit.load('foo.pt') + + print(loaded.stack.pop()) + +.. code-block:: shell + + $ python ../export_attr.py + testing + +Defining Custom Operators that Take or Return Bound C++ Classes +--------------------------------------------------------------- + +Once you've defined a custom C++ class, you can also use that class +as an argument or return from a custom operator (i.e. free functions). Here's an +example of how to do that: + +.. code-block:: cpp + + c10::intrusive_ptr> manipulate_instance(const c10::intrusive_ptr>& instance) { + instance->pop(); + return instance; + } + + static auto instance_registry = torch::RegisterOperators().op( + torch::RegisterOperators::options() + .schema( + "foo::manipulate_instance(__torch__.torch.classes.my_classes.MyStackClass x) -> __torch__.torch.classes.my_classes.MyStackClass Y") + .catchAllKernel()); + +Refer to the `custom op tutorial `_ +for more details on the registration API. + +Once this is done, you can use the op like the following example: + +.. code-block:: python + + class TryCustomOp(torch.nn.Module): + def __init__(self): + super(TryCustomOp, self).__init__() + self.f = torch.classes.my_classes.MyStackClass(["foo", "bar"]) + + def forward(self): + return torch.ops.foo.manipulate_instance(self.f) + +.. note:: + + Registration of an operator that takes a C++ class as an argument requires that + the custom class has already been registered. This is fine if your op is + registered after your class in a single compilation unit, however, if your + class is registered in a separate compilation unit from the op you will need + to enforce that dependency. One way to do this is to wrap the class registration + in a `Meyer's singleton `_, which can be + called from the compilation unit that does the operator registration. + +Conclusion +---------- + +This tutorial walked you through how to expose a C++ class to TorchScript +(and by extension Python), how to register its methods, how to use that +class from Python and TorchScript, and how to save and load code using +the class and run that code in a standalone C++ process. You are now ready +to extend your TorchScript models with C++ classes that interface with +third party C++ libraries or implement any other use case that requires the +lines between Python, TorchScript and C++ to blend smoothly. + +As always, if you run into any problems or have questions, you can use our +`forum `_ or `GitHub issues +`_ to get in touch. Also, our +`frequently asked questions (FAQ) page +`_ may have helpful information. diff --git a/recipes_source/recipes/torch_script_custom_ops.rst b/recipes_source/recipes/torch_script_custom_ops.rst new file mode 100644 index 00000000000..9127855878d --- /dev/null +++ b/recipes_source/recipes/torch_script_custom_ops.rst @@ -0,0 +1,1087 @@ +Extending TorchScript with Custom C++ Operators +=============================================== + +The PyTorch 1.0 release introduced a new programming model to PyTorch called +`TorchScript `_. TorchScript is a +subset of the Python programming language which can be parsed, compiled and +optimized by the TorchScript compiler. Further, compiled TorchScript models have +the option of being serialized into an on-disk file format, which you can +subsequently load and run from pure C++ (as well as Python) for inference. + +TorchScript supports a large subset of operations provided by the ``torch`` +package, allowing you to express many kinds of complex models purely as a series +of tensor operations from PyTorch's "standard library". Nevertheless, there may +be times where you find yourself in need of extending TorchScript with a custom +C++ or CUDA function. While we recommend that you only resort to this option if +your idea cannot be expressed (efficiently enough) as a simple Python function, +we do provide a very friendly and simple interface for defining custom C++ and +CUDA kernels using `ATen `_, PyTorch's high +performance C++ tensor library. Once bound into TorchScript, you can embed these +custom kernels (or "ops") into your TorchScript model and execute them both in +Python and in their serialized form directly in C++. + +The following paragraphs give an example of writing a TorchScript custom op to +call into `OpenCV `_, a computer vision library written +in C++. We will discuss how to work with tensors in C++, how to efficiently +convert them to third party tensor formats (in this case, OpenCV ``Mat``s), how +to register your operator with the TorchScript runtime and finally how to +compile the operator and use it in Python and C++. + +Implementing the Custom Operator in C++ +--------------------------------------- + +For this tutorial, we'll be exposing the `warpPerspective +`_ +function, which applies a perspective transformation to an image, from OpenCV to +TorchScript as a custom operator. The first step is to write the implementation +of our custom operator in C++. Let's call the file for this implementation +``op.cpp`` and make it look like this: + +.. code-block:: cpp + + #include + #include + + torch::Tensor warp_perspective(torch::Tensor image, torch::Tensor warp) { + cv::Mat image_mat(/*rows=*/image.size(0), + /*cols=*/image.size(1), + /*type=*/CV_32FC1, + /*data=*/image.data()); + cv::Mat warp_mat(/*rows=*/warp.size(0), + /*cols=*/warp.size(1), + /*type=*/CV_32FC1, + /*data=*/warp.data()); + + cv::Mat output_mat; + cv::warpPerspective(image_mat, output_mat, warp_mat, /*dsize=*/{8, 8}); + + torch::Tensor output = torch::from_blob(output_mat.ptr(), /*sizes=*/{8, 8}); + return output.clone(); + } + +The code for this operator is quite short. At the top of the file, we include +the OpenCV header file, ``opencv2/opencv.hpp``, alongside the ``torch/script.h`` +header which exposes all the necessary goodies from PyTorch's C++ API that we +need to write custom TorchScript operators. Our function ``warp_perspective`` +takes two arguments: an input ``image`` and the ``warp`` transformation matrix +we wish to apply to the image. The type of these inputs is ``torch::Tensor``, +PyTorch's tensor type in C++ (which is also the underlying type of all tensors +in Python). The return type of our ``warp_perspective`` function will also be a +``torch::Tensor``. + +.. tip:: + + See `this note `_ for + more information about ATen, the library that provides the ``Tensor`` class to + PyTorch. Further, `this tutorial + `_ describes how to + allocate and initialize new tensor objects in C++ (not required for this + operator). + +.. attention:: + + The TorchScript compiler understands a fixed number of types. Only these types + can be used as arguments to your custom operator. Currently these types are: + ``torch::Tensor``, ``torch::Scalar``, ``double``, ``int64_t`` and + ``std::vector`` s of these types. Note that *only* ``double`` and *not* + ``float``, and *only* ``int64_t`` and *not* other integral types such as + ``int``, ``short`` or ``long`` are supported. + +Inside of our function, the first thing we need to do is convert our PyTorch +tensors to OpenCV matrices, as OpenCV's ``warpPerspective`` expects ``cv::Mat`` +objects as inputs. Fortunately, there is a way to do this **without copying +any** data. In the first few lines, + +.. code-block:: cpp + + cv::Mat image_mat(/*rows=*/image.size(0), + /*cols=*/image.size(1), + /*type=*/CV_32FC1, + /*data=*/image.data()); + +we are calling `this constructor +`_ +of the OpenCV ``Mat`` class to convert our tensor to a ``Mat`` object. We pass +it the number of rows and columns of the original ``image`` tensor, the datatype +(which we'll fix as ``float32`` for this example), and finally a raw pointer to +the underlying data -- a ``float*``. What is special about this constructor of +the ``Mat`` class is that it does not copy the input data. Instead, it will +simply reference this memory for all operations performed on the ``Mat``. If an +in-place operation is performed on the ``image_mat``, this will be reflected in +the original ``image`` tensor (and vice-versa). This allows us to call +subsequent OpenCV routines with the library's native matrix type, even though +we're actually storing the data in a PyTorch tensor. We repeat this procedure to +convert the ``warp`` PyTorch tensor to the ``warp_mat`` OpenCV matrix: + +.. code-block:: cpp + + cv::Mat warp_mat(/*rows=*/warp.size(0), + /*cols=*/warp.size(1), + /*type=*/CV_32FC1, + /*data=*/warp.data()); + +Next, we are ready to call the OpenCV function we were so eager to use in +TorchScript: ``warpPerspective``. For this, we pass the OpenCV function the +``image_mat`` and ``warp_mat`` matrices, as well as an empty output matrix +called ``output_mat``. We also specify the size ``dsize`` we want the output +matrix (image) to be. It is hardcoded to ``8 x 8`` for this example: + +.. code-block:: cpp + + cv::Mat output_mat; + cv::warpPerspective(image_mat, output_mat, warp_mat, /*dsize=*/{8, 8}); + +The final step in our custom operator implementation is to convert the +``output_mat`` back into a PyTorch tensor, so that we can further use it in +PyTorch. This is strikingly similar to what we did earlier to convert in the +other direction. In this case, PyTorch provides a ``torch::from_blob`` method. A +*blob* in this case is intended to mean some opaque, flat pointer to memory that +we want to interpret as a PyTorch tensor. The call to ``torch::from_blob`` looks +like this: + +.. code-block:: cpp + + torch::from_blob(output_mat.ptr(), /*sizes=*/{8, 8}) + +We use the ``.ptr()`` method on the OpenCV ``Mat`` class to get a raw +pointer to the underlying data (just like ``.data()`` for the PyTorch +tensor earlier). We also specify the output shape of the tensor, which we +hardcoded as ``8 x 8``. The output of ``torch::from_blob`` is then a +``torch::Tensor``, pointing to the memory owned by the OpenCV matrix. + +Before returning this tensor from our operator implementation, we must call +``.clone()`` on the tensor to perform a memory copy of the underlying data. The +reason for this is that ``torch::from_blob`` returns a tensor that does not own +its data. At that point, the data is still owned by the OpenCV matrix. However, +this OpenCV matrix will go out of scope and be deallocated at the end of the +function. If we returned the ``output`` tensor as-is, it would point to invalid +memory by the time we use it outside the function. Calling ``.clone()`` returns +a new tensor with a copy of the original data that the new tensor owns itself. +It is thus safe to return to the outside world. + +Registering the Custom Operator with TorchScript +------------------------------------------------ + +Now that have implemented our custom operator in C++, we need to *register* it +with the TorchScript runtime and compiler. This will allow the TorchScript +compiler to resolve references to our custom operator in TorchScript code. +Registration is very simple. For our case, we need to write: + +.. code-block:: cpp + + static auto registry = + torch::RegisterOperators("my_ops::warp_perspective", &warp_perspective); + +somewhere in the global scope of our ``op.cpp`` file. This creates a global +variable ``registry``, which will register our operator with TorchScript in its +constructor (i.e. exactly once per program). We specify the name of the +operator, and a pointer to its implementation (the function we wrote earlier). +The name consists of two parts: a *namespace* (``my_ops``) and a name for the +particular operator we are registering (``warp_perspective``). The namespace and +operator name are separated by two colons (``::``). + +.. tip:: + + If you want to register more than one operator, you can chain calls to + ``.op()`` after the constructor: + + .. code-block:: cpp + + static auto registry = + torch::RegisterOperators("my_ops::warp_perspective", &warp_perspective) + .op("my_ops::another_op", &another_op) + .op("my_ops::and_another_op", &and_another_op); + +Behind the scenes, ``RegisterOperators`` will perform a number of fairly +complicated C++ template metaprogramming magic tricks to infer the argument and +return value types of the function pointer we pass it (``&warp_perspective``). +This information is used to form a *function schema* for our operator. A +function schema is a structured representation of an operator -- a kind of +"signature" or "prototype" -- used by the TorchScript compiler to verify +correctness in TorchScript programs. + +Building the Custom Operator +---------------------------- + +Now that we have implemented our custom operator in C++ and written its +registration code, it is time to build the operator into a (shared) library that +we can load into Python for research and experimentation, or into C++ for +inference in a no-Python environment. There exist multiple ways to build our +operator, using either pure CMake, or Python alternatives like ``setuptools``. +For brevity, the paragraphs below only discuss the CMake approach. The appendix +of this tutorial dives into the Python based alternatives. + +Building with CMake +******************* + +To build our custom operator into a shared library using the `CMake +`_ build system, we need to write a short ``CMakeLists.txt`` +file and place it with our previous ``op.cpp`` file. For this, let's agree on a +a directory structure that looks like this:: + + warp-perspective/ + op.cpp + CMakeLists.txt + +Also, make sure to grab the latest version of the LibTorch distribution, which +packages PyTorch's C++ libraries and CMake build files, from `pytorch.org +`_. Place the unzipped distribution +somewhere accessible in your file system. The following paragraphs will refer to +that location as ``/path/to/libtorch``. The contents of our ``CMakeLists.txt`` +file should then be the following: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.1 FATAL_ERROR) + project(warp_perspective) + + find_package(Torch REQUIRED) + find_package(OpenCV REQUIRED) + + # Define our library target + add_library(warp_perspective SHARED op.cpp) + # Enable C++11 + target_compile_features(warp_perspective PRIVATE cxx_range_for) + # Link against LibTorch + target_link_libraries(warp_perspective "${TORCH_LIBRARIES}") + # Link against OpenCV + target_link_libraries(warp_perspective opencv_core opencv_imgproc) + +.. warning:: + + This setup makes some assumptions about the build environment, particularly + what pertains to the installation of OpenCV. The above ``CMakeLists.txt`` file + was tested inside a Docker container running Ubuntu Xenial with + ``libopencv-dev`` installed via ``apt``. If it does not work for you and you + feel stuck, please use the ``Dockerfile`` in the `accompanying tutorial + repository `_ to + build an isolated, reproducible environment in which to play around with the + code from this tutorial. If you run into further troubles, please file an + issue in the tutorial repository or post a question in `our forum + `_. + +To now build our operator, we can run the following commands from our +``warp_perspective`` folder: + +.. code-block:: shell + + $ mkdir build + $ cd build + $ cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 5.4.0 + -- The CXX compiler identification is GNU 5.4.0 + -- Check for working C compiler: /usr/bin/cc + -- Check for working C compiler: /usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /usr/bin/c++ + -- Check for working CXX compiler: /usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Found torch: /libtorch/lib/libtorch.so + -- Configuring done + -- Generating done + -- Build files have been written to: /warp_perspective/build + $ make -j + Scanning dependencies of target warp_perspective + [ 50%] Building CXX object CMakeFiles/warp_perspective.dir/op.cpp.o + [100%] Linking CXX shared library libwarp_perspective.so + [100%] Built target warp_perspective + +which will place a ``libwarp_perspective.so`` shared library file in the +``build`` folder. In the ``cmake`` command above, you should replace +``/path/to/libtorch`` with the path to your unzipped LibTorch distribution. + +We will explore how to use and call our operator in detail further below, but to +get an early sensation of success, we can try running the following code in +Python: + +.. code-block:: python + + >>> import torch + >>> torch.ops.load_library("/path/to/libwarp_perspective.so") + >>> print(torch.ops.my_ops.warp_perspective) + +Here, ``/path/to/libwarp_perspective.so`` should be a relative or absolute path +to the ``libwarp_perspective.so`` shared library we just built. If all goes +well, this should print something like + +.. code-block:: python + + + +which is the Python function we will later use to invoke our custom operator. + +Using the TorchScript Custom Operator in Python +----------------------------------------------- + +Once our custom operator is built into a shared library we are ready to use +this operator in our TorchScript models in Python. There are two parts to this: +first loading the operator into Python, and second using the operator in +TorchScript code. + +You already saw how to import your operator into Python: +``torch.ops.load_library()``. This function takes the path to a shared library +containing custom operators, and loads it into the current process. Loading the +shared library will also execute the constructor of the global +``RegisterOperators`` object we placed into our custom operator implementation +file. This will register our custom operator with the TorchScript compiler and +allow us to use that operator in TorchScript code. + +You can refer to your loaded operator as ``torch.ops..``, +where ```` is the namespace part of your operator name, and +```` the function name of your operator. For the operator we wrote +above, the namespace was ``my_ops`` and the function name ``warp_perspective``, +which means our operator is available as ``torch.ops.my_ops.warp_perspective``. +While this function can be used in scripted or traced TorchScript modules, we +can also just use it in vanilla eager PyTorch and pass it regular PyTorch +tensors: + +.. code-block:: python + + >>> import torch + >>> torch.ops.load_library("libwarp_perspective.so") + >>> torch.ops.my_ops.warp_perspective(torch.randn(32, 32), torch.rand(3, 3)) + tensor([[0.0000, 0.3218, 0.4611, ..., 0.4636, 0.4636, 0.4636], + [0.3746, 0.0978, 0.5005, ..., 0.4636, 0.4636, 0.4636], + [0.3245, 0.0169, 0.0000, ..., 0.4458, 0.4458, 0.4458], + ..., + [0.1862, 0.1862, 0.1692, ..., 0.0000, 0.0000, 0.0000], + [0.1862, 0.1862, 0.1692, ..., 0.0000, 0.0000, 0.0000], + [0.1862, 0.1862, 0.1692, ..., 0.0000, 0.0000, 0.0000]]) + + +.. note:: + + What happens behind the scenes is that the first time you access + ``torch.ops.namespace.function`` in Python, the TorchScript compiler (in C++ + land) will see if a function ``namespace::function`` has been registered, and + if so, return a Python handle to this function that we can subsequently use to + call into our C++ operator implementation from Python. This is one noteworthy + difference between TorchScript custom operators and C++ extensions: C++ + extensions are bound manually using pybind11, while TorchScript custom ops are + bound on the fly by PyTorch itself. Pybind11 gives you more flexibility with + regards to what types and classes you can bind into Python and is thus + recommended for purely eager code, but it is not supported for TorchScript + ops. + +From here on, you can use your custom operator in scripted or traced code just +as you would other functions from the ``torch`` package. In fact, "standard +library" functions like ``torch.matmul`` go through largely the same +registration path as custom operators, which makes custom operators really +first-class citizens when it comes to how and where they can be used in +TorchScript. + +Using the Custom Operator with Tracing +************************************** + +Let's start by embedding our operator in a traced function. Recall that for +tracing, we start with some vanilla Pytorch code: + +.. code-block:: python + + def compute(x, y, z): + return x.matmul(y) + torch.relu(z) + +and then call ``torch.jit.trace`` on it. We further pass ``torch.jit.trace`` +some example inputs, which it will forward to our implementation to record the +sequence of operations that occur as the inputs flow through it. The result of +this is effectively a "frozen" version of the eager PyTorch program, which the +TorchScript compiler can further analyze, optimize and serialize: + +.. code-block:: python + + >>> inputs = [torch.randn(4, 8), torch.randn(8, 5), torch.randn(4, 5)] + >>> trace = torch.jit.trace(compute, inputs) + >>> print(trace.graph) + graph(%x : Float(4, 8) + %y : Float(8, 5) + %z : Float(4, 5)) { + %3 : Float(4, 5) = aten::matmul(%x, %y) + %4 : Float(4, 5) = aten::relu(%z) + %5 : int = prim::Constant[value=1]() + %6 : Float(4, 5) = aten::add(%3, %4, %5) + return (%6); + } + +Now, the exciting revelation is that we can simply drop our custom operator into +our PyTorch trace as if it were ``torch.relu`` or any other ``torch`` function: + +.. code-block:: python + + torch.ops.load_library("libwarp_perspective.so") + + def compute(x, y, z): + x = torch.ops.my_ops.warp_perspective(x, torch.eye(3)) + return x.matmul(y) + torch.relu(z) + +and then trace it as before: + +.. code-block:: python + + >>> inputs = [torch.randn(4, 8), torch.randn(8, 5), torch.randn(8, 5)] + >>> trace = torch.jit.trace(compute, inputs) + >>> print(trace.graph) + graph(%x.1 : Float(4, 8) + %y : Float(8, 5) + %z : Float(8, 5)) { + %3 : int = prim::Constant[value=3]() + %4 : int = prim::Constant[value=6]() + %5 : int = prim::Constant[value=0]() + %6 : int[] = prim::Constant[value=[0, -1]]() + %7 : Float(3, 3) = aten::eye(%3, %4, %5, %6) + %x : Float(8, 8) = my_ops::warp_perspective(%x.1, %7) + %11 : Float(8, 5) = aten::matmul(%x, %y) + %12 : Float(8, 5) = aten::relu(%z) + %13 : int = prim::Constant[value=1]() + %14 : Float(8, 5) = aten::add(%11, %12, %13) + return (%14); + } + +Integrating TorchScript custom ops into traced PyTorch code is as easy as this! + +Using the Custom Operator with Script +************************************* + +Besides tracing, another way to arrive at a TorchScript representation of a +PyTorch program is to directly write your code *in* TorchScript. TorchScript is +largely a subset of the Python language, with some restrictions that make it +easier for the TorchScript compiler to reason about programs. You turn your +regular PyTorch code into TorchScript by annotating it with +``@torch.jit.script`` for free functions and ``@torch.jit.script_method`` for +methods in a class (which must also derive from ``torch.jit.ScriptModule``). See +`here `_ for more details on +TorchScript annotations. + +One particular reason to use TorchScript instead of tracing is that tracing is +unable to capture control flow in PyTorch code. As such, let us consider this +function which does use control flow: + +.. code-block:: python + + def compute(x, y): + if bool(x[0][0] == 42): + z = 5 + else: + z = 10 + return x.matmul(y) + z + +To convert this function from vanilla PyTorch to TorchScript, we annotate it +with ``@torch.jit.script``: + +.. code-block:: python + + @torch.jit.script + def compute(x, y): + if bool(x[0][0] == 42): + z = 5 + else: + z = 10 + return x.matmul(y) + z + +This will just-in-time compile the ``compute`` function into a graph +representation, which we can inspect in the ``compute.graph`` property: + +.. code-block:: python + + >>> compute.graph + graph(%x : Dynamic + %y : Dynamic) { + %14 : int = prim::Constant[value=1]() + %2 : int = prim::Constant[value=0]() + %7 : int = prim::Constant[value=42]() + %z.1 : int = prim::Constant[value=5]() + %z.2 : int = prim::Constant[value=10]() + %4 : Dynamic = aten::select(%x, %2, %2) + %6 : Dynamic = aten::select(%4, %2, %2) + %8 : Dynamic = aten::eq(%6, %7) + %9 : bool = prim::TensorToBool(%8) + %z : int = prim::If(%9) + block0() { + -> (%z.1) + } + block1() { + -> (%z.2) + } + %13 : Dynamic = aten::matmul(%x, %y) + %15 : Dynamic = aten::add(%13, %z, %14) + return (%15); + } + +And now, just like before, we can use our custom operator like any other +function inside of our script code: + +.. code-block:: python + + torch.ops.load_library("libwarp_perspective.so") + + @torch.jit.script + def compute(x, y): + if bool(x[0] == 42): + z = 5 + else: + z = 10 + x = torch.ops.my_ops.warp_perspective(x, torch.eye(3)) + return x.matmul(y) + z + +When the TorchScript compiler sees the reference to +``torch.ops.my_ops.warp_perspective``, it will find the implementation we +registered via the ``RegisterOperators`` object in C++, and compile it into its +graph representation: + +.. code-block:: python + + >>> compute.graph + graph(%x.1 : Dynamic + %y : Dynamic) { + %20 : int = prim::Constant[value=1]() + %16 : int[] = prim::Constant[value=[0, -1]]() + %14 : int = prim::Constant[value=6]() + %2 : int = prim::Constant[value=0]() + %7 : int = prim::Constant[value=42]() + %z.1 : int = prim::Constant[value=5]() + %z.2 : int = prim::Constant[value=10]() + %13 : int = prim::Constant[value=3]() + %4 : Dynamic = aten::select(%x.1, %2, %2) + %6 : Dynamic = aten::select(%4, %2, %2) + %8 : Dynamic = aten::eq(%6, %7) + %9 : bool = prim::TensorToBool(%8) + %z : int = prim::If(%9) + block0() { + -> (%z.1) + } + block1() { + -> (%z.2) + } + %17 : Dynamic = aten::eye(%13, %14, %2, %16) + %x : Dynamic = my_ops::warp_perspective(%x.1, %17) + %19 : Dynamic = aten::matmul(%x, %y) + %21 : Dynamic = aten::add(%19, %z, %20) + return (%21); + } + +Notice in particular the reference to ``my_ops::warp_perspective`` at the end of +the graph. + +.. attention:: + + The TorchScript graph representation is still subject to change. Do not rely + on it looking like this. + +And that's really it when it comes to using our custom operator in Python. In +short, you import the library containing your operator(s) using +``torch.ops.load_library``, and call your custom op like any other ``torch`` +operator from your traced or scripted TorchScript code. + +Using the TorchScript Custom Operator in C++ +-------------------------------------------- + +One useful feature of TorchScript is the ability to serialize a model into an +on-disk file. This file can be sent over the wire, stored in a file system or, +more importantly, be dynamically deserialized and executed without needing to +keep the original source code around. This is possible in Python, but also in +C++. For this, PyTorch provides `a pure C++ API `_ +for deserializing as well as executing TorchScript models. If you haven't yet, +please read `the tutorial on loading and running serialized TorchScript models +in C++ `_, on which the +next few paragraphs will build. + +In short, custom operators can be executed just like regular ``torch`` operators +even when deserialized from a file and run in C++. The only requirement for this +is to link the custom operator shared library we built earlier with the C++ +application in which we execute the model. In Python, this worked simply calling +``torch.ops.load_library``. In C++, you need to link the shared library with +your main application in whatever build system you are using. The following +example will showcase this using CMake. + +.. note:: + + Technically, you can also dynamically load the shared library into your C++ + application at runtime in much the same way we did it in Python. On Linux, + `you can do this with dlopen + `_. There exist + equivalents on other platforms. + +Building on the C++ execution tutorial linked above, let's start with a minimal +C++ application in one file, ``main.cpp`` in a different folder from our +custom operator, that loads and executes a serialized TorchScript model: + +.. code-block:: cpp + + #include // One-stop header. + + #include + #include + + + int main(int argc, const char* argv[]) { + if (argc != 2) { + std::cerr << "usage: example-app \n"; + return -1; + } + + // Deserialize the ScriptModule from a file using torch::jit::load(). + std::shared_ptr module = torch::jit::load(argv[1]); + + std::vector inputs; + inputs.push_back(torch::randn({4, 8})); + inputs.push_back(torch::randn({8, 5})); + + torch::Tensor output = module->forward(std::move(inputs)).toTensor(); + + std::cout << output << std::endl; + } + +Along with a small ``CMakeLists.txt`` file: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.1 FATAL_ERROR) + project(example_app) + + find_package(Torch REQUIRED) + + add_executable(example_app main.cpp) + target_link_libraries(example_app "${TORCH_LIBRARIES}") + target_compile_features(example_app PRIVATE cxx_range_for) + +At this point, we should be able to build the application: + +.. code-block:: cpp + + $ mkdir build + $ cd build + $ cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 5.4.0 + -- The CXX compiler identification is GNU 5.4.0 + -- Check for working C compiler: /usr/bin/cc + -- Check for working C compiler: /usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /usr/bin/c++ + -- Check for working CXX compiler: /usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Found torch: /libtorch/lib/libtorch.so + -- Configuring done + -- Generating done + -- Build files have been written to: /example_app/build + $ make -j + Scanning dependencies of target example_app + [ 50%] Building CXX object CMakeFiles/example_app.dir/main.cpp.o + [100%] Linking CXX executable example_app + [100%] Built target example_app + +And run it without passing a model just yet: + +.. code-block:: cpp + + $ ./example_app + usage: example_app + +Next, let's serialize the script function we wrote earlier that uses our custom +operator: + +.. code-block:: python + + torch.ops.load_library("libwarp_perspective.so") + + @torch.jit.script + def compute(x, y): + if bool(x[0][0] == 42): + z = 5 + else: + z = 10 + x = torch.ops.my_ops.warp_perspective(x, torch.eye(3)) + return x.matmul(y) + z + + compute.save("example.pt") + +The last line will serialize the script function into a file called +"example.pt". If we then pass this serialized model to our C++ application, we +can run it straight away: + +.. code-block:: cpp + + $ ./example_app example.pt + terminate called after throwing an instance of 'torch::jit::script::ErrorReport' + what(): + Schema not found for node. File a bug report. + Node: %16 : Dynamic = my_ops::warp_perspective(%0, %19) + +Or maybe not. Maybe not just yet. Of course! We haven't linked the custom +operator library with our application yet. Let's do this right now, and to do it +properly let's update our file organization slightly, to look like this:: + + example_app/ + CMakeLists.txt + main.cpp + warp_perspective/ + CMakeLists.txt + op.cpp + +This will allow us to add the ``warp_perspective`` library CMake target as a +subdirectory of our application target. The top level ``CMakeLists.txt`` in the +``example_app`` folder should look like this: + +.. code-block:: cmake + + cmake_minimum_required(VERSION 3.1 FATAL_ERROR) + project(example_app) + + find_package(Torch REQUIRED) + + add_subdirectory(warp_perspective) + + add_executable(example_app main.cpp) + target_link_libraries(example_app "${TORCH_LIBRARIES}") + target_link_libraries(example_app -Wl,--no-as-needed warp_perspective) + target_compile_features(example_app PRIVATE cxx_range_for) + +This basic CMake configuration looks much like before, except that we add the +``warp_perspective`` CMake build as a subdirectory. Once its CMake code runs, we +link our ``example_app`` application with the ``warp_perspective`` shared +library. + +.. attention:: + + There is one crucial detail embedded in the above example: The + ``-Wl,--no-as-needed`` prefix to the ``warp_perspective`` link line. This is + required because we will not actually be calling any function from the + ``warp_perspective`` shared library in our application code. We only need the + global ``RegisterOperators`` object's constructor to run. Inconveniently, this + confuses the linker and makes it think it can just skip linking against the + library altogether. On Linux, the ``-Wl,--no-as-needed`` flag forces the link + to happen (NB: this flag is specific to Linux!). There are other workarounds + for this. The simplest is to define *some function* in the operator library + that you need to call from the main application. This could be as simple as a + function ``void init();`` declared in some header, which is then defined as + ``void init() { }`` in the operator library. Calling this ``init()`` function + in the main application will give the linker the impression that this is a + library worth linking against. Unfortunately, this is outside of our control, + and we would rather let you know the reason and the simple workaround for this + than handing you some opaque macro to plop in your code. + +Now, since we find the ``Torch`` package at the top level now, the +``CMakeLists.txt`` file in the ``warp_perspective`` subdirectory can be +shortened a bit. It should look like this: + +.. code-block:: cmake + + find_package(OpenCV REQUIRED) + add_library(warp_perspective SHARED op.cpp) + target_compile_features(warp_perspective PRIVATE cxx_range_for) + target_link_libraries(warp_perspective PRIVATE "${TORCH_LIBRARIES}") + target_link_libraries(warp_perspective PRIVATE opencv_core opencv_photo) + +Let's re-build our example app, which will also link with the custom operator +library. In the top level ``example_app`` directory: + +.. code-block:: shell + + $ mkdir build + $ cd build + $ cmake -DCMAKE_PREFIX_PATH=/path/to/libtorch .. + -- The C compiler identification is GNU 5.4.0 + -- The CXX compiler identification is GNU 5.4.0 + -- Check for working C compiler: /usr/bin/cc + -- Check for working C compiler: /usr/bin/cc -- works + -- Detecting C compiler ABI info + -- Detecting C compiler ABI info - done + -- Detecting C compile features + -- Detecting C compile features - done + -- Check for working CXX compiler: /usr/bin/c++ + -- Check for working CXX compiler: /usr/bin/c++ -- works + -- Detecting CXX compiler ABI info + -- Detecting CXX compiler ABI info - done + -- Detecting CXX compile features + -- Detecting CXX compile features - done + -- Looking for pthread.h + -- Looking for pthread.h - found + -- Looking for pthread_create + -- Looking for pthread_create - not found + -- Looking for pthread_create in pthreads + -- Looking for pthread_create in pthreads - not found + -- Looking for pthread_create in pthread + -- Looking for pthread_create in pthread - found + -- Found Threads: TRUE + -- Found torch: /libtorch/lib/libtorch.so + -- Configuring done + -- Generating done + -- Build files have been written to: /warp_perspective/example_app/build + $ make -j + Scanning dependencies of target warp_perspective + [ 25%] Building CXX object warp_perspective/CMakeFiles/warp_perspective.dir/op.cpp.o + [ 50%] Linking CXX shared library libwarp_perspective.so + [ 50%] Built target warp_perspective + Scanning dependencies of target example_app + [ 75%] Building CXX object CMakeFiles/example_app.dir/main.cpp.o + [100%] Linking CXX executable example_app + [100%] Built target example_app + +If we now run the ``example_app`` binary and hand it our serialized model, we +should arrive at a happy ending: + +.. code-block:: shell + + $ ./example_app example.pt + 11.4125 5.8262 9.5345 8.6111 12.3997 + 7.4683 13.5969 9.0850 11.0698 9.4008 + 7.4597 15.0926 12.5727 8.9319 9.0666 + 9.4834 11.1747 9.0162 10.9521 8.6269 + 10.0000 10.0000 10.0000 10.0000 10.0000 + 10.0000 10.0000 10.0000 10.0000 10.0000 + 10.0000 10.0000 10.0000 10.0000 10.0000 + 10.0000 10.0000 10.0000 10.0000 10.0000 + [ Variable[CPUFloatType]{8,5} ] + +Success! You are now ready to inference away. + +Conclusion +---------- + +This tutorial walked you throw how to implement a custom TorchScript operator in +C++, how to build it into a shared library, how to use it in Python to define +TorchScript models and lastly how to load it into a C++ application for +inference workloads. You are now ready to extend your TorchScript models with +C++ operators that interface with third party C++ libraries, write custom high +performance CUDA kernels, or implement any other use case that requires the +lines between Python, TorchScript and C++ to blend smoothly. + +As always, if you run into any problems or have questions, you can use our +`forum `_ or `GitHub issues +`_ to get in touch. Also, our +`frequently asked questions (FAQ) page +`_ may have helpful information. + +Appendix A: More Ways of Building Custom Operators +-------------------------------------------------- + +The section "Building the Custom Operator" explained how to build a custom +operator into a shared library using CMake. This appendix outlines two further +approaches for compilation. Both of them use Python as the "driver" or +"interface" to the compilation process. Also, both re-use the `existing +infrastructure `_ PyTorch +provides for `*C++ extensions* +`_, which are the +vanilla (eager) PyTorch equivalent of TorchScript custom operators that rely on +`pybind11 `_ for "explicit" binding of +functions from C++ into Python. + +The first approach uses C++ extensions' `convenient just-in-time (JIT) +compilation interface +`_ +to compile your code in the background of your PyTorch script the first time you +run it. The second approach relies on the venerable ``setuptools`` package and +involves writing a separate ``setup.py`` file. This allows more advanced +configuration as well as integration with other ``setuptools``-based projects. +We will explore both approaches in detail below. + +Building with JIT compilation +***************************** + +The JIT compilation feature provided by the PyTorch C++ extension toolkit allows +embedding the compilation of your custom operator directly into your Python +code, e.g. at the top of your training script. + +.. note:: + + "JIT compilation" here has nothing to do with the JIT compilation taking place + in the TorchScript compiler to optimize your program. It simply means that + your custom operator C++ code will be compiled in a folder under your system's + `/tmp` directory the first time you import it, as if you had compiled it + yourself beforehand. + +This JIT compilation feature comes in two flavors. In the first, you still keep +your operator implementation in a separate file (``op.cpp``), and then use +``torch.utils.cpp_extension.load()`` to compile your extension. Usually, this +function will return the Python module exposing your C++ extension. However, +since we are not compiling our custom operator into its own Python module, we +only want to compile a plain shared library . Fortunately, +``torch.utils.cpp_extension.load()`` has an argument ``is_python_module`` which +we can set to ``False`` to indicate that we are only interested in building a +shared library and not a Python module. ``torch.utils.cpp_extension.load()`` +will then compile and also load the shared library into the current process, +just like ``torch.ops.load_library`` did before: + +.. code-block:: python + + import torch.utils.cpp_extension + + torch.utils.cpp_extension.load( + name="warp_perspective", + sources=["op.cpp"], + extra_ldflags=["-lopencv_core", "-lopencv_imgproc"], + is_python_module=False, + verbose=True + ) + + print(torch.ops.my_ops.warp_perspective) + +This should approximately print: + +.. code-block:: python + + + +The second flavor of JIT compilation allows you to pass the source code for your +custom TorchScript operator as a string. For this, use +``torch.utils.cpp_extension.load_inline``: + +.. code-block:: python + + import torch + import torch.utils.cpp_extension + + op_source = """ + #include + #include + + torch::Tensor warp_perspective(torch::Tensor image, torch::Tensor warp) { + cv::Mat image_mat(/*rows=*/image.size(0), + /*cols=*/image.size(1), + /*type=*/CV_32FC1, + /*data=*/image.data()); + cv::Mat warp_mat(/*rows=*/warp.size(0), + /*cols=*/warp.size(1), + /*type=*/CV_32FC1, + /*data=*/warp.data()); + + cv::Mat output_mat; + cv::warpPerspective(image_mat, output_mat, warp_mat, /*dsize=*/{64, 64}); + + torch::Tensor output = + torch::from_blob(output_mat.ptr(), /*sizes=*/{64, 64}); + return output.clone(); + } + + static auto registry = + torch::RegisterOperators("my_ops::warp_perspective", &warp_perspective); + """ + + torch.utils.cpp_extension.load_inline( + name="warp_perspective", + cpp_sources=op_source, + extra_ldflags=["-lopencv_core", "-lopencv_imgproc"], + is_python_module=False, + verbose=True, + ) + + print(torch.ops.my_ops.warp_perspective) + +Naturally, it is best practice to only use +``torch.utils.cpp_extension.load_inline`` if your source code is reasonably +short. + +Note that if you're using this in a Jupyter Notebook, you should not execute +the cell with the registration multiple times because each execution registers +a new library and re-registers the custom operator. If you need to re-execute it, +please restart the Python kernel of your notebook beforehand. + +Building with Setuptools +************************ + +The second approach to building our custom operator exclusively from Python is +to use ``setuptools``. This has the advantage that ``setuptools`` has a quite +powerful and extensive interface for building Python modules written in C++. +However, since ``setuptools`` is really intended for building Python modules and +not plain shared libraries (which do not have the necessary entry points Python +expects from a module), this route can be slightly quirky. That said, all you +need is a ``setup.py`` file in place of the ``CMakeLists.txt`` which looks like +this: + +.. code-block:: python + + from setuptools import setup + from torch.utils.cpp_extension import BuildExtension, CppExtension + + setup( + name="warp_perspective", + ext_modules=[ + CppExtension( + "warp_perspective", + ["example_app/warp_perspective/op.cpp"], + libraries=["opencv_core", "opencv_imgproc"], + ) + ], + cmdclass={"build_ext": BuildExtension.with_options(no_python_abi_suffix=True)}, + ) + + +Notice that we enabled the ``no_python_abi_suffix`` option in the +``BuildExtension`` at the bottom. This instructs ``setuptools`` to omit any +Python-3 specific ABI suffixes in the name of the produced shared library. +Otherwise, on Python 3.7 for example, the library may be called +``warp_perspective.cpython-37m-x86_64-linux-gnu.so`` where +``cpython-37m-x86_64-linux-gnu`` is the ABI tag, but we really just want it to +be called ``warp_perspective.so`` + +If we now run ``python setup.py build develop`` in a terminal from within the +folder in which ``setup.py`` is situated, we should see something like: + +.. code-block:: shell + + $ python setup.py build develop + running build + running build_ext + building 'warp_perspective' extension + creating build + creating build/temp.linux-x86_64-3.7 + gcc -pthread -B /root/local/miniconda/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/local/miniconda/lib/python3.7/site-packages/torch/lib/include -I/root/local/miniconda/lib/python3.7/site-packages/torch/lib/include/torch/csrc/api/include -I/root/local/miniconda/lib/python3.7/site-packages/torch/lib/include/TH -I/root/local/miniconda/lib/python3.7/site-packages/torch/lib/include/THC -I/root/local/miniconda/include/python3.7m -c op.cpp -o build/temp.linux-x86_64-3.7/op.o -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=warp_perspective -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++11 + cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ + creating build/lib.linux-x86_64-3.7 + g++ -pthread -shared -B /root/local/miniconda/compiler_compat -L/root/local/miniconda/lib -Wl,-rpath=/root/local/miniconda/lib -Wl,--no-as-needed -Wl,--sysroot=/ build/temp.linux-x86_64-3.7/op.o -lopencv_core -lopencv_imgproc -o build/lib.linux-x86_64-3.7/warp_perspective.so + running develop + running egg_info + creating warp_perspective.egg-info + writing warp_perspective.egg-info/PKG-INFO + writing dependency_links to warp_perspective.egg-info/dependency_links.txt + writing top-level names to warp_perspective.egg-info/top_level.txt + writing manifest file 'warp_perspective.egg-info/SOURCES.txt' + reading manifest file 'warp_perspective.egg-info/SOURCES.txt' + writing manifest file 'warp_perspective.egg-info/SOURCES.txt' + running build_ext + copying build/lib.linux-x86_64-3.7/warp_perspective.so -> + Creating /root/local/miniconda/lib/python3.7/site-packages/warp-perspective.egg-link (link to .) + Adding warp-perspective 0.0.0 to easy-install.pth file + + Installed /warp_perspective + Processing dependencies for warp-perspective==0.0.0 + Finished processing dependencies for warp-perspective==0.0.0 + +This will produce a shared library called ``warp_perspective.so``, which we can +pass to ``torch.ops.load_library`` as we did earlier to make our operator +visible to TorchScript: + +.. code-block:: python + + >>> import torch + >>> torch.ops.load_library("warp_perspective.so") + >>> print(torch.ops.custom.warp_perspective) + diff --git a/recipes_source/recipes/torchscript_inference.rst b/recipes_source/recipes/torchscript_inference.rst new file mode 100644 index 00000000000..6491f992f4b --- /dev/null +++ b/recipes_source/recipes/torchscript_inference.rst @@ -0,0 +1,197 @@ +TorchScript for Deployment +========================== + +In this recipe, you will learn: + +- What TorchScript is +- How to export your trained model in TorchScript format +- How to load your TorchScript model in C++ and do inference + +Requirements +------------ + +- PyTorch 1.5 +- TorchVision 0.6.0 +- libtorch 1.5 +- C++ compiler + +The instructions for installing the three PyTorch components are +available at `pytorch.org`_. The C++ compiler will depend on your +platform. + +What is TorchScript? +-------------------- + +**TorchScript** is an intermediate representation of a PyTorch model +(subclass of ``nn.Module``) that can then be run in a high-performance +environment like C++. It’s a high-performance subset of Python that is +meant to be consumed by the **PyTorch JIT Compiler,** which performs +run-time optimization on your model’s computation. TorchScript is the +recommended model format for doing scaled inference with PyTorch models. +For more information, see the PyTorch `Introduction to TorchScript +tutorial`_, the `Loading A TorchScript Model in C++ tutorial`_, and the +`full TorchScript documentation`_, all of which are available on +`pytorch.org`_. + +How to Export Your Model +------------------------ + +As an example, let’s take a pretrained vision model. All of the +pretrained models in TorchVision are compatible with TorchScript. + +Run the following Python 3 code, either in a script or from the REPL: + +.. code:: python3 + + import torch + import torch.nn.functional as F + import torchvision.models as models + + r18 = models.resnet18(pretrained=True) # We now have an instance of the pretrained model + r18_scripted = torch.jit.script(r18) # *** This is the TorchScript export + dummy_input = torch.rand(1, 3, 224, 224) # We should run a quick test + +Let’s do a sanity check on the equivalence of the two models: + +:: + + unscripted_output = r18(dummy_input) # Get the unscripted model's prediction... + scripted_output = r18_scripted(dummy_input) # ...and do the same for the scripted version + + unscripted_top5 = F.softmax(unscripted_output, dim=1).topk(5).indices + scripted_top5 = F.softmax(scripted_output, dim=1).topk(5).indices + + print('Python model top 5 results:\n {}'.format(unscripted_top5)) + print('TorchScript model top 5 results:\n {}'.format(scripted_top5)) + +You should see that both versions of the model give the same results: + +:: + + Python model top 5 results: + tensor([[463, 600, 731, 899, 898]]) + TorchScript model top 5 results: + tensor([[463, 600, 731, 899, 898]]) + +With that check confirmed, go ahead and save the model: + +:: + + r18_scripted.save('r18_scripted.pt') + +Loading TorchScript Models in C++ +--------------------------------- + +Create the following C++ file and name it ``ts-infer.cpp``: + +.. code:: cpp + + #include + #include + + + int main(int argc, const char* argv[]) { + if (argc != 2) { + std::cerr << "usage: ts-infer \n"; + return -1; + } + + std::cout << "Loading model...\n"; + + // deserialize ScriptModule + torch::jit::script::Module module; + try { + module = torch::jit::load(argv[1]); + } catch (const c10::Error& e) { + std::cerr << "Error loading model\n"; + std::cerr << e.msg_without_backtrace(); + return -1; + } + + std::cout << "Model loaded successfully\n"; + + torch::NoGradGuard no_grad; // ensures that autograd is off + module.eval(); // turn off dropout and other training-time layers/functions + + // create an input "image" + std::vector inputs; + inputs.push_back(torch::rand({1, 3, 224, 224})); + + // execute model and package output as tensor + at::Tensor output = module.forward(inputs).toTensor(); + + namespace F = torch::nn::functional; + at::Tensor output_sm = F::softmax(output, F::SoftmaxFuncOptions(1)); + std::tuple top5_tensor = output_sm.topk(5); + at::Tensor top5 = std::get<1>(top5_tensor); + + std::cout << top5[0] << "\n"; + + std::cout << "\nDONE\n"; + return 0; + } + +This program: + +- Loads the model you specify on the command line +- Creates a dummy “image” input tensor +- Performs inference on the input + +Also, notice that there is no dependency on TorchVision in this code. +The saved version of your TorchScript model has your learning weights +*and* your computation graph - nothing else is needed. + +Building and Running Your C++ Inference Engine +---------------------------------------------- + +Create the following ``CMakeLists.txt`` file: + +:: + + cmake_minimum_required(VERSION 3.0 FATAL_ERROR) + project(custom_ops) + + find_package(Torch REQUIRED) + + add_executable(ts-infer ts-infer.cpp) + target_link_libraries(ts-infer "${TORCH_LIBRARIES}") + set_property(TARGET ts-infer PROPERTY CXX_STANDARD 11) + +Make the program: + +:: + + cmake -DCMAKE_PREFIX_PATH= + make + +Now, we can run inference in C++, and verify that we get a result: + +:: + + $ ./ts-infer r18_scripted.pt + Loading model... + Model loaded successfully + 418 + 845 + 111 + 892 + 644 + [ CPULongType{5} ] + + DONE + +Important Resources +------------------- + +- `pytorch.org`_ for installation instructions, and more documentation + and tutorials. +- `Introduction to TorchScript tutorial`_ for a deeper initial + exposition of TorchScript +- `Full TorchScript documentation`_ for complete TorchScript language + and API reference + +.. _pytorch.org: https://pytorch.org/ +.. _Introduction to TorchScript tutorial: https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html +.. _Full TorchScript documentation: https://pytorch.org/docs/stable/jit.html +.. _Loading A TorchScript Model in C++ tutorial: https://pytorch.org/tutorials/advanced/cpp_export.html +.. _full TorchScript documentation: https://pytorch.org/docs/stable/jit.html \ No newline at end of file diff --git a/recipes_source/recipes/warmstarting_model_using_parameters_from_a_different_model.py b/recipes_source/recipes/warmstarting_model_using_parameters_from_a_different_model.py new file mode 100644 index 00000000000..7fb88c501c1 --- /dev/null +++ b/recipes_source/recipes/warmstarting_model_using_parameters_from_a_different_model.py @@ -0,0 +1,142 @@ +""" +Warmstarting model using parameters from a different model in PyTorch +===================================================================== +Partially loading a model or loading a partial model are common +scenarios when transfer learning or training a new complex model. +Leveraging trained parameters, even if only a few are usable, will help +to warmstart the training process and hopefully help your model converge +much faster than training from scratch. + +Introduction +------------ +Whether you are loading from a partial ``state_dict``, which is missing +some keys, or loading a ``state_dict`` with more keys than the model +that you are loading into, you can set the strict argument to ``False`` +in the ``load_state_dict()`` function to ignore non-matching keys. +In this recipe, we will experiment with warmstarting a model using +parameters of a different model. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + +:: + + pip install torch + +""" + + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network A and B +# 3. Save model A +# 4. Load into model B +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network A and B +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. We will +# create two neural networks for sake of loading one parameter of type A +# into type B. +# + +class NetA(nn.Module): + def __init__(self): + super(NetA, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +netA = NetA() + +class NetB(nn.Module): + def __init__(self): + super(NetB, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +netB = NetB() + + +###################################################################### +# 3. Save model A +# ~~~~~~~~~~~~~~~~~~~ +# + +# Specify a path to save to +PATH = "model.pt" + +torch.save(netA.state_dict(), PATH) + + +###################################################################### +# 4. Load into model B +# ~~~~~~~~~~~~~~~~~~~~~~~~ +# +# If you want to load parameters from one layer to another, but some keys +# do not match, simply change the name of the parameter keys in the +# state_dict that you are loading to match the keys in the model that you +# are loading into. +# + +netB.load_state_dict(torch.load(PATH), strict=False) + + +###################################################################### +# You can see that all keys matched successfully! +# +# Congratulations! You have successfully warmstarted a model using +# parameters from a different model in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes/what_is_state_dict.py b/recipes_source/recipes/what_is_state_dict.py new file mode 100644 index 00000000000..afd6e375bf5 --- /dev/null +++ b/recipes_source/recipes/what_is_state_dict.py @@ -0,0 +1,132 @@ +""" +What is a state_dict in PyTorch +=============================== +In PyTorch, the learnable parameters (i.e. weights and biases) of a +``torch.nn.Module`` model are contained in the model’s parameters +(accessed with ``model.parameters()``). A ``state_dict`` is simply a +Python dictionary object that maps each layer to its parameter tensor. + +Introduction +------------ +A ``state_dict`` is an integral entity if you are interested in saving +or loading models from PyTorch. +Because ``state_dict`` objects are Python dictionaries, they can be +easily saved, updated, altered, and restored, adding a great deal of +modularity to PyTorch models and optimizers. +Note that only layers with learnable parameters (convolutional layers, +linear layers, etc.) and registered buffers (batchnorm’s running_mean) +have entries in the model’s ``state_dict``. Optimizer objects +(``torch.optim``) also have a ``state_dict``, which contains information +about the optimizer’s state, as well as the hyperparameters used. +In this recipe, we will see how ``state_dict`` is used with a simple +model. + +Setup +----- +Before we begin, we need to install ``torch`` if it isn’t already +available. + +:: + + pip install torchaudio + +""" + + + +###################################################################### +# Steps +# ----- +# +# 1. Import all necessary libraries for loading our data +# 2. Define and intialize the neural network +# 3. Initialize the optimizer +# 4. Access the model and optimizer ``state_dict`` +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will use ``torch`` and its subsidiaries ``torch.nn`` +# and ``torch.optim``. +# + +import torch +import torch.nn as nn +import torch.optim as optim + + +###################################################################### +# 2. Define and intialize the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For sake of example, we will create a neural network for training +# images. To learn more see the Defining a Neural Network recipe. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + +net = Net() +print(net) + + +###################################################################### +# 3. Initialize the optimizer +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# We will use SGD with momentum. +# + +optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) + + +###################################################################### +# 4. Access the model and optimizer ``state_dict`` +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Now that we have constructed our model and optimizer, we can understand +# what is preserved in their respective ``state_dict`` properties. +# + +# Print model's state_dict +print("Model's state_dict:") +for param_tensor in net.state_dict(): + print(param_tensor, "\t", net.state_dict()[param_tensor].size()) + +print() + +# Print optimizer's state_dict +print("Optimizer's state_dict:") +for var_name in optimizer.state_dict(): + print(var_name, "\t", optimizer.state_dict()[var_name]) + + +###################################################################### +# This information is relevant for saving and loading the model and +# optimizers for future use. +# +# Congratulations! You have successfully used ``state_dict`` in PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes/zeroing_out_gradients.py b/recipes_source/recipes/zeroing_out_gradients.py new file mode 100644 index 00000000000..aa05b1d2662 --- /dev/null +++ b/recipes_source/recipes/zeroing_out_gradients.py @@ -0,0 +1,193 @@ +""" +Zeroing out gradients in PyTorch +================================ +It is beneficial to zero out gradients when building a neural network. +This is because by default, gradients are accumulated in buffers (i.e, +not overwritten) whenever ``.backward()`` is called. + +Introduction +------------ +When training your neural network, models are able to increase their +accuracy through gradient decent. In short, gradient descent is the +process of minimizing our loss (or error) by tweaking the weights and +biases in our model. + +``torch.Tensor`` is the central class of PyTorch. When you create a +tensor, if you set its attribute ``.requires_grad`` as ``True``, the +package tracks all operations on it. This happens on subsequent backward +passes. The gradient for this tensor will be accumulated into ``.grad`` +attribute. The accumulation (or sum) of all the gradients is calculated +when .backward() is called on the loss tensor. + +There are cases where it may be necessary to zero-out the gradients of a +tensor. For example: when you start your training loop, you should zero +out the gradients so that you can perform this tracking correctly. +In this recipe, we will learn how to zero out gradients using the +PyTorch library. We will demonstrate how to do this by training a neural +network on the ``CIFAR10`` dataset built into PyTorch. + +Setup +----- +Since we will be training data in this recipe, if you are in a runable +notebook, it is best to switch the runtime to GPU or TPU. +Before we begin, we need to install ``torch`` and ``torchvision`` if +they aren’t already available. + +:: + + pip install torchvision + + +""" + + +###################################################################### +# Steps +# ----- +# +# Steps 1 through 4 set up our data and neural network for training. The +# process of zeroing out the gradients happens in step 5. If you already +# have your data and neural network built, skip to 5. +# +# 1. Import all necessary libraries for loading our data +# 2. Load and normalize the dataset +# 3. Build the neural network +# 4. Define the loss function +# 5. Zero the gradients while training the network +# +# 1. Import necessary libraries for loading our data +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# For this recipe, we will just be using ``torch`` and ``torchvision`` to +# access the dataset. +# + +import torch + +import torch.nn as nn +import torch.nn.functional as F + +import torch.optim as optim + +import torchvision +import torchvision.transforms as transforms + + +###################################################################### +# 2. Load and normalize the dataset +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# PyTorch features various built-in datasets (see the Loading Data recipe +# for more information). +# + +transform = transforms.Compose( + [transforms.ToTensor(), + transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]) + +trainset = torchvision.datasets.CIFAR10(root='./data', train=True, + download=True, transform=transform) +trainloader = torch.utils.data.DataLoader(trainset, batch_size=4, + shuffle=True, num_workers=2) + +testset = torchvision.datasets.CIFAR10(root='./data', train=False, + download=True, transform=transform) +testloader = torch.utils.data.DataLoader(testset, batch_size=4, + shuffle=False, num_workers=2) + +classes = ('plane', 'car', 'bird', 'cat', + 'deer', 'dog', 'frog', 'horse', 'ship', 'truck') + + +###################################################################### +# 3. Build the neural network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# We will use a convolutional neural network. To learn more see the +# Defining a Neural Network recipe. +# + +class Net(nn.Module): + def __init__(self): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(3, 6, 5) + self.pool = nn.MaxPool2d(2, 2) + self.conv2 = nn.Conv2d(6, 16, 5) + self.fc1 = nn.Linear(16 * 5 * 5, 120) + self.fc2 = nn.Linear(120, 84) + self.fc3 = nn.Linear(84, 10) + + def forward(self, x): + x = self.pool(F.relu(self.conv1(x))) + x = self.pool(F.relu(self.conv2(x))) + x = x.view(-1, 16 * 5 * 5) + x = F.relu(self.fc1(x)) + x = F.relu(self.fc2(x)) + x = self.fc3(x) + return x + + +###################################################################### +# 4. Define a Loss function and optimizer +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# Let’s use a Classification Cross-Entropy loss and SGD with momentum. +# + +net = Net() +criterion = nn.CrossEntropyLoss() +optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) + + +###################################################################### +# 5. Zero the gradients while training the network +# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +# +# This is when things start to get interesting. We simply have to loop +# over our data iterator, and feed the inputs to the network and optimize. +# +# Notice that for each entity of data, we zero out the gradients. This is +# to ensure that we aren’t tracking any unnecessary information when we +# train our neural network. +# + +for epoch in range(2): # loop over the dataset multiple times + + running_loss = 0.0 + for i, data in enumerate(trainloader, 0): + # get the inputs; data is a list of [inputs, labels] + inputs, labels = data + + # zero the parameter gradients + optimizer.zero_grad() + + # forward + backward + optimize + outputs = net(inputs) + loss = criterion(outputs, labels) + loss.backward() + optimizer.step() + + # print statistics + running_loss += loss.item() + if i % 2000 == 1999: # print every 2000 mini-batches + print('[%d, %5d] loss: %.3f' % + (epoch + 1, i + 1, running_loss / 2000)) + running_loss = 0.0 + +print('Finished Training') + + +###################################################################### +# You can also use ``model.zero_grad()``. This is the same as using +# ``optimizer.zero_grad()`` as long as all your model parameters are in +# that optimizer. Use your best judgement to decide which one to use. +# +# Congratulations! You have successfully zeroed out gradients PyTorch. +# +# Learn More +# ---------- +# +# Take a look at these other recipes to continue your learning: +# +# - TBD +# - TBD diff --git a/recipes_source/recipes_index.rst b/recipes_source/recipes_index.rst new file mode 100644 index 00000000000..d1c009df61f --- /dev/null +++ b/recipes_source/recipes_index.rst @@ -0,0 +1,151 @@ +PyTorch Recipes +--------------------------------------------- +Recipes are bite-sized bite-sized, actionable examples of how to use specific PyTorch features, different from our full-length tutorials. + +.. raw:: html + + + + +
+ + + +
+ +
+ +
+
+ +.. Add recipe cards below this line + +.. Getting Started + +.. customcarditem:: + :header: Writing Custom Datasets, DataLoaders and Transforms + :card_description: Learn how to load and preprocess/augment data from a non trivial dataset. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/data_loading_tutorial.html + :tags: Getting-Started + + +.. Production + +.. customcarditem:: + :header: Deploying PyTorch in Python via a REST API with Flask + :card_description: Deploy a PyTorch model using Flask and expose a REST API for model inference using the example of a pretrained DenseNet 121 model which detects the image. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/flask_rest_api_tutorial.html + :tags: Production + +.. customcarditem:: + :header: Introduction to TorchScript + :card_description: Introduction to TorchScript, an intermediate representation of a PyTorch model (subclass of nn.Module) that can then be run in a high-performance environment such as C++. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/Intro_to_TorchScript_tutorial.html + :tags: Production + +.. customcarditem:: + :header: Loading a TorchScript Model in C++ + :card_description: Learn how PyTorch provides to go from an existing Python model to a serialized representation that can be loaded and executed purely from C++, with no dependency on Python. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/cpp_export.html + :tags: Production + +.. customcarditem:: + :header: (optional) Exporting a Model from PyTorch to ONNX and Running it using ONNX Runtime + :card_description: Convert a model defined in PyTorch into the ONNX format and then run it with ONNX Runtime. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/super_resolution_with_onnxruntime.html + :tags: Production + +.. Parallel-and-Distributed-Training + +.. customcarditem:: + :header: Model Parallel Best Practices + :card_description: Learn how to implement model parallel, a distributed training technique which splits a single model onto different GPUs, rather than replicating the entire model on each GPU + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/model_parallel_tutorial.html + :tags: Parallel-and-Distributed-Training + +.. customcarditem:: + :header: Getting Started with Distributed Data Parallel + :card_description: Learn the basics of when to use distributed data paralle versus data parallel and work through an example to set it up. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/ddp_tutorial.html + :tags: Parallel-and-Distributed-Training + +.. customcarditem:: + :header: Writing Distributed Applications with PyTorch + :card_description: Set up the distributed package of PyTorch, use the different communication strategies, and go over some the internals of the package. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/dist_tuto.html + :tags: Parallel-and-Distributed-Training + +.. customcarditem:: + :header: Getting Started with Distributed RPC Framework + :card_description: Learn how to build distributed training using the torch.distributed.rpc package. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/rpc_tutorial.html + :tags: Parallel-and-Distributed-Training + +.. customcarditem:: + :header: (advanced) PyTorch 1.0 Distributed Trainer with Amazon AWS + :card_description: Set up the distributed package of PyTorch, use the different communication strategies, and go over some the internals of the package. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/aws_distributed_training_tutorial.html + :tags: Parallel-and-Distributed-Training + +.. Extending PyTorch + +.. customcarditem:: + :header: Extending TorchScript with Custom C++ Operators + :card_description: Implement a custom TorchScript operator in C++, how to build it into a shared library, how to use it in Python to define TorchScript models and lastly how to load it into a C++ application for inference workloads. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/torch_script_custom_ops.html + :tags: Extending-PyTorch, TorchScript + +.. customcarditem:: + :header: Extending TorchScript with Custom C++ Classes + :card_description: This is a contiuation of the custom operator tutorial, and introduces the API we’ve built for binding C++ classes into TorchScript and Python simultaneously. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/torch_script_custom_classes.html + :tags: Extending-PyTorch, TorchScript + +.. customcarditem:: + :header: Creating Extensions Using numpy and scipy + :card_description: Create a neural network layer with no parameters using numpy. Then use scipy to create a neural network layer that has learnable weights. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/numpy_extensions_tutorial.html + :tags: Extending-PyTorch, numpy, scipy + +.. customcarditem:: + :header: Custom C++ and CUDA Extensions + :card_description: Create a neural network layer with no parameters using numpy. Then use scipy to create a neural network layer that has learnable weights. + :image: _static/img/thumbnails/pytorch-logo-flat.png + :link: ../recipes/recipes/cpp_extension.html + :tags: Extending-PyTorch, C++, CUDA + + +.. End of recipe card section + +.. raw:: html + +
+ +
+ +
+ +
+ +.. .. galleryitem:: beginner/saving_loading_models.py diff --git a/src/pytorch-sphinx-theme b/src/pytorch-sphinx-theme index 19dbba563ff..135a5708ac5 160000 --- a/src/pytorch-sphinx-theme +++ b/src/pytorch-sphinx-theme @@ -1 +1 @@ -Subproject commit 19dbba563ffd86c4167b6e9ac571556521c25f13 +Subproject commit 135a5708ac5378c717d4bfd5c00a85b768223a65